But behind the shine, the purpose of Facebook’s experiment is to promote anything but genuine accountability. This was a more transparent way of offering Facebook cover to gain socially unprofitable benefits, which would be revised publicly if it were more transparent.
The trick is simple. Facebook is facing a two-sided economic stimulus problem: dangerous and socially objectionable content is actually valuable to its bottom line, but the public’s perception that it maintains a socially responsible and safe community Committed to keep. It constructed an oversight board to avoid this double bond. Oversighted by a legal body with the presence of neutrality, Facebook earns public goodwill and highlights flaws for loose content moderation. But in designing the body structure itself, Facebook has ensured almost certain financially beneficial outcomes: maximum content, even dangerous and harmful, left online. The result is a win for Facebook. The platform places valuable material on the external body, removing social criminality. Already, the board is showing these true colors.
“Engagement” is the sacred grave of social media and digital advertising. For Facebook, more engagement means more advertising dollars – which makes the most of its money. But false and / or hateful content often drives the most clicks. Among political content, “far-right” content goes far
More engagement Compared to center-corrected material; Within the “far-flung” category, even more inaccurate, misinformation gets the greatest engagement. The most extreme – and deceitful – content is very valuable to Facebook. It is not exclusive to political positions; Other pornographic, abusive or inaccurate content also appeals to us Rotten And social media Obsessed brain. Mark Zuckerberg also made a fancy chart Accepting So much.
The public already recognizes this and is demanding strict moderation. Even before the 2020 election “
big lie“And the upcoming violent Rebellion, 78% of American adults The platform was held wholly or partially responsible for the dissemination of false and objectionable content on their sites, and 87% felt that the platform would at least occasionally (65% “always”) pick up false content. Is the duty. Facing this mandate, A. Clear majority Consider a “not hard enough” platform in content moderation. Most critically, it is not a cheap thing by the public; It has already started influencing the bottom-line of Facebook in a big way. Advertisers boycott, User fault, And regulatory and The legal Investigation
The board responds to both incentives. It offers the presence of freedom, yet is constructed to produce predictable results – urging Facebook to leave more problematic content online. This structural flaw is already evident in the board’s earlier decisions.
The operation of the board shows an Anglo-American appellate court and imports the principles of public law. Almost all of it
The members Are constitutional or human rights lawyers. Three of its four co-chairs are constitutional lawyers; Two are from the US, currently home to the most speech-protective jurisprudence in the history of the world. Scientifically absent are scientists or economists; Facebook wants the benefit of speech-protective legal principles, not the amount of externalization of harmful speech.
Most critically, the board’s jurisdiction turns it into a unilateral shaft because it can review Facebook’s decisions for content
bottom, Do not leave content UP. In college The affirmative action that the board can take is to order Repairs The content was already deemed objectionable by Facebook.
The effects of this asymmetry are beyond obvious. There are many observers
to keep track How often the board disagrees with Facebook considers it an important indicator of independence. But if it only considers content already removed by Facebook, the board can only claim its “freedom” to restore content, forcing Facebook to act with financial interests. So we should not be surprised that Facebook has been Receptive The board’s early claims of independence. Doing so is the ultimate expression of win-win for Facebook: it simultaneously restores valuable content while reinforcing the narrative that is committed to independent oversight.
The rulings read like a caricature of US constitutional law, particularly medical misinformation opinions. This provides two major justifications, both referencing First Amendment principles: ambiguity in the relevant Facebook policy, and the “imminent” risk posed by misinformation is insufficient. No wonder, constitutional lawyers constituted in an institution like the court are employing familiar legal norms.
But his call here is credible. “Impurity” is a fluid standard developed to prevent political critics from going to prison for criticizing the government; Even though FDA has not implemented this assumption
Is already responsible Severe death and serious injury to off-brand Kovid-19 treatment with the same drug. And ambiguity challenges, which prevent unequal enforcement, can become so widespread that the Supreme Court Recently confirmed Speakers whose conduct can be constitutionally regulated – like the poster here – cannot increase them. Nor does the board place any value on public sentiment; 85% of Americans believe Platforms should never allow misleading medical information.
Oversight board is dangerous, as seen. It propels Facebook from public criticism that may force meaningful progress on content moderation and provides political critics with a powerful answer: Let our “legal experiment” come into play before considering state action. Along with this, it will also force regression in Facebook’s already sluggish moderation policies – even between Barack Obama like the US and the world, the trident is close.
Recently warned, For dystopia after a truth.