By: Roman Leal, YLS ‘22

A recent decision from Israeli High Court of Justice highlights how the newly created Facebook Oversight Board might empower governments to censor speech while avoiding accountability. The High Court’s decision, handed down on April 12th, considered the legality of a practice by the Cyber Unit of the Israel State Attorney’s Office. When the Cyber Unit identifies social media content it considers illegal under Israeli law, it sends a takedown request to the platform. Content is usually referred to the Cyber Unit by Israeli intelligence agencies and platforms comply with most Cyber Unit requests. Civil rights groups brought a challenge to the takedown requests, claiming that they infringe the constitutional right of free expression. In rejecting the Civil rights groups’ claim, the High Court emphasized that the decision to remove content was ultimately made by the platforms themselves. The Court also suggested that the Facebook Oversight Board was an alternative forum to bring the claim. It noted, “the ability to appeal takedowns before the board provides a remedy for rights infringements that might have occurred.” But by issuing its decision, the Court not only closed its doors to the claim—it also likely precluded the Oversight Board’s review of the matter. The Oversight Board is barred from reviewing content takedowns required by local law. Because the Oversight Board does not have the power to authoritatively determine when local law requires content to be removed, governments may shape the Board’s docket in pernicious and invisible way.

Background on the Facebook Oversight Board

In November 2018, Mark Zuckerberg shared his vision for an “independent body” which would review Facebook’s content moderation practices. Facebook was emerging from its “hardest year”—the Cambridge Analytica scandal had transformed the company from a darling silicon startup into a dystopian threat to democracy. Amidst calls for regulatory intervention, Zuckerberg announced a reviewing body that would “create a new way for people to appeal content decisions” and “give people a voice” in the way Facebook balances competing principles of “free expression” and “keeping people safe.” Zuckerberg imaged a review board “almost like a Supreme Court” which could “make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.”

Zuckerberg’s vision became a reality in 2019 when Facebook released a charter for the Facebook Oversight Board. Today, the Facebook Oversight Board hears appeals from Facebook users around the world. A star-studded group of Board members with diverse areas of expertise and multinational backgrounds sit in judgement of Facebook’s decisions to remove content from the platform. The Board’s judgements are binding on Facebook; if the Board orders Facebook to restore content, Facebook must comply. Some commentators have seized on limitations in the Board’s power to review Facebook’s algorithms in describing it as “toothless” and a “fig leaf for Facebook’s failures.” Others have cautiously described the Board as a small but positive step in the right direction. Recently, the Oversight Board announced that it will also review Facebook’s decisions to let content remain on the platform—a significant expansion of the Board’s power.

Limits to the Oversight Board’s Jurisdiction

The jurisdiction of the Facebook Oversight Board is fundamentally shaped by the laws of countries in which Facebook operates. In its bylaws, the Board is precluded from reviewing Facebook takedown decisions “made on reports…pursuant to legal obligations.” And cases are not available for Board review “where the underlying content is unlawful in a jurisdiction with a connection to the content (such as the jurisdiction of the posting party and/or the reporting party) and where a board decision to allow the content on the platform could lead to adverse governmental action against Facebook.”

That is, if someone in a country with restrictive speech laws simply reports a post, it may be precluded from Board review if Facebook’s legal department believes that keeping the post up creates a risk of “adverse government action.” And the “legal specialists” who review these cases are employees of Facebook—not the Board. Further, even if a Facebook takedown decision does make it before the Board, the Facebook legal department can still block a Board decision to reinstate content. If Facebook’s lawyers they think Facebook is under a “legal obligation” to keep blocking the content, the Board cannot force Facebook to put the content back in place. So, Facebook lawyers have a veto over (1) whether certain removal decisions can come before the Board in the first place, and (2) whether the Board’s opinion will be binding on Facebook. And Facebook is not under any obligation to explain its legal opinions in these matters.

Implications of the Israeli High Court’s Decision

Following the High Court’s decision, it is unlikely that the legal determinations of the Cyber Unit will be subject to any independent review. A takedown request from the Cyber Unit would go to the Facebook legal department, which would determine whether Facebook was bound to comply with the request. Facebook’s lawyers have a duty to Facebook—not to the user who posted the content and not to the Oversight Board. Accordingly, their objective is to avoid liability for Facebook. And, as legal scholars have noted, “a platform facing liability will predictably protect itself by removing too much content: It throws the baby out with the bathwater.” Facebook has little incentive to resist a Cyber Unit request. Doing so means that they risk violating Israeli law. This inclination will be even sharper if Facebook’s legal department lacks expertise in Israeli law. And once Facebook has determined it has a legal obligation to remove content, the Oversight Board is entirely preempted.

The High Court determined that there was not a sufficient connection between the Cyber Unit’s requests and Facebook’s removal of protected expression. Essentially, it reasoned that Facebook was ultimately responsible for taking down posts. However, the Court suggested the case might have come out differently if Facebook itself had been a party to the proceeding. The Cyber Unit claimed that Facebook retains full discretion to determine whether the content at issue is illegal under Israeli law. Facebook was not there to dispute that claim. But it’s not clear that Facebook would do so if it had the opportunity. Making that argument would require Facebook to admit that it over-complies with government requests to avoid liability. And if Facebook doesn’t make that claim, users would likely be incapable of making it themselves. A user would be unlikely to know that their post had been removed pursuant to a Cyber Unit request. Facebook promises to explain takedown decisions based on its community guidelines, but it guarantees no such explanation for content removed for illegality.

Conclusion

Governments coopting private platforms to censor content isn’t new. But the Israeli High Court’s opinion suggests that the Facebook Oversight Board may further obscure that cooptation. The Oversight Board lends a degree of legitimacy to Facebook’s moderation decisions. The Board appears to make Facebook accountable through a transparent review process which relieves public pressure for the government to intervene with overt regulation. However, because Facebook’s Supreme Court can’t say what the law is, its power is shaped by governments in invisible ways. Ultimately, the Oversight Board may provide a distraction from government attempts to censor speech while avoiding accountability.