Advertisement

SKIP ADVERTISEMENT

How to Make Facebook’s ‘Supreme Court’ Work

The idea of a body that will decide what kind of content is allowed on the site is promising — but only if it’s done right.

Credit...Sarah Silbiger/The New York Times

Kate Klonick and

Dr. Klonick and Mr. Kadri are lawyers.

Facebook’s chief executive, Mark Zuckerberg, has announced a plan to create an independent body to make decisions about what kinds of content his site’s users will be allowed to post. It’s a development that he hinted at when he said on a podcast in April, “You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.”

It appears that this “Supreme Court” idea is now going to be carried out — and the independence, transparency, accountability and oversight Mr. Zuckerberg described in announcing on Thursday that such a body would be created within a year all sound promising. But details so far are scarce, and what this really means for free speech and fair process on the internet will depend on the answer to one key question: How much will the “Supreme Court of Facebook” be like the Supreme Court of the United States?

In theory, a court has at least three virtues. The first is due process: such a court can allow people to argue that mistakes have been made, and the court can then publicly explain its final decision. The second virtue is representation: the justices can represent different segments of society, bringing diverse perspectives and expertise to the difficult questions that they must answer. The third virtue is independence: while a legislature debates and passes laws, a court can be insulated from this political process when it interprets those laws and resolves competing legal claims.

There are good reasons to doubt whether even the United States Supreme Court is an institution that lives up to these ideals. But at the very least there are structures in place that aim to promote these values on the court. On Thursday, Mr. Zuckerberg admitted that Facebook is still figuring out how its panel will function. “Starting today,” he said, “we’re beginning a consultation period to address the hardest questions” about “how this will work in practice.” As Facebook — which operates in many ways like a government when it determines what kinds of speech are allowed — does so, it should pay attention to the government that already exists.

Questions abound about how this new tribunal will protect due process. For one, in trying to sniff out when mistakes have been made, what “record” of evidence will Facebook’s justices consider? In our legal system, the record is developed in the lower courts long before the Supreme Court hears the appeal, and the justices don’t get to do their own fact-finding to figure out what happened in the case before them.

But one of the greatest challenges posed by moderation of speech on Facebook is how heavily these decisions depend on context; the difference between a racist slur and a rap lyric, for example, might turn on the speaker’s identity, her motivations, her audience. These challenges become even more complex in a global context in which moderators must account for different languages and slang; for different historical, cultural and political divides; and for different power structures — all of which might color the social meaning of the speech.

What does this mean for Facebook’s new oversight council? In short, you need real-world evidence to get real-world context. The Silicon Valley “justices” adjudicating questions of online speech will need fact-finding powers that we don’t give to their counterparts in Washington. If all that they review is the same lifeless screenshot seen by Facebook’s own moderators, it’s tough to see how they’ll be in a much better position to correct mistakes. That may give Facebook’s users more process, but is that really the process they are due?

Then there’s the issue of representation: Who will Facebook’s justices be, and how will they be chosen? As we saw with the recent confirmation battle over Brett Kavanaugh, the identity of our judges matters a great deal. We subject nominees to extreme scrutiny, and the Constitution splits the authority to appoint justices between two branches of government to offer at least some check on that awesome power.

Given the diversity of Facebook’s community, the body should be international, represent multiple stakeholders, and include voices from groups that are targets of hate speech and harassment on social media. But that still leaves the problem of “who.” A 100-person panel might represent Facebook’s diverse community better than the nine-justice Supreme Court represents America, but such a vast institution would surely struggle to deliberate and decide on the questions before it. Still, Facebook’s council can’t hope to gain legitimacy if it doesn’t represent a broad array of viewpoints and perspectives.

Finally, in considering the issue of independence, Facebook can learn a few things from our nation’s highest court. The Supreme Court is part of something greater — it is the third branch in a system that creates a separation of powers. Congress makes laws, the court interprets them, and the executive enforces them. Here, Facebook would, in a sense, still be playing the role of both Congress and the executive. But the court’s role in checking congressional and executive actions is crucial, and it works only because the court is independent from the other branches. The justices have broad discretion in deciding which cases to hear, and justices are appointed for life in an attempt to shield them from the pressures of outside politics.

The new council imagined by Facebook feels like an attempt to create this type of independent body. Indeed, Mr. Zuckerberg says that Facebook is creating this organization to “prevent the concentration of too much decision-making within our teams” and to “provide assurance that these decisions are made in the best interests of our community and not for commercial reasons.” But it is also in the best interests of Facebook: such a tribunal would be a convenient scapegoat for contentious decisions: “Don’t like how we dealt with the takedown of the Alex Jones pages? Don’t blame us! It was the Council!”

For Facebook’s new appeals body to be more than an empty gesture, it must be independent from its creators. It’s obvious that such an oversight group must be independently funded — Facebook can’t control the purse strings — but that’s not enough. There are two other critical features that can give a court independence from other governmental branches. One is the court’s ability to decide which appeals it hears, and Facebook’s tribunal should also have broad discretion to pick its “cases.”
But the second and perhaps much more powerful part comes from a court’s adherence to a constitution. A constitution plays many roles, but one of them is its ability to be stalwart on values in the face of societal change: Constitutions are difficult to amend and difficult to reinterpret. Whatever direction the wind blows, or whatever Sirens may call, a constitution will be the proverbial binding that tethers Odysseus to the mast and ensures that the ship continues to sail true.

With that in mind, Facebook should consider — especially if it continues to act as a type of governing body — adopting something like a constitution that is harder to amend that its ever-shifting content-moderation rules, which it could alter mercurially to get around decisions issued by its court that it doesn’t like.

The idea of a Supreme Court of Facebook is promising in theory. But how all this will function ultimately rests on choices that Mr. Zuckerberg has yet to make. All we can do is hope that he chooses wisely — and therein lies the perilous relationship we have with Facebook.

Kate Klonick is an assistant professor at St. John’s Law School and the author of “The New Governors: The People, Rules, and Processes Governing Online Speech.” Thomas Kadri is a resident fellow at Yale Law School’s Information Society Project and a doctoral candidate at Yale Law School.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

Advertisement

SKIP ADVERTISEMENT