Facebook Inc.’s de facto Supreme Court of content is on the verge of ruling on Donald Trump’s return to the world’s largest social network, a closely watched decision that’s likely to be just as consequential for the social media giant as it is for the former U.S. president.
The company suspended Trump on Jan. 6, forbidding him to share content with his 35 million followers after the violent attack on the U.S. Capitol.
The ban was extended indefinitely, meaning Trump’s Facebook page — where he often posted more than a dozen times a day — has been frozen for more than three months, intensifying long-running accusations of political bias against conservatives and adding fire to the debate about social media companies’ refereeing of speech.
The decision in the coming days by the Oversight Board, an independent group of lawyers, academics, and journalists, will be binding, and marks the most significant test since the panel’s formation last year. If the board restores Trump’s account, Facebook will contend with fresh criticism that it doesn’t do enough to stop dangerous and false information from spreading on its platform. Banning Trump permanently is apt to spark even more backlash from conservatives who favor a more hands-off approach to content moderation.
Whatever the board concludes, its ruling will set a precedent for how Facebook handles future posts from other political leaders around the world who rely on the platform to make or break public policy, win elections, and influence social movements.
“For a company like Facebook it is putting them back at the center, the epicenter of a lot of these debates around where the line is around legal but harmful content,” said Katie Harbath, a former public policy director at Facebook who left the company in March. “There is both what’s at stake for Facebook, but then also what’s at stake I think more broadly in terms of the debate and the questions around how world leaders should be held accountable for the things that they say and push on the internet.”
Facebook wasn’t the only company to take the extraordinary step of suspending Trump’s accounts after Jan. 6.
Snapchat parent Snap Inc. and Alphabet Inc.’s YouTube were among companies that also blocked Trump’s accounts. Twitter Inc., Trump’s favorite social network, banned him permanently, and executives say it has no plans to reverse that decision.
But Facebook is alone in its plan to outsource the decision about Trump’s future on the platform. The ruling will increase scrutiny of both the company and the Oversight Board, which was born out of a longstanding argument by Chief Executive Officer Mark Zuckerberg that Facebook shouldn’t be making so many decisions about what speech is acceptable on its service.
He first announced plans for an “independent body” to review content decisions in late 2018, but the board didn’t start operating for almost two years. During that time, Zuckerberg repeatedly defended his company’s responsibility to protect freedom of speech and his desire to keep Facebook from becoming an arbiter of truth.
“We are of course aware of the fact that those decisions will not please all stakeholders all the time,” Thomas Hughes, director of the Oversight Board administration, said in an interview about the board’s role. “That is a simple reality of the type of work the board is engaged in.”
To some, the idea of an independent board comes across as a cop-out — a chance for Zuckerberg and Menlo Park, California-based Facebook to avoid taking responsibility for the tough decisions that come with building a global platform.
“At this point, @Facebook’s Oversight Board is another attempt to appear accountable even while FB’s leadership has set the board up for failure,” Facebook critic and Color of Change President Rashad Robinson tweeted earlier this year.
Facebook’s argument is that the board is a form of checks and balances, challenging its power as the company responsible for policing the world’s speech and holding it accountable when it missteps.
“This is exactly the type of thing that the board was meant to do and why it was created to help think through these thorny questions,” Harbath said.
The board’s initial 20-member roster was unveiled in mid-2020, and includes an array of academics, human rights activists, and even Tawakkol Karman, a Nobel Peace Prize Laureate. On April 13, the board said it would expand its mandate to let users appeal posts that Facebook allows to remain on its platform, not just those that were removed. To handle the increased caseload, the panel will grow from roughly 20 members to about 40 in the coming months.
Of the seven cases where the board has reviewed and issued a ruling, it overturned Facebook’s initial decision to take down content in five of them, including one post about COVID-19 cures that the company deemed a threat to user safety.
That track record was seen as a sign to some on the outside that the Oversight Board is likely to reinstate Trump. A five-person panel randomly selected from the larger board is reviewing the former president’s case, but specific board members won’t be disclosed — that information is kept confidential for privacy and security. Members, who have been meeting via videoconference, will also weigh input from the public. An open comment period for Trump’s case elicited more than 9,000 responses, including a response from Trump, Hughes said.
The Oversight Board is essentially debating the merits of two competing arguments. One camp says Trump should be held to the same rules and standards as other users. Critics of that point say that political speech — even the offensive kind — should be afforded more protection in a democratic society. The board will also take into account international human rights principles and the local cultural contexts of the case. It’s one of the challenges of having one set of rules that could be applied globally.
“You can either say political speech is uniquely important and should be protected,” said board member Alan Rusbridger, or “the counterargument is that political speech ought to be uniquely responsible because it has so much power and therefore there is an extra burden on it to be extra responsible.”
One of the key arguments in favor of upholding the ban is that the First Amendment affords companies such as Facebook a lot of freedom to moderate content as they see fit. Policing its service, the argument goes, is a business decision.
“We hope [The board] will be mindful that Facebook is not a government — and that the platform’s decisions denying active accounts or taking down posts pose no threat of loss of liberty to any person,” wrote Harvard Law School professors Vicki Jackson and Martha Minow in Lawfare in early March.
Twitter CEO Jack Dorsey made a similar argument before Congress last month when discussing his own company’s decision to ban Trump. “Ultimately we’re running a business,” Dorsey said, “and a business wants to grow the number of customers it serves. Enforcing policy is a business decision.”
The oversight board doesn’t consider Facebook’s business interests when making its decisions, according to Hughes.
Trump has downplayed his reliance on Facebook in recent weeks, saying he plans to start his own social network to reach his followers while avoiding big tech companies that have penalized him.
The board could recommend that Facebook change its policies — an approach it took in January, when it recommended the company “create a new community standard on health misinformation.” Facebook has promised to uphold the board’s decision around Trump, but has not committed to implementing policy advice. It’s possible Facebook could find a middle-ground option between suspending world leaders’ accounts and letting them post rule-breaking posts without consequences. Twitter, for instance, routinely left Trump’s rule-violating tweets up, but hidden behind a warning label.
“In the cases we have been looking at so far, not including Trump, we have had discussions about saying, well, ‘sometimes it can seem a little too prude to just say take it up or leave it down,’” Rusbridger said.
If the Trump ban is upheld, Facebook may come under more pressure to take action against other world leaders who use social media to spread misinformation, lies, or hateful content to further their authoritarian aims. In February, Facebook kicked Myanmar’s military off its platform after the country’s democratic government was overthrown in a military coup.
Other world leaders that Facebook could target include France’s far-right leader, Marine Le Pen; India Prime Minister Narendra Modi; and Brazilian President Jair Bolsonaro, according to Harbath. Bolsonaro’s posts were removed last year by Facebook and other platforms for violating the companies’ coronavirus misinformation policies.
“The decision on Trump will be a very significant precedent for how the speech of political leaders around the world is treated,” said Nate Persily, a Stanford Law School professor and elections expert who has closely followed Facebook’s Oversight Board.
Taking more aggressive action against other political leaders risks prompting greater backlash from around the globe. German Chancellor Angela Merkel and French Junior Minister for European Union Affairs Clement Beaune both criticized Twitter’s move to permanently ban Trump, arguing lawmakers should be the ones to set the rules governing free speech, not private technology companies.
“The chancellor sees the complete closing down of the account of an elected president as problematic,” Steffen Seibert, Merkel’s chief spokesman, said in January. Rights like freedom of speech “can be interfered with, but by law and within the framework defined by the legislature — not according to a corporate decision.”
Merkel has a surprising ally in favor of her argument: Mark Zuckerberg.
“Many people are concerned that platforms can ban elected leaders,” Zuckerberg said during a March 25 hearing before Congress. “I am, too. I don’t think private companies should make so many decisions like this alone.”
Source: Newmax