You are here
Facebook wants a faux regulator for Internet speech. It won't happen . . .
IN a recent opinion article, Facebook CEO Mark Zuckerberg said that he agreed with the growing consensus that Facebook - and other social media companies - should be subject to more regulation. The article, published in four countries and three languages, was fated to be misunderstood from the start.
His first suggestion was to create an independent body so users could appeal against Facebook's moderation decisions. Over the past few years, Facebook has caught fire from all sides for its content moderation. Some say hate speech should be censored more aggressively. At the same time, the company has been accused of censoring conservative viewpoints. And for years, it has been roundly criticised for its puritanical ban on female nipples.
But if Facebook had its way, the ultimate authority would no longer lie with Facebook - or Twitter or YouTube or other competitors. That job would fall to unspecified "regulators". "Regulation could set baselines for what's prohibited and require companies to build systems for keeping harmful content to a bare minimum," Mr Zuckerberg wrote. For a company exhausted by a year of scandal, a regulatory scapegoat is just what the doctor ordered. If you don't like what we do, why don't you try it for a change?
US legal experts were incredulous. Daphne Keller of the Stanford Center for Internet and Society accused Facebook of proposing an unconstitutional system, knowing it was impossible. In an initial statement, Ben Wizner of the American Civil Liberties Union said it was a violation of the First Amendment. The Electronic Frontier Foundation claimed it would violate the freedom of expression guaranteed by the Universal Declaration of Human Rights.
Facebook's head of public policy, Kevin Martin, explained that while Mr Zuckerberg's reference to "regulation" might mean actual government intervention in France, Germany and Ireland, it meant only private sector self-regulation in the United States.
This kind of faux regulation is nothing new. Among the examples cited were FINRA, a nongovernmental financial industry "regulator"; the Motion Picture Association of America, which rates films; and the Entertainment Software Rating Board, which rates video games. (After this clarification, the ACLU's Wizner agreed that independent bodies like the MPAA are not unconstitutional).
But none of these examples deal with directly regulating speech. Social media content moderation is a different beast entirely. Slapping a label on a video game isn't the same as banning distribution of the video game.
Facebook avoided bringing up the Hays Code, the closest corollary to what they propose. The MPAA's rating system is a pale shadow of Hollywood's old Hays Code, the now-laughable list of rules that for years had on-screen husbands and wives sleeping in separate beds.
The Code was developed voluntarily by the studios in hopes of avoiding government censorship. It zealously policed depictions of romance, crime, law enforcement and the clergy. When the Supreme Court held that motion pictures were protected by the First Amendment in 1952, enforcement of the code diminished.
Facebook's proposal is a bow to public opinion. In 2018, a coalition of advocacy groups published the Santa Clara Principles - new baseline rules for how content moderation should work. The principles focus most heavily on the right to appeal against decisions - particularly in conjunction with "new independent self-regulatory mechanisms" created in collaboration with industry.
All this sounds like what The Verge's Casey Newton calls "a Facebook Supreme Court". It's almost as if the Santa Clara Principles were developed by a room full of lawyers. Hammer, meet nail.
Due process would be much welcomed in a world where people believe simultaneously that Facebook takes down too little content or too much. But due process is costly, even after removing high-billing lawyers from the equation. Consider that the Supreme Court, with a budget of nearly US$90 million, receives 8,000 petitions a year - most of which are rejected. Meanwhile, according to a class-action lawsuit filed by an ex-Facebook moderator, "moderators are asked to review more than 10 million potentially rule-breaking posts per week". No wonder content moderation on the big platforms doesn't so much resemble an unpleasant visit to the Department of Motor Vehicles as it does a re-enactment of the horror film The Purge. Due process is a luxury good.
We're not likely to see a Facebook Supreme Court - not an American one, in any event. The Hays Code died after the First Amendment was extended to movies; a Hays Code for the Internet will probably be dead on arrival.
In a confused, fractured world, Facebook would be glad to stick to a single global standard. This is perhaps why Mr Zuckerberg offers full-throated praise of the European Union's privacy standard, the General Data Protection Regulation, in his op-ed. The fracture of the Internet into different spheres of influence would be bad for his business, and to that end, the company would much rather impose European sensibilities on the US Internet than deal with multiple standards.
So, while the US government has its hands tied behind its back by the Constitution, the French, the Germans and the Irish will set their own bar for online speech. In the future, US speech - at least online - may be governed by Europe. NYTIMES