You are here
Scouring hate off Facebook in Germany
SECURITY is tight at this brick building on the western edge of Berlin. Inside, a sign warns: "Everybody without a badge is a potential spy!" Spread over five floors, hundreds of men and women sit in rows of six scanning their computer screens. All have signed non-disclosure agreements. Four trauma specialists are at their disposal seven days a week.
They are the agents of Facebook. And they have the power to decide what is free speech and what is hate speech.
This is a deletion centre, one of Facebook's largest, with more than 1,200 content moderators. They are cleaning up content - from terrorist propaganda to Nazi symbols to child abuse - that violates the law or the company's community standards.
Germany, home to a tough new online hate speech law, has become a laboratory for one of the most pressing issues for governments today: how and whether to regulate the world's biggest social network.
Around the world, Facebook and other social networking platforms are facing a backlash over their failures to safeguard privacy, disinformation campaigns and the digital reach of hate groups.
In India, seven people were beaten to death after a false viral message on the Facebook subsidiary WhatsApp. In Myanmar, violence against the Rohingya minority was fuelled, in part, by misinformation spread on Facebook.
Europe, and Germany in particular, have emerged as the de facto regulators of the industry, exerting influence beyond their own borders. Berlin's digital crackdown on hate speech, which took effect on Jan 1, is being closely watched by other countries. And German officials are playing a major role behind one of Europe's most aggressive moves to rein in technology companies, strict data privacy rules that take effect across the European Union on May 25 and are prompting global changes.
"For them, data is the raw material that makes them money," said Gerd Billen, Secretary of State in Germany's Ministry of Justice and Consumer Protection. "For us, data protection is a fundamental right that underpins our democratic institutions."
Germany's troubled history has placed it on the front line of a modern tug-of-war between democracies and digital platforms. In the country of the Holocaust, the commitment against hate speech is as fierce as the commitment to free speech. Hitler's "Mein Kampf" is available only in an annotated version. Swastikas are illegal. Inciting hatred is punishable by up to five years in jail.
But banned posts, pictures and videos have routinely lingered on Facebook and other social media platforms. Now companies that systematically fail to remove "obviously illegal" content within 24 hours face fines of up to 50 million euros (S$79 million).
The deletion centre predates the legislation, but its efforts have taken on new urgency. Every day content moderators in Berlin, hired by a third-party firm and working exclusively on Facebook, pore over thousands of posts flagged by users as upsetting or potentially illegal and make a judgment: Ignore, delete or, in particularly tricky cases, "escalate" to a global team of Facebook lawyers with expertise in German regulation.
Some decisions to delete are easy. Posts about Holocaust denial and genocidal rants against particular groups like refugees are obvious ones for taking down.
Others are less so. On Dec 31, the day before the new law took effect, a far-right lawmaker reacted to an Arabic New Year's tweet from the Cologne police, accusing them of appeasing "barbaric, Muslim, gang-raping groups of men". The request to block a screenshot of the lawmaker's post wound up in the queue of Nils, a 35-year-old agent in the Berlin deletion centre. His judgment was to let it stand. A colleague thought it should come down. Ultimately, the post was sent to lawyers in Dublin, London, Silicon Valley and Hamburg. By the afternoon it had been deleted, prompting a storm of criticism about the new legislation, known here as the "Facebook Law."
The far-right Alternative of Germany party has been quick to proclaim "the end of free speech". Human rights organisations have warned that the legislation was inspiring authoritarian governments to copy it.
Other people argue that the law simply gives a private company too much authority to decide what constitutes illegal hate speech in a democracy, an argument that Facebook, which favoured voluntary guidelines, made against the law.
Richard Allan, Facebook's vice-president for public policy in Europe, put it more simply: "We don't want to be the arbiters of free speech." German officials counter that social media platforms are the arbiters anyway.
It all boils down to one question, said Mr Billen, who helped draw up the new legislation: "Who is sovereign? Parliament or Facebook?" NYTIMES