Facebook, Twitter reverse changes meant to curb vote misinformation

Published Thu, Dec 17, 2020 · 05:51 AM

    [SAN FRANCISCO] Facebook and Twitter reversed changes to their content policies that were implemented to stem the viral spread of misinformation about November's US presidential election, saying the temporary changes are no longer needed.

    Twitter had made it harder to retweet others' posts, encouraging people to add commentary before posting something. The company said it will return to one-click retweets, after seeing a 20 per cent decrease in sharing following the change.

    After the election, Facebook boosted news sources it considered authoritative on its social network, to make sure users were getting high-quality information on the outcome, but that problem isn't as urgent anymore. "This was a temporary change we made," the company said in a statement.

    Although election conspiracies pose less of a threat now, with President Donald Trump's campaign lawsuits failing and Joe Biden's victory confirmed by the Electoral College, the companies are going back to their old rules just when they may soon face a public-health information problem around the Covid-19 vaccines, which just started being administered in the US.

    For example, Alabama's Department of Public Health on Wednesday warned of rumours circulating on social media, specifically about a nurse dying from the shot.

    "Rumours and misinformation can easily circulate within communities during a crisis," the department said in a Facebook post, urging people to "look for information from official public health and safety authorities".

    DECODING ASIA

    Navigate Asia in
    a new global order

    Get the insights delivered to your inbox.

    Twitter said that although it's reverting to the old rules on retweets, "we'll continue to focus on encouraging more thoughtful amplification," according to a post Wednesday. "This requires multiple solutions - some of which may be more effective than others. For example, we know that prompting you to read articles leads to more informed sharing."

    Both companies have policies in place to block or label misinformation about Covid-19 and treatments for the virus. Twitter on Wednesday expanded its rules to include statements that contain harmful or misleading information about Covid-19 vaccines.

    Facebook also has said it will remove false information about the vaccines that has been debunked by experts, and will let users know when they have interacted with such information.

    "We're still ensuring that people see authoritative and informative news on Facebook, especially during major news cycles and around important global topics like elections, Covid-19 and climate change," a spokesperson said in the statement. Facebook's shift back was reported earlier by the New York Times.

    BLOOMBERG

    Decoding Asia newsletter: your guide to navigating Asia in a new global order. Sign up here to get Decoding Asia newsletter. Delivered to your inbox. Free.

    Share with us your feedback on BT's products and services