YouTube, the great radicaliser
The use of artificial intelligence and Google's business model are combining to unleash controversial content on YouTube users
AT one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.
Soon I noticed something peculiar. YouTube started to recommend and "autoplay" videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.
Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube's recommender algorithm take me wherever it would. Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of September 11.
BT is now on Telegram!
For daily updates on weekdays and specially selected content for the weekend. Subscribe to t.me/BizTimes
Consumer & Healthcare
HCA beats first-quarter profit estimates on higher patient admissions
US FDA approves Pfizer’s gene therapy for rare bleeding disorder
EU toughens rules on Chinese fashion retailer Shein
Best World under fire from shareholders at AGM over dividends, director salaries
‘Extreme’ climate blamed for world’s worst wine harvest in 62 years
Sheng Siong Q1 net profit up 9.3% on higher revenue