YouTube, the great radicaliser
The use of artificial intelligence and Google's business model are combining to unleash controversial content on YouTube users
DeeperDive is a beta AI feature. Refer to full articles for the facts.
AT one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.
Soon I noticed something peculiar. YouTube started to recommend and "autoplay" videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.
Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube's recommender algorithm take me wherever it would. Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of September 11.
Share with us your feedback on BT's products and services
TRENDING NOW
‘Boring’ is the new black: The stars are aligning for a Singapore stock market revival
Near sell-out launches in March boost developer sales to 1,300 units after four slow months
China pips the US if Asean is forced to choose, but analysts warn against reading it like a sports result
Genting Singapore’s Lim Kok Thay receives S$7.5 million pay package for FY2025