The Business Times
SUBSCRIBERS

Twitter bot learns from users to be racist

Less than 24 hours after @TayandYou went online, Microsoft halted account's posting

Published Fri, Mar 25, 2016 · 09:50 PM
Share this article.

New York

Microsoft set out to learn about "conversational understanding" by creating a bot designed to have automated discussions with Twitter users, mimicking the language that they use.

What could go wrong? If you guessed "It will probably become really racist," you've clearly spent time on the Internet. Less than 24 hours after the bot, @TayandYou, went online on Wednesday, Microsoft halted posting from the account and deleted several of its most obscene statements.

The bot, developed by Microsoft's technology and research and Bing teams, got major assistance in being offensive from users who egged it on. It disputed the existence of the Holocaust, referred to women and minorities with unpublishable words and advocated genocide. Several of the tweets were sent after users commanded the bot to repeat their own statements, a…

BT is now on Telegram!

For daily updates on weekdays and specially selected content for the weekend. Subscribe to  t.me/BizTimes

Technology

SUPPORT SOUTH-EAST ASIA'S LEADING FINANCIAL DAILY

Get the latest coverage and full access to all BT premium content.

SUBSCRIBE NOW

Browse corporate subscription here