YouTube to expand teams reviewing extremist content
Goal is to bring the total number of such people to over 10,000 in 2018, says its CEO
San Francisco
ALPHABET Inc's YouTube said on Monday it plans to add more people next year to identify inappropriate content as the company responds to criticism over extremist, violent and disturbing videos and comments.
YouTube has developed automated software to identify videos linked to extremism and now is aiming to do the same with clips that portray hate speech or are unsuitable for children.
BT is now on Telegram!
For daily updates on weekdays and specially selected content for the weekend. Subscribe to t.me/BizTimes
Technology
Garmin’s Q1 results beat on strong demand for fitness, auto products
Foxconn’s musical chairs sound like punk rock
US sets up board to advise on safe, secure use of AI
Regulate AI? How US, EU and China are going about It
Meta’s results are best viewed through rose-tinted AI glasses
'Harvesting data': Latin American AI startups transform farming