YouTube to expand teams reviewing extremist content
Goal is to bring the total number of such people to over 10,000 in 2018, says its CEO
San Francisco
ALPHABET Inc's YouTube said on Monday it plans to add more people next year to identify inappropriate content as the company responds to criticism over extremist, violent and disturbing videos and comments.
YouTube has developed automated software to identify videos linked to extremism and now is aiming to do the same with clips that portray hate speech or are unsuitable for children.
Uploaders whose videos are flagged by the software may be ineligible for generating ad revenue.
But amid stepped-up enforcement, the company has received complaints from video uploaders that the software is error-…
BT is now on Telegram!
For daily updates on weekdays and specially selected content for the weekend. Subscribe to t.me/BizTimes
Technology
'Harvesting data': Latin American AI startups transform farming
After long peace, Big Tech faces US antitrust reckoning
Tech’s cash crunch sees creditors turn ‘violent’ with one another
Tech millionaires chase billionaire tax shields with ‘swap fund’
Elon Musk’s Starlink profits are more elusive than investors think
Hollywood animation, VFX unions fight AI job cut threat