San Bruno, California (Reuters) - Alphabet Inc's YouTube said on Monday (Dec 4) it plans to add more people next year to review and remove violent or extremist content on the video platform.
YouTube is taking stern actions to protect its users against inappropriate content with stricter policies and larger enforcement teams, YouTube CEO Susan Wojcicki said in a blog post.
"We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether," Wojcicki said.
The goal is to bring the total number of people across Google working to address content that might violate its policies to over 10,000 in 2018, she said.
YouTube last week updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions.
YouTube had been facing a lot of criticism from advertisers and regulators and advocacy groups for failing to police content and account for the way its services shape public opinion.