YouTube Steps Up Takedowns as Concerns About Kids' Videos Grow

YouTube Steps Up Takedowns as Concerns About Kids' Videos Grow
Advertisement

YouTube stepped up enforcement of its guidelines for videos aimed at children, the unit of Alphabet's Google said on Wednesday, responding to criticism that it has failed to protect children from adult content.

The streaming video service removed more than 50 user channels in the last week and stopped running ads on over 3.5 million videos since June, YouTube vice president Johanna Wright wrote in a blog post.

"Across the board we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies," Wright said. "These latest enforcement changes will take shape over the weeks and months ahead as we work to tackle this evolving challenge."

YouTube has become one of Google's fastest-growing operations in terms of sales by simplifying the process of distributing video online but putting in place few limits on content.

Parents, regulators, advertisers and law enforcement have become increasingly concerned about the open nature of the service. They have contended that Google must do more to banish and restrict access to inappropriate videos, whether it be propaganda from religious extremists and Russia or comedy skits that appear to show children being forcibly drowned.

Concerns about children's videos gained new force in the last two weeks after reports in BuzzFeed and the New York Times and an online essay by British writer James Bridle pointed out questionable clips.

A forum on the Reddit internet platform dubbed ElsaGate, based on the Walt Disney Co princess, also became a repository of problematic videos.

Several forum posts Wednesday showed support for YouTube's actions while noting that vetting must expand even further.

Common Sense Media, an organisation that monitors children's content online, did not immediately respond to a request to comment about YouTube's announcement.

YouTube's Wright cited "a growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not" for the new efforts "to remove them from YouTube."

The company relies on review requests from users, a panel of experts and an automated computer programme to help its moderators identify material possibly worth removing.

Moderators now are instructed to delete videos "featuring minors that may be endangering a child, even if that was not the uploader's intent," Wright said. Videos with popular characters "but containing mature themes or adult humor" will be restricted to adults, she said.

In addition, commenting functionality will be disabled on any videos where comments refer to children in a "sexual or predatory" manner.

© Thomson Reuters 2017

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Steam Black Friday 2017 Sale: GTA V, Witcher 3, Wolfenstein 2, and More PC Game Deals
Uber's Messy Data Breach Collides With Launch of SoftBank Deal
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »