YouTube, fighting child exploitation, will ban comments on videos featuring kids
YouTube will block users from commenting on most videos that feature minors, the video-streaming platform said Thursday, responding to reports that pedophiles had used comments to find, track and exploit children.
Under YouTube’s new policy, users will no longer be able to comment on videos that prominently feature kids under age 13. YouTube said it intends to disable comments on videos including children ages 13 through 17 if the content risks attracting predatory behavior.
YouTube, which is part of Alphabet Inc.’s Google, said the new rules would take several months to implement and that it would identify minors in videos using software.
“We recognize that comments are a core part of the YouTube experience and how you connect with and grow your audience. At the same time, the important steps we’re sharing today are critical for keeping young people safe,†the company said in a blog post.
YouTube’s move comes two weeks after a video blogger documented how the site had enabled what he called a “soft-core pedophile ring.†In many cases, apparent pedophiles took advantage of YouTube’s comments system, where they would post time stamps so others could skip ahead to moments when the video showed kids in compromising positions. Users who viewed videos of minors would be served up additional videos featuring children through YouTube’s recommendation engine.
YouTube initially responded last week by removing tens of millions of comments, along with more than 400 channels, on videos involving minors. The revelations also triggered a sharp backlash among major brands that advertise on YouTube, including Nestle and Disney, which suspended their ad spending on the site.
On Thursday, YouTube said it had accelerated its work on new software that could spot and remove predatory comments more effectively, adding that it had terminated additional channels that put children at risk. The company said it would grant an exception to its new comment ban for a “small number of channels that actively moderate their comments and take additional steps to protect children.â€
YouTube has long struggled to monitor and remove problematic content from its massive platform, where users upload 400 hours of content every minute. In recent years, it has faced controversies over militant extremist content, hateful conspiracy-theory videos and violent, sexually suggestive clips that were reaching children. A coalition of consumer and privacy groups filed a complaint last year with the Federal Trade Commission alleging that YouTube also is violating the nation’s child privacy law by collecting data on kids under the age of 13.
Since the video blogger documented how pedophiles shared time stamps of sexually suggestive moments, a parent in Florida found that a clip explaining how to commit suicide had been spliced into children’s videos on YouTube and YouTube Kids, an app specifically designed for children.