YouTube struggles to prevent alarming and violent videos that target children
By Kari Paul, MarketWatch
Parents have come across cartoon characters being tortured that appear to target young children
Children watching videos on YouTube have been regularly stumbling upon disturbing content disguised as cartoons, and the company says it is cracking down -- but critics say it isn't enough.
After several reports (https://www.nytimes.com/2017/11/04/business/media/youtube-kids-paw-patrol.html?_r=0) on videos that feature beloved cartoon characters in alarming situations being served to children, YouTube (GOOGL) is making changes to how it deals with questionable content. YouTube announced Friday it is in the process of implementing a new policy that age-restricts flagged content in the YouTube app. These videos include childhood characters like Elsa from "Frozen" and Spider-Man being buried alive or tortured. Often they play automatically after children watch legitimate videos on the site.
"Age-restricted content is automatically not allowed in YouTube Kids," Juniper Downs, director of policy at YouTube, said in a statement. "The YouTube team is made up of parents who are committed to improving our apps and getting this right."
Some critics say YouTube is being targeted nefarious groups or individuals. James Bridle wrote on Medium (https://medium.com/@jamesbridle/something-is-wrong-on-the-internet-c39c471271d2) that they're using YouTube "to systematically frighten, traumatize, and abuse children." He added, "It forces me to question my own beliefs about the internet, at every level."
YouTube said it has been developing tools to address negative content targeting children that it will release soon. This month, the company introduced YouTube Kids profiles (http://www.marketwatch.com/story/youtube-is-introducing-profiles-for-children-why-you-should-pay-attention-2017-11-03), which let parents personalize content based on their children's ages and interests. The accounts are for children ages 13 and under, who are not allowed to have their own accounts on the site under YouTube's terms and conditions.
Members of the public can only do so much to help
Critics say YouTube should not rely on its community to flag inappropriate videos and should dedicate more resources to addressing the problem. Josh Golin, executive director of the Campaign for a Commercial-Free Childhood, told MarketWatch the organization had alerted YouTube to these issues two years ago but the company did not respond until recent backlash due to media reports of the inappropriate children's videos.
"YouTube should have the resources to police its own site and not rely on unpaid labor from the community of people who watch videos," he said. "Anything that is on YouTube Kids should have already been previewed by someone who works at YouTube."
Jill Murphy, editor in chief of media nonprofit Common Sense (commonsense.org) also said relying on an algorithm to filter out inappropriate content is not enough and that a combination of methods should be used to maintain the safety of the service.
"Deciding what content is served up to kids, the quality of that content, and the balance of watching content versus participating in other activities is still a role adults need to be heavily involved in, and is not something an algorithm can replace," she said.
-Kari Paul; 415-439-6400; AskNewswires@dowjones.com
RELATED: 5 unpretentious ways to be an ethical traveler this holiday season (http://www.marketwatch.com/story/5-ways-to-make-the-world-a-better-place-while-you-travel-2017-06-09)
RELATED: Facebook now wants to help find your apartment (http://www.marketwatch.com/story/facebook-now-wants-to-help-find-your-apartment-2017-11-09)
RELATED: Would you go on a vacation with a complete stranger? There are dating sites for that.. (http://www.marketwatch.com/story/would-you-go-on-a-vacation-with-a-complete-stranger-there-are-dating-sites-for-that-2017-11-09).
(END) Dow Jones Newswires
11-10-17 1511ETCopyright (c) 2017 Dow Jones & Company, Inc.