A group of parents and child safety organizations has asked the Federal Trade Commission to investigate Google’s practices when it comes to the YouTube Kids platform.
The organizations, which include the Campaign for a Commercial Free Childhood, the Center for Digital Democracy and 21 others, said that YouTubeKids violates child protection laws because it retrieves data on its children customers and then uses such data to advertise to children.
The groups said in a complaint to the FTC that the Children’s Online Privacy Protection Act includes strict guidelines for companies collecting data from kids.
According to Engadget, the act requires companies that run websites directed at children “to inform parents what information they plan to gather, and get permission to do so.”
However, YouTube is accused to avoiding this rule since it asks that only children over the age of 13 use the app, when in reality it is a popular brand among those 6 to 12 years old, according to Engadget.
“Google has acted duplicitously by falsely claiming in its terms of service that YouTube is only for those who are age 13 or older, while it deliberately lured young people into an ad-filled digital playground,” said Jeff Chester of the Center for Digital Democracy, according to The Guardian. “Just like Facebook, Google has focused its huge resources on generating profits instead of protecting privacy.”
YouTube said in a statement to The Guardian that it plans to "read the complaint thoroughly and evaluate if there are things we can do to improve."
The group urges the FTC to investigate and sanction Google for collecting data on children.
“For years, Google has abdicated its responsibility to kids and families by disingenuously claiming YouTube — a site rife with popular cartoons, nursery rhymes, and toy ads — is not for children under 13,” said Josh Golin, executive director of the CCFC, according to The Guardian. “Google profits immensely by delivering ads to kids and must comply with Coppa. It’s time for the FTC to hold Google accountable for its illegal data collection and advertising practices.”
YouTube previously suffered criticism for posting videos that contain conspiracy theories on its YouTube Kids app, according to Business Insider. The videos depicted theories about how the Earth is flat and that the U.S. faked its moon landings.
YouTube similarly fell under fire last year when parents discovered many of the videos on the app contain violent and mature content. One popular example features cartoon characters of “Peppa Pig” breaking each other's bones in a video.
“In another featuring Doc McStuffins, children are turned into zombies with bleeding wounds on their faces and chase their mother around the house,” the Deseret News reported.
In response, YouTube said it planned to hire 10,000 people to review all of its videos on the Kids app to make sure they didn’t contain mature material.
The Parents Television Council approved the decision, but still asked for YouTube and Google to do more.Comment on this story
“We applaud Google’s decision to increase monitoring of violent and extreme content — both the videos and comments — on YouTube,” PTC President Tim Winter said in a statement. “This is a great first step towards not only protecting advertisers, but also protecting the health and safety of young viewers who may be watching. Our past research on YouTube found that children entering ‘child-friendly’ search terms were confronted with highly offensive content in the text commentary posted by other site users. YouTube continues to be a site that needs constant monitoring, and today’s announcement should assist with that goal.”]