Jenny Kane, AP
FILE- In this April 25, 2018, file photo the YouTube app and YouTube Kids app are displayed on an iPhone in New York on Wednesday, April 25, 2018.

REXBURG, Idaho — Nestle, Disney, and Epic Games are pulling their ads from YouTube amid a controversy surrounding child videos, Bloomberg reports.

  • The decision to pull ads comes after a blogger explained how YouTube comments were being used to help build a "soft-core pedophilia ring." The videos involved in the controversy came with pre-rolled ads (those that air before the video begins) from Disney and Nestle, among others, according to Bloomberg.

Context: Video blogger Matt Watson posted a video Sunday that explained how predators used comments on YouTube "to identify certain videos in which young girls were in activities that could be construed as sexually suggestive, such as posing in front of a mirror and doing gymnastics," according to Bloomberg.

  • For example, if a user clicked on one of the videos, YouTube would recommend a similar one. And, for example, searching for something like "bikini haul" can lead to exploitative videos of children, too, The Verge reports.
  • "The comment sections are often full of predators timestamping certain parts of a video that sexualizes the child or children in the scene, although the videos themselves aren't pornographic in nature," according to The Verge.

Epic Games also postponed its YouTube spending after their ads showed before those videos too.

  • "We have paused all pre-roll advertising," Epic Games spokesperson told The Verge. "Through our advertising agency, we have reached out to Google/YouTube to determine actions they'll take to eliminate this type of content from their service."

All Nestle companies have pulled their ads from YouTube, CNN reports. "We have received reassuring information from Google on temporary actions being taken and will revise our decision upon confirmation of those actions and our investigation being completed," according to Nestle.

Disney is also pulling its ads from YouTube, one source told Bloomberg.

German food giant Dr. Oetker also pulled its ads from the service, CNN reports.

YouTube reaction: "Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments," a spokeswoman for YouTube told Bloomberg.

Watson details: The creator of the video explained the video more in a Reddit post.

  • "YouTube's recommended algorithm is facilitating pedophiles' ability to connect with each-other, trade contact info, and link to actual child pornography in the comments," Watson wrote on Reddit. "I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than 10 minutes, in sometimes less than five clicks."

Want more stories from Herb Scribner? Sign up here.

Big picture: This could have an impact on YouTube's ad revenue. Disney is currently one of YouTube's top advertisers, accounting for 4 percent of ad revenue every year, according to Marketing Dive.

  • Similarly, one of the reasons "Fortnite" rose to popularity was because of its reach on YouTube, especially among younger generations, Marketing Dive reported.

What next?: Dr. Oetker is asking YouTube "to explain how it could happen that advertising of our company was placed in an environment that we strictly reject and consider highly reprehensible ethically," according to a statement provided to CNN.

As CNN reported, the company wants YouTube to "immediately remove from its site any contributions that threaten the integrity and protection of minors."

The National Center on Sexual Exploitation (NCOSE) has called on YouTube to remove all pornography from its platform and create policies to avoid these situations in the future.

1 comment on this story
  • "YouTube deserves to lose advertising money from Disney, and more, just as it deserves to lose the trust of its users," said Haley Halverson, vice president of Advocacy and Outreach at the National Center on Sexual Exploitation, in a statement emailed to the Deseret News.
  • "Why doesn't Google's YouTube take sexual exploitation and abuse seriously?" Halverson asked. "On YouTube, children are eroticized, pornographic content with disguised titles are shared, and predatory pedophiles use video comment sections to network and share illicit content. This has been happening for years. Why isn't it YouTube's #1 priority to create sustained solutions, instead of carrying on with its current whack-a-mole approach?"