MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
youtube
Search

YouTube pushes children's videos to pedophiles through content recommendation engine

Monday June 3, 2019. 05:15 PM , from BoingBoing
A mom in Brazil became concerned as she watched the viewing numbers on innocent backyard clip her daughter posted to YouTube suddenly climb hundreds of thousands of views. The child posted a video of herself and a friend playing in the family pool. YouTube's recommendation engine had been suggesting the video as recommended content to viewers who'd just watched other videos that contained sexually oriented video content. YouTube's AI sexualized her kid and pushed her image to pedophiles. This happens a lot, apparently.
“YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids,” tweeted Max Fisher at the New York Times.
“YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitation.”
“I asked YouTube— why not just turn off recommendations on videos of kids? Your system can already identify videos of kids automatically,” says Fisher.
“The recommendation algorithm is driving this whole child exploitation phenomenon. Switching it off would solve the problem and keep kids safe.”
YouTube's CEO is a woman, Susan Wojcicki.
From Max Fisher and Amanda Taub at the New York Times:
YouTube’s automated recommendation system — which drives most of the platform’s billions of views by suggesting what users should watch next — had begun showing the video to users who watched other videos of prepubescent, partially clothed children, a team of researchers has found.
YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content.
The result was a catalog of videos that experts say sexualizes children.
“It’s YouTube’s algorithm that connects these channels,” said Jonas Kaiser, one of three researchers at Harvard’s Berkman Klein Center for Internet and Society who stumbled onto the videos while looking into YouTube’s impact in Brazil. “That’s the scary thing.”
The video of Christiane’s daughter was promoted by YouTube’s systems months after the company was alerted that it had a pedophile problem. In February, Wired and other news outlets reported that predators were using the comment section of YouTube videos with children to guide other pedophiles.
That month, calling the problem “deeply concerning,” YouTube disabled comments on many videos with children in them.
But the recommendation system, which remains in place, has gathered dozens of such videos into a new and easily viewable repository, and pushed them out to a vast audience.
YouTube never set out to serve users with sexual interests in children — but in the end, Mr. Kaiser said, its automated system managed to keep them watching with recommendations that he called “disturbingly on point.”

And here is YouTube's response today: An update on our efforts to protect minors and families.
There's a lot of yada yada in there. they're pushing an update on the day the New York Times is pushing this story. Here's a snip from the YouTube blog post about the changes they say they're making to fix this horrible oversight:
Over the last 2+ years, we’ve been making regular improvements to the machine learning classifier that helps us protect minors and families. We rolled out our most recent improvement earlier this month. With this update, we’ll be able to better identify videos that may put minors at risk and apply our protections, including those described above, across even more videos.
More from observers on Twitter and reporters, below.

We talked to child psychologists, sexual trauma specialists, psychologists who work with pedophiles, academic experts on pedophilia, network analysts. They all said YouTube has built a vast audience — maybe unprecedented — for child sexual exploitation, with grave risks for kids.
— Max Fisher (@Max_Fisher) June 3, 2019

YouTube, to its credit, said it has been working nonstop on this issue since a similar issue was first reported in February.
YT also removed some of the videos immediately after we alerted the company, though not others that we did not specifically flag.
— Max Fisher (@Max_Fisher) June 3, 2019

YouTube’s algorithm also changed immediately after we notified the company, no longer linking the kiddie videos together.
Strangely, however, YouTube insisted that the timing was a coincidence. When I pushed, YT said the timing might have been related, but wouldn’t say it was.
— Max Fisher (@Max_Fisher) June 3, 2019

I asked YouTube— why not just turn off recommendations on videos of kids? Your system can already identify videos of kids automatically.
The recommendation algorithm is driving this whole child exploitation phenomenon. Switching it off would solve the problem and keep kids safe.
— Max Fisher (@Max_Fisher) June 3, 2019

Initially, YouTube gave me comment saying that they were trending in that direction. Experts were thrilled, calling it potentially a hugely positive step.
Then YouTube “clarified” their comment. Creators rely on recommendations to drive traffic, they said, so would stay on.
https://boingboing.net/2019/06/03/youtube-sexualizes-children.html
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Apr, Tue 16 - 08:56 CEST