The Question
Eternal
https://komonews.com/news/nation-world/20m-child-sexual-abuse-images-on-facebook-in-2020-report-says
KOMO News said:WASHINGTON (SBG) - Facebook had more than 20 million child sexual abuse images on its platform in 2020, according to a report by the National Center for Missing and Exploited Children.
“You don’t want to talk to your kids about sexual exploitation, you don’t want to talk to the kids about sex or pornography,” said Rezsaun Lewis, a father of five. But Lewis says he has already talked to his kids about pornography, fearing exploitative images of children may pop up on their feeds.
“To be sexually abused and then for it to be put on camera and for millions of strangers to watch it for enjoyment, that’s sick,” said Lewis.
In 2020, 1,400 companies reported over 21 million incidents related to child pornography. Google had over 546,00 incidents; TikTok over 22,000.
I can see why that would be a problem for Google -- they crawl just about everything online and have a huge "Images" category of the results. That's populated basically via automation. But the shit on Facebook is uploaded to Facebook by users. People put that shit on there on purpose.
And, y'know, it's weird -- I was just relating to Jack how I got a 30 day ban for "bullying" a totally fictitious person. Most likely that was done by a bot; yet, somehow, Facebook doesn't have a bot looking out for + deleting these images and handing out bans for those who upload them?
Tells you where Facebook's priorities are. And that, for those priorities being where they are, Facebook needs to crash and fucking burn.