I got baited into posting a picture of a child eating Popcorn on Discord, not knowing it was associated with CSAM. The account got banned, but I dont care about it but more about the legal consequences. Has anyone heard of legal action against people posting it?
Helpful video: https://www.youtube.com/watch?v=Kyc_ysVgBMs
But basically, the picture was a cropped frame from a CSAM content, which then their systems thought you are posting CSAM content when you did not.
About the legal consequences, i am not a lawyer, but i don’t think you will be visited by the police anytime soon, since the picture you posted isn’t CSAM by itself, just a cropped portion which does not contain the material itself.
Edit: as someone said, it goes through multiple human reviews.
Can someone explain this to me, cus huh?
No. I see no way to prosecute over popcorn. If a country actually did I would find the exit. Fast!
Not condoning child abuse
But popcorn?
Websites have false positives all the time and while it sucks, it’s infeasible for them to have human reviewers checking everything and it’s better to have false positives than false negatives… What isn’t acceptable is that the appeals process uses the exact same models as the flagging process so it gets the exact same false positives and false negatives…
Pic related as it was one of the first to reveal how broken the appeals process in most social media platforms was.
How does eating popcorn == CSAM?
From what other people have said and from the occasional video that’s popped up on Youtube, Discord has a library of CSAM content that its automated systems match against and there are certain individuals that try to bait people to post seemingly innocent pictures that are actually frames from said videos. Discord’s systems see that the image is a frame from such material and will auto-ban the account
This is fascinating and I have a bunch of questions, basically all centered around the fact that possession of the such content is outlawed. I don’t exoect OP to know, but maybe someone else does:
Isn’t it illegal to have a library of such content? Is there a legal carveout for that, like Coca Cola importing cocaine?
How is the library compiled, maintained, and added to?
Is the library specific to Discord or is it a shared library maintained by some centralized “authority” or developer? If it’s specific to Discord then can we assume there are many different libraries of illegally produced and possessed content compiled and maintained by various social media companies? Who’s got that job? Do they get therapy in their benefits package?
Do they get therapy in their benefits package?
This kind of moderation is generally outsourced to people in the global south paid pennies. And no, they don’t get therapy.
As far as I understand they use a tool called PhotoDNA (AI company acquired by Discord) which they use to scan pictures.