#csam — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #csam, aggregated by home.social.
-
RE: https://mastodon.social/@arteesetica/116042974555051098
Ustedes son muy chicos, pero a principios de 2022, en el Discord oficial de #Midjourney habían lanzado un canal oculto llamado "war-game channel / war room" donde los desarrolladores y primeros testers generaban imágenes perturbadoras (gore, drogas, violencia, contenido no apto para el trabajo ( #NSFW ) y material de abuso sexual infantil ( #CSAM )
Si entrenan un modelo de IA generativa con esa clase de contenidos es porque buscan generar ese tipo de contenidos.
-
RE: https://mastodon.social/@arteesetica/116042974555051098
Ustedes son muy chicos, pero a principios de 2022, en el Discord oficial de #Midjourney habían lanzado un canal oculto llamado "war-game channel / war room" donde los desarrolladores y primeros testers generaban imágenes perturbadoras (gore, drogas, violencia, contenido no apto para el trabajo ( #NSFW ) y material de abuso sexual infantil ( #CSAM )
Si entrenan un modelo de IA generativa con esa clase de contenidos es porque buscan generar ese tipo de contenidos.
-
RE: https://mastodon.social/@arteesetica/116042974555051098
Ustedes son muy chicos, pero a principios de 2022, en el Discord oficial de #Midjourney habían lanzado un canal oculto llamado "war-game channel / war room" donde los desarrolladores y primeros testers generaban imágenes perturbadoras (gore, drogas, violencia, contenido no apto para el trabajo ( #NSFW ) y material de abuso sexual infantil ( #CSAM )
Si entrenan un modelo de IA generativa con esa clase de contenidos es porque buscan generar ese tipo de contenidos.
-
RE: https://mastodon.social/@arteesetica/116042974555051098
Ustedes son muy chicos, pero a principios de 2022, en el Discord oficial de #Midjourney habían lanzado un canal oculto llamado "war-game channel / war room" donde los desarrolladores y primeros testers generaban imágenes perturbadoras (gore, drogas, violencia, contenido no apto para el trabajo ( #NSFW ) y material de abuso sexual infantil ( #CSAM )
Si entrenan un modelo de IA generativa con esa clase de contenidos es porque buscan generar ese tipo de contenidos.
-
RE: https://mastodon.social/@arteesetica/116042974555051098
Ustedes son muy chicos, pero a principios de 2022, en el Discord oficial de #Midjourney habían lanzado un canal oculto llamado "war-game channel / war room" donde los desarrolladores y primeros testers generaban imágenes perturbadoras (gore, drogas, violencia, contenido no apto para el trabajo ( #NSFW ) y material de abuso sexual infantil ( #CSAM )
Si entrenan un modelo de IA generativa con esa clase de contenidos es porque buscan generar ese tipo de contenidos.
-
EU negotiations to extend the “Chat Control 1.0” ePrivacy derogation collapsed after Parliament, Council, and Commission failed to agree on privacy safeguards. 🇪🇺
The expired measure had allowed platforms like Meta and Microsoft to scan private messages for CSAM, raising concerns over mass surveillance and legal basis. 🔒🔗 https://edri.org/our-work/did-the-eu-parliament-really-vote-not-to-protect-children-online/
#TechNews #EU #Europe #Privacy #ChatControl #Parliament #Commission #Encryption #Surveillance #Cybersecurity #Meta #Microsoft #Policy #Freedom #CSAM
-
EU negotiations to extend the “Chat Control 1.0” ePrivacy derogation collapsed after Parliament, Council, and Commission failed to agree on privacy safeguards. 🇪🇺
The expired measure had allowed platforms like Meta and Microsoft to scan private messages for CSAM, raising concerns over mass surveillance and legal basis. 🔒🔗 https://edri.org/our-work/did-the-eu-parliament-really-vote-not-to-protect-children-online/
#TechNews #EU #Europe #Privacy #ChatControl #Parliament #Commission #Encryption #Surveillance #Cybersecurity #Meta #Microsoft #Policy #Freedom #CSAM
-
EU negotiations to extend the “Chat Control 1.0” ePrivacy derogation collapsed after Parliament, Council, and Commission failed to agree on privacy safeguards. 🇪🇺
The expired measure had allowed platforms like Meta and Microsoft to scan private messages for CSAM, raising concerns over mass surveillance and legal basis. 🔒🔗 https://edri.org/our-work/did-the-eu-parliament-really-vote-not-to-protect-children-online/
#TechNews #EU #Europe #Privacy #ChatControl #Parliament #Commission #Encryption #Surveillance #Cybersecurity #Meta #Microsoft #Policy #Freedom #CSAM
-
EU negotiations to extend the “Chat Control 1.0” ePrivacy derogation collapsed after Parliament, Council, and Commission failed to agree on privacy safeguards. 🇪🇺
The expired measure had allowed platforms like Meta and Microsoft to scan private messages for CSAM, raising concerns over mass surveillance and legal basis. 🔒🔗 https://edri.org/our-work/did-the-eu-parliament-really-vote-not-to-protect-children-online/
#TechNews #EU #Europe #Privacy #ChatControl #Parliament #Commission #Encryption #Surveillance #Cybersecurity #Meta #Microsoft #Policy #Freedom #CSAM
-
EU negotiations to extend the “Chat Control 1.0” ePrivacy derogation collapsed after Parliament, Council, and Commission failed to agree on privacy safeguards. 🇪🇺
The expired measure had allowed platforms like Meta and Microsoft to scan private messages for CSAM, raising concerns over mass surveillance and legal basis. 🔒🔗 https://edri.org/our-work/did-the-eu-parliament-really-vote-not-to-protect-children-online/
#TechNews #EU #Europe #Privacy #ChatControl #Parliament #Commission #Encryption #Surveillance #Cybersecurity #Meta #Microsoft #Policy #Freedom #CSAM
-
UK Schools Told to Remove Children’s Photos as Criminals Use AI to Create Explicit Images https://petapixel.com/2026/05/11/uk-schools-told-to-remove-childrens-photos-as-criminals-use-ai-to-create-explicit-images/ #artificalintelligence #highschoolphotos #onlinesafety #schoolphotos #childsafety #Technology #News #csam #Law
-
UK Schools Told to Remove Children’s Photos as Criminals Use AI to Create Explicit Images https://petapixel.com/2026/05/11/uk-schools-told-to-remove-childrens-photos-as-criminals-use-ai-to-create-explicit-images/ #artificalintelligence #highschoolphotos #onlinesafety #schoolphotos #childsafety #Technology #News #csam #Law
-
UK Schools Told to Remove Children’s Photos as Criminals Use AI to Create Explicit Images https://petapixel.com/2026/05/11/uk-schools-told-to-remove-childrens-photos-as-criminals-use-ai-to-create-explicit-images/ #artificalintelligence #highschoolphotos #onlinesafety #schoolphotos #childsafety #Technology #News #csam #Law
-
UK Schools Told to Remove Children’s Photos as Criminals Use AI to Create Explicit Images https://petapixel.com/2026/05/11/uk-schools-told-to-remove-childrens-photos-as-criminals-use-ai-to-create-explicit-images/ #artificalintelligence #highschoolphotos #onlinesafety #schoolphotos #childsafety #Technology #News #csam #Law
-
UK Schools Told to Remove Children’s Photos as Criminals Use AI to Create Explicit Images https://petapixel.com/2026/05/11/uk-schools-told-to-remove-childrens-photos-as-criminals-use-ai-to-create-explicit-images/ #artificalintelligence #highschoolphotos #onlinesafety #schoolphotos #childsafety #Technology #News #csam #Law
-
Associated Press: French prosecutors seek charges against Elon Musk and X over child sexual abuse images. “French prosecutors are seeking charges against Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.”
https://rbfirehose.com/2026/05/10/associated-press-french-prosecutors-seek-charges-against-elon-musk-and-x-over-child-sexual-abuse-images/ -
Associated Press: French prosecutors seek charges against Elon Musk and X over child sexual abuse images. “French prosecutors are seeking charges against Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.”
https://rbfirehose.com/2026/05/10/associated-press-french-prosecutors-seek-charges-against-elon-musk-and-x-over-child-sexual-abuse-images/ -
Associated Press: French prosecutors seek charges against Elon Musk and X over child sexual abuse images. “French prosecutors are seeking charges against Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.”
https://rbfirehose.com/2026/05/10/associated-press-french-prosecutors-seek-charges-against-elon-musk-and-x-over-child-sexual-abuse-images/ -
Associated Press: French prosecutors seek charges against Elon Musk and X over child sexual abuse images. “French prosecutors are seeking charges against Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.”
https://rbfirehose.com/2026/05/10/associated-press-french-prosecutors-seek-charges-against-elon-musk-and-x-over-child-sexual-abuse-images/ -
Associated Press: French prosecutors seek charges against Elon Musk and X over child sexual abuse images. “French prosecutors are seeking charges against Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.”
https://rbfirehose.com/2026/05/10/associated-press-french-prosecutors-seek-charges-against-elon-musk-and-x-over-child-sexual-abuse-images/ -
Elon Musk faces a criminal probe in France as prosecutors escalate X’s AI Investigation.
French prosecutors have opened a criminal investigation into Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.
#ElonMusk #France #CSAM #Deepfakes #Disinformation #X #Grok #AI #Tech
-
Elon Musk faces a criminal probe in France as prosecutors escalate X’s AI Investigation.
French prosecutors have opened a criminal investigation into Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.
#ElonMusk #France #CSAM #Deepfakes #Disinformation #X #Grok #AI #Tech
-
Elon Musk faces a criminal probe in France as prosecutors escalate X’s AI Investigation.
French prosecutors have opened a criminal investigation into Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.
#ElonMusk #France #CSAM #Deepfakes #Disinformation #X #Grok #AI #Tech
-
Elon Musk faces a criminal probe in France as prosecutors escalate X’s AI Investigation.
French prosecutors have opened a criminal investigation into Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.
#ElonMusk #France #CSAM #Deepfakes #Disinformation #X #Grok #AI #Tech
-
Elon Musk faces a criminal probe in France as prosecutors escalate X’s AI Investigation.
French prosecutors have opened a criminal investigation into Elon Musk and his social platform X for child sexual abuse images on the platform, deepfakes, disinformation and complicity in denying crimes against humanity by the platform’s artificial intelligence system, Grok.
#ElonMusk #France #CSAM #Deepfakes #Disinformation #X #Grok #AI #Tech
-
Hey, that guy who was so fucking keen to go to Epstein's island of child rape is in trouble for more nonce stuff again.
Maybe he'll post as his Ma to say he's 'not a Pedo', like the 'winner' he is...#ElonMusk #Musk #Grok #Deepfakes #CSAM #SocialMedia #Twitter #EpsteinClass
-
Hey, that guy who was so fucking keen to go to Epstein's island of child rape is in trouble for more nonce stuff again.
Maybe he'll post as his Ma to say he's 'not a Pedo', like the 'winner' he is...#ElonMusk #Musk #Grok #Deepfakes #CSAM #SocialMedia #Twitter #EpsteinClass
-
Hey, that guy who was so fucking keen to go to Epstein's island of child rape is in trouble for more nonce stuff again.
Maybe he'll post as his Ma to say he's 'not a Pedo', like the 'winner' he is...#ElonMusk #Musk #Grok #Deepfakes #CSAM #SocialMedia #Twitter #EpsteinClass
-
Hey, that guy who was so fucking keen to go to Epstein's island of child rape is in trouble for more nonce stuff again.
Maybe he'll post as his Ma to say he's 'not a Pedo', like the 'winner' he is...#ElonMusk #Musk #Grok #Deepfakes #CSAM #SocialMedia #Twitter #EpsteinClass
-
Hey, that guy who was so fucking keen to go to Epstein's island of child rape is in trouble for more nonce stuff again.
Maybe he'll post as his Ma to say he's 'not a Pedo', like the 'winner' he is...#ElonMusk #Musk #Grok #Deepfakes #CSAM #SocialMedia #Twitter #EpsteinClass
-
"We analyse an anonymous online survey of 4,918 adult men quota-matched and weighted to national populations in Australia, the United Kingdom, and the United States. In pooled analyses, 8.0% reported sexual feelings towards children, 7.4% would likely have sexual contact with a child if undetected, 5.5% to 5.7% would watch child sexual abuse material or a webcam show, and 2.4% to 4.7% reporting engagement in online or contact offending."
https://journals.sagepub.com/doi/full/10.1177/08862605251403614
-
"We analyse an anonymous online survey of 4,918 adult men quota-matched and weighted to national populations in Australia, the United Kingdom, and the United States. In pooled analyses, 8.0% reported sexual feelings towards children, 7.4% would likely have sexual contact with a child if undetected, 5.5% to 5.7% would watch child sexual abuse material or a webcam show, and 2.4% to 4.7% reporting engagement in online or contact offending."
https://journals.sagepub.com/doi/full/10.1177/08862605251403614
-
"We analyse an anonymous online survey of 4,918 adult men quota-matched and weighted to national populations in Australia, the United Kingdom, and the United States. In pooled analyses, 8.0% reported sexual feelings towards children, 7.4% would likely have sexual contact with a child if undetected, 5.5% to 5.7% would watch child sexual abuse material or a webcam show, and 2.4% to 4.7% reporting engagement in online or contact offending."
https://journals.sagepub.com/doi/full/10.1177/08862605251403614
-
CW: uspol,abuse
Minnesota passes ban on fake AI nudes; app makers risk $500K fines
More evidence of Grok CSAM seen as Minnesota passes nudifying app ban.
#ai #csam #elon-musk #fake-nudes #grok #minnesota #nudifying-apps #policy #x #xai
https://arstechnica.com/tech-policy/2026/05/minnesota-set-to-be-first-state-to-ban-nudification-apps/ -
CW: uspol,abuse
Minnesota passes ban on fake AI nudes; app makers risk $500K fines
More evidence of Grok CSAM seen as Minnesota passes nudifying app ban.
#ai #csam #elon-musk #fake-nudes #grok #minnesota #nudifying-apps #policy #x #xai
https://arstechnica.com/tech-policy/2026/05/minnesota-set-to-be-first-state-to-ban-nudification-apps/ -
CW: uspol,abuse
Minnesota passes ban on fake AI nudes; app makers risk $500K fines
More evidence of Grok CSAM seen as Minnesota passes nudifying app ban.
#ai #csam #elon-musk #fake-nudes #grok #minnesota #nudifying-apps #policy #x #xai
https://arstechnica.com/tech-policy/2026/05/minnesota-set-to-be-first-state-to-ban-nudification-apps/ -
CW: uspol,abuse
Minnesota passes ban on fake AI nudes; app makers risk $500K fines
More evidence of Grok CSAM seen as Minnesota passes nudifying app ban.
#ai #csam #elon-musk #fake-nudes #grok #minnesota #nudifying-apps #policy #x #xai
https://arstechnica.com/tech-policy/2026/05/minnesota-set-to-be-first-state-to-ban-nudification-apps/ -
CW: uspol,abuse
Minnesota passes ban on fake AI nudes; app makers risk $500K fines
More evidence of Grok CSAM seen as Minnesota passes nudifying app ban.
#ai #csam #elon-musk #fake-nudes #grok #minnesota #nudifying-apps #policy #x #xai
https://arstechnica.com/tech-policy/2026/05/minnesota-set-to-be-first-state-to-ban-nudification-apps/ -
Minnesota passes ban on fake AI nudes; app makers risk $500K fines More evidence of Grok CSAM seen as Minnesota passes nudifying app ban. https://s.faithcollapsing.com/j2b6t#ai #csam #elon-musk #fake-nudes #grok #minnesota #nudifying-apps #policy #x #xai
-
if you're over 30 you might have missed the news that a major american pop star is about to go on trial for grooming an 11 year old girl for three years before murdering and dismembering her with a chainsaw, making hit songs about it along the way.
(he's guilty btw)
-
if you're over 30 you might have missed the news that a major american pop star is about to go on trial for grooming an 11 year old girl for three years before murdering and dismembering her with a chainsaw, making hit songs about it along the way.
(he's guilty btw)
-
if you're over 30 you might have missed the news that a major american pop star is about to go on trial for grooming an 11 year old girl for three years before murdering and dismembering her with a chainsaw, making hit songs about it along the way.
(he's guilty btw)
-
if you're over 30 you might have missed the news that a major american pop star is about to go on trial for grooming an 11 year old girl for three years before murdering and dismembering her with a chainsaw, making hit songs about it along the way.
(he's guilty btw)
-
if you're over 30 you might have missed the news that a major american pop star is about to go on trial for grooming an 11 year old girl for three years before murdering and dismembering her with a chainsaw, making hit songs about it along the way.
(he's guilty btw)
-
We cannot protect victims by tying the hands of those who could prevent abuse from spreading.
If we fail to equip our systems to stop it we too carry a share of that responsibility.
We owe victims and their families action.
Video
---
https://nitter.net/EPPGroup/status/2049519578074472813#m -
We cannot protect victims by tying the hands of those who could prevent abuse from spreading.
If we fail to equip our systems to stop it we too carry a share of that responsibility.
We owe victims and their families action.
Video
---
https://nitter.net/EPPGroup/status/2049519578074472813#m -
⚠️ SW-ISAC Advisory
The IFTAS Do Not Interact domain denylist has been updated.
Recent additions have been identified as federating CSAM, based on human review.
If your service imports the IFTAS DNI denylist, please update accordingly.
-
⚠️ SW-ISAC Advisory
The IFTAS Do Not Interact domain denylist has been updated.
Recent additions have been identified as federating CSAM, based on human review.
If your service imports the IFTAS DNI denylist, please update accordingly.