child sexual abuse materials

Auto Added by WPeMatico

Elon Musk’s xAI sued for turning three girls’ real photos into AI CSAM

A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As recently as January, Musk denied that Grok generated any CSAM during a scandal in which xAI refused to update filters to block […]

Elon Musk’s xAI sued for turning three girls’ real photos into AI CSAM Read More »

Grok assumes users seeking images of underage girls have “good intent”

For weeks, xAI has faced backlash over undressing and sexualizing images of women and children generated by Grok. One researcher conducted a 24-hour analysis of the Grok account on X and estimated that the chatbot generated over 6,000 images an hour flagged as “sexually suggestive or nudifying,” Bloomberg reported. While the chatbot claimed that xAI

Grok assumes users seeking images of underage girls have “good intent” Read More »

X blames users for Grok-generated CSAM; no fixes announced

It seems that instead of updating Grok to prevent outputs of sexualized images of minors, X is planning to purge users generating content that the platform deems illegal, including Grok-generated child sexual abuse material (CSAM). On Saturday, X Safety finally posted an official response after nearly a week of backlash over Grok outputs that sexualized

X blames users for Grok-generated CSAM; no fixes announced Read More »