X blames users for Grok-generated CSAM; no fixes announced

It seems that instead of updating Grok to prevent outputs of sexualized images of minors, X is planning to purge users generating content that the platform deems illegal, including Grok-generated child sexual abuse material (CSAM). On Saturday, X Safety finally posted an official response after nearly a week of backlash over Grok outputs that sexualized […]

X blames users for Grok-generated CSAM; no fixes announced Read More »