Teens Sue Elon Musk’s xAI
AP Photo/Markus Schreiber, File
A group of online friends who make fun of current news stories ……… (opposing viewpoints welcome)
AP Photo/Markus Schreiber, File
Several teenagers from Tennessee have filed a lawsuit claiming that Elon Musk’s artificial intelligence company played a role in turning their school photos into sexually explicit images involving minors.
According to a proposed class-action lawsuit filed Monday in federal court in California, three students allege that xAI’s Grok chatbot was used to digitally remove clothing from photos of more than 18 girls, many of whom attended the same school. The altered images, which portrayed the girls nude or in sexualized ways, were later shared and exchanged on Discord and Telegram.
Authorities have arrested a suspect connected to the case. The lawsuit claims that the individual distributed the images and traded them for other sexually explicit material involving minors within group chats that included hundreds of participants.
The complaint argues that xAI and Elon Musk made this kind of misuse possible by introducing image-editing tools through Grok’s “Spicy” mode and related features. The plaintiffs say those tools were capable of generating images that appeared to undress real people and were marketed in ways meant to increase engagement with the platform.
The teens are seeking financial damages and are also asking the court to block similar image-editing features from being used to create such material in the future.
Musk has previously stated that he was not aware of any explicit images involving minors produced through Grok. He has said the system is designed to reject illegal requests and suggested that any instances where such content was created were the result of adversarial attempts to bypass the safeguards.
Meanwhile, authorities in California, the United Kingdom, and across Europe have begun investigating xAI’s tools for generating sexualized images. Researchers studying the technology say it produced roughly 23,000 images that appeared to depict minors over an 11-day period.
Also on Monday, Australia’s online safety regulator reported that child sexual abuse material appears to be more widespread and easier to find on X than on other major social media platforms.
Subscribe now to keep reading and get access to the full archive.