Three individuals from Tennessee, including two minors, have filed a lawsuit against Elon Musk’s xAI. They claim that the company deliberately designed its Grok image generator to allow users to create sexually explicit content using real photos of other people.
The lawsuit, filed in San Jose, California federal court, seeks class-action status. This means it aims to represent all people in the United States whose real images were “reasonably identifiable” in sexualized images or videos created by Grok. xAI, the artificial intelligence company, has not yet responded to Reuters’ request for comment.
This legal action follows earlier controversy. After public outrage over sexually explicit content generated by its chatbot, xAI stated in January that it had stopped all users from editing images of “real people in revealing clothing” and from creating images of people in revealing clothing in “jurisdictions where it’s illegal.”
Globally, governments and regulators have since launched investigations, issued bans, and demanded stronger safeguards. This reflects a growing effort to control illegal and offensive material generated by AI.
The lawsuit specifically claims that xAI failed to put in place proper safeguards to prevent its systems from generating sexual content involving minors. All three plaintiffs in the case were minors when these images were allegedly created.
The plaintiffs assert that their actual photos were digitally changed into explicit content. This content then reportedly spread online through various platforms, causing them significant emotional distress and creating a public nuisance. They are seeking unspecified damages, legal fees, and a court order to stop xAI from these alleged practices.
Annika Martin, a lawyer for the plaintiffs from Lieff Cabraser Heimann & Bernstein, issued a strong statement. “These are children whose school photographs and family pictures were turned into child sexual abuse material,” Martin said. She further alleged, “Elon Musk and xAI deliberately designed Grok to produce sexually explicit content for financial gain, with no regard for the children and adults who would be harmed.”










