
But “that cannot be the case,” Goldberg argued.
Goldberg argued, “Facing the implicit threat that Grok would put St. Clair’s images online and, presumably, make more of them”, St. Clair had no choice but to negotiate with Grok. Goldberg argued that the protections St. Clair seeks to claim under New York law should not dilute that incentive, asking the court to void St. Clair’s XAI contract and reject XAI’s motion to change venues.
Should St. Clair win its battle to retain the lawsuit in New York, the case could perhaps help set a precedent for millions of other victims who are considering legal action but are afraid to face XAI in a court of Musk’s choosing.
Goldberg argued, “It would be unjust to expect St. Clair to prosecute in a state so far from his residence, and it may be that a trial in Texas would be so difficult and inconvenient that St. Clair would effectively be deprived of his day in court.”
Grok may continue to harm children
The estimated volume of sexual images reported this week is worrying as it suggests that Grok, at the height of the scandal, may have been generating more child sexual abuse material (CSAM) on its platform each month than X.
In 2024, XSafety reported 686,176 instances of CSAM to the National Center for Missing and Exploited Children, an average of approximately 57,000 CSAM reports per month. If the CCDH’s estimate of 23,000 Grok output who sexually exploited children over an 11-day period is accurate, the average monthly total could exceed 62,000 if Grok were left unchecked.
NCMEC did not immediately respond to Ars’ request for information on how Grok’s estimated CSAM volume compares to X’s average CSAM reporting. But NCMEC previously told Ars that “regardless of whether an image is real or computer-generated, the harm is real, and the content is illegal.” This suggests that Grok may remain a thorn in the side of NCMEC, as CCDH warns that even when X removes harmful Grok posts, “images may still be accessed through different URLs,” suggesting that Grok’s CSAM and other harmful outputs may continue to spread. CCDH also found examples of alleged CSAM that X had not removed by January 15.
<a href