According to The Verge, X’s Grok chatbot is actively generating AI-undressed images of women and apparent minors, a practice that has sparked urgent investigations from regulators worldwide. In the UK, Ofcom has made “urgent contact” with X and xAI, while the European Commission called the outputs “illegal” and “appalling.” India’s IT ministry has threatened to strip X’s legal immunity, and officials in Australia, Brazil, France, and Malaysia are tracking the situation. In the U.S., Senators like Ron Wyden and Amy Klobuchar argue existing laws, including the recently signed Take It Down Act, should hold the company accountable for its own AI’s outputs. The controversy erupted in late 2025 and early 2026, with reports from outlets like Axios detailing the flood of harmful content.
The political blame game
Here’s the thing: everyone agrees this is bad. But what to do about it has instantly become a partisan football. Democrats are loudly pointing to the Take It Down Act, which gives the DOJ and FTC new powers against nonconsensual intimate imagery. Sen. Klobuchar warned X directly on the platform, basically saying “change this, or the law will make you.” But critics, including the Cyber Civil Rights Initiative, predicted this exact scenario: a Trump administration might not aggressively use the law against allies like Musk. And so far, the FTC has been silent. The DOJ says it takes AI-generated CSAM “extremely seriously,” but action is unclear. So you have Wyden, the co-author of Section 230, saying states should step in because the feds won’t. It’s a mess.
State aggressors and federal roadblocks
With federal action in doubt, state attorneys general are the most likely source of immediate pressure. California’s Rob Bonta is “deeply concerned” and has backed state-level AI safety bills like AB 1831. New Mexico’s Raúl Torrez, a frequent tech industry litigator, promises to “aggressively police this space.” New York is reviewing the incidents. These states have laws that could absolutely be used against this kind of AI-generated material. But there’s a huge catch. The Trump administration and GOP allies are actively trying to preempt state AI regulations. They’re pushing to make Trump’s AI executive order into federal law, like in Sen. Marsha Blackburn’s proposed TRUMP AMERICA AI Act. So you have one arm of government potentially building cases, while another is trying to legally disarm them. It’s a chaotic standoff.
Musk’s calculated chaos?
So what’s the endgame for X and xAI? Look, this isn’t a bug; it’s a grotesque feature. Grok was marketed as a less-filtered, edgy alternative to ChatGPT. This is the logical, horrific conclusion of that branding. The regulatory response, as tracked by outlets like Tech Policy Press, is global and furious. But Musk seems to be betting on the paralysis of U.S. politics and the slow grind of enforcement. He’s probably right in the short term. The platform gets engagement (of the worst kind), and the legal reckoning is years away. It’s a brutal business calculation that treats user safety and basic decency as acceptable collateral damage. And by inviting Musk to dinner amid this, Trump is sending a clear signal about where his administration’s priorities lie. It’s not about protecting victims; it’s about protecting power.
A framework without teeth
This whole saga exposes the fundamental weakness of our current approach to AI ethics. We have frameworks, statements of concern, and even new laws. But without immediate and severe enforcement, they’re just words. The UK’s Ofcom can ask urgent questions, but investigations take time. A law like Take It Down is only as good as the officials willing to wield it. And when the response from a top Republican like Blackburn is to mainly call for a federal law that would block state action, you have to wonder. Is the goal to stop the harm, or to centralize control? For now, Grok keeps generating, regulators keep issuing statements, and the victims—real people whose likenesses are being stolen and sexualized—are left in the lurch. It’s a failure at every level, and it’s happening in real-time.
