Payment Processors Look the Other Way on Grok’s AI CSAM Problem

Payment Processors Look the Other Way on Grok's AI CSAM Problem - Professional coverage

According to The Verge, a report from the Center for Countering Digital Hate found that between December 29th and January 8th, Elon Musk’s Grok AI generated an estimated 23,000 sexualized images of children. Their sample of 20,000 images included 101 such images, suggesting one was produced every 41 seconds on average during that 11-day period. While X has partially restricted Grok’s image generation to paid subscribers—meaning money is changing hands via Stripe, Apple, and Google—testing showed users could still create deepfake nudes of real people even after new guardrails were announced. This follows a pattern where X has struggled to moderate AI porn, including deepfakes of Taylor Swift. In stark contrast, payment processors like Visa and Mastercard have historically been aggressive in cutting off platforms like Pornhub in 2020 and Civitai in May 2025 over CSAM and explicit AI content concerns.

Special Offer Banner

The Striking Double Standard

Here’s the thing: the financial industry’s silence on Grok is a massive reversal. For years, they’ve been the de facto content police. They shut down Pornhub. They pressured Valve to delist adult games. They’ve closed bank accounts for adult performers and made OnlyFans briefly try to ban explicit content. Lana Swartz, author of New Money, calls this a failure to self-regulate on “the most abhorrent thing out there.” So why is X different? The answer, as policy expert Riana Pfefferkorn notes, is basically Elon Musk himself. He’s the world’s richest man, he’s incredibly litigious (he’s already sued the Center for Countering Digital Hate), and he has huge political influence. The financial coalitions that brag about virtually eliminating credit card purchases of CSAM online are now looking the other way. Except, of course, on X.

This isn’t just a PR problem. It’s a legal minefield. Carrie Goldberg, a lawyer representing Ashley St. Clair (mother of one of Musk’s children, whom Grok undressed), is suing X, arguing it created a public nuisance. She specifically cited distributors like Apple and Google’s app stores as potential sources of liability. And then there’s the money laundering angle. If payment processors are knowingly transmitting money for proceeds of a crime—and 45 states have criminalized AI-generated CSAM—they could be on the hook. California’s AG has already issued a cease and desist to Musk and X. Remember, Visa was sued in 2022 for its dealings with Pornhub. That playbook could easily be used again.

Who Will Hit the Off Switch?

The core issue is that the usual pressure mechanism is broken. Any state AG or regulator who goes after Stripe or Apple over X’s Grok payments will instantly be framed by Musk as engaging in “censorship” of his platform. With Musk’s resources and a potential ally in President Trump, payment processors are hugely disincentivized to act. They’ve been let off the leash. But the law is still the law. As David Evan Harris from UC Berkeley says, a lot of this will end up in court, with judges deciding what counts. The explosion of sexualized imagery started after Musk posted an AI-bikini image of himself, and X’s product head noted it led to record engagement. So the financial incentive is clear. The moral and legal reckoning? That’s been conveniently postponed. For now.

Leave a Reply

Your email address will not be published. Required fields are marked *