According to Fortune, hundreds of thousands of workers from over 50 countries are currently trapped in Southeast Asia’s scam centers, but they may soon be replaced by artificial intelligence. Researcher Ling Li confirms that AI already crafts initial scam messages in some centers, and large language models could eventually handle entire “pig butchering” scam processes. Since the start of 2025, Meta has detected and disrupted nearly 8 million Facebook and Instagram accounts linked to these criminal operations, while banning over 6.8 million WhatsApp accounts in just six months. An October investigation revealed more than 2,000 Starlink devices were being used by scam centers in Myanmar before SpaceX disabled them. International pressure has led Cambodia and Myanmar to arrest thousands, but experts worry automation could reduce this urgency as human trafficking risks decline.
The automation paradox
Here’s the thing that makes this situation so complicated: as AI takes over the actual scamming work, the humanitarian crisis might become less visible to the outside world. When foreign citizens aren’t being trafficked in such large numbers, will governments maintain their pressure on countries like Thailand, Cambodia and Myanmar? Ling Li thinks not – she predicts governments and NGOs may withdraw from the fight when their citizens face less immediate risk.
And that creates a real problem for law enforcement. Without human workers trapped inside these compounds, who’s going to provide the inside information that helps authorities dismantle these networks? It’s much harder to get informants when the operations are fully automated. Basically, we could end up with more efficient criminal enterprises that are even harder to track down.
Technology’s complicated role
The story gets even messier when you look at how legitimate technology is being weaponized. We’ve got Starlink providing internet access to remote scam centers, stablecoins and fintech apps facilitating money movement, and social media platforms serving as recruitment and communication channels. Jacob Sims points out the uncomfortable truth that criminal activity drives enormous traffic on platforms like Facebook, where many trafficking victims are initially recruited.
Meanwhile, Hammerli Sriyai notes that platforms are actually rolling back content moderation efforts. WhatsApp, for instance, relies entirely on user reports rather than proactive monitoring. And tech companies face a fundamental conflict: to effectively combat scams, they’d need to generate false positives that might block legitimate users – something they’re understandably reluctant to do.
Where the money flows
There’s an interesting split happening in the financial world too. Traditional banks and even cryptocurrency exchanges have clear incentives to crack down on scam activity – every dollar lost to fraud is money leaving their platforms and damaging customer trust. But social media? That’s a different story entirely. Engagement is engagement, even when it’s criminal.
The recent U.S. and UK sanctions against Southeast Asian crypto scam networks show governments are trying to follow the money. But as these operations become more automated and sophisticated, tracking becomes exponentially harder.
What happens when the humans leave?
Stephanie Baroud from Interpol offers a sobering perspective: AI probably won’t end human trafficking – it will just reshape it. Criminal networks have spent years building sophisticated trafficking operations, and they’re not going to dismantle them just because AI can handle the scamming. They’ll simply pivot to other criminal enterprises.
The situation in Myanmar shows how persistent these problems are – scam centers continue booming despite crackdowns, now using technology like Starlink to operate in remote areas. And as CNN reported, the rapid response from SpaceX shows that tech companies can act when evidence emerges – but the burden of detection falls heavily on investigators.
So where does this leave us? We’re heading toward a future where scam operations become more efficient, less detectable, and potentially more profitable. The human tragedy of trafficking might become less visible, but the criminal networks behind it will likely become more entrenched. It’s a classic case of technology solving one problem while creating several new, more complex ones.
