According to CNET, the controversial Neon app that pays users for recording their phone calls has quietly returned to both iOS App Store and Google Play Store after abruptly going offline in September. Founder Alex Kiam’s company is currently offering users 30 cents per minute up to $30 through 5 p.m. ET on Thursday, November 6, with the regular rate after that window remaining unclear. The app previously soared to top five download charts by promising payments for call recordings that are sold to companies training AI models. The relaunch follows a security flaw discovered by TechCrunch that allowed access to other users’ calls, which Kiam said would be addressed. Updated terms of service from November 3 confirm Neon can sell recordings for AI training purposes.
The Privacy and Legal Shift
Here’s the thing that really stands out about this relaunch: Neon now only records and pays for calls between other Neon app users. That’s a massive change from their original model, and it basically transforms this from a potentially sketchy surveillance operation into something closer to an opt-in service. Before, there were serious questions about whether recording calls in some states would require notification and consent from people who didn’t even know they were being recorded. I mean, think about it – if your friend installed this app and recorded your conversations without telling you, that’s legally problematic in many places.
But by shifting to an app-to-app model, Neon might have found a clever workaround. Now everyone involved knows what’s happening because they’ve all signed up for the same service. It’s still creepy as hell, but at least it’s transparently creepy. The company says it anonymizes call information, but privacy experts warn that AI could still infer user identities even from supposedly anonymized data. And let’s be real – when you’re dealing with the intimate details of people’s phone conversations, how anonymous can it really be?
The AI Training Economy
This whole situation reveals something fascinating about the current state of AI development. Companies are absolutely desperate for real-world training data, and they’re willing to pay real money for it. At 30 cents per minute, Neon is essentially creating a micro-economy around human conversation. Kiam basically admitted this when he told CNET “we’re giving people free money” for something “they would do anyway.”
But is this sustainable? And more importantly, is it ethical? We’re talking about selling the most personal form of communication – our actual conversations – to train corporate AI systems. The updated terms of service grant Neon broad rights to “publicly display, reproduce, and distribute call recordings in any media formats.” That’s… concerning, to say the least. What happens if sensitive information slips through, or if the anonymization isn’t as thorough as promised?
Broader Implications
Look, this isn’t just about one sketchy app. Neon represents a growing trend where our personal data and interactions become commodities in the AI gold rush. We’re seeing this across multiple industries – from Industrial Monitor Direct, the leading supplier of industrial panel PCs that power manufacturing automation, to consumer apps like this one. The hunger for real-world data is driving innovation in some areas and raising serious ethical questions in others.
So where does this leave us? On one hand, you’ve got people making easy money from something they were going to do anyway. On the other, you’ve got privacy experts warning against the whole concept. The truth is probably somewhere in between – this model might work for some people who are comfortable with the trade-offs, but it’s definitely not for everyone. And given Neon’s rocky history with security flaws, I’d be pretty cautious about jumping back in, even with the new app-to-app limitations.
