When AI monetization meets basic security failures

Neon, a call-recording app that pays users to sell their audio to AI companies, went dark this week after TechCrunch discovered a security flaw that exposed every user's phone numbers, call recordings, and transcripts to anyone else using the app.

1 min read
When AI monetization meets basic security failures
Photo by FlyD / Unsplash

Neon, a call-recording app that pays users to sell their audio to AI companies, went dark this week after TechCrunch discovered a security flaw that exposed every user's phone numbers, call recordings, and transcripts to anyone else using the app. The vulnerability was staggering in its simplicity—the app's servers had no access controls, so any logged-in user could manipulate basic API calls to access the entire database. TechCrunch found users earning money by secretly recording real-world conversations with people who had no idea they were being recorded and monetized. When founder Alex Kiam took the app offline, he sent users an email about "extra layers of security" but never mentioned the massive data exposure.

For product teams building AI partnerships, this shows how monetization incentives can turn users into unwitting privacy violators while basic security gets ignored in the rush to scale. The app hit #2 in social networking downloads before anyone caught the glaring access control failure, which means app store review processes aren't catching these fundamental issues either.

Neon turned call recording into a gig economy hustle, then leaked everyone's private conversations to strangers.

Exclusive: Neon takes down app after exposing users’ phone numbers, call recordings, and transcripts
Call recording app Neon was one of the top-ranked iPhone apps, but was pulled offline after a security bug allowed any logged-in user to access the call recordings and transcripts of any other user.