A new app that promises to pay people for recordings of their phone calls, which are then used to train AI models, has been disabled after a major security flaw was reported.
Neon is still in the top 10 of iOS free app downloads, but after TechCrunch reported Thursday about a security flaw that the news site found in the service, its servers have apparently been made unavailable to users.
The app can still be downloaded, but it’s no longer functioning. It’s unclear whether the service will return or how long it will take.
Emails to Neon Mobile, the company behind the app, have not been returned.
The company promises it only draws from the recording of one side of the phone conversation, the caller’s, which appears to be a way of skirting state laws that prohibit recording phone calls without permission.
The app doesn’t record regular phone app calls, only those made within the Neon app or received from another person using Neon.
Training AI using your data
According to the company’s FAQ, the call data is anonymized and used to train AI voice assistants. “This helps train their systems to understand diverse, real-world speech,” it says.
AI companies need increasing amounts of data to train their models, which may be why Neon is offering the monetary incentive.
“The industry is hungry for real conversations because they capture timing, filler words, interruptions and emotions that synthetic data misses, which improves quality of AI models,” said Zahra Timsah, CEO of i-Gentic AI, which works in AI compliance.
“But that doesn’t give apps a pass on privacy or consent,” Timsah said.
Neon could be pushing its luck, especially across states and countries, when it comes to privacy and IP laws or regulations, depending on how it handles consent and where the data ends up.
Howden said that even if the data is anonymized, AI might not have a hard time retroactively discovering who is on the line in a Neon conversation.


Leave a Reply