Not so long ago, the idea that your phone calls might be recorded without your knowledge was the stuff of scandal. We’re talking tabloid expos茅s about journalists hacking voicemail, governments running covert surveillance programmes or whistleblowers unmasking corporate eavesdropping. Call recording was synonymous with secrecy, privacy violations and public outrage, and in the age of AI, it became the biggest potential concern for futuristic tech.
Fast forward to today, and a new app has turned that entire narrative on its head. Neon, currently the second most popular social app on Apple鈥檚 App Store, is paying users to do what was once unthinkable- openly record their phone calls, hand them over and sell the data to AI firms. What was once treated as a violation is now being packaged as an opportunity.
It’s a side hustle for the smartphone generation.
So, how did we get here? And what does this shift say about our changing relationship with privacy, technology and money?
From Privacy Breach To Business Model
The UK press has a long history of treating phone tapping and call recording as scandalous. The News of the World phone-hacking scandal in the 2000s is one of the most notorious examples, a saga that destroyed reputations, shuttered a newspaper and reinforced the idea that secretly accessing private conversations was deeply unethical.
The cultural memory of those years still lingers, and yet, somewhere along the way, our attitudes have softened. Social media platforms normalised the idea that our personal data (messages, photos and even our locations) could be shared and monetised in exchange for free access to services. Slowly, the outrage gave way to a kind of resigned acceptance.
Apps like Neon mark the latest phase in this evolution – privacy isn鈥檛 just something we give away in the background anymore, it鈥檚 something we can actively sell.
The Allure of a Side Hustle
What makes Neon so compelling is that it taps directly into the booming side-hustle economy. For many people, particularly younger generations struggling with rising costs of living, the idea of earning money from everyday activities is irresistible. If you鈥檙e already making phone calls, why not get paid for it? It’s the easiest passive income you could possibly dream of.
The app positions itself less as surveillance and more as empowerment. You鈥檙e not being spied on, you鈥檙e actively choosing to share. You鈥檙e not a passive victim, you鈥檙e a participant in a new kind of data marketplace. In a world where gig economy apps already pay people to deliver food, rent out spare bedrooms or sell second-hand clothes, Neon extends that model into personal conversations.
But the implications are very different. You鈥檙e not selling your time or your old belongings – rather, you鈥檙e selling your voice, your thoughts, your conversations. That鈥檚 a shift with huge social and ethical consequences.
AI鈥檚 Insatiable Appetite for Data
The real driver behind this trend is artificial intelligence. Large language models and conversational AI systems need vast amounts of real-world speech data to function, improve and sound natural. Traditional datasets (scripted conversations and curated dialogues, for instance) only go so far. What AI firms really want is authentic, unscripted human interaction.
This is where apps like Neon come in. They provide raw, unfiltered conversational data at scale, directly from ordinary users. For AI firms, it鈥檚 a goldmine. For individuals, it鈥檚 a way to cash in on something that, until now, was largely extracted for free without their knowledge.
But, at the same time, the value of this data raises uncomfortable questions – do users really understand what they鈥檙e giving up? And once their voices are uploaded into an AI system, who controls how it鈥檚 used?
More from Artificial Intelligence
- Taiwan’s TSMC Profits Set To Surpass 50% Thanks To AI Chip Demand
- Google And Intel Deepen AI Chip Ties, Indicating That AI Isn’t Just About GPUs Anymore
- The ICO Just Weighed In On AI Agents And Data Protection, Here Is What UK Startups Need To Know
- Sam Altman鈥檚 Robot Tax Plans: What Does It Actually Mean And Who Would It Affect?
- In The AI Age, Do You Still Need To Spend Money On Expensive Phone Cameras?
- Meet Muse Spark, Meta’s AI That Knows You Better Than You Know Yourself
- Mallory Launches AI-Native Threat Intelligence Platform, Turning Global Threat Data Into Prioritised Action
- How Is AI Being Used In Dentistry?
Changing Attitudes Towards Privacy
The rise of Neon suggests a profound cultural shift in how we think about privacy. Once, it was seen as a right to be defended, but now, increasingly, it鈥檚 treated as a commodity to be traded.
For younger users, who have grown up with social media, the boundary between public and private life has always been more fluid. Posting personal updates online, sharing photos, even broadcasting live video – all of this has been normalised. Call recording feels like the next logical step.
For older generations, particularly those who remember the days when phone hacking was a national scandal, the trend may feel unsettling. Yet even among sceptics, the logic of 鈥済etting paid for what鈥檚 already happening anyway鈥 has a certain pull.
A Side Hustle Or a Slippery Slope?
The arrival of Neon raises bigger questions about the direction of the digital economy. If call recording can be turned into a side hustle, what other forms of personal data could soon be openly monetised? Text messages? Video chats? Health information from wearable devices?
For startups, this could spark a wave of innovation in the 鈥渄ata-for-cash鈥 sector. For regulators, particularly in the UK and Europe where data protection laws are stringent, it will ignite urgent debates about consent, ownership and exploitation.
And for users, it could mark a turning point in how we perceive our relationship with technology. The narrative has flipped – the scandal isn鈥檛 that someone is recording your calls without permission, but that you might be missing out on getting paid to do it yourself.
Ultimately, Neon represents both an opportunity and a warning. On one hand, it democratises the data economy by letting individuals profit directly from their own information. On the other, it risks normalising the idea that privacy is something to be casually sold, rather than carefully protected.
What was once a scandal is now a business model. And as AI鈥檚 appetite for human data only grows, we may find ourselves rethinking not just what we鈥檙e willing to share, but whether we can afford not to.