It's not just Alexa who's listening in on your household and telephone conversations

Experts reveal sneaky way your phone listens in on your conversations - and how to stop it

It was long thought to be a myth and dismissed by big tech companies.

But experts have revealed how listening into your conversations has become a multi-billion dollar industry.

Earlier this week, a leak from a leading marketing firm appeared to confirm how companies use microphones on your smart devices to eavesdrop before selling the data to advertisers.

'You can be talking to one of your friends about going on a vacation to Portugal through a phone call, and then a day later or that same day, what do you see? An advertisement for a trip,' data security expert Andy LoCascio told DailyMail.com. 

The leak last week came from a pitch deck given by CMG, a marketing partner of Facebook, Amazon and Google.

The deck - which appears to have been made for prospective clients - detailed CMG's 'Active-Listening,' software, which collects data from people by listening in on their conversations.

Active-Listening software can be enabled through any app on an Android or iPhone, and other devices like smart home assistants can listen in too, LoCacio said.

What's more, these devices are listening practically all the time, not just when you're intentionally using your microphone to make a phone call or talk to Alexa, for example. 

'For most devices, there is no device state when the microphone is inactive. It is nearly always active when Siri is present on the device or any other voice activated assistant is present,' LoCascio said.  

Companies that want to capture your voice data and sell it often gain access to your microphone through apps. 

Typically, apps are granted permission to use your microphone through a clause 'buried in the myriad of permissions you accept when installing a new app,' he added. 

That means that many users are consenting to being tapped without even realizing it. 

'The problem is, the form of consent is an all-or-nothing Faustian bargain,' data privacy expert and consultant Sharon Polsky said.

'So many websites say 'we collect information from you and about you. If you use our website, you've consented to everything that we do.' You have no way of opting out,' she added. 

LoCascio explained that this is how CMG and other companies are getting away with this even in states with wiretapping laws that prohibit recording somebody without their knowledge, like California.

'To be perfectly clear, there are no laws about this. If we give somebody permission to use the microphone on our device, and we click off all the other terms of service that none of us ever read, they can certainly use it,' LoCascio said. 

That lack of protective legislation has 'created an entire data broker industry that's now worth billions,' Polsky said. 

This industry's rapid growth is owed partly to the development of highly sophisticated large language models, like Chat GPT.

These extremely powerful AI tools have made it easier and faster for advertisers or other third parties to mine our voice data for valuable information, LoCascio noted.

'All I have to do is take one of those transcripts, drop it in the ChatGPT box, and then ask it a simple question. Like, 'please tell me what product and services I could market to somebody based on this conversation,' he explained.

Once that voice data is captured, it can be sold to advertisers to direct and inform their targeted marketing. But it can also be sold to other clients, who could be using it for entirely different reasons.

'They could be capturing those conversations for anything,' LoCascio said. 

'It's one thing to say they're doing it for ads, and they can claim that, but they sell that information blindly to other people. And they don't scrub it, so they basically sell an audio transcript,' he added.

Other examples of voice data purchasers include insurance companies, for the purpose of creating personalized insurance rates, and the federal government, Polsky said.

'One of the purchasers of our information - information about us - everything from our opinions, our predilections, our associations, our travel routes, is the government,' she said. 

And there are other insidious entities that want to get their hands on our voice data too, such as 'people from the dark web that want to profit from scamming us,' Polsky said. 

That means that sharing your social security number or other sensitive personal details could put you at risk of identity theft, LoCascio said.  

CMG is an American media conglomerate based in Atlanta, Georgia. The company provides broadcast media, digital media, advertising and marketing services, and it generated $22.1 billion in revenue in 2022. 

CMG did not respond to DailyMail.com's request for comment.