Apple’s Siri is eavesdropping on your conversations, putting users at risk: Report

If you own an iPhone it might be time to stop relying on Siri and turn the listening assistant off.

Apple's Siri can be activated by something as mundane as the sound of a zipper -- leaving any conversation open to surveillance and accompanied by user data like location and contact details, whether it’s you talking to your doctor or having an intimate conversation with your partner, Siri is listening in on you, according to a new report from The Guardian.

“We should learn how they actually work. They are always listening for what’s called a ‘watchword’ they’re listening for the words Siri, or Alexa, or they’re listening for ok Google, so that means they’re actually listening to all the time," Lifewire.com Editor-in-Chief Lance Ulanoff said on "Mornings with Maria Monday. "It’s not people. It’s a program that is trying to match up what you said with a command that it understands. It can do something. There’s nobody sitting there trying to figure it out."

Apple contractors have said to listen in to users' medical information, and even recordings of personal conversations between couples, according to the report.

Ulanoff rejects the notion that Apple’s iPhones and gadgets are listening in on their customers.

“Your phones aren’t listening to you when you think they’re not. There are so many different ways they can triangulate what you’re thinking about, even with the relationships you have. We are a web of connections on social media, and through search, and through all kinds of data. That’s how it happens. It’s not that they’re listening,” he said.

Whereas Lou Basenese, 'Disruptive Tech Research' founder and chief analyst, believes otherwise, saying “We are giving them unfettered access to our data, and look, they can’t resist the temptation: data is a drug to big tech. it allows Google and Facebook to serve you better ads, and to think that they’re not going to use this data and that it’s just going into the cloud being crunched by algorithms and not be put into their business I think is just a little bit naïve.”

Apple has responded to the claims, saying that only 1 percent of recordings are used to improve its responses to user requests and to measure when the device is activated accidentally. However, even 1 percent isn’t a small number with 500 million Siri enabled devices in use, according to figures.

CLICK HERE TO GET THE FOX BUSINESS APP

Siri is not the only assistant listening in on the recordings of users. In April, it was revealed that Amazon’s voice assistant Alexa was at times recording private conversations. In July, it also emerged that Google’s Assistant was doing the same.

“Amazon, Apple, will take samples of anonymized data, samples of audio to try and improve the text to speed, words to speed to understanding technology, to improve it over time. If they don’t do that, Siri won’t understand you in the future and later on. It won’t improve,” Ulanoff said.