Turns Out Personal Assistants Siri and Cortana Aren't As Personal As You Think

Whatever you say toApple andMicrosoft's personal assistants Siri and Cortana may not be as private as you think.

Both companies store data input into their voice assistants including voice commands, location information, and, well, basically everything you say. Apple explains its recording and storage policy in its privacy policy for OS 8.1. The policy, which is easy enough to find if you are looking for it, but nothing anyone would ever stumble across if you aren't, spells out every manner in which your interaction with your phone is used by Apple, but bolds the following in order for it to stand out:

Microsoft isn't quite as blatant in spelling out how it records data input into Cortana in its Windows Phone 8.1privacy policy but it does point out that the company stores your data repeatedly.

Both companies also make it fairly easy to shut off these features, but doing so makes the personal assistants more or less useless.

Why are Apple and Microsoft doing this?Apple and Microsoft spell out that they record and store data input into their voice assistants to make the experience of using them better. Both do that on a personal level -- getting to know what you mean when you verbalize certain commands -- and on a broader level among all users of the app.

It's not as scary as it seemsApple addressed consumer fears about its data handling toWiredin an April 2013 article. In that story, Apple spokesman Trudy Muller told the magazine that the company does store data for two years, but it becomes anonymous after six months.

"Once the voice recording is six months old, Apple 'disassociates' your user number from the clip, deleting the number from the voice file," she said in a call to the magazine. "But it keeps these disassociated files for up to 18 more months for testing and product improvement purposes."

Apple also makes it clear in anotherprivacy policy section on its website that it needs to access data in order to improve your experience while emphasizing that the data is encrypted,

That random identifier can be turned off and on at any time "effectively restarting your relationship with Siri and Dictation." Turning Siri and Dictation off deletes the user data associated with your Siri identifier and "the learning process will start all overagain."

Microsoft does not say how long it stores data from customers which allow it to do so, but the company does offer detailed instructions on how to turn Cortana's data storage off as well on its "Cortana and my privacy page." It's possible to have a fairly strong level of control over what information Microsoft receives. You could for example, allow Cortana to have access to your location, but not access to dictated emails.

It's also possible to delete your Cortana data saved both on your phone and in the cloud and Microsoft provides instructions on how to do that here.

Apple and Microsoft need to do this(sort of)It's reasonable for Apple and Microsoft to use data input into Siri and Cortana to improve their voice assistants. But, consumers should understand that this is happening and consider that digital privacy is somewhat of an illusion -- at least if you don't take steps to manage your digital profile.

For example, the following search is now sitting on my iPhone, and stored on Apple's servers associated to me for the next six months.

Siri screenshot Source: Author.

That would be bad if I planned to commit a crime, but no different than browser history or any other data which can be extracted from a computer, phone, or even those electronic tags used to pay tolls.

Be careful, but be reasonableYes, it's a little disturbing to know that Apple and Microsoft have records of your requests to find stores selling hemorrhoid cream or late-night queries for White Castle locations, but it's mostly harmless. The info is not being used to do anything other than improve your experience with Siri or Cortana.

While there is more public info on Siri than Cortana, both companies have gone to great lengths to explain that any storage of data is not to embarrass customers, but to make the voice assistants more responsive. Knowing that this data is kept should make you a little more cautious in the same way the Sonyhacking incident made people realize that deleted email may not be actually gone.

So, do be a bit careful with what you confess to Siri or Cortana, but don't let privacy concerns stop you from using either one.

The article Turns Out Personal Assistants Siri and Cortana Aren't As Personal As You Think originally appeared on Fool.com.

Daniel Kline owns shares of Apple and Microsoft. He is surprised that it turns out Rockwell was right.The Motley Fool recommends Apple. The Motley Fool owns shares of Apple. Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.

Copyright 1995 - 2015 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.