Enkey Magazine
Technology information website

Apple locks the listening of Siri’s records

Apple locks the listening of the conversations

Many spoke with the virtual assistants through their own smartphones. The artificial intelligence has ever more a prevalent role in our life, everything becomes easier: from sending a message to reproduce a playlist. An example between all it’s Siri, which is ever more used, but Apple locks the listing of the records of the users, let’s see why.

The revelation to the Guardian

The Guardian, famous english newspaper, published the 26th of july an article with the anonymous story of an operator of the quality program of Siri. Many people never think about what it’s been made with the records of the conversations with the virtual assistants. But this new discovery surely will let us open the eyes on an unknown world.

Infact millions of conversations with Siri and the other virtual assistants are constantly listened by contractors for the quality control. The employees of those companies have so the access to the conversations, which are often even about private situations. The employee that decided to speak anonymously to the Guardian tells, infact, about how often they find out sexual, medical or even criminal conversations.

Apple blocca l'ascolto
Apple locks the listening of Siri’s records: they were accidentally activated

It seems that the record of those conversation was random. Siri infact accidentally switch on itself by activating the geo-localization, and giving the access to contacts and other personal datas. The most risky device about this event is surely the Apple Watch. Infact the smartwatch activates the record of Siri everytime that the owner moves the wrist toward the face. For this reason often the conversations recorded were really private, the user in reality were talking with other people and doesn’t know that Siri was listening.

The answer of Apple

After those revelations Apple prompty said that only the 1% of the daily conversation, realized with the vocal assistant, is analized. Those conversations then are used only with the purpose to improve the performances of the dictation’s software. Furthermore every conversation extrapolated and analized can not be linked to the Apple ID and so to its owner.

Apple finished by saying that the conversations are analysed into reserved structures and the employees have to follow strict safety rules. But the employee that released the interview to the Guardian said to be equally worry. This because the staff turnover inside those contractors is really high and not everyone follow the rules, especially once they changed their job.

apple blocca l'ascolto delle registrazioni

Apple temporarily stops the rating program of Siri to introduce some news about the privacyThe settings about the privacy of Apple are really strict, in case we don’t want to share those informations we can’t even use Siri. While the virtual assistants of Amazon and Google allow to limit the sharing mode of the informations. 

Apple locks the records

But only one week after the article with the revelations about the records, Apple decided to stop all the listening’s operations on the conversations. The company so temporarily stops the rating’s program of Siri. This to introduce an upgrade of the software and the possibility for the users to partecipate or not to the rating program.

Surely this sudden change in the name of the privacy let’s arise the reliability of the Cupertino’s company. It prompty found infact a concrete and strong solution for the problem, unlike  Amazon and Google, which finished in the crosshairs of the privacy’s protection without currently remedy to those faults.

It won’t change a lot for the users, at the moment there will be only the safety that no one will listen any more the conversations recorded. In the future instead it will be added a disclaimer which will ask to the users if partecipate or not to the program.

This post is also available in: Italiano

Potrebbe piacerti anche