Are Apple subcontractors listening to and transcribing our conversations, recorded without our knowledge by the voice assistant Siri? This is what is suggested by the report and complaint filed in France by the Human Rights League (LDH) on Thursday, February 13, against the giant with the apple. Apple is being sued for violation of privacy, illicit processing of personal data and misleading commercial practices, reveal Le Monde and the investigation unit of Radio France, this Friday, February 14.
At the origin of the procedure: a former employee of a Apple subcontractor Thomas Le Bonniec explains that he was hired in Ireland in 2019 by Globe Technical Services, an Apple subcontractor, as a “data operations annotation analyst.” In fact, he checks and corrects, if necessary, the transcriptions of conversations of Siri users in his native language, French. These are users “talking to their Siri assistant” or “recordings captured without their knowledge when the machine was triggered by mistake,” explains the whistleblower, quoted by our colleagues.
Recordings made in the privacy of private life, according to the LDH
“There are recordings that are made, even while you are in the car, therefore with several people, which means that no one knows what is being recorded. But it can also be in the privacy of private life," said LDH president Nathalie Tehio, speaking on France Inter this Friday.
Every day, Thomas Le Bonniec explains that he has to process nearly 1,300 recordings, while other colleagues label the conversations with keywords. While the complaint refers to several tens or even hundreds of millions of recordings, it is difficult to have a precise figure for French users - because the complaint, filed in France, concerns only French users.
"We are asking the public prosecutor to investigate, because we do not have the means (...). What makes us believe this whistleblower is that he took screenshots and that as a result, we have a certain amount of evidence that there are transcriptions where, clearly, people do not know that they are being recorded", specifies the president of the LDH.
In 2019, The Guardian had already revealed a scandal of recordings and transcriptions of conversations, including when Siri was not intentionally activated. At the time, Apple, the American giant, had suspended its Siri rating program.
For the president of the Human Rights League, the 2019 case simply continued. Because each time, no "informed" consent from users was requested and collected before all the recordings, which are, in themselves, collections of personal information, believe the authors of the complaint. However, this consent is a prerequisite imposed by the General Data Protection Regulation (GDPR) within the European Union, explains the LDH.
Users do not realize "that there will be (...) hundreds of employees who will listen and who will transcribe and verify the transcriptions"
In 2019, Apple explained that the data recorded by Siri was not intended to define marketing profiles for targeted advertising purposes, but to improve Siri.
Apple then explained, in a blog post, that it had made changes to ensure that Siri respected its commitments in terms of confidentiality. Among the new features, the group committed to ensuring that "only Apple employees are authorized to listen to audio samples of interactions with Siri", when customers have chosen to participate. Having subcontractors listen to and correct the transcriptions does not therefore seem to be planned. The American giant also indicated that it was trying "to delete any recording that is considered to be an unintentional trigger of Siri".
Contacted by 01net.com, Apple confirmed to us this Friday, in reference to another (class) procedure taking place in the United States for similar facts, that the group does not keep audio recordings of Siri interactions, unless the user explicitly consents to it. And in such a case, these recordings only serve to improve Siri, the American company specifies, which adds that users can reverse their decision at any time.
However, the recorded sounds include "a lot of very personal data: people talk about their illness, their political opinions, their union membership, their sexuality», explains the whistleblower, quoted by our colleagues.
«When you consciously use Siri (…) I am also not convinced that simply saying: “it is for the improvement of the service”, people realize that there will be (…) hundreds of employees who will listen and who will transcribe and verify the transcriptions,” emphasizes the president of the LDH on France Inter
This Friday, another decision must be made on the same subject, this time in the United States. Many plaintiffs have joined together in a collective procedure to denounce wiretapping and transcriptions via Siri in the country, on the grounds of violation of privacy.
To end the proceedings, the Cupertino giant has proposed a settlement of 95 million dollars, on which a US court must rule this Friday. Last January, Apple told us that it “settled this case to avoid additional litigation and to move forward”. Siri data is used to "improve" the tool, Apple repeated, with the group stating that it is "constantly developing technologies to make Siri even more respectful of privacy."
Editor's note: This article was amended after its publication on Friday, February 14, to include comments received from Apple.

0 Comments