Apple has come under fire again after a whistleblower gave an insight into the privacy and confidentiality of their voice assistant, Siri, in a recent Guardian report.
Siri is available across all Apple devices including iPhones, iPads, Apple Watches, HomePods and Apple TV. Questions about the privacy features of the voice assistant have been raised before, but now an Apple whistleblower has come forward to discuss the quality control checks that are carried out on Siri data.
The Guardian reported that a small selection of Siri recordings are passed onto Apple contractors around the world to be analysed for ways to improve the service. Apple responded to the report by stating that only 1% of Siri activations are reviewed by humans and they do not have user’s Apple IDs associated with them.
The contractors grade the snippets of audio and note down whether the AI handled the request correctly or if there were any mistakes, such as Siri being ‘woken up’ mistakenly. Apple has said that its goal is to improve Siri’s ability to understand and help users, and that the vetting is a vital part of the process.
The Siri recordings are said to have included private discussions, overheard appointments with doctors, business deals and more illicit events. These recordings have personal user data attached to them, including names, location, contact details etc and could be misused by the contractors.
Accidental activation of Siri can also occur, meaning that other words or sounds can activate the system, instead of the usual “Hello, Siri” command.
Apple does not explicitly say in their privacy documentation that humans might be listening in whilst using the Siri voice assistant.
Users can turn off location data for Siri on their Apple devices but this does not mean that their interactions would be exempt from the vetting process. Some users have also opted to download a ‘prevent server-side logging of Siri commands’ iOS profile on Github, which has been submitted by an Information Security researcher.
Many users are now calling for an opt-out procedure to be put in place so that they can choose that their Siri recordings are not saved to Apple’s servers. Unfortunately, this is yet another case of a company not taking their user’s data seriously enough, and it is unclear whether Apple will respond to the requests and negative press. However, this instance does also serve as a solid reminder that your personal data is at risk whilst using any voice assistant software.
Do you regularly use Siri? Does it make you concerned knowing that people could be listening in? Let us know your thoughts and Tweet us @Hyve!