Amazon, Apple and Google have most of the market share of voice assistants. With recent coverage of their human review process, many consumers are concerned about their privacy. That's why we put together this infographic so people like you can more easily control your privacy settings.
Share the Graphic
|
|
Hi,
If you have a voice assistant in your home or on your phone, have you ever been concerned that someone from the company could listen to your voice recordings?
Recent news coverage confirms that suspicion.
At the end of July, The Guardian reported that people at Apple were regularly listening to recordings of deeply personal events such as conversations with doctors, sexual encounters, and other moments. While the effort was designed as a quality control measure, users likely had no idea that some of their utterances were being recorded and reviewed by humans.1
Since then, Apple has temporarily suspended its human review program.2 Google has been forced to pause its own review program in the EU and Amazon is now giving users the ability to opt-out.
Mozilla has put together a guide for you to change your privacy settings on voice assistants.
Click here to learn how to change your settings and to share the gr...
Even with these additional privacy controls, there are still a number of concerns raised by these programs that haven't yet been resolved. Some of those concerns are:
- For users who don't opt-out, workers at Amazon and Google are still listening to a small segment of recordings from people's smart voice assistants and despite efforts to anonymize that data, recordings can contain sensitive and personally identifiable information.3
- In many cases, recordings were made even without someone saying the wake word ("Hey Google") or because they said something that sounded similar to the wake word (such as "Syria" — alerting Apple's Siri). People may not have known they were being recorded once the device was triggered to listen.4,5
- Until recent reporting on this issue, these review programs were not clearly disclosed to users and some like Amazon's did not give users the ability to opt in/out. What's more, news continues to break that other companies like Facebook are also employing human review of other types of users' voice content without previous disclosure. This raises questions about what meaningful consent should look like when people's data is used to train a model to improve the product.6,7
We will keep monitoring the developments on the issue, and of course advocate for disclosure and stronger privacy protections in publications like our Privacy Not Included guide and more.
In the meantime, it is important that consumers like you know how to set the privacy settings for your own voice assistant. Will you share this graphic to spread the word?
Thanks, The Mozilla Team
References:
- Alex Hart, “Apple contractors 'regularly hear confidential details' on Siri rec...,” The Guardian, July 26, 2019.
- Taylor Mahlandt, “How to Stop Amazon, Apple, or Google From Listening to Your Smart S...,” Slate, Aug. 6, 2019.
- Sarah Perez, “41% of voice assistant users have concerns about trust and privacy,...,” TechCrunch, May, 2019.
- James Vincent, “Yep, human workers are listening to recordings from Google Assistan...” The Verge, July 11, 2019.
- Jeremy Horwitz, “Apple Siri contractors often hear up to 30 seconds of accidental re...,” VentureBeat, July 26, 2019.
- Sarah Frier, “Facebook Paid Contractors to Transcribe Users' Audio Chats,” Bloomberg, Aug. 13, 2019.
- Paul Sawers, “Apple and Google halt human voice-data reviews over privacy backlas...” VentureBeat, Aug 2, 2019.
|