How to Delete Siri Audio History and Opt Out of Siri Audio Sharing on HomePod

This article explains how to delete your Siri audio interaction history and opt out of sharing audio recordings with Apple on iPhone, iPad, and iPod touch.

Earlier this year, it was discovered that Apple hired contractors to listen to a small percentage of anonymized ‌Siri‌ recordings to evaluate the virtual assistant's responses with the purpose of improving accuracy and reliability.

The Guardian revealed that Apple employees working on ‌Siri‌ often heard confidential details while listening to the audio recordings. Apple was subsequently criticized for not making it clear to customers that some of their ‌Siri‌ recordings were being used to improve the service.

Soon after the report, Apple suspended its ‌Siri‌ grading practices and promised users that it would introduce tools in a forthcoming update that would allow them to opt out of sharing their audio recordings.

With the release of iOS 13.2 in October, those new tools arrived on iPhone and ‌iPad‌, allowing users to delete their ‌Siri‌ and Dictation history and opt out of sharing audio recordings. With the release of the 13.2.1 software update for HomePod, the same tools are also available for Apple's smart speaker.

It's important to note that ‌HomePod‌'s ‌Siri‌ settings are independent from your iOS device's ‌Siri‌ settings, so if you want to opt out of ‌Siri‌ Audio Sharing and delete your ‌Siri‌ audio history completely, you'll have to disable them separately.

The following steps show you how to access these settings on ‌HomePod‌. To learn how to disable them on iPhone, ‌iPad‌, and ‌iPod touch‌, click here.

How to Opt Out of ‌Siri‌ Audio Sharing on ‌HomePod‌


  1. Launch the Home app on your iPhone, ‌iPad‌, or ‌iPod touch‌.

  2. Press and hold the ‌HomePod‌ button in your Favorite Accessories. If it's not in your Favorites, tap the Rooms icon at the bottom of the screen and select the Room where your ‌HomePod‌ is located using the room selector in the top-left corner of the screen.
    home
  3. Tap the cog icon in the bottom-right corner of the ‌HomePod‌ card to take you to the device's settings.

  4. Tap Analytics & Improvements.

  5. If you don't want to let Apple review your recordings, toggle off the switch next to Improve ‌Siri‌ & Dictation.
    home
Note that you can tap the link under the toggle for more information relating to Apple's ‌Siri‌ analytics policy.

How to Delete Your ‌Siri‌ Audio History on ‌HomePod‌


  1. Launch the Home app on your iPhone, ‌iPad‌, or ‌iPod touch‌.

  2. Press and hold the ‌HomePod‌ button in your Favorite Accessories. If it's not in your Favorites, tap the Rooms icon at the bottom of the screen and select the Room where your ‌HomePod‌ is located using the room selector in the top-left corner of the screen.
    home
  3. Tap the cog icon in the bottom-right corner of the ‌HomePod‌ card to take you to the device's settings.

  4. Tap ‌Siri‌ History.
    home
  5. Tap Delete ‌Siri‌ History.
Apple will inform you that your request was received and that your ‌Siri‌ and dictation history will be deleted. That's all there is to it.

In addition to these new ‌Siri‌ and Dictation-related privacy features, Apple also says it is making further changes to its human grading process that will minimize the amount of data that reviewers have access to.

Related Roundup: HomePod
Tags: Siri, privacy
Buyer's Guide: HomePod (Neutral)

This article, "How to Delete Siri Audio History and Opt Out of Siri Audio Sharing on HomePod" first appeared on MacRumors.com

Discuss this article in our forums

How to Delete Your Siri Audio History and Opt Out of Siri Audio Sharing

This article explains how to delete your Siri audio interaction history and opt out of sharing audio recordings with Apple on iPhone, iPad, and iPod touch.

Earlier this year, it was discovered that Apple hired contractors to listen to a small percentage of anonymized ‌Siri‌ recordings to evaluate the virtual assistant's responses with the purpose of improving accuracy and reliability.

The Guardian revealed that Apple employees working on ‌Siri‌ often heard confidential details while listening to the audio recordings. Apple was subsequently criticized for not making it clear to customers that some of their ‌Siri‌ recordings were being used to improve the service.

Soon after the report, Apple suspended its ‌Siri‌ grading practices and promised users that it would introduce tools in a forthcoming update that would allow them to opt out of sharing their audio recordings.

With the release of iOS 13.2 in October, those new tools arrived. Apple now includes an option on iPhone and ‌iPad‌ that allows users to delete their ‌Siri‌ and Dictation history and opt out of sharing audio recordings. The following steps show you how to do both.

How to Opt Out of ‌Siri‌ Audio Sharing


  1. Launch the Settings app on your iPhone, ‌iPad‌, or ‌iPod touch‌.

  2. Scroll down and tap Privacy.
    settings
  3. Scroll to the bottom of the Privacy screen and tap Analytics & Improvements.

  4. If you don't want to let Apple review your recordings, toggle off the switch next to Improve ‌Siri‌ Dictation.
    settings
Note that you can tap the link under the toggle for more information relating to Apple's ‌Siri‌ analytics policy.

How to Delete Your ‌Siri‌ Audio History


  1. Launch the Settings app on your iPhone, ‌iPad‌, or ‌iPod touch‌.

  2. Scroll down and tap ‌Siri‌ & Search.
    settings
  3. Tap ‌Siri‌ & Dictation History.

  4. Tap Delete ‌Siri‌ & Dictation History.
Apple will inform you that your request was received and that your ‌Siri‌ and dictation history will be deleted. That's all there is to it.

In addition to these new ‌Siri‌ and Dictation-related privacy features, Apple also says it is making further changes to its human grading process that will minimize the amount of data that reviewers have access to.

Related Roundups: iOS 13, iPadOS
Tag: Siri

This article, "How to Delete Your Siri Audio History and Opt Out of Siri Audio Sharing" first appeared on MacRumors.com

Discuss this article in our forums

Apple Plans to Allow Siri to Default to Frequently-Used Third-Party Messaging Apps Later This Year

Apple plans to release a software update later this year that will enable third-party messaging apps like WhatsApp, Skype, and Facebook Messenger to work better with Siri, the company told Bloomberg's Mark Gurman.


Specifically, the update will enable Siri to default to the messaging app that a person uses most frequently to communicate with a given contact. For example, if an iPhone user almost always messages a friend via WhatsApp, Siri will automatically launch WhatsApp rather than Apple's own iMessage.

It will still not be possible to set third-party apps as default on an iPhone. Instead, the report claims Siri will decide which messaging app to use based on interactions with specific contacts. App Store developers will need to enable the new Siri functionality in their apps when available.

Currently, users must specify the third-party app they wish to use to message someone. Following the software update, a user could simply say "message John" and Siri might automatically do so via WhatsApp or so forth.

This functionality will later be expanded to third-party phone apps for calls as well, but no timeframe was specified.

In a statement, Apple also defended the competitive landscape of the App Store:
Apple offers our users an experience that is only possible from the integration of hardware, software, and services. From the very first iPhone, we have included apps to provide customers with a great experience right out of the box for making phone calls, playing music, surfing the web, and more. With every generation of iPhone we have advanced the built in capabilities for our customers with a few default apps designed for great performance, long battery life, seamless integration, and industry-leading protections for security and privacy. We have also created the App Store, the safest place to get apps, so customers can choose from millions of apps to find the ones that further enhance their iPhone. In the few categories where Apple also has an app, we have many successful competitors and we're proud that their success is responsible for almost 2 million U.S. jobs in a thriving multibillion dollar market for developers. Our North Star is always to create the best products for our customers and that is why iPhone has the highest customer satisfaction in the industry.
More details to follow…


This article, "Apple Plans to Allow Siri to Default to Frequently-Used Third-Party Messaging Apps Later This Year" first appeared on MacRumors.com

Discuss this article in our forums

Spotify Testing Siri Support on iOS 13

Spotify has added Siri support to the latest beta version of its iOS app, allowing users to ask Siri to play songs, albums, and playlists in Spotify on an iPhone running iOS 13 or later, as noted by The Verge's Tom Warren.


Apple opened up its SiriKit framework to third-party music, podcasts, audiobooks, and radio apps in iOS 13 and iPadOS, enabling users to use Siri to control audio playback in supported apps. It is now up to third-party apps to take advantage of this functionality, with Spotify and Pandora among the first to do so.

When asking Siri to play a song, album, or so forth, users must specify "on Spotify" or else the feature defaults to Apple Music.


Spotify's inability to offer the same Siri integration as Apple Music was one of the tentpoles of its anticompetitive complaint against Apple that it filed with the European Commission earlier this year.

Spotify has not provided a timeframe for rolling out Siri support to all users, but we will provide an update when that happens.

Tags: Siri, Spotify

This article, "Spotify Testing Siri Support on iOS 13" first appeared on MacRumors.com

Discuss this article in our forums

Apple Working on Siri Feature Allowing Back-and-Forth Conversations About Health Problems

Apple is working on a new Siri feature for release by iOS 15 in fall 2021 that will allow users to have a back-and-forth conversation about health problems, according to internal documentation obtained by The Guardian.


The report does not offer any further details about the feature, but Siri will presumably become more capable of responding to physical and possibly mental health questions. Apple CEO Tim Cook has repeatedly said that Apple's health-related initiatives will be the company's "greatest contribution to mankind."

Apple has increased its presence in the health and fitness space over the past few years. In 2018, for example, it launched an ECG app for the Apple Watch that can detect signs of atrial fibrillation, a condition that can lead to potentially life-threatening complications such as stroke and cardiac arrest.

Also in 2018, Apple rolled out Health Records, a feature that allows patients to view medical records from multiple hospitals and clinics directly in the Health app on the iPhone, including allergies, vital signs, conditions, immunizations, lab results, medications, procedures, and other information.

Apple's Health app in iOS 13

Apple's internal documentation, which The Guardian obtained from a former Siri grader, also reveals the company's efforts to ensure that Siri responds as neutrally as possible to sensitive topics such as feminism:
In explaining why the service should deflect questions about feminism, Apple's guidelines explain that "Siri should be guarded when dealing with potentially controversial content." When questions are directed at Siri, "they can be deflected … however, care must be taken here to be neutral".

For those feminism-related questions where Siri does not reply with deflections about "treating humans equally", the document suggests the best outcome should be neutrally presenting the "feminism" entry in Siri's "knowledge graph", which pulls information from Wikipedia and the iPhone's dictionary.
In a statement, Apple said it aims for Siri to be "factual with inclusive responses rather than offer opinions":
Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers. Our approach is to be factual with inclusive responses rather than offer opinions.
Note: Due to the political nature of the discussion regarding this topic, the discussion thread is located in our Politics, Religion, Social Issues forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.


This article, "Apple Working on Siri Feature Allowing Back-and-Forth Conversations About Health Problems" first appeared on MacRumors.com

Discuss this article in our forums

Apple Will Continue to Review Computer-Generated Siri Transcripts Regardless of Opt-In Status

Apple has published a new support document with several questions and answers about its Siri quality evaluation process, also known as grading, to address any privacy concerns that customers may have.


As a refresher, it was recently discovered that Apple hired contractors to listen to a small percentage of anonymized Siri recordings — and review their corresponding computer-generated transcripts — to measure how well Siri was responding and to improve the assistant's accuracy and reliability.

The human review process likely existed for quite some time, but it was never mentioned in Apple's privacy policy, and it only became the subject of controversy last month after The Guardian reported that contractors "regularly" heard "confidential details" while listening to the Siri audio recordings.

Following that report, Apple quickly suspended its grading program and conducted a review of its policies. Apple has since apologized over the matter and says it will resume the evaluation process in the fall on an opt-in basis with improved privacy measures, including no longer retaining audio recordings.

In its FAQ, however, Apple says it will continue to review computer-generated transcripts of Siri interactions, even from users who do not opt in. The only way to avoid this will be to disable Siri entirely:
Is the only way for Siri not to retain my audio recordings and transcripts to disable Siri?

By default, Apple will no longer retain audio of your Siri requests, starting with a future software release in fall 2019. Computer-generated transcriptions of your audio requests may be used to improve Siri. These transcriptions are associated with a random identifier, not your Apple ID, for up to six months. If you do not want transcriptions of your Siri audio recordings to be retained, you can disable Siri and Dictation in Settings.
Prior to suspending grading, Apple says it reviewed less than 0.2 percent of Siri interactions and corresponding computer-generated transcripts.

As for users that do opt in, Apple says it has updated its review process to limit graders' exposure to audio recordings that are determined to have resulted from Siri being triggered inadvertently. Apple is also making changes to minimize the amount of data that the graders have access to:
When you say you are minimizing the amount of data reviewers have access to, what does that mean? What will they still be able to hear?

We are making changes to the human grading process to further minimize the amount of data reviewers have access to, so that they see only the data necessary to effectively do their work. For example, the names of the devices and rooms you setup in the Home app will only be accessible by the reviewer if the request being graded involves controlling devices in the home.
Apple says it will work to delete any recording which is determined to have resulted from Siri being triggered inadvertently.

The changes to Siri will be implemented in a future iOS update released this fall, which will likely introduce a toggle switch for grading. For more details, read Apple's support document and its related press release.

Tag: Siri

This article, "Apple Will Continue to Review Computer-Generated Siri Transcripts Regardless of Opt-In Status" first appeared on MacRumors.com

Discuss this article in our forums

Apple Apologizes Over Siri Privacy Concerns, Will Resume Grading Program in Fall With Several Changes

Apple today announced that it will resume its Siri quality evaluation process in the fall with several privacy-focused changes.


Going forward, Apple will only gather audio samples from users who opt in to the grading program, and those who participate will be able to opt out at any time. And when a customer does opt in, only Apple employees will be allowed to listen to the audio samples, and the recordings will no longer be retained.

Apple says it will work to delete any recording which is determined to have resulted from Siri being triggered inadvertently.

These changes come after The Guardian reported that Apple contractors "regularly" heard confidential information while grading anonymized Siri audio samples. Following the report, Apple suspended the grading program and began conducting a review of its process, and it has now apologized over the matter.
As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:

• First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

• Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

• Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.
Apple has also shared a new support document with further details about Siri privacy and grading.



Tag: Siri

This article, "Apple Apologizes Over Siri Privacy Concerns, Will Resume Grading Program in Fall With Several Changes" first appeared on MacRumors.com

Discuss this article in our forums

Apple Contractors Listened to 1,000+ Siri Recordings Per Shift

Apple has suspended the grading program that used contractors to listen to Siri recordings for quality control purposes, but in a new report, The Irish Examiner (via The Verge) gives some additional insight into how it worked.

According to one of the contractors who worked on Siri grading in Cork, Ireland, employees were expected to listen to more than 1,000 Siri recordings per shift. Most recordings were a few seconds in length, and "occasionally" employees would hear personal data or snippets of conversation. Contractors primarily heard Siri commands, though.


Each recording was "graded" based on different factors, such as whether or not a Siri activation was accidental or if the query was something the personal assistant could or couldn't help with.

The employee said that Siri user details were kept anonymous, and that he or she mostly heard recordings with Canadian, Australian, and UK accents.
"I understood the reasons why the company was doing it but I could see why people would feel it was a breach of privacy because they weren't telling people. I think the lack of consent was the issue."
Data analysts who worked with Globetech, a Cork-based firm, were told this week that their work with Apple has been terminated. Apple and Globetech have not commented on how many employees were let go, but The Irish Examiner says that more than 300 contractors working on transcription and grading for Apple may have lost their jobs.

Apple last week told Globetech that it would be ending all transcription and voice grading work, and Globetech has confirmed that it will no longer be providing these services to Apple.

Prior to Apple's decision to end all grading and transcription work with Globetech, Apple prohibited employees from bringing their cell phones to work after the original story from The Guardian hit. In that report, an anonymous contractor said that employees working on Siri often heard private data including confidential medical information, drug deals, recordings of couples having sex, and more.

Following that story, where the employee also called out Apple for not properly disclosing human-based Siri grading to its customers, Apple announced that it would temporarily suspend the program worldwide.

Apple said it would review the process that's currently used, and also add a feature to let people to opt out of allowing their Siri recordings be used for quality control purposes. In a statement to The Irish Examiner, Apple said that it is still evaluating its grading processes and is "working closely" with partners to reach the "best possible outcome" for all involved.
"We believe that everyone should be treated with the dignity and respect they deserve -- this includes our own employees and the suppliers we work with in Ireland and around the world. Apple is committed to customer privacy and made the decision to suspend Siri grading while we conduct a thorough review of our processes. We're working closely with our partners as we do this to ensure the best possible outcome for our suppliers, their employees and our customers around the world."
It's not if and clear when Siri grading will resume, but it's likely going to remain suspended until Apple is able to release a software update that adds a toggle allowing customers to opt out.

Apple is facing a class action lawsuit over the issue, which claims Apple did not inform consumers that they are regularly being recorded without consent."

Tag: Siri

This article, "Apple Contractors Listened to 1,000+ Siri Recordings Per Shift" first appeared on MacRumors.com

Discuss this article in our forums

Former Siri Chief Bill Stasior Joins Microsoft to Lead AI Team

Bill Stasior, Apple's former head of Siri development, has joined Microsoft as corporate VP of technology, reports The Information.


Starting this month, Stasior will lead an artificial intelligence group at Microsoft and will be reporting to Microsoft's chief technology officer, Kevin Scott.

A Ph.D. graduate in computer science from MIT, Stasior was head of Apple's Siri team for seven years, following the departure of Siri co-founders Adam Cheyer and Dag Kittlaus in 2012. Cheyer and Kittlaus had joined Apple when the company originally purchased Siri in 2010, but didn't stay long.

Stasior stepped down from his role as leader of Apple's voice assistant group in February, as part of a restructuring effort by John Giannandrea, Apple's senior vice president of machine learning and AI strategy.

Giannandrea was a prominent Google executive before being hired by Apple last year. With Giannandrea taking over the Siri team, Stasior was said to be stepping away from day-to-day management of Siri, yet remaining at the company. However, according to The Information, Stasior cut all ties with Apple in May.

Giannandrea's hiring came amid widespread criticism of Siri, which has shortcomings in comparison to AI offerings from the likes of Microsoft, Amazon, and Google. Apple made strides to improve Siri in 2018 under Giannandrea's leadership, with features like Siri Shortcuts in iOS 12.


This article, "Former Siri Chief Bill Stasior Joins Microsoft to Lead AI Team" first appeared on MacRumors.com

Discuss this article in our forums

Apple Facing Lawsuit for ‘Unlawful and Intentional’ Recording of Confidential Siri Requests Without User Consent

Apple is facing a class action lawsuit [PDF] for employing contractors to listen to and grade some anonymized Siri conversations for the purpose of quality control and product improvement.

Apple's Siri practices were highlighted in a recent report where one of the contractors claimed that Apple employees evaluating Siri recordings often hear confidential medical information, drug deals, and other private information when Siri is activated accidentally.


The lawsuit, filed in a Northern California court today (and shared by CNBC's Kif Leswing), accuses Apple of "unlawful and intentional recording of individuals' confidential communications without their consent," violating California privacy laws when accidental Siri activations are recorded and evaluated by humans.
Siri Devices are only supposed to record conversations preceded by the utterance of "Hey Siri" (a "wake phrase") or through a specific gesture, such as pressing the home button on a device for a specified amount of time. California law prohibits the recording of oral communications without the consent of all parties to the communication.

Individuals who have purchased or used Siri Devices and interacted with Siri have not consented to Apple recording conversations where "Hey Siri" was not uttered or where they did not otherwise perform a gesture intending to activate Siri, such as pressing and holding down the home button on a device for a certain period of time.
As outlined in its privacy policies, Apple collects some anonymized Siri recordings for the purpose of improving Siri and, presumably, cutting down on accidental Siri activations. These recordings are analyzed by humans and can include details recorded when Siri mishears a "Hey Siri" trigger word.

The lawsuit claims that Apple has not informed consumers that they are "regularly being recorded without consent," though it also highlights Apple's privacy policy where Apple does state that such data can be used for improving its services.

The plaintiffs in the case, one of whom is a minor, claim to own an iPhone XR and an iPhone 6 that they would not have purchased had they known that their Siri recordings were stored for evaluation. The plaintiffs are seeking class action status for all individuals who were recorded by a Siri device without their consent from October 12, 2011 to the present.

The lawsuit asks for Apple to obtain consent before recording a minor's Siri interactions, to delete all existing recordings, and to prevent unauthorized recordings in the future. It also asks for $5,000 in damages per violation.

Apple has suspended its Siri evaluation program right now as it reviews the processes that are in place in light of the contractor's claims. Prior to the suspension of the program, Apple said that a small, random subset (less than 1%) of daily Siri requests are analyzed for improving Siri and dictation, with requests not associated with a user's Apple ID.

Apple in the future plans to release a software update that will let Siri users opt out of having their Siri queries included in the evaluation process, something that's not possible at the current time. All collected Siri data can be cleared from an iOS device by turning Siri off and then on again, while accidental recordings can be stopped by disabling "Hey Siri."

Tag: Siri

This article, "Apple Facing Lawsuit for 'Unlawful and Intentional' Recording of Confidential Siri Requests Without User Consent" first appeared on MacRumors.com

Discuss this article in our forums