Apple Working on Siri Feature Allowing Back-and-Forth Conversations About Health Problems

Apple is working on a new Siri feature for release by iOS 15 in fall 2021 that will allow users to have a back-and-forth conversation about health problems, according to internal documentation obtained by The Guardian.


The report does not offer any further details about the feature, but Siri will presumably become more capable of responding to physical and possibly mental health questions. Apple CEO Tim Cook has repeatedly said that Apple's health-related initiatives will be the company's "greatest contribution to mankind."

Apple has increased its presence in the health and fitness space over the past few years. In 2018, for example, it launched an ECG app for the Apple Watch that can detect signs of atrial fibrillation, a condition that can lead to potentially life-threatening complications such as stroke and cardiac arrest.

Also in 2018, Apple rolled out Health Records, a feature that allows patients to view medical records from multiple hospitals and clinics directly in the Health app on the iPhone, including allergies, vital signs, conditions, immunizations, lab results, medications, procedures, and other information.

Apple's Health app in iOS 13

Apple's internal documentation, which The Guardian obtained from a former Siri grader, also reveals the company's efforts to ensure that Siri responds as neutrally as possible to sensitive topics such as feminism:
In explaining why the service should deflect questions about feminism, Apple's guidelines explain that "Siri should be guarded when dealing with potentially controversial content." When questions are directed at Siri, "they can be deflected … however, care must be taken here to be neutral".

For those feminism-related questions where Siri does not reply with deflections about "treating humans equally", the document suggests the best outcome should be neutrally presenting the "feminism" entry in Siri's "knowledge graph", which pulls information from Wikipedia and the iPhone's dictionary.
In a statement, Apple said it aims for Siri to be "factual with inclusive responses rather than offer opinions":
Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers. Our approach is to be factual with inclusive responses rather than offer opinions.
Note: Due to the political nature of the discussion regarding this topic, the discussion thread is located in our Politics, Religion, Social Issues forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.


This article, "Apple Working on Siri Feature Allowing Back-and-Forth Conversations About Health Problems" first appeared on MacRumors.com

Discuss this article in our forums

Apple Will Continue to Review Computer-Generated Siri Transcripts Regardless of Opt-In Status

Apple has published a new support document with several questions and answers about its Siri quality evaluation process, also known as grading, to address any privacy concerns that customers may have.


As a refresher, it was recently discovered that Apple hired contractors to listen to a small percentage of anonymized Siri recordings — and review their corresponding computer-generated transcripts — to measure how well Siri was responding and to improve the assistant's accuracy and reliability.

The human review process likely existed for quite some time, but it was never mentioned in Apple's privacy policy, and it only became the subject of controversy last month after The Guardian reported that contractors "regularly" heard "confidential details" while listening to the Siri audio recordings.

Following that report, Apple quickly suspended its grading program and conducted a review of its policies. Apple has since apologized over the matter and says it will resume the evaluation process in the fall on an opt-in basis with improved privacy measures, including no longer retaining audio recordings.

In its FAQ, however, Apple says it will continue to review computer-generated transcripts of Siri interactions, even from users who do not opt in. The only way to avoid this will be to disable Siri entirely:
Is the only way for Siri not to retain my audio recordings and transcripts to disable Siri?

By default, Apple will no longer retain audio of your Siri requests, starting with a future software release in fall 2019. Computer-generated transcriptions of your audio requests may be used to improve Siri. These transcriptions are associated with a random identifier, not your Apple ID, for up to six months. If you do not want transcriptions of your Siri audio recordings to be retained, you can disable Siri and Dictation in Settings.
Prior to suspending grading, Apple says it reviewed less than 0.2 percent of Siri interactions and corresponding computer-generated transcripts.

As for users that do opt in, Apple says it has updated its review process to limit graders' exposure to audio recordings that are determined to have resulted from Siri being triggered inadvertently. Apple is also making changes to minimize the amount of data that the graders have access to:
When you say you are minimizing the amount of data reviewers have access to, what does that mean? What will they still be able to hear?

We are making changes to the human grading process to further minimize the amount of data reviewers have access to, so that they see only the data necessary to effectively do their work. For example, the names of the devices and rooms you setup in the Home app will only be accessible by the reviewer if the request being graded involves controlling devices in the home.
Apple says it will work to delete any recording which is determined to have resulted from Siri being triggered inadvertently.

The changes to Siri will be implemented in a future iOS update released this fall, which will likely introduce a toggle switch for grading. For more details, read Apple's support document and its related press release.

Tag: Siri

This article, "Apple Will Continue to Review Computer-Generated Siri Transcripts Regardless of Opt-In Status" first appeared on MacRumors.com

Discuss this article in our forums

Apple Apologizes Over Siri Privacy Concerns, Will Resume Grading Program in Fall With Several Changes

Apple today announced that it will resume its Siri quality evaluation process in the fall with several privacy-focused changes.


Going forward, Apple will only gather audio samples from users who opt in to the grading program, and those who participate will be able to opt out at any time. And when a customer does opt in, only Apple employees will be allowed to listen to the audio samples, and the recordings will no longer be retained.

Apple says it will work to delete any recording which is determined to have resulted from Siri being triggered inadvertently.

These changes come after The Guardian reported that Apple contractors "regularly" heard confidential information while grading anonymized Siri audio samples. Following the report, Apple suspended the grading program and began conducting a review of its process, and it has now apologized over the matter.
As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:

• First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

• Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

• Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.
Apple has also shared a new support document with further details about Siri privacy and grading.



Tag: Siri

This article, "Apple Apologizes Over Siri Privacy Concerns, Will Resume Grading Program in Fall With Several Changes" first appeared on MacRumors.com

Discuss this article in our forums

Apple Contractors Listened to 1,000+ Siri Recordings Per Shift

Apple has suspended the grading program that used contractors to listen to Siri recordings for quality control purposes, but in a new report, The Irish Examiner (via The Verge) gives some additional insight into how it worked.

According to one of the contractors who worked on Siri grading in Cork, Ireland, employees were expected to listen to more than 1,000 Siri recordings per shift. Most recordings were a few seconds in length, and "occasionally" employees would hear personal data or snippets of conversation. Contractors primarily heard Siri commands, though.


Each recording was "graded" based on different factors, such as whether or not a Siri activation was accidental or if the query was something the personal assistant could or couldn't help with.

The employee said that Siri user details were kept anonymous, and that he or she mostly heard recordings with Canadian, Australian, and UK accents.
"I understood the reasons why the company was doing it but I could see why people would feel it was a breach of privacy because they weren't telling people. I think the lack of consent was the issue."
Data analysts who worked with Globetech, a Cork-based firm, were told this week that their work with Apple has been terminated. Apple and Globetech have not commented on how many employees were let go, but The Irish Examiner says that more than 300 contractors working on transcription and grading for Apple may have lost their jobs.

Apple last week told Globetech that it would be ending all transcription and voice grading work, and Globetech has confirmed that it will no longer be providing these services to Apple.

Prior to Apple's decision to end all grading and transcription work with Globetech, Apple prohibited employees from bringing their cell phones to work after the original story from The Guardian hit. In that report, an anonymous contractor said that employees working on Siri often heard private data including confidential medical information, drug deals, recordings of couples having sex, and more.

Following that story, where the employee also called out Apple for not properly disclosing human-based Siri grading to its customers, Apple announced that it would temporarily suspend the program worldwide.

Apple said it would review the process that's currently used, and also add a feature to let people to opt out of allowing their Siri recordings be used for quality control purposes. In a statement to The Irish Examiner, Apple said that it is still evaluating its grading processes and is "working closely" with partners to reach the "best possible outcome" for all involved.
"We believe that everyone should be treated with the dignity and respect they deserve -- this includes our own employees and the suppliers we work with in Ireland and around the world. Apple is committed to customer privacy and made the decision to suspend Siri grading while we conduct a thorough review of our processes. We're working closely with our partners as we do this to ensure the best possible outcome for our suppliers, their employees and our customers around the world."
It's not if and clear when Siri grading will resume, but it's likely going to remain suspended until Apple is able to release a software update that adds a toggle allowing customers to opt out.

Apple is facing a class action lawsuit over the issue, which claims Apple did not inform consumers that they are regularly being recorded without consent."

Tag: Siri

This article, "Apple Contractors Listened to 1,000+ Siri Recordings Per Shift" first appeared on MacRumors.com

Discuss this article in our forums

Former Siri Chief Bill Stasior Joins Microsoft to Lead AI Team

Bill Stasior, Apple's former head of Siri development, has joined Microsoft as corporate VP of technology, reports The Information.


Starting this month, Stasior will lead an artificial intelligence group at Microsoft and will be reporting to Microsoft's chief technology officer, Kevin Scott.

A Ph.D. graduate in computer science from MIT, Stasior was head of Apple's Siri team for seven years, following the departure of Siri co-founders Adam Cheyer and Dag Kittlaus in 2012. Cheyer and Kittlaus had joined Apple when the company originally purchased Siri in 2010, but didn't stay long.

Stasior stepped down from his role as leader of Apple's voice assistant group in February, as part of a restructuring effort by John Giannandrea, Apple's senior vice president of machine learning and AI strategy.

Giannandrea was a prominent Google executive before being hired by Apple last year. With Giannandrea taking over the Siri team, Stasior was said to be stepping away from day-to-day management of Siri, yet remaining at the company. However, according to The Information, Stasior cut all ties with Apple in May.

Giannandrea's hiring came amid widespread criticism of Siri, which has shortcomings in comparison to AI offerings from the likes of Microsoft, Amazon, and Google. Apple made strides to improve Siri in 2018 under Giannandrea's leadership, with features like Siri Shortcuts in iOS 12.


This article, "Former Siri Chief Bill Stasior Joins Microsoft to Lead AI Team" first appeared on MacRumors.com

Discuss this article in our forums

Apple Facing Lawsuit for ‘Unlawful and Intentional’ Recording of Confidential Siri Requests Without User Consent

Apple is facing a class action lawsuit [PDF] for employing contractors to listen to and grade some anonymized Siri conversations for the purpose of quality control and product improvement.

Apple's Siri practices were highlighted in a recent report where one of the contractors claimed that Apple employees evaluating Siri recordings often hear confidential medical information, drug deals, and other private information when Siri is activated accidentally.


The lawsuit, filed in a Northern California court today (and shared by CNBC's Kif Leswing), accuses Apple of "unlawful and intentional recording of individuals' confidential communications without their consent," violating California privacy laws when accidental Siri activations are recorded and evaluated by humans.
Siri Devices are only supposed to record conversations preceded by the utterance of "Hey Siri" (a "wake phrase") or through a specific gesture, such as pressing the home button on a device for a specified amount of time. California law prohibits the recording of oral communications without the consent of all parties to the communication.

Individuals who have purchased or used Siri Devices and interacted with Siri have not consented to Apple recording conversations where "Hey Siri" was not uttered or where they did not otherwise perform a gesture intending to activate Siri, such as pressing and holding down the home button on a device for a certain period of time.
As outlined in its privacy policies, Apple collects some anonymized Siri recordings for the purpose of improving Siri and, presumably, cutting down on accidental Siri activations. These recordings are analyzed by humans and can include details recorded when Siri mishears a "Hey Siri" trigger word.

The lawsuit claims that Apple has not informed consumers that they are "regularly being recorded without consent," though it also highlights Apple's privacy policy where Apple does state that such data can be used for improving its services.

The plaintiffs in the case, one of whom is a minor, claim to own an iPhone XR and an iPhone 6 that they would not have purchased had they known that their Siri recordings were stored for evaluation. The plaintiffs are seeking class action status for all individuals who were recorded by a Siri device without their consent from October 12, 2011 to the present.

The lawsuit asks for Apple to obtain consent before recording a minor's Siri interactions, to delete all existing recordings, and to prevent unauthorized recordings in the future. It also asks for $5,000 in damages per violation.

Apple has suspended its Siri evaluation program right now as it reviews the processes that are in place in light of the contractor's claims. Prior to the suspension of the program, Apple said that a small, random subset (less than 1%) of daily Siri requests are analyzed for improving Siri and dictation, with requests not associated with a user's Apple ID.

Apple in the future plans to release a software update that will let Siri users opt out of having their Siri queries included in the evaluation process, something that's not possible at the current time. All collected Siri data can be cleared from an iOS device by turning Siri off and then on again, while accidental recordings can be stopped by disabling "Hey Siri."

Tag: Siri

This article, "Apple Facing Lawsuit for 'Unlawful and Intentional' Recording of Confidential Siri Requests Without User Consent" first appeared on MacRumors.com

Discuss this article in our forums

Apple Suspends Program That Lets Employees Listen to Siri Recordings for Quality Control, Opt Out Option Coming

Apple is suspending a Siri program that allows employees to listen to Siri recordings for quality control purposes, reports TechCrunch.

Apple is going to review the process that's currently used, where workers listen to anonymized Siri recordings to determine whether Siri is hearing questions correctly or being activated on accident.


Apple in the future also plans to release a software update that will let Siri users opt out of having their Siri queries included in this evaluation process, called grading.
"We are committed to delivering a great Siri experience while protecting user privacy," Apple said in a statement to TechCrunch. "While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."
The decision to suspend the program and offer an opt-out option comes following a report from The Guardian that shared details gleaned from one of the contractors working on evaluating Siri queries.

The employee expressed concern with Apple's lack of disclosure about the human oversight and said that contractors who work on the program have overhead confidential medical information, drug deals, recordings of couples having sex, and other private details from accidental Siri activations.

When The Guardian report came out, Apple confirmed that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri and dictation. While Apple anonymizes Siri data that's evaluated for quality control and Siri improvement, its current privacy policy and security documents do not explicitly mention human oversight.

Tag: Siri

This article, "Apple Suspends Program That Lets Employees Listen to Siri Recordings for Quality Control, Opt Out Option Coming" first appeared on MacRumors.com

Discuss this article in our forums

Contractors Working on Siri ‘Regularly’ Hear Recordings of Drug Deals, Private Medical Info and More Claims Apple Employee

Contractors that are working on Siri regularly hear confidential medical information, drug deals, recordings of couples having sex, and other private information, according to a report from The Guardian that shares details collected from a contractor who works on one of Apple's Siri teams.

The employee who shared the info is one of many contractors around the world that listen to Siri voice data collected from customers to improve the Siri voice experience and help Siri better understand incoming commands and queries.


According to The Guardian, the employee shared the information because he or she was concerned with Apple's lack of disclosure about the human oversight, though Apple has several times in the past confirmed that this takes place and the practice has been outlined in past reports as well.
The whistleblower said: "There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data."
In a statement, Apple confirmed to The Guardian that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri. A small, random subset (less than 1 percent) of daily Siri activations are used for grading, with each clip only lasting for a few seconds.
"A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user's Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."
Apple has not made its human-based Siri analysis a secret, but its extensive privacy terms don't appear to explicitly state that Siri information is listened to by humans. The employee said that Apple should "reveal to users" that human oversight exists.

The contractor who spoke to The Guardian said that "the regularity of accidental triggers on the watch is incredibly high," and that some snippets were up to 30 seconds in length. Employees listening to Siri recordings are encouraged to report accidental activations as a technical problem, but aren't told to report about content.

Apple has an extensive privacy policy related to Siri and says it deanonymizes all incoming data so that it's not linked to an Apple ID and provides no information about the user. Still, the contractor claims that user data showing location, contact details, and app data is shared, and that names and addresses are sometimes disclosed when they're spoken aloud. To be clear, Apple says that all Siri data is assigned a random identifier and does not include location or contact details as stated by the contractor.
As well as the discomfort they felt listening to such private information, the contractor said they were motivated to go public about their job because of their fears that such information could be misused. "There's not much vetting of who works there, and the amount of data that we're free to look through seems quite broad. It wouldn't be difficult to identify the person that you're listening to, especially with accidental triggers - addresses, names and so on.
While Apple's Siri privacy policy and security documents do not mention human oversight specifically, they are detailed and provide information on how Siri recordings are used.

As stated in Apple's security white paper, for example, user voice data is saved for a six-month period so that the recognition system can use them to better understand a person's voice. The voice data that's saved is identified using a random identifier that's assigned when Siri is turned on, and it is never linked to an Apple ID. After six months, a second copy is saved sans any identifier and is used by Apple for improving Siri for up to two years. A small number of recordings, transcripts, and associated data without identifying information is sometimes used by Apple for ongoing improvement of Siri beyond two years.

Apple's privacy website has a Siri section that offers up more info, explaining that all Siri queries are assigned a random identifier not associated with an Apple ID. The identifier is reset whenever Siri is turned off and then on again, and turning Siri off deletes all user data associated with a Siri identifier.
When we do send information to a server, we protect your privacy by using anonymized rotating identifiers so that searches and locations can't be traced to you personally. And you can disable Location Services, our proactive features, or the proactive features' use of your location at any time.
Those concerned about Siri triggering accidentally on devices like the iPhone, Apple Watch, and HomePod can turn off the "Hey Siri" feature and can instead activate Siri manually, and Siri can also be turned off entirely.

Tag: Siri

This article, "Contractors Working on Siri 'Regularly' Hear Recordings of Drug Deals, Private Medical Info and More Claims Apple Employee" first appeared on MacRumors.com

Discuss this article in our forums

Spotify and Other Music and Podcasts Apps Can Choose to Support Siri in iOS 13

Hey Siri, play Old Town Road on Spotify.

Ask that now and Siri will tell you that it cannot play songs from Spotify, but that could change soon. Apple is opening up its SiriKit framework to third-party music, podcasts, audiobooks, and radio apps in iOS 13 and iPadOS, enabling users to use Siri to control audio playback in supported apps.

Mockup of Siri support for Spotify

It will be up to developers to enable this functionality in their apps. We've reached out to Spotify, Amazon, Google, Pandora, Tidal, Overcast, Castro, and several other popular music and podcasts app developers to see if they have plans to support Siri, and we'll update this story if we hear back.

Spotify recently accused Apple of anticompetitive business practices, and its inability to integrate with Siri was one of its complaints. "Apple won't allow us to be on HomePod and they definitely won't let us connect with Siri to play your jams," said Spotify. Going forward, the latter is no longer the case.

The first betas of iOS 13 and iPadOS were seeded to developers on Monday, with public betas to follow in July. The software updates will be widely released in the fall, likely alongside new iPhones in September as usual.

Related Roundups: iOS 13, iPadOS

This article, "Spotify and Other Music and Podcasts Apps Can Choose to Support Siri in iOS 13" first appeared on MacRumors.com

Discuss this article in our forums

WWDC 2019: Siri Expected to Become More Useful With Third-Party Apps on iOS 13 and More

iOS 13 will enable developers to integrate Siri into their apps for several new use cases, including media playback, search, voice calling, event ticketing, message attachments, flights, train trips, and airport gate and seat information, according to 9to5Mac's Guilherme Rambo.


In a report today, Rambo detailed several other developer-focused features that he expects Apple to announce at WWDC in June, including the ability for iOS apps ported to the Mac to use Mac-specific features such as the Touch Bar and keyboard shortcuts along with support for multiple windows.

Rambo says enabling Mac support for an existing iOS app is "as easy as checking a checkbox" in Xcode, akin to adding iPad support to an iPhone app.

Apple's augmented reality platform ARKit is said to gain "significant improvements" this year, including a brand new Swift-only framework for augmented reality and a companion app that lets developers create augmented reality experiences visually. ARKit is also said to gain the ability to detect human poses.

Developers are also expected to gain access to a handful of new frameworks that allow for expanded use of the Taptic Engine, document scanning in third-party apps, and the ability to capture photos from external devices such as cameras and SD cards without having to go through Apple's Photos app.

Last, on the Mac, apps will supposedly be able to offer file provider extensions, improving the integration of apps like Dropbox with Finder.

Related Roundups: WWDC 2019, iOS 13, macOS 10.15
Tag: Siri

This article, "WWDC 2019: Siri Expected to Become More Useful With Third-Party Apps on iOS 13 and More" first appeared on MacRumors.com

Discuss this article in our forums