Apple’s Craig Federighi Highlights iOS 14 and MacOS Big Sur Privacy Updates in New Interview

iOS 14, iPadOS 14, and macOS Big Sur include some notable privacy updates that offer useful privacy protections for those that invest in and use Apple's range of iPhones, iPads, and Macs.


In an interview with Fast Company, Apple software chief Craig Federighi highlighted all of the new privacy features that users can look forward to when the new software updates come out this fall, plus he provided some insight into Apple's privacy philosophy.

According to Federighi, privacy is an important part of every software update because Apple wants to show customers that they can "demand more" and "expect more" from the industry when it comes to privacy protections. "We can help move the industry into building things that better protect privacy," said Federighi.
"I think that there are many instances where we started providing privacy protections of some sort, and then we then saw others in the industry-some of whom have different business models than we do-adopt those practices because users came to expect them," he says. "That's happening all over the place. I mean, look at whether it's apps protecting customer messaging with end-to-end encryption. Or some of the kinds of location protections we're talking about. Or some of our protections, like requiring apps to ask before they access your photo libraries, and so forth. You see those protections being added to other operating systems, inspired by our work and based on the fact that users demand them."
Privacy at Apple is guided by four core principles: data minimization, on-device intelligence, security, and transparency and control. All four of those principles were in play when Apple designed ‌iOS 14‌, ‌iPadOS‌ 14, and ‌macOS Big Sur‌, and all of the updates include significant privacy features, as outlined below.

  • Approximate Location - You can now choose to provide apps with your approximate location rather than your specific location, which is a great feature for protecting location privacy. Apps that provide info like weather, news, and restaurant recommendations don't need exact location data, making approximate location data an appropriate choice.

  • App Tracking Permission - Apps in iOS and ‌iPadOS‌ 14 won't be able to cross track you across the web without consent. Users will be able to see what apps they've granted permission to cross-track them and revoke that permission at any time. This feature also applies to Apple's own apps.

  • App Store Privacy Details - ‌App Store‌ listings for apps will include an easy-to-read list of privacy details so you know what data is collected before you download an app. Internally, Apple is referring to this as a "nutrition label for apps," and it will include details on the user data an app wants across 31 categories. This won't be available when ‌iOS 14‌ ships, but it is coming before the end of the year.

  • Clipboard Restrictions - Apps no longer have full access to the clipboard. Previously, most apps could access the last data you copied, but that's no longer the case. Apps will require user permission to access the clipboard for the first time, so you can prevent apps that don't need that information from accessing it.

  • Compromised passwords - Apple's new software updates will notify you if a password stored in iCloud Keychain has been compromised in a data breach.

  • Microphone and Camera Notifications - When an app is accessing either the camera or the microphone on an iPhone or iPad, there will be indicator lights next to the cellular signal that will let you know. There's a green indicator light for the camera and an orange indicator light for the microphone.


Federighi told Fast Company that many of the new privacy features added to iOS each year are based on customer feedback and emails.
"I get emails from customers saying to me, 'I am sure this popular app I downloaded is secretly listening to me. I was just talking about this thing, and this ad came up that was just about what I was talking about. I'm sure it was listening to me,'" he says.

"Now, in many cases, this, in fact, was not happening," says Federighi of concerns over ‌iPhone‌ users being unwittingly recorded. "We know it was not happening. But they believe it is. And so, providing that peace of mind through a recording indicator that will always let you know whether an app, at that moment, is accessing your camera or accessing your microphone is important."
He wrapped up the conversation by saying that he believes Apple's work on privacy protections will be one of the legacies that it's remembered for centuries from now.

Federighi's full interview on privacy can be read over on the Fast Company website, and it's well worth reading because it provides a look at Apple's efforts to improve privacy protections for users over time as well as Apple's thoughts on how developers perceive its privacy features.
Related Roundups: iOS 14, macOS Big Sur

This article, "Apple's Craig Federighi Highlights iOS 14 and MacOS Big Sur Privacy Updates in New Interview" first appeared on MacRumors.com

Discuss this article in our forums

Signal Encrypted Messenger Rolling Out New Face-Blurring Tool for Shared Images

Encrypted messaging app Signal is rolling out a new face-blurring feature that automatically locates and blurs faces in images shared over the platform.


In a blog post announcing the update, Signal co-founder Moxie Marlinspike explained that the tool was a response to a surge in traffic, spurred by ongoing protests around the world against racism and poice brutality.
Many of the people and groups who are organizing for that change are using Signal to communicate, and we're working hard to keep up with the increased traffic. We've also been working to figure out additional ways we can support everyone in the street right now.

One immediate thing seems clear: 2020 is a pretty good year to cover your face.

The latest version of Signal for Android and iOS introduces a new blur feature in the image editor that can help protect the privacy of the people in the photos you share. Now it's easy to give every face a hiding place, or draw a fuzzy trace over something you want to erase. Simply tap on the new blur tool icon to get started.
According to Marlinspike, all processing involved in the new blur feature happens locally on the device to maintain privacy. He also cautions that the feature isn't perfect, and won't detect every face all of the time. To compensate for these flaws, Signal's image editor also includes an option to manually obscure faces and other areas of a photo with a blur brush.


Since May 25, Signal has been setting daily download records in the U.S., according to Apptopia (via Recode). The encrypted chat app was the eighth most downloaded social networking app on Tuesday, for example, and ranked around the top 100 for all apps.

Signal Private Messenger is a free download [Direct Link] for iPhone and iPad available on the App Store.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.
This article, "Signal Encrypted Messenger Rolling Out New Face-Blurring Tool for Shared Images" first appeared on MacRumors.com

Discuss this article in our forums

Apple and Google in ‘Standoff’ With UK Health Service Over COVID-19 Contact Tracing App

Apple and Google are said to be in a "standoff" with the UK's health service over its plans to build an app that alerts users when they have been in contact with someone with coronavirus.


Apple and Google announced on Friday that they are working together on Bluetooth technology to help governments and health agencies reduce the spread of the COVID-19 virus around the world.

Apple says that user privacy and security will be central to the design of the project, which will use a decentralized API to prevent governments from building a surveillance-style centralized database of contacts.

However, according to The Guardian, that means if the NHS goes ahead with its original plans, its app would face severe limitations in its operation.

NHSX – the British health service's digital innovation unit – reportedly wasn't aware of Apple and Google's project before it was announced, and it now looks like the usefulness of its own app will be severely hampered or even rendered non-functional if it doesn't implement the protocol.

That's because without adhering to the Apple and Google API, a contact tracing app won't be able to access Bluetooth when it's running in the background, and would only work when the app was open and the phone unlocked.

Similar limitations have been demonstrated in Singapore's contact tracing app, TraceTogether, which requires the user to leave their phone unlocked to work properly. The app has a three-star rating on the App Store and has been installed by just 12 percent of the country's population.

For its part, a spokesperson for NHSX denied claims of a "standoff," telling The Guardian: "This suggestion is completely wrong. Everyone is in agreement that user privacy is paramount, and while our app is not dependent on the changes they are making, we believe they will be helpful and complementary."
This article, "Apple and Google in 'Standoff' With UK Health Service Over COVID-19 Contact Tracing App" first appeared on MacRumors.com

Discuss this article in our forums

Zoom Accused of Misleading Users With ‘End-to-End Encryption’ Claims

Zoom is facing fresh scrutiny today following a report that the videoconferencing app's encryption claims are misleading.


Zoom states on its website and in its security white paper that the app supports end-to-end encryption, a term that refers to a way of protecting user content so that the company has no access to it whatsoever.

However, an investigation by The Intercept reveals that Zoom secures video calls using TLS encryption, the same technology that web servers use to secure HTTPS websites:
This is known as transport encryption, which is different from end-to-end encryption because the Zoom service itself can access the unencrypted video and audio content of Zoom meetings. So when you have a Zoom meeting, the video and audio content will stay private from anyone spying on your Wi-Fi, but it won't stay private from the company.
As the report makes clear, for a Zoom meeting to be end-to-end encrypted, the call would need to be encrypted in such a way that ensures only the participants in the meeting have the ability to decrypt it through the use of local encryption keys. But that level of security is not what the service offers.

When asked by The Intercept to comment on the finding, a spokesperson for Zoom denied that the company was misleading users:
"When we use the phrase 'End to End' in our other literature, it is in reference to the connection being encrypted from Zoom end point to Zoom end point… The content is not decrypted as it transfers across the Zoom cloud."
Technically, Zoom's in-meeting text chat appears to be the only feature of Zoom that is actually end-to-end encrypted. But in theory, the service could spy on private video meetings and be compelled to hand over recordings of meetings to governments or law enforcement in response to legal requests.

Zoom told The Intercept that it only collects user data that it needs to improve its service – this includes IP addresses, OS details, and device details – but it doesn't allow employees to access the content of meetings.

Last week, Zoom's data sharing practices were criticized after it emerged that the service was sending data to Facebook without disclosing the fact to customers. The company subsequently updated the app to remove its Facebook log-in feature and prevent the data access.
This article, "Zoom Accused of Misleading Users With 'End-to-End Encryption' Claims" first appeared on MacRumors.com

Discuss this article in our forums

U.S. Government Using Mobile Ad Location Data to Track Compliance With Curbs on Movement

The U.S. government is using smartphone location data from the mobile ad industry to track people's movements amid the coronavirus outbreak, according to a Wall Street Journal report.


Local governments and the Centers for Disease Control and Prevention have received the anonymized data about people in areas of "geographic interest," with the aim being to create a portal of geolocation information for 500 cities across the country.

The information will be used to learn how well people are complying with stay-at-home orders, according to WSJ. Citing an example, the report says researchers discovered large numbers of people were gathering in a New York City park, which led them to notify local authorities.

Even though the data is anonymized, WSJ says that privacy advocates want "strong legal safeguards" to limit how it can be used, in order to prevent its use for other purposes. Cellular carriers told the news outlet they have not been asked by the government to provide location data.

The development follows reports of other countries using cellphone data to monitor citizens and see if they are complying with curbs on movement to defeat the viral outbreak.

European mobile carriers have reportedly been sharing data with health authorities in Italy, Germany and Austria, while at the same time respecting Europe's privacy laws. Earlier this month, Israel passed emergency measures that allow security agencies to track the smartphone data of people with suspected COVID-19 and find others they may have come into contact with.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.
This article, "U.S. Government Using Mobile Ad Location Data to Track Compliance With Curbs on Movement" first appeared on MacRumors.com

Discuss this article in our forums

Israel Passes Emergency Law to Track and Trace Mobile Users With Suspected COVID-19

Israel has passed emergency measures that will allow security agencies to track the smartphone data of people with suspected COVID-19 and find others they may have come into contact with (via BBC News).


The Israeli government said the new powers will be used to identify people infected with coronavirus and make sure they're following quarantine rules.

On Monday, an Israeli parliamentary subcommittee discussed a government request to authorize the security service to assist in a national campaign to stop the spread of COVID-19, but the group decided to delay voting on the request, arguing that it needed more time to assess it.

The emergency law was passed on Tuesday during an overnight sitting of the cabinet, effectively bypassing parliamentary approval.

The government has yet to explain how the mobile tracking will work, but the BBC reports that it is understood the location data collected through telecommunication companies by Shin Bet, the domestic security agency, will be shared with health officials.

Israeli prime minister Benjamin Netanyahu last week announced his intention to bypass parliamentary oversight in order to push through the emergency regulations. Netanyahu says the new powers will last for 30 days only. Civil liberties campaigners in Israel called the move "a dangerous precedent and a slippery slope."

Israel is still in the relatively early stages of the pandemic. It had 200 confirmed cases of the coronavirus as of Tuesday morning. On Wednesday, the country's health ministry reported that cases had risen to 427.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.
This article, "Israel Passes Emergency Law to Track and Trace Mobile Users With Suspected COVID-19" first appeared on MacRumors.com

Discuss this article in our forums

MI5 Argues for ‘Exceptional Access’ to Encrypted Messages

The director general of Britain's Security Service is arguing for "exceptional access" to encrypted messages, in the ongoing battle between authorities and technology companies, reports The Guardian.

MI5 head Andrew Parker
MI5's director general has called on technology companies to find a way to allow spy agencies "exceptional access" to encrypted messages, amid fears they cannot otherwise access such communications.

Sir Andrew Parker is understood to be particularly concerned about Facebook, which announced plans to introduce powerful end-to-end encryption last March across all the social media firm's services.

In an ITV interview to be broadcast on Thursday, Sir Andrew Parker says he has found it "increasingly mystifying" that intelligence agencies like his are not able to easily read secret messages of terror suspects they are monitoring.
Parker goes on to say that cyberspace has become an unregulated "Wild West" that is largely inaccessible to authorities, and calls on tech firms to answer the question: "Can you provide end-to-end encryption but on an exceptional basis – exceptional basis – where there is a legal warrant and a compelling case to do it, provide access to stop the most serious forms of harm happening?"

The U.K. government has long argued that encrypted online channels such as WhatsApp and Telegram provide a "safe haven" for terrorists because governments and even the companies that host the services cannot read them.

Tech companies have pushed back against various attempts by authorities to weaken encryption methods, such as the FBI's request that Apple help it hack into the iPhone owned by Syed Farook, one of the shooters in the December 2015 attacks in San Bernardino.

Apple famously refused to comply with the request, and has since consistently argued against laws that would require tech companies to build so-called "back doors" into their software, claiming that such a move would weaken security for everyone and simply make terrorists and criminals turn to open-source encryption methods for their digital communications.

On the opposing side of the debate, Britain's cybersecurity agency has proposed that if tech companies sent a copy of encrypted messages and the encryption keys to unscramble them when requested following a warrant, this would allow them to prevent terrorists and criminals from operating out of sight without compromising encryption methods.

However, given that encrypted communication services like WhatsApp and Signal do not have access to private keys that would enable them to decrypt messages, a back door would seem the only alternative.

A spokesperson for Privacy International, a technology human rights group, told The Guardian that strong encryption kept communications safe from criminals and hostile governments.

"The reality is that these big tech platforms are international companies: providing access to UK police would mean establishing a precedent that police around the world could use to compel the platforms to monitor activists and opposition, from Hong Kong to Honduras," the spokesperson added.

Note: Due to the political nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.


This article, "MI5 Argues for 'Exceptional Access' to Encrypted Messages" first appeared on MacRumors.com

Discuss this article in our forums

Apple’s Privacy Officer Jane Horvath Uses CES Appearance to Defend Company Stance on Encryption and Software Backdoors

Apple's chief privacy officer attended a discussion panel at the Consumer Electronics Show in Las Vegas on Tuesday to debate the state of consumer privacy, marking the first time in 28 years that Apple has been at CES in an official capacity.

Apple's privacy officer at CES 2020 panel (Image: Parker Ortolani)

Jane Horvath, Apple's senior director for global privacy, joined an all-female panel consisting of representatives from Facebook, Procter & Gamble and the Federal Trade Commission. During the discussion, Horvath defended Apple's use of encryption to protect customer data on mobile devices.
"Our phones are relatively small and they get lost and stolen," Horvath said. "If we're going to be able to rely on our health data and finance data on our devices, we need to make sure that if you misplace that device, you're not losing your sensitive data."
Apple has held a consistent position regarding its use of encryption, even if that means it has limited ability to help law enforcement access data on devices involved in criminal investigations.

Just this week, the FBI asked Apple to help unlock two iPhones that investigators believe were owned by Mohammed Saeed Alshamrani, who carried out a mass shooting at a Naval Air Station in Florida last month. Apple said that it had already given the FBI all of the data in its possession.

Apple's response suggests it will maintain the same stance it took in 2016, when the FBI demanded that Apple provide a so-called "backdoor" into iPhones, following the December 2015 shooter incidents in San Bernardino. Apple refused, and the FBI eventually backed down after it found an alternate way to access the data on the iPhone.

Horvath took the same tack by saying that Apple has a team working around the clock to respond to requests from law enforcement, but that building backdoors into software to give law enforcement access to private data is something she doesn't support.
"Building backdoors into encryption is not the way we are going to solve those issues," Horvath said.
Horvath went on to talk up Apple's "privacy by design" technologies like differential privacy, user randomization in native apps and services, the on-device facial recognition in the Photos app, and minimal data retrieval for Siri. Horvath also confirmed that Apple scans for child sexual abuse content uploaded to iCloud. "We are utilizing some technologies to help screen for child sexual abuse material," she said.

Horvath became Apple's chief privacy officer in September 2011. Prior to her work at Apple, Horvath was global privacy counsel at Google and chief privacy counsel at the Department of Justice.

Note: Due to the political nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.


This article, "Apple's Privacy Officer Jane Horvath Uses CES Appearance to Defend Company Stance on Encryption and Software Backdoors" first appeared on MacRumors.com

Discuss this article in our forums

NYT Investigation Reveals How Easily Smartphone Location Data Can Be Used to Identify and Track Individuals

The New York Times today claimed that it has obtained a file with the precise location of over 12 million smartphones over a period of several months in 2016 and 2017. While this data is technically anonymized, the report details how easy it is to associate specific data points with specific individuals.


With the help of publicly available information, like home addresses, The New York Times said it easily identified and then tracked military officials, law enforcement officers, lawyers, tech employees, and others:
In one case, we observed a change in the regular movements of a Microsoft engineer. He made a visit one Tuesday afternoon to the main Seattle campus of a Microsoft competitor, Amazon. The following month, he started a new job at Amazon. It took minutes to identify him as Ben Broili, a manager now for Amazon Prime Air, a drone delivery service.
The report explains that location data is collected from third-party smartphone apps that have integrated SDKs from location data companies like Gimbal, NinthDecimal, Reveal Mobile, Skyhook, PlaceIQ, and others, adding that it is currently legal to collect and sell all this information in the United States.

Apple continues to take steps to protect the privacy of its users. In iOS 13, for example, there is no more "always allow" option when third-party apps request to access your location. If a user wants to grant an app continuous access to location data, they must do so in Settings > Privacy > Location Services.

Apple also requires that apps provide users with a detailed explanation as to how location data is being used when prompted.

iPhone users who are concerned about their privacy can better protect themselves by navigating to Settings > Privacy > Location Services and disabling access to location data for unessential apps, or choosing the "while using the app" option at a minimum. We also recommend reviewing the privacy policies of apps.

A spokesperson said Apple had no comment on The New York Times report when contacted by MacRumors.


This article, "NYT Investigation Reveals How Easily Smartphone Location Data Can Be Used to Identify and Track Individuals" first appeared on MacRumors.com

Discuss this article in our forums

DuckDuckGo’s Safari Privacy Browser Extension Now Available for macOS Catalina

Privacy oriented search engine DuckDuckGo today released an updated version of its browser extension for desktop Safari users running macOS Catalina.


The launch comes after DuckDuckGo Privacy Essentials had to be removed from the Safari extensions gallery following major changes introduced in Safari 12 that made the extension incompatible. From the DuckDuckGo website:
As you may be aware, major structural changes in Safari 12 meant that we had to remove DuckDuckGo Privacy Essentials from the Safari extensions gallery. With Safari 13, new functionality was thankfully added that enabled us to put it back. Consequently, you'll need Safari 13+ on macOS 10.15 (Catalina) or newer to install the updated version.
DuckDuckGo Privacy Essentials blocks hidden third-party trackers on websites and features a Privacy Dashboard, which generates a Privacy Grade rating (A-F) information card whenever a user visits a site. The rating aims to let them see at a glance how protected they are, while providing additional options to dig deeper into the details of blocked tracking attempts.

While the extension doesn't include private search, DuckDuckGo Search is built into Safari as a default search option, and they work together to help users search and browse privately.

DuckDuckGo Privacy Essentials is only available for desktop browsers, however DuckDuckGo Privacy Browser is available for iOS and uses the same privacy protection technology.


This article, "DuckDuckGo's Safari Privacy Browser Extension Now Available for macOS Catalina" first appeared on MacRumors.com

Discuss this article in our forums