Victory! Apple pledges to encrypt iCloud, drops phone scanning plans – EFF

0
Victory!  Apple pledges to encrypt iCloud, drops phone scanning plans – EFF

Related posts

Today Apple announced that it will provide fully encrypted iCloud backups, meeting a long-standing request from the EFF and other privacy-focused organizations.

We commend Apple for listening to experts, child advocates, and users who want to protect their most sensitive data. Encryption is one of the most important tools we have to maintain online privacy and security. That’s why we’ve included the request that Apple allow users to encrypt iCloud backups in Fix it already campaign which we launched in 2019.

Apple’s on-device encryption is strong, but some particularly sensitive iCloud data, such as photos and backups, continued to be vulnerable to government demands and hackers. Users who opt for the new feature offered by Apple, which the company calls advanced data protection for iCloudwill be protected even in the event of a cloud data breach, a government request, or a breach within Apple (e.g. dishonest employee). Apple said today that the feature will be available to US users by the end of the year and will roll out to the rest of the world in “early 2023”.

We are also pleased to learn that Apple has has officially dropped plans to install photo scanning software on its devices, which allegedly inspected users’ private photos in iCloud and iMessage. This software, a version of what is called “client-side analysis”, aimed to locate child abuse images and report them to authorities. When a user’s information is end-to-end encrypted and there is no device scanning, the user has real control over who has access to that data.

Apple’s image scanning plans were announced in 2021but deferred after EFF supporters protested and delivered a petition containing more than 60,000 signatures to Apple executives. While Apple quietly postponed those digitization plans until later that year, today’s announcement makes it official.

In a report distributed to Wired and other reporters, Apple said:

We have further decided not to move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue to work with governments, child advocates and other companies to help protect young people, preserve their right to life privacy and to make the Internet a safer place for children and for all of us. .

The company said it would instead focus on “parent enabling tools” and “privacy-preserving solutions to combat child sexual abuse and protect children, while addressing the needs confidentiality of personal communications and data storage”.

The constant search for child abuse images can lead to unwarranted investigations and false positives. Earlier this year, the New York Times reported how many faulty scans at Google led to false accusations of child abuse against fathers in Texas and California. The men were exonerated by the police but were subject to permanent deletion of their accounts by Google.

Companies should stop trying to square the circle by putting insects in our pockets at the request of governments, and focus on protecting their users and human rights. Today, Apple took a big step forward on both fronts. There are a number of implementation choices that can affect the overall security of the new feature, and we’ll be pushing Apple to make sure the encryption is as strong as possible. Finally, we would like Apple to go a little further. Enabling these privacy features by default would mean that all users can have their rights protected.

T
WRITTEN BY

Related posts