Apple has officially killed off one of its most controversial proposals: a map to analyze iCloud images for signs of child sexual abuse material (or CSAM).
Yes, last summer, Apple announcement that it would roll out on-device scanning, a new feature in iOS that used cutting-edge technology to discreetly scan individual users’ photos for signs of bad hardware. The new feature was designed so that, if the scanner found evidence of CSAM, it would alert the human technicians, who would then presumably alert the police.
The plan immediately inspired a torrential backlash privacy and security experts, with critics arguing that the scan function could ultimately be repurposed to find other types of content. EEven having such scanning capabilities in iOS was a slippery slope to broader surveillance abuses, the alleged criticisms, and youhe general consensus was that the tool vswould quickly become a back door for the police.
At the time, Apple fought hard against these Criticismsbut the company eventually relented, and shortly after initially announcing the new feature, it said it “report” implemented until a later date.
Now it looks like that date will never come. Wednesday, amid announcements of a slew of new iCloud security Features, the company also revealed that it will not be moving forward with its analytics plans on the device. In a report share with Wired magazine, Apple made it clear that it had decided to take another route:
After extensive consultation with experts to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the communications security feature we have made available for the first time. in December 2021. We have further decided not to proceed with our previously proposed CSAM detection. tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue to work with governments, child advocates and other companies to help protect young people, preserve their right to life privacy and to make the Internet a safer place for children and for all of us. .
Apple’s plans seemed well intentioned. CSAM’s digital proliferation is a a major problem– and experts say it has only gotten worse in recent years. Obviously, an effort to solve this problem was a good thing. That said, Apple’s underlying technology suggested to use – and the surveillance dangers it posed – seems like it just wasn’t the right tool for the job.