Apple’s plan to scan child abuse images ‘tears in the heart of privacy’ | Privacy

technology like this Recommended by Apple The world’s leading security and cryptography experts said it would open the door to mass surveillance to search for images of child sexual abuse on iPhones, leaving them vulnerable to abuse.

Client-side scanning (CSS) provides access to data on users’ devices, including stored data, “takes surveillance to a new level,” according to analysis from academics at the Harvard Kennedy School, Massachusetts Institute of Technology (MIT), and the University. Cambridge, among others.

They write that the technology that brings background software to users’ devices “breaks the heart of individual citizens’ privacy,” but is also fallible and can be avoided by those intended to be targeted and abused.

Errors in Our Pocket, The Risks of Client-Side Browsing, a 46-page analysis of CSS published on the open access website arXiv On Friday, the authors said: “In reality, CSS is mass intervention, albeit automated and distributed… CSS makes law-abiding citizens more vulnerable with their industrial-scale searchable personal devices.

“To put it bluntly, it’s a dangerous technology. Even if initially deployed to screen child sexual exploitation materials, content that is clearly illegal, there will be tremendous pressure to expand its scope. Then we will have a hard time finding any way to resist expansion or control abuse of the system.”

Apple’s plans announced this year include a technique called “perceptual hashing” to compare photos with known child abuse images when users upload them to the cloud. If the company detects enough matches, it will manually review the images before reporting the user account to law enforcement.

Apple paused app after a backlash from privacy campaigners last month, but not before researchers succeeded create very different images it produced the same fingerprint so it would look the same as Apple’s scanning system and create false positives.

Others have succeeded in doing the opposite: changing the mathematical output of an image without altering its appearance, creating false negatives.

The authors of the report say that people may try to disable browsers or avoid using devices like the iPhone with CSS. They added: “The software vendor, infrastructure operator, and targeting curator must be trusted. If any of them – or their key employees – misbehave or become corrupted, hacked or compromised, the system can fail to secure.”

While CSS is debatable to target specific content, the report warns: “Whether the next terror scare comes, a little coercion will be all it takes to reduce or remove existing protections.”

He points out that Apple seems to have succumbed to government pressure before, for example Migrating iCloud data of Chinese users Data centers controlled by a Chinese state-owned company and jailed Russian opposition leader Alexei Navalny tactical voting app From the Russian app store.

Ross Anderson, one of the report’s co-authors and professor of security engineering at Cambridge University, said: “It’s a very small step from there. [targeting child sexual abuse material] Having various governments say “Here is a list of other pictures we want to add to the list of naughty pictures for iPhones in our country”.

Apple, approached for comment, quoted the Guardian as saying: “Based on feedback from customers, advocacy groups, researchers and others, we’ve decided to take additional time in the coming months to gather input and make improvements before releasing this critical information on child safety. properties.”

Add a Comment

Your email address will not be published. Required fields are marked *