If you’re unfamiliar with it, you can brush up on Apple’s CSAM proposal here (scroll down to the “CSAM detection” heading): https://www.apple.com/child-safety/
As an iPhone user since 2009, this move by Apple is very concerning. I can’t say it any better than the good folks here have already: https://appleprivacyletter.com
But Apple Already Scans Photos Uploaded to iCloud…
Yes, they do. And they have every right to scan their servers for illegal and/or abusive content. But this is a new technology we’re talking about, one that moves scanning to users’ devices, so it deserves its own discussion.
But Apple Says It Won’t Let This System Be Abused…
Are we really going to say that long-term this functionality is just going to sit on our phones and do nothing even though it could scan photos as they’re added to the library? Or messages as they’re being typed?
The problem is that, once you add this functionality, the cat’s out of the bag and replies to requests to further invade users’ privacy on-device change from “we can’t” to “we won’t”, a position Apple can’t possibly hope to maintain, esp. with the likes of China or the US government.
But Don’t We Already Trust Apple, Especially with Data Uploaded to iCloud?
Yes, we do. And at some point you have to trust someone, whether it’s your phone’s manufacturer, your telco, or the developers of the apps you use.
But when there’s a flagrant disregard for users and the potential for abuse a system like CSAM has (see “Apple photo-scanning plan faces global backlash from 90 rights groups“), to me that crosses a line and means the company pushing such a policy is no longer trustworthy.
If a company actively screws its users in broad daylight, then what is it willing to do behind closed doors?
At least previously Apple had the pastiche of a privacy and user-centric company. No more if this goes through.
But the Technology is Solid…
No, it’s not. Experts who previously researched CSAM have said:
A foreign government could, for example, compel a service to out people sharing disfavored political speech. That’s no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials.
We spotted other shortcomings. The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.
We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides….University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the ‘Dangerous’ Technology (https://www.macrumors.com/2021/08/20/university-researchers-csam-dangerous/)
Also, check out this post by respected technologist Bruce Schneier: Apple’s NeuralHash Algorithm Has Been Reverse-Engineered.
OK but CSAM Scanning is Only for Photos Uploaded to iCloud, So Just Opt-Out
This is where it starts, not where it ends.
The long-term goal here is probably to allow the scanning of Signal, WhatsApp, etc. messages on-device. No need for encryption backdoors then. And better to have this capability device-wide vs. app-by-app like China (WeChat) and others are currently doing.
But I Have Nothing to Hide…
The problem is that the definition of what is “illegal” can shift very quickly, esp. in certain countries, so moving the ability to scan for “illegal” stuff to user’s devices is absolutely disastrous privacy/freedom-wise.
This technology exposes all users of iOS, past and present, to the whims of the current definition of “legal”.
But we Have to Stop the Bad Guys…
If you add CSAM scanning to phones, yeah bad guys probably won’t use them to do bad things, but they’ll just move to using other technology. Meanwhile, all users are now exposed to the potential abuses of the CSAM scanning system. And if history has taught us anything, it’s that things like this will be abused and/or co-opted.
In the end, aren’t we just trading the abuse of iPhone’s privacy for the much greater abuses that CSAM will enable?
These are my concerns. I respect the opposite position though, and definitely have no judgment for anyone who continues to use Apple devices should this feature be rolled out. I’m just not comfortable with where this path will take us…and it makes me sad because I freakin’ love my iPhone.
If you’re concerned too, please sign the EFF’s petition here: https://www.nospyphone.com