My Take on Apple's CSAM Detection

Here is my take on Apple‘s just-announced client-side CSAM.

To start with, let's review a few facts. The term "client-side" is a bit confusing -- because Apple will only scan the photos that will be uploaded to an iCloud Photo Library.

In other words, your photo will not be scanned if your photos stay strictly on your iPhone. So practically, there is no difference between what $MSFT and $GOOG have been doing on their cloud services. Except that $MSFT and $GOOG perform the scanning on the cloud.

They also use hashes to compare the photos against the fingerprints of known child porn. So your kid's naked photo send to the doctors will not be flagged.

However, EFF (https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life) does have several good points on privacy concerns. One of the biggest ones is the "slippery slope" argument. The idea is that the government may censor other content easily by supplying hashes of arbitrary things they want to censor.

And that is because the hashes supplied to Apple are non-auditable by the public -- no one read hashes to verify that they are truly child porn. I fully agree that this is indeed a dangerous slippery slope. But there is always a trade-off between privacy and the public good.

The real underlying issue is that we do not have a transparent, neutral, universally-agreeed authority that we trust. And such authority is almost impossible. If the hashes were auditable, Apple's implementation will be much less of a concern.

Another elephant in the room is the power of government over companies. Big tech has constantly been receiving pressure from the FBI and CIA to unlock phones and decrypt data. Many have legit reasons, but I don't know for sure how much are for other purposes.

If this pressure is universal, do you really think Android and even Windows will be immune? In other words, the slippery slope is not just Apple's. It's everyone's. Unless you use an Linux phone that you compiled yourself.

I think Apple's solution is the most balanced among existing solutions from others. I imagine the reason for client-side scanning is to fulfill their slogan "What happens on your iPhone, stays on your iPhone". You can choose to refuse the scan by declining iCloud Photos.

If they wanted to cooperate with government censorship that extends to other areas, why not straight up perform a cloud-based scanning like $MSFT and $GOOG? That's technically easier except for the following two reasons:

I also doubt the effectiveness of scanning CSAM on cloud storage, because alternative encrypted ways to spread CSAM are readily available. Law enforcement can only get a few criminals who are careless enough to get caught by using cloud storage. Is the privacy trade-off worth it?

But at the same time, if you are concerned about privacy, alternative communication channels are also readily available. To spread protest information, you can use Signal instead of iMessage. You can use an encrypted storage app instead of iCloud.

The true slippery slope begins when alternative channels are not available anymore, which, unfortunately, is also the only way of ultimately stopping child porn.

When this happens, it's a deal-breaker for me. I will switch back to Linux from macOS after 8 years since I graduated from high school, and perhaps also switch from iPhone to an open-source phone. Otherwise, Apple has served me well.

Last updated: August 6, 2021

Check out more posts from Alan Chen.

Powered by thread.pub.