On 8 Aug 2021, I signed my name on appleprivacyletter.com, an open letter voicing the community’s deep concern for Apple’s proposal of including a backdoor that scans all of the user’s photos on their personal Apple devices and the iCloud Photo Library.
You are invited to read the letter for a more comprehensive summary of the developments, and an accessible description of the technical details. I will focus my discussions, then, on the societal issues.
Between the time I began writing this essay and its eventual publication, Apple’s move is again criticised as the hashing algorithm is reverse-engineered and confirmed to have rather obvious flaws. Further reading resources will be added.
Imagine the following scenario.
You view a house for purchasing, and the property developer says all houses now come with a drug-detection dog. You must keep it; it will bark immediately when it thinks you possess drug, and summon police to your location.
The developer would say something to assure you, perhaps, “See, this is just a dog. It won’t comprehend your private info even if it sees it (Data Hashing), and even it comprehends, it won’t spread it (On-device Tests and Token Storage). Furthermore, the dog is really well trained, such that it only barks when it is extremely sure you got drugs (Match Threshold).
Sounds good? The dog is ready to move in with you tonight.
Ah, also, you feed it out of your own pocket.
I hope the allegory captures the unease any technologically-versed and law-abiding modern citizen should feel about Apple’s new policy, and why I am strongly opposed to the existence of a content-scanning backdoor at all.
Let’s go a bit back in time. For me, a sense of unease and uncertainty about personal technology started with the iOS 15 announcement in June 2021.
Back then, Apple proudly proclaimed that optical character recognition (OCR) would be automatically enabled system-wide to modern iPhones and iPads. So far on my beta testing devices, I haven’t found a switch to disable it, either.
They say they’d safeguard the function, so it only helps the user.
But the fact is, all old photos of yourself holding your passport you were supposed to delete, shopping receipts for personal records, and even things as mundane as a screenshot of a private chat, all are now just plain text that somebody can sift through automatically.
I was beyond disappointed to see Apple finally reveal what their true intentions were with the gradual sophistication of on-device machine learning hardware and AI integration… None was a ground breaking new feature, but all do bring the compromise to user privacy to a whole new level. One thing was clear to me, then. The June’s updates won’t be the end of things, and I should keep an eye on how low they would go.
Before long I would have my answer.
You won’t be surprised that I am a SETI@Home user who already donates my workstations’ idle time for astrophysical data analysis, and I can’t list how many times I’d dreamed my spare computing power can be used by a vigilante, joining forces into some movement that helps bring better justice and welfare into the world.
As such, this time, in a crowd-sourced battle against Child Sexualization and Molestation (CSAM), I can’t deny that Apple is fighting for a good cause. Part of me wants to see news reports of the rollout doing something, catching someone, that my iPhone’s extra CPU usage helped to protect vulnerable children.
The matching algorithm which Apple says it employs is more sophisticated than the normal data hashing, where a single pixel modification would thwart the entire checksum result. Instead, Apple claims that its neural hashing algorithm is robust against typical basic image manipulations, such as rotation, cropping, and colour filtering. And when a user’s questionable photo is indeed ready for manual review, only a monochrome, low-resolution copy is provided to the human reviewer.
Is this a self-consistent solution perfectly suited for the problem of CSAM? Potentially so.
Is it suitable, however, for just this one problem?
I believe that it is just as important to realise that Apple is solving one problem using a much more general-purpose tool, which will irreversibly and eternally change the relationship between individuals and their technological properties.
Furthermore, I would argue that the proponents of the use of this tool has clearly demonstrated that they do not know, or are complacent about not knowing, the various limitations, and serious implications of the weapon they wield.
Given the generalness, versatility and robustness of their method, and the (of course) opaqueness of the AI training data set, too much control is forcefully relinquished by all owners. Which seems like a direct reversal of decades of progress, and will lead to numerous hassles (or worse) most of us do not deserve to face.
Imagine Apple in the future apologising that “due to an incorrect entry in the training data”, the system will bark upon all sights of naked people in your photographs; imagine Apple sharing your media’s neural hash values (and text recognition discussed above) with advertisers or third-party reviewers; imagine CIA asking Apple to match some rival country’s scientists’ faces against the people in every photo of every single iPhone user to trace her or his activities.
Not only did Apple rob the user of several fundamental rights to their privacy, the control is also missing. All of these procedures described here monitor your content using additional local resources that you provide: the comparison algorithm runs in the background on your CPU; the comparison results are stored in your disk; when they actually become useful, additional traffic between your device and iCloud has to incur, etc.
Is this a price you are willing to pay? To be treated as a criminal by default and fundamentally compromising your own privacy and digital life to prove otherwise? And, inevitably, to become a cooperative hostage when such mechanisms are inevitably turned into incentives for degradation, violation, and chaos?
Incidentally, an alleged leaked Apple internal email between itself and an US-funded NGO behind this push has dismissed voices from the wider computing, cryptology, and technology communities as “the screeching voices of the minority“. I am proud to be one such, and I do wish to keep a record of their words verbatim unironically, if indeed authentic.
I am phasing out Apple products, and will migrate my Lux photo library away from iCloud.
Apple, if you somehow are reading this, I am too innocent to deserve being treated as a criminal by default, and have fun scanning my stars while it lasts.