1984 SpyVeillance: Apple Adding Mandatory NeuralMatch Scanner and Full-Take Infrastructure, Pegasus

grarpamp grarpamp at gmail.com
Thu Aug 5 13:47:56 PDT 2021

Apple just committed "suicide by consumer" on behalf of Governments.
Huge internet backlash, Apple caught offguard spins PR drop early.
Fight Back against SpyVeillance 1984!

"It is an absolutely appalling idea, because it is going to lead
to distributed bulk surveillance of...our phones and laptops," said
Ross Anderson, professor of security engineering at the University
of Cambridge.

Alec Muffett, a security researcher and privacy campaigner who
formerly worked at Facebook and Deliveroo, said Apple's move was
"tectonic" and a "huge and regressive step for individual privacy".
"Apple are walking back privacy to enable 1984."

Ive had independent confirmation from multiple people that Apple
is releasing a client-side tool for CSAM scanning tomorrow. This
is a really bad idea. These tools will allow Apple to scan your
iPhone photos for photos that match a specific perceptual hash, and
report them to Apple servers if too many appear. Initially I
understand this will be used to perform client side scanning for
cloud-stored photos. Eventually it could be a key ingredient in
adding surveillance to encrypted messaging systems. The ability to
add scanning systems like this to E2E messaging systems has been a
major ask by law enforcement the world over. Heres an open letter
signed by former AG William Barr and other western governments.
This sort of tool can be a boon for finding child pornography in
peoples phones. But imagine what it could do in the hands of an
authoritarian government? The way Apple is doing this launch, theyre
going to start with non-E2E photos that people have already shared
with the cloud. So it doesnt hurt anyones privacy. But you have to
ask why anyone would develop a system like this if scanning E2E
photos wasnt the goal. But even if you believe Apple wont allow
these tools to be misused theres still a lot to be concerned about.
These systems rely on a database of problematic media hashes that
you, as a consumer, cant review. Hashes using a new and proprietary
neural hashing algorithm Apple has developed, and gotten NCMEC to
agree to use. We dont know much about this algorithm. What if someone
can make collisions? Imagine someone sends you a perfectly harmless
political media file that you share with a friend. But that file
shares a hash with some known child porn file? These images are
from an investigation using much simpler hash function than the new
one Apples developing. They show how machine learning can be used
to find such collisions. The idea that Apple is a privacy company
has bought them a lot of good press. But its important to remember
that this is the same company that wont encrypt your iCloud backups
because the FBI put pressure on them.

Dumbed-down explanation: Apple's iPhones will soon start secretly
calling the police if they find photos on your phone that match
fingerprints of photos depicting child abuse and any content
eventually deemed objectionable. This can/will be generalized to
secure messaging. Removing "Apple enthusiast" from my Twitter bio.
What a joke. If Apple doesn't walk back this decision nobody has
any business calling them a privacy company anymore.





More information about the cypherpunks mailing list