1984: Thread

grarpamp grarpamp at gmail.com
Thu May 12 23:59:28 PDT 2022


> Make no mistake... digital is a kill switch meant to shut *you* down

Your Face Is Now A Weapon Of War

by Stephanie Hare via National Interest

https://www.washingtonpost.com/technology/2022/02/16/clearview-expansion-facial-recognition/
https://news.yahoo.com/facial-recognition-fake-identities-and-digital-surveillance-tools-inside-the-post-offices-covert-internet-operations-program-214234762.html
https://www.bbc.co.uk/news/technology-60738204
https://www.nytimes.com/2022/04/07/technology/facial-recognition-ukraine-clearview.html
https://www.washingtonpost.com/technology/2022/02/16/clearview-expansion-facial-recognition/
https://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004&ChapterID=57
https://www.eff.org/deeplinks/2022/02/victory-another-lawsuit-proceeds-against-clearviews-face-surveillance
https://nationalinterest.org/blog/buzz/after-75-years-five-eyes-intelligence-alliance-remains-mystery-180852
https://www.nytimes.com/2021/02/03/technology/clearview-ai-illegal-canada.html
https://www.bbc.com/news/technology-57268121
https://ainowinstitute.org/regulatingbiometrics.pdf
https://www.reuters.com/article/clearview-ai-investigation-idINKBN24A0ZB
https://www.bbc.co.uk/news/business-59466803
https://www.bbc.co.uk/news/technology-59149236
https://gpdp.it/home/docweb/-/docweb-display/docweb/9751323
https://www.bloomberg.com/news/articles/2021-05-27/clearview-ai-hit-by-wave-of-european-privacy-complaints

Ukraine is using Clearview AI’s facial recognition technology to
identify Russians, dead and alive, drawing on a database of billions
of face images that it scraped from the internet without anyone’s
consent.

Who owns your face? You might think that you do, but consider that
Clearview AI, an American company that sells facial recognition
technology, has amassed a database of ten billion images since 2020.
By the end of the year, it plans to have scraped 100 billion facial
images from the internet. It is difficult to assess the company’s
claims, but if we take Clearview AI at face value, it has enough data
to identify almost everyone on earth and end privacy and anonymity
everywhere.

As you read these words, your face is making money for people whom
you’ve never met and who never sought your consent when they took your
faceprint from your social media profiles and online photo albums.
Today, Clearview AI’s technology is used by over 3,100 U.S. law
enforcement agencies, as well as the U.S. Postal Service. In Ukraine,
it is being used as a weapon of war. The company has offered its tools
free of charge to the Ukrainian government, which is using them to
identify dead and living Russian soldiers and then contact their
mothers.

It would be easy to shrug this off. After all, we voluntarily
surrendered our privacy the moment we began sharing photos online, and
millions of us continue to use websites and apps that fail to protect
our data, despite warnings from privacy campaigners and Western
security services. As so many of us sympathize with Ukraine and are
appalled by Russia’s brutality, it is tempting to overlook the fact
that Ukraine is not using Clearview AI to identify dead Ukrainians,
which suggests that we are witnessing the use of facial recognition
technology for psychological warfare, not identification. Some people
will be fine with the implications of this: if Russian mothers have to
receive disturbing photos of their dead sons, so be it.

To understand why we might want to rethink the use of facial
recognition technology in conflict, consider the following thought
experiments. First, imagine that it was Russia that had scraped
Ukrainian biometric data from the internet to build a facial
recognition technology tool which it was using to identify dead
Ukrainians and contact their mothers. Liberal democracies would likely
condemn these actions and add them to its growing list of Russia’s
barbaric actions. Second, imagine a conflict in which the United
States was fighting against an opponent who had taken American
faceprints to train its facial recognition technology and was using it
to identify dead American soldiers and contact their mothers. This
would almost certainly cause howls of protest across the United
States. Technology executives would be vilified in the press and
hauled before Congress, where lawmakers might finally pass a law to
protect Americans’ biometric data.

We do not need to wait for these scenarios to occur; Congress could
act now to protect Americans’ biometric data. If taking inspiration
from the European Union (EU) General Data Protection Regulation (GDPR)
seems a step too far, Congress only needs to look to Illinois, whose
Biometric Information Privacy Act (BIPA) requires that companies
obtain people’s opt-in consent before capturing facial images and
other biometrics. Clearview AI is currently fighting multiple lawsuits
in federal and state courts in Illinois for failing to obtain users’
consent. These lawsuits highlight a troubling aspect of facial
recognition technology in the United States: Americans’ privacy, civil
liberties, and rights over their biometric data vary from state to
state, and even within states, and are not protected by federal law.

To remedy this problem, Congress could also look to several U.S.
allies, including three of its Five Eyes intelligence-sharing
partners. In 2021, Canada’s privacy commissioner identified Clearview
AI as a tool of mass surveillance and declared it illegal, while
British and Australian data regulators fined Clearview AI and ordered
it to delete their citizens’ data. In the EU, Italy recently fined
Clearview €20 million, ordered it to delete all of the data that it
had collected from Italian citizens, and prohibited Clearview AI from
collecting more data. Last year, Sweden fined police authorities for
using Clearview AI’s technology to identify people, and a number of
privacy, civil liberties, and human rights groups filed complaints
against the company with data regulators in France, Austria, and
Greece. Unsurprisingly, in May 2021, Clearview AI said it had no
EU-based customers.

For all of Clearview AI’s many flaws, the challenge free-societies
face is about more than the actions of one company. Many companies and
governments are using similar means to create the same tools, such as
PimEyes, FindClone, and TrueFace. Liberal democracies can regulate
them, but currently, there is nothing preventing adversaries from
capturing our faces and other biometric data. Failing to act could
endanger soldiers, security personnel, and law enforcement officers,
as well as civilian populations. It is time to confront this challenge
head-on.


More information about the cypherpunks mailing list