This work is released under a Creative Commons Attribution-NoDerivatives 4.0 International License.
"OpenPGP" refers to the OpenPGP protocol, in much the same way that
HTML refers to the protocol that specifies how to write a web page.
"GnuPG", "SequoiaPGP", "OpenPGP.js", and others are implementations of
the OpenPGP protocol in the same way that Mozilla Firefox, Google
Chromium, and Microsoft Edge refer to software packages that process
HTML data.
Robert J. Hansen <rjh@sixdemonbag.org>.
I maintain the GnuPG FAQ and unofficially hold the position of crisis
communicator. This is not an official statement of the GnuPG project,
but does come from someone with commit access to the GnuPG git repo.
In the last week of June 2019 unknown actors deployed a certificate
spamming attack against two high-profile contributors in the OpenPGP
community (Robert J. Hansen and Daniel Kahn Gillmor, better known in the
community as "rjh" and "dkg"). This attack exploited a defect in the
OpenPGP protocol itself in order to "poison" rjh and dkg's OpenPGP
certificates. Anyone who attempts to import a poisoned certificate into
a vulnerable OpenPGP installation will very likely break their
installation in hard-to-debug ways. Poisoned certificates are already
on the SKS keyserver network. There is no reason to believe the
attacker will stop at just poisoning two certificates. Further, given
the ease of the attack and the highly publicized success of the attack,
it is prudent to believe other certificates will soon be poisoned.
This attack cannot be mitigated by the SKS keyserver network in any
reasonable time period. It is unlikely to be mitigated by the OpenPGP
Working Group in any reasonable time period. Future releases of OpenPGP
software will likely have some sort of mitigation, but there is no time
frame. The best mitigation that can be applied at present is simple:
stop retrieving data from the SKS keyserver network.
When Phil Zimmermann first developed PGP ("Pretty Good Privacy") in
the early 1990s there was a clear chicken and egg problem. Public key
cryptography could revolutionize communications but required individuals
possess each other's public keys. Over time terminology has shifted:
now public key cryptography is mostly called "asymmetric cryptography"
and public keys are more often called "public certificates", but the
chicken-and-egg problem remains. To communicate privately, each party
must have a small piece of public data with which to bootstrap a private
communication channel.
Special software was written to facilitate the discovery and
distribution of public certificates. Called "keyserver software", it
can be thought of as analogous to a telephone directory. Users can
search the keyserver by a variety of different criteria to discover
public certificates which claim to belong to the desired user. The
keyserver network does not attest to the accuracy of the information,
however: that's left for each user to ascertain according to their own
criteria.
Once a user has verified a certificate really and truly belongs to
the person in question, they can affix an affidavit to the certificate
attesting that they have reason to believe the certificate really
belongs to the user in question.
For instance: John Hawley (john@example.org) and I (rjh@example.org)
are good friends in real life. We have sat down face-to-face and
confirmed certificates. I know with complete certainty a specific
public certificate belongs to him; he knows with complete certainty a
different one belongs to me. John also knows H. Peter Anvin (hpa@example.org)
and has done the same with him. If I need to communicate privately
with Peter, I can look him up in the keyserver. Whichever certificate
bears an attestation by John, I can trust really belongs to Peter.
In the early 1990s we were concerned repressive regimes would attempt
to force keyserver operators to replace certificates with different
ones of the government's choosing. (I speak from firsthand experience.
I've been involved in the PGP community since 1992. I was there for
these discussions.) We made a quick decision that keyservers would
never, ever, ever, delete information. Keyservers could add information
to existing certificates but could never, ever, ever, delete either a
certificate or information about a certificate.
To meet this goal, we started running an international network of
keyservers. Keyservers around the world would regularly communicate
with each other to compare directories. If a government forced a
keyserver operator to delete or modify a certificate, that would be
discovered in the comparison step. The maimed keyserver would update
itself with the content in the good keyserver's directory. This was a
simple and effective solution to the problem of government censorship.
In the early 1990s this design seemed sound. It is not sound in 2019. We've known it has problems for well over a decade.
There are powerful technical and social factors inhibiting further keyserver development.
The software is Byzantine. The standard keyserver
software is called SKS, for "Synchronizing Key Server". A bright fellow
named Yaron Minsky devised a brilliant algorithm that could do
reconciliations very quickly. It became the keystone of his Ph.D
thesis, and he wrote SKS originally as a proof of concept of his idea.
It's written in an unusual programming language called OCaml, and in a
fairly idiosyncratic dialect of it at that. This is of course no
problem for a proof of concept meant to support a Ph.D thesis, but for
software that's deployed in the field it makes maintenance quite
difficult. Not only do we need to be bright enough to understand an
algorithm that's literally someone's Ph.D thesis, but we need expertise
in obscure programming languages and strange programming customs.
The software is unmaintained. Due to the above,
there is literally no one in the keyserver community who feels qualified
to do a serious overhaul on the codebase.
Changing a design goal is not the same as fixing a bug. The design goal of the keyserver network is "baked into" essentially
every part of the infrastructure. This isn't a case where there's a bug
that's inhibiting the keyserver network from functioning correctly.
Bugs are generally speaking fairly easy to fix once you know where the
problem is. Changing design goals often requires an overhaul of such
magnitude it may be better to just start over with a fresh sheet of
paper.
There is no centralized authority in the keyserver network. The lack of centralized authority was a feature, not a bug. If there
is no keyserver that controls the others, there is no single point of
failure for a government to go after. On the other hand it also means
that even after the software is overhauled and/or rewritten, each
keyserver operator has to commit to making the upgrade and stomping out
the difficulties that inevitably arise when new software is fielded.
The confederated nature of the keyserver network makes changing the
design goals even harder than it would normally be—and rest assured, it
would normally be very hard!
The keyserver network is susceptible to a variety of attacks as a
consequence of its write-only design. The keyserver network can be
thought of as an extremely large, extremely reliable, extremely
censorship-resistant distributed filesystem which anyone can write to.
Imagine if Dropbox allowed any Tom, Dick, or Harry to not only put
information in your public Dropbox folder, but made it impossible for
you to delete it. How would everyone from spammers to child
pornographers abuse this?
Many of the same attacks are possible on the keyserver network. We
have known about these vulnerabilities for well over a decade. Fixing
the keyserver network is, however, problematic for the reasons listed
above.
In order to limit the scope of this document a detailed breakdown of only one such vulnerability will be presented (see below).
Consider public certificates. In order to make them easier to use,
they have a list of attestations: statements from other people,
represented by their own public certificates, that this certificate
really belongs to the individual in question. In my example from
before, John Hawley attested to H. Peter Anvin's certificate. When I
looked for H. Peter Anvin's certificate I checked all the certificates
which claimed to belong to him and selected the one John attested as
being really his.
These attestations — what we call certificate signatures —
can be made by anyone for any purpose. And once made, they never go
away. Ever. Even when a certificate signature gets revoked the
original remains on the certificate: all that happens is a second
signature is affixed saying "don't trust the previous one I made".
The OpenPGP specification puts no limitation on how many signatures
can be attached to a certificate. The keyserver network handles
certificates with up to about 150,000 signatures.
GnuPG, on the other hand … doesn't. Any time GnuPG has to deal with
such a spammed certificate, GnuPG grinds to a halt. It doesn't stop,
per se, but it gets wedged for so long it is for all intents and
purposes completely unusable.
My public certificate as found on the keyserver network now has just short of 150,000 signatures on it.
Further, pay attention to that phrase any time GnuPG has to deal with such a spammed certificate. If John were to ask GnuPG to verify my signature on H. Peter Anvin's
certificate, GnuPG would attempt to comply and in the course of business
would have to deal with my now-spammed certificate.
We've known for a decade this attack is possible. It's now here and
it's devastating. There are a few major takeaways and all of them are
bad.
That last one requires some explanation. Any certificate may be poisoned at any time, and is unlikely to be discovered until it breaks an OpenPGP installation.
The number one use of OpenPGP today is to verify downloaded packages
for Linux-based operating systems, usually using a software tool called
GnuPG. If someone were to poison a vendor's public certificate and
upload it to the keyserver network, the next time a system administrator
refreshed their keyring from the keyserver network the vendor's
now-poisoned certificate would be downloaded. At that point upgrades
become impossible because the authenticity of downloaded packages cannot
be verified. Even downloading the vendor's certificate and
re-importing it would be of no use, because GnuPG would choke trying to
import the new certificate. It is not hard to imagine how motivated
adversaries could employ this against a Linux-based computer network.
At present I (speaking only for myself) do not believe the global keyserver network is salvageable. High-risk users should stop using the keyserver network immediately.
Users who are confident editing their GnuPG configuration files should follow the following process:
gpg.conf
in a text editor. Ensure there is no line starting with keyserver
. If there is, remove it.dirmngr.conf
in a text editor. Add the line keyserver hkps://keys.openpgp.org
to the end of it.keys.openpgp.org
is a new experimental keyserver which
is not part of the keyserver network and has some features which make it
resistant to this sort of attack. It is not a drop-in replacement: it
has some limitations (for instance, its search functionality is sharply
constrained). However, once you make this change you will be able to
run gpg --refresh-keys
with confidence.
If you know which certificate is likely poisoned, try deleting it:
this normally goes pretty quickly. If your OpenPGP installation becomes
usable again, congratulations. Acquire a new unpoisoned copy of the
certificate and import that.
If you don't know which certificate is poisoned, your best bet is to
get a list of all your certificate IDs, delete your keyrings completely,
and rebuild from scratch using known-good copies of the public
certificates.
dkg wrote a blog post about this. He sums up my feelings pretty well, so I'm going to quote him liberally with only a trivial correction.
I've spent a significant amount of time over the years trying to push the ecosystem into a more responsible posture with respect to OpenPGP certificates, and have clearly not been as successful at it or as fast as I wanted to be. Complex ecosystems can take time to move.
To have my own certificate directly spammed in this way felt surprisingly personal, as though someone was trying to attack or punish me, specifically. I can't know whether that's actually the case, of course, nor do I really want to. And the fact that Robert J. Hansen's certificate was also spammed makes me feel a little less like a singular or unique target, but I also don't feel particularly proud of feeling relieved that someone else is also being "punished" in addition to me.
But this report wouldn't be complete if I didn't mention that I've felt disheartened and demotivated by this situation. I'm a stubborn person, and I'm trying to make the best of the situation by being constructive about at least documenting the places that are most severely broken by this. But I've also found myself tempted to walk away from this ecosystem entirely because of this incident. I don't want to be too dramatic about this, but whoever did this basically experimented on me (and Rob) directly, and it's a pretty shitty thing to do.
If you're reading this, and you set this off, and you selected me specifically because of my role in the OpenPGP ecosystem, or because I wrote the abuse-resistant-keystore draft, or because I'm part of the Autocrypt project, then you should know that I care about making this stuff work for people. If you'd reached out to me to describe what you were planning to do, we could have done all of the above bug reporting and triage using demonstration certificates, and worked on it together. I would have happily helped. I still might! But because of the way this was done, I'm not feeling particularly happy right now. I hope that someone is, somewhere.
To which I'd like to add: I have never in my adult life wished
violence on any human being. I have witnessed too much of it and its
barbaric effects, stood by the graves of too many people cut down too
young. I do not hate you and I do not wish any harm to befall you.
But if you get hit by a bus while crossing the street, I'll tell the driver everyone deserves a mulligan once in a while.
You fool. You absolute, unmitigated, unadulterated, complete and utter, fool.
Peace to everyone — including you, you son of a bitch.
— Rob