If You Want to Protect A Security Secret, Make Sure It's Public
<http://online.wsj.com/article_print/0,,SB107930573476054980,00.html> The Wall Street Journal March 15, 2004 PORTALS By LEE GOMES If You Want to Protect A Security Secret, Make Sure It's Public Here is some news that is shocking but true: The most sensitive, most highly classified secrets of the U.S. government will soon be in the hands of two foreigners, both of them self-described "Linux hackers." It's nothing to be alarmed about, though. Joan Daemen and Vincent Rijmen, two Belgian mathematicians, won a U.S.-sponsored global competition in 2000 to design the encryption system that will henceforth encode the secret communications of the U.S. government. The contest was an entirely open affair, and the winners selected after a lengthy public process. You can go online yourself and test the Daemen-Rijmen Advanced Encryption Standard, assuming you're handy with the likes of matrix multiplication. It seems that the world's cryptographers, while dealing with keeping secrets, do most of their work in public. That's worth remembering as the country moves to electronic voting. The connection between cryptography and voting may not be immediately apparent. But in both fields, the integrity of something secret must be maintained, often in very hostile circumstances. After the Florida recount debacle, there is now a big push in the U.S. toward electronic-voting systems; 50 million people are expected to be using them this November. The problem is that most of the systems being purchased by local election officials are proprietary, "black box" solutions sold by companies who, citing trade secret issues, won't let others look inside them. It's not just conspiracy theorists who are worried about this, but leading computer scientists. Proprietary balloting software leaked by corporate insiders has been discovered by outside evaluators to be full of security holes. Thus, the good folks working to guarantee secret ballots should learn something from the people who work to guarantee secret messages. They never trust anyone who says "trust us." The basic approach in modern cryptography is to keep the pattern of your specific key a secret, but not to worry if the overall design of your lock gets out. It's called Kerckhoffs' Principle, after Auguste Kerckhoffs, a 19th-century cryptographer who, like Messrs. Daemen and Rijmen, was Flemish. He listed six guidelines for a reliable encryption system. No. 2 was, "It must not be required to be secret, and it must be able to fall into the hands of the enemy without inconvenience." The idea is counterintuitive, and for most of the long history of secret codes, it was ignored. But with the rise of computer-assisted cryptography in the past 50 years or so, there has been a sea change in the working assumptions of cryptographers. Now, "you can't get good cryptography by designing in secret," says Whitfield Diffie, co-inventor of the "public key" encryption system that revolutionized the field, and currently chief security officer at Sun Microsystems. If you use the Internet, you are using an alphabet soup of different encoding methods, all available for public inspection: RSA, SSL and more. Many security problems exist on the Internet, but none involve these algorithms. Why make this stuff public? Because even the smartest people make mistakes. David Kahn, author of "The Codebreakers," says that hubris is something of an occupational hazard among code makers. "One of the patterns in cryptographic history is how people always believe the system they just created is unbreakable," he says. "Someone very clever will create a cipher, but then someone even cleverer will come along and find a flaw in it." Mr. Kahn notes that the German businessmen who began selling the famed Enigma machine in the 1920s thought they had an unbreakable system. They marketed the device by boasting that even if someone else had an Enigma, he couldn't read your messages. Lucky for us, they were wrong. Polish, and later British, cryptographers were able to defeat Enigma, in part because at least in the early years, it gave away a clue by repeating the first three characters of a transmission twice in a row. These days, tens of thousands of cryptographers use the Internet as a kind of global Bletchley Park, the famed World War II site where the British cracked Enigma. Indeed, cryptographer Paul Kocher notes a pattern: Cryptographic systems developed in public tend to stand up; those developed in secret, like those for DVD systems or European-style GMS phones, often get broken. But if the entire world can see your encryption method, couldn't some smart bad guy find a flaw in it and quietly use the information against you? In theory, yes. But the real world doesn't work that way. Think of all the graduate students eager to make a name for themselves by pointing out someone else's mistake. Mr. Kocher, for instance, is a cryptocelebrity because as a student, he found a subtle but serious theoretical flaw in the widely used RSA encryption method. The system could then be repaired. You get the point by now. Cryptography is developed in public. If it's good enough for eBay, isn't it good enough for the ballot box? -- ----------------- R. A. Hettinga <mailto: rah@ibuc.com> The Internet Bearer Underwriting Corporation <http://www.ibuc.com/> 44 Farquhar Street, Boston, MA 02131 USA "... however it may deserve respect for its usefulness and antiquity, [predicting the end of the world] has not been found agreeable to experience." -- Edward Gibbon, 'Decline and Fall of the Roman Empire'
R. A. Hettinga (2004-03-15 02:07Z) wrote:
<http://online.wsj.com/article_print/0,,SB107930573476054980,00.html>
If You Want to Protect A Security Secret, Make Sure It's Public
What is "terrible article titles for $500, Alex"? -- That woman deserves her revenge... and... we deserve to die. -- Budd, "Kill Bill"
Despite the long-lived argument that public review of crypto assures its reliability, no national infosec agency -- in any country worldwide -- follows that practice for the most secure systems. NSA's support for AES notwithstanding, the agency does not disclose its military and high level systems. It is likely that these agencies are willing to go along with the notion of public review to lull users into depending on the systems made public. If any are breakable, the review will show that, and if the agencies can break them they need not say squat, merely reap the benefits of public ignorance and trust in seemingly unbreakable systems, as with Enigma, Crypto AG, and numerous other historical examples David Kahn describes. Cryptome's FOI request for NSA documents on when and what it learned about public key (non-secret) crypto from the Brits is now 3 1/2 years old. The agency has said it has relevants documents but has not yet released anything, though some $4,000 has been paid for the search. (Last response from NSA: May 23, 2003, a telephone call from Pamela Philips, FOIA Chief, saying that the request was in the "easy queue," number 45 from the top.) Whit Diffie has said he got hints of PK, or something like it, at NSA. It is not clear from his account whether information on PK was deliberately leaked to him, with or without a restriction of disclosure, or if the breakthrough was truly a phenomenal private effort of Diffie-Hellman-Merkle. Consider that intelligence agencies are known to run years- even decades-long deception operations, especially about top secret infosec operations, with the goal of deceiving about the strength of infosec systems so that they will be sufficiently trusted to be widely used. Again, Kahn cites numerous examples of such deceptions. The reputation of witting and unwitting participants and institutions are often used to gain trust in these breakable systems. The weakness of vaunted systems is considered to be more valuable than their strengths. It is imaginable that if AES did not exist it would have to be invented for such a purposed. As with PK, PGP and the notion that public review of crypto is the hen's teeth of assurance. Until national infosec agencies reveal what they know it does not seem prudent to to believe conventional wisdom no matter how often repeated, especially how often repeated. A 100% safe crypto system is never to be believed, isn't that what always accompanies cryptographers' assurances for they now better than anyone that snake oil is their No. 1 tool. Snake oil = crypto, which accounts for why the charge is so often hurled. And why snake oil is used to camouflage what is occurring beneath its contemptible obviousness.
John Young <jya@pipeline.com> wrote:
Despite the long-lived argument that public review of crypto assures its reliability, no national infosec agency -- in any country worldwide -- follows that practice for the most secure systems. NSA's support for AES notwithstanding, the agency does not disclose its military and high level systems.
Nevertheless, given that the public has two options (disclosure or non-), it seems public review is as good as it gets. You're right, of course---don't put 100% trust in anything---but I think it's still reasonable to trust a publicly reviewed system more than a closed one. -- Riad Wahby rsw@jfet.org MIT VI-2 M.Eng
John Young <jya@pipeline.com> wrote:
Despite the long-lived argument that public review of crypto assures its reliability, no national infosec agency -- in any country worldwide -- follows that practice for the most secure systems. NSA's support for AES notwithstanding, the agency does not disclose its military and high level systems. Nevertheless, given that the public has two options (disclosure or non-), it seems public review is as good as it gets. I also can't see an alternative; yes, we are giving military organizations the "crown jewels" of our efforts for no cost (although at least in theory they should pay for anything that is copyrighted or
Riad S. Wahby wrote: patented :) but no large company can afford to spend a fraction of what the NSA do every day on analysis - it is rely on the community or rely on a handful of staff who may or may not be able to code their way out of a paper bag (and if there is no community to give peer status to a cryptographer, how can you tell good from bad when you hire one?) Almost always, closed source systems are either snakeoil, or based on publically accepted algos with just a few extra valueless steps thrown in so that they can claim it is different (VME for example can be very secure indeed provided you combine it with something else - explicitly mentioned as an option in the patent document - but the combined system is still patented because their silly variant on a classic cypher is used at some point)
participants (5)
-
Dave Howe
-
John Young
-
Justin
-
R. A. Hettinga
-
Riad S. Wahby