On 9/04/13 03:43 AM, Jeffrey Goldberg wrote:
On Apr 8, 2013, at 7:38 AM, ianG <iang@iang.org> wrote:
We all know stories. DES is now revealed as interfered with, yet for decades we told each other it was just parity bits.
But it turned out that the interference was to make it *stronger* against attacks, differential cryptanalysis, that only the NSA and IBM knew about at the time.
That's what we all believed. From Wikipedia (I haven't checked the primary references): ====================== In contrast, a declassified NSA book on cryptologic history states: In 1973 NBS solicited private industry for a data encryption standard (DES). The first offerings were disappointing, so NSA began working on its own algorithm. Then Howard Rosenblum, deputy director for research and engineering, discovered that Walter Tuchman of IBM was working on a modification to Lucifer for general use. NSA gave Tuchman a clearance and brought him in to work jointly with the Agency on his Lucifer modification."[8] and NSA worked closely with IBM to strengthen the algorithm against all except brute force attacks and to strengthen substitution tables, called S-boxes. Conversely, NSA tried to convince IBM to reduce the length of the key from 64 to 48 bits. Ultimately they compromised on a 56-bit key.[9] ======================== http://en.wikipedia.org/wiki/Data_Encryption_Standard#NSA.27s_involvement_in... Conclusion? We wuz tricked! In their own words, they managed the entire process, and succeeded in convincing everyone that they did not. And they made it weaker where they held the advantage: budget to crunch. Last two sentences, above.
If history is a guide, weakness that TLAs insist on are transparent. They are about (effective) key size.
Indeed. Notice the subtlety of their attack: it is brutally simple. We others focus on elegance, and dismiss the simplistic. Cognitive dissonance? They focussed on mission, and used asymmetry of crunch strength. Recall that, in the old days, no other country could muster the budget and technology that they could.
We have no way to know whether this will continue to be the case, but I'd imagine that the gap in knowledge between the NSA and the academic community diminishes over time; so that makes me think that they'd be even more reluctant to try to slip in a hidden weakness today than in 1975.
Possibly. In terms of algorithms, I don't think there has been a case where they've deliberately weakened the algorithm. OTOH, in terms of key strength, they have been very very finessed. Remember Skipjack? The comments at the time was that the key strength was beautifully aligned - right at the edge. 80 bit keys where the open community had already concluded 128 was the target. Which meant that if there was to be an advantage, all that was left was: budget in crunching. iang _______________________________________________ cryptography mailing list cryptography@randombit.net http://lists.randombit.net/mailman/listinfo/cryptography ----- End forwarded message ----- -- Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org ______________________________________________________________ ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org 8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE