cypherpunks
Threads by month
- ----- 2024 -----
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- 6 participants
- 29554 discussions
10 Sep '13
----- Forwarded message from "John S. Quarterman" <jsqnanog(a)quarterman.com> -----
Date: Fri, 06 Sep 2013 06:47:26 -0400
From: "John S. Quarterman" <jsqnanog(a)quarterman.com>
To: sam(a)circlenet.us, "John S. Quarterman" <jsq(a)quarterman.com>, nanog(a)nanog.org
Subject: Re: The US government has betrayed the Internet. We need to take it back
> On 2013-09-06 05:57, Roland Dobbins wrote:
> > There are no purely technical solutions to social ills. Schneier of
> > all people should know this.
Schneier does know this, and explicitly said this.
-jsq
http://www.theguardian.com/commentisfree/2013/sep/05/government-betrayed-in…
Three, we can influence governance. I have resisted saying this up to now,
and I am saddened to say it, but the US has proved to be an unethical
steward of the internet. The UK is no better. The NSA's actions are
legitimizing the internet abuses by China, Russia, Iran and others. We
need to figure out new means of internet governance, ones that makes it
harder for powerful tech countries to monitor everything. For example,
we need to demand transparency, oversight, and accountability from our
governments and corporations.
Unfortunately, this is going play directly into the hands of totalitarian
governments that want to control their country's internet for even more
extreme forms of surveillance. We need to figure out how to prevent that,
too. We need to avoid the mistakes of the International Telecommunications
Union, which has become a forum to legitimize bad government behavior,
and create truly international governance that can't be dominated or
abused by any one country.
Generations from now, when people look back on these early decades of
the internet, I hope they will not be disappointed in us. We can ensure
that they don't only if each of us makes this a priority, and engages in
the debate. We have a moral duty to do this, and we have no time to lose.
Dismantling the surveillance state won't be easy. Has any country that
engaged in mass surveillance of its own citizens voluntarily given up
that capability? Has any mass surveillance country avoided becoming
totalitarian? Whatever happens, we're going to be breaking new ground.
Again, the politics of this is a bigger task than the engineering, but
the engineering is critical. We need to demand that real technologists
be involved in any key government decision making on these issues. We've
had enough of lawyers and politicians not fully understanding technology;
we need technologists at the table when we build tech policy.
To the engineers, I say this: we built the internet, and some of us have
helped to subvert it. Now, those of us who love liberty have to fix it.
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
4
3
Thanks for the replies guys! I'm going through the links provided.
Meanwhile I got a follow up question...
Creating hardware rngs for individual PCs or phones or similar devices isn't really hard. We don't need to rely on a multibillion american corporation like intel to produce some state-of-the-art circuitry. There are applications that need a fast stream of random numbers, but those applications are not the applications end users run on their devices for security purposes - Did I get the general idea right?
J.
3
2
----- Forwarded message from Ben Laurie <ben(a)links.org> -----
Date: Mon, 9 Sep 2013 17:29:24 +0100
From: Ben Laurie <ben(a)links.org>
To: Cryptography Mailing List <cryptography(a)metzdowd.com>
Subject: [Cryptography] What TLS ciphersuites are still OK?
Perry asked me to summarise the status of TLS a while back ... luckily I
don't have to because someone else has:
http://tools.ietf.org/html/draft-sheffer-tls-bcp-00
In short, I agree with that draft. And the brief summary is: there's only
one ciphersuite left that's good, and unfortunately its only available in
TLS 1.2:
TLS_DHE_RSA_WITH_AES_128_GCM_SHA256
_______________________________________________
The cryptography mailing list
cryptography(a)metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
----- Forwarded message from rob.golding(a)astutium.com -----
Date: Sun, 08 Sep 2013 04:56:17 +0100
From: rob.golding(a)astutium.com
To: bitcoin-development(a)lists.sourceforge.net
Subject: Re: [Bitcoin-development] Blockchain archival
User-Agent: Roundcube Webmail/0.8.5
> (there's no way to be completely trust-free without this).
Not quite true, as I said balance-at-point-in-time would solve that
(and make the storage requirements much lower)
>> If going that route, then solutions to the 'consolidate
>> addresses/wallets'
>> question and formal 'discard' of addresses could get addressed.
>
> Not sure what you mean here. Addresses and wallets are two completely
> different things. Addresses are single-use destinations that point to
> a wallet
> (which is itself private and unknown to the network).
For bitcoin to grow beyond interesting experiment into global everyday
use a number of things would have to happen, not least of which is
taking 'average punter' into account. Whilst new ideas can filter into
the general consciousness over time,sometimes concepts have to go with
'what already works' :)
People's concept of money hasn't really changed in over 1,000 years -
it remains 'something of known value i can exchange for something else'.
No-one outside of bitcoin dev's and early adopters really gets the
one-shot concept of addresses - possibly rightly so - keeping issues of
it lowering levels of anonymity etc out of the discussion - it doesn't
fit with the mindset people have - it's difficult enough getting
merchants to setup separate addresses for each client, one per
transaction is simply a waste (of addresses, storage, blockchain size,
numnber of inputs|outputs when spending etc)
I'm sure the wife would love a new handbag everytime she gets some
money, but the real-world just isnt like that ;)
Addresses are perceived as the equivalent of a jar you stick your coins
in. You can have lots of jars. Each jar can be for a specific reason or
whatever, but the analogy is there.
Wallets are like a box you keep some of your jars in. With the added
interesting concept that a jar can be in multiple boxes at the same
time. Only the person with the right 'key' can open the jar and take the
contents.
However unlike the 3 money boxes I have behind me right now - which i
can take 1 single penny out of one and put it into another - if I want
to move bitcoins from one addresses (jar) to another *of my own* I have
to pay a fee. Worse still if the jar doesnt have much in it I'm denied
that ability.
End user will neither understand why or want to pay the fee, for
dealing with their own coins.
If a jar breaks I can just tip the contents into a new one - unless I'm
very careless, the amount in the new one = the amount in the old one -
people will want/need it to work like that.
Similarly if you do have all these addresses around, you may want (as
good housekeeping) discard some of them (after moving the cash).
So having the ability to specify address to send from is essential (and
a sadly missing feature of the QT client)
'intra-wallet' transfers with an 'also discard the sending address'
would be a way of (once confirmed) stopping any further use of that
address (denied any further transactions by miners ?) and when
balance-at-point-in-time is implemented, a way of shrinking the storage
for all other bitcoin users (who chosse not to have a full transaction
set).
If i send luke 10, and luke sends me back 3, i have 3, luke has 7.
If luke sends me 2, and i send luke 1, i have 4 and luke has 6.
To verify my ability to send jeff 4, all that is needed is to know that
I have 4, not all the transactions that led to that state - thats how
its done now, thats not necessarily efficient as bitcoin grows
If luke sends me 4 more, i now have 4 again, luke has 3
If i send 1 to each of the children, they have 1 each (*4)
Having a 'family' wallet means when on holiday they can have that
rental of quad-bikes - to send the rental company 4 the client only
needs to know that those addresses now have 1 each in them, not all the
previous transactions - if they didnt exist at the point-in-time
balance, then yes, it would need to know about the luke>rob>kids
transactions, but thats all
I moved to a new netbook recently - it took 140 *hours* to d/load and
process the blockchain (yes the wifi was that bad), I heard from one of
our clients that (although they only had the client running during
working hours) that to their desktop it was over 9 days before it had
caught up.
If all I was d/loading were the transactions since the last difficulty
change (as one example of a fixed point), and the remaining balance on
any not-discarded address as at that point it would have been much much
quicker, and not be shagging my shiny new hard drive.
There's more but it's 4.45 in the morning, and I cant think coherently
until after a few hours kip and some good coffee :)
Rob
------------------------------------------------------------------------------
Learn the latest--Visual Studio 2012, SharePoint 2013, SQL 2012, more!
Discover the easy way to master current and previous Microsoft technologies
and advance your career. Get an incredible 1,500+ hours of step-by-step
tutorial videos with LearnDevNow. Subscribe today and save!
http://pubads.g.doubleclick.net/gampad/clk?id=58041391&iu=/4140/ostg.clktrk
_______________________________________________
Bitcoin-development mailing list
Bitcoin-development(a)lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
Scott Aaaronson: NSA: Possibly breaking US laws, but still bound by laws of computational complexity
by Eugen Leitl 09 Sep '13
by Eugen Leitl 09 Sep '13
09 Sep '13
http://www.scottaaronson.com/blog/?p=1517
NSA: Possibly breaking US laws, but still bound by laws of computational
complexity
Last week, I got an email from a journalist with the following inquiry. The
recent Snowden revelations, which made public for the first time the US
government’s “black budget,” contained the following enigmatic line from the
Director of National Intelligence: “We are investing in groundbreaking
cryptanalytic capabilities to defeat adversarial cryptography and exploit
internet traffic.” So, the journalist wanted to know, what could these
“groundbreaking” capabilities be? And in particular, was it possible that
the NSA was buying quantum computers from D-Wave, and using them to run
Shor’s algorithm to break the RSA cryptosystem?
I replied that, yes, that’s “possible,” but only in the same sense that it’s
“possible” that the NSA is using the Easter Bunny for the same purpose. (For
one thing, D-Wave themselves have said repeatedly that they have no interest
in Shor’s algorithm or factoring. Admittedly, I guess that’s what D-Wave
would say, were they making deals with NSA on the sly! But it’s also what
the Easter Bunny would say.) More generally, I said that if the open
scientific world’s understanding is anywhere close to correct, then quantum
computing might someday become a practical threat to cryptographic security,
but it isn’t one yet.
That, of course, raised the extremely interesting question of what
“groundbreaking capabilities” the Director of National Intelligence was
referring to. I said my personal guess was that, with ~99% probability, he
meant various implementation vulnerabilities and side-channel attacks—the
sort of thing that we know has compromised deployed cryptosystems many times
in the past, but where it’s very easy to believe that the NSA is ahead of the
open world. With ~1% probability, I guessed, the NSA made some sort of big
improvement in classical algorithms for factoring, discrete log, or other
number-theoretic problems. (I would’ve guessed even less than 1% probability
for the latter, before the recent breakthrough by Joux solving discrete log
in fields of small characteristic in quasipolynomial time.)
Then, on Thursday, a big New York Times article appeared, based on 50,000 or
so documents that Snowden leaked to the Guardian and that still aren’t
public. (See also an important Guardian piece by security expert Bruce
Schneier, and accompanying Q&A.) While a lot remains vague, there might be
more public information right now about current NSA cryptanalytic
capabilities than there’s ever been.
So, how did my uninformed, armchair guesses fare? It’s only halfway into the
NYT article that we start getting some hints:
The files show that the agency is still stymied by some encryption, as Mr.
Snowden suggested in a question-and-answer session on The Guardian’s Web site
in June.
“Properly implemented strong crypto systems are one of the few things that
you can rely on,” he said, though cautioning that the N.S.A. often bypasses
the encryption altogether by targeting the computers at one end or the other
and grabbing text before it is encrypted or after it is decrypted…
Because strong encryption can be so effective, classified N.S.A. documents
make clear, the agency’s success depends on working with Internet companies —
by getting their voluntary collaboration, forcing their cooperation with
court orders or surreptitiously stealing their encryption keys or altering
their software or hardware…
Simultaneously, the N.S.A. has been deliberately weakening the international
encryption standards adopted by developers. One goal in the agency’s 2013
budget request was to “influence policies, standards and specifications for
commercial public key technologies,” the most common encryption method.
Cryptographers have long suspected that the agency planted vulnerabilities in
a standard adopted in 2006 by the National Institute of Standards and
Technology and later by the International Organization for Standardization,
which has 163 countries as members.
Classified N.S.A. memos appear to confirm that the fatal weakness, discovered
by two Microsoft cryptographers in 2007, was engineered by the agency. The
N.S.A. wrote the standard and aggressively pushed it on the international
group, privately calling the effort “a challenge in finesse.”
So, in pointing to implementation vulnerabilities as the most likely
possibility for an NSA “breakthrough,” I might have actually erred a bit too
far on the side of technological interestingness. It seems that a large part
of what the NSA has been doing has simply been strong-arming Internet
companies and standards bodies into giving it backdoors. To put it bluntly:
sure, if it wants to, the NSA can probably read your email. But that isn’t
mathematical cryptography’s fault—any more than it would be mathematical
crypto’s fault if goons broke into your house and carted away your laptop.
On the contrary, properly-implemented, backdoor-less strong crypto is
something that apparently scares the NSA enough that they go to some lengths
to keep it from being widely used.
I should add that, regardless of how NSA collects all the private information
it does—by “beating crypto in a fair fight” (!) or, more likely, by
exploiting backdoors that it itself installed—the mere fact that it collects
so much is of course unsettling enough from a civil-liberties perspective.
So I’m glad that the Snowden revelations have sparked a public debate in the
US about how much surveillance we as a society want (i.e., “the balance
between preventing 9/11 and preventing Orwell”), what safeguards are in place
to prevent abuses, and whether those safeguards actually work. Such a public
debate is essential if we’re serious about calling ourselves a democracy.
At the same time, to me, perhaps the most shocking feature of the Snowden
revelations is just how unshocking they’ve been. So far, I haven’t seen
anything that shows the extent of NSA’s surveillance to be greater than what
I would’ve considered plausible a priori. Indeed, the following could serve
as a one-sentence summary of what we’ve learned from Snowden:
Yes, the NSA is, in fact, doing the questionable things that anyone not
living in a cave had long assumed they were doing—that assumption being so
ingrained in nerd culture that countless jokes are based around it.
(Come to think of it, people living in caves might have been even more
certain that the NSA was doing those things. Maybe that’s why they moved to
caves.)
So, rather than dwelling on civil liberties, national security, yadda yadda
yadda, let me move on to discuss the implications of the Snowden revelations
for something that really matters: a 6-year-old storm in theoretical computer
science’s academic teacup. As many readers of this blog might know, Neal
Koblitz—a respected mathematician and pioneer of elliptic curve cryptography,
who (from numerous allusions in his writings) appears to have some
connections at the NSA—published a series of scathing articles, in the
Notices of the American Mathematical Society and elsewhere, attacking the
theoretical computer science approach to cryptography. Koblitz’s criticisms
were varied and entertainingly-expressed: the computer scientists are too
sloppy, deadline-driven, self-promoting, and corporate-influenced; overly
trusting of so-called “security proofs” (a term they shouldn’t even use,
given how many errors and exaggerated claims they make); absurdly overreliant
on asymptotic analysis; “bodacious” in introducing dubious new hardness
assumptions that they then declare to be “standard”; and woefully out of
touch with cryptographic realities. Koblitz seemed to suggest that, rather
than demanding the security reductions so beloved by theoretical computer
scientists, people would do better to rest the security of their
cryptosystems on two alternative pillars: first, standards set by
organizations like the NSA with actual real-world experience; and second, the
judgments of mathematicians with … taste and experience, who can just see
what’s likely to be vulnerable and what isn’t.
Back in 2007, my mathematician friend Greg Kuperberg pointed out the irony to
me: here we had a mathematician, lambasting computer scientists for trying to
do for cryptography what mathematics itself has sought to do for everything
since Euclid! That is, when you see an unruly mess of insights, related to
each other in some tangled way, systematize and organize it. Turn the tangle
into a hierarchical tree (or dag). Isolate the minimal assumptions (one-way
functions? decisional Diffie-Hellman?) on which each conclusion can be
based, and spell out all the logical steps needed to get from here to
there—even if the steps seem obvious or boring. Any time anyone has tried to
do that, it’s been easy for the natives of the unruly wilderness to laugh at
the systematizing newcomers: the latter often know the terrain less well, and
take ten times as long to reach conclusions that are ten times less
interesting. And yet, in case after case, the clarity and rigor of the
systematizing approach has eventually won out. So it seems weird for a
mathematician, of all people, to bet against the systematizing approach when
applied to cryptography.
The reason I’m dredging up this old dispute now, is that I think the recent
NSA revelations might put it in a slightly new light. In his article—whose
main purpose is to offer practical advice on how to safeguard one’s
communications against eavesdropping by NSA or others—Bruce Schneier offers
the following tip:
Prefer conventional discrete-log-based systems over elliptic-curve systems;
the latter have constants that the NSA influences when they can.
Here Schneier is pointing out a specific issue with ECC, which would be
solved if we could “merely” ensure that NSA or other interested parties
weren’t providing input into which elliptic curves to use. But I think
there’s also a broader issue: that, in cryptography, it’s unwise to trust any
standard because of the prestige, real-world experience, mathematical good
taste, or whatever else of the people or organizations proposing it. What
was long a plausible conjecture—that the NSA covertly influences
cryptographic standards to give itself backdoors, and that
otherwise-inexplicable vulnerabilities in deployed cryptosystems are
sometimes there because the NSA wanted them there—now looks close to an
established fact. In cryptography, then, it’s not just for idle academic
reasons that you’d like a publicly-available trail of research papers and
source code, open to criticism and improvement by anyone, that takes you all
the way from the presumed hardness of an underlying mathematical problem to
the security of your system under whichever class of attacks is relevant to
you.
Schneier’s final piece of advice is this: “Trust the math. Encryption is
your friend.”
“Trust the math.” On that note, here’s a slightly-embarrassing confession.
When I’m watching a suspense movie (or a TV show like Homeland), and I reach
one of those nail-biting scenes where the protagonist discovers that
everything she ever believed is a lie, I sometimes mentally recite the proof
of the Karp-Lipton Theorem. It always calms me down. Even if the entire
universe turned out to be a cruel illusion, it would still be the case that
NP ⊂ P/poly would collapse the polynomial hierarchy, and I can tell you
exactly why. It would likewise be the case that you couldn’t break the GGM
pseudorandom function without also breaking the underlying pseudorandom
generator on which it’s based. Math could be defined as that which can still
be trusted, even when you can’t trust anything else.
This entry was posted on Sunday, September 8th, 2013 at 11:31 am and
is filed under Complexity, Nerd Interest. You can follow any responses to
this entry through the RSS 2.0 feed. You can leave a response, or trackback
from your own site.
24 Responses to “NSA: Possibly breaking US laws, but still bound by laws of
computational complexity” Aaronson on crypto. Schneier “elliptic-curve
systems; the latter have constants that the NSA influences when they can.” |
Gordon's shares Says: Comment #1 September 8th, 2013 at 1:22 pm […] Link.
Trust math, but not NSA mathematicians. […]
Douglas Knight Says: Comment #2 September 8th, 2013 at 1:35 pm Could you be
more specific about what you mean by the hypothetical “big improvement” on
number theory algorithms that is covered by your 1%?
Do elliptic curve algorithms count? Does an L(1/4) algorithm count, or only
quasi-polynomial? What if they can’t break all instances, but, as has
repeatedly happened, they discovered bad primes or bad exponents that make
particular keys weak? Breaking a random half of all keys is almost as good as
breaking all of them. Schneier’s condemnation of ECC seems to require more
than 1% chance NSA knows something special about ECC.
PS – David Jao, commenting on Schneier’s blog says that we can and do use
cryptography to prevent NSA from meddling with mystery constants. He says
that the ECC standard curves are generated by SHA-1, so to meddle, NSA would
have to break the has function. (But if half of curves are bad, that’s easy.)
Anonymous Says:
Comment #3 September 8th, 2013 at 1:45 pm
You are making good and interesting points. However, Koblitz also has some
valid criticisms of TCS even if his conclusions are not valid. The
mathematical models we built in TCS are useless if they don’t relate to the
practice and we know many of our standard models are not good enough
approximation of the reality and arguably there isn’t enough effort to deal
with these issues. Technical heavy weight lifting is used as the ultimate
criteria for judging the value of research projects inside the community.
Also I think you are exaggerating what most cryptographers expected that NSA
was doing. I have heard several famous crypto experts quite surprised by
these revelations and it has shaken their trust in the government
institutions. I never understood why some people presume that government is a
benevolent entity, such beliefs in government institutions seems like
ideology to me.
Daniel Armak Says:
Comment #4 September 8th, 2013 at 2:06 pm
You can trust the math itself, and so can Bruce Schneier and a few tens of
thousands of other people. But everyone else who can’t grok the entire
mathematical arguments for each cryptographical system, or doesn’t want to
spend a long time studying it, must trust the word of people like you. And
since the NSA can and does subvert people like you, who do original work and
analyze others’ work and sit on standards committees, not to mention the
programmers who implement it in code, what are we to do?
Daniel W. Says:
Comment #5 September 8th, 2013 at 2:33 pm
In my mind, the best circumstantial evidence that the NSA has not practically
broken any of the major cryptosystems is the following:, if they had, they
would most likely keep this as a highly guarded secret to be used only
against high value targets rather than as a means of monitoring potential
terrorists. It would most likely be contained within a small circle and not
mentioned in power-point presentations to low-level analysts.
Of course, the above argument may be flawed by assuming the NSA has too high
of a level of competence.
T H Ray Says:
Comment #6 September 8th, 2013 at 2:43 pm
Scott,
” … the clarity and rigor of the systematizing approach has eventually won
out.”
No doubt. In Euclid’s time as well as the present, though, it is helpful to
have something to systematize. Making that assumption available and
convenient is what mathematicians do.
Scott Says:
Comment #7 September 8th, 2013 at 3:02 pm
Daniel Armak #4:
You can trust the math itself, and so can Bruce Schneier and a few tens of
thousands of other people. But everyone else … must trust the word of people
like you. You raise an excellent point, which I think applies even more
broadly than you say. For one thing, I merely understand some of the general
ideas: I haven’t gone through every detail of the math used by the crypto in
my web browser, and I dare say that most professional cryptographers haven’t
either.
For another, the point is much broader than cryptography: how can you trust
quantum mechanics, if you haven’t done the requisite experiments yourself?
The physicists could’ve all been bought off by some anti-realist cabal. :-)
Or how can you trust that the government isn’t putting mind-control drugs
into the fruit you buy in the supermarket, etc. etc.
So we’re extremely lucky that science hit on a solution to these problems—the
only workable solution, really—back in the 17th century. The solution is to
open up every question to scrutiny, discussion, and challenge by any
interested person. Assertions gain credibility by surviving public
criticism—and that’s just as true in math as it is in experimental sciences.
I believe many theorems even though I haven’t checked the proofs myself,
because I know that if there were an error, then someone else could’ve made a
name for themselves by finding it.
Now, for this Popperian dynamic to work, the whole process has to be carried
out in the open: if I thought someone who found a fatal flaw in a proof would
only tell their friends, then that doesn’t do me any good. That’s why the
dividing line between “crypto as black art” and “modern crypto” happened
precisely when new discoveries started being published in the open
literature, rather than being filed in a drawer at NSA or GCHQ.
wolfgang Says:
Comment #8 September 8th, 2013 at 3:20 pm
Unfortunately, this xkcd.com/538/ had it right imho.
Scott Says:
Comment #9 September 8th, 2013 at 3:20 pm
Daniel W. #5: If the NSA had really broken strong cryptosystems, then why
would they have resorted to so many covert tactics (or, in the case of the
Clipper Chip, overt attempts) to prevent people from using strong crypto,
unless NSA has a backdoor? I suppose it’s all elaborate psychological
warfare, to prevent us from discovering the fact that these cryptosystems
were broken? And that even Snowden himself is part of the NSA’s master plan?
:-)
At least in my book, every time you claim that what looks on its face like
evidence for X, is really evidence for a powerful cabal trying to prevent
everyone from discovering not(X), the plausibility of your theory gets cut by
a factor of maybe 50,000. This is directly related to the fact that I don’t
believe any conspiracy theories—as in zero, not one.
Scott Says:
Comment #10 September 8th, 2013 at 3:32 pm
Douglas Knight #2: Sure, dramatic improvements in elliptic-curve algorithms
would certainly count—as would “merely” subexponential algorithms, were the
improvements large enough to threaten key sizes that the academic
cryptographers considered safe.
More broadly, though, you’re entirely right that there’s not a sharp line
between “improved number-theory algorithms” and “implementation
vulnerabilities.” Often, what’s happened in practice is that an
implementation vulnerability has opened the way for an attack that still
requires interesting and nontrivial number theory. But I suppose that sort of
thing would still belong to the “99%” part of my probability estimate. In the
“1%” part, I really had in mind “something that would give theoretical
cryptographers a heart attack” (like, I dunno, factoring in L(1/10), or
elliptic curve discrete log in quasipolynomial time).
Scott Says:
Comment #11 September 8th, 2013 at 5:03 pm
Anonymous #3:
You are making good and interesting points. However, Koblitz also has some
valid criticisms of TCS even if his conclusions are not valid. I completely
agree that Koblitz has some valid criticisms.
However, I’ve read pretty much all of his and Menezes’s anti-TCS screeds, and
to me what he’s doing seems, if you like, too easy to be helpful. Koblitz’s
favorite M.O. is to recount various slip-ups by people in the “Goldreich
school of crypto” and laugh at them: “haha, they talk about ‘provable
security,’ but there was a bug in their proof! or their security definition
left out an important class of side-channel attacks!” Then, with even more
glee, Koblitz relates how the hapless computer scientists put out a new paper
supposedly fixing the problem, but that paper had its own problems, and so
on.
The trouble is, that is indeed what a bunch of incompetent buffoons would
look like, but it’s also what science looks like! :-) Koblitz never seems to
want to acknowledge that the end result of the process is better scientific
understanding and more secure cryptosystems than before (even if still not
perfect).
Also, of course, Koblitz almost defiantly refuses to suggest any better
mathematical foundations for cryptography, besides the reduction-based
foundations that were built up over the last 30 years. I.e., it’s not that
instead of adaptive chosen ciphertext attack, he has a better definition to
propose, or that instead of “bodacious” new hardness assumptions, he can give
a single assumption that suffices for everything. Instead, what he appears to
want is simply a return to the “black art” era of cryptography, when security
arguments boiled down to “we tried to break it and failed” or “trust us, we
have better mathematical taste than you.”
The trouble is, I can’t think of a single case in the history of science when
mathematical foundations as well-developed as cryptography’s now are, were
simply abandoned wholesale without better mathematical foundations to replace
them. So intellectually, Koblitz strikes me as someone who’s throwing spears
at battle-tanks. Being the excellent marksman that he is, he actually scores
some hits—but the reduction-encrusted battle-tanks are still going to win in
the end.
The mathematical models we built in TCS are useless if they don’t relate to
the practice and we know many of our standard models are not good enough
approximation of the reality and arguably there isn’t enough effort to deal
with these issues. Would one also say that the mathematical foundations of
topology—open sets, Urysohn’s Lemma, etc.—are useless if they don’t relate to
the practice of tying and untying knots? I think that’s a pretty close
analogy for the relationship between what, say, Goldreich or Goldwasser or
Micali do, and the actual practice of cryptography. In both cases, yes,
there’s some relation between the intellectual foundations on the bottom and
the beautiful ornaments on top, but not surprisingly there are many floors in
between. Starting from a one-way function, for example, you first have to
construct a quasi-regular one-way function, then a pseudoentropy generator,
then a pseudorandom generator, then a pseudorandom function, and then maybe
you can start to think about building (say) a rudimentary private-key
cryptosystem or signature scheme.
Also I think you are exaggerating what most cryptographers expected that NSA
was doing. I have heard several famous crypto experts quite surprised by
these revelations and it has shaken their trust in the government
institutions. I never understood why some people presume that government is a
benevolent entity, such beliefs in government institutions seems like
ideology to me. My situation is different: I never had any real doubt that
NSA was doing such things; the thing I genuinely don’t know is whether they
have good reasons to be doing them. I consider it conceivable that the NSA
has indeed stopped many terrorist attacks or other international disasters
that we never hear about—in which case, the strongest case in their favor
might be stronger than the strongest case that can ever be made publicly. The
fact that President Obama, who’s so reasonable on so many issues, has implied
as much is evidence for that view from my perspective. On the other hand, I
also consider it conceivable that the current eavesdropping regime is purely
a result of the universal tendency of bureaucracies to expand, justify
themselves, and zealously guard their power and privileges. Or it could be
some combination of the two.
For me, though, the deciding consideration is that, even in a fantasy world
where the NSA’s actions had always been 100% justified, I’d still want them
to be more accountable to the public than they are now. “Trust that we have
our reasons, even though we can’t tell you what they are” simply doesn’t work
over the long term in a democracy, even if the trust is justified at any
particular time or in any particular case (and of course, often it hasn’t
been).
Anonymous Says:
Comment #12 September 8th, 2013 at 8:05 pm
I agree with you that his attitude is not constructive criticism. I would
even go further than you and say it is stupid to forget the science of crypto
and go back to purely engineering art treatment.
Regarding reasonability of what NSA does, NSA and its backers would of course
claim these tools are useful. To be honest, security was a weak point of
Obama’s campaign, he is not really knowledgeable in these issues and he has
not gone and will not go against his advisers if they tell him these tools
are necessary to fight terrorism. However, as far as I have heard, they have
hard time convincing anyone outside executive branch that these tools have
been as useful as they are claiming. How many major terrorist plots they have
been uncovered and prevented using these tools? It seems that they are using
these tools for a very wide range of activities including industrial and
political espionage on foreign governments and companies and gain political
and commercial advantage (what they call US national interests, not just
securing Americans against terrorists). Does anyone really believe that EU or
Brazil or liberal NGOs will launch a terrorist attack on US? FBI’s actions
against Dr. King is telling how far they would go. They use the fear factor
of a possible terrorist attacks to justify these actions to the public,
however the laws allow them to do whatever they want to and when there are
restrictions (like the fourth amendments) they find ways to circumvents them
(e.g. by colliding with foreign intelligence services like GCHQ to spy on
American citizens) or change the interpretations of those laws. We are very
lucky that many influential Americans in the previous generations had a
negative view of the federal government and wanted to restrict its powers as
much as possible, restrictions which are being removed in practice (partly
because some people want to settle sociopolitical disputes present in the
country using the government’s power). I don’t see why so much power should
be invested in a single authority with almost no real public supervision and
scrutiny (a role that media was playing to some extent in previous decades
but is coming under heavy pressure from government as Manning, Swartz,
Snowden, … cases demonstrate). And even when courts find that someone in the
government has seriously violated the laws the president forgives them and
they avoid real punishment (as Scoot Libby case demonstrates).
It is not just US government, there is a trend in western liberal
democracies. It is simply unbelievable that the UK security forces used a law
passed to fight terrorism to hold the partner of a Guardian journalist for 9
hours without a lawyer and without the protection of Miranda rights against
self-incrimination. Anyone who thinks that security forces will only use the
authority and tools they obtain to the limited extent of the original goal
suffers from extreme nativity. They will use any tools in their disposal to
the fullest extent they can to achieve what they perceive to be the goals of
their institution. When they perceive journalists like Greenwald as a threat
to the national interests they use these tools to fight them which includes
intimidating the partner of a journalist using terrorism fighting powers. I
still fund it really hard to believe that we have gone so far in the
direction of an Orwellian society.
What can theoretical computer science offer biology? | Theory, Evolution, and
Games Group Says:
Comment #13 September 9th, 2013 at 2:16 am
[…] the aid that cstheory can offer to biological understanding. In
yesterday’s post on the NSA and computational complexity, Aaronson — with
attribution to mathematician Greg Kuperberg — provided the following […]
Paul Beame Says:
Comment #14 September 9th, 2013 at 2:45 am
Some of the NSA revelations have been no surprise at all. It was well known
in the 1980′s, particularly after the publication of The Puzzle Palace, that
the NSA was tapping all the trans-Atlantic telephone cables; gathering up of
all e-mail to foreign addresses seems like more of the same.
The relationship of the NSA with TCS cryptographers has been pretty shaky. I
recall attending a theory of cryptography workshop at MIT’s Endicott House in
June 1985 with one or two official NSA attendees. At the time, there were one
or two TCS attendees known to have NSA funding and the NSA people wanted to
recruit more. In announcing their desire to sponsor more TCS cryptographers,
one of the NSA people cast a pall over the meeting by saying: “If you are
interested, just mention it in a phone conversation with one of your friends
and we’ll get back to you.” This didn’t exactly endear them to anyone.
J Says:
Comment #15 September 9th, 2013 at 2:51 am
“Math could be defined as that which can still be trusted, even when you
can’t trust anything else”
Wait till someone shows multiplication and addition have same complexity or
possible Voevodsky’s/Nelson’s worst nightmare comes true
Refer:
http://mathoverflow.net/questions/40920/what-if-current-foundations-of-math…
http://mathoverflow.net/questions/36693/nelsons-program-to-show-inconsisten…
Scott Says:
Comment #16 September 9th, 2013 at 4:20 am
J #15: Multiplication and addition having the same complexity (and yes, it’s
conceivable that there’s a linear-time multiplication algorithm) wouldn’t do
anything whatsoever to undermine my trust in math—why would it?
Also, even if ZF set theory were shown to be inconsistent (and it won’t be
:-) ), that wouldn’t do anything whatsoever to undermine my trust in theorems
about (say) finite groups, or low-dimensional topology, or theoretical
computer science—in fact, about anything that doesn’t involve transfinite
sets. It would “merely” tell me that there was a need (and, of course, an
exciting opportunity) to rethink the foundations. That’s something that
already happened 100+ years ago (the renovations causing virtually no damage
to the higher floors), and that could conceivably happen again.
Vitruvius Says:
Comment #17 September 9th, 2013 at 4:58 am
I agree, Scott, with your general position that any time one claims that
“evidence for x is really evidence for a powerful cabal trying to prevent
everyone from discovering not(x)” one’s credibility drops by an irrecoverably
large factor, and I agree with you that “math can be defined as that which
can still be trusted, even when you can’t trust anything else” (as you put
it), yet that still begs the question of how we the people decide what to
trust to be valid math.
Similarly, while your suggestion to “open up every question to scrutiny,
discussion, and challenge by any interested person” may be necessary in order
to establish public trust, it isn’t sufficient because we still have the
problem of deciding which such interested persons to trust, and which to
write off as conspiracy theorists in their own right. How do we feasibly
decide, in effect, whether Ehrenhaft is a crackpot (as it were), and whether
“Snowden himself is part of the NSA’s master plan” (as you playfully alluded
to)?
To that end you may be interested in Why Doesn’t the Public Trust
Scientists?, a lecture by The Right Honourable Professor The Baroness O’Neill
of Bengarve, Emeritus Professor of Philosophy at the University of Cambridge
and past Principal of Newnham College, Cambridge, which she presented in 2005
as part of the Science Futures series by the San Diego Science and Technology
Council’s Center for Ethics in Science and Technology.
Note that while “scientists” are the titular and exemplary referent matter in
that lecture, Baroness O’Neill’s talk actually considers a range of questions
in regard of public trust, including the roles of professional organizations,
trustworthiness (which can’t replace trust because of the quis custodiet
ipsos custodes problem), statutory regulation, post hoc accountability, &c,
which apply more broadly to the matters of public trust in any and every
profession and institution, including politics and the law.
O’Neill argues, if I may be so bold as to suggest a précis, that going back
through the 17th century (as you noted) western liberal democracies have
indeed evolved a multipartite methodology that does tend work in practice and
that may well be the best we can get in principal, though it remains unclear
to me how well we are applying those techniques to matters of state security
in general, and how effectively you folks in the United States of America are
applying those techniques to your vaunted Agency in particular.
Scott Says:
Comment #18 September 9th, 2013 at 5:01 am
Paul Beame #14: I’ve actually heard that joke many times, in other variants.
(“Interested in career opportunities at the NSA? Call your mom and let her
know!”) I didn’t know that NSA people themselves used the joke at
conferences, but it doesn’t surprise me at all.
J Says: Comment #19 September 9th, 2013 at 6:39 am “Multiplication and
addition having the same complexity (and yes, it’s conceivable that there’s a
linear-time multiplication algorithm) wouldn’t do anything whatsoever to
undermine my trust in math—why would it?”
I thought I read somewhere that if addition and multiplication turn out to be
similar in complexity, then it would imply something is wrong with
mathematics.
On the same vein think of the generalization of scheme theory that Mochizuki
claims to have undertaken to take apart + and x in ring structure.
I would think something fundamentally would have changed in our picture if
they turn to be similar in complexity.
J Says:
Comment #20 September 9th, 2013 at 6:47 am
Atleast for computational purposes, the multiplicative group structure and
additive group structure of $\Bbb Z$ seem to be coinciding. This seems wrong.
I cannot directly relate to $Z \bmod p$ but this seems to have implication to
Discrete Log. An implication for this may not be beyond reach for atleast a
few other rings as well.
Scott Says:
Comment #21 September 9th, 2013 at 7:02 am
J #19: Well, we already have a remarkable O(n logn loglogn) multiplication
algorithm (due to Fürer, and building on many previous works), and it hasn’t
created any problem for the foundations of mathematics that I know about.
Meanwhile, just like for most problems, we currently have no lower bound for
multiplication better than the trivial Ω(n). I suppose I’d guess that Ω(n
logn) is some sort of barrier, but not with any strength of conviction: if a
linear-time algorithm were discovered, it certainly wouldn’t cause me to
doubt the consistency of ZF set theory. :-)
Scott Says:
Comment #22 September 9th, 2013 at 7:16 am
Vitruvius #17:
it remains unclear to me … how effectively you folks in the United States of
America are applying those techniques to your vaunted Agency in particular.
As long as we’re trading mild national barbs, you’re Canadian? You guys do
have the Communications Security Establishment, which according to the NYT
article is one of only four foreign agencies (along with Britain’s,
Australia’s, and New Zealand’s) that “knows the full extent” of the NSA’s
decoding capabilities and is cleared for its “Bullrun” program. Though I
confess that, when I try to imagine Canada’s CSE, I come up with something
like the following:
Read this gentleman’s private email? Ooo, nooo, that doesn’t sound terribly
polite, eh?
J Says:
Comment #23 September 9th, 2013 at 7:21 am
Professor I am well aware of all $n^{1+\epsilon}$ algorithms and Schonage’s
$O(n)$ algorithm on multitape machines. I cannot find the reference I am
thinking. It was written by a TCS theorist. I would seriously think that the
standard ring structure in $\Bbb Z$ could be modeled differently. I do not
know if ZF would be affected. However the question of treating x and +
differently for computation purposes compare to mathematical purposes arises
making things murky.
I am not implicating ZF with $O(n)$ algorithms for standard x operations on
the standard structure of $\Bbb Z$. The ZFC comment was a second piece of
mathematical conundrum some reputed folks have raised awareness about for a
need to be more well-grounded and it rang well with your statement on truth
in math as we know it. (Unrelated but bringing in – $Z$ has been a puzzle
before as well – it is the simplest ring with a spectrum of prime ideals
whose dimension is unclear to be interpreted in a standard way)
Scott Says:
Comment #24 September 9th, 2013 at 7:23 am
Wolfgang #8:
Unfortunately, this xkcd.com/538/ had it right imho.
YES! I especially liked the mouseover text (“Actual actual reality: nobody
cares about his secrets”).
1
0
Re: [cryptography] New NSA Slides and Details Released last night via Fantastico (BR)
by Eugen Leitl 09 Sep '13
by Eugen Leitl 09 Sep '13
09 Sep '13
----- Forwarded message from David D <david(a)7tele.com> -----
Date: Mon, 9 Sep 2013 12:56:17 +0200
From: David D <david(a)7tele.com>
To: 'Crypto discussion list' <cryptography(a)randombit.net>
Subject: Re: [cryptography] New NSA Slides and Details Released last night via Fantastico (BR)
X-Mailer: Microsoft Office Outlook 12.0
http://g1.globo.com/fantastico/noticia/2013/09/nsa-documents-show-united-sta
tes-spied-brazilian-oil-giant.html
No millisecond counter:
1:49 US-983 Stormbrew - Fiber connections
1:49 US-983 Stormbrew - "KEY CORPORATE PARTNER WITH ACCESS TO INTERNATIONAL
CABLES, ROUTERS, AND SWITCHES". (# traceroute google.com)
2:07 - "QUERY BY CERTIFICATE META DATA"
2:07 - "Private keys of Diginotar stolen by hacker" FLYING PIG ...
Launch a MITM attack.
2:08 - mail.ru and server IP: 94.100.104.14
This site has broken out some of the screenshots from the video:
http://leaksource.wordpress.com/2013/09/09/economic-espionage-nsa-spies-on-b
razil-oil-giant-petrobras/
"How the attack was done:" image is most interesting.
http://leaksource.files.wordpress.com/2013/09/nsa-brazil-5.png
Based on this slide, it appears that the bandwidth providers are dumping the
traffic at core routers directly to the NSA.
-----Original Message-----
From: cryptography [mailto:cryptography-bounces@randombit.net] On Behalf Of
David D
Sent: Monday, September 09, 2013 12:07 PM
To: 'Crypto discussion list'
Subject: Re: [cryptography] New NSA Slides and Details Released last night
via Fantastico (BR)
Lots of gems in this video:
http://g1.globo.com/fantastico/noticia/2013/09/nsa-documents-show-united-sta
tes-spied-brazilian-oil-giant.html
_______________________________________________
cryptography mailing list
cryptography(a)randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
On Mon, Sep 09, 2013 at 12:50:49PM +0200, phryk wrote:
http://cryptome.org/2013/09/nsa-cowboy.htm
9 September 2013
The Cowboy of the NSA Keith Alexander
--------------------------------------------------------------------------------
http://www.foreignpolicy.com/articles/2013/09/08/the_cowboy_of_the_nsa_keit…
Foreign Policy Magazine
The Cowboy of the NSA
Inside Gen. Keith Alexander's all-out, barely-legal drive to build the
ultimate spy machine.
BY SHANE HARRIS | SEPTEMBER 9, 2013
Shane Harris is a senior writer for Foreign Policy and author of The
Watchers: The Rise of America's Surveillance State.
On Aug. 1, 2005, Lt. Gen. Keith Alexander reported for duty as the 16th
director of the National Security Agency, the United States' largest
intelligence organization. He seemed perfect for the job. Alexander was a
decorated Army intelligence officer and a West Point graduate with master's
degrees in systems technology and physics. He had run intelligence operations
in combat and had held successive senior-level positions, most recently as
the director of an Army intelligence organization and then as the service's
overall chief of intelligence. He was both a soldier and a spy, and he had
the heart of a tech geek. Many of his peers thought Alexander would make a
perfect NSA director. But one prominent person thought otherwise: the prior
occupant of that office.
Air Force Gen. Michael Hayden had been running the NSA since 1999, through
the 9/11 terrorist attacks and into a new era that found the global
eavesdropping agency increasingly focused on Americans' communications inside
the United States. At times, Hayden had found himself swimming in the
murkiest depths of the law, overseeing programs that other senior officials
in government thought violated the Constitution. Now Hayden of all people was
worried that Alexander didn't understand the legal sensitivities of that new
mission.
"Alexander tended to be a bit of a cowboy: 'Let's not worry about the law.
Let's just figure out how to get the job done,'" says a former intelligence
official who has worked with both men. "That caused General Hayden some
heartburn."
The heartburn first flared up not long after the 2001 terrorist attacks.
Alexander was the general in charge of the Army's Intelligence and Security
Command (INSCOM) at Fort Belvoir, Virginia. He began insisting that the NSA
give him raw, unanalyzed data about suspected terrorists from the agency's
massive digital cache, according to three former intelligence officials.
Alexander had been building advanced data-mining software and analytic tools,
and now he wanted to run them against the NSA's intelligence caches to try to
find terrorists who were in the United States or planning attacks on the
homeland.
By law, the NSA had to scrub intercepted communications of most references to
U.S. citizens before those communications can be shared with other agencies.
But Alexander wanted the NSA "to bend the pipe towards him," says one of the
former officials, so that he could siphon off metadata, the digital records
of phone calls and email traffic that can be used to map out a terrorist
organization based on its members' communications patterns.
"Keith wanted his hands on the raw data. And he bridled at the fact that NSA
didn't want to release the information until it was properly reviewed and in
a report," says a former national security official. "He felt that from a
tactical point of view, that was often too late to be useful."
Hayden thought Alexander was out of bounds. INSCOM was supposed to provide
battlefield intelligence for troops and special operations forces overseas,
not use raw intelligence to find terrorists within U.S. borders. But
Alexander had a more expansive view of what military intelligence agencies
could do under the law.
"He said at one point that a lot of things aren't clearly legal, but that
doesn't make them illegal," says a former military intelligence officer who
served under Alexander at INSCOM.
In November 2001, the general in charge of all Army intelligence had informed
his personnel, including Alexander, that the military had broad authority to
collect and share information about Americans, so long as they were
"reasonably believed to be engaged" in terrorist activities, the general
wrote in a widely distributed memo.
The general didn't say how exactly to make this determination, but it was all
the justification Alexander needed. "Hayden's attitude was 'Yes, we have the
technological capability, but should we use it?' Keith's was 'We have the
capability, so let's use it,'" says the former intelligence official who
worked with both men.
Hayden denied Alexander's request for NSA data. And there was some irony in
that decision. At the same time, Hayden was overseeing a highly classified
program to monitor Americans' phone records and Internet communications
without permission from a court. At least one component of that secret
domestic spying program would later prompt senior Justice Department
officials to threaten resignation because they thought it was illegal.
But that was a presidentially authorized program run by a top-tier national
intelligence agency. Alexander was a midlevel general who seemed to want his
own domestic spying operation. Hayden was so troubled that he reported
Alexander to his commanding general, a former colleague says. "He didn't use
that atomic word -- 'insubordination' -- but he danced around it."
The showdown over bending the NSA's pipes was emblematic of Alexander's
approach to intelligence, one he has honed over the course of a 39-year
military career and deploys today as the director of the country's most
powerful spy agency.
Alexander wants as much data as he can get. And he wants to hang on to it for
as long as he can. To prevent the next terrorist attack, he thinks he needs
to be able to see entire networks of communications and also go "back in
time," as he has said publicly, to study how terrorists and their networks
evolve. To find the needle in the haystack, he needs the entire haystack.
"Alexander's strategy is the same as Google's: I need to get all of the
data," says a former administration official who worked with the general. "If
he becomes the repository for all that data, he thinks the resources and
authorities will follow."
That strategy has worked well for Alexander. He has served longer than any
director in the NSA's history, and today he stands atop a U.S. surveillance
empire in which signals intelligence, the agency's specialty, is the coin of
the realm. In 2010, he became the first commander of the newly created U.S.
Cyber Command, making him responsible for defending military computer
networks against spies, hackers, and foreign armed forces -- and for fielding
a new generation of cyberwarriors trained to penetrate adversaries' networks.
Fueled by a series of relentless and increasingly revealing leaks from former
NSA contractor Edward Snowden, the full scope of Alexander's master plan is
coming to light.
Today, the agency is routinely scooping up and storing Americans' phone
records. It is screening their emails and text messages, even though the spy
agency can't always tell the difference between an innocent American and a
foreign terrorist. The NSA uses corporate proxies to monitor up to 75 percent
of Internet traffic inside the United States. And it has spent billions of
dollars on a secret campaign to foil encryption technologies that
individuals, corporations, and governments around the world had long thought
protected the privacy of their communications from U.S. intelligence
agencies.
The NSA was already a data behemoth when Alexander took over. But under his
watch, the breadth, scale, and ambition of its mission have expanded beyond
anything ever contemplated by his predecessors. In 2007, the NSA began
collecting information from Internet and technology companies under the
so-called PRISM program. In essence, it was a pipes-bending operation. The
NSA gets access to the companies' raw data--including e-mails, video chats,
and messages sent through social media--and analysts then mine it for clues
about terrorists and other foreign intelligence subjects. Similar to how
Alexander wanted the NSA to feed him with intelligence at INSCOM, now some of
the world's biggest technology companies -- including Google, Microsoft,
Facebook, and Apple -- are feeding the NSA. But unlike Hayden, the companies
cannot refuse Alexander's advances. The PRISM program operates under a legal
regime, put in place a few years after Alexander arrived at the NSA, that
allows the agency to demand broad categories of information from technology
companies.
Never in history has one agency of the U.S. government had the capacity, as
well as the legal authority, to collect and store so much electronic
information. Leaked NSA documents show the agency sucking up data from
approximately 150 collection sites on six continents. The agency estimates
that 1.6 percent of all data on the Internet flows through its systems on a
given day -- an amount of information about 50 percent larger than what
Google processes in the same period.
When Alexander arrived, the NSA was secretly investing in experimental
databases to store these oceans of electronic signals and give analysts
access to it all in as close to real time as possible. Under his direction,
it has helped pioneer new methods of massive storage and retrieval. That has
led to a data glut. The agency has collected so much information that it ran
out of storage capacity at its 350-acre headquarters at Fort Meade, Maryland,
outside Washington, D.C. At a cost of more than $2 billion, it has built a
new processing facility in the Utah desert, and it recently broke ground on a
complex in Maryland. There is a line item in the NSA's budget just for
research on "coping with information overload."
Yet it's still not enough for Alexander, who has proposed installing the
NSA's surveillance equipment on the networks of defense contractors, banks,
and other organizations deemed essential to the U.S. economy or national
security. Never has this intelligence agency -- whose primary mission is
espionage, stealing secrets from other governments -- proposed to become the
electronic watchman of American businesses.
This kind of radical expansion shouldn't come as a surprise. In fact, it's a
hallmark of Alexander's career. During the Iraq war, for example, he
pioneered a suite of real-time intelligence analysis tools that aimed to
scoop up every phone call, email, and text message in the country in a search
for terrorists and insurgents. Military and intelligence officials say it
provided valuable insights that helped turn the tide of the war. It was also
unprecedented in its scope and scale. He has transferred that architecture to
a global scale now, and with his responsibilities at Cyber Command, he is
expanding his writ into the world of computer network defense and cyber
warfare.
As a result, the NSA has never been more powerful, more pervasive, and more
politically imperiled. The same philosophy that turned Alexander into a giant
-- acquire as much data from as many sources as possible -- is now
threatening to undo him. Alexander today finds himself in the unusual
position of having to publicly defend once-secret programs and reassure
Americans that the growth of his agency, which employs more than 35,000
people, is not a cause for alarm. In July, the House of Representatives
almost approved a law to constrain the NSA's authorities -- the closest
Congress has come to reining in the agency since the 9/11 attacks. That
narrow defeat for surveillance opponents has set the stage for a Supreme
Court ruling on whether metadata -- the information Alexander has most often
sought about Americans -- should be afforded protection under the Fourth
Amendment's prohibition against "unreasonable searches and seizures," which
would make metadata harder for the government to acquire.
Alexander declined Foreign Policy's request for an interview, but in response
to questions about his leadership, his respect for civil liberties, and the
Snowden leaks, he provided a written statement.
"The missions of NSA and USCYBERCOM are conducted in a manner that is lawful,
appropriate, and effective, and under the oversight of all three branches of
the U.S. government," Alexander stated. "Our mission is to protect our people
and defend the nation within the authorities granted by Congress, the courts
and the president. There is an ongoing investigation into the damage
sustained by our nation and our allies because of the recent unauthorized
disclosure of classified material. Based on what we know to date, we believe
these disclosures have caused significant and irreversible harm to the
security of the nation."
In lieu of an interview about his career, Alexander's spokesperson
recommended a laudatory profile about him that appeared in West Point
magazine. It begins: "At key moments throughout its history, the United
States has been fortunate to have the right leader -- someone with an ideal
combination of rare talent and strong character -- rise to a position of
great responsibility in public service. With General Keith B. Alexander ...
Americans are again experiencing this auspicious state of affairs."
Lawmakers and the public are increasingly taking a different view. They are
skeptical about what Alexander has been doing with all the data he's
collecting -- and why he's been willing to push the bounds of the law to get
it. If he's going to preserve his empire, he'll have to mount the biggest
charm offensive of his career. Fortunately for him, Alexander has spent as
much time building a political base of power as a technological one.
* * *
Those who know Alexander say he is introspective, self-effacing, and even
folksy. He's fond of corny jokes and puns and likes to play pool, golf, and
Bejeweled Blitz, the addictive puzzle game, on which he says he routinely
scores more than 1 million points.
Alexander is also as skilled a Washington knife fighter as they come. To get
the NSA job, he allied himself with the Pentagon brass, most notably Donald
Rumsfeld, who distrusted Hayden and thought he had been trying to buck the
Pentagon's control of the NSA. Alexander also called on all the right
committee members on Capitol Hill, the overseers and appropriators who hold
the NSA's future in their hands.
When he was running the Army's Intelligence and Security Command, Alexander
brought many of his future allies down to Fort Belvoir for a tour of his base
of operations, a facility known as the Information Dominance Center. It had
been designed by a Hollywood set designer to mimic the bridge of the starship
Enterprise from Star Trek, complete with chrome panels, computer stations, a
huge TV monitor on the forward wall, and doors that made a "whoosh" sound
when they slid open and closed. Lawmakers and other important officials took
turns sitting in a leather "captain's chair" in the center of the room and
watched as Alexander, a lover of science-fiction movies, showed off his data
tools on the big screen.
"Everybody wanted to sit in the chair at least once to pretend he was
Jean-Luc Picard," says a retired officer in charge of VIP visits.
Alexander wowed members of Congress with his eye-popping command center. And
he took time to sit with them in their offices and explain the intricacies of
modern technology in simple, plain-spoken language. He demonstrated a command
of the subject without intimidating those who had none.
"Alexander is 10 times the political general as David Petraeus," says the
former administration official, comparing the NSA director to a man who was
once considered a White House contender. "He could charm the paint off a
wall."
Alexander has had to muster every ounce of that political savvy since the
Snowden leaks started coming in June. In closed-door briefings, members of
Congress have accused him of deceiving them about how much information he has
been collecting on Americans. Even when lawmakers have screamed at him from
across the table, Alexander has remained "unflappable," says a congressional
staffer who has sat in on numerous private briefings since the Snowden leaks.
Instead of screaming back, he reminds lawmakers about all the terrorism plots
that the NSA has claimed to help foil.
"He is well aware that he will be criticized if there's another attack," the
staffer says. "He has said many times, 'My job is to protect the American
people. And I have to be perfect.'"
There's an implied threat in that statement. If Alexander doesn't get all the
information he wants, he cannot do his job. "He never says it explicitly, but
the message is, 'You don't want to be the one to make me miss,'" says the
former administration official. "You don't want to be the one that denied me
these capabilities before the next attack."
Alexander has a distinct advantage over most, if not all, intelligence chiefs
in the government today: He actually understands the multibillion-dollar
technical systems that he's running.
"When he would talk to our engineers, he would get down in the weeds as far
as they were. And he'd understand what they were talking about," says a
former NSA official. In that respect, he had a leg up on Hayden, who
colleagues say is a good big-picture thinker but lacks the geek gene that
Alexander was apparently born with.
"He looked at the technical aspects of the agency more so than any director
I've known," says Richard "Dickie" George, who spent 41 years at the NSA and
retired as the technical director of the Information Assurance Directorate.
"I get the impression he would have been happy being one of those guys
working down in the noise," George said, referring to the front-line
technicians and analysts working to pluck signals out of the network.
Alexander, 61, has been a techno-spy since the beginning of his military
career. After graduating from West Point in 1974, he went to West Germany,
where he was initiated in the dark arts of signals intelligence. Alexander
spent his time eavesdropping on military communications emanating from East
Germany and Czechoslovakia. He was interested in the mechanics that supported
this brand of espionage. He rose quickly through the ranks.
"It's rare to get a commander who understands technology," says a former Army
officer who served with Alexander in 1995, when Alexander was in charge of
the 525th Military Intelligence Brigade at Fort Bragg, North Carolina. "Even
then he was into big data. You think of the wizards as the guys who are in
their 20s." Alexander was 42 at the time.
At the turn of the century, Alexander took the big-data approach to
counterterrorism. How well that method worked continues to be a matter of
intense debate. Surely discrete interceptions of terrorists' phone calls and
emails have helped disrupt plots and prevent attacks. But huge volumes of
data don't always help catch potential plotters. Sometimes, the drive for
more data just means capturing more ordinary people in the surveillance
driftnet.
When he ran INSCOM and was horning in on the NSA's turf, Alexander was fond
of building charts that showed how a suspected terrorist was connected to a
much broader network of people via his communications or the contacts in his
phone or email account.
"He had all these diagrams showing how this guy was connected to that guy and
to that guy," says a former NSA official who heard Alexander give briefings
on the floor of the Information Dominance Center. "Some of my colleagues and
I were skeptical. Later, we had a chance to review the information. It turns
out that all [that] those guys were connected to were pizza shops."
A retired military officer who worked with Alexander also describes a
"massive network chart" that was purportedly about al Qaeda and its
connections in Afghanistan. Upon closer examination, the retired officer
says, "We found there was no data behind the links. No verifiable sources. We
later found out that a quarter of the guys named on the chart had already
been killed in Afghanistan."
Those network charts have become more massive now that Alexander is running
the NSA. When analysts try to determine if a particular person is engaged in
terrorist activity, they may look at the communications of people who are as
many as three steps, or "hops," removed from the original target. This means
that even when the NSA is focused on just one individual, the number of
people who are being caught up in the agency's electronic nets could easily
be in the tens of millions.
According to an internal audit, the agency's surveillance operations have
been beset by human error and fooled by moving targets. After the NSA's legal
authorities were expanded and the PRISM program was implemented, the agency
inadvertently collected Americans' communications thousands of times each
year, between 2008 and 2012, in violation of privacy rules and the law.
Yet the NSA still pursued a counterterrorism strategy that relies on
ever-bigger data sets. Under Alexander's leadership, one of the agency's
signature analysis tools was a digital graph that showed how hundreds,
sometimes thousands, of people, places, and events were connected to each
other. They were displayed as a tangle of dots and lines. Critics called it
the BAG -- for "big ass graph" -- and said it produced very few useful leads.
CIA officials in charge of tracking overseas terrorist cells were
particularly unimpressed by it. "I don't need this," a senior CIA officer
working on the agency's drone program once told an NSA analyst who showed up
with a big, nebulous graph. "I just need you to tell me whose ass to put a
Hellfire missile on."
Given his pedigree, it's unsurprising that Alexander is a devotee of big
data. "It was taken as a given for him, as a career intelligence officer,
that more information is better," says another retired military officer.
"That was ingrained."
But Alexander was never alone in his obsession. An obscure civilian engineer
named James Heath has been a constant companion for a significant portion of
Alexander's career. More than any one person, Heath influenced how the
general went about building an information empire.
Several former intelligence officials who worked with Heath described him as
Alexander's "mad scientist." Another called him the NSA director's "evil
genius." For years, Heath, a brilliant but abrasive technologist, has been in
charge of making Alexander's most ambitious ideas a reality; many of the
controversial data-mining tools that Alexander wanted to use against the
NSA's raw intelligence were developed by Heath, for example. "He's smart,
crazy, and dangerous. He'll push the technology to the limits to get it to do
what he wants," says a former intelligence official.
Heath has followed Alexander from post to post, but he almost always stays in
the shadows. Heath recently retired from government service as the senior
science advisor to the NSA director -- Alexander's personal tech guru. "The
general really looked to him for advice," says George, the former technical
director. "Jim didn't mind breaking some eggs to make an omelet. He couldn't
do that on his own, but General Alexander could. They brought a sense of
needing to get things done. They were a dynamic duo."
Precisely where Alexander met Heath is unclear. They have worked together
since at least 1995, when Alexander commanded the 525th Military Intelligence
Brigade and Heath was his scientific sidekick. "That's where Heath took his
first runs at what he called 'data visualization,' which is now called 'big
data,'" says a retired military intelligence officer. Heath was building
tools that helped commanders on the field integrate information from
different sensors -- reconnaissance planes, satellites, signals intercepts --
and "see" it on their screens. Later, Heath would work with tools that showed
how words in a document or pages on the Internet were linked together,
displaying those connections in the form of three-dimensional maps and
graphs.
At the Information Dominance Center, Heath built a program called the
"automatic ingestion manager." It was a search engine for massive sets of
data, and in 1999, he started taking it for test runs on the Internet.
In one experiment, the retired officer says, the ingestion manager searched
for all web pages linked to the website of the Defense Intelligence Agency
(DIA). Those included every page on the DIA's site, and the tool scoured and
copied them so aggressively that it was mistaken for a hostile cyberattack.
The site's automated defenses kicked in and shut it down.
On another occasion, the searching tool landed on an anti-war website while
searching for information about the conflict in Kosovo. "We immediately got a
letter from the owner of the site wanting to know why was the military spying
on him," the retired officer says. As far as he knows, the owner took no
legal action against the Army, and the test run was stopped.
Those experiments with "bleeding-edge" technology, as the denizens of the
Information Dominance Center liked to call it, shaped Heath and Alexander's
approach to technology in spy craft. And when they ascended to the NSA in
2005, their influence was broad and profound. "These guys have propelled the
intelligence community into big data," says the retired officer.
Heath was at Alexander's side for the expansion of Internet surveillance
under the PRISM program. Colleagues say it fell largely to him to design
technologies that tried to make sense of all the new information the NSA was
gobbling up. But Heath had developed a reputation for building expensive
systems that never really work as promised and then leaving them half-baked
in order to follow Alexander on to some new mission.
"He moved fairly fast and loose with money and spent a lot of it," the
retired officer says. "He doubled the size of the Information Dominance
Center and then built another facility right next door to it. They didn't
need it. It's just what Heath and Alexander wanted to do." The Information
Operations Center, as it was called, was underused and spent too much money,
says the retired officer. "It's a center in search of a customer."
Heath's reputation followed him to the NSA. In early 2010, weeks after a
young al Qaeda terrorist with a bomb sewn into his underwear tried to bring
down a U.S. airliner over Detroit on Christmas Day, the director of national
intelligence, Dennis Blair, called for a new tool that would help the
disparate intelligence agencies better connect the dots about terrorism
plots. The NSA, the State Department, and the CIA each had possessed
fragments of information about the so-called underwear bomber's intentions,
but there had been no dependable mechanism for integrating them all and
providing what one former national security official described as "a
quick-reaction capability" so that U.S. security agencies would be warned
about the bomber before he got on the plane.
Blair put the NSA in charge of building this new capability, and the task
eventually fell to Heath. "It was a complete disaster," says the former
national security official, who was briefed on the project. "Heath's approach
was all based on signals intelligence [the kind the NSA routinely collects]
rather than taking into account all the other data coming in from the CIA and
other sources. That's typical of Heath. He's got a very narrow viewpoint to
solve a problem."
Like other projects of Heath's, the former official says, this one was never
fully implemented. As a result, the intelligence community still didn't have
a way to stitch together clues from different databases in time to stop the
next would-be bomber. Heath -- and Alexander -- moved on to the next big
project.
"There's two ways of looking at these guys," the retired military officer
says. "Two visionaries who took risks and pushed the intelligence community
forward. Or as two guys who blew a monumental amount of money."
As immense as the NSA's mission has become -- patrolling the world's data
fields in search of terrorists, spies, and computer hackers -- it is merely
one phase of Alexander's plan. The NSA's primary mission is to protect
government systems and information. But under his leadership, the agency is
also extending its reach into the private sector in unprecedented ways.
Toward the end of George W. Bush's administration, Alexander helped persuade
Defense Department officials to set up a computer network defense project to
prevent foreign intelligence agencies --mainly China's -- from stealing
weapons plans and other national secrets from government contractors'
computers.
Under the Defense Industrial Base initiative, also known as the DIB, the NSA
provides the companies with intelligence about the cyberthreats it's
tracking. In return, the companies report back about what they see on their
networks and share intelligence with each other.
Pentagon officials say the program has helped stop some cyber-espionage. But
many corporate participants say Alexander's primary motive has not been to
share what the NSA knows about hackers. It's to get intelligence from the
companies -- to make them the NSA's digital scouts. What is billed as an
information-sharing arrangement has sometimes seemed more like a one-way
street, leading straight to the NSA's headquarters at Fort Meade.
"We wanted companies to be able to share information with each other," says
the former administration official, "to create a picture about the threats
against them. The NSA wanted the picture."
After the DIB was up and running, Alexander proposed going further. "He
wanted to create a wall around other sensitive institutions in America, to
include financial institutions, and to install equipment to monitor their
networks," says the former administration official. "He wanted this to be
running in every Wall Street bank."
That aspect of the plan has never been fully implemented, largely due to
legal concerns. If a company allowed the government to install monitoring
equipment on its systems, a court could decide that the company was acting as
an agent of the government. And if surveillance were conducted without a
warrant or legitimate connection to an investigation, the company could be
accused of violating the Fourth Amendment. Warrantless surveillance can be
unconstitutional regardless of whether the NSA or Google or Goldman Sachs is
doing it.
"That's a subtle point, and that subtlety was often lost on NSA," says the
former administration official. "Alexander has ignored that Fourth Amendment
concern."
The DIB experiment was a first step toward Alexander's taking more control
over the country's cyberdefenses, and it was illustrative of his assertive
approach to the problem. "He was always challenging us on the defensive side
to be more aware and to try and find and counter the threat," says Tony
Sager, who was the chief operating officer for the NSA's Information
Assurance Directorate, which protects classified government information and
computers. "He wanted to know, 'Who are the bad guys? How do we go after
them?'"
While it's a given that the NSA cannot monitor the entire Internet on its own
and that it needs intelligence from companies, Alexander has questioned
whether companies have the capacity to protect themselves. "What we see is an
increasing level of activity on the networks," he said recently at a security
conference in Canada. "I am concerned that this is going to break a threshold
where the private sector can no longer handle it and the government is going
to have to step in."
* * *
Now, for the first time in Alexander's career, Congress and the general
public are expressing deep misgivings about sharing information with the NSA
or letting it install surveillance equipment. A Rasmussen poll of likely
voters taken in June found that 68 percent believe it's likely the government
is listening to their communications, despite repeated assurances from
Alexander and President Barack Obama that the NSA is only collecting
anonymous metadata about Americans' phone calls. In another Rasmussen poll,
57 percent of respondents said they think it's likely that the government
will use NSA intelligence "to harass political opponents."
Some who know Alexander say he doesn't appreciate the depth of public
mistrust and cynicism about the NSA's mission. "People in the intelligence
community in general, and certainly Alexander, don't understand the strategic
value of having a largely unified country and a long-term trust in the
intelligence business," says a former intelligence official, who has worked
with Alexander. Another adds, "There's a feeling within the NSA that they're
all patriotic citizens interested in protecting privacy, but they lose sight
of the fact that people don't trust the government."
Even Alexander's strongest critics don't doubt his good intentions. "He's not
a nefarious guy," says the former administration official. "I really do feel
like he believes he's doing this for the right reasons." Two of the retired
military officers who have worked with him say Alexander was seared by the
bombing of the USS Cole in 2000 and later the 9/11 attacks, a pair of major
intelligence failures that occurred while he was serving in senior-level
positions in military intelligence. They said he vowed to do all he could to
prevent another attack that could take the lives of Americans and military
service members.
But those who've worked closely with Alexander say he has become blinded by
the power of technology. "He believes they have enough technical safeguards
in place at the NSA to protect civil liberties and perform their mission,"
the former administration official says. "They do have a very robust
capability -- probably better than any other agency. But he doesn't get that
this power can still be abused. Americans want introspection. Transparency is
a good thing. He doesn't understand that. In his mind it's 'You should trust
me, and in exchange, I give you protection.'"
On July 30 in Las Vegas, Alexander sat down for dinner with a group of civil
liberties activists and Internet security researchers. He was in town to give
a keynote address the next day at the Black Hat security conference. The mood
at the table was chilly, according to people who were in attendance. In 2012,
Alexander had won plaudits for his speech at Black Hat's sister conference,
Def Con, in which he'd implored the assembled community of experts to join
him in their mutual cause: protecting the Internet as a safe space for
speech, communications, and commerce. Now, however, nearly two months after
the first leaks from Snowden, the people around the table wondered whether
they could still trust the NSA director.
His dinner companions questioned Alexander about the NSA's legal authority to
conduct massive electronic surveillance. Two guests had recently written a
New York Times op-ed calling the NSA's activities "criminal." Alexander was
quick to debate the finer points of the law and defend his agency's programs
-- at least the ones that have been revealed -- as closely monitored and
focused solely on terrorists' information.
But he also tried to convince his audience that they should help keep the
NSA's surveillance system running. In so many words, Alexander told them: The
terrorists only have to succeed once to kill thousands of people. And if they
do, all of the rules we have in place to protect people's privacy will go out
the window.
Alexander cast himself as the ultimate defender of civil liberties, as a man
who needs to spy on some people in order to protect everyone. He knows that
in the wake of another major terrorist attack on U.S. soil, the NSA will be
unleashed to find the perpetrators and stop the next assault. Random searches
of metadata, broad surveillance of purely domestic communications,
warrantless seizure of stored communications -- presumably these and other
extraordinary measures would be on the table. Alexander may not have spelled
out just what the NSA would do after another homeland strike, but the message
was clear: We don't want to find out.
Alexander was asking his dinner companions to trust him. But his credibility
has been badly damaged. Alexander was heckled at his speech the next day at
Black Hat. He had been slated to talk at Def Con too, but the organizers
rescinded their invitation after the Snowden leaks. And even among
Alexander's cohort, trust is flagging.
"You'll never find evidence that Keith sits in his office at lunch listening
to tapes of U.S. conversations," says a former NSA official. "But I think he
has a little bit of naiveté about this controversy. He thinks, 'What's the
problem? I wouldn't abuse this power. Aren't we all honorable people?' People
get into these insular worlds out there at NSA. I think Keith fits right in."
One of the retired military officers, who worked with Alexander on several
big-data projects, said he was shaken by revelations that the agency is
collecting all Americans' phone records and examining enormous amounts of
Internet traffic. "I've not changed my opinion on the right balance between
security versus privacy, but what the NSA is doing bothers me," he says.
"It's the massive amount of information they're collecting. I know they're
not listening to everyone's phone calls. No one has time for that. But
speaking as an analyst who has used metadata, I do not sleep well at night
knowing these guys can see everything. That trust has been lost."
1
0
Looks paywalled. Can someone liberate the document, and repost
it here?
----- Forwarded message from Noah Shachtman <noah.shachtman(a)gmail.com> -----
Date: Sun, 8 Sep 2013 21:36:04 -0400
From: Noah Shachtman <noah.shachtman(a)gmail.com>
To: liberationtech(a)lists.stanford.edu
Subject: [liberationtech] Meet the 'cowboy' in charge of the NSA
X-Mailer: Apple Mail (2.1085)
Reply-To: liberationtech <liberationtech(a)lists.stanford.edu>
All:
Sorry if this is considered spamming the list - if it is, it won't happen again.
At Foreign Policy, we just published what I believe is the first major profile of NSA chief Keith Alexander. It is not a particularly flattering one.
One scooplet among many in Shane Harris' nearly 6,000-word story: Even his fellow spies consider Keith Alexander to be a "cowboy" who's barely concerned with law.
Anyway, take a look. Let me know what you think.
http://www.foreignpolicy.com/articles/2013/09/08/the_cowboy_of_the_nsa_keit…
All the best,
nms
--
Noah Shachtman
Executive Editor for News | Foreign Policy
917-690-0716
noah.shachtman(a)gmail.com
http://www.foreignpolicy.com/author/NoahShachtman
encrypted phone: 415-463-4956
--
Liberationtech is a public list whose archives are searchable on Google. Violations of list guidelines will get you moderated: https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, change to digest, or change password by emailing moderator at companys(a)stanford.edu.
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
2
1
----- Forwarded message from coderman <coderman(a)gmail.com> -----
Date: Sun, 8 Sep 2013 22:35:02 -0700
From: coderman <coderman(a)gmail.com>
To: David Johnston <dj(a)deadhat.com>
Cc: Discussion of cryptography and related <cryptography(a)randombit.net>
Subject: Re: [cryptography] urandom vs random
On Sun, Sep 8, 2013 at 9:57 PM, David Johnston <dj(a)deadhat.com> wrote:
> ...
> I've argued in private (and now here) that a large entropy pool is a natural
> response to entropy famine and uneven supply, just like a large grain depot
> guards against food shortages and uneven supply.
this is a good analogy :)
> ... The natural size for the state
> shrinks to the block size of the crypto function being used for entropy
> extraction
for best effective performance, it seems memory bus(es) constrains the
optimal transmission unit size.
4k extended instructions providing more throughput than repeated
instructions at 512bit chunks.
the worst case scenarios, you're passing entropy directly into AES
native instructions, and/or onward to PCIe lanes...
> This is one of the things that drove the design decisions in the RdRand
> DRNG. With 2.5Gbps of 95% entropic data, there is no value in stirring the
> data into a huge pool (E.G. like Linux)
you keep coming back to this assumption that RDRAND is entirely
trusted and always available.
consider adding additional entropy sources like USB keys, scavengers
like Dakarand or Haveged, and so forth.
conversely to your argument, there is no harm in aggressively mixing a
large pool with a high rate hardware entropy source. if you are one of
the worst case scenarios, like seeding an entire new volume for full
disk encryption with entropy, then you can manage accordingly and cut
out the OS level, kernel pool middle man, system call boundary, and
other overhead accordingly.
> A consequence of Linux having a big pool is that the stirring algorithm is
> expensive because it has to operate over a many bits.
but not effectively expensive!
again, i find very few the situations in which my modern processor is
unable to keep a properly refilled aggressively reseeded /dev/random
up to any demanded rate of consumption for high speed network
services, common client side uses, most key generation, and so forth.
if you are one of the worst case scenarios, like seeding an entire new
volume for full disk encryption with entropy, then you can manage
accordingly and cut out the OS level, kernel pool middle man, system
call boundary, and other overhead
> When I count my raw data in bits per second, rather than gigabits per
> second, I am of course going to use them efficiently and mix up a large pot
> of state, so I can get maximum utility. With the RdRand DRNG, the bus is the
> limiting factor, not the supply or the pool size.
fair enough, but consider the inverse, particularly for a skeptical
audience knowing what we do now:
why not mix aggressively with multiple sources if you have the CPU budget?
why not provide access to the raw, un-mixed, un-encrypted,
un-whitened, un-obfuscated state of the raw entropy bits for those so
inclined to use it in such a manner?
efforts to drive RDRAND into direct use instead of the kernel entropy
pool in the linux kernel,
efforts to steadfast refuse access to the raw entropy stream,
are thus viewed with elements of suspicion and provide an air of lack
of credibility.
even with all of these concerns, i have publicly said and will
continue to assert, using RDRAND is better than nothing. the current
state of entropy on most operating systems and especially virtual
machine environments on these operating systems, is very poor.
it is just a shame this resource cannot be used to greater utility and
confidence, as would be provided, were raw access be available.
best regards,
_______________________________________________
cryptography mailing list
cryptography(a)randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
----- Forwarded message from Collin Anderson <collin(a)averysmallbird.com> -----
Date: Mon, 9 Sep 2013 01:24:15 -0400
From: Collin Anderson <collin(a)averysmallbird.com>
To: "liberationtech(a)lists.stanford.edu" <liberationtech(a)lists.stanford.edu>
Subject: [liberationtech] Pew: Anonymity, Privacy, and Security Online
Reply-To: liberationtech <liberationtech(a)lists.stanford.edu>
This was linked to in the FP piece on Alexander, and should hopefully be of
interest to many here in privacy and CFAA work (14% have used VPNs, Tor,
etc). - Collin
---
http://pewinternet.org/Reports/2013/Anonymity-online/Summary-of-Findings.as…
Most internet users would like to be anonymous online at least
occasionally, but many think it is not possible to be completely anonymous
online. New findings in a national survey show:
- 86% of internet users have taken steps online to remove or mask their
digital footprints—ranging from clearing cookies to encrypting their email,
from avoiding using their name to using virtual networks that mask their
internet protocol (IP) address.
- 55% of internet users have taken steps to avoid observation by
specific people, organizations, or the government
- Still, 59% of internet users do not believe it is possible to be
completely anonymous online, while 37% of them believe it is possible.
A section of the survey looking at various security-related issues finds
that notable numbers of internet users say they have experienced problems
because others stole their personal information or otherwise took advantage
of their visibility online—including hijacked email and social media
accounts, stolen information such as Social Security numbers or credit card
information, stalking or harassment, loss of reputation, or victimization
by scammers.
- 21% of internet users have had an email or social networking account
compromised or taken over by someone else without permission.
- 13% of internet users have experienced trouble in a relationship
between them and a family member or a friend because of something the user
posted online.
- 12% of internet users have been stalked or harassed online.
- 11% of internet users have had important personal information stolen
such as their Social Security Number, credit card, or bank account
information.
- 6% of internet users have been the victim of an online scam and lost
money.
- 6% of internet users have had their reputation damaged because of
something that happened online.
- 4% of internet users have been led into physical danger because of
something that happened online.
- 1% of internet users have lost a job opportunity or educational
opportunity because of something they posted online or someone posted about
them.
Some 68% of internet users believe current laws are not good enough in
protecting people’s privacy online and 24% believe current laws provide
reasonable protections.
Most internet users know that key pieces of personal information about them
are available online—such as photos and videos of them, their email
addresses, birth dates, phone numbers, home addresses, and the groups to
which they belong. And growing numbers of internet users (50%) say they are
worried about the amount of personal information about them that is
online—a figure that has jumped from 33% who expressed such worry in 2009.
People would like control over their information, saying in many cases it
is very important to them that only they or the people they authorize
should be given access to such things as the content of their emails, the
people to whom they are sending emails, the place where they are when they
are online, and the content of the files they download.
--
*Collin David Anderson*
averysmallbird.com | @cda | Washington, D.C.
--
Liberationtech is a public list whose archives are searchable on Google. Violations of list guidelines will get you moderated: https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, change to digest, or change password by emailing moderator at companys(a)stanford.edu.
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0