cypherpunks
Threads by month
- ----- 2025 -----
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- 3 participants
- 33916 discussions

Re: [Freedombox-discuss] Which mesh system should be included in the Freedombox?
by Eugen Leitl 12 Oct '13
by Eugen Leitl 12 Oct '13
12 Oct '13
----- Forwarded message from Paul Gardner-Stephen <paul(a)servalproject.org> -----
Date: Sat, 12 Oct 2013 20:23:27 +1030
From: Paul Gardner-Stephen <paul(a)servalproject.org>
To: Petter Reinholdtsen <pere(a)hungry.com>
Cc: freedombox-discuss <freedombox-discuss(a)lists.alioth.debian.org>
Subject: Re: [Freedombox-discuss] Which mesh system should be included in the Freedombox?
Message-ID: <CA+_T8-C5m-2+1snwQZpz4GUdHinqy1z19ixnuk=we6px3ghEiQ(a)mail.gmail.com>
Hello all,
On Sat, Oct 12, 2013 at 6:06 PM, Petter Reinholdtsen <pere(a)hungry.com>wrote:
>
> [Sandy Harris]
> > As I see it, security has to be the first consideration for any Box
> > component, including a mesh system. Given the stated project goals we
> > should not even consider anything unless we have good reason to
> > consider it secure.
>
> Well, I believe that is putting the cart in front of the horse, given
> the current amount of people involved. I believe we first need to get
> something useful that can be located in the privacy of the users homes
> to get that legal protection, and then we can continue improving that to
> make it more and more "secure", which is a word that mean different
> things to different people and thus hard to have as a fuzzy goal.
>
> This mean to me that we pick solutions already in common use and
> integrate it into the Freedombox, and depend on the rest of the free
> software community to audit it (with our help, if someone in the
> Freedombox want to spend time on it).
>
> > If something looks desirable but has not had an audit for security,
> > then auditing it and contributing fixes if needed is more important
> > for the Box than things like getting it into Debian or making it run
> > on a Dreamplug.
>
> I am happy to hear that you want to focus on that area, and suggest you
> have a look at the batman-adv mesh routing system when you find time to
> audit mesh systems.
>
> I've concluded I will focus on batman-adv for now, as it provide layer 2
> mesh networking (as in both IPv4 and IPv6 will work) and is used by the
> Serval project that provide a peer-to-peer phone system that allow phone
> calls and "SMS" messaging without central infrastucture. If the
> freedombox provide mesh nodes compatible with the Serval project, we get
> free software phone support for free. :)
>
So some clarification here:
Serval used to use the original layer-3 batman, and can still coexist with
batman, batman-adv, babel, olsrd etc. But Serval now includes its own mesh
routing protocol, for many of the reasons that are stimulating discussion
here.
Some of those reasons include the difficulty of making a secure fully
distributed network, especially a mesh network. Indeed, this was a major
reason for us side-stepping IP, and creating our own mesh-oriented network
layer.
We started from the ground-up by using public cryptography keys as network
addresses. This means that we promiscuously share and exchange public keys
on the network as part of its inherent operation. It also means that
end-to-end encryption is trivial, requires no key exchange, centralised
authority or other complication. Indeed, encryption is so simple in the
Serval network layer that we enable it by default: you need to set flags on
a packet if you don't want it signed and encrypted.
Careful choice of crypto system means that it is still fast, and doesn't
need huge keys. We also added an address abbreviation scheme that means
that we typically have smaller network headers than IPv4, let alone IPv6.
That leaves only key verification to ensure private man-in-the-middle-free
communications with any party on the network -- a problem that the
open-source community has largely solved with web of trust.
This security platform was recently recognised at the Global Security
Challenge grand-final in London where we received an Honourable Mention,
coming a close second in the entire competition -- against entrants from
the USA, UK, Israel and other major players in the security space.
We do not rest on our laurels, nor do we take the praise of men as meaning
that we have a perfect or vulnerability-free system. But we do believe
that we have created something that has great potential in the open-source
world, and especially for projects like Freedom Box where private
correspondence (text, voice and data) on a fully-distributed
self-organising network is a major objective.
As mentioned, because all Serval services operate in parallel to IP, this
means you can mix and match Serval service with your favourite traditional
mesh routing protocols should you wish to use them.
It also means that we can use interesting radio platforms that are too slow
to be useful on IP, e.g., ~100kbit/sec ISM band radios that have ranges 10x
to 100x that of Wi-Fi. We already have a working example of this in our
Serval Mesh Extender hardware device, which also shares many common
objectives with the Freedom Box.
We think that we have some interesting technologies that are of use to this
community, and of course since we develop them as free and open-source
software, we encourage this community to take whatever they find useful,
and perhaps even open a conversation for us to work out what activities and
efforts are in the intersection of our needs and objectives, and apply some
combined energy that will accelerate our mutual progress towards our goals.
Paul.
> See my blog post from yesterday,
> <URL:
> http://people.skolelinux.org/pere/blog/Oslo_community_mesh_network___with_N…
> >,
> for more details of what I have found out so far.
>
> --
> Happy hacking
> Petter Reinholdtsen
>
> _______________________________________________
> Freedombox-discuss mailing list
> Freedombox-discuss(a)lists.alioth.debian.org
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/freedombox-discuss
>
_______________________________________________
Freedombox-discuss mailing list
Freedombox-discuss(a)lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/freedombox-discuss
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
----- Forwarded message from Jim Thompson <jim(a)netgate.com> -----
Date: Sat, 12 Oct 2013 11:59:33 -0500
From: Jim Thompson <jim(a)netgate.com>
To: pfSense support and discussion <list(a)lists.pfsense.org>
Subject: Re: [pfSense] naive suggestion: conform to US laws
Message-Id: <9D8BFE6E-48A9-42E2-A494-99892FA27C90(a)netgate.com>
X-Mailer: Apple Mail (2.1812)
Reply-To: pfSense support and discussion <list(a)lists.pfsense.org>
On Oct 12, 2013, at 7:20 AM, Thinker Rix <thinkerix(a)rocketmail.com> wrote:
> On 2013-10-11 22:33, Walter Parker wrote:
>> Yes, you have been informed correctly. There are more than 2. According the World Atlas (http://www.worldatlas.com/nations.htm#.UlhOHVFDsnY) the number is someone between 189 and 196.
>
> No kidding! ;-)
>
>> But you did not answer the question asked: Name the country that you would move the project to and why you believe that country would do a better job?
>
> Why should *I* name it and why should I present ready solutions for an idea another community member brought up? Why should anybody be in a position to present ready solutions at this point? How about having a fruitful discussion and find solutions together?
There is no reason to build a house on sand.
There is no fruitful discussion to be had when the premise is patently false.
>> Then because the USA can't be trusted, who is going to replace the Americans on the project?
>
> You are mixing things up here. Just because the USA invented their tyrannous "Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act", for which they perversely coined the euphemistic term "Patriot Act" and there fore can not be trusted anymore for hosting anything there, why should the Americans be replaced?!?!?
>
>> The name and logo are owned by an American company.
>
> I guess, that is true, i.e. that ESF registered pfSense and it's log as a brand name.
You seem upset at this. Why?
Instead of some kooky conspiracy theory that ESF could be tortured or pressured to weaken pfSense, is this the *real* issue you have?
>> I doubt they want to give them up to a foreign company owned by non-Americans
>
> Nobody suggested that. Try thinking a bit more outside the box!
> For instance: A non-profit foundation could be founded in a country outside the USA, and the brand, hosting of the project, etc. be transferred to that company. A board would be elected for this foundation who just a few basic things annually to keep the foundation running.
> ESF on the other side would be released of a great threat! They could continue offering their pfSense services to their customers as usual, but from now on nobody could come and force them to do things to pfSense since "they have nothing to do with it”.
You seem upset that ESF controls the project. Why?
>> just to make it harder for the American government to pressure the project.
>
> Incorporating pfSense and bringing it out of the reach of US-domestic jurisdiction would not "make it harder" but "impossible" to pressure the project.
You have provided no explanation (other than “rubber hoses”) for what form that “pressure” would take.
>> If the rest of world wants to fork the project because of concerns about the US government, fine, but I don't think you will get buy in from ESF [the American company that owns the rights to the name pfSense].
>
> Why to fork the code base?! No one suggested that - and no one suggested to do things without - or even against - the key people of the ESF. Right the opposite. It would even protect the ESF!
>
>> Once again, name some names. Who do you consider more trustworthy?
>
> I am not Jesus to hand solutions to the community on a silver platter
though point in fact, Jesus didn’t hand anyone a solution.
> (but surely would be available for a *constructive* and *well-disposed*, *amicable* discussion to find solutions together!). I know of quite a lot of countries that seem interesting for a closer analysis for this cause and surely would propose one or another in such a constructive discussion.
>
> Generally, what Adrian proposed makes only sense, if the community - including ESF - understands the threat and decides to act proactively to fight this threat.
“The community” doesn’t own the copyright on the code, nor the trademarks to the names used. Those belong to ESF.
Further, you’ve hypothesized about a ‘threat’ without providing any factual basis for same. The term for this form of argument is “conspiracy theory”.
Since pfSense is open source (specifically, the BSD license), “the community” (or rather “a community”) could take the decision to fork the code and create their own solution. It’s been attempted a couple times, but none of these have flourished. While I don’t encourage forks (it’s typically not good for either project), occasionally they work out (at least for a while), I don’t go out of my way to inhibit those who wish to fork.
However, in any case, such a community would be prohibited from naming the result “pfSense”.
> But since 33% of the ESF - namely Jim Thompson
You greatly inflate my ownership interest here.
> - prefers bullying, insulting, frightening and muzzling anybody who brings up the threat that we are facing, trying to strike dead any thought as soon as it comes up (strange, isn't it?),
Not as strange as someone randomly showing up one day, hiding under a pseudonym, having never posted to a pfSense list before, making accusations. You started throwing accusations, and yes, I got hostile.
Mostly I got hostile because your accusations are baseless, and despite my challenge, you refuse to drop it. Since your activities are not furthering the project (find bugs, or at least make proposals), you’re wasting everyone’s time. (I’d quote Spock here, but…)
Goodness man, you don’t even understand what happened with Lavabit, or why the situation would be different if a three letter agency were to show up on the doorstep one morning and demand that we weaken the project.
Despite my challenges (“name the law that they would use”), you refuse to respond, instead ducking for cover in your empty, baseless accusations that “it might happen”.
Specifically, Lavabit ran afoul of the Stored Communications Act (http://en.wikipedia.org/wiki/Stored_Communications_Act) "a law that addresses voluntary and compelled disclosure of "stored wire and electronic communications and transactional records" held by third-party internet service providers (ISPs)."
ESF is not an ISP. The SCA does not apply.
CALEA (http://en.wikipedia.org/wiki/Communications_Assistance_for_Law_Enforcement_…) obliges telecommunications companies to make it possible for law enforcement agencies to tap any phone conversations carried out over its networks, as well as making call detail records available. Common carriers, facilities-based broadband Internet access providers, and providers of interconnected Voice over Internet Protocol (VoIP) service – all three types of entities are defined to be “telecommunications carriers” and must meet the requirements of CALEA.
Since ESF is not a “telecommunications carrier”, CALEA does not apply to your proposed “FBI/NSA on the doorstep” scenario.
Even the various provisions of the PATRIOT act of 2001 (and it’s follow-ons) do not apply. The most abusive of these, the so called “NSLs” are really a demand letter issued to a particular entity or organization to turn over various records and data pertaining to individuals, and an accompanying "gag order". Since pfSense has no reason to store any records, there is nothing to hand over. You could *perhaps* make the case that the config backup service could be attacked this way, but it was specifically designed such that ESF (or before January, BSDP) doesn’t have access to the plaintext configuration. It is encrypted by the remote user, and we store the result. We don’t know the keys.
Thus, my challenge stands. You have yet to offer ANY legal authority under which the NSA (or any other agency of the US government) could demand that ESF make changes to pfSense.
Some here in the “community” seem upset that I’ve been so abrasive with you. If you had an actual argument that made sense, you and they would see a different side (“Oh, you’re right. We should find a way to close that loophole.”) Instead, you stood on your accusations, despite any factual basis. Your "Culture of fear” argument was roughly equivalent to the meme of a couple years ago:
"Did Glenn Beck Rape And Murder A Young Girl In 1990?”
This hoax began as a parody of public perception of Glenn Beck’s over-the-top interview antics on his self-titled television show Glenn Beck, wherein he frequently asks his guests to disprove highly speculative and often outrageous assertions.
(Just like you did.)
About.com published an article titled “Internet Hoax Says Glenn Beck Raped, Murdered Young Girl in 1990”, which called the hoax a textbook example of “…how to construct Internet smear campaigns…” (http://urbanlegends.about.com/b/2009/09/03/internet-hoax-says-glenn-beck-ra…)
So yes, I went after you, because the correct response here is to not let the attempt at a smear campaign stand. People love to take silence as assent. Placating you would have been a mistake of the first order.
In the past, I’ve stood up to AT&T. It took a decade, and was both expensive and exhausting. I won. Fnord.
You and those in the community who are upset with my behavior (whilst I was defending ESF and pfSense from your smear tactics) can bet their last Euro/Dollar/Yen that I’ll be 10X more abrasive with the US Government if they attempted what you accuse.
Were I to seek a country that was at least outwardly opposed to the behavior of the US security apparatus (and its related apparatus in other countries), I might consider Brazil. That time is not now.
What you probably don’t appreciate is that the actual “we write code before breakfast” people employed by ESF to work on pfSense are already outside the US(*). One of them lives in Brazil, another in Albania. Perhaps of interest. Perhaps not. At the very least, they’re not subject to US law, so it would be difficult to get them to “go quiet” about any attempt to weaken the codebase of pfSense.
Jim
(*) Jim Pingle does some, but not as much as the others. He does, however, carry most of the support load.
_______________________________________________
List mailing list
List(a)lists.pfsense.org
http://lists.pfsense.org/mailman/listinfo/list
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
// absolutely wild Jim Henson AT&T technology parables, via digg...
Tech Time Warp of the Week: Jim Henson’s Muppet Computer, 1963
By Daniela Hernandez
http://www.wired.com/wiredenterprise/2013/10/tech-time-warp-of-the-week-jim…
1
0
thanks to everyone for sharing their views regarding my forward-leaning
speculation about the NSA facility...
two additional observations:
1) if the NSA Utah data centre is not involved in quantum computing in any
way, as per the larger society and its synchronization with existing tools
and technologies manufactured and standardized likewise in the mainstream-
it is curious to what extent non-quantum computing technologies could
potentially exist in such a scenario, yet remain secret and off-limits.
this is to consider the basic technology research and scientific advances
that are oftentimes in the news on physics sites, such that university or
corporate RD&D labs have achieved a latest milestone, and that a given
improvement may make it to market in the next 10-20 years, and then from
this frontier of development, a gap with the present in what appears to be
an 80s computing paradigm largely seemingly unchanged, perhaps mostly in
terms of the OS itself and how data is managed on these electronic filing
cabinet devices that collectively become the worldwide junk drawer (aka
Internet and WWW), the epitome of "bureaucracy" as networked furniture.
for instance, what if the facade of a computing server has hidden or
advanced technology within it, in such an installation, that may change the
equations for how data is being processed. 'spintronics' is one such area,
various approaches to memory are others, that could potentially have larger
combined effects if built-out at scale, perhaps the equations would differ
or the methodology or techniques employed to discern and sift thru data.
and thus, it is still wondered if in the gap between where research is and
is essentially "bounded" both within a secret government context and also
outside of a public, commercial mainstream domain, what may actually be
going on in the inner workings as it relates to what is going on within the
surveillance society itself and its capacity to process vast amounts of
data that seemingly would require more 'organizational capacity' than just
brute force computing alone-- meaning: the capacity to compute "unknowns"
in a running, looping equation... which to me appears to be what serial
processors are not good at, and instead seemingly must arrive at an answer;
and even parallelism within serial processing (non-quantum), from my truly
naive and completely limited understanding and perhaps errored viewpoint,
would only best be served by quantum nonlinear or multilinear processing
across vast amounts of data sets that may not have known correlations to
start and instead to build these up and test against models. can equations
or algorithms do what the hardware cannot do, in its logic-based biased
processing via digitalism, or might other transistors and other ways of
balancing these computations more towards weighted analog looping means
exist by which to 'naturally approach' this situation, from big picture to
local instances, versus localized modeling outward, via rationalization.
that is, in conceptual terms, what if the likelihood that coherence could
be decided upon or determined by a software framework alone, versus having
additional capacity both in advanced hardware components, to both assist
and allow for such nonlinear massively dynamic (ecological) capacity for
global computation across all interconnected sets-- perhaps such data sets
could even would shrink the example of weather modeling for supercomputing
to a minor issue by comparison, given a potentially wider range of knowns
that are relationally unknown in their effects, given empirical evaluation,
and the massive floating unknown that may never be computed for beyond a
vague nebulous threat-condition, some threshold that suddenly grounds in
the modeling and triggers alerts or forecasts based on statistics as this
relates to recorded data, live surveillance, the entirety of this, in situ.
i think what would be impossible about a traditional computing paradigm for
this situation would be the incapacity of 'knowing' what to model given the
vast amounts of data and their instability in terms of limits to what can
be known and understood, versus what is probabilistic and reevaluated again
and again and again, and perhaps tens of thousands of such equations that
may exist in various computations, and then the 'meta-algorithm' that may
govern over this situation-- is it a man's judgement for how this data is
interpreted and thus the situation is managed as if an office computer that
is queried- or may 'computational reasoning' exist in the form of AI that
is running models, both theories and competing hypotheses - in parallel -
and that these are built of past events and future statistical graphs, and
potentially reveal *constellations* or patterns that begin to match what
was a previous condition or sequence, and thus the computer is determining
the direction of what is tapped based on its inferences and its deductive
and inductive, ~intuitive reasoning.
now, i totally believe it is technologically possible to do this-- just not
with today's limited equipment and software paradigm-- not a chance. it is
too heavily biased in both processing and computer architecture to allow
for open-ended questioning. everything must be rationalized in advance, it
is the way the logic-gates work on the PNP and NPN transistors. there is no
'unknown' state within the circuitry, and whatever may be achieved via
software would be unnatural to the hardware and bounded in its capacity and
functioning against the direction of current, as if an aberration. or thus
is the foolish explanation i conjure from the abstract. that it is going
against the grain of electrical current, essentially.
it is like telling someone who is entirely vested in a binary mindset to
consider a paradoxical situation, and how for its resolution it must be
bounded within that worldview and made to fit within its sampling (1/0)
else the model or worldview fails. the paradox would need to fit within the
bit and not exist beyond or outside it else it would not exist or be real.
so how can an entire paradoxical situation that is actively parsed exist in
a running model of 'rationalization' in terms of computation, if the charge
itself is biased to an on/off state for any given consideration when the
majority likely remain gray-area computations and empirically unknown,
except within a sliding scale analog-like weighting that shifts modeling.
perhaps parallel software could "model" this, yet how effectively if such
data can only be run one-way and must be restarted and recalculated versus
allowing the data to live in ambiguity and continuously compute itself, in
a realm that is too highly constrained by binary modeling, determined by
it, and thus would bound any such computation to only what the technology
allows in its bias, and thus limits how much can be questioned, asked, and
allowed to be unknown, ambiguous, and computed this way.
is it not possible that stuxnet-like influences could exist if making a
continuous brute-force calculation upon a massively changing real-time
model that it could essentially burn-out the serial processes linked in
parallel via networking, such that they would be performing at peak rates
and potentially overheat or have a limit to what kinds of data could be
processed and in what fidelity-- and could such fidelity with the world of
real-time surveillance and state threat modeling exist in its vastness and
nth-degree unknowns and run that within the parameters of petaflops and not
consider all the other issues that may limit such computations, such as how
the processing itself disallows certain considerations that appear solved
or actively processed -somehow- and to me that explanation is that there is
a highly advanced system that is naturally capable, not by brute-force or
by chess-computer linear one-point perspective, and instead by utilizing
knowledge of nature and how quantum mechanics relates directly with paradox
and in this context- it is probable a quantum computer could only do this
level of computation that is integrative and a basis for threat analysis,
albeit perhaps guided by a too-simple binary interface or outlook or way of
managing a situation due to lack of previous experience for guidance.
that is: it is likely impossible that existing technology could carry the
computational and data load in terms of its organization and coherence as a
computing model of N-dimensions or interrelated degrees. it likely does
not involve a simple script or algorithm that rationalizes the entirety of
what is going on, and instead it likely would emerge as a pattern from
within the churning cauldron of data, well beyond meta-data, only an
ingredient amongst vast many others that span any relevant archivable
referenceable 'known' form that could establish a model, from any given
perspective. and thus be evaluated in terms of computer reasoning versus
human reasoning, and gauge insight upon what correlations may be beyond the
human limit of considering as shared sets, versus what such computing power
could compare and consider (and thus, perhaps the AI develops its own
algorithms and has its own working hypotheses- yet this would inherent
require grounding with the hardware, that allows this and does not involve
translating or other lossy middle-management or mediation that skews or
warps or limits the running iterative computation or nonlinear
computational analysis).
so while a step can be taken back from the assumption of a fielded quantum
computer in a seemingly mundane gargantuan data centre/warehouse, what is
occurring within the existing surveillance regime does not realistically
appear to be grounded in such a facility, either. it does not add up, the
capacity of storing records versus making judgments based on such data, yet
not only that data, everything that is known and modeled in terms of the
issues and the society and economic, social, political dimensions, locally
and globally, as it fits into a whole model and local instances- where
might that computation occur and it is my view that it is impossible for
this to be occurring outside a quantum context, in terms of efficiency.
lacking this capacity would involve constantly fighting the equipment and
being limited in approaching 'questioning itself', though this is also the
majority situation. it is an entirely different approach than the internet
and desktop and workstations equipment of today. it is beyond the binary.
there is nothing to bet on it either. either the government has a viable
working model of the state in all of its dimensions or it does not have
this capacity. and there is every indication it does have this today and
yet meta-data is like a binary worldview of the existing situation, and too
limited to get at what boundaries are involved in terms of citizens and law
and representation and rules, including for those in government. if it does
involve a realm of meta-computing and yet computing is the limit to what
can be reasoned, then as with physics and metaphysics- it is in that gap
that the reality is lost and another world exists outside the other and
could even be beyond accountability as a result of plausible deniability.
it is implausible a non-quantum computer could be modeling state data in
its ubiquity in a running simulation given existing binary technology.
2) Google and Nasa just announced a video of their quantum computer that
is actually an installation of a quantum processor whose 'number crunching'
will apparently help optimize algorithms for the Google Glass blink apps.
and a question encountered in this application of D-Wave quantum chip is
what to do with this technology- effectively a new kind of calculator it
seems, given the approach.
A first look inside Google's futuristic quantum lab // video
http://www.theverge.com/2013/10/10/4824026/a-first-look-inside-googles-secr…
in other words, it is not a 'quantum computer' that is installed and thus
there are no other quantum-specific devices connected to it, such that it
seems the pins in the chip lead directly to software to run queries, and
this sustained within a sealed supercold environment, to allow it to occur.
so it is more like a micro-controller or integrated circuit than a
multi-use processor in the sense it is outside of a motherboard context or
larger connected circuitry, or so it appears by my naive account.
and so there is this disjunction between what data processing today must
be capable of, in terms of processing, and then this 'first steps'
approach that is fielded in the commercial and scientific realm of Google
and NASA. like the computation of the 'state mind' and the situation in the
body of the state are mismatched and there is dissonance between what is
said and what is being done- which begs for mistrust if not fear, deity
state, etc.
so the question asked in the video is what can a quantum processor do. what
is it capable of. and i tend to imagine it is precisely this realm of the
paradoxical and the N-dimensional that begins in the gray-area with very
limited models and AI, and cosmic-soup-like, allows this data situation to
computationally bubble bubble while on the lookout for toil and trouble.
the random number generator (RNG) itself seems most closely aligned with
the paradigm of quantum mechanics - as a field of superposition and of
potentiality -- and this is where paradoxical unknowns would emerge from as
contingent patterns in terms of their grounded interconnectivity within the
ecologically and dynamically linked empirical models.
even more the case, or so it is proposed, are random event generators in
relation to spooky action at a distance. if these purposeful RNGs are
somehow linked and-or _grounded, as if quantum-tuned sensor-boxes even,
that extrasensory aspect of birds knowing navigation or animals knowing of
earthquakes via uncanny abilities could also exist naturally within the
computer hardware model at the physical level, if sensitive to various
indicators and 'aware' or capable of making-sense of the chaotic input.
humans on the internet who may gauge a larger situation via interactions
without this being voiced, is similar to a distributed sensor network in
its parallel computation, grounded locally, and its predictive capacity in
that something may not feel right or a particular momentum may be sensed
and thus serve as a meridian or guideline or boundary via lines of force
like intuitive calculation. so too, computer processing whereby the logic
is capable of assessing existing and potential N-dimensional connections
and boundaries and considering the dynamics involved, hunches, intuitions.
for instance, a software program could be written to look at bird call data
as it relates to air quality as this relates to news reports and events. if
you know what you are looking for, it could be rationalized-- find these
criteria and in matching those patterns x=y.
yet what if the computer was able to take any data, and consider any event
from a largely unbounded context, and thus it could begin to process the
migration of birds in realm time with pollution levels as it relates to,
say, periods of heavy traffic due to sporting events in a given sector. and
the algorithm would find such correlations, yet not stop at this, and keep
looking and evaluating. and perhaps there are 'infinitely more variables
than equations', and thus quadratic approaches are beyond the bounds of a
limited viewpoint or rationalization, where kind of perspective that seeks
to view a situation that is too small to accurately model it then becomes
the sign of the thing it seeks to understand, where the model replaces the
entity itself. (note: this as it relates to conservation issues, politics,
saving one species to jeopardize another; else, technology and wildlife and
ecosystems, wind turbines and bird and bat and eagle deaths, etc)
in a binary approach, the 'good enough' model allows the extraneous data to
be ignored or kept out of the analysis -- yet this does not work at scale,
because the bias and warping and skew and distortion only increases with
each further reliance upon inaccuracies in the false framework. you could
not have accuracy at the level of the state via such an approach when in a
context of ubiquitous information, there would be a total onesided madness.
and it does not appear this is actually the case. again, beyond binarism.
thus highly-dimensional modeling may begin as inaccurate models and require
constant analysis and refinement that only a computer could be in the vast
data relations required. for instance, taking all sensor data from street
infrastructure regarding pollution and toxins, all weather, traffic, news,
academic research, epistemological models, social, economic, anything of
any import that can exist in an equation, -- essentially a noise field --
and then allow whatever potential interconnection exists to be modeled as a
possibility, whether activated as a structural scaffold of a hypothesis or
not, and that through educational, career, taxes, health records, and other
indicators - across all known statistics viewed as an empirical framework
for pattern matching past, present, and future events -- to then see what
stands-out in a given moment or in a particular context or perspective, and
that like the extrasensory 'knowing' or intuitive understanding, somehow
the model could suddenly achieve a strange coherence in the data modeled,
and this could occur from the ground-up from the particulate to the whole
and entirely or mostly in a referenced-based AI computational approach, yet
requiring a capacity of parallel linked quantum supercomputers to achieve.
there is no time-sharing on a system like this, it would always be on and
running and like the birth of a universe, would grow and grow and grow, and
while accessed, ultimately it would be best at modeling these chaotic and
unknown dynamics itself -- allowed to run and compute, reason autonomously
so as to consider these dynamics, while humans could input models and help
evaluate models and assumptions. yet ultimately its accuracy is not within
the 1 and 0 of binary digits as representations of on/off switches, instead
it would be this 1 and 0 mapping directly to truth, the basis for grounding
the software model in the hardware physical world as it relates to matter,
energy, and information. it would not inherently involve an extra layer or
distancing from this, seemingly, an additional language and translation.
if allowed, the paradoxical processing of gray-area conditions by a quantum
computer installation could - in accordance with AI and 3-value and N-value
logical reasoning - achieve this 'holistic' approach yet with attributes of
analog characteristic of shifting parameters and sliding scale analyses. in
this way the rigidity of binary 2-value computation that does not allow the
truth to exist would instead not allow the assumption of truth within the
modeling of representational signs (programming, models themselves,
beliefs) to become the truth by default of this interaction (true belief)
and instead, this truth would need to be earned by massive reasoning that
is tied into facts and physics and human knowledge, from the beginning, and
not some ideological bubble-universe that has control over the equipment.
grounded sensing, in other words. if a linear supercomputer can take input
from every weather station and model weather systems worldwide, given what
is known about cloud system formation, wind and temperature and humidity,
and various weather systems and events- what can it do beyond this same
seemingly *fixed* boundary that could involve birds or wildlife or sounds
or resonance or people who have broken bones that ache before a storm or
who immediately know tornado weather minutes far ahead of warning sirens.
if they are outside the model, that is not computed. yet what if that data
somehow could be connected to, yet the software and hardware model do not
allow it, because it is a deterministic approach, rationalized within a
given set of parameters where it controls the knowns via leaving out the
unknowns. what if the unknowns are where the discovery is for new
knowledge. what if the gathered flight of seagulls indicates a thunderstorm
two days out with a degree of certainty before computer models or at least
could be correlated with such observations, tested as empirical evidence as
part of a larger hypothesis of integrated systems. and what if this was
only one aspect of one question of a given scenario in a given discipline
of which there are trillions to consider. the boundaries kill off the
ideas, certainties replace and stand-in for uncertainty, controlling what
is interpreted and allowed to be considered within modeling that then
limits and bounds knowledge to only what can be known from a given view,
and that its inaccuracies become structural, beyond questioning, dogma.
say, a non-electromagnetic view of the universe as this relates to weather
and cloud systems as giant charge-based entities (electric weather).
The Electric Universe / Electrical Weather
http://www.holoscience.com/wp/synopsis/synopsis-9-electrical-weather/
what seems most likely in my naive estimation is that the quantum computer
in its capacity for paradoxical computation and looping heuristic methods
for meta-modeling across multiple interrelated scales simultaneously, is
that this ecological condition of reality and this dimensional capacity of
massive quantum installations linked together in parallel would allow for
this evaluation by default of a natural affinity for the physics involved,
and this goes beyond just the processor itself and into its grounding at
the macro-level with material entanglement, whereby sensor networks that
count microns of chemicals could potentially be remotely entangled in their
data sets this way, so that a trigger upon one variable may effect others
at a distance likewise, in terms of latent superposition potentialities.
in this way, the grounded antenna or the ground wire, as with the sensor
connected to a proposed LED signaling display influencing or becoming an
RNG input, could have a autonomic nervous system alongside the brain-based
computational reasoning, whereby ~sensing may remain ambiguous yet also be
highly connected in other hidden or missing or gray area dimensionality
that could like junk DNA recombine in a given instance as a pattern or
allow other frameworks to exist and this seems inherent or unique to the
quantum situation, in that the grounding of the computer also would be a
grounding of the data model in terms of its realism, that the information
modeled accurately maps into the world and is harmonized with it, and that
this diagnostic evaluation occurs if not also in terms of error-correction
or tabulation or computational processing. perhaps there is intelligence
within the current itself, in terms of entanglement, and so perhaps the
entangling of computers in a monitoring sense of environment or various
"dimensions" would also be involved in how data is effectively processed.
this versus having society serve a flawed computer model, subservient to
it, versus the ability to question the model and test it against the truth
of all combined data within the given hypotheses, and the issue of going
against nature. the microns of chemicals cannot simply be ignored. the
poison in the air, toxins everywhere, sick and inhuman approaches as this
relates to ethics and morality. essentially-- the society is schizophrenic
and allowed to be this way in the binary ideology and its computer model,
required even, enforced, yet denied by those who benefit via onesideness,
the inequality, exploitativeness, the epic swindle for fake immortal gain.
thus it is proposed that the quantum computer as a device would have this
RNG/REG aspect that relates to grounding data, and this could connect with
sensors or various other inputs (as if peripherals perhaps).
in this way, a quantum computer installation at massive scale could parse
all traffic cams, all weather info, all news and knowledge and reference
all books and texts in all languages, and build up models within these as
frameworks for evaluating issues of concern as tools for state management
and governance - or tyranny. and thus the short-circuit if this were amiss
or something was off about it, sparking loose-ends that need to be pulled
to find out there is binarism at the helm of such computational power and
that its output is skewed towards the ideological, due to boundaries that
are retained from a given mindset or too narrow belief system, etc. and
that this could likely be expected as a result of not know the questions to
ask when faced with the physics involved, in their metaphysical dimensions.
forcing the data versus listening to it. forcing answers via biased views
versus allowing questions to exist, coherence discerned logical reasoning.
this as it relates to private mankind and public humanity, as referenced to
the US Constitution or ignoring and replacing it via its substitution as a
hollowed-out digital sign. the state as empty set. any tyranny is possible.
♬
1
0

11 Oct '13
----- Forwarded message from morlockelloi(a)yahoo.com -----
Date: Fri, 11 Oct 2013 10:27:50 -0700
From: morlockelloi(a)yahoo.com
To: nettime-l(a)kein.org
Subject: Re: <nettime> Pascal Zachary: Rules for the Digital Panopticon (IEEE)
Reply-To: a moderated mailing list for net criticism <nettime-l(a)mail.kein.org>
This realization per se is pretty much useless, as are endless
ruminations regarding how free we were, once upon time. The old
Marxist postulate that awareness will save the species is blatantly
false - look around you.
These technologies came to rule the world because their proponents
made coherent efforts to make it so. The only way to do something
about it is to actively develop other technologies which tilt the
balance in the direction you like better. Countering technology with
words, laws and general awareness will get you nowhere. See 'bronze
age'.
The corollary is that the future belongs to the few, not to the
masses, because high tech is centralized by nature, as it requires
understanding, and those capabilities are scarce. The rest are fucked
... I mean 'users'.
There are only competing elites.
> NSA at all. It is about the dawning realization that we all now live
> inside a "virtual" system that compels us to *control* ourselves,
> since all the details of our lives are being "remembered," in a way
> that no *human* civilization has EVER even imagined it could do!
# distributed via <nettime>: no commercial use without permission
# <nettime> is a moderated mailing list for net criticism,
# collaborative text filtering and cultural politics of the nets
# more info: http://mx.kein.org/mailman/listinfo/nettime-l
# archive: http://www.nettime.org contact: nettime(a)kein.org
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
----- Forwarded message from elijah <elijah(a)riseup.net> -----
Date: Thu, 10 Oct 2013 14:17:01 -0700
From: elijah <elijah(a)riseup.net>
To: liberationtech(a)lists.stanford.edu
Subject: Re: [liberationtech] 10 reasons not to start using PGP
Message-ID: <5257194D.1050202(a)riseup.net>
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:24.0) Gecko/20100101 Thunderbird/24.0
Reply-To: liberationtech <liberationtech(a)lists.stanford.edu>
On 10/10/2013 12:23 PM, carlo von lynX wrote:
> 1. Downgrade Attack: The risk of using it wrong.
Fixed in the new generation of clients (mailpile, LEAP, etc).
> 2. The OpenPGP Format: You might aswell run around the city naked.
Fixed by using StartTLS with DANE (supported in the new version of
postfix). Admittedly, this makes sysadmin's job more challenging, but
LEAP is working to automate the hard stuff (https://leap.se/platform)
> 3. Transaction Data: He knows who you are talking to.
Fixed in the short term by using StartTLS with DANE. Fixed in the long
term by adopting one of these approaches: https://leap.se/en/routing
> 4. No Forward Secrecy: It makes sense to collect it all.
Imperfectly fixed in the short term using StartTLS with only PFS ciphers
enabled. This could be fixed in the long term by using Trevor Perrin's
scheme for triple EC Diffie-Hellman exchange. This has been implemented
by moxie for SMS, and could be for SMTP
(https://whispersystems.org/blog/simplifying-otr-deniability/)
> 5. Cryptogeddon: Time to upgrade cryptography itself?
New version of GPG supports ECC, but of course nothing in the snowden
leaks suggest we need to abandon RSA of sufficient key length (just the
ECC curves that have *always* been suspicious).
> 6. Federation: Get off the inter-server super-highway.
Federated transport with spool-then-forward time delay is likely a much
more feasible way to thwart traffic analysis than attempting to lay down
a high degree of cover traffic for direct peer to peer transport. This
is, of course, an area of active academic research and it would be
irresponsible to say that we definitively know how to prevent traffic
analysis, either with p2p or federation.
> 7. Statistical Analysis: Guessing on the size of messages.
Easily fixed.
> 8. Workflow: Group messaging with PGP is impractical.
No one anywhere has solved the problem of asynchronous, forward-secret
group cryptography. There are, however, working models of group
cryptography using OpenPGP, such as SELS
(http://sels.ncsa.illinois.edu/) This approach makes key management
more difficult, but we need to automate key management anyway for
OpenPGP to be usable enough for wider adoption.
> 9. TL;DR: I don't care. I've got nothing to hide.
This critique rests on the assumption that the problems with email are
unfixable.
> 10. The Bootstrap Fallacy: But my friends already have e-mail!
Email remains one of the two killer apps of the internet, and is
unlikely to vanish any time soon. Simple steps we can take to make it
much better seem like a wise investment in energy.
There are two approaches to addressing the problems with email:
(1) assert that email is hopeless and must be killed off.
(2) identify areas where we can fix email to bring it into the 21st century.
I think that approach #1 is irresponsible: regardless of one's personal
feelings about email, it is certainly not a lost cause, and asserting
that it is will make it more difficult to build support for fixing it.
Approach #2 is certainly an uphill battle, but there are a growing
number of organizations working on it. LEAP's (free software) efforts
are outlined here: https://leap.se/email. We have it working, we just
need to get it mature enough for production use.
-elijah
--
Liberationtech is public & archives are searchable on Google. Violations of list guidelines will get you moderated: https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, change to digest, or change password by emailing moderator at companys(a)stanford.edu.
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
2
1

11 Oct '13
----- Forwarded message from Patrice Riemens <patrice(a)xs4all.nl> -----
Date: Thu, 10 Oct 2013 20:41:55 +0200
From: Patrice Riemens <patrice(a)xs4all.nl>
To: nettime-l(a)kein.org
Subject: <nettime> Pascal Zachary: Rules for the Digital Panopticon (IEEE)
Message-ID: <0adf8f7abff38f778a06f8b776729759.squirrel(a)webmail.xs4all.nl>
User-Agent: SquirrelMail/1.4.18
Reply-To: a moderated mailing list for net criticism <nettime-l(a)mail.kein.org>
original to:
http://spectrum.ieee.org/computing/software/rules-for-the-digital-panopticon
Rules for the Digital Panopticon
The technologies of persistent surveillance can protect us only if certain
boundaries are respected
By G. Pascal Zachary
(Posted 20 Sep 2013)
For centuries, we humans have lacked the all-knowing, all-seeing
mechanisms to credibly predict and prevent bad actions by others. Now
these very powers of preemption are perhaps within our grasp, thanks
to a confluence of technologies.
In the foreseeable future, governments, and perhaps some for-profit
corporations and civil-society groups, will design, construct, and
deploy surveillance systems that aim to predict and prevent bad
actionsand to identify, track, and neutralize people who commit them.
And when contemplating these systems, lets broadly agree that we
should prevent the slaughter of children at school and the abduction,
rape, and imprisonment of women. And lets also agree that we should
thwart lethal attacks against lawful government.
Of late, the U. S. government gets most of the attention in this
arena, and for good reason. The National Security Agency, through its
vast capacity to track virtually every phone call, e-mail, and text
message, promises new forms of preemption through a system security
experts call persistent surveillance.
The Boston Marathon bombing, in April, reinforced the impression
that guaranteed prevention against unwanted harm is elusive, if not
impossible. Yet the mere chance of stopping the next mass shooting
or terror attack persuades many people of the benefits of creating
a high-tech version of the omniscient surveillance construct that,
in 1787, the British philosopher Jeremy Bentham conceived as a
panopticon: a prison with a central viewing station for watching all
the inmates at once.
Some activists complain about the potential of such a system to
violate basic freedoms, including the right to privacy. But others
will be seduced by the lure of techno fixes. For example, how could
anyone object to a digital net that protects a school from abusive
predators?
Ad hoc surveillance will inevitably proliferate. Dropcam and other
cheap surveillance programs, already popular among the tech-savvy,
will spread widely. DIY and vigilante panopticons will complicate
matters. Imagine someone like George Zimmerman, the Florida
neighborhood watchman, equipped not with a gun but with a digital
surveillance net, allowing him to track pretty much anythingon his
smartphone.
With data multiplying exponentially and technology inexorably
advancing, the question is not whether an all-encompassing
surveillance systems will be deployed. The question is how, when, and
how many.
In the absence of settled laws and norms, the role of engineers looms
large. They will shoulder much of the burden of designing systems in
ways that limit the damage to innocents while maximizing the pressures
brought to bear on bad guys.
But where do the responsibilities of engineers begin and end?
It is too early to answer conclusively, but engineers would do well to
keep a few fundamental principles in mind:
Keep humans in the loop, but insist they follow the rules of the
road. Compiling and analyzing data can be done by machines. But it
would be best to design these surveillance systems so that a human
reviews and ponders the data before any irreversible actions are
taken. If citizens want to spy on one another, as they inevitably
will, impose binding rules on how they do so.
Design self-correcting systems that eject tainted or wrong
information fast and inexpensively. Create a professional ethos
and explicit standards of behavior for engineers, code writers,
and designers who contribute significantly to the creation of
panopticon-like systems.
Delete the old stuff routinely. Systems should mainly contain
real-time data. They should not become archives tracing the lives of
innocents.
Engineers acting responsibly are no guarantee that panopticons will
not come to control us. But they can be part of getting this brave new
world right.
About the Author
G. Pascal Zachary is the author of Endless Frontier: Vannevar Bush,
Engineer of the American Century (Free Press, 1997). He teaches at
Arizona State University.
# distributed via <nettime>: no commercial use without permission
# <nettime> is a moderated mailing list for net criticism,
# collaborative text filtering and cultural politics of the nets
# more info: http://mx.kein.org/mailman/listinfo/nettime-l
# archive: http://www.nettime.org contact: nettime(a)kein.org
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
2
1
----- Forwarded message from carlo von lynX <lynX(a)time.to.get.psyced.org> -----
Date: Fri, 11 Oct 2013 02:14:29 +0200
From: carlo von lynX <lynX(a)time.to.get.psyced.org>
To: liberationtech <liberationtech(a)mailman.stanford.edu>
Subject: Re: [liberationtech] 10 reasons not to start using PGP
Message-ID: <20131011001429.GI22105(a)lo.psyced.org>
User-Agent: Mutt/1.5.20 (2009-06-14)
Reply-To: liberationtech <liberationtech(a)lists.stanford.edu>
Next collection of answers to replies.
Expect yours to be somewhere in here.
Thanks for all the feedback!
I actually expected harsher religious replies! :)
On 10/10/2013 10:55 PM, Enrique Piracés wrote:
> I think this is a good topic for debate among those who can or are
> currently developing security tools/protocols, and it is one way to
> further discuss usability as a security feature in communities like
> this one. That said, I think it is really bad advice and I encourage
> you to refrain from providing this as a suggestion for users who may
> put themselves or others at risk as a result of it.
The opening sentence says
"Pretty Good Privacy is better than no encryption at all ..."
> Also, I think the title is misleading, as most of the article is about
> why PGP is not an ideal solution for the future (a point where I think
> you would find significant agreement). Again, suggesting not to use
> PGP without providing a functional alternative is irresponsible.
I am suggesting four alternatives and indicating to work harder
to make them viable tools for everyone as we should no longer postpone
replacing PGP and e-mail. Of course I would also appreciate attention
regarding the fifth, secushare.
On 10/10/2013 10:57 PM, Jonathan Wilkes wrote:
> Bitmessage doesn't have forward secrecy, and AFAICT there's no
> way to easily add it later on.
If I understood the principle correctly it allows you to generate
new "accounts" freely, so you can put your *next* account name into
a message. If both sides do this, they can obfuscate their identities
a bit. And you can automate it. You could also re-key at each
message with PGP, but I presume it would make your implementation
incompatible with everybody else's.
On 10/10/2013 11:08 PM, Gregory Maxwell wrote:
> I'm surprised to see this list has missed the thing that bugs me most
> about PGP: It conflates non-repudiation and authentication.
>
> I send Bob an encrypted message that we should meet to discuss the
> suppression of free speech in our country. Bob obviously wants to be
> sure that the message is coming from me, but maybe Bob is a spy ...
> and with PGP the only way the message can easily be authenticated as
> being from me is if I cryptographically sign the message, creating
> persistent evidence of my words not just to Bob but to Everyone!
I kind-of lumped it mentally together with forward secrecy, because
for both problems the answer is Diffie-Hellman. But you are right, it
is the eleventh reason.
> My other big technical complaint about PGP is (3) in the post, that
> every encrypted message discloses what key you're communicating with.
> PGP easily _undoes_ the privacy that an anonymity network like tor can
> provide. It's possible to use --hidden-recipient but almost no one
> does.
Guess what, none of the alternative messaging tools would dream of
putting the recipient address close to the message. They just make
sure that it somehow gets there.
> Its also easy to produce a litany of non-technical complaints: PGP is
> almost universally misused (even by people whos lives may depend on
> its correct use), the WOT leaks tons of data, etc.
Oh yes, I completely forgot to link that long article that recently
came out criticizing the PGP web of trust.
> In my view the use of PGP is more appropriately seen as a statement
> about the kind of world we want to haveâ one where encryption is
> lawful, widely used, and uncontroversialâ and less of a practical way
> to achieve security against many threats that exist today.
It is not enough for the purpose of protecting democracy, therefore
it's one of those statements that backfire: The adversary doesn't
care about you making that statement and can use it against you.
On 10/11/2013 12:17 AM, Jillian C. York wrote:
> Just replying to this bit of your reply to me; the rest made sense
Grrreat.
> On Thu, Oct 10, 2013 at 3:08 PM, carlo von lynX
> <lynX(a)time.to.get.psyced.org <mailto:lynX@time.to.get.psyced.org>> wrote:
>
> > If this is still jargony to you, hmmm... you are unlikely to understand
> > the risks you are exposed to by using the Internet from day to day.
> > These are concepts that anyone in the circumvention business must
> > be aware of. You can choose to not read the Guardian article and not
> > try to understand what's going on, but then you should better just
> > trust that the conclusion is not made up:
>
> No, see that's the thing: /I /get it, but I don't think I'm totally your
> target audience (I've been using PGP for years, you're talking to people
> who haven't started yet, right?)
No, not really. It is for the multipliers and activists. The ones that
carry the torch to the people. The Luciphers. You have been carrying
PGP to the people and I am suggesting you should consider giving them
other tools, and educating them to question those tools and look out
for even newer tools. And help make these tools safe, reviewed and usable.
Then again I wouldn't mind if normal people /get/ it, too, but I wouldn't
want them to opt out the easy way by stopping to use cryptography.
> You want criticism? There it is. Your writing does not work for the
> general public. You write in a way that feels condescending and assumes
> that the reader already has a full grasp of why those things are issues.
I tried to hide the depth in the links so that it's still readable for
someone who already knows all that stuff.
> On the one hand, you're telling people that PGP is too hard/broken,
> while with the other you're expecting them to already understand it/the
> threat model.
>
> Also, I have no idea what is meant by the "bull run" comment in that
> sentence. If you want your piece to have any reach beyond the English
> language, consider tightening up your writing.
It is mentioned in the article. It's the NSA program that enables them
to hijack any TLS connection on the fly. It was mentioned in television
news some weeks ago, too. The way I put it in that text is a hint saying
"if you don't understand this, you should seriousy consider reading the
linked articles..." ;-)
On Thu, Oct 10, 2013 at 02:17:01PM -0700, elijah wrote:
> On 10/10/2013 12:23 PM, carlo von lynX wrote:
>
> > 1. Downgrade Attack: The risk of using it wrong.
>
> Fixed in the new generation of clients (mailpile, LEAP, etc).
Except for the fact that you are still using a mail address, thus
it can ALWAYS be used without encryption -> FAIL.
> > 2. The OpenPGP Format: You might aswell run around the city naked.
>
> Fixed by using StartTLS with DANE (supported in the new version of
> postfix). Admittedly, this makes sysadmin's job more challenging, but
> LEAP is working to automate the hard stuff (https://leap.se/platform)
Are you alluding to
https://datatracker.ietf.org/doc/draft-ietf-dane-smtp-with-dane/ ?
> > 3. Transaction Data: He knows who you are talking to.
>
> Fixed in the short term by using StartTLS with DANE. Fixed in the long
> term by adopting one of these approaches: https://leap.se/en/routing
Hm, all of the approaches presume that there is something like a server
that a dissident can trust.
> > 4. No Forward Secrecy: It makes sense to collect it all.
>
> Imperfectly fixed in the short term using StartTLS with only PFS ciphers
> enabled. This could be fixed in the long term by using Trevor Perrin's
> scheme for triple EC Diffie-Hellman exchange. This has been implemented
> by moxie for SMS, and could be for SMTP
> (https://whispersystems.org/blog/simplifying-otr-deniability/)
You are slowly turning the email network in some sort of a Tor.
Hehe.
> > 5. Cryptogeddon: Time to upgrade cryptography itself?
>
> New version of GPG supports ECC, but of course nothing in the snowden
> leaks suggest we need to abandon RSA of sufficient key length (just the
> ECC curves that have *always* been suspicious).
Ok, how does it figure out the recipient can handle ECC?
> > 6. Federation: Get off the inter-server super-highway.
>
> Federated transport with spool-then-forward time delay is likely a much
> more feasible way to thwart traffic analysis than attempting to lay down
> a high degree of cover traffic for direct peer to peer transport. This
Feasible? Such tools already exist. File sharing happens. Tor, too.
Whereas obfuscation over mail servers needs to be deployed first.
> is, of course, an area of active academic research and it would be
> irresponsible to say that we definitively know how to prevent traffic
> analysis, either with p2p or federation.
I think tools should do both spool-then-forward and play with
cover traffic. If GNUnet as an academic project is working so much
on the cover traffic bit, some academic results maybe exist.
The terms P2P and federation are starting to get confusing.
So-called P2P tools are sometimes actually employing dumb relay
servers, which kind of defeats the original definition of P2P.
And you are talking of federation servers that, although they
are using plaintext email addresses, are actually not knowing
where they are sending things to. That kind of goes beyond the
traditional notion of federation. So in a way both are
converging to a similar strategy. The difference that
remains is that P2P uses DHT-resolution strategies like GNS
to address any node, be it at home or in a server rack, while
federation sticks to domain names and therefore cannot easily
include user endpoints. Also, as you pointed out, it needs a
whole lot of administration work. A DHT just works out of the
box by using it.
And then there are also social approaches to discovery...
Still I have a feeling the DHT approach, especially with
built-in lookup privacy like GNS/GADS has it, is superior.
On the other hand, maintaining the domain name hell is
backwards compatible to current e-mail. The question is if
that is actually doing anyone any good. Maybe if you can
convince spammers to use LEAP they will provide not only
for nuisance but also for cover traffic. :)
> > 7. Statistical Analysis: Guessing on the size of messages.
>
> Easily fixed.
>
> > 8. Workflow: Group messaging with PGP is impractical.
>
> No one anywhere has solved the problem of asynchronous, forward-secret
I think you have to be a bit opportunistic about it.
Briar does it somehow, if I understood correctly.
> group cryptography. There are, however, working models of group
> cryptography using OpenPGP, such as SELS
> (http://sels.ncsa.illinois.edu/) This approach makes key management
> more difficult, but we need to automate key management anyway for
> OpenPGP to be usable enough for wider adoption.
Yes. Key management is an API, not a user interface.
Automatic import of embedded secret keys sounds like a major
cultural revolution for good ole PGP. No surprise none of the
list clients supports that yet. Interesting developments.
Not enough to consider this path worth pursueing but
in the category of better-than-nothing.
> > 9. TL;DR: I don't care. I've got nothing to hide.
>
> This critique rests on the assumption that the problems with email are
> unfixable.
Yes. That even if all the effort is done you will still be
receiving unencrypted mail because you have a mail address.
You will still have a multitude of hosts that are still
"unfixed." That you will still carry a dependency on DNS
and X.509 around your neck just to be able to be backwards
compatible to an e-mail system of which you hope you won't
have to send or receive any messages since they will damage
your privacy. So what is this terrific effort to stay backward
compatible good for? I don't see it being a worthwhile goal.
There is so much broken about it while a fresh start, where
every participant is safe by definition, is so much more useful.
Especially you don't have that usability challenge of having
to explain to your users that some addresses are superduper safe
while other addresses are lacking solid degree of privacy.
And I still haven't understood where I get my trustworthy server
from. I know I can rent one, but even if I have a root shell on
it, it doesn't mean it is safe.
So yes, I can't find a way to believe that those fixes actually
can fix the entire architecture.
> > 10. The Bootstrap Fallacy: But my friends already have e-mail!
>
> Email remains one of the two killer apps of the internet, and is
> unlikely to vanish any time soon. Simple steps we can take to make it
> much better seem like a wise investment in energy.
I've read that claim before and I am sure Facebook has already proven
us wrong. Wasn't it in the news a year ago that e-mail was losing
users to Facebook messaging?
And I don't see a use in maintaining e-mail if I have to rebuild my
trust network, anyway.
> There are two approaches to addressing the problems with email:
>
> (1) assert that email is hopeless and must be killed off.
> (2) identify areas where we can fix email to bring it into the 21st century.
>
> I think that approach #1 is irresponsible: regardless of one's personal
> feelings about email, it is certainly not a lost cause, and asserting
> that it is will make it more difficult to build support for fixing it.
I think I have laid out why it is indeed a lost cause.
> Approach #2 is certainly an uphill battle, but there are a growing
> number of organizations working on it. LEAP's (free software) efforts
> are outlined here: https://leap.se/email. We have it working, we just
> need to get it mature enough for production use.
You didn't actually address the "bootstrap fallacy" that I pointed out.
--
Liberationtech is public & archives are searchable on Google. Violations of list guidelines will get you moderated: https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, change to digest, or change password by emailing moderator at companys(a)stanford.edu.
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
----- Forwarded message from Tim Retout <diocles(a)debian.org> -----
Date: Thu, 10 Oct 2013 23:25:18 +0100
From: Tim Retout <diocles(a)debian.org>
To: freedombox-discuss(a)lists.alioth.debian.org
Subject: Re: [Freedombox-discuss] Tor
Message-ID: <1381443918.9831.69.camel@air>
X-Mailer: Evolution 3.8.5-2
On Tue, 2013-10-08 at 11:04 +0200, Petter Reinholdtsen wrote:
> So to me, it seem like routing all traffic through Tor bring the
> advantage of making it harder to track your location while changing
> the set of people that can perform MITM attack on you. It is not like
> using Tor for everything is introducing some new threat. It is
> already known that NSA and China rutinely perform MITM attach on
> non-Tor traffic, and I assume others do as well. So we are left with
> probability calculations instead to evaluate the threat.
I agree to some extent, but my assessment of the probabilities is still
that using Tor unencrypted is going to cause you new and interesting
security problems.
Privacy and anonymity are different things, and actually I am more
worried about privacy first. There's no point using Tor to access a
cloud-based email service. I want to focus on getting everyone's data
decentralized, and their communications encrypted.
> While talking about these topics with a friend, I just got a tip
> about PORTALofPi, which is a ARch based Raspberry Pi setup to force
> all traffic over Tor. See <URL: https://github.com/grugq/PORTALofPi/ >
> for that recipe.
Grugq's writing is very interesting: http://grugq.github.io/
He recommends using a VPN over Tor to avoid monitoring by malicious exit
nodes (which of course won't avoid monitoring by the VPN provider):
http://grugq.github.io/blog/2013/06/14/you-cant-get-there-from-here/
http://www.slideshare.net/grugq/opsec-for-hackers (NSFW, slide 137
onwards)
--
Tim Retout <diocles(a)debian.org>
_______________________________________________
Freedombox-discuss mailing list
Freedombox-discuss(a)lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/freedombox-discuss
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
----- Forwarded message from carlo von lynX <lynX(a)time.to.get.psyced.org> -----
Date: Fri, 11 Oct 2013 00:08:47 +0200
From: carlo von lynX <lynX(a)time.to.get.psyced.org>
To: liberationtech <liberationtech(a)mailman.stanford.edu>
Subject: Re: [liberationtech] 10 reasons not to start using PGP
Message-ID: <20131010220847.GH22105(a)lo.psyced.org>
User-Agent: Mutt/1.5.20 (2009-06-14)
Reply-To: liberationtech <liberationtech(a)lists.stanford.edu>
Hello again. I will answer to most comments all in a single mail
to avoid clogging libtech. While I wrote this another ten mails
have slipped in, so expect another large reply to those. :-)
On 10/10/2013 10:00 PM, Richard Brooks wrote:
> 10 reasons to give up, stop trying, hide in a corner, and die.
Sorry if I start talking about the alternatives only at the very end
of the document. This is about becoming aware of how serious the
problem is and to start directing some energy into fueling the
alternatives which are popping up like mushrooms just recently.
For the obvious reasons. And I specifically mention peer reviewing
them. So the message is: go get yourself new tools and teach your
peers to use the new tool of the day.
On 10/10/2013 10:11 PM, Pranesh Prakash wrote:
> Interesting. But someone should also write a piece called "1 reason not
> to criticise security tech without clearly stating threat model which
> serves as basis for that criticism". What if Mallory isn't a
> well-funded governmental organization but is the admin who runs your
> employer's email servers?
That's a good point. The reason why I don't pay attention to lesser
threat models is that the loss in quality of democracy we are currently
experiencing is large enough that I don't see much use for a distinction
of threat models - especially since alternatives that work better than
PGP exist, so they are obviously also better for lesser threat models.
For example, I don't think that a dissident in Irya (ficticious country)
is better off if no-one but Google Mail knows that he is a dissident.
Should at any later time in his life someone with access to that data
find it useful to use it against the dissident, he can still do it.
And who knows what the world looks like in twenty years from now?
Not saying give up and die. Saying if you can opt for better security,
don't postpone learning about it. If you can invest money in making
it a safe option, don't waste time with yet another PGP GUI project.
> This should actually be two lists: reasons not to use e-mail, and
> reasons not to use OpenPGP over e-mail.
Fine with me. I don't think it makes much difference for the end
user whether SMTP federation or actual PGP is failing her.
> Only reasons 2, 3, 4, 5, 7, 8 are really about OpenPGP (you should've
> stuck to "6 reasons not to use PGP"), and at least three of them are
> really good reasons to look for alternatives. There are no good
> alternatives over e-mail: S/MIME unfortunately suffers from many of the
> same issues as OpenPGP, and then some more.
I don't find S/MIME worth mentioning anymore. It has so failed us.
But maybe I should for completeness?
> And reason #1 is something that the client should take care of (ideally
> with default settings), and not the encryption protocol. Why are you
> attacking OpenPGP and OTR for this?
Because it's not true that the client can handle it. The fact that an
email address exists implies that some folks will send unencrypted
stuff to it. I experienced this. Just yesterday a friend changed his
life plans because of an unencrypted message. Yes, you could enforce
PGP once it's configured - but you can't opt out from e-mail. That is
evil.
Look at any of the alternatives instead. None of them allow you to
transmit an unencrypted message. In fact all the modern systems use
the public key for addressing, so you can't do it wrong.
> And thank you so much for the comparative chart. It is *very* useful.
My pleasure. I felt the need to do this since I get asked for
recommendations frequently - and I don't like to say.. wait until
secushare is ready. I don't want to wait for it myself.
> Why doesn't telephony have SIP?
It should. What would the icons be that you would put there?
I'm not familiar with end-to-end encryption over SIP for instance.
On 10/10/2013 10:33 PM, Marcin de Kaminski wrote:
> Agreed. The threat model discussion clearly is too often lost in all
> the current post-Snowden debates. We need to remember that a lot if
> solutions might not be enough to protect anyone against NSAish
> authorities but more than enough against other, most real, threats
> to peoples personal safety. Regular employers, schools, parents, skiddies, whatever.
I think if employers, schools, parents, skiddies can find out who
you are exchanging encrypted messages with, that can be a very real
threat to you. Using a tool that looks like it does something
totally different.. on your screen, over the network and even on
your hard disk.. can save your physical integrity.
On 10/10/2013 09:55 PM, adrelanos wrote:
> Thank you for doing this work!
> The world needs someone facing the truth, explaining why gpg isn't the
> solution, advocating positive change. It's a communicative task, a very
> difficult one. As long there is gpg, most geeks don't see need to create
> better alternatives.
Glad someone is understanding the positivity in awareness and
will to move forward. Ignoring threats just because they are
depressing is a bit like sticking your head in the ground.
> I'd say, gpg's development slowed down. They're qualified but standing
> in their own way. They should break compatibility with commercial PGP
> (not because thats good, just because it's easier to implement better
> solutions), also break compatibility with RFCs, implement better
> solutions and standardize later. The current "first standardize, then
> maybe implement, and don't implement if it's not standardized" approach
> is much too slow, can't keep up with real developments in real word.
The whole architecture is wrong. There is hardly anything worth keeping
in the old PGP approach except for the cryptographic basics. All the
modern alternatives use a completely different approach.
> (Still don't even have mail subject encryption.) If Bitmessage succeeds
> (I haven't learned much about it yet), and actually provides better
> protection than gpg, I am happy with that also if there isn't a RFC. If
> Bitmessage gets really popular, I am sure they'll somehow work things
> out and happen to standardize it later.
Thanks for reminding me to look at Bitmessage. I was postponing that
unnecessarily. I read the whitepaper and have added it to the comparison
table according to the claims in it. The architecture sounds a bit
like the one of IRC, but without multicast routing - so I expect it
to run into serious scalability issues. It will probably have to split
into several incompatible networks as it grows. It will also probably
keep your computer a lot more busy than you expect from a communication
tool. But for the time being it is a crypto-strategically much safer
approach than PGP.
Concerning standardization: It is a VERY BAD development that it has
become en vogue to require standardization from projects that haven't
even started functioning. It has been detrimental to the social tool
scene: None of them work well enough to actually scale and replace
Facebook, but the scalability problems are already being cemented into
"open standards," ensuring that they never will.
You must ALWAYS have a working pioneer tool FIRST, then dissect the
way it works and derive a standard out of it. Bittorrent is a good
example for that. It's one of the few things that actually works.
Imagine if Napster and Soulseek had developed an open standard. It
would only have delayed the introduction of Bittorrent, promoting
an inferior technology by standardization.
Open standards are part of the problem, not the solution.
> Sometimes I even think, if there wasn't gpg, new approaches had better
> chances reaching critical mass.
Good point. libtech is the place where people put time and money
into these things. Figuring out the ultimate UX fix for PGP won't
solve the underlying problems. The number of PGP critics is growing.
On Thu, Oct 10, 2013 at 12:40:55PM -0700, Jillian C. York wrote:
> In my opinion, this makes about as much sense as telling people who are
> already having sex not to use condoms.
I am saying to use condoms that don't slip off during intercourse
and explaining why the old condom technology has a tendency to break.
> Consider mine a critique of why this post makes almost no sense to and
> won't convince any member of the public. I'm sure some of the geeks here
> will have a field day with it, but some of it is barely in my realm of
> understanding (and while I'm admittedly not a 'geek', I've been working in
> this field for a long time, which puts me at the top rung of your 'average
> user' base).
Well, maybe we can find wordings that make it more understandable.
Of course the links are meant for being clicked upon when necessary.
> TL;DR: This may well be a solid argument for convincing developers to
> implement better UIs, etc, but it doesn't work for its intended purpose,
> which seems to be convincing n00bs not to use PGP.
No, it is exactly about not trying to fix on the UI level what is
fundamentally beyond repair.
> > 2. The OpenPGP Format: You might aswell run around the city naked.
> >
> > As Stf pointed out at CTS, thanks to its easily detectable [06]OpenPGP
> > Message Format it is an easy exercise for any manufacturer of [07]Deep
> > Packet Inspection hardware to offer a detection capability for
> > PGP-encrypted messages anywhere in the flow of Internet communications,
> > not only within SMTP. So by using PGP you are making yourself visible.
> >
> > Stf has been suggesting to use a non-detectable wrapping format. That's
> > something, but it doesn't handle all the other problems with PGP.
>
> Okay, this part requires more explanation for the layman, methinks. It's
> not intuitive for a non-tech to understand.
Didn't feel like including an explanation of Deep Packet Inspection
and elaborate on the recognizable characteristics of the OpenPGP format
as it would explode the paragraph a bit, but maybe that's wrong. Depends
on who my target audience is. Somebody like you could be. Does it work by
following the links or does it get too abstract from there .. in the sense
that you can read the Wikipedia pages but fail to connect the dots?
> > 3. Transaction Data: He knows who you are talking to.
> >
> > Should Mallory not [08]possess the private keys to your mail provider's
> > TLS connection yet, he can simply intercept the communication by means
> > of a [11]man-in-the-middle attack, using a valid fake certificate that
> > he can make for himself on the fly. It's a bull run, you know?
>
> You're not going to convince anyone with jargony talk.
If this is still jargony to you, hmmm... you are unlikely to understand
the risks you are exposed to by using the Internet from day to day.
These are concepts that anyone in the circumvention business must
be aware of. You can choose to not read the Guardian article and not
try to understand what's going on, but then you should better just
trust that the conclusion is not made up:
> > Even if you employ PGP, Mallory can trace who you are talking to, when
> > and how long. He can guess at what you are talking about, especially
> > since some of you will put something meaningful in the unencrypted
> > Subject header.
>
> Again, this is a call for better education around email practices, not for
> people to stop using PGP.
There is nothing you can do with email that saves you from this happening.
Thus, it's not a problem of practices. It's a question of throwing away
the broken condom and learn about new contraceptive technology.
> > Should Mallory have been distracted, he can still recover your mails by
> > visiting your provider's server. Something to do with a PRISM, I heard.
> > On top of that, TLS itself is being recklessly deployed without forward
> > secrecy most of the time.
> >
> > 4. No Forward Secrecy: It makes sense to collect it all.
> >
> > As Eddie has told us, Mallory is keeping a complete collection of all
> > PGP mails being sent over the Internet, just in case the necessary
> > private keys may one day fall into his hands. This makes sense because
> > PGP lacks [12]forward secrecy. The characteristic by which encryption
> > keys are frequently refreshed, thus the private key matching the
> > message is soon destroyed. Technically PGP is capable of refreshing
> > subkeys, but it is so tedious, it is not being practiced - let alone
> > being practiced the way it should be: at least daily.
>
> Again: Fair criticism, but unclear why this should convince one NOT to use
> PGP. Rather, it should convince us to improve mechanisms and add forward
> secrecy.
You mean I should explain why it is impossible to add forward secrecy
to PGP over e-mail by design? I thought that was going to be clear.
> > 6. Federation: Get off the inter-server super-highway.
> >
> > NSA officials have been reported saying that NSA does not keep track of
> > all the peer-to-peer traffic as it is just large amounts of mostly
> > irrelevant copyright infringement. It is thus a very good idea to
> > develop a communications tool that embeds its ECC- encrypted
> > information into plenty of P2P cover traffic.
> >
> > Although this information is only given by hearsay, it is a reasonable
> > consideration to make. By travelling the well-established and
> > surveilled paths of e-mail, PGP is unnecessarily superexposed. Would be
> > much better, if the same PGP was being handed from computer to computer
> > directly. Maybe even embedded into a picture, movie or piece of music
> > using [17]steganography.
>
> Steganography, really? Sigh.
One of the options that are safer than PGP is steganography, yes.
> > 7. Statistical Analysis: Guessing on the size of messages.
> >
> > Especially for chats and remote computer administration it is known
> > that the size and frequency of small encrypted snippets can be observed
> > long enough to guess the contents. This is a problem with SSH and OTR
> > more than with PGP, but also PGP would be smarter if the messages were
> > padded to certain standard sizes, making them look all uniform.
>
> It would be great, yes. Still doesn't convince me that using PGP isn't
> worthwhile.
This one alone not necessarily, it's the least bad one of all.
> > 8. Workflow: Group messaging with PGP is impractical.
> >
> > Have you tried making a mailing list with people sharing private
> > messages? It's a cumbersome configuration procedure and inefficient
> > since each copy is re-encrypted. You can alternatively all share the
> > same key, but that's a different cumbersome configuration procedure.
> >
> > Modern communication tools automate the generation and distribution of
> > group session keys so you don't need to worry. You just open up a
> > working group and invite the people to work with.
>
> Okay, yes, you've got me here. PGP sucks for group discussion, although I
> fail to see why group discussion is an imperative. But what, do you
> suggest, is an immediate alternative? Nothing? Right, okay...still using
> PGP.
The article is not meant to be an advertisement for the alternatives.
With my working group we are currently exchanging materials by means
of RetroShare. It takes a bit getting used to as you would not expect
that a file sharing app should be used for by-the-way features like
its built-in homebrewn mail system and forum messaging - and to expect
it to be safer than regular e-mail. But RetroShare does indeed solve
some of the things listed in this long list (see the comparison for
details). The downside is, nobody is willing to put her hands in the
fire to guarantee it is a safe choice of software and the source code,
as frequently with file sharing tools, is too complex to be an easy
read. So it is an amazing feature beast pending peer review.
Briar is expected to be a better solution, but it is in alpha stage.
Both should probably be used over Tor for obfuscation, as they don't
provide for that themselves.
And then there is Pond, which is technologically a work of art, but it
doesn't facilitate group communication (yet).
> > 9. TL;DR: I don't care. I've got nothing to hide.
> >
> > So you think PGP is enough for you since you aren't saying anything
> > reaaally confidential? Nobody actually cares how much you want to lie
> > yourself stating you have nothing to hide. If that was the case, why
> > don't you do it on the street, as John Lennon used to ask?
> >
> > It's not about you, it's about your civic duty not to be a member of a
> > predictable populace. If somebody is able to know all your preferences,
> > habits and political views, you are causing severe damage to democratic
> > society. That's why it is not enough that you are covering naughty
> > parts of yourself with a bit of PGP, if all the rest of it is still in
> > the nude. Start feeling guilty. Now.
>
> Again: This is merely a reason to convince people to use encryption MORE
> OFTEN (which EFF does and which I fully support).
I agree that you should use encryption MORE.. but use it BETTER!
> > 10. The Bootstrap Fallacy: But my friends already have e-mail!
> >
> > But everyone I know already has e-mail, so it is much easier to teach
> > them to use PGP. Why would I want to teach them a new software!?
> >
> > That's a fallacy. Truth is, all people that want to start improving
> > their privacy have to install new software. Be it on top of
> > super-surveilled e-mail or safely independent from it. In any case you
> > will have to make a [18]safe exchange of the public keys, and e-mail
> > won't be very helpful at that. In fact you make it easy for Mallory to
> > connect your identity to your public key for all future times.
> >
> > If you really think your e-mail consumption set-up is so amazing and
> > you absolutely don't want to start all over with a completely different
> > kind of software, look out for upcoming tools that let you use mail
> > clients on top. Not the other way around.
>
> I don't even get what you're saying here. What, do you suggest, is the new
> software to teach people if not PGP?
I am saying that teaching people PGP is MORE work than getting them
to installed any of
- Pond
- Briar
- RetroShare
- Bitmessage
And that I hope that we will have more projects to list and that we will
not feel guilty for doing so. RetroShare has a terribly confusing UI (but
the developers are just waiting for some UX designer to tell them what to
do) and I bet the others need a hand on that front, too.
> > But what should I do then!??
> >
> > So that now we know 10 reasons not to use PGP over e-mail, let's first
> > acknowledge that there is no easy answer. Electronic privacy is a crime
> > zone with blood freshly spilled all over. None of the existing tools
> > are fully good enough. We have to get used to the fact that new tools
> > will come out twice a year.
>
> Cop-out. "Don't use PGP but I can't suggest anything for you." Silly.
Where does it say that?
Here comes the part that you were missing:
> > In the [09]comparison we have listed a few currently existing
> > technologies, that provide a safer messaging experience than PGP. The
> > problem with those frequently is, that they haven't been peer reviewed.
> > You may want to invest time or money in having projects peer reviewed
> > for safety.
> >
> > Pond is currently among the most interesting projects for mail privacy,
> > hiding its padded undetectable crypto in the general noise of Tor. Tor
> > is a good place to hide private communication since the bulk of Tor
> > traffic seems to be anonymized transactions with Facebook and the like.
> > Even better source of cover traffic is file sharing, that's why
> > RetroShare and GNUnet both have solid file sharing functionality to let
> > you hide your communications in.
You gave up reading just a few paragraphs too early....
--
Liberationtech is public & archives are searchable on Google. Violations of list guidelines will get you moderated: https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, change to digest, or change password by emailing moderator at companys(a)stanford.edu.
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0