On Sun, 31 Dec 2000, Eric Cordian wrote:
I see some interesting science here. Permit me to explain.
Cool. I'll take the liberty of adding some commentary myself.
One of the unchallenged inerrant doctrines of crypto-anarchy has been that highly redundant widely distributed services are immune to attack.
I'd say a more realistic view is that the cost of an attack goes up as a function of distribution. Hence, if you want to hit on one 'unit' then it's reasonable that even an individual might do it. But if you start trying to hit on many units the combinatorics take over and cost goes up in a factorial fashion. This is the reason I want to see the key management stay at the 'unit' level. The ability for me (or any party) to readily identify another party in my immediate environ is much lower in cost than say to have Tim May do it for us. The best person to evaluate my 'trust requirements' is me. In addition, by layering the protocols this way adds another level of protection. You've got a point to point link key that the two parties share and exchange in their own way. Each node takes whatever traffic it gets and routes it according to address, everything 'below' this layer is still encrypted. But I don't have access to the keys, nor do I know necessarily what algorithm was used to encrypt them. This was made by the appropriate parties at each level.
Indeed, things like BlackNet are made possible because they can use such services (eg alt.anonymous.message) as their underlying transport mechanism.
I like Plan 9 because it allows this at the network layer as a part of the OS. Though I'd like to see DES replaced with something more secure. When coupled with its distributed file/work space some really interesting possibilities become possible.
Now we see a network of 33 servers being assimilated to a new way of doing things. How could this be? Perhaps there are some flaws in our analysis of highly redundant widely distributed networks. Perhaps by looking at Efnext, we might see what they are.
I wouldn't say '33' qualified for 'highly redundent' or 'widely distributed'. I wouldn't trust any system that didnt' have at least several hundred, and preferably several thousand widely dispersed nodes, involved. 33 machines is within the scope of a couple of individual to hack over a year.
Flaw number one is that the servers in most networks are not equal. Most Networks are star networks, and most of the nodes are leaf nodes. Leaf nodes are at the mercy of their hubs. Where the hubs go, the leaves will follow.
I would say this observation is reason enough to say our original description of the network (and as a result our expectations) were at fault rather than how the particular system did end up being implimented or how it performed.
Flaw number two is that it is far more prestigious to run a hub than a leaf. Given the choice of having ones own Enamelware Factory under the new Reich, or being reduced to a delinked leaf, most server operators will swallow their pride and go with the herd.
The really(!) interesting observation is the one choice that almost never(?) comes up is real distributed systems. If these people would put the same energy into sending out a few notices in the appropriate forum and starting a real distributed content-blind network we'd all be better off. But it doesn't happen. Why? People don't like direct and immediate confrontation, they hate to burn bridges. It's a psychology thing in my opinion.
Flaw number three is that once the herd starts moving, it is very difficult for individual sheep to make their views known, and almost impossible for them to push the herd in a different direction.
Again, this is that face to face confrontation thingy.
Also, the trading of privacy and autonomy for convenience is a new threat model we have not considered in the context of highly redundant widely distributed networks.
But a distributed system maximises convenience, it has the minimum level of regulation and operating cost at the individual level. At the same time it maximizes autonomy. Obliquely, a couple of weeks ago there was a post on /. about 'why freedom'. One of the issues was Franklins security/freedom quote. The point which nobody gets is that Franklin was saying security IS freedom. In a fully distributed network w/ inherent privacy management we'd have soemthing like this: - Me and my ISP would share a key. This would allow me the pipe to route my traffic through. - My friends and I would share private 'public' keys that we'd use for day to chit chat. (this implies the package must allow multi-key selection for each recipient) - We'd have real public keys we'd share on our webpage as a matter of course. It should also be in our .sig. - Any service I worked with would share a key. We'd use the real public keys only to initiate the dialog. Nearly the first thing we'd do prior to any exchange would be to generate a (potentially session sensitive) specific key pair for us to use. It should be easy to sign these keys with a public key for 3rd party authentication. Smart folks would have key generation and exchange done on a nearly session by session basis.
Here we have EFNet en masse giving up the old way of doing things. En masse. "Voluntarily." And what is their motivation?
Since it's voluntary is that even an issue? Isn't part of respecting an individual simply accepting it's their decision and to impact them about it is only to weaken our own arguments and desires of individuality? That people have a means to opt-out is enough.
Impending government legislation?
<shrug>Influence through abduction by space aliens?
Janet Reno's tanks rolling on the locations of all 33 IRC Servers? A court order, which threatens indefinite jailing for non-compliance?
No, it's none of these things. It's some people who have gone off and written some mods to ircd which make running a server less of a headache.
But isn't this exactly what individuals are supposed to do? Irrespective of free market requirements, isn't this a requirement of a real world market? We bitch and moan about sheeple and then when a group goes off and does it we bitch and moan...
So the lesson here is that there is a "better software" attack on highly redundant widely distributed server networks, and that entire networks will trade control of their servers and allow changes to fundamental protocols, in return for new "singing and dancing" code.
I'd say the lesson is that we went into this with one set of expectations and we applied those in an environment where they weren't valid. We need to identify our process errors and repair them and try again.
Certainly, Usenet is also vulnerable to such an attack. Most news admins I know would give their left nut for a life free of spam.
The question is how to do it. And who will do it? Isn't this the point of a market? If "Do I compete?" then "Differentiate!" else "Use what somebody else brung to the party." This is a perfect opportunity for a real difference to be made. I think this is a facet of Lessigs "Code" that many don't seem to get. As I've said before this is the real power of Open Source development. The trick is to get the right mix of capability and personality in some core members. This is a nearly completely unstudied aspect of Open Source (and no, I don't think the same things that make Open Source success mean closed source success).
Much in the same sense that it is "voluntary" for an individual in the top 1 percentile on IQ and Achievment Tests to get a high school diploma.
However, try being allowed to flip burgers without one, regardless of your actual talent.
Well you won't just walk in and get hired as a president of a company but getting a job is not impossible without a diploma. You'll have to be content to start at the bottem washing plates, delivering boxes, etc. The opportunities are still there if at some point you change your mind and figure you do need something more to achieve. You can always start your own business also.
Making people "part of the process" is one of the first things one learns in management. How to simultaneously make sure they have zero chance of actually altering what you have planned for them is the second thing.
People ARE the process, everything else is a tool for their success. The way I manage my team is by clear goals and well described process and resource utility (and staying out of their way). ____________________________________________________________________ Before a larger group can see the virtue of an idea, a smaller group must first understand it. "Stranger Suns" George Zebrowski The Armadillo Group ,::////;::-. James Choate Austin, Tx /:'///// ``::>/|/ ravage@ssz.com www.ssz.com .', |||| `/( e\ 512-451-7087 -====~~mm-'`-```-mm --'- --------------------------------------------------------------------