On Thu, Sep 18, 2003 at 11:17:00AM -0400, Tyler Close wrote:
On Tuesday 16 September 2003 11:38, Morlock Elloi wrote:
That is the problem when a centralized technical solution relies on the legal system (and they almost always do.)
What is important is how and if will this accelerate alternate solutions for name space management.
Machines can handle numerical addresses, as a stop-gap measure search engines (hardcoded into browsers) obviate the need to memorize URIs. Though there are several competing search engines, this is of course still mostly a single point of failure. We here all probably agree that the days of open online publishing are counted, and that traffic-remixing P2P (which, by tweaking parameters could be used to implement a BlackNet) networks will rapidly displace the WWW, once a usable system appears on the scene. The publishers knows the cryptographic hash of the document, and can submit it to full-text indexing search services. Each P2P node should come with a search engine, which uses part of the store space to keep an index. Denial of service can be counteracted by agoric load levelling, and prestige accounting. If you provide shitty service, your node gets consulted less and less, and your requests are processed with lower and lower priority. If you push out documents, you have to provide store, bandwidth and crunch, building an impeccable prestige over a long periods of time. Given the recent history, it looks hard to develop a usable system which gets all of the above right, so it will obviously take a while. I haven't spent much time reading up on YURLs, so I can't comment on that. What's the local consensus on the Waterken feller? [demime 0.97c removed an attachment of type application/pgp-signature]