Re: Hollywood Hackers
On Tue, 30 Jul 2002 20:51:24 -0700, you wrote:
When we approve a file, all the people who approved it already get added to our trust list, thus helping us select files, and we are told that so and so got added to our list of people who recommend good files. This gives people an incentive to rate files, since rating files gives them the ability to take advantage of other people's ratings.
If onr discommendd a file, those who discommend it are added to our trust list, and those who commended it to our distrust list. If, as will frequently happen, there is a conflict, we are told that so and so commended so many files we like, and so many files we dislike, so how should future commendations and discommendations from him be handled.
Such an approach suffers from the "bad guy" occasionally signing a good file, thus placing himself on the trusted signer list. A better approach is for the downloader to create his own trusted list, along the lines of PGP web of trust. Ideal for exactly this application. The downloader can add and subtract from the trusted signer list at will, with no central control. Since one must expect some trusted signers to get busted and move to the dark side under court order, such downloader control is necessary. Problematic is that mp3 and other compression processes do not generate bit-identical files. Two perfect mp3 files may have different md5 hashes, for example. A tool for making bit-identical mp3 files from the same digital input is needed, so that a single signed hash can verify the same file from multiple origins.
On Wed, 31 Jul 2002, Anonymous wrote:
Such an approach suffers from the "bad guy" occasionally signing a good file, thus placing himself on the trusted signer list.
This assumes a boolean trust metric. What you need is a trust scalar, and a mechanism to prevent Malory poisoning it. It should use scarce resources (e.g. crunch) to generate a trust currency in each node, a kind of decentralized mint (nothing crunches quite a few million boxes on the Net). Clearly there will be some inflation, as systems tend to get faster these days. The algorithm should resist FPGAzation, too (Mallory is inventive).
A better approach is for the downloader to create his own trusted list, along the lines of PGP web of trust. Ideal for exactly this
The infrastructure needs to be hidden out of view. If you query the net for a specific document, those signed by most trusted parties should come up first. And when you download and sample a document the GUI should offer positive/negative karma buttons for easy grading.
application. The downloader can add and subtract from the trusted signer list at will, with no central control. Since one must expect some trusted signers to get busted and move to the dark side under court order, such downloader control is necessary.
Problematic is that mp3 and other compression processes do not generate bit-identical files. Two perfect mp3 files may have different md5 hashes, for example. A tool for making bit-identical mp3 files
Doesn't matter, as long a single good copy gets out & gets amplified. Plus, you can get different cryptohash URIs for minor variations on content, as long they're published by somebody trusted.
from the same digital input is needed, so that a single signed hash can verify the same file from multiple origins.
Anonymous wrote:
On Tue, 30 Jul 2002 20:51:24 -0700, you wrote:
When we approve a file, all the people who approved it already get added to our trust list, thus helping us select files, and we are told that so and so got added to our list of people who recommend good files. This gives people an incentive to rate files, since rating files gives them the ability to take advantage of other people's ratings.
[...]
A better approach is for the downloader to create his own trusted list, along the lines of PGP web of trust. Ideal for exactly this application. The downloader can add and subtract from the trusted signer list at will, with no central control. Since one must expect some trusted signers to get busted and move to the dark side under court order, such downloader control is necessary.
One practical method that has been, and still remains popular it seems, is a trusted hub approach. DirectConnect, as a more recent example, allows anyone to set up a central hub, and then filter the people connecting to it (e.g. by amount of files shared, or by personal acquaintance), in a very "localised" peer-2-peer group. This is the same tactic adopted by pre-Napster set-ups such as IRC channels, et al. The obvious downside is immediate choice. Obscurity is naturally exaggerated in comparison to a completely open network. However, smaller groups tend to encourage increased validity of files being offered, especially when only a small number of those people are offering it. This obscurity can be countered in a number of ways - chained networking, in that one person can be in many groups and thus has access to a wider range, coupled with an anonymous request/barter-driven facility would decrease obscurity without losing much of the validity implicit in trusted groups. History suggests that even in such fragmented environments, content can travel to as many people in as short a time as an open network. Under this scenario, the opportunities to spread false files are much more limited, as their scope from origin would be more contained, probably averaging 2 or 3 interlinked groups at most. Not perfect, clearly. But it does seem to be the surviving philosophy.
participants (3)
-
Anonymous
-
Eugen Leitl
-
Graham Lally