Has anyone tried to create a strong encryption algorithm that cannot be broken by brute force? ^^^^^^
Brute force = exhaustive search. Therefore, if there is a solution, and the search terminates, the answer will be found. Your point was, can an encryption system be designed such that an exhaustive search yields multiple equally good, different (preferably contradictory) decryptions, for any given encrypted message. In "Communications Theory of Secrecy Systems", Bell Systems Technical Journal, Vol 28, pp. 656-715, Shannon measures the efficacy of an encryption system by the average number of plaintext messages that map to an arbitrary cyphertext (via different keys). Later, Hellman (in "An Extension of the Shannon Theory Approach to Cryptography", IEEE Transactions on Information Theory, vol 23, No. 3, pp. 289-284) emphasized Shannon's point about using compression with encryption so that decryption will yield more false positives. Ross Williams discusses this in his book: "Adaptive Data Compression". Note the limited definition of meaningful in these papers as 'makes words'. Given sufficient context, a correct decryption would not be able to hide in a forest of 'meaningul' false positives. (e.g. "Hmmmm, do you think it's 'cats often enjoy', or 'be ready by tuesday'). Of course, a one time pad affords a very large space of meaningful (much more meaningful than just 'makes words') decryptions for each encryption, hence its information-theoretically perfect security. A system which provides arbitrary mappings at the message level and no derivable component context enjoys this property as well. (e.g. a code book: 1-->be ready by tuesday; 2-->expect a guest. What does the message '1' mean? It can mean any message in the world, exactly as (when using a one time pad) the 17th character might mean any character in the world.) So in answer to your question: yes, a one time pad is just such a system. -- Scott + Scott Collins + "Few people realize what + + catalyst@netcom.com | tremendous power there is in one | + of these things." -- Willy Wonka +
Just thought of something (I hope it gives someone a business idea, I have plenty to spare at the moment.) OK: compression, simplified, works (in several of its manifestations at least) by replacing redunant parts with a single part that represents 1) what the replaced parts are, and 2) how many there are. Thus "feed" could be compressed as "f!d" where ! = "2 e's". I know, I know this is a terrible oversimplifica- tion, but that's the juice of the fruit, no? OK well if you encrypt a compressed file, there are bound to be lots more new redundencies created in the encryption process (unless it is something like ROT-13). Why not compress this AGAIN, squeezing more space out of the data? Sure you can do this manually but things like DES are slow. What I am thinking is: have something like zip or compress that compresses, encrypts, then recompresses, and repeats this process until it can compress no more. Compression/extraction time will slow down, but for those that NEED heavy- duty compression, big deal. It shouldn't really be TOO bad, since this almost 1/2-assed encryption need not be secure in any way, it could have a very short key. Any ideas? What is wrong with this idea? (something must be, or it would've been done by now, I am guessing.) I don't know the math, so I suspect I must've erred gravely somewhere. -- Testes saxi solidi! ********************** Podex opacus gravedinosus est! Stanton McCandlish, SysOp: Noise in the Void Data Center BBS IndraNet: 369:1/1 FidoNet: 1:301/2 Internet: anton@hydra.unm.edu Snail: 8020 Central SE #405, Albuquerque, New Mexico 87108 USA Data phone: +1-505-246-8515 (24hr, 1200-14400 v32bis, N-8-1) Vox phone: +1-505-247-3402 (bps rate varies, depends on if you woke me up...:)
Any ideas? What is wrong with this idea? (something must be, or it would've been done by now, I am guessing.) I don't know the math, so I suspect I must've erred gravely somewhere.
You have indeed erred gravely :-) One of the information theoretical concepts we are dealing with here is that of information density. The whole reason compression works is that in most files, the information density is not "perfect"; that is, there is repeated information in the file. This reflects what we see when we compress a file: the more which is repeated, the better compression is. Graphics compress much better than executeables. Well, one of the reasons encryption works is because I can't tell from the encrypted text what kind of patterns exist. Consider a letter-substitution cipher. If I were to apply one to this message, you could probably decrypt it, because much of the structure is still there: common english words, letter frequencies, etc. This makes letter-substitution a pretty poor cipher. What about DES? Well, this is interesting. Without the key, the information density of an encrypted file looks the same as the density of a compressed file, or of noise. This is why you could claim something was just noise, not encrypted data. It's also why a common "good" PRNG is formed by feeding the numbers through some crypto algorithm, because it makes the numbers appear random. It is because encrypted data appears to have a very high information density that it will not compress much, if at all. Compressing encrypted data, from some standpoints, is tatamount to actually decrypting it. Examples: A is a file with 1000 lines of 79 "A"'s followed by a newline. A.Z is the file, compressed. A.x is the file, encrypted (unix crypt, lame, I know) A.x.Z is the encrypted file, compressed wiht the -f option. -rw-rw-r-- 1 marc 80000 May 21 17:26 A -rw-rw-r-- 1 marc 1466 May 21 17:26 A.Z -rw-rw-r-- 1 marc 80000 May 21 17:47 A.x -rw-rw-r-- 1 marc 106577 May 21 17:47 A.x.Z Note that A.x doesn't compress at all. In fact, it grows! Marc
Stanton McCandlish wrote: OK well if you encrypt a compressed file, there are bound to be lots more new redundencies created in the encryption process In fact there are not. You can test this out; use PGP to encrypt any file you please, and then use any compression software you like to compress it. You will get no significant compression. Eric Watt Forste arkuat@joes.garage.com 1800 Market St #243 San Francisco CA 94102 "Expectation foils perception." -- Pamela C. Dean
OK well if you encrypt a compressed file, there are bound to be lots more new redundencies created in the encryption process In fact there are not. You can test this out; use PGP to encrypt any file you please, and then use any compression software you like to compress it. You will get no significant compression.
Isn't encrypted data supposed to be random, and thus not compressable? You might be able to creat some redundencies by decrypting it though. -- Ian S. Nelson I speak for only myself. Finger for my PGP key. If you are a beautiful woman, it is mandatory that you reply to this message.
participants (5)
-
catalyst@netcom.com
-
Eric Watt Forste
-
Ian S. Nelson
-
Marc Horowitz
-
Stanton McCandlish