Random Data Compressed 100:1 (Guffaw)

Michael Motyka mmotyka at lsil.com
Tue Jan 8 08:51:09 PST 2002


Eric Cordian <emc at artifact.psychedelic.net> wrote :
>Someone else needs to read the comp.compression FAQ.
>
>http://www.reuters.com/news_article.jhtml?type=technologynews&StoryID=498720
>
>-----
>
>NEW YORK (Reuters) - A Florida research start-up working with a team of
>renowned mathematicians said on Monday it had achieved a breakthrough that
>overcomes the previously known limits of compression used to store and
>transmit data.
>...
>ZeoSync said its scientific team had succeeded on a small scale in
>compressing random information sequences in such a way as to allow the
>same data to be compressed more than 100 times over -- with no data loss.
>That would be at least an order of magnitude beyond current known
>algorithms for compacting data.
>...
>-- 
>Eric Michael Cordian 0+
>
There may be a way to derive & patch together pseudorandom sequence
generators, who knows. If there is anything real about this I wonder how
long it takes to compress large blocks of arbitrary input data?
Geological time scales anyone?

Mike





More information about the cypherpunks-legacy mailing list