[algorithm to] store data the same way the human brain does. [stored data would] take up only [0.5%] of the original space.
Whoever said the human brain stores data compressed to 0.5% of its original size, and what is its original size anyway. Paul Baclace says:
Sounds like Bugajsky creates a generative grammar and then stores list of productions that specifies a walk on the tree to extract data. This is a form of Kolmogorov Complexity compression, which has been expanded upon most notably by Chaitin.
I agree. The description sounds more like this than anything else I'm familiar with. Paul Baclace goes on to say:
I wonder whether [he] includes the size of his grammar in [the claim]
0.5% is a questionable claim. If it includes the grammar, then the grammar must be very simple, and the data of very low entropy with respect to it -- in which case 0.5% would be an uninteresting experimental result. If the claim does _not_ include the size of the grammar, then the claim is useless for evaluating this scheme. Scott Collins | "Few people realize what tremendous power there | is in one of these things." -- Willy Wonka ......................|................................................ BUSINESS. voice:408.862.0540 fax:974.6094 collins@newton.apple.com Apple Computer, Inc. 5 Infinite Loop, MS 305-2B Cupertino, CA 95014 ....................................................................... PERSONAL. voice/fax:408.257.1746 1024:669687 catalyst@netcom.com
participants (1)
-
collins@newton.apple.com