1) continuum phenomena are real and space is not merely quantized at a level which is undetectable by experiment (just because physics models it as a continuum doesn't mean it is so)
2) all of this precision actually makes a difference
On Tue, 26 Jul 1994, Ray Cromwell wrote: true. true.
For instance, at the level of brain chemistry, who cares about quantum precision when thermal noises will swamp it anyway? (the Penrose argument even goes as far as assuming quantum gravity, a force pitifully weak, as a signficant factor) What does that have to do with the above?
One of the reasons digital manipulation became popular was because analog data was too prone to error. Why will a quantum computer, which seems even more sensitive to external perturbation, be any different? Are you trying to say that things have to be digital to have noise imunity? If so, you are totally wrong. Examples abound from analog elctronics specifically transmission.
And regardless of whether quantum computers work or not, they are still algorithmic if they can be simulated (however slowly) by a turing machine. It's a rigorous mathematical definition. Claiming Sure, I never said otherwise, just that it is conceivable that some continum phenomina can't be described algorithmicly AT ALL. otherwise uses algorithm in a manner different than was intended. It's like the way Ludwig Plutonium solves all those famous problems in sci.math by assuming different definitions of primality, etc. Quantum computers might be faster than classical computers, but non-algorithmic, I don't think so. Hmmm, argument by plutonium? Try again.
Berzerk.