James A. Donald writes:
I was referring to the proposed quantum computers.
Since such machines do not operate algorithmically
This statement is exactly wrong. Such machines *define* a class of algorithms.
I recommend that you read the following paper.
E. Bernstein and U. Vazirani, {\it Quantum Complexity Theory}, Proc. 25th ACM Symp. on Theory of Computation, pp. 11--20 (1993).
James, without reading the paper, can you tell me why the following argument is incorrect? 1) By definition, if something can be computed by a turing machine, then it is an algorithm (Lewis and Papadimitriou) 2) a quantum computer can be simulated by a TM with exponential slowdown. (claimed by you on the Extropians list, but also claimed by Feynmann I believe, not about qm computers, but qm systems in general) then by (1) and (2), it follows that 3) quantum computers are algorithmic (if not, it would contradict 2) and possibly 1) It doesn't matter how slow the turing machine runs the simulation because we allow an arbitrary time along with the infinite tape to complete the computation. -Ray
Ray writes
1) By definition, if something can be computed by a turing machine, then it is an algorithm (Lewis and Papadimitriou)
Suppose we have a spatial transform performed by light flowing through a grid. Is that an algorithm? Perhaps it is, but I am about to describe a case that will stretch your definition of algorithm rather more drastically.
2) a quantum computer can be simulated by a TM with exponential slowdown. (claimed by you on the Extropians list, but also claimed by Feynmann I believe, not about qm computers, but qm systems in general)
True.
then by (1) and (2), it follows that 3) quantum computers are algorithmic (if not, it would contradict 2) and possibly 1)
Suppose our quantum system has thirty two bytes. Then a classical simulation of our quantum system would require 2^257 words of memory The computer would require more matter than exists in the universe. Each step of the simulation would require 2^514 steps by the computer, which even for a computer constructed of very tiny components out of all the matter in the universe would still require vastly longer than the entire lifetime of the univers.
It doesn't matter how slow the turing machine runs the simulation because we allow an arbitrary time along with the infinite tape to complete the computation. -Ray
It does not sound like a very useful algorithm, nor is it one that is easy to describe. The difference is like the difference in my example of light flowing through a grid, as against a fourier transform etc, but the difference is enormously greater. You say it makes no difference by definition. I say such definitions are misleading when we discuss how problems are to be solved. -- --------------------------------------------------------------------- We have the right to defend ourselves and our property, because of the kind of animals that we James A. Donald are. True law derives from this right, not from the arbitrary power of the omnipotent state. jamesd@netcom.com
James A. Donald says:
Ray writes
1) By definition, if something can be computed by a turing machine, then it is an algorithm (Lewis and Papadimitriou)
Suppose we have a spatial transform performed by light flowing through a grid. Is that an algorithm? Perhaps it is, but I am about to describe a case that will stretch your definition of algorithm rather more drastically.
Suppose I have a frog. Is that an algorithm? Obviously not. On the other hand, suppose I define something that takes an input tape and turns it into an output tape. Is that something in the space of things we are talking about? Yes. The Church-Turing thesis is that if you are talking about the space of "things that turn input tapes into output tapes and end in particular states", turing machines are capable of doing any sort of transformation other things can, although perhaps taking longer to do so. I can believe that (possibly) quantum computers are faster, but it would be truly shocking to discover that they did some things that turing machines couldn't given enough time. Perry
James A. Donald writes:
Ray writes
1) By definition, if something can be computed by a turing machine, then it is an algorithm (Lewis and Papadimitriou) 2) a quantum computer can be simulated by a TM with exponential slowdown. (claimed by you on the Extropians list, but also claimed by Feynmann I believe, not about qm computers, but qm systems in general)
True.
Therefore it is an algorithm.
then by (1) and (2), it follows that 3) quantum computers are algorithmic (if not, it would contradict 2) and possibly 1)
Suppose our quantum system has thirty two bytes.
Then a classical simulation of our quantum system would require 2^257 words of memory
The computer would require more matter than exists in the universe.
Each step of the simulation would require 2^514 steps by the computer, which even for a computer constructed of very tiny components out of all the matter in the universe would still require vastly longer than the entire lifetime of the univers.
We are not talking about physical computers, we are talking about turing machines. If there is some *finite* deterministic process to get from the initial data to the final result, no matter how long it takes, it is an algorithm. I'm sure I could hand you a composite number that would require a computer larger and older than the universe to factor. Does that prove that none of our current factoring algorithms are actually algorithms, or that brute force isn't an algorithm? If you have a different definition of "algorithm" then perhaps your argument is right, but to me, an algorithm is a process to get from A to B, regardless of how long it takes.
It doesn't matter how slow the turing machine runs the simulation because we allow an arbitrary time along with the infinite tape to complete the computation. -Ray
It does not sound like a very useful algorithm, nor is it one that is easy to describe.
Usefulness is a matter of time complexity, not a condition for membership in the set of algorithms.
The difference is like the difference in my example of light flowing through a grid, as against a fourier transform etc, but the difference is enormously greater.
You say it makes no difference by definition. I say such definitions are misleading when we discuss how problems are to be solved.
Those definitions were invented to solve problems in the first place. I can't think of a single thing which is non-algorithmic except true randomness or non-determinism. Since no finite axiom system can prove whether a string is truly random, no algorithm is possible for generating nor proving them. (anything with infinite logical depth would also probably suffice) Err, I may be mistaken since I recall that Chaitin said that you need N bits of formal axioms to prove that an N-bit string is "elegant" (the smallest representation), but I also recall somewhere that a truely random string needs an infinite set of axioms. Perhaps Tim can shed some light. Perhaps another example is a physical process able to solve the halting problem. Imagine a time traveling UTM. Call it as a subroutine. All it does is run your algorithm program and wait. If the program ever halts, it sends the signal back in time, otherwise it runs forever. Thus, you feed the TT-UTM the algorithm you want to check. If the program halts, the signal travels back in time from the far future to arrive during the next "tick" of your current program. If you receive no such signal, then either the universe died before the algorithm halted, the machine broke down, or the algorithm doesn't halt. The traditional "proof by contradiction" of the insoluability of the halting problem doesn't work here. The algorithm used to test the contradiction simply doesn't halt. It calls the TT-UTM recursively forever, and creates an infinite number of them. In fact, this questions the validity of the halting proof itself since the contradiction derived isn't a valid input to the halt checking machine in the first place, or, the halting proof disproves logically the existence of time travel! Inputing an algorithm to the halt checker which calls the halt checker should be considered an exception like "division by zero" In which case, the halt checking TT-UTM returns "exception: input algorithm recurses forever" Thus, two new classes of algorithms are developed. Those checkable by a TT-UTM and those which are not. Those which are not should be left up to an even more powerful machine. ;-) (this violates the conditions of Church's thesis since the machine can perform an infinity of calculation at each step. Oh well.) -Ray "Everything is an algorithm, even you!"
We are not talking about physical computers, we are talking about turing machines. If there is some *finite* deterministic process to get from the initial data to the final result, no matter how long it takes, it is an algorithm.
I don't see the need for determinism; it depends on the underlying computational model.
I can't think of a single thing which is non-algorithmic except true randomness or non-determinism.
The "essence" of nondeterminism may not be algorithmic, but I don't see why that's important. If nondeterminism can be sufficiently characterized that I can express an algorithmic process involving it (and of course we can; that's how NP problems are expressed) then my boat floats.
participants (4)
-
jamesd@netcom.com -
m5@vail.tivoli.com -
Perry E. Metzger -
rjc@gnu.ai.mit.edu