The late Nobel laureate Richard Feynman became very interested in the subject of computation and physics towards the end of his life. My understanding is that he concluded that there was no apparent limitation to the amount of computation that could be completed with a given amount of free energy. Computation may indeed always dissipate energy, but Feyman's conclusion was that this dissipated energy can be made arbitrarily small -- that there is no fundamental quantum limitation on the amount of computation that can be performed at any given mass-energy scale.
Actually, I _think_ I've read an article in a pop-science magazine about some work of Hawking's that indicated there was a minimum amount of energy neccesary to do some sort of quanta of computation. (is there such a thing ? I don't know enough about the math, I'm afraid. INformation theory?) If my memory serves, he used this to hint at a solution to the "why does time only flow in one direction, when the mathematics are perfectly symmetrical both ways?" question. But I could be wrong. Sorry I don't have any better info then you.