Wednesday, June 17, 2009

Algorithm Efficiency

I made a first pass through Charles Bennett's paper "The Thermodynamics of Computation - A Review." There are a number of interesting parts.

One that is just cool, is a description of how to make a computer out of billiard balls and rigid walls which theoretically dissipates no energy for computation. It is capable of this performance while actually making measurable progress on the computations. It has a few disadvantages. It is vulnerable to outside influence, so it must be thermally isolated, or the computations cease to be accurate after a handful of iterations. It is actually sensitive enough that the tidal forces from the atmospheric turbulence of nearby extra-solar stars will destroy the algorithm in a few hundred computations.

Bennett and others also have proposed a Brownian motion computer (Turing machine). In the presence of a weak driving force, an algorithm will progress toward its conclusion. The higher the speed, the more energy is dissipated by the computer, but in the limit, it is a lossless computer. The chemical actions of RNA polymerase synthesis can act as the elements of a Brownian motion Turing machine.

Bennet measures the energy dissipated by a computation in kT, the order of energy contained in a single molecule of the system. At room temperature, kT~10^-21J. RNA polymerase synthesis dissipates about 20kT per computation, while a digital computer dissipates about 10^11kT (at least in 1982.)

Bennett's paper demonstrates that the measurement required by Maxwell's Demon does not increase the entropy of the system (or its environment), provided the measuring device starts in a known state. The entropy change in the system arises from the necessity to return the measurement device to a known state. The entropy change is ln(2)kT and it matches the work extracted by Maxwell's Demon.

I need to read through the article more carefully, but I believe that we can consider the minimum entropy added to a system to be the change in entropy of all of the information encoded by the algorithm, the computer's memory, and the information stream feeding the algorithm from when the algorithm starts until it finishes. We may not actually have to consider the entropy of the input stream. The complexity of the algorithm required to extract the answer from input streams with a variety of entropies may already book this value. It may also be that the complexity of the algorithm can be predicted from the entropy change in going from the input data stream to the output data stream. This is probably related to why programs for handling free-form data entry are much more complicated than programs which handle data entered with a rigid format.

We can look at the efficiency of an algorithm by determining how much extra entropy is created to execute the algorithm than is strictly required to solve the problem. Efficiency of an algorithm should be directly measurable, since there should be a correspondance between the entropy created by a processor and the entropy of the algorithm it is executing. It also appears that the speed of computation has an effect on the entropy dissipation. Certainly, for the Brownian motion computers, speed of computation is related to the amount of energy expended per computation.

I wonder if ultra-low temperature computers would be slow? The kT term is small, but molecular interaction/velocity should be correspondingly small.

No comments:

Post a Comment