Friday, June 12, 2009

Algorithm Efficiency

Is there a theoretically most efficient algorithm for performing a particular task?

I began wondering about this question while working on a signal processing project using a processor with very limited computation and memory. Inefficiency in design limits the system's ability to perform other tasks, so it would be useful to know how close to ideal an algorithm is. Furthermore, such a measure would bound the tasks a system is capable of performing.

Driving home last night, I thought about Maxwell's Demon. I remember reading that the missing work in the description of the system is contained in the intelligence of the demon and its measurement of the environment. I am also tentatively aware that information, data, and information transmission have associated entropy measures.

Is it be possible to express a task as a Maxwell's Demon lowering the entropy of information. The optimal efficiency in performing the task would be related to the minimal work required to change the entropy.

If I recall my statistical physics properly, the work required to change the entropy of a system also depends upon the temperature of the system. In statistical physics, the temperature is related to the speed of the gas molecules making up the system. Is it possible to define an analogous notion for information related to the rate of information arriving or the speed of computation?

No comments:

Post a Comment