There are several ways in which one might try to build a demon. The simplest of these are purely mechanical. Consider two gas filled chambers, A and B, connected by a tunnel and at equilibrium, so that the pressures and temperatures are equal. If a pressure difference could be created, this pressure difference could be used to do work by allowing one of the chambers to expand into the other. This demon is a one-way valve in the tunnel which allows molecules to pass through it in one direction but not the other. This could be a trap door held closed either by a spring or by a weight, as in Figure 1.
Molecules hitting the trap door from the right would be deflected, because the trap door can only be opened from the left. If a particle hitting it from the left has enough energy, though, it may knock the door open and enter the right chamber. One might suppose that a pressure difference can be built in this manner. Maryan Smoluchowski , who first discussed this mechanical demon, suggested that it would not actually work, because the door itself would rise in entropy to an extent sufficient to offset the reduction in entropy caused in the gas. If the trap door is built in such a way that it shuts after allowing an appropriate molecule through, and stays shut until another appropriate particle arrives, then it must somehow dissipate the energy it had while it was opening. A macroscopic spring loaded door would do this by heating itself and the door frame when it slams shut, thereby raising the entropy of the door and door frame. An elastic microscopic demonic trap door would continue to bounce wildly, thereby becoming ineffective as a demon. Molecule moving from A to B might be bounced back by the door, and molecules from B could be allowed through to A while the door was bounced open. Therefore, the mechanical demon can not be used reduce the entropy of the gas, except for the first few collisions, where the drop in entropy in the gas is offset in a rise in entropy by the demon. That this is the case has been demonstrated recently by P. A. Skordos and W. H. Zurek using a computer simulation of a two-dimensional gas in which the ``molecule'' collide as if they were hard disks.
A more complex way in which one might try to build a demon is to have an ``intelligently acting'' device observe the gas, and open or close the door appropriately. Again, consider two chambers at equilibrium, with a tunnel running between them. This time, we replace the trap door with a pair of sliding doors, one at each end of the tunnel. These doors may be opened or closed by a computer; this computer is also equipped with sensors with which it can observe whether or not there is a molecule in the tunnel. The computer-demon which the tunnel at regular time intervals. Initially, the door on the left is open and the door on the right is closed, allowing molecules to enter the tunnel from the left but not the right, as in Figure 2.
In the next time step, the demon looks to see if there is a molecule within the tunnel. If not, the doors are left unchanged. If one or more molecules are in the tunnel, the demon closes the door on the left and open the one on the right, allowing the molecules in the tunnel to enter the chamber on the right, while preventing molecules from entering the left hand chamber, as in Figure 3.
The next time the demon looks, if there are still one or more molecules in the tunnel, the doors are left in their prior positions until the next demon observation. If the tunnel is empty, the doors are returned to their original positions and allowing the demon to begin again, is in Figure 4.
In the process, the demon has created a pressure difference between the two chambers without doing work on the gas. If the doors are nearly frictionless, the entropy increase associated with the mechanical operation of the demon can be made negligibly small. The total entropy of the system appears to decrease.
There is, however, some entropy which has been overlooked in this analysis. In order to determine when and how to move the doors, the demon must collect information and perform computations. In this case, the computations are very simple. The entropy of the computers memory must be taken into account when measuring the entropy of the demon.
Recall that the entropy of a system is related to the logarithm of the number of states accessible by the system. If there are no restraints on the state of the memory of the demon, this memory has many accessible states. In fact, if the demon's memory is N unrestricted bits in length, then the memory has accessible states. This ``phase space volume'' must be taken into account. A revision of Boltzmann's definition which reflects this contribution to the entropy has been proposed by W. H. Zurek,
where S is the total entropy, including the demon, H is the entropy of the gas, and I is the entropy of the information stored in the demon's memory. Substituting in reasonable expressions for the entropy of the gas and demon,
where k is a constant, is the number of accessible states of the gas, and is the number of accessible states of the demon's memory.
Taking into account the demon's memory, the system could progress as follows. The demon will begin with an empty memory ( ; the memory is known to be empty), and the gas is at equilibrium.
When the demon checks the tunnel and calculates where the walls need to be, it must store this information in its memory. Different demons may store the information more or less efficiently, but any computing demon must use some memory. Because there are many different values which could be stored in the used memory, increases.
As the demon operates, it will occasionally move a molecule from the chamber on the left to the chamber on the right, decreasing the entropy of the gas. No longer at equilibrium, the gas can now be used to do work.
Eventually, the demon has moved all the molecules from one chamber to the other. The entropy of the gas is much smaller, but the demon has used lots of memory. When the contribution of the demon's memory is taken into account, the total entropy of the demon-gas system should increase.
One might object that, after the demon has completed its task, its memory may be erased, reducing the entropy of the memory to zero and not increasing the entropy of the gas. After the erasure has taken place, the system's total entropy will have decreased. This is not the case, however. Landauer has shown that a computer must increase the entropy of its surroundings to perform irreversible computations. For each bit of information ``lost'' in an irreversible calculation, the demon increases the entropy by at least . C. H. Bennett has shown rigorously that any computation that can be done at all can be done reversibly, except for erasure of memory, because the information erased, by definition, is lost. The restriction to irreversibility therefore does not restrict the range of operation of the demon provided it has sufficient memory. It does, however, prevent the demon from erasing information to reduce the entropy of the system.
Another scheme to avoid the increase in entropy because of the demon's memory might be to begin with the demon's memory filled with random bits. This way, the entropy of the demon's memory would be no higher at the end than at the beginning. This would not work either, because the original contents of the demon's memory would either have to be erased, increasing the entropy of the system in the beginning, or would have to be stored so that the demon's computations could be reversible.
This is not to say that there is nothing that the demon can do to reduce its entropy. After the demon has completed operation, it may be able to reduce the amount of memory it needs to use to store all the information it collected. If there is correlation between the different bits of the demon's memory, the memory taken up by that information can be reduced through reversible compression of the data. If, for example, the demon examines the tunnel at time intervals so short that changes from one observation to the next are unlikely, the demons memory would have long strings of identical observations. If observations of a molecule were recorded with a 1 and observations of an empty tunnel with a 0, then part of the string might look like this:
This memory could be compressed reversibly to take up much less space. For example, instead of storing the actual observations, the demon could record the time steps spent at each value, like this:
Of course, the demon would store the numbers in binary form, and a scheme would have to be worked out for dividing each number from the next. Considerable work has been done in devising schemes to efficiently code redundant data. Strings of bits where there is a high degree of correlation between bits can be coded very compactly.
The possibility of determining the minimum possible increase in the demon's memory is complicated by this possibility of compression. Rather than merely counting the number of times the demon has to look, one must calculate the minimum possible amount of memory the information can be compressed into. This is difficult because different compression schemes will compress the same data with different efficiencies. Where one compression scheme might cut the amount of memory necessary in half, another could only reduce it by a quarter.
The measure of information proposed by Zurek avoids this problem. Rather than discuss the minimum number of bits necessary for some arbitrarily chosen coding scheme to compress the data, the algorithmic entropy is used. The algorithmic entropy is the length of the shortest program in a Turing machine required to produce the binary string and then stop. Because any Turing machine can run any program that any other computing machine can given a finite program, the difference in algorithmic entropy as measured by different Turing machines will vary at most by the length of the program necessary for one to emulate the other. Furthermore, any coding scheme which can be performed to compress the data can be performed by a Turing machine, so the effectiveness of any possible coding schemes is also taken into account.
Although the algorithmic entropy is well-defined mathematically, it is usually not a computable function. If one considers a data set where the bits were generated randomly, the algorithmic entropy can be approximated by the number of bits in the original string. That is, very little compression at all can be performed on random data sets. In ensure that this approximation is valid, one can consider demons where there is little correlation in the original data taken by the demon.