13. How To Measure Disorder   Previous PageNext Page
       THE STATISTICAL MEANING OF ENTROPY

Boltzmann made the crucial connection between thermodynamic entropy and disorder. Any situation that is so definite that it can be put together only in one or a small number of ways is recognized by our minds as orderly. Any situation that could be reproduced in thousands or millions of different but equivalent ways is disorderly.

Boltzmann's law tells us that the most perfect, orderly object conceivable in the universe would be a perfect crystal at absolute zero. Anything else-a crystal at any temperature above 0�K, a liquid, a gas, or a mixture of substances-is more disordered and has a positive entropy. The higher the entropy, the greater the disorder.

When we combine Boltzmann's ideas with thermodynamics, we arrive at one of the most important principles of science: In any real, spontaneous processes, including chemical reactions, the disorder of the universe always increases. In any isolated system, in which the total energy does not change, a spontaneous reaction is one in which entropy (and disorder) increases.

Any process that produces order, or lowers the entropy, cannot occur without outside help. If we supply enough energy, we can make a reaction occur even though the entropy decreases in the process. If we do not supply enough energy, a reaction leading to increased order will not take place.

The two ways of looking at entropy-thermodynamic and statistical-are contrasted at the top of these two pages.

The thermodynamicist measures heats of processes, and calculates from them a numerical value for the third-law entropy, S0298, of the substance.

The theoretician can compute from the known amount of disorder in a substance what its entropy should be. If his estimate of disorder and his subsequent calculations are valid, he will arrive at a final number that agrees with the value that was measured from heat experiments.

  Page 13 of 45 HomeGlossary