Created by Materia for OpenMind Recommended by Materia
4
Start What Is Entropy? Beyond Disorder
21 March 2023

What Is Entropy? Beyond Disorder

Estimated reading time Time 4 to read

There are certain words that can embellish any speech or quotation. “Entropy” is one of them. We find it in the unintelligible phrases of some famous spiritual guru, and even in self-help advice and motivational coaching. And, of course, we all know what entropy means: disorder. If we don’t get our house in order, we are told, entropy will eat us alive. Except that, in reality, it won’t: physicists are constantly explaining that no, entropy does not mean disorder. And yet, with this approximate meaning, it has almost become part of the lexicon of everyday life. But what does entropy actually mean?

The Austrian Ludwig Boltzmann introduced the current formulation of entropy, giving it a statistical meaning, understood as the probability distribution between the different possible microstates. Credit: Alamy Stock Photo

When the Prussian physicist Rudolf Clausius defined entropy in 1865, the idea of disorder was nowhere to be found. Clausius was seeking to explain mathematically the workings of energy in the Carnot heat engine, an optimised model of the heat engine—on which the original diesel engine was based—proposed four decades earlier by the French engineer Sadi Carnot. To express the unusable heat lost, he defined entropy—etymologically, a transformation of energy content—which measures how spontaneously a hot body gives up heat to a cold body as the system tends to equilibrium, unless interfered with to prevent it. This is why “entropy in a thermodynamic sense is an energy divided by a temperature,” summarises chemist-physicist Emil Roduner, professor emeritus at the University of Stuttgart (Germany) to OpenMind.

This spontaneous behaviour of a system is the basic foundation of the second law of thermodynamics, as intuited by Clausius years before his definition of entropy. In coining the term, the physicist ended his work by summarising the first two laws of thermodynamics in this way: “The energy of the universe is constant,” and “The entropy of the universe tends to a maximum.” But Clausius’ choice of term has not made the concept any easier to understand; as physicist and Nobel laureate Leon Cooper wrote in 1968, by choosing the term entropy, “rather than extracting a name from the body of the current language (say: lost heat), he [Clausius] succeeded in coining a word that meant the same thing to everybody: nothing.” In 1904, thermodynamics mathematician George H. Bryan wrote in Nature that entropy is “that most difficult of all physical conceptions.”

The formulation of entropy

Years after Clausius’ definition, the Austrian physicist and philosopher Ludwig Boltzmann introduced the current formulation of entropy, giving it a statistical meaning that relates the microstates of matter (atoms, molecules) to the macrostate (observable) of the system. Boltzmann’s definition refers to the statistical measure of disorder, understood as the probability distribution between the different possible microstates. “Disorder is technically wrong,” theoretical physicist Peter Watson, professor emeritus at Carleton University (Canada), tells OpenMind; “but the idea is correct, and if you replace it by probability, I don’t have any issue,” he adds.

Dan Styer explains entropy with the analogy of the bottle of salad dressing, which has separate, highly ordered layers of oil and vinegar, yet it is in thermal equilibrium, in a state of maximum entropy. Credit: Lisa Top/Getty Images

Watson gives an example: there are six atoms of gas in a room. It is unlikely that all six will be on the same side, a low-entropy situation; a distribution of three on each side is more probable. This tendency to equilibrium is related to the arrow of time, a concept associated with entropy because it deals with irreversible processes that move in only one direction in time. This is one of the reasons why, according to Watson, we probably cannot travel back in time, as this would violate the second law of thermodynamics, the increase in entropy.

Theoretical physicist Dan Styer, of Oberlin College (Ohio, USA), offers another analogy that helps to dismantle the idea of disorder: a bottle of Italian salad dressing has separate, highly ordered layers of oil and vinegar, yet it is in thermal equilibrium, in a state of maximum entropy. Styer prefers another word to explain entropy: freedom. A club (macro-state) with more permissive rules than another allows its members (micro-states) a greater variety of choices. “If there are more microstates corresponding to the macrostate, the macrostate has a larger entropy,” he summarises to OpenMind. “So a club granting its members more freedom is analogous to a macrostate with higher entropy.” Microstates, Styer clarifies, do not have entropy, only macrostates.

The measure of uncertainty

So the idea of disorder is misleading, except where it coincides with the real meaning of entropy, as in the case of gases, its original field. And, according to Roduner, in other examples too, such as the transition from solid to liquid to gas, or when we pour a drop of ink into a glass of water. But “nature is much more complex,” he warns. “Many things seem to occur spontaneously, but in the wrong direction.” The formation of a snowflake or any kind of living matter are phenomena that seem to go against the spontaneity of natural processes, leading to an increase in entropy.

BBVA-OpenMind-Yanes-La entropia no es desorden_3 La formación de un copo de nieve, por ejemplo, es un fenómeno que parece ir en contra de la espontaneidad de los procesos naturales que conllevan un aumento de entropía. Crédito: Martin Siepmann/Westend/Getty images
The formation of a snowflake or any kind of living matter are phenomena that seem to go against the spontaneity of natural processes, leading to an increase in entropy. Credit: Martin Siepmann/Westend/Getty images

“This is one of the most common misunderstandings of the second law of thermodynamics,” says Roduner. “Here we do not have an isolated system; we have a system of interest, and its environment.” In these cases, there is no equilibrium, but a gradient guiding the direction of the process; the entropy of a system can spontaneously decrease if at the same time the entropy of its surroundings increases to a greater degree. It was an understanding of this idea that in the 1960s led the Russian-born Belgian chemist Ilya Prigogine to explain how it was thermodynamically possible for life to have arisen from its elementary components, which in turn had a major influence on the development of chaos theory.

This application of entropy to biology is an example of how the concept has spread to other fields. In 1948, mathematician and engineer Claude Shannon, considered the father of information theory, applied the idea of entropy to the loss of information in telecommunications. As physicist Kevin Knuth of the University at Albany and editor of the scientific journal Entropy tells OpenMind, in this sense “entropy is a measure of uncertainty, and as such, it has broad applicability to any problem in which one is making inferences.” In 1957, physicist Edwin Jaynes brought back to thermodynamics the concept of entropy introduced by Shannon for information. “This is how it gets confused with disorder, because a disordered system results in a lot of uncertainty,” says Knuth, “but uncertainty is the main concept, not disorder.”

BBVA-OpenMind-Yanes-La entropia no es desorden_4 La entropía, señala Kevin Knuth, es una medida de la incertidumbre, por eso se confunde con el desorden, porque un sistema desordenado resulta en una gran incertidumbre. Crédito: Aaron Amat/Getty Images
Entropy, Kevin Knuth points out, is a measure of uncertainty, which is why it is confused with disorder, because a disordered system results in a lot of uncertainty. Credit: Aaron Amat/Getty Images

And it is such an important concept, Watson stresses, as so well expressed in the words of English playwright Tom Stoppard in his 1993 play Arcadia: “Heat goes to cold. It’s a one-way street. Your tea will end up at room temperature. What’s happening to your tea is happening to everything everywhere. The sun and the stars. It’ll take a while but we’re all going to end up at room temperature.”

Javier Yanes

@yanes68

 

Comments on this publication

Name cannot be empty
Write a comment here…* (500 words maximum)
This field cannot be empty, Please enter your comment.
*Your comment will be reviewed before being published
Captcha must be solved