Entropy, Meaninglessness and Miracles

Sand has high entropy, because shuffling the grains doesn’t change its sandiness. Seaweed has low entropy, because if you shuffle its parts, chances are it loses its seaweediness. A crab would have been more dramatic than seaweed, but you get the point. Photo by “Emily.”

August 18, 2023. My curmudgeonly pal Richard Gaylord goaded me into writing this column. Richard is a chemist/physicist who sends me science-y articles and videos along with his pithy judgements, which range from “what an idiot” to “not bad.”

I’ve already written about two of Richard’s most insistent assertions: 1, to understand things, you must describe them mathematically (I disagree); 2, things don’t necessarily have a single, “true” mathematical description (I agree). 

Richard has strong feelings about the second law of thermodynamics. I do too. In this column, I’ll try to explain the second law, to myself if no one else. Then I’ll riff on its implications, which I find both depressing and exhilarating.

Actually, I’ll start with the first law of thermodynamics, which like the second dates back to the 18th-century. Scientists then were figuring out how heat and other forms of energy are transformed by steam engines and chemistry experiments.

The first law says energy is conserved, no matter how much it changes. Potential energy morphs into kinetic/mechanical/acoustic energy when you drop your iPhone and it shatters on a rock, making you go “D’oh!” Energy can even turn into matter and vice versa. But the total energy of a “closed” system, sealed off from its environment, remains constant.

The second law of thermodynamics, in its original formulation, says the energy of a closed system tends to become uniformly distributed. That means, usually, that everything becomes the same temperature. A mug of hot tea cools off if you don’t drink it, a scoop of ice cream melts.

The second law says useful energy, meaning energy that can flex a muscle or push a piston, tends to dissipate as waste heat. No matter how efficiently your engine or metabolism recycles energy, you need to keep fueling or feeding it, or it stops working.

The more general, modern definition of the second law says the entropy of a closed system tends to increase over time. And what is entropy again? Before I answer that question, let me remind you of a well-established corollary of the second law: no matter how you define entropy, experts will disagree with you.

Physicists often equate low entropy with “order” and high entropy with “disorder.” My pal Richard dislikes these equivalences, and I do too. They might lead you to guess, wrongly, that an empty classroom has less entropy than a classroom crammed with rambunctious college kids. I prefer to think of low-entropy systems as “interesting” or “weird” and high-entropy ones as “boring” or “bland.”

But let me offer a more precise definition of entropy, pieced together from videos and articles  Richard has sent me and other stuff I’ve found on my own. Entropy is a measure of the ways in which a system’s parts can be randomly rearranged, or shuffled, without significantly changing the system’s overall properties.

The system can be very small or very big. It can be an amoeba, ant colony, pond, classroom, biosphere, solar system, galaxy, universe. The fewer ways there are to shuffle a system’s parts without substantially changing it, the lower its entropy. The more ways there are to shuffle the parts, the higher the entropy.

A bumblebee or college kid or any living thing has low entropy, because if you shuffle its parts, such as its cells or organs, you alter its macroscopic state, which is a nice way of saying that you disable or kill it. The air in an empty classroom is a high-entropy system, because there are many ways to shuffle the air molecules without changing the air’s overall temperature and density.

Information theory, invented by mathematician Claude Shannon in the 1940s, links entropy to the realm of symbols, such as numbers and letters. Shannon defines information as the capacity of a system of symbols to surprise you, that is, to tell you something you don’t know. And that capacity is proportional to the system’s entropy, which is a measure of all possible arrangements of the symbols. Shannon once told me that he thought of entropy as “shuffle-ability.”

Too abstract? Here's a simpler way to think about entropy: high entropy = high probability, low entropy = low probability. The following thought experiment gets at what I mean: Put your system, whether a universe, planet or pond, in a box and shake it hard to shuffle its components. What do you get? Chances are overwhelming that you won’t get anything interesting, like spiral galaxies, the United Nations or frogs.

Life seems, superficially, like a blatant violation of the second law. Low-entropy organisms spawn more low-entropy organisms, and one insanely ambitious species invents alphabets, poetry, trigonometry, money, steam engines, thermonuclear bombs and so on. Physicists assure us that life and civilization don’t violate the second law, because our biosphere is not a closed system; it gets massive infusions of energy from the Sun.

The second law nonetheless underscores the terrible fragility of our existence. There are many more ways for us to be sick or dead or non-existent than to be alive and healthy. The second law makes a mockery of our dreams of progress. Global warming is an all-too-predictable consequence of the second law. Burning fossil fuels has helped us become healthier and wealthier--or some of us, anyway. But all that waste heat is warming the atmosphere and oceans, threatening civilization.

Let’s say we survive climate change and other self-created threats, and even the demise of the Sun, which astronomers expect to swell up like a diabetic old drunk in 5 billion years or so. As the cosmos keeps expanding, stars and other hot, shiny things will radiate away their mass and energy, and the universe will become increasingly cold, dark and empty. This void will have maximal entropy, because every part will be identical to, and hence interchangeable with, every other part. Nothing has more entropy than nothing.

“Heat death,” coined in the 19th century by scientists brooding over the second law, refers to a state of terminal blandness, which lacks the energy needed for anything to happen. You could also call this state “time death,” because if nothing happens, time ceases. “Meaning death” also works. The cosmos becomes eternally boring, except there’s no one to be bored.

In the late 1990s, astronomers discovered that the expansion of the universe is speeding up. That means we’re accelerating toward heat/time/meaning death. We won’t get there for a while, 10 to the googol years, according to one estimate, but still. If you don’t believe in a benign, eternal God, or even if you do, the prospect of heat/time/meaning death might freak you out; it might make all our striving seem pointless.

That’s the “depressing” implication of the second law. As for the “exhilarating” implication, which I mentioned above, here’s how I try to convey it to my students. After blathering about the second law, entropy and heat/time/meaning death, I say to my students:

Now imagine taking all the molecules in this room at this very instant and putting them in a box. Shake the box really hard. What are the odds that you would get this classroom, with me standing here yammering and you sitting there looking at me? The odds are infinitesimal that you’d get this.

When I say “this,” I wave my hands around to indicate the classroom, my students and me. Then I continue:

When something infinitely improbable happens, what do you call it? A miracle, right? Yeah, I’m telling you that this moment, in this required humanities class, which you’d probably rather not take, is a miracle! And so is this moment! Every moment of every life is a goddamn miracle! [See Postscript.]

I’m not trying to sneak God into the classroom by saying “miracle.” The cruelty and injustice of the world keep me from believing in a creator who cares about us in any remotely human sense. I’m just trying to get my students to see their existence in a positive light. Sometimes, instead of saying this moment is a miracle, I say this moment is weird, by which I mean infinitely improbable and inexplicable. We shouldn’t be here, and yet here we are.

Contemplating the second law, and even the first, amplifies my sense of life’s weirdness. The big bang, which got everything started, flies in the face of both laws. Out of nothing springs an extremely high-energy, low-entropy something, which eventually produces us. How did that happen? Faced with this question, physicists can only wave their hands and mutter quantum incantations.

The second law can make life seem meaningless--or miraculous. I choose to see it as miraculous. I wonder how my buddy Richard sees it.

Richard responds: of course, life is meaningless.

Postscript: I only say “goddamn” when my students seem distracted.

Further Reading:

See the chapter titled “Entropy” in my free online book My Quantum Experiment.

For expert takes on the second law, see riffs by Sabine Hossenfelder, Sean Carroll, Philip Ball, Zhigang Suo, George Musser and Stephen Wolfram. I also found Wikipedia helpful, as well as this 2013 article in Evolution: Education and Outreach,  which Richard Gaylord sent me. See also the wonderful little essay “Poetry and Entropy” in Kenyon Review.

A Facebook friend, Brian Joseph Gates, says this column reminds him of the song “Holy Now” by Peter Mayer: https://m.youtube.com/watch?v=KiypaURysz4

Previous
Previous

Drawing a Pen with the Same Pen and Other Strange Loops

Next
Next

The Delusion of Scientific Omniscience