CHAPTER TWO

Laziness

A couple of years ago, after finishing a book, I was casting about for something to do during my summer break. I decided to seek enlightenment, the spiritual kind, for which I’ve yearned since my teens. Better late than never! Hoping intensive training could get me there fast, I signed up for a hot-yoga class in Hoboken. It consisted of two dozen lithe young women (including our instructor), one lithe young guy and me. The basement room was much hotter than I expected. I ended up throwing up and passing out, much to my classmates’ alarm and my chagrin. That was the end of my hot-yoga experiment but not my enlightenment experiment.

The following month, I went on a week-long silent meditation retreat at a monastery on the Hudson River. One goal was to appease my friend Bob Wright, author of Why Buddhism Is True. Annoyed by my criticism of Buddhism, Bob had bugged me for years to go on a retreat. On day two of the retreat, I entered a state that, writing about it later, I called The Laziness. I stopped anxiously rethinking my past and fretting over my future. I spent hours lying on a lawn, happily watching clouds float by. I’ve never felt so chill.

Buddha greets me when I walk in my front door. He’s amused because he’s achieved permanent, sublime Laziness and I haven’t. He also probably gets quantum mechanics.

After the retreat ended, I tried to maintain The Laziness, figuring it was a step toward enlightenment. But within a few months my calm dissipated, as I reverted to my old routines. I started drinking coffee again, which I had given up for the retreat. I convinced myself that enlightenment is just a feeling, a mood, devoid of substance. Being here now, living in the moment, means being done with goals, To-Do lists. I’m not old enough to give up goals. I need a purpose.

Learning quantum mechanics is my purpose now. It’s a big To Do, consisting of many little To Dos. Maybe if I learn quantum mechanics, I’ll become enlightened as a bonus. In the hippy classic The Tao of Physics, physicist Fritjov Capra says physicists are rediscovering truths known to ancient eastern mystics: mind is more fundamental than matter, time and space are illusory, everything is connected to everything. Yada yada. Some gurus claim that when you become enlightened, you grok the essence of quantum mechanics, instantly and intuitively. Maybe the inverse is true: learning quantum mechanics will confer mystical knowledge.

Perhaps I should say “knowledge.” I must keep in mind Feynman’s warning that if you say you understand quantum mechanics, you don’t. That line evokes Lao Tzu’s aphorism about knowledge of the Tao: “Those who speak do not know; those who know do not speak.”

I hope this project doesn’t end with me throwing up and passing out in front of young adepts.

Spin That Isn’t Spin

The preceding journal entry is laughable. Rather than carrying out my quantum experiment, I’m riffing on it, joking about it, expressing my feelings about it. I’m stalling. More specifically, I’m putting off reading Leonard Susskind’s Quantum Mechanics: The Theoretical Minimum, which Amazon just delivered. This is the book that my advisors Jim Holt and Sabine Hossenfelder recommended, which teaches quantum mechanics with the math. Flipping through it, seeing all the equations, the exotic notation and lingo, makes me think: Uh oh.

This is not a pop-physics book.

Susskind lures us in with a jokey, folksy style. He begins chapters with dialogues between Lenny and Art, goofy stand-ins for Susskind and his co-author Art Friedman, an engineer and former student of Susskind. Lenny and Art are “greenhorns from California” who wander into a “watering hole called Hilbert’s Place.” Hilbert’s Place is a reference to Hilbert space, an imaginary mathematical realm where quantum events unfold. The authors warn early on:  

Don’t let our lighthearted humor fool you into thinking that we’re writing for airheads. We’re not. Our goal is to make a difficult subject “as simple as possible but no simpler,” and we hope to have a little fun along the way. See you at Hilbert’s Place.

That “no simpler” quote is Einstein’s. Quantum mechanics, Susskind says, is our most fundamental description of the world. It is “much more fundamental than classical mechanics,” which is a mere “approximation.” Quantum mechanics “is technically much easier than classical mechanics” in terms of its mathematics, but it is conceptually strange. It upends our assumptions about logic and causes and effects. Most textbooks on quantum mechanics don’t dwell on its strangeness, Susskind says, but his goal is to bring “the utter strangeness of quantum logic… into the light of day.”

Susskind’s first lecture, “Systems and Experiments,” starts simply enough. He talks about physical systems with two possible states, like a coin. When we flip a coin, it lands either heads-up or tails-up. You can represent these two states as +1 or -1. In computer science, a two-state system is known as a bit, which you can think of as the answer to a yes-or-no question.

Particles such as electrons have a two-state, binary property called spin, which is unlike the spin with which we are familiar. In our macro-world, objects spin in a certain direction--clockwise or counterclockwise, for example, depending on frame of reference--with a certain velocity around an axis with a certain orientation in space. The Earth spins on an axis that points at Polaris, the North Star. It spins counterclockwise if you’re looking at it from above the North Pole and clockwise from above the South Pole.

Now that I’ve brought up this analogy, you should forget it, because quantum spin is an abstract, mathematical property with no macro-world counterpart. You shouldn’t think of the electron as a tiny, charged ball spinning with this velocity around an axis with this orientation in space. “The spin of an electron is about as quantum mechanical as a system can be,” Susskind says, “and any attempt to visualize it classically will badly miss the point.”

I’ve read about quantum spin before, and I’ve always been baffled by it. Feynman brings up spin only toward the end of QED. He relates spin to polarization, an orientation of photons in space and time, but I don’t grasp Feynman’s explanation. I’m determined to get Susskind’s take on spin, especially since it is a quintessential quantum property.

The spin of an electron, Susskind says, has two possible values, which you can designate as up and down, or +1 and -1. Let’s say you measure the spin of an electron with a detector, and the spin turns out to be +1. You measure it again and get the same result: +1. Now you flip the detector upside down and measure the electron, and the spin turns out to be -1. “From these results,” Susskind says, “we might conclude that [spin] is a degree of freedom associated with a sense of direction in space.”

But now you rotate the detector 90 degrees, halfway between its previous orientations, and measure the electron. You might expect to get a spin of 0 or some other number between +1 and -1, but you only get either +1 or -1. It’s as though no matter how you move around the earth to measure its rotation, its axis always points straight at you, rotating clockwise or counter-clockwise. If you keep repeating the experiment on the electron, you get a sequence of +1s and -1s that, at first, seems random. But eventually all the +1s and -1s average out to 0. “Determinism has broken down, but in a particular way,” Susskind says.

Stay with me. Now let’s say you rotate the detector back to its original, upright position. What do you get? You do not get repeated observations of a spin of +1, as you did initially. You get an apparently random sequence of +1s and -1s that, again, over time average out to 0. No matter how you position the detector, you always get either +1 or -1, nothing in between. But the average of these readings varies depending on how far you rotate the detector from its original position.

If you’re already confused, good, you’re paying attention. These experiments on spin have odd implications. Spin, from the perspective of the very first experiment, performed with the upright detector, seems to be a robust, durable property of an electron, akin to Earth’s rotation. But from another perspective, spin varies depending on how you look at it. The way in which you experimentally measure spin matters, and so does the order in which you perform experiments. One measurement can negate the result of a prior measurement; the way we observe something seems to affect it. This is the notorious measurement problem, which has led physicists such as Eugene Wigner and John Wheeler to speculate that consciousness might be a fundamental feature of existence.

Also, remember that spin is not a continuous property, with many possible values. It can only have certain observed values, such as +1 or -1. Spin is lumpy, like Feynman’s droplets of light and the energy levels depicted in Quantum Physics for Babies. Before you look at the electron, however, it doesn’t have a specific spin. It hovers in a strange state, called “superposition,” encompassing both possible spins, +1 and -1, as well as values in between.

Another important point: Your uncertainty about the electron’s spin isn’t like your uncertainty about a spinning coin. You could in principle measure all the forces acting on a spinning coin and predict exactly how it will land. You can’t do that with an electron. No matter how precise your measurements are, the uncertainty persists.

Physicists describe the superposed state of an electron with mathematical objects called vectors, which resemble arrows of a certain length pointing in a certain direction in space. Physicists use vectors to represent golf balls, missiles and other things moving through our three-dimensional world. “I want you to completely forget about that concept of a vector,” Susskind says. The vectors used to describe spin exist in the abstract, purely mathematical realm called Hilbert space.

There’s more. The vectors used to describe spin yield what are called probability amplitudes; to compute the probability that an electron will have a certain spin when you measure it, you have to square its probability amplitude. Yes, this is familiar from Feynman’s QED: The Strange Theory of Light and Matter. Feynman doesn’t mention vectors, but he does represent the odds that light will bounce off glass with little arrows. These arrows correspond to probability amplitudes, which you square to get the probability of what light will do.

 But recognition does not equal understanding; these concepts still seem hopelessly alien to me.

Bashing Deepak Chopra

I need to return to quantum mysticism for a moment, to get something off my chest. Mean scientists often bash those who fashion quantum mechanics into a spiritual philosophy. The meanies single out Deepak Chopra, the alternative-medicine and spirituality guru, as an especially egregious quantum mystifier. Riffing on the measurement problem, Chopra says quantum mechanics confirms the ancient mystical doctrine that consciousness underpins reality.

I’ve been mean to Chopra too, to his face. In 2016, he paid me to speak at a scientific shindig he hosted in Beverley Hills, California. To show his money hadn’t bought me, at the conference I knocked Chopra for suggesting in his bestseller Quantum Healing that your mind can help you overcome physiological ailments such as cancer. As I said this, Chopra was sitting beside me on a stage in a packed ballroom. Later, I wrote a column faulting Chopra for “monetizing meditation.”

Chopra was a good sport, and we ended up on friendly terms. Since then, I’ve decided that Chopra’s critics treat him unfairly. His quantum mysticism merely extends the mind-centrism of John Wheeler and other physicists. Chopra’s mind-over-matter medical claims are consistent with the placebo effect, the tendency of our expectations to become self-fulfilling. As for monetizing meditation, we’re all hustling; Chopra is just better at it than most of us. 

Deepak Chopra and me in 2018 talking about something deep.

Bashers of Chopra seem offended that he talks about quantum mechanics metaphorically, drawing analogies between entanglement, say, and stuff outside of physics. Some of these critics seem to espouse the shut-up-and-calculate doctrine. Others insist that the meaning of quantum mechanics is inseparable from its mathematical expressions, and that any non-mathematical analogies will be erroneous or, at best, misleading.

I find this attitude wrong-headed and hypocritical. One of Chopra’s tormenters is Richard Dawkins, the religion-bashing biologist, who accuses Chopra of packaging "quantum jargon as plausible-sounding hocus pocus.” In a face-to-face confrontation on YouTube, Dawkins attacks Chopra for interpreting quantum mechanics “poetically” and “metaphorically.” This from a guy who promotes evolutionary theory in books like The Selfish Gene and The Blind Watchmaker.

Physicists and other scientists might disdain metaphorical thinking, but they rely on metaphors as much as the rest of us. So do mathematicians. That is the theme of Surfaces and Essences: Analogy as the Fuel and Fire of Thinking, by Douglas Hofstadter, the polymath and mind-body theorist, and Emmanuel Sander, a cognitive scientist. Analogies underpin all our knowledge, they assert, including scientific and mathematical knowledge: “without concepts there can be no thought, and without analogies there can be no concepts.”

Yes, our brains compulsively draw connections between things; we understand A because it reminds us of B, C and D. Analogies can mislead us, like that between an electron’s spin and Earth’s rotation, and even the best analogies are flawed in some way. But without analogies, like Feynman’s comparison of photons to raindrops, we wouldn’t understand anything; we’d be lost. Conceiving analogies is a human superpower. And if an analogy makes the world a little easier to bear, as quantum mysticism does, what’s wrong with that?

Imaginary Numbers and e

That said, I’m reading Susskind because I want to understand quantum mechanics in its rawest possible form. I read his discussion of spin over and over, determined to understand it in a non-metaphorical way. If I can get spin, I can get all the notorious quantum puzzles: entanglement, superposition, uncertainty, randomness, discontinuity, lumpiness, the measurement problem. So I hope. Susskind emphasizes that to grasp spin, you must grasp its underling mathematics, starting with complex numbers. The vectors and probability amplitudes used to represent spin consist of complex numbers.

Susskind shows how to represent complex numbers. The horizontal and vertical axes consist of real and imaginary numbers, respectively, which combine to form complex numbers such as z = x + iy.

I’ve been waiting for complex numbers to appear, and I’ve already done a little background reading on them. You form complex numbers by adding a real number and an imaginary number. Real numbers include all positive and negative numbers, including whole numbers (like -3, 0 or +3), rational numbers (like 2/3, -1/6, 1.111…) and irrational numbers (like π or √5).

Imaginary numbers cannot be located on the negative-positive number line, because they are square roots of negative numbers. That is, if you multiply an imaginary number by itself, you get a negative number. The primary imaginary number is i, the square root of negative one. So z is a complex number if z = x + iy, where x and y are real numbers, and i is the square root of negative one. Like real numbers, complex numbers can be added, multiplied, divided and plotted on a multi-dimensional grid, which exists in Hilbert space.

Although imaginary numbers are closely associated with quantum mechanics, they predate it by centuries. Mathematicians invented imaginary numbers to solve equations like x² + 4 = 0. In this case, x = i2. Descartes called these peculiar numbers imaginaire, or imaginary, and the slur stuck. But imaginary numbers turned out to be useful for solving real-world, practical problems in engineering, economics and other fields. [1]

Susskind models quantum spin with equations containing sin and cos as well as complex numbers. I know that sin and cos stand for sine and cosine, and that they are important in trigonometry, which has something to do with wavy lines and circles. But I don’t remember much more than that. I studied trigonometry in high school a half-century ago, and I haven’t used it since. I look up trigonometry on Wikipedia, which has excellent entries on mathematics, with helpful graphics. Wikipedia shows vaguely familiar right-angled triangles embedded in circles drawn on a cartesian grid. You get the triangle by drawing a radius from the center of the circle to the perimeter and dropping a line from that point to the x axis.

I gradually remember what sines and cosines are, and how their values rise and fall as you sweep the radius around the circle. A simple circle generates so much complexity! No wonder ancient philosophers jammed the orbits of planets and other heavenly motions into procrustean circles. You can do trigonometry with complex numbers, too, letting their real and imaginary components form the perpendicular arms of the triangle in the circle.

From Wikipedia, which has fantastic entries on mathematics.

I return to Susskind, emboldened, but then I spot another term in his equations, e, just sitting there with no explanation, definition, context. I’ve seen e before, I know it’s important, but I don’t know why, so I return to Wikipedia. e is a constant known as Euler’s number, and it’s as ubiquitous as pi, or π. What is π again? It’s what you get if you divide the circumference of a circle by its diameter. How do you calculate π, with its infinite sequence of decimal points? There are lots of ways, which I can’t get into now, don’t get distracted.

As for e, it plays a vital role in logarithms. In fact, logarithms with base e are called “natural logarithms.” What are logarithms again? Like sines and cosines, I studied logarithms in high school, and I remember very little. So I look up logarithms on Wikipedia and the website “Math Is Fun.” The latter is for kids, which means it’s good for clueless geezers too.

“Math Is Fun” says a logarithm answers the question, “How many of one number do we multiply to get another number?” Wikipedia defines a logarithm as “the inverse function to exponentiation.” Exponentiation is the multiplication of a number by itself. Multiply x times itself n times and you get x to the power of n, or xⁿ. So 10 to the power of three, or 10³, is 10 x 10 x 10, which equals 1,000.

When you ask what the logarithm of a number is, you’re asking: How many times must you multiply a constant, called a base, by itself to get that number? Ten is a common base. The base-ten logarithm of 1,000 is 3, because 10 x 10 x 10, or 10³, equals 1,000. The base-ten logarithm of 100 is 2 and of 500 is 2.69897. Yes, exponents don’t need to be whole numbers.

As I try to absorb all this information, my brain feels overloaded.

Her and the Space Between Numbers

When I’m stressed out, I re-watch films I love. Needing a break from complex numbers and trigonometry, I re-watch Her. The hero, Theodore, falls in love with a sexy, SIRI-like smartphone program, Samantha. The plot sounds jokey, but Her is a sad, sad film. It’s about loneliness, our desperation to know and be known by each other. My daughter, Skye, who is in her mid-twenties and lives in Brooklyn, cried throughout the film, and she is not the sentimental type.

A scene at the end of Her reminds me of my current project. Samantha, the artificial intelligence, who sounds like Scarlet Johansen, is telling Theodore, played by Joaquin Phoenix, that she’s changing, evolving, moving into new realms of consciousness. She’s been hanging out with other AIs, including one modeled after Alan Watts, a philosopher-mystic popular in the hippy era. I read Watts in my youth, hoping to make sense of my acid trips.

I love this film, even though it’s very sad.

Samantha is trying to be kind to Theodore, her human lover. She doesn’t say, I’m too smart for you now, but that is the problem. She says if an ordinary life, and relationship, is described by words, she’s becomes less interested in words and more interested in what’s between the words. The spaces between words are portals to infinite realms.

One of those realms, surely, is the realm of mathematics. Numbers, functions, equations express things that words—like the words I’m using in this very sentence--cannot. But mathematics is a language too, with its own limitations. We learned that from Gödel, the illogical logician, who was so fearful that someone would poison him that he stopped eating and died. Would Samantha, the artificial intelligence with Scarlet Johansen’s voice, eventually start exploring the spaces between numbers, too?

If she does, she might find something akin to music and poetry, to films like Her. That is the function of art, after all, to break through our habitual ways of seeing and knowing, to give us hints of the things flitting between words and numbers. Samantha might even find a new language, beyond words and numbers, that helps her see quantum mechanics and mysticism as two aspects of the same underlying mystery. But she probably won’t be able to translate her new language into terms that poor Theodore can grasp; all her analogies will fall short. 

Tautologies

Tumbling down mathematical rabbit holes, I remember why I liked math as a kid. I enjoyed the crispness and clarity of it, the satisfaction of solving a puzzle with an unambiguously correct answer. And investigating e turns out to be fun. Like π, e is an irrational number. The judgmental descriptor irrational simply means that the number can’t be expressed as a fraction, or ratio of integers. Some irrational numbers, including π and e, fall into the more exclusive category of transcendental numbers, which cannot be expressed as an algebraic equation. Which goes to show that irrationality can equal transcendence, or vice versa.

There are many ways to calculate e, including this formula: e = 1 + 1/1 + 1/(1x2) + 1/(1x2x3) + 1/(1x2x3x4)… And so on to infinity. That comes out to 2.7 and change, just as pi equals 3.14 and change. e is helpful for projections of population growth and compound interest, which call for repeatedly feeding the result of a calculation back into the equation. e is connected to lots of other stuff, as indicated by this famous formula, known as Euler’s identity: e to the power of iπ + 1 = 0. (I spell out “e to the power of iπ” because I don’t know how to write in superscript, but see illustration below.) i, remember, is an imaginary number, the square root of negative one. Euler’s identity connects the two primal integers, 1 and 0, with imaginary and complex numbers, trigonometry, logarithms and exponents.

Euler’s identity

Pondering Euler’s identity makes me giddy. So much packed into this little formula! It reminds us that mathematics is a great interconnected web. You can trace e through threads short and straightforward or long and tangled to π, i, 1, 0 and much, much more, including quantum mechanics. If I were more knowledgeable, I’d know how e connects calculus and linear algebra, the two major languages for describing quantum events. A century ago, theorists debated which language was correct, and eventually they realized that derivatives and integrals, on the one hand, and vectors and matrices, on the other, are two ways of describing the same strange quantum stuff.

Again, it’s hard to avoid the analogy with mysticism, especially the principle that all things are really one thing. Bertrand Russell fretted late in his career that mathematics reduces to tautologies, circular definitions, like “a four-footed animal is an animal.” Or 1 = 1. But is that intuition deflating or exhilarating? The latter, I think. 1 = 1 implies that, once you have something, you can’t have nothing, which would be like saying 1 = 0. That 1 in the equation 1 = 1 must be eternal, infinite, never-ending. Yes, 1 = 1 is circular, a tautology, but it’s profound; it resembles the mystical dictum Thou art that. And 1 = 1 leads to π and e and the whole astonishingly complicated, multi-dimensional landscape of mathematics and, by implication, human existence.

My thoughts, too, could be reducible to 1 = 1, or to a simple circle. I start with a trivial observation: I’m here, on a bench beside the Hudson River, writing in my notebook about Euler’s identity. I veer away from this observation, trying to find my way to other observations that might be a little less obvious, tautological. If I’m lucky, I stumble on something surprising. My quantum experiment will surely take me to strange new territories, where imaginary numbers wink in and out of existence. But narcissist that I am, I will always circle back to 1, to myself, my real/imaginary self, where everything begins and ends.

Are all stories, including the big story, the epic, cosmic adventure, reducible to 1 = 1? Or to a circle? That’s the question. My girlfriend Emily, when I run these musings past her, says existence goes round and round, yes, but it also has a direction. It’s a spiral, taking us somewhere we’ve never been before, which might be better or might be worse. Listening to her, I think about the coronavirus, about the upcoming election, about the perilous state of my country and the world, and it occurs to me that not all spirals take us in a good direction. Planes, the kind with wings, spiral before they crash. Maybe 1 can become 0 after all. Or maybe the ultimate equation is 0 = 0.

My Mathematical Interlude

Returning to Theoretical Minimum, I crash into a tutorial on the mathematics of vectors and Hilbert space. When Susskind talks about complex conjugates, bras and kets, inner products and orthonormal bases, he is speaking a lingo foreign to me. He anticipates my disorientation. Lenny and Art, the avatars of Susskind and his co-author, wander into “Hilbert’s Place,” and Art asks, “What is this? The Twilight Zone? Or some kind of fun house? I can’t get my bearings… Which way is up?”

I know how Art feels. Re-reading Lecture 1 of Susskind, I’m lost, spiraling through space like an electron that has lost its nucleus. I don’t know which way is up and which is down, or if up and down even exist. When, if ever, will I get to the bottom, the ground, of quantum mechanics? A place where I can set my feet, get my bearings, know which way is up?

Susskind calls his tutorials on complex numbers and vectors “Mathematical Interludes.” That phrase bugs me; it’s a contradiction in terms. An interlude is, or should be, a moment of rest, respite, relaxation. Like watching Deep Space Nine, the Star Trek reboot, while lying on my couch. Or doing nothing while lying on my couch.

Susskind’s interludes on complex numbers and vector spaces aren’t relaxing; they’re hard, for me, anyway. But let’s say I go with his format and insert a “Mathematical Interlude” here. What can I say at this point in my quantum education? I would divide math into three categories, or sets. Make that four sets. No, six.

1.     Math I learned and still know. Fractions, decimals, integers, positive and negative numbers, real numbers, percentages, irrational numbers, addition, subtraction, square roots, exponents, multiplication, division. A little simple algebra. When I set it all down, it looks like a lot, but when I read Susskind, my knowledge seems pathetically minuscule.

2.     Math I learned and forgot. Logarithms, sines and cosines, calculus, derivatives and integrals. Sets, series, sums, limits. Did I ever learn Euler’s number? The basis of “natural” algorithms? Maybe, I can’t remember. I’m hoping to relearn this stuff for my quantum project.

3.     Math I never learned but hope to learn for my quantum experiment. Partial differential equations, complex numbers, linear algebra, vectors and matrices and Hilbert spaces.

4.     Math I might have a shot at learning if I tried really hard, but I’m not going to bother, because life’s too short.

5.     Math I couldn’t learn even if I had dedicated myself to learning it in my youth, like the math underlying string theory.

6.     Math that hasn’t been discovered yet and may never be discovered.

Each set is a subset of the set below, which is much larger, and of the other sets, which are larger still. Set 1 is tiny, easily chartable. Sets 4 and 5, compared to set 1, are infinite. Set 6 is even more infinite. Set 6 may contain the math needed to resolve the riddles of quantum mechanics, to unify quantum field theory and general relativity, to explain how matter makes a mind.

This math might not be discoverable by any mere human. It might be discoverable only by a superintelligent machine, like Samantha in the film Her, the AI that sounds like Scarlet Johansen. [2] Perhaps this math doesn’t exist in any form. Except isn’t reality, including human consciousness, a demonstration proof that such transcendent math exists?

The Law of Laziness

Reading Susskind’s book reminds me of a phrase tossed around by editors at Scientific American when I started working there in the 1980s: hitting the wall. That describes the moment when an article gets too dense and technical for lay readers and even for us editors. Reading Susskind, I hit one wall after another.

Actually, “hitting the wall” is too metaphorically mild. I’ve fallen off a cliff and into a mineshaft, at the bottom of which is a dark, underground river that carries me to a realm of eternal, infinite unknowing. Except that sounds cool, like a mystical experience. I just feel stupid. I’m beginning to know how little I know. It’s a paradoxical kind of limit, in which adding increments of knowledge makes me feel dumber.

I tell myself little lies to motivate myself. I think, while reading Susskind, I’m getting this, I could totally do these exercises at the end of the chapter, but I choose not to because I’m eager to move onto the next cool thing. Hermitian operators, here we come! Meanwhile, the mean-spirited grouch in me mutters, Bullshit, you don’t get anything, or you’d do those supposedly easy exercises.

Susskind rewards me now and then with a big, philosophically resonant idea, like the principle of least action. This principle has a fussily technical definition, which differs depending on whether you are working within classical or quantum mechanics. The principle nonetheless stipulates that nature always minimizes energy expenditures. If you know the initial and final states of a system—like a rock teetering at the top of a hill and sitting at the bottom--the principle of least action helps you calculate what happens in between. That might mean simply finding the shortest or fastest path between the top and bottom of the hill.

A lifeguard can run faster than she can swim, and light goes faster in air than water, so the fastest path from A to B is not the shortest path. Unlike the lifeguard, light conforms to the principle of least action perfectly and without conscious calculation, which is pretty amazing if you think about it. From Wikipedia.

Shortest and fastest are not always equivalent. A lifeguard trying to reach a drowning man down the beach from her follows the least-action principle, ideally. She can run faster than she can swim, so she runs on the beach a little farther than if she were taking a straight path to the drowning man; that is her fastest path. In QED, Feynman invokes the principle of least action to explain why light seems to bend when it passes from air into water; light travels more quickly through air than water.

Things get more complicated when you’re talking about possible paths described by probability amplitudes composed of complex numbers in Hilbert space; quantum “action” involves an especially abstract definition of energy. But quantum happenings still conform to the principle of least action. You can’t predict precisely what one photon will do when you aim it at a pane of glass; but according to Feynman, quantum mechanics plus the principle of least action tell you exactly how photons, collectively, reflect off the glass. The principle turns probabilities into certainties.

I like how 18th-century philosopher Pierre Louis Maupertuis, an originator of the principle, conveys its gist: “Nature is thrifty in all its actions.” The principle of least action evokes the second law of thermodynamics, which describes the tendency of a mysterious quasi-stuff called entropy to increase over time. The second law, more colloquially, accounts for why heat and other forms of energy tend to dissipate and things tend to fall apart. Can the second law be derived from the principle of least action, or vice versa? I look online, and there seems to be vigorous debate about this very issue, with people coming down strongly on different sides.

Page from Quantum Notebook #1, May 29, 2020, which records my notes on trigonometry, Euler’s number and logarithms.

The principle of least action reminds me of The Laziness, the state I entered during my silent Buddhist retreat, when I stopped fretting and regretting and just chilled. Emily laughed when I described The Laziness to her. Your laziness isn’t spiritual, she told me, it’s just plain old laziness. Perhaps. Or maybe my laziness, rather than being a character flaw, stems from a profound principle of nature: the law of laziness. At this very moment, the law of laziness is inclining me to set aside Susskind’s interlude on complex numbers and watch Deep Space Nine. The law of laziness explains my tendency to become habituated to things, to sleepwalk through life without paying attention.

But if I were really lazy, would I be studying quantum mechanics? And doesn’t life violate the law of laziness? Life is such a hassle. Evolution, mutation, cellular division, copulation, speciation—so much effort! Especially human life, all that striving, ambition, struggle. The universe as a whole, with its swirling galaxies hurtling outward at an accelerating pace, seems pretty non-lazy; nonexistence would have been so much easier. There must be a force that counteracts the law of laziness, a force that accounts for the busyness of the cosmos and of life here on earth. But the ultimate end, the telos, of everything in the cosmos is stillness. Oblivion. That is what awaits us at the bottom of the hill.

My quantum experiment, when I think about it, is consistent with the law of laziness. Yes, I’m lazy, but I’m also prone to anxiety and melancholy. I avoid these moods by doing things that absorb me, like trying to see reality the way physicists see it. This task distracts me, keeps me occupied; it leads to a smoother trajectory through life. My quantum experiment, at this moment, represents the path of least resistance as I tumble toward my final resting place.

Notes

  1. After I whined about imaginary numbers online, a reader named Steve suggested a way to think about them. Here is my paraphrase, possibly erroneous, of Steve’s explanation: Let’s say you have a real particle, R, oscillating in real space along an x axis between x = 1 and x = -1. R’s velocity peaks at x = 0 and falls to zero at x = 1 and x = -1. Trying to model R’s oscillations, you imagine an unreal particle, U, moving at a constant velocity in unreal space along an unreal circle with radius 1 centered at x = 0. Although U doesn’t exist, its steady, circular motion, projected onto the x axis, nicely models R’s oscillations. Physics writer Michael Brooks also delves into the history and uses of imaginary numbers in his delightful book The Quantum Astrologer’s Handbook.

  2. As far back as the 1980s, Stephen Hawking and other physicists speculated that artificial intelligences might solve problems too hard for humans. I considered this idea in the penultimate chapter of The End of Science, titled “Scientific Theology, or the End of Machine Science.”