Myth 1: Nobody Understands This Quantum Stuff

Chris Ferrie
14 min readMay 27, 2024

“Nobody understands quantum physics.” — Richard Feynman

What a quote! But, to be fair, there are so many to choose from!

“Those who are not shocked when they first come across quantum mechanics cannot possibly have understood it.” — Niels Bohr

“Quantum mechanics makes absolutely no sense.” — Roger Penrose

“If it is correct, it signifies the end of physics as a science.” — Albert Einstein

“I do not like it, and I am sorry I ever had anything to do with it.” — Erwin Schrödinger

Cute. But no one would guide their philosophical attitudes and technology predictions based on what a bunch of dead guys didn’t understand, right? I mean, who would do that? Everyone, that’s who!

Here’s how the argument might go. While the principles we do understand have been enough to begin building quantum computers, it’s possible that as we push the boundaries of our understanding, we might discover new principles or limitations that make quantum computing impossible or impractical. But this is just plain wrong. We understand quantum physics exceptionally well — so much so that we have built our entire modern society through the exploitation of our understanding of it.

You are already using quantum technology

Don’t think that all this hype about quantum technology is merely going to lead to a societal revolution — because it already has! Every piece of modern technology has the fingerprints of quantum physics on it. Quantum computers seem less miraculous, and their inevitability becomes more acceptable when this is understood. So, let’s understand it! But first, there is something important to keep in mind.

While our precision engineering and control at the microscopic scale was a continuous process of improvement, there is still a clear distinction between first-generation quantum technologies, which you will learn about in this chapter, and second-generation quantum technologies, which include quantum computers. Once we understood the fine structure of light and matter, many new paths in understanding and engineering opened up. Yet, these did not require the manipulation of individual atoms or photons to discover, nor did they need access to the fundamental constituents to exploit. Now, we can control the world down to individual atoms. It’s still quantum physics, but it provides us with more possibilities — most of which we probably don’t even know about yet!

Consider the following analogy. When you bake a cake, you start with a collection of individual ingredients: flour, sugar, eggs, etc. When we gained the ability to refine and perfect the quality of these ingredients, our culinary achievements evolved from avoiding starvation to celebrity bake-offs. But, we can still only mix these ingredients together to form a general batter. This is similar to how quantum physics informed us of the constituents of matter, but we were limited to exploiting those in bulk quantities. Now, imagine being able to alter each grain of flour or sugar crystal, much like how we can now manipulate individual atoms. This would allow Gordon Ramsay to require customization of reality TV cakes to an unprecedented degree, with demands on texture, taste, and appearance beyond its recognition as food. Controlling individual atoms could allow us to design the world from scratch, producing things beyond our imagination and unrecognizable to an experience trained in a classical world.

Although that is beginning to sound like hyped-up science fiction, our ability to manipulate the world at the most fundamental scale is better seen as the natural evolution of technological progress. But what will it evolve from? Let’s look at some examples.


Arguably, the most important technological consequence of quantum physics is the mighty transistor. If you are sitting on your mobile phone right now, you are currently sitting on a billion of these now-tiny devices. But they weren’t always tiny, and the story of this technology is at least as old as quantum theory.

In the 19th and early 20th centuries, physicists discovered that silicon and some other materials had electrical properties that fit between conductors and insulators — they could conduct electricity under certain conditions but not others. Classical physics could not explain this behavior, nor could it aid in exploiting these properties. Quantum physics provided the needed theoretical framework to understand these materials. The discrete energy levels of Bohr’s atomic theory generalize to the so-called band theory of solids. Instead of the specific energy levels for electrons in a single atom, bulk materials have “bands” and “gaps” in how electron energy can be organized in materials. Band theory suggests that conductors have many energy levels for electrons to move into, which allows electric current to flow easily. Insulators, on the other hand, have large “band gaps,” preventing electron movement. Semiconductors have a small band gap that electrons can cross under certain conditions.

The detailed band structure of any material can be deduced by solving Schrödinger’s equation. This is an impossible task for all but very contrived scenarios. Thus, most of the art and science of the quantum physics of solid material is in crafting useful approximations. Indeed, the sole purpose of the largest sub-field of physics — called condensed matter physics — is approximating quantum physics for large systems of atoms. Basically, any person or company that has sourced a product, component, or ingredient with specific properties has at least indirectly benefitted from quantum physics research. A modern transistor, for example, demands a long list of properties to function correctly.

The transistor was invented in 1947 by John Bardeen, Walter Brattain, and William Shockley at Bell Labs. While “big” compared to today’s transistors, it was still only the size of a coin. Its purpose was to amplify electrical signals to replace the bulky and much less efficient vacuum tube. By using two closely spaced metal contacts on the surface of a small chunk of semiconductor material, the transistor could regulate electric current by exploiting the band structure properties of the material. After the initial demonstration, progress was rapid. In addition to replacing vacuum tubes as a necessary component in electric circuits, transistors also replaced vacuum tubes as computer switches. By 1954, the first “all-transistor” computer was built, and the rest, as they say, is history.


A laser used to be a LASER (Light Amplification by Stimulated Emission of Radiation), but it is now so ubiquitous that it’s both a noun and a verb. While you don’t want to be arbitrarily lasered, whenever you scan a barcode at the supermarket, for example, you are harnessing the power of lasers.

At the heart of laser operation is the phenomenon called “stimulated emission.” This process was first described by Einstein in 1917, using the early ideas of quantum physics. Recall that an electron in an atom sits on discrete energy levels. When the electron changes levels, it absorbs or emits a photon. The energy of the photon is exactly the difference in the energy levels. Spontaneous emission happens when atoms in high-energy states randomly drop to low-energy states. All the light you see, and all the light ever seen before the 20th century, was due to atoms randomly changing energy levels. Einstein suggested that an atom in an excited state (with its electron at a high energy level) could be stimulated to drop to a lower energy state with a photon matching the energy level difference, thereby creating two identical photons.

Despite Einstein’s theoretical work, it wasn’t until the 1950s that scientists were able to build devices that took advantage of stimulated emission. The first was developed by Charles Townes and his colleagues at Columbia University in 1954, though it still wasn’t practical and only amplified microwave frequencies rather than visible light. A proper laser was first demonstrated in 1960 by Theodore H. Maiman at Hughes Research Laboratories. It was red, by the way. But while impressive, it was still famously described as a
“solution looking for a problem.” Interestingly, most applications of lasers today solve problems no one even dreamed of having in 1960.

Just like the transistor, the laser has found many uses. Today, lasers are used in a wide variety of fields, from medicine and telecommunications to manufacturing and entertainment. They are used to cut steel, perform delicate eye surgeries, carry information over fiber-optic cables, read barcodes, read and write music and video onto plastic discs, and create mesmerizing light shows. Our modern world could not exist without the laser, which will continue to pay dividends when applied to second-generation quantum technology.

Atomic clocks

If you’ve ever used GPS (Global Positioning System) to navigate, which is nearly impossible not to unless you carry no devices and drive yourself around in a twenty-year-old vehicle, then you’ve indirectly used an atomic clock. Since it has the word “atom” right in it, you know it has something to do with quantum physics.

The development of atomic clocks was made possible by understanding the detailed internal quantum nature of atoms. Electrons in atoms occupy discrete energy levels and can transition between these levels by absorbing or emitting specific frequencies of light. That’s old news to us now. What’s new is that these levels can be manipulated with electric and magnetic fields — otherwise, single energy levels can be split. This splitting can even come from electric and magnetic fields generated from within the atom — electrons carry an electric charge, after all! The natural splittings are subtle but reveal what was called the fine and hyperfine structure of atoms. In the 1930s, Isidor Rabi developed the technique of magnetic resonance, enabling the precise measurement of these features.

Since the energy level structure is a fixed property of atoms, and the frequency they omit would create a very precise “ticking,” Rabi later suggested that atoms could be used to define an extremely stable clock. In 1949, the United States National Bureau of Standards built the first atomic clock using ammonia. But it was the cesium-based clock, developed in 1955 at the National Physical Laboratory in England, that truly revolutionized timekeeping.

Cesium-based atomic clocks work on the principle of measuring the frequency of cesium atoms when they transition between two specific energy levels. The lowest energy hyperfine transition of cesium atoms oscillates exactly 9,192,631,770 times per second, and this frequency was so constant and reliable that in 1967, it was adopted as the new standard for the second, replacing the previous standard based on the Earth’s orbit. Our definition of time itself is based on quantum physics.

These incredibly precise timekeepers have been used in a variety of applications, from synchronizing telecommunications networks like the internet to testing the predictions of Einstein’s theory of relativity. However, perhaps their most well-known application is in the Global Positioning System. Each of the 24 satellites in the GPS constellation carries multiple atomic clocks on board. The tiny differences in the time signals sent by these clocks, caused by their different distances from the receiver, allow the receiver’s position to be triangulated within a few meters. Measuring distances is hard, but since light has a constant speed, distance can be inferred by how long it takes to travel — provided you can accurately measure time.

The atomic clock is a prime example of a quantum technology that has become indispensable in our modern world. Whether it’s enabling global navigation or the internet, the atomic clock demonstrates that we are living in the quantum technology revolution — the first one, anyway.

Magnetic resonance imaging (MRI)

Magnetic Resonance Imaging, commonly known as MRI, is a powerful tool in the medical world, providing detailed and non-invasive images of soft tissues in the body — something traditional X-rays struggle with. The technology behind MRI is rooted in quantum physics, drawing directly from the principles of magnetic resonance mentioned with atomic clocks.

One of the hallmarks of quantum physics was the discovery of spin, a property internal to subatomic particles that forces them to act like tiny magnets. Individually, they are very weak and, when surrounded by others aligned in random directions, are impossible to detect. However, they will align themselves with a strong enough magnet, and that’s where the giant superconducting coil magnets of an MRI machine come in.

In MRI, the focus is mainly on the spin of hydrogen due to the abundance of water in our bodies. When placed inside the MRI machine’s strong magnet, these spins align with it. A signal is then applied perpendicular to this magnetic field, causing the spins to tip away from their aligned state. When the signal is removed, the spins return to their original alignment, but in the process, they emit signals back. These signals, from the collective relaxation of billions upon billions of tiny magnets deep within hydrogen atoms, are what the MRI machine captures and converts into images.

The densities and types of tissues result in varied relaxation times, meaning the signals received from various tissues will differ. This difference allows for a contrast in the images and enables the detailed visualization of organs, tumors, and other structures.

Without understanding the spin properties of atomic nuclei, the development of MRI would not have been possible. Today, MRI is used globally to diagnose a myriad of conditions, from brain tumors to joint injuries, showcasing yet another practical application of quantum physics in our everyday lives.

And many more…

While we have covered some of the most significant applications of quantum physics in technology, there are, of course, many more. For instance, nuclear energy and, unfortunately, weapons are widespread applications relying heavily on the principles of quantum physics via nuclear physics. Without knowledge of the internal workings of atoms, we would not have the over 400 nuclear reactors providing 10% of the world’s electricity, nor would we have the over 200 research-grade reactors that also produce radioactive material for industrial and medical purposes.

Speaking of which, beyond MRI, quantum physics has had a profound impact on medical technology. Positron Emission Tomography (PET), for example, uses the obscure-sounding anti-particle to the electron to construct images of the body’s internal processes. A PET scan introduces a small amount of radioactive material into the body, which emits radiation as positrons. When they meet electrons, the emitted positrons annihilate and produce two gamma rays in opposite directions. The PET scanner detects these gamma rays, infers the source, and hence maps out the journey of the radioactive material within the body.

Semiconductor technology has led to numerous advances over the years beyond the ubiquitous transistor. While reliant on quantum physics for many aforementioned reasons, quantum knowledge is doing double duty in tunnel diodes. In these devices, electrons can “tunnel” through energy barriers instead of requiring to go over them, as one might expect from classical physics. Various versions of these are used to improve everything from Christmas lights and solar cells to lasers and thermometers.

Moreover, quantum concepts and mathematics have begun to find applications in fields far removed from traditional physics. Quantum biology, for instance, applies quantum principles to biological processes, investigating phenomena like photosynthesis and bird navigation, where a classical description alone may be insufficient. “Quantum” finance borrows mathematical techniques from quantum mechanics to model financial markets and to understand their seemingly random fluctuations better.

Quantum physics is far more embedded in our everyday lives than we might initially realize. From the electronic devices we use to navigate our world to the medical technologies that help diagnose and treat illnesses to the mathematical models that power our financial systems — the principles of quantum mechanics are an integral part of our intentionally engineered modern world. And this doesn’t even touch on the second-generation quantum technologies that are on the horizon. As our control of quantum systems continues to improve, more applications will come, so this all begets the question, what exactly don’t we understand about quantum physics?

Quantum teleology

Quantum physics is almost always taught chronologically. Indeed, I just did that in the previous chapter. You read about a long list of 20th-century scientific heroes who uncovered the wild and untamed world behind our fingertips. The story had modest roots in Planck’s 1900 hypothesis that energy is discrete. Though we didn’t need to make it that far for the purpose of introducing quantum computers, the standard tale of quantum physics usually crescendos with John Bell’s work on “spooky” entanglement in the 1960s. Today, as the story goes, we are on the cusp of the yet-to-be-written second quantum revolution.

Along the way, the standard story tells of a piece of great machinery that was simultaneously being created next to quantum physics, called quantum mechanics, which allowed graduate students to blindly turn the mathematical crank to make predictions about newer and more extreme experiments. It is often said that generations of physicists would “shut up and calculate” to earn their degrees and professorships to eventually repeat the program again with the next cohort. The wild horse of quantum physics had apparently been stabled but not tamed.

When Richard Feynman made the casual remark about “understanding,” he was not only opening his now-famous lecture on quantum physics, he was inadvertently venturing into the realm of philosophy, which he famously derided. In epistemology — the study of knowledge itself — understanding is not just a mechanical mastery but a rich concept that probes the underlying meaning and connection between ideas. This brand of philosophy seeks to clarify the very nature of knowing and being. As you might have expected from such a grandiose task, there’s more disagreement than agreement amongst philosophers on even the definition of the word.

On the other hand, we all have some intuitive notion of understanding. In the everyday world, understanding might be likened to knowing how to ride a bike, but in physics, it’s about grasping the forces that make the bike move. You might not know the physics behind balance and motion, but you come to “understand” how to ride it through feel and experience.

When we talk about physicists’ understanding, we’re stepping into a workshop where the universe’s machinery is laid bare. A physicist strives to see the gears and levers behind phenomena, aiming for an intuitive grasp of why things happen as they do. This is not merely knowing the equations but being able to feel them, like instinctively leaning into a turn.

However, in the realm of quantum physics, the rules of the game seem to change. Understanding here is like trying to catch smoke with your bare hands — it slips and dances between your fingers. The intuitive mechanics known in classical physics fade, and in their place are abstract mathematical objects and formulas. You can follow them, learn them, even use them, but a concrete mechanical understanding of them appears impossible. Imagine riding a bike — built by no one — with all its mechanisms hidden and impossible to reveal.

If all the mixed messages about quantum physics confuse you, I want you to erase everything you know about it and memorize the next paragraph.

Quantum physics is a branch of science that describes highly isolated systems — things that don’t interact randomly with other stuff around them. Traditionally, these are small, like atoms, but now we can engineer artificial systems under high isolation. Anything that is extremely isolated requires quantum physics to be described accurately. The information such things encode is quantum information. If you attempt to use classical physics or classical information to make predictions or statements about such things, you may end up being very wrong.

Quantum physics does not tell us what reality lies beyond our experience. It only tells us that, whatever it is, reality is nothing like the mechanical worldview we have come to take for granted in realms where it works. Thus, if “understanding” demands such an explanation of the world — displaying the causes and effects that make it go — indeed, no one yet has it.

But this myth is not about philosophy because this is a book about technology. So, we will stick to the everyday use of words. That is, using a tool is exactly how you come to understand how it works. Generally, people stop demanding explanations of things they are familiar with. We will eventually become so familiar with quantum technology that understanding will be a word reserved for one’s ability to successfully navigate its use. In much the same way that my inability to “understand” the point of TikTok does not stand in the way of it being a successful app, our lack of “understanding” quantum physics does not stand in the way of building quantum computers.

If Feynman were alive today, I think he’d contextualize this quote better. He might say, “Quantum mechanics cannot be understood using classical physics and information — it must be understood in the language of quantum information.” Or, perhaps, he’d be pithier and say, “Nobody speaks quantum information.” Indeed, nearing the end of his career, he said, “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly, it’s a wonderful problem, because it doesn’t look so easy.” The workshop he said this at in 1981 is often considered the birthplace of quantum information and computation.



Chris Ferrie

Quantum theorist by day, father by night. Occasionally moonlighting as a author.