The world is made for you, and it’s built with information. In this epic journey of a 5-minute read, we’ll explore the concept of information, its relationship to physics and entropy, and how our understanding of the universe is nothing but a process of acquiring and deciphering information.
What is information?
Information comes in various forms and definitions. In the age of data and technology, however, it’s often seen as the bits and bytes stored in hard drives or on cloud servers. But even this now ubiquitous form of information has its humble roots in an idea conjured by Claude Shannon in the late 1940s: information is surprise. Surprised? See, you just learned something.
I want you to pause for a moment and consider this because it is one of the most prophetic ideas devised by humanity. Information is clearly a thorny concept, but Shannon stripped it down to its core elements. He stripped it of meaning, context, and interpretation. By doing so, he made information measurable. He wrote in his seminal paper:
The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point.
Shannon was an engineer at Bell Labs, so this was a very practical problem for him. People wanted to send messages, and Shannon wanted to find the best way possible to do that. At the time, messages were sent via physical wires, but the principles remain in the wireless age. There is a source and a receiver. The source selects symbols, like the dots and dashes of Morse code, or which emojis to send via text. You might think the receiver is just a stand-in for something on the other end of the line, but Shannon’s great insight was that the receiver is the most important part of communicating information.
I am — or more abstractly, this text is — a source of information. The tool you are using to read this on is capable of displaying about 150,000 different characters, which includes numbers, letters from various alphabets, and emojis. Here are ten random characters from that set: ᰠᒂ͢☟⡝ȧೂകᓤᡁ. But you, the receiver, have some expectations already for what letters will come next. You are not surprised when the letter ‘u’ follows the letter ‘q,’ for example.
Once you accept this idea of information, things start to get counterintuitive. Streaming a movie you’ve already seen downloads data, but it is a source of zero information for you since you know exactly what will happen. Meanwhile, the successive outcomes of a coin toss are a maximal source of information since the outcome of every toss is a surprise. One way to think about sources of information is how much you stand to learn from them. When putting this way, it is obvious that information is relative to who the receiver is.
In summary, you are a receiver facing multiple sources of information that can provide you with answers selected from a fixed set of possibilities. If you don’t know what answers will be provided, that source “contains” information. If the source contains no information, then you must already possess it.
Information and physics
In science, Nature is the information source, and we are the receivers. But we don’t download raw data from Nature. The bits and bytes we receive are filtered through various lenses and interpreted in the context of neatly arranged models.
In a simplified view, the universe is composed of nothing but fundamental particles and forces that act between them. What we call the laws of physics dictate the rules that define the possible states of the universe. That’s all very boring. The actually experienced universe is determined by the information we have about which states are realized.
When you look out into the night sky, you don’t see particles and forces — you see a dazzling display of stars, planets, and galaxies. The information that Nature provides is encoded in the arrangement and behavior of these objects. And the deeper we peer into the cosmos, the more surprised we are and the more information we acquire.
But we can’t gain information without some interaction with the source. However, we can lose information. Back to the streaming movie analogy, you may have forgotten the plot of the movie — in which case, the source again provides surprises. This is an intuitive way to understand one of the most fundamental laws of physics we have.
Entropy and information
Entropy is often explained as a measure of “disorder” in a physical system. It was first introduced in the 19th century by Austrian physicist Ludwig Boltzmann and is a key concept in thermodynamics. The concept of entropy has a long history of confusion and misunderstanding, primarily due to its abstract nature. Initially, entropy was considered only in relation to heat transfer and thermodynamics, but later, its significance expanded to encompass a broader range of applications, including statistical mechanics and information theory.
The Second Law of Thermodynamics states that in any natural process, the total entropy of an isolated system can only increase or remain constant over time. In other words, systems tend to move from a state of order to a state of disorder, and this progression is irreversible. This principle has profound implications in various fields of science and engineering, from predicting the heat death of the universe to explaining why time flows in a single direction. Throughout their history, entropy and the Second Law have been a source of confusion and debate, but it makes much more sense when considering it from the point of view of information.
First, we equate entropy and information. When a system has high entropy, it is disordered. We don’t have many expectations about a disordered system. In other words, we can extract a lot of information from it. So high entropy naturally means high information — disordered systems offer more surprises and potential for learning. In contrast, an orderly system has low information because we have information about it. The Second Law of Thermodynamics, in the language of information, is that the total information contained within a system can only increase or remain constant over time. This makes perfect sense — we can’t increase our information without extracting it. If we don’t extract it, the information contained within the system must remain the same. If we modeled it incorrectly, or we have simply forgotten something about it, the information we can gain from it goes up.
The physical world as information
Everything we observe and interact with is a manifestation of information. From the smallest subatomic particle to the vast expanse of the universe, the fundamental nature of reality is encoded in the form of information. Taking this perspective to heart eases the understanding of otherwise complicated ideas in physics and gives us insights into the nature of reality and our place in the cosmos.
From an information standpoint, the idea of an observer-independent reality becomes untenable. Since our understanding of the universe is fundamentally based on the information we acquire and interpret, the notion of reality is inherently tied to our experiences as observers. The act of acquiring information (observing or measuring) alters the information content of the system. This suggests that our perception of reality is not only shaped by the information we receive but also by our active engagement with the world as observers. Thus, an observer-independent notion of reality is impossible.
Now, you may be concerned that this is starting to sound solipsistic. However, it is crucial to recognize that while our understanding of the universe is inherently tied to the information we receive and interpret, this does not imply that reality exists solely in our minds. Instead, our collective experience and intersubjective understanding of the world help to establish a shared sense of reality that transcends individual perceptions.
Intersubjectivity refers to the shared understanding and interpretation of reality by multiple observers. In the context of information, intersubjectivity allows us to create a common sense of reality by collectively agreeing upon the significance and meaning of information. As we experience and gather information from the world around us, we compare, contrast, and reconcile our individual interpretations with other people. This process helps to refine and solidify our collective understanding of reality, allowing us to build a consistent and coherent model of the universe we can all share and agree upon.
Through the process of gathering, sharing, and refining information, we can construct a coherent and consistent model of the universe. The emphasis on information and its role in shaping our understanding of the cosmos should not be seen as solipsistic but rather as an acknowledgment of the inextricable link between our perception of reality and the information we acquire. Reality is our shared knowledge of the world around us. Without us, necessarily including you, there is no reality to speak of.