ChatGPT’s explanation of quantum computing is bullsh*t

Chris Ferrie
4 min readJan 25, 2023

When you fire up ChatGPT, a large AI-based language model developed by OpenAI, the prime example of its capabilities is to “Explain quantum computing in simple terms.” So, how does it fare? Well, you’ve already read the title of this article…

Of course, ChatGPT doesn’t give the same answer every time, but for this prompt, it is relatively consistent. Here is a typical response:

Quantum computing is a type of computing that uses quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. These operations can be performed much faster and more efficiently than with traditional computers, which use classical bits to represent data. In simple terms, a quantum computer uses quantum bits or “qubits” instead of classical bits, which can exist in multiple states simultaneously, allowing for much more complex and powerful processing.

Almost all of this is wrong. Let’s break it down. First, quantum computing does (obviously) involve quantum mechanics. But quantum computers don’t use superposition and entanglement as if they were some high-tech fuel. Quantum computers manipulate quantum data using the rules of quantum mechanics. Superposition and entanglement are words that describe things in quantum mechanics, but they are not things that are uniquely extracted to enable a new kind of computation. They are related to quantum computing only insofar as they are related to quantum mechanics. Invoking these concepts will not help you understand quantum computing.

Next is the idea that quantum computers are “faster” than classical computers. I decided to challenge ChatGPT on this. After all, the gate times of both kinds of computers exist on the internet. So I said, “Quantum computations are not faster. The gate times of quantum computers are hundreds of times slower than classical computers.” ChatGPT replied with some repetitive contradiction about slower gates but faster computation, so I asked, “How can the gate times be slower, but the computation be faster?” This was the reply:

The gate times of a quantum computer refer to the time it takes for the computer to perform a basic operation, such as a quantum logic gate, on one or more qubits. These gate times are currently slower than those of classical computers. However, the power of a quantum computer comes from its ability to perform many operations in parallel, which is a direct result of the properties of quantum mechanics, such as superposition and entanglement.

For example, a quantum computer can perform a calculation on all possible inputs at the same time, a classical computer would have to perform the calculation one input at a time. These parallel computations can result in an exponential speedup for certain types of problems, such as factoring large integers or searching large databases. While the gate times are slower, the number of operations that can be performed in parallel allows quantum computers to perform some calculations much faster than classical computers.

Quantum computers “do computations in parallel” is a very common misconception repeated by practicing quantum physicists and is used as a bullshit justification for the viability of quantum computing start-ups. It’s not at all how quantum computers work. I’ve written a simple explanation of quantum computing elsewhere, so I won’t dwell on that here. Let’s get to the point.

ChatGPT acts like a professional bullshitter. But is it a bullshitter? No. ChatGPT is a machine learning model that has been trained on a vast amount of data from that totally not objectionable place we call the internet. This training compels it to generate text similar to human-written (for now, anyway) text. You may be surprised to find out that most of the human-written text about quantum computing (and quantum physics more generally) is bullshit. Hence, ChatGPT appears to be a quantum bullshitter because it mimics mostly quantum bullshitters.

(Perhaps this suggests a good way to avoid your writing job being automated by AI — just ensure most of the training data is bullshit!)

But, even if the human-written text in its training data were correct, the training does not give it a deep understanding of the underlying concepts. In other words, ChatGPT can generate text that appears to be an explanation of quantum computing, but it does not truly understand the subject. It will get better at convincing us that it does, but the latter part — that we will be convinced — is the surprising bit. Meanwhile, if you want to prepare yourself for the quantum bullshit AI bot takeover, prepare yourself by reading my book Quantum Bullsh*t.

--

--

Chris Ferrie

Quantum theorist by day, father by night. Occasionally moonlighting as a author. csferrie.com