The promises of quantum computing

Reading time: 14 minutes

Quantum computing for beginners (and regulars of this notebook …).
Translation by AB – September 13, 2020


Scientifically proven facts remain immutable but their explanation varies with the progress of knowledge. The theories of Darwin and Pasteur are already outdated. The atom, once a miracle of simplicity, has become a miracle of complexity.

Gustave Le Bon – Les incertitudes de l’heure présente

Google and others would not be very far from being able to demonstrate “quantum supremacy1. This expression stands for a technical configuration based on the “quantum” properties of matter at the atomic scale that allows for calculations that no classic computer could materially finish before the extinction of the universe. It remains to obtain this configuration which is at this very moment the object of a bitter struggle. It will then be necessary to make it carry out “interesting” calculations, that is to say economically profitable or strategically necessary. One of the calculations achievable by such a machine is indeed interesting but we will see that it poses a big problem.

Classic computing

The digital environment is emerging (Emergence of the digital natural environment) and unleashing economic, social and political powers… which we, as citizens and consumers, are still struggling to grasp. We are nevertheless aware that since the mid-1990s we have been going through an extraordinary transition period determined, to a large extent, by information technologies (other factors such as ecology and demography are obviously fundamental but do not directly concern our thoughts).

So-called “classic” computer science, born in the middle of the 20th century, is the result of the convergence of theoretical work in many fields: mathematics, physics, chemistry, ergonomics, cognitive psychology… Computer science, which now serves as a social and natural environment, is also of interest to medicine, morals, law, education… and is now shaping our culture. It has taken nearly 80 years of technical progress and transdisciplinary evolution to get to this point: Information Technology (“IT”) is a mature, almost transparent technology, fully integrated into our socio-economic and political models. Even more: it shapes them. From then on, we become in some ways what it offers us.

Reductionism

One of the most fascinating aspects of this sudden historical movement is that it has become impossible to think the (causal) relationship between the boundless manifestations of the digital environment in our daily lives and the “basic physics” of the digital world based on the mathematical “bit”. Let us recall that a bit is the smallest possible quantity of information in classic computing: it is “worth” 0 or 1, it is “on” or “off”. There is today the same kind of “conceptual distance” between one of the bits that animate, for example, the software of our favorite video game, and one of the atoms that make up our strawberry ice cream (fortunately, we don’t have to know more about atomic physics to enjoy an ice cream than we have to know Boolean logic to play Crash Team Racing Nitro-Fueled).

There is, however, one major difference between the bit and the atom: the former is artificial. So everything that happens on the scale of successive orders of magnitude, from the bit to the most general manifestations of digital (such as a computer virus and its global effects or, better still, the phenomenon of human replacement by robots) is technically defined by humans. It is this rise in successive orders of magnitude that leads to the gradual loss of the causal and explanatory link with the bits.

Orders of magnitude

A smartphone contains around 1011 bits (1 followed by 11 zeros). Every second in the world we create around 1015 bits (1 million billion), or 1022 bits every year. It’s huge, but a strawberry ice cream contains 1027 atoms, each atom being infinitely more “complex” than a bit. Despite the extraordinary global inflation of bits that govern our societies and the enormous electricity consumption to animate them (About this delusion of a “green” digital world), they live in a “microscopic” order of magnitude: the “atomic” substrate of the digital world is ridiculously small and simple. Compared to the physical world, it’s only an infinitesimal scum.

The nature of the problems that we can practically solve is therefore, and whatever we may think (artificial intelligence, etc.), extremely limited. The computer protection methods we use are linked to these intrinsic limits: a good encryption guarantees that it would take many more atoms than in a strawberry ice cream to be decoded …

So-called “quantum” computing promises to radically change the order of magnitude and unleash something else. Let’s briefly explain what it is all about.

Qubit

The bit (0 or 1, on or off) is therefore the atom of classic computing. Quantum computing uses another atom: the “qubit” (quantum bit), which can occupy an infinite number of intermediate states between 0 and 1. It is therefore already enough to understand this: the qubit is infinitely “subtler” than the bit.

The most visual and accurate representation of the state (the “value”) of a qubit is that of a geographical point on a sphere. At the North Pole the classic value 0, at the South Pole the classic value 1. A classic bit can thus occupy only these two positions. But the qubit can be “placed” anywhere else on the sphere:

states of a quit

It is neither really “on” nor really “off”, as Schrödinger’s famous cat would neither be really alive nor really dead.
Where does this strange idea that a qubit would be useful for our calculations come from? Not from mathematics, like the classic bit, but from atomic physics. In a way, it’s a question of taking advantage of the laws of nature.

Richard Feynman

Richard Feynman (1918-1988) was one of the most brilliant and influential physicists of his generation. In 1965, he won the Nobel Prize in Physics for his work in quantum electrodynamics (“QED”). This theory makes it possible to describe the behavior of elementary particles (electrons, photons, etc.) with extraordinary precision. At this very small scale (a billionth of a meter at most), the physical rules have nothing to do with those we know from our sensitive experience. We need a mathematical “translation” and an interpretation (a way of speaking) that allows our intuition to play its imaginative and creative role. This is very difficult2:

Many years later [ after the Nobel Prize ], in a lecture, [ Richard Feynman ] said he didn’t expect the audience to understand QED, because he didn’t either – the rules are screwy, and what he found particularly galling was that the theory relied on probability, rather than certainty.

Let’s remember this: quantum physics is “screwy“, difficult to understand (it deals with phenomena for which we make no sense) and talks about probabilities …

Computers were beginning to develop in Feynman’s day and physicists used them to calculate trajectories, simulate the evolution of small systems, etc. But these computers proved incapable of simulating quantum systems, which were too complex. So, in 1982, the “inventor” Richard Feynman published an article entitled “Simulating Physics with Computers“, where he explains that to simulate quantum systems (electrons, photons … at the atomic and subatomic scale), you need … quantum computers, i.e. computers that work like the objects they seek to simulate. It’s a bit like trying to simulate the brain with real neurons rather than with mathematical simulacra …

It is therefore a physicist who inspired, almost 40 years ago, quantum computing. It’s quite peculiar: never a mathematician like Alan Turing, one of the leading theorists of classic computer science (Turing’s Body), could have imagined it.

Quantum Strangenesses

The atom of quantum computing is therefore the qubit. It obeys the laws of quantum physics and therefore behaves “like” an elementary particle. These laws are really very strange, difficult to explain and to represent. They have been deduced by physicists as they observe them. We explain here is one of these principles, which is not the most important one for quantum computing but which is enough to understand the radical difference in nature between a bit and a qubit (for some other details, see the end of the article).

The state of the qubit (the location on the sphere) is governed by mathematical equations. It is not just anywhere and does not “move” just anyhow. But we will never be able to observe it anywhere else than at the North or South Pole, i.e. in state 1 or state 0, like a classical bit! In other words, isolated from the outside world, “on the sly”, the qubit comes to life on the sphere: it balances between 0 and 1 according to the laws of quantum physics. But if we want to know what state it is in, we have to observe it, so exchange energy with it, and this exchange then freezes it in of the states 0 or 1. The laws of quantum physics only give us the probability at each instant of observing it in one of the two classic states.

Heads or tailsThe qubit is like a coin thrown in the air. In its aerial journey, the piece obeys the precise laws of classical Newtonian mechanics. According to these laws, it describes a parabolic trajectory and rotates on itself. But taking into account the imprecision of our knowledge on the whole of the experiment (starting speed, state of the ground at the point of impact, wind, etc.) it is as likely to fall on heads or tails. It’s only once on the ground that we will know the result.

Likewise, it is only once observed that a qubit is revealed… a bit. This is the reason why it is still possible to talk about computing since the solution to a problem, the result of a quantum algorithm that animated the qubits during their “flight” is in fact of a binary nature (heads or tails), i.e. classic.

Quantum risks

Since Feynman’s article, researchers have been trying to build a quantum computer. This is proving to be extremely difficult (what to make a real qubit with?) and requires considerable resources that only the largest government agencies or the richest digital players (Google, IBM, Microsoft, etc.) can gather. But what’s the point of spending billions if it’s just a matter of simulating quantum systems? How does all this concern us?

At the same time, other researchers have carried out work in quantum algorithms. Given a system of qubits and the quantum laws that drive them, can we use this system to make interesting calculations?

Peter Shor

Peter Shor

It was not until 1994 that the American mathematician Peter Shor made a major breakthrough by inventing an interesting quantum algorithm, based on a system of qubits, which makes it possible to achieve what no classical algorithm can do in a reasonable time: find the prime factors of a large enough integer. For example, given the number 576460752303423487, find that 576460752303423487 = 179951 × 3203431780337.

One can imagine that in principle one has to try all possible combinations until finding the right one, which quickly becomes impractical for much larger numbers. The difficulty to carry out this decomposition is at the core of all our cryptographic systems and thus of the protection of information transiting in particular on the Internet. And without a tamper-proof cryptographic system, no e-commerce for example. So, if a quantum computer running Shor’s algorithm were to come into existence, the entire global digital environment would instantly become unusable. Such a computer is probably feasible. So, there is no choice: it has to be built. No Nation can wait for another one to develop this digital atomic weapon before it does so. But how much time do we have?

17 years

An American report published in early 20193 provides a 272-page comprehensive, serious and well-documented overview of quantum computing research based on publicly available information (any work by a particular company or state may be kept secret). One of the conclusions of the report is that the time needed to create a large quantum computer capable of running Shor’s algorithm or other useful applications is probably more than a decade.

It’s actually quite short. In fact, it is about the time needed to deploy on a global scale, on all computer systems, new classical cryptographic protocols resistant to Shor’s algorithm (“migration time“). But moreover, once these protocols are deployed, they must be given time to encrypt enough information so that the bulk of the world’s sensitive data can rest in peace in the data centers (“security shelf life“) before the arrival of a large quantum computer. It is estimated that this time will take an average of 7 years.

So we only have 17 years of protection ahead of us from the moment we have a classic post-quantum encryption algorithm (there are now several serious leads but nothing definitive). If such an algorithm is proposed, say, in 2020, a large quantum computer should not be operational on a large scale before 2037. Otherwise, the entire security of the digital environment, on which our economy is based, would be compromised.

Is a large quantum computer possible?

By “large”, we mean with some dozens or hundreds of qubits and it will not be easy to achieve…

One of the main difficulties is that qubits must be completely isolated from their environment to perform “free” flight (on the sly), a purely quantum displacement on their state sphere. At the slightest exchange of energy (heat, radiation …), their quantum behavior ceases or is marred by errors. It’s a bit like throwing a coin in the air, equations ready, and suddenly the wind picks up and disrupts the theoretical trajectory by changing the final probability of “heads” and “tails”. Some have also compared the making of a quantum computer to that of a house of cards in a hurricane ….

But it is impossible to completely isolate a system from the rest of the universe (there is always “some wind”). The more qubits and the longer the “calculations” take, the more small disturbances accumulate despite the vacuum, darkness and near-absolute cold that prevail in quantum machines. There are partial solutions to this problem but they remain difficult to implement. Some researchers even express serious doubts about the possibility of a useful quantum machine4. But the means deployed are considerable…

Quantum supremacy

The fantastic advances in computer science (Moore’s Law, which is now being exhausted in its classic version) are of course the result of tremendous technical advances. But above all, these advances were gradually absorbed by the economic system, transformed into gigantic revenues, and therefore largely financed. In the same way, quantum computing will have to quickly prove its practical (economic) usefulness in order to move out of the military and research spheres, and thus access the funding needed for its large-scale deployment over decades.

The first step is to build a computer capable of unambiguously demonstrating “quantum supremacy“, i.e. capable of performing a calculation that no conventional bit-based computer will ever be able to perform (within a reasonable time). On that day, the announcement will be widely known to the general public and one can imagine the huge image stakes for the first team or the first commercial company to achieve this. As long as this quantum supremacy is not demonstrated, research organizations invest public or private funds, often in the greatest secrecy. IBM, Intel, Microsoft, Google… China are spending billions of dollars on it. As this is a breakthrough technology, which conditions the security of States, Europe is trying to stay in the race5:

Launched by the European Commission on October 29 [2018] in Vienna, the Quantum Technologies Flagship program is divided into four areas: quantum computing, quantum simulation, quantum communication and quantum metrology. This distribution is quite usual. It is found in the plans of countries like the USA or China.

This program is funded 1 billion euros over 10 years. In other words, not much … As usual, Europe remains terribly under-invested and divided.

Tomorrow…

The quantum computer already exists, but it still lacks to be “large” and reliable enough to reach quantum supremacy, that is, to do better than a classical computer. When this will be the case (tomorrow, in a few months, years?), the strategic battle will be officially launched and the countdown to the classic digital world will begin.

It will then take decades for this new atom, the qubit, to deploy new tools and paradigms, order of magnitude after order of magnitude, until our economic, social and political systems are transformed, and therefore maybe our culture itself.


Going a little bit further …

If you are a bit of a mathematician, programmer, physicist … or just very curious, here are two quick additions to the subject of quantum computing.

First, the specific power of the quantum computer, which enables it to solve calculations “before the end of the universe”, is linked to a quantum phenomenon called “entanglement”. Several qubits can be in a linked state (position on the sphere), regardless of the distance between them. This instantaneous codependency allows a quantum system to simultaneously occupy all the possible states of a problem of exponential dimension. This is why a quantum computer can be exponentially faster than a classical computer for certain types of computation. It is for example the case of the decomposition of a whole number in prime factors by the Shor algorithm.

Second, we explained that the slightest noise, the slightest breeze disturbs the qubit and distorts the calculation. In physics, this means that the qubit must not exchange energy with its environment. For example, it should not “heat up”. This absolute condition explains why we cannot do any classical calculation with qubits. Take multiplication:

2 x 12 = 24

When the classical computer finished its calculation, the numbers 2 and 12 were “replaced” by the solution: 24. Information was therefore lost. Indeed, given 24, it is impossible to know if it is the result of 2 x 12, 3 x 8 or 4 x 6 … However, when information is lost, the thermodynamic entropy increases and heat is produced: classic multiplication heats up! Quantum multiplication must therefore proceed differently. It must be thermodynamically neutral and therefore reversible (“adiabatic”). It is necessary to be able to reverse the calculation. The intuitive solution is obvious. You must keep at least one of the two numbers proposed for the calculation, for example the “2”:

2 x 12 = 24 [2]

So, from 24 [2] we can go back to 2 x 12. In quantum computing, all algorithms must therefore be reversible. It’s possible but not always easy.

But then, does quantum computing really radiate no heat? Would it help to contribute to the climate? Unfortunately, no: it takes a lot of energy to isolate the computer from the outside world, to shelter the house of cards from the hurricane.


1. Kevin Hartnett / Quanta Magazine – July 18, 2019 – Quantum Supremacy Is Coming: Here’s What You Should Know
2. Dick Selwood / Electronic Engineering Journal – May 24, 2018 – Richard Feynman and Quantum Computing

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.