Skip to content

The High Stakes of Quantum Computing

Quantum computing, which exploits the entanglement of particles that so infuriated Einstein (“spooky action at a distance”), is not a receding mirage like cold fusion, feasible in theory but not in prac­tice. It is already with us as Google, IBM, IonQ, Rigetti, and Honey­well have assembled working specimens of suitably otherworldly appearance.

We are thus entering the qubit era. In succession to the familiar bits, the ones or zeros produced by the off-or-on switches of our present computers, qubits exist in a superposition of both states one and zero simultaneously. Each additional qubit contributes exponentially more power, which is the key to the enormous computing capa­bilities of quantum machines.

Obstacles to the Adoption of New Technology

Normally, when a major new technology arrives, its potential benefits and any nasty by-products are both delayed and moderated by the practicalities of the introduction process. The high unit costs of anything really new, whose developmental debts and facility expenses are distributed over very few units at first, is the most obvious obsta­cle to rapid diffusion. Digital computers were at first—and indeed for two decades—only acquired by a few military and other governmental bodies insen­sitive to unit costs, and by the major commercial banks and other large firms that could afford to buy them.

Further, implementing really new technology is frequently ham­pered by the shortage of operators who can use the new machines in productive ways. The lack of skilled operators for new digital com­puters was just as much of an obstacle as cost initially. The nasty adage that those who can, do, and those who can’t, teach, often held true in the computer field, as academics focused on mathematical rep­resentations of computational processes, while the operators focused on data preparation and presentation procedures, as well as on go-around techniques and tricks, such as unscheduled cooling-off pauses in the earlier vacuum tube period, and the switch off and start again alternative to the patient resolution of programming glitches later on.

Thus almost twenty years passed between the operation of the first hand-assembled vacuum-tube computers of the Second World War and the 1964 introduction of the first really practical computers of the IBM System/360 series that were readily programmable in actual practice. Only then was the promise of the very first programmable computer—invented and built by Konrad Zuse in Nazi Germany—finally realized. Zuse’s government-funded “Turing-complete” (rule-recognizing) Z4 became operational in May 1941, long before Turing himself received the first computing Colossus to decode the messages generated by German Enigma machines. Zuse also designed the first high-level programming language, Plankalkül, and because he patent­ed his inventions—and because IBM was a wide-awake organization at that time—its employees arrived at Zuse’s doorstep amidst the ruins of a defeated Germany to buy options on his inventions.

Because what follows is a matter of timing more than anything else, it is worth noting that Zuse was still alive and well in the later 1970s, when the programmable computer he had invented was sufficiently diffused across the United States and other countries to really make it the age of the digital computer. Its arrival induced all manner of structural changes, including the disappearance of most clerical jobs, which in turn had drastic social and even political consequences, as less-educated office workers were cast out into the low-paying end of the service economy.

In fact, Zuse lived to see the introduction and wide diffusion of the personal computer before he died in 1995, which is another way of saying that when it came to the digital computer, its diffusion across the economy and its impact on society were fast enough to occur within a lifetime. (It was as if Henry Ford had seen the introduction of electric cars.) But at the same time the transition was slow enough to make its disruptive effects manageable.

This Time Is Different

This is not the case with quantum computing. First, high unit costs are irrelevant because what is offered to the market are not the machines themselves but their output: computational service via the cloud. Second, for the same reason, the staffing problem is initially irrelevant, even if a different lack of expertise is likely to be manifest in the slowness of management—and, crucially, of government—to react to the implications of quantum computing.

The most obvious issue is simply a matter of scale: perfectly feasible quantum computers could out-calculate today’s very largest computer not by a factor of ten, or a hundred, or a thousand, but by a ratio too high to be worth estimating. For example, the world’s most powerful supercomputer is (or was on July 8, 2020) the so-called Summit, or more formally the OLCF‑4, developed by IBM for the Oak Ridge National Laboratory. Summit is capable of 200 petaflops, or in other words its problem-solving linpack performance bench­mark is clocked at 148.6 petaflops.

It might seem, therefore, that the calculation potential of quantum computing could be usefully understood by using the Summit as a benchmark. But that is not so—not at all—because a petaflop corre­sponds to only one quadrillion (with fifteen zeros after the number), while the performance of a fair-sized quantum computer might be measured in zettaflops (with twenty-one zeros after the number), or more likely yottaflops (with twenty-four zeros after the number). Even that may well be deemed an inadequate benchmark for an ambi­tious quantum computer.

Numbers always matter of course, but they are only really important when they change matters qualitatively. As in the old Josh White song: The poor little man might read the menu as offering one meatball for fifteen cents; “but you get no bread,” shouted the waiter, when you order just one meatball. Bread might be had with two meatballs at thirty cents, while three might come with spaghetti as well—a real dinner.

When it comes to quantum computing, we are in million- or trillion-meatball territory, which of course means that it is not the quantities that matter but their qualitative consequences. As Hegel explained in his 1812 Science of Logic, sometimes change is not only a transition from one proportion to another, but . . . a sudden leap, into a  qualitatively different thing.”

Quantum computing is certainly such a leap. Of course it will make no difference at all for scientific discovery: Einstein did not need even an abacus to discover how and why the universe deviated from Newtonian calculations, anymore than Fleming needed a high-powered microscope to discover that the penicillium mold defends itself against staphylococcal bacteria by producing penicillin, thereby inau­gurating the antibiotic era for humanity. No computation machine can ever produce even one such hypothesis.

But the trillion-meatball effect will be revolutionary for those scientific fields that remain computationally constrained, such as fluid dynamics, with the calculation of quantum algorithms on quantum computers. Of course there are many other scientific endeavors, including the modeling of pharmaceuticals in a pandemic, that might be advanced by the calculation of every possible variable in every possible parametric permutation—which gamblers might also find very useful when playing constrained card games such as twenty-one.

Cracking the Code

That brings us directly to a kindred application of quantum computing: cryptoanalysis, which could bring about consequences as un­foreseen as the negative political effects of the computer-driven de­struction of the clerical class, but potentially far more catastrophic.

There are still a few face-to-face cash transactions (at least in non-epidemic times) even in the most advanced societies, in Europe’s Christmas markets for example. But for the rest, the security and thus the practical value of both wholesale and retail commerce—as well as all forms of banking and finance—depend on encryption produced with public key infrastructure (PKI) cryptography, supplemented in some limited cases by voiceprint verification.

Furthermore, governments rely on encryption for the electronic transmission of their confidential information, as do the armed forces, often on a wholesale basis and for a very good reason, because even a depleted laundry report from a warship could provide “actionable” information (e.g., a resupply rendezvous, an imminent return to port).

The immediate problem is that all extant variants of PKI crypto­graphy are vulnerable to quantum cryptoanalysis, as are all forms of digitally encrypted and electronically transmitted government, mili­tary, and intelligence communications, with the very impractical exception of messages sent by onetime pad enciphered with true random numbers, which is useful enough to order ballistic-missile submarines to execute a preplanned attack, an event that might occur not even once in a lifetime, one must hope.

And then of course there is still the alternative of sending hand-written or mechanically typed (yes, mechanical typewriters are still on sale) letters by courier. Once, as a British army cadet, I saw the arrival of an officer bearing a “by hand of officer only” envelope for the CO: it was the order to fly to North Borneo. And of course diplomatic couriers are used all the time to personally carry the most sensitive documents back and forth between foreign ministries and embassies; they provide small but steady business to airlines in non-epidemic times.

The first response to the immediate problem might be to increase key size by adding digits. But that is certainly a losing bet against quantum computers that will undoubtedly be scaled up very quickly to exploit the abundance offered by superposition, all those alter­natives to one or zero. Indeed, increasing key size is a losing bet even against current computers. In 1977, the eminent mathematician and cryptologist Ronald Linn Rivest, a Turing prize winner, offered a $100 prize to whoever could break a 129-digit semi-prime ciphertext of the phrase “The Magic Words are Squeamish Ossifrage [a vulture],” predicting that—with the computers of those days—the factoring would take forty quadrillion years, when presumably, with inflation, the $100 might not be a great loss for him. A volunteer gathering of academics successfully read the message by 1994, how­ever. Rivest duly paid up, when his $100 was only worth $39 in 1977 dollars because of inflation.

Much less amusing are three conjoined facts. First, quantum com­puting exists, and its computational capacity is increasing by the day, ensuring the obsolescence of all current encryption technologies used in both commerce and government, with the highly circumscribed exceptions noted above, which are even less practical in commerce and finance.

Second, governments and also some private bad actors have been harvesting encrypted data for a very long time, so that even if a quantum computing alert is issued, and all who can immediately switch to hand-of-officer-only messaging and its equivalents for the very few communications for which that is feasible, adversaries will still be able to obtain access to an immense accumulation of encrypted financial, technological, military, diplomatic, and intelligence infor­mation—a true cryptographic apocalypse.

If complacency is pierced by the imminence of quantum computing—it is already advertised online—a remedy is available: post- quantum cryptography which does not require quantum computing. Pure random numbers in large quantities, which would enable the routine use of onetime-pad communications and archival storage, can be produced by inconstant physical processes, such as the opera­tion of lasers. As of now, as far as I know, two companies—one based in the United Kingdom and one in the United States—are offering the necessary supply of pure random (not the no-longer-secure pseudo-random) numbers.

The retroactive quantum encryption of all important data archives, as well as of all new reserved communications, is an immediate priority that does not advertise itself, and which requires creative minds and determined action. Because of the vast amounts of pre­viously harvested data, there will be vast damage regardless, but because of the greater reliance on secrecy among bad actors, whether private or governmental, the balance might work in our favor—so long as no time is wasted in preparing before—and not after—the advent of quantum cryptoanalysis.

This article originally appeared in American Affairs Volume IV, Number 3 (Fall 2020): 136–41.

Sorry, PDF downloads are available
to subscribers only.

Subscribe

Already subscribed?
Sign In With Your AAJ Account | Sign In with Blink