computing. The current standard is 2048-bit RSA (Rivest–Shamir–Adleman) encryption, which is widely used for activities such as securely sending credit card details over the internet. Predictions vary as to when it may be possible to break the current standard, meaning factor the 2048-bit integers used by the RSA method. Although not immanent, methods are constantly improving, and readying cryptographic systems for the quantum era is prescribed.
A 2019 report published by the US National Academies of Sciences predicts that breaking RSA encryption is unlikely within the next decade. The report indicates that any serious applications of quantum computing are at least 10 years away (Grumbling & Horowitz, 2019). Given the current state of quantum computing and the recent rates of progress, “it is highly unexpected that a quantum computer that can compromise RSA 2048 or comparable discrete logarithm-based public key cryptosystems will be built within the next decade” (Grumbling & Horowitz, 2019, 157). The report’s stance on error correction is that “The average error rate of qubits in today’s larger devices would need to be reduced by a factor of 10 to 100 before a computation could be robust enough to support [the required] error correction at scale” (Grumbling & Horowitz, 2019). The report further highlights that “at this error rate, the number of physical qubits held by these devices would need to increase at least by a factor of 105 in order to create a useful number of effective logical qubits” (Grumbling & Horowitz, 2019). A 2016 National Institute of Standards and Technology (NIST) report has a similar result, noting that some of the most aggressive time estimates predicting when quantum computers might be powerful enough to break 2048-bit RSA might be by 2030, at a potential cost of a billion dollars (Chen et al., 2016, 6).
One complication in making predictions is that the number of required qubits (processing power) needed to factor 2048-bit RSA integers varies by method. Different algorithms need different numbers of qubits (Mavroeidis et al., 2018). Although difficult to guess, “current estimates range from tens of millions to a billion physical qubits” (Mosca, 2018, 39). Newer estimates propose more granularity, indicating in more detail how a quantum computer might perform the calculation with 20 million noisy qubits (without error correction) in just 8 hours (Gidney & Ekera, 2019). (The method relies on modular exponentiation which is the most computationally expensive operation in Shor’s algorithm for factoring.) In 2012, a 4-qubit quantum computer factored the number 143, and in 2014, a similar device factored the number 56,153. However, scaling up quickly is not straightforward because the extent of error correction required is unknown. The recent result from Gidney & Ekera (2019) suggesting that 20 million qubits might be possible without error correction is potentially a landmark step.
One of the nearest term remedies for post-quantum security is quantum cryptography, in the form of quantum key distribution (QKD), which has been foreseen in quantum information science roadmaps for some time (Los Alamos National Laboratory (LANL), 2004). QKD is the idea of issuing cryptographic keys generated with quantum computing methods and distributing them via global communications networks, satellite-based and terrestrial. QKD has been experimentally demonstrated and remains to be commercialized. The market for QKD is estimated to reach $980 million by 2024 from $85 million in 2019 (Gasman, 2019).
3.2Basic Concepts: Bit and Qubit
Quantum computing springs from Nobel physicist Richard Feynman’s intuition that it should be possible to perform very powerful computations by using quantum building blocks. He suggests the idea of simulating physics with computers using a universal quantum simulator.
Feynman worked on many problems at the small scales of the quantum mechanical domain. He famously announced that “There’s plenty of room at the bottom” (Feynman, 1960), referring to the idea of miniaturization at the atomic scale such that the entire 24-volume set of the Encyclopedia Britannica might be printed on the head of a pin. These ideas helped to sponsor the nanotechnology industry in the 1990s and are likewise motivating the current development of quantum computing. To get an idea of scale, the nanometer scale is 10−9 m, the atomic scale (in terms of the Bohr radius being the probable distance from the nucleus to the electron) is 10−11 m, and the electron scale (the size of the electron) is 10−15 m.
Feynman suggested that a quantum computer could be an efficient universal simulator of quantum mechanics. Such a “universal quantum simulator” (Feynman, 1982, 474) would be a different kind of computer that is not a traditional Turing machine. He posits two ways to simulate quantum mechanics with computers. One is reconceiving the notion of computers and building computers out of quantum mechanical elements that obey quantum mechanical laws. The other idea is trying to imitate quantum mechanical systems with classical systems.
Feynman’s key thought is that the more closely computing systems can be built in the structure of nature, the better they can simulate nature. He says that the “the various field theories have the same kind of behavior, and can be simulated in every way, apparently, with little latticeworks of spins and other things” (Feynman, 1982, 474–5). Assuming that the world is made in a discrete lattice, “the phenomena of field theories can be well imitated by many phenomena in solid state theory (which is simply the analysis of a latticework of crystal atoms, and in the case of the kind of solid state I mean each atom is just a point which has numbers associated with it, with quantum mechanical rules)” (Feynman, 1982, 475).
Feynman proposed the idea of the universal quantum simulator in 1982, following which there have been other theoretical developments. In 1985, David Deutsch proposed a universal quantum computer based on the idea that quantum gates could function in a similar fashion to traditional digital computing binary logic gates [Deutsch, 1985]. In 2000, Charlie Bennett showed that it is possible to efficiently simulate any classical computation using a quantum computer (Bennett & DiVincenzo, 2000).
Other advances in recent decades have led to the practical realizability of quantum computers. First, in the 1990s was the discovery of quantum error correction. Unlike classical bits that persistently stay in a 1 or 0 state, quantum bits are extremely sensitive to environmental noise and may decohere before they can be used to perform a computation. Quantum error correction overcomes some of the challenges of working in quantum mechanical domains.
Second, since 2012, there have been advances in room-temperature superconducting materials and a proliferation of ways of making qubits such that quantum systems have increased from 1–2 qubits to 50–100 qubits. A research goal is demonstrating quantum advantage, which is specific cases in which quantum computing confers an advantage over classical computing.
3.2.1 Quantum computing and classical computing
Quantum information processing is not only a potentially faster means of computing but also a new paradigm in that information is conceived and managed in a completely different way due to the different properties of quantum objects. According to W.D. Phillips, 1997 Nobel Prize winner in physics and NIST scientist, “Quantum information is a radical departure in information technology, more fundamentally different from current technology than the digital computer is from the abacus” (Williams, 2007).
Some of the special properties of quantum objects (be it atoms, ions, or photons) are superposition, entanglement, and interference (SEI properties). Superposition means that particles can exist across all possible states simultaneously. This is known as a superposition of states. For example, an electron may exist in two possible spin states simultaneously, referred to as 1 and 0, or spin-up and spin-down. Entanglement is the situation that groups of particles are related and can interact in ways such that the quantum state of each particle cannot be described independently of the state of the others even when the particles are separated by a large distance. Across large distances, this is called Bell pair entanglement or nonlocality. Interference relates to the wave-like behavior of particles. Interference can be positive or negative, in that when two waves come together, they are either reinforced or diminished.
Classical computing is based on electrical conductivity, using Boolean algebra (namely expressions evaluating as true/false, and/or, etc.) to manipulate bits. Quantum computing is based on quantum mechanics, using vectors and linear algebra to manipulate matrices of complex numbers. Aiming toward a universal