“We show that Shor’s algorithm, the most complex quantum algorithm known to date, is realizable in a way where, yes, all you have to do is go in the lab, apply more technology, and you should be able to make a bigger quantum computer,” says Isaac Chuang, professor of physics and professor of electrical engineering and computer science at MIT. “It might still cost an enormous amount of money to build — you won’t be building a quantum computer and putting it on your desktop anytime soon — but now it’s much more an engineering effort, and not a basic physics question.”
The news is part of a much broader story, which is the decades-long history of quantum computing that has gained speed in recent years. Promises of incredible computing power delivered via quantum cloud computing tantalize today’s technologist. Like most hardware-level advancements, most will remain unaware of how the technology works or even that it’s arrived, save for hearing the term “quantum” more often in ads and the news, and that’s just as well because most people don’t need to understand why or how their phones and computers do the things they do.
Those orchestrating long-term plans, policy and infrastructure of a government, however, should at the very least understand what new products might be available within two presidential terms and how those technologies might nourish and sculpt the govtech landscape.
The Nitty Gritty of Quantum Computing
In a Reddit Ask-Me-Anything, Bill Gates predicted quantum computers may arrive soon.“Microsoft and others are working on quantum computing,” Gates wrote. “It isn't clear when it will work or become mainstream. There is a chance that within 6-10 years that cloud computing will offer super-computation by using quantum. It could help use solve some very important science problems including materials and catalyst design.”
In this case, the term quantum refers to quantum physics, a branch of physics concerned with the study of atoms and photons. At such a small scale, the body of knowledge and laws so far compiled in classical physics break down and it’s been observed that small objects behave in unexpected ways. Scientists are beginning to understand how they can exploit these strange behaviors to take shortcuts that wouldn’t be possible with traditional computers. The outcome is computers that are exponentially faster and more efficient for some kinds of computing.
Scientists began looking to quantum computing in anticipation of the expected end of Moore’s Law, which is not a physical law, but an observation by computer scientist Gordon E. Moore that the number of transistors in an integrated circuit double every two years. The law has been tweaked to an observed doubling of about every 18 months, and applied more broadly to things like microprocessor prices, memory capacity, sensors, video resolution and digital camera megapixels. Consumers especially have enjoyed the realization of Moore’s Law in recent years as the decrease in cost and increase in power of computers have made incredible technologies seem prosaic. But scientists expect the law’s progress to wane once computer components can’t be made any smaller.
Today’s semiconductor transistors can be found on the 14 nanometer scale, which is 8.5 times smaller than the HIV virus. The 5 nanometer scale is the predicted end of Moore’s Law. Electronics with transistors on the 5 nanometer scale have been created, but none have been produced commercially, though some predictions place a commercial Intel release around 2020. Quantum computing may provide the innovation needed to circumvent these physical limitations of new design. Some suspect quantum computing may even unlock powers of the physical universe that allow humans to overcome problems previously thought unsolvable, while others believe quantum computing will remain a specialized tool.
Quantum computers work by exploiting the multiplicity of concurrent states in small objects. Whereas traditional computers use bits -- binary data -- to store and relay information, quantum computers use quantum bits or qubits (pronounced cue-bits). Qubits also hold binary values, set by things like the horizontal or vertical polarization of a photon, but the difference is that it’s possible for a photon to be in varying proportions of both binary states simultaneously, a phenomenon called superposition.
Superposition allows a single qubit to hold much more information than a regular bit. A regular bit is either a 1 or a 0, but a qubit is varying degrees of both until it is measured, at which point it resolves to one state or the other. This is called the observer effect. In quantum mechanics, the observer effect has not to do with some metaphysical influence of human consciousness or some philosophical tree-falling-in-the woods phenomenon, but rather with wave function math and a measuring instrument’s unintended influence on the action. Stated simply, quantum bits allow for a computer to perform multiple parts of a calculation simultaneously, therefore making it far more powerful.
Another property powering quantum computing is called entanglement, which is a connection between qubits wherein a state-change in one qubit causes a rapid change in the other, regardless of proximity. Scientists can exploit this property by measuring just one qubit and deducing the state of its entangled partners. Like superposition, this allows more information to be stored and referenced with fewer operations.
Bell’s Theorem, named after John Stewart Bell, attempts to explain this phenomenon by ruling out probabilistic features of quantum mechanics by the mechanism of underlying inaccessible values.
Bell stated that, “No physical theory of local hidden variables can ever reproduce all of the predictions of quantum mechanics. … In a theory in which parameters are added to quantum mechanics to determine the results of individual measurements, without changing the statistical predictions, there must be a mechanism whereby the setting of one measuring device can influence the reading of another instrument, however remote. Moreover, the signal involved must propagate instantaneously, so that such a theory could not be Lorentz invariant.”
Here, this explanation is not intended to illuminate, but rather serve as a starter’s illustration of why quantum mechanics is so difficult to understand. A famous aphorism most frequently attributed to the genius physicist Richard Feynman states, “If you think you understand quantum mechanics, you don't understand quantum mechanics.”
In other words, physicists have a lot of ideas of why objects behave so differently at a small scale, but no one understands exactly what’s happening yet, and that’s why no one knows exactly what quantum computers will be capable of or precisely when they will arrive.
When it comes time to process, super computers use quantum gates, the quantum equivalent of regular semiconductor logic gates. Quantum gates manipulate superpositioned qubits and output differently superpositioned qubits. The core of why quantum computers are so powerful is that the core units, qubits, are so dense with information. The uncertainty involved in the output requires some extra checking, but the amount of concurrent data being processed drastically reduces computation time. Processing large swaths of information simultaneously, rather than serially, makes for computers that are exponentially faster and more efficient.
So What Is Quantum Computing Good For?
It’s not yet clear whether quantum computers will be a niche technology or universally applicable. But some in government clearly are aware of quantum computing's potential, given the Nov. 8, 2015 announcement by IBM that Intelligence Advanced Research Projects Activity (IARPA), an organization within the Office of the Director of National Intelligence, awarded the company a multi-year grant to continue researching the building of quantum computers.And that same month, quantum computing company D-Wave Systems announced that Los Alamos National Laboratory will acquire and install its latest quantum computer. Before that, in September 2015, the company announced plans to install a unit at NASA's Ames Research Center as part of a collaboration with Google to study the role quantum computing could play in artificial intelligence.
And there are a few problems that quantum computers are known to be adept at solving, such as searching databases.
When it comes to encryption, the controversial debate between Apple and the FBI to create a backdoor into its iPhone devices may not matter for much longer: Computers are getting so powerful, according to Motherboard, that they will eventually be able to break any encryption.
The National Institute of Standards and Technology released a report in February that predicts that by 2030, quantum computers will be able to break the popular public key encryption technology known as RSA at a cost of $1 billion.
It’s also thought that quantum computers will enable breakthroughs in fields of science that depend heavily on modeling and simulation, such as chemistry and biology. Extra computational power would help scientists find cures and develop medicines.
But beyond breaking encryption and making strides in artificial intelligence is the fact that the PCs of today are nearing max computational capacity.
“As conventional computers reach their limits in terms of scaling and performance per watt, we need to investigate new technologies to support our mission,” said Mark Anderson of the Los Alamos National Laboratory's Weapons Physics Directorate. “Researching and evaluating quantum annealing as the basis for new approaches to address intractable problems is an essential and powerful step, and will enable a new generation of forward thinkers to influence its evolution in a direction most beneficial to the nation."