FREEDOM AND SAFETY
Fifty years ago, smartphones would have been the ultimate computing wizardry. Just as classical computers were almost unimaginable to previous generations, we’re now facing the birth of an entirely new type of computation, something so mystical it may as well be magic: quantum computing.
If the word “quantum” makes your head spin, you’re not alone. The world of the very small, very cold, very sensitive and very weird may seem like an improbable system to build a commercial computing machine on, yet that is exactly what IBM, Google, Rigetti Computing, and other companies are working towards.
In January at the Consumer Electronics Show, IBM Q, a pioneering industry initiative trying to bring quantum computers from labs to the real world, unveiled System One: a dazzling, delicate, and chandelier-like machine that’s now the first integrated universal quantum computing system for commercial use, available for anyone to play with.
You’ve probably heard of quantum computers’ potential: the properties of quantum physics blast open massively parallel computing schemes that will likely deliver huge leaps in computing power, potentially outstripping anytransistor-based supercomputers we can come up with - today and tomorrow. It may revolutionize chemistry, pharmaceuticals, materials science, and machine learning.
But what exactly makes quantum computers so powerful? To delve into this mysterious field, I briefly chatted with Jeff Welser, vice president and lab director at IBM Research - Almaden for his expert take on the potentially disruptive technology. The interview can be found at the end of the article.
First, some basics on how quantum computers work - and it definitely helps if you keep classical computers in mind.
The secret to quantum computers’ prowess is that they manipulate qubits. Everything a classical computer processes - text, images, videos, and so on - rely on large strings of 0s and 1s, or bits. At its core, a bit represents one state or another, such as whether a light bulb is on or off, or if an electrical circuit is connected or not. In modern computers, a bit is generally represented by an electrical voltage or current pulse.
Quantum computers, in contrast, rely on qubits. Just like binary bits, qubits form the basis of computing, with one giant difference: qubits are generally super-conducting electrons or other types of subatomic particles. Unsurprisingly, managing qubits is a huge scientific and engineering challenge. IBM, for example, relies on multiple layers of superconducting circuits sequestered in a controlled environment and cooled step-wise to temperatures colder than deep space - near absolute zero.
Because qubits live in the quantum realm, they have some crazy quantum properties.
If a bit is a coin that’s either heads (0) or tails (1), qubits are a spinning coin: in a way, they’re simultaneously heads or tails, with each state having a different probability. Scientists use calibrated microwave pulses to put qubits into superposition; similarly, other frequencies and durations of these pulses can flip the qubit so that it’s in a slightly different state (but still in superposition).
Because of superposition, a single qubit can represent far more information than a binary bit. This is partly how, given an initial input, qubits can brute-force through a vast array of potential outcomes simultaneously. The final answer is only available once scientists measure the qubits - also using microwave signals - which causes them to “collapse” into a binary state. Scientists often have to run a problem multiple times to double-check the answer.
Entanglement is even more mind-blowing. Applying microwave pulses between a pair of qubits can entangle them so that they always exist in the samequantum state. This allows scientists to manipulate pairs of entangled qubits by just changing the state of one of them, even if they’re physically separated by a long distance - hence, “spooky action at a distance.” Because of the predictive nature of entanglement, adding qubits exponentially explodes a quantum computer’s computing power.
Interference is a final property that realizes quantum algorithms. It helps to picture rolling waves: sometimes they boost each other up (constructive), other times they cancel each other out (destructive). In a nutshell, using interference allows scientists to control states to amplify the type of signals towards the right answer and cancel those leading to wrong answers.
The general goal here is to encode parts of a problem into a complex quantum state using qubits, and then manipulate the state to drive it towards something that will eventually represent the solution, which can be measured after collapsing the superpositions into deterministic sequences of 0s and 1s.
As with classical programming, scientists are now working to move low-level assembly languages, which the machine better understands, to high-level languages and graphical interfaces more suited for the human mind. IBM’s Qiskit, for example, lets experimenters set up problems and drag-and-drop logic gates.
So why aren’t quantum computers more common already? In a way, scientists are trying to build perfect machines out of imperfect parts. Quantum computers are extremely sensitive to perturbations, noise, and other environmental effects, which cause their quantum state to waver and disappear, an effect called decoherence.
To some experts, decoherence is the challenge that’s holding quantum computing back. Even with utmost care, noise can slip into calculations. Scientists can only keep quantum information for so long before they lose their fidelity, which limits the number of calculations possible to do in a row before everything collapses.
The delicate nature of quantum computing is also why blindly adding qubits to a system doesn’t necessarily make it more powerful. Fault tolerance is a heated research area in quantum computing: logically, adding qubits could compensate for some of the problems, but millions of error-correcting qubits are likely needed to create a single, reliable data-carrying qubit (today we’re around 128). Smart algorithms, while still in their infancy, could also help.
Because big data is all the rage, you might expect quantum computers to better handle large datasets than classical ones. Not so.
Rather, quantum computers are especially good at simulating nature. For example, in drug development quantum computing can more efficiently design drug molecules because it basically operates on the same basis as the molecules it’s trying to simulate. Computing a molecule’s quantum state is incredibly computationally challenging with our current computers, but it’s right up a quantum computer’s alley.
Similarly, quantum computing could also revolutionize materials science or information transfer. Thanks to entanglement, qubits physically separated by long distances could potentially create a channel for information transfer that’s scientifically proven to be more secure than our current channels. Quantum internet isn’t a wild dream.
But perhaps most exciting is this: we don’t even know all the exciting questions quantum computers can tackle. Just by having a commercial quantum computer available and having people play with it (awesome interaction at a distance), we may be able to identify other exciting areas well-suited for this mind-blowing emerging tech.
Still with me? Here’s Jeff Welser on the current state and future of quantum computing.
Shelly Fan: Why did IBM decide to build and release IBM Q?
Jeff Welser: Think of this early stage of quantum computing like classical computing in the 1950s, and the beginnings of the mainframe. There is one big difference: the cloud gives us a way to share access to this nascent technology with the world. It’s like getting years to prepare for next-generation technology while it’s still a prototype. It’s why in 2016 we made publicly available the IBM Q Experience, which includes the public, no-charge 5-qubit and 16-qubit devices. In 2017 we launched the commercial IBM Q Network, which has access to our 20-qubit systems.
To date, more than 110,000 users have run more than 7 million experiments on the public IBM Q Experience devices, publishing more than 145 third-party research papers based on experiments run on the devices. The IBM Q Network has grown to 45 organizations all over the world, including Fortune 500 companies, research labs, academic institutions, and startups. This goal of helping industries and individuals get “quantum ready” with real quantum hardware is what makes IBM Q stand out.
SF: What are the main technological hurdles that still need to be resolved before quantum computing goes mainstream?
JW: Today’s approximate or noisy quantum computers have a coherence time of about 100 microseconds. That’s the time in which an experiment can be run on a quantum processor before errors take over. Error mitigation and error correction will need to be resolved before we have a fault-tolerant quantum computer.
We are in a stage of rapid evolution of quantum computing. It makes much more sense to put them on the cloud so we can maintain, upgrade, and scale them in our cloud data centers.
SF: Do you envision quantum computing as the future of computation?
JW: Quantum computers and classical computers will work together for the foreseeable future. Quantum processors have to be kept at near absolute zero to operate, and we need classical computers to interface with quantum systems: send signals to the processors, interpret the results of those signals, etc.
What we can imagine are classical computers tapping into quantum computers for tasks too unwieldy or intractable for them, like molecular simulation. That’s how quantum computers will provide a “quantum advantage” for certain tasks.