Since 2015, interest in quantum computing has steadily increased, with total venture capital flows into quantum projects exceeding $1.2 billion over the last six years. With computing giants Google, Microsoft and IBM all investing heavily in the technology, many have heralded the dawn of the quantum computing age, as the birth of a new realm of hi-tech which promises to revolutionise entire industries from drug development via novel materials through to finance.
But it is not just venture capital investors who have a vested interest in quantum, governments around the world, as well as a wealth of new startups, all have stakes in this fledgling space. A global race is now underway to unveil the first commercially available quantum computer, a target many believe is now tantalising within reach. We at Dawn call this quest Not Completely Useless Quantum Advantage (NoCoQuAd)
We’re having a great number of discussions around quantum computing, with those building and others interested in the space. We wanted to take the opportunity, therefore, to dive into the topic, explore where the market is now, and where we can expect it to go.
So what is a quantum computer?
In October 2019, scientists at Google made headlines by revealing that they had developed a quantum computer capable of solving a problem which would take the most powerful of classical supercomputers 10,000 years to complete.
The key to how quantum computers can perform calculations way beyond the limits of their classical equivalents lies in their ability to create and manipulate quantum bits, otherwise known as qubits.
While conventional computers store all their information as long strings of either 0s or 1s, known as bits, quantum computers generate qubits which have a special property known as superposition. This property means they can represent multiple possible combinations of 0 and 1 at the same time, enabling the quantum computer to calculate vast numbers of potential outcomes simultaneously. They can also be entangled — or non-locally connected — whereby one qubit can instantaneously transit ‘information’ to another qubit far far away with obvious speed advantages. The final result of the calculation only emerges once the qubits have been measured, causing their quantum state to collapse to either 0 or 1.
Right now various breakthroughs have been made in terms of the number of qubits which can be achieved. Google’s quantum device contained 53 qubits, while Honeywell has designed a quantum computer with 64 qubits.
What are the major challenges of building a quantum computer?
The process of generating and managing qubits is an enormous scientific and engineering challenge. Quantum states are exceptionally fragile, and the slightest vibration, temperature fluctuation, or interaction with the outside environment can cause them to collapse unexpectedly before the program has run to completion, making the quantum computer unreliable and error-prone.
This problem, known as decoherence, has been described as the main difference between a 10,000 qubit quantum computer being a random noise generator, and the most powerful machine on the planet! As a result, scientists around the world, from the tech industry to national state laboratories, have been exploring different methods of either preventing or reducing such errors. Companies such as IBM, Honeywell, Google and Intel are looking at ways of creating a controlled quantum state, using tiny superconducting circuits at temperatures as low as 4 kelvins (or -269.15 degrees Celsius) which is comparable to deep space, or electromagnetic fields in ultra-high vacuum chambers, although so far all of these methods still yield errors.
Because fully fault-tolerant quantum devices are thought to be at least a decade or two away, other scientists are looking at ways of managing errors through different error correction schemes, such as estimating an error-free computation, based on the results of computations set at various noise levels. The current challenge, however, is that because these error correction methods still consume such a large number of qubits, relatively few remain for actual computation, greatly reducing the potential power of the quantum computer.
What is ultimately driving this quantum race?
Geopolitical competition has a major role in driving quantum computing forwards, as governments have become increasingly aware that the potential power of such machines could pose new existential risks regarding encryption and national security.
Because quantum computers can perform prime factorisation — the mathematical concept behind almost all modern encryption techniques — by being exponentially faster than classical computers, a quantum computer operating at full capacity could theoretically break even the most complex of military communication channels. The US has already responded to this potential threat by introducing the National Quantum Initiative Act, a ten year $1.2 billion program to try and ensure that it remains ahead of the curve. That being said, the number of cubits needed is very large, meaning that RSA encryption is safe for some time.
In the corporate world, quantum hardware is seen to represent the future of the cloud computing business model pioneered by Amazon Web Services over the last two decades. Google, Microsoft, Intel, IBM, as well as Chinese tech giants Alibaba and Tencent, all know that if they can become the first to develop commercially viable quantum hardware architecture, and make it accessible to corporations around the globe via APIs, they are unlocking the keys to a novel revenue stream worth billions.
Such are the potential applications of quantum computing — from financial predictions to resource optimisation in healthcare — even corporates not directly involved in building computing hardware have begun to invest in developing layers of the quantum stack, such as software programs that might run on a future quantum computer. Both Goldman Sachs and JP Morgan have built teams of quantum computing researchers to work on future applications ranging from portfolio optimisation to simulating the stochastic processes involved in derivative pricing.
So, where are we today?
Because of the many practical challenges surrounding decoherence and quantum error correction, the first wave of commercial applications will be powered by hybrid classical-quantum machines, known as NISQ (Noisy Intermediate-Scale Quantum) devices. These systems work by running only the most specialised and performance-critical sections of a software program on a quantum computer, with the bulk of the program being run on a more stable classical machine.
Many experts predict that the first NISQ devices could become available within the next three years, and their arrival will kickstart an entire industry of new startups, designing novel algorithms and software programs which can run on these devices.
The pharmaceutical industry is particularly well primed to be one of the main beneficiaries of these early quantum machines, which could impact all phases of the drug development pipeline. With many of the problems in computer-aided drug discovery such as protein-structure prediction and molecule docking expected to be powered and solved by the variational quantum eigensolver algorithm which is designed to run on NISQ machines.
Companies such as Finland based Algorithmiq.fi, who are developing quantum algorithms to speed up variational routines in quantum-chemical simulations by many orders of magnitude over incumbents such as Google or IBM. ProteinQure, who use machine learning and computational biology to perform structure-based drug design on NISQ devices, and Menten.ai, who claim to have developed the first protein design algorithm for current and near-term quantum computers, could all lead this initial wave. Because hybrid devices have the potential to increase the number of molecule interactions a computer can process when doing drug discovery simulations, by orders of magnitude, pharmaceutical executives are hoping that they can play a major role in slashing the costs and timeframe of bringing new drugs to market. Currently it takes new chemical entities around $2 billion and a decade of development before they reach the clinic, so shortcutting this would be game-changing and yield big rewards.
And finally, what should we expect to come?
While the exact predictions vary from one report to the next, market forecasts agree unanimously that the quantum computing market will grow substantially over the next five years. One report earlier this year predicted that the market value will swell from $472 million to more than $1.7 billion by 2026, at a compound annual growth rate of 30.2%.
As NISQ devices become available in the next few years, we can expect more companies starting to invest in being ‘quantum ready.’ The NISQ market alone is predicted to be worth $5 billion by 2024, with more startups like QC Ware and Zapata consulting organisations on using quantum-ready applications and simulations for high-value problems. Many companies will attempt to position themselves by paying for access to early quantum hardware in order to be ahead of the curve, even if those systems are not yet capable of accurate or complete computations.
Meanwhile, as the quantum hardware race between the global tech giants intensifies, we will see an increasing number of M&A of early startup leaders as it becomes evident that certain approaches are more superior than others, and large conglomerates realise they need to either abandon or diversify their current architectural approaches.
Longer term, quantum experts are predicting the start of a quantum advantage phase in 10 to 20 years time — an era marked by continued advantages in speed, cost, and quality, with quantum computers achieving steadily superior performance in industrial tasks. And, in the not too distant future, full-scale, fault-tolerant quantum computers will become available, running on algorithms which are capable of transforming every industry, from materials design to global healthcare to cybersecurity.
If you’re building in the space, I’d be delighted to to chat and compare notes: email@example.com