Consider this a word to the wise. Just as the recent developments in artificial intelligence have and will continue to impact us all, quantum computing will amplify that and take us to places we can only imagine. This is the next truly disruptive technology on the horizon. It is not just about incremental improvements in computer hardware but a breakthrough in the way computers work, how data is processed and handled, and the enablement to fulfill the aspirations and dreams of AI. My goal is to provide as many details as practical and introduce the fundamental “ingredients” of what is to come.

In short, quantum computing is an innovation that applies the laws of quantum mechanics to simulate and solve complex problems that are too difficult for the current genre of classical computers. It is more about science than hardware technology per se. It is still so new that most of us can’t define or properly understand it, but it might (many experts say will) be the next (currently obscure) technology to have a seismic effect on businesses of all types including medical, scientific, engineering, etc.

## Importance of Quantum Computing

A key turning point in the history of quantum computing was reached in 2019 with the attainment of “quantum supremacy.” This refers to the point at which a quantum computer became feasible and can (when fully developed) outperform the most powerful classical supercomputers. This achievement shows how quantum computers have the ability to tackle complicated problems and do so in orders of magnitude faster than classical computers.

Let me say at the outset that the current field of quantum computers isn’t quite ready for prime time. The technology is in the early stages. McKinsey among others have estimated that there will be 5,000 quantum computers operational by 2030 and that the hardware and software necessary for handling the most complex problems will follow after that.

At first glance, this appears to give us all some breathing room to acclimate but keep in mind that in today’s world prognosticators and pundits don’t have a very good track record when it comes to forecasting technological advances, especially ones with such a huge upside. For one example, look at the forecasts versus the reality of the expansion of AI.

## Understanding Moore’s Law

The concept of forecasting the future of technological advances in computing dates back to 1965 when Gordon E. Moore, the co-founder of Intel, made an observation that became popularly known as Moore’s Law. Moore’s Law states that the number of transistors on a microchip doubles about every two years with a minimal cost increase. Another tenet of Moore’s Law says that the growth of microprocessors is exponential. Over the years it has been all about making the chips smaller and packing more transistors in a given space. Many have expanded this “axiom” to say that technology writ large “doubles” every two years and is exponential.

Today, the general consensus is that we are now reaching the physical limits of Moore’s Law, and it will be fully reached at some point in the 2020s. This is not new thinking. In a 2005 interview, Moore himself admitted that “…the fact that materials are made of atoms is the fundamental limitation and it’s not that far away…We’re pushing up against some fairly fundamental limits so one of these days we’re going to have to stop making things smaller.” This means that if we want to progress beyond current limitations, we will need to find ways to do things that are outside of the existing technological paradigms.

## Why Quantum Computing Matters

Since we are approaching the currently known limits in traditional computing, this is where necessity becomes the mother of invention. If we look at AI and its various iterations we need more (pun intended) computing power to take it to the next level. The need/demand will fuel developments in quantum computing that will exceed forecasts. The takeaway is that organizations need to start thinking now about where they might leverage the technology to solve real-world business problems.

Fear not, quantum computing is not going to totally replace conventional computers. Subject matter experts and researchers alike explain that typical businesses with common (small to moderate-sized problems) will not benefit from quantum computing. However, those trying to solve large problems with exponential algorithmic gains and those that need to process very large datasets will derive significant advantages. As one pioneer in the field opines, “Quantum computing is not going to be better for everything, just for some things.” I would add to this that the applications requiring “better for some things” is huge. Now, let’s get to the point of what it is, how it works and where it is at.

### Technology for the Future

Quantum computing is a cutting-edge computer science harnessing quantum mechanics to solve problems orders of magnitude faster and beyond the ability of even the most powerful classical computers. When fully developed, a quantum computer will be able to address computational challenges in a matter of minutes that might take a classical computer thousands of years to complete. Yes, you heard that right. The technology includes both quantum hardware and quantum algorithms. Fair warning — here comes the tutorial part with an informational assist from IBM, Google, Amazon and Microsoft plus a few others to keep us on track.

So take a deep breath and let us dive in!

## Principles of Quantum Mechanics

Quantum mechanics involves the study of subatomic particles and their unique and fundamental natural principles. Quantum computers harness these to compute “probabilistically and quantum mechanically.” Think in terms of sorting huge amounts of data and cross-referencing it and sorting it to find the most probable outcomes. When discussing quantum computers, it is important to understand that quantum mechanics is not like traditional physics. As one physicist points out, “The behaviors of quantum particles often appear to be bizarre, counterintuitive or even impossible yet the laws of quantum mechanics dictate the order of the natural world.” To understand quantum computing it requires familiarity with four overriding principles of quantum mechanics:

**Superposition**: Superposition is the state in which a quantum particle or system can represent not just one possibility, but a combination of multiple possibilities.**Entanglement**: Entanglement is the process in which multiple quantum particles become correlated more strongly than regular probability allows.**Decoherence**: Decoherence is the process in which quantum particles and systems can decay, collapse or change, converting into single states measurable by classical physics.**Interference**: Interference is the phenomenon in which entangled quantum states can interact and produce more and less likely probabilities.

Classic computers generally perform calculations sequentially, storing data by using binary bits of information with a discrete number of possible states, 0 or 1. When combined into binary code and manipulated by using logic parameters, computers can be used to create everything from simple operating systems to the currently most advanced supercomputing calculations. The limitation lies in the discrete number of possible states being 0 or 1. We can get whatever we want in this binary world, but the downside is that it will take a lot of time to arrive…Think about the end game and consider the demand and value of time when you think about the push to get quantum computers where they need to be as soon as possible.

## Enhancing Quantum Computers

Quantum computers function similarly to classical computers, but instead of single bits of information, they use qubits. Qubits are special systems that act like subatomic particles made of atoms, superconducting electric circuits or other systems that result in a set of amplitudes applied to both 0 and 1, rather than just two states (0 or 1). This complicated quantum mechanical concept is called a superposition as noted earlier. Through a process called quantum entanglement, those amplitudes can apply to multiple qubits simultaneously.

A qubit can be “weighted” as a combination of zero and one at the same time. When combined, qubits in superposition can literally scale exponentially. Two qubits can store four bits of information, three can store eight, and four can store twelve on so on. Think in terms of subatomic particles, the smallest known building blocks of the physical universe. Generally, qubits are created by manipulating and measuring theses quantum particles such as photons, electrons, trapped ions, and atoms.

### Types of Qubits

To manipulate such particles, qubits must be kept extremely cold to minimize noise and prevent them from providing inaccurate results or errors resulting from unintended decoherence. There are many different types of qubits used in quantum computing today, with some better suited for different types of tasks. One type does not fit all applications. According to experts involved in the technology, a few of the more common types of qubits already in use today are as follows:

**Superconducting qubits**: Made from superconducting materials operating at extremely low temperatures, these qubits are favored for their speed in performing computations and fine-tuned control.**Trapped ion qubits**: Trapped ion particles can also be used as qubits and are noted for long coherence times and high-fidelity measurements.**Quantum dots**: Quantum dots are small semiconductors that capture a single electron and use it as a qubit, offering promising potential for scalability and compatibility with existing semiconductor technology.**Photons**: Photons are individual light particles used to send quantum information across long distances through optical fiber cables and are currently being used in quantum communication and quantum cryptography.**Neutral atoms**: Commonly occurring neutral atoms charged with lasers are well suited for scaling and performing operations.

### Considerations for Qubit Technologies

The team at Innovation and Research point out that no approach has yet emerged to evaluate and compare qubit technologies that can produce a perfect quantum computer. However, they did find six key considerations and challenges for evaluating them:

**Fidelity to scale**: This is closely related to the defining constraint of quantum computing, which is to increase the number of qubits and the computational power for complex algorithms while maintaining high levels of quality.**Computational speed**: Individual qubits can retain their quantum state (coherence) only for a limited time, so to compensate, gating operations should occur fast enough to make complex computations possible before qubits lose coherence.**Multi-qubit networks**: The more qubits that can be linked together, the more easily quantum computing algorithms can be implemented and the more powerful the resulting computer will be.**Control over individual qubits at scale**: This control is fundamental, but as the number of qubits in a quantum computing system increases, control over individual qubits becomes increasingly complex.**Cooling and environmental control**: The required scale of cooling equipment in terms of size and power is beyond the feasibility of currently available equipment.**Manufacturing**: Some qubit designs use existing production technology, while others require new manufacturing techniques. Therefore, such production will require automated manufacturing and testing of components at scale.

## Quantum Computers for Broad Range of Datasets

Quantum processors do not perform mathematical equations the same way classical computers do. Unlike classical computers that must compute every step of a complicated calculation, quantum circuits made from logical qubits can process enormous datasets simultaneously with different operations, improving efficiency by many orders of magnitude for certain problems.

Quantum computers operate probabilistically and find the most likely solution to a problem, while traditional computers are deterministic, requiring laborious computations to determine a specific singular outcome of any inputs.

One of the leaders in quantum computing IBM says that “Describing the behaviors of quantum particles presents a unique challenge. Most common-sense paradigms for the natural world lack the vocabulary to communicate the surprising behaviors of quantum particles.” They offer the following to help us connect the dots.

### Principles of Quantum Mechanics in Quantum Computing

To understand quantum computing, it is important to understand how the key principals of quantum mechanics mentioned earlier works in quantum computing:

**Superposition**- A qubit itself isn’t very useful. But it can place the quantum information it holds into a state of superposition, which represents a combination of all possible configurations of the qubit. Groups of qubits in superposition can create complex, multidimensional computational spaces. Complex problems can be represented in new ways in these spaces.
- This superposition of qubits gives quantum computers their inherent parallelism, allowing them to process many inputs simultaneously.

**Entanglement**- Entanglement is the ability of qubits to correlate their state with other qubits. Entangled systems are so intrinsically linked that when quantum processors measure a single entangled qubit, they can immediately determine information about other qubits in the entangled system.
- When a quantum system is measured, its state collapses from a superposition of possibilities into a binary state, which can be registered like binary code as either a zero or a one.

**Decoherence**- Decoherence is the process in which a system in a quantum state collapses into a nonquantum state. It can be intentionally triggered by measuring a quantum system or by other environmental factors (sometimes these factors trigger it unintentionally). Decoherence allows quantum computers to provide measurements and interact with classical computers.

**Interference**- An environment of entangled qubits placed into a state of collective superposition structures information in a way that looks like waves, with amplitudes associated with each outcome. These amplitudes become the probabilities of the outcomes of a measurement of the system. These waves can build on each other when many of them peak at a particular outcome or cancel each other out when peaks and troughs interact. Amplifying a probability or canceling out others are both forms of interference.

Quantum computers are expensive and require a deep understanding of quantum mechanics, computer science, and engineering. Finding professionals with expertise in all three is difficult, but the gap in availability versus demand for this talent will be filled. Also, the environments in which they operate are “extreme” so there are challenges to overcome but there are no apparent “deal breakers.”

## Strategic Investment in Quantum Computing

Despite the challenges facing quantum computing, its future looks promising. It will become a fundamental tool for scientific research, making it easier to solve problems that were previously impossible. The battles for this science and technology supremacy are heating up. Technology giants such as IBM, Google, Microsoft, Intel and Amazon AWS Bracket along with overseas companies like the Alibaba Group and Baidu out of China and EVIDEN out of France are investing large sums of money in the field. Governments are also beginning to see the strategic importance of quantum computing, resulting in increased funding and collaborative efforts.

This is complex to say the least and a lot to absorb even on the surface but for those who take the time it provides a view into the not-too-distant future. Whether we understand the science or not most of you will immediately connect the dots and see where the interests of AI and quantum computing intersect and where this is headed. Yes, there are still some challenges to overcome, but experts agree there is nothing that cannot be overcome…so stay tuned and buckle up!

*Alan C. Brawn, CTS, DSCE, DSDE, DSDE, DSNE, DSSP, ISF-C, is principal at Brawn Consulting.*