One Million Qubit Quantum Computer

By Craig D

When will we see a 1 Million Qubit Quantum Computer?

For several decades, researchers, big tech companies and governments around the world have invested efforts, time, and resources into extensive research to advance quantum computing (QC) technology.

With D-Wave still at a 5000-qubit Advantage system and IBM just announcing a 433-qubit Osprey system, many tech pundits believe that we still have a long way to go in the drive to build a One Million Qubit Quantum Computer system. Much as there is a palpable sense of joy in the QC industry due to progress made thus far, there is no denying the fact that scaling those disruptive computers requires a lot of time, expertise, research, and resources.  So the big question is, when will a one million qubit quantum computer be available?

Example of a qubit with a bright light in the centre. Used to represent the a goal to achieve a One Million Qubit Quantum Computer

Today, big industry players such as Google, IBM, Microsoft and Intel are in the race to build what would become the world’s most powerful computer. Several reports say that QC machines of 100 qubits would be more powerful than all the classical computers in the world combined.

However, with one qubit added to the existing system, it can do the unthinkable as it becomes more powerful. Then, imagine when computer scientists build 1,000,000 qubits into one chip. Sure, that gives you an idea of what quantum supremacy is all about.  

The Obstacles to Scaling Quantum Computers 

At the last count, a plethora of companies has invested billions of dollars in exhaustive research to scale QC systems. Despite all the milestones achieved so far, the QC architectures and models are relatively unstable and difficult to produce on commercial quantities.

The primary issues with today’s QC machines are cost, heat management, readout, and control. Building a qubit that can withstand heat has remained a major challenge because today’s qubits need to be kept at near absolute zero temperatures to function optimally. In other words, adding more qubits to scale today’s QC machines to a mind-blowing processing capacity will require more resources to keep its superconducting materials at their optimal temperature. 

Picture demonstrating what is an atom in simple terms or a Qubit. Purple rings around centre and side spheres

In fact, computer scientists have expressed frustrations at how difficult it is to keep today’s QC machines at that near absolute zero temperatures, let alone doing the same for one million qubits. To paint a crystal picture, one qubit costs about $10,000, and to ensure it functions well requires microwave controller electronics, coaxial cabling, and other materials that need massive controlled rooms to operate.

It simply means that these computers are only available to large corporations that can afford them, thus making these next-gen machines unavailable to much of the world.  

What’s more, other drawbacks of today’s QC systems are noise and interference. However, there are strong indications that we will see superconducting QC systems with less noise in 2023 and beyond. Indeed, eliminating or minimizing noise and interference is critical to the success of QC technology. Plus, in our previous article , we mentioned that QC machines depend on classical computers to display and simulate their results. This is a setback.

Nevertheless, once developers surpass the era of NISQ (noisy intermediate-scale quantum technologies are small quantum QC processors without fault tolerance capacity and are a little above 50 – 100 qubits), there will be high-level coherence.  

Building One Million Qubits

Let’s be honest here, scaling QC machines is no child’s play, but the QC scientists have taken the bulls by the horns. Hence, it is only fair to appreciate the remarkable feat they have achieved thus far. Over the past 3 – 5 years, we have learned so much about the rapidly evolving technology because big tech companies have taken turns to unveil a series of QC systems.

For instance, IBM has advanced from Falcon (27 qubits) to Hummingbird (65 qubits), Eagle (127 qubits), and now Osprey (433 qubits). Looking at the disparity between Eagle and Osprey, the progress is glaringly obvious. Looking beyond the success, the difference also suggests that it will take years (if not decades) of research and development to reach the one-million-qubit target. 

In a similar vein, Google AI, the artificial intelligence division of Google Inc., has also unveiled Sycamore (53 qubits) and Bristlecone (72 qubits). While the duo research and develop gate-based QC systems, D-Wave has heavily invested in quantum annealing. As a result, the Canadian tech company has unveiled a series of quantum annealers, including 2,048 qubits in 2019 and 5000 qubits in mid-2020. Because IBM, the undisputed quantum technology leader, is far behind the one million qubit quantum computer finish line, you may be skeptical that it will take ages to hit the target.

Well, Google is not resting on its laurels because it is committed to building an error-corrected QC machine and meet the one-million-qubit benchmark. But like IBM, the search giant admitted it has a long way to go before such ideas can come to fruition

A Paradigm Shift 

One fact we know for certain is that the big tech companies are taking baby steps to build 1 million qubits into one system. On the flip side, PsiQuantum is charting a unique, fast course in order to get there first. The tech startup plans to set up its production process by 2025.

Using a photonics-powered approach called fusion-based quantum computing, PsiQuantum aims to introduce a paradigm shift in the QC industry. Fusion-based quantum computing requires precise manufacturing tolerance to scale up. Furthermore, the QC firm is looking to leverage semiconductor-manufacturing processes and network tons of qubits into one physical unit to reach the threshold.

The PsiQuantum team believes they can build much larger QC systems than those that exist today. Once they accomplish that feat, they can use the systems to learn more about the feasibility of the target and continue to shoot for the moon. 

Before long, they will build a powerful machine that allows for the validation of control electronics, system integration, cryogenics, networking, etc. With the new approach, they won’t be programming for NISQ technology. The company embraced the photonics approach because it has numerous benefits, including manufacturability, scalability, a less frigid operating environment, and ease of networking and error correction.

The photonics technique is a linear optical quantum computing method where photons serve as qubits. In the past, PsiQuantum published journals and released videos to explain its approach but still keeps specific intricacies to itself. 


In truth, it remains unclear when these tech firms will launch one million qubit Quantum Computer systems on commercial scale. However, what is rather clear is that if IBM continues to use its present strategy of increasing NISQ computers, it will take much longer to achieve the threshold. Hence, all eyes should be on early-stage QC startups trying to upset the apple cart by introducing a paradigm shift.

In all honesty, no matter what concept or strategy a company employs, nobody should expect the launch of a one-million-qubit QC system until 2031. All things being equal, building one million superconducting qubits in one physical chip is a few years away (or 2032, to be precise). 

Leave a Comment