The Devil in the Details: A Survey of Current Approaches to Building a Quantum Computer

Quantum computing has the potential to change the world. Its computational power could dwarf that of today's supercomputers, enabling radical disruptions and advances in a variety of applications. Fully realized quantum computers could break modern encryption, enable the discovery of new drugs and materials, facilitate unprecedented efficiency in financial networks and global commerce, and support powerful AI models. Related but distinct quantum technologies, such as quantum sensing and quantum communication, have already supported exciting developments in defense, medicine, and secure communications.

At present, quantum technologies are in use by companies and researchers around the world and have already demonstrated superiority in certain applications. However, the construction and scaling of these machines poses impressive theoretical and engineering challenges. Their computations are carried out via fragile, sub-atomic manipulations that are vulnerable to many forms of interference or “noise.” Improvements are constantly being made to overcome these obstacles, but quantum computers need to grow in complexity and error-tolerance before their ambitious goals can be realized. For example, while Boeing has seen early success in working with IBM Quantum to theoretically optimize the design of airplane wing materials, a much more complex quantum computer than those currently available would be necessary to design a complete airplane wing. A McKinsey poll of tech executives, investors, and academics in the quantum computing space found that “72 percent expect to see a fully fault-tolerant quantum computer by 2035,” but current machines already offer some commercial utility.

At the heart of any effort at building a quantum computer is the need to construct qubits (short for “quantum bits”). Whereas conventional computer bits can represent either 0 or 1, qubits can represent 0, 1, or a combination of the two. This process relies on the quantum property of superposition, wherein certain quantum systems can exist in multiple states at the same time. Superposition allows quantum computers to handle exponentially more complexity than conventional computers, leading to computational superiority. Quantum processors need to have gates that manipulate the values of both single qubits and pairs of qubits simultaneously—the latter process relies on a property known as quantum entanglement. Importantly, qubits must also be scalable. Experts have estimated that a general-purpose quantum computer might require one million qubits or more, but world-leading quantum processors have thus far demonstrated only one thousand or so qubits. While qubit counts are easy to compare and therefore likely to make headlines, it is important to note that not all qubits are created equal: the ability to build complex entanglements between qubits and prevent the many sources of error are also important metrics for the overall quality of a quantum computer. For example, researchers have worked to build qubits that are resistant to decoherence, wherein stored quantum information degrades over time and becomes unusable.

At present, at least six main competing visions for qubit construction are being pursued by tech giants, startups, and academics around the world. While each approach has its supporters and detractors, there is currently no clear answer as to which form, or forms of quantum hardware will ultimately prove superior. For each approach, this blog outlines a high-level description of the method, current progress, which groups are pursuing it, and any notable advantages or disadvantages of the approach in scaling towards general quantum computers.

Superconductors

Just as laptops may overheat when performing intensive computation, a small portion of the electricity flowing through nearly any material is constantly converted to heat via a process known as resistance. Superconductors are metals (most commonly niobium and aluminum) that can conduct electricity without any resistance or energy loss, though only at near-absolute-zero temperatures close to -460 degrees Fahrenheit. Using superconductors, engineers can construct circuits that will never lose their energy, allowing them to make durable qubits by manipulating the electrical states of these circuits using magnetism and microwaves.

The popular image of a quantum computer is that of a hanging jumble of gold and wiring that some have likened to a chandelier, but the majority of this infrastructure is to maintain the requisite low temperatures, with the actual processing occurring in a small chip embedded at the bottom of the structure. These chips can be fabricated using existing microchip technology, an advantage for scalability. It takes 1-2 days to cool IBM’s “chandelier” dilution refrigerator down, and this cooling process is much more energy-intensive than any computation it may perform. The need for supercooling may hamper efforts to scale the technology or make it more mobile. The discovery of a room-temperature superconducting material would therefore remove these burdens, but research efforts have thus far been unsuccessful.

In December, 2023, IBM introduced the largest quantum chip ever built with 1,121-qubits, along with a smaller 133-qubit chip designed to be combined with other chips. Several competitors are also in the mix: Google was the first company to claim “quantum supremacy” in 2019 when its quantum computer outperformed the world’s best conventional supercomputer (although these results have been disputed by rivals), and it currently boasts a 70-qubit machine. US-based Rigetti Computing has an 80-qubit system and Japan’s Fujitsu has developed a 64-qubit machine in collaboration with the RIKEN national research institute. Significantly, the Chinese Academy of Sciences has built a 176-qubit superconducting computer despite U.S. sanctions on the import of critical chipmaking and dilution refrigeration technologies. Baidu has also built a 36-qubit machine.

Photonics

This approach is centered around photons, the massless particles that make up light. These particles are quite stable and can travel long distances without decoherence, making them an ideal basis for a qubit. Internal architecture varies, but the basis of the approach is sending precise pulses of light created by a laser through specially designed crystals. Each pulse is made up of a pair of entangled photons that constitute a qubit, and special loops of fiber-optic cable allow for further entanglement and the performance of calculations while these photons are in transit. Highly sensitive sensors capture the results of these light pulses and translate them into conventional computer hardware. Importantly, most of this process is performed at room temperature, though leading approaches currently rely on superconducting photon sensors which must be kept near absolute zero. There is hope that these sensors could be replaced by a non-superconducting alternative, enabling full computation at room temperature. Removing reliance on expensive and energy-intensive supercooling machinery could significantly improve access to quantum computers and support mobile use cases. Proponents of photonics have also argued that its reliance on well-developed technologies such as lasers and fiber optics will make it easier and more cost-effective to build. They further contend that the use of light to communicate quantum information will become essential to other approaches as they look to scale, including superconductors and atom-based methods.

Photonics is the only approach besides superconducting that has thus far demonstrated “quantum supremacy” by outperforming conventional computers on well-defined tasks. In 2019, for the first time, a team from the University of Science and Technology of China solved a physics problem in 200 seconds which would have taken the world’s best supercomputers 2.5 billion years, and in 2022 Canada’s Xanadu solved the same problem in a fraction of a second. Xanadu’s machine boasts 216 photonic qubits, while the leading Chinese effort now has 255 and has rebroken previous speed records. Other companies working on photonics computing include the U.K.’s ORCA Computing, as well as the U.S.’s Quantum Computing Inc. and PsiQuantum.

Trapped Ions

In this approach, ions (particles with a positive or negative charge) are trapped within an electromagnetic field, with quantum information stored in the electronic states of the ions. Targeted laser bursts can alter these energy levels and induce superposition and entanglement. Only a small subset of ions is ideal for this task, including calcium-40, beryllium-9, barium-138, and the rare earth element ytterbium-171. This approach can be performed entirely at room temperature, and it is alleged to outperform superconductors in entanglement. It is also one of the oldest approaches, dating to the late 1990s.

Several companies have developed trapped ion computers for research purposes, though qubit numbers are more modest than some other approaches. Quantinuum has a 32-qubit computer that has demonstrated the best fidelity rate (99.8%) for two-qubit operations of any commercial quantum product, meaning that it can faithfully perform these operations more reliably than other machines. IonQ has developed a 35-qubit machine and Austria’s Alpine Quantum Technologies offers 20 qubits—and both firms’ computers can fit in a standard server rack found in any data center. However, the laser controller system becomes increasingly complex to manage as more qubits are added, potentially limiting the scalability of ion trap computers. Furthermore, charged ions repel each other, making it difficult to place large numbers of ions in proximity.

Neutral Atoms

This approach is similar to ion traps, but it relies on atoms with no charge. Unlike ions, these neutral atoms can be held in close proximity to each other. They are commonly arranged in a neat two-dimensional grid in a vacuum chamber, though leading researchers may eventually pursue the more difficult task of constructing three-dimensional arrays for additional complexity. Instead of using magnetic fields to trap these atoms, these atoms are trapped using highly focused laser beams sometimes referred to as “optical tweezers.” These atoms are metals such as strontium, ytterbium, and rubidium, which are heated to convert them into a gaseous form that can be manipulated by the lasers. The results of the calculations are then read by special cameras. Like ion traps, this process can be performed entirely at room temperature.

In October 2023, Atom Computing announced that it had used neutral atom computing to produce the first 1000+ qubit gate-based computer, with a total of 1,180 qubits. The Berkley-based company claims to have also demonstrated a record decoherence time of 40 seconds. Other companies have also had success with scaling neutral atom computers, including France’s Pasqal with a 324-qubit machine and Boston-based QuEra with 256 qubits.

Further Approaches and Futures

While the above approaches have demonstrated the most scale, it is possible that future quantum computers might run on different technologies which are still in earlier stages of development. Chief among these is silicon quantum processors, which aim to use leading chipmaking technologies to fabricate qubits that are as small as conventional transistors (up to one million times smaller than current qubit technologies). These chips could be mass-produced and would potentially lend themselves to scaling, though they would need to be supercooled. Intel is working on this technology and has developed a 12-qubit chip. Another growing approach seeks to exploit natural defects known as nitrogen-vacancy centers in the structure of diamonds to store electrons—this method has translated into a 5-qubit machine in Australia which can operate at room temperature.

Innovation is often unpredictable in this space, so any of these approaches or others that have not been mentioned may eventually form the cutting edge of quantum computing. Some initially promising approaches, such as nuclear magnetic resonance computers, have already fallen out of favor with the research community, and perhaps some of the proposals above could meet the same fate due to cost barriers or physical limitations. A future quantum computing landscape could also feature several different qubit hardware implementations which are uniquely suited to different use cases. Countries and tech companies should continue to invest in a broad portfolio of quantum computing innovation until the playing field becomes clearer.

Image
Evan Brown
Program Coordinator and Research Assistant, Economics Program and Scholl Chair in International Business