An Introduction to Quantum Computing
Overview
Quantum computing is normally described as the deepest of the deep tech moonshots, but I think that is also precisely why it is so attractive – “We choose to go to the Moon... not because they are easy, but because they are hard” – John F. Kennedy.
As a clear example of deep tech, quantum is a scientifically demanding, capital-intensive field with potentially transformative long-term impact. But deep tech companies have a very different risk profile from a B2B SaaS company, notably because of the engineering risk and the difficult path from lab to market.
What is a quantum computer? It is the most powerful machine that the laws of physics would allow. Please bear in mind that it is not a general-purpose machine to replace classical computing. Instead, it is a new computing paradigm that uses controllable quantum-mechanical systems to solve certain structured problems more efficiently than classical machines. In practice, the most important concepts are superposition, entanglement, and interference. Of these, interference is especially important from a computational perspective: quantum algorithms do not simply “try every answer at once”, but are designed so that probability amplitudes for useful answers reinforce one another while bad answers cancel out.
Today’s quantum computers are still noisy and fragile; decoherence, error correction, and scaling constraints remain fundamental challenges. As a result, near-term commercial activity often comes through cloud access, experimentation, government initiatives, and frontier companies’ R&D contracts rather than repeatable product or service economics at scale. The shift towards meaningful revenue generation is still to come; development is steady, but the timeline is uncertain, even if one hopes it is a foreseeable one. The sector has progressed extremely well across academic work, engineering, and funding, yet it is still at an early stage of development.
The DiVincenzo Criteria
A useful framework for assessing any quantum computing platform is the DiVincenzo criteria, formalised by David DiVincenzo in 2000. These set out five core requirements for a viable quantum computer:
• A scalable physical system with well-characterised qubits
• The ability to initialise qubits into a known starting state
• Coherence times long enough to perform computation
• A universal set of quantum gates
• Qubit-specific measurement
They remain a useful way to distinguish scientific promise from engineering maturity.
Modalities
Quantum-computing modalities are the different physical approaches used to build and control qubits. Modalities matter because there is no consensus winner yet; each platform is really a different bundle of engineering trade-offs across fidelity, coherence, connectivity, manufacturability, speed, infrastructure burden, and path to error correction.
• Superconducting circuits are currently among the most established modalities, used prominently by IBM and Google. They offer fast gate operations and have benefited from strong ecosystem development, but they require cryogenic infrastructure and still face scaling and error-correction challenges.
• Trapped ions encode information in individual charged atoms held in electromagnetic traps and manipulated with lasers. They are known for high-fidelity operations, long coherence times, and strong qubit connectivity, though scaling and system engineering remain demanding.
• Neutral atoms use lasers and optical tweezers to trap and arrange uncharged atoms in programmable arrays. This modality is attractive because atoms are naturally identical and can be arranged flexibly in 2D or 3D layouts, giving it a potentially strong scaling story, though the technology is still evolving towards robust fault-tolerant operation.
• Photonic quantum computing uses photons to encode and process information. Photons are naturally attractive for networking and can operate in architectures that leverage existing semiconductor and fibre-optic supply chains, but engineering reliable photon generation, interaction, loss tolerance, and large-scale error correction remains difficult.
• Semiconductor spin qubits encode information in electron or nuclear spins, often in silicon-based devices. The strategic attraction here is potential compatibility with mature semiconductor fabrication, though the platform still faces significant scaling and control challenges.
• Topological qubits remain one of the most ambitious approaches because, in principle, they could offer stronger intrinsic error protection. Microsoft’s current effort is centred on its Majorana 1 announcement and topological-qubit architecture, but this remains an emerging and closely watched area.
Potential Use Cases
Perhaps it makes more sense to think about quantum use cases through the underlying problem classes. The most credible long-term opportunities are in areas where quantum mechanics is intrinsic to the problem or where specific mathematical structure creates a genuine algorithmic advantage.
The first major class is cryptography-relevant mathematics, especially factoring and discrete logarithms. A sufficiently capable fault-tolerant quantum computer could break widely used public-key cryptosystems, which is why governments and enterprises are already moving towards post-quantum cryptography. This is a real and important use case.
The second major class is quantum simulation, especially in chemistry, materials, catalysts, batteries, and drug discovery. The value proposition is less about “faster computing” and more about “new capability”, and this remains one of the most important long-term commercial theses in quantum computing.
A third category is optimisation and sampling, relevant in sectors such as logistics, manufacturing, energy, and finance. This area is commercially attractive but also more contested: the theoretical case is often weaker or more problem-specific than in cryptography and simulation, and classical heuristics remain strong. This is precisely why many near-term use cases should be treated cautiously from an investment perspective.
The key challenges facing the industry will be explored in the next article.
Author: a physics girlie
Selected Sources
• DiVincenzo (2000): https://arxiv.org/abs/quant-ph/0002077
• IBM Quantum overview: https://www.ibm.com/think/topics/quantum-computing
• NIST quantum computing explainer: https://www.nist.gov/quantum-information-science/quantum-computing-explained
• NIST post-quantum cryptography: https://csrc.nist.gov/projects/post-quantum-cryptography
• Microsoft Majorana 1: https://azure.microsoft.com/en-us/blog/quantum/2025/02/19/microsoft-unveils-majorana-1-the-worlds-first-quantum-processor-powered-by-topological-qubits/
• McKinsey Quantum Monitor 2025: https://www.mckinsey.com/~/media/mckinsey/business%20functions/mckinsey%20digital/our%20insights/the%20year%20of%20quantum%20from%20concept%20to%20reality%20in%202025/quantum-monitor-2025.pdf