The quantum leap: Why 2025 became the year everything changed

Posted: February 13, 2026

The quantum leap: Why 2025 became the year everything changed

Whether you're organizing a supply chain for the best combination of speed and cost, or assigning jobs to machines to minimize downtime, computer programs can help find solutions. But here's the catch: with every new variable, the time required to calculate the optimal answer to a complex problem can grow exponentially. Take the problem of optimizing a national power grid, for example. Fully accounting for every generator, transmission line, storage unit, and real-time fluctuation in demand and weather patterns is what mathematicians call an NP-hard problem: as the grid grows, the number of possible configurations can increase so rapidly that finding the truly optimal solution becomes practically impossible for classical computers.

Complex problems like these are a daily reality facing grid operators, logistics managers, pharmaceutical researchers, and manufacturing engineers around the world. As grids modernize and add distributed energy resources like rooftop solar panels and battery storage, the problem will only compound. Similarly, a pharmaceutical company screening millions of molecular compounds for a new drug candidate faces a combinatorial explosion of possible interactions. Global shipping companies routing thousands of containers across ports, warehouses, and delivery trucks must weigh countless combinations of cost, timing, weather, and capacity constraints. What these challenges share is sheer computational scale. Even with modern parallel processing, classical computers struggle to keep pace as these problems scale. In theory, a quantum computer could potentially tackle these problems more efficiently.



The strange world of qubits

It can be tempting to think of a quantum computer as some kind of souped-up regular computer operating at hyper speeds. This is not the case. “A fighter jet is not a faster Ferrari because it has wings,” Sridhar Tayur, a professor with Carnegie Mellon University’s Tepper School of Business, told CNN last November. “Quantum computing is not just a faster classical computer…because it works on a different principle.”

Classical computers operate using bits—switches that are, effectively, either on or off, with a value 0 or 1. While a classical bit can only be 0 or 1, a quantum bit — or qubit—can exist in a superposition—a probabilistic state described by a wave function that represents the likelihood of being 0 or 1. In other words, a qubit doesn't commit to being 0 or 1 until it's measured — until then, it exists as a blend of probabilities that can be manipulated to solve problems. Qubits can also entangle (creating strong correlations between one another) or interfere with one another, tipping the scales of the likelihood that certain combinations will occur. They can also interact with each other's probabilities in ways that, for certain types of problems, vastly outperform classical computation. As Josh Schneider and Ian Smalley explain in a post for IBM, “Quantum mechanics is a bit like the operating system of the universe. A computer that uses quantum mechanical principles to process information has certain advantages in modeling physical systems.”

The hope is that quantum computers could eventually help solve the kinds of complex challenges industry is most interested in: simulating power markets or biological systems for drug discovery, optimizing logistics schedules, or modeling quantum systems to discover new materials.

But building quantum computers in practice is extraordinarily difficult. All it takes is a passing subatomic particle, slight vibration, or minute temperature fluctuation to throw off a qubit's state and introduce errors. "If I just vibrate a table, I'll kill our quantum computers," IBM's Jay Gambetta has noted. Because every qubit is error-prone, scaling up has often meant calculations go off the rails faster.

What changed in 2025: Chips, architectures, error correction

In December 2024, Google announced its Willow chip, which, the company claimed, reduced errors rather than compounding them as more qubits were added. The chip performed a benchmark calculation in five minutes that Google claimed would take supercomputers ten septillion years. In a post announcing the chip, Hartmut Neven, Founder and Lead of Google Quantum AI, called the system “the most convincing prototype for a scalable logical qubit built to date. It’s a strong sign that useful, very large quantum computers can indeed be built.”

Michael Cuthbert, director of the UK's National Quantum Computing Centre, on the other hand, called it a "milestone rather than a breakthrough"—but "clearly a highly impressive piece of work."

Just two months later, in February of last year, Microsoft unveiled its Majorana 1 chip, which uses a new type of material called “topoconductors” to produce more reliable and scalable qubits. The company explained, “The topoconductor, or topological superconductor, is a special category of material that can create an entirely new state of matter—not a solid, liquid or gas but a topological state. This is harnessed to produce a more stable qubit that is fast, small and can be digitally controlled.”

From there, the announcements just kept coming. Later that same month, Amazon followed in Microsoft’s footsteps, announcing its own inaugural quantum chip, the Ocelot chip.

In August, the Chinese Jiuzhang 4.0 quantum computer set a new record when it used 3,090 particles to achieve “quantum advantage,” carrying out a specific benchmark task that would take classical supercomputers trillions of times the age of the universe to complete.

Then, in September, another record. Researchers at the California Institute of Technology created the largest qubit array ever assembled, featuring 6,100 individual qubits, and kept them in superposition for about 13 seconds—nearly 10 times longer than was possible in previous arrays. They were also able to manipulate individual qubits with 99.98% accuracy. Previous arrays contained only hundreds of qubits.

In November, IBM revealed two new quantum computers: Nighthawk and Loon, both featuring a modular design and new ways of connecting qubits across computing units. Each qubit in Loon is connected to six others and isn’t just confined to horizontal movement across the chip—qubits can move vertically as well, a capability not seen before in quantum computing. Nighthawk, meanwhile, has four-way connected qubits. IBM hopes the increased connectivity will help further increase computational power and decrease errors, the major issue that continues to haunt all current quantum computing models.

From laboratory to factory floor

Despite these breakthroughs, practical quantum computing at an industrial scale remains years away. But that’s not stopping industry from courting promising collaborations.

Boehringer Ingelheim has been partnering with Google since 2021 and, in 2022, developed quantum algorithms to analyze the chemistry of a key enzyme family called cytochrome P450, important for drug metabolism, making such calculations 234 to 278 times faster than previous methods for calculating the electronic structure of molecules.

In June of 2025, quantum computing company IonQ, in partnership with AstraZeneca, announced it had achieved quantum advantage using a hybrid quantum-classical computing model in drug discovery applications.

In August of last year, IBM and RIKEN, Japan’s National Research and Development Agency, used quantum-classical hybrid computing to simulate the energy states of a complex molecule beyond classical capabilities alone. The setup used 77 qubits, more than any real-world quantum chemistry problem has used before.

Recent investment in quantum computing reflects high expectations. Private investment has surged dramatically, with quantum computing startups raising more than $1.25 billion in Q1 2025—a 128% increase compared to the $550 million raised during the same period in 2024. Through the first three quarters of 2025, the sector secured $3.77 billion in total equity funding, establishing quantum computing as one of the fastest-growing areas in deep tech.

Government spending has accelerated even faster. By April 2025, global public funding had topped $10 billion, with Japan placing a $7.4 billion bet on the technology, accounting for 75% of all public investment in quantum that year. Governments in the U.K., Germany, the U.S. and South Korea have all been investing heavily.

Consulting firm McKinsey and Company said that 72% of the tech executives, investors and academics it spoke with believe a fully fault-tolerant quantum computer could be available as soon as 2035.

As one analyst at Constellation Research observed: "Boardrooms are going to start asking about your quantum computing plans in 3, 2, 1."


Contact AVEVA
Live Chat
Schedule Demo