Asian American Daily

Subscribe

Subscribe Now to receive Goldsea updates!

  • Subscribe for updates on Goldsea: Asian American Daily
Subscribe Now

The Man Set to Take Quantum Computing Out of the Laboratory
By Goldsea Staff | 08 Jan, 2026

Systems-level quantum engineer Jerry Chow is leading IBM's push to turn quantum computing into products reliable enough to release into the real world.

For decades quantum computing has lived in the twilight realm of laboratories, white papers, and carefully staged demonstrations. The machines work, but demand extreme conditions, heroic engineering, and armies of PhDs to coax out results that still struggle to beat classical computers at most real-world tasks. 

What quantum computing has lacked isn't brilliance, ambition, or capital, but focus on turning fragile experiments into machines that can be deployed, scaled, and trusted in the real world.

Jerry Chow is a rarity in the field who hasn't been chasing theoretical milestones but has spent years working at the unforgiving boundary between physics and engineering, where ideas either become hardware or quietly fail.  He is now emerging as one of the clearest representatives of quantum computing's transition from scientific curiosity to industrial technology.  Chow is now on the verge of enabling that transition from the laboratory to the real world.

At its core quantum computing promises a radically different way of processing information.  Classical computers use bits that are either 0 or 1. Quantum computers use qubits that can exist in superpositions of states, entangle with one another, and exploit interference in ways that allow certain calculations to scale exponentially faster.  In theory, this enables breakthroughs in cryptography, materials science, chemistry, optimization, and machine learning.

In practice, however, qubits are maddeningly fragile. They decohere when exposed to heat, vibration, electromagnetic noise, or even stray cosmic rays. They require cryogenic temperatures colder than outer space, exquisite control electronics, and layers of error correction just to stay coherent long enough to compute.  Building a useful quantum computer is less like designing a chip and more like constructing a cathedral that must remain perfectly still during an earthquake.

Jerry Chow’s work has been focused on making that cathedral stable.

Chow is best known for his leadership in superconducting quantum systems, one of the most promising approaches to building scalable quantum computers. Superconducting qubits are fabricated using techniques similar to those used in semiconductor manufacturing, offering a path toward mass production. But translating that promise into reliable systems has required deep advances in materials, device design, control software, calibration, and error mitigation.

Chow’s defining contributions has been his insistence on systems thinking. Quantum computing is often discussed in terms of qubit counts, coherence times, or error rates, but Chow has insisted that no single metric matters in isolation.  A quantum computer is a tightly coupled stack: physics at the bottom, electronics and firmware in the middle, compilers and algorithms at the top.  Weakness in any layer can erase progress in the others.

This perspective has shaped much of the field’s most practical progress. Instead of celebrating record-breaking but brittle qubits, Chow’s work has prioritized repeatability, manufacturability, and integration. These are the qualities that allow quantum devices to leave the lab bench and enter data centers, where uptime, reproducibility, and scalability matter as much as raw performance.

A major obstacle to practical quantum computing is error.  Quantum error correction is theoretically well understood, but implementing it at scale is requires large numbers of physical qubits to create a single logical qubit robust enough to run useful algorithms. Chow has pushed error mitigation and error correction techniques from theory toward implementation, focusing on architectures that can realistically support them.

Equally important has been his work on calibration and automation. Early quantum systems required constant manual tuning by experts, making them more like scientific instruments than computers.  Chow recognized early that quantum machines would never scale if each system required skilled human attention.  He helped pioneer automated calibration routines and control systems that allow quantum devices to maintain performance over time without continuous human intervention.

This may sound mundane compared to breakthroughs in quantum supremacy or exotic algorithms, but it's precisely this kind of work that transforms technologies.  

Background in Physics and Engineering

Jerry Chow was born in the United States, though he hasn't disclosed details about his birthplace, parents, or early childhood.  

Chow completed his undergraduate education at Harvard University, where he earned a Bachelor of Arts in physics and a Master of Science in applied mathematics, graduating magna cum laude in 2005. His training at Harvard emphasized both theoretical rigor and quantitative problem-solving, providing a strong foundation for experimental and systems-level physics.

He went on to pursue doctoral studies at Yale, earning a PhD in physics in 2010. At Yale, Chow specialized in experimental physics, with research that directly informed his later work in superconducting and solid-state quantum systems.

After completing his doctorate, Chow joined IBM in 2010 as a Research Staff Member at IBM Research’s Thomas J. Watson Research Center. There, he became a central figure in IBM’s superconducting quantum computing program at a time when the field was transitioning from proof-of-concept experiments to early system prototypes.

Over the following decade, Chow played a key role in multiple milestones, including advances in superconducting qubit design, control electronics, automated calibration, and system-level integration. He contributed to IBM’s early multi-qubit processors and to the development of architectures capable of supporting error mitigation and, eventually, quantum error correction.

As IBM accelerated its push toward cloud-accessible quantum computers, Chow rose into senior leadership roles, helping guide the strategy that led to regularly upgraded quantum processors, public hardware roadmaps, and the deployment of quantum systems outside traditional laboratory environments. His work emphasized reliability, reproducibility, and scalability—qualities essential for turning quantum machines into usable computing platforms.

Today, Chow is widely regarded within the field as a systems-level leader whose influence lies not in hype or theory alone, but in the disciplined engineering that is required to move quantum computing from experimental physics into practical technology.

Colleagues describe Chow as unusually pragmatic for a quantum physicist.  He is known for asking uncomfortable questions about scalability, yield, and reliability long before these issues become fashionable.  Rather than being dazzled by elegant experiments, he gravitates toward designs that can survive contact with manufacturing lines and operational constraints.

On a personal level, Chow is often characterized as understated and methodical. He isn't a flamboyant evangelist for quantum computing, nor does he indulge in science fiction visions of instant disruption.  Instead, he tends to speak in measured terms about timelines, tradeoffs, and engineering bottlenecks.  This restraint has earned him credibility in both scientific and industrial circles.

Despite working at the frontier of a highly abstract field, Chow maintains a strong interest in real-world applications.  He has spoken about the importance of identifying near-term problems where quantum advantage might emerge gradually rather than all at once. Chemistry simulations, materials discovery, and hybrid classical-quantum workflows are areas where he sees realistic opportunities for progress.

Outside of work, Chow is known to value balance, an attitude that quietly informs his professional philosophy. Quantum computing, he often implies, is a marathon rather than a sprint. Breakthroughs will come not from heroic all-nighters, but from sustained, disciplined effort across many years. This long view sets him apart in an industry prone to boom-and-bust cycles of expectation.

Taking Quantum Out of the Lab

Jerry Chow’s real impact isn't a single breakthrough but a cumulative shift in how quantum computing is approached.  He represents a generation of leaders who understand that the success of quantum technology will depend less on dazzling demonstrations and more on boring virtues: reliability, standardization, and integration.

As quantum hardware matures, the center of gravity in the field is shifting. The questions are no longer simply whether qubits can be built, but whether quantum computers can be operated day after day, upgraded systematically, and programmed by people who are not quantum physicists. Chow’s work sits squarely at this transition point.

In many ways, his role mirrors that of early computing pioneers who transformed room-sized experiments into dependable machines. The first electronic computers were astonishing but impractical. Only when engineers focused on reliability, abstraction, and manufacturing did computing escape the laboratory and reshape society.

In 2026 Jerry Chow’s IBM work will likely result in releases of modular, data-center-oriented quantum systems and the software/control tooling that makes them usable without a constant human babysitter.

First, IBM’s roadmap points to IBM Quantum Kookaburra—a modular processor aimed at storing and processing encoded information (a practical step toward error-corrected, fault-tolerant operation rather than one-off demos). 

Second, expect expanded IBM Quantum System Two–style modular deployments (more “system engineering” and packaging that looks like infrastructure, not a lab rig), plus runtime and Quantum/HPC integration tools that let developers run larger, more structured workloads through the IBM Quantum Platform. 

Third, a key enabler for real-world use is faster real-time error-correction decoding on conventional hardware (e.g., FPGAs)—the kind of behind-the-scenes productization that turns fragile qubits into repeatable computations.

(Image by ChatGPT)