For the first time, a machine that runs on the mind-boggling physics of quantum mechanics reportedly has solved a problem that would stump the world's top supercomputers - a breakthrough known as "quantum supremacy."
If validated, the report by Google's AI Quantum team and University of California at Santa Barbara physicist John Martinis constitutes a major leap for quantum computing, a technology that relies on the bizarre behavior of tiny particles to encode huge amounts of information. According to a paper published Wednesday in the journal Nature, Google's Sycamore processor performed in less than 3 1/2 minutes a calculation that would take the most powerful classical computer on the planet 10,000 years to complete.
The achievement has been compared to the Wright brothers' 12-second first flight at Kitty Hawk - an early, aspirational glimpse at a revolution to come. By providing exponentially greater calculation power than the machines we use today, quantum computers could one day transform the way we communicate ideas, conceal data and comprehend the universe.
The result is also a feather in the cap for both Google and the United States, because quantum technology is expected to confer huge economic and national security advantages to whomever can master it first.
The technology community has been abuzz about the breakthrough ever since a leaked version of the study was published on (and then removed from) a NASA website last month.
"For those of us who work on the theory," said Ashley Montanaro, an expert in quantum algorithms at the University of Bristol, "it's a point where it really seems that things that were only theoretical in the past are now becoming reality."
Writing in the magazine Quanta, Caltech theoretical physicist John Preskill called the result "a remarkable achievement in experimental physics and a testament to the brisk pace of progress in quantum computing hardware."
But the claim also has prompted skepticism from competitors. Researchers at IBM, which has been working on its own quantum machines, reported this week that a classical computer system would in fact take two and a half days to perform the calculation in Google's report - and would make fewer mistakes in the process. (That paper has not yet been published in a peer-reviewed journal.)
In a blog post, the IBM scientists also questioned the use of the James Bond-esque term "quantum supremacy," which seems to imply that classical computers are about to become obsolete.
Whoever turns out to be right, quantum supremacy is a largely symbolic achievement; the specific task assigned to the Google computer - checking outputs from a random number generator - has few practical applications.
In a statement Wednesday, Google Chief Executive Sundar Pichai called this a "hello world" milestone (the simple phrase is often the first program written by people learning to code), representing "a moment of possibility."
His words echo what Preskill wrote in his piece for Quanta. The Caltech physicist, who coined the notion of quantum supremacy in 2012, said he aimed to convey the notion that "this is a privileged time in the history of our planet," when the most arcane laws of physics might be harnessed for human ambitions.
Scientists have known for a century that the predictable laws of Newtonian physics - objects fall down; matter can be in only one place at one time - fall apart at the atomic and subatomic level.
In this quantum realm, electrons appear to leap from one energy state to another. Particles can exist in multiple states at the same time, a phenomenon known as "superposition." They can also stay connected across large distances, which Albert Einstein called "spooky" and modern physicists call "entanglement."
With quantum computing, scientists can put these weird, wild particles to work.
Classical computers encode information in "bits," an electrical or optical pulse that can represent either a 0 or 1. Eight bits constitute a "byte," which can typically store one character - for example, the letter A or a dollar sign. The first eight-inch floppy disk held 242,944 bytes. Apple's new iPhone 11 comes with 64 billion bytes.
The Summit system at Oak Ridge National Lab, a classical supercomputer that takes up two tennis courts' worth of floor space and can perform 200 quadrillion calculations per second, boasts a whopping 250 petabytes of storage - in bytes, that number comes out to about 250,000,000,000,000,000.
But superposition means that a quantum bit, or qubit, isn't confined to being just 0 or 1. This means it can be associated with twice as many numbers, a power that increases exponentially with each qubit added: Two qubits are associated with four possible numbers; three with eight; four with 16.
By the time you get up to 53 qubits - the size of both Google's Sycamore processor and a similar machine being built at IBM - you're approaching the potential of supercomputers like Summit. By harnessing quantum physics, a small machine can hold vast amounts of information and perform multiple calculations at once.
That is, if the quantum computer works. A faint noise or a glimmer of heat can alter a superposition, leading to errors. Measuring a particle, or disturbing it in any way, will cause the superposition to "decohere," or collapse. The qubit becomes an ordinary bit. Add more qubits to a system, and decoherence becomes even harder to control.
That's what stands between researchers such as those at Google and the quantum world they hope to attain. To build an effective quantum computer, scientists must figure out how to create and manipulate entangled qubits that last long enough to do something interesting with them.
Wednesday's announcement is the product of half a decade of collaboration between Google researchers and Martinis's team at UCSB. Their first task was to build their machine, a futuristic tower of coiled wires and gleaming silicon and steel.
The Sycamore processor itself is just a tiny silicon chip comprising 54 qubits laid out in a crosshatch pattern. This chip is bonded to a superconducting circuit board and then enclosed in a refrigerator so powerful that its temperature approaches absolute zero. This ensures that nothing can affect the qubits except the electronic signals sent by the scientists themselves.
Then they had to come up with a calculation complex enough to test their computer, and myriad tiny fixes for the errors that arose. One of the qubits in the processor failed to function, so the scientists had to cut it out of their experiments. The team developed an error-correcting process to ensure that the results coming out of each component were 99.99 percent accurate.
Pichai, the Google chief executive, described how every component of the processor had to be invented and built by the scientists themselves.
"If it didn't work - and often, it didn't - they had to redesign and build it again," he said. "The thing about building something that hasn't been proven yet is that there is no playbook."
In October 2018, severe wildfires in Southern California forced the experimenters to temporarily close down their Santa Barbara laboratory - right at a moment when their progress had started to stall, Pichai said. The forced break may have helped them; three months later, they achieved their breakthrough.
In a commentary for Nature, MIT physicist William Oliver wrote that the new results will help combat some of the criticisms of quantum computers: that they are too difficult to control and won't work on large scales. But even a computer like Google's is too "noisy," or error-prone, to be viable long-term.
Speaking Wednesday to reporters, Martinis said the team's next step would be to improve its error correction process. Then he hopes to scale up the processor to 1,000 qubits, creating a system capable of processing more parallel computations than there are atoms in the observable universe.
Addressing IBM's criticism of the quantum supremacy claim, Martinis said he anticipated that other researchers would find more efficient ways to simulate the quantum machine's calculation with classical computers. But 2 1/2 days is still a lot longer than three minutes -- and the process demanded the entire capacity of a supercomputer plus an additional storage mechanism.
But no matter how good algorithms for classical computers get, he said, quantum computers are improving exponentially faster.
"We are in the quantum supremacy regime," Martinis said. The Nature paper "is a very strong statement towards that, and it's going to get stronger and stronger in the future."
Computer scientists refer to systems like Sycamore as "Noisy Intermediate-Scale Quantum technology," or NISQ. This term (yet another Preskill invention) describes machines that can perform some calculations but remain too small and error-prone to be fully functional.
The true revolution will happen slowly. Montanaro guessed it will be 10 to 15 years before a resilient, large-scale quantum computer comes online.
But in the meantime, scientists can start to design more modest algorithms for NISQ machines, using them to model solar panels, superconductors, and other systems where quantum effects play an important role.
Still, it's the grandest possibilities presented by quantum computing that entice companies and governments to keep plugging away at it.
Several technology companies are competing to create quantum machines; IBM has even made its prototype available online for anyone to use. Last year, President Donald Trump signed into law the National Quantum Initiative Act, which establishes research centers to focus on quantum information science. Meanwhile, China has spent billions on quantum technology development.
The most obvious potential applications of this research are in the realm of national security. Entangled particles could one day be used for "quantum communication" - a means of sending super-secure messages that doesn't rely on cables or wireless signals. The tremendous processing power of quantum computers might be used to break previously unbreakable codes.
Then again, some experts worry this capacity will be used to steal people's passwords -- potentially imperiling online systems from email to banking.
But spying and hacking are far from the technology's only uses. Biologists might use quantum computers to understand natural processes far too complex for classical machines to simulate. Pharmaceutical researchers could employ them to discover new drugs. Quantum computing promises to generate better artificial intelligence and more-effective nanotechnologies.
"In many ways quantum brings computing full circle," Pichai said, "giving us another way to speak the language of the universe and understand the world and humanity not just in 1s and 0s but in all of its states: beautiful, complex, and with limitless possibility."
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)