Top Academics: Here’s How We Facilitate the Next Big Leap in Quantum Computing – PCMag AU

Table of Contents From Quantum Physics to Quantum Computing Grand Challenges and Error Correction The Road to Quantum Advantage Education and Workforce Development The Quantum Bottom Line

In advance of the ribbon-cutting for its new IBM System One quantum computer, the first one on a college campus, Rensselaer Polytechnic Institute (RPI) last week hosted a quantum computing day which featured several prominent speakers who together provided a snapshot of where the field is now. I've been writing about quantum computing for a long time, and have noted some big improvements, but there are also a host of challenges that still need to be overcome.

Here are some highlights.

The first plenary speaker was Jay M. Gambetta, Vice President of Quantum Computing at IBM, who gave an overview of the history and progress of quantum computing, as well as the challenges and opportunities ahead. He explained that quantum computing is based on exploiting the quantum mechanical properties of qubits, such as superposition and entanglement, to perform computations that are impossible or intractable for classical computers. He talked about watching the development of superconducting qubits, as they moved from single qubit systems in 2007, to 3-qubit systems in 2011, and now with IBM's Eagle chip, which has 127 qubits and is the heart of the Quantum System One.

He then asked how we could make quantum computing useful. His answer: We need to keep building larger and larger systems and we need to improve error correction.

"There are very strong reasons to believe there are problems that are going to be easy for a quantum computer but hard for a classical computer, and this is why we're all excited," Gambetta said. He discussed the development of quantum circuits and that while the number of qubits was important, equally important was the "depth," detailing how many operations you can do and the accuracy of the results. Key to solving this are larger and larger systems, and also error mitigation, a topic that would be discussed in much greater detail later in the day.

To get to "quantum utility"which he said would be reached when a quantum computer is better than a brute force simulation of a quantum computer on a classical machineyou would need larger systems with at least 1000 gates, along with improved accuracy and depth, and new efficient algorithms.

He talked about quantum algorithmic discovery, which means finding new and efficient ways to map problems to quantum circuits. For instance, a new variation on Shor's algorithm, which allows for factorization in much faster time than would be possible on a classical computer. "The future of running error-mitigated circuits and mixing classical and quantum circuits sets us up to explore this space, " he said.

In a panel discussion that followed, James Misewich from Brookhaven National Laboratory discussed his interest in using quantum computing to understand quantum chromodynamics (QCD), the theory of strong interactions between quarks and gluons. QCD is a hard problem that scales well with the number and depth of qubits, and he is looking at entanglement between jets coming out of particle collisions as a possible avenue to explore quantum advantage.

Jian Shi and Ravishankar Sundararaman from RPI's Materials Science and Engineering faculty talked about computational materials science, and applying quantum computing to discover new materials and properties. Shi noted there was a huge community now doing quantum chemistry, but there is a gap between that and quantum computing. He stressed that a partnership between the two groups will be important, so each learns the language of the other and can approach the problems from a different perspective.

One of the most interesting talks was given by Steve M. Girvin, Eugene Higgins Professor of Physics, Yale University, who discussed the challenges of creating an error-correction quantum computer.

Girvin described how the first quantum revolution was the development of things like the transistor, the laser, and the atomic clock, while the second quantum revolution is based on a new understanding of how quantum mechanics works. He usually tells his students that they do the things that Einstein said were impossible just to make sure that we have a quantum computer and not a classical computer.

He thought there was a bit too much hype around quantum computing today. quantum is going to be revolutionary and do absolutely amazing things, but it's not its time yet. We still have massive problems to solve.

He noted that quantum sensors are extremely sensitive, which is great for making sensors, but bad for building computers, because they are very sensitive to external perturbations and noise. Therefore, error correction is important.

Among the issues Girvin discussed were making measurements to detect errors, but he said we also need calculations to decide if it truly is an error, where it is located, and what kind of error it is. Then there is the issue of deciding what signals to send to correct those errors. Beyond that, there is the issue of putting these together in a system to reduce overall errors, perhaps borrowing from the flow control problems used in things like telephony.

In addition to quantum error detection, Girvin said there are "grand challenges all up and down the stack," from materials to measurement to machine models and algorithms. We need to know how to make each layer of the stack more efficient, using less energy and fewer qubits, and get to higher performance so people can use these to solve science problems or economically interesting problems.

Then there are the algorithms. Girvin noted that there were algorithms way before there were computers, but it took time to decide on the best ones for classical computing. For quantum computing, this is just the beginning, and over time, we need people to figure out how to build up their algorithms and how to do heuristics. They need to discover why quantum computers are so hard to program and clever tools to solve these problems.

Another challenge he described was routing quantum information. He noted that having two quantum computers that can communicate classically is exponentially less good than having two quantum computers that can communicate with quantum information, entangling with each other.

He talked about fault tolerance, which is the ability to correct errors even when your error correction circuit makes errors. He believes that fact that it's possible to do that in a quantum system, at least in principle, is even more amazing than the fact that if you had a perfect quantum computer, you could do interesting quantum calculations.

Girvin described the difficulty in correcting errors, saying you have an unknown quantum state, and you're not allowed to know what it is, because it's from the middle of a quantum computation. (If you know what it is, you've destroyed the superposition, and if you measure it to see if there's an error, it will randomly change, due to state collapse.) Your job is that if it develops an error, please fix it.

"That's pretty hard, but miraculously it can be done in principle, and it's even been done in practice," he said. We're just entering the era of being able to do it. The basic idea is to build in redundancy, such as building a logical qubit that consists of multiple physical qubits, perhaps nine. Then you have two possible giant entangled states corresponding to a logical Zero and a logical One. Note the one and zero aren't living in any single physical qubit, both are only the superposition of multiple ones.

In that case, Girvin says, if the environment reaches in and measures one of those qubits, the environment doesn't actually learn what it knows. There's an error, but it doesn't know what state, so there's still a chance that you haven't totally collapsed anything and lost the information.

He then discussed measuring the probability of errors and then seeing whether it exceeds some threshold value, with some complex math. Then correcting the errors, hopefully quicklysomething that should improve with new error correction methods and better, more precise physical qubits.

All this is still theoretical. That's why fault tolerance is a journey with improvements being made continuously. (This was in opposition to Gambetta, who said systems are either fault tolerant or they aren't). Overall, Girvin said, "We still have a long way to go, but we're moving in the right direction."

Later in the morning, Austin Minnich, Professor of Mechanical Engineering and Applied Physics, Caltech described "mid-circuit measurement" and the need for hybrid circuits as a way of finding, and thus mitigating errors.

In a discussion that followed, Kerstin Kleese van Dam, Director of the Computational Science Initiative at Brookhaven National Laboratory, explained that her team was looking for answers to problems, whether solved on traditional or quantum machines. She said there were problems they can't solve accurately on a traditional computer, but there remains the question of whether the accuracy will matter. There are areas, such as machine learning, where quantum computers can do things accurately. She predicts that quantum advantage will come when we have systems that are large enough. But she also wondered about energy consumption, noting that a lot of power is going into today's AI models, and if quantum can be more efficient.

Shekhar Garde, Dean of the School of Engineering, RPI, who moderated this part of the discussion, compared the status of quantum computing today to where traditional computing was in the late 70s or early 80s. He asked what the next 10 years would bring.

Kleese van Dam said that within 10 years, we would see hybrid systems that combine quantum and classical computing, but also hoped we would see libraries that are transferred from high-performance computing to quantum systems, so a programmer could use them without having to understand the way the gates work. Aparna Gupta, Professor and Associate Dean of RPI's Lally School of Management would bet on the hybrid approach offering more easy access and cost-effectiveness, as well as "taking away the intrigue and the spooky aspects of quantum, so it is becoming real for all of us"

Antonio Corcoles, Principal Research Scientist, IBM Quantum, said he hoped users who don't know quantum will be able to use the system because the complexity will become more transparent, but that can take a long time. In between, they can develop quantum error correction in a way that is not as disruptive as current methods. Minnich talked about "blind quantum computing" where many smaller machines might be linked together.

One of the most interesting talks came from Lin Lin, Professor of Mathematics at the University of California, Berkeley, who discussed the theoretical aspects and challenges of achieving quantum advantage for scientific computation. He defined quantum advantage as the ability to solve problems that are quantumly easy but classically hard, and proposed a hierarchy of four levels of problems.

Lin said that for the first two levels, a lot of people think quantum advantage will be achieved, as the methods are generally understood. But on the next two levels, there needs to be a lot of work on the algorithms to see if it will work. That's why this is an exciting time for mathematicians as well as physicists, chemists, and computer scientists.

This talk was followed by a panel during which Lin said that he is interested in solving quantum many-body problems, as well as applying quantum computing to other areas of mathematics, such as numerical analysis and linear algebra.

Like Garde above, Lin compared where quantum is today to the past, going even further to say it's where classical computing was 60 or 70 years ago, where error correction was still very important. Quantum computing will need to be a very interdisciplinary field, in that it will require people to be very good at building the machines, but it will always produce errors, so it will require both mathematical and engineering ways to correct these.

Ryan Sweke from IBM Research noted that one of the things that has allowed classical computing to develop to the point it is at is the various levels of abstraction, so if you want to work on developing algorithms, you don't have to understand how the compiler works. If you want to understand how the compiler works, you don't have to understand how the hardware works.

The interesting thing in the quantum regime, as seen in error mitigation for example, is that people who come out of the top level of abstraction have to interact with people who are developing the devices. This is an exciting aspect of the time we're in.

Di Fang, Assistant Professor of Mathematics, Duke University, said now was a "golden time for people who work on proving algorithms." She talked about the varying levels of complexity, and the need to see where new algorithms can solve theoretical problems, then look at the hardware and solve practical problems.

Brian McDermott, Principal R&D Engineer at the Naval Nuclear Laboratory, said he was looking at this in reverse, seeing what the problems are and then working backward toward the quantum hardware and software. His job involved matching applications of new and emerging computing architectures to the types of engineering problems that are important to the lab's mission for new nuclear propulsion.

The panelists discussed where quantum algorithms could have the most impact. McDermott talked about things like finite elements and computational fluid dynamics, going up to material science. As a nuclear engineer, he was first attracted to the field because of the quantum properties of the nucleus itself moving predicting behaviors in astrophysics, the synthesis of nuclei in a supernova, and then with engineering, into nuclear reactors and things like fusion. Lin discussed the possibilities for studying molecular dynamics.

Olivia Lanes, Global Lead and Manager for IBM Quantum Learning and Education gave the final talk of the day, where she discussed the need for workforce development in the quantum field.

Already the US is projected to face a shortfall of nearly two million STEM workers by next year. She quoted Carl Sagan, who said "We live in a society exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology," and agreed with him that this is a recipe for disaster.

She noted that not only do very few people understand quantum computing, very few actually understand how classical computers work. She cited a McKinsey study which found that there are three open jobs in quantum for every person qualified to fill those positions. It's probably just going to get worse from here to 2026.

She focused on upskilling and said it was unrealistic to expect that we'll make everyone into experts in quantum computing. But, there were a lot of other jobs that are part of the quantum ecosystem that will be required, and urged students to focus on the areas they are particularly interested in.

In general, she recommended getting a college degree (not surprising, since she was talking at a college), considering graduate school, or finding some other way to get relevant experience in the field, and building up rare skills. "Find the one thing that you can do better than anybody else and market that thing. You can make that thing applicable to any career that you really want for the most part," she said. "Stop letting the physicists hog quantum; they've had a monopoly here for too long and that needs to change."

Similar concepts were voiced in a panel that followed. Anastasia Marchenkova, Quantum Researcher, Bleximo Corporation, said that there was lots of pop science, and lots of research, but not much in the middle. She said we need to teach people enough so they can use quantum computing, even if they aren't computer scientists.

Richard Plotka, Director of Information Technology and Web Science, RPI, said it was important to create middleware tools that can be applied to quantum so that the existing workforce can take advantage of these computers. He also said it was important to prepare students for a career in the future, with foundational knowledge, so they have the ability to adapt because quantum in five or ten years won't look like it does today.

All told, it was a fascinating day of speakers. I was intrigued by software developers explaining the challenge in writing languages, compilers, and libraries for quantum. One explained that you can't use traditional structures such as "ifthen" because you won't know "if." Parts of it were beyond my understanding, and I remain skeptical about how quickly quantum will become practical and how broad the applications may be.

Still, it's an important and interesting technology that is sure to get even more attention in the coming years, as researchers meet some of the challenges. It's good to see students getting a chance to try out the technology and discover what they can do with it.

Read more here:
Top Academics: Here's How We Facilitate the Next Big Leap in Quantum Computing - PCMag AU

Related Posts

Comments are closed.