Category Archives: Quantum Physics

Strontium Unlocks the Quantum Secrets of Superconductivity – SciTechDaily

Researchers observed the dynamic phases of BCS superconductor interactions in a Cavity QED by measuring the light leakage from the cavity. Credit: Steven Burrows/Rey and Thompson Groups

Superconductivity makes physics seem like magic. At cold temperatures, superconducting materials allow electricity to flow indefinitely while expelling outside magnetic fields, causing them to levitate above magnets. MRIs, maglev trains, and high-energy particle accelerators use superconductivity, which also plays a crucial role in quantum computing, quantum sensors, and quantum measurement science. Someday, superconducting electric grids might deliver power with unprecedented efficiency.

Yet scientists lack full control over conventional superconductors. These solid materials often comprise multiple kinds of atoms in complicated structures that are difficult to manipulate in the lab. Its even harder to study what happens when theres a sudden change, such as a spike in temperature or pressure, that throws the superconductor out of equilibrium.

Quantum theory has predicted intriguing behaviors when a superconductor is driven out of equilibrium. However, it has been challenging to perturb these materials in the lab without disrupting their delicate superconducting properties, leaving these predictions untested.

However, scientists can obtain surprisingly deep insights into superconductivity by studying it with fully controllable arrays of atoms in a gas. That is the approach of a research collaboration at JILA, a joint institute of the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder.

In their latest work, JILA researchers caused a gas of strontium atoms to act like a superconductor. Even though the strontium atoms themselves are not superconducting, they follow the same rules of quantum physics. The researchers could make atoms in a gas interact in a way that preserves the sorts of interactions responsible for superconductivity while suppressing other competing, complex interactions. By throwing the atoms out of equilibrium, the researchers saw changes in atomic interactions that would affect the properties of actual superconductors.

With their strontium gas acting as a quantum simulator, the researchers were able to observe a behavior of superconductors that has been predicted to exist for years. This study, published in Nature, offers new insight into how superconductors work when appropriately driven out of equilibrium, and sheds light on how to make superconductors more robust, and how to use their unique properties in other quantum technologies.

In a normal material, electrons move in an incoherent way, bumping into one another constantly; normally, electrons repel each other. As they move, they collide, losing energy and generating heat; thats why electric currents dissipate when electrons flow in a metallic wire. In a superconductor, however, electrons join up into weakly bonded pairs, called Cooper pairs. When these pairs form, they all tend to move coherently, and that is why they flow through the material with no resistance.

The physics is simple in some sense, explains theoretical physicist Ana Maria Rey, a NIST and JILA Fellow. Cooper pairs exist in a low-energy state because vibrations in the materials crystalline structure pull the electrons together. When formed, Cooper pairs prefer to act coherently and lock together. The Cooper pairs are kind of like arrows that want to line up in the same direction. To unlock them or make one of the arrows point along a different direction, you need to add extra energy to break the Cooper pairs, Rey explains. The energy that you need to add to unlock them is called an energy gap. Stronger interactions between the atoms create a larger energy gap because the attraction that keeps the Cooper pairs locked is so strong. Overcoming that energy gap takes a lot of energy away from the Cooper pairs. So this energy gap acts as a buffer, letting the Cooper pairs remain happily locked in phase.

This all works when the system is in equilibrium. But when you introduce a sudden, rapid change, the superconductor falls out of equilibrium, or becomes quenched. For decades, scientists have wanted to know what happens to superconductivity following a quench that is abrupt but not so strong to completely break the Cooper pairs, said JILA physicist James Thompson.

In other words, how robust are these things? Thompson said.

Theorists predicted three different possibilities or phases that could happen when the superconductor is quenched. Think of it like a big group of square dancers, Thompson says. At first everyone is in sync, keeping to the beat of the music. Then some people get a little tired or some others start moving a little too fast, they crash into each other, and it turns into a mosh pit. Thats Phase I, when superconductivity collapses. In Phase II, the dancers get off the beat, but manage to stay in sync. Superconductivity survives the quench. Scientists have been able to observe and study these two phases.

But they have never seen a long-predicted third phase, in which the superconductivity of the system oscillates over time. In this phase, our dancers will move a bit faster or a bit slower at times, but no one crashes. That means sometimes its a weaker superconductor, and sometimes its a stronger superconductor. Until now, no one had been able to observe that third phase.

Working with Reys theory group, Thompsons team at JILA laser-cooled and loaded strontium atoms into an optical cavity, a space with highly reflective mirrors at either end. Laser light bounces back and forth millions of times before some light leaks out at one end.

The light in the cavity mediated interactions between the atoms, causing them to align into a superposition state meaning they are in both the excited and ground state at the same time and to lock in phase, like Cooper pairs do, Rey explains.

Using lasers, scientists can quench the system, and by measuring the light that leaks out, they learn how the energy gap has changed over time. With this quantum superconductor simulation, they were able to observe all three dynamic phases for the first time.

They found that in the third phase the energy gap can keep superconductivity going even when the system is out of equilibrium. Using quantum simulators like this could help scientists engineer unconventional or more robust superconductors, and better understand the physics of superconductors in general.

Its also a counterintuitive way for scientists who work in measurement science to see atomic interactions, like the ones that cause the energy gap, as a benefit, not a curse.

In measurement science, interactions are usually bad. But here, when interactions are strong, they can help you. The gap protects the system everything flows, Rey says. At the heart of this idea you could have something that oscillates forever.

Having something that oscillates forever is a dream for quantum technology, Thompson adds, because it would let sensors work better for longer. Much like the superconductors, groups of atoms, photons and electrons in quantum sensors need stay in sync, or coherent, to work, and we dont want them to turn into a quantum mosh pit or dephase.

I am stoked that one of the dynamical phases that we observe can be used to protect quantum optical coherence against dephasing. For instance, this may one day allow an optical atomic clock to tick for longer, Thompson said. It represents a whole new way to increase the precision and sensitivity of quantum sensors, a topic that is at the frontier of quantum metrology, or measurement, science. We want to harness the many atoms and take advantage of the interactions to build a better sensor.

Reference: Observing dynamical phases of BCS superconductors in a cavity QED simulator by Dylan J. Young, Anjun Chu, Eric Yilun Song, Diego Barberena, David Wellnitz, Zhijing Niu, Vera M. Schfer, Robert J. Lewis-Swan, Ana Maria Rey and James K. Thompson, 24 January 2024, Nature. DOI: 10.1038/s41586-023-06911-x

Continued here:

Strontium Unlocks the Quantum Secrets of Superconductivity - SciTechDaily

Physicists Just Measured Gravity on the Smallest Scale Ever By Using a Promising New Technique – Inverse

Just over a week ago, European physicists announced they had measured the strength of gravity on the smallest scale ever.

In a clever tabletop experiment, researchers at Leiden University in the Netherlands, the University of Southampton in the UK, and the Institute for Photonics and Nanotechnologies in Italy measured a force of around 30 attonewtons on a particle with just under half a milligram of mass. An attonewton is a billionth of a billionth of a newton, the standard unit of force.

The researchers say the work could unlock more secrets about the universes very fabric and may be an important step toward the next big revolution in physics.

But why is that? Its not just the result: its the method and what it says about a path forward for a branch of science critics say may be trapped in a loop of rising costs and diminishing returns.

On smaller scales, the effects of gravity get weaker and weaker.

From a physicists point of view, gravity is an extremely weak force. This might seem like an odd thing to say. It doesnt feel weak when youre trying to get out of bed in the morning!

Still, compared with the other forces that we know about such as the electromagnetic force that is responsible for binding atoms together and for generating light and the strong nuclear force that binds the cores of atoms gravity exerts a relatively weak attraction between objects.

And on smaller scales, the effects of gravity get weaker and weaker.

Its easy to see the effects of gravity on objects the size of a star or planet, but it is much harder to detect gravitational effects for small, light objects.

We still dont understand how gravity, and thus black holes, work in the quantum realm.

Despite the difficulty, physicists really want to test gravity at small scales. This is because it could help resolve a century-old mystery in current physics.

Physics is dominated by two extremely successful theories.

The first is general relativity, which describes gravity and spacetime at large scales. The second is quantum mechanics, which is a theory of particles and fields the basic building blocks of matter at small scales.

These two theories are in some ways contradictory, and physicists dont understand what happens in situations where both should apply. One goal of modern physics is to combine general relativity and quantum mechanics into a theory of quantum gravity.

One example of a situation where quantum gravity is needed is to fully understand black holes. These are predicted by general relativity and we have observed huge ones in space but tiny black holes may also arise at the quantum scale.

At present, however, we dont know how to bring general relativity and quantum mechanics together to give an account of how gravity, and thus black holes, work in the quantum realm.

Until recently, scientists thought wed need equipment even bigger than the Large Hadron Collider to conduct these experiments.

A number of approaches to a potential theory of quantum gravity have been developed, including string theory, loop quantum gravity, and causal set theory.

However, these approaches are entirely theoretical. We currently dont have any way to test them via experiments.

To empirically test these theories, wed need a way to measure gravity at very small scales where quantum effects dominate.

Until recently, performing such tests was out of reach. It seemed we would need very large pieces of equipment, even bigger than the worlds largest particle accelerator, the Large Hadron Collider, which sends high-energy particles zooming around a 27-kilometer loop before smashing them together.

This is why the recent small-scale measurement of gravity is so important.

The experiment conducted jointly between the Netherlands and the UK is a tabletop experiment. It didnt require massive machinery.

The experiment works by floating a particle in a magnetic field and then swinging a weight past it to see how it wiggles in response.

This is analogous to the way one planet wiggles when it swings past another.

By levitating the particle with magnets, it can be isolated from many of the influences that make detecting weak gravitational influences so hard.

The beauty of tabletop experiments like this is they dont cost billions of dollars, which removes one of the main barriers to conducting small-scale gravity experiments, and potentially to making progress in physics. (The latest proposal for a bigger successor to the Large Hadron Collider would cost US$17 billion.)

Tabletop experiments are very promising, but there is still work to do.

The recent experiment comes close to the quantum domain, but doesnt quite get there. The masses and forces involved will need to be even smaller, to find out how gravity acts at this scale.

We also need to be prepared for the possibility that it may not be possible to push tabletop experiments this far.

There may yet be some technological limitation that prevents us from conducting experiments of gravity at quantum scales, pushing us back toward building bigger colliders.

Its also worth noting some of the theories of quantum gravity that might be tested using tabletop experiments are very radical.

Some theories, such as loop quantum gravity, suggest space and time may disappear at very small scales or high energies. If thats right, it may not be possible to carry out experiments at these scales.

After all, experiments as we know them are the kind of thing that happen at a particular place, across a particular interval of time. If theories like this are correct, we may need to rethink the very nature of experimentation so we can make sense of it in situations where space and time are absent.

On the other hand, the very fact we can perform straightforward experiments involving gravity at small scales may suggest that space and time are present after all.

Which will prove true? The best way to find out is to keep going with tabletop experiments, and to push them as far as they can go.

This article was originally published on The Conversation by Sam Baron at The University of Melbourne. Read the original article here.

Read this article:

Physicists Just Measured Gravity on the Smallest Scale Ever By Using a Promising New Technique - Inverse

Design rules and synthesis of quantum memory candidates – EurekAlert

image:

The double perovskite crystal structure of Cs2NaEuF6synthesized in this research.

Credit: The Grainger College of Engineering at University of Illinois Urbana-Champaign

In the quest to develop quantum computers and networks, there are many components that are fundamentally different than those used today. Like a modern computer, each of these components has different constraints. However, it is currently unclear what materials can be used to construct those components for the transmission and storage of quantum information.

In new research published in the Journal of the American Chemical Society, University of Illinois Urbana Champaign materials science & engineering professor Daniel Shoemaker and graduate student Zachary Riedel used density functional theory (DFT) calculations to identify possible europium (Eu) compounds to serve as a new quantum memory platform. They also synthesized one of the predicted compounds, a brand new, air stable material that is a strong candidate for use in quantum memory, a system for storing quantum states of photons or other entangled particles without destroying the information held by that particle.

The problem that we are trying to tackle here is finding a material that can store that quantum information for a long time. One way to do this is to use ions of rare earth metals, says Shoemaker.

Found at the very bottom of the periodic table, rare earth elements, such as europium, have shown promise for use in quantum information devices due to their unique atomic structures. Specifically, rare earth ions have many electrons densely clustered close to the nucleus of the atom. The excitation of these electrons, from the resting state, can live for a long timeseconds or possibly even hours, an eternity in the world of computing. Such long-lived states are crucial to avoid the loss of quantum information and position rare earth ions as strong candidates for qubits, the fundamental units of quantum information.

Normally in materials engineering, you can go to a database and find what known material should work for a particular application, Shoemaker explains. For example, people have worked for over 200 years to find proper lightweight, high strength materials for different vehicles. But in quantum information, we have only been working at this for a decade or two, so the population of materials is actually very small, and you quickly find yourself in unknown chemical territory.

Shoemaker and Riedel imposed a few rules in their search of possible new materials. First, they wanted to use the ionic configuration Eu3+ (as opposed to the other possible configuration, Eu2+) because it operates at the right optical wavelength. To be written optically, the materials should be transparent. Second, they wanted a material made of other elements that have only one stable isotope. Elements with more than one isotope yield a mixture of different nuclear masses that vibrate at slightly different frequencies, scrambling the information being stored. Third, they wanted a large separation between individual europium ions to limit unintended interactions. Without separation, the large clouds of europium electrons would act like a canopy of leaves in a forest, rather than well-spaced-out trees in a suburban neighborhood, where the rustling of leaves from one tree would gently interact with leaves from another.

With those rules in place, Riedel composed a DFT computational screening to predict which materials could form. Following this screening, Riedel was able to identify new Eu compound candidates, and further, he was able to synthesize the top suggestion from the list, the double perovskite halide Cs2NaEuF6. This new compound is air stable, which means it can be integrated with other components, a critical property in scalable quantum computing. DFT calculations also predicted several other possible compounds that have yet to be synthesized.

We have shown that there are a lot of unknown materials left to be made that are good candidates for quantum information storage, Shoemaker says. And we have shown that we can make them efficiently and predict which ones are going to be stable.

*

Daniel Shoemaker is also an affiliate of the Materials Research Laboratory (MRL) and the Illinois Quantum Information Science and Technology Center (IQUIST) at UIUC.

Zachary Riedel is currently a postdoctoral researcher at Los Alamos National Laboratory.

This research was supported by the U.S. Department of Energy, Office of Science, National Quantum Information Science Research Center Q-NEXT. The National Science Foundation through the University of Illinois Materials Research Science and Engineering Center supported the use of facilities and instrumentation.

Journal of the American Chemical Society

Design Rules, Accurate Enthalpy Prediction, and Synthesis of Stoichiometric Eu3+ Quantum Memory Candidates

12-Jan-2024

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Go here to read the rest:

Design rules and synthesis of quantum memory candidates - EurekAlert

Quantum physics and the end of naturalism | Bruce L. Gordon – IAI

Naturalism, the idea that there are no gods, spirits, or transcendent meanings, is the leading theory of our time. However, in this instalment of our idealism series, in partnership with the Essentia Foundation, Bruce Gordon argues that quantum mechanics not only beckons the end of naturalism, but also points towards the existence of a transcendent mind.

Naturalism remains a popular philosophy in the academic world. Its articulation varies, so lets be clear what we mean. Theoretical physicist and philosopher Sean Carrolls definition will suffice: Naturalism is a philosophy according to which there is only one worldthe natural world, which exhibits unbroken patterns (the laws of nature), and which we can learn about through hypothesis testing and observation. In particular, there is no supernatural worldno gods, no spirits, no transcendent meanings. Advocates of naturalism tend to regard it as the inevitable accompaniment of a scientific mindset. It seems appropriate, therefore, to undermine it using the most fundamental of sciences: quantum physics.

Given its scientific pretensions, its appropriate that the doctrine that the natural world is self-contained, self-explanatory, and exceptionless is at least falsifiable. All we need is one counterexample to the idea that nature is a closed system of causes and effects, or one clear example of natures non-self-sufficiency, to be justified in rejecting naturalism, yet contrary evidence and considerations abound. Rather than trying to cover the gamut of cosmological fine-tuning, the origin of biological information, the origin and nature of consciousness, and the evidentiary value of near-death experiences, lets focus on the implications of quantum physics as a less familiar aspect of naturalisms failure.

___

Particle talk has pragmatic utility in relation to measurement results and macroscopic appearances, but no basis in unobserved (mind-independent) reality.

___

Quantum physics sets aside classical conceptions of motion and the interaction of bodies and introduces acts of measurement and probabilities for observational outcomes in an irreducible way not ameliorated by appealing to our limited knowledge. The state of a quantum system is described by an abstract mathematical object called a wave function that only specifies the probability that various observables will have a particular value when measured. These probabilities cant all equal zero or one and measurement results are irreducibly probabilistic, so no sufficient physical reason exists for one outcome being observed rather than another. This absence of sufficient material causality in quantum physics has experimentally confirmed consequences that, as we shall see, put an end to naturalist conceits.

SUGGESTED READING Quantum mechanics makes no sense without the mind By Shan Gao

The delayed-choice quantum eraser experiment provides a good example with which to start. This experiment measures which path a particle took after wave function interference inconsistent with particle behavior has already been created. The interference can be turned off or on by choosing whether or not to measure which way the particle went after the interference already exists. Choosing to look erases wave function interference and gives the system a particle history. The fact that we can make a causally disconnected choice whether wave or particle phenomena manifest in a quantum system demonstrates that no measurement-independent causally-connected substantial material reality exists at the microphysical level.

We see this in other ways too. First, the physically reasonable assumptions that an individual particle, like an electron, cannot serve as an infinite source of energy or be in two places at once, entail that quantum particles have zero probability of existing in any bounded spatial region, no matter how large. Unobserved electrons (for example) dont exist anywhere in space, and thus have no reality apart from measurement. In short, there is no intelligible notion of microscopic material objects: particle talk has pragmatic utility in relation to measurement results and macroscopic appearances, but no basis in unobserved (mind-independent) reality.

Secondly, microphysical properties do not require a physical substrate. Reminiscent of Alice in Wonderland, quantum physics has its own Cheshire Cat in which quantum systems behave like their properties are spatially separated from their positions. For example, an experiment using a neutron interferometer has sent neutrons along one path while their spins follow another. In macroscopic terms, this would be like still having the spin once the top is taken away, having a dance without any dancer, or having a water wave without any water. Under appropriate experimental conditions, quantum systems are decomposable into disembodied propertiesa collection of Cheshire Cat grins.

___

While this quantum sea is the basis of our experiential reality, none of the mathematical-structural components of interacting quantum wave functions are materially real.

___

But how, then, should we understand the transition between the microscopic and macroscopic worlds? Every quantum wave function is expressible as a superposition of different possibilities (states) in which the thing it describes fails to possess the properties those possibilities specify. No quantum system, microscopic or macroscopic, ever has simultaneously determinate values for all its associated properties. You could think of it this way: imagine a house that, if you were looking at the front, didnt have a back, and vice-versa. Everything we experience with our senses, if we take it to be a mind-independent object rather than just a phenomenological appearance, is metaphysically incomplete. What is more, under special laboratory conditions, we can create macroscopic superpositions of properties that are, classically speaking, inconsistentfor instance, a single object appearing in more than one location simultaneously. Large organic molecules have been put into such superpositions, and Superconducting Quantum Interference Devices (SQUIDs) have superposed a billion electrons moving clockwise around a superconducting ring with another billion electrons moving anticlockwise, so that two incompatible macroscopic currents are in superposition.

What this reveals is that the macroscopic stability we normally observe is the product of what physicists call environmental decoherencethe destructive interference of probability waves as quantum systems interact. You can imagine this as two water waves the same size meeting each other from opposite directions. When the crest of one wave meets the trough of the other, there is destructive interference as the waves cancel out and waters surface is momentarily flat and calm. The quantum realm behaves analogously: our experiential world of appearances is cloaked in an illusory stability, while underneath, innumerable probability waves are destructively interfering in a roiling quantum sea.

SUGGESTED VIEWING Beyond Quantum With Gerard 't Hooft

It is important to keep in mind that, while this quantum sea is the basis of our experiential reality, none of the mathematical-structural components of interacting quantum wave functions are materially real. They are mathematical abstractions, a hollow and merely quantitative informational architecture. Speaking of the mathematical framework of physical theory, Robert Adams remarks that [it] is a framework that, by its very nature, needs to be filled in by something less purely formal. It can only be a structure of something of some not merely structural sort it participates in the incompleteness of abstractions [whereas] the reality of a substance must include something intrinsic and qualitative over and above any formal or structural features it may possess. Our experiential reality rests on a quantum-informational construct that is not materially substantial.

As a final observation before nailing the coffin of naturalism shut, in the case of laboratory-created macroscopic superpositions, our conscious self is not in the superposition but rather observing it. We are substantial, but the world of our experience is not. Our mental life transcends quantum reality. While this reality is given to us and not produced by our own consciousness, it is merely phenomenologicalit goes no deeper than the perceptual possibilities across all five of our sensory modalities decohering (destructively interfering) to produce our world.

But why should this be so? When there is no sufficient physical reason why one observation occurs rather than another, why should mere perceptions cohere across our sensory modalities, and why should all of us inhabit the same world? Saying that since no physical explanation is possible, no explanation is required, would be a mistake of disastrous proportions. If there were no reason why we observe one thing rather than another, if the regularities of nature were metaphysically ungrounded, then our current perception of reality and its accompanying memories might be happening for no reason at all. How could we know? No objective probability and hence no likelihood is assignable to something for which there is no explanation, so we couldnt even say this possibility is unlikely.

___

Clearly, naturalism is inadequate: it cannot meet the ineluctable explanatory demand. A proper ultimate explanation must terminate upon something that transcends contingent reality and has self-contained existence as its very essence.

___

Lets be perfectly clear. If we affirm brute chance by saying that some things can happen for no reason at all, we have deprived ourselves of any basis for deciding which things these are, and they could well include all of the perceptions and beliefs we currently take ourselves to have. This means we dont even know whether were in touch with reality. Were stuck with an irremediable skepticism that deprives our experience of any credibility, not only destroying any basis for doing science, but eliminating the very possibility of our knowing anything at all! Embracing brute chance by denying that every contingent event must have an explanation is the pathway to epistemic nihilism. An explanation must exist.

SUGGESTED READING Reality is not a simulation and why it matters By Marcelo Gleiser

But what could the explanation be? The laws of nature, specifically those of quantum physics, wont suffice. Theyre neither logically nor metaphysically necessary. The reality they describe did not need to exist and they certainly didnt cause its existencein short, they are in need of explanation themselves. Clearly, naturalism is inadequate: it cannot meet the ineluctable explanatory demand. A proper ultimate explanation must terminate upon something that transcends contingent reality and has self-contained existence as its very essence.

The required conclusion is obvious: since every contingent state of affairs requires an explanation, there must exist a transcendent, independent, necessarily existent being the existence of which is explained by its intrinsic necessity. This being is unique, not just because two or more necessary beings is overkill, but because their mutual dependence would create unexplainable contingency. Furthermore, since spacetime and mass-energy are contingent phenomena, this transcendent being must be incorporeal. Finally, in explaining why any reality exists, especially in the absence of a uniquely best reality, a non-arbitrary self-determined decision based on a perfectly ranked and complete set of reasons known to this necessarily existent being must be made. This means the necessary ground for the phenomenological reality of our experience is a transcendent, omniscient Mind. Given such considerations, quantum physics not only shows the falsity of naturalism, it leads to a transcendent form of idealism. Goodbye, Richard Dawkins, and hello, Bishop Berkeley!

Follow this link:

Quantum physics and the end of naturalism | Bruce L. Gordon - IAI

Quantum computing and finance | UDaily – University of Delaware

Article adapted with permission by Karen B. Roberts Photo illustration by Jeffrey C. Chase March 11, 2024

Over the next decade, quantum computers are expected to have a transformative impact on numerous industry sectors, as they surpass the computational capabilities of classical computers. In finance, for example, quantum computing will one day be used to speed banking, make financial predictions and analyze financial patterns and risks.

The technology, however, is still in its infancy.

The University of Delawares Ilya Safro, associate professor and associate chair for graduate studies and research in the Department of Computer and Information Sciences, is part of a team of researchers from industry, academia and the U.S. Department of Energys Argonne National Laboratory that recently published a primer on quantum computing and finance.

The paper, published in Nature Reviews Physics, summarizes the current state of the art in quantum computing for financial applications and outlines advantages and limitations over classical computing techniques used by the financial industry. It also sheds light on some of the challenges that still need to be addressed for quantum computing to be used in this way.

It's an important topic across the world and at UD, which is among the first institutions in the United States to offer an interdisciplinary graduate degree program in quantum science and engineering.

The work, facilitated by the Chicago Quantum Exchange (CQE) and led by a team that includes UD, Argonne, JPMorgan Chase and University of Chicago scientists, lays groundwork for future applications and highlights the need for cross-sector collaboration. Other partners include Fujitsu Research of America, Inc., and Menten AI.

The hope, Safro said, is to bring the financial world to what is called practical quantum advantage, where processes are faster, more accurate and more energy efficient.

Finance is an area where even a small improvement will be felt literally in dollars, said Safro. Even a tiny improvement at the level of economy will be very significant. For example, it may amplify efficiencies across entire industries, leading to substantial cost reductions, enhanced productivity or more sustainable practices.

This is one reason finance is considered a major beneficiary of quantum computing.

Written for researchers who arent necessarily quantum computing experts, the team views the primer as a one-stop resource on the use of quantum computers to accelerate solutions for the finance sector. The paper discusses challenges in three categories at the intersection of finance and computing: optimization, machine learning and stochastic modeling.

We got together as a group of researchers from different institutions to better understand the state of the art of quantum computing for financial applications, said Marco Pistoia, head of Global Technology Applied Research at JPMorgan Chase. We wanted this to be appreciated by a larger audience. Our paper can be the starting point for researchers to better understand the landscape and then dive deeper in the areas that theyre interested in.

Quantum computing harnesses features of physics at the level of the atom to perform computations at speeds that leave traditional computing in the dust. In some cases, a quantum computer will be able to calculate in a few minutes what it would take a supercomputer 10,000 years to run.

The upside of quantum computing is absolutely humongous, said Argonne scientist Yuri Alexeev, one of the reports co-authors. Were talking about a potential speedup of millions of times for solving certain problems.

It is precisely the advantage of supersonic speed that finance experts are interested in.

In the financial world, time and accuracy are of the essence, Alexeev said. Getting solutions quickly can have huge benefits.

This could apply to everything from improving portfolio management to optimizing investment strategies to advancing the speedy detection of credit card fraud, to name just a few examples.

All these problems sound very general, but in fact, they are mathematical problems. Moreover, many of them are mathematical optimization problems, said Safro, whose expertise is in algorithms and models for quantum computing, machine learning and artificial intelligence systems, with a focus on natural language processing.

The three categories of challenges the paper discusses optimization, machine learning and stochastic modeling are at the intersection of finance and computing.

Optimization refers to methods for rapidly obtaining the best solution to a problem. For example, financial companies could use quantum computers to rapidly select assets that would provide the maximum return on a customers investment with minimal risk.

The second category, machine learning, is already a part of many financial institutions toolkits. In machine learning, computers draw on massive data sets to make predictions about various behaviors, such as patterns in the stock market. Combining quantum algorithms with machine learning can massively speed up those predictions.

The third category, stochastic modeling, is used across the sciences to predict the spread of disease, the evolution of a chemical reaction, or weather patterns. The mathematical technique models complex processes by making random changes to a variable and observing how the process responds to the changes. The method is used in finance, for instance, to describe the evolution of stock prices and interest rates. With the power of quantum computing behind it, stochastic modeling can provide faster and more accurate predictions about the market.

As the Nature Reviews Physics report makes clear, there is no shortage of finance challenges for quantum computers to tackle. Training the future workforce is a key part of solving future challenges.

UD is among the first higher-education institutions in the country to offer graduate and doctorate-level degrees in quantum science and engineering. Launched in 2023, the UD program offers three tracks: quantum nanotechnology, quantum theory or quantum algorithms and computation. Housed within the Universitys Graduate College, the interdisciplinary program is designed to equip next-generation quantum experts with skills expected to be in high demand, including algorithms, theoretical physics, engineering and nanoscale science.

According to Safro, one of the things that makes the field and ongoing research in this area exciting is the unknown.

Currently, there is no one specific quantum technology that we know for sure will take over the market, he said.

This means that multiple quantum technologies and vendors must compete to scale up quantum hardware, making it more powerful, reliable and accessible for broad use in scientific research to practical applications for a variety of industries.

Once researchers demonstrate practical scalability of quantum computing devices in one of these technologies, we will have an exact roadmap of how we build bigger and bigger quantum computers to tackle very large, real-world problems and how to hybridize them with classical supercomputers, Safro continued. With this literal breakthrough, I think the number of available jobs in quantum computing will explode, similar to what we have seen today with artificial intelligence.

This research was facilitated by the Chicago Quantum Exchange (CQE), an intellectual hub that brings together academia, government, and industry to advance quantum research, train the future quantum workforce and drive the quantum economy. Argonne and UChicago are founding members of the CQE, which is also anchored by DOEs Fermi National Accelerator Laboratory, the University of Illinois Urbana-Champaign, the University of WisconsinMadison and Northwestern University. JPMorgan Chase is a CQE corporate partner.

Read more from the original source:

Quantum computing and finance | UDaily - University of Delaware

Protecting quantum computers from adversarial attacks – Innovation News Network

The solution, Quantum Noise Injection for Adversarial Defence (QNAD), counteracts the impact of adversarial attacks designed to disrupt the interference of quantum computers. This is AIs ability to make decisions or solve tasks.

Adversarial attacks designed to disrupt AI inference have the potential for serious consequences, said Dr Kanad Basu, assistant professor of electrical and computer engineering at the Erik Jonsson School of Engineering and Computer Science.

The work will be presented at the IEEE International Symposium on Hardware Oriented Security and Trust on 6-9 May in Washington, DC.

Quantum computers can solve several complex problems exponentially faster than classical computers. The emerging technology uses quantum mechanics and is expected to improve AI applications and solve complex computational problems.

Qubits represent the fundamental unit of information in quantum computers, like bits in traditional computers.

In classical computers, bits represent 1 or 0. However, qubits take advantage of the principle of superposition and can, therefore, be in a state of 0 and 1. By representing two states, quantum computers have greater speed compared to traditional computers.

For example, quantum computers have the potential to break highly secure encryption systems due to their computer power.

Despite their advantages, quantum computers are vulnerable to adversarial attacks.

Due to factors such as temperature fluctuations, magnetic fields, and imperfections in hardware components, quantum computers are susceptible to noise or interference.

Quantum computers are also prone to unintended interactions between qubits.

These challenges can cause computing errors.

The researchers leveraged intrinsic quantum noise and crosstalk to counteract adversarial attacks.

The method introduced crosstalk into the quantum neural network. This is a form of Machine Learning where datasets train computers to perform tasks. This includes detecting objects like stop signs or other computer vision responsibilities.

The noisy behaviour of quantum computers actually reduces the impact of attacks, said Basu, who is senior author of the study. We believe this is a first-of-its-kind approach that can supplement other defences against adversarial attacks.

The researchers revealed that during an adversarial attack, the AI application was 268% more accurate with QNAD than without it.

The approach is designed to supplement other techniques to protect quantum computer security.

In case of a crash, if we do not wear the seat belt, the impact of the accident is much greater, Shamik Kundu, a computer engineering doctoral student and a first co-author, said.

On the other hand, if we wear the seat belt, even if there is an accident, the impact of the crash is lessened. The QNAD framework operates akin to a seat belt, diminishing the impact of adversarial attacks, which symbolise the accident, for a QNN model.

The research was funded by the National Science Foundation.

See the original post:

Protecting quantum computers from adversarial attacks - Innovation News Network

How one theory ties together everything we know about the universe – New Scientist

You can sip your coffee thanks to the fundamental forces of nature

Vladimir Arndt / Alamy

The following is an extract from our Lost in Space-Time newsletter. Each month, wehand over the keyboard to a physicist or mathematician to tell you about fascinating ideas from their corner of the universe. You can sign up for Lost in Space-Time for freehere.

Though our world is bewildering in its diversity, all known natural phenomena can be classified into just a few categories.Four of these gravitational, electromagnetic, strong nuclear and weak nuclear are

The rest is here:

How one theory ties together everything we know about the universe - New Scientist

Quantum Gravity Unveiled Scientists Crack the Cosmic Code That Baffled Einstein – SciTechDaily

Researchers have developed a method to measure gravity at a microscopic level, marking a significant advancement in understanding quantum gravity. Credit: SciTechDaily.com

Physicists successfully measure gravity in the quantum world, detecting weak gravitational pull on a tiny particle with a new technique that uses levitating magnets, putting scientists closer to solving mysteries of the universe.

Scientists are a step closer to unraveling the mysterious forces of the universe after working out how to measure gravity on a microscopic level.

Experts have never fully understood how the force discovered by Isaac Newton works in the tiny quantum world.

Even Einstein was baffled by quantum gravity and, in his theory of general relativity, said there is no realistic experiment that could show a quantum version of gravity.

However, physicists at the University of Southampton, working with scientists in Europe, have now successfully detected a weak gravitational pull on a tiny particle using a new technique.

They claim it could pave the way to finding the elusive quantum gravity theory.

The experiment, published in the Science Advances journal, used levitating magnets to detect gravity on microscopic particles small enough to border on the quantum realm.

Artist impression of the quantum experiment. Credit: University of Southampton

Lead author Tim Fuchs, from the University of Southampton, said the results could help experts find the missing puzzle piece in our picture of reality.

He added: For a century, scientists have tried and failed to understand how gravity and quantum mechanics work together.

Now we have successfully measured gravitational signals at the smallest mass ever recorded, it means we are one step closer to finally realizing how it works in tandem.

From here we will start scaling the source down using this technique until we reach the quantum world on both sides.

By understanding quantum gravity, we could solve some of the mysteries of our universe like how it began, what happens inside black holes, or uniting all forces into one big theory.

The rules of the quantum realm are still not fully understood by science but it is believed that particles and forces at a microscopic scale interact differently than regular-sized objects.

Academics from Southampton conducted the experiment with scientists at Leiden University in the Netherlands and the Institute for Photonics and Nanotechnologies in Italy, with funding from the EU Horizon Europe EIC Pathfinder grant (QuCoM).

Their study used a sophisticated setup involving superconducting devices, known as traps, with magnetic fields, sensitive detectors, and advanced vibration isolation.

It measured a weak pull, just 30aN, on a tiny particle 0.43mg in size by levitating it in freezing temperatures a hundredth of a degree above absolute zero about minus-273 degrees Celsius.

The results open the door for future experiments between even smaller objects and forces, said Professor of Physics Hendrik Ulbricht also at the University of Southampton.

He added: We are pushing the boundaries of science that could lead to new discoveries about gravity and the quantum world.

Our new technique that uses extremely cold temperatures and devices to isolate the vibration of the particle will likely prove the way forward for measuring quantum gravity.

Unravelling these mysteries will help us unlock more secrets about the universes very fabric, from the tiniest particles to the grandest cosmic structures.

Reference: Measuring gravity with milligram levitated masses by Tim M. Fuchs, Dennis G. Uitenbroek, Jaimy Plugge, Noud van Halteren, Jean-Paul van Soest, Andrea Vinante, Hendrik Ulbricht and Tjerk H. Oosterkamp, 23 February 2024, Science Advances. DOI: 10.1126/sciadv.adk2949

See more here:

Quantum Gravity Unveiled Scientists Crack the Cosmic Code That Baffled Einstein - SciTechDaily

Enabling state-of-the-art quantum algorithms with Qedma’s error mitigation and IonQ, using Braket Direct | Amazon … – AWS Blog

This post was contributed by Eyal Leviatan, Barak Katzir, Eyal Bairey, Omri Golan, and Netanel Lindner from Qedma, Joshua Goings from IonQ, and Daniela Becker from AWS.

Quantum computing is an exciting, fast-paced field. And especially in these early days, unfettered access to the right set of resources is critical in order to accelerate experimentation and innovation. Amazon Braket provides customers access to a choice of quantum hardware and the tooling they need to experiment, while also enabling them to engage directly with experts across the field from scientists to device manufacturers.

In this post, the team from Qedma, a quantum software company, dives into how they used Braket Direct to accomplish a milestone demonstration of their error mitigation software on IonQs Aria device. Leveraging dedicated access to quantum hardware capacity using reservations and collaborating with IonQ scientists for expert guidance directly via AWS, Qedma was able to successfully execute some of the most challenging Variational Quantum Eigensolver (VQE) circuits on a quantum processor to date.

In todays quantum processing units (QPUs), the susceptibility to various forms of noise results in errors that corrupt the quantum program and eventually render the results useless. The accumulation of errors over time, limits the duration and therefore the performance of quantum algorithms. Thus, achieving quantum advantage the ability to perform computations on quantum computers significantly faster than with classical supercomputers, needs a solution to mitigate the detrimental impact of these errors and enable algorithms to scale.

Error mitigation aims to reduce the effect of errors on the outputs of circuits executed on noisy quantum devices. However, these improvements come at the cost of runtime overhead that increases with the number of two-qubit gates (circuit volume) in the circuit. To overcome this, Qedmas novel approach to error mitigation, and the Qedma Error Suppression and Error Mitigation (QESEM) product, requires exponentially less overhead compared to other methods and suppresses errors at the hardware level to run longer programs while maintaining reasonable runtimes, potentially accelerating the path to quantum advantage.

Below we detail how QESEM was used in conjunction with IonQs Aria device via Braket Direct to produce high-accuracy results for a variety of quantum chemistry and quantum materials applications. We also show how Braket Direct provided us with dedicated QPU access, ideally suited for QESEMs interactive workflow, as well as the ability to connect directly with IonQs hardware experts. Scientific guidance from IonQ was important for tailoring QESEM to make the best use of Aria, and for constructing novel quantum chemistry circuits for the demonstration. These included VQE and Hamiltonian simulation circuits on 12 qubits, leveraging the high connectivity of IonQs devices. The results presented in this blog post demonstrate how users can push the boundaries of quantum chemistry and materials applications accessible on IonQs devices with Qedmas error mitigation, powered by Braket Direct.

QESEM can be used with any quantum program. When applied, QESEM first carries out a hardware-specific characterization protocol. According to the deduced error model, QESEM recompiles the input quantum circuit to a set of circuits that are sent to the device; the measurement outcomes are then classically post-processed, returning high-accuracy outputs, as we demonstrate below. The characterization process underlying QESEM ensures that its results are unbiased for any circuit. This means that QESEM provides results whose accuracy is only limited by the QPU time allocated for execution. In contrast, many error mitigation methods are algorithm-specific or heuristic. Algorithm-specific methods are not designed to mitigate generic errors across any quantum circuit, whereas heuristic methods generically converge to an incorrect (biased) output [1]. Relative to the leading unbiased and algorithm-agnostic methods, QESEMs QPU time is exponentially shorter as a function of circuit volume, as shown below.

We applied QESEM to three circuits from various applications and with a range of structural circuit properties (see Table 1). Specifically, we created a reservation via Braket Direct to get dedicated device access to IonQs Aria device. The reservation enabled the entire QESEM workflow to execute within a single working session where exclusive QPU access avoided the need to wait in line, and optimized throughput resulted in the shortest possible runtime. Along with the inherent stability of the physical properties of IonQs Aria, the reduced runtime ensured minimal drift of the system parameters during our experiments. This allowed QESEM to obtain an efficient description of the noise model during the execution.

Table 1: Properties of the circuits we demonstrated QESEM on.

Compared to the number of qubits they employ, all three circuits are comprised of a relatively high number of unique two-qubit gates between different pairs of qubits. This is made possible by the all-to-all qubit connectivity of IonQs hardware, which can calibrate an entangling gate between any pair of qubits; each of those gates is uniquely facilitated through the vibrational modes of the ion chain encoding the qubits. On the one hand, high qubit connectivity allows the compilation of complex circuits without incurring significant depth overhead. In contrast, on devices with lower connectivity, e.g., square lattice, applying a two-qubit gate to qubits that are not connected requires additional SWAP gates. On the other hand, the ability to run a large number of two-qubit gates poses a challenge for any characterization-based error mitigation method, since the noise model becomes very complicated. To address this challenge, QESEM used a characterization model specifically tailored to trapped ions, efficiently describing the errors of trapped-ion devices using a tractable noise model.

The first two circuits are examples of the VQE algorithm, which aims to find the ground state energy of a quantum many-body system, e.g., a molecule [1]. The specific examples we ran were designed to find the ground states of the NaH and O2 molecules. The third circuit realized a Hamiltonian simulation algorithm, implementing the time evolution of a quantum spin-lattice. We first describe the VQE circuits and focus on the oxygen molecule O2. Our efforts concentrated there due to its relevance to industrial and biological processes, while striking a balance between complexity and tractability making it a robust test for todays quantum devices. Moreover, the O2 experiment used a circuit volume of 99 two-qubit gates, larger than all VQE circuits featured in a recent experimental survey [3].

Typically, the presence of errors severely limits the size of VQE circuits because of the need for particularly accurate results. The ability to leverage the all-to-all connectivity of trapped-ion devices to reduce gate overhead is therefore well suited to this type of algorithm. With Braket Direct, we were able to incorporate expert guidance from IonQ on how to maximize the benefit of using their high connectivity and compile directly to their native gates to optimize the VQE circuits for the Aria device and produce the best results.

IonQ brought their quantum chemistry expertise to the table, equipping Qedma with circuits precisely crafted for the O2 molecule. Designed to mirror full configuration interaction results [4], these circuits included a chemistry-inspired Ansatz [5] supplemented by particle-conserving unitaries, which reflects the underlying molecular electronic structure. Additionally, IonQ undertook the classical optimization of the circuit parameters, setting the ground work for Qedma to apply QESEM effectively during the final energy assessment.

QESEM significantly enhanced the accuracy of the ground-state energy of the O2 molecule. Running this VQE circuit on Aria without error mitigation and measuring the ground state energy yields the result shown in red in Figure 1. This unmitigated result, i.e. executed without error mitigation, misses its mark by roughly 30%. In black, we show the exact energy, as it would have been obtained from the VQE circuit had it been run on a noise-free, i.e., ideal device. Using QESEM, the error mitigated energy (blue) closely matches the exact result up to the statistical error bar corresponding to the finite mitigation time. Moreover, the error bar accompanying the mitigated result is small enough to indicate a very clear statistical separation from the unmitigated result.

Figure 1: The ground state energy of the O2 molecule as obtained from running the VQE circuit on IonQ Aria without error mitigation (red) and with QESEM (blue) compared to the exact result that would be obtained on an ideal, i.e., noise-free, device.

Aside from the ground state energy, this VQE circuit also allows us to learn about the electronic structure of the O2 molecule. The states of individual qubits encode the electronic occupations of the molecules orbitals. A qubit in the 0 state signifies an empty orbital whereas the 1 state corresponds to occupation by a single electron. Moreover, from the correlations between pairs of qubits, we can extract the correlations between occupations. Some examples of occupations and their correlations can be seen in Figure 2. Again, all mitigated values match the ideal values up to the statistical error bars while the noisy results are, in most cases, far off.

Figure 2. Ideal, noisy and mitigated values for example orbitals occupations and their correlations.

Similar results for the NaH VQE circuit are shown in Figure 3. While the NaH circuit is narrower, i.e., involves fewer qubits, it requires a full qubit-connectivity graph and is of a comparable depth. Since this circuit only makes use of 6 qubits, the number of all possible outcomes is not very large, allowing the depiction of the full probability distribution of measurement outcomes (see Figure 3). Excellent agreement of the mitigated results with the ideal outcome can be seen for all bitstrings, demonstrating QESEMs capability to provide an unbiased estimate for any output observable of interest.

Figure 3: Results for the NaH VQE circuit. Left: The probability distribution of all possible measurement outcomes. Right: Observables of interest, e.g., the ground state energy. QESEM results (blue) reproduce the ideal values (black) up to statistical accuracy while the unmitigated results (red) are off.

In the study of quantum materials, there are two fundamental questions of interest: energetics and dynamics. The VQE algorithm presented above addresses the question of energetics. In contrast, the Hamiltonian simulation algorithm computes the time evolution of the quantum state of the material, i.e., its dynamics. The quantum circuit approximates the continuous dynamics by small discrete time evolution steps [6].

Spin Hamiltonians are widely used as models for quantum materials where the electrons are in fixed positions but interact magnetically. For this demonstration, we chose a canonical Hamiltonian, the so-called XY model with a perpendicular magnetic field [7]. The 12 spins, encoded by 12 qubits, reside on the sites of a three-by-four triangular lattice with periodic boundary conditions (see Figure 4). Under these conditions, the Hamiltonian simulation circuit requires high connectivity between the qubits to be compiled compactly. Beyond being a highly demanding benchmark, the Hamiltonian we simulated also illustrates rich quantum physical phenomena. The XY model is a model of strongly interacting bosons, as in a Josephson junction array. On a triangular lattice, this type of system can form an exotic phase of matter called a Supersolid [8].

Figure 4: Hamiltonian simulation. Left: the simulated triangular spin lattice. Colors represent different observables of interest the magnetization of individual spins (gray), and correlations between magnetizations of different spin patterns. Right: ideal, noisy and mitigated values for the different observables

Figure 4 shows the values of various observables of physical interest after one time-step (consisting of 72 two-qubit gates) is performed to an initial state where all spins, i.e., qubits, are oriented along the X direction. From left to right, these observables are the projections onto the X direction of the magnetization of single spins, and correlations of spin magnetizations along interaction bonds, lattice plaquettes, and strings of spins that envelop the lattice in one of its directions. Examples of each appear on the top panel in matching colors. These observables indicate the strength of various magnetic properties of the model. For each observable, we present the exact expectation values in black, the noisy unmitigated values in red, and the error mitigated results using QESEM in blue. Again, QESEM results reproduce the ideal values up to statistical accuracy, while the unmitigated results are statistically well-separated from both.

While we presented only a few specific examples, QESEM can be applied to any quantum circuit for which error-free results are desired. It is meticulously designed to optimize the accuracy-to-runtime tradeoff inherent to error mitigation methods. In particular, QESEMs QPU time, at a given statistical accuracy, scales exponentially better as a function of the volume of the target circuit compared to competing unbiased error mitigation protocols. For instance, a circuit with 120 two-qubit gates, run on a trapped-ion device with 99% two-qubit gate fidelity, would take 90 minutes to execute to 90% accuracy using QESEM, which can be easily completed within a two-hour device reservation using Braket Direct. The same circuit, executed with the leading competing unbiased and algorithm-generic error mitigation technique, Probabilistic Error Cancellation [9, 10], would take over a month.

Error mitigation is essential for executing cutting-edge applications on near-term quantum devices [1]. While the problems discussed in this blog can be simulated classically, QESEM enables accurate, error-free execution of large circuits increasing the number of two-qubit gates that can be utilized by more than an order of magnitude compared to unmitigated execution at the same level of accuracy.

Figure 5 shows the circuit volumes accessible with QESEM on trapped-ion devices. With expected near-future improvements in hardware fidelities and qubit counts, QESEM could enable executing generic quantum circuits faster than a supercomputer performing a state-vector simulation of the same circuit. Achieving this milestone will spur further exploration of applications requiring simulations of quantum systems, such as the design of novel materials.

Figure 5: accessible circuit volumes with QESEM on ion traps, assuming a desired accuracy of 90%. Active volume denotes the number of two-qubit gates within the circuit that affect the observable of interest. Here it is measured in terms of IonQs MlmerSrensen (MS) entangling gates. The black line estimates the time it would take a supercomputer to perform a state-vector simulation for a square circuit with the corresponding circuit volume. A square circuit consists of a sequence of layers in which each qubit participates in an MS gate, and the number of layers equals to the number of qubits (width=depth).

To learn more about Qedma and QESEM, visit Qedmas website. To further accelerate your research with dedicated access to quantum hardware including IonQs latest Forte QPU, check out the Braket Direct documentation or navigate to the AWS Management Console.

The content and opinions in this blog are those of the third-party authors and AWS is not responsible for the content or accuracy of this blog.

[1] Quantum Error Mitigation, https://arxiv.org/abs/2210.00921 (2022) [2] A variational eigenvalue solver on a photonic quantum processor, https://www.nature.com/articles/ncomms5213 (2014) [3] Orbital-optimized pair-correlated electron simulations on trapped-ion quantum computers https://www.nature.com/articles/s41534-023-00730-8 (2023) [4] Molecular Electronic-Structure Theory; John Wiley & Sons (2014) [5] Universal quantum circuits for quantum chemistry, https://doi.org/10.22331/q-2022-06-20-742 (2022) [6] Universal Quantum Simulators, https://www.science.org/doi/10.1126/science.273.5278.1073 (1996) [7] Boson localization and the superfluid-insulator transition, https://journals.aps.org/prb/abstract/10.1103/PhysRevB.40.546 (1989) [8] Superfluids and supersolids on frustrated two-dimensional lattices, https://journals.aps.org/prb/abstract/10.1103/PhysRevB.55.3104 (1997) [9] Probabilistic error cancellation with sparse PauliLindblad models on noisy quantum processors, https://www.nature.com/articles/s41567-023-02042-2 (2023) [10] Efficiently improving the performance of noisy quantum computers, https://arxiv.org/abs/2201.10672 (2022)

The rest is here:

Enabling state-of-the-art quantum algorithms with Qedma's error mitigation and IonQ, using Braket Direct | Amazon ... - AWS Blog

The new system allows to look at phenomena that occur in special topological materials by video recording the … – EurekAlert

image:

Left to right: Prof. Yair Shokef, Dr. Izhar Neder & Chaviva Sirote-Katz.

Credit: Tel Aviv University

Classifying Quantum Secrets: Pendulum Experiment Reveals Insights into Topological Materials

The new system allows to look at phenomena that occur in special topological materials by video recording the motion of pendula

A recent study conducted at Tel Aviv University has devised a large mechanical system that operates under dynamical rules akin to those found in quantum systems. The dynamics of quantum systems, composed of microscopic particles like atoms or electrons, are notoriously difficult, if not impossible, to observe directly. However, this new system allows researchers to visualize phenomena occurring in specialized topological materials through the movement of a system of coupled pendula. The research is a collaboration between Dr. Izhar Neder of the Soreq Nuclear Research Center, Chaviva Sirote-Katz of the Department of Biomedical Engineering, Dr. Meital Geva and Prof. Yair Shokef of the School of Mechanical Engineering, and Prof. Yoav Lahini and Prof. Roni Ilan of the School of Physics and Astronomy at Tel Aviv University and was recently published in the Proceedings of the National Academy of Sciences of the USA (PNAS).

A video of the pendulas motion in the experiment may be found here: https://youtu.be/HGheBTLtLxM?si=d8qt0akAnnRvKo4d

Quantum mechanics governs the microscopic world of electrons, atoms and molecules. An electron, which is a particle that moves in an atom or in a solid, may have properties that give rise to wave-like phenomena. For instance, it may demonstrate a probability of dispersing in space similar to waves spreading out in a pool after a stone is thrown in, or the capability to exist simultaneously in more than one place.

Such wave-like properties lead to a unique phenomenon that appears in some solid isolators, where even though there is no electric current through them, and the electrons do not move due to an external electric voltage, the internal arrangement of the material shows up in a state referred to as topological. This means that the wave of electrons possesses a quantity that can close on itself in different ways, somewhat like the difference between a cylinder and a Mbius strip. This topological state of the electrons, for which the 2016 Nobel Prize in Physics was awarded, is considered a new state of matter, and attracts much current research.

Despite the theoretical interest, there is a limitation in measuring these phenomena in quantum systems. Due to the nature of quantum mechanics, one cannot directly measure the electrons wave function and its dynamical evolution. Instead, researchers indirectly measure the wave-like and topological properties of electrons in materials, for instance by measuring the electrical conductivity at the edges of solids.

In the current study, the researchers considered the possibility of constructing a sufficiently large mechanical system that would adhere to dynamical rules akin to those found in quantum systems, and in which they could directly measure everything. To this end, they built an array of 50 pendula, with string lengths that slightly varied from one pendulum to the other. The strings of each neighboring pair of pendula were connected at a controlled height, such that each ones motion will affect its neighbors motion.

On one hand, the system obeyed Newtons laws of motion, which governs the physics of our everyday lives, but the precise lengths of the pendula and the connections between them created a magical phenomenon: Newtons laws caused the wave of the pendulas motion to approximately obey Schrdingers equation the fundamental equation of quantum mechanics, which governs the motion of electrons in atoms and in solids. Therefore, the motion of the pendula, which is visible in the macroscopic world, reproduced behaviors of electrons in periodic systems such as crystals.

The researchers pushed a few pendula and then released them. This generated a wave that propagated freely along the chain of pendula, and the researchers could directly measure the evolution of this wave an impossible mission for the motion of electrons. This enabled direct measurement of three phenomena. The first phenomenon, known as Bloch oscillations, occurs when electrons within a crystal are influenced by an electric voltage, pulling them in a specific direction. In contrast to what one would expect, the electrons do not simply move along the direction of the field, but they oscillate back and forth due to the periodic structure of the crystal. This phenomenon is predicted to appear in ultra-clean solids, which are very hard to find in nature. In the pendula system, the wave periodically moved back and forth, exactly according to Blochs prediction.

The second phenomenon that was directly measured in the pendula system is called Zener tunneling. Tunneling is a unique quantum phenomenon, which allows particles to pass through barriers, in contrast to classical intuition. For Zener tunneling, this appears as splitting of a wave, the two parts of which then move in opposite directions. One part of the wave returns as in Bloch oscillations, while the other part tunnels through a forbidden state and proceeds in its propagation. This splitting, and specifically its connection to motion of the wave in either direction, is a clear characteristic of the Schrdinger equation. In fact, such a phenomenon is what disturbed Schrdinger, and is the main reason for the suggestion of his famous paradox; according to Schrdingers equation, the wave of an entire cat can split between a live-cat state and a dead-cat state. The researchers analyzed the pendula motion and extracted the parameters of the dynamics, for instance the ratio between the amplitudes of the two parts of the split wave, which is equivalent to the quantum Zener tunneling probability. The experimental results showed fantastic agreement with the predictions of Schrdingers equation.

The pendula system is governed by classical physics. Therefore, it cannot mimic the full richness of quantum systems. For instance, in quantum systems, the measurement can influence the systems behavior (and cause Schrdingers cat to eventually be dead or alive when it is viewed). In the classical system of macroscopic pendula there is no counterpart to this phenomenon. However, even with these limitations, the pendula array allows observing interesting and non-trivial properties of quantum systems, which may not be directly measured in the latter.

The third phenomenon that was directly observed in the pendula experiment was the wave evolution in a topological medium. Here, the researchers found a way to directly measure the topological characteristic from the wave dynamics in the system a task which is almost impossible in quantum materials. To this end, the pendula array was tuned twice, so that they will mimic Schrdingers equation of the electrons, once in a topological state and once in a trivial (i.e. standard) state. By comparing small differences in the pendula motion between the two experiments, the researchers could classify the two states. The classification required a very delicate measurement of a difference between the two experiments of exactly half a period of oscillation of a single pendulum after 400 full oscillations that lasted 12 minutes. This small difference was found to be consistent with the theoretical prediction.

The experiment opens the door to realizing further situations that are even more interesting and complex, like the effects of noise and impurities, or how energy leakage affects wave dynamics in Schrdingers equation. These are effects that can be easily realized and seen in this system, by deliberately perturbing the pendula motion in a controlled manner.

Link to the article:

https://www.pnas.org/doi/abs/10.1073/pnas.2310715121

Proceedings of the National Academy of Sciences

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

More:

The new system allows to look at phenomena that occur in special topological materials by video recording the ... - EurekAlert