Category Archives: Quantum Physics
Uniting Einstein’s gravity with Quantum mechanics – Open Access Government
The ongoing challenge in physics has been to harmonise these two theories, each governing different aspects of the universe.
Quantitative theory deals with the microscopic realm and general relativity, explaining gravity through the curvature of spacetime.
Previous approaches focused on modifying Einsteins theory to align with quantum mechanics, but UCLs new theory challenges this consensus.
Led by Professor Jonathan Oppenheim, the theory proposes an alternative perspective, suggesting that spacetime may remain classical, untouched by quantum principles.
Named a postquantum theory of classical gravity, it argues that the breakdown in predictability lies within quantum theory itself, leading to random and violent fluctuations in spacetime.
These fluctuations, larger than those predicted by quantum theory, make the precise measurement of object weight unpredictable.
To put their theory to the test, UCL researchers proposed an experiment. The experiment involves measuring a mass with extreme precision to observe potential fluctuations in its weight over time.
The outcome of this experiment, or any evidence confirming the nature of spacetime, is the subject of a ongoing debate between Professor Oppenheim and proponents of other quantum gravity theories, such as string theory and loop quantum gravity.
The UCL research group has tested the theorys implications over the past five years.
Professor Oppenheim emphasizes the significance of resolving the mathematical incompatibility between quantum theory and general relativity, stating, Now that we have a consistent fundamental theory in which spacetime does not get quantized, its anybodys guess.
The proposed experiment aims to determine whether spacetime has classical attributes by detecting random fluctuations in mass, offering an alternative approach to experiments aiming to verify the quantum nature of spacetime through gravitationally mediated entanglement.
The theory also extends beyond gravity, challenging the conventional measurement postulate of quantum theory.
While acknowledging the experimental challenges, the researchers express optimism about the potential to unveil the fundamental nature of spacetime within the next two decades.
The UCL teams postquantum theory opens new avenues for exploring the interplay between quantum particles and classical spacetime, pushing the boundaries of our understanding of the laws governing the universe.
Editor's Recommended Articles
Follow this link:
Uniting Einstein's gravity with Quantum mechanics - Open Access Government
Infleqtion Launches Oqtant, the Worlds First Quantum Matter Service to Accelerate the Transition to the – PRUnderground
Infleqtion, the worlds leading quantum information company, proudly announces the launch of Oqtant, the worlds first quantum innovation platform as a service that provides groundbreaking access to quantum matter for researchers, innovators, and students working on next-generation quantum applications. Oqtant will be used to build more powerful and versatile solutions for new and better sensors, atomtronic circuits, and signal processing, providing the next leap in technology innovation. With access to quantum matter typically out of reach to many, Oqtant will fundamentally democratize quantum discovery and invention. Oqtant provides the core capabilities to create and manipulate matter, anywhere, anytime, by anyone with internet access.
Oqtant provides an unprecedented path to innovation through ultracold atom quantum technology; a path that begins by allowing users to create and interact with Bose-Einstein condensates (BECs), a unique state of matter where quantum effects command the collective behavior of a myriad of atoms. Oqtant represents a leap forward, enabling users to directly create, manipulate, and study quantum phenomena previously inaccessible outside of specialized research institutions. Users can seamlessly explore and unlock unparalleled insights into the rich world of quantum physics and technology. BECs are the hallmark of the quantum world, as atoms in this state exhibit remarkable properties that do not exist in the classical world.
For the first time everyone now has the tools to interact and harness the building blocks of our universe at their fingertips, putting us on the precipice of a new age of discovery and exploration, said Dana Anderson, Founder and Chief Strategy Officer at Infleqtion. BECs provide the doorway to the quantum era, they are as fundamental to quantum innovation as electrons are to electronics innovation. Harnessing quantum matter opens a realm of possibilities and solutions that were previously unimaginable. As we accelerate the adoption of quantum technology, we unlock the potential to transform the world, empowering innovators to address the most pressing challenges we face leveraging the quantum advantage.
This is a historic moment for the industry. Oqtant is a critical capability for quantum innovation, research, and education, said Anjul Loiacono, Vice President of Quantum Matter Platforms at Infleqtion. We know that quantum technology will bring about new levels of precision, greater resolution, more capable signal processing and AI-enabled quantum applications. To realize this potential, we need to expand access to the fundamental building blocks of our quantum future for innovators and give hands-on experiences to the next generation quantum workforce.
Key features and benefits of Oqtant include:
We are very excited to have access to Oqtant. Oqtant allows us to further our research and advance the education of our students working with next-generation ultracold atom technology, said Dr. Carrie Weidner, Lecturer in Quantum Engineering, University of Bristol. Having remote access to a BEC system will expand our capability to explore quantum physics while building students skills and confidence much more quickly than we can do on our own.
Oqtant will serve the research and educational communities while playing a pivotal role in the commercial quantum sensor design and development market, providing essential tools to quantum-era innovators. Oqtant was selected as one ofTime Magazines Best Inventions of 2022 and the 2022 Prism Awards Winner in Quantum.
Learn more at Q2B 2023 Silicon Valley this week by attending Making Quantum Matter on Wednesday, December 6th, where Paul Lipman, our Chief Commercial Officer, will present an overview of Oqtant. For a hands-on experience, visit our booth, #D4, where you can interact and explore with Oqtant or embark on your quantum matter journey at oqtant.infleqtion.com.
About Infleqtion
Infleqtion delivers high-value quantum information precisely where it is needed. By operating at the Edge, our software-configured, quantum-enabled products deliver unmatched levels of precision and power, generating streams of high-value information for commercial organizations, the United States, and allied governments. With 16 years of ColdQuantas pioneering quantum research as our foundation, our hardware products and AI-powered solutions address critical market needs in positioning, navigating and timing, global communication security and efficiency, resilient energy distribution, and accelerated quantum computing. With offices in Austin, TX; Boulder, CO; Chicago, IL; Madison, WI; Melbourne, AU; and Oxford, UK. Learn how Infleqtion is revolutionizing how we communicate, navigate, and discover at http://www.Infleqtion.com and connect with us on LinkedIn.
See more here:
Stretching the Limits: How Diamond Manipulation Enhances Quantum Bits – SciTechDaily
Advancements in quantum networking have been made by stretching diamond films, enabling quantum bits to function more effectively and with less expense, marking a significant step towards practical quantum networks.
Breakthrough by Argonne, UChicago researchers could help pave way for quantum infrastructure.
In work supported by the Q-NEXT quantum research center, scientists stretch thin films of diamond to create more cost-effective and controllable qubits.
A future quantum network may become less of a stretch thanks to researchers at the University of Chicago, Argonne National Laboratory and Cambridge University.
A team of researchers announced a breakthrough in quantum network engineering: By stretching thin films of diamond, they created quantum bits that can operate with significantly reduced equipment and expense. The change also makes the bits easier to control.
The researchers hope the findings, published on November 29 in the journal Physical Review X, can make future quantum networks more feasible.
This technique lets you dramatically raise the operating temperature of these systems, to the point where its much less resource-intensive to operate them, said Alex High, assistant professor with the Pritzker School of Molecular Engineering, whose lab led the study.
By stretching thin films of diamond, researchers have created quantum bits that can operate with significantly reduced equipment and expense. Credit: Peter Allen
Quantum bits, or qubits, have unique properties that make them of interest to scientists searching for the future of computing networksfor example, they could be made virtually impervious to hacking attempts. However, there are significant challenges to work out before it could become a widespread, everyday technology.
One of the chief issues lies within the nodes that would relay information along a quantum network. The qubits that make up these nodes are very sensitive to heat and vibrations, so scientists must cool them down to extremely low temperatures to work.
Most qubits today require a special fridge the size of a room and a team of highly trained people to run it, so if youre picturing an industrial quantum network where youd have to build one every five or 10 kilometers, now youre talking about quite a bit of infrastructure and labor, explained High.
Highs lab worked with researchers from Argonne National Laboratory, a U.S. Department of Energy national lab affiliated with UChicago, to experiment with the materials these qubits are made from to see if they could improve the technology.
One of the most promising types of qubits is made from diamonds. Known as Group IV color centers, these qubits are known for their ability to maintain quantum entanglement for relatively long periods, but to do so they must be cooled down to just a smidge above absolute zero.
The team wanted to tinker with the structure of the material to see what improvements they could makea difficult task given how hard diamonds are. However, the scientists found that they could stretch out the diamond at a molecular level if they laid a thin film of diamond over hot glass. As the glass cools, it shrinks at a slower rate than the diamond, slightly stretching the diamonds atomic structurelike pavement expands or contracts as the earth cools or warms beneath it, High explained.
This stretching, though it only moves the atoms apart an infinitesimal amount, has a dramatic effect on how the material behaves.
First, the qubits could now hold their coherence at temperatures up to 4 Kelvin (or -452F). Thats still very cold, but it can be achieved with less specialized equipment. Its an order of magnitude difference in infrastructure and operating cost, High said.
Secondly, the change also makes it possible to control the qubits with microwaves. Previous versions had to use light in the optical wavelength to enter information and manipulate the system, which introduced noise and meant the reliability wasnt perfect. By using the new system and the microwaves, however, the fidelity went up to 99%.
Its unusual to see improvements in both these areas simultaneously, explained Xinghan Guo, a Ph.D. student in physics in Highs lab and first author on the paper.
Usually if a system has a longer coherence lifetime, its because its good at ignoring outside interferencewhich means it is harder to control, because its resisting that interference, he said. Its very exciting that by making a very fundamental innovation with materials science, we were able to bridge this dilemma.
This technique lets you dramatically raise the operating temperature of these systems, to the point where its much less resource-intensive to operate them. Alex High
By understanding the physics at play for Group IV color centers in diamond, we successfully tailored their properties to the needs of quantum applications, said Argonne National Laboratory scientist Benjamin Pingault, also a co-author on the study. With the combination of prolonged coherent time and feasible quantum control via microwaves, the path to developing diamond-based devices for quantum networks is clear for tin vacancy centres, added Mete Atature, a professor of physics with Cambridge University and a co-author on the study.
Reference: Microwave-Based Quantum Control and Coherence Protection of Tin-Vacancy Spin Qubits in a Strain-Tuned Diamond-Membrane Heterostructure by Xinghan Guo, Alexander M. Stramma, Zixi Li, William G. Roth, Benchen Huang, Yu Jin, Ryan A. Parker, Jess Arjona Martnez, Noah Shofer, Cathryn P. Michaels, Carola P. Purser, Martin H. Appel, Evgeny M. Alexeev, Tianle Liu, Andrea C. Ferrari, David D. Awschalom, Nazar Delegan, Benjamin Pingault, Giulia Galli, F. Joseph Heremans, Mete Atatre and Alexander A. High, 29 November 2023, Physical Review X.DOI: 10.1103/PhysRevX.13.041037
The researchers used the Pritzker Nanofabrication Facility and Materials Research Science and Engineering Center at UChicago.
Other study authors included Zixi Li, Benchen Huang, Yu Jin, Tianle Lu, Prof. Giulia Galli and Prof. David Awschalom with the University of Chicago; Nazar Delegan and Benjamin Pingault with Argonne National Laboratory; and Alexander Stramma (co-first author), William Roth, Ryan Parker, Jesus Arjona Martinez, Noah Shofer, Cathryn Michales, Carola Purser, Martin Appel, Evgeny Alexeev, and Andrea Ferrari with the University of Cambridge.
Funding: Air Force Office of Scientific Research, U.S. Department of Energy Q-NEXT National Quantum Information Science Research Center, ERC Advanced Grant PEDASTAL, EU Quantum Flagship, National Science Foundation, EPSRC/NQIT, General Sir John Monash Foundation and G-research, Winton Programme and EPSRC DTP, EU Horizon 2020 Marie Sklodowska-Curie Grant.
Read the original:
Stretching the Limits: How Diamond Manipulation Enhances Quantum Bits - SciTechDaily
New theory seeks to unite gravity and quantum mechanics – ZME Science
Physicists at University College London (UCL) have introduced a theory that may resolve a longstanding paradox in modern physics. This theory bridges the gap between Einsteins general relativity and quantum mechanics, two foundational yet contradictory pillars of physics.
Einsteins theory, which describes gravity through the curvature of spacetime, stands in stark opposition to quantum mechanics, the rules governing the universes smallest entities where particles dance to the tune of probabilistic wave equations.
These two theories, while individually robust, clash over the fundamental nature of spacetime. Quantum mechanics treats spacetime as a fixed stage, but general relativity insists its a dynamic actor, responding to the presence of mass.
For over a century, scientists believed that to harmonize these theories, gravity must be quantized. This belief spurred the development of string theory and loop quantum gravity.
However, the new UCL theory, spearheaded by Professor Jonathan Oppenheim, proposes an alternative route, suggesting that spacetime might remain classical. By treating gravity classically, the scientists propose merging it with quantum mechanics through a probabilistic mechanism.
Published in Physical Review X, Oppenheims theory introduces a postquantum theory of classical gravity. It suggests a radical shift, modifying quantum theory instead of spacetime. This model predicts that spacetimes interaction with quantum particles leads to unpredictable, violent fluctuations, challenging the precision of weight measurements (which is how the theory could be validated experimentally).
Oppenheims innovation lies in rejecting a key assumption put forth by critics of such hybrid models: that the interaction between classical gravity and quantum matter must be reversible. He proposes a stochastic (probabilistic) model where future states are uncertain, contrasting with deterministic models where the future can be precisely predicted from the present.
Extending his theory, Oppenheim explores coupling quantum field theory (QFT) with general relativity. Here, quantum fields on curved spacetime interact with the classical metric of general relativity through his stochastic equation. This setup allows quantum fields to influence spacetimes curvature, a feature absent in existing QFT approaches.
Oppenheims theory is radical yet conservative. It retains the classical nature of general relativity, sidestepping the conceptual challenges of quantizing spacetime. However, it also suggests that quantum information could be lost in black holes, a contentious implication.
A second study, published in Nature Communications, explores this theorys implications and outlines an experimental test. The proposed experiment involves precisely measuring a mass, such as the standard 1kg mass at the International Bureau of Weights and Measures in France, to detect any fluctuation in its weight. If the fluctuations in measurements of this 1kg mass are smaller than the new theory predicts, then the theory can be ruled out.
We have shown that if spacetime doesnt have a quantum nature, then there must be random fluctuations in the curvature of spacetime which have a particular signature that can be verified experimentally, says co-author Zach Weller-Davies, who is a Ph.D. student at UCL and a key member of the team behind the new theory.
In both quantum gravity and classical gravity, spacetime must be undergoing violent and random fluctuations all around us, but on a scale which we havent yet been able to detect. But if spacetime is classical, the fluctuations have to be larger than a certain scale, and this scale can be determined by another experiment where we test how long we can put a heavy atom in superposition of being in two different locations.
This should be an interesting experiment once its finally completed since it is the subject of a bet with 5000:1 odds between Oppenheim and Professor Carlo Rovelli and Dr. Geoff Penington the leading proponents of quantum loop gravity andstring theory, respectively.
Quantum theory and Einsteins theory of general relativity are mathematically incompatible with each other, so its important to understand how this contradiction is resolved. Should spacetime be quantized, or should we modify quantum theory, or is it something else entirely? Now that we have a consistent fundamental theory in which spacetime does not get quantized, its anybodys guess, said Professor Oppenheim.
In the realm of fundamental physics, Oppenheims proposal is a daring departure from seven decades of established thought. It opens a new frontier in our quest to understand the universes deepest mysteries, standing at the threshold of potentially reshaping our conception of reality.
But dont call the Nobel Prize committee just yet. The ultimate test for Oppenheims theory lies in empirical validation one well be following with great interest and anticipation.
Read more from the original source:
New theory seeks to unite gravity and quantum mechanics - ZME Science
Unveiling the Mysteries of Quantum Entanglement: Harnessing the Power of Spooky Action for the Future | by Tariq … – Medium
Quantum entanglement, the phenomenon famously dubbed spooky action at a distance by Albert Einstein, has long been a source of fascination and intrigue in the world of physics. As we delve into the intricate realm of quantum mechanics, we uncover the mesmerizing characteristics of entangled particles and the tantalizing possibilities they hold for the future of technology.
Einsteins skepticism about quantum entanglement was rooted in his discomfort with the idea that particles could instantaneously influence each others states, regardless of the vast distances that separated them. He famously referred to this as spooky action at a distance. However, despite Einsteins reservations, experiments conducted since then have consistently demonstrated the reality of quantum entanglement, paving the way for groundbreaking discoveries.
Characteristics of Quantum Entanglement
Instantaneous ConnectionQuantum entanglement defies our classical intuition, as particles become interconnected in such a way that the state of one particle instantaneously influences the state of its entangled partner, no matter the distance between them.
Non-localityThe entanglement phenomenon exhibits non-locality, where the properties of one particle are dependent on the state measurements of its entangled partner, even when separated by vast distances. This seemingly faster-than-light connection challenges our understanding of causality.
Quantum SuperpositionEntangled particles exist in a state of superposition, meaning they can exist in multiple states simultaneously until a measurement is made. This unique characteristic forms the basis for quantum computing and quantum communication.
Potential Applications
Quantum ComputingOne of the most promising applications of quantum entanglement lies in the realm of quantum computing. Entangled qubits can perform complex calculations at an exponentially faster rate than classical bits, potentially revolutionizing fields such as cryptography, optimization, and simulation.
Quantum CommunicationHarnessing entangled particles for quantum communication enables the creation of unbreakable quantum key distribution (QKD) systems. These systems leverage the properties of entanglement to secure communication channels, offering a new era of ultra-secure communication.
Quantum SensingQuantum entanglement can be harnessed for ultra-precise sensing and measurement devices. Quantum sensors based on entanglement could revolutionize fields like navigation, imaging, and gravitational wave detection.
Realizing the full potential of quantum entanglement requires overcoming significant challenges. Scientists are actively working on improving the fidelity and distance over which entanglement can be maintained. Additionally, mitigating the effects of environmental factors that can disrupt entanglement is crucial for practical applications.
As we continue to unravel the mysteries of quantum entanglement, we stand on the brink of a technological revolution. The characteristics that once perplexed Einstein may soon power the next generation of computing, communication, and sensing technologies. Embracing the spooky action at a distance opens the door to a future where the unimaginable becomes achievable, and the quantum world becomes a playground for innovation and discovery.
Read the rest here:
IBM Offers Quantum Error Suppression Out Of The Box – IEEE Spectrum
The error-prone nature of todays quantum computers can make doing any useful computation on them a major headache. IBM has announced that, as of this past week, they have integrated error suppression technology from Q-CTRL into IBM cloud quantum services, letting users slash error rates by simply flicking a switch.
Computers that harness the unusual properties of quantum mechanics will ultimately be capable of computational feats beyond even the most powerful supercomputers. But the quantum states that make this possible are incredibly fragile and susceptible to noise, which means carrying out operations before they are overwhelmed by errors is a significant challenge.
Its a bit of a dirty secret in our sector that the typical user experience is very rarely at the limit of what the hardware could provide.Michael Biercuk, CEO, Q-CTRL
Its widely accepted that large-scale quantum computers will require some form of error correction. The leading schemes involve spreading information over a large number of physical qubits to create more robust logical qubits. But this can require as many as a thousand physical qubits for each logical one. Given that todays largest processors feature just hundreds of qubits, error corrected quantum computing is still a distant prospect.
In the meantime, the start-up Q-CTRLbased in Sydney, Australiasays the best way to tame unruly, near-term quantum processors is error suppression, which involves altering how you operate the underlying hardware to reduce the likelihood of errors. Using a combination of techniques, the company says its software can boost the chances of an algorithm running successfully by several orders of magnitude. And now, IBM has integrated the technology into its quantum cloud offerings.
Its a bit of a dirty secret in our sector that the typical user experience is very rarely at the limit of what the hardware could provide, says Q-CTRL CEO and founder Michael Biercuk. Thats because the performance is effectively buried by all the sources of noise and interference and error. We, through our performance management solution, suppress that error in a way that allows a user to immediately access just about the best the hardware can theoretically deliver.
The companys software requires no configuration by IBMs end users. Customers accessing Big Blues quantum hardware over the cloud via its Pay-As-You-Go plan will simply see a performance management option that can be toggled on and off. Flicking the switch will engage an automated set of several different software modules that run in the background to optimize the way the users algorithm runs on the hardware.
According to Biercuk, Q-CTRLs quantum compiler mathematically optimizes the number of logic gates required to run an algorithm before subjecting this minimal circuit to several further error suppressing steps. For a start, the gates are mapped onto the hardwares qubits in such a way as to avoid the most error-prone layouts, based on a catalog of pre-performed test measurements. The circuit is also interleaved with operations that use a technique known as dynamical decoupling, in which qubits are subjected to control pulses designed to cancel out the effects of cross-talk from other nearby qubits.
Separately, Biercuk says, AI designed by Q-CTRL regularly redefines the machine language used to implement circuits on the hardware. Every six to twelve hours, he says, the AI runs through multiple test circuits to check how various potential gate implementations contribute to errors. The results are then compiled in a lookup table that the compiler uses to build the circuit.
We break the link between the underlying quantum processor ... and what the application-focused end-user programs. Thats a huge change.Michael Biercuk, CEO, Q-CTRL
Finally, once the circuit has run, the software then carries out a final post-processing step on the results designed to catch measurement errors. One of the ways that quantum computers fail is that the algorithm actually runs correctly, but when you look at the answer to read it out, that process is faulty, says Biercuk, who is also a professor of quantum physics at The University of Sydney. Q-CTRLs software uses a neural network to learn patterns in the way measurement errors occur in the hardware. The system then uses these patterns to offset any errors in the readout.
With all these error-mitigation efforts working in concert, these different modules boost the chances that an algorithm will run successfullywhich is not a sure fire thing when it comes to quantum computers. In peer-reviewed research published in Physical Review Applied in August, the company tested their error suppression technology out on several popular quantum algorithms and showed that they could boost performance by as much as 1000 times. (The quantum circuit Q-CTRL tested their error-correction protocols on in the August paper was more limited in size. However, contacted by Spectrum about the discrepancy between the size of the quantum circuits described in the graph above and those depicted in the August paper, Biercuk emailed back, It wasnt published in the previous paper, because at the time those larger devices didnt exist yet.)
The companys software works with any kind of quantum computing hardware, says Biercuk, be that trapped-ions,superconducting qubits or cold atoms. And while configuring the software for a particular processor takes some time and effort, there is no extra computational cost for the user at runtime.
Biercuk thinks an out-of-the-box solution for error suppression could be a boon for many of the companies interested in quantum computing. Typically, they have focused on algorithm development, but error-suppression requires expertise in low-level hardware manipulation.
Its a little bit like asking a web developer whos building Salesforce or Facebook, can you please start programming at the level of the voltages on the transistors, says Biercuk. We break the link between the underlying quantum processor and the way it has to be manipulated, and what the application-focused end-user programs. Thats a huge change.
Quantum hardware is still some way from being able to compete with classical computers on practical problems, admits Biercuk, but a number of companies have been testing out the technology ahead of the integration with IBM.
We have previously explored Q-CTRLs performance management capabilities and were impressed by the order of magnitude improvement seen across both the inverse quantum Fourier transform and quantum phase estimation, Julian van Velzen, CTIO and Head of Quantum Lab at Capgemini in Utrecht, the Netherlands, said in a statement. With this technology natively embedded within IBM Quantum services, we can get more value from current hardware and push our applications further.
From Your Site Articles
Related Articles Around the Web
Go here to read the rest:
IBM Offers Quantum Error Suppression Out Of The Box - IEEE Spectrum
Radical theory could unite Albert Einstein’s concept of gravity with quantum mechanics – Study Finds
LONDON Albert Einstein once said, Logic will get you from A to B. Imagination will take you everywhere. Now, thanks to the creative power of University College London physicists, a new theory is challenging the longstanding discrepancy between Einsteins theory of general relativity and quantum theory. This new theory reconciles the concepts of gravity and quantum mechanics, while still holding onto the famous physicists understanding of spacetime.
At the core of modern physics lie these two fundamental pillars: quantum theory, governing the behavior of the tiniest particles, and Einsteins theory of general relativity, explaining gravity through the bending of spacetime. However, these theories clash, presenting a challenge that has persisted for over a century.
Scientists traditionally believed that reconciling these theories required modifying Einsteins theory to fit within quantum theory, as attempted by leading theories like string theory and loop quantum gravity. However, the new theory is challenging this consensus.
In the theory termed a postquantum theory of classical gravity, rather than altering spacetime, quantum theory undergoes modification. This suggests that spacetime might not be governed by quantum theory at all. Instead, the theory predicts unpredictable fluctuations in spacetime larger than those predicted by quantum theory, causing objects apparent weights to become unpredictable under precise measurements.
A simultaneous paper is also exploring the theorys implications and proposes an experiment to test it. This experiment involves precisely measuring the weight of a one-kilogram mass over time at the International Bureau of Weights and Measures in France. If fluctuations in measurements surpass expected limits, it could refute the theory.
For five years, the UCL research group has scrutinized and tested this theory.
Quantum theory and Einsteins theory of general relativity are mathematically incompatible with each other, so its important to understand how this contradiction is resolved. Should spacetime be quantized, or should we modify quantum theory, or is it something else entirely? questions study author Jonathan Oppenheim, professor of physics and astronomy at UCL, in a university release. Now that we have a consistent fundamental theory in which spacetime does not get quantized, its anybodys guess.
Study co-author Zach Weller-Davies, who as a PhD student at UCL helped develop the experimental proposal and made key contributions to the theory itself, emphasized the experimental implications.
In both quantum gravity and classical gravity, spacetime must be undergoing violent and random fluctuations all around us, but on a scale which we havent yet been able to detect, explains Weller-Davies. But if spacetime is classical, the fluctuations have to be larger than a certain scale, and this scale can be determined by another experiment where we test how long we can put a heavy atom in superposition of being in two different locations.
Study co-author Dr. Barbara oda expressed hope that these experiments could settle the debate about pursuing a quantum theory of gravity.
Because gravity is made manifest through the bending of space and time, we can think of the question in terms of whether the rate at which time flows has a quantum nature, or classical nature, notes Dr. oda. And testing this is almost as simple as testing whether the weight of a mass is constant, or appears to fluctuate in a particular way.
The theorys origins lie in resolving the black hole information problem, allowing for information destruction due to a breakdown in predictability, a departure from standard quantum theory.
This novel theory not only challenges current paradigms but also offers new perspectives for experimental testing, potentially reshaping our understanding of gravity and spacetime.
The study is published in the journal Physical Review X.
Go here to read the rest:
Radical theory could unite Albert Einstein's concept of gravity with quantum mechanics - Study Finds
Superconductors’ Secret: Old Physics Law Stands the Test of Time in Quantum Material Conundrum – SciTechDaily
A new study argues that the Wiedemann-Franz law, linking electronic and thermal conductivity, is still valid for copper oxide superconductors. The research suggests that discrepancies in quantum materials stem from non-electronic factors like lattice vibrations. This finding is significant for understanding unconventional superconductors and may lead to advancements in the field.
This surprising result is important for understanding unconventional superconductors and other materials where electrons band together to act collectively.
Long before researchers discovered the electron and its role in generating electrical current, they knew about electricity and were exploring its potential. One thing they learned early on was that metals were great conductors of both electricity and heat.
In 1853, two scientists showed that those two admirable properties of metals were somehow related: At any given temperature, the ratio of electronic conductivity to thermal conductivity was roughly the same in any metal they tested. This so-called Wiedemann-Franz law has held ever since except in quantum materials, where electrons stop behaving as individual particles and glom together into a sort of electron soup. Experimental measurements have indicated that the 170-year-old law breaks down in these quantum materials, and by quite a bit.
An illustration shows strongly interacting electrons carrying heat and charge from warmer to cooler regions of a quantum material. A theoretical study by SLAC, Stanford, and the University of Illinois found that the ratio of heat transport to charge transport in cuprates quantum materials like this one, where electrons glom together and act cooperatively should be similar to the ratio in normal metals, where electrons behave as individuals. This surprising result overturns the idea that the 170-year-old Wiedemann-Franz law does not apply to quantum materials. Credit: Greg Stewart/SLAC National. Accelerator Laboratory
Now, a theoretical argument put forth by physicists at the Department of Energys SLAC National Accelerator Laboratory, Stanford University and the University of Illinois suggests that the law should, in fact, approximately hold for one type of quantum material the copper oxide superconductors, or cuprates, which conduct electricity with no loss at relatively high temperatures.
In a paper published in the journal Science on November 30, they propose that the Wiedemann-Franz law should still roughly hold if one considers only the electrons in cuprates. They suggest that other factors, such as vibrations in the materials atomic latticework, must account for experimental results that make it look like the law does not apply.
This surprising result is important to understanding unconventional superconductors and other quantum materials, said Wen Wang, lead author of the paper and a PhD student with the Stanford Institute for Materials and Energy Sciences (SIMES) at SLAC.
The original law was developed for materials where electrons interact with each other weakly and behave like little balls that bounce off defects in the materials lattice, Wang said. We wanted to test the law theoretically in systems where neither of these things was true.
Superconducting materials, which carry electric current without resistance, were discovered in 1911. But they operated at such extremely low temperatures that their usefulness was quite limited.
That changed in 1986, when the first family of so-called high-temperature or unconventional superconductors the cuprates was discovered. Although cuprates still require extremely cold conditions to work their magic, their discovery raised hopes that superconductors could someday work at much closer to room temperature making revolutionary technologies like no-loss power lines possible.
After nearly four decades of research, that goal is still elusive, although a lot of progress has been made in understanding the conditions in which superconducting states flip in and out of existence.
Theoretical studies, performed with the help of powerful supercomputers, have been essential for interpreting the results of experiments on these materials and for understanding and predicting phenomena that are out of experimental reach.
For this study, the SIMES team ran simulations based on whats known as the Hubbard model, which has become an essential tool for simulating and describing systems where electrons stop acting independently and join forces to produce unexpected phenomena.
The results show that when you only take electron transport into account, the ratio of electronic conductivity to thermal conductivity approaches what the Wiedemann-Franz law predicts, Wang said. So, the discrepancies that have been seen in experiments should be coming from other things like phonons, or lattice vibrations, that are not in the Hubbard model, she said.
SIMES staff scientist and paper co-author Brian Moritz said that although the study did not investigate how vibrations cause the discrepancies, somehow the system still knows that there is this correspondence between charge and heat transport amongst the electrons. That was the most surprising result.
From here, he added, maybe we can peel the onion to understand a little bit more.
Reference: The Wiedemann-Franz law in doped Mott insulators without quasiparticles by Wen O. Wang, Jixun K. Ding, Yoni Schattner, Edwin W. Huang, Brian Moritz and Thomas P. Devereaux, 30 November 2023, Science.DOI: 10.1126/science.ade3232
Major funding for this study came from the DOE Office of Science. Computational work was carried out at Stanford University and on resources of the National Energy Research Scientific Computing Center, which is a DOE Office of Science user facility.
Read the original:
Unlocking the Quantum Realm: A New Tool for Uncharted Phenomena – SciTechDaily
The temperature profiles obtained by the researchers show that particles that interact strongly with the environment are hot (red) and those that interact little are cold (blue). Entanglement is therefore large where the interaction between particles is strong. Credit: Helene Hainzer
Predictions of quantum field theory experimentally confirmed for the first time.
Entanglement is a quantum phenomenon where the properties of two or more particles become interconnected in such a way that one cannot assign a definite state to each individual particle anymore. Rather, we have to consider all particles at once that share a certain state. The entanglement of the particles ultimately determines the properties of a material.
Entanglement of many particles is the feature that makes the difference, emphasizes Christian Kokail, one of the first authors of the paper now published in Nature. At the same time, however, it is very difficult to determine.
The researchers led by Peter Zoller at the University of Innsbruck and the Institute of Quantum Optics and Quantum Information (IQOQI) of the Austrian Academy of Sciences (AW) now provide a new approach that can significantly improve the study and understanding of entanglement in quantum materials. In order to describe large quantum systems and extract information from them about the existing entanglement, one would naively need to perform an impossibly large number of measurements.
We have developed a more efficient description, that allows us to extract entanglement information from the system with drastically fewer measurements, explains theoretical physicist Rick van Bijnen.
In an ion trap quantum simulator with 51 particles, the scientists have imitated a real material by recreating it particle by particle and studying it in a controlled laboratory environment. Very few research groups worldwide have the necessary control of so many particles as the Innsbruck experimental physicists led by Christian Roos and Rainer Blatt.
The main technical challenge we face here is how to maintain low error rates while controlling 51 ions trapped in our trap and ensuring the feasibility of individual qubit control and readout, explains experimentalist Manoj Joshi. In the process, the scientists witnessed for the first time effects in the experiment that had previously only been described theoretically.
Here we have combined knowledge and methods that we have painstakingly worked out together over the past years. Its impressive to see that you can do these things with the resources available today, says an excited Christian Kokail, who recently joined the Institute for Theoretical Atomic Molecular and Optical Physics at Harvard.
In a quantum material, particles can be more or less strongly entangled. Measurements on a strongly entangled particle yield only random results. If the results of the measurements fluctuate very much i.e., if they are purely random then scientists refer to this as hot. If the probability of a certain result increases, it is a cold quantum object. Only the measurement of all entangled objects reveals the exact state.
In systems consisting of very many particles, the effort for the measurement increases enormously. Quantum field theory has predicted that subregions of a system of many entangled particles can be assigned a temperature profile. These profiles can be used to derive the degree of entanglement of the particles.
In the Innsbruck quantum simulator, these temperature profiles are determined via a feedback loop between a computer and the quantum system, with the computer constantly generating new profiles and comparing them with the actual measurements in the experiment. The temperature profiles obtained by the researchers show that particles that interact strongly with the environment are hot and those that interact little are cold. This is exactly in line with expectations that entanglement is particularly large where the interaction between particles is strong, says Christian Kokail.
The methods we have developed provide a powerful tool for studying large-scale entanglement in correlated quantum matter. This opens the door to the study of a new class of physical phenomena with quantum simulators that already are available today, says quantum mastermind Peter Zoller. With classical computers, such simulations can no longer be computed with reasonable effort.
The methods developed in Innsbruck will also be used to test new theories on such platforms.
The results have been published in Nature.
Reference: Exploring large-scale entanglement in quantum simulation by Manoj K. Joshi, Christian Kokail, Rick van Bijnen, Florian Kranzl, Torsten V. Zache, Rainer Blatt, Christian F. Roos and Peter Zoller, 29 November 2023, Nature.DOI: 10.1038/s41586-023-06768-0
Financial support for the research was provided by the Austrian Science Fund FWF, the Austrian Research Promotion Agency FFG, the European Union, the Federation of Austrian Industries Tyrol and others.
Originally posted here:
Unlocking the Quantum Realm: A New Tool for Uncharted Phenomena - SciTechDaily
IBM Quantum Debuts 133-Qubit Processor and Quantum System Two – High-Performance Computing News Analysis – insideHPC
Today, at its annualIBM Quantum SummitinNew York, the company debuted the 133-qubit Quantum Heron, the first in what IBM said will be a series of utility-scale quantum processors whose architecture has been engineered over the past four years. IBM said the processor deliver(s) IBMs highest performance metrics and lowest error rates of any IBM Quantum processor to date.
IBM also unveiledIBM Quantum System Two, a modular quantum computer and part of IBMs quantum-centric supercomputing architecture. The first IBM Quantum System Two, located inYorktown Heights, New York, has begun operations with three IBM Heron processors and supporting control electronics.
The consensus among industry analysts is that in its current stage of development, quantum has attracted attention from organizations exploring its potential to solve workload solution. Touting demonstrations of its 127-qubit Quantum Eagle processor earlier this year, IBM said more organizations are looking at quantum as a scientific tool for utility-scale classes of problems in chemistry, physics, and materials beyond brute force classical simulation of quantum mechanics. Those organizations include the U.S. Department of Energys Argonne National Lab, the universities of Tokyo, Washington, Cologne, Harvard, Qedma, Algorithmiq, UC Berkeley, Q-CTRL, Fundacion Ikerbasque, Donostia International Physics Center, and the University of the Basque Country, as well as IBM.
IBM Quantum System Two
This includes experiments already running on the new IBM Quantum Heron, which IBM is making available for users today via the cloud. IBM said Heron has significantly improved error rates, offering a five-times improvement over the previous best records set by IBM Eagle.
We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science, saidDario Gil, IBM SVP and Director of Research. As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack and put it into the hands of our users and partners who will push the boundaries of more complex problems.
The company said the System Two is the foundation of IBMs next generation quantum system architecture, combining scalable cryogenic infrastructure and classical runtime servers with modular qubit control electronics. The architecture brings together quantum communication and computation, assisted by classical computing resources, and leverages a middleware layer to appropriately integrate quantum and classical workflows, according to the company.
IBM also announced updated its 10-year quantum development roadmap in which Quantum System Two will house IBMs future generations of quantum processors, adding that these future processors are intended to gradually improve the quality of operations they can run to significantly extend the complexity and size of workloads they are capable of handling.
The company also released detailsfor a new generation of its software stack, within which Qiskit 1.0, wich will include Qiskit Patterns, designed to serve as a mechanism to allow quantum developers to more easily create code. It is based on a collection of tools intended to map classical problems, optimize them to quantum circuits using Qiskit, executing those circuits using Qiskit Runtime, and then postprocess the results.
With Qiskit Patterns, combined with Quantum Serverless, users will be able to build and deploy workflows integrating classical and quantum computation in different environments, such as cloud or on-prem scenarios, according to IBM.
IBM also said it is developing the use of generative AI for quantum code programming throughwatsonx, IBMs enterprise AI platform, achieved through the finetuning of theIBM Granite model series.
Generative AI and quantum computing are both reaching an inflection point, presenting us with the opportunity to use the trusted foundation model framework of watsonx to simplify how quantum algorithms can be built for utility-scale exploration, said Jay Gambetta, Vice President and IBM Fellow at IBM. This is a significant step towards broadening how quantum computing can be accessed and put in the hands of users as an instrument for scientific exploration.
The rest is here: