Category Archives: Quantum Computing
IBMs Juan Bernab-Moreno: Understanding nature using traditional computers is impossible – EL PAS USA
IBM's Juan Bernab-Moreno in front of an image of one of the company's quantum computers.IBM
Juan Bernab-Moreno is IBMs director of research for Ireland and the United Kingdom. The Spanish computer scientist is also responsible for IBMs climate and sustainability strategy, which is being developed by seven global laboratories using artificial intelligence (AI) and quantum computing. He believes quantum computing is better suited to understanding nature and matter than classical or traditional computers.
Question. Is artificial intelligence a threat to humanity?
Answer. Artificial intelligence can be used to cause harm, but its crucial to distinguish between intentional and malicious use of AI, and unintended behavior due to lack of data control or governance rigor.
Q. IBM has developed governance tools, but malicious people presumably wont use them.
A. How do we prevent computer viruses? Its challenging because of people who dont share certain values. However, one effective approach is to promote open education within the community. By making AI benefits accessible to all and encouraging community involvement, we can prevent misuse. This approach is more scalable than relying on a few individuals, as we have observed in the field of cybersecurity. Our tools are open, and the community actively contributes to their development.
Q. What does IBMs announcement about the beginning of the quantum utility era mean??
A. This year marks a significant milestone in quantum computing. Until now, quantum advantage had only been demonstrated theoretically, without practical implementation. However, this year, we have successfully developed a quantum computer that outperforms classical or traditional systems. It was able to solve a magnetization problem with the help of error mitigation routines. It may be possible to develop a perfect quantum computer in seven years, but Im not setting any hard and fast dates.
Q. So what will we have seven years from now?
A. Its called quantum-centric supercomputing a supercomputer focused on quantum capabilities while also incorporating classical components. This includes the latest advancements in hardware. Our goal is to combine the power of quantum and classical computing. For instance, when it comes to machine learning, classical computing is needed for data processing in bits. Quantum computing is used for the actual processing, but the final interpretation of the data requires classical computing again. So, classical computing remains an integral part of the equation alongside quantum computing. In terms of hardware, the current Eagle processor has undergone multiple iterations. However, effective communication between units is crucial. Weve learned from classical computing that monolithic systems are difficult to control. Instead, we rely on smaller units that we connect using classical links.
Q. Could quantum computing and artificial intelligence together pose a threat to humanity?
A. Lets use the example of factoring in cybersecurity. Quantum computing has the potential to render current cryptography systems obsolete. To mitigate this risk, we are developing quantum security algorithms [quantum-safe cryptography] that cannot be breached. We already have three such algorithms at the NIST [National Institute of Standards and Technology] that are contributing to alternative cryptography methods. Embracing innovative technology opens up many new opportunities.
Q. But what if its used by communities that dont share our values?
A. We have been working for 40 years at IBM to develop the science of quantum information and have a strong foundation in chip development. But not many people have access to the advanced technology required to achieve quantum utility.
Q. What does quantum computing enable?
A. One of the most promising aspects that drew me to quantum, which I teach at the university, is its ability to tackle subjects beyond the reach of classical computing methods. Quantum enables us to discover intricate data relationships that are otherwise elusive. Artificial intelligence is also very helpful in revealing data relationships. When combined with quantum computing, it becomes a powerful tool. However, the true breakthrough lies in simulation. Understanding nature and the behavior of particles using traditional computers is impossible quantum is the key to understanding matter.
Q. And this enables us to understand the origin of the universe as well as diseases?
A. Targeted therapies can benefit greatly from its application. By delving into the molecular and cellular levels, we gain a deeper understanding of how these therapies work. Quantum research in this field is already underway, offering promising advancements. These small steps are paving the way for a future where we can comprehend nature and matter like never before. This newfound understanding will have significant implications for designing new materials and advancements in the biological field.
Q. Does it require significant energy consumption, like artificial intelligence?
A. Quantum requires less data and energy to comprehend new relationships and for training. It has low consumption, enabling faster and more extensive calculations.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAS USA Edition
View post:
IBMs Juan Bernab-Moreno: Understanding nature using traditional computers is impossible - EL PAS USA
Recent Research Advances Likely to Boost Government Support of Quantum – The Quantum Insider
Insider Brief
As US legislation to fund quantum hangs in the balance, research advances may boost chances of quantums funding in government, insiders are telling Politico.
Earlier this month, researchers from Harvard University, along with QuEra Computing, MIT, and NIST/UMD, tackled one of the thorniest challenges in quantum information processing errors when they executed large-scale algorithms on an error-corrected quantum computer with 48 logical qubits and hundreds of entangling logical operations. For extremely sensitive quantum calculations that can be thrown off by a range of environmental interferences, this was a huge leap in error correction and a step toward creating quantum devices that can handle meaningful tasks that bog down classical computers.
IBM also reported on a quantum processor that improves managing of errors and touted a big push to designing and building quantum computers with more than 1,000 qubits.
Politico reports the news has not gone unnoticed in Washington D.C.
For that legislation to pass, the goal is for it to be as uncontroversial and as proven as possible, and a recent breakthrough certainly provides a useful talking point, Adam Kovacevich, founder and CEO of the Chamber of Progress, told Politico.
These reports may be critical because the federal government is debating the extension of the reauthorization of the National Quantum Initiative Act, which is currently up for its first five-year extension since the original bill was signed into law in 2018, Politico reports. The legislation partially funded the Harvard study, the political news site added.
Despite these advances, theres no guarantee the legislation to fund quantum will navigate the perilous journey through Congress. While the House Committee on Science, Space and Technology passed the bill in November the House of Representatives did not attach it to the National Defense Authorization Act. Politico reports that representatives could attach quantum spending to another spending bill next month.
A couple of hurdles seem to be blocking the passing of the reauthorization, according to Politico: partisan disagreements and the growing obsession with artificial intelligence.
Garnering attention for quantum in the current tech policy landscape dominated by AI remains an uphill battle, and navigating this environment to secure sufficient recognition and resources for quantum is proving a difficult task, Hodan Omaar, a senior policy analyst at the Information Technology and Innovation Foundation, told Politico, but added, These sorts of breakthroughs dont hurt.
Read more:
Recent Research Advances Likely to Boost Government Support of Quantum - The Quantum Insider
GQIs Take on Harvards Logical Qubits Demonstration Executive Summary – Quantum Computing Report
GQI has developed a full analysis of the recent technical paper that describes a Harvard-led experiment on error correction. The article below is a summary of our conclusions. To obtain a copy of the full analysis, contact us at info@global-qi.com.
A Harvard-led experiment with up to 48 logical qubits is the most eye-catching demonstration of quantum error correcting codes to date. However, GQI believes this is not yet a Sputnik moment for fault tolerant quantum computing.
Sputnik On 4 October 1957 the Soviet Union launched Sputnik 1 into orbit. This grabbed headlines around the world, radically raised the profile of this important new technology and kicked-off the Space Race.
Key points about the Harvard-led experiment with collaborators from MIT, QuICS, and QuEra is that it involved up to 48 logical qubits to demonstrate quantum error correction codes. A notable demonstration includes using 280 physical qubits to form logical qubits and up to 200 2Q logical gates. And it leverages a technique of flexible qubit shuttling for implementation.
The demonstration builds on a body of work performed starting in 2017 by many researchers from this team. This includes technologies like Magneto-Optical Traps (MOT), Spatial Light Modulators (SLM), and Acousto-Optical Deflectors (AOD) that they developed for analog quantum simulators. The team moved on to demonstrate gate-model operations in 2019, Zoned architecture and Raman optical system refinement in 2022, a high-fidelity 2Q gates in 2023. They also create a blueprint for a zoned neutral-atom architecture leveraging high-rate Q LDPC codes which they introduced in 2023.
The architecture leverages neutral atoms strengths, including long-lived hyperfine states for memory qubits. This enables high-rate Q LDPC codes for memory qubits and planar codes for Rydberg-mediated gates. And the AOD technology enables the usage of a zoned processor architecture.
Although we regard this demonstration as a great step forward, there is still several challenges that need to be overcome to make this approach truly useful for achieve Fault Tolerant Quantum Computing (FTQC). These include:
This requires ensuring that as the codes get bigger (e.g. provide greater distance between the codewords), the logical error rates continue to improve. This needs to work across multiple rounds of error correction (so far Harvard only demonstrated error detection over a single round). This requires continued improvement in the physical error rate to meet the error correction thresholds and make the error correction circuits more effective. And in neutral atoms platforms, providing an efficient means for reloading and resetting the atoms to refresh them during long calculations.
It is not sufficient to achieve quantum advantage if an error correction architecture only implements a set of the Clifford gates (gates including H, S, CNOT, and others that can be generated by combining these together). A universal gate set that can perform any quantum calculation requires an additional gate, such as the T-gate. These are typicallycreated using a process called magic state distillation which is very resource intensive and often a dominant source of overhead. The team needs to do further work to show how this, or an alternative scheme, can be achieved robustly and efficiently.
Although an error correction demonstration of 48 logical qubits is the largest we have seen so far, it is still not enough to achieve a useful quantum advantage over classical processors using the best available algorithms. So, the number of logical qubits needs to continue to increase. This will always be a key challenge for the hardware designers because they need to do it while still maintaining or even improving the physical error rates. This includes not introducing overwhelming crosstalk, complexity in qubit calibration or an overload of environmental isolation (e.g. cooling power, or vacuum cycle time).
But it also introduces two additional challenges. Larger module sizes can possibly negatively impact the overall speed performance, particularly in architectures that utilize qubit shuttling. So further work needs to develop ways of increasing module size without ending up with excessive run times for a users calculations.
But there will still be limits to how large a single module can be created. It might be as large as 10,000 physical qubits, but even that would not be enough to perform the types of large-scale quantum computations that users will want to do. Many researchers are pursuing multi-module architectures with the modules linked by either opticalor microwave connections in order to scale up. This will require some form of transducer to convert the neutral atom qubits to a photon so entangle qubits can be sent to a neighboring module. This will require much more research which is still in its very early stages.
Nonetheless, this demonstration is an impressive step forward that builds upon the progress shown by others working on error correction. But we look forward to seeing more progress in 2024 and beyond as researchers build upon this effort.
QuEra will be hosting a webinar on January 9, 2024 where they will be providing additional details about their roadmap. You can register it at https://quera.link/roadmap.
December 19, 2023
Originally posted here:
GQIs Take on Harvards Logical Qubits Demonstration Executive Summary - Quantum Computing Report
3 Ways Quantum Computing Will Affect Artificial Intelligence Applications in the Next Decade – AMBCrypto Blog
Quantum computing is on the cusp of reshaping the landscape of artificial intelligence (AI) in profound ways, ushering in a new era of capabilities and possibilities. Over the next decade, we can expect to witness a transformation in AI applications fueled by the quantum revolution. If youre looking to know how quantum computing will affect artificial intelligence applications, stay tuned.
In this brief exploration, we will delve into three key ways in which quantum computing is poised to reshape AI, offering a glimpse into the future of these groundbreaking technologies.
We have listed three ways in which quantum computing affects artificial intelligence applications. Take a look:
Quantum computing holds immense potential to revolutionize ML and AI algorithms in several profound ways. In this extended examination, we will delve deeper into these enhancements, providing additional case studies and examples of potential applications, and also discuss some of the challenges and limitations of quantum machine learning (QML).
To delve deeper into advanced AI concepts, explore our complete guide on quantum artificial intelligence.
Quantum computers leverage the principles of superposition and entanglement to process data exponentially faster than classical computers. This speed is especially critical in ML, where tasks such as training complex models and performing large-scale simulations can be highly time-consuming.
Grovers algorithm, for instance, showcases quantum computings potential by searching unsorted databases quadratically faster than classical algorithms. This capability could significantly speed up data retrieval tasks in AI systems.
Quantum computing has the potential to efficiently manage models that are currently too complex for classical computers to handle. This includes deep learning networks with an unprecedented number of layers and nodes.
This breakthrough could lead to the development of more advanced AI models capable of learning from even larger datasets and making highly accurate predictions. These models might be particularly advantageous in areas like image recognition, natural language understanding, and autonomous decision-making systems.
Two critical aspects of quantum physics, superposition, and entanglement, contribute significantly to the enhanced computing power of quantum computers. Superposition allows a quantum bit (qubit) to be in a combination of states at once, while entanglement enables particles to remain interconnected regardless of the distance between them, allowing instantaneous interactions.
Quantum computing and ML can significantly boost each other. It is not just about speed; its also about the ability to create models that reflect complex conditions far better than current models. This is especially beneficial in areas like financial portfolio optimization, fluid dynamics simulations, and material design.
Quantum computing holds the potential to revolutionize the way complex optimization problems are solved across various industries. This is primarily due to its ability to process and analyze large datasets much faster than classical computers, as well as its unique approach to handling data through quantum mechanics principles.
Quantum entanglement is a phenomenon where pairs or groups of qubits interact in such a way that the state of one qubit is directly related to the state of another, regardless of the distance between them. This interconnectedness enables high-level correlation and parallelism in computations.
In optimization problems, entanglement allows for more efficient coordination of information across different parts of a system, enabling the quantum computer to find correlations and patterns that are not easily discernible with classical computing methods.
For those interested in AI coding tools, our step-by-step tutorial on how to use AIXCoder can be a great resource.
Quantum computers can perform certain types of search operations much faster than classical computers, thanks to algorithms like Grovers algorithm. This is particularly beneficial in optimization problems, where the solution involves searching through a vast number of possibilities.
Quantum computers are well-suited for dealing with problems involving a large number of variables and high-dimensional spaces, which are common in complex optimization tasks. Their ability to handle these types of problems can lead to more efficient solutions than those achievable with classical techniques.
Quantum algorithms, in some cases, can provide exponential speedups over their classical counterparts. This is particularly relevant in optimization problems where traditional algorithms may struggle with the complexity or size of the data.
This quantum computing technique is used specifically for solving optimization problems. It works by encoding the problem into a quantum system and then gradually evolving this system to its lowest energy state, which corresponds to the optimal solution.
Quantum computing significantly impacts artificial intelligence applications in RL by enhancing the learning processs speed and efficiency. The unique capabilities of quantum computing, such as superposition and entanglement, allow RL algorithms to explore multiple solutions simultaneously, leading to faster learning compared to traditional methods.
A significant advancement in quantum computing for reinforcement learning is the development of hybrid systems that combine quantum and classical computing. These systems use quantum computing for trial-and-error exploration in RL, while a classical computer provides feedback or rewards based on the AI agents performance. This approach has been shown to speed up the learning process of an AI agent by over 60% in certain scenarios.
Quantum reinforcement learning (QRL) can potentially require fewer training steps to reach convergence compared to traditional RL methods. This efficiency is particularly beneficial in applications where data collection is resource-intensive or difficult, such as in autonomous systems and healthcare.
Research has focused on finding the most suitable quantum architectures and learning algorithms for RL tasks. For instance, selecting a hardware-efficient parametrized quantum circuit (PQC) and refining classical algorithms like REINFORCE to work with quantum platforms have been crucial steps. These advancements contribute to successfully training RL agents on quantum devices, albeit with some adaptations to fit the constraints of current quantum hardware.
Discover how AI is revolutionizing marketing with these 15 real-world examples of artificial intelligence in marketing.
Lets delve into each of these eight real-life applications in greater depth to appreciate the profound impact quantum computing is set to exert:
Quantum computing offers a quantum leap in the field of pharmaceuticals. By leveraging quantum algorithms, it can significantly accelerate the complex process of molecular simulation, expediting the analysis of intricate molecular structures and interactions essential for drug discovery.
This revolutionary speed-up enables researchers to explore a vast chemical space, leading to the rapid identification and development of new drugs and treatments for a multitude of diseases, ultimately improving global healthcare.
Financial institutions worldwide stand to gain immensely from quantum computings computational prowess. Its capacity to process vast volumes of financial data with unprecedented speed and precision can transform asset allocation strategies, revolutionize risk assessment models, and fortify fraud detection mechanisms.
This quantum advantage empowers financial professionals with more accurate insights into market trends, enhancing decision-making and risk management in the financial sector.
The intricacies of modern transportation systems and logistics networks demand sophisticated optimization solutions. Classical computers often struggle to tackle the complexity of these problems efficiently.
Quantum algorithms, however, excel in handling numerous variables simultaneously, making them indispensable for solving intricate route optimization challenges. This, in turn, leads to reduced congestion, improved urban mobility, and enhanced supply chain efficiency, benefitting both cities and businesses.
Quantum computing offers a ray of hope in addressing the pressing issue of climate change. Its capacity to simulate the complexities of environmental systems with high precision promises the development of more accurate climate models.
These advanced models provide invaluable insights into climate dynamics, empowering policymakers to formulate more effective strategies for mitigating the effects of climate change and preserving the environment for future generations.
Interested in monetizing AI? Check out 5 ways to use AI tools like ChatGPT to earn passive income.
Quantum computings prowess extends to the realm of energy optimization. It can simulate and optimize energy distribution networks, paving the way for increased energy efficiency and reduced operational costs across various industries and power grids.
This transformative application has far-reaching implications for sustainable energy practices, contributing to a greener and more environmentally conscious future.
Quantum computing plays a pivotal role in the discovery of groundbreaking materials with tailor-made properties. Through precise atomic-level simulations, it facilitates the development of materials optimized for specific applications.
These include high-performance batteries for renewable energy storage, advanced materials for cutting-edge electronics, and innovative materials for aerospace and engineering, driving innovation across a wide array of industries.
Quantum computing presents both challenges and opportunities in the realm of cybersecurity. While its immense processing power threatens existing encryption methods, it also necessitates the development of quantum-resistant encryption techniques.
Quantum computing can also contribute to the creation of more robust cryptographic protocols, bolstering cybersecurity across industries, protecting sensitive data, and ensuring the integrity of digital systems.
In essence, these applications underscore the extraordinary potential of quantum computing to catalyze innovation and transformation across diverse sectors of our global economy.
Although quantum computing is affecting artificial intelligence applications in positive ways, it also faces several limitations that are important to consider:
One of the major challenges of quantum computing is the current limitations in hardware. Quantum computers are limited in terms of their size and the number of qubits they can support.
This constrains the complexity of the problems that can be solved and makes it difficult to scale the technology. As quantum computing evolves, addressing these hardware limitations will be crucial for its broader application in AI and other fields.
Quantum computing is more prone to errors compared to classical computing due to the delicate nature of qubits. Moreover, developing robust error correction techniques is critical for creating reliable quantum computing systems.
This is a significant area of research in quantum computing, as the effectiveness of quantum algorithms depends heavily on the accuracy of quantum operations.
Quantum computing requires a completely different set of algorithms and programming techniques compared to classical computing. This complexity makes it challenging for developers to adapt and limits the number of professionals who can effectively work with quantum technology.
The distinct nature of quantum programming, involving quantum gates and the probabilistic nature of qubit states, requires a new mindset and understanding.
Quantum computing poses potential risks to cybersecurity. Furthermore, its ability to process complex algorithms at high speed could potentially break many of the cryptographic algorithms currently used to secure data and communications.
Therefore, the development of new quantum-resistant cryptographic algorithms is essential.
The absence of universally accepted standards in quantum computing hinders collaboration and idea-sharing among researchers and developers. Moreover, establishing such standards will be essential for the development of a robust and scalable quantum computing industry.
Despite these challenges, quantum computing holds the promise of transformative outcomes. Furthermore, it includes increased computational power, improved efficiency, new scientific discoveries, enhanced cybersecurity, and increased innovation.
Check out our curated list oftop AI copywriting toolsthat will revolutionize your content creation process.
The future developments in quantum computing are expected to focus on several key areas, each addressing current limitations and expanding the potential applications of this technology:
Learn about Claude AI, another innovative AI assistant, and how it can be utilized for various tasks.
As quantum computing continues to advance at a rapid pace, its impact on artificial intelligence applications in the coming decade cannot be overstated. The synergy between quantum computing and AI is set to drive monumental changes. Additionally, this ranges from supercharged data analysis and optimization to the development of more sophisticated AI models.
With each passing year, we edge closer to realizing the full potential of this transformative partnership. Moreover, it promises breakthroughs that will not only reshape industries but also broaden the horizons of what AI can achieve. The next decade promises to be an exciting journey as we witness the evolution of quantum-powered AI. Finally, this opens doors to solutions for some of humanitys most complex challenges.
Where Will IonQ Stock Be in a Year? – The Motley Fool
Since ChatGPT was released a year ago, the tech world has been grappling with massive disruption from artificial intelligence (AI) chatbots. In fact, OpenAI's ChatGPT has the potential to disrupt some of Alphabet's search business -- itself a massively disruptive product introduced just 25 years ago.
So now the question is, what will be the next thing that could disrupt ChatGPT?
Given recent strides in quantum computing, the answer may be coming sooner than you think. The leader in the field, IonQ (IONQ -1.09%), has a big year coming up, during which its technology could make some key commercial breakthroughs.
For those who don't know about quantum computing, the technology is pretty crazy. Basically, whereas traditional computing stores information in bits signified by 1 or 0, quantum computing stores information in qubits, which occupy varying degrees of 1 or 0 states simultaneously in what is called superposition.
To achieve this, a laser stimulates an ion in an isolated electromagnetic or vacuum chamber, causing the ion to enter a probabilistic dual state with properties of both a particle and a wave simultaneously. Extremely precise lasers then measure the ion, calculating the different probabilities of its dual states.
By opening up the 1-0 binary code of classical computing to infinitely more permutations, the upshot is that quantum computing has the potential to do massive calculations that classical computers cannot, at least not within a reasonable time frame. These use cases include modeling new materials for materials science and pharmaceutical applications, cryptography applications, and other optimization modeling involving a massive number of factors and possibilities.
Quantum computing hasn't become mainstream because, well, the science is extremely difficult to pull off and isn't mature yet. But it could be very close: IonQ believes it has made a number of breakthroughs now, moving it from the research stage to the commercialization stage as soon as next year. If the company reaches a few key milestones it has laid out for itself, the quantum industry -- and IonQ, specifically -- could take off in a big way.
In his letter to shareholders accompanying the 2022 annual report, CEO Peter Chapman wrote that he expects quantum computing to have its own ChatGPT-like moment within the next two years. That would, of course, be a massive event.
There has already been tangible progress toward that goal in 2023. In 2023, IonQ achieved a system with 29 algorithmic qubits, or #AQ 29, which is a shorthand IonQ invented to track the power of its quantum computing systems. But each AQ number is more than just a linear improvement; it's an exponential improvement. For instance, if #AQ 5 is the quantum computing equivalent of the tip of a magic marker, #AQ 29 is the size of a basketball court.
And #AQ 64, which the company hopes to achieve in 2025, would be equivalent to the size of the land mass of the United States!
So far this year, the company has sold four systems, logging $58.4 million in year-to-date bookings. These systems are mainly for research purposes, with two large customers being the U.S. Air Force Research Lab and Switzerland's QuantumBasel, which serves enterprises and governments in Europe.
The next year will be so momentous because IonQ believes it will commercialize its #AQ 35 Forte systems in 2024. This will be an important milestone, as the #AQ 35 machine will be the first to exceed the capabilities of classical computing for machine learning. Management believes that the first display of commercial advantage over traditional computing will spur more commercial uses for the technology, essentially making the leap from the research stage to the commercial stage.
In 2025, the company hopes to achieve #AQ 64, which will use a new type of process to reduce the errors inherent in quantum computing systems. Because quantum computing is such a delicate process, errors inevitably pop up due to quantum "noise." Quantum systems get around this through a process called error correction. But that involves a lot of redundancy, storing the same information among many qubits.
If there are discrepancies in the information, the error correction process takes a "majority vote" to filter out errors in qubits, combining many error-filled qubits into one optimized "logical qubit." But the numbers get enormous as some approaches require 1,000 or even 100,000 qubits to form one logical qubit.
To scale to #AQ 64, IonQ will use a new software-based technique it developed called error mitigation rather than full error correction. This software-based modeling technique will require far fewer qubits than full error correction to make an error-corrected logical qubit. That innovation will allow IonQ to achieve #AQ 64 with less qubit "overhead," which is important for commercialization. The new error mitigation technique will also allow IonQ to scale even higher to its ultimate goal of #AQ 1024 by 2028.
IonQ won't be making any profits anytime soon. So, the main things for investors to look out for in the year ahead are technical achievements, such as error mitigation, and getting the #AQ 35 to market. It's also possible that by the end of the year, a customer will begin using the #AQ 35 in a commercial use case, which could spur interest in the technology and, of course, the stock.
The possibility for a ChatGPT-like moment with the introduction of #AQ 35 in 2024 makes IonQ an interesting speculative play. There could be a lot of upside if IonQ is first to market with commercial quantum computing.
However, investors should remember it is still very early in the quantum industry, with more significant technological hurdles to clear. So, be sure to size any IonQ position in your portfolio according to your risk appetite, which likely means keeping it small. Nevertheless, IonQ is a name every technology investor should at least be tracking in 2024.
See more here:
Where Will IonQ Stock Be in a Year? - The Motley Fool
Argonne’s New Research Unlocks Potential of Switchable Superconducting Circuits in Industrial Computing – HPCwire
Dec. 20, 2023 As industrial computing needs grow, the size and energy consumption of the hardware needed to keep up with those needs grows as well. A possible solution to this dilemma could be found in superconducting materials, which can reduce that energy consumption exponentially. Imagine cooling a giant data center full of constantly running servers down to nearly absolute zero, enabling large-scale computation with incredible energy efficiency.
Physicists at the University of Washington and the U.S. Department of Energys (DOE) Argonne National Laboratory have made a discovery that could help enable this more efficient future. Researchers have found a superconducting material that is uniquely sensitive to outside stimuli, enabling the superconducting properties to be enhanced or suppressed at will. This enables new opportunities for energy-efficient switchable superconducting circuits. The paper was published in Science Advances.
Superconductivity is a quantum mechanical phase of matter in which an electrical current can flow through a material with zero resistance. This leads to perfect electronic transport efficiency. Superconductors are used in the most powerful electromagnets for advanced technologies such as magnetic resonance imaging, particle accelerators, fusion reactors and even levitating trains. Superconductors have also found uses in quantum computing.
Todays electronics use semiconducting transistors to quickly switch electric currents on and off, creating the binary ones and zeroes used in information processing. As these currents must flow through materials with finite electrical resistance, some of the energy is wasted as heat. This is why your computer heats up over time. The low temperatures needed for superconductivity, usually more than 200 degrees Fahrenheit below freezing, makes those materials impractical for hand-held devices. However, they could conceivably be useful on an industrial scale.
The research team, led by Shua Sanchez of the University of Washington (now at the Massachusetts Institute of Technology), examined an unusual superconducting material with exceptional tunability. This crystal is made of flat sheets of ferromagnetic europium atoms sandwiched between superconducting layers of iron, cobalt and arsenic atoms. Finding ferromagnetism and superconductivity together in nature is extremely rare, according to Sanchez, as one phase usually overpowers the other.
It is actually a very uncomfortable situation for the superconducting layers, as they are pierced by the magnetic fields from the surrounding europium atoms, Sanchez said. This weakens the superconductivity and results in a finite electrical resistance.
To understand the interaction of these phases, Sanchez spent a year as a resident at one of the nations leading X-ray light sources, the Advanced Photon Source (APS), a DOE Office of Science user facility at Argonne. While there he was supported by DOEs Science Graduate Student Research program. Working with physicists at APS beamlines 4-ID and 6-ID, Sanchez developed a comprehensive characterization platform capable of probing microscopic details of complex materials.
Using a combination of X-ray techniques, Sanchez and his collaborators were able to show that applying a magnetic field to the crystal can reorient the europium magnetic field lines to run parallel to the superconducting layers. This removes their antagonistic effects and causes a zero-resistance state to emerge. Using electrical measurements and X-ray scattering techniques, scientists were able to confirm that they could control the behavior of the material.
The nature of independent parameters controlling superconductivity is quite fascinating, as one could map out a complete method of controlling this effect, said Argonnes Philip Ryan, a co-author on the paper. This potential posits several fascinating ideas including the ability to regulate field sensitivity for quantum devices.
The team then applied stresses to the crystal with interesting results. They found the superconductivity could be either boosted enough to overcome the magnetism even without re-orienting the field or weakened enough that the magnetic reorientation could no longer produce the zero-resistance state. This additional parameter allows for the materials sensitivity to magnetism to be controlled and customized.
This material is exciting because you have a close competition between multiple phases, and by applying a small stress or magnetic field, you can boost one phase over the other to turn the superconductivity on and off, Sanchez said. The vast majority of superconductors arent nearly as easily switchable.
About the Advanced Photon Source
The U. S. Department of Energy Office of Sciences Advanced Photon Source (APS) at Argonne National Laboratory is one of the worlds most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nations economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.
This research used resources of the Advanced Photon Source, a U.S. DOE Office of Science User Facility operated for the DOE Office of Science by Argonne National Laboratory under Contract No. DE-AC02-06CH11357.
About Argonne National Laboratory
Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nations first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance Americas scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energys Office of Science.
Source: Shua Sanchez And Andre Salles, Argonne
See the original post:
Argonne's New Research Unlocks Potential of Switchable Superconducting Circuits in Industrial Computing - HPCwire
Alice & Bob Takes Another Step Toward Releasing Error-Corrected Logical Qubit – The Quantum Insider
Insider Brief
PRESS RELEASE Alice & Bob, a leading hardware developer in the race to fault tolerant quantum computers, today announced the tape out of a new chip expected to improve error rates with every qubit added, making it a prototype for the companys first error-corrected, logical qubit.
The 16-qubit quantum processing unit (QPU), Helium 1, is the first chip in Alice & Bobs roadmap combining cat qubits to run an error correction code. The company will be able to use this platform to create its first logical qubit with error rates lower than any existing single physical qubit. With the tape-out complete, the chip enters a characterization and calibration phase that will be followed by a release on the cloud.
The quantum industry is at the dawn of demonstrating logical qubits showing significant advantages over any existing physical qubit. Such logical qubits are the only way to achieve the extremely low error rates required by fault tolerant quantum computing.
Errors in quantum computers are caused by bit flips and phase flips. Protected from bit flips by design, cat qubits are hardware efficient and enable logical qubit designs using significantly fewer qubits. Helium 1 will run an error correction code that actively suppresses the remaining phase flips, effectively addressing both error types.
Our cat qubit technology already holds world records in addressing bit flips, said Thau Peronnin, CEO of Alice & Bob. Helium 1 is our new platform to exponentially suppress the remaining errors as we add more depth, enabling us to deliver on our clear roadmap to reach the full computational potential of quantum computers.
Helium 1 is the first prototype which will be the basis of Alice & Bobs six-nines logical qubit (with a logical error rate of 10-6 or lower).
Continued here:
Alice & Bob Takes Another Step Toward Releasing Error-Corrected Logical Qubit - The Quantum Insider
Quantum computing will make online heists likelier than we can fathom | Mint – Mint
As your credit card is scanned one final time this holiday season, say thanks to prime numbers for keeping the checkout queues short and your money safe. Well, most of the time anyway. Much of the cryptography that goes into beating credit-card fraud comes down to 3,5,7, etc, or integers that can only be divided into themselves and 1. Banks randomly generate two huge primessay, 150 digits longand multiply them to encrypt payment authorization from the microchip of your card to the point-of-sale terminal.
As your credit card is scanned one final time this holiday season, say thanks to prime numbers for keeping the checkout queues short and your money safe. Well, most of the time anyway. Much of the cryptography that goes into beating credit-card fraud comes down to 3,5,7, etc, or integers that can only be divided into themselves and 1. Banks randomly generate two huge primessay, 150 digits longand multiply them to encrypt payment authorization from the microchip of your card to the point-of-sale terminal.
Even supercomputers cant easily decipher the original numbers because the time required to run any of the known algorithms increases exponentially with their length. A 250-digit number that was part of a 1991 factorization challenge was finally broken down into a product of two primes in 2020. On a single advanced computer running non-stop, the calculations would take 2,700 years.
Hi! You're reading a premium article
Even supercomputers cant easily decipher the original numbers because the time required to run any of the known algorithms increases exponentially with their length. A 250-digit number that was part of a 1991 factorization challenge was finally broken down into a product of two primes in 2020. On a single advanced computer running non-stop, the calculations would take 2,700 years.
Rivest-Shamir-Adleman (RSA) private keys generated with the help of large numbers let bank affix a unique, tamper-proof digital signature via microchips embedded in cards. With these replacing magnetic strips, the menace of counterfeiting has gone down. Payment scams are now more likely in e-commerce deals. However, a new challenge is emerging that could begin to erode the protection offered by prime numbers, perhaps by decade-end.
The first hint of trouble came at the start of the millennium. A team of scientists at IBM exploited the mysterious interplay of subatomic matter and energy to run calculations that would be impossible on a classical computer. Their primitive quantum computer figured out that 15 was a product of 3 and 5. A subsequent experiment in 2012 split 21 into 3 and 7. While every middle-schooler knows how to break down small integers like 15 and 21, these were the first demonstrations of Shors algorithm, a method of quantum factorization that does not get exponentially more time-consuming as the number gets bigger. In August, Oded Regev of New York University proposed what many consider the first big improvement to Peter Shors 1994 technique. If it works, the time taken to decode complex ciphers may shorten. By 2030, a $1 billion quantum computer may be able to break RSA Laboratories widely used 2048-bit encryption by factoring a 617-digit number in a few hours, according to a 2016 estimate by the National Institute of Standards and Technology in Maryland. NIST has come up with new protocols that will be resistant to quantum computers, but what if the threat arrives before the weaponry to ward it off has been adopted?
Retail losses may be kept to manageable levels, at least for a while. But a wholesale quantum heist would be catastrophic. The security of worldwide interbank payments may be compromised if the digital signatures authorizing release of funds lose their sanctity. Scammers already have a blueprint in the $81 million theft from Bangladeshs accounts with the Federal Reserve Bank of New York in 2016.
It isnt just the security of existing products thats at stake; innovative new payment instruments will be affected, too. Monetary authorities are experimenting with paperless cash. The Bank for International Settlements estimates that by 2030, there may be 15 central bank digital currencies (CBDCs) in retail circulation.
Tourbillon, a recent BIS project, has shown that the cash-like privacy that users will expect of these instruments may be realizable. The so-called blind signature protocol, invented by privacy pioneer David Chaum in 1982, may be enough to ensure that payers dont reveal their identities to anyone. Even the central bank will only check that the eCash that comes to it for verification has its signature and not been spent before. Payments made using such highly private CBDCs may also be fast and able to handle peak demand. However, this is only true as long as the RSA encryption technology is strong enough to keep malicious actors at bay. Introduce quantum-safe cryptography into the equation, and the cost to users increases: A one-second payment cycle stretches to five; transactions per second drop by a factor of 200. More experimentation is needed. If it turns out that a central banks virtual signature on CBDC tokens is not impervious to quantum counterfeitingor can only be foolproofed by slowing payments downthen ordinary people are going to reject them.
As the ChatGPT-induced revolution in generative AI has shown, meaningful breakthroughs can elude a field for decades. But once they do occur, they can multiply at an overwhelming speed. Quantum computing may be no different. Prime numbers have served the internet age well, but the private sector and public authorities cannot take their continued guardianship for granted. bloomberg
See the rest here:
Quantum computing will make online heists likelier than we can fathom | Mint - Mint
Why Quantum Computing Will Change the World | by Noah Graham | Dec, 2023 – Medium
The elusive cure for cancer, the eradication of car accidents, and a sustainable future free from fossil fuels these formidable challenges, once thought to be centuries away from resolution, might be here sooner than anticipated. The driving force behind this big change? Quantum computing. This cutting-edge technology transcends traditional computing by processing multiple outcomes simultaneously, significantly outpacing even the most advanced supercomputers of our era.
To understand how quantum computing could revolutionize our world, its crucial to first grasp what sets it apart from classical computing. Traditional computers use bits as the basic unit of information, which can either be a 0 or a 1. Quantum computers, however, utilize quantum bits or qubits. These qubits can exist in multiple states at once, enabling them to perform complex calculations at unprecedented speeds.
One of the most promising applications of quantum computing lies in healthcare, particularly in the fight against cancer. Quantum computers can analyze vast datasets of genetic information, environmental factors, and treatment outcomes to identify potential cures and personalized treatments. This approach could dramatically accelerate the development of effective therapies, potentially unlocking the secrets to curing cancer. This could also allow us to find the cure for many other diseases because of the ability to test chemical compounds so quickly.
Quantum computing also has the potential to revolutionize transportation. By processing enormous amounts of data from sensors, traffic patterns, and environmental conditions in real-time, quantum-powered AI systems could drastically reduce, if not eliminate, car accidents. This would not only save lives but also pave the way for more efficient, autonomous vehicles.
The energy sector stands on the brink of a quantum revolution. Quantum computing could optimize renewable energy systems, enhance battery storage capacities, and improve energy distribution networks. These advancements could lead to more effective use of renewable resources, reducing our reliance on fossil fuels and mitigating the effects of climate change.
Despite its potential, quantum computing faces significant challenges. The technology is still in its infancy, and developing stable, large-scale quantum computers remains a daunting task. Moreover, with great power comes great responsibility. Ensuring ethical use and preventing misuse of quantum computing in areas like surveillance and cybersecurity is paramount.
While we may not have all the answers yet, quantum computing promises a future where some of todays most daunting problems could be effectively addressed. From healthcare to transportation, and energy sustainability, the quantum leap could be closer than we think, heralding a new era of innovation and problem-solving.
Read the original here:
Why Quantum Computing Will Change the World | by Noah Graham | Dec, 2023 - Medium
The future is quantum: universities look to train engineers for an emerging industry – Nature.com
IBM physicist Olivia Lanes says quantum tech needs workers from various educational levels.Credit: IBM
The first year of university is always an opportunity to explore, but William Papantoniou really took the plunge. From the start of his studies in 2021 at the University of New South Wales (UNSW) in Sydney, Australia, he signed up for the universitys latest offering: an undergraduate degree in quantum engineering.
Now a third-year student, Papantoniou chose the programme because he wanted to learn more about quantum computers and the physics that makes them run. He first heard of the devices in a programming class during secondary school. It was presented as the future of computing, he says. They described how quantum computing makes complex problems simpler.
The programme prepares students to enter the emerging quantum-technology industry, which has begun to develop devices that use individual atoms, electrons, photons and other components exhibiting quantum properties. These distinctive properties allow quantum computers to execute types of algorithm that are not easily accessed by conventional computers.
Quantum technology includes magnetic sensors and atomic clocks, as well as quantum computers, the development of which some specialists project will take at least a decade to be commercially useful. Proponents tout these devices as a technological paradigm shift, in which quantum mechanics enables extremely precise measurements and a fresh way for computers to crunch numbers.
William Papantoniou explores quantum devices in a practical class as part of his degree.Credit: William Papantoniou/The UNSW Quantum Engineering Society
Many industries are betting that they will benefit from the anticipated quantum-computing revolution. Pharmaceutical companies and electric-vehicle manufacturers have begun to explore the use of quantum computers in chemistry simulations for drug discovery or battery development. Compared with state-of-the-art supercomputers, quantum computers are thought to more efficiently and accurately simulate molecules, which are inherently quantum mechanical in nature.
From software developers to biologists and chemists, users are now investigating whether quantum technology can bolster their fields. But there is still lively debate about how the technology will pan out, says physicist Olivia Lanes, a researcher at IBM in Yorktown Heights, New York. A lot of people dont want to enter the industry until they see the technology is robust, but can we make it robust without them?
The UNSWs undergraduate degree begins to fill a void in quantum education outside PhD programmes. The study of quantum mechanics has fallen largely under basic research since its discovery in the early twentieth century, falling in the purview of graduate studies. When quantum technologies began to be commercialized in the 2010s, the industry predominantly hired researchers with physics PhDs.
But in the past decade, governments including those in Australia, the United States, the United Kingdom, China and the European Union have collectively pledged billions of dollars to develop the quantum-technology industry. Thats aside from the commercial investment by technology companies such as Google, Microsoft, IBM and smaller start-ups. As the industry grows, experts have already started to bemoan a lack of qualified job candidates, and the shortfall looks likely to expand.
Andrea Morello instructs students in the quantum-engineering teaching laboratory at the University of New South Wales in Sydney, Australia.Credit: UNSW Sydney
For example, one estimate suggests that Australias quantum-technology industry could provide 19,400 jobs by 2045 (see go.nature.com/3ubxvac), yet a 2016 survey tallied only about 5,000 PhD physicists in the entire country (see go.nature.com/46rgpuu). With a physics graduate degree often taking five years or longer, we simply cannot produce PhDs fast enough to satisfy the needs of this booming industry, says physicist Andrea Morello, who helped to start the UNSWs undergraduate programme in quantum engineering. Instead, the industry will predominantly need engineers with undergraduate training in relevant quantum topics, such as how the hardware components work and how to write relevant software. The evolution of the quantum industry parallels that of the computer-science industry over the past 50 years. Jobs in computing in the United States grew by more than tenfold between 1970 and 2014, according to the US Census Bureau. In the early 1970s, many universities established and expanded their computer-science undergraduate programmes in anticipation.
The quantum-tech industry will need workers with various educational backgrounds to benefit society. A technology cant succeed if the only people who know how to use it are PhDs, says Lanes.
In response to this demand, some universities are starting quantum-training programmes at both the bachelors and masters levels. In 2019, Saarland University in Saarbrcken, Germany, introduced an undergraduate quantum-engineering degree similar to the UNSWs and launched a masters programme a year later. Bachelors students at Virginia Tech in Blacksburg can opt for quantum and information science as a secondary specialization, which was introduced in 2022. Pretty much every week, Ill learn about a new programme somewhere, says quantum experimentalist Abraham Asfaw, who leads education and outreach efforts for Googles quantum team in Santa Barbara, California.
Google quantum experimentalist Abraham Asfaw works on a dilution refrigerator in his laboratory in Santa Barbara, California.Credit: Erik Lucero, Google Quantum AI
Undergraduate degree programmes aim to train engineers who work directly with quantum devices and require a relatively deep understanding of quantum mechanics. The industry also needs engineers to work with conventional technology, such as the cryogenics systems that keep quantum computers cold enough to operate, or the optical fibres that link multiple quantum devices. These engineers could perhaps learn the necessary quantum mechanics in an undergraduate course or two that are incorporated into a conventional engineering degree or vocational programme, says Asfaw.
Morello and his colleagues built the UNSWs quantum-engineering programme on the framework of a conventional electrical-engineering degree. Students take largely the same course as do non-quantum engineers, but with extra, quantum-specific classes. Morello says they designed the programme so that its graduates could still choose to work as conventional electrical engineers. Its really important to choose a degree that gives you a solid basis while providing you options, says Morello.
UNSWs quantum courses originate from masters classes that Morello and his colleagues deliver. These have required academics to rethink how they teach quantum mechanics. The conventional approach comes from a theoretical physics perspective, which centres on understanding the behaviour of idealized quantum objects, such as a single confined particle. In traditional quantum mechanics courses, you [might] spend a day talking about applications, but its not the focus of the course, says physicist Lex Kemper, who is developing an undergraduate quantum engineering course at North Carolina State University in Raleigh.
How to get started in quantum computing
For example, undergraduate physics students typically learn about quantized energy levels, in which quantum objects can lose or gain energy only in discrete amounts, or quanta, through physicist Niels Bohrs quantum model of hydrogen, the simplest atom. Bohrs model depicts hydrogen as a positively charged nucleus with orbiting negatively charged electrons, and the atom can lose or gain a quantum of energy by emitting or absorbing a photon, a particle of light. Instead, Morello uses a real-world example in his teaching a material called a quantum dot, which is used in some LEDs and in some television screens. I can now teach quantum mechanics in a way that is far more engaging than the way I was taught quantum mechanics when I was an undergrad in the 1990s, he says.
Morello also teaches the mathematics behind quantum mechanics in a more computer-friendly way. His students learn to solve problems using matrices that they can represent using code written for the Python programming language, rather than conventional differential equations on paper.
His colleagues at the UNSW are also developing laboratory courses to give students hands-on experience with the hardware in quantum technologies. For example, they designed a teaching lab to convey the fundamental concept of quantum spin, a property of electrons and some other quantum particles, using commercially available synthetic diamonds known as nitrogen vacancy centres (V. K. Sewani et al. Preprint at https://arxiv.org/abs/2004.02643; 2020). Students can use magnets and a laser to observe and measure effects resulting from the diamonds quantum spin.
Quantum computers: what are they good for?
During his second trimester, Papantoniou started the Quantum Engineering Student Society. Its a difficult degree. Theres a lot of physics, a lot of maths and a lot of engineering, all of it combined together, he says. I realized straight away that there would be a need for study groups and social events to bring us together. The group invites people working at quantum-technology companies to give talks, and organizes tours of academic labs.
Asfaw thinks of these academic programmes as experiments. The quantum-technology community still needs to work out how to evaluate their success, and how various programmes can share their experiences, says Asfaw, who has helped to organize the quantum-education community. In 2020, he worked with a group of academics to identify the key concepts needed to prepare students for entering the quantum industry. These include the idea of a quantum bit, or qubit, which is the fundamental unit of information; and of a quantum state, which is a mathematical representation of a quantum object. In 2021, Asfaw worked with academics and quantum-industry specialists to publish an undergraduate curriculum in quantum engineering (A. Asfaw et al. Preprint at https://arxiv.org/abs/2108.01311; 2021).
Quantum-computing companies are helping to develop quantum education directly. The industrys overall objectives are to build quantum computers and work out how to use them, says Asfaw. It will require a large and diverse workforce to achieve those objectives, so it is in the companies interests to help to train that workforce.
IBM quantum researcher Abby Mitchell.Credit: IBM
Companies are offering teaching resources for undergraduate educators. Kemper has logged into IBMs small prototype quantum computers through the cloud to teach his undergraduates the basics. Both IBMs Qiskit and Googles Cirq are open-source software packages that anyone can use and build on. For those who have left university, contributing to this software offers a path into a quantum-computing-related job, if theyre willing to put in the time. Abby Mitchell, who works for IBMs quantum team in Yorktown Heights and who studied arts and sciences as an undergraduate, learnt quantum computing on the job by writing and debugging code for Qiskit. I managed to transfer from my old job at IBM doing web development into a full-time member of the Qiskit community team, she says.
Its still unclear how quantum technology will bring commercial value. In many ways, it is a solution looking for a problem. Quantum communications, such as creating and delivering encryption keys encoded in single photons, is theoretically more secure than current cryptography techniques. But these technologies have delivered mixed results in practice, and require buy-in from institutions such as banks and governments. Existing quantum computers still make too many errors to be able to execute commercially valuable algorithms, and researchers have not worked out whether they can do anything useful with these adolescent machines. Its a chicken-and-egg problem in some ways, says Lanes.
But Papantoniou sees the uncertain future of quantum technologies as an opportunity. Even if quantum computing doesnt become commercially successful in the next few years, he says he can use the skills in short-term technologies, such as quantum sensing.
He has two more years before he graduates with two bachelors degrees, in quantum engineering and computer science. He plans to enter the quantum-technology industry after graduation, and is particularly interested in the development of algorithms for quantum computers. I have to do a lot of explaining to my parents [about] what I study, says Papantoniou. At this point, nobody really knows what a quantum engineer is. But in ten years time, they will.
Link:
The future is quantum: universities look to train engineers for an emerging industry - Nature.com