Category Archives: Quantum Computer
BBVA runs successful trial of distributed quantum simulation in the cloud – Finextra
BBVA has completed a successful trial of the execution of quantum algorithms across multiple conventional servers in the AWS cloud.
Javier Recuenco Andrs, head of the Technical Architecture Innovation area at BBVA CIB in charge of the pilot project, says of the project: With these trials, we have shown that at BBVA we can have a proprietary architecture for executing quantum algorithms, which would help further our exploration of their use in complex financial tasks.
To run its test, BBVA worked with the Quantum Computing team of the digital transformation company VASS and AWS, using Qiskit software to distribute the execution of quantum algorithms across multiple classical compute servers located in the AWS cloud, and created a platform to automate and streamline the distribution process.
During the tests, BBVA was able to run quantum algorithms scaling up to a total computing power of 38 qubits, a scale that is difficult to reach with the use of a single classical computer. The higher the number of qubits, the more complex the problems the system can tackle.
Alongside its own inhouse trials, BBVA is also a founding member of the Quantum Safe Financial Forum (QSFF), a safe space for collaboration between European and US financial firms promoted by Europols European Cybercrime Centre (EC3). The alliance aims to foster the creation of new technological systems within the financial industry that are safe, secure and resilient to malicious attacks that rely on quantum computing.
At BBVA we explore the potential of quantum computing for two main reasons: to try to find better solutions to business problems and to strengthen the security of our communications and data to counteract the malicious use of quantum computing by third parties, explains Escolstico Snchez, leader of the Quantum discipline at BBVA. The distributed quantum simulation pilot we have successfully completed is a further step in this exploration, which could enable different business units of the bank to leverage this technology.
Read more:
BBVA runs successful trial of distributed quantum simulation in the cloud - Finextra
Unveiling the Potential of Hole Spin Qubits in Quantum Computing – AZoQuantum
Scientists from the University of Basel and the NCCR SPIN have achieved the first controllable interaction between two-hole spin qubits in a typical silicon transistor. The discovery makes it possible to use established manufacturing techniques to combine millions of these qubits on a single chip. The research was published in the journal Nature Physics.
The race to build a practical quantum computer is well underway, with researchers around the world working on a huge variety of qubit technologies. Up until now, there has been no consensus on what type of qubit is most suitable for maximizing the potential of quantum information science.
A quantum computer's qubits, which manage data processing, transport, and storage, are its fundamental components. They need to process information quickly and store it accurately to function properly. Stable and quick interactions among several qubits whose states are reliably controllable externally constitute the foundation for fast information processing.
Millions of qubits need to fit on a single chip for a quantum computer to be useful. With only a few hundred qubits, the most sophisticated quantum computers available today are limited to performing tasks that can already be completed (and frequently done more quickly) by conventional computers.
Researchers at the University of Basel and the NCCR SPIN are addressing the challenge of arranging and linking thousands of qubits by utilizing a type of qubit that exploits the spin (intrinsic angular momentum) of either an electron or a hole.
A hole is essentially a missing electron in a semiconductor. Both holes and electrons possess spin, which can adopt one of two states: up or down, analogous to 0 and 1 in classical bits. An advantage of a hole spin over an electron spin is that it can be controlled entirely through electrical means, eliminating the need for additional components such as micromagnets on the chip.
As early as 2022, physicists from Basel demonstrated that hole spins could be trapped and utilized as qubits in existing electronic devices. These devices, known as "FinFETs" (fin field-effect transistors), are integral components of modern smartphones and are manufactured through widespread industrial processes.
Recently, a team led by Dr. Andreas Kuhlmann achieved a breakthrough by successfully facilitating a controllable interaction between two qubits within this setup for the first time.
Quantum computers require "quantum gates" to perform calculations; these gates are operations that manipulate qubits and link them together. As detailed in the journal Nature Physics, researchers have successfully coupled two qubits and achieved a controlled flip of one qubit's spin based on the state of the other's spin, a process referred to as a controlled spin-flip.
Hole spins allow us to create two-qubit gates that are both fast and high-fidelity. This principle now also makes it possible to couple a larger number of qubit pairs.
Dr. Andreas Kuhlmann, Department of Physics, University of Basel
The exchange interaction between two indistinguishable particles that interact electrostatically provides the basis for the coupling of two spin qubits.
Surprisingly, the exchange energy of holes is not only electrically controllable but strongly anisotropic. This is due to thespin-orbit coupling, meaningthat the spin state of a hole is influenced by its motion through space.
Experimental and theoretical physicists from the NCCR SPIN and the University of Basel joined forces to describe this observation in a model.
The anisotropy makes two-qubit gates possible without the usual trade-off between speed and fidelity, Qubits based on hole spins not only leverage the tried-and-tested fabrication of silicon chips, they are also highly scalable and have proven to be fast and robust in experiments.
Dr. Andreas Kuhlmann, Department of Physics, University of Basel
The study emphasizes how promising this strategy is to create a large-scale quantum computer.
Geyer, S., et al. (2024) Anisotropic exchange interaction of two hole-spin qubits. Nature Physics. doi.org/10.1038/s41567-024-02481-5.
Source: https://www.unibas.ch/en.htm
Here is the original post:
Unveiling the Potential of Hole Spin Qubits in Quantum Computing - AZoQuantum
BBVA runs successful trial of distributed quantum simulation in the cloud – BBVA
To run its test, BBVA worked with the Quantum Computing team of the digital transformation company VASS and AWS, using Qiskit software to distribute the execution of quantum algorithms across multiple classical compute servers located in the AWS cloud, and created a platform to automate and streamline the distribution process.
With this distributed quantum simulation, one of the first of its kind in the financial sector, BBVA was able to run quantum algorithms scaling up to a total computing power of 38 qubits, a scale that is difficult to reach with the use of a single classical computer. The higher the number of qubits, the more complex the problems the system can tackle.
BBVA is a founding member of the Quantum Safe Financial Forum (QSFF), a safe space for collaboration between European and US financial firms promoted by Europols European Cybercrime Centre (EC3). The alliance aims to foster the creation of new technological systems within the financial industry that are safe, secure and resilient to malicious attacks that rely on quantum computing.
The trial also served to demonstrate that classical computers can be used to test quantum algorithms at scale and in an ideal computing environment. Quantum computing is an emerging technology and todays hardware is highly susceptible to noise. Running large-scale simulations allows BBVA to explore potential applications in a noise-free environment, with the potential to bring these applications to larger, more fault-tolerant quantum hardware as it matures.
The results were exactly what we expect to obtain in a fault-tolerant quantum computer, said Javier Recuenco Andrs, head of the Technical Architecture Innovation area at BBVA CIB in charge of the pilot project. With these trials, we have shown that at BBVA we can have a proprietary architecture for executing quantum algorithms, which would help further our exploration of their use in complex financial tasks.
At BBVA we explore the potential of quantum computing for two main reasons: to try to find better solutions to business problems and to strengthen the security of our communications and data to counteract the malicious use of quantum computing by third parties, explained Escolstico Snchez, leader of the Quantum discipline at BBVA. The distributed quantum simulation pilot we have successfully completed is a further step in this exploration, which could enable different business units of the bank to leverage this technology.
Original post:
BBVA runs successful trial of distributed quantum simulation in the cloud - BBVA
Trolling IBM’s Quantum Processor Advantage With A Commodore 64 – Hackaday
The memory map of the implementation, as set within the address space of the Commodore 64 about 15kB of the accessible 64kB RAM is used.
Theres been a lot of fuss about the quantum advantage that would arise from the use of quantum processors and quantum systems in general. Yet in this high-noise, high-uncertainty era of quantum computing it seems fair to say that the advantage part is a bit of a stretch. Most recently an anonymous paper (PDF, starts at page 199) takes IBMs claims with its 127-bit Eagle quantum processor to its ludicrous conclusion by running the same Trotterized Ising model on the ~1 MHz MOS 6510 processor in a Commodore 64. (Worth noting: this paper was submitted to Sigbovik, the conference of the Association for Computational Heresy.)
We previously covered the same claims by IBM already getting walloped by another group of researchers (Tindall et al., 2024) using a tensor network on a classical computer. The anonymous submitter of the Sigbovik paper based their experiment on a January 2024 research paper by [Tomislav Begui] and colleagues as published in Science Advances. These researchers also used a classical tensor network to run the IBM experiment many times faster and more accurately, which the anonymous researcher(s) took as the basis for a version that runs on the C64 in a mere 15 kB of RAM, with the code put on an Atmel AT28C256 ROM inside a cartridge which the C64 then ran from.
The same sparse Pauli dynamics algorithm was used as by [Tomislav Begui] et al., with some limitations due to the limited amount of RAM, implementing it in 6502 assembly. Although the C64 is ~300,000x slower per datapoint than a modern laptop, it does this much more efficiently than the quantum processor, and without the high error rate. Yes, that means that a compute cluster of Commodore 64s can likely outperform a please call us for a quote quantum system depending on which linear algebra problem youre trying to solve. Quantum computers may yet have their application, but this isnt it, yet.
Thanks to [Stephen Walters] and [Pio] for the tip.
See the article here:
Trolling IBM's Quantum Processor Advantage With A Commodore 64 - Hackaday
French quantum computing powerhouses Pasqal and Welinq announce partnership – Tech.eu
Today, two French quantum computing companies,PasqalandWelinq, announced a partnershipsetto bring new standards to thequantum computingindustry.
Pasqal builds quantum processors from ordered neutral atoms in 2D and 3D arrays to give its customers a practical quantum advantage and address real-world problems. It was founded in 2019 out of the Institut d'Optique by Georges-Olivier Reymond, Christophe Jurczak, Professor Dr Alain AspectNobel Prize Laureate Physics, 2022, Dr Antoine Browaeys, and Dr Thierry Lahaye. To date, Pasqal has secured more than 140 million in financing.
Welinq develops and commercialises quantum links based on laser-cooled neutral atom quantum memories to interconnect quantum computers, drastically increasing their computational power and ensuring their deployment in clusters on customer premises.
The company spun out from Sorbonne Universit, CNRS and PSL-University and was founded in 2022 by Tom Darras, Prof Julien Laurat, Dr Eleni Diamanti and Jean Lautier-Gaud.
The next-generation Quantum Processing Units (QPUs)are expectedto execute quantum algorithms relying on a large number of qubits while applying error correction, which would necessitate an even more significant number.
Welinq harnesses a unique solution to interconnect multiple QPUs, significantly enhancing computational power. This facilitates scaling up the number of qubits and optimised QPU deployment and establishes the foundation for expansive quantum networks.Welinq's world-leading quantum memories are central to this breakthrough,whichare essential in creating these pivotal quantum links.
The two companies aim to push the boundaries of quantum processing unit (QPU) interconnectivity. Welinq brings their full-stack, turnkey quantum links to the partnership and the world's most efficient quantum memories based on cold neutral atoms, promising to provide the scalability necessary for achieving fault-tolerant quantum computing.
Pasqal offers expertise in quantum computing with neutral atoms, featuring full-stack capabilities from hardware design and development to software solutions.
By the end of 2024, Welinq targets an industrial prototype of their neutral atom quantum memory with cutting-edge efficiency, storage time, and fidelity. Pasqal aims for a breakthrough in 2024 with 1000-qubit QPUs. T
he roadmap peaks in the 2026-2027 horizon with projected 10,000-qubit QPUs and high-fidelity two-qubit gates.
By 2030, they aim to foster a thriving quantum computing ecosystem, driving significant scientific and commercial advancements.
Multiple Pasqal neutral atom quantum processors will be interconnected for the first time, significantly boosting computing power. This represents a substantial step toward developing a complete, fault-tolerant quantum computing architecture that supports distributed computing.
Georges-Olivier Reymond, CEO and co-founder Pasqal commented:
"The partnership between Pasqal and Welinq is a strategic step towards practical quantum computing.
Our collaboration is centred on creating tangible solutions by integrating Pasqal's precision in quantum processing with Welinq's innovative networking and quantum memory systems.
This is quantum advancement with real-world application in mind, striving to solve complex problems with greater efficiency and reliability."
According to Tom Darras, CEO & Co-founder of Welinq:
"I am delighted to see that Welinq's unique vision for the scale-up of quantum computing is in alignment with quantum computing leaders like Pasqal,
This is a landmark for boosting the global quantum community towards achieving practical quantum computing in networked quantum computer architectures."
Lead image: Dynamic Wang.
See the rest here:
French quantum computing powerhouses Pasqal and Welinq announce partnership - Tech.eu
Quantum Computing Could be the Next Revolution – Fair Observer
Every few decades, the world witnesses technological revolutions that profoundly change our lives. This happened when we first invented computers, when we created the Internet and most recently when artificial intelligence (AI) emerged.
Today, experts frequently speculate that the next revolution will involve technologies grounded in the principles of quantum mechanics. One such technology is quantum computing. Harnessing the unique properties of quantum mechanics, quantum computers promise to achieve superior computational power, solving certain tasks that are beyond the reach of classical computers.
Quantum computers can potentially transform many sectors, from defense and finance to education, logistics and medicine. However, we are currently in a quantum age reminiscent of the pre-silicon era of classical computers. Back then, state-of-the-art computers like ENIAC ran on vacuum tubes, which were large, clunky, and required a lot of power. During the 1950s, experts investigated various platforms to develop the most efficient and effective computing systems. This journey eventually led to the widespread adoption of silicon semiconductors, which we still use today.
Similarly, todays quantum quest involves evaluating different potential platforms to produce what the industry commonly calls a fault-tolerant quantum computer quantum computers that are able to perform reliable operations despite the presence of errors in their hardware.
Tech giants, including Google and IBM, are adapting superconductors materials that have zero resistance to electrical current to build their quantum computers, claiming that they might be able to build a reasonably large quantum computer by 2030. Other companies and startups dedicated to quantum computing, such as QuEra, PsiQuantum and Alice & Bob, are experimenting with other platforms and even occasionally declaring that they might be able to build one before 2030.
Until the so-called fault-tolerant quantum computer is built, the industry needs to go through an era commonly referred to as the Noisy Intermedia-Scale Quantum (NISQ) era. NISQ quantum devices contain a few hundred quantum bits (qubits) and are typically prone to errors due to various quantum phenomena.
NISQ devices serve as early prototypes of fault-tolerant quantum computers and showcase their potential. However, they are not expected to clearly demonstrate practical advantages, such as solving large scale optimization problems or simulating sufficiently complex chemical molecules.
Researchers attribute the difficulty of building such devices to the significant amount of errors (or noise) NISQ devices suffer from. Nevertheless, this is not surprising. The basic computational units of quantum computers, the qubits, are highly sensitive quantum particles easily influenced by their environment. This is why one way to build a quantum computer is to cool these machines to near zero kelvin a temperature colder than outer space. This reduces the interaction between qubits and the surrounding environment, thus producing less noise.
Another approach is to accept that such levels of noise are inevitable and instead focus on mitigating, suppressing or correcting any errors produced by such noise. This constitutes a substantial area of research that must advance significantly if we are to facilitate the construction of fault-tolerant quantum computers.
As the construction of quantum devices progresses, research advances rapidly to explore potential applications, not just for future fault-tolerant computers, but also possibly for todays NISQ devices. Recent advances show promising results in specialized applications, such as optimization, artificial intelligence and simulation.
Many speculate that the first practical quantum computer may appear in the field of optimization. Theoretical demonstrations have shown that quantum computers will be capable of solving optimization problems more efficiently than classical computers. Performing optimization tasks efficiently could have a profound impact on a broad range of problems. This is especially the case where the search for an optimized solution would usually require an astronomical number of trials.
Examples of such optimization problems are almost countless and can be found in major sectors such as finance (portfolio optimization and credit risk analysis), logistics (route optimization and supply chain optimization) and aviation (flight gate optimization and flight path optimization).
AI is another field in which experts anticipate quantum computers will make significant advances. By leveraging quantum phenomena, such as superposition, entanglement and interference which have no counterparts in classical computing quantum computers may offer advantages in training and optimizing machine learning models.
However, we still do not have concrete evidence supporting such claimed advantages as this would necessitate larger quantum devices, which we do not have today. That said, early indications of these potential advantages are rapidly emerging within the research community.
Simulating quantum systems was the original application that motivated the idea of building quantum computers. Efficient simulations will likely drastically impact many essential applications, such as material science (finding new material with superior properties, like for better batteries) and drug discovery (development of new drugs by more accurately simulating quantum interactions between molecules).
Unfortunately, with the current NISQ devices, only simple molecules can be simulated. More complex molecules will need to wait for the advent of large fault-tolerant computers.
There is uncertainty surrounding the timeline and applications of quantum computers, but we should remember that the killer application for classical computers was not even remotely envisioned by their inventors. A killer application is the single application that contributed the most to the widespread use of a certain technology. For classical computers, the killer application, surprisingly, turned out to be spreadsheets.
For quantum computers, speculation often centers around simulation and optimization being the potential killer applications of this technology, but a definite winner is still far from certain. In fact, the quantum killer application may be something entirely unknown to us at this time and it may even arise from completely uncharted territories.
[Will Sherriff edited this piece.]
The views expressed in this article are the authors own and do not necessarily reflect Fair Observers editorial policy.
Read the rest here:
Quantum Computing Could be the Next Revolution - Fair Observer
Exploring the Power of Quantum AI: What You Need to Know – Scioto Valley Guardian
Quantum AI is a fascinating field that combines the power of quantum computing with artificial intelligence to unlock new possibilities and revolutionize industries. In this article, we will delve into the basics of quantum computing, explore the next frontier of quantum AI algorithms, examine cutting-edge applications across various industries, discuss the challenges and opportunities that come with this technology, and speculate on the future of quantum AI. Whether youre a seasoned tech enthusiast or simply intrigued by the potential of groundbreaking innovations, the emergence of platforms likequantumai.counderscores the growing importance and accessibility of quantum AI in shaping the technological landscape of tomorrow.
The field of quantum computing represents a fundamental change in computational approach, moving away from classical computings binary logic and towards the probabilistic domain of quantum mechanics. Fundamentally, quantum computing uses quantum bits, or qubits, to alter data by utilizing the laws of superposition and entanglement. Qubits are different from classical bits in that they can exist in more than one state at once. This allows for the processing of information in parallel and exponentially increases computer capacity. Shors algorithm for integer factorization and Grovers algorithm for database search are two examples of these algorithms that highlight the revolutionary potential of quantum computing in resolving intricate issues that are beyond the scope of classical systems.
The cutting edge of computational innovation is embodied in AI algorithms, which combine the intelligence of artificial neural networks with the capabilities of quantum computing. Equipped with the concepts of quantum parallelism and superposition, these algorithms go beyond the limitations of traditional machine learning models by enabling quick data processing and improved optimization methods. The field of these artificial intelligence (AI) algorithms is expanding at an unprecedented rate, with ground-breaking developments ranging from quantum-inspired optimization algorithms such as the Quantum Approximate Optimisation Algorithm (QAOA) to quantum neural networks for pattern recognition and classification. These algorithms present unmatched prospects for propelling scientific research and expanding technological boundaries.
Quantum AI has the potential to revolutionize a wide range of sectors by spurring creativity and altering long-standing paradigms. By utilizing these optimization algorithms to negotiate complicated market dynamics and open up new paths for profit maximization, quantum AI in finance empowers algorithmic trading tactics, risk management protocols and portfolio optimization procedures. Quantum AI in healthcare heralds a new era of precision medicine and tailored medicines by streamlining drug discovery pipelines, facilitating genome sequencing and analysis, and enabling personalized treatment regimens. This AI also improves inventory management systems, expedites route optimization, and boosts demand forecasting skills in logistics and supply chain management, all of which maximize operational effectiveness and resource utilization.
Although quantum AI has countless potential, there are manydifficulties and barriersin the way of its actualization. One of the biggest obstacles in the way of effective computing is still the search for fault-tolerant quantum hardware that can maintain stable qubits and reduce quantum decoherence. In addition, interdisciplinary cooperation and coordinated research efforts are required for the development of scalable quantum algorithms and error correction codes to overcome current obstacles and realize the full potential of quantum AI. However, these difficulties also present previously unheard-of chances for creativity, teamwork, and societal effect, highlighting the revolutionary potential of quantum AI in reshaping both technology and humankind.
Quantum AI is expected to evolve through a trajectory of rapid innovation, revolutionary breakthroughs, and paradigm shifts in computational approaches as we move towards a future driven by quantum energy. From ground-breaking studies in quantum information theory to industrial applications in quantum computing and artificial intelligence, the field of quantum AI is changing at a rate never seen before, changing entire industries, transforming scientific research, and advancing humankind to new heights of comprehension. The potential of quantum AI to surpass imagination and usher in an unprecedented era of technical growth and societal upheaval is contingent upon sustained investment, collaboration, and inventiveness.
To sum up, quantum AI is a cutting-edge technical advancement that embodies a stunning combination of artificial intelligence and quantum computing, with the potential to redefine human achievement. By exploring the complexities of artificial intelligence with quantum mechanics, we can open up new possibilities outside the scope of traditional computing paradigms. We set out on a voyage of exploration and invention as we negotiate the difficulties of quantum AI, driven by the unquenchable quest for knowledge and advancement.
The field of quantum AI is constantly growing, offering numerous chances for groundbreaking discoveries, cross-disciplinary cooperation and societal influence. Quantum AI is a progress accelerator that will lead us to a future filled with limitless potential and unimaginable possibilities, revolutionizing everything from industries to scientific frontiers to solving urgent global concerns. As we approach the dawn of a quantum-powered era, let us seize the opportunity presented by quantum AI and use its revolutionary potential to create a better, more promising future for coming generations.
View original post here:
Exploring the Power of Quantum AI: What You Need to Know - Scioto Valley Guardian
‘Almost very close’ to nuclear weapon: Federal cyber officials brace for quantum computing surprise – Washington Times
Federal cybersecurity officials are preparing for a quantum computing surprise that requires the largest change in encryption ever to safeguard Americans data from foreign hackers.
The Cybersecurity and Infrastructure Security Agencys Garfield Jones said Tuesday that the emergence of a cryptanalytically relevant quantum computer will upend digital security in unprecedented ways and that people need to prepare immediately.
Such a device, dubbed CRQC, would be capable of breaking encryption to expose government secrets and peoples personal information to anyone who uses the machine, according to cyber officials.
Nations will rush to develop the tech and keep it hidden from public view in order to steal their enemies data while upending information security in the process, according to Mr. Jones, CISA associate chief of strategic technology.
When it drops, its not going to be, I dont think its going to be a slow drop, Mr. Jones told cyber officials assembled at the U.S. General Services Administration. I think once someone gets this CRQC, none of us will know.
Quantum computers promise speeds and efficiency that todays fastest supercomputers cannot match, according to the National Science Foundation. Classical computers have more commercial value now because quantum computers have not yet proven capable of correcting errors involving encoded data.
A cryptanalytically relevant quantum computer, the CRQC, will be capable of correcting errors, according to Mr. Jones, and perform tasks that other computers cannot approach.
Preparations for defense against such technology are underway across the federal government.
Art Fuller, who is leading the Justice Departments post-quantum cryptography efforts, said developing secure systems presents a huge challenge that cannot be solved by flipping a switch.
This is the largest cryptographic migration in history, Mr. Fuller told officials at Tuesdays event.
Estimates on the timing of the creation of such a quantum computer vary, but Mr. Jones said large-scale quantum computers remain in the early stages of research and development and could still be a ways off.
Regardless, Mr. Jones cautioned digital defenders against delaying preparation for the arrival of such technology.
He described the environment surrounding the development of the CRQC as almost very close to a nuclear weapon, with nations competing to obtain the machine and keep it top secret.
You never know, three years from now, you might have a CRQC but I think planning and getting that preparation in place will help you protect that data, Mr. Jones said.
The National Security Agency similarly fears the arrival of a CRQC in the hands of Americas enemies.
NSA Director of Research Gil Herrera said last month that teams around the world are building with different technologies and could develop something representing a black swan event, an extremely unexpected occurrence with harsh consequences.
If this black swan event happens, then were really screwed, Mr. Herrera said, citing potential damage to everything from financial transactions to sensitive communications for nuclear weapons.
Mr. Herrera did not forecast precisely when a nation could develop such a device in remarks at the Intelligence and National Security Alliance event but indicated it may take a long time to achieve.
View original post here:
'Almost very close' to nuclear weapon: Federal cyber officials brace for quantum computing surprise - Washington Times
Future quantum computers will be no match for ‘space encryption’ that uses light to beam data around with the 1st … – Space.com
By converting data into light particles and beaming them around the world using satellites, we could prevent encrypted messages from being intercepted by a superpowerful quantum computer, scientists claim.
Currently, messaging technology relies on mathematical, or cryptographic, methods of protection, including end-to-end encryption. This technology is used in WhatsApp as well as by corporations, the government and the military to protect sensitive data from being intercepted.
Encryption works by scrambling data or text into what appears to be nonsense, using an algorithm and a key that only the sender and recipient can use to unlock the data. These algorithms can, in theory, be cracked. But they are designed to be so complex that even the fastest supercomputers would take millions of years to translate the data into something readable.
Related: World's 1st fault-tolerant quantum computer launching this year ahead of a 10,000-qubit machine in 2026
Quantum computers change the equation. Although the field is young, scientists predict that such machines will be powerful enough to easily break encryption algorithms someday. This is because they can process exponentially greater calculations in parallel (depending on how many qubits they use), whereas classical computers can process calculations only in sequence.
Fearing that quantum computers will render encryption obsolete someday, scientists are proposing new technologies to protect sensitive communications. One field, known as "quantum cryptography," involves building systems that can protect data from encryption-beating quantum computers.
Unlike classical cryptography, which relies on algorithms to scramble data and keep it safe, quantum cryptography would be secure thanks to the weird quirks of quantum mechanics, according to IBM.
Breaking space news, the latest updates on rocket launches, skywatching events and more!
For example, in a paper published Jan. 21 in the journal Advanced Quantum Technologies, scientists describe a mission called "Quick3," which uses photons particles of light to transmit data through a massive satellite network.
"Security will be based on the information being encoded into individual light particles and then transmitted," Tobias Vogl, professor of quantum communication systems engineering at TUM and co-author of the paper, said in a statement. "The laws of physics do not permit this information to be extracted or copied."
That's because the very act of measuring a quantum system changes its state.
"When the information is intercepted, the light particles change their characteristics," he added. "Because we can measure these state changes, any attempt to intercept the transmitted data will be recognized immediately, regardless of future advances in technology."
The challenge with traditional Earth-based quantum cryptography, however, lies in transmitting data over long distances, with a maximum range of just a few hundred miles, the TUM scientists said in the statement. This is because light tends to scatter as it travels, and there's no easy way to copy or amplify these light signals through fiber optic cables.
Scientists have also experimented with storing encryption keys in entangled particles meaning the data is intrinsically shared between two particles over space and time no matter how far apart. A project in 2020, for example, demonstrated "quantum key distribution" (QKD) between two ground stations 700 miles apart (1,120 km).
When it comes to transmitting photons, however, at altitudes higher than 6 miles (10 kilometers), the atmosphere is so thin that light is not scattered or absorbed, so signals can be extended over longer distances.
The Quick3 system would involve the entire system for transmitting data in this way, including the components needed to build the satellites. The team has already tested each component on Earth. The next step will be to test the system in space, with a satellite launch scheduled for 2025.
They will probably need hundreds, or perhaps even thousands, of satellites for a fully working quantum communications system, the team said.
Go here to read the rest:
Future quantum computers will be no match for 'space encryption' that uses light to beam data around with the 1st ... - Space.com
Top Academics: Here’s How We Facilitate the Next Big Leap in Quantum Computing – PCMag Middle East
Table of Contents From Quantum Physics to Quantum Computing Grand Challenges and Error Correction The Road to Quantum Advantage Education and Workforce Development The Quantum Bottom Line
In advance of the ribbon-cutting for its new IBM System One quantum computer, the first one on a college campus, Rensselaer Polytechnic Institute (RPI) last week hosted a quantum computing day which featured several prominent speakers who together provided a snapshot of where the field is now. I've been writing about quantum computing for a long time, and have noted some big improvements, but there are also a host of challenges that still need to be overcome.
Here are some highlights.
The first plenary speaker was Jay M. Gambetta, Vice President of Quantum Computing at IBM, who gave an overview of the history and progress of quantum computing, as well as the challenges and opportunities ahead. He explained that quantum computing is based on exploiting the quantum mechanical properties of qubits, such as superposition and entanglement, to perform computations that are impossible or intractable for classical computers. He talked about watching the development of superconducting qubits, as they moved from single qubit systems in 2007, to 3-qubit systems in 2011, and now with IBM's Eagle chip, which has 127 qubits and is the heart of the Quantum System One.
He then asked how we could make quantum computing useful. His answer: We need to keep building larger and larger systems and we need to improve error correction.
"There are very strong reasons to believe there are problems that are going to be easy for a quantum computer but hard for a classical computer, and this is why we're all excited," Gambetta said. He discussed the development of quantum circuits and that while the number of qubits was important, equally important was the "depth," detailing how many operations you can do and the accuracy of the results. Key to solving this are larger and larger systems, and also error mitigation, a topic that would be discussed in much greater detail later in the day.
To get to "quantum utility"which he said would be reached when a quantum computer is better than a brute force simulation of a quantum computer on a classical machineyou would need larger systems with at least 1000 gates, along with improved accuracy and depth, and new efficient algorithms.
He talked about quantum algorithmic discovery, which means finding new and efficient ways to map problems to quantum circuits. For instance, a new variation on Shor's algorithm, which allows for factorization in much faster time than would be possible on a classical computer. "The future of running error-mitigated circuits and mixing classical and quantum circuits sets us up to explore this space, " he said.
In a panel discussion that followed, James Misewich from Brookhaven National Laboratory discussed his interest in using quantum computing to understand quantum chromodynamics (QCD), the theory of strong interactions between quarks and gluons. QCD is a hard problem that scales well with the number and depth of qubits, and he is looking at entanglement between jets coming out of particle collisions as a possible avenue to explore quantum advantage.
Jian Shi and Ravishankar Sundararaman from RPI's Materials Science and Engineering faculty talked about computational materials science, and applying quantum computing to discover new materials and properties. Shi noted there was a huge community now doing quantum chemistry, but there is a gap between that and quantum computing. He stressed that a partnership between the two groups will be important, so each learns the language of the other and can approach the problems from a different perspective.
One of the most interesting talks was given by Steve M. Girvin, Eugene Higgins Professor of Physics, Yale University, who discussed the challenges of creating an error-correction quantum computer.
Girvin described how the first quantum revolution was the development of things like the transistor, the laser, and the atomic clock, while the second quantum revolution is based on a new understanding of how quantum mechanics works. He usually tells his students that they do the things that Einstein said were impossible just to make sure that we have a quantum computer and not a classical computer.
He thought there was a bit too much hype around quantum computing today. quantum is going to be revolutionary and do absolutely amazing things, but it's not its time yet. We still have massive problems to solve.
He noted that quantum sensors are extremely sensitive, which is great for making sensors, but bad for building computers, because they are very sensitive to external perturbations and noise. Therefore, error correction is important.
Among the issues Girvin discussed were making measurements to detect errors, but he said we also need calculations to decide if it truly is an error, where it is located, and what kind of error it is. Then there is the issue of deciding what signals to send to correct those errors. Beyond that, there is the issue of putting these together in a system to reduce overall errors, perhaps borrowing from the flow control problems used in things like telephony.
In addition to quantum error detection, Girvin said there are "grand challenges all up and down the stack," from materials to measurement to machine models and algorithms. We need to know how to make each layer of the stack more efficient, using less energy and fewer qubits, and get to higher performance so people can use these to solve science problems or economically interesting problems.
Then there are the algorithms. Girvin noted that there were algorithms way before there were computers, but it took time to decide on the best ones for classical computing. For quantum computing, this is just the beginning, and over time, we need people to figure out how to build up their algorithms and how to do heuristics. They need to discover why quantum computers are so hard to program and clever tools to solve these problems.
Another challenge he described was routing quantum information. He noted that having two quantum computers that can communicate classically is exponentially less good than having two quantum computers that can communicate with quantum information, entangling with each other.
He talked about fault tolerance, which is the ability to correct errors even when your error correction circuit makes errors. He believes that fact that it's possible to do that in a quantum system, at least in principle, is even more amazing than the fact that if you had a perfect quantum computer, you could do interesting quantum calculations.
Girvin described the difficulty in correcting errors, saying you have an unknown quantum state, and you're not allowed to know what it is, because it's from the middle of a quantum computation. (If you know what it is, you've destroyed the superposition, and if you measure it to see if there's an error, it will randomly change, due to state collapse.) Your job is that if it develops an error, please fix it.
"That's pretty hard, but miraculously it can be done in principle, and it's even been done in practice," he said. We're just entering the era of being able to do it. The basic idea is to build in redundancy, such as building a logical qubit that consists of multiple physical qubits, perhaps nine. Then you have two possible giant entangled states corresponding to a logical Zero and a logical One. Note the one and zero aren't living in any single physical qubit, both are only the superposition of multiple ones.
In that case, Girvin says, if the environment reaches in and measures one of those qubits, the environment doesn't actually learn what it knows. There's an error, but it doesn't know what state, so there's still a chance that you haven't totally collapsed anything and lost the information.
He then discussed measuring the probability of errors and then seeing whether it exceeds some threshold value, with some complex math. Then correcting the errors, hopefully quicklysomething that should improve with new error correction methods and better, more precise physical qubits.
All this is still theoretical. That's why fault tolerance is a journey with improvements being made continuously. (This was in opposition to Gambetta, who said systems are either fault tolerant or they aren't). Overall, Girvin said, "We still have a long way to go, but we're moving in the right direction."
Later in the morning, Austin Minnich, Professor of Mechanical Engineering and Applied Physics, Caltech described "mid-circuit measurement" and the need for hybrid circuits as a way of finding, and thus mitigating errors.
In a discussion that followed, Kerstin Kleese van Dam, Director of the Computational Science Initiative at Brookhaven National Laboratory, explained that her team was looking for answers to problems, whether solved on traditional or quantum machines. She said there were problems they can't solve accurately on a traditional computer, but there remains the question of whether the accuracy will matter. There are areas, such as machine learning, where quantum computers can do things accurately. She predicts that quantum advantage will come when we have systems that are large enough. But she also wondered about energy consumption, noting that a lot of power is going into today's AI models, and if quantum can be more efficient.
Shekhar Garde, Dean of the School of Engineering, RPI, who moderated this part of the discussion, compared the status of quantum computing today to where traditional computing was in the late 70s or early 80s. He asked what the next 10 years would bring.
Kleese van Dam said that within 10 years, we would see hybrid systems that combine quantum and classical computing, but also hoped we would see libraries that are transferred from high-performance computing to quantum systems, so a programmer could use them without having to understand the way the gates work. Aparna Gupta, Professor and Associate Dean of RPI's Lally School of Management would bet on the hybrid approach offering more easy access and cost-effectiveness, as well as "taking away the intrigue and the spooky aspects of quantum, so it is becoming real for all of us"
Antonio Corcoles, Principal Research Scientist, IBM Quantum, said he hoped users who don't know quantum will be able to use the system because the complexity will become more transparent, but that can take a long time. In between, they can develop quantum error correction in a way that is not as disruptive as current methods. Minnich talked about "blind quantum computing" where many smaller machines might be linked together.
One of the most interesting talks came from Lin Lin, Professor of Mathematics at the University of California, Berkeley, who discussed the theoretical aspects and challenges of achieving quantum advantage for scientific computation. He defined quantum advantage as the ability to solve problems that are quantumly easy but classically hard, and proposed a hierarchy of four levels of problems.
Lin said that for the first two levels, a lot of people think quantum advantage will be achieved, as the methods are generally understood. But on the next two levels, there needs to be a lot of work on the algorithms to see if it will work. That's why this is an exciting time for mathematicians as well as physicists, chemists, and computer scientists.
This talk was followed by a panel during which Lin said that he is interested in solving quantum many-body problems, as well as applying quantum computing to other areas of mathematics, such as numerical analysis and linear algebra.
Like Garde above, Lin compared where quantum is today to the past, going even further to say it's where classical computing was 60 or 70 years ago, where error correction was still very important. Quantum computing will need to be a very interdisciplinary field, in that it will require people to be very good at building the machines, but it will always produce errors, so it will require both mathematical and engineering ways to correct these.
Ryan Sweke from IBM Research noted that one of the things that has allowed classical computing to develop to the point it is at is the various levels of abstraction, so if you want to work on developing algorithms, you don't have to understand how the compiler works. If you want to understand how the compiler works, you don't have to understand how the hardware works.
The interesting thing in the quantum regime, as seen in error mitigation for example, is that people who come out of the top level of abstraction have to interact with people who are developing the devices. This is an exciting aspect of the time we're in.
Di Fang, Assistant Professor of Mathematics, Duke University, said now was a "golden time for people who work on proving algorithms." She talked about the varying levels of complexity, and the need to see where new algorithms can solve theoretical problems, then look at the hardware and solve practical problems.
Brian McDermott, Principal R&D Engineer at the Naval Nuclear Laboratory, said he was looking at this in reverse, seeing what the problems are and then working backward toward the quantum hardware and software. His job involved matching applications of new and emerging computing architectures to the types of engineering problems that are important to the lab's mission for new nuclear propulsion.
The panelists discussed where quantum algorithms could have the most impact. McDermott talked about things like finite elements and computational fluid dynamics, going up to material science. As a nuclear engineer, he was first attracted to the field because of the quantum properties of the nucleus itself moving predicting behaviors in astrophysics, the synthesis of nuclei in a supernova, and then with engineering, into nuclear reactors and things like fusion. Lin discussed the possibilities for studying molecular dynamics.
Olivia Lanes, Global Lead and Manager for IBM Quantum Learning and Education gave the final talk of the day, where she discussed the need for workforce development in the quantum field.
Already the US is projected to face a shortfall of nearly two million STEM workers by next year. She quoted Carl Sagan, who said "We live in a society exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology," and agreed with him that this is a recipe for disaster.
She noted that not only do very few people understand quantum computing, very few actually understand how classical computers work. She cited a McKinsey study which found that there are three open jobs in quantum for every person qualified to fill those positions. It's probably just going to get worse from here to 2026.
She focused on upskilling and said it was unrealistic to expect that we'll make everyone into experts in quantum computing. But, there were a lot of other jobs that are part of the quantum ecosystem that will be required, and urged students to focus on the areas they are particularly interested in.
In general, she recommended getting a college degree (not surprising, since she was talking at a college), considering graduate school, or finding some other way to get relevant experience in the field, and building up rare skills. "Find the one thing that you can do better than anybody else and market that thing. You can make that thing applicable to any career that you really want for the most part," she said. "Stop letting the physicists hog quantum; they've had a monopoly here for too long and that needs to change."
Similar concepts were voiced in a panel that followed. Anastasia Marchenkova, Quantum Researcher, Bleximo Corporation, said that there was lots of pop science, and lots of research, but not much in the middle. She said we need to teach people enough so they can use quantum computing, even if they aren't computer scientists.
Richard Plotka, Director of Information Technology and Web Science, RPI, said it was important to create middleware tools that can be applied to quantum so that the existing workforce can take advantage of these computers. He also said it was important to prepare students for a career in the future, with foundational knowledge, so they have the ability to adapt because quantum in five or ten years won't look like it does today.
All told, it was a fascinating day of speakers. I was intrigued by software developers explaining the challenge in writing languages, compilers, and libraries for quantum. One explained that you can't use traditional structures such as "ifthen" because you won't know "if." Parts of it were beyond my understanding, and I remain skeptical about how quickly quantum will become practical and how broad the applications may be.
Still, it's an important and interesting technology that is sure to get even more attention in the coming years, as researchers meet some of the challenges. It's good to see students getting a chance to try out the technology and discover what they can do with it.
Read more from the original source:
Top Academics: Here's How We Facilitate the Next Big Leap in Quantum Computing - PCMag Middle East