Category Archives: Quantum Computing

Innovative Butterfly-Shaped Nanographene May Propel Quantum Computing Forward – yTech

Summary: A research team from the National University of Singapore, led by Associate Professor Lu Jiong, has created a new butterfly-shaped magnetic nanographene. This quantum material shows promise for use in quantum computing due to its unique structure hosting correlated spins, a critical aspect for qubits in quantum computers. Their ground-breaking work is pivotal in advancing technologies related to information processing and high-density storage.

Researchers at the National University of Singapore have charted a course towards the potential future of quantum computing with the synthesis of a novel, butterfly-shaped magnetic nanographene. Boasting intertwined magnetic properties, this quantum material breaks conventional barriers by combining elements of both ferromagnetism and antiferromagnetism within its design. The innovative configuration of the structure is attributed to four symmetrical wing-like extensions, centered around a rhombus, embodying the means to control the magnetic behaviour of spins with greater precision than ever before.

Beyond just a scientific curiosity, the nanographenes mere 3-nanometer size conceals an intricate dance of -electrons. These particles play a pivotal role in the magnetic characteristics of graphene and, when expertly arranged, could revolutionize the fundamental components of quantum computersparticularly, the critical quantum bits or qubits.

The remarkable breakthrough represents a collective triumph of interdisciplinary collaboration, bringing together chemists, material scientists, and physicists with notable contributions from international teams, including experts from the Czech Academy of Sciences. Published in Nature Chemistry, this development not only signifies a major step in quantum materials research but also lays a foundation for monumental shifts in how future technology processes and stores information.

The Quantum Computing Industry

Quantum computing represents a revolutionary leap forward from traditional computing, offering the potential to solve complex problems that are beyond the reach of classic computers. Unlike classical bits, which represent data as either 0s or 1s, quantum bits or qubits, such as those potentially created from magnetic nanographene, can exist in multiple states simultaneously, thanks to the principles of quantum mechanics. This property, known as superposition, coupled with quantum entanglement, enables quantum computers to perform immense numbers of calculations at once.

Market Forecasts

The market for quantum computing is expected to grow exponentially in the coming years. Advances like the newly developed magnetic nanographene signal the materials industrys maturation, which substantially impacts the quantum computing market. Market research reports forecast that the global quantum computing market could be worth billions of dollars by the late 2020s, with a compound annual growth rate (CAGR) that underscores the keen interest and substantial investments in the sector. Key players in the industry include multinational corporations such as IBM, Google, Microsoft, and a number of startups, all competing to achieve quantum supremacy.

Industry Challenges and Issues

However, the quantum computing industry faces several challenges. Maintaining coherence of quantum states, error correction, and developing user-friendly quantum programming languages are just a few of the hurdles researchers and engineers need to overcome. Furthermore, the development of materials that can effectively function as qubits, such as the magnetic nanographene created by the National University of Singapore, is crucial for the advancement of quantum computing technology.

The production and manipulation of such materials at an industrial scale are also complex and costly. Scale-up processes must maintain the properties that make these materials valuable for quantum computing, which is no small feat. Establishing reliable supply chains and manufacturing processes will be essential for the industrys growth.

For those interested in following the trajectory of this emerging field, keeping an eye on institutions like the National University of Singapore and collaborations with organizations such as the Czech Academy of Sciences will be crucial. To find out more about the evolving quantum computing industry and market trends, reputable sources like the Nature journal, which published the research on butterfly-shaped magnetic nanographene, provide valuable insights and updates on the latest scientific advancements.

As technology evolves and quantum computing moves closer to practical application, groundbreaking materials like magnetic nanographene will play a pivotal role in shaping the future landscape of computing and information processing.

Roman Perkowski is a distinguished name in the field of space exploration technology, specifically known for his work on propulsion systems for interplanetary travel. His innovative research and designs have been crucial in advancing the efficiency and reliability of spacecraft engines. Perkowskis contributions are particularly significant in the development of sustainable and powerful propulsion methods, which are vital for long-duration space missions. His work not only pushes the boundaries of current space travel capabilities but also inspires future generations of scientists and engineers in the quest to explore the far reaches of our solar system and beyond.

Read more:
Innovative Butterfly-Shaped Nanographene May Propel Quantum Computing Forward - yTech

Oxford scientists achieve breakthrough in quantum computing security – Techerati

If you are a visitor of this website:

Please try again in a few minutes.

There is an issue between Cloudflare's cache and your origin web server. Cloudflare monitors for these errors and automatically investigates the cause. To help support the investigation, you can pull the corresponding error log from your web server and submit it our support team. Please include the Ray ID (which is at the bottom of this error page). Additional troubleshooting resources.

Originally posted here:
Oxford scientists achieve breakthrough in quantum computing security - Techerati

Google’s Sycamore and the Quantum Supremacy Milestone – yTech

Summary: Googles quantum computer, Sycamore, represents a significant breakthrough in computing, having demonstrated quantum supremacy by performing a calculation far beyond the capability of classical computers. This article explores the specifics of quantum computing technology, its current challenges, and potential future impacts, including energy sustainability and security implications.

Quantum computing is entering the spotlight as a powerful technology poised to outstrip traditional computing methods. Googles Sycamore quantum computer has catalyzed this movement by demonstrating quantum supremacy, completing a complex task in mere minutes versus the millennia it would take the best classical supercomputers.

Differing from traditional computers that process bits as zeros or ones, Sycamore operates using qubits. These qubits can exist in a state of superposition, where they can be in multiple states at once, dramatically increasing computational power and speed. Sycamore capitalized on this advantage with its 53 functioning qubits to make history.

While quantum computing is groundbreaking, it is not without its hurdles. Quantum machines are highly sensitive, requiring extremely cold environments for operation to prevent quantum decoherencean event that disrupts the state necessary for quantum calculations. Moreover, maintaining low error rates in quantum gate operations is crucial to preserve accurate results.

The promises of quantum computing extend to energy efficiency since these machines consume drastically less power than their classical counterparts. Only a small fraction of energy is needed for the calculations themselves, with the rest dedicated to maintaining the conditions necessary for the qubits to function.

The roadmap ahead for quantum computing is filled with both opportunities and challenges. Immediate benefits may be seen in fields like material science and complex simulations, but longer-term considerations must center around cybersecurity, ethical use, and international regulations that foster safe and beneficial advancement of quantum technology. Googles Sycamore is therefore not just a stride in computational capability but also a step into a future that demands careful management of powerful new technology.

Quantum Computings Industry and Market Forecast

Quantum computing is rapidly transforming from a theoretical concept to a market of vast potential. By leveraging the principles of quantum mechanics, this technology is poised to revolutionize industries that depend on computational power. Industries such as cryptography, pharmaceuticals, financial services, and materials science are eagerly awaiting the advancements that quantum computers promise, especially in the realms of drug discovery, financial modeling, and optimizing complex systems.

The market for quantum computing is on an upward trajectory, with significant investments from both public and private sectors. Market research forecasts project that the quantum computing market could be worth billions of dollars in the next decade as technology matures and becomes commercially viable. The applications for quantum computing are extensive, with potential to disrupt almost every industry by enabling them to solve complex problems much more efficiently than classical computers.

Key Challenges and Issues

Despite the optimism, quantum computing faces substantial challenges. As indicated by the article, quantum computers operate under delicate conditions that are challenging to maintain. The susceptibility to quantum decoherence and the need for error correction mechanisms make scalability and reliability immediate concerns for the industry.

On top of technical challenges, there are also significant issues regarding data security. Quantum computers hold the power to break many of the current encryption methods, which protects essential communications globally, including in the realms of government and finance. This has led to an increased focus on developing quantum-resistant encryption methods, a pursuit that is now just as crucial as the development of quantum computers themselves.

Additionally, the ethical implications of quantum computing and the consequences of such computational power require attention. The proliferation of quantum technology raises questions about the balance of power, potential weapons development, and the exclusivity of access to such resources.

As the industry evolves, so will the regulations and international policies aimed at governing the use of quantum technologies. Its imperative for the global community to establish a framework to ensure that advances benefit society as a whole and that security risks are mitigated.

For continuous updates and information regarding quantum computing, please visit the official website of Google or the IBM main domain, which are engaged in research and development in this cutting-edge field.

In conclusion, quantum computing promises a future of unparalleled computational potential. The industry is poised to navigate a complex landscape of opportunities and challenges, with market forecasts indicating significant growth and the potential for transformative impacts across a myriad of sectors. Googles Sycamore serves as both a beacon of possibility and a reminder of the responsibilities inherent in ushering in such a profound technological evolution.

Roman Perkowski is a distinguished name in the field of space exploration technology, specifically known for his work on propulsion systems for interplanetary travel. His innovative research and designs have been crucial in advancing the efficiency and reliability of spacecraft engines. Perkowskis contributions are particularly significant in the development of sustainable and powerful propulsion methods, which are vital for long-duration space missions. His work not only pushes the boundaries of current space travel capabilities but also inspires future generations of scientists and engineers in the quest to explore the far reaches of our solar system and beyond.

Excerpt from:
Google's Sycamore and the Quantum Supremacy Milestone - yTech

Breaking the Limits: Overcoming Heisenberg’s Uncertainty in Quantum Measurements – SciTechDaily

An artistic illustration shows how microscopic bolometers (depicted on the right) can be used to sense very weak radiation emitted from qubits (depicted on the left). Credit: Aleksandr Kkinen/Aalto University

Aalto University researchers are the first in the world to measure qubits with ultrasensitive thermal detectorsthus evading the Heisenberg uncertainty principle.

Chasing ever-higher qubit counts in near-term quantum computers constantly demands new feats of engineering.

Among the troublesome hurdles of this scaling-up race is refining how qubits are measured. Devices called parametric amplifiers are traditionally used to do these measurements. But as the name suggests, the device amplifies weak signals picked up from the qubits to conduct the readout, which causes unwanted noise and can lead to decoherence of the qubits if not protected by additional large components. More importantly, the bulky size of the amplification chain becomes technically challenging to work around as qubit counts increase in size-limited refrigerators.

Cue the Aalto University research group Quantum Computing and Devices (QCD). They have a hefty track record of showing how thermal bolometers can be used as ultrasensitive detectors, and they just demonstrated in an April 10 Nature Electronics paper that bolometer measurements can be accurate enough for single-shot qubit readout.

To the chagrin of many physicists, the Heisenberg uncertainty principle determines that one cannot simultaneously know a signals position and momentum, or voltage and current, with accuracy. So it goes with qubit measurements conducted with parametric voltage-current amplifiers. But bolometric energy-sensing is a fundamentally different kind of measurementserving as a means of evading Heisenbergs infamous rule. Since a bolometer measures power, or photon number, it is not bound to add quantum noise stemming from the Heisenberg uncertainty principle in the way that parametric amplifiers are.

Unlike amplifiers, bolometers very subtly sense microwave photons emitted from the qubit via a minimally invasive detection interface. This form factor is roughly 100 times smaller than its amplifier counterpart, making it extremely attractive as a measurement device.

When thinking of a quantum-supreme future, it is easy to imagine high qubit counts in the thousands or even millions could be commonplace. A careful evaluation of the footprint of each component is absolutely necessary for this massive scale-up. We have shown in the Nature Electronics paper that our nanobolometers could seriously be considered as an alternative to conventional amplifiers, says Aalto University Professor Mikko Mttnen, who heads the QCD research group.

In our very first experiments, we found these bolometers accurate enough for single-shot readout, free of added quantum noise, and they consume 10,000 times less power than the typical amplifiersall in a tiny bolometer, the temperature-sensitive part of which can fit inside of a single bacterium, Mttnen explains.

Single-shot fidelity is an important metric physicists use to determine how accurately a device can detect a qubits state in just one measurement as opposed to an average of multiple measurements. In the case of the QCD groups experiments, they were able to obtain a single-shot fidelity of 61.8% with a readout duration of roughly 14 microseconds. When correcting for the qubits energy relaxation time, the fidelity jumps up to 92.7%.

With minor modifications, we could expect to see bolometers approaching the desired 99.9% single-shot fidelity in 200 nanoseconds. For example, we can swap the bolometer material from metal to graphene, which has a lower heat capacity and can detect very small changes in its energy quickly. And by removing other unnecessary components between the bolometer and the chip itself, we can not only make even greater improvements on the readout fidelity, but we can achieve a smaller and simpler measurement device that makes scaling-up to higher qubit counts more feasible, says Andrs Gunyh, the first author on the paper and a doctoral researcher in the QCD group.

Prior to demonstrating the high single-shot readout fidelity of bolometers in their most recent paper, the QCD research group first showed that bolometers can be used for ultrasensitive, real-time microwave measurements in 2019. They then published in 2020 a paper in Nature showing how bolometers made of graphene can shorten readout times to well below a microsecond.

Reference: Single-shot readout of a superconducting qubit using a thermal detector by Andrs M. Gunyh, Suman Kundu, Jian Ma, Wei Liu, Sakari Niemel, Giacomo Catto, Vasilii Vadimov, Visa Vesterinen, Priyank Singh, Qiming Chen and Mikko Mttnen, 10 April 2024, Nature Electronics. DOI: 10.1038/s41928-024-01147-7

The work was carried out in the Research Council of Finland Centre of Excellence for Quantum Technology (QTF) using OtaNano research infrastructure in collaboration with VTT Technical Research Centre of Finland and IQM Quantum Computers. It was primarily funded by the European Research Council Advanced Grant ConceptQ and the Future Makers Program of the Jane and Aatos Erkko Foundation and the Technology Industries of Finland Centennial Foundation.

View original post here:
Breaking the Limits: Overcoming Heisenberg's Uncertainty in Quantum Measurements - SciTechDaily

Is Nvidia Also the Best Bet in Quantum Computing Right Now? – sharewise

In a discussion with analysts and media members following his keynote address at GTC 2024 (a leading conference on artificial intelligence), Nvidia (NASDAQ: NVDA) CEO Jensen Huang outlined a list of top technologies his company is at the forefront of developing: industrial robotics, 6G (move over, 5G), weather prediction, particle physics analysis, biology, and so on.

Among those massive developments Huang mentioned was quantum computing.

In fact, as researchers and developers race to make a breakthrough, Nvidia stock might be the best bet on quantum computers right now. Here's what you need to know.

Continue reading

Source Fool.com

Go here to see the original:
Is Nvidia Also the Best Bet in Quantum Computing Right Now? - sharewise

Next-Generation Quantum Computing Secured by Oxford Innovation – yTech

Research scientists at Oxford University have made significant strides in securing the future of quantum computing, focusing on how users can safely leverage this profound technology through cloud services using fiber optics. Their cutting-edge approach may enable individuals to perform quantum computing tasks without compromising their datas privacy and integrity.

**Summary:** Oxford University has broken new ground in quantum computing by developing a way to ensure data privacy and security when using quantum computers through cloud services. This innovation represents a key milestone in addressing security concerns and making quantum computing accessible to individual users.

The flagship breakthrough reported in the Physics Review Letters journal involves a strategy termed blind quantum computing, which establishes a secure lien between personal devices and quantum servers. The team, led by Professor David Lucas, has showcased a model that not only preserves data privacy but also enables users to verify computational accuracy, a critical contribution to the growing field of quantum computations.

Quantum computing stands at the forefront of a technological revolution, with industry titans pouring resources into this sphere in anticipation of its ability to tackle complexities unfathomable to classic supercomputers. Spearheading these advances, Oxfords research introduces an innovative photonic detection system, assuring secure data interaction across networks.

This progress charts possible new territories for telecommunications providers, as these breakthroughs necessitate robust infrastructure to accommodate quantum networks. Of pivotal importance, these advancements push the boundaries of data processing and may profoundly affect sectors reliant on computational power.

The article illustrates a burgeoning opportunity that quantum computing represents, pinning down the crucial element of security that will define the technologys practical application. This comes at a critical juncture where quantum computings rapid expansion underscores the need for enhanced data protection measures and infrastructural expansions to realize its full potential.

Quantum Computing Industry Outlook: The quantum computing industry is poised for significant growth, with some market forecasts projecting that it will reach billions of dollars in value within the next decade. Leading corporations in tech and defense, such as IBM, Google, Microsoft, and numerous startups, are competing to achieve quantum supremacy and to bring practical quantum computing solutions to the market. This technology promises to revolutionize industries ranging from pharmaceuticals, where it could expedite drug discovery, to finance, providing powerful tools for risk analysis and optimization.

Market Forecasts: According to research firms, the global quantum computing market is expected to witness a compound annual growth rate (CAGR) that might exceed 20-30% in the upcoming years. Factors such as increased investment in quantum computing ventures, partnerships between academic institutions and the private sector, and government funding in quantum technologies are fueling this growth. As quantum computings potential applications expand and as the technology becomes more accessible, even more sectors could begin to explore its advantages.

Industry Issues: However, the quantum computing industry faces several issues, notably in the aspects of security and technology standardization. The very principles that make quantum computing powerful also pose risks, such as the potential to break current encryption standards. As a result, there is a parallel pursuit for quantum-safe cryptography. Additionally, the industry is confronted with the technical challenge of achieving and maintaining quantum coherence to realize practical and reliable quantum machines.

Infrastructure and Telecommunications: The development of quantum networks for secure communication is vital. Quantum Key Distribution (QKD) and other quantum-safe communication methodologies are likely to become industry standards. Telecommunications providers will play a pivotal role in creating the necessary infrastructure. These providers need to be ready to make considerable investments to future-proof their networks against the forthcoming surge in quantum data traffic.

Research such as the work conducted by the scientists at Oxford University is essential to address these issues. Their innovation in blind quantum computing provides a new layer of security in cloud-accessible quantum systems, possibly paving the way for consumer-level quantum computing services. As this field develops, telecommunications companies and service providers would likely seek to integrate these advancements into their offerings to gain a competitive edge.

For additional information about quantum computing and the industrys developments, check out the following related resources:

IBM Quantum Google Quantum AI Microsoft Quantum

These links represent some of the industry leaders who are actively investing in and developing quantum computing technologies that are shaping the future of the sector.

Micha Rogucki is a pioneering figure in the field of renewable energy, particularly known for his work on solar power innovations. His research and development efforts have significantly advanced solar panel efficiency and sustainability. Roguckis commitment to green energy solutions is also evident in his advocacy for integrating renewable sources into national power grids. His groundbreaking work not only contributes to the scientific community but also plays a crucial role in promoting environmental sustainability and energy independence. Roguckis influence extends beyond academia, impacting industry practices and public policy regarding renewable energy.

Read more from the original source:
Next-Generation Quantum Computing Secured by Oxford Innovation - yTech

Advancement in Quantum Computing Security through Oxford University’s Breakthrough – yTech

Researchers at Oxford University have developed a new system that could transform the security of quantum computing when performed via cloud services. By employing a technique called blind quantum computing, the team has successfully enabled computations to be done on remote servers without compromising the privacy of the data involved. This method, which utilizes fiber optics for secure communication, is poised to usher in an era where quantum computations can be safely conducted from any distance.

In an age where digital privacy is under scrutiny, the advancement by Oxford scientists represents a notable step forward. The protocol not only protects user data but also verifies the accuracy of the quantum computations being performed remotely. This means that individuals and organizations could utilize quantum computings exceptional capabilities without risking the exposure of sensitive information.

Quantum computing presents a revolutionary opportunity across various sectors, from cryptography and pharmaceuticals to artificial intelligence and finance. By leveraging quantum mechanics, it promises to execute calculations indomitably faster than any traditional computing system.

However, as the technology nears a more practical application, issues such as data security and the need for quantum-resistant encryption are becoming more pressing. The pioneering work by the Oxford University team is a response to such concerns and is crucial for the industry to overcome security challenges associated with quantum networks.

With government investments and academic-industry collaborations fueling its growth, quantum computing is projected to experience a significant market expansion, maintaining a CAGR of over 20% in the ensuing years. Nonetheless, the sector continues to contend with obstacles like error correction, hardware stability, scaling quantum systems, and establishing universal standards. The community anticipates that with the maturation of technologies, regulations will guide the ethical and secure deployment of quantum computing.

For those interested in this cutting-edge field, resources such as IBMs research division and market analytics from Bloomberg offer comprehensive insights. Scholarly contributions, for instance in Nature journals, provide a wealth of information on the dynamic progress and implications of quantum advancements. As investments and innovations persist, quantum computing will likely redefine digital computations horizon and fortify data security in unprecedented ways.

Quantum computing is a topic enveloped in both excitement and complexity, as researchers and industry specialists explore its vast potential to transform computing as we know it. With its ability to perform complex calculations at speeds unmatchable by classical computers, this technology holds the key to monumental strides in myriad industries.

The implications of quantum computing ripple across numerous fields, standing to reimagine the landscape of data encryption, drug discovery, financial modeling, and weather forecasting, among others. The technology operates using the principles of quantum mechanicssuch as superposition and entanglementto process information in ways that traditional computers cannot.

However, this leap in computational capabilities brings to the forefront various challenges, including those of data privacy and cybersecurity. The advancements at Oxford University mark a significant response to these hurdles, offering enhanced protection for data during quantum computations, particularly in scenarios where such operations must be performed remotely.

As the quantum computing industry readies itself for wider adoption, the market forecasts are highly optimistic. Estimates suggest that the quantum computing market could reach billions of dollars in the coming decade, with both public and private entities increasing their stakes in this high-potential space.

Yet, this forecasted growth does not obscure the ongoing challenges facing the industry. The precision demanded by quantum computing means that developers need to solve problems related to quantum error correction, coherence times, and the development of fault-tolerant systems. Furthermore, scaling up quantum computers to be commercially viable is an intricate process thats still under intensive research.

The field of quantum computing is underpinned by significant investments from government bodies and the private sector. Companies like IBM are spearheading research, and market analytics available from entities like Bloomberg can offer well-defined projections and analyses.

Likewise, academic publishing in well-respected journals such as those found within the Nature portfolio continues to supply vital findings and discussions on quantum developments. These resources, together with the breakthroughs from institutions like Oxford University, indicate a future wherein quantum computing is not only viable but also secureholding vast potential to reshape industries and protect data in ways previously unimagined.

Leokadia Gogulska is an emerging figure in the field of environmental technology, known for her groundbreaking work in developing sustainable urban infrastructure solutions. Her research focuses on integrating green technologies in urban planning, aiming to reduce environmental impact while enhancing livability in cities. Gogulskas innovative approaches to renewable energy usage, waste management, and eco-friendly transportation systems have garnered attention for their practicality and effectiveness. Her contributions are increasingly influential in shaping policies and practices towards more sustainable and resilient urban environments.

Continue reading here:
Advancement in Quantum Computing Security through Oxford University's Breakthrough - yTech

Top Academics: Here’s How We Facilitate the Next Big Leap in Quantum Computing – PCMag AU

Table of Contents From Quantum Physics to Quantum Computing Grand Challenges and Error Correction The Road to Quantum Advantage Education and Workforce Development The Quantum Bottom Line

In advance of the ribbon-cutting for its new IBM System One quantum computer, the first one on a college campus, Rensselaer Polytechnic Institute (RPI) last week hosted a quantum computing day which featured several prominent speakers who together provided a snapshot of where the field is now. I've been writing about quantum computing for a long time, and have noted some big improvements, but there are also a host of challenges that still need to be overcome.

Here are some highlights.

The first plenary speaker was Jay M. Gambetta, Vice President of Quantum Computing at IBM, who gave an overview of the history and progress of quantum computing, as well as the challenges and opportunities ahead. He explained that quantum computing is based on exploiting the quantum mechanical properties of qubits, such as superposition and entanglement, to perform computations that are impossible or intractable for classical computers. He talked about watching the development of superconducting qubits, as they moved from single qubit systems in 2007, to 3-qubit systems in 2011, and now with IBM's Eagle chip, which has 127 qubits and is the heart of the Quantum System One.

He then asked how we could make quantum computing useful. His answer: We need to keep building larger and larger systems and we need to improve error correction.

"There are very strong reasons to believe there are problems that are going to be easy for a quantum computer but hard for a classical computer, and this is why we're all excited," Gambetta said. He discussed the development of quantum circuits and that while the number of qubits was important, equally important was the "depth," detailing how many operations you can do and the accuracy of the results. Key to solving this are larger and larger systems, and also error mitigation, a topic that would be discussed in much greater detail later in the day.

To get to "quantum utility"which he said would be reached when a quantum computer is better than a brute force simulation of a quantum computer on a classical machineyou would need larger systems with at least 1000 gates, along with improved accuracy and depth, and new efficient algorithms.

He talked about quantum algorithmic discovery, which means finding new and efficient ways to map problems to quantum circuits. For instance, a new variation on Shor's algorithm, which allows for factorization in much faster time than would be possible on a classical computer. "The future of running error-mitigated circuits and mixing classical and quantum circuits sets us up to explore this space, " he said.

In a panel discussion that followed, James Misewich from Brookhaven National Laboratory discussed his interest in using quantum computing to understand quantum chromodynamics (QCD), the theory of strong interactions between quarks and gluons. QCD is a hard problem that scales well with the number and depth of qubits, and he is looking at entanglement between jets coming out of particle collisions as a possible avenue to explore quantum advantage.

Jian Shi and Ravishankar Sundararaman from RPI's Materials Science and Engineering faculty talked about computational materials science, and applying quantum computing to discover new materials and properties. Shi noted there was a huge community now doing quantum chemistry, but there is a gap between that and quantum computing. He stressed that a partnership between the two groups will be important, so each learns the language of the other and can approach the problems from a different perspective.

One of the most interesting talks was given by Steve M. Girvin, Eugene Higgins Professor of Physics, Yale University, who discussed the challenges of creating an error-correction quantum computer.

Girvin described how the first quantum revolution was the development of things like the transistor, the laser, and the atomic clock, while the second quantum revolution is based on a new understanding of how quantum mechanics works. He usually tells his students that they do the things that Einstein said were impossible just to make sure that we have a quantum computer and not a classical computer.

He thought there was a bit too much hype around quantum computing today. quantum is going to be revolutionary and do absolutely amazing things, but it's not its time yet. We still have massive problems to solve.

He noted that quantum sensors are extremely sensitive, which is great for making sensors, but bad for building computers, because they are very sensitive to external perturbations and noise. Therefore, error correction is important.

Among the issues Girvin discussed were making measurements to detect errors, but he said we also need calculations to decide if it truly is an error, where it is located, and what kind of error it is. Then there is the issue of deciding what signals to send to correct those errors. Beyond that, there is the issue of putting these together in a system to reduce overall errors, perhaps borrowing from the flow control problems used in things like telephony.

In addition to quantum error detection, Girvin said there are "grand challenges all up and down the stack," from materials to measurement to machine models and algorithms. We need to know how to make each layer of the stack more efficient, using less energy and fewer qubits, and get to higher performance so people can use these to solve science problems or economically interesting problems.

Then there are the algorithms. Girvin noted that there were algorithms way before there were computers, but it took time to decide on the best ones for classical computing. For quantum computing, this is just the beginning, and over time, we need people to figure out how to build up their algorithms and how to do heuristics. They need to discover why quantum computers are so hard to program and clever tools to solve these problems.

Another challenge he described was routing quantum information. He noted that having two quantum computers that can communicate classically is exponentially less good than having two quantum computers that can communicate with quantum information, entangling with each other.

He talked about fault tolerance, which is the ability to correct errors even when your error correction circuit makes errors. He believes that fact that it's possible to do that in a quantum system, at least in principle, is even more amazing than the fact that if you had a perfect quantum computer, you could do interesting quantum calculations.

Girvin described the difficulty in correcting errors, saying you have an unknown quantum state, and you're not allowed to know what it is, because it's from the middle of a quantum computation. (If you know what it is, you've destroyed the superposition, and if you measure it to see if there's an error, it will randomly change, due to state collapse.) Your job is that if it develops an error, please fix it.

"That's pretty hard, but miraculously it can be done in principle, and it's even been done in practice," he said. We're just entering the era of being able to do it. The basic idea is to build in redundancy, such as building a logical qubit that consists of multiple physical qubits, perhaps nine. Then you have two possible giant entangled states corresponding to a logical Zero and a logical One. Note the one and zero aren't living in any single physical qubit, both are only the superposition of multiple ones.

In that case, Girvin says, if the environment reaches in and measures one of those qubits, the environment doesn't actually learn what it knows. There's an error, but it doesn't know what state, so there's still a chance that you haven't totally collapsed anything and lost the information.

He then discussed measuring the probability of errors and then seeing whether it exceeds some threshold value, with some complex math. Then correcting the errors, hopefully quicklysomething that should improve with new error correction methods and better, more precise physical qubits.

All this is still theoretical. That's why fault tolerance is a journey with improvements being made continuously. (This was in opposition to Gambetta, who said systems are either fault tolerant or they aren't). Overall, Girvin said, "We still have a long way to go, but we're moving in the right direction."

Later in the morning, Austin Minnich, Professor of Mechanical Engineering and Applied Physics, Caltech described "mid-circuit measurement" and the need for hybrid circuits as a way of finding, and thus mitigating errors.

In a discussion that followed, Kerstin Kleese van Dam, Director of the Computational Science Initiative at Brookhaven National Laboratory, explained that her team was looking for answers to problems, whether solved on traditional or quantum machines. She said there were problems they can't solve accurately on a traditional computer, but there remains the question of whether the accuracy will matter. There are areas, such as machine learning, where quantum computers can do things accurately. She predicts that quantum advantage will come when we have systems that are large enough. But she also wondered about energy consumption, noting that a lot of power is going into today's AI models, and if quantum can be more efficient.

Shekhar Garde, Dean of the School of Engineering, RPI, who moderated this part of the discussion, compared the status of quantum computing today to where traditional computing was in the late 70s or early 80s. He asked what the next 10 years would bring.

Kleese van Dam said that within 10 years, we would see hybrid systems that combine quantum and classical computing, but also hoped we would see libraries that are transferred from high-performance computing to quantum systems, so a programmer could use them without having to understand the way the gates work. Aparna Gupta, Professor and Associate Dean of RPI's Lally School of Management would bet on the hybrid approach offering more easy access and cost-effectiveness, as well as "taking away the intrigue and the spooky aspects of quantum, so it is becoming real for all of us"

Antonio Corcoles, Principal Research Scientist, IBM Quantum, said he hoped users who don't know quantum will be able to use the system because the complexity will become more transparent, but that can take a long time. In between, they can develop quantum error correction in a way that is not as disruptive as current methods. Minnich talked about "blind quantum computing" where many smaller machines might be linked together.

One of the most interesting talks came from Lin Lin, Professor of Mathematics at the University of California, Berkeley, who discussed the theoretical aspects and challenges of achieving quantum advantage for scientific computation. He defined quantum advantage as the ability to solve problems that are quantumly easy but classically hard, and proposed a hierarchy of four levels of problems.

Lin said that for the first two levels, a lot of people think quantum advantage will be achieved, as the methods are generally understood. But on the next two levels, there needs to be a lot of work on the algorithms to see if it will work. That's why this is an exciting time for mathematicians as well as physicists, chemists, and computer scientists.

This talk was followed by a panel during which Lin said that he is interested in solving quantum many-body problems, as well as applying quantum computing to other areas of mathematics, such as numerical analysis and linear algebra.

Like Garde above, Lin compared where quantum is today to the past, going even further to say it's where classical computing was 60 or 70 years ago, where error correction was still very important. Quantum computing will need to be a very interdisciplinary field, in that it will require people to be very good at building the machines, but it will always produce errors, so it will require both mathematical and engineering ways to correct these.

Ryan Sweke from IBM Research noted that one of the things that has allowed classical computing to develop to the point it is at is the various levels of abstraction, so if you want to work on developing algorithms, you don't have to understand how the compiler works. If you want to understand how the compiler works, you don't have to understand how the hardware works.

The interesting thing in the quantum regime, as seen in error mitigation for example, is that people who come out of the top level of abstraction have to interact with people who are developing the devices. This is an exciting aspect of the time we're in.

Di Fang, Assistant Professor of Mathematics, Duke University, said now was a "golden time for people who work on proving algorithms." She talked about the varying levels of complexity, and the need to see where new algorithms can solve theoretical problems, then look at the hardware and solve practical problems.

Brian McDermott, Principal R&D Engineer at the Naval Nuclear Laboratory, said he was looking at this in reverse, seeing what the problems are and then working backward toward the quantum hardware and software. His job involved matching applications of new and emerging computing architectures to the types of engineering problems that are important to the lab's mission for new nuclear propulsion.

The panelists discussed where quantum algorithms could have the most impact. McDermott talked about things like finite elements and computational fluid dynamics, going up to material science. As a nuclear engineer, he was first attracted to the field because of the quantum properties of the nucleus itself moving predicting behaviors in astrophysics, the synthesis of nuclei in a supernova, and then with engineering, into nuclear reactors and things like fusion. Lin discussed the possibilities for studying molecular dynamics.

Olivia Lanes, Global Lead and Manager for IBM Quantum Learning and Education gave the final talk of the day, where she discussed the need for workforce development in the quantum field.

Already the US is projected to face a shortfall of nearly two million STEM workers by next year. She quoted Carl Sagan, who said "We live in a society exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology," and agreed with him that this is a recipe for disaster.

She noted that not only do very few people understand quantum computing, very few actually understand how classical computers work. She cited a McKinsey study which found that there are three open jobs in quantum for every person qualified to fill those positions. It's probably just going to get worse from here to 2026.

She focused on upskilling and said it was unrealistic to expect that we'll make everyone into experts in quantum computing. But, there were a lot of other jobs that are part of the quantum ecosystem that will be required, and urged students to focus on the areas they are particularly interested in.

In general, she recommended getting a college degree (not surprising, since she was talking at a college), considering graduate school, or finding some other way to get relevant experience in the field, and building up rare skills. "Find the one thing that you can do better than anybody else and market that thing. You can make that thing applicable to any career that you really want for the most part," she said. "Stop letting the physicists hog quantum; they've had a monopoly here for too long and that needs to change."

Similar concepts were voiced in a panel that followed. Anastasia Marchenkova, Quantum Researcher, Bleximo Corporation, said that there was lots of pop science, and lots of research, but not much in the middle. She said we need to teach people enough so they can use quantum computing, even if they aren't computer scientists.

Richard Plotka, Director of Information Technology and Web Science, RPI, said it was important to create middleware tools that can be applied to quantum so that the existing workforce can take advantage of these computers. He also said it was important to prepare students for a career in the future, with foundational knowledge, so they have the ability to adapt because quantum in five or ten years won't look like it does today.

All told, it was a fascinating day of speakers. I was intrigued by software developers explaining the challenge in writing languages, compilers, and libraries for quantum. One explained that you can't use traditional structures such as "ifthen" because you won't know "if." Parts of it were beyond my understanding, and I remain skeptical about how quickly quantum will become practical and how broad the applications may be.

Still, it's an important and interesting technology that is sure to get even more attention in the coming years, as researchers meet some of the challenges. It's good to see students getting a chance to try out the technology and discover what they can do with it.

Read more here:
Top Academics: Here's How We Facilitate the Next Big Leap in Quantum Computing - PCMag AU

Japan Embarks on Quantum Computing Leap with NVIDIA-powered ABCI-Q Supercomputer – yTech

Summary: Japan aims to bolster its quantum computing prowess with the deployment of a new world-class supercomputer, ABCI-Q. To be situated at the ABCI supercomputing center, it will utilize NVIDIAs cutting-edge platforms specialized in accelerated and quantum computing to conduct high-fidelity quantum simulations aiding various industry sectors.

The realm of quantum computing is set to receive a substantial boost in Japan as the countrys latest supercomputer, named ABCI-Q, draws upon NVIDIA technology to lay the groundwork for significant research advancements. This strategic move is poised to escalate Japanese capabilities in computational science and underlines the nations commitment to its quantum computing initiative.

Powered by an array of over 2,000 NVIDIA H100 Tensor Core GPUs, the ABCI-Q machine will sport more than 500 nodes, all seamlessly connected through the NVIDIA Quantum-2 InfiniBand networking platform. NVIDIA CUDA-Q, an innovative open-source hybrid framework, will be a pivotal aspect of the system, enabling the programming of sophisticated quantum-classical systems and serving as a robust simulation toolkit.

The strategic collaboration between NVIDIA and the Global Research and Development center for Business by Quantum-AI Technology (G-QuAT) at the National Institute of Advanced Industrial Science and Technology (AIST) facilitated the creation of the ABCI-Q. This supercomputer is not just an investment in hardware; it also represents a long-term vision for Japans scientific community, supporting quantum machine learning, quantum circuit simulation, and the pioneering development of algorithms inspired by quantum principles.

Expected to launch in the coming year, the ABCI-Qs role in accelerating Japans quantum technology meets a national strategy aiming to derive societal and economic benefits from quantum advancements. Furthermore, this initiative is set to propel industrial applications in conjunction with G-QuAT/AIST and NVIDIA, laying the foundation for future hybrid quantum-classical systems and breakthroughs in a variety of fields, including AI, energy, and biosciences.

Japans Quantum Computing Ambitions with ABCI-Q

Japans commitment to advancing its position in quantum computing is epitomized by the ABCI-Q supercomputer. This latest quantum initiative aligns with the broader industrys trajectory towards creating computers that leverage the principles of quantum mechanics to process information at speeds unachievable by classical computers.

Quantum Computing Industry and Market Forecasts

Quantum computing is not only a scientific endeavor but also a burgeoning industry with significant commercial potential. Global market forecasts suggest that quantum computing could evolve into a multi-billion-dollar industry over the next decade. According to market experts, this growth is likely to be driven by increasing investments in quantum technologies by nations and private entities, in sectors ranging from cryptography and optimization to drug discovery and material science.

The market expectations indicate that while we may still be in the early stages of quantum computing commercialization, the acquisition of quantum capabilities is set to be a strategic focus for countries and corporations aiming to secure a competitive edge in the high-tech landscape.

Industry and Research Applications

The ABCI-Q supercomputers powerful NVIDIA H100 Tensor Core GPUs and Quantum-2 InfiniBand networking capabilities place it at the forefront of quantum simulation. Such simulations are crucial for advancing research into complex molecular interactions, material properties, and energy-efficient solutions, thereby aiding industries like pharmaceuticals, automotive, and clean energy.

Furthermore, the ability to perform quantum machine learning and circuit simulations is expected to open new avenues for artificial intelligence, allowing for the exploration of quantum-inspired algorithms that could dramatically enhance AIs problem-solving abilities.

Issues and Challenges

Despite the promise of quantum computing, the industry faces significant challenges. The technology is still in development, and practical, scalable quantum computers are yet to become mainstream. Issues such as error correction, qubit coherence, and the integration of quantum processors with classical systems need to be resolved before the full potential of quantum computing can be realized.

Additionally, there is a looming need for quantum-skilled workforce development. Education and training programs will be crucial to prepare the next generation of scientists and engineers, equipping them with the skills necessary to foster quantum innovation.

As quantum technology advances, there are implications for cybersecurity, as traditional encryption methods may become vulnerable to quantum attacks. This has led to a growing field of quantum cryptography, which aims to develop security protocols resistant to quantum computing threats.

Conclusion

Japans investment in ABCI-Q exemplifies a strategic approach to harnessing quantum computings potential. The collaboration with NVIDIA and the focus on building quantum-classical hybrid systems indicate that Japan is positioning itself as a global leader in the quantum race.

For further information on the state of the global quantum computing industry and market trends, respected sources can be consulted, such as IBM, a pioneer in the field of quantum computing, and NVIDIA, a leader in accelerated computing platforms. These resources offer insights into the latest developments, research, and forecasts for quantum computings evolution, shaping the future across diverse sectors.

Roman Perkowski is a distinguished name in the field of space exploration technology, specifically known for his work on propulsion systems for interplanetary travel. His innovative research and designs have been crucial in advancing the efficiency and reliability of spacecraft engines. Perkowskis contributions are particularly significant in the development of sustainable and powerful propulsion methods, which are vital for long-duration space missions. His work not only pushes the boundaries of current space travel capabilities but also inspires future generations of scientists and engineers in the quest to explore the far reaches of our solar system and beyond.

Excerpt from:
Japan Embarks on Quantum Computing Leap with NVIDIA-powered ABCI-Q Supercomputer - yTech

Breakthrough promises secure quantum computing at home – University of Oxford

The full power of next-generation quantum computing could soon be harnessed by millions of individuals and companies, thanks to a breakthrough by scientists at Oxford University Physics guaranteeing security and privacy. This advance promises to unlock the transformative potential of cloud-based quantum computing and is detailed in a new study published in the influential U.S. scientific journal Physical Review Letters.

Never in history have the issues surrounding privacy of data and code been more urgently debated than in the present era of cloud computing and artificial intelligence. As quantum computers become more capable, people will seek to use them with complete security and privacy over networks, and our new results mark a step change in capability in this respect.

Quantum computing is developing rapidly, paving the way for new applications which could transform services in many areas like healthcare and financial services. It works in a fundamentally different way to conventional computing and is potentially far more powerful. However, it currently requires controlled conditions to remain stable and there are concerns around data authenticity and the effectiveness of current security and encryption systems.

Several leading providers of cloud-based services, like Google, Amazon, and IBM, already separately offer some elements of quantum computing. Safeguarding the privacy and security of customer data is a vital precursor to scaling up and expending its use, and for the development of new applications as the technology advances. The new study by researchers at Oxford University Physics addresses these challenges.

We have shown for the first time that quantum computing in the cloud can be accessed in a scalable, practical way which will also give people complete security and privacy of data, plus the ability to verify its authenticity, said Professor David Lucas, who co-heads the Oxford University Physics research team and is lead scientist at the UK Quantum Computing and Simulation Hub, led from Oxford University Physics.

In the new study, the researchers use an approach dubbed blind quantum computing, which connects two totally separate quantum computing entities potentially an individual at home or in an office accessing a cloud server in a completely secure way. Importantly, their new methods could be scaled up to large quantum computations.

Using blind quantum computing, clients can access remote quantum computers to process confidential data with secret algorithms and even verify the results are correct, without revealing any useful information. Realising this concept is a big step forward in both quantum computing and keeping our information safe online said study lead Dr Peter Drmota, of Oxford University Physics.

The results could ultimately lead to commercial development of devices to plug into laptops, to safeguard data when people are using quantum cloud computing services.

Researchers exploring quantum computing and technologies at Oxford University Physics have access to the state-of-the-art Beecroft laboratory facility, specially constructed to create stable and secure conditions including eliminating vibration.

Funding for the research came from the UK Quantum Computing and Simulation (QCS) Hub, with scientists from the UK National Quantum Computing Centre, the Paris-Sorbonne University, the University of Edinburgh, and the University of Maryland, collaborating on the work.

The study Verifiable blind quantum computing with trapped ions and single photons has been published in Physical Review Letters.

See the rest here:
Breakthrough promises secure quantum computing at home - University of Oxford