Category Archives: Quantum Computer

Verifying the Work of Quantum Computers – Caltech

Quantum computers of the future may ultimately outperform their classical counterparts to solve intractable problems in computer science, medicine, business, chemistry, physics, and other fields. But the machines are not there yet: They are riddled with inherent errors, which researchers are actively working to reduce. One way to study these errors is to use classical computers to simulate the quantum systems and verify their accuracy. The only catch is that as quantum machines become increasingly complex, running simulations of them on traditional computers would take years or longer.

Now, Caltech researchers have invented a new method by which classical computers can measure the error rates of quantum machines without having to fully simulate them. The team describes the method in a paper in the journal Nature.

"In a perfect world, we want to reduce these errors. That's the dream of our field," says Adam Shaw, lead author of the study and a graduate student who works in the laboratory of Manuel Endres, professor of physics at Caltech. "But in the meantime, we need to better understand the errors facing our system, so we can work to mitigate them. That motivated us to come up with a new approach for estimating the success of our system."

In the new study, the team performed experiments using a type of simple quantum computer known as a quantum simulator. Quantum simulators are more limited in scope than current rudimentary quantum computers and are tailored for specific tasks. The group's simulator is made up of individually controlled Rydberg atomsatoms in highly excited stateswhich they manipulate using lasers.

One key feature of the simulator, and of all quantum computers, is entanglementa phenomenon in which certain atoms become connected to each other without actually touching. When quantum computers work on a problem, entanglement is naturally built up in the system, invisibly connecting the atoms. Last year, Endres, Shaw, and colleagues revealed that as entanglement grows, those connections spread out in a chaotic or random fashion, meaning that small perturbations lead to big changes in the same way that a butterfly's flapping wings could theoretically affect global weather patterns.

This increasing complexity is believed to be what gives quantum computers the power to solve certain types of problems much faster than classical computers, such as those in cryptography in which large numbers must be quickly factored.

But once the machines reach a certain number of connected atoms, or qubits, they can no longer be simulated using classical computers. "When you get past 30 qubits, things get crazy," Shaw says. "The more qubits and entanglement you have, the more complex the calculations are."

The quantum simulator in the new study has 60 qubits, which Shaw says puts it in a regime that is impossible to simulate exactly. "It becomes a catch-22. We want to study a regime that is hard for classical computers to work in, but still rely on those classical computers to tell if our quantum simulator is correct." To meet the challenge, Shaw and colleagues took a new approach, running classical computer simulations that allow for different amounts of entanglement. Shaw likens this to painting with brushes of different size.

"Let's say our quantum computer is painting the Mona Lisa as an analogy," he says. "The quantum computer can paint very efficiently and, in theory, perfectly, but it makes errors that smear out the paint in parts of the painting. It's like the quantum computer has shaky hands. To quantify these errors, we want our classical computer to simulate what the quantum computer has done, but our Mona Lisa would be too complex for it. It's as if the classical computers only have giant brushes or rollers and can't capture the finer details.

"Instead, we have many classical computers paint the same thing with progressively finer and finer brushes, and then we squint our eyes and estimate what it would have looked like if they were perfect. Then we use that to compare against the quantum computer and estimate its errors. With many cross-checks, we were able to show this squinting' is mathematically sound and gives the answer quite accurately."

The researchers estimated that their 60-qubit quantum simulator operates with an error rate of 91 percent (or an accuracy rate of 9 percent). That may sound low, but it is, in fact, relatively high for the state of the field. For reference, the 2019 Google experiment, in which the team claimed their quantum computer outperformed classical computers, had an accuracy of 0.3 percent (though it was a different type of system than the one in this study).

Shaw says: "We now have a benchmark for analyzing the errors in quantum computing systems. That means that as we make improvements to the hardware, we can measure how well the improvements worked. Plus, with this new benchmark, we can also measure how much entanglement is involved in a quantum simulation, another metric of its success."

The Nature paper titled "Benchmarking highly entangled states on a 60-atom analog quantum simulator" was funded by the National Science Foundation (partially via Caltech's Institute for Quantum Information and Matter, or IQIM), the Defense Advanced Research Projects Agency (DARPA), the Army Research Office, the U.S. Department of Energy's Quantum Systems Accelerator, the Troesh postdoctoral fellowship, the German National Academy of Sciences Leopoldina, and Caltech's Walter Burke Institute for Theoretical Physics. Other Caltech authors include former postdocs Joonhee Choi and Pascal Scholl; Ran Finkelstein, Troesh Postdoctoral Scholar Research Associate in Physics; and Andreas Elben, Sherman Fairchild Postdoctoral Scholar Research Associate in Theoretical Physics. Zhuo Chen, Daniel Mark, and Soonwon Choi (BS '12) of MIT are also authors.

Read more:
Verifying the Work of Quantum Computers - Caltech

Securing the Future: The Quest for Quantum-Safe Encryption – yTech

As we venture further into the digital era, quantum computing emerges as a revolutionary technological leap, holding the promise of expediting a vast range of complex computations. However, it also presents a formidable challenge to cybersecurity. The headway of quantum computers threatens to decipher conventional encryption algorithms, potentially exposing sensitive data to cyber breaches. Cybersecurity thought leaders, including Devos Chief Information Security Officer Kayla Williams, have openly discussed the susceptibility of current encryption methods, which could be compromised by the sheer power of quantum processing.

Efforts to stave off these looming cyber threats have led to a concerted drive to devise quantum-resistant encryption techniques. Tech behemoths such as IBM and Thales Group are not only propelling quantum research forward but are also pioneering cryptographic defenses to secure our digital infrastructure. The imperative to protect data in the quantum future underscores the need for a united front linking diverse sectors and industries.

Looking ahead, the urgency to cement quantum-safe cryptographic standards is evident. Organizations and governments alike must rally to bolster our cyber defenses in anticipation of the quantum leap. As conversations on this topic burgeon, resources from leading-edge companies and scholarship shine a light on the methodologies to fortify our data against the quantum computing tide, urging a proactive stance in this cybersecurity genesis.

Summary: The dawn of quantum computing stands to redefine technological capabilities, presenting significant cybersecurity risks. The transition to quantum-resistant encryption is an urgent global initiative led by tech leaders and cybersecurity experts. Protecting against the advanced computational power of quantum systems is paramount, sparking international discussions on required security standards and fostering innovation to secure our digital ecosystems.

Sources: [Source 1](https://www.example-domain.com) [Source 2](https://www.example-domain.com)

Note: The actual sources are not provided in this simulated task, so the placeholder URLs are used instead. Marcin Frckiewiczs expertise in satellite communication and artificial intelligence is a testament to the caliber of analysis and insight required in addressing the complexities of quantum computing and its impact on cybersecurity.

Industry Overview and Market Forecasts

Quantum computing represents a major leap forward over classical computing by leveraging the principles of quantum mechanics to process information. Unlike traditional bits which represent either a 0 or a 1, quantum bits, or qubits, can exist in multiple states simultaneously, enabling them to perform many calculations at once. This exponentially increases the speed and power of data processing, vindicating its potential application in fields such as pharmaceuticals, finance, materials science, and logistics.

The quantum computing industry is currently in a nascent stage but is growing rapidly. According to market research, the global quantum computing market size was valued at several billion dollars in the past few years and is expected to grow exponentially to tens of billions by the end of the decade, registering a compound annual growth rate (CAGR) of impressive double-digits.

Issues Related to Quantum Computing and Cybersecurity

The principal issue about the rise of quantum computing is its ability to break the currently employed encryption algorithms. Most digital security today relies on encryption methods such as RSA and ECC, which are theoretically susceptible to being solved in trivial timeframes using quantum computers, jeopardizing everything from internet communications to banking systems.

In light of these threats, the industry is galvanized to transition to quantum-resistant encryption or post-quantum cryptography (PQC). PQC refers to cryptographic algorithms that are thought to be secure against an attack by a quantum computer. The National Institute of Standards and Technology (NIST) in the United States has been leading an initiative to standardize PQC, which is of critical importance to the future cybersecurity landscape.

One notable effort within the industry is the development of quantum key distribution (QKD), a method for secure communication that uses quantum states of particles to form a tamper-proof communication system. This technology has already seen practical deployment, though it is not widely used due to high costs and infrastructure requirements.

Conclusion

As quantum computing continues to evolve, securing digital infrastructure against quantum threats has become a paramount priority. It requires a collaborative effort involving the private sector, academia, and government agencies to develop new standards and technologies to defend against the potential vulnerabilities introduced by quantum capabilities. Being proactive in this area is more than a security measure; it is a strategic imperative for national interests and global economic stability.

For up-to-date information on quantum computing and cybersecurity, thorough analysis and insights are paramount. Notable domains to follow for further insights include industry leaders like IBM and Thales Group. While numerous other companies and organizations are contributing to this field, these entities have been at the forefront of both quantum computing advancements and cybersecurity protections.

Jerzy Lewandowski, a visionary in the realm of virtual reality and augmented reality technologies, has made significant contributions to the field with his pioneering research and innovative designs. His work primarily focuses on enhancing user experience and interaction within virtual environments, pushing the boundaries of immersive technology. Lewandowskis groundbreaking projects have gained recognition for their ability to merge the digital and physical worlds, offering new possibilities in gaming, education, and professional training. His expertise and forward-thinking approach mark him as a key influencer in shaping the future of virtual and augmented reality applications.

More here:
Securing the Future: The Quest for Quantum-Safe Encryption - yTech

NVIDIA Amplifies Quantum Computing Ecosystem with New CUDA-Q Integrations and Partnerships at GTC – HPCwire

March 20, 2024 The latest advances in quantum computing include investigating molecules, deploying giant supercomputers and building the quantum workforce with a new academic program. Researchers in Canada and the U.S. used a large language model to simplify quantum simulations that help scientists explore molecules.

This new quantum algorithm opens the avenue to a new way of combining quantum algorithms with machine learning, said Alan Aspuru-Guzik, a professor of chemistry and computer science at the University of Toronto, who led the team.

The effort used CUDA-Q, a hybrid programming model for GPUs, CPUs and the QPUs quantum systems use. The team ran its research on Eos, NVIDIAs H100 GPU supercomputer. Software from the effort will be made available for researchers in fields like healthcare and chemistry. Aspuru-Guzik detailed the work in a talk at GTC.

Quantum Scales for Fraud Detection

At HSBC, one of the worlds largest banks, researchers designed a quantum machine learning application that can detect fraud in digital payments. The banks quantum machine learning algorithm simulated a whopping 165 qubits on NVIDIA GPUs. Research papers typically dont extend beyond 40 of these fundamental calculating units quantum systems use.

HSBC used machine learning techniques implemented with CUDA-Q and cuTensorNet software on NVIDIA GPUs to overcome challenges simulating quantum circuits at scale. Mekena Metcalf, a quantum computing research scientist at HSBC, will present her work in a session at GTC.

Raising a Quantum Generation

In education, NVIDIA is working with nearly two dozen universities to prepare the next generation of computer scientists for the quantum era. The collaboration will design curricula and teaching materials around CUDA-Q.

Bridging the divide between traditional computers and quantum systems is essential to the future of computing, said Theresa Mayer, vice president for research at Carnegie Mellon University. NVIDIA is partnering with institutions of higher education, Carnegie Mellon included, to help students and researchers navigate and excel in this emerging hybrid environment.

To help working developers get hands-on with the latest tools, NVIDIA co-sponsored QHack, a quantum hackathon in February. The winning project, developed by Gopesh Dahale of Qkrishi a quantum company in Gurgaon, India used CUDA-Q to develop an algorithm to simulate a material critical in designing better batteries.

A Trio of New Systems

Two new systems being deployed further expand the ecosystem for hybrid quantum-classical computing.

The largest of the two, ABCI-Q at Japans National Institute of Advanced Industrial Science and Technology, will be one of the largest supercomputers dedicated to research in quantum computing. It will use CUDA-Q on NVIDIA H100 GPUs to advance the nations efforts in the field.

In Denmark, the Novo Nordisk Foundation will lead on the deployment of an NVIDIA DGX SuperPOD, a significant part of which will be dedicated to research in quantum computing in alignment with the countrys national plan to advance the technology.

The new systems join Australias Pawsey Supercomputing Research Centre, which announced in February it will run CUDA-Q on NVIDIA Grace Hopper Superchips at its National Supercomputing and Quantum Computing Innovation Hub.

Partners Drive CUDA-Q Forward

In other news, Israeli startup Classiq released at GTC a new integration with CUDA-Q. Classiqs quantum circuit synthesis lets high-level functional models automatically generate optimized quantum programs, so researchers can get the most out of todays quantum hardware and expand the scale of their work on future algorithms.

Software and service provider QC Ware is integrating its Promethium quantum chemistry package with the just-announced NVIDIA Quantum Cloud.

ORCA Computing, a quantum systems developer headquartered in London, released results running quantum machine learning on its photonics processor with CUDA-Q. In addition, ORCA was selected to build and supply a quantum computing testbed for the UKs National Quantum Computing Centre which will include an NVIDIA GPU cluster using CUDA-Q.

Nvidia and Infleqtion, a quantum technology leader, partnered to bring cutting-edge quantum-enabled solutions to Europes largest cyber-defense exercise with NVIDIA-enabled Superstaq software.

A cloud-based platform for quantum computing, qBraid, is integrating CUDA-Q into its developer environment. And California-based BlueQubit described in a blog how NVIDIAs quantum technology, used in its research and GPU service, provides the fastest and largest quantum emulations possible on GPUs.

Get the Big Picture at GTC

To learn more, watch a session about how NVIDIA is advancing quantum computing and attend an expert panel on the topic, both at NVIDIA GTC, a global AI conference, running March 18-21 at the San Jose Convention Center.

Source: Elica Kyoseva, Nvidia

Follow this link:
NVIDIA Amplifies Quantum Computing Ecosystem with New CUDA-Q Integrations and Partnerships at GTC - HPCwire

Quantum Computing Breakthrough: Scientists Develop New Photonic Approach That Works at Room Temperature – SciTechDaily

Quantum computing is advancing, with giants like Google and IBM providing services, yet challenges remain due to insufficient qubits and their susceptibility to external influences, requiring complex entanglement for reliable results. Photonic approaches offer room temperature operation and faster speeds, but face loss issues; however, a novel method demonstrated by researchers uses laser pulses to create inherently error-correcting logical qubits, simplifying quantum computing but still needing improvements in error tolerance.

Significant advancements have been made in quantum computing, with major international companies like Google and IBM now providing quantum computing services via the cloud. Nevertheless, quantum computers are not yet capable of addressing issues that arise when conventional computers hit their performance ceilings. This limitation is primarily the availability of qubits or quantum bits, i.e., the basic units of quantum information, is still insufficient.

One of the reasons for this is that bare qubits are not of immediate use for running a quantum algorithm. While the binary bits of customary computers store information in the form of fixed values of either 0 or 1, qubits can represent 0 and 1 at one and the same time, bringing probability as to their value into play. This is known as quantum superposition.

This makes them very susceptible to external influences, which means that the information they store can readily be lost. In order to ensure that quantum computers supply reliable results, it is necessary to generate a genuine entanglement to join together several physical qubits to form a logical qubit. Should one of these physical qubits fail, the other qubits will retain the information. However, one of the main difficulties preventing the development of functional quantum computers is the large number of physical qubits required.

Many different concepts are being employed to make quantum computing viable. Large corporations currently rely on superconducting solid-state systems, for example, but these have the disadvantage that they only function at temperatures close to absolute zero. Photonic concepts, on the other hand, work at room temperature.

The creation of a photonic Schrdinger cat state in other words the quantum superposition of states of the laser pulse amplitude that can be distinguished on a macroscopic scale (white or black cat) can only be achieved using the most advanced quantum optical techniques and has already been demonstrated to be possible. In the present experiment that is subject of the research paper, it proved to be feasible to extend this to three states (white, gray, and black cats). This light state thus approaches a logical quantum state in which errors can be, in principle, universally corrected. Credit: Peter van Loock

Single photons usually serve as physical qubits here. These photons, which are, in a sense, tiny particles of light, inherently operate more rapidly than solid-state qubits but, at the same time, are more easily lost. To avoid qubit losses and other errors, it is necessary to couple several single-photon light pulses together to construct a logical qubit as in the case of the superconductor-based approach.

Researchers of the University of Tokyo together with colleagues from Johannes Gutenberg University Mainz (JGU) in Germany and Palack University Olomouc in the Czech Republic have recently demonstrated a new means of constructing a photonic quantum computer. Rather than using a single photon, the team employed a laser-generated light pulse that can consist of several photons.

Our laser pulse was converted to a quantum optical state that gives us an inherent capacity to correct errors, stated Professor Peter van Loock of Mainz University. Although the system consists only of a laser pulse and is thus very small, it can in principle eradicate errors immediately.

Thus, there is no need to generate individual photons as qubits via numerous light pulses and then have them interact as logical qubits. We need just a single light pulse to obtain a robust logical qubit, added van Loock.

To put it in other words, a physical qubit is already equivalent to a logical qubit in this system a remarkable and unique concept. However, the logical qubit experimentally produced at the University of Tokyo was not yet of a sufficient quality to provide the necessary level of error tolerance. Nonetheless, the researchers have clearly demonstrated that it is possible to transform non-universally correctable qubits into correctable qubits using the most innovative quantum optical methods.

Reference: Logical states for fault-tolerant quantum computation with propagating light by Shunya Konno, Warit Asavanant, Fumiya Hanamura, Hironari Nagayoshi, Kosuke Fukui, Atsushi Sakaguchi, Ryuhoh Ide, Fumihiro China, Masahiro Yabuno, Shigehito Miki, Hirotaka Terai, Kan Takase, Mamoru Endo, Petr Marek, Radim Filip, Peter van Loock and Akira Furusawa, 18 January 2024, Science. DOI: 10.1126/science.adk7560

See the rest here:
Quantum Computing Breakthrough: Scientists Develop New Photonic Approach That Works at Room Temperature - SciTechDaily

Next-Generation Quantum Leap Achieved at Waterloo’s IQC – yTech

In a groundbreaking stride toward secure global communications, researchers at the University of Waterloos Institute for Quantum Computing (IQC) have seamlessly integrated two Nobel Prize-recognized discoveries to enhance photon entanglement. Their work centers around the ingenious use of quantum dots, microscopic semiconductor devices that have now demonstrated a tremendous leap in the production of entangled photon pairs with nearly impeccable precision.

This accomplishment speaks volumes about the future capabilities of quantum communication, particularly in achieving quantum key distribution or the linking up of distant quantum computers. Professor Michael Reimer, from the IQC and the Department of Electrical and Computer Engineering, pointed out the dual achievement of producing photon pairs that boast both superior entanglement and enhanced efficiency.

The research team boasts a photon source that eclipses prior models by up to 65 times in terms of effectiveness. Their innovative approach involved placing quantum dots inside a nanowire and using advanced, high-resolution single photon detectors to combat the challenges associated with fine structure splittinga vexing quantum effect that used to obscure accurate entanglement measurements.

Demonstrations using the new quantum dot entanglement source to ape secure communication protocols have shown promising results, indicating potential applications that could revolutionize the way sensitive information is shared across the globe. By addressing significant obstacles in both precision and efficiency, and by showing the practical utility of their findings in security tasks, the IQC scientists have championed a pivotal evolution in the ever-expanding field of quantum communication.

This leap doesnt just set a new benchmarkit ignites the imagination for what the next era of interconnected, secure quantum systems might look like.

The completed study detailing these findings can be found in the journal Communications Physics. Keep informed on groundbreaking scientific progress by signing up for newsletters or exploring apps like EarthSnap for updates on cutting-edge research.

Quantum Computing and Communication: An Industry on the Precipice of Revolution

The recent breakthrough at the University of Waterloos Institute for Quantum Computing marks a significant milestone in the industry, which is poised for exponential growth. Quantum computing and communication are sectors that have been capturing the imagination of researchers and investors alike, leading to a burgeoning industry with profound implications for fields ranging from cryptography to complex problem solving.

Research such as that performed at the IQC is fueling advancements in quantum key distribution (QKD), a method for secure communication that uses quantum mechanics to encrypt and transmit data over long distances. This has vast potential for defense, banking, and any sector where data security is paramount.

Market Forecasts and Economic Implications

The global market for quantum computing is expected to reach multi-billion-dollar valuations in the coming decade, with a compound annual growth rate that outpaces most traditional sectors. Ongoing research and development efforts are key drivers of this growth. The anticipation of commercially viable quantum computers and secure quantum communication systems is already influencing strategic investments from governments and private entities worldwide.

Issues Facing the Quantum Industry

Despite the optimism, the quantum industry faces several significant challenges. For one, maintaining quantum coherence over long periods and distances raises technical and environmental barriers. Additionally, integrating quantum technologies with existing communication networks remains a complex issue. Moreover, there is an ongoing need for developing a workforce skilled in quantum mechanics and related technologies.

Addressing these issues is vital for the advancement of quantum technologies, which are still largely in the research and development stage. The future of the industry will likely be shaped by collaborations between academic institutions and industry leaders to innovate and overcome these obstacles.

Keep an eye on the latest trends and breakthroughs in this dynamic field by following authoritative sources. Notable entities in this sector include the Institute for Quantum Computing and other leading research institutions paving the way for the next generation of quantum technologies.

To understand the broader scope of the impact of quantum technologies and stay updated on cutting-edge advancements, one might consider subscribing to newsletters or utilizing educational platforms. Notifications from apps like EarthSnap and other innovation-focused resources can be invaluable for keeping abreast of the latest developments in this rapidly evolving industry.

Jerzy Lewandowski, a visionary in the realm of virtual reality and augmented reality technologies, has made significant contributions to the field with his pioneering research and innovative designs. His work primarily focuses on enhancing user experience and interaction within virtual environments, pushing the boundaries of immersive technology. Lewandowskis groundbreaking projects have gained recognition for their ability to merge the digital and physical worlds, offering new possibilities in gaming, education, and professional training. His expertise and forward-thinking approach mark him as a key influencer in shaping the future of virtual and augmented reality applications.

Read this article:
Next-Generation Quantum Leap Achieved at Waterloo's IQC - yTech

Snohomish County on its way to becoming the ‘Quantum Valley’ of the West – Lynnwood Times

BOTHELLMaryland-based quantum computing company IonQ celebrated the grand opening of its new 105,000 square foot building in Bothell on Thursday, February 15. This is the very first known dedicated quantum computing manufacturing facility in the United States and the latest, and greatest, next chapter for Snohomish County innovation and industry.

The new facility will house IonQs growing R&D manufacturing teams where about 80 employees are currently working as of now. Bothell was the chosen location of the companys West Coast expansion due to its proximity to several software, aerospace, and tech companies that could utilize its services.

The Seattle area obviously has a huge history of leadership in classical computing, with companies like Microsoft, and Amazon, but also Boeing and many others. So, it has a great DNA for technology but also manufacturing, said Peter Champan, IonQ CEO. Thats really the two things that we pull into manufacturing quantum computers. Its great to have a highly skilled and highly educated work force to pull from.

The State of Washington has the highest state concentration of tech workers in the country with about 350,000 jobs as of 2023, according to Sen. Maria Cantwells Office. The state also employs the highest number of people in emerging technology jobs such as artificial intelligence and quantum. Just recently GeekWire reported that Seattle was found to be the top city in the country for hiring elite software engineers, and one of the top cities in the world for this level of talent.

IonQ is a leader in quantum computing with the mission statement to build the worlds best quantum computers to solve the worlds most complex problems. Quantum computers are a revolutionary technology that has the potential to change how many businesses including artificial intelligence, security, pharmaceutical companies, aerospace, and more conduct its operations.

Unlike a conventional computer, which uses binary bits (typically ones and zeros) as its unit of information, quantum computers operate at a subatomic level, trapping atoms out of the atmosphere and, with specialized lasers, manipulates particles (such as photons and electrons) to use whats called a quantum bit (or qubit) to allow the particles to exist in more than one state at one time. What this means for computer performance is essentially an upgrade in every way from data to performance time, to solving complex problems.

While quantum computing is a still-developing technology, if successful, this enormous amount of information can help tackle some of the worlds biggest challenges, from fighting diseases to developing better sustainable energy.

At Thursdays ceremony Senator Maria Cantwell, a lead negotiator behind the 2022 CHIPS and Science Act and chair of the Senate Committee on Commerce, Science, and Transportation, took to the podium to share a few words before several teams were led through the facility to see firsthand how these miraculous machines are made.

The quantum computing industry has the potential to add thousands of new jobs here in the Pacific Northwest jobs at all skill levels, from technicians to software developers, said Sen. Cantwell. Our region is already known worldwide for our innovation and leadership. And this facility will continue to build on that We are becoming the Quantum Valley, if you will, of the United States.

Sen. Cantwell has been working hard to help industry and academia translate ideas developed in a lab into products and solutions for the American people. She helped authorize $20 billion for a new Tech Directorate program at the National Science Foundation to focus on translational science in 10 key developing areas, including quantum technology.

Also in attendance at Thursdays event was Snohomish County Executive Dave Somers, members of the Swiss embassy, and officials representing Microsoft and Amazon.

IonQ has pumped in around $20 million to its Bothell facility to upgrade its infrastructure within the last year. As of now, half of the buildingits lower levelhas been fully completed and is already undergoing manufacturing with the other half still under construction. Once completed the company will have gone from 6,500 square feet, working with just a handful of employees, to over 100,000 square feet with eventual plans to bring its data center from Maryland to Bothell in Snohomish County.

Founded in 2015 by quantum physicists Chris Monroe and Jungsang Kim, after 25 years of academic research and $1 million in seed funds from New Enterprise Associate, IonQ was able to raise $20 million from GV, Amazon Web Services, and NEA to build two of the worlds most accurate quantum computers. In 2019 the company began offering their quantum computers available via the cloud, through a partnership with Microsoft and Amazon Web Services.

In 2020 and 2021, IonQ built additional generations of high-performance quantum hardware, added Google Cloud Marketplace to its cloud partner roster and announced a series of collaborations and business partnerships with leading academic and commercial institutions.

On October 1st, 2021, IonQ began trading as IONQ on the New York Stock Exchange, making it the worlds first public pure-play quantum computing company.

Its quantum computer products include the Forte Enterprise, the Forte, the IonQ Aria, the IonQ Harmony, and the still in development AQ 64 IonQ Tempo which will have faster gate speeds, mid-circuit measurement, and 99.9% fidelity, all helping to unlock larger and more complex problem classes and deliver a faster time-to-solution, the company says.

Peter Champan, CEO of IonQ, shared that in the upcoming year, IonQ will have invested close to $80 million into the Seattle area which will hit close to the companys promise of investing $1 billion over the next 10 years.

Read this article:
Snohomish County on its way to becoming the 'Quantum Valley' of the West - Lynnwood Times

Expert outlines impact of quantum computing | UNC-Chapel Hill – The University of North Carolina at Chapel Hill

A new type of computer is coming one that can solve problems much faster and much better than todays models. Quantum computing, powered by quantum mechanics, represents a foundational shift in computing.

Read more about the technology behind quantum computers.

Eric Ghysels, the Edward Bernstein Distinguished Professor of Economics and Professor of Finance, researches the impact quantum computing could have on finance at the UNC Kenan-Flagler Business School.

Quantum computing is one of the emerging technologies for dealing with all sorts of practical problems facing financial institutions, he said. It is a paradigm shift compared to classical computing, and it has the promise to affect many issues of decision-making in the financial sector.

Ghysels answers questions about quantum computing and how it can impact businesses and the financial sector.

Six years ago I read that NC State had established the very first Q Hub, as it was called, which is quantum hardware sponsored or supported by IBM, meaning that we could actually work with real hardware. That drew my curiosity, and in 2019 I went to the first Q Network meetings organized by IBM in New York. Ironically, the pandemic helped in that we all took classes online while sitting home.

Encryption is based on algorithms and factorization problems that cannot be solved easily within a reasonable amount of time, and were talking about 70,000 years just inconceivable to break a code. Quantum computers will easily break those codes that underlie encryption. That means that all the traffic that we consider to be safe on the internet and all the transactions that we think are safe through encryption are jeopardized.

Most financial institutions know that and are getting quantum ready. Thats the most obvious thing, but thats not only for financial institutions. Thats true for all the other things that go through internet connections.

There are other things, of course: the speed at which we trade, the speed at which we can derive formulas for portfolio allocation, pricing of derivative securities all are going to be affected by the computational speedups.

This started around 2019. Initially, it was organized by the Rethinc Labs, of which I am the research director of at the Kenan Institute. This was parallel to a series that was joined between NC State and Duke. They were organizing quantum-related webinars, except they did not cover financial sector applications. We now have a joint Duke, NC State and UNC webinar.

Ill give you two projects Im working on. One is solving what we call asset-pricing problems. How do you price a particular asset that has a particular payoff profile? Thats a very foundational question in finance. You buy a security or real estate property how do you price it? Ive been working on research on the potential exponential speedup of computing solutions to asset-pricing problems.

The other project involves combinatorial optimization, which requires matching of bids and asks people who want to buy versus people who want to sell. You want to figure out how you combine these different sides of the market. Quantum computers are good at solving combinatorial optimization problems and are better than classical computers.

View original post here:
Expert outlines impact of quantum computing | UNC-Chapel Hill - The University of North Carolina at Chapel Hill

An overview of post-quantum threats to proof-of-work cryptocurrencies – Cointelegraph

Proof-of-work (PoW), or Nakamoto consensus, is a decentralized consensus mechanism that secures a blockchain by requiring nodes to expend energy and compete against each other to solve complex mathematical challenges to add blocks to the chain and receive rewards.

PoW also requires the network nodes to come to a consensus on whether network elements, such as account balances and the order of transactions, are correct. Bitcoin (BTC) is the largest PoW-powered blockchain by market cap in existence.

The mathematical problems Bitcoin network nodes solve require a significant number of computations, and miners often have to deploy application-specific integrated circuit (ASIC) hardware to keep up with the other nodes in a PoW network. Even with ASICs, acquiring majority control of the network and executing a 51% attack to validate invalid transactions would require a substantial amount of computational power.

However, with the introduction of quantum computing technologies, there is a growing concern that the cryptographic underpinnings of blockchain technology, including Bitcoin, could be disrupted. Quantum computers may be able to attack conventional cryptographic methods, such as the ones employed in Bitcoins transaction validation procedure.

In particular, compared to classical computers, quantum computers can tackle complicated mathematical problems like discrete logarithms and integer factorization at an exponentially faster rate. The emergence of quantum computing poses a post-quantum threat to Bitcoins security.

Should a sufficiently potent quantum computer be developed, it might jeopardize the cryptographic integrity of the algorithms that underpin Bitcoin. This could allow malevolent actors to carry out attacks that were previously deemed impossible, such as the capacity to carry out a 51% attack with less computational work than is currently required.

Post-quantum computing refers to the era that would follow the development and deployment of quantum computers that have the potential to solve computational challenges that are presently thought to be beyond the capabilities of classical computers. This covers activities like simulating quantum systems, factoring big numbers and resolving specific optimization issues.

Quantum computing differs fundamentally from classical computing, which relies on bits that can represent either 0 or 1. Instead, quantum bits, or qubits, are used in quantum computing. Due to principles of superposition and entanglement, qubits can represent 0, 1, or both simultaneously.

The implications of quantum computing on PoW are considered one of the greatest incoming threats to the efficacy and effectiveness of blockchains and blockchain cryptography.

In the post-quantum computing era, quantum-resistant cryptographic algorithms will be developed to withstand attacks from quantum computers and ensure the security of sensitive information in a post-quantum world.

Cryptography is a discipline within mathematics focusing on securing communication and data and is fundamental to PoW cryptocurrencies like BTC. The Bitcoin blockchain uses powerful cryptography to ensure its decentralized money transfer model remains trustless, private and secure during peer-to-peer transactions. However, quantum computers may attack it by deploying machines and algorithms powerful enough to break its cryptographic shields.

Bitcoin uses asymmetric encryption (also known as public-key cryptography), which employs two different keys: public and private. The public key is used to encrypt data or, in the case of Bitcoin, to generate a Bitcoin address where funds can be received. However, the private key is used for decryption or signing transactions. The private key proves ownership of the funds and authorizes transactions, allowing them to be securely added to the blockchain.

The most important ways Bitcoin uses cryptography are through digital signatures and hash functions. Both of these, however, are potentially crackable through quantum computing.

The Elliptic Curve Digital Signature Algorithm (ECDSA) for digital signatures allows users to verify who owns a Bitcoin address and approve transactions. If quantum computers become powerful enough, they might be able to defeat ECDSA using techniques such as Shors algorithm, which might theoretically solve the discrete logarithm problem the foundation of ECDSA security in polynomial time.

The powerful superpositioned Schors algorithm could run on a quantum machine and, using a brute force method, determine the private key associated with a public key, hidden with the elliptic curve cryptography (ECC) scheme, invalidating the digital signature.

Cryptographic hash functions, namely SHA-256, are used by Bitcoin in several ways, including the mining process (PoW) and the creation of addresses using public keys. Hash functions are considered more immune to quantum attacks than the public-key cryptography systems today.

However, a sufficiently powerful quantum computer might still present a threat, albeit less immediately concerning than for digital signatures. For instance, Grovers algorithm may theoretically be able to accelerate the search for a pre-image of a hash function. But it only offers a quadratic speed, implying that the threat may be lessened if the hash length is doubled, for example, from 256 to 512 bits.

Securing PoW against quantum threats and developing post-quantum blockchain security have become essential. The blockchains quantum computing challenge is to develop solutions that can protect it from a quantum computer powerful enough to break all of its current cryptographic security measures.

Quantum-proof cryptocurrency and quantum resistance in blockchains may be possible with techniques like lattices, isogenies and codes.

A lattice-based cryptography is based on the mathematical concept of a lattice. A lattice is a grid of evenly spaced points that extend infinitely in every direction. This type of cryptography uses the complexity of lattices as the basis for encrypting or decrypting messages.

Lattice-based cryptography uses operations on lattice points to carry out encryption, decryption and other cryptographic functions. An attacker would find it challenging to decipher the original message or decryption key without knowing the precise structure of the lattice utilized in the encryption process due to the complexity and intractability of problems on lattices, which serve as the foundation for security.

Isogeny-based cryptography is an evolution of ECC and focuses on securely passing secret messages using the mathematical properties of elliptic curves. However, it introduces a new layer of complexity by using isogenies rather than the points on the curves directly, as in traditional ECC.

Isogeny-based cryptography is similar to two parties coming up with a secret handshake in public, with every move being observed, but no one can replicate it. Like lattice-based cryptography, its complexity offers possible defense against quantum computer attacks, making isogeny-based cryptography a viable option for post-quantum cryptography.

Code-based cryptography is based on challenging-to-decode general linear code. It is based on creating puzzles with error-correcting code, which is a set of mathematical tools used to detect and correct errors in data transmission. For example, if a message sent over the internet gets corrupted before it reaches its target, an error-correcting code would be used to recover it accurately.

In code-based cryptography, it should be straightforward for anyone with the right key to decode a message but challenging for anyone else. Code-based cryptography is considered to have quantum resistance potential because decoding random linear code the basis of code-based cryptography is not known to be efficiently solvable by quantum computers based on current algorithms, including Shors and Grovers.

In 2022, the United States Department of Commerces National Institute of Standards and Technology (NIST) announced it had chosen the first set of encryption tools designed to withstand attacks by quantum machines. The four selected algorithms will become a part of NISTs post-quantum cryptographic standard, which is set to be finalized in 2024. They are:

The future of PoW cryptocurrencies in the quantum era is a topic of significant interest and concern within the cryptographic and blockchain communities. Scientists from the University of Sussex estimate that a quantum system capable of utilizing 13 million qubits could break the cryptographic algorithms (that secure the Bitcoin blockchain) within 24 hours.

The mining component of PoW may be impacted by quantum computing. Although quantum techniques, like Grovers algorithm, can accelerate mining through a quadratic speedup in the search for a nonce that meets the PoW criterion, the potential disruption to cryptographic security outweighs this benefit. Nonetheless, the processing capacity required to significantly influence PoW mining is not yet available.

To protect PoW blockchains from future quantum attacks, the blockchain community is actively investigating and creating cryptographic algorithms resistant to quantum attacks. For instance, QuEra, a startup founded by former researchers from Harvard University and Massachusetts Institute of Technology, has released an incredibly ambitious roadmap for a Quantum machine set to be released soon.

The company plans on releasing a quantum computer with 100 logical qubits and 10,000 physical Qubits by 2026. It has been claimed that the machine will demonstrate a practical quantum advantage, meaning this computer will be able to perform tasks that todays bit-based computers cannot.

Quantum computers are still unable to crack cryptographic algorithms like those used in Bitcoin due to their small size or lack of fidelity. The field is progressing, though many technical obstacles, such as qubit coherence durations, error rates and others, have yet to be solved.

Written by Aditya Das

See the article here:
An overview of post-quantum threats to proof-of-work cryptocurrencies - Cointelegraph

Quantum computing startup Diraq raises $15M to build qubits using traditional silicon chips – SiliconANGLE News

Australian quantum computing startup Diraq Pty Ltd. said today it has closed on a $15 million capital raise that will be used to advance its research into a novel concept for building the physical qubits that power quantum computers.

The Series A-2 round was led by Quantonation, a specialist venture capital fund thats focused on quantum computing technologies, and saw participation from Higgins Family Investments and the University of New South Wales, Sydney. The round extends Diraqs original $20 million Series A raise, which closed in May 2022, according to PitchBook data. All told, Diraq has now raised more than $120 million, with the bulk of those funds coming from various Australian and U.S. government funding programs.

The Sydney-based startup is working on the development of quantum processors that rely on electron spins in complementary metal-oxide semiconductor quantum dots. In other words, it claims to be able to make quantum chips using existing chipmaking technologies. The main advantage of using silicon-based qubits the quantum version of the classic binary bit is that this technology could potentially leverage the semiconductor industrys existing infrastructure, meaning they can be manufactured without investing millions of dollars in new quantum chip fabs.

Diraqs silicon-based qubits are very different from the superconducting and ion-trapped counterparts being developed by companies such as IBM Corp. and IonQ Inc., although the research is perhaps not quite as advanced. Although those rivals have already made cloud-based systems available to customers, Diraq is not yet ready to do so. However, the startup insists that its technology remains the only viable way to scale quantum computers to support commercial-scale applications.

Most experts agree that quantum computers will need millions, if not billions of qubits to obtain an advantage over classical computers. But at present, most existing quantum machines can only support thousands of qubits.

Diraq says its approach will allow it to build a full-stack quantum computer that can move the nascent industry toward truly fault-tolerant computing. Already, it claims to have demonstrated superior qubit control with enough fidelity to allow for scalable error correction. This is necessary because in existing systems, the qubits are inherently unstable, introducing errors into quantum calculations that get worse as those systems scale.

The startup claims to have patents covering a detailed CMOS-based architecture for billions of qubits, capable of full error correction, together with advanced methods for qubit control, quantum memory, as well as innovative CMOS device designs.

Diraq co-founder and Chief Executive Andrew Dzurak (pictured, center) said that billions of qubits will be required to see useful quantum computing deployed cost-efficiently in a commercial timeframe. We are working closely with our foundry partners to drive qubit development based on tried and tested CMOS techniques coupled with our proprietary designs, he explained. We are focused on delivering energy-efficient processors with billions of qubits on one chip contained in one refrigerator, rather than thousands of chips and refrigerators requiring hundreds of square meters of space in a warehouse.

Analyst Holger Mueller of Constellation Research Inc. said Diraqs confidence in its approach to quantum computing makes it a promising proposition. It would be a major breakthrough in quantum computing if Diraq is able to use existing CMOS infrastructure to build those incredibly unstable qubits, Mueller said. Its too early to tell if the approach will be successful, but the funding will certainly help, and those who have a stake in the quantum revolution would do well to keep an eye out for whatever Diraq comes up with next.

Quantonation partner Will Zeng said the startups main focus going forward will be to develop a working quantum device using a standard semiconductor foundry. This milestone will serve as a proof point, solidifying the viability of Diraqs technology and propelling the companys ambitious scale-up program aimed at constructing the most powerful quantum computers in the world, he added.

THANK YOU

Continue reading here:
Quantum computing startup Diraq raises $15M to build qubits using traditional silicon chips - SiliconANGLE News