Category Archives: Quantum Computer

Quantum information and quantum field theory: Study found a new connection between them – Tech Explorist

Quantum information plays a vital role in connecting several branches of physics. Specifically, the theory of quantum error correction, which explains how to protect and recover quantum information in quantum computers and other complex interacting systems, now becomes fundamental to the modern understanding of quantum gravity.

Anatoly Dymarsky, Associate Professor at the Skoltech Center for Energy Science and Technology (CEST), said,Normally, information stored in physical systems is localized. Say, a computer file occupies a particular small area of the hard drive. By error we mean any unforeseen or undesired interaction which scrambles information over an extended area.

In our example, pieces of the computer file would be scattered over different hard drive areas. Error-correcting codes are mathematical protocols that allow collecting these pieces together to recover the original information. They are in heavy use in data storage and communication systems. Quantum error-correcting codes play a similar role in cases when the quantum nature of the physical system is important.

In a rather unexpected twist, scientists realized not too long ago that quantum gravitythe theory describing quantum dynamics of space and timeoperates similar mathematical protocols to exchange information between different parts of space. The locality of information within quantum gravity remains one of the few open fundamental problems in theoretical physics. That is why the appearance of well-studied mathematical structures such as quantum error-correcting codes is intriguing.

Yet the role of codes was only understood schematically, and the explicit mechanism behind the locality of information remains elusive.

In a new study, scientists have discovered a new connection between quantum information and quantum field theory. The study offers clear evidence of the growing role of quantum information theory across various areas of physics.

In the study from Skoltech and the University of Kentucky, scientists established a novel connection between quantum error-correcting codes and two-dimensional conformal field theories. By describing interactions of quantum particles, they have offered standard theoretical tools to describe many different phenomena, from fundamental elementary particles to quasi-particles emerging in quantum materials.

Dymarsky concludes,Now we have a new playground to study the role of quantum error-correcting codes in the context of quantum field theory. We hope this is a first step in understanding how locality of information works and what hides behind all this beautiful mathematics.

Read more:
Quantum information and quantum field theory: Study found a new connection between them - Tech Explorist

Global AI Chipsets for Wireless Networks and Devices, Cloud and Next Generation Computing, IoT, and Big Data Analytics to 2026 -…

DUBLIN--(BUSINESS WIRE)--The "AI Chipsets for Wireless Networks and Devices, Cloud and Next Generation Computing, IoT, and Big Data Analytics 2021 - 2026" report has been added to ResearchAndMarkets.com's offering.

This report evaluates leading market players across the AI chipsets ecosystem, technology strategies, and solution plans. This includes leveraging AI chipsets for support of various emerging and disintermediating technology areas such as edge computing, 5G, and blockchain systems. Additional areas addressed include AI support of emerging computing technologies including edge platforms and servers.

This report also assesses applications and service support scenarios for AI chipsets across almost all major industry verticals. The report provides forecasts for AI chipset hardware, embedded software, professional service, deployment platforms, and applications for every major industry vertical as well as regional and country forecasts for 2021 to 2026. The report also provides exclusive recommendations for stakeholders within the AI chipsets ecosystem.

Select Report Findings:

Companies Mentioned

The AI chipset marketplace is poised to transform the entire embedded system ecosystem with a multitude of AI capabilities such as deep machine learning, image detection, and many others. With 86% of all chipsets globally shipping AI-equipped, over 59% of all electronics will have some form of embedded intelligence by 2026. This will also be transformational for existing critical business functions such as identity management, authentication, and cybersecurity.

Multi-processor AI chipsets learn from the environment, users, and machines to uncover hidden patterns among data, predict actionable insights and perform actions based on specific situations. AI chipsets will become an integral part of both AI software/systems as well as critical support of any data-intensive operation as they drastically improve processing for various functions as well as enhance overall computing performance. This will be a boon for many aspects of ICT ranging from decision support and data analytics to product safety and system optimization.

Consumers will realize benefits indirectly through improved product and service performance such as device and cloud-based gaming. Enterprise and industrial users will benefit through general improvements in automated decision-making, especially in the areas of robotic process automation, decision support systems, and overall data management. AI chipsets will be particularly useful for business edge equipment for real-time data analytics and store versus processing decisions.

Key Topics Covered:

1.0 Executive Summary

2.0 Research Overview

3.0 AI Chipsets Introduction

3.1 AI Chipsets

3.1.1 Chipset Components

3.1.2 General Purpose Applications

3.2 AI Systems

3.3 Market Dynamics Analysis

3.4 AI Investments

3.5 Competitive Market

4.0 Technologies, Solutions, and Markets

4.1 Chipsets Technology and Products

4.2 AI Technology

4.2.1 Machine Learning

4.2.2 Machine Learning APIs

4.2.3 Deep Machine Learning

4.2.4 Natural Language Processing

4.2.5 Computer Vision

4.2.6 Voice Recognition

4.2.7 Context Awareness Computing

4.2.8 Neural Networks

4.2.9 Facial Recognition

4.3 Deployment Platform

4.4 IoT Sector

4.5 Applications in Industry Verticals

4.6 Regional Markets

4.7 Value Chain

4.8 5G Network and Edge Computing

4.9 Cloud Computing and Data Analytics

4.10 Industry 4.0 and Factory Automation

4.11 Autonomous Networks

4.12 Blockchain Networks

4.13 Quantum Computing

4.14 Machine Intelligence

4.15 Nanoscale Technology

4.16 Mobile Network Operators

5.0 Company Analysis

6.0 AI Chipsets Market Analysis and Forecasts 2021 - 2026

6.1 Global AI Chipsets Market 2021 - 2026

6.2 Regional AI Chipsets Market 2021 - 2026

6.3 AI Chipsets Deployment Forecast 2021 - 2026

7.0 Conclusions and Recommendations

For more information about this report visit https://www.researchandmarkets.com/r/c4pcmr

Read the original:
Global AI Chipsets for Wireless Networks and Devices, Cloud and Next Generation Computing, IoT, and Big Data Analytics to 2026 -...

What is quantum computing? Everything you need to know about the strange world of quantum computers – ZDNet

While researchers don't understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information.

Quantum computing exploits the puzzling behavior that scientists have been observing for decades in nature's smallest particles think atoms, photons or electrons. At this scale, the classical laws of physics ceases to apply, and instead we shift to quantum rules.

While researchers don't understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information. Successfully bringing those particles under control in a quantum computer could trigger an explosion of compute power that would phenomenally advance innovation in many fields that require complex calculations, like drug discovery, climate modelling, financial optimization or logistics.

As Bob Sutor, chief quantum exponent at IBM, puts it: "Quantum computing is our way of emulating nature to solve extraordinarily difficult problems and make them tractable," he tells ZDNet.

Quantum computers come in various shapes and forms, but they are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.

The nature of those quantum particles, as well as the method employed to control them, varies from one quantum computing approach to another. Some methods require the processor to be cooled down to freezing temperatures, others to play with quantum particles using lasers but share the goal of finding out how to best exploit the value of quantum physics.

The systems we have been using since the 1940s in various shapes and forms laptops, smartphones, cloud servers, supercomputers are known as classical computers. Those are based on bits, a unit of information that powers every computation that happens in the device.

In a classical computer, each bit can take on either a value of one or zero to represent and transmit the information that is used to carry out computations. Using bits, developers can write programs, which are sets of instructions that are read and executed by the computer.

Classical computers have been indispensable tools in the last few decades, but the inflexibility of bits is limiting. As an analogy, if tasked with looking for a needle in a haystack, a classical computer would have to be programmed to look through every single piece of hay straw until it reached the needle.

There are still many large problems, therefore, that classical devices can't solve. "There are calculations that could be done on a classical system, but they might take millions of years or use more computer memory that exists in total on Earth," says Sutor. "These problems are intractable today."

At the heart of any quantum computer are qubits, also known as quantum bits, and which can loosely be compared to the bits that process information in classical computers.

Qubits, however, have very different properties to bits, because they are made of the quantum particles found in nature those same particles that have been obsessing scientists for many years.

One of the properties of quantum particles that is most useful for quantum computing is known as superposition, which allows quantum particles to exist in several states at the same time. The best way to imagine superposition is to compare it to tossing a coin: instead of being heads or tails, quantum particles are the coin while it is still spinning.

By controlling quantum particles, researchers can load them with data to create qubits and thanks to superposition, a single qubit doesn't have to be either a one or a zero, but can be both at the same time. In other words, while a classical bit can only be heads or tails, a qubit can be, at once, heads and tails.

This means that, when asked to solve a problem, a quantum computer can use qubits to run several calculations at once to find an answer, exploring many different avenues in parallel.

So in the needle-in-a-haystack scenario about, unlike a classical machine, a quantum computer could in principle browse through all hay straws at the same time, finding the needle in a matter of seconds rather than looking for years even centuries before it found what it was searching for.

What's more: qubits can be physically linked together thanks to another quantum property called entanglement, meaning that with every qubit that is added to a system, the device's capabilities increase exponentially where adding more bits only generates linear improvement.

Every time we use another qubit in a quantum computer, we double the amount of information and processing ability available for solving problems. So by the time we get to 275 qubits, we can compute with more pieces of information than there are atoms in the observable universe. And the compression of computing time that this could generate could have big implications in many use cases.

Quantum computers are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.

"There are a number of cases where time is money. Being able to do things more quickly will have a material impact in business," Scott Buchholz, managing director at Deloitte Consulting, tells ZDNet.

The gains in time that researchers are anticipating as a result of quantum computing are not of the order of hours or even days. We're rather talking about potentially being capable of calculating, in just a few minutes, the answer to problems that today's most powerful supercomputers couldn't resolve in thousands of years, ranging from modelling hurricanes all the way to cracking the cryptography keys protecting the most sensitive government secrets.

And businesses have a lot to gain, too. According to recent research by Boston Consulting Group (BCG),the advances that quantum computing will enable could create value of up to $850 billion in the next 15 to 30 years, $5 to $10 billion of which will be generated in the next five years if key vendors deliver on the technology as they have promised.

Programmers write problems in the form of algorithms for classical computers to resolve and similarly, quantum computers will carry out calculations based on quantum algorithms. Researchers have already identified that some quantum algorithms would be particularly suited to the enhanced capabilities of quantum computers.

For example, quantum systems could tackle optimization algorithms, which help identify the best solution among many feasible options, and could be applied in a wide range of scenarios ranging from supply chain administration to traffic management. ExxonMobil and IBM, for instance, are working together to find quantum algorithmsthat could one day manage the 50,000 merchant ships crossing the oceans each day to deliver goods, to reduce the distance and time traveled by fleets.

Quantum simulation algorithms are also expected to deliver unprecedented results, as qubits enable researchers to handle the simulation and prediction of complex interactions between molecules in larger systems, which could lead to faster breakthroughs in fields like materials science and drug discovery.

With quantum computers capable of handling and processing much larger datasets,AI and machine learning applications are set to benefit hugely, with faster training times and more capable algorithms. And researchers have also demonstrated that quantum algorithmshave the potential to crack traditional cryptography keys, which for now are too mathematically difficult for classical computers to break.

To create qubits, which are the building blocks of quantum computers, scientists have to find and manipulate the smallest particles of nature tiny parts of the universe that can be found thanks to different mediums. This is why there are currently many types of quantum processors being developed by a range of companies.

One of the most advanced approaches consists of using superconducting qubits, which are made of electrons, and come in the form of the familiar chandelier-like quantum computers. Both IBM and Google have developed superconducting processors.

Another approach that is gaining momentum is trapped ions, which Honeywell and IonQ are leading the way on, and in which qubits are housed in arrays of ions that are trapped in electric fields and then controlled with lasers.

Major companies like Xanadu and PsiQuantum, for their part, are investing in yet another method that relies on quantum particles of light, called photons, to encode data and create qubits. Qubits can also be created out of silicon spin qubits which Intel is focusing on but also cold atoms or even diamonds.

Quantum annealing, an approach that was chosen by D-Wave, is a different category of computing altogether. It doesn't rely on the same paradigm as other quantum processors, known as the gate model. Quantum annealing processors are much easier to control and operate, which is why D-Wave has already developed devices that can manipulate thousands of qubits, where virtually every other quantum hardware company is working with about 100 qubits or less. On the other hand, the annealing approach is only suitable for a specific set of optimization problems, which limits its capabilities.

What can you do with a quantum computer today?

Right now, with a mere 100 qubits the state of the art, there is very little that can actually be done with quantum computers. For qubits to start carrying out meaningful calculations, they will have to be counted in the thousands, and even millions.

Both IBM and Google have developed superconducting processors.

Right now, with a mere 100 qubits the state of the art, there is very little that can actually be done with quantum computers. For qubits to start carrying out meaningful calculations, they will have to be counted in the thousands, and even millions.

"While there is a tremendous amount of promise and excitement about what quantum computers can do one day, I think what they can do today is relatively underwhelming," says Buchholz.

Increasing the qubit count in gate-model processors, however, is incredibly challenging. This is because keeping the particles that make up qubits in their quantum state is difficult a little bit like trying to keep a coin spinning without falling on one side or the other, except much harder.

Keeping qubits spinning requires isolating them from any environmental disturbance that might cause them to lose their quantum state. Google and IBM, for example, do this by placing their superconducting processors in temperatures that are colder than outer space, which in turn require sophisticated cryogenic technologies that are currently near-impossible to scale up.

In addition, the instability of qubits means that they are unreliable, and still likely to cause computation errors. This hasgiven rise to a branch of quantum computing dedicated to developing error-correction methods.

Although research is advancing at pace, therefore, quantum computers are for now stuck in what is known as the NISQ era: noisy, intermediate-scale quantum computing but the end-goal is to build a fault-tolerant, universal quantum computer.

As Buchholz explains, it is hard to tell when this is likely to happen. "I would guess we are a handful of years from production use cases, but the real challenge is that this is a little like trying to predict research breakthroughs," he says. "It's hard to put a timeline on genius."

In 2019, Googleclaimed that its 54-qubit superconducting processor called Sycamore had achieved quantum supremacy the point at which a quantum computer can solve a computational task that is impossible to run on a classical device in any realistic amount of time.

Google said that Sycamore has calculated, in only 200 seconds, the answer to a problem that would have taken the world's biggest supercomputers 10,000 years to complete.

More recently,researchers from the University of Science and Technology of China claimed a similar breakthrough, saying that their quantum processor had taken 200 seconds to achieve a task that would have taken 600 million years to complete with classical devices.

This is far from saying that either of those quantum computers are now capable of outstripping any classical computer at any task. In both cases, the devices were programmed to run very specific problems, with little usefulness aside from proving that they could compute the task significantly faster than classical systems.

Without a higher qubit count and better error correction, proving quantum supremacy for useful problems is still some way off.

Organizations that are investing in quantum resources see this as the preparation stage: their scientists are doing the groundwork to be ready for the day that a universal and fault-tolerant quantum computer is ready.

In practice, this means that they are trying to discover the quantum algorithms that are most likely to show an advantage over classical algorithms once they can be run on large-scale quantum systems. To do so, researchers typically try to prove that quantum algorithms perform comparably to classical ones on very small use cases, and theorize that as quantum hardware improves, and the size of the problem can be grown, the quantum approach will inevitably show some significant speed-ups.

For example, scientists at Japanese steel manufacturer Nippon Steelrecently came up with a quantum optimization algorithm that could compete against its classical counterpartfor a small problem that was run on a 10-qubit quantum computer. In principle, this means that the same algorithm equipped with thousands or millions of error-corrected qubits could eventually optimize the company's entire supply chain, complete with the management of dozens of raw materials, processes and tight deadlines, generating huge cost savings.

The work that quantum scientists are carrying out for businesses is therefore highly experimental, and so far there are fewer than 100 quantum algorithms that have been shown to compete against their classical equivalents which only points to how emergent the field still is.

With most use cases requiring a fully error-corrected quantum computer, just who will deliver one first is the question on everyone's lips in the quantum industry, and it is impossible to know the exact answer.

All quantum hardware companies are keen to stress that their approach will be the first one to crack the quantum revolution, making it even harder to discern noise from reality. "The challenge at the moment is that it's like looking at a group of toddlers in a playground and trying to figure out which one of them is going to win the Nobel Prize," says Buchholz.

"I have seen the smartest people in the field say they're not really sure which one of these is the right answer. There are more than half a dozen different competing technologies and it's still not clear which one will wind up being the best, or if there will be a best one," he continues.

In general, experts agree that the technology will not reach its full potential until after 2030. The next five years, however, may start bringing some early use cases as error correction improves and qubit counts start reaching numbers that allow for small problems to be programmed.

IBM is one of the rare companies thathas committed to a specific quantum roadmap, which defines the ultimate objective of realizing a million-qubit quantum computer. In the nearer-term, Big Blue anticipates that it will release a 1,121-qubit system in 2023, which might mark the start of the first experimentations with real-world use cases.

In general, experts agree that quantum computers will not reach their full potential until after 2030.

Developing quantum hardware is a huge part of the challenge, and arguably the most significant bottleneck in the ecosystem. But even a universal fault-tolerant quantum computer would be of little use without the matching quantum software.

"Of course, none of these online facilities are much use without knowing how to 'speak' quantum," Andrew Fearnside, senior associate specializing in quantum technologies at intellectual property firm Mewburn Ellis, tells ZDNet.

Creating quantum algorithms is not as easy as taking a classical algorithm and adapting it to the quantum world. Quantum computing, rather, requires a brand-new programming paradigm that can only be ran on a brand-new software stack.

Of course, some hardware providers also develop software tools, the most established of which is IBM's open-source quantum software development kit Qiskit. But on top of that, the quantum ecosystem is expanding to include companies dedicated exclusively to creating quantum software. Familiar names include Zapata, QC Ware or 1QBit, which all specialize in providing businesses with the tools to understand the language of quantum.

And increasingly, promising partnerships are forming to bring together different parts of the ecosystem. For example, therecent alliance between Honeywell, which is building trapped ions quantum computers, and quantum software company Cambridge Quantum Computing (CQC), has got analysts predicting that a new player could be taking a lead in the quantum race.

The complexity of building a quantum computer think ultra-high vacuum chambers, cryogenic control systems and other exotic quantum instruments means that the vast majority of quantum systems are currently firmly sitting in lab environments, rather than being sent out to customers' data centers.

To let users access the devices to start running their experiments, therefore, quantum companies have launched commercial quantum computing cloud services, making the technology accessible to a wider range of customers.

The four largest providers of public cloud computing services currently offer access to quantum computers on their platform. IBM and Google have both put their own quantum processors on the cloud, whileMicrosoft's Azure QuantumandAWS's Braketservice let customers access computers from third-party quantum hardware providers.

The jury remains out on which technology will win the race, if any at all, but one thing is for certain: the quantum computing industry is developing fast, and investors are generously funding the ecosystem. Equity investments in quantum computing nearly tripled in 2020, and according to BCG, they are set to rise even more in 2021 to reach $800 million.

Government investment is even more significant: the US has unlocked $1.2 billion for quantum information science over the next five years, while the EU announced a 1 billion ($1.20 billion) quantum flagship. The UKalso recently reached the 1 billion ($1.37 billion) budget milestonefor quantum technologies, and while official numbers are not known in China,the government has made no secret of its desire to aggressively compete in the quantum race.

This has caused the quantum ecosystem to flourish over the past years, with new start-ups increasing from a handful in 2013 to nearly 200 in 2020. The appeal of quantum computing is also increasing among potential customers: according to analysis firm Gartner,while only 1% of companies were budgeting for quantum in 2018, 20% are expected to do so by 2023.

Although not all businesses need to be preparing themselves to keep up with quantum-ready competitors, there are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.

Goldman Sachs and JP Morgan are two examples of financial behemoths investing in quantum computing. That's because in banking,quantum optimization algorithms could give a boost to portfolio optimization, by better picking which stocks to buy and sell for maximum return.

In pharmaceuticals, where the drug discovery process is on average a $2 billion, ten-year-long deal that largely relies on trial and error, quantum simulation algorithms are also expected to make waves. This is also the case in materials science: companies like OTI Lumionics, for example,are exploring the use of quantum computers to design more efficient OLED displays.

Leading automotive companies including Volkswagen and BMW are also keeping a close eye on the technology, which could impact the sector in various ways, ranging from designing more efficient batteries to optimizing the supply chain, through to better management of traffic and mobility. Volkswagen, for example,pioneered the use of a quantum algorithm that optimized bus routes in real time by dodging traffic bottlenecks.

As the technology matures, however, it is unlikely that quantum computing will be limited to a select few. Rather, analysts anticipate that virtually all industries have the potential to benefit from the computational speedup that qubits will unlock.

There are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.

Quantum computers are expected to be phenomenal at solving a certain class of problems, but that doesn't mean that they will be a better tool than classical computers for every single application. Particularly, quantum systems aren't a good fit for fundamental computations like arithmetic, or for executing commands.

"Quantum computers are great constraint optimizers, but that's not what you need to run Microsoft Excel or Office," says Buchholz. "That's what classical technology is for: for doing lots of maths, calculations and sequential operations."

In other words, there will always be a place for the way that we compute today. It is unlikely, for example, that you will be streaming a Netflix series on a quantum computer anytime soon. Rather, the two technologies will be used in conjunction, with quantum computers being called for only where they can dramatically accelerate a specific calculation.

Buchholz predicts that, as classical and quantum computing start working alongside each other, access will look like a configuration option. Data scientists currently have a choice of using CPUs or GPUs when running their workloads, and it might be that quantum processing units (QPUs) join the list at some point. It will be up to researchers to decide which configuration to choose, based on the nature of their computation.

Although the precise way that users will access quantum computing in the future remains to be defined, one thing is certain: they are unlikely to be required to understand the fundamental laws of quantum computing in order to use the technology.

"People get confused because the way we lead into quantum computing is by talking about technical details," says Buchholz. "But you don't need to understand how your cellphone works to use it."

"People sometimes forget that when you log into a server somewhere, you have no idea what physical location the server is in or even if it exists physically at all anymore. The important question really becomes what it is going to look like to access it."

And as fascinating as qubits, superposition, entanglement and other quantum phenomena might be, for most of us this will come as welcome news.

View original post here:
What is quantum computing? Everything you need to know about the strange world of quantum computers - ZDNet

Will the NSA Finally Build Its Superconducting Spy Computer? – IEEE Spectrum

Today, silicon microchips underlie every aspect of digital computing. But their dominance was never a foregone conclusion. Throughout the 1950s, electrical engineers and other researchers explored many alternatives to making digital computers.

One of them seized the imagination of the U.S. National Security Agency (NSA): a superconducting supercomputer. Such a machine would take advantage of superconducting materials that, when chilled to nearly the temperature of deep spacejust a few degrees above absolute zeroexhibit no electrical resistance whatsoever. This extraordinary property held the promise of computers that could crunch numbers and crack codes faster than transistor-based systems while consuming far less power.

For six decades, from the mid-1950s to the present, the NSA has repeatedly pursued this dream, in partnership with industrial and academic researchers. Time and again, the agency sponsored significant projects to build a superconducting computer. Each time, the effort was abandoned in the face of the unrelenting pace of Moores Law and the astonishing increase in performance and decrease in cost of silicon microchips.

Now Moores Law is stuttering, and the worlds supercomputer builders are confronting an energy crisis. Nuclear weapon simulators, cryptographers, and others want exascale supercomputers, capable of 1,000 petaflops1 million trillion floating-point operations per secondor greater. The worlds fastest known supercomputer today, Chinas 34-petaflop Tianhe-2, consumes some 18 megawatts of power. Thats roughly the amount of electricity drawn instantaneously by 14,000 average U.S. households. Projections vary depending on the type of computer architecture used, but an exascale machine built with todays best silicon microchips could require hundreds of megawatts.

The exascale push may be superconducting computings opening. And the Intelligence Advanced Research Projects Activity, the U.S. intelligence communitys arm for high-risk R&D, is making the most of it. With new forms of superconducting logic and memory in development, IARPA has launched an ambitious program to create the fundamental building blocks of a superconducting supercomputer. In the next few years, the effort could finally show whether the technology really can beat silicon when given the chance.

Cold Calling: In the 1950s, Dudley Buck envisioned speedy, energy-efficient computers. These would be driven by his superconducting switch, thecryotron.Photo:Gjon Mili/The LIFE Picture Collection/Getty Images

The NSAs dream of superconducting supercomputers was first inspired by the electrical engineer Dudley Buck. Buck worked for the agencys immediate predecessor on an early digital computer. When he moved to MIT in 1950, he remained a military consultant, keeping the Armed Forces Security Agency, which quickly became the NSA, abreast of new computing developments in Cambridge.

Buck soon reported on his own worka novel superconducting switch he named the cryotron. The device works by switching a material between its superconducting statewhereelectrons couple up and flow as a supercurrent, with no resistance at alland its normal state, where electrons flow with some resistance. A number of superconducting metallic elements and alloys reach that state when they are cooled below a critical temperature near absolute zero. Once the material becomes superconducting, a sufficiently strong magnetic field can drive the material back to its normal state.

In this, Buck saw a digital switch. He coiled a tiny control wire around a gate wire, and plunged the pair into liquid helium. When current ran through the control, the magnetic field it created pushed the superconducting gate into its normal resistive state. When the control current was turned off, the gate became superconducting again.

Buck thought miniature cryotrons could be used to fashion powerful, fast, and energy-efficient digital computers. The NSA funded work by him and engineer Albert Slade on cryotron memory circuits at the firm A.D. Little, as well as a broader project on digital cryotron circuitry at IBM. Quickly, GE, RCA, and others launched their own cryotron efforts.

Engineers continued developing cryotron circuits into the early 1960s, despite Bucks sudden and premature death in 1959. But liquid-helium temperatures made cryotrons challenging to work with, and the time required for materials to transition from a superconducting to a resistive state limited switching speeds. The NSA eventually pulled back on funding, and many researchers abandoned superconducting electronics for silicon.

Even as these efforts faded, a big change was under way. In 1962 British physicist Brian Josephson made a provocative prediction about quantum tunneling in superconductors. In typical quantum-mechanical tunneling, electrons sneak across an insulating barrier, assisted by a voltage push; the electrons progress occurs with some resistance. But Josephson predicted that if the insulating barrier between two superconductors is thin enough, a supercurrent of paired electrons could flow across with zero resistance, as if the barrier were not there at all. This became known as the Josephson effect, and a switch based on the effect, the Josephson junction, soon followed.

Junction Exploration: 1970s-era Josephson circuitry.Image: IBM

IBM researchers developed a version of this switch in the mid-1960s. The active part of the device was a line of superconducting metal, interrupted by a thin oxide barrier cutting across it. A supercurrent would freely tunnel across the barrier, but only up to a point; if the current rose above a certain threshold, the device would saturate and unpaired electrons would trickle across the junction with some resistance. The threshold could be tuned by a magnetic field, created by running current through a nearby superconducting control line. If the device operated close to the threshold current, a small current in the control could shift the threshold and switch the gate out of its supercurrent-tunneling state. Unlike in Bucks cryotron, the materials in this device always remained superconducting, making it a much faster electronic switch.

As explored by historian Cyrus Mody, by 1973 IBM was working on building a superconducting supercomputer based on Josephson junctions. The basic building block of its circuits was a superconducting loop with Josephson junctions in it, known as a superconducting quantum interference device, or SQUID. The NSA covered a substantial fraction of the costs, and IBM expected the agency to be its first superconducting-supercomputer customer, with other government and industry buyers to follow.

IBMs superconducting supercomputer program ran for more than 10 years, at a cost of about US $250 million in todays dollars. It mainly pursued Josephson junctions made from lead alloy and lead oxide. Late in the project, engineers switched to a niobium oxide barrier, sandwiched between a lead alloy and a niobium film, an arrangement that produced more-reliable devices. But while the project made great strides, company executives were not convinced that an eventual supercomputer based on the technology could compete with the ones expected to emerge with advanced silicon microchips. In 1983, IBM shut down the program without ever finishing a Josephson-junction-based computer, super or otherwise.

Japan persisted where IBM had not. Inspired by IBMs project, Japans industrial ministry, MITI, launched a superconducting computer effort in 1981. The research partnership, which included Fujitsu, Hitachi, and NEC, lasted for eight years and produced an actual working Josephson-junction computerthe ETL-JC1. It was a tiny, 4-bit machine, with just 1,000 bits of RAM, but it could actually run a program. In the end, however, MITI came to share IBMs opinion about the prospect of scaling up the technology, and the project was abandoned.

Critical new developments emerged outside these larger superconducting-computer programs. In 1983, Bell Telephone Laboratories researchers formed Josephson junctions out of niobium separated by thin aluminum oxide layers. The new superconducting switches were extraordinarily reliable and could be fabricated using a simplified patterning process much in the same way silicon microchips were.

On The Move: Magnetic flux ejected from a superconducting loop through a Josephson junction can take the form of tiny voltage pulses. The presence or absence of a pulse in a given period of time can be used to perform computations.Image: Hypres

Then in 1985, researchers at Moscow State University proposed [PDF] a new kind of digital superconducting logic. Originally dubbed resistive, then renamed rapid single-flux-quantum logic, or RSFQ, it took advantage of the fact that a Josephson junction in a loop of superconducting material can emit minuscule voltage pulses. Integrated over time, they take on only a quantized, integer multiple of a tiny value called the flux quantum, measured in microvolt-picoseconds.

By using such ephemeral voltage pulses, each lasting a picosecond or so, RSFQ promised to boost clock speeds to greater than 100 gigahertz. Whats more, a Josephson junction in such a configuration would expend energy in the range of just a millionth of a picojoule, considerably less than consumed by todays silicon transistors.

Together, Bell Labs Josephson junctions and Moscow State Universitys RSFQ rekindled interest in superconducting electronics. By 1997, the U.S. had launched the Hybrid Technology Multi-Threaded (HTMT) project, which was supported by the National Science Foundation, the NSA, and other agencies. HTMTs goal was to beat conventional silicon to petaflop-level supercomputing, using RSFQ integrated circuits among other technologies.

It was an ambitious program that faced a number of challenges. The RSFQ circuits themselves limited potential computing efficiency. To achieve tremendous speed, RSFQ used resistors to provide electrical biases to the Josephson junctions in order to keep them close to the switching threshold. In experimental RSFQ circuitry with several thousand biased Josephson junctions, the static power dissipation was negligible. But in a petaflop-scale supercomputer, with possibly many billions of such devices, it would have added up to significant power consumption.

The HTMT project ended in 2000. Eight years later, a conventional silicon supercomputerIBMs Roadrunnerwas touted as the first to reach petaflop operation. It contained nearly 20,000 silicon microprocessors and consumed 2.3megawatts.

For many researchers working on superconducting electronics, the period around 2000 marked a shift to an entirely different direction: quantum computing. This new direction was inspired by the 1994 work of mathematician Peter Shor, then at Bell Labs, which suggested that a quantum computer could be a powerful cryptanalytical tool, able to rapidly decipher encrypted communications. Soon, projects in superconducting quantum computing and superconducting digital circuitry were being sponsored by the NSA and the U.S. Defense Advanced Research Projects Agency. They were later joined by IARPA, which was created in 2006 by the Office of the Director of National Intelligence to sponsor intelligence-related R&D programs, collaborating across a community that includes the NSA, the Central Intelligence Agency, and the National Geospatial-Intelligence Agency.

Single-Flux Quantum: Current in a superconducting loop containing a Josephson junction a nonsuperconducting barrier generates a magnetic field with atiny, quantized value.

Nobody knew how to build a quantum computer, of course, but lots of people had ideas. At IBM and elsewhere, engineers and scientists turned to the mainstays of superconducting electronicsSQUIDs and Josephson junctionsto craft the building blocks. A SQUID exhibits quantum effects under normal operation, and it was fairly straightforward to configure it to operate as a quantum bit, or qubit.

One of the centers of this work was the NSAs Laboratory for Physical Sciences. Built near the University of Maryland, College Parkoutside the fence of NSA headquarters in Fort Meadethe laboratory is a space where the NSA and outside researchers can collaborate on work relevant to the agencys insatiable thirst for computing power.

In the early 2010s, Marc Manheimer was head of quantum computing at the laboratory. As he recently recalled in an interview, he saw an acute need for conventional digital circuits that could physically surround quantum bits in order to control them and correct errors on very short timescales. The easiest way to do this, he thought, would be with superconducting computer elements, which could operate with voltage and current levels that were similar to those of the qubit circuitry they would be controlling. Optical links could be used to connect this cooled-down, hybrid system to the outside worldand to conventional silicon computers.

At the same time, Manheimer says, he became aware of the growing power problem in high-performance silicon computing, for supercomputers as well as the large banks of servers in commercial data centers. The closer I looked at superconducting logic, he says, the more that it became clear that it had value for supercomputing in its own right.

Manheimer proposed a new direct attack on the superconducting supercomputer. Initially, he encountered skepticism. Theres this history of failure, he says. Past pursuers of superconducting supercomputers had gotten burnedso people were very cautious. But by early 2013, he says, he had convinced IARPA to fund a multisite industrial and academic R&D program, dubbed the Cryogenic Computing Complexity (C3) program. He moved to IARPA to lead it.

The first phase of C3its budget is not publiccalls for the creation and evaluation of superconducting logic circuits and memory systems. These will be fabricated at MIT Lincoln Laboratorythe same lab where Dudley Buck once worked.

Manheimer says one thing that helped sell his C3 idea was recent progress in the field, which is reflected in IARPAs selection of performers, publicly disclosed in December 2014.

One of those teams is led by the defense giant Northrop Grumman Corp. The company participated in the late 1990s HTMT project, which employed fairly-power-hungry RSFQ logic. In 2011, Northrop Grummans Quentin Herr reported an exciting alternative, a different form of single-flux quantum logic called reciprocal quantum logic. RQL replaces RSFQs DC resistors with AC inductors, which bias the circuit without constantly drawing power. An RQL circuit, says Northrop Grumman team leader Marc Sherwin, consumes 1/100,000 the power of the best equivalent CMOS circuit and far less power than the equivalent RSFQ circuit.

A similarly energy-efficient logic called ERSFQ has been developed by superconducting electronics manufacturer Hypres, whose CTO, Oleg Mukhanov, is the coinventor of RSFQ. Hypres is working with IBM, which continued its fundamental superconducting device work even after canceling its Josephson-junction supercomputer project and was also chosen to work on logic for the program.

Hypres is also collaborating with a C3 team led by a Raytheon BBN Technologies laboratory that has been active in quantum computing research for several years. There, physicist Thomas Ohki and colleagues have been working on a cryogenic memory system that uses low-power superconducting logic to control, read, and write to high-density, low-power magnetoresistive RAM. This sort of memory is another change for superconducting computing. RSFQ memory cells were fairly large. Todays more compact nanomagnetic memories, originally developed to help extend Moores Law, can also work well at low temperatures.

The worlds most advanced superconducting circuitry uses devices based on niobium. Although such devices operate at temperatures of about 4 kelvins, or 4degrees above absolute zero, Manheimer says supplying the refrigeration is now a trivial matter. Thats thanks in large part to the multibillion-dollar industry based on magnetic resonance imaging machines, which rely on superconducting electromagnets and high-quality cryogenic refrigerators.

One big question has been how much the energy needed for cooling will increase a superconducting computers energy budget. But advocates suggest it might not be much. The power drawn by commercial cryocoolers leaves considerable room for improvement, Elie Track and Alan Kadin of the IEEEs Rebooting Computing initiative recently wrote. Even so, they say, the power dissipated in a superconducting computer is so small that it remains 100 times more efficient than a comparable silicon computer, even after taking into account the present inefficient cryocooler.

For now, C3s focus is on the fundamental components. This first phase, which will run through 2017, aims to demonstrate core components of a computer system: a set of key 64-bit logic circuits capable of running at a 10-GHz clock rate and cryogenic memory arrays with capacities up to about 250 megabytes. If this effort is successful, a second, two-year phase will integrate these components into a working cryogenic computer of as-yet-unspecified size. If that prototype is deemed promising, Manheimer estimates it should be possible to create a true superconducting supercomputer in another 5 to 10 years.

Go For Power: Performance demands power. Todays most powerful supercomputers consume multiple megawatts (red), not including cooling. Superconducting computers, cryocoolers included, are projected to dramatically drop those power requirements (blue).Source: IEEE Transactions on Applied Superconductivity, vol. 23, #1701610; Marc Manheimer

Such a system would be much smaller than CMOS-based supercomputers and require far less power. Manheimer projects that a superconducting supercomputer produced in a follow-up to C3 could run at 100 petaflops and consume 200 kilowatts, including the cryocooling. It would be 1/20 the size of Titan, currently the fastest supercomputer in the United States, but deliver more than five times the performance for 1/40 of the power.

A supercomputer with those capabilities would obviously represent a big jump. But as before, the fate of superconducting supercomputing strongly depends on what happens with silicon. While an exascale computer made from todays silicon chips may not be practical, great effort and billions of dollars are now being expended on continuing to shrink silicon transistors as well as on developing on-chip optical links and 3-D stacking. Such technologies could make a big difference, says Thomas Theis, who directs nanoelectronics research at the nonprofit Semiconductor Research Corp. In July 2015, President Barack Obama announced the National Strategic Computing Initiative and called for the creation of an exascale supercomputer. IARPAs work on alternatives to silicon is part of this initiative, but so is conventional silicon. The mid-2020s has been targeted for the first silicon-based exascale machine. If that goal is met, the arrival of a superconducting supercomputer would likely be pushed out still further.

But its too early to count out superconducting computing just yet. Compared with the massive, continuous investment in silicon over the decades, superconducting computing has had meager and intermittent support. Yet even with this subsistence diet, physicists and engineers have produced an impressive string of advances. The support of the C3 program, along with the wider attention of the computing community, could push the technology forward significantly. If all goes well, superconducting computers might finally come in from the cold.

This article appears in the March 2016 print issue as The NSAs Frozen Dream.

A historian of science and technology, David C. Brock recently became director of the Center for Software History at the Computer History Museum. A few years back, while looking into the history of microcircuitry, he stumbled across the work of Dudley Buck, a pioneer of speedy cryogenic logic. He wrote about Buck in our April 2014 issue. Here he explores what happened after Buck, including a new effort to build a superconducting computer. This time, he says, the draw is energy efficiency, not performance.

Excerpt from:
Will the NSA Finally Build Its Superconducting Spy Computer? - IEEE Spectrum

Is Bitcoin (BTC) Safe from Grover’s Algorithm? – Yahoo Finance

When crypto investors discuss quantum computing, they invariably worry about its potential to undermine encryption. Quantum computers alone do not pose such a mortal threat, however. Its their capacity to exploit Shors algorithm that makes them formidable.

Thats because Shors algorithm can factor large prime numbers, the security behind asymmetric encryption.

Another quantum algorithm can potentially undermine the blockchain as well. Grovers algorithm helps facilitate quantum search capabilities, enabling users to quickly find values among billions of unstructured data points at once.

Unlike Shors algorithm, Grovers algorithm is more of a threat to cryptographic hashing than encryption. When cryptographic hashes are compromised, both blockchain integrity and block mining suffer.

Collision Attacks

One-way hash functions help to make a blockchain cryptographically secure. Classical computers cannot easily reverse-engineer them. They would have to find the correct arbitrary input that maps to a specific hash value.

Using Grovers algorithm, a quantum attacker could hypothetically find two inputs that produce the same hash value. This phenomenon is known as a hash collision.

By solving this search, a blockchain attacker could serendipitously replace a valid block with a falsified one. Thats because, in a Proof-of-Work system, the current blocks hash can verify the authenticity of all past blocks.

This kind of attack remains a distant threat, however. Indeed, achieving a cryptographic collision is far more challenging than breaking asymmetric encryption.

Mining Threats

A somewhat easier attack to pull off using Grovers algorithm involves proof-of-work mining.

Using Grovers search algorithm, a quantum miner can mine at a much faster rate than a traditional miner. This miner could generate as much Proof-of-Work as the rest of the network combined. Consequently, the attacker could effectively take over the blockchain and force consensus on any block they selected.

Story continues

A quantum miner might also use Grovers search algorithm to help facilitate the guessing of a nonce. The nonce is the number that blockchain miners are solving for, in order to receive cryptocurrency. Thats because Grovers algorithm provides a quadratic speedup over a classical computer (for now, ASIC-based mining remains considerably faster).

How fast is a quadratic speedup? Roughly stated, if a classical computer can solve a complex problem in the time of T, Grovers algorithm will be able to solve the problem in the square root of T (T).

Thus, any miner who can solve the nonce faster than other miners will be able to mine the blockchain faster as well.

Grovers algorithm could also be used to speed up the generation of nonces. This capability would allow an attacker to quickly reconstruct the chain from a previously modified block (and faster than the true chain), .In the end, a savvy attacker could substitute this reconstructed chain for the true chain.

Grovers algorithm may ultimately help make Proof-of-Work obsolete. Thats because there is no possible PoW system that is not susceptible to Grover speed-up. In the end, quantum actors will always have an advantage over classical ones in PoW-based blockchains. (allowing them) to either mine more effectively or (instigate) an attack (source).

Proof-of-Work Weaknesses

As bitcoin matures, the weaknesses inherent within PoW become ever-more evident. Miners are pitted against each other as if in a never-ending arms race This arms race is incentivized by the ability of larger mining pools to achieve economies of scale, a cost advantage that quickly erodes the capacity of individual miners to survive.

Of course, Proof-of-Stake is not without flaws. For instance, critics assert that it favors larger stakeholders (hence the claim that it enables the rich to get richer). These critics neglect to note that PoW is amenable to the same strategy (albeit with miners).

As this arms race comes to a head, any miner with the resources to do so will use quantum computing to achieve a competitive advantage. Combined with Grovers algorithm, a quantum-based miner would outperform other miners (most likely, small-and medium-sized miners). .

With access to quadratic speedup, any PoW coin will inevitably fall under the control of mega-cap institutions and governments. If so, regular investors and mid to large-cap enterprises risk getting priced out of the market. In particular, their devices will be either too expensive or prone to excessive regulation (much the same way that PGP encryption once was).

Summary

Shors algorithm undoubtedly poses the most immediate threat to bitcoin (namely, the potential to break ECDSA, its digital signature algorithm). Grovers algorithm is a distant second in this respect.

Grovers algorithm may someday pose a formidable challenge to PoW mining, however. And it could conceivably threaten cryptographic hashing as well. Any algorithm powerful enough to reverse engineer hash values would invariably undermine PoW itself.

Quantum Resistant Ledger (QRL) will ultimately offer protection against both.

For instance, a quantum-safe digital signature scheme named XMSS safeguards the coin from Shors algorithm.

Likewise, the QRL team will rely on Proof-of-Stake to head off mining-based attacks using Grovers search algorithm.

As you can see, the QRL team is thoroughly preparing for a post-quantum future. Their mission is an increasingly urgent one, as quantum computing continues to advance by leaps and bounds.

See more from Benzinga

2021 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

View original post here:
Is Bitcoin (BTC) Safe from Grover's Algorithm? - Yahoo Finance

IBM’s newest quantum computer is now up-and-running: Here’s what it’s going to be used for – ZDNet

A Quantum System One, IBM's flagship integrated superconducting quantum computer, is now available on-premises in the Kawasaki Business Incubation Center in Kawasaki City.

IBM has unveiled a brand-new quantum computer in Japan, thousands of miles away from the company's quantum computation center in Poughkeepsie, New York, in another step towards bringing quantum technologies out of Big Blue's labs and directly to partners around the world.

A Quantum System One, IBM's flagship integrated superconducting quantum computer, is now available on-premises in the Kawasaki Business Incubation Center in Kawasaki City, for Japanese researchers to run their quantum experiments in fields ranging from chemistry to finance.

Most customers to date can only access IBM's System One over the cloud, by connecting to the company's quantum computation center in Poughkeepsie.

Recently, the company unveiled the very first quantum computer that was physically built outside of the computation center's data centers,when the Fraunhofer Institute in Germany acquired a System One. The system that has now been deployed to Japan is therefore IBM's second quantum computer that is located outside of the US.

The announcement comes as part of a long-standing relationship with Japanese organizations. In 2019, IBM and the University of Tokyo inaugurated the Japan-IBM Quantum Partnership, a national agreement inviting universities and businesses across the country to engage in quantum research. It was agreed then that a Quantum System One would eventually be installed at an IBM facility in Japan.

Building on the partnership, Big Blue and the University of Tokyolaunched the Quantum Innovation Initiative Consortium last yearto further bring together organizations working in the field of quantum. With this, the Japanese government has made it clear that it is keen to be at the forefront of the promising developments that quantum technologies are expected to bring about.

Leveraging some physical properties that are specific to quantum mechanics, quantum computers could one day be capable of carrying out calculations that are impossible to run on the devices that are used today, known as a classical computers.

In some industries, this could have big implications; and as part of the consortium, together with IBM researchers, some Japanese companies have already identified promising use cases. Mitsubishi Chemical's research team, for example, has developed quantum algorithms capable of understanding the complex behavior of industrial chemical compounds with the goal of improving OLED displays.

A recent research paper published by the scientistshighlighted the potential of quantum computers when it comes to predicting the properties of OLED materials, which could eventually lead to more efficient displays requiring low-power consumption.

Similarly, researchers from Mizuho Financial Group and Mitsubishi Financial Group have been developing quantum algorithms that could speedup financial operations like Monte Carlo simulations, which could allow for optimized portfolio management thanks to better risk analysis and option pricing.

With access to IBM's Quantum System One, research in those fields is now expected to accelerate. But other industry leaders exploring quantum technologies as part of the partnership extend from Sony to Toyota, through Hitachi, Toshiba or JSR.

Quantum computing is still in its very early stages, and it is not yet possible to use quantum computers to perform computations that are of any value to a business. Rather, scientists are currently carrying out proofs-of-concept, by attempting to identify promising applications and testing them at a very small scale, to be prepared for the moment that the hardware is fully ready.

This is still some way off. Building and controlling the components of quantum computers is a huge challenge, which has so far been limited to the confines of specialist laboratories such as IBM's Poughkeepsie computation center.

It is significant, therefore, that IBM's Quantum System One is now mature enough to be deployed outside of the company's lab.

"Thousands of meticulously engineered components have to work together flawlessly in extreme temperatures within astonishing tolerances," said IBM in a blog post.

Back in the US, too, quantum customers are showing interest in building quantum hardware in their own facilities. The Cleveland Clinic, for example,recently invested $500 million for Big Blue to build quantum hardware on-premises.

Read the original post:
IBM's newest quantum computer is now up-and-running: Here's what it's going to be used for - ZDNet

URI to host international experts for conference on future of quantum computing – URI Today

KINGSTON, R.I. July 22, 2021 The University of Rhode Island will host more than a dozen international experts in the growing field of quantum information science in October for the inaugural Frontiers in Quantum Computing conference in celebration of the launch of URIs new masters degree program in quantum computing.

The conference, which will take place Oct. 18-20 on URIs picturesque Kingston Campus amid fall foliage, will feature daily talks and a panel focusing on the future of quantum computing, current developments and research, and educational initiatives for future leaders in the field.

We are bringing key quantum information science leaders from all sectorsgovernment, academia and industryto URI to present the latest developments and frontiers in the theoretical and experimental aspects of quantum computing, said Vanita Srinivasa, assistant professor of physics and director of URIs new Quantum Information Science Program. This is not only the inaugural event for our new degree. This will mark the first major conference on quantum computing for the region, and we want to establish Rhode Island as a center of quantum information science.

URIs new Master of Science in Quantum Computing is a meaningful step not only for our regions quantum workforce developmentbut also for the U.S. and global quantum ecosystem, said Christopher Savoie 92, founder and chief executive officer of Zapata Computing, who is a member of the conference steering committee and will be one of the speakers at the conference. This inaugural conference is a pioneering effort on the Universitys part to launch a regional center for quantum computingand its amazing to return to campus with some of our fields greatest contributors.

The list of speakers includes U.S. Sen. Jack Reed, who will deliver an address on the opening day of the conference, along with pioneers from across the U.S. as well as from Canada, Europe, and Australia. They will provide a look into different areas of quantum computing. They include:

With the launch of URIs masters program in quantum computing, education will be a focus of the conference, and college and high school students are encouraged to attend to learn more about the opportunities in the field and to make professional connections. Among the speakers focusing on education will be Professor Chandralekha Singh, president of the American Association of Physics Teachers. Also, students and postdoctoral scientists are invited to present their own work through research posters.

Leonard Kahn, chair of the URI Physics Department, says given the interdisciplinary nature of quantum computing, the conference will be of interest to a wide audience.

The future of quantum computing will impact us all as we go forward. People will come away from this conference with a far broader understanding of the topic, said Kahn. We hope to encourage the full URI community and all the high schools to attend. Well have talks on education, and our round-table discussion on the future of quantum computing will present a diverse group of people describing their views of the future. I think that will be enlightening to the general community. This is not just for people in physics. This not only affects the STEM disciplines, but also philosophy and the humanities.

The conference is planned as a primarily in-person event. But given the uncertainty of the pandemic and its impact on travel, some speakers may be virtual, and plans for a hybrid format are in place.

Along with Zapata Computing, sponsors for Frontiers in Quantum Computing include IBM and D-Wave both pioneers in the development of quantum computers. Sponsorship opportunities are still available. For more information, contact Leonard Kahn at lenkahn@uri.edu.

New masters program

The 38-credit Master of Science in Quantum Computing program, which starts in the fall as part of the Physics Departments new Quantum Information Science Program, aligns with the goals of the National Quantum Initiative Act. The act highlights the need for researchers, educators and students and for the growth of multidisciplinary curricula and research opportunities in quantum-based technologies.

The program, which builds on URIs strengths in quantum information science, optics and nanophysics, is geared toward students with a bachelors degree in physics or a closely related discipline and a basic understanding of quantum mechanics.

What makes the program unique in the U.S. is that a student can come in as a freshman and in five years graduate with both a bachelors degree in physics and a Master of Science in Quantum Computing, Kahn said.

Students graduating from the program can either use the degree as a direct route into the quantum computing industry or as preparation for doctoral studies. The non-thesis program includes specialized coursework in quantum computing and complementary graduate-level coursework in physics. The program also allows students the flexibility to tailor their studies to pursue interests in mathematics, computer science, and engineering through the programs interdisciplinary curriculum.

In addition, the program has established partnerships with industrial firms, such as Zapata Computing, as well as national labs and institutes that will help students take part in collaborative research projects and internships to ensure they are prepared to enter the workforce or to pursue further studies in the rapidly advancing field.

There are several major quantum computing efforts in New England, and students will have access to that, said Srinivasa. They will have access to industry and government internships that will give them hands-on, experiential learning, along with opportunities to pursue research with University faculty and their collaborators in this field.

Frontiers in Quantum Computing, lectures and the panel discussion on the future of quantum computing are free and open to the public but registration is required. To learn more about the conference or to register, go to the Frontiers in Quantum Computing webpage. For more information on the masters program, email Leonard Kahn at lenkahn@uri.edu or Vanita Srinivasa at vsriniv@uri.edu.

The rest is here:
URI to host international experts for conference on future of quantum computing - URI Today

Research by University of Surrey and Arqit reveals Quantum Threat to Digital Assets – Business Wire

LONDON--(BUSINESS WIRE)--A University of Surrey report co-authored by Stephen Holmes, Chief Product Officer at Arqit Limited (Arqit), a global leader in quantum encryption technology, and Professor Liqun Chen, Professor in Secure Systems at the University of Surrey, released today identifies the definitive threat posed to Digital Assets unless urgent changes are made to their cryptography.

With increasing global investment, quantum computing technology is developing quickly towards the point where it will have sufficient power to break the digital signatures used in digital assets. As central banks and large enterprises are now seriously considering the large- scale use of digital assets, this is more important than ever. This research assesses the attack mechanisms employed by a quantum computer and when they will arise. It covers:

Arqits Founder, Chairman and CEO, David Williams said, The University of Surrey research paper highlights some critical issues that must be addressed and prioritised to ensure all digital assets are secure against quantum computer attacks. With mass digitisation of almost every aspect of our society already underway, and a number of governments considering the launch of their own digital currencies, this is now a global issue. The world needs stronger, simpler encryption to counter the major attacks seen daily today, and the quantum attack of tomorrow. We welcome this research which elaborates well the things that the digital assets community must consider to mitigate these cryptographic threats.

About the researchers:Stephen Holmes is Chief Product Officer at Arqit. He has 30 years of experience in IT, including appointments as CTO, consultant, enterprise architect, and product manager of disruptive innovation. He conducted his PhD research in post quantum cryptography applied to blockchain at the University of Surrey. Stephen has a wide range of experience in commercialising disruptive technologies and has worked at IBM laboratories, Hewlett Packard and was co-founder and CTO of Virtusa Xlabs. Stephen holds an MBA, specialising in marketing innovative products and services. Stephen has a passion for building secure systems and protecting privacy, and writes extensively about security, privacy, and quantum technologies. He represents the UK as a subject matter expert on ISO tc307 blockchain and DLT systems.

Prof Liqun Chen joined the Department of Computer Science at the University of Surrey as Professor in Secure Systems in 2016. Prior to this appointment, she was a Principal Research Scientist at Hewlett Packard Laboratories in Bristol, UK, which she joined in 1997. Before that, she worked at Royal Holloway, University of London, and the University of Oxford. Her research interests include applied cryptography, trusted computing, and network security.

-ends-

About Arqit Limited:Arqit supplies a unique quantum encryption Platform-as-a-Service which secures the communications links of any networked device against current and future forms of attack even from a quantum computer. Arqits product, QuantumCloud, enables any device to download a lightweight software agent of less than 200 lines of code, which can create keys in partnership with any other device. The keys are, computationally secure, dont exist until the moment they are needed and can never be known to a third party. QuantumCloud can create limitless volumes of keys in limitless group sizes and can regulate the secure entrance and exit of a device in a group. The addressable market for QuantumCloud is every connected device. The release of QuantumCloudTM 1.0 will launch to the first cohort of customers in the second half of 2021, with $130M in contracts already committed*.

*As of release date

On May 12, 2021, Arqit entered into a definitive agreement to combine with Centricus Acquisition Corp (NASDAQ: CENHU, CENH, CENHUW), a special purpose acquisition company, which would result in Arqit becoming a publicly listed company on the NASDAQ Stock Market under the name Arqit Quantum Inc.

Additional InformationThis communication is being made in respect of the proposed transaction involving Arqit Limited (Arqit), Centricus Acquisition Corp. (Centricus) and Arqit Quantum Inc. (Pubco), a newly formed Cayman holding company. This communication does not constitute an offer to sell or the solicitation of an offer to buy any securities or a solicitation of any vote or approval, nor shall there be any sale of securities in any jurisdiction in which such offer, solicitation or sale would be unlawful prior to registration or qualification under the securities laws of such jurisdiction. In connection with the proposed transaction, Pubco has filed with the Securities and Exchange Commission (SEC) a registration statement on Form F-4 that includes a proxy statement of Centricus in connection with Centricus solicitation of proxies for the vote by Centricus shareholders with respect to the proposed transaction and other matters as may be described in the registration statement. Pubco and Centricus also plan to file other documents with the SEC regarding the proposed transaction and a proxy statement/prospectus will be mailed to all holders of Centricus Class A ordinary shares. BEFORE MAKING ANY VOTING OR INVESTMENT DECISION, INVESTORS AND SECURITY HOLDERS ARE URGED TO READ THE FORM F-4 AND THE PROXY STATEMENT/PROSPECTUS REGARDING THE PROPOSED TRANSACTION AND ANY OTHER RELEVANT DOCUMENTS FILED WITH THE SEC IN CONNECTION WITH THE PROPOSED TRANSACTION CAREFULLY IN THEIR ENTIRETY WHEN THEY BECOME AVAILABLE BECAUSE THEY WILL CONTAIN IMPORTANT INFORMATION ABOUT THE PROPOSED TRANSACTION. The proxy statement/prospectus, as well as other filings containing information about Arqit and Centricus will be available without charge at the SECs Internet site (http://www.sec.gov). Copies of the proxy statement/prospectus can also be obtained, when available, without charge, from Arqits website at http://www.arqit.uk, or by directing a request to: Centricus Acquisition Corp., PO Box 309, Ugland House, Grand Cayman, KY1- 1104, Cayman Islands.

Participants in the SolicitationsArqit, Centricus and certain of their respective directors, executive officers and other members of management and employees may, under SEC rules, be deemed to be participants in the solicitation of proxies from Centricus shareholders in connection with the proposed transaction. Information about Centricus directors and executive officers and their ownership of Centricus securities will be set forth in the proxy statement/prospectus when available. Additional information regarding the participants in the proxy solicitation and a description of their direct and indirect interests will be included in the proxy statement/prospectus when it becomes available. Shareholders, potential investors and other interested persons should read the proxy statement/prospectus carefully when it becomes available before making any voting or investment decisions. You may obtain free copies of these documents from the sources indicated above.

No Offer or SolicitationThis communication does not constitute an offer to sell or the solicitation of an offer to buy any securities, or a solicitation of any vote or approval, nor shall there be any sale of securities in any jurisdiction in which such offer, solicitation or sale would be unlawful prior to registration or qualification under the securities laws of any such jurisdiction. No offering of securities shall be made except by means of a prospectus meeting the requirements of section 10 of the Securities Act, or an exemption therefrom.

Caution About Forward-Looking StatementsThis communication includes forward-looking statements. These forward-looking statements are based on Arqits and Centricuss expectations and beliefs concerning future events and involve risks and uncertainties that may cause actual results to differ materially from current expectations. These factors are difficult to predict accurately and may be beyond Arqits and Centricuss control. Forward-looking statements in this communication or elsewhere speak only as of the date made. New uncertainties and risks arise from time to time, and it is impossible for Arqit and Centricus to predict these events or how they may affect Arqit and Centricus. Except as required by law, neither Arqit and Centricus has any duty to, and does not intend to, update or revise the forward-looking statements in this communication or elsewhere after the date this communication is issued. In light of these risks and uncertainties, investors should keep in mind that results, events or developments discussed in any forward-looking statement made in this communication may not occur. Uncertainties and risk factors that could affect Arqits and Centricuss future performance and cause results to differ from the forward-looking statements in this release include, but are not limited to: (i) that the business combination may not be completed in a timely manner or at all, which may adversely affect the price of Centricus securities, (ii) the risk that the business combination may not be completed by Centricus business combination deadline and the potential failure to obtain an extension of the business combination deadline if sought by Centricus, (iii) the failure to satisfy the conditions to the consummation of the business combination, including the approval of the Business Combination Agreement by the shareholders of Centricus and the satisfaction of the minimum trust account amount following any redemptions by Centricus public shareholders, (iv) the lack of a thirdparty valuation in determining whether or not to pursue the business combination, (v) the occurrence of any event, change or other circumstance that could give rise to the termination of the Business Combination Agreement, (vi) the effect of the announcement or pendency of the business combination on the Companys business relationships, operating results, and business generally, (vii) risks that the business combination disrupt current plans and operations of the Company, (viii) the outcome of any legal proceedings that may be instituted against the Company or against Centricus related to the Business Combination Agreement or the business combination, (ix) the ability to maintain the listing of Centricus securities on a national securities exchange, (x) changes in the competitive and regulated industries in which the Company operates, variations in operating performance across competitors, changes in laws and regulations affecting the Companys business and changes in the combined capital structure, (xi) the ability to implement business plans, forecasts, and other expectations after the completion of the business combination, and identify and realize additional opportunities, (xii) the potential inability of the Company to convert its pipeline or orders in backlog into revenue, (xiii) the potential inability of the Company to successfully deliver its operational technology which is still in development, (xiv) the potential delay of the commercial launch of the Companys products, (xv) the risk of interruption or failure of the Companys information technology and communications system and (xvi) the enforceability of the Companys intellectual property.

Go here to read the rest:
Research by University of Surrey and Arqit reveals Quantum Threat to Digital Assets - Business Wire

A Roadmap On The Geopolitical Impact Of Emerging Technologies By Chuck Brooks And Dr. David Bray – Forbes

Futuristic Technology and Disruptive Technologies Worldwide as Concept

Earlier this summer, the Atlantic Councils GeoTech Center published a new bipartisan report of Commission on the Geopolitical Impacts of New Technologies and Data (www.atlanticcouncil.org/geotechreport). Fourteen months in the making, the bipartisan recommendations highlight that the technological revolution is advancing at such speed and enormity that it is reshaping both societies globally and the geopolitical landscape.

We are now entering a new era of emerging connected technologies that blend engineering, computing algorithms, and culture. Along with connected computing comes new capabilities enabled by machine learning and artificial intelligence. This technology convergence will be immensely impactful. Human/computer interface will extend our human brain capacities, memories, and capabilities, and that the power of computing doubles, on average, every two years. These connected technology tools can be stepping-stones to a new world in diverse areas such as genetic engineering, augmented reality, robotics, renewable energies, big data, digital security, quantum computing and artificial intelligence.

If you think about the transformative role that technology is already playing in our lives, it is easy to envision how emerging technologies can proceed in effecting societal change. We are in the early stages of profound technological innovation namely the digital revolution, the Internet of Things, health and medicine, and manufacturing. It is no exaggeration to say we are on the cusp of scientific and technological advancements that will change the human condition. Incredibly, technology continues to evolve at a pace we could not have envisioned even two decades ago, and these advancements can greatly improve our lives if we harness them properly.

The GeoTech Commission Report poignantly states that the world must now start to understand how technology and data interact with society and how to implement solutions that address these challenges and grasp these opportunities. For example, data analytics has the potential to improve healthcare by identifying the best pathways in treatments and administration of patient medicines, as well as predicting the spread of the flu. In commerce and trade, data analytics can predict when and what consumers are buying. The mathematical applications used in analyzing large data sets can be used to predict societal change at almost every level of human interaction.

Collaboration is Key:

A thematic takeaway from the Commission co-chaired by John Goodman and Teresa Carlson was the urgent need for collaboration and that governments, industries and other stakeholders must work together to remain economically competitive, sustain social welfare and public safety, protect human rights and democratic processes, and preserve global peace and stability.

Success in these cooperative public private partnerships is dependent on information sharing,planning,investment in emerging technologies,and allocation of resources coordinated by both the public and private sectorsinspecial workingpartnerships.

The result of such collaboration will both keep us apprised of new paradigms and contribute to a seismic shift in breakthrough discoveries. Such cooperation could speed up the realization of the next industrial revolution and bring benefits beyond our expectations.

Personal information concept. Group of engineer.

Technologies can impact society across many industry verticals. For example

Health & Medicine

Abstract luminous DNA molecule. Doctor using tablet and check with analysis chromosome DNA genetic ... [+] of human on virtual interface. Medicine. Medical science and biotechnology.

*Health- Implantable devices, (bionic eyes, limbs)

*DNA nanomedicines

*Genomic techniques gene therapy (Gene therapy to enhance strength, endurance and lifespan Gene therapy to enhance human intelligence

*Remote sensing tech (Wearables)

*Medicine for longevity, enhancement

*Real-time biomarker tracking and monitoring

*Artificially grown organ

*Human regeneration Human cells interfaced with nanotech

*Cybernetics

*Exoskeletons for mobility

*Telemedicine

Transportation:

Global business logistics import export background and container cargo freight ship transport ... [+] concept

*Sustainability of infrastructure

* Converged transportation ecosystems and monitoring

*Autonomous and connected cars

*Predictive analytics (parking, traffic patterns)

* Smart Cities

Ecology concept. Hand holding light bulb against nature on green leaf with icons energy sources for ... [+] renewable, sustainable development, save energy.

Energy & Environment:

*New Materials for stronger construction and resilience

*Solar power

*Converting waste to biofuels

*Protecting the Electrical Grid

*Batteries (long lasting)

*Renewables

*Energy efficiency.

*Instrumenting the planet to respond to crises faster

Professional policemen tracing group of bandits in room of video monitoring watching movements on ... [+] city map while other colleagues working

Public Safety:

*Instrumenting the planet to respond to crises faster

*Better situational awareness for emergency management (chemical and bio sensors, cameras, drones)

*License plate readers

*Non-lethal public safety technologies

*Advanced forensics, both physical and digital

*Interoperable and secure communications

future technologies

Please also see: GovCon Expert Chuck Brooks: A Guide to Emerging Technologies Impacting Government in 2021 and Beyond GovCon Expert Chuck Brooks: A Guide for Emerging Technologies Impacting Government in 2021 and Beyond - GovCon WireFour Emerging Technology Areas Impacting Industry 4.0 Advanced Computing, Artificial intelligence, Big Data & Materials Science COGNITIVE WORLD

In the GeoTech Commission report there are several areas of specialized cooperation that get attention, including two of the biggest threats that society faces today, pandemics, and cyber-attacks.

Doctor wearing highly protective suit and holding globe in her hands. Globe link: ... [+] https://visibleearth.nasa.gov/images/2181/the-blue-marble/2182w

Mitigating Pandemics:

As we are now acutely aware of the implications of a global pandemic from Covid19, The Commission suggested that government should launch a global pandemic surveillance and warning system and develop rapid and automated treatments for unknown pathogens. It should also build a digital infrastructure that includes, among other systems, emerging biosensors and autonomous sequencers deployed in water systems, air filtration systems and other public infrastructureto integrate their diverse data for analysis and modeling with protocols for activating rapid analysis of new pathogens, including new strains of extant pathogens to evaluate ongoing vaccine efficacy,

Cyber Security and Digital Data Protection Concept. Icon graphic interface showing secure firewall ... [+] technology for online data access defense against hacker, virus and insecure information for privacy.

Implementing Cybersecurity:

Cybersecurity is a big focus area of the report. Cybersecurity, information assurance, and resilience are the glues that will keep our world of converged sensors and algorithms operational. This has become one of the largest areas of both public and private sector spending and is consistently ranked the top priority among CIOs, CTOs, and CISOs.

The commission urged for strengthening the National Cyber Strategy Implementation Plan, developing a geopolitical cyber deterrence strategy for critical digital assets and hardening the security of commercial space capabilities. To advance security, the government should increase its oversight of supply chain assurance, organize data on critical resources, and with allied partners, evaluate the physical and digital information technology supply chain.

Furthermore, the report highlighted that Governments, especially democratic governments, must work to build and sustain the trust in the algorithms, infrastructures and systems that could underpin society, the commission noted. The world must now start to understand how technology and data interact with society and how to implement solutions that address these challenges and grasp these opportunities. Maintaining both economic and national security and resiliency requires new ways to develop and deploy critical and emerging technologies, cultivate the needed human capital, build trust in the digital fabric with which our world will be woven and establish norms for international cooperation.

We have entered a new renaissance of accelerated technological development that is exponentially transforming our civilization. Yet with these benefits come risks. The Atlantic Council Commission Report provides a working roadmap on the Geopolitical Impact of emerging technology and the correct paths to follow.

Please see the Report of the Commission on the Geopolitical Impacts of New Technologies and Data atReport of the Commission on the Geopolitical Impacts of New Technologies and Data - Atlantic Council

Atlantic Council Report of the Commission on the Geopolitical Impacts of New Technologies and Data

Dr. David A. Bray has served in a variety of leadership roles in turbulent environments, including bioterrorism preparedness and response from 2000-2005, time on the ground in Afghanistan in 2009, serving as the non-partisan Executive Director for a bipartisan National Commission on R&D, and providing leadership as a non-partisan federal agency Senior Executive. He accepted a leadership role in December 2019 to incubate a new global Center with the Atlantic Council.

David also provides strategy to both Boards and start-ups espousing human-centric principles to technology-enabled decision making in complex environments. Business Insider named him one of the top 24 Americans Who Are Changing the World under 40 and he was named a Young Global Leader by the World Economic Forum for 2016-2021. From 2017 to the start of 2020, David served as Executive Director for the People-Centered Internet coalition Chaired by Internet co-originator Vint Cerf, focused on providing support and expertise for community-focused projects that measurably improve peoples lives using the internet. He also was named a Marshall Memorial Fellow and traveled to Europe in 2018 to discuss Trans-Atlantic issues of common concern including exponential technologies and the global future ahead. Later in 2018, he was invited to work with the U.S. Navy and Marines on improving organizational adaptability and to work with U.S. Special Operation Commands J5 Directorate on the challenges of countering misinformation and disinformation online. He has received both the Joint Civilian Service Commendation Award and the National Intelligence Exceptional Achievement Medal.

Dr. David Bray LinkedIn Profile:

AC GeoTech Center on Twitter:@ACGeoTech

Chuck Brooks, President of Brooks Consulting International, is a globally recognized thought leader and subject matter expert Cybersecurity and Emerging Technologies. LinkedIn named Chuck as one of The Top 5 Tech People to Follow on LinkedIn. He was named by Thompson Reuters as a Top 50 Global Influencer in Risk, Compliance, and by IFSEC as the #2 Global Cybersecurity Influencer. He was featured in the 2020 Onalytica "Who's Who in Cybersecurity" as one of the top Influencers for cybersecurity issues. He was also named one of the Top 5 Executives to Follow on Cybersecurity by Executive Mosaic.He is also a Cybersecurity Expert for The Network at the Washington Post, Visiting Editor at Homeland Security Today, Expert for Executive Mosaic/GovCon, and a Contributor to FORBES. He has also been featured author in technology and cybersecurity blogs & events by IBM, AT&T, Microsoft, Cylance, Xerox, Malwarebytes, General Dynamics Mission Systems, and many others.

Chuck is on the Faculty of Georgetown University where he teaches in the Graduate Applied Intelligence and Cybersecurity Risk Programs. In government, Chuck was a plank holder at The Department of Homeland Security (DHS) serving as the first Legislative Director of The Science & Technology Directorate at the Department of Homeland Security. He served as a top Advisor to the late Senator Arlen Specter on Capitol Hill coveringsecurity and technology issues on Capitol Hill. He has an M.A from the University of Chicago and a B.A. from DePauw University

Chuck Brooks LinkedIn Profile:

Chuck Brooks on Twitter:@ChuckDBrooks

Read this article:
A Roadmap On The Geopolitical Impact Of Emerging Technologies By Chuck Brooks And Dr. David Bray - Forbes

IBM and CERN on quantum computing to track the elusive Higgs boson – Tech News Inc

The capabilities of quantum computers are currently under discussion in contexts ranging from banks to merchant ships. This technology, which is still in the pipeline, has been taken further or rather, lower. More precisely, a hundred meters below the French-Swiss border, where the largest machine in the world is located, the Large Hadron Collider (LHC), operated by the European particle physics laboratory, CERN. Faced with technical difficulties in understanding the mountains of data produced by such a massive system, scientists at CERN have just called on the IBM quantum team for help.

This collaboration on the data produced by the LHC could have major implications for the state of our knowledge regarding matter, antimatter, or even dark matter. Especially since the LHC is one of CERNs most important tools for understanding the fundamental laws that govern the particles and forces that make up the universe.

Shaped like a 27-kilometre loop, the system accelerates beams of particles such as protons and electrons to slower-than-light speeds, before causing them to collide in collisions that scientists observe with eight high-resolution detectors housed inside the accelerator. Every second, particles collide about a billion times within the LHC, producing petabytes of data that is currently processed by a million CPUs in 170 locations around the world a split that can be explained by the fact that not much data can be stored in one place.

CERNs job isnt just to store data, but rather the opposite. All information generated by the LHC is then available to be processed and analyzed, so that scientists can develop hypotheses, evidence and discoveries. Thus, by observing the collision of particles between them, CERN researchers in 2012 discovered the existence of an elementary particle called the Higgs boson, which gives mass to all other fundamental particles. The discovery was then hailed as a major breakthrough in the field of physics.

Scientists do this using sophisticated machine learning algorithms that can scan data produced by the LHC to distinguish useful collisions such as those that produce the Higgs bosons from other collisions. So far, scientists have used classic machine learning techniques to analyze raw data captured by particle detectors, and automatically select the best candidate events, the IBM researchers explained in a blog post.

This is where Big Blue comes in. We believe we can greatly improve this selection process by enhancing quantum computing-based machine learning, they say. As the volume of data grows, traditional machine learning models are rapidly approaching the limits of their capabilities. This is where quantum computers are likely to play a useful role.

The general-purpose qubits that make up quantum computers can actually contain much more information than conventional qubits, meaning they can visualize and process many more dimensions than conventional devices. Therefore, a quantum computer equipped with a sufficient number of qubits can, in principle, perform very complex calculations that would take centuries for classical computers to solve. With that in mind, CERN partnered with the IBM quantum team in 2018, with the goal of discovering exactly how to apply quantum technologies to advance scientific discovery.

Quantum machine learning soon emerged as a solution to the data-analysis problems burdening CERN teams. The approach is to harness the capabilities of qubits to expand what is known as feature space the set of features on which an algorithm bases its classification decision. Using a larger feature space, a quantum computer would be able to see patterns and perform classification tasks even in a huge data set, where a classical computer would only see random noise.

Applied to CERN research, a quantum machine learning algorithm can examine the raw data produced by the LHC and identify iterations of the behavior of the Higgs boson, for example, where classical computers would struggle to see anything.

To assist CERN scientists with their research, the IBM teams have created a quantum algorithm called the Quantum Support Vector Machine (QSVM), designed to identify collisions that produce the Higgs bosons. The algorithm was trained using the test dataset based on information generated by one of the LHC detectors, and performed on both quantum simulators and physical quantum devices.

In both cases, the results were promising. The simulation study, conducted on Google Tensorflow Quantum, IBM Quantum, and Amazon Braket, used up to 20 qubits and a data set of 50,000 events, and performed as well, if not better, than their traditional counterparts for the same problem. The hardware experiment was performed on IBMs quantum machines using 15 qubits and a 100 event dataset. The results showed that despite the noise affecting the quantum computations, the quality of the classification remained comparable to the best results of classical simulations.

This once again confirms the potential of the quantum algorithm for this class of problems, IBM says. The quality of our results indicates a possible demonstration of a quantum advantage of data classification using quantum vector machines in the near future. This does not mean, however, that the quantitative advantage has actually been demonstrated. A quantum algorithm developed by IBM produced results similar to those of traditional methods on todays finite quantum processors but these systems are still in their infancy.

With only a few qubits left, quantum computers today are unable to perform useful calculations. They are also paralyzed by the fragility of qubits, which are very sensitive to environmental changes and still prone to errors. Instead, both IBM and CERN are relying on future improvements in quantum hardware to demonstrate that quantum algorithms have a tangible advantage, not just in theory.

Our results show that quantum machine-learning algorithms for classifying data can be as accurate as classical algorithms on noisy quantum computers, paving the way for demonstrating a quantum advantage in the near future, the IBM research team emphasized. Something that gives hope to CERN scientists.

The Large Hadron Collider is currently undergoing an upgrade and the next iteration of the system, which is scheduled to enter service in 2027, and is expected to produce 10 times more collisions than the current device. The volume of data generated only goes in one direction and it wont be long before traditional processors can no longer handle it all.

Source : ZDNet.com

Read the original post:
IBM and CERN on quantum computing to track the elusive Higgs boson - Tech News Inc