Category Archives: Quantum Computer

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative – HPCwire

Editors Note: Next month there will be a workshop to discuss what a quantum initiative led by NSFs Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Call for Participation announcement. A key contact for interested quantum community members is Frank Mueller, N.C. State University.

Call for Participation: Planning Workshop on Quantum Computing (PlanQC 2024) April 28, 2024, https://www.asplos-conference.org/asplos2024/ in conjunction with: ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS 2024) San Diego, CA (USA)

Funding for quantum computing has come from a variety of programs at the National Science Foundation (NSF), which have been multi-disciplinary and cutting across multiple NSF divisions. However, no NSF quantum initiatives have been led by the Computer, Information

Science and Engineering (CISE) directorate within NSF. Undoubtedly, there is a surge in demand driven by open positions in academia and industry focused on the computing side of quantum. There is arguably a need for a focused program on quantum computing led by CISE in cooperation with other directorates to enable the next generation of quantum algorithms, quantum architectures, quantum communication, quantum systems, quantum software and compilers. The objective of this workshop is to identify areas of quantum computing in particular to enable new discoveries in quantum science and engineering, and to meaningly contribute to creating a Quantum computing ready workforce.

To articulate this need and to develop a plan for new CISE-led quantum program, we plan to bring several leading senior and some junior researchers in quantum computing together for a planning workshop complemented by selected ones beyond the CISE community to foster interdisciplinary interactions.

This workshop will lead to a comprehensive report that will provide a detailed assessment of the need for a new CISE program and will provide a description of the research areas that such a program should focus on. With such a new program, NSF would be able to focus its efforts on the computing aspects of quantum science and greatly enhance its ability to sustain the leadership role of the United States in this area of strategic interest. This workshop and report will be the first stepping stone in bootstrapping such a program.

Call for participation

We invite researchers both already active in quantum computing and those aspiring to become active to participate in a one day workshop, including some participants from the quantum application domains. The focus of the workshop will be to engage in discussions and summarize findings in writing to create a report on open quantum problems and motivate workforce development in the area. The workshop format will be alternating plenary and break-out sessions to provide a unifying vision while identifying research challenges in a diverse set of subareas.

Quantum Topics

Algorithms

Architectures

Communication

Compilers/Languages

Simulation

Software

Theory

Applications

Classical control and peripheral hardware

Workshop Chairs and Organizers

Frank Mueller North Carolina State University

Fred Chong University of Chicago

Vipin Chaudhary Case Western Reserve University

Samee Khan Mississippi State University

Gokul Ravi University of Michigan

Read the original post:
Call for Participation in Workshop on Potential NSF CISE Quantum Initiative - HPCwire

Quantum computing progress: Higher temps, better error correction – Ars Technica

There's a strong consensus that tackling most useful problems with a quantum computer will require that the computer be capable of error correction. There is absolutely no consensus, however, about what technology will allow us to achieve that. A large number of companies, including major players like Microsoft, Intel, Amazon, and IBM, have all committed to different technologies to get there, while a collection of startups are exploring an even wider range of potential solutions.

We probably won't have a clearer picture of what's likely to work for a few years. But there's going to be lots of interesting research and development work between now and then, some of which may ultimately represent key milestones in the development of quantum computing. To give you a sense of that work, we're going to look at three papers that were published within the last couple of weeks, each of which tackles a different aspect of quantum computing technology.

Error correction will require connecting multiple hardware qubits to act as a single unit termed a logical qubit. This spreads a single bit of quantum information across multiple hardware qubits, making it more robust. Additional qubits are used to monitor the behavior of the ones holding the data and perform corrections as needed. Some error-correction schemes require over a hundred hardware qubits for each logical qubit, meaning we'd need tens of thousands of hardware qubits before we could do anything practical.

A number of companies have looked at that problem and decided we already know how to create hardware on that scalejust look at any silicon chip. So, if we could etch useful qubits through the same processes we use to make current processors, then scaling wouldn't be an issue. Typically, this has meant fabricating quantum dots on the surface of silicon chips and using these to store single electrons that can hold a qubit in their spin. The rest of the chip holds more traditional circuitry that performs the initiation, control, and readout of the qubit.

This creates a notable problem. Like many other qubit technologies, quantum dots need to be kept below 1 Kelvin in order to keep the environment from interfering with the qubit. And, as anyone who has ever owned an x86-based laptop knows, all the other circuitry on the silicon generates heat. So, there's the very real prospect that trying to control the qubits will raise the temperature to the point that the qubits can't hold onto their state.

That might not be the problem that we thought, according to some work published in Wednesday's Nature. A large international team that includes people from the startup Diraq have shown that a silicon quantum dot processor can work well at the relatively toasty temperature of 1 Kelvin, up from the usual milliKelvin that these processors normally operate at.

The work was done on a two-qubit prototype made with materials that were specifically chosen to improve noise tolerance; the experimental procedure was also optimized to limit errors. The team then performed normal operations starting at 0.1 K and gradually ramped up the temperatures to 1.5 K, checking performance as they did so. They found that a major source of errors, state preparation and measurement (SPAM), didn't change dramatically in this temperature range: "SPAM around 1 K is comparable to that at millikelvin temperatures and remains workable at least until 1.4 K."

The error rates they did see depended on the state they were preparing. One particular state (both spin-up) had a fidelity of over 99 percent, while the rest were less constrained, at somewhere above 95 percent. States had a lifetime of over a millisecond, which qualifies as long-lived in the quantum world.

All of which is pretty good and suggests that the chips can tolerate reasonable operating temperatures, meaning on-chip control circuitry can be used without causing problems. The error rates of the hardware qubits are still well above those that would be needed for error correction to work. However, the researchers suggest that they've identified error processes that can potentially be compensated for. They expect that the ability to do industrial-scale manufacturing will ultimately lead to working hardware.

Read this article:
Quantum computing progress: Higher temps, better error correction - Ars Technica

Quantum computing just got hotter: 1 degree above absolute zero – The Conversation

For decades, the pursuit of quantum computing has struggled with the need for extremely low temperatures, mere fractions of a degree above absolute zero (0 Kelvin or 273.15C). Thats because the quantum phenomena that grant quantum computers their unique computational abilities can only be harnessed by isolating them from the warmth of the familiar classical world we inhabit.

A single quantum bit or qubit, the equivalent of the binary zero or one bit at the heart of classical computing, requires a large refrigeration apparatus to function. However, in many areas where we expect quantum computers to deliver breakthroughs such as in designing new materials or medicines we will need large numbers of qubits or even whole quantum computers working in parallel.

Quantum computers that can manage errors and self-correct, essential for reliable computations, are anticipated to be gargantuan in scale. Companies like Google, IBM and PsiQuantum are preparing for a future of entire warehouses filled with cooling systems and consuming vast amounts of power to run a single quantum computer.

But if quantum computers could function at even slightly higher temperatures, they could be much easier to operate and much more widely available. In new research published in Nature, our team has shown a certain kind of qubit the spins of individual electrons can operate at temperatures around 1K, far hotter than earlier examples.

Cooling systems become less efficient at lower temperatures. To make it worse, the systems we use today to control the qubits are intertwining messes of wires reminiscent of ENIAC and other huge computers of the 1940s. These systems increase heating and create physical bottlenecks to making qubits work together.

Read more: How long before quantum computers can benefit society? That's Google's US$5 million question

The more qubits we try to cram in, the more difficult the problem becomes. At a certain point the wiring problem becomes insurmountable.

After that, the control systems need to be built into the same chips as the qubits. However, these integrated electronics use even more power and dissipate more heat than the big mess of wires.

Our new research may offer a way forward. We have demonstrated that a particular kind of qubit one made with a quantum dot printed with metal electrodes on silicon, using technology much like that used in existing microchip production can operate at temperatures around 1K.

This is only one degree above absolute zero, so its still extremely cold. However, its significantly warmer than previously thought possible. This breakthrough could condense the sprawling refrigeration infrastructure into a more manageable, single system. It would drastically reduce operational costs and power consumption.

The necessity for such technological advancements isnt merely academic. The stakes are high in fields like drug design, where quantum computing promises to revolutionise how we understand and interact with molecular structures.

The research and development expenses in these industries, running into billions of dollars, underscore the potential cost savings and efficiency gains from more accessible quantum computing technologies.

Hotter qubits offer new possibilities, but they will also introduce new challenges in error correction and control. Higher temperatures may well mean an increase in the rate of measurement errors, which will create further difficulties in keeping the computer functional.

It is still early days in the development of quantum computers. Quantum computers may one day be as ubiquitous as todays silicon chips, but the path to that future will be filled with technical hurdles.

Read more: Explainer: quantum computation and communication technology

Our recent progress in operating qubits at higher temperatures is as a key step towards making the requirements of the system simpler.

It offers hope that quantum computing may break free from the confines of specialised labs into the broader scientific community, industry and commercial data centres.

Follow this link:
Quantum computing just got hotter: 1 degree above absolute zero - The Conversation

IBM Quantum Computing Blog | Landmark IBM error correction paper on Nature cover – IBM

Today, the paper detailing those results was published as the cover story of the scientific journal Nature.1

Last year, we demonstrated that quantum computers had entered the era of utility, where they are now capable of running quantum circuits better than classical computers can. Over the next few years, we expect to find speedups over classical computing and extract business value from these systems. But there are also algorithms with mathematically proven speedups over leading classical methods that require tuning quantum circuits with hundreds of millions, to billions, of gates. Expanding our quantum computing toolkit to include those algorithms requires us to find a way to compute that corrects the errors inherent to quantum systems what we call quantum error correction.

Read how a paper from IBM and UC Berkeley shows a path toward useful quantum computing

Quantum error correction requires that we encode quantum information into more qubits than we would otherwise need. However, achieving quantum error correction in a scalable and fault-tolerant way has, to this point, been out of reach without considering scales of one million or more physical qubits. Our new result published today greatly reduces that overhead, and shows that error correction is within reach.

While quantum error correction theory dates back three decades, theoretical error correction techniques capable of running valuable quantum circuits on real hardware have been too impractical to deploy on quantum system. In our new paper, we introduce a new code, which we call the gross code, that overcomes that limitation.

This code is part of our broader strategy to bring useful quantum computing to the world.

While error correction is not a solved problem, this new code makes clear the path toward running quantum circuits with a billion gates or more on our superconducting transmon qubit hardware.

Quantum information is fragile and susceptible to noise environmental noise, noise from the control electronics, hardware imperfections, state preparation and measurement errors, and more. In order to run quantum circuits with millions to billions of gates, quantum error correction will be required.

Error correction works by building redundancy into quantum circuits. Many qubits work together to protect a piece of quantum information that a single qubit might lose to errors and noise.

On classical computers, the concept of redundancy is pretty straightforward. Classical error correction involves storing the same piece of information across multiple bits. Instead of storing a 1 as a 1 or a 0 as a 0, the computer might record 11111 or 00000. That way, if an error flips a minority of bits, the computer can treat 11001 as 1, or 10001 as 0. Its fairly easy to build in more redundancy as needed to introduce finer error correction.

Things are more complicated on quantum computers. Quantum information cannot be copied and pasted like classical information, and the information stored in quantum bits is more complicated than classical data. And of course, qubits can decohere quickly, forgetting their stored information.

Research has shown that quantum fault tolerance is possible, and there are many error correcting schemes on the books. The most popular one is called the surface code, where qubits are arranged on a two-dimensional lattice and units of information are encoded into sub-units of the lattice.

But these schemes have problems.

First, they only work if the hardwares error rates are better than some threshold determined by the specific scheme and the properties of the noise itself and beating those thresholds can be a challenge.

Second, many of those schemes scale inefficiently as you build larger quantum computers, the number of extra qubits needed for error correction far outpaces the number of qubits the code can store.

At practical code sizes where many errors can be corrected, the surface code uses hundreds of physical qubits per encoded qubit worth of quantum information, or more. So, while the surface code is useful for benchmarking and learning about error correction, its probably not the end of the story for fault-tolerant quantum computers.

The field of error correction buzzed with excitement in 2022 when Pavel Panteleev and Gleb Kalachev at Moscow State University published a landmark paper proving that there exist asymptotically good codes codes where the number of extra qubits needed levels off as the quality of the code increases.

This has spurred a lot of new work in error correction, especially in the same family of codes that the surface code hails from, called quantum low-density parity check, or qLDPC codes. These qLDPC codes are quantum error correcting codes where the operations responsible for checking whether or not an error has occurred only have to act on a few qubits, and each qubit only has to participate in a few checks.

But this work was highly theoretical, focused on proving the possibility of this kind of error correction. It didnt take into account the real constraints of building quantum computers. Most importantly, some qLDPC codes would require many qubits in a system to be physically linked to high numbers of other qubits. In practice, that would require quantum processors folded in on themselves in psychedelic hyper-dimensional origami, or entombed in wildly complex rats nests of wires.

In our paper, we looked for fault-tolerant quantum memory with a low qubit overhead, high error threshold, and a large code distance.

Bravyi, S., Cross, A., Gambetta, J., et al. High-threshold and low-overhead fault-tolerant quantum memory. Nature (2024). https://doi.org/10.1038/s41586-024-07107-7

In our Nature paper, we specifically looked for fault-tolerant quantum memory with a low qubit overhead, high error threshold, and a large code distance.

Lets break that down:

Fault-tolerant: The circuits used to detect errors won't spread those errors around too badly in the process, and they can be corrected faster than they occur

Quantum memory: In this paper, we are only encoding and storing quantum information. We are not yet doing calculations on the encoded quantum information.

High error threshold: The higher the threshold, the higher amount of hardware errors the code will allow while still being fault tolerant. We were looking for a code that allowed us to operate the memory reliably at physical error rates as high as 0.001, so we wanted a threshold close to 1 percent.

Large code distance: Distance is the measure of how robust the code is how many errors it takes to completely flip the value from 0 to 1 and vice versa. In the case of 00000 and 11111, the distance is 5. We wanted one with a large code distance that corrects more than just a couple errors. Large-distance codes can suppress noise by orders of magnitude even if the hardware quality is only marginally better than the code threshold. In contrast, codes with a small distance become useful only if the hardware quality is significantly better than the code threshold.

Low qubit overhead: Overhead is the number of extra qubits required for correcting errors. We want the number of qubits required to do error correction to be far less than we need for a surface code of the same quality, or distance.

Were excited to report that our teams mathematical analysis found concrete examples of qLDPC codes that met all of these required conditions. These fall into a family of codes called Bivariate Bicycle (BB) codes. And they are going to shape not only our research going forward, but how we architect physical quantum systems.

While many qLDPC code families show great promise for advancing error correction theory, most arent necessarily pragmatic for real-world application. Our new codes lend themselves better to practical implementation because each qubit needs only to connect to six others, and the connections can be routed on just two layers.

To get an idea of how the qubits are connected, imagine they are put onto a square grid, like a piece of graph paper. Curl up this piece of graph paper so that it forms a tube, and connect the ends of the tube to make a donut. On this donut, each qubit is connected to its four neighbors and two qubits that are farther away on the surface of the donut. No more connections needed.

The good news is we dont actually have to embed our qubits onto a donut to make these codes work we can accomplish this by folding the surface differently and adding a few other long-range connectors to satisfy mathematical requirements of the code. Its an engineering challenge, but much more feasible than a hyper-dimensional shape.

We explored some codes that have this architecture and focused on a particular [[144,12,12]] code. We call this code the gross code because 144 is a gross (or a dozen dozen). It requires 144 qubits to store data but in our specific implementation, it also uses another 144 qubits to check for errors, so this instance of the code uses 288 qubits. It stores 12 logical qubits well enough that fewer than 12 errors can be detected. Thus: [[144,12,12]].

Using the gross code, you can protect 12 logical qubits for roughly a million cycles of error checks using 288 qubits. Doing roughly the same task with the surface code would require nearly 3,000 qubits.

This is a milestone. We are still looking for qLDPC codes with even more efficient architectures, and our research on performing error-corrected calculations using these codes is ongoing. But with this publication, the future of error correction looks bright.

Fig. 1 | Tanner graphs of surface and BB codes.

Fig. 1 | Tanner graphs of surface and BB codes. a, Tanner graph of a surface code, for comparison. b, Tanner graph of a BB code with parameters [[144, 12, 12]] embedded into a torus. Any edge of the Tanner graph connects a data and a check vertex. Data qubits associated with the registers q(L) and q(R) are shown by blue and orange circles. Each vertex has six incident edges including four short-range edges (pointing north, south, east and west) and two long-range edges. We only show a few long-range edges to avoid clutter. Dashed and solid edges indicate two planar subgraphs spanning the Tanner graph, see the Methods. c, Sketch of a Tanner graph extension for measuring Z ={Z} and X ={X} following ref. 50, attaching to a surface code. The ancilla corresponding to the X ={X} measurement can be connected to a surface code, enabling load-store operations for all logical qubits by means of quantum teleportation and some logical unitaries. This extended Tanner graph also has an implementation in a thickness-2 architecture through the A and B edges (Methods).

Fig. 2 | Syndrome measurement circuit.

Fig. 2 | Syndrome measurement circuit. Full cycle of syndrome measurements relying on seven layers of CNOTs. We provide a local view of the circuit that only includes one data qubit from each register q(L) and q(R). The circuit is symmetric under horizontal and vertical shifts of the Tanner graph. Each data qubit is coupled by CNOTs with three X-check and three Z-check qubits: see the Methods for more details.

This code is part of our broader strategy to bring useful quantum computing to the world.

Today, our users benefit from novel error mitigation techniques methods for reducing or eliminating the effect of noise when calculating observables, alongside our work suppressing errors at the hardware level. This work brought us into the era of quantum utility. IBM researchers and partners all over the world are exploring practical applications of quantum computing today with existing quantum systems. Error mitigation lets users begin looking for quantum advantage on real quantum hardware.

But error mitigation comes with its own overhead, requiring running the same executions repeatedly so that classical computers can use statistical methods to extract an accurate result. This limits the scale of the programs you can run, and increasing that scale requires tools beyond error mitigation like error correction.

Last year, we debuted a new roadmap laying out our plan to continuously improve quantum computers over the next decade. This new paper is an important example of how we plan to continuously increasing the complexity (number of gates) of the quantum circuits that can be run on our hardware. It will allow us to transition from running circuits with 15,000 gates to 100 million, or even 1 billion gates.

Read the rest here:
IBM Quantum Computing Blog | Landmark IBM error correction paper on Nature cover - IBM

3 Quantum Computing Stocks to Buy on the Dip: March 2024 – InvestorPlace

While classical computers have enjoyed tremendous capacity gains over the past few decades, its time for a paradigm shift, which brings the discussion to quantum computing stocks to buy. Here, were not just talking about shifting gears but moving from a race car to a rocket ship.

To be sure, its difficult to explain the various intricacies that help propel quantum computers over their traditional counterparts. But in a nutshell, it comes down to exponentially quicker processing. An attribute called superposition enables quantum computers to evaluate multiple possibilities simultaneously. That makes the new innovation run circles around classical processes.

Further, you cant argue with the numbers. In 2022, the quantum market reached a valuation of $1.9 billion. By 2032, this sector could jump to $42.1 billion, representing a compound annual growth rate of 36.4%.

Who knows? That might end up being a conservative estimate. With so much anticipation, these are the quantum computing stocks to buy for speculators.

Source: JHVEPhoto / Shutterstock.com

One of the top names in the tech ecosystem, Intel (NASDAQ:INTC) could be one of the underappreciated quantum computing stocks to buy. According to its public profile, designs, develops, manufactures, markets and sells computing and related products and services worldwide. It operates through Client Computing Group, Data Center and AI [artificial intelligence], Network and Edge, Mobileye and Intel Foundry Services segments.

Last year, Intel manufactured a quantum chip, availing it to university and federal research labs to grow the underlying community. While it might not be the most exciting play among quantum computing stocks to buy, its continued research and development makes it a worthy idea to consider.

Financially, the company has performed quite well against expected bottom-line targets. Specifically, Intel mitigated the expected loss per share in the first quarter of 2023 while delivering earnings in Q2 through Q3. Overall, the average positive surprise came out to 177.65% in the past four quarters.

For fiscal 2024, analysts anticipate earnings per share to land at $1.24 on sales of $53.1 billion. Thats a solid improvement over last years 97 cents per share on sales of $50.18 billion.

Source: Amin Van / Shutterstock.com

Falling under the computer hardware segment of the broader tech ecosystem, IonQ (NASDAQ:IONQ) engages in the development of general-purpose quantum computing systems. Per its corporate profile, the company sells access to quantum computers of various qubit capacities. The company makes access to its quantum computers through cloud platforms. These platforms are offered by enterprises like Amazon (NASDAQ:AMZN), Microsoft (NASDAQ:MSFT) and Alphabet (NASDAQ:GOOGL).

Since the start of the year, IONQ slipped 25%. However, in the past 52 weeks, it has gained 78%. Therefore, those who are willing to tolerate volatility in the near term may benefit from a possible discounted opportunity. On the financials, the company has started to improve its performance.

For example, in Q2 last year, IonQ incurred a negative surprise of 69.2%. In Q3, the metric was 22.2% in the red. However, in Q4, the company met the expected loss per share of 20 cents.

For fiscal 2024, analysts believe that the tech firm could generate revenue of $38.93 million. If so, that would represent a 76.6% increase from last years print of $22 million. Thus, its one of the exciting ideas for quantum computing stocks to buy.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Another name within the computer hardware subsector, Rigetti Computing (NASDAQ:RGTI), through its subsidiaries builds quantum computers and the superconducting quantum processors. Per its public profile, Rigetti offers cloud services in the form of quantum processing units. It also sells access to its quantum systems via a Cloud Computing as a Service business model.

Now, RGTI generates plenty of attention regarding quantum computing stocks to buy because of its tremendous performance. Since the beginning of the year, Rigetti shares popped up more than 64%. In the trailing 52 weeks, its up almost 175%. However, RGTI is also down 15% in the trailing five sessions, potentially providing speculators with a discount.

Interestingly, Rigetti provides some hits and misses in its quarterly disclosures. In Q2 and Q4, the company beat per-share expectations while missing in Q1 and Q3. For fiscal 2024, Rigetti could generate $16.1 million in revenue. If so, that would be 34.1% higher than last years print of $12.01 million.

Its no wonder, then, that analysts rate RGTI a unanimous strong buy with a $3.25 price target. That implies 115% upside potential.

On the date of publication, Josh Enomoto did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

A former senior business analyst for Sony Electronics, Josh Enomoto has helped broker major contracts with Fortune Global 500 companies. Over the past several years, he has delivered unique, critical insights for the investment markets, as well as various other industries including legal, construction management, and healthcare. Tweet him at @EnomotoMedia.

Read the original:
3 Quantum Computing Stocks to Buy on the Dip: March 2024 - InvestorPlace

First Business PCs to Protect Firmware Against Quantum Computer Hacks – Manufacturing.net

The potential introduction of quantum computerscapable of breaking asymmetric cryptography could place the entire digital world at risk, and this risk is becoming increasingly plausible every day.This led HP tounveilPCs designed to protect firmware against quantum computer attacks. Features include:

Research showsthat 27 percent of experts think there is a 50 percent likelihood of a cryptographically relevant quantum computer (CRQC) by 2033. When that day comes, thesecurity of existing digital signatures on firmware and software will be in question. (Read more on Anticipating the Quantum threat to Cryptographyhere).

Migrating the entire digital world to a new cryptographic standard is a huge undertaking, and while software can be updated, hardware cant. That includes some of the cryptography that protects PC firmware. With no cryptographic protections in place, no device would be safe attackers could access and modify the underlying firmware and gain total control.

HP also recommends that customers start to assess how and when to start migrating all other aspects of their information systems to quantum-resistant cryptography. This includes three steps to begin planning:

For further information on our fifth generation ESC chipavailability, click here.

See the article here:
First Business PCs to Protect Firmware Against Quantum Computer Hacks - Manufacturing.net

Revolutionizing Quantum Computing: Breakthroughs in Quantum Error Correction – AZoQuantum

Despite their great potential, quantum computers are delicate devices. Unlike classical computers, qubits (the quantum version of bits) are prone to errors from noise and decoherence. Addressing this challenge, Quantum Error Correction (QEC) is a crucial division of quantum computing development that focuses on resolving qubit errors.

Image Credit:Yurchanka Siarhei/Shutterstock.com

The world of atoms and subatomic particles is governed by the laws of quantum mechanics. Quantum computing harnesses these principles, performing calculations in a completely different way from traditional computers.

Regular computers use bits, which can be either 0 or 1. Quantum computers, however, exploit the bizarre property of superposition, allowing qubits to be 0, 1, or both at the same time. The ability to be in multiple states simultaneously enhances the processing power of quantum computers.

Qubits are made from quantum particles like electrons or photons. By controlling properties like electrical charge or spin, data can be represented as 0, 1, or a combination of both. To unlock the true power of quantum computers, scientists rely on two unique properties:

There is no preferred qubit technology; instead, a range of physical systems, such as photons, trapped ions, superconducting circuits, and semiconductor spins, are being investigated for use as qubits.1

All these methods face the common challenge of isolating qubits from external noise, making errors during quantum computation inevitable. In contrast, classical computer bits, realized by the on/off states of transistor switches with billions of electrons, have substantial error margins that virtually eliminate physical defects.

There is no equivalent error-prevention security for quantum computers, where qubits are realized as fragile physical systems. Thus, active error correction is necessary for any quantum computer relying on qubit technology.

In 1995, Peter Shor introduced the first quantum error-correcting method. Shors approach demonstrated how quantum information could be redundantly encoded by entangling it across a larger system of qubits.

Subsequent findings then showed that if specific physical requirements on the qubits themselves are satisfied, extensions to this technique may theoretically be utilized to arbitrarily lower the quantum error rate.

While diverse efforts are being undertaken in the field of QEC, the fundamental approach to QEC implementation involves the following steps.

Quantum information is encoded across several physical, distributed qubits. These qubits act as 'information holders' for a 'logical qubit,' which is more robust and contains the data used for computation.

The logical qubits are then entangled with the physical information holders using a specific QEC code. These additional physical qubits serve as sentinels for the logical qubit.

QEC identifies errors in the encoded data by measuring the information holders using a method that does not affect the data directly in the logical qubit. This measurement provides an indication or a pattern of results that shows the type and location of the error.

Different QEC codes are available for the various types of errors that could occur. Based on the detected error, the chosen QEC system applies an operation to correct the error in the data qubits.

Error correction itself has the potential to generate noise. Therefore, additional physical qubits are required to maintain the delicate balance of correcting errors and limiting the introduction of new ones.

To realize the full potential of a quantum computer, the number of logical qubits has to be increased. However, since each logical qubit requires several physical qubits for error correction, the complexity and resources needed to isolate and manage high-quality qubits become considerable obstacles to scalability.

In recent years, quantum error correction has seen significant advancements, and the community's focus has shifted from noisy applications to the potential uses of early error-corrected quantum computers. Though research on superconducting circuits, reconfigurable atom arrays, and trapped ions has made significant strides, several platform-specific technological obstacles remain to be solved.

Some notable recent advancements in QEC include:

Despite the challenges, QEC is essential for building large-scale, fault-tolerant quantum computers. Researchers are constantly developing new and improved QEC codes and techniques.

As quantum technology progresses, QEC will play a critical role in unlocking the true potential of this revolutionary field.

More from AZoQuantum: Harnessing Quantum Computing for Breakthroughs in Artificial Intelligence

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

View original post here:
Revolutionizing Quantum Computing: Breakthroughs in Quantum Error Correction - AZoQuantum

Alice & Bob and partners awarded 16.5M to slash quantum computing costs – Tech.eu

Quantum computing hardware developer Alice & Bob and academic partners ENS de Lyon and Mines Paris-PSL today announced the receipt of a 16.5 million innovation grant, a France 2030 initiative operated by Bpifrance, France's public investment bank.

Alice & Bob is based in Paris and Boston and is working to create the first universal, fault-tolerant quantum computer. Founded in 2020, the company has raised 30 million in funding, hired over 95 employees and demonstrated experimental results surpassing those of technology giants such as Google or IBM.

Alice & Bob specialises in cat qubits, a pioneering technology developed by the company's founders and later adopted by Amazon.

The funded project, called "Cat Factory," brings industry and government partners to tackle quantum computing's critical issues across various enabling technologies, including nanofabrication, chip design and validation, digital tools and electronic control.

The research from the three partners aims to accelerate quantum computing by enhancing the efficiency of the entire stack, reducing costs, and accelerating market readiness.

The goal is to develop a new optimised architecture for fault-tolerant quantum computing by 2027 that will allow the following:

Reduction of the number of control lines per cat qubit from 4.5 to 2 Reduction of the readout lines per cat qubit from 1 to 0.2

To achieve this degree of optimisation, the infrastructure of enabling technologies surrounding the Quantum Processing Unit (QPU) will be updated to: Increase the number of analogue ports per rack from 60 to 180, dividing the footprint of control electronics by three. Increase control lines per cryostat from 200 to 2000 using next-generation cabling technology.

According to Theau Peronnin, CEO of Alice & Bob:

"We are honoured to be entrusted with making quantum computing useful earlier. Our plan, centred around cat qubits, addresses the real challenges of quantum computing headfirst, enabling massive savings in energy and end-user costs."

The projected cost of quantum computation due to cryogenics and the control of large sets of qubits presents a barrier to widespread adoption. Alice and Bob will use the funding to optimise quantum computation, from design to manufacturing and infrastructure, to make quantum computers ten times cheaper to build and ready for market three years earlier.

"Quantum computing algorithms require hundreds of logical qubits, which translates to thousands to millions of physical qubits," said Florent Di Meglio, the project's lead at Mines ParisPSL.

"Cat Factory aims to reach 100 logical qubits with only three cryostats, a dramatic reduction in the hardware needed for running a useful quantum computer."

To achieve this goal, the partners will work on the whole quantum computer architecture and the infrastructure of enabling technologies surrounding it. The project's cornerstone, the cat qubit, already reduces the number of physical qubits required to build a logical one by a factor of 60.

Paul-Franois Fournier, Executive Director, ofInnovation at Bpifrance, shared:

"We are delighted to support Alice & Bob in its development, which aims to accelerate quantum computing's progress. This support reflects Bpifrance's strong ambitions in terms of disruptive innovation."

Lead image: Alice & Bob.

Read the original post:
Alice & Bob and partners awarded 16.5M to slash quantum computing costs - Tech.eu

Chattanooga State Takes A Quantum Leap With Expert Insights Into The Future Of Computing – Chattanooga Pulse

Chattanooga State Community College will take a deep dive into the world of quantum computing alongside physics and computer science expert Dr. Shohini Ghose.

On April 3, Dr. Ghose will join ChattState students, faculty, and members of the Chattanooga community virtually to discuss the emerging field of quantum computing. The event will kick off at 9:30 a.m. with coffee and pastries followed by Dr. Ghoses presentation at 10 a.m. The lecture is titled Preparing for Quantum 2.0.

The reception and lecture will be held in BRANCH 30 on the ChattState main campus. A live stream of the lecture will also be accessible on YouTube.

Dr. Ghose is a professor of Physics and Computer Science at Wilfrid Laurier University. Her areas of expertise include quantum computing, quantum communication, and classical and quantum chaos.

The event is hosted by ChattStates Honors Program, said organizer Dr. Angie Wood, professor of social and behavioral sciences and director of ChattState Honors.

Dr. Wood said she challenged honors students last year to research the field of quantum computing and find an expert to speak on campus. They ultimately chose to invite Dr. Ghose after viewing her TED Talk titled A Beginners Guide to Quantum Computing.

Dr. Wood said offering educational opportunities outside of the classroom is one way ChattState gives students opportunities to further their futures and broaden their horizons.

College is about more than just going to class. It's also about networking and the contacts that you gain outside of the classroom, she said. "You never know when you will walk into an event like this and realize its what youre passionate about.

The event is presented alongside the Chattanooga Quantum Collaborative, a local organization that aims to use quantum technology to improve the local workforce, economy, and infrastructure.

The organization is supported by several founding members: EPB, the City of Chattanoga, TVA, UTC, Oak Ridge National Lab, Hamilton County, ChattState, Chattanooga Area Chamber of Commerce, Company Lab, Hamilton County Schools, and Qubitekk.

Originally posted here:
Chattanooga State Takes A Quantum Leap With Expert Insights Into The Future Of Computing - Chattanooga Pulse

Quantum startup Alice & Bob receives innovation grant to make quantum computing cheaper – DatacenterDynamics

Quantum startup Alice & Bob has received a 16.5 million ($17.8 million) innovation grant from Frances public investment bank, Bpifrance.

The French startup is the first company to receive all four innovation grants from Bpifrance and the funds will be used to finance a 36-month project that will see the company enhance the efficiency of its quantum stack, reducing manufacturing costs and time to market.

The deal has been endorsed by the office of Prime Minister Gabriel Attal.

Dubbed Cat Factory after cat qubit technology, the project will bring together industry, government, and academic partners from ENS de Lyon and Mines Paris (PSL) to address a number of critical issues related to quantum computing, including nanofabrication, chip design and validation, digital tools, and electronic control.

The team is aiming to develop a new optimized architecture for fault-tolerant quantum computing by 2027 that will allow a reduction in the number of control lines per cat qubit from 4.5 to 2, and a reduction in the readout lines per cat qubit from 1 to 0.2.

In order to achieve this, the Quantum Processing Unit (QPU) architecture will be updated to increase the number of analog ports from 60 to 18 per rack, and increase the number of control lines per cryostat from 200 to 2000.

Quantum computing algorithms require hundreds of logical qubits, which translates to thousands to millions of physical qubits, said Florent Di Meglio lead on the project at Mines Paris, PSL. Cat Factory aims to reach 100 logical qubits with only 3 cryostats, a dramatic reduction in the hardware needs for running a useful quantum computer.

Theau Peronnin, CEO of Alice & Bob, added: We are honored to be entrusted with the task of making quantum computing useful earlier. Our plan, centered around cat qubits, addresses the real challenges of quantum computing headfirst, enabling massive savings in energy and end-user costs.

Founded in 2020, Alice & Bob has already raised 30 million across six funding rounds to develop a fault-tolerant quantum processor.

Earlier this month, the company was given access to 500 million in funding after being invited by the French Ministry for Armed Forces to participate in the countrys PROQCIMA initiative.

Those project participants will work to deliver a universal fault-tolerant quantum computer demonstrator with 128 logical qubits by 2030, and its industrialization into a 2048-logical-qubits computer by 2035.

Follow this link:
Quantum startup Alice & Bob receives innovation grant to make quantum computing cheaper - DatacenterDynamics