Category Archives: Quantum Computer

Are quantum computers good at picking stocks? This project tried to find out – ZDNet

The researchers ran a model for portfolio optimization on Canadian company D-Wave's 2,000-qubit quantum annealing processor.

Consultancy firm KPMG, together with a team of researchers from the Technical University of Denmark (DTU) and a yet-to-be-named European bank, has been piloting the use of quantum computing to determine which stocks to buy and sell for maximum return, an age-old banking operation known as portfolio optimization.

The researchers ran a model for portfolio optimization on Canadian company D-Wave's 2,000-qubit quantum annealing processor, comparing the results to those obtained with classical means. They foundthat the quantum annealer performed better and faster than other methods, while being capable of resolving larger problems although the study also indicated that D-Wave's technology still comes with some issues to do with ease of programming and scalability.

The smart distribution of portfolio assets is a problem that stands at the very heart of banking. Theorized by economist Harry Markowitz as early as 1952, it consists of allocating a fixed budget to a collection of financial assets in a way that will produce as much return as possible over time. In other words, it is an optimization problem: an investor should look to maximize gain and minimize risk for a given financial portfolio.

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

As the number of assets in the portfolio multiplies, the difficulty of the calculation exponentially increases, and the problem can quickly become intractable, even to the world's largest supercomputers. Quantum computing, on the other hand, offers the possibility of running multiple calculations at once thanks to a special quantum state that is adopted by quantum bits, or qubits.

Quantum systems, for now, cannot support enough qubits to have a real-world impact. But in principle, large-scale quantum computers could one day solve complex portfolio optimization problems in a matter of minutes which is why the world's largest banks are already putting their research team to work on developing quantum algorithms.

To translate Markowitz's classical model for the portfolio selection problem into a quantum algorithm, the DTU's researchers formulated the equation into a quantum model called a quadratic unconstrained binary optimization (QUBO) problem, which they based on the usual criteria used for the operation such as budget and expected return.

When deciding which quantum hardware to pick to test their model, the team was faced with a number of options: IBM and Google are both working on a superconducting quantum computer, while Honeywell and IonQ are building trapped-ion devices; Xanadu is looking at photonic quantum technologies, and Microsoft is creating a topological quantum system.

D-Wave's quantum annealing processor is yet another approach to quantum computing. Unlike other systems, which are gate-based quantum computers, it is not possible to control the qubits in a quantum annealer; instead, D-Wave's technology consists of manipulating the environment surrounding the system, and letting the device find a "ground state". In this case, the ground state corresponds to the most optimal portfolio selection.

This approach, while limiting the scope of the problems that can be resolved by a quantum annealer, also enable D-Wave to work with many more qubits than other devices. The company's latest devicecounts 5,000 qubits, while IBM's quantum computer, for example, supports less than 100 qubits.

The researchers explained that the maturity of D-Wave's technology prompted them to pick quantum annealing to trial the algorithm; and equipped with the processor, they were able to embed and run the problem for up to 65 assets.

To benchmark the performance of the processor, they also ran the Markowitz equation with classical means, called brute force. With the computational resources at their disposal, brute force could only be used for up to 25 assets, after which the problem became intractable for the method.

Comparing between the two methods, the scientists found that the quality of the results provided by D-Wave's processor was equal to that delivered by brute force proving that quantum annealing can reliably be used to solve the problem. In addition, as the number of assets grew, the quantum processor overtook brute force as the fastest method.

From 15 assets onwards, D-Wave's processor effectively started showing significant speed-up over brute force, as the problem got closer to becoming intractable for the classical computer.

To benchmark the performance of the quantum annealer for more than 25 assets which is beyond the capability of brute force the researchers compared the results obtained with D-Wave's processor to those obtained with a method called simulated annealing. There again, shows the study, the quantum processor provided high-quality results.

Although the experiment suggests that quantum annealing might show a computational advantage over classical devices, therefore, Ulrich Busk Hoff, researcher at DTU, who participated in the research, warns against hasty conclusions.

"For small-sized problems, the D-Wave quantum annealer is indeed competitive, as it offers a speed-up and solutions of high quality," he tells ZDNet. "That said, I believe that the study is premature for making any claims about an actual quantum advantage, and I would refrain from doing that. That would require a more rigorous comparison between D-Wave and classical methods and using the best possible classical computational resources, which was far beyond the scope of the project."

DTU's team also flagged some scalability issues, highlighting that as the portfolio size increased, there was a need to fine-tune the quantum model's parameters in order to prevent a drop in results quality. "As the portfolio size was increased, a degradation in the quality of the solutions found by quantum annealing was indeed observed," says Hoff. "But after optimization, the solutions were still competitive and were more often than not able to beat simulated annealing."

SEE: The EU wants to build its first quantum computer. That plan might not be ambitious enough

In addition, with the quantum industry still largely in its infancy, the researchers pointed to the technical difficulties that still come with using quantum technologies. Implementing quantum models, they explained, requires a new way of thinking; translating classical problems into quantum algorithms is not straightforward, and even D-Wave's fairly accessible software development kit cannot be described yet as "plug-and-play".

The Canadian company's quantum processor nevertheless shows a lot of promise for solving problems such as portfolio optimization. Although the researchers shared doubts that quantum annealing would have as much of an impact as large-scale gate-based quantum computers, they pledged to continue to explore the capabilities of the technology in other fields.

"I think it's fair to say that D-Wave is a competitive candidate for solving this type of problem and it is certainly worthwhile further investigation," says Hoff.

KPMG, DTU's researchers and large banks are far from alone in experimenting with D-Wave's technology for near-term applications of quantum computing. For example, researchers from pharmaceutical company GlaxoSmithKline (GSK) recently trialed the use of different quantum methods to sequence gene expression, and found that quantum annealingcould already compete against classical computersto start addressing life-sized problems.

Read the original post:
Are quantum computers good at picking stocks? This project tried to find out - ZDNet

Quantum computing is finally having something of a moment – World Finance

Author: David Orrell, Author and Economist

March 16, 2021

In 2019, Google announced that they had achieved quantum supremacy by showing they could run a particular task much faster on their quantum device than on any classical computer. Research teams around the world are competing to find the first real-world applications and finance is at the very top of this list.

However, quantum computing may do more than change the way that quantitative analysts run their algorithms. It may also profoundly alter our perception of the financial system, and the economy in general. The reason for this is that classical and quantum computers handle probability in a different way.

The quantum coinIn classical probability, a statement can be either true or false, but not both at the same time. In mathematics-speak, the rule for determining the size of some quantity is called the norm. In classical probability, the norm, denoted the 1-norm, is just the magnitude. If the probability is 0.5, then that is the size.

The next-simplest norm, known as the 2-norm, works for a pair of numbers, and is the square root of the sum of squares. The 2-norm therefore corresponds to the distance between two points on a 2-dimensional plane, instead of a 1-dimensional line, hence the name. Since mathematicians love to extend a theory, a natural question to ask is what rules for probability would look like if they were based on this 2-norm.

It is only in the final step, when we take the magnitude into account, that negative probabilities are forced to become positive

For one thing, we could denote the state of something like a coin toss by a 2-D diagonal ray of length 1. The probability of heads is given by the square of the horizontal extent, while the probability of tails is given by the square of the vertical extent. By the Pythagorean theorem, the sum of these two numbers equals 1, as expected for a probability. If the coin is perfectly balanced, then the line should be at 45 degrees, so the chances of getting a heads or tails are identical. When we toss the coin and observe the outcome, the ambiguous state collapses to either heads or tails.

Because the norm of a quantum probability depends on the square, one could also imagine cases where the probabilities were negative. In classical probability, negative probabilities dont make sense: if a forecaster announced a negative 30 percent chance of rain tomorrow, we would think they were crazy. However, in a 2-norm, there is nothing to prevent negative probabilities occurring. It is only in the final step, when we take the magnitude into account, that negative probabilities are forced to become positive. If were going to allow negative numbers, then for mathematical consistency we should also permit complex numbers, which involve the square root of negative one. Now its possible well end up with a complex number for a probability; however the 2-norm of a complex number is a positive number (or zero). To summarise, classical probability is the simplest kind of probability, which is based on the 1-norm and involves positive numbers. The next-simplest kind of probability uses the 2-norm, and includes complex numbers. This kind of probability is called quantum probability.

Quantum logicIn a classical computer, a bit can take the value of 0 or 1. In a quantum computer, the state is represented by a qubit, which in mathematical terms describes a ray of length 1. Only when the qubit is measured does it give a 0 or 1. But prior to measurement, a quantum computer can work in the superposed state, which is what makes them so powerful.

So what does this have to do with finance? Well, it turns out that quantum algorithms behave in a very different way from their classical counterparts. For example, many of the algorithms used by quantitative analysts are based on the concept of a random walk. This assumes that the price of an asset such as a stock varies in a random way, taking a random step up or down at each time step. It turns out that the magnitude of the expected change increases with the square-root of time.

Quantum computing has its own version of the random walk, which is known as the quantum walk. One difference is the expected magnitude of change, which grows much faster (linearly with time). This feature matches the way that most people think about financial markets. After all, if we think a stock will go up by eight percent in a year then we will probably extend that into the future as well, so the next year it will grow by another eight percent. We dont think in square-roots.

This is just one way in which quantum models seem a better fit to human thought processes than classical ones. The field of quantum cognition shows that many of what behavioural economists call paradoxes of human decision-making actually make perfect sense when we switch to quantum probability. Once quantum computers become established in finance, expect quantum algorithms to get more attention, not for their ability to improve processing times, but because they are a better match for human behaviour.

View post:
Quantum computing is finally having something of a moment - World Finance

How and when quantum computers will improve machine learning? – Medium

The different strategies toward quantum machine learningThey say you should start an article with a cool fancy image. Google 72 qubits chip Sycamore Google

There is a strong hope (and hype) that Quantum Computers will help machine learning in many ways. Research in Quantum Machine Learning (QML) is a very active domain, and many small and noisy quantum computers are now available. Different approaches exist, for both long term and short term, and we may wonder what are their respective hopes and limitations, both in theory and in practice?

It all started in 2009 with the publications of the HHL Algorithm [1] proving an exponential acceleration for matrix multiplication and inversion, which triggered exciting applications in all linear algebra-based science, hence machine learning. Since, many algorithms were proposed to speed up tasks such as classification [2], dimensionality reduction [3], clustering [4], recommendation system [5], neural networks [6], kernel methods [7], SVM [8], reinforcement learning [9], and more generally optimization [10].

These algorithms are what I call Long Term or Algorithmic QML. They are usually carefully detailed, with guarantees that are proven as mathematical theorems. We can (theoretically) know the amount of speedup compared to the classical algorithms they reproduce, which are often polynomial or even exponential, with respect to the number of input data for most of the cases. They come with precise bounds on the results probability, randomness, and accuracy, as usual in computer science research.

While they constitute theoretical proof that a universal and fault-tolerant quantum computer would provide impressive benefits in ML, early warnings [11] showed that some underlying assumptions were very constraining.

These algorithms often require loading the data with a Quantum Random Access Memory, or QRAM [12], a bottleneck part without which exponential speedups are much more complex to obtain. Besides, they sometimes need long quantum circuits and many logical qubits (which, due to error correction, are themselves composed of many more physical qubits), that might not be arriving soon enough.

When exactly? When we will reach the Universal Fault-Tolerant Quantum Computer, predicted by Google in 2029, or by IonQ in only 5 years. More conservative opinion claim this will not happen before 20+ years, and some even say we will never reach that point. Future will tell!

More recently, a mini earthquake amplified by scientific media has cast doubt on the efficiency of Algorithm QML: the so-called dequantization papers [13] that introduced classical algorithms inspired from the quantum ones to obtain similar exponential speedups, in the field of QML at least. This impressive result was then hindered by the fact that the equivalent speedup only concerns the number of data, and comes at a cost of a terrible polynomial slowdown with respect to other parameters for now. This makes these quantum-inspired classical algorithms currently unusable in practice [14].

In the meantime, something very exciting happened: actual quantum computers were built and became accessible. You can play with noisy devices made of 5 to 20 qubits, and soon more. Quite recently Google performed a quantum circuit with 53 qubits [15], the first that could not be efficiently simulable by a classical computer.

Researchers have then been looking at new models that these noisy intermediate scale quantum computers (NISQ) could actually perform [16]. They are all based on the same idea of variational quantum circuits (VQC), inspired by classical machine learning.

The main difference with algorithmic QML is that the circuit is not implementing a known classical ML algorithm. One would simply hope that the chosen circuit will converge to successfully classify data or predict values. For now, there are several types of circuits in the literature [17] and we start to see interesting patterns in the success. The problem itself is often encoded in the loss function we try to decrease: we sum the error made compared to the true values or labels, or compared to the quantum states we aim for, or to the energy levels, and so on, depending on the task. Active research tries to understand why some circuits work better than others on certain tasks, and why quantumness would help.

Another core difference is that many providers [18, 19, 20] allow you to program these VQC so you can play and test them on actual quantum computers!

In recent years, researchers have tried to find use cases where Variational QML would succeed at classical problems, or even outperforms the classical solutions [21, 22]. Some hope that the variational nature of the training confers some resilience to hardware noise. If this happens to be the case, it would be beneficial not to wait for Error Correction models that require many qubits. One would only need Error Mitigation techniques to post-process the measurements.

On the theoretical side, researchers hope that quantum superposition and entangling quantum gates would project data in a much bigger space (the Hilbert Space of n qubits has dimension 2^n) where some classically inaccessible correlations or separations can be done. Said differently, some believe that the quantum model will be more expressive.

It is important to notice that research on Variational QML is less focused on proving computational speedups. The main interest is to reach a more expressive or complex state of information processing. The two approaches are related but they represent two different strategies. Unfortunately, less is proven compared to Algorithmic QML, and we are far from understanding the theoretical reasons that would prove the advantage of these quantum computations.

Of course, due to the limitations of the current quantum devices, experiments are often made on a small number of qubits (4 qubits in the above graph) or on simulators, often ideal or limited to 30+ qubits. It is hard to predict what will happen when the number of qubits will grow.

Despite the excitement, VQC also suffers from theoretical disturbance. It is proven that when the number of qubits or the number of gates becomes too big, the optimization landscape will be flat and hinder the ability to optimize the circuit. Many efforts are made to circumvent this issue, called Barren Plateaus [23], by using specific circuits [24] or smart initialization of the parameters [25].

But Barren Plateaus are not the only caveat. In many optimization methods, one must compute the gradient of a cost function with respect to each parameter. Said differently, we want to know how much the model is improved when I modify each parameter. In classical neural networks, computing the gradients is usually done using backpropagation because we analytically understand the operations. With VQC, operations become too complex, and we cannot access intermediate quantum states (without measuring and therefore destroying them).

The current state-of-the-art solution is called the parameter shift rule [27, 28] and requires to apply the circuit and measure its result 2 times for each parameter. By comparison, in classical deep learning, the network is applied just once forward and once backward to obtain all thousand or millions gradients. Hopefully, we could parallelize the parameter shift rule on many simulators or quantum devices, but this could be limited for a large number of parameters.

Finally, researchers tend to focus more and more on the importance of data loading into a quantum state [29], also called feature map [30]. Without the ideal amplitude encoding obtained with the QRAM, there are doubts that we will be able to load and process high dimensional classical data with an exponential or high polynomial factor. Some hope remains on data independent tasks such as generative models [21, 31] or solving partial differential equations.

Note that the expression Quantum Neural Networks has been used to show the similarities with classical Neural Networks (NN) training. However they are not equivalent, since the VQC dont have the same hidden layers architecture, and neither have natural non linearities, unless a measurement is performed. And theres no simple rule to convert any NN to a VQC or vice versa. Some now prefer to compare VQC to Kernel Methods [30].

We now have a better understanding of the advantages and weaknesses of the two main strategies towards quantum machine learning. Current research is now focused on two aspects:

Finally, and most importantly, improve the quantum devices! We all hope for constant incremental improvements or a paradigm shift in the quality of the qubits, their number, the error correction process, to reach powerful enough machines. Please physicists, can you hurry?

PS: lets not forget to use all this amazing science to do good things that will benefit everyone.

Jonas Landman is a Ph.D. student at the University of Paris under the supervision of Prof. Iordanis Kerenidis. He is Technical Advisor at QC Ware and member of QuantX. He has previously studied at Ecole Polytechnique and UC Berkeley.

See more here:
How and when quantum computers will improve machine learning? - Medium

After the Govt’s Big Allocation on Quantum Technologies in 2020, What Next? – The Wire Science

Photograph of a quantum computing chip that a Google team used in their claimed quantum computer. Photo: Nature 574, 505-510 (2019).

The Union finance ministry presented the national budget for 2021 one and a half months ago. One of the prime motivations of a nationalist government should be cyber-security, and it is high time we revisited this technological space from the context of this budget and the last one.

One of the highlights of the 2020 budget was the governments new investment in quantum computing. Finance minister Nirmala Sitharamans words then turned the heads of researchers and developers working in this area: It is proposed to provide an outlay of 8,000 crore rupees over a period of five years for the National Mission on Quantum Technologies and Applications.

Thanks to the pandemic, it is not clear how much funding the government transferred in the first year. The 2021 budget speech made no reference to quantum technologies.

Its important we discuss this topic from a technological perspective. Around four decades ago, physicist Richard Feynman pointed out the possibility of devices like quantum computers in a famous speech. In the early 1990s, Peter Shor and others proved that such computers could easily factor the product of two large prime numbers a task deemed very difficult for the classical computers we are familiar with. This problem, of prime factorisation, underlies the utility of public key crypto-systems, used to secure digital transactions, sensitive information, etc. online.

If we have a practicable quantum computer, the digital security systems currently in use around the world will break down quickly, including that of financial institutions. But commercial quantum computers are still many years away.

On this count, the economically developed nations are on average far ahead of others. Countries like the US, Canada, Australia and China have already made many advancements towards building usable quantum computers with meaningful capabilities. Against this background, the present governments decision in February 2020 to invest such a large sum in quantum technologies was an outstanding development.

The problem now lies with distributing the money and achieving the actual technological advances. So far, there is no clear evidence of this in the public domain.

A logical step in this direction would be to re-invest a large share of the allocation in indigenous development. This is also where the problems lie. One must understand that India has never been successful in fabricating advanced electronic equipment. While we have very good software engineers and theoretical computer scientists, there is no proven expertise in producing chips and circuits. We might have some limited exposure in assembling and testing but nothing beyond that.

So while Atmanirbhar Bharat is an interesting idea, it will surely take a very long time before we find ourselves able to compete with developed nations vis--vis seizing on this extremely sophisticated technology involving quantum physics. In the meantime, just as we import classical computers and networking equipment, so should we proceed by importing quantum equipment, until our indigenous capability in this field matures to a certain extent.

For example, demonstrating a four-qubit quantum system or designing a proof-of-concept quantum key distribution (QKD) circuit might be a nice textbook assignment. However, the outcome will not nearly be competitive to products already available in the international arena. IBM and Google have demonstrated the use of machines with more than 50 qubits. (These groups have participation from Indian scientists working abroad.) IBM has promised a thousand-qubit machine by 2023. ID Quantique has been producing commercial QKD equipment for more than five years.

India must procure such finished products and start testing them for security trapdoors before deploying them at home. Doing so requires us to train our engineers with state-of-the-art equipment as soon as possible.

In sum, indigenous development shouldnt be discontinued but allocating a large sum of money for indigenous development alone may not bring the desired results at this point.

By drafting a plan in the 2020 Union budget to spend Rs 8,000 crore, the government showed that it was farsighted. While the COVID-19 pandemic has made it hard to assess how much of this money has already been allocated, we can hope that there will be renewed interest in the matter as the pandemic fades.

This said, such a huge allocation going to academic institutes and research laboratories for trivial demonstrations might be imprudent. In addition, we must begin by analysing commercially available products, made by international developers, so we can secure Indias security infrastructure against quantum adversaries.

Serious science requires deep political thought, people with strong academic commitment in the government and productive short- as well as long-term planning. I hope the people in power will enable the Indian community of researchers to make this quantum leap.

Subhamoy Maitra is a senior professor at the Indian Statistical Institute, Kolkata. His research interests are cryptology and quantum computing.

Continued here:
After the Govt's Big Allocation on Quantum Technologies in 2020, What Next? - The Wire Science

Quantum computing: Honeywell just quadrupled the power of its computer – ZDNet

The System Model H1, a ten-qubit quantum computer, has reached a quantum volume of 512.

Honeywell's quantum scientists have quadrupled the capabilities of the company's quantum computer, with the device achieving record levels of performance less than a year after the first generation of the system was released.

The System Model H1, a ten-qubit quantum computer, effectively reached a quantum volume of 512 four times as much as was attained in the previous tweak of the system, which saw the H1 reach a quantum volume of 128.

Released commercially last June (at the time as the System Model H0), the H1 makes use of trapped ions, unlike IBM and Google's devices, which are built with superconducting qubits. Honeywell's new record is eight times as much as was achieved with the System Model H0, which launched with a quantum volume of 64.

Quantum volume is a concept that IBM developed in 2017 as a way of measuring various aspects of a quantum computer's performance; in simple terms, the higher the quantum volume, the higher the potential for resolving real-world problems across industry and research. Designed to be independent of the architecture of any given quantum computer, quantum volume can measure any system that runs quantum circuits.

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

For example, one measurement that is indicative of a quantum computer's capabilities is qubit fidelity, which is critical to understanding how well a device can implement quantum code. According to Honeywell, the average single-qubit gate fidelity in the latest version of the H1 was 99.991%.

The final number that determines quantum volume is an aggregate of many other measurements and tests of a single quantum system's operations: they include the number of physical qubits in the quantum computer, but also the device's error rate, and connectivity, which reflects the extent to which qubits can be fully connected to each other within the device.

This is why it is possible for a quantum system to reach a high quantum volume, even with few qubits. Despite having only ten qubits, for instance, Honeywell's System Model H1 performs well when it comes to error rates and connectivity, which has earned the device a top spot for its overall capabilities. In comparison,last year IBM's 27-qubit client-deployed system achieved a quantum volume of 64.

The new milestone, therefore, hasprompted Honeywell's president of quantum solutions Tony Uttley to describethe System Model H1 as "the highest performing quantum computing system in the world."

Honeywell has made no secret of its strategy, which consists of focusing on qubit fidelity and connectedness, before attempting to scale up the number of qubits. "When you hear about fidelity and error, that's about the quality of the quantum operation," Uttley told ZDNet. "It's about knowing how often you get the right answer when you run these quantum algorithms."

"We have taken one approach that is very unique when it comes to how to get the most out of these near-term systems," he continued. "Nobody is talking about millions of qubits right now we're talking about tens of qubits. To get the most out of these tens of qubits, you have to have super-high fidelity, fully-connected and highly-controlled systems. That's our approach."

SEE: The EU wants to build its first quantum computer. That plan might not be ambitious enough

Making these highly reliable systems available to Honeywell's customers now enables businesses to test and trial with small-scale applications while waiting for the company to design and build new generations of more capable quantum computers, according to Uttley.

Honeywell recently introduced the first subscription-based plan for the usage of the H1,which grants paying customers a monthly access to the machine.

With only ten qubits, there is little that the device can achieve on top of proofs of concepts, designed to be implemented in full scale once a larger computer is available; but high-profile customers are nevertheless flocking to Honeywell's services.

J.P. Morgan Chase, for example, is investigating how the company's quantum computermight improve operations in banking; and BMW ispiloting the use of Honeywell's hardwareto optimize supply chains for car manufacturing.

Read this article:
Quantum computing: Honeywell just quadrupled the power of its computer - ZDNet

After merger, College Park startup IonQ plans to go public with $2 billion valuation – The Diamondback

IonQ, a quantum computing startup born in College Park, announced Monday that it would likely soon become the first publicly traded company to specialize in commercialized quantum computing.

The company plans to file paperwork with the Securities Exchange Commission in the next week, which will allow it to go public on the New York Stock Exchange through an acquisition deal that would set the valuation of the combined entity to nearly $2 billion.

The ability to become a public company gives us access to a huge capital base, and that will allow us to spend more time building our system, deploying them for useful application, said Chris Monroe, IonQs founder and a physics professor at the University of Maryland. We can start to do our own research and development We can do more risky things.

Monroe and co-founder Junsang Kim formed IonQ with the goal of taking quantum computing into the market. They initially received $2 million in seed funding from New Enterprise Associates, giving them a license to lab technology from the University of Maryland and Duke University. From there, they were able to raise tens of millions of dollars in funding from companies like Samsung and Mubadala, and partnered with Amazon Web Services and Microsoft.

[Gov. Hogan names College Park quantum computing company one of top state start-ups]

The company going public was made possible by a planned merger with a blank-check firm, dMY Technology Group Inc. III.

If it goes through, the merger will result in over $650 million in gross proceeds, including $350 million from private investors, according to a press release from IonQ. Combined with the $84 million the company has raised in venture capital funding, the deal would place IonQs total earnings at about $734 million.

The transition to quantum computing is unprecedented, Monroe said, and it will allow people to solve problems that a regular computer often cant.

Some problems like optimizing a fleet of trucks or discovering medicines have too many variables to solve with regular computing. But at the quantum level, more information can be handled, Monroe said, making it radically different from todays computing.

University President Darryll Pines, formerly the dean of the engineering school, explained that classical computing uses a stream of electrical pulses called bits, which represent 1s and 0s, to store information. However, on the quantum scale, subatomic particles known as qubits are used to store information, greatly increasing the speed of computing.

IonQs approach to researching quantum computing has been rooted in university-led research. Quantum physics has strange rules that arent always accepted in the engineering world, Monroe said, so many of these laws have become the domain of research at universities and national laboratories.

And this university especially, with its proximity to Washington, D.C., has one of the biggest communities of quantum scientists, Monroe said.

We have students and postdocs and all kinds of researchers on Marylands campus studying the field, and at IonQ, weve hired many of them, Monroe said. And thats a huge advantage for us.

As a company with about 60 employees, some of whom attended this university, IonQ has become a pioneer in quantum computing. In October, Peter Chapman, IonQs CEO and president, announced the companys newest 32-qubit computer, the most powerful quantum computer on the market.

And in November, Maryland Gov. Larry Hogan named IonQ one of the states top 20 startup companies in the state.

[Women of color in UMD community are making it as entrepreneurs despite challenges]

The biggest advantage for IonQ has been its technology, Monroe said. Companies like IBM, Google or Microsoft use silicon to build their computers but IonQ uses individual atoms, which, unlike silicon, float over a chip in a vacuum chamber.

That technology has been perfected at this university, Monroe said, and IonQ has a concrete plan over the next five years to manufacture quantum computer modules and wire them together.

By 2030, 20 percent of global organizations whether in the public or private sector are expected to budget for quantum-computing projects, according to Gartner Inc., a global research and advisory firm. That number is up from less than 1 percent in 2018, according to Gartner.

Niccolo de Masi, CEO of dMY, said in IonQs press release that he expects the quantum computing industry to grow immensely in the next ten years, with a market opportunity of approximately $65 billion by 2030.

Pines expressed his excitement at seeing a university startup make strides in computing.

Were happy for building the ecosystem from science, to translation, to startup, to possibly developing a product and adding value to society and growing jobs in the state of Maryland, Pines said.

See the article here:
After merger, College Park startup IonQ plans to go public with $2 billion valuation - The Diamondback

Quantum computing company D-Wave Systems secures $40M in government funding – IT World Canada

Burnaby, B.C.-based D-Wave Systems is getting $40 million from the federal government to help advance its efforts in the development of quantum computing.

The funding comes from Ottawas Strategic Innovation Fund to support a $120 million project to advance D-Waves hardware and software.

Quantum will help us quickly solve problems that would have otherwise taken decades, said Franois-Philippe Champagne, Minister of Innovation, Science and Industry, told reporters during a virtual press briefing for the announcement.

In a separate release, he added that the funding will will help place Canada at the forefront of quantum technology development, and will create new jobs and opportunities to help Canadians and advance the economy.

A brief history (so far) of quantum computing [PART 1]

A brief history (so far) of quantum computing [PART 3]

A brief history (so far) of quantum computing [PART 2]

D-Wave is the first company to offer a commercially available quantum computer but is still only in the early stages of building a sustainable business after 20 years of development and more than USD $300 million in funds raised.

D-Wave promoted Silicon Valley veteran executive Alan Baratz to chief executive officer last year, replacing Vern Brownell. The company also experienced other changes at the top of the corporate ladder and has parted ways with long-time board members.

Jim Love, Chief Content Officer, IT World Canada

See the original post here:
Quantum computing company D-Wave Systems secures $40M in government funding - IT World Canada

Europe moves to exclude neighbors from its quantum and space research – Science Magazine

A department overseen by European Union research commissioner Mariya Gabriel wants to safeguard strategic research by barring non-EU researchers.

By Nicholas WallaceMar. 11, 2021 , 4:25 PM

In a sign of growing national tensions over the control of strategic research, the European Commission is trying to block countries outside the European Union from participating in quantum computing and space projects under Horizon Europe, its new research funding program.

The proposed calls, which must still be approved by delegates from the 27 EU member states in the coming weeks, would shut out researchers in countries accustomed to full access to European research programs, including Switzerland, the United Kingdom, and Israel. European Economic Area (EEA) countries Norway, Lichtenstein, and Iceland would be barred from space research calls while remaining eligible for quantum computing projects.

Research advocates see the proposed restrictions as self-defeating for all parties, including the European Union. It would be a classic lose-lose, with researchers in all countries having to work harder, and spend more, to make progress in these fields, says Vivienne Stern, director of UK Universities International. The unexpected news has upset some leaders of existing collaborations and left them scrambling to find out whether they will need to exclude partnersor even drop out themselvesif they want their projects to be eligible for further funding. It is really a pity because we have a tight and fruitful relationship with our partners in the U.K., says Sandro Mengali, director of the Italian research nonprofit Consorzio C.R.E.O. and coordinator of an EU-funded project developing heat shields for spacecraft.

In 2018, when the European Commission first announced plans for the 85 billion, 7-year Horizon Europe program, it said it would beopen to the world. Switzerland, Israel, the EEA nations, and other countries have long paid toassociate with EU funding programs like Horizon Europegiving their researchers the right to apply for grants, just like those in EU member states. After leaving the European Union,the United Kingdom struck a dealin December 2020 to join Horizon Europe, which put out its first grant calls last month through the European Research Council.

But more recently,strategic autonomy andtechnological sovereignty have become watchwords among policymakers in Brussels, who argue the European Union should domestically produce components in key technologies, such as quantum computers and space technology. Those views influenced the Commissions research policy department, overseen by EU research commissioner Mariya Gabriel, which drafted the calls and their eligibility rules,first revealed by Science|Business. The draft says the restrictions are necessary tosafeguard the Unions strategic assets, interests, autonomy, or security.

Its a bit of a contradiction, says a Swiss government official who asked to remain anonymous because of the sensitivity of forthcoming discussions.You want to open the program to the world and work with the best. But the core group of associated countries with whom youre used to working, suddenly you exclude them and force them to work with the competitors. The official says the Commission gave no warnings the proposal was coming but believes the combination of Brexit and the COVID-19 crisis, in which Europe has struggled to secure access to vaccines, masks, and other equipment, may have further spurred Europe to guard its technologies. Negotiations on Swiss membership in Horizon Europe have not begun, but the country intends to join.

The restrictions affect 170 million in funding that could be available in the next few months. The affected areas include quantum computing, quantum communications, satellite communications, space transport, launchers, andspace technologies for European non-dependence and competitiveness. Projects relating to the Copernicus Earth-observation system and the Galileo satellite navigation programs would remain largely open to associated countries.

Shutting out the associated countries would be alost opportunity and could slow progress in quantum computing, says Lieven Vandersypen, a quantum nanoscientist at the Delft University of Technology.To me, it doesnt make sense. Vandersypen contributes to an EU-funded project that is investigating how to create the basic bits of a quantum computer from cheap and readily available silicon. The project includes U.K. and Swiss researchers at University College London and the University of Basel.They are in there for a good reason, Vandersypen says.They bring in really valuable expertise. With a few years left on the grant, the project isn't in any immediate danger. But the exclusions are bad for long-term planning, Vandersypen says.

Non-EU researchers working on a 150 million European quantum flagship initiative set up in 2018 are also upset by the sudden reversal and wonder about their future status. We discuss with our partners in Europe, they ask us, Can you join?And we dont knowthats probably the worst thing, says Hugo Zbinden, a quantum physicist at the University of Geneva and coordinator of one of these flagship projects, QRANGE, which is investigating how a quantum random number generator can be used to improve encryption.

The restrictions are not yet set in stone; national delegates could reject the draft calls and ask the Commission to open them up. But member states accepted the legal basis for the restrictions last year, when they agreed to the Horizon Europe legislation.Of course, you hope that we will be in, Zbinden says. For the time being, we are waiting for some news.

More here:
Europe moves to exclude neighbors from its quantum and space research - Science Magazine

After year of reset expectations, D-Wave secures $40-million from Ottawa for quantum computing – The Globe and Mail

D-Wave is the first company to offer a commercially available quantum computer.

Reuters

One of Canadas most heavily financed technology development companies, quantum computer maker D-Wave Systems Inc., has secured a $40-million financial contribution from the federal government.

The funding, through Ottawas Strategic Innovation Fund, follows a year of reset expectations for D-Wave, a leader in the global race to develop computers whose chips draw their power by harnessing natural properties of subatomic particles to perform complex calculations faster than conventional computers.

Burnaby, B.C.-based D-Wave is the first company to offer a commercially available quantum computer, but after 20-plus years of development and more than US$300-million in funds raised, it is still in the early stages of building a sustainable business.

Story continues below advertisement

Last year D-Wave promoted Silicon Valley veteran executive Alan Baratz to chief executive officer, replacing Vern Brownell, to step up commercialization efforts. The company also parted ways with other top executives and long-time board members.

Mr. Baratz, who led Sun Microsystems Inc.s effort in the 1990s to transform Java from a nascent programming language into the internets main software-writing platform, directed D-Wave to stop selling its shed-sized computers, which listed for US$15-million and had just a handful of customers including NASA, Google, Lockheed Martin and the U.S. Los Alamos National Laboratory.

Instead, D-Wave has focused on selling online access to the technology and expanded its software applications, which Mr. Baratz had started developing after joining as chief product officer in 2017. Customers including Volkswagen and biotechnology startups have used D-Waves technology to find answers to dense optimization problems, such as improving traffic flows in big cities, identifying proteins that could become breakthrough drugs and improving the efficiency of painting operations on vehicle production assembly lines.

D-Wave also completed a costly US$40-million refinancing last year that wiped out most of the value of some long-time investors, including the U.S. Central Intelligence Agencys venture capital arm, Amazon CEO Jeff Bezos and fund giant Fidelity Investments. The capital restructuring cut D-Waves valuation to less than US$170-million, down from US$450-million, The Globe reported in October. Investors that ponied up, including Public Sector Pension Investment Board, D-Waves top shareholder, BDC Capital and Goldman Sachs, maintained their relative stakes, limiting their writedowns.

Over the years [D-wave has] had to raise money and more money and more money ... and as such you end up getting diluted over time because every third quarter it seems like you run out of the $50-million that you raised, Kevin Rendino, CEO and portfolio manager of D-Wave investor 180 Degree Capital Corp., told his investors last November. D-Wave has been a source of bitter disappointment for all of us.

Meanwhile, D-Wave faces years and tens of millions of dollars more in costs to continue developing its core technology. The government aid will support a $120-million project to advance D-Waves hardware and software and will help place Canada at the forefront of quantum technology development, and will create new jobs and opportunities to help Canadians and advance the economy, Franois-Philippe Champagne, Minister of Innovation, Science and Industry, said in a release.

During a press conference to discuss the funding, the minister was asked if the government would review potential takeovers of quantum computing companies, as the U.S. government is considering doing. Mr. Champagne provided a non-committal response, saying Im sure you would expect us to be eyes wide open when it comes to whatever we would need to take in terms of steps to protect.[intellectual property] that has been developed in Canada.

Story continues below advertisement

Were always out there looking at how we can improve to make sure that new technologies and inventions and improvements and IP that has been developed in Canada stays in Canada.

D-Wave faces a slew of competitors including Google, Microsoft, Intel, IBM and Honeywell that are also trying to build the first quantum machine that can outperform classical or conventional computers. In addition, a new class of startups including Torontos Xanadu Quantum Technologies Inc. and College Park, Md.-based IonQ Inc. believe they can build quantum chips that dont have to be supercooled to function, as D-Waves system and others in development do. IonQ said this week it would go public through a special purpose acquisition company become the first publicly traded quantum computing-focused company.

Mr. Baratz said in an emailed statement that since D-Waves launch last September of of its latest quantum chip and expanded efforts to sell online access to its computers weve been encouraged by the positive customer response to the value delivered by a quantum system designed for practical, in-production business-scale applications. Were eager to see even more developers, academics, and companies leverage it to solve larger, more complex problems.

Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.

View post:
After year of reset expectations, D-Wave secures $40-million from Ottawa for quantum computing - The Globe and Mail

Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI? – Analytics India Magazine

As per Open AI data, the amount of computational power needed to train large AI models has grown massively doubling every three and a half months since 2021. GPT-3, which requires 3.14E23 FLOPS of computing for training, is a good case in point.

Typically, to carry out high-performance computing tasks, conventional AI chips are equipped with transistors that work with electrons. Although they perform a wide array of complex high performing tasks, energy consumption and engineering glitches pose a challenge. Thus, the growing need for computing power has set researchers on a quest to find a workaround to boost these chips power without increasing energy consumption.

And thats when experts turned to photons and light particles that can easily substitute electrons in AI chips to reduce the heat, leading to a massive reduction in energy consumption and a dramatic upgrade in processor speed.

While electrons perform calculations by reducing the information to a series of 1s and 0s, photonic chips split and mix beams of light within tiny channels to carry out the tasks. Compared to regular AI chips, photonics chips are only designed to perform a certain kind of mathematical calculation, critical for running large AI models.

Lightmatter, an MIT-backed startup, last year developed an AI chip Envise that leverages photons (light particles) to perform computing tasks.

Lights travel faster than electrons. The concept of using light as a substitute for carrying out heavy tasks (aka photonics computing/optical computing) dates back to the 1980s, when Nokia Bell Labs, an American industrial research and scientific development company, tried to develop a light-based processor. However, due to the impracticality of creating a working optical transistor, the concept didnt take off.

We experience optical technology in cameras, CDs, and even in Blue-Ray discs. But these photons are usually converted into electrons to deploy in chips. Four decades later, photonic computing gained momentum when IBM and researchers from the University of Oxford Muenster developed the system that uses light instead of electricity to perform several AI model-based computations.

Alongside, Lightmatters new AI chip has created a buzz in the industry. According to the company website, Envise can run the largest neural networks three times higher inferences/second than the Nvidia DGX-A100, with seven times the inferences/second/Watt on BERT-Base with the SQuAD dataset.

Japan-based NTT company has also been developing an optical computer believed to outpace quantum computing to solve optimisation problems. Last year, Chinese quantum physicist, Chao-Yang Lu, has also announced light-based quantum computing.

Other companies like US-based Honeywell and IonQ have also been working around the issue by using trapped ions.

Such developments have led the experts to believe photonics computing will gain ground once the big tech companies throw their weight behind it and understand the importance of using light for their AI chips.

On the other hand, like any other remarkable technology, photonics computing also comes with certain challenges. Despite its less energy-consumption, photons chips are considered less accurate and precise than electron-based chips. Much of this could be attributed to its analogue-based calculations, making it perfect for running pre-trained models and deep neural networks.

On the designing aspect, silicon-based computer chips dont go well with photo particles that limit their usage in computing.

The cost issues and environmental impact of digital chips might set the stage for photonics computing to rise as a substitute. With startups like Lightmatter and giants like IBM committing resources to this computing paradigm, AI might get a photonic boost.

Read the rest here:
Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI? - Analytics India Magazine