Category Archives: Quantum Computing

Cracking the Quantum Black Box: A New Benchmarking Tool From … – SciTechDaily

A team of international experts has developed a new method for benchmarking quantum computers, using mathematical physics to derive meaningful performance metrics from random data sequences. This tool can characterize quantum operations and compare them to traditional computing, requiring logarithmically less data for greater insights.

The field of quantum computing is rapidly advancing, but as quantum computers increase in size and complexity, they become less like a tool and more like a mysterious black box. A team utilizing mathematical physics has now cracked this box open, managing to extract concrete metrics from seemingly random data sequences. These metrics serve as benchmarks for assessing quantum computer performance.

Experts from Helmholtz-Zentrum Berlin, Freie Universitt Berlin, Qusoft Research Centre Amsterdam, the University of Copenhagen, and the Technology Innovation Institute Abu Dhabi were involved in the work, which has now been published in Nature Communications.

Quantum computers can be used to calculate quantum systems much more efficiently and solve problems in materials research, for example. However, the larger and more complex quantum computers become, the less transparent the processes that lead to the result. Suitable tools are therefore needed to characterize such quantum operations and to fairly compare the capabilities of quantum computers with classical computing power for the same tasks. Such a tool with surprising talents has now been developed by a team led by Prof. Jens Eisert and Ingo Roth.

Quantum computers (here an experiment at the Technology Innovation Institute in Abu Dhabi) work at very low temperatures to minimize noise and unwanted disturbances. With a new developed mathematical tool, it is now possible to evaluate the performance of a quantum computer by random test data and diagnose possible bugs. Credit: Roth/Quantum research center, TII

Roth, who is currently setting up a group at the Technology Innovation Institute in Abu Dhabi, explains: From the results of random test sequences, we can now extract different numbers that show how close the operations are on statistical average to the desired operations. This allows us to learn much more from the same data than before. And what is crucial: the amount of data needed does not grow linearly but only logarithmically.

This means: to learn a hundred times as much, only twice as much data is needed. An enormous improvement. The team was able to prove this by using methods from mathematical physics.

This is about benchmarking quantum computers, says Eisert, who heads a joint research group on theoretical physics at Helmholtz-Zentrum Berlin and Freie Universitt Berlin. We have shown how randomized data can be used to calibrate such systems. This work is important for the development of quantum computers.

Reference: Shadow estimation of gate-set properties from random sequences by J. Helsen, M. Ioannou, J. Kitzinger, E. Onorati, A. H. Werner, J. Eisert and I. Roth, 19 August 2023,Nature Communications.DOI: 10.1038/s41467-023-39382-9

Read the original post:
Cracking the Quantum Black Box: A New Benchmarking Tool From ... - SciTechDaily

B.C. quantum computer maker Photonic emerges from ‘stealth mode’ with $100-million and Microsoft deal – The Globe and Mail

Open this photo in gallery:

Founder and chief quantum officer, Stephanie Simmons, poses for a photograph at the Photonic Inc. lab in Coquitlam, B.C.Tijana Martin/The Globe and Mail

Canadas third entrant in the global race to build a quantum computer has emerged from stealth mode to reveal its technology, while announcing US$140-million ($193-million) in funding and unveiling a partnership with software giant Microsoft Corp.

Vancouver-based Photonic Inc. said Wednesday it plans to build a quantum computer using silicon chips that are networked with light, a relatively new approach that the seven-year-old startup said would enable the creation of marketable machines within five years.

What were bringing to the table is the fact that the network is the computer, Photonic founder and chief quantum officer Stephanie Simmons said in an interview.

The 120-person company said its collaboration with Microsoft would allow users to access its quantum system through Microsofts Azure cloud computing network. Krysta Svore, Microsofts vice-president of advanced quantum development, said unlike commercial agreements with other quantum computer makers operating on Azure, the Photonic deal is a co-innovation collaboration to promote quantum networking. Microsoft will offer Photonic as a preferred hardware provider for customers doing computational chemistry and material science discovery.

Microsoft MSFT-Q has also backed a US$100-million ($138-million) venture capital financing of Photonic also announced Wednesday alongside British Columbia Investment Management Corp., the British governments National Security Strategic Investment Fund, Inovia Capital, Yaletown Partners and Amadeus Capital Partners. Photonic previously raised US$40-million ($55-million) from investors including veteran technology executive Paul Terry, who became chief executive officer in 2019, and former Microsoft president Don Mattrick.

Inovia partner Shawn Abbott said hed watched the quantum computing space for 20 years before deciding to back Photonic. Ive felt others were too early for the 10-year life of a venture fund they were still science projects. Photonic is the first Ive seen with the potential to scale quickly into a full platform.

Photonics networking model is in keeping with what many in the field regard as a promising direction for scaling up quantum computers to commercial relevancy.

I think everybody in the industry has realized by now that networking is needed no matter what platform you think about, said Prem Kumar, a professor of electrical and computer engineering at Northwestern University in Evanston, Ill.

At stake is the prospect of a new kind of device that can easily outperform conventional computers at certain kinds of calculations. In principle, a quantum computer could break encryption codes used to protect financial information while providing a new form of impenetrable encryption. Quantum systems could also be used to predict the behaviour of molecules and help discover materials and drugs or optimize decision making in dynamic situations, from traffic grids to financial markets.

Quantum computers achieve such feats by replacing a conventional computers bits its 1s and 0s with qubits that have an indeterminate value until they are measured. When qubits are linked together through a phenomenon known as entanglement, these uncertainties can be harnessed to solve in mere seconds calculations that could tie up a regular computer for eons.

While some quantum systems operating today have reached the level of hundreds to more than 1,000 qubits, commercial quantum systems are expected to require millions.

Developers have explored a range of design options for creating such computers, but all come with technical hurdles. Those based on the physical properties of subatomic particles are easy to disturb, and their systems require extreme cooling to reduce vibrations. Those that use entangled particles of light, or photons, have the problem that light cannot be stored, and that photons can be lost while travelling through a fibre optic network.

Despite the challenges, startups and tech giants alike are in a global race to create a commercial quantum computer. A few companies, including Google and Torontos Xanadu Quantum Technologies, have proven their machines can achieve quantum advantage, by performing certain theoretical operations faster than existing computers. But while such demonstrations are regarded as milestones, they fall well short of the goal of building a practical quantum computer, in part because they lack fault tolerance the need for a quantum system to dedicate the majority of its qubits to correcting errors and providing reliable answers. They also arent close to performing tasks commercial customers would pay for.

Some quantum computing startups including D-Wave Quantum, Inc. of Burnaby, B.C., the first company to commercialize a limited form of quantum computer have tested the public markets, although demand has been limited. D-Wave, which went public last year, generated just US$3.2-million ($4.4-million) in revenue in the first half and racked up US$46.7-million ($64-million) in operating expenses. Its stock trades for pennies a share.

Photonic is the brainchild of Dr. Simmons, who grew up in Kitchener, Ont., and decided at 16 to devote her life to the field after learning of the creation of the Institute of Quantum Computing close by. I said, This has to be it, this must be the next wave, it will be so fun, the 38-year-old said.

She decided to build her own quantum computer while studying math and physics at the University of Waterloo after learning that the technology was still in its infancy. First she earned a PhD in material science at Oxford University, then studied electrical engineering at the University of New South Wales in Sydney. She moved to B.C. in 2015, believing Vancouver was the best place to recruit talent. She taught physics at Simon Fraser University and founded Photonic in 2016.

Dr. Simmons felt early quantum computer attempts werent working backwards from the long-term solution, which I thought was going to be a horizontally scalable supercomputer.

To achieve scalability, she opted to work with silicon chips, a well-understood material in the computer industry. The chips are cooled to one degree above absolute zero, or -273.15 C colder than deep space but a less demanding threshold than some kinds of quantum computers with qubits that must be kept even colder.

The Photonic systems qubits consist of tiny flaws within the silicon material whose quantum properties can be transmitted and manipulated using light. This opens the possibility of building up a distributed network of chips connected by optical fibres to perform quantum calculations instead of a single, large processor, as other developers have done.

Dr. Simmons said such a system would be able to exploit new approaches to error correction and produce a fault tolerant quantum computer. The bringing together of the networking and computational side of quantum technology has won support from investors in part because it addresses both how to reliably do calculations and how to convey information securely.

With Stefs architecture you get a 90-per-cent-plus efficiency of transferring the quantum state, Amadeus co-founder Hermann Hauser said. Thats why I think it will become the dominant quantum computing architecture.

See the original post here:
B.C. quantum computer maker Photonic emerges from 'stealth mode' with $100-million and Microsoft deal - The Globe and Mail

US lawmakers want to spend billions on quantum computers – The Register

The US House Committee on Science, Space, and Technology is concerned the United States could fall behind Russia and China if something isn't done to accelerate development of quantum computing systems.

As such the leaders of the panel chairman Frank Lucas (R-OK) and ranking member Zoe Lofgren (D-CA) have introduced a law bill to spend its way out of the problem.

Various technologists believe quantum computing has the potential to accelerate a variety of complex workloads ranging from the simulation of chemicals to advanced networking, route optimization, and advanced logistics. Some fear the technology could eventually render modern encryption useless though the jury is still out on that one. And other folk just think it'll never live up to the hype.

"Quantum technologies are actively changing our landscape, and we must ensure we are at the forefront, breaking down quantum barriers while leading with our democratic values," committee chairman Lucas declared in a statement warning of Chinese and Russian developments in this arena.

As we've previously reported, Russian and Chinese researchers are actively developing quantum computing test beds for research and development. Earlier this year, a Chinese group claimed to have brought a 176 qubit quantum computer online. The Russians' latest system is a fair bit smaller at 16 qubits though scientists claim to have used the system to model simple molecules.

According to lawmakers, while the US has maintained its lead in the theoretical physics underpinning quantum systems, rivals like China have managed to take the lead in quantum communications and are closing the gap in other areas. The committee contends that unless steps are taken to fuel US development of quantum systems, the Land of the FreeTM could fall behind.

The proposed House bill HR6213 [PDF] reauthorizes the Quantum Initiative Act signed into law by President Donald Trump in 2018.

It calls for the US government to take a number of specific steps to accelerate US development in quantum systems. Some are rather straightforward like working with allies to develop more advanced quantum systems, establishing a pipeline of skilled workers, roping NASA into quantum research efforts, and promoting commercialization of these technologies.

Other initiatives would see the creation of research and development centers under the National Institute for Standards and Technology (NIST), the creation of new quantum testbeds, and supply chains under agencies like the Department of Energy (DoE). The latter makes sense as quantum computers are increasingly being paired with conventional supercomputers of which the DoE operates some of the most powerful in the world.

To promote these initiatives, lawmakers are calling for some serious funding. Adding up the allowances for each of the initiatives [PDF], the bill calls for more than $3 billion between 2024 and 2028. A sizable chunk of that cash would be pulled from the $280 billion CHIPS and Science Act signed into law last year.

However, there's no guarantee that the bill will be passed by Congress in its current form.

To be clear, it's not like the US has been standing still on quantum development. Last month the Biden administration unveiled 31 regional tech hubs across the US to advance a variety of technologies and supply chains including quantum computing.

Back in February, Defense Advanced Research Agency (DARPA) announced the Underexplored Systems for Utility-Scale Quantum Computing (US2QC) program. The initiative, launched in collaboration with Microsoft, Atom, and PsiQuantum, wants to further the development of utility-scale quantum system designs.

Meanwhile, in the private sector we've seen a flurry of interest around quantum computing over the past few months. In May, IBM announced its plans to spend $100 million to build a 100,000 qubit "quantum-centric supercomputer" within a decade.

Last month Atom Computing touted the creation of a 1,180 qubit system. That may sound small compared to Big Blue's ambitions, but as Gartner analyst Matthew Brisse recently told The Register not all qubits are created equal. Factors like decoherence and the quality of the qubits themselves are often more significant in determining the power of a quantum system.

Follow this link:
US lawmakers want to spend billions on quantum computers - The Register

3 Quantum Computing Stocks To Make You The Millionaire Next Door – InvestorPlace

Have you ever wished you could go back in time and invest in trailblazing companies like Apple (NASDAQ:AAPL), Amazon (NASDAQ:AMZN), or Tesla (NASDAQ:TSLA) before they hit it big? Well, you may just have that chance again today with quantum computing stocks.

The futuristic field of quantum computing has faced some bumps on its road to mainstream adoption lately. The recent Nasdaq correction has hit many once-hot quantum computing stocks hard. But this correction also presents a golden buying opportunity for investors who take the long view.

Quantum computing may sound like science fiction, but its likely to become a commercial reality sooner than you think. Leading experts predict quantum computers will reach the tipping point of usefulness within this decade. When that happens, early investors could be richly rewarded.

The problem is, quantum computing technology is highly complex and many companies trading in this space have business models that are speculative. Were in the early innings of this technological revolution, so plenty of quantum stocks carry substantial risk.

However, the long-term growth prospects in quantum computing are too great to ignore. Quantum computing could drive massive progress in fields like AI, materials science, cryptography, and more. It could fundamentally reshape our digital infrastructure and lead to innovations we cant even imagine yet.

Scooping up promising quantum computing stocks now, while theyre under pressure and trading at a discount, could put you firmly on the path to long-term wealth creation. Lets dive in!

Source: josefkubes / Shutterstock.com

Of all the quantum computing stocks, Honeywell (NASDAQ:HON) strikes me as one of the safer yet potentially lucrative options for investors. Naturally, the quantum computing market is highly-volatile and speculative at this early stage. However, Honeywell has stood out from the pack thanks to its ties to the defense and aerospace industries. The company has been a prime beneficiary of higher defense spending in cutting-edge areas like aerospace and intelligence. With geopolitical tensions rising, Honeywells government business seems poised for further growth.

That said, Honeywell also has an intriguing pure-play quantum computing segment called Honeywell Quantum. Since its formation in 2021, Honeywell Quantum has quickly become a quantum computing leader. It recently became the first company to integrate quantum-computing-hardened encryption keys into smart meters for gas, water and electric utilities

While HON stock has sagged recently amid a broader market downturn (and a revenue miss in its latest quarter), I view this as a buying opportunity. At 20-times forward earnings, Honeywell seems attractively-priced, given its leading positions in must-have aerospace and defense technologies. The company generates mountains of recurring revenue and cash flow to support future R&D and growth initiatives like quantum computing. The company aims to launch a powerful quantum computer called Model H1 in 2023.

For a relatively low-risk play on the quantum computing revolution, Honeywell fits the bill nicely. The company offers stability from its government-contracted businesses, while also providing upside from emerging technologies like quantum computing.

Source: Shutterstock

If you desire a pure-play quantum computing stock, look no further than IONQ (NYSE:IONQ). Since going public, IONQ has emerged as one of the frontrunners in the quantum computing race. While very speculative, IONQ offers tantalizing growth potential.

IONQ has pioneered the use of trapped ions to construct quantum computers. This approach aims to minimize error rates and heating effects compared to rival technologies. The company uses ytterbium atoms suspended in electromagnetic fields for its qubits. Thus far, the results look highly promising.

Earlier this year, IONQ unveiled its next-generation quantum computer called Forte. With a planned 32 algorithmic qubits, it would be the worlds most powerful trapped ion-based quantum computer on the market. Previously, IONQs systems were only accessible via the cloud. However, Forte will also be available as an on-premise solution for select partners. This quarter, IONQ achieved a major milestone by reaching quantum volume 29.

Lets put the good news on the sidelines for now. There are a lot of caveats with this company, but my biggest problem with IONQ is unprofitability. Currently, its balance sheet held over $509 million in cash against $44 million in losses in the most recent quarter. Profitability remains years away, as product development costs weigh heavily on its bottom line.

Another big caveat for me is the stocks massive volatility. Unless you are okay with that sort of risk and are willing to hold for years, I would recommend you avoid pure plays like these and focus more on the two other stocks in this list.

Source: JHVEPhoto / Shutterstock.com

Back to the established giants on this list, IBM (NYSE:IBM) looks ideally positioned to capitalize on the commercialization of quantum computing. Indeed, if any company can make quantum computing mainstream, IBM seems like the top contender.

While much less speculative than pure plays, IBM still offers substantial upside potential, in my view. The company operates the IBM Quantum Network, which allows customers to access IBMs advanced quantum computing systems. Over 210 Fortune 500 companies leverage this network for research and education. As quantum computing grows more practical, IBMs massive customer base gives it an enormous head start over rivals.

Additionally, IBM continues pushing the boundaries on quantum computing performance. Last year, it unveiled its new Osprey processor, which has a quantum volume of 128, double its previous system. IBM aims to launch a 1,121-qubit quantum computer this year, more than doubling its power again. Indeed, IBM may very well become the first to develop quantum computers powerful enough for mainstream business and scientific use.

Besides its growth upside, IBM also offers safety. At just 16-times earnings, IBM is cheap compared to other large tech firms. It generates prodigious free cash flow to support both growing the quantum business and rewarding shareholders.

On the date of publication, Omor Ibne Ehsan did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Omor Ibne Ehsan is a writer at InvestorPlace. He is a self-taught investor with a focus on growth and cyclical stocks that have strong fundamentals, value, and long-term potential. He also has an interest in high-risk, high-reward investments such as cryptocurrencies and penny stocks. You can follow him on LinkedIn.

View original post here:
3 Quantum Computing Stocks To Make You The Millionaire Next Door - InvestorPlace

IonQ Announces Third Quarter 2023 Financial Results – Yahoo Finance

Third Quarter Results of $6.1 Million in Revenue, Above High End of Range

2023 Full Year Revenue and Bookings Outlooks Increased Again

Third Quarter Bookings of $26.3 Million Bring Bookings to $58.4 Million Year-to-Date as of Q3

Announces $25.5 Million Quantum Networking System Sale to AFRL

Robust Commercial Pipeline Growth and Visibility

Achieves $100 Million in Cumulative Bookings Within First Three Years of Commercialization Efforts

Technical Momentum Continues Towards Commercial Advantage

COLLEGE PARK, Md., November 08, 2023--(BUSINESS WIRE)--IonQ (NYSE: IONQ), a leader in the quantum computing industry, today announced financial results for the quarter ended September 30, 2023.

"Our third quarter represents another massive step in a pivotal year for IonQ as we usher in the enterprise era of quantum computing. I am pleased to report we have achieved our goal of $100 million in cumulative bookings within the first three years of commercialization, since 2021, and are on track to exceed that goal by the end of 2023. IonQs commercial pipeline is bigger and better than ever and our technical momentum, while always arduous, continues to be ahead of schedule," said Peter Chapman, President and CEO of IonQ.

"This quarter, we saw further validation of our technology with another two systems sold in a $25.5 million deal from the US Air Force Research Lab (AFRL) to further explore quantum networking. We also unveiled our next two future generations of quantum computers: IonQ Forte Enterprise and IonQ Tempo. IonQ Forte Enterprise will bring #AQ 35 to customers in a form factor that integrates seamlessly into existing data centers. IonQ Tempo will deliver #AQ 64 and quantum advantage for certain applications."

Third Quarter 2023 Financial Highlights

IonQ recognized revenue of $6.1 million for the third quarter, which is above the high end of the previously provided range, and represents 122% growth compared to $2.8 million in the prior year period.

IonQ achieved $26.3 million in new bookings for the third quarter, and $58.4 million year-to-date as of September 30, 2023.

IonQ achieved its previously stated goal of $100 million in cumulative bookings within the first three years of its commercialization efforts, starting in 2021, a full quarter ahead of the initial prediction.

Cash, cash equivalents and investments were $485.1 million as of September 30, 2023.

Net loss was $44.8 million and Adjusted EBITDA loss was $22.4 million for the third quarter.* Exclusions from Adjusted EBITDA include a non-cash loss of $7.6 million related to the change in the fair value of IonQs warrant liabilities.

Story continues

*Adjusted EBITDA is a non-GAAP financial measure defined under "Non-GAAP Financial Measures," and is reconciled to net loss, its closest comparable GAAP measure, at the end of this release.

Commercial Highlights

Technical Highlights

IonQ announced at its September Analyst Day that it now believes it will be able to reach #AQ 64 and commercial advantage using error mitigationthe same technique the company is already employing todayrather than needing to implement full error correction. Error mitigation requires fewer qubits than error correction, and makes IonQ even more confident in the companys ability to reach #AQ 64 in the near-term.

IonQ achieved #AQ 29 on Next Generation Barium Qubits to Deliver Industry Leading Performance, marking a key milestone in the companys journey towards developing scalable and reliable systems capable of commercial quantum advantage.

IonQ intends to achieve an #AQ 64 system by the end of 2025. The company believes that in reaching this #AQ milestone, its systems will deliver quantum advantage for certain use cases and classical computers will no longer be able to fully simulate an IonQ system.

2023 Financial Outlook

For the full year 2023, IonQ is increasing its revenue outlook range to $21.2 million to $22.0 million.

For the fourth quarter of 2023, IonQ is expecting revenue of between $5.3 million and $6.1 million.

For the full year 2023, IonQ is increasing its bookings expectation range to between $60.0 million to $63.0 million.

Announcing Filing of S-3 Shelf Registration Statement

Tomorrow, the company intends to file a universal shelf registration statement on Form S-3 with the Securities and Exchange Commission (the "SEC").*

The company has no foreseeable needs to raise and use additional capital at this time, however would like to maintain the optionality to raise additional capital to fund strategic growth and M&A opportunities in the near to medium term.

*Any offer, solicitation or sale of any of the securities registered under the registration statement will be made only by means of the prospectus and the accompanying prospectus supplement once the registration statement is declared effective by the SEC. This announcement does not constitute an offer to sell or a solicitation of an offer to buy securities, nor may there be any sale of IonQs common stock or other securities in any state or jurisdiction in which such an offer, solicitation or sale would be unlawful prior to the effectiveness of the registration statement with the SEC and registration or qualification under the securities law of any state or jurisdiction.

Third Quarter 2023 Conference Call

IonQ will host a conference call today at 4:30 p.m. Eastern time to review the Companys financial results for the third quarter ended September 30, 2023 and to provide a business update. The call will be accessible by telephone at 877-300-8521 (domestic) or 412-317-6026 (international). The call will also be available live via webcast on the Companys website here, or directly here. A telephone replay of the conference call will be available approximately two hours after its conclusion at 844-512-2921 (domestic) or 412-317-6671 (international) with access code 10183201 and will be available until 11:59 p.m. Eastern time, November 22, 2023. An archive of the webcast will also be available here shortly after the call and will remain available for one year.

Non-GAAP Financial Measures

To supplement IonQs condensed consolidated financial statements presented in accordance with GAAP, IonQ uses non-GAAP measures of certain components of financial performance. Adjusted EBITDA is a financial measure that is not required by or presented in accordance with GAAP. Management believes that this measure provides investors an additional meaningful method to evaluate certain aspects of the Companys results period over period. Adjusted EBITDA is defined as net loss before interest income, net, interest expense, income tax expense, depreciation and amortization expense, stock-based compensation, change in fair value of assumed warrant liabilities, and other non-recurring non-operating income and expenses. IonQ uses Adjusted EBITDA to measure the operating performance of its business, excluding specifically identified items that it does not believe directly reflect its core operations and may not be indicative of recurring operations. The presentation of non-GAAP financial measures is not meant to be considered in isolation or as a substitute for the financial results prepared in accordance with GAAP, and IonQs non-GAAP measures may be different from non-GAAP measures used by other companies. For IonQs investors to be better able to compare the Companys current results with those of previous periods, IonQ shows a reconciliation of GAAP to non-GAAP financial measures at the end of this release.

About IonQ

IonQ, Inc. is a leader in quantum computing, with a proven track record of innovation and deployment. IonQs current generation quantum computer, IonQ Forte, is the latest in a line of cutting-edge systems, boasting an industry-leading 29 algorithmic qubits. Along with record performance, IonQ has defined what it believes is the best path forward to scale.

IonQ is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access. To learn more, visit http://www.ionq.com.

IonQ Forward-Looking Statements

This press release contains certain forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. Some of the forward-looking statements can be identified by the use of forward-looking words. Statements that are not historical in nature, including the words "anticipate," "expect," "suggests," "plan," "believe," "intend," "estimates," "targets," "projects," "should," "could," "would," "may," "will," "forecast" and other similar expressions are intended to identify forward-looking statements. These statements include those related to the companys technology driving commercial advantage in the future, the companys future financial and operating performance, including our outlook and guidance, the ability for third parties to implement IonQs offerings to increase their quantum computing capabilities, the effect of increased availability of customer support functions, IonQs quantum computing capabilities and plans, access to IonQs quantum computers, increases in algorithmic qubit achievement, and the scalability and reliability of IonQs quantum computing offerings. Forward-looking statements are predictions, projections and other statements about future events that are based on current expectations and assumptions and, as a result, are subject to risks and uncertainties. Many factors could cause actual future events to differ materially from the forward-looking statements in this press release, including but not limited to: changes in the competitive industries in which IonQ operates, including development of competing technologies; changes in laws and regulations affecting IonQs business; IonQs ability to implement its business plans, forecasts and other expectations, identify and realize partnerships and opportunities, and to engage new and existing customers, and risks associated with U.S. government sales, including provisions that allow the government to unilaterally terminate or modify contracts for convenience and the uncertain scope and impact of a possible U.S. government shutdown or operation under a continuing resolution. You should carefully consider the foregoing factors and the other risks and uncertainties disclosed in the Companys filings, including but not limited to those described in the "Risk Factors" section of IonQs most recent Quarterly Report on Form 10-Q and other documents filed by IonQ from time to time with the Securities and Exchange Commission. These filings identify and address other important risks and uncertainties that could cause actual events and results to differ materially from those contained in the forward-looking statements. Forward-looking statements speak only as of the date they are made. Readers are cautioned not to put undue reliance on forward-looking statements, and IonQ assumes no obligation and does not intend to update or revise these forward-looking statements, whether as a result of new information, future events, or otherwise. IonQ does not give any assurance that it will achieve its expectations.

IonQ, Inc.

Condensed Consolidated Statements of Operations

(unaudited)

(in thousands, except share and per share data)

Three Months EndedSeptember 30,

Nine Months EndedSeptember 30,

2023

2022

2023

2022

Revenue

$

6,136

$

2,763

$

15,936

$

7,324

Costs and expenses:

Cost of revenue (excluding depreciation and amortization)

2,008

733

4,945

2,043

Research and development

24,599

13,292

60,701

30,282

Sales and marketing

5,047

1,969

11,289

5,971

General and administrative

13,927

10,149

35,438

26,901

Depreciation and amortization

2,749

1,531

6,869

4,248

Total operating costs and expenses

48,330

27,674

119,242

69,445

Loss from operations

(42,194

)

(24,911

)

(103,306

)

(62,121

)

Change in fair value of warrant liabilities

Read more from the original source:
IonQ Announces Third Quarter 2023 Financial Results - Yahoo Finance

Microsoft and Photonic join forces on the path to quantum at scale – Microsoft Azure Quantum Blog – Microsoft

We are excited to announce a strategic co-innovation collaboration with Photonic Inc., a company focused on building scalable, fault tolerant, and distributed quantum technologies. Our shared mission is to unlock the next stages in quantum networking and empower the quantum computing ecosystem with new capabilities enabled by our unique and complementary approaches to scalable quantum infrastructure.

By combining Photonics novel spin-photon architecture that natively supports quantum communication over standard telecom wavelengths with the global scale and state-of-the-art infrastructure of Azure, we will work together to integrate quantum networking capabilities into everyday operating environments. Together, we aim to deliver new technologies that will enable reliable quantum communication over long distances and accelerate scientific research and development with quantum computing devices to be integrated into Azure Quantum Elements.

We are thrilled about joining forces with Photonic in improving the world through quantum technologies. There is an opportunity to ignite new capabilities across the quantum ecosystem extending beyond computing, such as networking and sensing, and unlocking applications and scientific discovery at scale across chemistry, materials science, metrology, communications, and many other fields. The capabilities we aim to deliver with Photonic can enable this vision and bring about quantums impact far more quickly than otherwise possible.Jason Zander, Executive Vice President of Strategic Missions and Technologies, Microsoft.

Realizing this vision requires a fundamental capability: entanglement distribution over long distances. Photonics unique architecture is based on highly connected silicon spin qubits with a spin-photon interface. By using a qubit with a photon interface, this novel approach communicates using ultralow-loss standard telecom fibers and wavelengths. When paired with the Microsoft global infrastructure, platforms, and scale of the Azure cloud, this technology will integrate new quantum networking capabilities into everyday operating environments.

Together, Microsoft and Photonic will address three stages of quantum networking.

It will take a global ecosystem to unlock the full promise of quantum computing. No company or country can do it alone. Thats why were incredibly excited to be partnering with Microsoft to bring forth these new quantum capabilities. Their extensive global infrastructure, proven platforms, and the remarkable scale of the Azure cloud make them the ideal partner to unleash the transformative potential of quantum computing and accelerate innovation across the quantum computing ecosystem.Dr. Stephanie Simmons, founder and Chief Quantum Officer of Photonic, and the Co-Chair of Canadas National Quantum Strategy Advisory Board.

It is only through global collaboration and co-innovation that we will be able to empower people to unlock solutions to the biggest challenges facing our industries, and our world. Just like the cloud democratized access to supercomputersonce available only to governments, research universities, and the most resourced corporationswe are on a mission to engineer a fault-tolerant quantum supercomputing ecosystem at scale on Azure. We announced last June our roadmap to a Level 3 quantum supercomputer along with peer-reviewed research demonstrating that weve achieved our first milestone.

Scientific discovery is crucial to our global future, and we want to empower scientists today with the best available offerings in the ecosystem, which is why as part of our co-innovation collaboration we plan to integrate Photonics unique quantum hardware into our Azure Quantum Elements offering as it becomes available. Our collaboration with Photonic seeks to enable scientific exploration at Level 1, foundational quantum computing with a firm commitment to reach higher levels of resilience and scale on the path to quantum supercomputing in the future.

With Azure Quantum Elements, your quantum solutions will be completely integrated with high-value advancements in high-performance computing (HPC) and AI so you can transform your research and developmentprocesses today with the certainty that you will be ready to adopt quantum supercomputing at scale seamlessly in the future. You can sign-up for our Private Preview of Azure Quantum Elements now.

To learn more about how Microsoft and Photonic will be working together to advance the next stages of quantum networking and empower the quantum ecosystem with new capabilities,register for the January episodeof the Quantum Innovator Series.

Photonic is building a scalable, fault-tolerant and unified quantum computing and networking platform, uniquely based on proven spin qubits in silicon. Photonics platform offers a native telecom networking interface and the manufacturability of silicon. Headquartered in Vancouver, Canada, Photonic also has offices in the United States and the United Kingdom. To learn more about the company, visit their website.

Continue reading here:
Microsoft and Photonic join forces on the path to quantum at scale - Microsoft Azure Quantum Blog - Microsoft

Security in the impending age of quantum computers – Help Net Security

Quantum computing is poised to be one of the most important technologies of the 21st century. With global governments having collectively pledged more than $38 billion in public funds for quantum technologies and $2.1 billion of new private capital flowing to quantum companies in 2022, quantum technologies, particularly quantum computers, are rapidly moving from the lab to the commercial marketplace.

By leveraging the principles of quantum mechanics, quantum computers have the potential to perform certain computations exponentially faster than classical computers. From drug discovery and modeling chemical reactions to optimization problems and emissions reduction, quantum computers are poised to revolutionize various industries and accelerate global scientific progress.

In addition to these use cases, quantum computing is particularly important to the cybersecurity community. Thats because a future quantum computer of sufficient size and efficiency could crack current encryption schemes and jeopardize all information and communications currently protected by such schemes (public-key encryption enables more than 4.5 billion internet users to securely access 200 million websites and engage in $3 trillion of retail ecommerce each year).

On the opposite side of that coin, certain quantum technologies can strengthen existing encryption and power an estimated $30 billion quantum cybersecurity market by the end of the decade.

Quantum technologies are, therefore both a sword and a shield for cybersecurity. Thus, organizations and governments must invest in quantum offensively and defensively to adequately protect our data from the threat that quantum computers pose to current encryption schemes and take advantage of extremely strong cybersecurity solutions over the long term.

Todays public key cryptography that secures much of our communications and data relies upon math problems that are extraordinarily difficult for classical computers to solve. But a quantum computer running a dedicated algorithm (such as Shors algorithm) will likely be able to extract the encryption key and decrypt the underlying data in a reasonable time interval. Thus, any system using public-key encryption will be vulnerable to an attack by a quantum computer.

The timeline for developing a cryptographically relevant quantum computer is highly contested, with estimates often ranging between 5 and 15 years. Although the date when such a quantum computer exists remains in the future, this does not mean this is a problem for future CIOs and IT professionals. The threat is live today due to the threat of harvest now, decrypt later attacks, whereby an adversary stores encrypted communications and data gleaned through classical cyberattacks and waits until a cryptographically relevant quantum computer is available to decrypt the information. To further highlight this threat, the encrypted data could be decrypted long before a cryptographically relevant quantum computer is available if the data is secured via weak encryption keys.

While some data clearly loses its value in the short term, social security numbers, health and financial data, national security information, and intellectual property retain value for decades and the decryption of such data on a large scale could be catastrophic for governments and companies alike.

To address this threat, the National Institute of Standards and Technology (NIST) has been working since 2016 to identify and select a set of cryptographic algorithms that are theoretically resistant to attacks from quantum computers and classical computers. NIST released draft standards for the first three algorithms in August 2023 and is currently accepting comments until November 22, 2023, before finalizing the standards in 2024.

We have seen significant action from the Legislative and Executive branches of the US government to push the public sector to migrate to post-quantum cryptography (PQC) algorithms as soon as they are standardized.

In May of 2022, President Biden released a national security memorandum (NSM-10) outlining efforts to protect US government assets from the quantum threat. NSM-10 states that the goal for the US federal government is to move the maximum number of systems off quantum-vulnerable cryptography within a decade of the publication of the initial set of standards.

In December of 2022, President Biden signed the Quantum Computing Cybersecurity Preparedness Act requiring federal agencies to migrate information technology systems to post-quantum cryptography. This will be impactful for government agencies as well as companies that do business with the federal government, especially those providing IT services.

Both government actions seek to align the federal government to the NIST PQC algorithms with a goal of completing as much of the migration as possible by 2035.

This is an aggressive timeline. Historically, major cryptographic transitions can take years and even decades to complete. Starting the migration process now gives organizations the chance to put in place protections before cryptographically relevant quantum computers become available. The PQC migration is likely to be a long and resource-intensive exercise and will require cryptographic agility to shift IT systems to the final standards, provide flexibility among the algorithms, and protect data with minimal disruption.

Although NISTs algorithms are not yet standardized and we likely remain years away from a cryptographically relevant quantum computer, the time is now to:

1) Begin inventorying cryptography systems that will be vulnerable to future quantum attacks2) Develop Quantum IQ across your organization by exploring the benefits and risks that quantum technologies will pose for your business3) Review the NIST post-quantum algorithms and create a strategy for cryptographic agility that will allow you to shift your systems to the final standards and protect your data with minimal disruption; and4) Identify partners established in the quantum ecosystem who can guide you through the transition to quantum-resilient cybersecurity while protecting data from both classical and quantum cyberattacks.

At Quantum World Congress James Kushmerick, director of the Physical Measurement Laboratory at NIST, stated the sooner we get this out, the better off well be whenever a cryptographically relevant quantum computer is developed. This will be a long process and government agencies and the private sector must begin the hard work of inventorying data and putting together a plan for PQC migration and cryptographic agility now to adequately prepare for the threat.

Cryptographic agility is particularly important given that one of the then-leading PQC candidates, Rainbow, was broken in 2022 by a laptop. While migrating to PQC algorithms is extremely important (and will likely be required for government agencies and their private sector contractors), these algorithms are still based on math problems and there is a chance that they may be cracked in the future, requiring new algorithms to be integrated. The ability to implement seamless updates will be important for organizations to maintain trust and ensure the integrity of cryptography, against threats both classical and quantum.

Additionally, cryptographic agility will provide organizations the freedom to assess and test quantum-leveraged cybersecurity solutions such as quantum-hardened keys and quantum key distribution (QKD) as part of an all of the above approach to cybersecurity in the quantum age.

As discussed above, PQC is a necessary start for cybersecurity in the quantum age. However, it is not a panacea to the threat posed by quantum computers.

Given the scale of the threat that future quantum computers are likely to pose to encryption, a defense in depth approach will be necessary. This layered approach will deploy PQC algorithms alongside tools that leverage quantum mechanics to offer stronger security guarantees. Such technologies include quantum computing-hardened and non-deterministic processes for encryption key generation and QKD.

Encryption keys are the basis of all cryptography and classical and PQC algorithms both rely on properly secured keys. One of the benefits of quantum is that it is fundamentally unpredictable and applying the power of quantum computers to harden encryption key generation can protect sensitive encrypted data and communications against current and emerging threats.

Furthermore, these provably unpredictable keys can maximize the resilience and lifetime value of existing critical technology infrastructures. Such quantum-computing-hardened keys are available today and offer a stronger alternative for key generation.

Rather than relying on math problems, QKD relies upon quantum mechanics to protect communications. With QKD it is possible to exchange a key and prove that it has not been intercepted by anyone as the attempt to measure the communication by a potential eavesdropper alters the quantum state. This guarantee will last forever, meaning that once a key is exchanged, it is safe, even if encryption algorithms are broken. Importantly, QKD protects against classical attacks such as man-in-the-middle attacks as well as attacks from a future cryptographically relevant quantum computer.

QKD is a complex technology that requires a significant amount of hardware and infrastructure to adequately generate the quantum states that make it valuable for the future of our cybersecurity. Researchers and industry are working to overcome these challenges and further commercialize and deploy QKD in the field, but its range and commercial implementation remains somewhat limited.

The National Security Agency (NSA) has issued guidance which did not recommend QKD technologies for securing the transmission of data in national security systems today. However, as the technology continues to advance, QKD has the potential to offer the next generation of cybersecurity and provide a uniquely quantum method for securing communications that will stand to greatly benefit military and civilian networks alike. When combined with PQC algorithms, these quantum-derived technologies can provide a layered approach to cybersecurity that further protects data and communications.

By considering how quantum-hardened encryption keys and QKD can fit into their quantum readiness strategy alongside PQC, organizations will be better prepared for that unknown day in the future when a cryptographically relevant quantum computer becomes available. These technologies are complementary rather than opposed to one another and are all critical pieces of the cybersecurity puzzle in the quantum age.

Read more here:
Security in the impending age of quantum computers - Help Net Security

Why Enterprises and Governments Must Prepare for Q-Day Now – Infosecurity Magazine

In todays hyperconnected world, enterprises and governments are accelerating digital transformations to revolutionize the way businesses operate improving efficiency and productivity, creating new revenue streams and delivering value to customers. As part of this effort, governments and businesses alike are investing in quantum computing to help address societal challenges and improve efficiencies and insights.

While quantum computing has huge potential for good, when in the wrong hands it also has the ability to cause tremendous harm. So what are the quantum threats facing enterprises and governments and why should they start preparing for the impending Q-Day now?

Across the world, we are seeing governments and enterprises increase investments in quantum computing to tackle issues around sustainability, defense and climate change. Organizations that invest early in quantum computing are more likely to reap significant benefits.

For example, already we are seeing quantum computing help banks run more advanced financial computations and companies like Mercedes Benz shaping the future of electric cars. As the potential to use quantum computing for good appears to be limitless, we can expect companies and countries will continue to leverage its capabilities.

Just as we are able to leverage quantum-speed problem solving for good, it can also be used to wield quantum-speed cyber-attacks. However, this will require a cryptographically relevant quantum computer (CRQC) which does not yet exist. So, if quantum computers havent yet arrived, you may ask why do we need to worry now?

With technology advances, Q-Day the day a viable CRQC will be able to break most of todays public key encryption algorithms is moving ever closer. Some experts predict it could be as soon as 2030, so the sooner we can prepare ourselves the better. In fact, governments and enterprises are already at risk; even today, bad actors are employing harvest now, decrypt later tactics, to decrypt data for a mass Q-day attack.

In the hands of bad actors, quantum computers carry the potential to impact economies, disrupt critical research or worse, endanger lives. Human-critical networks such as power grids, healthcare networks, utilities, public safety and financial systems, are particularly vulnerable given the potential for financial gain associated with interference in these sectors. The diversity of cyber-attacks that were currently seeing across industries indicates that cybercriminals are targeting multiple sectors to find vulnerable systems and victims no sector is exempt.

To cause disruption, bad actors will use a CRQC to unravel current data encryption protocols that protect sensitive data, making current public key cryptography methods obsolete. They could hijack millions of connected IoT devices to create distributed denial of service (DDoS) botnets that flood IP and optical networks with terabits of data and hundreds of millions of packets per second.

With public citizen data, national security, financial records, intellectual property and critical infrastructure all at risk, we must prepare our enterprise and critical networks for the possibility of quantum computing threats now. This will involve network modernization, including the updating and upgrading of network infrastructure and protocols, as well as implementing security measures to ensure the safety of communications. A multi-layer approach from the optical core to the IP edge and application layer will be essential in effectively encrypting in-flight network data, according to the transmission and network infrastructure.

Quantum computers are here and becoming accessible around the globe, so now is the time to build quantum-safe networks with advanced cybersecurity protection and post-quantum era encryption. Critical enterprises and governments need to protect themselves and their critical infrastructure against these attacks, so they are ready for Q-Day.

Link:
Why Enterprises and Governments Must Prepare for Q-Day Now - Infosecurity Magazine

Best practices for portfolio optimization by quantum computing … – Nature.com

Dataset

The data are collected from Yahoo!@finance29 using yfinance30, an open-source tool that uses Yahoos publicly available APIs. This tool, according to its creator, is intended for research and educational purposes.

To explore the efficiency of the proposed approach, small-sized examples are considered by extracting at most (N=4) different assets: (text {Apple}), (text {IBM}), (text {Netflix}) and (text {Tesla}). These are representative global assets with interesting dynamics influenced by financial and social events. For each asset i, with (1le i le N), the temporal range between 2011/12/23 and 2022/10/21 is considered. For each day t in this range ((0le t le T)), the performance of an asset is well represented by its closing price (p^t_i). A sub-interval of dates considered is shown in Table1. Additional experiments, performed on different dataset and falling within the same time interval considered here, are available in the supplementary information.

The first information extracted from this data set consists in the list P of current prices (P_i) of the considered assets.

$$begin{aligned} P_{i}=p^T_i. end{aligned}$$

(1)

Moreover, for each asset, the return (r^t_i) between the days (t-1) and t can be calculated:

$$begin{aligned} r^t_i=frac{p^t_i-p^{t-1}_i}{p^{t-1}_i} end{aligned}$$

(2)

These returns, calculated for days when the initial and the end prices are known, cannot be used for inference. Instead, it is convenient to define the expected return of an asset as an educated guess of its future performance. Assuming a normal distribution of the returns, the average of their values at each time t on the set of historical observations is a good estimator of the expected return. Therefore, given the entire historical data set, the expected return of each asset (mu _i) is calculated by:

$$begin{aligned} mu _i=E[r_i]=frac{1}{T}sum _{t=1}^{T}r^t_i. end{aligned}$$

(3)

Following the same principle, the variance of each asset return and the covariance between returns of different assets over the historical series can be calculated as follows:

$$begin{aligned}&sigma ^{2}_{i}=E[(r_{i}-mu _{i})^2]=frac{1}{T-1}sum _{t=1}^{T}(r^{t}_{i}-mu _{i})^{2}, \&sigma _{ij}=E[(r_{i}-mu _{i})(r_{j}-mu _{j})]=frac{1}{T-1}sum _{t=1}^{T}((r^{t}_{i}-mu _{i})(r^{t}_{j}-mu _{j})) nonumber . end{aligned}$$

(4)

The traditional theory of PO was initially formulated by Markowitz1. There are multiple possible formulations of PO, all embodying different degrees of approximation of the real-life problem. This work deals with Multi-Objective Portfolio optimization: this approach tries to simultaneously maximize the return and minimize the risk while investing the available budget. Even if other formulations include more objectives, the aim is still the solution of a constrained quadratic optimization problem; therefore, the formulation considered here is general enough to test the performances of the proposed approach.

A portfolio is defined as the set of investments (x_{i}) (measured as a fraction of the budget or number of asset units) allocated for each ith asset of the market. Therefore, the portfolio consists of a vector of real or integer numbers with dimensions equal to the number of assets considered. An optimal strategy for portfolio allocations aims to achieve the maximum portfolio return (mu ^{text {T}} x) while minimizing risk, defined as the portfolio variance (x^{text {T}}Sigma x) (whose square root is the portfolio volatility), where (mu ) is the vector of mean asset returns for each asset i calculated by (3), (Sigma ) is the covariance matrix calculated by (4), and x is the vector of investments measured as fractions of budget. Hence, the task of finding the optimal portfolio aims at finding the x vector that maximizes the following objective function:

$$begin{aligned} {{mathscr {L}}}(x): mu ^{text {T}} x - qx^{text {T}}Sigma x, end{aligned}$$

(5)

where the risk aversion parameter q expresses the propensity to risk of the investor (a trade-off weight between the risk and the return).

In a realistic scenario, the available budget B is fixed. Therefore, the constraint that the sum of (x_i) equals 1 must hold. Moreover, if only buying is allowed, each (x_ige 0), this constraint does not hold if either buying or selling is possible. As a consequence, in the general case, the problem can be stated as follows:

$$begin{aligned}&underset{x}{max }{mathscr {L}}(x): underset{x}{max }(mu ^{text {T}} x - qx^{text {T}}Sigma x),\&text {s.t.} quad sum ^{N}_{i=1}x_i=1 nonumber end{aligned}$$

(6)

However, if x is a possible solution to the problem with continuous variables, each product (x_iB) must be an integer multiple of the corresponding price (P_i) calculated by (1) since an integer number of units of each asset can be exchanged. Therefore, only a subset of the possible solutions corresponding to integer units is acceptable, and the problem is better stated as follows:

$$ begin{gathered} mathop {max }limits_{n} {mathcal{L}}(n):mathop {max }limits_{n} (mu ^{{prime {text{T}}}} n - qn^{{text{T}}} Sigma ^{prime } n), hfill \ {text{s}}{text{.t}}{text{.}}quad P^{{prime {text{T}}}} n = 1 hfill \ end{gathered} $$

(7)

where n is the vector of (n_i) integer units of each asset, while (P'=P/B), (mu '=P'circ mu ) and (Sigma '=(P'circ Sigma )^{text {T}}circ P') are appropriate transformations of (mu ) and (Sigma ). The latter formulation (7) is an integer constrained quadratic optimization problem.

Possible solutions to the problem (6) are those satisfying the constraint. Among them, some correspond to possible solutions to problem (7). The collection of possible solutions corresponding to portfolios with maximum return for any risk is called Markowitz efficient frontier. The solution of the constrained quadratic optimization problem lies on the efficient frontier, and the distance from minimum risk depends on q.

The general problem, if regarded in terms of continuous variables, can be solved exactly by Lagrange multipliers in case of equality constraints, or by KarushKuhnTucker conditions, which generalize the method of Lagrange multipliers to include inequality constraint31, as the covariance matrix is positive semi-definite32. Optimizing a quadratic function subject to linear constraints leads to a linear system of equations, solvable by Cholesky decomposition33 of the symmetrical covariance matrix. The exact solution involves the computation of the inverse of an (N times N) matrix, where N is the number of assets, thus requiring about (O(N^3)) floating-point operations34.

As long as integer or binary variables are considered, the problem turns into combinatorial optimization. The computational complexity is known to be high since the optimization problem is NP-hard35,36, while the decision version is NP-complete37. Indeed, a search approach should find the optimal one among possible solutions whose number increases exponentially with the number of assets (e.g., for b binary variables, (2^b) possible solutions, while for N integer variables ranging from 0 to (n_{max}), ({left( n_{max}+1right) }^N) possible solutions).

In practice, various methods are currently employed, either based on geometric assumptions, such as the branch-and-bound method2,3, or rather heuristic algorithms4,5,6, such as Particle Swarms, Genetic Algorithms, and Simulated Annealing. These have some limitations but allow to obtain approximate solutions. However, in all cases, the exact or approximate solution is feasible only for a few hundreds of assets on current classical computers.

Using quantum mechanical effects, like interference and entanglement, quantum computers can perform computational operations within the Bounded-error Quantum Polynomial (BQP) class of complexity, which is the quantum analogue of the Bounded-error polynomial probabilistic (BPP) class. Even if there is no NP problem for which there is a provable quantum/classical separation, it is widely believed that BQP (not subset ) BPP, hence when considering time complexity, quantum computers are more powerful than classical computers. More generally, it is conjectured that P is a subset of BQP. Therefore, while all problems that can be efficiently solved classically, are efficiently solvable by quantum computers as well, some problems exist that are considered intractable, until nowadays, by classical computers in polynomial space, and can they be solved with quantum machines. These facts are still matter of investigation but there are good reasons to believe that there are problems solvable by QC more efficiently than classical computers, thus QC will have a disruptive potential over some hard problems38, among which constrained quadratic optimization problems, including PO.

The branch-and-bound method3,7 is used in this work as a classical benchmark to compare the results of the proposed approach. It is based on the Lagrangian dual relaxation and continuous relaxation for discrete multi-factor portfolio selection model, which leads to an integer quadratic programming problem. The separable structure of the model is investigated by using Lagrangian relaxation and dual search. This algorithm is capable of solving portfolio problems with up to 120 assets.

Specifically, the library CPLEX freely available on Python provides a robust implementation of the aforementioned classical solving scheme.

As formulated in Eq. (7), the PO problem lies within the class of quadratic optimization problems. To be quantum-native, it has to be converted into a Quadratic Unconstrained Binary Optimization (QUBO) problem, i.e., the target vector to be found has to be expressed as a vector of zeros and ones, and constraints have to be avoided.

Therefore, the binary conversion matrix C is constructed with a number of binarizing elements (d_i) for each asset i depending on the price (P_i). Hence

$$begin{aligned} n^{max}_{i}=Intleft( frac{B}{P_i}right) , end{aligned}$$

(8)

where the operation Int stands for the integer part, and

$$begin{aligned} d_i=Intleft( log _{2}{n^{max}_i}right) , end{aligned}$$

(9)

such that

$$begin{aligned} n_{i}=sum _{j=0}^{d_i}2^{j}b_{i,j}. end{aligned}$$

(10)

In this way, the overall dimension of the binarized target vector, (b=left[ b_{1,0},dots ,b_{1,d_1},dots ,b_{N,0},dots ,b_{N,d_N}right] ), is (text {dim}(b) =sum _{i=1}^{N}left( d_i+1right) ), which is lower than that used in implementation available in Qiskit15. Conveniently, the encoding matrix C is defined as follows:

$$begin{aligned} C= begin{pmatrix} 2^{0} &{} dots &{} 2^{d_1} &{} 0 &{} dots &{} 0 &{} dots &{} 0 &{} dots &{} 0 \ 0 &{} dots &{} 0 &{} 2^{0} &{} dots &{} 2^{d_2} &{} dots &{} 0 &{} dots &{} 0 \ vdots &{} ddots &{} vdots &{}vdots &{} ddots &{}vdots &{} ddots &{}vdots &{}ddots &{}vdots \ 0 &{} dots &{} 0 &{} 0 &{} dots &{} 0 &{} dots &{} 2^{0} &{} dots &{} 2^{d_N} end{pmatrix}, end{aligned}$$

(11)

and thus, the conversion can be written in short notation as (n = Cb). It is possible to redefine the problem (7), in terms of the binary vector b, applying the encoding matrix by (mu ''=C^{text {T}}mu '), (Sigma ''=C^{text {T}}Sigma 'C) and (P''=C^{text {T}}P'):

$$ begin{gathered} mathop {max }limits_{b} {mathcal{L}}(b):mathop {max }limits_{b} left( {mu ^{{prime prime {text{T}}}} b - qb^{{text{T}}} Sigma ^{{prime prime }} b} right), hfill \ {text{s}}{text{.t}}{text{.}}quad P^{{prime prime {text{T}}}} b = 1 hfill \ quad quad b_{i} in { 0,1} quad forall i in left[ {1, ldots ,dim(b)} right]. hfill \ end{gathered} $$

(12)

The problem (12) falls into the wide set of binary quadratic optimization problems, with a constraint, given by the total budget. In this form, the problem cannot be cast directly into a suitable set of quantum operators that run on quantum hardware: the constraint, in particular, is troublesome, as it poses a hard limitation on the sector of Hilbert space that needs to be explored by the algorithm, to find a solution. It is thus necessary to convert the problem into a QUBO (Quadratic Unconstrained Binary Optimization) by transforming the constraint into a penalty term in the objective function. Each kind of constraint can be converted into a specific penalty term39, and the one considered in (12), which is equality, linear in the target variable, maps into (lambda (P''^{text {T}} b-1)^{2}), such that (12) can be written in terms of the following QUBO problem:

$$ mathop {max }limits_{b} {mathcal{L}}(b):mathop {max }limits_{b} left( {mu ^{{prime prime {text{T}}}} b - qb^{{text{T}}} Sigma ^{{prime prime }} b - lambda (P^{{prime prime {text{T}}}} b - 1)^{2} } right). $$

(13)

The penalty coefficient (lambda ) is a key hyperparameter to state the problem as the QUBO of the objective function (13).

There is a strong connection, technically an isomorphism, between the QUBO and the Ising Hamiltonian40: Ising Hamiltonian was originally constructed to understand the microscopic behavior of magnetic materials, particularly to grasp the condition that leads to a phase transition. However, its relative simplicity and natural mapping into QUBO have made the Ising model a fundamental benchmark well beyond the field of quantum physics. To convert (13) into an Ising, it is convenient to expand it in its components:

$$ {mathcal{L}}(b):sumlimits_{i} {mu _{i}^{prime } b_{i} } - qsumlimits_{{i,j}} {Sigma _{{i,j}}^{prime } } b_{i} b_{j} - lambda left( {sumlimits_{i} {P_{i}^{prime } b_{i} - 1} } right)^{2} , $$

(14)

where (mu ''_{i}, Sigma ''_{i,j}, P''_{i} ), are the components of the transformed return, covariance, and price, respectively, and (i,jin left[ 1,dim(b)right] ). Since the Ising represents spin variables (s_{i}), which have values ({-1,1}), the transformation (b_{i}rightarrow frac{1+s_{i}}{2}) is applied and coefficients are re-arranged, to obtain the Ising objective function to minimize:

$$begin{aligned}&underset{s}{min }{mathscr {L}}(s): underset{s}{min }left( sum _{i}h_{i}s_{i}+ sum _{i,j} J_{i,j}s_{i}s_{j}+lambda (sum _{i}pi _{i}s_{i}-beta )^{2}right) ,\&text {s.t.} quad s_{i,j}in {-1,1} quad forall i nonumber , end{aligned}$$

(15)

with (J_{i,j}) being the coupling term between two spin variables. It is now straightforward to obtain the corresponding quantum Hamiltonian, whose eigenvector corresponding to the minimum eigenvalue corresponds to the solution: in fact, the eigenvalues of the Pauli operators Z are (pm 1). Thus they are suitable for describing the classical spin variables (s_{i}). Furthermore, the two-body interaction term can be modeled with the tensor product between two Pauli operators, i.e., (Z_{i}otimes Z_{j}). The quantum Ising Hamiltonian reads:

$$begin{aligned} H= sum _{i}h_{i}Z_{i} + sum _{i,j} J_{i,j}Z_{i}otimes Z_{j}+lambda (sum _{i}pi _{i}Z_{i}-beta )^{2}. end{aligned}$$

(16)

With the procedure described above, the integer quadratic optimization problem of a portfolio allocation with budget constraints is expressed first as a binary problem via the binary encoding, then it is translated into a QUBO, transforming the constraints into a penalty term by the chosen penalty coefficient, and finally into a quantum Hamiltonian written in term of Pauli gates. Hence, the PO problem (7) is now formulated as the search of the ground state, i.e., the minimum energy eigenstate, of the Hamiltonian (16). Therefore, it is possible to use the VQE, employing real quantum hardware, and iteratively approximate such a state, as described in the following section, which corresponds to the optimal portfolio.

The VQE is a hybrid quantum-classical algorithm41, which is based on the variational principle: it consists in the estimation of the upper bound of the lowest possible eigenvalue of a given observable with respect to a parameterized wave-function (ansatz). Specifically, given a Hamiltonian H representing the observable, and a parameterized wave-function ({|{psi (theta )}rangle }), the ground state (E_{0}) is the minimum energy eigenstate associated s

$$begin{aligned} E_{0}le frac{{langle {psi (theta }|}H{|{psi (theta )}rangle }}{{langle {psi (theta )|psi (theta )}rangle }}, quad forall quad theta . end{aligned}$$

(17)

Hence, the task of the VQE is finding the optimal set of parameters, such that the energy associated with the state is nearly indistinguishable from its ground state, i.e., finding the set of parameters (theta ), corresponding to energy (E_{min}), for which (|E_{min}-E_{0}|

$$begin{aligned} E_{text {min}}=underset{theta }{min }{langle {textbf{0}}|}U^{dagger }(theta )HU(theta ){|{textbf{0}}rangle }. end{aligned}$$

(18)

where (U(theta )) is the parametrized unitary operator that gives the ansatz wave-function when applied on the initial state, (E_{min}) is the energy associated with the parametrized ansatz. The Hamiltonian H, defined for the specific problem, and in this case corresponding to (16), can be written in a specific operator basis that makes it naturally measurable on a quantum computer: this choice depends on the architecture considered. In this work, given the extensive use of the IBM quantum experience42, it is convenient to map the Hamiltonian into spin operators base. This base is formed by the tensor product of Pauli strings: (P_{l}in {I,X,Y,Z}^{otimes N}). In this base the Hamiltonian can always be written in the general form, (H=sum _{l}^{D}c_{l}P_{l}), where D is the number of Pauli strings that define the Hamiltonian and (c_{l}) is a suitable set of weights. It follows that the VQE in Eq. (18) can be written as:

$$begin{aligned} E_{text {min}}=underset{theta }{min }sum _{l}^{D}c_{l}{langle {textbf{0}}|}U^{dagger }(theta )P_{l} U(theta ){|{textbf{0}}rangle }. end{aligned}$$

(19)

Each term in Eq. (19) corresponds to the expectation value of the string (P_{l}) and is computed on quantum hardware (or a simulator). The summation and the optimization of the parameters are computed on a classical computer, choosing an ad-hoc optimizer. The eigenvector corresponding to the ground state corresponds to the solution of the problem(13), thus to the optimal portfolio.

Schematic of the VQE algorithm. The ansatz wave-function ({|{psi (theta )}rangle })) is initialized with random parameters and encoded in a given set of quantum gates. The PO problem is translated into an Ising Hamiltonian and encoded into a set of Pauli gates. The collection of output measurement allows the reconstruction of the expectation value of the Hamiltonian H, which is the energy that needs to be minimized. A classical optimization algorithm provides an update rule for the parameters of the wave-function, which ideally moves iteratively towards the ground state of the problem, thus providing an estimation of the corresponding eigenstate. This corresponds to the solution of the original PO problem.

In light of what is stated above, the complete VQE estimation process can be decomposed in a series of steps, as depicted in Fig. 1. First, it is necessary to prepare a trial wave-function (ansatz) on which the expectation value needs to be evaluated and realized via a parameterized quantum circuit. Then, it is necessary to define the Hamiltonian (16), whose ground state is the solution to the problem to be addressed, and convert it into the Pauli basis so that the observable can be measured on the quantum computer. Finally, the parameters are trained using a classical optimizer. This hybrid system ideally converges to a form that produces a state compatible with the ground state of the Hamiltonian.

This procedure includes two hyperparameters that have to be settled, i.e., the type of ansatz and the optimizer. When defining the ansatz, two main features have to be taken into account: its expressivity, i.e., the set of states that can be spanned by the ansatz itself, and the trainability, i.e., the ability of the ansatz to be optimized efficiently with available techniques. It is worth pointing out the problem of the barren plateau43, related to the possibility of vanishing gradients when the cost function gradients converge to zero exponentially, as a function of the specific characteristic of the problem to be solved. The barren plateau depends on the number of qubits, the high expressivity of the ansatz wave-function, the degree of entanglement, and the quantum noise44. There are several methods to avoid or mitigate the effect of the barren plateau, especially in the context of VQE, most of which consist in finding a trade-off between the expressivity of the ansatz and its trainability and reducing the effective size of the Hilbert space of the problem formulation45.

The following ansatzes are available in Qiskit and are analyzed in this work: Two Local ansatz, where qubits are coupled in pairs, the Real Amplitude ansatz, which assumes real-valued amplitude for each base element of the wave-function, and the Pauli Two ansatz, used mainly in quantum machine learning for the mitigation of barren plateu46. Although other ansatzes are provided in Qiskit, they are generally unsuitable for a PO problem. For instance, the Excitation preserving ansatz preserves the ratio between basis vector components, hence does not allow, in principle, any weight imbalance in the output distribution while moving towards the solution of the problem.

For all the ansatzes considered, the convergence of four different possible assumptions on the entanglement structure of the wave-function is checked, namely the full entanglement, the linear entanglement, the circular and the pairwise entanglement. The former modifies the ansatz such that any qubit is entangled with all the others pairwisely. In the linear case, the entanglement is built between consecutive pairs of qubits. The circular case is equivalent to the linear entanglement but with an additional entanglement layer connecting the first and the last qubit before the linear sector. Finally, in the pairwise entanglement construction, in one layer, the ith qubit is entangled with qubit (i+1) for all even i, and in a second layer, qubit i is entangled with qubit (i+1), for odd values of i.

Once the ansatz is defined, its parameters must be optimized classically until convergence is reached. The choice of the optimizer is crucial because it impacts the number of measurements that are necessary to complete the optimization cycle since, when properly chosen, it can mitigate the barren plateau problem and minimize the number of iterations required to reach convergence. In this work, dealing with the PO problem, different optimizers are tested to select which one fulfills its task faster, among those available on Qiskit, i.e., Cobyla, SPSA, and NFT47.

The experimental results presented in this work are obtained on real quantum hardware, specifically using the platforms provided by IBM superconducting quantum computers. These quantum machines belong to the class of NISQ devices, which stands for Noisy Intermediate Scale Quantum devices, i.e., a class of hardware with a limited number of qubits and where noise is not suppressed. Noise, in quantum computers, comes from various sources: decoherence, gate fidelities, and measurement calibration. Decoherence is the process that most quantum mechanical systems undergo when interacting with an external environment48. It causes the loss of virtually all the quantum properties of the qubits, which then collapse into classical bits. Gate fidelities measure the ability to implement the desired quantum gates physically: in the IBM superconducting qubits hardware, these are constructed via pulses, which are shaped and designed to control the superconductors. Given the limited ability to strictly control these pulses, a perfect gate implementation is highly non-trivial and subject to imperfections. Last, measurement errors are caused by the limits of the measurement apparatus, improper calibration, and imperfect readout techniques. Hence, NISQ devices do not always provide reliable results due to the lack of fault tolerance. However, they provide a good benchmark for testing the possibilities of quantum computing. Furthermore, ongoing research is on the possibility of using NISQ in practical applications, such as machine learning and optimization problems.

In this work, both simulators and real quantum computers are used. Even though error mitigation techniques49 can be applied, the main goal of this paper is to test the performances of the quantum computers on a QUBO problem, such as PO, without error mitigation, with the binary encoding strategies and the budget constraints as described in the previous sections. Therefore, in all computations, there is no error mitigation, aiming to build an indirect but comprehensive analysis of the hardware limitations and to improve the quality of the results offered by a proper selection of the hyperparameters. This will provide a solid benchmark for the following experimental stages, which will be enabled in the coming years by large and nearly fault-tolerant quantum computers.

Hence, the experiments run on simulators (without noise) are also executed by adding noise mimicking real hardware: this operation can be readily implemented on Qiskit by inserting a noise model containing the decoherence parameters and the gate error rate from real quantum hardware.

Moreover, experiments are run on IBM NISQ devices with up to 25 qubits. Specifically, a substantial subset of the available quantum computers in the IBM quantum experience was employed: IBM Guadalupe, Toronto, Geneva, Cairo, Auckland, Montreal, Mumbai, Kolkata, and Hanoi. These machines have either 16 or 27 qubits, but they have different quantum volumes (QV) and Circuit Layer Operations Per Second (CLOPS). QV and CLOPS are useful metrics to define the performances of a quantum computation pipeline50. Generally, a bigger QV means that the hardware can sustain deeper circuits with a relatively small price on the performance. At the same time, the CLOPS quantifies the number of operations that can be handled by the hardware per unit of time. Hence, altogether, they qualify the quality and speed of quantum computation.

Link:
Best practices for portfolio optimization by quantum computing ... - Nature.com

GENCI/CEA, FZJ, and PASQAL Announce Significant Milestone in … – HPCwire

DENVER, Nov. 9, 2023 In the context of the SuperComputing 2023 conference in Denver (SC23), Grand Equipement National de Calcul Intensif (GENCI), Commissariat lnergie atomique et aux nergies alternatives (CEA), Forschungszentrum Jlich (FZJ), and PASQAL are demonstrating progresses in the framework of the European project High-Performance Computer and Quantum Simulator hybrid (HPCQS).

HPC-Quantum Computing applications in finance, pharma, and energy are leveraging the upcoming quantum computers that are currently being installed at the supercomputing centers CEA/TGCC (France) and FZJ/JSC (Germany), providing already concrete results.

Now, PASQAL is delivering two 100+-qubit quantum computers to its first customers in France (GENCI/CEA) and Germany (FZJ). These devices, acquired in the framework of the European project HPCQS, and co-funded by the EuroHPC Joint Undertaking, France and Germany, will be coupled respectively with the Joliot-Curie and JURECA DC supercomputers.

Over the past months, several HPC-Quantum Computing and Simulation (HPC-QCS) applications have been studied on the targeted 100+-qubit quantum computing platform based on neutral atoms. These explorations have involved several industrial partners from various fields who provided practical use cases that, with the support of the PASQAL team, were ported on the quantum system, enabling the development of more efficient drugs, more efficient electricity consumption, and competitive advantage in risk management.

A significant illustration of this is the development of a novel quantum algorithm to accelerate drugs discovery. A joint collaboration between PASQAL and the Qubit Pharmaceuticals startup has been launched end of 2021, co-funded by the Pack Quantique (PAQ) initiative of the Region Ile-de-France for an 18-month project. This collaboration aims at improving the understanding of protein hydration, a crucial element in determining how the medicine candidate can inhibit the toxic behavior of the targeted protein. A preliminary version of the algorithm for identifying the presence of water molecules in the pockets of a protein has been implemented on PASQALs analog quantum computer to validate theoretical predictions with impressive match. The follow up of this project is being co-funded by the Wellcome Trust Quantum for Bio program.

PASQAL will showcase this exploration in favor of commercial and strategic advantages on the booths of both CEA and FZJ/JSC at the SuperComputing 2023 conference in Denver through live demos.

The two PASQAL quantum computers will be accessible to a wide range of European users in 2024. They are the first building blocks of a federated European HPC-QCS infrastructure that will also consist of the six quantum computers acquired by the EuroHPC JU and hosted in France (GENCI/CEA), Germany (LRZ), Czech Republic (IT4I @ VSB), Poland (PSNC), Spain (BSC-CNS) and Italy (CINECA).

HPCQS users are already able to validate their use cases through various entry points, such as the Pulser environment deployed on the Joliot-Curie and JURECA DC environments, as well as thanks to remote access to a 100+-qubit device hosted on PASQALs premises in Massy, France. Currently, some HPCQS users from JSC are performing remote simulations on this device to benchmark it and to demonstrate quantum many-body scarring, a phenomenon that has recently attracted a lot of interest in foundations of quantum statistical physics and potential quantum information processing applications. European end-users will also soon have access to a more scalable, tensor network-based emulator from PASQAL, called EMU-TN, which will also be deployed on both French and German environments.

About HPCQS

HPCQS is an open and evolutionary infrastructure that aims at expanding in the future by including a diversity of quantum computing platforms at different technology readiness levels and by allowing the integration of other European quantum nodes. The HPCQS infrastructure realizes, after the Julich UNified Infrastructure for Quantum computing (JUNIQ), a second step towards a European Quantum Computing and Simulation Infrastructure (EuroQCS), as advocated for in the Strategic Research Agenda of the European Quantum Flagship of 2020. At FZJ, HPCQS is fully integrated in JUNIQ. During the preparations for the Strategic Research and Industry Agenda (SRIA 2030) for Quantum Technologies in the European Union, the name of the EuroQCS infrastructure was changed to EuroHPC-QCS to emphasize the involvement of HPC as well.

Source: Grand Equipement National de Calcul Intensif

Read the rest here:
GENCI/CEA, FZJ, and PASQAL Announce Significant Milestone in ... - HPCwire