Category Archives: Quantum Computer
Valuation of quantum computer maker D-Wave slashed by more than half after company struggles to raise financing – The Globe and Mail
D-Wave has fallen short of financial targets and only had a handful of buyers for its expensive, shed-sized computers.
One of Canadas most heavily funded early stage technology companies, quantum computer developer D-Wave Systems, undertook a costly refinancing this year that wiped out most of the value of some long-time investors, including the U.S. Central Intelligence Agencys venture capital arm, Amazon CEO Jeff Bezos and fund giant Fidelity Investments, according to public filings and three sources.
The US$40-million financing, including US$10-million from new investor NEC Corp., came as part of a capital restructuring that cut D-Waves valuation to less than US$170-million before the receipt of funds, down from about US$450-million, the sources familiar with the company said. The Globe and Mail is not disclosing their identities as they are not authorized to speak on the matter.
Existing investors who participated including Montreal-based Public Sector Pension Investment Board (D-Waves top shareholder, with $100-million-plus invested to date), Business Development Bank of Canada and Goldman Sachs maintained their relative stakes, limiting the writedown of their holdings. Those that didnt, including the CIAs In-Q-Tel arm, Mr. Bezos and Fidelity, saw their stakes significantly devalued, by upward of 85 per cent in some cases.
Story continues below advertisement
The funding was undertaken during a transformational year for Burnaby, B.C.-based D-Wave, the leader in a global race to develop computers whose chips draw their power by harnessing natural properties of subatomic particles to perform complex calculations faster than conventional computers.
Despite raising more than US$300-million from investors to date, D-Wave has fallen short of financial targets and only had a handful of buyers for its expensive, shed-sized computers, which were mainly used by researchers to tinker with cutting-edge technology. D-Wave has generated just more than US$75-million of customer contracts in its 21 years.
Meanwhile, newer quantum computer startups are attracting buzz and financing, while giants including IBM, Intel and Google continue to develop their own quantum computers.
This year, D-Wave promoted Silicon Valley veteran executive Alan Baratz to chief executive officer, replacing Vern Brownell, to step up efforts to commercialize its technology. It parted ways with several other top executives, including its chief financial officer and senior vice-president responsible for applications and technology. Long-time board members Steve Jurvetson, a Silicon Valley venture capitalist, and ex-Cisco executive Don Listwin also left.
Mr. Baratz has earned praise from investors for shifting D-Waves strategy away from selling computers, which listed for US$15-million, in favour of offering internet access to the machines. With the recent launch of its latest processor and updated software and service offerings, D-Wave says it can help companies solve real-world business problems and deliver business value.
The company has never looked better," said Rick Nathan, managing director with Kensington Capital Partners, a D-Wave investor. "After all this time, it now feels like [D-Wave] has achieved real product market fit and is scaling with its customers. This is new, and as a long-time investor, it is great to see.
But D-Wave is still not financially self-sustaining. It was on the verge of raising significant funds from a Chinese investor that backed out when Canadian authorities arrested a top executive of Chinas Huawei Technologies in 2018. Existing investors subsequently injected US$30-million in 2019 to tide D-Wave over, but after it struggled to find new backers other than NEC, they stepped up again at the much reduced valuation.
Story continues below advertisement
D-Wave declined to comment. But evidence of the financial hit surfaced in recent public disclosures by some investors.
Regulatory filings from PenderFund Capital Managements Working Opportunity Fund show D-Wave dropped to its 11th largest investment as of June 30 from its second-largest before. The fund didnt say why, but separately disclosed it had cut the carrying value of one of its private holdings, likely D-Wave, by $22.6-million after the unidentified company did a significant equity financing at a lower [valuation] level" than prior financings.
Fidelitys annual report shows three funds that had invested $18.3-million in D-Wave valued their combined stakes at $3.1-million as of June 30, down 85.6 per cent from a year earlier. A Fidelity spokesman declined to comment.
Meanwhile, U.S. fund manager 180 Degree Capital cut the value of its D-Wave stake to US$1.2 million from US$7.7 million on Dec. 31. The U.S. fund manager paid US$5.7-million for its stake from 2008 to 2014. In a call with 180 Degree investors on Aug. 11, president Daniel Wolfe said the writedown was directly related to a financing event that repriced the company. Also, 180 Degree further revealed that D-Wave in April consolidated all outstanding shares into one class of preferred stock in a new holding company, DWSI Holdings Inc., and completed a 1-for-5 reverse stock split.
Mr. Wolfe said D-Wave has not performed, and its value has dropped materially and that fundraising has been very difficult. Sales have been difficult. But he added he was very bullish on Mr. Baratzs ability to lead [D-Wave] to the next level.
Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.
A team of Australian quantum theorists has shown how to break a bound that had been believed, for 60 years, to fundamentally limit the coherence of lasers.
The coherence of a laser beam can be thought of as the number of photons (particles of light) emitted consecutively into the beam with the same phase (all waving together). It determines how well it can perform a wide variety of precision tasks, such as controlling all the components of a quantum computer.
Now, in a paper published inNature Physics, the researchers from Griffith University and Macquarie University have shown that new quantum technologies open the possibility of making this coherence vastly larger than was thought possible.
The conventional wisdom dates back to a famous 1958 paper by American physicists Arthur Schawlow and Charles Townes, said Professor Howard Wiseman, projectleaderand Director of Griffiths Centre for Quantum Dynamics.
Each of them went on to win a Nobel prize for their laser work.
They showed theoretically that the coherence of the beam cannot be greater than the square of the number of photons stored in the laser, he said.
But they made assumptions about how energy is added to the laser and how it is released to form the beam.
The assumptions made sense at the time, and still apply to most lasers today, but they are not required by quantum mechanics.
In our paper, we have shown that the true limit imposed by quantum mechanics is that the coherence cannot be greater than the fourth power of the number of photons stored in the laser, said Associate Professor Dominic Berry, from Macquarie University.
When the stored number of photons is large, as is typically the case, our new upper bound is much bigger than the old one.
But can this new bound on coherence be achieved?
Yes, says Dr NarimanSaadatmand, a researcher inProfessorWisemans group.
By numerical simulation we have found a quantum mechanical model for a laser which achieves the theoretical upper bound for coherence, in a beam that is otherwise indistinguishable from that of a conventional laser.
So,when we will see these new super-lasers?
Probably not for a while, says Mr Travis Baker, the PhD student on the project at Griffith University.
But we do prove that it would be possible to construct our truly quantum-limited laser using superconducting technology. This is the same technology used in the current best quantum computers, and our proposed device may have applications in that field.
Our work raises many interesting questionssuch as whether it could allow more energy-efficient lasers, Professor Wiseman said.
That would also be a great benefit, so we hope to able to investigate that in the future.
Innovations driving what many refer to as the Fourth Industrial Revolution are as varied as the enterprises affected. Industries and their supply chains are already being revolutionized by several emerging technologies, including 5G networks, artificial intelligence, and advanced robotics, all of which make possible new products and services that are both better and cheaper than current offerings. Unfortunately, not every application of transformational technology is as obviously beneficial to individuals or society as a whole. But rather than panic, regulators will need to step back, and balance costs and benefits rationally.
Amid the economic upheaval caused by Covid-19, technology-driven disruption continues to transform nearly every business at an accelerating pace, from entertainment to shopping to how we work and go to school. Though the crisis may be temporary, many changes in consumer behavior are likely permanent.
Well before the pandemic, however, industries and their supply chains were already being revolutionized by several emerging technologies, including 5G networks, artificial intelligence, and advanced robotics, all of which make possible new products and services that are both better and cheaper than current offerings. That kind of big bang disruption can quickly and repeatedly rewrite the rules of engagement for incumbents and new entrants alike. But is the world changing too fast? And, if so, are governments capable of regulating the pace and trajectory of disruption?
The answers to those questions vary by industry, of course. Thats because the innovations driving what many refer to as the Fourth Industrial Revolution are as varied as the enterprises affected. In my recent book, Pivot to the Future, my co-authors and I identified ten transformative technologies with the greatest potential to generate new value for consumers, which is the only measure of progress that really matters. They are: extended reality, cloud computing, 3D printing, advanced human-computer interactions, quantum computing, edge and fog computing, artificial intelligence, the Internet of Things, blockchain, and smart robotics.
Some of these disruptors, such as blockchain, robotics, 3D printing and the Internet of things, are already in early commercial use. For others, the potential applications may be even more compelling, though the business cases for reaching them are less obvious. Today, for example, only the least risk-adverse investors are funding development in virtual reality, edge computing, and new user interface technologies that interpret and respond to brainwaves.
Complicating both investment and adoption of transformative technologies is the fact that the applications with the biggest potential to change the world will almost certainly be built on unanticipated combinations of several novel and mature innovations. Think of the way ride-sharing services require existing GPS services, mobile networks, and devices, or how video conferencing relies on home broadband networks and high-definition displays. Looking at just a few of the most exciting examples of things to come make clear just how unusual the next generation of disruptive combinations will be, and how widespread their potential impact on business-as-usual:
Unfortunately, not every application of transformational technology is as obviously beneficial to individuals or society as a whole. Every one of the emerging technologies we identified (and plenty of those already in mainstream use) come with potential negative side effects that may, in some cases, outweigh the benefits. Often, these costs are both hard to predict and difficult to measure.
As disruption accelerates, so too does anxiety about its unintended consequences, feeding what futurist Alvin Toffler first referred to half a century ago as Future Shock. Tech boosters and critics alike are increasingly appealing to governments to intervene, both to promote the most promising innovations and, at the same time, to solve messy social and political conflicts aggravated by the technology revolution.
On the plus side, governments continue to support research and development of emerging technologies, serving as trial users of the most novel applications. The White House, for example, recently committed over $1 billion for continued exploration of leading-edge innovation in artificial intelligence and quantum computing. The Federal Communications Commission has just concluded one its most successful auctions yet for mobile radio frequencies, clearing bandwidth once considered useless for commercial use but now seen as central to nationwide 5G deployments. Palantir, a data analytics company that works closely with governments to assess terrorism and other complex risks, has just filed for a public offering that values the start-up at over $40 billion.
At the same time, a regulatory backlash against technology continues to gain momentum, with concerns about surveillance, the digital divide, privacy, and disinformation leading lawmakers to consider restricting or even banning some of the most popular applications. And the increasingly strategic importance of continued innovation to global competitiveness and national security has fueled increasingly nasty trade disputes, including some between the U.S., China, and the European Union.
Together with on-going antitrust inquiries into the competitive behavior of leading technology providers, these negative reactions underscore what author Adam Thierer sees as the growing prevalence of techno-panics generalized fears about personal autonomy, the fate of democratic government, and perhaps even apocalyptic outcomes from letting some emerging technologies run free.
Disruptive innovation is not a panacea, but nor is it a poison. As technology transforms more industries and becomes the dominant driver of the global economy, it is inevitable both that users will grow more ambivalent, and, as a result, that regulators will become more involved. If, as a popular metaphor of the 1990s had it, the digital economy began as a lawless frontier akin to the American West, its no surprise that as settlements grow socially complex and economically powerful, the law will continue to play catch up, likely for better and for worse.
But rather than panic, regulators need to step back, and balance costs and benefits rationally. Thats the only way well achieve the exciting promise of todays transformational technologies, but still avoid the dystopias.
Go here to see the original:
A Measured Approach to Regulating Fast-Changing Tech - Harvard Business Review
Lets get subatomic. In philanthropic circles, arcane topics such as theoretical physics and quantum mechanics have a tough time attracting significant funding. Grantseekers can find it challenging to convey to potential donors the importance of subjects that are not only outside the ken of most non-scientists, but which may not seem as pressing as emergencies like global pandemics, poverty or climate change. Even within science funding, public and private, the life sciences dominate.
But the Perimeter Institute, a center for theoretical physics based in Waterloo, Ontario, has been successfully attracting funding through a pioneering public-private funding model. We wrote about Perimeter and its approach last year in the wake of the 20-year-old institutes contribution to developing the worlds first image of a black hole.
In short, Perimeter draws a blend of support from government, industry and private funders, and has become a worldwide leader in advancing talent and new discoveries in theoretical physics.
Just last week, Perimeter announced its new Clay Riddell Centre for Quantum Matter, a research hub where scientists will study the subatomic world of quantum mechanics to understand and discover new states of matteryou know, states of matter other than the familiar solid, liquid, gas and plasma that you learned about in high school. (Dont ask us to explain plasma.)
The new center is the culmination of a 10-year, $25 million investment in quantum matter research, made possible by a $10 million founding donation from the Riddell Family Charitable Foundation. Clay Riddell, who died in 2018, was a Canadian entrepreneur and philanthropist. Physicists believe that study of quantum science and matter will eventually lead to useful technologies and abilities that stretch the imagination.
That the theoretical science of today leads to the technologies of tomorrow is a key message in basic scienceand especially funding for basic science, explained Greg Dick, Perimeters executive director of advancement and senior director of public engagement. Consider the theory of special relativity and curved space: One hundred years after Einstein proposed it, Dick said, special relativity is a necessary element of GPS navigation systems in cars and other settings. The theories of quantum mechanics led in just a few decades to the computer age. And before all that, the theories of magnetism and electricity eventually translated into practically every single thing we use every day.
When electricity and magnetism were discovered, the problem of the day was air pollution in New York City from the manure that horse hoofs pulverized into dust, said Dick. But fortunately, people were thinking about esoteric questions of electricity and magnetism, and that changed society.
In other words, society can ill afford to stop funding basic and theoretical science. The exciting thing is that the time from new theory to useful technology is getting shorter, Dick said. Perhaps in a decade, the study of quantum matter could lead to solutions for next-generation quantum computers, medical diagnostics, transportation, superconductors for energy grids and cryptography for data security and communications.
But just as likely, said Dick, the study of quantum matter will enable the creation of exotic materials and technologies no one currently expects or imagines.
And this brings us to why the coronavirus pandemic, which has demanded so much of the worlds attention, is helping science grantseekers connect with funders.
Obviously, when COVID started, there was a pause (in fundraising), but interestingly, COVID has also moved the relevance and value of foundational science to the forefront of peoples minds, said Dick. Yes, the theoretical physics that we do is nuanced, but COVID has put science on a pedestal. Its actually easier to have that conversation about the value of science.
Whatever their understanding of physics, prospective donors can easily grasp the importance of the basic research that has enabled todays search for treatments and vaccines for COVID-19.
In a related manner, the COVID-19 pandemic changed the nature of the social interactions with potential donors, said Dick. In the past, wed host big events and parties, but now, the pivot to digital communication has really opened up new ways to connect with supporters. Those person-to-person video calls can actually enable more personal and deeper conversations, he said.
Perimeter was established in 1999, seeded with $100 million from Mike Lazaridis, the founder of the Blackberry smartphone pioneer Research In Motion. Bringing the public along as enthusiastic partners was always a requirement, said Dick. Mikes vision right at the beginning was world-class research, for sure, but he also wanted that message of foundational science baked into Perimeter from the very beginning.
As a result, Perimeter also offers classroom-ready educational resources used by teachers around the world, reaching millions of students.
See the original post:
The Importance of Funding Quantum Physics, Even in a Pandemic - Inside Philanthropy
The waste chips of paint you strip off the walls might not be so useless afterall.
Image credit: Sandia National Laboratories
For the next generation of computer processors, one persistent challenge for researchers is finding novel ways to make non-volatile memory on an ever-smaller scale. As smaller processors inevitably lead toward a finite limit on space and therefore processing power quantum computing or new materials that move away from traditional silicon chips are thought to overcome this barrier.
Now, researchers at Sandia National Laboratory, California, and the University of Michigan, publishing in Advanced Materials, have made a step forward in realizing a solution to this problem using a new material in processing chips for machine-learning applications, that gives these computers more processing power than conventional computer chips. The specific obstacle the authors wanted to overcome was the limitations with filamentary resistive random access memory (RRAM), in which defects occur within the nanosized filaments. The team instead wanted to create filament-free bulk RRAM cells.
The materials the authors use, titanium oxide or TiO2, may sound like a rather mundane inorganic substance to readers unfamiliar with it, but it is in fact a lot more common than most people realize. If you ever watched Bob Rosss wonderful The Joy of Painting, you may be more familiar with TiO2 as titanium white the name it is given when used as a pigment in paints. In fact, TiO2 is ubiquitous in paints not just on the landscape artists pallet, but in house paints, varnishes, and other coatings. It is also found in sunscreen and toothpaste.
The point is, TiO2 is cheap and easy to make, which is one of the reasons this new-found application in computer technology is so exciting.
A. Alec Talin of the Sandia National Laboratory, lead author of the paper, explained why this cheap, nontoxic substance is ideal for his teams novel processing chip: Its an oxide, theres already oxygen there. But if you take a few out, you create what are called oxygen vacancies. It turns out that when you create oxygen vacancies, you make this material electrically conductive.
These vacancies can also store electrical data, a key ingredient computing power. These oxygen vacancies are created by heating a computer chip with a titanium oxide coating 150 C, and through basic electrochemistry, some of the oxygen in the TiO2 coating can be removed, creating oxygen vacancies.
When it cools off, it stores any information you program it with, Talin said.
Furthermore, their TiO2-based processor not only offers a new way of processing digital information, it also has the potential to fundamentally alter the way computers operate. Currently, computers work by storing data in one place and processing that same data in another place. In other words, energy is wasted in moving data from one place to another before it can be processed.
What weve done is make the processing and the storage at the same place, said Yiyang Li of the University of Michigan and first author of the paper. Whats new is that weve been able to do it in a predictable and repeatable manner.
This is particularly important for machine-learning and deep neural network applications, where as much computing power is needed for data processing, and not data moving.
Li explained: If you have autonomous vehicles, making decisions about driving consumes a large amount of energy to process all the inputs. If we can create an alternative material for computer chips, they will be able to process information more efficiently, saving energy and processing a lot more data.
Talin also sees applications in everyday devices that are already ubiquitous. Think about your cell phone, he said. If you want to give it a voice command, you need to be connected to a network that transfers the command to a central hub of computers that listen to your voice and then send a signal back telling your phone what to do. Through this process, voice recognition and other functions happen right in your phone.
In an age where digital privacy is important, it may be attractive to consumers to know sensitive datasuch as the sound of their own voice stays in their phone, rather than being sent to the Cloud first, where accountability and control is less clear-cut.
Like many advances in science, the discovery of this technological application of TiO2 is, as Bob Ross would call it, yet another happy accident, that has real-world, positive applications.
Reference: Yiyang Li et al., FilamentFree Bulk Resistive Memory Enables Deterministic Analogue Switching, Advanced Materials (2020) DOI: 10.1002/adma.202003984
Quotes adapted from the Sandia National Laboratory press release.
The term quantum computing gained momentum in the late 20thcentury. These systems aim to utilize these capabilities to become highly-efficient. They use quantum bits or qubits instead of the simple manipulation of ones and zeros in existing binary-based computers. These qubits also have a third state called superposition that simultaneously represents a one or a zero. Instead of analyzing a one or a zero sequentially, superposition allows two qubits in superposition to represent four scenarios at the same time. So we are at the cusp of a computing revolution where future systems have capability beyond mathematical calculations and algorithms.
Quantum computers also follow the principle of entanglement, which Albert Einstein had referred to as spooky action at a distance. Entanglement refers to the observation that the state of particles from the same quantum system cannot be described independently of each other. Even when they are separated by great distances, they are still part of the same system.
Several nations, giant tech firms, universities, and startups are currently exploring quantum computing and its range of potential applications. IBM, Google, Microsoft, Amazon, and other companies are investing heavilyin developing large-scale quantum computing hardware and software. Google and UCSB have a partnership to develop a 50 qubits computer, as it would represent 10,000,000,000,000,000 numbers that would take a modern computer petabyte-scale memory to store. A petabyte is the unit above a terabyte and represents 1,024 terabytes. It is also equivalent to 4,000 digital photos taken every day. Meanwhile, names like Rigetti Computing, D-Wave Systems, 1Qbit Information Technologies, Inc., Quantum Circuits, Inc., QC Ware, Zapata Computing, Inc. are emerging as bigger players in quantum computing.
IEEE Standards Association Quantum Computing Working Group is developing two technical standards for quantum computing. One is for quantum computing definitions and nomenclature, so we can all speak the same language. The other addresses performance metrics and performance benchmarking to measure quantum computers performance against classical computers and, ultimately, each other. If required, new standards will also be added with time.
The rapid growth in the quantum tech sector over the past five years has been exciting. This is because quantum computing presents immense potential. For instance, a quantum system can be useful for scientists for conducting virtual experiments and sifting through vast amounts of data. Quantum algorithms like quantum parallelism can perform a large number of computations simultaneously. In contrast, quantum interference will combine their results into something meaningful and can be measured according to quantum mechanics laws. Even Chinese scientists are looking to developquantum internet, which shall be a more secure communication system in which information is stored and transmitted withadvanced cryptography.
Researchers at Case Western Reserve University used quantum algorithms to transform MRI scans for cancer, allowing the scans to be performed three times faster and to improve their quality by 30%. In practice, this can mean patients wont need to be sedated to stay still for the length of an MRI, and physicians could track the success of chemotherapy at the earliest stages of treatment.
Laboratoire de Photonique Numrique et Nanosciences of France has built a hybrid device that pairs a quantum accelerometer with a classical one and uses a high-pass filter to subtract the classical data from the quantum data. This has the potential to offer an highly precise quantum compass that would eliminate the bias and scale factor drifts commonly associated with gyroscopic components. Meanwhile, the University of Bristolhas founded a quantum solution for increasing security threats. Researchers at the University of Virginia School of Medicine are working to uncover the potential quantum computers hold to help understand genetic diseases.Scientists are also using quantum computing to find a vaccine for COVID and other life-threatening diseases.
In July 2017, in collaboration with commercial photonics tools providerM Squared, QuantIC demonstrated how a quantum gravimeter detects the presence of deeply hidden objects by measuring disturbances in the gravitational field. If such a device becomes practical and portable, the team believes it could become invaluable in an early warning system for predicting seismic events and tsunamis.
Go here to read the rest:
What is Quantum Computing, and How does it Help Us? - Analytics Insight
The IEEE Quantum Week (QCE20) is a conference where academics, newcomers, and enthusiasts alike come together to discuss new developments and challenges in the field of quantum computing and engineering. Due to COVID-19 restrictions, this year's conference will be held virtually, starting today and running till October 16.
Throughout the course of the event, QCE20 will host parallel tracks of workshops, tutorials, keynotes, and networking sessions by industry front-runners like Intel, Microsoft, IBM, and Zapata. From the pack, today well peek into what Intel has in store for the IEEE Quantum Week. Particularly, well be previewing Intels array of new papers on developing commercial-grade quantum systems.
Starting off, Intel will be presenting a paper in which researchers have employed a deep learning framework to simulate and design high-fidelity multi-qubit gates for quantum dot qubit systems. This research is interesting because quantum dot silicon qubits can potentially improve the scalability of quantum computers due to their small size. This paper also indicates that machine learning is a powerful technique in optimizing the design and implementation of quantum gates. A similar insight was used by another team at the University of Melbourne back in March in which the researchers used machine learning to pinpoint the spatial locations of phosphorus atoms in a silicon lattice to design better quantum chips and subsequently reduce errors in computations.
Next up, Intel's second paper proposes an algorithm that optimizes the loading of certain classes of functions, e.g. Gaussian and Probability distributions, which are frequently used for mapping real-world problems to quantum computers. By loading data faster in a quantum computer and increasing throughput, the researchers believe that we can save time and leverage the exponential compute power offered by quantum computers in practical applications.
One of the earliest and most useful applications of quantum computers is to simulate a quantum system of particles. Consider the scenario where the ground state of a particle is to be calculated to study a certain chemical process. Traditionally, this task usually involves obtaining the lowest eigenvalue from the corresponding eigenvectors of the states of a particle represented by a matrix known as the Hamiltonian. But this deceptively simple task grows exponentially for larger systems that have innumerable particles. Naturally, researchers have devised quantum algorithms for it. Intels paper highlights the development and research requirements of running such algorithms on small qubit systems. The firm believes that the insight garnered from these findings can have potential implications for designing qubit chips in the future while simultaneously making quantum computing more accessible.
While were still in the NISQ (Noisy Intermediate-Scale Quantum) era of quantum computers, meaning that perfect quantum computers with thousands of qubits running Shors algorithm are still a thing of the future, firms have already started preparing for a quantum-safe future. One of the foreseeable threats posed by quantum computers is the ease with which they can factor large numbers, and hence threaten to break our existing standards of encryption. In this paper, researchers at Intel have aimed to address this concern. By presenting a design for a BIKE (Bit-flipping Key Encapsulation) hardware accelerator, todays cryptosystems can be made resilient to quantum attacks. Another thing to note here is that this approach is also currently under consideration by the National Institute of Standards and Technology (NIST), so a degree of adoption and standardization might be on the cards in the future.
Addressing the prevalent issues of the NISQ era once again, this paper debuts a novel technique that helps quantum-classical hybrid algorithms run efficiently on small qubit systems. This technique can be handy in this era since most practical uses of quantum computers involve a hybrid setup in which a quantum computer is paired with a classical computer. To illustrate, the aforementioned problem of finding the ground state of a quantum particle can be solved by a Variational-Quantum-Eigensolver (VQE), which uses both classical and quantum algorithms to estimate the lowest eigenvalue of a Hamiltonian. But running such hybrid algorithms is difficult. But the new method to engineer cost functions outlined in this paper could allow small qubit systems to run these algorithms efficiently.
Finally, on the penultimate day of the conference, Dr. Anne Matsuura, the Director of Quantum Applications and Architecture at Intel Labs, will be delivering a keynote titled Quantum Computing: A Scalable, Systems Approach. In it, Dr. Matsuura will be underscoring Intels strategy of taking a systems-oriented, workload-driven view of quantum computing to commercialize quantum computers in the NISQ era:
Quantum computing is steadily transitioning from the physics lab into the domain of engineering as we prepare to focus on useful, nearer-term applications for this disruptive technology. Quantum research within Intel Labs is making solid advances in every layer of the quantum computing stack from spin qubit hardware and cryo-CMOS technologies for qubit control to software and algorithms research that will put us on the path to a scalable quantum architecture for useful commercial applications. Taking this systems-level approach to quantum is critical in order to achieve quantum practicality.
The research works outlined above accentuate Intels efforts to develop useful applications that are ready to run on near-term, smaller qubit quantum machines. They also put the tech giant alongside the ranks of IBM and Zapata that are working on the commercialization of quantum computers as well.
Read the original here:
QCE20: Here's what you can expect from Intel's new quantum computing research this week - Neowin
In a bid to accelerate this countrys efforts in quantum computing, 24 Canadian hardware and software companies specializing in the field are launching an association this week to help their work get commercialized.
Called Quantum Industry Canada, the group says they represent Canadas most commercial-ready technologies, covering applications in quantum computing, sensing, communications, and quantum-safe cryptography.
The group includes Burnaby, B.C., manufacturer D-Wave Systems, Vancouver software developer 1Qbit, Torontos photonic quantum computer maker Xanadu Quantum Technologies, the Canadian division of software maker Zapata Computing, Waterloo, Ont.,-based ISARA which makes quantum-safe solutions and others.
The quantum opportunity has been brewing for many years, association co-chair Michele Mosca of the University of Waterloos Institute for Quantum Computing and the co-founder of two quantum startups, said in an interview, explaining why the new group is starting now. Canadas been a global leader at building up the global opportunity, the science, the workforce, and we didnt want this chance to pass. Weve got over 24 innovative companies, and we wanted to work together to make these companies a commercial success globally.
Its also important to get Canada known as a leader in quantum-related products and services, he added. This will help assure a strong domestic quantum industry as we enter the final stages of quantum readiness.
And while quantum computing is a fundamental new tool, Mosca said, its also important for Canadian organizations to start planning for a quantum computing future, even if the real business value isnt obvious. We dont know exactly when youll get the real business advantage you want to be ready for when quantum computers can give you an advantage.
Adib Ghubril, research director at Toronto-based Info-Tech Research Group, said in an interview creation of such a group is needed. When you want to foster innovation you want to gain critical mass, a certain number of people working in different disciplines it will help motivate them, even maybe compete.
Researchers from startups and even giants like Google, Microsoft, Honeywell and IBM have been throwing billions at creating quantum computers. So are countries, especially China, but also Australia, the U.K., Germany and Switzerland. Many big-name firms are touting projects with experimental equipment, or hybrid hardware that does accelerated computations but dont meet the standard definition of a quantum computer.
True quantum computers may be a decade off, some suggest. Ghubril thinks were 15 years from what he calls reliable, effective quantum computing. Still, last December IDC predicted that by 2023, one-quarter of the Fortune Global 500 will gain a competitive advantage from emerging quantum computing solutions.
Among the recent signposts:
Briefly, quantum computers take the theory of quantum mechanics to change the world of traditional computation of bits represented by zeros and ones. Instead, a bit can be a zero or a one. In a quantum computer, such basic elements are called qubits. With their expected ability to do astonishing fast computations, quantum computers may be able to help pharmaceutical companies create new drugs and nation-states to break encryption protecting government secrets.
Companies are taking different approaches. D-Wave uses a quantum annealing process to make machines it says are suited to solving real-world computing problems today. Xanadu uses what Mosca calls a more circuit-type computing architecture. Theres certainly the potential that some of the nearer-term technologies will offer businesses advantage, especially as they scale.
We know the road towards a full-fledged quantum computer is long. But there are amazing milestones in that direction.
Ghubril says Canada is in the leading pack of countries working on quantum computing. The momentum out of China is enormous, he said, but it looks like the country will focus on using quantum for telecommunications and not business solutions.
From his point of view companies are taking two approaches to quantum computers. Some, like D-Wave, are trying to use quantum ideas to optimize solving modelling problems. The problem is not every problem is an optimization problem, he said. Other companies are trying for the Grand Poobah the real (quantum) computer. So the IBMs of the world are going for the gusto. They want the real deal. They want to solve the material chemistry and biosynthesis and so on. Theyve gone big, but by doing so theyve gone slower. You cant do much on the IBM platform. You can learn a lot, but you cant do much. You can do more on a D-Wave, but you can only do one thing.
Ghburil encourages companies to dabble in the emerging technology.
Thats Infotechs recommendation: Just learn about it. Join a forum, open an account, try a few things. Nobody is going to gain a (financial) competitive advantage. Its a learning advantage.
See the original post here:
Canadian quantum computing firms partner to spread the technology - IT World Canada
Ten-year Forecasts for Quantum Networking Opportunities and Deployments Over the Coming Decade – WFMZ Allentown
DUBLIN, Oct. 12, 2020 /PRNewswire/ -- The "Quantum Networking: A Ten-year Forecast and Opportunity Analysis" report has been added to ResearchAndMarkets.com's offering.
This report presents detailed ten-year forecasts for quantum networking opportunities and deployments over the coming decade.
Today there increasing talk about the Quantum Internet. This network will have the same geographical breadth of coverage as today's Internet but where the Internet carries bits, the Quantum Internet will carry qubits, represented by quantum states. The Quantum Internet will provide a powerful platform for communications among quantum computers and other quantum devices. It will also further enable a quantum version of the Internet-of-Things. Finally, quantum networks can be the most secure networks ever built - completely invulnerable if constructed properly.
Already there are sophisticated roadmaps showing how the Quantum Internet will come to be. At the present time, however, quantum networking in the real world consists of three research programs and commercialization efforts: Quantum Key Distribution (QKD) adds unbreakable coding of key distribution to public-key encryption. Cloud/network access to quantum computers is core to the business strategies of leading quantum computer companies. Quantum sensor networks promise enhanced navigation and positioning; more sensitive medical imaging modalities, etc. This report provides power ten-year forecasts of all three of these sectors.
This report provides a detailed quantitative analysis of where the emerging opportunities can be found today and how they will emerge in the future:
With regard to the scope of the report, the focus is, of course, on quantum networking opportunities of all kinds. It looks especially, however, on three areas: quantum key distribution (QKD,) quantum computer networking/quantum clouds, and quantum sensor networks. The report also includes in the forecasts breakouts by all the end-user segments of this market including military and intelligence, law enforcement, banking and financial services, and general business applications, as well as niche applications. There are also breakouts by hardware, software and services as appropriate.
In addition, there is also some discussion of the latest research into quantum networking, including the critical work on quantum repeaters. Quantum repeaters allow entanglement between quantum devices over long distances. Most experts predict repeaters will start to prototype in real-world applications in about five years, but this is far from certain.
This report will be essential reading for equipment companies, service providers, telephone companies, data center managers, cybersecurity firms, IT companies and investors of various kinds.
Key Topics Covered:
Executive SummaryE.1 Goals, Scope and Methodology of this ReportE.1.1 A Definition of Quantum NetworkingE.2 Quantum Networks Today: QKD, Quantum Clouds and Quantum Networked SensorsE.2.1 Towards the Quantum Internet: Possible Business OpportunitiesE.2.2 Quantum Key DistributionE.2.3 Quantum Computer Networks/Quantum CloudsE.2.4 Quantum Sensor NetworksE.3 Summary of Quantum Networking Market by Type of NetworkE.4 The Need for Quantum Repeaters to Realize Quantum Networking's PotentialE.5 Plan of this Report
Chapter One: Ten-year Forecast of Quantum Key Distribution1.1 Opportunities and Drivers for Quantum Key Distribution Networks1.1.1 QKD vs. PQC1.1.2 Evolution of QKD1.1.3 Technology Assessment1.2 Ten-year Forecasts of QKD Markets1.2.1 QKD Equipment and Services1.2.2 A Note on Mobile QKD1.3 Key Takeaways from this Chapter
Chapter Two: Ten-Year Forecast of Quantum Computing Clouds2.1 Quantum Computing: State of the Art2.2 Current State of Quantum Clouds and Networks2.3 Commercialization of Cloud Access to Quantum Computers2.4 Ten-Year Forecast for Cloud Access to Quantum Computers2.4.1 Penetration of Clouds in the Quantum Computing Space2.4.2 Revenue from Network Equipment for Quantum Computer Networks by End-User Industry2.4.3 Revenue from Network Equipment Software by End-User Industry2.5 Key Takeaways from this Chapter
Chapter Three: Ten-Year Forecast of Quantum Sensor Networks3.1 The Emergence of Networked Sensors3.1.1 The Demand for Quantum Sensors Seems to be Real3.2 The Future of Networked Sensors3.3 Forecasts for Networked Quantum Sensors3.4 Five Companies that will Shape the Future of the Quantum Sensor Business: Some Speculations
Chapter Four: Towards the Quantum Internet4.1 A Roadmap for the Quantum Internet4.1.1 The Quantum Internet in Europe4.1.2 The Quantum Internet in China4.1.3 The Quantum Internet in the U.S.4.2 Evolution of Repeater Technology: Ten-year Forecast4.3 Evolution of the Quantum Network4.4 About the Analyst4.5 Acronyms and Abbreviations Used In this Report
For more information about this report visit https://www.researchandmarkets.com/r/rksyxu
About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.
Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.
Research and MarketsLaura Wood, Senior Managerpress@researchandmarkets.com
For E.S.T Office Hours Call +1-917-300-0470For U.S./CAN Toll Free Call +1-800-526-8630For GMT Office Hours Call +353-1-416-8900
U.S. Fax: 646-607-1907Fax (outside U.S.): +353-1-481-1716
Innovative technologies from Lawrence Berkeley National Laboratory (Berkeley Lab) to achieve higher energy efficiency in buildings, make lithium batteries safer and higher performing, and secure quantum communications were some of the inventions honored with R&D 100 Awards by R&D World magazine.
For more than 50 years, the annual R&D 100 Awards have recognized 100 technologies of the past year deemed most innovative and disruptive by an independent panel of judges. The full list of winners, announced by parent company WTWH Media LLC is available at the R&D World website.
Berkeley Labs award-winning technologies are described below.
A Tool to Accelerate Electrochemical and Solid-State Innovation
(from left) Adam Weber, New Danilovic, Douglas Kushner, and John Petrovick (Credit: Berkeley Lab)
Berkeley Lab scientists invented a microelectrode cell to analyze and test electrochemical systems with solid electrolytes. Thanks to significant cost and performance advantages, this tool can accelerate development of critical applications such as energy storage and conversion (fuel cells, batteries, electrolyzers), carbon capture, desalination, and industrial decarbonization.
Solid electrolytes have been displacing liquid electrolytes as the focus of electrochemical innovation because of their performance, safety, and cost advantages. However, the lack of effective methods and equipment for studying solid electrolytes has hindered advancement of the technologies that employ them. This microelectrode cell meets the testing needs, and is already being used by Berkeley Lab scientists.
The development team includes Berkeley Lab researchers Adam Weber, Nemanja Danilovic, Douglas Kushner, and John Petrovick.
Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com)
Information transmitted by MMQ-Com is impervious to security breaches. (Credit: Alexander Stibor/Berkeley Lab)
Quantum communication, cybersecurity, and quantum computing are growing global markets. But the safety of our data is in peril given the rise of quantum computers that can decode classical encryption schemes.
The Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com) technology is a fundamentally new kind of secure quantum information transmitter. It transmits messages by modulating electron matter-waves without changing the pathways of the electrons. This secure communication method is inherently impervious to any interception attempt.
A novel quantum key distribution scheme also ensures that the signal is protected from spying by other quantum devices.
The development team includes Alexander Stibor of Berkeley Labs Molecular Foundry along with Robin Rpke and Nicole Kerker of the University of Tbingen in Germany.
Solid Lithium Battery Using Hard and Soft Solid Electrolytes
(from left) Marca Doeff, Guoying Chen, and Eongyu Yi (Credit: Berkeley Lab)
The lithium battery market is expected to grow from more than $37 billion in 2019 to more than $94 billion by 2025. However, the liquid electrolytes used in most commercial lithium-ion batteries are flammable and limit the ability to achieve higher energy densities. Safety issues continue to plague the electronics markets, as often-reported lithium battery fires and explosions result in casualties and financial losses.
In Berkeley Labs solid lithium battery, the organic electrolytic solution is replaced by two solid electrolytes, one soft and one hard, and lithium metal is used in place of the graphite anode. In addition to eliminating battery fires, incorporation of a lithium metal anode with a capacity 10 times higher than graphite (the conventional anode material in lithium-ion batteries) provides much higher energy densities.
The technology was developed by Berkeley Lab scientists Marca Doeff, Guoying Chen, and Eongyu Yi, along with collaborators at Montana State University.
Porous Graphitic Frameworks for Sustainable High-Performance Li-Ion Batteries
High-resolution transmission electron microscopy images of the Berkeley Lab PGF cathode reveal (at left) a highly ordered honeycomb structure within the 2D plane, and (at right) layered columnar arrays stacked perpendicular to the 2D plane. (Credit: Yi Liu/Berkeley Lab)
The Porous Graphitic Frameworks (PGF) technology is a lithium-ion battery cathode that could outperform todays cathodes in sustainability and performance.
In contrast to commercial cathodes, organic PGFs pose fewer risks to the environment because they are metal-free and composed of earth-abundant, lightweight organic elements such as carbon, hydrogen, and nitrogen. The PGF production process is also more energy-efficient and eco-friendly than other cathode technologies because they are prepared in water at mild temperatures, rather than in toxic solvents at high temperatures.
PGF cathodes also display stable charge-discharge cycles with ultrahigh capacity and record-high energy density, both of which are much higher than all commercial inorganic cathodes and organic cathodes known to exist.
The development team includes Yi Liu and Xinie Li of Berkeley Labs Molecular Foundry, as well as Hongxia Wang and Hao Chen of Stanford University.
Building Efficiency Targeting Tool for Energy Retrofits (BETTER)
The buildings sector is the largest source of primary energy consumption (40%) and ranks second after the industrial sector as a global source of direct and indirect carbon dioxide emissions from fuel combustion. According to the World Economic Forum, nearly one-half of all energy consumed by buildings could be avoided with new energy-efficient systems and equipment.
(from left) Carolyn Szum (Lead Researcher), Han Li, Chao Ding, Nan Zhou, Xu Liu (Credit: Berkeley Lab)
The Building Efficiency Targeting Tool for Energy Retrofits (BETTER) allows municipalities, building and portfolio owners and managers, and energy service providers to quickly and easily identify the most effective cost-saving and energy-efficiency measures in their buildings. With an open-source, data-driven analytical engine, BETTER uses readily available building and monthly energy data to quantify energy, cost, and greenhouse gas reduction potential, and to recommend efficiency interventions at the building and portfolio levels to capture that potential.
It is estimated that BETTER will help reduce about 165.8 megatons of carbon dioxide equivalent (MtCO2e) globally by 2030. This is equivalent to the CO2 sequestered by growing 2.7 billion tree seedlings for 10 years.
The development team includes Berkeley Lab scientists Nan Zhou, Carolyn Szum, Han Li, Chao Ding, Xu Liu, and William Huang, along with collaborators from Johnson Controls and ICF.
AmanziATS: Modeling Environmental Systems Across Scales
Simulated surface and subsurface water from Amanzi-ATS hydrological modeling of the Copper Creek sub-catchment in the East River, Colorado watershed. (Credit: Zexuan Xu/Berkeley Lab, David Moulton/Los Alamos National Laboratory)
Scientists use computer simulations to predict the impact of wildfires on water quality, or to monitor cleanup at nuclear waste remediation sites by portraying fluid flow across Earth compartments. The Amanzi-Advanced Terrestrial Simulator (ATS) enables them to replicate or couple multiple complex and integrated physical processes controlling these flowpaths, making it possible to capture the essential physics of the problem at hand.
Specific problems require taking an individual approach to simulations, said Sergi Molins, principal investigator at Berkeley Lab, which contributed expertise in geochemical modeling to the softwares development. Physical processes controlling how mountainous watersheds respond to disturbances such as climate- and land-use change, extreme weather, and wildfire are far different than the physical processes at play when an unexpected storm suddenly impacts groundwater contaminant levels in and around a nuclear remediation site. Amanzi-ATS allows scientists to make sense of these interactions in each individual scenario.
The code is open-source and capable of being run on systems ranging from a laptop to a supercomputer. Led by Los Alamos National Laboratory, Amanzi-ATS is jointly developed by researchers from Los Alamos National Laboratory, Oak Ridge National Laboratory, Pacific Northwest National Laboratory, and Berkeley Lab researchers including Sergi Molins, Marcus Day, Carl Steefel, and Zexuan Xu.
Institute for the Design of Advanced Energy Systems (IDAES)
The U.S. Department of Energys (DOEs) Institute for the Design of Advanced Energy Systems (IDAES) project develops next-generation computational tools for process systems engineering (PSE) of advanced energy systems, enabling their rapid design and optimization.
IDAES Project Team (Credit: Berkeley Lab)
By providing rigorous modeling capabilities, the IDAES Modeling & Optimization Platform helps energy and process companies, technology developers, academic researchers, and DOE to design, develop, scale-up, and analyze new and potential PSE technologies and processes to accelerate advances and apply them to address the nations energy needs. The IDAES platform is also a key component in the National Alliance for Water Innovation, a $100 million, five-year DOE innovation hub led by Berkeley Lab, which will examine the critical technical barriers and research needed to radically lower the cost and energy of desalination.
Led by National Energy Technology Laboratory, IDAES is a collaboration with Sandia National Laboratories, Berkeley Lab, West Virginia University, Carnegie Mellon University, and the University of Notre Dame. The development team at Berkeley Lab includes Deb Agarwal, Oluwamayowa (Mayo) Amusat, Keith Beattie, Ludovico Bianchi, Josh Boverhof, Hamdy Elgammal, Dan Gunter, Julianne Mueller, Jangho Park, Makayla Shepherd, Karen Whitenack, and Perren Yang.
# # #
Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Labs facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energys Office of Science.
DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.
View original post here:
Berkeley Lab Technologies Honored With 7 R&D 100 Awards - Lawrence Berkeley National Laboratory