Category Archives: Quantum Computing

Microsoft and KPMG will try out quantum algorithms on real-world problems – GeekWire

Krysta Svore, general manager of Microsoft Quantum, explains how quantum computing hardware works during a Seattle science conference in 2020. (GeekWire Photo / Alan Boyle)

Microsoft and KPMG are getting set to test Azure Quantums capabilities on the sorts of real-world problems that should give quantum computing an edge over traditional approaches.

Such problems have to do with optimizing systems and networks, such as where best to place cellular phone towers or how to allocate investments to match a clients priorities relating to risks vs. rewards.

Optimization problems are found in many industries and are often difficult to solve using traditional methods which can accelerate optimization, Krysta Svore, general manager of Microsoft Quantum, explained today in a blog post. Emulating these quantum effects on classical computers has led to the development of quantum-inspired optimization (QIO) algorithms that run on classical hardware.

Such algorithms reflect the quantum perspective, in which information doesnt necessarily take the form of rigid ones and zeroes but can instead reflect a range of values simultaneously during processing. The beauty of QIO algorithms is that they dont need to run on honest-to-goodness quantum processors, which are still in their infancy.

The Microsoft-KPMG partnership gives both companies a chance to tweak the algorithms and how theyre used to maximize Azure Quantums QIO capabilities.

The Azure Quantum platform allows us to explore numerous different solver approaches utilizing the same code, helping to minimize re-work and improve efficiency, said Bent Dalager, global head of KPMGs Quantum Hub. The shared goal for these initial projects is to build solution blueprints for common industry optimization problems using Azure Quantum, which we can then provide to more clients at scale.

This isnt the first time for Microsofts quantum computing team has experimented with real-world optimization challenges: A couple of years ago, researchers at Microsoft and Ford used QIO algorithms to analyze strategies for smoothing out the Seattle areas traffic snarls. Preliminary studies showed a decrease of more than 70% in congestion and an 8% reduction in average travel time.

Last year, Toyota Tsusho and a Japanese startup called Jij used Azure Quantum to optimize the timing of traffic signals. They found that QIO algorithms could reduce the waiting time for drivers stopped at red lights by about 20%, saving an average of about 5 seconds for each car. And California-based Trimble turned to Azure Quantum to identify the most efficient routes for fleets of vehicles, ensuring that fewer trucks run empty.

The Microsoft-KPMG project will start out focusing on benchmarking solutions for optimizing financial services portfolios and telecommunications operations.

Portfolio optimization has to do with balancing the mix of investments to minimize risk and maximize profit while staying within a given budget. As financial options get more complex, it becomes difficult to assess those options using a brute-force analytical approach but QIO algorithms are well-suited to take on the challenge.

Quantum-inspired optimization could also increase the efficiency of voice-over-LTE telecom networks, leading to a better user experience for customers. Down the line, the project could look into optimization strategies for cell tower placement, mobile handover between cell towers, and staff scheduling for call centers.

The teams plan to share results in the coming months, Svore said.

Read more from the original source:
Microsoft and KPMG will try out quantum algorithms on real-world problems - GeekWire

Honeywell Superpositions Itself in the Quantum Computing Industry With New Company Quantinuum – TECHdotMN

The concept of quantum superposition is mind melting. Explained in its most basic form (that is to say, with no math, an area of school which was definitely not my strong suit), quantum superposition means that instead of something being X or Y, it can be X and Y simultaneously (until observed, which is a whole different can of quantum worms). The concept allows a wave to be a particle, a particle to be a wave, and a cat to be alive and dead at the same time.

Quantum superposition is also the driving force behind quantum computing. While a normal computer operates in binary bits of information, ones and zeroes, a quantum computer uses quantum bits or, to put it more adorably, quibits. Qubits are bits of data that, due to the uncertainty of their quantum state, exist as both a one and zero simultaneously. This allows for much more flexibility in the storage and transfer of data, allowing quantum computers to blaze through extremely complex equations.

Instead of using superconductors, Quantinuums quantum computers are driven by ion traps, specialized chips designed to line up ions using an RF electromagnetic field. Photo: Quantinuum

What does all this have to do with Minnesota? Well, one of the most powerful quantum computers on Earth just so happens to be superposition-ing its quantum butt off in good old Golden Valley. Honeywell Quantum Solutions and Cambridge Quantum Computing have merged to form Quantinuum. The new company will be headquartered in Colorado and the U.K., but a 40-person team will also be working on quantum computing projects in Minnesota. In an interview with the Star Tribune, Quantinuum President and COO Tony Uttley said the team will continue to grow.

In a release detailing the new company, Quantinuum outline its plans for a cybersecurity product to be launched this month (December 2021), and an, enterprise software package that applies quantum computing to solve complex scientific problems in pharmaceuticals, materials science, specialty chemicals and agrochemicals.

I am thrilled to help lead our new company, which will positively change the world through the application of quantum computing. Our scientists continue to work hard to develop the best quantum software and hardware technologies available and I am excited to be able to offer these to customers on an on-going basis, Uttley said in the release. The next few weeks and months are going to be extremely active for Quantinuum as we increase the pace in deriving unique value from todays quantum computers especially in cybersecurity. However, in addition to cybersecurity, our products will include solutions for drug discovery and drug delivery, materials science, finance, natural language processing, as well as optimization, pattern recognition, and supply chain and logistics management.

Out of the gate, Honeywell is the primary bankrolling fuel behind the venture with a 54 percent ownership stake, an upfront investment of $300 million. Honeywell will also supply the ion traps that serve as the backbone of the companys quantum computing technology.

Quantinuum also appears to be fast-tracking its path to the public market. In a CNET article, Uttley said he expects the company to be a publicly listed entity in the next 12 months.

While quantum states might be unsure, the financial outlook for the quantum computing sector is not. Constellation Research has estimated the current market to be worth $174 billion, and Honeywell projects the industry to crack $1 trillion eventually. But whatever the fate of the quantum computing market may be, its cool to know one of the industrys powerhouse devices is chugging away right in Golden Valley.

Minnesota: land of ten thousand lakes, land of superpositioned states.

The rest is here:
Honeywell Superpositions Itself in the Quantum Computing Industry With New Company Quantinuum - TECHdotMN

Where does EU stand in the quantum computing race with China and US? – TechHQ

The leading contenders in the race to qubits (the basic measuring unit in quantum computing) superiority have always been dominated by the US and China. The competition between the superpowers has been ramping up as the quantum research arena has flourished in recent years despite still being a pretty nascent technology. But while the efforts in both the far east and west draw headlines, an often-overlooked region in the quantum conversation has been Europe.

After all, the Europeans have had to catch up with initiatives in the US and China, or risk being left behind in the quantum computing maturity race altogether. With so many research and development breakthroughs emerging, there is a global race underway to be the first to create and conquer the market surrounding this key future tech. The US for instance is investing in excess of US$1.2 billion in quantum R&D between 2019 and 2028, and China is building a US$10 billion National Laboratory for Quantum Information Sciences.

To kick-start a continent-wide quantum-driven industry and accelerate market take-up, Europe launched the Quantum Flagship back in 2018, an ambitious 1 billion, 10-year endeavor. According to the European Commission, the Quantum Technologies Flagship is a long-term research and innovation initiative that aims to put Europe at the forefront of the second quantum revolution.

In May 2021, the German government announced that it would spend billions of euros to support the development of the countrys first quantum computer. The aim is to build a competitive quantum computer in just five years, while nurturing a network of companies to develop applications.

Just one month later, it was announced that researchers at the Institute for Experimental Physics of the University of Innsbruck, Austria, have built a prototype for a compact quantum computer. In essence, the quantum computer aims to fit quantum computing experiments into the smallest space possible.

It is European born-and-bred. It is built with European parts and has demonstrated a world-class ability to entangle 24 qubits a necessary condition for genuine quantum computations, an article by the European Commission reads, adding that it is made for the benefit of European industry and academia.

The quantum computer is available online to interested parties, from individual to corporate users, via AQT Cloud Access. With that, it offers a competitive European alternative to the traditional big tech giants such as Google, IBM, or Chinas Alibaba. It also represents a great step forward in ensuring Europes technological sovereignty and reducing our dependence on foreign technology computing, the commission said.

A notable feature of the compact quantum computer is its low power consumption, which stands at 1.5 kilowatts the same amount of energy needed to power a kettle. Indeed, such is its low power consumption, that the researchers in the University of Innsbruck are exploring how to power the device using solar panels.

Another decisive factor for the industrial use of quantum computers is the number of available qubits. The Innsbruck physicists were able to run the quantum computer with 24 fully functional qubits individually controlling and entangling 24 trapped ions with their device meeting a recent target set by the German government with surprising speed.

The University wants to be able to provide a device with up to 50 individually controllable quantum bits by next year, as per its press release. To recall, in 2019, Google engineers published a paper stating that they had achieved quantum supremacy with a quantum computer with 54 qubits.

The following year, a team from the University of Science and Technology (USTC) in China managed to build the Zuchongzhi, which is capable of surpassing Googles best efforts by a mind-boggling factor of 10 billion. Then again this year, physicists in China claimed that theyve come up with two quantum computers that have the sheer computing capability to surpass virtually any other system in the world.

Published in the journal Physical Review Letters and Science Bulletin, physicists named their superconducting machine Zuchongzhi 2. The Zuchongzhi 2 is an upgrade from an earlier machine released in July 2021 that can run a calculation task one million times more complex than Googles Sycamore, according to lead researcher Pan Jianwei. At this point, China seemingly took pole position in the unofficial quantum computing race.

Elsewhere in Europe, France and the Netherlands signed a memorandum of understanding in August this year to intensify synergies for the research and development of quantum technologies, joining the race in building high-performance supercomputers, according to EURACTIV France.

The agreement between both nations would mean more collaboration in research, greater cooperation among large tech companies, investments to develop the ecosystem, the acceleration of existing European initiatives, and the creation of jobs in the field. So it is safe to say that Europe may have had a slow start, but is coming on strong from behind and is definitely not one to be taken lightly.

See the original post here:
Where does EU stand in the quantum computing race with China and US? - TechHQ

Pistoia Alliance predicts a focus on the fight against antimicrobial resistance and a surge in quantum computing research for 2022 – Bio-IT World

Boston, US, 22 November 2021: The Pistoia Alliance, a global, not-for-profit alliance that advocates for greater collaboration in life sciences R&D, has today outlined predictions for the life sciences industry in 2022. The predictions come from three experts recently appointed by the Alliance to drive collaboration efforts across its three key themes. Their insights span the urgent fights against antimicrobial resistance, the potential of quantum computing and commercial space travel, and autonomous laboratories. Throughout 2021, digital transformation has continued to accelerate and the pharmaceutical industry has further embraced collaboration, both of which will underpin success in emerging areas in the next 12 months.

Linda Kasim, Empowering the Patient theme lead, Pistoia Alliance: In 2022, the renewed focus on the fight against super bugs and antimicrobial resistance (AMR) will be prioritized. This will be primarily driven by public-private partnerships, funding from philanthropic organizations, governments and international bodies to incentivize research. The public sector must quickly increase investment into AMR research, or the cost to national economies and public health could be devastating. mRNA technologies will represent a rapid and valuable platform to be further exploited for vaccines against AMR infections.

Digital health platforms will also be more integrated seeking efficiency through harmonized data generation. The use of Self-Sovereign Identities within healthcare solutions will expand. For these breakthroughs to happen, regulatory authorities must catch-up with the pace of research and innovation in health systems in 2022 by updating legal frameworks.

Imran Haq, Emerging Science and Technology theme lead, Pistoia Alliance: Driven by macro geopolitical trends and Big Tech, emerging technologies are being developed increasingly rapidly. Reflecting this, deal making in the quantum space will continue to grow a pace in 2022. As the buzz around the sector increases, will this be the year we finally start to see translation of this buzz into early versions of applications and use cases in the pharma industry? A likely quantum use case could be to improve supply chain efficiency. Big promises have been made during COP26, and large organizations, including pharma companies, must have net zero strategies. This is also an area we would like to explore with the Pistoia Alliances Quantum Computing Community of Interest.

Pharma is also going to play an increasingly critical role in space exploration. As plans to launch a commercial space stationfrom companies like Blue Origin accelerate, pharma should be engaged to ensure humans are healthy and can survive in the long term in extreme environments. 2022 is the time to think how we could be molding and driving forward health in space.

Anca Ciobanu, Improving the Efficiency and Effectiveness of R&D theme lead, Pistoia Alliance: Efficiency in R&D is on an exponential growth path as more pharma and biotech organizationspartner with AI and robotics companies, enabling a more automated drug discovery process. In 2022, the major tech players will increase their focus on the life sciences and will play an important role in developing new products and initiatives. The application of new technologies will not only empower scientists to conduct experiments more efficiently, but it will also help them make more breakthrough discoveries. As companies continue to invest resources in launching or improving their autonomous labs, researchers will need upskilling in data science, to be able to program and interact with themachines.

The Pistoia Alliance has more than 150 member companies including major life science companies, technology and service providers, academic groups, publishers, and patient research groups. Members collaborate as equals on projects that generate value for the worldwide life sciences and healthcare ecosystem. To find out more about the Alliance and its portfolio of projects, click here: https://www.pistoiaalliance.org/category/projects/.

--ENDS

About The Pistoia Alliance:

The Pistoia Alliance is a global, not-for-profit members organization made up of life science companies, technology and service providers, publishers, and academic groups working to lower barriers to innovation in life science and healthcare R&D. It was conceived in 2007 and incorporated in 2009 by representatives of AstraZeneca, GSK, Novartis and Pfizer who met at a conference in Pistoia, Italy. Its projects transform R&D through pre-competitive collaboration. It overcomes common R&D obstacles by identifying the root causes, developing standards and best practices, sharing pre-competitive data and knowledge, and implementing technology pilots. There are currently over 150 member companies; members collaborate on projects that generate significant value for the worldwide life sciences R&D community, using The Pistoia Alliances proven framework for open innovation.

Media Contacts:

Spark Communications

+44 207 436 0420

pistoiaalliance@sparkcomms.co.uk

Tanya Randall

The Pistoia Alliance

+44 7887 811332

tanya.randall@pistoiaalliance.org

Read the original:
Pistoia Alliance predicts a focus on the fight against antimicrobial resistance and a surge in quantum computing research for 2022 - Bio-IT World

Creating Dynamic Symmetry in Diamond Crystals To Improve Qubits for Quantum Computing – SciTechDaily

By Matthew Hutson, MIT Department of Nuclear Science and EngineeringNovember 15, 2021

Instrumentation setup in the Quantum Engineering Group at MIT to study dynamical symmetries with qubits in diamond crystals. Credit: Guoqing Wang/MIT

MIT researchers develop a new way to control and measure energy levels in a diamond crystal; could improve qubits in quantum computers.

Physicists and engineers have long been interested in creating new forms of matter, those not typically found in nature. Such materials might find use someday in, for example, novel computer chips. Beyond applications, they also reveal elusive insights about the fundamental workings of the universe. Recent work at MIT both created and characterized new quantum systems demonstrating dynamical symmetry particular kinds of behavior that repeat periodically, like a shape folded and reflected through time.

There are two problems we needed to solve, says Changhao Li, a graduate student in the lab of Paola Cappellaro, a professor of nuclear science and engineering. Li published the work recently in Physical Review Letters, together with Cappellaro and fellow graduate student Guoqing Wang. The first problem was that we needed to engineer such a system. And second, how do we characterize it? How do we observe this symmetry?

Concretely, the quantum system consisted of a diamond crystal about a millimeter across. The crystal contains many imperfections caused by a nitrogen atom next to a gap in the lattice a so-called nitrogen-vacancy center. Just like an electron, each center has a quantum property called a spin, with two discrete energy levels. Because the system is a quantum system, the spins can be found not only in one of the levels, but also in a combination of both energy levels, like Schrodingers theoretical cat, which can be both alive and dead at the same time.

Dynamical symmetries, which play an essential role in physics, are engineered and characterized by a cutting-edge quantum information processing toolkit. Credit: Courtesy of the researchers

The energy level of the system is defined by its Hamiltonian, whose periodic time dependence the researchers engineered via microwave control. The system was said to have dynamical symmetry if its Hamiltonian was the same not only after every time period t but also after, for example, every t/2 or t/3, like folding a piece of paper in half or in thirds so that no part sticks out. Georg Engelhardt, a postdoc at the Beijing Computational Science Research, who was not involved in this work but whose own theoretical work served as a foundation, likens the symmetry to guitar harmonics, in which a string might vibrate at both 100 hertz and 50 Hz.

To induce and observe such dynamical symmetry, the MIT team first initialized the system using a laser pulse. Then they directed various selected frequencies of microwave radiation at it and let it evolve, allowing it to absorb and emit the energy. Whats amazing is that when you add such driving, it can exhibit some very fancy phenomena, Li says. It will have some periodic shake. Finally, they shot another laser pulse at it and measured the visible light that it fluoresced, in order to measure its state. The measurement was only a snapshot, so they repeated the experiment many times to piece together a kind of flip book that characterized its behavior across time.

What is very impressive is that they can show that they have this incredible control over the quantum system, Engelhardt says. Its quite easy to solve the equation, but realizing this in an experiment is quite difficult.

Critically, the researchers observed that the dynamically symmetry of the Hamiltonian the harmonics of the systems energy level dictated which transitions could occur between one state and another. And the novelty of this work, Wang says, is also that we introduce a tool that can be used to characterize any quantum information platform, not just nitrogen-vacancy centers in diamonds. Its broadly applicable. Li notes that their technique is simpler than previous methods, those that require constant laser pulses to drive and measure the systems periodic movement.

One engineering application is in quantum computers, systems that manipulate qubits, bits that can be not only 0 or 1, but a combination of 0 and 1. A diamonds spin can encode one qubit in its two energy levels.

Qubits are delicate: they easily break down into simple bit, a 1 or a 0. Or the qubit might become the wrong combination of 0 and 1. These tools for measuring dynamical symmetries, Engelhardt says, can be used to as a sanity check that your experiment is tuned correctly and with a very high precision. He notes the problem of outside perturbations in quantum computers, which he likens to a de-tuned guitar. By tuning the tension of the strings adjusting the microwave radiation such that the harmonics match some theoretical symmetry requirements, one can be sure that the experiment is perfectly calibrated.

The MIT team already has their sights set on extensions to this work. The next step is to apply our method to more complex systems and study more interesting physics, Li says. They aim for more than two energy levels three, or 10, or more. With more energy levels they can represent more qubits. When you have more qubits, you have more complex symmetries, Li says. And you can characterize them using our method here.

Reference: Observation of Symmetry-Protected Selection Rules in Periodically Driven Quantum Systems by Guoqing Wang, Changhao Li and Paola Cappellaro, 29 September 2021, Physical Review Letters.DOI: 10.1103/PhysRevLett.127.140604

This research was funded, in part, by the National Science Foundation.

Read more here:
Creating Dynamic Symmetry in Diamond Crystals To Improve Qubits for Quantum Computing - SciTechDaily

Atom Computing: A Quantum Computing Startup That Believes It Can Ultimately Win The Qubit Race – Forbes

Atom Computing

Atom Computing describes itself as a company obsessed with building the worlds most scalable quantum computers out of optically trapped neutral atoms. The companyrecently revealed it had spent the past two years secretly building a quantum computer using Strontium atoms as its units of computation.

Headquartered in Berkeley, California, Benjamin Bloom and Jonathan King founded the company in 2018 with $5M in seed funds. Bloom received his PhD in physics from the University of Colorado, while King received a PhD in chemical engineering from California Berkeley.

Atom Computing received $15M in Series A funding from investorsVenrock, Innovation Endeavors, and Prelude Ventures earlier this year. The company also received three grants from the National Science Foundation.

Atom Staff

Rob Hays, a former Intel, and Lenovo executive was recently named CEO of the company. Atom Computingsstaff of quantum physicists and design engineers fully complements quantum-related disciplines and applications.This month Atom Computing signaled its continued momentum by adding twoquantum veterans to key positions within the company:

Qubit technologies

While traditional computers use magnetic bits to represent a one or a zero for computation, quantum computers usequantum bits or qubits to represent a one or a zero or simultaneously any number in between.

Todays quantum computers use several different technologies for qubits. But regardless of the technology, a common requirement for all quantum computing qubits is that it must be scalable, high quality, and capable of fast quantum interaction with each other.

IBM uses superconducting qubits on its huge fleet of about twenty quantum computers. Although Amazon doesnt yet have a quantum computer, it plans to build one using superconducting hardware. Honeywell and IonQ both use trapped-ion qubits made from a rare earth metal called ytterbium. In contrast, Psi Quantum and Xanadu use photons of light.

Atom computing chose to use different technology -nuclear-spin qubits made from neutral atoms.Phoenix, the name of Atoms first-generation, gate-based quantum computer platform, uses 100 optically trapped qubits.

Atom Computings quantum platform

First-Generation Quantum Computer, Phoenix, Berkeley,

The Phoenix platform uses a specific type of nuclear-spin qubits created from an isotope of Strontium, a naturally occurring element. Strontium is a neutral atom. At the atomic level, neutral atoms have equal numbers of protons and electrons. However, isotopes of Strontium have varying numbers of neutrons. These differences in neutrons produce different energy levels in the atom. Atom Computing uses the isotope Strontium-87 and takes advantage of its unique energy levels to create spin qubits.

Qubits need to remain in a quantum state long enough to complete computations. The length of time that a qubit can retain its quantum state is its coherence time. Since Atom Computings neutral atom qubits are natural rather than manufactured, no adjustments are needed to compensate for differences between qubits. That contributes to its stability and relatively long coherence time in a range greater than 40 seconds compared to a millisecond for superconducting or a few seconds for ion-trapping systems. Moreover, a neutral atom has little affinity for other atoms, making the qubits less susceptible to noise.

Neutral atom qubits offer many advantages that make them suitable for quantum computing. Here are just a few:

How neutral atom quantum processors work

Atom Computing

The Phoenix quantum platform uses lasers as proxies for high-precision, wireless control of the Strontium-87 qubits. Atoms are trapped in a vacuum chamber using optical tweezers controlledby lasers at very specific wavelengths, creatingan array of highly stable qubits captured in free space.

First, a beam of hot strontium moves the atoms into the vacuum chamber. Next, multiple lasers bombard each atom with photons to slow their momentum to a near motionless state, causing its temperature to fall to near absolute zero. This process is called laser cooling and it eliminates the requirement for cryogenics and makes it easier to scale qubits.

Then, optical tweezers are formed in a glass vacuum chamber, where qubits are assembled and optically trapped in an array. One advantage of neutral atoms is that the processors array is not limited to any specific shape, and it can be either 2D or 3D. Additional lasers create a quantum interaction between the atoms (called entanglement) in preparation for the actual computation. After initial quantum states are set and circuits are established, then the computation is performed.

The heart of Phoenix, showing where the Atom Computings qubits entangle. (First-Generation Quantum ... [+] Computer, Phoenix - Berkeley, California)

Going forward

Atom Computing is working with several technology partners. It is also running tests with a small number of undisclosed customers. The Series A funding has allowed it to expand its research and begin working on the second generation of its quantum platform. Its a good sign that Rob Hays, CEO, believes Atom Computing will begin generating revenue in mid-2023.

Atom Computing is a young and aggressive company with promising technology. I spoke with Denise Ruffner shortly after she joined Atom. Her remarks seem to reflect the optimism of the entire company:

"I am joining the dream team - a dynamic CEO with experience in computer development and sales, including an incredible Chief Product Officer, as well as a great scientific team. I am amazed at how many corporations have already reached out to us to try our hardware. This is a team to bet on."

Analyst notes

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, or speaking sponsorships. The company has had or currently has paid business relationships with 88,A10 Networks,Advanced Micro Devices, Amazon,Ambient Scientific,AnutaNetworks,Applied Micro,Apstra,Arm, Aruba Networks (now HPE), AT&T, AWS, A-10 Strategies,Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera,Clumio, Cognitive Systems, CompuCom,CyberArk,Dell, Dell EMC, Dell Technologies, Diablo Technologies,Dialogue Group,Digital Optics,DreamiumLabs, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud,Graphcore,Groq,Hiregenics,HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM,IonVR,Inseego, Infosys,Infiot,Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo,Linux Foundation,Luminar,MapBox, Marvell Technology,Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco),Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek,Novumind, NVIDIA,Nutanix,Nuvia (now Qualcomm), ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Panasas,Peraso, Pexip, Pixelworks, Plume Design, Poly (formerly Plantronics),Portworx, Pure Storage, Qualcomm, Rackspace, Rambus,RayvoltE-Bikes, Red Hat,Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak (now Aruba-HPE), SONY Optical Storage,Springpath(now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity,TensTorrent,TobiiTechnology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications,Vidyo, VMware, Wave Computing,Wellsmith, Xilinx,Zayo,Zebra,Zededa, Zoho, andZscaler.Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is a personal investor in technology companiesdMYTechnology Group Inc. VI andDreamiumLabs.

Visit link:
Atom Computing: A Quantum Computing Startup That Believes It Can Ultimately Win The Qubit Race - Forbes

Don Kahle: Quantum quandaries are emerging – The Register-Guard

Don Kahle| Register-Guard

The challenges and opportunities ahead as computers become ever more powerful are coming more clearly into view. Count me among those who are not surprised at how well novelist and humorist Douglas Adams anticipated them. Artists and comics often speak the truth before anyone else.

The bandwagon is getting fuller by the day. Former Google CEO Eric Schmidt has partnered with Henry Kissinger to author a book about the challenges. Kissinger is 98 years old. Hes strategizing the coming clash with artificial intelligence as if it will be the last war any of us will ever wage.

He might be right.

Ive written almost every year about how we are leaving behind the Age of Enlightenment, without any confidence about what will replace it. Kissinger, Schmidt, Elon Musk and others are warning us that AI will sneak up on us if were not careful, rewriting the rules for civilization without our consent.

I hope the next epoch is organized around empathy, a decidedly human trait thats beyond the ken of calculations. As futurists become realists, its beginning to look like emergent properties may be the frontier were entering. Its very like what Adams anticipated in his Hitchhikers Guide to the Galaxy in 1978.

Leave aside artificial intelligence for a moment. Consider the consequences of quantum computing. IBM announced this week it has built a quantum computer that couldnt be matched with a conventional computer unless that computer was larger than our planet.

Although Adams anticipated that factoid quite accurately, its not the interesting part. According to IBM CEO Arvind Krishna, this super-duper-computer is not adept at computations in the traditional sense. That would be too easy. This quantum computer wont calculate as much as ruminate.

Weve used computers to solve problems but not to wonder how a problem could be solved. Big difference!

To review, Adamss characters asked the most powerful computer to give them the answer to life, the universe, and everything. The answer was 42. Understanding the question was exponentially more complicated, keeping his characters busy for three more volumes.

Kissinger and Schmidt posit that computers soon will give us answers before we understand the questions. Alexander Fleming discovered penicillin before anyone understood how microbes and cells operate inside the human body. We learned what works before we learned why.

With that in mind, mysteries abound. How do starlings execute their mesmerizing murmurations, flying like a three-dimensional marching band producing an amazing halftime show, but without a conductor? How can humans reverse or adapt to global warming? And everything in between.

Finding answers to unimaginably complex problems will be the easy part. Thoroughly understanding the questions being posed will be new. If quantum computing fulfills its promise, we may soon send it searching for the emergent properties behind self-organizing cities, coordinated starling flight patterns, and human consciousness.

Each is beyond the scope of calculations. The resultsemerge as if by magic. The whole is literally greater than the sum of its (calculated) parts. We may soon be envisioning the most hopeful future for our planet since 1650 and terrifyingly so.

Don Kahle (fridays@dksez.com) writes a column each Friday for The Register-Guard and archives past columns atwww.dksez.com.

Read the rest here:
Don Kahle: Quantum quandaries are emerging - The Register-Guard

4 key threats to the new central bank digital currencies – World Economic Forum

With G7 officials recently endorsing principles for central bank digital currencies (CBDC), and over 80 countries launching some form of initiative related to CBDC, it seems their widespread deployment is a matter of time. CBDC is a digital form of central bank money that can be accessible to the general public; essentially, it consists of individuals and firms having access to transaction and savings accounts with their home countrys central bank. Those of the Bahamas, China and Nigeria have all implemented early CBDC programmes, with more expected in the future. If successful, CBDC could help policy-makers achieve goals around payment efficiency, financial inclusion, banking and payment competitiveness, access to safe central bank money in the era of digital payments, and more.

Yet like any digital payment system, CBDC is vulnerable to cybersecurity attack, account and data breaches and theft, counterfeiting, and even farther-off challenges related to quantum computing. For citizens to be comfortable adopting CBDC, they will need to be confident in its security. Ultimately, it will not be successful if it does not carefully consider and invest in a robust cybersecurity strategy. Decision-makers should look to cybersecurity best practices such as those published by the US National Institute of Standards and Technology (NIST) and the Microsoft STRIDE model. This article, which summarizes key points from the World Economic Forums new white paper on CBDC Technology Considerations, lays out additional imperative considerations for CBDC cybersecurity.

How can we make sure CBDC is secure for decades to come? We discuss four major dimensions to its cybersecurity below:

CBDC access credentials are needed for accessing and transferring funds. Such credentials could be given in the form of a passphrase that could be easily communicated even on paper, or a hardware token that stores the private keys. Regardless of the form, the threat of theft and credential loss is significant, meaning account funds and data could be compromised.

Theft can be physical or virtual, especially in the case of passphrases. Given the arsenal of modern attackers, techniques such as social engineering, side-channel attacks and malware could be used to extract credentials from a CBDC users device. Moreover, if passphrases or hardware tokens are lost/damaged due to fire/water or natural calamities, CBDC users should not simply lose all their funds and data. Therefore, the system should have built-in credential recovery mechanisms.

If a CBDC is based on blockchain technology, it might use a multi-signature (multi-sig) wallet where at least two other trusted parties hold credentials to the same wallet (this could be the central bank itself and/or family members or other contacts of the end users). The drawback of multi-sig wallets is that they are less user-friendly, since for any transfer one needs to coordinate with at least one other party. Such security-usability trade-offs are common even nowadays with internet banking where 2 Factor Authentication (2FA) is extremely common. If CBDC is based on traditional technology, a privileged authority could simply update a database entry with new credentials.

Over 80 countries are launching some form of initiative related to CBDC

Image: BIS

One concern is that central bank or government insiders, law enforcement and other agents may have roles that allow privileged actions, such as the freezing or withdrawal of funds in CBDC accounts without the users consent. These capabilities are in line with todays compliance procedures in regulated payment systems. Though such roles are likely to be a functional requirement of a CBDC, it is possible for them to enable malicious insiders to abuse the system. As with other types of information security, the central bank and any intermediaries involved should have and execute a cybersecurity risk-management plan covering such privileges. Multi-party mechanisms, such as those employed by multi-signature wallets or other protections, could increase the difficulty of such attacks.

If the CBDC operates on blockchain technology, where nodes include non-central bank entities that have powers to validate or invalidate transactions, malicious validator nodes can pose security threats. They could also undermine the central banks monetary authority and independence by virtue of accepting or rejecting transactions that are contrary to the central banks intention. Thus, it is generally not recommended for non-central bank nodes to have transaction validation powers unless absolutely necessary.

Depending on the consensus protocol used, non-central bank nodes with privileged power could declare transactions as invalid, essentially blocking them from being accepted by the network and creating a denial-of-service attack for CBDC users and censorship of their transactions.

Collusion by non-central bank nodes could also enable double-spending attacks, a form of counterfeiting where the CBDC is spent multiple times illegitimately. The nodes may also decide to fork the distributed ledger, creating a different track and view of the ledger of transactions that disagrees with the central banks. CBDC end users could try to spend funds from their wallets in multiple places, also constituting digital counterfeiting. Risk of double-spend is higher if the CBDC in question has offline capability, depending on the technology with which it operates; in this scenario, double-spend transactions could be sent to offline entities without the high-security validation process that would normally occur online.

By imposing spending limits and transaction frequency when the CBDC user is offline, the impact of such attacks would be reduced. Further, once a device that is conducting transactions comes back online, compliance software could sync with any transactions that have concurred during the offline period.

Quantum computing will ultimately impact all financial services as it compromises major data encryption methodologies and cryptographic primitives used for protecting access, confidentiality and integrity of data stored and transmitted. CBDC is no exception. Therefore, the threat of emerging quantum computers, which can compromise the cryptography employed to secure CBDC accounts, must be taken into account during technology design. For instance, central banks should consider the vulnerability of certain primitives to forthcoming quantum computing. Moreover, quantum computers in the future might be able to break the cryptography in the CBDC system without detection.

The World Economic Forum's Centre for Cybersecurity is leading the global response to address systemic cybersecurity challenges and improve digital trust. We are an independent and impartial global platform committed to fostering international dialogues and collaboration on cybersecurity in the public and private sectors. We bridge the gap between cybersecurity experts and decision makers at the highest levels to reinforce the importance of cybersecurity as a key strategic priority.

Our community has three key priorities:

Strengthening Global Cooperation - to increase global cooperation between public and private stakeholders to foster a collective response to cybercrime and address key security challenges posed by barriers to cooperation.

Understanding Future Networks and Technology - to identify cybersecurity challenges and opportunities posed by new technologies, and accelerate forward-looking solutions.

Building Cyber Resilience - to develop and amplify scalable solutions to accelerate the adoption of best practices and increase cyber resilience.

Initiatives include building a partnership to address the global cyber enforcement gap through improving the efficiency and effectiveness of public-private collaboration in cybercrime investigations; equipping business decision makers and cybersecurity leaders with the tools necessary to govern cyber risks, protect business assets and investments from the impact of cyber-attacks; and enhancing cyber resilience across key industry sectors such as electricity, aviation and oil & gas. We also promote mission aligned initiatives championed by our partner organizations.

The Forum is also a signatory of the Paris Call for Trust and Security in Cyberspace which aims to ensure digital peace and security which encourages signatories to protect individuals and infrastructure, to protect intellectual property, to cooperate in defense, and refrain from doing harm.

For more information, please contact us.

Cybersecurity, along with technical resilience and sound technical governance, are the most important elements of CBDC technical design. Failure to implement a robust cybersecurity strategy and consider the risks introduced above could compromise citizen data and funds, the success of the CBDC programme, central bank reputational risk and broader opinions of the new currency. Based on past experiences in cybersecurity failures, the bar for security is not only about keeping the bad guys out or minimizing unauthorized account access. It must be comprehensive and consider the full spectrum of risks, ensuring that the system works as it was designed and that its integrity remains intact. Only then will CBDC be successful in achieving its goals.

Written by

Sebastian Banescu, Senior Research Engineer / Security Auditor, Quantstamp

Ben Borodach, Vice-President, Strategy and Operations, Team8

Ashley Lannquist, Project Lead, Blockchain and Distributed Ledger Technology, World Economic Forum

The views expressed in this article are those of the author alone and not the World Economic Forum.

Visit link:
4 key threats to the new central bank digital currencies - World Economic Forum

Quantum computing breakthrough may help us learn about the earliest moments of the universe – TechRadar

The latest breakthrough in the field of quantum computing could pave the way for complex simulations that tell us about the earliest moments of the universe and more.

A team of researchers from the University of Waterloo, Canada, claims to have performed the first ever simulation of baryons (a highly complex type of subatomic particle) on a quantum computer.

To achieve this goal, the researchers paired a traditional computer with a quantum machine in the cloud, and developed from scratch a quantum algorithm that was resource-efficient enough to allow the system to shoulder the workload.

Until now, computers have only been able to simulate the composite elements of baryons (which are made up of three quarks), but the research paper shows its possible to perform detailed quantum simulations with many baryons.

Although the science is complex, the broad significance is this: scientists will be able to simulate aspects of physics completely out of reach for traditional supercomputers.

According to the researchers, the breakthrough represents a landmark step towards overcoming the limitations of classical computing and allowing the massive potential of quantum computers to be realized.

This is an important step forward - it is the first simulation of baryone on a quantum computer ever, said Christine Muschik, faculty member at the Institute for Quantum Computing (IQC). Instead of smashing particles in an accelerator, a quantum computer may one day allow us to simulate these interactions that we use to study the origins of the universe and so much more.

More specifically, researchers will be able to simulate complex lattice gauge theories, which describe the physics of reality. So-called non-Abelian gauge theories are said to be particularly attractive candidates for quantum simulation, as they relate to the stability of matter in the universe.

While the most powerful traditional computers are able to simulate simple non-Abelian gauge theories, only a quantum computer (as has now been proven) can perform the complex simulations necessary to unpack the inner workings of the universe.

Whats exciting about these results for us is that the theory can be made so much more complicated, added Jinglei Zhang, another researcher at the IQC. We can consider simulating matter at higher densities, which is beyond the capability of classical computers.

Original post:
Quantum computing breakthrough may help us learn about the earliest moments of the universe - TechRadar

IBM creates largest ever superconducting quantum computer – New Scientist

IBM claims it has produced a 127-qubit quantum computer. This is over double the size of comparable machines made by Google and the University of Science and Technology of China

By Matthew Sparkes

IBM has created the worlds largest superconducting quantum computer, surpassing the size of state-of-the-art machines from Google and from Chinese researchers. Previous devices have demonstrated up to 60 superconducting qubits, or quantum bits, working together to solve problems, but IBMs new Eagle processor more than doubles that by stringing together 127.

Several approaches are being pursued by teams around the world to create a practical quantum computer, including superconductors and entangled photons, and it remains unclear which will become the equivalent of the transistor which powered the classical computing revolution.

In 2019, Google announced that its Sycamore processor, which uses the same superconducting architecture that IBM is working with, had achieved quantum supremacy the name given to the point at which quantum computers can solve a problem that a classical computer would find impossible. That processor used 54 qubits, but has since been surpassed by a 56 and then 60-qubit demonstration with the Zuchongzhi superconducting processor from the University of Science and Technology of China (USTC) in Hefei.

IBMs 127-qubit Eagle processor now takes the top spot as the largest, and therefore theoretically most powerful, superconducting quantum computer to be demonstrated. Each additional qubit represents a significant step forward in ability: unlike classical computers, which rise in power in a linear fashion as they grow, one additional qubit effectively doubles a quantum processors potential power.

Canadian company D-Wave Systems has sold machines for some years that consist of thousands of qubits, but they are widely considered to be very specific machines tailored towards a certain algorithm called quantum annealing rather than fully programmable quantum computers. In recent years, much progress in quantum computing has focused on superconducting qubits, which is one of the main technologies that Google, USTC and IBM are backing.

Bob Sutor at IBM says that breaking the 100-qubit barrier is more psychological than physical, but that it shows the technology can grow. With Eagle, were demonstrating that we can scale, that we can start to generate enough qubits to get on a path to have enough computation capacity to do the interesting problems. Its a stepping stone to bigger machines, he says.

However, it is difficult to compare the power of the IBM chip with previous processors. Both Google and USTC used a common test to assess such chips, which was to simulate a quantum circuit and sample random numbers from its output. IBM claims to have created a more programmable and adaptable processor, but has yet to publish an academic paper setting out its performance or abilities.

Peter Leek at the University of Oxford says it is tempting to assess performance entirely on the qubit count, but that there are other metrics that need to be looked at none of which has yet been released for Eagle. Its definitely positive, its good that theyre making something with more qubits, but ultimately it only becomes useful when the processor performs really well, he says.

Scott Aaronson at the University of Texas at Austin has similar reservations about judging the importance of the new processor at this stage, saying that more detail is needed. I hope that information will be forthcoming, he says.

IBM has said that it hopes to demonstrate a 400-qubit processor next year and to break the 1000-qubit barrier the following year with a chip called Condor. At that point, it is expected that a limit on expansion will be reached that requires quantum computers to be created from networks of these processors strung together by fibre-optic links.

More on these topics:

See the article here:
IBM creates largest ever superconducting quantum computer - New Scientist