Category Archives: Quantum Computing

Atom Computing Plans To Build A Bigger And Better High-Tech Quantum Computer With Its Latest $60 Million Series B Funding – Forbes

Atom Computing

Atom Computing, a quantum computing company headquartered in Berkeley, California, seems to be on the fast track for funding.

This week Atom announced it had secured$60MSeries B round of financing led by Third Point Ventures. The round also included Prime Movers Lab and insiders Innovation Endeavors, Venrock, and Prelude Ventures.

Atom was founded in 2018 with $5M in seed funds by Benjamin Bloom and Jonathan King. Over two years, the duo used those funds to secretly staff and build a quantum computer with a unique technology. What set Atoms computer apart from other quantum machines was that it was the first quantum computer to use nuclear-spin qubits created from optically-trapped neutral atoms.

First-Generation Quantum Computer, Phoenix

In July 2021, Atom Computingreceived an additional $15M in Series A funding from investorsVenrock, Innovation Endeavors, and Prelude Ventures, plus three grants from the National Science Foundation.

According to a statement on Atom's press release by Rob Hays, Atom Computing's president and CEO, there was no shortage of investment interest. "We've seen a tremendous amount ofinvestor interest in what many are starting to believe is a more promising way to scale quantum computers neutral atoms, he said. Our technology advancements and this investment give us the runway to continue our focus on delivering the most scalable and reliable quantum computers."

Whats different about its technology

Most of todays quantum computers use two types of qubits, either superconducting (IBM & Google) or trapped-ion (Quantinum or IonQ). Amazon doesnt yet have a quantum computer, but it plans to build one using superconducting hardware. In contrast, Psi Quantum and Xanadu use photons of light that act as qubits.

Atom computing chose to use a different technology -nuclear-spin qubits made from neutral atoms.Phoenix, the name of Atoms first-generation, gate-based quantum computer platform, uses 100 optically trapped qubits.

These qubits are created from an isotope of Strontium, a naturally occurring element considered to be a neutral atom. Goingdeeper, neutral atoms have equal numbers of protons and electrons. However, isotopes of Strontium have varying numbers of neutrons. These differences in neutrons produce different energy levels in the atom that allow spin qubits to be created. Atom Computing uses the isotope Strontium-87 and takes advantage of its unique energy levels to create spin qubits.

It is important for qubits to remain in a quantum state long enough to complete running the quantum circuits. The time that a qubit retains its quantum state is called its coherence time. Neutral atom qubits have a longer coherence time than most other qubit technologies.

Lasers instead of wires are used for precision control of the strontium-87 qubits. Lasers eliminates wiring, which can create radiation and noise that negatively affects coherence.

There are many other technical reasons for using neutral atom spin qubits but beyond the scope of this article.

Second generation plans

Artist rendering of Atom Computings second-generation quantum

With its latest $60M Series B funding, Atom Computing plans to build a larger, second-generation neutral-atom quantum computer. Many additional qubits will give the system increased computational ability. Atom Computing is currently likely to have undisclosed customer trials and use cases in progress. However, we expect new and more significant use cases to be publicly announced once the new quantum system is operational.

Patrick Moorhead, president and chief analyst of Moor Insights and Strategy, said, Qubit coherence, fidelity, and scalability are essential factors for achieving quantum advantage. Atom Computing has already demonstrated that Phoenix, its first-generation 100+ nuclear-spin qubit quantum processor, has the potential to check all those boxes. With the additional $60M Series B funding, I believe Atom could build a large qubit, second-generation quantum system that either brings it to the edge of quantum advantage or possibly even achieves it.

Analyst notes:

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, or speaking sponsorships. The company has had or currently has paid business relationships with 88,A10 Networks,Advanced Micro Devices, Amazon,Ambient Scientific,AnutaNetworks,Applied Micro,Apstra,Arm, Aruba Networks (now HPE), AT&T, AWS, A-10 Strategies,Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera,Clumio, Cognitive Systems, CompuCom,CyberArk,Dell, Dell EMC, Dell Technologies, Diablo Technologies,Dialogue Group,Digital Optics,DreamiumLabs, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud,Graphcore,Groq,Hiregenics,HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM,IonVR,Inseego, Infosys,Infiot,Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo,Linux Foundation,Luminar,MapBox, Marvell Technology,Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco),Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek,Novumind, NVIDIA,Nutanix,Nuvia (now Qualcomm), ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Panasas,Peraso, Pexip, Pixelworks, Plume Design, Poly (formerly Plantronics),Portworx, Pure Storage, Qualcomm, Rackspace, Rambus,RayvoltE-Bikes, Red Hat,Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak (now Aruba-HPE), SONY Optical Storage,Springpath(now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity,TensTorrent,TobiiTechnology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications,Vidyo, VMware, Wave Computing,Wellsmith, Xilinx,Zayo,Zebra,Zededa, Zoho, andZscaler.Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is a personal investor in technology companiesdMYTechnology Group Inc. VI andDreamiumLabs.

Continued here:
Atom Computing Plans To Build A Bigger And Better High-Tech Quantum Computer With Its Latest $60 Million Series B Funding - Forbes

Going beyond the exascale | symmetry magazine – Symmetry magazine

After years of speculation, quantum computing is heresort of.

Physicists are beginning to consider how quantum computing could provide answers to the deepest questions in the field. But most arent getting caught up in the hype. Instead, they are taking what for them is a familiar tackplanning for a future that is still decades out, while making room for pivots, turns and potential breakthroughs along the way.

When were working on building a new particle collider, that sort of project can take 40 years, says Hank Lamm, an associate scientist at the US Department of Energys Fermi National Accelerator Laboratory. This is on the same timeline. I hope to start seeing quantum computing provide big answers for particle physics before I die. But that doesnt mean there isnt interesting physics to do along the way.

Classical computers have been central to physics research for decades, and simulations that run on classical computers have guided many breakthroughs. Fermilab, for example, has used classical computing to simulate lattice quantum chromodynamics. Lattice QCD is a set of equations that describe the interactions of quarks and gluons via the strong force.

Theorists developed lattice QCD in the 1970s. But applying its equations provedextremely difficult. Even back in the 1980s, many people said that even if they had an exascale computer [a computer that can perform a billion billion calculations per second], they still couldnt calculate lattice QCD, Lamm says.

But that turned out not to be true.

Within the past 10 to 15 years, researchers have discovered the algorithms needed to make their calculations more manageable, while learning to understand theoretical errors and how to ameliorate them. These advances have allowed them to use a lattice simulation, a simulation that uses a volume of a specified grid of points in space and time as a substitute for the continuous vastness of reality.

Lattice simulations have allowed physicists to calculate the mass of the protona particle made up of quarks and gluons all interacting via the strong forceand find that the theoretical prediction lines up well with the experimental result. The simulations have also allowed them to accurately predict the temperature at which quarks should detach from one another in a quark-gluon plasma.

The limit of these calculations? Along with being approximate, or based on a confined, hypothetical area of space, only certain properties can be computed efficiently. Try to look at more than that, and even the biggest high-performance computer cannot handle all of the possibilities.

Enter quantum computers.

Quantum computers are all about possibilities. Classical computers dont have the memory to compute the many possible outcomes of lattice QCD problems, but quantum computers take advantage of quantum mechanics to calculate differently.

Quantum computing isnt an easy answer, though. Solving equations on a quantum computer requires completely new ways of thinking about programming and algorithms.

Using a classical computer, when you program code, you can look at its state at all times. You can check a classical computers work before its done and trouble-shoot if things go wrong. But under the laws of quantum mechanics, you cannot observe any intermediate step of a quantum computation without corrupting the computation; you can observe only the final state.

That means you cant store any information in an intermediate state and bring it back later, and you cannot clone information from one set of qubits into another, making error correction difficult.

It can be a nightmare designing an algorithm for quantum computation, says Lamm, who spends his days trying to figure out how to do quantum simulations for high-energy physics. Everything has to be redesigned from the ground up. We are right at the beginning of understanding how to do this.

Quantum computers have already proved useful in basic research. Condensed matter physicistswhose research relates to phases of matterhave spent much more time than particle physicists thinking about how quantum computers and simulators can help them. They have used quantum simulators to explore quantum spin liquid states and to observe a previously unobserved phase of matter called aprethermal time crystal.

The biggest place where quantum simulators will have an impact is in discovery science, in discovering new phenomena like this that exist in nature, says Norman Yao, an assistant professor at University of California Berkeley and co-author on the time crystal paper.

Quantum computers are showing promise in particle physics and astrophysics. Many physics and astrophysics researchers are using quantum computers to simulate toy problemssmall, simple versions of much more complicated problems. They have, for example, used quantum computing to test parts of theories of quantum gravity or create proof-of-principle models, like models of theparton showers that emit from particle colliderssuch as the Large Hadron Collider.

"Physicists are taking on the small problems, ones that they can solve with other ways, to try to understand how quantum computing can have an advantage, says Roni Harnik, a scientist at Fermilab. Learning from this, they can build a ladder of simulations, through trial and error, to more difficult problems.

But just which approaches will succeed, and which will lead to dead ends, remains to be seen. Estimates of how many qubits will be needed to simulate big enough problems in physics to get breakthroughs range from thousands to (more likely) millions. Many in the field expect this to be possible in the 2030s or 2040s.

In high-energy physics, problems like these are clearly a regime in which quantum computers will have an advantage, says Ning Bao, associate computational scientist at Brookhaven National Laboratory. The problem is that quantum computers are still too limited in what they can do.

Some physicists are coming at things from a different perspective: Theyre looking to physics to better understand quantum computing.

John Preskill is a physics professor at Caltech and an early leader in the field of quantum computing. A few years ago, he and Patrick Hayden, professor of physics at Stanford University, showed that if you entangled two photons and threw one into a black hole, decoding the information that eventually came back out via Hawking radiation would be significantly easier than if you had used non-entangled particles. Physicists Beni Yoshida and Alexei Kitaev then came up with an explicit protocol for such decoding, and Yao went a step further, showing that protocol could also be a powerful tool in characterizing quantum computers.

We took something that was thought about in terms of high-energy physics and quantum information science, then thought of it as a tool that could be used in quantum computing, Yao says.

That sort of cross-disciplinary thinking will be key to moving the field forward, physicists say.

Everyone is coming into this field with different expertise, Bao says. From computing, or physics, or quantum information theoryeveryone gets together to bring different perspectives and figure out problems. There are probably many ways of using quantum computing to study physics that we cant predict right now, and it will just be a matter of getting the right two people in a room together.

Read the original:
Going beyond the exascale | symmetry magazine - Symmetry magazine

Arqit Quantum Is Terrestrial For Now, But Will Go To Space – Seeking Alpha

mviamonte/iStock via Getty Images

The last time we covered Arqit Quantum (NASDAQ:ARQQ) was when it was still Centricus Acquisition Corp, the SPAC that we invested a small, speculative exposure in. The thesis remains mostly unchanged. Using symmetric keys is secure, given trust can be established between the parties sharing the key, and that this is basically the only tenable solution against quantum computing, which can break any encryption based on mathematics. The commercial value of the product is being backed up by early adopters who are signing up for long-term commitments with Arqit and have mission-critical data needs. Arqit will have gone from pre to post-revenue as of the end of 2021, and we believe that as an end of the world hedge, unlike crypto which could by the way be universally dismantled by quantum computing in a few instants, Arqit fits the bill perfectly.

The whole space of cybersecurity tends to be wholly underinvested. Not until a data breach occurs do most companies ever really think about their cybersecurity needs. Indeed, it's a segment that still has quite the room to grow.

Arqit Quantum

Companies that do understand the importance of cybersecurity are the ones that make up Arqit's already developing pipeline.

Arqit Quantum

The UK government, major Japanese conglomerates, IoT, telco and defense companies all make up the current pipeline. Companies like Babcock (OTCPK:BCKIF) and Northrop Grumman (NYSE:NOC) already have signed agreements with Arqit to both use and collaborate in developing as many use cases for Arqit's courier-like model for symmetric keys. The companies share a common need to protect data communications for mission-critical uses. In particular the defense companies are a vote of confidence for the use-cases and necessity of symmetric key courier infrastructure for data communication.

One of the key selling points of Arqit is also the fact that as opposed to dramatic infrastructure shifts or an arms race of encryption algorithms that might have otherwise been required to defend against the quantum threat, Arqit uses immutable and unbeatable properties of photon transmission in conjunction with encryption algorithms that have been used in the past by banks and governments, with literal couriers transporting them. Arqit is just an outsourced courier-like service that for now uses just the cloud and terrestrial data centers run its platform, but will eventually launch satellites with small, but exceptionally powerful computers that will be able to root keys that will be generated between recipients and senders of data using random numbers and a proprietary protocol. The addressable market is therefore every networked device between which safe transmission of data is desirable, and the system makes symmetric keys, which traditionally required high-levels of interparty trust in exchange for security, a totally trustless system.

The revenue and EBITDA projections, which are based on these recurring revenue contracts with the customers in the pipeline, amount to the following and imply the following multiples on forward earnings.

The Value Lab

While revenues and profits are only getting started in FY 2022, 2023 is when we start to see more meaningful EBITDA. With the addressable market including all interconnected devices where privacy is important, even at the 2025 forecast levels, we are of course still at a very nascent stage for this market.

The quantum threat is still a while away. People are not close to working a quantum computer yet, but the point is that it's inevitable, and if you aren't prepared the moment a quantum computer comes online, perhaps in the hands of a quite hostile government or entity, then it will be an instant before that quantum computer decrypts all your data for whatever purposes they might have had for it. With the core market of mission-critical use-cases already forming a nice revenue base for a recurring model with strong theoretical economics, and the further commercialization possible into the broader addressable market, the current valuation, while already acknowledging the uniqueness and timeliness of the Arqit offer, is probably a long way off from where it could be in the next 10-15 years when all networks are threatened by quantum attacks.

See the article here:
Arqit Quantum Is Terrestrial For Now, But Will Go To Space - Seeking Alpha

AI-flavored predictions for 2022 from Baidu The Register – The Register

Baidu Research's AI-centric "Top 10 Tech Trends in 2022" report has outlined the Middle Kingdom megacorp's predictions for technology over the coming year.

Baidu CTO Haifeng Wang describes AI as a "key driving force of innovation and development," thanks to rapidly evolving core technologies, cross-domain connectivity, and expanding applications.

It's no surprise that the list focuses on AI given Baidu's business domain. The Beijing-based company's search engine captures over 70 per cent of the Chinese market while also developing other products, particularly AI research and cloud computing. The research arm takes a deeper look at its associated technologies. Think Google but Chinese.

A similar list was produced by Alibaba's DAMO Academy earlier this month. Alibaba's business features e-commerce, cloud computing, AI, and fintech products among others. DAMO is its research arm.

Although DAMO's list features AI, it does so with less focus than Baidu Research, resulting in both overlapping predictions and completely different takes.

Not mentioned by Baidu Research but included in DAMO's list were the use of silicon photonic chips for high-speed data transmission and replacing electronic chips in computing, and soft-bodied robotics whose flexibility and enhanced perception could potentially change the course of manufacturing industry.

Three areas Baidu Research covered that Alibaba did not are autonomous driving, inclusivity, and the integration of quantum hardware and software. The pair had similar thoughts when it comes to foundation models, AI's effect on science, and privacy computing. Space exploration, humans merging with robots, and sustainability factors made both company's lists, albeit with different approaches or outlooks.

Given that Baidu operates an L4 autonomous taxi service in a 60km2 area of Beijing, it's not surprising that the list might refer to robocars as "the next generation of vehicles that will give rise to a revolution in transportation," although the company doesn't give too much detail on what that will look like.

Baidu said AI will become more inclusive, but in a convoluted way: small and medium-sized enterprises will be able to access open-source platforms to improve their businesses, thus supporting more diverse sources of innovation.

Already becoming widely implemented, Baidu Research thinks the technology should continue to improve by increasing qubits in scale and reducing or adapting to noise.

Users will start having access to more back-end options on cloud-native quantum computing platforms, gradually proving commercial potential for the integration of quantum hardware and software.

"Government agencies, academia and the industry will collaborate more closely to build high-quality quantum devices and cultivate quantum technology talent, taking the first steps toward creating a quantum computing industry chain," Baidu Research predicted.

Both the trend lists featured the self-supervised large-scale models trained on big data known as foundation models, and both predicted the world will see smaller ones as research and development shifts from attempts to increase the model size toward practical deployment.

"For example, built upon cross-modal large models, AI will drive the adoption of AIGC (AI Generated Content), which promises to spur creative cognition, improve content diversity and reduce production costs," said Baidu Research.

AI for science and medical uses is already pervasive. Baidu Research reckons it will further fuse data-driven and theoretical deduction and play a greater role in furthering fundamental discoveries.

In particular, the research arm expects AI-powered computational biology, spurred on by the success of mRNA vaccine technology, will be on the rise. Precision medicine will have an uptake as costs decline.

In an environment where security and compliance has become critical, thanks in part to global regulations, Baidu Research said privacy computing is not far off from development in areas like computational biology, financial analysis, and data regulations. Additionally, in the long-run, privacy computing technology may "push encrypted data circulation and computing by default," thereby creating an "infrastructure for user confidence."

Similarly, Alibaba said advanced algorithms for homomorphic encryption, which enables calculations on data without decrypting it, will hit a critical point reducing the need for computing power to support the encryption. The emergence of data trust entities to assist in sharing the sets across organizations was also mentioned.

Baidu Research also thinks AI is going to take hold in outer space, with sensor-loaded decision-making autonomous spacecraft roaming around extra-terrestrial planets carrying robotic arms, not only conducting scientific studies, but also mining moons and planets for resources, empowered by lessons learned in construction machinery automation to excavate continuously 24 hours a day. Oh yeah, and these spacebots are self-repairing as they build space laboratories and analyse big data. Really, they do everything.

Alibaba's space predictions of the future were less imaginative, focusing on the very real demand for increased satellite connectivity and how they can improve through increased interconnectivity.

With social distancing kickstarting human digital presence, and continuing to be a necessity for many for the foreseeable future, Baidu Research said humans will continue to morph into avatars and robots. Of course, AI will play a role as it infiltrates and replicates human senses and XR (extended reality) improves.

As cyborg tech becomes the norm, more platforms will want in on the market. As a result, the world will see more virtual and real mashup experiences, both in consumer situations and professional life, not only reinforcing the digital economy, but also merging with the "real" economy.

Alibaba addressed the concept of human and machine mashups, but kept it simple, with XR glasses being what drives the success of technologies such as cloud-edge computing, network communications, and digital twins.

As for going green, Baidu painted a picture of AI equipment operating in a more sustainable manner. For instance, AI processors with higher computing power and lower energy consumption will continue to be invented, but also AI technology can be applied to tasks like improving energy efficiency ratios.

Meanwhile, Alibaba's green predictions centre around how AI can help connect the dots of things like renewable energy. For instance, "intelligent scheduling using deep learning techniques could optimise scheduling policies across energy sources like wind, solar, and hydroelectric."

Read more here:
AI-flavored predictions for 2022 from Baidu The Register - The Register

Two Rochester researchers named AAAS fellows : NewsCenter – University of Rochester

January 26, 2022

Two University of Rochester faculty members have been elected fellows of the American Association for the Advancement of Science (AAAS). Nicholas Bigelow, the Lee A. DuBridge Professor of Physics and a professor of optics, and Michael Scott, the Arthur Gould Yates Professor of Engineering and also a professor in and chair of the computer science department, are among 564 members of the association recognized this year for their scientifically or socially distinguished efforts on behalf of the advancement of science or its applications.

Bigelow has helped advance the understanding of quantum physics and quantum optics through his pioneering research on the interactions between light and matter. His lab uses laser light to cool atoms to nearly absolute zero temperatures to better manipulate and study them.

Bigelows current projects include creating and manipulating Bose-Einstein condensatesa quantum state of matter made from an atomic gas cooled to temperatures close to absolute zeroand investigating the quantum nature of atom-photon interactions. This research has important applications in areas of quantum mechanics such as quantum computing and sensing. He is also director of the NASA-funded Consortium for Ultracold Atoms in Space and the principal investigator of cold atom experiments running aboard the International Space Station.

Bigelow joined the faculty of the University of Rochester in 1992 and served as chair of the Department of Physics and Astronomy from 2008 to 2014.

He has twice received the Universitys Society of Physics Students Award for Excellence in Undergraduate Teaching (in 1998 and 2006) and has held various positions in University governance and leadership, including serving as chair of the Board on Academic Honesty for the College from 1998 to 2004, chair of the University of Rochester Presidential Search Committee in 2004, cochair of the Universitys Middle States Accreditation Committee, and chair of the Faculty Senate.

Bigelow is a fellow of the American Physical Society and of Optica (formerly OSA, or the Optical Society of America).

Scotts widely cited research focuses primarily on systems software for parallel and distributed computing, including developing new ways to share data among concurrent activities, to automate its movement and placement, and to protect it from accidental loss or corruption.

He is best known as a cocreator of the MCS mutual exclusion lock and as the author of Programming Language Pragmatics, one of the definitive and most widely used textbooks on programming language design and implementation. Several algorithms from Scotts research group have been incorporated into the standard library of the Java programming language.

He is a fellow of the Association for Computing Machinery (ACM) and the Institute of Electrical and Electronics Engineers (IEEE). In 2006, he shared the Edsger W. Dijkstra Prize in Distributed Computing.

Scott, who joined the faculty in 1985, also chaired the Department of Computer Science from 1996 to 1999, and was interim chair for six months in 2007, and again in 2017. He received the Universitys Robert and Pamela Goergen Award for Distinguished Achievement and Artistry in Undergraduate Teaching in 2001, the William H. Riker Award for Graduate Teaching in 2020, and the Lifetime Achievement Award from the Hajim School of Engineering & Applied Sciences in 2018.

He has played an active role in University governance, including serving as cochair of the Faculty Advisory Committee for the presidential search in 2018.

Ultimate vacuum chamber creates nothing

Nicholas Bigelows lab conducts experiments using a box of nothing, an ultimate vacuum chamber that allows researchers to interact with and manipulate atoms. But is nothing ever possible? How have scientists, philosophers, and mathematicians thought about the concept of nothing throughout history and up to the present?

Tags: Arts and Sciences, award, Department of Computer Science, Department of Physics and Astronomy, Hajim School of Engineering and Applied Sciences, Institute of Optics, Michael Scott, Nicholas Bigelow

Category: Science & Technology

See the original post here:
Two Rochester researchers named AAAS fellows : NewsCenter - University of Rochester

Quantum Computing in Silicon Breaks a Crucial Threshold for the First Time – Singularity Hub

Quantum computers made from the same raw materials as standard computer chips hold obvious promise, but so far theyve struggled with high error rates. That seems set to change after new research showed silicon qubits are now accurate enough to run a popular error-correcting code.

The quantum computers that garner all the headlines today tend to be made using superconducting qubits, such as those from Google and IBM, or trapped ions, such as those from IonQ and Honeywell. But despite their impressive feats, they take up entire rooms and have to be painstakingly handcrafted by some of the worlds brightest minds.

Thats why others are keen to piggyback on the miniaturization and fabrication breakthroughs weve made with conventional computer chips by building quantum processors out of silicon. Research has been going on in this area for years, and its unsurprisingly the route that Intel is taking in the quantum race. But despite progress, silicon qubits have been plagued by high error rates that have limited their usefulness.

The delicate nature of quantum states means that errors are a problem for all of these technologies, and error-correction schemes will be required for any of them to reach significant scale. But these schemes will only work if the error rates can be kept sufficiently low; essentially, you need to be able to correct errors faster than they appear.

The most promising family of error-correction schemes today are known as surface codes and they require operations on, or between, qubits to operate with a fidelity above 99 percent. That has long eluded silicon qubits, but in the latest issue of Nature three separate groups report breaking this crucial threshold.

The first two papers from researchers at RIKEN in Japan and QuTech, a collaboration between Delft University of Technology and the Netherlands Organization for Applied Scientific Research, use quantum dots for qubits. These are tiny traps made out of semiconductors that house a single electron. Information can be encoded into the qubits by manipulating the electrons spin, a fundamental property of elementary particles.

The key to both groups breakthroughs was primarily down to careful engineering of the qubits and control systems. But the QuTech group also used a diagnostic tool developed by researchers at Sandia National Laboratories to debug and fine-tune their system, while the RIKEN team discovered that upping the speed of operations boosted fidelity.

A third group from the University of New South Wales took a slightly different approach, using phosphorus atoms embedded into a silicon lattice as their qubits. These atoms can hold their quantum state for extremely long times compared to most other qubits, but the tradeoff is that its hard to get them to interact. The groups solution was to entangle two of these phosphorus atoms with an electron, which enables them to talk to each other.

All three groups were able to achieve fidelities above 99 percent for both single qubit and two-qubit operations, which crosses the error-correction threshold. They even managed to carry out some basic proof-of-principle calculations using their systems. Nonetheless, they are still a long way from making a fault-tolerant quantum processor out of silicon.

Achieving high-fidelity qubit operations is only one of the requirements for effective error correction. The other is having a large number of spare qubits that can be dedicated to this task, while the remaining ones focus on whatever problem the processor has been set.

As an accompanying analysis in Nature notes, adding more qubits to these systems is certain to complicate things, and maintaining the same fidelities in larger systems will be tough. Finding ways to connect qubits across large systems will also be a challenge.

However, the promise of being able to build compact quantum computers using the same tried-and-true technology as existing computers suggests these are problems worth trying to solve.

Image Credit: UNSW/Tony Melov

Read this article:
Quantum Computing in Silicon Breaks a Crucial Threshold for the First Time - Singularity Hub

Microsoft Quantum Computing Executive Sees Progress After Lagging Rivals – The Information

For more than 15 years, researchers at Microsoft have been chasing the idea of building a better quantum computer, an ultrapowerful machine that promises to accelerate everything from drug discovery to the development of electric batteries.

Along the way, the company has suffered setbacks as it takes a noveland riskyapproach to developing the technology. Last year, for instance, a group of physicists that included a Microsoft researcher had to retract a paper related to quantum computing, an embarrassment that dampened morale. And even as it made progress with one method of controlling the building blocks of quantum computers, it turned away from its work on another, shutting down a project that had been underway for several years. The move resulted in several hundred people having to find new jobs within Microsoft, said a person with direct knowledge of the change.

Still, Chetan Nayak, a distinguished engineer at the company and one of Microsofts top quantum computing researchers, told The Information in an interview that the company is making technical progress. A breakthrough just before the holidays gave him more faith in its efforts, he said. While Nayak wouldnt go into much detail, he said Microsoft conducted simulations that made him confident the companys approach wasnt a dead end. He was so enthusiastic about the results that he told his wife he didnt need a Christmas gift.

Go here to see the original:
Microsoft Quantum Computing Executive Sees Progress After Lagging Rivals - The Information

US Army visits Brazil in search of technological partnership for Security and Defense areas – Dialogo-Americas.com

By Brazilian National Service for Industrial Learning January 24, 2022

In mid-January, representatives from the U.S. Army Combat Capabilities Development Command (DEVCOM) visited Salvador, in Bahia state, Brazil, to discuss the possibility of cooperation in technological innovation projects with the Brazilian National Service for Industrial Learning (SENAI).

DEVCOM-Americas Director Harry DuRette, and Rosa Santoni, director of DEVCOM Technology and Innovation Office in Brazil, visited the facilities of SENAI CIMATEC (Integrated Manufacturing and Technology Center, in Portuguese), on January 18. At the headquarters, located in the neighborhood of Piat, in Salvador, DuRette and Santoni visited the Defense and Security Center, the Robotics and Quantum Computing Centers, and also the Aeronautics Research Laboratory and the Ballistic Testing Laboratory.

DuRette said that what is most impressive is the size of the facilities, which encompass many competencies. It is a good demonstration of Brazilian science and technology. And the capacity for growth in these areas is what I see as the most promising aspect for the advancement of our partnership.

Quantum computing

This is the second visit of DEVCOM representatives to SENAI CIMATEC. The Brazilian Ministry of Defense and the Brazilian Army took part in the first visit in 2020 and support the advancing partnership between SENAI CIMATEC and the U.S. Army research center. This second meeting marks a breakthrough in the interest of academic and technological cooperation between the U.S. Army Research Laboratory and SENAI CIMATEC.

Milton Deir, researcher at SENAI CIMATECs Defense and Security Center, said that there are already some mapped areas of common interest, among them quantum computing. They have part of the expertise, we have part of the expertise, and this joining of forces comes at an extremely important moment for us to advance in the development of these technologies on both sides.

The U.S. Army representatives also visited CIMATEC Park, SENAI CIMATECs technological and industrial complex, which spreads over a 4-million-square-meter area in the Camaari Industrial Complex. Tarso Nogueira, manager of the SENAI CIMATEC Defense and Security Center, said that six project proposals have already been submitted to DEVCOM for evaluation so that we can, in the next few months, conduct some of these projects with them. And, who knows, maybe soon we will have an exchange of teams here at SENAI CIMATEC with the laboratories that form the network within the U.S. military.

Follow this link:
US Army visits Brazil in search of technological partnership for Security and Defense areas - Dialogo-Americas.com

LG Electronics Joins the IBM Quantum Network – Database Trends and Applications

IBM has announced that LG Electronics has joined the IBM Quantum Network to advance the industry applications of quantum computing. By joining the IBM Quantum Network, IBM will provide LG Electronics access to IBM's quantum computing systems, as well as to IBM's quantum expertise and Qiskit, IBM's open source quantum information software development kit.

LG Electronics aims to explore applications of quantum computing in industry to support big data, artificial intelligence, connected cars, digital transformation, IoT, and robotics applications?all of which require processing a large amount of data.

With IBM Quantum, IBM says, LG can leverage quantum computing hardware and software advances and applications as they emerge, in accordance with IBM'squantum roadmap. By leveraging IBM Quantum technology, LG will provide workforce training to its employees, permitting LG to investigate how potential breakthroughs can be applied to its industry.

"We're happy to welcome LG Electronics to a growing quantum computing ecosystem in Korea at an exciting time for the region," saidJay Gambetta, IBM fellow and VP, quantum computing, at IBM. "The relationship between IBM and LG Electronics will permit LG to explore new types of problems associated with emerging technologies and will help strengthen the quantum capabilities in Korea."

At the IBM Quantum Summit inNovember 2021, IBM recentlyunveiledits new 'Eagle' quantum computing processor with 127 qubits, a major step forward in IBM's roadmap to reach Quantum Advantage.

There are more than 170 clients, including LG Electronics, Fortune 500 companies, start-ups, academic institutions and research labs working with IBM Quantum technology to advance quantum computing and explore practical applications.

For more information about IBM's quantum computing efforts, visitwww.ibm.com/quantum-computing.

View original post here:
LG Electronics Joins the IBM Quantum Network - Database Trends and Applications

Hyperion Research Says 2021 Will Be an Exceptional Growth Year for the Global HPC Market – HPCwire

With Fourth-Quarter Figures Still Due from Vendors, 2021 HPC Server Revenue Are Projected to Grow More Than 12% and Push the Overall On-Premises Market Past $31 Billion

ST. PAUL, Minn., January 24, 2022 Although many vendor-reported figures for the fourth quarter arent available yet, high performance computing (HPC) market analyst firm Hyperion Research said its clear 2021 will be an exceptionally strong growth year for the worldwide HPC market, according to CEO Earl Joseph.

Based on 2021 growth over the prior-year quarters of 13.3% (first quarter), 23.9% (second quarter) and 18.6% (third quarter), the company projects full-year 2021 revenue for on-premises HPC server systems will grow at least 12% to reach over $15 billion in HPC server revenues. Growth in this plus related segments, including software, storage and technical support services, will lift revenue for the overall on-premises HPC market above $31 billion. This compares with overall market revenue of $27.3 billion in 2020.

Adding revenue for cloud-based HPC usage raises the projected 2021 HPC market total to more than $36 billion. Joseph said Hyperion Research forecasts that by 2025, the on-premises HPC market will pass $38 billion and cloud-based HPC usage will be worth nearly $9 billion, elevating the combined on-premises and cloud-based total to about $47 billion.

Exceptional HPC market growth in 2021 was driven by much stronger adoption of HPC in mainstream markets, pre-exascale systems and other high-end supercomputers, along with external storage and cloud use, Joseph said. We expect our forecast growth to be uneven quarter to quarter, depending heavily on when pre-exascale and exascale supercomputers are accepted for payment by customers, but increasing demand for HPC in established and in newer areas, especially artificial intelligence, cloud, quantum and edge computing, promises robust growth in this transformational technology market.

About Hyperion Research Holdings

Hyperion Research helps organizations make effective decisions and seize growth opportunities by providing research and recommendations in both high performance computing and emerging technology area. For more information, visit http://www.hyperionresearch.com and http://www.hpcuserforum.com.

Source: Hyperion Research

Read more here:
Hyperion Research Says 2021 Will Be an Exceptional Growth Year for the Global HPC Market - HPCwire