Category Archives: Quantum Computing
IonQ and University of Maryland Researchers Demonstrate Fault-Tolerant Error Correction, Critical for Unlocking the Full Potential of Quantum…
COLLEGE PARK, Md.--(BUSINESS WIRE)--Researchers from The University of Maryland and IonQ, Inc. (IonQ) (NYSE: IONQ), a leader in trapped-ion quantum computing, on Monday published results in the journal Nature that show a significant breakthrough in error correction technology for quantum computers. In collaboration with scientists from Duke University and the Georgia Institute of Technology, this work demonstrates for the first time how quantum computers can overcome quantum computing errors, a key technical obstacle to large-scale use cases like financial market prediction or drug discovery.
Quantum computers suffer from errors when qubits encounter environmental interference. Quantum error correction works by combining multiple qubits together to form a logical qubit that more securely stores quantum information. But storing information by itself is not enough; quantum algorithms also need to access and manipulate the information. To interact with information in a logical qubit without creating more errors, the logical qubit needs to be fault-tolerant.
The study, completed at the University of Maryland, peer-reviewed, and published in the journal Nature, demonstrates how trapped ion systems like IonQs can soon deploy fault-tolerant logical qubits to overcome the problem of error correction at scale. By successfully creating the first fault-tolerant logical qubit a qubit that is resilient to a failure in any one component the team has laid the foundation for quantum computers that are both reliable and large enough for practical uses such as risk modeling or shipping route optimization. The team demonstrated that this could be achieved with minimal overhead, requiring only nine physical qubits to encode one logical qubit. This will allow IonQ to apply error correction only when needed, in the amount needed, while minimizing qubit cost.
This is about significantly reducing the overhead in computational power that is typically required for error correction in quantum computers," said Peter Chapman, President and CEO of IonQ. "If a computer spends all its time and power correcting errors, that's not a useful computer. What this paper shows is how the trapped ion approach used in IonQ systems can leapfrog others to fault tolerance by taking small, unreliable parts and turning them into a very reliable device. Competitors are likely to need orders of magnitude more qubits to achieve similar error correction results.
Behind todays study are recently graduated UMD PhD students and current IonQ quantum engineers, Laird Egan and Daiwei Zhu, IonQ cofounder Chris Monroe as well as IonQ technical advisor and Duke Professor Ken Brown. Coauthors of the paper include: UMD and Joint Quantum Institute (JQI) research scientist Marko Cetina; postdoctoral researcher Crystal Noel; graduate students Andrew Risinger and Debopriyo Biswas; Duke University graduate student Dripto M. Debroy and postdoctoral researcher Michael Newman; and Georgia Institute of Technology graduate student Muyuan Li.
The news follows on the heels of other significant technological developments from IonQ. The company recently demonstrated the industrys first Reconfigurable Multicore Quantum Architecture (RMQA) technology, which can dynamically configure 4 chains of 16 ions into quantum computing cores. The company also recently debuted patent-pending evaporated glass traps: technology that lays the foundation for continual improvements to IonQs hardware and supports a significant increase in the number of ions that can be trapped in IonQs quantum computers. Furthermore, it recently became the first quantum computer company whose systems are available for use via all major cloud providers. Last week, IonQ also became the first publicly-traded, pure-play quantum computing company.
About IonQ
IonQ, Inc. is a leader in quantum computing, with a proven track record of innovation and deployment. IonQs next-generation quantum computer is the worlds most powerful trapped-ion quantum computer, and IonQ has defined what it believes is the best path forward to scale. IonQ is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access. IonQ was founded in 2015 by Christopher Monroe and Jungsang Kim based on 25 years of pioneering research. To learn more, visit http://www.ionq.com.
About the University of Maryland
The University of Maryland, College Park is the state's flagship university and one of the nation's preeminent public research universities. A global leader in research, entrepreneurship and innovation, the university is home to more than 40,000 students,10,000 faculty and staff, and 297 academic programs. As one of the nations top producers of Fulbright scholars, its faculty includes two Nobel laureates, three Pulitzer Prize winners and 58 members of the national academies. The institution has a $2.2 billion operating budget and secures more than $1 billion annually in research funding together with the University of Maryland, Baltimore. For more information about the University of Maryland, College Park, visit http://www.umd.edu.
Forward-Looking Statements
This press release contains certain forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. Some of the forward-looking statements can be identified by the use of forward-looking words. Statements that are not historical in nature, including the words anticipate, expect, suggests, plan, believe, intend, estimates, targets, projects, should, could, would, may, will, forecast and other similar expressions are intended to identify forward-looking statements. These statements include those related to the Companys ability to further develop and advance its quantum computers and achieve scale; and the ability of competitors to achieve similar error correction results. Forward-looking statements are predictions, projections and other statements about future events that are based on current expectations and assumptions and, as a result, are subject to risks and uncertainties. Many factors could cause actual future events to differ materially from the forward-looking statements in this press release, including but not limited to: market adoption of quantum computing solutions and the Companys products, services and solutions; the ability of the Company to protect its intellectual property; changes in the competitive industries in which the Company operates; changes in laws and regulations affecting the Companys business; the Companys ability to implement its business plans, forecasts and other expectations, and identify and realize additional partnerships and opportunities; and the risk of downturns in the market and the technology industry including, but not limited to, as a result of the COVID-19 pandemic. The foregoing list of factors is not exhaustive. You should carefully consider the foregoing factors and the other risks and uncertainties described in the Risk Factors section of the registration statement on Form S-4 and other documents filed by the Company from time to time with the Securities and Exchange Commission. These filings identify and address other important risks and uncertainties that could cause actual events and results to differ materially from those contained in the forward-looking statements. Forward-looking statements speak only as of the date they are made. Readers are cautioned not to put undue reliance on forward-looking statements, and the Company assumes no obligation and do not intend to update or revise these forward-looking statements, whether as a result of new information, future events, or otherwise. The Company does not give any assurance that it will achieve its expectations.
See the original post:
IonQ and University of Maryland Researchers Demonstrate Fault-Tolerant Error Correction, Critical for Unlocking the Full Potential of Quantum...
Quantum computing startups pull in millions as VCs rush to get ahead of the game – The Register
Venture capital firms are pouring billions into quantum computing companies, hedging bets that the technology will pay off big time some day.
Rigetti, which makes quantum hardware, announced a $1.5bn merger with Supernova Partners Acquisition Company II, a finance house focusing on strategic acquisitions. Rigetti, which was valued at $1.04bn before the deal, will now be publicly traded.
Before Rigetti's deal, quantum computer hardware and software companies raked in close to $1.02bn from venture capital investments this year, according to numbers provided to The Register by financial research firm PitchBook. That was a significant increase from $684m invested by VC firms in 2020, and $188m in 2019.
Prior to the Rigetti transaction, the biggest deal was a $450mn investment in PsiQuantum, which was valued at $3.15bn, in a round led by venture capital firm BlackRock on July 27.
Quantum computers process information differently way than classical computing. Quantum computers encode information in qubits, and store exponentially more information in the form of 1s, 0s or a superposition of both. These computers can evaluate data simultaneously, while classical computers evaluate data sequentially, simply put.
Theoretically, that makes quantum computers significantly more powerful, and enables applications like drug discovery, which are limited by the constraints of classical computers.
Rigetti and PsiQuantum are startups in a growing field of quantum computer makers that includes heavyweights IBM and Google, which are building superconducting quantum systems based on transmon qubits. D-Wave offers a quantum-annealing system based on flux bits to solve limited-sized problems, but this week said it was building a new superconducting system to solve larger problems.
Quantum computers show promise but still immature, with questions around stability, said Linley Gwennap, president of Linley Group, in a research note last month.
"Solving the error-rate problem will require substantially new approaches. If researchers can meet that challenge, quantum processors will provide an excellent complement to classical processors," Gwennap wrote.
If quantum ever works, there could be a huge market, hence the VC interest, but the technology is years away from significant revenue, Gwennap told The Register.
Deals by SPAC (special purpose acquisition companies) like Supernova Partners tend to be highly speculative, but the venture firm's due diligence on Rigetti was more around the possible rewards if quantum computers live up to their hype.
Rigetti's quantum technology is scalable, practical and manufacturable, said Supernova's chief financial officer Michael Clifton, in a press conference this week related to the deal.
"Quantum is expected to be as important as mobile and cloud have been over the last two decades," Clifton said, adding, "we were focused on large addressable markets, differentiated technologies and excellent management teams."
Rigetti's quantum computer is modular and scalable with qubit systems linked through faster interconnects. The company's introductory system in 2018 had 8 qubits, and will scale it up to 80 qubit multichip system with high-density I/O and 3D signalling. The company's roadmap includes a 1000-qubit system in 2024 that is "error mitigating," and a 4000-qubit system in 2026 with full error correction features.
Rigetti designs and makes the quantum computers chips in its own fabrication plant, which helps accelerate the delivery of chips. Amazon offers access to Rigetti's quantum hardware through AWS.
IT leaders in non-tech companies are taking quantum computing seriously, IDC said in May.
A survey by the analyst house in April revealed companies would allocate more than 19 per cent the annual IT budgets to quantum computing in 2023, growing from 7 per cent in 2021. Investments would in at quantum algorithms and systems available through the cloud to boost AI and cybersecurity.
View original post here:
Quantum computing startups pull in millions as VCs rush to get ahead of the game - The Register
IBM and Raytheon Technologies collaborate on AI, cryptography and quantum technologies – Scientific Computing World
IBM and Raytheon Technologies have announced a collaboration to jointly develop advanced artificial intelligence (AI), cryptographic and quantum solutions for the aerospace, defence and intelligence industries, including the federal government, as part of a strategic collaboration agreement.
Artificial intelligence and quantum technologies give aerospace and government customers the ability to design systems more quickly, better secure their communications networks and improve decision-making processes. By combining IBM's breakthrough commercial research with Raytheon Technologies' own research, plus aerospace and defence expertise, the companies will be able to crack once-unsolvable challenges.
Dario Gil, senior vice president, IBM, and director of research comments: The rapid advancement of quantum computing and its exponential capabilities has spawned one of the greatest technological races in recent history one that demands unprecedented agility and speed. Our new collaboration with Raytheon Technologies will be a catalyst in advancing these state-of-the-art technologies combining their expertise in aerospace, defence and intelligence with IBM's next-generation technologies to make discovery faster, and the scope of that discovery larger than ever.
In addition to artificial intelligence and quantum, the companies will jointly research and develop advanced cryptographic technologies that lie at the heart of some of the toughest problems faced by the aerospace industry and government agencies.
Mark Russell, Raytheon Technologies chief technology officer added: Take something as fundamental as encrypted communications. As computing and quantum technologies advance, existing cybersecurity and cryptography methods are at risk of becoming vulnerable. IBM and Raytheon Technologies will now be able to collaboratively help customers maintain secure communications and defend their networks better than previously possible.
The companies are building a technical collaboration team to quickly insert IBM's commercial technologies into active aerospace, defence and intelligence programs. The same team will also identify promising technologies for jointly developing long-term system solutions by investing research dollars and talent.
Original post:
IBM and Raytheon Technologies collaborate on AI, cryptography and quantum technologies - Scientific Computing World
Digital Wealth Management Fees to Increase Threefold to $12.6 Billion By 2026 – Yahoo Finance
JP Morgan Identified as Leading Champion in Wealth Management Market
LONDON, October 12, 2021--(BUSINESS WIRE)--Total fees generated from digital assets under management will reach $12.6 billion by 2025, a more than threefold increase on the 2020 figure of $4 billion, according to fintech and payments research specialists Kaleido Intelligence.
According to Kaleidos new report, Digital Wealth Management Strategies & Forecasts 2021, US digital wealth managers currently dominate the space (the five largest players all cater exclusively for US clients) and, with AUM (assets under management) exceeding $700 billion in 2020, accounts for nearly 85% of all digitally managed assets. This dominance is expected to continue for the foreseeable future: even by 2025 the US is expected to account for around 71% of robo-managed funds, with over $2.1 trillion under management
VIEW THE FULL REPORT HERE
ACCESS OUR FREE MARKET SIZIG INFOGRPAHIC HERE
JP Morgan Heads Digital Wealth Readiness Rankings
The report assessed the digital readiness of the 15 leading wealth management companies by AUM, scoring the players against indicators including the breadth of their digital portfolios, the scale of their fintech acquisitions and investment and the activities in key areas including artificial intelligence, blockchain and quantum computing.
Four companies were identified as Champion companies which have embraced the need for digital transformation, have committed to significant R&D in innovative technologies and have already deployed them at the backend or in customer facing products, and have reached out to third party partners across the ecosystem. JP Morgan headed the rankings, followed by Fidelity Investments, with Blackrock and Vanguard in joint third place.
Changing Audience, Changing Priorities
The report emphasised the need for wealth managers to understand, and respond to, the fact that not only will millennials seek to interact with them increasingly via digital channels, but that their expectations and demands from funds will differ significantly from their predecessors.
Story continues
According to report author Dr Windsor Holden, Principal Research Consultant at Kaleido Intelligence, "There is a far greater awareness of environmental issues and a related desire to invest in "green" or socially responsible funds, as well as in technology funds. Similarly, there has been a paradigm shift in goals, needs and lifestyles, which will need to be recognised by the wealth managers so that they can present appropriate investment options."
Investment Soaring into Wealth Management Fintechs
Kaleido estimates that over the past decade, approximately $4.6 billion has been invested by VC firms into robo-advisory fintech platforms, of which $1.8 billion has been invested since June 2020. One company, the Canadian startup Wealthsimple, accounts for 19% of all VC investment in that time, including $610 million earlier this year.
VIEW THE FULL PRESS RELEASE & INFOGRAPHIC ON OUR WEBSITE
About Kaleido Intelligence
Kaleido Intelligence is a specialist fintech and payments consulting and market research firm with a proven track record delivering fintech and payments research at the highest level. Research is led by expert analysts, each with significant experience delivering fintech research and insights that matter.
View source version on businesswire.com: https://www.businesswire.com/news/home/20211012005470/en/
Contacts
Contact our Press Team for interviews and research access:Jon King, Chief Commercial Officerfintech@kaleidointelligence.com +44 (0) 2039839843
Read more:
Digital Wealth Management Fees to Increase Threefold to $12.6 Billion By 2026 - Yahoo Finance
How science and diplomacy inform each other – SWI swissinfo.ch – swissinfo.ch
The potential of quantum computing is one of the focuses ofa summit in Genevathataimstoimprove the dialogue between diplomatsandthescientific communityto safeguard our collective welfare.Tworesearchersexplaintherewards and risks ofquantum computing.
Dorian Burkhalter
Thescientists, diplomats, captains of industry and investors gathering inGenevafor the first-ever summit of theScience and Diplomacy Anticipator (GESDA)External linkwill, among other lofty goals, discuss howpolicymakersshouldprepare forquantumcomputing, provide governance for it,and ensure thatitis accessible to all.But what are quantum computers, and whatwill they be able to do?
Quantum computersperform calculations byexploitingtheproperties ofquantummechanics, which describes thebehaviourofatoms andparticles at a subatomic scale,for example,howelectrons interact with each other.As quantum computersoperate onthe same set of rules asmolecules do,they are,for instance,much better suitedto simulate them than classical computers are.
Today, quantum computers are small and unreliable. They are not yet able to solve problems classical computers cannot.
There is still some uncertainty, but I don't see any reason to not be able to develop such a quantum computer, although it's a huge engineering challenge, says Nicolas Gisin, professor emeritus at the University of Genevaand at the Schaffhausen Institute of Technology,and an expert in quantum technologies.
Quantum computerscouldhelp solvesome of the worlds most pressing problems. They couldaccelerate thediscovery ofmaterials for longer-lasting batteries,bettersolar panels, andnew medicaltreatments.They could also break current encryptionmethods, meaning that information secure today maybecomeat risk tomorrow.
For private companies, winning the race to develop reliable and powerful quantum computers means reaping large economic rewards. For countries, it means gaining a significant national security advantage.
Gisinsaysquantum computers capable of simulating new molecules could be 5-10 years away, while more powerful quantum computers that can break encryption could become a reality in 10-20 years.
The pace at whichthesetechnologies develop will depend on the level of investments made.Large technology firms such as IBM, Microsoft, and Googleare all developing quantum computers, while the US, China,and Europeareinvestingheavilyinquantum technologies.
Anticipating the arrival ofthesetechnologies isimportant,because you play through different scenarios, and some you may like,some you may not like,says HeikeRiel, IBM Fellow at IBMResearch in Zurich.Then you can also think of what type of regulations you may need,or what type of research you need to foster.
TheSwiss governmentis a supporter oftheGESDAfoundationwhichorganisedits first summit in Geneva fromOctober 7-9.The conferencebringstogetherscientists, diplomats, andother stakeholders to discussfuturescientific developmentsandtoanticipate their impacton society.
To work well, scientists needfavourableframeworks. There is definitely a back and forth between science and diplomacy, and science and politics, because diplomacy can also advance science, Riel says.
Politicians and diplomatsare responsible forcreatingopportunities for researchers to collaborate across borders. Initiatives and funding aimed at addressingspecifictechnical problems influence the directionofresearchefforts.
The fact that Switzerland is outside of the European research framework is an absurdity for everyone because this is just going to harm both Switzerland and Europe, Gisin says. It would be really important that Europe and Switzerland understand that we will both benefit if we talk together more and collaborate more.
Since July 2021, Switzerland haslimited accessto Horizon Europe, the European Unions flagship funding program for research and innovation due to a breakdown in negotiations on regulating bilateral relations.
Many of ourproblemstodaysuch as climate change or the Covid-19 pandemicare globalin nature.Getting governments across the world to agree to work togetheronsolutions is not easy, but researcherscan help.
The research communitylikes to worktogether globally, and this collaboration has helped historically to overcome certainbarriers, Riel says, emphasising the importance of communication in this regard.
Researchers working togetheron a global scaleduring the pandemichasled to vaccines being developed atarecord-breakingspeed.During the Cold Warat theEuropean Organization for Nuclear Research (CERN) in Geneva,Sovietscientistsremained involvedin projectswhich allowedforsomecommunicationto take place.
In science, we have a common ground and it's kind of universal; the scientists in the UnitedStates, Canada, Australia,Europeand China, they all work on the same problems, they all try to solve the same technical issues, Riel says.
Scientists also have an important role to play to inform and share facts with both policymakers and the public, even if politicians cannotrely solely on scientific evidence when making decisions. The challenges of communicatingfact-based evidencehavebeen laid bare during the pandemic.
I think it's very important that we also inform the society of what we are doingthat it's not a mystery thatscares people, Riel says.
Ultimately,to successfullyaddress global challenges scientists,diplomats and politicians willhave towork together.
It's really a cooperation between the global collaboration of the scientists and the global collaboration of the diplomats to solve the problems together, Riel says.
More here:
How science and diplomacy inform each other - SWI swissinfo.ch - swissinfo.ch
Is Neuromorphic Computing The Answer For Autonomous Driving And Personal Robotics? – Forbes
Intel's Loihi 2 is the company's second generation neuromorphic computing chip, a technology that's ... [+] designed to function like a digital representation of a biological brain complete with neurons and synapses.
If you follow the latest trends in the tech industry, you probably know that theres been a fair amount of debate about what the next big thing is going to be. Odds-on favorite for many has been augmented reality (AR) glasses, while others point to fully autonomous cars, and a few are clinging to the potential of 5G. With the surprise debut of Amazons Astro a few weeks back, personal robotic devices and digital companions have also thrown their hat into the ring.
However, while there has been little agreement on exactly what the next thing is, there seems to be little disagreement that whatever it turns out to be, it will be somehow powered, enabled, or enhanced by artificial intelligence (AI). Indeed, the fact that AI and machine learning (ML) are our future seems to be a foregone conclusion.
Yet, if we do an honest assessment of where some of these technologies actually stand on a functionality basis versus initial expectations, its fair to argue that the results have been disappointing on many levels. In fact, if we extend that thought process out to what AI/ML were supposed to do for us overall, then we start to come to a similarly disappointing conclusion.
To be clear, weve seen some incredible advancements in many areas that AI has powered. Advanced analytics, neural network training, and other related fields (where large chunks of data are used to find patterns, learn rules, and then apply them) have been huge benefactors of existing AI approaches.
At the same time, if we look at an application like autonomous driving, it seems increasingly clear that just pushing more and more data into algorithms that crank out ever refined, yet still flawed, ML models isnt really working. Were still years away from true Level 5 autonomy, and, given the number of accidents and even deaths that efforts like Teslas AutoPilot have led to, its probably time to consider another approach.
Similarly, though we are still at the dawn of the personal robotics age, its easy to imagine how the conceptual similarities between autonomous cars and robots will lead to conceptually similar problems in this new field. The problem, ultimately, is that there is simply no way to feed every potential scenario into an AI training model and create a predetermined answer on how to react for any given situation. Randomness and unexpected surprises are simply too strong an influence.
Whats needed is a type of computing that can really think and learn on its own and then adapt its learning to those unexpected scenarios. As crazy and potentially controversial as that may sound, thats essentially what researchers in the field of neuromorphic computing are attempting to do. The basic idea is to replicate the structure and function of the most adaptable computing/thinking device we know ofthe human brainin digital form. Following the principles of basic biology, neuromorphic chips attempt to re-create a series of connected neurons using digital synapses that send electrical pulses between them, much as biological brains do.
Its an area of academic research thats been around for a few decades now, but only recently has it started to make real progress and gain more attention. In fact, buried in the wave of tech industry announcements that have been made over the last few weeks was news that Intel had released the second generation of its neuromorphic chip, named Loihi 2, along with a new open-source software framework for it that theyve dubbed Lava.
To put realistic expectations around all of this, Loihi 2 is not going to be made commercially availableits termed a research chipand the latest version offers 1 million neurons, a far cry from the approximately 100 billion found in a human brain. Still, its an extremely impressive, ambitious project that offers 10x the performance, 15x the density of its 2018-era predecessor (its built on the companys new Intel 4 chip manufacturing process technology), and improved energy efficiency. In addition, it also provides better (and easier) means of interconnecting its unique architecture with other more traditional chips.
Intel clearly learned a great deal from the first Loihi, and one of the biggest realizations was that software development for this radically new architecture is extremely hard. As a result, another essential part of the companys news was the debut of Lava, an open-source software framework and set of tools that can be used to write applications for Loihi. The company is also offering tools that can simulate its operation on traditional CPUs and GPUs so that developers can create code without having access to the chips.
Whats particularly fascinating about how neuromorphic chips operate is that, despite the fact they function in a dramatically different fashion from both traditional CPU computing and parallel GPU-like computing models, they can be used to achieve some of the same goals. In other words, neuromorphic chips like Loihi 2 can provide the desired outcomes that traditional AI is shooting for, but in a significantly faster, more energy efficient, and less data intensive way. Through a series of event-based spikes that occur asynchronously and trigger digital neurons to respond in various waysmuch as a human brain operates (vs. the synchronous, structured processing in CPUs and GPUs)a neuromorphic chip can essentially learn things on the fly. As a result, its ideally suited for devices that must react to new stimuli in real-time.
These capabilities are why these chips are so appealing to those designing and building robots and robotic-like systems, which autonomous driving cars essentially are. Bottom line is that it could take commercially available neuromorphic chips to power the kind of autonomous cars and personal robots of our science fiction-inspired dreams.
Of course, neuromorphic computing isnt the only new approach to advancing the world of technology. Theres also a great deal of work being done in the more widely discussed world of quantum computing. Like quantum computing, the inner workings of neuromorphic computing are extraordinarily complex and, for now, primarily seen as research projects for corporate R&D labs and academic research. Unlike quantum, however, neuromorphic computing doesnt require the extreme physical challenges (temperatures near absolute zero) and power requirements that quantum currently does. In fact, one of the many appealing aspects of neuromorphic architectures is that theyre designed to be extremely low power, making them suitable for a variety of mobile or other battery-powered applications (like autonomous cars and robots).
Despite recent advancements, its important to remember that commercial application of neuromorphic chips is still several years away. However, its hard not to get excited and intrigued by a technology that has the potential to make AI-powered devices truly intelligent, instead of simply very well-trained. The distinction may seem subtle, but ultimately, its that kind of new smarts that well likely need in order to make some of the next big things really happen in a way that we can all appreciate and imagine.
Disclosure: TECHnalysis Research is a tech industry market research and consulting firm and, like all companies in that field, works with many technology vendors as clients, some of whom may be listed in this article.
View original post here:
Is Neuromorphic Computing The Answer For Autonomous Driving And Personal Robotics? - Forbes
IonQ is set to make its public trading debut. Here’s a look at the quantum computing company’s 2021 highlights – Technical.ly DC
This week, College Park, Maryland quantum computing company IonQ is officially going public.
Following a merger with dMY Technology Group Inc. III, which is a special purpose acquisition company based in Las Vegas, the firm will begin trading on the New York Stock Exchange on Friday, Oct. 1. The merger was officially approved on Tuesday by dMY III stockholders.
The company will be trading under the symbol IONQ, and CEO Peter Chapman said it is expected to raise $635 million, with an additional $132 million in outstanding warrants. Of this, $350 million will be raised through private investment in public equity (PIPE) funding from investors including Fidelity Management & Research Company, Silver Lake, Breakthrough Energy Ventures, MSD Partners, Hyundaiand Kia.
Founded in 2015 by University of Maryland College Park professor Dr. Chris Monroe and Duke University professor Dr. Jungsang Kim, IonQ specializes in trapped ion quantum computing. Drawing on two decades of research, the company is working to create more powerful computers than those currently available, and apply the technology to solving foundational problems in new ways.
IonQ first announced plans to go public earlier this year, estimating that the company would be valued at $2 billion when the deal closed. Chapman told Technical.ly that the IPO will make IonQ more competitive in talent recruiting and help it to reach the manufacturing stage with its products, particularly in quantum networking.
This was not actually a liquidity event for us, Chapman said. Most people when they get to an IPO, theyre thinking about how can they cash out there. But there isnt anyone actually cashing out. We just thought of this as a means to an end on how to raise money.
Going forward, Chapman said the company expects to double its 90-person team, which is spread across offices in College Park, Seattle and Boston.
Since it announced the IPO in March, 2021 has been a banner year for IonQ. It has landed partnerships that will help to further explore real-world applications of quantum computing with GE Research, the Fidelity Center for Applied Technology, Goldman Sachs and QCWare, Google, Accenture andSoftbank. It is teaming with theUniversity of Maryland on a new lab in College Park.
When it comes to tech advances, the company launched what it says is the industrys first reconfigurable multicore quantum architecture, as well as designed and launched a chipset known as Evaporated Glass Traps. This year also brought its second research credit program cohort, which offers free credits to academics building novel quantum algorithms (Want to know more about quantums rise out of the lab? Check out our explainer here).
[Going public] will lift all the boats in quantum computing in this sense that we can show that it can be done in quantum now, and thats probably good for the entire industry, Chapman said.
Nir Minerbi, CEO and cofounder of Classiq, a fellow quantum company, agrees, although he thinks theres still more work to be done in the industry.
Organizations understand that the ability to extract true business value from quantum computing grows as more qubits with higher quality are available, said Minerbi in a statement. IonQs funding is good news for the industry and their quantum roadmap is encouraging as well.
As the company moves into the new year, Chapman said IonQ will be expanding into the drug discovery, materials science and battery industries. But, he noted, the possibilities with quantum computing offer plenty of new, yet-to-be-discovered options, as well.
Every day at the company is fun. You have a customer thats doing something that has never been done before, Chapman said. Its a pretty exciting place to be.
Read the original here:
IonQ is set to make its public trading debut. Here's a look at the quantum computing company's 2021 highlights - Technical.ly DC
Connecting the Dots Between Material Properties and Superconducting Qubit Performance – SciTechDaily
Scientists performed transmission electron microscopy and x-ray photoelectron spectroscopy (XPS) at Brookhaven Labs Center for Functional Nanomaterials and National Synchrotron Light Source II to characterize the properties of niobium thin films made into superconducting qubit devices at Princeton University. A transmission electron microscope image of one of these films is shown in the background; overlaid on this image are XPS spectra (colored lines representing the relative concentrations of niobium metal and various niobium oxides as a function of film depth) and an illustration of a qubit device. Through these and other microscopy and spectroscopy studies, the team identified atomic-scale structural and surface chemistry defects that may be causing loss of quantum informationa hurdle to enabling practical quantum computers. Credit: Brookhaven National Laboratory
Brookhaven Lab and Princeton scientists team up to identify sources of loss of quantum information at the atomic scale.
Engineers and materials scientists studying superconducting quantum information bits (qubits)a leading quantum computing material platform based on the frictionless flow of paired electronshave collected clues hinting at the microscopic sources of qubit information loss. This loss is one of the major obstacles in realizing quantum computers capable of stringing together millions of qubits to run demanding computations. Such large-scale, fault-tolerant systems could simulate complicated molecules for drug development, accelerate the discovery of new materials for clean energy, and perform other tasks that would be impossible or take an impractical amount of time (millions of years) for todays most powerful supercomputers.
An understanding of the nature of atomic-scale defects that contribute to qubit information loss is still largely lacking. The team helped bridge this gap between material properties and qubit performance by using state-of-the-art characterization capabilities at the Center for Functional Nanomaterials (CFN) and National Synchrotron Light Source II (NSLS-II), both U.S. Department of Energy (DOE) Office of Science User Facilities at Brookhaven National Laboratory. Their results pinpointed structural and surface chemistry defects in superconducting niobium qubits that may be causing loss.
Anjali Premkumar
Superconducting qubits are a promising quantum computing platform because we can engineer their properties and make them using the same tools used to make regular computers, said Anjali Premkumar, a fourth-year graduate student in the Houck Lab at Princeton University and first author on the Communications Materials paper describing the research. However, they have shorter coherence times than other platforms.
In other words, they cant hold onto information very long before they lose it. Though coherence times have recently improved from microseconds to milliseconds for single qubits, these times significantly decrease when multiple qubits are strung together.
Qubit coherence is limited by the quality of the superconductors and the oxides that will inevitably grow on them as the metal comes into contact with oxygen in the air, continued Premkumar. But, as qubit engineers, we havent characterized our materials in great depth. Here, for the first time, we collaborated with materials experts who can carefully look at the structure and chemistry of our materials with sophisticated tools.
This collaboration was a prequel to the Co-design Center for Quantum Advantage (C2QA), one of five National Quantum Information Science Centers established in 2020 in support of the National Quantum Initiative. Led by Brookhaven Lab, C2QA brings together hardware and software engineers, physicists, materials scientists, theorists, and other experts across national labs, universities, and industry to resolve performance issues with quantum hardware and software. Through materials, devices, and software co-design efforts, the C2QA team seeks to understand and ultimately control material properties to extend coherence times, design devices to generate more robust qubits, optimize algorithms to target specific scientific applications, and develop error-correction solutions.
Andrew Houck
In this study, the team fabricated thin films of niobium metal through three different sputtering techniques. In sputtering, energetic particles are fired at a target containing the desired material; atoms are ejected from the target material and land on a nearby substrate. Members of the Houck Lab performed standard (direct current) sputtering, while Angstrom Engineering applied a new form of sputtering they specialize in (high-power impulse magnetron sputtering, or HiPIMS), where the target is struck with short bursts of high-voltage energy. Angstrom carried out two variations of HiPIMS: normal and with an optimized power and target-substrate geometry.
Back at Princeton, Premkumar made transmon qubit devices from the three sputtered films and placed them in a dilution refrigerator. Inside this refrigerator, temperatures can plunge to near absolute zero (minus 459.67 degrees Fahrenheit), turning qubits superconducting. In these devices, superconducting pairs of electrons tunnel across an insulating barrier of aluminum oxide (Josephson junction) sandwiched between superconducting aluminum layers, which are coupled to capacitor pads of niobium on sapphire. The qubit state changes as the electron pairs go from one side of the barrier to the other. Transmon qubits, co-invented by Houck Lab principal investigator and C2QA Director Andrew Houck, are a leading kind of superconducting qubit because they are highly insensitive to fluctuations in electric and magnetic fields in the surrounding environment; such fluctuations can cause qubit information loss.
For each of the three device types, Premkumar measured the energy relaxation time, a quantity related to the robustness of the qubit state.
The energy relaxation time corresponds to how long the qubit stays in the first excited state and encodes information before it decays to the ground state and loses its information, explained Ignace Jarrige, formerly a physicist at NSLS-II and now a quantum research scientist at Amazon, who led the Brookhaven team for this study.
Ignace Jarrige
Each device had different relaxation times. To understand these differences, the team performed microscopy and spectroscopy at the CFN and NSLS-II.
NSLS-II beamline scientists determined the oxidation states of niobium through x-ray photoemission spectroscopy with soft x-rays at the In situ and Operando Soft X-ray Spectroscopy (IOS) beamline and hard x-rays at the Spectroscopy Soft and Tender (SST-2) beamline. Through these spectroscopy studies, they identified various suboxides located between the metal and the surface oxide layer and containing a smaller amount of oxygen relative to niobium.
We needed the high energy resolution at NSLS-II to distinguish the five different oxidation states of niobium and both hard and soft x-rays, which have different energy levels, to profile these states as a function of depth, explained Jarrige. Photoelectrons generated by soft x-rays only escape from the first few nanometers of the surface, while those generated by hard x-rays can escape from deeper in the films.
At the NSLS-II Soft Inelastic X-ray Scattering (SIX) beamline, the team identified spots with missing oxygen atoms through resonant inelastic x-ray scattering (RIXS). Such oxygen vacancies are defects, which can absorb energy from qubits.
At the CFN, the team visualized film morphology using transmission electron microscopy and atomic force microscopy, and characterized the local chemical makeup near the film surface through electron energy-loss spectroscopy.
Sooyeon Hwang
The microscope images showed grainspieces of individual crystals with atoms arranged in the same orientationsized larger or smaller depending on the sputtering technique, explained coauthor Sooyeon Hwang, a staff scientist in the CFN Electron Microscopy Group. The smaller the grains, the more grain boundaries, or interfaces where different crystal orientations meet. According to the electron energy-loss spectra, one film had not just oxides on the surface but also in the film itself, with oxygen diffused into the grain boundaries.
Their experimental findings at the CFN and NSLS-II revealed correlations between qubit relaxation times and the number and width of grain boundaries and concentration of suboxides near the surface.
Grain boundaries are defects that can dissipate energy, so having too many of them can affect electron transport and thus the ability of qubits to perform computations, said Premkumar. Oxide quality is another potentially important parameter. Suboxides are bad because electrons are not happily paired together.
Going forward, the team will continue their partnership to understand qubit coherence through C2QA. One research direction is to explore whether relaxation times can be improved by optimizing fabrication processes to generate films with larger grain sizes (i.e., minimal grain boundaries) and a single oxidation state. They will also explore other superconductors, including tantalum, whose surface oxides are known to be more chemically uniform.
From this study, we now have a blueprint for how scientists who make qubits and scientists who characterize them can collaborate to understand the microscopic mechanisms limiting qubit performance, said Premkumar. We hope other groups will leverage our collaborative approach to drive the field of superconducting qubits forward.
Reference: Microscopic relaxation channels in materials for superconducting qubits by Anjali Premkumar, Conan Weiland, Sooyeon Hwang, Berthold Jck, Alexander P. M. Place, Iradwikanari Waluyo, Adrian Hunt, Valentina Bisogni, Jonathan Pelliciari, Andi Barbour, Mike S. Miller, Paola Russo, Fernando Camino, Kim Kisslinger, Xiao Tong, Mark S. Hybertsen, Andrew A. Houck and Ignace Jarrige, 1 July 2021, Communications Materials.DOI: 10.1038/s43246-021-00174-7
This work was supported by the DOE Office of Science, National Science Foundation Graduate Research Fellowship, Humboldt Foundation, National Defense Science and Engineering Graduate Fellowship, Materials Research Science and Engineering Center, and Army Research Office. This research used resources of the Electron Microscopy, Proximal Probes, and Theory and Computation Facilities at the CFN, a DOE Nanoscale Science Research Center. The SST-2 beamline at NSLS-II is operated by the National Institute of Standards and Technology.
The rest is here:
Connecting the Dots Between Material Properties and Superconducting Qubit Performance - SciTechDaily
Quantum Computing in Agriculture Market to Witness Stellar CAGR During the Forecast Period 2021 -2026 – Northwest Diamond Notes
Executive Summary:
The data presented in the research report on Quantum Computing in Agriculture market is compiled with an aim to provide readers with a thorough understanding of industry dynamics in this business sphere. It entails a detailed analysis of growth stimulants, opportunities, challenges, and all other critical factors that will dictate the growth trajectory of this domain.
Proceeding further, the research document analyses past records and recent data to predict market valuation and business expansion trends between 2021-2026. It also elaborates on the segmentations of this vertical and their respective contribution to the overall growth. Besides, competitive landscape of this industry is included to help stakeholders make right decisions for the future.
Request Sample Copy of this Report @ https://www.nwdiamondnotes.com/request-sample/22629
Market Rundown:
Regional landscape:
Product spectrum outline:
Application scope overview
Competitive landscape review:
Influence of the Quantum Computing in Agriculture Market report:
The huge assortment of tables, graphs, diagrams, and charts obtained in this market research report generates a strong niche for an in-depth analysis of the ongoing trends in the Quantum Computing in Agriculture Market. The report also looks at the latest developments and advancement among the key players in the market such as mergers, partnerships, and achievements.
Quantum Computing in Agriculture Market Research Reports Includes PESTLE Analysis:
Quantum Computing in Agriculture Market Drivers Affecting:
In short, the Global Quantum Computing in Agriculture Market report offers a one-stop solution to all the key players covering various aspects of the industry like growth statistics, development history, industry share, Quantum Computing in Agriculture Market presence, potential buyers, consumption forecast, data sources, and beneficial conclusion.
MAJOR TOC OF THE REPORT:
Request Customization on This Report @ https://www.nwdiamondnotes.com/request-for-customization/22629
Go here to see the original:
Quantum Computing in Agriculture Market to Witness Stellar CAGR During the Forecast Period 2021 -2026 - Northwest Diamond Notes
What is quantum computing?
Quantum computing is an area of study focused on the development of computer based technologies centered around the principles ofquantum theory. Quantum theory explains the nature and behavior of energy and matter on thequantum(atomic and subatomic) level. Quantum computing uses a combination ofbitsto perform specific computational tasks. All at a much higher efficiency than their classical counterparts. Development ofquantum computersmark a leap forward in computing capability, with massive performance gains for specific use cases. For example quantum computing excels at like simulations.
The quantum computer gains much of its processing power through the ability for bits to be in multiple states at one time. They can perform tasks using a combination of 1s, 0s and both a 1 and 0 simultaneously. Current research centers in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory. In addition, developers have begun gaining access toquantum computers through cloud services.
Quantum computing began with finding its essential elements. In 1981, Paul Benioff at Argonne National Labs came up with the idea of a computer that operated with quantum mechanical principles. It is generally accepted that David Deutsch of Oxford University provided the critical idea behind quantum computing research. In 1984, he began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, publishing a breakthrough paper a few months later.
Quantum Theory
Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.
Quantum Theory
Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.
The Essential Elements of Quantum Theory:
Further Developments of Quantum Theory
Niels Bohr proposed the Copenhagen interpretation of quantum theory. This theory asserts that a particle is whatever it is measured to be, but that it cannot be assumed to have specific properties, or even to exist, until it is measured. This relates to a principle called superposition. Superposition claims when we do not know what the state of a given object is, it is actually in all possible states simultaneously -- as long as we don't look to check.
To illustrate this theory, we can use the famous analogy of Schrodinger's Cat. First, we have a living cat and place it in a lead box. At this stage, there is no question that the cat is alive. Then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both alive and dead, according to quantum law -- in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.
The principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.
A Comparison of Classical and Quantum Computing
Classical computing relies on principles expressed by Boolean algebra; usually Operating with a 3 or 7-modelogic gateprinciple. Data must be processed in an exclusive binary state at any point in time; either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. In addition, there is still a limit as to how quickly these devices can be made to switch states. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply.
The quantum computer operates with a two-mode logic gate:XORand a mode called QO1 (the ability to change 0 into a superposition of 0 and 1). In a quantum computer, a number of elemental particles such as electrons or photons can be used. Each particle is given a charge, or polarization, acting as a representation of 0 and/or 1. Each particle is called a quantum bit, or qubit. The nature and behavior of these particles form the basis of quantum computing and quantum supremacy. The two most relevant aspects of quantum physics are the principles of superposition andentanglement.
Superposition
Think of a qubit as an electron in a magnetic field. The electron's spin may be either in alignment with the field, which is known as aspin-upstate, or opposite to the field, which is known as aspin-downstate. Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from alaser. If only half a unit of laser energy is used, and the particle is isolated the particle from all external influences, the particle then enters a superposition of states. Behaving as if it were in both states simultaneously.
Each qubit utilized could take a superposition of both 0 and 1. Meaning, the number of computations a quantum computer could take is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. For reference, 2^500 is infinitely more atoms than there are in the known universe. These particles all interact with each other via quantum entanglement.
In comparison to classical, quantum computing counts as trueparallel processing. Classical computers today still only truly do one thing at a time. In classical computing, there are just two or more processors to constitute parallel processing.EntanglementParticles (like qubits) that have interacted at some point retain a type can be entangled with each other in pairs, in a process known ascorrelation. Knowing the spin state of one entangled particle - up or down -- gives away the spin of the other in the opposite direction. In addition, due to the superposition, the measured particle has no single spin direction before being measured. The spin state of the particle being measured is determined at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction. The reason behind why is not yet explained.
Quantum entanglement allows qubits that are separated by large distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.
Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously. This is because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.
Quantum Programming
Quantum computing offers an ability to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of "take all the superpositions of all the prior computations." This would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers.
The first quantum computing program appeared in 1994 by Peter Shor, who developed a quantum algorithm that could efficiently factorize large numbers.
The Problems - And Some Solutions
The benefits of quantum computing are promising, but there are huge obstacles to overcome still. Some problems with quantum computing are:
There are many problems to overcome, such as how to handle security and quantum cryptography. Long time quantum information storage has been a problem in the past too. However, breakthroughs in the last 15 years and in the recent past have made some form of quantum computing practical. There is still much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis.
Read more here:
What is quantum computing?