Category Archives: Quantum Computing
View: Its the Spacetime to Quantum – Economic Times
In July, the European Organisation for Nuclear Research (Cern) announced it would deploy quantum computers (QCs) to power its search for fundamental particles. Unlike a decade ago, QCs are no more tentative prototypes, but fast emerging as a viable tool for niche practical applications ranging from designing novel materials to enabling drug discovery.
QCs are now available as a cloud-based service to anyone with an internet connection. We will see the unveiling of more powerful QCs over the next five years. How prepared is India to ride the quantum technology wave?
Introduced as an idea by Nobel-winning physicist Richard Feynman in the early 1980s, QCs are not merely faster versions of the computers we use but are machines based on the laws of quantum physics. A typical QC hardware computes by manipulating electrons and nuclei using electromagnetic radiation from lasers. The technology is complex as precise control over these delicate manipulation schemes is necessary to perform calculations. If this technology can be mastered, QCs promise, at least for a certain class of problems, unprecedented computational speeds not attainable even by the fastest supercomputers available today.
Barring a few premier institutions, quantum computing is not yet part of the curriculum in most Indian universities and colleges. This issue must be addressed through a programme to skill faculty, enabling them to teach engineering and science undergraduates. By 2024, Indias software developer community is expected to be the largest in the world. By training this community, India can create a quantum workforce for itself and the world.
GoI and the industry must support interdisciplinary research and development in quantum science and technologies. As part of the National Mission on Quantum Technologies and Applications (NM-QTA), the 2020 budget had committed 8,000 crore. Also, a Technology Innovation Hub (TIH) for quantum technologies has been set up at Indian Institute of Science Education and Research (IISER), Pune, focused on translating research into products and services. These investments must increase. At present, private investments are lacking. Industry and PSUs must be incentivised to evaluate and work on applications relevant to their domain.
Quantum technologies include a whole gamut of interrelated technologies quantum cryptography, quantum sensors, quantum materials, quantum meteorology, etc. Products based on quantum cryptography for secure communications are already available in the market. However, unambiguous evidence of societal benefits of QCs is still lacking. Demonstrating a few showcase applications is critical to persuade industry to invest in quantum technologies. These applications could be in drug discovery, logistics and optimisation, new materials, fintech, machine learning and defence. This will have a cascading effect of seeding a vibrant quantum startup ecosystem leading to job-creation and economic growth.
India must build its own competitively sized QC in mission mode by pooling its existing academic expertise. A few indigenous QCs will give India a voice in shaping the future of quantum computing. With the right policy framework and incentives, India has the potential to become a key player in a global quantum technology market anticipated to reach $31.57 billion (2.32 lakh crore) by 2026. This will generate more technical jobs in the coming decades. India must move fast to respond to the fast-evolving quantum landscape.
See the article here:
View: Its the Spacetime to Quantum - Economic Times
How Horizon Plans To Bring Quantum Computing Out Of The Shadows – Forbes
Breakthroughs in quantum computing keep coming the latest quantum processor designed by Google has solved a complex mathematical calculation in less than four minutes; the most advanced conventional computers would require 10,000 years to get to an answer. Heres the problem though: even as scientists perfect the quantum computing hardware, there arent many people with the expertise to make use of it, particularly in real-life settings.
Joe Fitzsimons, the founder of Horizon Quantum Computing, believes he is well-placed to help here. Fitzsimons left academia in 2018 following years of research at Oxford University and the Quantum Information and Theory group in Singapore, spotting an opportunity. Were building the tools that will help people take advantage of these advances in the real world, he explains.
To understand Horizons unique selling point does not require a crash course in quantum computing. The key point is that while conventional computing uses binary processing technique a world reduced to 0 or 1 quantum computing operates using many combinations of these digits simultaneously; that means it can get results far more quickly.
The problem for anyone wanting to take advantage of this speed and power is that conventional computer programs wont run on quantum computing. And not only do you need a different language to tell your quantum computer what to do, the program also needs to be able to work out the best way for the machine to achieve a given outcome; not every possible route will secure an advantage.
A further difficulty is that quantum computer programmers are in short supply. And quantum computer programmers who also understand the intricacies of commercial problems that need solving in financial services, pharmaceuticals or energy, say are non-existent.
Horizon aims to fill this gap. Our role is to make quantum computing accessible by building the tools with which people can use it in the real world, he explains. If there is a problem that can be addressed by quantum computing, we need to make it more straightforward to do so.
Think of Horizon as offering a translation service. If you have written a programme to deliver a particular outcome on a conventional computer, Horizons translation tool will turn it into a programme that can deliver the same outcome from a quantum processor. Even better, the tool will work out the best possible way to make that translation so that it optimises the power of quantum computing to deliver your outcome more speedily.
Horizon's Joe Fitzsimons wants to drive access to quantum computing
In the absence of such tools, real-life applications for quantum computing have been developing slowly. One alternative is to use one of the libraries of programmes that already exist for quantum computing, assuming there is one for your particular use case. Another is to hire a team of experts or buy expertise in from a consultant to build your application for you, but this requires time and money, even if talent with the right skills for your outcome is actually out there.
Instead, we are trying to automate what someone with that expertise would do, adds Fitzsimons. If youre an expert in your particular field, we provide the quantum computing expertise so that you don't need it.
We are not quite at the stage of bringing quantum computing to the masses. For one thing, hardware developers are still trying to perfect the machines themselves. For another, we dont yet have a clear picture of where quantum computing will deliver the greatest benefits, though it is increasingly clear that the most promising commercial use cases lie in industries that generate huge amounts of data and require complex analytics to drive insight from that information.
Nevertheless, Fitzsimons believes widescale adoption of quantum computing is coming closer by the day. He points to the huge volumes of funding now going into the industry not least, private sector investment is doubling each year and the continuing technical breakthroughs.
From a commercial perspective, the forecasts are impressive. The consulting group BCG thinks the quantum computing sector could create $5bn-$10bn worth of value in the next three to five years and $450bn to $850bn in the next 15 to 30 years. And Horizon is convinced it can help bring those paydays forward.
Originally posted here:
How Horizon Plans To Bring Quantum Computing Out Of The Shadows - Forbes
Quantum Computing Breakthrough: Entanglement of Three Spin Qubits Achieved in Silicon – SciTechDaily
Figure 1: False-colored scanning electron micrograph of the device. The purple and green structures represent the aluminum gates. Six RIKEN physicists succeeded in entangling three silicon-based spin qubits using the device. Credit: 2021 RIKEN Center for Emergent Matter Science
A three-qubit entangled state has been realized in a fully controllable array of spin qubits in silicon.
An all-RIKEN team has increased the number of silicon-based spin qubits that can be entangled from two to three, highlighting the potential of spin qubits for realizing multi-qubit quantum algorithms.
Quantum computers have the potential to leave conventional computers in the dust when performing certain types of calculations. They are based on quantum bits, or qubits, the quantum equivalent of the bits that conventional computers use.
Although less mature than some other qubit technologies, tiny blobs of silicon known as silicon quantum dots have several properties that make them highly attractive for realizing qubits. These include long coherence times, high-fidelity electrical control, high-temperature operation, and great potential for scalability. However, to usefully connect several silicon-based spin qubits, it is crucial to be able to entangle more than two qubits, an achievement that had evaded physicists until now.
Seigo Tarucha (second from right) and his co-workers have realized a three-qubit entangled state in a fully controllable array of spin qubits in silicon. Credit: 2021 RIKEN
Seigo Tarucha and five colleagues, all at the RIKEN Center for Emergent Matter Science, have now initialized and measured a three-qubit array in silicon with high fidelity (the probability that a qubit is in the expected state). They also combined the three entangled qubits in a single device.
This demonstration is a first step toward extending the capabilities of quantum systems based on spin qubits. Two-qubit operation is good enough to perform fundamental logical calculations, explains Tarucha. But a three-qubit system is the minimum unit for scaling up and implementing error correction.
The teams device consisted of a triple quantum dot on a silicon/silicongermanium heterostructure and is controlled through aluminum gates. Each quantum dot can host one electron, whose spin-up and spin-down states encode a qubit. An on-chip magnet generates a magnetic-field gradient that separates the resonance frequencies of the three qubits, so that they can be individually addressed.
The researchers first entangled two of the qubits by implementing a two-qubit gatea small quantum circuit that constitutes the building block of quantum-computing devices. They then realized three-qubit entanglement by combining the third qubit and the gate. The resulting three-qubit state had a remarkably high state fidelity of 88%, and was in an entangled state that could be used for error correction.
This demonstration is just the beginning of an ambitious course of research leading to a large-scale quantum computer. We plan to demonstrate primitive error correction using the three-qubit device and to fabricate devices with ten or more qubits, says Tarucha. We then plan to develop 50 to 100 qubits and implement more sophisticated error-correction protocols, paving the way to a large-scale quantum computer within a decade.
Reference: Quantum tomography of an entangled three-qubit state in silicon by Kenta Takeda, Akito Noiri, Takashi Nakajima, Jun Yoneda, Takashi Kobayashi and Seigo Tarucha, 7 June 2021, Nature Nanotechnology.DOI: 10.1038/s41565-021-00925-0
Read the original post:
Quantum Computing Breakthrough: Entanglement of Three Spin Qubits Achieved in Silicon - SciTechDaily
IonQ Scores Quantum Computing Deal With University Of Maryland And Announces Its Tripling 2021 Bookings – Forbes
IONQ
The relationship between higher education and the tech companies I cover as an analyst is close and mutually beneficial. The private sector often provides technology resources, capital, expertise, and knowledge of industry needs and challenges to research institutions, the sandbox of tomorrows tech innovators and leaders.
Quantum technology is at an exciting crossroads now, where it is beginning to migrate out of the realm of research and academia to seek out early commercialization opportunities. Much quicker and more powerful than traditional computing, quantum technology promises to revolutionize everything from medicine to climate science. It could very well change the world as we know it within our lifetimes.
So naturally, I immediately perked up at this weeks news of the University of Maryland (UMD)s $20 million, 3-year investment in quantum computing, the majority of which will go to IonQ, to co-develop a groundbreaking quantum laboratory at the College Park campus of the University.
The National Quantum Lab at Maryland, or Q-Lab for short, looks to be an ambitious project that could pay significant dividends in the efforts to advance and commercialize quantum technology. While I had initially viewed the word investment as a balance sheet impact, versus revenue, IonQ announced today it has tripled its bookings forecast for 2021, suggesting the UMD deal is very much a revenue event. To be clear, the tripling of bookings isnt only UMD, but includes other customers, too.
Lets look at the players, the deal and what it includes.
Something is happening in College Park
Based in College Park, MD, IonQ was founded in 2015 by Christopher Monroe, a professor at the University of Maryland and Jungsang Kim, a professor at Duke University (a great example of higher eds interconnectivity with the private sector). Built on its founders 25 years of academic quantum research, IonQs bread and butter is a subcategory of quantum computing known as trapped ion quantum computing. While a full explanation of trapped ion computing is well beyond the scope of this blog and more in Moor Insights & Strategys Quantum principal analyst Paul Smith-Goodson, know that it is one of the more promising proposed approaches to achieving a large-scale quantum computer.
UMD College Park, for its part, is known as a leading public research universityparticularly in the field of quantum computing. Marylands flagship university has invested approximately $300 million into the field of quantum science over the last 30-plus years and currently hosts over 200 quantum researchers and seven quantum facilities. The campus is already home to the Quantum Startup Foundry and the Mid-Atlantic Quantum Alliance, two organizations committed to advancing the nascent quantum ecosystem.
Q-lab promises to be the worlds first on-campus, commercial-grade quantum user facility. The stated goal of the Q-lab is to significantly democratize access to IonQs state-of-the-art technology, giving students, faculty and researchers hands-on experience with technology such as the companys 32-qubit trapped-ion quantum computer (the most performant quantum computer in operation). Lab users also stand to benefit from the opportunity to collaborate with IonQs quantum scientists and engineering experts, who will co-locate within the lab (which will be located next door to IonQs College Park headquarters).
IonQs market momentum
The announcement of the Q-lab comes along with a flurry of other exciting activity at IonQ. Last month, the company demonstrated its 4X16 Reconfigurable Multicore Quantum Architecture (RMQA), an industry first. IonQ says this breakthrough could enable it to boost its qubit count up to the triple digits on a single chip, also laying the groundwork for theoretical future Parallel Multicore Quantum Processing Units.
Another significant recent announcement from IonQ was that it will now offer its quantum systems on Google Cloud (the first quantum player to do so). For that matter, it is now the only quantum provider available via all three of the major cloud platforms (Microsoft Azure, Google Cloud and AWS) and through direct API access. I see this as another crucial way in which IonQ is democratizing access to quantum computers.
Additionally, the company recently announced a strategic integration with IBM Qiskit. This quantum software development kit will make it easier for quantum programmers to get up and running with IonQs systems. Rounding out the new developments was the announcement of a partnership with SoftBank Investment Advisors to facilitate enterprise deployment of quantum solutions worldwide.
All of these developments, including the Q-lab, considered, its no wonder today IonQ recently tripled its expectations for its 2021 contract bookings, from an original goal of $5 million to an ambitious $15 million. To be clear, the tripling of bookings isnt only UMD, but includes other customers, too. All of this must look good to investors, who will soon get a crack at the Quantum company when it goes public via a special purpose acquisition company (SPAC) later this month (a merger with dMY Technology Group, Inc) under $DMYI.
Wrapping up
With both a preeminent quantum research school and a private sector quantum leader located in College Park, the Maryland city could soon be a (if not the) veritable epicenter of quantum technology in the United States. The Q-lab has the potential to produce the next generation of quantum innovators, generate new quantum IP and draw even more quantum startups and scientific and engineering talent to College Park.
Were likely a bit away from recognizing quantum computings full potential as a paradigm shift. However, IonQs moves this summer demonstrate that the technology is entering a new, exciting phase of commercialization, which should only accelerate the process of innovation at research locations such as the new Q-lab. Ill be watching with interest.
From the business point of view, it is great to see IonQ drive orders and subsequently revenue. I hear from some of the uninformed that theres no money in quantum. I think the doubters are wrong and when we all get a closer look at IonQs financials, I believe there will be some surprises.
Moor Insights & Strategy, like all research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, including 8x8, Advanced Micro Devices, Amazon, Applied Micro, ARM, Aruba Networks, AT&T, AWS, A-10 Strategies,Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera,Clumio, Cognitive Systems, CompuCom, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Digital Optics,Dreamchain, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Google (Nest-Revolve), Google Cloud, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Ion VR,IonQ, Inseego, Infosys, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation,MapBox, Marvell,Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco),Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek,Novumind, NVIDIA, Nuvia, ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Poly, Panasas,Peraso, Pexip, Pixelworks, Plume Design, Poly,Portworx, Pure Storage, Qualcomm, Rackspace, Rambus,RayvoltE-Bikes, Red Hat,Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak, SONY,Springpath, Spirent, Splunk, Sprint, Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity,TensTorrent,TobiiTechnology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications,Vidyo, VMware, Wave Computing,Wellsmith, Xilinx, Zebra,Zededa, and Zoho which may be cited in blogs and research.
Patrick was ranked the #1 analyst out of 8,000 in the ARInsights Power 100 rankings and the #1 most cited analyst as ranked by Apollo Research. Patrick founded Moor
Patrick was ranked the #1 analyst out of 8,000 in the ARInsights Power 100 rankings and the #1 most cited analyst as ranked by Apollo Research. Patrick founded Moor Insights & Strategy based on in his real-world world technology experiences with the understanding of what he wasnt getting from analysts and consultants. Moorhead is also a contributor for both Forbes, CIO, and the Next Platform. He runs MI&S but is a broad-based analyst covering a wide variety of topics including the software-defined datacenter and the Internet of Things (IoT), and Patrick is a deep expert in client computing and semiconductors. He has nearly 30 years of experience including 15 years as an executive at high tech companies leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.Before Patrick started the firm, he spent over 20 years as a high-tech strategy, product, and marketing executive who has addressed the personal computer, mobile, graphics, and server ecosystems. Unlike other analyst firms, Moorhead held executive positions leading strategy, marketing, and product groups. He is grounded in reality as he has led the planning and execution and had to live with the outcomes.Moorhead also has significant board experience. He served as an executive board member of the Consumer Electronics Association (CEA), the American Electronics Association (AEA) and chaired the board of the St. Davids Medical Center for five years, designated by Thomson Reuters as one of the 100 Top Hospitals in America.
Quantum computing breakthrough achieved, road to the future begins now – TweakTown
A team of researchers has achieved what is being described as a "breakthrough" in quantum computing.
VIEW GALLERY - 2 IMAGES
The achievement comes from a team of researchers at the RIKEN Center for Emergent Matter Science, who have been able to entangle a three-qubit array in silicon with high accuracy of predicting the state the qubit is in. For those that don't know, instead of using bits to make calculations and perform tasks like a typical computer does, quantum computers use quantum bits, or qubits.
The device the researchers created used three very small blobs of silicon called quantum dots, and each of these dots can hold one electron. The direction of the spin of the electron encodes the qubit. With that in mind, it should be noted that a "Two-qubit operation is good enough to perform fundamental logical calculations. But a three-qubit system is the minimum unit for scaling up and implementing error correction", explains Tarucha.
False-colored scanning electron micrograph of the device. The purple and green structures represent the aluminum gates, per scitechdaily.com.
After successfully entangling two qubits, the team of researchers introduced the third qubit and was able to predict its state with a high fidelity of 88%. Tarucha added, "We plan to demonstrate primitive error correction using the three-qubit device and to fabricate devices with ten or more qubits. We then plan to develop 50 to 100 qubits and implement more sophisticated error-correction protocols, paving the way to a large-scale quantum computer within a decade."
For more information on this story, check out this link here.
See more here:
Quantum computing breakthrough achieved, road to the future begins now - TweakTown
Quantum Computing Theorist Vojtech Vlcek Receives Research Award from DOE – HPCwire
Sept. 8, 2021 How can one predict a materials behavior on the molecular and atomic levels, at the shortest timescales? Whats the best way to design materials to make use of their quantum properties for electronics and information science?
These broad, difficult questions are the type of inquiries that UC Santa Barbara theorist Vojtech Vlcek and his lab will investigate as part of a select group of scientists chosen by the U.S. Department of Energy (DOE) to develop new operating frameworks for some of the worlds most powerful computers. Vlcek will be leading one of five DOE-funded projects to the tune of $28 million overall that will focus on computational methods, algorithms and software to further chemical and materials research, specifically for simulating quantum phenomena and chemical reactions.
Its really exciting, said Vlcek, an assistant professor in the Department of Chemistry and Biochemistry, and one of, if not the youngest researcher to lead such a major endeavor. We believe we will be for the first time able to not only really describe realistic systems, but also provide this whole framework for ultrafast and driven phenomena that will actually set the scene for future developments.
I congratulate Vojtech Vlcek on being selected for this prestigious grant, said Pierre Wiltzius, dean of mathematical, physical and life sciences at UC Santa Barbara. Its especially impressive and unusual for an assistant professor to lead this type of complex, multi-institution research project. Vojtech is in a league if his own, and I look forward to future insights that will come from the teams discoveries.
A Multilayer Framework
As part of the DOEs efforts toward clean energy technologies, scientists across the nation study matter and energy at their most fundamental levels. The goal is to design and discover new materials and processes that can generate, manipulate and store energy techniques that have applications in a wide variety of areas, including energy, environment and national security.
Uncovering these potentially beneficial phenomena and connecting them to the atoms they come from is hard work work that could be assisted with the use of the supercomputers that are housed in the DOEs national laboratories.
DOEs national labs are home to some of the worlds fastest supercomputers, and with more advanced software programs we can fully harness the power of these supercomputers to make breakthrough discoveries and solve the worlds hardest to crack problems, said U.S. Secretary of Energy Jennifer M. Granholm. These investments will help sustain U.S. leadership in science, accelerate basic energy and advance solutions to the nations clean energy priorities.
Among these hard-to-crack problems is the issue of many interacting particles. Interactions are more easily predicted in a system of a few atoms or molecules, or in very regular, periodic systems. But add more bodies or use more elaborate systems and the complexity skyrockets because the characteristics and behaviors of and interactions between every particle have to be accounted for. In some cases, their collective behaviors can produce interesting phenomena that cant be predicted from the behavior of individual particles.
People have been working with small molecules, or characterizing perfectly periodic systems, or looking at just a few atoms, Vlcek said, and more or less extending their dynamics to try to approximate the behaviors of larger, more complex systems.
This is not necessarily realistic, he continued. We want to simulate surfaces. We want to simulate systems that have large-scale periodicity. And in these cases you need to consider systems that are not on nanometer scales, but on the scale of thousands of atoms.
Add to that complexity non-equilibrium processes, which are the focus of Vlceks particular project. He will be leading an effort that involves an additional seven co-principal investigators from UC Berkeley, UCLA, Rutgers University, University of Michigan and Lawrence Berkeley National Laboratory.
Essentially these systems are driven by some strong external stimuli, like from lasers or other driving fields, he said. These processes are relevant for many applications, such as electronics and quantum information sciences.
The goal, according to Vlcek, is to develop algorithms and software based on a multilayer framework with successive layers of embedding theories to capture non-equilibrium dynamics. The team, in partnership with two DOE-supported Scientific Discovery through Advanced Computing (SciDAC) Institutes at Lawrence Berkeley and Argonne National Laboratories, begins with the most fundamental assumptions of quantum theory. That foundation is followed by layers that incorporate novel numerical techniques and neural network approaches to take advantage of the intensive computing the supercomputers can perform.
We still stay with the first principles approach, but were making successive levels of approximations, Vlcek explained. And with this approach well be able to treat extremely large systems. Among the many advantages of the methodology will be the ability for the first time to describe experimental systems in real-time, as they are driven by external forces.
The outcome of the project will be bigger than the sum of its parts, said Vlcek. Not only will it provide a method of studying and designing a wide variety of present and future novel materials, the algorithms are also meant for future supercomputers.
One interesting outcome will be that we will also try to connect to future computational platforms, which could possibly be quantum computers, he said. So this framework will actually allow future research on present and future novel materials as well as new theoretical research.
Source: UC Santa Barbara
Read more:
Quantum Computing Theorist Vojtech Vlcek Receives Research Award from DOE - HPCwire
UMD, IonQ join forces to create the nation’s first quantum computing lab in College Park – The Diamondback
The University of Maryland and IonQ, a College Park-based quantum computing company, announced Wednesday that they will join forces to develop a facility that will give students, faculty, staff and researchers access to a commercial-grade quantum computer.
The new facility, which will be known as the National Quantum Lab at Maryland or Q-Lab for short is the product of a nearly $20 million investment from this university. As the nations first facility of its kind, it will also provide training related to IonQs hardware and allow visitors to collaborate with the companys scientists and engineers, according to a news release.
No other university in the United States is able to provide students and researchers this level of hands-on contact with commercial-grade quantum computing technology and insights from experts working in this emerging field, university President Darryll Pines said in the news release.
The Q-Lab will be located in the Discovery District next to IonQs headquarters by the College Park Airport, the news release stated.
Quantum computing attempts to evolve computer technology, striving to create a machine that can solve more problems at a faster rate.
[Whats new, whats coming, whats moving: The business scene in College Park]
Around the time IonQ announced its plans to go public earlier this year, Pines explained that classical computing uses a stream of electrical pulses called bits, which represent 1s and 0s, to store information. However, on the quantum scale, subatomic particles known as qubits are used to store information, greatly increasing computing speed.
Most importantly, we wanted to put our scientists at the cutting edge of quantum computers because we know that we already use supercomputers, Pines said Wednesday. But why not use the best computers that are right in our backyard?
Recent advancements in quantum computing also support research in areas such as biology, medicine, climate science and materials development, the release noted, adding that the creation of the Q-Lab may also attract additional entrepreneurs and startups to College Park.
We could not be more proud of IonQs success and we are excited to establish this strategic partnership, further solidifying UMD and the surrounding region as the Quantum Capital of the world, Pines added.
The development of the Q-Lab builds upon the universitys $300 million investment in quantum science and more than 30-year history of advancements in the field, according to the news release. The university also currently houses more than 200 researchers and seven centers specializing in quantum-related work.
We are very proud that the nations leading center of academic excellence in quantum research chose IonQs hardware for this trailblazing partnership, said Peter Chapman, the president and CEO of IonQ.
[UMD students allege poor living conditions, maintenance at University Club apartments]
Chris Monroe, a professor in this universitys physics department, and Jungsang Kim co-founded IonQ, which is set to become the first publicly traded commercialized quantum computing company. The company is estimated to go public with a valuation of nearly $2 billion.
The company recently became the first quantum computer supplier whose products are available on all major cloud services providers such as Google Cloud, Microsoft Azure and Amazon Web Services, according to the release.
Monroe and Kim also joined the White Houses National Quantum Initiative Advisory Committee in an effort to accelerate the development of the national strategic technological imperative, the news release stated.
UMD has been at the vanguard of this field since quantum computing was in its infancy, and has been a true partner to IonQ as we step out of the lab and into commerce, industry, and the public markets, Chapman said in the news release.
Senior staff writer Clara Niel contributed to this report.
Here is the original post:
UMD, IonQ join forces to create the nation's first quantum computing lab in College Park - The Diamondback
Leading Chinese researchers are looking at the coming quantum revolution – The Press Stories
Quantum technologies refer to engineering systems that use the quantum properties of photons, electrons, atoms or molecules. For example, Radio China reports that lasers, magnetic resonance imaging and the global stabilization system are closely linked to quantum technology.
Zhao Fuan, a recent graduate of the Chinese University of Science and Technology and a doctoral candidate in optical engineering, left a well-paying job with an annual salary of 600,000 yuan ($ 92,300) and decided to start his own business in quantum computing. . The foundation of quantum computing technology. For example, when a new object is discovered, we can assess whether the equipment is working properly by detecting any change in the electromagnetic field around it, he explains.
Currently, common methods for checking the quality of electrical equipment, such as temperature measurement and ultraviolet radiation, are good for detecting large thermal defects, but are not as effective for small ones or for predicting potential problems that may develop rapidly.
According to Zhao Bowen, in terms of quantum technology, China already has its advantages over relatively good countries, but lags behind in some areas, such as sensors used for facial recognition.
Another young expert who has linked his career with quantum technology is Ha Yu from Sichuan Province. Before completing his doctorate, he launched a new product in the field of quantum sensors, which has great potential in detecting microscopic structures such as cellular and protein molecules. In 2016, he founded CIQTEK, which specializes in quantum computing, which is gaining desirable interest from investors. We have investors like iFlytek and Hillhouse Capital behind us, and the government has given us some support, he said.
Zhao Bowen and Ha Yu are a growing army of experts in quantum physics, quantum computing and related industries in China. According to statistics, in the first half of this year, more than 4,000 companies related to quantum technology were created, an increase of 652% on an annual basis.
Follow this link:
Leading Chinese researchers are looking at the coming quantum revolution - The Press Stories
Top 10 Data Center Stories of the Month: August 2021 – Data Center Knowledge
It May Be Too Early to Prepare Your Data Center for Quantum Computing:Before some fundamental questions are answered, it's hard to predict what shape quantum computing will take at scale.
Google, Amazon, Microsoft Share New Security Efforts After White House Summit:The news arrives after tech company leaders met with President Biden to discuss the public-private partnership needed to address security threats.
What Has to Happen for Quantum Computing to Hit Mainstream?Data Center World keynote: It's still early days for quantum computing, where the fundamental technology remains unsettled, and the nature of workloads is fuzzy.
Open Compute Project: Redefining Open Source for the Data Center:OCP expanded the meaning of "open source" beyond software to address the same problems open source software is meant to address.
Taking a Close Look at the $2B for Cybersecurity in the $1T US Infrastructure Bill:The $1 trillion spending package includes funds for bolstering cybersecurity posture in critical digital infrastructure.
The Intersection of Colocation and Hybrid Cloud Remains in Flux:All colo providers recognize a business opportunity in the hybrid cloud trend. How theyre going after it differs widely.
How Much Does Hard Disk Temperature Matter?Tracking hard disk temperature can help avoid disk failure--and the consequences of disk failure.
Digital Realtys Hybrid Cloud Strategy Rests On Connectivity, Partnerships:The companys focus is on making connectivity easier for customers, while partners enable hybrid architecture solutions.
Pilot in Austin to Offer Early Look at Edge Computing at Scale:A group is deploying dozens of nodes that combine compute, connectivity, and sensors in a uniform fashion.
Nvidia Gives Upbeat Forecast Even as Supplies Remain Tight:Its data center unit, which sells GPU accelerators for supercomputers and AI, had sales of $2.37 billion in the quarter, up 35% from a year earlier.
Will Cloudflares Zero-Carbon Pledge Make a Real Impact?Its commitment to 100% renewable energy operations and removing historic emissions is laudable, but complex challenges limit its ambitions compared with hyperscalers.
Read the original post:
Top 10 Data Center Stories of the Month: August 2021 - Data Center Knowledge
Large-Scale Simulations Of The Brain May Need To Wait For Quantum Computers – Forbes
Will quantum computer simulations crack open our understanding of the biological brain?
Looking back at the history of computers, its hard to overestimate the rate at which computing power has scaled in the course of just a single human lifetime. But yet, existing classical computers have fundamental limits. If quantum computers are successfully built and eventually fully come online, they will be able to tackle certain classes of problems that elude classical computers. And they may be the computational tool needed to fully understand and simulate the brain.
As of this writing, the fastest supercomputer in the world is Japans Fugaku supercomputer, developed jointly by Riken and Fujitsu. It can perform 442 peta-floating-point operations per second.
Lets break that number down in order to arrive at an intuitive (as much as possible) grasp of what it means.
A floating-point number is a way to express, or write down, a real number - real in a mathematical sense - with a fixed amount of precision. Real numbers are all the continuous numbers from the number line. 5, -23, 7/8, and numbers like pi (3.1415926 ...) that go on forever are all real numbers. The problem is a computer, which is digital, has a hard time internally representing continuous numbers. So one way around this is to specify a limited number of digits, and then specify how big or small the actual number is by some base power. For example, the number 234 can be written as 2.34 x 102, because 2.34 x 100 equals 234. Floating point numbers specify a fixed number of significant digits the computer must store in its memory. It fixes the accuracy of the number. This is important because if you do any mathematical operation (e.g. addition, subtraction, division or multiplication) with the fixed accuracy version of a real number, small errors in your results will be generated that propagate (and can grow) throughout other calculations. But as long as the errors remain small its okay.
A floating point operation then, is any arithmetic operation between two floating-point numbers (abbreviated as FLOP). Computer scientists and engineers use the number of FLOP per second - or FLOPS - as a benchmark to compare the speed and computing power of different computers.
One petaFLOP is equivalent to 1,000,000,000,000,000 - or one quadrillion - mathematical operations. A supercomputer with a computing speed of one petaFLOPS is therefore performing one quadrillion operations per second! The Fugaku supercomputer is 442 times faster than that.
For many types of important scientific and technological problems however, even the fastest supercomputer isnt fast enough. In fact, they never will be. This is because for certain classes of problems, the number of possible combinations of solutions that need to be checked grow so fast, compared to the number of things that need to be ordered, that it becomes essentially impossible to compute and check them all.
Heres a version of a classic example. Say you have a group of people with differing political views, and you want to seat them around a table in order to maximize constructive dialogue while minimizing potential conflict. The rules you decide to use dont matter here, just that some set of rules exist. For example, maybe you always want to seat a moderate between a conservative and a liberal in order to act as a bit of a buffer.
This is what scientists and engineers call an optimization problem. How many possible combinations of seating arrangements are there? Well, if you only have two people, there are only two possible arrangements. One individual on each side of a table, and then the reverse, where the two individuals change seats. But if you have five people, the number of possible combinations jumps to 120. Ten people? Well, now youre looking at 3,628,800 different combinations. And thats just for ten people, or more generally, any ten objects. If you had 100 objects, the number of combinations is so huge that its a number with 158 digits (roughly, 9 x 10157). By comparison, there are only about 1021 stars in the observable universe.
Imagine now if you were trying to do a biophysics simulation of a protein in order to develop a new drug that had millions or billions of individual molecules interacting with each other. The number of possible combinations that would need to be computed and checked far exceed the capability of any computer that exists today. Because of how theyre designed, even the fastest supercomputer is forced to check each combination sequentially - one after another. No matter how fast a classical computer is or can be, given the literally greater than astronomical sizes of the number of combinations, many of these problems would take a practical infinity to solve. It just becomes impossible.
Related, the other problem classical computers face is its impossible to build one with sufficient memory to store each of the combinations, even if all the combinations could be computed.
The details of how a quantum computer and quantum computing algorithms work is well beyond the scope or intent of this article, but we can briefly introduce one of the key ideas in order to understand how they can overcome the combinatorial limitations of classical computers.
Classical computers represent information - all information - as numbers. And all numbers can be represented as absolute binary combinations of 1s and 0s. The 1 and 0 each represent a bit of information, the fundamental unit of classical information. Or put another way, information is represented by combinations of two possible states. For example, the number 24 in binary notation is 11000. The number 13 is 1101. You can also do all arithmetic in binary as well. This is convenient, because physically, at the very heart of classical computers is the transistor, which is just an on-off electrical switch. When its on it encodes a 1, and when its off it encodes a 0. Computers do all their math by combining billions of tiny transistors that very quickly switch back and forth as needed. Yet, as fast as this can occur, it still takes finite amounts of time, and all calculations need to be done in an appropriate ordered sequence. If the number of necessary calculations become big enough, as is the case with the combinatorial problems discussed above, you run into an unfeasible computational wall.
Quantum computers are fundamentally different. They overcome the classical limitations by being able to represent information internally not just as a function of two discrete states, but as a continuous probabilistic mixing of states. This allows quantum bits, or qubits, to have many more possible states they can represent at once, and so many more possible combinations of arrangements of objects at once. Put another way, the state space and computational space that a quantum computer has access too is much larger than that of a classical computer. And because of the wave nature of quantum mechanics and superposition (concepts we will not explore here), the internal mixing and probabilistic representation of states and information eventually converge to one dominant solution that the computer outputs. You cant actually observe that internal mixing, but you can observe the final computed output. In essence, as the number of qubits in the quantum computer increase, you can exponentially do more calculations in parallel.
The key concept here is not that quantum computers will necessarily be able to solve new and exotic classes of problems that classical computers cant - although computer scientists have discovered a theoretical class of problem that only quantum computers can solve - but rather that they will be able to solve classes of problems that are - and always will be - beyond the reach of classical computers.
And this isnt to say that quantum computers will replace classical computers. That is not likely to happen anytime in the foreseeable future. For most classes of computational problems classical computers will still work just fine and probably continue being the tool of choice. But for certain classes of problems, quantum computers will far exceed anything possible today.
Well, it depends on the scale at which the dynamics of the brain is being simulated. For sure, there has been much work within the field of computational neuroscience over many decades successfully carrying out computer simulations of the brain and brain activity. But its important to understand the scale at which any given simulation is done.
The brain is exceedingly structurally and functionally hierarchical - from genes, to molecules, cells, network of cells and networks of brain regions. Any simulation of the brain needs to begin with an appropriate mathematical model, a set of equations that capture the chosen scale being modeled that then specify a set of rules to simulate on a computer. Its like a map of a city. The mapmaker needs to make a decision about the scale of the map - how much detail to include and how much to ignore. Why? Because the structural and computational complexity of the brain is so vast and huge that its impossible given existing classical computers to carry out simulations that cut across the many scales with any significant amount of detail.
Even though a wide range of mathematical models about the molecular and cell biology and physiology exist across this huge structural and computational landscape, it is impossible to simulate with any accuracy because of the sheer size of the combinatorial space this landscape presents. It is the same class of problem as that of optimizing people with different political views around a table. But on a much larger scale.
Once again, it in part depends on how you choose to look at it. There is an exquisite amount of detail and structure to the brain across many scales of organization. Heres a more in depth article on this topic.
But if you just consider the number of cells that make up the brain and the number of connections between them as a proxy for the computational complexity - the combinatorial space - of the brain, then it is staggeringly large. In fact, it defies any intuitive grasp.
The brain is a massive network of densely interconnected cells consisting of about 171 trillion brain cells - 86 billion neurons, the main class of brain cell involved in information processing, and another 85 billion non-neuronal cells. There are approximately 10 quadrillion connections between neurons that is a 1 followed by 16 zeros. And of the 85 billion other non-neuronal cells in the brain, one major type of cell called astrocyte glial cells have the ability to both listen in and modulate neuronal signaling and information processing. Astrocytes form a massive network onto themselves, while also cross-talking with the network of neurons. So the brain actually has two distinct networks of cells. Each carrying out different physiological and communication functions, but at the same time overlapping and interacting with each other.
The computational size of the human brain in numbers.
On top of all that structure, there are billions upon billions upon billions of discrete electrical impulses, called action potentials, that act as messages between connected neurons. Astrocytes, unlike neurons, dont use electrical signals. They rely on a different form of biochemical signaling to communicate with each other and with neurons. So there is an entire other molecularly-based information signaling mechanism at play in the brain.
Somehow, in ways neuroscientists still do not fully understand, the interactions of all these electrical and chemical signals carry out all the computations that produce everything the brain is capable of.
Now pause for a moment, and think about the uncountable number of dynamic and ever changing combinations that the state of the brain can take on given this incredible complexity. Yet, it is this combinatorial space, the computations produced by trillions of signals and billions of cells in a hierarchy of networks, that result in everything your brain is capable of doing, learning, experiencing, and perceiving.
So any computer simulation of the brain is ultimately going to be very limited. At least on a classical computer.
How big and complete are the biggest simulations of the brain done to date? And how much impact have they had on scientists understanding of the brain? The answer critically depends on whats being simulated. In other words, at what scale - or scales - and with how much detail given the myriad of combinatorial processes. There certainly continue to be impressive attempts from various research groups around the world, but the amount of cells and brain being simulated, the level of detail, and the amount of time being simulated remains rather limited. This is why headlines and claims that tout ground-breaking large scale simulations of the brain can be misleading, sometimes resulting in controversy and backlash.
The challenges of doing large multi-scale simulations of the brain are significant. So in the end, the answer to how big and complete are the biggest simulations of the brain done to date and how much impact have they had on scientists understanding of the brain - is not much.
First, by their very nature, given a sufficient number of qubits quantum computers will excel at solving and optimizing very large combinatorial problems. Its an inherent consequence of the physics of quantum mechanics and the design of the computers.
Second, given the sheer size and computational complexity of the human brain, any attempt at a large multi-scale simulation with sufficient detail will have to contend with the combinatorial space of the problem.
Third, how a potential quantum computer neural simulation is set up might be able to take advantage of the physics the brain is subject to. Despite its computational power, the brain is still a physical object, and so physical constraints could be used to design and guide simulation rules (quantum computing algorithms) that are inherently combinatorial and parallelizable, thereby taking advantage of what quantum computers do best.
For example, local rules, such as the computational rules of individual neurons, can be used to calculate aspects of the emergent dynamics of networks of neurons in a decentralized way. Each neuron is doing their own thing and contributing to the larger whole, in this case the functions of the whole brain itself, all acting at the same time, and without realizing what theyre contributing too.
In the end, the goal will be to understand the emergent functions of the brain that give rise to cognitive properties. For example, large scale quantum computer simulations might discover latent (hidden) properties and states that are only observable at the whole brain scale, but not computable without a sufficient level of detail and simulation from the scales below it.
If these simulations and research are successful, one can only speculate about what as of yet unknown brain algorithms remain to be discovered and understood. Its possible that such future discoveries will have a significant impact on related topics such as artificial quantum neural networks, or on specially designed hardware that some day may challenge the boundaries of existing computational systems. For example, just published yesterday, an international team of scientists and engineers announced a computational hardware device composed of a molecular-chemical network capable of energy-efficient rapid reconfigurable states, somewhat similar to the reconfigurable nature of biological neurons.
One final comment regarding quantum computers and the brain: This discussion has focused on the potential use of future quantum computers to carry out simulations of the brain that are not currently possible. While some authors and researchers have proposed that neurons themselves might be tiny quantum computers, that is completely different and unrelated to the material here.
It may be that quantum computers will usher in a new era for neuroscience and the understanding of the brain. It may even be the only real way forward. But as of now, actually building workable quantum computers with sufficient stable qubits that outperform classical computers at even modest tasks remains a work in progress. While a handful of commercial efforts exist and have claimed various degrees of success, many difficult hardware and technological challenges remain. Some experts argue that quantum computers may in the end never be built due to technical reasons. But there is much research across the world both in academic labs and in industry attempting to overcome these engineering challenges. Neuroscientists will just have to be patient a bit longer.
Excerpt from:
Large-Scale Simulations Of The Brain May Need To Wait For Quantum Computers - Forbes