Category Archives: Quantum Computing
MIT researchers have introduced a quantum computing architecture thatcan perform low-error quantum computations while also rapidly sharing quantum information between processors. The work represents a key advance toward a complete quantum computing platform.
Previous to this discovery, small-scale quantum processors have successfully performed tasks at a rate exponentially faster than that of classical computers. However, it has been difficult to controllably communicate quantum information between distant parts of a processor. In classical computers, wired interconnects are used to route information back and forth throughout a processor during the course of a computation. In a quantum computer, however, the information itself is quantum mechanical and fragile, requiring fundamentally new strategies to simultaneously process and communicate quantum information on a chip.
One of the main challenges in scaling quantum computers is to enable quantum bits to interact with each other when they are not co-located, says William Oliver, an associate professor of electrical engineering and computer science, MIT Lincoln Laboratory fellow, and associate director of the Research Laboratory for Electronics. For example, nearest-neighbor qubits can easily interact, but how do I make quantum interconnects that connect qubits at distant locations?
The answer lies in going beyond conventional light-matter interactions.
While natural atoms are small and point-like with respect to the wavelength of light they interact with, in a paper published today in the journal Nature, the researchers show that this need not be the case for superconducting artificial atoms. Instead, they have constructed giant atoms from superconducting quantum bits, or qubits, connected in a tunable configuration to a microwave transmission line, or waveguide.
This allows the researchers to adjust the strength of the qubit-waveguide interactions so the fragile qubits can be protected from decoherence, or a kind of natural decay that would otherwise be hastened by the waveguide, while they perform high-fidelity operations. Once those computations are carried out, the strength of the qubit-waveguide couplings is readjusted, and the qubits are able to release quantum data into the waveguide in the form of photons, or light particles.
Coupling a qubit to a waveguide is usually quite bad for qubit operations, since doing so can significantly reduce the lifetime of the qubit, says Bharath Kannan, MIT graduate fellow and first author of the paper. However, the waveguide is necessary in order to release and route quantum information throughout the processor. Here, weve shown that its possible to preserve the coherence of the qubit even though its strongly coupled to a waveguide. We then have the ability to determine when we want to release the information stored in the qubit. We have shown how giant atoms can be used to turn the interaction with the waveguide on and off.
The system realized by the researchers represents a new regime of light-matter interactions, the researchers say. Unlike models that treat atoms as point-like objects smaller than the wavelength of the light they interact with, the superconducting qubits, or artificial atoms, are essentially large electrical circuits. When coupled with the waveguide, they create a structure as large as the wavelength of the microwave light with which they interact.
The giant atom emits its information as microwave photons at multiple locations along the waveguide, such that the photons interfere with each other. This process can be tuned to complete destructive interference, meaning the information in the qubit is protected. Furthermore, even when no photons are actually released from the giant atom, multiple qubits along the waveguide are still able to interact with each other to perform operations. Throughout, the qubits remain strongly coupled to the waveguide, but because of this type of quantum interference, they can remain unaffected by it and be protected from decoherence, while single- and two-qubit operations are performed with high fidelity.
We use the quantum interference effects enabled by the giant atoms to prevent the qubits from emitting their quantum information to the waveguide until we need it. says Oliver.
This allows us to experimentally probe a novel regime of physics that is difficult to access with natural atoms, says Kannan. The effects of the giant atom are extremely clean and easy to observe and understand.
The work appears to have much potential for further research, Kannan adds.
I think one of the surprises is actually the relative ease by which superconducting qubits are able to enter this giant atom regime. he says. The tricks we employed are relatively simple and, as such, one can imagine using this for further applications without a great deal of additional overhead.
Andreas Wallraff, professor of solid-state physics at ETH Zurich, says the research "investigates a piece of quantum physics that is hard or even impossible to fathom for microscopic objects such as electrons or atoms, but that can be studied with macroscopic engineered superconducting quantum circuits. With these circuits, using a clever trick, they are able both to protect their giant atom from decay and simultaneously to allow for coupling two of them coherently. This is very nice work exploring waveguide quantum electrodynamics."
The coherence time of the qubits incorporated into the giant atoms, meaning the time they remained in a quantum state, was approximately 30 microseconds, nearly the same for qubits not coupled to a waveguide, which have a range of between 10 and 100 microseconds, according to the researchers.
Additionally, the research demonstrates two-qubit entangling operations with 94 percent fidelity. This represents the first time researchers have quoted a two-qubit fidelity for qubits that were strongly coupled to a waveguide, because the fidelity of such operations using conventional small atoms is often low in such an architecture. With more calibration, operation tune-up procedures and optimized hardware design, Kannan says, the fidelity can be further improved.
Read more from the original source:
Giant atoms enable quantum processing and communication in one - MIT News
Computer Scientist Don Towsley Named to Team Developing the Quantum Internet – UMass News and Media Relations
The National Science Foundation this week announced a five-year, $26 million initial grant to a consortium led by the University of Arizona to form a new Engineering Research Center, the Center for Quantum Networks (CQN), with partners at Harvard, Yale and MIT, among others, and including quantum networking researcher Don Towsley of the College of Information and Computer Sciences.
One of the centers goals is to develop quantum communications and the quantum internet, where bits of information are encoded in photons of infrared light, for example, instead of radio waves when beaming from a satellite. Once developed fully, quantum communication is expected to be more secure than existing systems.
Towsley says his role in the multi-institution team will be to co-lead one of three research thrusts focused on quantum network architecture. In addition to leadership responsibilities, Towsley will perform research on fundamental performance limits of quantum networks and on the right protocols to achieve these limits.
The University of Arizona leadership team says they hope the new CQN will lay the foundations for a quantum internetby transforming computing, communication and sensing on a new platform. It will connect quantum computers, data centers and gadgets using information states of quantum bits known asqubits. These will someday offer increased processing capacity over the classical data system. It now relies on data storage and processing in the 0 or 1 bit state, but qubits will allow superposition of both states at the same time.
The CQN also is charged with investigating the effects of a future quantum internet on education, workforce development, innovation and society. Organizers also say CQN has a mandate to develop not only the technology, but to address concurrent topics science, law, policy and society as they emerge, within a strong culture of inclusion. It will also emphasize engineering workforce development and include raising student awareness with curriculum and projects involving policy, law and society.
The new CQN will also create Master of Science programs in quantum information science and engineering, believed to be the first of its kind. Another major focus of the CQN team will be research to advance key underlying technologies, including fundamental quantum materials and devices, the quantum and classical processing required at a network node, and quantum network protocols and architectures.
Other universities involved are Brigham Young University, Howard University, Northern Arizona University and the universities of Chicago and Oregon.
COVID-19 Impact on Quantum Computing Market Research, Growth, Industry Analysis, Size and Share 2025 | IBM Corporation, Google – My Kids Health
The globalQuantum Computing Marketis carefully researched in the report while largely concentrating on top players and their business tactics, geographical expansion, market segments, competitive landscape, manufacturing, and pricing and cost structures. Each section of the research study is specially prepared to explore key aspects of the global Quantum Computing market. For instance, the market dynamics section digs deep into the drivers, restraints, trends, and opportunities of the global Quantum Computing market. With qualitative and quantitative analysis, we help you with thorough and comprehensive research on the global Quantum Computing market. We have also focused on SWOT, PESTLE, and Porters Five Forces analyses of the global Quantum Computing market.
Leading players of the global Quantum Computing market are analyzed taking into account their market share, recent developments, new product launches, partnerships, mergers or acquisitions, and markets served. We also provide an exhaustive analysis of their product portfolios to explore the products and applications they concentrate on when operating in the global Quantum Computing market. Furthermore, the report offers two separate market forecasts one for the production side and another for the consumption side of the global Quantum Computing market. It also provides useful recommendations for new as well as established players of the global Quantum Computing market.
Request for Sample Copy of This Report:https://www.limraglobalmarketresearch.com/request-a-sample/?pid=8818
Major Players: IBM Corporation, Google
Regions and Countries:U.S, Canada, France, Germany, UK, Italy, Rest of Europe, India, China, Japan, Singapore, South Korea, Australia, Rest of APAC, Brazil, Mexico, Argentina, Rest of LATAM, Saudi Arabia, South Africa, UAE.
Enquiry For Customization: https://www.limraglobalmarketresearch.com/send-an-enquiry/?pid=8818
Table of Contents
Report Overview:It includes major players of the global Quantum Computing market covered in the research study, research scope, and Market segments by type, market segments by application, years considered for the research study, and objectives of the report.
Global Growth Trends:This section focuses on industry trends where market drivers and top market trends are shed light upon. It also provides growth rates of key producers operating in the global Quantum Computing market. Furthermore, it offers production and capacity analysis where marketing pricing trends, capacity, production, and production value of the global Quantum Computing market are discussed.
Market Share by Manufacturers:Here, the report provides details about revenue by manufacturers, production and capacity by manufacturers, price by manufacturers, expansion plans, mergers and acquisitions, and products, market entry dates, distribution, and market areas of key manufacturers.
Market Size by Type:This section concentrates on product type segments where production value market share, price, and production market share by product type are discussed.
Market Size by Application:Besides an overview of the global Quantum Computing market by application, it gives a study on the consumption in the global Quantum Computing market by application.
Production by Region:Here, the production value growth rate, production growth rate, import and export, and key players of each regional market are provided.
Consumption by Region:This section provides information on the consumption in each regional market studied in the report. The consumption is discussed on the basis of country, application, and product type.
Company Profiles:Almost all leading players of the global Quantum Computing market are profiled in this section. The analysts have provided information about their recent developments in the global Quantum Computing market, products, revenue, production, business, and company.
Market Forecast by Production:The production and production value forecasts included in this section are for the global Quantum Computing market as well as for key regional markets.
Market Forecast by Consumption:The consumption and consumption value forecasts included in this section are for the global Quantum Computing market as well as for key regional markets.
Value Chain and Sales Analysis:It deeply analyzes customers, distributors, sales channels, and value chain of the global Quantum Computing market.
Key Findings:This section gives a quick look at important findings of the research study.
Request for Sample Copy of This Report:https://www.limraglobalmarketresearch.com/request-a-sample/?pid=8818
About Us:Limra Global Market Research is a one-stop place having a wide database of market research reports for all domains. Our market research reports provide the best study of the market, its growth, and the latest trends in the market along with high accuracy and great reliability. Limra Global Market Research caters to major corporations as well as non-profit organizations. The information provided in the reports will not help our clients enhance their knowledge and decision making skills, but also provide enormous opportunities for growth by giving the newest trends in the market.
Contact Us:Limra Global Market ResearchE-mail- [emailprotected]Web: https://www.limraglobalmarketresearch.com/
IBM and the University of Tokyo Unveil the Quantum Innovation Initiative Consortium to Accelerate Japan’s Quantum Research and Development Leadership…
TOKYO, July 30, 2020 /PRNewswire/ --Today, IBM (NYSE: IBM) and the University of Tokyo unveiled a landmark collaboration with the launch of the Quantum Innovation Initiative Consortium (QIIC). Expanding from the December 2019 JapanIBM Quantum Partnership initiative, QIIC, aims to accelerate the collaboration between industry, academia, and government to advance Japan's leadership in quantum science, business, and education.
QIIC's main goal is to strategically accelerate quantum computing R&D activities in Japan by bringing together academic talent from across the country's universities and prominent research associations and large-scale industry. The consortium plans to further develop technology for quantum computing in Japan and build an ecosystem to improve student skills and expertise, opening doors to future scientific discoveries and practical quantum applications.
Headquartered at the University of Tokyo, member organizations of QIIC will collaborate to engage students, faculty, and industry researchers with seminars, workshops, and events to foster new quantum business opportunities in Japan. Organizations in agreement to join the consortiuminclude Keio University, Toshiba, Hitachi, Mizuho,MUFG, JSR, DIC, Toyota, Mitsubishi Chemicals and IBM Japan.
These organizations in consortium will also be part of the IBM Q Network the world's first community of Fortune 500 companies, startups, academic institutions and research labs to advance quantum computing and the development of practical applications for it. As part of the network, they will have access to IBM's expertise and resources, and cloud development environment, as well as cloud-based access to the IBM Quantum Computation Center, which includes IBM's most-advanced quantum computers.
In addition to cloud-based access to the IBM's fleet of quantum systems, the QIIC will also have access to an IBM Q System One, a dedicated system planned for installation in Japan in 2021. The first of its kind in the region, and only the second such installation outside of the US, this system along with a separate testbed system to be part of a system technology development lab will support the consortium's goals of next-generation quantum hardware research and development, including cryogenic components, room temperature electronics, and micro-signal generators.
According to Professor Makoto Gonokami, President of the University of Tokyo:
"Society 5.0is the concept of a better future with inclusive, sustainable and a knowledge-intensive society where information and services create value underpinned by digital innovation. The key to realizing this society is to utilize real data in real-time. In order to achieve this, it is necessary to protect and nurture the global environment, an entity of physical space and cyberspace as one, by taking it as a global commons (a concept that encompasses global resources and the ecosystems) which is sustainable and reliable, while the fusion of physical space and cyberspace progresses.
"Quantum technology and quantum computers are indispensable technologies to make that happen. I believe that Japan will play an important role in implementing quantum computing technology to society ahead of rest of the world, and that industry-academia-government collaboration is necessary for this. The QIIC will accelerate quantum technology research and its implementation to the Society 5.0 while firmlysharing each other's wisdom and promoting the close sharing of information."
"Today, I am extremely excited and proud to launch this new consortium that will help foster economic growth and quantum technology leadership in Japan.The QIIC will greatly advance Japan's entire quantum computing ecosystem, bringing experts from industry, government and academia together to collaborate on researchand development," said Dario Gil, Director of IBM Research. "Quantum computing has the potential totackle some of the world's greatest challengesin the future.We expect that it will helpusaccelerate scientific discovery so that we candevelop vaccinesmore quickly and accurately,create new materials toaddressclimate changeor design better energy storage technologies. The potential is massive,andwe will only reach this future if we work together uniting the best minds from the public and private sectors. Universities, businesses and governments have to collaborate so that we can unleash the full potential of quantum computing."
QIIC's members are forging a path for Japan's discovery of practical quantum applications for the benefit of society. The cooperation between industry, academia, and government aims to create a new community for quantum computation research and use cases.
About IBM QuantumIBM Quantum is an industry-first initiative to build quantum systems for business and science applications. For more information about IBM's quantum computing efforts, please visitwww.ibm.com/ibmq.
For more information about the IBM Q Network, as well as a full list of all partners, members, and hubs, visithttps://www.research.ibm.com/ibm-q/network/
About The University of Tokyo
The University of Tokyo was established in 1877 as the first national university in Japan. As a leading research university, the University of Tokyo is conducting academic research in almost all fields at both undergraduate and graduate schools. The University aims to provide its students with a rich and varied academic environment that ensures opportunities for acquiring both academic and professional knowledge and skills.
Chris Nay [emailprotected]
Miri Yasuhara IBM Japan +81 50 3150 7967 [emailprotected]
This month, Insights & Outcomes will turn your head with spinning electrons, prolific plankton, and the biology of sex.
As always, you can find more science and medicine research news onYaleNews Science & TechnologyandHealth & Medicinepages.
The group of single-celled marine organisms known as planktic foraminifera are among the most prolific shell producers in the open ocean. They leave behind one of the most extensive fossil records on the planet, and they allow scientists to reconstruct Earths climate history. Yet little was known about their life history until now. A research team led by Yale paleontologistCatherine Davisgrew a generation of planktic foraminifera in the lab and documented the organisms full life cycle. The team confirmed the organisms apparent ability to reproduce both sexually and asexually, and found that the shells of cloned siblings grown together in the laboratory can look strikingly different from each other. These results have broad impacts on how foraminifera fit into food webs, how vanishingly small populations can rapidly respond to their environment, and perhaps even their long-lived success as a group, said Davis, a postdoctoral associate in the lab ofPincelli Hull, assistant professor in the Department of Earth and Planetary Sciences and co-author of the study.The study appears in Science Advances.
Since 2003, the lab of YalesMark Gersteinhas played a major role in an international effort to catalog data on the complex interactions between genes and the segments of DNA and RNA that regulate their functions. The latest findings of the ENCODE project were published July 29 in 30 papers, four spearheaded by Gersteins lab, in a variety of scientific journals.Jing ZhangandDonghoon Leefrom Gersteins lab have createda video illustrating sciences evolving understandingof the complex regulatory networks that can contribute to cancer and other diseases.The latest findings by the Gerstein lab and other major ENCODE contributors can be found on the Gerstein lab website.
YalesNina Stachenfeldbelieves that to understand disease, scientists must understand the biology of sex. So she is helping to launch a series of papers for publication in The FASEB Journal that explores the systemic role sex plays in human physiology. Stachenfeld, a fellow at the John B. Pierce Laboratory and professor of obstetrics, gynecology, and reproductive sciences, has enlisted contributions from half a dozen scientists to explore a variety of topics, including the role sex plays in addiction and the biology of high blood pressure in people of different races. The series,Sex as a Variable in Human Research: A Systems Approach,will appear over the next few months in The FASEB Journal.
A research result by Yale physicists lends credibility to an exotic proposal for safeguarding quantum information called topological quantum protection. Topological quantum protection is an alternative to Yales primary approach to fault tolerant quantum computing based on active error correction. Rather, it involves a theoretically proposed entity called a Majorana quasiparticle, which has not yet been directly observed. A team led byMichel Devoret, the F.W. Beinecke Professor of Applied Physics and Physics, has applied the tools of circuit quantum electrodynamics to achieve the continuous monitoring of a quasiparticles spin, a promising step toward detection of Majorana quasiparticles. The Yale team includesMax Hays,Valla Fatemi,Kyle Serniak, andSpencer Diamond. Thestudy appears in Nature Physics.
When pathogens or cancer cells develop resistance to drug treatment, researchers usually try to develop new drugs. But a new study by Yale researchers helps bolster a new strategy taking advantage of evolutionary processes to combat drug resistance through drug-sensitive pathogenic cells. The new approach, known as adaptive therapy, offers an alternative to prolonged and high-dose drug treatment for cancer or infections. Adaptive therapy calls for an intermittent series of lower dose treatments that kill fewer disease-causing cells but also decrease the chances that those cells develop resistance to the drugs. In other words, as long as a pathogen or cancer remains responsive to a drug, it may be wiser, in some instances, to manage a disease rather than trying to eradicate it at the expense of an elevated risk of drug resistance evolution, saidSergey Melnikov, lead author of the new study. It is based on his work in the lab of YalesDieter Soll, Sterling Professor of Molecular Biophysics and Biochemistry and professor of chemistry. In a laboratory experiment, Melnikov and Soll gave adaptive therapy a boost by adding the amino acid norvaline to the antibiotic tavaborole to combat drug-resistant E. coli. Norvaline impairs the ability of E. coli cells to produce cells resistant to tavaborole by hindering their ability to mutate, allowing antibiotic-sensitive cells to outcompete antibiotic-resistant ones. By integrating Darwinian principles of natural selection into therapeutic treatment of a disease,we can significantly prolong the effectiveness of drugs or give a second life for drugs that are currently abandoned due to rapid evolution of resistance, said Melnikov, now a group leader at Newcastle University.The study was published in the Proceedings of the National Academy of Sciences.
Many videos describing quantum computers try to distill and oversimplify everything. Thoughty's takes its time and gives more historical and theoretical context than most.
Because it does take a while to get into the subject, here's a shorter explainer by MIT:
Today's computers use bitsa stream of electrical or optical pulses representing1s or0s. Everything from your tweets and e-mails to your iTunes songs and YouTube videos are essentially long strings of these binary digits.
Quantum computers, on the other hand, usequbits, whichare typically subatomic particles such as electrons or photons. Generating and managing qubits is a scientific and engineering challenge. Some companies, such as IBM, Google, and Rigetti Computing, use superconducting circuits cooled to temperatures colder than deep space. Others, like IonQ, trap individual atoms in electromagnetic fields on a silicon chip in ultra-high-vacuum chambers. In both cases, the goal is to isolate the qubits in a controlled quantum state.
The processing power possible through these controlled qubits will make today's fastest computers look positively archaic.
Image: YouTube / Thoughty2
Intelligence is a surprisingly difficult thing to define. Kurzgesagt jumps into the debate with an interesting overview of where intelligence begins. Is a slime mold intelligent? Are plants intelligent?
Wildfires are a natural part of many ecosystems, though more and more are human-caused. Wendover Productions takes a look at how firefighters work to minimize the spread of wildfires in grueling and dangerous conditions.
Because of its ubiquity, the landscape is littered with proposed etymologies of the term OK. This nice explainer clarifies the murky origins of one of the most widely spoken words in the world.
Back in the old days of the 20th century, practicality was the biggest reason most people wore a watch. If you didnt have a clock face right on your wrist, how else would you know the time? Of course, in the age of smartphones and even smartwatches, the time is available virtually everywhere at any 
Its bad enough when regular food stuff gets stale after a couple days sitting on your kitchen counter. But when infused edibles start losing their freshnesswell, heck, edibles arent cheap! If youre going to do edibles, you may as well do them right, which means having the equipment to make sure your product remains both 
CBD-infused coffee almost sounds like the premise for a bad sitcom episode. But seriouslyisnt it almost comical to consider what happens when the go-go kick of high-octane caffeine slams headlong into the calming, tranquil effects of CBD? The reality is, it actually can be the best of both worlds situation for many CBD coffee fans. 
Here is the original post:
This simple explainer tackles the complexity of quantum computing - Boing Boing
UK firm reaches final stages of the NIST quest for quantum-proof encryption algorithms – www.computing.co.uk
Post Quantum CEO Andersen Cheng
London-based encryption specialist Post Quantum has reached the final stage of the NIST competition to find practical encryption standards capable of withstanding attacks by a quantum computer.
The US National Institute of Standards and Technology (NIST) launched its competition for Public-Key Post-Quantum Cryptographic Algorithms, in 2016 with the aim of arriving at quantum-safe standards by 2024. Successful candidates will enhance or replace the three paradigms considered most vulnerable to quantum attack: the digital signature standard FIPS 186-4 and the public key cryptography standards NIST SP 800-56AandNIST SP 800-56B.
Many of the current encryption algorithms use one-way functions to derive encryption/decryption key pairs, for example factorising very large integers into primes. This method is used by the general purpose RSA algorithms that form the basis of the secure internet protocols SSL and TLS. Elliptic curve cryptography, often preferred in IoT and mobile devices, also uses a one-way mathematical function. Unfortunately both are vulnerable to attack by quantum computers.
Last year NIST whittled down the original 69 candidates to 26, and in a third round announced last week reduced this number to 15: seven finalists "most likely to be ready for standardisation soon after the end of the third round", and eight alternate candidates' "regarded as potential candidates for future standardisation". Candidates fall into three functional categories: Code-based, multivariate and lattice-based cryptography, which cover the variety of different use cases for which post quantum (PQ) encryption will be required. In addition, some candidates are suitable for public key exchange while others are better suited to digital signatures.
The only remaining candidate in the code-based category is Classic McEliece, which is a merger of Post Quantum's Never-The-Same Key Encapsulation Mechanism (NTS-KEM) and work done in the same area by a team led by Professor Daniel Bernstein of University of Illinois at Chicago. The joint candidate, known as Classic McEliece', is based on the McEliece cryptosystem first proposed in the 1970s.
It works by injecting random error codes into the cyphertext. The error correction codes allow the recipient of the encrypted message to cut out the random noise added to the message when decrypting it, a facility not available to any eavesdropper intercepting the message.
"Classic McEliece has a somewhat unusual performance profileit has a very large public key but the smallest ciphertexts of all competing KEMs [key-encapsulation mechanisms]. This is not a good fit for general use in internet protocols as they are currently specified, but in some applications, the very small ciphertext size could make Classic McEliece an appealing choice," NIST says, offering a possible use case as protecting VPNs.
Cheng said he was pleased to join forces with Bernstein's team, adding that the need for viable PQ encryption is urgent.
"The entire world needs to upgrade its encryption, and we last did that in 1978, when RSA came in. The stakes couldn't be higher with record levels of cyber-attack and heightened nation state activity - if China or Russia is the first to crack RSA then cyber Armageddon will begin," Cheng said.
"This isn't an academic exercise for us, we are already several years down the commercialisation path with real-world quantum-safe products for identity authentication and VPN. If you work for an organisation with intellectual property or critical data with a long shelf life, and you're working from home during lockdown, you should already be using a quantum-safe VPN."
An age of unbelievably fast quantum computers is only a stones throw away, promising machines that will forever transform the way we solve problems, communicate and compute.
However, such powerful machines in the wrong hands could spell major trouble for the cyber security community, as many experts fear that quantum computers could also effectively break even the strongest encryption we have today.
So when can we expect to see these quantum machines in action? Theres a chance that it has already happened, by either the US NSA (National Security Agency) or China, but we dont publicly know about it yet," says Roger Grimes, Data-Driven Defence evangelist at KnowBe4, who will be speaking on Quantum reckoning: The coming day when quantum computers break cryptography at ITWeb Security Summit 2020, to be held as a virtual event from 25 to 28 August this year.
According to Grimes, if it hasnt happened already, many people believe it will happen within the next two years.
Speaking of how this quantum reckoning could impact information security, Grimes says any secret protected by traditional asymmetric ciphers will no longer be protected. This includes RSA, Diffie-Hellman, Elliptic Curve Cryptography which is used in HTTPS, TLS, WiFi, FIDO keys, PKI, digital certificates, digital signatures and banking networks. Essentially, it would impact about 95% of our digital world.
Its not all bad news, though. He says along with the dangers, quantum computing will bring us many wonderful inventions we cannot even begin to imagine right now, much as the Internet did, but on an even greater scale.
There is a glimmer of hope in that post-quantum cryptography, or cryptographic algorithms that are believed to be secure against an attack by a quantum computer, might save the day.
Grimes says its a race, but that dozens of good quantum-resistant cryptography standards are being tested right now and there are likely to be some good standards in place by the time the quantum reckoning becomes public and widespread.
But once the new cryptography standards are in place, how long will it take every person and company and the world to switch over to the new quantum-resistant standards? That is the real problem, he adds.
Delegates attending Grimes talk will learn exactly what it is they need to start doing now in order to prepare for the quantum reckoning.
See original here:
Quantum reckoning: The day when computers will break cryptography - ITWeb
This stunning image captured last year by physicists at the University of Glasgow in Scotland is the first-ever photo of quantum entanglement - a phenomenon so strange, physicist Albert Einstein famously described it as 'spooky action at a distance'.
It might not look like much, but just stop and think about it for a second: this fuzzy grey image was the first time we'd seen the particle interaction that underpins the strange science of quantum mechanics and forms the basis of quantum computing.
Quantum entanglement occurs when two particles become inextricably linked, and whatever happens to one immediately affects the other, regardless of how far apart they are. Hence the 'spooky action at a distance' description.
This particular photo shows entanglement between two photons - two light particles. They're interacting and - for a brief moment - sharing physical states.
Paul-Antoine Moreau, first author of the paper wherein the image was unveiled back in July 2019, told the BBC the image was "an elegant demonstration of a fundamental property of nature".
To capture the incredible photo, Moreau and a team of physicists created a system that blasted out streams of entangled photons at what they described as 'non-conventional objects'.
The experiment actually involved capturing four images of the photons under four different phase transitions. You can see the full image below:
(Moreau et al., Science Advances, 2019)
What you're looking at here is actually a composite of multiple images of the photons as they go through a series of four phase transitions.
The physicists split the entangled photons up and ran one beam through a liquid crystal material known as -barium borate, triggering four phase transitions.
At the same time they captured photos of the entangled pair going through the same phase transitions, even though it hadn't passed through the liquid crystal.
You can see the setup below: The entangled beam of photons comes from the bottom left, one half of the entangled pair splits to the left and passes through the four phase filters. The others that go straight ahead didn't go through the filters, but underwent the same phase changes.
(Moreau et al., Science Advances, 2019)
The camera was able to capture images of these at the same time, showing that they'd both shifted the same way despite being split. In other words, they were entangled.
While Einstein made quantum entanglement famous, the late physicist John Stewart Bell helped define quantum entanglement and established a test known as 'Bell inequality'. Basically, if you can break Bell inequality, you can confirm true quantum entanglement.
"Here, we report an experiment demonstrating the violation of a Bell inequality within observed images," the team wrote in Science Advances.
"This result both opens the way to new quantum imaging schemes ... and suggests promise for quantum information schemes based on spatial variables."
The research was published in Science Advances.
A version of this article was first published in July 2019.
Read the original here:
Looking Back on The First-Ever Photo of Quantum Entanglement - ScienceAlert
In a chapter of the Modern CTO podcast, Ripples CTO, David Schwartz, expressed concerns about the development of quantum computers. Ripples CTO believes this technology is a threat to the security of Bitcoin, XRP, and cryptocurrencies. This is primarily because the consensus algorithms behind cryptocurrencies rely on conventional cryptography, as Schwartz stated:
From the point of view of someone who is building systems based on conventional cryptography, quantum computing is a risk. We are not solving problems that need powerful computing like payments and liquidity the work that the computers do is not that incredibly complicated, but because it relies on conventional cryptography, very fast computers present a risk to the security model that we use inside the ledger.
Algorithms like SHA-2 and ECDSA (elliptic curve cryptography) are sort of esoteric things deep in the plumbing but if they were to fail, the whole system would collapse. The systems ability to say who owns Bitcoin or who owns XRP or whether or not a particular transaction is authorized would be compromised().
Ripples CTO said that Ripple is trying to prepare for the emergence of quantum computers. Therefore, they are determining when the algorithms mentioned will no longer be reliable. Ripples CTO estimates that in the next 8-10 years, quantum computers will begin to pose a threat, as Schwartz further stated:
I think we have at least eight years. I have very high confidence that its at least a decade before quantum computing presents a threat, but you never know when there could be a breakthrough. Im a cautious and concerned observer, I would say.
The other fear would be if some bad actor, some foreign government, secretly had quantum computing way ahead of whats known to the public. Depending on your threat model, you could also say what if the NSA has quantum computing. Are you worried about the NSA breaking your payment system?
Despite the above, Ripples CTO made an optimistic conclusion and stated that even if there is a malicious actor with this technology, he will not use it against the average person. Therefore, Schwartz believes that most users have nothing to worry about:
While some people might really be concerned it depends on your threat model, if youre just an average person or an average company, youre probably not going to be a victim of this lets say hypothetically some bad actor had quantum computing that was powerful enough to break things, theyre probably not going to go after you unless you are a target of that type of actor.
As soon as its clear that theres a problem, these systems will probably be frozen until they can be fixed or improved. So, most people dont have to worry about it.