Category Archives: Quantum Computing
NSA: We ‘don’t know when or even if’ a quantum computer will ever be able to break today’s public-key encryption – The Register
America's National Security Agency has published an FAQ about quantum cryptography, saying it does not know "when or even if" a quantum computer will ever exist to "exploit" public-key cryptography.
In the document, titled Quantum Computing and Post-Quantum Cryptography, the NSA said it "has to produce requirements today for systems that will be used for many decades in the future." With that in mind, the agency came up with some predictions [PDF] for the near future of quantum computing and their impact on encryption.
Is the NSA worried about the threat posed by a "cryptographically relevant quantum computer" (CRQC)? Apparently not too much.
"NSA does not know when or even if a quantum computer of sufficient size and power to exploit public key cryptography (a CRQC) will exist," it stated, which sounds fairly conclusive though in 2014 the agency splurged $80m looking for a quantum computer that could smash current encryption in a program titled Owning the Net, so the candor of the paper's statements is perhaps open to debate.
What the super-surveillance agency seems to be saying is that it's not a given that a CRQC capable of breaking today's public-key algorithms will ever emerge, though it wouldn't be a bad idea to consider coming up with and using new techniques that could defeat a future CRQC, should one be built.
It's almost like the NSA is dropping a not-so-subtle hint, though why it would is debatable. If it has a CRQC, or is on the path to one, it might want to warn allies, vendors, and citizens to think about using quantum-resistant technologies in case bad people develop a CRQC too. But why would the spies tip their hand so? It's all very curious.
Progress on quantum computers has been steadily made over the past few years, and while they may not ever replace our standard, classical computing, they are very effective at solving certain problems
Eric Trexler, VP of global governments at security shop Forcepoint, told The Register: "Progress on quantum computers has been steadily made over the past few years, and while they may not ever replace our standard, classical computing, they are very effective at solving certain problems. This includes public-key asymmetric cryptography, one of the two different types of cryptosystems in use today."
Public-key cryptography is what the world relies on for strong encryption, such as TLS and SSL that underpin the HTTPS standard used to help protect your browser data from third-party snooping.
In the NSA's summary, a CRQC should one ever exist "would be capable of undermining the widely deployed public key algorithms used for asymmetric key exchanges and digital signatures" and what a relief it is that no one has one of these machines yet. The post-quantum encryption industry has long sought to portray itself as an immediate threat to today's encryption, as El Reg detailed in 2019.
"The current widely used cryptography and hashing algorithms are based on certain mathematical calculations taking an impractical amount of time to solve," explained Martin Lee, a technical lead at Cisco's Talos infosec arm. "With the advent of quantum computers, we risk that these calculations will become easy to perform, and that our cryptographic software will no longer protect systems."
Given that nations and labs are working toward building crypto-busting quantum computers, the NSA said it was working on "quantum-resistant public key" algorithms for private suppliers to the US government to use, having had its Post-Quantum Standardization Effort running since 2016. However, the agency said there are no such algos that commercial vendors should adopt right now, "with the exception of stateful hash signatures for firmware."
Smart cookies will be glad to hear that the NSA considers AES-256 and SHA-384 "safe against attack by a large quantum computer."
Jason Soroko, CTO of Sectigo, a vendor that advertises "quantum safe cryptography" said the NSA report wasn't conclusive proof that current encryption algos were safe from innovation.
"Quantum computers alone do not crack public key cryptography," he said, adding that such a beast would need to execute an implementation of Shors algorithm. That algo was first described in 1994 by an MIT maths professor and allows for the calculation of prime factors of very large numbers; a vital step towards speeding up the decryption of the product of current encryption algorithms.
"Work on quantum resistant cryptographic algorithms is pushing forward based on the risk that Universal quantum computers will eventually have enough stable qubits to eventually implement Shors algorithm," continued Soroko. "I think its important to assume that innovation in both math and engineering will potentially surprise us."
While advances in cryptography are of more than merely academic interest to the infosec world, there is always the point that security (and data) breaches occur because of primarily human factors. Ransomware, currently the largest threat to enterprises, typically spreads because someone's forgotten to patch or decommission a machine on a corporate network or because somebody opens an attachment from a malicious email.
Or there's the old joke about rubber hose cryptanalysis, referring to beating the passwords out of a captured sysadmin.
Talos' Lee concluded: In a world where users will divulge their passwords in return for chocolate or in response to an enticing phishing email, the risk of quantum computers might not be our biggest threat.
Read the original:
NSA: We 'don't know when or even if' a quantum computer will ever be able to break today's public-key encryption - The Register
Quantum Computing in Manufacturing Market Rising Trends-Microsoft, D-Wave Solutions, Rigetti Computing, Intel UNLV The Rebel Yell – UNLV The Rebel…
This report studies the Quantum Computing in Manufacturing Market with many aspects of the industry like the market size, market status, market trends, and forecast, the report also provides brief information of the competitors and the specific growth opportunities with key market drivers. Find the complete Quantum Computing in Manufacturing Market analysis segmented by companies, region, type, and applications in the report.
The report offers valuable insight into the Quantum Computing in Manufacturing market progress and approaches related to the Quantum Computing in Manufacturing market with an analysis of each region. The report goes on to talk about the dominant aspects of the market and examine each segment.
Top Companiesin the Global Quantum Computing in Manufacturing Market:IBM, Google, Microsoft, D-Wave Solutions, Rigetti Computing, Intel, Origin Quantum Computing Technology, Anyon Systems Inc., Cambridge Quantum Computing Limited,
Click the link to get a Sample Copy of the Report@:
This report segments the global Quantum Computing in Manufacturing market on the basis ofTypesis:
Manufacturing
Industry Chain Service
On the basis ofApplication, the Global Quantum Computing in Manufacturing market is segmented into:
Car
Mechanical
Electronic
Chemical Industry
Other
SPECIAL OFFER (Avail an Up-to 25% discount on this report, please fill the form and mention the code: MID25 in the comments section)
Inquire For Discount at:
Influence of the Quantum Computing in Manufacturing market report:
-Comprehensive assessment of all opportunities and risks in the Quantum Computing in Manufacturing market.
-Quantum Computing in Manufacturing market recent innovations and major events
-Detailed study of business strategies for growth of the Quantum Computing in Manufacturing market-leading players.
-Conclusive study about the growth plot of Quantum Computing in Manufacturing market for forthcoming years.
-In-depth understanding of Quantum Computing in Manufacturing market-particular drivers, constraints, and major micro markets.
-Favorable impression inside vital technological and market latest trends striking the Quantum Computing in Manufacturing market.
The report has 150 tables and figures browse the report description and TOC:
What are the market factors that are explained in the report?
Key Strategic Developments:The study also includes the key strategic developments of the market, comprising R&D, new product launch, M&A, agreements, collaborations, partnerships, joint ventures, and regional growth of the leading competitors operating in the market on a global and regional scale.
Key Market Features:The report evaluated key market features, including revenue, price, capacity, capacity utilization rate, gross, production, production rate, consumption, import/export, supply/demand, cost, market share, CAGR, and gross margin. In addition, the study offers a comprehensive study of the key market dynamics and their latest trends, along with pertinent market segments and sub-segments.
Analytical Tools:The Global Quantum Computing in Manufacturing Market report includes the accurately studied and assessed data of the key industry players and their scope in the market by means of a number of analytical tools. The analytical tools such as Porters five forces analysis, SWOT analysis, feasibility study, and investment return analysis have been used to analyze the growth of the key players operating in the market.
We also offer customization on reports based on specific client requirement:
1-Free country-level analysis forany 5 countriesof your choice
2-FreeCompetitive analysis ofany 5 key market players.
3-Free40 analyst hoursto cover any other data points.
Media contact us:
Irfan Tamboli (Head of Sales)Market Insights ReportsPhone: + 1704 266 3234 | Mob: +91-750-707-8687sales@marketinsightsreports.comirfan@marketinsightsreports.com
IBM quantum computing: From healthcare to automotive to energy, real use cases are in play – TechRepublic
Companies including Anthem, Daimler-Benz and ExxonMobil, have big plans to deploy IBM quantum computers this decade.
Image: PhonlamaiPhoto, Getty Images/iStockphoto
Quantum computers have been receiving a lot of attention because of their potential to solve computationally difficult problems that classical computers cannot. Among those problems are the abilities to help companies reduce their carbon footprint and protect the world from the next pandemic.
SEE: The CIO's guide to quantum computing (free PDF) (TechRepublic)
Since announcing the IBM Quantum Network in 2017 with 12 initial organizations, today IBM said its commercial quantum computing program has grown to more than 150 Fortune 500 companies, academic institutions, startups and national research labs. Some 260,000 users have run nearly a trillion circuits, according to the company.
Last spring, IBM rolled out Qiskit Runtime, and the ability to speed up quantum programs on the cloud by 120x, as well as IBM's to deliver a 1,000+ qubit system by 2023.
In addition to increasing speed, Qiskit Runtime changes how IBM is able to offer quantum computing to clients and make it more widely available, the company said.
"Before we got to Runtime, clients were doing research using simulators," and now they can investigate applications for finance, machine learning and chemistry using real hardware, said Jay Gambetta, IBM fellow and vice president of quantum computing.
"To me, this is fundamentally important because a simulator can never mimic quantum computing, so you need to do your research and development on the hardware and that's what's getting enabled," Gambetta said. "I see this year as when this fully comes out of beta and will be the new way of using quantum" to ask questions such as whether quantum will scale in the way clients can use apps."
In the meantime, customers are incorporating quantum into their plans for the future. At healthcare provider Anthem, quantum computing is "an integral part of our digital platform for health," and is being used for "computationally intense and expensive tasks such as identifying anomalies, where there's tons of data and interactions," said John Utz, staff vice president of digital product management.
Quantum computers are better at that than classical computers, Utz said. Anthem is running different models on IBM's quantum cloud. Right now, company officials are building a roadmap around how Anthem wants to deliver its platform using quantum technology, so "I can't say quantum is ready for primetime yet," Utz said. "The plan is to get there over the next year or so and have something working in production."
SEE: Expert: Now is the time to prepare for the quantum computing revolution(TechRepublic)
A good place to start with anomaly detection is in finding fraud, he said. "Classical computers will tap out at some point and can't get to the same place as quantum computers."
Other use cases are around longitudinal population health modeling, meaning that as Anthem looks at providing more of a digital platform for health, one of the challenges is that there is "almost an infinite number of relationships," he said. This includes different health conditions, providers patients see, outcomes and figuring out where there are outliers, he said.
"There's only so much a classical system can do there, so we're looking for more opportunities to improve healthcare for our members and the population at large," and the ability to proactively predict risk, Utz said. Quantum computers are better at driving outcomes from the models Anthem is building, he said.
Daimler AG, the parent company of Mercedes-Benz, is studying how to develop energy-dense batteries such as the lithium-sulfur battery. But going from the drawing board to a commercially viable Li-S battery is "essentially a mammoth chemistry experiment," the company said.
Engineers are testing quantum systems to distill some very abstract physics theory into a new kind of computing power that can handle what IBM calls "once-insoluble complexity." Using quantum bits known as qubits, the performance doubles, giving a substantial boost to the ability to run algorithms to speed the simulation process and test the feasibility of the battery, the company said.
Energy challenges are expected to increase as the global population grows from 7.5 billion today to a projected 9.2 billion by 2040, according to ExxonMobil. This has created what the company refers to as the "dual challenge" of providing reliable and affordable energy to a rising population while also reducing environmental impacts and the risks of climate change.
One way to tackle that challenge in the near term is to use natural gas, which emits up to 60% less greenhouse gases than coal, according to Dr. Vijay Swarup, vice president of research and development at ExxonMobil, in a statement. This creates issues with production and transportation, he said.
SEE: Startup claims new "quantum analog computer" solved the traveling salesman problem for 128 cities(TechRepublic)
It requires efficient liquified natural gas shipping, but finding optimal routes for a fleet of LNG ships to transport critical fuel supplies is a "mind-bendingly complex optimization problem." It involves accounting for each ship's position every day of the year along with the LNG requirements of each delivery site.
This type of problem cannot be solved exactly with classical computing, IBM said. So ExxonMobil, in tandem with IBM Research, is using a combination of classical and quantum computers to address the complexity. Teams are modeling maritime inventory routing on quantum devices, analyzing the strengths and tradeoffs of different strategies for vehicle and inventory routing, and laying the foundation for constructing practical solutions for their operations, IBM said.
Swarup said ExxonMobil's goal is to increase its ability to tackle more complex optimizations and previously insoluble routing problems as IBM's quantum hardware scales from small prototype systems to larger devices.
Be in the know about smart cities, AI, Internet of Things, VR, AR, robotics, drones, autonomous driving, and more of the coolest tech innovations. Delivered Wednesdays and Fridays
See the original post:
IBM quantum computing: From healthcare to automotive to energy, real use cases are in play - TechRepublic
Quantum computers could read all your encrypted data. This ‘quantum-safe’ VPN aims to stop that – ZDNet
The trial successfully demonstrated, according to Verizon, that it is possible to replace current security processes with protocols that are quantum-proof.
To protect our private communications from future attacks by quantum computers, Verizon is trialing the use of next-generation cryptography keys to protect the virtual private networks (VPNs) that are used every day by companies around the world to prevent hacking.
Verizon implemented what it describes as a "quantum-safe" VPN between one of the company's labs in London in the UK and a US-based center in Ashburn, Virginia, using encryption keys that were generated thanks to post-quantum cryptography methods meaning that they are robust enough to withstand attacks from a quantum computer.
According to Verizon, the trial successfully demonstrated that it is possible to replace current security processes with protocols that are quantum-proof.
VPNs are a common security tool used to protect connections made over the internet, by creating a private network from a public internet connection. When a user browses the web with a VPN, all of their data is redirected through a specifically configured remote server run by the VPN host, which acts as a filter that encrypts the information.
This means that the user's IP address and any of their online activities, from sending emails to paying bills, come out as gibberish to potential hackers even on insecure networks like public WiFi, where eavesdropping is much easier.
Especially in the last few months, which have seen many employees switching to full-time working from home,VPNs have become an increasingly popular tool to ensure privacy and security on the internet.
The technology, however, is based on cryptography protocols that are not un-hackable. To encrypt data, VPN hosts use encryption keys that are generated by well-established algorithms such as RSA (RivestShamirAdleman). The difficulty of cracking the key, and therefore of reading the data, is directly linked to the algorithm's ability to create as complicated a key as possible.
In other words, encryption protocols as we know them are essentially a huge math problem for hackers to solve. With existing computers, cracking the equation is extremely difficult, which is why VPNs, for now, are still a secure solution. But quantum computers are expected to bring about huge amounts of extra computing power and with that, the ability to hack any cryptography key in minutes.
"A lot of secure communications rely on algorithms which have been very successful in offering secure cryptography keys for decades," Venkata Josyula, the director of technology at Verizon, tells ZDNet. "But there is enough research out there saying that these can be broken when there is a quantum computer available at a certain capacity. When that is available, you want to be protecting your entire VPN infrastructure."
One approach that researchers are working on consists ofdeveloping algorithms that can generate keys that are too difficult to hack, even with a quantum computer. This area of research is known as post-quantum cryptography, and is particularly sought after by governments around the world.
In the US, for example, the National Institute of Standards and Technology (NIST) launched a global research effort in 2016 calling on researchers to submit ideas for algorithms that would be less susceptible to a quantum attack. A few months ago, the organization selected a group of 15 algorithms that showed the most promise.
"NIST is leading a standardization process, but we didn't want to wait for that to be complete because getting cryptography to change across the globe is a pretty daunting task," says Josyula. "It could take 10 or even 20 years, so we wanted to get into this early to figure out the implications."
Verizon has significant amounts of VPN infrastructure and the company sells VPN products, which is why the team started investigating how to start enabling post-quantum cryptography right now and in existing services, Josyula adds.
One of the 15 algorithms identified by NIST, called Saber, was selected for the test. Saber generated quantum-safe cryptography keys that were delivered to the endpoints in London and Ashburn of a typical IPsec VPN through an extra layer of infrastructure, which was provided by a third-party vendor.
Whether Saber makes it to the final rounds of NIST's standardization process, in this case, doesn't matter, explains Josyula. "We tried Saber here, but we will be trying others. We are able to switch from one algorithm to the other. We want to have that flexibility, to be able to adapt in line with the process of standardization."
In other words, Verizon's test has shown that it is possible to implement post-quantum cryptography candidates on infrastructure links now, with the ability to migrate as needed between different candidates for quantum-proof algorithms.
This is important because, although a large-scale quantum computer could be more than a decade away, there is still a chance that the data that is currently encrypted with existing cryptography protocols is at risk.
The threat is known as "harvest now, decrypt later" and refers to the possibility that hackers could collect huge amounts of encrypted data and sit on it while they wait for a quantum computer to come along that could read all the information.
"If it's your Amazon shopping cart, you may not care if someone gets to see it in ten years," says Josyula. "But you can extend this to your bank account, personal number, and all the way to government secrets. It's about how far into the future you see value for the data that you own and some of these have very long lifetimes."
For this type of data, it is important to start thinking about long-term security now, which includes the risk posed by quantum computers.
A quantum-safe VPN could be a good start even though, as Josyula explains, many elements still need to be smoothed out. For example, Verizon still relied on standard mechanisms in its trial to deliver quantum-proof keys to the VPN end-points. This might be a sticking point, if it turns out that this phase of the process is not invulnerable to quantum attack.
The idea, however, is to take proactive steps to prepare, instead of waiting for the worst-case scenario to happen. Connecting London to Ashburn was a first step, and Verizon is now looking at extending its quantum-safe VPN to other locations.
Continue reading here:
Quantum computers could read all your encrypted data. This 'quantum-safe' VPN aims to stop that - ZDNet
Sumitomo Corporation Quantum Transformation (QX) Project Announces Its Vision and Activities at the IEEE Quantum AI Sustainability Symposium -…
TOKYO--(BUSINESS WIRE)--Sumitomo Corporation Quantum Transformation (QX) Project will present at the IEEE Quantum AI Sustainability Symposium on September 1st, 2021. The QX Project was launched in March 2021 by Sumitomo Corporation, a global Fortune 500 trading and investment company, with the intent to provide new value to society by applying quantum computing technology to the wide-ranging industries in which the company operates. This is the worlds first project that defines Quantum Transformation (QX) as the next social paradigm shift, beyond Digital Transformation (DX).
The founder and head of the QX Project, Masayoshi Terabe, will present about the vision and activities of QX at the IEEE Quantum AI Sustainability Symposium. The organizer IEEE is the world's largest technical professional organization for the advancement of technology. In this talk, he will show how quantum computing can contribute to sustainability. For example, he will introduce the Quantum Sky project, which is a pilot experiment for developing flight routes for numerous air mobility vehicles by quantum computing. Also you can find other concepts like Quantum Smart City and Quantum Energy Management.
The objective of the QX Project is to create new value to the society by combining vast business fields of Sumitomo Corporation throughout its more than 900 consolidated companies, from underground to space, and an extensive number of business partners around the world.
A broad and deep ecosystem is necessary to achieve QX. This is because combining a wide range of technologies, not limited to quantum, and working with a crossover of various industries, is essential. If you are interested in this project, lets take on the challenge of creating a new business, and a new society together!
[information]
[Appendix]
The rest is here:
Sumitomo Corporation Quantum Transformation (QX) Project Announces Its Vision and Activities at the IEEE Quantum AI Sustainability Symposium -...
Life, the universe and everything Physics seeks the future – The Economist
Aug 25th 2021
A WISE PROVERB suggests not putting all your eggs in one basket. Over recent decades, however, physicists have failed to follow that wisdom. The 20th centuryand, indeed, the 19th before itwere periods of triumph for them. They transformed understanding of the material universe and thus peoples ability to manipulate the world around them. Modernity could not exist without the knowledge won by physicists over those two centuries.
Your browser does not support the
Get The Economist app and play articles, wherever you are
In exchange, the world has given them expensive toys to play with. The most recent of these, the Large Hadron Collider (LHC), which occupies a 27km-circumference tunnel near Geneva and cost $6bn, opened for business in 2008. It quickly found a long-predicted elementary particle, the Higgs boson, that was a hangover from calculations done in the 1960s. It then embarked on its real purpose, to search for a phenomenon called Supersymmetry.
This theory, devised in the 1970s and known as Susy for short, is the all-containing basket into which particle physicss eggs have until recently been placed. Of itself, it would eliminate many arbitrary mathematical assumptions needed for the proper working of what is known as the Standard Model of particle physics. But it is also the vanguard of a deeper hypothesis, string theory, which is intended to synthesise the Standard Model with Einsteins general theory of relativity. Einsteins theory explains gravity. The Standard Model explains the other three fundamental forceselectromagnetism and the weak and strong nuclear forcesand their associated particles. Both describe their particular provinces of reality well. But they do not connect together. String theory would connect them, and thus provide a so-called theory of everything.
String theory proposes that the universe is composed of minuscule objects which vibrate in the manner of the strings of a musical instrument. Like such strings, they have resonant frequencies and harmonics. These various vibrational modes, string theorists contend, correspond to various fundamental particles. Such particles include all of those already observed as part of the Standard Model, the further particles predicted by Susy, which posits that the Standard Models mathematical fragility will go away if each of that models particles has a heavier supersymmetric partner particle, or sparticle, and also particles called gravitons, which are needed to tie the force of gravity into any unified theory, but are not predicted by relativity.
But, no Susy, no string theory. And, 13 years after the LHC opened, no sparticles have shown up. Even two as-yet-unexplained results announced earlier this year (one from the LHC and one from a smaller machine) offer no evidence directly supporting Susy. Many physicists thus worry they have been on a wild-goose chase.
They have good reason to be nervous. String theory already comes with a disturbing conceptual price tagthat of adding six (or in one version seven) extra dimensions to the universe, over and above the four familiar ones (three of space and one of time). It also describes about 10500 possible universes, only one of which matches the universe in which human beings live. Accepting all that is challenging enough. Without Susy, though, string theory goes bananas. The number of dimensions balloons to 26. The theory also loses the ability to describe most of the Standard Models particles. And it implies the existence of weird stuff such as particles called tachyons that move faster than light and are thus incompatible with the theory of relativity. Without Susy, string theory thus looks pretty-much dead as a theory of everything. Which, if true, clears the field for non-string theories of everything.
The names of many of these do, it must be conceded, torture the English language. They include causal dynamical triangulation, asymptotically safe gravity, loop quantum gravity and the amplituhedron formulation of quantum theory. But at the moment the bookies favourite for unifying relativity and the Standard Model is something called entropic gravity.
Entropy is a measure of a systems disorder. Famously, the second law of thermodynamics asserts that it increases with time (ie, things have a tendency to get messier as they get older). What that has to do with a theory of gravity, let alone of everything, is not, perhaps, immediately obvious. But the link is black holes. These are objects which have such strong gravitational fields that even light cannot escape from them. They are predicted by the mathematics of general relativity. And even though Einstein remained sceptical about their actual existence until the day he died in 1955, subsequent observations have shown that they are indeed real. But they are not black.
In 1974 Stephen Hawking, of Cambridge University, showed that quantum effects at a black holes boundary allow it to radiate particlesespecially photons, which are the particles of electromagnetic radiation, including light. This has peculiar consequences. Photons carry radiant heat, so something which emits them has a temperature. And, from its temperature and mass, it is possible to calculate a black holes entropy. This matters because, when all these variables are plugged into the first law of thermodynamics, which states that energy can be neither created nor destroyed, only transformed from one form (say, heat) into another (say, mechanical work), what pops out are Einsteins equations of general relativity.
That relationship was discovered in 2010 by Erik Verlinde of Amsterdam University. It has serious implications. The laws of thermodynamics rely on statistical mechanics. They involve properties (temperature, entropy and so on) which emerge from probabilistic descriptions of the behaviour of the underlying particles involved. These are also the particles described by quantum mechanics, the mathematical theory which underpins the Standard Model. That Einsteins equations can be rewritten thermodynamically implies that space and time are also emergent properties of this deeper microscopic picture. The existing forms of quantum mechanics and relativity thus do indeed both seem derivable in principle from some deeper theory that describes the underlying fabric of the universe.
String theory is not so derivable. Strings are not fundamental enough entities. But entropic gravity claims to describe the very nature of space and timeor, to use Einsteinian terminology, spacetime. It asserts this is woven from filaments of quantum entanglement linking every particle in the cosmos.
The idea of quantum entanglement, another phenomenon pooh-poohed by Einstein that turned out to be true, goes back to 1935. It is that the properties of two or more objects can be correlated (entangled) in a way which means they cannot be described independently. This leads to weird effects. In particular, it means that two entangled particles can appear to influence each others behaviour instantaneously even when they are far apart. Einstein dubbed this spooky action at a distance, because it seems to violate the premise of relativity theory that, in the speed of light, the universe has a speed limit.
As with black holes, Einstein did not live long enough to see himself proved wrong. Experiments have nevertheless shown he was. Entanglement is real, and does not violate relativity because although the influence of one particle on another can be instantaneous there is no way to use the effect to pass information faster than light-speed. And, in the past five years, Brian Swingle of Harvard University and Sean Carroll of the California Institute of Technology have begun building models of what Dr Verlindes ideas might mean in practice, using ideas from quantum information theory. Their approach employs bits of quantum information (so-called qubits) to stand in for the entangled particles. The result is a simple but informative analogue of spacetime.
Qubits, the quantum equivalent of classical bitsthe ones and zeros on which regular computing is builtwill be familiar to those who follow the field of quantum computing. They are the basis of quantum information theory. Two properties distinguish qubits from the regular sort. First, they can be placed in a state of superposition, representing both a one and a zero at the same time. Second, several qubits can become entangled. Together, these properties let quantum computers accomplish feats such as performing multiple calculations at once, or completing certain classes of calculation in a sensible amount of time, that are difficult or impossible for a regular computer.
And because of their entanglement qubits can also, according to Dr Swingle and Dr Carroll, be used as stand-ins for how reality works. More closely entangled qubits represent particles at points in spacetime that are closer together. So far, quantum computers being a work in progress, this modelling can be done only with mathematical representations of qubits. These do, though, seem to obey the equations of general relativity. That supports entropic-gravity-theorys claims.
All of this modelling puts entropic gravity in pole position to replace strings as the long-sought theory of everything. But the idea that spacetime is an emergent property of the universe rather than being fundamental to it has a disturbing consequence. It blurs the nature of causality.
In the picture built by entropic gravity, spacetime is a superposition of multiple states. It is this which muddies causality. The branch of maths that best describes spacetime is a form of geometry that has four axes at right angles to each other instead of the more familiar three. The fourth represents time, so, like the position of objects, the order of events in spacetime is determined geometrically. If different geometric arrangements are superposed, as entropic gravity requires, it can therefore sometimes happen that the statements A causes B and B causes A are both true.
This is not mere speculation. In 2016 Giulia Rubino of the University of Bristol, in England, constructed an experiment involving polarised photons and prisms which achieved exactly that. This spells trouble for those who have old-fashioned notions about causalitys nature.
However, Lucien Hardy of the Perimeter Institute, in Canada, has discovered a way to reformulate the laws of quantum mechanics to get around this. In his view, causality as commonly perceived is like data compression in computing: it is a concept that gives you more bang for your buck. With a little bit of information about the present, causality can infer a lot about the futurecompressing the amount of information needed to capture the details of a physical system in time.
But causality, Dr Hardy thinks, may not be the only way to describe such correlations. Instead, he has invented a general method for building descriptions of the patterns in correlations from scratch. This method, which he calls the causaloid framework, tends to reproduce causality but it does not assume it, and he has used it to reformulate both quantum theory (in 2005) and general relativity (in 2016). Causaloid maths is not a theory of everything. But there is a good chance that if and when such a theory is found, causaloid principles will be needed to describe it, just as general relativity needed a geometry of four dimensions to describe spacetime.
Entropic gravity has, then, a lot of heavy-duty conceptual work to back it up. But it is not the only candidate to replace string theory. Others jostling for attention include an old competitor called loop quantum gravity, originally proposed in 1994 by Carlo Rovelli, then at the University of Pittsburgh, and Lee Smolin, of the Perimeter Institute. This, and causal dynamical triangulation, a more recent but similar idea, suggest that spacetime is not the smooth fabric asserted by general relativity, but, rather, has a structureeither elementary loops or triangles, according to which of the two theories you support.
A third option, asymptotically safe gravity, goes back still further, to 1976. It was suggested by Steven Weinberg, one of the Standard Models chief architects. A natural way to develop a theory of quantum gravity is to add gravitons to the model. Unfortunately, this approach got nowhere, because when the interactions of these putative particles were calculated at higher energies, the maths seemed to become nonsensical. However, Weinberg, who died in July, argued that this apparent breakdown would go away (in maths speak, the calculations would be asymptotically safe) if sufficiently powerful machines were used to do the calculating. And, with the recent advent of supercomputers of such power, it looks, from early results, as if he might have been right.
One of the most intriguing competitors of entropic gravity, though, is the amplituhedron formulation of quantum theory. This was introduced in 2013 by Nima Arkani-Hamed of the Institute of Advanced Study at Princeton and Jaroslav Trnka of the University of California, Davis. They have found a class of geometric structures dubbed amplituhedrons, each of which encodes the details of a possible quantum interaction. These, in turn, are facets of a master amplituhedron that encodes every possible type of physical process. It is thus possible to reformulate all of quantum theory in terms of the amplituhedron.
Most attempts at a theory of everything try to fit gravity, which Einstein describes geometrically, into quantum theory, which does not rely on geometry in this way. The amplituhedron approach does the opposite, by suggesting that quantum theory is actually deeply geometric after all. Better yet, the amplituhedron is not founded on notions of spacetime, or even statistical mechanics. Instead, these ideas emerge naturally from it. So, while the amplituhedron approach does not as yet offer a full theory of quantum gravity, it has opened up an intriguing path that may lead to one.
That space, time and even causality are emergent rather than fundamental properties of the cosmos are radical ideas. But this is the point. General relativity and quantum mechanics, the physics revolutions of the 20th century, were viewed as profound precisely because they overthrew common sense. To accept relativity meant abandoning a universal notion of time and space. To take quantum mechanics seriously meant getting comfortable with ideas like entanglement and superposition. Embracing entropic gravity or its alternatives will require similar feats of the imagination.
No theory, though, is worth a damn without data. That, after all, is the problem with Supersymmetry. Work like Dr Rubinos points the way. But something out of a particle-physics laboratory would also be welcome. And, though their meaning is obscure, the past few months have indeed seen two experimentally induced cracks in the Standard Model.
On March 23rd a team from CERN, the organisation that runs the LHC, reported an unexpected difference in behaviour between electrons and their heavier cousins, muons. These particles differ from one another in no known properties but their masses, so the Standard Model predicts that when other particles decay into them, the two should each be produced in equal numbers. But this appears not to be true. Interim results from the LHC suggest that a type of particle called a B-meson is more likely to decay into an electron than a muon. That suggests an as-yet-undescribed fundamental force is missing from the Standard Model. Then, on April 7th, Fermilab, Americas biggest particle-physics facility, announced the interim results of its own muon experiment, Muon g-2.
In the quantum world, there is no such thing as a perfect vacuum. Instead, a froth of particles constantly pops in and out of existence everywhere in spacetime. These are virtual rather than real particlesthat is, they are transient fluctuations which emerge straight out of quantum uncertainty. But, although they are short-lived, during the brief periods of their existence they still have time to interact with more permanent sorts of matter. They are, for example, the source of the black-hole radiation predicted by Hawking.
The strengths of their interactions with types of matter more conventional than black holes are predicted by the Standard Model, and to test these predictions, Muon g-2 shoots muons in circles around a powerful superconducting magnetic-storage ring. The quantum froth changes the way the muons wobble, which detectors can pick up with incredible precision. The Muon g-2 experiment suggests that the interactions causing these wobbles are slightly stronger than the Standard Model predicts. If confirmed, this would mean the model is missing one or more elementary particles.
There is a slim chance that these are the absent sparticles. If so, it is the supporters of supersymmetry who will have the last laugh. But nothing points in this direction and, having failed thus far to stand their ideas up, they are keeping sensibly quiet.
Whatever the causes of these two results, they do show that there is something out there which established explanations cannot account for. Similarly unexplained anomalies were starting points for both quantum theory and relativity. It looks possible, therefore, that what has seemed one of physicss darkest periods is about to brighten into a new morning.
This article appeared in the Science & technology section of the print edition under the headline "Bye, bye, little Susy"
Read the original:
Life, the universe and everything Physics seeks the future - The Economist
This Exotic Particle Had an Out-of-Body Experience These Surprised Scientists Took a Picture of It – SciTechDaily
Artists illustration of ghost particles moving through a quantum spin liquid. Credit: Jenny Nuss/Berkeley Lab
An unexpected finding by scientists at Berkeley Lab and UC Berkeley could advance quantum computers and high-temperature superconductors.
Scientists have taken the clearest picture yet of electronic particles that make up a mysterious magnetic state called a quantum spin liquid (QSL).
The achievement could facilitate the development of superfast quantum computers and energy-efficient superconductors.
The scientists are the first to capture an image of how electrons in a QSL decompose into spin-like particles called spinons and charge-like particles called chargons.
Artists illustration of ghost particles moving through a quantum spin liquid. Credit: Jenny Nuss/Berkeley Lab
Other studies have seen various footprints of this phenomenon, but we have an actual picture of the state in which the spinon lives. This is something new, said study leader Mike Crommie, a senior faculty scientist at Lawrence Berkeley National Laboratory (Berkeley Lab) and physics professor at UC.
Spinons are like ghost particles. They are like the Big Foot of quantum physics people say that theyve seen them, but its hard to prove that they exist, said co-author Sung-Kwan Mo, a staff scientist at Berkeley Labs Advanced Light Source. With our method weve provided some of the best evidence to date.
In a QSL, spinons freely move about carrying heat and spin but no electrical charge. To detect them, most researchers have relied on techniques that look for their heat signatures.
Now, as reported in the journal Nature Physics, Crommie, Mo, and their research teams have demonstrated how to characterize spinons in QSLs by directly imaging how they are distributed in a material.
Schematic of the triangular spin lattice and star-of-David charge density wave pattern in a monolayer of tantalum diselenide. Each star consists of 13 tantalum atoms. Localized spins are represented by a blue arrow at the star center. The wavefunction of the localized electrons is represented by gray shading. Credit: Mike Crommie et al./Berkeley Lab
To begin the study, Mos group at Berkeley Labs Advanced Light Source (ALS) grew single-layer samples of tantalum diselenide (1T-TaSe2) that are only three-atoms thick. This material is part of a class of materials called transition metal dichalcogenides (TMDCs). The researchers in Mos team are experts in molecular beam epitaxy, a technique for synthesizing atomically thin TMDC crystals from their constituent elements.
Mos team then characterized the thin films through angle-resolved photoemission spectroscopy, a technique that uses X-rays generated at the ALS.
Scanning tunneling microscopy image of a tantalum diselenide sample that is just 3 atoms thick. Credit: Mike Crommie et al./Berkeley Lab
Using a microscopy technique called scanning tunneling microscopy (STM), researchers in the Crommie lab including co-first authors Wei Ruan, a postdoctoral fellow at the time, and Yi Chen, then a UC Berkeley graduate student injected electrons from a metal needle into the tantalum diselenide TMDC sample.
Images gathered by scanning tunneling spectroscopy (STS) an imaging technique that measures how particles arrange themselves at a particular energy revealed something quite unexpected: a layer of mysterious waves having wavelengths larger than one nanometer (1 billionth of a meter) blanketing the materials surface.
The long wavelengths we saw didnt correspond to any known behavior of the crystal, Crommie said. We scratched our heads for a long time. What could cause such long wavelength modulations in the crystal? We ruled out the conventional explanations one by one. Little did we know that this was the signature of spinon ghost particles.
With help from a theoretical collaborator at MIT, the researchers realized that when an electron is injected into a QSL from the tip of an STM, it breaks apart into two different particles inside the QSL spinons (also known as ghost particles) and chargons. This is due to the peculiar way in which spin and charge in a QSL collectively interact with each other. The spinon ghost particles end up separately carrying the spin while the chargons separately bear the electrical charge.
Illustration of an electron breaking apart into spinon ghost particles and chargons inside a quantum spin liquid. Credit: Mike Crommie et al./Berkeley Lab
In the current study, STM/STS images show that the chargons freeze in place, forming what scientists call a star-of-David charge-density-wave. Meanwhile, the spinons undergo an out-of-body experience as they separate from the immobilized chargons and move freely through the material, Crommie said. This is unusual since in a conventional material, electrons carry both the spin and charge combined into one particle as they move about, he explained. They dont usually break apart in this funny way.
Crommie added that QSLs might one day form the basis of robust quantum bits (qubits) used for quantum computing. In conventional computing a bit encodes information either as a zero or a one, but a qubit can hold both zero and one at the same time, thus potentially speeding up certain types of calculations. Understanding how spinons and chargons behave in QSLs could help advance research in this area of next-gen computing.
Another motivation for understanding the inner workings of QSLs is that they have been predicted to be a precursor to exotic superconductivity. Crommie plans to test that prediction with Mos help at the ALS.
Part of the beauty of this topic is that all the complex interactions within a QSL somehow combine to form a simple ghost particle that just bounces around inside the crystal, he said. Seeing this behavior was pretty surprising, especially since we werent even looking for it.
Reference: Evidence for quantum spin liquid behaviour in single-layer 1T-TaSe2 from scanning tunnelling microscopy by Wei Ruan, Yi Chen, Shujie Tang, Jinwoong Hwang, Hsin-Zon Tsai, Ryan L. Lee, Meng Wu, Hyejin Ryu, Salman Kahn, Franklin Liou, Caihong Jia, Andrew Aikawa, Choongyu Hwang, Feng Wang, Yongseong Choi, Steven G. Louie, Patrick A. Lee, Zhi-Xun Shen, Sung-Kwan Mo & Michael F. Crommie, 19 August 2021, Nature Physics.DOI: 10.1038/s41567-021-01321-0
Researchers from SLAC National Accelerator Laboratory; Stanford University; Argonne National Laboratory; the Massachusetts Institute of Technology; the Chinese Academy of Sciences, Shanghai Tech University, Shenzhen University, Henan University of China; and the Korea Institute of Science and Technology and Pusan National University of Korea contributed to this study. (Co-first author Wei Ruan is now an assistant professor of physics at Fudan University in China; co-first author Yi Chen is currently a postdoctoral fellow at the Center for Quantum Nanoscience, Institute for Basic Science of Korea.)
This work was supported by the DOE Office of Science, and used resources at Berkeley Labs Advanced Light Source and Argonne National Laboratorys Advanced Photon Source. The Advanced Light Source and Advanced Photon Source are DOE Office of Science user facilities.
Additional support was provided by the National Science Foundation.
Read the rest here:
This Exotic Particle Had an Out-of-Body Experience These Surprised Scientists Took a Picture of It - SciTechDaily
Deloitte’s quantum computing leader on the technology’s healthcare future – Healthcare IT News
Quantum computing has enormous potential in healthcare and has started to impact the industry in various ways.
For example, quantum computing offers the ability to track and diagnose disease. Using sensors, quantum technology has the ability to track the progress of cancer treatments and diagnose and monitor such degenerative diseases as multiple sclerosis.
The tech also can help modernize supply chains. Quantum technology can solve routing issues in real time using live data such as weather and traffic updates to help determine the most efficient method of delivery. This would have been particularly helpful during the pandemic since many states had issues with vaccine deliveries.
Elsewhere, quantum technology can impact early-stage drug discovery. Pharmaceuticals can take a decade or longer to bring to market. Quantum computing could lower the costs and reduce the time.
"In the simplest terms, quantum computing harnesses the mysterious properties of quantum mechanics to solve problems using individual atoms and subatomic particles," explained Scott Buchholz, emerging technology research director and government and public services CTO at Deloitte Consulting. "Quantum computers can be thought of as akin to supercomputers.
"However, today's supercomputers solve problems by performing trillions of math calculations very quickly to predict the weather, study air flow over wings, etc.," he continued. "Quantum computers work very differently they perform calculations all at once, limited by the number of qubitsof information that they currently hold."
Because of how differently they work, they aren't well suited for all problems, but they're a fit forcertain types of problems, such as molecular simulation, optimization and machine learning.
"What's important to note is that today's most advanced quantum computers still aren't especially powerful," Buchholz noted.
"Many calculations they currently can do can be performed on a laptop computer. However, if quantum computers continue to scale exponentially that is, the number of qubitsthey use for computation continues to double every year or so they will become dramatically more powerful in years to come.
"Because quantum computers can simulate atoms and other molecules much better than classical computers, researchers are investigating the future feasibility of doing drug discovery, target protein matching, calculating protein folding and more," he continued.
"That is, during the drug discovery process, they could be useful to dramatically reduce the time required to sort through existing databases of molecules to look for targets, identify potential new drugs with novel properties, identify potential new targets and more."
Researchers also are investigating the possibility of simulating or optimizing manufacturing processes for molecules, which potentially could help make scaling up manufacturing easier over time. While these advances won't eliminate the lengthy testing process, they may well accelerate the initial discovery process for interesting molecules.
"Quantum computing may also directly and indirectly lead to the ability to diagnose disease," Buchholz said. "Given future machines' ability to sort through complex problems quickly, they may be able to accelerate the processing of some of the techniques that are being developed today, say those that are designed to identify harmful genetic mutations or combinations.
"Indirectly, some of the materials that were investigated for quantum computers turned out to be better as sensors," he added. "Researchers are investigating quantum-based technologies to make smaller, more sensitive, lower-power sensors. In the future, these sensors and exotic materials may be combined in clever ways to help with disease identification and diagnosis."
Quantum computers will improve the ability to optimize logistics and routing, potentially easing bottlenecks in supply chains or identifying areas of improvement, Buchholz said.
Perhaps more interestingly, due to their ability to simulate molecular interactions, researchers are looking at their ability to optimize manufacturing processes to be quicker, use less energy and produce less waste, he added. That could lead to alternative manufacturing techniques that could simplify healthcare supply chains, he noted.
"Ultimately, the promise of quantum computers is to make some things faster like optimization and machine learning and make some things practical like large scale molecular and process simulation," he said.
"While the technology to solve the 'at scale' problems is still several years in the future, researchers currently are working hard today to put the foundations in place to tackle these problems as the hardware capacity of quantum computers advances.
"Should the hardware researchers achieve some of the sought after scalability breakthroughs, that promise could accelerate," he concluded.
Twitter:@SiwickiHealthITEmail the writer:bsiwicki@himss.orgHealthcare IT News is a HIMSS Media publication.
Go here to read the rest:
Deloitte's quantum computing leader on the technology's healthcare future - Healthcare IT News
Experiments Prove Quantum Computing Errors Correlated, Tied to Cosmic Rays – SciTechDaily
In experiments performed at the University of Wisconsin-Madison, researchers found that fluctuations in the electrical charge of multiple quantum bits, or qubits, can be highly correlated, as opposed to completely random and independent. The team also linked tiny error-causing perturbations in the qubits charge state to the absorption of cosmic rays. Credit: Illustration courtesy of UW-Madison
Research by aLawrence Livermore National Laboratory(LLNL) physicist and a host of collaborators is shedding new light on one of the major challenges to realizing the promise and potential of quantum computing error correction.
In a new paperpublished inNatureand co-authored by LLNL physicist Jonathan DuBois, scientists examined quantum computing stability, particularly what causes errors and how quantum circuits react to them. This must be understood in order to build a functioning quantum system. Other co-authors included researchers at theUniversity of Wisconsin-Madison,Fermi National Accelerator Laboratory, Google,Stanford Universityand international universities.
In experiments performed at UW-Madison, the research team characterized a quantum testbed device, finding that fluctuations in the electrical charge of multiple quantum bits, or qubits the basic unit of a quantum computer can be highly correlated, as opposed to completely random and independent. When a disruptive event occurs, such as a burst of energy coming from outside the system, it can affect every qubit in the vicinity of the event simultaneously, resulting in correlated errors that can span the entire system, the researchers found. Additionally, the team linked tiny error-causing perturbations in the qubits charge state to the absorption of cosmic rays, a finding that already is impacting how quantum computers are designed.
For the most part, schemes designed to correct errors in quantum computers assume that the errors across qubits are uncorrelated theyre random. Correlated errors are very difficult to correct, said co-author DuBois, who heads LLNLs Quantum Coherent Device Physics (QCDP) Group. Essentially, what this paper is showing is that if a high-energy cosmic ray hits the device somewhere, it has the potential to affect everything in the device at once. Unless you can prevent that from happening you cant perform error correction efficiently, and youll never be able to build a working system without that.
Unlike bits found in classical computers, which can exist only in binary states zeroes or onesthe qubits that make up a quantum computer can exist in superpositions. For a few hundred microseconds, data in a qubit can be either a one or zero before being projected into a classical binary state. Whereas bits are only susceptible to one type of error, under their temporary excited charge state, the delicate qubits are susceptible to two types of error, stemming from changes that can occur in the environment.
Charged impulses, even minute ones like those from cosmic rays absorbed by the system, can create a blast of (relatively) high-energy electrons that can heat up the quantum devices substrate just long enough to disrupt the qubits and disturb their quantum states, the researchers found. When a particle impact occurs, it produces a wake of electrons in the device. These charged particles zoom through the materials in the device, scattering off atoms and producing high-energy vibrations and heat. This alters the electric field as well as the thermal and vibrational environment around the qubits, resulting in errors, DuBois explained.
Weve always known this was possible and a potential effectone of many that can influence the behavior of a qubit, DuBois added. We even joked when we saw bad performance that maybe its because of cosmic rays. The significance of this research is that, given that sort of architecture, it puts some quantitative bounds on what you can expect in terms of performance for current device designs in the presence of environmental radiation.
To view the disruptions, researchers sent radio frequency signals into a four qubit system and, by measuring their excitation spectrum and performing spectroscopy on them, were able to see the qubits flip from one quantum state to another, observing that they all shift in energy at the same time, in response to changes in the charge environment.
If our model about particle impacts is correct, then we would expect that most of the energy is converted into vibrations in the chip that propagate over long distances, said UW-Madison graduate student Chris Wilen, the papers lead author. As the energy spreads, the disturbance would lead to qubit flips that are correlated across the entire chip.
Using the method, researchers also examined the lifetimes of qubits the length of time that qubits can remain in their superposition of both one and zero and correlated changes in the charge state with a reduction in lifetime of all the qubits in the system.
The team concluded that quantum error correction will require development of mitigation strategies to protect quantum systems from correlated errors due to cosmic rays and other particle impacts.
I think people have been approaching the problem of error correction in an overly optimistic way, blindly making the assumption that errors are not correlated, said UW-Madison physics professor Robert McDermott, senior author on the study. Our experiments show absolutely that errors are correlated, but as we identify problems and develop a deep physical understanding, were going to find ways to work around them.
Though long theorized, DuBois said the teams findings had never been experimentally proven in a multi-qubit device before. The results will likely impact future quantum system architecture, such as putting quantum computers behind lead shielding or underground, introducing heatsinks or dampers to quickly absorb energy and isolate qubits, and alter the types of materials used in quantum systems.
LLNL currently has aquantum computing testbed system, designed and built with funding from a Laboratory Directed Research and Development (LDRD) Strategic Initiative that began in 2016. It is being developed with continued support by the National Nuclear Security Administrations Advanced Simulation & Computing program and its Beyond Moores Law project.
In related follow-on work, DuBois and his team in the QCDP group are studying a quantum device that is significantly less sensitive to the charge environment. At the extreme cold temperatures required by quantum computers (systems are kept at temperatures colder than outer space), DuBois said researchers observe that thermal and coherent energy transport is qualitatively different from room temperature. For example, instead of diffusing, thermal energy can bounce around in the system like sound waves.
DuBois said he and his team are focused on understanding the dynamics of the microscopic explosion that occurs inside quantum computing devices when they interact with high energy particles and developing ways to absorb the energy before it can disrupt the delicate quantum states stored in the device.
There are potentially ways to design the system so its as insensitive as possible to these kinds of events, and in order to do that you need to have a really good understanding of how it heats up, how it cools down and what exactly is happening through the whole process when exposed to background radiation, DuBois said. The physics of whats going on is quite interesting. Its a frontier, even aside from the quantum applications, because of the oddities of how energy is transported at those low temperatures. It makes it a physics challenge.
DuBois has been working with the papers principal investigator McDermott (UW-Madison)and his group to develop ways to use qubits as detectors to measure charge bias, the method the team used in the paper to conduct their experiments.
Reference: Correlated charge noise and relaxation errors in superconducting qubits by C. D. Wilen, S. Abdullah, N. A. Kurinsky, C. Stanford, L. Cardani, G. DImperio, C. Tomei, L. Faoro, L. B. Ioffe, C. H. Liu, A. Opremcak, B. G. Christensen, J. L. DuBois and R. McDermott, 16 June 2021, Nature.DOI: 10.1038/s41586-021-03557-5
The featured work, including DuBois contribution, was funded by a collaborative grant between LLNL and UW-Madisonfrom the U.S.Department of Energys Office of Science.
The paper included co-authors from UW-Madison, the Fermi National Accelerator Laboratory, the Kavli Institute for Cosmological Physics at the University of Chicago, Stanford University, INFN Sezione di Roma, Sorbonne Universites Laboratoire de Physique Theorique et Hautes Energies and Google.
Follow this link:
Experiments Prove Quantum Computing Errors Correlated, Tied to Cosmic Rays - SciTechDaily
Urgent Warning Issued Over The Future Of Bitcoin Even As The Crypto Market Price Smashes Past $2 Trillion – Forbes
Bitcoin and cryptocurrencies have seen a huge resurgence over the last year following the brutal so-called crypto winter that began in 2018.
The bitcoin price has this year climbed to never-before-seen highs, topping $60,000 per bitcoin before falling back slightly. Other smaller cryptocurrencies have risen at an even faster clip than bitcoin, with many making percentage gains into the thousands.
Now, as bitcoin and cryptocurrencies begin to carve out a place among traditional assets in investor portfolios, technologists have warned that advances in quantum computing could mean the encryption that underpins bitcoin is "fundamentally" undermined as soon as 2026unless the software is updated.
Sign up now for the freeCryptoCodexA free, daily newsletter for the crypto-curious. Helping you understand the world of bitcoin and crypto, every weekday
Subscribe now to Forbes' CryptoAsset & Blockchain Advisor and discover crypto blockbusters poised for 1,000% gains
The bitcoin price has risen many hundreds of percent over the last few years but quantum computing ... [+] could spell the end of bitcoin and cryptocurrencies unless urgent action is taken.
"Quantum computers, expected to be operational by around 2026, will easily undermine any blockchain security systems because of their power," says the founder of quantum encryption company Arqit, David Williams, speaking over the phone. Arqit is gearing up for a September SPAC listing in New York.
"There needs to be rather more urgency," Williams adds.
Quantum computing, which sees traditional computer "bits" replaced with quantum particles (qubits) that can calculate information at vastly increased speed, has been in development since the 1990s. Researchers at universities around the world are now on the verge of creating a working quantum computer, with search giant Google and scientists from the University of New South Wales in Sydney, Australia, recently making headlines with breakthroughs.
Williams, pointing to problems previously identified by the cofounder of ethereum and creator of cardano, Charles Hoskinson, warns that upgrading to post-quantum algorithms will "dramatically slow blockchains down" and called for blockchain developers to adopt so-called quantum encryption keys.
"Blockchains are effectively fundamentally flawed if they dont address the oncoming quantum age. The grownups in the room know what's coming."
Others have also begun working on getting bitcoin and other blockchains ahead of quantum computing.
"If this isn't addressed before quantum computers pose a threat, the impact would be massive," says Duncan Jones, head of quantum cybersecurity at Cambridge Quantum Computing, speaking via email. "Attackers could create fraudulent transactions and steal currency, as well as potentially disrupting blockchain operations."
Earlier this month, Cambridge Quantum Computing, along with the Inter-American Development Bank and Tecnolgico de Monterrey, identified four potential threats to blockchain networks posed by quantum computers and used a post-quantum cryptography layer to help protect them.
CryptoCodexA free, daily newsletter for the crypto-curious
Subscribe now to Forbes' CryptoAsset & Blockchain Advisor and discover crypto blockbusters poised for 1,000% gains
"Time is of the essence here," says Jones, pointing to Google chief executive Sundar Pichai's prediction that encryption could be broken in as little as five to 10 years. "It's important for decentralized networks to start this migration process soon because it requires careful planning and execution. However, I'm hopeful the core developers behind these platforms understand the issues and will be addressing them."
Recently, it's been reported that China is pulling ahead in the global quantum race, something Williams fears could undermine both traditional and crypto markets to the same degree as the 2008 global financial crisis.
"On day one, the creation of a quantum computer doesn't break everything," says Williams. "It will probably initially happen in secret and the information will slowly leak out that the cryptography has been broken. Then there will be a complete loss of confidence, similar to how the global financial crisis saw confidence in the system disintegrate."
With more than 11,000 different cryptocurrencies now listed on crypto data website CoinMarketCap and competition between bitcoin and other major cryptocurrencies reaching fever pitch, adding protection against the coming quantum revolution could be beneficial.
"If any one blockchain company could deliver proof it's quantum-safe it would have an advantage," says Williams.
See the original post here:
Urgent Warning Issued Over The Future Of Bitcoin Even As The Crypto Market Price Smashes Past $2 Trillion - Forbes