Category Archives: Quantum Physics
Counting the Risks for IonQ and Quantum Computing Technology – InvestorPlace
Investing is like a box of chocolates, you never know what you gonna get a top-line from the movie Forrest Gump. The actual phrase was about life, but investing is a part of life. IonQ (NYSE:IONQ) went public via a special purpose acquisition company (SPAC) deal with dMY Technology Group. It is the first publicly traded quantum computing firm and its goals are high but its risks might be higher.
2021 has been a year full of trends for investors. There was all kinds of excitement and drama, as well as new questions posed as to what matters in investing. Such as the power of retail investors to move stocks defying the fundamentals and causing new trends to emerge. One such new cutting-edge trend to the investing world is quantum computing. And IonQ is developing and commercializing this innovative technology, and claims that this is the future.
So, what should you know about IONQ stock now?
This first publicly traded purely quantum computing company has a mission to achieve significant things, from building the worlds best quantum computers, to solving the worlds most complex problems. Its quantum computing systems can solve complicated problems, using quantum physics, much faster compared to classical computers.
First, a small bit of quantum computing 101. Quantum computing uses qubits, quantum bits, rather than the classic bits, where each bit in classical computing has a value of either zero or one. Qubits can have a value of zero, one, or a linear combination of them. This superposition state allows quantum computers to bypass many of the traditional limits seen in classical computers. This in turn, gives them significant advantages over their classical counterparts.
There are plenty of applications where quantum computing can be helpful, cybersecurity, financial modeling, artificial intelligence, medicine research, even predicting the weather. There is just a lot of potential utility for quantum computing development.
However, this potential utility though comes with severe risks and challenges.
Quantum computers have several notable problems. As with any revolutionary technology, there are specific challenges and hurdles until its infancy period passes.
These main problems for quantum computers are both structural and operational. They can be simply summarized as hard to engineer, construct and program to run without any significant errors.
Another big problem currently for quantum computers is their price they are far more expensive than the classical computers that persons and businesses use for performing various tasks.
However, the biggest problem faced by quantum computers is a loss of coherence when making quantum calculations (aka decoherence). Decoherence means that these highly advanced computers are subject to factors such as vibrations, the temperature of their working environment, and any electromagnetic waves, making them operate less efficiently than they are built-in for.
Simply speaking, theyre extremely delicate.
Thus, the cost of competitiveness and innovation is incredibly high, and at this moment, makes it difficult to evaluate.
The market of any company it operates in, and its sector dynamics are of paramount importance for its financial performance and business excellence. The quantum computing market is expected to grow to $8.6 billion in 2027, from only $412 million in 2020. That is a compound annual growth rate of 50.9% for these six years.
IonQ defines itself as a leader in quantum computing. However, there are several factors that raise concerns about its stock now. There are at least four main fundamental worries I want to mention.
To start, IonQ in its third-quarter 2021 financial results reported a revenue of a measly $451,00 year-to-date. For a company with a market capitalization of $4 billion as of Dec. 9, the revenue reported is not only meaningless, but reinforces how expensive IONQ stock is now.
Second, I consider the total contract value (TCV) bookings of $15.1 million year-to-date, also not substantial.
Third, IonQ is an unprofitable company with widening losses in Q3 2021 compared to the second quarter of 2021. For the nine months ended Sept. 30, 2021 IonQ reported a net loss of $32,102,000, whereas for Q2 2020 the net loss was about a third of it, $10,493,000.
Fourth, IonQs research and development, sales, and marketing costs rose significantly in Q3 2021 and as a result, the loss from operations was wider than in Q2 2020. The companys operational losses in Q3 2021 were $10,524,000, compared to a loss of $3,576,000 in Q2 2020.
I see no significant revenue, and yet, IONQ stock has an incredibly high market capitalization. With net losses widening, and operating expenses increasing, this quantum computing firm has not yet made its commercialization that effective.
IONQ stock is a very high-risk play right now. IonQ is a firm with a technology that may seem revolutionary, but the expectations for high growth do not reflect a strong case in favor based on its fundamentals.
On the date of publication, Stavros Georgiadis, CFA did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.
Stavros Georgiadis is a CFA charter holder, an Equity Research Analyst, and an Economist. He focuses on U.S. stocks and has his own stock market blog atthestockmarketontheinternet.com/. He has written in the past various articles for other publications and can be reached onTwitterand onLinkedIn.
More:
Counting the Risks for IonQ and Quantum Computing Technology - InvestorPlace
A career built on the strongest force in the universe – EurekAlert
image:Latifa Elouadrhiri view more
Credit: DOE's Jefferson Lab
Latifa Elouadrhiri has spent her career pursuing a passion for experimental physics, investing nearly three decades in work at the Department of Energy's Thomas Jefferson National Accelerator Facility. She has also devoted herself to passing on her love of science to other women and underrepresented groups, including conferences that encourage undergraduate women to pursue physics degrees and careers.
Such efforts and her numerous professional successes havent gone unnoticed. Elouadrhiri was just presented with the 2021 Jesse W. Beams Research Award, which recognizes especially significant or meritorious research in physics that has earned the critical acclaim of peers from around the world. The award was established by the Southeastern Section of the American Physical Society (SESAPS) in 1973. Elouadrhiri is only the second woman to receive it.
This is just a great honor for me and for the science we do, Elouadrhiri said. Not just me this award is also a recognition of the team of scientists, including the technical staff and the students, that started at Christopher Newport University and continues at Jefferson Lab.
Elouadrhiri first arrived at the lab in 1994 in a joint position with CNU. She joined the experimental hall staff in 2001, and today is senior staff scientist in Hall B.
Elouadrhiri and other experimentalists use the labs powerful Continuous Electron Beam Accelerator Facility (CEBAF) to probe ever deeper into the proton that sits inside the atomic nucleus. CEBAF is a DOE user facility built to support research in nuclear physics.
In 2018, the CEBAF completed an upgrade that doubled its top design energy to 12 billion electron-volts, or 12 GeV, providing unequaled access to the mysterious elements of subatomic matter. The upgrade also enabled the experimental program in Hall B to be restructured with a novel detector called CLAS12. Elouadrhiri oversaw the full lifecycle of CLAS12 construction and commissioning.
That same year, Elouadrhiri and her team were lauded for achieving the first measurement of the pressure distribution inside the proton a finding that the quarks that make up the proton are subject to a crushing pressure 10 times that in the heart of a neutron star. Their results were published in the journal Nature and opened up an entirely new direction of exploration in nuclear and particle physics.
In announcing the Beams award, the SESAPS selection committee cited Elouadrhiris fundamental and lasting contributions to the development of experimental equipment in forefront nuclear science.
I followed my heart
Elouadrhiris journey to Jefferson Lab was an unlikely one. She was born in Morocco, the sixth of eight children, to a mother who could neither read nor write but who believed in the power of education.
She had never been to school, but she had a vision, said Elouadrhiri. She could see far into the future and created the right environment for us, making education particularly for women as central. She understood the importance of educating girls and finding our way, and supported us in anything we did.
Of the eight siblings, seven would go to college and such careers as diplomat, physician, college professor, computer engineer, artist and economist.
Elouadrhiri took her first physics class in high school and was just fascinated by the topic. It combines mathematics, science and also some philosophy how the world works.
At age 15, at a local flea market, she acquired her first physics book: a work by Werner Heisenberg, a German physicist and 1932 Nobel laureate responsible for the namesake Heisenbergs uncertainty principle of quantum mechanics.
And I was hooked, Elouadrhiri said. Since then, she said, that book travels with her everywhere.
She earned her undergraduate degree and then a masters in theoretical physics at Mohammed V University of Rabat. She moved to France to continue her studies toward a Ph.D., conducting experiments at the Saclay Nuclear Research Centre and also the Paul Scherrer Institute in Switzerland. She was accepted at the University of Massachusetts Amherst for her first postdoctoral position.
It was during an American Physical Society meeting that her work caught the attention of Nathan Isgur, then chief scientist at Jefferson Lab when it was still known simply as CEBAF. Isgur invited her to give a seminar on her research, then suggested she apply for the joint JLab/CNU position.
She was offered that position at the same time another offer for a permanent position came from the prestigious French National Centre for Scientific Research (CNRS).
I just followed my heart, Elouadrhiri said. My heart told me that I should stay here.
The Beams award, she said, is a big recognition for the science that we do. It really inspires me and motivates me to further develop experimental techniques toward understanding the way protons and neutrons, which are the building blocks of all atomic nuclei, are held together by the strong force. And with the CLAS12 science program we will be building a deeper understanding of these forces.
Personally, this award now helps in sharing my love of scientific learning with women throughout the world, and also continuing my work in broadening scientific participation across genders, ethnicities, religions, cultures and geographies. Im very excited.
Further ReadingHall B Staff Bios - Latifa ElouadrhiriMoroccan Physicist Latifa Elouadrhiri Makes Ground-Breaking Nuclear Physics DiscoveryW&M, Jefferson Lab host conference to support women undergrads in physicsQuarks Feel the Pressure in the Proton
By Tamara Dietrich
-end-
Jefferson Science Associates, LLC, operates the Thomas Jefferson National Accelerator Facility, or Jefferson Lab, for the U.S. Department of Energy's Office of Science.
DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United Statesand is working to address some of the most pressing challenges of our time. For more information, visithttps://energy.gov/science.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.
More:
A career built on the strongest force in the universe - EurekAlert
The future of scientific research is quantum – The Next Web
Over the past few years, the capabilities of quantum computers have reached the stage where they can be used to pursue research with widespread technological impact. Through their research, the Q4Q team at the University of Southern California, University of North Texas, and Central Michigan University, explores how software and algorithms designed for the latest quantum computing technologies can be adapted to suit the needs of applied sciences. In a collaborative project, the Q4Q team sets out a roadmap for bringing accessible, user-friendly quantum computing into fields ranging from materials science, to pharmaceutical drug development.
Since it first emerged in the 1980s, the field of quantum computing has promised to transform the ways in which we process information. The technology is centered on the fact that quantum particles such as electrons exist in superpositions of states. Quantum mechanics also dictates that particles will only collapse into one single measurable state when observed by a user. By harnessing these unique properties, physicists discovered that batches of quantum particles can act as more advanced counterparts to conventional binary bits which only exist in one of two possible states (on or off) at a given time.
On classical computers, we write and process information in a binary form. Namely, the basic unit of information is a bit, which takes on the logical binary values 0 or 1. Similarly, quantum bits (also known as qubits) are the native information carriers on quantum computers. Much like bits, we read binary outcomes of qubits, that is 0 or 1 for each qubit.
However, in a stark contrast to bits, we can encode information on a qubit in the form of a superposition of logical values of 0 and 1. This means that we can encode much more information in a qubit than in a bit. In addition, when we have a collection of qubits, the principle of superposition leads to computational states that can encode correlations among the qubits, which are stronger than any type of correlations achieved within a collection of bits. Superposition and strong quantum correlations are, arguably, the foundations on which quantum computers rely on to provide faster processing speeds than their classical counterparts.
To realize computations, qubit states can be used in quantum logic gates, which perform operations on qubits, thus transforming the input state according to a programmed algorithm. This is a paradigm for quantum computation, analogous to conventional computers. In 1998, both qubits and quantum logic gates were realized experimentally for the first time bringing the previously-theoretical concept of quantum computing into the real world.
From this basis, researchers then began to develop new software and algorithms, specially designed for operations using qubits. At the time, however, the widespread adoption of these techniques in everyday applications still seemed a long way off. The heart of the issue lay in the errors that are inevitably introduced to quantum systems by their surrounding environments. If uncorrected, these errors can cause qubits to lose their quantum information, rendering computations completely useless. Many studies at the time aimed to develop ways to correct these errors, but the processes they came up with were invariably costly and time-consuming.
Unfortunately, the risk of introducing errors to quantum computations increases drastically as more qubits are added to a system. For over a decade after the initial experimental realization of qubits and quantum logic gates, this meant that quantum computers showed little promise in rivalling the capabilities of their conventional counterparts.
In addition, quantum computing was largely limited to specialized research labs, meaning that many research groups that could have benefited from the technology were unable to access it.
While error correction remains a hurdle, the technology has since moved beyond specialized research labs, becoming accessible to more users. This occurred for the first time in 2011, when the first quantum annealer was commercialized. With this event, feasible routes emerged towards reliable quantum processors containing thousands of qubits capable of useful computations.
Quantum annealing is an advanced technique for obtaining optimal solutions to complex mathematical problems. It is a quantum computation paradigm alternative to operating on qubits with quantum logic gates.
The availability of commercial quantum annealers spurned a new surge in interest for quantum computing, with consequent technological progress, especially fueled by industrial capitals. In 2016, this culminated in the development of a new cloud system based on quantum logic gates, which enabled owners and users of quantum computers around the world to pool their resources together, expanding the use of the devices outside of specialized research labs. Before long, the widespread use of quantum software and algorithms for specific research scenarios began to look increasingly realistic.
At the time, however, the technology still required high levels of expertise to operate. Without specific knowledge of the quantum processes involved, researchers in fields such as biology, chemistry, materials science, and drug development could not make full use of them. Further progress would be needed before the advantages of quantum computing could be widely applied outside the field of quantum mechanics itself.
Now, the Q4Q team aims to build on these previous advances using user-friendly quantum algorithms and software packages to realize quantum simulations of physical systems. Where the deeply complex properties of these systems are incredibly difficult to recreate within conventional computers, there is now hope that this could be achieved using large systems of qubits.
To recreate the technologies that could realistically become widely available in the near future, the teams experiments will incorporate noisy intermediate-scale quantum (NISQ) devices which contain relatively large numbers of qubits, and by themselves are prone to environmental errors.
In their projects, the Q4Q team identifies three particular aspects of molecules and solid materials that could be better explored through the techniques they aim to develop. The first of these concerns the band structures of solids which describe the range of energy levels that electrons can occupy within a solid, as well as the energies they are forbidden from possessing.
Secondly, they aim to describe the vibrations and electronic properties of individual molecules each of which can heavily influence their physical properties. Finally, the researchers will explore how certain aspects of quantum annealing can be exploited to realize machine-learning algorithms which automatically improve through their experience of processing data.
As they apply these techniques, the Q4Q team predicts that their findings will lead to a better knowledge of the quantum properties of both molecules and solid materials. In particular, they hope to provide better descriptions of periodic solids, whose constituent atoms are arranged in reliably repeating patterns.
Previously, researchers struggled to reproduce the wavefunctions of interacting quantum particles within these materials, which relate to the probability of finding the particles in particular positions when observed by a user. Through their techniques, the Q4Q team aims to reduce the number of qubits required to capture these wavefunctions, leading to more realistic quantum simulations of the solid materials.
Elsewhere, the Q4Q team will account for the often deeply complex quantum properties of individual molecules made up of large groups of atoms. During chemical reactions, any changes taking place within these molecules will be strongly driven by quantum processes, which are still poorly understood. By developing plugins to existing quantum software, the team hopes to accurately recreate this quantum chemistry in simulated reactions.
If they are successful in reaching these goals, the results of their work could open up many new avenues of research within a diverse array of fields especially where the effects of quantum mechanics have not yet been widely considered. In particular, they will also contribute to identifying bottlenecks of current quantum processing units, which will aid the design of better quantum computers.
Perhaps most generally, the Q4Q team hopes that their techniques will enable researchers to better understand how matter responds to external perturbations, such as lasers and other light sources.
Elsewhere, widely accessible quantum software could become immensely useful in the design of new pharmaceutical drugs, as well as new fertilizers. By ascertaining how reactions between organic and biological molecules unfold within simulations, researchers could engineer molecular structures that are specifically tailored to treating certain medical conditions.
The ability to simulate these reactions could also lead to new advances in the field of biology as a whole, where processes involving large, deeply complex molecules including proteins and nucleic acids are critical to the function of every living organism.
Finally, a better knowledge of the vibrational and electronic properties of periodic solids could transform the field of materials physics. By precisely engineering structures to display certain physical properties on macroscopic scales, researchers could tailor new materials with a vast array of desirable characteristics: including durability, advanced interaction with light, and environmental sustainability.
If the impacts of the teams proposed research goals are as transformative as they hope, researchers in many different fields of the technological endeavor could soon be working with quantum technologies.
Such a clear shift away from traditional research practices could in turn create many new jobs with required skillsets including the use of cutting-edge quantum software and algorithms. Therefore, a key element of the teams activity is to develop new strategies for training future generations of researchers. Members of the Q4Q team believe that this will present some of the clearest routes yet towards the widespread application of quantum computing in our everyday lives.
This article was authored by the Q4Q team, consisting of lead investigator Rosa Di Felice, Anna Krylov, Marco Fornari, Marco Buongiorno Nardelli, Itay Hen and Amir Kalev, in Scientia. Learn more about the team, and find the original article here.
See the article here:
When science mixes with politics, all we get is politics – Big Think
To be ignorant of causes is to be frustrated in action. So wrote Francis Bacon, counsel to Queen Elizabeth I of England and key architect of the scientific method. In other words, do not base your actions on ignorance, or your actions will fail and cause damage. Bacon proposed that careful observation of natural phenomena, combined with experimentation and data collection and analysis, could be used to obtain knowledge of the mechanisms of nature. His method, known as the inductive method, looked at particulars (e.g., observations) to achieve the general (e.g., laws). When King Charles II founded the Royal Society in 1660, Bacons ideas were taken as the guiding principles of natural philosophy (the old name for science).
The method is not foolproof. No scientific theory based on the inductive method can be equated with the final truth on a subject. But, and this is an enormous but, the method is incredibly efficient at gathering evidence that is then used to formulate general principles that describe the operations of the natural world. Once vetted by the scientific community, scientific knowledge is the only way to develop technological applications that will serve society, from antibiotics and vaccines to cell phones and electric cars.
The only reason you step into an airplane with confidence is because, knowing it or not, you trust science. You trust the hydrodynamics used to design wings, you trust the chemical physics of combustion, and you trust the guidance system an incredibly complex system that involves radar, GPS, intricate electromagnetic circuitry, and even the Theory of Relativity to achieve amazing levels of precision navigation. You trust the expert, the pilot, who has training in the operation of the airplane and its instrumentation.
The paradox of our age is that although we live in a world that depends in essential ways on science and its technological applications, the credibility of science and of scientists is being questioned by people with no expertise whatsoever in science or how it works. This is not just about silly attacks on social media. It is about questioning knowledge that is painstakingly obtained by years of hard work and study to then superficially decide that this knowledge is wrong or worse, manipulative. How did we get ourselves into this mess?
After the Second World War, scientists enjoyed an all-time high in public perception. The technological inventions that decided the outcome of the war depended heavily on cutting-edge science: quantum and nuclear physics, radar, computers and code-breaking, effective explosives, aeronautical technology, faster planes and ships, and deeper-diving submarines. The list goes on. There was an intensified alliance between science and the State, which has been present in Western history since Greek times think of Archimedes and his catapults and fire-inducing mirrors, applied to protect Syracuse from Roman invaders.
The Cold War amplified this prestige, and defense support has sustained a large part of the scientific research budget. There was also an understanding that basic science is the cornerstone of technological innovation, so that even more abstract topics were worthy of funding.
As science advanced, it also became more technical, complicated, and arcane, moving farther away from general understanding. Quantum physics, genetics, biochemistry, AI, and machine learning are all part of our everyday life, even if few know much about any of these fields. Even the experts are siloed inside their research areas. Specialization is how new knowledge is produced, given the enormous amount of detail within each subfield. An astrophysicist who specialized in black holes knows practically nothing about the physics of graphene or quantum optics. Specialization has a dual role: It strengthens its own subfield but weakens the global understanding of a question. Specialization makes it harder for scientists to be a public voice for their fields in ways that are engaging to the general public.
To complicate things, the relationship between science and society changed. Beginning roughly in the 1960s, scientists started to use their findings to caution people and governments about the dangers of certain products or of unchecked industrialization and population growth. Cigarettes are bad for you. There will be a shortage of energy and water as more and more humans fill up the world. Climate change is going to create hell on Earth. Plastics are evil. Pollution of waterways, oceans, and the atmosphere will make people sick, kill animals, and destroy natural resources. Meanwhile, we, as a species even if we claim to be the most intelligent on this planet cannot act collectively to change what we are doing to our own environment.
These discoveries (some of them predating the 1960s by decades) were inconvenient to many. They were inconvenient to the tobacco industry, the auto industry, the fossil fuel industry, and the chemical industry. So, scientists, the darlings of the 1950s, became the harbingers of annoying news, threatening peoples way of life and the profitability of large sectors of the economy. They had to be stopped!
Scientists sounded the alarm, denouncing how the tobacco and fossil fuel industries developed a corrosive strategy to undermine sciences credibility, attacking scientists as opportunists and manipulators. Politicians aligned with these industries jumped in, and a campaign to politicize science took over the headlines. Scientific knowledge became a matter of opinion, something that Francis Bacon fought against almost 400 years ago. The media helped, often giving equal weight to the opinion of the vast majority of scientists and to the opinion of a small contrarian group, confusing the general public to no end. The growth of social media compounded the damage, as individuals with no or little scientific training jumped in ready to make a name for themselves as defenders of freedom and liberty, conflating lies with the American ideal of individual freedom.
The results, not surprisingly, have been catastrophic. From Flat-Earthers to antivaxxers to climate deniers, scientific authority and knowledge became a free-for-all, a matter of individual opinion aligned with political views, often sponsored by corporate interest groups and opportunist politicians.
To get out of this mess will take a tremendous amount of work, especially from the scientific community, the media, and educators. Science needs more popular voices, people that have a gift to explain to the general public how and why science works. Scientists need to visit more schools and talk to the children about what they do. Educators need to reenergize the science curriculum to reflect the realities of our world, inviting more scientists to visit classes and telling more stories about scientists that are engaging to students. This humanizes science in the process.
Historians often say that history swings back and forth like a pendulum. Lets make sure that we do not allow the pendulum of scientific knowledge to swing back to the obscurantism of centuries past, when the few with power and means controlled the vast majority of the population by keeping them in ignorance and manipulating them with fear.
Read the rest here:
When science mixes with politics, all we get is politics - Big Think
I wrote the book on warp drive. No, we didn’t accidentally create a warp bubble. – Big Think
In perhaps his most famous quip of all time, celebrated physicist Richard Feynman once remarked, when speaking about new discoveries, The first principle is that you must not fool yourselfand you are the easiest person to fool. When you do science yourself, engaging in the process of research and inquiry, there are many ways you can become your own worst enemy. If youre the one proposing a new idea, you must avoid falling into the trap of becoming enamored with it; if you do, you run the risk of choosing to emphasize only the results that support it, while discounting the evidence that contradicts or refutes it.
Similarly, if youre an experimenter or observer whos become enamored with a particular explanation or interpretation of the data, you have to fight against your own biases concerning what you expect (or, worse, hope) the outcome of your labors will indicate. As the more familiar refrain goes, If the only tool you have is a hammer, you tend to see every problem as a nail. Its part of why we demand, as part of the scientific process, independent, robust confirmation of every result, as well as the scrutiny of our scientific peers to ensure were all doing our research properly and interpreting our results correctly.
Recently, former NASA engineer Harold Sonny White, famous (or infamous) for his previous dubious claims about physics-violating engines, has made a big splash, claiming to have created a real-life warp bubble: an essential step toward creating an actual warp drive, as made famous by Star Trek. But is this claim correct? Lets take a look.
Warp drive started off as a speculative idea. Rather than being bound by the limits of special relativity where massive objects can only approach, but can never reach or exceed, the speed of light warp drive recognized the novel possibility brought about by general relativity: where the fabric of space is curved. In special relativity, we treat space as being indistinguishable from flat, which is an excellent approximation almost everywhere in the Universe. Only near extremely dense and massive objects do the effects of curved space typically become important. But if you can manipulate the matter and energy in the Universe properly, its possible to cause space to curve in intricate, counterintuitive ways.
Just as you could take a flat sheet of paper and fold it, it should be possible, with enough matter and energy in the right configuration, to warp the fabric of space between any two points. If you warp space properly, the reasoning goes, you could potentially shorten the amount of space you need to traverse between any two points; all youd need is the right amount of energy configured in the right way. For a long time, the theoretical solutions that shortened the journey from one point to another were limited to ideas like wormholes, Einstein-Rosen bridges, and black holes that connected to white holes at the other end. In all of these cases, however, there was an immediate problem: Any spacecraft traveling through these mechanisms would violently be torn apart by the irresistible gravitational forces.
But all of this changed in 1994, when physicist Miguel Alcubierre put forth a paper that showed how warp drive could be physically possible. Alcubierre recognized that the presence of matter and/or energy always led to positive spatial curvature, like the heavily curved space just outside a black holes event horizon. However, negative spatial curvature would also be possible if, instead of matter and/or energy, we had some sort of negative-mass matter or negative energy. By playing around with these two ingredients, instead of just the usual one, Alcubierre stumbled upon an idea that was truly brilliant.
By manipulating large amounts of both positive and negative energy, Alcubierre showed how, without wormholes, a spaceship could travel through the fabric of space at an arbitrarily large speed: unbounded by the speed of light. The way this would work is that both types of energy positive and negative would be present in equal quantities, compressing the space in front of the spacecraft while simultaneously rarifying the space behind it by an equal amount. Meanwhile, the spacecraft itself would be encased in a warp bubble where space was indistinguishable from flat on the interior. This way, as the spacecraft and the bubble moved together, they would travel through the compressed space, shortening the journey.
One way to envision this is to imagine we wanted to travel to the TRAPPIST-1 system: a stellar system with a red dwarf star, containing at least seven Earth-sized planets in orbit around it. While the innermost planets are likely to be too hot, akin to Mercury, and the outermost planets are likely frozen over like Pluto, Triton, or Enceladus, some of the intermediate planets might yet be just right for habitability, and may possibly even be inhabited. The TRAPPIST-1 system is approximately 40 light-years away.
Without warp drive, youd be limited by special relativity, which describes your motion through the fabric of space. If you traveled quickly enough, at, say, 99.992% the speed of light, you could make the journey to TRAPPIST-1 in just six months, from your perspective. If you looked around, assessed the planet, and then turned around and came home at precisely the same speed, 99.992% the speed of light, it would take you another six months to return. Those individuals aboard the spacecraft would experience only one year of times passage, but back here at home, everyone else would have experienced the passage of 81 years.
When youre limited by the speed of light, this problem cannot be avoided: Even if you could travel arbitrarily close to the speed of light, slowing your own aging through time dilation and shortening your journey through length contraction, everyone back home continues to age at the normal rate. When everyone meets up again, the effects are dramatic.
With warp drive, however, this problem goes away almost entirely. The way that relativity works dictates that your passage through space and time are related: that the faster you move through space, the slower time passes for you, while remaining completely stationary in space causes time to pass at the maximum possible rate. By warping space itself, you can actually change it so that what was previously a 40-light-year journey in front of you might now appear as though it were only a 0.5-light-year journey. If you travel that distance, now, at 80% the speed of light, it still might take about six months to get to TRAPPIST-1. When you stop, turn around, and come back, with space warped again in your forward direction of motion, it again will take six months. All told, youll have aged one year on your journey.
But this time, because of how you undertook your journey, someone back on Earth would still be older, but not by very much. Instead of witnessing you traveling through space at nearly the speed of light, a terrestrial observer would witness the space in front of your spacecraft be continually shrunk, while the space behind you would continually be expanded. Youd be moving through space, but the warping of space itself would far and away be the dominant effect. Everyone back at home would have aged about 1 year and 8 months, but (almost) everyone you knew and loved would still be alive. If we want to undertake interstellar journeys and not say a permanent goodbye to everyone at home, warp drive is the way to do it.
In 2017, I authored the book Treknology: The Science of Star Trek from Tricorders to Warp Drive, where I presented nearly 30 different technological advances envisioned by the Star Trek franchise. For each technology, I evaluated which ones had already been brought to fruition, which ones were on their way, which ones were still a ways off but were physically possible, and which one would require something novel and presently speculative as far as science was concerned in order to become possible. Although there were only four such technologies that were currently impossible with our present understanding of physics, warp drive was one of them, as it required some type of negative mass or negative energy, which at present is purely speculative.
Today, however, its recognized that whats needed isnt necessarily negative mass or negative energy; that was simply the way that Alcubierre recognized one could induce the needed opposite type of curvature to space from what normal mass or energy causes. However, theres another possibility for this that stems from a realization that didnt yet exist back in 1994, when Alcubierre first put his work forth: that the default amount of energy in space isnt zero, but some positive, non-zero, finite value. It wasnt until 1998 that the effects of this energy were first robustly seen, manifesting itself in the accelerated expansion of the Universe. We know this today as dark energy, and its a form of energy intrinsic to the fabric of space itself.
Now, keep that in mind: Theres a finite amount of energy to the fabric of space itself. In addition to that, theres a famous calculation that was done back in the 1940s, in the early days of quantum field theory, by Hendrik Casimir, that has remarkable implications. Normally, the quantum fields that govern the Universe, including the electromagnetic field, exist everywhere in space; theyre intrinsic to it, and they cannot be removed. But if you set up certain boundary conditions Casimir first envisioned two parallel, conducting plates as an example certain modes of that field would be excluded; they had the wrong wavelength to fit between the plates.
As a result, the energy inherent to the space outside of the plates would be slightly greater than the energy inside the plates, causing them to attract. The effect wasnt experimentally confirmed until almost 50 years after it was proposed, when Steve Lamoreaux successfully did it, and the Casimir effect has now been calculated and measured for many systems and many configurations. It may be possible, with the proper configuration, to use the Casimir effect in a controlled fashion to substitute for Alcubierres original idea of exotic matter that possessed some type of negative energy.
However, one must be careful as stated earlier, its easy to fool yourself. The Casimir effect isnt equivalent to a warp bubble. But in principle, it could be used to warp space in the negative fashion that would be needed to create one.
The article, thankfully, published in the open access (but often dubious) European Physical Journal C, is publicly available to anyone wishing to download it. (Link here.) Using micron-scale electrical conductors in a variety of shapes, including pillars, plates, spheres and other cavities, teams of researchers were able to generate electric potentials (or changes in voltage) of a few hundred microvolts, completely in line with what previous experiments and theoretical predictions both indicate. Thats what the DARPA-funded project was for, and thats what the experimental research surrounding this idea accomplished: in a custom Casimir cavity.
However, theres an enormous difference between what teams working on Casimir cavities do experimentally and the numerical calculations performed in this paper. Thats right: This isnt an experimental paper, but rather a theoretical paper, one with a suspiciously low number (zero) of theoretical physicists on it. The paper relies on the dynamic vacuum model a model typically applicable to single atoms to model the energy density throughout space that would be generated by this cavity. They then use another technique, worldline numerics, to assess how the vacuum changes in response to the custom Casimir cavity.
And then it gets shady. Wheres my warp bubble? They didnt make one. In fact, they didnt calculate one, either. All they did was show that the three-dimensional energy density generated by this cavity displayed some qualitative correlations with the energy density field required by the Alcubierre drive. They dont match in a quantitative sense; they were not generated experimentally, but only calculated numerically; and most importantly, they are restricted to microscopic scales and extremely low energy densities. Theres a lot of speculation and conjecture, and all of it is unproven.
That isnt to say this might not be an interesting idea that might someday pan out. But the most generous thing I can say about it is this: it isnt fully baked. The most worrisome part, as a scientist familiar with Dr. Whites grandiose claims surrounding physics-violating engines in the past, is that hes making new grand claims without adequate supporting evidence. Hes going to be looking at tiny, low-power systems and attempting to make measurements right at the limit of what his equipment will be able to detect. And, in the very recent past, he has fooled himself (and many others) into believing a novel effect was present when, in fact, it was not. An error, where his team failed to account for the magnetic and electric fields generated by the wires powering his previous apparatus, was all he wound up measuring.
In science, the mindset made famous by The X-Files series, I want to believe, is frequently the most dangerous one we can have. Science is not about what you hope is true; its not about the way youd like reality to be; its not about what your gut tells you; and its not about the patterns you can almost see when you ignore the quantitative details. At its core, science is about what is true in our reality, and what can be experimentally and/or observationally verified. Its predictions are reliable when youre using established theories within their established range of validity, and speculative the instant you venture beyond that.
As much as Id love it if we had created a warp bubble in the lab, that simply isnt what happened here. A lack of appropriately healthy skepticism is how we wind up with scams and charlatans. As soon as you no longer bear the responsibility of rigorously testing and attempting to knock down your own hypotheses, youre committing the cardinal sin of any scientific investigation: engaging in motivated reasoning, rather that letting nature guide you to your conclusions. Warp drive remains an interesting possibility and one worthy of continued scientific investigation, but one that you should remain tremendously skeptical about given the current state of affairs.
Remember: The more you want something to be true, the more skeptical you need to be of it. Otherwise, you are already violating the first principle about not fooling yourself. When you want to believe, you already are the easiest person to fool.
More here:
I wrote the book on warp drive. No, we didn't accidentally create a warp bubble. - Big Think
Physicists have coaxed ultracold atoms into an elusive form of quantum matter – Science News Magazine
An elusive form of matter called a quantum spin liquid isnt a liquid, and it doesnt spin but it sure is quantum.
Predicted nearly 50 years ago, quantum spin liquids have long evaded definitive detection in the laboratory. But now, a lattice of ultracold atoms held in place with lasers has shown hallmarks of the long-sought form of matter, researchers report in the Dec. 3 Science.
Quantum entanglement goes into overdrive in the newly fashioned material. Even atoms on opposite sides of the lattice share entanglement, or quantum links, meaning that the properties of distant atoms are correlated with one another. Its very, very entangled, says physicist Giulia Semeghini of Harvard University, a coauthor of the new study. If you pick any two points of your system, they are connected to each other through this huge entanglement. This strong, long-range entanglement could prove useful for building quantum computers, the researchers say.
The new material matches predictions for a quantum spin liquid, although its makeup strays a bit from conventional expectations. While the traditional idea of a quantum spin liquid relies on the quantum property of spin, which gives atoms magnetic fields, the new material is based on different atomic quirks.
Headlines and summaries of the latest Science News articles, delivered to your inbox
Thank you for signing up!
There was a problem signing you up.
A standard quantum spin liquid should arise among atoms whose spins are in conflict. Spin causes atoms to act as tiny magnets. Normally, at low temperatures, those atoms would align their magnetic poles in a regular pattern. For example, if one atom points up, its neighbors point down. But if atoms are arranged in a triangle, for example, each atom has two neighbors that themselves point in opposite directions. That arrangement leaves the third one with nowhere to turn it cant oppose both of its neighbors at once.
So atoms in quantum spin liquids refuse to choose (SN: 9/21/21). Instead, the atoms wind up in a superposition, a quantum combination of spin up and down, and each atoms state is linked with those of its compatriots. The atoms are constantly fluctuating and never settle down into an orderly arrangement of spins, similarly to how atoms in a normal liquid are scattered about rather than arranged in a regularly repeating pattern, hence the name.
Conclusive evidence of quantum spin liquids has been hard to come by in solid materials. In the new study, the researchers took a different tack: They created an artificial material composed of 219 trapped rubidium atoms cooled to a temperature of around 10 microkelvins (about 273.15 Celsius). The array of atoms, known as a programmable quantum simulator, allows scientists to fine-tune how atoms interact to investigate exotic forms of quantum matter.
In the new experiment, rather than the atoms spins being in opposition, a different property created disagreement. The researchers used lasers to put the atoms into Rydberg states, meaning one of an atoms electrons is bumped to a very high energy level (SN: 8/29/16). If one atom is in a Rydberg state, its neighbors prefer not to be. That setup begets a Rydberg-or-not discord, analogous to the spin-up and -down battle in a traditional quantum spin liquid.
The scientists confirmed the quantum spin liquid effect by studying the properties of atoms that fell along loops traced through the material. According to quantum math, those atoms should have exhibited certain properties unique to quantum spin liquids. The results matched expectations for a quantum spin liquid and revealed that long-range entanglement was present.
Notably, the materials entanglement is topological. That means it is described by a branch of mathematics called topology, in which an object is defined by certain geometrical properties, for example, its number of holes (SN: 10/4/16). Topology can protect information from being destroyed: A bagel that falls off the counter will still have exactly one hole, for example. This information-preserving feature could be a boon to quantum computers, which must grapple with fragile, easily destroyed quantum information that makes calculations subject to mistakes (SN: 6/22/20).
Whether the material truly qualifies as a quantum spin liquid, despite not being based on spin, depends on your choice of language, says theoretical physicist Christopher Laumann of Boston University, who was not involved with the study. Some physicists use the term spin to describe other systems with two possible options, because it has the same mathematics as atomic spins that can point either up or down. Words have meaning, until they dont, he quips. It all depends how you spin them.
See more here:
What if Einstein never existed? – Big Think
If you ask the average person to name one scientist from any time or place in history, one of the most common names youre likely to hear is Albert Einstein. The iconic physicist was responsible for a remarkable number of scientific advances during the 20th century, and perhaps single-handedly overthrew the Newtonian physics that had dominated scientific thought for more than 200 years. His most famous equation, E = mc, is so prolific that even people who dont know what it means can recite it. He won the Nobel Prize for advances in quantum physics. And his most successful idea the general theory of relativity, our theory of gravity remains undefeated in all tests more than 100 years after Einstein first proposed it.
But what if Einstein had never existed? Would others have come along and made precisely the same advances? Would those advances have come quickly, or would they have taken such a long time that some of them might not yet have occurred? Would it have taken a genius of equal magnitude to bring his great achievements to fruition? Or do we severely overestimate just how rare and unique Einstein was, elevating him to an undeserved position in our minds based on the fact that he was simply in the right place at the right time with the right set of skills? Its a fascinating question to explore. Lets dive in.
Einstein had whats known as his miracle year in 1905, when he published a series of papers that would go on to revolutionize a number of areas in physics. But just prior to that, a great number of advances had recently occurred that threw many long-held assumptions about the Universe into great doubt. For over 200 years, Isaac Newton had stood unchallenged in the realm of mechanics: both in the terrestrial and celestial realms. His law of universal gravitation applied just as well to objects in the Solar System as it did to balls rolling down a hill, or cannonballs fired from a cannon.
In the eyes of a Newtonian physicist, the Universe was deterministic. If you could write down the positions, momenta, and masses of every object in the Universe, you could calculate how each of them would evolve to arbitrary precisions at any moment in time. Additionally, space and time were absolute entities, and the gravitational force traveled at infinite speeds, with instantaneous effects. Throughout the 1800s, the science of electromagnetism was developed as well, uncovering intricate relationships between electric charges, currents, electric and magnetic fields, and even light itself. In many ways, it seemed that physics was almost solved, given the successes of Newton, Maxwell, and others.
Until, that is, it wasnt. There were puzzles that seemed to hint at something new in many different directions. The first discoveries of radioactivity had already taken place, and it was realized that mass was actually lost when certain atoms decayed. The momenta of the decaying particles didnt appear to match the momenta of the parent particles, indicating that either something wasnt conserved or that something unseen was present. Atoms were determined not to be fundamental, but made of positively charged atomic nuclei and discrete, negatively charged electrons.
But there were two challenges to Newton that seemed, somehow, more important than all of the others.
The first confusing observation was the orbit of Mercury. Whereas all of the other planets obeyed Newtons laws to the limits of our precision in measuring them, Mercury did not. Despite accounting for the precession of the equinoxes and the effects of the other planets, Mercurys orbits failed to match predictions by a minuscule but significant amount. The extra 43 arc-seconds-per-century of precession led many to hypothesize the existence of Vulcan, a planet inner to Mercury, but none was there to be discovered.
The second was perhaps even more puzzling: When objects moved close to the speed of light, they no longer obeyed Newtons equations of motion. If you were on a train at 100 miles per hour and threw a baseball at 100 miles per hour in the forward direction, the ball would move at 200 miles per hour. Intuitively, this is what youd expect to occur, and also what does occur when you perform the experiment for yourself.
But if youre on a moving train and you shine a beam of light forward, backward, or any other direction, it always moves at the speed of light, regardless of how the train is moving. In fact, its also true regardless of how quickly the observer watching the light is moving.
Moreover, if youre on a moving train and you throw a ball, but the train and ball are both traveling close to the speed of light, addition doesnt work the way were used to. If the train moves at 60% the speed of light and you throw the ball forward at 60% the speed of light, it doesnt move at 120% the speed of light, but only at ~88% the speed of light. Although we were able to describe whats happening, we couldnt explain it. And thats where Einstein came onto the scene.
Although its difficult to condense the entirety of his achievements into even a single article, perhaps his most momentous discoveries and advances are as follows.
The equation E = mc: When atoms decay, they lose mass. Where does that mass go if its not conserved? Einstein had the answer: It gets converted into energy. Moreover, Einstein had the correct answer: It gets converted, specifically, into the amount of energy described by his famous equation, E = mc. It works the other way as well; weve since created masses in the form of matter-antimatter pairs from pure energy based on this equation. In every circumstance its ever been tested under, E = mc is a success.
Special Relativity: When objects move close to the speed of light, how do they behave? They move in a variety of counterintuitive ways, but all are described by the theory of special relativity. There is a speed limit to the Universe: the speed of light in a vacuum, at which all massless entities in a vacuum move precisely. If you have mass, you can never reach, but only approach that speed. The laws of special relativity dictate how objects moving near the speed of light accelerate, add or subtract in velocity, and how time dilates and lengths contract for them.
The photoelectric effect: When you shine direct sunlight on a piece of conducting metal, it can kick the most loosely held electrons off of it. If you increase the lights intensity, more electrons get kicked off, while if you decrease the lights intensity, fewer electrons get kicked off. But heres where it gets weird: Einstein discovered that it wasnt based on the lights total intensity, but on the intensity of light above a certain energy threshold. Ultraviolet light only would cause the ionization, not visible or infrared, regardless of the intensity. Einstein showed that lights energy was quantized into individual photons, and that the number of ionizing photons determined how many electrons got kicked off; nothing else would do it.
General relativity: This was the biggest, most hard-fought revolution of all: a new theory of gravity governing the Universe. Space and time were not absolute, but made a fabric through which all objects, including all forms of matter and energy, traveled. Spacetime would curve and evolve owing to the presence and distribution of matter and energy, and that curved spacetime told matter and energy how to move. When put to the test, Einsteins relativity succeeded where Newton failed, explaining Mercurys orbit and predicting how starlight would deflect during a solar eclipse. Since it was first proposed, General Relativity has never been experimentally or observationally contradicted.
In addition to this, there were many other advances that Einstein himself played a major role in initiating. He discovered Brownian motion; he co-discovered the statistical rules under which boson particles operated; he contributed substantially to the foundations of quantum mechanics through the Einstein-Podolsky-Rosen paradox; and he arguably invented the idea of wormholes through the Einstein-Rosen bridge. His scientific career of contributions was truly legendary.
And yet, there are many reasons to believe that despite the unparalleled career that Einstein had, the full suite of advances that were made by Einstein would have been made by others in very short order without him. Its impossible to know for certain, but for all that we laud the genius of Einstein and hold him up as a singular example of how one incredible mind can change our conception of the Universe as he, in fact, actually did pretty much everything that occurred on account of Einstein would have occurred without him.
Prior to Einstein, back in the 1880s, physicist J.J. Thomson, discoverer of the electron, began thinking that the electric and magnetic fields of a moving, charged particle must carry energy with them. He attempted to quantify the amount of that energy. It was complicated, but a simplified set of assumptions allowed Oliver Heaviside to make a calculation: He determined the amount of effective mass that a charged particle carried was proportional to the electric field energy (E) divided by the speed of light (c) squared. Heaviside had a proportionality constant in there of 4/3 that was different from the true value of 1 in his 1889 calculation, as would Fritz Hasenhrl in 1904 and 1905. Henri Poincar independently derived E = mc in 1900, but didnt understand the implications of his derivations.
Without Einstein, we were already perilously close to his most famous equation; it seems unrealistic to expect we wouldnt have gotten the rest of the way there in short order had he not come along.
Similarly, we were already extremely close to special relativity. The Michelson-Morley experiment had demonstrated that light always moved at a constant speed, and it had disproven the most popular aether models. Hendrik Lorentz had already uncovered the transformation equations that determined how velocities added and how time dilated, and independently along with George FitzGerald, determined how lengths contracted in the direction of motion. In many ways, these were the building blocks that led Einstein to develop the theory of special relativity. However, it was Einstein who put it together. Again, its difficult to imagine that Lorentz, Poincar, and others working at the interface of electromagnetism and the speed of light wouldnt have taken similar leaps to arrive at this profound conclusion. Even without Einstein, we were already so close.
Max Plancks work with light set the stage for the discovery of the photoelectric effect; it surely would have occurred with or without Einstein.
Fermi and Dirac worked out the statistics for fermions (the other type of particle, besides bosons), while it was Satyendra Bose who worked them out for the particles that bear his name; Einstein was merely the recipient of Boses correspondence.
Quantum mechanics, arguably, would have developed just as well in the absence of Einstein.
But general relativity is the big one. With special relativity already under his belt, Einstein set about to fold in gravity. While Einsteins equivalence principle the realization that gravitation caused an acceleration, and that all accelerations were indistinguishable to the observer is what led him there, with Einstein himself calling it his happiest thought that left him unable to sleep for three days, others were thinking along the same lines.
Of all the advances that Einstein made, this was the one that his peers were farthest behind when he put it forth. Still, while it might have taken many years or even decades, the fact that others were already so close to thinking precisely along the same lines as Einstein leads us to believe that even if Einstein had never existed, general relativity would eventually have fallen into the realm of human knowledge.
We typically have a narrative in how science advances: that one individual, through a sheer stroke of genius, spots the key advance or way of thinking that everyone else had missed. Without that one individual, humanity would never have gained that remarkable knowledge that was stored away.
But when we examine the situation in greater detail, we find that many individuals were often nipping at the heels of that discovery just before it was made. In fact, when we look back through history, we find that many people had similar realizations to one another at about the same time. Alexei Starobinskii put many of the pieces of inflation together before Alan Guth did; Georges Lematre and Howard Robertson put together the expanding Universe before Hubble did; and Sin-Itiro Tomonaga worked out the calculations of quantum electrodynamics before Julian Schwinger and Richard Feynman did.
Einstein was the first to cross the finish line on a number of independent and remarkable scientific fronts. But had he never come along, many others were close behind him. Although he may have possessed every bit of dazzling genius that we often attribute to him, one thing is almost certain: Genius is not as unique and rare as we often assume it to be. With a lot of hard work and a little luck, almost any properly trained scientist can make a revolutionary breakthrough simply by stumbling upon the right realization at the right time.
The rest is here:
Air Liquide Enters a Long-term Partnership to Secure Its Supply of Helium-3 – Business Wire
PARIS--(BUSINESS WIRE)--Regulatory News:
Air Liquide (Paris:AI) has entered into a long-term agreement with Laurentis Energy Partners, a leader in the clean-energy industry, to produce and distribute helium-3 (3He). This molecule is a rare isotope of helium used in quantum computing, quantum science, astrophysics, neutron detection, medical imaging and, in the future, fusion. Thanks to this new partnership, Air Liquide will be able to deliver large quantities of helium-3 to its customers around the world.
Laurentis Energy Partners will extract helium-3 as a by-product of the energy produced by the Darlington power generating station in Canada. As an expert in gas management and extreme cryogenics, Air Liquide will purify the molecule, then package and distribute it to its customers globally. The production will start by the end of this year.
Helium-3 is a very rare and stable isotope of helium. Compared to the most common isotope (helium-4), helium-3 has unique physical properties, such as a lower liquefaction temperature. Thanks to this, helium-3 has a wide range of applications, from neutron detection for security gates to deep science where helium-3 allows to reach temperatures close to absolute zero. In particular, helium-3 is necessary to produce the ultra cold required by quantum computers that harness quantum physics to process exponentially more data compared to classical computers. The promise of quantum computing is accelerated development in many fields including the search for new drugs, the discovery of new materials, and even cyber defense. As a medical isotope, He-3 can also be used to produce highly detailed Magnetic Resonance Imaging (MRI) of airways in the lung.
Emilie Mouren-Renouard, Member of the Air Liquide Executive Committee, in charge of Innovation, Digital and IT, Intellectual Property and the Global Markets & Technologies World Business Unit, said: One year after the acquisition of CryoConcept, specialized in technologies allowing to reach very low temperatures (close to absolute zero), this major agreement reinforces our core competencies when it comes to extreme cryogenics and deep tech, and illustrates our ambition to push back the frontiers of science. In particular, it enables us to offer our customers an ever more comprehensive range of products and services including helium-3 rare gas. It also confirms the willingness of Air Liquide to take part in the growing market driven by the quantum revolution and serve customers leveraging its expertise in the field of ultra low temperatures."
Global Markets & TechnologiesThe GM&T World Business Unit delivers technological solutions - molecules, equipment and services - to support the markets of energy transition and deep tech, in order to drive Air Liquide sustainable growth. GM&T employs 2,200 people worldwide, and generated a 2020 revenue of 579 million euros.
A world leader in gases, technologies and services for Industry and Health, Air Liquide is present in 78 countries with approximately 64,500 employees and serves more than 3.8 million customers and patients. Oxygen, nitrogen and hydrogen are essential small molecules for life, matter and energy. They embody Air Liquides scientific territory and have been at the core of the companys activities since its creation in 1902.
Air Liquides ambition is to be a leader in its industry, deliver long term performance and contribute to sustainability - with a strong commitment to climate change and energy transition at the heart of its strategy. The companys customer-centric transformation strategy aims at profitable, regular and responsible growth over the long term. It relies on operational excellence, selective investments, open innovation and a network organization implemented by the Group worldwide. Through the commitment and inventiveness of its people, Air Liquide leverages energy and environment transition, changes in healthcare and digitization, and delivers greater value to all its stakeholders.
Air Liquides revenue amounted to more than 20 billion euros in 2020. Air Liquide is listed on the Euronext Paris stock exchange (compartment A) and belongs to the CAC 40, EURO STOXX 50 and FTSE4Good indexes.
http://www.airliquide.com Follow us on Twitter @airliquidegroup
View original post here:
Air Liquide Enters a Long-term Partnership to Secure Its Supply of Helium-3 - Business Wire
Fujitsu SD WAN and ISS are first users of quantum seciurity – Capacity Media
20h | Alan Burkitt-Gray
Fujitsu and a company working with the International Space Station have been named as among the first users of Quantum Origin, whats claimed to be the worlds first commercial product built using quantum computers.
Cambridge Quantum, which is now part of the US-UK group Quantinuum, says it can fit quantum-level security to existing networks, including software-defined wide area networks (SD-WANs) from Fujitsu, which has incorporated the technology into its products.
Duncan Jones, head of cyber security at Cambridge Quantum, said last night: We are kick-starting the quantum cyber security industry. He said the company will start to distribute [quantum] keys into cloud platforms.
Houtan Houshmand, principal architect at Fujitsu, said his company was planning to incorporate the technology into its SD-WAN products.
David Zuniga, business development manager at Axiom Space, said the technology has been tested on the International Space Station (ISS) and would lead to space tourism with researchers and scientists [who] could do their work in space with total security.
Cambridge Quantum founder and Quantinuum CEO Ilyas Khan said: This product could be used by anyone.
He said it should be used by organisations worrying about the threat from people sequestering data storing encrypted information for the time when quantum computers will also be available to decrypt it.
You cannot afford to be asleep at the wheel, said Khan. When should we be worried? Of course, now. He said existing classical systems could be protected by a quantum computer.
Jones said that the Quantum Origin typical end point might be a hardware security module that could be added to existing infrastructure. For large enterprises to add this might be a year or two, he said. Smaller businesses were slightly further out.
On prices, he said that a typical key using existing technology costs about US$1 a month. He implied that a Quantum Origin key would be cheaper but did not go into details.
Fujitsus Houshmand was also asked about pricing. I cant provide a cost, he said, saying that what Fujitsu has done so far is just a proof of concept.
Jones said that Quantinuum, which is a joint venture of Cambridge Quantum and Honeywell, is forming a number of partnerships, naming military supplier Thalys and public key infrastructure (PKI) specialist Keyfactor. This is how the technology will diffuse into the market.
He said: We want to make this product broadly available, but accepted that there were global security considerations. There are export control laws. We have to do a lot of due diligence.
Zuniga at Axiom Space, which is training its own crew for the ISS and is planning its own private space station, said that the US operating segment of the ISS, where Quantum Origin is to be used, has a firewall to keep our data secure from the Russian sector. If we cant secure our data, it hurts a really expensive asset thats floating in space.
Khan, asked about possible exports to China and Russia, said: We are answerable to the regulators. We are an American and a British company. Were not actually able to sell to adversaries.
Houshmand at Fujitsu agreed: We have to stay rigidly compliant.
Elaborating on the technology, Jones said: Quantum Origin is a cloud-based platform that uses a quantum computer from Quantinuum to product cryptographic keys.
He was asked whether companies had five years, as is often suggested, to install quantum-level protection for their data. Theyre wrong by about five years, he said.
Jones said Quantum Origin keys are the strongest that have ever been created or could ever be created, because they use quantum physics to produce truly random numbers.
Khan noted that the beta version of Quantum Origin has been tested on an IBM quantum network.
Quantinuum and Cambridge Quantum has a number of clients that have tested the technology, but they are operating under a non-disclosure agreement (NDA), said Khan.
We have been working for a number of years now on a method to efficiently and effectively use the unique features of quantum computers in order to provide our customers with a defence against adversaries and criminals now and in the future once quantum computers are prevalent, he said.
He added: Quantum Origin gives us the ability to be safe from the most sophisticated and powerful threats today as well threats from quantum computers in the future.
Jones said: When we talk about protecting systems using quantum-powered technologies, were not just talking about protecting them from future threats. From large-scale takedowns of organisations, to nation state hackers and the worrying potential of hack now, decrypt later attacks, the threats are very real today, and very much here to stay. Responsible enterprises need to deploy every defence possible to ensure maximum protection at the encryption level today and tomorrow.
A quantum of disruptiion: Capacity's feature about quantum technology, its threat to data security and what it is also doing to protect security, is here
Originally posted here:
Fujitsu SD WAN and ISS are first users of quantum seciurity - Capacity Media
UNSW researcher honoured for outreach in the physics community – UNSW Newsroom
UNSW Engineering, Scientia Professor Andrea Morello has been recognised for his outstanding outreach work in the physics field by the Australian Institute of Physics New South Wales (AIP NSW).
Prof. Morello is a renowned international leader in the field of quantum computing and has led the development and launch of the worlds first bachelors degree in Quantum Engineering at UNSW Sydney.
In its eighth year, the AIP NSW Community Outreach to Physics Awardis presented to individuals that seek to achieve activities that engage and contribute to public participation within physics communities.
Prof. Morellos outreach achievements include a popular YouTube channel, contribution to science initiatives for students, and artistic collaborations.
His YouTube video series on explaining quantum computing, building quantum computers and quantum phenomena in everyday life has attracted over 10 million views.
Prof. Morello has contributed to several popular science initiatives to engage students and younger audiences, including the National Youth Science Forum and World Science Festival, as well as being featured in the Australian Broadcasting Corporation Science elevator pitch series.
I am truly honoured by this award. As much as I love basic research, pushing the boundaries of human knowledge isn't worth much if I don't share it with the public, Prof. Morello said.
I have been fortunate to have, over the years, the opportunity to interact with many outstanding science communicators, who have involved me in their activities, and inspired me to work on outreach myself.
In a ceremony on Friday, Prof. Morello was presented the AIP NSW Community Outreach to Physics Award. Photo: Supplied.
Collaborations with visual and literary artists have also seen him engage with wider audiences.
Visual art created by UNSW Art & Designs Professor Paul Thomas, inspired by Prof. Morellos quantum bits and quantum chaos research, has been exhibited internationally.
And together with award-winning writer Bernard Cohen, Prof. Morello has initiated a project to work with NSW schools to develop experiential learning activities that bring together science and creative writing.
I thank my creative arts collaborators, Professor Paul Thomas and Bernard Cohen, who helped me see things from a very different perspective and find new angles to convey the fascination for science through different channels.
UNSW Dean of Engineering Professor Stephen Foster congratulated Prof. Morello on his outreach achievements.
Congratulations to Prof. Morello on receiving this prestigious award acknowledging his relentless advocacy work in fostering closeness between science and the community, reflecting UNSWs Values in Action.
UNSW Deputy Vice-Chancellor Academic, Professor Merlin Crossley also applauded Prof. Morellos engagement initiatives.
Through his depth of knowledge Prof. Morello has helped inform the public, here in Australia and across the world, about the opportunities and prospects for quantum computing that are now appearing on the horizon, said Prof. Crossley.
The Australian Institute of Physics is an organisation dedicated to promoting the role of physics in research, education, industry and the community.
Read the original post:
UNSW researcher honoured for outreach in the physics community - UNSW Newsroom