Page 2,962«..1020..2,9612,9622,9632,964..2,9702,980..»

This Is the Fastest Random-Number Generator Ever Built – Scientific American

Researchers have built the fastest random-number generator ever made, using a simple laser. It exploits fluctuations in the intensity of light to generate randomnessa coveted resource in applications such as data encryption and scientific simulationsand could lead to devices that are small enough to fit on a single computer chip.

True randomness is surprisingly difficult to come by. Algorithms in conventional computers can produce sequences of numbers that seem random at first, but over time these tend to display patterns. This makes them at least partially predictable, and therefore vulnerable to being decoded.

To make encryption safer, researchers have turned to quantum mechanics, where the laws of physics guarantee that the results of certain measurementssuch as when a radioactive atom decaysare genuinely random.

A popular way to tap into quantum randomness is to exploit fluctuations in how photons are emitted by the materials used in lasers. Typical laser devices are designed to minimize these fluctuations to produce light of steady intensity: they make the light waves bounce around inside the material to force its atoms to emit more and more photons in sync with each other.

But for random-number generation, researchers aim for the opposite. We want the intensity to fluctuate randomly, so we can digitize the intensity to generate random numbers, says Hui Cao, an applied physicist at Yale University in New Haven, Connecticut.

Cao and her team made their laser materiala translucent semiconductorin the shape of a bow tie. Photons bounce between the curved walls of the bow tie multiple times, before coming out as a scattered beam. The researchers can then capture the light with an ultrafast camera. They recorded the light output of 254 independent pixels, which together produced random bits at a rate of around 250 terabits per second, or 250 terahertz. Thats several orders of magnitude faster than previous such devices, which recorded only one pixel at a time. Their results were reported inScienceon 25 February.

The invention represents a major leap in performance of random-number generators, says Krister Shalm, a physicist at the US National Institute of Standards and Technology in Boulder, Colorado.

The fastest existing computers have clock speeds measured in gigahertz, which is much too slow to fully exploit the full power of Caos device. The set-up could be made smaller by using simpler light detectors instead of a high-speed camera. This could eventually yield practical devices small enough to fit on a single computer chip, says Cao. These could have useful applications, such as encryption technology on mobile phones.

This article is reproduced with permission and was first published on March 2 2021.

Follow this link:

This Is the Fastest Random-Number Generator Ever Built - Scientific American

Read More..

New research indicates the whole universe could be a giant neural network – The Next Web

The core idea is deceptively simple: every observable phenomenon in the entire universe can be modeled by a neural network. And that means, by extension, the universe itself may be a neural network.

Vitaly Vanchurin, a professor of physics at the University of Minnesota Duluth, published an incredible paper last August entitled The World as a Neural Network on the arXiv pre-print server. It managed to slide past our notice until today when Futurisms Victor Tangermann published an interview with Vanchurin discussing the paper.

The big idea

According to the paper:

We discuss a possibility that the entire universe on its most fundamental level is a neural network. We identify two different types of dynamical degrees of freedom: trainable variables (e.g. bias vector or weight matrix) and hidden variables (e.g. state vector of neurons).

At its most basic, Vanchurins work here attempts to explain away the gap between quantum and classical physics. We know that quantum physics does a great job of explaining whats going on in the universe at very small scales. When were, for example, dealing with individual photons we can dabble with quantum mechanics at an observable, repeatable, measurable scale.

But when we start to pan out were forced to use classical physics to describe whats happening because we sort of lose the thread when we make the transition from observable quantum phenomena to classical observations.

The argument

The root problem with sussing out a theory of everything in this case, one that defines the very nature of the universe itself is that it usually ends up replacing one proxy-for-god with another. Where theorists have posited everything from a divine creator to the idea were all living in a computer simulation, the two most enduring explanations for our universe are based on distinct interpretations of quantum mechanics. These are called the many worlds and hidden variables interpretations and theyre the ones Vanchurin attempts to reconcile with his world as a neural network theory.

To this end, Vanchurin concludes:

In this paper we discussed a possibility that the entire universe on its most fundamental level is a neural network. This is a very bold claim. We are not just saying that the artificial neural networks can be useful for analyzing physical systems or for discovering physical laws, we are saying that this is how the world around us actually works. With this respect it could be considered as a proposal for the theory of everything, and as such it should be easy to prove it wrong. All that is needed is to find a physical phenomenon which cannot be described by neural networks. Unfortunately (or fortunately) it is easier said than done.

Quick take: Vanchurin specifically says hes not adding anything to the many worlds interpretation, but thats where the most interesting philosophical implications lie (in this authors humble opinion).

If Vanchurins work pans out in peer review, or at least leads to a greater scientific fixation on the idea of the universe as a fully-functioning neural network, then well have a found a thread to pull on that could put us on the path to a successful theory of everything.

If were all nodes in a neural network, whats the networks purpose? Is the universe one giant, closed network or is it a single layer in a grander network? Or perhaps were just one of trillions of other universes connected to the same network. When we train our neural networks we run thousands or millions of cycles until the AI is properly trained. Are we just one of an innumerable number of training cycles for some larger-than-universal machines greater purpose?

You can read the paper whole paper here on arXiv.

Published March 2, 2021 19:18 UTC

Original post:

New research indicates the whole universe could be a giant neural network - The Next Web

Read More..

Physics – The Tiniest Superfluid Circuit in Nature – Physics

February 25, 2021• Physics 14, 27

A new analysis of heavy-ion collision experiments uncovers evidence that two colliding nuclei behave like a Josephson junctiona device in which Cooper pairs tunnel through a barrier between two superfluids.

The Josephson effect is a remarkable example of a macroscopic quantum phenomenon, in which, without an applied voltage, current flows between two superconductors separated by a thin film of normal material. In this structure, called a Josephson junction, the current is due to the quantum tunneling of paired, superconducting electrons (so-called Cooper pairs) [1]. For decades, nuclear physicists have hypothesized that similar effects can occur on much smaller scales, since atomic nuclei could be regarded as superfluids consisting of paired nucleons. Recent experiments have supported this hypothesis, delivering hints that two colliding nuclei could be described as a Josephson junction in which entangled neutron pairs play the role of Cooper pairs (Fig. 1) [2, 3]. Now, Gregory Potel from Lawrence Livermore National Laboratory in California and colleagues have put these ideas on firmer ground [4]. Analyzing tin-nickel collisions from previous experiments, they found that experimental observables offer compelling signatures that two nuclei indeed form, for a split second, a Josephson junction.

The orderly motion of gigantic ensembles of correlated electron pairs makes superconductors behave as a single objecta macroscopic quantum state called a condensate. The condensate is characterized by its density and phase, and the latter plays the same role as the orientation of magnetic moments in a ferromagnet: an isolated ferromagnet can be rotated at no energy cost, but two ferromagnets with different orientations affect each other. Similarly, according to quantum mechanics, the phase doesnt have implications for a single condensate. But if two condensates are sufficiently close, a Cooper-pair current, whose magnitude depends on the phase difference, may flow from one condensate to the other. A striking feature of this effect is that electric current may flow without a driving voltage.

There may be other systems in Nature where this effect occurs, and atomic nuclei, which can be regarded as superfluid ensembles of nucleons, are good candidates. This idea appeared among nuclear physicists as early as the 1970s [5]. In the 1980s and 1990s, several experiments indicated an enhanced probability of neutron-pair transfer between colliding nucleia possible manifestation of the Josephson effect. But the evidence for this interpretation wasnt compelling. There were doubts, in particular, about whether ensembles of nucleons are sufficiently large to be treated as a pair condensate. Superconductivity is an emergent phenomenon: It appears when dealing with a huge number of particles but vanishes when the system is broken down into smaller constituents. But can we consider a nucleus made of about 100 nucleons a huge ensemble of particles? Can we expect that two nuclei in close proximity exhibit a Josephson effect?

The study by Potel and his colleagues provides strong arguments for affirmative answers to these questions. The researchers analyzed data from previous experiments in which tin-116 ( 116Sn) nuclei were collided with nickel-60 ( 60Ni) [2]. With energies between 140.60 and 167.95 MeV, these collisions are gentle: they allow the nuclei to overcome just enough of the Coulomb repulsion to get sufficiently close to exchange a few neutrons at most. Under such conditions, two reactions are possible: the transfer of one neutron and the transfer of two neutrons, producing 115Sn+61Ni and 114Sn+62Ni, respectively. The case of two-neutron transfer is particularly interesting, as it may carry signatures of the correlated pairing of neutrons in the nuclei.

The team devised a way to uncover the experimental evidence of Josephson flow. Their idea is that there can be a nuclear equivalent of the alternating current (ac) Josephson effect (Fig. 1). In this variant of the Josephson effect, a constant, or dc, voltage applied to a Josephson junction produces an ac current. This striking behavior arises because the voltage causes the phase difference between the two condensates to increase over time. Since phases that differ by multiples of 2 are equivalent, a linear phase growth produces an oscillating current. The researchers argue that for the nuclear case, a similar effect can occur because neutron pairs inside two colliding nuclei possess different energies. This energy difference plays the role of the dc voltage in the ac Josephson effect.

Therefore, similar oscillatory behavior is expected to occur during a nuclear collision: the back-and-forth tunneling of neutron pairs means that 116Sn+60Ni transforms into 114Sn+62Ni and then again into 116Sn+60Nia cyclical process whose frequency is determined by the energy difference of neutron pairs in initial and final nuclei. Because the collision lasts for only a short time, the team estimates that only about three such back-and-forth transfer cycles may occur in an experiment. However, even these few oscillations can lead to observable consequences. Since neutrons and protons interact strongly, oscillating neutron pairs cause protons to oscillate at the same frequency. Because of their charge, oscillating protons should emit electromagnetic radiation at this frequency. While electrons oscillating in a standard Josephson junction emit microwave photons [6], nuclei are expected to emit gamma-ray photons because of the much larger nuclear energy differences involved. The researchers calculate the expected radiation energy to be slightly less than 4 MeV, which matches the gamma-ray spectrum seen in previous experiments.

The results are thrilling for two reasons. First, they indicate that the principles of superconductivity valid for macroscopic phenomena in solids may be applicable to the much smaller (femtometer) nuclear scalesa truly spectacular conclusion. Second, the analysis shows that the pairing description is appropriate for a small number of particlesthe hundreds of nucleons making up the nuclei. It is worth pointing out, however, that this description contains a puzzling inconsistency. According to quantum mechanics, the phase and the number of particles in the condensate are related by the uncertainty principlemuch like the position and momentum of a quantum particle: if either quantity is well defined, the other isnt. But for the nuclear case, the number of nucleons is always exactly defined. Further theoretical work will need to resolve this inconsistency.

These findings whet our appetite for more work aimed at validating superfluid nuclear models by confronting theory with experiments. In particular, it would be crucial to show that such models can deliver accurate, quantitative predictions for analogous effects in nuclear collisions beyond those involving tin and nickel.

Piotr Magierski is Professor of Physics and Head of the Nuclear Physics Division at Warsaw University of Technology, Poland, and an Affiliate Professor at the University of Washington. He is a theoretical physicist whose research interests include superfluidity and superconductivity in systems far from equilibrium, such as nuclear fission and fusion reactions, nuclear matter in neutron stars, and ultracold atomic gases.

Two mirror nuclei, in which the numbers of neutrons and protons are interchanged, have markedly different shapesa finding that defies current nuclear theories. Read More

Particle physicists have detected a short-lived nucleus containing two strange quarks, whose properties could provide new insights into the behavior of other nuclear particles. Read More

More:

Physics - The Tiniest Superfluid Circuit in Nature - Physics

Read More..

New History of the Physics Department by Raj Gupta and Paul Sharrah Published – University of Arkansas Newswire

Cover design by UA Printing Services

A Centennial History of the Physics Department, University of Arkansas.

A new history of the Department of Physics, titledAcoustics to Quantum Materials: A Centennial History of the Department of Physics, University of Arkansas andauthored by Rajendra Gupta and Paul C. Sharrah, has been published.

The Department of Physics was born during the 1907-08 academic year when the first full-time physics teacher was appointed and a syllabus for a physics major was defined for the first time. The department celebrated its centennial in April 2008. For 35 years, from 1872 to 1907, physics was taught by teachers whose primary discipline was not physics, for example, chemistry, applied mathematics, mechanic arts and engineering, and even biology and geology. While primary emphasis of this book is on the one hundred years, 1907-2007; for completeness, the previous 35 years are covered in two prologues. The period, 2008 t0 2018 is summarized in an epilogue. The history includes the perspective of the authors, both emeritus professors of physics, who were eye-witnessto the events unfolding in the department over a combined period of 76 years.

The book traces the evolution of the department from a one-person department with no physics majors in 1907, to 1920s when it expanded to three faculty and graduated its first major; to the 1930s when professor Llyod Ham established the first research laboratory in physics; to the 1940s when the department's ambitious vision of starting a credible research program was interrupted by the World War I and it had to teach an estimated 3,500 army trainees, but that was followed by a blossoming of its physics majors program; to the 1950s when a credible research program did start and the department's Ph.D. program was approved; to the 1960s when post-Sputnik government support for research helped the department expand its research efforts; to the evolution of the department's research in many diverse areas, including atomic and molecular physics, quantum optics, biophysics and condensed matter physics.

The first research laboratory in physics established by professor Ham was in acoustics. Today, the largest research effort is in the area of quantum materials, which explains the title of the book.

While the department had a modest beginning, starting with just one teacher and no majors, today it can claim its rightful place among the noteworthy physics departments at the U.S. public institutions.

An electronic copy of the book can be downloaded from the university repository: scholarworks.uark.edu/physpub/23.

See more here:

New History of the Physics Department by Raj Gupta and Paul Sharrah Published - University of Arkansas Newswire

Read More..

Quantum Tunneling in Graphene Advances the Age of High Speed Terahertz Wireless Communications – SciTechDaily

Quantum tunneling. Credit: Daria Sokol/MIPT Press Office

Scientists from MIPT, Moscow Pedagogical State University and the University of Manchester have created a highly sensitive terahertz detector based on the effect of quantum-mechanical tunneling in graphene. The sensitivity of the device is already superior to commercially available analogs based on semiconductors and superconductors, which opens up prospects for applications of the graphene detector in wireless communications, security systems, radio astronomy, and medical diagnostics. The research results are published in a high-rank journal Nature Communications.

Information transfer in wireless networks is based on the transformation of a high-frequency continuous electromagnetic wave into a discrete sequence of bits. This technique is known as signal modulation. To transfer the bits faster, one has to increase the modulation frequency. However, this requires a synchronous increase in carrier frequency. A common FM-radio transmits at frequencies of hundred megahertz, a Wi-Fi receiver uses signals of roughly five gigahertz in frequency, while the 5G mobile networks can transmit up to 20 gigahertz signals.

This is far from the limit, and a further increase in carrier frequency admits a proportional increase in data transfer rates. Unfortunately, picking up signals with hundred gigahertz frequencies and higher is an increasingly challenging problem.

A typical receiver used in wireless communications consists of a transistor-based amplifier of weak signals and a demodulator that rectifies the sequence of bits from the modulated signal. This scheme originated in the age of radio and television, and becomes inefficient at frequencies of hundreds of gigahertz desirable for mobile systems. The fact is that most of the existing transistors arent fast enough to recharge at such a high frequency.

An evolutionary way to solve this problem is just to increase the maximum operation frequency of a transistor. Most specialists in the area of nanoelectronics work hard in this direction. A revolutionary way to solve the problem was theoretically proposed in the beginning of 1990s by physicists Michael Dyakonov and Michael Shur, and realized, among others, by the group of authors in 2018. It implies abandoning active amplification by transistor, and abandoning a separate demodulator. Whats left in the circuit is a single transistor, but its role is now different. It transforms a modulated signal into bit sequence or voice signal by itself, due to non-linear relation between its current and voltage drop.

In the present work, the authors have proved that the detection of a terahertz signal is very efficient in the so-called tunneling field-effect transistor. To understand its work, one can just recall the principle of an electromechanical relay, where the passage of current through control contacts leads to a mechanical connection between two conductors and, hence, to the emergence of current. In a tunneling transistor, applying voltage to the control contact (termed as gate) leads to alignment of the energy levels of the source and channel. This also leads to the flow of current. A distinctive feature of a tunneling transistor is its very strong sensitivity to control voltage. Even a small detuning of energy levels is enough to interrupt the subtle process of quantum mechanical tunneling. Similarly, a small voltage at the control gate is able to connect the levels and initiate the tunneling current.

The idea of a strong reaction of a tunneling transistor to low voltages is known for about fifteen years, says Dr. Dmitry Svintsov, one of the authors of the study, head of theLaboratory of 2D Materials for Optoelectronicsat the MIPT center for Photonics and 2D materials. But its been known only in the community of low-power electronics. No one realized before us that the same property of a tunneling transistor can be applied in the technology of terahertz detectors. Georgy Alymov (co-author of the study) and I were lucky to work in both areas. We realized then: if the transistor is opened and closed at a low power of the control signal, then it should also be good in picking up weak signals from the ambient surrounding.

The created device is based on bilayer graphene, a unique material in which the position of energy levels (more strictly, the band structure) can be controlled using an electric voltage. This allowed the authors to switch between classical transport and quantum tunneling transport within a single device, with just a change in the polarities of the voltage at the control contacts. This possibility is of extreme importance for an accurate comparison of the detecting ability of a classical and quantum tunneling transistor.

The experiment showed that the sensitivity of the device in the tunneling mode is few orders of magnitude higher than that in the classical transport mode. The minimum signal distinguishable by the detector against the noisy background already competes with that of commercially available superconducting and semiconductor bolometers. However, this is not the limit the sensitivity of the detector can be further increased in cleaner devices with a low concentration of residual impurities. The developed detection theory, tested by the experiment, shows that the sensitivity of the optimal detector can be a hundred times higher.

The current characteristics give rise to great hopes for the creation of fast and sensitive detectors for wireless communications, says the author of the work, Dr. Denis Bandurin. And this area is not limited to graphene and is not limited to tunnel transistors. We expect that, with the same success, a remarkable detector can be created, for example, based on an electrically controlled phase transition. Graphene turned out to be just a good launching pad here, just a door, behind which is a whole world of exciting new research.

The results presented in this paper are an example of a successful collaboration between several research groups. The authors note that it is this format of work that allows them to obtain world-class scientific results. For example, earlier, the same team of scientists demonstrated how waves in the electron sea of graphene can contribute to the development of terahertz technology. In an era of rapidly evolving technology, it is becoming increasingly difficult to achieve competitive results. comments Dr. Georgy Fedorov, deputy head of the Laboratory of Nanocarbon Materials, MIPT, Only by combining the efforts and expertise of several groups can we successfully realize the most difficult tasks and achieve the most ambitious goals, which we will continue to do.

Reference: Tunnel field-effect transistors for sensitive terahertz detection by I. Gayduchenko, S. G. Xu, G. Alymov, M. Moskotin, I. Tretyakov, T. Taniguchi, K. Watanabe, G. Goltsman, A. K. Geim, G. Fedorov, D. Svintsov and D. A. Bandurin, 22 January 2021, Nature Communications.DOI: 10.1038/s41467-020-20721-z

The work was supported by Russian Science Foundation (grant # 16-19-10557) and Russian Foundation for Basic Research (grant # 18-29-20116 mk).

Go here to see the original:

Quantum Tunneling in Graphene Advances the Age of High Speed Terahertz Wireless Communications - SciTechDaily

Read More..

International Business Machines : The Decade of Quantum Computing Is Upon Us, IBM Executive Says – Marketscreener.com

By Jared Council

As more uses for quantum computing appear, chief information officers need to help their companies see the emerging technology's potential as well as its risks, an International Business Machines Corp. executive said on Tuesday.

"This is going to be the decade in which quantum really comes of age," said Dario Gil, senior vice president and director of IBM Research, at The Wall Street Journal's virtual CIO Network summit.

Leveraging the properties of quantum physics, quantum computers have the ability to sort through a vast number of possibilities in nearly real time and come up with a probable solution. Difficulties with hardware have held back the adoption of quantum computing, but that is about to change, Dr. Gil said.

IBM is planning to release new quantum systems this year and next, but a big shift will come with the system the company plans to release in 2023, Dr. Gil said. That one for the first time will allow engineers to mitigate errors through software, as opposed to only through hardware.

"We've got to get these computers to operate without errors, and if we can do that we'll realize their full potential," he said. "So what we envision is in 2023, when we deliver that system, it will be an inflection point in that the errors of quantum computers will continue to decrease exponentially through software, as opposed to just by making the device better."

After that point, he said, quantum computers will start to be used on a broader scale.

Major corporations are already experimenting with early-stage quantum technology, among them, Visa Inc., JPMorgan Chase & Co., Roche Holding AG and Volkswagen AG.

Quantum computers harness the properties of quantum physics, including superposition and entanglement, to radically speed up complex calculations intractable to today's computers. Conventional computers store information as either zeros or ones, but quantum computers use quantum bits, or qubits, which represent and store information as both zeros and ones simultaneously.

While a commercial-grade quantum computer hasn't been built yet, startups and tech giants including IBM, Alphabet Inc.'s Google and Microsoft Corp. are racing to commercialize the technology.

Dr. Gil said quantum computers will be able to speed up research-and-development discoveries because the machines will excel at modeling physics, chemistry and materials science. This could lead to developing more energy-efficient batteries in the auto sector, he said, or better carbon-capture membranes to fight climate change.

"That's my vision: Bring the power of computing to compress the time to discovery of what we need," he said.

Quantum computers are also poised to benefit from artificial intelligence algorithms, partly because of their ability to solve complex mathematical problems.

CIOs looking to benefit from quantum computing can start by experimenting with community platforms such as the open-source Qiskit framework, which provides tools for developers to create and run quantum programs on real quantum hardware or simulators, he said.

Companies should form small working groups that can start to identify problems that quantum computing may be able to help solve, he said.

"Value today means, do you have a small team that knows what's going on in quantum and start mapping to problems that are relevant to your business?"

But companies should also brace themselves for quantum computing's implications for cybersecurity: The machines will be able to break encryption locks that ordinary computers have trouble cracking, Dr. Gil said. Companies should consider migrating important business systems to so-called "quantum-safe" encryption protocols, he said.

"It's very important that all of you develop crypto-agility, and you develop a migration path and a quantum-safe approach to do encryption and security," he said. "Because if you don't, you're going to leave your institutions vulnerable to these kinds of attacks in the future."

Write to Jared Council at jared.council@wsj.com

(END) Dow Jones Newswires

03-03-21 1524ET

Read the original here:

International Business Machines : The Decade of Quantum Computing Is Upon Us, IBM Executive Says - Marketscreener.com

Read More..

Subtle quantum phenomenon found to alter chemical reactivity for the first time – Chemistry World

A new frontier has been discovered in how quantum phenomena control chemical reactivity. By colliding beams of two different reactants, a Chinese team spanning three universities has shown that the outcome can only be explained by interactions between electron spin and orbital angular momentum. This is the first time that electronic angular momentum has been found to affect such reactions, explains Xueming Yang from South University of Science and Technology in Shenzhen. The finding is very special, adds his colleague Zhigang Sun from the Dalian Institute of Chemical Physics.

The crossed-beam approach the team uses is common in experiments seeking to understand the quantum states involved in reactions, explains Sun. In their reaction, one of the beams consisted of fluorine atoms. The other beam, crossing the first at a right angle, contained hydrogendeuterium molecules. When the reactants collide, the fluorine atom displaces the deuterium atom, forming a hydrogen fluoride molecule, with the products scattering in various directions.

The experimental results are simply stunning. They have achieved a resolution that 20 years ago would have been deemed as unattainable

Francisco Javier Aoiz,Complutense University of Madrid

The transition state between the starting materials and products lasts for less than 10-12 seconds, a picosecond, or quadrillionth of a second. But the crossed beam experiments provide windows on that fleeting world. Varying the collision energy between atoms creates sharp variations, or resonances, in the probability of products forming at a particular scattering angle and internal energy. By measuring that information, researchers can extract information about the quantum mechanical energy level structure of the transition state.

To get those valuable details, researchers typically shine lasers into the collision zone of their crossed-beam reactions. These both detect where the products go and gain information about their electronic structure spectroscopically. Until recently this laser ionisation approach could detect the product molecules rotational energy level, but not their electronic angular momentum. Electronic angular momentum energy is much smaller than the rotational energy of a diatomic molecule, Wang says. Its influence on a chemical reaction is therefore subtle and difficult to detect, says team member Xingan Wang from Hefei National Laboratory for Physical Sciences at the Microscale.

However Wang, Yang and Suns team has developed a more sensitive technique called near-threshold ionisation. The key experimental result in the current work is [detected] with high angular resolution, comments Wang. This was not available previously. If laser energy is above the ionisation limit, momentum can transfer to the atom and adversely affect the measurements, he explains. Near-threshold ionisation avoids this. By accurately tuning the photon energy during the detection, we can be sure that the products receive just enough photon energy to be ionised, Wang says.

In one particular resonance state, the researchers found a horseshoe-shaped scattering pattern where hydrogen fluoride molecules were in high rotational energy states. The teams theoretical analysis revealed that the horseshoe pattern had largely resulted from quantum interference between electron spin and orbital angular momentum.

Francisco Javier Aoiz from Complutense University of Madrid calls the study very beautiful work that represents the state-of-the-art in molecular reaction dynamics. The experimental results are simply stunning, he adds. They have achieved a resolution that 20 years ago would have been deemed as unattainable. They have determined quantum-state-resolved angular distributions in the whole range of scattering angles with an unprecedented accuracy. He says the effect detected is subtle but reveals interplay of several coupled potential energy surfaces that manifests in different ways in other chemical reactions.

While the work provides a fundamental revelation about the influences on chemical transition states, the team now needs to evaluate its full significance, notes Wang. We are planning to further investigate the role of the electronic angular momentum in a more general chemical reaction, he says.

See original here:

Subtle quantum phenomenon found to alter chemical reactivity for the first time - Chemistry World

Read More..

Physicists believe faster-than-light travel is indeed possible with new warp drive – ZME Science

Scientists have often turned to fiction for inspiration about what secrets of nature they could unlock. Some of the most outlandish science fiction concepts include teleportation and warp drive, both of which have been traditionally frowned upon because they would break the laws of physics or so we thought. A new study found that, at least in the case of warp drives, faster-than-light-travel may be possible by taking advantage of a loop hole in general relativity.

The concept for the Star Trek-inspired warp-drive is simple to grasp. Everything within space is restricted by the speed of light, butthe really cool thing is space-time, the fabric of space, is not limited by the speed of light. As the name suggests, a warp drive involves warping spacetime itself around an object.

The most widely used theoretical framework for a warp drive was proposed by Mexican physicistMiguel Alcubierre in 1994. In order to allow faster-than-light-travel, Alcubierre envisioned catapulting a spaceship in a bubble of negative energy that would expand space and time behind the ship, while at the same time compressing space-time in front of it.

In the theory of generalrelativity, we usually assume that the energy is greater than zero, at all times and everywhere in the universe. This has a very important consequence for gravity: energy (e) is linked to mass via the formula e=mc (with m being the mass and c the speed of light). So negative energy would consequently also imply negative mass. Positive masses attract each other, but with a negative mass, gravity could suddenly become a repulsive force.

This is where things get interesting. Quantum theory does allow negative energy. According to quantum physics, it is possible to borrow energy from a vacuum at a certain location, like money from a bank. This would theoretically open the door for negative mass and negative gravity.

But thats easier said than done. Due to its requirements for negative energy, many believe that warp drives are not physically possible.

This, however, is no longer correct, said Alexey Bobrick, scientist and astrophysicist at Lund University.

Bobrick and colleagues went for a different route with their research, describing a new class of warp drives that respects general relativity and doesnt require negative energy.

The researchers in Sweden, part of an independent group of scientists known as Applied Physics, worked closely with distinguished researchers in warp field mechanics and even received the blessings of Alcubierre, the godfather of warp drive.

Writing in the journal Classical and Quantum Gravity, the authors said that any warp drive, including the Alcubierre drive, is a shell of regular or exotic material moving inertially with a certain velocity. Mathematically, they then go on to show that a class of subluminal, spherically symmetric warp drive spacetimes, at least in principle, can be constructed based on the physical principles known to humanity today.

If that sounds hard to understand, well, it kind of is. But fear not! Sabine Hossenfelder, Professor and Research Fellow at the Frankfurt Institute for Advanced Studies, break down this study in a recent YouTube video, which you can watch below.

One problem though: even if the theory is reliable, no one really knows how to actually do this yet. For the time being, warp drives remain in the realm of theory, but this research provides new perspectives on how to achieve faster-than-light travel. At least, the concept doesnt sound like such a crazy idea anymore.

Read the original here:

Physicists believe faster-than-light travel is indeed possible with new warp drive - ZME Science

Read More..

Exclusive! Ashwin Sanghi on his dream to cast Sushant Singh Rajput in ‘Keepers Of The Kalachakra’ series: He was like an excited child when it came to…

Indian author Ashwin Sanghi's book Keepers Of The Kalachakra is being adapted into a series. The bestseller is a mythological-science fiction thriller that tells the story of men who guard the Kalachakra' or Wheel of Time.

In an exclusive chat with ETimes, Ashwin Sanghi spoke candidly about this upcoming venture, his dream cast and Bollywoods new trend of turning book adaptations into blockbusters.

Sharing his thoughts about the ideal person to take this franchise forward, he said, The person I wanted, alas, is no more - Sushant Singh Rajput. He was like an excited child when it came to quantum physics, which is what this thriller is about.

While the nation continues to feel the void left by late Bollywood actor Sushant Singh Rajput, one of the country's most eminent awards in entertainment, Dadasaheb Phalke Award, has honoured the star with the 'Critic's Best Actor' accolade in the prestigious award ceremony held on Saturday.

When quizzed about his favoured movie adapted from a book, pat came his reply, The Godfather wins hands down.

As far as 'Keepers Of The Kalachakra' is concerned, the Vikram Malhotra-headed Abundantia Entertainment has acquired the rights to the book and plans to convert it into a multi-season series. The author will work closely with the screenwriters' team to bring the book to life.

Sharing his excitement on his new venture, Ashwin said, Vikram Malhotra, Shikhas Sharma and Abundantia are outstanding partners to collaborate with. Their vision for this book is exhilarating. I am sure that we will deliver a series that will pack a punch.

The book follows scientist Vijay Sundaram, who races against time to save humanity from impending doom. Zigzagging from the Ramayana to the birth of Buddhism; from the origin of Wahhabism to the Einsteinian gravitational wave-detectors of LIGO; from tantric practitioners to the Oval Office; and from the rites of Minerva, shrouded in frankincense, to the smoke-darkened ruins of Nalanda, Keepers Of The Kalachakra has it all.

The virtual world can be as fake as you want your story to be

Go here to see the original:

Exclusive! Ashwin Sanghi on his dream to cast Sushant Singh Rajput in 'Keepers Of The Kalachakra' series: He was like an excited child when it came to...

Read More..

Can god be disproved using the laws of physics? An expert explains how it depends on perspective – Scroll.in

I still believed in god (I am now an atheist) when I heard the following question at a seminar, first posed by Einstein, and was stunned by its elegance and depth: If there is a god who created the entire universe and all of its laws of physics, does god follow gods own laws? Or can god supersede his own laws, such as travelling faster than the speed of light and thus being able to be in two different places at the same time? Could the answer help us prove whether or not god exists or is this where scientific empiricism and religious faith intersect, with no true answer?

I was in lockdown when I received this question and was instantly intrigued. It is no wonder about the timing tragic events, such as pandemics, often cause us to question the existence of god: if there is a merciful god, why is a catastrophe like this happening?

So the idea that god might be bound by the laws of physics which also govern chemistry and biology and thus the limits of medical science was an interesting one to explore.

If god was not able to break the laws of physics, she arguably would not be as powerful as you had expected a supreme being to be. But if she could, why have not we seen any evidence of the laws of physics ever being broken in the universe?

To tackle the question, let us break it down a bit. First, can god travel faster than light? Let us just take the question at face value. Light travels at an approximate speed of 300,000 kilometres every second. We learn at school that nothing can travel faster than the speed of light not even the USS Enterprise in Star Trek when its dilithium crystals are set to max.

But is it true? A few years ago, a group of physicists posited that particles called tachyons travelled above light speed. Fortunately, their existence as real particles is deemed highly unlikely. If they did exist, they would have an imaginary mass and the fabric of space and time would become distorted leading to violations of causality (and possibly a headache for god).

It seems, so far, that no object has been observed that can travel faster than the speed of light. This in itself does not say anything at all about god. It merely reinforces the knowledge that light travels very fast indeed.

Things get a bit more interesting when you consider how far light has travelled since the beginning. Assuming a traditional big bang cosmology and a light speed of 300,000 km/s, then we can calculate that light has travelled roughly 10 to the 24th power kilometres in the 13.8 billion years of the universes existence. Or rather, the observable universes existence.

The universe is expanding at a rate of approximately 70km/s per Mpc (1 Mpc = 1 Megaparsec ~ 30 million km), so current estimates suggest that the distance to the edge of the universe is 46 billion light years. As time goes on, the volume of space increases and light has to travel for longer to reach us.

There is a lot more universe out there than we can view, but the most distant object that we have seen is a galaxy, GN-z11, observed by the Hubble Space Telescope. This is approximately 13.4 billion light years away, meaning that it has taken 13.4 billion years for light from the galaxy to reach us. But when the light set off, the galaxy was only about 3 billion light years away from our galaxy, the Milky Way.

We cannot observe or see across the entirety of the universe that has grown since the big bang because insufficient time has passed for light from the first fractions of a second to reach us.

Some argue that we therefore cannot be sure whether the laws of physics could be broken in other cosmic regions perhaps they are just local, accidental laws. And that leads us on to something even bigger than the universe.

Many cosmologists believe that the universe may be part of a more extended cosmos, a multiverse, where many different universes co-exist but do not interact. The idea of the multiverse is backed by the theory of inflation the idea that the universe expanded hugely before it was 10 to the minus 32nd power seconds old. Inflation is an important theory because it can explain why the universe has the shape and structure that we see around us.

But if inflation could happen once, why not many times? We know from experiments that quantum fluctuations can give rise to pairs of particles suddenly coming into existence, only to disappear moments later.

And if such fluctuations can produce particles, why not entire atoms or universes? It is been suggested that, during the period of chaotic inflation, not everything was happening at the same rate quantum fluctuations in the expansion could have produced bubbles that blew up to become universes in their own right.

But how does god fit into the multiverse? One headache for cosmologists has been the fact that our universe seems fine-tuned for life to exist. The fundamental particles created in the big bang had the correct properties to enable the formation of hydrogen and deuterium substances which produced the first stars.

The physical laws governing nuclear reactions in these stars then produced the stuff that lifes made of carbon, nitrogen and oxygen. So how come all the physical laws and parameters in the universe happen to have the values that allowed stars, planets and ultimately life to develop?

Some argue it is just a lucky coincidence. Others say we should not be surprised to see biofriendly physical laws they after all produced us, so what else would we see? Some theists, however, argue it points to the existence of a god creating favourable conditions.

But god is not a valid scientific explanation. The theory of the multiverse, instead, solves the mystery because it allows different universes to have different physical laws. So it is not surprising that we should happen to see ourselves in one of the few universes that could support life. Of course, you cannot disprove the idea that a god may have created the multiverse.

This is all very hypothetical, and one of the biggest criticisms of theories of the multiverse is that because there seem to have been no interactions between our universe and other universes, then the notion of the multiverse cannot be directly tested.

Now let us consider whether god can be in more than one place at the same time. Much of the science and technology we use in space science is based on the counter-intuitive theory of the tiny world of atoms and particles known as quantum mechanics.

The theory enables something called quantum entanglement: spookily connected particles. If two particles are entangled, you automatically manipulate its partner when you manipulate it, even if they are very far apart and without the two interacting. There are better descriptions of entanglement than the one I give here but this is simple enough that I can follow it.

Imagine a particle that decays into two sub-particles, A and B. The properties of the sub-particles must add up to the properties of the original particle this is the principle of conservation. For example, all particles have a quantum property called spin roughly, they move as if they were tiny compass needles.

If the original particle has a spin of zero, one of the two sub-particles must have a positive spin and the other a negative spin, which means that each of A and B has a 50% chance of having a positive or a negative spin. (According to quantum mechanics, particles are by definition in a mix of different states until you actually measure them.)

The properties of A and B are not independent of each other they are entangled even if located in separate laboratories on separate planets. So if you measure the spin of A and you find it to be positive. Imagine a friend measured the spin of B at exactly the same time that you measured A. In order for the principle of conservation to work, she must find the spin of B to be negative.

But and this is where things become murky like sub-particle A, B had a 50:50 chance of being positive, so its spin state became negative at the time that the spin state of A was measured as positive.

In other words, information about spin state was transferred between the two sub-particles instantly. Such transfer of quantum information apparently happens faster than the speed of light. Given that Einstein himself described quantum entanglement as spooky action at a distance, I think all of us can be forgiven for finding this a rather bizarre effect.

So there is something faster than the speed of light after all: quantum information. This does not prove or disprove god, but it can help us think of god in physical terms maybe as a shower of entangled particles, transferring quantum information back and forth, and so occupying many places at the same time? Even many universes at the same time?

I have this image of god keeping galaxy-sized plates spinning while juggling planet-sized balls tossing bits of information from one teetering universe to another, to keep everything in motion. Fortunately, God can multitask keeping the fabric of space and time in operation. All that is required is a little faith.

Has this essay come close to answering the questions posed? I suspect not: if you believe in god (as I do), then the idea of god being bound by the laws of physics is nonsense because God can do everything, even travel faster than light. If you do not believe in god, then the question is equally nonsensical, because there is not a god and nothing can travel faster than light. Perhaps the question is really one for agnostics, who do not know whether there is a god.

This is indeed where science and religion differ. Science requires proof, religious belief requires faith. Scientists do not try to prove or disprove gods existence because they know there is not an experiment that can ever detect god. And if you believe in god, it does not matter what scientists discover about the universe any cosmos can be thought of as being consistent with god.

Our views of god, physics or anything else ultimately depends on perspective. But let us end with a quotation from a truly authoritative source. No, it is not the bible. Nor is it a cosmology textbook. It is from Reaper Man by Terry Pratchett: Light thinks it travels faster than anything but it is wrong. No matter how fast light travels, it finds the darkness has always got there first, and is waiting for it.

Monica Grady is a Professor of Planetary and Space Sciences at The Open University.

This article first appeared on The Conversation.

Link:

Can god be disproved using the laws of physics? An expert explains how it depends on perspective - Scroll.in

Read More..