Category Archives: Quantum Physics

Six Things Everyone Should Know About Quantum Physics

Quantum physics is usually just intimidating from the get-go. It's kind of weird and can seem counter-intuitive, even for the physicists who deal with it every day. But it's not incomprehensible. If you're reading something about quantum physics, there are really six key concepts about it that you should keep in mind. Do that, and you'll find quantum physics a lot easier to understand.

Everything Is Made Of Waves; Also, Particles

Light as both a particle and a wave. (Image credit: Fabrizio Carbone/EPFL)

There's lots of places to start this sort of discussion, and this is as good as any: everything in the universe has both particle and wave nature, at the same time. There's a line in Greg Bear's fantasy duology (The Infinity Concerto and The Serpent Mage), where a character describing the basics of magic says "All is waves, with nothing waving, over no distance at all." I've always really liked that as a poetic description of quantum physics-- deep down, everything in the universe has wave nature.

Of course, everything in the universe also has particle nature. This seems completely crazy, but is an experimental fact, worked out by a surprisingly familiar process:

(there's also an animated version of this I did for TED-Ed).

Of course, describing real objects as both particles and waves is necessarily somewhat imprecise. Properly speaking, the objects described by quantum physics are neither particles nor waves, but a third category that shares some properties of waves (a characteristic frequency and wavelength, some spread over space) and some properties of particles (they're generally countable and can be localized to some degree). This leads to some lively debate within the physics education community about whether it's really appropriate to talk about light as a particle in intro physics courses; not because there's any controversy about whether light has some particle nature, but because calling photons "particles" rather than "excitations of a quantum field" might lead to some student misconceptions. I tend not to agree with this, because many of the same concerns could be raised about calling electrons "particles," but it makes for a reliable source of blog conversations.

This "door number three" nature of quantum objects is reflected in the sometimes confusing language physicists use to talk about quantum phenomena. The Higgs boson was discovered at the Large Hadron Collider as a particle, but you will also hear physicists talk about the "Higgs field" as a delocalized thing filling all of space. This happens because in some circumstances, such as collider experiments, it's more convenient to discuss excitations of the Higgs field in a way that emphasizes the particle-like characteristics, while in other circumstances, like general discussion of why certain particles have mass, it's more convenient to discuss the physics in terms of interactions with a universe-filling quantum field. It's just different language describing the same mathematical object.

Quantum Physics Is Discrete

These oscillations created an image of "frozen" light. (Credit: Princeton)

It's right there in the name-- the word "quantum" comes from the Latin for "how much" and reflects the fact that quantum models always involve something coming in discrete amounts. The energy contained in a quantum field comes in integer multiples of some fundamental energy. For light, this is associated with the frequency and wavelength of the light-- high-frequency, short-wavelength light has a large characteristic energy, which low-frequency, long-wavelength light has a small characteristic energy.

In both cases, though, the total energy contained in a particular light field is an integer multiple of that energy-- 1, 2, 14, 137 times-- never a weird fraction like one-and-a-half, , or the square root of two. This property is also seen in the discrete energy levels of atoms, and the energy bands of solids-- certain values of energy are allowed, others are not. Atomic clocks work because of the discreteness of quantum physics, using the frequency of light associated with a transition between two allowed states in cesium to keep time at a level requiring the much-discussed "leap second" added last week.

Ultra-precise spectroscopy can also be used to look for things like dark matter, and is part of the motivation for a low-energy fundamental physics institute.

This isn't always obvious-- even some things that are fundamentally quantum, like black-body radiation, appear to involve continuous distributions. But there's always a kind of granularity to the underlying reality if you dig into the mathematics, and that's a large part of what leads to the weirdness of the theory.

Quantum Physics Is Probabilistic

(Credit: Graham Barclay/Bloomberg News)

One of the most surprising and (historically, at least) controversial aspects of quantum physics is that it's impossible to predict with certainty the outcome of a single experiment on a quantum system. When physicists predict the outcome of some experiment, the prediction always takes the form of a probability for finding each of the particular possible outcomes, and comparisons between theory and experiment always involve inferring probability distributions from many repeated experiments.

The mathematical description of a quantum system typically takes the form of a "wavefunction," generally represented in equations by the Greek letter psi:. There's a lot of debate about what, exactly, this wavefunction represents, breaking down into two main camps: those who think of the wavefunction as a real physical thing (the jargon term for these is "ontic" theories, leading some witty person to dub their proponents "psi-ontologists") and those who think of the wavefunction as merely an expression of our knowledge (or lack thereof) regarding the underlying state of a particular quantum object ("epistemic" theories).

In either class of foundational model, the probability of finding an outcome is not given directly by the wavefunction, but by the square of the wavefunction (loosely speaking, anyway; the wavefunction is a complex mathematical object (meaning it involves imaginary numbers like the square root of negative one), and the operation to get probability is slightly more involved, but "square of the wavefunction" is enough to get the basic idea). This is known as the "Born Rule" after German physicist Max Born who first suggested this (in a footnote to a paper in 1926), and strikes some people as an ugly ad hoc addition. There's an active effort in some parts of the quantum foundations community to find a way to derive the Born rule from a more fundamental principle; to date, none of these have been fully successful, but it generates a lot of interesting science.

This is also the aspect of the theory that leads to things like particles being in multiple states at the same time. All we can predict is probability, and prior to a measurement that determines a particular outcome, the system being measured is in an indeterminate state that mathematically maps to a superposition of all possibilities with different probabilities. Whether you consider this as the system really being in all of the states at once, or just being in one unknown state depends largely on your feelings about ontic versus epistemic models, though these are both subject to constraints from the next item on the list:

Quantum Physics Is Non-Local

A quantum teleportation experiment in action. (Credit: IQOQI/Vienna)

The last great contribution Einstein made to physics was not widely recognized as such, mostly because he was wrong. In a 1935 paper with his younger colleagues Boris Podolsky and Nathan Rosen (the "EPR paper"), Einstein provided a clear mathematical statement of something that had been bothering him for some time, an idea that we now call "entanglement."

The EPR paper argued that quantum physics allowed the existence of systems where measurements made at widely separated locations could be correlated in ways that suggested the outcome of one was determined by the other. They argued that this meant the measurement outcomes must be determined in advance, by some common factor, because the alternative would require transmitting the result of one measurement to the location of the other at speeds faster than the speed of light. Thus, quantum mechanics must be incomplete, a mere approximation to some deeper theory (a "local hidden variable" theory, one where the results of a particular measurement do not depend on anything farther away from the measurement location than a signal could travel at the speed of light ("local"), but are determined by some factor common to both systems in an entangled pair (the "hidden variable")).

This was regarded as an odd footnote for about thirty years, as there seemed to be no way to test it, but in the mid-1960's the Irish physicist John Bell worked out the consequences of the EPR paper in greater detail. Bell showed that you can find circumstances in which quantum mechanics predicts correlations between distant measurements that are stronger than any possible theory of the type preferred by E, P, and R. This was tested experimentally in the mid-1970's by John Clauser, and a series of experiments by Alain Aspect in the early 1980's is widely considered to have definitively shown that these entangled systems cannot possibly be explained by any local hidden variable theory.

The most common approach to understanding this result is to say that quantum mechanics is non-local: that the results of measurements made at a particular location can depend on the properties of distant objects in a way that can't be explained using signals moving at the speed of light. This does not, however, permit the sending of information at speeds exceeding the speed of light, though there have been any number of attempts to find a way to use quantum non-locality to do that. Refuting these has turned out to be a surprisingly productive enterprise-- check out David Kaiser's How the Hippies Saved Physics for more details. Quantum non-locality is also central to the problem of information in evaporating black holes, and the "firewall" controversy that has generated a lot of recent activity. There are even some radical ideas involving a mathematical connection between the entangled particles described in the EPR paper and wormholes.

Quantum Physics Is (Mostly) Very Small

Images of a hydrogen atom as seen through a quantum telescope. (Credit: Stodolna et al. Phys. Rev.... [+] Lett.)

Quantum physics has a reputation of being weird because its predictions are dramatically unlike our everyday experience (at least, for humans-- the conceit of my book is that it doesn't seem so weird to dogs). This happens because the effects involved get smaller as objects get larger-- if you want to see unambiguously quantum behavior, you basically want to see particles behaving like waves, and the wavelength decreases as the momentum increases. The wavelength of a macroscopic object like a dog walking across the room is so ridiculously tiny that if you expanded everything so that a single atom in the room were the size of the entire Solar System, the dog's wavelength would be about the size of a single atom within that solar system.

This means that, for the most part, quantum phenomena are confined to the scale of atoms and fundamental particles, where the masses and velocities are small enough for the wavelengths to get big enough to observe directly. There's an active effort in a bunch of areas, though, to push the size of systems showing quantum effects up to larger sizes. I've blogged a bunch about experiments by Markus Arndt's group showing wave-like behavior in larger and larger molecules, and there are a bunch of groups in "cavity opto-mechanics" trying to use light to slow the motion of chunks of silicon down to the point where the discrete quantum nature of the motion would become clear. There are even some suggestions that it might be possible to do this with suspended mirrors having masses of several grams, which would be amazingly cool.

Quantum Physics Is Not Magic

Comic from "Surviving the World" by Dante Shepherd. (http://survivingtheworld.net/Lesson1518.html )... [+] Used with permission.

The previous point leads very naturally into this one: as weird as it may seem, quantum physics is most emphatically not magic. The things it predicts are strange by the standards of everyday physics, but they are rigorously constrained by well-understood mathematical rules and principles.

So, if somebody comes up to you with a "quantum" idea that seems too good to be true-- free energy, mystical healing powers, impossible space drives-- it almost certainly is. That doesn't mean we can't use quantum physics to do amazing things-- you can find some really cool physics in mundane technology-- but those things stay well within the boundaries of the laws of thermodynamics and just basic common sense.

So there you have it: the core essentials of quantum physics. I've probably left a few things out, or made some statements that are insufficiently precise to please everyone, but this ought to at least serve as a useful starting point for further discussion.

The rest is here:

Six Things Everyone Should Know About Quantum Physics

A state of vibration that exists simultaneously at two different times – Tech Explorist

Quantum mechanics has an exciting feature: a single event can exist in a state of superposition happening bothhereandthere, or bothtodayandtomorrow.

Such superposition is quite challenging to create as they are easily destroyed if any information about the events place and time leaks into the surrounding and even if nobody records this information. Once superposition is created, they lead to observations that are very different from that of classical physics, questioning down to our very understanding of space and time.

Recently scientists from EPFL, MIT, and CEA Saclay demonstrate a state of vibration simultaneously at two different times. They evidence this quantum superposition by measuring the strongest class of quantum correlations between light beams that interact with the vibration.

Using a very short laser-pulse, scientists triggered a specific pattern of vibration inside a diamond crystal. They then oscillated pair of neighboring atoms like two masses linked by a spring. This oscillation was synchronous across the entire illuminated region.

A light of a new color was emitted during the process to conserve the energy.

This classical picture, however, is inconsistent with the experiments. Instead, both light and vibration should be described as particles, or quanta: light energy is quantized into discrete photons. In contrast, vibrational energy is quantized into discrete phonons (named after the ancient Greek photo = light and phono = sound).

Therefore, the process described above should be seen as the fission of an incoming photon from the laser into a pair of photon and phonon akin to nuclear fission of an atom into two smaller pieces.

But it is not the only shortcoming of classical physics. In quantum mechanics, particles can exist in a superposition state, like the famous Schrdinger cat being alive and dead at the same time.

In this new study, scientists successfully entangled the photon and the phonon produced in an incoming laser photons fission inside the crystal. They did this by designing an experiment in which the photon-photon pair could be created at two different instants. Classically, it would result in a situation where the pair is created at time t1 with a 50% probability or at a later time t2 with 50% probability.

Here, scientists played a trick to generate an entangled state. They arranged the experiment in such a way that not even the faintest trace of the light-vibration pair creation time (t1 vs. t2) was left in the universe.

In other words, they erased information about t1 and t2. Quantum mechanics then predicts that the photon-photon pair becomes entangled and exists in a superposition of time t1andt2. This prediction was beautifully confirmed by the measurements, which yielded results incompatible with the classical probabilistic theory.

By showing entanglement between light and vibration in a crystal that one could hold in their finger during the experiment, the new study creates a bridge between our daily experience and the fascinating realm of quantum mechanics.

Christophe Galland, head of the Laboratory for Quantum and Nano-Optics at EPFL and one of the studys main authors, said,Quantum technologies are heralded as the next technological revolution in computing, communication. They are currently being developed by top universities and large companies worldwide, but the challenge is daunting. Such technologies rely on very fragile quantum effects surviving only at extremely cold temperatures or under high vacuum.

Our study demonstrates that even a common material at ambient conditions can sustain the delicate quantum properties required for quantum technologies. There is a price to pay, though: the quantum correlations sustained by atomic vibrations in the crystal are lost after only 4 picoseconds i.e., 0.000000000004 of a second! This short time scale is, however, also an opportunity for developing ultrafast quantum technologies. But much research lies ahead to transform our experiment into a useful device a job for future quantum engineers.

Here is the original post:

A state of vibration that exists simultaneously at two different times - Tech Explorist

Counter-Intuitive Quantum Mechanics: State of Vibration That Exists Simultaneously at Two Different Times – SciTechDaily

An especially counter-intuitive feature of quantum mechanics is that a single event can exist in a state of superposition happening both here and there, or both today and tomorrow.

Such superpositions are hard to create, as they are destroyed if any kind of information about the place and time of the event leaks into the surrounding and even if nobody actually records this information. But when superpositions do occur, they lead to observations that are very different from that of classical physics, questioning down to our very understanding of space and time.

Scientists from EPFL, MIT, and CEA Saclay, publishing in Science Advances, demonstrate a state of vibration that exists simultaneously at two different times, and evidence this quantum superposition by measuring the strongest class of quantum correlations between light beams that interact with the vibration.

The researchers used a very short laser-pulse to trigger a specific pattern of vibration inside a diamond crystal. Each pair of neighboring atoms oscillated like two masses linked by a spring, and this oscillation was synchronous across the entire illuminated region. To conserve energy during this process, a light of a new color is emitted, shifted toward the red of the spectrum.

An illustration representing the common vibe of light and atoms described in this study. Credit: Christophe Galland (EPFL)

This classical picture, however, is inconsistent with the experiments. Instead, both light and vibration should be described as particles, or quanta: light energy is quantized into discrete photons while vibrational energy is quantized into discrete phonons (named after the ancient Greek photo = light and phono = sound).

The process described above should therefore be seen as the fission of an incoming photon from the laser into a pair of photon and phonon akin to nuclear fission of an atom into two smaller pieces.

But it is not the only shortcoming of classical physics. In quantum mechanics, particles can exist in a superposition state, like the famous Schrdinger cat being alive and dead at the same time.

Even more counterintuitive: two particles can become entangled, losing their individuality. The only information that can be collected about them concerns their common correlations. Because both particles are described by a common state (the wavefunction), these correlations are stronger than what is possible in classical physics. It can be demonstrated by performing appropriate measurements on the two particles. If the results violate a classical limit, one can be sure they were entangled.

1. A laser generates a very short pulse of light 2. A fraction of this pulse is sent to a nonlinear device to change its color 3. The two laser pulses overlap on the same path again, creating a write & read pair of pulses. 4. Each pair is split into a short and a long path, 5. yielding an early and a late time slot, overlapping once again 6. Inside the diamond, during the early time slot, one photon from the write pulse may generate a vibration, while one photon from the read pulse converts the vibration back into light. 7. The same sequence may also happen during the late slot. But in this experiment, the scientists made sure that only one vibration is excited in total (in both early and late time slots). 8. By overlapping the photons in time again it becomes impossible to discriminate the early vs. late moment of the vibration. The vibration is now in a quantum superposition of early and late time. 9. In the detection apparatus, write and read photons are separated according to their different colors, and analyzed with single-photon counters to reveal their entanglement. Credit: Santiago Tarrago Velez (EPFL)

In the new study, EPFL researchers managed to entangle the photon and the phonon (i.e., light and vibration) produced in the fission of an incoming laser photon inside the crystal. To do so, the scientists designed an experiment in which the photon-phonon pair could be created at two different instants. Classically, it would result in a situation where the pair is created at time t1 with 50% probability, or at a later time t2 with 50% probability.

But here comes the trick played by the researchers to generate an entangled state. By a precise arrangement of the experiment, they ensured that not even the faintest trace of the light-vibration pair creation time (t1 vs. t2) was left in the universe. In other words, they erased information about t1 and t2. Quantum mechanics then predicts that the phonon-photon pair becomes entangled, and exists in a superposition of time t1 and t2. This prediction was beautifully confirmed by the measurements, which yielded results incompatible with the classical probabilistic theory.

By showing entanglement between light and vibration in a crystal that one could hold in their finger during the experiment, the new study creates a bridge between our daily experience and the fascinating realm of quantum mechanics.

Quantum technologies are heralded as the next technological revolution in computing, communication, sensing, says Christophe Galland, head of the Laboratory for Quantum and Nano-Optics at EPFL and one of the studys main authors. They are currently being developed by top universities and large companies worldwide, but the challenge is daunting. Such technologies rely on very fragile quantum effects surviving only at extremely cold temperatures or under high vacuum. Our study demonstrates that even a common material at ambient conditions can sustain the delicate quantum properties required for quantum technologies. There is a price to pay, though: the quantum correlations sustained by atomic vibrations in the crystal are lost after only 4 picoseconds i.e., 0.000000000004 of a second! This short time scale is, however, also an opportunity for developing ultrafast quantum technologies. But much research lies ahead to transform our experiment into a useful device a job for future quantum engineers.

Reference: Bell correlations between light and vibration at ambient conditions by Santiago Tarrago Velez, Vivishek Sudhir, Nicolas Sangouard and Christophe Galland, 18 December 2020, Science Advances.DOI: 10.1126/sciadv.abb0260

Link:

Counter-Intuitive Quantum Mechanics: State of Vibration That Exists Simultaneously at Two Different Times - SciTechDaily

This Incredible Particle Only Arises in Two Dimensions – Popular Mechanics

Physicists have confirmed the existence of an extraordinary, flat particle that could be the key that unlocks quantum computing.

Get unlimited access to the weird world of Pop Mech.

What is the rare and improbable anyon, and how on Earth did scientists verify them?

[T]hese particle-like objects only arise in realms confined to two dimensions, and then only under certain circumstanceslike at temperatures near absolute zero and in the presence of a strong magnetic field, Discover explains.

Scientists have theorized about these flat, peculiar particle-like objects since the 1980s, and the very nature of them has made it sometimes seem impossible to ever verify them. But the qualities scientists believe anyons have also made them sound very valuable to quantum research and, now, quantum computers.

This content is imported from {embed-name}. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

The objects have many possible positions and "remember," in a way, what has happened. In a press release earlier this fall, Purdue University explains more about the value of anyons:

GeoSafari Jr. Microscope for Kids (3+)

$21.99

GeoSafari Jr. Talking Microscope for Kids (4+)

Beginner Microscope for Kids

40X-1000X LED Illumination Lab Compound Monocular Microscope

SE306R-PZ-LED Forward-Mounted Binocular Stereo Microscope

$189.99

SE400-Z Professional Binocular Stereo Microscope

$223.99

LCD Digital Microscope

$79.99

Andonstar AD407 3D HDMI Soldering Digital Microscope

$279.99 (12% off)

Its these fractional charges that let scientists finally design the exact right experiments to shake loose the real anyons. A coin sorter is a good analogy for a lot of things, and this time is no different: scientists had to find the right series of sorting ideas in order to build one experimental setup that would, ultimately, only register the anyons. And having the unique quality of fractional charges gave them, at least, a beginning to work on those experiments.

Following an April paper about using a miniature particle accelerator to notice anyons, in July, researchers from Purdue published their findings after using a microchip etched to route particles through a maze that phased out all other particles. The maze combined an interferometera device that uses waves to measure what interferes with themwith a specially designed chip that activates anyons at a state.

Purdue University

What results is a measurable phenomenon called anyonic braiding. This is surprising and good, because it confirms the particle-like anyons exhibit this particular particle behavior, and because braiding as a behavior has potential for quantum computing. Electrons also braid, but researchers werent certain the much weaker charge of anyons would exhibit the same behavior.

Braiding isnt just for electrons and anyons, either: photons do it, too. "Braiding is a topological phenomenon that has been traditionally associated with electronic devices," photon researcher Mikael Rechtsman said in October.

He continued:

Now, the quantum information toolkit includes electrons, protons, and what Discover calls these strange in-betweeners: the anyons.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io

Excerpt from:

This Incredible Particle Only Arises in Two Dimensions - Popular Mechanics

Quantum Mechanics, the Mind-Body Problem and Negative Theology – Scientific American

Heres how I distinguish science from philosophy. Science addresses questions that can be answered, potentially, through empirical investigation. Examples: Whats the best way to defeat COVID-19? What causes schizophrenia, and how should it be treated? Can nuclear power help us overcome climate change? What are the causes of war, and how can we end it?

Philosophy addresses questions that probably cant be solved, now or ever. Examples (and these are of course debatable, some philosophers and scientists insist that science can answer all questions worth asking): Why is there something rather than nothing? Does free will exist? How does matter make a mind? What does quantum mechanics mean?

This final question has absorbed me lately because of my ongoing effort to learn quantum mechanics. Quantum mechanics represents reality at its most fundamental level, that of particles darting through space. Supposedly. Thats why science writer and astrophysicist Adam Becker calls his recent book about quantum mechanics What Is Real?

I suspect well never have final, definitive answers to what quantum mechanics means and what is real. My reasoning is primarily inductive. For more than a century, experts have sought to interpret quantum mechanics, to specify what it tells us about matter and energy, time and space, the infrastructure of existence.

Physicists and philosophers have come up with lots of possibilities, notably the Copenhagen interpretation, the many-worlds hypothesis and the Bohmian pilot-wave model. Ive just become aware of a hypothesis called quantum Bayesianism, or QBism (pronounced cubism), which proposeswell, check it out yourself.

Unfortunately, most interpretations dont offer testable predictions to distinguish them from rivals. (An exception is a quantum model proposed by Nobel laureate Roger Penrose, certain versions of which are reportedly ruled out by a recent experiment.) Hence adherents favor one interpretation over others for largely subjective, aesthetic reasons.

You dig the austere minimalism of the Copenhagen interpretation. I favor the pilot-wave model, which insists that particles are particles, and not probabilistic blurs. If Im feeling frisky, I might go with John Wheelers metaphysically extravagant it from bit proposal, which fuses quantum mechanics and information theory. Arguments about which interpretation is true cannot be resolved, because our preferences are matters of taste, not truth.

When I say a problem is unsolvable, I dont mean we should abandon it. Far from it. I love reading, writing and arguing about intractable puzzles. For example, I dont believe in God, certainly not the God of my Catholic childhood. But I enjoy smart, imaginative theology (defined as the study of God) in the same way that I enjoy good science fiction. Two of my favorite theologians are physicist Freeman Dyson and psychedelic adventurer Terence McKenna.

Im especially fond of what is known as negative theology. Negative theology assumes that God exists but insists that He/She/It/They transcends human language and concepts. Negative theologians try to sayover and over again, and sometimes with great eloquencewhat they acknowledge cannot be said.

Negative theology is an outgrowth of mysticism. Mystical experiences, as defined by William James in The Varieties of Religious Experience, possess two seemingly contradictory properties. They are on the one hand noetic, that is, you feel you are gaining profound insight into and knowledge of reality. They are on the other hand ineffable, meaning you cannot convey your revelation in words.

Mystical aphorisms often emphasize ineffability. He who knows, does not speak, the ancient Chinese sage Lao Tzu says, violating his own dictum. He who speaks, does not know. Pseudo-Dionysius the Areopagite, a medieval monk, describes mystical knowledge as being at one with Him Who is indescribable.

I suspect Wittgenstein had his own mystical experiences in mind when he wrote at the end of his cryptic prose-poem Tractatus Logico-Philosophicus, Whereof one cannot speak, thereof one must remain silent. (After a friend, a philosopher, quoted that line to me, I replied: Then why are you still speaking? The friend hasnt spoken to me since.)

In 1999, while doing research for my book Rational Mysticism, I attended a symposium on mysticism at the University of Chicago. At a session on negative theology, a speaker said hed arrived by mistake a day early. Upon entering the empty auditorium, he thought, This is taking negative theology too far. Another speaker described mystical literature as that which contests its own possibility.

Negative theology can serve as a model for scientists and philosophers trying to solve quantum mechanics and another enigma I posed above: How does matter make a mind? This is the mind-body problem, which investigates, as I argue in a recent book, what we really are, can be and should be, collectively and as individuals. Are we really matter, mind, some combination of the two or, perhaps, none of the above?

When we wrestle with quantum mechanics, were also taking on the mind-body problem. Quantum paradoxes like Schrdingers cat and the measurement problem raise questions about the connection between matter and mind, and their status relative to each other. Is matter self-sufficient, as materialists insist, or does reality require mind too?

Mind is essential, according to QBism, it from bit and other quantum hypotheses. I have disparaged these mind-centric frameworks as neo-geocentrism, throwbacks to the ancient assumption that the universe revolves around us. But I enjoy mulling them over, just as I enjoy thinking about theodicies, which seek to explain why a loving, all-powerful God would create such a painful, unfair world. (Ive even come up with a drug-inspired theodicy of my own.)

Many, most, scientists and philosophers who dwell on quantum mechanics and the mind-body problem have faith that these conundrums can and will be solved, eventually. They crave answers, they want to know. If they cannot know during their lifetime, they want at least to feel that their efforts are taking us closer to the truth.

Philosopher David Chalmers, who has rejected strictly materialist solutions of what he calls the hard problem of consciousness, nonetheless insists that one day well crack it. So does another thinker I admire, philosopher-novelist Rebecca Goldstein. They and other seekers will probably dismiss negative theology as a model for inquiry, and I understand why. I share their craving for a revelation so profound that it dissipates the weirdness of the world.

But Ive also become increasingly wary of our craving for absolute knowledge, and absolute certainty, especially when it comes to riddles like what is reality and what are we. People convinced that they possess ultimate knowledge can become self-righteous fanatics, capable of enslaving and exterminating others in the name of truth.

Negative theology helps us avoid fanaticism by keeping us humble. We acknowledge, as an axiom, that ultimate truth will always elude us. Those who have a hard time accepting this anti-truthand hence the premise of negative theologyshould keep two points in mind. First, if we cannot grasp ultimate truth, we can pursue it forever, never losing sight of the mystery at the heart of things.

Second, Im not proposing negative theology as a model for science as a whole. Science has answered, conclusively, many questions, and it will answer many more, including, I hope, those listed at the beginning of this column. Problems related to infectious disease, mental illness, climate change and war will surely yield to dogged empirical inquiry. Although science will never entirely explain reality, it can make it more bearable.

Further Reading:

For more ruminations on quantum mechanics, the mind-body problem and mysticism, see my new book Pay Attention: Sex, Death, and Science.

Read more:

Quantum Mechanics, the Mind-Body Problem and Negative Theology - Scientific American

Quantum Interference Phenomenon Identified That Occurs Through Time – SciTechDaily

Credit: ULB

Since the very beginning of quantum physics, a hundred years ago, it has been known that all particles in the universe fall into two categories: fermions and bosons. For instance, the protons found in atomic nuclei are fermions, while bosons include photons which are particles of light as well as the BroutEnglert-Higgs boson, for which Franois Englert, a professor at ULB, was awarded a Nobel Prize in Physics in 2013.

Bosons especially photons have a natural tendency to clump together. One of the most remarkable experiments that demonstrated photons tendency to coalesce was conducted in 1987, when three physicists identified an effect that was since named after them: the Hong-Ou-Mandel effect. If two photons are sent simultaneously, each towards a different side of a beam splitter a sort of semitransparent mirror one could expect that each photon will be either reflected or transmitted.

Logically, photons should sometimes be detected on opposite sides of this mirror, which would happen if both are reflected or if both are transmitted. However, the experiment has shown that this never actually happens: the two photons always end up on the same side of the mirror, as though they preferred sticking together! In an article published recently in US journal Proceedings of the National Academy of Sciences, Nicolas Cerf a professor at the Centre for Quantum Information and Communication (cole polytechnique de Bruxelles) and his former PhD student Michael Jabbour now a postdoctoral researcher at the University of Cambridge describe how they identified another way in which photons manifest their tendency to stay together. Instead of a semi-transparent mirror, the researchers used an optical amplifier, called an active component because it produces new photons. They were able to demonstrate the existence of an effect similar to the Hong-Ou-Mandel effect, but which in this case captures a new form of quantum interference.

Quantum physics tells us that the Hong-Ou-Mandel effect is a consequence of the interference phenomenon, coupled with the fact that both photons are absolutely identical. This means it is impossible to distinguish the trajectory in which both photons were reflected off the mirror on the one hand, and the trajectory in which both were transmitted through the mirror on the other hand; it is fundamentally impossible to tell the photons apart. The remarkable consequence of this is that both trajectories cancel each other out! As a result, the two photons are never observed on the two opposite sides of the mirror. This property of photons is quite elusive: if they were tiny balls, identical in every way, both of these trajectories could very well be observed. As is often the case, quantum physics is at odds with our classical intuition.

The two researchers from ULB and the University of Cambridge have demonstrated that the impossibility to differentiate the photons emitted by an optical amplifier produces an effect that may be even more surprising. Fundamentally, the interference that occurs on a semi-transparent mirror stems from the fact that if we imagine switching the two photons on either sides of the mirror, the resulting configuration is exactly identical. With an optical amplifier, on the other hand, the effect identified by Cerf and Jabbour must be understood by looking at photon exchanges not through space, but through time.

When two photons are sent into an optical amplifier, they can simply pass through unaffected. However, an optical amplifier can also produce (or destroy) a pair of twin photons: so another possibility is that both photons are eliminated and a new pair is created. In principle, it should be possible to tell which scenario has occurred based on whether the two photons exiting the optical amplifier are identical to those that were sent in. If it were possible to tell the pairs of photons apart, then the trajectories would be different and there would be no quantum effect. However, the researchers have found that the fundamental impossibility of telling photons apart in time (in other words, it is impossible to know whether they have been replaced inside the optical amplifier) completely eliminates the possibility itself of observing a pair of photons exiting the amplifier. This means the researchers have indeed identified a quantum interference phenomenon that occurs through time. Hopefully, an experiment will eventually confirm this fascinating prediction!

Reference: Two-boson quantum interference in time by Nicolas J. Cerf and Michael G. Jabbour, 11 December 2020, Proceedings of the National Academy of Sciences.DOI: 10.1073/pnas.2010827117

See more here:

Quantum Interference Phenomenon Identified That Occurs Through Time - SciTechDaily

9 Most Confusing Sci-Fi Movies That Feel Like You Need a PhD in Quantum Physics – FandomWire

Science-fiction is a genre that defies all human imagination. It forces the human mind to think and wander, unlike the other genres. Its an ell-encompassing genre that covers everything from parallel universes to world destroying viruses. And then there are movies the go above and beyond. They are so convoluted and conniving they leave our brains in knots by the time credits roll.

Before we begin, this list is not going to pander to mainstream audiences. We will not be talking about Interstellar, Inception, and predestination here. So if you were expecting a typical list of sci-fi movies, this list is not for you.

Arthouse movies are not everyones cup of tea. Alphaville sits atop that sci-fi arthouse throne. So its a very, very niche movie. Legendary experimental film maker Jean-Luc Goddard gives us this classic. Detective Lemmy Caution is sent to infiltrate a dystopian city named Alphaville. Alphaville is governed by a sentient A.I called Alpha 60. Using mind control and brainwashing, Alpha 60 controls the entire citys population. The movie explores several tropes like Surrealism, French New Wave Cinema, and the concept of individuality and creativity. Its a movie that relies on a solid story and incredible concepts to drive the plot forward. And its damn interesting to watch.

From the outside, Ad Astra looks like any normal sci-fi flick. But theres a deeper narrative, something most people will miss in their first watch. Ad Astra is not about a distant son trying to get his estranged father home. Its about relationships across distance, the distance being in lightyears. Does a father remain a father to a son if he is no longer within the boundaries of our solar system? Where does love and passion end and madness begin? The most important part of the story is its slow, unnerving pace. You may think you have figured out the movie but re-watch it again and you will find something interesting every time.

The movie begins like a typical exploration journey. A mysterious meteor strike leads to a region of the United states being enclosed in a quarantined space called the Shimmer. Literally nothing about the Shimmer is ever truly explained. A group of scientists enter the Shimmer in search of another search party. Each of them end up dying in super mysterious ways. add to that the movies non linear narrative and super short foreshadowing and you have a movie as mysterious as it can get.

If you are not familiar with Kabbalah, a Jewish esoteric discipline and school of thought, do not even consider watching this film. The lead protagonist is also the narrator. He is unreliable, a paranoid schizophrenic, and talks in a language that only scientists would understand. The movie revolves around the search for a mysterious universe that could explain literally everything happening in the universe. The lead protagonists paranoia inserts itself into every scene, making an already harder to understand movie nigh impossible to decipher.

Upstream Color is the anti-thesis of convoluted sci-fi. Its beautiful, well-planned and linear. A group of criminals use mysterious parasites to induce a hyper hypnotic state in their victims. The victims are then vulnerable to suggestion. They will do anything the criminals ask them to. Things become complicated when a parasite is inserted into a pig. Theres also a telepathic link between two people who believe they are in love. But later they realize they are both victims of the same parasite attack. The narrative is also extremely obscured that never gives us a definitive conclusion.

This movie is considered to be one of the greatest sci-fi masterpieces of all time. Its also one of the most confusing sci-fi movies of all time. A Black Monolith appears out of nowhere and prehistoric humanity is taught how to use tools. Flash forward to the future and another Monolith appears somewhere else in our solar system and humans send a space-ship to investigate. Meanwhile theres also a crazed A.I aboard the humans need to deal with. A star-gate scene further complicates matters. Theres also a very disconcerting space-faring baby whose true purpose is as bewildering as it gets.

Darren Aronofskys metaphysical drama went through a lot of hoops before it finally hit the theaters. the movie is about the human acceptance of mortality and death. There are three different storylines in one movie, each running parallelly to the other two. Hugh Jackman and Rachel Weisz play the lead characters in the three arcs. The movie is filled up to the brim with powerful symbolism that might bounce off of your heads. Theres also various historical allegories that are force fit into the movies non-linear pace. This movie will leave your head spinning.

The best movies do not rely in CGI or star power. They rely on clever story-telling with maximum use of whatever resources it has at its disposal. Coherence is a movie that deals with alternate realities. A comet passes over the earth and the power goes out. People attending a dinner party go to the only house in the neighborhood that still has power, leaving a note. When they come back, they find the same note in front of their home. Its not long before they realize that the comet has opened a portal into parallel worlds. The real question is who amongst the people that went out and came back are actually from an alternate universe.

Irony is Sci-fi has always been a genre of high value projects. To make one, you need big pockets. but the most confusing movie of all time was made on a shoestring budget. It explores the most disturbingly difficult science fiction plot element time travel. Two part time inventors accidentally come up with time travel technology. What follows next is literal chaos. Multiple timelines exist simultaneously and there are so many parallel version of the same time traveler that time itself could fragment into a zillion pieces out of sheer confusion.

View post:

9 Most Confusing Sci-Fi Movies That Feel Like You Need a PhD in Quantum Physics - FandomWire

Expanding the Scope of Electronic-Structure Theory – Physics

December 16, 2020• Physics 13, 196

An efficient new approach makes density-functional simulations feasible over larger length scales.

R. Godby/University of York

R. Godby/University of York

Density-functional theory (DFT) has, since the 1970s, had a huge impact on our understanding of condensed-matter physics through its ability to describe the effect of the electrons mutual interaction on the electronic structure of matter. However, in solids in which successive crystal unit cells are no longer exact repetitions of one another, the usual approach for implementing DFT can run out of steam. Now, Tristan Mller at the Max Planck Institute of Microstructure Physics, Germany, and colleagues have devised an efficient new way to implement DFT in the presence of such a long-wavelength variation [1]. Their technique potentially extends the scope of DFT to encompass phenomena of technological interest, such as skyrmions and magnetic domain walls.

The key that unlocks a materials electronic structure is an almost magical result known as Blochs theorem, which greatly assists the solution of Schrdingers equation [2]. Rather than having to take account of arbitrary mixing of the atomic wave functions of the solids infinite number of atoms, Bloch showed that the atoms in each unit cell of the solid contribute equally to any wave function. The contribution of each unit cell differs from its neighbors only by a phase factor that is a fixed characteristic of a given wave function. This phase factor is normally described through a Bloch wave vector, the k point. In essence, this idea means that solving Schrdingers equation for electrons in a periodic solid is little costlier than solving it for a single unit cell. The efficiency of this approach facilitated the development of electronic-structure calculations for solids in the early decades of quantum mechanics [3].

The first such calculations did not take into account the effect of the electrostatic repulsion between electrons on the materials electronic structure. Correcting this shortcoming is where DFT comes in [4, 5]. In DFT, a Schrdinger equation is still solved for each electron in turn, but with the periodic potential felt by the electrons now modified by the periodic density of the electrons themselves within each unit cell. Crucially, the power of Blochs theorem is preserved. The combination of quantitative accuracy and efficiency fueled the explosion in applications of DFT to crystalline solids from the 1970s onwards [6].

What if the system under study is not a periodic solid but is nevertheless infinite? Often, the concept of a supercell is usefula larger unit cell within which a periodic atomic arrangement can still be assumed to a good approximation. (The simplest example would be an antiferromagnetic material, in which the alternating spins of neighboring atoms double the periodicity.) The power of Blochs theorem is then regained, albeit with increased computational cost reflecting the presence of, perhaps, dozens of atoms in the new unit cell rather than just one or two. The cost typically scales with the cube of this number of atoms, so supercell calculations can be very (even prohibitively) costly. If, for example, the supercell is 10 times larger than the basic unit cell in each direction, then the reciprocal lattice becomes 10 times finer in each direction. This expansion greatly increases the number of coefficients that must be calculated for each electronic wave function and for the corresponding DFT potentials.

Tackling this scaling problem is the purpose of the new work by Mller and colleagues. To achieve their goal, the researchers developed a flexible approach that is closely related to the concept of satellite peaks in x-ray crystallography and electron diffraction. There, the observed image is a series of diffraction peaks that is essentially the Fourier transform of the structure of the solid under observation. If the solid is modulated by acquiring a new, longer periodicity, then each diffraction peak becomes surrounded by a finely spaced set of a few satellite peaks (Fig. 1).

Away from each original peak position, the intensity of the satellites falls off quickly, provided that the modulation has a long wavelength. In the language of a supercell DFT calculation, this behavior means that much information can be neglected to a good approximation: only a few satellite coefficients need be calculated in place of each original coefficient (that describe the electronic wave function or charge or magnetization density for the original periodic solid).

Mller and colleagues, then, address the situation of a periodic solid upon which an additional spatial variation on a long length scale is imposedeither an externally applied potential or a spontaneous internal adjustment of the electrons themselves, such as a charge-density wave. Their approach is equivalent to a DFT supercell calculation plus certain well-founded approximations arising from the retention of a limited set of satellite coefficients, which is the key to the efficiency. As is common in DFT, the local electronic structure is represented using a compact set of functions within each unit cell. Meanwhile, the satellite aspects of the wave functions and densities are naturally described using long-wavelength plane waves, which allows these parts of the calculation to benefit from the numerical efficiency of fast Fourier transforms.

As a demonstration of their technique, Mller and colleagues present three examples: the spin-spiral state of the phase of Fe; coupled spin and charge-density waves in Cr; and LiF with an externally applied potential. It is noteworthy that their method need not start from any rigid assumption about the modulation of the original solid, other than that it is on a length scale of many unit cells. When the electronic ground state, as given by DFT, is found, the nature of the modulation (spin spiral or charge-density wave, for example) emerges naturally from the calculation.

When the researchers compare their models results with full supercell calculations, it is clear that the two methods are not yet in perfect alignment. However, given sufficient computer power, this mismatch should narrow. Looking beyond materials electronic ground states, Mller and colleagues foresee the application of their approach to the time dependence of such modulated solids, making use of time-dependent DFT [7]. This ability should enable the ab initio simulation of the dynamic coupling between the electronic wave functions on an atomic scale with, say, electromagnetic waves on a longer length scale in a plasmonic optoelectronic device. For designers of such nanostructures, the electromagnetic waveforms emitted in response to some intense applied pulse could therefore take proper account of the quantum-mechanical motion of the electrons, without the limitations of perturbation theory.

Rex Godby is an emeritus professor of theoretical physics in the Department of Physics, University of York. His research focuses on the quantum dynamics of systems of interacting electrons and other complex systems, including studies of the exact Kohn-Sham potential of time-dependent density-functional theory (TDDFT), together with the development of improved approximate TDDFT functionals for dynamical problems. After completing his Ph.D. at the University of Cambridge in 1984, Godby was a postdoctoral researcher at Bell Labs, New Jersey, and then returned to Cambridge in 1986 as a research fellow, moving to York in 1995.

A useful metric for characterizing the topological behavior of fermions can be extended to bosonic systems as well. Read More

Read more here:

Expanding the Scope of Electronic-Structure Theory - Physics

Physicists attempt to unify all forces of nature and rectify Einstein’s biggest failure – Livescience.com

In his waning years, Albert Einstein spent his time tilting at windmills, trying to unify all the forces of nature. He died disappointed, and his attempt would go down in history as his biggest failure.

But Einstein's failed dream could ultimately become his ultimate triumph, as a small group of theoretical physicists rework his old ideas. It won't necessarily bring all the forces of the universe together, but it could explain some of the most pressing issues facing modern science.

The most successful theory of gravity known to humanity is Einstein's famous theory of general relativity. Einstein spent more than seven years developing it, and it was worth the wait. On the surface, general relativity is deceptively simple. All of the drama of the universe takes place on the grand, four-dimensional stage called space-time. Matter and energy the actors and actresses of the cosmos run around doing their thing, saying their lines. Matter and energy deform space-time, causing it to warp and curve. That warping in turn tells the matter and energy how to move and behave.

Related: 8 ways you can see Einstein's theory of relativity in real life

And voila: general relativity! The constant dialogue between space-time stage and matter and energy is what we see as the force of gravity.

Einstein's theory has passed every observational test thrown at it, which is why it's survived for the century since its birth. It has predicted and explained strange phenomena across the universe, including the bending of light around massive objects and the formation of black holes.

And yet, we know that it's broken. While general relativity says that black holes should exist, it completely breaks down when it tries to describe their singular hearts. We have no description of gravity at such a subatomic scale where quantum mechanics holds sway. At this scale, when gravity gets both strong and short-range, general relativity can't even make predictions - the math just falls apart.

Those are places where we know that general relativity breaks down. But beyond that, astronomers have noticed two phenomena that also aren't completely explained by general relativity: Most of the matter in the universe (so-called dark matter) doesn't interact with light; and the expansion of the universe is accelerating every single day (which is thought to be caused by as-yet-unknown dark energy). In order to explain dark matter and dark energy, we have two choices. Either general relativity is perfectly correct, but our cosmos is filled with strange new substances, or general relativity is flat-out wrong.

Einstein himself tried to push past the limits of general relativity. But he wasn't motivated by the puzzles of black hole singularities or an accelerating universe nobody knew that those existed, let alone would be major theoretical challenges.

Instead, Einstein was motivated by a higher purpose: an attempt to unify all the (known) laws of physics in a single mathematical framework. In his case, he had gravity on one side, represented by his now-famous general relativity, and electromagnetism on the other, represented by Maxwell's equations that described everything from magnets and electrical currents to light itself.

In his attempts to make a super-theory of everything, Einstein introduced General Relativity 2.0. The basic version of relativity only cares about space-time's curvature. But Einstein's reboot also paid attention to space-time's twistiness, or torsion. There was no need to include torsion in his original theory, because it turned out that all you needed was curvature to explain gravity. But now that Einstein was trying to explain more than gravity, he had to include additional effects.

Related: The 18 biggest unsolved mysteries in physics

Einstein had hoped that the twistiness of space-time would somehow be connected to electromagnetism (the same way that the curvature of space-time is connected to gravity) but alas, he couldn't find any solutions and his new theory died with him.

But other physicists never gave up the dream, and they have been attempting to unify physics ever since. One of the most well-developed concepts is called string theory, which claims that all particles are really tiny little vibrating strings. Oh, and our universe has extra spatial dimensions that are all tiny and curled up.

String theory was never based on Einstein's original idea of the twistiness of space-time, but now physicists are giving that old idea, which is called teleparallel gravity, a second look.

The name "teleparallel" comes from Einstein's original work that examined the nature of distant parallel lines in his geometric framework, exploring how both the curvature and twistiness of space-time affected the motion of matter and energy. Physicists nowadays don't think teleparallel gravity can unify physics (even Einstein himself eventually gave up on the idea), but it may be an interesting candidate for a new theory of gravity.

That's because theorists have been using teleparallel gravity to explain things like the accelerated expansion of the universe, the early period after the Big Bang when the universe ballooned, called"inflation" and more recent problems such as an observed conflict between different measurements of the expansion rate of the cosmos. In other words, teleparallel gravity has proven to be pretty predictive.

But what about those early dreams of a unified theory? Teleparallel gravity may be an interesting and useful new approach to gravity, but it doesn't get us any closer to understanding a more fundamental law of physics. Instead, physicists have been using the language of string theory to do that job, so naturally the question came up: Does string theory which claims to be an ultimate theory of everything in any way connect to teleparallel gravity? In other words, if teleparallel gravity can potentially solve all these nasty problems like dark matter and dark energy, does it flow as a natural consequence of string theory, or are these two separate lines that don't have any connection to each other?

Recently, theoretical theorists have begun to tie teleparallel gravity to string theory, providing a motivation for the theory within the stringy universe, as reported in a paper appearing in the preprint journal arXiv in November. In their work, they showed how teleparallel gravity can be a consequence of string theory. This is an important insight, because string theory should be able to explain all laws of physics, and if teleparallel gravity is a better version of general relativity, and ultimately turns out to be correct, then you should be able to derive teleparallelism from the math of string theory.

Here's an analogy. Let's say police identify a murder weapon at a crime scene (general relativity). They have a prime suspect (string theory) that they want to connect to the murder weapon. But new analysis of the crime scene reveals that a different weapon (teleparallelism) actually caused the murder. Can the prime suspect still be connected to the new murder weapon?

The short answer is: yes.

There's a lot more work to be done. String theory isn't finished yet (and may never be finished, if we never figure out firm mathematical solutions), so any connection it can make to reality is useful. If teleparallel gravity turns out to be a useful way to explain some of the current shortcomings of general relativity, and we can derive teleparallelism from string theory, then that's one more step in achieving Einstein's ultimate dream of unification not the way he envisioned it, but it still counts.

Originally published on Live Science.

The rest is here:

Physicists attempt to unify all forces of nature and rectify Einstein's biggest failure - Livescience.com

Black dwarf supernovae: The last explosions in the Universe – SYFY WIRE

Here's a happy thought: The Universe may end in a whimper and a bang. A lot of bangs.

Calculations done by an astrophysicist indicate that in the far future, the Universe will have sextillions of objects called black dwarfs, and that eventually they can explode like supernovae. In fact, they may represent the very last things the Universe can do.

But this won't happen for a long time. A very, very, very long time*. So long from now I'm having difficulty figuring out how to explain how long it'll be. I'll get to it your brain will be stomped flat by it, I promise but we need to talk a bit first about stars, and nuclear fusion, and matter.

Stars like the Sun release energy as they fuse hydrogen atoms into helium atoms in their cores. It's very much like the way a hydrogen bomb works, but on a massively larger scale; the Sun outputs about the equivalent energy of one hundred billion one-megaton bombs. Every second.

Eventually the hydrogen runs out. A lot of complicated things can happen then depending on how massive the star is, what's in it, and more. But for stars up to about 8 10 times the mass of the Sun the outer layers all blow away, exposing the core to space; a core that has become a ball of material so compressed weird quantum mechanics rules come into play. It's still made up of atomic nuclei (like oxygen, magnesium, neon, and such) and electrons, but they're under incredible pressures, with the nuclei practically touching. We call such a material degenerate matter, and the object itself is called a white dwarf.

For stars like this, that's pretty much the end of the road. The kind of fusion process they enjoyed for billions of years thermonuclear fusion, where (hugely simplified) the atomic nuclei are so hot they slam into each other and fuse can't work any more. The white dwarf is born very hot, hundreds of thousands of degrees Celsius, but without an ongoing heat source it begins to cool.

That process takes billions of years. White dwarfs that formed in the early Universe are just now cool enough to be red hot, around 4,000 C.

But the Universe is young, only about 14 billion years old. Over very long periods of time, those white dwarfs will cool further. Eventually, they'll cool all the way down to just about absolute zero: -273C. That will take trillions of years, if not quadrillions. Much much longer than the Universe has already existed.

But at that point the degenerate matter objects won't emit any light. They'll be dark, which is why we call them black dwarfs.

So is that it? Just black dwarfs sitting out there, frozen, forever?

Well, maybe not, and this is where things start to get weird (yes, I know, they're already weird, but just you wait a few paragraphs). Currently, physicists think that protons, one of the most basic of subatomic particles, can decay spontaneously. On average this takes a very long time. Experimental evidence has shown that the proton half-life may be at least 1034 years. That's a trillion trillion times longer than the current age of the Universe.

If true, that means that the protons inside the atomic nuclei in the black dwarfs will decay. If they do, then after some amount of time, 1035 or more years, the black dwarfs will evaporate. Poof. Gone. At that point all that will be left are even denser neutron stars and black holes.

But proton decay, while predicted by current particle theory, hasn't yet been observed. What if protons don't decay? What happens to black dwarfs then?

That's where this new paper comes in. It turns out that there are other quantum mechanics effects that become important, like tunneling. Atomic nuclei are loaded with protons, which have a positive charge, so the nuclei repel each other. But they are very close together in the center of the black dwarf. Quantum mechanics says that particles can suddenly jump in space very small distances (that's the tunneling part, and of course it's far more complicated than my overly simple synopsis here), and if one nucleus jumps close enough to another, kablam! They fuse, form a heavier element nucleus, and release energy.

This is different than thermonuclear fusion, which needs lots of heat. This kind doesn't need heat at all, but it does need really high density, so it's called pycnonuclear fusion (pycno in ancient Greek means dense).

Over time, the nuclei inside the black dwarf fuse, very very slowly. The heat released is minimal, but the overall effect is that they get even denser. Also, like in normal stars, the nuclei that fuse create heavier nuclei, up to iron.

That's a problem. The effects holding the star up against its own intense gravity is degeneracy pressure between electrons. When you try to fuse iron it eats up electrons. If enough iron fuses the electrons go away, the support for the object goes with it, and it collapses.

This happens with normal stars too. They have to be pretty massive, more than 810 times the mass of the Sun (so the core is at least 1.5 or so times the Sun's mass). But for stars like those the core suddenly collapses, the nuclei smash together and form a ball of neutrons, what we call a neutron star. This also releases a lot of energy, creating a supernova.

This will happen with black dwarfs too! When enough iron builds up, they too will collapse and explode, leaving behind a neutron star.

But pycnonuclear fusion is an agonizingly slow process. How long will that take before the sudden collapse and kablooie?

Yeah, I promised earlier that I'd explain this number. For the highest mass black dwarfs, which will collapse first, the average amount of time it takes is, well, 101,100 years.

That's 10 to the 1,100th power. Written out, it's a 1 followed by eleven hundred zeroes.

I I don't have any analogies for how long that is. It's too huge a number to even have any kind of rational meaning to the pathetic globs of meat in or skulls.

I mean, seriously, here it is written out:

That's a lot of zeroes. Feel free to make sure I got the number right.

I tried to break it down into smaller units that make sense, but c'mon. One of the largest numbers we named is a googol, which is 10100, a one followed by 100 zeroes.

The number above is a googol11, a googol to the 11th power.

And that's the black dwarfs that go first. The lowest mass ones take much longer.

How much longer? I'm not terribly glad you asked. They collapse after about 1032,000 years.

That's not a typo. It's ten to the thirty-two-thousandth power. A one with 32,000 zeroes after it.

OK then.

I'll note that this is for stars that start out more massive than the Sun. Stars like ours aren't massive enough to get the pycnonuclear fusion going they don't have enough mass to squeeze the core into the density needed for it so when they turn into black dwarfs, that's pretty much it. After that, nothing.

Assuming protons don't decay, I'll note again. They probably do, so perhaps this is all just playing with physics without an actual outcome we can see (not that we'll be around to anyway). Or maybe we're wrong about protons, and in that unimaginably distant future the Universe will consists of neutron stars, black holes, low mass black dwarfs like the Sun, and something like a sextillion black dwarfs that will one day collapse and explode.

Black holes, I'll note, evaporate as well, and the last of those should go in less than a googol years. If so, then black dwarf supernovae may be the last energetic events the Universe can muster. After that, nothing. Heat death. Infinite cold for infinite time.

Oh hey, it gets worse. The Universe is expanding, but the part of it that we can see, the observable Universe, is actually shrinking. This has to do with dark energy and the accelerated expansion of the Universe, which I have explained elsewhere. But by the time the black dwarfs start to explode, the Universe we can see will have shrunk to the size of our own galaxy. Well, what's left of it by then. Odds are the black dwarfs will be scattered so far by then that we there won't even be one in our observable frame.

That's a rip-off. You'd think that waiting that long would have some payoff.

So why go through the motions to calculate all this? I actually think it's a good idea. For one thing, science is never wasted. It's possible this may all be right.

Also, the act of doing the calculation could yield interesting side results, things that have implications for the here-and-now that might be observable (like the decay of protons). There could be some tangible benefit.

But really, for my money, this act of spectacular imagination is what science is all about. Push the limits! Exceed the boundaries! Ask, "What's next? What happens after?" This expands our borders, pushes back at our limitations, and frees the brain within the limits of the known physics and math to pursue avenues otherwise undiscovered.

Seeking the truth can be a tough road, but it does lead to understanding, and there's beauty in that.

*That links to an article written by my SYFY WIRE colleague Jeff Spry about this topic when it first came out a while back. He gives a good summation of it, but after reading the paper myself I wanted to do a deeper dive. And, to be honest, I could write an article three times this long on this topic. There's a lot going on here.

Link:

Black dwarf supernovae: The last explosions in the Universe - SYFY WIRE