Category Archives: Quantum Physics

Why are there exactly 3 generations of particles? – Big Think

Everything that exists in our Universe, as far as we understand it, is made up of particles and fields. At a fundamental level, you can break everything down until you reach the limit of divisibility; once things can be divided no further, we proclaim that weve landed upon an entity thats truly fundamental. To the best of our current understanding, there are the known elementary particles those represented by the Standard Model of elementary particle physics and then there are the unknowns: things that must be out there beyond the confines of the Standard Model, but whose nature remains unknown to us.

In the latter category are things like dark matter, dark energy, and the particle(s) responsible for creating the matter-antimatter asymmetry in our Universe, as well as any particles that would arise from a quantum theory of gravity. But even within the Standard Model, there are things for which we dont quite have an adequate explanation. The Standard Model consists of two types of particles:

While theres only one copy of each of the bosons, for some reason, there are three copies of each of the fermionic particles: they come in three generations. Although its long been accepted and robustly experimentally verified, the three-generational nature of the Standard Model is one of the great puzzles of nature. Heres what we know so far.

On the right, the gauge bosons, which mediate the three fundamental quantum forces of our Universe, are illustrated. There is only one photon to mediate the electromagnetic force, there are three bosons mediating the weak force, and eight mediating the strong force. This suggests that the Standard Model is a combination of three groups: U(1), SU(2), and SU(3).

Although the Standard Model possesses an incredibly powerful framework leading to, by many measures, our most successful physical theory of all-time it also has limitations. It makes a series of predictions that are very robust, but then has a large number of properties that we have no way of predicting: we simply have to go out and measure them to determine just how nature behaves.

The particles and forces of the Standard Model. Any theory that claims to go beyond the Standard Model must reproduce its successes without making additional predictions that have already been shown to not be true. Pathological behavior that would already be ruled out is the largest source of constraints on beyond-the-Standard Model scenarios.

But what the Standard Model doesnt tell us is also profound.

All of these things can only, at least as we currently understand it, be measured experimentally, and its from those experimental results that we can determine the answers.

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

Fortunately, were good enough at experimental particle physics that weve been able to determine the answers to these questions through a series of both clever and brute-force observations and experiments. Every single one of the Standard Models particles and antiparticles have been discovered, their particle properties have been determined, and the full scope of what exists in the Standard Model three generations of fermions that are all massive and where quarks of like charges and the massive neutrinos all mix together is now unambiguous.

The rest masses of the fundamental particles in the Universe determine when and under what conditions they can be created, and also describe how they will curve spacetime in General Relativity. The properties of particles, fields, and spacetime are all required to describe the Universe we inhabit, but the actual values of these masses are not determined by the Standard Model itself; they must be measured to be revealed.

The two major ways that we know there are three generations no more and no less of fermions are as follows.

1.) The Z-boson, the neutral but very massive weak boson, has a series of different decay pathways. About 70% of the time, it decays into hadrons: particles made up of quarks and/or antiquarks. About 10% of the time, it decays into charged leptons: either the electron (1st generation), muon (2nd generation), or tau (3rd generation) flavor, all with equal probabilities. And about 20% of the time predicted to be exactly double the frequency that it decays to a charged lepton it decays into neutral leptons: the neutrinos, with equal probability for each of the various flavors.

These neutrino decays are invisible, since it would take about a light-year worth of lead to have a 50/50 shot of detecting your average neutrino. The fact that the fraction of Z-bosons that decays into invisible constituents (i.e., neutrinos) is exactly double the fraction that decays into the known charged leptons tells us that there are only three species of neutrinos that are below half the mass of the Z-boson, or around 45 GeV/c. If there is a fourth generation of neutrino, the lightest massive particle in each of the three known generations, its more than a trillion times more massive than any of the other neutrinos.

The final results from many different particle accelerator experiments have definitively showed that the Z-boson decays to charged leptons about 10% of the time, neutral leptons about 20%, and hadrons (quark-containing particles) about 70% of the time. This is consistent with 3 generations of particles and no other number.

2.) The presence of neutrinos that were created in the early Universe, during the first ~second of the hot Big Bang, imprints itself onto other observable cosmic signals.

In addition to the constraints on neutrinos, there are no additional charged leptons or quarks at masses at or below 1.2 and 1.4 TeV, respectively, from experimental constraints at the Large Hadron Collider (and the fact that probabilities must always add up to 100%).

All told, this strongly disfavors the existence of a fourth (or higher) generation of particles.

If there were no oscillations due to matter interacting with radiation in the Universe, there would be no scale-dependent wiggles seen in galaxy clustering. The wiggles themselves, shown with the non-wiggly part (blue, top) subtracted out (bottom), is dependent on the impact of the cosmic neutrinos theorized to be present by the Big Bang. Standard Big Bang cosmology with three neutrino species corresponds to =1.

With the exception of the neutrinos, which appear to be just as stable in the electron species as they are in either the muon or tau species, the only stable charged particles (including neutral composite particles with charged, fundamental constituents) in the Universe are made out of first-generation quarks and leptons. The muon is the longest-lived unstable particle, and even it only has a mean lifetime of 2.2 microseconds. If you have a strange (or heavier) quark, your lifetime is measured in nanoseconds or less; if you have a tau lepton, your lifetime is measured in fractions-of-a-picosecond. There are no stable species that contain second-or-third generation quarks or charged leptons.

There are no hints in the decays of the most massive particles the W, the Z, the Higgs or the top quark that there are any particles in additions to the ones we know. When we look at the mass ratios of the different generations, we find that the four separate types of particles:

all have significantly different mass ratios between the generations from one another. In addition, although quarks mix with one another and neutrinos mix across the generations, the ways in which they mix are not identical to each other. If there is a pattern or an underlying cause or reason as to why there are three generations, we havent uncovered it yet.

Instead of an empty, blank, three-dimensional grid, putting a mass down causes what would have been straight lines to instead become curved by a specific amount. In General Relativity, we treat space and time as continuous, but all forms of energy, including but not limited to mass, contribute to spacetime curvature. The deeper you are in a gravitational field, the more severely all three dimensions of your space is curved, and the more severe the phenomena of time dilation and gravitational redshift become. It is not known if there is a connection between the number of spatial dimensions and the number of fermionic generations.

One of the ideas thats sometimes floated is really just a hint: we have three generations of fermionic particles, and we have three spatial dimension in our Universe. On the other hand, we have only one generation of bosonic particles, and one time dimension in our Universe.

Could this be a potential link; the number of spatial dimensions with the number of generations of fermions, and the number of time dimensions with the number of generations of bosons?

Maybe, but this line of thought doesnt provide any obvious connections between the two. However, pursuing it does help us understand what similarly-minded connections arent present. Particles dont have different spins or spin-modes across generations, indicating that intrinsic angular momentum is simple and unrelated to either generations or dimensions. There is CP-violation in the (weak) decays of heavy quarks, and that requires a minimum of three generations, but we still dont know why theres no CP-violation in the strong decays.

If youre looking at 3 as though its a mysterious number, you might note:

but none of them have any known connection to either the number of spatial dimensions or the number of generations. As far as we can tell, its all just coincidence.

The difference between a Lie algebra based on the E(8) group (left) and the Standard Model (right). The Lie algebra that defines the Standard Model is mathematically a 12-dimensional entity; the E(8) group is fundamentally a 248-dimensional entity. There is a lot that has to go away to get back the Standard Model from String Theories as we know them, and there are numerous ways to recover three generations based on how the various symmetries are broken in String Theory.

Perhaps. By adding in additional symmetries and by considering larger gauge groups, its possible to come up with a rationale for why there would be three, and only three, generations of particles. Indeed, thats not too far fetched. In supersymmetry, there would be more than double the number of particles than are present in the Standard Model, with an additional fermion for every boson, an additional boson for every fermion, and multiple Higgs particles as well as supersymmetric Higgsinos that would exist.

In string theory, were required to go to even greater states of symmetry, with larger gauge groups that are capable of admitting the particles of the Standard Model many times over. It is certainly possible, with such a wide set of variables to play with, to choose a way that these very large gauge groups might break to not only give rise to the Standard Model, but to a Standard Model that has three identical copies of its fermions, but no additional bosons.

But, again, theres no reason that we know of that dictates why this ought to be the case. When you strike a pane of glass with a rock, its possible that the glass will shatter in such a way that youll wind up with three specific shards that are identical; thats a plausible outcome. But unless you can predict those shards in advance, the idea doesnt have any predictive power. Such is the case with string theory at present: it could lead to three generations of fermionic particles, but theres no way to predict such an outcome.

A geometrical interpretation of the Koide formula, showing the relative relationship between the three particles that obey its particular mathematical relationship. Here, as was its original intent, its applied to the charged leptons: the electron, muon, and tau particles.

Back in 1981, physicist Yoshio Koide was looking at the then-known particles of the Standard Model and their particle properties, and took particular notice of the rest masses of the electron, muon, and tau particles. They are:

Although it might appear that theres no relationship at all between these three masses, his eponymous Koide formula indicated differently. One of the rules of quantum physics is that any particles with the same quantum numbers will mix together. With the exception of lepton family number (i.e., the fact that theyre in different generations), the electron, muon, and tau do have identical quantum numbers, and so they must mix.

What Koide noted was that mixing would generally lead to the following formula:

where that constant must lie between and 1. When you put the numbers in, that constant just happens to be a simple fraction that splits the range perfectly: .

The Koide formula, as applied to the masses of the charged leptons. Although any three numbers could be inserted into the formula, guaranteeing a result between 1/3 and 1, the fact that the result is right in the middle, at 2/3 to the limit of our experimental uncertainties, suggests that there might be something interesting to this relation.

But even with all that said, theres no underlying reason for any of this; its just a suggestive correlation. There may be a deep reason as to why there are three generations no more, no less of fermionic particles in the Standard Model, but as far as what that reason might be, we have no indicators or evidence that are any better than these tenuous connections.

The experimental data and the theoretical structure of the Standard Model, combined, allow us to conclude with confidence that the Standard Model, as we presently construct it, is now complete. There are no more Standard Model particles out there, not in additional generations nor in any other yet-undiscovered place. But there are, at the same time, certainly puzzles about the nature of the Universe that require us to go beyond the Standard Model, or well never understand dark matter, dark energy, the origin of the matter-antimatter asymmetry, and many other properties that the Universe certainly possesses. Perhaps, as we take steps towards solving those mysteries, well take another step closer to understanding why the Standard Models particle content is neither greater nor lesser than it is.

View original post here:

Why are there exactly 3 generations of particles? - Big Think

Rice physicist wins DOE early career award | Rice News | News and Media Relations | Rice University – Rice News

Guido Pagano, an assistant professor of physics and astronomy at Rice University, has received a prestigiousEarly Career Research Awardfrom the Department of Energy to continue his development of a quantum simulator.

The five-year award for $750,000 is one of 56 granted to university-based researchers in the round of grants announced by DOEs Office of Science. The office also awarded grants to 27 scientists at national laboratories.

Im really thrilled to get this award because it gives me the possibility to address a very promising line of research, said Pagano, who has also won aNational Science Foundation CAREER Awardand anOffice of Naval Research Young Investigator Awardfor different lines of research this year. I feel grateful because my ideas are now basically fully funded.

Pagano, who joined Rice and itsQuantum Initiativein 2019 and shortly thereafter had papers in bothNatureandScience, will use the grant to complete his labs custom ion trap, while already planning the design of a second system. The labs focus ison using trapped ions to simulate quantum systems of interest. Currently, Pagano and his students are building a laser-based system able to manipulate individual atomic ions.

The system were putting together is complex and flexible enough to connect to theories that nuclear physics researchers are interested in, like simulatinggauge field theories, said Pagano, who won the grant in part on the strength of a closecollaborationwith a University of Maryland theorist,Zohreh Davoudi, on trapped-ion research.

A primary goal for the lab is to tailor new ways in which trapped ions can interact among each other to directly map them to gauge field theories.

The new apparatus is designed to both write and read quantum information on the ions in many different ways. The system is designed to give us much more flexibility compared to what we had before, Pagano said. We have so many ways to manipulate the ions, either using multiple atomic states or addressing them from different directions in ways that other experiments are unable to do.

Paganos field falls under nuclear physics, among DOEs set of research topics. These also include advanced scientific computing research, basic energy sciences, biological and environmental research, fusion energy sciences, high energy physics and isotope and accelerator research and development.

Supporting talented researchers early in their career is key to fostering scientific creativity and ingenuity within the national research community, said DOE Office of Science Director Asmeret Asefaw Berhe. Dedicating resources to these focused projects led by well-deserved investigators helps maintain and grow Americas scientific skill set for generations to come.

Read the original:

Rice physicist wins DOE early career award | Rice News | News and Media Relations | Rice University - Rice News

Difficult-to-observe effect confirms the existence of quark mass – EurekAlert

image:A cascade of particles and gluons initiated by a decelerating charm quark. The more developed the cascade, the lower the energies of secondary particles and the greater the opening angle of dead cones avoided by subsequent gluons. view more

Credit: Source: CERN

A phenomenon that directly proves the existence of quark mass has been observed for the first time in extremely energetic collisions of lead nuclei. A team of physicists working on the ALICE detector at the Large Hadron Collider can boast this spectacular achievement the observation of the dead cone effect.

The objects that make up our physical everyday life can have many different properties. Among these, a fundamental role is played by mass. Despite being so fundamental, mass has a surprisingly complex origin. Its primary source is the complex interactions binding triplets of quarks in the interiors of protons and neutrons. In modern physics it is assumed that the masses of the quarks themselves, originating from their interactions with the Higgs field (its manifestations are the famous Higgs bosons), contribute only a few percent to the mass of a proton or neutron. However, this has only been a hypothesis. Although the masses of single quarks have been determined from measurements for many years, only indirect methods were used. Now, thanks to the efforts of scientists and engineers working in Geneva at the LHC of the European Organization for Nuclear Research (CERN), it has finally been possible to observe a phenomenon that directly proves the existence of the mass of one of the heavy quarks.

When lead nuclei collide at the LHC particle accelerator, the energy density can become so great that protons and neutrons decay and momentarily form quark-gluon plasma. The quarks inside then move in a powerful field of strong interactions and begin to lose energy by emitting gluons. However, they do this in a rather peculiar way, which our team was the first to succeed in observing, Prof. Marek Kowalski from the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) in Cracow starts to explain. Prof. Kowalski is one of the members of a large international collaboration carrying out measurements using the ALICE detector.

Gluons are particles that carry strong interactions between quarks. Their role is therefore similar to that of photons, which are responsible for the electromagnetic interactions between, for example, electrons. In electrodynamics, there is a phenomenon concerning electrons decelerating in an electromagnetic field: they lose energy by emitting photons and the higher the energy of the electron, the more often the photons fly in a direction increasingly consistent with its direction of motion. This effect is the basis of free-electron lasers today unique, powerful devices capable of producing ultra-short pulses of X-rays.

Electrons decelerating in a magnetic field like to emit 'forward' photons, in an angular cone. The higher their original energy, the narrower the cone. Quarks have quite the opposite predilection. When they lose energy in a field of strong interactions, they emit gluons, but the lower the energy and the larger the mass of the quark, the fewer gluons fly 'forward', says Prof. Kowalski and specifies: It follows from the theory that there should be a certain angular cone around the direction ofquark motion in which gluons do not appear. This cone the more divergent, the lower the energy of the quark and the higher its mass is called the dead cone.

Theorists predicted the phenomenon of the dead cone more than 30 years ago. Unfortunately, its existence in experiments has so far been noticed only indirectly. Both the nature of the phenomenon and the recording process are extremely difficult to observe directly. A decelerating quark emits gluons, which themselves can emit further gluons at different angles or transform into secondary particles. These particles have smaller and smaller energies, so the gluons they emit will avoid larger and larger dead cones. To make matters worse, individual detectors can only record this complex cascade in its final state, at different distances from the collision point, and therefore at different times. To observe the dead cone effect, millions of cascades produced by charm quarks had to be reconstructed from fragmentary data. The analysis, performed with sophisticated statistical tools, included data collected during the three years the LHC was in operation.

Experimental confirmation of the existence of the dead cone phenomenon is an achievement of considerable physical significance. This is because the world of quarks and gluons is governed by strong interactions described by a theory called quantum chromodynamics, which predicts that the dead cone effect can only occur when a quark emitting gluons has non-zero mass. The present result, published in the prestigious journal Nature, is therefore the first direct experimental confirmation of the existence of quark masses.

In the gigantic amount of data collected at the ALICE detector during the collision of lead nuclei and protons, we have traced a phenomenon that we know can only occur in nature when quarks have non-zero masses. Current measurements do not allow us to estimate the magnitude of the mass of the charm quarks we observed, nor do they tell us anything about the masses of quarks of other kinds. So we have a spectacular success, but in fact it is only a prelude to a long line of research, stresses Prof. Kowalski.

The first direct observation of the dead cone effect involved only gluons emitted by charm (c) quarks. Scientists now intend to look for dead cones in processes involving quarks with larger masses, especially beauty (b) quarks. This will be a huge challenge because the higher the mass ofthe quark, the less frequently it is produced in collisions, and therefore the more difficult it will be to collect a number of cases that will guarantee adequate reliability of statistical analyses.

The reported research is of fundamental importance to modern physics. This is because the Standard Model is the basic tool currently used to describe phenomena involving elementary particles. Masses of quarks are the key constants here, responsible for the correspondence between theoretical description and physical reality. It is therefore hardly surprising that the observations of dead cones, raising hopes for direct measurements of quark masses, are of such interest to physicists.

The Henryk Niewodniczaski Institute of Nuclear Physics (IFJ PAN) is currently one of the largest research institutes of the Polish Academy of Sciences. A wide range of research carried out at IFJ PAN covers basic and applied studies, from particle physics and astrophysics, through hadron physics, high-, medium-, and low-energy nuclear physics, condensed matter physics (including materials engineering), to various applications of nuclear physics in interdisciplinary research, covering medical physics, dosimetry, radiation and environmental biology, environmental protection, and other related disciplines. The average yearly publication output of IFJ PAN includes over 600 scientific papers in high-impact international journals. Each year the Institute hosts about 20 international and national scientific conferences. One of the most important facilities of the Institute is the Cyclotron Centre Bronowice (CCB), which is an infrastructure unique in Central Europe, serving as a clinical and research centre in the field of medical and nuclear physics. In addition, IFJ PAN runs four accredited research and measurement laboratories. IFJ PAN is a member of the Marian Smoluchowski Krakw Research Consortium: "Matter-Energy-Future", which in the years 2012-2017 enjoyed the status of the Leading National Research Centre (KNOW) in physics. In 2017, the European Commission granted the Institute the HR Excellence in Research award. The Institute holds A+ Category (the highest scientific category in Poland) in the field of sciences and engineering.

CONTACTS:

Prof. Marek Kowalski

Institute of Nuclear Physics, Polish Academy of Sciences

tel.: +48 12 6628074

email: marek.kowalski@cern.ch, marek.kowalski@ifj.edu.pl

SCIENTIFIC PUBLICATIONS:

Direct observation of the dead-cone effect in quantum chromodynamics

ALICE Collaboration

Nature 605, 440446 (2022)

DOI: https://doi.org/10.1038/s41586-022-04572-w

LINKS:

http://www.ifj.edu.pl/

The website of the Institute of Nuclear Physics, Polish Academy of Sciences.

http://press.ifj.edu.pl/

Press releases of the Institute of Nuclear Physics, Polish Academy of Sciences.

IMAGES:

IFJ220609b_fot01s.jpg

HR: http://press.ifj.edu.pl/news/2022/06/09/IFJ220609b_fot01.jpg

A cascade of particles and gluons initiated by a decelerating charm quark. The more developed the cascade, the lower the energies of secondary particles and the greater the opening angle of dead cones avoided by subsequent gluons. (Source: CERN)

Direct observation of the dead-cone effect in quantum chromodynamics

18-May-2022

Continue reading here:

Difficult-to-observe effect confirms the existence of quark mass - EurekAlert

Microsoft aims to win the race to build a new kind of computer. So does Amazon – Sunbury Daily Item

SEATTLE The tech giants are locked in a race.

It might not end for another decade, and there might not be just one winner.

But, at the finish line, the prize they promise is a speedy machine, a quantum computer, that will crack in minutes problems that cant be solved at all today.

Builders describe revolutionary increases in computing power that will accelerate the development of artificial intelligence, help design new drugs and offer new solutions to help fight climate change.

Relying on principles of physics and computer science, researchers are working to build a quantum computer, a machine that will go beyond the capabilities of the computers we use today by moving through information faster.

Unlike the laptop screen were used to, quantum computers display all their inner organs. Often cylindrical, the computers are an intimidating network of coils, plates, wires and bolts. And theyre huge.

Were talking about computing devices which are just unimaginable in terms of their power in what they can do, said Peter Chapman, president and CEO of IonQ, a startup in the race alongside tech giants Microsoft, Amazon, Google, IBM, Intel and Honeywell.

The companies are riding a swell of interest that could grow to $9.1 billion in revenue by 2030, according to Tractica, a market intelligence firm that studies new technologies and how humans interact with tech advancements.

Right now, each company is deciding how to structure the building blocks needed to create a quantum computer. Some rely on semiconductors, others on light. Still others, including Microsoft, have pinned their ambitions on previously unproven theories in physics.

Bottom line, we are in very heavy experimentation mode in quantum computing, and its fairly early days, said Chirag Dekate, who studies the industry for research firm Gartner. We are in the 1950s state of classical computer hardware.

Theres not likely to be a single moment when quantum computers start making the world-changing calculations technologists are looking forward to, said Peter McMahon, an engineering professor at Cornell University. Rather, theres going to be a succession of milestones.

At each one, the company leading the race could change.

In October 2019, Google said it had reached quantum supremacy, a milestone where one of its machines completed a calculation that would have taken todays most advanced computers 10,000 years. In October last year, startup IonQ went public with an initial public offering that valued the company at $2 billion. In November, IBM said it had also created a quantum processor big enough to bypass todays machines.

In March, it was Microsofts turn.

After a false start that saw Microsoft retract some research, it said this spring it had proved the physics principles it needed to show that its theory for building a quantum computer was, in fact, possible.

We expect to capitalize on this to do the almost unthinkable, Krysta Svore, an engineer who leads Microsofts quantum program, said in a company post announcing the discovery. Its never been done before. ... [Now] heres this ultimate validation that were on the right path.

As envisioned by designers, a quantum computer uses subatomic particles like electrons instead of the streams of ones and zeros used by computers today.

In doing so, a quantum computer can examine an unimaginable number of combinations of ones and zeros at once.

A quantum computers big selling points are speed and multitasking, enabling it to solve complex problems that would trip up todays technology.

To understand the difference between classical computers (the computers we use today) and quantum computers (the computers researchers are working on), picture a maze.

Using a classical computer, youre inside the maze. You choose a path at random before realizing its a dead end and circling back.

A quantum computer gives an aerial view of the maze, where the system can see several different paths at once and more quickly reach the exit.

To solve the maze, maybe you have to go 1,000 times to find the right answer, said IonQs Chapman. In quantum computing, you get to test all these paths all at once.

Researchers imagine quantum computers being used by businesses, universities and other researchers, though some industry leaders also talk about quantum computing as a technology that will unlock new ideas our brains cant yet imagine. (Its not likely the average household will have a quantum computer room any time soon.)

Microsoft recently partnered with paints and coatings company AkzoNobel to create a virtual laboratory where it will test and develop sustainable products using quantum computing to overcome some of the constraints that jam up a traditional lab setting, like access to raw materials, lack of space and concerns about toxicity.

Goldman Sachs is working to use quantum computing to speed up risk evaluation done by Wall Street traders. Boeing wants to use the advanced tech to model how materials will react to different environments, while ExxonMobil has plans to use it to simulate the chemical properties of hydrogen, hoping to develop new materials that can be used to make renewable energy.

Most of the companies in the race today will develop fairly credible quantum machines, Chong said, and customers will look for ways to take advantage of their strengths and mitigate their weaknesses.

In the meantime, Amazon, Google and Microsoft are hosting quantum technology from their competitors, alongside their own, hoping to let customers play around with the tech and come up with uses that havent yet been imagined. In the same way companies can buy cloud space and digital infrastructure technology from Amazon Web Services or Google Cloud, the tech companies now offer customers pay-as-you-go quantum computing.

At this stage of the tech, it is important to explore different types of quantum computers, said Nadia Carlsten, former head of product at the AWS Center for Quantum Computing. Its not clear which computer will be the best of all applicants. Its actually very likely there wont be one thats best.

Dekate, who analyzes the quantum industry for research and consulting firm Gartner, says quantum may have reached the peak of its hype cycle.

Excitement and funding for the quantum industry has been building he said, pointing to a rising slope on a line graph. Now, it could be at a turning point, he continued, pointing to the spot right before the line graph takes a nosedive.

The hype cycle is a five phase model Gartner uses to analyze new technologies, as a way to help companies and investors decide when to get on board and when to cash out. It takes three to five years to complete the cycle if a new tech makes it through.

Predictive analytics made it to phase five, where users see real-world benefits. Autonomous vehicles are in phase three, where the original excitement wears off and early adopters are running into problems. Quantum computing is in phase two, the peak of expectations, Dekate said.

For every industry to advance, there needs to be hype. That inspires investment, he said. What happens in these ecosystems is end-users [like businesses and other enterprises] get carried away by extreme hype.

Some quantum companies are nearing the deadlines they originally set for themselves, while others have already passed theirs. The technology is still at least 10 years away from producing the results businesses are looking for, Dekate estimates. and investors are realizing they wont see profits anytime soon.

In the next phase of the hype cycle, Dekate predicts private investment in quantum computing will go down, public investment will go up in an attempt to make up the difference, and companies that have made promises they can no longer keep will be caught flat-footed. Mergers, consolidation and bankruptcy are likely, he said.

The kind of macroeconomic dynamics that were about to enter into, I think means some of these companies might not be able to survive, Dekate said. The ecosystem is ripe for disruption: way too much fragmentation and companies overpromising and not delivering.

In other words, we could be headed toward a quantum winter.

But, even during the funding freeze, businesses are increasingly looking for ways to use quantum computing preparing for when the technology is ready, Dekate said. While Amazon, Microsoft, Google and others are developing their quantum computers, companies like BMW, JPMorgan Chase, Goldman Sachs and Boeing are writing their list of problems for the computer to one day solve.

The real changes will come when that loop closes, Dekate said, when the tech is ready and the questions are laid out.

At some point down the line, the classical [computing] approaches are going to stall, and are going to run into natural limitations, he said. Until then, quantum computing will elicit excitement and, at the same time, disappointment.

2022 The Seattle Times. Visit seattletimes.com. Distributed by Tribune Content Agency, LLC.

View original post here:

Microsoft aims to win the race to build a new kind of computer. So does Amazon - Sunbury Daily Item

Best physics books: Change the way you look at the universe – Livescience.com

What is our place in the universe? How do we explain what happens around us? These are big questions to ask on our quest to understand the complexities of physics and the universe. Thats why weve curated this round up of the best physics books to gain a deeper understanding from the top authors in the field.

Physics can be a dense and detailed study, with complicated theories and exploration of ideas that can be difficult for anyone to fully comprehend. They explain these concepts in ways that are approachable and will continue your journey of understanding our physical world.

Weve collected the best physics books written by some of the worlds most renowned scientists, including Stephen Hawking, Brian Greene, and Richard Feynman. These are the books that break down complicated matters to simple, easy-to-read concepts, get to the heart of the matter quickly without getting lost in the details, and entertain you along the way with their humor and personal stories.

If you want to discover anything from the origins of physics through to its evolution into the modern century, these are the best physics books to add to your library for all levels of enthusiasts to expand your thinking and knowledge of the way our world works.

If you're looking for physics books that specifically deal with the cosmos, then you can check out our guide to the best astronomy books.

1. The Elegant Universe

Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory

Price: $11.59 (paperback, new)

Author: Brian Greene

Publisher: W. W. Norton & Company

Release date: October 11, 2010

Expertly organized

Uses relatable analogies

Complex topics accessible for those without a scientific background

Later chapters can grow in complexity and may seem daunting

Written by one of the worlds most renowned string theorists, The Elegant Universe takes complex topics and makes them easily accessible to any reader with or without a science background! Greene creates an impactful and visual reading experience as he navigates through the mysteries of the universe. This international bestseller inspired a major Nova special and leans into Greens expertise in superstring theory.

The Elegant Universe brings thoughtful discussion surrounding special relativity, general relativity, and quantum mechanics, paving the way towards an explanation of all forces and matter. Simple analogies and footnotes break down heavier topics with a dash of humor. Readers will be delighted by the approachable way in which Greene ties in string theory to help our understanding of the vast universe.

2. The Feynman Lectures on Physics (box set)

The New Millennium Edition

Price: $115.99 (hardcover, new)

Author: Richard P. Feynman

Publisher: Basic Books

Release date: January 4, 2011

World's greatest lectures still used in universities today

Approachable intro for those interested in the foundations of physics

Expensive, but they are hardcovers

Unmissable content for any student and those eager to learn more about this expansive field who wants a foundational introduction to physics written by beloved Nobel laureate, Richard P. Feynman. The Feynman Lectures on Physics is a collection of his most profound lectures, reprinted and corrected in collaboration with CalTech. Inside this three-book box set, youll find the basic principles of Newtonian physics through more complex topics such as general relativity, quantum mechanics, and beyond.

Feynman's lectures are accessible without sacrificing relevant information. His passion is evident throughout the pages, never shying away from asking the tougher questions and challenging his audience to expand their thinking. This is a box set designed for each generation, setting up the future for emerging scientists.

3. Quantum Mechanics: The Theoretical Minimum (illustrated edition)

What you need to know to start doing physics

Price: $16.33 (paperback, new)

Author: Leonard Susskind and Art Friedman

Publisher: Basic Books

Release date: May 12, 2015

Clear presentation of the inner workings of quantum physics

Includes step-by-step exercises

Requires some prior mathematical knowledge

Need to read first book to better understand this one

Quantum Mechanics: The Theoretical Minimum is the second book in the Theoretical Minimum series. If youre a reader with some knowledge of linear algebra and calculus who wants to dive deeper into the world of quantum mechanics, this is for you. Susskind and Friedman make it easy to follow along with the subject matter, getting to logical explanations quickly. Susskind deploys notations in earnest, condensing information into manageable symbols.

Itll get you thinking about the information differently, trying out a new way to speculate and approach complicated topics. This book will connect the dots, build the bridges between each concept presented, and explain all the core ideas of theory coherently.

4. Thirty Years that Shook Physics

The story of quantum theory

Price: $12.59 (paperback, new)

Author: George Gamow

Publisher: Dover Publications, Inc

Release date: July 1, 1985

Accounts of personal interactions with all the science greats

Interesting look into the history of science and quantum physics

To get the best out of the theories in this book you'll need a good grasp of maths

Gamow possesses an engaging, entertaining way of presenting the very basics of quantum physics and its progression over the span of three decades. As Gamow was personally acquainted with the scientists presented in this book Bohr, Pauli, Dirac, and Heisenberg just to name a few the result is a level of humanity and personality behind the origins of some of physics' most complex theories and equations.

This is a book about how science has changed and developed in the last century, and Gamow writes this in a way that is accessible to a general audience. Covering prominent events between 1900-1930, youll get the inside story on the course that shaped modern physics.

5. A Brief History of Time

Price: $7.99 (paperback, new)

Author: Stephen Hawking

Publisher: Bantam

Release date: September 1, 1998

Filled with images and useful definitions

Short, quick read

Uses basic terminology and avoids over-complicated info dumps

Deeper theories require prior physics knowledge to fully appreciate

Written by the late Stephen Hawking one of the most renowned scientists of this century A Brief History of Time delves into topics such as black holes, wormholes, uncertainty principle, space and time, expansion of the universe, time travel, and so much more.

Hawking manages to be accessible, while still speaking to those with years of scientific experience under their belts. Its quick and to the point, providing clarity around some of the most complex mechanics of how our universe works. Logically organized, humorous at times, and immersive, youll be taken on a journey that spans from our worlds earliest astronomers to the latest on the future of the universe.

6. Seven Brief Lessons on Physics

Price: $12.00 (paperback, new)

Author: Carlo Rovelli

Publisher: Penguin

Release date: January 1, 2012

Short (only 7 chapters)

Perfect for those interested in the foundations of physics

Can be dense in some areas

Hard to find

Carlo Rovelli is a widely respected and renowned theoretical physicist who introduces you to the modern world of physics. Its a short book, with the paperback only coming in at 81 pages, but its packed with playful and entertaining takes on our world and the role we play in it. Moving quickly through Einsteins general relativity, quantum mechanics, and other complexities of our known universe, Seven Brief Lessons outlines how physics arrived to where it is now.

Written confidently and in a way that is accessible to any reader, the intricacies of this book is written with vivid clarity. Beautifully written, and almost lyrical in its presentation of Newton, Bohr, and Einstein, Seven Brief Lessons on Physics is not one to miss.

7. Physics of the Impossible

A Scientific Exploration of the World of Phasers, Force Fields, Teleportation, and Time Travel

Price: $29.82 (hardcover, new)

Author: Michio Kaku

Publisher: Doubleday

Release date: March 11, 2008

Perfect for sci-fi fans

Humorous undertones

Some feel this book is more fantastical rather than focusing on the actual physics

Fans of pop culture will delight in the insights presented in this engaging and humorous book. Michio Kaku, theoretical physicist and bestselling author, explores the possibilities of teleportation, force fields, interstellar spaceships, and other future technologies youve seen only in science fiction. Are they truly as impossible to achieve as it seems?

In this informative yet widely imaginative look at the universe and the laws of physics, the very topic of scientific possibility is on full display. Kaku looks into the several branches of physics from Newtonian mechanics up to relativity and quantum mechanisms of the 20th century. Sci-fi technologies are broken down into accessible ideas as Kaku explores the possibilities of building starships, time travel, and invisibility.

8. Astrophysics for People in a Hurry

Price: $9.49 (hardcover, new)

Author: Neil deGrasse Tyson

Publisher: W. W. Norton & Company

Release date: May 2, 2017

Today's Best Deals

Clear, concise introduction

Shorter page count

Here is the original post:

Best physics books: Change the way you look at the universe - Livescience.com

What is quantum mechanics trying to tell us? – Big Think

Classical physics did not need any disclaimers. The kind of physics that was born with Isaac Newton and ruled until the early 1900s seemed pretty straightforward: Matter was like little billiard balls. It accelerated or decelerated when exposed to forces. None of this needed any special interpretations attached. The details could get messy, but there was nothing weird about it.

Then came quantum mechanics, and everything got weird really fast.

Quantum mechanics is the physics of atomic-scale phenomena, and it is the most successful theory we have ever developed. So why are there a thousand competing interpretations of the theory? Why does quantum mechanics need an interpretation at all?

What, fundamentally, is it trying to tell us?

There are many weirdnesses in quantum physics many ways it differs from the classical worldview of perfectly knowable particles with perfectly describable properties. The weirdness you focus on will tend to be the one that shapes your favorite interpretation.

But the weirdness that has stood out most, the one that has shaped the most interpretations, is the nature of superpositions and of measurement in quantum mechanics.

Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

Everything in physics comes down to the description of what we call the state. In classical physics, the state of a particle was just its position and momentum. (Momentum is related to velocity.) The position and velocity could be known with as much accuracy as your equipment allowed. Most important, the state was never connected to making a measurement you never had to look at the particle. But quantum mechanics forces us to think about the state in a very different way.

In quantum physics, the state represents the possible outcomes of measurements. Imagine you have a particle in a box, and the box has two accessible chambers. Before a measurement is made, the quantum state is in a superposition, with one term for the particle being in the first chamber and another term for the particle being in the second chamber. Both terms exist at the same time in the quantum state. It is only after a measurement is made that the superposition is said to collapse, and the state has only one term the one that corresponds to seeing the particle in the first or the second chamber.

So, what is going on here? How can a particle be in two places at the same time? This is also akin to asking whether particles have properties in and of themselves. Why should making a measurement change anything? And what exactly is a measurement? Do you need a person to make a measurement, or can you say that any interaction at all with the rest of the world is a measurement?

These kinds of questions have spawned a librarys worth of so-called quantum interpretations. Some of them try to preserve the classical worldview by finding some way to minimize the role of measurement and preserve the reality of the quantum state. Here, reality means that the state describes the world by itself, without any reference to us. At the extreme end of these is the Many Worlds Interpretation, which makes each possibility in the quantum state a parallel Universe that will be realized when a quantum event a measurement happens.

This kind of interpretation is, to me, a mistake. My reasons for saying this are simple.

When the inventors of quantum mechanics broke with classical physics in the first few decades of the 1900s, they were doing what creative physicists do best. They were finding new ways to predict the results of experiments by creatively building off the old physics while extending it in ways that embraced new behaviors seen in the laboratory. That took them in a direction where measurement began to play a central role in the description of physics as a whole.Again and again, quantum mechanics has shown that at the heart of its many weirdnesses is the role played by someone acting on the world to gain information. That to me is the central lesson quantum mechanics has been trying to teach us: That we are involved, in some way, in the description of the science we do.

Now to be clear, I am not arguing that the observer affects the observed, or that physics needs a place for some kind of Cosmic Mind, or that consciousness reaches into the apparatus and changes things. There are much more subtle and interesting ways of hearing what quantum mechanics is trying to say to us. This is one reason I find much to like in the interpretation called QBism.

What matters is trying to see into the heart of the issue. After all, when all is said and done, what is quantum mechanics pointing to? The answer is that it points to us. It is trying to tell us what it means to be a subject embedded in the Universe, doing this amazing thing called science. To me that is just as exciting as a story about a Gods eye view of the Universe.

See the article here:

What is quantum mechanics trying to tell us? - Big Think

No, particle physics on Earth won’t ever destroy the Universe – Big Think

Anytime you reach deeper into the unknown than ever before, you should not only wonder about what youre going to find, but also worry about what sort of demons you might unearth. In the realm of particle physics, that double-edged sword arises the farther we probe into the high-energy Universe. The better we can explore the previously inaccessible energy frontier, the better we can reveal the high-energy processes that shaped the Universe in its early stages.

Many of the mysteries of how our Universe began and evolved from the earliest times can be best investigated by this exact method: colliding particles at higher and higher energies. New particles and rare processes can be revealed through accelerator physics at or beyond the current energy frontiers, but this is not without risk. If we can reach energies that:

certain consequences not all of which are desirable could be in store for us all. And yet, just as was the case with the notion that The LHC could create black holes that destroy the Earth, we know that any experiment we perform on Earth wont give rise to any dire consequences at all. The Universe is safe from any current or planned particle accelerators. This is how we know.

The idea of a linear lepton collider has been bandied about in the particle physics community as the ideal machine to explore post-LHC physics for many decades, but only if the LHC makes a beyond-the-Standard-Model discovery. Direct confirmation of what new particles could be causing CDFs observed discrepancy in the W-bosons mass might be a task best suited to a future circular collider, which can reach higher energies than a linear collider ever could.

There are a few different approaches to making particle accelerators on Earth, with the biggest differences arising from the types of particles were choosing to collide and the energies were able to achieve when were colliding them. The options for which particles to collide are:

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

In the future, it may be possible to collide muons with anti-muons, getting the best of both the electron-positron and the proton-antiproton world, but that technology isnt quite there yet.

A candidate Higgs event in the ATLAS detector at the Large Hadron Collider at CERN. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles, and due to the fact that dozens of proton-proton collisions occur with every bunch crossing. Examining how the Higgs decays to very high precision is one of the key goals of the HL-LHC.

Regardless, the thing that poses the most danger to us is whatevers up there at the highest energy-per-particle-collision that we get. On Earth, that record is held by the Large Hadron Collider, where the overwhelming majority of proton-proton collisions actually result in the gluons inside each proton colliding. When they smash together, because the protons total energy is split among its constituent particles, only a fraction of the total energy belongs to each gluon, so it takes a large number of collisions to find one where a large portion of that energy say, 50% or more belongs to the relevant, colliding gluons.

When that occurs, however, thats when the most energy is available to either create new particles (via E = mc2) or to perform other actions that energy can perform. One of the ways we measure energies, in physics, is in terms of electron-volts (eV), or the amount of energy required to raise an electron at rest to an electric potential of one volt in relation to its surrounding. At the Large Hadron Collider, the current record-holder for laboratory energies on Earth, the most energetic particle-particle collision possible is 14 TeV, or 14,000,000,000,000 eV.

Although no light can escape from inside a black holes event horizon, the curved space outside of it results in a difference between the vacuum state at different points near the event horizon, leading to the emission of radiation via quantum processes. This is where Hawking radiation comes from, and for the tiniest-mass black holes, Hawking radiation will lead to their complete decay in under a fraction-of-a-second.

There are things we can worry will happen at these highest-of-energies, each with their own potential consequence for either Earth or even for the Universe as a whole. A non-exhaustive list includes:

If you draw out any potential, it will have a profile where at least one point corresponds to the lowest-energy, or true vacuum, state. If there is a false minimum at any point, that can be considered a false vacuum, and it will always be possible, assuming this is a quantum field, to quantum tunnel from the false vacuum to the true vacuum state. The greater the kick you apply to a false vacuum state, the more likely it is that the state will exit the false vacuum state and wind up in a different, more stable, truer minimum.

Although these scenarios are all bad in some sense, some are worse than others. The creation of a tiny black hole would lead to its immediate decay. If you didnt want it to decay, youd have to impose some sort of new symmetry (for which there is neither evidence nor motivation) to prevent its decay, and even then, youd just have a tiny-mass black hole that behaved similarly to a new, massive, uncharged particle. The worst it could do is begin absorbing the matter particles it collided with, and then sink to the center of whatever gravitational object it was a part of. Even if you made it on Earth, it would take trillions of years to absorb enough matter to rise to a mass of 1 kg; its not threatening at all.

The restoration of whatever symmetry was in place before the Universes matter-antimatter symmetry arose is also interesting, because it could lead to the destruction of matter and the creation of antimatter in its place. As we all know, matter and antimatter annihilate upon contact, which creates bad news for any matter that exists close to this point. Fortunately, however, the absolute energy of any particle-particle collision is tiny, corresponding to tiny fractions of a microgram in terms of mass. Even if we created a net amount antimatter from such a collision, it would only be capable of destroying a small amount of matter, and the Universe would be fine overall.

The simplest model of inflation is that we started off at the top of a proverbial hill, where inflation persisted, and rolled into a valley, where inflation came to an end and resulted in the hot Big Bang. If that valley isnt at a value of zero, but instead at some positive, non-zero value, it may be possible to quantum-tunnel into a lower-energy state, which would have severe consequences for the Universe we know today. Its also possible that a kick of the right energy could restore the inflationary potential, leading to a new state of rapid, relentless, exponential expansion.

But if we instead were able to recreate the conditions under which inflation occurred, things would be far worse. If it happened out in space somewhere, wed create in just a tiny fraction of a second the greatest cosmic void we could imagine. Whereas today, theres only a tiny amount of energy inherent to the fabric of empty space, something on the order of the rest-mass-energy of only a few protons per cubic meter, during inflation, it was more like a googol protons (10100) per cubic meter.

If we could achieve those same energy densities anywhere in space, they could potentially restore the inflationary state, and that would lead to the same Universe-emptying exponential expansion that occurred more than 13.8 billion years ago. It wouldnt destroy anything in our Universe, but it would lead to an exponential, rapid, relentless expansion of space in the region where those conditions occur again.

That expansion would push the space that our Universe occupies outward, in all three dimensions, as it expands, creating a large cosmic bubble of emptiness that would lead to unmistakable signatures that such an event had occurred. It clearly has not, at least, not yet, but in theory, this is possible.

Visualization of a quantum field theory calculation showing virtual particles in the quantum vacuum. (Specifically, for the strong interactions.) Even in empty space, this vacuum energy is non-zero, and what appears to be the ground state in one region of curved space will look different from the perspective of an observer where the spatial curvature differs. As long as quantum fields are present, this vacuum energy (or a cosmological constant) must be present, too.

And finally, the Universe today exists in a state where the quantum vacuum the zero-point energy of empty space is non-zero. This is inextricably, although we dont know how to perform the calculation that underlies it, linked to the fundamental physical fields and couplings and interactions that govern our Universe: the physical laws of nature. At some level, the quantum fluctuations in those fields that cannot be extricated from space itself, including the fields that govern all of the fundamental forces, dictate what the energy of empty space itself is.

But its possible that this isnt the only configuration for the quantum vacuum; its plausible that other energy states exist. Whether theyre higher or lower doesnt matter; whether our vacuum state is the lowest-possible one (i.e., the true vacuum) or whether another is lower doesnt matter either. What matters is whether there are any other minima any other stable configurations that the Universe could possibly exist in. If there are, then reaching high-enough energies could kick the vacuum state in a particular region of space into a different configuration, where wed then have at least one of:

Any of these would, if it was a more-stable configuration than the one that our Universe currently occupies, cause that new vacuum state to expand at the speed of light, destroying all of the bound states in its path, down to atomic nuclei themselves. This catastrophe, over time, would destroy billions of light-years worth of cosmic structure; if it happened within about 18 billion light-years of Earth, that would eventually include us, too.

The size of our visible Universe (yellow), along with the amount we can reach (magenta). The limit of the visible Universe is 46.1 billion light-years, as thats the limit of how far away an object that emitted light that would just be reaching us today would be after expanding away from us for 13.8 billion years. However, beyond about 18 billion light-years, we can never access a galaxy even if we traveled towards it at the speed of light. Any catastrophe that occurred within 18 billion light-years of us would eventually reach us; ones that occur today at distances farther away never will.

There are tremendous uncertainties connected to these events. Quantum black holes could be just out of reach of our current energy frontier. Its possible that the matter-antimatter asymmetry was only generated during electroweak symmetry breaking, potentially putting it within current collider reach. Inflation must have occurred at higher energies than weve ever reached, as do the processes that determine the quantum vacuum, but we dont know how low those energies could have been. We only know, from observations, that such an event hasnt yet happened within our observable Universe.

But, despite all of this, we dont have to worry about any of our particle accelerators past, present, or even into the far future causing any of these catastrophes here on Earth. The reason is simple: the Universe itself is filled with natural particle accelerators that are far, far more powerful than anything weve ever built or even proposed here on Earth. From collapsed stellar objects that spin rapidly, such as white dwarfs, neutron stars, and black holes, very strong electric and magnetic fields can be generated by charged, moving matter under extreme conditions. Its suspected that these are the sources of the highest-energy particles weve ever seen: the ultra-high-energy cosmic rays, which have been observed to achieve energies many millions of times greater than any accelerator on Earth ever has.

The energy spectrum of the highest energy cosmic rays, by the collaborations that detected them. The results are all incredibly highly consistent from experiment to experiment, and reveal a significant drop-off at the GZK threshold of ~5 x 10^19 eV. Still, many such cosmic rays exceed this energy threshold, indicating that either this picture is not complete or that many of the highest-energy particles are heavier nuclei, rather than individual protons.

Whereas weve reached up above the ten TeV threshold for accelerators on Earth, or 1013 eV in scientific notation, the Universe routinely creates cosmic rays that rise up above the 1020 eV threshold, with the record set more than 30 years ago by an event known, appropriately, as the Oh-My-God particle. Even though the highest energy cosmic rays are thought to be heavy atomic nuclei, like iron, rather than individual protons, that still means that when two of them collide with one another a near-certainty within our Universe given the vastness of space, the fact that galaxies were closer together in the past, and the long lifetime of the Universe there are many events producing center-of-mass collision energies in excess of 1018 or even 1019 eV.

This tells us that any catastrophic, cosmic effect that we could worry about is already tightly constrained by the physics of what has happened over the cosmic history of the Universe up until the present day.

When a high-energy particle strikes another one, it can lead to the creation of new particles or new quantum states, constrained only by how much energy is available in the center-of-mass of the collision. Although particle accelerators on Earth can reach very high energies, the natural particle accelerators of the Universe can exceed those energies by a factor of many millions.

None of the cosmic catastrophes that we can imagine have occurred, and that means two things. The first thing is that we can place likely lower limits on where certain various cosmic transitions occurred. The inflationary state hasnt been restored anywhere in our Universe, and that places a lower limit on the energy scale of inflation of no less than ~1019 eV. This is about a factor of 100,000 lower, perhaps, than where we anticipate inflation occurred: a reassuring consistency. It also teaches us that its very hard to kick the zero-point energy of the Universe into a different configuration, giving us confidence in the stability of the quantum vacuum and disfavoring the vacuum decay catastrophe scenario.

But it also means we can continue to explore the Universe with confidence in our safety. Based on how safe the Universe has already shown itself to be, we can confidently conclude that no such catastrophes will arise up to the combined energy-and-collision-total threshold that has already taken place within our observable Universe. Only if we begin to collide particles at energies around 1020 eV or greater a factor of 10 million greater than the present energy frontier will we need to begin to worry about such events. That would require an accelerator significantly larger than the entire planet, and therefore, we can reach the conclusion promised in the articles title: no, particle physics on Earth wont ever destroy the Universe.

Link:

No, particle physics on Earth won't ever destroy the Universe - Big Think

How the Multiverse could break the scientific method – Big Think

Today lets take a walk on the wild side and assume, for the sake of argument, that our Universe is not the only one that exists. Lets consider that there are many other universes, possibly infinitely many. The totality of these universes, including our own, is what cosmologists call the Multiverse. It sounds more like a myth than a scientific hypothesis, and this conceptual troublemaker inspires some while it outrages others.

The controversy started in the 1980s. Two physicists, Andrei Linde at Stanford University and Alex Vilenkin at Tufts University, independently proposed that if the Universe underwent a very fast expansion early on in its existence we call this an inflationary expansion then our Universe would not be the only one.

This inflationary phase of growth presumably happened a trillionth of a trillionth of a trillionth of one second after the beginning of time. That is about 10-36 seconds after the bang when the clock that describes the expansion of our universe started ticking. You may ask, How come these scientists feel comfortable talking about times so ridiculously small? Wasnt the Universe also ridiculously dense at those times?

Well, the truth is we do not yet have a theory that describes physics under these conditions. What we do have are extrapolations based on what we know today. This is not ideal, but given our lack of experimental data, it is the only place we can start from. Without data, we need to push our theories as far as we consider reasonable. Of course, what is reasonable for some theorists will not be for others. And this is where things get interesting.

The supposition here is that we can apply essentially the same physics at energies that are about one thousand trillion times higher than the ones we can probe at the Large Hadron Collider, the giant accelerator housed at the European Organization for Nuclear Research in Switzerland. And even if we cannot apply quite the same physics, we can at least apply physics with similar actors.

In high energy physics, all the characters are fields. Fields, here, mean disturbances that fill space and may or may not change in time. A crude picture of a field is that of water filling a pond. The water is everywhere in the pond, with certain properties that take on values at every point: temperature, pressure, and salinity, for example. Fields have excitations that we call particles. The electron field has the electron as an excitation. The Higgs field has the Higgs boson. In this simple picture, we could visualize the particles as ripples of water propagating along the surface of the pond. This is not a perfect image, but it helps the imagination.

The most popular protagonist driving inflationary expansion is a scalar field an entity with properties inspired by the Higgs boson, which was discovered at the Large Hadron Collider in July 2012.

Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

We do not know if there were scalar fields at the cosmic infancy, but it is reasonable to suppose there were. Without them, we would be horribly stuck trying to picture what happened. As mentioned above, when we do not have data, the best that we can do is to build reasonable hypotheses that future experiments will hopefully test.

To see how we use a scalar field to model inflation, picture a ball rolling downhill. As long as the ball is at a height above the bottom of the hill, it will roll down. It has stored energy. At the bottom, we set its energy to zero. We do the same with the scalar field. As long as it is displaced from its minimum, it will fill the Universe with its energy. In large enough regions, this energy prompts the fast expansion of space that is the signature of inflation.

Linde and Vilenkin added quantum physics to this picture. In the world of the quantum, everything is jittery; everything vibrates endlessly. This is at the root of quantum uncertainty, a notion that defies common sense. So as the field is rolling downhill, it is also experiencing these quantum jumps, which can kick it further down or further up. Its as if the waves in the pond were erratically creating crests and valleys. Choppy waters, these quantum fields.

Here comes the twist: When a sufficiently large region of space is filled with the field of a certain energy, it will expand at a rate related to that energy. Think of the temperature of the water in the pond. Different regions of space will have the field at different heights, just as different regions of the pond could have water at different temperatures. The result for cosmology is a plethora of madly inflating regions of space, each expanding at its own rate. Very quickly, the Universe would consist of myriad inflating regions that grow, unaware of their surroundings. The Universe morphs into a Multiverse.Even within each region, quantum fluctuations may drive a sub-region to inflate. The picture, then, is one of an eternally replicating cosmos, filled with bubbles within bubbles. Ours would be but one of them a single bubble in a frothing Multiverse.

This is wildly inspiring. But is it science? To be scientific, a hypothesis needs to be testable. Can you test the Multiverse? The answer, in a strict sense, is no. Each of these inflating regions or contracting ones, as there could also be failed universes is outside our cosmic horizon, the region that delimits how far light has traveled since the beginning of time. As such, we cannot see these cosmoids, nor receive any signals from them. The best that we can hope for is to find a sign that one of our neighboring universes bruised our own space in the past. If this had happened, we would see some specific patterns in the sky more precisely, in the radiation left over after hydrogen atoms formed some 400,000 years after the Big Bang. So far, no such signal has been found. The chances of finding one are, quite frankly, remote.

We are thus stuck with a plausible scientific idea that seems untestable. Even if we were to find evidence for inflation, that would not necessarily support the inflationary Multiverse. What are we to do?

The Multiverse suggests another ingredient the possibility that physics is different in different universes. Things get pretty nebulous here, because there are two kinds of different to describe. The first is different values for the constants of nature (such as the electron charge or the strength of gravity), while the second raises the possibility that there are different laws of nature altogether.

In order to harbor life as we know it, our Universe has to obey a series of very strict requirements. Small deviations are not tolerated in the values of natures constants. But the Multiverse brings forth the question of naturalness, or of how common our Universe and its laws are among the myriad universes belonging to the Multiverse. Are we the exception, or do we follow the rule?

The problem is that we have no way to tell. To know whether we are common, we need to know something about the other universes and the kinds of physics they have. But we dont. Nor do we know how many universes there are, and this makes it very hard to estimate how common we are. To make things worse, if there are infinitely many cosmoids, we cannot say anything at all. Inductive thinking is useless here. Infinity gets us tangled up in knots. When everything is possible, nothing stands out, and nothing is learned.

That is why some physicists worry about the Multiverse to the point of loathing it. There is nothing more important to science than its ability to prove ideas wrong. If we lose that, we undermine the very structure of the scientific method.

Follow this link:

How the Multiverse could break the scientific method - Big Think

Ultracold gas bubbles on the space station could reveal strange new quantum physics – Space.com

While it might be a comfortable 72 degrees Fahrenheit (22 degrees Celsius) inside the International Space Station (ISS), there's a small chamber onboard where things get much, much colder colder than space itself.

In NASA's Cold Atom Lab aboard the ISS, scientists have successfully blown small, spherical gas bubbles cooled to just a millionth of a degree above absolute zero, the lowest temperature theoretically possible. (That's a few degrees colder than space!) The test was designed to study how ultracold gas behaves in microgravity, and the results may lead to experiments with Bose-Einstein condensates (BECs), the fifth state of matter.

The test demonstrated that, like liquid, gas coalesces into spheres in microgravity. On Earth, similar experiments have failed because gravity pulls the matter into asymmetrical droplets.

Related: Scientists create exotic, fifth state of matter on space station to explore the quantum world

"These are not like your average soap bubbles," David Aveline, the study's lead author and a member of the Cold Atom Lab science team at NASA's Jet Propulsion Laboratory (JPL) in California, said in a statement (opens in new tab). "Nothing that we know of in nature gets as cold as the atomic gases produced in Cold Atom Lab.

"So we start with this very unique gas and study how it behaves when shaped into fundamentally different geometries," Aveline explained. "And, historically, when a material is manipulated in this way, very interesting physics can emerge, as well as new applications."

Now, the team plans to transition the ultracold gas bubbles into the BEC state, which can exist only in extremely cold temperatures, to perform more quantum physics research.

"Some theoretical work suggests that if we work with one of these bubbles that is in the BEC state, we might be able to form vortices basically, little whirlpools in the quantum material," Nathan Lundblad, a physics professor at Bates College in Maine and the principal investigator of the new study, said in the same statement. "That's one example of a physical configuration that could help us understand BEC properties better and gain more insight into the nature of quantum matter."

Such experiments are possible only in the microgravity of the Cold Atom Lab, which comprises a vacuum chamber about the size of a minifridge. It was installed on the ISS in 2018, and it's operated remotely by a team on the ground at JPL.

"Our primary goal with Cold Atom Lab is fundamental research we want to use the unique space environment of the space station to explore the quantum nature of matter," said Jason Williams, a project scientist for the Cold Atom Lab at JPL. "Studying ultracold atoms in new geometries is a perfect example of that."

The team's observations were published May 18 in the journal Nature (opens in new tab).

Follow Stefanie Waldek on Twitter @StefanieWaldek (opens in new tab). Follow us on Twitter @Spacedotcom (opens in new tab) and on Facebook (opens in new tab).

Read more from the original source:

Ultracold gas bubbles on the space station could reveal strange new quantum physics - Space.com

Could quantum mechanics explain the Mandela effect? – Big Think

There are some questions that, if you look up the answer, might make you question the reliability of your brain.

Many other examples abound, from the color of different flavor packets of Walkers crisps to the spelling of Looney Tunes (vs. Looney Toons) and Febreze (vs. Febreeze) to whether the Monopoly Man has a monocle or not.

Perhaps the simplest explanation for all of these is simply that human memory is unreliable, and that as much as we trust our brains to remember what happened in our own lives, that our own minds are at fault. But theres another possibility based on quantum physics thats worth considering: could these truly have been the outcomes that occurred for us, but in a parallel Universe? Heres what the science has to say.

Visualization of a quantum field theory calculation showing virtual particles in the quantum vacuum. (Specifically, for the strong interactions.) Even in empty space, this vacuum energy is non-zero, and what appears to be the ground state in one region of curved space will look different from the perspective of an observer where the spatial curvature differs. As long as quantum fields are present, this vacuum energy (or a cosmological constant) must be present, too.

One of the biggest differences between the classical world and the quantum world is the notion of determinism. In the classical world which also defined all of physics, including mechanics, gravitation, and electromagnetism prior to the late 19th century the equations that govern the laws of nature are all completely deterministic. If you can give details about all of the particles in the Universe at any given moment in time, including their mass, charge, position, and momentum at that particular moment, then the equations that govern physics can tell you both where they were and where they will be at any moment in the past or future.

But in the quantum Universe, this simply isnt the case. No matter how accurately you measure certain properties of the Universe, theres a fundamental uncertainty that prevents you from knowing those properties arbitrarily well at the same time. In fact, the better you measure some of the properties that a particle or system of particles can have, the greater the inherent uncertainty becomes an uncertainty that you can not get rid of or reduce below a critical value in other properties. This fundamental relation, known as the Heisenberg uncertainty principle, cannot be worked around.

This diagram illustrates the inherent uncertainty relation between position and momentum. When one is known more accurately, the other is inherently less able to be known accurately. Every time you accurately measure one, you ensure a greater uncertainty in the corresponding complementary quantity.

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

There are many other examples of uncertainty in quantum physics, and many of those uncertain measurements dont just have two possible outcomes, but a continuous spectrum of possibilities. Its only by measuring the Universe, or by causing an interaction of an inherently uncertain system with another quantum from the environment, that we discover which of the possible outcomes describes our reality.

The Many Worlds Interpretation of quantum mechanics holds that there are an infinite number of parallel Universes that exist, holding all possible outcomes of a quantum mechanical system, and that making an observation simply chooses one path. This interpretation is philosophically interesting, but may add nothing-of-value when it comes to actual physics.

One of the problems with quantum mechanics is the problem of, what does it mean for whats really going on in our Universe? We have this notion that there is some sort of objective reality a really real reality thats independent of any observer or external influence. That, in some way, the Universe exists as it does without regard for whether anyone or anything is watching or interacting with it.

This very notion is not something were certain is valid. Although its pretty much hard-wired into our brains and our intuitions, reality is under no obligation to conform to them.

What does that mean, then, when it comes to the question of whats truly going on when, for example, we perform the double-slit experiment? If you have two slits in a screen that are narrowly spaced, and you shine a light through it, the illuminated pattern that shows up behind the screen is an interference pattern: with multiple bright lines patterned after the shape of the slit, interspersed with dark lines between them. This is not what youd expect if you threw a series of tiny pebbles through that double slit; youd simply expect two piles of rocks, with each one corresponding to the rocks having gone through one slit or the other.

Results of a double-slit-experiment performed by Dr. Tonomura showing the build-up of an interference pattern of single electrons. If the path of which slit each electron passes through is measured, the interference pattern is destroyed, leading to two piles instead. The number of electrons in each panel are 11 (a), 200 (b), 6000 (c), 40000 (d), and 140000 (e).

The thing about this double slit experiment is this: as long as you dont measure which slit the light goes through, you will always get an interference pattern.

This remains true even if you send the light through one photon at a time, so that multiple photons arent interfering with one another. Somehow, its as though each individual photon is interfering with itself.

Its still true even if you replace the photon with an electron, or other massive quantum particles, whether fundamental or composite. Sending electrons through a double slit, even one at a time, gives you this interference pattern.

And it ceases to be true, immediately and completely, if you start measuring which slit each photon (or particle) went through.

But why? Why is this the case?

Thats one of the puzzles of quantum mechanics: it seems as though its open to interpretation. Is there an inherently uncertain distribution of possible outcomes, and does the act of measuring simply pick out which outcome it is that has occurred in this Universe?

Is it the case that everything is wave-like and uncertain, right up until the moment that a measurement is made, and that act of measuring a critical action that causes the quantum mechanical wavefunction to collapse?

When a quantum particle approaches a barrier, it will most frequently interact with it. But there is a finite probability of not only reflecting off of the barrier, but tunneling through it. The actual evolution of the particle is only determined by measurement and observation, and the wavefunction interpretation only applies to the unmeasured system; once its trajectory has been determined, the past is entirely classical in its behavior.

Or is it the case that each and every possible outcome that could occur actually does occur, but simply not in our Universe? Is it possible that there are an infinite number of parallel Universes out there, and that all possible outcomes occur infinitely many times in a variety of them, but it takes the act of measurement to know which one occurred in ours?

Although these might all seem like radically different possibilities, theyre all consistent (and not, by any means, an exhaustive list of) interpretations of quantum mechanics. At this point in time, the only differences between the Universe they describe are philosophical. From a physical point of view, they all predict the same exact results for any experiment we know how to perform at present.

However, if there are an infinite number of parallel Universes out there and not simply in a mathematical sense, but in a physically real one there needs to be a place for them to live. We need enough Universe to hold all of these possibilities, and to allow there to be somewhere within it where every possible outcome can be real. The only way this could work is if:

From a pre-existing state, inflation predicts that a series of universes will be spawned as inflation continues, with each one being completely disconnected from every other one, separated by more inflating space. One of these bubbles, where inflation ended, gave birth to our Universe some 13.8 billion years ago, where our entire visible Universe is just a tiny portion of that bubbles volume. Each individual bubble is disconnected from all of the others.

The Universe needs to be born infinite because the number of possible outcomes that can occur in a Universe that starts off like ours, 13.8 billion years ago, increases more quickly than the number of independent Universes that come to exist in even an eternally inflating Universe. Unless the Universe was born infinite in size a finite amount of time ago, or it was born finite in size an infinite amount of time ago, its simply not possible to have enough Universes to hold all possible outcomes.

But if the Universe was born infinite and cosmic inflation occurred, suddenly the Multiverse includes an infinite number of independent Universes that start with initial conditions identical to our own. In such a case, anything that could occur not only does occur, but occurs an infinite number of times. There would be an infinite number of copies of you, and me, and Earth, and the Milky Way, etc., that exist in an infinite number of independent Universe. And in some of them, reality unfolds identically to how it did here, right up until the moment when one particular quantum measurement takes place. For us in our Universe, it turned out one way; for the version of us in a parallel Universe, perhaps that outcome is the only difference in all of our cosmic histories.

The inherent width, or half the width of the peak in the above image when youre halfway to the crest of the peak, is measured to be 2.5 GeV: an inherent uncertainty of about +/- 3% of the total mass. The mass of the particle in question, the Z boson, is peaked at 91.187 GeV, but that mass is inherently uncertain by a significant amount.

But when we talk about uncertainty in quantum physics, were generally talking about an outcome whose results havent been measured or decided just yet. Whats uncertain in our Universe isnt past events that have already been determined, but only events whose possible outcomes have not yet been constrained by measurables.

If we think about a double slit experiment thats already occurred, once weve seen the interference pattern, its not possible to state whether a particular electron traveled through slit #1 or slit #2 in the past. That was a measurement we could have made but didnt, and the act of not making that measurement resulted in the interference pattern appearing, rather than simply two piles of electrons.

There is no Universe where the electron travels either through slit #1 or slit #2 and still makes an interference pattern by interfering with itself. Either the electron travels through both slits at once, allowing it to interfere with itself, and lands on the screen in such a way that thousands upon thousands of such electrons will expose the interference pattern, or some measurements occurs to force the electron to solely travel through slit #1 or slit #2 and no interference pattern is recovered.

Perhaps the spookiest of all quantum experiments is the double-slit experiment. When a particle passes through the double slit, it will land in a region whose probabilities are defined by an interference pattern. With many such observations plotted together, the interference pattern can be seen if the experiment is performed properly; if you retroactively ask which slit did each particle go through? you will find youre asking an ill-posed question.

What does this mean?

It means as was recognized by Heisenberg himself nearly a century ago that the wavefunction description of the Universe does not apply to the past. Right now, there are a great many things that are uncertain in the Universe, and thats because the critical measurement or interaction to determine what that things quantum state is has not yet been taken.

In other words, there is a boundary between the classical and quantum the definitive and the indeterminate and that the boundary between them is when things become real, and when the past becomes fixed. That boundary, according to physicist Lee Smolin, is what defines now in a physical sense: the moment where the things that were observing at this instant fixes certain observables to have definitively occurred in our past.

We can think about infinite parallel Universes as opening up before us as far as future possibilities go, in some sort of infinitely forward-branching tree of options, but this line of reasoning does not apply to the past. As far as the past goes, at least in our Universe, previously determined events have already been metaphorically written in stone.

This 1993 photo by Carol M. Highsmith shows the last president of apartheid-era South Africa, F.W. de Klerk, alongside president-elect Nelson Mandela, as both were about to receive Americas Liberty Medal for effecting the transition of power away from white minority rule and towards universal majority rule. This event definitively occurred in our Universe.

In a quantum mechanical sense, this boils down to two fundamental questions.

The answer seems to be no and no. To achieve a macroscopic difference from quantum mechanical outcomes means weve already crossed into the classical realm, and that means the past history is already determined to be different. There is no way back to a present where Nelson Mandela dies in 2013 if he already died in prison in the 1980s.

Furthermore, the only places where these parallel Universes can exist is beyond the limit of our observable Universe, where theyre completely causally disconnected from anything that happens here. Even if theres a quantum mechanical entanglement between the two, the only way information can be transferred between those Universes is limited by the speed of light. Any information about what occurred over there simply doesnt exist in our Universe.

We can imagine a very large number of possible outcomes that could have resulted from the conditions our Universe was born with, and a very large number of possible outcomes that could have occurred over our cosmic history as particles interact and time passes. If there were enough possible Universes out there, it would also be possible that the same set of outcomes happened in multiple places, leading to the scenario of infinite parallel Universes. Unfortunately, we only have the one Universe we inhabit to observe, and other Universes, even if they exist, are not causally connected to our own.

The truth is that there may well be parallel Universes out there in which all of these things did occur. Maybe there is a Berenstein Bears out there, along with Shazaam the movie and a Nelson Mandela who died in prison in the 1980s. But that has no bearing on our Universe; they never occurred here and no one who remembers otherwise is correct. Although the neuroscience of human memory is not fully understood, the physical science of quantum mechanics is well-enough understood that we know whats possible and what isnt. You do have a faulty memory, and parallel Universes arent the reason why.

Read the original:

Could quantum mechanics explain the Mandela effect? - Big Think