Category Archives: Quantum Physics

Six fabulous facts about the Standard Model – Symmetry magazine

In ancient times, Greeks interested in forecasting the future would voyage on the seventh day of the month to the Temple of Apollo in Delphi to seek insight from the oracle. Today, we dont need to decipher the riddles of a high priestess; scientists build mathematical models that predict everything from the economy to the weather.

One particularly powerful mathematical prophet is the Standard Model of particle physics. It produces sharp predictions about the subatomic world.

The Standard Model is a collection of ideas that tells us about nature and how all the particles in the universe interact with each other, says Tulika Bose, a physics professor at the University of Wisconsin.

The Standard Model describes the behavior of the smallest building blocks we know: six types of quarks, six types of leptons, three fundamental forces (and their four associated particles), plus the Higgs boson.

Like the soothsayers of antiquity, the Standard Model speaks in riddles that only trained practitioners can interpret. But unlike Pythia of Apollo, the Standard Model is an amalgamation based on the work of thousands of independent scientists, and its predictions have weathered decades of experimental testing.

Today in Symmetry, learn six fabulous facts about one of the most robust scientific models in the world.

Illustration by Sandbox Studio, Chicago with Corinne Mucha

By the 1960s, physicists had built up quite a collection of what they considered to be fundamental particlesdiscrete pieces of matter that could not be broken down any further into constituent parts. There were so many different particles, they referred to them as the particle zoo.

But in 1964, physicists Murray Gell-Mann and George Zweig theorized that many members of the particle zoo were actually composite particles made up of even smaller pieces, which we now call quarks. The list of true fundamental particles was significantly smaller, and scientists began to see new patterns. This was the beginning of the development of the Standard Model of particle physics.

The first references to a standard model appear in papers published in the 1970s. At this stage, physicists were still using standard as an adjective, not as part of a proper noun.

If you start digging through the papers, you see in lowercase, the standard model of gauge interactions and the standard description of electroweak interactions, says Richard Ruiz, a theorist at the Institute of Nuclear Physics in Krakow, Poland.

This was the era of model-building. The ideas that worked the best were considered standardbut in the traditional sense of the word. Over time, in the 80s and 90s, slowly the S and the M got capitalized.

Illustration by Sandbox Studio, Chicago with Corinne Mucha

The Standard Model makes some basic assumptions about the universe. And it would function very differently if any of them turned out to be untrue. Here are a few of those assumptions:

Illustration by Sandbox Studio, Chicago with Corinne Mucha

The Standard Models equations currently suggest that our universe could be metastable: With an unlucky quantum occurrence, it could collapse.

But scientists think its unlikely the universe is actually in existential danger. They attribute this anxiety-provoking prediction to the data the Standard Model is working withour inexact measurements of the masses of the two heaviest known fundamental particles, the top quark and the Higgs boson.

These particle masses matter so much because the Standard Model is derived from experimental measurements.

Just as you cant use the Pythagorean theorem to figure out the length of the hypotenuse of a right triangle without knowing the length of the triangles other two sides, you cant use the Standard Model to make predictions without other inputs. By itself, the Standard Model cannot predict the mass of the various fundamental particles, nor can it predict how strongly they will interact with each other.

Because the Standard Models predictions depend on data from experiments, the predictions are not static, but constantly evolving as detectors and analytical methods improve.

And as predictions become more precise, there are hints that some of them might no longer be consistent with each other.Its a bit like a scavenger hunt where each measured value is a clue that helps lead us to the next, Ruiz says.

Illustration by Sandbox Studio, Chicago with Corinne Mucha

The Standard Model is quite a bit more complicated than the Pythagorean theorem. Whereas Pythagoras needed only two inputs to determine the length of the third side of a right triangle, the Standard Model needs values for at least 18 independent variables to predict the behaviors of subatomic particles. These inputs include factors such as the particle masses, the strength of the Higgs field, and how the various forces intersect with each other.

Its like 18 independent knobs that each have a fixed value, Bose says. They are free parameters that are not tied to each other.

The Standard Model funnels these independent values into equations that can predict how particles form, decay and bond to create all matter in the visible universe.

Illustration by Sandbox Studio, Chicago with Corinne Mucha

The Standard Model needs inputs such as the masses of particles to make predictions. But some particles, such as photons and gluons, have no mass.

An early version of the Standard Model assumed that another type of particle, the neutrino, was massless as well. But when scientists discovered that was not true, theorists needed to fit this discovery into the Standard Models equations. (And they are still working on it!)

According to Bose, the Standard Model has developed through decades of trial and error. The initial description of the Standard Model was far from complete or correct, Bose says. Our understanding has changed significantly over the years.

Physicists are hopeful that future discoveries will provide further insight into the big questions in physics that the Standard Model fails to address.

Illustration by Sandbox Studio, Chicago with Corinne Mucha

The Standard Model is a trustworthy guide when it comes to its 17 fundamental particles and three fundamental forces. But observations from astronomy and cosmology let us know that theres more to the subatomic world.

A big open question is gravity, Bose says. We dont have any way to account for gravity on subatomic scales.

Gravity, dark matter and many other phenomena are omitted from the Standard Model, and experimental results so far have only served to discredit promising ideas for ways to include them.

But Bose isnt worried. As the history of the Standard Model shows, just knowing where the questions lie is an important step. Interesting questions lead to intriguing answers, and intriguing answers lead to even more interesting questions.

Its one of the greatest theories of all timeand almost a theory of everything, Bose says. Were excited to see how new experimental results might shake things up and enable the Standard Model to keep growing and evolving.

See original here:

Six fabulous facts about the Standard Model - Symmetry magazine

Cryptocurrency: Can it be climate conscious, and if so, how? – Landscape News

This article is the first in a two-part series on cryptocurrency and climate change.

Are you thinking of investing in Bitcoin? The worldslargest cryptocurrency by market caphas surged in value since the beginning of the COVID-19 pandemic, fueled by furloughed millennials and high-profile investors like Elon Musk, co-founder of Tesla. The car manufacturer revealed in February that it hadinvested USD 1.5 billion in Bitcoin.

With investments rocketing five-fold since October, many are wondering, How do I get into Bitcoin? But, given thatclimate change is forecastedto reduce average global incomes roughly 23 percent by 2100, we should be asking a more fundamental question: What impact do cryptocurrencies have on the environment?

In this first article of our two-part series on cryptocurrency and climate change,Landscape Newswent to the experts and put Bitcoin under the microscope.

InHow Bad Are Bananas? The Carbon Footprint of Everything, professor Berners-Lee describes cryptocurrencies as one of the most fundamentally pointless ways of using energy. Its a forthright statement, but the data would seem to support Berners-Lee.

A 2019 study from twosustainability researchers at Aalborg University,Susanne Khler and Massimo Pizzol,estimated that the Bitcoin mining network in 2018 had an annual footprint of 17.29 metric tons of carbon dioxide equivalent.That is roughly the same asdriving from San Francisco to New York 15,000 times, or the amount of carbon hypothetically sequestered by a forest the size of Portugal.

But Khler and Pizzol were using data from 2018, when the market value of Bitcoin was a seventh of its value today. According to theCambridge Centre for Alternative Finance, Bitcoin currently uses more electricity than the entire countries of Austria and Greece combined. Since the most recent surge in market price, which began in November 2020,the energy demands of the Bitcoin network have doubled.

In order to understand why this has happened, we must first understand how cryptocurrencies work.

Fundamentally, currencies only have value because a group of people believe that they have value. The only reason that we can exchange goods and services with the tap of a credit card is because everybody trusts the system. As anthropologist David Graeber wrote inDebt: The First 5,000 Years, the value of a unit of currency is not the measure of the value of an object, but the measure of ones trust in other human beings.

With conventional currencies, that trust is backed up by a national or federal central bank, the government, the police and, ultimately, the military. As a result, most people have a lot of faith in the system and, most of the time, the system works. Bitcoin, in contrast, is backed by absolutely nothing: no central bank, no government and certainly no military. So why do so many people trust it?

The answer is blockchain technology, invented in 2008 by a still-anonymous creator called Satoshi Nakamoto. In cryptocurrencies like Bitcoin, the blockchain serves as a distributed ledger, a public record of transactions that is virtually impossible to defraud. Over the years, the Bitcoin blockchain has proved itself trustworthy again and again: to the point where a company like Tesla has trusted the cryptocurrency more than the US dollar for its latest investment.

The problem for the environment is that the Bitcoin blockchain is founded on something called proof of work. In proof of work blockchains, computers around the world (called miners) compete to add new blocks of currency transactions to the ledger by solving extremely hard mathematical puzzles. The first computers to solve these puzzles are rewarded in Bitcoin. The difficulty of these puzzles also helps keep financial transactions in the blockchain secure:it is simply too expensive to defraud the network.

The problem for the environment is that the Bitcoin blockchain is founded on something called proof of work.

The issue is that Bitcoin is designed to be scarce, like gold. Satoshi Nakamoto designed Bitcoin so that one block would be added to the blockchain roughly every 10 minutes. This 10-minute rule is regulated by the difficulty of the mathematical puzzles and, as computing power has increased since 2008, the difficulty of the mining puzzles has also increased exponentially.

In January 2009, the difficulty of the puzzles was 1.0. In March 2021, the difficulty of the puzzles is 20 trillion.Because the puzzles are harder, the mining computers need to work harder, using more and more electricity. In 2018, Bitcoin miners in Kosovo drained enough power from the gridto make digital clocks all over Europe lose time.

Unfortunately, as more and more investors pile in, the price of Bitcoin is rising faster than the electricity bills of the mining companies.

I dont think you can participate [in Bitcoin] and have zero impact, says Susanne Khler, a sustainable blockchain researcher at Aalborg University. Firstly, one could attribute to your transactions the related share of the systems impacts. Secondly, using Bitcoin adds to the miners revenues and likely impacts the market price, so you are perpetuating a system that has a negative impact, whether thats you buying USD .05 Bitcoin or Elon Musk buying 1.5 billion.

You dont have to search hard to findarguments defending the vast energy use of Bitcoin, many of them extremely convincing, but the resounding message from researchers looking at its environmental impacts is to beware the true believers. Having these conversations can be extremely frustrating because there are so many people that believe in Bitcoin and dont want to look at the negative sides, says Khler.

Next well examine some of the arguments that Bitcoin believers use to defend the cryptocurrencys high energy cost.

Using survey data from May 2020, Cambridge Universitys3rd Global Cryptoasset Benchmarking Studyfound that 39 percent of global proof of work mining was powered by renewable energy. For miners based in Europe and North America, that proportion goes up to 70 percent and 66 percent respectively.

There are some Bitcoin mining operations that are projected to positively impact the world of renewable energy, but on a large scale, that is likely not the case, Khler says.

Because of low energy prices, including from renewables,around half the worlds Bitcoin mining takes place in China. However, the enormous environmental cost is already putting the industry on a collision course with President Xi Jinpings pledge to achieve carbon neutrality before 2060, and the Chinese government recently announced that allBitcoin mining operations in Inner Mongolia will be shut down by the end of Aprilbecause they were preventing the region from meeting its carbon reduction goals.

Ultimately, the question of whether the power used to mine Bitcoin is environmentally-friendly or not is somewhat irrelevant.Renewables only supply 28 percent of the worlds electricity, so the Bitcoin networks use of green energy simply means that other areas of the economy cannot decarbonize.

There have been multiple cases where Bitcoin mining facilities have displaced other electricity consumption, Khler says. So, if they use the renewable electricity, others may not have access to [renewables] anymore.

Some Bitcoin mining sites around the world are powered by stranded or curtailed energy energy that, for whatever reason, cannot be connected to the grid and would otherwise be wasted.

This argument has been used to defend Bitcoin mining in places like Sichuan, China, where, during the rainy season, hydroelectric dams generate a huge excess of power. By only using curtailed power, Bitcoin is in effect preserving surplus energy in the same way thatIceland captures its surplus renewable power by smelting aluminum.

But, according to Khler, this defense is no longer valid. Were not at the scale where that is possible any longer, she says. The Bitcoin network is increasing and cannot be covered by curtailed power alone.

There are two further arguments against the idea of using curtailed electricity, even if itwereable to cover the energy demands of Bitcoin. Firstly, as Khler says, it disincentivizes power plants from being integrated into international grids and, secondly, it also disincentivizes research and development into grid-scale batteries that could store and transport the energy off-site.

Christian Stoll,co-author of another paper that estimates the carbon footprint of Bitcoin, points out that Bitcoin is currently responsible for less than 1 percent of global carbon emissions. Although Stoll agrees that proof of work cryptocurrencies like Bitcoin are not climate conscious investments, he also suggests that there are bigger levers to meet the Paris Agreement goals.

For comparison, the entire Bitcoin network currently uses abouthalf as much power every yearas all the electronic devices that citizens of the U.S. leave plugged in when they are not using them.

The entire Bitcoin network currently uses about half as much power every year as all the electronic devices that citizens of the U.S. leave plugged in when they are not using them.

Nevertheless, because the mathematical puzzles underpinning the cryptocurrency are getting harder and harder to solve, there is no doubt that Bitcoin is environmentally harmful and getting worse,despite a 96 percent increase in mining machine efficiency over the past seven years. That puts a hole in the argument that Bitcoin mining technology might somehow reduce its impact through efficiency gains.

If proof of work is the problem, the question is: are there any greener alternatives? Finally, heres some good news.

Stephen Reid, selected as the Green Party candidate for Totnes, U.K. at the last General Election, is one of the teachers ofTools for the Regenerative Renaissance, a course that combines technology and blockchain education with climate consciousness.

Bitcoin was the very earliest instantiation of this technology, and it is incredibly energy inefficient, says Reid, who holds a masters degree in physics and another in complexity sciences. But, nevertheless, it was a stroke of absolute genius. It is changing the world as much as Einsteins 1905 papers on quantum mechanics and special relativity.

Satoshi Nakamoto pieced together three or four different concepts to produce the first decentralized form of money, in a way that no one had thought before. Bitcoin is incredibly energy inefficient and cannot be defended over the long term, but, happily, people have come up with vastly more energy efficient consensus schemes.

The most promising alternative to proof of work is called proof of stake. In proof of stake blockchains, the blocks are forged rather than mined and, instead of solving hard puzzles, the creator of the next block in the chain is chosen using a combination of randomization and how much of the cryptocurrency they hold thats the stake.

Dont worry. You dont necessarily have to understand the finer details to get the point that proof of stake is better for the environment than proof of work. If you do want to learn more,Coindesk has a great explainer.

Proof of work blockchains are designed to need computers running all day, every day. Proof of stake blockchains only need computers to run for milliseconds at a time. Because it doesnt require hard computing work, proof of stake has the potential to massively reduce the energy needed to add blocks to a cryptocurrencys blockchain.

Not using proof of work would reduce the calculated footprint to zero and the overall footprint by approximately 99 percent, Khler says.

It is worth emphasizing that the overall energy use of a proof of stake cryptocurrency is not zero: it still needs a peer-to-peer computer network to verify transactions and secure the system. The numbers that we calculated are 100 percent the mining process, Khler explains. They dont include the servers that host the blockchain nodes.

Nevertheless, proof of stake still represents a huge improvement in energy efficiency over proof of work cryptocurrencies like Bitcoin. There are already proof of stake cryptocurrencies out in the world: Avalanche, Cardano and Harmony to name three of the largest but they are all small fry compared to Bitcoin. However, the worlds second largest cryptocurrency by market capitalization, Ethereum, is midway through a complicated transition from proof of work to proof of stake.

According to Ethereums website, the transition to proof of stake has three phases, the first of which is already live and the third due sometime in 2022. Everybodys been waiting for Ethereums move to proof of stake for years, but its a slow, step-by-step process, says Khler.

The good news is that, should it be successful, the carbon-saving benefits of the transition will not only effect those who hold the Ethereum cryptocurrency. Ethereum is fundamentally different to Bitcoin, Khler explains. It is a cryptocurrency, but it also hosts tons and tons of applications.

Whereas Bitcoin aims to become the world currency, Ethereum aims to become the world computer: it is a blockchain that supports a programming language. In theory, anything that can be imagined and programmed can be hosted on the Ethereum network.

Applications on the Ethereum network include numerous other cryptocurrencies and financial services including venture capital and insurance, but also smart contracts, social media and even, perhaps controversially,carbon credits. To cap it all, on 11 March, British auction house Christies, founded in 1766,sold a crypto-artwork hosted on the Ethereum network for USD 69.3 million. Christies, of course, accepted payment in cryptocurrency.

If Ethereum manages to move to proof of stake, then, by association, all these projects using Ethereum will also be working with a more environmentally friendly blockchain, Khler says.

In principle, Bitcoin could do the same, Stephen Reid says of Ethereums shift to proof of stake. So why dont they?

The problem is that the majority of Bitcoins miners and stakeholders, everyone from Elon Musk to the miners of Sichuan, would have to come together and agree to change the underlying design of the blockchain. That would be an almost incredible feat of decentralized democracy.

But this question also assumes that the Bitcoin community evenwantsto move away from proof of work. The miners are not interested in moving to proof of stake, Khler says. They make tons of money with the current system and are highly invested in it, so why would they want to move to a different one?

The problem is that, when the price of Bitcoin rises, miners can use their higher revenues to reinvest in more mining machines, increasing both their profits and their overall carbon emissions. This relationship needs to be studied further, Khler says, but it seems to be a vicious cycle.

Continue reading part two of this series on cryptocurrency and climate change: Can cryptocurrencies help the planet?

Read more from the original source:

Cryptocurrency: Can it be climate conscious, and if so, how? - Landscape News

Element Sixs DNV-B1 Announced Winner for the Quantum Category at the 13th Edition of the SPIE Prism Awards – AZoM

First general-purpose quantum grade diamond enables users to unlock next generation quantum technologies.

Element Six (E6), part of the De Beers Group, has been named as a Prism Awards winner in the quantum category for its chemical vapour deposition (CVD) quantum grade diamond, DNV-B1, part of the DNV Series.

The prestigious award, which is judged by a panel of industry experts and is part of the SPIE Photonics West yearly event programme, recognises leading industrial innovations that make a difference, solve problems, and improve life through optics and photonics across a range of different areas. Element Sixs DNV-B1was announced as the quantum category winner, ahead of fellow finalists Qnami, with their ProteusQ, and AUREA Technology, with their Quantum Entangled Photon Source.

Launched in 2020, DNV-B1is Element Sixs first general-purpose CVD quantum grade diamond available on the market. Building on the companys extensive experience, international network and a unique, patented process to develop bespoke CVD diamond solutions, DNV-B1 is an ideal starting material for those interested in researching nitrogen-vacancy (NV) ensembles for quantum demonstrations, masers, detection of RF radiation, gyroscopes and magnetic sensing.

Diamond NV (DNV) centres offer researchers a unique solid-state platform with spin qubits that can be initialised and read out at room temperature, with long qubit lifetimes. Developed to provide a baseline solution that contains a uniform density of NV spin centres, Element Sixs DNV-B1 was specifically designed for emerging diamond applications that require ensembles of NV centres, guaranteeing a minimum level of performance.

ProteusQ, Qnamis quantum microscope containing Element Sixs diamond was also a finalist in the quantum category. ProteusQ is the first desktop-scale diamond-enabled NV scanning microscope, made possible with bespoke quantum grade CVD diamond developed and engineered by Element Six. The microscope uses scanning NV magnetometry to enable table-top analysis of magnetic materials at the atomic scale, offering an unprecedented, miniaturised solution to processes that had previously relied on large-scale light sources.

Matthew Markham, Principal Scientist at Element Six, said: It really is a honour to receive such a prestigious award and we are grateful to SPIE and its panel of experts for this recognition. We have been developing diamond-enabled quantum solutions for many years and it is rewarding to witness the transition from academic-led research to commercially-available solutions in a number of industrial applications.We are really proud of the DNV-B1 and the technology it enables, from fundamental quantum physics to magnetic sensors.

Source:https://www.e6.com/en

Read the rest here:

Element Sixs DNV-B1 Announced Winner for the Quantum Category at the 13th Edition of the SPIE Prism Awards - AZoM

Physicists Create Quasiparticles That Bind Together Two Differently Colored Particles of Light – SciTechDaily

Photon-photon polaritons in microresonators. Credit: University of Bath

Researchers exploring the interactions between light particles, photons, and matter find that optical microresonators host quasiparticles made by two photons.

Scientists at the University of Bath have found a way to bind together two photons of different colors, paving the way for important advancements in quantum-electrodynamics the field of science that describes how light and matter interact. In time, the teams findings are likely to impact developments in optical and quantum communication, and precision measurements of frequency, time, and distances.

An apple falling from a tree has velocity and mass, which together give it momentum. Apple energy derived from motion depends on the fruits momentum and mass.

Most people find the concept of momentum and energy (and therefore mass) an easy one to grasp when its associated with solid objects. But the idea that non-material objects, such as light waves (everything from sunlight to laser radiation), also have a mass is surprising to many. Among physicists, however, its a well-known fact. This apparently paradoxical idea that waves have a mass marks the place where quantum physics and the physical world come together.

The wave-particle duality, proposed by French physicist Louis de Broglie in 1924, is a powerful concept that describes how every particle or quantum entity can be described as either a particle or a wave. Many so-called quasiparticles have been discovered that combine either two different types of matter particles, or light waves bound to a particle of matter. A list of exotic quasiparticles includes phonons, plasmons, magnons, and polaritons.

The team of physicists at Bath has now reported a way to create quasiparticles that bind together two differently colored particles of light. They have named these formations photon-photon polaritons.

The opportunity to discover, and manipulate, photon-photons is possible thanks to the relatively new development of high-quality microresonators. For light, microresonators act as miniature racetracks, with photons zipping around the internal structure in loops. The signature left by photon-photons in the light exiting the microresonator can be linked to the AutlerTownes effect, a peculiar phenomenon in quantum theory that describes strong photon-atom interactions. To achieve this effect in microresonators, a laser is tuned to the specific resonance frequency where a photon is expected to be absorbed, yet no resonance absorption happens. Instead, the photon-photon interaction makes up two new resonance frequencies away from the old one.

A significant feature that has emerged from the Bath research is that the microresonator provided a whole set of split resonances, where each photon-photon pair displayed its own momentum and energy, allowing the researchers to apply the quasiparticle concept and calculate mass. According to the researchers predictions, photon-photons are 1,000+ times lighter than electrons.

Professor Dmitry Skryabin, the physicist who led the research, said: We now have a situation where microresonators which are millimeter-scale objects behave like giant atoms. The artificial atoms concept is rapidly gaining ground in the quantum-electrodynamics of microwaves in superconducting circuits, while here we are looking at the similar opportunity in the optical range of frequencies.

The small mass of photon-photons could lead to further developments of many important analogies between light and fluids, where other families of quasiparticles have already being used.

PhD student Vlad Pankratov, who participated in the project, said: After a year of running models and collecting data, these are incredibly exciting findings for us. The potential applications of our results are in the terabit and quantum optical communication schemes, and in the area of precision measurements.

Reference: Photon-photon polaritons in (2) microresonators by D. V. Skryabin, V. V. Pankratov, A. Villois and D. N. Puzyrev, 17 February 2021, Physical Review Research.DOI: 10.1103/PhysRevResearch.3.L012017

This work was supported by the European Union Marie Skodowska-Curie Actions (812818, MICROCOMB).

Continue reading here:

Physicists Create Quasiparticles That Bind Together Two Differently Colored Particles of Light - SciTechDaily

Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature – ScienceAlert

This month is a time to celebrate. CERN has just announced the discovery of four brand new particles at the Large Hadron Collider (LHC) in Geneva.

This means that the LHC has now found a total of 59 new particles, in addition to the Nobel prize-winning Higgs boson, since it started colliding protons particles that make up the atomic nucleus along with neutrons in 2009.

Excitingly, while some of these new particles were expected based on our established theories, some were altogether more surprising.

The LHC's goal is to explore the structure of matter at the shortest distances and highest energies ever probed in the lab testing our current best theory of nature: the Standard Model of Particle Physics. And the LHC has delivered the goods it enabled scientists to discover the Higgs boson, the last missing piece of the model. That said, the theory is still far from being fully understood.

One of its most troublesome features is its description of the strong force which holds the atomic nucleus together. The nucleus is made up of protons and neutrons, which are in turn each composed of three tiny particles called quarks (there are six different kinds of quarks: up, down, charm, strange, top and bottom).

If we switched the strong force off for a second, all matter would immediately disintegrate into a soup of loose quarks a state that existed for a fleeting instant at the beginning of the universe.

Don't get us wrong: the theory of the strong interaction, pretentiously called "quantum chromodynamics", is on very solid footing. It describes how quarks interact through the strong force by exchanging particles called gluons. You can think of gluons as analogues of the more familiar photon, the particle of light and carrier of the electromagnetic force.

However, the way gluons interact with quarks makes the strong force behave very differently from electromagnetism. While the electromagnetic force gets weaker as you pull two charged particles apart, the strong force actually gets stronger as you pull two quarks apart.

As a result, quarks are forever locked up inside particles called hadrons particles made of two or more quarks which includes protons and neutrons. Unless, of course, you smash them open at incredible speeds, as we are doing at Cern.

To complicate matters further, all the particles in the standard model have antiparticles which are nearly identical to themselves but with the opposite charge (or other quantum property). If you pull a quark out of a proton, the force will eventually be strong enough to create a quark-antiquark pair, with the newly created quark going into the proton.

You end up with a proton and a brand new "meson", a particle made of a quark and an antiquark. This may sound weird but according to quantum mechanics, which rules the universe on the smallest of scales, particles can pop out of empty space.

This has been shown repeatedly by experiments we have never seen a lone quark. An unpleasant feature of the theory of the strong force is that calculations of what would be a simple process in electromagnetism can end up being impossibly complicated. We therefore cannot (yet) prove theoretically that quarks can't exist on their own.

Worse still, we can't even calculate which combinations of quarks would be viable in nature and which would not.

Illustration of a tetraquark. (CERN)

When quarks were first discovered, scientists realized that several combinations should be possible in theory. This included pairs of quarks and antiquarks (mesons); three quarks (baryons); three antiquarks (antibaryons); two quarks and two antiquarks (tetraquarks); and four quarks and one antiquark (pentaquarks) as long as the number of quarks minus antiquarks in each combination was a multiple of three.

For a long time, only baryons and mesons were seen in experiments. But in 2003, the Belle experiment in Japan discovered a particle that didn't fit in anywhere. It turned out to be the first of a long series of tetraquarks.

In 2015, the LHCb experiment at the LHC discovered two pentaquarks.

The four new particles we've discovered recently are all tetraquarks with a charm quark pair and two other quarks. All these objects are particles in the same way as the proton and the neutron are particles. But they are not fundamental particles: quarks and electrons are the true building blocks of matter.

Is a pentaquark tightly (above) or weakly bound (see image below)? (CERN)

The LHC has now discovered 59 new hadrons. These include the tetraquarks most recently discovered, but also new mesons and baryons. All these new particles contain heavy quarks such as "charm" and "bottom".

These hadrons are interesting to study. They tell us what nature considers acceptable as a bound combination of quarks, even if only for very short times.

They also tell us what nature does not like. For example, why do all tetra- and pentaquarks contain a charm-quark pair (with just one exception)? And why are there no corresponding particles with strange-quark pairs? There is currently no explanation.

Is a pentaquark a molecule? A meson (left) interacting with a proton (right). (CERN)

Another mystery is how these particles are bound together by the strong force. One school of theorists considers them to be compact objects, like the proton or the neutron.

Others claim they are akin to "molecules" formed by two loosely bound hadrons. Each newly found hadron allows experiments to measure its mass and other properties, which tell us something about how the strong force behaves. This helps bridge the gap between experiment and theory. The more hadrons we can find, the better we can tune the models to the experimental facts.

These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model. Despite its successes, the standard model is certainly not the last word in the understanding of particles. It is for instance inconsistent with cosmological models describing the formation of the universe.

The LHC is searching for new fundamental particles that could explain these discrepancies. These particles could be visible at the LHC, but hidden in the background of particle interactions.Or they could show up as small quantum mechanical effects in known processes.

In either case, a better understanding of the strong force is needed to find them. With each new hadron, we improve our knowledge of nature's laws, leading us to a better description of the most fundamental properties of matter.

Patrick Koppenburg, Research Fellow in Particle Physics, Dutch National Institute for Subatomic Physics and Harry Cliff, Particle physicist, University of Cambridge.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

See more here:

Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature - ScienceAlert

Tech that sees through the earth could help build cities of the future – The Times

On Thursday a Cumbrian farmer was swallowed by a sinkhole that opened up in the middle of one of his fields. He fell 60ft through the 8ft-wide opening, with his quad bike beneath him, and was taken to hospital with chest injuries.

Many iron ore mines were dug in the area in the 19th century, but they are long abandoned and no one has much more than a rough idea where they lie, so sinkholes are an ever-present danger.

Rescuers at the scene of the sinkhole in Cumbria

Dominic Hogg and his choir were more lucky. On Christmas Day he heard a bang outside his home in Bristol. A Japanese pagoda tree in a nearby square had crashed to the ground when a hole opened beneath it. Nobody was hurt, but the choir had

Follow this link:

Tech that sees through the earth could help build cities of the future - The Times

Living in a simulation: Is Universe a Neural Network? – The Indian Wire

In the past few innovation-led ease-enabled years, the best-performing Artificial-intelligence systems such as the speech recognizers, automatic translators or most welcomed virtual assistants have resulted from a technique called deep learning, a part of Neural network approach.

Neural nets have occupied a major area of research in both neuroscience and computer science and has seen its zenith in 1970s as well as its nadir in years later.

It is a means of performing machine learning, in which a computer learns to perform some task by analyzing training examples.

When we train our neural networks we run thousands or millions of cycles until the AI is properly trained.

The first Mind-Machine collab:The very first trainable neural network was named the PerceptronIt was demonstrated by the Cornell University psychologist Frank Rosenblatt in 1957.

It exhibited the qualities of executing certain fairly common computations on Perceptrons would be impractically time consuming otherwise.

The latest but surely not the last of such revelation regarding neural networks is a recent enlightening study regarding a possible collusion between our worldly phenomenons and the neural network model.

Vitaly Vanchurin( professor of physics at the University of Minnesota Duluth) published a paper by the name The World as a Neural Network that has recently found hightened resurgence among peers.

The study attempted to describe the scope between quantum and classical physics.

We get hold of quantum physics to understand universal processes at the scale of atoms and subatomic particles including photons and even cells but on large scale, we seek refuge under classical physics.

Undoubted is that general relativity works pretty well on large scales and Quantum on small levels.

For years, physicists have tried unsuccessfully to reconcile quantum mechanics and general relativity. This is known as the problem of quantum gravity.

The study clearly proposes of an entire universe at its most fundamental level to be simulating a neural network.

What has Vanchurin said:We are not just saying that the artificial neural networks can be useful for analyzing physical systems or for discovering physical laws, we are saying that this is how the world around us actually works.

With this respect it could be considered as a proposal for the theory of everything, and as such it should be easy to prove it wrong. All that is needed is to find a physical phenomenon which cannot be described by neural networks.

Unfortunately (or fortunately) it is easier said than done.

The two available explanations for our universe based on distinct interpretations deriving from quantum mechanics. These are:

1. many worlds- there are many worlds existing in parallel at the same space and time as our own. Therefore it becomes possible to remove randomness and action at a distance from quantum theory and thus from all physics.

This implies that all possible outcomes of quantum measurements are physically realized in some world or universe. Stephen Hawking, Everett and Sean Carrolls are its proponents.

2. hidden variables-this approach is championed by Albert Einstein and David Bohm, among others, basically states that the wave function is just a temporary fix and that physicists will eventually replace it.

Vanchurins concept tries to reconcile the two distinct Quantum theories. Though adding nothing new in existing ones, just exhibiting the approximate behaviours of both.

If such a theory found more proponents and leads to more work in this era, it could soon yield the absolute Theory of everything ie. How the World around us actually works.

Since it is widely believed that most of the fundamental ideas of entire universe are governed by the rules of quantum mechanics, even gravity should somehow emerge from it.

This may definitely resolve a few things but triggers a few more questions that will require inherent research:

If the Universe is a neural network, do we act as deep learning nodes connecting inputs and yielding output connections?

If so, whats the networks purpose?

Is it one giant closed network or just a single layer in a grander network?

Are there other universes connected to the same network?

Are we just being trained for some larger-than-universal machines greater purpose?

These questions will rule our minds for some considerable time as this idea is definitely crazy, but is it crazy enough to be true?

That remains to be seen. We may never find it different enough!

Read this article:

Living in a simulation: Is Universe a Neural Network? - The Indian Wire

This Is the Fastest Random-Number Generator Ever Built – Scientific American

Researchers have built the fastest random-number generator ever made, using a simple laser. It exploits fluctuations in the intensity of light to generate randomnessa coveted resource in applications such as data encryption and scientific simulationsand could lead to devices that are small enough to fit on a single computer chip.

True randomness is surprisingly difficult to come by. Algorithms in conventional computers can produce sequences of numbers that seem random at first, but over time these tend to display patterns. This makes them at least partially predictable, and therefore vulnerable to being decoded.

To make encryption safer, researchers have turned to quantum mechanics, where the laws of physics guarantee that the results of certain measurementssuch as when a radioactive atom decaysare genuinely random.

A popular way to tap into quantum randomness is to exploit fluctuations in how photons are emitted by the materials used in lasers. Typical laser devices are designed to minimize these fluctuations to produce light of steady intensity: they make the light waves bounce around inside the material to force its atoms to emit more and more photons in sync with each other.

But for random-number generation, researchers aim for the opposite. We want the intensity to fluctuate randomly, so we can digitize the intensity to generate random numbers, says Hui Cao, an applied physicist at Yale University in New Haven, Connecticut.

Cao and her team made their laser materiala translucent semiconductorin the shape of a bow tie. Photons bounce between the curved walls of the bow tie multiple times, before coming out as a scattered beam. The researchers can then capture the light with an ultrafast camera. They recorded the light output of 254 independent pixels, which together produced random bits at a rate of around 250 terabits per second, or 250 terahertz. Thats several orders of magnitude faster than previous such devices, which recorded only one pixel at a time. Their results were reported inScienceon 25 February.

The invention represents a major leap in performance of random-number generators, says Krister Shalm, a physicist at the US National Institute of Standards and Technology in Boulder, Colorado.

The fastest existing computers have clock speeds measured in gigahertz, which is much too slow to fully exploit the full power of Caos device. The set-up could be made smaller by using simpler light detectors instead of a high-speed camera. This could eventually yield practical devices small enough to fit on a single computer chip, says Cao. These could have useful applications, such as encryption technology on mobile phones.

This article is reproduced with permission and was first published on March 2 2021.

Follow this link:

This Is the Fastest Random-Number Generator Ever Built - Scientific American

Physics – The Tiniest Superfluid Circuit in Nature – Physics

February 25, 2021• Physics 14, 27

A new analysis of heavy-ion collision experiments uncovers evidence that two colliding nuclei behave like a Josephson junctiona device in which Cooper pairs tunnel through a barrier between two superfluids.

The Josephson effect is a remarkable example of a macroscopic quantum phenomenon, in which, without an applied voltage, current flows between two superconductors separated by a thin film of normal material. In this structure, called a Josephson junction, the current is due to the quantum tunneling of paired, superconducting electrons (so-called Cooper pairs) [1]. For decades, nuclear physicists have hypothesized that similar effects can occur on much smaller scales, since atomic nuclei could be regarded as superfluids consisting of paired nucleons. Recent experiments have supported this hypothesis, delivering hints that two colliding nuclei could be described as a Josephson junction in which entangled neutron pairs play the role of Cooper pairs (Fig. 1) [2, 3]. Now, Gregory Potel from Lawrence Livermore National Laboratory in California and colleagues have put these ideas on firmer ground [4]. Analyzing tin-nickel collisions from previous experiments, they found that experimental observables offer compelling signatures that two nuclei indeed form, for a split second, a Josephson junction.

The orderly motion of gigantic ensembles of correlated electron pairs makes superconductors behave as a single objecta macroscopic quantum state called a condensate. The condensate is characterized by its density and phase, and the latter plays the same role as the orientation of magnetic moments in a ferromagnet: an isolated ferromagnet can be rotated at no energy cost, but two ferromagnets with different orientations affect each other. Similarly, according to quantum mechanics, the phase doesnt have implications for a single condensate. But if two condensates are sufficiently close, a Cooper-pair current, whose magnitude depends on the phase difference, may flow from one condensate to the other. A striking feature of this effect is that electric current may flow without a driving voltage.

There may be other systems in Nature where this effect occurs, and atomic nuclei, which can be regarded as superfluid ensembles of nucleons, are good candidates. This idea appeared among nuclear physicists as early as the 1970s [5]. In the 1980s and 1990s, several experiments indicated an enhanced probability of neutron-pair transfer between colliding nucleia possible manifestation of the Josephson effect. But the evidence for this interpretation wasnt compelling. There were doubts, in particular, about whether ensembles of nucleons are sufficiently large to be treated as a pair condensate. Superconductivity is an emergent phenomenon: It appears when dealing with a huge number of particles but vanishes when the system is broken down into smaller constituents. But can we consider a nucleus made of about 100 nucleons a huge ensemble of particles? Can we expect that two nuclei in close proximity exhibit a Josephson effect?

The study by Potel and his colleagues provides strong arguments for affirmative answers to these questions. The researchers analyzed data from previous experiments in which tin-116 ( 116Sn) nuclei were collided with nickel-60 ( 60Ni) [2]. With energies between 140.60 and 167.95 MeV, these collisions are gentle: they allow the nuclei to overcome just enough of the Coulomb repulsion to get sufficiently close to exchange a few neutrons at most. Under such conditions, two reactions are possible: the transfer of one neutron and the transfer of two neutrons, producing 115Sn+61Ni and 114Sn+62Ni, respectively. The case of two-neutron transfer is particularly interesting, as it may carry signatures of the correlated pairing of neutrons in the nuclei.

The team devised a way to uncover the experimental evidence of Josephson flow. Their idea is that there can be a nuclear equivalent of the alternating current (ac) Josephson effect (Fig. 1). In this variant of the Josephson effect, a constant, or dc, voltage applied to a Josephson junction produces an ac current. This striking behavior arises because the voltage causes the phase difference between the two condensates to increase over time. Since phases that differ by multiples of 2 are equivalent, a linear phase growth produces an oscillating current. The researchers argue that for the nuclear case, a similar effect can occur because neutron pairs inside two colliding nuclei possess different energies. This energy difference plays the role of the dc voltage in the ac Josephson effect.

Therefore, similar oscillatory behavior is expected to occur during a nuclear collision: the back-and-forth tunneling of neutron pairs means that 116Sn+60Ni transforms into 114Sn+62Ni and then again into 116Sn+60Nia cyclical process whose frequency is determined by the energy difference of neutron pairs in initial and final nuclei. Because the collision lasts for only a short time, the team estimates that only about three such back-and-forth transfer cycles may occur in an experiment. However, even these few oscillations can lead to observable consequences. Since neutrons and protons interact strongly, oscillating neutron pairs cause protons to oscillate at the same frequency. Because of their charge, oscillating protons should emit electromagnetic radiation at this frequency. While electrons oscillating in a standard Josephson junction emit microwave photons [6], nuclei are expected to emit gamma-ray photons because of the much larger nuclear energy differences involved. The researchers calculate the expected radiation energy to be slightly less than 4 MeV, which matches the gamma-ray spectrum seen in previous experiments.

The results are thrilling for two reasons. First, they indicate that the principles of superconductivity valid for macroscopic phenomena in solids may be applicable to the much smaller (femtometer) nuclear scalesa truly spectacular conclusion. Second, the analysis shows that the pairing description is appropriate for a small number of particlesthe hundreds of nucleons making up the nuclei. It is worth pointing out, however, that this description contains a puzzling inconsistency. According to quantum mechanics, the phase and the number of particles in the condensate are related by the uncertainty principlemuch like the position and momentum of a quantum particle: if either quantity is well defined, the other isnt. But for the nuclear case, the number of nucleons is always exactly defined. Further theoretical work will need to resolve this inconsistency.

These findings whet our appetite for more work aimed at validating superfluid nuclear models by confronting theory with experiments. In particular, it would be crucial to show that such models can deliver accurate, quantitative predictions for analogous effects in nuclear collisions beyond those involving tin and nickel.

Piotr Magierski is Professor of Physics and Head of the Nuclear Physics Division at Warsaw University of Technology, Poland, and an Affiliate Professor at the University of Washington. He is a theoretical physicist whose research interests include superfluidity and superconductivity in systems far from equilibrium, such as nuclear fission and fusion reactions, nuclear matter in neutron stars, and ultracold atomic gases.

Two mirror nuclei, in which the numbers of neutrons and protons are interchanged, have markedly different shapesa finding that defies current nuclear theories. Read More

Particle physicists have detected a short-lived nucleus containing two strange quarks, whose properties could provide new insights into the behavior of other nuclear particles. Read More

More:

Physics - The Tiniest Superfluid Circuit in Nature - Physics

New research indicates the whole universe could be a giant neural network – The Next Web

The core idea is deceptively simple: every observable phenomenon in the entire universe can be modeled by a neural network. And that means, by extension, the universe itself may be a neural network.

Vitaly Vanchurin, a professor of physics at the University of Minnesota Duluth, published an incredible paper last August entitled The World as a Neural Network on the arXiv pre-print server. It managed to slide past our notice until today when Futurisms Victor Tangermann published an interview with Vanchurin discussing the paper.

The big idea

According to the paper:

We discuss a possibility that the entire universe on its most fundamental level is a neural network. We identify two different types of dynamical degrees of freedom: trainable variables (e.g. bias vector or weight matrix) and hidden variables (e.g. state vector of neurons).

At its most basic, Vanchurins work here attempts to explain away the gap between quantum and classical physics. We know that quantum physics does a great job of explaining whats going on in the universe at very small scales. When were, for example, dealing with individual photons we can dabble with quantum mechanics at an observable, repeatable, measurable scale.

But when we start to pan out were forced to use classical physics to describe whats happening because we sort of lose the thread when we make the transition from observable quantum phenomena to classical observations.

The argument

The root problem with sussing out a theory of everything in this case, one that defines the very nature of the universe itself is that it usually ends up replacing one proxy-for-god with another. Where theorists have posited everything from a divine creator to the idea were all living in a computer simulation, the two most enduring explanations for our universe are based on distinct interpretations of quantum mechanics. These are called the many worlds and hidden variables interpretations and theyre the ones Vanchurin attempts to reconcile with his world as a neural network theory.

To this end, Vanchurin concludes:

In this paper we discussed a possibility that the entire universe on its most fundamental level is a neural network. This is a very bold claim. We are not just saying that the artificial neural networks can be useful for analyzing physical systems or for discovering physical laws, we are saying that this is how the world around us actually works. With this respect it could be considered as a proposal for the theory of everything, and as such it should be easy to prove it wrong. All that is needed is to find a physical phenomenon which cannot be described by neural networks. Unfortunately (or fortunately) it is easier said than done.

Quick take: Vanchurin specifically says hes not adding anything to the many worlds interpretation, but thats where the most interesting philosophical implications lie (in this authors humble opinion).

If Vanchurins work pans out in peer review, or at least leads to a greater scientific fixation on the idea of the universe as a fully-functioning neural network, then well have a found a thread to pull on that could put us on the path to a successful theory of everything.

If were all nodes in a neural network, whats the networks purpose? Is the universe one giant, closed network or is it a single layer in a grander network? Or perhaps were just one of trillions of other universes connected to the same network. When we train our neural networks we run thousands or millions of cycles until the AI is properly trained. Are we just one of an innumerable number of training cycles for some larger-than-universal machines greater purpose?

You can read the paper whole paper here on arXiv.

Published March 2, 2021 19:18 UTC

Original post:

New research indicates the whole universe could be a giant neural network - The Next Web