Category Archives: Quantum Physics

Scientists spotted an electron-capture supernova for the first time – Science News Magazine

A long-predicted type of cosmic explosion has finally burst onto the scene.

Researchers have found convincing evidence for an electron-capture supernova, a stellar explosion ignited when atomic nuclei sop up electrons within a stars core. The phenomenon was first predicted in 1980, but scientists have never been sure that they have seen one. A flare that appeared in the sky in 2018, called supernova 2018zd, matches several expected hallmarks of the blasts, scientists report June 28 in Nature Astronomy.

These have been theorized for so long, and its really nice that weve actually seen one now, says astrophysicist Carolyn Doherty of Konkoly Observatory in Budapest, who was not involved with the research.

Electron-capture supernovas result from stars that sit right on the precipice of exploding. Stars with more than about 10 times the suns mass go supernova after nuclear fusion reactions within the core cease, and the star can no longer support itself against gravity. The core collapses inward and then rebounds, causing the stars outer layers to explode outward (SN: 2/8/17). Smaller stars, with less than about eight solar masses, are able to resist collapse, instead forming a dense object called a white dwarf (SN: 6/30/21). But between about eight and 10 solar masses, theres a poorly understood middle ground for stars. For some stars that fall in that range, scientists have long suspected that electron-capture supernovas should occur.

Headlines and summaries of the latest Science News articles, delivered to your inbox

During this type of explosion, neon and magnesium nuclei within a stars core capture electrons. In this reaction, an electron vanishes as a proton converts to a neutron, and the nucleus morphs into another element. That electron capture spells bad news for the star in its war against gravity because those electrons are helping the star fight collapse.

According to quantum physics, when electrons are packed closely together, they start moving faster. Those zippy electrons exert a pressure that opposes the inward pull of gravity. But if reactions within a star chip away at the number of electrons, that support weakens. If the stars core gives way boom that sets off an electron-capture supernova.

But without an observation of such a blast, it remained theoretical. The big question here was, Does this kind of supernova even exist? says astrophysicist Daichi Hiramatsu of the University of California, Santa Barbara andLas Cumbres Observatory in Goleta, Calif. Potential electron-capture supernovas have been reported before, but the evidence wasnt definitive.

So Hiramatsu and colleagues created a list of six criteria that an electron-capture supernova should meet. For example, the explosions should be less energetic, and should forge different varieties of chemical elements, than more typical supernovas. Supernova 2018zd checked all the boxes.

A stroke of luck helped the team clinch the case. Most of the time, when scientists spot a supernova, they have little information about the star that produced it by time they see the explosion, the star has already been blown to bits. But in this case, the star showed up in previous images taken by NASAs Hubble Space Telescope and Spitzer Space Telescope. Its properties matched those expected for the type of star that would produce an electron-capture supernova.

All together, it really is very promising, says astrophysicist Pilar Gil-Pons of Universitat Politcnica de Catalunya in Barcelona. Reading the researchers results, she says, I got pretty excited, especially about the identification of the progenitor.

Finding more of these supernovas could help unveil their progenitors, misfit stars in that odd mass middle ground. It could also help scientists better nail down the divide between stars that will and wont explode. And the observations could reveal how often these unusual supernovas occur, an important bit of information for better understanding how supernovas seed the cosmos with chemical elements.

Read the original post:

Scientists spotted an electron-capture supernova for the first time - Science News Magazine

Wild new theory says the Big Bang wasn’t the beginning – The Next Web

The prevailing theory on the origin of our universe goes like this: about 13.7 billion years ago a single particle exploded. The resultant blast created an ever-expanding universe that, eventually, became home to the planet we call Earth.

The Big Bang theory first appeared in a scientific paper in 1931. Physicist Georges Lemaitre is credited with its creation. And the bulk of our assumptions about the universe and its rate of expansion are based on his ideas.

In 2019 that rate of expansion, called the Hubble Constant, was put into question by various teams that determined either the rate of expansion had been incorrectly calculated or something was seriously wrong with the universe.

[Read: Either the Hubble Constant is wrong or the universe is ripping itself apart]

Scientists are still sorting things out and working towards an explanation that can reconcile both the Big Bang and our modern observations.

The reason why we cant just punch some numbers in a supercomputer and determine the truth is because we dont have all the information.

Trying to determine how old the universe is by measuring its current rate of expansion is like trying to pick the winner of a NASCAR race based on a blurry, out-of-context photograph of one racers left rear tire.

To that end, the Big Bang theory only really works if we assume it was the beginning of our universe. Doing so makes it the one piece of the entire puzzle that corresponds with what were actually able to see and measure.

But what if the Big Bang wasnt the beginning?

Chanda Prescod-Weinstein, a physicist at the University of New Hampshire, has a differing theory.

Writing for New Scientist, they claim it makes more sense to assume the universe has been expanding forever.

Per the article:

The universe may not have had a beginning moment, and we may live in what is called an eternally inflating universe. One that was expanding exponentially even before what we call the big bang. Mathematically, this seems the most likely scenario assuming inflation is correct.

The ramifications of such a theory may seem trivial one explanation for a number is as good as another until were able to measure more. But a lot of our assumptions concerning both classical and quantum physics are grounded in the idea that time is more than just a construct.

Whether were discussing Newtons Laws or breaking down the nature of relative observations in quantum physics, the idea is that theres a dimensional quality called time thats codified by distinct points representing the beginning and end of an event.

Without a finite moment at the creation of the universe where nothingness became something, theres no origin point for time there are no beginnings.

The concept of infinite expansion without a beginning may be difficult to wrap our heads around, but it kind of adds up. After all, it seems paradoxical to imagine a period in which the universe itself, and thus time, didnt exist at all because youre forced to wonder how long time didnt exist for before it finally did.

But, if times always existed because the universe itself has always existed then perhaps its never existed. What is time without a beginning or end?

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to itright here.

Continued here:

Wild new theory says the Big Bang wasn't the beginning - The Next Web

Quantum Theory: A Scientific Revolution that Changed Physics Forever – Interesting Engineering

To many, quantum physics, or quantum mechanics, may seem an obscure subject, with little application for everyday life, but its principles and laws form the basis for explanations of howmatter and light work on the atomic and subatomic scale. If you want to understand how electrons move through acomputerchip, how photons of light travel in a solar panel or amplify themselves in alaser, or even why the sun keeps burning, you will need to use quantum mechanics.

Quantum mechanics is the branch of physics relating to the elementary components of nature, it is the study of the interactions that take place between subatomic forces. Quantum mechanics was developed because many of the equations ofclassical mechanics, which describe interactions at larger sizes and speeds, cease to be useful or predictive when trying to explain the forces of nature that work on the atomic scale.

Quantum mechanics, and the math that underlies it, is not based on a single theory, but on a series of theories inspired by new experimental results, theoretical insights, and mathematical methods which were elucidated beginning in the first half of the 20th century, and together create a theoretical system whose predictive power has made it one of the most successful scientific models created.

The story of quantum mechanics can be said to begin in 1859, a full 32 years before the discovery of the electron. Many physicists were concerned with a puzzling phenomenon: no matter what an object is made of, if it can survive being heated to a given temperature, the spectrum of light it emits is exactly the same as for any other substance.

In1859, physicistGustav Kirchhoff proposed a solution when he demonstrated thatthe energy emitted by a blackbody objectdepends on the temperature and the frequencyof the emitted energy, i.e

E=J(T,v)

A blackbody is a perfect emitter - an idealized object that absorbs all the energy that falls on it (because it reflects no light, it would appear black to an observer).Kirchhoff challenged physicists to find the function J, which would allow the energy emitted by light to be described for all wavelengths.

In the years following, a number of physicists would work on this problem. One of these was Heinrich Rubens, who worked to measurethe energy of black-body radiation. In 1900, Rubens visited fellow physicist Max Planckand explained his results to him. Within a few hours of Rubens leavingPlanck's house,Planckhad come up with an answer to Kirchoff's function whichfitted the experimental evidence.

Planck sought to use the equation to explain the distribution of colors emitted over the spectrum in the glow of red-hot and white-hot objects. However, when doing this, Planck realized the equation implied that only combinations of certain colors were emitted, and ininteger multiples of a small constant (which became known as Plank's Constant) times the frequency of the light.

This was unexpected because, at the time, light was believed to act as a wave, which meant that the values of color emitted should be a continuous spectrum. However,Planck realized that his solution gave different values at different wavelengths.

In order toexplain howatomswere being prevented from producing certain colors, Planck made a novel assumption - thatatoms absorb and emit energy in the form of indistinguishable energy units - what came to be called quanta.

At the time, Planck regarded quantization as a mathematical trick to make his theory work. However, a few years later, physicists provedthat classical electromagnetism couldneveraccount for the observed spectrum. These proofs helped to convincephysicists that Planck's notion of quantized energy levels may in fact be more than a mathematical "trick".

One of the proofs was given by Einstein, who published a paper in 1905 in which he envisioned light traveling not as a wave, but as a packet of "energy quanta" which could be absorbed or generated when an atom "jumps" between quantized vibration rates. In this model, the quanta contained the energy difference of the jump; when divided by Plancks constant, that energy difference determined the wavelength of light given off by those quanta.

In 1913 Niels Bohrapplied Planck's hypothesis of quantization to Ernest Rutherford's 1911 "planetary" model of the atom. This model, which came to be called the Rutherford-Bohr model, postulated that electrons orbited the nucleus in a similar way to how planets orbit the sun. Bohr proposed that electrons could only orbit at certain distances from the nucleau, and could "jump" between the orbits; doing so would give off energy at certain wavelengths of light, which could be observed as spectral lines.

It now appeared that light could act as a wave and as a particle. However, what about the matter?

In 1924, French physicist Louis de Broglie used the equations of Einstein'stheory of special relativityto show that particles can exhibit wave-like characteristics, and vice-versa.

German physicist Werner Heisenberg met with Neils Bohr at the University of Copenhagen in 1925, and after this meeting, he applied de Broglie's reasoning to understand the spectrum intensity of an electron.At the same time, Austrian physicist ErwinSchrdinger, working independently, also used de Broglie's reasoning to explain how electrons moved around in atoms.The following year, Schrdinger demonstrated that the two approaches were equivalent.

In 1927, Heisenberg reasoned that if matter can act as a wave, there must be a limit to how precisely we can know some properties, such as an electron's position and speed. In what would later be called "Heisenberg'suncertainty principle," he reasoned that the more precisely an electron's position is known, the less precisely its speed can be known, and vice versa. The proved an important piece of the quantum puzzle.

In the Heisenberg-Schrdingerquantum mechanical model of the atom, each electron acts as a wave, or "cloud") around the nucleus of an atom, with the ability to measure only the speed or position of an electron to a particular probability. This model replaced the Rutherford-Bohr model.

All these revelations regarding quantum theory revolutionized the world of physics and revealed important details about universal actions at atomic and subatomic levels.

Quantum mechanics further combined with other phenomena in physics such as relativity, gravitation, electromagnetism, etc. also increased our understanding of the physical world and how construction and destruction occur within it.

For their exceptional contributions, Planck, Einstein, Bohr, Heisenberg, and Schrdinger were awarded the Nobel Prize in Physics in 1918, 1921, 1922, 1932, and 1933 respectively.

While it may seem as though quantum mechanics progressed in a fairly straightforward series of theoretical leaps, in reality, there was a lot of disagreement among physicists over its relevance.

These disagreements reached a peak at the 1927 Solvay Conference in Brussels, where 29 of the world's most brilliant scientists gathered to discuss the many seemingly contradictory observations inquantum theory that could not be reconciled. One major point of contention had to do with the theory that, until they are observed, the location and speed of entities such as electrons, can only exist as a "probability".

Bohr, in particular,emphasized that quantum predictions founded on probability are able to accurately describe physical actions in the real world. In what later came to be called the Copenhagen interpretation, he proposed that while wave equations described the probability of where entities like electronscouldbe found,theseentities didn't actually exist as particles unless they were observed. In Bohr's words, they had no "independent reality" in the ordinary physical sense.

He described that the events that take place on atomic levels can alter the outcome of quantum interaction.According to Bohr, a system behaves as a wave or a particle depending on context, but you cannot predict what it will do.

Einstein, in contrast, argued thatan electron was an electron, even if no one was looking at it, that particles like electrons had independent reality, and prompting his famous claim that God does not play dice with the universe.

Einstein and Bohr would debate their views until Einstein's death three decades later, but remained colleagues and good friends.

Einstein argued that the Copenhagen interpretation was incomplete. He theorized that there might be hidden variables or processes underlying quantum phenomena.

In 1935, Einstein, along with fellow physicists Boris Podolsky and Nathan Rosen published a paper on what would be known as the Einstein-Boris-Podolsky (EPR) paradox. The EPR paradox described in the paper again raised doubts on the quantum theory.

The EPR paper featured predetermined values of momentum and particle velocity and suggested that the description of physical reality provided by the wave function in quantum theory is incomplete, and therefore, physical reality can not be derived from the wave function or in the context of quantum-mechanical theory.

The same year, Bohr replied to the claims made by Einstein. In his response, published in the Physical Review, Bohr proved that the predetermined values of the second particles velocity and momentum, as per the EPR paradox were incorrect. He also argued that the paradox failed to justify the inability of quantum mechanics to explain physical reality.

The understanding of elementary particles and their behavior helped to create groundbreaking innovations in healthcare, communication, electronics, and various other fields. Moreover, there are numerous modern technologies that operate on the principles mentioned in quantum physics.

Laser-based equipment

Laser technology involves equipment that emits light by the means of a process called optical amplification. Laser equipment work on the principle of photon emission and they release the light with a well-defined wavelength in a very narrow beam. Hence, the laser beams function in alignment with theories (such as the photoelectric effect) mentioned in quantum mechanics.

A report published in 2009 reveals that extreme ultraviolet lasers when hit a metal surface can cause electrons to move out of the atom, this outcome is said to further extend Einsteins photoelectric effect in the context of super-intense lasers.

Electronic Devices and Machines

From flash memory storage devices like USB drives to complex lab equipment such as electron microscopes, an understanding of quantum mechanics led to countless modern-day inventions. Light-emitting diodes, electric switches, transistors, quantum computers, etc are examples of some highly useful devices that resulted from the advent of quantum physics.

Let us understand this from the example of Magnetic Resonance Imaging (MRI) machine, this medical equipment is very useful in diagnosing the brain and other body organs. MRI works on the principle of electromagnetism, it has a strong magnetic field that uses the spin of protons in hydrogen atoms to analyze the composition of different tissues.

MRI aligns all the protons in the body as per their spin, due to the magnetic field, the protons absorb energy and emit the same (quantum theory), the MRI scanner uses the emitted energy signals received from all the water molecules to deliver a detailed image of the internal body parts.

X-Rays

Used in medical diagnosis, border inspection, industrial tomography, cancer treatment, and for many other purposes, X-rays are a form of electromagnetic radiation. While the discovery of X-rays predates quantum mechanics, quantum mechanical theory has allowed the use of X-rays in a practical way.

A beam of X-rays can be regarded as consisting of a stream of quanta. These quanta are projected out from the target of the X-ray tube, and, on penetrating tissue, there is an effect produced that is proportional to the number of the quanta multiplied by the energy carried by each quantum.

The emitted electrons also emit photons, whichare able to penetrate the matter and form its image on the X-ray screen. Therefore, the elementary particles mentioned in quantum mechanics interact with X-ray energy to deliver the inside look of an object.

Fluorescence-based Applications

Fluorescence is referred to the emission of light under UV exposure that takes place when an electron achieves a higher quantum state and emits photons, fluorescent lamps and spectrometers work on basis of quantum theory. Various minerals such as Aragonit, Calcite, and Fluorite are also known to exhibit fluorescence.

Fluorescence is also used to lit synthetic gems and diamonds, jewelry manufacturers use this phenomenon to create artificial imitation stones that look brighter and more beautiful than the naturally occurring original stones.

Apart from these applications, quantum mechanics has contributed to our understanding of many areas of technology, biological systems, and cosmic forces and bodies. While there are several important questions remaining inquantum physics, the core concepts, which define the behavior of energy, particles, and matter have continued to hold constant.

Read the original:

Quantum Theory: A Scientific Revolution that Changed Physics Forever - Interesting Engineering

Klarman postdoc seeks ‘theory of everything’ by approximation | Cornell Chronicle – Cornell Chronicle

Two pillar theories in physics general relativity and quantum mechanics stand up well on their own, but are incompatible with each other.

These two theories describe two different regimes of phenomena, said Francesco Sgarlata, a Klarman Postdoctoral Fellow in physics in the College of Arts and Sciences (A&S).

Quantum mechanics, he said, describes physical phenomena at atomic or sub-atomic scales; general relativity describes very large phenomena.

The two theories are both correct in that they both predict very well, and we dont have any violation of these theories. However, the two theories are inconsistent with each other, Sgarlata said, adding that the inconsistencies show up in processes at extremely small scales.

A member of the first cohort of six Klarman Fellows, Sgarlata is using his three-year fellowship to join theoretical physicists at Cornell and around the world in trying to solve this inconsistency.

Physicists have long sought a theory of everything, or theory of quantum gravity, that would unify quantum mechanics and general relativity. In recent decades, researchers have tried a top-down approach, trying to come up with a unifying theory, such as string theory.

Sgarlata, in contrast, is taking a bottom-up approach to finding a theory of quantum gravity, which attempts to unify gravity with other fundamental forces of physics.

We seek an approximation, he said. We dont know what this theory of everything is. [Instead,] we are trying to write down some theory which can be seen as an approximation of quantum gravity, and we study what conditions this theory will have in order to be a good approximation of quantum gravity.

Sgarlata is working with Cornells theoretical physics community, including his faculty host, Csaba Csaki, professor of physics (A&S), and Thomas Hartman, associate professor of physics (A&S), to identify some hidden properties of quantum gravity, one at a time and then build from there.

Francescos research is on the fundamental properties of particles and forces, Hartman said. His goal is to understand what particles are consistent with basic principles of relativity and quantum mechanics, and how these particles can interact.

Sgarlatas background is in particle physics, Hartman said, while his own background is in black hole physics and string theory.

There is a lot of overlap, but these are two different perspectives, Hartman said, so this is a great opportunity for us to collaborate on new ideas. We are working on joining forces and combining our approaches.

To find conditions necessary to support a theory of quantum gravity, Sgarlata and collaborators focus on first principles those we experience in everyday life but are difficult to prove mathematically. One example is causality the link between cause and effect.

If I punch you, you will start feeling pain after I punch you, not before, Sgarlata said. We assume that this theory of everything respects causality.

Other first principles the researchers consider are unitarity (probabilities must add up to 1); and locality (particles only interact with neighboring particles.)

From a swampland of possible theories arise islands of probable theories, Sgarlata said, narrowing the scope. We get some constraints on the parameters of the theory, he said.

Hartman said that Sgarlata uses methods from particle physics to develop and interpret theories of physics at high energies.

In some cases, his methods can even be used to understand some corners of the more mysterious theory of quantum gravity at ultrashort distances, Hartman said. Over the next couple years, I think Francescos research at Cornell will lead to better insight into fundamental particles and new connections between particles, gravity and black holes.

The Klarman Fellowship, Sgarlata said, offers independence to pursue research collaborations toward solving the biggest problems in physics.

We have the tools to understand features of quantum gravity, he said. Today we are reinterpreting these concepts in a more modern way, and we are discovering new concepts of physics just by our interpretations.

A version of this story appears on the Arts and Sciences website.

Kate Blackwood is a writer for the College of Arts and Sciences.

See the article here:

Klarman postdoc seeks 'theory of everything' by approximation | Cornell Chronicle - Cornell Chronicle

Rare Superconductor Discovered May Be Critical for the Future of Quantum Computing – SciTechDaily

Research led by Kent and theSTFC Rutherford Appleton Laboratoryhas resulted in the discovery of a new rare topological superconductor, LaPt3P. This discovery may be of huge importance to the future operations of quantum computers.

Superconductors are vital materials able to conduct electricity without any resistance when cooled below a certain temperature, making them highly desirable in a society needing to reduce its energy consumption.

They manifest quantum properties on the scale of everyday objects, making them highly attractive candidates for building computers that use quantum physics to store data and perform computing operations, and can vastly outperform even the best supercomputers in certain tasks. As a result, there is an increasing demand from leading tech companies like Google, IBM and Microsoft to make quantum computers on an industrial scale using superconductors.

However, the elementary units of quantum computers (qubits) are extremely sensitive and lose their quantum properties due to electromagnetic fields, heat, and collisions with air molecules. Protection from these can be achieved by making more resilient qubits using a special class of superconductors called topological superconductorswhich in addition to being superconductors also host protected metallic states on their boundaries or surfaces.

Topological superconductors, such as LaPt3P, newly discovered through muon spin relaxation experiments and extensive theoretical analysis, are exceptionally rare and are of tremendous value to the future industry of quantum computing.

To ensure its properties are sample and instrument independent, two different sets of samples were prepared in theUniversity of Warwickand inETH Zurich. Muon experiments were then performed in two different types of muon facilities: in the ISIS Pulsed Neutron and Muon Source in the STFC Rutherford Appleton Laboratory and inPSI, Switzerland.

Dr. Sudeep Kumar Ghosh, Leverhulme Early Career Fellow at KentsSchool of Physical Sciencesand Principle Investigator said: This discovery of the topological superconductor LaPt3P has tremendous potential in the field of quantum computing. Discovery of such a rare and desired component demonstrates the importance ofmuonresearch for the everyday world around us.

Reference: Chiral singlet superconductivity in the weakly correlated metal LaPt3P by P. K. Biswas, S. K. Ghosh, J. Z. Zhao, D. A. Mayoh, N. D. Zhigadlo, Xiaofeng Xu, C. Baines, A. D. Hillier, G. Balakrishnan and M. R. Lees, 4 May 2021, Nature Communications.DOI: 10.1038/s41467-021-22807-8

The paper is published inNature Communications(University of Kent: Dr. Sudeep K. Ghosh; STFC Rutherford Appleton Laboratory: Dr. Pabitra K. Biswas, Dr. Adrian D. Hillier; University of Warwick Dr. Geetha Balakrishnan, Dr. Martin R. Lees, Dr. Daniel A. Mayoh; Paul Scherrer Institute: Dr. Charles Baines; Zhejiang University of Technology: Dr. Xiaofeng Xu; ETH Zurich: Dr. Nikolai D. Zhigadlo; Southwest University of Science and Technology: Dr. Jianzhou Zhao).

Follow this link:

Rare Superconductor Discovered May Be Critical for the Future of Quantum Computing - SciTechDaily

Science Should Not Try to Absorb Religion and Other Ways of Knowing – Scientific American

An edgy biography of Stephen Hawking has me reminiscing about sciences good old days. Or were they bad? I cant decide. Im talking about the 1990s, when scientific hubris ran rampant. As journalist Charles Seife recalls in Hawking Hawking: The Selling of a Scientific Celebrity, Hawking and other physicists convinced us that they were on the verge of a theory of everything that would solve the riddle of existence. It would reveal why there is something rather than nothing, and why that something is the way it is.

In this column, Ill look at an equally ambitious and closely related claim, that science will absorb other ways of seeing the world, including the arts, humanities and religion. Nonscientific modes of knowledge wont necessarily vanish, but they will become consistent with science, our supreme source of truth. The most eloquent advocate of this perspective is biologist Edward Wilson, one of our greatest scientist-writers.

In his 1998 bestseller Consilience: The Unity of Knowledge, Wilson prophesies that science will soon yield such a compelling, complete theory of nature, including human nature, that the humanities, ranging from philosophy and history to moral reasoning, comparative religion, and interpretation of the arts, will draw closer to the sciences and partly fuse with them. Wilson calls this unification of knowledge consilience, an old-fashioned term for coming together or converging. Consilience will resolve our age-old identity crisis, helping us understand once and for all who we are and why we are here, as Wilson puts it.

Dismissing philosophers warnings against deriving ought from is, Wilson insists that we can deduce moral principles from science. Science can illuminate our moral impulses and emotions, such as our love for those who share our genes, as well as giving us moral guidance. This linkage of science to ethics is crucial, because Wilson wants us to share his desire to preserve nature in all its wild variety, a goal that he views as an ethical imperative.

At first glance you might wonder: Who could possibly object to this vision? Wouldnt we all love to agree on a comprehensive worldview, consistent with science, that tells us how to behave individually and collectively? And in fact. many scholars share Wilsons hope for a merger of science with alternative ways of engaging with reality. Some enthusiasts have formed the Consilience Project, dedicated to developing a body of social theory and analysis that explains and seeks solutions to the unique challenges we face today. Last year, poet-novelist Clint Margrave wrote an eloquent defense of consilience for Quillette, noting that he has often drawn inspiration from science.

Another consilience booster is psychologist and megapundit Steven Pinker, who praised Wilsons excellent book in 1998 and calls for consilience between science and the humanities in his 2018 bestseller Enlightenment Now. The major difference between Wilson and Pinker is stylistic. Whereas Wilson holds out an olive branch to postmodern humanities scholars who challenge sciences objectivity and authority, Pinker scolds them. Pinker accuses postmodernists of defiant obscurantism, self-refuting relativism and suffocating political correctness.

The enduring appeal of consilience makes it worth revisiting. Consilience raises two big questions: (1) Is it feasible? (2) Is it desirable? Feasibility first. As Wilson points out, physics has been an especially potent unifier, establishing over the past few centuries that the heavens and earth are made of the same stuff ruled by the same forces. Now physicists seek a single theory that fuses general relativity, which describes gravity, with quantum field theory, which accounts for electromagnetism and the nuclear forces. This is Hawkings theory of everything and Steven Weinbergs final theory."

Writing in 1998, Wilson clearly expected physicists to find a theory of everything soon, but today they seem farther than ever from that goal. Worse, they still cannot agree on what quantum mechanics means. As science writer Philip Ball points out in his 2018 book Beyond Weird: Why Everything You Thought You Knew about Quantum Physics Is Different, there are more interpretations of quantum mechanics now than ever.

The same is true of scientific attempts to bridge the explanatory chasm between matter and mind. In the 1990s, it still seemed possible that researchers would discover how physical processes in the brain and other systems generate consciousness. Since then, mind-body studies have undergone a paradigm explosion, with theorists espousing a bewildering variety of models, involving quantum mechanics, information theory and Bayesian mathematics. Some researchers suggest that consciousness pervades all matter, a view called panpsychism; others insist that the so-called hard problem of consciousness is a pseudoproblem because consciousness is an illusion.

There are schisms even within Wilsons own field of evolutionary biology. In Consilience and elsewhere, Wilson suggests that natural selection promotes traits at the level of tribes and other groups; in this way, evolution might have bequeathed us a propensity for religion, war and other social behaviors. Other prominent Darwinians, notably Richard Dawkins and Robert Trivers, reject group selection, arguing that natural selection operates only at the level of individual organisms and even individual genes.

If scientists cannot achieve consilience even within specific fields, what hope is there for consilience between, say, quantum chromodynamics and queer theory? (Actually, in her fascinating 2007 book Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning, physicist-philosopher Karen Barad finds resonances between physics and gender politics; but Barads book represents the kind of postmodern analysis deplored by Wilson and Pinker.) If consilience entails convergence toward a consensus, science is moving away from consilience.

So, consilience doesnt look feasible, at least not at the moment. Next question: Is consilience desirable? Although Ive always doubted whether it could happen, I once thought consilience should happen. If humanity can agree on a single, rational worldview, maybe we can do a better job solving our shared problems, like climate change, inequality, pandemics and militarism. We could also get rid of bad ideas, such as the notion that God likes some of us more than others; or that racial and sexual inequality and war are inevitable consequences of our biology.

I also saw theoretical diversity, or pluralism, as philosophers call it, as a symptom of failure; the abundance of solutions to the mind-body problem, like the abundance of treatments for cancer, means that none works very well. But increasingly, I see pluralism as a valuable, even necessary counterweight to our yearning for certitude. Pluralism is especially important when it comes to our ideas about who we are, can be and should be. If we settle on a single self-conception, we risk limiting our freedom to reinvent ourselves, to discover new ways to flourish.

Wilson acknowledges that consilience is a reductionistic enterprise, which will eliminate many ways of seeing the world. Consider how he treats mystical visions, in which we seem to glimpse truths normally hidden behind the surface of things. To my mind, these experiences rub our faces in the unutterable weirdness of existence, which transcends all our knowledge and forms of expression. As William James says in The Varieties of Religious Experience, mystical experiences should forbid a premature closing of our accounts with reality.

Wilson disagrees. He thinks mystical experiences are reducible to physiological processes. In Consilience, he focuses on Peruvian shaman-artist Pablo Amaringo, whose paintings depict fantastical, jungly visions induced by ayahuasca, a hallucinogenic tea (which I happen to have taken) brewed from two Amazonian plants. Wilson attributes the snakes that slither through Amaringos paintings to natural selection, which instilled an adaptive fear of snakes in our ancestors; it should not be surprising that snakes populate many religious myths, such as the biblical story of Eden.

Moreover, ayahuasca contains psychotropic compounds, including the potent psychedelic dimethyltryptamine, like those that induce dreams, which stem from, in Wilsons words, the editing of information in the memory banks of the brain that occurs while we sleep. These nightly neural discharges are arbitrary in content, that is, meaningless; but the brain desperately tries to assemble them into coherent narratives, which we experience as dreams.

In this way, Wilson explains Amaringos visions in terms of evolutionary biology, psychology and neurochemistry. This is a spectacular example of what Paul Feyerabend, my favorite philosopher and a fierce advocate for pluralism, calls the tyranny of truth. Wilson imposes his materialistic, secular worldview on the shaman, and he strips ayahuasca visions of any genuine spiritual significance. While he exalts biological diversity, Wilson shows little respect for the diversity of human beliefs.

Wilson is a gracious, courtly man in person as well on the page. But his consilience project stems from excessive faith in science, or scientism. (Both Wilson and Pinker embrace the term scientism, and they no doubt think that the phrase excessive faith in science is oxymoronic.) Given the failure to achieve consilience within physics and biologynot to mention the replication crisis and other problemsscientists should stop indulging in fantasies about conquering all human culture and attaining something akin to omniscience. Scientists, in short, should be more humble.

Ironically, Wilson himself questioned the desirability of final knowledge early in his career. At the end of his 1975 masterpiece Sociobiology, Wilson anticipates the themes of Consilience, predicting that evolutionary theory plus genetics will soon absorb the social sciences and humanities. But Wilson doesnt exult at this prospect. When we can explain ourselves in mechanistic terms, he warns, the result might be hard to accept; we might find ourselves, as Camus put it, divested of illusions.

Wilson neednt have worried. Scientific omniscience looks less likely than ever, and humans are far too diverse, creative and contrary to settle for a single worldview of any kind. Inspired by mysticism and the arts, as well as by science, we will keep arguing about who we are and reinventing ourselves forever. Is consilience a bad idea, which wed be better off without? I wouldnt go that far. Like utopia, another byproduct of our yearning for perfection, consilience, the dream of total knowledge, can serve as a useful goad to the imagination, as long as we see it as an unreachable ideal. Lets just hope we never think weve reached it.

This is an opinion and analysis article; the views expressed by theauthor or authorsare not necessarily those of Scientific American.

Further Reading:

The Delusion of Scientific Omniscience

The End of Science (updated 2015 edition)

Mind-Body Problems: Science, Subjectivity and Who We Really Are

I just talked about consilience with science journalist Philip Ball on my podcast Mind-Body Problems.

I brood over the limits of knowledge in my new book Pay Attention: Sex, Death, and Science.

Read the original:

Science Should Not Try to Absorb Religion and Other Ways of Knowing - Scientific American

Lars Jaeger: Quantum Computers Have Reached the Mainstream – finews.asia

The discussion about quantum computers has reached the mainstream including investors. This is one of the numerous examples that such technological development is happening much faster today than 50 years ago, Lars Jaeger writes on finews.first.

This article is published on finews.first, a forum for authors specialized in economic and financial topics.

A word that is becoming more and more popular, but still sounds like science fiction, is the term quantum computer. Only 10 to 15 years ago, the construction of such a computer as a future technology seemed impossible within any reasonable time frame.

Thus, the discussion about it was limited to a small team of experts or just material for science fiction. Just as transistor effect or von Neumann processors were not even remotely familiar terms to non-physicists in the 1940s, the same was true for the term quantum computer until recently.

The discussion about quantum computers has even reached the mainstream including investors. And this could become one of the numerous examples that such technological development is happening much faster today than 50 years ago.

The quantum world offers even more

However, most people are still completely unaware of what a quantum computer actually is, as in principle all computers today are still entirely based on classical physics, on the so-called von Neumann architecture from the 1940s.

In it, the individual computing steps are processed sequentially bit by bit. The smallest possible unit of information (a so-called binary digits, or bit for short) thereby always takes a well-defined state of either 1 or 0. In contrast, quantum computers use the properties of quantum systems that are not reducible to classical bits but are based on quantum bits, or qubits for short.

These can assume the different states of bits, i.e. 0 and 1 and all values in between simultaneously. So, they can be half 1 and half 0 as well as in any other possible combination of them. This possibility is beyond our classical (everyday) imagination, according to which a state is either one or the other, tertium non datur, but is very typical for quantum systems. Physicists call such mixed quantum states superpositions.

Quantum computers are supposed to be the crowning achievement

But the quantum world offers even more: Different quantum particles can be in so-called entangled states. This is another property that does not exist in our classical world. It is as if the qubits are coupled to each other with an invisible spring. They are then all in direct contact with each other, without any explicit acting force. Each quantum bit knows so to say over any distance what the others are doing. Such entanglement was the subject of a heated debate in early quantum physics. Albert Einstein, for example, considered entanglement to be physically impossible and derisively called it a spooky action-at-a-distance.

In the meantime, however, this controversial quantum property is already being exploited in many technical applications. Quantum computers are supposed to be the crowning achievement here. They could open completely new, fantastic possibilities in at least five fields:

Some physicists even believe that a quantum computer could be used to calculate and thus solve any problem in nature, from the behavior of black holes, the development of the very early universe, the collisions of high-energy elementary particles, to the phenomenon of superconductivity as well as the modeling of the 100 billion neurons and the thousand times larger number of their connections in our brain. Quantum computers could therefore represent a revolution in science as well as in the technology world.

Some even spoke of a Sputnik moment in information technology

Less than two years ago, Google announced that its engineers had succeeded in building a quantum computer that for the first time was able to solve a problem that any conventional computer could not. The corresponding computer chip Sycamore needed just 200 seconds for a special computing task that would have taken the worlds best supercomputer 10,000 years.

It had been Google itself that some years earlier had christened such an ability of a quantum computer to be superior to any existing classical computer in accomplishing certain tasks with quantum supremacy. The moment of such quantum supremacy seemed to have finally come. Some even spoke of a Sputnik moment in information technology.

However, this was more a symbolic milestone, since the problem solved by Sycamore was still a very special and purely academic one. But there is no doubt that it represented a significant step forward (which, however, was also called into question in some cases: IBM even doubted the quantum nature of this computing machine).

Jiuzhang was also controversial as a quantum computer

Then, in December 2020, a team-based mainly at the University of Science and Technology of China in Hefei communicated in the journal Science that a new quantum computer they had developed and which they had named Jiuzhang, was up to 10 billion times faster than Googles Sycamore.

That this news came from China was not quite as surprising as it might have been to those with little familiarity with today's Chinese science. Partly still seen as a developing country and thus technologically behind, China has meanwhile invested heavily in potential quantum computing and other quantum processes as well as artificial intelligence, genetic engineering, and a bunch of other cutting-edge technologies. Communist General Secretary Xi Jinpings government is spending $10 billion over several years on the countrys National Laboratory for Quantum Information Sciences.

Jiuzhang was also controversial as a quantum computer. But if both Sycamore and Jiuzhang could indeed solve their (still very specific) problems incomparably fast with quantum technologies and this can no longer be easily dismissed there would already be two quantum computers that have achieved the desired quantum superiority.

Just these days, there was another (money-big) announcement

From here, we could then expect numerous further versions quite soon, which can solve more and more problems faster and faster. A few weeks ago, Google announced that they want to have built a powerful quantum computer that can be used on a very broad scale (no longer limited to exotic peripheral problems) by 2029. To this end, they want to bring together one million physical qubits that work together in an error-correcting quantum computer (in todays quantum computers this number still stands at less than 100 qubits).

In addition to Google and the Chinese research center in Hefei, there are countless other quantum computer development sites. And they are increasingly supported by governments. Germany, for example, announced in 2020 that the country will invest billions into quantum computing technology.

The new entity could become another global leader

And just these days, there was another (money-big) announcement: Cambridge Quantum Computing, a British company founded in 2014, announced that it will partner with the quantum solutions division of U.S. industrial giant Honeywell to build a new quantum computer. This deal brings together Honeywells expertise in (quantum) hardware with the one of Cambridge Quantum in software and algorithms.

The new entity could become another global leader (along with Google, IBM, and the Chinese) in developing quantum computers. Without the belief that initial breakthroughs in quantum computing have already been achieved, it is unlikely that so much money would be flowing into the industry already.

These sums are likely to multiply again as further progress is made. One might feel transported back to the early 1970s before commercial computers existed. Only this time, everything will probably happen even much faster.

Lars Jaeger is a Swiss-German author and investment manager. He writes on the history and philosophy of science and technology and has in the past been an author on hedge funds, quantitative investing, and risk management.

Previous contributions: Rudi Bogni, Peter Kurer, Rolf Banz, Dieter Ruloff, Werner Vogt, Walter Wittmann, Alfred Mettler, Robert Holzach, Craig Murray, David Zollinger, Arthur Bolliger, Beat Kappeler, Chris Rowe, Stefan Gerlach, Marc Lussy, Nuno Fernandes, Richard Egger, Maurice Pedergnana, Marco Bargel, Steve Hanke, Urs Schoettli, Ursula Finsterwald, Stefan Kreuzkamp, Oliver Bussmann, Michael Benz, Albert Steck, Martin Dahinden, Thomas Fedier, Alfred Mettler,Brigitte Strebel, Mirjam Staub-Bisang, Nicolas Roth, Thorsten Polleit, Kim Iskyan, Stephen Dover, Denise Kenyon-Rouvinez, Christian Dreyer, Kinan Khadam-Al-Jame, Robert Hemmi,Anton Affentranger,Yves Mirabaud, Katharina Bart, Frdric Papp, Hans-Martin Kraus, Gerard Guerdat, MarioBassi, Stephen Thariyan, Dan Steinbock, Rino Borini,Bert Flossbach, Michael Hasenstab, Guido Schilling, Werner E. Rutsch,Dorte Bech Vizard, Adriano B. Lucatelli, Katharina Bart, Maya Bhandari, Jean Tirole, Hans Jakob Roth,Marco Martinelli, Thomas Sutter,Tom King,Werner Peyer, Thomas Kupfer, Peter Kurer,Arturo Bris,Frederic Papp,James Syme, DennisLarsen, Bernd Kramer, Ralph Ebert, Armin Jans,Nicolas Roth, Hans Ulrich Jost, Patrick Hunger, Fabrizio Quirighetti,Claire Shaw, Peter Fanconi,Alex Wolf, Dan Steinbock, Patrick Scheurle, Sandro Occhilupo, Will Ballard, Nicholas Yeo, Claude-Alain Margelisch, Jean-Franois Hirschel, Jens Pongratz, Samuel Gerber, Philipp Weckherlin, Anne Richards, Antoni Trenchev, Benoit Barbereau, Pascal R. Bersier, Shaul Lifshitz, Klaus Breiner, Ana Botn, Martin Gilbert, Jesper Koll, Ingo Rauser, Carlo Capaul, Claude Baumann, Markus Winkler, Konrad Hummler, Thomas Steinemann, Christina Boeck, Guillaume Compeyron, Miro Zivkovic, Alexander F. Wagner, Eric Heymann, Christoph Sax, Felix Brem, Jochen Moebert, Jacques-Aurlien Marcireau, Ursula Finsterwald, Claudia Kraaz, Michel Longhini, Stefan Blum, Zsolt Kohalmi, Karin M. Klossek, Nicolas Ramelet, Sren Bjnness, Lamara von Albertini, Andreas Britt, Gilles Prince, Darren Willams, Salman Ahmed, Stephane Monier, and Peter van der Welle, Ken Orchard, Christian Gast, Jeffrey Bohn, Juergen Braunstein, Jeff Voegeli, Fiona Frick, Stefan Schneider, Matthias Hunn, Andreas Vetsch, Fabiana Fedeli, Marionna Wegenstein, Kim Fournais, Carole Millet, Ralph Ebert, Swetha Ramachandran, Brigitte Kaps, Thomas Stucki, Neil Shearing, Claude Baumann, Tom Naratil, Oliver Berger, Robert Sharps, Tobias Mueller, Florian Wicki, Jean Keller, Niels Lan Doky, Karin M. Klossek, Ralph Ebert, Johnny El Hachem, Judith Basad, Katharina Bart, Thorsten Polleit, Bernardo Brunschwiler, Peter Schmid, Karam Hinduja, Zsolt Kohalmi, Raphal Surber, Santosh Brivio, Grard Piasko, Mark Urquhart, Olivier Kessler, Bruno Capone, Peter Hody, Lars Jaeger, Andrew Isbester, Florin Baeriswyl, and Michael Bornhaeusser, Agnieszka Walorska, Thomas Mueller, Ebrahim Attarzadeh, Marcel Hostettler,Hui Zhang, Michael Bornhaeusser, Reto Jauch, Angela Agostini, Guy de Blonay, Tatjana Greil Castro, Jean-Baptiste Berthon, Marc Saint John Webb, Dietrich Goenemeyer, Mobeen Tahir, Didier Saint-Georges, Serge Tabachnik, Rolando Grandi, Vega Ibanez, Beat Wittmann, Carina Schaurte, and David Folkerts-Landau, Andreas Ita, Teodoro Cocca, Michael Welti, Mihkel Vitsur, Fabrizio Pagani, Roman Balzan, Todd Saligman, Christian Kaelin, Stuart Dunbar, and Fernando Fernndez.

More here:

Lars Jaeger: Quantum Computers Have Reached the Mainstream - finews.asia

Quantum Computing Stumped Einstein 100 Years Ago. Today, It’s Ready to Change the World. – InvestorPlace

Back in October of 1927, the worlds leading scientists descended upon Brussels for the fifth Solvay Conference an exclusive, invite-only conference that is dedicated to discussing and solving the outstanding preeminent open problems in physics and chemistry.

In attendance were scientists that, today, we praise as the brightest minds in the history of mankind.

Albert Einstein was there so was Erwin Schrodinger, who devised the famous Schrodingers cat experiment and Werner Heisenberg, the man behind the world-changing Heisenberg uncertainty principle and Louis de Broglie. Max Born. Neils Bohr. Max Planck.

The list goes on and on. Of the 29 scientists who met in Brussels in October 1927, 17 of them went on to win a Nobel Prize.

These are the minds that collectively created the scientific foundation upon which the modern world is built.

And yet, when they all descended upon Brussels nearly 94 years ago, they got stumped by one concept one concept that for nearly a century has remained the elusive key to unlocking the full potential of humankind.

And now, for the first time ever, that concept which stumped even Einstein is turning into a disruptive reality, via a breakthrough technology that will change the world as we know it.

So what exactly were Einstein, Schrodinger, Heisenberg, and the rest of those Nobel Laureates talking about in Brussels back in 1927?

Quantum mechanics.

Now, to be clear, quantum mechanics is a big, complex topic that would require 500 pages to fully understand, but heres my best job at making a Cliffs Notes version in 500 words instead

For centuries, scientists had developed, tested, and validated the laws of the physical world which became known as classical mechanics. These laws scientifically explained how things worked. Why they worked. Where they came from. So on and so forth.

But the discovery of the electron in 1897 by J.J. Thomson unveiled a new, subatomic world of supper-small things that didnt obey the laws of classical mechanics. The biggest differences were two-fold.

First, in classical mechanics, objects are in one place, at one time. You are either at the store, or at home.

But, in quantum mechanics, subatomic particles can theoretically exist in multiple places at once before they are observed. A single subatomic particle can exist in point A and point B at the same time, until we observe it, at which point it only exists at either point A or point B.

So, the true location of a subatomic particle is some combination of all its possible locations.

This is called quantum superposition.

Second, in classical mechanics, objects can only work with things that are also real. You cant use your imaginary friend to help move the couch. You need your real friend to help you.

But, in quantum mechanics, all of those probabilistic states of subatomic particles are not independent. Theyre entangled. That is, if we know something about the probabilistic positioning of one subatomic particle, then we know something about the probabilistic positioning of another subatomic particle meaning that these already super-complex particles can actually work together to create a super-complex ecosystem.

This is called quantum entanglement.

So, in short, subatomic particles can theoretically have multiple probabilistic states at once, and all those probabilistic states can work together again, all at once to accomplish some task.

And that, in a nutshell, is the scientific breakthrough that stumped Einstein back in the early 1900s.

It goes against everything classical mechanics had taught us about the world. It goes against common sense. But its true. Its real. And, now, for the first time ever, we are leaning how to harness this unique phenomenon to change everything about everything

That is, the study of quantum theory has made huge advancements over the past century, especially so over the past decade, wherein scientists at leading technology companies have started to figure out how to harness the powers of quantum mechanics to make a new generation of super quantum computers that are infinitely faster and more powerful than even todays fastest supercomputers.

In short, todays computers are built on top of the laws of classical mechanics. That is, they store information on what are called bits which can store data binarily as either 1 or 0.

But what if you could harness the power of quantum mechanics to turn those classical bits into quantum bits or qubits that can leverage superpositioning to be both 1 and 0 data stores at the same time?

Even further, what if you could take those quantum bits and leverage entanglement to get all of the multi-state bits to work together to solve computationally taxing problems?

You would theoretically create a machine with so much computational power that it would make even todays most advanced supercomputers look like they are from the Stone Age.

Thats exactly what is happening today.

Google has built a quantum computer that solved a mathematical calculation in 200 seconds, that took the worlds most advanced classical supercomputer IBM Summit 10,000 years to do. That means Googles quantum computer is about 158 million times faster than the worlds fastest supercomputer.

Thats not hyperbole. Thats a real number.

Imagine the possibilities if we could broadly create a new set of quantum computers 158 million times faster than even todays fastest computers.

Wed finally have the level of AI that you see in movies. Thats because the biggest limitation to AI today is the robustness of machine learning algorithms, which are constrained by supercomputing capacity. Expand that capacity, and you get infinitely improved machine learning algos, and infinitely smarter AI.

We could eradicate disease. We already have tools like gene editing, but the effectiveness of gene editing relies of the robustness of the underlying computing capacity to identify, target, insert, cut, and repair genes. Insert quantum computing capacity, and all that happens without an error in seconds allowing for us to truly fix anything about anyone.

We could finally have that million-mile EV. We can only improve batteries if we can test them, and we can only test them in the real-world so much. Therefore, the key to unlocking a million-mile battery is through cellular simulation, and the quickness and effectiveness of cellular simulation rests upon the robustness of the underlying computing capacity. Make that capacity 158 million times bigger, and cellular simulation will happen 158 million times faster.

The applications here are truly endless.

And thats why the Boston Consulting Group believes quantum computing will be the next trillion-dollar industry.

I couldnt agree more. Over the next two decades, quantum computing is going to change everything about everything.

And thats why, Ive made my first-ever foray into quantum computing stocks recently, adding the best quantum computing stock to buy today in my ultra-exclusive newsletter service, Exponential Growth Report.

You can learn more about my premium newsletter subscription services, and my top picks in the worlds emerging megatrends, by becoming a free subscriber to Hypergrowth Investing. By signing up, youll also get my latest research report, 11 Electric Vehicle Stocks for 2021, sent directly to your inbox.

In Hypergrowth, my team and I cover the preeminent megatrends of today, and the top stocks to buy within them, sending a free issue to your inbox each day at 7:30 a.m. Eastern. Im talking about markets with hidden gems that could score investors 10X, 50X, or even 100X upside.

And you can learn all about my premium subscription services where I compile the best-of-the-best in the hypergrowth investing world. My latest pick in Exponential Growth Report is the top quantum computing company in the world a name no one has heard about yet which could one day be as big as Amazon Web Services or Google cloud.

This is the next big thing.

To find out more about this potential life-changing investment opportunity, start by following Hypergrowth Investing for free.

On the date of publication, Luke Lango did not have (either directly or indirectly) any positions in the securities mentioned in this article.

See the original post here:

Quantum Computing Stumped Einstein 100 Years Ago. Today, It's Ready to Change the World. - InvestorPlace

Exotic Superconductors: The Secret That Was Never There – SciTechDaily

Experiments in the lab at TU Wien. Credit: TU Wien

How reproducible are measurements in solid-state physics? New measurements show: An allegedly sensational effect does not exist at all.

A single measurement result is not a proof this has been shown again and again in science. We can only really rely on a research result when it has been measured several times, preferably by different research teams, in slightly different ways. In this way, errors can usually be detected sooner or later.

However, a new study by Prof. Andrej Pustogow from the Institute of Solid State Physics at TU Wien together with other international research teams shows that this can sometimes take quite a long time. The investigation of strontium ruthenate, a material that plays an important role in unconventional superconductivity, has now disproved an experiment that gained fame in the 1990s: it was believed that a novel form of superconductivity had been discovered. As it now turns out, however, the material behaves very similarly to other well-known high-temperature superconductors. Nevertheless, this is an important step forward for research.

Superconductivity is one of the great mysteries of solid-state physics: certain materials lose their electrical resistance completely at low temperatures. This effect is still not fully understood. What is certain, however, is that so-called Cooper pairs play a central role in superconductivity.

Pyramid shaped crystal in a coil. Credit: TU Wien

In a normal metal, electric current consists of individual electrons that collide with each other and with the metal atoms. In a superconductor, the electrons move in pairs. This changes the situation dramatically, explains Andrej Pustogow. Its similar to the difference between a crowd in a busy shopping street and the seemingly effortless motion of a dancing couple on the dance floor. When electrons are bound in Cooper pairs, they do not lose energy through scattering and move through the material without any disturbance. The crucial question is: Which conditions lead to this formation of Cooper pairs?

From a quantum physics point of view, the important thing is the spin of these two electrons, says Andrej Pustogow. The spin is the magnetic moment of an electron and can point either up or down. In Cooper pairs, however, a coupling occurs: in a singlet state, the spin of one electron points upwards and that of the other electron points downwards. The magnetic moments cancel each other out and the total spin of the pair is always zero.

However, this rule, which almost all superconductors follow, seemed to be broken by the Cooper pairs in strontium ruthenate (Sr2RuO4). In 1998, results were published that indicated Cooper pairs in which the spins of both electrons point in the same direction (then it is a so-called spin triplet). This would enable completely new applications, explains Andrej Pustogow. Such triplet Cooper pairs would then no longer have a total spin of zero. This would allow them to be manipulated with magnetic fields and used to transport information without loss, which would be interesting for spintronics and possible quantum computers.

This caused quite a stir, not least because strontium ruthenate was also considered a particularly important material for superconductivity research for other reasons: its crystal structure is identical to that of cuprates, which exhibit high-temperature superconductivity. While the latter are deliberately doped with impurities to make superconductivity possible, Sr2RuO4is already superconducting in its pure form.

Actually, we studied this material for a completely different reason, says Andrej Pustogow. But in the process, we realized that these old measurements could not be correct. In 2019, the international team was able to show that the supposedly exotic spin effect was just a measurement artefact: the measured temperature did not match the actual temperature of the sample studied; in fact, the sample studied at the time was not superconducting at all. With this realization in mind, the superconductivity of the material was now re-examined with great precision. The new results clearly show that strontium ruthenate is not a triplet superconductor. Rather, the properties correspond to what is already known from cuprates.

However, Andrej Pustogow does not find this disappointing: It is a result that brings our understanding of high-temperature superconductivity in these materials another step forward. The finding that strontium ruthenate shows similar behavior to cuprates means two things: On the one hand, it shows that we are not dealing with an exotic, new phenomenon, and on the other hand it also means that we have a new material at our disposal, in which we can investigate already known phenomena. Ultra-pure strontium ruthenate is better suited for this than previously known materials. It offers a much cleaner test field than cuprates.

In addition, one also learns something about the reliability of old, generally accepted publications: Actually, one might think that results in solid-state physics can hardly be wrong, says Pustogow. While in medicine you might have to be satisfied with a few laboratory mice or a sample of a thousand test subjects, we examine billions of billions (about 10 to the power of 19) electrons in a single crystal. This increases the reliability of our results. But that does not mean that every result is completely correct. As everywhere in science, reproducing previous results is indispensable in our field and so is falsifying them.

References:

Evidence for even parity unconventional superconductivity in Sr2RuO4 by Aaron Chronister, Andrej Pustogow, Naoki Kikugawa, Dmitry A. Sokolov, Fabian Jerzembeck, Clifford W. Hicks, Andrew P. Mackenzie, Eric D. Bauer, and Stuart E. Brown, 22 June 2021, Proceedings of the National Academy of Sciences.DOI: 10.1073/pnas.2025313118

Constraints on the superconducting order parameter in Sr2RuO4 from oxygen-17 nuclear magnetic resonance by A. Pustogow, Yongkang Luo, A. Chronister, Y.-S. Su, D. A. Sokolov, F. Jerzembeck, A. P. Mackenzie, C. W. Hicks, N. Kikugawa, S. Raghu, E. D. Bauer and S. E. Brown, 23 September 2019, Nature.DOI: 10.1038/s41586-019-1596-2

The rest is here:

Exotic Superconductors: The Secret That Was Never There - SciTechDaily

Monetary Policy Around the World Is Too Loose – Barron’s

This commentary was issued recently by money managers, research firms, and market newsletter writers and has been edited by Barrons.

June 24: Does monetary policy have things backward in this highly unusual cycle? Markets suffered only a fleeting blow from the Feds slight step back from uber-dovishness last week, as the bigger picture is that almost all central banks still have easy policies cranked to 11. The Bank of Mexicos shock rate hike this week is an exception that proves the rule. The fact that 10-year Treasury yields are planted at 1.5%, even as core inflation spikes above 3%, and equities are again testing all-time highs speaks to the lack of fear of the Fed among market participants. And, while Chair Powell flashed a hint of concern about the persistence of inflation at this weeks testimony, his main message is that we still have a long way to go in the recovery, particularly on the jobs front.

But the opening question is aimed at whether monetary policy is the proper vehicle to get us to the full-employment destination. As widely covered here and elsewhere, 9 million U.S. job openings do not suggest that there is a demand problem. It is becoming increasingly obvious that supply issues are the constraint on growth, whether its hesitant workers, bottlenecks, shortages, or backlogs. Yet, policy is still set at maximum support for demand, with fiscal policy now poised to add yet another leg, via an infrastructure deal. Those central banks that are now gingerly stepping backNorway, Mexico, and even Canadaare the few that seem to openly recognize this new reality.

Douglas Porter

The McClellan Market Report

McClellan Financial Publications

mcoscillator.com

June 24: Anyone can look at the VIX [ CBOE volatility index] to get sentiment indications about the stock market. Thats beginner stuff, although still pretty good. The real fun lies in going deeper into data that no one else looks at to find the fun insights.

This week [well look at] the total open interest in VIX futures. VIX futures first traded in 2004, but didnt really get going as a trading vehicle until around 2012. Normally, total open interest moves up and down with stock prices. It gets interesting when open interest moves too far in one direction or the other, or when the behavior changes.

In 2021, we are seeing a change in behavior. Total open interest has been falling since the peak in February, and is now down to the 200-day moving average, even though prices are continuing higher. This is the change in behavior that is so important to note. Since VIX futures first started trading in 2004, the important price tops for the S&P 500 have appeared when VIX open interest was well above its 200-day MA. I should clarify further that just being well above the 200-day MA isnt enough to put in a top. Prices can keep going up despite such a condition.

Rather, having VIX open interest below the 200-day MA is useful for ruling out the possibility that prices are now at a major top. It is a missing-topping condition. So we have some assurance that there should still be a lot more [room] for prices to run higher. When we see hedge funds getting excited again about trading the VIX futures, and open interest numbers rising to well above the 200-day MA, then we can worry about a meaningful top for stock prices.

Tom McClellan

Economic Update

Regions Financial

regions.com

June 22: Total existing home sales fell to an annualized rate of 5.80 million units in May from Aprils sales rate of 5.85 million units, a bit better than the 5.73 million unit pace we and the consensus expected. While the May headline sales number may have been a bit better than expected, the real May sales number is much worse than the headline number implies.

As our regular readers know, when it comes to the data on residential construction and sales, we have no use for the seasonally adjusted annualized headline numbers and even less use for any attempts at analysis based on these numbers, with our sole focus on the not-seasonally-adjusted data. The unadjusted data show that there were 528,000 existing homes sold in May, far below our forecast of 561,000 sales. While this is up from 513,000 sales in April, the 2.9% increase is much smaller than the typical increase for the month of May.

As has been the case for years, not months, lean inventories were once again a drag on sales in May. Listings of existing homes rose to 1.23 million units in May, a touch higher than our forecast of 1.22 million units, but this nonetheless left listings down 20.7% year-on-year. The median existing-home sales price rose to $350,300, the highest on record and a year-on-year increase of 23.6%, though the median sales price is being skewed higher by the mix of sales being increasingly weighted toward the higher price ranges given the dearth of inventory in the lower price ranges. While we do look for some relief on the supply front over the back half of 2021 to help blunt the pace of house price appreciation, affordability will remain an issue, particularly for prospective first-time buyers.

Richard F. Moody

Blog

William Blair

williamblair.com

June 21: There is at least one emerging technology with the potential to be highly disruptive: quantum computing. At some point, leading-edge semiconductors (the tiniest and best performing) will reach a physical limitchips cant get much smaller.

Computers using quantum physics instead of traditional semiconductor architectures have performance capabilities and processing power thats far greater than classical computers.

While it probably wont become mainstream for at least another five years, quantum computing has the potential to transform everything from technology to healthcare.

Greg Scolaro

Market Commentary

Texas Capital Bank

texascapitalbank.com

June 21: Remember that June is one of the worst months of the year for stock performance. Somebody must be in last place. Fridays [June 18] triple-witching day put the exclamation mark on stock performance for the month. Quarter-end portfolio positioning along with options expiration jolted most stocks lower by a percent or two. Each of the three Dow indices is in the red for June, with the recovery-oriented Transports faring the worst, -7% at Fridays close. One quarter of the S&P 500 consists of growth-oriented technology stocks, and the sector helped the big index stay above water for the month.

The late June jolt may stick around. Most index charts in the very short term are in downtrends, but all remain in their consolidation areas that date back to mid-April. Year 2 of a Bull cycle should see bumps along the way. Stocks are less than 4% below all-time highs, and earnings forecasts are improving. Any summer correction should be a buying opportunity.

Steve Orr, Greg Kalb

To be considered for this section, material, with the author's name and address, should be sent to MarketWatch@barrons.com.

Continue reading here:

Monetary Policy Around the World Is Too Loose - Barron's