Category Archives: Quantum Physics

The neuroscience of advanced scientific concepts | npj Science of Learning – Nature.com

This study identified the content of the neural representations in the minds of physicists considering some of the classical and post-classical physics concepts that characterize their understanding of the universe. In this discussion, we focus on the representations of post-classical concepts, which are the most recent and most abstract and have not been previously studied psychologically. The neural representations of both the post-classical and classical concepts were underpinned by four underlying neurosemantic dimensions, such that these two types of concepts were located at opposite ends of the dimensions. The neural representations of classical concepts tended to be underpinned by underlying dimensions of measurability of magnitude, association with a mathematical formulation, having a concrete, non-speculative basis, and in some cases, periodicity. By contrast, the post-classical concepts were located at the other ends of these dimensions, stated initially here in terms of what they are not (e.g. they are not periodic and not concrete). Below we discuss what they are.

The main new finding is the underlying neural dimension of representation pertaining to the concepts presence (in the case of the classical concepts) or absence (in the case of the post-classical concepts) of a concrete, non-speculative basis. The semantic characterization of this new dimension is supported by two sources of converging evidence. First, the brain imaging measurement of each concepts location on this underlying dimension (i.e. the concepts factor scores) converged with the behavioral ratings of the concepts degree of association with this dimension (as we have interpreted it) by an independent group of physicists. (This type of convergence occurred for the other three dimensions as well.) Second, the two types of concepts have very distinguishable neural signatures: a classifier can very accurately distinguish the mean of the post-classical concepts signatures from the mean of the classical concepts within each participant, with a grand mean accuracy of 0.93, p<0.001.

As physicists ventured into conceptually new territory in the 20th century and developed new post-classical concepts, their brains organized the new concepts with respect to a new dimension that had not played a role in the representation of classical concepts.

To describe what mental processes might characterize the post-classical end of this new dimension, it is useful to consider what attributes of the post-classical concepts could have led to their being neurally organized as they are and what cognitive and neural processes might operate on these attributes. Previously mentioned was that post-classical concepts often involve their immeasurability and their lower likelihood of being strongly associated with a mathematical formulation and periodicity, both of which are attributes that are often absent from post-classical concepts.

More informative than the absent attributes are four types of cognitive processes evoked by the post-classical concepts: (1) Reasoning about intangibles, taking into account their separation from direct experience and their lack of direct observability; (2) Assessing consilience with other, firmer knowledge; (3) Causal reasoning about relations that are not apparent or observable; and (4) Knowledge management of a large knowledge organization consisting of a multi-level structure of other concepts.

In addition to enabling the decoding of the content of the participants thoughts, whether they were thinking of dark matter or tachyon for example, the brain activation patterns are also informative about the concomitant psychological processes that operate on the concepts, in particular, the four processes listed above are postulated to co-occur specifically with the post-classical concepts. The occurrence of these processes was inferred from those locations of the voxel clusters associated with (having high loadings on) the classical/post-classical factor, specifically the factor locations where the activation levels increased for the post-classical concepts. (These voxel clusters are shown in Fig. 4, and their centroids are included in Table 2). Inferring a psychological process based on previous studies that observed activation in that location is called reverse inference. This can be an uncertain inferential method because many different processes or tasks can evoke activation at the same location. What distinguishes the current study are several sources of independent converging evidence, in conjunction with the brain locations associated with a factor (and not simply observed activation), indicating a particular process.

The factor clusters are encircled and numbered for ease of reference in the text and their centroids are included in Table 2. These locations correspond to the four classes of processes evoked by the post-classical concepts.

First, a statistically reliable decoding model predicted the activation levels for each concept in the factor locations, based on independent ratings of the concepts with respect to the postulated dimension/factor. The activation levels of the voxels in the factor locations were systematically modulated by the stimulus set, with the post-classical concepts, a specific subset of the stimuli eliciting the highest activation levels in these locations, resulting in the highest factor scores for this factor. Thus these brain locations were associated with an activation-modulating factor, not with a stimulus or a task. Second, the processes are consistent with the properties participants reported to have associated with the post-classical concepts. These properties provide converging evidence for these four types of processes occurring. For example, the concept of multiverse evoked properties related to assessing consilience, such as a hypothetical way to explain away constants. Another example is that tachyons and quasars were attributed with properties related to reasoning about intangibles, such as quasi-stellar objects. Third, the processes attributed to the factor locations were based not simply on an occasional previous finding, but on the large-scale meta-analysis (the Neurosynth database, Yarkoni et al.10) using the association based test feature. The association between the location and the process was based on the cluster centroid locations; particularly relevant citations are included in the factor descriptions. Each of the four processes is described in more detail below.

The nature of many of the post-classical concepts entails the consideration of alternative possible worlds. The post-classical factor location in the right temporal area (shown in cluster 5 in Fig. 4) has been associated with hypothetical or speculative reasoning in previous studies. In a hypothetical reasoning task, the left supramarginal factor location (shown in cluster 8) was activated during the generation of novel labels for abstract objects11. Additionally, the right temporal factor location (shown in cluster 5) was activated during the assessment of confidence in probabilistic judgments12.

Another facet of post-classical concepts is that they require the unknown or non-observable to be brought into consilience with what is already known. The right middle frontal cluster (shown in cluster 2) has been shown to be part of a network for integrating evidence that disconfirms a belief13. This consilience process resembles the comprehension of an unfolding narrative, where a new segment of the narrative must be brought into coherence with the parts that preceded it. When readers of a narrative judge the coherence of a new segment of text, the dorsomedial prefrontal cortex location (shown in cluster 6) is activated14. This location is associated with a post-classical factor location, as shown in Fig. 4. Thus understanding the coherence of an unfolding narrative text might involve some of the same psychological and neural consilience-seeking processes as thinking about concepts like multiverse.

Thinking about many of the post-classical concepts requires the generation of novel types of causal inferences to link two events. In particular, the inherent role of the temporal relations in specifying causality between events is especially complex with respect to post-classical concepts. The temporal ordering itself of events is frame-dependent in some situations, despite causality being absolutely preserved, leading to counter-intuitive (though not counter-factual) conclusions. For example, in relativity theory the concept of simultaneity entails two spatially separated events that may occur at the same time for a particular observer but which may not be simultaneous for a second observer, and even the temporal ordering of the events may not be fixed for the second observer. Because the temporal order of events is not absolute, causal reasoning in post-classical terms must eschew inferencing on this basis, but must instead rely on new rules (laws) that lead to consilience with observations that indeed can be directly perceived.

Another example, this one from quantum physics, concerns a particle such as an electron that may be conceived to pass through a small aperture at some speed. Its subsequent momentum becomes indeterminate in such a way that the arrival location of the particle at a distant detector can only be described in probabilistic terms, according to new rules (laws) that are very definite but not intuitive. The perfectly calculable non-local wave function of the particle-like object is said to collapse upon arrival in the standard Copenhagen interpretation of quantum physics. Increasingly elaborate probing of physical systems with one or several particles, interacting alone or in groups with their environment, has for decades elucidated and validated the non-intuitive new rules about limits and alternatives to classical causality in the quantum world. The fact that new rules regarding causal reasoning are needed in such situations was described as the heart of quantum mechanics and as containing the only mystery by Richard Feynman15.

Generating causal inferences to interconnect a sequence of events in a narrative text evokes activation in a right temporal and right frontal location (shown in clusters 3 and 4) which are post-classical factor locations16,17,18 as shown in Fig. 4. Causal reasoning accompanying perceptual events also activates a right middle frontal location (shown in cluster 3) and a right superior parietal location (shown in cluster 1)19. Notably, the right parietal activation is the homolog of a left parietal cluster associated with causal visualization1 found in undergraduates physics conceptualizations, suggesting that post-classical concepts may recruit right hemisphere homologs of regions evoked by classical concepts. Additionally, a factor location in the left supramarginal gyrus (shown in cluster 8) is activated in causal assessment tasks such as determining whether the causality of a social event was person-based (being a hard worker) or situation based (danger)20.

Although we have treated post-classical concepts such as multiverse as a single concept, it is far more complex than velocity. Multiverse entails the consideration of the uncertainty of its existence, the consilience of its probability of existence with measurements of matter in the universe, and the consideration of scientific evidence relevant to a multiverse. Thinking about large, multi-concept units of knowledge, such as the schema for executing a complex multi-step procedure evokes activation in medial frontal regions (shown in cluster 6)21,22. Reading and comprehending the description of such procedures (read, think about, answer questions, listen to, etc.) requires the reader to cognitively organize diverse types of information in a common knowledge structure. Readers who were trained to self-explain expository biological texts activated an anterior prefrontal cortex region (shown in cluster 7 in Fig. 4) during the construction of text models and strategic processing of internal representations23.

This underlying cognitive function of knowledge management associated with the post-classical dimension may generate and utilize a structure to manage a complex array of varied information that is essential to the concept. This type of function has been referred to as a Managerial Knowledge Unit22. As applied to a post-classical concept such as a tachyon, this knowledge management function would contain links to information to evaluate the possibility of the existence of tachyons, hypothetical particles that would travel faster than light-speed in vacuum. The concept invokes a structured network of simpler concepts (mass, velocity, light, etc.) that compose it. This constitutes a knowledge unit larger than a single concept.

Although the discussion has so far focused on the most novel dimension (the classical vs. post-classical), all four dimensions together compose the neural representation of each concept, which indicates where on each dimension a given concept is located (assessed by the concepts factor scores). The bar graphs of Fig. 5 show how the concepts at the extremes of the dimensions can appear at either extreme on several dimensions. These four dimensions are:

the classical vs. post-classical dimension, as described above, which is characterized by contrasting the intangible but consilient nature of post-classical concepts versus the quantifiable, visualizable, otherwise observable nature of classical concepts.

the measurability of a magnitude associated with a concept, that is, the degree to which it has some well-defined extent in space, time, or material properties versus the absence of this property.

the periodicity or oscillation which describes how many systems behave over time versus the absence of periodicity as an important element.

the degree to which a concept is associated with a mathematical formulation that formalizes the rules and principles of the behavior of matter and energy versus being less specified by such formalizations.

A concept may have a high factor score for more than one factor; for example, potential energy appears as measurable, mathematical, and on the classical end of the post-classical dimension. In contrast, multiverse appears as non-measurable, non-periodic, and post-classical.

The locations of the clusters of voxels with high loadings on each of the factors are shown in Fig. 6.

Colors differentiate the factors and greater color transparency indicates greater depth. Sample concepts from the two ends of the dimensions are listed. The post-classical factor locations include those whose activations were high for post-classical concepts (their locations are shown in Fig. 4) as well as those locations whose activations were high for classical concepts.

Classical concepts with high factor scores on the measurability factor, such as frequency, wavelength, acceleration, and torque, are all concepts that are often measured, using devices such as oscilloscopes and torque wrenches, whereas post-classical concepts such as duality and dark matter have an uncertainty of boundedness and no defined magnitude resulting in factor scores at the other end of the dimension. This factor is associated with parietal and precuneus clusters that are often found to be activated when people have to assess or compare magnitudes of various types of objects or numbers24,25,26, a superior frontal cluster that exhibits higher activation when people are comparing the magnitudes of fractions as opposed to decimals27, and an occipital-parietal cluster (dorsolateral extrastriate V3A) that activates when estimating the arrival time of a moving object28. Additional brain locations associated with this factor include left supramarginal and inferior parietal regions that are activated during the processing of numerical magnitudes;26 and left intraparietal sulcus and superior parietal regions activated during the processing of spatial information29. This factor was not observed in a previous study that included only classical concepts and hence the factor would not have differentiated among the concepts1.

The mathematical formulation factor is salient for concepts that are clearly associated with a mathematical formalization. The three concepts that are most strongly associated with this factor, commutator, Lagrangian, and Hamiltonian, are mathematical functions or operators. Cluster locations that are associated with this factor include: parietal regions that tend to activate in tasks involving mathematical representations30,31 and right frontal regions related to difficult mental calculations32,33. The parietal regions associated with the factor, which extend into the precuneus, activate in arithmetic tasks34. While most if not all physics concepts entail some degree of mathematical formulation, post-classical concepts such as quasar, while being measurable, are typically not associated with an algebraic formulation.

The periodicity factor is salient for many of the classical concepts, particularly those related to waves: wave function, light, radio waves, and gamma rays. This factor is associated with right hemisphere clusters and a left inferior frontal cluster, locations that resemble those of a similarly described factor in a neurosemantic analysis of physics concepts in college students1. This factor was also associated with a right hemisphere cluster in the inferior frontal gyrus and bilateral precuneus.

For all four underlying semantic dimensions, the brain activation-based orderings of the physics concepts with respect to their dimensions were correlated with the ratings of those concepts along those dimensions by independent physics faculty. This correlation makes it possible for a linear regression model to predict the activation pattern that will be evoked by future concepts in physicists brains. When a new physics concept becomes commonplace, (such as a new particle category, say, magnetic monopoliae), it should be possible to predict the brain activation that will be the neural signature of the magnetic monopole concept, based on how that concept is rated along the four underlying dimensions.

The neurosemantic conceptual space defined by the four underlying dimensions includes regions that are currently sparsely populated by existing concepts, but these regions may well be the site of some yet-to-be theorized concepts. It is also possible that as future concepts are developed, additional dimensions of neural representation may emerge, expanding the conceptual space that underpins the concepts in the current study.

More here:

The neuroscience of advanced scientific concepts | npj Science of Learning - Nature.com

The 3 types of energy stored within every atom – Big Think

The humble atom is the fundamental building block of all normal matter.

Hydrogen, in which single electrons orbit individual protons, composes ~90% of all atoms.

Quantum mechanically, electrons only occupy specific energy levels.

Atomic and molecular transitions between those levels absorbs and/or releases energy.

Energetic transitions have many causes: photon absorption, molecular collisions, atomic bond breaking/forming, etc.

Chemical energy powers most human endeavors, through coal, oil, gas, wind, hydroelectric, and solar power.

The most energy-efficient chemical reactions convert merely ~0.000001% of their mass into energy.

However, atomic nuclei offer superior options.

Containing 99.95% of an atoms mass, bonds between protons and neutrons involve significantly greater energies.

Nuclear fission, for example, converts ~0.09% of the fissionable mass into pure energy.

Fusing hydrogen into helium achieves even greater efficiencies.

For every four protons that fuse into helium-4, ~0.7% of the initial mass is converted into energy.

Nuclear power universally outstrips electron transitions for energy efficiency.

Still, the atoms greatest source of energy is rest mass, extractable via Einsteins E = mc2.

Matter-antimatter annihilation is 100% efficient, converting mass completely into energy.

Practically unlimited energy is locked within every atom; the key is to safely and reliably extract it.

Mostly Mute Monday tells an astronomical story in images, visuals, and no more than 200 words. Talk less; smile more.

The rest is here:

The 3 types of energy stored within every atom - Big Think

What’s behind rising violent crime? Progressive prosecutors’ non-enforcement of the law | TheHill – The Hill

Mutual combatants.

I was a prosecutor for nearly 20 years and taught criminal law. But I must admit, Id missed that one and thus had to be, um, edified by the office of Kim Foxx, states attorney for Cook County, Ill., and thus responsible for enforcing the law on Chicagos murder-ravaged streets.

There was a gang firefight in Chicago recently. Well, okay, theres always a gang firefight in Chicago. This one seemed run-of-the-mill at first. There is internecine strife, Im sure youll be stunned to hear, in the Four Corner Hustlers street gang. As a result, rival factions shot it out. Not in the dark of night, but in mid-morning, as though a brisk daylight gunfight were as routine as a jog along the river (provided, of course, that you wear your flak jacket and matching N95 mask). When the dust settled, one young man was dead and two were wounded.

Windy City police rounded up five of the gangbangers and brought them to prosecutors, expecting that each would be charged with commensurately serious felonies, including first-degree murder. The cops, however, were stunned when Foxxs office released the suspects without any charges.

Why? Did the shooting not really happen? Oh, it happened all right. But theres apparently a new law non-enforcement standard in Chicago: No indictments, even in brutal crossfire between rival criminals, because those criminals are yes mutual combatants.

Theyre in gangs, get it? And if youre in a gang, this is what you sign up for. Next case.

You want to know why violent crime is surging in the nations urban centers? Why, as the latest FBI statistics indicate, murder was up an astonishing 30 percent year-over-year in 2020, a record increase? Look no further than the Progressive Prosecutors Project (as I branded it in a March 2020 Commentary essay). This is the radical lefts enterprise to reform the criminal justice system by pretending that we dont have criminals or, to get so very nuanced about it, to assign blame for all crime on our systemically racist society.

No, it is not the sociopaths committing the violence, the thinking goes. See, when you really consider it, its each of us, right?

Imagine if todays prosecutors didnt rationalize this way. Lets say they werent hypnotized under the spell of disparate impact analysts, who insist that racism, not crime, explains Americas prison population which, not coincidentally, has plummeted as felonies have surged.

Well, then theyd have to wrestle with what, and who, is responsible for the bloodshed. Progressive prosecutors would have to come to grips with the stubborn fact they and their media cheerleaders strain to avoid: patterns of offending.

By the metric of percentage composition of the overall population, young Black males account for a disproportionate amount of the incarcerated population because, as a demographic class, they commit a disproportionate amount of the crime. They account for an overwhelming number of the gang arrests in cities such as Chicago because they are shooting at each other; that means they also account for more of the victims. And shooting each other is something they are certain to do more of, if progressive prosecutors keep coming up with creative ways to resist charging them.

These novelties include declining to invoke the anti-gang sentencing enhancement provisions. Though state legislatures enact these laws, prosecutors are effectively and imperiously repealing them because they disproportionately punish African Americans (as if defendants were being prosecuted for being African American rather than for committing murder and mayhem).

Now, evidently, the do-not-prosecute trick bag also includes the flat-out refusal to prosecute on a mutual combatant theory the lunatic notion that killing and being killed is what gang members volunteer for, so who are we prosecutors to intrude?

Heres an interesting point: If arrests and prosecutions are explained by racism rather than criminal behavior, why rely on the statistics breaking down the races of prison inmates? Why not just say that police departments even though many are run and heavily staffed by minority officers are systematically racist, and therefore, must be blinded by bias in making arrests?

Because progressive prosecutors know that is not how things work.

In most cases, police do not witness crimes or theorize about who the suspects are. Crimes have victims, and victims file reports identifying the perps. Thats what police act on. And thats how we know although progressive prosecutors are preoccupied by their perception of racism against the criminals (which may reflect their own ingrained bias) it is African American communities that disproportionately bear the bruntof violent crime. Prosecutors like Kim Foxx exacerbate this tragedy when they invent reasons not to address it.

If you want to know why crime is up, you need to understand why it went down, dramatically, after the high-crime generation from the 1970s into the 90s.

Prosecutors and police back then grasped that crime rates were a function of expectations about the rule of law. When prosecutors set the tone by acting against quality-of-life crimes, it signaled to more serious criminals that the communitys laws would be enforced. When serious crimes were committed, police were not told the cases would be dismissed; they were encouraged to conduct interrogations and follow-up investigations that improved law enforcements intelligence data bank. That intelligence was carefully and continually studied so that police could be deployed in the places where crime trends were emerging. Order does not need to be re-established if you take pains not to lose it in the first place.

This is not quantum physics. Progressive prosecutors dereliction of their duty invites more crime. Professional criminals are recidivists, and if they are repeatedly returned to the streets, rather than prosecuted and imprisoned, they commit lots more crime. The only way to stop it is to stop it. That means enforcing the law, even or especially, I should say against the mutual combatants.

Former federal prosecutor Andrew C. McCarthy is a senior fellow atNational Review Institute, a contributing editor at National Review, a Fox News contributor and the author of several books, including Willful Blindness: A Memoir of the Jihad. Follow him on Twitter@AndrewCMcCarthy.

Read more:

What's behind rising violent crime? Progressive prosecutors' non-enforcement of the law | TheHill - The Hill

The Week of October 11, 2021 – FYI: Science Policy News

NIH Director Francis Collins Announces Plans to Step Down

Francis Collins announced on Oct. 5 that he will resign as director of the National Institutes of Health by the end of the year and return to his laboratory at NIHs National Human Genome Research Institute (NHGRI). President Obama originally picked Collins as director in 2009 and, after being retained by Presidents Trump and Biden, he now ranks among the longest-serving science agency heads of the last half-century. During his tenure, Congress increased NIHs budget from $30 billion to $43billion and the agency launched a number of major research initiatives, including the BRAIN Initiative to study how the brain functions, the All Of Us Research Program to gather health data from a million-person cohort, and the Cancer Moonshot that Biden spearheaded when he was vice president. Previously, as NHGRI director from 1993 to 2008, Collins oversaw federally funded work on the Human Genome Project, which led President Bush to award him the Presidential Medal of Freedom in 2007.

Most recently, Collins led NIHs response to the COVID-19 pandemic and developed plans for a new Advanced Research Projects Agency for Health. In addition, he has overseen efforts to expand the diversity of the researchers NIH funds, crack down on sexual harassment at NIH-funded institutions, and uncover researchers who have not properly disclosed connections to foreign institutions, leading dozens to resign or be fired. According to the Washington Post, Collins considered resigning last year when Trump contravened researchers views on COVID-19. In a statement last week, Collins said heis stepping aside now in the belief that no single person should serve in the position too long. Calling Collins one of the most important scientists of our time, Biden remarked in his own statement, I will miss the counsel, expertise, and good humor of a brilliant mind and dear friend.

At a House Science Committee hearing last week, National Science Foundation Inspector General Allison Lerner stated that suspected cases of undue foreign influence on NSF grantees now comprise 63% of her offices investigative portfolio. Lerner said the workload growth began in late 2017 after her office became aware of issues associated with grantees participating in talent recruitment programs sponsored by foreign entities. In her written testimony, Lerner indicated that as of this August NSF had recovered $7.9 million in grant funds after taking action against 23 grantees. According to reporting by the journal Science, all but one of the cases involved researchers with ties to China, and based on the offices overall caseload, it perhaps has around 80 active investigations related to foreign influence. NSF Chief of Research Security Strategy and Policy Rebecca Spyke Keiser stated this summer that another reason for the increase is that around 2016 the Chinese government began allowing participants in its talent recruitment programs to participate on a part-time rather than full-time basis, remarking, That was when we also saw many of these programs start to not be disclosed and an uptick in number of people subscribing to the programs. Lerner testified that her office has become overwhelmed by allegations of grantee wrongdoing that it has received from NSF, from academic institutions, and from other law enforcement entities. She suggested that a doubling of her offices current investigative staff of 20 is warranted to handle the caseload.

On Oct. 7, the Central Intelligence Agency announced it has formed a Transnational and Technology Mission Center focused on new and emerging technologies, economic security, climate change, and global health. In addition, it has created a chief technology officer position and launched a Technology Fellows program that will bring outside experts into the CIA for one-to-two-year terms. The moves build on CIAs creation of a federal laboratory last year and a digital innovation directorate in 2015. Alongside the new technology center, the agency has also created a China Mission Center that will unify its activities related to the country. In a statement, CIA Director William Burns said the center will strengthen our collective work on the most important geopolitical threat we face in the 21st Century, an increasingly adversarial Chinese government.

The White House Office of Science and Technology Policy held a summit on Oct. 5 on quantum industry and society with representatives of more than 20 companies developing quantum technologies. Participantsincluded Google, Microsoft, Amazon Web Services, IBM, Intel, Honeywell, Boeing, Northrop Grumman, Lockheed Martin, HRL, and Goldman Sachs, as well as a number of smaller companies focused specifically on quantum technology. In conjunction with the event, OSTP released an interagency report that highlights how foreign nationals comprise about half of U.S. university graduates in fields related to quantum information science and technology (QIST). The report states there is currently significant unmet demand for talent at all levels of the QIST workforce and notes long lead times are required to expand the domestic workforce in these fields. Accordingly, it calls for efforts to better attract and retain foreign-born talent while simultaneously expanding the domestic workforce. The summit is the latest in a series OSTP has held in recent years on QIST.

Last week, the interagency National Science and Technology Council released a blueprint for a national strategic computing reserve that would be tapped into during emergencies. The concept is motivated in large part by lessons learned from the COVID-19 High-Performance Computing Consortium, which was stood up to provide pandemic researchers with priority access to federal and non-federal supercomputing resources. Noting that the ad hoc creation of the consortium diverted resources from existing research projects and significantly increased the workloads of the personnel involved, the blueprint outlines structures needed to ensure computing resources and expertise can be rapidly mobilized while minimizing disruptions to the broader research ecosystem. It recommends establishing a program office that would develop partnerships with resource providers and coordinate the allocation of resources to users. The office would also determine specific activation criteria for future crises and coordinate annual training exercises to demonstrate readiness for a variety of potential disasters. The blueprint estimates the office would require an annual budget of $2 million and that the necessary cyberinfrastructure platform for allocating resources would require an additional $2 million per year. It also indicates federal agencies would have to expand their existing computing capacity, suggesting that 20% of additional resource capacity in the steady state is necessary to ensure adequate resources for future emergencies.

The National Institute of Standards and Technology released a report last week analyzing causes of an incident in February in which the agencys research reactor released radiation into the surrounding facility. Several workers received elevated radiation doses as a result of the incident, albeit within regulatory limits, and the reactor has been shut down since then, depriving researchers of a facility that supports almost half of U.S.-based neutron-scattering research. According to NISTs analysis, reactor operators did not fully secure one of the reactors 30 fuel elements during a routine refueling operation and then failed to properly perform follow-up checks. NIST traces the errors to the departure of personnel who had long experience with refueling the reactor and to deficiencies in refueling procedures and the training of new operators that were exacerbated by the COVID-19 pandemic. A triennial National Academies review of the reactor facility in 2018 did not identify increased staff turnover as a risk for the reactor facility. The next such review is currently underway. NIST cannot restart the reactor until the Nuclear Regulatory Commission completes its own review of the incident and NISTs corrective actions.

SLAC National Accelerator Laboratory announced last week that the proposed upgrade to itsMatter in Extreme Conditions instrument has received approval from the Department of Energy to begin preliminary design work, a milestone known as Critical Decision 1. The upgrade involves building a new underground cavern that will couple the X-ray laser beam from SLACs Linac Coherent Light Source-II with a short-pulse petawatt laser as well as a second, lower-energy long-pulse laser, enabling study of new regimes of hot dense plasmas. DOE initiated the project in response to a 2017 National Academies report that spotlighted how the U.S. lags internationally in capabilities for laser research at the highest powers currently achievable. As of this spring, the estimated cost range for the upgrade was $234 million to $372 million with a target of completing the project by 2028. SLAC is pursuing the project in partnership with Lawrence Livermore National Lab and the University of Rochesters Laboratory for Laser Energetics.

The White House announced federal agencies release of23climate change adaptation plans last week that respond to a Jan. 27 executive order by President Biden. Preparedness actions identified in the plans primarily revolve around protecting federal facilities and activities, with some agencies also outlining research and services aimed at mitigating climate change impacts. For instance, the Department of Energys plan states that over the next year all department sites will conduct climate vulnerability assessments, and it outlines departmental efforts to provide climate-adaptation tools anddevelop climate-resilient technologies. The Commerce Departments plan outlines activities such as the provision of climate information by the National Oceanic and Atmospheric Administration and the development of forward-looking building standards by the National Institute of Standards and Technology. The Department of the Interiors plan focuses on efforts across agencies to examine threats, such as those related to wildfire and water supplies, and to develop climate-sensitive resource management strategies.

Syukuro Manabe, Klaus Hasselmann, and Giorgio Parisi were jointly awarded the 2021 Nobel Prize in Physics by the Royal Swedish Academy of Sciences last week for their work on complex systems. Half of the award was given to Manabe and Hasselmann for the physical modelling of Earths climate, quantifying variability and reliably predicting global warming, marking the first time the physics prize has gone to advances in climate science. Working at the National Oceanic and Atmospheric Administrations Geophysical Fluid Dynamics Laboratory in the 1960s, Manabe developed a numerical model of energy in the atmosphere that enabled sound quantitative predictions of future warming. A decade later, at the Max Planck Institute for Meteorology in Germany, Hasselmann created a stochastic climate model that added fluctuations due to weather, paving the way for the attribution of weather events to climate change. The other half of the prize was awarded for Parisis mathematical solution to the spin-glass problem, which bears on the complex organization of magnetic spins in certain materials but has also found applications in fields such as machine learning and artificial intelligence, neuroscience, and biology.

Read more:

The Week of October 11, 2021 - FYI: Science Policy News

Coldest Temperature Ever Recorded | What Is Absolute Zero? – Popular Mechanics

Erik Von WeberGetty Images

Researchers from four universities in Germany have created the coldest temperature ever recorded in a lab38 trillionths of a degree warmer than absolute zero to be exact, according to their new work, recently published in the journal Physical Review Letters.

The bone-chilling temperature only persisted for a few seconds at the University of Bremen's Center for Applied Space Technology and Microgravity, but the breakthrough could have longstanding ramifications for our understanding of quantum mechanics.

That's because the closer we get to absolute zerothe lowest possible temperature that we could ever theoretically reach, as outlined by the laws of thermodynamicsthe more peculiarly particles, and therefore substances, act. Liquid helium, for instance, becomes a "superfluid" at significantly low temperatures, meaning that it flows without any resistance from friction. Nitrogen freezes at -210 degrees Celsius. At cool enough temperatures, some particles even take on wave-like characteristics.

Absolute zero is equal to 273.15 degrees Celsius, or -459.67 degrees Fahrenheit, but most commonly, it's measured as 0 Kelvins. This is the point at which "the fundamental particles of nature have minimal vibrational motion," according to ScienceDaily. However, it's impossible for scientists to create absolute zero conditions in the lab.

In this case, the researchers were studying wave properties of atoms when they came up with a process that could lower a system's temperature by slowing particles to virtually a total standstill. For several seconds, the particles held completely still, and the temperature lowered to an astonishing 38 picokelvins, or 38 trillionths of a degree above absolute zero. This temperature is so low that it's not even detectable with a regular thermometer of any kind. Instead, the temperature is based on the lack of kinetic movement of the particles.

The mechanism at play here is "a time-domain matter-wave lens system," according to the team's research paper. A matter wave is just what it sounds like: matter that is behaving like a wave. This is part of quantum physics, where everything we previously thought we knew gets a little wobbly upon close examination. In this case, scientists used an magnetic lens to shape a quantum gas, and used that to make a matter wave focus and behave in a particular way. A regular gas is made of a loose arrangement of discrete particles, but a quantum gas is no such predictable material. In this case, the quantum gas is a perplexing state of matter called a Bose-Einstein condensate.

The lens is "tuned" using careful excitation. Think of the lenses on a pair of glasses, where the bend is designed to focus closer or further away depending on the patient's eyes. For this experiment, the scientists tuned the focus to literally infinity. Within the subset of quantum physics known as optics, this means the quantum gas confines the passing particles until they pass one at a time and at an astonishingly slow speed.

"By combining an excitation of a Bose-Einstein condensate (BEC) with a magnetic lens, we form a time-domain matter-wave lens system," the researchers write. "The focus is tuned by the strength of the lensing potential. By placing the focus at infinity, we lower the total internal kinetic energy of a BEC to 38pK."

The researchers, from the University of Bremen, Leibniz University Hannover, the Humboldt University of Berlin, and the Johannes Gutenberg University Mainz, say they envision future researchers making the particles go even slower, with a top potential "weightlessness" period of up to 17 seconds.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io

See the article here:

Coldest Temperature Ever Recorded | What Is Absolute Zero? - Popular Mechanics

Scientists are using quantum computing to help them discover signs of life on other planets – ZDNet

Scientists will use quantum computing tools to eventually help them detect molecules in outer space that could be precursors to life.

Quantum computers are assisting researchers in scouting the universe in search of life outside of our planet -- and although it's far from certain they'll find actual aliens, the outcomes of the experiment could be almost as exciting.

Zapata Computing, which provides quantum software services, has announced a new partnership with the UK's University of Hull, which will see scientists use quantum computing tools to eventually help them detect molecules in outer space that could be precursors to life.

During the eight-week program, quantum resources will be combined with classical computing tools to resolve complex calculations with better accuracy, with the end goal of finding out whether quantum computing could provide a useful boost to the work of astrophysicists, despite the technology's current limitations.

See also: There are two types of quantum computing. Now one company says it wants to offer both.

Detecting life in space is as tricky a task as it sounds. It all comes down to finding evidence of molecules that have the potential to create and sustain life -- and because scientists don't have the means to go out and observe the molecules for themselves, they have to rely on alternative methods.

Typically, astrophysicists pay attention to light, which can be analyzed through telescopes. This is because light -- for example, infrared radiation generated by nearby stars -- often interacts with molecules in outer space. And when it does, the particles vibrate, rotate, and absorb some of the light, leaving a specific signature on the spectral data that can be picked up by scientists back on Earth.

Therefore, for researchers, all that is left to do is detect those signatures and trace back to which molecules they correspond.

The problem? MIT researchershave previously established that over 14,000 moleculescould indicate signs of life in exoplanets' atmospheres. In other words, there is still a long way to go before astrophysicists have drawn a database of all the different ways that those molecules might interact with light -- of all the signatures that they should be looking for when pointing their telescopes to other planets.

That's the challenge that the University of Hull has set for itself: the institution's Centre for Astrophysics is effectively hoping to generate a database of detectable biological signatures.

For over two decades, explains David Benoit, senior lecturer in molecular physics and astrochemistry at the University of Hull, researchers have been using classical means to try and predict those signatures. Still, the method is rapidly running out of steam.

The calculations carried out by the researchers at the center in Hull involve describing exactly how electrons interact with each other within a molecule of interest -- think hydrogen, oxygen, nitrogen and so on. "On classical computers, we can describe the interactions, but the problem is this is a factorial algorithm, meaning that the more electrons you have, the faster your problem is going to grow," Benoit tells ZDNet.

"We can do it with two hydrogen atoms, for example, but by the time you have something much bigger, like CO2, you're starting to lose your nerve a little bit because you're using a supercomputer, and even they don't have enough memory or computing power to do that exactly."

Simulating these interactions with classical means, therefore, ultimately comes at the cost of accuracy. But as Benoit says, you don't want to be the one claiming to have detected life on an exo-planet when it was actually something else.

Unlike classical computers, however, quantum systems are built on the principles of quantum mechanics -- those that govern the behavior of particles when they are taken at their smallest scale: the same principles as those that underlie the behavior of electrons and atoms in a molecule.

This prompted Benoit to approach Zapata with a "crazy idea": to use quantum computers to solve the quantum problem of life in space.

"The system is quantum, so instead of taking a classical computer that has to simulate all of the quantum things, you can take a quantum thing and measure it instead to try and extract the quantum data we want," explains Benoit.

Quantum computers, by nature, could therefore allow for accurate calculations of the patterns that define the behavior of complex quantum systems like molecules without calling for the huge compute power that a classical simulation would require.

The data that is extracted from the quantum calculation about the behavior of electrons can then be combined with classical methods to simulate the signature of molecules of interest in space when they come into contact with light.

It remains true that the quantum computers that are currently available to carry out this type of calculation are limited: most systems don't break the 100-qubit count, which is not enough to model very complex molecules.

See also: Preparing for the 'golden age' of artificial intelligence and machine learning.

Benoit explains that this has not put off the center's researchers. "We are going to take something small and extrapolate the quantum behavior from that small system to the real one," says Benoit. "We can already use the data we get from a few qubits, because we know the data is exact. Then, we can extrapolate."

That is not to say that the time has come to get rid of the center's supercomputers, continues Benoit. The program is only starting, and over the course of the next eight weeks, the researchers will be finding out whether it is possible at all to extract those exact physics on a small scale, thanks to a quantum computer, in order to assist large-scale calculations.

"It's trying to see how far we can push quantum computing," says Benoit, "and see if it really works, if it's really as good as we think it is."

If the project succeeds, it could constitute an early use case for quantum computers -- one that could demonstrate the usefulness of the technology despite its current technical limitations. That in itself is a pretty good achievement; the next milestone could be the discovery of our exo-planet neighbors.

View post:

Scientists are using quantum computing to help them discover signs of life on other planets - ZDNet

Quantum computing will break today’s encryption standards – here’s what to do about it – Verizon Communications

When you come to the fork in the road, take it. Yogi Berra

For cryptologists, Yogi Berras words have perhaps never rang more true. As a future with quantum computing approaches, our internet and stored secrets are at risk. The tried-and-true encryption mechanisms that we use every day, like Transport Layer Security (TLS) and Virtual Private Networks (VPN), could be cracked and exposed by a hacker equipped with a large enough quantum computer using Shors algorithm, a powerful algorithm with exponential speed over classical algorithms. The result?The security algorithms we use today that would take roughly 10 billion years to decrypt could take as little as 10 seconds. To prevent this, its imperative that we augment our security protocols, and we have two options to choose from: one using physics as its foundation, or one using math our figurative fork in the road.

To understand how to solve the impending security threats in a quantum era, we need to first understand the fundamentals of our current encryption mechanism. The most commonly used in nearly all internet activities TLS is implemented anytime someone performs an online activity involving sensitive information, like logging into a banking app, completing a sale on an online retailer website, or simply checking email. It works by combining the data with a 32-byte key of random 1s and 0s in a complicated and specific way so that the data is completely unrecognizable to anyone except for the two end-to-end parties sending and receiving the data. This process is called public key encryption, and currently it leverages a few popular algorithms for key exchange, e.g., Elliptic curve Diffie-Hellman (ECDH) or RSA (each named after cryptologists,) each of which are vulnerable to quantum computers. The data exchange has two steps: the key exchange and the encryption itself. The encryption of the data with a secure key will still be safe, but the delivery of the key to unlock that information (key distribution) will not be secure in the future quantum era.

To be ready for quantum computers, we need to devise a new method of key distribution, a way to safely deliver the key from one end of the connection to the other.

Imagine a scenario wherein you and a childhood friend want to share secrets, but can only do so once you each have the same secret passcode in front of you (and there are no phones.) One friend has to come up with a unique passcode, write it down on a piece of paper (while maintaining a copy for themselves,) and then walk it down the block so the other has the same passcode. Once you and your friend have the shared key, you can exchange secrets (encrypted data) that even a quantum computer cannot read.

While walking down the block though, your friend could be vulnerable to the school bully accosting him or her and stealing the passcode, and we cant let this happen. What if your friend lives across town, and not just down the block? Or even more difficult in a different country? (And where is that secret decoder ring we got from a box of sugar-coated-sugar cereal we ate as kids?)

In a world where global information transactions are happening nonstop, we need a safe way of delivering keys no matter the distance. Quantum physics can provide a way to securely deliver shared keys quicker and in larger volume, and, most importantly, immune to being intercepted. Using fiber optic cables (like the ones used by telecommunications companies,) special Quantum Key Distribution (QKD) equipment can send tiny particles (or light waves) called photons to each party in the exchange of data. The sequence of the photons encapsulates the identity of the key, a random sequence of 1s and 0s that only the intended recipients can receive to construct the key.

Quantum Key Distribution also has a sort of built-in anti-hacker bonus. Because of the no-cloning theorem (which essentially states that by their very nature, photons cannot be cloned,) QKD also renders the identity of the key untouchable by any hacker. If an attacker tried to grab the photons and alter them, it would automatically be detected, and the affected key material would be discarded.

The other way we could choose to solve the security threats posed by quantum computers is to harness the power of algorithms. Although its true the RSA and ECDH algorithms are vulnerable to Shors algorithm on a suitable quantum computer, the National Institute of Standards and Technology (NIST) is working to develop replacement algorithms that will be safe from quantum computers as part of its post-quantum cryptography (PQC) efforts. Some are already in the process of being vetted, like ones called McEliece, Saber, Crystals-Kyber, and NTRU.

Each of these algorithms has its own strong and weak points that the NIST is working through. For instance, McEliece is one of the most trusted by virtue of its longstanding resistance to attack, but it is also handicapped by its excessively long public keys that may make it impractical for small devices or web browsing. The other algorithms, especially Saber, run very well on practically any device, but, because they are relatively new, the confidence level in them from cryptographers is still relatively low.

With such a dynamic landscape of ongoing efforts, there is promise that a viable solution will emerge in time to keep our data safe.

The jury is still out. We at Verizon and most of the world rely heavily on e-commerce to sell our products and encryption to communicate via email, messaging, and cellular voice calls.All of these need secure encryption technologies in the coming quantum era. But whether we choose pre-shared keys (implemented by the awesome photon) or algorithms, further leveraging mathematics, our communications software will need updating. And while the post quantum cryptography effort is relatively new, it is not clear which algorithms will withstand scrutiny from the cryptographic community. In the meantime, we continue to peer down each fork in the road to seek the best option to take.

Read more from the original source:

Quantum computing will break today's encryption standards - here's what to do about it - Verizon Communications

A novel way to heat and cool things – The Economist

Oct 7th 2021

REFRIGERATORS AND air-conditioners are old and clunky technology, and represent a field ripe for disruption. They consume a lot of electricity. And they generally rely on chemicals called hydrofluorocarbons which, if they leak into the atmosphere, have a potent greenhouse-warming effect. Buildings central-heating systems, meanwhile, are often powered by methane in the form of natural gas, which releases carbon dioxide, another greenhouse gas, when it is burned, and also has a tendency to leak from the pipes that deliver itwhich is unfortunate, because methane, too, is a greenhouse gas, and one much more potent than CO2.

Your browser does not support the

Enjoy more audio and podcasts on iOS or Android.

One potential way of getting around all this might be to exploit what is known as the thermoelectric effect, a means of carrying heat from place to place as an electric current. Thermoelectric circuits can be used either to cool things down, or to heat them up. And a firm called Phononic, based in Durham, North Carolina, has developed a chip which does just that.

The thermoelectric effect was discovered in 1834 by Jean Charles Peltier, a French physicist. It happens in an electrical circuit that includes two materials of different conductivity. A flow of electrons from the more conductive to the less conductive causes cooling. A flow in the other direction causes heating.

The reason for this is that electrons are able to vibrate more freely when pushed into a conductive material. They thereby transfer energy to their surroundings, warming them. When shunted into a less conductive one, electrons vibrations are constrained, and they absorb energy from their surroundings, cooling those surroundings down. An array of thermoelectric circuits built with all the high-conductivity materials facing in one direction and all the low conductivity ones in the other can thus move heat in either direction, by switching the polarity of the current. For reasons buried in the mathematics of quantum physics, the heat thus flowing does so in discrete packages, called phonons. Hence the name of the firm.

The thermoelectric effect works best when the conductors involved are actually semiconductors, with bismuth and tin being common choices. Fancy cameras contain simple cooling chips which use these, as do some scientific instruments. But Phononics boss, Tony Atti, thinks that is small beer. Using the good offices of Fabrinet, a chipmaker in Thailand, he has started making more sophisticated versions at high volume, using the set of tools and techniques normally employed to etch information-processing circuits onto wafers made of silicon. In this case, though, the wafers are made of bismuth.

The results are, admittedly, still a long way from something that could heat or cool a building. But they are already finding lucrative employment in applications where space is at a premium. At the moment, the fastest-growing market is cooling the infrared lasers used to fire information-encoding photons through fibre-optic cables, for the long-distance transmission of data. They are also being used, though, in the 5G mobile-phone base stations now starting to blanket street corners, to keep the batteries of electric vehicles at optimal operating temperatures, and as components of the optical-frequency radar-like systems known as LIDAR, that help guide autonomous vehicles.

The crucial question from Mr Attis point of view is whether semiconductor-based thermoelectronics can break out of these niches and become more mainstream, in the way that semiconductor-based electronics and lighting have done. In particular, he would like to incorporate heat-pumping chips into buildings, to provide them with integral thermoregulation.

In their current form, thermoelectric chips are unlikely to replace conventional air conditioning and central heating because they cannot move heat over the distances required to pump it in and out of a building in bulk. But they could nonetheless be used as regulators. Instead of turning a big air-conditioning system on or off, to lower or raise the temperature by the small amounts required to maintain comfort, with all the cost that entails, thermoelectric chips might tweak matters by moving heat around locally.

Phononic has already run trials of such local-temperature-tweaking chips in Singapore, in partnership with Temasek, that countrys state-run investment fund. In 2019 SP Group, Singapores utility company, installed eight of the firms heat pumps, which comprise an array of chips pointed down at people, pumping heat out of the air above them, on the boardwalk on Clarke Quay in the city. Phononic claims the devices lowered the temperature in their vicinity by up to 10C and, as a bonus, consequently reduced humidity by 15%. If that can be scaled up, it would certainly be a cool result.

This article appeared in the Science & technology section of the print edition under the headline "Cool thinking"

View original post here:

A novel way to heat and cool things - The Economist

New Fundamental Limit of Trapping and Exploiting Light at the Nanoscale – SciTechDaily

Metasurface of split-ring resonators, partially overlaid with 3D colourmaps showing the simulated electric-field distribution. High-momentum magnetoplasmons lead to the break-down of polaritons (blue spheres with photon energies in red). Credit: Urban Senica, ETH Zurich

Physicists from the University of Southampton and ETH Zrich have reached a new threshold of light-matter coupling at the nanoscale.

The international research, published recently in Nature Photonics, combined theoretical and experimental findings to establish a fundamental limitation of our ability to confine and exploit light.

The collaboration focused on photonic nano-antennas fabricated in ever reducing sizes on the top of a two-dimensional electron gas. The setup is commonly used in laboratories all over the world to explore the effect of intense electromagnetic coupling, taking advantage of the antennas ability to trap and focus light close to electrons.

Professor Simone De Liberato, Director of the Quantum Theory and Technology group at the University of Southampton, says: The fabrication of photonic resonators able to focus light in extremely small volumes is proving a key technology which is presently enabling advances in fields as different as material science, optoelectronics, chemistry, quantum technologies, and many others.

In particular, the focussed light can be made to interact extremely strongly with matter, making electromagnetism non-perturbative. Light can then be used to modify the properties of the materials it interacts with, thus becoming a powerful tool for material science. Light can be effectively woven into novel materials.

Scientists discovered that light could no longer be confined in the system below a critical dimension, of the order of 250nm in the sample under study, when the experiment started exciting propagating plasmons. This caused waves of electrons to move away from the resonator and spill the energy of the photon.

Experiments performed in the group of Professors Jrme Faist and Giacomo Scalari at ETH Zrich had obtained results that could not be interpreted with state-of-the-art understanding of light-matter coupling. The physicists approached Southamptons School of Physics and Astronomy, where researchers led theoretical analysis and built a novel theory able to quantitatively reproduce the results.

Professor De Liberato believes the newfound limits could yet be exceeded by future experiments, unlocking dramatic technological advances that hinge on ultra-confined electromagnetic fields.

Read Exploring the Quantitative Limits of LightMatter Coupling at the Nanoscale for more on this research.

Reference: Polaritonic nonlocality in lightmatter interaction by Shima Rajabali, Erika Cortese, Mattias Beck, Simone De Liberato, Jrme Faist and Giacomo Scalari, 9 August 2021, Nature Photonics.DOI: 10.1038/s41566-021-00854-3

The rest is here:

New Fundamental Limit of Trapping and Exploiting Light at the Nanoscale - SciTechDaily

The best way to win a Nobel is to get nominated by another laureate – The Economist

Oct 9th 2021

THE NOBEL prizes, whose winners are announced this month (see Science), may be the worlds most coveted awards. As soon as a new crop of laureates is named, critics start comparing the victors achievements with those of previous winners, reigniting debates over past snubs.

Your browser does not support the

Enjoy more audio and podcasts on iOS or Android.

A full account of why, say, Stephen Hawking was passed over will have to wait until 2068: the Nobel Foundations rules prevent disclosure about the selection process for 50 years. But once this statute of limitations ends, the foundation reveals who offered nominations, and whom they endorsed. Its data start in 1901 and end in 1953 for medicine; 1966 for physics, chemistry and literature; and 1967 for peace. (The economics prize was first awarded in 1969.)

Nomination lists do not explain omissions like Leo Tolstoy (who got 19 nominations) or Mahatma Gandhi (who got 12). But they do show that in 1901-66, Nobel voters handed out awards more in the style of a private members club than of a survey of expert opinion. Whereas candidates with lots of nominations often fell short, those with the right backerslike Albert Einstein or other laureatesfared better.

The bar to a Nobel nomination is low. For the peace prize, public officials, jurists and the like submit names to a committee, chosen by Norways parliament, that picks the winner. For the others, Swedish academies solicit names from thousands of people, mostly professors, and hold a vote for the laureate. On average, 55 nominations per year were filed for each prize in 1901-66.

Historically, voters paid little heed to consensus among nominators. In literature and medicine, the candidate with the most nominations won just 11% and 12% of the time; in peace and chemistry, the rates were 23% and 26%. Only in physics, at 42%, did nomination leaders have a big advantage. In 1956 Ramn Menndez Pidal, a linguist and historian, got 60% of nominations for the literature prize, but still lost.

However, voters did make one group of nominators happy: current and future laureates. Candidates put forward by past victors went on to win at some point in the future 40% more often than did those whose nominators never won a Nobel. People whose nominators became laureates later on also won unusually often. This implies that being accomplished enough to merit future Nobel consideration was sufficient to gain extra influence over voters.

In theory, this imbalance could simply reflect laureates nominating stronger candidates. However, at least one Nobel winner seems to have boosted his nominees chances, rather than merely naming superstars who would have won anyway.

According to the Nobel Foundations online archive, all 11 of Einsteins nominees won a prize. Some were already famous, like Max Planck; others, like Walther Bothe, were lesser-known. In two cases, his support seems to have been decisive.

In 1940 Einstein supported Otto Stern, a physicist who had already had 60 nominations. Stern won the next time the prize was given. Similarly, Wolfgang Pauli, whose exclusion principle is central to quantum mechanics, had received 20 nominations before Einstein backed him in 1945. He got his prize that same year.

Source: Nobel Foundation

This article appeared in the Graphic detail section of the print edition under the headline "Noblesse oblige"

See the article here:

The best way to win a Nobel is to get nominated by another laureate - The Economist