Category Archives: Quantum Physics
What Is The Law Of Attraction? Forbes Health – Forbes
The basic philosophy behind the law of attraction is that energy precedes manifestation, explains Whitman. As such, positive thoughts may bring positive results into a persons life, while negative thoughts bring the opposite.
Whatever we direct our powerful focus upon within the invisible realm of our thoughts, beliefs and emotions eventually manifests into outer form, claims Whitman. Thereby the law of attraction says the state of everything in our external worldour bodies, our relationships, the robustness of our careers and our financesis a direct reflection of our internal state.
When were focused on whats missing, whats unjust or on all the ways weve been wronged, we not only continue to find evidence of wrongdoing, but we hold ourselves in a victim mindset that deprives us of our power to think and feel on purpose, adds Whitman. Conversely, the law of attraction says if you dont like the quality of the experiences youre drawing to yourself, you can change them by adjusting your vibrational output, which means shifting your mood, attitude, words, thoughts or perspective.
As an example, imagine you are selling your house and you feel anxious about people going through your home, because someone might steal somethingthat is a negative output and like a boomerang will give an equal reaction of vibrational energy, claims Whitman. You might find that someone moved something, stole something or the energy in the house feels off. However, if you are tending to your energy and the fear or anxiety around selling your home, and trust that the perfect buyer will come, in the perfect timing, and that you are excited about your next move or the money that will flow in, then you are sending out positive vibes.
The energy, attitude, acts and emotions one puts out into the world are more likely than not to attract similar or resonant energy, attitudes, acts, vibrations and emotions, says James Michael Nolan, Ph.D., a licensed psychologist, law of attraction coach and former president of Southwestern College in Santa Fe, New Mexico. This is not unrelated to the common notion of karma, you reap what you sow, or what goes around comes around.
Principles behind the law of attraction can be traced back to ancient philosophers, says Nolan, including Hermes Trismegistus and the Hermetic Teachings, which were written as far back as the first century. So, this is not by any means new or new age-y thinking, says Nolan. It has been around a very long time, across all major religions and many philosophical schools.
There are various ways people look at the law of attraction, explains Nolan. On the one hand, there are people who ascribe to the more scientific sounding explanation of vibrational energy. While the other, more metaphysical worldview of this concept isnt based on a logical, linear explanation at all, and instead relies on intuition, faith and trust, according to Nolan.
See the original post:
First Experimental Proof That Quantum Entanglement Is Real – SciTechDaily
Scientists, including Albert Einstein and Erwin Schrdinger, first discovered the phenomenon of entanglement in the 1930s. In 1972, John Clauser and Stuart Freedman were the first to prove experimentally that two widely separated particles can be entangled.
A Q&A with Caltech alumnus John Clauser on his first experimental proof of quantum entanglement.
When scientists, including Albert Einstein and Erwin Schrdinger, first discovered the phenomenon of entanglement in the 1930s, they were perplexed. Disturbingly, entanglement required two separated particles to remain connected without being in direct contact. In fact, Einstein famously called entanglement spooky action at a distance, because the particles seemed to be communicating faster than the speed of light.
Born on December 1, 1942, John Francis Clauser is an American theoretical and experimental physicist known for contributions to the foundations of quantum mechanics, in particular the ClauserHorneShimonyHolt inequality. Clauser was awarded the 2022 Nobel Prize in Physics, jointly with Alain Aspect and Anton Zeilinger for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.
To explain the bizarre implications of entanglement, Einstein, along with Boris Podolsky and Nathan Rosen (EPR), argued that hidden variables should be added to quantum mechanics. These could be used to explain entanglement, and to restore locality and causality to the behavior of the particles. Locality states that objects are only influenced by their immediate surroundings. Causality states that an effect cannot occur before its cause, and that causal signaling cannot propagate faster than light speed. Niels Bohr famously disputed EPRs argument, while Schrdinger and Wendell Furry, in response to EPR, independently hypothesized that entanglement vanishes with wide-particle separation.
Unfortunately, at the time, no experimental evidence for or against quantum entanglement of widely separated particles was available. Experiments have since proven that entanglement is very real and fundamental to nature. Furthermore, quantum mechanics has now been proven to work, not only at very short distances but also at very great distances. Indeed, Chinas quantum-encrypted communications satellite, Micius, (part of the Quantum Experiments at Space Scale (QUESS) research project) relies on quantum entanglement between photons that are separated by thousands of kilometers.
John Clauser standing with his second quantum entanglement experiment at UC Berkeley in 1976. Credit: University of California Graphic Arts / Lawrence Berkeley Laboratory
The very first of these experiments was proposed and executed by Caltech alumnus John Clauser (BS 64) in 1969 and 1972, respectively. His findings are based on Bells theorem, devised by CERN theorist John Bell. In 1964, Bell ironically proved that EPRs argument actually led to the opposite conclusion from what EPR had originally intended to show. Bell demonstrated that quantum entanglement is, in fact, incompatible with EPRs notion of locality and causality.
In 1969, while still a graduate student at Columbia University, Clauser, along with Michael Horne, Abner Shimony, and Richard Holt, transformed Bells 1964 mathematical theorem into a very specific experimental prediction via what is now called the ClauserHorneShimonyHolt (CHSH) inequality (Their paper has been cited more than 8,500 times on Google Scholar.) In 1972, when he was a postdoctoral researcher at the University of California Berkeley and Lawrence Berkeley National Laboratory, Clauser and graduate student Stuart Freedman were the first to prove experimentally that two widely separated particles (about 10 feet apart) can be entangled.
Clauser went on to perform three more experiments testing the foundations of quantum mechanics and entanglement, with each new experiment confirming and extending his results. The FreedmanClauser experiment was the first test of the CHSH inequality. It has now been tested experimentally hundreds of times at laboratories around the world to confirm that quantum entanglement is real.
Clausers work earned him the 2010 Wolf Prize in physics. He shared it with Alain Aspect of the Institut d Optique and Ecole Polytechnique and Anton Zeilinger of the University of Vienna and the Austrian Academy of Sciences for an increasingly sophisticated series of tests of Bells inequalities, or extensions thereof, using entangled quantum states, according to the award citation.
John Clauser at a yacht club. Clauser enjoys sailboat racing in his spare time. Credit: John Dukat
Here, John Clauser answers questions about his historical experiments.
In the 1960s and 70s, experimental testing of quantum mechanics was unpopular at Caltech, Columbia, UC Berkeley, and elsewhere. My faculty at Columbia told me that testing quantum physics was going to destroy my career. While I was performing the 1972 FreedmanClauser experiment at UC Berkeley, Caltechs Richard Feynman was highly offended by my impertinent effort and told me that it was tantamount to professing a disbelief in quantum physics. He arrogantly insisted that quantum mechanics is obviously correct and needs no further testing! My reception at UC Berkeley was lukewarm at best and was only possible through the kindness and tolerance of Professors Charlie Townes [PhD 39, Nobel Laureate 64] and Howard Shugart [BS 53], who allowed me to continue my experiments there.
In my correspondence with John Bell, he expressed exactly the opposite sentiment and strongly encouraged me to do an experiment. John Bells 1964 seminal work on Bells theorem was originally published in the terminal issue of an obscure journal, Physics, and in an underground physics newspaper, Epistemological Letters. It was not until after the 1969 CHSH paper and the 1972 FreedmanClauser results were published in the Physical Review Letters that John Bell finally openly discussed his work. He was aware of the taboo on questioning quantum mechanics foundations and had never discussed it with his CERN co-workers.
Part of the reason that I wanted to test the ideas was because I was still trying to understand them. I found the predictions for entanglement to be sufficiently bizarre that I could not accept them without seeing experimental proof. I also recognized the fundamental importance of the experiments and simply ignored the career advice of my faculty. Moreover, I was having a lot of fun doing some very challenging experimental physics with apparatuses that I built mostly using leftover physics department scrap. Before Stu Freedman and I did the first experiment, I also personally thought that Einsteins hidden-variable physics might actually be right, and if it is, then I wanted to discover it. I found Einsteins ideas to be very clear. I found Bohrs rather muddy and difficult to understand.
In truth, I really didnt know what to expect except that I would finally determine who was rightBohr or Einstein. I admittedly was betting in favor of Einstein but did not actually know who was going to win. Its like going to the racetrack. You might hope that a certain horse will win, but you dont really know until the results are in. In this case, it turned out that Einstein was wrong. In the tradition of Caltechs Richard Feynman and Kip Thorne [BS 62], who would place scientific bets, I had a bet with quantum physicist Yakir Aharonov on the outcome of the FreedmanClauser experiment. Curiously, he put up only one dollar to my two. I lost the bet and enclosed a two-dollar bill and congratulations when I mailed him a preprint with our results.
I was very sad to see that my own experiment had proven Einstein wrong. But the experiment gave a 6.3-sigma result against him [a five-sigma result or higher is considered the gold standard for significance in physics]. But then Dick Holt and Frank Pipkins competing experiment at Harvard (never published) got the opposite result. I wondered if perhaps I had overlooked some important detail. I went on alone at UC Berkeley to perform three more experimental tests of quantum mechanics. All yielded the same conclusions. Bohr was right, and Einstein was wrong. The Harvard result did not repeat and was faulty. When I reconnected with my Columbia faculty, they all said, We told you so! Now stop wasting money and go do some real physics. At that point in my career, the only value in my work was that it demonstrated that I was a reasonably talented experimental physicist. That fact alone got me a job at Lawrence Livermore National Lab doing controlled-fusion plasma physics research.
In order to clarify what the experiments showed, Mike Horne and I formulated what is now known as ClauserHorne Local Realism [1974]. Additional contributions to it were subsequently offered by John Bell and Abner Shimony, so perhaps it is more properly called BellClauserHorneShimony Local Realism. Local Realism was very short-lived as a viable theory. Indeed, it was experimentally refuted even before it was fully formulated. Nonetheless, Local Realism is heuristically important because it shows in detail what quantum mechanics is not.
Local Realism assumes that nature consists of stuff, of objectively real objects, i. e., stuff you can put inside a box. (A box here is an imaginary closed surface defining separated inside and outside volumes.) It further assumes that objects exist whether or not we observe them. Similarly, definite experimental results are assumed to obtain, whether or not we look at them. We may not know what the stuff is, but we assume that it exists and that it is distributed throughout space. Stuff may evolve either deterministically or stochastically. Local Realism assumes that the stuff within a box has intrinsic properties, and that when someone performs an experiment within the box, the probability of any result that obtains is somehow influenced by the properties of the stuff within that box. If one performs say a different experiment with different experimental parameters, then presumably a different result obtains. Now suppose one has two widely separated boxes, each containing stuff. Local Realism further assumes that the experimental parameter choice made in one box cannot affect the experimental outcome in the distant box. Local Realism thereby prohibits spooky action-at-a-distance. It enforces Einsteins causality that prohibits any such nonlocal cause and effect. Surprisingly, those simple and very reasonable assumptions are sufficient on their own to allow derivation of a second important experimental prediction limiting the correlation between experimental results obtained in the separated boxes. That prediction is the 1974 ClauserHorne (CH) inequality.
The 1969 CHSH inequalitys derivation had required several minor supplementary assumptions, sometimes called loopholes. The CH inequalitys derivation eliminates those supplementary assumptions and is thus more general. Quantum entangled systems exist that disagree with the CH prediction, whereby Local Realism is amenable to experimental disproof. The CHSH and CH inequalities are both violated, not only by the first 1972 FreedmanClauser experiment and my second 1976 experiment but now by literally hundreds of confirming independent experiments. Various labs have now entangled and violated the CHSH inequality with photon pairs, beryllium ion pairs, ytterbium ion pairs, rubidium atom pairs, whole rubidium-atom cloud pairs, nitrogen vacancies in diamonds, and Josephson phase qubits.
Testing Local Realism and the CH inequality was considered by many researchers to be important to eliminate the CHSH loopholes. Considerable effort was thus marshaled, as quantum optics technology improved and permitted. Testing the CH inequality had become a holy grail challenge for experimentalists. Violation of the CH inequality was finally achieved first in 2013 and again in 2015 at two competing laboratories: Anton Zeilingers group at the University of Vienna, and Paul Kwiats group at the University of Illinois at UrbanaChampaign. The 2015 experiments involved 56 researchers! Local Realism is now soundly refuted! The agreement between the experiments and quantum mechanics now firmly proves that nonlocal quantum entanglement is real.
One application of my work is to the simplest possible object defined by Local Realisma single bit of information. Local Realism shows that a single quantum mechanical bit of information, a qubit, cannot always be localized in a space-time box. This fact provides the fundamental basis of quantum information theory and quantum cryptography. Caltechs quantum science and technology program, the 2019 $1.28-billion U.S. National Quantum Initiative, and the 2019 $400 million Israeli National Quantum Initiative all rely on the reality of entanglement. The Chinese Micius quantum-encrypted communications satellite systems configuration is almost identical to that of the FreedmanClauser experiment. It uses the CHSH inequality to verify entanglements persistence through outer space.
My dad, Francis H. Clauser [BS 34, MS 35, PhD 37, Distinguished Alumni Award 66] and his brother Milton U. Clauser [BS 34, MS 35, PhD 37] were PhD students at Caltech under Theodore von Krmn. Francis Clauser was Clark Blanchard Millikan Professor of Engineering at Caltech (Distinguished Faculty Award 80) and chair of Caltechs Division of Engineering and Applied Science. Milton U. Clausers son, Milton J. Clauser [PhD 66], and grandson, Karl Clauser [BS 86] both went to Caltech. My mom, Catharine McMillan Clauser was Caltechs humanities librarian, where she met my dad. Her brother, Edwin McMillan [BS 28, MS 29], is a Caltech alum and 51 Nobel Laureate. The family now maintains Caltechs Milton and Francis Doctoral Prize awarded at Caltech commencements.
Originally posted here:
First Experimental Proof That Quantum Entanglement Is Real - SciTechDaily
Quantum physics forces us to make really weird choices – Big Think
On Tuesday, the 2022 Nobel Prize in Physics was awarded to three researchers: Alain Aspect, John F. Clauser and Anton Zeilinger. These scientists work opened up new frontiers in quantum weirdness to study. What their findings also showed is that the most philosophically challenging aspects of quantum mechanics are also its most essential. Those challenges mean that anyone taking quantum mechanics seriously is faced with strange choices in thinking about the nature of reality and our place in it. That is what I want to focus on today.
To be explicit, the three physicists share their prize for their studies of quantum entanglement. When particles are entangled, they can no longer be thought of as having separate properties. Imagine I have two particles with properties that I cannot know before I take measurements of them. But if the particles are entangled, then a measurement of just one out of the pair instantly establishes what a measurement on the other would produce. This is true even if the particles are separated by a distance so large that there would be no chance for them to communicate in the time it would take to measure one and then the other. In this way, entangled particles seem to form a coherent whole across space and time.
Entanglement is exactly the kind of spooky action at a distance that Einstein was famously concerned about in quantum mechanics. Its why he felt quantum theory was somehow incomplete, meaning there must be something about it we have yet to understand.
What Einstein wanted was a physics that returned us to a classical view of reality a view where things have their own distinct properties, regardless of whether a measurement of those properties was made or not. In 1964 Irish physicist John Stewart Bell proposed a way to clearly differentiate Einsteins vision of reality from the spookier quantum version. Measuring entanglement was the key. It took a few decades, but eventually measurements of separate entangled particles became commonplace, and in every experiment, Einstein lost. Reality really is spooky.
But what exactly is that spookiness telling us? The answer is that no one knows. Unlike classical physics, quantum mechanics always requires an interpretation to be pinned on top of mathematical formalism. Whereas Newtonian physicists could easily imagine their laws of motion governing atoms that acted just like tiny billiard balls, quantum physicists never had any such assurance. The heart of the dilemma comes with the role of measurement. Quantum mechanics is famous for its wave-particle duality, where an electron, for example, will behave as a wave or a particle depending on which kind of experiment you perform. Its the choice of measurement of a wave kind or a particle kind that seems to determine the result.
So, is the electron a wave spread out through space, or is it a particle holding just a single position at any one time? And why should the choice made by a measurer have any effect? What is a measurement anyway, and what is a measurer? Is it always a person an observer or does any interaction with any kind of thing count? The answers to these questions cannot be found in the mathematical theory at least not yet. That leaves people to interpret the mathematics according to the features of the reality they think the mathematics must express. But the problem is that no one agrees on which interpretation is correct, and the interpretations can vary wildly. And the spookiness of quantum cannot be made to go away every interpretation is forced to accept something about reality that seems really, really weird.
Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday
For example, the Many Worlds Interpretation of quantum mechanics holds that there is still a reality out there independent of measurers, but there is a price paid for this view. Every measurement in other words, every interaction with anything forces the Universe to split into a near infinity of copies. Each of these many worlds holds one of the possible measurement results.
In Quantum Bayesianism, on the other hand, the measurements of quantum mechanics never reveal the world in itself, but our interactions with the world. QBism has no problem explaining the importance of measurements, but it gives up on the dream (or fantasy) of a perfectly objective view of reality. As you can see, the Many Worlds interpretation is very different from Quantum Bayesianism. But each shows the kinds of choices you must make when you try to ask what quantum mechanics tells us about reality. If someone could tell us which choice we simply have to make, well, that would be worth another Nobel Prize.
Read more here:
Quantum physics forces us to make really weird choices - Big Think
Carlo Rovelli on the bizarre world of relational quantum mechanics – New Scientist
Physicist Carlo Rovelli explains the strange principles of relational quantum mechanics - which says objects don't exist in their own right - and how it could unlock major progress in fundamental physics
By Michael Brooks
Carlo Rovelli at the Cornelia Parker exhibition, Tate Britain
David Stock
Carlo Rovelli stands in front of an exploding shed. Fragments of its walls and shattered contents parts of a childs tricycle, a record player, a shredded Wellington boot hang in mid-air behind him. I have come to meet the physicist and bestselling author at an exhibition at the Tate Britain art gallery in London. The scattered objects are the work of Cornelia Parker, one of the UKs most acclaimed contemporary artists, known for her large-scale installations that reconfigure everyday objects.
For Rovelli, based at Aix-Marseille University in France, Parkers work is meaningful because it mirrors his take on the nature of reality. I connect with the process: of her coming up with the idea, producing the idea, telling us about the idea and of us reacting to it, he tells me. We dont understand Cornelia Parkers work just by looking at it, and we dont understand reality just by looking at objects.
Rovelli is an advocate of an idea known as relational quantum mechanics, the upshot of which is that objects dont exist independently of each other. It is a concept that defies easy understanding, so Parkers reality-challenging exhibition seemed like it might be a helpful setting for a conversation about it and about what else Rovelli is up to. It is a happy coincidence that Parkers shed is called Cold Dark Matter, a reference to the unidentified stuff that is thought to make up most of the universe. Because Rovelli now thinks he knows how we might finally pin
Go here to see the original:
Carlo Rovelli on the bizarre world of relational quantum mechanics - New Scientist
Click chemistry’ and quantum weirdness: What are this year’s Nobel Prizes all about? – Euronews
Groundbreaking discoveries in the fields of click chemistry and quantum mechanics have been awarded this years prestigious Nobel Prizes in Chemistry and Physics, with three scientists per field jointly receiving recognition for their individual work.
But what does their research tell us about the world we live in, and how did they change it?
While most people have the utmost respect for Nobel Prize winners, its also true that a majority of us may not have the knowledge to immediately grasp why the scientific results, recognised by the Swedish Academy who judge the prizes, have the potential to change our lives.
We tried to explain it as simply as possible.
John Clauser, Alain Aspect and Anton Zeilinger were jointly awarded this years Nobel Prize in Physics for their independent work on quantum weirdness.
The three all studied the entanglement of photons, the behaviour of two subatomic particles of light which act like a single unit even when not physically linked and separated by a large distance.
Even when apart, what happens to one of the two particles influences what happens to the other.
This phenomenon was dubbed spooky action at a distance by arguably the most influential physicist in history, Albert Einstein.
Though the idea of quantum entanglement, or weirdness, was theoretically developed in the 1960s by Northern Irish physicist John Stewart Bell, it had not been proven before Clauser, Aspect and Zeilingers experiments.
It turned out that the spookiness Einstein spoke of is definitely there - and it could be used to our benefit.
Quantum mechanics might sound rather abstract and far away from other daily lives, but its closer than we might think.
The trios entanglement research could be applied to quantum computers, promising a change in the way we use artificial intelligence to solve complex problems, such as world hunger and the climate crisis, as well as for encryption, making it harder for someone to hack private communications.
This years Nobel Prize in Chemistry was awarded to the groundbreaking research of scientists Carolyn Bertozzi, Morten Meldal and Barry Sharpless, who discovered reactions that let molecules snap together like Lego pieces to create new compounds.
Explained in such terms, click chemistry - the field studying the way molecules can be linked together - sounds rather simple. Its far from it.
Click chemistry is a field of study in which molecular building blocks snap together quickly and efficiently, forming new complicated molecules.
Sharpless, who has just received his second Nobel Prize in Chemistry, laid the foundation of this new form of chemistry and coined its name around the year 2000.
Where linking molecules together is an extremely complex operation, click reactions are very fast, clean and reliable and dont produce unwanted side products. Instead of taking place in harmful solvents, the reactions are made inside water.
Shortly afterwards, Meldal and Sharpless both independently discovered the copper catalysed azide-alkyne cycloaddition, a mouthful of words that indicate a chemical reaction now widely used in chemistry and which has created its own branch of synthetic chemistry.
Bertozzi then took click chemistry to a new level, according to the Swedish Academy, developing click reactions that work inside living organisms - what she called bioorthogonal reactions - in 2004.
These click reactions, which dont require copper, take place without disrupting the normal biochemistry of the cell.
Bertozzis bioorthogonal reactions have given us invaluable insight into the working of tumour cells, how they develop and evade our immune system, as well as offering us ways to track these cancerous cells.
But theres something inherent to click chemistry that is revolutionary in chemistry: this form of chemistry is environmentally friendly. Unlike other chemical creations, click chemistry avoids creating harmful waste and byproducts which would be expensive and hard to dispose of.
Beyond cancer research and treatments, click chemistry has been used to produce antimicrobials, herbicides, diagnostic tests, corrosion retardants and brightening agents.
Bertozzi becomes only the eighth woman to receive the chemistry prize in Nobel history, joining the likes of Marie Curie, Dorothy Crowfoot Hodgkin and Ada E Yonath.
Read more here:
Click chemistry' and quantum weirdness: What are this year's Nobel Prizes all about? - Euronews
Einstein, God, and Spooky Action at-a-Distance | Tim Maudlin – IAI
The 2022 Physics Nobel Prize is misunderstood even by the Nobel prize committee itself. What the work of John Clauser, Alain Aspect and Anton Zeilinger has shown, building on John Bells ideas, isnt that quantum mechanics cannot be replaced by a deterministic, hidden variables theory. What it has shown is that quantum mechanics, as well as all of physics, is non-local. Spooky action at a distance, what Einstein had found disturbing about quantum mechanics, is real and emerging technologies depend on it, argues Tim Maudlin.
The presentation of the 2022 Nobel Prize in Physics to John Clauser, Alain Aspect and Anton Zeilinger is a bittersweet moment for those of us who work in the foundations of physics. It is mostly bittersweet because John Bell, whose brilliant theoretical work provided in impetus and basis for the experimental work done by the laureates, did not live long enough to receive this same recognition for his achievement.
SUGGESTED READINGQuantum Theory and Common Sense: It's ComplicatedBy TimMaudlin
Bell died unexpectedly of a cerebral hemorrhage in 1990 at the age of 62. His seminal work was done in 1963, so this has been a long time coming. Bells work has rightly been characterized as the spark that fueled the second quantum revolution, arising from the appreciation of the potential technological usefulness of entanglement in quantum systems. But the primary focus of Bells work in 1963 was the related question of locality. Unfortunately, many news accounts of the implications of Bells work and the experiments based on itincluding the press release from the Nobel committee itselfhave not correctly recounted the situation. So the announcement is also bittersweet because it mischaracterizes what Bell did.
___
Einsteins main complaint about quantum theory was not the indeterminism. It was rather what he called spooky action-at-a-distance.
___
Albert Einstein did not like the orthodox, so-called Copenhagen understanding of quantum theory. His most oft-repeated and quotable complaint is that according to Copenhagen God plays dice with the universe, i.e. the fundamental physical laws are indeterministic and probabilistic rather than deterministic (as Newtons and Maxwells were). But Einsteins main complaint about quantum theory was not the indeterminism. It was rather what he called spooky action-at-a-distance. These two issues are related: in the Copenhagen approach (which postulates both that the mathematical wavefunction used to describe a system provides a complete physical description and that it suddenly collapses when a measurement is done) it is not just that the collapse is random and unpredictable, but that it has physical effects instantaneously far away from where the collapse occurred. For example, when a dot forms on a screen, indicating that a particle was found there, the wavefunction of the particle everywhere else in the universe is instantly reduced to zero. Einstein objected to this sudden universal physical change, not least because it would have to be produced faster than light. His objections began in the 1920s, when the new quantum theory was formulated.
___
Einsteins spooky action-at-a-distance amounted to the claim that what Alice does to her particle in her lab can have an instantaneous physical effect on the state of Bobs particle in his lab, arbitrarily far away.
___
In 1935, together with Boris Podolsky and Nathan Rosen, Einstein made the issue even more obvious and acute by discussing an experimental set-up using a pair of particles (rather than just one) prepared in a special sort of quantum state called an entangled state. According to the theory, these particles could be separated to an arbitrary distance from one another (one sent to Alice and the other to Bob) and separately experimented on. Einsteins spooky action-at-a-distance amounted to the claim that what Alice does to her particle in her lab can have an instantaneous physical effect on the state of Bobs particle in his lab, arbitrarily far away. What they argued was that the Copenhagen understanding of quantum theoryin which the wavefunction provides a complete physical description of the systemrequires such spooky action-at-a-distance. Why?
Two entagled particles: For Einstein, Podolsky and Rosen, "spooky action at a distance" was not acceptable.
Einstein, Podolsky and Rosen (EPR) constructed a wavefunction such that, although the outcome of a position measurement made by Alice cannot be predicted and so would be regarded as the outcome of an indeterministic and random process, it can be predicted with certainly by Bob on the basis of the outcome of his own observation of the position of his particle. According to Copenhagen, Bob can be in this position to make better predictions than Alice (from the original wavefunction) because his experiment has the effect of collapsing the wavefunction of the pair of particles and therefore, because they are entangled, changing the physical state of Alices. And since Bob (but not Alice) knows how the collapse occurred, he can make better predictions than she can.
EPR observed that this postulation of collapse, and its attendant spooky action, was completely unnecessary. It was only forced on the Copenhagen view because they insisted on the completeness of the wavefunction. Much more reasonable, they argued, is to assume that the outcomes of both Alices and Bobs experiments are determined by the local situation in their own labs, and all Bob is doing by his experiment is finding out about Alices particle, not changing the physical state of Alices particle. But to pursue this sort of account, one has to concede that the wavefunction does not provide a complete description of the system, which the Copenhagenists were unwilling to do.
This is how things stood for over two decades: EPR argued that to save locality and avoid spooky action-at-a-distance one had to postulate that a quantum system has physical characteristics (such as, for example, the exact position of a particle) that are not recorded or reflected in its wavefunction. These additional postulated physical characteristics came to be known as hidden variables (although, as Bell pointed out, it is a very misleading name since the additional variables better not be hidden if they are to help solve the problem).
___
Bell proved that even a deterministic "hidden variables" theory cannot save locality and replicate all the quantum predictions. Therefore no local theory, of any sort, can return the quantum predictions.
___
A critical event in our story occurs in 1952 when David Bohm, deeply influenced by discussions with Einstein, publishes a paper laying out an interpretation of quantum theory in terms of hidden variables, i.e. a theory that explicitly postulates the incompleteness of the wavefunction. The idea had already been discovered by Louis De Broglie in 1927, but he abandoned the theory for some time. The theory (sometimes called De Broglie/Bohm theory or pilot wave theory or Bohmian Mechanics) is deterministic, and so denies that God plays dice. And it is a hidden variables theory that makes the same predictions as the standard Copenhagen quantum theory. The article caught the attention of John Bell, who immediately appreciated what Bohm had accomplished. But the theory was also manifestly and undeniably non-local: it directly postulated spooky action-at-a-distance in its dynamics. In Bohms theory, Bobs experiment can indeed physically effect (and not just provide information about) the goings-on in Alices distant lab. So the theory was of no interest to Einstein, as it did not eliminate the non-locality.
SUGGESTED VIEWINGPlanck and the consciousness puzzleWith Catherine Heymans, Laura Mersini-Houghton, Patricia Churchland, Brian Greene, Amanda Gefter
It occurred to Bell to ask whether the achievement of Bohms theorythe determinism and general physical clarity of the theorycould somehow be retained but the non-locality eliminated. In 1963, on sabbatical, he had the time to look into the question and he discovered an astonishing fact. Bell considered experiments similar tobut also crucially different fromthe ones EPR had discussed. He could do this also due to Bohms help: Bohm, in his textbook on quantum theory, had reconfigured the EPR set-up to deal with a particles spin rather than its position or momentum, as EPR had. Since the spins of particles can be measured in any direction, that immediately suggests a whole universe of experimental possibilities. When Alice and Bob set their spin-measuring apparatuses in the same direction, they always get opposite results: one particle will be deflected up and the other down (although quantum theory does not predict which will go up and which down). So once again, when Bob sees the outcome in his lab, he can make better predictions about what Alice will see than Alice can. But if the two directions of the measuring devices are offset from each other, these perfect correlations slowly degrade.
Bell took a close look at exactly how they degrade and discovered an amazing thing: the full set of quantum mechanical predictions for these sorts of experiments cannot be reproduced by any local theory. The EPR correlations, of course, could be recovered by a local theory: it just had to be a deterministic and hence a hidden variables theory. But Bell proved that even a deterministic hidden variables theory cannot save locality and replicate all the quantum predictions. Therefore no local theory, of any sort, can return the quantum predictions. In other words, if quantum theory is predictively accurate in these cases, then locality is a lost cause. One must accept what Einstein called "spooky action-at-a-distance": what is done in one place makes a difference to how things happen in a location and time so far away that even light could not reach there in time.
___
The theoretical work by Bell and his successors yields only a conditional conclusion: if Bells inequality is violated in the lab, then nature itself must be non-local.
___
Bells mathematical result was extended and refined in various ways. Clauser, Michael Horne, Abner Shimony and Richard Holt relaxed Bells assumption of perfect correlations and derived the CHSH inequalities, a generalization of the original Bell inequality. No local physics can robustly produce violations of the CHSH inequality for experiments done far apart. Later, Daniel Greenburger, Michael Horne and Zeilinger derived the GHZ example, which involves three entangled particles rather than two. Again, no local theory can reproduce the quantum predictions for experiments done on the triple, and the relevant predictions are not statistical in nature: they are absolute.
The theoretical work by Bell and his successors yields only a conditional conclusion: if Bells inequality or the CHSH inequality is violated in the lab, then nature itself must be non-local. Quantum mechanics predicts such violations, but for many physicists that was just a good reason to expect the quantum-mechanical predictions to fail. Non-locality seemed too strange to accept, as it was for Einstein. But the final arbitration of the question had to be left to the experimentalists, some of whom have just received the Nobel prize for their work.
___
Unfortunately, much of this history has been garbled in the public discussion of Bells work and its experimental tests. The Noble prize committee itself gets it wrong in its press release.
___
The first tests were carried out by Clauser and Stuart Freedman in 1972. They appeared to support the quantum-mechanical predictions, but those loath to give up locality seized upon various experimental loopholes. One required improving detector efficiencies and another demanded that Alice and Bob have the settings of their apparatuses change so rapidly that information about how they are set could not be sent from one to the other even using signals that travel at the speed of light. Via the subsequent experimental efforts of Aspect and Zeilinger and others, these loopholes have been successively closed, and there is no serious doubt that the quantum-mechanical predictions are accurate.
Further, it has been shown that the quantum-mechanical states that allow for violations of Bells inequality are exactly the entangled states, so proving the physical reality of non-locality suggests that entanglement is an exploitable physical resource in a quantum system. This realization inspired people working in quantum computation and quantum communication and quantum cryptography to investigate how entangled states might be used for practical purposes. And that set off the second quantum revolution.
___
What Bells theoretical work and the subsequent experimental work of Clauser, Aspect and Zeilinger proved was non-locality. Ultimately, they proved Einstein wrong in his suspicions against spooky action-at-a-distance.
___
Unfortunately, much of this history has been garbled in the public discussion of Bells work and its experimental tests. The Noble prize committee itself gets it wrong in its press release,
"John Clauser developed John Bells ideas, leading to a practical experiment. When he took the measurements, they supported quantum mechanics by clearly violating a Bell inequality. This means that quantum mechanics cannot be replaced by a theory that uses hidden variables."
But that statement is flatly false. Indeed, it was a theory that uses hidden variablesBohmian mechanicsthat inspired Bell to find his inequalities, and that theory makes the correct prediction that the inequalities will be violated. The most well-known and highly developed hidden variables theory is that of Bohm, and Bell not only did not refute it, he was one of its most vocal proponents. In On the Impossible Pilot WaveBells wonderful paper expositing the theoryhe writes
Why is the pilot wave picture ignored in textbooks? Should it not be taught, not as the only way, but as an antidote to the prevailing complacency? To show that vagueness, subjectivity, and indeterminism are not forced on us by experimental facts, but by deliberate theoretical choice?
If one rightly wonders how such an encomium for a hidden variables theory could be written by the man who disproved hidden variables, the simple answer is that it wasnt. What Bells theoretical work and the subsequent experimental work of Clauser, Aspect and Zeilinger proved was non-locality, not no-hidden-variables. Ultimately, they proved Einstein wrong in his suspicions against spooky action-at-a-distance. And that, surely, deserves the highest honors one can bestow.
SUGGESTED READINGThe many meanings of Schrdinger's catBy IAI Editorial
The long gap between Bells proofand its experimental testsand the recognition of the accomplishment is regrettable. We should all be very pleased that that recognition has finally arrived. But the delay was in large part due to persistent misunderstandings and even dismissal of Bells work by people who had not put in the effort needed to understand it. May the presentation of the Nobel prize mark the end of these mischaracterizations, so the sweet may triumph over the bitter.
The rest is here:
Einstein, God, and Spooky Action at-a-Distance | Tim Maudlin - IAI
Think tank smears Orch OR quantum theory of consciousness – Newswise
Introduction A Tale of Two ORs
Newswise The think tank Foundational Questions Institute (FQXI) has sponsored coordinated research, articles and a press release comparing two theoretical versions of objective reduction (OR), proposals for gravity-related collapse of the quantum wavefunction.1-3 They also wrongly insinuate that the Orch OR theory of consciousness, put forth by Sir Roger Penrose and Stuart Hameroff in the mid 1990s,4,5 has been refuted.
Orch OR depends on a particular version of OR developed by Roger Penrose (P-OR).6,7 A similar version of OR was introduced by Hungarian physicist Lajos Disi8 (D-OR), who is an investigator and author on the FQXI-sponsored research and articles.
P-OR and D-OR both address the measurement problem in quantum mechanics, the problem that quantum superpositions of multiple possibilities exist, but are not observable. Many believe the very act of measurement or of conscious observation causes quantum state reduction to definite states (collapse of the wavefunction). But P-OR and D-OR both suggest instead that superpositions self-collapse, and undergo reductiondue to a gravity-related objective threshold (hence objective reduction, 'OR'). Both P-OR and D-ORpropose reduction threshold at time t=/EG,a form of the uncertainty principle ( is the Planck-Dirac constant, and EG the gravitational self-energy of superposition separation). Consequently, the P-OR and D-OR versions of OR are often lumped together, as DP theory.
However there are critical differences. D-OR predicts measureable radiation, heat emission with each OR event, and P-OR does not. P-OR predicts (proto-) consciousness (qualia, the experiential feature of consciousness) with OR events, and D-OR is not known to have proposed it.
As radiative heat can be measured to distinguish between the two OR approaches, FQXI sponsored an impressive attempt by a well-qualified group to detect radiation from OR collapse, and thus test D-OR. But continuous experiments over two months under Gran Sasso Mountain in Italy did not find spontaneous OR-related radiation.1 D-OR appeared to be ruled out.
However, the articles and press release strangely avoided discussing the results, remained committed to radiation (D-OR), and didnt consider P-OR (which correctly predicts no radiation). The Gran Sasso authors2 then put forth their own variant of Orch OR based on (the just-refuted) D-OR, instead of P-OR. They then refuted their own variant, but falsely implied that P-OR and Orch OR were also refuted. What is Orch OR?
To address the measurement problem, and consciousness, Roger Penrose6,7considered the fundamental question of how a particle can be in quantum superposition of multiple locations simultaneously. He used microscopic general relativity, portraying quantum particles as tiny spacetime curvatures, and superpositions as separated, alternative curvatures, governed by a wavefunction (Figure 1).
Collapse/reduction caused by subjective conscious observation (BING!!) is shown in Figure 1a (subjective reduction, SR). In this view, consciousness collapses the wavefunction, but consciousness is left outside science.
Penrose proposed the precise opposite, that OR collapse occurs spontaneously at time t = /EG, and (in P-OR) alsocauses (or is equivalent to)aconscious moment. In random microenvironments, ubiquitous P-OR moments would be experiential, but isolated, acausal and lacking memory, merely proto-conscious (ping!!, Figure 1b), metaphorically the noise of musical instruments being tuned before a performance.
Figure 1
Figure 1. Penrose 1 described quantum particles as tiny curvaturesin Planck scale spacetime geometry, and superpositions as separated curvatures. Figure 1a: a conscious observer (BING!!)causes quantum state reduction, collapsing the wavefunction. Figure 1b: according to Penrose, separated curvatures are unstable, and reduce at time t = /EG, producing a moment of phenomenal proto-conscious experience (ping!!).
How could P-OR moments be organized in the brain to give full, rich conscious perception and action? In the mid 1990s Penrose and Hameroff proposed that orchestrated quantum computations terminating by P-OR occur in microtubules, lattice polymers of the protein 'tubulin', inside brain neurons. Dendritic-somatic superpositions would entangle, evolve, quantum compute and collectively reach OR threshold for full conscious moments, selecting microtubule states which regulate axonal firing and behavior Orchestrated objective reduction, Orch OR4,5.
To accommodate Orch OR events by t = /EG, a scalar brain hierarchy was described, both upward in size from neurons into networks, but also downward, inward, deeper and faster, to quantum vibrations among electron clouds inside microtubules,9and ultimately to fundamental spacetime geometry at the Planck scale (Figures 2 and 3).
Figure 2
Figure 2. Brain scalar hierarchy inward from 1) neuron to 2) microtubule bundles, 3) microtubules, 4) row of tubulin dipoles. Self-similar dynamics repeat every ~3 orders of magnitude.
Figure 3
Figure 3. Brain scalar hierarchy continued. 1) tubulin protein, 2) pi resonance electron cloud dipoles, 3) atomic nuclei, and ... 4) fundamental Planck scale spacetime geometry.
The inward hierarchy was discovered by Anirban Bandyopadhyay and colleagues10,11 as self-similar patterns of microtubule quantum vibrations and resonances repeating in terahertz, gigahertz, megahertz, kilohertz and hertz frequency ranges. Orch OR conscious events are proposed as quantum computations selecting particular microtubule vibrational states, like notes and chords in music, able to control axonal firings and neuronal functions. Thus Orch OR entails conscious qubits, e.g. in and among dendritic-somatic microtubules in cortical pyramidal neurons, multiple poised alternatives that can collapse to specific states as the solution, selecting a conscious perception or action via axonal firing directly to spinal cord.
Consider a tennis player, Roger Federer say, charging the net on a drop-shot, peeking at his opponent Rafael Nadal shall Roger go left/cross-court, or right/down the line? He has, at that instant, a superposition of both possibilities as poised alternatives e.g. in microtubule quantum superpositions in cortical layer V pyramidal neuron cell bodies and dendrites in his motor and pre-motor cortex. Then, Orch OR occurs, BING!!. Axons fire to spinal cord, and Federer hits down the line for a winner. Conscious quantum computing would be very useful in biological evolution.
Ample evidence exists for functional quantum states in microtubules. Bandyopadhyays quantum resonances,10,11 genomic, proteomic and optogenetic studies,12and quantum optical effects in microtubules inhibited by anesthetics13-15support the biological (Orch) side of Orch OR nicely. What about P-OR?
In the abstract of their paper in Nature Physics,1 the FQXI-sponsored authors say: We then report the results of a dedicated experiment at the Gran Sasso underground laboratory to measure this radiation emission rate.
But they then dont report the results of the dedicated experiment (the results apparently being that radiation emission was not measured or detected). Without saying what the result was, they pivot directly to: Our result sets a lower bound on the effective size of the mass density of nuclei, which is about three orders of magnitude larger than previous bounds. This rules out the natural parameter-free version of the DisiPenrose model. (The authors clarify that the natural parameter-free version of the DisiPenrose model is P-OR.)
What the FQXI are they talking about?!
Let's unpack this. Mass density of nuclei refers to the degree of superposition, or smearing out of mass separated from itself. The more the smearing, the lower the mass density, and the lower the EG, given by EG= Gm2/ac(where G is the gravitational constant, m is the mass, and ac a measure of superposition separation length, or smear factor). A 'smear factor' length value foracis necessary to calculate EG.
To calculate EGfor tubulin, the component protein in microtubules, Hameroff and Penrose4determined acfor tubulin in 3 possible ways- 1) partial separation of the entire protein, 2) complete separation at the level of the protein's atomic nuclei, and 3) complete separation at the level of its protons and neutrons. The dominant effect was 2) complete separation at the level of (carbon) atomic nuclei each nucleus separated from itself by its radius 2.5 femtometers (2.5 x 10-15m, 2.5 fermi lengths). (And acof 2.5 femtometers theoretically enables electronic-nuclear quantum coupling.) This smear factor ac= 2.5 femtometers was used to calculate tubulin EGto then relate the number of superpositioned tubulins for Orch OR moments to occur by t = /EGat various times and frequencies .
Anirban Bandyopadhyay had already shown quantum resonances in microtubules in megahertz and gigahertz experimentally,10,11so 10 megahertz (t = 10-7secs) was selected as a likely frequency for Orch OR events, brief enough to avoid decoherence.By t = /EG,Hameroff and Penrose calculated that to reach Orch OR threshold at time t = 10-7 secs (potentially repeating at 107 Hz, 10 megahertz), EGof 1015superpositioned entangled tubulins would be required. With a total of about 1020 tubulins in a human brain, 10 megahertz Orch OR events would involve10-5 of total brain tubulins, a seemingly reasonable fraction (1/10,000) which could change dynamically. Hameroff and Penrose5further suggested that Orch OR events in megahertz and other frequencies negatively resonate, interfere and change scale to give beats at slower kilohertz and hertz frequencies, including EEG and cognitive epochs. See the scalar hierarchy in Figure 2.
Back to the Big Smear. D-OR predicts that small superposition smear values (such as 2.5 femtometers ac) result in high levels of radiation (large EG), and that large smear values result in low levels of radiation (small EG), or if large enough, none at all. The Gran Sasso authors then assert that the absence of radiation in their experiments was because the smear value must have been too large. They then dry-labbed a smear factor big enough to give zero radiation (R0) to match their negative results. This is their lower bound R0 = ~0.5 x 10-10 m, half an angstrom. (With a carbon atom nucleus smeared to half an angstrom, a single carbon atom would be microns, larger than some entire biological cells!)
Apparently to test their R0, the Gran Sasso authors2 then concocted a variant of Orch OR based on D-OR constraints, further smearing the 2.5 femtometer acsmear factor 3 to 4 orders of magnitude larger! They then calculated times t = /EG, and, not surprisingly, got bizarre results. For example EG of 1015 tubulins with their smeared out R0 would reach threshold only after 12 days, rather than 10-7 secs with the 2.5 femtometers acfactor (12 day response time not very helpful in tennis, for example). They thus successfully disproved their own D-OR variant of Orch OR, but have absolutely no reason to rule out the natural parameter-free version of the DisiPenrose model if that implies P-OR. This would be like saying: Sorry Einstein, you were wrong. We changed your equation to e=mc5 and it doesnt work.
The Gran Sasso authors and FQXI press release1-3don't actually criticize, accuse or find fault with P-OR, or Orch OR, whatsoever. But they repeatedly insinuate they've been refuted by conflating P-OR and D-OR.
They say2:
Orch OR based on the DP theory is definitively ruled out
Reply: DP theory lumps together D-OR and P-OR, but only the D-OR variant of Orch OR was ruled out. Authentic Orch OR based on P-OR remains perfectly viable.
And they say2:
Orch OR theory, when based on the simplist version of gravity-related dynamical collapse, is highly implausible
Reply: The simplist version (according to the authors) is Disi OR. Their Orch OR variant based on D-OR doesnt work. Authentic Orch OR based on P-OR remains completely plausible.
To be fair, they also say:2 the results of this paper do not rule out Orch OR theory in general. Rather, they rule out variants of Orch OR based on the simplist version of the DP theory of gravity-related dynamical wavefunction collapse.
Reply: Yes, they say this twice in the 3 papers, but 7 times say that Orch OR based on DP theory is ruled out, or implausible.
Finally, the FQXI Press release3 title: Collapsing a leading theory for the quantum origin of consciousness
Reply: The leading theory is presumably authentic Orch OR based on P-OR and, as a scientific theory, is most definitely not collapsing. The two peer-reviewed papers made deceptive insinuations, but the FQXI press release headline did the dirty work, and wrongly declared Orch OR demise.
Could both views on OR radiation be correct? Penrose has maintained such radiation would not be detected, but has also proposed retro-active effects of OR, time symmetry being an essential feature of quantum mechanics. For example inThe emperors new mind,6Penrose described seemingly backward time effects in brain sensory processing, as discovered by Ben Libet in the late 1970s,16as retro-active effects of OR. Many other examples of apparent backward time effects in consciousness include the color phi effect, and pre-cognition. And real-time conscious behavior and free will may depend on retro-activity, avoiding the apparent problem of consciousness coming too late.17
In principle, retro-activity can erase the unselected superposition curvature, as if it never happened, preventing observable radiation in OR. What is retro-activity? Penrose develops the idea in his chapter in a book Quantum mechanics and consciousness edited by Shan Gao (~October, 2022, Oxford Press).18
Retro-active influence from near-future events could be beneficial in predator-prey relations, and evolution in general. Lets return to Roger Federer charging the net on a drop-shot, peeking at his opponent Nadal shall he go cross-court, or down the line? With retro-activity, perhaps Federer sees Nadal leaning to his right, expecting Roger to hit cross-court. Orch OR occurs (BING!!), axons fire and Federer hits to Nadals left, down the line for a winner.
Insinuations by FQXI and the Gran Sasso team that Penrose OR, and/or Orch OR had been disproven are deceptive, misleading and false. Penrose OR is the first, and as yet only truly scientific mechanism for conscious experience yet put forth.It cannot simply be replaced in the Orch OR theory by D-OR or anything else.
But new avenues of research interest have been highlighted, for example quantum optical pharmacology, the brain's microtubule-based scalar hierarchy, and Rogers retro-activity in physics and consciousness. Among theories of consciousness, Orch OR has the 1) greatest explanatory power, 2) tightest connection to brain biology, and 3) most supportive evidence. We welcome constructive criticism and collaboration.
The rest is here:
Think tank smears Orch OR quantum theory of consciousness - Newswise
UPSC Key-October 5, 2022: Why you should read ‘Animal Adoption Scheme’ or ‘Credit Suisse’ or ‘Quantum Theory’ for UPSC CSE – The Indian Express
FRONT PAGE
EC asks parties to explain how they plan to finance poll promises, draws Opp fire
Syllabus:
Preliminary Examination:Indian Polity and Governance-Constitution, Political System, Panchayati Raj, Public Policy, Rights Issues, etc.
Mains Examination:General Studies II: Salient features of the Representation of Peoples Act.
Key Points to Ponder:
Whats the ongoing story- Amid a raging debate on the financial implication of freebies promised by political parties during elections or the Revdi culture referred to by Prime Minister Narendra Modi the Election Commission of India Tuesday wrote to parties proposing that they spell out ways and means of raising additional resources to finance the promises, and the impact it would have on the fiscal sustainability of the state or the Central government.
For Your Information-Floating a consultation paper, the EC, in its letter to all recognised national and state parties, has prescribed a standardised disclosure proforma for them to declare quantification of the physical coverage of the schemes promised, financial implications of the promise and availability of the financial resources. The EC has asked the parties to furnish their views by October 19.
The ECs move drew a sharp reaction from the Opposition-Is this move by Election Commission is overreach?
What is Model Code of Conduct?
What is Freebies?
Is there any definition of the term freebies given in the existing legal/policy framework?
What is wrong if political parties are promising freebies?
Promises made by the political parties, often driven by short-term electoral calculations-Agree or disagree?
Do You Know The parties will have to detail how they propose to raise the additional resources to finance the scheme or schemes if voted to power like whether they plan an increase in tax and non-tax revenues, rationalise expenditure, go for additional borrowings or do it in any other manner.
How freebies Impact Union and State budgets?
Why the word Freebie is almost sounds pejorative especially in political and policy circle?
The political dialogue built around freebies is fraught with danger. What danger is associated with Freebies scheme?
How Budget is managed for Freebies schemes and Programmes?
Election Commission of India and Article 324 of the Constitution-Know in detail
The independent and impartial functioning of the Election Commission-How it is ensured?
Election Commission of India- Powers and Functions
Chief Election Commissioner and the two other Election Commissioners have equal powers-True or False?
In case of difference of opinion amongst the Chief election commissioner and/or two other election commissioners, the matter is decided by the Supreme Court of India-Right or Wrong?
Other Important Articles Covering the same topic:
As constitutional office, may not be apt to be part of SC panel on poll freebies: EC
Election monitor does well to refrain from stepping into debate on freebies. That
Over-reach, will stay out, Election Commission told Supreme Court before U-turn
Split on method to name new SC judges, CJI sends 2nd note to Collegium
Syllabus:
Preliminary Examination:Indian Polity and Governance
Main Examination:General Studies-II: Structure, organization and functioning of the Executive and the Judiciary
Key Points to Ponder:
Whats the ongoing story- With a little over a month left for his retirement and two of the five-member Supreme Court Collegium opposed to a proposal to recommend four new judges, including a Supreme Court lawyer, to the top court through a written note instead of a formal meeting, Chief Justice of India U U Lalit is learnt to have written to them again, seeking reconsideration of their stand.
For Your Information-As per the Memorandum of Procedure (MoP), the document governing the process of appointment of judges and appointment of the CJI, the Law Minister asks the outgoing Chief Justice of India to recommend the next CJI. The MoP states that the CJI should be the senior-most judge of the Supreme Court considered fit to hold the office. Although the MoP says that the CJIs views must be sought at the appropriate time and does not specify a timeline for the process, it normally takes place a month before the retirement of the incumbent CJI.
Do You Know-Articles 124 to 147 in Part V of the Constitution deal with the organisation, independence jurisdiction, powers, procedures and so on of the Supreme Court. The Parliament is also authorised to regulate them.
What is the sanctioned strength of Supreme Court Judges in India?
Who appoints the Chief Justice of India?
How is the seniority of judges in the Supreme Court decided?
What does the Collegium consider while making the recommendation?
National Judicial Appointments Commission (NJAC) Act 2014-Know the key highlights
Supreme Court on NJAC Act 2014 (99th Constitutional Amendment Act)-know in detail
First Judges Case (1982), Second Judges Case (1993) and Third Judges Case (1998)-Know in detail
What was the Supreme courts ruling in the Second Judges case (1993), with respect to the appointment of a judge?
Third Judges case (1998) and Supreme Courts ruling in case of the appointment-What was the Supreme Courts ruling?
The National Judicial Appointments Commission Act of 2014 and the Collegium System-Compare and Contrast
What are the qualifications required for a person to be appointed as the Chief Justice of India?
A person appointed as a Chief Justice of Supreme Court, before entering upon his office, has to make and subscribe an oath or affirmation before whom?
The Constitution has made certain provisions to safeguard and ensure the independent and impartial functioning of a Judges-Know in detail
Jurisdiction And Powers Of Chief Justice of India-Know in detail
Executive Vs Judiciary for appointment of judges in higher judiciary-Know in detail
Other Important Articles Covering the same topic:
How is seniority decided in the SC?
Supreme Court strikes down NJAC, revives collegium system
Debate over the collegium system: How are SC and HC judges appointed?
Reservation for Paharis too in J&K, says Shah
Syllabus:
Preliminary Examination: Economic and Social Development-Sustainable Development, Poverty, Inclusion, Demographics, Social Sector Initiatives, etc.
Mains Examination:General Studies II: Government policies and interventions for development in various sectors and issues arising out of their design and implementation.
Key Points to Ponder:
Whats the ongoing story Ahead of the first-ever visit of Union Home Minister Amit Shah to border Rajouri district on Tuesday, the Pir Panjal region of Jammu and Kashmir is seeing growing tension between the Gujjar and Pahari communities over expected plans to grant Scheduled Tribe status to the latter.
Gujjar and Bakarwal Communities-Know in detail
Who are Paharis in Jammu & Kashmir?
Do You Know-The Gujjars and Bakerwals, who follow Islam, constitute 40% of the population in the border districts of Rajouri and Poonch, with the rest living in these areas identifying themselves as Paharis. With a population of nearly 15 lakh as per the 2011 Census, the Gujjars and Bakerwals form the third largest ethnic group in J&K after Kashmiris and Dogras. Since April 1991, they have enjoyed benefits of 10% reservation for STs in government jobs and admissions to educational institutions.
What is the issue between Paharis and Gujjars?
How are Scheduled Tribes declared?
The Gujjars contend that the majority of Pahari speakers are upper class Muslims such as Syeds, Qazi, Per, Beg, Raja, Malik, Mirza, Khan, upper caste Sikhs, Kashmiris, Hindus, and Christians such as Barahmins, Rajputs, and Mahajans who do not experience caste discrimination or social stigma like the Gujjars, Bakerwals, and other tribes of Jammu and Kashmir-Comment
Other Important Articles Covering the same topic:
New contours as J&K map redrawn: Paharis press for ST status, NC loses key leader
THE CITY
You can adopt an animal at Delhi Zoo, but prices vary
Syllabus:
Preliminary Examination: General issues on Environmental ecology, Bio-diversity and Climate Change that do not require subject specialization.
Mains Examination:General Studies III: Conservation, environmental pollution and degradation, environmental impact assessment.
Key Points to Ponder:
Whats the ongoing story- The National Zoological Park (Delhi Zoo) has launched an animal adoption scheme allowing people to sign up to pay for the care of animals at the facility. Zoo director Dharam Deo Rai said that the scheme is being introduced for the first time at the zoo and is meant to encourage the participation of people in wildlife conservation.
What is animal adoption scheme?
How animal adoption scheme will work?
Animal adoption scheme is first of its kind in India-Agree or disagree?
Animal adoption scheme-Know its features and highlights
Human-animal conflict is the biggest challenge in India and Animal adoption scheme will be icebreaker-discuss
Animal adoption scheme can be implemented in sanctuary or in national parks?
Other Important Articles Covering the same topic:
Animal Adoption Scheme: You can now adopt your favorite animals at Delhi Zoo Check how the scheme works and rate list
EXPLAINED
What led to the Credit Suisse crisis; and the road ahead
Syllabus:
Preliminary Examination: Current events of national and international importance.
Mains Examination:General Studies III: Indian Economy and issues relating to planning, mobilization, of resources, growth, development and employment.
Key Points to Ponder:
Whats the ongoing story- Over the past few days, the share price of Credit Suisse, one of the oldest and historically one of the most influential banks in the world, has hit an all-time low. Since the beginning of 2022, Credit Suisses share price has fallen close to 60 per cent.
Credit Suisse-Know in detail
What has triggered the concern around Credit Suisse?
How did it impact Credit Suisse?
What is a credit default swaps?
How did credit default swaps work?
Why rise in the spread of Credit Default Swaps is worrisome?
What has Credit Suisse said?
Other Important Articles Covering the same topic:
What Is a Credit Default Swap (CDS)?
Continue reading here:
The biggest problem with gravity and quantum physics – Big Think
No matter what you may have heard, make no mistake: physics is not over in any sense of the word. As far as weve come in our attempts to make sense of the world and Universe around us and we have come impressively far its absolutely disingenuous to pretend that weve solved and understood the natural world around us in any sort of satisfactory sense. We have two theories that work incredibly well: in all the years weve been testing them, weve never found a single observation or made a single experimental measurement thats conflicted with either Einsteins General Relativity or with the Standard Models predictions from quantum field theory.
If you want to know how gravitation works or what its effects on any object in the Universe will be, General Relativity has yet to let us down. From tabletop experiments to atomic clocks to celestial mechanics to gravitational lensing the formation of the great cosmic web, its success rate is 100%. Similarly, for any particle physics experiment or interaction conceivable, whether mediated via the strong, weak, or electromagnetic force, the Standard Models predictions have always been found to agree with the results. In their own realms, General Relativity and the Standard Model can each lay claim to be the most successful physics theory of all-time.
But theres a huge fundamental problem at the heart of both of them: they simply dont work together. If you want your Universe to be consistent, this situation simply wont do. Heres the fundamental problem at the heart of physics in the 21st century.
Countless scientific tests of Einsteins General Theory of Relativity have been performed, subjecting the idea to some of the most stringent constraints ever obtained by humanity. Einsteins first solution was for the weak-field limit around a single mass, like the Sun; he applied these results to our Solar System with dramatic success. Very quickly, a handful of exact solutions were found thereafter.
On the one hand, General Relativity, our theory of gravity, was a radical concept when it first came out: so radical that it was attacked by many on both philosophical and physical grounds for many decades.
Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!
Regardless of how anyone might have felt about the new picture that Einsteins greatest achievement, the general theory of relativity, brought along with it, the behavior of physical phenomena in the Universe doesnt lie. Based on a whole suite of experiments and observations, General Relativity has proven to be a remarkably successful description of the Universe, succeeding under every conceivable condition that weve been able to test, whereas no other alternative does.
The results of the 1919 Eddington expedition showed, conclusively, that the General Theory of Relativity described the bending of starlight around massive objects, overthrowing the Newtonian picture. This was the first observational confirmation of Einsteins theory of gravity.
What General Relativity tells us is that the matter-and-energy in the Universe specifically, the energy density, the pressure, the momentum density, and the shear stress present throughout spacetime determines the amount and type of spacetime curvature thats present in all four dimensions: the three spatial dimensions as well as the time dimension. As a result of this spacetime curvature, all entities that exist in this spacetime, including (but not limited to) all massive and massless particles, move not necessarily along straight lines, but rather along geodesics: the shortest paths between any two points defined by the curved space between them, rather than an (incorrectly) assumed flat space.
Where spatial curvature is large, the deviations from straight-line paths are large, and the rate at which time passes can dilate significantly as well. Experiments and observations in laboratories, in our Solar System, and on galactic and cosmic scales all bear this out in great agreement with General Relativitys predictions, lending further support to the theory.
Only this picture of the Universe, at least so far, works to describe gravitation. Space and time are treated as continuous, not discrete, entities, and this geometric construction is required to serve as the background spacetime in which all interactions, including gravitation, take place.
The particles and antiparticles of the Standard Model obey all sorts of conservation laws, but also display fundamental differences between fermionic particles and antiparticles and bosonic ones. While theres only one copy of the bosonic contents of the Standard Model, there are three generations of Standard Model fermions. Nobody knows why.
On the other hand, theres the Standard Model of particle physics. Originally formulated under the assumptions that neutrinos were massless entities, the Standard Model is based on quantum field theory, where there are:
The electromagnetic force is based on electric charges, and so all six of the quarks and the three charged leptons (electron, muon, and tau) all experience the electromagnetic force, whereas the massless photon mediates it.
The strong nuclear force is based on color charges, and only the six quarks possess them. There are eight massless gluons that mediate the strong force, and no other particles are involved in it.
The weak nuclear force, meanwhile, is based on weak hypercharge and weak isospin, and all of the fermions possess at least one of them. The weak interaction is mediated by the W-and-Z bosons, and the W bosons also possess electric charges, meaning they experience the electromagnetic force (and can exchange photons) as well.
The inherent width, or half the width of the peak in the above image when youre halfway to the crest of the peak, is measured to be 2.5 GeV: an inherent uncertainty of about +/- 3% of the total mass. The mass of the particle in question, the Z boson, is peaked at 91.187 GeV, but that mass is inherently uncertain by a significant amount owing to its excessively short lifetime. This result it remarkably consistent with Standard Model predictions.
Theres a rule in quantum physics that all identical quantum states are indistinguishable from one another, and that enables them to mix together. Quark mixing was expected and then confirmed, with the weak interaction determining various parameters of this mixing. Once we learned that neutrinos were massive, not massless as originally expected, we realized that the same type of mixing must occur for neutrinos, also determined by the weak interactions. This set of interactions the electromagnetic, weak, and strong nuclear forces, acting upon the particles that have the relevant and necessary charges describes everything that one could want to predict particle behavior under any imaginable conditions.
And the conditions weve tested them under are extraordinary. From cosmic ray experiments to radioactive decay experiments to solar experiments to high-energy physics experiments involving particle colliders, the Standard Models predictions have agreed with every single such experiment ever performed. Once the Higgs boson was discovered, it confirmed our picture that the electromagnetic and weak force were once unified at high energies into the electroweak force, which was the ultimate test of the Standard Model. In all of physics history, theres never been a result the Standard Model couldnt explain.
Today, Feynman diagrams are used in calculating every fundamental interaction spanning the strong, weak, and electromagnetic forces, including in high-energy and low-temperature/condensed conditions. The electromagnetic interactions, shown here, are all governed by a single force-carrying particle: the photon, but weak, strong, and Higgs couplings can also occur. These calculations are difficult to perform, but are still far more complicated in curved, rather than flat, space.
But theres a catch. All of the Standard Model calculations we perform are based on particles that exist in the Universe, which means they exist in spacetime. The calculations we typically perform are done so under the assumption that spacetime is flat: an assumption that we know is technically wrong, but one thats so useful (because calculations in curved spacetime are so much more difficult than they are in flat space) and such a good approximation to the conditions we find on Earth that we plow ahead and make this approximation anyway.
After all, this is one of the great methods we use in physics: we model our system in as simple a fashion as possible in order to capture all of the relevant effects that will determine the outcome of an experiment or measurement. Saying Im doing my high-energy physics calculations in flat spacetime rather than in curved spacetime doesnt give you an appreciably different answer except in the most extreme conditions.
But extreme conditions do exist in the Universe: in the spacetime around a black hole, for example. Under those conditions, we can determine that using a flat spacetime background is simply no good, and were compelled to take on the herculean task of performing our quantum field theory calculations in curved space.
Inside a black hole, the spacetime curvature is so large that light cannot escape, nor can particles, under any circumstances. Although we lack an understanding of what happens at the central singularities of black holes themselves, Einsteins General Relativity is sufficient for describing the curvature of space more than a few Planck lengths away from the singularity itself.
It might surprise you that, in principle, this isnt really all that difficult. All you have to do is replace the flat spacetime background you normally use for performing your calculations with the curved background as described by General Relativity. After all, if you know how your spacetime is curved, you can write down the equations for the background, and if you know what quanta/particles you have, you can write down the remaining terms describing the interactions between them in that spacetime. The rest, although its quite difficult in practice under most circumstances, is simply a matter of computational power.
You can describe, for example, how the quantum vacuum behaves inside and outside of a black holes event horizon. Because youre in a region where spacetime is more severely curved the closer you are to a black holes singularity, the quantum vacuum differs in a calculable way. The difference in what the vacuum state is in different regions of space particularly in the presence of a horizon, whether a cosmological or an event horizon leads to the production of radiation and particle-antiparticle pairs wherever quantum fields are present. This is the fundamental reason behind Hawking radiation: the reason that black holes, in a quantum Universe, are fundamentally unstable and will eventually decay.
Although no light can escape from inside a black holes event horizon, the curved space outside of it results in a difference between the vacuum state at different points near the event horizon, leading to the emission of radiation via quantum processes. This is where Hawking radiation comes from, and for the tiniest-mass black holes, Hawking radiation will lead to their complete decay in under a fraction-of-a-second. For even the largest mass black holes, survival beyond 10^103 years or so is impossible due to this exact process.
Thats as far as we can go, however, and that doesnt take us everywhere. Yes, we can make the Standard Model and General Relativity play nice in this fashion, but this only allows us to calculate how the fundamental forces work in strongly curved spacetimes that are sufficiently far away from singularities, like those at the centers of black holes or in theory at the very beginning of the Universe, assuming that such a beginning exists.
The maddening reason is that gravity affects all types of matter and energy. Everything is affected by gravitation, including, in theory, whatever types of particles are ultimately responsible for gravitation. Given that light, which is an electromagnetic wave, is made up of individual quanta in the form of photons, we assume that gravitational waves are made up of quanta in the form of gravitons, which we even know many of the particle properties of in the absence of a full quantum theory of gravitation.
But thats precisely what we need. Thats the missing piece: a quantum theory of gravity. Without it, we cannot understand or predict any of the quantum properties of gravity. And before you say, What if they dont exist? know that wouldnt paint a consistent picture of reality.
Results of a double-slit-experiment performed by Dr. Tonomura showing the build-up of an interference pattern of single electrons. If the path of which slit each electron passes through is measured, the interference pattern is destroyed, leading to two piles instead. The number of electrons in each panel are 11 (a), 200 (b), 6000 (c), 40000 (d), and 140000 (e).
For example, consider the most inherently quantum of all the quantum experiments that have ever been performed: the double slit experiment. If you send a single quantum particle through the apparatus and you observe which slit it goes through as it goes through it, the outcome is completely determined, as the particle behaves as though it
the slit you observed it to go through at every step of the way. If that particle was an electron, you could determine what its electric and magnetic fields were during its entire journey. You could also determine what its gravitational field was (or equivalently, what its effects on the curvature of spacetime were) at every moment as well.
But what if you dont observe which slit it goes through? Now the electrons position is indeterminate until it gets to the screen, and only then can you determine where it is. Along its journey, even after you make that critical measurement, its past trajectory is not fully determined. Because of the power of quantum field theory (for electromagnetism), we can determine what its electric field was. But because we dont have a quantum theory of gravitation, we cannot determine its gravitational field or effects. In this sense as well as at small, quantum fluctuation-rich scales or at singularities in which classical General Relativity gives only nonsense answers we dont fully understand gravitation.
Quantum gravity tries to combine Einsteins General theory of Relativity with quantum mechanics. Quantum corrections to classical gravity are visualized as loop diagrams, as the one shown here in white. Whether space (or time) itself is discrete or continuous is not yet decided, as is the question of whether gravity is quantized at all, or particles, as we know them today, are fundamental or not. But if we hope for a fundamental theory of everything, it must include quantized fields, which General Relativity does not do on its own.
This works both ways: because we dont understand gravitation at a quantum level, that means we dont quite understand the quantum vacuum itself. The quantum vacuum, or the properties of empty space, is something that can be measured in various ways. The Casimir effect, for instance, lets us measure the effect of the electromagnetic interaction through empty space under a variety of setups, simply by changing the configuration of conductors. The expansion of the Universe, if we measure it over all of our cosmic history, reveals to us the cumulative contributions of all of the forces to the zero-point energy of space: the quantum vacuum.
But can we quantify the quantum contributions of gravitation to the quantum vacuum in any way?
Not a chance. We dont understand how to calculate gravitys behavior at high energies, at small scales, near singularities, or when quantum particles exhibit their inherently quantum nature. Similarly, we dont understand how the quantum field that underpins gravity assuming there is one behaves at all under any circumstances. This is why attempts to understand gravity at a more fundamental level must not be abandoned, even if everything were doing now turns out to be wrong. Weve actually managed to identify the key problem that needs to be solved to push physics forward beyond its current limitations: a huge achievement that should never be underestimated. The only options are to keep trying or give up. Even if all of our attempts turn out to ultimately be in vain, its better than the alternative.
Excerpt from:
The biggest problem with gravity and quantum physics - Big Think
Physics – Breakthrough Prize for the Physics of Quantum Informationand of Cells – Physics
The idea of using the laws of quantum mechanics for computation was proposed in 1982 by Richard Feynman. But Deutschwho is at the University of Oxford, UKis often credited with establishing the conceptual foundations of the discipline. Computer bits that obey quantum principles, such as superposition and entanglement, can carry out some calculations much faster and more efficiently than ones that obey classical rules. In 1985 Deutsch postulated that a device made from such quantum bits (qubits) could be made universal, meaning it could simulate any quantum system. Deutsch framed his proposal in the context of the many worlds interpretation of quantum mechanics (of which he is an advocate), likening the process of one quantum computation to that of many parallel computations occurring simultaneously in entangled worlds.
To motivate further work in quantum computing, researchers at the time needed problems that a quantum computer could uniquely solve. I remember conversations in the early 1990s in which people would argue about whether quantum computers would ever be able to do anything really useful, says quantum physicist William Wootters of Williams College, Massachusetts, who has worked with Bennett and Brassard on quantum cryptography problems. Then suddenly Peter Shor devised a quantum algorithm that could indeed do something eminently useful.
In 1995 Shor, who is now at the Massachusetts Institute of Technology, developed an algorithm that could factorize large integersdecompose them into products of primesmuch more efficiently than any known classical algorithm. In classical computation, the time that it takes to factorize a large number increases exponentially as the number gets larger, which is why factorizing large numbers provides the basis for todays methods for online data encryption. Shors algorithm showed that for a quantum computer, the time needed increases less rapidly, making factorizing large numbers potentially more feasible. This theoretical demonstration immediately injected energy into the field, Wootters says. Shor has also made important contributions to the theory of quantum error correction, which is more challenging in quantum than in classical computation (see Focus: LandmarksCorrecting Quantum Computer Errors).
Without Deutsch and Shor we would not have the field of quantum computation as we know it today, says quantum theorist Artur Ekert of the University of Oxford, who considers Deutsch his mentor. David defined the field, and Peter took it to an entirely different level by discovering the real power of quantum computation and by showing that it actually can be done.
Data encryption is the topic cited for the award of Bennett (IBMs Thomas J. Watson Research Center in Yorktown Heights, New York) and Brassard (University of Montreal, Canada). In 1984 the pair described a protocol in which information could be encoded in qubits and sent between two parties in such a way that the information could not be read by an eavesdropper without that intervention being detected. Like quantum computing, this quantum cryptographic scheme relies on entangling qubits, meaning that their properties are interdependent, no matter how far apart they are separated. This BB84 protocol and similar quantum encryption schemes have now been used for secure transmission of data along optical networks and even via satellite over thousands of kilometers (see Focus: Intercontinental, Quantum-Encrypted Messaging and Video).
In 1993 Bennett and Brassard also showed how entanglement may be harnessed for quantum teleportation, whereby the state of one qubit is broadcast to another distant one while the original state is destroyed (see Focus: LandmarksTeleportation is not Science Fiction). This process too has applications in quantum information processing.
I am really gratified by this award because it recognizes the field of quantum information and computation, Shor says. Deutsch echoes the sentiment: Im glad that [quantum information] is now officially regarded as fundamental physics rather than as philosophy, mathematics, computer science, or engineering.
Deutsch, Shor, Bennett, and Brassard deserve recognition for their work, and Im delighted that theyre getting it, Wootters says. He notes that their research not only inspired the development of quantum technologies, but also influenced new research in quantum foundations. Quantum information theory views quantum theory through a novel lens and opens up a new perspective from which to address foundational questions.
Link:
Physics - Breakthrough Prize for the Physics of Quantum Informationand of Cells - Physics