Category Archives: Quantum Physics

Quantum information science is rarely taught in high school here’s … – Nextgov

The first time I heard about quantum information science, I was at a teacher development workshop in Canada in 2008.

I already knew that quantum science was the study of the smallest objects in nature. I also knew that information science was the study of computers and the internet. What I didnt know was that quantum information science sometimes called QIS was a new field of science and technology, combining physical science, math, computer science and engineering.

Until then, I didnt realize how QIS was key to so many everyday items, like cellphones, satellites, MRI machines, lasers, cybersecurity and solar technology. I was a physics teacher and didnt know this, so I knew other teachers didnt either. And if they didnt know about it, that meant K-12 students were definitely not learning it.

I vowed to do a better job of teaching these concepts in my own classroom and to the teachers I mentor. But I quickly discovered significant barriers.

Those barriers include:

With the help of colleagues, I organized Quantum for All in 2020 to help give high school teachers support in teaching quantum information science. The project received nearly US$1 million in funding from the National Science Foundation. The goal of the grant is to help students become quantum smart by teaching K-12 educators how to teach QIS.

From a societal perspective, there are many reasons to invest in quantum education at the high school level.

The quantum information technology market is poised to be worth $44 billion by 2028. Yet one study estimates a major talent shortage in the industry with the number of open jobs outnumbering the number of qualified applicants by about 3 to 1.

Not having fundamental knowledge in the field may keep students from pursuing these highly paid jobs. Annual salaries can start at about $100,000 for quantum engineers, developers and scientists. Quantum physicists can earn up to $170,000.

While there is a need for quantum science talent in many industries, one of the most critical is in national security.

Historically, huge scientific and technological advancements have been made in the United States when politicians invest in efforts they deem critical to national security think of the space race, where the U.S. spent US$257 billion over 13 years, or the atomic bomb that cost about $30 billion to $50 billion over four years, both in todays dollars.

In 2016, the U.S. government recognized the importance of quantum information science in maintaining the countrys strategic edge when China launched the worlds first quantum satellite, showcasing its emerging space and technology program. U.S. military leaders also worried that China was on the verge of creating hack proof communications tools far more sophisticated than American designs. This raises questions about which nation will dominate from space in times of crisis.

The Center for New American Security, a Washington-based think tank, warned that Chinas focus on quantum science as part of its research efforts could help that country surpass the U.S. as an economic and military superpower.

In 2018, the National Quantum Initiative Act was signed into law to accelerate quantum research and development and develop a quantum information science and technology workforce pipeline. However, the initiative lacked details on how this workforce would be developed.

With a new national focus on quantum information science, the National Quantum Network was launched in 2020 to help support and coordinate the K-12 education efforts, expand available learning tools and create opportunities for students to envision their role in a quantum workforce.

The most logical venue for exposure to quantum information science would be a high school physics course. However, as many as 16% to 39% of high school students do not attend high schools where physics is offered each year.

Traditional professional development focuses on teaching the teacher, rather than helping the teacher prepare to teach. Thats why I and other researchers are studying the effectiveness of a different professional development model. Components of the model include having the content taught by fellow science teachers.

Our model educates teachers one week and then allows them to teach students at a camp the following week while the information and techniques are still fresh. Research has shown that this approach is more effective than doing summer workshops that dont allow teachers to try out what they learned until much later.

This model also allows teachers to gain confidence as they practice teaching techniques with fellow science teachers, making it more likely they will implement this knowledge in their own lessons. The lessons being developed by the project can be embedded into existing STEM curricula science, technology, engineering and math or taught as stand-alone topics.

Examples of quantum information science lessons that have been developed include levitation, where students are shown the basics of superconductors and quantum levitation. These concepts are already being used in applications such as Maglev trains, which use magnets to quietly float above the tracks instead of using wheels. There are many benefits to this type of travel, including energy efficiency, fewer derailments, less maintenance and less impact on the environment.

Other lessons involve understanding cryptography and cybersecurity. Cryptography is the technique of coding information or encryption so it can only be read by the intended receiver, whereas cybersecurity is the process or procedures taken to keep information secure in devices and networks.

As districts and educators begin to implement quantum information science concepts, my colleagues and I are collecting feedback from teachers on the effectiveness of their lessons and student engagement. This feedback will be used to inform how to add quantum information into more lessons.

If this new model of teacher education works, it could be expanded nationwide.

This type of professional development may be expensive due to the time teachers need to learn the content and increase their teaching confidence. But failing to prepare students for the jobs of the future could be even more costly if the U.S. yields its place in quantum technology, allowing countries like China to assert their supremacy in the field.

Karen J. Matsler, Assistant Professor in Practice for UTeach Arlington Program, University of Texas at Arlington

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Read more from the original source:

Quantum information science is rarely taught in high school here's ... - Nextgov

New Zealand to invest in quantum technology research – Xinhua

WELLINGTON, Sept. 13 (Xinhua) -- New Zealand will invest in quantum technology research through an ambitious and internationally-connected program, according to the Ministry of Business, Innovation and Employment (MBIE) on Wednesday.

The program to leverage New Zealand's niche expertise in quantum technology research will receive up to 12 million NZ dollars (7.07 million U.S. dollars) of government funding over the next five years, said a statement from the ministry.

The Quantum Technologies Research Program will be developed by the University of Otago's Dodd-Walls Center for Photonic and Quantum Technologies, one of New Zealand's leading centers for quantum research.

It will focus on increasing international connectivity and domestic capability in this rapidly developing sector that leverages the principles of quantum physics to build new and advanced technologies, the ministry said.

"Quantum technologies have potentially transformative applications across many parts of society and in nearly every industry including climate and environmental monitoring, drug discovery and medical imaging, materials sciences and communication," said MBIE Manager International Science Partnerships Loveday Kempthorne.

The program will enable New Zealand researchers to initiate and respond to collaboration prospects and to become valued partners as governments, large technology companies and start-ups around the world invest heavily to achieve breakthroughs in quantum innovation, Kempthorne said.

Dodd-Walls Center has established strong connections with world-leading research organizations globally and draws expertise from research institutions throughout New Zealand, she added.

This funding allows New Zealand to develop strategic capabilities and grow the ecosystem to a point to leverage the nascent "second quantum revolution," said Dodd-Walls Center Director Frederique Vanholsbeeck.

Some of the first countries New Zealand will seek to collaborate with include Britain, Japan, Singapore, the United States and Germany, Vanholsbeeck said, adding that Australia is also an important partner for New Zealand alongside the Association to Horizon Europe.

Read the original:

New Zealand to invest in quantum technology research - Xinhua

Research Fellow in Theoretical Quantum Physics job with … – Times Higher Education

About Centre for Quantum Technologies (CQT)

The Centre for Quantum Technologies (CQT) is a research centre of excellence in Singapore. It brings together physicists, computer scientists and engineers to do basic research on quantum physics and to build devices based on quantum phenomena. Experts in this new discipline of quantum technologies are applying their discoveries in computing, communications, and sensing.CQT is hosted by the National University of Singapore and also has staff at Nanyang Technological University. With some 180 researchers and students, it offers a friendly and international work environment.Learn more about CQT atwww.quantumlah.org

Job Description

The Centre is looking for a talented and driven Research Fellow to join the research group of Prof. Valerio Scarani. The group conducts theoretical research, specialising in foundational aspects, nonlocality and device-independent certification, quantum thermodynamics, and description of quantum devices.

We are specifically looking for a candidate to lead an original research project in a broad variety of topics, but anyway centered on the theoretical description of realistic physical systems and devices. A non-exhaustive list of possible topics includes: the theory of mechanical systems in the quantum regime; the characterisation and benchmarking of large quantum devices; proposal for the detection of tabletop gravity from quantum sources; implementations of primitives in quantum networks. It is absolutely essential that the candidate proposes a convincing research plan and shows their ability to lead it.

The position is open for two years, with further extensions possible in case of exceptional performance. Starting date: January 2024.

Job Requirements

More Information

For enquiries and details about the position, please contact Prof. Valerio Scarani atphysv@nus.edu.sg.

Please include your consent by filling in theNUS Personal Data Consentfor Job Applicants.

Applications should contain your latest CV and a description of your research interests.

Original post:

Research Fellow in Theoretical Quantum Physics job with ... - Times Higher Education

It Happened at Michigan Physics, Ann Arbor and J. Robert … – The University Record

What began as a modest university summer lecture program featuring notable physicists in 1923 evolved into an extraordinary series of appearances by some of the greatest minds in theoretical physics.

J. Robert Oppenheimer, Niels Bohr, Werner Heisenberg, E.O. Lawrence, and others traveled to Ann Arbor to join the U-M faculty for the Summer Symposia in Theoretical Physics between the world wars. No fewer than 15 of the visiting physicists were either Nobel laureates or would go on to receive the Nobel Prize in physics.

When Harrison M. Randall, chair of the Department of Physics, rolled out the first symposium in 1923, it drew on the programs strengths in experimental physics. By 1928, the emphasis turned to theoretical physics, where the real scientific advances of the day were unfolding. David M. Dennison, a junior faculty member, was a driving force in attracting physicists he had come to know while studying abroad.

In order to remain ahead, it was necessary to have very frequent contacts all the while with the foremost physics, and the foremost physics at that time was being done in Europe, Dennison later recalled.

From 1929 to 1941, a parade of the worlds leading physicists marched into Ann Arbor each summer. Enrico Fermi came from the Royal University of Rome in 1930 to discuss quantum electrodynamics; it was the first of five visits he would make over the next decade.Wolfgang Pauli, a pioneer of quantum physics, lectured in 1931 and returned in 41.

Oppenheimer, who was a professor at the California Institute of Technology and would come to be known as the father of the atomic bomb, taught in 1931 and 1934.

Lectures drew advanced physics students from around the country and globe who spent their summers interreacting with each other and leaders in their field. If we looked at the photographs, we could see all of these people who are now prominent physicists, who in those days were in the beginning stages, Randall said in 1964. So, this was a very great service, I think, to physics as a whole that was accomplished by the summer symposia.

In 2010, the American Physical Society designated U-M a historic site for the symposias impact, which played a critical role in helping America achieve international status in theoretical physics.

Interview of Harrison Randall by David Dennison and W. James King on 1964 February 19 courtesy of Niels Bohr Library & Archives, American Institute of Physics, College Park, Maryland

Read more here:

It Happened at Michigan Physics, Ann Arbor and J. Robert ... - The University Record

Two professors awarded Vannevar Bush Fellowships – The Rice Thresher

By Keegan Leibrock 9/13/23 12:01am

Qimiao Si, a professor of physics and astronomy, and Jeffrey Tabor, a professor of bioengineering and biosciences, have been awarded Vannevar Bush Faculty Fellowships by the U.S. Department of Defense. The award comes with a five-year fellowship and $3 million in research funding to continue work in their respective fields.

Si studies theoretical condensed matter physics and has been a faculty member at Rice since 1995. His contributions to the field of condensed matter physics focus on strongly correlated electron systems and the theory of quantum criticality, which relates to the transition of matter from one quantum state to another.

Si said his research intends to explore the responsiveness of electrons to stimuli. With the grant money, he said he will pursue research dealing with the control of topological states of matter.

We want to establish a theoretical framework for realizing materials that have unusual properties in this case, being extremely responsive, Si said. We want to start from intuition and build up the equations to be able to make a statement about realizing new states of matter Its not the approach that one takes on a daily basis and its also highly risky. It may not work.

Tabors research focuses on bioengineering and biosciences. Since joining Rice in 2010, Tabors lab has programmed living cells to sense and respond to environmental stimuli along with developing potential methods to sense and treat intestinal disease.

Tabor said that he hopes this new research will streamline the DNA synthesis processes to open new research opportunities.

Enjoy what you're reading?Sign up for our newsletter

Unfortunately, the technology thats used to make synthetic DNA is based on chemistry and has not been changed in 40 years. Its very expensive, it produces toxic chemical waste and it severely limits research, Tabor said. [Because of this], we arent able to do all the research that we want our goal is to harness the power of enzymes that come from living things to naturally make DNA in an approach that is astronomically cheaper.

Tabor said that this new process of DNA synthesis would have substantial benefits for those seeking medical treatment for cancer through cancer immunotherapy.

While there [have] been early breakthroughs [in natural DNA synthesis], theres also still a lot of challenges, Tabor said. If we were to make that process much cheaper, we could try many new designs, learn much more about how this system works and then make a new generation of these cancer immunotherapies that targeted more cancers, had less side effects and helped more people.

Si said that obstacles to success, both in seeking this grant and in his research, have driven his excitement.

Over the course of my career, Ive made many predictions, and more often than not, they turn out not to be true, Si said. To me, thats where the excitement lies, in this inevitability of unpredictability the opportunities are enormous [and] the challenges are enormous.

Si said that the grant funding will open new opportunities in quantum physics that would have otherwise not been possible.

The [$3 million grant] allows us to do two things, Si said. One is that it allows us to have a large group of postdocs and graduate students who can come together to brainstorm and formulate equations [the grant] also allows us to facilitate experimental capacity.

Si said that he would advise aspiring researchers to recognize a specific area of interest and work hard within that area.

Follow your trajectory, and youll get something great I think thats quite true, Si said.

Tabor expressed a similar sentiment, advising young researchers to be ambitious in their goals.

Dont be afraid to think big and pursue something that youre really excited about, Tabor said. Even if it seems too far away, or too risky, or too challenging, if you are excited about it, [then] that excitement will come through in your daily life, and youll be doing your best work thats a good recipe for success.

Original post:

Two professors awarded Vannevar Bush Fellowships - The Rice Thresher

Dr. Rashad Richey Named Physics Professor and Dep. Chair at … – WAOK

ATLANTA, Ga. (September 12, 2023)Renowned scholar and multimedia broadcaster, Rashad Richey, PhD, EdD, MBA, LLM, LLD, JD Candidate, has been named the new chair of the Department of Science, Law, and Philosophy, as well as professor of physics at the prestigious Paris Graduate School (PGS) in France. The internationally known lecturer, best-selling author, and social activist began his new role at PGS September 1.

I am beyond humbled. PGS has provided me the opportunity to inspire, create, research and educate at the highest levels of education Dr. Rashad Richey

Richey brings a wealth of knowledge and experience to his new role. He previously served as chair of the Department of Adult Learning at Beulah Heights University (BHU) in Atlanta. Richey is currently a professor at Morris Brown College, director of the Applied Medical Physics Program at OcuPrep Medical Institute and lectures in health equity and physics at Morehouse School of Medicine. Richey has also lectured at the University of Michigan, Reinhart University and Clark Atlanta University (CAU). In addition, Richey hosts The Rashad Richey Morning Show, an award-winning radio program on News & Talk 1380 WAOK-AM. He also anchors the nationally syndicated daily television show, Indisputable with Dr. Rashad Richey, on The Young Turks Network. He is the recipient of numerous awards, including the Presidents Lifetime Achievement Award by President Biden, and the Distinguished Alumni Award from CAU. Richey was recently inducted into the highly selective National Black College Hall of Fame, where notable Black leaders such as Dr. Martin Luther King Jr., Oprah Winfrey and Justice Thurgood Marshall are all recipients.

The Paris Graduate School steadfastly believes in seven core values that are key to adapting to the changing world: inclusivity, research excellence, sustainability, adaptability, student wellness, ethics and opportunity, said Souha Akiki, PhD, president and founder of PGS. Dr. Richey is the right person at the right time to help us maintain these values and accomplish our goals.

Richey is a rare multidisciplinary scholar who holds multiple advanced degrees, including completing doctoral research studies in higher education policy reform at CAU. He also earned a PhD from the Business University of Costa Rica, where his research focused on the intersectionality of government policy contextualized through religious norms, an MBA from BHU, a Master of Laws from the University of Renaissance, and his Doctor of Law (LLD-ABD) from Azteca University. Grounding his enthusiasm for physics by completing 43 physics-based college courses through the Massachusetts Institute of Technology (MIT) OpenCourseWare (non-degree) program, Richey earned his Master of Science in Neuroscience at the University of Pacific, and holds a Master of Applied Physics and Quantum Mechanics, where his masters thesis was adapted into a book, Ancient Egyptian Mastery of Quantum Physics, Vibratory Frequency, and Geometric Sciences: An Overview of Complex Scientific Applications in Ancient Cultures. The book quickly became one of the best-selling physics books on the Amazon platform. He is currently pursuing his fourth PhD as a research student in quantum physics and is a Senior Physicist Research Fellow at the School of Life Information Science & Engineering at Asia Pacific in collaboration with CAUs science department laboratory.

NOTE TO JOURNALISTS: To schedule an interview or for more information, please contact RicheyAssistant@gmail.com or call 912.508.2170.

Dr. Rashad Richey, host of the award-winning Rashad Richey Morning Show on News & Talk 1380 WAOK/V-103FM (HD3) (Weekdays 7am -10am), and the Dr. Rashad Richey Review on SiriusXMs Urban View (Sundays at 1pm and 9pm), was voted 'Best Talk Radio Personality in Atlanta' by readers of the Atlanta Journal-Constitution and named 'Most Trusted Voice in Atlanta' by the Atlanta Business Journal, making him the first African-American to receive these distinctions.

The intelligent and fearless television news anchor for the opinion news show, 'Indisputable with Dr. Rashad Richey on the TYT Network, which was named 'fastest growing TV news show in America', and Political Commentator for The People's Station V-103 FM, America's largest urban station, also serves as President of Rolling Out, the largest free-print urban publication in the country. This multimedia powerhouse with over 3-million combined subscribers/followers on Facebook Watch, YouTube, Podcasts, and Twitch combined, is a noted multidisciplinary academic scholar and university professor/lecturer and an Emmy-nominated television Political Analyst for CBS News Atlanta.

Believing in the power of knowledge and education, Dr. Richey holds several advanced degrees, making him one of the most academically credentialed individuals in American history according to America News Now. Completing doctoral research studies in federal policy reform from Clark Atlanta University, Dr. Richey also holds a PhD from the Business University of Costa Rica where his research and doctoral dissertation highlighted the nuances and intersectionality of politics, policy and religion.

Being a student of leadership, Dr. Richey completed studies in Executive Leadership at Cornell University and was accepted into a specialty executive law program at Harvard University in International Finance: Policy, Regulation, and Transactions. Understanding the connectivity of culture and science, Dr. Richey earned his Master of Science in Neuroscience from the University of Pacific, where his masters thesis researched cognitive functionalities of brain entrainment. Dr. Richey also completed a Master of Science in Applied Physics and Quantum Mechanics from Universidad Empresarial, his masters thesis was adapted into a book titled, Ancient Egyptian Mastery of Quantum Physics, Vibratory Frequency, and Geometric Sciences: An Overview of Complex Scientific Applications in Ancient Cultures, which quickly became the #1 Physics, #1 Science, #1 History, and #1 Egyptian Genre b0ok on the Amazon platform.

As an executive leader, Dr. Richey says its imperative to combine business practices with compassionate leadership principles, which was a primary focus when completing his Master of Business Administration (MBA) program at Beulah Heights University. Dr. Richey also holds a Master of Laws (LL.M) in Humanitarian Studies from the University of Renaissance and a Doctor of Law in International Law (ABD) from the research institution, Azteca University.

He is currently in the final leg of completing his Juris Doctor (Law Degree) from Birmingham Law School and dually enrolled in a PhD in Quantum Physics program, which is a collaborative between the Clark Atlanta University physics department and the School of Life Information Science & Engineering at Asia Pacific School of Business.

As host of The Rashad Richey Morning Show, Dr. Richey has interviewed everyone from Vice-President Kamala Harris to TI, and always brings relevant information, the best on-air debates, and most insightful interviews in media. Tune in every weekday morning from 7am-10am on News and Talk 1380-WAOK, V-103FM (HD3),www.WAOK.com, or on the Audacy App.

Featured Image Photo Credit: Dr. Rashad Richey

Here is the original post:

Dr. Rashad Richey Named Physics Professor and Dep. Chair at ... - WAOK

The grand paradox at the heart of every black hole – Big Think

When something falls into a black hole, where does it go, and will it ever come back out again? According to Einsteins General Relativity, those answers are simple: as soon as anything physical matter, antimatter, radiation, etc. crosses over the event horizon, its gone. It can add things like mass, electric charge, and angular momentum to the black hole, but little else. It goes swiftly toward and eventually into the central singularity, and will never escape again.

But our Universe isnt governed by General Relativity alone, but also by quantum physics. According to our best understanding of quantum reality, theres much more that needs to be considered. Not only are there other quantum properties inherent to the raw ingredients that go into making a black hole baryon number, lepton number, color charge, spin, lepton family number, weak isospin and hypercharge, etc. but the fabric of spacetime itself, which contains the black hole, is quantum in nature. Because of those quantum properties, black holes do not remain static, but rather evaporate over time: emitting Hawking radiation (and perhaps even more) in the process.

When black holes do evaporate, then, what happens to the information that went into creating them? Is it conserved? Is it destroyed? Is it encoded in the outgoing radiation? And if so, how? These questions are at the heart of perhaps the greatest paradox of all: the black hole information paradox. Heres both what we know and what we still need to find out.

When two particles are entangled in the quantum mechanical sense, its as though some sort of hidden, invisible connection exists between them. Many have conjectured that this connection persists even across the event horizon of a black hole, and that whatever information goes into forming a black hole will eventually emerge as the black hole evaporates.

Information

When a physicist talks about information, they dont necessarily mean what we conventionally think of as information: a string of letters, numbers, symbols, or anything else that can be encoded with bits like 0s or 1s. Conventionally, this is often described as the number of yes/no questions that must be answered to fully specify the properties of your physical system, although even that description has limitations. These are all certainly examples of information, but those examples dont encompass all the various types of information that exist. Information can also include:

That last one is tricky, because entropy an inherently thermodynamical quantity is very often misunderstood. Youll often hear statements like entropy is a measure of disorder or entropy always increases for any system and while those things are kind of true, its possible to make very ordered high-entropy systems and to decrease a systems entropy through the input of an external energy source.

As an alternative, consider this: what entropy actually measures is the number of possible arrangements of the (fully quantum) state of your system.

A system set up in the initial conditions on the left and allowed to evolve will have less entropy if the door remains closed (left) than if the door is opened (right). If the particles are allowed to mix, there are more ways to arrange twice as many particles at the same equilibrium temperature than there are to arrange half of those particles, each, at two different temperatures, resulting in a much greater entropy for the system at right than the one at left.

A classic example is to consider two systems:

Both systems have the same number of particles, the same total energy in them, but wildly different entropies from one another. The second system has a much greater amount of entropy, as there are many different ways to distribute energy among all of the particles in your system to achieve the desired configuration than there are for the first system; the number of possible arrangements of the fully quantum state of your full system is much greater for the second system than the first.

Because there is a greater number of possible arrangements, you have to provide a greater amount of information and, therefore, answer a greater number of yes/no questions to fully describe the system with a greater amount of entropy. Information and entropy arent identical, but they are proportional: a greater entropy to your system means it requires more information to fully describe it.

A wine glass, when vibrated at the right frequency, will shatter. This is a process that dramatically increases the entropy of the system, and is thermodynamically favorable. The reverse process, of shards of glass reassembling themselves into a whole, uncracked glass, is so unlikely that it never occurs spontaneously in practice. However, if the motion of the individual shards, as they fly apart, were exactly reversed, they would indeed fly back together and, at least for an instant, successfully reassemble the wine glass. Time reversal symmetry is exact in Newtonian physics.

Information and black holes

If you take a book and burn it, the books information doesnt get lost or destroyed, but merely scrambled. In principle although, maybe not in practice just yet you could trace each and every particle of paper-and-ink that went into the fire, determine where they went, and from the ash, soot, chemicals, and invisible gases they produced, keep track of every character on every page in that book. In principle, you could look at that final system of the completely burned book and reconstruct the complete information that was in the book before you burned it.

You can do this with the remnants of a shattered glass, reconstructing what the original, unbroken structure looked like. You can do this with a scrambled-and-cooked egg, reconstructing what the uncooked, unscrambled egg was like. As long as the fundamental particles that the original system was made out of were preserved, no matter what interactions they underwent in the meanwhile, that original information about the initial state of the system would be preserved as well.

But with black holes, that absolutely isnt the case any longer. In General Relativity, black holes dont have any memory about the types of particles (or the properties of those particles) that went into creating or growing the black hole. The only measurable properties a black hole can possess are mass, electric charge, and angular momentum.

One of the most important contributions of Roger Penrose to black hole physics is the demonstration of how a realistic object in our Universe, such as a star (or any collection of matter), can form an event horizon and how all the matter bound to it will inevitably encounter the central singularity. Once an event horizon forms, the development of a central singularity is not only inevitable, its extremely rapid.

In the early 1970s, this puzzle was considered by physicist Jacob Bekenstein, who recognized why this was such a problem. Whatever particles go into forming a black hole have their own properties, configuration, and amount of entropy (and information) encoded within them. According to the second law of thermodynamics, entropy can never decrease for a closed system; it can only increase or remain the same, unless some external source of energy is inputted to decrease that entropy. (And even then, the total entropy of the original system plus the external source, where the external source is where that inputted energy comes from, will continue to increase.)

But in pure General Relativity, black holes have zero entropy, and that definition simply wont work. From the perspective of an external observer, its quantum particles that go into the creation of a black hole, and as the black hole gets created and grows, the surface area of its event horizon increases. As the mass goes up, the surface area goes up, and as more particles pour in, the entropy must rise as well.

It was Bekenstein who first recognized that the information encoded by the infalling particles would, from an external observers perspective, appear to get smeared out over the surface of the event horizon, enabling a definition of entropy that was proportional to a black holes event horizons surface area. Today, this is known as the Bekenstein-Hawking entropy: the entropy of a black hole.

Encoded on the surface of the black hole can be bits of information, proportional to the event horizons surface area. When the black hole decays, it decays to a state of thermal radiation. Whether that information survives and is encoded in the radiation or not, and if so, how, is not a question that our current theories can provide the answer to.

Will that information get destroyed?

This definition was very exciting, but the notion that we had made sense of the Universe of entropy, information, and black holes was extremely short-lived. In 1974, just two years after Bekensteins earliest work on the topic, Stephen Hawking came along and not only had a spectacular realization, but performed a tremendous calculation to go with it.

His realization was that the standard way of performing quantum field theory calculations made an assumption: that space would, on tiny quantum scales, be treated as though it were flat, unaffected by the General Relativistic curvature of space. However, in the vicinity of a black hole, this wasnt just a bad approximation, it was a worse approximation than it would be under any other conditions that occurred within our physical Universe.

Instead, Hawking recognized, the calculation needed to be done in a background of curved space, where the background spatial curvature was given by Einsteins equations and the properties of the black hole in question. Hawking calculated the simplest case for a black hole with mass only, without electric charge or angular momentum in 1974, and recognized that the state of the quantum vacuum, or empty space itself, was fundamentally different in curved space, near the black holes event horizon, than the state of the quantum vacuum far away from the black hole: where space is flat.

In the far future, there will be no more matter around black holes, but instead their emitted energy will be dominated by Hawking radiation, which will cause the size of the event horizon to shrink. The transition from growing to decaying black holes will occur whenever the accretion rate drops below the mass loss rate due to Hawking radiation, an event estimated to occur some ~10^20 years in the future. How the information that went into making the black hole gets encoded into the outgoing radiation, or whether thats even the case, has not yet been determined.

That calculation revealed that black holes dont simply exist, stably, in this curved space, but that the differences in the vacuum near and far away from the event horizon lead to a continuous emission of blackbody radiation: now known as Hawking radiation. This radiation should:

This is remarkable, and is a purely quantum effect that were now realizing may apply to systems other than black holes as well.

However, it raised a new, troubling issue. If the radiation that comes out of a black hole as it evaporates, this Hawking radiation, is purely blackbody in nature, it should have no preference for:

or any other metric needed to answer a yes/no question regarding the initial quantum state of the matter that went into creating the black hole in the first place. For the first time, it seems that weve encountered a physical system where knowing and measuring all of the information about its final state doesnt, even in principle, allow you to reconstruct the initial state.

As the Universe continues to age, the last sources of light will arise from the evaporation of black holes. While the least massive black holes will complete their evaporation after only 10^67 years or so, the most massive ones will persist for over a googol (10^100) years, making them the last cosmic objects to emit light, as far as we know.

The core of the black hole information paradox

So where, then, does the information go?

Thats the puzzle: we think that information shouldnt be able to be destroyed, but if the black hole is evaporating into pure blackbody radiation, then all of that information that went into making the black hole has somehow disappeared.

The event horizon of a black hole is a spherical or spheroidal region from which nothing, not even light, can escape. But outside the event horizon, the black hole is predicted to emit radiation. Hawkings 1974 work was the first to demonstrate this, and it was arguably his greatest scientific achievement. A new study now suggests that Hawking radiation may even be emitted in the absence of black holes, with profound implications for all stars and stellar remnants in our Universe.

The truth is, despite many declarations over the years that the black hole information paradox has been resolved, that nobody knows. Nobody knows whether the information is preserved, whether its destroyed or erased, and whether it depends on what occurs in a black holes interior or whether it can be completely described from an outside observers perspective.

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

We have mathematical correspondences between what happens on the inside and the outside of a black hole, including an underappreciated fact that takes us beyond the semiclassical approximation (quantum field theory calculations in a background of curved spacetime) used by Hawking: that when radiation comes out of a black hole, it should maintain a quantum mechanical entangled link to the black holes interior.

We have devised methods that allow us to map the entropy of a black holes interior onto the outgoing radiation that arises due to the Hawking mechanism, which suggests (but does not prove) that we may be approaching a mechanism for understanding how the information that went into creating a black hole gets encoded back into the Universe outside of the black holes event horizon.

Unfortunately, we dont know how to calculate individual bits of information using any of these methods; we only know how to calculate overall amounts of information as though were putting them on a scale, seeing whether they balance or not. Thats an important step, but it isnt enough to resolve this paradox.

In the final stages of a black holes evaporation, quantum gravitational effects are likely to become important. It is conceivable that these effects could play an important role when it comes to encoding the information that went into creating the black hole in the first place.

Certainly, there are other ideas that are playing a major role. String-inspired ideas like complementarity and the AdS/CfT correspondence, as well as the notion of a firewall appearing partway through the evaporation process, are considered by many working on the paradox. Others suggest that there are correlations between every quantum of radiation emitted in the Hawking process (similar to entanglement), and that the full suite of those correlations must be understood in order to resolve the paradox. Still others have suggested altering the black holes internal and external geometries over the course of the emission of Hawking radiation to attempt to preserve information, while others appeal to whatever strong quantum effects must be present at the interface of quantum physics and relativity: becoming important in the final stages of black hole evaporation.

However, we still do not understand the most important aspects of the paradox: where the information from the particles that create the black hole goes, and how that information assuming it does get out into the Universe again actually gets encoded into the outgoing radiation that results when black holes evaporate. Despite whatever claims you may have heard, make no mistake: the black hole information paradox is still an unresolved paradox, and although its still an active area of research, no one can be sure what the solution will ultimately be, or what method will eventually lead us to it.

See original here:

The grand paradox at the heart of every black hole - Big Think

Harnessing the Void: MIT Controls Quantum Randomness For the … – SciTechDaily

By Massachusetts Institute of Technology, Institute for Soldier NanotechnologiesSeptember 9, 2023

MIT researchers have successfully controlled quantum randomness using vacuum fluctuations, introducing a breakthrough in probabilistic computing with potentially wide-ranging applications.

Groundbreaking study demonstrates control over quantum fluctuations, unlocking potential for probabilistic computing and ultra-precise field sensing.

A team of researchers from the Massachusetts Institute of Technology (MIT) has achieved a milestone in quantum technologies, demonstrating for the first time the control of quantum randomness.

The team of researchers focused on a unique feature of quantum physics known as vacuum fluctuations. You might think of a vacuum as a completely empty space without matter or light. However, in the quantum world, even this empty space experiences fluctuations or changes. Imagine a calm sea that suddenly gets waves thats similar to what happens in a vacuum at the quantum level. Previously, these fluctuations have allowed scientists to generate random numbers. Theyre also responsible for many fascinating phenomena that quantum scientists have discovered over the past hundred years.

Experimental setup to generate tunable random numbers from vacuum fluctuations. Credit: Charles Roques-Carmes, Yannick Salamin

The findings were described recently in the journal Science, in a paper led by MIT postdoctoral associates Charles Roques-Carmes and Yannick Salamin; MIT professors Marin Soljai and John Joannopoulos; and colleagues.

Conventionally, computers function in a deterministic manner, executing step-by-step instructions that follow a set of predefined rules and algorithms. In this paradigm, if you run the same operation multiple times, you always get the exact same outcome. This deterministic approach has powered our digital age, but it has its limitations, especially when it comes to simulating the physical world or optimizing complex systems, tasks that often involve vast amounts of uncertainty and randomness.

Artistic illustration of the generation of tunable random numbers from the quantum vacuum. Credit: Lei Chen

This is where the concept of probabilistic computing comes into play. Probabilistic computing systems leverage the intrinsic randomness of certain processes to perform computations. They dont just provide a single right answer, but rather a range of possible outcomes each with its associated probability. This inherently makes them well-suited to simulate physical phenomena and tackle optimization problems where multiple solutions could exist and where exploration of various possibilities can lead to a better solution.

Dr. Charles Roques-Carmes, one of the lead authors of the work, operating the experimental system. Credit: Anthony Tulliani

However, the practical implementation of probabilistic computing has been hampered historically by a significant obstacle: the lack of control over the probability distributions associated with quantum randomness. However, the research conducted by the MIT team has shed light on a possible solution.

Specifically, the researchers have shown that injecting a weak laser bias into an optical parametric oscillator, an optical system that naturally generates random numbers, can serve as a controllable source of biased quantum randomness.

Despite extensive study of these quantum systems, the influence of a very weak bias field was unexplored, remarks Charles Roques-Carmes, a researcher in the study. Our discovery of controllable quantum randomness not only allows us to revisit decades-old concepts in quantum optics but also opens up potential in probabilistic computing and ultra-precise field sensing.

The team has successfully exhibited the ability to manipulate the probabilities associated with the output states of an optical parametric oscillator, thereby creating the first-ever controllable photonic probabilistic bit (p-bit). Additionally, the system has shown sensitivity to the temporal oscillations of bias field pulses, even far below the single photon level.

Dr. Yannick Salamin, one of the lead authors of the work, operating the experimental system. Credit: Allyson Mac Basino

Yannick Salamin, another team member, remarks, Our photonic p-bit generation system currently allows for the production of 10,000 bits per second, each of which can follow an arbitrary binomial distribution. We expect that this technology will evolve in the next few years, leading to higher-rate photonic p-bits and a broader range of applications.

Professor Marin Soljai from MIT emphasizes the broader implications of the work: By making the vacuum fluctuations a controllable element, we are pushing the boundaries of whats possible in quantum-enhanced probabilistic computing. The prospect of simulating complex dynamics in areas such as combinatorial optimization and lattice quantum chromodynamics simulations is very exciting.

Reference: Biasing the quantum vacuum to control macroscopic probability distributions by Charles Roques-Carmes, Yannick Salamin, Jamison Sloan, Seou Choi, Gustavo Velez, Ethan Koskas, Nicholas Rivera, Steven E. Kooi, John D. Joannopoulos and Marin Soljai, 13 July 2023, Science.DOI: 10.1126/science.adh4920

View post:

Harnessing the Void: MIT Controls Quantum Randomness For the ... - SciTechDaily

Toward a Complete Theory of Crystal Vibrations – Physics

September 11, 2023• Physics 16, 151

A new set of equations captures the dynamical interplay of electrons and vibrations in crystals and forms a basis for computational studies.

J. Berges/University of Bremen

J. Berges/University of Bremen

Although a crystal is a highly ordered structure, it is never at rest: its atoms are constantly vibrating about their equilibrium positionseven down to zero temperature. Such vibrations are called phonons, and their interaction with the electrons that hold the crystal together is partly responsible for the crystals optical properties, its ability to conduct heat or electricity, and even its vanishing electrical resistance if it is superconducting. Predicting, or at least understanding, such properties requires an accurate description of the interplay of electrons and phonons. This task is formidable given that the electronic problem aloneassuming that the atomic nuclei stand stillis already challenging and lacks an exact solution. Now, based on a long series of earlier milestones, Gianluca Stefanucci of the Tor Vergata University of Rome and colleagues have made an important step toward a complete theory of electrons and phonons [1].

At a low level of theory, the electronphonon problem is easily formulated. First, one considers an arrangement of massive point charges representing electrons and atomic nuclei. Second, one lets these charges evolve under Coulombs law and the Schrdinger equation, possibly introducing some perturbation from time to time. The mathematical representation of the energy of such a system, consisting of kinetic and interaction terms, is the systems Hamiltonian. However, knowing the exact theory is not enough because the corresponding equations are only formally simple. In practice, they are far too complexnot least owing to the huge number of particles involvedso that approximations are needed. Hence, at a high level, a workable theory should provide the means to make reasonable approximations yielding equations that can be solved on todays computers.

One way to reduce the complexity of the problem is to step back from the picture of individual particles in favor of one of effective quasiparticles specific to the system at hand. An early example of a quasiparticle in the literature is the phonon: instead of focusing on the atomic nuclei that could, in principle, be located anywhere in space, one considers their collective vibration about their positions in a predefined crystal structure. Scientists have studied such elastic waves for almost a century [2], often resorting to two famous approximations: the Born-Oppenheimer approximation, which assumes that the electrons respond instantaneously to displacements of the nuclei; and the harmonic approximation, which posits that this response results in restoring forces proportional to the displacements.

Stefanucci and colleagues work builds on studies made in the middle of the last century that analyzed the interaction between quasiparticles by borrowing tools from quantum field theory. In 1961, Gordon Baym published a corresponding theory of electrons and phonons, in which the phonon field assigns a displacement to points in space and time [3]. One of the aforementioned tools is the technique of Feynman diagrams, which represent interaction processes graphically (Fig. 1) and can be translated into mathematical formulas through simple rules. By combining such diagrams into sets of equations that recursively depend on each other, one can account for all possible processes occurring in physical reality. In 1965, Lars Hedin presented examples of such equations, which completely describe systems of interacting electrons [4]. In a 2017 review, Feliciano Giustino merged these approaches and coined the term Hedin-Baym equations in the context of state-of-the-art materials simulationsanswering many, but not all, open questions [5].

Stefanucci and colleagues have addressed several of the remaining issues [1]. First, they imposed requirements on the electronphonon Hamiltonian, avoiding the mistake of trying to solve a problem not properly formulated in the first place. They emphasized that the equilibrium state around which the theory is built is not known in advance, making setting up and evaluating the Hamiltonian an iterative procedure. They also stressed that this Hamiltonian cannot generally be written in terms of physical phonons, contrary to what is often supposed. Second, the team generalized Giustinos work [5] to systems driven out of equilibrium at any temperaturea key advance because this scenario reflects experimental and technological conditions. Mathematically, this generalization allows time to take on complex values. Third, the researchers carefully derived the corresponding rules for Feynman diagrams and provided the first complete set of diagrammatic Hedin-Baym equations. Such equations form the basis of systematic approximations, in which certain diagrams are neglected, and provide a criterion [3] for the resulting dynamics to respect fundamental conservation laws. Whereas the effects of electrons on phonons and vice versa are well studied separately [5], here it is crucial that both occur simultaneously.

Nowadays, parameter-free simulations of electrons and phonons rely heavily on so-called density-functional perturbation theory [6], which is based on the Born-Oppenheimer and harmonic approximations. By contrast, diagrammatic techniques are oftenbut not always [7]used in combination with parameterized model Hamiltonians. Efforts to bring both approaches together have led to so-called downfolding methods, which already exist for the electronphonon problem [8]. The insights gained by Stefanucci and colleagues will certainly help to further bridge the different strategies. Moreover, the advancements beyond thermal equilibrium will be of utmost importance because such an extension is needed to explain the latest time-resolved spectroscopy experiments and to design better photovoltaics. Finally, given that the teams results apply to any fermionboson system, such as an interacting lightmatter system, many fields will benefit from this seminal work.

Jan Berges is a postdoctoral researcher at the University of Bremen in Germany. He is working on electronphonon interactions at the interface of first principles and model calculations, with a focus on computational implementation. Since the beginning of his doctoral studies, which he completed in 2020, he has been interested in many-body instabilitiessuch as charge-density waves and superconductivityespecially in two-dimensional materials.

Researchers have demonstrated a way to sift a database of crystalline compounds for structures that can be separated into useful one-dimensional materials. Read More

Read the rest here:

Toward a Complete Theory of Crystal Vibrations - Physics

A potential fifth fundamental force – The Chronicle – Duke Chronicle

For over 50 years, physicists have generally agreed that the interactions between elementary particles are governed by four fundamental forces: the strong force, weak force, electromagnetic force and gravity. The former three forces are related through the Standard Model of particle physics, while gravity derives its explanation from Einsteins general relativity.

However, similar to the fashion industry, sometimes models can give inaccurate conclusions of what our perceived reality should look like. And like any Victorias Secret catalog, we often dont get the full picture. Experimentalists have pointed out several phenomena that cannot be explained by our current understanding of physics, such as the nature of dark matter and dark energy, as well as the subatomic darties that may be going on inside black holes.

As a result, since as early as the 1980s, some scientists have postulated the existence of a fifth fundamental force to describe such anomalous behavior. This idea has been the subject of significant debate among the physics community ever since, with disparate studies proving and disproving the concept in an ever-perplexing will-they-wont-they battle featuring theory, experiment and a lot of lasers. Nevertheless, recent findings from Fermilab have shed light (or as they say in the particle physics community, shed photons) on the potential for a fifth force after all.

The evidence of a fifth force comes from the fact that the standard model can make precise predictions about particle behavior, and if we run an experiment such that the particles behave in a way that doesnt align with our predictions, there must be some unknown additional force at play. This notion underlies the workflow of the Muon g-2 experiment (pronounced gee minus two") a collaboration of nearly 200 scientists from 33 institutions, based at Fermilab in Batavia, Illinois. The experiment analyzes the behavior of muons, the heavier cousin of the electron by a factor of over 200. Muons are a temporary byproduct from collisions between positively-charged protons colliding with the nuclei of air molecules, part of the ever fluctuating quantum foam of appearing and disappearing particles in the boundless Diet Coke that is our universe.

Researchers at Fermilab shot a bunch of muons through a 50-foot diameter electromagnetic ring, where the muons sped around the ring about 1,000 times near the speed of light, like the coolest NASCAR race whose ionizing radiation would probably mutate your DNA into a sheep. The ring was filled with detectors that can make precise measurements of the muons behavior.

When elementary particles are exposed to a magnetic field, they act like mini-magnets themselves and exhibit an intrinsic property called spin, where they act as if they were a spinning object but arent actually spinning. The muon's internal magnet wants to rotate itself to align along the magnetic field axis like a compass aligning with the Earth's magnetic field. However, the muons spin prevents this from happening. Instead, the muon starts to act like a spinning top, where it spins around a wide axis, a type of rotation called precession which is notoriously one of the most difficult things to calculate in any mechanics course. Physicists can precisely measure this wobbling and eventually calculate the gyromagnetic ratio of a muon, denoted by g. All of this math theoretically predicts g for a muon to be around two, differing by a factor of 0.1, hence the name of the experiment.

But this is not what happened. Fermilab reported as of last month that they found the g for a muon to be about 0.002, five standard deviations smaller than expected. This means that theres only a 1-in-3.5 million chance of this being a statistical fluke! Such a discrepancy has provided staggering evidence that there might be some additional force at play. However, the researchers did note a high uncertainty in the theoretical calculation of g due to contributions from the four known fundamental forces at play in the experimental environment, slightly dampening the novelty of the result. Nevertheless, the staggering nature of these findings are enough to get theorists sharpening their pencils to figure out whats going on, and experimentalists to sharpenwhatever tool you need to build a particle detector.

When we say that something is defying the laws of physics, what were really saying is that something is defying our current understanding of how physics should work (or that we just know someone whos really good at calisthenics). Although particle physics seeks to describe the most fundamental aspects of the universe, the fundamentals of physics themselves are always subject to change in response to new information. This makes studying physics exciting and full of the feeling that youre always on some cutting edge, but also frustrating in that youre constantly bombarded with ideas that fundamentally conflict with one other, like with relativity and quantum mechanics. Relativity produces conclusions that are definite, whereas quantum mechanics produces conclusions that are probabilistic. Relativity sees time as malleable, whereas quantum mechanics sees time as fixed. Relativity is continuous or smooth, whereas quantum mechanics is discrete or chunky.

None of this is to say that either theory is wrong and that we need to Marie Kondo an entire discipline from a few discrepancies. If Ive learned anything from peanut butter, smooth and chunky ways of thinking are equally valid in their own contexts. What physicists are now hoping to do is to find the gaps in these models, and potentially find ways of better connecting the two schools of thought. As the Muon g-2 experiment continues to analyze data through 2025, we are in for a potential restandardization of the Standard Model itself and a mass reckoning for how well we know the things we are supposed to understand.

And although Im hesitant to jump to conclusions from preliminary data, if a fifth force is confirmed, I believe it deserves a cool name for future textbooks. Like pickles. Or Jeff.

Monika Narain is a Trinity junior. Her columntypicallyruns on alternating Thursdays.

Signup for our weekly newsletter. Cancel at any time.

See the article here:

A potential fifth fundamental force - The Chronicle - Duke Chronicle