Category Archives: Quantum Physics
Postdoctoral position in Quantum Physics job with UNIVERSITY OF HAMBURG | 273815 – Times Higher Education (THE)
The theory group on 'Fundamental Processes in Quantum Physics' at the Center for Optical Quantum Technologies of the University of Hamburg announces a postdoctoral position in theoretical physics.
The underlying project aims at developing novel quantum and hybrid algorithms for quantum simulation based on a Rydberg tweezer platform for ultracold atoms. Applications to relevant computational and optimization problems are envisaged.
Ideally a close interface with the underlying driven many-body Rydberg physics will be established. Research is performed in the above theory group with an immediate link to the corresponding experimental groups.
For more information and/or details please contact Prof. Dr. Peter Schmelcher at the below given email (see also https://www.physik.uni-hamburg.de/en/ilp/schmelcher.html).
We are looking for a strongly motivated and highly skilled postdoctoral researcher who shares the excitement of doing research in theoretical physics. The position will be available for a two years period with a possible extension up to five years.
To apply, please send a meaningful CV (including the names of potential references) with a
cover letter to firstname.lastname@example.org.
Salaries are paid according to the German standards. Recruitment will continue until the position is filled.
See the rest here:
A solid is made of atoms that are, more or less, locked in an ordered structure. A liquid, on the other hand, is made of atoms that can flow freely around and past each other. But imagine atoms that stay unfrozen, like those in a liquidbut which are in a constantly changing magnetic mess.
What you have then is a never-before-seen state of matter, a state of quantum weirdness called a quantum spin liquid. Now, by carefully manipulating atoms, researchers have managed to create this state in the laboratory. The researchers published their work in the journal Science on December 2.
Scientists had discussed theories about spin liquids for years. But we really got very interested in this when these theorists, here at Harvard, finally found a way to actually generate the quantum spin liquids, says Giulia Semeghini, a physicist and postdoc at Harvard University, who coordinated the research project and was one of the paper authors.
Under extreme conditions not typically found on Earth, the rules of quantum mechanics can twist atoms into all sorts of exotica. Take, for instance, degenerate matter, found in the hearts of dead stars like white dwarfs or neutron stars, where extreme pressures cook atoms into slurries of subatomic particles. Or, for another, the Bose-Einstein condensate, in which multiple atoms at very low temperatures sort of merge together to act as one (its creation won the 2001 Nobel Prize in Physics).
The quantum spin liquid is the latest entry in that bestiary of cryptid states. Its atoms dont freeze into any sort of ordered state, and theyre constantly in flux.
[Related: IBMs latest quantum chip breaks the elusive 100-qubit barrier]
The spin in the name refers to a property inherent to each particleeither up or downwhich gives rise to magnetic fields. In a normal magnet, all the spins point up or down in a careful order. In a quantum spin liquid, on the other hand, theres a third spin in the picture. This prevents coherent magnetic fields from forming.
This, combined with the esoteric rules of quantum mechanics, means that the spins are constantly in different positions at once. If you look at just a few particles, its hard to tell whether you have a quantum liquid or, if you do, what properties it has.
Quantum spin liquids were first theorized in 1973 by a physicist named Philip W. Anderson, and physicists have been trying to get their hands on this matter ever since. Many different experimentstried to create and observe this type of state. But this has actually turned out to be very challenging, says Mikhail Lukin, a physicist at Harvard University and one of the paper authors.
The researchers at Harvard had a new tool in their arsenal: what they call a programmable quantum simulator. Essentially, its a machine that allows them to play with individual atoms. Using specifically focused laser beams, researchers can shuffle atoms around a two-dimensional grid like magnets on a whiteboard.
We can control the position of each atom individually, says Semeghini. We can position them individually in any shape or form that we want.
Moreover, to actually determine if they had successfully created a quantum spin liquid, the researchers took advantage of something called quantum entanglement. They energized the atoms, which began to interact: changes in the property of one atom would reflect in another. By looking at those connections, the scientists found the confirmation they needed.
All this might seem like creating abstract matter for abstract matters sakebut thats part of the appeal. We can kind of touch it, poke, play with it, even in some ways talk to this state, manipulate it, and make it do what we want, says Lukin. Thats whats really exciting.
But scientists do think quantum spin liquids have valuable applications, too. Just venture into the realms of quantum computers.
Quantum computers have the potential to far outstrip their traditional counterparts. Compared with computers today, quantum computers could create better simulations of systems such as molecules and far more quickly complete certain calculations.
But what scientists use as the building blocks of quantum computers can leave something to be desired. Those blocks, called qubits, are often things like individual particles or atomic nucleiwhich are sensitive to the slightest bit of noise or temperature fluctuations. Quantum spin liquids, with information stored in how theyre arranged, could be less finicky qubits.
If researchers were able to demonstrate that a quantum spin liquid could be used as a qubit, says Semeghini, it could lead to an entirely new sort of quantum computer.
See the original post here:
Atos Confirms Role in Quantum Hybridization Technologies at Its 8th Quantum Advisory Board – HPCwire
PARIS, Dec. 3, 2021 At the meeting of the 8thAtos Quantum Advisory Board, a group of international experts, mathematicians and physicists, authorities in their fields, Atos reaffirmed its position as a global leader in quantum computing technologies. In particular, the quantum hybridization axis (convergence of high-performance computing (HPC) and quantum computing) positions the company at the forefront of quantum research, converging its expertise. Atos has invested, along with partner start-ups Pasqal and IQM, in two major quantum hybridization projects in France and Germany.
Held atAtos R&D center, dedicated to research in quantum computing and high-performance computing, in Clayes-sous-Bois, in the presence of Atos next CEO, Rodolphe Belmer, and under the chairmanship of Pierre Barnab, Chair of the Quantum Advisory Board, Interim co-CEO and Head of Big Data and Cybersecurity, this meeting of the Quantum Advisory Board was an opportunity to review Atos recent work and to take stock of future prospects.
Artur Ekert,Professor of Quantum Physics at the Mathematical Institute, University of Oxford,Founding Director of the Centre for Quantum Technologies in Singapore and member of the Quantum Advisory Boardsaid We are truly impressed by the work and the progress that Atos has made over the past year. The company takes quantum computing seriously and it gives us great pleasure to see it becoming one of the key players in the field. It is a natural progression for Atos. As a world leader in High Performance Computing (HPC), Atos is in a unique position to combine its existing, extensive, expertise in HPC with quantum technology and take both fields to new heights. We are confident that Atos will shape the quantum landscape in years to come, both with research and applications that have long-lasting impact.
In the field of quantum hybridization Atos is the only player and the company is already enablingseveralapplications in the areas of chemistry, such as catalysis design for nitrogen fixation, and for the optimization of smart grids. Atos is also involved in two additional quantum hybridization projects, which are currently being launched:
The EuropeanHPC-QS(Quantum Simulation) project, which starts this December 2021, aims to build the first European hybrid supercomputer with an integrated quantum accelerator by the end of 2023. It is intended to be a first major brick of the French quantum plan. Atos is involved in this project alongside national partners including the CEA, GENCI, Pasqal and the Julich Supercomputing Centre. Pasqal will provide its analog quantum accelerator and Atos, with its quantum simulator, theQuantum Learning Machine(QLM), will ensure the hybridization with the HPCs at the two datacenters at GENCI and Julich.
TheQ-EXAproject, part of the German Government quantum plan, will see a consortium of partners, including Atos, work together to integrate a German quantum computer into an HPC supercomputer for the first time. Atos QLM will be instrumental in connecting the quantum computer, from start-up IQM (also part of theAtos Scalerprogram) to the Leibniz Supercomputing-LRZ centre.
The European Organization for Nuclear Research (CERN), one of the worlds largest and most respected research centres, based in Geneva, has recently acquired an Atos Quantum Learning Machine (QLM) appliance and joined the Atos User Club. The Atos QLM, delivered to CERN in October, will be made available to the CERN scientific community to support research activities in the framework of theCERN Quantum Technology Initiative (CERN QTI), thus accelerating the investigation of quantum advantage for high-energy physics (HEP) and beyond.
Building on CERNs unique expertise and strong collaborative culture, co-development efforts are at the core of CERN QTI. As we explore the fast-evolving field of quantum technologies, access to the Atos Quantum Learning Machine and Atos expertise can play an important role in our quantum developments roadmap in support of the high-energy physics community and beyond, saysAlberto Di Meglio, Coordinator of the CERN Quantum Technology Initiative.A dedicated training workshop is being organized with Atos to investigate the full functionality and potential of the quantum appliance, as well as its future application for some of the CERN QTI activities.
Atos is the world leader in the convergence of supercomputing and quantum computing, as shown by these two major and strategic projects we are involved in in France and Germany. At a time when the French government is expected to announce its plan for quantum computing, the durability of our Quantum Board, the quality of the work carried out and the concrete applications of this research in major projects reinforce this position, commentsPierre Barnab, interim co-CEO and head of Big Data and Cybersecurity at Atos.
The Quantum Advisory Board is made up of universally recognized quantum physicists and includes:
As a result of Atos ambitious program to anticipate the future of quantum computing and to be prepared for the opportunities and challenges that come with it Atos Quantum Atos was the first organization to offer a quantum noisy simulation module which can simulate real Qubits, the Atos QLM and to propose Q-score, the only universal metrics to assess quantum performance and superiority. Atos is also the first European patent holder in quantum computing.
Photo, from left to right:
Atos is a global leader in digital transformation with 107,000 employees and annual revenue of over 11 billion. European number one in cybersecurity, cloud and high performance computing, the Group provides tailored end-to-end solutions for all industries in 71 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos is a SE (Societas Europaea), listed on Euronext Paris and included in the CAC 40 ESG and Next 20 Paris Stock indexes.
Thepurpose of Atosis to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space.
See the original post here:
Robert Crease Awarded 2021 Institute of Physics William Thomson, Lord Kelvin Medal and Prize | | SBU News – Stony Brook News
Robert Crease, professor and chair of the Department of Philosophy in the College of Arts and Sciences, has been awarded the 2021 Institute of Physics (IOP) William Thomson, Lord Kelvin Medal and Prize. Crease has received this award for his 21 years writing Physics Worlds outstanding Critical Point column, devoted to describing key humanities concepts for scientists, and explaining the significance of key scientific ideas for humanists.
Crease has written, co-written, translated, and edited nearly 20 books on the history and philosophy of science, several of which sprang from material in Critical Point. These books have been reviewed in places as diverse as The Economist, the London Review of Books, and the Wall Street Journal, and translated into a total of 10 languages. One book in particular, The Quantum Moment: How Planck, Bohr, Einstein, and Heisenberg Taught Us to Love Uncertainty, is about the real and fanciful impact that quantum mechanics has had on philosophy, culture, and life. The book stemmed from an innovative class that Crease and physicist Alfred Goldhaber co-taught at Stony Brook University.
My sincere congratulations to Bob on his receipt of the William Thomson, Lord Kelvin Medal and Prize, said Nicole Sampson, dean of the College of Arts and Sciences and distinguished professor of chemistry. His decades-long contribution to the sciences from a humanists perspective, through his Critical Point column and numerous publications as well as inventive course offerings that blend the arts and sciences, is testament to the importance of interdisciplinary collaboration as we navigate our ever-changing world. I applaud Bob for his commitment to communicating ideas and knowledge from his own area of expertise for the benefit of other disciplines.
Crease is also co-editor-in-chief of Physics in Perspective, whose mission is to bridge the gulf between physicists and non-physicists through historical and philosophical studies that typically display the unpredictable as well as the cross-disciplinary interplay of observation, experiment, and theory that has occurred over extended periods of time in academic, governmental, and industrial settings and in allied disciplines such as astrophysics, chemical physics, and geophysics.
Im thrilled to get this award, said Crease. The IOP, a century-old scientific organization, gave it to me for more than 20 years of writing the column, Critical Point for Physics World. Its a good sign for the humanities, for the column explores the numerous intersections between humanities and the sciences. In a science-dominated world, I think, the vitality of the humanities is threatened, not by interacting too much with the sciences, but too little. By the way, Kelvin, the scientist for whom the award is named, occupied what at the time was called a Chair of Natural Philosophy.
Other books include Philosophy of Physics, an IOP ebook, and the final portion of J. Robert Oppenheimer: A Life, which was begun by physicist Abraham Pais and left incomplete with his death. Crease also edited Science Policy Up Close by physicist and U.S. Presidential Science Advisor John H. Marburger III. For these and other contributions to history and philosophy of science, Crease was elected a fellow of the APS and IOP.
Crease received BA from Amherst College and PhD from Columbia University. He has written more than 75 reviews, editorials, and other short pieces on philosophy, history, and science for academic journals and The New York Times, Wall Street Journal, Nature, Newsday, and more. Crease has also appeared on a range of radio programs, from the BBC to the offbeat Talk Nerdy.
On behalf of the Institute of Physics, I warmly congratulate all of this years Award winners, said Professor Sheila Rowan, president of the Institute of Physics. Each and every one of them has made a significant and positive impact in their profession, whether as a researcher, teacher, industrialist, technician or apprentice. Recent events have underlined the absolute necessity to encourage and reward our scientists and those who teach and encourage future generations. We rely on their dedication and innovation to improve many aspects of the lives of individuals and of our wider society.
The Institute of Physics is the professional body and learned society for physics, and the leading body for practising physicists, in the UK and Ireland. The IOP awards celebrate physicists at every stage of their career; from those just starting out through to physicists at the peak of their careers, and those with a distinguished career behind them.They also recognize and celebrate companies which are successful in the application of physics and innovation, as well as employers who demonstrate their commitment and contribution to scientific and engineering apprenticeship schemes.
Read this article:
Hopes are high that quantum theory could yield revolutionary applications. Physicist Jasmin Meinecke is working on the physical foundations of particle systems and studying the central phenomenon of entanglement, which is still a mystery to science.
Its not long into the conversation with Jasmin Meinecke that the magic word comes up: entanglement. Without it, nothing seems possible in the quantum world. It is a mysterious connection between things like photons or atoms, Meinecke explains. It means that these particles cannot be regarded as separate, even if they are far apart. Albert Einstein once called this invisible force spooky action at a distance. To this day, it is still not really understood.
Jasmin Meinecke is interested in such strange phenomena. She wants to learn more about a world that existed mainly only in theory for decades. Partly because it is so difficult to observe. Quantum theory is considered one of the most powerful physical theories of the 20th century. Although the finer details of many of its peculiar features remain unknown and still puzzle scientists to this day, there has recently been a great deal of interest around potential applications: from ultra-powerful quantum computers that are superior to todays computers, to communication encrypted with quantum cryptography making it tap-proof, to super-precise quantum sensors that could enable completely new kinds of measuring instruments to become reality. The current euphoria is driven by the fascinating possibilities offered by the quantum world itself: The theory promises dramatic improvements over the classical equipment we know and use today. Right now were learning more about what you can actually do with entangled particles, says Meinecke.
Entanglemement in a waveguideJasmin Meinecke in her laboratory in Garching.
Christoph Hohmann / LMURight now were learning more about what you can actually do with entangled particles.Dr. Jasmin MeineckeThe young physicist is one of many scientists to feel torn between the conflicting priorities of ambitious basic research and the great hopes for practical applications in the near future in areas like quantum computing. Meinecke, who heads a junior research group at the Max Planck Institute of Quantum Optics (MPQ) in Garching, is keen to conduct basic research, and so she is studying photons light particles in that context. Yet at the same time, she also wants to keep an eye on possible later applications, for instance by understanding how quantum systems can be used to measure the physical and chemical properties of matter.
Jasmin Meinecke completed her doctorate in Bristol in 2015 before moving to LMU Munich. She is also one of the researchers in the cluster of excellence MCQST (the Munich Center for Quantum Science and Technology), and in January 2020 she was awarded a START Fellowship. This is a program designed to enable excellent postdocs to set up their own project within two years and receive 300,000 euros in funding for it. Meinecke is currently working in LMU physicist Harald Weinfurters group, where she is using this funding to study open quantum systems, which are systems that interact with their environment something that scientists normally try to avoid at all costs. That is because such systems are very sensitive and are quickly disturbed by external influences all it takes is a change in temperature or a vibration.
EnlargeComparatively simple experimental setup Christoph Hohmann / LMU
Wave patterns in glassTo conduct her experiments, Meinecke uses photons that she observes under controlled conditions in integrated waveguides. Waveguides are small, unremarkable-looking glass plates with a kind of pattern inscribed inside them; the patterns are predetermined pathways for the light particles to travel along. Various entangled photons move towards and away from each other along these pathways.
Meinecke has sketched such wave patterns in her lab book and on whiteboards in the lab: curved paths, pretty to look at. Located inside the glass, they are not visible from the outside. The aim of these experiments is for the researcher to begin to understand, for example, how the quantum system and its environment exchange information and how that affects the quantum properties within it. Im fascinated by how and when quantum properties like coherence or entanglement are lost, says Meinecke. Quantum properties dont just suddenly disappear. The question is, where does the information go?
Quantum properties dont just suddenly disappear. The question is, where does the information go?Jasmin MeineckeSetting the pace about the impending quantum revolutionRead moreFor some years now, researchers have had increasing success with comparatively simple experiments like these as a way to better understand some of the concepts of quantum mechanics, such as superposition and entanglement, and be able to use them in technological applications. Entanglement is not just one of the most important properties of quantum particles, it is also the central resource for promising quantum technologies.
Meineckes field, experimental quantum optics, is ultimately based on the ability of scientists to control light, matter, and how they interact to an ever better degree. Its a field in which the Munich cluster of excellence brings together a huge amount of expertise and offers broad-ranging opportunities for cooperation. Meinecke chose photons as an experimental platform partly, she says, because light particles are easy to control, easier than solid-state systems with atoms, which are often very sensitive to their environment and in many cases can only serve as a platform for experiments at extremely low temperatures close to absolute zero. Added to that, photons have many properties that will be needed for future applications. For example, they can be readily transmitted in fiber optic cables like those used in telecommunications, and they are also easy to generate.
Several of the properties of photons can in principle be used for entanglement: the polarization of the light particles, the color of the light (the wavelength, in other words), the energy, the spin. Part of the reason why Meinecke is so excited about the possibilities offered by photons is because, unlike her colleagues who work with ultracold atoms, her experiments can be realized in a small space and she is able to take advantage of advances in the miniaturization of optical components.
The secret life of photonsThe setups in her lab are, in fact, astonishingly simple compared to other quantum experiments. A laser, a crystal, a small glass plate, and a little bit of electronics to analyze the experiments. All of it fits into a space the size of a kitchen table. The laser beams its light through a fiber optic cable into a nonlinear crystal, in which the lasers light particles can be used to generate photon pairs that are entangled. Whether an entangled pair is created from a photon coming out of the laser is pure chance.
Thats the mysterious bit about the experiment. It is a process that has only a certain probability of working. Once the particles have become entangled, thats when things get exciting for the physicist. The particles can then move along different paths through the waveguides. But what do they really do? Which pathway do they choose? Do they remain entangled? These seemingly simple questions transport you deep into the jungle of the quantum world. Meinecke has a more fundamental way of putting all these questions: What actually happens to the particles when they are not being observed?
For the LMU physicist, this is not just a philosophical question. Observation is an important thing in the quantum world. Because on the one hand, you dont know anything about whats happening in the waveguide until you observe the particles. And on the other hand, the entanglement between the photons is usually lost very suddenly if you only measure one state of a photon, such as its energy.
The peculiarities of the quantum walkSo what can be done? Were working on weak measurement too, says Meinecke. That means measuring without completely destroying the entanglement. Many research groups are interested in these same topics: Fellow scientists at the MPQ recently developed a method for detecting the entanglement of two distant atomic qubits, the quantum stores of information.
The scientists findings are not always easy for laypeople to grasp. A conversation with Jasmin Meinecke is like taking a high-speed ride through many exotic topics around quantum physics. She talks about Bell states and the peculiarities of the quantum walk, a kind of random walk that particles do, and of course the integrated waveguides that can be used for a wide variety of applications thanks to the complex structures inside the glass plates. They can now provide the level of stability and miniaturization we need to build larger experimental arrays, says Meinecke.
High-precision work Christoph Hohmann / LMUQubits for quantum computersIntegrated waveguides are an ideal tool for exploring fundamental questions of quantum physics. This is another reason why Meinecke is now keen to test the possibilities of integrated waveguide structures as quantum simulators in several further series of experiments. She says that entangled photons make a good study platform precisely because quantum systems are sensitive to the tiniest of disturbances in the environment. And that is what Meinecke wants to develop measurement techniques for.
She evidently prefers the small experiments to experiments using large-scale lasers, such as those still set up from years past in the Laboratory for Multiphoton Physics in Garching. One of these lasers, a powerful, high-performance machine, has been dubbed Tsunami. Fitted with much more complex attachments, it can be used to generate up to six entangled photons; only a few laboratories are capable of generating more entangled pairs. The current record is twelve. The hope here is that these could be used as qubits for quantum computers. But the effort is immense, especially considering that useful applications would require many times more photons. China still has research groups that keep reporting new records with more and more entangled particles, she says. This is not for her: The amount of resources and lab time it consumes is enormous. And it doesnt tell me anything more about physics, she says. A scientist at a university needs to be generating more fundamental insights anyway.
Despite all the current interest in areas like quantum computing, the researcher tries to keep her focus on the fundamentals. Quantum simulators and perhaps, later, quantum computers are not just faster computers that can solve problems like how to optimize delivery routes in logistics or work out the structure and effect of new molecules, says Meinecke. Its not just a matter of switching over from something like diesel engines to electric powertrains. Its a long road. Its quite simply something totally different, a new way of computing and understanding the world.
The rest is here:
Quantum computers could one day blow boring old classical computers out of the water, but so far their complexity limits their usefulness. Engineers at Stanford have now demonstrated a new relatively simple design for a quantum computer where a single atom is entangled with a series of photons to process and store information.
Quantum computers tap into the weird world of quantum physics to perform calculations far faster than traditional computers can handle. Where existing machines store and process information in bits, as either ones and zeroes, quantum computers use qubits, which can exist as one, zero, or a superposition of both one and zero at the same time. That means their power scales exponentially with each added qubit, allowing them to tackle problems beyond the reach of classical computers.
Of course, quantum computers bring their own challenges. For one, the quantum effects they run on are sensitive to disturbances like vibration or heat, so quantum computers need to be kept at temperatures approaching absolute zero. As such, their complexity scales with the computing power of the machine, so they become physically larger and more cumbersome as more processing power is added.
But the Stanford team says their new design is deceptively simple. Its a photonic circuit made using a few components that are already available a fiber optic cable, a beam splitter, two optical switches and an optical cavity and it can reduce the number of physical logic gates needed.
Normally, if you wanted to build this type of quantum computer, youd have to take potentially thousands of quantum emitters, make them all perfectly indistinguishable, and then integrate them into a giant photonic circuit, says Ben Bartlett, lead author of the study. Whereas with this design, we only need a handful of relatively simple components, and the size of the machine doesnt increase with the size of the quantum program you want to run.
The new design is made up of two main parts: a ring that stores photons, and a scattering unit. The photons represent qubits, with the direction that they travel around the ring determining whether their value is a one or a zero or both if it travels in both directions at once, thanks to the quirks of quantum superposition.
To encode information on the photons, the system can direct them out of the ring into the scattering unit, where they enter a cavity containing a single atom. When the photon interacts with the atom, they become entangled, a quantum state where the two particles can no longer be described separately, and changes made to one will affect its partner, no matter how large a distance separates them.
In practice, after the photon is returned to the storage ring, it can be written to by manipulating the atom with a laser. The team says that the one atom can be reset and reused, manipulating many different photons in the one ring. That means the quantum computers power can be scaled up by adding more photons to the ring, rather than needing to add more rings and scattering units.
By measuring the state of the atom, you can teleport operations onto the photons, says Bartlett. So we only need the one controllable atomic qubit and we can use it as a proxy to indirectly manipulate all of the other photonic qubits.
Importantly, this system should be able to run a variety of quantum operations. The team says that different programs can be run on the same circuit, by writing new code to change how and when the atom and photons interact.
For many photonic quantum computers, the gates are physical structures that photons pass through, so if you want to change the program thats running, it often involves physically reconfiguring the hardware, says Bartlett. Whereas in this case, you dont need to change the hardware you just need to give the machine a different set of instructions.
Better still, photonic quantum computer systems can operate at room temperature, removing the bulk added by the extreme cooling systems.
The research was published in the journal Optica.
Source: Stanford University
Spectral Lines From A Dying Nation: The Molecular Spectrometry Of Hertha Sponer – Women You Should Know
It is hard to imagine a time and place outside of Charles Dickenss Revolutionary France that more embodies the spirit of the Best of Times and Worst of Times than early 20th century Gttingen. At its best, in the 1920s, it boasted a heady selection of the planets greatest mathematicians and physicists, clustered together to rewrite entirely the rules for atomic and molecular structure and behavior. At its worst, in the 1930s, it gave itself over to the anti-Semitic ravings of the rising Nazi party, and tore its magnificence to tatters in the name of party ideology and racial purity. The roll call of those who directly experienced those high highs and low lows reads like a roll call of the modern scientific pantheon: Max Born, James Franck, Emmy Noether, Maria Goeppert-Mayer, Richard Courant, Edward Teller, and Germanys second ever woman to hold a professorship in physics, Hertha Sponer.
Historically, Sponers story has tended to get lost in the deep shadow of her contemporary Lise Meitner, whose tale features many of the same contours as that of Sponer, but with both the triumphs and tragedies amplified. We have decided which woman physicist victimized by the Nazi era and the systemic gender discrimiantion of early 20th century Germany we have chosen to remember, and its Meitner, thank you very much. And thats rather too bad, because not only did Hertha Sponer do some gorgeous physics over the course of four personally challenging decades, but her story also highlights the rapid social change occurring in the Western world in the early 20th century, and how a professional and highly educated woman was able to successfully navigate that evolving culturescape.
Hertha Dorothea Elisabeth Sponer was born, the eldest of five children, on September 1, 1895, in the small east German (currently Polish) town of Neisse. Her family was of a Protestant background that highly valued education, and in her elementary school Sponer was exposed to an impressively broad array of subjects, including French, German, geography, natural science, writing, singing, arithmetic, religion, gymnastics, and needlework. Somewhere around the year 1906, her family moved to the town of Zittau, which still remains, if just barely, within the borders of Germany.
It was, for Sponer, an unfortunate move. The school in Zittau was not as advanced as Neisses, and Sponer believed that, between the slow pace and unchallenging material, she would never learn enough at Zittau to be able to take the Abitur, the competitive exam for the German university system, where her ambitions lay. As such, she downgraded her expectations for a life of scientific research to that of a workaday private teacher, and attended a governess training school, which she graduated from in 1913.
She worked as a governess for two years, and in the middle of World War I took up a substitute teaching appointment. If youve ever acted as a substitute teacher, you wont be surprised by the fact that she did not remain long in that generally thankless profession and, in 1916, she made the decision to make up for lost time and attend a cram school for the Abitur. She worked hard, mastered the two year condensed curriculum in one year, and took the exam in March of 1917, excelling in its physics portion, and becoming one of only 570 women that year to pass. Her parents had expected, after the Abitur, that she would take up a position as a teacher at a girls secondary school, but after having read in a newspaper story about Sponers performance how remarkable it was to have passed the exam with so few years of preparation, they wrote to her proudly giving their full permission for her to undertake whatever university studies she liked.
Sponer began her university career at Tbingen University in May of 1917, but was quickly frustrated by the lack of theoretical physics courses on offer, or indeed any physics beyond an experimental methods course, and a lab section. Within a year, she was preparing to transfer, and by April of 1918 she was settling in at the University of Gttingen. Here, she came to the notice of Peter Debye, a future Nobel Prize winner, when she complained to one of his teaching assistants about the low difficulty level of the labs he had assigned, and the poor quality of the equipment allotted for the students use. She was told to go to Debyes office and present her criticisms in person which, amazingly, she did, and even more amazingly, which Debye responded positively to, setting her on a path of independent research which culminated in her writing her dissertation with him.
After only four semesters of study at Gottingen, Sponer took the oral examinations for her PhD in March of 1920. Ordinarily, she would have taken longer to learn more material, but Debye was planning on leaving the university shortly, and in his absence, her examination would have fallen to Robert Pohl, who was notoriously against the higher education of women, and might therefore do his level best to stack the examination against her. Ultimately, she passed her orals with an overall mark of Very Good.
Rather than remaining at Gttingen, where the focus was on theoretical physics, Sponer decided to find an institute where she could perfect her experimental technique, and Debye fatefully recommended to her that she go to Berlin and the Kaiser Wilhelm Institute, where Fritz Haber (the Nobel laureate we have met before in the tragic and sinister guise as the husband of Clara Immerwahr) was her supervisor and, more monumentally, where she met a young man named James Franck, thirteen years her senior, who would play a pivotal role in her life for the next four decades.
Before World War I, Franck had become famous for his work in verifying an early quantum prediction about how electrons of certain energies interact with atoms and now, after a brief stint as a soldier in the German army, he was returning to his research. Sponers first paper from her time at the Institute was an extension of this work. Sponer and Franck worked well together, and in 1921, when Franck was made the head of an experimental physics institute in Gttingen created by his friend Max Born, he took Sponer along with him as his Assistentin.
A gregarious and inspiring teacher and leader, Franck created a legendary intellectual atmosphere in Gttingen that set aside the rigid formalities of the usual German university system. Here it was that the new quantum physics was applied to the interpretation of atomic and molecular structures, and where the experiments were performed that dramatically supported even the most extreme predictions of the new paradigm.
To begin with, Sponer continued her work in measuring how often electron collisions with atoms induce the electrons in those atoms to undergo a quantum leap. It was not until 1924 that she pushed off into the research that would cement her name in the global physics community, the study of molecular spectra.
So, what is a molecular spectrum? You might remember from your high school chemistry courses that, when a photon of light of a particular frequency is absorbed by a lone atom, it can take that energy and use it to promote one of its electrons to a higher energy level. Since most atoms have many different electrons, there are many different promotions that can take place, each of a very particular energy. Those energies are so specific that, just by looking at the energies absorbed by an atom, or those emitted by one, you can identify what the atom is. An atoms absorption spectrum, which is a tally of all the different energies absorbed by it, acts for scientists like a tell-tale fingerprint.
When you send photons of energy at something containing more than a single atom, however, the story of what is absorbed and what that energy is used for becomes much more complicated, and molecular spectra in the early 20th century were correspondingly much harder to interpret than atomic spectra. Not only can molecules (consisting of two or more atoms bonded together) use incoming energy to promote electrons to higher energy orbits, but they can also use it to change how their atoms vibrate relative to one another, AND how they rotate around different possible axes. The more atoms there are in a molecule, the more possible vibrational states there will be, and the more asymmetric the molecule is, the more likely it is that the molecules rotational states will require distinctly different amounts of energy, all summing to a rich cornucopia of absorbed energies which it was Sponers task to untangle.
What Sponer did as a physicist throughout the 1920s, and up through the 1960s, was to take the complicated patterns produced by gaseous molecules exposed to different energies of light, and interpret them employing the theoretical mathematical models emerging from the great modern physics capitals of Berlin, Gttingen, and Copenhagen. This branch of physics, called molecular spectroscopy, was among our most powerful techniques for determining the bond lengths between the different atoms of a molecule, and the angles between them. By knowing what different energies were required to induce different types of rotation, for example, you could determine the moments of inertia around those different axes, and once you know that, coupled with the masses of the atoms involved, you can start mapping out the lengths of all the bonds and thus create geometrical pictures of increasingly complicated molecular structures.
On the strength of her work in molecular spectroscopy, Sponer received her Habilitation in 1925, which brought with it the ability to lecture at the university level. In the 25 years from 1908 to 1933, of the 10,595 women in Germany who obtained their doctorate, only 54 received a Habilitation, less than half of whom went on to become professors. Predictably, Robert Pohl was vehemently against the appointment, but relented when Franck promised that, should he ever leave the university, Sponer would as well.
That seemed an unlikely possibility in that banner year of 1925, however, and soon after receiving her Habilitation, Sponer also received a Rockefeller grant to study at the University of California, Berkeley, for a year, where she wished to learn their highly successful techniques in the field of vacuum ultraviolet spectroscopy. She learned a great deal about science while there (and developed a technique with Raymond Birge for using measured vibrational frequencies to calculate the energy required to break a molecule apart, known today as the Birge-Sponer Method), and the development of her English language skills would prove to be deeply important to her in the lean years ahead, but it was also an unfortunate time to leave Gttingen, as 1925 saw an explosion of quantum mathematical progress there, including Heisenbergs landmark paper producing a matrix mechanical explanation of electron orbitals and quantum transitions. It was also the year that James Franck won the Nobel Prize for Physics.
The late 1920s were an era of mounting professional respect and international reputation for Sponer, and in 1930 she was commissioned by the prestigious German science publishing house Springer to produce a book on molecular structure. That work would have to wait, however, as in January of 1933 the Nazi party came to power in Germany, and on April 7 promulgated the Law for the Restoration of the Professional Civil Service whereby all Jews serving in universities were to be removed from their positions, with the exception of World War I veterans.
James Franck was half-Jewish, but as a war veteran he could have maintained his position at Gttingen. Courageously, however, he elected to renounce his position in protest of the treatment of Jewish intellectuals in Germany. The decision made Franck a hero of the international intellectual community, but left Sponer in an awkward position. Classified as an Aryan by the Nazi state, she could have kept her position, but with Franck gone, Pohl expected her to leavel in accord with Francks 1925 promise.
Even if Pohl had been willing to keep Sponer on at Gottingen, in May of 1933 the German state passed the Law for the Modification of Regulations Concerning Civil Servants, Their Salaries, and Benefits, which restricted womens ability to advance in their professions and downgraded their pay scale, sending a clear indication of things to come for women at work in government. Sponer saw that her position at Gottingen was fast becoming untenable, and started making plans for emigration, either to Norway or the United States. In 1934, she moved to Oslo, where she would spend the next two years trying to get the University of Oslo on some kind of footing in the world of experimental physics. Though not the dismal situation that Lise Meitner found herself in when she moved to the University of Stockholm in 1938, it was nonetheless soon clear to her that this was not the place to continue her lifes work, and in 1936 she accepted an appointment at Duke University.
At Duke, Sponer had the funding and administrative support she needed to build a new department that married the best of European experimental techniques with the best of American educational principles. As the situation in Germany deteriorated for scientists, she became active in the emigre community organizing resources to bring as many imperilled scientists and their families to the United States as possible, thereby rescuing some of the continents greatest minds from the calamities to come. During World War II, she worked in the V-12 Program, training Navy and Marine officer candidates in accelerated education programs, while sporadically continuing her work on the spectroscopy of aromatic compounds (molecules composed of rings of atoms, like benzene).
In 1946, Hertha Sponer married James Franck, who had come to America the year before she had, and had worked on the Manhattan Project during the war on the condition that he would be able to directly advise the government against its actual use should they succeed in building it. His wife of 35 years had died in 1942, and in the ensuing years he sank deeper into ill-health and depression. The marriage of Sponer and Franck in 1946 was a tonic to both, pulling them from their respective loneliness and heartbreak over the state of their home country, and giving them both something to look forward to in the future.
Franck remained at the University of Chicago, while Sponer stayed at Duke, so the newly married pair saw each other primarily during vacations and scientific conferences, which made their time together all the more precious. Franck died in 1964, by which point Sponer was suffering from an Alzheimers-like form of dementia which had to be increasingly compensated for by her friends and loyal students. At Francks funeral, she was unable to deliver a brief eulogy, and one year later, in 1965, Hertha Sponer retired from Duke University after nearly three decades of service in which she built the experimental physics program up from the ground to become one of the nations leading centers of molecular investigation. She was brought back to Germany by her family in 1966, declared incapable of independent self-care in 1967, and died in a sanatorium in Ilten on February 17, 1968.
FURTHER READING: Marie-Ann Mausharts 1997 Um nicht zu vergessen: Hertha Sponer ein Frauenleben fuer die Physik im 20. Jahrhundert was translated in 2011 by Ralph A. Morris and is the source to have for the career of Hertha Sponer, and has a good deal of highly interesting information as well about womens evolving place in the German university system in the early 20th century.
Lead image credit: Hertha Sponer included in photo Physikalische Institut der Universitt Gttingen anllich der Franckfeier 1923 By GFHund Friedrich Hund, CC BY-SA 4.0, via Wikimedia Commons
See the original post here:
There is a realm the laws of physics forbid us from accessing, below the resolving power of our most powerful microscopes and beyond the reach of our most sensitive telescopes. Theres no telling what might exist thereperhaps entire universes.
Since the beginning of human inquiry, there have been limits to our observing abilities. Worldviews were restricted by the availability of tools and our own creativity. Over time, the size of our observable universe grew as our knowledge grewwe saw planets beyond Earth, stars beyond the Sun, and galaxies beyond our own, while we peered deeper into cells and atoms. And then, during the 20th century, mathematics emerged that can explain, shockingly welland, to a point, predictthe world we live in. The theories of special and general relativity describe exactly the motion of the planets, stars, and galaxies. Quantum mechanics and the Standard Model of Particle Physics have worked wonders at clarifying what goes on inside of atoms.
However, with each of these successful theories comes hard-and-fast limits to our observing abilities. Today, these limits seem to define true boundaries to our knowledge.
On the large end, there is a speed limit that caps what we can see. It hampers any hope for us to observe most of our universe first-hand.
The speed of light is approximately 300,000,000 meters per second (or 671,000,000 miles per hour, if thats how your brain works). The theory of special relativity, proposed by Albert Einstein in 1905, forbids anything from traveling faster than that. Massless things always travel this speed in a vacuum. Accelerating massive objects to this speed essentially introduces a divide-by-zero in one of special relativitys equations; it would take infinite energy to accelerate something with mass to the speed of light.
If, as a child, you hopped on a spaceship traveling out of the solar system at 99% the speed of light, you might be able to explore other parts of the galaxy before succumbing to age, but because time is relative, your friends and family would likely be long gone before you could report your observations back to Earth. But youd still have your limitsthe Milky Way galaxy is 105,700 light-years across, our neighboring galaxy Andromeda is 2.5 million light-years away, and the observable universe is around 93 billion light-years across. Any hope of exploring farther distances would require multigenerational missions or, if using a remote probe, accepting that youll be dead and humanity may be very different by the time the probes data returns to Earth.
The speed of light is more than just a speed limit, however. Since the light we see requires travel time to arrive at Earth, then we must contend with several horizons beyond which we cant interact, which exist due to Einsteins theory of general relativity. There is an event horizon, a moving boundary in space and time beyond which light and particles emitted now will never reach Earth, no matter how much time passesthose events we will never see. There is also the particle horizon, or a boundary beyond which we cannot observe light arriving from the pastthis defines the observable universe.
Theres a second kind of event horizon, one surrounding a black hole. Gravity is an effect caused by the presence of massive objects warping the shape of space, like a bowling ball on a trampoline. A massive-enough object might warp space such that no information can exit beyond a certain boundary.
These limits arent static. We will see further and further as time goes on, because the distance light travels outward gets bigger and bigger, said Tamara Davis, astrophysics professor who studies cosmology at the University of Queensland. But this expanding perspective wont be permanentsince our universe is also expanding (and that expansion is accelerating). If you fast-forward 100 billion years into the future, all of the galaxies that we can currently see will be so far, and accelerating so quickly away from us, that the light they emitted in the past will have faded from view. At that point, our observable universe would be just those nearby galaxies gravitationally bound to our own.
Another boundary lives on the other end of the scale. Zoom in between molecules, into the center of atoms, deep into their nuclei and into the quarks that make up their protons and neutrons. Here, another set of rules, mostly devised in the 20th century, governs how things work. In the rules of quantum mechanics, everything is quantized, meaning particles properties (their energy or their location around an atomic nucleus, for example) can only take on distinct values, like steps on a ladder, rather than a continuum, like places on a slide. However, quantum mechanics also demonstrates that particles arent just dots; they simultaneously act like waves, meaning that they can take on multiple values at the same time and experience a host of other wave-like effects, such as interference. Essentially, the quantum world is a noisy place, and our understanding of it is innately tied to probability and uncertainty.
This quantum-ness means that if you try to peer too closely, youll run into the observer effect: Attempting to see things this small requires bouncing light off of them, and the energy from this interaction can fundamentally change that which youre attempting to observe.
But theres an even more fundamental limit to what we can see. Werner Heisenberg discovered that the wonkiness of quantum mechanics introduces minimum accuracy with which you can measure certain pairs of mathematically related properties, such as a particles position and momentum. The more accurately you can measure one, the less accurately you can measure the other. And finally, even attempting to measure just one of those properties becomes impossible at a small enough scale, called the Planck scale, which comes with a shortest length, 10^-35 meters, and a shortest time interval, around 5 x 10^-44 seconds.
You take the constant numbers that describe naturea gravitational constant, the speed of light, and Plancks constant, and if I put these constants together, I get the Planck length, said James Beacham, physicist at the ATLAS experiment of the Large Hadron Collider. Mathematically, its nothing specialI can write down a smaller number like 10^-36 meters But quantum mechanics says that if I have a prediction to my theory that says structure exists at a smaller scale, then quantum has built-in uncertainty for it. Its a built-in limit to our understanding of the universethese are the smallest meaningful numbers that quantum mechanics allows us to define.
This is assuming that quantum mechanics is the correct way to think about the universe, of course. But time and time again, experiments have demonstrated theres no reason to think otherwise.
These fundamental limits, large and small, present clear barriers to our knowledge. Our theories tell us that we will never directly observe what lies beyond these cosmic horizons or what structures exist smaller than the Planck scale. However, the answers to some of the grandest questions we ask ourselves might exist beyond those very walls. Why and how did the universe begin? What lies beyond our universe? Why do things look and act the way that they do? Why do things exist?
The unobservable and untestable exist beyond the scope of scientific inquiry. Alls well and good to write down the math and say you can explain the universe, but if you have no way of testing the hypothesis, then thats getting outside the realm of what we consider science, said Nathan Musoke, a computational cosmologist at the University of New Hampshire. Exploring the unanswerable belongs to philosophy or religion. Its possible, however, that science-derived answers to these questions exist as visible imprints on these horizons that the scientific method can uncover.
That imprinting is literal. Ralph Alpher and Robert Herman first predicted in 1948 that some light left over from an early epoch in the universes history might still be observable here on Earth. Then, in 1964, Arno Penzias and Robert Wilson were working as radio astronomers at Bell Labs in New Jersey, when they noticed a strange signal in their radio telescope. They went through every idea to figure out the source of the noiseperhaps it was background radiation from New York City, or even poop from pigeons nesting in the experiment? But they soon realized that the data matched Alpher and Hermans prediction.
Penzias and Wilson hadspotted the microwave radiation from just 400,000 years after the Big Bang called the cosmic microwave background, the oldest and most distant radiation observable to todays telescopes. During this era in the universes history, chemical reactions caused the previously opaque universe to allow light to travel through uninhibited. This light, stretched out by the expanding universe, now appears as faint microwave radiation coming from all directions in the sky.
Astronomers experiments since then, such as the Cosmic Background Explorer (COBE), the Wilkinson Microwave Anisotropy Probe (WMAP), and the Planck space observatory have attempted to map this cosmic microwave background, revealing several key takeaways. First, the temperature of these microwaves is eerily uniform across the skyaround 2.725 degrees above absolute zero, the universes minimum temperature. Second, despite its uniformity, there are small, direction-dependent temperature fluctuations; patches where the radiation is slightly warmer and patches where its slightly cooler. These fluctuations are a remnant of the structure of the early universe before it became transparent, produced by sound waves pulsing through it and gravitational wells, revealing how the earliest structures may have formed.
At least one theory has allowed for a scientific approach to probing this structure, with hypotheses that have been tested and supported by further observations of these fluctuations. This theory is called inflation. Inflation posits that the observable universe as we see it today would have once been contained in a space smaller than any known particle. Then, it underwent a burst of unthinkable expansion lasting just a small fraction of a second, governed by a field with dynamics determined by quantum mechanics. This era magnified tiny quantum-scale fluctuations into wells of gravity that eventually governed the large-scale structure of the observable universe, with those wells written into the cosmic microwave background data. You can think of inflation as part of the bang in the Big Bang theory.
Its a nice thought, that we can pull knowledge from beyond the cosmic microwave background. But this knowledge leads to more questions. I think theres a pretty broad consensus that inflation probably occurred, said Katie Mack, theoretical astrophysicist at North Carolina State University. Theres very little consensus as to how or why it occurred, what caused it, or what physics it obeyed when it happened.
Some of these new questions may be unanswerable. What happens at the very beginning, that information is obscured from us, said Mack. I find it frustrating that were always going to be lacking information. We can come up with models that explain what we see, and models that do better than others, but in terms of validating them, at some point were going to have to just accept that theres some unknowability.
At the cosmic microwave background and beyond, the large and the small intersect; the early universe seems to reflect quantum behaviors. Similar conversations are happening on the other end of the size spectrum, as physicists attempt to reconcile the behavior of the universe on the largest scale with the rules of quantum mechanics. Black holes exist in this scientific space, where gravity and quantum physics must play together, and where physical descriptions of whats going on sit below the Planck scale.
Here, physicists are also working to devise a mathematical theory that, while too small to observe directly, produces observable effects. Perhaps most famous among these ideas is string theory, which isnt really a theory but a mathematical framework based on the idea that fundamental particles like quarks and electrons arent just specks but one-dimensional strings whose behavior governs those particles properties. This theory attempts to explain the various forces of nature that particles experience, while gravity seems to be a natural result of thinking about the problem in this way. Like those studying any theory, string theorists hope that their framework will put forth testable predictions.
Finding ways to test these theories is a work in progress. Theres faith that one way or another we should be able to test these ideas, said David Gross, professor at the Kavli Institute for Theoretical Physics and winner of the 2004 Nobel Prize in Physics. It might be very indirectbut thats not something thats a pressing issue.
Searching for indirect ways to test string theory (and other theories of quantum gravity) is part of the search for the theory itself. Perhaps experiments producing small black holes could provide a laboratory to explore this domain, or perhaps string theory calculations will require particles that a particle accelerator could locate.
At these small timescales, our notion of what space and time really is might break down in profound ways, said Gross. The way physicists formulate questions in general often assumes various givens, like spacetime exists as a smooth, continuous manifold, he said. Those questions might be ill formulated. Often, very difficult problems in physics require profound jumps, revolutions, or different ways of thinking, and its only afterward when we realize that we were asking the question in the wrong way.
For example, some hope to know what happened at the beginning of the universeand what happened before time began. That, I believe, isnt the right way to ask the question, said Gross, as asking such a question might mean relying on an incorrect understanding of the nature of space and time. Not that we know the correct way, yet.
Walls that stop us from easily answering our deepest questions about the universe well, they dont feel very nice to think about. But offering some comfort is the fact that 93 billion light-years is very big, and 10^-35 meters is very small. Between the largest and the smallest is a staggering space full of things we dont but theoretically can know.
Todays best telescopes can look far into the distance (and remember, looking into the distance also means looking back in time). Hubble can see objects as they were just a few hundred million years after the Big Bang, and its successor, the Webb Space Telescope, will look farther still, perhaps 150 million years after the Big Bang. Existing galactic surveys like the Sloan Digital Sky Survey and the Dark Energy Survey have collected data on millions of galaxies, the latter having recently released a 3D map of the universe with 300 million galaxies. The upcoming Vera C. Rubin Observatory in Chile will survey up to 10 billion galaxies across the sky.
From an astronomy point of view, we have so much data that we dont have enough people to analyze it, said Mikhail Ivanov, NASA Einstein Fellow at the Institute for Advanced Study. There are so many things we dont understand in astrophysicsand were overwhelmed with data. To question whether were hitting a limit is like trolling. Even then, these mind-boggling surveys represent only a small fraction of the universes estimated 200 billion galaxies that future telescopes might be able to map.
But as scientists attempt to play in these theoretically accessible spaces, some wonder whether the true limit is us.
Today, particle physics seems to be up against an issue of its own: Despite plenty of outstanding mysteries in need of answers, the physicists at the Large Hadron Collider have found no new fundamental particles since the Higgs Boson in 2012. This lack of discovery has physicists scratching their heads; its ruled out the simplest versions of some theories that had been guiding particle physicists previously, with few obvious signposts about where to look next (though there are some!).
Beacham thinks that these problems could be solved by searching for phenomena all the way down to the Planck scale. A vast, unknown chasm exists between the scale of todays particle physics experiments and the Planck scale, and theres no guarantee of anything new to discover in that space. Exploring the entirety of that chasm would take an immense amount of energy and increasingly powerful colliders. Quantum mechanics says that higher-momentum particles have smaller wavelengths, and thus are needed to probe smaller length scales. However, actually exploring the Planck scale may require a particle accelerator big enough to circle the Sunmaybe even one the size of the solar system.
Maybe its daunting to think of such a collider, but its inspiration for a way to get to the scaleand inspiration to figure out how to get there with a smaller device, he said. Beacham views it as particle physicists duty to explore whether any new physical phenomena might exist all the way down to the Planck scale, even if there currently isnt evidence theres anything to find. We need to think about going as high in energy as we can, building larger and larger colliders until we hit the limit. We dont get to choose what the discoveries are, he said.
Or, perhaps we can use artificial intelligence to create models that perfectly explain the behavior of our universe. Zooming back out, Fermilab and University of Chicago scientist Brian Nord has dreamed up a system that could model the universe with the help of artificial intelligence, constantly and automatically updating its mathematical model with new observations. Such a model could grow arbitrarily close to the model that actually describes our universeit could generate a theory of everything. But, as with other AI algorithms, it would be a black box to humans.
Such issues are already cropping up in fields where we use software-based tools to make accurate models, explained Taner Edis, physicist at Truman State University. Some software toolsmachine learning models, for examplemay accurately describe the world we live in but are too complex for any individual to completely understand. In other words, we know that these tools work, but not necessarily how. Maybe AI will take us farther down this path, where the knowledge we create will exist spread over a civilization and its technology, owned in bits and pieces by humanity and the algorithms we create to understand the universe. Together, wed have generated a complete picture, but one inaccessible to any single person.
Finally, these sorts of models may provide supreme predictive power, but they wouldnt necessarily offer comfortable answers to questions about why things work the way they do. Perhaps this sets up a dichotomy between what scientists can domake predictions based on initial conditionsand what they hope these predictions will allow them to dolead us to a better understanding of the universe we live in.
I have a hunch that well be able to effectively achieve full knowledge of the universe, but what form will it come in? said Nord. Will we be able to fully understand that knowledge, or will it be used merely as a tool to make predictions without caring about the meaning?
Thinking realistically, todays physicists are forced to think about what society cares about most and whether our systems and funding models permit us to fully examine what we can explore, before we can begin to worry about what we cant. U.S.legislators often discuss basic science research with the language of applied science or positive outcomesthe Department of Energy funds much particle physics research. The National Science Foundations mission is To promote the progress of science; to advance the national health, prosperity, and welfare; and to secure the national defense; and for other purposes.
Physicists hoping to receive funding must compete for resources in order to do research that promotes the missions of these organizations. While many labs, such as CERN, exist solely to fund peaceful research with no military applications, most still brag that indirectly solving bigger problems will lead to new techthe internet, or advances in data handling and AI, for example. Private funding organizations exist, but they, too, are either limited in their resources, driven by a mission, or both.
But what if answering these deep questions requires thinking that isnt driven by anything? How can scientists convince funders that we should build experiments, not with the hope of producing new technology or advancing society, but merely with the hope of answering deep questions? Echoing a sentiment expressed in an article by Vanessa A. Bee, what if our systems today (sorry, folks, Im talking about capitalism) are actually stifling innovation in favor of producing some short-term gain? What if answering these questions would require social policy and international collaboration deemed unacceptable by governments?
If this is indeed the world we live in, then the unknowable barrier is far closer than the limits of light speed and the Planck scale. It would exist because we collectivelythe governments we vote for, the institutions they funddont deem answering those questions important enough to devote resources to.
Prior to the 1500s, the universe was simply Earth; the Sun, Moon, and stars were small satellites that orbited us. By 1543, Nicolaus Copernicus proposed a heliocentric model of the universethe Sun sat at the center, and Earth orbited it. It was only in the 1920s that Edwin Hubble calculated the distance of Andromeda and proved the Milky Way wasnt the whole universe; it was just one of many, many galaxies in a larger universe. Scientists discovered most of the particles that make up todays Standard Model of particle physics in the second half of the 20th century. Sure, relativity and quantum theory seem to have established the size of the sandbox we have to play inbut precedent would suggest theres more to the sandbox, or even beyond the sandbox, that we havent considered. But then, maybe there isnt.
There are things that well never know, but thats not the right way to think about scientific discovery. We wont know unless we attempt to know, by asking questions, crafting hypotheses, and testing them with experiments. The vast unknown, both leading up to and beyond our boundaries, presents limitless opportunities to ask questions, uncover more knowledge, and even render previous limits obsolete. We cannot truly know the unknowable, then, since the unknowable is just what remains when we can no longer hypothesize and experiment. The unknowable isnt factits something we decide.
Read more from the original source:
Quantum computing could be the solution to the challenges that are faced by quantum physicists. It has the power to change our fundamental understanding of reality, and it could soon become a reality.
Quantum computing is an area of research in which engineers, scientists, and technologists are trying to build a computer where information is represented at the quantum level.
Quantum computers would be able to solve problems that are not possible with classical computers or solve them much more quickly. Today's silicon-based computer chips use binary digits (bits) with values of either 0 or 1 for storing information. These bits exist in two states at any given time and can't represent both 0 and 1 simultaneously like qubits which can represent all values at once thanks to the quantum mechanics principle called superpositioning.
Classical Computers VS Quantum ComputersTo understand how quantum computing works, it's important to know the difference between the old (classical) way of computing and the new (quantum) way.
On classical computers, information is encoded into binary digits called "bits." These bits can be in one of two states: 0 or 1. A qubit also has two possible states - 0, 1, or both at once (superposition). This means that it can encode much more information than a binary digit. The physical world behaves according to quantum mechanics. So theoretically, if we want to simulate physical phenomena on a computer, we should use quantum mechanical principles as well
Now that we have made the switching and memory units of computers, known as transistors, almost as small as an atom, we need to find an entirely new way of thinking about and building computers. Quantum computers are not intended to replace classical computers, they are expected to be a different tool we will use to solve complex problems that are beyond the capabilities of a classical computer. A problem that requires more power and time than today's computers can accommodate is called an intractable problem. These are the problems that quantum computers are predicted to solve.
When you enter the world of atomic and subatomic particles, things begin to behave in unexpected ways. It's this ability that quantum computers take advantage of. By entering into this quantum area of computing where the traditional laws of physics no longer apply, we will be able to create processors that are significantly faster than the ones we use today. Sounds fantastic, but the challenge is that quantum computing is also incredibly complex.
That's precisely why the computer industry is racing to make quantum computers work on a commercial scale.
Quantum computers are different from traditional computers because they use quantum bits (qubits) instead of binary bits. One qubit can be in two states at the same time, which solves many problems that current computers don't. Moreover, quantum computing can solve highly complex problems by using "parallelism" to process many calculations at the same time. The downside to this technology is that it needs an enormous amount of energy for operations to work properly. For instance, IBM has said that qubits need about 100 milliwatts of power per operation whereas regular processors need about 10 kilowatts
The Quantum Revolution
The practical uses of quantum computers are still being researched and tested. In the future, it is possible that quantum computers will be able to solve problems that have been impossible to solve before. For example, they have the potential to be used for modelling molecules or predicting how a molecule will behave under different conditions.
We should also remember that a quantum computer is not faster than a regular computer - it's just more powerful. That means that "running" a program on a quantum computer will take just as long as on a regular computer - but with much better results because of their increased power.Quantum computers will allow for the storage and processing of data in ways that we cannot even comprehend today. They also offer more complex calculations than traditional computers and therefore can easily solve problems that would take years to solve on a traditional computer.
Some experts believe that they could be used to calculate complex formulas with no time limit, which will make them an invaluable tool in medical science, AI technologies, aeronautical engineering and so on. So far, quantum computing has been used to solve optimization problems, which are too complex for traditional computer models. It's also been used to study protein folding and drug interactions within the body.
Quantum computers are powerful computers that work on the principles of quantum mechanics. They use qubits, not bits to represent data and they can access potentially more than two values at the same time. Quantum computers will be able to break all of the encoding and encryption we have today. Quantum computing is changing the world of cybersecurity. Quantum computers are capable of running sophisticated simulations in parallel, making them much faster than classical computers. The ability to run simulations in parallel means that quantum computers can quickly find solutions to difficult problems. Quantum computers will disrupt many industries like finance, healthcare, and education.
While it's still unclear how big of an impact quantum computing will have on marketing in the future, there are already some significant uses happening now. One example is in ad targeting where companies can analyze customer behaviour with astounding precision by processing large amounts.
Go here to read the rest:
Albert Einsteins handwritten notes on the theory of relativity fetched a record 11.6m (9.7m) at an auction in Paris on Tuesday.
The manuscript had been valued at about a quarter of the final sum, which is by far the highest ever paid for anything written by the genius scientist.
It contains preparatory work for the physicists signature achievement, the theory of general relativity, which he published in 1915.
Calling the notes without a doubt the most valuable Einstein manuscript ever to come to auction, Christies which handled the sale on behalf of the Aguttes auction house had estimated prior to the auction that it would fetch between 2m and 3m.
Previous records for Einsteins works were $2.8m for the so-called God letter in 2018, and $1.56m in 2017 for a letter about the secret to happiness.
The 54-page document was handwritten in 1913 and 1914 in Zurich, Switzerland, by Einstein and his colleague and confidant Michele Besso, a Swiss engineer.
Christies said it was thanks to Besso that the manuscript was preserved for posterity. This was almost like a miracle, it said, since Einstein would have been unlikely to hold on to what he considered to be a simple working document.
Today the paper offered a fascinating plunge into the mind of the 20th centurys greatest scientist, Christies said. It discusses his theory of general relativity, building on his theory of special relativity from 1905 that was encapsulated in the equation E=mc2.
Einstein died in 1955 aged 76, lauded as one of the greatest theoretical physicists of all time. His theories of relativity revolutionised his field by introducing new ways of looking at the movement of objects in space and time.
In 1913 Besso and Einstein attacked one of the problems that had been troubling the scientific community for decades: the anomaly of the planet Mercurys orbit, Christies said.
This initial manuscript contains a certain number of unnoticed errors, it added. Once Einstein spotted them, he let the paper drop, and it was taken away by Besso.
Scientific documents by Einstein in this period, and before 1919 generally, are extremely rare, Christies said. Being one of only two working manuscripts documenting the genesis of the theory of general relativity that we know about, it is an extraordinary witness to Einsteins work.
Einstein also made major contributions to quantum mechanics theory and won the Nobel physics prize in 1921. He became a pop culture icon thanks to his dry witticisms and trademark unruly hair, moustache and bushy eyebrows.