Category Archives: Quantum Physics
What is time? The mysterious essence of the fourth dimension – New Scientist
The nature of time is a tricky notion to pin down. But whether it is a fundamental part of our universe or just an illusion has huge implications
By Richard Webb
Skizzomat
WE ARE BORN; we live; at some point, we die. The notion that our existence is limited by time is fundamental to human experience. We cant fight it and truth be told, we dont know what we are fighting against. Time is a universal whose nature we all and physicists especially fail to grasp. But why is time so problematic? If we had a really good answer to that question, says Astrid Eichhorn, a theoretical physicist at the University of Southern Denmark in Odense, then it wouldnt be so problematic.
On a certain level, time is simple: it is what stops everything happening at once. That might seem flippant, but it is at least something people can agree on. The causal order of things is really what time is all about, says Eichhorn.
Viewed this way, the existence of time can be interpreted as a necessary precondition for the sort of universe where things lead to other things, among them intelligent life that can ask questions, such as what is time?. Beyond that, times essence is mysterious. For instance, why can things only influence other things in one direction in time, but in multiple directions in the three dimensions of space.
Most physical theories, from Isaac Newtons laws of motion to quantum mechanics, skirt such questions. In these theories, time is an independent variable against which other things change, but which cant be changed by anything else. In that sense, time exists outside physics, like the beat of a metronome outside the universe to which everything inside it plays out.
Albert Einsteins theories of relativity, developed in the early 20th century, threw such ethereal notions over a barrel. In relativity, time is a physical, dynamic thing, fused with space to form space-time the fabric of the universe itself. And space-time isnt absolute, but relative, warped by motion and gravity. If you travel fast, or if you are in a strong gravitational field, it slows down.
The relativity of time has wide-ranging consequences. Because there is no unique way of defining its passage, there is no unique way of defining now. Einstein concluded that all nows past, present and future must exist simultaneously, a picture known as the block universe that is completely at odds with our intuitions.
That mismatch occurs because, in our universe, the speed of light is finite. We can only reach certain times within a certain, well, time, so we can never achieve that God-like block-universe view. In practice, causality limits what we can perceive in a very strict way, and our experience and anything that affects us is limited strongly by causality, says cosmologist Katie Mack at North Carolina State University.
The mysteries dont stop there. By making time part of the physical fabric of a universe that, as far as we can tell, began in a big bang some 13.8 billion years ago, Einsteins theory of general relativity implies that time itself had a beginning and perhaps an end, too. There can be no eternal metronome ticking outside the universe as quantum theory implies, because such a source would have to exist outside space and time itself. This sets up a currently unbridgeable divide between relativity and quantum theory. In attempting to cross it, researchers such as Eichhorn hope to make progress towards a more unified picture of physics one that would have to have a very different conception of time.
Many quantum gravity theories propose that if you could zoom in very close to the fabric of Einsteins space-time, to a fine-grained level known as the Planck scale, you would discover a substructure a kind of quantum pixelation. That would open up entirely new possibilities. It may very well be that the quantum structure of space and time is different in the presence of matter than it is if youre just thinking of sort of a universe which contains just space and time, says Eichhorn.
Not everyone thinks we need to go that far. Some see an avenue to finding the nature of time in a better understanding of quantum theory. Or perhaps time is itself a mirage. Like the colour or pattern of a tree leaf, time might be something of no significance, says Mack, the passage of which we invent to make sense of local patterns around us and our own lives.
After all, we never measure time itself, but rather regular changes be it the passage of the seasons, the swing of a pendulum or the oscillation of a caesium atom that we reverse-engineer into some mysterious thing we call time. Its something that we see, and that appears to be there, says Mack. It may not matter to the cosmos.
New Scientist audioYou can now listen to many articles look for the headphones icon in our app newscientist.com/app
More on these topics:
Read more:
What is time? The mysterious essence of the fourth dimension - New Scientist
Aqemia Announces an Extension of Its First Collaboration With Sanofi About AI and Quantum Physics-driven Drug Discovery in Oncology – Business Wire
PARIS--(BUSINESS WIRE)--Aqemia, the next-gen pharmatech company leveraging artificial intelligence and quantum physics announced, today that it has entered a new research collaboration with Sanofi.
This new agreement is a follow-up to a Research Collaboration initiated at the end 2020 by Sanofi to bring the unique technologies of Aqemia to the design and discovery of novel molecules in several projects in oncology, a priority therapeutic area for Sanofi.
This initial collaboration resulted in promising molecules for an oncology program, for which Sanofi and Aqemia decided to pursue joint efforts.
Aqemia will take responsibility for the AI-based design of optimized molecules that fulfill several small molecule design goals among which potency and selectivity in a priority project in oncology. Unlike most AI-based technologies that need experimental data to train their algorithms prior to starting the design, Aqemia will tackle the drug discovery project by generating its own data with quantum and statistical physics-based calculations.
This collaboration includes an undisclosed upfront payment from Sanofi.
Maximilien Levesque, CEO and co-founder of Aqemia, commented, We are really proud of the results obtained in the first Sanofi-Aqemia oncology collaboration and are very excited to continue working together to accelerate important projects in oncology. He added, This follow-up of our first collaboration project with Sanofi, a global leader in the Pharmaceutical industry, demonstrates our ability to quickly generate novel potent and selective compounds for a given target, and we cant wait to scale it up to dozens of drug discovery projects.
We are also extremely excited by the promising results obtained by Aqemia using their proprietary and disruptive technology to design potent inhibitors on given targets. We are eager to prolong our collaboration to speed up our candidate finding process for the sake of patients suffering from cancer, said Laurent Schio, head of Integrated Drug Discovery of Sanofi France,
About Aqemia
Aqemia is a next-gen pharmatech company generating one of the world's fastest-growing drug discovery pipeline. Our mission is to design fast innovative drug candidates for dozens of critical diseases. Our differentiation lies in our unique quantum and statistical mechanics algorithms fueling a generative artificial intelligence to design novel drug candidates. The disruptive speed and accuracy of our technological platform enables us to scale drug discovery projects just like tech projects.
For more information visit us on http://www.aqemia.com or follow us on LinkedIn
Read more:
Quantum leap: uOttawa partners with TO firm in bid to commercialize high-powered computing technology – Ottawa Business Journal
The University of Ottawa is teaming up with a Toronto-based company to develop and commercialize high-powered quantum computing technology.
The university said this week its signed a memorandum of understanding with Xanadu, one of the worlds leading suppliers of quantum hardware and software, to create new courses aimed at training the next generation of quantum computing experts as well as develop algorithms to make high-speed quantum computers even more powerful.
The one-year agreement, which has the option of being renewed, is expected to take effect in September. Sylvain Charbonneau, the universitys vice-president of research and innovation, said it will make uOttawa a leader in discovering real-world applications for quantum computing.
This partnership will help elevate emerging quantum research by giving our students and researchers access to the cutting-edge technologies and expertise held at Xanadu, he said in a statement.
It has the potential to change lives as we train the next generation of quantum pioneers, and work with industry experts to develop and commercialize real-life applications.
Xanadu will provide an undisclosed amount of funding for the research program. The federal government which last year said it planned to invest $360 million in a national strategy to advance quantum research is also expected to help fund the project.
Combining uOttawa's deep knowledge in quantum photonics with Xanadu's industry-leading expertise in quantum hardware and software will pave the way for tackling today's most important scientific and engineering challenges, Josh Izaac, Xanadu's director of product, said in a statement.
Under the agreement, uOttawa researchers will use Xanadus hardware and software to test quantum computing technology in real-world settings and help find ways of commercializing it.
Charbonneau said Xanadu which was founded in Toronto in 2016 and now employs more than 130 people will also help the school create new quantum diploma and certificate programs that straddle the border between science and engineering.
Quantum computing uses the laws of quantum physics, tapping into the world of atoms and molecules to create computers that are many times faster and more powerful than traditional digital computers.
Charbonneau said the technology has a wide range of applications, including encrypting data to make it more difficult for hackers to crack and creating ultra-powerful sensors for industries such as health care and mining.
The veteran academic said recent market research suggests quantum computing will be an $86-billion industry by 2040.
Its going to be big, he told Techopia on Wednesday afternoon. If youre (the Department of National Defence) and you want to communicate securely between A and B, youre going to use quantum cryptography for sure.
Charbonneau said uOttawa currently has more than 70 faculty members involved in quantum research, from faculties as diverse as engineering, law and physics. About a dozen of them will be part of the universitys quantum research team, and they will be assisted by upwards of 100 graduate and PhD students.
The new deal with Xanadu promises to boost uOttawas growing expertise in the field of quantum research.
The agreement comes seven years after the launch of the Max Planck uOttawa Centre for Extreme and Quantum Photonics. The facility was created to provide a forum for researchers from the university and the Max Planck Society, a non-profit association of German research institutes, to work together on technology such as high-intensity lasers.
Charbonneau said quantum computing is getting closer to becoming mainstream, and uOttawa hopes to lead the pack when it comes to training developers and programmers.
Talent really is the new currency, and were capable of providing it to the ecosystem, he said.
Link:
Emily Williams, Mark Turiansky Win 2021-22 Winifred and Louis Lancaster Dissertation Awards – Noozhawk
How can we better hold environmental polluters accountable? How can we enhance the efficiency of qubits?
These questions, which loom large for the researchers who study them, are the type of big-issue topics UC Santa Barbara graduate students are encouraged to tackle. And theyre the central themes of the dissertations that won the 2021-2022 Winifred and Louis Lancaster Dissertation Awards.
This years recipients are Emily Williams and Mark Turiansky, selected by the awards committee for dissertations with significant impact on the field in terms of methodological and substantive contributions.
As global temperatures rise and communities feel the effects of climate change, how do we as a global society address the uneven distribution of harms and gains?
The tropics, for instance, are already bearing the brunt of sea level rise and ocean acidification, yet they are not the places that have generated the magnitude of carbon emissions that cause these events, nor do they benefit in a proportionate way from the activities that cause these emissions.
Elsewhere around the world, weather events of disastrous proportions are increasing in severity and frequency, clearly caused by anthropogenic activity, yet who exactly do we hold accountable?
Inequalities and blind spots such as these are the type of thing that spark Emily Williams curiosity and activist drive. A long-time environmentalist, she got her first taste of the discipline of environmental studies as an undergraduate at UCSB under the tutelage of the late Professor William Freudenburg.
He opened my eyes to thinking about the causes of climate change, Williams said. She became conscious of the strategies corporations use to justify their actions and their methods of deflection from their outsized contribution to the problem.
Around that time, Typhoon Haiyan, then the most powerful typhoon on record, struck the central Philippines, becoming a strong and real reminder of global warmings effects. But even more compelling for Williams who had become part of a civil delegation to the UN Framework Convention on Climate Change (the international climate negotiations space) was the maddening slowness to address these impacts.
Fast-forward several years, and Williams desire to illuminate the gaps in climate accountability resulted in her dissertation, Interrogating the science of climate accountability: Allocating responsibility for climate impacts within a frame of climate justice. In it, she builds a best practices conceptual framework to identify responsibility for climate impacts.
She then tests it using an empirical case study involving the drought in the greater Four Corners region and the Zuni people who live there.
I had the opportunity to work with very diverse mentors, meaning I got to do the attribution science, engage ethnographic methods, organizational sociology and some science and technology studies-related work, she said. Its certainly hard to do interdisciplinary work, but if you find a group of mentors that will support you in this effort, its fascinating.
Among the things she uncovered in her research is the meteorological concept of vapor pressure deficit and its role on droughts, as a result of increased temperatures.
By linking this fundamental principle to vegetation, Williams and her co-authors were able to estimate what the Four Corners region would look like without climate change, and identify the human fingerprint in this whodunit of global warming.
This ability to definitively attribute effects to human activity can help build a case toward holding polluters accountable, advancing the field of climate justice. Its also what earned Williams the Lancaster Award.
Emilys outstanding integration of theory with qualitative and quantitative methods and her passionate commitment to climate justice truly set her apart, said her adviser, geography professor David Lpez-Carr.
Her dissertation makes a significant contribution to the nascent climate accountability literature by being the first to identify the human contribution to regional climate change and to follow those climate change impacts on vulnerable populations at the local level," Lpez-Carr said.
Her work provides a framework for future researchers and practitioners to advance the important area of climate accountability, with real-world implications for holding those responsible for climate change emissions and for mitigating impacts on vulnerable populations, he said.
I feel so honored and so humbled to have received this award, said Williams, who plans to complete a short post-doc before moving into the nonprofit world for more advocacy work. I know for certain that anyone who gets through a Ph.D. program, with all the challenges and opportunities the program presents, deserves such an award.
"I chose my dissertation topic because I believe so deeply in the importance of ensuring climate accountability work is done within principles of justice. I am just so happy that the selection committee thinks this topic is important, too.
The quantum world holds much potential for those who learn to wield it. This space of subatomic particles and their behaviors, interactions and emergent properties can open the door to new materials and technologies with capabilities we have yet to even dream of.
Mark Turiansky is among those at the forefront of this discipline at UCSB, joining some of the finest minds in the quantum sciences as a fellow at the NSF-supported UCSB Quantum Foundry.
The field of quantum information science is rapidly developing and has garnered a ton of interest, said Turiansky, who developed an abiding interest in physics as a child. In the past few years, billions of dollars of funding have been allocated to quantum information science.
Enabled by relatively recent technologies that allow for the study of the universeat its smallest scales, quantum researchers like Turiansky are still just scratching the surface as they work to nail down the fundamentals of the strange yet powerful reality that is quantum physics.
At the heart of some of these investigations is the quantum defect imperfections in a semiconductor crystal that can be harnessed for quantum information science.
One common example is the nitrogen-vacancy center in a diamond: In an otherwise uniform crystalline carbon lattice, an NV center is a defect wherein one carbon atom is replaced with a nitrogen atom, and an adjacent spot in the lattice is vacant. These defects can be used for sensing, quantum networking and long-range entanglement.
The NV center is only one such type of quantum defect, and though well-studied, has its limitations. For Turiansky, this underlined the need to gain a better understanding of quantum defects and to find ways to predict and possibly generate more ideal defects.
These needs became the basis of his dissertation, Quantum Defects from First Principles, an investigation into the fundamental concepts of quantum defects, which could lead to the design of a more robust qubit the basic unit of a quantum computer.
To explore his subject, Turiansky turned his attentions to hexagonal boron nitride.
Hexagonal boron nitride is an interesting material because it is two-dimensional, which means that you can isolate a plane of the material that is just one atom thick, he said. By shining light on this material, it is possible to detect quantum defects called single-photon emitters by the bright spots that shine back. These single photons, he said, are inherently quantum objects that can be used for quantum information science.
The main feat was identifying the defect that was responsible for single-photon emission, Turiansky said. He accomplished it with computational methodologies that he worked to develop in his research.
One methodology that Ive worked on a lot is for nonradiative recombination, he said, describing it in his paper as fundamental to the understanding of quantum defects, dictating the efficiency and operation of a given qubit.
By applying his methodology, Turiansky was able to determine the origin of these single photon emitters a topic of much debate in the community. Its a feat that could be applied to examine other quantum defects, and one that was deemed worthy of the Lancaster Award.
Marks work has moved the field forward by systematically identifying promising quantum defects, and providing an unambiguous identification of the microscopic nature of the most promising quantum emitter in hexagonal boron nitride, said Turianskys adviser, materials professor Chris Van de Walle. He accomplished this by creatively applying the computational approaches he developed and fruitfully collaborating with experimentalists.
Its really an exceptional honor to receive such a prestigious award for my research efforts over the last five years, Turiansky said. Its even more meaningful knowing the high quality of research turned out at UCSB and the fierce competition of my peers.
"Im incredibly grateful to my adviser, group members, collaborators, friends and family who helped make this achievement possible.
The two Lancaster dissertations are enteres into a national competition sponsored by the Council of Graduate Schools. A check for $1,000 and a plaque will be awarded upon completion of entry for the national competition.
Excerpt from:
Review: Elusive: How Peter Higgs Solved the Mystery of Mass, by Frank Close – The New York Times
ELUSIVE: How Peter Higgs Solved the Mystery of Mass, by Frank Close
In early October 2013, the Nobel Prize committee was preparing to announce the winner of its award in physics. The leading candidate as pretty much everyone knew was an 84-year-old Scottish scientist named Peter Higgs, who was not feeling nearly as joyful as you might think. Yes, he wanted to win the award, yes, he wanted to be recognized for his pioneering insights into how subatomic particles build our universe. He just wanted to be recognized for it quietly.
But as a theorist already heralded for his 1964 work predicting the Higgs boson (sometimes called the God particle), he knew he was pipe-dreaming. He could almost hear the thunder of microphone-wielding journalists advancing on his Edinburgh apartment. So he made a pre-emptive decision: I decided not to be home. On the morning of the announcement, Higgs crept out his back door, caught a bus to a nearby town, tucked himself into a pub and hunkered down with a medicinal pint of ale.
Thus, when Higgs did win the Nobel (along with the French physicist Franois Englert), neither journalists nor fellow physicists could find him. We dont know where he is, one University of Edinburgh colleague sadly explained to an exasperated reporter. One is left to wonder if Frank Close chose the title for Elusive as a reference to the glimmering subatomic particle of Higgss theory or to the theorist himself.
As Close notes, Peter Higgs has managed to avoid much of the pace of modern life. He does his best to avoid both email and cellphones. Close, a physicist himself and the author of numerous popular science books, is a longtime colleague and friend of Higgss, but to research this volume he was forced to mail reminder letters to confirm appointments. Their conversations, not entirely revealing, were mostly conducted via Higgss treasured landline phone. As a result, although his publisher describes Elusive as the first major biography of Peter Higgs, Close seems less sure of that, describing his book as not so much a biography of the man but of the boson named after him.
Closes description is more accurate. The biographical facts add up to more of a brisk sketch than a richly detailed portrait. This is not to deny that there are moments of sharp and even bitter insight: Higgss belief that his antisocial personality developed during a sickly and lonely childhood in northern England I grew up a rather isolated child; his marriage and its failure because of his workaholic habits; a resulting, paralyzing depression; Higgss dedication to social justice causes, which at one point led him to suspect that he had become an embarrassment to some of his colleagues. After all, Higgs notes modestly, The portion of my life for which I am known is rather small three weeks in the summer of 1964.
It is those three weeks that anchor the real story in this book, a clear, vivid and occasionally even beautiful portrait of a scientific breakthrough: the tale of how a relatively obscure Scotland-based physicist developed a stunning theory, one that would help illuminate the invisible, particulate web that holds our universe together. And how in the following decades, the research community would argue, debate, build and expand on his idea, setting out on a quest to confirm the existence of the Higgs boson and with it our own understanding of the universe.
At a basic level, Higgss theory belongs to a fundamental and puzzling question: Where does the mass of the universe come from? Using the known rules of physics, from electromagnetism to quantum mechanics, Higgs raised the possibility of an unstable subatomic particle that, through a series of fizzing interactions, could lend mass to other particles. He predicted this particle would be a boson a notably massive subatomic particle that helps hold matter together and that it would exist in an energy field that enabled the interactions. Higgs suggested a path to confirming the existence of the boson and the eventual measurement of its decay products. In doing so, Close writes, the theory issued a subtle challenge: Is this just a clever piece of mathematics or does nature really work this way?
Close uses that question as a launching point, taking the reader through much of the history of particle physics and introducing the key players, the insights by others in the field who moved the ideas forward and the eventual decision to build a machine in Switzerland the Large Hadron Collider to test the possibilities. The L.H.C. would find confirmation for the bosons decay products in 2012. Close brings to this story an insiders knowledge and a combat-ready willingness to defend Higgs against his occasional critics, at one point dismissing the high-profile British physicist Stephen Hawking as a man with a singular genius for playing the media.
In other words, this is a very human telling of the ways that weve figured out at least some of the mysteries of our universe since the mid-20th century. What does the discovery reveal about the cosmos and our place in the universe? Close wonders, and he ends his book on a note of additional mystery, reminding us that there are great achievements in physics to come and that tantalizing questions still shine in front of us, their answers still out of reach, ever elusive.
Deborah Blum is the author of The Poison Squad: One Chemists Single-Minded Crusade for Food Safety at the Turn of the Twentieth Century and the director of the Knight science journalism program at the Massachusetts Institute of Technology.
ELUSIVE: How Peter Higgs Solved the Mystery of Mass, by Frank Close | Illustrated | 304 pp. | Basic Books | $30
Link:
Review: Elusive: How Peter Higgs Solved the Mystery of Mass, by Frank Close - The New York Times
BSC Contributes to Gaia with Nearly 58M Hours of MareNostrum and Programming Models – HPCwire
June 13, 2022 The largest collection of astrophysical data for stars of the Milky Way, a catalog of binary stars that surpasses all the scientific work from the past two centuries and the first low-resolution and radial velocity spectroscopy studies carried out to date: these are some of the scientific findings of the third catalog release of the Gaia mission, published by the European Space Agency (ESA) on Monday, June 13.
Since its beginning, Gaia has counted on the participation of a team of astronomers and engineers of the Institute of Cosmos Sciences of the University of Barcelona (ICCUB) and the Institute of Space Studies of Catalonia (IEEC), led by researchers Carme Jordi, Xavier Luri and Francesca Figueras, from the Department of Quantum Physics and Astrophysics (UB-ICCUB-IEEC).
The Barcelona Supercomputing Center-Centro Nacional de Supercomputacin (BSC-CNS) has contributed to Gaia since its beginning, providing millions of hours of supercomputing in the MareNostrum supercomputer and programming models.
Specifically, since the beginning of the project, BSC has contributed almost 58 million hours, and for this third release of data, it has contributed almost 33 million hours.
On the other hand, the PyCOMPSs programming model and the dislib machine learning library developed by the BSCs Workflows and Distributed Computing group have been used in the software developed by the Gaia team to search for new open star clusters.
The BSC user support team has collaborated in data storage and transfer to other processing centers involved in the project.
This new data release, which includes a total of 1.8 billion stars of the Milky Way, provides the international astronomic collective with an unprecedented perspective of stellar characteristics and their life cycle, as well as the structure and evolution of the Galaxy. The published data of the Gaia Data Release 3 (DR3) were collected during thirty-four months, between July 25, 2014 and May 28, 2017.
Since the launch of Gaia in 2013, data sets have been released in 2016 and 2018, as well as a subgroup of the third data set in 2020. For the moment, the Gaia mission exceeds the 2,850 days of sky observation, it has collected 100 terabytes of data and has documented 200-billion-star transits in its focal plane.
Gaia Mission: the most accurate map of our galaxy
Gaia is the ESAs emblematic mission launched in December 2013 to create the most accurate and complete multi-dimensional map of our galaxy the Milky Way, with data on the position, speed and direction of motion, brightness, temperature and composition of nearly two billion galactic and extragalactic objects. This information will allow astronomers to rebuild the past and future evolution of the Galaxy over billions of years.
The largest low-resolution spectroscopy study ever
The Gaia satellite, located 1.5 million away from the Earth in the opposite direction to the Sun in the Lagrange L2 point, has surveyed the sky through two telescopes which have provided scientific data to calculate the position, distance, speeds and physical features of nearly 2 billion stars.
One of the first scientific indications of the dataset now published are the light spectra of 220 million stars, which can be used to determine brightness, temperature, mass and chemical compositions with precision. As noted by Professor Carme Jordi, for the first time, we can separate in detail the light we receive from the stars and that from other objects observed by Gaia. The expert adds that this separation provides us with knowledge on the physical properties such as temperature, brightness and chemical composition, which is essential information for determining the age of the stars and deduce their origins.
Gaia DR3 includes the radial velocity of 33 million stars, a volume of information five times higher to the one the second data set of the mission provided in 2018. The radial velocity is the speed to which the objects distance from us or get closer, a parameter that is brought by the third dimension of speed in the Gaia map of our galaxy.
As Professor Xavier Luri says, the number of measurements is, by far, larger than the total measures of radial velocity conducted from Earth in all history. This is already a radical change in data availability.Luri also notes that having the third motion component (the other two are provided by astrometry, through the own motions measured by Gaia) enables us to make a complete analysis of the kinematics of the stars.Overall, the volume, quality and completeness of data opens new perspectives for understanding the kinematics and dynamics of our galaxy.
The largest catalog of binary stars to date
Another novelty of the dataset is that it has the largest catalog of binary stars of the Milky Way to date. With positions, distances, orbits and masses of more than 800,000 systems, this catalog is key for understanding the stellar evolution. Moreover, Gaia DR3 has essential information for studying the origins of the Solar System. Specifically, data on 156,000 asteroids of this solar system, information of great precision which combines compositions and orbits.
The great volume of data Gaia offers to the international astronomic collective provides unprecedented views on the understanding of the characteristics of the stars and their life cycle, as well as on the study of the structure and evolution of the Milky Way. The data now presented include information on the stars with a brightness which varies over time, in addition to objects from the Solar System asteroids and planetary moons and galaxies and quasars beyond the Galaxy in which we find ourselves.
As seen in previous data releases, the most unexpected and surprising findings will arrive during the following weeks, as soon as we distinguish the secrets these data have; these data have been open to the professional community and amateurs since the beginning, notes lecturer Francesca Figueras. We are watching millions of eclipsing binary stars moving and beating, as well as thousands of pulsating cepheids, stellar populations that trace the distance of the universe. We also capture the non-radial pulsations of variable stars in rapid rotation, small tsunamis in its surface. These are only some examples, I cannot imagine the euphoria and passion Henrietta Swan Leavitt would feel now.
A scientific collaboration since the beginning of the space mission
The role of the UB-ICCUB-IEEC team focused on the scientific and technological design of the project, the development of the data processing system and on the production of simulated data. A part of the software for data processing sent by the satellite has been developed by the UB-ICCUB-IEEC team and is carried out by the MareNostrum computer, from the Barcelona Supercomputing Center-Centro Nacional de Supercomputacin (BSC-CNS).
The team members work on the scientific exploitation of data, in fields such as the study of the spiral structure of the Galaxy; identification of past interactions of the Milky Way with near galaxies, which are essential for knowing its evolution to present times; open clusters, including the identification of unknown clusters to date; and the study of the Magellanic Clouds, two small galaxies orbiting in our galaxy.
With each new release, the data accuracy and its volume improve. In the upcoming years, we will have, for instance, 150 million high-resolution spectra with more accurate distance and motions. The results we will obtain from the analysis of these data are unpredictable, but they will allow us, among other things, to better understand the evolution of the Galaxy, or its structure, notes lecturer Eduard Massana.
The Gaia team at ICCUB (UB-IEEC), led by Professor Jordi Torra at the beginning of the mission, was awarded in 2013 the Barcelona City Award in the category of Experimental Sciences and Technology. Some of its members are part of the Gaia Science Team (GST), ESAs scientific advisory body. Fuel consumption leads to the prediction that Gaia is expected to operate until 2025, and that the final catalog will not come out before 2030.
Source: Barcelona Supercomputing Center
Originally posted here:
BSC Contributes to Gaia with Nearly 58M Hours of MareNostrum and Programming Models - HPCwire
Methane: As concerns rise about this greenhouse gas, CU startup works to plug leaks – CU Boulder Today
Title image: A laser-emitting device atop a tower at an oil and gas operation scans the landscape for methane-containing natural gas leaks. Credit: Casey Cass/CU Boulder
Sean Coburn walks down a dusty dirt road in Greeley, Colorado, flanked by a scene thats becoming more common in this city at the edge of the Front Rangerows and rows of tanks, pipes, stacks and other hallmarks of the oil and gas industry.
The engineer, who earned his doctorate from CU Boulder and now splits time between the university and a company called LongPath Technologies, is wearing a flame retardant jacket, bulky boots and a hard hat. He needs them on this site. Here, operators take raw and very flammable oil and natural gas, the latter mostly composed of methane, and process it into a form that people can use to heat their homes or drive their cars.
But Coburn is heading for something else: a metal tower, about 50-feet-tall with what looks like a security camera on top.
We pipe the laser light up from there, said Coburn, pointing at a cabinet at the base of the tower. Then we shoot it at different targets around the site.
As he talks, the cabinet beeps, and the laser emitter at its end begins to turn, sweeping over the landscape.
The tower is part of an ambitious undertaking from scientists at LongPath and CU Boulder. Theyre using new laser technology to do what other technologies have struggled to do for years: detect natural gas, which is invisible to the eye, leaking from pipes at sites like this, in real time.
Methane is a powerful greenhouse gas, said Greg Rieker, an associate professor of mechanical engineering who testified before the House Science, Space and Technology Committee June 8 about the problem of methane emissions. It can trap nearly 80 times more heat in the atmosphere than carbon dioxide, and research suggests that escaped methane from oil and gas operations may play a much bigger role in climate change than previously thought.
LongPath is trying to plug that source. The companys towers shoot lasers over miles of terrain to sniff out even the faintest whiffs of methane in the air. So far, the company has installed 23 of them covering almost 300,000 acres in Texas, New Mexico, Oklahomaand Colorado. Rieker believesthe technology could be a win-win for the West: Slowing down emissions of this dangerous gas, while also reducing costs for an industry that employs tens of thousands.
The story of this technology, called a dual frequency comb laser spectrometer, dates back to the 1990s when a CU/JILA physicist named Jan Hall first developed frequency comb lasers to explore the working of tiny atomsand earned a Nobel Prize in the process.
Now, were able to use those same ideas and, with just one of these systems, mitigate about 80 million cubic feet of methane emissions per year, said Rieker who co-founded LongPath in 2017.
Scott Diddams was part of those early days of frequency comb lasers. He was a postdoctoral researcher working with Hall at JILA, a joint research institute between CU Boulder and the National Institute of Standards and Technology (NIST), to probe quantum physicsor the mysterious workings of very, very small things.
Greg Rieker (left) works with a colleague in the lab at CU Boulder.
The researchers werent thinking about methane hovering over oil fields at the time. Instead, they used their lasers to measure how fast atoms tick. To make an atomic clock, Diddams explained, physicists first shine laser light at a cloud of atoms, giving them a kick so that they flip between different energy levels at a staccato pace. Halls group invented frequency combs to help count out that rhythm.
Atoms tick nearly a quadrillion times per second, said Diddams, now a professor in the Department of Electrical, Computer and Energy Engineering. You need a really special tool to count those cycles.
Frequency combs were special. Normal lasers, like the pointers in any lecture hall, can only generate one type of light: say, red light or green light. But these new lasers could produce thousands or even millions of colors of infrared light at the same timean entire rainbow inside a single beam.
Hall and German scientist Theodor Hnsch took home a Nobel in 2005 for their contributions to the development of laser-based precision spectroscopy, including the optical frequency comb technique."
By the time Rieker joined CU Boulder in 2013, he and Diddams were already wondering what else frequency combs could do.
At LongPaths offices in Boulder, Coburn and his colleagues open a computer window showing the data coming in from the system in Greeley. The graph shows a squiggly readout with sharp spikes like the teeth in a comb.
Each tooth corresponds to a color in the teams frequency comb laser (hence, the name). Rieker explained that if you shine one of these devices into a cloud of gas, the molecules inside will absorb some of those colors but not all of them. In other words, molecules will leave an imprint on the laser light, almost like pressing your thumb to a glass.
Comb-like spikes on a computerscreen illustrate measurements of methane, water and carbon dioxide.
Each of these different molecules absorbs a different pattern of light, Rieker said. Methane has one pattern. Water and carbon dioxide have another.
Frequency comb technology can read those molecular fingerprints to tell you exactly what kinds of molecules are present in a patch of air.
Or that was the theory in the mid-2000s. Rieker and scientists from NIST took roughly a decade to make it reality. First the team had to shrink these lasers, which could fill entire rooms, down to the size of a suitcasethen design them to survive the extremes of Colorado winters.
We tested what happened when our laser froze, Rieker said. We broke it every way we could think of breaking it.
Traditionally, he said, oil and gas operators look for leaks by using special video cameras or by hiring airplanes to fly overhead. Frequency comb lasers, in contrast, can operate 24/7 without a single human involved.
For 11 months in 2017 and 2018, the team put its technology to the test with funding from the U.S. Department of Energy. Rieker and his colleagues deployed one of their lasers at a natural gas storage facility in California. The laser, then mounted to the roof of a trailer, was able to detect methane leaks over several miles of terrain and at an incredible precision of just a few parts per billion. Because the system ran all the time, they were able to detect 12 times more methane per month on average than traditional tools spotted.
After that, it spread by word of mouth, Rieker said. Because these things work.
A technician monitors methane at an oil and gas site in Colorado.
Around the same time, Rieker co-founded LongPath Technologies with his then research scientists Coburn and Robbie Wright, and Caroline Alden, a research scientist at the Cooperative Institute for Research in Environmental Sciences (CIRES) at CU Boulder.
In the beginning, it was slow-going. To launch LongPath and secure initial funding, Rieker and his colleagues worked with Venture Partners, the universitys commercialization arm for campus researchers. The companys first employees worked out of rented space in Riekers basement lab on campus.
Instead of the startup-in-a-garage, we were the startup-in-a basement. Then when COVID hit we all were working out of our own basements, said Wright, now vice president of engineering at LongPath. But in the past year we finally got our first dedicated office, and weve scaled from having three deployments out with one customer to 23 deployments with 17 customers."
Oil and gas executives have come around to these lasers, in part because they can save companies money, Rieker addedeven a routine leak, he said, could cost operators thousands of dollars if they dont catch it right away.
Hes now trying to replicate the success of LongPath.
In 2021, Rieker signed on to lead a new effort on campus called the Quantum Engineering Initiative, which seeks to transform other, fundamental scientific discoveries into real tools that you can hold in your hand. Graduate students in the engineers lab arent done with frequency comb lasers, either. This year, researchers will install one over a patch of frozen soil near Fairbanks, Alaska. Theyre hoping to measure how much methane gas leaks out from that soil as it warms because of climate change.
Graduate student David Yun, meanwhile, uses frequency comb lasers for a completely different purpose: To study how hypersonic jet engines suck up and burn oxygen as they roar to life. Diddams employs a similar set of tools to search for planets circling stars tens of light-years from Earth.
We really want to push the limits of where we can take this technology, Yun said. We keep pushing to see what is the craziest thing we can do with frequency combs?
For Rieker, its a testament to science coming full circlefrom explorations of atomic jitters to a Nobel Prize and even technology that may soon improve the lives of everyday Coloradans.
This is a technology that was developed for something completely differentfor creating better atomic clocks and other tools for quantum research, he said. Now, were making an impact on climate change.
More:
Physicists have made a quantum boomerang for the first time – New Scientist
Hundreds of thousands of lithium atoms cooled to near absolute zero exhibit a strange quantum effect a bit like a boomerang
By Karmela Padavic-Callaghan
A hundred thousand lithium atoms held in a vacuum exhibited the quantum boomerang effect
Tony Mastres/David Weld Lab at University of California, Santa Barbara
Researchers have managed to demonstrate a strange phenomenon known as the quantum boomerang effect for the first time.
David Weld at University of California, Santa Barbara and his colleagues cooled hundreds of thousands of lithium atoms close to absolute zero inside a small vacuum-sealed box. They used lasers to arrange the lithium atoms in a line and keep them in a particular quantum state that they hoped would revealthe boomerang effect.
The researchers then used the laser to nudge the atoms. This resulted in them going from having zero average momentum to having a positive average momentum. If the same change happened to a ball it would roll away, but due to the quantum boomerang effect, the team found that the atoms average momentum soon returned to zero.
Theorists had originally proposed that this boomerang effect could happen with electrons moving inside a crystal filled with particles of dirt, but that has proved difficult to demonstrate and study. Weld and the team sidestepped that difficulty by instead focusing on very cold atoms which can be precisely manipulated with lasers.
Weld and colleagues presented the new experiment at the DAMOP conference in Orlando, Florida in May.
He says that the next goal is to determine if boomeranging happens when super cold atoms interact with each other very intensely. Behaviour of such very coordinated atoms is not well understood, so seeing them boomerang could uncover something new about quantum physics.
More on these topics:
View post:
Physicists have made a quantum boomerang for the first time - New Scientist
Researchers Discovered a New Kind of Higgs Relative in The Unlikeliest of Places – ScienceAlert
Sometimes the discovery of new physics demands insane levels of energy. Big machines. Fancy equipment. Countless hours of sifting through reams of data.
And then sometimes the right combination of materials can open a doorway to invisible realms in a space little bigger than a tabletop.
Take this new kind of relative to the Higgs boson, for example. It was found lurking in a room temperature chunk of layered tellurium crystals. Unlike its famous cousin, it didn't take years of smashing up particles to spot it, either. Just a clever use of some lasers and a trick for unweaving their photon's quantum properties.
"It's not every day you find a new particle sitting on your tabletop," saysKenneth Burch, a Boston College physicist and the lead co-author of the study announcing the discovery of the particle.
Burch and his colleagues caught sight of what's known as an axial Higgs mode, a quantum wiggle that technically qualifies as a new kind of particle.
Like so many discoveries in quantum physics, observing theoretical quantum behaviors in action get us closer to uncovering potential cracks in the Standard Model and even helps us hone in on solving some of the remaining big mysteries.
"The detection of the axial Higgs was predicted in high-energy particle physics to explain dark matter," says Burch.
"However, it has never been observed. Its appearance in a condensed matter system was completely surprising and heralds the discovery of a new broken symmetry state that had not been predicted."
It's been 10 years since the Higgs boson was formally identified amid the carnage of particle collisions by CERN researchers. This not only ended the hunt for the particle but loosely closed the final box in the Standard Model the zoo of fundamental particles making up nature's complement of bricks and mortar.
With the Higgs field's discovery, we could, at last, confirm our understanding of how components of the model gained mass while at rest. It was a huge win for physics, one we're still using to understand the inner mechanics of matter.
While any single Higgs particle exists for barely a fraction of a second, it's a particle in the truest sense of the word, blinking briefly into reality as a discrete excitation in a quantum field.
There are, however, other circumstances in which particles can bestow mass. A break in the collective behavior of a surge of electrons called a charge density wave, for example, would do the trick.
This 'Frankenstein's monster' version of Higgs, called a Higgs mode, can also appear with traits that aren't seen in its less patchwork cousin, such as a finite degree of angular momentum (or spin).
A spin-1 or axial Higgs mode not only does a similar job to the Higgs boson under very specific circumstances, it (and quasiparticles like it) could provide interesting grounds for studying the shadowy mass of dark matter.
As a quasiparticle, the axial Higgs mode can only be seen emerging from the collective behaviors of a crowd. Spotting it requires knowing its signature amid a wash of quantum waves and then having a way to sift it out of the chaos.
By sending perfectly coherent beams of light from two lasers through such material and then watching for telltale patterns in their alignment, Burch and his team uncovered the echo of an axial Higgs mode in layers of rare-earth tritelluride.
"Unlike the extreme conditions typically required to observe new particles, this was done at room temperature in a table top experiment where we achieve quantum control of the mode by just changing the polarization of light," says Burch.
It's possible there could be plenty of other such particles emerging from the tangle of body parts making up exotic quantum materials. Having a means of easily catching a glimpse of their shadow in the light of a laser could reveal a whole litany of new physics.
This research was published in Nature.
Here is the original post:
Researchers Discovered a New Kind of Higgs Relative in The Unlikeliest of Places - ScienceAlert
Is life the result of the laws of entropy? – New Scientist
By Stephon Alexander and Salvador Almagro-Moreno
Can physics explain biology?
Shutterstock / Billion Photos
The following is an extract from our Lost in Space-Time newsletter. Each month, we hand over the keyboard to a physicist or two to tell you about fascinating ideas from their corner of the universe. You can sign up for the Lost in Space-Time here.
At the dawn of time, the universe exploded into existence with the big bang, kick-starting a chain of events that led to subatomic particles clumping together into atoms, molecules and, eventually, the planets, stars and galaxies we see today. This chain of events also led to us, although we often see life and the formation of the universe as separate, or non-overlapping magisteria to borrow biologist Stephen Jay Goulds phrase.
To cosmologists, complex systems like life seem of little consequence to the problems they are trying to solve, such as those relating to the big bang or the standard model of particle physics. Similarly, to biologists, life is housed in a biosphere that is decoupled from the happenings of the grandiose universe. But is that right?
Notable scientists, including John von Neumann, Erwin Schrdinger, Claude Shannon and Roger Penrose, have entertained the idea that there could be insights to gather from looking at life and the universe in tandem.
Physicist Erwin Schrdingers views were particularly interesting, as his audacious speculations and predictions in biology have been hugely influential. In 1943, he gave a series of lectures at Trinity College Dublin that would eventually be published in a tiny, but mighty, book called What Is Life? In it, he speculated on how physics could team up with biology and chemistry to explain how life emerges from inanimate matter.
Schrdinger believed that the same laws of physics that describe a star must account for the intricate processes of metabolism within a living cell. He knew that the physics of his time was insufficient to explain some of the ingenious experimental findings that had already been made about living cells, but he ploughed on regardless, attempting to use the physics he knew to explain biology.
He said that quantum mechanics must play a key role in life, as it is necessary for making atoms stable and enabling them to bond in the molecules found in matter, both living and not. For non-living matter, such as in metal, quantum mechanics allows molecules to organise in interesting ways, such as periodic crystals lattices of molecules with high-degrees of symmetry. But he believed that periodicity was too simple for life; instead he speculated that living matter is governed by aperiodic crystals. He proposed that this type of non-repetitive molecular structure should house a code-script that would give rise to the entire pattern of the individuals future development and of its functioning in the mature state. In other words, he was stumbling across an early description of DNA.
Before Schrdingers time, biologists had hit upon the idea of the gene, but it was just an undefined unit of inheritance. Today, the idea that genes are governed by a code that programs the structures and mechanisms of cells and determines the fate of living organisms seems so familiar, that it feels like common sense. Yet, exactly how this is accomplished at a molecular level is still a being teased out by biologists.
What is particularly remarkable is that Schrdinger used reasoning stemming from quantum mechanics to formulate his hypothesis. He was an outsider to biology, and this naturally made him bring a different approach.
Physics and biology have moved on a lot since Schrdingers day. What if we were to follow the same process and ask what is life today?
Over the years we, the authors of this newsletter, have developed a pattern. We meet up, sometimes over a drink, to exchange ideas and share our latest musings in cosmology or molecular biology. We have often stayed up late talking while listening to our favourite jazz or flamenco musicians. In part, our conversations are an exercise in deliberately generating an outsider perspective, like Schrdinger did, hopefully to benefit each others research. But it is also just a lot of fun.
Specifically, since 2014 we have developed a common intuition that there is a hidden interdependence between living systems and cosmology, as demonstrated in some of our publications. To understand this, we need to talk about entropy, a measure of disorder, and how it flows in the universe, both at biological and cosmological scales.
In the early universe, before there were stars and planets, space was mostly filled with an equal amount of radiation and matter. As this mixture warmed and moved about more, it became less ordered and its entropy increased. But as the universe expanded, it distributed radiation and matter in a homogenous, ordered fashion, lowering the entropy of the universe.
As the universe further expanded and cooled, complex structures such as stars, galaxies and life formed. The second law of thermodynamics says that entropy always increases, but these structures had more order (and therefore less entropy) than the rest of the cosmos. The universe can get away with this because the regions of lower entropy are concentrated within cosmic structures, while entropy in the universe as a whole still increases.
We believe this entropy-lowering network of structures is the main currency for the biosphere and life on planets. As the father of thermodynamics, Ludwig Boltzmann, said: The general struggle for existence of animate beings is therefore not a struggle for raw materials nor for energy which exists in plenty in any body in the form of heat, but a struggle for entropy, which becomes available through the transition of energy from the hot sun to the cold earth.
As the universe deviates from homogeneity, by seeding and forming lower entropy structures, entropy elsewhere in the universe continues to grow. And entropy also tends to grow within those structures. This makes entropy, or its absence, a key player in sustaining cosmic structures, such as stars and life; therefore, an early lifeless universe with low entropy is necessary for life here on Earth. For example, our sun radiates energy that is absorbed by electrons in plants on Earth and used in the functions they need to live. Plants release this energy in the form of heat, giving back to the universe more entropy than was taken in.
Unfortunately, it is difficult to explain with our current understanding of physics why the entropy was so low in the early universe. In fact, this problem of the low entropy we demand of the big bang is one of the major problems with this theory.
The biology side of the story stems from Salvadors research into the genetic and ecological drivers that lead populations of harmless bacteria to evolve and emerge as pathogens. Crucial to the story is that it isnt just a question of the genetic code of the bacteria. One of Salvadors mantras is that life is an adaptive phenomenon responding to constant and unexpected changes in pressures from the environment.
This makes an organism an emergent phenomenon, where the final shape of it isnt contained in the individual pieces that make it up, but can be influenced by a series of larger systems to which it belongs. Living things comprise a network of interactions mediated through the environment. A living system is able to regulate billions of cells to maintain its overall functioning. Beyond that, collections of organisms belong to a network called an ecosystem, which also maintains a dynamical equilibrium.
This extends all the way to networks at lifes largest scales. The idea of Earth being a self-regulating ecosystem was co-discovered by James Lovelock and Lynn Margulis in the 1970s, and it became known as the Gaia hypothesis. The takeaway for us is that the flow of negative entropy exists not only for individual living things, but for the entire Earth.
The sun sends free energy to Earth, and through a chain of complex interactions, the energy gets distributed through a network of interactions to living things, each relying on it to maintain its complexity in the face of increasing disorder. To contextualise the role of life within the framework of thermodynamics, we define these order-generating structures (such as a cell) as Units Of Negentropy, or UONs. But theres no such thing as a free lunch. When UONs release this energy back into the environment, they mostly do so in a form that has higher entropy than was received.
This uncanny parallel between living systems, UONs and the evolution of the universe may seem like a coincidence, but we choose not to think of it this way. Instead, we propose that it is a central organising principle of the evolution of the cosmos and the existence of life. Salvador elected to call this the entropocentric principle, a wink at the anthropic principle, which, in its strong form, states that the universe is fine-tuned for life. This arises because the laws of nature seem to be just right for life. For example, if the strength of the nuclear force that binds the hearts of atoms differed by a few per cent, stars wouldnt be able to produce carbon and there would be no carbon-based life.
The fine-tuning problem may not be as severe as it seems, though. In research Stephon conducted with colleagues, he showed that the universe can be fit for life even when we let the constants of nature like gravity and electromagnetism vary, so long as they vary simultaneously. Maybe we dont need the anthropic principle after all. The entropocentric principle, on the other hand, is harder to shake. If the universe was unable to provide pathways that enabled it to create regions of lower entropy, then life as we know it wouldnt exist. This leaves us wondering: do we live in a cosmic biosphere or is the universe a cosmic cell?
Stephon Alexander is a theoretical physicist at Brown University in Rhode Island who spends his time thinking about cosmology, string theory and jazz, and wondering if the universe is a self-learning AI. He is author of the book Fear of a Black Universe. Salvador Almagro-Moreno is a molecular biologist at the University of Central Florida who investigates emergent properties in complex biological systems, from protein evolution to pandemic dynamics.
More on these topics:
Read more from the original source: