Category Archives: Quantum Physics

The expansion of the universe could be a mirage, new theoretical … – Livescience.com

The expansion of the universe could be a mirage, a potentially controversial new study suggests. This rethinking of the cosmos also suggests solutions for the puzzles of dark energy and dark matter, which scientists believe account for around 95% of the universe's total energy and matter but remain shrouded in mystery.

The novel new approach is detailed in a paper published June 2 in the journal Classical and Quantum Gravity, by University of Geneva professor of theoretical physics Lucas Lombriser.

Related: Dark energy could lead to a second (and third, and fourth) Big Bang, new research suggests

Scientists know the universe is expanding because of redshift, the stretching of light's wavelength towards the redder end of the spectrum as the object emitting it moves away from us.Distant galaxies have a higher redshift than those nearer to us, suggesting those galaxies are moving ever further from Earth.

More recently, scientists have found evidence that the universe's expansion isn't fixed, but is actually accelerating faster and faster. This accelerating expansion is captured by a term known as the cosmological constant, or lambda.

The cosmological constant has been a headache for cosmologists because predictions of its value made by particle physics differ from actual observations by 120 orders of magnitude. The cosmological constant has therefore been described as "the worst prediction in the history of physics."

Cosmologists often try to resolve the discrepancy between the different values of lambda by proposing new particles or physical forces but Lombriser tackles it by reconceptualizing what's already there..

"In this work, we put on a new pair of glasses to look at the cosmos and its unsolved puzzles by performing a mathematical transformation of the physical laws that govern it," Lombriser told Live Science via email.

In Lombriser's mathematical interpretation, the universe isn't expanding but is flat and static, as Einstein once believed. The effects we observe that point to expansion are instead explained by the evolution of the masses of particles such as protons and electrons over time.

In this picture, these particles arise from a field that permeates space-time. The cosmological constant is set by the field's mass and because this field fluctuates, the masses of the particles it gives birth to also fluctuate. The cosmological constant still varies with time, but in this model that variation is due to changing particle mass over time, not the expansion of the universe.

In the model, these field fluctuations result in larger redshifts for distant galaxy clusters than traditional cosmological models predict.And so, the cosmological constant remains true to the model's predictions.

"I was surprised that the cosmological constant problem simply seems to disappear in this new perspective on the cosmos," Lombriser said.

Lombriser's new framework also tackles some of cosmology's other pressing problems, including the nature of dark matter. This invisible material outnumbers ordinary matter particles by a ratio of 5 to 1, but remains mysterious because it doesn't interact with light.

Lombriser suggested that fluctuations in the field could also behave like a so-called axion field, with axions being hypothetical particles that are one of the suggested candidates for dark matter.

These fluctuations could also do away with dark energy, the hypothetical force stretching the fabric of space and thus driving galaxies apart faster and faster. In this model, the effect of dark energy, according to Lombriser, would be explained by particle masses taking a different evolutionary path at later times in the universe.

In this picture "there is, in principle, no need for dark energy," Lombriser added.

Post-doctoral researcher at the Universidad ECCI, Bogot, Colombia, Luz ngela Garca, was impressed with Lombriser's new interpretation and how many problems it resolves.

"The paper is pretty interesting, and it provides an unusual outcome for multiple problems in cosmology," Garca, who was not involved in the research, told Live Science. "The theory provides an outlet for the current tensions in cosmology."

However, Garca urged caution in assessing the paper's findings, saying it contains elements in its theoretical model that likely can't be tested observationally, at least in the near future.

Editor's note: This article was corrected at 1:30 p.m. ET on June 20, to reflect that redshift is evidence of cosmic expansion, but not evidence of accelerated cosmic expansion.

Read the original post:

The expansion of the universe could be a mirage, new theoretical ... - Livescience.com

Did physicists get the idea of "fundamental" wrong? – Big Think

If all you start with are the fundamental building blocks of nature the elementary particles of the Standard Model and the forces exchanged between them you can assemble everything in all of existence with nothing more than those raw ingredients. Thats the most common approach to physics: the reductionist approach. Everything is simply the sum of its parts, and these simple building blocks, when combined together in the proper fashion, can come to build up absolutely everything that could ever exist within the Universe, with absolutely no exceptions.

In many ways, its difficult to argue with this type of description of reality. Humans are made out of cells, which are composed of molecules, which themselves are made of atoms, which in turn are made of fundamental subatomic particles: electrons, quarks, and gluons. In fact, everything we can directly observe or measure within our reality is made out of the particles of the Standard Model, and the expectation is that someday, science will reveal the fundamental cause behind dark matter and dark energy as well, which thus far are only indirectly observed.

But this reductionist approach might not be the full story, as it omits two key aspects that govern our reality: boundary conditions and top-down formation of structures. Both play an important role in our Universe, and might be essential to our notion of fundamental as well.

On the right, the gauge bosons, which mediate the three fundamental quantum forces of our Universe, are illustrated. There is only one photon to mediate the electromagnetic force, there are three bosons mediating the weak force, and eight mediating the strong force. This suggests that the Standard Model is a combination of three groups: U(1), SU(2), and SU(3), whose interactions and particles combine to make up everything known in existence.

This might come as a surprise to some people, and might sound like a heretical idea on its surface. Clearly, theres a difference between phenomena that are fundamental like the motions and interactions of the indivisible, elementary quanta that compose our Universe and phenomena that are emergent, arising solely from the interactions of large numbers of fundamental particles under a specific set of conditions.

Take a gas, for example. If you look at this gas from the perspective of fundamental particles, youll find that every fundamental particle is bound up into an atom or molecule that can be described as having a certain position and momentum at every moment in time: well-defined to the limits set by quantum uncertainty. When you take together all the atoms and molecules that make up a gas, occupying a finite volume of space, you can derive all sorts of thermodynamic properties of that gas, including:

Entropy, pressure, and temperature are the derived, emergent quantities associated with the system, and can be derived from the more fundamental properties inherent to the full suite of component particles that compose that physical system.

This simulation shows particles in a gas of a random initial speed/energy distribution colliding with one another, thermalizing, and approaching the Maxwell-Boltzmann distribution. The quantum analogue of this distribution, when it includes photons, leads to a blackbody spectrum for the radiation. Macroscopic properties like pressure, temperature, and entropy can all be derived from the collective behavior of the individual component particles within the system.

But not every one of our familiar, macroscopic laws can be derived from these fundamental particles and their interactions alone. For example, when we look at our modern understanding of electricity, we recognize that its fundamentally composed of charged particles in motion through a conductor such as a wire where the flow of charge over time determines the quantity that we know of as electric current. Wherever you have a difference in electric potential, or a voltage, the magnitude of whatever that voltage is determines how fast that electric charge flows, with voltage being proportional to current.

On macroscopic scales, the relation that comes out of it is the famous Ohms Law: V = IR, where V is voltage, I is current, and R is resistance.

Only, if you try to derive this from fundamental principles, you cant. You can derive that voltage is proportional to current, but you cannot derive that the thing that turns your proportionality into an equality is resistance. You can derive that theres a property to every material known as resistivity, and you can derive the geometrical relationship between how cross-sectional area and the length of your current-carrying wire affects the current that flows through it, but that still wont get you to V = IR.

At temperatures greater than the critical temperature of a superconductor, magnetic flux can freely pass through the conductors atoms. But below the critical superconducting temperature, all of the magnetic flux gets expelled. This is the essence of the Meissner effect, which enables flux-pinning inside regions of a superconductor and the resultant application of magnetic levitation.

In fact, theres a good reason you cant derive V = IR from fundamental principles alone: because its neither a fundamental nor a universal relation. After all, theres a famous experimental set of conditions where this relationship breaks down: inside all superconductors.

In most materials, as they heat up, the resistance of the material to current flowing through it increases, which makes some intuitive sense. At higher temperatures, the particles inside a material zip around more quickly, which makes pushing charged particles (such as electrons) through it more difficult. Common materials such as nickel, copper, platinum, tungsten, and mercury all have their resistances rise as their temperatures increase, as it becomes more and more difficult at higher temperatures to achieve the same flow of current through a material.

On the flipside, however, cooling a material down often makes it easier for current to flow through it. These same materials, as the temperature lowers and cools them down, exhibit less and less resistance to the flow of current. Only, theres a specific transition point where, all of a sudden, once a specific temperature threshold (unique to each material) is crossed, where the resistance suddenly drops to zero.

When cooled to low enough temperatures, certain materials will superconduct: the electrical resistance inside them will drop to zero. When exposed to a strong magnetic field, some superconductors will exhibit levitation effects, and with a properly configured external magnetic field, its possible to pin the superconducting object in place in one or more dimensions, resulting in spectacular applications like quantum levitation.

Its specifically when this occurs that we declare a material has entered a superconducting state. First discovered all the way back in 1911 when mercury was cooled to below 4.2 K, superconductivity still remains only partially explained even today; it cannot be derived or fully explained by fundamental principles alone.

Instead, one needs to apply another set of rules atop the fundamental particles and their interactions: a set of rules known collectively as boundary conditions. Simply giving the information about what forces and particles are at play, even if you include all the information you could possibly know about the individual particles themselves, is insufficient to describe how the full system will behave. You also need to know, in addition to whats going on within a specific volume of space, whats happening at the boundary that encloses that space, with two very common types of boundary conditions being:

If you want to create a propagating electromagnetic wave down a wire where the electric and magnetic fields of that propagating wave are always both perpendicular to the wire and perpendicular to one another, you have to tweak the boundary conditions (e.g., set up a coaxial cable for the wave to travel through) in order to get the desired outcome.

This diagram shows a cutaway of the interior of a coaxial cable. With current flowing in one direction down the central, interior cable and the opposite direction down the outer cable, these boundary conditions enable the propagation of an internal transverse electric-and-magnetic mode in the space between the conductors. This configuration, known as TEM, can only arise due to the specific boundary conditions present in a coaxial cable-like system.

Boundary conditions are of tremendous importance under a wide variety of physical circumstances as well: for plasmas in the Sun, for particle jets around the active black holes at the centers of galaxies, and for the ways that protons and neutrons configure themselves within an atomic nucleus. Theyre required if we want to explain why external magnetic and electric fields split the energy levels in atoms. And theyre absolutely going to come into play if you want to learn how the first strings of nucleic acids came to reproduce themselves, as the constraints and inputs from the surrounding environment must be key drivers of those processes.

One of the most striking places where this arises is on the largest cosmic scales of all, where for decades, a debate took place between two competing lines of thought as to how the Universe grew up and formed stars, galaxies, and the grandest cosmic structures of all.

This image shows the view of JWSTs NIRCam instrument as it looked at galaxy cluster Abell 2744 and revealed a number of galaxies that are members of a proto-cluster. The red squares show several of the galaxies for which spectroscopic measurements were obtained; the orange circles are photometric galaxy candidates that may yet turn out to be part of this cluster. Small, low-mass galaxies form earlier; larger, evolved galaxies and galaxy clusters only appear at later times.

In a top-down Universe, the largest imperfections are on the largest scales; they begin gravitating first, and as they do, these large imperfections fragment into smaller ones. Theyll give rise to stars and galaxies, sure, but theyll mostly be bound into larger, cluster-like structures, driven by the gravitational imperfections on large scales. Galaxies that are a part of groups and clusters would have largely been a part of their parent group or cluster since the very beginning, whereas isolated galaxies would only arise in sparser regions: in between the pancake-and-filament regions where structure was densest.

A bottom-up Universe is the opposite, where gravitational imperfections dominate on smaller scales. Star clusters form first, followed later by galaxies, and only thereafter do the galaxies collect together into clusters. The primary way that galaxies form would be as the first-forming star clusters gravitationally grow and accrete matter, drawing adjacent star clusters into them to form galaxies. The formation of larger-scale structure would only occur as small-scale imperfections experience runaway growth, eventually beginning to affect larger and larger cosmic scales.

If the Universe were purely built based on a top-down scenario of structure formation, wed see large collections of matter fragment into smaller structures like galaxies. If it were purely bottom-up, it would begin by forming small structures whose mutual gravitation brings them together later. Instead, the actual Universe appears to be an amalgam of both, meaning that its not described well by either scenario on its own.

In order to answer this question from an observational perspective, cosmologists began attempting to measure what we call cosmic power, which describes on what scale(s) the gravitational imperfections that seed the Universes structure first appear. If the Universe is entirely top-down, all of the power would be clustered on large cosmic scales, and there would be no power on small cosmic scales. If the Universe is entirely bottom-up, all the cosmic power is clustered on the smallest of cosmic scales, with no power on large scales.

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

But if theres at least some power on all manner of cosmic scales, wed instead need to characterize the Universes power spectrum by what we call a spectral index: a parameter that tells us how tilted the Universes power is, and whether it:

If it were this final case, the Universe wouldve been born with power evenly distributed on all scales, and only gravitational dynamics would drive the structure formation of the Universe to get the structures we wind up observing at late times.

The evolution of large-scale structure in the Universe, from an early, uniform state to the clustered Universe we know today. The type and abundance of dark matter would deliver a vastly different Universe if we altered what our Universe possesses. Note that in all cases, small-scale structure arises before structure on the largest scales comes about, and that even the most underdense regions of all still contain non-zero amounts of matter.

When we look back at the earliest galaxies we can see a set of records that are now being newly set all the time with the advent of JWST we overwhelmingly see a Universe dominated by smaller, lower-mass, and less evolved galaxies than we see today. The first groups and proto-clusters of galaxies, as well as the first large, evolved galaxies, dont seem to appear until hundreds of millions years later. And the larger-scale cosmic structures, like massive clusters, galactic filaments, and the great cosmic web, seem to take billions of years to emerge within the Universe.

Does this mean that the Universe really is bottom-up, and that we dont need to examine the birth conditions for the larger scales in order to understand the types of structure that will eventually emerge?

No; thats not true at all. Remember that, regardless of what types of seeds of structure the Universe begins with, gravitation can only send-and-receive signals at the speed of light. This means that the smaller cosmic scales begin to experience gravitational collapse before the larger scales can even begin to affect one another. When we actually measure the power spectrum of the Universe and recover the scalar spectral index, we measure it to be equal to 0.965, with an uncertainty of less than 1%. It tells us that the Universe was born nearly scale-invariant, but with slightly more (by about 3%) large-scale power than small-scale power, meaning that its actually a little bit more top-down than bottom-up.

The large, medium, and small-scale fluctuations from the inflationary period of the early Universe determine the hot and cold (underdense and overdense) spots in the Big Bangs leftover glow. These fluctuations, which get stretched across the Universe in inflation, should be of a slightly different magnitude on small scales versus large ones: a prediction that was observationally borne out at approximately the ~3% level. By the time we observe the CMB, 380,000 years after the end of inflation, theres a spectrum of peaks-and-valleys in the temperature/scale distribution of fluctuations, owing to interactions between normal/dark matter and radiation.

In other words, if you want to explain all of the phenomena that we actually observe in the Universe, simply looking at the fundamental particles and the fundamental interactions between them will get you far, but wont cover it all. A great many phenomena in a great many environments require that we throw in the additional ingredients of conditions both initially and at the boundaries of your physical system on much larger scales than the ones where fundamental particles interact. Even with no novel laws or rules, simply starting from the smallest scales and building up from that wont encapsulate everything thats already known to occur.

This doesnt mean, of course, that the Universe is inherently non-reductionist, or that there are some important and fundamental laws of nature that only appear when you look at non-fundamental scales. Although many have made cases along those lines, those are tantamount to God of the gaps arguments, with no such rules ever having been found, and no emergent phenomena ever coming to be only because some new rule or law of nature has been found on a non-fundamental scale. Nevertheless, we must be cautious against adopting an overly restrictive view of what fundamental means. After all, the elementary particles and their interactions might be all that make up our Universe, but if we want to understand how they assemble and what types of phenomena will emerge from that, much more is absolutely necessary.

Original post:

Did physicists get the idea of "fundamental" wrong? - Big Think

How Does Multiverse Theory Relate to Time Travel? – DISCOVER Magazine

While many sci-fi writers have wondered if time travel into the past can be allowed as long as you slip into an alternate universe, you should be warned: Nature doesnt like a cheater.

Time travel into the past is a real pain in the neck. For one, as far as we can tell its forbidden in our universe. You can travel into the future as slowly or as quickly as you like, but you cant stop time or go in reverse.

Technically speaking, physicists are not exactly sure why time travel is forbidden. We dont have a single law of physics that clearly rejects the possibility.

But every time we try to concoct a time travel concept like wormholes or twisting paths around infinitely long cylinders we find some reason that its not allowed. Wormholes, it turns out, require negative mass to stabilize themselves, and infinitely long cylinders are rather hard to come by.

Read More: Why Do Humans Perceive Time The Way We Do?

Even within any glimmer of a possibility that time travel might be possible, traveling into our own pasts opens up all sorts of noxious paradoxes, as plenty of films and comic book storylines have demonstrated.

Beyond the fact that the past is over and done with, we must reckon with the reality that what happened in the past created the present that we experience.

To put the paradox succinctly: If you change the past, you change the present, but the present already exists.

One of the most common examples of this head-scratching scenario is known as the grandfather paradox.

Suppose you travel back in time and kill your own grandfather before he met your grandmother (no need to dissect your motivations behind this hypothetical act here). That means one of your parents was never born, which means you never exist. But if you never existed, how did you go back in time to kill your grandfather in the first place?

Read More: What Is the Grandfather Paradox of Time Travel?

To address this dilemma, we could invoke the idea of parallel or alternate universes, a concept under the general umbrella of the multiverse.

Perhaps when you go back and kill your grandfather, you create a new universe where you dont exist. You can then return to your original universe where everything is hunky-dory (or, at least youre still alive). Essentially, you only monkeyed around with some other, disconnected cosmos.

There are, in fact, two places in physics where the multiverse concept pops up. Unfortunately for any wannabe time travelers, neither of them allows for this type of parallel-futures hopping.

The first place where alternate universe theories have gained traction is in quantum mechanics.

Subatomic processes are inherently random. When we run an experiment we never know what result were going to get. One explanation for this is that all experimental results happen, just in separate universes.

If someone shoves an electron through a strong magnetic field, for example, and it has a fifty-fifty chance of deflecting either up or down, then every time they run an electron through the device, they get two universes: one where the electron went up, and one where the electron went down.

As the theory goes, this kind of splitting isnt just limited to arcane physics experiments. It happens with every single quantum interaction throughout the cosmos, which is a lot of interactions, thus generating countless parallel worlds.

In this view, there is a universe out there that already exists where your grandfather is dead and you were never born. But you could never see it.

Read More: Advanced Quantum Material Curves the Fabric of Space

Even if the theory above is actually happening, the rules of quantum physics are absolutely clear: There is no travel or communication between these universes.

Granted, universe is probably the wrong word here. Thats because the splitting is really separate partitions of a single quantum wave function. And that separation is really permanent.

If there was any kind of connection allowed, then the many branching possibilities of quantum processes simply wouldnt happen.

The second place where the multiverse pops up in physics is in studies related to the extremely early universe.

From this lens, our cosmos could be just one of many like a bubble in an infinite foam of universes. These universes too could contain all the many interesting and varied possibilities that make life so fun.

In this (extremely hypothetical) view, there are many universes out there that contain alternate realities from the one you know. Theres one where your grandfather died. Theres another where no stars ever formed. And theres a universe where all your matter is replaced with antimatter, and so on.

Because all these bubble universes exist in the same expanding foam that is reality, those alternate universes really are out there, an unimaginable but still-finite distance away.

But even in this foam-like collection of bubble universes, time still does its thing. That is, it moves forward, always.

Even if you could concoct some scheme to travel among the alternate universes, those universes still share the same spacetime fabric, marching forward along with us. Thus they all appear to forbid time travel, as the theory goes.

They also, in theory, share the reality that nature doesnt like a cheater. So, if youre theoretically going to visit a universe where you never existed, maybe nix the time travel method.

Read More: Scientists Attempt to Map the Multiverse

Original post:

How Does Multiverse Theory Relate to Time Travel? - DISCOVER Magazine

Fast magnetic imaging with diamond-based quantum sensor technology – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

by Fraunhofer Institute for Applied Solid State Physics IAF

Microscopic imaging of magnetic fields, enabled by quantum sensing, allows the measurement of the unique magnetic fingerprint of objects. This opens the door for fundamentally new applications in various fields such as materials testing or biomedicine.

Fraunhofer IAF has developed an innovative method using fast camera images in the form of an improved wide-field magnetometer. The system offers a unique compromise of sensitivity, resolution and speed. It will be presented at LASER World of QUANTUM 2023, held June 2730 in Munich, as part of the Quantum Sensing Hub Freiburg, in which the institutes Fraunhofer IAF, IPM and IWM pool their expertise in the field of quantum magnetometry.

Researchers at the Fraunhofer Institute for Applied Solid State Physics IAF have managed to harness the great potential of quantum sensor technology based on nitrogen-vacancy (NV) centers in a unique measurement setup. Their wide-field magnetometer enables the magnetic stray field of a sample to be measured rapidly over a large range. Its high measurement accuracy is characterized by a resolution down to the nanometer range and is absolutely quantifiable.

This measurement method opens up new avenues in metrology and is suitable for various industries such as (nano-)electronics, material sciences or biomedicine due to its wide range of applications, from inorganic to organic samples.

The innovative measurement system was developed in the course of the Fraunhofer lead project QMag. In the course of the project, a concentrated magnetometry expertise and infrastructure has developed in Freiburg im Breisgau at the three institutes Fraunhofer IAF, IPM and IWM, which together form the Quantum Sensing Hub Freiburg. Schematic structure of the measuring principle of the wide-field magnetometer. Due to the direct contact of the sample with the diamond sensor, magnetic nanoparticles can be measured precisely and quickly. Credit: Fraunhofer IAF

Wide-field magnetometry is based on NV centers in thin diamond films and is a young approach in quantum sensing. The measurement setup developed at Fraunhofer IAF uses an arbitrary waveform generator (AWG), which generates microwave radiation and triggers a laser and the recording time window of a camera with nanosecond precision. By using different measurement protocols, this allows for high flexibility and precision of measurements.

"The wide-field magnetometer benefits not only from our improved setup, but also from the growth process developed at Fraunhofer IAF for diamond plates, which we use as sensors," explains Dr. Jan Jeske, deputy business unit manager quantum devices at Fraunhofer IAF. The substrates grown at the institute are based on (100)-oriented, pure, undoped diamond of type "IIa" with a thickness of 500 m and an area of 4 x 4 mm. This substrate is overgrown with a thin layer in which the NV centers for the sensor application are generated close to the sample.

In materials science, experimental methods are used to characterize polycrystalline materials in order to obtain a microscopic understanding of the macroscopic material behavior. This makes it possible to better understand materials and optimize their properties. However, current methods usually rely on long measurement times and large experimental facilities. Often, vacuum conditions or high-energy particles are also necessary, which can have a detrimental effect on the sample material.

Wide-field magnetometry based on NV centers is an alternative, non-invasive method that operates at room temperature. This opens up new possibilities for insights into the microscopic magnetic field distribution, which has great potential for material analyses. The system is not limited to inorganic material samples, but can also be applied to organic samples due to its comparatively low demands on the measurement environment. These measurement properties, coupled with the high measurement speed of the method developed at Fraunhofer IAF, allow even complex measurements such as fluctuations, alternating fields and alternating current (AC) measurementspaving the way for new material analysis methods.

Provided by Fraunhofer Institute for Applied Solid State Physics IAF

View original post here:

Fast magnetic imaging with diamond-based quantum sensor technology - Phys.org

Researchers manufacture first-ever droplet-etched quantum dots that glow in C-band optical light – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

proofread

by Universitt Paderborn

Paderborn researchers from the Department of Physics and the Institute for Photonic Quantum Systems (PhoQS) have succeeded in manufacturing quantum dotsnanoscopic structures where the material's quantum properties come into playthat glow in the optical C-band at wavelengths of 1530 to 1565 nanometers.

This is particularly special as it is the first time that quantum dots like these have been manufactured by the local droplet etching and subsequent filling of nanoholes in an indium aluminum arsenide / indium gallium arsenide system lattice-matched to indium phosphide substrates.

In the future, these quantum dots could for example be used as a source of entangled photons, which might be relevant for innovative encryption systems involving quantum technologies. Luminescence in the optical C-band is particularly relevant here: slowdown in fiber optic networks is minimal at this wavelength, enabling potential future use with the current network. The researchers have now published their findings in the journal AIP Advances.

The team, consisting of Dennis Deutsch, Christopher Buchholz, Dr. Viktoryia Zolatanosha, Prof. Dr. Klaus Jns and Prof. Dr. Dirk Reuter, etched nanoholes in an indium aluminum arsenide surface and filled them with indium gallium arsenide.

"One critical element of manufacturing quantum dots, if they are to be used for generating entangled photons, is lattice matching. If this is not performed, it causes tension in the quantum dot, which can dispel the quantum mechanical entanglement of the photons generated," Denis Deutsch explains.

Manufacturing quantum dots by filling droplet-etched holes is not new, but unlike in previous processes, the researchers used lattice matching to indium phosphide rather than gallium arsenide. The change of material enabled them to achieve emission in the C-band. As well as lattice matching materials, the symmetry of quantum dots is also a key factor in their suitability as an entangled photon source. The publication therefore also statistically evaluated and examined the symmetry of numerous holes manufactured using different parameters.

This is a long way from being technically implementable, but the method is already demonstrating its potential for manufacturing quantum dots. This is because in the future, quantum computing is likely to be far superior to traditional computers when it comes to encryption.

The phenomenon of entanglement is a promising approach to securely exchanging encrypted data, as any attempts to eavesdrop are exposed thanks to the laws of physics. Since entangled photons are exchanged via fiber optic cables, it is essential that transmission should be as low-loss as possible. "Manufacturing photons in the particularly low-loss optical C-band is therefore a major step forward in encryption using entangled photons," Deutsch concludes.

More information: D. Deutsch et al, Telecom C-band photon emission from (In,Ga)As quantum dots generated by filling nanoholes in In0.52Al0.48As layers, AIP Advances (2023). DOI: 10.1063/5.0147281

Journal information: AIP Advances

Provided by Universitt Paderborn

See the original post:

Researchers manufacture first-ever droplet-etched quantum dots that glow in C-band optical light - Phys.org

Research – Stony Brook University

Quantum Information Science and Technology (QIST) is a key area of growth for Stony Brook University, and a central piece of investment from President Maurie McInnis Innovation and Excellence (PIE) Fund.

The field of QIST offers scientific opportunities that may revolutionize information technology and how we store, manipulate and transmit information.

Earlier this year, faculty were asked to submit proposals for protocenters to comprise a QIST Consortium. A protocenter represents the planning stage in a QIST research program, and, based on the success of each protocenter in key domains including the generation of cutting-edge research and external funding, each has the potential to grow into a university-recognized center within the QIST Consortium.

Protocenter proposals requested between $100,000-300,000 per year for up to four years, at the end of which time the success will be evaluated to determine additional funding and potential growth into a QIST center within the QIST consortium. Protocenters demonstrating considerable success and potential for long-term sustainability will then transition into a university-recognized center within the QIST Consortium.

The review panel selected two proposals to fund for the next four years.

Leon Shterengas, professor in the Department of Electrical and Computer Engineering in the College of Engineering and Applied Sciences, serves as Principal Investigator for the Devices for Quantum Sensing and Communication Protocenter. Along with co-PIs Jennifer Cano (assistant professor, Department of Physics and Astronomy in the College of Arts and Sciences), Eden Figueroa (associate professor, Department of Physics and Astronomy), Dmitri Donetski (associate professor, Department of Electrical and Computer Engineering) and Sergey Suchalkin (associate professor, Department of Electrical and Computer Engineering), this research center will focus on creating new and advanced devices that use the principles of quantum physics.

The initial goal of the research center will be development of photonic devices for quantum communication and sensing. Design and fabrication of the single photon and entangled photon emitters for quantum internet and distributed sensing as well as single photon detectors for quantum ghost imaging in mid-infrared region of spectrum will be the primary targets.

The research center will also work on the development of a scalable material platform for quantum computation and signal processing. Advanced epitaxial and nanofabrication capabilities including metamorphic epitaxy of the ultra-short period superlattices and monolithic integration with photonic crystals will be applied.

This research center offers a unique opportunity to leverage state-of-the-art semiconductor optoelectronic technological capabilities developed at the Belenky Lab at the Department of Electrical and Computer Engineering to achieve critical advances in the QIST field, Shterengas said.

Himanshu Gupta, professor in the Department of Computer Science, serves as Principal Investigator for the Center for Quantum Computing and Networks Protocenter. The following faculty members serve as co-PIs: Aruna Balasubramanian (associate professor, Department of Computer Science), Xianfeng (David) Gu (professor, Department of Computer Science), Jon Longtin (Interim Dean of the College of Engineering of Applied Sciences and professor, Department of Mechanical Engineering), Omkant Pandey (associate professor, Department of Computer Science), Supartha Podder (assistant professor, Department of Computer Science; NSF Quantum Faculty Fellow), C. R. Ramakrishnan, (professor, Department of Computer Science) and Nengkun Yu (associate professor, Department of Computer Science; SUNY Empire Innovation Scholar).

The research conducted in this protocenter will mainly focus on solving complex problems related to the design and development of large, scalable, and reliable quantum computing systems. In particular, the protocenter will have four specific research areas: (i) Quantum Networks and Distributed Quantum Computing, (ii) Quantum Algorithms and Advantages, (iii) Quantum Cryptography and (iv) Quantum Verification.

The specific research goals of the center align well with the eight frontiers identified in the 2020 Quantum Frontiers report by the White Houses National Quantum Coordination Office. This broad research agenda positions the protocenter to attract collaborators and compete for different funding opportunities. The overall goals of the protocenter are to perform fundamental research in the scientific and technological foundations of quantum computing platforms and to build a vibrant, diverse community of researchers and practitioners

Protocenter PI Himanshu Gupta said, We are thankful to the university for giving us this opportunity to build a center on Quantum Computing and Networks. We hope to leverage the strengths of our team to build a self-sustaining visible center. Quantum Computing has the potential to significantly alter the computing landscape and the focus of our center would be to conduct transformative research that can lead to new frontiers and usable technology in quantum information science.

Beth Squire

Read story "QIST Protocenters Receive a Piece of the PIE (Fund)" on SBU News

Here is the original post:

Research - Stony Brook University

15 Action Films with Plots So Convoluted, You’ll Need an Actual Map – Startefacts

These are the movies that made our brains hurt. Repeatedly.

Welcome, dear reader, to a bewildering cinematic landscape filled with narrative twists, plot-turns and mind-bending realities that refuse to adhere to the law of simplicity.

Here, in the sometimes confusing but always entertaining world of action films, some plots demand more than just your popcorn and soda they require a compass, a map, and perhaps a background in quantum physics.

15. Inception (2010)

Christopher Nolan's tour de force through dreamscapes is a riveting exploration of the human subconscious. Armed with an ensemble cast and a plot so twisted that it makes a pretzel look straight, 'Inception' requires a notepad, a pause button, and a personal lecture from Nolan himself to understand. Dreams within dreams within dreams and no, pinching yourself won't help here.

14. Primer (2004)

Shane Carruth's low-budget indie darling doesn't just walk the line of complex narrative it pole-vaults over it. 'Primer' takes a hard dive into time travel and wraps its plot around a Mbius strip of paradoxes. With a script that would baffle Einstein, it's the kind of movie that makes Sudoku seem like child's play.

13. Ghost in the Shell (1995)

This seminal anime film takes us deep into a cyberpunk universe where cybernetic humans coexist with artificial intelligence. What starts as a simple investigation spirals into a metaphysical discourse about consciousness, identity, and reality. Halfway through, you may find yourself wishing for an actual shell to crawl into.

12. The Matrix Revolutions (2003)

If the first 'Matrix' was a refreshing wake-up call, the third instalment felt more like being trapped in a recursive nightmare. 'Revolutions' amps up the series' philosophy quotient, culminating in a finale that could make Schrdinger's cat scratch its head in confusion. Red pill or blue pill? How about a headache pill?

11. Predestination (2014)

'Predestination' isn't just a movie; it's a mind-bending odyssey through time and fate that challenges you at every twist. The Spierig Brothers crafted a narrative so interwoven, so cyclical, it defies straightforward comprehension. Ethan Hawke gives a stellar performance as a Temporal Agent, but it's the screenplay that's the real star a star around which we orbit in dizzying circles, trying to piece together a plot as elusive as time itself.

10. Donnie Darko (2001)

Richard Kelly's cult classic introduces us to a disturbed teenager, a monstrous rabbit, and a series of bizarre events culminating in an impending apocalypse. However, beneath this strange premise is a narrative that merges quantum mechanics, time travel, and philosophical dilemmas. 'Donnie Darko' doesn't merely require a map it needs an astrophysicist to decipher its multi-layered plot. Keep an eye on that countdown clock; it's ticking toward a cerebral workout!

9. Looper (2012)

Rian Johnson's 'Looper' offers an intriguing take on time travel. It's not merely the complex mechanics of past meeting future that demands our attention; it's the moral and existential questions that come along. While Bruce Willis and Joseph Gordon-Levitt deliver impressive performances, the plot's intricacies are as twisted as the looping timelines it portrays. A note to the viewer a detailed flowchart might be as necessary as a bucket of popcorn.

8. Cloud Atlas (2012)

The Wachowskis and Tom Tykwer presented us with an ambitious narrative spanning five centuries in 'Cloud Atlas'. With six intertwined stories exploring themes of reincarnation, destiny, and interconnectedness, it's less of a film and more of a cinematic jigsaw puzzle. Each story is a piece of the larger narrative, and assembling them in order can be as daunting as scaling a mountain peak.

7. Interstellar (2014)

Another entry from the mind of Christopher Nolan, 'Interstellar' takes us on an awe-inspiring journey through wormholes, across galaxies, and into the heart of a black hole. Between the science of relativity and the concept of five-dimensional space-time, the plot becomes as infinite as the universe itself. An actual map? You might need an actual spaceship.

6. Mr. Nobody (2009)

Jaco Van Dormael's magnum opus 'Mr. Nobody' is a tapestry of tangled timelines, quantum physics, and 'what if' scenarios. We follow Nemo Nobody, played by Jared Leto, as he navigates different versions of his life based on various choices. The film is an exhilarating exercise in mental gymnastics, making the labyrinth of Minos seem like a backyard maze.

5. Memento (2000)

Yet again, we find ourselves ensnared in Nolan's web. 'Memento' subverts the traditional storytelling structure, unfolding its plot in a unique non-linear format that mirrors the protagonist's anterograde amnesia. It's an ingenious narrative trick, but it turns the plot into a jigsaw puzzle one you're asked to solve while blindfolded.

4. Twelve Monkeys (1995)

Terry Gilliam's dystopian time-travel thriller 'Twelve Monkeys' is an intellectual maze that intertwines the past, present, and future in a seemingly chaotic fashion. It combines elements of mental illness, environmental devastation, and fatalism in a plot as intricate as the gears of a clock. Tick-tock goes the plot, leaving us to untangle its threads.

3. The Adjustment Bureau (2011)

What happens when a romantic thriller meets a science fiction mystery? You get 'The Adjustment Bureau'. Matt Damon stars as a politician whose life is manipulated by a group of mysterious agents controlling fate itself. As the film delves into philosophical questions about destiny and free will, the plot takes more turns than a labyrinth. Here's a tip don't adjust your screen, adjust your expectations of understanding everything at the first watch.

2. Timecrimes (2007)

This Spanish sci-fi thriller by Nacho Vigalondo presents a multi-layered exploration of time travel that makes a Rubik's cube look like a walk in the park. As the protagonist stumbles into a series of eerie and confusing events, the plot spirals into a dizzying temporal paradox. A word of advice keep track of the bandaged man, and remember, every action has a consequence, especially when time travel's involved.

1. Edge of Tomorrow (2014)

Closing our list is this science fiction action film that takes 'rinse and repeat' to a whole new level. Tom Cruise plays a soldier who, after being killed in a battle against alien invaders, finds himself in a time loop that forces him to relive the same combat over and over again. The complexity isn't just in the plot, but in the intricate weave of the timeline. Think Groundhog Day meets Starship Troopers with a dash of temporal chaos.

Read more from the original source:

15 Action Films with Plots So Convoluted, You'll Need an Actual Map - Startefacts

Stephen Hawking and I created his final theory of the cosmos … – Daily Maverick

The late physicist Stephen Hawking first asked me to work with him to develop a new quantum theory of the Big Bang in 1998. What started out as a doctoral project evolved over some 20 years into an intense collaboration that endedonly with his passingon March 14 2018.

The enigma at the centre of our research throughout this period was how the Big Bang could have createdconditions so perfectly hospitable to life. Our answer is beingpublished in a new book, On the Origin of Time: Stephen Hawkings Final Theory.

Questions about the ultimate origin of the cosmos, or universe, take physics out of its comfort zone. Yet this was exactly where Hawking liked to venture. The prospect or hope to crack the riddle of cosmic design drove much of Hawkings research in cosmology. To boldly go where Star Trek fears to tread was his motto and also his screen saver.

Our shared scientific quest meant that we inevitably grew close. Being around him, one could not fail to be influenced by his determination and optimism that we could tackle mystifying questions. He made me feel as if we were writing our own creation story, which, in a sense, we did.

In the old days, it was thought that the apparent design of the cosmos meant there had to be a designer a God. Today, scientists instead point to the laws of physics. These laws have a number of striking life-engendering properties. Take the amount of matter and energy in the universe, the delicate ratios of the forces, or the number of spatial dimensions. Physicistshave discoveredthat if you tweak these properties ever so slightly, it renders the universe lifeless. It almost feels as if the universe is a fix even a big one.

But where do the laws of physics come from? From Albert Einstein to Hawking in his earlier work, most 20th-century physicists regarded the mathematical relationships that underlie the physical laws as eternal truths. In this view, the apparent design of the cosmos is a matter of mathematical necessity. The universe is the way it is because nature had no choice.

Around the turn of the 21st century, a different explanation emerged. Perhaps we live in a multiverse, an enormous space that spawns a patchwork of universes, each with its own kind of Big Bang and physics. It would make sense, statistically, for a few of these universes to be life-friendly. However, soon such multiverse musings got caught in aspiral of paradoxesand no verifiable predictions.

Can we do better? Yes, Hawking and I found out, but only by relinquishing the idea, inherent in multiverse cosmology, that our physical theories can take a Gods-eye view, as if standing outside the entire cosmos.

It is an obvious and seemingly tautological point: cosmological theory must account for the fact that we exist within the universe. We are not angels who view the universe from the outside, Hawking told me. Our theories are never decoupled from us. We set out to rethink cosmology from an observers perspective. This required adopting the strange rules ofquantum mechanics, which governs the microworld of particles and atoms.

According to quantum mechanics, particles can be in several possible locations at the same time a property called superposition. It is only when a particle is observed that it (randomly) picks a definite position. Quantum mechanics also involves random jumps and fluctuations, such as particles popping out of empty space and disappearing again.

In a quantum universe, therefore, a tangible past and future emerge out of a haze of possibilities by means of a continual process of observing. Such quantum observations dont need to be carried out by humans. The environment or even a single particle can observe. Countless such quantum acts of observation constantly transform what might be into what does happen, thereby drawing the universe more firmly into existence. And once something has been observed, all other possibilities become irrelevant.

We discovered that when looking back at the earliest stages of the universe through a quantum lens, theres a deeper level of evolution in which even the laws of physics change and evolve, in sync with the universe that is taking shape. Whats more, this meta-evolution has a Darwinian flavor.

Variation enters because random quantum jumps cause frequent excursions from whats most probable. Selection enters because some of these excursions can be amplified and frozen, thanks to quantum observation. The interplay between these two competing forces variation and selection in the primeval universe produced a branching tree of physical laws.

The upshot is a profound revision of the fundamentals of cosmology. Cosmologists usually start by assuming laws and initial conditions that existed at the moment of the Big Bang, then consider how todays universe evolved from them. But we suggest that these laws are themselves the result of evolution.

Dimensions, forces, and particle species transmute and diversify in the furnace of the hot Big Bang somewhat analogous to how biological species emerge billions of years later and acquire their effective form over time.

Moreover, the randomness involved means that the outcome of this evolution the specific set of physical laws that makes our universe what it is can only be understood in retrospect. In some sense, the early universe was a superposition of an enormous number of possible worlds. But we are looking at the universe today at a time when humans, galaxies and planets exist. That means we see the history that led to our evolution.

We observe parameters with lucky values. But we are wrong to assume they were somehow designed or always like that.

The crux of our hypothesis is that, reasoning backward in time, evolution towards more simplicity and less structure continues all the way. Ultimately, even time and, with it, the physical laws fade away.

This view is especially borne out of the holographic form of our theory. The holographic principle in physics predicts that just as a hologram appears to have three dimensions when it is in fact encoded in only two dimensions, the evolution of the entire universe is similarly encoded on an abstract, timeless surface.

Hawking and I view time and causalityas emergent qualities, having no prior existence but arising from the interactions between countless quantum particles. Its a bit like how temperature emerges from many atoms moving collectively, even though no single atom has temperature. One ventures back in time by zooming out and taking a fuzzier look at the hologram. Eventually, however, one loses all information encoded in the hologram. This would be the origin of time the Big Bang.

For almost a century, we have studied the origin of the universe against the stable background of immutable laws of nature. But our theory reads the universes history from within and as one that includes, in its earliest stages, the genealogy of the physical laws. It isnt the laws as such but their capacity to transmute that has the final word.

Future cosmological observations may find evidence of this. For instance, precision observations ofgravitational waves ripples in the fabric of spacetime may reveal signatures of some of the early branches of the universe. If spotted, Hawkings cosmological finale may well prove to be his greatest scientific legacy.DM

This story was first published inThe Conversation.

Thomas Hertog is a Professor of Physics at KU Leuven.

Link:

Stephen Hawking and I created his final theory of the cosmos ... - Daily Maverick

EDITORIAL ANALYSIS : The promise in India’s National Quantum Mission – INSIGHTSIAS – Insights IAS

Source: Indian Express, Indian Express

Prelims: Current events of national importance, quantum, National Quantum Mission, quantum computing, Nano Mission, National Supercomputing Mission etc

Mains GS Paper II and III: Development process and the development industry-the role of NGOs,SHGs etc

ARTICLE HIGHLIGHTS

INSIGHTS ON THE ISSUE

Context

Quantum technology:

National Quantum Mission (NQM):

Advantages:

Quantum materials:

Quantum devices:

Need for a Quantum Mission:

How to achieve the targets?

Challenges:

Way Forward

QUESTION FOR PRACTICE

How is the S-400 air defense system technically superior to any other system presently available in the world ?(UPSC 2021) (200 WORDS, 10 MARKS)

Original post:

EDITORIAL ANALYSIS : The promise in India's National Quantum Mission - INSIGHTSIAS - Insights IAS

Heisenberg’s Uncertainty Principle: Everything You Need to Know – Popular Mechanics

Stop me if youve heard this one.

Werner Heisenberg is driving down the road when he gets pulled over by a traffic cop.

Excuse me, sir, the cop says. Do you know how fast you were going?No, Heisenberg replies. But, I know exactly where I am.

Whether youre laughing right nowor staring at your screen in confusionhinges on how much you know about one of the foundational ideas in quantum physics: Heisenbergs uncertainty principle.

In its most basic and commonly known form, the uncertainty principle says that the more precisely you know the position of a particle in a quantum system, the less well you know its momentum (and vice-versa). The principle also applies to other pairs of characteristics in quantum systems, like energy and time. But every physics graduate first starts to unpack this concept through the lens of position and momentum, so we will, too.

If this was all Heisenbergs uncertainty principle said, it probably wouldnt have been profound enough to weave its way into pop culture in the form of mugs, T-shirts, and cartoonslet alone place its pioneer as an alias for an infamous meth-cooking chemistry teacher.

German theoretical physicist Werner Heisenberg first introduced his uncertainty principle in a 1925 paper. Its special because it remains intact no matter how good our experimental methods get; this isnt a lack of precision in measurement. It doesnt matter how smart you are, or how sophisticated your equipment, is you cant think your way past it. Its a fact of nature.

Legendary physicist and master bongo player Richard Feynman put it like this: The uncertainty principle protects quantum mechanics. Heisenberg recognized that if it were possible to measure both the momentum and the position simultaneously with greater accuracy, quantum mechanics would collapse. So he proposed that must be impossible.

Reality is telling us that we can have our quantum cake, but we cant eat it, too.

Getty Images

Chad Orzel is an associate professor in the Department of Physics and Astronomy at Union College in Schenectady, New York, who is also the author of several books that explain often complicated and esoteric ideas to a layman audience. In his book, How to Teach Quantum Physics to Your Dog, he covers Heisenbergs uncertainty principle.

The origin of the uncertainty principle is found in the duality of particles in quantum physics; depending on what theyre doing, they can be described as either a particle or a wave, Orzel tells Popular Mechanics.

At the turn of the 20th century, physicists were engaged in a heated debate regarding the nature of light, and whether it exists as a particle or a wave. Thanks to a pioneering test known as Youngs Double-Slit experiment, physicists discovered the answer was Door No. 3 as Orzel puts it. That is, light isnt a particle or a waveit has properties of both. And, shockingly, particles of matter like electrons also demonstrate this particle-wave duality.

Werner Karl Heisenberg, a Nobel Laureate who later became a key figure in Hitlers atomic project.

So every particle in the universeor every kind of object that we know of in the universe has this combination of properties we associate with waves, and properties we associate with particles; theyre a third kind of object that isnt really one or the other, Orzel says. And its because of that you cant get rid of the Heisenberg uncertainty principle with a better experimental technique, because its really fundamental to that dual nature.

He continues by explaining the fact that you need to have both wave-like properties and particle-like properties, meaning you cant measure either of them perfectly.

Being able to have both requires that each be imperfect, in a way, and theres just no way around that, Orzel explains. We know from experiments that things that we think of as particles, like an electron, would have a well-defined position, but they also have a wavelength associated with them, and that wavelength is related to the momentum.

When the particle is moving, its doing wavy stuff that has a characteristic wavelength associated with itand that length, it turns out, is inversely proportional to the momentum. This means the faster the particle is going, the shorter the wavelength, and the slower the particle is going, the longer the wavelength.

You need to have both of these things if you want to have the position well defined and momentum well defined, Orzel says. It has to have both a position in space that you can point to and say it is right here. And it has to have a wavelength with some characteristic length associated with it. And those things are incompatible.

Think of the momentum of a traveling particle as a wave; the peaks of the wave represent the probability of the particles position. One infinitely long wavelength represents a very precise momentum. Problem is, with that single infinite wavelength, there is an infinity of peaks, thus the momentum is precisely known, but the position is completely unknown. The particle could be anywhere. That means in that situation, youve got an exact momentum, but no clue about location.

To get a read on the particle, what we could start doing is stacking different wavelengths, each representing different momentums for the particle. Where a peak meets a trough, you get destructive interference, and the wavelength is flattened. Where the peaks meet, you get an increased peak, and thus an increased probability of finding the particle.

Add enough wavelengths and birth enough constructive and destructive interference, and youve got a single peak and close to a definite position for the particle. In the process of creating this peak, youve also destroyed the wavelength, meaning you now know zilch about the momentumyouve sacrificed it for certainty about the position.

The best you can do is create a wave packet, which is flat, and then you have some waves that get bigger and bigger and they come to a peak and then they get smaller on the other side, sort of tapering off on either side, Orzel says. You can look at that region of space and say, here are the peaks, and Ive got so many here. We also have a wavelength giving the momentum, but the region in which thats happening is relatively confined and can be quite small.

That means momentum and a position can be given for a system using this wave packet, but crucially theres an uncertainty to both measurements. Were all clued in on the joke now, but theres still the question of what makes it ludicrous.

Getty Images

Obviously, a carwhether its driven by one of the founders of quantum mechanics or notisnt a quantum object. It doesnt travel like a wave, meaning your car cant diffract around corners, and thus it isnt governed by the rules of the subatomic or the uncertainty principle; nor are tennis balls, or comic books, or squirrels. The reason you cant find your keys every morning isnt that you know their momentum precisely and thus cant possibly know their positionso no more using that as an excuse for being late for work.

The big question is: why doesnt the Heisenberg uncertainty principle affect everyday or macroscopic objects?

The answer lies in the equation that describes the phenomena.

The generalized form of the Heisenberg uncertainty principle says that if you measure the momentum of a particle with uncertainty p, then this affects the uncertainty of the position x, which cant be any less than /2p.

The whole equation looks like this: x /2p. Its the (pronounced H-bar) element were interested in here.

This is known as the reduced Plancks constant, and the thing about it is . . . its small, very small, and it constrains the values of the uncertainties of our two properties and makes them small, too.

Institute of Physics

The uncertainties are so small for macroscopic objects that if you have an object thats one kilogram, moving at one meter per second, its wavelength would be 10-34 meters, [thats zero, a decimal point followed by another 33 zeroes], which is a distance thats so small, it doesnt really make sense to talk about, Orzel explains. Then the uncertainty in the position is going to be some smallish multiple of that, which is just so tiny its ridiculous. So, you cant see the uncertainty principle with ordinary macroscopic objects.

You can see that uncertainty with subatomic objects like electrons, however, when the wave properties become apparent; thats because their wavelengths are long enough. As Orzel points out, thats also when you can measure that uncertainty.

Exactly where the line between quantum and non-quantum behaviors lies is currently a hot research topic in physics, with scientists discovering quantum effects in particles as large (and even larger) than Carbon-60 atoms, also known as Buckyballs, because of their resemblance to the hexagonal egg carton-like architecture of Buckminster Fuller.

As for why the uncertainty principle is so captivating, Orzel explains:

Its telling us something fascinating about the universe, which is that at a very deep fundamental level, the nature of reality is such that there will always be some uncertainty and that it is impossible, even in principle, to know certain things about the world or certain combinations of things about the world.

Oh, and it makes for great jokes, too.

Robert Lea is a freelance science journalist focusing on space, astronomy, and physics. Robs articles have been published in Newsweek, Space, Live Science, Astronomy magazine and New Scientist. He lives in the North West of England with too many cats and comic books.

More:

Heisenberg's Uncertainty Principle: Everything You Need to Know - Popular Mechanics