Page 2,454«..1020..2,4532,4542,4552,456..2,4602,470..»

It’s Only Cloud Computing (but I like it) – Finextra

A rolling stone gathers no moss, and a half-hearted move to the cloud delivers no value; it can even increase costs.

Moving to the cloud provides a catalyst for change and enables new possibilities in technology adoption but only if an organisation is prepared to truly transform how it operates. Half measures wont cut it here.

Simply moving applications is not enough. Without changing the architecture and design of applications, or the way they are built and maintained, limited value will be gained. The best outcomes are achieved when applications are designed specifically to take advantage of the cloud, and organisations change the way they work to fully embrace the value of elastic compute and DevOps.

And its not just us saying it, according to a report from IT analyst firm Freeform Dynamics, using cloud and DevOps together achieves an acceleration of software delivery of 81 percent.

Sympathy For The DevOp

To understand where an organisation is on its path towards fully adopting a DevOps environment, ask when its latest release was. An answer in hours or minutes signifies a company that has fully embraced the transformative benefits that DevOps can bring.

Yet adopting such a release schedule can seem like an anathema to the financial services sector. Decades of release pain have seen banks respond to the reality of poor code releases by fortifying their production environments with bureaucratic control frameworks to limit their exposure.

But the more releases the better it gets, and when you get to the premier league of measuring release gaps in hours or minutes good things start happening.

You can always get what you want

The hardest part of moving to DevOps is to change the process and control environment, hence the need to fully embrace a shift to the cloud as a catalyst for change. By radically transforming the architecture and design of applications, how they are built and maintained, significant benefits can be achieved.

Firstly, the relationship between the business and IT can be transformed. The business gives up their shadow IT, it's no longer necessary and IT is seen as an enabler. Business agility and rate of change increases dramatically, making the firm more competitive allowing the business to experiment more freely.

Secondly, more technology change means faster response times, greater agility and reduced issues. We advise clients to expect IT operating costs to fall by 20-35% by adopting this approach.

Hey You, Get On To My Cloud

Of course, there are considerations that need to be in place to make a move to DevOps a success such as having the right microservices architecture. This allows for the development of applications to be broken down into functional components that can be built, tested, maintained, and corrected in parallel with each other. This not only saves time but also allows for the most appropriate platforms, tools and coding languages to be used while fostering a culture of agility.

Which brings us back neatly to the cloud and choosing the right partner. As you would expect, there are advantages and compromises with each. The existing development platform - .Net for example might make one vendor the obvious choice, vendor lock-in and approach to open source is another key consideration.

But regardless of vendor, the case for embracing DevOps in the move to the cloud is a strong one. True, you can achieve much of what has been outlined without the cloud, but that rather misses the point. As a catalyst for change and a means to achieving true business agility, adopting the cloud opens up a whole range of possibilities.

Read more:
It's Only Cloud Computing (but I like it) - Finextra

Read More..

AWS has selected MIT ADT University for Cloud Computing Curriculum – United News of India

More News21 Sep 2021 | 8:43 PM

Mumbai, Sep 21 (UNI) Maharashtra minister and Shiv Sena leader Anil Parab has filed a defamation suit in the Bombay high court against BJP leader Kirit Somaiya alleging that the latter made malicious, malafide and defamatory statements and claimed damages to the tune of Rs 100 crore Mr Parab also sought an unconditional apology from Mr Somaiya stating that his reputation has been damaged and that former BJP MP is only seeking cheap publicity.

Nanded, Sep 21 (UNI) Swami Ramanand Teerth Marathwada (SRTM) University will organise a music webinar, titled Importance of Drama Music', on September 24.

Parbhani, Sep 21(UNI) Bharatiya Janata Party(BJP) Mahila Morcha state president Uma Khapre on Tuesday demanded to convened a special session of the legislature immediately to discuss the issue of safety of women in the state has come to the fore again after the rape of a schoolgirl in Parbhani following the Sakinaka incident in Mumbai.

Kolhapur, Sep 21 (UNI) Swabhimani Shetkari Sanghatana SSS national president Raju Shetty today demanded to sugar commissioner to provide sugar cutting labourers power through Sugar Cutting Labourers Corporation (SCLC) to all sugar factories for this year session to avoid suicide cases of sugarcane transport farmers.

Nanded, Sep 21 (UNI) A Punjab-based man donated a 2.5 kg gold crown worth Rs 2 crore to the Sachkhand Gurudwara, the religious institution informed on Tuesday.

See the rest here:
AWS has selected MIT ADT University for Cloud Computing Curriculum - United News of India

Read More..

Inari Further Invests in Predictive Design Capabilities with Hiring of Chief Information and Data Officer – PRNewswire

Khalaf brings more than a decade of artificial intelligence (AI) and cloud computing experience to further expand a team that is central to Inari's mission. She will lead the company's work in machine learning to further advance its predictive design, tapping into deep learning to map highly complex genetic networks and identify how to best address critical challenges such as resource use inefficiencies. These systemic discoveries identify unique gene pathways that will lead to improved plant performance to support a more sustainable food system.

"To transform the global food system for a more sustainable future, we are building a deeper understanding of some of nature's most complex systems to determine the multiplex edits required for the desired outcome," said Ponsi Trivisvavet, chief executive officer at Inari. "Our team has built a lot of momentum in this area over the past five years, so Rania's experience in building digital platforms will accelerate our product development."

Khalaf joins Inari from IBM Research, where she created and led the global research strategy on AI-infused automation. She ran an organization that developed foundational, award-winning innovation in cloud-native AI platforms, human-centered AI and serverless computing, leading to the launch and enhancement of an array of products. Khalaf held multiple leadership and research positions within IBM, laying the industry foundation for service-oriented computing and co-authoring core web service standards. She holds several patents and has over 90 peer-reviewed publications.

"I was drawn and inspired by the mission of Inari to drive impactful change," Khalaf said. "Accelerating this journey by scaling machine learning pipelines and interleaving computation with biological systems to create a more sustainable future is a unique and challenging opportunity that I'm excited to take on."

Khalaf serves on the board of the Hariri Institute for Computing at Boston University. She holds bachelor's and master's degrees in electrical engineering and computer science from the Massachusetts Institute of Technology (MIT) and a doctorate in computer science from the University of Stuttgart.

About InariInari is embracing diversity to build a new, more sustainable food system using unsurpassed technology to unlock the full potential of seed. Through its SEEDesign platform, Inari unlocks new possibilities to bring step-change products to market. Founded by Flagship Pioneering in 2016, Inari is based in Cambridge, Mass., with additional sites in West Lafayette, Ind., and Ghent, Belgium. Inari is a growing team of more than 180 employees. To learn more, visit Inari.com.

SOURCE Inari

Read more:
Inari Further Invests in Predictive Design Capabilities with Hiring of Chief Information and Data Officer - PRNewswire

Read More..

5 reasons why Java is still the best programming language – TheServerSide.com

A new, long-term support Java release is just around the corner. As Java passes the 25-year mark, let's step back and look at some of the reasons why Java remains the best programming language for modern software development.

Every programming language comes with a learning curve, but Java shares many similarities with C, C++ and JavaScript. Anyone with experience in any of those languages will pick up Java syntax quickly.

Java also has a very rigid and predictable set of rules that govern code structure. This contrasts starkly with other, untyped scripting languages where it seems like anything goes. When you try to master a new programming language, a clear set of consistently enforced rules makes learning easier.

Furthermore, when things don't make sense, programmers new to Java can find a strong support network of YouTube videos, websites like StackOverflow and online forums like CodeRanch to find answers to a multitude of questions.

The Java API is extensive. The standard JDK comes with over 200 built-in packages containing Java APIs that allow for everything from parsing XML to translating between time zones. When developers add on the Jakarta EE APIs, they have an even more extensive library of APIs that allow for the development of complex, middle-tier applications and cloud-native microservices.

But the rich ecosystem of Java APIs extends far beyond those sanctioned by Oracle or provisioned through Jakarta.

For data persistence, there's the JBoss Hibernate project. For cloud-native microservices development in Java, there's the full suite of Spring Boot APIs. And of course there's a multitude of open source Apache API projects that address a wide range of software development use cases, from aggregating error messages with log4j to solving complicated problems with HashMaps and fail-safe Iterators through the Apache Commons Collections API.

The rich set of Java APIs available to developers aid in the development of strong, error-free applications.

The application development landscape is filled with software development tools written in Java -- by Java developers -- that are designed to simplify and streamline the development, deployment and even decommissioning of Java applications. A few examples of tools written in Java include:

This is by no means an exhaustive list of Java's tool ecosystem. Other examples of tools and technologies written in Java include application servers like Tomcat to Red Hat's popular, Kubernetes-native Java stack named Quarkus.

Few other programming languages enjoy the same level of tooling support that Java does, which helps cement itself as the best programming language.

Android is the world's most popular mobile phone OS and Java is the de facto programming language for Android application development.

While Android's version of Java isn't exactly the same as what you'd find on the JDK, Google did copy over 11,500 lines of code from the Java Standard Edition when they built their Java clone. As a result, developers can expect that the version of Java they see on Android is pretty close to the original.

If you know can write Java code for desktop or server-side applications, you'll pick up Android development very quickly. The low-level differences between the JVM and the Dalvik Android Runtime will all be pleasantly abstracted away from you after a brief learning curve. When developers learn Java, the entire Android ecosystem will be at their disposal.

Java evolves slowly, but it does evolve. With over 25 years of evolution, Java has plenty of evolutionary improvements to be proud about.

From the bulletproof modularity system that was delivered as part of Project Jigsaw, to the recently added ability of functional programming in Java with lambda functions, Java continues to implement big changes that the community demands.

Incremental additions in non-LTS releases, such as the the addition of the new Record data typeand garbage collectors for improved memory management, showcase that the JDK is also constantly evolving.

But in the enterprise software development world, backwards compatibility is just as important as adding new features. Java has always made this a top priority for the stewards of the language. Very rarely does a comprehensive update or feature addition create issues with code written against older releases.

I personally compiled and packaged some complicated code that was written 20 years ago, and it ran without any issues on the latest Java 17 release. When enterprise clients know that things won't break when they perform a JVM upgrade, it gives just another reason why they choose to stick with Java as the best programming language.

View original post here:
5 reasons why Java is still the best programming language - TheServerSide.com

Read More..

$5M NSF Grant to Fund Research on Quantum Internet Foundations – Maryland Today

The National Science Foundation (NSF) today announced a $5 million, two-year award to a University of Maryland-led multi-institutional team to develop quantum interconnectscrucial technology to connect quantum computers and pave the way for a quantum internet.

The team, QuaNeCQT (Quantum Networks to Connect Quantum Technology), has been developing the quantum versions of a modem and a routerfamiliar equipment in the world of standard, or classical computing, but a challenge to build for use with devices that operate based on the principles of quantum.

The devices allow ion trap quantum computersa leading approach to quantum information processing developed in part at the University of Marylandto exchange quantum information over distances measured in kilometers, eventually leading to the development of networks that could revolutionize numerous industries and help solve vexing societal problems.

Quantum networks are at an inflection point with the potential for significant expansion, said Edo Waks, a professor of electrical and computer engineering and associate director of UMDs Quantum Technology Center (QTC). But the scale-up cant happen without standardized modular hardware between the new computers that are emerging and the vast infrastructure of the current internet.

The hardware we are developing will address the critical gap, opening up the door to the future quantum internet that can connect quantum computers over continental distances, said Waks.

Other UMD team members include physics Assistant Professor and QTC Fellow Norbert Linke, and Mid-Atlantic Crossroads (MAX) Executive Director Tripti Sinha, assistant vice president and chief technology officer for UMDs Division of Information Technology. The team also includes Dirk Englund of the Massachusetts Institute of Technology and Saikat Guha of the University of Arizona.

The researchers plan to deploy this new technology in the Mid-Atlantic Region Quantum Internet (MARQI), UMD's regional quantum network footprint. The MARQI network will interconnect quantum computers at UMD, the Army Research Laboratory, MAX and IonQa leading quantum computing company focused on ion-trap computers that operates in UMDs Discovery Districtwith a potential for significant expansion.

During the first phase of research, the team developed working prototypes of the quantum router and modem. Using a process called quantum frequency conversion, the modem converts signals from a quantum computer to infrared photons that can propagate through optical fibers over long distances. The router is powered by a silicon photonic chip that manipulates quantum signals in the network using quantum teleportationan effect demonstrated in 2009 by researchers at UMDs Joint Quantum Institute that allows quantum states to be transferred between particles that are physically separate. The team has deployed these prototypes in the MARQI network and established direct links with the various nodes of the network.

A quantum network could revolutionize numerous industries that take advantage of quantum computing including computing, banking, medicine and data analytics It would also enable connection of many multiple small quantum computers into powerful distributed quantum computers that could potentially solve problems with significant societal impact, from curing diseases to new approaches to fighting climate change.

As quantum technology converges with the Internet, a new technology sector would emerge, the researchers say, bringing with it the potential for major economic growth by producing rapid technological innovation and creating a large number of new jobs for the future quantum workforce, just as the emergence of the Internet did toward the late 20th century.

Original post:
$5M NSF Grant to Fund Research on Quantum Internet Foundations - Maryland Today

Read More..

3 Quantum Computing Stocks to Buy for Their Promising Healthcare Potential – InvestorPlace

Quantum computing stocks are gaining traction as this once-nascent industry is fast evolving. Wall Street is paying increased attention to the segment as companies move from the experimental research phase to developing commercially feasible computers that can solve the worlds most complex problems and revolutionize businesses in many industries. Thus, quantum computing stocks have become a hot item.

Overall, quantum computers offer computational power 100 million times faster than todays ordinary computers at the moment. They can process more information exponentially with each additional quantum bit, or qubit.

From advances in machine learning to healthcare, artificial intelligence (AI) and advanced cybersecurity capabilities, quantum computers are expected to have a significant impact across a wide range of industries. Therefore, I want to introduce three quantum computing stocks to invest in the rest of this year.

Multiple countries are already involved in the quantum computing race, and The Global Quantum Computing Market Size is expected to value USD 487.4 million in 2021 and is expected to reach USD 3728.4 million by 2030 at a CAGR of 25.40% over the forecast period from 2021 to 2030.

So, with that information, lets take a look at three of the top quantum computing stocks on the market right now.

Now, lets dive in and take a closer look at each one.

52-Week Range: $31.76 52.51Dividend Yield: 0.43%Expense Ratio: 0.40% per year

We start our discussion with an exchange-traded fund (ETF), namely the Defiance Quantum ETF. It invests in global businesses that are leading the technology and applications behind quantum computing, cloud platforms, machine learning, as well as other advanced computing technologies.

QTUM, which has 71 holdings, tracks the returns of the BlueStar Quantum Computing and Machine Learning Index. The fund was first listed in September 2018.

In terms of subsectors, we see Quantum Computing Technology (35.56%), followed by Machine Learning Services (21.44%), AI Chips (17.67%), GPU & Other Hardware (13.07%) and Big Data & Cloud Computing (9.39%). Close to 60% of the companies are U.S.-based. Others come from Japan (12.64%), the Netherlands (8.39%), Taiwan (4.11%) among others.

Leadings names in the roster are Analog Devices (NASDAQ:ADI), Ambarella (NASDAQ:AMBA), Advanced Micro Devices (NASDAQ:AMD), Synaptics (NASDAQ:SYNA), and Splunk (NASDAQ:SPLK). The top 10 stock comprise close to 20% of net assets of $132.4 million.

Year-to-date, QTUM is up more than 25% and hit a record high in recent days. As the funds holdings show, there are not many pure-play quantum computing stocks. Instead, a large number of tech names are increasing their focus on the quantum realm. Despite the recent run-up in price, such names in the quantum computing space are likely to create many more quarters of shareholder value. Potential investors could consider buying the dips.

52-week Range: $105.92$152.84Dividend Yield: 4.8%

In June, International Business Machinesrevealed Europes first quantum computerin Germany. According to IBM, the Q System One is now Europes most powerful quantum computer. In this race, IBM is not alone and elsewhere, other tech giants, including Google (NASDAQ:GOOG, NASDAQ:GOOGL) (NASDAQ:GOOG), Amazon (NASDAQ:AMZN) and Honeywell (NASDAQ:HON), are also investing heavily in the quantum computing world.

IBM generates revenue from five segments namely cloud and cognitive software, global business services, global technology services, systems and global financing. While global technology services has the highest share in the top line with about 35%, cloud and cognitive business is the most lucrative business as it has more than 25% pre-tax margin.

The company announced second quarter financial figures at the end of July. Revenue was $18.7 billion implying 3% year-over-year (YOY) growth. Net income of $1.3 million meant a decline of 3% YOY. Diluted non-GAAP earnings per share (EPS) was $2.33. A year ago, it had been $2.18. Meanwhile, net cash from operating activities stood at $17.7 billion.

Management believes quantum computing will play a key role in healthcare as it could enable a range of disruptive use cases for providers and health plans by accelerating diagnoses, personalizing medicine, and optimizing pricing. Quantum-enhanced machine learning algorithms are particularly relevant to the sector.

On the results, CFO James Kavanaugh cited,We expanded operating margins and grew profit dollars in the quarter, providing a key contribution to our cash performance. The company expects to grow revenue for fiscal year 2021 and anticipates free cash flow of $11 billion-$12 billion in 2021.

So far in the year, IBM stock returned just over 9.3%, and hit a multi-year high in June. Since then, though the shares have come under pressure, and price-sales (P/S) ratio stands at 1.66 times. Potential investors could regard the recent decline in price as an opportunity to buy for the long-run.

52-week Range: $196.25 $305.84Dividend Yield: 0.76%

Our last stock is the tech giant Microsoft, which generates revenue from three segments namely Productivity and Business Processes (such as Office 365 and LinkedIn) , Intelligent Cloud (Azure, Premier Support Services, and Consulting Services) and More Personal Computing (Windows Commercial, Bing, and Xbox).

Microsofts fiscal year ends on June 30. At the end of July, the company issued Q4 2021 results. Revenue was $46.2 billion, up 21% YOY. Additionally, net income grew 47% YOY growth to $16.5 billion. Diluted EPS was $2.17 for the fourth quarter, up 49% from a year ago. The company also ended its fiscal year with $14.2 billion cash and equivalents.

Following the announcement, CFO Amy Hood said, As we closed out the fiscal year, our sales teams and partners delivered a strong quarter with over 20%top and bottom-line growth, highlighted by commercial bookings growth of 30% year over year.

For the next quarter, Microsoft shared its segment revenue guidance. Hence, in the Productivity and Business Processes segment, the company expects its revenue between $14.5 and $14.75 billion. For Intelligent Cloud, Microsoft anticipates revenue to be between $16.4 and $16.65 billion.

Microsoft highlights, From breakthroughs in physics and nanomaterials to seamless integration with Microsoft Azure, Microsoft is leading the way to scalable, accessible quantum computing. For example, analysts have been pointing out how Microsofts quantum technology could influence the power industry, healthecare privacy, and personalized medicine.

So far in 2021, MSFT stock is up more than 33% and reached a record high in late August. Moreover, the stock is trading at 13.38 times current sales. Therefore, interested readers could consider investing in the shares for the long-term around current levels.

On the date of publication, Tezcan Gecgil did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to theInvestorPlace.comPublishing Guidelines.

TezcanGecgil has worked in investment management for over two decades in the U.S. and U.K. In addition to formal higher education in the field, she has also completed all 3 levels of the Chartered Market Technician (CMT) examination. Her passion is for options trading based on technical analysis of fundamentally strong companies. She especially enjoys setting up weekly covered calls for income generation.

Read this article:
3 Quantum Computing Stocks to Buy for Their Promising Healthcare Potential - InvestorPlace

Read More..

DoD Buys Two New Supercomputers That Rank Among Its Most Powerful Ever – Breaking Defense

Sandia National Laboratory Computer Annex conducts the hourly walk-through of the Thunderbird supercomputer at 2 a.m.

WASHINGTON: The Pentagon recently completed a $68 million acquisition of two new supercomputing platforms and related technical services that rank among its most powerful supercomputers ever and will be among the top 100 performers globally.

These are significant assets, Kevin Newmeyer, deputy director of the Defense Departments High Performance Computing Modernization Program (HPCMP), told Breaking Defense. They bring to us an increase in our computing capacity and the latest advanced chips for artificial intelligence work and storage to support applications of both computational and machine learning concepts within the same computer that we hope will deliver products and services to the warfighter faster.

Its the HPCMPs job to give DoD military and civilian as well as defense contractor scientists, engineers, and technologists access to such supercomputers to solve some of the militarys most computationally complex problems.

The problems range from climate/weather/ocean modeling and simulation, space/astrophysical sciences, and acoustics to signal/image processing, data/decision analytics, and electronics, networks, and C4I systems. Newmeyer said the most common use case is computational fluid dynamics, which is required for making complicated calculations in areas such as aircraft and ship design and engineering.

For the latest acquisition, the Pentagon chose Penguin Computings TrueHPC supercomputing platform. The two new supercomputers, according to the company, will provide DoD with a combined total of over 365,000 cores, more than 775 terabytes of memory, and a total of 47 petabytes of high-performance storage, including over 5 petabytes of high-performance flash storage.

Thats about 150,000 computers all stacked together, operating as one thing, Newmeyer said. If you laid them end to end, you would work your way pretty much across the country.

What does all that compute power get you? An additional 17.6 petaFLOPS, in total. FLOPS or floating point operations per second are the standard measure of a supercomputers performance. FLOPS are determined by how many real numbers a computer can process per second while accounting for the trade-off between range and precision of calculations.

FLOPS are a measure of computational power for solving computer-based problems. Its the horsepower of a machine, Penguins Vice President of Federal Sales Tom Ireland told Breaking Defense.

PetaFLOPS number one quadrillion (1,000,000,000,000,000). To put that in perspective, HPCMP currently has a total capacity across all of its supercomputers of approximately 100 petaFLOPS, according to Newmeyer. That includes the Navys most powerful (known) supercomputer, Narwhal, which is capable of 12.8 petaFLOPS. The known part of the Air Forces most powerful supercomputer, Mustang, is capable of 4.87 petaFLOPS. (Part of Mustang is classified, Newmeyer noted.) Penguins two TrueHPC supercomputers expected to register at 8.5 petaFLOPS and 9 petaFLOPS will be two of HPCMPs most powerful computers ever, Ireland said.

According to the Top500 Project, the fastest supercomputer in the world, as of June 2021, is Japans Fugaku, which registered 442.01 petaFLOPS in November 2020, taking the top spot from IBMs Summit (148.6 petaFLOPS), which is housed at the Department of Energys Oak Ridge National Laboratory.

The Pentagons upgrade in supercomputing power comes amid an intense technological race against near-peer rival China. According to the Top500, China currently leads the world in the total number of supercomputers with 188, but when ranked by performance, the US has five of the top 10 most powerful supercomputers in the world, while China has two of the top 10. No other country has more than one in the top 10.

Ireland noted that Penguin, which has been building supercomputers for 20 years, has for years been running programs at the Department of Energy, which has the most powerful (known) supercomputers in the US. Fifteen of Penguins debuts over 20 years have made the Top500, and were DoD to run official benchmarks on these two new supercomputers, they would rank within the top 100 worldwide, Ireland said.

The Navys DoD Supercomputing Resource Center (DSRC) at Stennis Space Center in Mississippi will house one of the new platforms, while the other will go to the Air Force Research Labs DSRC at Wright-Patterson Air Force Base in Dayton, Ohio.

But first Penguin has to build, deploy, and integrate them into HPCMPs network, known as the Defense Research Engineering Network (DREN). Ireland said Penguins TrueHPC consists of about 1,500 nodes, which must be engineered to work as one, giant machine.

The trick with distributed computing meaning its taking what heretofore was done on a mainframe-style computer where its all on a board, and its broken up into separate, discrete servers is making sure that is an adequate platform for any given application, Penguins Chief Strategy Officer Matt Jacobs told Breaking Defense. To make sure that balance between the elements is right and theres an appropriate amount of compute to solve the problem.

Jacobs said some of the key elements include data patterns, network traffic, and storage capacity, which all must be brought together in a way that doesnt strand investment in any given element of those resources and that its an effective production platform for the workload application. Thats really the art, he added.

Jacobs said that Penguin generally builds these types of platforms in a couple of months, but like many companies worldwide, Penguin has encountered challenges in the global supply chain, especially around chips. Jacobs and Ireland said the supply chain hiccups are beyond the companys control, but said they still wouldnt significantly delay the project.

Notably, the platforms will include over 100 NVIDIA graphics processing units, or GPUs, to bolster DoDs AI and machine learning capabilities, Ireland said.

Ultimately, Ireland said, the project is about keeping the US warfighter equipped with state-of-the-art technologies to solve compute problems. Were keeping our warfighters current. You dont want them fighting wars with F-14s when theres F-22s.

Its unclear how long the era of supercomputers will last, as the US and China, among others, race ahead towards quantum computing, which uses quantum mechanics to make a technological leap in processing power. But Newmeyer said hes not concerned traditional supercomputing platforms will become obsolete anytime soon.

Youll still have a use for these types of machines, he said. Any quantum computer built in the near future is going to be highly expensive to operate, and [quantum computers] are only more useful for certain applications maybe in some stuff around hypersonics, certainly cryptology, navigation there quantum has a key role. But for general computation, [quantum] is an awful lot of money.

Excerpt from:
DoD Buys Two New Supercomputers That Rank Among Its Most Powerful Ever - Breaking Defense

Read More..

Are we approaching quantum gravity all wrong? – Big Think

Electrons exhibit wave properties as well as particle properties, and can be used to construct images or probe particle sizes just as well as light can. Here, you can see the results of an experiment where electrons are fired one-at-a-time through a double-slit. Once enough electrons are fired, the interference pattern can clearly be seen. (Credit: THIERRY DUGNOLLE / PUBLIC DOMAIN)

At a fundamental level, we often assume that there are two ways of describing nature that each work well in their own regime, but that dont seem to play well together. On the one hand, we know that the matter and energy that makes up the Universe, from stars to atoms to neutrinos to photons, all require a quantum description in order to extract their properties and behavior. The Standard Model, the pinnacle of quantum physics, works perfectly well to describe every interaction weve ever measured in the Universe.

On the other hand, we also have General Relativity: our theory of gravity. However, this is fundamentally a classical theory, where the presence of matter and energy curves the fabric of space, and that curved space in turn tells matter and energy how to move through it. Although each one works quite well over its own range of validity, there are plenty of questions that require a thorough knowledge of both, together to answer. Due not only to their fundamental differences but their fundamental incompatibilities, many questions that we can imagine are currently beyond our ability to answer.

That doesnt necessarily imply that anything is broken with physics, but it certainly seems to indicate that our current understanding of matters is, at the very least, incomplete. In an attempt to uncover just what it is that we know, what we dont, and what the path forward might look like, I sat down in an interview with physicist Lee Smolin, wholl be appearing at the HowTheLightGetsIn festival in London this September 18 and 19. Lee is a pioneer in the field of quantum gravity and someone whose latest book, Einsteins Unfinished Revolution, details the search for what lies beyond whats presently known about the quantum Universe.

Ethan Siegel: What are the motivations behind why you would say quantum field theory and the Standard Model, and General Relativity for gravity, why can that not describe the Universe at a fundamental level?

Lee Smolin: Well it just cant. Its easy to think of experiments that that collection of ideas doesnt give consistent predictions for. More than that, there are reasons, in principle, why the principles that quantum physics is based on contradict the principles that General Relativity is based on, and we need to make these things fit together on a level of principle, because its supposed to be a fundamental theory of nature.

There are both experimental reasons and reasons of principle and between them there are also lots of technical problems that we see when we get to know them are a consequence of these conceptual clashes: clashes of principle.

ES: Can you give one example?

LS: Sure, what does collapse of the wavefunction, which is a part of quantum mechanics, look like in a spacetime which is dynamical, and which evolves according to some equations of motion in General Relativity?

ES: Einsteins original idea of unification was originally to geometrically add in classical electromagnetism to General Relativity, and we know that cant be right because we know the Universe is quantum mechanical in nature. You write about what you call Einsteins unfinished dream. Why is this dream important, even if Einsteins original ideas about it are no longer relevant?

LS: Well, I disagree with you about how relevant Einsteins original ideas were, for better or for worse. There are, in the history of science revolutions, where our understanding of nature changes profoundly and on every possible scale. When you go from being an Aristotelian scientist to a Newtonian scientist, your picture of the world changed drastically, on all scales, and there are many applications of that.

Heres what at stake. Einstein started two revolutions at the start of the 20th century: general relativity and quantum mechanics. He understood that they did not give a consistent picture put together. And in fact, he believed, and I agree with him, that quantum mechanics all by itself doesnt give a consistent picture. To put it directly, it just doesnt make sense: the quantum mechanics as it was formulated in the 1920s, by his friends and colleagues.

LS: So we have two tasks on the agenda. One is to make sense of quantum mechanics. And two is to fix that theory which is better than quantum mechanics, and to make that theory thats better than quantum mechanics also complete General Relativity. So I see it as a question of completion.

General Relativity covers very well, to a certain degree of approximation, certain phenomena. Quantum mechanics covers very well, to a certain degree of approximation, certain phenomena. Theyre both incomplete. Highly incomplete. At the level of experiments, you have to use some imagination, but its not all down at the Planck scale. There are experiments which involve timescales of minutes or seconds where we have no clear prediction. But this double revolution needs to be completed on both sides, and thats whats at stake: its to complete the revolution, because were living in a conceptual situation much analogous to that faced by Kepler and Galileo, who were contemporaries, they were each halfway between Aristotelian and Newtonian physics. They understood certain things very well, but they were deeply confused about other things. And thats our situation now.

ES: From the quantum side, Ive heard many people argue, counter to what youre arguing, that quantum physics works exactly fine for describing every quantum phenomenon in the Universe, so long as you dont also fold in quantum gravitational effects. If I can treat spacetime as being a classical or semi-classical background, then I can do everything that my quantum field theory predicts I should do without any errors or uncertainties. Do you disagree with that?

LS: Am I supposed to be impressed by that?

Aristotle worked with orbits and the positions of the planets that were accurate to a part in a thousand over a millennium. That was impressive, but it was bloody wrong. That simple-minded theory that youre describing why would somebody take such a little, little, low-ambitious thing? Of course you can make it work if you put in enough caveats and enough approximations, thats what were trained to do.

And there are some beautiful things that come out of it, like Steve Hawkings prediction of black hole radiation. So thats fine, but man, thats 1970s physics; do we want to do 1970s physics forever? Im being deliberately a bit provocative, but, you know, weve got to wake these people up!

ES: So I read, back in 2003, you co-wrote a paper [with Fotini Markopoulou] that showed what Ill say is an intriguing link between general ideas in quantum gravity and the fundamental non-locality of quantum physics. Now, maybe I should even ask you a setup question for this: we often state that quantum physics is fundamentally a non-local theory. And when we talk about quantum entanglement, we use that as sort of an illustration of that. But critics of that will say that no information ever travels faster than light from one quantum to another. Does this create any conflict in your mind? Would you say that quantum mechanics is fundamentally non-local?

LS: That quantum mechanics is fundamentally non-local, and therefore, making sense of quantum mechanics requires a strong modification in our understanding of what space is. And that General Relativity requires a strong modification in our ideas of what space is. And therefore, the things should go together. We shouldnt try to ignore that and do this and then ignore this and do that, we should fix them together, in one move. And thats what Ive been trying to do since 19 since I was in college.

That [paper], that was mostly [Markopoulous] idea, and that was a very clever demonstration of the principle that space could be is be emergent, so that time could be fundamental. And thats what she believed and she convinced me, and thats what Ive been working on, really, the last 20 years. Is the idea that time and causation are at the bottom, and are fundamental, and that space is a secondary, emergent quantity, like pressure of the air or temperature of the Earth. And so thats what weve been trying to do, and weve been having some moderate success along the road.

So that what we experience of the world, evolving in time event-by-event, event-by-event, is real, thats how the world really is. And out of that fundamental, active notion of time and causation, we make space as a derivative concept, the same way that out of the motion of atoms, you make a gas.

ES: Interesting. So you are very strongly an advocate that this classical notion of cause-and-effect, persists all the way down to the quantum level. I would assume that this means you are not a fan of quantum mechanics interpretations that do not maintain cause-and-effect as a fundamental tenet of all interactions?

LS: Mmm-hmmm, yes.

ES: I know that you have stated, and I dont know if its for ideological or physical reasons, that reality ought to be independent of us, the observer.

LS: Yes, of course.

ES: You say, yes, of course. And many people throughout the history of quantum mechanics have not thought, yes, of course. Can you explain why reality should be independent of the observer?

LS: Because Im a realist, and for me the goal of science is precisely the description of nature as it would be in our absence. Now, that doesnt mean that there isnt a role for the observer. For example, in the theories Ive been developing for the last five years its called the theory of views what is real in that Universe is a view of that Universe, looking back, causally, into the past. And thats exactly whats real. John Bell, who was very much a realist, used to say, we have to say not what the observables are, but what the viewables are. So Ive been developing this theory where we have events, and then have information or news that comes to them from the past, and thats whats real: those views. And the dynamics of the world doesnt depend on differential equations in space, or fields, it depends on the views, and the differences between those views. And the basic dynamical principle of the theory is that the Universe evolves to make the views as varied and as different from each other as possible.

ES: So you have a principle, then, of something thats either maximized or minimized.

LS: Of course.

ES: Is that something you could describe for us?

LS: Sure. Its called, the variety. It can be applied to many different kinds of systems, so lets take cities. Consider an old city: the center of Rome, which was preserved. Think of calling a friend and saying, Im lost, Im at some corner and heres what I see around me. Now, Rome is a city with a lot of variety, so your friend is gonna be able to say, Oh, youre there, near the [whatever] because every corner looks different. Rome is a city with high variety. On the other hand, there are some very suburban-dominated cities, in which you wouldnt know very much about where you are just from what you see when you look around, because many of the corners are similar to each other. So that can give you an example of what we mean when we say, we want to increase the variety.

ES: So when you say, we want to increase the variety, do you think that nature extremizes variety?

LS: Yes, and I can write that down as an equation within the framework I discussed, where there are these causal relations, and theres energy and momentum, but theres no space. We can construct a dynamical theory that extremizes, over time, the variety of the system. And we derive from that, quantum mechanics, and as a limit of that, classical mechanics.

Why do we get quantum mechanics out? Roughly speaking, there was an original realist interpretation of quantum mechanics called pilot wave theory, that Louis de Broglie invented in 1927, and it was reinvented by David Bohm in about 1952. And in that theory, theres potential energy and theres another new function of the wavefunction, and it sits where the potential energy usually sits. And they derive the Schrodinger equation from maximizing the influence of this function. Well, it turns out that this function that David Bohm invented is a certain limit of the quantity we call the variety, by the way with Julian Barbour, back in the 80s. And this was one of the great surprises of my working life.

ES: When you take this limit of the quantity you call the variety, and youre saying, were extremizing over that, this sounds to me like something that would be pretty analogous to some type of entropy, some type of thermodynamic quantity. So far, everyone I know whos tried to come up with a concept of gravity is emergent or space is emergent or some other quantity that we normally look at as fundamental is in fact emergent, takes something that in typical physics thought we view as emergent and makes that fundamental. I would say the typical view of physics is that entropy is an emergent property that you can calculate based on, say, the microscopic quantum state of all the particles aggregated together. Are you basically doing something similar to that, except with this thing you define as variety instead of entropy?

LS: Roughly speaking yes, but thats a long discussion. Because the role of entropy in cosmological theory is something we have to get our heads straight about. Theres a series of three very beautiful papers that Marina Corts, Andrew Liddle and Stu Kauffman have that weve been working on for a few years, and they contain some important new insights about very far-from-equilibrium systems and their relation to cosmology.

ES: Id like to ask about this idea that Heisenberg and a lot of other people had, which is that unless you have what we call an interaction in some sense one quantum interacting with another quantum thats the only thing that provides meaningful information about the Universe. If you dont make a measurement, then you dont have a quantifiable property of the Universe. So all of the information that we have has to come out of that act, which I look at, maybe naively, as fundamentally antagonistic to this idea of an objective reality. The fact that we cant make any measurements that discern between this Heisenberg-esque picture of reality and a objective reality exists picture of reality you have a certainty about your perspective that I dont share and that many physicists dont share. How do you make sense of this if you cant tell experimentally between these different interpretations?

LS: No, thats a fake. I dont know that, but its a good working fake. Let me tell you about how I look at quantum mechanics these days, because its new and its been very exciting to me. Our realization, actually following down some quotes of Heisenberg which were very mysterious at first, you know that Heisenberg said that the wavefunction description does not apply to the past. Somehow, the wavefunction was about the future, and the classical description is about the past. And a few people said this. Freeman Dyson said this at length; Schrodinger said something like that, and even deeper and more mysterious.

Recently Ive been working on these questions with Clelia Verde, and we realized they were trying to say is that in the Copenhagen version of quantum mechanics, there is a quantum world and there is a classical world, and a boundary between them: when things become definite. When things that are indefinite in the quantum world become definite. And what theyre trying to say is that is the fundamental thing that happens in nature, when things that are indefinite become definite. And thats what now is. The moment now, the present moment, that all these people say is missing from science and missing from physics, that is the transition from indefinite to definite. And quantum mechanics, the wavefunction, is a description of the future which is indefinite and incomplete. And classical physics is how we describe the past.

Why? Because the past happened, what happened was definite, and it doesnt change, because its the past. So we have this different way of thinking about quantum mechanics, and it seems to be helpful, were having a good time.

ES: Its very hard to disagree with that. So when you look at, lets say, Wheelers delayed choice experiment. And Im thinking in particular of one where you send in a photon and you have a beam splitter, and the photon can take two paths around mirrors, and theyll meet up on the other side. And either youll have another beam splitter that will combine them and youll get your detector that will see an interference pattern of the recombined photons, or you wont put the splitter in there, and youll just get one of the photons that comes into your detector.

So, you can do this, and Wheelers idea is that you can send the photon through that first splitter, to have it go those two different ways. And then you can either put the second splitter there or not. And at the last second, you can either remove the splitter that was there (or not) or you can insert the splitter that wasnt there to try and, he called it, catch the photon deciding on what it was going to do before you made that measurement.

In hindsight, to no ones surprise, what did you measure at the detector? Well, if the splitter was there, you get the interference pattern back. And if the splitter wasnt there, youd just get the one photon back. Basically, nature doesnt know in advance what youre going to do. But once you do it, its like it knew all along what you were going to do. That, to me, and youre going to tell me thats not the only interpretation, has always meant the act of interacting, itself, is what gives you that meaningful information. If you didnt have any interactions taking place, you have not determined your reality yet. Your reality remains indeterminate until you make a measurement that would discern between the different possibilities.

LS: Yeah, but you see, I agree with that. Only, our line is now, is the boundary between the future and the past.

ES: Are you saying that right now, the in progress things, that have not yet been decided, that will be decided with an interaction at some point in the future, are you saying that everything in the past has already been determined, even those things where that measurement that will draw that line has not yet occurred?

LS: So that event has not yet occurred, so thats rather compatible. The notion of the now that gives rise to is not a thin instant, where it has to happen here; its what the philosophers call a thick now. So there can be events that turn something definite, that are late, or that are early, so our now can zigzag quite a bit. At least, thats the way we try to understand those cases. Theyre not in the original papers with Verde, but were going through all these thought experiments in detail and show how to think about whats going on.

ES: This is stuff thats right on the cutting edge of trying to understand what the fundamental nature of reality is. Youve written very much, Id say, non-positively about many of the ideas in string theory, and how theyve become this dominant theoretical paradigm. One of the things Ive noticed about your work is that it seems to be relatively agnostic about other extensions to what might be out there: string theory, supersymmetry, grand unification, etc., you seem pretty agnostic about this all, which is maybe in contrast to what peoples public perception of you is.

LS: If people want to express an opinion about [my 2006 book, The Trouble With Physics], I would ask them the favor that they should read it. There was a lot of angst and conflict in that period, and I think people would be surprised here, but let me just tell you what I think. What I believe is that there are a number of interesting different approaches to quantum gravity, which so far are all incomplete. They all manage to explain something to us about what a quantum description of spacetime may be, but each of them also get stuck somewhere on some characteristic.

String theory is a beautiful set of ideas, which in my view has gotten stuck. And loop quantum gravity, which Im fortunate enough to have had the experience of working on while it was being invented, but its also clearly gotten stuck. Both of them express the same idea: that theres a duality between fields carrying forces, like the electromagnetic field, and quantum excitations of those fields can look like extended objects, like strings or loops, propagating. Both loop quantum gravity and string theory express in different contexts that fundamental conjecture.

What I tried to express in that book, and its always the authors fault when youre misunderstood, that book started as a case study of the role of conflict in science. Being a student of Paul Feyerabend, I think that conflict and disagreement are vital to the progress of science. And that book was meant to be an argument for that, using the case study that I knew best. As the book got shaped by me and by the editors, we flipped the book so that the case study came first and the analysis in terms of how the conflict plays a driving role in science came second, and most people only read the first half.

What I was against, and what I am against wherever I see it, is premature dogmatism: premature believing in something more than what the evidence supports. And this, unfortunately, is very common in science, because we all want to believe that weve done something good and discovered something. There was an atmosphere at the time, which I think is very dissipated now, of over-optimism in my view. I try to give a balanced view of what the strengths of string theory were and what the weaknesses were, and unfortunately some people reacted to that. But that was a long time ago.

ES: Can I ask you what you think of certain effective approaches to quantum gravity? Like asymptotically safe gravity, do you think that offers any promise? Ive always had an appreciation for that one because it seems to allow for predictions to be made in an otherwise inaccessible regime.

LS: Asymptotic safety has some very attractive points. Its basically an application by Steve Weinberg about some ideas about perturbatively non-renormalizable theories that Ken Wilson had, and he applied their ideas to gravity. Its a very attractive story, but theres a problem; as I said theres always a problem. The problem in asymptotic safety is unitarity. We know of an asymptotically safe theory which is present even in perturbation theory. Can we speak a little math here?

ES: Go ahead, Ill translate.

LS: The action principle to the theory is the Einstein action principle, plus the cosmological constant term, plus a term in the Ricci scalar squared plus a term in the Ricci tensor squared. And this last one invariably introduces instabilities and an impossibility to satisfy the principle of unitarity, which among other things means you cant guarantee that the probabilities for all the things that will happen will add up to one. And this has been a known problem since 1978 or 1982 or something, and I wrote the third paper in response to Steves paper that showed the violation of unitarity. So thats where it stands in my mind, but its always good to follow the kids, and theres a bunch of smart, young people working on this. Its not my bet, but its their bet, and theyre really good.

We dont have any senior faculty working on asymptotic safety at Perimeter, but we were so impressed by some of the young people who applied to us that, despite our own misgivings, we hired them for a few years. Because its interesting and exciting to have them around, and if you want your field to prosper, youve got to be able to listen to and promote young people who disagree with you, otherwise its not science.

ES: When Ive felt optimistic about it, Ive looked towards asymptotically safe gravity in the same way I now look back at the time-dependent Schrodinger equation. I say, okay, look, this has cases where it doesnt apply, and cases where it breaks down, because its not a relativistically invariant theory. But if you can find a formulation of it, like the Dirac equation, that is relativistically invariant, or if you could find a more general formulation, like quantum field theory that eliminates the need for that sort of thing. Maybe this idea can be salvaged, despite the fact that the way its formulated now, it doesnt guarantee unitarity.

LS: But if you turn it up so that it is giving you unitary answers to second or third order in perturbation theory, then the condition that there should be a non-trivial fixed point constrains the top quark mass by a measurable amount. They actually get a prediction that if this all works out, then heres the top quark mass.

ES: I remember reading a paper by Wetterich and Shaposhnikov years before they had measured the mass of the Higgs boson where they used the mass of all the particles except the Higgs to say, well, instead of getting the mass of the top were going to get the mass of the Higgs, and the value they got was ~126 1. But if I remember right, since then, the mass of the top has changed a little bit, and now if you put that same math back in, youd get something like 129 or 130, which doesnt agree with what theyve seen at the LHC.

LS: I didnt know that; thats interesting. Thats great. What else excites you?

ES: One thing Id like to press you on a little bit is this: if you have a dynamical spacetime, versus a static spacetime, how can you describe wavefunction collapse in a changing spacetime? If you have a wavefunction in a changing spacetime, what does wavefunction collapse look like, if your spacetime isnt static?

LS: Roger Penroses view of that is that the collapse of the wavefunction is a physical thing that happens when a certain measure of energy involved in that possible event is equal to the planck energy per planck time, or something like that. I dont remember the exact way he did it. So then, youre in a domain where neither the Einstein equation or the Schrodinger equation is quite right.

What Im really, really excited about is that there are some experiments under development where they actually test that. Theres a whole new generation of tabletop gravity or quantum gravity experiments that different people are working on.

ES: I like the tabletop experiments that are happening. One thing that I definitely wanted to ask you about is, youve talked, Ill say derisively about people who treat conclusions as if theyre foregone conclusions without having evidence to back that up. You want to remain open-minded to anything that may be possible before that critical evidence comes in. Do you worry that taking the stance of saying, I am a realist when it comes to quantum physics is violating that piece of advice. Do you worry about saying, Im a realist and I believe that reality is observer-independent is making that mistake?

LS: You know, I dont know whats wrong with me, but I love this stuff to death. I am having so much fun and theres nothing like it to be able to think about this stuff. Some people have this wire in them that says they have to be right, and I dont have that. I dont know why, maybe its a defect? So, sure, if you ask me, yeah, I could be wrong about that. I could be wrong about a lot of things.

Lets put us 1000 years in the future, well all look like fools for having missed the obvious things in neuroscience or planetary science or something that turned out to be important. There was a famous boxer who was asked how he felt about his career, and he said, you know, I did the best I could with what I was given. And Im happy with that. I dont gotta be right, but if I didnt follow what I believe in, I wouldnt be as happy a person now.

ES: I want to pull out a Niels Bohr quote and ask you your opinion of this, then. When we measure something, we are forcing an undetermined, undefined world to assume an experimental value. We are not measuring the world; we are creating it. This strikes me as a statement that I would expect you to fundamentally disagree with, but you might surprise me.

LS: No, it doesnt appeal to me, but wow, Im really sorry I never got to meet Bohr. He was an interesting guy; cant we just be on that level? In the end, Bohr was at a very weird place from our point of view in the development of western culture and society. He was influenced by Schopenhauer and people like that, and so he had what we would consider not just a non-realist viewpoint, but a radical non-realist viewpoint, and he did the best he could with that. But I dont believe that, that doesnt keep me up at night, but sure.

ES: Do you have any thoughts youd like to share that I havent asked you about that you think are too important to not share?

LS: Open up the scientific community to more people who are highly trained and really good. And maybe Im just getting this in because I like these ideas. For me, when people talk about diversity, that means not just women and blacks and aboriginals and who else, those are all very very important, but also very important are people who think differently. Now, to make a success in physics, you cant just be anyone off the streets, its like I couldnt compose a piece of music and send it to the New York Philharmonic and have them play it.

Youve gotta have your tools, youve got to be practiced, you gotta be good with your tools, youve gotta make a convincing case for the results that youve found in your work. Thats what a Ph.D. symbolizes But among the people who are excellent, technically, we want as wide a variety of ideas and viewpoints and types and personalities and gender and race its yes yes yes yes. I would hope that the next generation and the second-to-next generation live in a scientific world that is much more fun. Because if everyones like you, its not fun.

Lee Smolin will be appearing at the HowTheLightGetsIn London 2021 festival this September 18/19, with remaining tickets still available here.

Here is the original post:

Are we approaching quantum gravity all wrong? - Big Think

Read More..

Time Crystals dodge the second law of thermodynamics, Thanos style – PC Gamer

People are calling it a breakthrough, a new phase of matterGoogle's invention of Time Crystals has the science and tech scenes in a daze. No, we're not talking about one of six physics-defying stones evading the grasp of Thanos. This is actual science that apparently defies our current understanding of physics.

Here is a discovery that could move us closer to the most accurate and powerful atomic clocks, gyroscopes, and magnetometers to date, but that's just the icing on the cake. Who knows what other practical applications a discovery of this magnitude could have.

Within Google's Sycamore quantum processor, strange things are afoot. As theorised nine years ago by Theoretical Physicist, Frank Wilczek, a new state of matter has been achieved. By blasting strips of superconducting aluminum with microwaves, the system's qubitswhich encode the ones and zeros in quantum computingwere put into a kind of perpetual motion.

A report from Cornell University, named Observation of Time-Crystalline Eigenstate Order on a Quantum Processor, notes "We demonstrate the characteristic spatiotemporal response of a DTC for generic initial states. Our work employs a time-reversal protocol that discriminates external decoherence from intrinsic thermalization, and leverages quantum typicality to circumvent the exponential cost of densely sampling the eigenspectrum."

More simply put: "It just sort of flip-flops," explains Bournemouth Physicist Curt von Keyserlingk. "It doesn't end up looking random, it just gets jammed stuck. It's like it remembers what it looked like initially, and it repeats that pattern over time."

What this essentially means, is that it may be time to rethink the second law of thermodynamics. The second law basically states entropy (or disorder) will always increase. This is why perpetual motion machines still haven't been invented, at least, potentially until now.

In speaking to Live Science, Loughborough Physicist Achilleas Lazarides (who was part of the initial theory's discovery) describes time crystals as an ever-swinging pendulum. The second law dictates, "Even if you totally physically isolate a pendulum from the universe, so there's no friction and no air resistance, it will eventually stop." This 'thermalisation' just doesn't happen in the case of time crystals.

"Energy starts out concentrated in the pendulum's center of mass, but there's all of these internal degrees of freedomlike the ways the atoms can vibrate inside the rodthat it will eventually be transferred into."

Eventually, the discovery of Time Crystals could translate into supreme quantum computing technology. Right now, at least, while researchers are in very early stages of experimentation, it seems to point to some profound insights surrounding the workings of the universe. It's not concrete at this point, though, and quantum physics is not an easy thing to explain, let alone researchjust keep that in mind.

Now I just have a massive urge to play Bioshock Infinite.

The rest is here:

Time Crystals dodge the second law of thermodynamics, Thanos style - PC Gamer

Read More..

UArizona Engineer Awarded $5M to Build Quantum-Powered Navigation Tools – The University of Arizona Research

By Emily Dieckman, College of Engineering

Thursday

Zheshen Zhang, a University of Arizona assistant professor ofmaterials science and engineering, is leading a $5 million quantum technology project to advance navigation for autonomous vehicles and spacecraft, as well as measurement of otherworldly materials such as dark matter and gravitational waves.

The National Science Foundation'sConvergence Accelerator Program, which fast-tracks multidisciplinary efforts to solve real-world problems, is funding the Quantum Sensors project.

In September 2020, 29 U.S. teams received phase I funding to develop solutions in either quantum technology or artificial intelligence-driven data sharing and modeling. Ten prototypes have advanced to phase II, each receiving $5 million, including two projects led by UArizona researchers Zhang's project and another by hydrology and atmospheric sciences assistant professor Laura Condon.

"Quantum technology and AI innovation are a priority for the National Science Foundation," said Douglas Maughan, head of the NSF Convergence Accelerator program. "Today's scientific priorities and national-scale societal challenges cannot be solved by a single discipline. Instead, the merging of new ideas, techniques and approaches, plus the Convergence Accelerator's innovation curriculum, enables teams to speed their research into application. We are excited to welcome Quantum Sensors into phase II and to assist them in applying our program fundamentals to ensure their solution provides a positive impact on society at large."

Upgrading Gyroscopes and Accelerometers

The objects we interact with in our daily lives adhere to classic laws of physics, like gravity and thermodynamics. Quantum physics, however, has different rules, and objects in quantum states can exhibit strange but useful properties. For example, when two particles are linked by quantum entanglement, anything that happens to one particle affects the other, no matter how far apart they are. This means probes in two locations can share information, allowing for more precise measurements. Or, while "classical" light emits photons at random intervals, scientists can induce a quantum state called "squeezed" light to make photon emission more regular and reduce uncertainty or "noise" in measurements.

The Quantum Sensors project will take advantage of quantum states to create ultrasensitive gyroscopes, accelerometers and other sensors. Gyroscopes are used in navigation of aircraft and other vehicles to maintain balance as orientation shifts. In tandem, accelerometers measure vibration or acceleration of motion. These navigation-grade gyroscopes and accelerometers are light-based and can be extremely precise, but they are bulky and expensive.

Many electronics, including cellphones, are equipped with tiny gyroscopes and accelerometers that enable features like automatic screen rotation and directional pointers for GPS apps. At this scale, gyroscopes are made up of micromechanical parts, rather than lasers or other light sources, rendering them far less precise. Zhang and his team aim to develop chip-scale light-based gyroscopes and accelerometers to outperform current mechanical methods. However, the detection of light at this scale is limited by the laws of quantum physics, presenting a fundamental performance limit for such optical gyroscopes and accelerometers.

Rather than combat these quantum limitations with classical resources, Zhang and his team are fighting fire with fire, so to speak, by using quantum resources. For example, the stability of squeezed light can counterbalance the uncertainty of quantum fluctuations, which are temporary changes in variables such as position and momentum.

"The fundamental quantum limit is induced by quantum fluctuations, but this limit can be broken using a quantum state of light, like entangled photons or squeezed light, for the laser itself," said Zhang, director of the university's Quantum Information and Materials Group. "With this method, we can arrive at much better measurements."

Gaining an Edge on Earth and Beyond

The benefits of extremely precise measurements are numerous. If a self-driving car could determine its exact location and speed using only a compact, quantum-enhanced, onboard gyroscope and accelerometer, it wouldn't need to rely on GPS to navigate. A self-contained navigation system would protect the car from hackers and provide more stability. The same goes for navigation of spacecraft and terrestrial vehicles sent to other planets.

"In both space-based and terrestrial technologies, there are a lot of fluctuations. In an urban environment, you might lose GPS signal driving through a tunnel," Zhang said. "This method could capture information not provided by a GPS. GPS tells you where you are, but it doesnt tell you your altitude, the direction your vehicle is driving or the angle of the road. With all of this information, the safety of the passengers would be ensured."

Zhang is collaborating with partners at General Dynamics Mission Systems, Honeywell, the NASA Jet Propulsion Laboratory, the National institute of Standards and Technology, Purdue University, Texas A&M University, UCLA and Morgan State University.

"We are excited to work with the University of Arizona on this NSF Convergence Accelerator project," said Jianfeng Wu, Honeywell representative and project co-principal investigator. "The integrated entangled light sources can reduce the noise floor and enable the navigation-grade performance from chip-scale gyroscopes. The success of this program will significantly disrupt the current gyroscope landscape from many perspectives."

Because precise navigation would directly affect 700 million people worldwide, researchers estimate that quantum sensors could create a $2.5 billion market by 2035. They also expect that the precision and stability offered by the technology will give researchers a way to measure previously unmeasurable forces, such as gravitational waves and dark matter.

"As a leading international research university bringing the Fourth Industrial Revolution to life, we are deeply committed to advance amazing new information technologies like quantum networking to benefit humankind, said University of Arizona PresidentRobert C. Robbins. "The University of Arizona is an internationally recognized leader in this area, and I look forward to seeing how Dr. Zhang's Quantum Sensors project moves us forward in addressing real-world challenges with quantum technology."

Go here to see the original:

UArizona Engineer Awarded $5M to Build Quantum-Powered Navigation Tools - The University of Arizona Research

Read More..