Category Archives: Quantum Computing

Supercomputers: Tackling the Power Problem in High-Performance … – EnergyPortal.eu

Supercomputers have long been at the forefront of technological advancements, enabling researchers to tackle complex problems and simulate processes that would be impossible to study in real-world conditions. These high-performance computing (HPC) systems have been instrumental in advancing fields such as climate modeling, drug discovery, and astrophysics. However, as the demand for more powerful supercomputers continues to grow, so does the need for a more energy-efficient solution to power these behemoths.

Traditionally, supercomputers have relied on thousands of processors working in parallel to perform complex calculations at breakneck speeds. While this approach has yielded impressive results, it has also led to a significant increase in power consumption. As a result, researchers and engineers have been exploring alternative computing paradigms that could potentially offer both increased performance and reduced energy requirements.

One such promising technology is quantum computing, which leverages the principles of quantum mechanics to perform calculations that would be impossible for classical computers. Unlike traditional computing, which relies on bits that can be either a 0 or a 1, quantum computing uses qubits, which can exist in multiple states simultaneously. This allows quantum computers to perform multiple calculations at once, potentially leading to exponential speedups in certain problem-solving tasks.

The potential applications of quantum computing in the realm of high-performance computing are vast. For example, quantum computers could be used to simulate the behavior of molecules and materials at the quantum level, leading to breakthroughs in materials science and drug discovery. Additionally, quantum computers could be used to optimize complex systems, such as transportation networks and supply chains, leading to increased efficiency and reduced costs.

Despite the potential benefits of quantum computing, there are still significant challenges that must be overcome before this technology can be widely adopted in the HPC space. One of the primary obstacles is the development of error-correcting techniques that can mitigate the inherent instability of qubits. Additionally, researchers must find ways to scale up the number of qubits in a quantum computer, as current prototypes typically have only a few dozen qubits.

Another challenge facing the adoption of quantum computing in the HPC space is the development of suitable software and algorithms. While some progress has been made in this area, there is still much work to be done in order to fully harness the power of quantum computing. This includes the development of new programming languages and tools that can effectively leverage the unique capabilities of quantum computers.

Despite these challenges, there is a growing consensus among researchers and industry leaders that quantum computing represents the future of high-performance computing. As a result, significant investments are being made in the development of this technology, both by governments and private companies. For example, the United States recently announced a $1 billion initiative to support research in quantum computing and artificial intelligence, while companies such as IBM, Google, and Intel are actively working on developing their own quantum computing platforms.

In conclusion, the quest for more powerful and energy-efficient supercomputers has led researchers to explore the potential of quantum computing as a viable alternative to traditional high-performance computing paradigms. While there are still significant challenges to be overcome, the potential benefits of quantum computing in terms of performance and energy efficiency make it an attractive option for the future of supercomputing. As research and development in this area continue to progress, it is likely that we will see quantum computing play an increasingly important role in tackling the power problem in high-performance computing.

See the rest here:
Supercomputers: Tackling the Power Problem in High-Performance ... - EnergyPortal.eu

US Congress to consider two new bills on artificial intelligence – Economic Times

US senators on Thursday introduced two separate bipartisan artificial intelligence bills on Thursday amid growing interest in addressing issues surrounding the technology. One would require the US government to be transparent when using AI to interact with people and another would establish an office to determine if the United States is remaining competitive in the latest technologies.

Lawmakers are beginning to consider what new rules might be needed because of the rise of AI. The technology made headlines earlier this year when ChatGPT, an AI program that can answer questions in written form, became generally available.

"The federal government needs to be proactive and transparent with AI utilization and ensure that decisions aren't being made without humans in the driver's seat," said Braun in a statement.

"We cannot afford to lose our competitive edge in strategic technologies like semiconductors, quantum computing, and artificial intelligence to competitors like China," Bennet said.

The briefings include a general overview on AI, examining how to achieve American leadership on AI and a classified session on defense and intelligence issues and implications.

Go here to read the rest:
US Congress to consider two new bills on artificial intelligence - Economic Times

AI 101: 10 artificial intelligence terms to keep up with this new-age … – YourStory

Over the last few months, artificial intelligence (AI) has dominated headlines everywhere. And now, it looks like this new-age technology has taken over big tech companies, enterprises, startups, schools, and life itself. From music, art, and films to homework, news, and more, AI is the next big thing.

Suffice it to say, it is important to keep up with why AI is this interesting, what it is made up of, and why exactly it is a big deal.

In this article, we will navigate the vast landscape of artificial intelligence terminology across machine learning (ML), natural language processing (NLP), deep learning, quantum computing, and a lot more.

While numerous scientists and engineers laid the groundwork for AI since the 1940s, American Computer scientist John McCarthy coined the term Artificial Intelligence in 1955. A year later, he, along with other scientists, held the first AI conference at Dartmouth University.

In the 1980s, there was a shift towards neural networks and machine learning approaches. Researchers explored algorithms inspired by the structure and functioning of the human brain, enabling machines to learn from data and improve their performance over time.

The late 1980s and early 1990s witnessed a period known as the "AI Winter" when interest and funding significantly declined in this area due to unmet expectations. However, the field experienced a resurgence in the late 1990s with advancements in areas such as data mining, natural language processing, and computer vision.

In recent years, the availability of vast amounts of data and advancements in computational power have fuelled breakthroughs in AI. Deep learning, a subfield of machine learning that utilises neural networks with multiple layers, has led to significant advancements in image and speech recognition, natural language processing, and other AI applications.

An algorithm is a definite set of instructions that allow a computer to perform a certain task. AI algorithms help a computer understand how to perform certain tasks and achieve the desired results on its own. In other words, algorithms set the process for decision-making.

Machine learning is a subset of AI. It effectively enables machines to learn using algorithms, data and statistical models to make better decisions. While AI is a broad term that refers to the ability of computers to mimic human thought and behaviours, ML is an application of AI used to train computers to do specific tasks using data and pattern recognition.

A subset of ML, deep learning trains computers to do what humans canlearn by example. Computer models can be taught to perform tasks by recognising patterns in images, text, or sound, sometimes surpassing humans in their ability to make connections. Computer scientists leverage large sets of data and neural network architectures to teach computers to perform these tasks.

DL is employed in cutting-edge technology like driverless cars to process a stop sign or differentiate between a human and a lamp post.

Yet another application of ML, natural language processing helps machines understand, interpret, and process human language to perform routine tasks. It uses rules of linguistics, statistics, ML, and DL to equip computers to fully understand what a human is communicating through text or audio and perform relevant tasks. AI virtual assistants and AI voice recognition systems like voice-operated GPS are examples of NLP.

Computer vision is a form of AI that trains computers to recognise visual input. For instance, a machine will be able to analyse and interpret images, videos and other visual objects to perform certain tasks that are expected of it.

An example is medical professionals using this technology to scan MRIs, X-rays or ultrasounds to detect health problems in humans.

Robotics is a branch of engineering, computer science, and AI that designs machines to perform human-like tasks without human intervention. These robots can be used to perform a wide variety of tasks that are either too complex and difficult for humans or are repetitive or both. For example, building a robotic arm to assemble cars in an assembly line is an example of a robot.

Data science uses large sets of structured and unstructured data to generate insights that data scientists and others can use to make informed decisions. Often, data science employs ML practices to find solutions to different challenges and solve real-world problems.

For instance, financial institutions may employ data science to analyse a customers financial situation and bill-paying history to make better decisions on lending.

An extension of data science is data mining. It involves extracting useful and pertinent information from a large data set and providing valuable insights. It is also known as knowledge discovery in data (KDD). Data mining has numerous applications, including in sales and marketing, education, fraud detection, and improving operational efficiency.

Quantum computing uses theories of quantum physics to solve complex problems that classic computing cannot solve. It is used to run complex simulations in a matter of seconds by converting real-time language into quantum language.

Google has a quantum computer that they claim is 100 million times faster than an average computer. Quantum computing can be used in a variety of fields ranging from cybersecurity to pharmaceuticals to solve big problems with fewer resources.

A chatbot employs AI and NLP to simulate human conversations. It can operate through text or voice conversations. Chatbots use AI to analyse millions of conversations, learn from human responses and mimic them to provide human-like responses. This tech has found great usage in customer service and as AI virtual assistants.

All AI and ML tools are human-trained. This means that any inherent human bias can reflect AI bias. AI bias is a term that refers to the tendency of machines to adopt human biases because of how and by whom they are coded or trained. Algorithms can often reinforce human biases.

For instance, a facial recognition platform may be able to recognise Caucasian people better than people of colour because of the data set it has been fed. It is possible to reduce AI bias through more testing in real-life circumstances, accounting for these biases and improving how humans operating these systems are educated.

Today, AI represents the human capacity to create, innovate, and push the boundaries of what was once thought impossible.

So, whether you are an AI enthusiast, a curious learner, or a decision-maker shaping the future, it is essential to equip yourself with the right knowledge to survive in a world that is being increasingly powered by AI and its tools.

See the original post:
AI 101: 10 artificial intelligence terms to keep up with this new-age ... - YourStory

Quantum Computing: A New Era of Smart Cities and Infrastructure – CityLife

Quantum Computing: A New Era of Smart Cities and Infrastructure

Quantum computing, a groundbreaking technology that has the potential to revolutionize various industries, is now poised to transform the way we build and manage smart cities and infrastructure. This cutting-edge innovation, which leverages the principles of quantum mechanics to perform complex calculations at speeds previously unimaginable, promises to enhance the efficiency, sustainability, and resilience of urban environments.

As the worlds population continues to grow and urbanize, the need for smart cities and infrastructure has become increasingly urgent. These interconnected, data-driven systems are designed to optimize the use of resources, reduce environmental impact, and improve the quality of life for residents. However, the sheer volume of data generated by smart cities, coupled with the complexity of the challenges they face, has pushed the limits of classical computing.

Enter quantum computing, a technology that has the potential to tackle problems that are currently intractable for classical computers. Unlike classical computers, which use bits to represent information as either 0s or 1s, quantum computers use quantum bits, or qubits, which can represent both 0 and 1 simultaneously. This allows quantum computers to perform multiple calculations at once, enabling them to solve complex problems exponentially faster than their classical counterparts.

One of the most promising applications of quantum computing in the realm of smart cities and infrastructure is traffic management. With the ability to process vast amounts of data in real-time, quantum computers could optimize traffic flow, reducing congestion and emissions while improving overall transportation efficiency. For example, by analyzing data from traffic sensors, GPS devices, and other sources, quantum computers could predict and prevent traffic jams, dynamically adjust traffic signals, and even recommend alternative routes to drivers.

Another area where quantum computing could have a significant impact is energy management. As the demand for clean, renewable energy sources grows, so too does the need for efficient energy distribution and storage systems. Quantum computers could help optimize the energy grid by analyzing and predicting fluctuations in supply and demand, allowing for more efficient allocation of resources and reducing waste. Moreover, quantum computing could also play a role in the development of new materials for energy storage, such as advanced batteries, by simulating their properties at the atomic level.

In addition to traffic and energy management, quantum computing could also revolutionize the way we design and maintain infrastructure. For instance, by simulating the behavior of materials and structures under various conditions, quantum computers could help engineers develop more resilient and sustainable buildings, bridges, and other infrastructure elements. Furthermore, quantum computing could enable more effective monitoring and maintenance of infrastructure by analyzing sensor data to detect potential issues before they become critical.

Of course, the widespread adoption of quantum computing in smart cities and infrastructure is not without its challenges. One of the primary obstacles is the development of scalable, reliable quantum computers, which are still in their infancy. Additionally, integrating quantum computing into existing systems and ensuring data security will be crucial to its success.

Despite these challenges, the potential benefits of quantum computing for smart cities and infrastructure are too significant to ignore. As the technology continues to advance, it is likely that we will see quantum computing play an increasingly important role in the development and management of urban environments. By harnessing the power of quantum computing, we can build smarter, more sustainable cities that are better equipped to meet the challenges of the 21st century and beyond.

Read the original:
Quantum Computing: A New Era of Smart Cities and Infrastructure - CityLife

The Role of Neuromorphic Computing in the Future of Quantum … – CityLife

Exploring the Synergy between Neuromorphic Computing and Quantum Computing for Advanced AI Applications

The rapid advancements in artificial intelligence (AI) and machine learning (ML) have created a significant demand for powerful computing systems that can handle the massive amounts of data and complex algorithms involved in these fields. Traditional computing architectures, such as those based on the von Neumann model, are reaching their limits in terms of energy efficiency and processing capabilities. This has led researchers to explore alternative computing paradigms, such as neuromorphic computing and quantum computing, which hold the potential to revolutionize the way we process and analyze information.

Neuromorphic computing is a novel approach that aims to mimic the structure and function of the human brain in order to create more efficient and adaptive computing systems. It is based on the idea of using artificial neural networks, which are composed of interconnected artificial neurons, to process and store information. These networks can be implemented in hardware, using specialized electronic components, or in software, running on conventional computing platforms. Neuromorphic systems are designed to be highly parallel, fault-tolerant, and energy-efficient, making them well-suited for AI and ML applications.

Quantum computing, on the other hand, is a fundamentally different approach that relies on the principles of quantum mechanics to perform computations. Unlike classical computers, which use bits to represent information as either 0 or 1, quantum computers use quantum bits, or qubits, which can exist in multiple states simultaneously due to a phenomenon known as superposition. This allows quantum computers to perform certain types of calculations much faster than classical computers, potentially enabling them to solve problems that are currently intractable.

The synergy between neuromorphic computing and quantum computing is an exciting area of research that could lead to the development of advanced AI applications that were previously thought to be impossible. By combining the strengths of both paradigms, researchers hope to create hybrid systems that can tackle complex problems in areas such as natural language processing, pattern recognition, and decision-making.

One of the key challenges in developing such hybrid systems is finding ways to integrate neuromorphic and quantum components in a seamless and efficient manner. Researchers are exploring various techniques to achieve this, such as using quantum-inspired algorithms to train neuromorphic networks, or employing neuromorphic hardware to control and read out the states of qubits in a quantum processor.

Another important aspect of this research is the development of new materials and fabrication techniques that can support the implementation of neuromorphic and quantum devices. For example, researchers are investigating the use of superconducting materials, which can carry electrical currents without resistance, to create energy-efficient neuromorphic circuits and qubits. They are also exploring the potential of nanoscale structures, such as quantum dots and nanowires, to enable the miniaturization and integration of these devices.

As the field of neuromorphic-quantum computing continues to evolve, it is expected to have a profound impact on the future of AI and ML. By harnessing the power of both neuromorphic and quantum computing, researchers aim to develop systems that can learn and adapt in real-time, allowing them to handle complex tasks with greater speed and accuracy than ever before. This could lead to breakthroughs in areas such as robotics, autonomous vehicles, and personalized medicine, among others.

In conclusion, the synergy between neuromorphic computing and quantum computing holds great promise for the future of AI and ML applications. By exploring the potential of these two emerging paradigms, researchers are paving the way for the development of advanced computing systems that can tackle some of the most challenging problems in science and technology. As we continue to push the boundaries of what is possible with AI and ML, the integration of neuromorphic and quantum computing will undoubtedly play a crucial role in shaping the future of these fields.

Read this article:
The Role of Neuromorphic Computing in the Future of Quantum ... - CityLife

Strangeworks Announces an Expansion of Its Advanced Compute Platform – Quantum Computing Report

Strangeworks, an Austin, Texas based software startup, has updated and expanded its platform to add additional functionality. Besides offering access to over 60 different hardware platforms, the company has now added algorithm libraries from both internal and external sources that their customers can utilize to accelerate their application.

Examples of libraries provided by their external partners include a quantum kernal method from Rigetti, a portfolio optimization application from Multiverse, and an airline route optimization application from BosonQ Psi. Strangeworks has also made available three internally developed algorithms for various optimization applications including QAOA (Quantum Approximate Optimization Algorithm) and VQE (Variational Quantum Eigensolver). These applications can be run on several different quantum hardware backends allowing a user to compare the results. The company also announced it will be providing both classical and quantum-inspired solutions to its customersthat leverage artificial intelligence technology to provide additional alternatives to its customers when a quantum approach isnt a good fit.

Other improvements include improved enterprise business and reporting tools for controlling usage as well as an updated Backstage Pass program which provided early access to new quantum hardware, software, features, and functionality. A recent addition to this program includes Quandelas photonic quantum computer in the cloud.

More information about these new features is available in a news release on the Strangeworks website here.

April 29, 2023

More:
Strangeworks Announces an Expansion of Its Advanced Compute Platform - Quantum Computing Report

Quantum commercialization softly, softly towards the inevitable future, says backer – Diginomica

(Image by Garik Barseghyan from Pixabay )

The quantum tipping point that fabled moment when quantum technologies break through to commercial adoption at scale has beenquestioned in a previous diginomica report.

According to speakers at Davos this year, a more likely scenario is the gradual emergence of vertical use cases in which a quantum advantage finding hidden correlations between data points, for example will present itself. Plus, algorithms that allow the querying of massive data sets more quickly than classical models.

Even so, investors of every size are beginning to back quantum technologies, especially those that relate to enterprise data. And they are doing so despite the lack of strategic, long-term thinking among potential customers.

According to McKinsey, quantum investment exceeded $1.4 billion in 2021, while the UK government launched a 10-year, 2.5 billion ($3.2 billion) funding drive in March.

One of the challenges, says David Magerman, a founding partner at New York City-based investment house Differential Ventures, is that in a world of AI hype and a rush to associate with popular technologies like GPT, some enterprise users prefer a quick hit of tactical advantage to a long bet with an uncertain payback date.

A sugar rush trumps healthy eating for the future, perhaps? Magerman says:

I come from the financial industry. And while there are a lot of problems that quantum computing will solve significantly better [than classical systems], the financial industry is very much about the next five minutes.

All the spending that goes on research in finance is about solving problems today. There's such a competitive environment that they can't afford to be too forward-looking. We can't get the financial industry to devote many resources to quantum until production solutions are imminent. Then theyll spend a lot of money playing catch-up.

But every dollar they spend on forward-looking research gives their competitors a chance to beat them today.

A frustrating problem for tech innovators who need financial backing. One that applies across many sectors, though in areas such as cybersecurity and communications, where quantum-safe technologies are vital, and industries such as materials science and pharmaceuticals, early investment is a more immediate priority.

Yet being first out of the (quantum) gate is what drives Differential Ventures, a seed fund that backs promising early-stage companies in quantum, plus AI, machine learning, and data science. That said, seed-investment in the US can mean writing cheques for a few million dollars, as well as smaller, less risky sums.

Hotspots include bridging the gap between quantum technologies and classical environments, says Magerman. For example, his company recently led a $6.1 million seed extension round in Agnostiq, the Toronto-based distributed computing start-up that is building Covalent, an enterprise quantum and high-performance computing platform.

His company is also backing more personal artificial intelligence (AIs that query personally identifiable data, or PID), and their necessary flipside: private AIs, which keep PID safe from acquisitive AIs online. That certainly keeps the options open!

He explains the Differential philosophy:

A lot of people promote fantastical solutions that are oversold and aren't realistic. We'll avoid those companies. And those that are pitching things that aren't that hard, even if they're interesting applications. Once those ideas hit the mainstream, they are hard to protect from the competition, so we avoid those investments too.

Really, its based on my experience as a data scientist.

So, how common is demand for seed funding in quantum computing, an area often based in research labs and university spinouts, where high-risk, long-term, big-ticket VC backing would seem to be the order of the day? Massive funds that can afford to have a speculative under-performer in their portfolio when the other wins might be massive?

He says:

We typically avoid hardware companies, because they tend to require much more capital. And not only to start out: a quantum hardware company might raise a few million dollars in their first investment round, but they're going to require probably hundreds of millions to get to product.

But there are a lot of quantum software companies that typically want the seed rounds we look to invest in. They can get to product and get to revenue at scale, which means we can continue to invest in them before they get too big for us.

For example, Agnostiq recognized early on that the big problem impeding the growth of quantum research was orchestration tools that could integrate classical computing workloads with quantum computing resources. So, they built a product that allows companies to do high-performance computing orchestration.

So, does Magerman believe that a tipping point is approaching, or is it more a case of softly, softly? Because on the face of it, a $42 million seed fund (at present) could not afford to wait too long for commercial payback in such an uncertain space?

He explains:

I wouldn't say it's fast approaching, but I would say it's inevitable. And I think it will approach more quickly based on more intellectual and financial resources being devoted to solving the problem.

The subtext is the limits of Moores Law being reached, at which point quantum may offer a promising way forward: a new upward slope of increasing speed and power, assuming the challenge of noise in quantum circuits can be, if not solved, then at least made workable at scale.

He says:

The end of Moore's Law, plus the supply-chain issues that we had with silicon during COVID are among the factors contributing to the increased attention on quantum, accelerating the solutions to problems that prevent it from scaling.

For so long there has been this belief that, if you can continue to scale and get more power out of classical computing chips by engineering them with more densely packed circuitry, making them faster, and simply building more of them there is less pressure to develop alternative architectures and hardware.

But now we're seeing that we're becoming more limited in computing power [in the near future] yet we have more and more data, and more and more computing needs, especially with AI there'll be a lot more pressure to accelerate the development of quantum hardware.

There are still unsolved problems it is still too error prone and difficult to scale but these are practical problems, not theoretical ones. I wouldn't bet against the ingenuity of humans, especially when there's economic forces pushing for a solution!

So, the voices saying a tipping point wont happen were wrong?

He says:

I think the tipping point is more about coming up with the killer algorithm that is currently intractable. One that would take, effectively, infinite computing time in a classical environment, but which quantum computing could solve in a reasonable amount of time.

So, the two questions are: one, can you actually implement these algorithms in a quantum computing environment? Can you solve that problem? And, two, does the precise solution to the optimization perform substantially better, in practical problems, then the heuristic approximation?

Because it could be that the heuristic approximation is good enough and that getting the exact mathematical solution doesn't give you meaningfully better real-world performance.

So, the value of quantum will be finding the real-world problem where the heuristic solutions we are using today are inadequate and the quantum solution is substantially better.

So, what should enterprise decision-makers do in this push me, pull you environment of conflicting messages, not to mention the tactical pressure to innovate now, rather than place longer-term bets?

He says:

Theres a big education process that needs to happen, and that's going to need both a bottom-up and a top-down approach.

First, you have to get quantum researchers inside these corporate research groups, and give them the opportunity to experiment with solutions and promote those up the food chain to senior management.

And the top-down is getting management to make the hiring decisions, to bring in people who understand quantum technology well enough to do good research and prepare the company for quantum at scale.

Wise words from a brave investor who is prepared to be there in the early days of new ventures, even as the big guns of IBM, Google, Microsoft, et al, gear up in the background.

Read the original:
Quantum commercialization softly, softly towards the inevitable future, says backer - Diginomica

A model system of topological superconductivity mediated by skyrmionic magnons – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

by Ingrid Fadelli , Phys.org

Topological superconductors are superconducting materials with unique characteristics, including the appearance of so-called in-gap Majorana states. These bound states can serve as qubits, making topological superconductors particularly promising for the creation of quantum computing technologies.

Some physicists have recently been exploring the potential for creating quantum systems that integrate superconductors with swirling configurations of atomic magnetic dipoles (spins), known as quantum skyrmion crystals. Most of these efforts suggested sandwiching quantum skyrmion crystals between superconductors to achieve topological superconductivity.

Kristian Mland and Asle Sudb, two researchers at the Norwegian University of Science and Technology, have recently proposed an alternative model system of topological superconductivity, which does not contain superconducting materials. This theoretical model, introduced in Physical Review Letters, would instead use a sandwich structure of a heavy metal, a magnetic insulator, and a normal metal, where the heavy metal induces a quantum skyrmion crystal in the magnetic insulator.

"We have been interested in low-dimensional novel types of quantum spin systems for a long time and were looking into the question of how quantum spin-fluctuations in quantum skyrmion crystals could affect normal metallic states and possibly lead to superconductivity of an unusual type," Sudb told Phys.org.

"Previous work that in particular have inspired us and that we have been building on, is the experimental work of Heinze et al on realizations of quantum skyrmion crystals, and two of our own papers on quantum skyrmion crystals."

In a paper published in 2011, Stefan Heinze at University of Kiel and his colleagues at University of Hamburg showed that skyrmion crystals could be realized in actual physical systems. Inspired by the previous work by this research team, Sudb and Mland made a series of predictions, which serve as the basis of their newly proposed model system of topological superconductivity. Illustrations of the skyrmion crystal ground states in the magnetic monolayer. Arrows show the inplane component, while color gives the out of plane component. Credit: Kristian Mland and Asle Sudb

"We ourselves have not made these systems experimentally, but we are suggesting materials that could be used to create such systems and study their properties," Sudb said. "We specifically studied a new way of creating topological superconductivity by sandwiching a normal metal with a very specific spin systems where the spins form skyrmions in a repeated pattern, a skyrmion crystal. Previous propositions for creating topological superconductivity suggested sandwiching skyrmion crystals with superconductors. Our approach obviates the need for a superconductor in the sandwich."

While they did not experimentally realize their proposed model system, Sudb and Mland tried to determine its properties through a series of calculations. Specifically, they calculated a property of the system's induced superconducting state, the so-called superconducting order parameter, and found that it had a non-trivial topology.

"We were able to create a model system where we can produce topological superconductivity in a heterostructure without having a superconductor a priori in the sandwich," Sudb said. "Our system is sandwich structure of a normal metal and a magnetic insulator, while previous proposals have involved a sandwich structure of magnetic insulators and other superconductors."

In the future, new studies could try to realize the model system proposed by these researchers in an experimental setting, further examining its properties and potential for quantum computing applications. Meanwhile, Sudb and Mland plan to theoretically explore other possible routes to achieving unconventional superconductivity.

"In general terms, we will pursue unconventional superconductivity and routes to topological superconductivity in heterostructures of involving magnetic insulators with unusual and unconventional ground states as well as novel types of spin-excitations out of the ground state," Sudb said.

More information: Kristian Mland et al, Topological Superconductivity Mediated by Skyrmionic Magnons, Physical Review Letters (2023). DOI: 10.1103/PhysRevLett.130.156002

Journal information: Physical Review Letters

2023 Science X Network

Read this article:
A model system of topological superconductivity mediated by skyrmionic magnons - Phys.org

Majority News Release | Majority News Releases | News | United … – United States Senate Committee on Appropriations

04.26.23

***WATCH: Senator Murrays remarks and questioning***

Washington, D.C. Today, at a Senate Appropriations Commerce, Justice, Science, and Related Agencies subcommittee hearing on President Bidens FY24 budget request for the Department of Commerce, Senator Patty Murray (D-WA), Chair of the Senate Appropriations Committee, spoke about the need to build on investments that keep the U.S economy competitive, support innovation, and strengthen trade partnerships.

When we compete with our adversaries its not just a matter of who spends the most on defense, said Senator Murray. Its a matter of who has the strongest supply chains, who supports workers and growing businesses most, who makes sure our communities have the high-speed internet they need to do just about anything in the 21st Century, and who is on the cutting edge of technologyadvanced manufacturing, green energy, quantum computing, and more.

In particular, Senator Murray underscored just how critical it would be that the Biden administration have the resources necessary to effectively implement the bipartisan CHIPS and Science Act and how the investments of the Inflation Reduction Act would also help build a stronger clean energy economy and boost American manufacturing.

The Department is also using resources we provided in the CHIPS and Science Act to bring manufacturing jobs in things like quantum computing and chip manufacturing to states like minebecause weve got to make sure the jobs of the future are here in Americanot shipped overseas, stated Senator Murray. And the Inflation Reduction Act is actually ushering in a boom in green manufacturing for batteries, and electric vehicles, and more.

We know China is making investments in many of these same industries and trying to get the upper hand. So if we want to not just compete, but to lead the world, we cannot let our investments fall behind or fall victim to partisan pot shots, Senator Murray continued.

In her questioning, Senator Murray asked Secretary of Commerce Gina Raimondo how the National Oceanic and Atmospheric Administration (NOAA) plans to continue aiding salmon recovery in Washington state and protect important ecosystems in the Pacific Northwest from invasive species like European green crabs. Senator Murray and Congressman Derek Kilmer recently joined tribal leaders, environmental advocates, and experts to trap European green crabs and discuss the threat they pose to Washingtons states native species.

###

Continue reading here:
Majority News Release | Majority News Releases | News | United ... - United States Senate Committee on Appropriations

Top 10 New and Trending Technologies in 2023 You Must Know … – Analytics Insight

This article enlisted the top 10 new and trending technologies in 2023 you must know aboutblockchain

Technology is getting better and better and it has only made our lives easier than ever. With every passing day, technology is evolving and that too at a fast pace. Technology has made it extremely convenient for the organizations to reduce costs, enhance customer experiences and increase profits. Also, the onset of the pandemic has boosted businesses digital transformation journey in the best possible manner. On the same lines, being well aware about the tech trends for the coming future has got its own benefits. Here, we bring for you a list of top 10 new and trending technologies in 2023 you must know about.

It has been quite some time that artificial Intelligence has become popular. As a matter of fact, AI is constantly evolving, and newer applications for this emerging technology continue to spring upon the scene. Today, there are countless applications that rely on AI image and speech recognition, navigation programs, voice assistants like Siri and Alexa, to name a few. No wonder, the coming years will see a lot more dependence on AI.

These days, we can see millions of data points collecting user information from various sources such as social media, websites, emails, and web searches. This throws light on how exponentially the data collected is increasing. This is why old and traditional computing methods fall short. Well, edge computing to the rescue.

Quantum computing, a computing that focuses on developing computer technology based on the principles of quantum theory, has become extensively recognised and popular in no time. Quantum computing stands the ability to perform calculations based on the probability of an objects state before measurement instead of just 0s and 1s. This computing is highly likely to get more recognised in 2023.

Gone are the days when the world was invariably happy with 3G and 4G. Today is the era of 5G. As it has the potential to change the way we see the online world, theres no looking back. 5G aims to revolutionise our virtual interactions by integrating AR and VR technology and better cloud-based gaming experiences.

IoT has emerged to become one of the most promising technologies of the decade. Multiple devices or things today are wifi-enabled, which means they can be connected to the internet. Devices within the network can communicate with each other, collect data and transfer it across the network without human intervention.

The cryptocurrency market has given importance to blockchain technology like never before. This technology offers security that can be useful in several other ways as well. As this technology is consensus-driven, which means no single person or organisation can take control of the data, it makes it to the list of top tech trends.

Virtual reality and augmented reality are those technologies that have been popular for almost a decade now. Virtual Reality immerses the user into a new environment, on the other hand , Augmented Reality enhances the users existing environment. These technologies have profound importance in gaming and filters on social media, simulation software, etc, hence continuing to be a top trend.

RPA ues multiple software and applications to automate business processes such as data collection and analysis, customer service and other repetitive tasks managed previously through manual processes. Just like AI and Machine Learning, RPA is a rapidly advancing technology, thereby making it to the list of top tech trends for the year 2023.

Full stack development is one of the newest industry trends within the technological space and is all set to gain momentum in 2023 and beyond. It continues to rise as IoT becomes a more mainstream technology with diverse applications. Full stack development covers the front and back-end development of applications and website.

Though not a new tech trend, but definitely worth watching out for in the next year. Reason being quite simple -threats and hacking attempts are growing in number and intensity, which calls for improving security protocols and strengthening systems against malicious attacks.

More:
Top 10 New and Trending Technologies in 2023 You Must Know ... - Analytics Insight