Category Archives: Quantum Computing

Are these the best quantum computing stocks to watch? – IG UK

One of the highest profile players in the quantum computing space is Alphabets Google. Google Inc. announced in 2019 that they had attained quantum superiority. In other words, its quantum processor, Sycamore, had successfully performed its first-ever function beyond the capabilities of classical computers.

However, this was soon questioned by IBM, who claimed that the same problem could be solved by a standard computer just over the space of days, compared to Sycamores mere minutes.

Then, in the second half of 2020, a smaller version of Sycamore reached another milestone performing its first quantum chemistry reaction.

In May 2021, Google opened its new Quantum AI Campus in Santa Barbara, California, along with a new goal: to build the worlds first useful, error-corrected quantum computer by 2029. 1

Google has even, to a certain extent, opened this effort up to the public in collaboration, when it announced Quantum Computing Service in December 2021. This allows approved customers the opportunity to send their own computing programs to Google to be run on their quantum computing hardware at the lab in Santa Barbara.

With this kind of computing power, Google is hoping to solve problems humanity hasnt been able to for centuries. Some of these include developing better medicines, solving world hunger and climate crises. However, this is a long way off for now.

The only recent news regarding Google and quantum computing has been speculative. For example, there have been rumours that Google Inc. may or may not take Sandbox, its secretive quantum department unrelated to its quantum AI campus, public. However, nothing concrete has been confirmed and it could be years before any further tangible quantum milestones are reached.

See the original post:
Are these the best quantum computing stocks to watch? - IG UK

Accenture Invests in Good Chemistry Company to Drive Quantum Computing in Materials and Life Sciences – HPCwire

NEW YORK, April 6, 2022 Accenture has made a strategic investment, through Accenture Ventures, inGood Chemistry Company, a company that uses quantum chemistry, machine learning and quantum computing to accelerate new materials design.

The Good Chemistry Company platform,QEMIST Cloud, combines cloud, AI, and quantum computing in an integrated platform designed for developers. The platforms engine enables faster, more accurate, and scalable ways to perform computational chemistry simulations.

Were doubling down on the growth potential of quantum computing and uncovering new ways to navigate its potential while empowering our clients to confidently absorb and access this breakthrough technology, said Tom Lounibos, managing director, Accenture Ventures. Simulating chemistry in this new way leverages easily and readily accessible computers on the cloud to perform simulations that were previously intractable even on expensive, high-performance computing environments. This brings a competitive advantage to clients and can change pharmaceutical drug discovery and more.

According to theAccenture Technology Vision 2022, 69% of global executives say quantum computing will have a breakthrough or transformational positive impact on their organizations in the future. Quantum is the pinnacle of next generation problem solving and Accenture andBiogen collaborated with 1QBitalready to accelerate drug discovery, developing a proof of concept that validated a quantum-computing molecule comparison approach and building an enterprise-ready, quantum-enabled application with transparent processes that generates molecular comparison results with deeper insights about shared traits.

Arman Zaribafiyan, CEO of Good Chemistry Company said, With our platform, we are re-imagining the way computational chemistry simulations are done. Simulating chemistry on computers will help drive faster, more accurate and more accessible materials innovation in the decades to come. With Accentures support and collaboration, we will be able to explore the vastness of chemical space and enable rational materials design at scale.

Carl Dukatz, Accentures global quantum computing lead said, By building on and extending our relationship with 1QBit to the newly formed Good Chemistry Company, we are demonstrating our ongoing commitment to accelerating quantum computing innovation. We are witnessing the emergence of a new class of scalable cloud-based technology that is stretching the boundaries of what computers can solve. We recognize the potential of arming our clients with the next generation of chemistry, material science, and structural design.

Good Chemistry Company is the latest organization to join Accenture VenturesProject Spotlight, an engagement and investment program that connects emerging technology software startups with the Global 2000 to fill strategic innovation gaps. Project Spotlight offers extensive access to Accentures domain expertise and its enterprise clients, helping startups harness human creativity and deliver on the promise of their technology.

Terms of the investment were not disclosed.

About Accenture

Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology and Operations services all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us ataccenture.com.

About Good Chemistry Company

Good Chemistry Companys mission is to enable high-throughput high-accuracy computational chemistry simulations to accelerate new material designs. Their proprietary QEMIST Cloud, a cloud-based computational chemistry platform, provides the building blocks for computational chemistry developers to build chemical simulation applications and workflows, using emerging algorithms in quantum chemistry, machine learning, and quantum computing. Through simple, easy-to-use APIs, QEMIST Cloud provides access to computational chemistry tools with unprecedented scale enabled by the power of cloud. Headquartered in Vancouver, Canada, Good Chemistry Companys interdisciplinary team comprises computational and quantum chemists, software developers, ML engineers and quantum computing scientists. For more information about Good Chemistry Company, visitgoodchemistry.com.

Source: Accenture

See original here:
Accenture Invests in Good Chemistry Company to Drive Quantum Computing in Materials and Life Sciences - HPCwire

Quantum Week at Yale geared toward novices and experts alike – Yale News

Yales hub for quantum research will soon entangle the campus in the best possible sense in a full week of mind-bending science, artistry, and discussion devoted to the wonders of quantum research.

Quantum Week at Yale, organized by the Yale Quantum Institute (YQI), will feature a hackathon, a lab tour, a movie screening, a record launch party, hands-on computer programming, a superconductive jewelry display, and an assortment of quantum-related library and museum exhibits.

The activities begin April 8 and run through April 14. A full list of events is available here.

Yales quantum scientists are at the very top of this field, said Florian Carle, YQI manager and coordinator for the event. We want to take some of the excitement we see in the labs and at YQI and share it with the rest of the campus.

Quantum science delves into the physical properties that explain the behavior of subatomic particles, atoms, and molecules. Over the past century, quantum research has transformed disciplines as diverse as physics, engineering, mathematics, chemistry, computer science, and materials science.

Over the past 20 years, Yale researchers have propelled quantum research, particularly in quantum information science and quantum computing, with a series of groundbreaking discoveries including the first demonstration of two-qubit algorithms with a superconducting quantum processor.

Yales research has led to unprecedented control over individual quantum objects, whether those objects are naturally occurring microscopic systems such as atoms, or macroscopic, human-made systems with engineered properties. Researchers say these advances may soon enable them to perform otherwise intractable computations, ensure privacy in communications, better understand and design novel states of matter, and develop new types of sensors and measurement devices.

This is the time when computer scientists, mathematicians, physicists, and engineers are all coming together, said Yongshan Ding, assistant professor of computer science, who will lead a programming workshop on April 14 that shows visitors including those without any experience with quantum computing how to play with quantum interference patterns.

People can just code away, Ding said. My vision is that by exposing people to these activities, we can build a quantum-native programming language. This is a new paradigm of computation, so were going to need new ways to program for it.

YQI has partnered with 18 Yale departments and centers to create 23 events for Quantum Week at Yale. One of the challenges in organizing the week, Carle explained, was developing an engaging mix of activities suited for both experienced researchers and quantum science novices.

To that end, the week is organized around four components: Understanding Quantum, Art & Quantum, Career and Entrepreneurship, and For Researchers.

The hands-on programming event, for example, comes under the Understanding Quantum banner. Other include an April 9-10 Quantum Coalition Hack, hosted by the Yale Undergraduate Quantum Computer Club; an April 11 tour of superconducting qubit laboratories; and a quantum-related exhibit of rare books at the Beinecke Rare Book and Manuscript Library on April 11.

Were always looking for ways that our libraries can engage with the academic work going on at Yale, said Andrew Shimp, who consulted on Quantum Week events at Yale libraries. Shimp is Yales librarian for engineering, applied science, chemistry, and mathematics. One of the unique things a Yale library can offer is the chance to view rare collections that arent necessarily digitized yet.

The quantum exhibit at the Beinecke Library, for example, includes materials from quantum science pioneers such as Albert Einstein, Werner Heisenberg, and Max Planck. There is also an astronomy textbook, published in 1511, that includes the word quantum in its title. The title is Textus de Sphera Johannis de Sacrobosco: cum additione (quantum necessarium est) adiecta / Nouo commentario nuper edito ad vtilitate[m] studentiu[m] philosophice Parisien[em]. A brief English translation would be Sphere of Sacrobosco.

Under the Art & Quantum heading, there will be an April 8 screening of the 2013 indie thriller Coherence; a visual arts competition called Visualize Science hosted by Wright Lab on April 13; a launch party for Quantum Sound (a record project begun at YQI in 2018) on April 13; a display of Superconductive Jewelry throughout the week at YQI; a Quantum and the Arts exhibit all week at the Arts Library; an April 13 event hosted by the Yale Schwarzman Center devoted to historical preservation of technology ephemera, called Dumpster Diving: Historical Memory and Quantum Physics at Yale; and a new exhibit at the New Haven Museum, The Quantum Revolution, that opens April 13 and features drawings by former YQI artist in residence Martha Willette Lewis.

Carle is curator for the New Haven Museum exhibit. We wanted to show the evolution of quantum science at Yale, he said. It will take people from some of the first qubits in 1998 to Badger, the dilution refrigerator that ran the first two-qubit algorithms with a superconducting quantum processor in 2009.

Quantum computers require extremely cold temperatures near absolute zero in order to reduce operational errors.

The weeks Career and Entrepreneurship component will include a discussion of quantum startups hosted by The Tsai Center for Innovative Thinking at Yale (Tsai CITY) on April 12; a conversation with IBMs Mark Ritter on the global implications of quantum research, hosted by the Jackson Institute for Global Affairs on April 12; a session on how to access market research for major industry analysts, hosted by the Yale University Library, on April 12; and a series of panel discussions on how to join the quantum workforce.

Finally, the For Researchers component of Quantum Week at Yale will feature a quantum sensing workshop at Wright Lab on April 8; and an April 14 lecture by quantum researcher Nathan Wiebe of the University of Washington.

The final day for Quantum Week at Yale, April 14, also happens to be World Quantum Day, Carle said. Our hope is that by then, students all over campus will be aware of quantum work being done here and want to explore it themselves in some way.

Visit link:
Quantum Week at Yale geared toward novices and experts alike - Yale News

Quantum computing and the bigger picture – ComputerWeekly.com

Every few years IBM brings out a new addition to its Z series mainframe family. From the information accompanying the release of the new enterprise system, IBM appears to be touting the new z16 machines ability to handle real time fraud detection for instant payments across the financial sector. It also offers an AI (artificial intelligence) accelerator, using IBMs Telum chip. This will certainly be good news for many financial institutes. For instance, speaking at a recent IBM-hosted roundtable, Steve Suarez, global head of innovation, finance & risk at HSBC, described how the bank was drowning in data. Suarez sees a need to have technology that can help the bank provide insights that actually benefit people.

What is interesting from the virtual z16 briefing Computer Weekly attended is IBMs focus on the new machines ability to protect against hackers using quantum computing to break the strong encryption that underpins financial transactions.

IBM distinguished engineer, Anne Dames said: Good technology can be used to do bad things. In other words, a quantum computer could be used to break the cryptographic keys that are used to encrypt data.

We are entering a new cryptographic era, she warns, adding that the IT industry needs to act now before there is an effective quantum computing based attack.

The worst case scenario IBM paints is where a successful hacking attack gains access to a large quantity of encrypted data. Since this data is encrypted, it is near impossible to decipher it in a realistic timescale. The US National Institute of Standards and Technology warns that if large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use. This would seriously compromise the confidentiality and integrity of digital communications on the Internet and elsewhere. Nist is encouraging the IT sector to develop post-quantum cryptography and IBMs z16 is one of the first systems to claim it is quantum safe.

While this is clearly an important development and IBMs efforts should be applauded, one cant help worrying that IBM, Nist and the IT sector at large, are somehow missing the bigger picture. Breaking cryptography is one thing, but quantum computers have the potential to revolutionise drug development and the ability to create new chemical processes such as to reduce carbon emissions. The flip side is that these techniques may also be used to develop devastatingly effective, targeted chemical and biological weapons. As such, policy makers need to wake up to the risk, and track quantum computing in the same way that atomic, biological and chemical weapon materials are monitored.

More here:
Quantum computing and the bigger picture - ComputerWeekly.com

IonQ Announces Fourth Quarter and Full Year 2021 Results – Quantum Computing Report

IonQ Announces Fourth Quarter and Full Year 2021 Results

IonQ announced its fourth quarter and full year 2021 financial results with significant increases from earlier periods and also significant increases from their own earlier forecasts. Fourth quarter 2021 revenue was $1.6 million compared to a third quarter revenue of $234 thousand with full year revenue at $2.1 million. IonQ also reported contracts bookings in 2021 at $16.7 million compared with the previous 2021 estimate of $15 million they made in November. Ionqs EBITDA (Earnings before Interest, Taxes, Depreciation, and Amortization) loss was $28.3 million with a total net loss of $106.2 million. The net loss was dominated by a Change in Fair value of Warrant Liabilities of $63.3 million and Offering Costs Associated with Warrants of $4.3 million. These are non-cash expenses and are expected to be non-recurring. R&D expenditures in 2021 roughly doubled in 2021 at $20.2 million from the previous amount of $10.2 million in 2020.

The company forecasts significant growth in 2022 with revenues between $10.2 million and $10.7 million, and EBITDA loss of around $55 million, and contract bookings between $20 and $24 million. The contract books can extend over several years, which explains why they would be higher than the revenues, particularly in a growing business. The company also warned that the revenue and earnings could be lumpy due to large individual bookings or shipments. This is not uncommon, and we have seen this before in results from classical computing companies that ship expensive supercomputers.

The company highlighted several of their commercial and technical achievements in 2021, most of which we have previously reported on in the Quantum Computing Report. However, there were a few new items that we thought were noteworthy. First, they indicated they are discussing with potential customers the sales of full systems for on-premise installations. This would provide them with additional revenues beyond selling time over the cloud on one of their machines. Although closing of any of these deals is not assured, if it did happen it would augment revenue starting in 2023. IonQ also indicated they are setting up a manufacturing group to allow them to build up machines in higher volume. And finally, they announced they are setting up a team in Seattle, Washington to increase their technical capabilities and take advantage of talent that is available in the Pacific Northwest. Among other activities, this group will be researching the use of photonics to connect multiple ion trap processors together.

You can view IonQs press release announcing their financial results and business update here. And you can find the 10-K Report they have filed with the U.S. Securities and Exchange Commission here.

March 28, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

See the article here:
IonQ Announces Fourth Quarter and Full Year 2021 Results - Quantum Computing Report

Think big, start small, scale fast to enter the metaverse – Mint

Three years ago, when you spoke about the world being on the cusp of a major business transformation with AI, you had pointed out that that many companies were experiencing AI stagnation due to the lack of right data, lack of right talent, and the wrong mindset, among other things. To what extent has the scenario improved now?

We saw Covid being an accelerator of tech adoption. According to our research, before Covid, digital leaders who were adopting new technologies like Cloud AI faster outpaced others by a factor of 2x. When we did that same research after Covid, we found that these digital leaders who comprise the top 10%, had widened the gap to 5x. We saw another category of LeapFrog(gers)--another 15% of companies that were rapidly innovating to adapt new technologies like AI. Roughly 63% of companies started adopting AI for the first time during Covid, which drove a lot of innovation. That said, there's another round of research that I'm going to publish shortly, specifically on the adoption rate of AI. What that will show is that only about 10% of organizations are really adopting AI at scale. It's a relatively small number and the rest are doing pilot deployments. So, there's still a lot of room to cover.

Is the trend in India any different from other parts of the world?

The work we're doing in India supports what I just said, which means there is a lot of acceleration and a lot more of AI happening now.

What's Accenture's vision of the metaverse, which you have christened as the Metaverse Continuum in your Tech Vision 2022 report?

There are all sorts of definitions of the metaverse, and some of them are crazy--about alternative worlds and realities and things. A lot of these definitions are very consumer focused. What we're doing with the metaverse continuum is talking about a definition that's very grounded in business. We do believe the metaverse will impact every part of every business, and companies need to start acting now. We should talk what's happening with the future of the internet, which you referred to as Web3, which is enabling this new capability and looking to create worlds that bridge the virtual and real. That's where NFTs (non-fungible tokens) come in place. That's where the VR/AR (virtual reality and augmented reality) come in and (enhance) experiences. And it's here that the continuum becomes important. We say Metaverse Continuum because we believe it's not just about the consumer--it's about the worker, and it's about extensive continuum of the roles we play as people across the enterprise. It's a continuum from not just the virtual, but also the real.

As an example, we will work with a client or a worker who uses a digital twin (replica of a physical asset) to understand the workings of a manufacturing plant. They can put on their Augmented Reality (AR) headsets and play around with the equipment to change the way they need. Then they can go back to the real world. The bridging of the real and virtual is a key definition since most think just virtual when they define the metaverse.

The technology will get better. But not all eight billion people in the world can afford this. We need it to be inclusive. So, our ability to create 2D experiences that allow access to some of these 3D worlds is critical. Hence, the continuum of 2d to 3d is important. The good news is that the technology is moving in that direction.

We've been talking about the way you use artificial intelligence to create the experiences in the metaverse to do digital twins to create digital humans like we're doing for museums when we create intelligent guides. A lot of people think the metaverse is just about gaming and headsets, and I think that's going to mislead people into missing the opportunity. I caution companies that you might have missed Web1 like when Amazon disrupted e-commerce, or Web2 when Airbnb disrupted lodging or Uber disrupted transportation, or Facebook, social. But now's your opportunity to define the rules. If you don't, you're going to wake up in worlds (metaverse and Web3) that are defined by someone, and for someone else. That's not going to be a good place to be.

But people and businesses talk about the metaverse and Web3 in the same breath, giving the impression that these are interchangeable terms. What does Accenture think?

Sometimes Web3 is used in almost a political context by some in the crypto community. I'm not using Web3 in that sense. Web3 is the new set of capabilities that are enabling in, one sense, the Internet of place--shared virtual spaces to collaborate, and the internet of ownership--the ability to use blockchain and other technologies to invert the way that ownership works and create unique, differentiated protected digital identities for objects and products that people use.

In your Tech Vision 2022 report, you speak about the four building blocks of the the metaverse continuum -- WebMe, Programme World, The Unreal, and Computing the Impossible. Can you please explain these in brief?

The first one, WebMe, is where we explore how the internet is being reimagined. It's really about the metaverse and future the internet--Web3. The second trend called Programmable World talks about our world being personalized. This one does fold into our metaverse continuum definition. It projects how the convergence of the internet of things (IoT), sensors, digital twins, 5G, ambient computing, augmented reality, smart materials, and more are paving the way for businesses to reshape how they interact with the physical world. You can program smart materials with haptic sensing, for instance, so that a worker can sense if s/he gets close something hot and know not to touch it. This enhances the experience. During Covid, they programmed a messenger RNA to create a vaccine. And it was done by a drug discovery studio that was powered by convolutional neural networks, artificial intelligence, and the cloud. So, it's this fusion of science and the world to allow us to program the world--in this case, a, a vaccine in a very different way.

The third trend--The Unreal--is one of my favorite trends because it's talking about making synthetic objects, and the new issues we have to deal with. Here's just one small example of tremendous innovations coming from a biotech world of synthetic biology, which is allowing us to grow the equivalent of plastics right now. But there's also a dark side to this trend, which is about deepfakes. The other associated with the Unreal trend is synthetic data. We're talking about this tremendous amount of Unreal data you're managing--data about your digital twin of your manufacturing plant, your aircraft, or whatever. Most of the data that you use to train AI as you go forward is going to be synthetic data. One of the analyst firms just did a report saying that by 2030, 80% of the data managed by companies will be synthetic data, not real data because of these trends. The ability to match the unreal in a responsible manner becomes very important because the data itself is generated by AI and is not real data.

The fourth trend is about what happens in the post-Silicon era. We're talking about quantum computing and bio computing--new forms of computing. Quantum computing is getting interesting, and you need to pay attention to it. The US government just passed an executive order mandating every federal agency to be quantum ready with quantum safe encryption. Quantum cryptography is something that companies need to start paying attention to because it's just a matter of time before quantum computers can break RSA-based encryption. Once that happens, it creates a Y2K like moment for everybody to upgrade the cryptography and the security systems they have. This is not the buzzword right now, but it may become so in two to five years. That's why we call this trend Computing the Impossible.

Coming back to the metaverse, companies will also require an AI-type of architecture. What will the elements of this architecture comprise?

We have a reference architecture for the metaverse. There are a couple principles underlying it. One is we believe in an open metaverse and interoperability as a key tenant of it. We are also putting a lot of effort and time into what we call 'Responsible Metaverse', which has two big focus areas--trust and sustainability. On the trust side, we've done a lot of work. The risks, and the need to focus on responsibility, are higher in the metaverse (than in AI) because of the high level of concern. There are concerns around deepfakes, inclusion, and equal opportunity in the metaverse. We believe in pursuing a multi-stakeholder approach (like we did with AI) to work with other companies and organizations. It's a trusted metaverse that gets to handle data and privacy in the right way. Sustainability, too, is a big issue with the metaverse because 3d experiences and the multiple devices consume a lot of energy and resources. If you look at NFTs and cryptocurrencies, they use proof of work mining systems that consume too much energy.

Given all these complexities, what should be the approach of enterprises that are seeking to adopt Web3 and having a presence in the metaverse?

I think the challenge for every company is that every new technology adds on but doesn't eliminate something from the past. With the metaverse, you need to think big because you need to embed, envision the possibilities. But then, start small in a focused way to understand how to apply it. Companies must build an architecture that can scale fast because you will need to move your business fast. So, think big, start small, scale fast is the approach I would recommend from a company perspective.

We made a big entry into our own enterprise metaverse called the Nth Floor. It refers to the virtual environments we have created to bring Accenture people together to meet, collaborate and learn. So, 150,000 of our employees will be onboarded with a virtual reality headset over the next year. (Accenture has created digital twins of many of its physical offices--from Bangalore in India to Madrid in Spain and San Francisco in the US to provide familiar environments for its people to meet, collaborate and network.)

The results we're getting are tremendous. We did this not just to put a toy in people's hands, but we studied the neuroscience around learning and engagement which shows how there's 20-30% improved retention of concepts in an immersive environment. We're seeing that in practice as we apply it and measure it ourselves. People are engaging in new ways. They're sharing experiences that they wouldn't have before.

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Download the App to get 14 days of unlimited access to Mint Premium absolutely free!

See the original post here:
Think big, start small, scale fast to enter the metaverse - Mint

How Kronos Could Help the US Win the Fusion and Quantum Computing Race With China – Newswire

Press Release-Mar 28, 2022 07:00 CST

WASHINGTON, March 28, 2022 (Newswire.com) - Major world governments are increasingly focusing on fusion energy research as a potential foundation for gaining the economic and military advantage in the twenty-first century, and perhaps beyond. In this emerging arena of supercharged competition, the quantum computing systems, algorithms, and tokamak design plans developed by Kronos Fusion Energy Defense Systems could be a key factor in winning a significant edge for the USA over its economic and political rival, China.

Fusion energy, known theoretically since 1920, promises potentially near-limitless energy generation, free from polluting or radioactive byproducts. With rising petroleum costs and the looming specter of global warming, developing workable fusion technology is more urgent than ever. The first country to make breakthroughs to practical fusion will become the world's energy leader, giving its decisive advantage in commerce, defense, and space exploration that could last for generations.

With immense government backing and funding, most recently reinforced in China's 14th Five-Year Plan, Chinese scientists seemingly lead the world with the $900 million Experimental Advanced Superconducting Tokamak (EAST). The EAST recently set records by maintaining stable plasma at 120 million degrees for more than 1.5 minutes. China budgeted hundreds of millions more to operate and upgrade the EAST reactor, while funding the training of over 1,000 new fusion physicists.

China's vigorous fusion program is committed to developing its quantum computing resources. Centered on the recently founded Chinese National Laboratory for Quantum Information Sciences, the program has received billions of dollars in funding. China currently holds 2.5 times more patents in deep learning than America, as well as a cornerstone of advanced quantum computing, while aggressively pursuing further developments. Chinese premier Xi Jinping even describes these technological sectors as the "main battleground" between the USA and China.

Currently, the edge in these economically and strategically vital technologies arguably belong to the PRC. However, Kronos offers the potential to redress this balance by bringing together quantum computing and fusion energy into a single powerful project. Harnessing the ability of quantum devices, neural networks, and machine learning to crunch immense quantities of data, while testing a multidimensional array of thousands of problems, learning and adapting in real-time, the potent simulations Kronos has developed should enable building fusion tokamaks 4,000% more effective than current reactors.

Kronos believes the lightning-fast development and analysis cycle provided by its algorithms will empower the U.S. to leapfrog twenty years ahead of China in fusion energy generation. Its quantum computing systems will not only enable developing precise, efficient fusion reactor designs, compact fusion engines for spacecraft, and other fusion technology, but demonstrate the viability of quantum learning as a breakthrough tool of economic and scientific success. Kronos' cutting-edge "proof-of-concept" will potentially attract robust public and private investment to the wider quantum research sector, putting the USA on course to achieve superiority not only in tokamak design but also in quantum computing research.

PR Contact: Erin Pendleton - pr@kronosfusionenergy.com

Source: Kronos Fusion Energy

Original post:
How Kronos Could Help the US Win the Fusion and Quantum Computing Race With China - Newswire

Friend and foe: The little-known pact at the heart of cybersecurity – TechRadar

The cybersecurity industry is founded upon two types of competition: that between security vendors and cybercriminal adversaries, and that between the vendors themselves.

Whats unusual about the situation is the way in which these two battlegrounds are connected; to prevent threat actors from infecting devices with malware and infiltrating business networks, cybersecurity vendors often have to establish a temporary truce.

This balance between competition and collaboration is characterized by Jaya Baloo, CISO at antivirus company Avast, as a friendly rivalry that allows for all the largest market players to work hand-in-hand when it is important to do so.

In conversation with TechRadar Pro at MWC 2022, Baloo spoke to the unconventional relationship between vendors in the sector. She insists the cybersecurity community is focused first and foremost on shielding people against attack, and that turning a profit is a secondary consideration.

I dont really care which antivirus youre using, so long as youre using one, she told us. Were still seeing so many people attacked on so many different devices, so our biggest concern is the people who are completely unprotected.

In the coming years, there is expected to be a blending together of various emerging technologies, which will create the foundation for new digital experiences for consumers and businesses.

At MWC 2022, for example, there was plenty of talk about the interplay between 5G, AI, IoT and edge computing, a heady mixture that will enable use cases ranging from driverless cars to autonomous factories and more.

However, this level of interaction between technologies is bound to create headaches for security professionals, noted Baloo, especially if new products and services are not developed with security front-of-mind.

There is an organic and orgasmic coming together of technologies right now, she said. But this will involve an increase in complexity, and complexity is the enemy of security.

In a scenario such as this, cybersecurity companies stand the best chance of shielding customers from attack if they share intelligence on new vectors, vulnerabilities and cybercriminal groups.

Baloo highlighted the work of the Avast threat intelligence team, which publishes regular reports unpacking its discoveries. One recent report analyzed an increase in phishing attacks on Ukrainian companies in the leadup to the Russian invasion, for example, and the previous instalment covered the spike in DDoS hacktivism.

When the threat intelligence team discovers a new malware strain or route of attack, not only does Avast build protections into its own services where possible, but also offers assistance to the victims and alerts the wider community to its findings, Baloo explained.

We work with all the people youd think wed be competing against. Theres a very healthy level of dialogue across the ecosystem, she told us.

Thats why its so much fun; were collaborating with like-minded people to take down the bad guys. I love our threat intelligence work.

Asked whether there are any instances in which Avast would not share intelligence, say, if withholding information had the potential to confer a competitive advantage, Baloo gave us a disapproving shake of the head. When its information about the bad guys, we share. Its as simple as that.

Last year, the cybersecurity news cycle was dominated by the SolarWinds attack and Log4J vulnerability, both of which highlighted the dangers posed by the software supply chain, a source of risk often overlooked by businesses.

Despite the commotion that surrounded both incidents, Baloo told us she expects to see more of the same in 2022, because the necessary lessons have still not been learned.

Supply chain attacks are not going anywhere, she said. The biggest problem is that we dont fully understand our potential points of weakness.

Weve reached a certain level of maturity in terms of the technologies we use, but dont understand how they interlink to create areas of weakness.

This is an issue that affects open source software to the same extent as proprietary services, notes Baloo. The fact that code is available for anyone to pore over does not necessarily mean someone has done so with the requisite level of scrutiny, as Log4j demonstrated.

However, Baloo is optimistic that regulation requiring companies to maintain greater oversight over their software bill of materials (SBOM) could play a role in minimizing risk for their customers.

In the aftermath of the SolarWinds attack, for example, US President Biden put in place an executive order that led to new guidance that requires software vendors to provide a comprehensive SBOM as part of the government procurement process.

The US stopped short of requiring vendors to provide SBOMs to all customers, but the hope is that the practice will become more mainstream and, at the very least, that new regulation will raise the profile of supply chain-related risk.

Not only are cybersecurity companies tasked with anticipating the kinds of attacks that may threaten customers in the short-term, but they must also look further ahead and further afield.

Another developing field of technology expected to have a significant impact on the cybersecurity landscape is quantum computing, which happens to be an additional area of expertise for Baloo, who advises the World Economic Forum on the issue.

Quantum computers solve problems in an entirely different way to classical machines, exploiting a phenomenon known as superposition (whereby subatomic particles exist in multiple states at once) to perform certain calculations many times faster than is currently possible.

Although the worlds most powerful quantum processors currently offer too few quantum bits (qubits) to establish a meaningful advantage over traditional supercomputers, the maturation of quantum computing will create various problems from a security perspective.

Most significantly, large-scale quantum computers will have enough horsepower to break modern cryptography. It is a mistake, therefore, to assume that information protected by encryption today will remain secure for years to come. State-sponsored threat actors may already be collecting large quantities of encrypted data in the hope of one day being able to access it.

Quantum computing will answer fundamental needle-in-the-haystack scientific questions, noted Baloo. But were screwed as soon as we have a quantum computer capable of breaking current encryption.

To enjoy the benefits of quantum computing, we need a new set of cryptographic algorithms that will be unbreakable even with a quantum computer. As a cybersecurity community, we need to have a forward-looking defence, so were ready for these kinds of challenges.

Again, this is a problem on which security companies will have to collaborate closely in the coming years, both to develop new quantum-safe algorithms and push for regulation that ensures the most vulnerable portions of the economy are quantum ready.

In a scenario in which quantum-secure technologies do not develop apace with quantum computers, the foundations of modern cybersecurity will be compromised.

And the clock is ticking, warned Baloo.

Excerpt from:
Friend and foe: The little-known pact at the heart of cybersecurity - TechRadar

March: Robust-and-Reliable-Quantum-Computing | News and features – University of Bristol

A new programme called RoaRQ and funded by a 3m grant from the Engineering and Physical Sciences Research Council, will establish a vibrant and cross-disciplinary community of researchers in universities - including University of Bristol - in quantum computing and computer science.

The team will collaborate to address the global challenge of delivering quantum computing that is robust, reliable, and trustworthy. With substantial recent progress internationally in building ever larger quantum computers, verifying that they do indeed perform the tasks they were designed for has become a central unsolved problem in the field.

From complex software articulated in high-level languages down to the silicon chips made in foundries, 60 years of computer science and engineering has defined and refined a tower of abstractions that constitute the solid foundations of todays classical computer systems. Challenges to reliability and correctness have been facedand overcomeat many levels in the stack, and there is a wealth of insight and expertise in the diverse community of computer science researchers who work across it. Verification and testing are done at each level, with clearly defined protocols and acceptance criteria. Decades of classical computing systems research has worked out the architectures, languages and translations that bring it all together to make reliable digital systems.

Achieving reliable quantum computation faces unique challengesnot least the fragility of quantum systems due to their interactions with their environment and the fact that the state of the system during a computation cannot be measured to confirm its correctness. The very feature that makes quantum computation powerful, the exponential size of the space of states in the number of qubits, makes it hard to emulate and hence assess behaviour.

This programme will bring quantum computation research into close contact with the scientific tools, methods and (especially) mindsets of the computer science research communityacross a broad spread of the key classical computing stacks. Together, they will define the beginnings of a general framework and advance specific solutions for robust and reliable quantum computation, at key layers across the principal quantum computing stacks needed to achieve trustworthy quantum computing systems.

Over the first year, the programme directors will invite engagement from across the UKs scientific community to co-create a portfolio of funded, cross-disciplinary projects that address this ambitious goal. A series of scoping workshops will be convened to propose and discuss technical directions and to facilitate the formation of project investigator teams. Projects selected for funding will commence from April 2023.

Prof Noah Linden of Bristols School of Mathematics: "At its most ambitious, our programmewith its focus on reliability and robustnesscould lead to a completely new view of the quantum computing stack, with implications for hardware and software at every level."

Simon Benjamin, Professor of Quantum Technologies at University of Oxford, said: Its an incredibly exciting time for quantum computing, when we need people to come together from diverse backgrounds so that these machines achieve their potential as enabling tools for everyonenot just people with doctorates in quantum physics! This project is an important step in making that happen.

Tom Melham, Professor of Computer Science at University of Oxford said: This innovative programme, funded by the EPSRC, will create an entirely new scientific community in the UK aimed at making trustworthy quantum computing a reality. Our ambition is to seed innovation in the design of reliable quantum computing systems as far reaching as the revolution in VLSI chip design of the late 1970s and 80s.

Dan Browne, Professor of Physics at University College London said: Im excited to be taking part in such an innovative research programme. Quantum computing can learn a huge amount from the know how in the established computer science community. I am looking forward to sharing ideas with this community and building new collaborations.

Paul Kelly, Professor of Software Technology at Imperial College London said: This is an unusual and exciting opportunity to reach out to, establish, expand and seed the network of UK computer systems and software researchers to exploit the capabilities of quantum computingand to bridge the gap to deliver quantum-accelerated applications to realise new computational capability across diverse application domains.

Read the original here:
March: Robust-and-Reliable-Quantum-Computing | News and features - University of Bristol

Probing The Inner Workings Of High-Fidelity Quantum Processors – Eurasia Review

The Science

Tiny quantum computing processors built from silicon have finally surpassed 99 percent fidelity in certain logic operations (gates). Quantum computers store information in the quantum state of a physical system (in this case, two silicon qubits) then manipulate the quantum state to perform a calculation in a manner that isnt possible on a classical computer. Fidelity is a measure of how close the final quantum state of the real-life qubits is to the ideal case. If the fidelity of logic gates is too low, calculations will fail because errors will accumulate faster than they can be corrected. The threshold for fault-tolerant quantum computing is over 99 percent.

Three research groups demonstrated more than 99 percent fidelity for if-then logic gates between two silicon qubits. This required precisely measuring failure rates, identifying the nature and cause of the errors, and fine-tuning the devices. The researchers used a technique calledgate set tomographyto achieve this in two of the three experiments. The technique combined the results of many separate experiments to create a detailed snapshot of the errors in each logic gate. The researchers were able to make a precise determination of the error generated by different sources and fine-tune the gates to achieve error rates below 1 percent.

Quantum computingmay be able to solve certain problems, such as predicting the behavior of new molecules, far faster than todays computers. To do so, researchers must build qubits, engineer precise couplings between them, and scale up systems to thousands or millions of qubits.

Researchers expect qubits made of silicon to scale up better than the qubits used in todays testbed quantum computers, which rely on either trapped ions or superconducting circuits. Achieving high-fidelity logic gates opens the door to silicon-based testbed quantum computers. It also demonstrates the power of detailed error characterization to help users pinpoint error modes then work around or eliminate them.

Qubits protected, controllable 2-state quantum systems lie at the heart of quantum computing. Quantum computing processors are built by assembling an array of at least two (and hopefully someday thousands or millions) of qubits, with an integrated control system that can perform logic gates on each qubitandbetween pairs of qubits. Their performance and capability are limited by errors in the logic gates. High-fidelity gates have low error rates. Once the error rate is less than a certain threshold which scientists believe to be about 1 percent quantum error correction can, in principle, reduce it even further. Beating this threshold in laboratory experiments is a major milestone for any qubit technology.

Whatkindsof errors are occurring is also a big deal for quantum error correction. Some errors are easier to eliminate or correct; others may be fatal. Quantum computing researchers from the Department of Energy (DOE)-fundedQuantum Performance Laboratoryworked with Australian experimental physicists to design a new kind of gate set tomography customized to a 3-qubit silicon qubit processor. They used it to measure the rates of 240 distinct types of possible errors on each of six logic gates. Of those possible errors, 95 percent did not occur in the experiments, and the remaining errors added up to less than 1 percent infidelity. Research groups in Japan and the Netherlands reported similar results simultaneously, with the Dutch group also using the DOE-fundedpyGSTigate set tomography software to confirm their demonstration.

Here is the original post:
Probing The Inner Workings Of High-Fidelity Quantum Processors - Eurasia Review