Category Archives: Quantum Computer
Conclusions from Forum TERATEC 2021: European Cooperation, Novel Uses of HPC – HPCwire
July 1, 2021 As the world enters the quantum era while politicians define the future face of a digital Europe, High Performance Computing (HPC) shape the necessary and expected post-Covid rebound. Held from June 22 to 24, 2021, the 16th Forum Teratec highlighted the major challenges facing the entire HPC sector and the European community: autonomous production of supercomputers, democratization of HPC uses, and pooling of knowledge and skills at the European level.
Democratization of HPC use across businesses
As participating companies have shown us during thisTeratecForum, supercomputers are becoming increasingly popular in several sectors even outside industry, such as medical optics for smart glasses, and archaeology for large-scale, ultra-precise surveys However, these are still unsettled uses.
Industries in all sectors see supercomputers as a possible solution to the complex problems encountered by their customers along with new products yet to be created to address them.
With growing number and diversity of users of High Performance Computing (HPC) as for data storage consequently, new challenges will appear to manufacturers and technology providers.
The increase in demand will match an increase in computer power and, consistently in energy consumption and computer costs. As President ofTeratec, DanielVerwaerde, points out: In the next few decades, the world of supercomputers must be able to offer solutions closely reaching carbon neutrality.
If they want to offer seamless interfaces without loss of performance and from a sole technical standpoint, manufacturers will need to ensure that data is managed consistently between conventional processor, accelerator, and coprocessor.
Minister Delegate to the Minister of Economy, Finance and Recovery, in charge of Industry, Agns Pannier-Runacherrecalled that if quantum technology is to bring the promise of a major technological breakthrough by shortening computing times by a factor of 1 billion within 5 to 10 years, the investment will of course have to be made in hardware equipment as planned by the French Quantum Plan (1.8 billion euros over five years) beside European projects, also enabling to engage accelerators and quantum computers in operation within computing centers.
As reported by DanielVerwaerde: Failing to work in these three areas simultaneously, the investments made will be all the more useless because they are so important.
Europe-wide cooperation
Whether France aims to be among leaders in these technologies and particularly in quantum computing requires cooperation on a European scale. Even if France indeed has been pursuing a proactive policy in this field since the 1960s, financial stakes for the next generations of supercomputers are such that the nation cannot act alone. Since French and European policies are in line with each other, the expertise acquired over decades can only give France a leading role.
As DanielVerwaerdereaffirmed: for supercomputers are strategic tools for our European development and our collective security, failing to be autonomous in this field would be a serious economic handicap (supercomputers are high value-added products) but also a societal one (production generates jobs, from most qualified to other many technical and manufacturing skills).
Anders Dam Jensen, Executive Director ofEuroHPC, recalled the missions of the joint European venture: to provide Europe with eight supercomputers ranked, if possible, in the top five in the world which will enable it to compete on equal footing with its competitors, and to develop a complete supply chain so that Europe can be autonomous in the production of such supercomputers. One among deadlines is to produce a European-based technology computer over the next call for tenders for the production of exaflops computers starting in 2023.
This collaboration is fully expressed in the future implementation of one interconnection between all the major European computing centers, planned after 2025. National competence centers will be referenced for each member country so that all industrial companies have access to high-performance computing, including SMEs and government agencies. In France,Teratechas been designated by the government andEuroHPCto be the national competence center, in cooperation withCerfacs(European Center for Research and Advanced Training in Scientific Computing) and GENCI (GrandEquipementNational deCalculIntensif).
We are at a turning point [] and this is where European involvement is particularly important, saidHervMouren, Director ofTeratec. And DanielVerwaerdeadded: Investing in the European project increase chances that the policy decided by Europe will be the one that France needs.
To be released, a summary of various presentations at Teratec:
A summary will soon be available to outline the richness of interventions that took place during the 16th Forum Teratec.
Workshops were also held on targeted topics: hybrid quantum computing, communicable diseases, cyber threats, satellite data for the environment and climate, autonomous systems and HPC storage.
Finally, focusing on technological challenges of high-performance simulation and the diversity of uses of HPC, the roundtables reviewed:
The next Forum Teratec 2022 will be held June 14 & 15, 2022.
Find more information on Forum 2021 here:
https://teratec.eu/forum/index.html
Source: TERATEC
Read more:
Conclusions from Forum TERATEC 2021: European Cooperation, Novel Uses of HPC - HPCwire
Quantum Blockchain Technologies Plc – Working with D-Wave Systems – Yahoo Finance UK
6 July 2021
Quantum Blockchain Technologies Plc(QBT or the Company)
QBT To Use D-Waves Quantum Technologies In Cryptography Algorithms
Quantum Blockchain Technologies Plc (AIM: QBT), the UK Quantum Computing Cryptography and Artificial Intelligence research and development (R&D) and investment company, listed on the London Stock Exchanges AIM market, announces it will use the Leap quantum cloud service from D-Wave Systems Inc., the leader in quantum computing systems, software and services, to develop cryptography algorithms for crypto currency mining.
QBT will now be able to access D-Waves quantum-classical hybrid solvers, which leverage both quantum solutions and best-in-class classical algorithms to run large-scale business-critical problems. With real-time access to quantum computers via the cloud, QBT aims to transform classic computing cryptography algorithms, such as the one used for Bitcoin mining, in quantum computations, or quantum-classic hybrid computations.
QBTs quantum computing team is working on the Leap platform with the goal to exploit the speed of quantum computations, which can be, under the appropriate conditions, several order of magnitudes faster than a classic computer.
D-Waves new quantum computer, Advantage, includes more than 5,000 qubits and 15-way qubit connectivity. More qubits and richer connectivity provide programmers and businesses access to a larger, denser, and more powerful graph for building commercial quantum applications.
The hybrid solver services in the Leap platform combine the power of Advantage with classical resources, enabling businesses and developers to build, run and solve complex, large-scale business-critical problems with up to 1 million variables.
QBT has already created a team, which has started working on the conversion of optimised cryptographic algorithms, in order to make them suitable to run on D-Waves quantum system and quantum hybrid solvers.
Francesco Gardin, CEO and Executive Chairman of QBT, commented, QBT is delighted to work with the D-Wave team, which we believe will provide us with an alternative approach to the computation of cryptographic algorithms. We have selected what we believe is a major international consolidated player in the quantum computing market and we look forward to working with them during the first phase of the project. In particular, we are excited to have the benefit of utilising D-Waves Advantage quantum processor, with more than 5,000 qubits where we intend to develop our optimised cryptographic algorithms.
Story continues
Alan Baratz, CEO of D-Wave Systems Inc., commented, Bringing quantum computing to the world requires a robust ecosystem of developers and researchers, as well as forward-thinking businesses that are committed to building practical and applied quantum computing applications. QBT is a leader in developing new and disruptive approaches to blockchain technology an important innovation with the power to change the world.
For more information on D-Wave, please go to: https://www.dwavesys.com/quantum-computing
-ends-
For further information please contact:
Quantum Blockchain Technologies PlcFrancesco Gardin, CEO and Executive Chairman
+39 335 296573
SP Angel Corporate Finance(Nominated Adviser & Broker)Jeff Keating
+44 (0)20 3470 0470
Leander (Financial PR)Christian Taylor-Wilkinson
+44 (0) 7795 168 157
About Quantum Blockchain Technologies (AIM: QBT)
Quantums R&D focus is on Cryptography and AI using Quantum Computing, bringing together the most advanced classic computing technology, along with quantum computing and AI deep learning, to develop, among other things, a new and disruptive approach to blockchain technology, which includes cryptocurrencies mining and other advanced blockchain applications.
The Company has set up a team of international experts in the above sectors, as well as a computing infrastructure to support the development of the most advanced innovative solution based on the front-line IT technologies.
For further information, please visit, http://www.quantumblockchaintechnologies.co.uk
Originally posted here:
Quantum Blockchain Technologies Plc - Working with D-Wave Systems - Yahoo Finance UK
IBM researchers demonstrate the advantage that quantum computers have over classical computers – ZDNet
Big Blue's quantum team set out to discover if today's quantum devices could be used to complete a task that cannot be done on a classical system.
IBM researchers have finally proven in a real-world experiment that quantum computers are superior to classical devices although for now, only at a miniature scale.
Big Blue's quantum team set out to discover if today's quantum devices, despite their limitations, could be used to complete a task that cannot be done on a classical system.
Since quantum computing is still in its infancy, the researchers leveled the playing field between the two methods by designing a microscopic experiment with limited space that is, limited amount of available memory.
Two limited-space circuits were built, one quantum and one classical, with only one bit or qubit available for computation and result storage. The task programmed into the circuits consisted of finding the majority out of three input bits, returning zero if more than half of the bits are zero, and one if more than half of the bits are one.
The restrictions, said the scientists, enabled a fair comparison between the power of classical and quantum space when carrying out a calculation.
"Through our research, we're exploring a very simple question,"said IBM's quantum team in a blog post."How does the computational power differ when a computer has access to classical scratch space versus quantum scratch space?"
Equipped with a single bit for computation and storage, the classical system is not capable of running the algorithm, theorized the scientists. Even when giving the system's computational capabilities a boost by adding what is known as random Boolean gates, the classical computer only succeeded 87.5% of the time.
Quantum devices, on the other hand, fared better: a perfect, noiseless quantum computer could succeed 100% of the time, said the scientists in their theoretical demonstration.
This is because, unlike classical bits that can either represent a 1 or a 0, qubits can take on a combination of various states at once, meaning that they have access to a larger space of values. In other words, quantum space is more valuable than classical space.
The theory, however, is still some distance away from reality. Current quantum computers are still too noisy to achieve the perfect results demonstrated by the scientists in their paper. But when carrying out the experiment in real-life, with circuits calibrated to run the program more efficiently, IBM's team still observed a success rate of 93%, which beats the classical system.
"We show that qubits, even today's noisy qubits, offer more value than bits as a medium of storage during computations," said the scientists.
This means that even today's noisy quantum computers can offer better performance on the problem than the theoretical maximum performance of a classical device, suggesting that as the technology evolves, the performance gap with classical devices will only widen.
Big Blue's quantum team claims that this is a world-first demonstration of quantum advantage, because the theory is backed by a real-life experiment.
To date, research projects are concerned with proving a theoretical quantum advantage that can only be demonstrated when the hardware is mature enough to run large-scale programs, according to the scientists.
Fromimproving car manufacturing supply chainstooptimizing the routes of merchant ships around the world's oceans: there is no shortage of ideas when it comes to researching how quantum computing could create business value. But for now, scientists are mostly finding that quantum technologies are comparable to classical systems for small-scale problems, and only theorizing that quantum devices will eventually deliver an advantage as the computers develop.
"Here, for the first time that we are aware of, we report a simultaneous proof and experimental verification of a new kind of quantum advantage," said IBM's researchers.
As quantum hardware improves, these experimental verifications are expected to expand from tests carried out at the level of single bits. IBMrecently unveiled a quantum roadmap for the next few years, which includes a 1,121-qubit system to be built by 2023, on track to creating systems supporting more than one million qubits in the longer-term.
See original here:
IBM researchers demonstrate the advantage that quantum computers have over classical computers - ZDNet
Is this the first physics problem that the quantum computer will solve? – Centrum Wiskunde & Informatica (CWI)
In his PhD thesis, theoretical physicist Joris Kattemlle (UvA/QuSoft/CWI) proposes a physical problem that could be the first one for a quantum computer to solve. The problem cannot be solved by a classical computer, but a quantum computer with about one hundred quantum bits can. On Wednesday 30 June Kattemlle received a doctorate from the University of Amsterdam for his PhD thesis entitled 'Many-body physics meets quantum computation'.
A quantum computer can solve problems that a classical computer would never be able to calculate. Current quantum computers only exist in a few large research labs around the world and count at most a few dozen quantum bits, the elementary calculation units of the revolutionary new computer. In 2019, Google demonstrated a quantum computer consisting of 53 quantum bits that solved a problem that a classical computer cannot solve. This became world news, despite the fact that it was a toy problem with no applications.
In his doctoral thesis, theoretical physicist Joris Kattemlle describes a problem that is interesting for physicists to solve, one that cannot be solved by a classical computer but can be solved with only around one hundred quantum bits. And a quantum computer consisting of one hundred quantum bits is already in sight.
The problem that Kattemlle proposes is called the kagome lattice (kagome is a Japanese word for a certain weaving pattern that looks exactly like the lattice). Reproducing this lattice on a computer can provide new insights into the behaviour of solids found in nature. For example, the kagome lattice describes the magnetic properties of the mineral Herbertsmithite, which was discovered by Herbert Smith in Chile in 1972. The mineral has no specific applications but is an interesting object for physicists to study in order to understand all possible behaviours of atoms and molecules in solids.
The most exciting aspect of the kagome lattice is that it is a promising candidate for proving that there is a new kind of magnetism: a so-called quantum spin liquid (a new kind of disordered magnetic state in which there is no order in the direction of the elementary magnets, as there is, for example, in a ferromagnet, where all the elementary magnets point in the same direction). Physicists think that a quantum spin liquid exists, but have never proven it or found it experimentally. In his thesis, Kattemlle has shown that the kagome problem has exactly the right properties that make it very suitable to be solved with a quantum computer.
The thread running through Kattemlles thesis is the interaction between many-particle physics (which, for example, explains why electrical conduction only occurs with many electrons and not with a single electron) and the quantum computer. A many-particle problem that some physicists believe is a practical obstacle to the construction of a quantum computer is super-noise. Super-noise is the phenomenon that the noise of all quantum bits combined is greater than the sum of the noise of all individual quantum bits. In his thesis, Kattemlle, in addition to his work on the kagome lattice, also demonstrated that this super-noise does not pose any practical problem for the construction of a future quantum computer.
More information
Here is the original post:
Is this the first physics problem that the quantum computer will solve? - Centrum Wiskunde & Informatica (CWI)
New research proves that quantum computational errors are correlated and connects them to cosmic rays – Illinoisnewstoday.com
LLNL
In an experiment conducted at the University of Wisconsin-Madison, researchers found that the charge fluctuations of multiple qubits, or qubits, could be highly correlated rather than completely random and independent. I found that there is sex. The team also associated perturbations that cause small errors in the qubits charge state with cosmic ray absorption. Illustration courtesy of UW-Madison.
Study by Lawrence Livermore National Laboratory (LLNL) Physicists and many collaborators shed new light on error correction, one of the key challenges in realizing the potential and potential of quantum computing.
In a new treatise Published in Nature Scientists co-authored by LLNL physicist Jonathan DuBois have investigated the stability of quantum computing, especially the causes of errors and how quantum circuits react to them. You need to understand this in order to build a working quantum system. For other co-authors, University of Wisconsin-Madison, Fermi National Accelerator Laboratory,Google, Stanford University And an international university.
In an experiment conducted at UW-Madison, the research team characterized the quantum testbed device, and the charge fluctuations of multiple qubits or qubits, which are the basic units of quantum computers, are completely random and independent. doing. Catastrophic events, such as bursts of energy from outside the system, can affect all cubics near the event at the same time, resulting in potentially system-wide correlation errors. The researcher discovered. In addition, the team associated perturbations that cause small errors in the charge state of qubits with cosmic ray absorption. This is a discovery that has already influenced the way quantum computers are designed.
In most cases, schemes designed to correct quantum computer errors assume that errors between qubits are uncorrelated. They are random. Correcting correlated errors is very much. Its difficult, said DuBois, co-author of LLNLs Quantum Coherent Device Physics (QCDP) group. In essence, this paper shows that if a high-energy cosmic ray hits somewhere in the device, it can affect everything in the device at once. It can be prevented. Without it, error correction cannot be performed efficiently, and without it, a system that works cannot be built.
Unlike the bits found in classical computers, which can only exist in the binary state (0 or 1), the qubits that make up a quantum computer can exist in superposition. For hundreds of microseconds, the data in the qubit will be either 1 or 0 before being projected into the classical binary state. Bits are susceptible to one type of error, whereas in transient excited state states, delicate cubits are susceptible to two types of errors due to changes that may occur in the environment. I will.
Charged impulses, even tiny ones such as from cosmic rays absorbed by the system, can heat the substrate of the quantum device to destroy the qubits and disturb the quantum state (relatively) high energy. It can cause an electron explosion. Discovered by researchers. When particle collisions occur, electron wakes occur in the device. These charged particles zoom the material inside the device, scattering atoms and producing high-energy vibrations and heat. This changes the thermal and oscillating environment around the electric and cubits, causing errors, DuBois explained.
Weve always known that this is possible and has potential implications, and its one of many issues that can affect the behavior of qubits, DuBois added. We joked when we saw the bad performance that cosmic rays might have caused. The importance of this study is that given the architecture, the current device design in the presence of environmental radiation. Putting some quantitative limits on what you can expect in terms of performance.
To confirm the confusion, researchers send radio frequency signals to a 4-qubit system, measure the excitation spectrum, and perform spectroscopy to move the qubit from one quantum state to another. I confirmed that it was reversed. At the same time, energy as the charging environment changes.
If the model for particle impact is correct, most of the energy is expected to be converted into vibrations in the chip and propagate over long distances, said Chris Willen, a graduate student at UW-Madison, the lead author of the paper. Says. As the energy spreads, the disturbance leads to a qubit flip that correlates across the chip.
Using this method, researchers also examine the lifetime of qubits (the length of time that qubits can stay in both superpositions of 1 and 0) and change the charge state of everything in the system. Correlated with the decrease in qubit life.
The team concluded that quantum error correction requires the development of mitigation strategies to protect quantum systems from correlation errors due to the effects of cosmic rays and other particles.
I think people are tackling the problem of error correction in an overly optimistic way, blindly assuming that the errors are uncorrelated, said UW-Madison Physics, senior author of the study. Professor Robert McDermott said. Our experiments absolutely show that the errors are correlated, but as we identify the problems and gain a deeper physical understanding, we will find ways to avoid them.
After a long theory, DuBois says the teams findings have never been experimentally proven on multi-cubit devices. The result is that quantum computers can be placed in lead shields or underground, heat sinks and dampers can be introduced to quickly absorb energy to separate qubits, and the types of materials used in quantum systems can be changed. It may affect the future quantum system architecture.
LLNL is now Quantum computing testbed systemWas designed and built with funding from the Laboratory Directed Research and Development (LDRD) Strategic Initiative, which was launched in 2016. It is being developed with the continued support of the National Nuclear Security Administrations Advanced Simulation & Computing program and its Beyond Moores Law project.
In a related subsequent work, DuBois and his team in the QCDP group are studying quantum devices that are significantly less sensitive to the charge environment. At the extremely low temperatures required by quantum computers (the system is kept cooler than space), Dubois observes that heat and coherent energy transport is qualitatively different from room temperature. Said that. For example, instead of diffusing heat energy, it can bounce back in the system like a sound wave.
Dubois and his team understand the dynamics of microscopic explosions that occur when quantum computing devices interact with high-energy particles, before destroying the delicate quantum states stored in the devices. He said he is focusing on developing ways to absorb energy. ..
There are potential ways to design a system to be as unaffected by these types of events as possible. To do so, see how the system is heated, cooled, and exactly what is happening. It needs to be fully understood, throughout the process when exposed to background radiation, says DuBois. The physics of whats happening is very interesting. Aside from quantum applications, its a frontier because of the strangeness of how energy is transported at these low temperatures. Its that. I will make it a physics challenge.
DuBois worked with the papers lead researcher McDermott (UW-Madison) and his group to develop a method for using qubits as detectors to measure charge bias. This is the method the team used to experiment with the treatise.
Featured works, including the contribution of DuBois, were funded by a joint grant between LLNL and UW-Madison in the United States. Department of Energy Science..
This paper included co-authors of UW-Madison, Fermi National Accelerator Laboratory, Kavli Institute for Cosmological Physics at the University of Chicago, Stanford University, INFN Sezione di Roma, Laboratoire de Physique Theorique et Hautes Energies at Sorbonne Universite, and Google. ..
New research proves that quantum computational errors are correlated and connects them to cosmic rays
Source link New research proves that quantum computational errors are correlated and connects them to cosmic rays
View original post here:
New research proves that quantum computational errors are correlated and connects them to cosmic rays - Illinoisnewstoday.com
CSRWire – Refusing Limits with Liz Ruetsch – CSRwire.com
Published 07-01-21
Submitted by Keysight Technologies
Keysight Blog
By Brianne McClure | Brand Storyteller
Two years into the electrical engineering program at Rutgers University, Elizabeth (Liz) Ruetsch called her father in tears. She told him that she wanted to quit the program. The problem was, as her father pointed out pragmatically, she didn't have a plan B.
Liz shared this story with me when I invited her to participate in our Refusing Limits interview series to celebrateInternational Women in Engineering Day. Despite her initial feelings that the electrical engineering program was too challenging and she could not see herself working in research and development, Liz would go on to graduate as one of six women in a class of 160 engineers. She has since become an inspiration to many engineers especially women.
On her way to the finish line, Liz saw many of her female peers come to a similar crossroads and drop out. Thats when she realized how important it is for women in engineeringto have beacons. Liz explained that beacons are people in the industry who inspire you and give you a reason to stick with the engineering journey when things get tough. Once she found her own beacons, Liz wanted to help other women do the same, so they would be inspired to complete the engineering program.
When I spoke with Liz, I was eager to learn how she went from almost dropping out of engineering school to forging a fascinating career in the test and measurement industry - spanning twenty-seven years of sales, marketing, and leadership. She has worked in the US and internationally during her career, including a two-year assignment living and working in China. She was also recognized by the Society of Women Engineers with a Global Leadership Award and the North Bay Business Journal with a Women in Business Award. She now leads the quantum engineering team at Keysight.
Liz, how much of your ability to stick with the engineering program came down to sheer determination? And do you think women with grit are more likely to succeed as engineers?
The women in my engineering program were brilliant and had plenty of grit. So, I think it's more likely that they didn't have good enough reasons to keep going. The program is very demanding, and if you can't picture yourself coming out of it and entering a career that excites you, changing course makes a lot of sense. That's especially true at a university like Rutgers, where you can pursue degrees outside of engineering.
During the program, I found myself looking for inspiration. When I was introduced to a broader range of engineering careers, I became more excited about being an engineer. I wanted to inspire that same kind of excitement in my peers, soI got involved with the Society of Women Engineers (SWE). As co-president of our local section, I introduced a weekly speaker series where people from different engineering disciplines and roles (sales, marketing, operations) would talk about their work. Those speakers became beacons who showed the women in our sectionthat even if mechanical or electrical engineering wasn't for them, they might enjoy industrial, packaging, or environmental engineering. I'm proud to say that the program made a difference in retaining women in the overall engineering program.
We also started a program where girls in high school spent a weekend at the university getting a feel for studying engineering by working on some projects and meeting women studying in various engineering fields. When I received my leadership award at the SWE conference, I sought out the current president of the Rutgers SWE section. I was thrilled to hear from her that this weekend program is still going today - almost 30 years later.
In hindsight, do you think working through the most challenging parts of the engineering program helped prepare you for the real world?
I learned a lot about myself between the time I called my father - ready to quit - and graduation. Sticking with the program taught me how to navigate a hard situation, that I knew would last at least another two years until completion. Along the way, I realized that I dont have to have all of the answers on day one to keep moving forward. Once I could break the unknown down into smaller, solvable problems, the challenge suddenly became exciting and ultimately rewarding. And Im glad I learned that lesson early on because the most pivotal points in my career came down to taking on big challenges that I did not have a clear path to solving on day one.
Can you describe some of those pivotal points in your career?
When I started my career as a sales representative for Hewlett Packard (HP), my customer was a big defense contractor. At that time, I was twenty-something years old and trying to sell to a bunch of guys who were radar, missile, and satellite engineers. The first time I walked into a meeting, they said, "you know nothing about radar, right?" They said, "sure; maybe you have an engineering degree. And maybe you understand circuits and electromagnetics or digital signal processing from your textbooks. But what do you really know about radar? How can you possibly help me?" That was an intimidating situation. Luckily, I was learning at that time how to be comfortable with not having all the answers. So, I said, "You know what? I know absolutely nothing about radar, but I'd love to hear about it." And thankfully, people love to talk about what they are working on. And the more they talked, the more I listened to their challenges and learned what solutions we could bring to bear. Many of these customers became close friends, and here it is twenty years later, and I'm still in contact with them even though they are well into retirement.
Another significant challenge in my career was living and working in China. I had traveled to China frequentlyand managed people there and in 14 other countries. But living and working in China is far different than staying at the Marriott there for a few days. During my first three months, I struggled with learning the most effective way to lead the local team. But once I solicited some excellent mentors and did some deep reflecting, it turned into a tremendous experience. I learned more in my two years there than in other roles I had held for over five years.
Twenty-seven years later, I'm still doing work that stretches me as a leader. Because as I like to tell my teams - it's good to feel scared every few years. Thats how you know you are pushing yourself out of your comfort zone. Before taking on my latest role, I had expressed interest to my management about getting involved with mergers and acquisitions. In late 2019, an opportunity came about where we planned to acquire a company in Boston and set up a research and development team there. My leaders were looking for a general manager to integrate the acquired company with Keysight. It was one of those opportunities that's equal parts thrilling and terrifying. On the one hand, I had an excellent background in many of the areas that touch quantum, including aerospace and defense, markets like China, business models for selling software and services, and providing complete test solutions. On the other hand, I was not a quantum physicist. Since Keysight is a results-oriented company, and I've delivered results consistently in multiple business units, the management team supported me to stretch myself into this new GM role. When they offered me the role, I took on the challenge enthusiastically and started to navigate this new territory.
And youve been in that role for over a year now. Would you make the same decision again?
It was a massive leap for me with a lot of unknowns. But I knew that I would be able to figure things out along the way. Part of the reason I was confident was because of the caliber of the team that I had the opportunity to work with and learn from. And we have since added to that team with some exceptional industry and university talent. Having the opportunity to lead theteam that is enabling our customers to advance quantum computing has been one of the most exhilarating adventures of my career. And were just getting started!
Immediately after we founded our quantum research lab in Cambridge, Massachusetts, the world went into quarantine due to the pandemic. Like many people, we had to learn how to interview, hire, onboard, and manage a new team remotely. Hiring both quantum physicists and software engineers for research and development was entirely new to me, so we formed a group of managers with experience in this area to assist.
In parallel with this work, we also started the process to acquire another company,Quantum Benchmark. Quantum Benchmark was the first acquisition that I led from beginning to end, which was an even more complex challenge. It takes a lot of preparation to identify and promote an acquisition target to your CEO and board of directors. Once again, I called on a team of people with experience in this area to coach and guide us. And it worked out as Quantum Benchmark became part of Keysight in April.
Youve talked a lot about the importance of taking on challenges that push you out of your comfort zone. How does that belief manifest in your leadership style?
For the first time in my management career, there are more people on my team with Ph.D.'s than not. These individuals are at the leading edge of quantum, and they are very comfortable pushing the boundaries of technology. But I did encourage our team to be intentional about cultivating a diversity of thought across the ecosystem as they hired new team members.
Right now, the physics part of quantum is reasonably known. But the engineering part of actually building a computer is a big challenge. To progress this technology forward, you need very cross-disciplinary teams. You need physicists, software engineers, and FPGA [field programmable gate array] engineers. You also need to balance university experience with start-up experience and corporate experience to ensure that the solutions are innovative, scalable, and supportable.
And it's exciting to see this unique combination of talent working together to challenge what's possible. The most rewarding part about leading this team is seeing them engaging with customers and partners, being excited about their work, and having opportunities to stretch themselves.
And now that youve helped launch the Women in Quantum mentoring program, youre empowering people inside and out of the company to grow. Can you give an update on how thats going?
Sure. We introduced theWomen in Quantum mentoring programearlier this year. The idea behind creating a network of women in quantum goes back to our conversation earlier about setting up beacons to illuminate paths forward when people are feeling stuck or just needing some inspiration. When I learned about theWomen in Quantumorganization led byDenise Ruffner, I saw an opportunity to leverage Keysight's internal mentoring platform to connect mentors and mentees across the industry. I then sought out support from our Director of Diversity and Inclusion,Leslie Camino-Markowitz, and she made it happen. We have had over 400 people sign up for the program to date. It is also exciting that it keeps coming up on my calls with customers who've told me how glad they are that Keysight is sponsoring this effort to help with the talent pipeline in the quantum ecosystem.
The program is open to people of all gender identities who want to be a mentee or mentor. And it's not just mentoring on technical topics. A lot of people have called me out of the blue about career navigation. Or they have great ideas but can't get any buy-in, and they want coaching on how to improve their influencing skills. I'm always amazed when I'm speaking with mentees that sharing the simplest things can help somebody get unstuck and make them feel empowered to move forward.
Youve touched a lot of lives over the years. How do you feel when people call you inspirational?
I was surprised by how many people came up to me and said something along those lines after I received the Global Leadership Award during the Society of Women Engineers conference in Austin, TX. I have never intentionally set out to challenge the status quo or to inspire anyone. I like to challenge myself and try new things and somehow that inspires other women in the process. When that happenswhen I hear their success storiesit is special.
Keysight Technologies, Inc. (NYSE: KEYS) is a leading technology company that helps enterprises, service providers and governments accelerate innovation to connect and secure the world. Keysight's solutions optimize networks and bring electronic products to market faster and at a lower cost with offerings from design simulation, to prototype validation, to manufacturing test, to optimization in networks and cloud environments.
More from Keysight Technologies
See original here:
CSRWire - Refusing Limits with Liz Ruetsch - CSRwire.com
Crdit Agricole CIB partners with Pasqal and Multiverse Computing – IBS Intelligence
Crdit Agricole CIB with European tech Pasqal and Multiverse Computing announced a partnership to design and implement new approaches running on classical and quantum computers to outperform state of the art algorithms for capital markets and risk management.
International companies and institutions have started investing heavily in quantum technologies. Europe launched the Quantum Flagship Plan in October 2018, and France recently announced a 1.8 billion investment plan.
Quantum computing is likely to profoundly impact multiple industries in the coming years, including finance. Finance has been making substantial use of algorithms requiring advanced mathematics and statistics so far; it is the turn of quantum physics to help solve quantitative financial problems. In addition, quantum theory and technology, assembled in Quantum Computing, start demonstrating promising applications in capital markets and risk management.
Crdit Agricole CIB has teamed up with two quantum technology companies to apply quantum computing to real-world finance applications. French company Pasqal is developing a quantum computer based on neutral atoms arrays, currently being trialled to build industrial quantum computers. Spanish company Multiverse Computing specialises in quantum algorithms which can run both on quantum and classical computers.
Georges-Olivier Reymond, CEO of Pasqal, said: I strongly believe in that partnership to foster the usage of quantum computing for Finance. To our knowledge, it is the first-ever in which all the stakeholders, software developer, hardware provider and end-user are working together on a problem. All the teams are very excited, and this development will be the cornerstone of future industrial applications for neutral atom quantum computers.
Enrique Lizaso, CEO of Multiverse Computing, said: We are thrilled with the opportunity of working together with Credit Agricole CIB and Pasqal in this ambitious project, that will put into production the most advanced tools currently only used in large non-financial institutions in US and China. This is a landmark project in Finance in the world.
LinkedInWhatsAppFacebook Twitter EmailPrint
Read the original:
Crdit Agricole CIB partners with Pasqal and Multiverse Computing - IBS Intelligence
Keynotes Announced for IEEE International Conference on Quantum Computing and Engineering – HPCwire
LOS ALAMITOS, Calif., June 24, 2021 The IEEE International Conference on Quantum Computing and Engineering (QCE21), a multidisciplinary event bridging the gap between the science of quantum computing and the development of an industry surrounding it, reveals its full keynote lineup. Taking place 18-22 October 2021 virtually, QCE21 will deliver a series of world-class keynote presentations, as well as workforce-building tutorials, community-building workshops, technical paper presentations, stimulating panels, and innovative posters. Register here.
Also known as IEEE Quantum Week, QCE21 is unique by integrating dimensions from academic and business conferences and will reveal cutting edge research and developments featuring quantum research, practice, applications, education, and training.
QCE21s Keynote Speakers include the following quantum groundbreakers and leaders:
Alan Baratz D-Wave Systems, President & CEOJames S. Clarke Intel Labs, Director of Quantum HardwareDavid J. Dean Oak Ridge National Laboratory, Director Quantum Science CenterJay Gambetta IBM Quantum, IBM Fellow & VP Quantum ComputingSonika Johri IonQ, Senior Quantum Applications Research ScientistAnthony Megrant Google Quantum AI, Lead Research ScientistPrineha Narang Harvard University & Aliro Quantum, Professor & CTOBrian Neyenhuis Honeywell Quantum Solutions, Commercial Operations LeaderUrbasi Sinha Raman Research Institute, Bangalore, ProfessorKrista Svore Microsoft, General Manager Quantum Systems
Through participation from the international quantum community, QCE21 has developed an extensive conference program with world-class keynote speakers, technical paper presentations, innovative posters, exciting exhibits, technical briefings, workforce-building tutorials, community-building workshops, stimulating panels, and Birds-of-Feather sessions.
Papers accepted by QCE21 will be submitted to the IEEE Xplore Digital Library, and the best papers will be invited to the journals IEEE Transactions on Quantum Engineering (TQE) and ACM Transactions on Quantum Computing (TQC).
QCE21 is co-sponsored by IEEE Computer Society, IEEE Communications Society, IEEE Council of Superconductivity, IEEE Future Directions Committee, IEEE Photonics Society, IEEE Technology and Engineering Management Society, IEEE Electronics Packaging Society, IEEE Signal Processing Society (SP), and IEEE Electron Device Society (EDS).
The inaugural 2020 IEEE Quantum Week built a solid foundation and was highly successful over 800 people from 45 countries and 225 companies attended the premier event that delivered 270+ hours of programming on quantum computing and engineering.
The second annual 2021 Quantum Week will virtually connect a wide range of leading quantum professionals, researchers, educators, entrepreneurs, champions, and enthusiasts to exchange and share their experiences, challenges, research results, innovations, applications, and enthusiasm, on all aspects of quantum computing, engineering and technologies. The IEEE Quantum Week schedule will take place during Mountain Daylight Time (MDT).
Visit IEEE QCE21 for all event news including sponsorship and exhibitor opportunities.
QCE21 Registration Package provides Virtual Access to IEEE Quantum Week Oct 18-22, 2021 as well as On-Demand Access to all recorded events until the end of December 2021 featuring over 270 hours of programming in the realm of quantum computing and engineering.
About the IEEE Computer Society
TheIEEE Computer Societyis the worlds home for computer science, engineering, and technology. A global leader in providing access to computer science research, analysis, and information, the IEEE Computer Society offers a comprehensive array of unmatched products, services, and opportunities for individuals at all stages of their professional career. Known as the premier organization that empowers the people who drive technology, the IEEE Computer Society offers international conferences, peer-reviewed publications, a unique digital library, and training programs.
Source: IEEE
The rest is here:
Keynotes Announced for IEEE International Conference on Quantum Computing and Engineering - HPCwire
The evolution of cryptographic algorithms – Ericsson
Cryptographic algorithms and security protocols are among the main building blocks for constructing secure communication solutions in the cyber world. They correspond to the locks that secure a house in the physical world. In both, it is very difficult to access the assets inside without a valid key. The algorithms and protocols are based on hard mathematical and computationally infeasible problems, whereas the lock mechanisms are based on the difficulty of solving the physical construction.
Mobile networks are critical infrastructure and heavily use advances in cryptographic algorithms and protocols to ensure the security of the information in the communication and privacy protection for the individuals. In this blog post, we take a detailed look at the cryptographic algorithms and protocols used in mobile communications and share some insights into the recent progress. We give an overview taking into consideration the development from 2G to 5G and beyond. In addition, we present detailed information on the progress toward defining the profiles to be used in the security protocols for the mobile communication systems. Last but not least, we give the current status and future plans for post-quantum cryptographic algorithms and protocols.
It can be hard to get an overview of the cryptographic algorithms used in mobile networks. The specifications are spread out over many documents, published over a period of 30 years by the three standardization organizations: 3GPP, ETSI and GSMA. The algorithms can also have quite cryptic names, with more than one name often given to the same algorithm. For example, GEA5, UEA2, 128-EEA1 and 128-NEA1 are almost identical specifications of SNOW 3G for GPRS, UMTS, LTE and NR respectively.
The 3GPP/GSMA algorithms come in three different types: authentication and key generation, encryption and integrity. The authentication and key generation algorithms are used in the Authentication and Key Agreement (AKA) protocol. The encryption and integrity algorithms are used together or independently to protect control plane and user plane data. An overview of all currently specified algorithms is shown in Figures 1 and 2.
The second generation (2G or GSM) mobile networks have quite low security by todays standards. But GSM was actually the first mass-market communication system to use cryptography, which was both revolutionary and controversial. At the time, export of cryptography was heavily restricted and GSM had to be designed with this in mind. The encryption algorithms A5/1 and A5/2 are LFSR-based stream ciphers supporting 64-bit key length. A5/2 is a so-called export cipher designed to offer only 40-bit security level. Usage of export ciphers providing weak security was common at that time and other standards like TLS also supported export cipher suites.
To further align with export control regulations, the key generation algorithms COMP128-1 and COMP128-2 decreased the effective output key length to 54 bits by setting 10 bits the key to zero. While A5/1 and A5/2 mostly met their design criteria, COMP128-1 was a very weak algorithm and was soon replaced by COMP-128-2 and COMP128-3. When packet-switched data was introduced with GPRS, slightly different algorithms GEA1 and GEA2 were introduced. Similar to A5/1 and A5/2, GEA1 and GEA2 are LFSR-based stream ciphers supporting 64-bit key length, where GEA1 was the export cipher. The export ciphers A5/2 and GEA1 are forbidden to support in phones since many years and COMP128-1 is forbidden to support in both networks and SIM cards. None of the original 2G algorithms were officially published anywhere as they were intended to be kept secret, which was quite common practice at the time. But all were reverse engineered by researchers in academia nearly a decade after their development.
The third generation (3G or UMTS) mobile networks introduced 128-bit security level public encryption and integrity algorithms. In 3G, the algorithms were selected by the ETSI Security Algorithms Group of Experts (SAGE), which has since made recommendations for all the new algorithms for mobile networks. The final decision is always taken by 3GPP SA WG3, the security working group in 3GPP. While many other designs from the same time, such as SSH and TLS, turned out to have significant flaws, the 3G algorithms and their modes of operation are still secure today.
The 3G encryption algorithms UEA1 and UEA2 use the KASUMI block cipher and the SNOW 3G stream cipher, which are slightly modified versions of the MIST block cipher and SNOW 2.0 stream cipher respectively. The integrity algorithm UIA1 is CBC-MAC using KASUMI and UEA2 is a Carter-Wegman MAC based on SNOW 3G. For authentication and key generation, the exact algorithm is not standardized and it is up to the operator to choose the algorithm deployed in their home network and SIM cards. 3GPP defines the Milenage algorithm (based on AES-128) as a well-designed example algorithm and this choice is widely used in practice. All the 3G algorithms have also been specified to be used in 2G.
Figure 1: 3GPP/GSMA algorithms for authentication and key generation - Green algorithms are secure while red algorithms only offer 64-bit security or less.
Figure 2: 3GPP/GSMA algorithms for encryption and integrity protection - Green algorithms are secure while red algorithms only offer 64-bit security or less.
The fourth generation (4G or LTE) mobile networks replaced KASUMI with AES-128. The encryption algorithm 128-EEA2 is AES in counter mode (AES-CTR) while the integrity algorithm 128-EIA2 is AES in CMAC mode. 4G also introduced Tuak, a new algorithm family for authentication and key generation based on Keccak hash algorithm but using slightly different parameters from the one which NIST later standardized as SHA-3. SIM cards are recommended to support both Milenage and Tuak. 4G also introduced an optional algorithm, ZUC, to construct 128-EEA3 and 128-EIA3 algorithms, which are the only optional ones to be supported in implementations. It is also worth mentioning that 3GPP specifies at least two mandatory algorithms due to the security practice of having a backup algorithm.
The fifth generation (5G or NR) uses exactly the same algorithms used in 4G. There are no weaknesses in any of the 4G algorithms and they offer good enough performance when implemented in hardware. However, the currently used algorithms are not suitable for future deployments as they are slow in software, does not support 256-bit keys, and only support 32-bit MACs. Software performance is essential for software implementations in virtualized deployments. While these algorithms are fast enough for 5G when implemented in hardware, they perform far worse than state-of-the art algorithms also in hardware and will likely not be suitable for 6G.
3GPP SA3 and ETSI SAGE have therefore started working together on new virtualization-friendly algorithms suitable for later 5G releases and 6G. It is essential that the new algorithms perform well in software on a wide range of architectures (such as x86, ARM and RISC-V) and that they can also be efficiently implemented in hardware. AES-CTR is already fulfilling these criteria, but would have to be accompanied by a high-performance integrity mode like GMAC. SNOW 3G is not up to the task, but the new cipher SNOW-V would be a perfect fit, outperforming even AES-GCM on x86 processors.
The new algorithms to be introduced to 3GPP will likely support only 256-bit key length and offer at least 64-bit tags. While 128-bit algorithms will be practically secure against quantum computers, cellular networks are increasingly classified as critical infrastructure. Already today, governments and financial institutions often mandate more than 128-bit security level for protection of their communication.
While mobile networks use some algorithms and security protocols specific to 3GPP, most of the security protocols used in 5G such as TLS, DTLS, IKEv2, ESP, SRTP, X.509, and JOSE are standardized or maintained by the Internet Engineering Task Force (IETF). 3GPP has, for many years, had the excellent tradition of updating their security profiles in almost every release following recommendations from academia, IETF and other organizations. A large part of this work has been driven by Ericsson.
The general 3GPP profiles for (D)TLS, IPsec and X.509 specified in TS 33.210 and TS 33.310 apply to many different 3GPP interfaces. 3GPP now has some of the best and most secure profiles for TLS and IPsec. 3GPP was, for example, very early with mandating support for TLS 1.3 and with forbidding TLS 1.1 and all weak cipher suites in TLS 1.2. Best practice today is to encrypt as much information as possible and to do key exchange with Diffie-Hellman to enable Perfect Forward Secrecy (PFS). The profiles are well ahead of most other industries as well as IETFs own profiles. 5G is increasingly referred to as critical infrastructure and as such the security profiling should be state-of-art.
For Rel-16 and Rel-17, 3GPP initiated work items specific to security updates, but similar work has been done for much longer under the general TEI work item. For Rel-17, 3GPP aims to mandate support for SHA-256 in the few remaining places where MD5 or SHA-1 is still in use, introduce Curve25519 for low latency key exchange in IKEv2, enable use of OCSP and OCSP stapling as an alternative to CRL everywhere, mandate support of DTLS-STRP and AES-GCM for SRTP, and introduce deterministic ECDSA.
Updating profiles for cryptographic algorithms and security protocols is a process that takes many years because of backward compatibility, as nodes from one release often have to talk to devices from much older releases. Before any weak algorithms or protocol versions are forbidden, the support of strong alternatives needs to have been mandatory for several releases.
Taking into consideration that 3GPP produces approximately one release every 1.5 years, it is essential to mandate the support of new versions of security protocols as soon as possible like 3GPP did with TLS 1.3. Some drawbacks of TLS 1.2 are that it requires a large amount of configuration to become secure and does not provide identity protection, therefore it should be phased out in the future.
Current best practice is to mandate the support of at least two strong algorithms everywhere, so there is always a strong algorithm supported if one of the algorithms is broken. The National Institute of Standards and Technology (NIST) has long functioned as a global standardization organization for cryptographic algorithms. NIST standardizes algorithms in open competitions, inviting contributions from academia all over the world. Both AES and SHA-3 were designed by researchers from Europe. Recently, the Internet Research Task Force Crypto Forum Research Group (IRTF CFRG) has complemented NIST as a global cryptographic Standards Developing Organization (or SDO) and has standardized algorithms like ChaCha20-Poly1305, Curve25519, EdDSA, LMS, and XMSS. NIST has introduced many of the CFRG algorithms within their own standards.
Broken algorithms were once very common, but essentially all algorithms standardized by NIST, IRTF CFRG and ETSI SAGE since 2000 (such as AES, SHA-2, SHA-3, ChaCha20, KASUMI and SNOW 3G) have remained secure, with no practical attacks. Figure 3 gives an overview of broken, weak or legacy algorithms and security protocols. 3GPP has already forbidden most of these and will likely phase out the rest in future releases.
Figure 3: Broken and legacy cryptographic algorithms and security protocols
A big part of future work in upcoming releases will be to introduce quantum-safe algorithms or Post-Quantum Cryptography (PQC). PQC algorithms are cryptographical algorithms that are secure against attacks from quantum computers, which happens to be most algorithms except RSA and Elliptic-Curve Cryptography (ECC). This is something 3GPP is well prepared for, having already future-proofed protocols like 5G Subscription Concealed Identifier (SUCI) by allowing ciphertexts and public keys to be several thousands of bytes long. If somebody builds a sufficiently large quantum computer, RSA and ECC will likely be broken in a matter of hours.
Small quantum computers already exist, however it is still uncertain when (or if) quantum computers capable of breaking these cryptographic algorithms will be built. 3GPP will likely introduce quantum-safe algorithms long before quantum computers even get close to affecting the security of 3GPP systems. Introducing non-standardized cryptographic algorithms likely introduces more risks than it solves, and both 3GPP and IETF have taken the decision to wait for NIST standardization of PQC algorithms, which is already in the final round and will be ready in 2022-2024. After that, IETF will standardize the use of PQC algorithms in (D)TLS, IKEv2, X.509, JOSE and HPKE and as soon as this is done, 3GPP will introduce the new updated IETF RFCs.
Some of the candidates for post-quantum security level 1 in the final round of NIST PQC standardization are summarized in Figure 4. It seems very likely that one of the lattice-based algorithms will be the main replacement for RSA and ECC, for both Key Encapsulation Mechanisms (KEM) and signatures. KEM provides a simplified interface for key exchange and public key encryption. Lattice-based algorithms have slightly larger public keys, signature and ciphertext sizes than the ones of RSA, but they are even faster than ECC. As can be seen from Figure 4, PQC is very practically useful for most applications. Transition to PQC can be seen as a bigger step than the transitions from 3DES to AES and SHA-1 to SHA-256, as it might require security protocol changes to a larger degree. Note that PQC algorithms are not relying on quantum mechanics and software implementation does not require any new hardware.
Figure 4: Some candidates (post-quantum security level 1) in the third and final round of NIST PQC Standardization. The performance measurements are single-core on Skylake 2.5 GHz https://bench.cr.yp.to/ebats.html (lower is better)
128-bit symmetric algorithms will not be practically affected by quantum computers and NIST is currently labeling AES-128 as post-quantum security level 1. Even so, 3GPP is moving towards increased use of 256-bit keys and algorithms such as AES-256.
More information about the algorithms used in mobile networks can be found in the specification series prepared by the 3GPP SA3 working group. For the main profiles used in the security protocols, check 3GPP TS 33.210 and TS 33.310.
To learn and keep up to date on the latest progress in post-quantum cryptography, follow NIST PQC Standardization.
Learn more about the realities of post-quantum cryptography in our previous blog post from 2020.
Discover how 5G fits into mobile communication network security in our guide to 5G network security
Read our summary of the latest standardization work from 3GPP, Release 16 (5G phase 2)
See the original post:
The evolution of cryptographic algorithms - Ericsson
EU rewrites rulebook on science and technology cooperation with the rest of the world – Science Business
After years of fighting against a Europe first approach in research funding, policy makers and lobbyists in Brussels now want to explicitly limit access for scientists in countries that flout academic freedom and intellectual property rights.
The Commission says in future it will base its rules for international scientific cooperation on the principle of open strategic autonomy. In particular, it is drawing up a roadmap on science and technology with China, in which it is seeking to impose stricter terms on cooperation, to ensure EU research organisations and companies can access the Chinese market safely, without needing to worry about potential IP breaches.
We are not in the autonomy trap, we are keeping our eyes open, Jean-Eric Paquet, director general for research and innovation told the EUs Research & Innovation Days conference last week.
Paquet said both he and EU research commissioner Mariya Gabriel want intense cooperation with China, whose scientific prowess is spectacular in many respects. But before opening its arms, the Commission is planning to scrutinise Chinas practices in intellectual property and academic freedom. We need indeed to do it on a proper basis, and that is the challenge ahead of us in the coming weeks and months, said Paquet.
Researchers remain largely in favour of international R&D cooperation, but they do want limitations to be applied to countries that could use open access to EU programmes to spy on sensitive technologies for economic and military gains.
We are fully in favour of international cooperation, and we say no to a blind EU first approach, said Kurt Deketelaere, secretary general of the League of European Research Universities. But he said, the EU should protect itself from unfair and abusive practices by China.
Universities want help in steering a path through the new requirements and say the Commission should list the technologies it wants to protect, and specify the criteria for international cooperation universities should apply when weighing up collaboration projects with the rest of the world.
Heads of Canadian, American and UK universities are spending time with people from the national security services to track potential security threats via science collaboration, according to Deketelaere.
It would be very useful if the Commission took the initiative to open up channels for similar discussions between the EUs 27 national security services, said Deketelaere. We have to take this seriously, he said.
Germany has unilaterally put in place measures to make scientists and universities aware of the challenges posed by international cooperation, said Susanne Burger, head European and international cooperation at Germanys federal ministry for education and research. When dealing with one of the systemic rivals [universities] can approach us, she said.
Our aim is not to control the freedom of science, or to interfere in any case, said Burger. What we do is to just help [universities] discover the traps in international cooperation.
As a sign of a tougher stance, last week German authorities arrested a scientist who had been passing sensitive information from a German university to Russian intelligence services.
Research spies
German MEP Reinhard Btikofer said China has been exploiting its relationship with the EU. I think it's not a trust-building measure if they send scientists and don't say that these scientists are working on behalf of the Chinese armed forces, and that they are here to spy on our technology in order to enhance China's civil military fusion programme, and to help them in building an even stronger military at the service of their aggressive foreign policy, he said.
Btikofer was recently banned from entering China. In March, the EU announced sanctions against officials of the Chinese Communist Party who are thought to be involved in human rights abuses in the Xinjiang province. In retaliation, Beijing imposed sanctions on seven EU politicians, Btikofer included.
As a long-time critic of the Chinese government, Btikofer is calling for the EU to ensure science cooperation is based on commonly agreed principles and rules. You cannot build the relationship with a systemic rival on the basis of blind trust, he said.
Autonomy trap
It is ironic to think how as recently as 2018, EU research lobbies and policy makers in Brussels were quick to dismiss a proposal by Romanian MEP Dan Nica to make the Horizon Europe programme exclusive for EU researchers.
The Commissions pivot to open strategic autonomy is a significant departure from the position of the previous EU research commissioner Carlos Moedas, a leading proponent of science diplomacy. As the 2021 - 2027 Horizon Europe programme was taking shape, Moedas frequently called for EU research and innovation to be, open to the world.
Despite believing there should be some restraints, Deketelaere warned the EU should take care not isolate itself from the international scientific community. Organisations that feel weak, or are becoming weaker, very often tend to ask for more autonomy, he said. And with asking for more autonomy, they often isolate themselves more than they already were, and then they become even weaker as a consequence. And at the end of the day, of course, they collapse, said Deketelaere.
As one example of how these things can be finessed, the Commission and member states recently came to an agreement on third country access to sensitive research projects in quantum science and space that are funded through Horizon Europe.
The terms on which they take part will be negotiated separately with each associated country, the Commission agreed, after caving in to pressure from member states. Before that, the Commission planned to introduce a blanket ban on non-EU researchers and companies in all quantum and space projects.
Quantum is the future of computation, is the future communication, is really the next breakthrough in terms of technology, said Roberto Viola, the Commissions director general for communications technology. I think there's no doubt that we want a quantum computer available for our scientists and for our companies.
The EU is trying to boost its industrial competitiveness in key emerging technologies and make sure it has the capabilities needed to immunise itself against shortages of microelectronics and medical devices.
A global shortage of microchips, paired with significant delays in international shipping, has forced EU manufacturers of cars and medical devices to reduce or suspend production. This is really impairing the capability of Europe to produce, said Viola.
Viola said it is in Europes interest to build open strategic autonomy so that such situations can be avoided in the future. Theres not one way to deliver autonomy. It's a combination of being stronger technologically speaking in Europe, and also being smarter when it comes to partnerships, he said.
Europe should be able to build its own technologies and pay more attention at the rules of engagement with the rest of the world. We want those technologies to be used for peaceful use and no country, no company outside Europe should use these technologies against us, said Viola.
The EU recently announced it will establish closer research ties with the US and Canada, two countries which have already put in place new rules of engagement with China, for fear of technology and science espionage.
It is, I think, an obvious fact that we have to make a distinction, whether we are dealing with likeminded partners, or whether partners do not necessarily share our values on the principles of transparency, or reciprocity, or other core concerns, said Btikofer.