Category Archives: Quantum Computer

The Convergence of Communication and Computation with Dr. Vida Ilderem – RCR Wireless News

What type of talent is essential for the future of wireless communications? We need people with a systems mindset. The guest for todays episode is Dr. Vida Ilderem, Vice President at Intel Labs. Vida discusses with Carrie Charles how wireless communications require a plethora of expertise to succeed in the future. All the way from engineering to social science. Thats why we need talents who know how to bring multiple disciplines together and work as one. Join in the conversation to discover the future of wireless communications and why we need to have a systems mindset.

I have a wonderful guest with me. Her name is Vida Ilderem. She is the Vice President at Intel Labsand Director of Wireless Systems Research at Intel. Vida, thank you so much for joining me. I cant wait for our conversation.

Thank you, Carrie. Its a great privilege to be here. Thank you for the opportunity to share my point of view.

Id like to start first, Vida, by asking you about your professional journey. Youre an Executive at Intel with a very important position there. What has been your journey? How did you get from where you were to where you are now?

I have a PhD in Electrical Engineering from MIT, so in my career in electronics, Im running wireless system research. Its quite a journey to talk about here. Intel is my third company. The journey has been very exciting and frustrating because you have boulders and rocks on the road. You have to go around them or you have to remove them. You can always use help from other people. When I reflect on my journey, I can put it in three phases. Phase 1, post-graduate. Youre coming from a school with a good name the expectations are very high. There is pressure for you to perform. You have to establish yourself, show your expertise and earn the trust and respect of your colleagues.

That was the starting point. The second point is I was able to broaden my skillset. I grew up in Motorola, which was my second company. I went into the RF side. I got deep expertise in RF from Silicon all the way into the simulation modeling, packaging, working with our business groups, so manufacturing pretty well around that. I joined Motorola Labs. I had a sponsor who said, Youre too much in your comfort zone. You need to become uncomfortable. I joined Motorola Apps and that was where I expanded my background because we had physical sciences, which includes nanotechnology, display, printed electronics and fuel cells. It was a variety of technology.

Later on, I added computer architecture, security, imaging, wireless sensor networks. It was getting broader. I have fantastic people working with me, so I learned a lot. Im a technical person. When I enter a domain, I like to learn about it by depending on my people to drive that. That was it. The third phase where I am now is about growing people. You have to love what youre doing. Thats a given. We achieved through people. Its how to grow them, to help them achieve their aspirations. Once they do that, theyre successful companies. Im in that leg of my journey.

I love what you said. Its about growing people and thats so important. Were going to talk a little bit more about that later. Tell me about you. Describe your role as Director of Wireless Systems Research at Intel. Also, talk about what is Intel Labs?

In my team, if you want to go to the mission statement, we look at bringing seamless connectivity to improve the quality of experience for people and things because now things are being instrumented too. We start from Silicon, we have design, we have IP and we go all the way to the system level. We have algorithms and we have protocol designs. One of the joys of the job is the multicultural, multidisciplinary and working with different teams. I think thats whats exciting and makes us successful, frankly. Thats what we do. At Intel Lab, we are a corporate lab of about 700-plus people, 2/3s with PhDs with a very diverse set of skillsets from engineering and science, our physicist theories but we go all the way into anthropologists and psychologists.

We need social sciences because you need to understand what the users want and need. As engineers and scientists, thats not our work. That symbiotic relationship is very important between the different disciplines. Intel Lab also have a global footprint. Of course, we are in the US in Guadalajara, Mexico, in China, India, in Europe, Israel and Germany. We have a good footprint and have access to talent globally. The other thing the lab does is three tenets or pillars. One is exploration. What do we mean by exploration? The people always ask, How do you come up with an idea? How do you know what to do? Its about partnerships with academia and the government-industry. Those partnerships are very important to not only see but also understand where the world is going.

We innovate. The world is about becoming very data. Our charter is to take advantage or make the data bring this full potential to the plate. We are about sensing, analyzing, moving, storing, computing, securing and designing, all the elements that go into digitizing the digital world and data. Finally, we advise. We want to be the trusted advisor to our colleagues on the corporation. Those are the three pillars that we operate on in Intel Labs.

I know that 5G is a big part of your work. In your view, whats the current state of 5G? Where are we now? Where are we going? Any new developments that you can share with us that you see or your team is seeing?

We started in 5G around 2011, 12 working with universities, coming up with the spec and everything else, it takes a long time. Later on, it was kicked off around 2015 or 2016. 5G deployment depends on the standards. This standard is called GPP. The two releases that exist already are called Release 15 and 16. These releases are about enhanced mobile broadband communication. Its more of a human-to-human communication, giving me more data rates, capacity and better coverage. It builds on 4G type environment. Release 17 is in the works and now there are two more legs to 5G, which is about machine-type communication.

One is the billions of devices coming in. These are in the field of battery operated. Low-frequency spectrum, less than 2 Gigahertz spectrum. Thats in the works. The other side is called mission-critical machine-type communication. This is who you want to be at least five-nine reliable. Youre going to have very low latency, basically real-time or near real-time response. When 5G deployment started, you see more on the mobile broadband human to human communication. Machine communication is being deployed. I mentioned the spectrum with the machine types being less than two, embedded mobile broadband human to human, less than 6 or 7 gigahertz, mostly around 3.5, which is called CPRS spots.

It is a matter of your system. Theres the millimeter-wave side, which is a 28, 39 in the US, looking at improving even more data rates, lower the latency and that type of thing. I will say the uptake of millimeter-wave has been slow but still in the works. Thats where I would say the state of 5G is now. Where youre going is that theres one thing that has happened with the deployment of 5G and where is the data created. Thats what we have to think about. Data is that devices are sensors in the field. We have a lot of data. Its the advent of IoT with 5G enablement, so we call it a tsunami of data. This created this concept of Edge. We talk about Edge network or mobile edge communication computation.

This is where the data is created. You have devices, radio access network, which is not a presented by Edge. You go to core network and data center or cloud computing. With Edge, where we are going is not the computation is getting closer to where the data is created. Thats growing. The exciting thing about here is now you have the computer and comp working together. The data reduces latency significantly because youre where the data is. Its a shorter round trip time. Thats where were going. Another development is its not new but its taking off now. Its what we call private networks. This is where you look at a local area network that deliver for a very targeted coverage, think stadiums or factory. That is taking off, too, with the 5G and Wi-Fi coming together. Those are the next things we see where we are going in this domain.

You spoke a little bit about how communications and computation have merged or come together. How do you see that in the future? What does this merge look like in the future?

I would say because of Edge, your devices are everywhere. You have many Edges which are heterogeneous. What do we mean by heterogeneous? You have different devices, protocols, 3G, 4G, 5G protocols. You have dynamicism, notes coming and go. Your car is the very high-velocity area. The Edge is distributed. We have many Edges so that computation is distributed. What we see is this distributed computed all over the network. The network enables the computation coming to the data and the competition enables the network to perform more efficiently for this type. Also, there is the storage angle too where you store the data. Thats the coming together of these two.

When we talk about the next G and what barriers or challenges, do we face with bringing 6G to life? Whats next?

Standardization works on everything.Each G takes about ten years to peak but when you have that G, you still have the previous Gs. Lets see what the previous generations are to set it up for the 6G in this case. 1G was analog, so nobody even talks about it. 2G was about voice. Voice was king for the first digital set of networks. 3G became more about broadband. Now you have voice plus data. People started downloading a lot of images and videos. 4G brought mobile broadband on the go. I want to take my information anywhere.

I want to have access to my data. 5G is the first generation that brings the machines in. Building on the human team and communication, bringing the machine type communication and human to machine communication as well. Its transforming the network because the network was not built for machines. Why do I say that? Barely enough things are coming on online. They go like a blip. They transfer and go down. The requirements are very different than running a video of watching live streaming. What 5G did is allowing us to bring a new type of network. What is 6G? Its all about data.

Data is queen. I like to say queen instead of the king.

We have this large amount of data created and you want to make sense of this data. You dont want to transport or move the raw data because it takes a lot of bandwidth energy. Theres a huge overhead associated with that. This is when AI comes in, artificial intelligence or machine learning, as a subset. Now you want to make sense of the data youve collected and you want to send the information that is associated with that. 6G brings the third pillar. Now you have communication, computation and AI to help fit the real-time-ness and the lower latency, higher reliability and making sense of the information. Thats where we see its going as a confluence of these three disciplines. You still have to optimize the power, performance, quality of service and everything security are becoming important too. Its about data.

Has your team started working on 6G already and research?

Yes. Lets go back to partnerships. Im working with academia and in this case, NSF. We have two centers with NSF and the first one is called MLWiNS, which is Machine Learning for Wireless Intelligent Network Systems. They picked up this center with NSF in September of 2020. Youre looking at three research vectors. One is how to apply machine learning for improving wireless performance. The other one is how to have better use of spectrum through machine learning. Finally, since we said everything is going to be distributed, how to have distributed learning over wireless and apply the learning part of it to wireless. Thats MLWiNS. Theres a new center. Nine companies actually have joined in these entities and its called NSF. Theyre in a proposal mode right now. Its called RINGS. Its about Resilient Intelligent Next-Gen Systems. Its about looking at the resiliency of the network. We believe its very important and the security, how do you gracefully degrade the service rather than bring changes in that. That addresses all the protocol layers there.

The future is so exciting and I cant help but think about the people component. Were going to need not just the amount and the number of people but the type of talent that were going to need. In your perspective, what type of talent is needed now and for the future in wireless communications?

I would say that because we say we are looking at comps, compute AI, all these things coming in, its very hard for a person to have all these competencies but I would say you need some depth in one of them, and then you can broaden yourself in the others. Its because of this multidisciplinary type of research, you have to have system mindsets. We need somebody with that mindset. You need different competencies from obviously computer science and engineering, math, even as I said, social scientists, information series. Its a plethora of expertise you need to be successful in the future. I dont think you can embody that in any one person but I look for people who have the capability of being deep in 1 or 2 disciplines and in a bigger sense of that domain because everything is touching everything now. There is a fusion going on here.

Do you feel that universities are producing the right type of talent that we need and in what you discussed?

Yes, and we do work with universities, so we do get internships. When they come on board, we will leverage the expertise theyre coming in and they also did get exposed to the team members and what theyre working on. We have established centers separate at NSF directly that universities. We call them intra-science and technology centers. In this case, we have one PI at the university and we have a principal investigator on our side. We work very closely with that group of professors and their students. We bring that multidisciplinary thought process in. Obviously, its very hard to scale that but weve been doing that ever since Ive been with Intel. Its a pretty good way of showing people what you need to be successful.

How do you apply wireless expertise to other domains at Intel? Lets say using the wireless expertise and using those transferable skills in another area. What are some examples of that at Intel?

Wireless is about waves and my teamworks in high frequencies. A great example I can give you is quantum computing. You have your quantum bits sitting at very low temperatures in millikelvins. Usually, what happens is your instrumentation is sitting at room temperature. You have all of these wires, could be thousands of wires coming out of this refrigeration system. What my team has done is using our RF understanding and design capability, weve designed this controller, which sits at 4 Kelvin, so still cooler but not the coolest, not in millikelvin. We call that whole stretch. We have published that in ISCC, which is a conference for this. We have also published it in the press. Its where we applied what we know from our designs and we applied it to quantum computing. Who would have thought about applying radio frequencies rather than having a wire going down all the way to millikelvin? Thats an example of what Javi and I are doing are extending our expertise into a different field.

Thats a perfect example of re-skilling and up-skilling, and I do think its exciting. When we look at the future, I know some people might be nervous about this automation and machines but what youve described is that perfect use of those transferable skills and saying, You know this already. You can learn this and be of great value to the organization and society. Thats exciting news. I want to switch gears a bit and ask you some questions about your path and you, being a female executive in the tech space. What challenges have you faced on your climb to the top and maybe some lessons that youve learned?

I would say number one is networking. I was not good at it. I thought my work would speak for itself and thats not true. Networking in a sense of talk about your work, talk about these different mediums to talk about your work. That communication is so important. Network with your peers and colleagues and learn from them. I think I missed a bit there, again, starting from graduate school, that experience to work. A hard lesson to learn but its a learning experience. Some of us are wired for it, no pun intended. As women also, I think we need to support each other more. There are very few of us and we shouldnt compete against each other. Its very important to establish a reputation or expertise and earn the respect. Those are some lessons learned and to work hard. Its not delivered to you on a silver platter. You have to work for it.

I know that you mentor quite a few women. In your role, you serve as a mentor. What should women keep in mind as they strive toward those executive roles? Can you give us a few tips?

Network, talk about your work and support each other. Its important is to have a sponsor. A sponsor is different than a mentor. The sponsor is basically an advocate to talk about you or your work when youre not in the room. A mentor is more of a one-to-one relationship on how to get better in whatever the persons need is. The sponsor is important. You can be purposeful about it or you can have a sponsor that you dont even know you have one. That goes to your communication and talking about your work and people seeing the results you deliver. You have to deliver results. That would be my tip. Find the advocate. I think that helps.

Lets say youre looking for a sponsor or theres a woman who needs a sponsor. Is this something where she reaches out to someone and requests this person to be her advocate? I know you spoke briefly about it but what do you tell your mentees when they are in need of a sponsor?

I usually offer names and I say, Would you like me to approach these people? This is depending on what the function is, the mentees role is. With that permission, I approach the person. I say, Would you be willing to be a sponsor for such and such? I would put a brief description, a short bio of the individual, what they do, who they are, their essence. If they say yes, I do the introduction. I hook them up. Also, we have a sponsor, Patricia, also at Intel, which has been very helpful and powerful for upcoming women.

I know that diversity is a huge challenge in all companies. Its even more of a challenge now, post-COVID, because so many women have left the workforce. I know this is a much bigger question than we can answer right here, but maybe give me a few thoughts that you have. What can companies do to bring more diversity to their leadership teams?

Number one, you need a healthy pipeline. If you dont have a pipeline, it doesnt matter. I would say you have to activate at the university level. You bring people at the lower grade levels but they grow in the company. They learn the ropes. The other thing I would say is important is to give the opportunity because when we apply for something, we want to make sure we check all the boxes. It is okay to check in with 60% success. Have the courage to go forward. That courage is important. I rely on job shadowing and rotation opportunities to grow them. This is what this role means. In Italy and the US, we offer sabbaticals. Maybe you want to give them sabbatical coverage for a bigger role in the next opportunity for them.

Those are some of the things important in sponsorship that talked about, to help advocate for them. The other thing weve done is to create a cohort of executive women so that we support each other and we support the upcoming women. Its been pretty powerful and very important because you have to keep your eye on the ball. You have to develop the pipeline and the next generation of women and minorities. Thats why I emphasize the pipeline, the sponsorship advocacy. The other thing COVID showed us is family benefits, which go for both men and women. Its not diversity-specific. What benefits do you offer? People can work from home or can have that peace of mind with their children being there or providing support or whatever it is that they need. Those are very important pieces going into the pipeline side of it.

I could talk to you forever. I feel so much smarter talking to you this time. Ive learned so much. This has been wonderful. This is great. This is not every day that I get to talk to a PhD in Tech. Im so honored by this. The last question is what new technologies are you working on now? Whats most exciting to you?

What excites me is challenges, when somebody says, This cannot be done. I have a very strong team. One obvious angle weve talked about is going towards 6G. What does the future look for wireless that way? Another angle is how we can apply our competencies to a new area like quantum computing. By training, Im experimentalist. Anything that we can create IP and design and taking it all the way through the process and create a system of proof of concept, its very exciting for me personally and for my team because we all learn. We have people with different passions and expertise and they get exposed to different areas. If you stop learning, the excitement goes away. For me, its to learn new things and keep reading. Let me tell you, the ideas come. I love watching sci-fi Hollywood movies.

I think that theres a lot of future there. You can get a lot from the sci-fi area in the movies.

You get to pick up ideas from movies.

Look at The Jetsons. When I was younger, thats what I watched. Now all of that seems to be coming to reality soon. Flying cars and all of that.

The autonomous system is the next big thing. All this connectivity and communication and AI, is whats enabling it. Of course, on the other side, its the safety, reliability, all the policies and regulations, which is a different additional thing to make it real. Thats some exciting stuff on the table.

One more question. Tell me about your creative process and how you innovate and come up with ideas. How did they come to you besides the sci-fi moving? Im curious.

I read a lot, even normal technical journals and other types of journals like business journals. I gave you an example from my previous life when I was at Motorola. I read a book on materials. It was on construction. I was looking for movable materials, materials that change shape or characteristics based on temperature or voltage for the application I was thinking about. I opened this book, which is all pictures. Im flipping through it. There were materials out there that did what I was looking for but they were totally different disciplines.

There were these paints and when you touch them, they change color. Imagine you have a cell phone with a battery and you didnt have a battery cover. As this thing heats up, it changes color. It tells you its time to do something. You applied rotation needed from the wires. There were a lot of ideas I picked from here. We ordered it, we worked with the team and they went for it. Its how you connect the dots but you dont have to be in the wireless domain to have bright ideas. Theyre all over the place. Its how you apply different things and bring them to your domain.

I appreciate that. I appreciate the reading part because I do believe in the days that we live in that we dont have as much time for reading. The more brilliant people and leaders that I speak with, everyone reads. I think it is something that we need to make time for. It does produce those innovative ideas where you can connect the dots. Thank you for sharing that. Im going to do more reading. I read a lot but I always need to read more.

Its to get the juices flowing and thinking.

Vida, thank you so much. This has been absolutely a joy for me. I appreciate you coming on the show and I know that our audience has gotten a great deal from this interview. Thanks for coming on.

Thank you for the opportunity.

You take care.

Dr. Vida Ilderem is Vice President and director of Wireless Systems Research (WSR) at Intel Labs. WSR explores breakthrough wireless technologies to fulfill the promise of secure, energy efficient, seamless and affordable connection and sensing for people and things. Prior to joining Intel in 2009, Vida served as vice president of Systems and Technology Research at Motorolas Applied Research and Technology Center, where she was also recognized as Motorola Distinguished Innovator.

Vida holds a doctorate and a masters degree in electrical engineering from Massachusetts Institute of Technology, a bachelors degree in electrical engineering and a bachelors degree in physics from California State University, Fresno. She has 27 issued patents and has given numerous invited talks and keynotes at IEEE and other venues on nanotechnology, RF, IoT & Wearables, 5G/5G+, and innovation.

Related Posts

Read more:
The Convergence of Communication and Computation with Dr. Vida Ilderem - RCR Wireless News

Quantum Computing Market is anticipated to surge at a CAGR of 33.7% over the next ten years – PRNewswire

NEW YORK, July 19, 2021 /PRNewswire/ --As per the findings of a revised market research by Persistence Market Research, the worldwide quantum computing market insight reached a valuation of around US$ 5.6 Bn in 2020, and is anticipated to surge at a CAGR of 33.7% over the next ten years.

Major companies are developing quantum computers focused on delivering free access to their quantum systems through cloud platforms, with the objective of creating awareness and a community for developers working on quantum computing technology. Through this new way of offering access, companies are targeting universities, research groups, and organizations focused on quantum computing to practice, test, and develop applications of quantum computing.

Key Takeaways from Market Study

Request for sample PDF of report: https://www.persistencemarketresearch.com/samples/14758

"Growing trend of cost-effective cloud quantum computing along with technological advancements and rising governmental investments to develop quantum computing solutions for commercial applications to propel market growth," says a Persistence Market Research analyst.

Pharmaceutical Industry Preclinical Drug Discovery and Development of Personalized Medicine

Quantum computers are computational devices that use dynamics of atomic-scale objects to manipulate and store information. Current methods in drug synthesis involve significant approximations on the molecular and atomic level. Material science and pharmaceutical vendors use a variety of computational exhaustive methods to evaluation molecule matches and expect positive effects of potential therapeutic approaches.

Ask an expert for any other query: https://www.persistencemarketresearch.com/ask-an-expert/14758

Accurate predictions often require lengthy simulation processes with the current binary computing system, and it takes years and cost millions of dollars to achieve the desired result. There is an opportunity for quantum computing to replace exiting binary systems in drug discovery processes, as quantum computers can analyze large-scale molecules in less time. Also, high computational power of quantum computers opens up the possibility for developing personalized medicines based on individual unique genetic makeup.

COVID-19 Impact Analysis

The COVID-19 epidemic outbreak has disrupted different industries, including the quantum computing space. Demand for quantum computing software, machine learning, cloud-based quantum computing, artificial intelligence (AI), and quantum computer-as-a-services has been increasing during lockdowns. This is fueling demand for quantum computing software and services.

During the outbreak, manufacturing as well as design and development of quantum computing devices declined by nearly 5%-7% in Q3-Q4 2020, due to falling production across East Asian and North America factories, as both regions are the world's major quantum computing device manufacturers and suppliers. However, according to report, production has become pretty stable in the first half of 2021 with demand gaining traction again.

Large quantum-computing enterprises in North America, Europe, Canada, China, Australia, India, and Russia are investing in qubit research, while also giving researchers access to cloud-based and commercial cloud services. Over, the market for quantum computing is projected to grow faster from Q3-Q4 2021 onwards.

Get full access of report: https://www.persistencemarketresearch.com/checkout/14758

Find More Valuable Insights

Persistence Market Research puts forward an unbiased analysis of the global market for quantum computing market, providing historical demand data (2016-2020) and forecast statistics for the period 2021-2031.

To understand the opportunities in the market, it has been segmented on the basis of component (quantum computing devices, quantum computing software, and services (consulting services, implementation services, and support & maintenance), application (simulation & testing, financial modeling, artificial intelligence & machine learning, cybersecurity & cryptography, and others) and industry (healthcare & life sciences, banking & financial services, manufacturing, academics & research, aerospace & defense, energy & utilities, it & telecom and others) across major regions of the world (North America, Latin America, Europe, East Asia, South Asia & Pacific, and MEA).

Related Reports:

About Persistence Market Research:

Persistence Market Research (PMR), as a 3rd-party research organization, does operate through an exclusive amalgamation of market research and data analytics for helping businesses ride high, irrespective of the turbulence faced on the account of financial/natural crunches.

Overview:

Persistence Market Research is always way ahead of its time. In other words, it tables market solutions by stepping into the companies'/clients' shoes much before they themselves have a sneak pick into the market. The pro-active approach followed by experts at Persistence Market Research helps companies/clients lay their hands on techno-commercial insights beforehand, so that the subsequent course of action could be simplified on their part.

Contact

Rajendra Singh Persistence Market Research U.S. Sales Office:305 Broadway, 7th FloorNew York City, NY 10007+1-646-568-7751United StatesUSA - Canada Toll-Free: 800-961-0353Email: [emailprotected]Visit Our Website:https://www.persistencemarketresearch.com

SOURCE Persistence Market Research Pvt. Ltd.

Continued here:
Quantum Computing Market is anticipated to surge at a CAGR of 33.7% over the next ten years - PRNewswire

Quantum Computing for the Future Grid – Transmission & Distribution World

The electric power grid is undergoing unprecedented change. This change is due to decarbonization efforts, increased reliance on renewable and variable generation resources, the integration of distributed energy resources, and transportation electrification. In turn, these changes have required electric utilities to expand their monitoring and measurement efforts through metering infrastructure and distribution automation initiatives. All these efforts have resulted in the collection of mountains of data from the electric grid. While this significant increase in data collection enables better monitoring of the grid and enhanced decision making, we still need a robust computational foundation that can convert all this collected big data into actionable information.

As mathematical challenges increase and data becomes core to modern utility decision-making, our industry needs to make progress and draw from emerging analytics and computing technologies. Quantum computing is a ground-breaking information processing technology that can support efforts to address power system challenges and enable the grid of the future. Given the promising applications to the power grid, this is an area of research that has really caught my attention lately. While quantum computing applications to the power grid have remained mostly unexamined, forward-looking utilities are exploring the next step to enhance these analytics by understanding how emerging quantum computing technologies can be leveraged to provide higher service levels.

Building the future grid will require an overall view of the quantum computing technology applications in power systems, such as the dynamic interaction of the transmission and distribution systems. According to a recent IEEE article by Rozhin Eskandarpour and a team of researchers from the University of Denver Electrical and Computing Engineering Department, current computational technologies might not be able to adequately address the needs of the future grid.

The most notable change is observed in the role of the distribution grid and customers in system design and management. Transmission and distribution systems were frequently operated as distinct systems but are becoming more of an integrated system. The underlying hypothesis was that at the substation, the transmission system would supply a prescribed voltage, and the distribution system will supply the energy to individual customers. However, as various types of distributed energy resources, including generation, storage, electric vehicles, and demand response, are integrated into the distribution network, there may be distinct interactions between the transmission and distribution systems. Distributed generations transient and small-signal stability problems are one instance that changes the energy systems dynamic nature. Therefore, developing more comprehensive models that include the dynamic relationships between transmission and distribution systems, and relevant computational tools that can solve such models will be essential in the future. Furthermore, better scheduling models are needed to design viable deployment and use of distributed energy resources.

Eskandarpour et al. describe other potential quantum computing applications for the power grid, including optimization, planning, and logistics; forecasting; weather prediction; wind turbine design; cybersecurity; grid security; and grid stability.

Given that I am both professionally embedded in covering the newest innovations within the power sector and nearing the end of a Ph.D. program at the University of Denver, it is not particularly surprising that a new university-industry research consortium has caught my attention. I am excited to share about this ground-breaking initiative and its potential role in building the future grid.

The University of Denver, in collaboration with various utilities, has established a consortium related to envisioning the quantum upgraded electric system of tomorrow. QUEST is the clever acronym that has been adopted for this university-industry consortium. The consortium aims to enhance university-industry collaborations to solve emerging challenges in building the future grid by utilizing quantum information and quantum computation. The consortium will develop new quantum models, methodologies, and algorithms to solve a range of grid problems faster and more accurately. Topics of interest include:

Industry members financially support the QUEST consortium, and membership is voluntary and open to any public or private organization active in the power and energy industry. For more information, contact Dr. Amin Khodaei at the University of Denver, School of Engineering and Computer Science.

Read more:
Quantum Computing for the Future Grid - Transmission & Distribution World

Red Hat embraces quantum supremacy as it looks to the future – SiliconANGLE News

Since its founding in 1993, Red Hat Inc. has seen significant growth and witnessed first hand the transformation from an analog to a digital economy.

With years of experience under its belt, Red Hat is looking on the horizon to prepare for emerging technology with its partnership with IBM Corp., giving it a front-row seat to technological progress. The software company employs a variety of experts across different departments to maintain the massive overhead of running a large tech business.

We typically organize our teams around horizontal technology sectors, said Stephen Watt (pictured, right), distinguished engineer and head of emerging technologies at Red Hat. I have an edge team, cloud networking team, a cloud storage team, application platforms team. Weve got different areas that we attack work and opportunities, but the good ideas can come from a variety of different places, so we try and leverage co-creation with our customers and our partners.

Watt, along with Parul Singh (pictured, left), senior software engineer at Red Hat, and Luke Hinds (pictured, middle), senior software engineer at Red Hat, spoke with John Furrier, host of theCUBE, SiliconANGLE Medias livestreaming studio, during the recentRed Hat Summit. They discussed quantum supremacy, how Red Hat manages its consumers needs, signature server and more.(* Disclosure below.)

One of the many new technologies emerging is quantum computing, which uses qubits instead of bits and is able to process an exponential amount of data compared to its older counterpart.

Quantum computers are evolving, and they have been around, but right now you see that they are going to be the next thing, Singh said. We define quantum supremacy as, say you have any program that you run or any problem that you solve on a classical computer, a quantum computer would be giving you the results faster.

Because quantum computers are not as easily accessible as classical computers, Red Hat has sought out a solution that combines OpenShifts classical components with quantum computing, taking the results and integrating them into classical workloads.

Signature server, or sigstore, is an umbrella organization containing various open-source projects.

Sigstore will enable developers to sign software artifacts, bills and materials, containers, binaries, all of these different artifacts that are part of a software supply chain, Hinds said. Its very similar to a blockchain. It allows you to have cryptographic-proof auditing of our software supply chain, and weve made sigstore so that its easy to adopt, because traditional cryptographic signing tools are a challenge for a lot of developers to implement in their open-source projects.

Open-source boasts the advantage of being transparent, allowing everyone to see the code with no hidden surprises or security issues lurking underneath. Another advantage of open-source software is agency, according to Watt.

If youre waiting on a vendor to go do something, if its proprietary software, you dont have much agency to get that vendor to go do that thing. Whereas the open source, if youre tired of waiting around, you can just submit the patch, he said. So people can then go and take sigstore, run it as a smaller internal service. Maybe they discover a bug. They can fix that bug, contribute it back to the operationalizing piece, as well as the traditional package software, to make it a much more robust and open service. So you bring that transparency and the agency back to the software-as-a-service model as well.

Watch the complete video interview below, and be sure to check out more of SiliconANGLEs and theCUBEs coverage of Red Hat Summit. (* Disclosure: TheCUBE is a paid media partner for Red Hat Summit. Neither Red Hat Inc., the sponsor for theCUBEs event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

We are holding our second cloud startup showcase on June 16.Click here to join the free and open Startup Showcase event.

TheCUBEis part of re:Invent, you know,you guys really are a part of the eventand we really appreciate your coming hereand I know people appreciate thecontent you create as well Andy Jassy

We really want to hear from you. Thanks for taking the time to read this post. Looking forward to seeing you at the event and in theCUBE Club.

Read more here:
Red Hat embraces quantum supremacy as it looks to the future - SiliconANGLE News

Quantum Computing Is Coming. What Can It Do? – Harvard Business Review

Digital computing has limitations in regards to an important category of calculation called combinatorics, in which the order of data is important to the optimal solution. These complex, iterative calculations can take even the fastest computers a long time to process. Computers and software that are predicated on the assumptions of quantum mechanics have the potential to perform combinatorics and other calculations much faster, and as a result many firms are already exploring the technology, whose known and probable applications already include cybersecurity, bio-engineering, AI, finance, and complex manufacturing.

Quantum technology is approaching the mainstream. Goldman Sachs recently announced that they could introduce quantum algorithms to price financial instruments in as soon as five years. Honeywell anticipates that quantum will form a $1 trillion industry in the decades ahead. But why are firms like Goldman taking this leap especially with commercial quantum computers being possibly years away?

To understand whats going on, its useful to take a step back and examine what exactly it is that computers do.

Lets start with todays digital technology. At its core, the digital computer is an arithmetic machine. It made performing mathematical calculations cheap and its impact on society has been immense. Advances in both hardware and software have made possible the application of all sorts of computing to products and services. Todays cars, dishwashers, and boilers all have some kind of computer embedded in them and thats before we even get to smartphones and the internet. Without computers we would never have reached the moon or put satellites in orbit.

These computers use binary signals (the famous 1s and 0s of code) which are measured in bits or bytes. The more complicated the code, the more processing power required and the longer the processing takes. What this means is that for all their advances from self-driving cars to beating grandmasters at Chess and Go there remain tasks that traditional computing devices struggle with, even when the task is dispersed across millions of machines.

A particular problem they struggle with is a category of calculation called combinatorics. These calculations involve finding an arrangement of items that optimizes some goal. As the number of items grows, the number of possible arrangements grows exponentially. To find the best arrangement, todays digital computers basically have to iterate through each permutation to find an outcome and then identify which does best at achieving the goal. In many cases this can require an enormous number of calculations (think about breaking passwords, for example). The challenge of combinatorics calculations, as well see in a minute, applies in many important fields, from finance to pharmaceuticals. It is also a critical bottleneck in the evolution of AI.

And this is where quantum computers come in. Just as classical computers reduced the cost of arithmetic, quantum presents a similar cost reduction to calculating daunting combinatoric problems.

Quantum computers (and quantum software) are based on a completely different model of how the world works. In classical physics, an object exists in a well-defined state. In the world of quantum mechanics, objects only occur in a well-defined state after we observe them. Prior to our observation, two objects states and how they are related are matters of probability.From a computing perspective, this means that data is recorded and stored in a different way through non-binary qubits of information rather than binary bits, reflecting the multiplicity of states in the quantum world. This multiplicity can enable faster and lower cost calculation for combinatoric arithmetic.

If that sounds mind-bending, its because it is. Even particle physicists struggle to get their minds around quantum mechanics and the many extraordinary properties of the subatomic world it describes, and this is not the place to attempt a full explanation. But what we can say is quantum mechanics does a better job of explaining many aspects of the natural world that classical physics does, and it accommodates nearly all of the theories that classical physics has produced.

Quantum translates, in the world of commercial computing, to machines and software that can, in principle, do many of the things that classical digital computers can and in addition do one big thing classical computers cant: perform combinatorics calculations quickly. As we describe in our paper, Commercial Applications of Quantum Computing, thats going to be a big deal in some important domains. In some cases, the importance of combinatorics is already known to be central to the domain.

As more people turn their attention to the potential of quantum computing, applications beyond quantum simulation and encryption are emerging:

The opportunity for quantum computing to solve large scale combinatorics problems faster and cheaper has encouraged billions of dollars of investment in recent years. The biggest opportunity may be in finding more new applications that benefit from the solutions offered through quantum. As professor and entrepreneur Alan Aspuru-Guzik said, there is a role for imagination, intuition, and adventure. Maybe its not about how many qubits we have; maybe its about how many hackers we have.

Link:
Quantum Computing Is Coming. What Can It Do? - Harvard Business Review

Rigetti Computing Partners with Riverlane, Astex Pharmaceuticals on Quantum Computing for Drug Discovery – HPCwire

LONDON and CAMBRIDGE, England, July 13, 2021 Rigetti UK announced today it will partner with Riverlane and Astex Pharmaceuticals to develop an integrated application for simulating molecular systems using Rigetti Quantum Cloud Services, paving the way for a commercial application that could transform drug discovery in pharmaceutical R&D.

Our consortium brings together a complete quantum supply chain from hardware to end-user allowing us to develop a tailor-made solution to address a problem of real value to the pharmaceutical sector, says Mandy Birch, SVP of Technology Partnerships at Rigetti. This project lays the groundwork for the commercial application of Rigetti Quantum Cloud Services in the pharmaceutical industry.

The average cost of discovering a new drug and bringing it to market has tripled since 2010, reaching almost $3bn in 2018. However, soaring R&D costs have not translated into shorter times to market or higher numbers of newly approved drugs.

We want to solve this problem by using quantum computers to speed up the process of drug discovery, says Chris Murray, SVP Discovery Technology at Astex. Quantum computers provide a fundamentally different approach that could enable pharmaceutical companies to identify, screen, and simulate new drugs rather than using expensive, trial-and-error approaches in the laboratory.

To design more efficient drugs and shorten the time to market, researchers rely on advanced computational methods to model molecular structures and the interactions with their targets. While classical computers are limited to modelling simple structures, quantum computers have the potential to model more complex systems that could drastically improve the drug discovery process. However, todays quantum computers remain too noisy for results to evolve past proof-of-concept studies.

Building on previous work with Astex, our collaboration aims to overcome this technological barrier and address a real business need for the pharmaceutical sector, says Riverlane CEO Steve Brierley. The project will leverage Riverlanes algorithm expertise and existing technology for high-speed, low-latency processing on quantum computers using Rigettis commercially available quantum systems. The team will also develop error mitigation software to help optimise the performance of the hardware architecture, which they expect to result in up to a threefold reduction in errors and runtime improvements of up to 40x. This is an important first step in improving the performance of quantum computers so that they can solve commercially relevant problems, Brierley adds.

Science Minister Amanda Solloway says, The UK has bold ambitions to be the worlds first quantum-ready economy, harnessing the transformative capabilities of the technology to tackle global challenges such as climate change and disease outbreak.

This government-backed partnership will explore how the power of quantum could help boost drug discovery, with the aim of shortening the time it takes potentially life-saving drugs to transfer from lab to market, all while cementing the UKs status as a science superpower.

The 18-month feasibility study is facilitated by a grant through the Quantum Challenge at UK Research and Innovation (UKRI). Rigetti UK has previously received funding from UKRI to develop the first commercially available quantum computer in the UK. Riverlane has also received funding from UKRI to develop an operating system that makes quantum software portable across qubit technologies.

About Rigetti UK

Rigetti UK Limited is a wholly owned subsidiary of Rigetti Computing, based in Berkeley, California. Rigetti builds superconducting quantum computing systems and delivers access to them over the cloud. These systems are optimized for integration with existing computing infrastructure and tailored to support the development of practical software and applications. Learn more at rigetti.com.

About Riverlane

Riverlane builds ground-breaking software to unleash the power of quantum computers. Backed by leading venture-capital funds and the University of Cambridge, it develops software that transforms quantum computers from experimental technology into commercial products. Learn more at riverlane.com.

About Astex

Astex is a leader in innovative drug discovery and development, committed to the fight against cancer and diseases of the central nervous system. Astex is developing a proprietary pipeline of novel therapies and has a number of partnered products being developed under collaborations with leading pharmaceutical companies. Astex is a wholly owned subsidiary of Otsuka Pharmaceutical Co. Ltd., based in Tokyo, Japan.

For more information about Astex Pharmaceuticals, please visit https://astx.comFor more information about Otsuka Pharmaceutical, please visit http://www.otsuka.co.jp/en/

Source: Rigetti UK

See the rest here:
Rigetti Computing Partners with Riverlane, Astex Pharmaceuticals on Quantum Computing for Drug Discovery - HPCwire

Quantware Launches the World’s First Commercially Available Superconducting Quantum Processors, Accelerating the Advent of the Quantum Computer. -…

Delft, Netherlands -- July 15, 2021 -- Today Dutch startup QuantWare has launched the worlds first commercially available superconducting processor for quantum computers (QPU). This is the first time superconducting quantum processors have been available off the shelf, a development with the potential to significantly accelerate the quantum computing revolution.

Quantum technology promises to significantly expand the amount of data computers are able to process, which could have huge implications for fields such as A.I., medicine, business intelligence, and cybersecurity. But the quantum industry is still young and scaling is difficult. Companies building parts for quantum computers need qubits, the microscopic objects that make quantum computing possible, but it is often cost prohibitive for them to produce them themselves. QuantWares superconducting QPUs eliminate that barrier and may be instrumental in accelerating the development of the quantum computer market.

Superconducting is the leading and most mature approach to quantum processors - Google achieved quantum supremacy in 2019 using superconducting QPUs. While other QPUs are already available off the shelf, this is the first time a superconducting QPU has been easily available in productised form, leveling the playing field for quantum experimentation.

QuantWares proprietary product, Soprano, is a 5-qubit QPU. In an article published by Ars Technica, QuantWare shared that the fidelities of each qubit will be 99.9 percent, which should keep the error rate manageable. 5 qubits is sufficient for the immediate customer base QuantWare expects to attract, namely research institutions and university labs.

The race towards useful Quantum Computation is heating up, but still reserved to a small group of companies. By making QPUs more available, we will speed up the development of practical quantum-driven solutions to the worlds biggest problems. said QuantWare co-founder Dr. Alessandro Bruno.

Another way to achieve Quantum Advantage is by designing a chip specifically for a particular application. The startup wants to exploit this by making co-designed QPUs together with software companies to allow them to develop processors specialized in their algorithms.

QuantWare was founded in 2020 by quantum engineer Dr. Alessandro Bruno and Delft University of Technology (TU Delft) graduate MSc Matthijs Rijlaarsdam. They met while doing research at QuTech, a quantum technology research institute at TU Delft in the Netherlands. The company recently closed their pre-seed funding round, meaning the company has now raised 1.15M. They plan to quickly expand their team and upgrade their processors towards higher qubit numbers. One of their growth goals for the rest of the year is to expand fabrication capabilities and partnerships - QuantWare hopes to become a collaborative bridge between quantum companies worldwide. The company is already looking for new operational facilities, as they expect to outgrow their current building within months. QuantWares first two products, Crescendo and Soprano, are now available for pre-order.

Investors

About QuantWare

QuantWare builds super-conducting quantum processors and related hardware. The processors lie at the heart of quantum computers and are crucial for conducting research in this field. By providing processors, QuantWare is making quantum research accessible to researchers and startups. The company also develops technology that will increase the computational power of processors beyond current restrictions. QuantWares innovations are creating a new standard for quantum processors.

About UNIIQ

UNIIQ is a 22 million investment fund focused on the proof-of-concept phase, which helps entrepreneurs in West Holland bring their unique innovation to market faster. UNIIQ offers entrepreneurs the seed capital to achieve their plans and bridge the riskiest phase from concept to promising business. A consortium, including Erasmus MC, TU Delft, Leiden University and the regional development agency InnovationQuarter, created the fund. In 2021, Erasmus University Rotterdam also joined the fund. UNIIQ is made possible by the European Union, the Province of South Holland and the municipalities of Rotterdam, The Hague and Leiden. InnovationQuarter is responsible for the fund management.

About FORWARD.one

FORWARD.one is a VC fund focussed on investing in high-tech start-ups and scale-ups. With a team of financial professionals and technology entrepreneurs, FORWARD.one actively supports its portfolio companies to achieve their goals and ambitions. After successfully deploying the first fund in 11 promising start-ups, FORWARD.one has recently launched its second fund with a size of 100m. With this fund FORWARD.one will continue to invest in ambitious high-tech entrepreneurs and their companies.https://www.forward.one/

About Rabobank Startup & Scale-up Team

Start-ups and scale-ups are the innovators of the economy, contributing significantly to solving societal challenges, and are the main engine for economic growth and employment in the Netherlands. This target group therefore represents great commercial and strategic value for Rabobank. The Startup & Scale-up Team helps entrepreneurs who share this mission to grow sustainably by opening up their (international) network, by providing knowledge and funding.

Read more:
Quantware Launches the World's First Commercially Available Superconducting Quantum Processors, Accelerating the Advent of the Quantum Computer. -...

The Future of Data Encryption: What You Need to Know Now – FedTech Magazine

Making Encryption Harder, Better, Faster and Stronger

In response, the industry is advancing encryption on several fronts. Some efforts are focused on increasing key sizes to protect against brute-force decryption. Other efforts are looking at new cryptographic algorithms. For example, the National Institute of Standards and Technology isevaluating a next-generation public key algorithm intended to be quantum safe.

The trouble is that most quantum-safe algorithms arent efficient in classical computer architectures. To address this problem, the industry is focused on developing accelerators to speed up algorithms on x86 platforms.

A third area of research ishomomorphic encryption, an amazing concept that allows users to perform calculations on encrypted data without first decrypting it. So, an analyst who needs to can query a database containing classified information without having to ask an analyst with higher clearance to access the data or request that the data be declassified.

A big advantage of homomorphic encryption is that it protects data in all its states at rest (stored on a hard drive), in motion (transmitted across a network) or in use (while in computer memory). Another boon is that its quantum safe, because its based on some of the same math as quantum computing.

A downside is that homomorphic encryption performs very poorly on traditional computers, because its not designed to work with them. The industry is collaborating to develop x86-style instructions to make these new cryptosystems operate at cloud speeds. Practical applications are still a few years away, but were confident well get there.

EXPLORE:How can agencies combat encrypted attacks on government traffic?

In the interim, a new encryption capability has emerged that organizations can take advantage of right now:confidential computing. Confidential computing safeguards data while its being acted upon in computer memory; for example, while a user is conducting analytics on a database.

Confidential computing works by having the CPU reserve a section of memory as a secure enclave, encrypting the memory in the enclave with a key unique to the CPU. Data and application code placed in the enclave can be decrypted only within that enclave, on that CPU. Even if attackers gained root access to the system, they wouldnt be able to read the data.

With the latest generation of computer processors, a two-CPU server can create a 1 terabyte enclave. That enables organizations to place an entire database or transaction server inside the enclave.

The functionality is now being extended with the ability to encrypt all of a computers memory with minimal impact on performance. Total memory encryption uses a platform-specific encryption key thats randomly derived each time the system is booted up. When the computer is turned off, the key goes away. So even if cybercriminals stole the CPU, they wouldnt be able to access the memory.

Confidential computing transforms the way organizations approach security in the cloud, because they no longer have to implicitly trust the cloud provider. Instead, they can protect their data while its in use, even though its being hosted by a third party.

One major cloud provider already offers a confidential computing service to the federal government, and more will surely follow. Agencies can now build enclave-based applications to protect data in use in a dedicated cloud that meets government security and compliance requirements.

The need for strong data encryption wont go away, and the encryption challenges will only increase as quantum computing emerges over the next several years. In the meantime, innovative new encryption capabilities are delivering tighter cybersecurity to agencies today, and the industry is investing in the next generation of cryptosystems to protect government information for the next 25 years.

More here:
The Future of Data Encryption: What You Need to Know Now - FedTech Magazine

Quantum computing: this is how quantum programming works using the example of random walk – Market Research Telecast

Quantum computing: this is how quantum programming works using the example of random walk

The quantum mistake

Dont look after every coin toss

Conclusion

Read article in iX 13/2021

Developers are familiar with software development on classic computers. Intuitive programming languages, which are based on familiar thought and language patterns, enable even newbies to get started quickly and achieve initial success with small applications.

When programming a quantum computer, the situation is more complicated and significantly more abstract due to the underlying laws of quantum mechanics. The differences between programming on a classical and a quantum computer should be illustrated by an example.

Steffen is going on vacation. Immediately he was drawn to the beach promenade. At five oclock in the morning he stumbled out of a bar, heavily drunk, and couldnt remember which way his hotel was facing. But he has to get there as soon as possible if he wants to reserve a lounger in the first row by the hotel pool at 6:00 a.m. Steffen thinks about it: The hotel must be somewhere on this street. In a math lecture several years ago, the professor had said something about random walks and that the walker can reach any point on a line after any number of steps.

Start FREE month

Start your FREE month now

Already subscribed to heise +?

Sign up and read

Register now and read articles right away

More information about heise +

heise +

More and more knowledge. The digital subscription for IT and technology.

A Fritzbox can be set up quickly, but only those who know all the functions can optimize the connection and adequately protect the router.

153Comments

Thanks to generous subsidies, you can save a lot of money when buying a wallbox. Using specific models, we show the important points in which there are differences.

218Comments

Autos

If the prices for cryptocurrencies are high, private mining makes a profit. We show how easy it is to calculate ether with the graphics card.

473Comments

with video

Microsoft Teams is indispensable in the everyday life of many companies. We show how vulnerable the communication tool is and how you can protect yourself.

38Comments

You can finally set a different standard browser on the iPhone. We show what Google Chrome, Firefox, Microsoft Edge and Brave can do better.

21Comments

Mac & i

We show how you can flexibly integrate the flush-mounted modules and ready-made components from Shelly into your smart home without a hub and, if you wish, without a cloud.

21Comments

Article Source

Disclaimer: This article is generated from the feed and not edited by our team.

Read the original:
Quantum computing: this is how quantum programming works using the example of random walk - Market Research Telecast

IBM shows the advantages of a quantum computer over traditional computers – Tech News Inc

Among the most promising applications of quantum computing, quantum machine learning is set to form waves. But how this could be achieved is still a bit of a mystery.

IBM researchers now claim to have mathematically proven it With a quantum approach, some machine learning problems can be solved faster than conventional computers.

Machine learning is a well-established branch of artificial intelligence, and it is already used in many industries to solve different problems. This involves training an algorithm with large data sets, in order to allow the model to identify different patterns and ultimately calculate the best answer when new information is provided.

With larger data sets, a machine learning algorithm can be improved to provide more accurate answers, but this comes at a computational cost that quickly reaches the limits of traditional hardware. Thats why researchers hope that one day they will be able to harness the enormous computing power of quantum techniques to take machine learning models to the next level.

One method in particular, called quantum nuclei, is the subject of many research articles. In this approach, a quantum computer intervenes only for part of the global algorithm, by expanding the so-called characteristic space, that is, the set of properties used to characterize the data submitted to the model, such as gender or age if the system is trained to recognize patterns in people.

To put it simply, using a quantum nucleus approach, a quantum computer can distinguish between a larger number of features and thus identify patterns even in a huge database, whereas a classical computer would not see just random noise.

IBM researchers set out to use this approach to solve a specific type of machine learning problem called classification. As the IBM team explains, the most common example of a classification problem is a computer that receives pictures of dogs and cats and needs to be trained with this data set. The ultimate goal is to allow it to automatically tag all future images it receives whether it is a dog or a cat, with the goal of creating accurate tags in the least amount of time.

Big Blue scientists developed a new classification task and found that a quantum algorithm using the quantum kernel method was able to find relevant features in the data for accurate labeling, while for classical computers, the data set looked like random noise.

The routine we are using is a general method that in principle can be applied to a wide range of problems, Kristan Temme, a researcher at IBM Quantum, told ZDNet. In our research paper, we formally demonstrated that a quantum kernel estimation routine can lead to learning algorithms that, for specific problems, go beyond classical machine learning approaches.

To demonstrate the advantage of the quantum method over the classical approach, the researchers created a classification problem for which data could be generated on a classical computer, and showed that no classical algorithm could do better than a stochastic response to answer the problem.

However, when they visualized the data in a quantum feature map, the quantum algorithm was able to predict the labels very accurately and quickly.

The research team concludes, This article can be considered an important step in the field of quantum machine learning, as it demonstrates a comprehensive acceleration of a quantum nucleus method implemented in a fault-tolerant manner with realistic assumptions.

Of course, the classification task developed by scientists at IBM is specifically designed to determine whether the quantum nucleus method is useful, and is still far from ready to apply to any kind of large-scale business problem.

According to Kristan Temme, this is mainly due to the limited size of IBMs current quantum computers, which so far can only support less than 100 qubits. There are far from the thousands, if not millions, of qubits that scientists believe are necessary to start creating value in the field of quantum technologies.

At this point, we cant cite a specific use case and say this will have a direct impact, the researcher adds. We have not yet realized the implementation of a large quantum machine learning algorithm. The size of this algorithm is of course directly related to the development of quantum matter.

IBMs latest experiment also applies to a specific type of classification problem in machine learning, and it does not mean that all machine learning problems will benefit from the use of quantum cores.

But the results open the door to further research in this area, to see if other machine learning problems could benefit from using this method.

So much work is still up for debate at the moment, and the IBM team has recognized that any new discovery in this area has many caveats. But until quantum hardware improves, researchers are committed to continuing to prove the value of quantum algorithms, even if from a mathematical point of view.

Source : ZDNet.com

Read the original:
IBM shows the advantages of a quantum computer over traditional computers - Tech News Inc