Page 1,650«..1020..1,6491,6501,6511,652..1,6601,670..»

From 5G to 6G: The race for innovation and disruption – TechRepublic

Image: Unsplash

Connectivity is all about faster, better and increased data transfer between endpoints. The race for wireless connections, beginning in 1979 with the first 1G technology in Tokyo deployed by the Nippon Telegraph and Telephone (NTT), has led the world to 5G and 6G four decades later.

McKinsey Technology Trends Outlook 2022 reveals that advanced connectivity, which includes 5G, 6G, low-Earth-orbit satellites and other technologies, is driving growth and productivity across industries with an investment of $166 billion in 2021. Unlike other new technologies like artificial intelligence (AI) or mobility, the technology has a high adoption rate.

In a report shared by Market Research and Future to TechRepublic, the organization explains that the COVID-19 pandemic was a significant catalyst for implementing 5G globally.

With the power to transform industries faster, with greater capacity and less latency, 5G tech will impact transportation, banking systems, traffic control, remote healthcare, agriculture, digitized logistics and more, Market Research Future says.

New technologies like AI, machine learning, industrial Internet of Things (IIoT), new intelligent cars, and augmented and virtual reality applications in the metaverse also require faster download times and increased data communications in real-time. 5G and 6G are expected to boost these new trends.

SEE: Metaverse cheat sheet: Everything you need to know (free PDF) (TechRepublic)

Market Research and Future explains that the deployment of 5G does not come without challenges. The standardization of spectrum and the complexity in 5G network installation are the most prominent. MIT Tech Review adds that 6G will also face challenges and require cross-disciplinary innovation, new chips, new devices and software.

The next generation of cellular technologies offering higher-spectrum efficiency and high bandwidth have seen their share of debate. As McKinsey explains, many still wonder if 5G can completely replace the 4G LTE network and what percentage of networks will have 5G.

The Global Mobile Suppliers Association by May 2022, had identified 493 operators in 150 countries investing in 5G technology and an additional 200 companies that had technology that could potentially be used for 5G. New announcements for smartphones with 5G rose by 164% by the end of 2020, and cataloged 5G devices increased by 60%.

While new consumer products have rapidly adapted to 5G capabilities, industrial and business devices have not.

Shifting from 4G LTE to private 5G may not be cost-effective for all players; this would depend on a players technological aspirations and planned use cases, McKinsey said.

Market Research Future explains that $61.4 billion are driving this very competitive market, expected to reach $689.6 billion by 2027. But, infrastructure equipment, devices and software providers have been restraining growth.

MIT explains that 6G shares similar challenges with 5G but also presents new challenges. 6G engineers must work on infrastructure, devices and software to build the next-generation communication systems. 6G connectivity can not be done by simply scaling or updating todays technology.

MIT adds that 6G uses more sophisticated active-antenna systems, which integrate further using other Radio Access Technologies such as WLAN (wireless local area network), Bluetooth, UWB (ultra-wideband) and satellite. Fitting all this tech into a smartphone requires reimagining components like chips and radio transceiver technology.

This will require very creative electrical and computer engineering as well as disruptive industrial engineering and power management, MIT explained.

New 6G chips are essential to process the increased computing power. Low latencythe capacity to process a very high volume of data messages with minimal delayis already a challenge for 5G and will be even more defiant with 6G tech.

Low latency is essential for interactive data, real-time data and applications, and virtual environments or digital twins. These are all requirements for AI, the metaverse and the industrial sector. 6G latency will be reduced by using nearby devices, creating a signal on a 3-dimensional network.

SEE: Artificial Intelligence Ethics Policy (TechRepublic Premium)

To solve these problems, new semiconductor materials, intelligent surfaces, AI and digital twin technology developments are being used to test concepts, develop prototypes, and manage and enhance the network.

McKinsey stresses that 5G has proven that only a few telecommunications companies have been able to monetize from 5G enough to get a good return on investment (ROI). Therefore, capital expenditures and maintenance costs will also be closely watched. Additionally, large capital investments are required to build new technology and networks, representing another business challenge.

In its Dresden plant in Germany, Volkswagen replaced wired connections between machinery and now updates finished cars with over-the-air updates and connects unmanned vehicles with edge-cloud servers. Michelin uses new connectivity technologies for real-time inventory management, and Bosch equipped their first factory with 5G, enabling automation, connecting hundreds of end-points and synchronizing robotics with human factory workers. These are just some examples McKinsey gives of how advanced connectivity is disrupting industries.

Connectivity is expected to increase the annual rate of data creation by up to 25%, connect 51.9 billion devices by 2025 and impact the global GDP (gross domestic product) by more than $2 trillion. Additionally, 5G and 6G are expected to contribute to closing the digital divide allowing hundreds of millions of people to be connected for the first time.

In automotive and assembly, 5G and 6G are used to enhance maintenance and navigation, prevent collisions and drive the first fleets of autonomous vehicles. Healthcare devices and sensors connected to low-latency networks will improve patient treatment and monitoring with real-time data, significantly impacting treatment for patients with chronic disease that require constant checks.

Aerospace and defense are using 5G to boost their capacity and performance, while retail has improved inventory management, supply chain coordination and payment process and has created metaverse experiences thanks to the technology. The construction and building industry is printing 3D structures and using high-speed digital twins and applications, and the mining and natural resources sector is turning to smart exploration and exploitation with the digitalization of practices and automation of operations.

Leaders from almost every industry are considering engaging with new connectivity technologies. McKinsey says they should consider advanced connectivity a key enabler of revolutionary capabilities. From digital transformations to driving efficiency through automation and enabling technologies reliant on high-quality connectivity, such as cloud computing and IoT, connectivity will continue to drive the way the world works and lives.

More here:
From 5G to 6G: The race for innovation and disruption - TechRepublic

Read More..

Ransomware attackers expand the attack surface. This Week in Ransomware Friday, Sept 2 – IT World Canada

Ransomware continues to grow and expand, both in the number of attackers and the number of potential victims. This week we feature some of the attackers strategies described in recent news items.

Whats next Ransomware in a box? New Agenda Ransomware can be customized for each victim

A new ransomware strain called Agenda, written in Googles open source programming language Go (aka Golang) was detected and reported by researchers at Trend Micro earlier this week. There has been trend towards using newer languages like Go and Rust to create malware, particularly ransomware.

The fact that many of these languages can operate cross platform makes them a much greater threat. Go programs are cross platform and stand alone. They can execute without a Go interpreter on the host system.

In addition, the creators have added a new wrinkle making this new variant easily customizable. This new strain is being sold on the dark web as Ransomware as a Service (RaaS). Qilin, the threat actor that is selling it to its affiliates, claims it will allow them to easily customize, for each victim, the:

Finally, Agenda has a clever detection evasion technique also used in the other ransomware variant REvil. It changes the user password and enables automatic login with the new credentials. This allows the attacker to use safe mode to reboot and control the victims system.

Trend Micro reported that this allowed one attacker to move from reconnaissance to full-fledged attack in only two days. On the first day, the attacker scanned a Citrix server, and on the second day mounted a customized attack.

For more information you can review the original Trend Micro posting.

New Linux ransomware families

Another way that threat actors are expanding the attack surface is by targeting Linux, one of the predominant operating systems used on internet and cloud servers. RaaS offerings are increasing targeting Linux systems.

Although regarded as a very secure operating system, and despite a consistent move to patch vulnerabilities, the large number of Linux offerings used world-wide ensures there are a significant number of vulnerabilities at any given time. Failure to update and patch systems creates a large potential target base.

But software vulnerabilities are not the only area of weakness. Configuration mistakes are often the more likely factor in the breach of a Linux system, according to researchers at Trend Micro.

Remarkably, these include easily remedied issues such as:

To quote Trends report, given the prevalence of Linux, ransomware actors find the operating system to be a very lucrative target.

Ransomware going to the dogs is no joke

As RaaS and customizability become more and more prevalent, theres an increasing ability to target smaller and more specific groups. We are familiar with ransomware attacking health care organizations, but recently the United Veterinary Services Association has written to its members with recommendations to increase ransomware prevention after an attack that hit more than 700 animal health networks around the world.

It is a reminder that no group, regardless of size or type of business, is immune to ransomware.Every organization must communicate the need to have, at a minimum, the basics of ransomware protection in place:

Read this article:
Ransomware attackers expand the attack surface. This Week in Ransomware Friday, Sept 2 - IT World Canada

Read More..

The Network Binds The Increasingly Distributed Datacenter – The Next Platform

Before founding software-defined networking startup PlumGrid and then moving to VMware when it bought his company in 2016, Pere Monclus spent almost 12 years with Cisco Systems at a time when while much of enterprise networking was still in the corporate datacenter, the shift to network virtualization and the migration to the cloud were getting underway.

Cisco was dominant in the datacenter networking space and fed organizations with a steady stream of hardware, from routers to switches to silicon. The company carried an expansion view of its role in networking.

At Cisco, we were thinking always we have to control the end-to-end of the network, Monclus, vice president and chief technology officer of VMwares Networking and Security business unit, tells The Next Platform. The idea was we have to control the edge of the network so the core doesnt fall, because the core was where most of the markets were. We would have core routers, core switches and then take it all the way to the access to create the end-to-end networking as a principle, because from a Cisco perspective, what we were delivering was an end-to-end connectivity solution with our protocols.

About a year after Monclus left Cisco to found PlumGrid, VMware bought Nicira for $1.26 billion, a move that allowed the company that already was a significant datacenter presence through its server and storage virtualization to absorb networking into its increasingly software-defined world. NSX and networking have evolved over the past ten years to become a key part of VMwares own adaptation to an IT world that has broken well beyond the datacenter boundaries and out to the cloud and the edge. With containers, microservices and Kubernetes, software now dictates to hardware rather than the other way around.

Its also a world where the network is now tie that binds this increasingly decentralized IT environment, becoming the main thoroughfare for applications and data moving between the datacenter, cloud and edge and a central focus for organizations security measures. All this was on full display this week at VMwares Explore 2022 conference, which allowed the company to tout its ongoing expansion into the cloud and out to the edge and its networking portfolios central role in helping to make this happen.

The evolution of networking at VMware has taken several steps, Monclus says. At the time of the Nicira acquisition, enterprises would spend weeks or months putting the network in place before applications that would run top it could be put into production.

When VMware got into networking, the company heard from customers that they could quickly create and application and get a server up and running, but it takes them weeks to configure the network, he says. We started that journey with network virtualization and the first story [for networking] was about automation and agility. The question was, If I create a VM, could I just connect it to the network and give it an IP address? That was kind of the early days of network virtualization.

As more workloads and data were making their way out of the datacenter, security of the network became increasingly important, which is why VMware embraced micro-segmentation, a way to manage network access and separate workloads from one another to reduce an organizations attack surface and more easily contain breaches by preventing the lateral movement of attackers. The acquisition two years ago of network security startup Lastline helped fuel the vendors distributed IDS/IPS technology to complement the east-west protection delivered by micro-segmentation.

In June, the company added to its lateral security for network and endpoint technologies with a broad threat intelligence capability called Contexa. It sits in the infrastructure and offers visibility into both traditional and modern applications.

VMware over the years has put networking and security capabilities into the hypervisor and made them available as services in its own cloud offering and those of hyperscalers like Amazon Web Services and Google Cloud. Its also making NSX, and its expanding growing security capabilities including those from Carbon Black, which it bought in 2019 for $2.1 billion key parts of the multicloud strategy.

The vendor at Explore rolled out a broad range of enhancements to its networking and security portfolio all aimed at making it easier for enterprises to manage and secure their multicloud environments. It also gave a look to what the near-term future looks like with the introduction of a number of network- and security-focused projects.

VMware is embedding network detection and visibility capabilities into Carbon Black Clouds endpoint protection program, a move that is now in early access and brings together visibility into both the network and endpoints. It also is adding threat prevention tools like IDPS, malware analysis, sandboxing and URL filtering to its NSX Gateway Firewall and enhanced bot management to the NSX Advanced Load Balancer (ALB). The last two along with Project Watch, which aims to offer a continuous risk and compliance assessment model to multicloud environments are part of VMwares Elastic App Secure Edge (EASE), a strategy announced last year to offer a range of data plane services around networking and security.

As we noted earlier this week, VMware also is embracing data processing units (DPUs) from Nvidia for a number of its cloud-based offerings, including vSphere 8 and, for this case, NSX. Cloud providers like AWS and Oracle already are using DPUs and many in the industry believe that servers and other hardware in the near future will routinely include the chips. Monclus says customers that will gravitate toward DPUs or smartNICs for performance and security. For organizations like telcos that demand high performance and where their datacenters are revenue-generating facilities enabling CPUs to offload networking or compute tasks to DPUs is attractive.

There is a tradeoff they may save 15 percent in CPU utilization, which they can sell back to customers, but there also is the cost of the DPUs themselves. However, where datacenters are a cost factor, increasing security by leveraging the workload isolation offered by the DPUs and that likely will be a fast-growing use case for the chips, Monclus says.

Looking to the near future, VMware offered a look at Project Northstar and Project Trinidad, along with the aforementioned Project Watch. Project NorthStar is in technical preview and is a software-as-a-service (SaaS) network and security offering that will deliver services, visibility and controls to NSX users who can manage them via central cloud control plane.

The services include VMwares NSX Intelligence, ALB, Network Detection and Response and Web Application Firewall.

We are taking the control plane of NSX and turning it into a SaaS service to enable true multicloud solutions, Monclus says. When we have a policy as a service, it works on vSphere environments but it works across VMware Cloud, VMware Cloud Network, AWS, Google, Azure, and we have the same advanced protection, we have the same load balancer.

Both Project Trinidad and Project Watch are aimed at addressing the needs of modern workloads, he says. Theyre not tied to physical endpoints; instead, the API becomes the endpoint, he says. Project Trinidad uses AI and machine learning models to understand what are normal and expected east-west API traffic patterns between microservices so that if something anomalous pops up, it can be quickly detected.

We basically discover all the API, the schemas, API data and we create a baseline and we can start from the baseline, Monclus says. Project Trinidad introduces is AI/ML deep correlations between workflows and microservices.

As noted, Project Watch brings continuous security, compliance and risk assessment as well as automated and encrypted connectivity across clouds AWS, Google Cloud and Microsoft Azure virtual private clouds (VPCs) and virtual networks (VNETs) and security operations and integrates workflows from such areas as security and cloud operations and lines of business onto a single platform.

It also addresses the challenge of not only enabling networks and security to adapt to modern workloads but also to ensure that legacy hardware that cant make that change are secure.

VMware will assess and report the security risks enterprises face, giving the necessary data to make decisions, he says, adding that the vendor wants to create a continuous monitoring model in the same way as high availability, which uses the metric of three 9s, four 9s, and so forth, he says. We are trying to create a metric of how well youre running your datacenter or your applications from across security points.

Read more here:
The Network Binds The Increasingly Distributed Datacenter - The Next Platform

Read More..

Opinion: The line between data and privacy is measured only by success – Gambling Insider

If there is a modern subject of discussion which elicits a strong response from the public, it is that of data privacy.

Tech businesses such as Apple have made a conscious push towards informing the public about the subject, while making its products ever resistant to underhanded data retrieval.

In the gaming industry, data has become a way of life. Every bit that can be used is used by the industry to target new players, improve the flow of casinos thereby making it easier and more profitable to attract customers and generally improve market trends.

However, there is a catch.

At G2E Asia this year, Qliks Senior Director of Solutions and Value Engineering, Chin Kuan Tan, revealed the results of Qliks research into player preferences relating to data usage in the gaming and hospitality industry which threw up some interesting conundrums.

Tans presentation showed that 72% of people will stop engaging with a company completely if they have concerns over data collection, while also saying that 76% of players prefer hyper-personalisation over mass marketing techniques.

The duality of these two statistics shows that gaming operators are walking a knife edge when it comes to how the data gleaned from customers is used; and with the increased focus on data on an individual scale, the manner in which operators market themselves to customers has to evolve.

The more the industry uses servers and algorithms to solve and modernise everyday tasks, the more it relies on data collection to operate. This is something that Oosto CMO Dean Nicolls spoke about in a recent interview with Gambling Insider in relation to the companys facial recognition software.

When asked about how Oostos system protects the faces of millions of people that enter any of the locations where its technology is used, Nicolls spoke in depth about the subject in the upcoming September/October edition ofGambling Insidermagazine:

You might think a lot of the data is traversing from the casino to our central servers or to our cloud servers thats not the case. Everything is done locally. Traditionally, in a Vegas casino, all the servers are sitting on the premises and they are running our algorithms themselves, so were not getting that data on our servers. Now, naturally any data that goes from the camera to servers still needs to be encrypted; and it is, both in transit and at rest, but it isnt going anywhere on our servers.

The comments of the Oosto CMO show a willingness to please the audience, though perhaps it also shows a brushing off a want to get around the question as quickly and easily as possible, without wishing to be drawn into a larger conversation about ethical data practices.

On the whole, Oosto appears to do a good job of protecting the data of innocents; a difficult task when your business relies on filming and recognising large quantities of people en masse. However, the failure to categorically explain the safety precautions in place, outside of using the term encrypted, feels telling.

Data and the gaming industry is an odd mix, then.

In the modern day, the industry demands that players be protected from those that would do harm by obtaining data, while players themselves are ready to quit if they feel vulnerable for a second in signing up to a service.

Gaming companies want to use data to further the consumer experience while retaining and reassuring customers that any data provided will not be sold or used in other, nefarious ways as has been reported frequently since Edward Snowdens revelations in 2013.

Customers want what they always have wanted: a seamless service that benefits them without risk. But with the online nature of the modern world, this risk is accepted as long as it is mitigated, leaving gaming companies juggling the subjects of personalised experiences, data loss and customer satisfaction.

More:
Opinion: The line between data and privacy is measured only by success - Gambling Insider

Read More..

VMwares Project Monterey bears hardware-accelerated fruit – The Stack

Back in September 2020 VMware announced what it called Project Monterey as a material rearchitecture of infrastructure software for its VMware Cloud Foundation (VCF) a suite of products for managing virtual machines and orchestrating containers. The project drew in a host of hardware partners, as the virtualisation heavyweight looked to respond to the way in which cloud companies were increasingly using programmable hardware like FPGAs and SmartNICs to power data centre workloads that could be offloaded from the CPU in the process freeing up CPU cycles for core enterprise application processing work and improving performance for the networking, security, storage and virtualisation workloads offloaded to these hardware accelerators.

Companies including numerous large VMware customers looking to keep workloads on-premises had struggled to keep pace with the way in which cloud hyperscalers were rethinking data centre architecture like this. Using the kind of emerging SmartNICs, or what are increasingly called Data Processing Units (DPUs) was requiring extensive software development to adapt them to the users infrastructure. Project Monterey was an ambitious bid to tackle that from VMwares side in close collaboration with SmartNIC/DPU vendors Intel, NVIDIA and Pensando (now owned by AMD) and server OEMs Dell, HPE and Lenovo in order to improve performance.

This week Project Monterey bore some very public fruit, with those server OEMs and chipmakers lining up to offer new products optimised for VMware workloads and making some bold claims about total cost of ownership (TCO) savings as well as infrastructure price-performance ultimately, by relieving server CPUs of networking services and running infrastructure services on the DPU isolated from the workload domain, security will also markedly improve, all partners emphasised, as will workload latency and throughput for enterprises.

NVIDIA, for example, thinks using its new data processing units (DPUs) can save $8,200 per server in total cost of ownership (TCO) terms or $1.8 million in efficiency savings over three years for 1,000 server installations. (A single BlueField-3 DPU replaces approximately 300 CPU cores, NVIDIA CEO Jensen Huang has earlier claimed.) Customers can get the DPUs with new Dell PowerEdge servers that start shipping later this year and which are optimised for vSphere 8: This is a huge moment for enterprise computing and the most significant vSphere release weve ever seen, said Kevin Deierling, a senior VP at Nvidia. Historically, weve seen lots of great new features and capabilities with VMwares roadmap. But for the first time, were introducing all of that goodness running on a new accelerated computing platform that runs the infrastructure of the data centre.

(The BlueField 3 DPUs accelerating these VMware workloads feature 16 x Arm A78 cores, 400Gbit/s bandwidth and PCIe gen 5 support accelerators for software-defined storage, networking, security, streaming, line rate TLS/IPSEC cryptography, and precision timing for 5G telco and time synchronised data centres.)

Certain latency and bandwidth-sensitive workloads that previously used virtualization pass-thru can now run fully virtualized with pass-thru-like performance in this new architecture, without losing key vSphere capabilities like vMotion and DRS said VMware CEO Raghu Raghuram. Infrastructure admins can rely on vSphere to also manage the DPU lifecycle, Raghuram added: The beauty of what the vSphere engineers have done is they have not changed themanagement model. It can fit seamlessly into the data center architecture of today

The releases come amid a broader shift to software-defined infrastructure. As NVIDIA CEO Jensen Huang has earlier noted: Storage, networking, security, virtualisation, and all of that all of those things have become a lot larger and a lot more intensive and its consuming a lot of the data centre; probably half of the CPU cores inside the data center are not running applications. Thats kind of strange because you created the data center to run services and applications, which is the only thing that makes money The other half of the computing is completely soaked up running the software-defined data centre, just to provide for those applications. [That] commingles the infrastructure, the security plane and the application plane and exposes the data centre to attackers. So you fundamentally want to change the architecture as a result of that; to offload that software-defined virtualisation and the infrastructure operating system, and the security services to accelerate it

Vendors are lining up to offer new hardware and solutions born out of the VMware collaboration meanwhile. Dells initial offering of NVIDIA DPUs is on its VxRail solution and takes advantage of a new element of VMwares ESXi (vSphere Distributed Services Engine), moving network and security services to the DPU. These will be available via channel partners in the near future. Jeff Boudreau, President, Dell Technologies Infrastructure Solutions Group added: Dell Technologies and VMware have numerous joint engineering initiatives spanning core IT areas such as multicloud, edge and security to help our customers more easily manage and gain value from their data.

AMD meanwhile is making its Pensandu DPUs available through Dell, HPE and Lenovo. Those with budgets for hardware refreshes and a need for performance for distributed workloads will be keeping a close eye on price and performance data as the products continue to land in coming weeks and months.

More:
VMwares Project Monterey bears hardware-accelerated fruit - The Stack

Read More..

How reality gets in the way of quantum computing hype – VentureBeat

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

Baidu is the latest entrant in the quantum computing race, which has been ongoing for years among both big tech and startups. Nevertheless, quantum computing may face a trough of disillusionment as practical applications remain far from reality.

Last week, Baidu unveiled its first quantum computer, coined Qian Shi, as well as what it claimed is the worlds first all-platform integration solution, called Liang Xi. The quantum computer is based on superconducting qubits, which is one of the first types of qubits, among many techniques that have been investigated, that became widely adopted, most notably in the quantum computer which Google used to proclaim quantum supremacy.

Qian Shi has a computing power of 10 high-fidelity qubits. High fidelity refers to low error rates. According to the Department of Energys Office of Science, once the error rate is less than a certain threshold i.e., about 1% quantum error correction can, in theory, reduce it even further. Beating this threshold is a milestone for any qubit technology, according to the DOEs report.

Further, Baidu said it has also completed the design of a 36-qubit chip with couplers, which offers a way to reduce errors. Baidu said its quantum computer integrates both hardware, software and applications. The software-hardware integration allows access to quantum chips via mobile, PC and the cloud.

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Moreover, Liang Xi, Baidu claims, can be plugged into both its own and third-party quantum computers. This may include quantum chips built on other technologies, with Baidu giving a trapped ion device developed by the Chinese Academy of Sciences as an example.

With Qian Shi and Liang Xi, users can create quantum algorithms and use quantum computing power without developing their own quantum hardware, control systems or programming languages, said Runyao Duan, director of the Institute for Quantum Computing at Baidu Research. Baidus innovations make it possible to access quantum computing anytime and anywhere, even via smartphone. Baidus platform is also instantly compatible with a wide range of quantum chips.

Despite Baidus claim of being the worlds first such solution, the Liang Xi platform is reminiscent of Israels Innovation Authority approach, which is also aimed at being compatible with various types of qubits.

Although this is Baidus first quantum computer, the company has already submitted over 200 patents throughout the last four years since the founding of its quantum computing research institute. The patents span various areas of research including quantum algorithms and applications, communications and networks, encryption and security, error correction, architecture, measurement and control and chip design.

Baidu claims its offering paves the way for the industrialization of quantum computing, making it the latest company to make grandiose claims about quantum computing being on the verge of widespread adoption. Some quantum startups have already amassed staggering valuations of over $1 billion.

However, real applications for quantum computers, besides encryption, have yet to emerge. And even if they do, its expected that those will require thousands, which is far from what has anyone yet been able to achieve. For example, this scalability concern led Intel to stop pursuing the popular superconducting qubit approach in favor of the less mature silicon and silicon-germanium qubits, which are based on transistor-like structures that can be manufactured using traditional semiconductor equipment.

Nevertheless, voices are already emerging to warn of overhyping the technology. In the words of the Gartner Hype Cycle, this may mean that quantum computing may approach its trough of disillusionment.

The other main challenge in quantum computing is that real qubits tend to be too noisy, leading to decoherence This leads to the necessity of using quantum error correction, which increases the number of qubits far above the theoretical minimum for a given application. A solution called noisy intermediate scale quantum (NISQ) has been proposed as a sort of midway, but its success has yet to be shown.

The history of classical computers is filled with examples of applications that the technology enabled that had never been thought of beforehand. This makes it tempting to think that quantum computing may similarly revolutionize civilization. However, most approaches for qubits currently rely on near-absolute zero temperature. This inherent barrier implies quantum computing may remain limited to enterprises.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Go here to see the original:
How reality gets in the way of quantum computing hype - VentureBeat

Read More..

The U.S., China, and Europe are ramping up a quantum computing arms race. Heres what theyll need to do to win – Fortune

Every country is vying to get a head start in the race to the worlds quantum future. A year ago, the United States, the United Kingdom, and Australia teamed up todevelopmilitary applications of digital technologies, especially quantum computing technologies. That followed the passage in 2019 of the National Quantum Initiative Act by the U.S. Congress, which laid out the countrys plans to rapidly create quantum computing capabilities.

Earlier, Europe launched a $1 billion quantum computing research project, Quantum Flagship, in 2016, and its member states have started building a quantum communications infrastructure that will be operational by 2027. In like vein, Chinas 14th Five Year Plan (2021-2025) prioritizes the development of quantum computing and communications by 2030. In all, between 2019 and 2021 China invested as much as $11 billion, Europe had spent $5 billion, the U.S. $3 billion, and the U.K. around $1.8 billion between to become tomorrows quantum superpowers.

As the scientific development of quantum technologies gathers momentum, creating quantum computers has turned into apriority for nations that wish to gain the next competitive advantage in the Digital Age. Theyre seeking this edge for two very different reasons. On the one hand,quantum technologies will likely transform almost every industry, from automotive and aerospace to finance and pharmaceuticals. These systems could create fresh value of between $450 billion and $850 billion over the next 15 to 30 years, according to recentBCG estimates.

On the other hand, quantum computing systems will pose a significant threat to cybersecurity the world over, as we argued in an earliercolumn.Hackers will be able to use them to decipher the public keys generated by the RSA cryptosystem, and to break through the security of any conventionally-encrypted device, system, or network. It will pose a potent cyber-threat, popularly called Y2Q (Years to Quantum), toindividuals and institutions as well as corporations and country governments. The latter have no choice but to tacklethe unprecedented challenge by developing countermeasures such as post-quantum cryptography, which will itself require the use of quantum systems.

Countries have learned the hard way since the Industrial Revolution that general-purpose technologies, such as quantum computing, are critical for competitiveness. Consider, for instance, semiconductor manufacturing, which the U.S., China, South Korea, and Taiwan have dominated in recent times. When the COVID-19 pandemic and other factors led to a sudden fall in production over the last two years, it resulted in production stoppages andprice increases in over 150 industries, including automobiles, computers, and telecommunications hardware. Many countries, among the members of theEuropean Union, Brazil, India, Turkey, and even the U.S., were hit hard, and are now trying to rebuild their semiconductorsupply chains. Similarly,China manufacturesmost of the worlds electric batteries, with the U.S. contributingonly about 7% of global output. Thats why the U.S. has recently announcedfinancial incentivesto induce business to create more electric battery-manufacturing capacity at home.

Much worse could be in store if countries and companies dont focus on increasing their quantum sovereignty right away. Because the development and deploymentof such systems requires the efforts of the public and private sectors, its important for governments to compare their efforts on both fronts with those of other countries.

The U.S. is expected to be the global frontrunnerin quantum computing, relying on its tech giants, such as IBM and Google, to invent quantum systems as well as numerous start-ups that are developing software applications. The latter attract almost 50% of the investments in quantum computing by venture capital and private equity funds, according toBCG estimates. Although the U.S. government has allocated only $1.1 billion, it has created mechanisms that effectively coordinate the efforts of all its agencies such as the NIST, DARPA, NASA, and NQI.

Breathing down the U.S.s neck: China, whose government has spent more on developing quantum systems than any other. . Those investments have boosted academic research, with China producing over 10% of the worlds research in 2021, according toour estimatessecond only to the U.S. The spillover effects are evident: Less than a year after Googles quantum machine had solved in minutes a calculation that would have taken supercomputers thousands of years to unravel, the University of Science and Technology of China (USTC) had cracked a problem three times tougher. As of September 2021, China hadnt spawned as many startups as the U.S., but it was relying on its digital giants such as Alibaba, Baidu, and Tencent to develop quantum applications.

Trailing only the U.S. and China, the European Unionsquantum computing efforts are driven by its member states as well as the union. The EUsQuantum Flagshipprogram coordinates research projects across the continent, but those efforts arent entirely aligned yet. Several important efforts, such as those ofFranceandGermany,run the risk of duplication or dont exploit synergies adequately. While the EU has spawned several startups that are working on different levels of the technology stacksuch as FinlandsIQM and FrancesPasqalmany seem unlikely to scale because of the shortage of late-stage funding. In fact, the EUs startups have attracted only about one-seventh as much funding as their American peers,according toBCG estimates.

Finally, the U.K. was one of the firstcountries in the world to launch a government-funded quantum computing program. Its counting on itseducational policiesand universities;scholarships for postgraduate degrees; and centers for doctoral training to get ahead. Like the EU, the U.K. also has spawned promising start-ups such asOrca,which announced the worlds smallest quantum computer last year. However, British start-ups may not be able to find sufficient capital to scale, and many are likely to be acquired by the U.S.s digital giants.

Other countries, such as Australia, Canada, Israel, Japan, and Russia are also in the quantum computing race, and could carve out roles for themselves. For instance, Canada is home to several promising startups, such asD-Wave, a leader in annealing computers; whileJapanis using public funds to develop a homegrown quantum computer by March 2023. (For an analysis of the comparative standings and challenges that countries face in quantum computing, please see the recentBCG report.)

Meanwhile, the locus of the quantum computing industry is shifting to the challenges of developing applications and adopting the technology. This shift offers countries, especially the follower nations, an opportunity to catch up with the leaders before its too late. Governments must use four levers in concert to accelerate their quantum sovereignty:

* Lay the foundations.Governments have to invest more than they currently do if they wish to develop quantum systems over time, even as they strike partnerships to bring home the technology in the short run. Once they have secured the hardware, states must create shared infrastructure to scale the industry. The Netherlands, for instance, has set upQuantum Inspire, a platform that provides users with the hardware to perform quantum computations.

* Coordinate the stakeholders.Governments should use funding and influence to coordinate the work of public and private players, as theU.S. Quantum Coordination Office, for instance,does. In addition, policymakers must connect stakeholders to support the technologys development. Thats how the U.S. Department of Energy, for instance, came to partner with the University of Chicago; together, theyve set up anacceleratorto connect startups with investors and scientific experts.

* Facilitate the transition. Governments must support businesss transition to the quantum economy. They should offer monetary incentivessuch as tax credits, infrastructure assistance, no- or low-interest financing, and free landso incumbents will shift to quantum technologies quickly. TheU.K., for instance, hasrecently expanded its R&D tax relief scheme to cover investments in quantum technologies.

* Develop the business talent.Instead of developing only academics and scientists, government policies will have to catalyze the creation of a new breed of entrepreneurial and executive talent that can fill key roles in quantum businesses. To speed up the process, Switzerland, for instance, has helped create amasters programrather than offering only doctoral programs on the subject.

Not all general-purpose technologies affect a countrys security and sovereignty as quantum computing does, but theyre all critical for competitiveness. While many countries talk about developing quantum capabilities, their efforts havent translated into major advances, as in the U.S. and China. Its time every government remembered that if it loses the quantum computing race, its technological independence will erodeand, unlike with Schrdingers cat, theres no doubt that its global competitiveness will atrophy.

ReadotherFortunecolumns by Franois Candelon.

Franois Candelonisa managing director and senior partner at BCG and global director of the BCG Henderson Institute.

Maxime Courtauxis a project leader at BCG and ambassador at the BCG Henderson Institute.

Gabriel Nahasis a data senior scientist at BCG Gamma and ambassador at the BCG Henderson Institute.

Jean-Franois Bobier is a partner & director at BCG.

Some companies featured in this column are past or current clients of BCG.

Read the original post:
The U.S., China, and Europe are ramping up a quantum computing arms race. Heres what theyll need to do to win - Fortune

Read More..

Quantum Computing Market to Expand by 500% by 2028 | 86% of Investments in Quantum Computing Comes from 4 countries – GlobeNewswire

Westford, USA, Aug. 30, 2022 (GLOBE NEWSWIRE) -- Quantum computers touted as next big thing in computing. Major reliance on quantum computers could mean we're soon entering a new era of artificial intelligence, ubiquitous sensors, and more efficient drug discovery. While quantum computers are still in the earliest stages of development, growing interest in their capabilities means that they are likely to become a central part of future computing systems. This has created a growing demand for quantum computing market and software, with providers already reporting strong demand from major customers.

The promise of quantum computing is that it can solve complex problems much faster than traditional computers. This is because quantum computers are able to exploit the properties of subatomic particles such as photons, which are able to ferry information around extremely fast. So far, quantum computing market has been witnessing a demand coming mainly for scientific and research purposes.

However, this is set to change soon as there is growing demand for quantum computers market for various applications such as artificial intelligence (AI), machine learning and data analytics. Artificial intelligence (AI) is one application that could benefit greatly from the speed and accuracy of quantum computing. AI relies on algorithms that are trained on large data sets and are able to learn and improve upon their skills with repeated use. However, classical computer databases can take hours or even days to train an AI algorithm.

Get sample copy of this report:

https://skyquestt.com/sample-request/quantum-computing-market

Only 4 Countries are Responsible for 86% of Total Funding Since 2001

Quantum computing market is heating up. Companies like Google and IBM are racing to develop the technology, which could one day lead to massive improvements in artificial intelligence and other areas of cybersecurity. As per SkyQuests analysis, $1.9 billion public funding was announced in the second half of the year 2021, which, in turn, took the total global funding to $31 billion from year 2001. It was also observed that most of the private and public funding is coming from the US only, which account for around 49% of the private fundings, which is followed by UK (17%), Canada (14%), and China (6%).

In 2021, the global quantum computing market witnessed an investment of around $3 billion, out of which $1.9 billion came in the second of the year. All this investment is coming from both private and public domain to feast on the upcoming opportunity of generating around $41 billion revenue by the year 2040 at a CAGR of more than 30%. The market is projected to experience a significant surge in the demand for quantum sensing and Quantum communication in the years to come. As a result, investors have started pouring money to take advantage of rapidly expanding field. For instance, in 2021 alone, $1.1 billion out of $3 billion were invested in these two technologies. To be precise, $400 million and $700 million respectively.

SkyQuest has done deep study on public and private investment coming into global quantum computing market. This will help the market participants in understanding who are the major investors, what is their area of interest, what makes them to invest in the technology, investors profile analysis, investment pockets, among others.

IonQ, Rigetti, and D-Wave are Emerging Players in Global Quantum Computing Market

As quantum computing market becomes more mainstream, companies like IonQ, Rigetti and D-Wave are quickly proving they are the top emerging players in the field. IonQ is has been working on developing ionic quantum computer technology for several years now. IonQs flagship product is the IonQ One, which is a single-core quantum computer that can process quantum information.

The IonQ One has already been deployed at a number of institutions around the global quantum computing market including NASA.

Rigetti is another company that has been making significant strides in the development of quantum computing technology. Rigettis flagship product is the Rigetti Quilter, which is a scalable two-qubit quantum computer. The Rigetti Quilter is currently undergoing Phase II testing at NASAs Ames Research Center. D-Wave has also been making significant progress in the development of quantum computing technology. D-Waves flagship product is the D-Wave Two, which is a five-qubit quantum computer. The D-Wave Two was recently deployed at Google physicists to help accelerate the discovery of new phenomena in physics.

Browse summary of the report and Complete Table of Contents (ToC):

https://skyquestt.com/report/quantum-computing-market

Regetti has secured a total funding of around $298 million through 11 rounds until 2022 in the global quantum computing market. As per our analysis, the company has secured its last funding through post IPO equity. Wherein, Bessemer Venture Partners and Franklin Templeton Investments are the major investor in the company.

As per SkyQuests findings, these three organizations have collectively generated revenue of around $32 million in 2021 with market cap of more than $3 billion. However, at the same time, they are facing heavy loss. For instance, in 2021, they faced collective loss of over $150 million. Our observation also noticed that billions of dollars are poured into building the quantum computers, but most of the market players are not earning much in revenue in terms of ROI.

SkyQuest has published a report on global quantum computing market and have tracked all the current developments, market revenue, companys growth plans and strategies, their ROI, SWOT analysis, and value chain analysis. Apart from this, the provides insights about market dynamics, competitive landscape, market share analysis, opportunities, trends, among others.

Machine Learning Generated Revenue of Over $189 Million in 2021

Today, machine learning is heavily used for training artificial intelligence systems using data. Quantum computing market can help to speed up the process of training these systems by vastly increasing the amount of data that can be processed. This potential advantage of quantum computing is the ability to perform Fast Fourier Transform (FFT) calculations millions of times faster than classical computers. This is important for tasks like image processing and machine learning, which rely on fast FFT algorithms for comparing data sets.

A huge potential of quantum computing market has led to the development of several machine learning applications that use quantum computers. Some of these applications include fraud detection, drug discovery, and speech recognition. As per SkyQuest, fraud detection and drug discovery market were valued at around $25.1 billion and $75 billion, respectively. This represents a huge revenue opportunity for quantum computing market.

This technology has been used for a variety of purposes, including predicting the stock market and automating tasks such as decision making and recommendations. In machine learning, generating revenue is a major challenge through traditional processing. Wherein, traditional computer processing can only handle a small amount of data at a time. This limits how much data can be used in machine learning projects, which in turn limits the accuracy of the predictions made by the ANNs.

Quantum computing solves this problem by allowing computers to perform multiple calculations at the same time. This makes it possible to process vast amounts of data and make accurate predictions. As a result, quantum computing has already begun to revolutionize machine learning market.

SkyQuest has prepared a report on global quantum computing market. The report has segmented the market by application and done in-depth analysis of each application in revenue generation, market forecast, factors responsible for growth, and top players by applications, among others. The report would help to understand the potential of global market by application and understand how other players performing and generating revenue in each segment.

Speak to Analyst for your custom requirements:

https://skyquestt.com/speak-with-analyst/quantum-computing-market

Top Development in Global Quantum Computing Market

Top Players in Global Quantum Computing Market

Related Reports in SkyQuests Library:

Global Silicon Photonics Market

Global Data Center Transformer Market

Global Wireless Infrastructure Market

Global Cable Laying Vessel Market

Global Digital Twin Market

About Us:

SkyQuest Technologyis leading growth consulting firm providing market intelligence, commercialization and technology services. It has 450+ happy clients globally.

Address:

1 Apache Way, Westford, Massachusetts 01886

Phone:

USA (+1) 617-230-0741

Email:sales@skyquestt.com

LinkedInFacebookTwitter

Continued here:
Quantum Computing Market to Expand by 500% by 2028 | 86% of Investments in Quantum Computing Comes from 4 countries - GlobeNewswire

Read More..

Amazon and Google are not happy with Microsoft’s cloud changes – Quartz

Microsofts recent changes to its cloud contracts arent going down well with othertech behemoths.

Microsoft has decided to amend the terms of its licensing deals, making it easier for customers to run Microsoft software on cloud platforms beyond its own Azure. According to Microsoft, the changes will enable new scenarios for how they can license and run workloads with infrastructure outsourcers and are due to go into effect starting Oct. 1.

The new flexible virtualization benefits, which are meant to benefit smaller EU-based cloud service providers, exclude certain listed providers such as Alibaba, Amazon Web Services, Google Cloud, and Microsoft itself.

Amazon and Googles parent company, Alphabet, criticized Microsofts cloud computing changes, saying they limit competition and discourage customers from switching to rival cloud service providers, Reuters reported.

But Microsoft likely doesnt care what its rivals thinkthe plans aim to win over EU antitrust regulators. They are yet to react.

Brief history

On Feb. 1, 2010, when Microsoft launched Azure, it was among the trailblazers in the sector. Amazon Web Services (AWS) debuted much later in 2006, and Google Clouds in 2008.

While 95% of Fortune 500 companies use Azure, maintaining a stronghold hasnt been easy for the service. AWS has quickly climbed to the top of the chart.

Complicating matters further, Microsoft has been hit with a litany of complaints in Europe. In 2019, EU antitrust regulators launched a probe into the Washington-based firm after several smaller European competitors complained that Microsofts licensing terms made it prohibitively more expensive to run Microsoft software such as Office on cloud platforms other than its own Azure.

Microsofts move is meant to appease these critics, but drew ires from other tech giants.

Quotable

The promise of the cloud is flexible, elastic computing without contractual lock-ins. Customers should be able to move freely across platforms and choose the technology that works best for them, rather than what works best for Microsoft. At Google Cloud, we believe that openness matters, and we continue to gain customers trust by promoting the security, cost, and benefits of using multiple cloud providers. We urge all cloud providers to avoid locking in their customers and compete on the merits of their technologies.Marcus Jadotte, vice president of government affairs & policy at Google Cloud

By the digits

1.6 billion ($1.6 billion): How much Microsoft has been fined by EU antitrust regulators over the last decade

32, 19, and 7: The cloud market share commanded by AWS, Azure, and Google Cloud respectively

715 million: Microsoft Azure users globally (as of 2017)

200: Microsoft Azure products

Fun fact

The Microsoft Cloud is connected by enough fiber to stretch to the moon and back three times.

Related stories

A total Amazon cloud outage would be the closest thing to the world going offline

Early cloud computing was like borrowing a book from the library

All the innovative ways Google docs are being used today

The South African origins of Andy Jassys profitable Amazon division

Microsoft is rolling out its own metaverse

The rest is here:
Amazon and Google are not happy with Microsoft's cloud changes - Quartz

Read More..

Theres a storm brewing in cloud computingand most firms arent prepared for it – Fortune

Like so many other internet-era staples, cloud computing has evolved a great deal from its beginnings more than a quarter-century ago. What started as a way to centralize server capacity has grown to encompass thousands of services from providers big and small.

Yet for many firms, an understanding of the cloud and its associated costs remains primitive. PwCs Cloud Business Survey found that more than half (53%) of companies have yet to realize substantial value from their cloud investments.

With economic uncertainty growing by the month, pushing cloud-related decisions down the road could prove costly. Weve reached a critical juncture for business leaders to change how they approach the cloud, and the urgency is greater than you might think.

Firms that prepare today can recast the cloud as a driver of new business models, not just an outsourcing tool. They can minimize service creep, find value in what theyve paid for, and form partnerships to maximize return. By contrast, firms that wait could be forced into a more transactional view. They might treat the cloud like an on-or-off mechanism, cut from stacks they dont fully understand, and neglect to leverage the capabilities that remain.

The hard part is not deciding which organization you want to be in. The hard part is making the right investments to bring that vision to life.

To varying extents, most companies are undergoing some sort of digital transformation. They expect technology to form more of their business operations, and they want digital leverage to get there faster.

From their early days, cloud platforms have offered some fundamental tools to help. They replaced physical infrastructure with a virtual environment and solutions that required no physical footprint or hands-on management. But firms that still view cloud computing as just that, or even mostly that, are selling themselves short. Yes, storage and processing capacity are an important part of the cloud. But what youre really buying is agility: the speed to operate, scale, and change your enterprise. What once took two or three years to build in a physical environment can happen overnight.

That unlocks huge potentialand budgetary creep. Computing costs are here to stay, and overall technology costs will continue to rise with the main driver being increasing energy prices and consumption. They can equate to nearly 10% of revenue for some companies, with cloud services comprising a significant chunk of that. Already, the bill for cloud services can top $100 million per year for some firms. Leaders who see the cloud as a transition to a cheaper operating model might be in for a rude awakening. Instead, they should leverage the agility of the cloud and keep costs under control using a new approach that combines discipline and smart investments.

Companies should seek to reduce costs and capture returns.

Its easy to spin up cloud services at a moments noticeand its hard to manage the growth of infrastructure you cant see. Firms need the operating discipline to ask themselves whether they really need specific computing power or storage capability.

In the age of data centers, shutting down an application meant reallocating server space to another capability. Today, youd just acquire more cloud capacity to gain that new capability before shuttering the old. Even with watchful eyes at the top, its easy to end up with inefficient infrastructure usage.

Its vital to align the growth of cloud infrastructure with the growth of the larger business and manage the two similarly. What does that look like? Leaders need to embrace new mental models, expand digital upskilling for relevant employees, and invest in the headcount to engage the services theyve bought. Firms need to innovate faster, deploy low-code/no-code solutions, and leverage the cloud to create better experiences for their employees and customers.

That creates value in cloud services, which tightening market conditions will make even more critical. Efficient cloud usage can make the difference in retaining the very jobs responsible for managing that usage.

Of course, businesses should absolutely manage their cloud services just as they do other variable utilities and put processes in place to ensure they turn off whats not being used, even if it served a past purpose. Just like you wouldnt blast the HVAC at home while you were on vacation, unused cloud capacity is running up a bill.

The cloud still needs people. Recasting the cloud, remaining disciplined on spending, and extracting value from the services you retain requires a wholesale change of thinking.

Internally, leaders need to build transparency with partners who are less technologically oriented to explain how deriving value from the cloud means investing in the right talent to leverage its capabilities. In return, those same partners can feed business acumen back into cloud management. The cloud offers opportunities for efficiency for all parts of an organization, so everyone in the C-suite has a stake in itand should align on strategy. Having a unified executive team on these converging and complex business issues will remind you of businesswide priorities and help enhance the return on your cloud investment.

The best organizations have a harmonious relationship with their cloud provider so they can be in lockstep to develop new services as business environments change. They should align on the goals, pathways, and budgets of their business. Absent those conversations, the relationships become transactional: Clients derive less and less value, and thin budgets risk indiscriminate cloud cuts.

That gets back to the original point: Maximizing cloud potential, especially in uncertain economic times, requires new mentalities, more discipline, and some level of strategic investment. Firms that wait until the balance sheet forces their hand wont have the time or space to develop the best strategy. Its time for a completely new approach to managing the cloud.

Joe Atkinson is Chief Products & Technology Officer at PwC.

The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not reflect the opinions and beliefs ofFortune.

Sign up for theFortune Features email list so you dont miss our biggest features, exclusive interviews, and investigations.

Read this article:
Theres a storm brewing in cloud computingand most firms arent prepared for it - Fortune

Read More..