Category Archives: Cloud Servers
Most HPC data centers will deploy quantum computing within the next few years – TechRadar
Most high-performance computing (HPC) data center operators expect to deploy quantum computing solutions within the next couple of years, a new report from Atos and IQM has found.
The firms polled 110 key decision-makers from HPC centers worldwide and found that getting optimal performance out of HPC, while ensuring security and resilience, is getting more challenging for users.
To tackle the problem, 76% plan on using quantum computing by 2023. Furthermore, 71% plan to move to on-premises quantum computing by 2026.
In fact, quantum computing is the number one technology in Europe and the top three worldwide. Three-quarters (76%) of HPC centers are already using quantum computers for their servers, or have plans to do so within the next two years. They expect quantum computers to solve supply chain logistics challenges, as well as those related to climate change. They also expect it to solve existing problems faster, and reduce overall computing costs.
Furthermore, the top use cases for HPC centers are database searching, investment risk analysis, molecular modeling, and asset management.
Being able to mix standard elements with custom-developed infrastructure components is what makes cloud an essential part of the HPC architecture, the report further found. Hybrid and cloud deployments are of high priority all over the world, but very little is known about how quantum will work side-by-side with classical HPC infrastructure.
This will result, the report concludes, in the growth of outsourcing operations and maintenance in quantum computing.
Unlike classic computers, whose bits (basic unit of information) can only have two states (either 0 or 1), quantum computers qubits can take advantage of the collective properties of quantum states (superposition, interference, entanglement) when performing calculations. That should make them infinitely faster than traditional computers, but at the moment, they are only capable of solving specific computational problems.
Make sure to check out our list of the best dedicated server hosting providers right now
Read the rest here:
Most HPC data centers will deploy quantum computing within the next few years - TechRadar
Friday FOSS fest: Franz, RamBox, Pidgin and more – The Register
Most modern chat systems are entirely proprietary: proprietary clients, talking proprietary protocols to proprietary servers. There's no need for this: there are free open standards for one-to-one and one-to-many comms for precisely this sort of system, and some venerable clients are still a lot more capable than you might remember.
But as it is today, if you need to be on more than one chat system at once, the official way is to install their client app, meaning multiple clients or at best, multiple tabs open in your web browsers. Most of these "clients" are JavaScript web apps anyway, running inside Electron an embedded Chromium-based single-site browser. Which is fine, but Chrome is famously memory-hungry.
There is a brute-force way round this: have one app that embeds lots of separate Electron instances in tabs. There are a few of these around first came RamBox, followed by Franz. Both use the "freemium" model: there's a completely functional free client, plus subscriptions for extra features. If you prefer to avoid such things, both services have no-cost forks: Ferdi from Franz and Hamsket from RamBox. A newer rival still is Station.
They're not perfect but these messaging-aggregators are very handy: you get all your messaging apps in a single client, with a single set of notifications, and they're separate from your browsers. You can configure multiple accounts on each service, which can be tricky in a browser if it stores your credentials. The snags are that the UI inside each tab is totally different, and they are very memory-hungry: each tab takes hundreds of megs of RAM, and if you have a lot of tabs, the parent app can easily snarf a couple of gigs. That's the price of building apps in Javascript.
But there is another way. If you were online in the 1990s, you may recall the early days of online chat, with multiple proprietary "instant messengers": AIM, Yahoo, MSN and so on. Most of them have been shut down now, although the oldest of all, ICQ, was spun off by AOL and is still around. Some of the clients could connect to rival services, leading to decidedly hairy hacks to validate that clients were genuine, such as AOL intentionally exploiting a buffer overflow in its own code. This didn't stop third parties creating their own clients, such as the Linux client GAIM.
Back in 1999, a group came together to create a free open standard for person-to-person messaging: Jabber, later renamed the Extensible Messaging and Presence Protocol or XMPP. For a while, it was widespread, including large corporations such as Facebook and Google, although most have removed support for it now.
The original purpose of GAIM went away, but the app did not. The team added support for other operating systems and protocols and renamed it to Pidgin. It's still very much alive. It runs on Windows as well as Linux, and not only does it work with any XMPP service, such as Cisco's, but also Apple's Bonjour, Google Talk, Groupwise, ICQ, IRC, SIMPLE, and Zephyr.
There are also plugins available for dozens more Telegram, Facebook Messenger, both personal and business versions of Skype, Discord, Mattermost, QQ, Rocket.chat, Twitter, Slack, Steam, Threema, WeChat, and more. It may not talk to every chat service out there, but it supports most of them.
There is a drawback with multiprotocol clients like this, though, be they tabbed web-apps or true native client-server setups you need to configure all the protocols you will use in each client. A newer protocol hopes to tackle that problem: Matrix. Matrix can do point-to-point conversations, as XMPP does, but also channels and chatrooms and more importantly, it can link to other services via server-side bridges. A Matrix client the reference one is Element can bring multiple messaging services together into a single inbox in a single local app, connected via a single login.
Matrix can be tricky to configure, though, so some companies are offering paid-for messenger-unification services running on top of the Matrix protocol. Beeper is a commercial effort and includes Apple iMessage via a hilarious workaround, whereas the cheaper Element One is basically a hosted version of Matrix. A new mystery contender is Texts, which is closed-source for now although the company says it will open-source its SDK later.
There never was a golden age of any-to-any chat systems, but fifteen years ago, things were a lot better than they are now. There is reason to hope, though. There are signs that things are getting better.
Read the original here:
Friday FOSS fest: Franz, RamBox, Pidgin and more - The Register
FBI spams thousands with fake infosec advice after ‘software misconfiguration’ – The Register
The FBI has admitted that a software misconfiguration let parties unknown send legit-looking email from its servers.
A statement from the bureau, dated November 14, states the agency "is aware of a software misconfiguration that temporarily allowed an actor to leverage the Law Enforcement Enterprise Portal (LEEP) to send fake emails."
Spam-tracking service Spamhaus tweeted about the incident on November 13.
The mails contained a warning that FBI monitoring had detected "exfiltration of several of your virtualized clusters in a sophisticated chain attack" perpetrated by a chap named Vinny Troia, the founder of infosec firms named Shadow Byte Cyber and Night Lion Security.
There is no indication Troia had anything to do with the incident and The Register makes no suggestion he was in any way involved. However, an entity using the name and Twitter handle "@pompompur_in" appears to have told Krebs on Security they were behind the incident.
"I could've 1000% used this to send more legit looking emails, trick companies into handing over data etc.," Pompompurin told Krebs. "And this would've never been found by anyone who would responsibly disclose, due to the notice the feds have on their website."
Troia also appears to have attributed the incident to @pompompur_in.
For what it's worth, @pompompur_in's Twitter profile states it also operates a private account on the service with the handle @seds. The profile for that account reads: "Call me vinny troia the way I be selling DBs." Other @pompompur_in posts suggest bad blood between whoever operates the account and Troia.
Whoever was behind the attack, the FBI has admitted it was real and that a server it operates was used to send the mails. Another Spamhaus Tweet suggests that whoever got in was able to use the FBI server to send two spurts of mail, with around 100,000 messages making it out.
The server in question was part of LEEP, which the FBI describes as "a secure platform for law enforcement agencies, intelligence groups, and criminal justice entities [that] provides web-based investigative tools and analytical resources" for other law enforcement agencies.
"Users collaborate in a secure environment, use tools to strengthen their cases, and share departmental documents." Or at least that's what they do when they're not trying to figure out what "exfiltration of several of your virtualized clusters in a sophisticated chain attack" means.
But we digress.
The FBI explains that the server was "dedicated to pushing notifications for LEEP and was not part of the FBI's corporate email service", and that no data or personally identifiable information was accessed.
"Once we learned of the incident, we quickly remediated the software vulnerability, warned partners to disregard the fake emails, and confirmed the integrity of our networks."
Unusually, the FBI's posts don't mention an investigation into the incident. Perhaps the Bureau's waiting for the weekend to end before trying to track down @pompompur_in.
More here:
FBI spams thousands with fake infosec advice after 'software misconfiguration' - The Register
Dynatrace : Automatic connection of logs and traces accelerates AI-driven cloud analytics – marketscreener.com
As digital transformation continues to accelerate and enterprises modernize with the adoption of cloud-native architectures, the number of interconnected components and microservices is exploding. Logs are a critical ingredient in managing and optimizing these application environments. Dynatrace now unifies log monitoring with its patented PurePath technology for distributed tracing and code-level analysis. Logs are now automatically connected to distributed traces for faster analysis and optimization of cloud-native and hybrid applications.
Customers expect enterprises to deliver increasingly better, faster, and more reliable digital experiences. Cloud-native observability is a prerequisite for companies that need to meet these expectations. Observability enables a holistic approach to automation and BizDevOps collaboration for the optimization of applications and business outcomes.
Logs are a crucial component in the mix that help BizDevOps teams understand the full story of what's happening in a system. Logs include critical information that can't be found elsewhere, like details on transactions, processes, users, and environment changes.
A key element of effectively leveraging observability is analyzing telemetry data in context. Being able to cut through the noise, with all the relevant logs at hand, dramatically reduces the time it takes to get actionable insights into the optimization and troubleshooting of workloads.
Without automation, this contextualization is hardly feasible, especially in large and dynamic environments. Modern heterogeneous stacks consist of countless interconnected and ephemeral components and microservices. Log entries related to individual transactions can be spread across multiple microservices or serverless workloads. Manual and configuration-heavy approaches to putting telemetry data into context and connecting metrics, traces, and logs simply don't scale.
With PurePath distributed tracing and analysis technology at the code level, Dynatrace already provides the deepest possible insights into every transaction. Starting with user interactions, PurePath technology automatically collects all code execution details, executed database statements, critical transaction-based metrics, and topology information end-to-end.
By unifying log analytics with PurePath tracing, Dynatrace is now able to automatically connect monitored logs with PurePath distributed traces. This provides a holistic view, advanced analytics, and AI-powered answers for cloud optimization and troubleshooting.
Automatic contextualization of log data works out-of-the-box for popular languages like Java, .NET, Node.js, Go, and PHP, as well as for NGiNX and Apache Web servers. Unlike other approaches in the market, Dynatrace allows you to apply this new functionality broadly via central activation. This automated approach avoids any manual configuration of tracers or agents and the need to restart processes.
In addition, Dynatrace offers an open-source approach to the contextualization of log entries and distributed traces as well via OpenTelemetry.
You can instantly investigate logs related to individual transactions on the new Logs tab in the PurePath view. This instantly reveals additional context.
The example below includes analysis of a payment issue. It shows that a call to the payment provider was declined because the credit card verification failed. From the call perspective, you can easily see the related entry in the code-level information to understand where in your code this specific log entry was really created.
Dynatrace makes it easy to view the log lines related to individual spans or a broader view that covers transactions end-to-end or even entire workloads.
Uniquely, Dynatrace also provides connections to the processes that handle each call.
In the screenshot above, you can see that a single trace created 64 different log entries. The top entry, marked with status ERROR is critical to the analysis of this issue.
This seamless user journey is also available from the log viewer side. You can easily get from individual log lines to a transaction-centric view for additional context and analysis.
Starting with OneAgent version 1.231, you can activate our OneAgent code modules for Java, .Net, Go, Node.js, PHP, NGiNX, or Apache Web server to automatically enrich logs with trace context without any manual code or configuration change on your workload. Just go to Settings > Server-side service monitoring > Deep Monitoring > New OneAgent features. This ensures that trace IDs are automatically added to log lines for transaction-based analytics.
Structured log entries that are ingested via the Generic log ingestion of Log Monitoring V2 will show up in related PurePath traces starting with Dynatrace version 1.232.
Within the next 90 days, all transaction-related logs will show up in PurePath view after activation of this new functionality.
To find out more, see Dynatrace Log Monitoring documentation and PurePath distributed tracing documentation.
If so, start your free trial today!
Disclaimer
Dynatrace Inc. published this content on 18 November 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 18 November 2021 13:02:03 UTC.
Follow this link:
Dynatrace : Automatic connection of logs and traces accelerates AI-driven cloud analytics - marketscreener.com
EU digital sovereignty project Gaia-X opens its summit with the departure of Scaleway – The Register
French cloud hosting outfit Scaleway is to depart the EU's data sovereignty project, Gaia-X, with CEO Yann Lechelle worrying that what began with splendid ideals is getting increasingly mired in the status quo.
Scaleway's announcement came as the second Gaia-X summit got under way, titled "Here to deliver", and amid rumblings from members over the sponsorship of the gathering by the likes of Huawei and Alibaba as well as the involvement of US cloud giants such as Microsoft and AWS.
"Scaleway will not renew its GAIA-X membership," the company said. "The objectives of the Association, while initially laudable, are being sidetracked and slowed down by a polarization paradox which is reinforcing the status quo, that is an unbalanced playing field."
We did not intend to create a mass movement away from Gaia-X... I actually believed in it, even though the foundation was asymmetrical because of German interests, working with US players, mostly, and French players, who were also cloud providers
Scaleway was one of the founders of Gaia-X, and its departure will doubtless cause more than a little discussion behind the scenes during this week's event.
"It's been brewing for many months," Scaleway CEO Yann Lechelle told The Register. Of Gaia-X, he said: "The idea was to create a sort of improvement in terms of sovereignty, right. And sovereignty is very loaded as a term.
"Does anyone question the sovereignty of Microsoft in the US?" he asked, rhetorically. "Nobody does what we do know is that we do not have sovereignty when it comes to tech, and that is the main issue."
Lechelle went on to highlight differences between the German and French approaches, noting that a preference by some countries to work with the tech giants ("itself, not a problem," he added) "reinforces the dependence and the dependencies on extraterritorial tech monopolies."
"Gaia-X," he said, "evolved in terms of governance. I fought very hard that the governance remained strictly European. But then, the German counterparts and even the French big players not cloud providers, but big consumers of cloud wanted to keep working with the big tech players.
"And so at the end of the day, the reason why we're leaving is that Gaia-X as a construct is only reinforcing the status quo, which is that dominating players will keep dominating."
As the news broke, Sid Nag, a VP analyst at Gartner, told The Register: "My feeling is that players like Scaleway might be saying 'You know what, is it worthwhile for us to continue to invest in this? For a second or third-tier cloud provider it involves a significant investment of time and money and energy. They're probably saying, 'I'm not seeing the benefit of doing this any more.'
"The Gaia-X initiative was about creating a sovereign cloud of sorts for the Eurozone. But there's this secondary motivation of competing with the hyperscalers based in North America.
"If there was a way for North American providers to operate on Europe-based data for their European clients in a sovereign manner then I don't think this would even be a conversation today."
On the latter point, Lechelle noted efforts by the French government to, as he put it, "move away from the dusty mainframes and move to the public cloud."
"Except, at the same time, there's the CLOUD Act. And the CLOUD Act gives the US way too much access worldwide. So we talk about extraterritorial laws. And the EU is weak in that sense, because we do not have reciprocity."
As Microsoft continues to push Office customers to the cloud, European governments have found themselves in somewhat of a quandary when it comes to data. "So what they're saying in a way is like, OK, here is the deal: Microsoft, you need to do a joint venture with French players, create a packaged version of Azure, and Office 365. It will be managed by French operators, and therefore it will be disjointed from the US cloud and therefore not subject to the CLOUD Act."
As for Gaia-X, Lechelle said Scaleway would "be looking in from the outside." After all, Gaia-X is an open project. "But," he added, "we don't have time for this now we need to spend our energy becoming a relevant player at scale."
Reactions to Scaleway's departure have so far been muted. Lechelle put it diplomatically, saying: "Some of the members might want to reach out to us (and they have); maybe they will disengage, maybe they will not..." or, as he suggested, some might opt to leave things until after this week's summit.
"We did not intend to create a mass movement away from Gaia-X," he told us. "I actually believed in it, even though the foundation was asymmetrical because of German interests, working with US players, mostly, and French players, who were also cloud providers."
Other members reiterated their support. An HPE spokesperson told The Register the company was "committed to the framework. We are contributing to the Gaia-X foundations and driving a number of projects with customers and partners."
Amanda Brock, CEO of Gaia-X member OpenUK, also confirmed her organisation's support and commitment. "The Gaia-X members represent the state of the art for Europe in terms of digital sovereignty and they act as the backbone for Europe's federated data model," she said.
"The UK will engage with this more fully over time, and with that in mind, we are working with a group across the UK to shape a potential Gaia-X Hub for the UK to launch in 2022."
Of Scaleway's departure, Brock said: "With 300-plus members, if we are realistic, there will inevitably be a level of dropout. I don't see anything to be surprised about in the announcements this week. At the same time as we see this natural evolution, we see a true doubling down on open from Europe."
Simon Hansford, CEO of UKCloud, took a slightly more cautious tone: "It has been puzzling to see the increasing presence of global giants such as Huawei and Alibaba on Gaia-X which we wholeheartedly support in principle. Nonetheless, this does raise questions as to whether the current setup of Gaia-X is capable of fulfilling its objective as a genuinely sovereign cloud for Europe."
The Register put Scaleway's points to Gaia-X and will update should we receive a response.
See the article here:
EU digital sovereignty project Gaia-X opens its summit with the departure of Scaleway - The Register
Why is AI an obsession for business insiders? – just-auto.com
Jorm Sangsorn via Getty
Business leaders are still worrying about artificial intelligence (AI), but with Facebook pushing hard into the metaverse, augmented reality (AR) has also proven a massive concern for corporate chieftains.
Thats according to analytics firm GlobalData. The company defines a theme as any issue that keeps chief executives awake at night. In a thematic survey published in October 2021, GlobalData gauged the business communitys current sentiment towards emerging technologies that kept executives stirring into the early hours.
The research found that AI was the technology perceived as most disruptive in Q3 2021, regaining its position from AR, which, as Verdict previously reported, held the top spot in the previous quarter.
66% of professionals from over 30 industries stated in the poll that AI would deliver either slight or significant disruption to their industry. This was a sharp increase from the previous quarter when 49% said AI would disrupt their industry. It returns AI to the position that it held in Q4 2020 and Q1 2021.
AR has gone in the opposite direction, and now only 48% see the technology as disruptive, down from 70% in Q2 2021.
The interactive tech, which blends the physical space with digital visualisations, reached wider public knowledge through the Pokmon Go craze in 2016, but also has real business potential. AR tech, for example, is being used for remote collaboration, training, maintenance, customer support and product design.
Its also seeing large uptake in ecommerce as a utility, both for consumers and brands. Various social giants have merged AR and ecommerce into their social media platforms, offering users the ability to try on products virtually.
The buzz around AR has grown recently thanks to Facebook's expansion into the metaverse, a virtual world where users can share experiences and interact in real-time within simulated scenarios. This is made possible through AR applications and virtual reality (VR) headsets.
Facebook is banking on the digital world enough to have renamed itself as Meta in October. The recent name change came alongside the company's pivot to becoming a metaverse company instead of a social media one (and an increasingly toxic social media brand at that).
Name change or not, it seems insiders may have seen through the AR hype, according to GlobalData's polling.
"The greater variation in perceptions of AI and AR noted in Q3 2021 could be because, like cybersecurity and cloud computing, both have a wide range of applications," explains Rupantar Guha, associate project manager of Thematics at GlobalData. "However, unlike cyber and cloud, deployment of AI and AR is at an early stage."
"Regarding the metaverse, AR and VR are key technologies in this developing mega-theme. VR-based metaverses are arriving in the market (e.g., Facebook's Horizon Workrooms) and AR-based metaverses (e.g., Microsoft Mesh) are also in development.
"It is too early to say which technology will outpace the other in the short run, given that both are in nascent stages of development and the metaverse is still largely conceptual. However, AR's accessibility through web browsers, smartphones, and smart glasses (that are less expensive than VR headsets) could give it an edge over VR in the long run."
In the meantime, it is AI which business bosses are banking on as the emerging technology of choice.
Of the emerging technologies included in GlobalDatas polls every quarter, perceptions of AI and AR are the most volatile. In the companys view, this is because confidence in the disruptive potential of the two technologies is fragile and likely to continue to experience variation as more companies implement them.
The majority of the respondent pool said that they felt more positive towards emerging technologies in Q3 2021 than a year ago. At least half of all those polled said they were more positive towards four of the seven technologies that GlobalData enquired about: cybersecurity, AR, AI and 5G.
AR and AI were behind only cybersecurity in positive sentiment change. This indicates that, despite the volatile perceptions of the two technologies regarding the level of disruption they can bring, a majority still felt more favourable than in 2020.
55% of respondents said that AI would live up to all its promises, which is only a small drop from the 57% who said the same in the previous quarter. The continued good performance of AI in this indicator suggests that businesses are hopeful that the technology will ultimately deliver significant benefits.
GlobalData predicts the global AI platform market will be worth $52bn within three years. The burgeoning theme is driven by the obvious business benefits of AI.
The tech allows businesses to accelerate digital innovation and development, resulting in increased efficiency, lower operational costs, higher revenues and improved customer experience.
Enterprise AI projects often share three main objectives. One is the automation of business processes as automating routine day-to-day activities and obligations contributes to more efficient use of labour, with workers able to focus their time and energy on higher-value tasks. AI lets businesses also reduce operating costs and cut out errors that come part and parcel with routine processes and tasks in the workplace.
AI can also provide business insights that make sense of vast amounts of data to predict customer preferences and generate high-quality sales leads. Essentially converting information into knowledge, AI can help with everything from providing personalised product recommendations to identifying credit fraud.
Finally, the tech improves customer engagement. AI-driven customer service capabilities such as virtual assistants and chatbots enable businesses to communicate with high volumes of customers every day, something which proved especially useful in pandemic when face-fronting options werent possible. The pseudo-human face of AI can provide a more personalised experience that drives growth, reduces costs, and improves retention and overall customer satisfaction.
Companies must remember, though, that delivering a successful AI project is not easy, warns GlobalData thematic research director Ed Thomas. It requires meticulous planning, detailed preparation, and complete buy-in from all parts of the business. It also requires an understanding of the problem that needs solving.
AI alone will not cure all ills, but it can successfully address specific business challenges.
Another part of AIs attraction is that the tech is, arguably, a rising tide lifting the profile of various emerging technologies.
Take the example of AR. AI technologies such as machine learning (ML), conversational platforms and AI chips power most of todays AR devices and apps.
AR developers use ML to improve the user experience (UX) by continually analysing user activities. Apples CoreML and Googles TensorFlow Lite ML frameworks support ARKit and ARCore, respectively, and allow developers to run ML models to improve object recognition in photos, convert speech to text and enable gesture recognition. Eye tracking and facial recognition, fast becoming standard functions across all AR devices, use ML to improve UX.
Virtual assistants like Amazons Alexa enable the hands-free operation of AR devices meanwhile. This is critical for some use cases, especially in enterprises: for example, doctors using Vuzixs M400 smart glasses for training and conducting patient rounds remotely during the pandemic. Equipped with hands-free voice support for Skype for Business and Zoom, the glasses help to keep human contact to a minimum with the aid of AI-powered conversation.
"Smart glasses' and headsets' heads-up displays bring visuals into the user's field of view, while voice assistants enable voice-based control of devices and apps," elaborates Guha. "The use of voice in AR devices is limited to specific tasks that add convenience to the user, helping them avoid using hands.
"Voice assistants are used as supporting capabilities such as pulling up apps to view documents and connecting with remote experts in industries such as healthcare, oil & gas, logistics, and manufacturing, among others."
Many organisations are also putting their faith in AI to improve their cybersecurity, with the tech providing cover for the continuing cybersecurity skills gap.
AI offers a more proactive defensive approach to discover and analyse the growing landscape for attacks. As GlobalData reports, the reality is that all software is at risk through human error and inadvertent security holes, which attackers can exploit.
With AI, the target is to have more comprehensive, predictive assessments of breach risk that recognise and prioritise the necessary steps to avoid breaches. The concept is enough to have led UK-US cybersec brand Darktrace to make its name and go public. Its Cyber AI Analyst aims to emulate human thought processes and automate tasks to continuously investigate cyber threats at machine speeds. The proprietary software is said to reduce the average time to investigate threats by 92%.
Darktrace was founded by Cambridge University mathematicians and US-UK government cyber intelligence experts, backed by infamous Autonomy founder Mike Lynch once dubbed Britains Bill Gates, today a wanted man facing extradition proceedings from the US over the sale of Autonomy to HP in 2011. Darktraces original AI tech, the Enterprise Immune System, was supplemented by autonomous response technology, which allowed the system to react to in-progress cyberattacks.
It should be noted though that Darktrace shares recently plunged by 23% over what's seen as a gap "between promise and reality" regarding its products: and the legacy of Lynch and Autonomy hangs over the company. Several executives at Darktrace, including its founder and CEO Poppy Gustafsson, have previously held roles at Autonomy and at Invoke Capital, Lynch's VC fund.
There is also the concept of AIoT, an amalgamation of AI and the Internet of Things (IoT). AIoT involves embedding AI technology into IoT components; combining data collected by connected sensors and actuators with AI allows for reduced latency, increased privacy and real-time intelligence at the edge. It also means that less data needs to be sent to, and stored on, cloud servers.
Apple is one name investing in this nascent field; in January 2020, the tech giant acquired Xnor.ai, which offered AI-enabled image recognition tools capable of functioning on low-power devices.
Being a rising tide lifting all boats and even profit margins, its no wonder AI is such an obsession for the global business community.
GlobalData polls were conducted online between the first week of July and the fourth week of September 2021, and ran on Verdict, GlobalDatas network of B2B websites. In total, the polls received 2,128 responses distributed unequally between each of the polls.
This article is part of a special series by GlobalData Media on artificial intelligence. Other articles in this series include:
See more here:
Why is AI an obsession for business insiders? - just-auto.com
Sponsored post: Ready for cloud? Five factors to consider before choosing your partner – TechCrunch
By Vinay Kumar, senior vice president, Oracle
Given all the talk about cloud computing, you might think that all business workloads are already running in a cloud. You would be mistaken.
While new applications are naturally born in the cloud, older applications can be challenging to migrate, and many remain in companies own server rooms or data centers. Companies should not have to make the choice between sacrificing existing applications and building for the future.
Since the majority of workloads have yet to move to the cloud, here are five things all companies should consider as they make their IT deployment plans.
The bulk of these on-prem workloads are the complicated-but-important applications that run a companys business. The term mission critical is no misnomer for the manufacturing, inventory, and financial software packages that pay the bills. If a migration of even one of these workloads goes awry, there will be a very real negative impact on the companys bottom line. And the challenges can persist, as companies work to run, maintain and secure legacy applications in the cloud.
Early cloud providers tried to bypass that issue by telling customers to rewrite (or refactor) these applications to run on first-generation clouds. Thats a hard sell for C-level executives who dont want to get caught up in technology wars and who realize that these applications still power the business. Why throw them out if they work?
The good news is that Oracle built its next-generation cloud to run these applications in their current state while also endowing them with the cloud benefits of scale-up-and-down capacity and price flexibility.
Moving on-prem applications to Oracle Cloud Infrastructure (OCI), while not as jarring as rewriting them for another cloud, still requires expertise and support. Oracle Cloud Lift Services, a free program launched recently, will help companies get their deployments done smoothly. New businesses can also lean on engineering talent from Oracle for Startups, which provides support from engineers experienced in building, testing and running nascent products on OCI.
In cloud as in real estate, location matters. A lot. So, the fact that we are able to quickly deploy full-featured cloud regions is an advantage and an important differentiator for Oracle and our customers. Not every company, or even region, has the acreage or wherewithal for massive scale-out data centers.
Data sovereignty laws enacted in several countries impose requirements on the location of some data. In some cases not only must user data be stored locally, it cannot be transmitted through servers in any other country even if its final destination is another domestic cloud region.
Most large corporations operate in many countries, and bear the burden of adhering to countless local requirements. These companies clearly need a cloud provider that has a vast network of in many countries, accommodating all sovereignty requirements. Towards that end, there are now 33 Oracle Cloud regions with 11 more on the drawing board to launch by the end of 2022.
But this location metric is not just about geography. Lots of companies in certain industries financial services, medical, pharmaceutical, and education, among them are also constrained by regulations on how their data is stored and processed.
Flexibility of deployment options is as important as geographical agility. Many companies including in the industries mentioned above would like to run some applications and keep some data in facilities that are fully under their control, while also using public cloud resources for other corporate tasks.
Providing organizations in these markets with their own cloud regions is a unique advantage offered by Oracle Dedicated Region Cloud@Customer.
These customers can run the exact same services in their private clouds as they run in public cloud deployments, whereas some of the first-generation cloud providers only offer subsets of their services. Deploying Cloud@Customer gives businesses an elegant way to maintain consistent operations, upgrade legacy applications, reduce costs, and meet demanding data residency and latency requirements. For enterprise SaaS applications, including Oracle Fusion Cloud Applications, this means customers can run apps closer to home, decreasing latency, resulting in faster response times. Oracle achieves service parity across private and public cloud which is a key requirement for running a truly hybrid cloud implementation and something that industry leaders have advocated for years.
Choice as to where given workloads run and where their associated data are stored is an important business requirement. But cloud providers ought to offer a certain amount of guidance to help customers avoid making costly mistakes in their cloud deployment.
Typically, when there is news about a major breach of a cloud-based workload, it is attributed (by the cloud vendor) to user error: the customer not the vendor left its cloud storage buckets unprotected by failing to turn on encryption, or by leaving ports open.
This is the sort of freedom customers would love to do without. Most businesses would be happy if their cloud provider offered guardrails to avoid costly mistakes like this. With Oracle Cloud Infrastructure, encryption and other security safeguards are activated by default.
And instead of providing a patchwork of discrete, confusing, and sometimes overlapping security tools, OCI offers a built-in set of security capabilities. Thus customers dont have to piece their defenses together by hand. In addition, businesses can also choose to implement security options over time at their own pace, based on what works best for their current environment and future plans.
In addition, use of Oracle Autonomous Database technology means the infrastructure updates itself in near real time without requiring a phalanx of human administrators to manually track and respond to the latest security vulnerabilities.
In cloud computing, cost isnt everything, but its pretty darn important. Thats why prospective customers really must look at cost options up front for all key services including compute, storage and networking. Outbound networking charges, in particular, have been a sore point for many, many early cloud customers. These data egress charges accrue when data is shipped out of a given cloud to the Internet and beyond.
Virtually no cloud player charges for data streaming into its cloud from customers, but one first-generation provider notoriously starts the meter running after one GB of data ships out per month to the internet. Those dollars add up incredibly fast, leaving many customers shell shocked because they probably didnt realize or could not predict how much data they might transfer at some point in the future.
OCI, on the other hand, starts charging only after 10 TB of data ships out. This means our customers can transfer 10,000 times as much data with OCI as they could with the other provider, without paying a cent. In addition, Oracle has recently teamed up with the Cloudflare-led Bandwidth Alliance of 19 tech companies that aim to minimize the cost of data egress. This is a huge step forward for cloud customers.
While the cloud computing era is still young, it is showing signs of maturity. The technology is getting better and more services are coming online. This is good news for the thousands of businesses still evaluating the best cloud for their existing and future workloads.
There are now a handful of top-tier players in cloud, each with their own strengths and weaknesses. Many businesses realize that reliance on a single cloud infrastructure provider is neither wise nor practical, and are opting for a multi-cloud approach. Working with multiple cloud vendors allows organizations to select the right cloud for each important workload, scale up and down as needed, and make the most of each vendors cost structures.
If businesses carefully consider these five factors in their selection process, chances are their deployment will be less disruptive, less stressful, and more productive than it would have been otherwise.
Excerpt from:
Sponsored post: Ready for cloud? Five factors to consider before choosing your partner - TechCrunch
Cloud Computing Market Research Report Highlights the Key Findings in the Area of Vendor Landscape, Key Market Segments, Regions, Latest Trends &…
Vendor Landscape
The market structure is expected to remain fragmented during the forecast period. Vendors operating in the market are adopting various marketing and growth strategies such as competitive pricing to compete in the market.
Adobe Inc., Alibaba Group Holding Ltd., Alphabet Inc., Amazon.com Inc., Hewlett Packard Enterprise Development LP, International Business Machines Corp., Microsoft Corp., Oracle Corp., Salesforce.com Inc., and SAP SE are some of the key vendors of this market. Vendors are competing to maintain their market position in the market.
Vendors are also trying to expand their market presence and strengthen their product portfolio by entering partnerships and launching new and innovative products.
For instance,In October 2020, Oracle Corplaunched the Oracle Cloud Observability and Management platform, which is a suite of services to enable better visibility and insight across both cloud-native and traditional technologies, whether deployed in multi-cloud or on-premises environments
View moreabout the market's vendor landscape highlights with a comprehensive list of vendors and their offerings.
Key Market Segmentation
Request a FREE Sampleof this report for more highlights into the market segments.
Regional Market Outlook
North America is leading the market with a contribution of 40% to the overall market in 2020 and would increase its contribution to a notable number of the global cloud computing market by 2025.The rising adoption of cloud solutions from various end-user industrieswill facilitate thecloud computing market growth in North America over the forecast period.
Download our FREE sample reportfor more key highlights on the regional market share of most of the above-mentioned countries.
Latest TrendsDriving the Global cloud computing market
SMEs have started opting for public solutions to scale up or scale down the hardware and resources. Apart from CAPEX reduction, cloud computing solutions can facilitate faster storage, processing, and communication lines. Clouds also enable the deployment of applications without the need for provisioning hosting capabilities. Cloud services provide security, facilitate optimum use of resources, and provide the reliability of a normal dedicated server and cloud resources.
The increased inclination for private cloud solutions for enhanced data security is another major factor supporting the cloud computing market share growth. Security and compliance concerns have been among the primary reasons for unwillingness among organizations to adopt a public cloud solution. There are also many regulations, such as the GDPR in Europe, which impose certain restrictions on where the data can be stored.
A private cloud offers cloud storage resources to a single enterprise or organization. The resources such as storage, servers, and network are not accessible from outside the enterprise network, and they only hold data related to a single business entity. Therefore, a private cloud offers a greater degree of security and control than a public cloud platform
Find additional information about various other market drivers & Trends mentioned in our FREE sample report.
Need More? Are You Looking for Information Not Covered in This Report?
Tailor this report according to your needs. Get it done with our $1000 worth of free customization. Speak to Our Analyst Now!
Related Reports:
Commerce Cloud Market -The commerce cloud market share should rise by USD27.33 billion from 2021 to 2025at a CAGR of 26.41%. Download a free sample now!
Cloud Backup and Recovery Market -The cloud backup and recovery market have the potential to grow by USD14.59 billion during 2021-2025, and the market's growth momentum will accelerate at a CAGR of 17.07%. Download a free sample now!
Cloud Computing Market Scope
Report Coverage
Details
Page number
120
Base year
2020
Forecast period
2021-2025
Growth momentum & CAGR
Decelerate at a CAGR of over 17%
Market growth 2021-2025
USD 287.03 billion
Market structure
Fragmented
YoY growth (%)
20.37
Regional analysis
North America, Europe, APAC, South America, and MEA
Performing market contribution
North America at 40%
Key consumer countries
US, China, UK, Germany, and Japan
Competitive landscape
Leading companies, competitive strategies, consumer engagement scope
Companies profiled
Adobe Inc., Alibaba Group Holding Ltd., Alphabet Inc., Amazon.com Inc., Hewlett Packard Enterprise Development LP, International Business Machines Corp., Microsoft Corp., Oracle Corp., Salesforce.com Inc., and SAP SE
Market Dynamics
Parent market analysis, Market growth inducers and obstacles, Fast-growing and slow-growing segment analysis, COVID-19 impact and future consumer dynamics, market condition analysis for the forecast period.
Customization preview
If our report has not included the data that you are looking for, you can reach out to our analysts and get segments customized.
About Us
Technavio is a leading global technology research and advisory company. Their research and analysis focus on emerging market trends and provide actionable insights to help businesses identify market opportunities and develop effective strategies to optimize their market positions. With over 500 specialized analysts, Technavio's report library consists of more than 17,000 reports and counting, covering 800 technologies, spanning across 50 countries. Their client base consists of enterprises of all sizes, including more than 100 Fortune 500 companies. This growing client base relies on Technavio's comprehensive coverage, extensive research, and actionable market insights to identify opportunities in existing and potential markets and assess their competitive positions within changing market scenarios.
Contact
Technavio ResearchJesse MaidaMedia & Marketing ExecutiveUS: +1 844 364 1100UK: +44 203 893 3200Email:[emailprotected]Website:www.technavio.com/
SOURCE Technavio
Follow this link:
Cloud Computing Market Research Report Highlights the Key Findings in the Area of Vendor Landscape, Key Market Segments, Regions, Latest Trends &...
Actiphy Inc., Releases ActiveImage Protector 2022 Backup and Recovery Solution Featuring Direct-To-Cloud Backup and In-Cloud Recovery – Galveston…
Country
United States of AmericaUS Virgin IslandsUnited States Minor Outlying IslandsCanadaMexico, United Mexican StatesBahamas, Commonwealth of theCuba, Republic ofDominican RepublicHaiti, Republic ofJamaicaAfghanistanAlbania, People's Socialist Republic ofAlgeria, People's Democratic Republic ofAmerican SamoaAndorra, Principality ofAngola, Republic ofAnguillaAntarctica (the territory South of 60 deg S)Antigua and BarbudaArgentina, Argentine RepublicArmeniaArubaAustralia, Commonwealth ofAustria, Republic ofAzerbaijan, Republic ofBahrain, Kingdom ofBangladesh, People's Republic ofBarbadosBelarusBelgium, Kingdom ofBelizeBenin, People's Republic ofBermudaBhutan, Kingdom ofBolivia, Republic ofBosnia and HerzegovinaBotswana, Republic ofBouvet Island (Bouvetoya)Brazil, Federative Republic ofBritish Indian Ocean Territory (Chagos Archipelago)British Virgin IslandsBrunei DarussalamBulgaria, People's Republic ofBurkina FasoBurundi, Republic ofCambodia, Kingdom ofCameroon, United Republic ofCape Verde, Republic ofCayman IslandsCentral African RepublicChad, Republic ofChile, Republic ofChina, People's Republic ofChristmas IslandCocos (Keeling) IslandsColombia, Republic ofComoros, Union of theCongo, Democratic Republic ofCongo, People's Republic ofCook IslandsCosta Rica, Republic ofCote D'Ivoire, Ivory Coast, Republic of theCyprus, Republic ofCzech RepublicDenmark, Kingdom ofDjibouti, Republic ofDominica, Commonwealth ofEcuador, Republic ofEgypt, Arab Republic ofEl Salvador, Republic ofEquatorial Guinea, Republic ofEritreaEstoniaEthiopiaFaeroe IslandsFalkland Islands (Malvinas)Fiji, Republic of the Fiji IslandsFinland, Republic ofFrance, French RepublicFrench GuianaFrench PolynesiaFrench Southern TerritoriesGabon, Gabonese RepublicGambia, Republic of theGeorgiaGermanyGhana, Republic ofGibraltarGreece, Hellenic RepublicGreenlandGrenadaGuadaloupeGuamGuatemala, Republic ofGuinea, RevolutionaryPeople's Rep'c ofGuinea-Bissau, Republic ofGuyana, Republic ofHeard and McDonald IslandsHoly See (Vatican City State)Honduras, Republic ofHong Kong, Special Administrative Region of ChinaHrvatska (Croatia)Hungary, Hungarian People's RepublicIceland, Republic ofIndia, Republic ofIndonesia, Republic ofIran, Islamic Republic ofIraq, Republic ofIrelandIsrael, State ofItaly, Italian RepublicJapanJordan, Hashemite Kingdom ofKazakhstan, Republic ofKenya, Republic ofKiribati, Republic ofKorea, Democratic People's Republic ofKorea, Republic ofKuwait, State ofKyrgyz RepublicLao People's Democratic RepublicLatviaLebanon, Lebanese RepublicLesotho, Kingdom ofLiberia, Republic ofLibyan Arab JamahiriyaLiechtenstein, Principality ofLithuaniaLuxembourg, Grand Duchy ofMacao, Special Administrative Region of ChinaMacedonia, the former Yugoslav Republic ofMadagascar, Republic ofMalawi, Republic ofMalaysiaMaldives, Republic ofMali, Republic ofMalta, Republic ofMarshall IslandsMartiniqueMauritania, Islamic Republic ofMauritiusMayotteMicronesia, Federated States ofMoldova, Republic ofMonaco, Principality ofMongolia, Mongolian People's RepublicMontserratMorocco, Kingdom ofMozambique, People's Republic ofMyanmarNamibiaNauru, Republic ofNepal, Kingdom ofNetherlands AntillesNetherlands, Kingdom of theNew CaledoniaNew ZealandNicaragua, Republic ofNiger, Republic of theNigeria, Federal Republic ofNiue, Republic ofNorfolk IslandNorthern Mariana IslandsNorway, Kingdom ofOman, Sultanate ofPakistan, Islamic Republic ofPalauPalestinian Territory, OccupiedPanama, Republic ofPapua New GuineaParaguay, Republic ofPeru, Republic ofPhilippines, Republic of thePitcairn IslandPoland, Polish People's RepublicPortugal, Portuguese RepublicPuerto RicoQatar, State ofReunionRomania, Socialist Republic ofRussian FederationRwanda, Rwandese RepublicSamoa, Independent State ofSan Marino, Republic ofSao Tome and Principe, Democratic Republic ofSaudi Arabia, Kingdom ofSenegal, Republic ofSerbia and MontenegroSeychelles, Republic ofSierra Leone, Republic ofSingapore, Republic ofSlovakia (Slovak Republic)SloveniaSolomon IslandsSomalia, Somali RepublicSouth Africa, Republic ofSouth Georgia and the South Sandwich IslandsSpain, Spanish StateSri Lanka, Democratic Socialist Republic ofSt. HelenaSt. Kitts and NevisSt. LuciaSt. Pierre and MiquelonSt. Vincent and the GrenadinesSudan, Democratic Republic of theSuriname, Republic ofSvalbard & Jan Mayen IslandsSwaziland, Kingdom ofSweden, Kingdom ofSwitzerland, Swiss ConfederationSyrian Arab RepublicTaiwan, Province of ChinaTajikistanTanzania, United Republic ofThailand, Kingdom ofTimor-Leste, Democratic Republic ofTogo, Togolese RepublicTokelau (Tokelau Islands)Tonga, Kingdom ofTrinidad and Tobago, Republic ofTunisia, Republic ofTurkey, Republic ofTurkmenistanTurks and Caicos IslandsTuvaluUganda, Republic ofUkraineUnited Arab EmiratesUnited Kingdom of Great Britain & N. IrelandUruguay, Eastern Republic ofUzbekistanVanuatuVenezuela, Bolivarian Republic ofViet Nam, Socialist Republic ofWallis and Futuna IslandsWestern SaharaYemenZambia, Republic ofZimbabwe
Amazon plans to build servers to host cloud computing in Calgary – MobileSyrup
Amazon is on track to open a second cloud computing server hub in Canada with a new location planned for Calgary in Alberta.
The hub is being placed in Calgary to provide better coverage for Western Canada since the only other Canadian server hub for Amazon web services (AWS) is in Montreal.
The tech/retail giant plans to open the data centre at some point in late 2023 or early 2024. Amazon also estimates that the construction and operation of the new server hub, combined with the existing jobs in Montreal, will amount to 5,000 jobs and an investment of $17 billion into the local economies around Calgary and Montreal by 2037.
Obviously, these are some giant numbers with minimal context so its hard to grasp how profitable or successful the Amazon hubs will actually be in Calgary. But for developers, websites and tools that take advantage of AWS and are based in Western Canada, this should bring a fairly substantial speed improvement.
Amazon has also invested in other areas of the province. In April, the company even announced that it was planning to build a giant solar farm to generate 195,000 megawatt-hours of power per year.
Source: Amazon
Read the original here:
Amazon plans to build servers to host cloud computing in Calgary - MobileSyrup