Category Archives: Cloud Computing

3 Strong Buy Cloud Computing Stocks to Add to Your Q2 Must-Watch List – InvestorPlace

These companies are global leaders in the cloud computing space

Cloud computing is the storing and accessing of data and programs over the internet rather than on a computers hard drive. With todays wired world creating data at an exponential rate, cloud computing is one of the fastest growing areas of technology. And spending on cloud computing services is growing almost as fast as the data thats being accumulated.

According to Statista, spending on cloud infrastructure services worldwide grew by $12 billion in the fourth quarter of 2023, reaching $73.7 billion for the three months ended Dec. 31. For all of last year, $270 billion was spent on cloud infrastructure services. The dollars up for grabs explain why technology companies large and small are racing into the space and trying to gain market share in the hotly contest and super competitive sector. Here are three strong buy cloud computing stock to add to your Q2 must-watch list.

Source: The Art of Pics / Shutterstock.com

Microsofts (NASDAQ:MSFT) current growth is not being driven by artificial intelligence (AI) but by its cloud-computing segment. The Seattle-based technology giant most recently reported that its Intelligent Cloud segment posted $25.88 billion in quarterly revenue, up 20% from a year ago. Within the cloud segment, revenue from Azure and other cloud services grew 30%, ahead of the 27% expected on Wall Street.

In its earnings release, Microsoft noted that it now has 53,000 Azure AI customers, and one-third of them are new to Azure in the past year. The company added that the number of commitments it has received to spend more than $1 billion on its Azure cloud services in the year ahead increased during the last quarter of 2023. The company has also introduced new custom cloud-computing microchips.

Source: Jonathan Weiss / Shutterstock.com

Oracle (NYSE:ORCL) is also seeing strong growth driven by its cloud unit. Oracles cloud service and license support segment, its largest business, saw sales rise 12% to $9.96 billion in Q4 2023. The growth in the cloud services unit offset declines in the companys other units, especially its hardware business where revenue fell 7% year-over-year to $754 million. The strong cloud growth also led Oracle to issue upbeat forward guidance.

The company expects earnings of $1.62 to $1.66 a share for the current first quarter of 2024, ahead of analysts estimates of $1.64 in earnings. The company said it anticipates revenue growth of 4% to 6% in the current quarter, also ahead of analyst expectations. Management also said during an earnings call that continued growth of its cloud-computing unit should help the company reach $65 billion in sales by 2026.

ORCL stock has gained 40% in the last 12 months, including a 20% increase so far in 2024.

Source: Tada Images / Shutterstock.com

Amazon (NASDAQ:AMZN) remains the cloud-computing king with a 31% share of the global market for cloud infrastructure. That said, Microsoft is quickly gaining on Amazon and now holds a 24% share of the market. There have been some concerns about Amazon Web Services (AWS) maintaining its dominant position. In its most recent quarterly results, the company reported that revenue from Amazon Web Services (AWS) totaled $24.20 billion, matching though not exceeding Wall Streets expectations.

The concerns might be a little premature. There are many analysts who expect Amazons cloud unit to get a big boost in coming quarters from the adoption of AI technologies. Analysts at Monness, Crespi, Hardt recently reiterated a buy rating on AMZN stock with a $215 price target, implying 20% upside from current levels. The analysts said: We believe the leading cloud service providers are well positioned to benefit from the early-stage ramp of generative AI projects, including AWS.

AMZN stock has gained 82% in the last 12 months, including a 19% increase so far this year.

On the date of publication, Joel Bagloleheld a long position in MSFT. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Joel Baglole has been a business journalist for 20 years. He spent five years as a staff reporter at The Wall Street Journal, and has also written for The Washington Post and Toronto Star newspapers, as well as financial websites such as The Motley Fool and Investopedia.

Read the original post:
3 Strong Buy Cloud Computing Stocks to Add to Your Q2 Must-Watch List - InvestorPlace

Hive: Distributed Cloud Computing Company Raises 12 Million – Pulse 2.0

SC Ventures (Standard Chartereds ventures arm) is leading a 12 million (USD $13 million) Series A funding round for distributed cloud provider Hive, which aims to increase businesses and individuals access to sustainable and high-powered computing resources. OneRagtime, a French venture capital fund that led Hives Seed round, and a collection of private investors joined the round.

Hives distributed cloud aggregates individual devices unused hard drives and computing capacities. And Hive is reinventing the cloud from a centralized model that uses expensive physical servers to a distributed cloud infrastructure that aggregates individual devices unused hard drives and computing capacities.

Hives model helps businesses efficiently manage their cloud-related expenses, reduce dependency on several cloud providers, and significantly reduce cloud energy use.

Since October 2023, Hive has had over 25,000 active users and contributors from 147 countries. These users store their files on hiveDisk and contribute unused hard drives to hiveNet to lower their subscription costs and effectively build the distributed cloud.

The computing capacity contributed to hiveNet also powers hiveCompute, allowing companies to manage workloads, such as running GenAI inference, video processing, and 3D modeling. And HiveNets architecture provides access to additional CPU, GPU, or NPU when needed, boosting the much-needed computing power. Companies looking for more control could build their private hiveNet, where IT managers control the devices entirely.

In December, Hive unveiled a Joint Development Partner (JDP) initiative, working closely with key partners to innovate the cloud landscape for businesses leveraging GenAI LLM computations.

Hive is a champion of sustainable technological progress, offering a practical solution to the challenges posed by traditional cloud computing models. With its latest funding round, Hive is set on growing its team and global footprint to address the enterprise markets, starting with startups and SMBs. The team prioritizes several business areas, including product development, building an engaged community of contributing Hivers, and sales and marketing efforts to reach users at scale.

KEY QUOTES:

Hive is addressing the pressing need for a new cloud paradigm that democratizes access, lowers financial barriers, and encourages innovation. With over 70% of the computing power available in our devices and billions of devices connected to the Internet, Hives community driven model builds The Right Cloud to offer a greener, more resilient network and secure alternative that also promotes a more equitable cloud solution. We thank our investors, as well as INRIA and Bpifrance, for their continuous support as we look to achieve our ambitious goals.

David Gurl, Hive Founder

We are big believers of Hives distributed cloud technology that will enable cheaper and more efficient access to computing power and storage, a critical point when most of our ventures may have an AI component requiring increasing such computing power. In addition to our investment, our ventures will be leveraging Hives services.

Alex Manson, who heads SC Ventures

Cloud technology has opened up horizons of innovation, but it also comes with challenges in terms of costs, security, data privacy, and environmental impact, heightened by the increasing demand for computing resources, especially for artificial intelligence. Hive, with its pioneering approach to distributed cloud, makes cloud access more secure, affordable, and efficient for everyone, and enables the sharing of computational power resources. As an early investor and believer, OneRagtime is particularly excited to support Hives vision and team.

Stphanie Hospital, Founder & CEO at OneRagtime

See the original post here:
Hive: Distributed Cloud Computing Company Raises 12 Million - Pulse 2.0

Private vs. public cloud security: Benefits and drawbacks – TechTarget

Regardless of whether an enterprise's infrastructure operates in a private, public, hybrid cloud or multiple clouds, cybersecurity is a critical component. Some cloud architectures greatly simplify security tasks and tool integrations, but that often comes at the cost of flexibility.

Let's look at the benefits and challenges organizations face as they compare private vs. public cloud security, as well as hybrid cloud security and multi-cloud security, in 2024 and beyond.

As its name implies, private clouds grant a business private access to dedicated infrastructure resources within a cloud. This infrastructure has both advantages and disadvantages.

Private clouds are attractive to organizations seeking more granular control over the underlying infrastructure. This commonly includes customer configuration access to the network, OSes and server virtualization platform.

From a security perspective, private cloud's advantages include the following:

The flexibility of private cloud comes at a cost in two areas: pricing and management.

For these two reasons, it is critically important that IT decision-makers carefully weigh the cybersecurity benefits of private clouds against the added financial expenses and management overhead.

Organizations can employ third-party cloud service providers (CSPs) to manage applications and data within their data center infrastructure. Many CSPs also provide built-in security tools to help protect business-critical data.

Businesses are attracted to public cloud infrastructures for a variety of reasons, including low Capex, service scalability and easing the management workload for in-house IT staff.

Public cloud model security benefits include the following:

Other businesses, especially larger ones with massive IT infrastructures, might find that public cloud security is not the right fit.

Potential public cloud security challenges include the following:

In hybrid cloud environments, some business applications and data reside in public clouds, while others are managed inside private clouds or private data centers.

With hybrid cloud, the whole might be greater than the sum of its parts. Security advantages of hybrid cloud infrastructure include the following:

Like with a private cloud, the flexibility of a hybrid cloud infrastructure has its downsides. For example, decisions about where applications and data reside are a significant responsibility and require deliberation.

Organizations should consider the following potential security disadvantages of the hybrid cloud model:

As the name suggests, a multi-cloud environment involves an organization using two or more cloud platforms or vendors. For example, an organization might use AWS for IaaS, Google App Engine for PaaS, and Microsoft 365 and Salesforce SaaS applications.

As in hybrid environments, multi-cloud deployments enable admins to put applications and data into the service with the most appropriate security levels. Similarly, they can adopt the most secure cloud offerings across CSPs.

Multi-cloud environments also offer the following security benefits:

Like hybrid cloud security challenges, multi-cloud environments require close management and consideration to decide where applications and data should reside. It can be difficult to apply a single security policy across multiple clouds, which can create security gaps. Using multiple clouds also requires security teams to know how to secure each cloud, as well as the best tools to use.

Multi-cloud deployments are also prone to the following security challenges:

With these challenges in mind, remember that many infrastructure security tools are now largely virtualized. This means the same security tools and policy configurations deployed within in-house data centers and across the corporate LAN can extend to private clouds to achieve hybrid or multi-cloud security parity. For many security departments, this greatly reduces security complexity from a uniformity point of view.

When it comes to cloud computing and cloud security, no single architecture is suitable for all businesses. IT architects must gauge the cybersecurity needs for all business applications and data sets. Once defined, the technology services can be categorized and earmarked for deployment in the public or private cloud -- whichever makes the most sense both from a cost and cybersecurity perspective.

Andrew Froehlich is founder of InfraMomentum, an enterprise IT research and analyst firm, and president of West Gate Networks, an IT consulting company. He has been involved in enterprise IT for more than 20 years.

Sharon Shea is executive editor of TechTarget Security.

Read the rest here:
Private vs. public cloud security: Benefits and drawbacks - TechTarget

Cloud computing trends – Enterprise License Optimization Blog

The thirteenth annual Flexera 2024 State of the Cloud Report (previously known as the RightScale State of the CloudReport) highlights the latest cloud computing trends and statistics, including strategies, challenges and initiatives from a broad cross-section of industries and organizations. The cloud computing report explores the thinking of 753 IT professionals and executive leaders from a survey conducted in late Q4 2023 and highlights the year-over-year (YoY) changes to help identify trends. The respondentsglobal cloud decision makers and usersrevealed their experiences with cloud migration, cloud computing and their thoughts about the public, private and multi-cloud market.

Select highlights of the report on cloud computing are included below.

Terminology used:

This marks the second year in a row that managing cloud spending is the top challenge facing organizations. As in previous years, there needs to be more resources/expertise. More than a quarter of respondents spend over $12 million a year on cloud (29%), and nearly a quarter (22%) spend that much on SaaS.

Respondents saw a slight increase in multi-cloud usage, up from 87% last year to 89% this year.

Sixty-one percent of large enterprises use multi-cloud security, and 57% use multi-cloud FinOps (cost optimization) tools.

The top two multi-cloud implementations are: apps siloed on different clouds, DR/failover between clouds. Apps siloed on different clouds increased the most (up to 57% from 44% YoY). Data integration between clouds increased to 45% from 37% YoY as organizations looked for the best fit for applications and data analysis.

Adoption grew for Amazon Web Services (AWS), Microsoft Azure and Google Cloud. Forty-nine percent of respondents reported using AWS for significant workloads, while 45% reported using Azure and 21% reported using Google Cloud Platform. In contrast, Oracle Cloud Infrastructure, IBM and Alibaba Cloud usage is substantially lower and relatively unchanged compared to the previous year.

SMBs are the highest cloud adopters, but fell off slightly from the previous year, with 61% (a drop from 67% last year) of workloads and 60% of data in the public cloud for both years.

Nearly all platform-as-a-service (PaaS) offerings saw a gain in usage, with the most prominent being in the data warehouse (up to 65% from 56% YoY). Container-as-a-service (52%) and serverless (function-as-a-service) (48%) are both up nine percentage points this year. Machine learning/artificial intelligence (ML/AI) had a modest gain at 41%, up from 36% last year. However, ML/AI is the PaaS offering getting the most attention from companies experimenting (32%) or planning to use it (17%).

Forty-eight percent of respondents say they already have defined sustainability initiatives that include tracking the carbon footprint of cloud usage. When asked how sustainability compares to cost optimization, 59% prioritized cost optimization, though an additional 29% say that both cloud cost optimization and sustainability are equally prioritized.

The world has experienced extraordinary disruption in the past few years, and while organizations of all sizes are prioritizing every dollar of spend, the cloud and technology will weather economic storms. Enterprises that remain focused on digital transformation, seizing new opportunities and evolving strategic initiatives through a cost-conscious lens will be better positioned for success than their competitors.

Get the latest insights in cloud computing trends and cloud migration statistics by viewing the complete survey results here.

Read more here:
Cloud computing trends - Enterprise License Optimization Blog

Bitmovin to run Live Encoder on Akamai cloud – Televisual

Bitmovin, a leading video streaming software solutions provider is launching its Live Encoder running on Akamai Cloud Computing.

By reducing data transfer out (DTO) costs, the combination of Bitmovin Live Encoding and Akamai can significantly help to lower operation costs.

Running Bitmovin Live Encoder on Akamai Cloud Computing is intended to help streaming services deliver better live viewing experiences across a host of use cases including sports/eSports, news, online fitness, eLearning, religious services, large-scale events, corporate communications, and political campaigns among others. Bitmovins Live Encoder also supports several ad monetization models, including 24/7 linear television channels and Free Ad-supported Television (FAST) channels.

Bitmovin can help its live-streaming customers deliver higher-quality viewing experiences, and reduce and better control costs by running Live Encoder on Akamai Cloud Computing, said Dan Lawrence, Vice President of Cloud Computing at Akamai (pictured). Placing and executing Live Encoders critical compute functions closer to end users can realize lower latency streaming while maintaining the high quality of service that consumers have come to expect and demand from streaming providers. It can also help dramatically reduce DTO fees in many cases. Collectively, we believe this meets the industrys desire to continue raising the standards of live streaming, provide lower and more predictable operational costs, and more opportunities to monetize content.

Bitmovins Live Encoder has a user interface designed to make it easy for users of all levels to set up live streams quickly, while Bitmovins API gives developers control over every aspect of the encoding pipeline. Live Encoder is pre-integrated with Akamai Media Services Live to support live-to-VOD and live clipping, which is part of Akamai Connected Cloud to support secure and efficient streaming at massive scale across Akamais global content delivery network (CDN).

Customers who run Bitmovins Live Encoder on Akamai will also benefit from pre-integrated third-party solutions for their video streaming workflows, including Videons LiveEdge contribution encoders, Grass Valleys Agile Media Processing Platform (AMPP) for live production, Zixi for secure transport and ingest; EZDRM for multi-DRM and content encryption; Yospace for Server-side Ad Insertion (SSAI), and more.

Our Live Encoder elevates live streaming, eliminating sub-par image and audio quality so audiences can enjoy truly immersive live experiences, said Stefan Lederer, CEO and co-founder of Bitmovin, Its a huge honor to announce our Live Encoder is running on Akamai Cloud Computing, which will help organizations of every size accelerate the quality of their live streaming workflows and deliver world-class viewing experiences.

The Bitmovin Live Transcoder running on Akamai comes by way of Bitmovin joining the Akamai Qualified Computing Partner (QCP) Program. The program is designed to make solution-based services that are interoperable with Akamai Cloud Computing services easily accessible to Akamai customers. The services are provided by Akamai technology partners that complete a rigorous qualification process to ensure they are readily available to deploy and scale across the globally distributed Akamai Connected Cloud.

Bitmovin will demonstrate its Live Encoding on Akamai Cloud Computing at the 2024 NAB Show in Las Vegas, April 14-17 (Bitmovin exhibitor stand W3013, Akamai meeting space W235LMR).

Share this story

Follow this link:
Bitmovin to run Live Encoder on Akamai cloud - Televisual

Cloud Email Filtering Bypass Attack Works 80% of the Time – Dark Reading

Computer scientists have uncovered a shockingly prevalent misconfiguration in popular enterprise cloud-based email spam filtering services, along with an exploit for taking advantage of it. The findings reveal that organizations are far more open to email-borne cyber threats than they know.

In a paper that will be presented at the upcoming ACM Web 2024 conference in Singapore in May, the authoring academic research team noted that services in wide use from vendors such as Proofpoint, Barracuda, Mimecast, and others could be bypassed in at least 80% of major domains that they examined.

The filtering services can be "bypassed if the email hosting provider is not configured to only accept messages that arrive from the email filtering service," explains Sumanth Rao, a graduate doctoral student at University of California at San Diego and lead author of the paper, entitled "Unfiltered: Measuring Cloud-based Email Filtering Bypasses."

That might seem obvious, but setting the filters to work in tandem with the enterprise email system is tricky. The bypass attack can happen because of a mismatch between the filtering server and the email server, in terms of matching how Google and Microsoft email servers react to a message coming from an unknown IP address, such as one that would be used by spammers.

Google's servers reject such a message during its initial receipt, while Microsoft's servers reject it during the "Data" command, which is when a message is already delivered to a recipient. This affects how the filters should be set up.

The stakes are high, given that phishing emails remain the initial access mechanism of choice for cybercriminals.

"Mail administrators that don't properly configuretheir inbound mail to mitigate this weakness are akin to bar owners who deploy a bouncer to check IDs at the main entrance but allow patrons to enter through an unlocked, unmonitored side door as well," says Seth Blank, CTO of Valimail, an email security vendor.

After examining Sender Policy Framework (SPF)-specific configurations for 673 .edu domains and 928 .com domains that were using either Google or Microsoft email servers along with third-party spam filters, the researchers found that 88% of Google-based email systems were bypassed, while 78% of Microsoft systems were.

The risk is higher when using cloud vendors, since a bypass attack isn't as easy when both filtering and email delivery are housed on premises at known and trusted IP addresses, they noted.

The paper offers two major reasons for these high failure rates: First, the documentation to properly set up both the filtering and email servers is confusing and incomplete, and often ignored or not well understood or easily followed. Second, many corporate email managers err on the side of making sure that messages arrive to recipients, for fear of deleting valid ones if they institute too strict a filter profile. "This leads to permissive and insecure configurations," according to the paper.

Not mentioned by the authors, but an important factor, is the fact that configuring all three of the main email security protocols SPF, Domain-based Message Authentication Reporting and Conformance (DMARC), and DomainKeys Identified Mail (DKIM) are needed to be truly effective at stopping spam. But that isn't easy, even for experts. Add that to the challenge of making sure the two cloud services for filtering and email delivery communicate properly, and the coordination effort becomes extremely complex. To boot, the filter and email server products are often managed by two separate departments within larger corporations, introducing yet more potential for errors.

"Email, like many legacy Internet services, was designed around a simple use case that is now out of step with modern demands," the authors wrote.

The documentation provided by each filtering vendor does vary in quality, according to the researchers. The paper points out that the instructions on the filtering products from TrendMicro and Proofpoint are particularly error-prone and can easily produce vulnerable configurations. Even those vendors that have better documentation, such as Mimecast and Barracuda, still produce high rates of misconfiguration.

While most vendors did not respond to Dark Reading's request for comment, Olesia Klevchuk, a product marketing manager at Barracuda, says, "Proper setup and regular 'health checks' of security tools is important. We provide a health-check guide that customers can use to help them identify this and other misconfigurations."

She adds, "most, if not all, email-filtering vendors will offer support or professional services during deployment and after to help ensure that their solution works as it should. Organizations should periodically take advantage and/or invest in these services to avoid potential security risks."

Enterprise email administrators have several ways to strengthen their systems and prevent these bypass attacks from happening. One way, suggested by the paper's authors, is to specify the filtering server's IP address as the sole origin of all email traffic, and to ensure that it can't be spoofed by an attacker.

"Organizations need to configure their email server to only accept email from their filtering service," the authors wrote.

Microsoft's documentation lays out email defense options and recommends setting a series of parameters to enable this protection for exchange online deployment, for example. Another is to ensure that all SPF, DKIM, and DMARC protocols are correctly specified for all domains and subdomains used by an enterprise for email traffic. As mentioned, that could be a challenge, particularly for larger companies or places that have acquired numerous domains over time and have forgotten about their use.

Finally, another solution, says Valimail's Blank, "is for the filtering application to include Authenticated Receiver Chain (RFC 8617) email headers, and for the inner layer to consume and trust these headers."

Read the original here:
Cloud Email Filtering Bypass Attack Works 80% of the Time - Dark Reading

Cloud Provider Vultr Has Bone To Pick After Reddit Post – CRN

'We do think this person knows better,' chief marketing officer Kevin Cochrane tells CRN. Were HIPAA compliant. If our terms of service meant we owned your data, we wouldnt be HIPAA compliant. Were GDPR compliant. If we owned your data, we wouldnt be GDPR compliant.

Private cloud provider Vultr is clearing the air after a widely viewed Reddit post claimed the company had changed its terms of services in a way that would give it ownership of all of the data stored or used on its network.

We do think this person knows better, Vultr Chief Marketing Officer Kevin Cochrane told CRN Thursday. Were HIPAA compliant. If our terms of service meant we owned your data, we wouldnt be HIPAA compliant. Were GDPR compliant. If we owned your data, we wouldnt be GDPR compliant.

Cochrane said that the terms of services cited in the Reddit post referred specifically to content posted to a public message board that has not been active in some time.

The content that you deploy on Vultr servers is wholly owned by you, said Cochrane (pictured above).

He said West Palm Beach, Fla.-based Vultr did update its terms of service. However, it was only to notify customers that the company will suspend accounts that have been dormant for two years.

Thats the reason everyone is having to click on this, Cochrane said.

Cochrane said the company believes the Reddit post was designed to spread misinformation after Vultr was among the first cloud providers to offer customers the ability to use Nvidias GH200 Grace Hopper Superchip with their workloads.

This is why the terms of service is such a concern for us. We specifically challenge the hyperscalers and other public clouds for using your private data for other purposes. business, he said. Our statement has always been Your private data is your private data.

The Reddit account that created the post has been active for five days. The post has 1,500 upvotes which gives the post added credibility inside Reddits platform. In it, the original poster claimed that Vultr changed its terms of service to state:

You hereby grant to Vultr a non-exclusive, perpetual, irrevocable, royalty-free, fully paid-up, worldwide license (including the right to sublicense through multiple tiers) to use, reproduce, process, adapt, publicly perform, publicly display, modify, prepare derivative works, publish, transmit and distribute each of your User Content, or any portion thereof, in any form, medium or distribution method now known or hereafter existing, known or developed, and otherwise use and commercialize the User Content in any way that Vultr deems appropriate, without any further consent, notice and/or compensation to you or to any third parties, for purposes of providing the Services to you.

Cochrane said this portion of Vultrs terms of service relates just to messages and content shared on a public discussion forum that Vultr hosts and is not related to the data and apps that customers use on Vultr systems.

The specific language in the post is, if you post content on one of our public mediums. IT was specific to when we had a forum. So if you are posting content on a forum, that forum is owned by us because we have to publicly publish it so other people can see the posts.

He compared the language to tech debt that is no longer needed, but carried forward, through newer iterations. To avoid confusion, he said Vultr is stripping the language from its terms moving forward.

CRN has reached out to the person who penned the Reddit post but had not heard back at press time.

Vultr, a privately held cloud computing platform, has 1.5 million customers across 185 countries. It offers cloud computing infrastructure and resources spanning from bare metal options to GPU compute available on demand.

Backed by parent company Constant, Vultr provides shared and dedicated CPU, block and object storage, Nvidia cloud GPUs, as well as networking and Kubernetes solutions. The companys mission is to make high-performance cloud computing easier to use, affordable and locally accessible.

Vultr is consistently expanding its data center footprint. In May 2023, Vultr Talon was launched to offer customers accelerating computing by enabling GPU sharing so multiple workloads can run on a single Nvidia GPU.

More here:
Cloud Provider Vultr Has Bone To Pick After Reddit Post - CRN

The Shocking Power Problem Behind Cloud Computing and Artificial Intelligence – Channelnomics

The demand for electric power is outstripping demand for data center capacity, which has the potential to slow the development of cloud computing and artificial intelligence services.

By Larry Walsh

A passing press release at the beginning of the year received little attention, as is often the case, despite predicting substantial sales growth for the vendor and its partners over the next two years. The press release was issued by Vertiv, a power systems manufacturer that provides equipment essential for running servers in data center racks. In touting the success of its acquisitions of E&I Engineering and Powerbar Gulf, Vertiv stated that it expects to more than double capacity for its switchgear, busbar, and modular solutions capacity in the next two years.

This is a bold claim, especially considering the shift of most enterprises toward cloud-based infrastructure. While the sale of data center products servers, switches, storage is projected to increase by about 10% this year, cloud computing sales are expected to jump 22%, and artificial intelligence technologies are forecasted to soar by more than 50%.

However, its the increasing demand for AI and cloud computing thats driving the sales of basic, seemingly conventional technologies such as power conditioning and backup systems. The construction of data centers, whether on-premises or for cloud services, necessitates products like those offered by Vertiv, Eaton, and Schneider Electric.

Yet, the optimistic outlook of Vertiv reveals a startling problem lurking behind the trend of cloud computing and AI: a lack of power.

Insatiable Power Demand Earlier this month, the energy industrys leaders convened in Houston for the annual CERAWeek by S&P Global, an event typically centered on electric generation and its associated inputs (oil, gas, coal, renewables). This years attendees included tech elites such as Microsofts Bill Gates and Bill Vass, vice president of engineering at Amazon Web Services, who were there to sound the alarm over the diminishing electrical capacity and the urgent need for more data centers.

At the event, Vass remarked that the world is adding three new data centers every day, each consuming as much energy as possible, with demand being insatiable. Over the next decade, the United States alone could require new capacity exceeding 100 gigawatts, sufficient to power 82 million homes. This figure doesnt even account for the capacity needed to power new homes and offices, as well as electric-vehicle fleets.

The enormous capacity requirements to power the next-generation cloud and AI era explain why OpenAI CEO Sam Altman proposed a $7 trillion fund to build data center capacity globally.

The U.S. already faces a challenge with electrical production and distribution. The Grid, as its commonly referred to, is based on century-old technology. Significant portions of the distribution network are decades-old and in need of repair, with parts of the country already experiencing brownouts and periodic disruptions because capacity cant keep up with demand.

Other developed regions encounter similar issues. Germany, for instance, became heavily reliant on Russian gas after decommissioning all of its nuclear power plants. Now, with the war in Ukraine disrupting energy supplies, Germany is compelled to reactivate conventional power plants to meet power demands.

AI Is Making the Problem Worse The development of AI will only intensify this issue. Data centers, already known as heat blooms due to their high energy consumption, will become furnaces as the massive computational processes consume more electrons. Manufacturers of servers and storage hardware are already cautioning partners and customers about the pitfalls of low-cost but power-intensive alternatives.

At CERAWeek, Gates, an advocate for sustainability, stated that the success and profitability of a data center hinge on the cost of its inputs. If electricity costs are too high, data centers will struggle to turn a profit without passing costs onto consumers.

Given that vendors sell cloud and Software-as-a-Service (SaaS) contracts on a multi-year basis, passing on costs is complicated. If a series of data centers proves unprofitable, costs will rise across the board to compensate.

Constructing more data centers isnt a straightforward solution. Real estate services firm CBRE Group reports that data center construction timelines are delayed two to six years due to electrical capacity issues.

Cloud vendors are establishing new data centers near sustainable energy sources, which doesnt always align with population or commercial needs. Moreover, building new power-generation facilities conventional or sustainable is slowed by regulatory reviews and local opposition.

Sustainability: A Solution? Balancing new data center capacity with electrical consumption needs will become a contentious issue for the technology industry, which is largely committed to sustainability goals. Microsoft aims to be carbon-neutral by 2035, and many other technology vendors are pursuing similar objectives to reduce their carbon footprint, which includes minimizing their consumption of materials and resources such as electricity.

While sustainable energy sources such as wind and solar may appear to be the solution, constructing the necessary infrastructure is as challenging as establishing a coal-fired power plant.

The power issue underlying cloud computing and AI is alarming and could hinder sales and growth, at least in the short term. Over time, vendors will improve the power efficiency of AI systems.

In the interim, vendors and partners must emphasize sustainability and efficiency as key selling points to their cloud computing, AI, and infrastructure customers. Power consumption deserves a prominent role in the total-cost-of-ownership equation, illustrating to customers the full expense of opting for cheaper but less efficient product options.

In the long term, the technology and energy sectors are likely to find a solution to this power dilemma. The remarkable aspect of technology is that it often becomes the solution to the problems it creates. For the present, though, vendors and solution providers must manage market expectations carefully.

The technology industry has excelled in convincing everyone that anything is possible with cloud computing. Its now doing the same with AI. Anything is indeed possible, provided the lights remain on.

Larry Walsh is the CEO, chief analyst, and founder of Channelnomics. Hes an expert on the development and execution of channel programs, disruptive sales models, and growth strategies for companies worldwide.

Read the original:
The Shocking Power Problem Behind Cloud Computing and Artificial Intelligence - Channelnomics

SC Ventures Leads Hive’s 12M Series A, Enabling Sustainable Distributed Cloud Computing for the Masses – PR Newswire

Hive's technology aims to transform 70% of the world's unused device capacity into a global supercomputer

GENEVA, March 29, 2024 /PRNewswire/ -- SC Ventures, Standard Chartered's innovation, fintech investment and ventures arm, is leading a 12 million (USD $13 million) Series A round for distributed cloud provider Hive, to increase access to sustainable, high-powered computing resources for businesses and individuals. OneRagtime, a French venture capital fund that led Hive's Seed round, and a collection of private investors also joinedthe round.

Hive is reinventing the cloud from a centralized model that uses expensive physical servers to a distributed cloud infrastructure that aggregates individual devices' unused hard drive and computing capacities. Hive's model helps businesses efficiently manage their cloud-related expenses, reduce dependency on a select few cloud providers, and significantly reduces cloud energy use. In 2023, global data centres, which power the world's cloud, required 7.4 Gigawatts of power, a 55% increase from 2022. Currently, data centres account for up to 3% of global electricity consumption, with projections suggesting this could rise to 4% by 2030.

"Hive is addressing the pressing need for a new cloud paradigm that democratizes access, lowers financial barriers, and encourages innovation," said David Gurl, Hive Founder. "With over 70% of the computing power available in our devices and billions of devices connected to the Internet, Hive's community driven model builds 'The Right Cloud' to offer a greener, more resilient network and secure alternative that also promotes a more equitable cloud solution. We thank our investors, as well as INRIA and Bpifrance, for their continuous support as we look to achieve our ambitious goals."

Since October 2023, Hive has amassed over 25,000 total active users and contributors from 147 countries, who store their files on hiveDisk and contribute a portion of their unused hard drive to hiveNet to effectively lower their subscription costs and build the distributed cloud. The contributed computing capacity to hiveNet also powers hiveCompute, allowing companies to manage workloads, such as run GenAI inference, video processing, and 3D modelling. HiveNet's architecture provides access to additional CPU, GPU, or NPU when needed, boosting the much-needed computing power. Companies seeking more control could also build their own private hiveNet, where IT managers retain full control over the devices.

In December, Hive unveiled a Joint Development Partner (JDP) initiative, working closely with key partners to innovate the cloud landscape for businesses leveraging GenAI LLM computations.

"We are big believers of Hive's distributed cloud technology that will enable cheaper and more efficient access to computing power and storage, a critical point when most of our ventures may have an AI component requiring increasing such computing power," saidAlex Manson, who heads SC Ventures. "In addition to our investment, our ventures will be leveraging Hive's services."

"Cloud technology has opened up horizons of innovation, but it also comes with challenges in terms of costs, security, data privacy, and environmental impact, heightened by the increasing demand for computing resources, especially for artificial intelligence," said Stphanie Hospital, Founder & CEO at OneRagtime. "Hive, with its pioneering approach to distributed cloud, makes cloud access more secure, affordable, and efficient for everyone, and enables the sharing of computational power resources. As an early investor and believer, OneRagtime is particularly excited to support Hive's vision and team."

Hive is a champion of sustainable technological progress, offering a practical solution to the challenges posed by traditional cloud computing models. With its latest funding round, Hive's sights are set on growing its team and global footprint, with a focus on addressing the enterprise markets starting with startups and SMBs. The team is prioritizing several areas of the business, including product development, building an engaged community of contributing Hivers, and sales and marketing efforts to reach users at scale.

About Hive Hive is revolutionizing the digital world by bringing the power of distributed cloud computing directly to the masses, providing everyone with the tools to innovate, secure their data, and contribute to a greener planet. Through hiveNet, hiveDisk, hiveCompute, and many other applications to come, Hive is aggregating latent computing resources from a community of interconnected computers to provide the market with easy-to-implement solutions that will unleash the power of distributed cloud to the masses, all while minimizing environmental impact. Hive's Joint Development Partner (JDP) program develops innovations in the cloud computing landscape for businesses leveraging GenAI LLM computations. Together, Hive is shaping a technological future that equitably uplifts every community.

To learn more about Hive, please visit http://www.hivenet.com

About SC Ventures SC Ventures is a business unit that provides a platform and catalyst for Standard Chartered to promote innovation, invest in disruptive financial technology and explore alternative business models.

For more information, please visit http://www.scventures.io and follow SC Ventures on LinkedIn.

About Standard Chartered We are a leading international banking group, with a presence in 53 of the world's most dynamic markets and serving clients in a further 64. Our purpose is to drive commerce and prosperity through our unique diversity, and our heritage and values are expressed in our brand promise, here for good.

Standard Chartered PLC is listed on the London and Hong Kong Stock Exchanges.

For more stories and expert opinions please visit Insights at sc.com. Follow Standard Chartered on Twitter, LinkedIn, Instagram and Facebook.

About OneRagtimeOneRagtime is a venture capital platform founded by Stphanie Hospital and Jean-Marie Messier. Since 2017, OneRagtime has sourced, financed, and scaled +40 start-ups. With its unique platform model, OneRagtime allows its investor community to invest in the most-promising French and European startups through its funds or dedicated club deals, while giving entrepreneurs unparalleled network and business acceleration.

Photo - https://mma.prnewswire.com/media/2375182/Hive.jpg Logo - https://mma.prnewswire.com/media/2375183/Hive_Logo.jpg

SOURCE Hive

Read more:
SC Ventures Leads Hive's 12M Series A, Enabling Sustainable Distributed Cloud Computing for the Masses - PR Newswire