The Shocking Power Problem Behind Cloud Computing and Artificial Intelligence – Channelnomics

The demand for electric power is outstripping demand for data center capacity, which has the potential to slow the development of cloud computing and artificial intelligence services.

By Larry Walsh

A passing press release at the beginning of the year received little attention, as is often the case, despite predicting substantial sales growth for the vendor and its partners over the next two years. The press release was issued by Vertiv, a power systems manufacturer that provides equipment essential for running servers in data center racks. In touting the success of its acquisitions of E&I Engineering and Powerbar Gulf, Vertiv stated that it expects to more than double capacity for its switchgear, busbar, and modular solutions capacity in the next two years.

This is a bold claim, especially considering the shift of most enterprises toward cloud-based infrastructure. While the sale of data center products servers, switches, storage is projected to increase by about 10% this year, cloud computing sales are expected to jump 22%, and artificial intelligence technologies are forecasted to soar by more than 50%.

However, its the increasing demand for AI and cloud computing thats driving the sales of basic, seemingly conventional technologies such as power conditioning and backup systems. The construction of data centers, whether on-premises or for cloud services, necessitates products like those offered by Vertiv, Eaton, and Schneider Electric.

Yet, the optimistic outlook of Vertiv reveals a startling problem lurking behind the trend of cloud computing and AI: a lack of power.

Insatiable Power Demand Earlier this month, the energy industrys leaders convened in Houston for the annual CERAWeek by S&P Global, an event typically centered on electric generation and its associated inputs (oil, gas, coal, renewables). This years attendees included tech elites such as Microsofts Bill Gates and Bill Vass, vice president of engineering at Amazon Web Services, who were there to sound the alarm over the diminishing electrical capacity and the urgent need for more data centers.

At the event, Vass remarked that the world is adding three new data centers every day, each consuming as much energy as possible, with demand being insatiable. Over the next decade, the United States alone could require new capacity exceeding 100 gigawatts, sufficient to power 82 million homes. This figure doesnt even account for the capacity needed to power new homes and offices, as well as electric-vehicle fleets.

The enormous capacity requirements to power the next-generation cloud and AI era explain why OpenAI CEO Sam Altman proposed a $7 trillion fund to build data center capacity globally.

The U.S. already faces a challenge with electrical production and distribution. The Grid, as its commonly referred to, is based on century-old technology. Significant portions of the distribution network are decades-old and in need of repair, with parts of the country already experiencing brownouts and periodic disruptions because capacity cant keep up with demand.

Other developed regions encounter similar issues. Germany, for instance, became heavily reliant on Russian gas after decommissioning all of its nuclear power plants. Now, with the war in Ukraine disrupting energy supplies, Germany is compelled to reactivate conventional power plants to meet power demands.

AI Is Making the Problem Worse The development of AI will only intensify this issue. Data centers, already known as heat blooms due to their high energy consumption, will become furnaces as the massive computational processes consume more electrons. Manufacturers of servers and storage hardware are already cautioning partners and customers about the pitfalls of low-cost but power-intensive alternatives.

At CERAWeek, Gates, an advocate for sustainability, stated that the success and profitability of a data center hinge on the cost of its inputs. If electricity costs are too high, data centers will struggle to turn a profit without passing costs onto consumers.

Given that vendors sell cloud and Software-as-a-Service (SaaS) contracts on a multi-year basis, passing on costs is complicated. If a series of data centers proves unprofitable, costs will rise across the board to compensate.

Constructing more data centers isnt a straightforward solution. Real estate services firm CBRE Group reports that data center construction timelines are delayed two to six years due to electrical capacity issues.

Cloud vendors are establishing new data centers near sustainable energy sources, which doesnt always align with population or commercial needs. Moreover, building new power-generation facilities conventional or sustainable is slowed by regulatory reviews and local opposition.

Sustainability: A Solution? Balancing new data center capacity with electrical consumption needs will become a contentious issue for the technology industry, which is largely committed to sustainability goals. Microsoft aims to be carbon-neutral by 2035, and many other technology vendors are pursuing similar objectives to reduce their carbon footprint, which includes minimizing their consumption of materials and resources such as electricity.

While sustainable energy sources such as wind and solar may appear to be the solution, constructing the necessary infrastructure is as challenging as establishing a coal-fired power plant.

The power issue underlying cloud computing and AI is alarming and could hinder sales and growth, at least in the short term. Over time, vendors will improve the power efficiency of AI systems.

In the interim, vendors and partners must emphasize sustainability and efficiency as key selling points to their cloud computing, AI, and infrastructure customers. Power consumption deserves a prominent role in the total-cost-of-ownership equation, illustrating to customers the full expense of opting for cheaper but less efficient product options.

In the long term, the technology and energy sectors are likely to find a solution to this power dilemma. The remarkable aspect of technology is that it often becomes the solution to the problems it creates. For the present, though, vendors and solution providers must manage market expectations carefully.

The technology industry has excelled in convincing everyone that anything is possible with cloud computing. Its now doing the same with AI. Anything is indeed possible, provided the lights remain on.

Larry Walsh is the CEO, chief analyst, and founder of Channelnomics. Hes an expert on the development and execution of channel programs, disruptive sales models, and growth strategies for companies worldwide.

Read the original:
The Shocking Power Problem Behind Cloud Computing and Artificial Intelligence - Channelnomics

Related Posts

Comments are closed.