How AI and the Metaverse will Impact the Datasphere – Visual Capitalist

How AI and the Metaverse Will Impact the Datasphere

The dataspherethe infrastructure that stores and processes our datais critical to many of the advanced technologies on which we rely.

So we partnered with HIVE Digital on this infographic to take a deep dive on how it could evolve to meet the twin challenges of AI and the metaverse.

If the second decade of the 21st century is remembered for anything, it will probably be the leaps and bounds made in the field of AI. Large language models (LLMs) have pushed AI performance to near-human levels, and in some cases beyond. But to get there, it is taking more and more computational resources to train and operate them.

The Large-Scale Era is often considered to have started in late 2015 with the release of DeepMinds AlphaGo Fan, the first computer to defeat a professional Go player.

That LLM required a training compute of 380 quintillion FLOP/s, or floating-point operations per second, a measure of computer performance. In 2023, OpenAIs GPT-4 had a training compute 55 thousand times greater, at 21 septillion FLOP/s.

At this rate of growthessentially doubling every 9.9 monthsfuture AI systems will need exponentially larger computers to train and operate them.

The metaverse, an immersive and frictionless web accessed through augmented and virtual reality (AR and VR), will only add to these demands. One way to quantify this demand is to compare bitrates across applications, which measures the amount of data (i.e. bits) transmitted.

On the low end: music streaming, web browsing, and gaming all have relatively low bitrate requirements. Only streaming gaming breaks the one Mbps (megabits per second) threshold. Things go up from there, and fast. AR, VR, and holograms, all technologies that will be integral for the metaverse, top out at 300 Mbps.

Consider also that VR and AR require incredibly low latencyless than five millisecondsto avoid motion sickness. So not only will the metaverse contribute increase the amount of data that needs to be moved644 GB per household per daybut it will also need to move it very quickly.

At time of writing there are 5,065 data centers worldwide, with 39.0% located in the U.S. The next largest national player is the UK, with only 5.5%. Not only do they store the data we produce, but they also run the applications that we rely on. And they are evolving.

There are two broad approaches that data centers are taking to get ahead of the demand curve. The first and probably most obvious option is going BIG. The worlds three largest hyperscale data centers are:

The other route is to go small, but closer to where the action is. And this is what edge computing does, decentralizing the data center in order to improve latency. This approach will likely play a big part in the rollout of self-driving vehicles, where safety depends on speed.

And investors are putting their money behind the idea. Global spending on edge data centers is expected to hit $208 billion in 2023, up 13.1% from 2022.

The International Data Corporation projects that the amount of data produced annually will grow to 221 zettabytes by 2026, at a compound annual growth rate of 21.2%. With the zettabyte era nearly upon us, data centers will have a critical role to play.

Learn more about how HIVE Digital exports renewably sourced computing power to customers all over the world, helping to meet the demands of emerging technologies like AI.

See more here:
How AI and the Metaverse will Impact the Datasphere - Visual Capitalist

Related Posts

Comments are closed.