What Is Edge Computing? Why It’s Important and How It Works | eWEEK – eWeek

In a nutshell, edge computing is any computing that occurs on the edge of the network rather than in a centralized server.

If you dig deeper into edge, youll see that edge computing deployments often supported by cloud computing providers are part of a distributed infrastructure, which enables the compute power to be closer to the people who produce or consume that data.

Key to the idea of edge, whether your edge deployment supports machine learning, artificial intelligence or data analytics, is that it extends resources far outside the once-dominant datacenter. Edge is forward-looking today in the same way that the datacenter was a leader some dozen years ago.

The most important part of edge technology is that its a form of distributed computing. If you look back at computer history, you can see a cycle between more centralized computing (like the early mainframes) to more distributed models (like networked PCs). In recent years, the trend toward cloud computing has been a move to a more diffuse, multicloud computing model. The newer trend toward edge computing is a further extension of that distributed model.

You might not realize it, but you probably interact with devices leveraging edge computing every day. For example, if you work in a remote office or back office (ROBO) environment with your own computing infrastructure, thats an example of edge computing.

The smartphone you have in your pocket does edge computing. So does your car. Your printer. Probably your TV.

Heres a non-exhaustive list of edge computing devices:

Because there are so many different kinds of edge devices, there is no single edge architecture that covers all use cases. However, in general, most edge computing deployments do have some typical characteristics in common.

First, edge devices usually collect data from sensors. Those sensors might be part of the device itself (as in the case of smartphones and autonomous vehicles) or they might be separate (as in the case of gaming systems and many IIoT deployments).

Then the edge device does some processing and storage locally. In theory, a device could store the data at the edge indefinitely, but in most deployments, the device then sends a portion of the data up to the cloud for additional processing and analytics. Other devices and users can then access the processed data via the cloud.

It might be easier to understand this architecture by considering a particular use case. Think about the tablet-style kiosks you might see at each table in a chain restaurant. These edge devices collect data input by users, such as order information, payment details, and/or survey responses.

Those tablets then transmit all that data via Wi-Fi to a centralized server in the restaurant. That server processes and stores data, as well as forwarding it to various Internet-connected servers that process payments, monitor company financials, and analyze customer orders and survey responses. Administrators and business managers can then access that cloud-based data through various applications.

This combination of edge computing and cloud computing is becoming increasingly common in a variety of different use cases and industries.

Edge computing offers a number of benefits over centralized computing models, including the following:

As you might expect, edge computing also has some downsides. Here are some of the most significant:

Despite these challenges, enterprises should be paying attention to the edge computing trend and considering how their company might participate. Heres why:

Here is the original post:
What Is Edge Computing? Why It's Important and How It Works | eWEEK - eWeek

Related Posts

Comments are closed.