We’re all sensitive to slow load times, whether we’re streaming music or a video, waiting for a webpage to display, downloading a large file or competing in online gaming. More data than ever before is being pushed down to the end-user – but problems can occur (buffering, load times, etc.) as network traffic increases.

The influx of data coming from a growing number of sensors and devices involved in the Internet of Things (IoT) only adds to network stress.

As recent as five years ago, enterprises were investing in massive, centralized data centers, relying on networks to quickly and reliably carry large amounts of data to end-users. But it’s funny how quickly things in our industry can change.

Less than half a decade later, the trend is moving in the opposite direction. Instead, enterprises are choosing to establish smaller, regional data centers in locations of high data consumption. This eliminates the need to wait for data to travel across the country (or farther), placing information on servers much closer to end-users – shortening the distance the data must travel before it reaches its final destination.

These small data centers are called “edge data centers”; the concept is also known as “fog computing” or “edge computing.”

The concept of using smaller, regional data centers (often called “edge data centers”) is commonly known known as “fog computing” or “edge computing.”

1. How Does Edge Computing Work?

Hardware – anything from routers and switches to wide area network (WAN) multiplexors – are moving from the core network in a central data center to smaller data centers, which then relay data to end-users. (“End-users” could be people using smartphones and tablets, or enterprises expecting data from sensors and devices as part of IoT.)

This geographically distributed platform moves huge amounts of data much faster instead of sending it back to the cloud for processing. When data is kept “close to the edge” through edge computing, it’s processed locally for faster delivery.

2. What are the Goals of Edge Computing?

Edge computing has several goals, which translate to speed and reliability benefits for end-users.

  1. Reduce load times
  2. Shrink core network data traffic
  3. Reduce bandwidth requirements
  4. Lessen latency (the time between data transmission as it reaches its destination and then returns)
  5. Allow fast data center commissioning
  6. Decrease data center complexity
  7. Reduce outages and downtime

By placing information on servers much closer to where the users actually are, rich content is delivered faster.

Because of the shift to geographically dispersed, small data centers vs. one huge central data center, installation, commissioning, repairs and upgrades may be more difficult. The data centers may be located in remote areas, and IT professionals may not be close by. For this reason, edge computing must facilitate easy installation, reduced complexity and the ability for remote management and control.

Download our 10 Steps to Holistic Data Center Design white paper3. Who is Using Edge Computing?

Big companies like Facebook, Twitter, Netflix and Adobe are already utilizing edge computing to supply rich content directly to users, improving their experience. But there are other interesting examples as well.

Coca-Cola uses edge computing to run its Freestyle machines located across the country. Edge servers – which gather a wealth of information about consumer preferences and then shares it with the central Coca-Cola data center via the cloud – are optimized to perform specific tasks very quickly, and communicate customer information back to Coca-Cola in real-time.

As edge computing continues to change the way data centers are designed, maintained and upgraded, we’ll keep you updated on the impacts it will have on data center managers like you.

Learn more about Belden solutions that allow you to reduce data center complexity and transfer data faster.

Did you learn more about edge computing by reading this article?
Share your thoughts with us in the comments section below!