We’re all sensitive to slow load times, whether we’re streaming music or a video, waiting for a webpage to display, downloading a large file or competing in online gaming. More data than ever before is being pushed down to the end-user – but problems can occur (buffering, load times, etc.) as network traffic increases.
The influx of data coming from a growing number of sensors and devices involved in the Internet of Things (IoT) only adds to network stress.
As recent as five years ago, enterprises were investing in massive, centralized data centers, relying on networks to quickly and reliably carry large amounts of data to end-users. But it’s funny how quickly things in our industry can change.
Less than half a decade later, the trend is moving in the opposite direction. Instead, enterprises are choosing to establish smaller, regional data centers in locations of high data consumption. This eliminates the need to wait for data to travel across the country (or farther), placing information on servers much closer to end-users – shortening the distance the data must travel before it reaches its final destination.
These small data centers are called “edge data centers”; the concept is also known as “fog computing” or “edge computing.”
Hardware – anything from routers and switches to wide area network (WAN) multiplexors – are moving from the core network in a central data center to smaller data centers, which then relay data to end-users. (“End-users” could be people using smartphones and tablets, or enterprises expecting data from sensors and devices as part of IoT.)
This geographically distributed platform moves huge amounts of data much faster instead of sending it back to the cloud for processing. When data is kept “close to the edge” through edge computing, it’s processed locally for faster delivery.
Edge computing has several goals, which translate to speed and reliability benefits for end-users.
By placing information on servers much closer to where the users actually are, rich content is delivered faster.
Because of the shift to geographically dispersed, small data centers vs. one huge central data center, installation, commissioning, repairs and upgrades may be more difficult. The data centers may be located in remote areas, and IT professionals may not be close by. For this reason, edge computing must facilitate easy installation, reduced complexity and the ability for remote management and control.
Big companies like Facebook, Twitter, Netflix and Adobe are already utilizing edge computing to supply rich content directly to users, improving their experience. But there are other interesting examples as well.
Coca-Cola uses edge computing to run its Freestyle machines located across the country. Edge servers – which gather a wealth of information about consumer preferences and then shares it with the central Coca-Cola data center via the cloud – are optimized to perform specific tasks very quickly, and communicate customer information back to Coca-Cola in real-time.
As edge computing continues to change the way data centers are designed, maintained and upgraded, we’ll keep you updated on the impacts it will have on data center managers like you.
Did you learn more about edge computing by reading this article?
Share your thoughts with us in the comments section below!
With 24 years of telecommunications and data center industry experience, Warren McCarty is a Lucent Technologies/Bell Laboratories training graduate and BICSI RCDD who is responsible for supervising Belden’s direct sales, partner management and marketing activity implementation for data centers.