Consider all the things we do now with our devices that weren’t happening a decade ago. Low-voltage systems, such as security, nurse call and fire alarms, are connecting directly to enterprise networks. Cloud-based services and virtualization were just becoming business discussions; now, the number of enterprises using these applications increases every day. Seventy-two (72%) of people in the United States report owning a smartphone.
When streaming online content or doing some general web surfing, the average user probably doesn’t notice (or care) how the data is being delivered. It just needs to be fast and flawless.
As Millennials and Generation Z enter the workforce – two generations that have grown up with technology in their everyday lives – expectations are higher. These new generations demand much more than previous groups in terms of what their devices and connections should be able to do for them:
These generations insist on rich content with no lag time or delays as the numbers of both users and devices increase. Their demands, however, will slow down existing networks – and ultimately impact the quality of content delivery. High network latency, which is caused by downloads, streaming, etc., can significantly reduce performance and increase security risk.
The number of users and devices isn’t going to decrease, and expectations for multimedia content aren’t going to go down, so how can our industry address this encroaching challenge?
By moving the content physically closer to users through edge data centers.
Edge data centers contain all the components of a regular data center, but in much smaller scale.
Often referred to as “edge computing” or “fog computing,” edge data centers deliver a richer media experience by conducting processing in various small data centers spread across geographic areas vs. all in one place.
One main data center is surrounded by small edge data centers close to sets of users. By cutting down on lengthy data transmission, network speed and bandwidth are improved.
Large, bandwidth-intensive applications and information regularly accessed by groups of people, whether it involves cloud-based services, gaming, streaming content, complex diagrams or medical images, generates constant traffic that slows network response time and reduces productivity as users wait for downloads and streaming content to transmit.
When information is stored geographically closer to the people who access it, the data will be delivered faster – with less opportunity for a problem to occur along the way.
For example, a national architecture firm that stores large design and architectural files may find it beneficial to move its data closer to “the edge,” locating edge data centers near each of its branch offices across the country. This speeds up performance for users, reducing frustration and improving productivity.
Data center complexity is rarely a good thing. As systems become more complex, they become more expensive and difficult to manage, and present the opportunity for more mistakes during troubleshooting and repair.
As data centers get bigger, the connections among systems often become more complex. This increased complexity can require more floor space, more staff resources and more money. But when IT resources are dispersed across several locations vs. one large, central location, it eliminates the single point of failure. If one edge data center goes down, other locations aren’t affected.
Moving your data to the edge will be a slow process. Edge data centers aren’t commonplace yet, but interest is growing as IT professionals learn more about their benefits.
According to an April 2016 survey from Green House Data, 52% of IT professionals believe that edge data centers will reduce costs due to shorter backbone transport; nearly half of the IT professionals surveyed plan to add an edge data center within the next 12 months.
Are you investigating edge data centers?
Share a comment below!