The World is getting faster. We expect the best customer experience, problems need to be resolved right away, goods need to arrive the day after we order them or even the same day like Amazon.com and Bol.com and we have become used to communicating with anyone, anywhere, at any time. This raises the need for edge computing.
For enterprises, this trend is reflected in the increased demand for real-time data processing. Look at the trends that will power the next generation of innovation such as Artificial Intelligence (AI) and the Internet of Things (IoT): they are driving a surge in data production. As mentioned in our earlier blog: How Internet of Things (IoT) will change data centers, we shared that Statista expect that by 2025, the number of installed IoT devices will be over 74.5B. Devices such as connected cars, smart drones, intelligent rental, etc, etc.
So what is edge?
Let we start with the edge computing definition:
“Edge computing is a distributed, open IT architecture that features decentralized processing power, enabling mobile computing and Internet of Things technologies. In edge computing, data is processed by the device itself or by a local server, rather than being transmitted to a data center”.
At present, edge computing, is more of a concept than an off the shelve product. Edge computing is the practice of processing data near the edge of your network, where the data is being generated, instead of in centralized data centers. The term “edge” is used in many ways, but maybe the description of Dr. Tom Bradicich is the best: The edge is where the action is. It‘s a manufacturing floor, a building, a campus, a city, your house, a wind farm, a power plant, a telecommunications outpost, in your car, in the sky or under the sea. It’s everywhere everything is, and it’s where the “things” are in the Internet of Things.
Why Edge data centers
As the Internet of Things (IoT) expands, massive amounts of data require faster processing and better security. Edge data centers allows organizations to:
- Minimize latency – There are many applications that require immediate insight and control and any latency is intolerable.
- Reduce bandwidth – Sending big data sets back and forth to the cloud can consume enormous bandwidth.
- Lower cost – Even if bandwidth is available, it can be costly.
- Reduce threats – Processing data locally can reduce security vulnerabilities. Transfer of data across various networks is simply more prone to attacks and breaches.
- Maintain compliance – Laws and corporate policies govern the remote transfer of data. For example, certain countries forbid companies from moving the personal data of their citizens outside their borders.
Edge computing and the future of the data center
The ‘traditional’ data center won’t become obsolete by any means and will still be used for a wide range of functions, but the rise of edge computing cause a rise of a larger number of smaller data centers built closer to population centers like cities, campuses and business parks. Data center infrastructure could therefore change and slowly become a more distributed model. Edge-driven systems will work alongside cloud and the huge data center model will still thrive and be vital to all kinds of businesses. As demand for edge devices and applications increases, this part of the network could see big growth.
The working definition of edge data centers