In recent years there’s been a massive migration of processing and data into the cloud, but powerful though the arguments are for centralising computing and resources in large data centres, it’s not a suitable architecture for all applications.
We’re not talking here about some of the usual arguments against cloud, such as data sovereignty and control. We’re talking about issues of latency — the need to process data and make decisions on it close to its source – and the need to reduce the sheer volume of data, much of it redundant, that can be generated by some applications. Welcome to the world of edge computing, aka fog computing.
Edge computing is very much a child of the Internet of Things. Some of these things – of which there will be billions – will generate massive amounts of data. In some cases most of it will be redundant. In some cases it will require an instant response: hence the need to decentralise compute resources to the edge of the network, near where the action is.
Fog computing is a term coined by Cisco for its version of edge computing.It seems to have made its debut in an academic paper from 2012. Originally it was received with some scepticsm. One critic branded the concept “an ill-conceived marketing metaphor that further confuses the cloud market,” saying it represent a reason why “reporters should be trained not to parrot the marketing hype of large companies.”
However, far from clearing, the fog has thickened, with the launch in November 2015 of the Open Fog Consortium with founding members being Cisco, chipmakers ARM and Intel, Dell, Microsoft and Princeton University, through the Princeton University Edge Laboratory.
But, to get back to the more generic concept of edge computing. Fog Consortium members Intel, Dell and Microsoft all have their own implementations of edge computing. Intel and HP announced in November 2015 that they would jointly develop edge computing products. Akamai has teamed up with IBM. Dell announced its Edge Gateway in October 2015.
All these vendors have their own take on edge computing. If you want to get a more neutral overview, a good place to start is the white paper The Drivers and Benefits of Edge Computing from Schneider Electric.
As a provider of data centre infrastructure, its pretty agnostic to the processing and data architectures it supports. The white paper “explains the drivers of edge computing and explores the various types of edge computing available.”
While edge computing can be implemented in small devices out in the field, Schneider is in the business of supplying data centre infrastructure. So it suggests there is a whole gamut of implementations stretching from devices through localised data centres with one to 10 racks and regional data centres. It concludes an optimal architecture that “provides the deployment speed and capacity in line with future IoT application demands are the localised 1-10 rack versions.
However the model chosen will be very dependent on the number of IoT devices, the volume of data and the nature of the application.
Our team are highly experienced in helping businesses get the most out of technology. Consulting, cloud infrastructure and outsourced IT services.
Don't waste your budget spend on soon-to-be-obsolete technology that can't evolve with your business. Find out now if your business is cloud ready? In only a few moments find out with our Cloud Ready self-assessment tool