The enterprise is rapidly deploying new classes of compute and storage on the network edge in order to service the plethora of devices that are populating the internet of things (IoT).
But while the edge forms a vital link between centralized resources and the multitude of data points all around us, it is by no means the only link in what is emerging as a highly dynamic and intelligent data chain. In between the data center, the cloud facility and the edge is a host of systems and architectures that some experts are starting to call “fog computing.”
According to Infiniti Market Research, the global fog computing market is poised to grow more than 60 percent per year until 2021, representing a wealth of machine-to-machine (M2M) communications, security, automation and other technologies. Already, the wealth of services targeting just the modern automobile has led to a dramatic increase in data volume and complexity across wired and wireless infrastructure. Much of this is sensor-driven at the moment, but the advent of higher-order devices related to self-driving cars and other applications is set to push traffic levels even higher.
Fog computing is more than just a collection of discrete components. Tight integration between multiple points and on multiple layers of the data/infrastructure stack is crucial to maintaining an optimal IoT environment, which Cisco and SAS are addressing with a new edge-to-enterprise analytics platform and reference architecture. These systems are designed to analyze data at the edge to determine an appropriate response to user requirements, and then massage data that is destined for central repositories in order to speed up the high-volume number-crunching that organizations need for strategic planning and market analysis. By preconfiguring the platform on multiple Cisco hardware and software systems and SAS-based analytics and database technologies, the companies hope to offer a quick and easy means to build working fog computer environments.
Fog computing will be the main driver of the IoT precisely because it alleviates the disadvantages of both on-premises and cloud infrastructure, says AetherWorks CEO Dr. Rob MacInnis. In the data center, enterprises face a lack of scale and flexibility, while the cloud has high latency and expensive bandwidth requirements. Indeed, it is often faster to send a 2TBit hard drive to Amazon by snail mail than to upload the contents over the internet. With a wide range of connected compute nodes in a fog architecture, organizations get both the speed and scalability that IoT applications require, and they can better take advantage of idle compute resources, such as workstations and other devices, for more effective workload distribution. Rather than simply housing compute infrastructure in the data center or the cloud center, the fog employs all of the data points between the enterprise and the end user.
This is a radical shift from the client-server paradigm that has driven data infrastructure in the past, says Formtek’s Dick Weisinger, but it is necessary due to the sheer scale of the IoT. Not only will the data from literally billions of devices flood network pathways to and from the centralized processing facility, it will overwhelm the server and storage infrastructure itself, effectively killing the IoT before it even begins. This is why fog computing is about more than just the edge – it represents a soup-to-nuts reimaging of data, infrastructure and the services that will drive the next-generation economy. Architected properly, the fog will allow the enterprise to offload multiple gigabytes of data from core systems, while protecting sensitive data from intrusion.
As a metaphor, “fog computing” is not the best. It’s no fun getting lost in the fog, and foggy days are usually dreary and unpleasant. But as a means to describe the method in which data processing inhabits a dynamic network of connected devices rather than a centralized facility, it is highly accurate.
The enterprise will still need to learn how to manage and navigate this diffuse data architecture, but in the end it should gain a faster, more personal connection with its customers.