The Evolution of the IIoT Edge

The edge is the most critical and perhaps the most difficult piece in the IIoT puzzle. It is not one single thing, because application requirements vary so widely. Take just one application requirement – the frequency of data update – and the needs range all over the map. A vendor-managed inventory application may work very well with just a single data point every few days. Analytical models can estimate inventories, so the sensor data is needed just to correct model error and to keep these errors from accumulating. On the other hand, data can be very dynamic in applications like field service management. A critical customer event can re-order service priorities instantly, and so data about the state of the service assets (machines, tools, and people) need to be updated very often.

In IIoT implementation, a number of different classes of edge can be identified. Such as:

Intermittent – For applications that are not data-intensive, an intermittent connection to the edge may be a fine solution, such as a small sensor network linked via cellular data.

Data Historian – Larger industrial sites maintain historian data applications, and these usually provide near real-time sensor data services as well. This can serve as a very economical source of time series and current data. The advantage of scalability comes from leveraging the resources of the historian service, which has already configured data collection for a large number of installed sensors.

IIoT Gateway – This is the canonical approach where a dedicated edge device manages the local data collection. The importance of a dedicated device is that it can be installed and maintained by a 3rd party. Its configuration and embedded applications can be updated as the edge service needs evolve.

Fog – In this class the edge device moves beyond data collection and management. The edge device includes compute and storage resources and these begin to take a share of the application’s analytics tasks. Why? First because the backhaul capacity may be tightly constrained, and so it is impossible to simply pump all the edge data to a cloud service. Second, certain applications may demand faster response, and these responses can be determined locally if they depend on local data which is already available.

On-premise cloud – If the application requires significant local resources, and can afford the investment, then an on premise cloud can be installed at the network edge and forms part of the overall application. This is certainly the exception rather than the norm. However the telecommunications industry is beginning to deploy such clouds at its network edge (the cell tower base station). The value is that a single edge cloud and a set of software applications can replace a number of dedicated-function hardware appliances.

The relentless advance of Moore’s Law has implications for the future of these choices. Five years ago the idea of deploying a cloud at the base of every cell tower would be considered madness. Five years hence it may well be the norm. Expect higher end edge solutions to challenge the traditionally lower cost solutions over time. Why? Because their long-term cost advantage is lower service costs, which should be persistent. Their disadvantage is higher initial outlay for IT equipment, and this disadvantage will erode as Moore’s Law relentlessly drives down the cost of hardware resources.

“Reprinted with permission, original blog was posted here

About ARC Advisory Group (www.arcweb.com): Founded in 1986, ARC Advisory Group is a Boston based leading technology research and advisory firm for industry and infrastructure.

For further information or to provide feedback on this article, please contact nsingh@arcweb.com

Leave a Reply

Your email address will not be published. Required fields are marked *


*