The Future of efficient IOT deployment: Using Containerisation at the Edge
The cloud offers web-scale processing power and storage which apparently makes an ideal companion to Internet of things (IoT) implementations. Centralised cloud-based systems are obvious destinations for the analytics processing and assist Machine Learning activities. However, the demand for rapid operational-support systems intuitively requires processing nearer to the source of IoT data, in other words - a decentralised approach. With transactional sensor data going to the cloud, the challenge is therefore to avoid the inherent limitations network connectivity and latency adversely impacting local time sensitive operations.
I can relate this very well from experience of a live IoT implementation, when adopting a containerisation technology at the “Edge” was the welcome savior, and decentralising the analytics and knowledge generation, putting it closer to the source of the data, was the right answer.
I have shared some insights below on the challenges faced and how the power of the Edge combined with Containerisation helped take operations to the next level.
Challenges for making IoT practically happen
Once deployed in the field, IoT devices constantly struggle to communicate with centralized systems over the Internet. Each of the connectivity enablers - SIMs, networking equipment, lease lines/broadband, cellular towers etc. have the potential to introduce latency or disrupt connectivity. The heterogeneous nature of the networks required for end to end internet connectivity makes the feasibility of cost effective end-to-end uptime SLAs unlikely.
Network latency could cause chaos within real-time Industrial operations which are solely dependent on centralized IoT backend infrastructure.
Taming the Beast using Containerisation
With no fool proof connectivity option in sight, by design IoT systems need to have capability for distributed deployment. This would mean loose coupling of Business logic, data processing and storage out of the confines of the centralized servers and onto smart controllers nearer to the field-based IoT devices. Quick local decision support can be rendered locally over peer-to-peer communication between IoT devices and their controllers instead of relying on internet connectivity.
Containerization technologies like LXC /Docker were harnessed to create and systematically propagate containerized miniature applications onto Smart Controllers at the project sites.
Containers armed with ideal payload of business logic, data processing and storage efficiently support localized operations as depicted in an industrial setup below
Important design considerations for effective containerization:
- Prioritization of execution within the IoT controller from real-time kernel aspect.
- Restricting the compute and storage requirements.
- Setting up bridge networks between hosts and containers
- Setting up frequency of data upload onto the cloud.
- Setting up required security aspects to avoid any security breaches.
- Setting up dynamic DNS to securely resolve domains for web communication.
Using the approach above, local operations can be further optimized by systematically moving HMI logic onto containers, making expensive HMI equipment redundant
Applying the concept of App stores, the controller board can be made to mimic an Appstore hosting containers (acting as app variants) bundled with specific micro IoT services .