At present, IoT devices are not only proliferating, but simultaneously increasing in sophistication. The compute demands of artificial intelligence (AI) processes demand that future IoT environments are composed of more than simple sensors with 8-bit microprocessors. Gradually, they will be populated by devices capable of porting diverse sensors and actuators, combined with heterogeneous (general purpose and GPU) microprocessors. In fact, IDC expects this trend to be the main source of growth for the microprocessor industry, with edge devices representing 40.5% of the market by 2023.
The proliferation of these AI-powered rich IoT edge devices (the so-called “empowered edge”) will enable the presence of an increasingly growing computing continuum — ranging from the cloud to numerous devices at the edge referred to as “autonomous things” by business analysts like Gartner.
Edge computing, according to the Open Glossary , refers to the delivery of computing capabilities to the logical extremes of a network in order to improve the performance, operating cost and reliability of applications and services. By shortening the distance between devices and the cloud resources that serve them and reducing network hops, edge computing mitigates the latency and bandwidth constraints of today’s Internet, ushering in new classes of applications.
A common topology for an edge computing installation is composed of three-layers. From bottom to top these are:
- IoT devices: IoT devices are connected to an edge device. IoT devices communicate via diverse communication protocols with the edge environment acting as data sources.
- Edge nodes: they enable data processing close to sources of data through near real-time data analytics and model execution. It offers diverse communication and messaging protocols for data acquisition from near-by IoT devices and acts as temporal data storage.
- Cloud services: they develop management functionalities for both edge and IoT devices and they perform long-term data storage and analytics.
The 5 advantages of edge computing
Latency and reliability
To improve responsiveness and reliability by maximizing processing at the edge and thus minimizing dependence on cloud connection.
Only pre-processed data is sent to cloud or datacenter for mid- term analysis. It allows to maintain costs whatever the data amount.
Edge computing installations are not subject to data center boundaries and well-known fault tolerance practices.
Critical data is kept at the data source which reduces vulnerability breaches or hacking
Whatever the data volume and complexity, the TCO is maintained