How Does the Edge Plan Work?

Edge computing is a form of computing that takes place at the edge of corporate networks. The “edge” is defined as the place where end devices access the rest of the network, including phones, laptops, industrial robots, and sensors. In the past, the edge was where these devices connected to deliver data to, and receive instructions and download software updates from a centrally located data center or the cloud. With the explosion of the Internet of Things (IoT), this model has shortcomings. This is where the edge plan comes in.

IoT devices gather so much data that the sheer volume requires larger and more expensive connections to data centers and the cloud. The nature of the work performed by these IoT devices is also creating a need for much faster connections between the data center or cloud and the devices. For example, if sensors in valves at a petroleum refinery detect dangerously high pressure in the pipes, shutoffs need to be triggered as soon as possible. With analysis of that pressure data taking place at distant processing centers, the automatic shutoff instructions may come too late.

With processing power placed local to the end devices, latency is less, and that roundtrip time can be significantly reduced, potentially saving downtime, damage to property, and even lives. Even with the introduction of edge devices that provide local computing and storage, there will still be a need to connect them to data centers, whether they are on premises or in the cloud.

Temperature and humidity sensors in agricultural fields gather valuable data, but that data doesn’t have to be analyzed or stored in real time. Edge devices can collect, sort, and perform preliminary analysis of the data, then send it along to where it needs to go: to centralized applications or some form of long-term storage, again either on-prem or in the cloud. Because this traffic may not be time-sensitive, slower, less expensive connections – possibly over the internet – can be used. And because the data is pre-sorted, the volume of traffic that needs to be sent at all may be reduced.

The upside of edge computing is faster response time for applications that require it and slowing the growth of expensive long-haul connections to processing and storage centers. The downside can be security. With data being collected and analyzed at the edge, it’s important to include security for the IoT devices that connect to the edge devices and for the edge devices themselves. They contain valuable data, but they are also network elements that, if exploited, could compromise other devices that contain stores of valuable assets.

With edge computing becoming more essential, it’s also important to make sure that the edge devices themselves don’t become a single point of failure. Network architects need to build in redundancy and provide failover contingencies to avoid crippling downtime if a primary node goes down.

The industry has already gone a long way toward addressing the demands of edge computing, and it is becoming mainstream. Its importance is likely to grow even more as the use of real-time applications becomes more prevalent. In summary, the edge plan is a way to bring computing closer to the end devices to reduce latency and improve response time for applications that require it. It also helps to minimize the growth of expensive connections to processing and storage centers. However, it’s important to ensure the security of the edge devices and build in redundancy to avoid downtime.