Wednesday, March 6, 2019

New Use Cases are Driving Demand for Edge Computing

By 2025, almost 20 percent of data created will be real-time in nature, rather than being sent to the core of the network for processing, says B.S. Teh, Seagate SVP. That is the sort of use case that benefits from local processing at the edge, or close to where the data actually is generated.

“We see edge as a big driver of growth,” he said. Some growing use cases, such as video surveillance will benefit from edge processing, sometimes not because latency is always so important, but simply to avoid wide area network transport costs.

The top reason for using edge computing is ultra-low latency. But there are other drivers, including use cases where processing at the edge alleviates the need to move bulk data generated by bandwidth intensive apps across the wide area network.

Wikibon compared the three-year management and processing costs of a cloud-only solution using AWS IoT services compared with an edge-plus-cloud solution, to support cameras, security sensors, sensors on the wind-turbines and access sensors for all employee physical access points at a remote wind farm.

At a distance of 200 miles between the wind-farm and the cloud, and with an assumed 95 percent reduction in traffic from using the edge computing capabilities, the total cost is reduced from about $81,000 to $29,000 over three years. The cost of edge-plus-cloud computing is about a third the cost of a cloud-only approach, Wikibon estimated.


No comments:

Post a Comment