It has to be said: edge computing is not really new. In a real sense, nearly all computing happens at the edge: it just might not be your local edge. About the only computing that happens “in the core” are operations conducted by transmission networks to deliver bits from one place to another.
Still, we normally analyze the latest eras of computing as using centralization versus decentralization models. The mainframe era was centralized. The client-server era was more decentralized. The internet era has been centralized. Now, with the emerging internet of things era, we seem to be moving back towards decentralization, in part.
The drivers for edge computing utility tend to center on internet of things use cases, where data must be analyzed really fast (requiring ultra-low latency), or where the sheer volume of data (full motion, very-high definition video) makes transport costs to remote computing centers a cost issue.
We might ultimately even find that many of the latency or transport cost dependent use cases can be handled by metro-level computing, not strictly “at the edge” or at remote hyperscale data centers.
No comments:
Post a Comment