Most would consider edge computing a move back to decentralized computing. Where mainframes centralized computing operations, personal computing moved it to the desktop. Cloud computing moved much computing back to a centralized model. Most agree edge computing is a shift back to more-local computing.
But not everybody agrees that edge computing is a return to decentralized computing.
Cloudflare, for example, considers edge computing an instance of “centralized applications running close to users.” That makes sense from the perspective of a content delivery network. Looked at from the standpoint of “where computing happens,” the term “edge computing” explains the concept.
Still, most would agree that edge computing puts computing operations closer to edge devices and users, even if the web remains a form of decentralized computing that is supported by centralized, remote servers.
A half decade ago it might have been controversial to talk about the end of cloud computing. These days, even if the growth is seen at the edge, we seem to be heading towards a hybrid model where computing happens wherever it makes most sense.
For applications requiring extreme low latency, real-time processing of large quantities of data or autonomous operation, local processing will make sense. Many other applications still are most efficient using the remote computing (cloud) model.
As we refer to increasingly virtualized and heterogeneous networks, where resources are used where they make most sense, we might say computing is moving to a more-complex model as well.
No comments:
Post a Comment