‘Edge computing’ is much harder to define than “cloud computing. AWS defines cloud computing as “the on-demand delivery of IT resources over the Internet with pay-as-you-go pricing.”
All the clauses in that definition might not always apply to edge computing.
Edge computing can occur on a device, in which case internet delivery is not necessary. It might occur on an organization’s premises, in which case delivery is by a private network.
Also, when the computing resources are owned by the enterprise, there are no usage fees or payments.
And one might not consider a device’s own processor and memory or a premises-located or remote location based user-owned server to provide “on-demand” delivery. In such instances workloads are invoked as needed, but not “on-demand,” when that term refers to use of a third party’s compute infrastructure.
So edge computing uses the notion of computing close to the location where it is needed.
That might always have been the case for personal computers, smartphones and other devices; enterprise and other organization computing on owned equipment, on the premises. In the case of remote computing it often means moving the resources geographically closer to the use point.
Others might add that edge computing involves local computing or onboard computing; computing analytics on the device or premises; or happens wherever the digital world and physical world intersect.
Such notions suggest why hybrid computing has emerged. Workloads already are a combination of edge and remote processes.
source: 451 Research, S&P Global Intelligence
No comments:
Post a Comment