Edge computing is certain to play a bigger role in our computing fabric as augmented reality, virtual reality and future Metaverse environments become possible. “Even at ultra-low latency, it makes little sense to stream (versus locally process) AR data given the speed at which a camera moves and new input data is received (i.e. literally the speed of light and from only a few feet away),” says Matthew Ball, EpyllionCo managing partner.
The conventional wisdom today is that multi-player games, to say nothing of more-immersive applications, do not work when total latency is greater than 150 milliseconds and user experience is impaired when latency is as low as 50 milliseconds, says Ball.
Will the Metaverse require 1,000 times more computing power? Intel thinks so. And that implies we might be decades away from a ubiquitous and widely-accepted Metaverse that people actually use routinely.
“Consider what is required to put two individuals in a social setting in an entirely virtual environment: convincing and detailed avatars with realistic clothing, hair and skin tones; all rendered in real time and based on sensor data capturing real world 3D objects, gestures, audio and much more; data transfer at super high bandwidths and extremely low latencies; and a persistent model of the environment, which may contain both real and simulated elements,” says Raja Koduri, Intel SVP and GM of Intel’s Accelerated Computing Systems and Graphics Group. “Now, imagine solving this problem at scale--for hundreds of millions of users simultaneously--and you will quickly realize that our computing, storage and networking infrastructure today is simply not enough to enable this vision.”
“We need several orders of magnitude more powerful computing capability, accessible at much lower latencies across a multitude of device form factors,” says Koduri.
“Truly persistent and immersive computing, at scale and accessible by billions of humans in real time, will require even more: a 1,000-times increase in computational efficiency from today’s state of the art,” he notes.
No comments:
Post a Comment