Saturday, April 30, 2022

5G Edge Computing Not a Big Deal in Asia-Pac in 2025?

To the extent that multi-access edge computing is linked with 5G, revenue impact in some regions--including Asia-Pacific, might be quite muted through 2025.


You would be hard pressed to find any observers who do not believe edge computing, private networks and network slicing will lift revenue for mobile operators over the next decade In the Asia-Pacific or any other region.


The only question is the magnitude of those increases. And that is where matters get tricky. Some forecasts suggest sharp drop offs in Asia-Pacific mobile revenue through 2025, compared to trends up to 2019. 


But most forecasts call for revenue in the range of $230 billion to $390 billion by about 2025, with total revenue--fixed and mobile--closer to $500 billion in the region. 


If 5G revenue earned by mobile operators in the Asia-Pacific region by about 2025 reach $24 billion, then 5G would represent between six percent and 10 percent of mobile operator revenues.


If one assumes that consumer mobile connections represent 90 percent of 5G revenue in 2025, and using the higher figures of $24 billion in 5G revenue, then edge computing, network slicing and private networks together would only represent perhaps $2.4 billion in revenue.


That is a small amount contributed by three new revenue sources. 


But some believe 5G might contribute less, perhaps contributing $14 billion in mobile revenues  by about 2025. In that case, 5G would represent between four percent and six percent of mobile operator revenues in 2025. 


In that case network slicing, private networks and edge computing would be negligible revenue contributors, generating perhaps 1.4 percent of mobile operator revenues. 


At such levels, the impact of changes in subscription volume, average revenue per account, increases in internet access revenues and market share changes will have far more impact on mobile operator revenues than network slicing, edge computing and private networks.


Google Cloud Acquires MobiledgeX

MobiledgeX, which sought to become an operating system or orchestration layer for multi-access edge computing, now will be an open source framework owned by Google Cloud. 


Two possible avenues for development arguably exist. The former MobiledgeX assets could be used by ecosystem participants to support their own edge computing products and services, a horizontal approach. 


source: MobiledgeX 


Or, those assets might play more directly in connectivity provider efforts to build on cloud-native computing supporting their own internal operations. 


Even if some entities do both, the largest opportunity likely exists in the vertical plane, as a support to wider connectivity provider shifts of computing platform to cloud-native operations using public cloud resources. 


Google Cloud's Anthos has been used by a number of telcos globally as part of the building of their own internal cloud operations.  


Some indications could emerge as supporters of the original MobiledgeX effort further develop their edge strategies. Deutsche Telekom, Samsung, VMware, SK Telecom, Telefónica and Singtel are among the firms to watch.


As so often happens when telcos try and create new roles for themselves in ecosystem adjacencies, the efforts eventually fail. That has been true for app store, data center, device, cloud and premises computing initiatives. Some might add content ownership and operations to that list. 


What happens with the internet of things or private networks is not yet clear. Still, it has to be noted that telco efforts to diversify into adjacencies to connectivity have not generally worked well.


Wednesday, April 27, 2022

Edge Computing Will Underpin Metaverse, Eventually

Some technology transformations are so prodigious that it takes decades for mass adoption to happen. That is likely to be true for edge computing.


Edge computing will eventually represent a significant portion of total computing as a service revenues. What is not yet clear is the percentage of total computing as a service that edge will claim in the future, compared to classic cloud computing.


We might point to artificial intelligence or virtual reality as other prime examples of how long major technology transformations actually take. It can take decades before an innovation becomes commercially ubitquitous. Now we probably can add Web 3.0 and metaverse to that list. 


At a practical level, we might also point to the delay of “new use cases” developing during the 3G and 4G eras. That is likely to happen with 5G as well. Some futuristic apps predicted for 3G did not happen until 4G. Some will not happen until 5G. Likely, many will not mature until 6G. 


The simple fact is that the digital infrastructure will not support metaverse immersive apps, as envisioned, for some time. Latency performance is not there; compute density is not there; bandwidth is not there. 


In fact, it is possible to argue that metaverse is itself digital infrastructure, as much as it might also be viewed as an application supported by a range of other elements and capabilities, including web 3.0, blockchain and decentralized autonomous organizations, artificial intelligence, edge computing, fast access networks and high-performance computing. 


source: Constellation Research 


Scaling persistent, immersive, real-time computing globally to support the metaverse will require computational efficiency 1,000 times greater than today’s state of the art can offer, Intel has argued. 


To reduce latency, computing will have to move to the edge and access networks will have to be upgraded. 


All of that takes time, lots of capital investment and an evolution of business models and company cultures. Metaverse is coming, but it is not here today, and will take a decade or more to fully demonstrate its value. Major technology transformations are like that. 


Monday, April 25, 2022

In 2030, Most IoT Devices Will Use a Short-Range Connectivity Solution, Transforma Insights Predicts

Whatever else one might say about the internet of things, it seems clear there will be many billions of sensors that require some sort of connection to broader internet resources.


But the types of possible connections are myriad, with choices ranging from bits per second data rates up to Gbps rates, with a corresponding range of connectivity platforms and services. Many apps and devices will be able to use standard Wi-Fi for connectivity or other short-range connectivity solutions such as Bluetooth. 


Others might require either mobile or satellite network connections. 


source: Eseye, Transforma Insights 


There remains debate about whether specialized low power data networks or mobile networks will be the biggest supplier of connections, compared to use of private network options. Most observers likely believe most connections will use short-range connectivity platforms using unlicensed spectrum. 


Of perhaps 27 billion IoT devices and sensors at work by 2030, perhaps a third will use some form of wide area network connection, according to Transforma Insights data.


Sunday, April 24, 2022

No Metaverse Without Edge Computing

It is possible to predict that edge data centers and edge data processing will be foundational for support of metaverse applications and use cases. But it also is possible to predict that much more orchestration also will be required. 


In fact, improvements seem likely almost everywhere, from chips and algorithms to devices to connectivity and computing infrastructures. 

source: CB Insights 


Much as today’s web pages are essentially assembled in real time, so too metaverse experiences and other immersive applications will require assembling elements from many sources, in real time. That will require much more real-time communication and computing, and likely more server denstiy as well


Architectures that are more open also are likely. Edge computing, hybrid local and remote real-time rendering, video compression, cross-layer visibility, network optimizations and improved latency between devices and within radio access networks all seem necessary. 


Friday, April 22, 2022

Can Digital Transformation Results be Quantified?

It might not be so easy to describe or implement digital transformation as it applies to professional services firms. According to a Hinge Marketing study, “the most common goal firms want to achieve through digital transformation is improving their customer/client experience.”

source: Hinge Marketing 


That would be difficult to quantify under the best of circumstances. But many firms use net promoter scores as a proxy for satisfaction. 


Respondents to a survey do believe their annual revenue and profitability has increased. 

More than 60 percent of firms cited growth in revenue and profitability as a result of digital transformation efforts. 


The caveats are that these are user perceptions of change that might be hard to quantify. It might also be the case that multiple changes are made simultaneously, limiting ability to attribute the success to any single factor. 


Many firms undertake digital transformation to improve efficiency, Hinge says, often by automating processes. Some might argue that is “digitalization” but not necessarily “digital transformation. It might not matter much if applying technology leads to performance gains that can be quantified. 


Operational efficiency was most likely to increase as a result of digital transformation, as over 75 percent of firms reported they experienced an increase, says Hinge. 


A large majority of firms also saw increases in other critical performance metrics such as client satisfaction and awareness.


Or at least that is what respondents claim. Realists will regard those responses with caution. Very few professional services firms likely can actually quantify what their digital transformation projects have produced. 


Thursday, April 14, 2022

Google Distributed Cloud Runs Almost Anywhere on the Edge


Google Distributed Cloud is designed to run in customers' data centers; on customers' edge devices; on the edge infrastructure of network operators and on Google's own public cloud infrastructure.

Why Most Edge Computing Revenue Forecasts are Too Optimistic

As important as edge computing will likely be for mobile operators in many markets, estimates of incremental revenue are virtually always too optimistic. The reason is that most of the activity, and close to all of the actual revenue from “edge computing as a service” will not be earned by mobile operators. . 

source: STL Partners 


The same goes for edge computing estimates that involve sensors, edge customer premises equipment, platforms supporting those devices, server and other hardware. The most obvious opportunities for mobile operators are real estate (racks, power, air conditioning, security) and connectivity services. 


Of those two types of value, connectivity is the greatest likely revenue driver. There are some opportunities to operate private 5G networks on behalf of enterprise clients as well. But it remains to be seen how the private 5G market develops. 


It might not be unreasonable to assume that the suppliers of enterprise 5G private networks are the same entities that already supply much enterprise information technology services. System integrators, for example, would seem to be likely winners in that regard.


Will 25% of 5G Use Cases Involve Edge?

It is by now relatively well accepted that most enterprise compute workloads will happen at the edge by 2025, rather than being processed remotely. That might be important for 5G value and use cases if, as some expect, as much as a quarter of 5G use cases are based on edge computing by about 2023. 


source: MIT Technology Review 


source: MIT Technology Review

Friday, April 8, 2022

Web 3.0 Will Require Edge Computing

Web 3.0 often is said to be “decentralized,” featuring applications based on open-source, trustless, and permissionless blockchain networks. Web 3.0 is said to rely on: 

  • Blockchain

  • Crypto assets

  • Low code or “no code” app development

  • Artificial intelligence

  • Metaverse or extended or virtual reality

  • Users able to monetize their data


It also is reasonable to argue that if Web 3.0 develops, it will build on edge computing as well, simply because the sheer amount of data updates--in real time-- for any immersive experience will require high-performance computing very close to the end user. 

source: GlobalData 


source: Elastos


Wednesday, April 6, 2022

Edge is Important for Streaming Content

As a survey of content streamers suggests, application latency is affected by local loop issues, but primarily by encoding and transcoding. About 17 percent of latency issues were seen as caused by the access networks on either end of any streaming operation. Most of the latency issues are caused by launch or receive coding or transcoding in the core network. 


source: Wowza 


As with anything related to the internet, end-to-end performance is affected by lots of things--not just the performance of access networks. These days, core networks add so little latency as to be almost negligible in most cases. 


Also, use of content delivery networks (edge computing) also reduces the impact of core network transit issues. About 42 percent of streaming content providers say CDNs (“short segment lengths”) are a primary tool for controlling latency issues. 

source: Wowza 


Streaming providers universally hope for improved performance in the future. 

source: Wowza

Hyperscaleers Already Dominate Edge Computing

Hyperscalers already seem poised to dominate edge computing. Wavelength from Amazon Web Services (AWS), Azure Edge Zones from Microsoft and Anthos for Telecom from Google Cloud appear to be the way connectivity service providers will make edge computing services available on their networks, eschewing becoming edge computing suppliers in their own right. 


The upside from multi-access edge computing therefore will come indirectly, in the form of account additions, churn reduction, some real estate revenue and support for mobile operator core network operations, as well as marketing. 


Much of the value might come as connectivity providers pitch their networks as better able to support low-latency applications, private networks or internet of things. 

source: Heavy Reading 


Beyond the obvious explanation that the hyperscalers have such scale advantages mobile operators could not hope to overcome, ease of deployment and time to market seem key advantages. 


source: Heavy Reading 


In the end, edge computing “as a service” remains a computing competency the hyperscalers dominate. They are the likely winners, for that reason, as computing continues to move to the edge. 

-------------


Sunday, April 3, 2022

Can Metaverse Exist Without Edge Computing?

Though metaverse will require many changes in computing and communications, it almost certainly will require edge computing.


As video content distribution has shaped global demand for inter-continental data transport and high-speed connections between major data centers, the metaverse will shape data center and connectivity network requirements. And the key words are “more” and “less.”


“Making the metaverse a reality will require significant advancements in network latency, symmetrical bandwidth and overall speed of networks,” says Dan Rabinovitsj, Meta VP for connectivity. 


The metaverse “will require innovations in fields like hybrid local and remote real-time rendering, video compression, edge computing, and cross-layer visibility, as well as spectrum advocacy, work on metaverse readiness of future connectivity and cellular standards, network optimizations, improved latency between devices and within radio access networks (RANs), and more,” he says. 


Already, experts predict Metaverse environments will require more data centers, more edge computing, more distributed computing, more collocation, more content distribution mechanisms, will require more power consumption and more cooling.


Eventually, fully-developed metaverses will require advances in chip technology as well. Beyond all that, blockchain will probably be necessary to support highly-decentralized value exchanges. And it is impossible to separate metaverse platforms and experiences from use of artificial intelligence, for business or consumer uses. 


source: iCapital Network 


If metaverses are built on persistent and immersive computing and tightly-integrated software stacks, platforms will be necessary. New developments in chip manufacturing also will be needed. 


For connectivity providers--especially internet service providers--far lower latency will be key. Today’s latency-sensitive applications such as video calling and cloud-based games have to meet a round-trip time latency of 75 milliseconds to 150 ms. Multi-player, complex games might require 30 ms latency. 


“A head-mounted mixed reality display, where graphics will have to be rendered on screen in response to where someone is focusing their eyes, things will need to move an order of magnitude faster: from single to low double digit ms,” says Rabinovitsj. 


Image rendering will require edge computing. “We envision a future where remote rendering over edge cloud, or some form of hybrid between local and remote rendering, plays a greater role,” he adds. “Enabling remote rendering will require both fixed and mobile networks to be rearchitected to create compute resources at a continuum of distances to end users.”


Bandwidth could increase by orders of magnitude over what is required to view 720p video on a standard smartphone screen. That might work with just 1.3 Mbps to 1.6 Mbps of downlink throughput. 


But a head-mounted display sitting just centimeters from the eyes required to display images at retina grade resolution will need to be many orders of magnitude larger, he notes. 


To be sure, most of what happens that is part of metaverse experiences rests on things that happen up the stack from computing and communications. 


source: Constellation Research


But we already can see how metaverse support will require changes in computing architecture and network capabilities.