Thursday, December 29, 2022

Containerized Edge Computing Forecast

“What is the edge?” is a big question these days, as an answer has to be formulated before we can figure out where revenue growth opportunities might lie, and for whom those opportunities exist. 


Containerized miniature data centers are one expression of edge computing. In Asia-Pacific markets, as well as others globally, the likely revenue participants are server suppliers, cloud computing firms and those in the ecosystem who support the supply of the containerized and racked computing facilities. 


source: Global Market Insights


As with most other information technology platforms, a majority of revenues will likely flow to hardware and software supplies and system integrators. 

source: Global Market Insights

Wednesday, December 21, 2022

Cloud Keeps Growing, Issue is Whether Some Use Cases Switch Back to--or Remain--Private

Everyone seemingly agrees cloud spending is growing, and growing faster than private cloud or traditional enterprise private computing. But some analyses of spending suggest the market is still young. Public cloud spending might still represent just single-digit portions of enterprise information technology budgets. 


By now, cloud computing advantages are clear enough: lower cost for growing and small companies; unpredictable workloads and flexibility. The downsides might include security, downtime, vendor lockin, data portability and even cost when workloads are sufficiently large or unmonitored.  


Mid-sized companies with stable workloads might actually find that public cloud costs more than using in-house private computing. That might be true for small workloads and simple apps, or highly-irregular and “spikey” compute volume, some argue. 


Still, Gartner analysts say that in enterprise IT categories that can transition to cloud--application software, infrastructure software, business process services and system infrastructure--by 2025, 51 percent of IT spending in these four categories will have shifted from traditional solutions to the public cloud.


Figure 1: Sizing Cloud Shift, Worldwide, 2019 – 2025

souce: Gartner


Security or criticality probably will remain drivers for private computing, both cloud and traditional.  And we will have to see whether total cost of ownership at scale tips decisions back towards private computing, as volume grows for any single entity.

Friday, December 16, 2022

HPC or Data Storage: Rural Data Centers Could Go Either Way

It remains to be seen how high-performance computing, data storage, colocation and cloud computing use cases might change over the next decade, beyond the expectation that change will happen. Edge computing, for example, will disperse some compute resources. 


Storage might further centralize, and could shift to new locations in remote areas where power costs are low, even if the storage sites are in rural areas. 


High-performance computing might take a couple of paths, depending on the use cases. Real-time apps are likely to need facilities that are near large population centers where such apps are required and where access to wide area and metro networks is plentiful.  


 Applied Digital Corporation, for example, is building a specialized processing center, a five-megawatt  facility next to the Company’s currently operating 100-MW hosting facility in Jamestown, North Dakota. 


The new center was designed and purpose-built for graphics processing units and is designed to run high performance computing applications including natural language processing and machine learning.


The new 16,382-square foot building is planned for energization in the first calendar quarter of 2023.


The hope is that new high-performance apps can be run at such remote data centers. Perhaps obviously, such apps would not be latency sensitive or especially designed to support real-time processing needs. 


It remains to be seen whether HPC-optimized data centers located in areas with very low cost energy, but located far from metro areas, could become a new niche within the cloud computing and data center industries. 


It might seem equally plausible that such areas might also become more important for bulk data storage, which likewise does not require real-time response.


Sunday, December 11, 2022

Who Really Wins New Revenue from Edge Computing?

One rule of thumb I use for determining whether any proposed new line of business makes sense for tier-one connectivity providers is whether the new line has potential to produce a minimum of $1 billion in annual revenues for a single provider in some definable time span (five years for a specific product. 


By that rule of thumb, tier-one service providers might be able to create edge computing revenue streams that amount to as much as $1 billion in annual revenue for some service providers. But most will fail to achieve that level of return in the next five to seven years.


That is not to say "computing at the edge" will be a small business. Indeed, it is likely to account for a growing part of public cloud computing revenues, eventually. And that is a big global business, already representing more than $400 billion in annual revenues, including both public cloud revenues as well as infrastructure spending to support cloud computing; the value of business applications and associated consulting and services to implement cloud computing.


The leading public cloud computing hyperscalers themselves represent about $72 billion or more in annual revenues already. All the rest of the revenue in the ecosystem comes from sales of software, hardware and services to enable cloud computing, both public and private.




source: IoT Analytics


It is likely a reasonable assumption that most public edge computing revenue is eventually earned by the same firms leading public cloud computing as a service.


Perhaps service provider revenues from edge computing could reach at least $20 billion, in about five years. By that standard, multi-access edge computing barely qualifies as "something worth pursuing," at least for tier-one connectivity service providers.


In other words, MEC is within the category of products that offers reasonable hope of payback, but is not yet in the category of “big winners” that add at least $100 billion to $200 billion in global service provider revenues. 


In other words, MEC is not “mobile phone service; home broadband. Perhaps it will be as big as MPLS or SD-WAN. For tier-one connectivity providers, perhaps MEC is more important than business voice (unified communications as a service). 


source: STL, KBV Research 


As with many other products, including Wi-Fi, SD-WAN, MPLS, 4G or 5G private networks, local area networks in general and  enterprise voice services, most of the money is earned by suppliers of software (business functionality) and hardware platforms, not end-user-facing services. 


The reason is that such solutions can be implemented on a do-it-yourself basis, directly by enterprises and system integrators, without needing to buy anything from tier-one connectivity providers but bandwidth or capacity. 


So one reason why I believe that other new connectivity services enabled by 5G likely do not have the potential to substantially move the industry to the next major revenue model is that none of those innovations are very likely to produce much more than perhaps one percent of total service revenues for the typical tier-one service provider. 


The opportunity for big public connectivity providers lies in use cases related to the wide area network rather than the domain of indoor and private networks. That is why the local area networks industry has always been dominated by infra providers (hardware platforms) and users who build and own their own networks (both enterprise and consumer). 


And most of the proposed “new revenue sources” for 5G are oriented towards private networks, such as private enterprise local area networks. Many of the other proposed revenue generators can be done by enterprises on a DIY basis (edge computing, internet of things). Some WAN network services--such as network slicing--attack problems that can be solved with DIY solutions.


Edge computing is a solution for some problems network slicing is said to solve, for example. 


None of the new 5G services--or new services in aggregate-- is believed capable of replacing half of all current mobile operator revenues, for example. And that would be the definition of a “new service” that transforms the industry. 


All of which suggests there is something else, yet to be discovered, that eventually drives industry revenue forward once mobility and home broadband have saturated. So far, nobody has a plausible candidate for that new service.


Edge computing might be helpful. So might network slicing, private networks or internet of things. But not even all of them together are a solution for industry revenue drivers once home broadband and mobile service begin to decline as producers of at least half of industry revenues.


It already seems clear that others in the edge computing ecosystem--including digital infra providers and hyperscale cloud computing as a service suppliers--will profit most from edge computing.


Sunday, December 4, 2022

Size Should Correlate with Profit: for AWS and Alphabet, that is Clear

Data center providers with double the share of their closest competitor lead their markets in profitability. The reason is the relationship between market share and profit margin or return on investment.


That is arguably true in the connectivity and data center markets as well, though impossible to verify as Microsoft Azure never releases margin data.  Alphabet profit margins are said to be quite low, in the single-digits range. AWS margins are in the 61 percent range.  


Since Microsoft has never published its profit margins from public cloud services, it is hard to say for certain that the expected pattern holds. 

Profit margin almost always is related to market share or installed base, at least in part because scale advantages can be obtained. Most of us would intuitively suspect that higher share would be correlated with higher profits. 


That is true in the connectivity and data center markets as well. 

source: Harvard Business Review 


But researchers also argue that market share leads to market power that also makes leaders less susceptible to price predation from competitors. There also is an argument that the firms with largest shares also outperform because they have better management talent. PIMS researchers might argue that better management leads to outperformance. Others might argue the outperformance attracts better managers, or at least those perceived to be “better.”


Without a doubt, firms with larger market shares are able to vertically integrate to a greater degree. Apple, Google, Meta and AWS can create their own chipsets, build their own servers, run their own logistics networks. 

source: Slideserve 


The largest firms also have bargaining power over their suppliers. They also may be able to be more efficient with marketing processes and spending. Firms with large share can use mass media more effectively than firms with small share.


Firms with larger share can afford to build specialized sales forces for particular product lines or customers, where smaller firms are less able to do so. Firms with larger share also arguably benefit from brand awareness and preferences that lessen the need to advertise or market as heavily as lesser-known and smaller brands with less share. 


Firms with higher share arguably also are able to develop products with multiple positionings in the market, including premium products with higher sales prices and profit margins. 


source: Contextnet 


That noted, the association between higher share and higher profit is stronger in industries selling products purchased infrequently. The relationship between market share and profit is less strong for firms and industries selling frequently-purchased, lower-value, lower-priced products where the risk of buying alternate brands poses low risk. 


The relationships tend to hold in markets where firms are spending to gain share; where they are mostly focused on keeping share or where they are harvesting products that are late in their product life cycles. 

source: Harvard Business Review 


The adage that nobody gets fired for buying IBM” or Cisco or any other “safe” product in any industry is an example of that phenomenon for high-value, expensive and more mission-critical products. 


For grocery shoppers, house brands provide an example of what probably drives the lower relationship between share and profit for regularly-purchased items. Many such products are actually or nearly commodities where brand value helps, but does not necessarily ensure high profit margins. 


On the other hand, in industries with few buyers--such as national defense products--profit margin can be more compressed than in industries with highly-fragmented buyer bases. 


Studies such as the Profit Impact of Market Strategies (PIMS) have been looking at this for many decades. PIMS is a comprehensive, long-term study of the performance of strategic business units  in thousands of companies in all major industries. 


The PIMS project began at General Electric in the mid-1960s. It was continued at Harvard University in the early 1970s, then was taken over by the Strategic Planning Institute (SPI) in 1975. 


Over time, markets tend to consolidate, and they tend to consolidate because market share is related fairly directly to profitability. 


One rule of thumb some of us use is that the profits earned by a contestant with 40-percent market share is at least double that of a provider with 20-percent share.


And profits earned by a contestant with 20--percent share are at least double the profits of a contestant with 10-percent market share.


This chart shows that for connectivity service providers, market share and profit margin are related. Ignoring market entry issues, the firms with higher share have higher profit margin. Firms with the lowest share have the lowest margins. 

source: Techeconomy  


In facilities-based access markets, there is a reason a rule of thumb is that a contestant must achieve market share of no less than 20 percent to survive. Access is a capital-intensive business with high break-even requirements. 


At 20 percent share, a network is earning revenue from only one in five locations passed. Other competitors are getting the rest. At 40 percent share, a supplier has paying customers at four out of 10 locations passed by the network. 


That allows the high fixed costs to be borne by a vastly-larger number of customers. That, in turn, means significantly lower infrastructure cost per customer.