Saturday, February 25, 2023

Higher Interest Rates a Negative for Data Center Assets, But Will Inevitably Produce Higher Demand and Valuations

All other things being equal, higher interest rates are a negative for real estate investment trusts or other businesses that rely on borrowed money to fuel growth. Rising interest rates tend to be a negative for all real estate assets in terms of valuation, so if interest rates remain higher, for longer, both construction and acquisition activities in the data center industry could slow, for a while.  


Of course, rarely in life are all other things equal. Nobody believes demand for data center colocation, computing as a service or internet-fueled data storage demand is going to slacken. So any temporary slowdown in construction of facilities eventually produces shortages that drive higher demand for new facilities, a reinflation of asset values and higher interest in acquisitions. 


To be sure, in the interim, higher interest rates will force firms to take a harder look both at construction and acquisition activities. But underlying demand eventually requires more construction, more capacity, more investment. 


Private equity drove most of the 187 data center acquisitions in 2022, a level nearly matching the record level of 2021 activity. Higher interest rates are expected to crimp deal flow in 2023, though, since borrowed money finances private equity transactions, which have fueled the data center acquisition spree.  


It might not be unreasonable to expect a dip in deal flow as the cost of borrowed money stays high. So acquisition multiples should contract a bit, deal flow should slow somewhat, deal size and volume could shrink. 


On the other hand, underlying growth of demand for data center services should continue to grow at a strong clip, which will eventually combine with slower build patterns to reignite multiple expansion, making the assets more attractive again, even in the face of high interest rates. 


Demand for data center assets has many drivers, attracting interest from growth capital, buyout, real estate, and infrastructure investors. 


For growth investors, the attraction is higher expected revenue and above-average revenue growth rates. Buyout firms believe there are opportunities to create value by boosting the performance of underperforming assets. 


For many the attraction is the perceived similarity of data center assets to other infrastructure assets purchased to diversify investment portfolios. Traditionally, those alternative investments have focused on real estate, airports, toll roads, seaports and energy utilities. Such assets provide a key perceived value: barriers to entry and predictable cash flow. 


But digital infrastructure now is perceived as a new asset class with those desired characteristics. In addition to data centers, fiber-to-home assets now have joined mobile cell towers as assets in that category. 


source: McKinsey 


Of course, there are risks as well. Some believe valuation premiums might not be so warranted, longer term. The mix of demand from enterprise customers who shift from operating their own facilities to reliance on data centers could shift. 


Though the shift to cloud computing generally has boosted demand for third-party data center capacity, scale often makes ownership more attractive than leasing. We see this in enterprise infrastructure quite frequently.


At low usage volumes, renting often makes much more sense than owning facilities. At high usage, the economics often reverse, and ownership becomes more affordable than leasing capacity. The mix of spending in a hybrid cloud or multi cloud environment could reverse, over time. 


Hyperscalers who drive third party data center demand could slow their own demand for third-party assets or build their own facilities, which they have done in the past. On the other hand, the reverse should hold in expanding markets, where time-to-market considerations favor the hyperscale app providers leasing capacity from third parties, at least until volume kicks in and the economics of owning versus leasing change for them, as well. 


Deal volume has been growing since the mid 2010s. In the first nine months of 2022 there were about 150 data center mergers or acquisitions, says Synergy Research Group. 


In 2021, there were 209 data center deals, with an aggregate value of more than $48 billion, up some 40 percent from 2020, when the deals were worth $34 billion, according to Synergy Research Group. 


source: Synergy Research Group 


 In the first half of 2022, there were 87 deals, with an aggregate value of $24 billion. 


From 2015 to 2018, private equity buyers accounted for 42 percent of the deal value. Their share increased to 65 percent from 2019 to 2021 and to more than 90 percent in the first half of 2022.


While interest rates remain elevated, organic growth will be quite important as a driver of data center asset valuation. In the near term, asset values should be compressed, at least a bit. 


Of course, compressed valuations make those assets more attractive, since virtually everyone believes the long term need for data center capacity remains high. Lower interest rates will stimulate activity as well, but tighter supply will inevitably produce higher demand, even in the face of higher interest rates. 


Time is part of the equation. As existing supply is soaked up, shortages will develop that must be remedied. And so investment will climb again. It is not a new story.


Thursday, February 23, 2023

Flip a Coin: Both AI and Edge Computing Widely Expected to be Used in Asia-Pacific Firms in 2023

You might flip a coin to determine whether edge computing or artificial intelligence will be more widely deployed in Asia-Pacific organizations in 2023, according to results of a survey of region business leaders by IDC.


Executives report identical "using or plan to use" intentions for edge computing and artificial intelligence in 2023.

ChatGPT interest is likely proof that “AI will become mainstream in 2023.”  In the Asia-Pacific region, more than 88 percent of survey respondents say they are using or are planning to use artificial intelligence or machine learning in the next 12 months. In the ASEAN+ region, some  91 percent of survey respondents say they will use, or plan to use, AI applications in the next year.  

source: IDC, AMD, Lenovo 


The IDC survey of 


source: IDC, AMD, Lenovo 


It might seem as though cloud computing adoption by enterprises should, by now, rival online ordering, use of enterprise resource planning or customer relationship management software. And that is likely true in 2021. 


The IDC survey of executives in India, Japan, Korea, Indonesia, Australia, New Zealand, Singapore, Taiwan, Thailand, Hong Kong, Malaysia, and the Philippines also finds that 88 percent of respondent firms already are either using, or planning to use, edge computing in the next 12 months for business operations.


According to an analysis sponsored by the Organization for Economic Cooperation and Development, “big data” analytics use is far lower than one might expect, at least that was the case in 2021, when perhaps 17 percent of enterprises might have used that tool, compared to more than 40 percent using cloud computing in some way. 

sources: Lenovo, Impact Economist 


Use of artificial intelligence and internet of things had begun a sharp rise about 2020.


Sunday, February 19, 2023

Data Center Architecture Hinges on Connectivity: N-S as well as E-W

Some adaptations of data center architecture in the cloud computing era have to do with connectivity architecture more than the way servers are cooled, the types of processors and specialized chips used in servers or the way servers are clustered


A cluster is a large unit of deployment, involving hundreds of server cabinets with top of rack (TOR) switches aggregated on a set of cluster switches. Meta has rearchitected its switch and server network into “pods,” in an effort to recast the entire data center as one fabric. 


source: Meta 


The point is to create a modularized compute fabric able to function with lower-cost switches. That, in turn, means the fabric is not limited by the port density of the switches. It also enables lower-cost deployment as efficient standard units are the building blocks. 


That also leads to simpler management tasks and operational supervision as well. 


And though  much of that architecture refers to “north-south” communications within any single data center, the value of the north-south architecture also hinges on the robustness of the east-west connections between data centers and the rest of the internet ecosystem. 

source: Meta 


In fact, it is virtually impossible to describe a data center architecture without reference to wide area and local area connectivity, using either the older three-tier or newer spine-leaf models. The point is that data centers require connectivity as a fundamental part of the computing architecture. 

source: Ultimate Kronos Group 


That will likely be amplified as data centers move to support machine learning operations as well as higher-order artificial intelligence operations. 


Still, in the cloud computing era, no data center has much value unless it is connected by high-bandwidth optical fiber links to other data centers and internet access and transport networks. 


“From a connectivity perspective, these networks are heavily meshed fiber infrastructures to ensure that no one server is more than two network hops from each other,” says Corning.


The other change is a shift in importance of north-south (inside the data center) and east-west (across the cloud) connections and data movement. As important as north-south intra-data-center traffic remains, communications across wide area networks assumes new importance in the AI era. 


The traditional core-spine-leaf connectivity architecture between switches and centers increasingly looks to be replaced by a two-layer spine-leaf design that reduces latency. In other words, in addition to what happens inside the data center, there will be changes outside the data center, in the connectivity network design. 


source: Commscope


In the above illustration, the Entrance Room (ER) is the entrance facility to the data center.


The Main Distribution Area (MDA) holds equipment such as routers, LAN and SAN switches. The Zone Distribution Area (ZDA) is a consolidation point for all the data center network cabling and switches.


The Equipment Distribution Area (EDA) is the main server area where the racks and cabinets are located.

 

So even if connectivity remains a separate business from cloud computing and the data center business. But all are part of a single ecosystem these days. Cloud computing requires data centers and data centers require good connectivity


How Will AI Change Data Center Architectures?

It is not so clear yet how increasing use of artificial intelligence such as ChatGPT could affect cloud computing and data center architectures. But more edge computing seems a reasonably safe bet.


But man of the other changes might be characterized as augmenting or increasing the value of trends we already can identify. 


“What goes in the racks” is one adaptation. More powerful servers to support high-performance computing seem an obvious inference, though that already is happening for other reasons. Likewise, greater use of parallel processing is likely, along with the use of customized or specialized servers designed to support machine learning operations. 


Those developments arguably are more tactical changes. 


It also is possible that more distributed workloads will be necessary, in part because data might be stored at more locations, including at edge locations. Again, that process already has been underway, driven by the need to support more low-latency processing operations.


But data gravity and edge or distributed locations therefore seem to be opposing trends that will have to be harmonized. 


And while energy consumption already is a big issue, the greater amount of processing makes sustainability even more important as AI operations proliferate. AI operations, being more intensive, also will require more energy, and create more heat, fueling a shift to liquid cooling as well. 


source: Meta 


That, at least, has prompted Meta to consider new data center designs built on liquid cooling. 


Some expect higher degrees of data center automation as well. And, of course, data centers are applying AI to support their own automation efforts and operations.  


As in the past, when each data center essentially takes the form of an ecosystem, so AI operations might push data centers towards a mesh of locations concept, which arguably already has been the case. That data mesh concept includes federated governance, domain-oriented and decentralized data ownership, as well as architecture and much greater self-serve capabilities, says IBM.


So far, the shift to liquid cooling seems the most-discrete change in data center design. Most of the other trends--faster processors, specialized processors, energy efficiency, automation, distributed computing, ecosystems and lower latency--were already underway for other reasons. 


Wednesday, February 8, 2023

Supporting Work and Life in the Cloud

It would be hard to overestimate the value of cloud computing and the internet. The changes those innovartions continue to bring often is underestimated.


Sometimes an acquisition proves so vital that one might actually argue the acquired firm actually bought the acquirer. Or so quipped PCCW Global CEO Marc Halbfinger at his PTC’23 main stage session, talking about Console Connect. 


Though “network as a service” was among the key themes he addressed, Halbfinger emphasized the potential for network effects as enterprises, application providers, data centers and connectivity providers are able to dial up, on demand, connectivity to cloud-based resources globally. 


source: GetSix 


Some might view the automated platform as “infrastructure as a service.” It is that, allowing users to self provision capacity and connections in granular fashion on a global basis. 


But it is the on-demand global cloud ecosystem which really represents the underlying value, Halbfinger said. Rather than simple connections between places, the platform enables enterprises to connect with each other, without intermediaries, to applications, software, sensor network providers, multiple cloud services providers. 


All of which makes Console Connect hard to categorize. Is it an automated ordering system for network resources? Yes. Is it a form of “network as a service?” Again, yes. 


Is it a form of “zero touch” automated ordering, provisioning and settlements? True. Is Console Connect a platform? Yes, but a platform with a network. 


Is the fabric a way for enterprises, app providers and computing as a service suppliers to make connections? Certainly. Is Console Connect a sales or distribution channel for its users? Yes. 


And yet the words and concepts fail us. 


Think about the original vision of the internet: any-to-any information exchange globally by computers and their users. Think about the cloud: computing services and apps accessed globally over the internet. 


Think about the promise of cloud computing: on-demand, self service, resource pooling, elastic scaling, pay per use, ubiquity, resilience and security. 


source: Slideshare 


Console Connect enables cloud computing for enterprises and people that now live in the cloud.

Friday, February 3, 2023

Meta Gets Efficient, Even as it Ramps AI Compute Capabilities

For investors, the highlight of Meta’s fourth quarter 2022 earnings report was the improvement in profit margin. For others it was the new emphasis on “efficiency,” including better use of capital investment, flattening the organization and moving faster to cut projects that are not showing nearer term upside. 


“We expect capital expenditures to be in the range of $30-33 billion, lowered from our prior estimate of $34-37 billion,” said Susan Li, Meta CFO. “The reduced outlook reflects our updated plans for lower data center construction spend in 2023 as we shift to a new data center architecture that is more cost efficient and can support both AI and non-AI workloads.”


For some, that comment about “lower cost” data centers might have been interesting as well. 


One issue appears to be the difference between centers able to support artificial intelligence and all other workloads. 


Supercomputers might be one example of the change. Will we see more data centers that are optimized for AI workloads? And are such specialized facilities needed mostly to build the inference models? If so, that implies that the non-real-time training can be done at specialized facilities, while the actual applications run closer to end users and the edge. 


But Meta also is saying that the new data center architecture combines AI and standard workloads, which implies some new way of assigning functions and running the workloads. 


 Is a shift of compute towards the edge also part of the architectural shift? 


And some of the savings might come from a more modular approach to assets, which is not new, but might be operationally more important. 


“Along with the new data center architecture, we're going to optimize our approach to building data centers,” Li said. “So we have a new phased approach that allows us to build base plans with less initial capacity and less initial capital outlay, but then flex up future capacity quickly if needed.”