We might as well set expectations about how CxOs are going to evaluate edge computing: many will be disappointed, for several reasons. As is the case for cloud computing generally, many outcomes CxOs expect are hard to measure: faster innovation; improved resilience; ability to advance core company strategies.
Increased revenue and cost savings can be measured, but also are subject to powerful influences that edge computing might not affect in a clearly material way, at first.
source: PwC
Edge computing might allow better real-time monitoring of processes, but it might not be easy to measure how much value edge computing creates by doing so. Edge computing might reduce wide area network costs, but how much do WAN costs factor into product costs, revenue or profits?
CxOs arguably would be far happier if edge computing demonstrably enabled new business models, products and services or powerfully reduced manufacturing or distribution costs. But those will be tough to clearly demonstrate, early on.
Patience, in other words, will be needed, as has been the case for most important information technologies for 50 years. The issue, of course, is that such lengths of time are beyond the meaningful horizon for any single CxO at any single company. It will not advance a career to argue that big investments now will pay off in 50 years, or even a decade, in most cases.
The question of value already is emerging for cloud computing, a much-more established trend.
It should come as no surprise that CxO expectations of cloud computing payback lag expectations in the areas of resilience; agility; decision making; innovation; customer experience; profits; talent recruitment and retention; costs or reputation, for example.
All of those business processes are shaped by many other inputs than mere applied technology. And the general rule with any important new technology is that the value is not recognized until core business processes are reshaped to take advantage of the new technology. That is as likely to happen with cloud computing as with any other important new tools.
Any major shift in technology and related business processes takes time. So much time that there often is a “productivity paradox” where investments do not seem to make much difference in outcomes for a decade or more.
Nokia has noted that manufacturing productivity since the 1980s has been slight, in the range of one percent per year growth, despite all the information technology applied to manufacturing.
source: PwC
Despite the promise of big data, industrial enterprises are struggling to maximize its value. A survey conducted by IDG showed that “extracting business value from that data is the biggest challenge the Industrial IoT presents.”
Why? Abundant data by itself solves nothing, says Jeremiah Stone, GM of Asset Performance Management at GE Digital. At least one study suggests similar findings for broadband internet access as well.
The consensus view on broadband access for business is that it leads to higher productivity. But a study by Ireland’s Economic and Social Research Institute finds “small positive associations between broadband and firms’ productivity levels, none of these effects are statistically significant.”
“We also find no significant effect looking across all service sector firms taken together,” ESRI notes. “These results are consistent with those of other recent research that suggests the benefits of broadband for productivity depend heavily upon sectoral and firm characteristics rather than representing a generalised effect.”
“Overall, it seems that the benefits of broadband to particular local areas may vary substantially depending upon the sectoral mix of local firms and the availability of related inputs such as highly educated labour and appropriate management,” says ESRI.
Big waves of information technology investment have in the past taken quite some time to show up in the form of measurable productivity increases.
In fact, there was a clear productivity paradox when enterprises began to spend heavily on information technology in the 1980s.
“From 1978 through 1982 U.S. manufacturing productivity was essentially flat,” said Wickham Skinner, writing in the Harvard Business Review.
In fact, researchers have created a hypothesis about the application of IT for productivity: the Solow computer paradox.
Here’s the problem: the rule suggests that as more investment is made in information technology, worker productivity may go down instead of up.
Empirical evidence from the 1970s to the early 1990s fits the hypothesis.
Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.
When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent.
This productivity paradox is not new. Information technology investments did not measurably help improve white collar job productivity for decades.
To be sure, some argue that the issue is our inability to measure productivity gains. It is happening, but we are unable to measure it, many would argue. That argument will not win many supporters in the CxO suites.
Still, the disappointment is to be expected. It will take time to reap the measurable benefits of cloud, 5G, edge computing, internet of things or any other major new technology.