For cloud to meet its promise, individuals must be empowered to consume as required, to innovate without delay and to take chances without commitments. Flexible pricing models deliver this flexibility, but organizations must change their cultures to support the real cloud revolution.

The 451 Take
Electricity is essentially invisible; we only know it is there when we see the value it produces, be it the suction of a vacuum cleaner, the recharging of a mobile device or the chill of a refrigerator. Cloud infrastructure, too, is invisible, but we are a long way from making it plug-and-play and accessible to all those seeking to add value. The issue is that, left uncontrolled, costs can rapidly mount, which naturally scares those that are used to fully predictable and controlled expenditure. The solution is to optimize on an ongoing basis, thereby empowering individuals to innovate, to scale when needed, and to self-serve while being assured that every byte and CPU cycle is deriving value. We are at an early stage, but there are signs that enterprises have an appetite for such a revolution. Service providers need to be the leaders of this journey.

Procurement maturity model
Just as no single venue is the right destination for all computing workloads, a single way of consuming IT is unlikely to satisfy most customers. Some applications will warrant the 'Lego bricks' flexibility and control of a set of primitives that can be put together in unique and powerful ways, while others favor the turnkey convenience of fully managed services – with infinite variations and combinations in between.

Either way, the processing power underlying the services delivered must be purchased and paid for by someone – the cloud provider, hardware vendor, managed service partner or systems integrator that's charging the enterprise for capabilities or expertise. This ongoing procurement process has a labor overhead related to ordering, provisioning and budgeting for the resources. Considering so many resources are now ordered online through cloud APIs – and automatically reconciled with billing systems in the back end – this overhead should be diminishing, but culture and complexity are holding back fully automated, flexible and ongoing procurement. Traditional procurement departments and processes based upon depreciable capital assets and enterprise license agreements are impeding the use of consumption-based models.

A maturity model offers a view of how procurement practices are changing and which milestones – in terms of people, process, technology and strategy – must be reached on the way to infrastructure invisibility. At level 1, we are still at the traditional world of IT procurement: fixed and inflexible, capital-led, manually procured. Level 5 represents the fully automated utility: not just pay-as-you-go, but optimized-as-you-go – dynamically purchased, intelligently forecasted, automatically provisioned. Level 1 has a high procurement overhead, while level 5 is invisible – it just works, with little or no human intervention. We believe most companies today are at level 3 or lower.

Proceeding from level 1 to 5 requires a culture shift, where users are empowered to consume what they want if it adds business value. For cloud to meet its promise, it must be plug-and-play. Enterprises should look forward and plan to increase their maturity using this model; service providers must support enterprises in this objective. Reassurance that a bill is as low as possible is of huge value to cost-conscious CIOs, especially if this can be done across multiple clouds, applications and technologies.

Figure 1. Cloud-procurement maturity model




Where are we now?
It's safe to say that most enterprises are closer to the beginning of this transition than to the end. Server virtualization has been in widespread use for 20 years, but many organizations still rely on legacy physical infrastructure (including mainframes) for some mission-critical applications.

Those that use cloud are using pay-as-you-go pricing, perhaps alongside other instruments, such as AWS's Reserved Instances or a renewable monthly contract with an 'overdraw' facility. Naturally, CIOs and security teams are concerned about spiraling costs or data leakage – but this is the point. The ideal cloud would optimize costs and control risk while empowering users to add value in an ad hoc manner. Just as a user can spontaneously plug in a desk lamp so they are more productive, a step change is needed to give cloud the same level of flexibility.

As shown by the avalanche of vendors and providers touting their hybrid cloud capabilities, the shift is not happening smoothly or in a wholesale fashion – companies such as GE and Juniper that have declared themselves to be 'all-in on cloud' are the exception rather than the rule, and the majority of firms taking advantage of multiple clouds will need cost-optimization tools to help manage the resulting complexity.

Most enterprises are still getting used to opex billing at this stage, and have yet to realize the inherent optimization problem facing them to reduce costs further. The Cloud Price Index Cloud Tracker detected the addition of over 50,000 SKUs to AWS's already-huge 300,000-strong product manifest in November 2017. How can anyone optimize with such inherent complexity? Tools are required. Machine learning and artificial-intelligence techniques have begun to impact pricing and procurement – being applied to infrastructure management to determine which services from which vendors result in the optimal combination of performance and price for a given workload. 451 Research's Voice of the Enterprise research shows that only 14% of organizations say machine learning is important for managing their cloud computing environments today (although 24% consider machine-learning skills to be acutely lacking).

Cloud and managed service providers, however, are moving ahead, and optimization is becoming a value-add. The Cloud Price Index has logged a flurry of activity in cloud cost optimization, with service providers such as Google, Microsoft and Oracle releasing new flexible purchasing models in a bid to take on AWS as buyers become increasingly cost-aware. Many MSPs offer cost-optimization services. More recently, the hyperscalers have been cutting prices on GPU instances and rolling out more sophisticated machine-learning and AI tools, not only to attract developers, but also to see what they create for optimizing multi-cloud usage and workload placement. Several stand-alone optimization companies have also hit the market. The advantage of third-party optimization is that users are empowered to consume, but costs aren't wasted on unused resources.

As often happens, the technology for evolving pricing models appears to be running ahead of people, processes and strategy. Showback and chargeback are already used by about half of organizations, according to 451 Research's VotE data, and among the largest companies (those with revenue over $1bn), 46% use chargeback and 21% use showback. Cloud providers, system suppliers and software vendors have the metering technology to measure usage in increasingly granular increments – whether based on volume, number of transactions or compute nodes in service – and the machine intelligence to apply that data to price-engineer applications. Enterprises are becoming more flexible and want to empower users, but the next step is optimizing this usage.

Public cloud consumption models move on-premises, provisioning expertise moves off
The cross-fertilization of consumption models – with public-cloud-like pay-as-you-go billing finding its way on-premises while predictable, longer-term payment plans take the form of reserved instances and service tiers for public IaaS – belies a larger trend in the direction of invisible infrastructure, where the source of compute power will be little more than an afterthought, with information and insight shifting from purpose-built hardware and software to functions expressed primarily in terms of business needs.

Attachment to traditional IT models reflects a desire for control, ownership, and visibility of devices and data by IT stakeholders, but nimbler technologies have encouraged individuals to use self-service resources – first in bring-your-own-device policies (compounded by near-universal mobile engagement) and then in shadow IT enabled by pay-as-you-go public cloud. Vendors responded with models offering shorter capacity-planning horizons, lower capex commitment and faster technology refreshes, providing a low-risk way to shift attention from infrastructure management and focus on wringing business value from data and applications.

Contributing to the shift are US FASB and IASB rule changes that will affect the way leases (e.g., for datacenter capacity) can be accounted for. Previously, leases could be charged as an operating expense rather than as capital expenditure on balance sheets, allowing systems integrators, outsourcers and service providers to operate 'asset light' models, where datacenter capacity was completely outsourced. The new FASB/IASB rules mean that, if today's leased assets are still leased assets by 2019, they will have to be capitalized on the balance sheet, instead of recorded as operating expenses.

In the new definition of a lease, an identified asset plus the right to control that asset constitutes a lease that must be recorded on the balance sheet. In these cases, systems integrators and enterprises will have a significant incentive to optimize utilization and eliminate overprovisioning in order to guard against having to record unused assets on their balance sheets. Operational data and expertise gained from years of multi-cloud provisioning and the application of machine learning to discover and eliminate the causes of overcapacity will be a factor here.

Applying machine intelligence to determine best execution venue
Cloud providers, managed service providers and systems integrators have been collecting and deploying data to train models for examining workloads and determining which of the many available options represent customers' best execution venues to meet their performance, cost, contractual and other requirements. As the worlds of outsourcing, hosting, managed services and cloud converge, the possibilities available to customers are growing exponentially. Independent software vendors have stepped in to create and tune provider-agnostic decision engines that discover existing IT resources and continually apply learning algorithms to indicate where more cost-effective configurations are available within user-defined constraints.

The most sophisticated IT users may operate multiple supplier relationships independently, but even those with contract negotiation expertise will likely struggle with the complexity of mechanisms available, especially in relation to cloud service pricing and delivery. This is why best execution tools, cloud service brokers, integrators and marketplaces using machine intelligence will play an increasingly important role when it comes to assessing and accessing available venues.

The crux of all of this is that the old-fashioned way of IT procurement could be holding back the business and its revenue. It's a balance of risk and reward. Empowering individuals might increase cloud spend, but there are huge rewards up for grabs, and companies that are slower to act due to their lack of flexibility will suffer. If cost and security can be intelligently and automatically optimized and controlled across multiple clouds, the risk can be mitigated and is therefore worth taking. The procurement maturity model can help enterprises understand this balancing act between cost and value, opportunity and risk. After all, to the victor belong the spoils.

Owen Rogers
Research Director, Digital Economics

As Research Director, Owen Rogers leads the firm's Digital Economics Unit, which serves to help customers understand the economics behind digital and cloud technologies so they can make informed choices when costing and pricing their own products and services, as well as those from their vendors, suppliers and competitors.

William Fellows
Co-Founder & Research VP

William Fellows is a cofounder of The 451 Group. As VP of Research, he is responsible for the Cloud Transformation Channel at 451 Research. This Channel provides a point of intellectual convergence for 451 Research around cloud computing, in much the same way that the industry is converging on cloud from all points.

Jean Atelsek
Analyst, Cloud Price Index

Jean Atelsek is an analyst for 451 Research’s Digital Economics Unit, focusing on cloud pricing in the US and Europe.

Want to read more? Request a trial now.