<img alt="" src="https://secure.bomb5mild.com/193737.png" style="display:none;">

Turbonomic Blog

Moving from Allocation to Consumption: Overcoming Cloud Waste for a Greener Public Cloud

Posted by Bobby Allen and Eric Senunas on Apr 22, 2021 8:00:00 AM

Some six Earth Days ago in 2015, Turbonomic began a campaign to help the world reduce the amount of energy wasted in delivering digital experiences to the planet. In the intervening years, the world has seen some progress – at least compared to previous estimates that saw data centers consuming one-fifth of all global electricity produced.

Data Center Energy Consumption Leveling Out as Workloads Grow
Indeed, the International Energy Agency found that while internet traffic A2C-GreenCloud1tripled and data center workloads more than doubled, the consumption of global electricity by data centers remained flat:

While some of this was due to efforts by our customers and others to become smarter about how they manage workloads in their data centers, reducing electricity-wasting over-provisioning by matching infrastructure supply with application demand in real time, perhaps more of this progress was due to the accelerated march of enterprises to the public cloud – and the drive by public cloud providers to become significantly more energy efficient.

As one article noted, while public clouds have a strong profit incentive to drive energy efficiency, this is less true for private data centers:

“Corporate data centers, on the other hand, often don’t have such incentives, their managers are rewarded only for maintaining uptime, not for doing it efficiently. They are notorious for not only being designed and operated inefficiently but also for not having their energy consumption closely tracked – or tracked at all – by their managers.”

We see this in our own interactions with our customers – while more corporations are becoming conscious of their role in combating climate change and active with corporate policy, only about one-third have set the ambitious goals needed, and few of those initiatives haven’t reached the level of corporate data center management. Some large, well-known corporations like Apple, however, are beginning to lead the way.

Thus, the rise of public cloud and private cloud image-png-Apr-21-2021-10-36-44-27-PMarchitectures may enable more immediate potential savings. The progress of this march to the public cloud is clearly visible in this chart from the IEA:

Electrical consumption by traditional data center architectures has declined by over 70% due to this shift, as public clouds take an accelerating share of workloads. The impact of this shift is not to be underestimated: while the number of instances has exploded in the past decade, this shift has helped keep energy consumption in check.

Overall energy consumption in support of these workloads was an even better story due to efficiency improvements in cooling and other data center supporting tasks:

The progress in making data centers more energy efficient is highly impactful, due to the unique nature of electrical generation and the realities of the electric grid.

The Power of the Negawatt
Energy efficiency is incredibly important in mitigating the impacts of climate change, due to the impact of source energy vs site energy – in other words, every watt consumed by a data center is not the only watt contributing to climate change. This reality is best illustrated by this graphic, courtesy of Schneider Electric:

As this graphic illustrates, the Negawatt concept is incredibly powerful. Every watt that isn’t wasted in a data center actually reduces up to 3 watts of equivalent fossil fuels that don’t need to be burned – thus carbon emissions are trebled with every watt of cloud waste prevented.

To help wrap our minds around the scale here, a terawatt is oneimage-png-Apr-21-2021-10-38-44-30-PM trillion watts of energy. As noted above, data centers consume some 200 TWh of electricity per year.

This infographic illustrates what one kilowatt hour can power – keep in mind that data centers consume over 200 Billion times this amount each year:

 

The problem is that the progress made in leveraging improvements in cooling and floor space designs is much of the low-hanging fruit and is beginning to reach the limits of the efficiencies that can be wrung out of data centers themselves. To really reduce the climate impact of the explosion in smart phones and the cloud, we must focus on the workload and strive to end cloud waste. (Infographic courtesy of Direct Energy)

The Problem of Cloud Waste
While the public cloud has become far more efficient in powering workloads (increasingly with green energy such as wind and solar), the corporations taking advantage of the public cloud haven’t been nearly as fastidious:

As estimated by Gartner, corporations will waste over $26 Billion in cloud resources in 2021 alone. Every dollar of that waste involves an equivalent number of watts wasted – every watt worsening the impacts of climate change three times over.

The largest culprit are resources wasted by idle resources – effectively leaving the workload lights on while no one is home. The next largest area of waste comes from cloud instances sized larger than necessary – the equivalent of building an NFL stadium for a high school football game. One estimate pegged direct energy consumption by servers at 40,000 Gigawatt hours a year – 40 Trillion watt hours – in 2020, with 20 Trillion or more watt hours wasted every year. That’s the carbon equivalent of over 3 billion passenger miles driven annually and would require 1.7 Million acres of trees to capture.

image-png-Apr-21-2021-10-40-27-35-PM                       image-png-Apr-21-2021-10-40-37-50-PM

Graphics: EPA Carbon Calculator

The reality is that the cloud isn’t some magic box that takes your apps and makes them elastic. In order to leverage the elastic nature of the cloud, you must either re-design your apps to take advantage of specific cloud services (and perhaps become locked into one cloud), write them from scratch be cloud-native, or leverage AI to match cloud supply with application demand and make your existing apps elastic. If you don’t, you still be allocating – guessing – at the resources needed. Guesses producing waste punished by the minute.

In both cases above, however, this waste can be stopped without negative impact to customer experience, but only by switching our mindset – from our traditional mode of resource allocation to a cloud-friendly model matching resource consumption to real-time application demand.

Moving from Allocation to Consumption Models
Traditionally, corporate data centers were resourced using what IT has called an “allocation” model where IT staff estimate future data center demand and then purchase servers, storage and network equipment to meet that demand – plus building in a margin for error given the ability to add incremental capacity would take months.

The dirty secret of this model is that the word “allocation” is just a technical term for what is really going on here: guessing. Further, since there are few businesses that are able to accurately guess at future demand (in normal years, let alone years like 2020), the margin of error IT tends to build in can be quite extreme. That was understandable, however; IT was seldom fired for over-spending on equipment, often because it was planned in the annual budget, but would face the wrath of the CEO if under-estimation caused application response time delays or outages leading to revenue loss.

As a result, server utilization has historically ranged from the low teens to nearly 50% in the most efficient of data centers – effectively wasting the vast majority of energy generated to power our data centers over the past decade. Perhaps due to this realization, many corporate executives have decided to outsource this to public cloud vendors, thus the accelerating growth in cloud services, but have done so without a concomitant shift in mindset.

To properly migrate to the cloud, it isn’t enough to just move workloads, you have to transform the thinking of your organization to match the consumption model of the cloud – and those executives failing to do so will pay for their error for every minute that each workload in the cloud, with our global environment paying a price 3x worse.

The problem is that this transformation isn’t easy. In fact, is quite a bit beyond human scale. A small migration of 500 workloads will face at least seven considerations in choosing a cloud instance (CPU, memory, storage, etc), which yields 1,486,071,034,734,000 possible combinations.

Try having your smartest and most expensive staff member do that with a spreadsheet…

Rather than frying your staff, the best way to transform to a consumption mindset is to focus on the app response time your customers expect and then leverage AI to ensure that the cloud resources are available – no more and no less – to assure that end-user experience. By focusing on the customer and their expectations, you’ll end the flawed guesswork of allocation; by leveraging AI, you’ll empower your IT staff to deliver the flawless experience your customers demand.

In doing so, you also do something else: by shifting from allocation to consumption, you’ll stop wasting cloud resources. You’ll begin saving the billions flushed into under- or un-used resources up in AWS, Azure or Google Cloud. And perhaps most importantly, you’ll help reduce the carbon emitted by the generation of electricity to power these data centers, and the millions of tons of coal that doesn’t need to be burned.

It’s up to you. Every swipe of every customer on your company’s digital assets has an impact – and can either remain part of the problem or become part of the solution.

We don’t have six more Earth Days to wait…

Subscribe Here!

Recent Posts

Posts by Tag

See all