Powering the Cloud: Energy Trends in Datacenters

Travis Lanham
November 24, 2017

Submitted as coursework for PH240, Stanford University, Fall 2017

Introduction

Fig. 1: Inside a datacenter. (Source: Wikimedia Commons)

One of the most impactful recent trends in enterprise technology is the migration from private data centers to shared public ones. The increasing complexity of digital applications is a driver of this trend as well as the cost savings available through economy of scale and resource multiplexing. Public cloud vendors like Amazon Web Services, Microsoft Azure, and Google Cloud Platform offer services ranging from storage and compute processing to high level databases and managed services.

Providing these services requires large scale data centers that consume significant amounts of energy to serve digital applications to billions of users across the globe. Although new users are connected to digital services every day and applications require steadily increasing data storage and processing, several technological and economic trends have made existing datacenters more efficient and slowed growth in datacenter energy consumption.

Public cloud usage is a large factor in this energy trend as more companies purchase their digital services as a multiplexed service from devices that can be shared among a large number of customers, achieving better utilization. An indication of the tremendous size of modern data centers can be seen in Fig. 1; each row in the image consists of racks of servers, each in turn made up of dozens of high-performance servers. The growing centralization of digital services from small private data centers to large, shared public cloud facilities allows public cloud vendors to take advantage of higher utilization, best practices, and economies of scale to reduce energy usage.

Moving to the Cloud

Modern data centers, especially those on the scale of public cloud data centers, are significant complexes, typically the size of several football fields and can frequently host 400,000 compute servers. In 2014, data centers consumed an estimated 70 billion kWh, accounting for approximately 2% of total U.S. electricity consumption. [1]

Several technological changes have given rise to public cloud computing including multiplexing server virtualization technology that allows a single hardware chip (CPU) to be shared among multiple tenants, increasing utilization. CPUs in public cloud data centers have almost three times the utilization compared to those in regular private data centers, and thus fewer CPUs can be used to support the same workloads. [2]

In addition to utilization efficiency gains, cloud data centers can harness best practices and domain expertise to improve the performance of their data centers beyond what is possible for smaller private data centers. Algorithmic techniques like machine learning are also used to optimize cooling and efficiently schedule workloads, resulting in significant energy savings. [3] Shehabi et al. anticipate that the shift to public cloud "hyperscalers" (or massive scale data centers) could net energy savings of 25% by 2020 compared to baseline scenarios. [1]

Conclusions

The rise of the public cloud offers not only the promise of economic and human resource savings for companies with digital assets, but also significant energy savings. With the acceleration of digital transformation for enterprises, the energy consumption of digital services will only become more important. Consolidating these services through public cloud providers will unlock energy savings previously available only to the largest technology companies with massive infrastructure.

© Travis Lanham. The author warrants that the work is the author's own and that Stanford University provided no input other than typesetting and referencing guidelines. The author grants permission to copy, distribute and display this work in unaltered form, with attribution to the author, for noncommercial purposes only. All other rights, including commercial rights, are reserved to the author.

References

[1] A. Shehabi et al., "United States Data Center Energy Usage Report," Lawrence Berkeley National Laboratory, LBNL-1005775," June 2016.

[2] E. Masanet et al., The Energy Efficiency Potential of Cloud-Based Software: A U.S. Case Study," Lawrence Berkeley National Laboratory, June 2013.

[3] J. Berral et. al., "Towards Energy-Aware Scheduling in Data Centers Using Machine Learning," in Proceedings of the 1st International Conference on Energy-Efficient Computing and Networking (ACM, 2010), p. 215.