Reversible Computing and Data Centers for Efficient Computing

Emanuel Pinilla
November 8, 2017

Submitted as coursework for PH240, Stanford University, Fall 2017


Fig. 1: Reversible computing is logically reversible if outputs can be computed from inputs and these same outputs can produce the inputs. Binary Search Trees follow this idea because there are always only two options at each node, resembling the idea of changing a bit from a 0 to a 1 stated in the initial Landauer Principle. (Source: Wikimedia Commons).

Computers have become ubiquitous and the computing power they possess has significantly increased, decreasing the of energy required for computation. The issue is that because computers have become so common, such that the number of personal computers in populous countries alone has surpassed 2.25 billion, there is a need to make computers run more efficiently to reduce energy use and develop better ways to manage data centers to decrease carbon footprint. [1]

The total energy consumption of computing, including power consumption and embodied energy of data centers, PCs and peripherals, and networks and devices, accounted for 2% of the world's carbon footprint in 2016. Data centers alone accounted for 0.5 % of the world's energy, a figure looking to quadruple to 2.0% in 2020 due to the demanding shift towards cloud computing which places high stressors on servers. [1]

To combat the rising carbon footprint, two things must happen: (1) the software and algorithms computers implement must become more efficient to reduce energy cost and (2) data centers must be able better regulate heat and energy consumption.

Efficient Algorithms

Efficient Algorithms can significantly reduce the energy consumption of computation and decrease the amount of heat dissipated. The Landauer Principle is a physical principle regarding the lower theoretical limit of energy consumption of computing and this basically states that do compute any function, such as changing a 0 bit to a 1 bit, would lead to heat dissipation and entropy increase. [2,3]

The action that is flipping a 0 bit to a 1 bit would require small energy consumption and would release small amounts of heat relative to the environment the computer is in. [2] Reversible computing offers a solution to this problem and are the only potential way to combat computational energy inefficiencies. Reversible computing is broken up into two subsets, physically reversible where there is no increase in entropy and logically reversible which is the notion that outputs could be computed from inputs and inputs computed from outputs replicating a one to one function. [2,4] The idea of reversible computing serve to solve energy inefficiencies because if computations can be done backwards, no energy is loss and Landauer's Principle is side stepped as computer will no longer have the same lower theoretical limit to how much energy it will consume. [5]

Refining classical algorithms so they incorporate reversible computing has shown success. Classical search algorithms such as binary search trees are used in almost every computer made and typically require millions of functions and significant amounts of energy running at O(n) time in order to search through n elements. Binary search trees as seen in Fig. 1 are one of the first algorithms to adapt reversible computing because of how when traversing the tree there are only ever two options, mapped as either a 0 or 1. This mapping is logically reversible allowing for reversible computing to be implemented to decrease energy consumption. By implementing reverse computing, the energy use of these same classical search algorithms was cut significantly and the Big O of the new algorithm became O(√n), lowering the amount of heat dissipated and time of operation. [6]

Reverse computing provides efficient computing which requires significantly less energy to function addressing energy issues related to computing.

Fig. 2: This is an inside look of a data center. Large data centers are filled with rows and rows of servers that are on at all times. The issue faced with all these servers is that they are consuming significant quantities of energy despite not being used to their full capacity. (Source: Wikimedia Commons).

Data Centers

Data centers consume significant amounts of energy to power and contribute to carbon emissions as most servers in data centers are not being run at their full efficiency.

Data centers are the backbone of the internet and are valuable resources in the never-ending hunger for more data. As seen in fig. 2, data centers are composed of rows and rows of servers all of which are meant to collect and manage data. Despite the importance of data centers to data storage and retrieval, data centers are largely contributing to increased global energy consumption and carbon emissions. In 2013, US data centers alone consumed an estimated 91 billion kilowatt hours of electricity the equivalent to the output of 34 500-megawatt coal-fired power plants. The projected increase come 2020 is 140 billion kilowatt hours of energy consumed, the equivalent of 50 coal-fired power plants and emitting 100 million metric tons of carbon pollution each year. [7] With the large shift towards cloud computing, the needs for data centers is sure to increase and as a result, the carbon footprint of computing will too.

Server inefficiencies are one of the key components that need to be addressed. From 2006 to 2012 server utilization was on average between 12 and 18% of the total capacity of the servers. This presents an issue because not only do these inefficient servers have energy drawbacks, they also place limits on the capacity of data centers. With current hardware, servers experiencing higher traffic than average were still only operating below 50% of their capacity and with an estimated 20% of servers being completely idle yet still consuming energy, it is clear to see that the inefficient use of servers in data centers is largely responsible for the massive carbon footprint. [7]

A pay as you grow philosophy can be applied as many data centers are large and contain idle servers because they highly overestimate the traffic they will receive instead of adjusting as traffic increases steadily.

Location of data centers is also crucial to carbon footprint as weather plays a large factor in energy consumption. Data centers in extreme temperatures and high humidity will require more energy to sustain consistent temperatures and prevent overheating. Location also dictates the source of energy which will power the data center and this can increase overall CO2 emissions if the plant relies on coal to be powered over greener alternatives. [8]

Data center are essential to computation in modern day and the transition to cloud computing will only place more strain on data centers. To prevent extensive pollution and leaving behind a massive and expensive carbon footprint, data centers need to address server inefficiencies as well as the locations where data centers are built to prevent weather implications as well as to have greater access to greener sources of energy to power data centers.


The influx of computers and drive towards cloud computing is creating a massive carbon footprint due to inefficient software and algorithms as well as poor managing of inefficient data centers.

The issues with inefficient software and algorithms is that every computation requires energy and releases heat. This is a problem because energy consumption places a strain on the computing power and the release of heat requires more power for cool down to occur. Modifying existing software and algorithms to incorporate reversible computing would not only significantly decrease energy consumption but in some cases, would eliminate any consumption.

Data center management must be significantly more attentive to server inefficiencies and location factors. Current data centers have many servers working well below their capacity yet still consume energy and release heat all day every day. Certain data centers are in locations where green energy sources are not available and must rely on coal based energy increasing CO2 emissions. The number of servers in these centers can be significantly cut so that each server does more work and thus makes better use of the energy it is consuming. Data centers can be built in locations of ideal weather so there is no overheating nor excessive energy consumption to sustain consistent temperatures for ideal server use, and also be built in locations which provide access to green energy to power the data centers. [9]

Addressing the inefficiencies in software as well as the inefficiencies of data centers will significantly reduce the carbon footprint left behind by the computer industry.

© Emanuel Pinilla. The author warrants that the work is the author's own and that Stanford University provided no input other than typesetting and referencing guidelines. The author grants permission to copy, distribute and display this work in unaltered form, with attribution to the author, for noncommercial purposes only. All other rights, including commercial rights, are reserved to the author.


[1] J. Mankoff, R. Kravets, and E. Blevis, "Some Computer Science Issues in Creating a Sustainable World," Computer 41, No. 8, 94 (August 2008).

[2] R. Landauer, "Irreversibility and the Heat Generation in the Computing Process" IBM J. Res. Develop. 5, No. 3, 183 (1961).

[3] C. H. Bennet, "Notes on the History of Reversible Computation," IBM J. Res. Develop. 32, No. 1, 16 (January 1988).

[4] G. Vega, "Computation, Energy-Efficiency, and Landauer's Principle," Physics 240, Stanford University, Fall 2016.

[5] T. Toffoli, "Reversible Computing" in Automata, Languages and Programming, ed. by J. W. de Bakker and J. van Leeuwen (Springer, 1980).

[6] L. Tarrataca and A. Wichert, "Tree Search and Quantum Computing," Quantum Inf. Process. 10, 475 (2011).

[7] "Data Center Efficiency Assessment," Natural Resources Defense Council, August 2014.

[8] D. Bouley, "Estimating a Data Center's Electrical Carbon Footprint," Schneider Electric, 2010.

[9] J. Lee, "Energy Use of Server Farms," Physics 240, Stanford University, Fall 2012.