December 5, 2016

Gordon Moore predicted in the 1960s that there would be exponential growth of transistor counts per chip unit area. This increased density has largely come from the semiconductor industry's ability to manufacture smaller transistors. Increasing transistor counts has led to increased computational power per chip size but comes with the cost of increased power density. [1] Modern computers perform their computation predominantly in the digital realm with information communicated and manipulated via transistors railing between low (zero voltage) and high (positive voltage) states. The energy consumption of computation is directly related to the number of transitions between high and low states. [1]

The conventional approaches used to follow along Moore's Law and improve energy-efficiency are fast approaching physical limits. [1] Despite the continued improvement of materials and engineering techniques, there are limits to how far current methods can reduce power consumption. [1] Indeed, as illustrated in Fig. 1, the Semiconductor Industry Association (SIA) has predicted that, with current engineering approaches, the required energy use for computation will exceed the estimated world energy supply by the year 2040. [1] Additionally, the SIA has estimated that current approaches focusing on reducing transistor size will become economically unviable by 2020. [1] There is a significant need for increased energy-efficiency if computational abilities are to continue to increase as they have. Many believe that Moore's law is fast approaching its physical limits but the ever increasing demand for computational power will likely continue to grow regardless.

Many of the practical limits to computational energy-efficiency have to do with the physical and economical limits and imperfections of materials. Independent of fabrication techniques or materials, there is also a theoretical limit to the minimum energy modern computational machines can use. This is given by the Landauer's Principle and the relationship between information and entropy. (The validity of Landauer's Principle has been debated, for a review of the major critiques and their counterpoints see Bennett. [2])

As described by the Second Law of Thermodynamics, the entropy of a system undergoing a physical process never decreases. A reversible physical process is one where at the beginning and end of the process, there is no net transfer of heat or work between the system and environment. In other words there is no increase in entropy from the process or its reverse acting on a system, and the system can be returned to its initial state from its final state. All other processes (which involve an increase in entropy) are called irreversible.

If the physical process causes a change in entropy of dS (dS is greater than or equal to zero), then the system dissipates heat by an amount described in Eq. (1) (where T is temperature in Kelvins). It is important to note that physically reversible processes are more of a theoretical concept and do not occur in nature. In some cases, however, changes can be made to physical processes to reduce their associated entropy increase or heat transfer which can be thought of as making them more reversible. [3]

Landauer's Principle relates the Second Law of
Thermodynamics to computation. As described by the principle, any
logically irreversible manipulation of information is accompanied by an
increase in entropy in the physical system implementing that
manipulation. [2] Landauer's Principle places a fundamental limit on the
number of computations that can be computed per joule of energy. It also
illustrates the connection between reversibility of logical
transformations/manipulations and the thermodynamic reversibility of
physical systems implementing these transformations stating that "any
logically irreversible operation cannot be implemented in a
thermodynamically reversible manner". [4] It is important to note that
while a reversible physical process is unnatural, a logical
transformation (which is an abstract mapping from inputs to outputs)
could be reversible. That is, there are logical transformations for
which the reverse process allows perfect computation of inputs from
outputs, and no information is lost. Examples of logically irreversible
manipulations of information include binary decisions and the merging of
two computational paths. [2] A binary decision, which is a logically
irreversible transformation as it involves erasing information carried
in one of the inputs, causes an increase in entropy in the physical
system realizing that decision of at least k_{B}ln(2) (see Eq.
(2), where k_{B} is Boltzmann's constant). In turn, this results
in heat dissipated (dQ) according to Eq. (3) (where N is the number of
decisions). [5]

dQ = T dS | (1) |

dS ≥ k_{B} ln(2) |
(2) |

dQ ≥ Nk_{B}T ln(2) |
(3) |

As noted by Landauer, the energy dissipation of most computational processes (including those performed by modern computers) has an "unavoidable" minimum value due to the fact that these devices "perform irreversible operations". [5] This lower limit cannot be overcome by improved materials. Pushing back the theoretical limits may require more fundamental changes to how computation is performed.

There may be potential for improving energy-efficiency by increasing reversibility both from a logical and physical perspective. For example, the fields of "Conservative" and "Reversible" computing involve minimizing the amount of information destroyed. [6,7] From a hardware perspective, such changes could correspond to conservation of low ('0') and high states ('1'). For example, aforementioned binary decisions could keep the bits that would be destroyed and reuse them. From a software perspective, such changes could involve including energy use in optimization criteria and requiring logical reversibility in algorithms (making them be able to run in both directions).

Improved efficiency will likely require combined efforts and reevaluation at all levels of computation from materials to software. Although mentioned circuit and algorithmic changes provide potential improvements in the energy-efficiency, they alone are unlikely to prevent the SIA predicted energy catastrophe at the hands of computation. In addition, economics provides several complicating factors to energy-efficiency that current ideas in the realm of software and circuit design are not likely to overcome. For example, high state voltages are higher than seemingly necessary due to imperfections in materials used and the need to ensure that switching of states is performed correctly. Fixing such material imperfections are largely economically unviable.

Demand for computing will likely increase to meet growing computational capacity. Independent of the detrimental environmental effects of excessive energy use, the idea that demand is seemingly inexhaustible highlights the importance of efficiency as in the not so distant future energy availability could push back and limit expansion of computation abilities. Without devaluing the importance of materials and overcoming other practical limits to energy-efficiency, fundamental changes to the process of computation (keeping Landauerâ€™s limit in mind) may be needed to meet ever increasing demand and limit the energetic toll of computation.

© Gabriel Vega. The author grants permission to copy, distribute and display this work in unaltered form, with attribution to the author, for noncommercial purposes only. All other rights, including commercial rights, are reserved to the author.

[1] "Rebooting the IT Revolution: A Call to Action, Semiconductor Industry Association, September 2015.

[2] C. H. Bennett, "Note on Landauer's Principle,
Reversible Computation, and Maxwell's Demon," Stud. Hist. Philos. M. P.
**34**, 501 (2003).

[3] E. Rathakrishnan, *Fundamentals of Engineering
Thermodynamics, 2nd Ed.* (Prentice Hall India, 2006), Ch. 4.

[4] J. A. C. Ladyman *et al.*,
"The Connection Between Logical and Thermodynamic Irreversibility,"
Stud. Hist. Philos. M. P. **38**, No. 1, 58 (2007).

[5] R. Landauer, "Irreversibility and Heat
Generation in the Computing Process," IBM J. Res. Dev. **5**, No. 3,
183 (1961).

[6] T. Toffoli, "Reversible Computing," in
* Automata, Languages and Programming*, ed. by J. de Bakker and J. van
Leeuwen (Springer, 1980), p. 632.

[7] E. Fredkin and T. Toffoli, "Conservative Logic,"
in *Collision-Based Computing*, ed. by A. Adamatzky (Springer,
2002), p. 47.