The Use of Computer Modeling in Reactor Design

Zach Vane
March 21, 2012

Submitted as coursework for PH241, Stanford University, Winter 2012


The past few decades have seen a meteoric rise in the capabilities of modern computers. As of the end of 2011, a handful of supercomputers in the United States, Japan, and China are capable of petaflop calculations [1]. This amounts to a quadrillion mathematical floating point operations per second. As the power of these machines has increased, so too has the ambition of researchers to use them for even more complicated challenges. Computer simulations have already proven themselves in both the aerospace and automobile industries as ways to investigate complex processes while greatly reducing the time and cost necessary to evaluate designs and introduce new products. Modeling the multi-scale and multi-physics phenomena that are inherent to the fission process and fluid flow in a nuclear reactor is another such area where simulation could lead to success [2]. Hence, this tool has become an attractive option in the design of the next generation of reactors. The goal of this approach is to produce true breakthroughs in reactor safety, performance, and reliability rather than simply modifying Cold War era designs to achieve marginal improvement.

Application to Nuclear Reactors and Practices

As an investment in clean energy for the United States, the Department of Energy began funding centers to investigate several promising areas of research. Three such institutions have been established to date. One of these, known as The Nuclear Energy Modeling and Simulation Energy Innovation Hub, focuses on using supercomputers to advance nuclear energy technology. The Consortium for Advanced Simulation of Light Water Reactors (CASL) was then established at Oak Ridge National Laboratory (ORNL) in 2010 with the goals of getting more power out of nuclear reactors, extending the life of current reactors, and aiding in the design of the next generation of reactors [3]. This program was even identified in President Obama's 2012 State of the Union Address when he said, "At Oak Ridge National Laboratory, they're using supercomputers to get a lot more power out of our nuclear facilities."

Instead of creating a program that simply strings together the results of different legacy codes, CASL sought to develop an entirely new software package, called Denovo, that is capable of modeling a complete power plant. This novel approach includes modeling everything from the atomic-level physics inside the reactor core to the large-scale operations in the auxiliary buildings of a nuclear power facility [4]. The only way such an ambitious program could be started, however, was by taking advantage of ORNL's Jaguar supercomputer. This total picture approach, coupled with an unrivaled level of fidelity, will hopefully lead to higher safety, reduced costs, and increased efficiency in the next generation of reactors.

The creation of a reliable, full-scale model of any nuclear facility starts with the nuclear core. To better understand the complex physics it contains, many universities, national laboratories, and government agencies collaborated to tackle this problem from the most fundamental of levels. The Department of Energy's Nuclear Energy Advanced Modeling and Simulation (NEAMS) program uses this approach in examining the entire fuel cycle.

Proper modeling must consider a wide arrange of physical phenomena in order to accurately represent fission processes. These simulations must take into account grid and fuel interactions, irradiation, coolant flow and fluid forces, structural and thermal response as well as the behavior of materials [2]. One of the unique challenges in doing this is combining the simple empirical models developed over the years with the newer, high fidelity algorithms capable of providing physical insight. Real-world data, when available, is then used to validate these codes. This validation process is essential to confirming the reliability of a model to represent a desired system. Once validated, these models then provide the capability to evaluate a multitude of design options in a much faster and cost-effective manner. This virtual procedure offers many advantages over the much riskier "trial and error" approach traditionally used in experimentally developing nuclear technology [5].

In the same vein, CASL has investigated various operations and safety concerns by testing them in a "virtual reactor" [6]. The first version of such a code, called the Virtual Experiment for Reactor Analysis (VERA), attempts to simulate a complete reactor core. This includes phenomena ranging from the creation and transport of neutrons to the fission behavior of a nuclear material. Additionally, it also includes any relevant chemistry as well as the coolant and structural response of the core. VERA will offer scientists and engineers a way to determine how refining safety margins or increasing the power uprates of existing reactors will affect safety and efficiency. Unlike the more ambitious Denovo project which requires supercomputer-level resources, the VERA code has been designed to be highly portable and can be used with far fewer processors. Hence, VERA could be used as an investigative tool for a much broader subset of the nuclear industry [7]. Accurate predictions of the operational response of a reactor could also lead better choices of the materials and geometries used in a reactor. A substantial advantage to computer models is that they can generate a substantial time history of virtual results in a relatively short amount of time in the real world. This enables researchers to look at decades of virtual reactor performance data that can inform their current real-world decisions [8]. Since materials tend to be the limiting factor in the service lifetime of a nuclear reactor, being able to predict how a component evolves within a nuclear core can allow for a longer period of safe operation.

Application to the Fuels and Safety

The nuclear fuel itself can also be the target of study. Current development of nuclear fuels takes roughly 15 years due to the time-consuming process of repeated burning and inspection of a fuel sample in a reactor [9]. Modeling, as stated previously, can significantly reduce the time needed to age the fuel while generating all of the relevant data that scientists require. Hotter-burning, longer-lasting fuels could then lead to a greater power output and a significant increase in a power plant's service lifetime. The configuration of fuels in a reactor is another area of current research. For instance, the research at Argonne National Laboratory (discussed earlier) also looked at the thermal mixing potential of different wire-wrapped, reactor fuel pin bundles [10]. Similar investigations have also considered how different fuel assemblies and coolant flows affect power generation.

Computer models not only offer the chance to predict potential future behavior, but also allow for an after-the-fact analysis of events. In a process similar to that of code validation, a simulation can be performed using the reactor conditions at the time of failure. Using data provided by the actual scenario, a computational result can be analyzed with respect to the event itself. This kind of virtual autopsy can be extremely useful in gaining insight into how and why operations were disrupted [2]. This method can also be used to provide guidance as to the most appropriate way to avoid similar problems in the future.

© Zachary Vane. The author grants permission to copy, distribute, and display this work in unaltered form, with attribution to the author, for noncommercial purposes only. All other rights, including commercial rights, are reserved to the author.


[1] G. Derene, " How IBM Built the Most Powerful Computer in the World," Popular Mechanice, 28 Dec 11.

[2] R. Michal, "The Nuclear News Interview: Doug Kothe: CASL and the Virtual Reactor", Nuclear News 54, No. 3., 88 (2011).

[3] J. Huotari, "President's Speech Names Names: ORNL," The Oak Ridger, 27 Jan 11.

[4] "Advancing Nuclear Through Computing," The Oak Ridger, 18 May 10.

[5] C. Dillow, "Oak Ridge Labs Using World's Fastest Supercomputer to Model Next-Gen Nuclear Plants," Popular Science, 19 May 10.

[6] "Oak Ridge Chosen for $122M Energy Hub," The Oak Ridger, 28 May 10.

[7] F. Munger, "ORNL Project Attracts Notice: President Mentions Computer Simulations to Improve Reactors", Knoxville News Sentinel, 31 January 2011.

[8] F. Munger, "ORNL Team to Get $122M: TVA Also Part of Five-Year Nuclear Simulation Project", Knoxville News Sentinel, 29 May 10.

[9] W. L. J. Howell, "NCSU Professor Testing Nuclear Fuels with Computers",, 6 Sep 10.

[10] R. Ranjana, C. Pantanoa and P. Fischer, "Direct Simulation of Turbulent Heat Transfer in Swept Flow Over a Wire in a Channel," Intl. J. Heat Mass Transfer 54, 4636 (2011).