Virtual Nuclear Testing

Curtis Hamman
March 19, 2011

Submitted as coursework for Physics 241, Stanford University, Winter 2011

Nuclear Testing

Fig. 1: The above ground test regime (1945-1962): Underwater test explosion in the Pacific at Bikini Atoll showing the blast wave, condensation cloud and several target ships. Source: Wikimedia Commons

Three weeks before the atomic bombings of Hiroshima and Nagasaki, the United States conducted the first nuclear weapons test at the White Sands Proving Ground in New Mexico. The device detonated at the Trinity test site was an implosion-type weapon design similar to that dropped on Nagasaki. In contrast, the simpler gun-type fission weapon detonated over Hiroshima was not tested in advance. Individual components for both designs were tested, but the first integrated systems test of the gun-type design with supercritical explosive yield occurred on August 6th, 1945, over Hiroshima, Japan. [1]

Shortly thereafter, a sustained above-ground testing campaign, which involved 210 atmospheric and five underwater detonations (see Fig. 1), began in earnest in the United States (Russia, the United Kingdom, France and China would start similar campaigns within twenty years). Experimental data from each test was collected to study the effects of nuclear explosions at high altitude, in the air, on land and underwater upon buildings, machinery, plants, animals and people. [2] Many new weapon designs were tested to evaluate their military characteristics and performance. Confidence that new weapon designs would perform to specifications was largely established by a successful test. Campaigns to optimize the design, e.g. by miniaturizing the nuclear warhead while simultaneously increasing its explosive yield, and to tailor the blast wave to specific targets and conditions were then conducted. By the 1950's, completely assembled weapons built "ready-to-go" were introduced into the U.S. stockpile. Peacetime accidental explosions and increased proliferation risks would be addressed further in the ensuing decades. The early 1960's were witness to significant yield-to-weight improvements that, when coupled with advances in ballistic missile and delivery technology, created an unsettling environment where both the U.S. and Soviet Union could threaten each other within about 30 minutes. [3]

Concern over global thermonuclear war and atmospheric fallout began to challenge the national security rationale for above-ground testing. [4-7] Efforts to secure international control of nuclear weapons were largely unsuccessful. At the time, some thought that atmospheric testing of clean tactical nuclear weapons under international authority would help limit fallout effects to an acceptable level and compel mankind towards substantial disarmament and open-sharing of nuclear technology. [8,9] Instead, the buildup of nuclear armaments accelerated and continued in secret with only the explosion of multi-megaton bombs signifying progress to the world. Increasing international pressure eventually forced a cessation of testing by the U.S. and Soviet Union from 1958 to 1961 and set the stage for test-ban negotiations. After several years of negotiation, the Limited Test Ban Treaty was signed by the United States, the Soviet Union and the United Kingdom in 1963. The limits, however, were not absolute. Underground testing was still permitted under the terms of the new treaty. The shift from above-ground to below-ground testing led to the further consolidation of the U.S. nuclear weapons complex. [10]

From October 1963 to September 1992, the U.S. detonated more than 700 nuclear devices in below-ground tests, most at the Nevada Test Site (see Fig. 2), averaging about two tests per month. The bright flash, blinding heat, blast wave and mushroom cloud typical of the atmospheric testing era were no longer sensible to the casual observer. Watching seismic monitors project data on video screens from the relative comfort of a permanent control room became the norm during the underground testing era. Multiple new weapons systems were designed, manufactured, and tested on a routine basis during this time to replace the previous generation within the nuclear stockpile. [10] More optimized designs were developed with tighter performance margins. Relatively minor defects could make the device ineffective compared to more robust designs. With an active design program, any defects discovered during routine surveillance were repaired or replaced by modern weapons with an established testing pedigree within a few years. Aging effects were of relatively little concern with a fresh supply of "ready-to-go" nuclear weapons. Engineering new safe, secure and reliable nuclear weapons for the purposes of preventing conflict became the norm. The march towards weapons of unprecedented mass destruction was replaced by the slow evolution of more complex engineering devices so fail-safe and proliferation-proof they were just barely able to detonate. [3,10]

Fig. 2: The underground test regime (1963-1992): Subsidence craters from old tests and monitoring equipment are visible at the Nevada Test Site. Source: Wikimedia Commons

With the dissolution of the Soviet Union in 1991, U.S. security policy transitioned to preserving the existing deterrent while reducing proliferation risks. Safety improvements from the 1970's had been incorporated into the stockpile, which appeared to satisfy anticipated military needs given the emerging international commitments. President Bush and the U.S. Congress initiated a test ban moratorium in 1992 that was subsequently extended by President Clinton culminating with the signing the Comprehensive Test Ban Treaty in 1996 (which has yet to be ratified by the U.S. Senate). [11] The weapons laboratories were now tasked with maintaining their nuclear weapons expertise and certifying the stockpile without physically detonating a nuclear device. [3,12]

A Virtual Reality

The U.S. last physically tested a nuclear weapon in 1992, underground, at the Nevada Test Site about 65 miles north of Las Vegas. By the mid-1990s, U.S. policy had solidified its commitment to a zero-yield test moratorium for the foreseeable future. Since then, the U.S. nuclear weapons community has been directed to exercise stewardship over the existing stockpile. The strategy developed to achieve this, known as Stockpile Stewardship, is motivated in part by the need to maintain confidence in the stockpile, preserve both the human and physical resources needed to maintain the current stockpile, and remain capable of resuming nuclear design, production and testing with minimum delay if U.S. policy were to change. [13]

In the absence of nuclear testing, policy makers argued that stockpile stewardship would strengthen the test ban treaty and enable the U.S. to maintain its nuclear deterrent without testing. Three different ways to maintain the deterrent are often highlighted. First, managing an aging stockpile would require the replacement of particular components with differently manufactured parts whose affect on the explosive yield of the bombs must be sufficiently understood to detect defects before they jeopardize the deterrent. Second, sustaining a scientifically competent workforce able to guarantee the reliability of the deterrent requires investment in experimental and computational facilities (see Fig. 3) able to keep the nuclear weapon designers active and attract a new generation of scientists into the weapons community. Third, substituting a virtual testing environment with refined physics fidelity from "first-principles" simulation is expected to replace physical testing as a means of certifying the enduring stockpile. Without physical testing, scientific judgment honed in a virtual reality is expected to validate the performance of the nuclear arsenal. [13,14]

The extraordinary temperatures and pressures produced in a nuclear explosion are difficult to reproduce by conventional means. To provide a detailed physical understanding of what goes on inside a nuclear explosion, mathematical models are constructed from first-principles theoretical knowledge of the underlying physics. The governing equations, full of non-linearities and complex interactions, often resist any simple solution and further coarse-grained approximations are made to arrive at a set of equations that "captures the physics" while being amenable to solution with computers. Further simplifications are then made to simplify these basic equations into a model suitable for weapon design and refurbishment. In doing so, engineering approximations are introduced that are calibrated against the results of actual nuclear testing or, possibly, calibrated against the results of more well-resolved virtual simulations. [15]

Scientific judgment often arbitrates the selection of governing equations and engineering models that collectively "capture the physics" of a given simulation scenario. Human judgment is, however, limited and prone to fantasy in the absence of experimental validation. Simple sanity checks (e.g. conservation of energy/momentum) and tests of isolated physics components (e.g. neutron transport, hydrodynamics, and equations of state) may be deemed sufficient to validate a fully-coupled physics simulation and produce useful output; however, simulations of nuclear weapons are still subject to the old maxim: garbage-in, garbage-out. Without new experimental data, the simple expediency of arbitrary correlation functions and parameters is left unchallenged. Only the simulated world persists. The proprietary, restricted and classified nature of weapons codes further complicates their validation. [16] Equations of state, cross-sections, opacities and physics modules useful to the design of nuclear weapons have largely been sequestered from scrutiny by the wider scientific community allowing poor models to persist and enabling opaque decision-making that supports institutional objectives instead of scientific objectivity. Though, these boundaries of secrecy continue to fluctuate as general scientific research expands into bomb-relevant areas.

Fig. 3: The virtual test regime (1995-today): Several racks of Dawn, a half petaflop system that supports weapons science codes for the U.S. Department of Energy. (Courtesy of the Lawrence Livermore National Laboratory)

To help assess the quality of internal scientific judgment and recruit scientists into an exciting career in weapons research instead of weapons gerontology, Stockpile Stewardship has supported several unclassified university research programs designed to develop predictive simulation tools for "grand-challenge", integrated multi-physics problems. This includes integrated simulations of gas turbines, hypersonic scramjets, rocket engines, supernova explosions, and the hypervelocity impact of metallic projectiles in extreme environments. Each of these programs are expected to deliver simulation tools, models and scientists able to be used in classified-domain applications "behind-the-fence." [17] In contrast, many numerical techniques commonplace in graduate education today instead had their origins in nuclear weapons research many decades ago. [18] Stockpile Stewardship has also increased the support for advanced experimental facilities that mimic the physics found in thermonuclear weapons. Basic research in astrophysics and high-energy density physics, which share similar physical processes, is expected to benefit greatly from increased access to these facilities. [19] Laser-driven fusion research, which also shares common physics with thermonuclear weapons, has led to further declassification of weapons information as the open scientific community rediscovers old concepts on new machines. [20,21] Some have argued that such interactions present a profound proliferation risk, while others argue that any presumed proliferation risk is outweighed by the need to reward and retain scientific rigor within the weapons community by allowing publication in the open literature. [22] Though, inevitable, open scientific progress in bomb-relevant fields necessarily mutes the hypersecurity rationale for knowledge sequestration.

On the road to disarmament, the U.S. nuclear weapons labs face an identity crisis. They are tasked with maintaining the technical competence needed to support the nation's nuclear deterrent even as the size of that stockpile tends toward zero. [23,24] Some argue that this limit is achievable by investing in surveillance and remanufacture capabilities preceded by only one last-time modification of the physics package. [25] Others argue that only those skills needed to remanufacture defective weapons according to their original specifications should be retained, akin to curatorship of antique nuclear weapons, while others are left to atrophy. [21] Stockpile Stewardship, on the other hand, has sought to sustain the nuclear enterprise, even as the physical weapons are dismantled, by the infusion of scientifically challenging problems and experimental facilities to support a virtual testing ground. The limits of computation are, however, real. After more than a decade under stewardship, some now claim that the only way to maintain the long-term safety and reliability of the stockpile is to end the moratorium and resume underground nuclear testing and weapon design. [26,27] Institutional patronage is apparently at odds with public policy.

© Curtis W. Hamman. The author grants permission to copy, distribute and display this work in unaltered form, with attribution to the author, for noncommercial purposes only. All other rights, including commercial rights, are reserved to the author.

References

[1] R. Rhodes, The Making of the Atomic Bomb, (Simon and Schuster, 1986).

[2] S. Glasstone, The Effects of Nuclear Weapons, (Knowledge Publications, 2006).

[3] S. Drell and B. Peurifoy, "Technical Issues of a Nuclear Test Ban," Annual Reviews of Nuclear and Particle Science 44, 285 (1994).

[4] H. Kahn, On Thermonuclear War, (Princeton U. Press, 1960).

[5] W. F. Libby, "Radioactive Fallout," Proc. Nat. Acad. Sci. 44, 800 (1958).

[6] H. Kissinger, "Nuclear Testing and the Problem of Peace," Foreign Affairs 37, 1 (1958).

[7] L. Pauling and E. Teller, Fallout and Disarmament: A Debate, (Fearon Publishers, 1958).

[8] F. Dyson, "The Future Development of Nuclear Weapons," Foreign Affairs 38, 457 (1960).

[9] E. Teller, "The Feasibility of Arms Control and the Principle of Openness," Daedalus 89, No. 4, 781 (1960).

[10] J. Masco, "Nuclear Technoaesthetics: Sensory Politics from Trinity to the Virtual bomb in Los Alamos," American Ethnologist 31, 349 (2008).

[11] J. Roehl, "The United States Senate and the Politics of Ratifying the Comprehensive Nuclear-Test-Ban Treaty," Comparative Strategy, 28 303 (2009).

[12] R. L. Garwin, "The Future of Nuclear Weapons Without Nuclear Testing," Arms Control Today 27, No. 8, 3 (1997).

[13] R. Jeanloz, "Science-Based Stockpile Stewardship," Physics Today, 53, 44 (2000).

[14] H. Gusterfson, "The Virtual Nuclear Weapons Laboratory in the New World Order," American Ethnologist 28, 417 (2001).

[15] D. Mackenzie, "The Influence of the Los Alamos and Livermore National Laboratories on the Development of Supercomputing," IEEE Annals of the History of Computing, 13, No. 2, 179 (1991).

[16] R. B. Laughlin, "The Physical Basis of Computability," Computing in Science and Engineering 4, 27 (2002).

[17] C. E. Paine, "A Case Against Virtual Nuclear Testing," Scientific American 281, No. 3, 74 (1999).

[18] F.H. Harlow, "Fluid Dynamics in Group T-3 Los Alamos National Laboratory," J. Comp. Phys. 195, 414 (2004).

[19] E. Moses et al., "The National Ignition Facility: Ushering in a New Age for High Energy Density Science," Physics of Plasmas 16, 041006 (2009).

[20] H. Gusterfson, "NIF-ty Exercise Machine," Bulletin of the Atomic Scientists 51, No. 5, 22 (1995).

[21] J. Katz, "Curatorship, Not Stewardship," Bulletin of the Atomic Scientists 51, No. 6, 3 (1995).

[22] C. E. Paine and M. G. McKinzie, "Does the U.S. Science-Based Stockpile Stewardship Program Pose a Proliferation Threat?," Science and Global Security 7, 151 (1998).

[23] J. Johnson, "DOE Weapons Labs at a Crossroad," Chemical and Engineering News, 87, No. 2, 32 (2009).

[24] J. Reppy, "U.S. Nuclear Laboratories in a Nuclear-Zero World," Bull. Atomic Scientists 66, 42 (2010).

[25] R.E. Kidder, "Problems with Stockpile Stewardship," Nature 386, 645 (1997).

[26] D.H. Sharp, "Nuclear Testing: Deterrence, Stewardship and Arms Reduction," Comparative Strategy 29, 295 (2010).

[27] K. Bailey and R. Barker, "Why the United States Should Unsign the Comprehensive Test Ban Treaty and Resume Nuclear Testing," Comparative Strategy 22, 131 (2010).