Scientists at U.S. National Laboratories are still testing nuclear weapons among the mountains, desert, and chaparral of the American West. High-tech machinery and warehouses stocked with supercomputer processors take data on warheads and explosions—yes, there are still explosions, which crack like rifle fire on schedule in the distance.
But there are no nuclear explosions. Though the treaty explicitly banning all nuclear weapons tests has not yet entered into force, the United States has not detonated a nuclear weapon since 1992. The American nuclear strategy still relies on the nuclear weapons working, but without full-scale tests, the Department of Energy’s National Labs now operate the Stockpile Stewardship program, which relies on theory, simulations, and experiments to deliver annual weapons assessments to the federal government. Science and computing initiatives, such as increasing supercomputing speed and investing in new processing technology like quantum computing, may one day make simulated testing as effective as actually detonating a device.
“The [Stockpile Stewardship program] has gone through a number of administrations, and the Defence Department hasn’t said that we have to go back to testing,” Victor “Vic” Reis, former assistant secretary of energy for defence programs at the Department of Energy and one of the program’s architects, told Gizmodo. “We understand enough of what’s happening with the current stockpile of weapons—they’re safe and reliable.”
From the start of the nuclear era, ensuring that nuclear devices worked relied, in part, on detonating them. But growing public concern in the 1950s regarding the health and environmental effects of nuclear fallout and the overall unease with the devastating potential of these weapons slowly led to treaty negotiations. The start-and-stop efforts were closely tied to the U.S.-Russia tensions surrounding the Cold War. But thanks in part to the Cuban Missile Crisis in 1962 (when the U.S. and Russia very nearly kicked off a nuclear war), countries around the world signed the 1963 Treaty Banning Nuclear Weapon Tests in the Atmosphere, in Outer Space and Under Water, called the Partial Test Ban Treaty (PTBT).
The treaty didn’t stop nuclear arms proliferation, and testing moved underground. Further treaties followed, limiting the size of the weapons that could be tested and introducing ways to verify that each country was complying with the terms of the treaties. But it took until the end of the Cold War for the next big push to end nuclear weapons testing. The PTBT’s signers met and negotiated a stronger treaty, the Comprehensive Nuclear-Test-Ban Treaty (CTBT), which would ban all nuclear explosions for all purposes. That treaty has not yet gone into effect; China, Egypt, India, Iran, Israel, North Korea, Pakistan, and the United States have not ratified it.
In a 1993 radio address, U.S. President Bill Clinton said that “To assure that our nuclear deterrent remains unquestioned under a test ban, we will explore other means of maintaining our confidence in the safety, the reliability, and the performance of our own weapons.” After all, the principle of nuclear deterrence requires a guarantee that the nuclear weapons actually work. Clinton did not say how this assurance would play out, however. When Vic Reis became the assistant secretary of energy for defence programs at the Department of Energy, he made this matter his highest priority.
“The issue became, how do you maintain the lab’s competence to be able to confidently tell the President that the weapons were still OK as they aged—or did we have to return to testing?” Reis told Gizmodo. “Did we understand the ageing process? What were the effects of ageing, and were we able to provide high-confidence fixes, if necessary?”
Reis teamed up with senior scientists and military personnel to draft a program that could validate the performance of the weapons and simulate the effects of ageing on the weapons and their safety—what he called Science Based Stockpile Stewardship. The three national labs with weapons programs—Los Alamos National Lab, Lawrence Livermore National Lab, and Sandia National Laboratories—were already working on large experiments for testing components of nuclear weapons. However, there wasn’t nearly enough computing capacity to run all of the required simulations.
Fortunately, Reis had previously been the director of DARPA and convinced a manager there to lead what would become the Accelerated Strategic Computing Initiative, a program that would significantly increase the computing power available to the weapons labs. Today, the Stockpile Stewardship program operates on a three-pillared approach, combining theory, simulation, and experiment, and runs mainly out of those three labs as well as the Nevada National Security Site.
The goal of the program is to issue an annual report to Congress offering complete confidence that the nuclear arsenal is 100 per cent reliable, even as the radioactive pits inside of the weapons age and undergo molecular changes. But the pit is just one part of a weapon; there are several thousand other components that go into the device.
Conventional explosives must set off chain reactions. The pit must be secured so it doesn’t jostle around and detonate accidentally, and the weapon must sit inside a casing that protects it from the outside world. Nuclear warheads require a delivery method, such as a gravity bomb from inside a plane, an intercontinental ballistic missile, or a submarine-launched ballistic missile. The nuke must continue to work even if another country attempts to stop it (say, with another nuke).
I visited Los Alamos in June 2019 to learn more about this program. Experiments there attempt to recreate the conditions that a nuclear device might face as it approaches its target. A steel tube about the height and width of a semi-truck but much longer sits in the scrubby fields behind security-guarded outposts at the lab, which is located about 95 kilometres northeast of Albuquerque. Warheads with dummy nuclear pits are placed at one end of the pipe, and more than 45 kilograms of the conventional explosive C4 is set off at the other.
The tube guides the shockwave toward the warhead, where scientists image the interaction using high-speed cameras. Beside the shock tube, a low concrete building contains a blue-and-white centrifuge that can spin test warheads to 200 revolutions per minute to ensure they can survive the 12-g force of reentry into the atmosphere.
The centrifuge weighs 9 tonnes but has low-enough friction that I was able to move it with a hard push. It was hypnotising to watch the centrifuge spin at 60 rotations per minute in person and downright upsetting to see a video of a test nuclear warhead attached to the centrifuge, whipped around at full speed like a tetherball. These experiments are operated remotely or from bunkers. Detonations of C4 and other explosives are heard daily around the lab’s vast campus and occasionally from the neighbouring towns.
Other experiments simulate the explosive triggering of a nuclear weapon without the pit inside. At the Dual-Axis Radiographic Hydrodynamic Test Facility, or DARHT, x-rays produced by an on-site particle accelerator are used to image material imploding as it undergoes shock. At Lawrence Livermore’s National Ignition Facility, the highest-energy laser ever built is stored in a sports stadium-sized building, where it focuses beams onto a target to set off fusion.
The data from these experiments, as well as data from the 1,054 officially counted U.S. nuclear tests that occurred between 1945 and 1992, are incorporated into the simulation phase of the Stockpile Stewardship program. Understanding how the weapons age is a crucial component to the simulations. “There’s a whole aspect of what happens to various materials and how they interact with metals, or with components of the devices themselves, that’s all ageing. We have no data on what happens when something is 40 years old,” Irene Qualters, associate laboratory director for simulation and computation at Los Alamos National Lab, told Gizmodo.
Improving the simulations requires ever-more-powerful and advanced computers. Today, Los Alamos hosts the world’s seventh-fastest supercomputer, called Trinity. Aisles of black monoliths behind clear doors in a loud, white-tiled room act as an enormous processor for running simulations, as well as storage for simulation results on tapes.
Small stickers reading “secret restricted data” decorate the stacks, a reminder of these processors’ true use. Lawrence Livermore National Lab hosts the world’s second-fastest supercomputer, used for similar purposes, called Sierra. Each lab has a backup of the other’s data in case one is taken offline or destroyed—say, in a nuclear blast. An endless race to boost supercomputer performance aims to improve the speed, detail, and efficiency of these simulations.
Meanwhile, other researchers study the limits of high-performance computing and try to understand what the next computers might look like. They write new algorithms and push computational theory. Los Alamos owns and tests a D-Wave quantum computer, a black cube emblazoned with LED lights that uses quantum effects to perform certain optimisation problems.
Other researchers try out new algorithms on quantum computers over the cloud from the likes of IBM and IonQ. These research areas are meant to advance the general understanding of computation and to ensure that scientists understand the highest-performance devices that might one day be useful for nuclear weapons simulations, all in the name of making sure the nukes still work. Building faster, better computers is a proxy for an arms race itself.
As I spoke to scientists at Los Alamos, I wondered how they felt working on maintaining the nuclear arsenal. Many of these scientists were initially interested in studying physics, nuclear power, or computing for peaceful purposes. “Nuclear weapons and the stockpile are important to this country,” Qualters said. “I would rather be a part of maintaining that with integrity than to abdicate the responsibility.” This sentiment was mirrored by others who work on the nuclear stockpile.
The program has been successful—the United States hasn’t detonated a nuclear weapon in almost 30 years. But the facilities have also allowed the U.S. to continue maintaining and upgrading its nuclear arsenal; the Obama administration committed hundreds of billions of dollars to rebuilding nuclear weapons, an effort that has continued under the Trump administration. But is there anything to stop the federal government from ordering new nuclear tests?
“I’m more worried about… members of Congress that just want to do a nuclear test, not for reliability but to send a signal, up the ante, what have you,” Lisbeth Gronlund, senior scientist and co-director of the Global Security Program at the Union of Concerned Scientists, told Gizmodo. It’s possible that lawmakers will doubt the effectiveness of the Stockpile Stewardship program alone and ask to resume detonations, Gronlund said.
Still, Reis told Gizmodo that he thinks the strategy should last at least another generation. The U.S. has found an effective workaround to true nuclear testing—it’s not quite as showy as nuking ships in the Pacific, but scientists each year report to Congress with 100 per cent confidence that the nuclear arsenal is reliable.
“But beyond 20 to 25 years, who knows,” Reis said. Future politicians will eventually have to decide what to do about the ageing nuclear arsenal.