Secret Los Alamos tunnel revealed, plus Stockpile Stewardship

The Rocketry Forum

Help Support The Rocketry Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Winston

Lorenzo von Matterhorn
Joined
Jan 31, 2009
Messages
9,560
Reaction score
1,749
There's only one McDonald's in Los Alamos, so I was unknowingly parked right over this tunnel on may last tourism visit there. As always, quoted text is italicized, mine isn't:

Secret Los Alamos tunnel revealed

[video=youtube;PlLvb_d8saE]https://www.youtube.com/watch?v=PlLvb_d8saE[/video]

Top-secret super-secure Los Alamos vault declassified

[video=youtube;dWA5Z32tiKM]https://www.youtube.com/watch?v=dWA5Z32tiKM[/video]

Historic Manhattan Project Sites at Los Alamos

[video=youtube;axkQ4UjTc8M]https://www.youtube.com/watch?v=axkQ4UjTc8M[/video]

Ensuring that our aging nuke stockpile continues to be potentially functional, we have the stockpile stewardship programs. From the incredibly impressive stuff seen in the videos below and considering the simulation power of modern supercomputers, I suspect they are now able or will be able to entirely simulate and optimise a nuclear warhead within a supercomputer simulation although that's not the openly stated goal of stockpile stewardship. Considering the percentage of successful tests during the 1950s with their absolutely CRUDE analysis and computing capabilities by comparison to today and their highly successful efforts using those crude tools to reduce the mass and increase the efficiency of devices even back then, I think my suspicion may be justified.

The Big Science of Stockpile Stewardship (PDF)

https://aip.scitation.org/doi/pdf/10.1063/1.5009218

Understanding how the fissile Pu pit is compressed during the primary-stage implosion is fundamental to nuclear weapons design. By simulating the pit’s compression using nonfissile Pu surrogates, researchers at LANL’s DARHT facility are able to generate a variety of two-dimensional, full-scale images that then serve to inform weapons primary design. At the Nevada Test Site, now known as the Nevada National Security Site (NNSS), primary implosions are performed with fissile Pu, but material quantities are reduced sufficiently to ensure that the assembly remains subcritical at all times, in compliance with the CTBT.

FIGURE 1. The US Department of Energy’s Stockpile Stewardship Program makes use of (clockwise from top left)
subcritical plutonium implosive testing equipment at the Nevada National Security Site, supercomputers such as
Lawrence Livermore National Laboratory’s Sequoia, electron accelerators at Los Alamos National Laboratory’s DualAxis Radiographic Hydrodynamic Test, and an inertial confinement fusion chamber at LLNL’s National Ignition
Facility. (Images courtesy of the US Department of Energy.)


Stockpile Stewardship: Los Alamos - lots of really impressive hardware shown.

[video=youtube;SdRmhrf6oXE]https://www.youtube.com/watch?v=SdRmhrf6oXE[/video]

Trinity Supercomputer Now Fully Operational

[video=youtube;z9eZs2GBn9c]https://www.youtube.com/watch?v=z9eZs2GBn9c[/video]

Stockpile Stewardship: How we ensure the nuclear deterrent without testing - Lawrence Livermore National Laboratory does it, too - more extremely impressive hardware shown.

[video=youtube;8MmujbPYT80]https://www.youtube.com/watch?v=8MmujbPYT80[/video]

Weapon Simulation and Computing Program at LLNL

https://wci.llnl.gov/about-us/weapon-simulation-and-computing

Supercomputers offer tools for nuclear testing — and solving nuclear mysteries
1 Nov 2011

https://www.washingtonpost.com/nati...ar-mysteries/2011/10/03/gIQAjnngdM_story.html

The laboratories, including Livermore in California and Los Alamos National Laboratory and Sandia National Laboratory in New Mexico, are responsible for certifying to the president the safety and reliability of the nation’s nuclear weapons under a Department of Energy program known as stockpile stewardship, run by the National Nuclear Security Administration.

Over the years, various flaws have been detected in the nuclear arsenal, some worse than others. A serious incident occurred in 2003, when traditional checks revealed a problem that, while not catastrophic, was widespread. Details of that problem are also classified. In response to the discovery, Livermore scientists performed a series of computer simulations, followed by high-explosive but nonnuclear experiments at Los Alamos, that showed the weapons did not need a major repair that might have cost billions of dollars, Goodwin said. In an earlier time, he added, the only way to reach that conclusion might have been to resume nuclear testing.

At the time the test ban treaty was defeated, critics said the United States might someday need to return to testing. Six former secretaries of defense in Republican administrations, including Caspar W. Weinberger, Richard B. Cheney and Donald H. Rumsfeld, wrote to the Senate in 1999 that the planned stockpile stewardship program “will not be mature for at least 10 years” and could only mitigate, not eliminate, a loss of confidence in weapons without testing.

Sen. Jon Kyl (R-Ariz.), who has long opposed the treaty, said: “Computer simulation is a part of the stockpile stewardship program, which scientists say has been helpful. One told me it produced good news and bad news. The good news is that it tells us a lot more about these weapons than we ever knew before. The bad news is that it tells us the weapons have bigger problems that we realized. While computers are helpful, they’re not a substitute for testing. That’s why, even though we’re not testing right now, we should not give up the legal right to test.”


Something they don't talk about much, but which is very important if one wants to design a functional and optimized nuke within a supercomputer:

Subcritical experiments

https://thebulletin.org/subcritical-experiments

https://str.llnl.gov/str/Conrad.html

[video=youtube;bGf4-ZOjyVY]https://www.youtube.com/watch?v=bGf4-ZOjyVY[/video]
 
Real world applications in many fields have shown that trusting 100% in simulations is a gamble, if not foolish.

Electrical, mechanical, chemical, it does not matter, the software designers may not even understand completely the
effects and fidelity of their simulation.

An example is SPICE, used to model circuits. Early versions were great with analog, but got confused on switching power supplies. The switching aspect is really digital, the fast rise time of the power transition has a lot of parameters, including EMI generation. The program would create bizarre results.

A space example, not quite simulation, but what happens when a machine sees a new, unexpected environment, is the Vapor Compression Distillation unit, about the size of a dishwasher, sent up to the ISS to purify waste water, including urine, for use in washing (not consumption, but capable of it). It worked great on the ground (I contributed some biological matter in the testing). :eek:

Turned out that in space, with zero gravity (bad for you), calcium leaches out of the bones of the astro's, and into the urine. It clogged the filters of the machine, which rapidly failed.

They really need to test their bombs once in a while. I suggest they have NASA boost them behind the moon, and detonate. The moon will hide the flash, and there will be no evidence on Earth of the test. You don't need many instruments, just some to tell if it went off. Let's go back to real testing to ensure a viable stockpile.
 
Great read and video. Crude and rudimentary devices killed over 200,000 in the 1940s. Just how optimum does a device have to be? After years of seeing silver bullets loaded on the centerline station of our fighters, you realize that the majority of the destruction isn't during detonation but the cyclical exposure to radioactive materials in the debris and fallout from high atmosphere weather phenomena. Hard to believe that people would visit the Strip in Vegas to watch a test go off.
 
Optimization was basically to do the most with less, plutonium is one of the most expensive materials to produce. To this day the Superfund site I work at stores 53 million gallons of highly toxic corrosive radioactive waste left over from the manufacturing processes, we have 11 reactors in varying stages of demolition or long term storage, there are 5 chemical seperation canyons ( plants) awaiting decontamination/stabilization and demolition. Plutonium is fantastically expensive, the most common price I have seen is $4000 per gram so optimization is necessary since we are no longer maufacturing the material, more optmization and accuracy means more and smaller devices to do the same job as larger less accurate weapons.
 
Yep, I was right about a desire to design a warhead without testing:

Reliable Replacement Warhead Program

https://en.wikipedia.org/wiki/Reliable_Replacement_Warhead

Excerpt:

Designs which trade off higher weight and larger volume to maximise:

Certification without nuclear testing
Comparable or improved levels of reliability to existing designs, using larger margins and simpler components
Designs which can be designed and certified without necessarily undergoing nuclear testing


----------

Fission implosion devices really aren't that hard to develop as the ancient tech of the 40s and 50s proved. Their test failure rate was incredibly low.

Adiabatic compression for the fusion secondary is more difficult, but as the huge, cryogenic liquid fueled Ivy Mike device with its very large dewar flask secondary proved, not THAT difficult.

For the experiments on precise fusion secondary physics, we have the otherwise laughable, never had a snowball's chance in hell of being commercially viable for power production National Ignition Facility (NIF) monster laser implosion array operated by, wait for it, one of our nuclear weapons labs:

What is NIF?

https://lasers.llnl.gov/about/what-is-nif

"NIF’s goals are to help ensure the reliability of the nation’s nuclear weapons without underground testing"

Oh, and I'm SURE this is just a coincidink:

Comprehensive Nuclear-Test-Ban Treaty

https://en.wikipedia.org/wiki/Comprehensive_Nuclear-Test-Ban_Treaty

The Comprehensive Nuclear-Test-Ban Treaty (CTBT) is a multilateral treaty that bans all nuclear explosions, for both civilian and military purposes, in all environments. It was adopted by the United Nations General Assembly on 10 September 1996.

National Ignition Facility

https://en.wikipedia.org/wiki/National_Ignition_Facility

Construction on the NIF began in 1997...

I have no doubt at all that with the incredibly impressive tools shown in those videos which determine in minute measurement detail the physics of every possible part of a device that they can now design a fully functional and optimised nuclear warhead in a modern supercomputer.

Ivy Mike

Ivy_Mike_Sausage_device.jpg


%22Ivy_Mike%22_atmospheric_nuclear_test_-_November_1952_-_Flickr_-_The_Official_CTBTO_Photostream.jpg
 
And this extended answer to a question which basically outlines exactly the same methodology used by our incredibly impressive "stockpile stewardship" which could also be used to design reliable, but untested warheads. The Israelis with undoubtedly vastly smaller testing resources have likely done exactly that.:

How could Israel have nuclear weapons, if they have never tested the weapons?
by David Kahana, physicist unhinged

https://www.quora.com/How-could-Israel-have-nuclear-weapons-if-they-have-never-tested-the-weapons
 
From my short investigation of publicy available resources on this topic, I believe that:

1. The US entered into the Comprehesive Nuclear Test Ban Treaty (CTBT) in 1996 because:
a) We had all of the types and numbers of nukes we'd ever need and were confident we could keep them reliable
a) The lack of an ability to test would harm less capable adversaries far more than the US while
b) it would allow valid justifications for considerable expenditures on technical assets to monitor and maintain the
reliability of the current nuclear stockpile, assets which would also generate the copious amounts of data required to
c) provide a rapid breakout capability if ever needed to create with the crucial assistance of supercomputer
simulations
some amazingly capable and specialized nuclear warheads in short order and with minimal testing

As examples of those specialized types of weapons designed by supercomputer, imagine things like directed-energy nuclear penetrators, devices designed for specific radiation output and directivity to maximize EMP effects, etc.

I found the following PDF. Note the 3.1 excerpt confirming my suspicion about the primary reason for the NIF. It's followed by the paper's index of some examples of specialized 4th generation nuclear weapons:

Fourth Generation Nuclear Weapons (FGNW): Military effectiveness and collateral effects
February 2, 2008

https://arxiv.org/pdf/physics/0510071.pdf

3.1 Inertial confinement fusion experiments and FGNW

Inertial confinement fusion (ICF), which basically consists of exploding very small amounts of thermonuclear fuel highly compressed by lasers or other means, enables to study the physics of thermonuclear secondaries in the laboratory (see
Fig. 2). While this technique has the potential to be used in a thermonuclear reactor to produce energy, it has primarily been developed as an alternative to the underground testing of nuclear weapons, and as a tool for designing new types of nuclear weapons [1].

Basically, as can be seen by comparing Fig. 1 and 2, ICF reproduces in the laboratory the same arrangement than the one on which two-stage H-bomb are based: the “Teller-Ulam principle.”

4. Target coupling

4.1. Initial energy from conventional or nuclear weapons
4.2. Initial work from conventional or nuclear weapons
4.3. Coupling to homogeneous and heterogeneous targets
4.4. FGNW coupling

5. Thermonuclear-driven jets and projectiles

5.1. Conventional shaped-charges
5.2. “Nuclear” and “thermonuclear” shaped-charges
5.3. FGNW-driven jets and projectiles

6. Collateral effects

6.1. Mechanical and thermal effects
6.2. Prompt radiation effects
6.3. Delayed radiological effects
6.4. Electromagnetic effects
 
Back
Top