The United States first began developing nuclear weapons during World War II under the order of President Franklin Roosevelt in 1939, motivated by the fear that they were engaged in a race with Nazi Germany to develop such a weapon. After a slow start under the direction of the National Bureau of Standards, at the urging of British scientists and American administrators, the program was put under the Office of Scientific Research and Development, and in 1942 it was officially transferred under the auspices of the United States Army and became known as the Manhattan Project, an American, British and Canadian joint venture. Under the direction of General Leslie Groves, over thirty different sites were constructed for the research, production, and testing of components related to bomb-making. These included the Los Alamos National Laboratory at Los Alamos, New Mexico, under the direction of physicist Robert Oppenheimer, the Hanford plutonium production facility in Washington, and the Y-12 National Security Complex in Tennessee.
By investing heavily in breeding plutonium in early nuclear reactors and in the electromagnetic and gaseous diffusion enrichment processes for the production of uranium-235, the United States was able to develop three usable weapons by mid-1945. The Trinity test was a plutonium implosion-design weapon tested on 16 July 1945, with around a 20 kiloton yield.
Faced with a planned invasion of the Japanese home islands scheduled to begin on 1 November 1945 and with Japan not surrendering, President Harry S. Truman ordered the atomic raids on Japan. On 6 August 1945, the U.S. detonated a uranium-gun design bomb, Little Boy, over the Japanese city of Hiroshima with an energy of about 15 kilotons of TNT, killing approximately 70,000 people, among them 20,000 Japanese combatants and 20,000 Korean slave laborers, and destroying nearly 50,000 buildings (including the 2nd General Army and Fifth Division headquarters). Three days later, on 9 August, the U.S. attacked Nagasaki using a plutonium implosion-design bomb, Fat Man, with the explosion equivalent to about 20 kilotons of TNT, destroying 60% of the city and killing approximately 35,000 people, among them 23,200–28,200 Japanese munitions workers, 2,000 Korean slave laborers, and 150 Japanese combatants.
On 1 January 1947, the Atomic Energy Act of 1946 (known as the McMahon Act) took effect, and the Manhattan Project was officially turned over to the United States Atomic Energy Commission.
On 15 August 1947, the Manhattan District was abolished.
During the Cold War
Protest in Bonn against the deployment of Pershing II
missiles in West Germany, 1981
Between 1945 and 1990, more than 70,000 total warheads were developed, in over 65 different varieties, ranging in yield from around .01 kilotons (such as the man-portable Davy Crockett shell) to the 25 megaton B41 bomb. Between 1940 and 1996, the U.S. spent at least $9.3 trillion in present-day terms on nuclear weapons development. Over half was spent on building delivery mechanisms for the weapon. $583 billion in present-day terms was spent on nuclear waste management and environmental remediation.
Richland, Washington was the first city established to support plutonium production at the nearby Hanford nuclear site, to power the American nuclear weapons arsenals. It produced plutonium for use in cold war atomic bombs.
Throughout the Cold War, the U.S. and USSR threatened with all-out nuclear attack in case of war, regardless of whether it was a conventional or a nuclear clash. U.S. nuclear doctrine called for mutually assured destruction (MAD), which entailed a massive nuclear attack against strategic targets and major populations centers of the Soviet Union and its allies. The term "mutual assured destruction" was coined in 1962 by American strategist Donald Brennan. MAD was implemented by deploying nuclear weapons simultaneously on three different types of weapons platforms.
After the 1989 end of the Cold War and the 1991 dissolution of the Soviet Union, the U.S. nuclear program was heavily curtailed, halting its program of nuclear testing, ceasing its production of new nuclear weapons, and reducing its stockpile by half by the mid-1990s under President Bill Clinton. Many former nuclear facilities were shut down, and their sites became targets of extensive environmental remediation. Efforts were redirected from weapons production to stockpile stewardship, attempting to predict the behavior of aging weapons without using full-scale nuclear testing. Increased funding was also put into anti-nuclear proliferation programs, such as helping the states of the former Soviet Union to eliminate their former nuclear sites and to assist Russia in their efforts to inventory and secure their inherited nuclear stockpile. By February 2006, over $1.2 billion had been paid under the Radiation Exposure Compensation Act of 1990 to U.S. citizens exposed to nuclear hazards as a result of the U.S. nuclear weapons program, and by 1998 at least $759 million had been paid to the Marshall Islanders in compensation for their exposure to U.S. nuclear testing, and over $15 million was paid to the Japanese government following the exposure of its citizens and food supply to nuclear fallout from the 1954 "Bravo" test. In 1998, the country spent an estimated total of $35.1 billion on its nuclear weapons and weapons-related programs.
Large stockpile with global range (dark blue)
In the 2013 book Plutopia: Nuclear Families, Atomic Cities, and the Great Soviet and American Plutonium Disasters (Oxford), Kate Brown explores the health of affected citizens in the United States, and the "slow-motion disasters" that still threaten the environments where the plants are located. According to Brown, the plants at Hanford, over a period of four decades, released millions of curies of radioactive isotopes into the surrounding environment. Brown says that most of this radioactive contamination over the years at Hanford were part of normal operations, but unforeseen accidents did occur and plant management kept this secret, as the pollution continued unabated. Even today, as pollution threats to health and the environment persist, the government keeps knowledge about the associated risks from the public.
During the presidency of George W. Bush, and especially after the 11 September terrorist attacks of 2001, rumors circulated in major news sources that the U.S. was considering designing new nuclear weapons ("bunker-busting nukes") and resuming nuclear testing for reasons of stockpile stewardship. Republicans argued that small nuclear weapons appear more likely to be used than large nuclear weapons, and thus small nuclear weapons pose a more credible threat that has more of a deterrent effect against hostile behavior. Democrats counterargued that allowing the weapons could trigger an arms race. In 2003, the Senate Armed Services Committee voted to repeal the 1993 Spratt-Furse ban on the development of small nuclear weapons. This change was part of the 2004 fiscal year defense authorization. The Bush administration wanted the repeal so that they could develop weapons to address the threat from North Korea. "Low-yield weapons" (those with one-third the force of the bomb that was dropped on Hiroshima in 1945) were permitted to be developed.
Statements by the U.S. government in 2004 indicated that they planned to decrease the arsenal to around 5,500 total warheads by 2012. Much of that reduction was already accomplished by January 2008.
According to the Pentagon's June 2019 Doctrine for Joint Nuclear Operations, "Integration of nuclear weapons employment with conventional and special operations forces is essential to the success of any mission or operation."