Following World War II, the publication of accounts such as John Hersey’s Hiroshima (1946) documented the devastating effects of atomic weaponry on inhabitants of the two Japanese cities targeted by atomic bombs. Yet the American government presented a positive image of the atom’s benefits for its citizens in peacetime. In the late 1940s and 1950s, the U.S. Atomic Energy Commission sought to develop nuclear medicine and nuclear energy alongside its continued production and testing of atomic weapons. In both its civilian and military endeavors, the agency maintained that its safety guidelines were sufficient to protect workers and the general population from dangerous exposures to ionizing radiation. In the 1950s, mounting concerns about the hazards of low-level radiation exposure, particularly from atomic weapons fallout, raised the stakes of understanding and protecting against the health hazards of ionizing radiation. Radiological protection guidelines focused on preventing the somatic effects of radiation—leukemia, cancer, and life-shortening—for which experts postulated a safety threshold. By contrast, the genetic effects of ionizing radiation on an individual’s fertility and gametes appeared to be dose-dependent even at the lowest levels. Geneticists challenged the perceived gap between somatic and genetic effects of radiation by arguing that radiation-induced mutations play a role in diseases, especially cancer. In doing so, they also contested the Commission’s portrayal of a safe atomic future. This article examines how concerns over low-level radiation from fallout facilitated acceptance of the then-controversial somatic mutation theory of carcinogenesis, which became an enduring feature of cancer biology.