Nuclear disaster in the context of Kashiwazaki-Kariwa Nuclear Power Plant


Nuclear disaster in the context of Kashiwazaki-Kariwa Nuclear Power Plant

⭐ Core Definition: Nuclear disaster

A nuclear and radiation accident is defined by the International Atomic Energy Agency (IAEA) as "an event that has led to significant consequences to people, the environment or the facility." Examples include lethal effects to individuals, large radioactivity release to the environment, or a reactor core melt. The prime example of a "major nuclear accident" is one in which a reactor core is damaged and significant amounts of radioactive isotopes are released, such as in the Chernobyl disaster in 1986 and Fukushima nuclear accident in 2011.

The impact of nuclear accidents has been a topic of debate since the first nuclear reactors were constructed in 1954 and has been a key factor in public concern about nuclear facilities. Technical measures to reduce the risk of accidents or to minimize the amount of radioactivity released to the environment have been adopted; however, human error remains, and "there have been many accidents with varying impacts as well near misses and incidents". As of 2014, there have been more than 100 serious nuclear accidents and incidents from the use of nuclear power. Fifty-seven accidents or severe incidents have occurred since the Chernobyl disaster, and about 60% of all nuclear-related accidents/severe incidents have occurred in the USA. Serious nuclear power plant accidents include the Fukushima nuclear accident (2011), the Chernobyl disaster (1986), the Three Mile Island accident (1979), and the SL-1 accident (1961). Nuclear power accidents can involve loss of life and large monetary costs for remediation work.

↓ Menu
HINT:

In this Dossier

Nuclear disaster in the context of Atomic age

The Atomic Age, also known as the Atomic Era, is the period of history following the detonation of the first nuclear weapon, The Gadget at the Trinity test in New Mexico on 16 July 1945 during World War II. Although nuclear chain reactions had been hypothesized in 1933 and the first artificial self-sustaining nuclear chain reaction (Chicago Pile-1) had taken place in December 1942, the Trinity test and the ensuing bombings of Hiroshima and Nagasaki that ended World War II represented the first large-scale use of nuclear technology and ushered in profound changes in sociopolitical thinking and the course of technological development.

While atomic power was promoted for a time as the epitome of progress and modernity, entering into the nuclear power era also entailed frightful implications of nuclear warfare, the Cold War, mutual assured destruction, nuclear proliferation, the risk of nuclear disaster (potentially as extreme as anthropogenic global nuclear winter), as well as beneficial civilian applications in nuclear medicine. It is no easy matter to fully segregate peaceful uses of nuclear technology from military or terrorist uses (such as the fabrication of dirty bombs from radioactive waste), which complicated the development of a global nuclear-power export industry right from the outset.

View the full Wikipedia page for Atomic age
↑ Return to Menu