News Medical Life Sciences, By Supriya Subramanian, PhD, 10 Apr 18
Solar ultraviolet radiation (UV) exposure triggers DNA damage, a preliminary step in the process of carcinogenesis.
The stability of DNA is extremely important for the proper functioning of all cellular processes. Exposure to UV radiation alters the structure of DNA, affecting the physiological processes of all living systems ranging from bacteria to humans.
Ultraviolet Radiation
Natural sunlight stimulates the production of vitamin D, an important nutrient for the formation of healthy bones. However, sunlight is also a major source of UV radiation. Individuals who get excessive UV exposure are at a great risk of developing skin cancers. There are three types of UV rays: UVA, UVB and UVC.
UVC rays (100-280 nm) are the most energetic and damaging of the three rays. Fortunately, UVC is absorbed by the ozone layer before reaching the earth’s surface.
UVA rays (315-400 nm) possess the lowest energy and is able to penetrate deep into the skin. Prolonged exposure has been linked to ageing and wrinkling of the skin. UVA is also the main cause of melanomas.
UVB rays (280-315 nm) possess higher energy than UVA rays and affect the outer layer of the skin leading to sunburns and tans. Basal cell carcinoma and squamous cell carcinoma are caused by UVB radiation.
DNA Damage by UV Radiation
DNA is composed of two complementary strands that are wound into a double helix. The hereditary message is chemically coded and made up of the four nucleotides adenine (A), thymine (T), guanine (G) and cytosine (C). UVB light interferes directly with the bonding between the nucleotides in the DNA. ……….
Energy Hogs: Can World’s Huge Data Centers Be Made More Efficient?
The gigantic data centers that power the internet consume vast amounts of electricity and emit 3 percent of global CO2 emissions. To change that, data companies need to turn to clean energy sources and dramatically improve energy efficiency. Yale Environment 360 BY FRED PEARCE•APRIL 3, 2018
The cloud is coming back to Earth with a bump. That ethereal place where we store our data, stream our movies, and email the world has a physical presence – in hundreds of giant data centers that are taking a growing toll on the planet.
Data centers are the factories of the digital age. These mostly windowless, featureless boxes are scattered across the globe – from Las Vegas to Bangalore, and Des Moines to Reykjavik. They run the planet’s digital services. Their construction alone costs around $20 billion a year worldwide.
The biggest, covering a million square feet or more, consume as much power as a city of a million people. In total, they eat up more than 2 percent of the world’s electricity and produce 3 percent of CO2 emissions, as much as the airline industry. And with global data traffic more than doubling every four years, they are growing fast.
Yet if there is a data center near you, the chances are you don’t know about it. And you still have no way of knowing which center delivers your Netflix download, nor whether it runs on renewable energy using processors cooled by Arctic air, or runs on coal power and sits in desert heat, cooled by gigantically inefficient banks of refrigerators.
We are often told that the world’s economy is dematerializing – that physical analog stuff is being replaced by digital data, and that this data has minimal ecological footprint. But not so fast. If the global IT industry were a country, only China and the United States would contribute more to climate change, according to a Greenpeace report investigating “the race to build a green internet,” published last year.
Storing, moving, processing, and analyzing data all require energy. Lots of it. The processors in the biggest data centers hum with as much energy as can be delivered by a large power station, 1,000 megawatts or more. And it can take as much energy again to keep the servers and surrounding buildings from overheating.
Almost every keystroke adds to this. Google estimates that a typical searchusing its services requires as much energy as illuminating a 60-watt light bulb for 17 seconds and typically is responsible for emitting 0.2 grams of CO2. Which doesn’t sound a lot until you begin to think about how many searches you might make in a year.
And these days, Google is data-lite. Streaming video through the internet is what really racks up the data count. IT company Cisco, which tracks these things, reckons video will make up 82 percent of internet traffic by 2021, up from 73 percent in 2016. Around a third of internet traffic in North America is already dedicated to streaming Netflix services alone.
Two things matter if we are to tame these runaway beasts: One is making them use renewable or other low-carbon energy sources; the other is ramping up their energy efficiency. On both fronts, there is some good news to report. Even Greenpeace says so. “We are seeing a significant increase in the prioritization of renewables among some of the largest internet companies,” last year’s report concluded.
More and more IT companies are boasting of their commitment to achieving 100 percent reliance on renewable energy. To fulfil such pledges, some of the biggest are building their own energy campuses. In February, cloud giant Switch, which runs three of the world’s top 10 data centers, announced plansfor a solar-powered hub in central Nevada that will be the largest anywhere outside China.
More often, the data titans sign contracts to receive dedicated supply from existing wind and solar farms. In the U.S., those can still be hard to come by. The availability of renewable energy is one reason Google and Microsoft have recently built hubs in Finland, and Facebook in Denmark and Sweden. Google last year also signed a deal to buy all the energy from the Netherlands’ largest solar energy park, to power one of its four European data centers.
Of the mainstream data crunchers for consumers, Greenpeace singled out Netflix for criticism. It does not have its own data centers. Instead, it uses contractors such as Amazon Web Services, the world’s largest cloud-computing company, which Greenpeace charged with being “almost completely non-transparent about the energy footprint of its massive operations.” Amazon Web Services contested this. A spokesperson told Yale Environment 360 that the company had a “long-term commitment to 100 percent renewable energy” and had launched a series of wind and solar farm projects now able to deliver around 40 percent of its energy. Netflix did not respond to requests for comment.
Amazon Web Services has some of its largest operations in Northern Virginia, an area just over the Potomac River from Washington D.C. that has the largest concentration of data centers in the world. Virginia gets less than 3 percent of its electricity from renewable sources, plus 33 percent from nuclear, according to Greenpeace.
Some industry insiders detect an element of smoke and mirrors in the green claims of the internet giants. “When most data center companies talk about renewable energy, they are referring to renewable energy certificates,” Phillip Sandino, vice-president of data centers at RagingWire, which has centers in Virginia, California, and Texas, claimed in an online trade journal recently. In the U.S. and some other countries, renewable energy certificates are issued to companies generating renewable energy for a grid, according to the amount generated. The certificates can then be traded and used by purchasers to claim their electricity is from a renewable source, regardless of exactly where their electricity comes from. “In fact,” Sandino said, “the energy [the data centers] buy from the power utility is not renewable.”
Others, including Microsoft, help sustain their claims to carbon neutrality through carbon offsetting projects, such as investing in forests to soak up the CO2 from their continued emissions.
All this matters because the differences in carbon emissions between data centers with different energy sources can be dramatic, says Geoff Fox, innovation chief at DigiPlex, which builds and operates centers in Scandinavia. Using data compiled by Swedish state-owned energy giant Vattenfall, he claims that in Norway, where most of the energy comes from hydroelectricity, generating a kilowatt-hour of electricity emits only 3 grams of CO2. By comparison, in France it is 100 grams, in California 300 grams, in Virginia almost 600 grams, in New Mexico more than 800 grams.
Meanwhile, there is growing concern about the carbon footprint of centers being built for Asian internet giants such as Tencent, Baidu, and Alibaba in China; Naver in South Korea; and Tulip Telecom in India. Asia is where the fastest global growth in data traffic is now taking place. These corporations have been tight-lipped about their energy performance, claims Greenpeace. But with most of the region’s energy coming from coal-fired power stations, their carbon footprint cannot be anything but large.
Vattenfall estimates the carbon emissions in Bangalore, home of Tulip’s giant Indian data center, at 900 grams per kilowatt-hour. Even more troubling, the world’s largest center is currently the Range International Information Hub, a cloud-data store at Langfang near the megacity of Tianjin in northeast China, where it takes more than 1,000 grams of CO2 for every kilowatt-hour.
Almost as important as switching data centers to low-carbon energy sources is improving their energy efficiency. Much of this comes down to the energy needed to keep the processors cool. Insanely, most of the world’s largest centers are in hot or temperate climates, where vast amounts of energy are used to keep them from overheating. Of the world’s 10 largest, two are in the desert heat of Nevada, and others are in Georgia, Virginia, and Bangalore.
Most would dramatically reduce their energy requirements if they relocated to a cool climate like Scandinavia or Iceland. One fast-emerging data hub is Iceland, where Verne Global, a London company, set up its main operation.
…….. Greenpeace says the very size of the internet business, and its exposure to criticism for its contribution to climate change, has the potential to turn it from being part of the problem to part of the solution. Data centers have the resources to change rapidly. And pressure is growing for them to do so.The hope is that they will bring many other giant corporations with them. “The leadership by major internet companies has been an important catalyst among a much broader range of corporations to adopt 100 percent renewable goals,” says Gary Cook, the lead author of the Greenpeace report. “Their actions send an important market signal.”
But the biggest signal, says Fox, will come from us, the digital consumers. Increasingly, he says, “they understand that every cloud lives inside a data center. And each has a different footprint.” We will, he believes, soon all demand to know the carbon footprint of our video streams and internet searches. The more far-sighted of the big data companies are gearing up for that day. “I fully expect we may see green labelling for digital sources as routine within five years.” https://e360.yale.edu/features/energy-hogs-can-huge-data-centers-be-made-more-efficient
During this dangerous time, women are leading the charge to eradicate weapons of mass destruction and forestall nuclear war. We saw this most recently in the 2017 U.N. Treaty to Prohibit the Use of Nuclear Weapons. Approved with 122 states voting for, and one against, it is the first legally binding global ban on nuclear weapons, with the intention of moving toward their complete elimination. The preamble to the treaty recognizes the maltreatment suffered as a result of nuclear weapons, including the disproportionate impact on women and girls, and on indigenous peoples around the world. The treaty has been predominantly championed and promoted by women.
My interest in nuclear issues began nearly 10 years ago when I first uncovered my mother’s work as an antinuclear activist with a group called Women Strike for Peace. I have been following women doing nuclear activism all over the world—writing about them, protesting with them, teaching about them in my university classes—and I often bring my daughter with me. My mother’s story is being passed down through an intergenerational maternal line, and with it, the activism that may help save the world, or at least help shift its view on disastrous weapons. Learning about my mother’s work radically changed my perception of her. It also changed my life.
Between 1945 and 1963, more than 200 atmospheric, underwater, and space nuclear bomb tests were conducted by the U.S., primarily in the Nevada desert and the Marshall Islands. Hundreds more took place around the world. In many instances citizens were not informed of the tests, nor were they warned of their effects. The negative health impacts of the testing and exposure to ionizing radiation turned out to be vast: early death, cancer, heart disease, and a range of other incurable illnesses, including neurological disabilities, weakened immune systems, infertility, and miscarriage. Ionizing radiation damages genes (it is mutagenic), so the health ramifications of exposures are passed down through the generations.
In the 1950s, scientists concerned with the health impacts of bomb testing and the spread of ionizing radiation conducted the St. Louis Baby Tooth Survey. The survey showed that radioactive fallout had traveled far and wide. Cow and breast milk contaminated with the isotope strontium 90 had entered children’s teeth. Strontium 90 metabolizes as calcium and these isotopes remain active in the body for many years. When Dagmar Wilson and Bella Abzug—who went on to become a Congresswoman and co-founder of the National Women’s Political Caucus with Gloria Steinem and Betty Friedan—learned the results of the Baby Tooth Survey, they formed Women Strike for Peace. The group brought together concerned mothers from across the U.S. The women organized. First within their communities. And then, 50,000 mothers protested across the country, and 15,000 descended on Washington, D.C. for Women’s Strike for Peace Lobbying Day on November 1, 1961. My mother was one of those 15,000 protestors. The group’s efforts brought vast political attention to the dire health consequences of radioactive fallout and led to the banning of atmospheric bomb testing by the U.S., Great Britain, and the former Soviet Union in 1963, with the signing of the Limited Nuclear Test Ban Treaty.
Women Strike for Peace reflects a cultural nuclear gender binary—with women constructed as peaceful antinuclear protectors of children and the nation, and men positioned as perpetrators of nuclear war—the designers, planners, and regulators of weapons of mass destruction.
Has this exclusion of women from nuclear decision-making led to our current crisis—a host of locations worldwide contaminated with radioactive waste, and the great potential for nuclear war? Leading anti-nuclear activists seem to think so.
Since the dawn of the nuclear age men have dominated and controlled nuclear weapons design and policy. As Benjamin A. Valentino, Associate Professor of Government, and Coordinator, War and Peace Studies Program, Dickey Center for International Understanding at Dartmouth College says, it is only recently that women have had access to positions of power in the military sphere. This is true in weapons’ sciences and engineering as well. While many women worked on the Manhattan Project, most held administrative roles. Has this exclusion of women from nuclear decision-making led to our current crisis—a host of locations worldwide contaminated with radioactive waste, and the great potential for nuclear war? Leading anti-nuclear activists seem to think so.
Carol Cohn, founding director of the Consortium on Gender, Security and Human Rights at the University of Massachusetts-Boston suggests that nuclear-weapons discourse is deeply rooted in hegemonic patriarchy. In nuclear techno-language metaphors of male sexual activity are used to describe nuclear violence. Nuclear missiles are referred to in phallic terms. The violence of nuclear war is described in abstract and impersonal terms, such as “collateral damage.” In her recent New York Times op-ed, Cohn finds it unsurprising that hypermasculine nuclear language has surfaced so blatantly today with Trump’s tweets about the size of his nuclear button and his overall muscular championing of expanding the nuclear weapons complex.
Following the Women Strike for Peace model, legions of anti-nuclear NGOs worldwide are predominantly led by women, including Women’s Action for Nuclear Disarmament, Women’s International League for Peace and Freedom, Reaching Critical Will, the German Green Party, Mothers for Peace, Just Moms (St. Louis), International Campaign to Abolish Nuclear Weapons (ICAN), Greenham Common Women’s Peace Camp, Green Action Japan, the women of Koondakulam in India, the antinuclear nuns Megan Rice, Ardeth Platte, Carol Gilbert, and many more.
At the U.N. conference to ban nuclear weapons in 2017, I asked Civil Society experts and participants about the importance of women as leaders in the antinuclear movement, and about the hegemony of masculinity in the nuclear weapons complex.
“Of course many men support disarmament and have participated in the treaty and current anti-nuclear efforts in general, but women overwhelmingly lead,” said Tim Wright, of the Australian branch of ICAN. ICAN won the 2017 Nobel Prize for their work on The Treaty to Prohibit the Use of Nuclear Weapons.
Ray Acheson, of Reaching Critical Will, said the proliferation of nuclear weapons is deeply embedded in “a misogynist and hegemonic culture of violence.” She stated this culture is oppressive to women, LGBTQ, the poor, and people of color, and, “we must smash patriarchy.” Such is the feminist cry heard around the world, but in this case, it might actually save us.
Beatrice Fihn, director of ICAN, explained that men are raised to be violent, to think it’s necessary to resolve differences through force, while “women, conversely, are socially trained to negotiate and compromise.”
According to Fihn, the problem in a patriarchal world is that peaceful negotiations are viewed as weak. The U.S. misogynist-in-chief feels we must drop nuclear bombs, expand our nuclear arsenal, and strong-arm competing nations, such as North Korea and Russia. The very act of supporting disarmament efforts in a patriarchal framework places “you in a feminine category,” Fihn stressed. “Those in favor of abolishing nuclear weapons, whether male or female, are characterized in negative, feminized terms. This characterization must be changed. It is not weak to abolish weapons of mass destruction. It is life-affirming.”
Women better understand this because they are the ones in charge of improving quality of life for all. Women most often function as caretakers of children and the elderly, they are aware of the human cost of war and radioactive disaster. When thinking about nuclear war, they wonder, if war breaks out, “How will we feed our children, how will we feed our sick? What will happen to our communities?” Fihn says she fears nuclear violence in respect to the safety of her own children. Fihn’s concern for her children echoes the concerns of my mother and her antinuclear cohort in the 1950s and ’60s. Like Fihn, they worked to save their children—all children—from radiation contamination and nuclear war. I hope I can carry on that legacy, and that my daughter chooses to pick up the cause as well.
For the 2017 UN Treaty to Prohibit the Use of Nuclear Weapons, women helped prepare key elements of the document and gave vital health testimony. Particularly poignant were tales from Australian Indigenous, Marshallese, and Hibakusha (Hiroshima and Nagasaki survivors) women. I interviewed many of these women. Abacca Anjain-Madison, a former Senator of the Republic of the Marshall Islands, told me that between 1946 and 1958, the U.S. conducted 67 nuclear bomb tests on the Atoll Islands. Many babies born during the testing period resembled jellyfish and died quickly after their births. The Marshallese developed very high rates of cancer (and other diseases) as a result of ionizing radiation exposures. Now, with climate change, the radioactive dangers persist. Rising sea levels threaten the Runit Dome—a sealed space that contains large amounts of radioactive contamination. The dome has also begun to crack, and the U.S. has no plans to assist Marshallese with this crisis. They finished the cleanup and sealed the dome in 1979. Abacca Anjain-Madison asserts the clean up was not sufficient and the dome was never meant to be permanent. The Marshallese to do not have the means to protect themselves from the impending disaster.
Mary Olson, Southeast Director of the Nuclear Information and Resource Service, gave a presentation at the UN on the unequal health impacts of radiation exposures. Women remain unaccounted for in nuclear regulatory safety standards. Based on the data set from the BEIR VII report that both Olson and Dr. Arjun Makhijani, President of the Institute for Energy and Environmental Research have studied, women are twice as likely to get cancer, and nearly twice as likely than men to die from cancer associated with ionizing radiation exposures. Children are five to 10 times more likely to develop cancer in their lifetimes from radiation exposures than adult males, and girls are most vulnerable of all. Scientists do not yet understand why there is an age and gender disparity. The standard “reference man” by which radiation safety regulations are set are based on a white adult male. Olson and Makhijani argue that safety regulations must change to account for age and gender disparities. Further studies are needed to assess how people of different races are impacted by radiation exposures. To date, no such completed studies exist.
At the closing of the conference and signing of the 2017 UN Treaty to Prohibit the Use of Nuclear Weapons, two speeches were made—one by Setsuko Thurlow, a Hiroshima survivor, Nobel Peace Prize winner, and leading campaigner for the prohibition of nuclear weapons. Abacca Anjain-Madison of the Marshall Islands also spoke.
Setsuko Thurlow told her story of beholding the bomb dropping on her city in 1945. She described how, as an 13-year-old child, she witnessed the death of her brother, and “unthinkable” violence thrust upon on her people. For Thurlow, the signing of the UN Treaty to ban nuclear weapons is a miracle, but she believes we must rid the world of weapons entirely. She will not give up her efforts until that day comes. Neither will I.
Heidi Hutner is a writer and professor at Stony Brook University in New York. She teaches and writes about ecofeminism, literature, film and environmental studies. Currently, Hutner is working on a narrative nonfiction book manuscript titled, “Accidents Can Happen: Women and Nuclear Disaster Stories From the Field.” Find her @HeidiHutner
The disaster at Unit 2 of the Three Mile Island (TMI) nuclear power plant near Harrisburg, Pennsylvania, began on March 28, 1979. Today, 39 years later, the reality, of what really happened, and how many people it harmed, remains cloaked in mystery and misinformation. Unlike the popular catchphrase, TMI is a story of too little information.
What happened?
The two unit Three Mile Island nuclear power plant sits on an island in the middle of the Susquehanna River, just ten miles southeast of Harrisburg, Pennsylvania. TMI Unit 2 was running at full power, but had been commercially operational for just 88 days when, at 4 A.M. on Wednesday, March 28, 1979, it experienced either a mechanical or electrical failure that caused the turbine-generator and the nuclear reactor to automatically shut down.
The pressure and temperature in the reactor began to increase, but when a relief valve on top of the reactor’s primary coolant pressurizer stuck open, malfunctioning instrumentation indicated that the valve had shut. While cooling water emptied out of the reactor, operators mistakenly reduced the amount of cooling water flowing into the core, leading to the partial meltdown.
Workers deliberately and repeatedly vented radioactive gas over several days to relieve pressure and save the containment structure. Then came fears of a hydrogen explosion. But by April 1, when President Jimmy Carter arrived at the site, that crisis had been averted, and by April 27 the now destroyed reactor was put into “cold shutdown.” TMI-2 was finished. But its deadly legacy was to last decades.
How much radiation got out?
Within hours of the beginning of the nuclear disaster, onsite radiation monitors went off the scale because radiation levels exceeded their measurement capacity. There were only a few offsite radiation monitors operating that day. Subsequent examination of human blood, and of anomalies in animals and plants, suggest that significant levels of radiation were released.
In the days following the TMI meltdown, hundreds of local residents reported the same acute radiation exposure symptoms as victims of the Hiroshima and Nagasaki bombings — nausea and vomiting, severe fatigue, diarrhea, hair loss and graying, and a radiation-induced reddening of the skin. For example, Marie Holowka, a dairy farmer near TMI, recalled as she left the milkhouse that morning that, outside, “it was so blue, I couldn’t see ten feet ahead of myself.” There was a “copper taste” in the air. She was later treated for thyroid problems. Given the absence of monitors and the paucity of evidence, the only real radiation meters were the people of Three Mile Island.
“No one died:” The biggest lie
Given that exposure to ionizing radiation is medically understood to cause diseases like cancer which can be fatal, there is no way to definitively state that “no one died at TMI” or later developed cancers. The opposite is far more likely to be true.
Estimates can be complicated by the long latency period for illnesses caused by exposure to radiation. Sometimes exposed populations move away and cannot be tracked. Nevertheless, long after a catastrophic radiation release, disease can still manifest, both from the initial radiation exposure and from slow environmental poisoning, as the radionuclides released by the disaster are ingested or inhaled for many generations.
The only independent study that looked at the aftermath of TMI was conducted by the late Dr. Stephen Wing and his team at the University of North Carolina, Chapel Hill. They looked at radiation-specific markers in residents’ blood, called biomarkers, to assess dose rather than relying solely on industry measured (or mis-measured as the case was) radiation emissions. The team concluded that lung cancer and leukemia rates were two to 10 times higher downwind of the Three Mile Island reactor than upwind.
Harm to animals and plants
After the radiation release from Three Mile Island, a number of plants exhibited strange mutations including extra large leaves (gigantism), double-headed blossoms and other anomalies. These plant anomalies were documented over decades by Mary Osborn, a local resident who conducted meticulous plant research and is a founder of Three Mile Island Alert. (Her deformed rose is pictured at the top of this story.)
Robert Weber, a Mechanicsburg veterinarian, reported a 10% increase in stillbirths, and a marked increase in the need for Cesarean Sections among sheep, goats and pigs in 1979, 1980, and 1981 in a 15-mile area around the TMI site. Dr. Weber also reported significant increases in the cancer rate among animals with shorter life spans such as dogs and cats. These findings are consistent with research around Chernobyl.
Evacuation failure
During the licensing phase of the construction and operation of TMI, a nuclear disaster was considered unthinkable. Consequently, emergency plans were practically non-existent when TMI began its meltdown. Emergency planning officials were repeatedly misinformed by TMI owner, Metropolitan Edison, on the disaster’s progression, and kept in the dark about the need for public protective actions in the early days at TMI.
On March 30, Pennsylvania Governor Richard Thornburgh finally “advised” that pregnant women and pre-school age children voluntarily evacuate a five-mile perimeter around TMI, an anticipated target population of 3,500 people. Instead, approximately 200,000 people spontaneously evacuated from a 25-mile perimeter.
TMI demonstrated that managing human responses during a nuclear catastrophe is not realistic and provokes unique human behavior not comparable to any other hazard.
Competing loyalties between work duty and personal family caused a significant number of staffing problems for various emergency response roles. As the crisis intensified, more emergency workers reported late or not at all.
Doctors, nurses and technicians in hospitals beyond the five-mile perimeter and out to 25 miles, spontaneously evacuated emergency rooms and their patients. Pennsylvania National Guard, nuclear power plant workers, school teachers and bus drivers assigned to accompany their students, abandoned their roles for family obligations. A similar response could be expected in the same situation today.
This second article on the unsuitability of nuclear power discusses radioactive waste. Waste disposal assumes there is a place ‘away’, where unwanted things can be discarded. This is a dualistic assumption. Holism recognizes that everything is connected: there is no ‘away’. In nature, everything is recycled, but humans have yet to learn this. Since 1945, more than 80K chemicals have been created and dispersed into our air, water, and soil, 90 percent of them untested for biological toxicity. Bacteria and fungus can break these chemicals into component parts, rendering them non-toxic and available for reintroduction into living systems.
However, radioactive waste is toxic by processes of nuclear physics (not chemistry), and only becomes safe over long periods of time. The life span of radioactive isotopes is measured by the time needed for 50 percent decay, a half-life, which can be tens of thousands of years.
The Nuclear Regulatory Commission (NRC) legislates that nuclear waste be categorized and treated based on level of radioactivity. Very Low Level Waste, twice as radioactive as natural granite, will decay to natural levels within 30 years. This material is disposed of in monitored landfills. Low Level Waste, about 20 times more radioactive than granite, contains isotopes with long half-lives. In the US, this material must be buried in one of four NRC regulated sites.
Intermediate Level Waste, generated from reprocessing spent fuel rods, is 100K to 100M times more radioactive than granite and can take more than 100 thousand years to return to natural levels. High Level Waste, spent fuel rods, is a billion times more radioactive than granite (an exposure of less than 20 seconds is lethal), and remains radioactive for millions of years. The 99 reactors currently operating in the US, have already produced 80,000 tons of High Level Waste. This will double by the time the reactors are decommissioned. These rods are stored in cooling pools, or dry cask storage, within the reactor facilities, but space is limited.
The only safe disposal of Intermediate and High Level waste requires geologic and social stability for hundreds of thousands of years. Globally, there are six research facilities studying the problem but, after 60 years of commercial nuclear power, there are no repositories that accept this type of radioactive waste. Geologic sites might exist with this kind of longevity, but human structures, social and physical, are relatively short lived. The Pandyan Empire in southern India, lasted 2,000 years, and the oldest culture, the Australian Aborigines, dates only 50,000 years. Warning signs about enduring danger are problematic, since language originated about 10,000 years ago.
Yucca Mountain, designated as the High Level depository in the US, was shut down in 2011, after decades of construction and billions in cost, because of unexpected ground water intrusion and political resistance from the State of Nevada. In any event, it is too small to store the waste now in storage, let alone future production, and would require air conditioning for a century. There are no plans for alternative sites, although the Trump administration wants to reopen Yucca for consideration.
“Spent fuel rods” contain 90 percent of the original enriched uranium and fissionable plutonium is produced within the rods, as a result of nuclear reactions. Advocates of nuclear technology and weapons, see this as a potential resource, and want consideration for future access. High Level waste must not be allowed to migrate into the environment for health reasons, and national security demands that this material be kept out of the hands of terrorists. Constructing geologic depositories with possible future access complicates an already difficult design problem.
A nuclear power plant boils water for less than half a century, and leaves a legacy lethal to life for a million years, with the added risk of it falling into the hands of terrorists to produce weapons of mass destruction. Such short-sighted thinking is typical of the dualistic mindset, which seems comfortable sacrificing future generations for the immediate gain of a few. We must be better than that.
Crispin B. Hollinshead is a retired mechanical engineer, a lifelong model maker, woodworker, and philosopher, residing in Mendocino County for over half his adult life, currently living in Ukiah.
Nuclear power is unsuited for a populated planet for three reasons; radiation, waste, and economics. Nuclear fission of a radioactive atom produces two smaller pieces (daughter products) and radiation of energetic debris consisting of gamma rays, beta and alpha particles, and neutrons. The bombs dropped on Japan were detonated at altitude to maximize the blast damage. The radiation damage was from gamma rays, which irradiate the entire body. Biologists were not involved in the development of the atom bomb, so radiation devastation was unexpected. Radiation deaths continued long after the armistice, but this information was overshadowed by the enthusiasm of using the bomb to end the war.
Three years later, a detailed study examined the health impact of radiation. Since Hiroshima and Nagasaki were obliterated and people had moved, it was difficult to track effects accurately, but damage correlated inversely with radiation dosage. One conclusion, which ignored long-term results, was the idea of a “safe” level of radiation exposure with no cause for concern.
External radiation exposure from a single blast differs in effect from long-term exposure to radiation from material ingested by breathing, drinking, or eating. Radioactive isotopes concentrate in different parts of the body and decay at different rates, some long lasting. Internal beta and alpha exposure is very damaging, increasing the likelihood of disease, cancer, and genetic mutation.
Physicians for Nuclear Responsibility have campaigned for decades against the idea of a safe level for ingesting radioactive material. However, the idea of a “safe” level is important to governments and corporations that build nuclear power plants, because all aspects of the nuclear process release radioactive material into the environment. The fiction of a safe level means that no one takes responsibility for the health problems associated with radiation.
The three worst nuclear power accidents released untold amounts of radioactive material into the environment. At Three Mile Island, the reactor experienced a partial core meltdown in 1979, which vented contamination to the surrounding area for over 12 hours. Onsite radiation instruments quickly went off scale and couldn’t measure how much radioactive material was released. To this day, the Nuclear Regulatory Commission (NRC) states that only low amounts were released, and no people were harmed. The many reports of health damage were dismissed as hysteria. Recent studies by independent investigators indicate the NRC understated the radiation exposure by a factor of 1000.
At Chernobyl, in 1986, the graphite core burned for more than a week, consuming over five percent of the nuclear fuel. As there was no containment structure, a radioactive plume spread across western Russia and much of Europe. The Chernobyl area is still contaminated, and requires constant investment to keep it contained.
The 2011 Japanese earthquake and resulting tsunami led to the core meltdown and containment breach in three of six reactors at the Fukushima complex, and an explosive release from a spent fuel pool. The contamination of air, land, and water by highly radioactive hot particles was widespread, extending to Tokyo, 150 miles away. Contaminated water continues to flow into the Pacific Ocean, but the US government has never measured radiation in the ocean or the air off the west coast.
While large contaminations due to accidents have been rare, reactors are aging and growing more vulnerable to failure. Even normal reactor operation releases radiation into the environment. The mining, refining, and enriching of uranium fuel releases radioactive material. Mountains of radioactive mine tailings sit next to the Colorado River, the source of drinking water for millions.
The efficiency of a uranium reactor core is reduced by contamination from the daughter products of nuclear decay. Within a few years, when as little as 10 percent of the uranium has been consumed, this “spent fuel” is removed to cooling pools, and fresh fuel rods are installed. Even though most of the uranium is still useful, it is expensive to reprocess the spent fuel by removing the daughter products, and reconstituting fresh fuel rods. Everywhere this has been tried, massive radioactive environmental contamination has resulted.
Every reactor has a designed life span, after which, it must be decommissioned, and the site cleaned of radioactive material. There are 449 large commercial power reactors in operation globally. Another 150 have been shut-down, but only 17 very small plants have been completely decontaminated. The decommissioning of all the rest will introduce massive amounts of radioactive material into the environment.
Dualistic economics, and the fiction of safe levels of radioactivity, guarantees that.
Seven years after the Fukushima, Japan nuclear disaster began, forcing evacuations of at least 160,000 people, research has uncovered significant health impacts affecting monkeys living in the area and exposed to the radiological contamination of their habitat.
Shin-ichi Hayama, a wild animal veterinarian, has been studying the Japanese macaque (Macaca fuscata), or snow monkey, since before the Fukushima nuclear disaster. Now, his research has shown that monkeys in Fukushima have significantly low white and red blood cell counts as well as a reduced growth rate for body weight and smaller head sizes.
Hayama, who began his macaque research in 2008, had access to monkeys culled by Fukushima City as a crop protection measure. He continued his work after the Fukushima nuclear explosions. As a result, he is uniquely positioned to discover how low, chronic radiation exposure can affect generations of monkeys.
The macaque is an old world monkey native to Japan, living in the coldest climates of all of the non-human primates. Like humans, macaques enjoy a good soak in the mountain hot springs in the region. It is even said that they have developed a “hot tub culture” and enjoy time at the pools to get warm during winter.
However, snow monkeys and humans share more than a love of hot springs. Human DNA differs from rhesus monkeys, a relative of the snow monkey, by just 7%. While that 7% can mean the difference between building vast cities to living unsheltered and outdoors, for basic processes like reproduction, these differences begin to fade. Consequently, what is happening to the macaques in Fukushima should send a warning about the implications for human health as well, and especially for evacuees now returning to a region that has been far from “cleaned up” to any satisfactory level.
Hayama’s research group has published two studies, each comparing data before and after the nuclear catastrophe began, and also between exposed and unexposed monkey populations. In a 2014 study, researchers compared monkeys from two regions of Japan, one group of monkeys from the Shimokita region, 400 Km north of Fukushima, and a second group of monkeys from contaminated land in Fukushima.
The monkeys in Fukushima had significantly low white and red blood cell counts. Other blood components were also reduced. The more a radioactive isotope called cesium was present in their muscles, the lower the white blood cell count, suggesting that the exposure to radioactive material contributed to the damaging blood changes. These blood levels have not recovered, even through 2017, meaning that this has become a chronic health issue.
Changes in blood are also found in people inhabiting contaminated areas around Chernobyl. Having a diminished number of white blood cells, which fight disease, can lead to a compromised immune system in monkeys as well as people, making both species unable to fight off all manner of disease.
Hayama followed up his 2014 study with another in 2017 examining the differences in monkey fetus growth before and after the disaster. The researchers measured fetuses collected between 2008 and 2016 from Fukushima City, approximately 70 km from the ruined reactors. Comparing the relative growth of 31 fetuses conceived prior to the disaster and 31 fetuses conceived after the disaster revealed that body weight growth rate and head size were significantly lower in fetuses conceived after the disaster. Yet, there was no significant difference in maternal nutrition, meaning that radiation could be responsible.
Smaller head size indicates that the fetal brain was developmentally retarded although researchers could not identify which part was affected. The mothers’ muscles still contained radioactive cesium as in the 2014 study, although the levels had decreased. These mothers had conceived after the initial disaster began, meaning that their fetuses’ health reflects a continuing exposure from environmental contamination. This study mirrors human studies around Chernobyl that show similar impacts as well as research from atomic bomb survivors. Studies of birds in Chernobyl contaminated areas show that they have smaller brains.
Although Hayama has approached radiation experts to aid with his research, he claims they have rejected it, saying they don’t have resources or time, preferring to focus on humans. But humans can remove themselves from contaminated areas, and many have chosen to stay away despite government policies encouraging return. Tragically, monkeys don’t know to leave, and relocating them is not under discussion, making study of radiation’s impact on their health vital to inform radiation research on humans, the environment, and any resettlement plans the government of Japan may have.
Hayama presented his work most recently as part of the University of Chicago’scommemoration of the 75th Anniversary of the first man-made controlled nuclear chain reaction. His work follows a long, important, and growing line of research demonstrating that radiation can not only damage in the obvious ways we have been told, but in subtle, yet destructive ways that were unexpected before. The implications for humans, other animals, and the environment, are stark. Cindy Folkers is the radiation and health specialist at Beyond Nuclear.
The court said no to the application because it considered that there were problems with the copper canister that had to be resolved now and not later.
the UK’s National Nuclear Laboratory (NNL) is to carry out an expert peer review of a Canadian research programme on microbiologically influenced corrosion of canisters that will be used to dispose of used nuclear fuel.
The Swedish Environmental Court has rejected the Nuclear Waste Company SKB’s license application for a final repository for spent nuclear fuel in Forsmark, Sweden. This is a huge triumph for safety and environment – and for the Swedish NGO Office for Nuclear Waste Review (MKG), the Swedish Society for Nature Conservation (SSNC), and critical scientists. Now it is up to the Swedish government to make a final decision.
The Environmental Court took into consideration viewpoints from all parties in the case, including scientists who have raised concerns about disposing spent nuclear fuel in copper canisters. During the legal proceedings, the Swedish NGO Office for Nuclear Waste Review (MKG) and the Swedish Society for Nature Conservation (SSNC) presented the shortcomings of this method of disposal. For many years, the environmental organisations have been arguing that the Nuclear Waste Company SKB need to listen to critical scientists, and investigate alternative disposal methods, especially the possibility of developing a very deep boreholes disposal system. (1) Johan Swahn, Director at MKG said:
“Several independent researchers have criticized both the applied method and the selected site. There is a solid documentation base for the Environmental Court’s decision. It is hard to believe the Swedish Government’s conclusions will be any different from the Court’s.”
MKG has made an unofficial translation into English of the Environmental Court opinion. (2)
The court said no to the application because it considered that there were problems with the copper canister that had to be resolved now and not later. The translation shows the courts judicial argumentation and why it decided not to accept the regulator – the Swedish Radiation Safety Authority’s (SSM’s) opinion that the problems with the integrity of the copper canister were not serious and could likely be solved at a later stage in the decision-making process. The court is quite clear in its statement and argumentation:
“The Land and Environmental Court finds that the environmental impact assessment meets the requirements of the Environmental Code and can therefore be approved. All in all, the investigation meets the high standards set out in the Environmental Code, except in one respect, the safety of the canister.” (Emphasis added)
“The investigation shows that there are uncertainties, or risks, regarding how much certain forms of corrosion and other processes can impair the ability of the canister to contain the nuclear waste in the long term. Overall, these uncertainties about the canister are significant and have not been fully taken into account in the conclusions of SKB’s safety analysis. The Land and Environmental Court considers that there is some leeway for accepting further uncertainties. The uncertainties surrounding certain forms of corrosion and other processes are, however, of such gravity that the Court cannot, based on SKB’s safety analysis, conclude that the risk criterion in the Radiation Safety Authority’s regulations has been met. In the context of the comprehensive risk assessment required by the Environmental Code, the documentation presented to date does not provide sufficient support for concluding that the final repository will be safe in the long term.” (Emphasis added)
The court says that the application is only permissible if the nuclear waste company SKB:
“…produces evidence that the repository in the long term will meet the requirements of the Environmental Code, despite remaining uncertainties regarding how the protective capability of the canister may be affected by: a. corrosion due to reactions in oxygen-free water; b. pit corrosion due to reaction with sulphide, including the contribution of the sauna effect to pit corrosion; c. stress corrosion due to reaction with sulphide, including the contribution of the sauna effect to stress corrosion; d. hydrogen embrittlement; e. radioactive radiation impact on pit corrosion, stress corrosion and hydrogen embrittlement.”
The main difference between the court’s and the regulator’s decision-making was that the court decided to rely on a multitude of scientific sources and information and not only on the material provided by SKB. It had also been uncovered that the main corrosion expert at SSM did not want to say yes to the application at this time that may have influenced the court’s decision-making. In fact there appear to have been many dissenting voices in the regulator despite the regulator’s claim in the court that a united SSM stood behind its opinion.
The court underlines in its opinion that the Environmental Code requires that the repository should be shown to be safe at this stage in the decision-making process, i.e. before the government has its say. The court says that some uncertainties will always remain but it sees the possible copper canister problems as so serious that it is not clear that the regulator’s limits for release of radioactivity can be met. This is a reason to say no to the project unless it can be shown that the copper canister will work as intended. The copper canister has to provide isolation from the radioactivity in the spent nuclear fuel to humans and the environment for very long time-scales.
It is still unclear how the process will proceed. The community of Östhammar has cancelled the referendum on the repository, as there will be no question from the government in the near future. The government has set up a working group of civil servants to manage the government’s handling of the opinions delivered by the court and SSM. SKB has said that it is preparing documentation for the government to show that there are no problems with the canister. Whether the government thinks this will be enough remains to be seen. This is likely not what the court had in mind. The government would be wise to make a much broader review of the issue. There is a need for a thorough judicial review on the governmental level in order to override the court’s opinion. Otherwise the government’ decision may not survive an appeal to the Supreme Administrative Court.
There are eminent corrosion experts who believe that copper is a bad choice as a canister material. There is also increasing experimental evidence that this is the case. The court’s decision shows the importance of democratic and open governance in environmental decisionmaking. It is important that the continued decision-making regarding the Swedish repository for spent nuclear is transparent and multi-faceted. (3)
Copper Canisters The canister has to enclose the nuclear waste for a very long; it is the final repository’s primary safety function. The canister has a 50 mm thick copper shell with an insert of cast iron. The canister must withstand corrosion and mechanical stress.
The investigation on the capability of the canister is extensive and involves complex technical and scientific issues. These include groundwater chemistry, corrosion processes, as well as creep and hydrogen embrittlement (this latter affects the mechanical strength of the canister). However, the parties taking part in the court proceedings disagreed on several issues crucial to the final repository’s long-term security.
The Land and Environmental Court considered the following uncertainties regarding the canister to be most important in the continued risk assessment:
1. General corrosion due to reaction in oxygen-free water. The parties have different views on scientific issues surrounding this kind of corrosion. The Court found that there is considerable uncertainty on this topic that has not been taken account of in SKB’s safety analysis
.· 2. Local corrosion in the form of pit corrosion due to reaction with sulphide. The Court found that there is significant uncertainty regarding pit-corrosion due to reaction with sulphide. This uncertainty has not been included in the safety analysis. In addition, there is uncertainty about the sauna effect, which may have an amplifying effect on pit corrosion.
· 3. Local corrosion in the form of stress corrosion due to reaction with sulphide. The Court found that there is significant uncertainty regarding stress corrosion due to reaction with sulphide. This uncertainty has not been included in the safety analysis. In addition, there is uncertainty about the sauna effect, which may have an amplifying effect on stress corrosion.
· 4. Hydrogen embrittlement is a process that affects the mechanical strength of the canister. The Court found that significant uncertainty regarding hydrogen embrittlement remains. This uncertainty has not been taken account of in the safety analysis.
· 5. The effect of ionizing radiation on pit corrosion, stress corrosion and hydrogen embrittlement. There is significant uncertainty regarding ionizing radiation impact on pit corrosion, stress corrosion and hydrogen sprays. This uncertainty has been included to a limited extent in the safety assessment.
Meanwhile, the UK’s National Nuclear Laboratory (NNL) is to carry out an expert peer review of a Canadian research programme on microbiologically influenced corrosion of canisters that will be used to dispose of used nuclear fuel. The NNL has been contracted by Canada’s National Waste Management Organisation (NWMO) to review its work on the potential for corrosion of the copper-clad canisters. The NWMO is responsible for designing and implementing the safe, long-term management of Canada’s used nuclear fuel under a plan known as Adaptive Phased Management. This requires used fuel to be contained and isolated in a deep geological repository, with a comprehensive process to select an informed and willing host for the project.
The used fuel will be isolated from the environment using a series of engineered barriers. Fuel elements comprise ceramic fuel pellets, which are themselves highly durable, contained inside corrosion-resistant zircaloy tubes to make fuel elements. Bundles of fuel elements are placed into large, durable copper-coated steel containers which are designed to contain and isolate used nuclear fuel in a deep geological repository, essentially indefinitely. The canisters will be placed in so-called “buffer boxes” containing by bentonite clay, providing a fourth barrier.
World Nuclear News reports that although copper is highly resistant to corrosion, under anoxic conditions – that is, where no oxygen is present – sulphate-reducing bacteria have the potential to produce sulphide, which can lead to microbiologically induced corrosion (MIC) of copper. Waste management organisations and regulators therefore need to understand the levels of sulphide that will be present in a geological disposal facility, to understand its potential to migrate to the canister surface and the potential for it to cause copper corrosion, the NNL said.
The NWMO has been actively developing computer models that will be used to evaluate the potential for MIC once a disposal site has been selected, and has selected the NNL to carry out a peer review of its work because of the UK laboratory’s expertise in the biogeochemical processes that could affect repository performance and in developing computer modelling techniques that simulate the effects of sulphate-reducing bacteria. The work is linked closely with NNL’s participation in the European Commission Horizon-2020 MIND (Microbiology in Nuclear waste Disposal) project. (4
MIT Receives Millions to Build Fusion Power Plant Within 15 Years https://gizmodo.com/mit-receives-millions-to-build-fusion-power-plant-withi-1823644634?IR=T Ryan F. Mandelbaum 10 Mar 18 Nuclear fusion is like a way-more-efficient version of solar power—except instead of harnessing energy from the rays of a distant sun, scientists create miniature suns in power plants here on Earth. It would be vastly more efficient, and more importantly, much cleaner, than current methods of energy production. The main issue is that actually realizing fusion power has been really difficult.
But MIT has announced yesterday that it is working with a new private company called Commonwealth Fusion Systems (CFS) to make nuclear fusion finally happen. CFS recently attracted a $50 million investment from the Italian energy company Eni, which it will use to fund the development.
The goal? “The 15-year timeline is to get a device that puts 200 megawatts on the grid,” CFS CEO and MIT grad Robert Mumgaard told Gizmodo. “That’s a small city.”
Nuclear fusion is the process by which two hydrogen atoms are fused together into helium. This process results in the release of lots of energy. Scientists have been chasing fusion for a long time, perhaps since the 1950s. But there have been technological roadblocks, mainly making a device that outputs more energy than it takes to run. Another fusion device under construction in Europe called ITER, for example, should be ready by 2035 but is far over budget. The US Senate has attempted to pull out of the project, reports Nature.
MIT’s “tokamak” fusion reactor seems to promise much cheaper fusion power. It relies on a non-traditional, higher-temperature superconductor called ytterbium-barium-copper-oxide, a kind of cuprate, to allow electricity to flow efficiently without resistance. These produce strong magnetic fields and high pressures in plasma that confine the fusion reactions inside of the device. The sun confines its own reactions with its immense gravity.
The MIT group has been researching the device for a few years. This infusion of capital should allow them to build magnets four times stronger in the next three years. The planned device, called Sparc, will be a 65th of ITER’s size but release a fifth the power, according to an email sent by MIT’s press office.
Of course, there are challenges. Martin Greenwald, deputy director of MIT’s Plasma Science and Fusion Center, told Gizmodo that they have yet to actually develop these magnets and integrate them into such a machine. They will also need to learn how to build and operate the device, and bring it to a position that it can actually enter the market.
It’s important to note that fusion is way different from fission, the process that powers current nuclear power plants. Fusion can’t cause an explosive runaway chain reaction without some sort of fissile material, and rather than using uranium like fission does, it’s water and lithium in, helium out. This also cuts down on nuclear weapons risks. “There’s no reason to have fissile materials around fusion plants,” said Greenwald. “If you do, you’re up to no good.”
Some, like the folks at the Bulletin of the Atomic Scientists, still worry that the excess neutrons produced in fusion could lead to radioactive waste or contaminants, as well as high costs.
Nature points out that there are plenty others are in the fusion-with-high-temperature-superconductors game, too. Princeton has its own tokamak, and there’s a British company called Tokamak Energy using a similar device to produce fusion energy. But all of the cash towards the MIT effort is significant.
“If MIT can do what they are saying—and I have no reason to think that they can’t — this is a major step forward,” Stephen Dean, head of Fusion Power Associates, in Maryland, told Nature. Perhaps all fusion power needed to become reality was, well, a lot of money. Mumgaard said that CFS’ collaboration with MIT will “provide the speed to take what’s happening in the lab and bring it to the market.”
Today is International Women’s Day “a time to reflect on progress made, to call for change and to celebrate acts of courage and determination by ordinary women who have played an extraordinary role in the history of their countries and communities.”
There are many such women in the anti-nuclear movement. For example..
Mary Olson is the Founder of the Gender and Radiation Impact Project and is clear her life’s mission is to bring to light the disproportionate impact of radiation on girls and women. Over her long career, Olson has studied radiation health consequences with some of the leading radiation researchers of the 20 th Century including Rosalie Bertell, Alice Stewart, Helen Caldicott and Wing, and was featured in the educational film “ The Ultimate Wish: Ending the Nuclear Age” Through her work as a staff biologist and policy analyst at Nuclear Information and Resource Service , she has worked for decades to improve public policy on highly radioactive spent nuclear fuel and plutonium
Below is an excellent fact sheet from the Nuclear Information and Resource Service
little girls (age 0 — 5 years) are twice as likely to
suffer harm from radiation (defined in BEIR VII
as cancer) as little boys in the same age group. iii
In October 2011, NIRS published a briefing paper
Atomic Radiation is More Harmful to Women iv
containing more details about these findings. The
numbers in the BEIR VII tables are the source of
this new information. Gender difference is not
discussed in the report text.
Not every dose of radiation results in detectable
harm–cells have repair
mechanisms. However,
every exposure carries the potential for harm; and
that potential is tied to age of exposure and
gender.
Radiation Exposure Standards Based on Adult
Male Body
While we cannot see or
otherwise detect radiation with
our senses, we can see its
damage….
When the first regulations were made, it was because
soldiers and scientists in the U.S. (virtually all
male to begin with) were working on building
nuclear weapons. The first standards were
“allowable” limits for exposing these men to a
known hazard.
Radiation Levels v Dose
Geiger counters and other devices can detect
levels of radiation and concentrations of
radioactivity. It is much more difficult to say how much of that energy has impacted a living body (dose). Dose is calculated based on body size, weight, distance from the source and assumptions about biological impact. Gender is not factored in a typical determination of a dose. Historically the “dose receptors” were male, and were of a small age range. It is somewhat understandable that the “Reference Man”v was based on a “Standard Man”–a guy of a certain height, weight and age. Clearly such assumptions are no longer valid when there is such a striking gender difference– 40% to 100% greater likelihood of cancer or cancer death (depending on the age) for females, compared to males.vi
Not Only Cancer
Radiation harm includes not only cancer and leukemia, but reduced immunity, reduced fertility, increases in other diseases including heart disease, birth defects including heart defects, other mutations (both heritable and not). When damage is catastrophic to a developing embryo, spontaneous abortion or miscarriage of a pregnancy may result.vii
Gender Mechanism Not Yet Described
Perhaps the reason that the National Academy of Sciences does not discuss the fact that gender has such a large impact on outcome of exposure to radiation is that the causal mechanism is not yet described.
Dr. Rosalie Bertell, one of the icons of research and education on radiation health effects, suggests that one basis may be that the female body has a higher percentage of reproductive tissue than the male body. Dr. Bertell points to
studies showing reproductive organs and tissues are more sensitive to radiation. Nonetheless, Dr. Bertell is clear: “While research is clearly needed, we should PROTECT FIRST.”
Ignoring Gender Results in More Harm
The NAS BEIR VII findings show that males of all ages are more resistant to radiation exposure than females, and also that all children are more vulnerable than adults. The only radiation standard certain to protect everyone is zero. Given the fact that there is no safe dose of radiation, it is an appropriate goal. Any additional exposure above unavoidable naturally occurring radiation should include full disclosure and concurrence of the individual. It is time to adopt non-radioactive practices for making energy, peace, security and healing.
03/10/2012 Mary Olson, NIRS Southeast maryo@nirs.org / 828-252-8409
i See http://www.nirs.org/radiation/
ii BEIR VII, Table 12D‐3 page 312, National Academy Press (Washington, DC) 2006.
iii BEIR VII page 311, Table 12‐D 1.
iv NIRS: Atomic Radiation is More Harmful to Women http://www.nirs.org/radiation/radhealth/radiationwomen.p df
vICRP Publication 23: Reference Man: Anatomical, Physiological and Metabolic Characteristics, 1st Edition
vi IEER: The use of Reference Man in Radiation Protection Standards and Guidance with Recommendations for Change http://www.ieer.org/reports/referenceman.pdf
vii Non‐cancer health effects are documented in classic works of John Gofman, for instance Radiation and Human Health (Random House 1982) and digital documents available: http://www.ratical.org/radiation/overviews.html#CNR and Dr. Rosalie Bertell’s classic work No Immediate Danger, Summer Town Books, 1986.
Thorium ‒ a better fuel for nuclear technology? Nuclear Monitor, by Dr. Rainer Moormann 1 March 2018 An important, detailed critique of thorium by Dr. Rainer Moormann, translated from the original German by Jan Haverkamp. Dr. Moormann concludes:
“The use of technology based on thorium would not be able to solve any of the known problems of current nuclear techniques, but it would require an enormous development effort and wide introduction of breeder and reprocessing technology. For those reasons, thorium technology is a dead end.”
Author: Dr. Rainer Moormann, Aachen (r.moormann@gmx.de) Thorium is currently described by several nuclear proponents as a better alternative to uranium fuel.
Thorium itself is, however, not a fissile material. It can only be transformed into fissile uranium-233 using breeder and reprocessing technology. It is 3 to 4 times more abundant than uranium.
Concerning safety and waste disposal there are no convincing arguments in comparison to uranium fuel. A severe disadvantage is that uranium-233 bred from thorium can be used by terror organisations for the construction of simple but high-impact nuclear explosives. Thus development of a thorium fuel cycle without effective denaturation of bredfissile materials is irresponsible.
Introduction
Thorium Introduction
Thorium (Th) is a heavy metal of atomic number 90
(uranium has 92). It belongs to the group of actinides, is
around 3 to 4 times more abundant than uranium and is
radioactive (half-life of Th-232 as starter of the thorium
decay-chain is 14 billion years with alpha-decay). There
are currently hardly any technical applications. Distinctive
is the highly penetrating gamma radiation from its decaychain
(thallium-208 (Tl-208): 2.6 MeV; compared to
gamma radiation from Cs-137: 0.66 MeV). Over the past
decade, a group of globally active nuclear proponents is
recommending thorium as fuel for a safe and affordable
nuclear power technology without larger waste and
proliferation problems. These claims should be submitted
to a scientific fact check. For that reason, we examine
here the claims of thorium proponents.
Dispelling Claim 1: The use of thorium expands the
availability of nuclear fuel by a factor 400
Thorium ‒ a better fuel for nuclear technology? Nuclear Monitor, by Dr. Rainer Moormann 1 March 2018
Thorium itself is not a fissile material. It can, however, be
transformed in breeder reactors into fissile uranium-233
(U-233), just like non-fissile U-238 (99.3% of natural
uranium) can be transformed in a breeder reactor to fissile
plutonium. (A breeder reactor is a reactor in which more
fissile material can be harvested from spent nuclear fuel
than present in the original fresh fuel elements. It may be
sometimes confusing that in the nuclear vocabulary every
conventional reactor breeds, but less than it uses (and
therefore it is not called a breeder reactor).)
For that reason, the use of thorium presupposes the use
of breeder and reprocessing technology. Because these
technologies have almost globally fallen into disrepute, it
cannot be excluded that the more neutral term thorium is
currently also used to disguise an intended reintroduction
of these problematic techniques.
The claimed factor 400: A factor of 100 is due to the
breeder technology. It is also achievable in the uraniumplutonium
cycle. Only a factor of 3 to 4 is specific to
thorium, just because it is more abundant than uranium
Dispelling Claim 2: Thorium did not get a chance in the nuclear energy development because it is not usable for military purposes Thorium ‒ a better fuel for nuclear technology? Nuclear Monitor, by Dr. Rainer Moormann 1 March 2018
In the early stages of nuclear technology in the USA (from 1944 to the early 1950s), reprocessing technology was not yet well developed. Better developed were graphite moderated reactors that used natural uranium and bred plutonium.
For the use of thorium (which, other than uranium, does not contain fissile components), enriched uranium or possibly plutonium would have been indispensable.
Initially, neither pathway for thorium development was chosen because it would have automatically reduced the still limited capacity for military fissile materials production. (Thorium has a higher capture cross section for thermal (that means slow) neutrons than U-238. For that reason, it needs as fertile material in reactors a higher fissile density than U-238.)
Only when the US enrichment capacity at about 1950 delivered sufficient enriched uranium, the military and later civil entry into thorium technology started: in 1955 a bomb with U-233 from thorium was exploded, and a strategic U-233 reserve of around 2 metric tons was created. The large head-start of the plutonium bomb could not be overtaken any more, and plutonium remained globally the leading military fission material (although, according to unconfirmed sources, Indian nuclear weapons contain U-233).
The US military research concluded in 1966 that U-233 is a very potent nuclear weapon material, but that it offers hardly any advantages over the already established plutonium. Because light water reactors with low-enriched uranium (LEU) were already too far developed, thorium use remained marginal also in civil nuclear engineering: for instance, the German “thorium reactor” THTR-300 in Hamm operated only for a short time, and in reality it was a uranium reactor (fuel: 10% weapon-grade 93% enriched U-235 and 90% thorium) because the amount of energy produced by thorium did not exceed 25%.
Dispelling Claim 3: Thorium use has hardly any proliferation risk Thorium ‒ a better fuel for nuclear technology? Nuclear Monitor, by Dr. Rainer Moormann 1 March 2018
The proliferation problem of Th / U-233 needs a differentiated analysis ‒ general answers are easily misleading. First of all, one has to assess the weapon capability of U-233. Criteria for good suitability are a low critical mass and a low rate of spontaneous fission. The critical mass of U-233 is only 40% of that of U-235, the critical mass of plutonium-239 is around 15% smaller than for U-233. A relatively easy to construct nuclear explosive needs around 20 to 25 kg U-233.
The spontaneous fission rate is important, because the neutrons from spontaneous fission act as a starter of the chain reaction; for an efficient nuclear explosion, the fissile material needs to have a super-criticality of at least 2.5 (criticality is the amount of new fissions produced by the neutrons of each fission.)
When, because of spontaneous fissions, a noticeable chain reaction already starts during the initial conventional explosion trigger mechanism in the criticality phase between 1 and 2.5, undesired weak nuclear explosions would end the super-criticality before a significant part of the fissile material has reacted. This largely depends on how fast the criticality phase of 1 to 2.5 is passed. Weapon plutonium (largely Pu-239) and moreover reactor plutonium have – different from the mentioned uranium fission materials U-235 and U-233 – a high spontaneous fission rate, which excludes their use in easy to build bombs.
More specifically, plutonium cannot be caused to explode in a so-called gun-type fission weapon, but both uranium isotopes can. Plutonium needs the far more complex implosion bomb design, which we will not go into further here. A gun-type fission weapon was used in Hiroshima – a cannon barrel set-up, in which a fission projectile is shot into a fission block of a suitable form so that they together form a highly super-critical arrangement. Here, the criticality phase from 1 to 2.5 is in the order of magnitude of milliseconds – a relatively long time, in which a plutonium explosive would destroy itself with weak nuclear explosions caused by spontaneous fission.
One cannot find such uranium gun-type fission weapons in modern weapon arsenals any longer (South Africa’s apartheid regime built 7 gun-type fission weapons using uranium-235): their efficiency (at most a few percent) is rather low, they are bulky (the Hiroshima bomb: 3.6 metric tons, 3.2 meters long), inflexible, and not really suitable for carriers like intercontinental rockets.
On the other hand, gun-type designs are highly reliable and relatively easy to build. Also, the International Atomic Energy Agency (IAEA) reckons that larger terror groups would be capable of constructing a nuclear explosive on the basis of the gun-type fission design provided they got hold of a sufficient amount of suitable fissile material.1
Bombs with a force of at most 2 to 2.5 times that of the Hiroshima bomb (13 kt TNT) are conceivable. For that reason, the USA and Russia have tried intensively for decades to repatriate their world-wide delivered highly enriched uranium (HEU).
A draw-back of U-233 in weapon technology is that – when it is produced only for energy generation purposes – it is contaminated with maximally 250 parts per million (ppm) U-232 (half-life 70 years).2 That does not impair the nuclear explosion capability, but the uranium-232 turns in the thorium decay chain, which means ‒ as mentioned above ‒ emission of the highly penetrating radiation of Tl-208. A strongly radiating bomb is undesirable in a military environment – from the point of view of handling, and because the radiation intervenes with the bomb’s electronics.
In the USA, there exists a limit of 50 ppm U-232 above which U-233 is no longer considered suitable for weapons.
Nevertheless, U-232 does not really diminish all proliferation problems around U-233. First of all, simple gun-type designs do not need any electronics; furthermore, radiation safety arguments during bomb construction will hardly play a role for terrorist organisations that use suicide bombers.
Besides that, Tl-208 only appears in the end of the decay chain of U-232: freshly produced or purified U-233/U-232 will radiate little for weeks and is easier to handle.2 It is also possible to suppress the build-up of uranium-232 to a large extent, when during the breeding process of U-233 fast neutrons with energies larger than 0.5 MeV are filtered out (for instance by arranging the thorium in the reactor behind a moderating layer) and thorium is used from ore that contains as little uranium as possible.
A very elegant way to harvest highly pure U-233 is offered by the proposed molten salt reactors with integrated reprocessing (MSR): During the breeding of U-233 from thorium, the intermediate protactinium-233 (Pa-233) is produced, which has a half-life of around one month. When this intermediate is isolated – as is intended in some molten salt reactors – and let decay outside the reactor, pure U-233 is obtained that is optimally suited for nuclear weapons.
An advantage of U-233 in comparison with Pu-239 in military use is that under neutron irradiation during the production in the reactor, it tends to turn a lot less into nuclides that negatively influence the explosion capability. U-233 can (like U-235) be made unsuitable for use in weapons by adding U-238: When depleted uranium is already mixed with thorium during the feed-in into the reactor, the resulting mix of nuclides is virtually unusable for weapons.
However, for MSRs with integrated reprocessing this is not a sufficient remedy. One would have to prevent separation of protactinium-233.9
The conclusion has to be that the use of thorium contains severe proliferation risks. These are less in the risk that highly developed states would find it easier to lay their hands on high-tech weapons, than that the bar for the construction of simple but highly effective nuclear explosives for terror organisations or unstable states will be a lot lower.
Dispelling Claim 4: Thorium reactors are safer than conventional uranium reactors Thorium ‒ a better fuel for nuclear technology? Nuclear Monitor, by Dr. Rainer Moormann 1 March 2018
The fission of U-233 results in roughly the same amounts
of the safety-relevant nuclides iodine-131, caesium-137
and strontium-90 as that of U-235. Also, the decay heat is
virtually the same. The differences in produced actinides (see
next claim) are of secondary importance for the risk during
operation or in an accident. In this perspective, thorium use
does not deliver any recognisable safety advantages.
Of greater safety relevance is the fact that uranium-233
fission produces 60% less so-called delayed neutrons than
U-235 fission. Delayed neutrons are not directly created
during the fission of uranium, but from some short-lived
decay products. Only due to the existence of delayed
neutrons, a nuclear reactor can be controlled, and the
bigger their share (for instance 0.6% with U-235), the
larger is the criticality range in which controllability is given
(this is called delayed criticality). Above this controllable
area (prompt criticality) a nuclear power excursion can
happen, like during the Chernobyl accident. The fact that
the delayed super-critical range is with U-233 considerably
smaller than with U-235, is from a safety point of view an
important technical disadvantage of thorium use.
During the design of thermal molten salt reactors (breeders),
the conclusion was that the use of thorium brings problems
with criticality safety that do not appear with classical
uranium use in this type of reactors. For that reason, it was
necessary to turn the attention to fast reactors for the use
of thorium in molten salt reactors. Although this conclusion
cannot be generalised, it shows that the use of thorium can
lead to increased safety problems.
As mentioned, a serious safety problem is the necessity to
restart breeder and reprocessing technology with thorium.
Thorium is often advertised in relation to the development
of so-called advanced reactors (Generation IV). The
safety advantages attributed to thorium in this context are
mostly, however, not germane to thorium (the fuel) but
rather due to the reactor concept. Whether or not these