A new serious problem with stainless steel canisters for nuclear wastes
|
By David Szondy January 28, 2020 A new study by researchers at Ohio State University suggests that stainless steel may not be the best choice for containing high-level nuclear waste. By simulating long-term storage conditions, the team found that the storage materials interact with each other more than previously thought, causing them to degrade faster.
The storage of nuclear waste is more than a perennial political football, it is an existential problem. Whatever one’s opinions about nuclear power or weapons, there are thousands of tons of nuclear waste temporarily stored around the world, meaning that a way must be found to store it all
safely in the long term.
The most important type of nuclear waste is the high-level waste left over from reprocessing nuclear fuel or from nuclear weapon production. Such waste is made up of a complex mixture of radioactive isotopes with half-lives ranging from years to millennia. Though reactors have been operating all over the world for over 75 years, only Finland has started to build a permanent storage facility for such very dangerous waste.
That may show a remarkable lack of political will or even courage, but perhaps this reluctance will turn out to be serendipitous. That’s because the favored way of storing high-level waste is to vitrify it. That is, to mix the isotopes with molten glass or ceramics to form a chemically inert mass that can be sealed in stainless steel canisters before being sealed in an underground storage facility.
That plan may now have to change if the Ohio study is correct. Led by Xiaolei Guo, the team took glasses and ceramics and put them in close contact with stainless steel in various wet solutions for 30 days in conditions similar to those that would be found in the proposed US Yucca Mountain nuclear waste repository.
In the real-life scenario, the glass or ceramic waste forms would be in close contact with stainless steel canisters,” says Xiaolei. “Under specific conditions, the corrosion of stainless steel will go crazy. It creates a super-aggressive environment that can corrode surrounding materials.”
They found that the steel interacted with the glass or ceramic to produce severe and localized corrosion that both damaged the steel and corroded and cracked the glass and ceramics. According to the team, this is because the iron in stainless steel has a chemical affinity with the silicon in glass, accelerating corrosion. This indicates that the current models may not be sufficient to keep this waste safely stored,” says Xiaolei. “And it shows that we need to develop a new model for storing nuclear waste.” The research was published in Nature Materials. Source: Ohio State University |
|
Nuclear recycling is a bad idea
claiming that nuclear power is safe and finding a solution for its toxic waste is easy. If it’s so easy, why don’t they have a workable solution? Is it really just peoples’ unreasonable fears that obstruct the industry and the federal government from creating a final solution?Originally we were told that there was no waste problem because the waste would be reprocessed and used again in bombs and new “breeder” reactors. That idea failed! Miserably! The only reprocessing facility for commercial nuclear waste that ever existed was West Valley in upstate New York and it shuttered after only five years because it contaminated the land and water around it with radiation. It remains a Superfund site to this day. Without the technology to safely reprocess it, nuclear fuel waste remains in fuel pools and dry storage at reactor sites all over the country.
All this was codified under the 1982 Nuclear Waste Policy Act (NWPA). Once established, investigations began to determine the best dump site/s. But every state that was identified as a potential site for a repository threatened to sue. Instituting the NWPA was in crisis. The NWPA was amended and Congress targeted Yucca Mountain because
failed to meet the necessary criteria for safe isolation of the deadly material. With the failure of the federal government and the nuclear industry to establish Yucca Mountain as the national repository for nuclear waste, nuclear corporations were forced to establish onsite storage at their operating and shuttered reactor sites. Six out of nine reactors in New England have shuttered due to significant public opposition and their inability to compete with gas and renewables. These six sites are in varying degrees of cleanup. Without a “solution” as to dealing with the nuclear waste, these sites have devolved into ad hoc nuclear waste dumps. All have created onsite storage for their high level waste. It costs a lot to store the waste onsite — at least $5 million out of pocket for each year. This waste could remain onsite for decades if not centuries. So costs could really add up for corporations without any revenue. Naivety, arrogance, and thoughtlessness add up to a lot of money!With waste piling up at shuttered reactor sites throughout the country, the industry has a perception problem. This is not a favorable image for an
industry trying to reinvent itself as the answer to global warming. So what’s the industry’s answer? It wants to create “interim storage” dump sites in west Texas and New Mexico in working poor, Hispanic communities to make this problem disappear. These sites don’t have to meet the strict environmental standards that sunk Yucca Mountain— i.e., isolation from the environment for 1,000 years and isolation from groundwater for 10,000 years.
This “interim storage” initiative is a statement of the failure of the nuclear industry and the federal government to address the most toxic waste we have ever created. We don’t need more nukes; we don’t need half baked “solutions”. We need a commitment to put our best minds to solve this thorny problem. What is needed is a scientifically sound and environmentally just solution, not more magic or wish fulfillment. A qualified “panel” must be established and funded to create the standards required to meet the health and safety of the public and the planet, not the profit-driven, short-sighted monetary bottom line of a moribund industry.
Deb Katz is the executive director of the Citizens Awareness Network, which was founded locally in 1991 and has offices in Shelburne Falls and Rowe. Here’s a link to our website www.nukebusters.org.
World’s first public database of mine tailings dams aims to prevent deadly disasters
World’s first public database of mine tailings dams aims to prevent deadly disasters https://www.eurekalert.org/pub_releases/2020-01/g-wfp012320.php
Previously unreleased data offer unprecedented view into mining industry’s waste storage practices
GRID-ARENDAL 24 JAN 2020 ENVIRONMENTAL ORGANIZATION GRID-ARENDAL HAS LAUNCHED THE WORLD’S FIRST PUBLICLY ACCESSIBLE GLOBAL DATABASE OF MINE TAILINGS STORAGE FACILITIES. THE DATABASE, THE GLOBAL TAILINGS PORTAL, WAS BUILT BY NORWAY-BASED GRID-ARENDAL AS PART OF THE INVESTOR MINING AND TAILINGS SAFETY INITIATIVE, WHICH IS LED BY THE CHURCH OF ENGLAND PENSIONS BOARD AND THE SWEDISH NATIONAL PENSION FUNDS’ COUNCIL ON ETHICS, WITH SUPPORT FROM THE UN ENVIRONMENT PROGRAMME. THE INITIATIVE IS BACKED BY FUNDS WITH MORE THAN US$13 TRILLION UNDER MANAGEMENT.
Until now, there has been no central database detailing the location and quantity of the mining industry’s liquid and solid waste, known as tailings. The waste is typically stored in embankments called tailings dams, which have periodically failed with devastating consequences for communities, wildlife and ecosystems.
“This portal could save lives”, says Elaine Baker, senior expert at GRID-Arendal and a geosciences professor with the University of Sydney in Australia. “Dams are getting bigger and bigger. Mining companies have found most of the highest-grade ores and are now mining lower-grade ones, which create more waste. With this information, the entire industry can work towards reducing dam failures in the future.”
The database allows users to view detailed information on more than 1,700 tailings dams around the world, categorized by location, company, dam type, height, volume, and risk, among other factors.
“Most of this information has never before been publicly available”, says Kristina Thygesen, GRID-Arendal’s programme leader for geological resources and a member of the team that worked on the portal. When GRID-Arendal began in-depth research on mine tailings dams in 2016, very little data was accessible. In a 2017 report on tailings dams, co-published by GRID and the UN Environment Programme, one of the key recommendations was to establish an accessible public-interest database of tailings storage facilities.
“This database brings a new level of transparency to the mining industry, which will benefit regulators, institutional investors, scientific researchers, local communities, the media, and the industry itself”, says Thygesen.
The release of the Global Tailings Portal coincides with the one-year anniversary of the tailings dam collapse in Brumadinho, Brazil, that killed 270 people. After that disaster, a group of institutional investors led by the Church of England Pensions Board asked 726 of the world’s largest mining companies to disclose details about their tailings dams. Many of the companies complied, and the information they released has been incorporated into the database.
For more information on tailings dams, see the 2017 report “Mine Tailings Storage: Safety Is No Accident” and the related collection of graphics, which are available for media use.
About GRID-Arendal
GRID-Arendal supports environmentally sustainable development by working with the UN Environment Programme and other partners. We communicate environmental knowledge that motivates decision-makers and strengthens management capacity. We transform environmental data into credible, science-based information products, delivered through innovative communication tools and capacity-building services.
India – a case study in regulatory capture by the nuclear industry
In this piece, the author while discussing the issues around nuclear safety, debates on why it is important to re-examine the proposed Nuclear Safety Regulatory Authority Bill for better regulation, transparency, and liability. SINCE returning to power last year with an overwhelming majority in the 2019 general elections, the Modi-led government has passed a series of legislations in rapid succession without any credible dialogue both within and outside Parliament – …………even as there has been exceptional eagerness to push these amendments and pass new legislation, including notifying the Citizenship (Amendment) Act, 2019 despite intense country-wide protests and a raging debate on its underlying intent, there are urgent issues, such as, nuclear safety, which remain in indefinite suspension.
The UPA-II government, under Dr Manmohan Singh, had introduced the Nuclear Safety Regulatory Authority (NSRA) Bill in the Lok Sabha on 07 September 2011, aimed at replacing India’s existing nuclear regulator, the Atomic Energy Regulatory Board (AERB) with a purportedly improved and more autonomous Nuclear Safety Regulatory Authority (NSRA) which would have the mandate to ‘regulate nuclear safety and activities related to nuclear material and facilities’.
The Bill, however, which had been referred to the Department-related Parliamentary Standing Committee on Science and Technology, Environment and Forests, did not come up for discussion before the dissolution of the 15th Lok Sabha, and subsequently, lapsed. The Standing Committee had reportedly endorsed the Bill with only minor suggestions for changes, while two members of the Committee from the CPI(M), gave dissent notes, arguing that the Bill provided ‘no substantive autonomy’ to the proposed NSRA. According to available information, in April 2017, the Union Minister of State (Independent Charge) Atomic Energy and Space, Dr Jitendra Singh, in a written response to a question in the Lok Sabha had stated that a ‘fresh Bill’ similar to the earlier NSRA Bill, was ‘under examination’……….India’s nuclear regulatory framework has long been criticized for being so thoroughly enmeshed within the government structure so as to render its requisite independence, practically meaningless. Nuclear safety in India has been the remit of the AERB, which was set up in November of 1983 by an executive order of the Secretary of the DAE under Section 27 of the Atomic Energy Act, 1962, with modifications made in April 2000 to “exclude all BARC facilities from (its) oversight, (following) the declaration of BARC as a nuclear weapons laboratory”.
The AERB has had the dishonourable reputation of being subservient to India’s exclusively public sector operators, which it is required to monitor, and is also acknowledged as suffering from an acute lack of independence from industry and government.
As things stand, the AERB is responsible for monitoring the safety of the various nuclear facilities operated by agencies such as, the Nuclear Power Corporation of India Limited (NPCIL) and the Uranium Corporation of India Limited (UCIL), which fall under the purview of the Department of Atomic Energy (DAE). However, the Board is required to report to the Atomic Energy Commission (AEC), whose chairman is the Secretary of the DAE and one of whose members is the Chair of the NPCIL, and which overall, comes under the direct control of the Prime Minister of India. Thus, the regulatory board reports to the very agency it is required to assess and monitor in the interest of public safety.
Moreover, the AERB frequently draws upon the ‘expertise’ of scientists and engineers provided by the DAE – “almost 95% of the members in AERB’s review and advisory committees are drawn from among retired employees of the DAE, either from one of their research institutes like the Bhabha Atomic Research Center or a power generation company like the Nuclear Power Corporation of India Ltd.” – thus, calling into question the AERB’s functional autonomy.
Dr A Gopalakrishnan, the former Chairman of the AERB has been at pains to explain how the present institutional setup makes nuclear safety regulation in India a ‘mere sham’ and that for the AERB to function effectively, the DAE’s hold on the Board needs to be urgently done away with. In 1995, during Dr Gopalakrishnan’s tenure as the nuclear regulator, the AERB had prepared a comprehensive ‘Document on Safety Issues in DAE Installations’ – a report detailing nearly 130 safety issues across India’s nuclear installations with 95 of them having been designated ‘top priority’, to which the first reactions from the NPCIL and BARC according to Dr Gopalakrishnan, were of denial and questioning AERB’s own technical expertise to review safety matters.
A 2012 Performance Audit Report on the AERB prepared by the Comptroller and Auditor General of India (CAG) and submitted to the Indian Parliament labelled the AERB a ‘subordinate office, exercising delegated functions of Central government and not that of the regulator’.
The Public Accounts Committee (PAC) scrutinizing the CAG report in 2013 castigated the Regulatory Board for failing to prepare a ‘comprehensive nuclear radiation safety policy despite a specific mandate in its constitution order of 1983’. The International Atomic Energy Agency’s (IAEA) Peer Review of India’s Nuclear Regulatory Framework in 2015 was also categorical in asserting that the AERB was in need of being separated from ‘other entities having responsibilities or interests that could unduly influence its decision making’.
As has been pointed out by MV Ramana, physicist and author of The Power of Promise, there have been accidents of ‘varying severity’ at several of the nuclear facilities being operated by the DAE, yet the regulatory board has frequently been seen downplaying the seriousness of such incidents, “postponing essential repairs to suit the DAE’s time schedules, and allowing continued operation of installations when public safety considerations would warrant their immediate shutdown and repair”. The charade of the AERB’s professed independence is further underscored by its conspicuous silence on the recent cybersecurity breach at the Koodankulam Nuclear Power Plant in Tirunelveli District in Tamil Nadu in October 2019.
It is these glaring frailties of the nuclear regulatory framework coupled with the obdurate insistence of the Central government to massively expand its activities along the entire nuclear fuel cycle, despite unsettled safety concerns, a long-standing and vociferous people’s resistance against uranium mining and nuclear energy projects, and concerns surrounding the health, environmental, economic, and democratic costs of this expansion, that make imperative, the need for a fiercely independent and non-partisan nuclear regulator.
Does the proposed NSRA fit the bill?
The NSRA Bill, 2011 upon its introduction, had failed to invoke any enthusiasm among independent experts, nuclear sector watchers, and civil society actors, and instead, was met with grim scepticism given that among other things, it made light of the principle of ‘separation’ as required under Article 8 of the IAEA Convention on Nuclear Safety to which India is a State Party.
The NSRA Bill provides for the establishment of a ‘Council of Nuclear Safety’, headed by the Prime Minister and comprised of five or more Union Ministers, the Cabinet Secretary, Chairman of the AEC, and other ‘eminent experts’ nominated by the Central government, which in turn, will constitute ‘search committees’ to select the Chair and Members of the proposed Regulatory Authority. Moreover, under Article 14 of the Bill, the Chairperson and Members of the NSRA can be removed by an order of the Central government.
Dr Gopalakrishnan argues that the Bill makes only an ornamental show of granting independence to the NSRA by requiring the Authority to report to the Parliament instead of a government department, ministry or official. Concomitantly, however, the Bill also unambiguously provides for the supersession of and the assumption of ‘all the powers, functions and duties’ of the Authority by the Central Government, if in its ‘opinion’ the Authority fails to function in concert with the provisions of the proposed Act, and, requires the Authority to seek approval of the central government prior to initiating any interaction with nuclear regulators of other countries and/or international organizations ‘engaged in activities relevant to…nuclear/radiation safety, physical security of nuclear material and facilities, transportation of nuclear and radioactive materials and nuclear and radiation safety and regulation’.
Article 20 (q) of the Bill mandates the NSRA to ‘discharge its functions and powers in a manner consistent with the international obligations of India’. This provision, argues Dr Gopalakrishnan is deeply worrisome for it “could mean, that if the Prime Minister has promised the French President in 2008 that India would buy six European Pressurised Reactors (EPRs)…(this) unilateral and personal commitment…will now (be) labelled ‘India’s international obligation’, and the NSRA cannot question, even on strong safety grounds, the setting up of those six EPR units, since that will violate the said clause of the Bill” – this might prove disastrous for both, public and environmental safety in the long term.
Experts argue that far from separating the regulator from the government, these provisions contained in the NSRA Bill will only mean absolute government control over nuclear regulation, including over appointment and dismissal procedures, thus, opening the way for ‘pliant technocrats’ to occupy prominent positions within the Authority.
The proposed Bill is also fuzzy on the question of which nuclear facilities will fall under the purview of the NSRA – it empowers, for instance, the central government to exempt “any nuclear material, radioactive material, facilities, premises and activities” from the jurisdiction of the Authority, on grounds of ‘national defence and security’. ……….
These and other provisions of the Bill are a stark reminder that the DAE has no love lost for transparency and public oversight – take, for instance, Article 45 which requires the Chairperson, Members, and other employees of the Authority to sign a ‘declaration of fidelity and secrecy’ “to not communicate or allow to be communicated to any person not legally entitled to any information relating to the affairs of the Authority”. It is for these reasons that the former nuclear regulator, Dr Gopalakrishnan has described the proposed NSRA Bill as an exercise in ‘boxing in’ nuclear regulation “from all sides by government controls, diktats, and threats of retaliation”, thus making it even more emaciated than the existing nuclear regulator – the AERB. …..https://theleaflet.in/in-a-season-of-impetuous-lawmaking-whither-nuclear-safety/
History of U.S. missiles testing atomic, bacterial and viral weapons in Utah
from anonymous contributor, 22 Jan 2020, Some of What The USA Military , Corporations and government did to us on top of detonating several open air nuclear bombs on us.
My government had a missile base 60 miles from Uranium and downwinder hell in Utah. My governemt also detonated four atomic bombs under the river that went through my town, in the llate 1960s..
My government and military contractors launched hundreds of single and multistage missiles from Green River Utah to White Sands new mexico from 1965 to 1970. The missiles went to white sands New Mexico, which is approximately 900 miles from Green River Utah. Their trajectory took them over southern Utah National parks, the navajo, zuni, ute hopi and pueblo nativ,e reservations, and most of north and central New Mexico. White sands New Mexico is 925 miles from Green river utah. White sands new mexico is 50 miles from alamagordo New Mexico . Alamagorgo is where the fist Atomic bomb, in the world was detonated. The cold war, multibillion dollar Missile project tested single and multistage rockets and biological warfare payloads.
They tested payloads
The Green-White sands missile project was to test missile paylods over the south west usa. You can find sections of missiles that failed in the 900 mile stretch, from Green ricver to white sands nm. They launched hundreds of missiles and rockets Some of the missile-rockets failed and crashed into the desert, long before reaching their south central New Mexico destination clasw to the mexico usa border. There are missile carcasses and stages from southern utah to in the Canuyonlands national Park, The Grand Gulch National Nation Monument, in The Four Corners Area of the USA where Utah, Colorado and Arizona intersect. Missile parts and from failed missiles can also be found in the New Mexico and Utah Navajo reservations. There detritus of missiles can found by Farminton New Mexico , west of Santa Fe New Mexico, east of demming New mexico and by Socorro nm . The initial stages of multistage rockets are mostly in utah.
The military and government tested several biological weapon payloads, in the missiles-rockets that went from Utah to white sands new mexico.The army , and corporate contractors, put Biological Warfare payloads on missiles. with viruses and bacteria in them. They tested the a weaponized version of the Hanta Virus. They tested Hearty bacterial spores, like anthrax as well. They launched the biological warfare payloads with viruses and bacteria in special mediums to test the stability of the most Hearty virus and bacterial spore-systems in missile carrier systems.
The biological warfare medium-containing–payloads were on rockets that went from Green River Utah to White sands new mexico from 1965 to 1970.
Hannta virus did not exist in the United States of America in Humans, until 1990. It was weaponized by the United States government and corporations in the 1960s. The first Hanta Virus casualties recorded in the USA were a family of Navajos in New Mexico, in 1992. Hanta virus is has now apread to mice vectors in all parts of the United States of america. It is epidemic in the USA . I know because I once worked for an agency that treated and tracked it.
The 1993 Four Corners hantavirus outbreak was an outbreak of hantavirus that caused the first known human cases of hantavirus disease in the United States. It occurred within the Four Corners region – the geographic intersection of the U.S. states of Utah, Colorado, New Mexico, and Arizona – of the southwestern part of the country in the spring of 1993. This region is largely occupied by Native American tribal lands, including the Hopi, Ute, Zuni, and Navajo reservations, from which many of the cases were reported.
“The Discovery of Hantaan Virus: Comparative Biology and …
by KM Johnson · 2004 · Cited by 10 · Related articles
Nov 1, 2004 · They became infected by tissues of antigen-positive wild mice of that single species. … Dr. Lee named it “Hantaan,” after a small river near the border between the 2 Koreas, where human infection was isolated and endemic in the 1950s”
FROM “Brief Histories of Three Federal Military Installations in Utah: Kearns Army Air Base, Hurricane Mesa, and Green River Test Complex” (PDF). Utah Historical Quarterly. Utah State Historical Society. 34 (2). Spring 1966. Archived from the original (PDF) on October 29, 2013. Retrieved September 12, 2013. More than 100 employees of the [Atlantic Research]
The Utah Launch Complex was a Cold War military subinstallation of White Sands Missile Range for USAF and US Army rocket launches. In addition to firing Pershing missiles, the complex launched Athena RTV missiles with subscale warheads of the Advanced Ballistic Re-entry System to reentry speeds and impact at the New Mexico range. From 1964 to 1975 there were 244 Green River launches, including 141 Athena launches and a Pershing to 281 kilometers altitude. “Utah State Route 19 runs through the Green River Launch Complex, which is south of the town and eponym of Green River.”
Debunking James Hansen’s claims in favour of nuclear power
John Wayne squares off against Jim Hansen, Medium, Albert Bates, 11 Jan 2020 “……. I greatly admire James Hansen ……. What annoys me, however, about Hansen, then and now, is his insistence, in utter disregard of best science, that nuclear energy can somehow save humanity from climate change because it is clean, safe, too cheap to meter and besides all that, is carbon-free. I watched with pity more than scorn when he took his time to repeat this nonsense at the recent UN climate conference in Madrid. He mounted fallacy upon fallacy in a pyramid of lies that had been heard since the 1940s coming from the Atomic Energy Commission, Nuclear Regulatory Commission, International Atomic Energy Agency and others in thrall to the atomic devil.“. . . the genetic effect has no threshold and exposure is not only cumulative in the individual, but in succeeding generations. On this basis, there would be no tolerance dose, but rather an acceptable injury-limit.”[Parker, H.M., Instrument ation and Radiation Protection (March, 1947), Health Physics, 38:957,970, June 1980]
and:
“Even sub-tolerance radiations produce certain biological changes (cosmic rays are supposed to have some biological effects), so tolerance radiation is not what one strives to get but the maximum permissible dose.”[Morgan, K.Z., The Responsibilities of Health Physics, The Scientific Monthly, 93 (August 1946); reprinted in Health Physics 38:949–952, June 1980.]
The question of what percentage of the population can be acceptably damaged came first to the attention of the AEC at a meeting of the Advisory Committee on Biology and Medicine on January 16–19,1957. At this meeting the AEC advisors determined that a 20 percent increase in the rate of bone cancers and birth defects nationwide would be an “acceptable” effect of U.S. nuclear weapons testing activities. These scientists also acknowledged at this time that the long-term genetic effects were totally unknown.
The historical record indicates that prominent radiologists, health physicists, and geneticists of the time recognized even at the outset of America’s atomic power program that any large population exposure to even very minute amounts of ionizing radiation could create lingering public health problems and genetic damage, and these scientists went to some lengths, including sacrificing their own illustrious careers, to express their views publicly. [ long list of references given here]
[ discusses Fukushima]
….. atmospheric physicists should not opine on health physics. There is no dose of radiation below which there is not a negative biological effect. Indeed, there is a “superlinear” ratio of dose to effect at low doses, because doses that do not kill a cell cause genetic damage that is a larger health threat than dead cells, so humans and animals exposed to low doses are at greater health risk than those exposed to higher doses.
While there are hundreds of different radioactive isotopes within a nuclear reactor, the isotope Cesium-137 is easily measured and has become a standard by which to calculate impacts. During the two-day accident, 18 quadrillion becquerels of cesium were released into the Pacific (18 with 15 zeros). A typical abdominal or pelvic CT scan (the most often performed) is 14–18 thousandths of a becquerel, so during the accident the cesium dose to the environment was the same as about 1 quintillion (1 with 18 zeros) CT scans (repeated every second, continuously, for the next 300 to 600 years). Depending on the type of scan and the age and sex of the patient, a single CT scan will produce 1 cancer for 150 to 3300 exposures, or a median risk of 10 cancers per becquerel (or seivert). [table here on original]
By that calculation, the cesium released during the Fukushima accident was capable of causing roughly 10 quadrillion cancers, but with one important difference.
When you receive radiation treatment like a CT-scan it is sudden and one-off. One second. The technician presses the button and it is on and then off. There is no danger from the machine when it is off. When radioactive elements like cesium-137 (and remember that is just one of hundreds of elements in a nuclear reactor) are released to the environment, there is no off-switch. Thus, the cesium released during the Fukushima accident is capable of roughly 10 quadrillion cancers per second. Inhaling or ingesting it can kill a person, a dolphin or a seagull, but then as the individual’s body decomposes after death — as bacteria, worms and fungi eat away the flesh and bone — the isotope goes back into the food chain to strike another individual, and another, and so on. The danger is limited only by the isotope’s half-life — the time it takes to decay to a harmless element, which for cesium-137 is 30.17 years. Scientists generally use 10 or 20 half-lives to bracket safety concerns, so for cesium 137, “safe” levels arrive in 302 to 604 years (around year 2322 to year 2624), admittedly an imperfect measurement since any residue, no matter how microscopic, may still be lethal, as we have known since before the Manhattan Project. Cesium is one of 256 radionuclides released during Fukushima, so we would need to calculate quantities, biological effectiveness, and the decay time of each of those to get the full health picture. Other isotopes in the Fukushima fuel include Uranium-235, with a half-life of 704 million years, and Uranium-238, with a half-life of 4.47 billion years, or longer than the age of the Earth.
At Fukushima, the end of the accident was not the end of the story. In 2013, 30 billion becquerels of cesium-137 were still flowing into the ocean every day from the damaged and leaking reactor cores. That is 300 billion cancer doses per second of man-made cesium added every day, or 109.5 trillion cancer doses per second added every year. To stop this assault on ocean life, and our own, over the next 5 years the owner of the plant constructed more than 1000 tanks to hold contaminated water away from the ocean. In September 2019, the Japanese government announced that more than one million tons were in storage but that space would run out by the summer of 2022 so it planned to begin releasing those billions of bequerels to the ocean again.Swimmers and sailors who plan to compete in open water events at the 2020 Tokyo Olympics might want to think about that, as might any who fish those waters or consume the catch.
What happens to ocean creatures who ingest radionuclides from leaking nuclear power plants is not very different from what happened to John Wayne, his sons and his co-stars. As the isotopes decay within the body of a dolphin or a coral polyp they send microscopic bullets hurling through DNA chains, causing tumors, sicknesses, defective offspring and death for untold generations. The chance that a single mutation will produce a beneficial result are less than one in a million. Radioactivity is, for practical purposes, forever, as we can see just by looking up at our Sun, a benevolent nuclear reactor providing us energy from the relatively safe distance of 93 million miles.
Even that radiation will kill a number of us, but far fewer than would die if, by some devilish plan or panic response, we follow Dr. Hansen’s advice. https://medium.com/@albertbates/john-wayne-squares-off-against-jim-hansen-42a258b2260d
Climate change afflicting the health of the world’s children
Warning: Climate change will bring major new health risks for kids https://thebulletin.org/2020/01/warning-climate-change-will-bring-major-new-health-risks-for-kids/?utm_source=Newsletter&utm_medium=Email&utm_campaign=MondayNewsletter01202020&utm_content=ClimateChange_HealthRisks_01172020#
By Kathleen E. Bachynski, January 17, 2020 As we enter a new decade, headlines from across the world make all too clear that the effects of climate change are not just looming. They’re here, they’re now, and they’re devastating communities on every continent. For example, in Australia, unprecedented fires have emitted roughly 400 million tons of carbon, killed at least 25 people, and destroyed 2,000 homes. In Indonesia, terrible flooding has killed at least 67 people and caused 400,000 to abandon their homes. The loss of sea ice in the Arctic is shrinking access to food resources that numerous indigenous communities have depended on for generations.
But the health effects of climate change go beyond even the most immediate and obvious consequences of fires, floods, and melting ice. In November 2019, the medical journal The Lancet published a detailed report examining the effects that climate change will have on human health under two scenarios: one in which the world reins in emissions according to commitments laid out in the Paris agreement, and one in which the world does not. In both cases, children will be most vulnerable to the numerous health harms resulting from decisions made by their parents and grandparents. Children are particularly likely to suffer the effects of climate change for numerous reasons: Their immune and organ systems are still developing, they drink relatively more water and breathe in more air than do adults relative to their body weight, and they tend to spend more time outdoors. Understanding the full scope of the public health consequences of a changing climate, then, involves examining how the risks will affect the bodies of the youngest people.
According to the Lancet report, air pollution—specifically, exposure to fine particulate matter known as PM 2.5—represents the largest environmental risk factor for premature deaths across the globe. When people think of the public health effects of air pollution, they often imagine the worst-case scenarios. For example, the smoke from the fires in Australia is currently so severe that a day spent inhaling the air in east Sydney represents the equivalent of smoking 19 cigarettes.
But air pollution need not reach such extreme levels to cause serious harm. Far more commonly, people are unaware of the daily pollution that they are breathing in due to the burning of fossil fuels, such as coal and gas. In fact, more than 90 percent of children are exposed to concentrations of PM 2.5 higher than the World Health Organization’s guidelines on outdoor air pollution. Over a lifetime, unhealthy air damages lungs and increases risks for a host of diseases, from asthma to pneumonia. And due to their small body size and the factors cited above, children absorb more of this pollution than do adults.
Similarly, The Lancet report notes that children are particularly vulnerable to the effects of heat. Specifically, young children are at greater risk for experiencing electrolyte imbalance, fever, respiratory disease, and kidney disease during periods of extreme heat. Rates of heat-related deaths are four times higher among children younger than one year old as compared to people aged 1-to-44. Changing temperature and precipitation patterns are also influencing the transmission of disease from insects to humans. In particular, malaria and dengue are spread by mosquitoes, and climate suitability for transmission of these diseases is increasing in numerous parts of the world. Because children tend to spend more time outdoors, they are more likely to contract these diseases. In 2017, children accounted for 61 percent of all malaria deaths worldwide, and climate change is putting more children at even greater risk.
Changing climate patterns, droughts, and fires also threaten to reduce crop yields and increase food insecurity. Moreover, rising carbon dioxide appears to diminish the nutrient quality of crucial staple foods such as wheat and rice. Combined, these trends are likely to exacerbate the already serious global health problem of malnutrition, which currently accounts for nearly one-fifth of premature deaths and poor health globally. The consequences of malnutrition are particularly severe among children. In 2018, 22 percent of children under five years of age were stunted, meaning they experienced impaired growth and development. Stunting is largely irreversible and includes serious consequences, from poorer cognition to increased risk of nutrition-related chronic diseases later in life.
Finally, The Lancet report observes that climate change has other health implications that are more challenging to quantify but crucial to address, such as mental health effects. Researchers have found that children are at high risk of mental health problems following the types of natural disasters that are likely to increase due to climate change. For example, one study found that 31 percent of a group of children who were evacuated during Hurricane Katrina reported clinically significant symptoms associated with depression and Post Traumatic Stress Disorder. According to the Centers for Disease Control, children are at particular risk for stress after a disaster because they often understand less about what is occurring, feel less able to control events, and have less experience coping with difficult situations.
Protecting children from air pollution, heat-related deaths, infectious diseases, malnutrition, and mental health effects associated with climate change will involve the mobilization of all sectors of society to drastically reduce emissions and invest in health systems and infrastructure. The Lancet report notes a few promising signs, such as increased public and political engagement, and increasing health adaptation spending to improve communities’ resilience to a changing climate. Unfortunately, however, current efforts are falling far short of what is needed to meaningfully reduce carbon emissions on the scale needed to address the threat posed to human health. According to a 2019 United Nations report, greenhouse gas emissions must begin falling by 7.6 percent this year in order to meet the most ambitious goals laid out in the 2015 Paris climate accord. But the world is nowhere near this goal, and many countries are heading in the opposite direction. Notably, in 2018, energy-related carbon dioxide emissions rose by 2.7 percent in the United States. The United Nations has warned that every year of delay “brings a need for faster cuts, which become increasingly expensive, unlikely, and impractical.”
Waiting until action becomes more difficult, or perhaps even impossible, has appalling moral consequences. The longer we fail to act to address the risks of climate change, the more human lives we place on the line. And the majority of those lives will belong to the most vulnerable among us. It is no wonder, then, that children across the world have taken the lead in advocating for urgent, necessary action. The public health stakes for them—and for all people—grow higher with each passing year. Our health is fundamentally tied to our planet’s health. We must all consider, then, what actions we need to take to protect our planet—and thereby our communities, our children, and our selves.
Britain’s £1.2bn cleanup begins, of Berkeley power station, closed 30 years ago
Nuclear waste removal begins 30 years after power station closure, https://www.bbc.com/news/uk-england-somerset-50866867 5 Jan 2029, Work has begun on removing nuclear waste from Berkeley power station, 30 years after it was decommissioned.The disused Magnox generator, situated on the banks of the River Severn in Gloucestershire, closed in 1989.
It was the world’s first commercial power station and its laboratories and many of its buildings have already been dismantled. Work emptying its vast concrete vaults of the nuclear waste Berkeley generated is only now able to safely begin. But it will not be safe for humans to go inside its reactor cores until 2074. The BBC has been given a rare glimpse of what is stored under the disused site.For the past 50 years parts of the coastline of the west of England have been dominated by nuclear power stations. The 1960s saw the construction of Hinkley A and Hinkley B in Somerset, with both Oldbury and Berkeley built on the banks of the River Severn in the 1950s. Only Hinkley B is still in use but the nuclear waste the stations generated has remained in place. It takes hundreds of years to decompose and has to be stored underground. It will cost an estimated £1.2bn to fully decommission Berkeley. About 200 people are currently working on the site under strict security. Work emptying waste products from the concrete vaults, eight metres (26ft) underground, is a complicated process. They contain used graphite from the fuel elements in the nuclear generating process, material from the cooling ponds and from the laboratories. The removal is expected to take five or six years to complete. Rob Ledger, waste operations director at Berkeley, said: “When the power stations first started generating I don’t think there was much thought put into how the waste was going to be dealt with or retrieved. “It’s taken a while to develop the equipment and the facilities [to do this]. “A mechanical arm moves the debris into position and then a ‘grab’ comes down through an aperture in the vaults and picks up the debris [and] puts it into a tray. “Each debris-filled tray weighs up to 100kg (220lb). “The automated machinery is controlled by computers [and] tips [the waste] into a cast iron container.” The containers will house the waste in an intermediate storage facility until a long-term solution can be found. “Nuclear waste does take a long time to decay… it’s hundreds of years. And that’s why we have to go to these lengths, to store it safely,” said Mr Ledger. Eventually the boxes will be housed deep underground in a long-term storage facility. The location has not yet been decided by the government. There are currently estimated to be almost 95,000 tonnes of nuclear waste in the form of graphite blocks across the UK. But if the Carbon 14 can be extracted from the blocks, they become much safer and easier to deal with. A new process is being explored, by scientists at Bristol University, to ensure not all of the waste will be discarded. They have developed a process that uses reactor core spent contents in a new power form. Carbon 14 from nuclear reactors is infused into wafer-thin diamonds, man-made in a lab at Bristol University. They then become radioactive and form the heart of a battery that would last for many thousands of years. The tiny batteries could be used in pacemakers, hearing aids or sent into space as part of the space programme. The process is being piloted in association with the UK Atomic Energy Authority in Abingdon. It is hoped the decommissioned Gloucestershire site may be redeveloped to manufacture the new batteries, creating jobs in the region. |
|
War planners ignore the fire effects of nuclear bombing
City on fire, Nuclear Darkness, by Lynne Eden, 30 Dec 19, By ignoring the fire damage that would result from a nuclear attack and taking into account blast damage alone, U.S. war planners were able to demand a far larger nuclear arsenal than necessary.
For more than 50 years, the U.S. Government has seriously underestimated damage from nuclear attacks. The earliest schemes to predict damage from atomic bombs, devised in 1947 and 1948, focused only on blast damage and ignored damage from fire, which can be far more devastating than blast effects.
The failure to include damage from fire in nuclear war plans continues today. Because fire damage has been ignored for the past half-century, high-level U.S. decision makers have been poorly informed, if informed at all, about the extent of damage that nuclear weapons would actually cause. As a result, any U.S. decision to use nuclear weapons almost certainly would be predicated on insufficient and misleading information. If nuclear weapons were used, the physical, social, and political effects could be far more destructive than anticipated.
How can this systematic failure to assess fire damage have persisted for more than half a century? The most common response is that fire damage from nuclear weapons is inherently less predictable than blast damage. This is untrue. Nuclear fire damage is just as predictable as blast damage.
One bomb, one city
To visualize the destructiveness of a nuclear bomb, imagine a powerful strategic nuclear weapon detonated above the Pentagon, a short distance from the center of Washington, D.C.1 Imagine it is a “near-surface” burst-about 1,500 feet above the ground-which is how a military planner might choose to wreak blast damage on a massive structure like the Pentagon. Let us say that it is an ordinary, clear day with visibility at 10 miles, and that the weapon’s explosive power is 300 kilotons-the approximate yield of most modern strategic nuclear weapons. This would be far more destructive than the 15-kilotonbomb detonated at Hiroshima or the 21-kiloton bomb detonated at Nagasaki.2
Washington, D.C., has long been a favorite hypothetical target.3 But a single bomb detonated over a capital city is probably not a realistic planning assumption.
When a former commander in chief of the U.S. Strategic Command read my scenario, he wanted to know why I put only one bomb on Washington. “We must have targeted Moscow with 400 weapons,” he said. He explained the military logic of planning a nuclear attack on Washington: “You’d put one on the White House, one on the Capitol, several on the Pentagon, several on National Airport, one on the CIA, I can think of 50 to a hundred targets right off. . . . I would be comfortable saying that there would be several dozens of weapons aimed at D.C.” Moreover, he said that even today, with fewer weapons, what makes sense would be a decapitating strike against those who command military forces. Today, he said, Washington is in no less danger than during the Cold War.
The discussion that follows greatly understates the damage that would occur in a concerted nuclear attack, and not only because I describe the effects of a single weapon. I describe what would happen to humans in the area, but I do not concentrate on injury, the tragedy of lives lost, or the unspeakable loss to the nation of its capital city. These are important. But I am concerned with how organizations estimate and underestimate nuclear weapons damage; thus, I focus largely, as do they, on the physical environment and on physical damage to structures.
With this in mind, let us look at some of the consequences of a nuclear weapon detonation, from the first fraction of a second to the utter destruction from blast and fire that would happen within several hours. This will allow us to understand the magnitude of the damage from both effects, but particularly from fire, which is neither widely understood nor accounted for in damage prediction in U.S. nuclear war plans.
Unimaginable lethality
The detonation of a 300-kiloton nuclear bomb would release an extraordinary amount of energy in an instant-about 300 trillion calories within about a millionth of a second. More than 95 percent of the energy initially released would be in the form of intense light. This light would be absorbed by the air around the weapon, superheating the air to very high temperatures and creating a ball of intense heat-a fireball.
Because this fireball would be so hot, it would expand rapidly. Almost all of the air that originally occupied the volume within and around the fireball would be compressed into a thin shell of superheated, glowing, high-pressure gas. This shell of gas would compress the surrounding air, forming a steeply fronted, luminous shockwave of enormous extent and power-the blast wave.
By the time the fireball approached its maximum size, it would be more than a mile in diameter. It would very briefly produce temperatures at its center of more than 200 million degrees Fahrenheit (about 100 million degrees Celsius)-about four to five times the temperature at the center of the sun.
This enormous release of light and heat would create an environment of almost unimaginable lethality. Vast amounts of thermal energy would ignite extensive fires over urban and suburban areas. In addition, the blast wave and high-speed winds would crush many structures and tear them apart. The blast wave would also boost the incidence and rate of fire-spread by exposing ignitable surfaces, releasing flammable materials, and dispersing burning materials.
Within minutes of a detonation, fire would be everywhere. Numerous fires and firebrands-burning materials that set more fires-would coalesce into a mass fire. (Scientists prefer this term to “firestorm,” but I will use them interchangeably here.) This fire would engulf tens of square miles and begin to heat enormous volumes of air that would rise, while cool air from the fire’s periphery would be pulled in. Within tens of minutes after the detonation, the pumping action from rising hot air would generate superheated ground winds of hurricane force, further intensifying the fire.4
Virtually no one in an area of about 40-65 square miles would survive.
A little farther away…….
Within minutes of a detonation, fire would be everywhere. Numerous fires and firebrands-burning materials that set more fires-would coalesce into a mass fire. (Scientists prefer this term to “firestorm,” but I will use them interchangeably here.) This fire would engulf tens of square miles and begin to heat enormous volumes of air that would rise, while cool air from the fire’s periphery would be pulled in. Within tens of minutes after the detonation, the pumping action from rising hot air would generate superheated ground winds of hurricane force, further intensifying the fire.4
Virtually no one in an area of about 40-65 square miles would survive.
A little farther away……
Three miles from ground zero……..
A hurricane of fire…..
|
….The first indicator of a mass fire would be strangely shifting ground winds of growing intensity near ground zero. (Such winds are entirely different from and unrelated to the earlier blast-wave winds that exert “drag pressure” on structures.) These fire-winds are a physical consequence of the rise of heated air over large areas of ground surface, much like a gigantic bonfire.
The inrushing winds would drive the flames from combusting buildings horizontally toward the ground, filling city streets with hot flames and firebrands, breaking in doors and windows, and causing the fire to jump hundreds of feet to swallow anything that was not yet violently combusting. These extraordinary winds would transform the targeted area into a huge hurricane of fire. Within tens of minutes, everything within approximately 3.5 to 4.6 miles of the Pentagon would be engulfed in a mass fire. The fire would extinguish all life and destroy almost everything else. Firestorm physicsThis description of the physics of mass fire is based on the work of a few scientists who have examined in detail the damaging effects of nuclear weapons, including nuclear engineer Theodore A. Postol and physicist Harold Brode. Postol is one of the country’s leading non-government funded technical experts on nuclear weapons, missiles, and arms control. Brode’s five-decade career has been devoted to the study of nuclear weapons effects. That mass fires have occurred, and that something like the firestorm described here could occur, is not in dispute. What is not widely accepted is that nuclear weapons detonated in urban or suburban areas would be virtually certain to set mass fires, and that the resulting damage is as predictable as blast damage. The much more widely held view is that the probability and range of mass fire depends on many unpredictable environmental variables, including rain, snow, humidity, temperature, time of year, visibility, and wind conditions. But the work of Postol, Brode, and Brode’s collaborators shows that mass fire creates its own environment. Except in extreme cases, environmental factors do not affect the likelihood of mass fire. Weather can affect the fire’s range, but this can be reasonably well predicted. For nuclear weapons of approximately 100 kilotons or more, the range of destruction from mass fire will generally be substantially greater than from blast. The extraordinarily high air temperatures and wind speeds characteristic of a mass fire are the inevitable physical consequence of many simultaneous ignitions occurring over a vast area. The vacuum created by buoyantly rising air follows from the basic physics of combustion and fluid flow (hydro- or fluid dynamics). As the area of the fire increases, so does the volume of rising air over the fire zone, causing even more air to be sucked in from the periphery of the fire at increasingly higher speeds. Only a few mass fires have occurred in human history: those created by British and U.S. conventional incendiary weapons and by U.S. atomic bombs in World War II. These include fires that destroyed Hamburg, Dresden, Kassel, Darmstadt, and Stuttgart in Germany, and Tokyo, Hiroshima, and Nagasaki in Japan. History’s first mass fire began on the night of July 27, 1943, in Hamburg-created by allied incendiary raids. Within 20 minutes, two thirds of the buildings within an area of 4.5 square miles were on fire. It took fewer than six hours for the fire to completely burn an area of more than five square miles. Damage analysts called it the “Dead City.” Wind speeds were of hurricane force; air temperatures were 400-500 degrees Fahrenheit. Between 60,000 and 100,000 people were killed in the attack.6 A mass fire from a modern nuclear bomb could be expected to destroy a considerably larger urban or suburban area, in a similarly short time. The unique features of the mass fire fundamentally distinguish it from the more slowly propagating line fire. …….. Fire environments created by mass fires are fundamentally more violent and destructive than smaller-scale fires, and they are far less affected by external weather conditions. They are not substantially altered by seasonal and daily weather conditions. ….. Average air temperatures in the burning areas after the attack would be well above the boiling point of water; winds generated by the fire would be hurricane force; and the fire would burn everywhere at this intensity for three to six hours. Even after the fire burned out, street pavement would be so hot that even tracked vehicles could not pass over it for days, and buried, unburned material from collapsed buildings could burst into flames if exposed to air even weeks after the fire. Those who sought shelter in basements of strongly constructed buildings could be poisoned by carbon monoxide seeping in, or killed by the ovenlike conditions. Those who tried to escape through the streets would be incinerated by the hurricane-force winds laden with firebrands and flames. Even those able to find shelter in the lower-level sub-basements of massive buildings would likely die of eventual heat prostration, poisoning from fire-generated gases, or lack of water. The firestorm would eliminate all life in the fire zone. All publication data from “Whole World on Fire” by Lynn Eden at Google Books http://www.nucleardarkness.org/web/cityonfire/ |
|
Ionising radiation damages brain connections
|
How Radiation Can Affect Brain Connections https://www.technologynetworks.com/neuroscience/news/how-radiation-can-affect-brain-connections-328547 Dec 17,| 2019 Original story from University of Rochester Medical Center, One of the potentially life-altering side effects that patients experience after cranial radiotherapy for brain cancer is cognitive impairment. Researchers now believe that they have pinpointed why this occurs and these findings could point the way for new therapies to protect the brain from the damage caused by radiation.
The new study – which appears in the journal Scientific Reports – shows that radiation exposure triggers an immune response in the brain that severs connections between nerve cells. While the immune system’s role in remodeling the complex network of links between neurons is normal in the healthy brain, radiation appears to send the process into overdrive, resulting in damage that could be responsible for the cognitive and memory problems that patients often face after radiotherapy. Continue reading |
USA’s Hanford nuclear site could suffer the same fate as Russia’s Mayak – or worse
Comment from Dtlt 21 Dec 19, TRUMP IS CUTTING THE BUDGET TO MONITOR AND TRY TO CLEAN THE HANFORD MESS IN HALF
Massive Nuclear Explosion similar to Kyrshtym by Mayak Can Happen at Hanford if the site is not Monitored and tanks not taken care of.
A Ten Thousand Gallon Tank at Mayak Exploded from Heat Decay. The Heat Deacy was from Strontium 90, Cesium 137, Cobalt 60 and Plutonium Stored in the Underground Tank. The explosion was equivalent to 100 tons of TNT. There are 55 million gallons of the same Radionuclide Mix stored at Hanford, in UnderGround Tanks. They used nitic acid to extract radionuclides at hanford as they did at Kyahym, by Mayak. The nitrates mixed with heat decaying rand hydrogen gas generating radionuclides are very much like the explosive brew that went off in Kyshtym in 1957 and there are 55 million gallons of the explosive brew at Hanford. The heat decay, heat emitting Radionuclides and Hydrogen gas generating explosive mix and the nitrates in the brew are very much at risk for a massive catastrophic chemical-radionuclide explosion . The Kyshtym disaster was a radioactive contamination accident that occurred on 29 September 1957 at Mayak, a plutonium production site in Russia for nuclear weapons and nuclear fuel reprocessing plant of the Soviet Union.
If the exlplosive stew becomes too concentrated and hot, the same thing will Happen there, contaminating a Great Portion of the Pacific NW USA and southe western Canada.
Medvedev, Zhores A. (4 November 1976). “Two Decades of Dissidence”. New Scientist.
Medvedev, Zhores A. (1980). Nuclear disaster in the Urals translated by George Saunders. 1st Vintage Books ed. New York: Vintage Books. ISBN 978-0-394-74445-2. (c1979)
In 1957 the cooling system in one of the tanks containing about 70–80 tons of liquid radioactive waste failed and was not repaired. The temperature in it started to rise, resulting in evaporation and a chemical explosion of the dried waste, consisting mainly of ammonium nitrate and acetates (see ammonium nitrate/fuel oil bomb). The explosion, on 29 September 1957, estimated to have a force of about 70–100 tons of TNT,[10] threw the 160-ton concrete lid into the air.[8] There were no immediate casualties as a result of the explosion, but it released an estimated 20 MCi (800 PBq) of radioactivity. Most of this contamination settled out near the site of the accident and contributed to the pollution of the Techa River, but a plume containing 2 MCi (80 PBq) of radionuclides spread out over hundreds of kilometers. Previously contaminated areas within the affected area include the Techa river, which had previously received 2.75 MCi (100 PBq) of deliberately dumped waste, and Lake Karachay, which had received 120 MCi (4,000 PBq).
In the next 10 to 11 hours, the radioactive cloud moved towards the north-east, reaching 300–350 km (190–220 mi) from the accident. The fallout of the cloud resulted in a long-term contamination of an area of more than 800 to 20,000 km2 (310 to 7,720 sq mi), depending on what contamination level is considered significant, primarily with caesium-137 and strontium-90. This area is usually referred to as the East-Ural Radioactive Trace EURT
USA House Democrats let Jared Kushner suck them in to a very bad space weapons deal
|
The Very Bad Space Force Deal, https://www.counterpunch.org/2019/12/18/the-very-bad-space-force-deal/ by KARL GROSSMAN Unless grassroots action somehow stops it, it looks likely that the Trump scheme for a Space Force, a sixth branch of United States armed forces, will happen. The U.S. House of Representatives last week passed the $738 billion military policy bill that gives Trump his sought-for Space Force as he moves for what he terms “American dominance in space.”The vote for what is titled the National Defense Authorization Act (NDAA) for 2020 was 377 to 48. Some 189 Republicans and 188 Democrats voted for it. Six Republican House members voted no, along with 41 Democrats and one independent. The large Democratic yes vote came as a result of a trade-off for 12 weeks of paid parental leave for civilian federal employees. The New York The U.S. Senate now will consider the measure and pass it considering the Trump-controlled majority in the Senate, and Trump will sign it. Indeed, last week Trump tweeted: “Wow! All our priorities have made it into the final NDAA: Pay Raises for our Troops, Rebuilding our Military, Paid Parental Leave, Border Security, and Space Force!” Establishment of a U.S. Space Force would come despite the landmark Outer Space Treaty of 1967, put together by the U.S., then Soviet Union and the U.K., designating space as a global commons to be used for peaceful purposes. The U.S. move to negate the intent of the Outer Space Treaty will cause Russia and China to respond in kind—especially considering Trump’s declaration that “it is not enough to merely have an American presence in space. We must have American dominance in space.” This will lead to an arms race in space. The Trump administration and the U.S. military have been claiming that a Space Force is necessary because of Russia and China moving into space militarily but, in fact, Russia and China and U.S. neighbor Canada have been leaders for decades in pushing for an expansion of the Outer Space Treaty. It bans weapons of mass destruction in space. The Prevention of an Arms Race in Outer Space (PAROS) treaty that the three nations have sought to expand would prohibit the placement of any weapons in space. The U.S.—under both Republican and Democratic presidential administrations—has opposed the PAROS treaty and effectively vetoed its enactment at the United Nations. The leading organization internationally in opposing the plan for a U.S. Space Force has been the Global Network Against Weapons & Nuclear Power in Space (www.space4peace.org). Commenting on the House vote, Bruce Gagnon, the network’s coordinator, said: “It is not surprise, but still disheartening, to see that 188 Democrats joined with Republicans to pass the NDAA bill in the House.” He noted that “the Democrats were led by Rep. Adam Smith from the Seattle area which means that the aerospace giant Boeing Corp., which stands to make a gold mine off Space Force, clearly pulls Mr. Smith’s chain.” (Smith, chairman of the House Armed Services Committee, called the bill “the most progressive defense bill we have passed in decades.”) Gagnon continued: “Another Democrat, Rep. Jim Cooper from Tennessee chimed in saying, ‘Trump’s belated support for a Space Force does not make this a Republican idea.’ Cooper chairs the House Armed Services Strategic Forces Subcommittee and clearly is trying to stake out Democratic Party ‘bragging rights’ on passage of this proposal to move warfare into the heavens.” Gagnon said, “About three-dozen progressive and anti-war groups worked hard to stop this NDAA and called the Democrats support for it ‘near complete capitulation.’” “With this newly enshrined Space Force—the NDAA will easily pass in the Senate—Trump will now be poised to tweet that Washington will be able to ‘control and dominate’ space on behalf of corporate interests,” Gagnon stated. “With technology now nearly in place to allow ‘mining the sky’ for precious minerals on planetary bodies, the Space Force fits in nicely with the long-planned Pentagon ability to control which nations, corporations and wealthy individuals will be able to venture into space and which will not. The idea was spelled out in a 1989 Congressional study called ‘Military Space Forces: The Next 50 Years.’” “Thus, Space Force would have two primary missions—give the Pentagon full control of the Earth and control the pathway on and off the Earth—both on behalf of corporate interests,” he said. “These provocative, expensive and destabilizing plans to control space will not be taken lightly by the rest of the world’s space faring nations. Even the Pentagon has lately been predicting the inevitability of war in the heavens.” Gagnon recounted: “In 1989 at a protest I organized at the Kennedy Space Center in Florida, Apollo astronaut Edgar Mitchell, the sixth man to walk on the moon [who took part in the protest] told the assembled that any war in space would be the one and only. By destroying satellites in space massive amounts of space debris would be created that would cause a cascading effect and even the billion-dollar International Space Station would likely be broken into tiny bits. So much space junk would be created, Mitchell told us, that we’d never be able to get a rocket off the planet again because of the mine field of debris orbiting the Earth at 15,000 mph.” “That would mean,” said Gagnon, “activity on Earth below would immediately shut down—cell phones, ATM machines, cable TV, traffic lights, weather prediction and more—all hooked up to satellites, would be lost. Modern society would go dark.” “The aerospace industry has long claimed that Star Wars would be the largest industrial project in the history of our planet,” said Gagnon. “So much money would be needed that the industry has identified the ‘entitlement programs’ for defunding to pay for ‘everything space.’ That means Social Security, Medicare, Medicaid and the remaining tattered social safety net would be cut to pay for Space Force.” “Everything has an Achilles heel,” said Gagnon. He said that “if you want to help defeat plans for Space Force, fight for social and environmental progress. Demand that the compromised Congress not fund this disastrous plan to move the arms race into space. It is going to cost all of us dearly.” A return in many respects to President Reagan’s “Star Wars” scheme of the 1980s, the Space Force notion “started as a joke,” National Public radio reported in August in a report by correspondent Claudia Grisales titled, “With Congressional Blessing, Space Force Is Closer to Launch.” She related: “Early last year President Trump riffed on an idea he called ‘Space Force’ before a crowd of Marines in San Diego. It drew laughs, but the moment was a breakthrough for a plan that had languished for nearly 20 years.” She continued: “’I said maybe we need a new force, we’ll call it the Space Force,’ Trump said at Marine Corps Air Station Miramar in March 2018. ‘And I was not really serious. Then I said, ‘What a great idea, maybe we’ll have to do that.’” The Outer Space Treaty was spurred, as Craig Eisendrath, involved as a U.S. State Department officer in its creation, by the Soviet Union launching the first space satellite, Sputnik, in 1957, as he noted in the 2001 TV documentary that I wrote and narrate, “Star Wars Returns.” Eisendrath said “we sought to de-weaponize space before it got weaponized…to keep war out of space.” The Reagan “Star Wars” program also used a defense rationale—it was formally called the Strategic Defense Initiative. It was based on orbiting battle platforms with nuclear reactors or “super” plutonium systems on board providing the power for hypervelocity guns, particle beams and laser weapons. Despite its claim of being defensive, it was criticized for being offensive and a major element in what the U.S. military documents then and since have described as “full spectrum dominance” over the Earth below which the U.S. has been seeking by taking the “ultimate high ground” of space. Among those voting against the NDAA bill last week were Representatives Jerry Nadler, chair of the House Judiciary Committee which has just approved articles of impeachment against Trump; Alexander Ocasio-Cortez; Tulsi Gabbard; and Ro Khanna, who earlier, with Senator Bernard Sanders, issued a joint statement decrying it as a “bill of astonishing moral cowardice.” Meanwhile, the U.S. military is gearing up for a selling campaign for a Space Force. On a website called “Space War Your World At War” , Barbara Barrett, Air Force secretary, is quoted as saying that the Air Force has come up with a “strategy to find support among not just U.S. lawmakers, but also among the public for Trump’s new branch of the country’s armed forces, the Space Force. She opined that this could clarify to the broader public what the U.S. is doing in this domain and why exactly it needs a separate force for operations in space, as well as funding. More articles by:KARL GROSSMAN
Karl Grossman, professor of journalism at State University of New York/College at Old Westbury, and is the author of the book, The Wrong Stuff: The Space’s Program’s Nuclear Threat to Our Planet. Grossman is an associate of the media watch group Fairness and Accuracy in Reporting (FAIR). He is a contributor to Hopeless: Barack Obama and the Politics of Illusion. |
|
Analysis of decontamination of irradiated soil of Fukushima area
Fukushima: Lessons learned from an extraordinary case of soil decontamination https://www.sciencedaily.com/releases/2019/12/191212081926.htm
- Source:
- European Geosciences Union
- Summary:
- Following the accident at the Fukushima nuclear power plant in March 2011, the Japanese authorities decided to carry out major decontamination works in the affected area, which covers more than 9,000 km2. On Dec. 12, 2019, with most of this work having been completed, researchers provided an overview of the decontamination strategies used and their effectiveness.
- On December 12, 2019, with most of this work having been completed, the scientific journal SOIL of the European Geosciences Union (EGU) is publishing a synthesis of approximately sixty scientific publications that together provide an overview of the decontamination strategies used and their effectiveness, with a focus on radiocesium. This work is the result of an international collaboration led by Olivier Evrard, researcher at the Laboratoire des Sciences du Climat et de l’Environnement [Laboratory of Climate and Environmental Sciences] (LSCE — CEA/CNRS/UVSQ, Université Paris Saclay).
Soil decontamination, which began in 2013 following the accident at the Fukushima Dai-ichi nuclear power plant, has now been nearly completed in the priority areas identified1. Indeed, areas that are difficult to access have not yet been decontaminated, such as the municipalities located in the immediate vicinity of the nuclear power plant. Olivier Evrard, a researcher at the Laboratory of Climate and Environmental Sciences and coordinator of the study (CEA/CNRS/UVSQ), in collaboration with Patrick Laceby of Alberta Environment and Parks (Canada) and Atsushi Nakao of Kyoto Prefecture University (Japan), compiled the results of approximately sixty scientific studies published on the topic.
This synthesis focuses mainly on the fate of radioactive cesium in the environment because this radioisotope was emitted in large quantities during the accident, contaminating an area of more than 9,000 km2. In addition, since one of the cesium isotopes (137Cs) has a half-life of 30 years, it constitutes the highest risk to the local population in the medium and long term, as it can be estimated that in the absence of decontamination it will remain in the environment for around three centuries.
“The feedback on decontamination processes following the Fukushima nuclear accident is unprecedented,” according to Olivier Evrard, “because it is the first time that such a major clean-up effort has been made following a nuclear accident. The Fukushima accident gives us valuable insights into the effectiveness of decontamination techniques, particularly for removing cesium from the environment.”
- This analysis provides new scientific lessons on decontamination strategies and techniques implemented in the municipalities affected by the radioactive fallout from the Fukushima accident. This synthesis indicates that removing the surface layer of the soil to a thickness of 5 cm, the main method used by the Japanese authorities to clean up cultivated land, has reduced cesium concentrations by about 80% in treated areas. Nevertheless, the removal of the uppermost part of the topsoil, which has proved effective in treating cultivated land, has cost the Japanese state about €24 billion. This technique generates a significant amount of waste, which is difficult to treat, to transport and to store for several decades in the vicinity of the power plant, a step that is necessary before it is shipped to final disposal sites located outside Fukushima prefecture by 2050. By early 2019, Fukushima’s decontamination efforts had generated about 20 million cubic metres of waste.
Decontamination activities have mainly targeted agricultural landscapes and residential areas. The review points out that the forests have not been cleaned up — because of the difficulty and very high costs that these operations2 would represent — as they cover 75% of the surface area located within the radioactive fallout zone. These forests constitute a potential long-term reservoir of radiocesium, which can be redistributed across landscapes as a result of soil erosion, landslides and floods, particularly during typhoons that can affect the region between July and October. Atsushi Nakao, co-author of the publication, stresses the importance of continuing to monitor the transfer of radioactive contamination at the scale of coastal watersheds that drain the most contaminated part of the radioactive fallout zone. This monitoring will help scientists understand the fate of residual radiocesium in the environment in order to detect possible recontamination of the remediated areas due to flooding or intense erosion events in the forests.
The analysis recommends further research on:
- the issues associated with the recultivation of decontaminated agricultural land3,
- the monitoring of the contribution of radioactive contamination from forests to the rivers that flow across the region,
- and the return of inhabitants and their reappropriation of the territory after evacuation and decontamination.
This research will be the subject of a Franco-Japanese and multidisciplinary international research project, MITATE (Irradiation Measurement Human Tolerance viA Environmental Tolerance), led by the CNRS in collaboration with various French (including the CEA) and Japanese organizations, which will start on January 1, 2020 for an initial period of 5 years.
Complementary approaches
This research is complementary to the project to develop bio- and eco-technological methods for the rational remediation of effluents and soils, in support of a post-accident agricultural rehabilitation strategy (DEMETERRES), led by the CEA, and conducted in partnership with INRA and CIRAD Montpellier.
Decontamination techniques
- In cultivated areas within the special decontamination zone, the surface layer of the soil was removed to a depth of 5 cm and replaced with a new “soil” made of crushed granite available locally. In areas further from the plant, substances known to fix or substitute for radiocesium (potassium fertilizers, zeolite powders) have been applied to the soil.
- As far as woodland areas are concerned, only those that were within 20 metres of the houses were treated (cutting branches and collecting litter).
- Residential areas were also cleaned (ditch cleaning, roof and gutter cleaning, etc.), and (vegetable) gardens were treated as cultivated areas.
1 In Fukushima prefecture and the surrounding prefectures, the decision to decontaminate the landscapes affected by the radioactive fallout was made in November 2011 for 11 districts that were evacuated after the accident (special decontamination zone — SDZ — 1,117 km2) and for 40 districts affected by lower, but still significant levels of radioactivity and that had not been evacuated in 2011 (areas of intensive monitoring of the contamination — ICA, 7836 km2). 2 128 billion euros according to one of the studies appearing in the review to be published on 12 December 2019 in SOIL. 3 Relating to soil fertility and the transfer of radiocesium from the soil to plants, for example.
The study was conducted by Olivier Evrard (Laboratoire des Sciences du Climat et de l’Environnement (LSCE/IPSL), Unité Mixte de Recherche 8212 (CEA/CNRS/UVSQ), Université Paris-Saclay), J. Patrick Laceby (Environmental Monitoring and Science Division (EMSD), Alberta Environment and Parks (AEP)), and Atsushi Nakao (Graduate School of Life and Environmental Sciences, Kyoto Prefectural University).
-
Archives
- April 2026 (220)
- March 2026 (251)
- February 2026 (268)
- January 2026 (308)
- December 2025 (358)
- November 2025 (359)
- October 2025 (376)
- September 2025 (257)
- August 2025 (319)
- July 2025 (230)
- June 2025 (348)
- May 2025 (261)
-
Categories
- 1
- 1 NUCLEAR ISSUES
- business and costs
- climate change
- culture and arts
- ENERGY
- environment
- health
- history
- indigenous issues
- Legal
- marketing of nuclear
- media
- opposition to nuclear
- PERSONAL STORIES
- politics
- politics international
- Religion and ethics
- safety
- secrets,lies and civil liberties
- spinbuster
- technology
- Uranium
- wastes
- weapons and war
- Women
- 2 WORLD
- ACTION
- AFRICA
- Atrocities
- AUSTRALIA
- Christina's notes
- Christina's themes
- culture and arts
- Events
- Fuk 2022
- Fuk 2023
- Fukushima 2017
- Fukushima 2018
- fukushima 2019
- Fukushima 2020
- Fukushima 2021
- general
- global warming
- Humour (God we need it)
- Nuclear
- RARE EARTHS
- Reference
- resources – print
- Resources -audiovicual
- Weekly Newsletter
- World
- World Nuclear
- YouTube
-
RSS
Entries RSS
Comments RSS

















