nuclear-news

The News That Matters about the Nuclear Industry Fukushima Chernobyl Mayak Three Mile Island Atomic Testing Radiation Isotope

Grandfather and granddaughter join forces prevent nuclear doom

The former defense secretary is spending his twilight years sounding the alarm with his 29-year-old granddaughter.

“When my kids were getting under desks at their school and going through nuclear drills — the danger today is actually greater. We’re just not aware of it,” says Perry.

At 89, he works with granddaughter to prevent nuclear doom

Before Forever Changes

 

MARCH 11, 2017, BY  Picture a nondescript packing crate labeled “agricultural equipment” being loaded onto a delivery truck, which drives along Pennsylvania Avenue in Washington, D.C., until it stops midway between the White House and the Capitol.

The nuclear bomb explodes with the power of 15 kilotons. There are more than 80,000 deaths, from the highest ranking members of government to the youngest schoolchildren. All major news outlets then report receiving an identical claim: that five more nuclear bombs are hidden in five major cities.

Such is the nightmare nuclear scenario that former US Defense Secretary William Perry says may seem remote, but the consequences, if realized, would be disastrous.

“I do not like to be a prophet of doom,” says Perry, 89, with the gentle grace of a decadeslong diplomat who has negotiated with countries both hostile and friendly to US interests. Then he bluntly gets to the point. “What we’re talking about is no less than the end of civilization.”

Perry doesn’t believe an intentional terrorist attack or all-out nuclear war is the greatest risk — he fears a “blunder” that plunges the globe into a nuclear conflict.

Perry says with a more aggressive Russia, and a brash and at times unpredictable President Donald Trump, “the possibility of a nuclear catastrophe is probably greater than it has ever been, greater than any time in the Cold War.”

CNN reached out to the White House for comment on Perry’s statements. It did not respond.

While he’s long been out of government, Perry’s uses his extensive policy chops and background to engage the public — through speeches, presentations and online courses.

He worries that tensions between the Koreas, and possibly Japan, could turn into a conventional conflict that could go nuclear. A bellicose and expansion-minded Russia could draw the United States into a situation that could escalate, Perry says. And the District of Columbia scenario shows how devastation can result from a crude bomb.

“When my kids were getting under desks at their school and going through nuclear drills — the danger today is actually greater. We’re just not aware of it,” says Perry.

The former defense secretary is spending his twilight years sounding the alarm with his 29-year-old granddaughter. They’re trying to awaken a new audience on social media with the William J. Perry Project, an advocacy group dedicated to helping end the nuclear threat.

“We’re really just out there trying to reach a generation that isn’t really engaged on this issue right now,” says Lisa Perry, the digital communications manager for the project. “It’s something we learned in history class. There was no conversation about what’s happening now.”

“The dangers will never go away as long as we have nuclear weapons,” William Perry explains. “But we should take every action to lower the dangers and I think it can be done.”

A lifetime dealing with the nuclear threat

Perry served three years under President Bill Clinton, a time when more than 8,000 nuclear weapons were dismantled. His nuclear knowledge traces back to his days as a CIA analyst working with the Kennedy administration during the Cuban Missile Crisis. He was tapped to evaluate photos showing Soviet nuclear missiles in Cuba and recalls it as one of the scariest times in his life.

“We made miscalculations,” recalls Perry about those anxious two weeks. “It’s a miracle they did not lead to war.”

Perry lists the risks: US-Russia hostilities. A nuclear terror attack. A regional crisis.

On a regional conflict, Perry sees North Korea as an unpredictable nuclear threat. The regime’s growing arsenal and history of bold actions, Perry says, could be met by an escalated response by South Korea or even the United States. Not necessarily a deliberate attack, says Perry, but he fears a “blunder” that plunges the globe into a nuclear conflict.

“When a crisis reaches a boiling point then you have a possibility of a miscalculation,” warns Perry.

Trump and the nuclear threat……….http://wtkr.com/2017/03/11/at-89-he-works-with-granddaughter-to-prevent-nuclear-doom/

March 13, 2017 Posted by | opposition to nuclear, PERSONAL STORIES, Reference, weapons and war | Leave a comment

Thorough research to be done on uranium health effects on Navajo Nation

Mothers, Babies on Navajo Nation Exposed to High Levels of Uranium https://indiancountrymedianetwork.com/culture/health-wellness/mothers-babies-navajo-nation-exposed-high-levels-uranium/ Navajo Birth Cohort Study figuring out how exposure affects health  • December 20, 2016

Researchers with the Navajo Birth Cohort Study aren’t looking for simple answers about how uranium exposure affects health. We already know—and have known for decades—that contact with uranium can cause kidney disease and lung cancer.

This study is the first to look at what chronic, long-term exposure from all possible sources of uranium contamination—air, water, plants, wildlife, livestock and land—does down through the generations in a Native American community.

Since the study began in 2012, over 750 families have enrolled and 600 babies have been born to those families, said Dr. Johnnye Lewis, director of the Community Environmental Health Program & Center for Native Environmental Health Equity Research at the University of New Mexico Health Sciences Center and NBCS principal investigator.

We’re collecting a huge amount of data,” Lewis said. “At this point … all of our results are preliminary, [but] what we do know is that if we look at uranium in urine in the Navajo participants we see higher concentrations than we would expect based on the U.S. population as a whole… [In babies,] we are seeing a trend that uranium levels in urine increase over the first year.”

The Navajo Nation overlies some of the largest uranium deposits in the U.S. Between 1944 and 1986, miners extracted nearly 30 million tons of uranium from Navajo Nation lands. Navajo miners did not have protective suits or masks; they took their work clothes home for laundering; they and other community members used rocks from the mines to build their homes.

When the Cold War ended, most of the uranium mines on Navajo were abandoned—not covered, or sealed, or remediated, just left as they were with waste piles exposed to wind and rain and accessible to anyone, including children.

Today, more than 500 open abandoned uranium mines are spread across the Navajo Nation and uranium dust, particles and radiation continue to be released into the environment.

The questions the NBCS seeks to answer are complex. Uranium does not exist in isolation at the mine sites, so the study is looking at 36 different metals associated with uranium. “We do that because when you look at uranium waste, it’s not just uranium that’s in the waste,” said Lewis. “None of the variables that we look at, none of the causes or the outcomes that we look at are on-off binary sort of things. What we look at is as concentrations of uranium or other metals changes, can we see changes in responses?”

Researchers have also been alarmed by the findings that levels of iodine and zinc are lower than they should be in the study group. Iodine levels are about 40 percent below the World Health Organization sufficiency level, and 61 percent of the mothers in the study have zinc levels below the WHO sufficiency level. “Iodine deficiencies [are] very, very important because iodine is really critical for normal organ development and neurodevelopment,” said Lewis. “And we worry about zinc because we have some evidence that it may be involved in the repair process when you have exposure to some of the metals that we look at. [A lack of zinc] actually inhibits the body’s ability to fix damage to DNA.”

Documenting these deficiencies would make the NBCS worthwhile “even if we learn there are absolutely no [long-term health] effects from uranium,” Lewis said. “Whatever we find out is going to be important.”

Two other endeavors resulting from the study are already in the works, and both will be hugely important to the well-being of Navajo families in the future.

The project has just won Environmental Influences on Child Health Outcomes (ECHO) Program funding from the National Institutes of Health. The project is looking at kids all across the U.S. to try to understand how their environment influences their health. It will eventually include 50,000 children and at least two cohorts will be from Native American communities, Lewis said. “We’re just really pleased that they’re including Native Americans.”

The Centers for Disease Control funding for the NBCS only allows families to be followed for up to one year. This new funding, which extends over 5 years after a 2-year initial period, will allow the researchers to go back and look again at each child on an annual basis and do much more detailed developmental assessments. In the process, they will be able to develop an assessment that takes into account Navajo parenting styles and create an instrument that is valid specifically for Navajo children, unlike standardized developmental assessments that are devised based primarily on the dominant culture’s parenting practices.

To accomplish that, “we put together a clinical team that is going to be training our Navajo staff to deliver these developmental assessments. It will be a long process of working together. They’ll be trained and then they will shadow the clinical team so that they get a lot more experience off Navajo before ever coming back here and then when they come back they’ll each be partnered with either a neurodevelopmental expert or psychometrician … who will be hired through the program. They will initially shadow them and then be shadowed by them to ensure that we have consistency.

“So at the end of seven years what we’re going to have is a really great team of professional evaluators who will be staying on Navajo and who will provide that new service” to Navajo families, Lewis said.

The NBCS is a collaborative effort of the University of New Mexico’s DiNEH Project, Center for Disease Control/Agency for Toxic Substances and Disease Registry (CDC/ATSDR), Navajo Area Indian Health Service, and the Navajo Nation Division of Health, and the Southwest Research and Information Center.

Women between the ages of 14 and 45 who have lived on the Navajo Nation for five years, are pregnant and will deliver their babies at hospitals in Chinle, Gallup, Shiprock, Ft. Defiance and Tuba City are eligible to participate in the study. Call 1-877-545-6775 for information.

 

March 13, 2017 Posted by | health, indigenous issues, Reference, USA | Leave a comment

Lockheed Martin – USA’s top salesman for war?

Lockheed Martin Used Pentagon Dollars to Lobby Congress for Nuclear Weapons Funding One of the uses of the billions of dollars from these contracts is to recycle them back into lobbying the government to push for additional conventional and nuclear weapons spending, as reported by William Hartung and Stephen Miles. Of course, in addition, these funds are used to support a general environment of fear and insecurity, through contributions supporting hawkish think tanks.

Trump Is Bankrupting Our Nation to Enrich the War Profiteers March 06, 2017 By Jonathan King and Richard KrushnicTruthout | News Analysis

“……..Corporations that contract with the Department of Defense (DOD) for nuclear weapons complex work do not report revenues and profits from this work separately from their other military work, although they do break up government work from civilian work, and sometimes break up military work from other government work. Hence, it is not possible to determine profits made from nuclear weapons complex work from the annual reports and Securities and Exchange Commission (SEC) filings of large military corporations. However, it is possible to estimate, and to demonstrate how a significant amount of military R&D and production not recorded as nuclear weapons work is in fact partially nuclear weapons work. The nuclear weapons work financed by the US Department of Energy (DOE) is (not surprisingly) carried out in a semi-secret insiders club that insulates it from public knowledge and oversight. The first contracts for the upgrading of the nuclear weapons triads have already been awarded — one to Northrop Grumman — for a new generation of long-range bomber. But the public remains in the dark as to how many tens of billions of their tax dollars will be spent on the project.

From 2012-2014, according to Lockheed Martin’s 2014 annual report, the company realized an average of $46 billion a year in revenue, with an average of $3.2 billion in profits — 7 percent of revenue, and a 76 percent return on $4.2 billion of investor equity. The annual report informs us that 59 percent of 2014 revenue came from the Pentagon. We know from other sources that $1.4 billion a year is coming from the DOE for operation of the Sandia nuclear weapons lab, and we are estimating that an additional $600 million a year is coming for DOE nuclear weapons complex work. Information in the annual report indicates that around $6.1 billion came from foreign military sales. This adds up to around $35 billion of military revenue, or 75.3 percent of total 2014 revenue. The single biggest revenue earner in recent years is the F-35 jet fighter, bringing in $8.2 billion, 17 percent of total corporation revenue, in 2014. (William Hartung’s recent report describes additional aspects of Lockheed Martin’s military business, and his book Prophets of War: Lockheed Martin and the Making of the Military Industrial Complex provides extensive background).

The only references to Lockheed Martin’s nuclear weapons complex work in its 2014 annual report is a sentence noting provision of infrastructure and site support to the DOE’s Hanford complex, and a phrase noting continuing work on the Trident missile. The words “nuclear weapons” never appear in the report.

Lockheed Martin’s Nuclear Weapons Operations

In spite of the lack of mention in the annual report, Lockheed Martin is a partner with Bechtel ATK, SOC LLC and subcontractor Booz Allen Hamilton in Consolidated Nuclear Security LLC (CNS), in running the DOE Pantex Plant and the Y-12 Complex. Pantex does nuclear weapons life extension, dismantlement, development, testing and fabrication of high explosive nuclear warhead components. Y-12 stores and processes uranium, and fabricates uranium weapons components.

Lockheed Martin produced the Trident strategic nuclear missile for the 14 US Ohio-class nuclear submarines and for the four British Vanguard-class submarines. The 24 Tridents on each Ohio-class submarine each carry either eight or 12 warheads, all of them 20 to 50 times more powerful than the bombs dropped on Hiroshima and Nagasaki. Each warhead is capable of killing most of the people in any one of the world’s largest cities — either immediately or later, from radiation, burns, other injuries, starvation and disease. Lockheed MArtin is not producing new Trident missiles now, but it maintains and modifies them. Previously, Lockheed Martin and its subcontractors received $65 million for each of the 651 Trident missiles, in addition to the $35 billion in earlier development costs.

The other primary strategic nuclear weapon delivery vehicle is Boeing’s land-based Minuteman III strategic missile, also with many warheads per missile. About 450 of them are in silos in Colorado and northern plains states. Lockheed Martin produced and continues to produce key systems for the Minuteman III, and plays a large role in maintaining them. It was awarded a $452 million contract for this work in 2014.

Lockheed’s Sandia Subsidiary

Regarding the Pentagon’s nuclear weapons upgrades planned for the next decade; particularly important is the role of Sandia National Laboratories (SNL). Outside of Albuquerque, New Mexico, this DOE lab’s 10,600 employees make 95 percent of the roughly 6,500 non-nuclear components of all seven US nuclear warhead types. Components arm, fuse, fire, generate neutrons to start nuclear reactions, prevent unauthorized firing, preserve the aging nuclear weapons stockpile and mate the weapons to the missiles, planes and ships that deliver them to targets. Sandia Corporation LLC, wholly owned by Lockheed Martin, operates Sandia. The DOE is spending at least $1.4 billion a year on Sandia nuclear weapons work. The secret Lockheed Martin nuclear warhead assembly plant uncovered in Sunnyvale in 2010 is an extension of Lockheed Martin’s Sandia operations. Again, none of this received any mention or revenue numbers in Lockheed Martin’s 2014 annual report.

Lockheed Martin Used Pentagon Dollars to Lobby Congress for Nuclear Weapons Funding

One of the uses of the billions of dollars from these contracts is to recycle them back into lobbying the government to push for additional conventional and nuclear weapons spending, as reported by William Hartung and Stephen Miles. Of course, in addition, these funds are used to support a general environment of fear and insecurity, through contributions supporting hawkish think tanks. Technically, the federal government does not allow military contracting firms to use awarded funds to lobby Congress. Lobbying funds must come from other parts of the companies’ businesses. In reality, this is a non-functional restriction, since profits from various business segments are fungible; that is, once they are profits, they are intermingled, so in reality, the firms can use the profits from military contracts to lobby Congress. But Lockheed Martin went ahead and spent military contract funds from 2008-2012 as part of the contract expenditures. It didn’t even bother to book the lobbying expenditures as expenditures of profits. In 2015, the US Department of Justice required Lockheed Martin’s Sandia subsidiary to repay $4.9 million of a Sandia contract award to the Pentagon that the firm had spent under the contract for lobbying of Congressman the DOE secretary and the secretary’s family and friends………http://www.truth-out.org/news/item/39712-trump-is-bankrupting-our-nation-to-enrich-the-war-profiteers

March 8, 2017 Posted by | business and costs, politics, Reference, secrets,lies and civil liberties, USA, weapons and war | Leave a comment

Italy’s thorium contamination resulting from military operations

Subject:  Alarming levels of thorium-232 at the military firing range lying between Cordenons, San Quirino, Vivaro and San Giorgio della Richinvelda, in the province of Pordenone http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+WQ+E-2014-000031+0+DOC+XML+V0//EN

The Italian Army operates a military firing range lying between the districts of Cordenons, San Quirino, Vivaro and San Giorgio della Richinvelda in the province of Pordenone, in the vicinity of the River Cellina and the River Meduna, and the drills carried out at this firing range have led to the area becoming radioactively contaminated.

As has been reported by the press, in late December 2013 the Commander of the 132nd Ariete Armoured Division in Cordenons, the Commander-in-Chief of the Italian Army, the offices of the region of Friuli-Venezia Giulia, the province of Pordenone and the affected districts, the prefect of Pordenone, and lastly Local Health Authority (ASS) No 6, were all sent the results of tests that had been carried out by the Friuli-Venezia Giulia provincial department of the Italian Regional Environmental Protection Agency (ARPA), which showed alarming levels of thorium-232 in the area.

Thorium-232 is a notoriously radioactive metal, which emits particles that are six times more hazardous to human health than those released by depleted uranium. It is at its most toxic between around 20 and 25 years after use. More specifically, out of the eight targets (the shells of armoured tanks used for firing practice) tested by the ARPA, four were found to contain thorium-232 at markedly higher levels than those that generally occur naturally; these levels were therefore unnatural, and presumably attributable to military firing operations.

In all likelihood, such levels are the legacy left behind by the drills carried out at the site in the 1980s and 1990s: between 1986 and 2003, the Italian Army’s units were equipped with ‘Milan’ shoulder-fired anti-tank missiles, which emitted thorium-232(1). The ARPA has indicated that it will shortly carry out more extensive tests in the area. It is recalled that, as a result of the area’s geological make-up, materials tend to trickle down to the lowest layers, which makes their future recovery appear rather difficult.

Consequently, there is an acute risk that the ‘Magredi’ region, and the rocky terrain that makes it so distinctive, will be devastated; what is more, the area is protected as both a site of Community importance and a Special Protection Area within the meaning of the Habitats Directive (92/43/EEC) and the Birds Directive (2009/147/EC), due to the wide variety of flora and fauna present there(2).

1. Is the Commission aware of this contamination?

2. Can it report whether any similar cases have occurred in the EU, how they were tackled and whether the areas affected were restored to their original state?

3. What initiatives does it intend to implement in order to prevent similar episodes from occurring in the EU, and in particular to prevent the contamination of aquifers?

(1) The same missiles were also used at the inter-force firing range in Quirra (Sardinia), which is sadly famous for the effects resulting from thorium-232 contamination.
(2) SCI IT3310009 ‘Magredi del Cellina’, SPA IT3311001 ‘Magredi di Pordenone’.

March 8, 2017 Posted by | environment, Italy, Reference, thorium, wastes, weapons and war | Leave a comment

America’s war profiteers

Trump Is Bankrupting Our Nation to Enrich the War Profiteers March 06, 2017 By Jonathan King and Richard KrushnicTruthout | News Analysis “………The Role of Weapons Contractors

We have previously argued that it is the guaranteed profits from nuclear weapons manufacture that leads contractors to resist nuclear disarmament and promote the concept of danger from abroad.

The profitability derives from three distinct aspects of such weapons contracts:

  • First, they cannot be outsourced to lower cost suppliers, such as in China or Mexico, by congressional edict.
  • Second, the contracts are cost-plus. That is, no matter what the companies spend on the manufacture, they are guaranteed a healthy profit on top. And, of course, the more they run up the costs, the more they make.
  • And third, the contracts are screened from oversight, such as proper audits, by national security considerations.

The current 2017 congressional military authorization calls for spending of some $350 billion over the next decade for upgrades of our nuclear weapons ($35 billion a year) — land-based missiles in silos, long-range bombers and their bombs, new Trident submarines and upgraded Trident missiles and new nuclear-capable cruise missiles. The so-called “modernization” program that Trump supports will spend more than $1 trillion — a thousand billion — income tax dollars over the next 30 years.

Given that the Soviet Union no longer exists, that China has become a capitalist economy and that the major difficulties faced abroad are ISIS (also known as Daesh) and related groups, it is deeply questionable why the congressional budget still devotes tens of billions of dollars to Cold War-era nuclear weapons. Yet the Trump administration is proposing to spend a trillion dollars or more over the next three decades upgrading the US nuclear weapons triad.

Where does the pressure for these wasteful and provocative programs — which almost certainly decrease national security — come from? While military high command and the intelligence agencies also press for nuclear weapons upgrades, corporate profits derived from nuclear weapons contracts may be the most powerful driving force, supported by members of Congress with military research and development (R&D) and production facilities in their districts.

A closer look at Lockheed Martin, the largest weapons contractor in the world, reveals how this coupling between corporate profits and the continuation of nuclear weapons delivery programs operates……….http://www.truth-out.org/news/item/39712-trump-is-bankrupting-our-nation-to-enrich-the-war-profiteers

March 8, 2017 Posted by | Reference, weapons and war | Leave a comment

America’s nuclear bomb tests and their health toll on Americans

U.S. nuclear testing ceased in 1992. In 2002, the Centers for Disease Control estimated that virtually every American that has lived since 1951 has been exposed to nuclear fallout, and that the cumulative effects of all nuclear testing by all nations could ultimately be responsible for up to eleven thousand deaths in the United States alone.

atomic-bomb-lhighly-recommendedAmerica’s Forgotten Nuclear War (On Itself), National Interesthttp://nationalinterest.org/blog/americas-forgotten-nuclear-war-itself-19662 Kyle Mizokami, 4 Mar  `17 , Nuclear weapons have a mysterious quality. Their power is measured in plainly visible blast pressure and thermal energy common to many weapons, but also invisible yet equally destructive radiation and electromagnetic pulse. Between 1945 and 1992, the United States conducted 1,032 nuclear tests seeking to get the measure of these enigmatic weapons. Many of these tests would be today be considered unnecessary, overly dangerous and just plain bizarre. These tests, undertaken on the atomic frontier, gathered much information about these weapons—enough to cease actual use testing—yet scarred the land and left many Americans with long-term health problems.

The majority of U.S. nuclear tests occurred in the middle of the Western desert, at the Nevada Test Site. The NTS hosted 699 nuclear tests, utilizing both above-ground and later underground nuclear devices. The average yield for these tests was 8.6 kilotons. Atmospheric tests could be seen from nearby Las Vegas, sixty-five miles southeast of the Nevada Test site, and even became a tourist draw until the Limited Test Ban Treaty banned them in 1963. Today the craters and pockmarks from underground tests are still visible in satellite map imagery.

The bulk of the remaining nuclear tests took place in Pacific, at the islands of Bikini, Enewetak, Johnson Island and Christmas Island. The second nuclear test, after 1945’s Trinity Test, took place at Bikini Atoll. The Pacific tests were notable not only for their stunning visuals, the most compelling imagery of nuclear weapons since Hiroshima, but also the forced relocation of native islanders. Others that were near tests were exposed to dangerous levels of radioactive fallout and forced to fleet. In 1954, the crew of the Japanese fishing boat Daigo Fukuryu Maru accidentally sailed through fallout from the nearby fifteen-megaton Castle Bravo test. Contaminated with nuclear fallout, one crew member died, and the rest were sickened by radiation.

The first test of a thermonuclear, or fusion, bomb took place on November 1952 at Enewetak Island. Nicknamed Ivy Mike, the huge eighty-two-ton device was more of a building than a usable nuclear device. The device registered a yield of 10.4 megatons, or the equivalent of 10,400,000 tons of TNT. (Hiroshima, by contrast, was roughly eighteen thousand tons of TNT.) Ivy Mike was the biggest test by far, creating a fireball 1.8 miles wide and a mushroom cloud that rose to an altitude of 135,000 feet.

One of the strangest atmospheric tests occurred in 1962 at the NTS, with the testing of the Davy Crockett battlefield nuclear weapon. Davy Crockett was a cartoonish-looking recoilless rifle that lobbed a nuclear warhead with an explosive yield of just ten to twenty tons of TNT. The test, code-named Little Feller I, took place on July 17, 1962, with attorney general and presidential adviser Robert. F. Kennedy in attendance. Although hard to believe, Davy Crockett was issued at the battalion level in both Germany and North Korea.

Also in 1962, as part of a series of high-altitude nuclear experiments, a Thor rocket carried a W49 thermonuclear warhead approximately 250 miles into the exoatmosphere. The test, known as Starfish Prime, had an explosive yield of 1.4 megatons, or 1,400,000 tons of TNT, and resulted in a large amount of electromagnetic pulse being released over the Eastern Pacific Ocean. The test, conducted off Johnston Island, sent a man-made electrical surge as far Hawaii, more than eight hundred miles away. The surge knocked out three hundred streetlights and a telephone exchange, and caused burglar alarms to go off and garage doors to open by themselves.

Nuclear tests weren’t just restricted to the Pacific Ocean and Nevada. In October 1964, as part of Operation Whetstone, the U.S. government detonated a 5.3-kiloton device just twenty-eight miles southwest of Hattiesburg, Mississippi. The test, nicknamed Salmon, was an experiment designed to determine if nuclear tests could be detected by seismometer. This was followed up in 1966 with the Sterling test, which had a yield of 380 tons.

In 1967, as part of a misguided attempt to use nuclear weapons for peaceful purposes, the United States detonated a nuclear device near Farmington, New Mexico. Project Gasbuggy was an early attempt at nuclear “fracking,” detonating a twenty-nine-kiloton nuke 4,227 feet underground just to see if the explosion would fracture surrounding rock and expose natural-gas reserves. The experiment was unsuccessful. Two similar tests, Rulison and Rio Blanco, took place in nearby Colorado. Although Rulison was a success in that it uncovered usable gas reserves, the gas was contaminated with radiation, leaving it unsuitable for practical commercial use.

A handful of nuclear tests were conducted in Alaska, or more specifically the Aleutian island of Amchitka. The first test, in October 1965, was designed to test nuclear detection techniques and had a yield of eighty kilotons. A second test occurred four years later, and had a yield of one megaton, or one thousand kilotons. The third and largest test, Cannikin, was a test of the Spartan antiballistic-missile warhead and had a yield of less than five megatons.

During the early years of nuclear testing it was anticipated that nuclear weapons would be used on the battlefield, and that the Army and Marine Corps had better get used to operating on a “nuclear battlefield.” During the 1952 Big Shot test, 1,700 ground troops took shelter in trenches just seven thousand yards from the thirty-three-kiloton explosion. After the test, the troops conducted a simulated assault that took them to within 160 meters of ground zero. This test and others like them led to increases in leukemia, prostate and nasal cancers among those that participated.

U.S. nuclear testing ceased in 1992. In 2002, the Centers for Disease Control estimated that virtually every American that has lived since 1951 has been exposed to nuclear fallout, and that the cumulative effects of all nuclear testing by all nations could ultimately be responsible for up to eleven thousand deaths in the United States alone. The United States did indeed learn much about how to construct safe and reliable nuclear weapons, and their effects on human life and the environment. In doing so, however, it paid a terrible and tragic price.

Kyle Mizokami is a defense and national-security writer based in San Francisco who has appeared in the DiplomatForeign PolicyWar is Boring and the Daily Beast. In 2009, he cofounded the defense and security blog Japan Security Watch. You can follow him on Twitter: @KyleMizokami.

March 6, 2017 Posted by | Reference, USA, weapons and war | Leave a comment

Why milk is nature’s perfect radioactivity delivery system

radiation-emanatingWhat’s up with milk and radiation? , Connect Savannah, 14 Sept 2011, 

1. It’s a food. While an external dusting of radionuclides isn’t healthy, for efficient long-term irradiation of vulnerable organs there’s no substitute for actually ingesting the stuff.

2. It’s fast. Not to knock potatoes and chicken, but growing these items can take weeks or months. With milk, the fallout simply drifts over the pasture and lands on the grass, which the cows then eat. The radioactive particles are deposited in the cows’ milk, the farmers milk the cows, and in a day or two the contaminated product shows up in the dairy case.

3. Because it’s processed quickly, milk makes effective use of contaminants that would otherwise rapidly decay. A byproduct of uranium fission is the radioactive isotope iodine-131. Iodine is critical to functioning of the thyroid gland, and any iodine-131 consumed will be concentrated there. However, iodine-131 has a half-life of just eight days. The speed of dairying eliminates this impediment.

4. Milk also does a good job of delivering other radioactive contaminants, such as cesium-134 and cesium-137. Although not important for human health, radioactive cesium mimics potassium, which we do need, and is readily absorbed by the body. Another uranium breakdown product is strontium-90, which is especially hazardous to children, since it can be incorporated into growing bones. In contrast to radioactive iodine, strontium-90 has a half-life of about 29 years, so once it gets embedded in you, you are, as the Irish say, fooked.

5. That brings us to the most fiendish property of radioactive milk-it targets the young. Children (a) drink a lot more milk and (b) are smaller, which when you add it up means they get a much stiffer dose. Some cancers triggered by radioactivity have a long latency period; older people may die of something else first, but kids bear the full brunt.

For all these reasons, testing milk and dumping any contaminated is at the top of the list of disaster-response measures following a nuclear accident, and it’s unusual, though not unknown, for bad milk to find its way into the food supply. For example:

• Iodine contamination during the 1979 Three Mile Island accident was negligible, 20 picocuries per liter. The FDA’s “action level” at the time was 12,000 picocuries per liter; the current limit of 4,600 picocuries is still far in excess of what was observed.

• After the problems with the Fukushima reactors in Japan, one batch of hot milk did test at nine times the current limit, and milk and vegetable consumption was prohibited in high-risk areas. But most bans were rescinded after a couple months.

• In 1957, after a fire at the Windscale plutonium processing plant in the UK, radiation levels of 800,000 picocuries per liter and higher were found in local milk. Though contamination of milk wasn’t well understood at the time, authorities figured 800,000 of anything involving curies can’t be good and banned the stuff.

• Then there’s Chernobyl. Milk sales were banned in nearby cities after the 1986 reactor explosion, but feckless Soviet officials let the sizable rural population fend for itself. Not surprisingly, 6,000 cases of thyroid cancer subsequently developed, proving there’s no catastrophic situation that stupidity can’t make worse.

One last thing. We’ve been talking about cow’s milk, but be aware that iodine-131, strontium-90, and other radioactive contaminants can also be transferred through human milk…..http://www.connectsavannah.com/savannah/whats-up-with-milk-and-radiation/Content?oid=2135647

March 4, 2017 Posted by | radiation, Reference | Leave a comment

Up to £219 billion to clean up the UK’s nuclear mess: autonomous robots to be developed

flag-UKUK funding development of autonomous robots to help clear up nuclear waste A new UK consortium will be developing robots to handle nuclear sites, bomb disposal, space and mining. International Business Times,     By   February 28, 2017 The UK government is funding a new consortium of academic institutions and industrial partners to jump start the robotics industry and develop a new generation of robots to help deal with situations that are hazardous for humans.

robot-manchester-uni

It is estimated that it will cost between £95 billion and £219 billion to clean up the UK’s existing nuclear facilities over the next 120 years or so. The environment is so harsh that humans cannot physically be on the site, and robots that are sent in often encounter problems, like the small IRID Toshiba shape-shifting scorpion robot used to explore Fukushima’s nuclear reactors, often break down and cannot be retrieved.Remote-controlled robots are needed to turn enter dangerous zones that haven’t been accessed in over 40 years to carry out relatively straightforward tasks that a human could do in an instant.

The problem is that robots are just not at the level they need to be yet, and it is very difficult to build a robot that can successfully navigate staircases, move over rough terrain and turn valves.

To fix this problem, the Engineering and Physical Sciences Research Council is investing £4.6m ($5.7m) into a new group consisting of the University of Manchester, the University of Birmingham, the University of the West of England (UWE) and industrial partners Sellafield, EDF Energy, UKAEA and NuGen…….http://www.ibtimes.co.uk/uk-funding-development-autonomous-robots-help-clear-nuclear-waste-1608985

 

March 1, 2017 Posted by | Reference, technology, UK, wastes | Leave a comment

Rapid spread of ocean acidification in the Arctic

International team reports ocean acidification spreading rapidly in Arctic Ocean, EurekAlert, 28 Feb 17, UNIVERSITY OF DELAWARE  Ocean acidification (OA) is spreading rapidly in the western Arctic Ocean in both area and depth, according to new interdisciplinary research reported in Nature Climate Changeby a team of international collaborators, including University of Delaware professor Wei-Jun Cai.

The research shows that, between the 1990s and 2010, acidified waters expanded northward approximately 300 nautical miles from the Chukchi slope off the coast of northwestern Alaska to just below the North Pole. Also, the depth of acidified waters was found to have increased, from approximately 325 feet to over 800 feet (or from 100 to 250 meters).

ocean-acidification

“The Arctic Ocean is the first ocean where we see such a rapid and large-scale increase in acidification, at least twice as fast as that observed in the Pacific or Atlantic oceans,” said Cai, the U.S. lead principal investigator on the project and Mary A.S. Lighthipe Professor of Earth, Ocean, and Environment at UD.

“The rapid spread of ocean acidification in the western Arctic has implications for marine life, particularly clams, mussels and tiny sea snails that may have difficulty building or maintaining their shells in increasingly acidified waters,” said Richard Feely, NOAA senior scientist and a co-author of the research. Sea snails called pteropods are part of the Arctic food web and important to the diet of salmon and herring. Their decline could affect the larger marine ecosystem.

Among the Arctic species potentially at risk from ocean acidification are subsistence fisheries of shrimp and varieties of salmon and crab.

Other collaborators on the international project include Liqi Chen, the Chinese lead principal investigator and scientist with the Third Institute of Oceanography of State Oceanic Administration of China; and scientists at Xiamen University, China and the University of Gothenburg, Sweden, among other institutions…….

Arctic ocean ice melt in the summer, once found only in shallow waters of depths less than 650 feet or 200 meters, now spreads further into the Arctic Ocean.

“It’s like a melting pond floating on the Arctic Ocean. It’s a thin water mass that exchanges carbon dioxide rapidly with the atmosphere above, causing carbon dioxide and acidity to increase in the meltwater on top of the seawater,” said Cai. “When the ice forms in winter, acidified waters below the ice become dense and sink down into the water column, spreading into deeper waters.”https://www.eurekalert.org/pub_releases/2017-02/uod-itr022717.php

March 1, 2017 Posted by | ARCTIC, climate change, oceans, Reference | Leave a comment

Climatologists explain threat of drastic cooling in North Atlantic

Drastic cooling in North Atlantic beyond worst fears, scientists warn https://www.theguardian.com/environment/2017/feb/24/drastic-cooling-north-atlantic-beyond-worst-fears-scientists-warn

Climatologists say Labrador Sea could cool within a decade before end of this century, leading to unprecedented disruption, reports Climate News Network, Guardian,  , 25 Feb 17, For thousands of years, parts of northwest Europe have enjoyed a climate about 5C warmer than many other regions on the same latitude. But new scientific analysis suggests that that could change much sooner and much faster than thought possible.

Climatologists who have looked again at the possibility of major climate change in and around the Atlantic Ocean, a persistent puzzle to researchers, now say there is an almost 50% chance that a key area of the North Atlantic could cool suddenly and rapidly, within the space of a decade, before the end of this century.

That is a much starker prospect than even the worst-case scientific scenario proposed so far, which does not see the Atlantic ocean current shutdown happening for several hundred years at least.

A scenario even more drastic (but fortunately fictional) was the subject of the 2004 US movie The Day After Tomorrow, which portrayed the disruption of the North Atlantic’s circulation leading to global cooling and a new Ice Age.

To evaluate the risk of extreme climate change, researchers from the Environnements et Paléoenvironnements Océaniques et Continentaux laboratory (CNRS/University of Bordeaux, France), and the University of Southamptondeveloped an algorithm to analyse the 40 climate models considered by the Fifth Assessment Report.

The findings by the British and French team, published in the Nature Communications journal, in sharp contrast to the IPCC, put the probability of rapid North Atlantic cooling during this century at almost an even chance – nearly 50%.

Current climate models foresee a slowing of the meridional overturning circulation (MOC), sometimes known also as the thermohaline circulation, which is the phenomenon behind the more familiar Gulf Stream that carries warmth from Florida to European shores. If it did slow, that could lead to a dramatic, unprecedented disruption of the climate system.

In 2013, drawing on 40 climate change projections, the IPCC judged that this slowdown would occur gradually, over a long period. Its findings suggested that fast cooling of the North Atlantic during this century was unlikely.

But oceanographers from EU emBRACE had also re-examined the 40 projections by focusing on a critical spot in the northwest of the North Atlantic: the Labrador Sea.

The Labrador Sea is host to a convection system ultimately feeding into the ocean-wide MOC. The temperatures of its surface waters plummet in the winter, increasing their density and causing them to sink. This displaces deep waters, which bring their heat with them as they rise to the surface, preventing the formation of ice caps.

The algorithm developed by the Anglo-French researchers was able to detect quick sea surface temperature variations. With it they found that seven of the 40 climate models they were studying predicted a total shutdown of convection, leading to abrupt cooling of the Labrador Sea by 2C to 3C over less than 10 years. This in turn would drastically lower North Atlantic coastal temperatures.

But because only a handful of the models supported this projection, the researchers focused on the critical parameter triggering winter convection: ocean stratification. Five of the models that included stratification predicted a rapid drop in North Atlantic temperatures.

The researchers say these projections can one day be tested against real data from the international OSnap project, whose teams will be anchoring scientific instruments within the sub-polar gyre (a gyre is any large system of circulating ocean currents).

If the predictions are borne out and the North Atlantic waters do cool rapidly over the coming years, the team says, with considerable understatement, climate change adaptation policies for regions bordering the North Atlantic will have to take account of this phenomenon.

February 27, 2017 Posted by | 2 WORLD, climate change, oceans, Reference | Leave a comment

Future sea level rise studies by NASA project – Oceans Melting Greenland (OMG)

sea-ice-meltingfOMG measurements of Greenland give us a glimpse of future sea rise https://www.skepticalscience.com/omg-greenland-sea-level-rise.html 24 February 2017 by John Abraham  If you meet a group of climate scientists, and ask them how much sea levels will rise by say the year 2100, you will get a wide range of answers. But, those with most expertise in sea level rise will tell you perhaps 1 meter (a little over three feet). Then, they will immediately say, “but there is a lot of uncertainty on this estimate.” It doesn’t mean they aren’t certain there will be sea level rise – that is guaranteed as we add more heat in the oceans. Here, uncertainty means it could be a lot more or a little less.

Why are scientists not certain about how much the sea level will rise? Because there are processes that are occurring that have the potential for causing huge sea level rise, but we’re uncertain about how fast they will occur. Specifically, two very large sheets of ice sit atop Greenland and Antarctica. If those sheets melt, sea levels will rise hundreds of feet.

Parts of the ice sheets are melting, but how much will melt and how fast will the melting occur? Are we talking decades? Centuries? Millennia? Scientists really want to know the answer to this question. Not only is it interesting scientifically, but it has huge impacts on coastal planning.

One reason the answer to this question is illusive is that melting of ice sheets can occur from above (warm air and sunlight) or from below (warm ocean waters). In many instances, it’s the melting from below that is most significant – but this melting from below is really hard to measure.

With hope we will have a much clearer sense of ice sheet melting and sea level rise because of a new scientific endeavor that is part of a NASA project – Oceans Melting Greenland (OMG). This project has brought together some of the best oceanographers and ice experts in the world. The preliminary results are encouraging and are discussed in two recent publications here and here.

In the papers, the authors note that Greenland ice loss has increased substantially in recent decades. It now contributes approximately 1/3 to total sea level rise. The authors want to know whether this contribution will change over time and they recognize that underwater processes may be the most important to study. In fact, they note in their paper:

Specifically, our goal is improved understanding of how ocean hydrographic variability around the ice sheet impacts glacial melt rates, thinning and retreat.

In plain English, they want to know how water flow around Greenland affects the ice melt.

Their experiments are measuring a number of key attributes. First, yearly changes in the temperature of ocean water near Greenland. Second, the yearly changes to the glaciers on Greenland that extend into the ocean waters. Third, they are observing marine topography (the shape of the land underneath the ocean surface).

The sea floor shape is quite complicated, particularly near Greenland. Past glaciers carved deep troughs in the sea floor in some areas, allowing warm salty water to reach huge glaciers that are draining the ice sheet. As lead OMG investigator Josh Willis said:

What’s interesting about the waters around Greenland is that they are upside down. Warm, salty water, which is heavy, sits below a layer of cold, fresh water from the Arctic Ocean. That means the warm water is down deep, and glaciers sitting in deep water could be in trouble.

As the warm water attacks marine glaciers (glaciers that extend into the ocean), the ice tends to break and calve, retreating toward land. In some cases, the glaciers retreat until their grounding line coincides with the shore. But in other cases the undulating surface allows warm water to wear the glacier underside for long distances and thereby increase the risk of large calving events.

Oftentimes, when glaciers near the coast break off they uncork other ice that can then more easily flow into the oceans.

Click here to read the rest

February 27, 2017 Posted by | ARCTIC, climate change, oceans, Reference | Leave a comment

Oh dear! Transatomic Power has been making false claims about Generation IV nuclear reactors

text-cat-questionIt’s interesting the way that, for dubious nuclear enterprises, they like to put a young woman at the top. Is this to make the nuclear image look young and trendy? Or is it so they she can cop the flak when it all goes wrong?

Below – Leslie Dewan – CEO of Transatomic Power

dewan-leslie-poisoned-chaliceNuclear Energy Startup Transatomic Backtracks on Key Promises The company, backed by Peter Thiel’s Founders Fund, revised inflated assertions about its advanced reactor design after growing concerns prompted an MIT review. MIT Technology Review by James Temple  February 24, 2017 Nuclear energy startup Transatomic Power has backed away from bold claims for its advanced reactor technology after an informal review by MIT professors highlighted serious errors in the company’s calculations, MIT Technology Review has learned.

The Cambridge, Massachusetts-based company, founded in 2011 by a pair of MIT students in the Nuclear Science & Engineering department, asserted that its molten salt reactor design could run on spent nuclear fuel from conventional reactors and generate energy far more efficiently than them. In a white paper published in March 2014, the company proclaimed its reactor “can generate up to 75 times more electricity per ton of mined uranium than a light-water reactor.”

Those lofty claims helped it raise millions in venture capital, secure a series of glowing media profiles (including in this publication), and draw a rock-star lineup of technical advisors. But in a paper on its site dated November 2016, the company downgraded “75 times” to “more than twice.” In addition, it now specifies that the design “does not reduce existing stockpiles of spent nuclear fuel,” or use them as its fuel source. The promise of recycling nuclear waste, which poses tricky storage and proliferation challenges, was a key initial promise of the company that captured considerable attention.

“In early 2016, we realized there was a problem with our initial analysis and started working to correct the error,” cofounder Leslie Dewan said in an e-mail response to an inquiry from MIT Technology Review.

The dramatic revisions followed an analysis in late 2015 by Kord Smith, a nuclear science and engineering professor at MIT and an expert in the physics of nuclear reactors.

At that point, there were growing doubts in the field about the company’s claims and at least some worries that any inflated claims could tarnish the reputation of MIT’s nuclear department, which has been closely associated with the company. Transatomic also has a three-year research agreement with the department, according to earlier press releases.

In reviewing the company’s white paper, Smith noticed immediate red flags. He relayed his concerns to his department head and the company, and subsequently conducted an informal review with two other professors.

“I said this is obviously incorrect based on basic physics,” Smith says. He asked the company to run a test, which ended up confirming that “their claims were completely untrue,” Smith says.

He notes that promising to increase the reactor’s fuel efficiency by 75 times is the rough equivalent of saying that, in a single step, you’d developed a car that could get 2,500 miles per gallon.

Ultimately, the company redid its analysis, and produced and posted a new white paper………

The company has raised at least $4.5 million from Peter Thiel’s Founders Fund, Acadia Woods Partners, and Daniel Aegerter of Armada Investment AG. Venture capital veteran Ray Rothrock serves as chairman of the company.

Founders Fund didn’t immediately respond to an inquiry……https://www.technologyreview.com/s/603731/nuclear-energy-startup-transatomic-backtracks-on-key-promises/

February 25, 2017 Posted by | Reference, spinbuster, technology, USA | Leave a comment

Effect of air pollution might have masked mid-20th Century sea ice loss

Air pollution may have masked mid-20th Century sea ice loss https://www.sciencedaily.com/releases/2017/02/170223124327.htm February 23, 2017

Source:
American Geophysical Union
Summary:
sea-ice-meltingfHumans may have been altering Arctic sea ice longer than previously thought, according to researchers studying the effects of air pollution on sea ice growth in the mid-20th Century.

Humans may have been altering Arctic sea ice longer than previously thought, according to researchers studying the effects of air pollution on sea ice growth in the mid-20th Century. The new results challenge the perception that Arctic sea ice extent was unperturbed by human-caused climate change until the 1970s.

Scientists have observed Arctic sea ice loss since the mid-1970s and some climate model simulations have shown the region was losing sea ice as far back as 1950. In a new study, recently recovered Russian observations show an increase in sea ice from 1950 to 1975 as large as the subsequent decrease in sea ice observed from 1975 to 2005. The new observations of mid-century sea ice expansion led researchers behind the new study to the search for the cause.

The new study supports the idea that air pollution is to blame for the observed Arctic sea ice expansion. Particles of air pollution that come primarily from the burning of fossil fuels may have temporarily hidden the effects of global warming in the third quarter of the 20th Century in the eastern Arctic, the researchers say.

These particles, called sulfate aerosols, reflect sunlight back into space and cool the surface. This cooling effect may have disguised the influence of global warming on Arctic sea ice and may have resulted in sea ice growth recorded by Russian aerial surveys in the region from 1950 through 1975, according to the new research.

“The cooling impact from increasing aerosols more than masked the warming impact from increasing greenhouse gases,” said John Fyfe, a senior scientist at Environment and Climate Change Canada in Victoria and a co-author of the new study accepted for publication in Geophysical Research Letters, a journal of the American Geophysical Union.

To test the aerosol idea, researchers used computer modeling to simulate sulfate aerosols in the Arctic from 1950 through 1975. Concentrations of sulfate aerosols were especially high during these years before regulations like the Clean Air Act limited sulfur dioxide emissions that produce sulfate aerosols.

The study’s authors then matched the sulfate aerosol simulations to Russian observational data that suggested a substantial amount of sea ice growth during those years in the eastern Arctic. The resulting simulations show the cooling contribution of aerosols offset the ongoing warming effect of increasing greenhouse gases over the mid-twentieth century in that part of the Arctic. This would explain the expansion of the Arctic sea ice cover in those years, according to the new study.

Aerosols spend only days or weeks in the atmosphere so their effects are short-lived. The weak aerosol cooling effect diminished after 1980, following the enactment of clean air regulations. In the absence of this cooling effect, the warming effect of long-lived greenhouse gases like carbon dioxide has prevailed, leading to Arctic sea ice loss, according to the study’s authors.

The new study helps sort out the swings in Arctic sea ice cover that have been observed over the last 75 years, which is important for a better understanding of sea ice behavior and for predicting its behavior in the future, according to Fyfe.

The new study’s use of both observations and modeling is a good way to attribute the Arctic sea ice growth to sulfate aerosols, said Cecilia Bitz, a sea ice researcher at the University of Washington in Seattle who has also looked into the effects of aerosols on Arctic ice. The sea ice record prior to satellite images is “very sparse,” added Bitz, who was not involved in the new study.

Bitz also points out that some aerosols may have encouraged sea ice to retreat. Black carbon, for instance, is a pollutant from forest fires and other wood and fossil fuel burning that can darken ice and cause it to melt faster when the sun is up — the opposite effect of sulfates. Also, black carbon emissions in some parts of the Arctic are still quite common, she said.


Story Source:

Materials provided by American Geophysical Union.

February 25, 2017 Posted by | ARCTIC, climate change, oceans, Reference | Leave a comment

The link between nuclear power stations and cancer rates

radiation-causing-cancerA link between cancer rates and nuclear plants? http://www.pottsmerc.com/article/MP/20170221/NEWS/170229937  Joseph Mangano Executive Director Radiation and Public Health Project 02/21/17,SINCE THE TWO NUCLEAR REACTORS AT LIMERICK Began operating in the 1980s, the question of whether toxic radiation releases affected local cancer rates has persisted.

February 24, 2017 Posted by | health, Reference, USA | Leave a comment

The Fukushima Daichi nuclear power complex is a continuing, permanent, catastrophe

Caldicott,-Helen-4highly-recommendedHELEN CALDICOTT: The Fukushima nuclear meltdown continues unabated https://independentaustralia.net/politics/politics-display/helen-caldicott-the-fukushima-nuclear-meltdown-continues-unabated,10019  3 February 2017,  Dr Helen Caldicott, explains recent robot photos taken of Fukushima’s Daiichi nuclear reactors: radiation levels have not peaked, but have continued to spill toxic waste into the Pacific Ocean — but it’s only now the damage has been photographed.

RECENT reporting of a huge radiation measurement at Unit 2 in the Fukushima Daichi reactor complex does not signify that there is a peak in radiation in the reactor building.

All that it indicates is that, for the first time, the Japanese have been able to measure the intense radiation given off by the molten fuel, as each previous attempt has led to failure because the radiation is so intense the robotic parts were functionally destroyed.

The radiation measurement was 530 sieverts, or 53,000 rems (Roentgen Equivalent for Man). The dose at which half an exposed population would die is 250 to 500 rems, so this is a massive measurement. It is quite likely had the robot been able to penetrate deeper into the inner cavern containing the molten corium, the measurement would have been much greater.

These facts illustrate why it will be almost impossible to “decommission” units 1, 2 and 3 as no human could ever be exposed to such extreme radiation. This fact means that Fukushima Daichi will remain a diabolical blot upon Japan and the world for the rest of time, sitting as it does on active earthquake zones.

What the photos taken by the robot did reveal was that some of the structural supports of Unit 2 have been damaged. It is also true that all four buildings were structurally damaged by the original earthquake some five years ago and by the subsequent hydrogen explosions so, should there be an earthquake greater than seven on the Richter scale, it is very possible that one or more of these structures could collapse, leading to a massive release of radiation as the building fell on the molten core beneath. But units 1, 2 and 3 also contain cooling pools with very radioactive fuel rods — numbering 392 in Unit 1, 615 in Unit 2, and 566 in Unit 3; if an earthquake were to breach a pool, the gamma rays would be so intense that the site would have to be permanently evacuated. The fuel from Unit 4 and its cooling pool has been removed.

But there is more to fear.

The reactor complex was built adjacent to a mountain range and millions of gallons of water emanate from the mountains daily beneath the reactor complex, causing some of the earth below the reactor buildings to partially liquefy. As the water flows beneath the damaged reactors, it immerses the three molten cores and becomes extremely radioactive as it continues its journey into the adjacent Pacific Ocean.

Every day since the accident began, 300 to 400 tons of water has poured into the Pacific where numerous isotopes – including cesium 137, 134, strontium 90, tritium, plutonium, americium and up to 100 more – enter the ocean and bio-concentrate by orders of magnitude at each step of the food chain — algae, crustaceans, little fish, big fish then us.

Fish swim thousands of miles and tuna, salmon and other species found on the American west coast now contain some of these radioactive elements, which are tasteless, odourless and invisible. Entering the human body by ingestion they concentrate in various organs, irradiating adjacent cells for many years. The cancer cycle is initiated by a single mutation in a single regulatory gene in a single cell and the incubation time for cancer is any time from 2 to 90 years. And no cancer defines its origin.

We could be catching radioactive fish in Australia or the fish that are imported could contain radioactive isotopes, but unless they are consistently tested we will never know.

As well as the mountain water reaching the Pacific Ocean, since the accident, TEPCO has daily pumped over 300 tons of sea water into the damaged reactors to keep them cool. It becomes intensely radioactive and is pumped out again and stored in over 1,200 huge storage tanks scattered over the Daichi site. These tanks could not withstand a large earthquake and could rupture releasing their contents into the ocean.

But even if that does not happen, TEPCO is rapidly running out of storage space and is trying to convince the local fishermen that it would be okay to empty the tanks into the sea. The Bremsstrahlung radiation like x-rays given off by these tanks is quite high – measuring 10 milirems – presenting a danger to the workers. There are over 4,000 workers on site each day, many recruited by the Yakuza (the Japanese Mafia) and include men who are homeless, drug addicts and those who are mentally unstable.

There’s another problem. Because the molten cores are continuously generating hydrogen, which is explosive, TEPCO has been pumping nitrogen into the reactors to dilute the hydrogen dangers.

Vast areas of Japan are now contaminated, including some areas of Tokyo, which are so radioactive that roadside soil measuring 7,000 becquerels (bc) per kilo would qualify to be buried in a radioactive waste facility in the U.S..

As previously explained, these radioactive elements concentrate in the food chain. The Fukushima Prefecture has always been a food bowl for Japan and, although much of the rice, vegetables and fruit now grown here is radioactive, there is a big push to sell this food both in the Japanese market and overseas. Taiwan has banned the sale of Japanese food, but Australia and the U.S. have not.

Prime Minister Abe recently passed a law that any reporter who told the truth about the situation could be gaoled for ten years. In addition, doctors who tell their patients their disease could be radiation related will not be paid, so there is an immense cover-up in Japan as well as the global media.

The Prefectural Oversite Committee for Fukushima Health is only looking at thyroid cancer among the population and by June 2016, 172 people who were under the age of 18 at the time of the accident have developed, or have suspected, thyroid cancer; the normal incidence in this population is 1 to 2 per million.

However, other cancers and leukemia that are caused by radiation are not being routinely documented, nor are congenital malformations, which were, and are, still rife among the exposed Chernobyl population.

Bottom line, these reactors will never be cleaned up nor decommissioned because such a task is not humanly possible. Hence, they will continue to pour water into the Pacific for the rest of time and threaten Japan and the northern hemisphere with massive releases of radiation should there be another large earthquake.

February 22, 2017 Posted by | Fukushima continuing, Reference | Leave a comment