MAKING OUR GREEN ECONOMY COME ALIVE AND THRIVETM

NEWS

ENVIRONMENTAL NEWS

Remember to also click on and check out the "Green Local 175 News" for environmental and green news within a 175 mile radius of Utica-Rome. Organizations and companies can send their press releases to info@greenlocal175.com . Express your viewpoint on a particular issue, by sending an e-mail to our opinion page, "Urge to be Heard". Lastly, for more environmental information listen to our Green Local 175 Radio & Internet Show.

Green Local 175 on ,

Chinese icebreaker steams for Antarctica in polar power play

By Ryan MCMORROW

Beijing (AFP) Nov 8, 2017

The Chinese ice-breaker Xuelong steamed south from Shanghai on Wednesday bound for Antarctica, where it will establish China's newest base as Beijing strives to become a polar power.

China is a latecomer in the race for pole position but its interest in Antarctica has grown along with its economic might. The new station will be the fifth Chinese foothold on the frozen continent, more than some nations which got there earlier.

China is ploughing money into polar exploration and research as other countries like the United States pull back under funding constraints and a glut of other global priorities.

An international treaty suspends all territorial claims to Antarctica, essentially setting it aside as a scientific preserve.

That "provides a precious opportunity to quickly develop China's polar bases", Qin Weijia, director of the China Arctic and Antarctic Administration, said at an annual meeting on the poles last month.

China has rapidly built up activities on the continent, building new bases and commissioning polar-capable ships and aircraft. Officials say it intends to become a "polar power."

"The fact that China has coined this new term and has made it an important part of their foreign policy shows the level of ambition and forward thinking that China has," said Anne-Marie Brady, a Global Fellow at the Wilson Center.

Brady's research, published in her book "China as a Polar Great Power", shows that China is already the pre-eminent spender on Antarctic programmes, when its logistics, infrastructure and research funding are added together.

The multilateral Antarctic Treaty bars mineral exploitation on the continent, but that may change in 2048 when rules governing the treaty change.

Some researchers worry that resource-hungry China's interest in the South Pole is a thinly veiled cover to allow mapping of the continent in preparation for a future when mining and drilling may be allowed.

Lin Shanqing, deputy director of the State Oceanic Administration which oversees China's polar programmes, said as much last week.

China must speed up development of "polar prospecting and extraction equipment", Lin said at the administration's annual meeting.

The 334-person crew of the Xuelong, which means "Snow Dragon", will establish a temporary 206-square-meter base on rocky Inexpressible Island, a leader of the expedition told the China Daily.

This will eventually be developed into China's fifth base, with work expected to be completed around 2022.

China has a growing collection of outposts, with its largest -- the Great Wall station -- able to pack in 80 researchers in the summer months. The base was not built until 1985, more than 80 years after Argentina established Antarctica's first base, on Laurie Island in 1904.

"China will be one of the few countries with a considerable number of bases spread out over the region," said Marc Lanteigne, a lecturer at Massey University Albany in New Zealand.

"It demonstrates China is a major player on the continent."

The United States, in contrast, operates three permanent bases relying in part on decades-old equipment. Argentina tops the list with six permanent bases.

Equally important are the expensive ice-breakers, whose sturdy hulls are crucial for getting supplies to iced-in Antarctic outposts.

Russia has more than 40, while the US has just two, one of which is years past its prime. China has two ice-breakers including the red-hulled Xuelong and a third under construction.

For China, it is more than a strategic priority, Brady said.

The projects in Antarctica are the latest to showcase and bolster the Communist Party's case that it is leading the nation to "rejuvenation".

"It's also about stirring up patriotism and confidence, which is very important to this government," Brady said.

Some Chinese coal ash too radioactive for reuse

Radiation levels 43 times higher than UN safety standards

Date: November 9, 2017

Source: Duke University

Summary:

Many manufacturers use coal ash from power plants as a low-cost binding agent in concrete and other building materials. But a new study finds that coal ash from high-uranium deposits in China is too radioactive for this use. Some coal ash analyzed in the study contained radiation 43 times higher than the maximum safe limit set for residential building materials by the U.N. Scientific Committee on the Effects of Atomic Radiation.

Manufacturers are increasingly using encapsulated coal ash from power plants as a low-cost binding agent in concrete, wallboard, bricks, roofing and other building materials. But a new study by U.S. and Chinese scientists cautions that coal ash from high-uranium deposits in China may be too radioactive for this use.

"While most coals in China and the U.S. have typically low uranium concentrations, in some areas in China we have identified coals with high uranium content," said Avner Vengosh, professor of geochemistry and water quality at Duke University's Nicholas School of the Environment. "Combustion of these coals leaves behind the uranium and radium and generates coal ash with very high levels of radiation."

The level of radiation in this coal ash could pose human health risks, particularly if it is recycled for use in residential building materials, he said.

Some of the coal ash samples analyzed in the new study contained radiation levels more than 43 times higher than the maximum safe limit established for residential building materials by the United Nations Scientific Committee on the Effects of Atomic Radiation.

"The magnitude of radiation we found in some of the Chinese coal ash far exceeds safe standards for radiation in building materials," said Shifeng Dai, professor of geochemistry at the state key laboratory of coal resources and safe mining at China University of Mining and Technology (CUMT) in Beijing and Xuzhou, China. "This calls into question the use of coal ash originating from uranium-rich coals for these purposes."

Vengosh, Dai and their teams published their findings Nov. 8 in the peer-reviewed journal Environmental Science and Technology.

The new paper is part of an ongoing collaboration between researchers at CUMT, Duke and Duke Kunshan University to identify the environmental impacts of coal and coal ash in China. Vengosh holds a secondary faculty appointment at Duke Kunshan, which is in China.

To conduct their study, the scientists measured naturally occurring radioactivity in high-uranium coals from 57 sites in China. They also measured radiation levels in coal ash residues produced from this coal, and in soil collected from four sites.

"By comparing the ratio of uranium in the coal to the radioactivity of the coal ash, we identified a threshold at which uranium content in coal becomes too high to allow coal ash produced from it to be used safely in residential building construction," said Nancy Lauer, a Ph.D. student at Duke's Nicholas School, who led the research.

This threshold -- roughly 10 parts per million of uranium -- is applicable to high-uranium coal deposits worldwide, not just in China, "and should be considered when deciding whether to allow coal ash to be recycled into building materials," she said.

All radionuclide measurements of coal and coal ash samples were conducted in Vengosh's lab at Duke.

"Since our findings demonstrate that using ash from this high-uranium coal is not suitable in building materials, the challenge becomes, how do we dispose of it in ways that limit any potential water or air contamination," Vengosh said. "This question requires careful consideration."

Story Source:

Materials provided by Duke University. Note: Content may be edited for style and length.

Journal Reference:

Nancy Lauer, Avner Vengosh, Shifeng Dai. Naturally Occurring Radioactive Materials in Uranium-Rich Coals and Associated Coal Combustion Residues from China. Environmental Science & Technology, 2017; DOI: 10.1021/acs.est.7b0347

By using surplus food, U.S. cities could tackle hunger, waste problems

Sophie Hares

TEPIC, Mexico (Thomson Reuters Foundation) - Apples, bread, pasta and coffee are high on the list of foods worth $218 billion Americans dump in the bin or pour down the drain each year, costing them and the environment dear, a green group said on Wednesday.

Under growing pressure to deal with the 40 percent of food households, restaurants, grocers and others throw away, U.S. cities must find new ways to stop waste going into landfill and get edible food to those who need it, the Natural Resources Defense Council (NRDC) said in two reports.

“This is food that is surplus... It’s not food that’s coming off people’s plates,” said Darby Hoover, a senior resource specialist at the NRDC, a U.S.-based environmental non-profit. “It’s food that’s packaged and prepared, and did not end up getting served.”

More than two-thirds of food thrown away by households is potentially edible, with uneaten food costing an average American family of four at least $1,500 a year and causing major environmental damage, said the research supported by The Rockefeller Foundation.

Meanwhile food that goes unsold in grocery stores, schools, restaurants and other consumer businesses could be redirected towards the one in eight Americans who lack a steady supply, it said.

Denver, Nashville and New York, the three cities at the center of the research, could dish up as many as 68 million extra meals a year if surplus food were donated, said the NRDC.

“If we could distribute just 30 percent of the food we currently discard, it would equate to enough food to provide the total diet for 49 million Americans,” said one of the reports.

While donating more food could help vulnerable people amid rising income inequality, it would not solve the underlying causes of poverty such as unemployment or low wages, the research added.

Cities need to find better ways to match food donations, especially from businesses, to poor communities, and reduce the food waste that takes up almost a quarter of landfill space and emits nearly 3 percent of U.S. greenhouse gases, the research said.

Improved planning and more education could help cut the amount of food people throw away often because it has gone moldy or they do not want to eat leftovers, said Hoover.

“Making the most of our food supply has wide-reaching benefits - helping to feed people and save money, water and energy in one fell swoop,” NRDC senior scientist Dana Gunders said in a statement.

Reporting by Sophie Hares; editing by Megan Rowling. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers

Research shows ice sheets as large as Greenland's melted fast in a warming climate

Purdue University

New research published in Science shows that climate warming reduced the mass of the Cordilleran Ice Sheet by half in as little as 500 years, indicating the Greenland Ice Sheet could have a similar fate.

The Cordilleran Ice Sheet covered large parts of North America during the Pleistocene - or Last Ice Age - and was similar in mass to the Greenland Ice Sheet. Previous research estimated that it covered much of western Canada as late as 12,500 years ago, but new data shows that large areas in the region were ice-free as early as 1,500 years earlier. This confirms that once ice sheets start to melt, they can do so very quickly.

The melting of the Cordilleran Ice Sheet likely caused about 20 feet of sea level rise and big changes in ocean temperature and circulation. Because cold water is denser than warm water, the water contained by ice sheets sinks when it melts, disrupting the "global conveyor belt" of ocean circulation and changing climate.

Researchers used geologic evidence and ice sheet models to construct a timeline of the Cordilleran's advance and retreat. They mapped and dated moraines throughout western Canada using beryllium-10, a rare isotope of beryllium that is often used as a proxy for solar intensity. Measurements were made in Purdue University's PRIME Lab, a research facility dedicated to accelerator mass spectrometry.

"We have one group of beryllium-10 measurements, which is 14,000 years old, and another group, which is 11,500 years old, and the difference in these ages is statistically significant," said Marc Caffee, a professor of physics in Purdue's College of Science and director of PRIME Lab. "The only way this would happen is if the ice in that area had completely gone away and then advanced."

Around 14,000 years ago the Earth started warming, and the effects were significant - ice completely left the tops of the mountains in western Canada, and where there were ice sheets, they probably thinned a lot. About a thousand years later, the climate cooled again, and glaciers started to advance, then retreated as conditions warmed at the onset of the Holocene. If the Cordilleran Ice Sheet had still been there when the climate started cooling during a period known as the Younger Dryas, cirque and valley glaciers wouldn't have advanced during that time. This indicates a rapid disappearance rather than a gradual melting of the ice sheet.

Reconstructing precise chronologies of past climate helps researchers establish cause and effect. Some have wondered whether the melting of the Cordilleran Ice Sheet caused the Younger Dryras cooling, but it's unlikely; the cooling started too early for that to be true, according to the study. What caused the cooling is still up for debate.

Creating a timeline of glacial retreat also provides insight into how the first people got to North America. Current estimates place human migration to the south of the Cordilleran and Laurentide Ice Sheets between 14,600 and 18,000 years ago, but how they got there isn't clear. Some say humans could have crossed through an opening between the ice sheets, but these new findings show that passage was likely closed until 13,400 years ago.

This paper should serve as motivation for further studies, said Caffee. Continental ice sheets don't disappear in a simple, monolithic way - it's an extremely complicated process. The more we know about the retreat of the Cordilleran Ice Sheet, the better we'll be able to predict what's to come for the Greenland Ice Sheet.

Special Report: The decisions behind Monsanto's weed-killer crisis

Emily Flitter

(Reuters) - In early 2016, agri-business giant Monsanto faced a decision that would prove pivotal in what since has become a sprawling herbicide crisis, with millions of acres of crops damaged.

Monsanto had readied new genetically modified soybeans seeds. They were engineered for use with a powerful new weed-killer that contained a chemical called dicamba but aimed to control the substance’s main shortcoming: a tendency to drift into neighboring farmers’ fields and kill vegetation.

The company had to choose whether to immediately start selling the seeds or wait for the U.S. Environmental Protection Agency (EPA) to sign off on the safety of the companion herbicide.

The firm stood to lose a lot of money by waiting. Because Monsanto had bred the dicamba-resistant trait into its entire stock of soybeans, the only alternative would have been “to not sell a single soybean in the United States” that year, Monsanto Vice President of Global Strategy Scott Partridge told Reuters in an interview.

Betting on a quick approval, Monsanto sold the seeds, and farmers planted a million acres of the genetically modified soybeans in 2016. But the EPA’s deliberations on the weed-killer dragged on for another 11 months because of concerns about dicamba’s historical drift problems.

Timeline of the Monsanto crisis

That delay left farmers who bought the seeds with no matching herbicide and three bad alternatives: Hire workers to pull weeds; use the less-effective herbicide glyphosate; or illegally spray an older version of dicamba at the risk of damage to nearby farms.

The resulting rash of illegal spraying that year damaged 42,000 acres of crops in Missouri, among the hardest hit areas, as well as swaths of crops in nine other states, according to an August 2016 advisory from the U.S. Environmental Protection Agency. The damage this year has covered 3.6 million acres in 25 states, according to Kevin Bradley, a University of Missouri weed scientist who has tracked dicamba damage reports and produced estimates cited by the EPA.

The episode highlights a hole in a U.S regulatory system that has separate agencies approving genetically modified seeds and their matching herbicides.

Monsanto has blamed farmers for the illegal spraying and argued it could not have foreseen that the disjointed approval process would set off a crop-damage crisis.

But a Reuters review of regulatory records and interviews with crop scientists shows that Monsanto was repeatedly warned by crop scientists, starting as far back as 2011, of the dangers of releasing a dicamba-resistant seed without an accompanying herbicide designed to reduce drift to nearby farms.

In 2015, just before Monsanto released its soybeans seeds, Arkansas regulators notified the firm of damage from illegal spraying of its dicamba-resistant cotton seeds. Some cotton farmers chose to illegally spray old versions of dicamba because other herbicides approved for use on the seeds were far less effective.

The EPA did not approve the new dicamba formulation that Monsanto now sells for use with cotton and soybean seeds - XtendiMax with Vapor Grip - until after the 2016 growing season.

Monsanto’s Partridge acknowledged that the company misjudged the regulatory timeline for approval of its new herbicide.

“The EPA process was lengthier than usual,” Partridge said.

Monsanto, however, denies culpability for the crisis that followed the two-stage approval.

“The illegal misuse of old dicamba herbicides with Xtend seeds was not foreseeable,” the company’s attorneys said in a response to one class action suit filed by farmers in Missouri. “Even if it were foreseeable that farmers would illegally apply old dicamba to their Xtend crops, which it was not, Monsanto is not liable for harms caused by other manufacturers’ products.”

Monsanto’s Partridge said in a written statement that the reports of damage from illegal spraying of dicamba on its cotton seeds in 2015 were “extremely isolated.”

“Those who applied dicamba illegally should be held responsible,” Partridge said.

Monsanto’s handling of the delayed herbicide approval may cause the firm legal and public relations damage, but it has boosted the company’s business considerably. Instead of halting seed sales while waiting on herbicide approval, Monsanto captured a quarter of the nation’s massive soybean market by the start of 2017, according to the U.S. Department of Agriculture.

Even the damage from dicamba may have boosted sales. Some farmers whose crops were harmed said in interviews that they bought Monsanto’s new dicamba-resistant seeds as a defense against drift from nearby spraying.

State regulators believe the illegal spraying of dicamba-tolerant cotton and soybean crops continued in 2017 - after the EPA approved Monsanto’s new herbicide. Farmers would still benefit from using old versions of dicamba because it is cheaper than XtendiMax. Many growers also have dicamba on hand because it is legal to use for limited purposes.

Regulators have not yet determined, however, how much damage came from illegal spraying and how much came from the legal application of XtendiMax, which weed scientists say still vaporizes under certain conditions.

Monsanto concedes that XtendiMax has caused crop damage, but blames farmers who the company says did not properly follow directions for applying the herbicide.

The EPA, after delaying a decision on XtendiMax, gave the herbicide a limited two-year approval - as opposed to the standard 20 years - in case drift issues arose.

A U.S. Department of Agriculture spokesman, Rick Corker, acknowledged in a statement to Reuters that the release of an engineered seed before its companion herbicide caused problems. The department, he said, is now in talks with the EPA about whether to coordinate approvals of paired seeds and chemicals.

“USDA and EPA are in discussions regarding the timing of our deregulations,” Corker said in a statement.

The EPA did not comment on whether it planned any policy changes in response to the dicamba crisis.

EARLY WARNINGS

Dicamba is cheap, plentiful, and has been used as a weed killer for decades. But its tendency to damage nearby fields had caused U.S. regulators to limit its use to the task of clearing fields before planting or after harvest, when there are no crops to damage and cooler temperatures make it less likely the substance will migrate.

Farmers who illegally sprayed dicamba during growing season are now facing fines of up to $1,000 for each violation of EPA rules limiting the use of dicamba, which are enforced by state regulators.

Farmers with damaged crops have filed at least seven lawsuits — five class-action suits and two by individuals — seeking compensation from Monsanto. The suits claim the company should have known that releasing the seeds without a paired herbicide would cause problems.

Monsanto officials had been repeatedly warned of the potential for damage from illegal spraying of dicamba on seeds designed to resist the chemical.

In October 2011, five scientists from Ohio State University addressed a conference in Columbus focused on the future of dicamba. In attendance were agriculture researchers from across the country as well as representatives of the companies Monsanto, Dow Chemical and BASF.

According to Douglas Doohan, one of the conference’s organizers, three Monsanto employees, including Industry Affairs Director Douglas Rushing, attended the meeting. Monsanto had a keen interest in the topic because the company was far along in developing its new line of dicamba products at the time.

In their introduction to the symposium, Doohan and his colleagues outlined what they called an increased risk of illegal dicamba spraying by farmers once dicamba-resistant seeds became available. They also argued that dicamba-resistant seeds - and the illegal spraying that might accompany them - would lead farmers whose crops were damaged to buy their own dicamba-tolerant seeds to protect themselves from further drift, according to conference records.

Monsanto’s Rushing gave his own presentation about dicamba to the symposium, according to conference records. Rushing explained the need for a new herbicide-and-seed combination to replace those that had grown less effective as weeds become more tolerant to certain chemicals, according to slides outlining Rushing’s conference presentation. He raised the issue of damage from dicamba drift, but said the risks could be reduced by using certain kinds of sprayers and taking other precautions.

Rushing could not be reached for comment. Monsanto did not directly respond to questions about the symposium.

DAMAGE REPORTS

Years later, some of what the scientists outlined in their presentations was becoming reality.

Monsanto released its dicamba-resistant cotton seed in the summer of 2015. The seed was compatible with two other legally available herbicides, giving farmers options for dealing with weeds before the EPA’s approval of XtendiMax.

But farmers started digging into their dicamba stockpiles anyway, and damage reports started to trickle in. Monsanto officials were among the first to see those reports, according to minutes of an Arkansas Plant Board committee meeting in July 2015.

Jammy Turner, a Monsanto salesman, was on the Arkansas Plant Board, the agricultural regulator that investigated the complaints. He and Duane Simpson, a Monsanto lobbyist, attended the committee meeting. There, the board’s Pesticide Division Director Susie Nichols gave a report about drift damage complaints linked to the new seed technology.

At that meeting, lobbyist Simpson was asked by the board what Monsanto was doing about drift damage complaints, according to the minutes. Simpson told the committee that the firm had been telling farmers not to spray dicamba illegally, even over crops specifically designed to withstand it. He said the company would consider pulling whatever licenses Monsanto had given to offending farmers to use its technology.

At an Aug. 8, 2016 meeting of the same committee, Simpson was asked again how Monsanto was dealing with farmers illegally spraying dicamba on Xtend crops. This time, he responded that Monsanto saw no way to pull farmers’ seed licenses over the issue.

Monsanto did not comment on Simpson’s statements in response to written questions from Reuters. The company said it would consider revoking a particular farmer’s license if asked to do so by state regulators “when they have investigated and adjudicated an egregious violation.”

Larry Steckel, a weed scientist and professor at the University of Tennessee, said those early damage reports should have been a red flag to Monsanto against releasing its soybean seeds the following year.

“It turned out to be a precursor of what was to come,” he said.

Neither Turner nor Simpson responded to calls and emails seeking comment. Monsanto did not comment on the company’s involvement in the Arkansas investigations.

By the end of 2015, the damage reports linked to the Xtend cotton seeds were making the rounds among scientists. Weed scientist Michael Owen, a professor at Iowa State University, said he warned at the ISU Annual Integrated Crop Management Conference on Dec. 2-3, 2015 that no dicamba formulations had been approved for use on Xtend crops. He told attendees it wasn’t clear when the formulas would be greenlighted, and that the situation was cause for concern.

Monsanto representatives attended that conference, according to ISU Program Services Coordinator Brent Pringnitz, who handled the registration. He would not identify them.

Owen said he also repeated his warning directly to Monsanto officials around the same time, but did not name them.

Monsanto did not comment on Owen’s assertion that he warned the company about dicamba spraying.

FLAWED ASSUMPTIONS

Farmers who spoke to Reuters and others who gave testimony recorded in state records described a variety of reasons why they purchased Xtend seeds before XtendiMax was available.

One farmer in Arkansas, Doug Masters, planted Xtend cotton in 2015 and was caught illegally spraying dicamba, according to Arkansas Plant Board records. He said the Monsanto salesman who sold him the Xtend seeds told him that, by the time the plants came up in the summer of 2015, it would be legal to spray dicamba and he should go ahead and do it, according to the board records.

Masters declined to identify the Monsanto salesman to Arkansas regulators, records show. He declined again, when reached by Reuters, to identify the representative. Masters admitted that he illegally sprayed dicamba, and the Plant Board fined him $4,000, assessing the maximum penalty allowed for four violations.

Monsanto did not comment on Masters’ testimony to the plant board.

Ritchard Zolman, another Arkansas farmer caught illegally spraying his cotton in 2015, said in a disciplinary hearing held by the Arkansas Plant Board that he’d planted Xtend seeds because he thought he could spray dicamba legally over his fields 14 days before his crops came up from the ground, according to records.

But the Plant Board ruled it was illegal to spray dicamba onto a field where planted crops had not yet sprouted, disciplinary records show. Zolman was fined $3,000 for three dicamba spraying violations, records show.

Zolman declined to comment.

In Missouri, Gary Dalton Murphy III said he and his family planted Xtend soybeans in 2016 after “hearing” that dicamba would be legal to spray by the summer. He did not say who told him dicamba would be legal that season.

When Murphy learned XtendiMax would not be available, the family got rid of their weeds by hand, hiring extra workers to help.

(Additional reporting by Steve Barnes in Little Rock, Arkansas; Editing by Rich Valdmanis and Brian Thevenot)

As Wind Power Sector Grows, Turbine Makers Feel the Squeeze

By STANLEY REEDNOV. 9, 2017

Wind power is increasing in importance and is getting cheaper, a boon for customers but a challenge for the companies that make the equipment.

Vestas Wind Systems and Siemens Gamesa are giants of the wind-power industry, building mammoth turbines that rise high into the air and power more and more homes. But disappointing earnings reports from the two companies this week indicated that even they are struggling to adapt to a fast-changing sector.

Wind power is an increasingly important source of electricity around the world, and prices for the technology are dropping fast. But belt-tightening governments across Europe and North America are phasing out subsidies and tax incentives that had helped the industry grow, squeezing companies like Vestas and Siemens Gamesa in the process.

On Thursday, Vestas, the world’s largest maker of wind turbines, said its revenue in the third quarter fell 6 percent compared with the same period a year ago, to 2.7 billion euros, or $3.1 billion. Profit dropped 18 percent, to about €250 million, Vestas said. The figures sent the Danish company’s shares plummeting by as much as 20 percent.

The Vestas results came just days after Siemens Gamesa Renewable Energy — the recently formed company combining the wind-power units of Siemens, the German conglomerate, and Gamesa of Spain — reported a €147 million loss for the third quarter. The Madrid-listed company also said it would have to shed 6,000 jobs.

Executives and analysts blamed several factors for the two companies’ poor results.

In addition to the phasing out of tax credits and guaranteed prices by some governments, prices for solar power have fallen rapidly, making it a competitor to wind in some parts of the world. Perhaps more significant, countries like Britain, Chile and Germany are using competitive auctions more often to award enormous wind and solar power projects, helping push down costs.

Such auctions have helped lower by 15 percent the costs per unit of electricity generated by onshore wind projects set to come online over the next five years, and by a third for offshore wind projects in the same period, according to the International Energy Agency, an organization based in Paris.

“The overall theme that affects all companies is that you are seeing a transition to power auctions that are quite competitive,” said Brian Gaylord, a senior analyst at MAKE, a market research firm.

As an example of the tougher conditions, Vestas said on Thursday that the price it gets for its turbines has fallen sharply. The company received around €800,000 per megawatt, a unit of power capacity, for the orders it booked in the third quarter of the year. By comparison, it received €950,000 per megawatt in late 2016.

Our columnist Andrew Ross Sorkin and his Times colleagues help you make sense of major business and policy headlines — and the power-brokers who shape them.

Those cost differences are significant given the size of the wind turbines that Vestas produces. Its largest onshore turbine can pump out 4.2 megawatts of power, enough to provide electricity to roughly 5,000 homes.

In a conference call with analysts on Thursday, Anders Runevad, the company’s chief executive, described a landscape of competitive electricity auctions that started in Latin America and have spread across much of the globe. Vestas said in a statement that it was “seeing accelerated competition and decreasing profitability.”

Things could get worse. Proposals being considered by lawmakers in the United States, one of the world’s biggest markets for wind power, could sharply reduce the value of tax credits for new wind power projects. That would potentially reduce the sector’s growth, and create uncertainty for manufacturers.

“We believe challenges” for wind turbine makers “are mounting,” Sean McLoughlin, an analyst at the bank HSBC, wrote in a note to clients on Monday.

Still, there are positive signs. A close look at the data appears to show that the industry is not in a fatal swoon but needs to adapt to current trends. Vestas, for example, reported a 48 percent rise in orders for the third quarter, a key metric for the company, compared with the same period a year ago.

And while there is little evidence that a respite is coming for manufacturers, the industry trends have been positive for consumers. Wind energy accounted for 20 percent of new global power capacity in 2016, according to the International Energy Agency. And the falling costs are a boon for households.

The shift, Mr. Gaylord said, “is ultimately good for consumers, and the power industry.”

NJ Sets 1st State Limits for Perfluorinated Chemicals in Drinking Water

"Besides lead, no contaminant in drinking water has provoked as loud a public outcry in the last two years in the United States as a class of chemicals known as perfluorinated compounds.

New Jersey regulators are taking the strongest action to date on the man-made chemicals that are used in scores of household and industrial products. The state will be the first to require utilities to test for two compounds and remove them from drinking water.

The New Jersey Department of Environmental Protection announced on November 1 that it accepted a recommendation from state water quality experts to set a legal limit for PFOA of 14 parts per trillion. The announcement follows a proposal in August to allow drinking water to contain no more than 13 parts per trillion of PFNA, a related perfluorinated compound used at plastics manufacturing facilities in New Jersey and found in public water systems."

Plastic fibres found in tap water around the world, study reveals

Exclusive: Tests show billions of people globally are drinking water contaminated by plastic particles, with 83% of samples found to be polluted

The average number of fibres found in each 500ml sample ranged from 4.8 in the US to 1.9 in Europe.

Damian Carrington, Guardian Environment editor

Tuesday 5 September 2017 19.01 EDT, Last modified on Wednesday 6 September 2017 08.35 EDT

Microplastic contamination has been found in tap water in countries around the world, leading to calls from scientists for urgent research on the implications for health.

Scores of tap water samples from more than a dozen nations were analysed by scientists for an investigation by Orb Media, who shared the findings with the Guardian. Overall, 83% of the samples were contaminated with plastic fibres.

Analysis We are living on a plastic planet. What does it mean for our health?

New studies reveal that tiny plastic fibres are everywhere, not just in our oceans but on land too. Now we urgently need to find out how they enter our food, air and tap water and what the effects are on all of us

The US had the highest contamination rate, at 94%, with plastic fibres found in tap water sampled at sites including Congress buildings, the US Environmental Protection Agency’s headquarters, and Trump Tower in New York. Lebanon and India had the next highest rates.

European nations including the UK, Germany and France had the lowest contamination rate, but this was still 72%. The average number of fibres found in each 500ml sample ranged from 4.8 in the US to 1.9 in Europe.

The new analyses indicate the ubiquitous extent of microplastic contamination in the global environment. Previous work has been largely focused on plastic pollution in the oceans, which suggests people are eating microplastics via contaminated seafood.

“We have enough data from looking at wildlife, and the impacts that it’s having on wildlife, to be concerned,” said Dr Sherri Mason, a microplastic expert at the State University of New York in Fredonia, who supervised the analyses for Orb. “If it’s impacting [wildlife], then how do we think that it’s not going to somehow impact us?”

A separate small study in the Republic of Ireland released in June also found microplastic contamination in a handful of tap water and well samples. “We don’t know what the [health] impact is and for that reason we should follow the precautionary principle and put enough effort into it now, immediately, so we can find out what the real risks are,” said Dr Anne Marie Mahon at the Galway-Mayo Institute of Technology, who conducted the research.

Mahon said there were two principal concerns: very small plastic particles and the chemicals or pathogens that microplastics can harbour. “If the fibres are there, it is possible that the nanoparticles are there too that we can’t measure,” she said. “Once they are in the nanometre range they can really penetrate a cell and that means they can penetrate organs, and that would be worrying.” The Orb analyses caught particles of more than 2.5 microns in size, 2,500 times bigger than a nanometre.

Microplastics can attract bacteria found in sewage, Mahon said: “Some studies have shown there are more harmful pathogens on microplastics downstream of wastewater treatment plants.”

Microplastics are also known to contain and absorb toxic chemicals and research on wild animals shows they are released in the body. Prof Richard Thompson, at Plymouth University, UK, told Orb: “It became clear very early on that the plastic would release those chemicals and that actually, the conditions in the gut would facilitate really quite rapid release.” His research has shown microplastics are found in a third of fish caught in the UK.

The scale of global microplastic contamination is only starting to become clear, with studies in Germany finding fibres and fragments in all of the 24 beer brands they tested, as well as in honey and sugar. In Paris in 2015, researchers discovered microplastic falling from the air, which they estimated deposits three to 10 tonnes of fibres on the city each year, and that it was also present in the air in people’s homes.

This research led Frank Kelly, professor of environmental health at King’s College London, to tell a UK parliamentary inquiry in 2016: “If we breathe them in they could potentially deliver chemicals to the lower parts of our lungs and maybe even across into our circulation.” Having seen the Orb data, Kelly told the Guardian that research is urgently needed to determine whether ingesting plastic particles is a health risk.

The new research tested 159 samples using a standard technique to eliminate contamination from other sources and was performed at the University of Minnesota School of Public Health. The samples came from across the world, including from Uganda, Ecuador and Indonesia.

How microplastics end up in drinking water is for now a mystery, but the atmosphere is one obvious source, with fibres shed by the everyday wear and tear of clothes and carpets. Tumble dryers are another potential source, with almost 80% of US households having dryers that usually vent to the open air.

“We really think that the lakes [and other water bodies] can be contaminated by cumulative atmospheric inputs,” said Johnny Gasperi, at the University Paris-Est Créteil, who did the Paris studies. “What we observed in Paris tends to demonstrate that a huge amount of fibres are present in atmospheric fallout.”

Plastic fibres may also be flushed into water systems, with a recent study finding that each cycle of a washing machine could release 700,000 fibres into the environment. Rains could also sweep up microplastic pollution, which could explain why the household wells used in Indonesia were found to be contaminated.

In Beirut, Lebanon, the water supply comes from natural springs but 94% of the samples were contaminated. “This research only scratches the surface, but it seems to be a very itchy one,” said Hussam Hawwa, at the environmental consultancy Difaf, which collected samples for Orb.

This planktonic arrow worm, Sagitta setosa, has eaten a blue plastic fibre about 3mm long. Plankton support the entire marine food chain. Photograph: Richard Kirby/Courtesy of Orb Media

Current standard water treatment systems do not filter out all of the microplastics, Mahon said: “There is nowhere really where you can say these are being trapped 100%. In terms of fibres, the diameter is 10 microns across and it would be very unusual to find that level of filtration in our drinking water systems.”

Bottled water may not provide a microplastic-free alternative to tapwater, as the they were also found in a few samples of commercial bottled water tested in the US for Orb.

38 million pieces of plastic waste found on uninhabited South Pacific island

Almost 300m tonnes of plastic is produced each year and, with just 20% recycled or incinerated, much of it ends up littering the air, land and sea. A report in July found 8.3bn tonnes of plastic has been produced since the 1950s, with the researchers warning that plastic waste has become ubiquitous in the environment.

“We are increasingly smothering ecosystems in plastic and I am very worried that there may be all kinds of unintended, adverse consequences that we will only find out about once it is too late,” said Prof Roland Geyer, from the University of California and Santa Barbara, who led the study.

Mahon said the new tap water analyses raise a red flag, but that more work is needed to replicate the results, find the sources of contamination and evaluate the possible health impacts.

She said plastics are very useful, but that management of the waste must be drastically improved: “We need plastics in our lives, but it is us that is doing the damage by discarding them in very careless ways.”

Report by Orb Media:

https://orbmedia.org/stories/Invisibles_plastics

Cooling with propane more energy effcient

Cooling homes and small office spaces could become less costly and more efficient with new early stage technology developed by Oak Ridge National Laboratory. Researchers designed a window air conditioning unit that uses propane as the refrigerant, cooling the air with 17 percent higher efficiency than the best ENERGY STAR® commercial units. "Propane offers superior thermodynamic properties and creates 700 percent less pollution than standard refrigerants," said ORNL's Brian Fricke. "We developed a system that takes advantage of these qualities and reduces global warming potential." The team's early-stage technology includes a novel heat exchanger, compressor and controls that require less propane than similar units used overseas. The team's laboratory evaluations demonstrate the prototype unit is the first propane window air conditioner to meet U.S. building safety standards. [Contact: Kim Askey, (865) 946-1861; askeyka@ornl.gov]

Supporting hurricane damage assessments

Geospatial scientists at Oak Ridge National Laboratory have developed a novel method to quickly gather building structure datasets that support emergency response teams assessing properties damaged by Hurricanes Harvey and Irma. By coupling deep learning with high-performance computing, ORNL collected and extracted building outlines and roadways from high-resolution satellite and aerial images. As hurricanes formed in the Gulf Coast and the Atlantic Ocean, ORNL activated their technique. "During devastating weather events, it's difficult and time consuming to assess damage manually," said ORNL's Mark Tuttle. "Our method supports emergency response efforts by providing preliminary building structure data--which can be categorized for residential, multi-family and commercial properties--on the county level, and this has been applied for hurricane-impacted areas of Texas, Florida, Puerto Rico and other U.S. Caribbean territories." During Hurricane Harvey, ORNL analyzed nearly 2,000 images covering nearly 26,000 square miles of building structures in Texas' coastal counties in just 24 hours, a process that would typically take up to nine months.

As hurricanes formed in the Gulf Coast, ORNL activated a computing technique to quickly gather building structure data from Texas' coastal counties.

ORNL's novel computing method supports emergency response efforts by providing preliminary building structure data on the county level. This technique has been applied for hurricane-impacted areas of Texas, Florida, Puerto Rico and other U.S. Caribbean territories.

Study identifies additional hurdle to widespread planting of bioenergy crops

Indiana University

INDIANAPOLIS -- A study examining how certain decisions impact what farmers plant and harvest identified one crucial factor that researchers believe needs to be added to the list of decision variables when considering bioenergy crops: the option value.

Most studies have not examined the role of the option value, which has to do with farmers waiting to see how bioenergy crop prices will change in the future, said Jerome Dumortier, an associate professor in the Indiana University School of Public and Environmental Affairs at IUPUI and a co-author of the study.

The study, "Production and spatial distribution of switchgrass and miscanthus in the United States under uncertainty and sunk cost," was published in the journal Energy Economics and is one of the first to consider adding the option value to the list of barriers to widespread planting of bioenergy crops. It also shows the spatial distribution of potential bioenergy crops in the U.S. when considering the option value.

"Farmers take into account the uncertainty associated with price fluctuations with bioenergy crops," Dumortier said. "They don't know how the price is going to evolve. Hence, they have to wait. Hence, fewer bioenergy crops are produced."

Previous studies have focused on comparing the costs of traditional crops -- corn, soybeans, wheat -- to planting bioenergy crops -- switchgrass, miscanthus -- and the cost of harvesting agricultural residue.

Dumortier said bioenergy crops have a higher energy density than other crops but are more expensive. There is also the issue of displacing acreage used to produce food, he said.

The study also found:

Bioenergy crops are less likely to be produced in the Midwest. Productivity for traditional crops is so high that the opportunity costs of converting to dedicated bioenergy crops would not be cost-effective. Agricultural residues, or the organic material left in the fields after crops are harvested, are abundant enough in the Corn Belt to supply the quantity of cellulosic ethanol mandated by the U.S. Renewable Fuel Standard.

There are two main sources for the cellulosic ethanol called for in the fuel standard. One is bioenergy crops. The second is agricultural residue.

"The fuel standard mandate is covered by agricultural residue, and planting of switchgrass or miscanthus is not necessary," Dumortier said.

Harvesting agricultural residue also has its problems, including either slowing harvesting as farmers gather the crops and the residue in a single pass through a field or increasing labor as farmers harvest their fields twice, once for crops and once for residue.

All of these issues play a role in determining the cost of the option value.

"This study shows that the option value cost now must be added to the higher costs of bioenergy crops and the cost of harvesting agricultural residue," Dumortier said.

Co-authors of the study are Nathan Kauffman, Federal Reserve Bank of Kansas City, and Dermot Hayes, Department of Economics, Iowa State University.

Energy efficiency labeling for homes has little effect on purchase price

Labeling doesn't appear to convince home buyers to pay more for energy efficient homes.

Norwegian University of Science and Technology

Most buyers aren't thinking about energy performance certification when they're house shopping. That's the conclusion of a team of researchers from the Norwegian University of Science and Technology (NTNU) after they conducted a thorough assessment of how the labels affected home pricing.

"Energy labeling has zero effect on the price. The scheme doesn't seem to be achieving its intended purpose," says Professor Olaf Olaussen at the NTNU Business School.

The Norwegian energy labeling system for homes and dwellings was implemented in 2010. One of the arguments for the system was that a good energy performance rating would be an advantage for the seller as well.

Energy performance is rated from A to G. It's intended as a tip for buyers so that they know just how much energy a home requires. With energy-efficient housing, you can save a lot of money over the years and thus, you may be willing to pay more for the property. The EU uses the same system.

"We found that some European studies, especially a Dutch one, showed that the energy labeling made a big difference to the price, but it seemed strange," says Olaussen.

These studies had weaknesses. They only showed the impact after energy labeling was introduced, not before, and usually only relied on data from a single year.

Olaussen and his colleagues Are Oust and Jan Tore Solstad believed that a price premium might be due to something other than a high energy performance rating. They set out to test this.

In Norway, energy labeling was not implemented gradually. It was launched in full in July 2010. Norwegian data on the prices of most home sales are easy to access. This makes it relatively easy - albeit time consuming - to compare price developments both before and after energy labelling was introduced.

The Norwegian researchers did find an apparent effect of the energy labeling system when they used the same method as the Dutch study. But this effect disappeared when they used a more thorough procedure.

Instead, the researchers took the home sales figures from 2000 to July 2010 and from July 2010 to 2014, which gave a picture of how prices for houses developed over the long term. They also compared houses with similar characteristics, such as homes in the same area and of the same type.

"We found that the homes that had a price premium after energy labeling was introduced had that advantage before it, too," says Olaussen.

The advantage had to come from something else. Perhaps, homes with better energy performance were generally of higher quality. Or maybe something else was a factor.

What may be more important than whether your home has an energy rating of A or G is whether it is in a child-friendly area, or near shops, or has ocean views or other things that buyers are willing to pay a premium for.

Quick bidding rounds may play a role

"I don't think most people care about the energy rating when they buy a home. Other factors play a role, especially in a market like ours [in Norway], with fast bidding rounds. Then you're not thinking about whether 'this property has an A-rating and I can afford a little more'," says Olaussen.

Simply stated, Norway's home sales system works like this: potential buyers go to a house showing, and sign a list with their names and mobile phone numbers if they are interested in participating in the bidding for the house. Once the bidding process begins, it can be fast and furious, especially if there are many bidders eager to buy a single property.

This kind of bidding on housing isn't common in very many other countries. You find it mostly in Norway, Sweden, Australia, New Zealand and to some extent in Scotland and Ireland.

In other countries, you usually work from a fixed price, with a little bit of wiggle room. You often have much more time when buying. Maybe energy labelling would have greater significance in that scenario. But new studies from Europe aren't indicating that. They match the results from NTNU.

Researchers at the NTNU Business School are now studying whether variations in the price of electricity affect how the energy rating impacts house sales.

But if the energy label doesn't matter for sales, how are you going to get people to make their homes more energy efficient?

"You can require energy labeling for home sales. You can give other financial benefits for upgrading homes. You can actively use available support schemes and create different incentives," says Olaussen.

Norwegians can apply for financial support from a government funded entity called Enova if they want to make their home more energy efficient.

And you can still upgrade your home for other reasons, such as a desire to be more environmentally friendly or to save electricity. But you definitely don't want to invest your money here if you simply want to sell your home for a higher price.

Reference: Olaussen, Jon Olaf; Oust, Are; Solstad, Jan Tore. (2017) Energy performance certificates- Informing the informed or the indifferent? Energy Policy. Vol. 111.

As It Looks to Go Green, China Keeps a Tight Lid on Dissent

Chinese President Xi Jinping has vowed to clean up the nation’s air and water and create an “ecological civilization.” Yet even as it carries out major environmental reforms, China’s government is making sure activist green movements stay under its control.

By Michael Standaert • November 2, 2017

China’s “chairman of everything,” Xi Jinping, has just solidified as much power as any Chinese leader since Mao Zedong, with his ideas for a “new era” for his nation enshrined in the constitution at last month’s 19th Communist Party Congress.

Among his new priorities are the creation of an “ecological civilization,” with Xi pledging to clean up three decades of environmental degradation, protect the country’s ecosystems, stringently enforce environmental laws and regulations, and create a “green economy.” Xi also has pledged to forge ahead with the country’s commitments under the Paris climate agreement, as China’s once-soaring CO2 emissions have finally plateaued in the past several years. And the Xi administration is expected to launch a national carbon trading system next month, creating the largest such regime in the world.

But as Xi — responding to years of growing public outrage and activism over the country’s egregious air, water, and soil pollution — casts himself as the nation’s chief protector of the environment, he and his administration have made it clear that the nation’s environmental problems will be tackled on the government’s terms and timetable. Given the Chinese leadership’s overriding concern about societal stability, the Xi administration has not hesitated to clamp down when it perceives that an environmental campaign is gaining broad popular support and taking on a life of its own. Nor has it been shy about appropriating environmental causes championed by the country’s NGOs and green activists.

That heavy-handed response to burgeoning green movements was on display earlier this year with the release of the film, “Plastic China,” which took a hard look at the dirty business of recycling imported plastic, including an image of a baby being born on a mound of trash. The documentary, directed by filmmaker Wang Jiuliang, went viral in early 2017, but the government soon blocked it on the Internet. The film may have helped prompt the government to speed up plans to ban imports on many categories of foreign waste, although the central government has never acknowledged why it decided to enact the ban so quickly last July.

Beijing has made it clear that the nation’s environmental problems will be tackled on the government’s terms and timetable.

The swift reaction to “Plastic China” echoed the response to journalist Chai Jing’s “Under the Dome,” a vivid account of the toll of air pollution on China’s citizens. That video went viral and was viewed an estimated 150 million to 300 million times within the first three days of its release. Its soaring popularity — an implicit rebuke to local and central government officials who had failed to crack down on polluters — occurred just before the annual National People’s Congress meetings in Beijing in 2015. Authorities quickly blocked the film on the Chinese Internet. Nevertheless, the government is moving aggressively to reduce air pollution in Beijing and other cities.

Pollution of the soil from noxious factories has also become a significant concern over the past several years, but the government has yet to release the results of tests it has performed on the nation’s tainted soils. Last year, a scandal erupted when 500 students from an elite high school near Shanghai were afflicted with numerous ailments caused by pollution and illegal dumping from three nearby pesticide factories. But news of that situation has largely disappeared from public view.

Environmental concerns are one of the leading causes of protests in China as citizens take to the Internet or the streets to vent their anger and frustration over runaway pollution and a lack of transparency on environmental issues. Roughly five years ago, grassroots protests sprung up throughout China as residents of so-called “cancer villages” — locales near chemical plants and factories whose residents suffer from high incidences of cancer — took to the streets. But fewer environmental protests take place in China today, and those that do occur — such as a demonstration in late 2016 protesting severe air pollution in the industrial city of Chengdu — are often quickly broken up by police.

Many of the anti-pollution policies the leadership in Beijing has rolled out in recent years have been at least partly aimed at heading off social instability by giving citizens a sense that leaders are going in the right direction, thus letting the air out of potentially explosive public outbursts.

Protestors in Beijing following the release of the viral documentary about Chinese air pollution "Under the Dome" in 2015. Source

“I think it follows a familiar pattern,” said Calvin Quek, the head of Sustainable Finance for Greenpeace East Asia. “The government reads the tea leaves, monitoring social networks like Weibo and WeChat, and when something really blows up, they acknowledge it, but they never want any independent sources of momentum. They are basically saying, ‘Thank you very much, we’ll take care of it now, we’re in charge of it.’ They have become very smart about getting information and controlling the narrative. They’ve been able to say, ‘We want to be final arbitrators of the public discussion.’ And I think the public is fine with it.”

Daniel Gardner, a professor at Smith College and author of Environmental Pollution in China: What Everyone Needs to Know — soon to be published by Oxford University Press — says that vested economic and political interests have long been concerned over grassroots environmental phenomena. This was evident in the crackdown on “Plastic China” and, especially, “Under the Dome,” which pointed to official corruption and collusion with fossil fuel companies as root causes of the air pollution problem.

“While Beijing may tolerate the common street protests of 100, 2,000, even 10,000 people, largely because they are of a familiar NIMBY-style and take as their targets local industry and local government, the specter of 200 million people from all around the country tuning in to ‘Under the Dome,’ and perhaps realizing they have common cause for building a national movement, was too unsettling,” Gardner said in an interview.

Ma Jun, one of China’s best-known environmental figures and director of the Institute of Public & Environmental Affairs, which pushes for greater transparency of pollution data, said that in many ways, both documentaries “achieved their target” and “prepared the public to support some very tough policy choices” related to air pollution controls and bans on imports of trash and toxic materials for recycling.

“When [a campaign] moves beyond the environment, and becomes an event of its own, there are lines.”

But when these movements became phenomena in and of themselves, China’s leaders grew worried about their ability to control such movements, Ma said in an interview.

“These documentaries show the space is there for discussion, but they also show there are some lines here and there,” Ma said. “When it moves beyond the environment, and becomes an event of its own, there are lines.”

Numerous examples have arisen in recent years in which Xi Jinping’s administration has taken action on the heels of environmental campaigns by activists and NGOs.

Earlier this year, China launched a major cleanup campaign along one of the sources of the Yellow River and ousted several local officials for their complicity in allowing a large open-pit coal mine to extensively damage a nature reserve. Yet it was Greenpeace China that first broke the news of this environmental scandal.

Also this year, the government imposed new rules on the use, transport, and storage of chemicals, a move that was seen, in part, as a response to a 2015 disaster in which improperly stored chemicals set off massive explosions in the port city of Tianjin that killed an estimated 170 people and injured hundreds more. The government attempted to censor reporting on the disaster in the news media and on Web sites such as Weibo, but word of the explosions nevertheless filtered out through social media.

Government moves to end the illegal importation of ivory into China only really took off after WildAid launched a campaign in 2013 using basketball star Yao Ming and other celebrities in ads. (The government has vowed to ban the ivory trade by the end of this year.) Other instances of crackdowns on the illegal trade in wildlife, such as pangolins, have been prompted in part by campaigns launched by stars like Jackie Chan.

Similarly, the air pollution crises in Beijing and other large cities in recent years have left the government scrambling to stay out in front on environmental issues after initially bungling its response. As recently as 2012, cities across China, particularly Beijing, were releasing inadequate air quality readings — basically saying a day was either good, bad, or moderately bad. The state-run press often referred to extreme air pollution as “bad fog.”

Public outcry has spurred government action on air pollution in Beijing (above) and other Chinese cities. Lei Han / Flickr

But with the U.S. Embassy in Beijing updating fine particulate matter readings hourly on Twitter, and Beijing residents living through a worsening series of “airpocalypses,” the central government was forced to shift into high gear. It soon began releasing accurate air pollution readings for major cities and pushed through a 10-point action plan to deal with airborne emissions, largely spurred by increasingly vocal protests online — and in the streets — over choking air pollution.

Today, the government, at least at the top levels, understands the pollution challenges and is actively communicating them. And it is trying to reach not only average Chinese citizens, but government officials, as well.

“The audience for [Xi’s ecological civilization] campaign is officials, in my view — officials in the central government who continue to hold a ‘pollute first, clean up later’ economic policy,” said Gardner. “And, critically, it’s to signal to local officials at all levels that the central government wants cooperation from them.”

Messages demonstrating the central government’s concern for the environment have figured prominently in propaganda efforts over the past several months, with a prime-time broadcast devoted to “ecological civilization” reforms airing in July. Xi’s administration also has pushed through a variety of other high-profile environmental reforms in recent years: commitments announced in July to block imports of a variety of solid waste, including consumer plastics; halting plans for hydropower facilities along the country’s last free-flowing major river, the Nujiang ( Salween); the creation of a national park system modeled on U.S. national parks; and a surge in new environmental laws, including an amended environmental protection law, an amended water pollution law, and the creation of the country’s first soil pollution law.

Add to all this a major push to develop renewable energy, deploy millions of electric vehicles, and reform energy markets, and Xi Jinping is increasingly looking, to many Chinese, like an environmental reformer.

The machinery of state censorship vigilantly restricts information on certain environmental issues.

Throughout, however, China’s government is carefully managing the debate. The machinery of state censorship, which actively monitors online discussion, vigilantly restricts information on certain environmental issues. The California-based Web site, China Digital Times, regularly publishes leaked directives from China’s state censors on issues such as deleting a report on air pollution deaths, quashing smog forecasts, and prohibiting reporting on lawsuits filed against provincial officials for failing to regulate pollution.

China’s drive for global resources is having an unprecedented environmental impact on the planet.

The government also is tightening restrictions on national and foreign NGO’s out of fear that their activities might foment environmental activism. These restrictions include the passage last year of a Charity Law overseeing domestic NGOs, as well as adoption of a foreign NGO law. Foreign NGOs have been hampered since the start of this year by requirements that they obtain institutional sponsors and register with public security authorities.

As a result, it’s difficult these days to get staff members at foreign NGOs to go on the record or talk without having quotes vetted by communications staff. The process of going through registration for the foreign NGO law “is making us more careful,” said a source at a prominent international NGO in Beijing.

“When you have to register, you’re going to think about how what you say is perceived” by authorities, said the source, who asked not to be identified. “They want to be in control of the message.”

Michael Standaert is a freelance journalist based out of South China, primarily covering environment, energy and climate change policy and regulatory news for Bloomberg BNA, and contributing to other publications, most recently including MIT Technology Review and South China Morning Post. He has resided in China since 2007.

Down hundreds of staff, Weather Service ‘teetering on the brink of failure,’ labor union says

By Jason Samenow October 26

A National Weather Service meteorologist in Norman, Okla., tracks a super cell tornado outbreak. (National Weather Service)

After the onslaught of devastating hurricanes and wildfires, the United States is enduring one of its most costly years for extreme weather. A near-record 16 billion-dollar weather disasters have ravaged the nation. Meanwhile, the National Weather Service workforce is spread razor thin, with hundreds of vacant forecast positions.

The National Weather Service Employees Organization, its labor union, said the lack of staff is taking a toll on forecasting operations and that the agency is “for the first time in its history teetering on the brink of failure.” Managers are being forced to scale back certain operations, and staff are stressed and overworked.

“It’s gotten so bad that we’re not going to be able to provide service that two years ago we were able to provide to public, emergency managers and media,” said Dan Sobien, the president of the union. “We’ve never been in that position before.”

As one example of an overburdened Weather Service office, the team of 15 forecasters serving the Washington and Baltimore region will be short five full-time staff heading into the winter months, according to Ray Martin, a union representative who works there. He said the office is short a senior forecaster, a general forecaster, two junior forecasters, and the lead for its weather observation program — a position that has remained vacant for two years.

Martin said staff morale is in the tank. “Some people have been denied vacations, because there are not enough bodies to fill shifts,” he said. “I, myself, worked a 15-hour day about a week ago. You get a lot less sleep. You start to wonder if you’re safe on the road. You don’t see your loved ones, which eats into family life.”

Martin added that the office is cutting shifts and that one afternoon and evening forecasting desk, charged with analyzing weather radar, won’t be staffed all winter long “because we just don’t have the bodies.” Forecasters staffing other desks will have to monitor radar by committee, in addition to their other responsibilities.

Whether the cutbacks will affect the quality of forecasts and warnings, “I can’t say for certain,” Martin said. “You’re working people double shifts, some people aren’t getting days off, and you’re grinding people down. There is that potential [to affect the forecast quality]. The longer this goes on, the more the potential rises. It’s a long winter ahead.”

The Burlington Free-Press reported similar circumstances at the forecast office serving its region. “Given our staffing, our ability to fill our mission of protecting life and property would be nearly impossible if we had a big storm,” Brooke Taber, a Weather Service forecaster and union steward, told the paper.

Susan Buchanan, a spokeswoman for the National Weather Service, pushed back on any notion the organization is neglecting obligations to constituents and the health of its staff. “Let me state emphatically that we would never take an action that would jeopardize the services we provide to emergency managers and the public,” she said. “NWS is taking definitive steps to ensure the health and well-being of our employees through guidance to local managers on scheduling and flexibility.”

For the past five years, the union has loudly voiced concerns about staff vacancies and their consequences, even filing grievances. The Weather Service has faced different obstacles in trying to fill positions, including the 2013 budget sequester, related hiring freezes, and changes in administrations. In 2012, challenged to find funding to compensate its workers, it was embroiled in a “reprogramming” scandal, in which it moved around funds to cover payroll, without congressional approval.

“The NWS leadership has been incapable of placing their budget priorities correctly,” the union said in a news release this week. “In fact, the NWS nationwide has not had full staffing levels for at least seven years.”

A union fact sheet on the vacancy issues stressed “understaffing is not due to underfunding,” stating that Congress has fully funded the Weather Service since the 2013 fiscal year and that there have been “unspent carry-over funds” in the tens of millions of dollars. Nevertheless, those funds haven’t been earmarked for staffing, and the vacancy problem has worsened.

An independent report from the Government Accountability Office showed staff vacancies increased 57 percent from 2014 to 2016. The overall vacancy rate reached 11 percent or 455 positions at the end of 2016 up from just 5 percent (211 positions) at the end of 2010, the report said.

The union believes the number of vacancies is even higher, closer to 700.

Buchanan asserts the union is exaggerating the number of vacancies, which she said is closer to 226 or 5 percent. “The 700 vacancy figure cited by the union was based on an old organization table that does not reflect the agency’s current staffing profile,” she said.

But Sobien said that, unless the Weather Service cut several hundred jobs in the last seven years — without a mandate to do so, its numbers are misleading. “They’re just not filling positions and saying they don’t exist anymore,” he said. “They’re moving vacancies around the country and not filling them with new bodies. They’re playing a game with these numbers. I don’t even know if it’s legal.”

A senior official with the National Oceanic and Atmospheric Administration (NOAA), which oversees the National Weather Service, said the number of positions the agency can fill has necessarily decreased due to congressional appropriations. The official said the agency has not eliminated positions. While the “ceiling” for the total number of positions is 4,890 per the National Weather Service Table of Organization, the official said the number of positions is 4,453 people under the 2017 fiscal year appropriations act. “We believe this level is adequate to achieve National Weather Service core functions,” the official said.

In sum, the severity of the vacancy situation depends on whether it is based on the total number of positions in the legacy organizational table or the total number of positions for which Congress has provided funding. The union and the Government Accountability Office base their vacancy numbers on the organizational table, which leads to much higher total of unfilled positions compared with the Weather Service.

Buchanan stressed the Weather Service is aggressively moving to fill open slots. “We’re working with NOAA to prioritize NWS field hiring actions to address the most critical need and to streamline the security clearance process to expedite hiring,” she said. “We are also releasing a nationwide announcement for lead forecaster positions this week and are planning batch hires of meteorologist interns three times per year.”

The vacancy situation has not gone unnoticed by Congress. In its 2018 fiscal year budget markup for the Weather Service, the Senate Appropriations Committee wrote that the “extended vacancies are unacceptable — particularly when the Committee has provided more than adequate resources and direction to fill vacancies expeditiously for the past several years.”

It directed NOAA to fully account for all filled and open positions in its fiscal year 2018 spending plan. It acknowledged some Weather Service positions “may be redundant,” but instructed the Weather Service to develop justifications for eliminating any positions. “Until such time as a plan to eliminate those vacancies is approved, NWS is directed to continue to fill all vacancies as expeditiously as possible,” the committee budget markup said.

"Arsenic Reductions in Drinking Water Tied to Fewer Cancer Deaths"

"The Environmental Protection Agency’s revised rule on arsenic contamination in drinking water has resulted in fewer lung, bladder and skin cancers.

In 2006, the E.P.A. reduced the arsenic maximum in public water systems to 10 milligrams per liter, from the previous level of 50 milligrams. The rule does not apply to private wells.

Using data from a continuing nationwide health survey, researchers compared urinary arsenic levels in 2003, before the new rule went into effect, with those in 2014, after it had been fully implemented. There were 14,127 participants in the study, and the scientists adjusted for arsenic contributions from tobacco and dietary sources. The report is in Lancet Public Health."

Bat Poop: A Reliable Source of Climate Change

University of South Florida (USF Health)

People have long known that bat guano - the polite term for what the flying mammals leave on the floors of caves where they live worldwide - is a valuable source of fuel and fertilizer, but now newly published research from University of South Florida geoscientists show that the refuse is also a reliable record of climate change.

In a new paper published this week in the research journal Scientific Reports, USF geochemistry Professor Bogdan Onac and PhD student Daniel Cleary report that isotopes found in bat guano over the last 1,200 years can provide scientists with information on how the climate was and is changing.

The scientists examined bat guano from a cave in northwestern Romania to produce new insight into how the climate in east-central Europe has changed since the Medieval Warm Period, about 850 AD.

Nitrogen cycling within temperate forests is very sensitive to changes in the amount of winter precipitation received each year. When nitrogen isotopes change in response to variation in winter precipitation over the past 2,000 years, this signature is transferred from the soil to plant leaves to insect to bat and ultimately guano.

"Luckily for scientists, the statement 'you are what you eat' also applies to bats," Onac said.

Scientists frequently examine chemical records in natural substances to document how the climate has changed in the past, and to lend insight into how rapidly it is changing now. Scientists drill mud cores into the sediments under the oceans, ice cores in the Arctic and Antarctica, examine tree rings, or use the chemistry found in caves (stalagmites) as climatic proxies.

Bat guano is rich with nitrogen, and scientists know that nitrogen moves through the food change and through animals, where it is returned to the environment. When bats return to the same location within a cave, guano piles beneath their roost can reach sizable dimensions. The researchers found in M?gurici Cave in Romania is a large three-meter pile of bat guano that has been accumulating for more than a thousand years.

Isotopic analysis of the guano pile in the M?gurici Cave resulted in a near annual record of winter precipitation for the region. The location of this cave in the foreland of the East Carpathian Mountains means winter precipitation is modulated by the North Atlantic Oscillation (NAO), with wetter conditions influencing the availability of nitrogen within the surrounding forest system. Using historical records of precipitation, a relationship between winter precipitation and NAO phases was established. Through this work, past phases of the NAO could then be reconstructed back to 1600 AD, Cleary said.

The work represents the first study to provide a paleo-record of this large scale atmospheric circulation pattern for East-Central Europe using cave bat guano. The USF researchers collaborated with Babes-Bolyai University in Cluj, Romania, and the University of Bremen in Germany.

'Scars' left by icebergs record West Antarctic ice retreat

University of Cambridge

Thousands of marks on the Antarctic seafloor, caused by icebergs which broke free from glaciers more than ten thousand years ago, show how part of the Antarctic Ice Sheet retreated rapidly at the end of the last ice age as it balanced precariously on sloping ground and became unstable. Today, as the global climate continues to warm, rapid and sustained retreat may be close to happening again, and could trigger runaway ice retreat into the interior of the continent, which in turn would cause sea levels to rise even faster than currently projected.

Researchers from the University of Cambridge, the British Antarctic Survey and Stockholm University imaged the seafloor of Pine Island Bay, in West Antarctica. They found that, as seas warmed at the end of the last ice age, Pine Island Glacier retreated to a point where its grounding line - the point where it enters the ocean and starts to float - was perched precariously at the end of a slope.

Break up of a floating 'ice shelf' in front of the glacier left tall ice 'cliffs' at its edge. The height of these cliffs made them unstable, triggering the release of thousands of icebergs into Pine Island Bay, and causing the glacier to retreat rapidly until its grounding line reached a restabilising point in shallower water.

Today, as warming waters caused by climate change flow underneath the floating ice shelves in Pine Island Bay, the Antarctic Ice Sheet is once again at risk of losing mass from rapidly retreating glaciers. Significantly, if ice retreat is triggered, there are no relatively shallow points in the ice sheet bed along the course of Pine Island and Thwaites glaciers to prevent possible runaway ice retreat into the interior of West Antarctica. The results are published in the journal Nature.

"Today, the Pine Island and Thwaites glaciers are grounded in a very precarious position, and major retreat may already be happening, caused primarily by warm waters melting from below the ice shelves that jut out from each glacier into the sea," said Matthew Wise of Cambridge's Scott Polar Research Institute, and the study's first author. "If we remove these buttressing ice shelves, unstable ice thicknesses would cause the grounded West Antarctic Ice Sheet to retreat rapidly again in the future. Since there are no potential restabilising points further upstream to stop any retreat from extending deep into the West Antarctic hinterland, this could cause sea-levels to rise faster than previously projected."

Pine Island Glacier and the neighbouring Thwaites Glacier are responsible for nearly a third of total ice loss from the West Antarctic Ice Sheet, and this contribution has increased greatly over the past 25 years. In addition to basal melt, the two glaciers also lose ice by breaking off, or calving, icebergs into Pine Island Bay.

Today, the icebergs that break off from Pine Island and Thwaites glaciers are mostly large table-like blocks, which cause characteristic 'comb-like' ploughmarks as these large multi-keeled icebergs grind along the sea floor. By contrast, during the last ice age, hundreds of comparatively smaller icebergs broke free of the Antarctic Ice Sheet and drifted into Pine Island Bay. These smaller icebergs had a v-shaped structure like the keel of a ship, and left long and deep single scars in the sea floor.

High-resolution imaging techniques, used to investigate the shape and distribution of ploughmarks on the sea floor in Pine Island Bay, allowed the researchers to determine the relative size and drift direction of icebergs in the past. Their analysis showed that these smaller icebergs were released due to a process called marine ice-cliff instability (MICI). More than 12,000 years ago, Pine Island and Thwaites glaciers were grounded on top of a large wedge of sediment, and were buttressed by a floating ice shelf, making them relatively stable even though they rested below sea level.

Eventually, the floating ice shelf in front of the glaciers 'broke up', which caused them to retreat onto land sloping downward from the grounding lines to the interior of the ice sheet. This exposed tall ice 'cliffs' at their margin with an unstable height, and resulted in rapid retreat of the glaciers from marine ice cliff instability between 12,000 and 11,000 years ago. This occurred under climate conditions that were relatively similar to those of today.

"Ice-cliff collapse has been debated as a theoretical process that might cause West Antarctic Ice Sheet retreat to accelerate in the future," said co-author Dr Robert Larter, from the British Antarctic Survey. "Our observations confirm that this process is real and that it occurred about 12,000 years ago, resulting in rapid retreat of the ice sheet into Pine Island Bay."

Today, the two glaciers are getting ever closer to the point where they may become unstable, resulting once again in rapid ice retreat.

Permits invalidated for big Washington state methanol plant

Phuong Le, Associated Press

September 19, 2017 Updated: September 19, 2017 1:17pm

SEATTLE (AP) — U.S. environmental groups opposed to the Pacific Northwest becoming an international fossil fuels gateway scored a major victory when a Washington state board invalidated two permits for a $2 billion project to manufacture methanol from natural gas and export it to China.

Last week's decision by the state Shorelines Hearings Board is a setback for the project by Northwest Innovation Works on the Columbia River in the small city of Kalama.

The China-backed consortium wants to build the refinery that would produce up to 10,000 metric tons a day of methanol from natural gas piped in from North American sources. The methanol sent to China would be used to make plastics and other consumer goods.

The conservation groups Columbia Riverkeeper, Sierra Club and the Center for Biological Diversity had challenged local permits issued for the project, arguing that the environmental review wrongly concluded the project's greenhouse gas emissions would not be significant.

Washington's state Shorelines Hearings Board agreed, ruling Friday that officials from Cowlitz County and the Port of Kalama failed to fully analyze the impacts of greenhouse gas emissions from the project, including emissions from offsite sources.

The panel reversed two shoreline permits and sent the review back to the county and port for further analysis.

"We are disappointed in the Shorelines Hearings Board's order because the EIS followed both the letter and intent of (the Department of) Ecology's guidance," Vee Godley, president of Northwest Innovation Works, said in an emailed statement Monday night.

He said the company looked forward to working with others "to do our part to provide greater certainty and positive impact for our state's economic and environmental goals."

The review did not factor in impacts of greenhouse gas emission beyond the facility site, including from producing and transporting natural gas, said Brett VandenHeuvel, executive director of Columbia Riverkeeper, which has fought other Northwest fossil fuel projects.

The groups said the project would add a new, enormous source of carbon pollution at a time when Washington state is trying to reduce its carbon footprint.

Janette Brimmer, an attorney for Earthjustice representing the groups, said a more accurate analysis will give a clear picture of the project's true carbon pollution and prompt measures to address those impacts.

Methanol, a wood alcohol, is used to make olefins, a component in everyday products such as eyeglasses, insulin pumps and fleece jackets. Developers have said the project would reduce global greenhouse gas emissions globally by producing methanol from natural gas rather than coal.

Environmental groups are also fighting a major crude-oil terminal proposed about 30 miles (48 kilometers) upstream from the proposed methanol plant.

The oil-by-rail terminal proposed by Vancouver Energy would handle about 360,000 barrels of crude oil a day.

Oil would arrive by train to be stored and then loaded onto tankers and ships bound for West Coast refineries. A state energy panel is reviewing the project and Washington Gov. Jay Inslee, a Democrat, has the final say over whether it will be built.

Northwest Innovation Works in April 2016 canceled plans to build a $3.4 billion methanol plant in Tacoma, Washington, following vocal opposition from people concerned about potential environmental and health impacts.

The company's methanol refinery operation in Kalama is estimated to produce more than 1 million metric tons of greenhouse gas emissions each year and increase the state's total emissions by 1 percent.

The company had argued that no state or federal agency has published rules concerning quantifying indirect greenhouse gas emissions in an environmental review and that the Department of Ecology's guidance was the only source about the issue.

The Ecology Department approved the permit for the project in June. But the agency determined the greenhouse gas emissions were significant and imposed conditions including requiring the facility to reduce its emissions by 1.7 percent each year, under the state's new rule to cap pollution.

Elaine Placido, director of building and planning for Cowlitz County, said Tuesday that the county is planning to discuss the next steps with the project developer.

Ohio bills would ease restrictive setbacks for new wind farms

Written By Kathiann M. Kowalski

Two bills in the Ohio Senate aim to ease restrictions in a 2014 law that have effectively blocked the development of new commercial wind farms in the state.

That 2014 budget bill amendment tripled the previous property line setbacks for wind turbines and has stymied all but a few projects that had already gone through the permitting process under prior law.

“We have stopped further development of wind projects in the state of Ohio while everyone else around the country seems to be progressing,” said Cliff Hite (R-Findlay), who introduced Senate Bill 188 on September 14. “Unless we do something, we’re not going to compete with these other states.

Hite’s bill is similar to a budget amendment that the Ohio Senate passed in June, but which was rejected by the Ohio House of Representatives in conference committee negotiations. Under Hite’s bill, the property line setback would be 120 percent of the turbine’s height, which is still more than the pre-2014 law required.

The other bill, SB 184, would restore the pre-2014 property line setback of 110 percent of the turbine height. Michael Skindell (D-Lakewood) introduced that bill on August 31.

Between them, SB 184 and 188 have 20 co-sponsors, which is more than a majority of the 33 lawmakers elected to the state senate. “I think we’ll get it done in the Senate,” Hite said. “Then we’ll see what happens in the House.”

Measuring up

Before the law changed in 2014, the minimum setbacks for commercial wind farm turbines were about 1,300 feet from the nearest habitable building and 1.1 times the turbine height from the property line.

At a minimum, that meant that a turbine would be about a quarter mile from any home, explained Andrew Gohn at the American Wind Energy Association. And if a turbine fell down, it still wouldn’t land on an adjoining property.

A last-minute budget bill amendment in 2014 made the property line setback the same as the former residence setback. Rep. Bill Seitz (R-Cincinnati), then in the state senate, was the only person who spoke in favor of the change when it was briefly discussed on the senate floor before passage.

Skindell was the only lawmaker who spoke out against the more restrictive setback provision at that time. Among other things, Skindell noted that there had been no hearings or testimony on the merits of the change.

Any arguments that the residence setback should apply to the property line for health and safety reasons are “specious,” said Gohn. The current requirements are “essentially a de facto zoning from the legislature, which is certainly something that that’s not welcomed by a lot of communities that want to host wind projects.”

‘Wind is our shale’

Those communities have been “the real losers here,” said Peter Kelley, also at the American Wind Energy Association. Although some companies have been unable to move forward with projects in Ohio, most usually go elsewhere, he continued. “So really, who’s missing out here are the communities in Ohio that would like to have this option for their economic development.”

The benefits of that development can be huge. According to a May 2017 study by the Wind Energy Foundation, the current law acts as a market barrier and has blocked over $4.2 billion in lifetime economic benefits for Ohio.

Among other things, those benefits would include “millions for your county that will be invested into your schools and your local county government,” Hite explained.

For example, he noted, Van Wert County has already received millions of dollars for its schools and other needs. Another project in Hardin County should bring in about $17 million for the local government and schools over a 20-year period, he added.

“Wind is our shale,” Hite stressed. The reference compares wind resources for his district in the northwestern part of the state to the shale gas resources that have boosted revenues for some parts of southeastern Ohio since 2008.

Other revenues and jobs may be at stake as well.

Amazon, Facebook, General Motors and other companies have announced goals of sourcing their electricity needs from 100 percent renewable energy. For example, on September 19, Starwood Energy Group Global announced a deal in which General Motors agreed to buy all the electricity to be produced by a commercial wind farm in northwest Ohio.

That project is the last grandfathered project that was permitted under the prior law, noted Trish Demeter at the Ohio Environmental Council. If Ohio’s lawmakers don’t see the opportunities from wind energy and fix the law, “Ohio will be left in the dust in the clean energy era,” she said.

Indeed, cities and states often compete with each other to attract new facilities that will add jobs. “Those opportunities are not going to wait around forever,” Kelley said.

Although some wind energy critics argue about “trespass zoning” if setbacks are relaxed, supporters say current setbacks “really interfere with property rights of all those people who would like to host a turbine,” Kelley said.

“Farmers love this because it’s a cash crop they can rely on when commodity prices go south or when there’s a drought,” Kelley continued. “When it becomes difficult to make the family farm work, having turbines on their land can be a huge boon to a way of life that was disappearing. And yet their property rights are negated by this overzealous setback rule.”

Hite says he wants “a compromise between protecting the property rights of people who don’t want [turbines on their land], but also protecting the rights of people who do want them. And I think this bill does that.”

County commissioners could still vote not to have wind energy in their area if that’s what their people wanted. However, Hite noted, those areas would also be unable to take advantage of some incentives to attract new facilities. And they would forego the economic benefits from wind farms.

“I just think it is right for a state to use the resources that it has to try to help make the state better,” Hite continued. “Where we live, we’ve got wind and we want to harvest it and use it.”

“And,” he added, “I don’t have a problem reducing the carbon footprint in the state of Ohio for my granddaughters.”

Wind farms in Ohio pit environmentalists against some neighbors tired of noise, view

By Dan Gearino, The Columbus Dispatch

Posted Sep 24, 2017 at 5:45 AM Updated Sep 24, 2017 at 5:54 AM

PAYNE, Ohio — From the ground, the narrow aluminum ladder might as well extend to infinity. Actual height: 290 feet.

A Dispatch videographer straps on a protective harness, hard hat and safety glasses, joined by two employees of the farm’s operator, EDP Renewables. They are about to climb inside one of 55 wind turbines at Timber Road II wind farm in Paulding County.

The first steps are easy, even with 10 pounds of cameras and other gear.

Just resist the urge to look up, or down.

As the climbers ascend, their only rest is on three metal platforms, which are spaced out within the tower to break up the journey and catch any falling objects.

“The first thing that really hits you (is) the size in general, the gravity of just how much machinery goes into putting these things together,” said Jeremy Chenoweth, an EDP operations manager whose territory includes all of Ohio, and who made the climb.

Wind farms are a big, and growing, business in Ohio. They’re a part of the state’s clean-energy economy that has gone from near zero to more than $1 billion worth of spending in the past 10 years, with the potential to grow fourfold if every announced project is built.

But some neighbors view the turbines as an affront, spoiling the landscape with noise, the flicker of shadows from turbine blades and blinking red lights.

This is the gut-level underpinning of a Statehouse battle over rules on where turbines can be placed, a debate that will determine how much building will be allowed to occur.

On one side are the wind-energy industry, environmentalists and companies that want to increase the supply of clean power. On the other are some of the neighboring residents, along with a patchwork of conservative-leaning groups.

The state’s wind farms are all in northwestern Ohio, but regulators have approved others just outside of the Columbus metro area, with projects planned for Crawford, Champaign, Hardin and Logan counties, and still more in the pre-development stage.

So the debate about wind energy could be coming to your neighborhood, even if you live nowhere near northwestern Ohio.

Top of the world

Inside the wind tower, the climb takes about an hour, and the final steps are a strain. Muscles ache. Clothes are soaked with sweat.

But there is a reward. At the very top is a schoolbus-size room that holds a generator and control equipment.

On the ceiling is a clear plastic hatch that one of the EDP guys pops open.

Then, blue sky.

The three climbers step onto the roof for a gobsmacking view. The safety harness remains in place. It’s safe to stand.

Fort Wayne, Indiana, is visible, 22 miles to the west. And, if you stop to focus, you see the tiny rectangles of houses, on farms and along rural highways.

The turbine is turned off whenever somebody is working in it, so there is no electricity being generated. When active, the carbon-fiber blades slice through the air at speeds that can reach 185 mph at the tips. The top room and its contents weigh a crushing 70-plus tons, and send electricity down the tower through a series of high-voltage lines. The lines then go underground to connect with substations, and then meet up with interstate powerlines that feed into the country’s power grid.

Only from a distance, which is how most people see wind farms, does this giant machine look like a pinwheel.

Not many jobs

All 255 of the wind-farm turbines operating in Ohio have been built along a stretch of Paulding and Van Wert counties, where the land is flat and the winds are some of the most brisk. If you include turbines at homes and businesses, the statewide total is 302, according to the American Wind Energy Association, a trade group.

A typical turbine in northwest Ohio is 1.5 to 2 megawatts; the Timber Road II wind farm has a total 99 megawatts. For some perspective, a 2 megawatt turbine in moderate wind generates enough electricity in a year to provide for the needs off more than 500 houses, based on estimates cited by EDP.

When a wind farm is developed, there is a flurry of spending for parts and construction services. After that, the costs are minimal. The fuel — wind — is free, and the developer needs only a few people to do ongoing work.

One of those jobs belongs to Benjamin Werkowski, 28, local operations manager for EDP. He’s the first person up the ladder.

“I used to drive a dump truck, right out of high school, because I didn’t know what I wanted to do,” he said. “And then I was hauling stone and dirt at the first wind farm they put in Indiana, and it just sparked my interest, and I went from there.”

He lives in Van Wert, one of 23 full-time EDP workers who live in the area.

This small employment footprint means that there are no throngs of wind-industry workers to advocate for their business the way there would be for a power plant that runs on coal or nuclear and might employ hundreds of people. And, EDP’s headquarters is nowhere near, with a base in Spain and a main U.S. office in Houston.

EDP primarily interacts with its Ohio neighbors financially — lease payments to landowners, taxes to local governments, and charitable giving — and visibly, given the near-constant sight of the turbines.

This creates a dynamic that some people see as a conflict between the haves and have-nots, with some residents surrounded by turbines but receiving little or no money.

“It’s just very annoying, very unpleasant,” said Brenda DeLong, 61, interviewed on her front porch.

She lives on a lot that has been in her family for generations and was once part of her parents’ farm. Now, she has a view of Blue Creek Wind Farm, the state’s largest, along with parts of the Timber Road farms.

Near dusk on a Thursday, she begins to count the turbines. After a walk around the house, she is finished with 114, 115, 116.

“And that’s about all I see,” she said.

In other words, she can see nearly half of all the turbines on all of Ohio’s wind farms without leaving her 1-acre property.

She is a retired fourth-grade teacher and now spends most of her time volunteering for 4-H, the Red Cross and her church. She has three children and three grandchildren.

From her porch, she hears a near-constant sound, like a plane flying overhead, from the turbines. On some mornings and evenings, when the sun is behind the turbines, she sees a flicker of shadows on the walls of her house.

She feels like an essential part of her life — the outdoors around her home — has been taken away from her.

Rex McDowell, 69, a neighbor of DeLong’s, describes the area as the “Red Light District.” By way of explanation, he walks to the backyard, where each of the turbines has a red light that goes on for a moment, then off, and then on again. The lights act in unison. Together, they are bright enough to cast a red glow for miles.

The lights are there to help prevent collisions with aircraft and are required by federal aviation rules.

DeLong is part of a local group of opponents of wind energy, called Citizens for Clear Skies. One of the organizers is Jeremy Kitson, 41, who lives in a nearby township and is a high school teacher.

“I love how (wind-energy supporters) think we’re getting paid off by the Koch brothers,” Kitson said, referring to the politically active family behind the Kansas-based conglomerate Koch Industries. “Citizens for Clear Skies is just a lot of regular folks kicking in 25 bucks at a time.”

At state and local hearings dealing with wind energy, he is often in the audience and ready to speak, a familiar face for officials.

While he seems to relish the debate, DeLong is much more hesitant. But that hasn’t prevented her from making her views known. She even made a trip to Columbus in June to testify before an Ohio Senate panel about wind-energy regulations.

“Many who are pro-wind will never live near a turbine,” she told lawmakers.

A few miles north and east of DeLong’s house is Wayne Trace High School, a rural school consolidated from three communities. Here, wind energy is a godsend, providing 11 percent of local tax revenue in the K-12 budget.

On a recent afternoon, one of the middle-school football teams practices on a field just west of the complex that houses the high school, middle school and district offices. The nearest wind farm is barely visible to the south.

“We just don’t have a lot of people seeking out the district for large industry,” said Ben Winans, the district’s superintendent. “The wind industry is one thing we can have.”

His district received $706,923 in taxes from wind turbines in the 2016-17 budget year. The wind money has allowed the district to increase staffing without raising taxes. The new hiring includes several reading specialists who work with students struggling to read at their grade level.

Winans grew up in the area and is an alumnus of the district he now runs. Though the wind turbines are barely visible from the school, he knows what it is like to be right next to them. His house, where he lives with his wife and children, has several turbines close enough to cast shadows on his walls.

He doesn’t mind.

“I’ve got one (turbine) on each end of my property,” he said. “I really don’t notice it anymore.”

Most wind-energy development has been in sparsely populated areas. But as demand grows, developers are moving closer to major cities.

The 255 active turbines in Ohio could be joined by 620 turbines at wind farms that regulators have approved, some of which are already under construction. In addition, there are at least a half-dozen other projects that are awaiting state approval, or are in a pre-application stage, based on filings and interviews with industry officials. The new wind farms represent more than $4 billion in spending.

With this pace of growth, researchers expect to see an increase in people who don’t like the turbines.

“It’s important to remember that this kind of opposition to any kind of infrastructure development is normal,” said Joseph Rand, a scholar who specializes in energy policy at Lawrence Berkeley National Laboratory in California.

He has taken a close look at how communities respond to renewable energy. Surveys show that a large majority of Americans support wind energy. At the same time, “people are inherently protective of place, of their landscape,” he said.

There are stereotypes at play. Wind supporters say the critics are uninformed or even delusional. Opponents say that wind supporters are motivated by money.

“Rarely do you see a nuanced perspective that has a fair story from both sides,” Rand said.

In his work, he hopes to determine the underlying motivations and find out how developers and communities can better respond.

State and local

One way to protect residents’ interests is through state and local regulations. Opponents of wind farms give the impression that the projects essentially receive a rubber stamp, with little consideration of the effects on residents and little skepticism of developers’ assertions. Also, local governments have a limited role in the process.

And yet, public records show an exhaustive and expensive review. For example, Blue Creek Wind Farm was approved by the Ohio Power Siting Board, with a multiyear review before the project went online in 2012. The board has broad authority on any utility-scale wind farm.

The board’s docket has 4,460 pages for Blue Creek, not counting two related cases, with extensive testing to estimate levels of noise, shadow flicker and other ramifications. The developer is a company that has since changed its name to Avangrid Renewables and has U.S. headquarters in Portland, Oregon.

According to filings, the developer projected that 39 houses would have shadow flicker of at least 30 hours per year. There is no legal standard for acceptable flicker, but 30 hours is often the guideline used by regulators.

Most of the residents of those houses have signed on to the project by either leasing property for the turbines, or by signing so-called “good neighbor” agreements.

With leases, property owners are signing a long-term contract to allow their land to be used to build a turbine and access roads. The payments vary, but are often in the range of $10,000 per turbine per year. The wind trade group says that annual lease payments in Ohio add up to more than $1 million, but less than $5 million.

Another type of contract is a neighbor agreement, which is for people who are near wind farms but have no turbines on their land. The residents are saying they will not object to the project in exchange for annual payments that are often in the $1,000 range. Winans’ family, for example, has such an agreement that pays $1,000 per year.

Avangrid gave special attention to the fewer than 10 households that did not sign any agreement and had more than 30 hours of flicker. In some cases, the company agreed to reduce the effects by shutting off certain turbines at certain times.

DeLong’s property is one of the majority that experience fewer than 30 hours. She does not recall any contact with the developer about flicker, noise or anything else.

“Nobody came to my door,” she said. It was an inauspicious start to what has been a bad relationship.

Paul Copleman, an Avangrid spokesman, had this response:

“This is a project that has roughly 250 participating landowners and that reflects a lot of effort to talk to a lot of people in the community, at kitchen tables and in living rooms and in public meetings,” he said. “We think the onus is on us to develop responsibly and talk to people about the benefits we feel the project delivers to everybody in the community.”

The competing interests collide in an ongoing debate about how much distance should be required between the turbines and nearby property lines.

In 2014, Ohio Senate Republican leaders expanded the required distance by making a last-minute amendment to an unrelated budget bill. The provision was largely in response to citizen concerns about wind farms.

Wind-industry advocates warned that the result would be a virtual halt in development. However, projects that already were approved could go forward using the old rules, which has accounted for nearly all construction since then.

Supporters of wind energy, a mix of Republicans and Democrats, have repeatedly tried and failed to pass rules that are more wind-friendly and warned that investment might soon shift to other states. The current proposal is Senate Bill 188, sponsored by Sen. Cliff Hite, R-Findlay, whose district includes most of the wind farms. The bill would allow construction of wind turbines within about 600 feet of property lines, which is down from about 1,300 feet under the 2013 amendment.

Hite is confident the bill can pass the Senate. His problem is in the House, where Majority Leader Bill Seitz, R-Cincinnati, is one of the most outspoken critics of wind energy.

“If wind farms cannot be developed without borrowing or stealing their neighbors’ nonresidential property in order to satisfy the setback, health and safety requirements, then perhaps they should not be developed at all,” Seitz said in a 2016 letter.

For DeLong and her friends, Hite has become the face of the pro-wind crowd. She notes that there is no wind farm near his home.

“If he lived near them, it would be different,” she said.

Hite had this response: “I would put one in my backyard if I could.”

People will disagree about whether that would be a pleasant view. Meanwhile, the opposite view, from the top of the turbine, is breathtaking.

Dispatch photographer and videographer Doral Chenoweth III contributed to this story.

Majority of Americans now say climate change makes hurricanes more intense

By Emily Guskin and Brady Dennis September 28

A NASA satellite image shows Hurricane Irma slamming into the French Caribbean islands on Sept. 6. (NASA/GOES Project/AFP)

A majority of Americans say that global climate change contributed to the severity of recent hurricanes in Florida and Texas, according to a new Washington Post-ABC News poll. That marks a significant shift of opinion from a dozen years ago, when a majority of the public dismissed the role of global warming and said such severe weather events just happen from time to time.

In a 2005 Post-ABC poll, taken a month after Hurricane Katrina ravaged the Gulf Coast and devastated New Orleans, 39 percent of Americans said they believed climate change helped to fuel the intensity of hurricanes. Today, 55 percent believe that.

[Read full poll results | How the poll was conducted]

The shift may in part reflect scientists’ increasing confidence — and their increasing amount of data — in linking certain extreme weather events such as hurricanes to climate change. Many researchers have been unequivocal that while hotter oceans, rising seas and other factors are not the sole cause of any event, the warming climate is contributing to more intense storms and more frequent, more crippling storm surges and flooding.

“Harvey was not caused by climate change, yet its impacts — the storm surge and especially the extreme rainfall — very likely worsened due to human-caused global warming,” Stefan Rahmstorf of the Potsdam Institute for Climate Impact Research said in a statement after the hurricane.

In a follow-up email to The Washington Post last month, Rahmstorf said that the explanation is just basic physics: The atmosphere holds more water vapor when it is warmer, setting the stage for more rain.

Yet for many Americans, the role of climate change has become as much about political beliefs as scientific findings.

Over the past decade, Democrats and independents accounted for most of the growing percentage of Americans who said climate change was a culprit in the intensity of hurricanes. While fewer than half of Democrats said climate change was a significant factor behind the intensity of hurricanes in 2005, that figure rose to 78 percent this month after hurricanes Harvey and Irma. Similarly, 42 percent of independents saw climate change as contributing to the severity of hurricanes in 2005 — a season in which hurricanes Rita and Wilma also hit the United States — while 56 percent now do.

By contrast, attitudes among Republicans have hardly changed over the same period. About 72 percent currently say that the severity of recent storms was merely the result of the kind of intense weather events that materialize periodically. In 2005, 70 percent had that view.

Younger adults are much more likely to attribute a role to climate change in the intensity of major storms, and their opinions appear to have shifted sharply over the past decade or so. The new poll finds that 67 percent of adults younger than 30 say climate change contributed to the recent hurricanes’ power, as do 64 percent of those in their 30s. Meanwhile, only about half of Americans ages 40 to 64 say that. The percentage is lower still for people older than 65.

Despite the damage their states suffered, the Post-ABC poll finds that residents in Texas and Florida appear more skeptical of linking climate change and a storm’s punch than Americans nationally. The poll included 134 respondents living in one of the two states, and just under half say the severity of the recent hurricanes was typical. Forty-four percent say their intensity was made worse by Earth’s changing climate.

The Washington Post-ABC News poll was conducted Sept. 18-21 among a random national sample of 1,002 adults reached on cellular and landline phones. The margin of sampling error for overall results is plus or minus 3.5 percentage points.

Katrina. Sandy. Harvey. The debate over climate and hurricanes is getting louder and louder.

Emily Guskin is the polling analyst at The Washington Post, specializing in public opinion about politics, election campaigns and public policy. Follow @emgusk

Brady Dennis is a national reporter for The Washington Post, focusing on the environment and public health issues. Follow @brady_dennis

Power company kills nuclear plant, plans $6 billion in solar, battery investment

Duke Energy Florida is just the latest utility to walk away from nuclear.

On Tuesday, power provider Duke Energy Florida announced a settlement with the state’s public service commission (PSC) to cease plans to build a nuclear plant in western Florida. The utility instead intends to invest $6 billion in solar panels, grid-tied batteries, grid modernization projects, and electric vehicle charging areas. The new plan involves the installation of 700MW of solar capacity over four years in the western Florida area.

There's excitement from the solar industry, but the announcement is more bad news for the nuclear industry. Earlier this year, nuclear reactor company Westinghouse declared bankruptcy as construction of its new AP1000 reactors suffered from contractor issues and a stringent regulatory environment. Two plants whose construction was already underway—the Summer plant in South Carolina and the Vogtle plant in Georgia—found their futures in question immediately.

At the moment, Summer’s owners are considering abandoning the plant, and Vogtle’s owners are weighing whether they will do the same or attempt to salvage the project.

Duke Energy Florida hadn’t started building the Levy nuclear plant, but it did have plans to order two AP1000 reactors from Westinghouse. Now that Westinghouse company is in dire financial straits, the Florida utility decided that its money is better spent elsewhere.

Just last week, Duke told its PSC that it would have to increase rates by more than eight percent due to increased fuel costs. But with the new settlement that directs the utility toward solar and storage, customers will see that rate hike cut to 4.6 percent.

The Levy nuclear plant was proposed in 2008 and ran into hurdles early on. With cheap natural gas in 2013, Duke Energy Florida became nervous that it might not recuperate costs spent on the nuclear plant, especially with regulatory delays. The company cancelled its engineering and construction agreements in 2013 but said that it was holding open the possibility of returning to Levy someday. Over nine years, about $800 million had been spent on preparatory work for the plant.

With Tuesday’s announcement, those costs are sunk costs now. But overall, the changes will save residential customers future nuclear-related rate increases. Those customers will see a cost reduction of $2.50 per megawatt-hour (MWh) “through the removal of unrecovered Levy Nuclear Project costs,” the utility said.

The 700MW of solar won’t exactly cover the nameplate capacity of the Levy plant, which was supposed to deliver 2.2 gigawatts to the region. But the Tampa Bay Times wrote that Duke “is effectively giving up its long-held belief that nuclear power is a key component to its Florida future and, instead, making a dramatic shift toward more solar power.”

Duke Energy Florida serves 1.8 million people, and Florida relies mostly on (currently cheap) natural gas. Duke said this week that it wants to raise its solar power capacity to eight percent of generating power in the next four years.

This isn’t the only planned nuclear plant to be cancelled in the last few days, either. The parent company of Duke Energy Florida—that is, Duke Energy—also pulled the plug on another planned nuclear plant in North Carolina last week, according to GreenTechMedia. There, the power company asked for a rate increase to cover more than $500 million in sunk costs related to the unbuilt Lee nuclear power plant. (It should be noted, though, that more than half of the rate increase was to cover costs related to cleaning up coal ash sites.)

While Duke said that its existing nuclear power plants were a “vital component” of the power mix, the company also said that “cancellation of the [Lee] project is the best option for customers.”

Experts don’t foresee a lot of nuclear investment in the next decade or so, according to UtilityDive. But experts are divided on whether focusing solely on renewable energy is a good thing. While opponents cite nuclear’s expense and safety risk, proponents say we need all kinds of greenhouse gas-free power online as quickly as possible.

"A Sea of Health and Environmental Hazards in Houston’s Floodwaters"

"Officials in Houston are just beginning to grapple with the health and environmental risks that lurk in the waters dumped by Hurricane Harvey, a stew of toxic chemicals, sewage, debris and waste that still floods much of the city.

Flooded sewers are stoking fears of cholera, typhoid and other infectious diseases. Runoff from the city’s sprawling petroleum and chemicals complex contains any number of hazardous compounds. Lead, arsenic and other toxic and carcinogenic elements may be leaching from some two dozen Superfund sites in the Houston area.

Porfirio Villarreal, a spokesman for the Houston Health Department, said the hazards of the water enveloping the city were self-evident.

“There’s no need to test it,” he said. “It’s contaminated. There’s millions of contaminants.”"

Antidepressants found in fish brains in Great Lakes region

The drugs enter rivers and lakes from treatment plants and sewage overflows, threatening aquatic life, scientists say

University at Buffalo

BUFFALO, N.Y. -- Human antidepressants are building up in the brains of bass, walleye and several other fish common to the Great Lakes region, scientists say.

In a new study, researchers detected high concentrations of these drugs and their metabolized remnants in the brain tissue of 10 fish species found in the Niagara River.

This vital conduit connects two of the Great Lakes, Lake Erie and Lake Ontario, via Niagara Falls. The discovery of antidepressants in aquatic life in the river raises serious environmental concerns, says lead scientist Diana Aga, PhD, the Henry M. Woodburn Professor of chemistry in the University at Buffalo College of Arts and Sciences.

"These active ingredients from antidepressants, which are coming out from wastewater treatment plants, are accumulating in fish brains," Aga says. "It is a threat to biodiversity, and we should be very concerned.

"These drugs could affect fish behavior. We didn't look at behavior in our study, but other research teams have shown that antidepressants can affect the feeding behavior of fish or their survival instincts. Some fish won't acknowledge the presence of predators as much."

If changes like these occur in the wild, they have the potential to disrupt the delicate balance between species that helps to keep the ecosystem stable, says study co-author Randolph Singh, PhD, a recent UB graduate from Aga's lab.

"The levels of antidepressants found do not pose a danger to humans who eat the fish, especially in the U.S., where most people do not eat organs like the brain," Singh says. "However, the risk that the drugs pose to biodiversity is real, and scientists are just beginning to understand what the consequences might be."

The research team included other scientists from UB, Ramkhamhaeng University and Khon Kaen University, both in Thailand, and SUNY Buffalo State. The study was published on Aug. 16 in the journal Environmental Science and Technology.

A dangerous cocktail of antidepressants in the water

Aga has spent her career developing techniques for detecting contaminants such as pharmaceuticals, antibiotics and endocrine disrupters in the environment.

This is a field of growing concern, especially as the use of such chemicals expands. The percentage of Americans taking antidepressants, for instance, rose 65 percent between 1999-2002 and 2011-14, according to the National Center for Health Statistics.

Wastewater treatment facilities have failed to keep pace with this growth, typically ignoring these drugs, which are then released into the environment, Aga says.

Her new study looked for a variety of pharmaceutical and personal care product chemicals in the organs and muscles of 10 fish species: smallmouth bass, largemouth bass, rudd, rock bass, white bass, white perch, walleye, bowfin, steelhead and yellow perch.

Antidepressants stood out as a major problem: These drugs or their metabolites were found in the brains of every fish species the scientists studied.

The highest concentration of a single compound was found in a rock bass, which had about 400 nanograms of norsertraline -- a metabolite of sertraline, the active ingredient in Zoloft -- per gram of brain tissue. This was in addition to a cocktail of other compounds found in the same fish, including citalopram, the active ingredient in Celexa, and norfluoxetine, a metabolite of the active ingredient in Prozac and Sarafem.

More than half of the fish brain samples had norsertraline levels of 100 nanograms per gram or higher. In addition, like the rock bass, many of the fish had a medley of antidepressant drugs and metabolites in their brains.

Evidence that antidepressants can change fish behavior generally comes from laboratory studies that expose the animals to higher concentrations of drugs than what is found in the Niagara River. But the findings of the new study are still worrisome: The antidepressants that Aga's team detected in fish brains had accumulated over time, often reaching concentrations that were several times higher than the levels in the river.

In the brains of smallmouth bass, largemouth bass, rock bass, white bass and walleye, sertraline was found at levels that were estimated to be 20 or more times higher than levels in river water. Levels of norsertraline, the drug's breakdown product, were even greater, reaching concentrations that were often hundreds of times higher than that found in the river.

Scientists have not done enough research yet to understand what amount of antidepressants poses a risk to animals, or how multiple drugs might interact synergistically to influence behavior, Aga says.

Wastewater treatment is behind the times

The study raises concerns regarding wastewater treatment plants, whose operations have not kept up with the times, says Aga, a member of the UB RENEW (Research and Education in eNergy, Environment and Water) Institute.

In general, wastewater treatment focuses narrowly on killing disease-causing bacteria and on extracting solid matter such as human excrement. Antidepressants, which are found in the urine of people who use the drugs, are largely ignored, along with other chemicals of concern that have become commonplace, Aga says.

"These plants are focused on removing nitrogen, phosphorus, and dissolved organic carbon but there are so many other chemicals that are not prioritized that impact our environment," she says. "As a result, wildlife is exposed to all of these chemicals. Fish are receiving this cocktail of drugs 24 hours a day, and we are now finding these drugs in their brains."

The problem is exacerbated, Singh says, by sewage overflows that funnel large quantities of untreated water into rivers and lakes. In August, for example, The Buffalo News reported that since May of 2017, a half billion gallons of combined sewage and storm water had flowed into local waterways, including the Niagara River.

New study: Shifting school start times could contribute $83 billion to US economy within a decade

Even after just two years, the study projects a national economic gain of $8.6 billion, which would already outweigh the costs per student from delaying school start times to 8:30 a.m.

RAND Corporation

The RAND Corporation and RAND Europe have released the first-ever, state-by-state analysis (in 47 states) of the economic implications of a shift in school start times in the U.S., showing that a nationwide move to 8.30 a.m. could contribute $83 billion to the U.S. economy within a decade.

Even after just two years, the study projects a national economic gain of $8.6 billion, which would already outweigh the costs per student from delaying school start times to 8:30 a.m. The costs per student are largely due to transportation, such as rescheduling bus routes and times, which would be affected by the school start time change.

The study used a novel macroeconomic model to project economic gains to the U.S. economy over 20 years from 2017, with this being around $140 billion after 15 years. On average, this corresponds to an annual gain of about $9.3 billion each year, which is roughly the annual revenue of Major League Baseball.

The economic gains projected through the study's model would be realised through the higher academic and professional performance of students, and reduced car crash rates among adolescents.

Previous estimates show that one additional hour of sleep is, on average, estimated to increase the probability of high school graduation by 13.3 percent and college attendance rate by 9.6 percent. These positive effects impact the jobs that adolescents are able to obtain in the future and, in turn, have a direct effect on how much a particular person contributes toward the economy in future financial earnings.

Data for car crash fatalities reveal that around 20 percent involved a driver impaired by sleepiness, drowsiness or fatigue. The impact of car crashes and young adults dying prematurely has a negative impact on the future labor supply of an economy.

From a health perspective, the American Academy of Pediatrics and the American Medical Association recommend that middle and high schools start no earlier than 8:30 a.m. to accommodate the known biological shift in adolescent sleep-wake schedules. However, a previous study from the federal Centers for Disease Control and Prevention estimated that 82 percent of middle and high schools in the U.S. start before 8:30 a.m., with an average start time of 8:03 a.m.

School start times are increasingly becoming a hotly debated topic in school districts across the U.S., while the California legislature is considering a proposal to mandate that start times for middle and high schools be no earlier than 8:30 a.m.

Wendy Troxel, senior behavioral and social scientist at the RAND Corporation, says, "For years we've talked about inadequate sleep among teenagers being a public health epidemic, but the economic implications are just as significant. From a policy perspective, the potential implications of the study are hugely important. The significant economic benefits from simply delaying school start times to 8.30 a.m. would be felt in a matter of years, making this a win-win, both in terms of benefiting the public health of adolescents and doing so in a cost-effective manner."

Marco Hafner, a senior economist at RAND Europe, the European affiliate of the RAND Corporation, says, "A small change could result in big economic benefits over a short period of time for the U.S. In fact, the level of benefit and period of time it would take to recoup the costs from the policy change is unprecedented in economic terms."

Even though it is recommended that teenagers need an average of 8 to 10 hours of sleep each night, up to 60 percent of middle school and high school students report weeknight sleep duration of less than seven hours.

Previous evidence has shown that a lack of sleep among adolescents is associated with numerous adverse outcomes, including poor physical and mental health, behavioural problems, suicidal thoughts and attempts, and attention and concentration problems.

"Throughout the cost-benefit projections, we have taken a conservative approach when establishing the economic gains," Hafner said. "We have not included other effects from insufficient sleep, such as higher suicide rates, increased obesity and mental health issues, which are all difficult to quantify precisely. Therefore, it is likely that the reported economic and health benefits from delaying school start times could be even higher across many U.S. states."

The study follows a previous piece of research from RAND Europe in November 2016 that showed that the U.S. sustains economic losses of up to $411 billion a year (2.28 percent of its GDP) due to insufficient sleep among its workforce.

Climate action window could close as early as 2023

University of Michigan

ANN ARBOR--As the Trump administration repeals the U.S. Clean Power Plan, a new study from the University of Michigan underscores the urgency of reducing greenhouse gas emissions--from both environmental and economic perspectives.

For the U.S.'s most energy-hungry sectors--automotive and electricity--the study identifies timetables for action, after which the researchers say it will be too late to stave off a climate tipping point.

And the longer the nation waits, the more expensive it will be to move to cleaner technologies in those sectors--a finding that runs contrary to conventional economic thought because prices of solar, wind and battery technologies are rapidly falling, they say.

Steps outlined in the Clean Power Plan, as well as in the 2016 Paris climate accord, would not have been enough to meet the goal of keeping global temperature increase to 2 degrees Celsius by the end of this century, the study shows.

To achieve the 70-percent reduction target for carbon dioxide emissions used in the study, additional steps would be needed--and before 2023. The window for effective action could close that early.

"If we do not act to reduce greenhouse gas emissions forcefully prior to the 2020 election, costs ?to reduce emissions at a magnitude and timing consistent with averting dangerous human interference with the climate will skyrocket," said Steven Skerlos, U-M professor of mechanical engineering. "That will only make the inevitable shift to renewable energy less effective in maintaining a stable climate system throughout the lives of children already born."

Before Trump's reversal of both the domestic and international climate plans, the Intergovernmental Panel on Climate Change had recommended a 70-percent cut in carbon dioxide emissions from industrialized nations such as the U.S., where nearly half of emissions come from the electric and automotive sectors.

Using a custom, state-of-the-art model of these sectors, the researchers showed that the window for initiating additional climate action would close between 2023 and 2025 for the automotive sector and between 2023 and 2026 for the electric sector.

"That's true under even the most optimistic assumptions for clean technology advancements in vehicles and power plants," said study lead author Sarang Supekar, a mechanical engineering postdoctoral fellow at U-M.

Withdrawal from the accord and the EPA's plan to repeal the Clean Power Plan will only make the chances of achieving the goal more remote, the researchers say.

"In the absence of a government mandate, and if there is encouragement for coal to come back, then there's no way we can meet the target," Supekar said.

To arrive at their findings, Supekar and Skerlos calculated the future greenhouse gas contributions of the auto and power industries based on two approaches going forward--"business as usual" and "climate action." Their calculations relied on the lowest-cost technologies in each sector.

In the "business as usual" scenario, the auto industry followed its current rate of vehicle diversification--utilizing efficient internal combustion, electric and hybrid models, and the power sector utilized mostly natural gas and renewable plants. In the "climate action" scenario, those sectors relied on a greater percentage of cleaner automotive and power technologies to meet the IPCC climate goals.

"At some point, likely by 2023, you actually can't build the newer, cleaner power plants fast enough or sell enough fuel-efficient cars fast enough to be able to achieve the 70-percent target," Skerlos said.

Added Supekar, "The year-on-year emission reduction rate in such dramatic technology turnovers will exceed 5 percent after about 2020, which makes the 70-percent target infeasible for all practical purposes."

The analysis found no evidence to justify delaying climate action in the name of reducing technological costs, even under the most optimistic trajectories for improvement in fuels efficiencies, demand, and technology costs in the U.S. auto and electric sectors. In fact, the study found that waiting another four years to initiate measures on track with the 70 percent target would take the total cost for both sectors from about $38 billion a year to $65 billion a year.

"You could take this same model or a different model and arrive at different cost numbers using a your own set of assumptions for "business as usual" or interests rates, for instance," Supekar said. "But the point is, regardless of whether the cost of climate action today is $38 billion or $100 billion, this cost will rise sharply in three to four years from now."

The IPCC has determined that in order to keep Earth's average temperature from rising more than 2 degrees Celsius above pre-industrial times by the end of the century, global greenhouse gas emissions must be reduced between 40 percent and 70 percent by 2050. The U.S. is the largest cumulative emitter of greenhouse gases, and the electric and auto industries account for nearly half of the country's annual output. Fossil fuel combustion accounts for 95 percent of those industries' emissions.

The study, "Analysis of Costs and Time Frame for Reducing CO2 Emissions by 70% in the U.S. Auto and Energy Sectors by 2050," is published in Environmental Science and Technology. It was funded by the U-M Energy Institute and the National Science Foundation.

Baltic clams and worms release as much greenhouse gas as 20,000 dairy cows

New study shows that oceans with worms and clams enhance the release of methane into the atmosphere up to eight times more than oceans without them

Cardiff University

Scientists have shown that ocean clams and worms are releasing a significant amount of potentially harmful greenhouse gas into the atmosphere.

The team, from Cardiff University and Stockholm University, have shown that the ocean critters are producing large amounts of the strongest greenhouse gases - methane and nitrous oxides - from the bacteria in their guts.

Methane gas is making its way into the water and then finally out into the atmosphere, contributing to global warming - methane has 28 times greater warming potential than carbon dioxide.

A detailed analysis showed that around 10 per cent of total methane emissions from the Baltic Sea may be due to clams and worms.

The researchers estimate that this is equivalent to as much methane given off as 20,000 dairy cows. This is as much as 10 per cent of the entire Welsh dairy cow population and 1 per cent of the entire UK dairy cow population.

The findings, which have been published in the journal Scientific Reports, point to a so far neglected source of greenhouse gases in the sea and could have a profound impact on decision makers.

It has been suggested that farming oysters, mussels and clams could be an effective solution against human pressures on the environment, such as eutrophication caused by the run-off of fertilisers into our waters.

The authors warn that stakeholders should consider these potential impacts before deciding whether to promote shellfish farming to large areas of the ocean.

Co-author of the study Dr Ernest Chi Fru, from Cardiff University's School of Earth and Ocean Sciences, said: "What is puzzling is that the Baltic Sea makes up only about 0.1% of Earth's oceans, implying that globally, apparently harmless bivalve animals at the bottom of the world's oceans may in fact be contributing ridiculous amounts of greenhouse gases to the atmosphere that is unaccounted for."

Lead author of the study Dr Stefano Bonaglia, from Stockholm University, said: "It sounds funny but small animals in the seafloor may act like cows in a stable, both groups being important contributors of methane due to the bacteria in their gut.

"These small yet very abundant animals may play an important, but so far neglected, role in regulating the emissions of greenhouse gases in the sea."

To arrive at their results the team analysed trace gas, isotopes and molecules from the worms and clams, known as polychaetes and bivalves respectively, taken from ocean sediments in the Baltic Sea.

The team analysed both the direct and indirect contribution that these groups were having on methane and nitrous oxide production in the sea. The results showed that sediments containing clams and worms increased methane production by a factor of eight compared to completely bare sediments.

Winter cold extremes linked to high-altitude polar vortex weakening

by Staff Writers

Potsdam, Germany (SPX) Sep 25, 2017

"Several types of weather extremes are on the rise with climate change, and our study adds evidence that this can also include cold spells, which is an unpleasant surprise for these regions." The effect is stronger over Asia and Europe than over the US.

When the strong winds that circle the Arctic slacken, cold polar air can escape and cause extreme winter chills in parts of the Northern hemisphere. A new study finds that these weak states have become more persistent over the past four decades and can be linked to cold winters in Russia and Europe.

It is the first to show that changes in winds high up in the stratosphere substantially contributed to the observed winter cooling trend in northern Eurasia. While it is still a subject of research how the Arctic under climate change impacts the rest of the world, this study lends further support that a changing Arctic impacts the weather across large swaths of the Northern Hemisphere population centers.

"In winter, the freezing Arctic air is normally 'locked' by strong circumpolar winds several tens of kilometers high in the atmosphere, known as the stratospheric polar vortex, so that the cold air is confined near the pole," says Marlene Kretschmer from PIK, lead-author of the study to be published in the Bulletin of the American Meteorological Society.

"We found that there's a shift towards more-persistent weak states of the polar vortex. This allows frigid air to break out of the Arctic and threaten Russia and Europe with cold extremes. In fact this can explain most of the observed cooling of Eurasian winters since 1990."

Warm Arctic, cold continents

Despite global warming, recent winters in the Northeastern US, Europe and especially Asia were anomalously cold - some regions like Western Siberia even show a downward temperature trend in winter. In stark contrast, the Arctic has been warming rapidly. Paradoxically, both phenomena are likely linked: When sea-ice North of Scandinavia and Russia melts, the uncovered ocean releases more warmth into the atmosphere and this can impact the atmosphere up to about 30 kilometers height in the stratosphere disturbing the polar vortex.

Weak states of the high-altitude wind circling the Arctic then favors the occurrence of cold spells in the mid-latitudes. Previous work by Kretschmer and colleagues identified this causal pathway in observational data and it is further supported by several climate computer simulation studies.

"Our latest findings not only confirm the link between a weak polar vortex and severe winter weather, but also calculated how much of the observed cooling in regions like Russia and Scandinavia is linked to the weakening vortex. It turns out to be most," says co-author Judah Cohen from Atmospheric and Environmental Research/Massachusetts Institute of Technology (US).

"Several types of weather extremes are on the rise with climate change, and our study adds evidence that this can also include cold spells, which is an unpleasant surprise for these regions." The effect is stronger over Asia and Europe than over the US.

Circulation patterns drive our weather"

"It is very important to understand how global warming affects circulation patterns in the atmosphere," says co-author Dim Coumou from Vrije Universiteit Amsterdam, Netherlands. "Jet Stream changes can lead to more abrupt and surprising disturbances to which society has to adapt. The uncertainties are quite large, but global warming provides a clear risk given its potential to disturb circulation patterns driving our weather - including potentially disastrous extremes."

Students, researchers turn algae into renewable flip-flops

by Brooks Hays

Washington (UPI) Oct 5, 2017

A team of researchers and students at the University of California, San Diego are trying to curb the number of petroleum-based flip flops -- currently, 3 billion every year -- that end up in landfills.

Their solution is the world's first algae-based, eco-friendly pair of flip flops.

"Even though a flip flop seems like a minor product, a throwaway that everyone wears, it turns out that this is the number one shoe in the world," Stephen Mayfield, a professor of biology at UCSD, said in a news release.

In India, China and Africa, the flip flop is the most popular shoe.

"These are the shoes of a fisherman and a farmer," Mayfield said.

Flip flops are responsible for a large amount of the polyurethane that ends up in the ocean. But researchers originally set their sights on the pollution caused by surfboards.

Two years ago, Mayfield and Skip Pomeroy, a professor of chemistry and biochemistry, worked with their students to design an algae-based, biodegradable surfboard. They teamed up with a local surfboard blanks manufacturer, Arctic Foam of Oceanside, to produce the board.

Soon, Mayfield and Pomeroy realized their technology could be used to replace other petroleum-based, polyurethane products.

"The algae surfboard was the first obvious product to make, but when you really look at the numbers you realize that making a flip-flop or shoe sole like this is much more important," says Mayfield. "Depending on how you do the chemistry, you can make hard foams or soft foams from algae oil. You can make algae-based, renewable surfboards, flip-flops, polyurethane athletic shoes, car seats or even tires for your car."

Petroleum is formed by ancient plant material like algae, but the chemical potential for petroleum is present in living algae. By converting living algae into petroleum and then into polyurethane, researchers are pulling carbon from the atmosphere instead of taking it out of reserves in the ground. This is what makes their algae-based polyurethane more sustainable.

Researchers are also working to make the carbon bonds in their polyurethane more biodegradable, so that microorganisms can more easily break them down.

In an effort to take their research to market, researchers have spun their work off into a new company called Algenesis Materials. Their first product is the Triton flip flop. The researchers hope to continue perfecting the chemistry behind the Triton flip flop and use it to make more eco-friendly shoe soles, car seats and more.

"The idea we're pursuing is to make these flip-flops in a way that they can be thrown into a compost pile and they will be eaten by microorganisms," said Mayfield.

Algae-based surfboards have already become popular in the surfing industry, and researchers hope their latest efforts will have a similar impact on the shoe industry.

"It's going to be a little while before you can buy one of these flip-flops in the store, but not too long," says Mayfield. "Our plan is that in the next year, you'll be able to go into the store and buy an Algenesis flip-flop that is sustainable, biodegradable and that was invented by students at UC San Diego."

Cleanup bill for toxic firefighting chemicals at military bases could reach $2 billion

UPDATED: Wed., Sept. 6, 2017, 11:07 p.m.

Airway Heights Public Work Department flushes potentially contaminated water from a fire hydrant into Aspen Grove Park in Airway Heights in May. The city worked to remove contaminates from the water pipes caused when chemicals used for fire suppression at Fairchild Air Force Base entered the water supply. A U.S. Senate report said it may cost $2 billion to cleanup the water pollution at more than 400 military bases across the country. (Colin Mulvany / The Spokesman-Review)

It may cost up to $2 billion to clean up toxic firefighting chemicals that have leaked from more than 400 U.S. military installations, including Fairchild Air Force Base, a group of Democratic senators said Tuesday in a letter to the Senate Appropriations Committee.

The senators, including Patty Murray and Maria Cantwell of Washington, attributed that cost estimate to U.S. Department of Defense officials.

The senators requested a study of the chemicals known as PFOS and PFOA, which were key ingredients in a foam that was used for decades to douse aircraft fires at military bases and civilian airports.

In recent months, it was revealed that the chemicals contaminated groundwater at many of those sites. Dozens of residents who live near Fairchild have learned that the chemical concentrations in their drinking water exceed recommendations from the U.S. Environmental Protection Agency.

Other senators who signed the letter include Michael Bennet of Colorado, Jeanne Shaheen and Maggie Hassan of New Hampshire, Kirsten Gillibrand of New York and Bob Casey of Pennsylvania.

They asked that funds be included in the 2018 budget for the Centers for Disease Control, the EPA and the Department of Defense to study the spread of the chemicals, the health effects and viable alternatives for the toxic firefighting foam.

The chemicals have been linked to cancer, thyroid problems and immune system disorders, although scientists aren’t sure exactly how they interact in the human body.

“Residents of our states are concerned about exposure, and what this means for their health and safety,” the senators wrote. “We urge the Committee to include language directing DOD and the military services to budget robustly for assessment, investigation, and remediation activities in the upcoming fiscal years.”

Chicago Just Repealed Its Brand New Soda Tax

October 12, 2017 By Elaine S. Povich

Matt Gill, manager of Tischler Finer Foods in Brookfield, Illinois, says the soda tax has cut into his business. The Cook County Board repealed the tax Wednesday.

The Cook County (Chicago) Board of Commissioners voted Wednesday to repeal the controversial penny-an-ounce tax on sweetened beverages, following a pitched advertising battle between advocates who cited public health concerns and opponents who said the tax was too burdensome on business.

The repeal, on a voice vote, was taken just two months after the tax went into effect Aug. 2. The repeal goes into effect Dec. 1. Cook County Board President Toni Preckwinkle, a supporter of the tax, said the county will now have to plug a $200 million annual hole in the budget – without the money the tax was expected to raise.

The American Beverage Association and local store owners pushed for repeal with millions of dollars in television advertising. They were countered by an equally expensive campaign by billionaire public health advocate and former New York Mayor Michael Bloomberg, who in 2012 pushed through a municipal soda tax on jumbo drinks only to have it overturned by a state court in 2014.

Chicago’s complicated tax on sweetened beverages covered any nonalcoholic beverage that was sweetened, whether artificially or with sugar, and whether it came in bottles and cans or from a fountain. The tax applied to regular and diet soda, sports and energy drinks, and juice products that weren’t 100 percent fruit or vegetable juice. It also applied to ready-to-drink sweetened coffees and teas — as long as they were less than half milk. It did not apply to flavored bubbly waters without sugar but did cover sparkling waters that were sweetened with sugar.

"AEP to Spend $4.5 Billion on the Largest Wind Farm in the U.S."

Growth-Starved Utilities Have Found a New Way to Make Money

By Chris Martin , Jim Polson , and Mark Chediak

July 26, 2017, 5:10 PM EDT July 27, 2017, 10:55 AM EDT

AEP seeking permission to invest $4.5 billion in wind power

Utilities now looking to own renewable energy assets outright

There’s a new model emerging for growth-starved utilities looking to profit from America’s solar and wind power boom.

American Electric Power Co. is using it for a $4.5 billion deal that’ll land the U.S. utility owner a massive wind farm in Oklahoma and a high-voltage transmission line to deliver the power. NextEra Energy Inc.’s Florida unit is using it to build solar farms. And in April, the chief executive officer of Xcel Energy Inc. said he’d use it to help add 3.4 gigawatts of new wind energy over the next five years.

Here’s how it works: Some utilities that for years contracted to buy electricity from wind and solar farm owners are now shifting away from these so-called power purchase agreements, or PPAs. They’re instead seeking approval from state regulators to buy the assets outright and recover the costs from customers through rates. While the takeovers are being branded as a cheaper way of securing power, saving ratepayers millions in the end, they also guarantee profits for utilities.

“We keep wondering why utilities are always signing PPAs that pass the cost through to customers,” said Amy Grace, an analyst at Bloomberg New Energy Finance. “If you put it in your rate base, you can get a guaranteed return on it. There’s a big upside to ownership.”

With the cost of building solar and wind farms sliding and electricity demand weakening, owning renewables is a more attractive proposition than ever for utilities.

“The price of wind has come down enough that it’s going to be competitive with anything else you’re probably going to propose to build out there,” Kit Konolige, a New York-based utility analyst for Bloomberg Intelligence, said by phone Wednesday.

Biggest Hurdle

The catch, of course, is regulatory buy-in. AEP will need approval from Arkansas, Louisiana, Oklahoma, Texas and federal regulators to purchase the 2,000-megawatt Wind Catcher farm from developer Invenergy LLC, build a 350-mile (563-kilometer) transmission line and bake the costs into customer rates. The company expects to file these plans with regulators on July 31.

American Electric wants regulatory approvals by April, Chief Executive Officer Nicholas Akins said on the company’s second-quarter earnings call Thursday. Central to the project are $2.7 billion of U.S. tax credits for wind production. No rate increase will be needed, he said.

American Electric shares rose as much as 1.9 percent to $70.80 in intraday trading in New York, the highest in almost a month, and were at $70.32 at 10:52 a.m.

Benjamin Fowke, chief executive officer of Minneapolis-based Xcel Energy, said in April that the company would seek approval to add 3.4 gigawatts of new wind energy and bill customers for as much as 80 percent of that investment through rates. He said at the time that wind energy is now the cheapest new form of generation.

NextEra has already gotten Florida’s go-ahead to recover the costs of the solar farms its Florida utility is building through customer rates.

Should AEP gain similar approvals, the utility will be embarking on its most expensive renewable energy project yet, underscoring a dramatic shift in America’s power mix. Once the largest consumer of coal in the U.S., AEP is now shuttering money-losing plants burning the fuel and diversifying its resources along with the rest of the utility sector.

The Wind Catcher farm in Oklahoma is set to become the largest wind farm in the nation and the second-biggest in the world, according Invenergy.

AEP estimated that the low cost of wind power will save customers $7 billion over 25 years. Construction on the farm began in 2016, and the plant is scheduled to go into service in mid-2020, Invenergy said by email

Plastic Garbage Patch Bigger Than Mexico Found in Pacific

Yet another floating mass of microscopic plastic has been discovered in the ocean, and it is mind-blowingly vast.

By Shaena Montanari

PUBLISHED JULY 25, 2017

Water, water, everywhere—and most of it is filled with plastic.

A new discovery of a massive amount of plastic floating in the South Pacific is yet another piece of bad news in the fight against ocean plastic pollution. This patch was recently discovered by Captain Charles Moore, founder of the Algalita Research Foundation, a non-profit group dedicated to solving the issue of marine plastic pollution.

Moore, who was the first one to discover the famed North Pacific garbage patch in 1997, estimates this zone of plastic pollution could be upwards of a million square miles in size. (Read: A Whopping 91% of Plastic Isn’t Recycled.)

The team is currently processing the data and weighing the plastic so they can get a handle on exactly how much garbage they’ve discovered in this area off the coast of Chile and Peru.

The term “patch” referring to the plastic pollution in oceanic gyres can be misleading. The pieces of plastic are not necessarily floating bottles, bags, and buoys, but teeny-tiny pieces of plastic resembling confetti, making them almost impossible to clean up.

These microplastic particles may not be visible floating on the surface, but in this case, they were detected after collecting water samples on Moore’s recent six-month expedition to the remote area that had only been explored for plastic once before. (See a map of plastic in the ocean.)

On the first transect of the South Pacific gyre in 2011, Marcus Eriksen, marine plastic expert and research director at the 5 Gyres Institute, did not spot much plastic. In only six years, according to the new data collected by Moore, things have changed drastically.

Henderson Island, located in this South Pacific region, was recently crowned the most plastic-polluted island on Earth, as researchers discovered it is covered in roughly 38 million pieces of trash.

The problem of plastic pollution is becoming ubiquitous in the oceans, with 90 percent of sea birds consuming it and over eight million pounds of new plastic trash finding its way into the oceans every year.

50% Rise in Renewable Energy Needed to Meet Ambitious State Standards

States are powerful forces for renewable electricity. About half the growth since 2000 has come from their Renewable Portfolio Standards, a national lab finds.

BY PHIL MCKENNA

JUL 27, 2017

Wind turbines at San Gorgonio Pass, California. Credit: David McNew/Getty

Hawaii, California and New York have some of the most ambitious renewable portfolio standards in the United States. Hawaii is aiming for 100 percent of electricity from renewable sources by 2045. Credit: David McNew/Getty

Renewable electricity generation will have to increase by 50 percent by 2030 to meet ambitious state requirements for wind, solar and other sources of renewable power, according to a new report from Lawrence Berkeley National Laboratory.

The report looked at Renewable Portfolio Standards (RPSs)—commitments set by states to increase their percentage of electricity generated from sources of renewable energy, typically not including large-scale hydropower. Twenty-nine states and Washington, D.C., currently have such standards, covering 56 percent of all retail electricity sales in the country.

"I think that the industry is quite capable of meeting that objective cost-competitively and, actually, then some," said Todd Foley, senior vice president of policy and government affairs at the American Council on Renewable Energy.

What's Needed to Meet States' Renewable Power Goals?

Seven states—Maryland, Michigan, New York, Rhode Island, Massachusetts, Illinois and Oregon—as well as Washington, D.C., have increased their RPS requirements for new wind and solar projects since the start of 2016. No states weakened their RPS policies during this time. Some of the most ambitious requirements are in California and New York, which require 50 percent of electricity to come from renewable sources by 2030, and Hawaii, which requires 100 percent from renewables by 2045.

RPS policies have driven roughly half of all growth in U.S. renewable electricity generation and capacity since 2000 to its current level of 10 percent of all electricity sales, the national lab's report shows. In parts of the country, the mandates have had an even larger effect—they accounted for 70-90 percent of new renewable electricity capacity additions in the West, Mid-Atlantic and Northeast regions in 2016.

"They have been hugely important over the years to help diversify our power mix and send a signal to investors and developers alike to put their resources in the deployment of renewable energy," Foley said.

Nationally, however, the role of RPS policies in driving renewable energy development is beginning to decrease as corporate contracts from companies that have committed to getting 100 percent of their electricity from renewables, and lower costs of wind and solar, play an increasing role.

From 2008 to 2014, RPS policies drove 60-70 percent of renewable energy capacity growth in the U.S., according to the report. In 2016, the impact dropped to just 44 percent of added renewable energy capacity.

29 States and D.C. Have Set Renewable Portfolio Standards

The increasing role market forces are playing in driving renewable energy generation is seen in a number of states with no RPS policies.

In Kansas, for example, wind energy provided 24 percent of net electricity generation in 2015, up from less than 1 percent in 2005, according to the U.S. Energy Information Administration. Similarly, wind power provides roughly one quarter of net electricity generation in Oklahoma and South Dakota, states that also lack RPS policies. Some of the generation in each of these states may be serving RPS demand in other states, or, in the case of Kansas, may be partly a result of an RPS that was repealed in 2015, lead author Galen Barbose said.

With some states considering further increases in their renewable energy standards, the policies are likely to continue to play a significant role in renewable energy development, Foley said.

"They have been very important," he said, "and I think they'll continue to be."

ABOUT THE AUTHOR

Phil McKenna is a Boston-based reporter for InsideClimate News. Before joining ICN in 2016, he was a freelance writer covering energy and the environment for publications including The New York Times, Smithsonian, Audubon and WIRED. Uprising, a story he wrote about gas leaks under U.S. cities, won the AAAS Kavli Science Journalism Award and the 2014 NASW Science in Society Award. Phil has a master's degree in science writing from the Massachusetts Institute of Technology and was an Environmental Journalism Fellow at Middlebury College.

Don’t just fear the power-grid hack. Fear how little the U.S. knows about it

By Tim Johnson

July 27, 2017 4:16 PM

LAS VEGAS

The robust and agressive takedown of part of Ukraine’s power grid by hackers served as a wakeup call for cyber experts and exposed just how much America does not know about foreign operatives’ ability to strike critical U.S. infrastructure.

Electrical grids are on the minds of those gathered at Black Hat, the world’s biggest hacker convention that entered its final day Thursday. The confab draws 16,000 hackers and information technology experts from around the globe.

Worries about infrastructure have grown since presumed Russian government-linked hackers last December shut down power to a section of the Ukrainian capital of Kiev, marking the second straight year they’ve blacked out part of the country as part of Russia’s drive to bring Ukraine back under its geopolitical wing.

Hackers routinely come to the Black Hat convention to demonstrate how to break into electronic systems embedded in medical devices, ATMs, cars, routers and mobile phones. This year, at the 20th annual gathering, one security researcher walked attendees through a hack of a wind farm.

Wind farm control networks are extremely susceptible to attack.

Jason Staggs, independent researcher

“Wind farm control networks are extremely susceptible to attack,” said the researcher, Jason Staggs, who is affiliated with the University of Tulsa.

Hackers would only have to get access to a single turbine to implant malware that would spread across the wind farm, Staggs said, noting that he had hacked into turbines at multiple wind farms with the permission of operators.

“We can turn the turbine on or turn the turbine off or put it in an idle state,” he said, demonstrating the method for taking digital control of a turbine.

Wind farms provided 4.7 percent of the nation’s electricity in 2015, Staggs said, a percentage that is likely to climb to 20 percent by 2030.

When a 250-megawatt wind farm is left idle due to a malicious hack, the downtime can cost an electric utility between $10,000 and $35,000 an hour, Staggs said.

While wind farms may face vulnerabilities, some experts said the nation’s complex electrical grids are more robust and capable of withstanding a cyberattack.

The reporting around hacking the grid is often over hyped.

Tom Parker, Accenture Security

“The reporting around hacking the grid is often over hyped,” said Tom Parker, a cofounder of FusionX, a firm specializing in cyber threat profiling that was bought in 2015 by Accenture Security, where he is group technology officer.

“Whenever anyone ever says someone could hack the grid, there’s no such thing as a single grid. There are multiple grids in the United States,” Parker said in an interview. “It’s a very complex system. There’s a lot of robustness built into it.”

Any cyberattack on U.S. power plants and electrical grids would only come if a foreign power launched a simultaneous military attack on the United States, Parker said.

Others suggested a lesser attack might occur, affecting only a section of the nation.

“If somebody really wanted to send a message, it hurts to have three or four days of no power to the Eastern Seaboard. That could be done. And there’s also no quick fix,” Sam Curry, chief product officer at Cybereason, a Boston firm, said on the sidelines of the conference.

Late last month, the FBI and the Department of Homeland Security issued an alert saying that foreign hackers were targeting the nuclear, power and critical infrastructure sectors.

Authorities did not say who the hackers were but added that they were not successful at breaching any of their targets, including an unidentified nuclear plant.

“We see this probing going on all the time,” said Joe Slowik, a senior threat intelligence researcher at Dragos Inc., an industrial cybersecurity company based in Fulton, Maryland.

A former member of an elite counterterrorism cyber unit at the National Security Agency, Jay Kaplan, said in an interview that he believes some U.S. rivals have already planted malicious code in the nation’s infrastructure, perhaps including nuclear facilities.

I do believe that there is a certain percentage of our critical infrastructure that’s already compromised.

Jay Kaplan, chief executive of Synack

“I do believe that there is a certain percentage of our critical infrastructure that’s already compromised,” said Kaplan, who is chief executive of Synack, a Menlo Park, California, firm that deploys security teams to hack into the networks of clients to test for vulnerabilities.

“This is prepositioning. Should we ever go to war with another nation state, they can leverage this malware for their benefit and basically cripple the economy,” Kaplan said. “I just think it’s reality. … Right now, I don’t feel confident at all.”

The cyberattacks on Ukraine’s power system in 2015 and 2016 have sharpened worldwide focus on hacking aimed at crippling infrastructure.

In a December 2015 attack, hackers cut power to about 225,000 customers serviced by three energy distribution companies. The attack in late 2016 struck a transmission station north of Kiev, blacking out a portion of the capital. Attackers used malware designed to delete data and physically damage industrial control systems, Cybereason said in a report this month.

The team, allegedly Russian, behind the second attack was formidable.

“It looked like there were around 20 people involved in the operation,” said Robert M. Lee, chief executive of Dragos.

Dragos partnered with ESET, a Slovakian security company, to examine the malicious code used in the industrial attack on the Ukrainian power supply.

While the malware was more sophisticated in the 2016 attack, Lee said it could not be easily repurposed to attack U.S. electrical grids, which he described as far more complex.

“Please don’t go reading Ted Koppel’s ‘Lights Out’ book and think you may need to build a bunker,” Lee said. “Things will be okay.”

But Lee added that authorities routinely overestimate their cyber defensive abilities and do not know for certain if malicious worms already reside within the infrastructure.

“The scary part of it is we simply do not know because the government tends to think it has more control over certain infrastructure than it actually does,” Lee said.

Congress Is Still Fighting Over Energy-Efficient Light Bulbs

By Ari Natter

July 27, 2017, 11:18 AM EDT

Amendment bars Energy Department enforcement of bulb standards

Light bulb makers have vowed to comply with rules regardless

Congressional Republicans renewed their effort to save the traditional light bulb, passing a measure to block federal energy standards that have come to symbolize government overreach for many consumers but are largely embraced by manufacturers as the cost of the newer bulbs has plummeted.

The House passed an amendment to a spending bill late Wednesday to block enforcement of Energy Department rules requiring manufacturers to phase out sales of incandescent light bulbs to cut power use. While the measure faces an uncertain future in the Senate, its sponsor called it an important victory for freedom.

"Congress should fight to preserve the free market," Representative Michael Burgess, a Texas Republican, said on the House floor before the vote. Burgess said he had heard from tens of thousands of people about how the regulations "will take away consumer choice when constituents are deciding which light bulbs they will use in their homes."

The measure takes aim at rules that were issued under a 2007 bipartisan energy legislation signed into law by Republican President George W. Bush. The standards, which were phased in overtime starting in 2012, have curtailed the use of incandescent light bulbs.

If it becomes law, the measure could have some unintended consequences, critics warn. While it won’t change light bulb standards which have been in effect for years, it could hamstring authorities from blocking import of incandescent bulbs from China or other nations that that don’t meet the requirements, said Steven Nadel, executive director of the American Council for an Energy-Efficient Economy, a non-profit group that promotes energy conservation.

When Republicans attacked these standards, manufacturers said they would still voluntarily comply with them, and the LED share of the market has surged over the last three years as the cost of those bulbs dropped.

LED light bulbs, which made up just 1 percent of global market share in 2010, are estimated to make up 95 percent that market by 2025, according to research by Goldman Sachs. They accounted for more than a quarter of U.S. sales last year, it said. Other bulbs that have been used to meet the requirements include the ubiquitous curly compact florescent, or CFL, bulbs and halogen lights.

When a similar measure passed into law as a rider in government funding legislation in 2011, the National Electrical Manufactures Association, which represents companies such as Philips Lighting NV and General Electric Co. said light bulb makers planned to comply with them regardless. Tracy Cullen, a spokeswoman for the trade group, did not respond to an email and phone message seeking comment.

"It’s mind boggling to me," Elizabeth Noll, legislative director of the Natural Resources Defense Council’s energy and transportation program, said. "It just adds uncertainty to their products and the rules of the road they are complying with."

The requirements, which have been put in place over time starting with the 100-watt light bulb in 2012, will cut the nation’s electricity costs by $12.5 billion dollars annually and avoid the equivalent of 30 new power plants, the environmental group estimates.

"It’s a an ideological objection to government interference in the market place," said Kateri Callahan, president of the Alliance to Save Energy, a Washington-based nonprofit. "All the things that Dr. Burgess and other opponents to those standards said would happen -- limited choice by consumer, high prices -- none of that has borne out."

Global diet and farming methods 'must change for environment's sake'

IOP Publishing

Reducing meat consumption and using more efficient farming methods globally are essential to stave off irreversible damage to the environmental, a new study says.

The research, from the University of Minnesota, also found that future increases in agricultural sustainability are likely to be driven by dietary shifts and increases in efficiency, rather than changes between food production systems.

Researchers examined more than 740 production systems for more than 90 different types of food, to understand the links between diets, agricultural production practices and environmental degradation. Their results are published today in the journal Environmental Research Letters.

Lead author Dr Michael Clark said: "If we want to reduce the environmental impact of agriculture, but still provide a secure food supply for a growing global population, it is essential to understand how these things are linked."

Using life cycle assessments - which detail the input, output and environmental impact of a food production system - the researchers analysed the comparative environmental impacts of different food production systems (e.g. conventional versus organic; grain-fed versus grass-fed beef; trawling versus non-trawling fisheries; and greenhouse-grown versus open-field produce), different agricultural input efficiencies (such as feed and fertilizer), and different foods.

The impacts they studied covered levels of land use, greenhouse gas emissions (GHGs), fossil fuel energy use, eutrophication (nutrient runoff) and acidification potential.

Dr Clark said: "Although high agricultural efficiency consistently correlated with lower environmental impacts, the detailed picture we found was extremely mixed. While organic systems used less energy, they had higher land use, did not offer benefits in GHGs, and tended to have higher eutrophication and acidification potential per unit of food produced. Grass-fed beef, meanwhile, tended to require more land and emit more GHGs than grain-fed beef."

However, the authors note that these findings do not imply conventional practices are sustainable. Instead, they suggest that combining the benefits of different production systems, for example organic's reduced reliance on chemicals with the high yields of conventional systems, would result in a more sustainable agricultural system.

Dr Clark said: "Interestingly, we also found that a shift away from ruminant meats like beef - which have impacts three to 10 times greater than other animal-based foods - towards nutritionally similar foods like pork, poultry or fish would have significant benefits, both for the environment and for human health.

"Larger dietary shifts, such as global adoption of low-meat or vegetarian diets, would offer even larger benefits to environmental sustainability and human health."

Co-author Professor David Tilman said: "It's essential we take action through policy and education to increase public adoption of low-impact and healthy foods, as well the adoption of low impact, high efficiency agricultural production systems.

"A lack of action would result in massive increases in agriculture's environmental impacts including the clearing of 200 to 1000 million hectares of land for agricultural use, an approximately three-fold increase in fertilizer and pesticide applications, an 80 per cent increase in agricultural GHG emissions and a rapid rise in the prevalence of diet-related diseases such as obesity and diabetes.

Professor Tilman added: "The steps we have outlined, if adopted individually, offer large environmental benefits. Simultaneous adoption of these and other solutions, however, could prevent any increase in agriculture's environmental impacts. We must make serious choices, before agricultural activities cause substantial, and potentially irreversible, environmental damage."

Optimizing feeding is necessary to maintain milk production in organic herds

Decisions on pasture use and feed management affect greenhouse gas emission, according to a new study from the Journal of Dairy Science®

Philadelphia, PA, June 15, 2017 - Consumer demand for organic milk recently surpassed the available supply, with sales of organic products reaching $35 billion in 2014 and continuing to rise. As farms transition to organic production to meet demand, feeding strategies will need to be adapted to meet USDA National Organic Program requirements. Currently, agriculture accounts for approximately 9% of total US greenhouse gas (GHG) emissions; the US dairy industry has committed to a 25% reduction of GHG by 2020 relative to 2009. By varying diet formulation and the associated crop production to supply the diet, farmers can affect the quantity of GHG emissions of various feeding systems. Therefore, researchers from the University of Wisconsin-Madison created a study to compare the effects of feeding strategies and the associated crop hectares on GHG emissions of Wisconsin certified organic dairy farms.

"Herd feeding strategies and grazing practices influence on-farm GHG emissions not only through crop production, but also by substantially changing the productivity of the herd," lead author Di Liang said. "Managing more land as pasture, and obtaining more of the herd feed requirements from pasture, can increase the GHG emissions if pasture and feed management are not optimized to maintain milk production potential."

The authors identified four feeding strategies that typified those used on farms in Wisconsin, with varying degrees of grazing, land allocated for grazing, and diet supplementation. A 16-year study was used for robust estimates of the yield potential on organically managed crop land in southern Wisconsin as well as nitrous oxide and methane emissions and soil carbon.

Production of organic corn resulted in the greatest nitrous oxide emissions and represented about 8% of total GHG emission; corn also had the highest carbon dioxide emissions per hectare. Emissions decreased as the proportion of soybeans in the diet increased, as soybeans require less nitrogen fertilization than corn grain. More intensive grazing practices led to higher GHG emission per metric tonne. However, allowing cows more time on pasture resulted in lower emissions associated with cropland. Manure management and replacement heifers accounted for 26.3 and 20.1% of GHG emissions.

Based on their findings, the authors determined that a holistic approach to farm production is necessary. Organic dairy farms with well-managed grazing practices and adequate levels of concentrate in diet can both increase farm profitability and reduce GHG emission per kilogram of milk.

"Consumers often equate more dependence on pasture with environmentally friendly farming, but this study demonstrated that low milk production per cow is a major factor associated with high GHG emission. Managing both pasture and supplementation to increase milk production per cow will substantially reduce GHG emissions," said Journal of Dairy Science Editor-in-Chief Matt Lucy.

Factors such as dairy cow breed and nonproduction variables may also have an effect on GHG emissions on organic dairy farms. Thus, future studies are needed in this area to elucidate the effects of grazing management and feeding systems. With more research, however, crop and milk production, GHG emissions, and farm profitability can be optimized on organic dairy farms.

EPA Announces Student Award Winners

The U.S. Environmental Protection Agency (EPA) today announced the winners of the 2016 President’s Environmental Youth Award (PEYA). The program recognizes outstanding environmental stewardship projects by K-12 youth. These students demonstrate the initiative, creativity, and applied problem-solving skills needed to tackle environmental problems and find sustainable solutions.

Fifteen projects are being recognized this year, from 13 states: California, Colorado, Connecticut, Florida, Nebraska, New Jersey, Michigan, Pennsylvania, South Carolina, Texas, Utah, Virginia, and Washington.

“Today, we are pleased to honor these impressive young leaders, who demonstrate the impact that a few individuals can make to protect our environment,” said EPA Administrator Scott Pruitt. “These students are empowering their peers, educating their communities, and demonstrating the STEM skills needed for this country to thrive in the global economy.”

Each year the PEYA program honors environmental awareness projects developed by young individuals, school classes (kindergarten through high school), summer camps, public interest groups and youth organizations.

This year’s PEYA winners conducted a wide range of activities, such as:

developing a biodegradable plastic using local agricultural waste product;

designing an efficient, environmentally friendly mosquito trap using solar power and compost by-product;

saving approximately 2,000 tadpoles to raise adult frogs and toads;

implementing a hydroponics and aquaculture project in a high school;

repurposing over 25,000 books;

creating an environmental news YouTube channel;

organizing recycling programs to benefit disaster victims and underserved community members;

reclaiming and repurposing over 4,000 discarded pencils within their school;

promoting food waste reduction;

creating a small, portable tool to prevent air strikes of migratory birds;

engaging their community in a program to save a threatened bird, the Western Snowy Plover;

testing grey water to encourage water conservation;

promoting bee health;

uniting their schools to address local environmental issues.

The PEYA program promotes awareness of our nation’s natural resources and encourages positive community involvement. Since 1971, the President of the United States has joined with EPA to recognize young people for protecting our nation’s air, water, land and ecology. It is one of the most important ways EPA and the Administration demonstrate commitment to environmental stewardship efforts created and conducted by our nation’s youth.

EPA Honors Winners of the 2017 Green Chemistry Challenge Awards

The U.S. Environmental Protection Agency (EPA) is recognizing landmark green chemistry technologies developed by industrial pioneers and leading scientists that turn potential environmental issues into business opportunities, spurring innovation and economic development.

“We congratulate those who bring innovative solutions that will help solve problems and help American businesses,” said EPA Administrator Scott Pruitt. “These innovations encourage smart and safe practices, while cutting manufacturing costs and sparking investments. Ultimately, these manufacturing processes and products spur economic growth and are safer for health and the environment.”

The Green Chemistry Challenge Award winners will be honored on June 12 at a ceremony in Washington, DC. The winners and their innovative technologies are:

Professor Eric Schelter, University of Pennsylvania, for developing a simple, fast, and low-cost technology to help recycle mixtures of rare earth elements. Reducing the costs to recover these materials creates economic opportunity by turning a waste stream, currently only recycled at a rate of 1%, into a potential revenue stream. About 17,000 metric tons of rare earth oxides are used in the U.S. annually in materials such as wind turbines, catalysts, lighting phosphors, electric motors, batteries, cell phones, and many others. Mining, refining, and purification of rare earths are extraordinarily energy and waste intensive and carry a significant environmental burden.

Dow Chemical Company, Collegeville, Pennsylvania, in partnership with Papierfabrik August Koehler SE, Germany, for developing a thermal printing paper that eliminates the need for chemicals used to create an image, such as bisphenol A (BPA) or bisphenol S (BPS). Thermal paper is used broadly throughout the world for cash register receipts, tickets, tags, and labels. This technology reduces costs by creating records that do not fade, even under severe sunlight, allowing the original document to be preserved for long term storage. The paper is compatible with thermal printers currently in commercial use around the world.

Merck Research Laboratories, Rahway, New Jersey, for successfully applying green chemistry design principles to Letermovir, an antiviral drug candidate, that is currently in phase III clinical trials. The improvements to the way the drug is made, including use of a better chemical catalyst, increases the overall yield by more than 60%, reduces raw material costs by 93%, and reduces water usage by 90%.

Amgen Inc., Cambridge, Massachusetts, in partnership with Bachem, Switzerland, for improving the process used to manufacture the active ingredient in ParsabivTM, a drug for the treatment of secondary hyperparathyroidism in adult patients with chronic kidney disease. This improved peptide manufacturing process reduces chemical solvent use by 71%, manufacturing operating time by 56%, and manufacturing cost by 76%. These innovations could increase profits and eliminate 1,440 cubic meters of waste or more, including over 750 cubic meters of aqueous waste annually.

UniEnergy Technologies, LLC (UET), Mukilteo, Washington, in partnership with Pacific Northwest National Laboratory (PNNL), for an advanced vanadium redox flow battery, originally developed at the PNNL and commercialized by UET. The battery, when used by utility, commercial and industrial customers, allows cities and businesses more access to stored energy. It also lasts longer and works in a broad temperature range with one-fifth the footprint of previous flow battery technologies. The electrolyte is water-based and does not degrade, and the batteries are non-flammable and recyclable, thus helping meet the increasing demand of electrical energy storage in the electrical power market, from generation, transmission, and distribution to the end users of electricity.

During the 22 years of the program, EPA has received more than 1600 nominations and presented awards to 114 technologies that spur economic growth, reduce costs, and decrease waste. The agency estimates winning technologies are responsible for annually reducing the use or generation of more than 826 million pounds of hazardous chemicals, saving 21 billion gallons of water, and eliminating 7.8 billion pounds of carbon dioxide equivalent releases to air.

An independent panel of technical experts convened by the American Chemical Society Green Chemistry Institute formally judged the 2017 submissions from among scores of nominated technologies and made recommendations to EPA for the 2017 winners. The 2017 awards event will be held in conjunction with the 21st Annual Green Chemistry and Engineering Conference.

More information: www.epa.gov/greenchemistry

Ph.D. student pioneers storytelling strategies for science communication

By Tianyi Dong | June 6, 2017

Ph.D. student Sara ElShafie has created workshops to teach graduate students how to tell stories about science. (UC Berkeley photo by Brittany Hosea-Small)

Sara ElShafie used to struggle to explain her research to her family in a meaningful way. A UC Berkeley graduate student in integrative biology with an emphasis on how climate change affects animals over time, she says she “would always get lost in the details, and it was not doing justice to what is so amazing about natural history.”

But she doesn’t have that challenge anymore. Today, 28-year-old ElShafie is one of the few people in the country who focus on adapting storytelling strategies from the film industry to science communication. For the past year and a half, she has been leading workshops for scientists — primarily graduate students — on how to tell stories about their research that resonate with a broader audience.

ElShafie found her storytelling solution at Pixar Animation Studios, in Emeryville. A Pixar fan her entire life, she emailed the company’s outreach department in 2015 and asked if anyone there would like to talk with students at the UC Museum of Paleontology about how to adapt strategies for filmmaking to communicating science to people outside the scientific community.

“I just thought, ‘Why not?’” she says. “Communication skills require training, just like any other skills. Good communication requires good storytelling. Maybe we can learn from professional storytellers.”

To her surprise, two story artists at Pixar were interested and volunteered their time for the project. They collaborated with ElShafie to present a pilot seminar at the museum that attracted not only students, but faculty and staff. Seeing the project’s potential, ElShafie worked over the following year to develop a series of workshops inspired by Pixar. Last March, together with a Pixar collaborator, ElShafie presented the first public workshop on the Berkeley campus. Although the studio is not formally involved, additional artists at Pixar have been generously contributing their time and feedback.

Drawing on examples from Pixar films and scientific studies, the workshops illustrate storytelling strategies that make science accessible by reframing research into a story with characters, obstacles and revelations. According to ElShafie, these techniques help to overcome scientists’ difficulty with communicating science about non-human subjects. Pixar films — often about animals — are great models, she adds, because they have compelling themes and emotional depth and appeal to a broad audience.

A Pixar fan since childhood, ElShafie has a collection of Pixar figurines and toys in her lab in the Valley Life Sciences Building. (UC Berkeley photo by Brittany Hosea-Small) Pixar characters are copyright Disney/Pixar.

In her workshops with scientists, ElShafie leads by example, turning her research on fossil lizards into a story about how the conflict of climate change affects animal characters over time. “Lizards alone aren’t going to be that interesting to most audiences,” ElShafie says, “but thinking about time, space and their connection with a major societal problem is.”

ElShafie’s workshops not only help participants frame a message, but develop a conceptual framework for introducing scientific topics in the form of a story. With a worksheet she’s created to illustrate the parallel between filmmaking and science, participants leave the workshop having made an outline of a story about their own research. Extended workshops also explore how to use the visual language of animated films in scientific presentations.

“She’s really enlightened the whole museum community on ways we can strengthen our own writing by applying storytelling techniques,” says Lisa White, assistant director of education and public programs at the UC Museum of Paleontology. White says she watched how ElShafie polished and perfected her workshop over the past school year and how the audience grew threefold, to nearly 200 people at workshops this spring.

ElShafie offered her first official workshop, “Science Through Story: Strategies to Engage Any Audience,” in November 2016 at the annual meeting of the Western Society of Naturalists in Monterey. It was a hit, and she started to receive invitations to present it at other institutions around the state and country.

Mallory Rice, a UC Santa Barbara graduate student who attended the Monterey workshop, invited ElShafie to present the workshop at her campus in April. “It was great to see the example of how she turned her own questions and research into a story, rather than thinking about (just presenting) it to the public with results and data,” Rice says.

ElShafie hopes to hold the workshop annually at Berkeley.

Sara ElShafie (far left) at a March 2017 workshop she led on campus for 200 people that included (left to right) Eric Rodenbeck, CEO of Stamen Design; Erica Bree Rosenblum, UC Berkeley associate professor of Environmental Science, Policy and Management; Lisa White, director of education at the UC Museum of Paleontology; and Daniel McCoy, technical director at Pixar Animation Studios. (UC Museum of Paleontology photo by Helina Chin)

Upon arriving at Berkeley as a Ph.D. student in 2014, ElShafie knew she wanted a career in informal education leadership and eventually to become director of a major science museum. To achieve this, she sought scientific training as well as experience in communications and outreach.

As an undergraduate at the University of Chicago, ElShafie had been involved in Project Exploration, a Chicago nonprofit that makes science accessible to underrepresented groups. One of its programs brought students from the inner city to campus for a weeklong intensive summer course. She remembered how the students were overwhelmed at the beginning, but completely transformed by the end of the week, confident and dying to share with others what they had learned.

“That’s when I realized science is not just about learning science, but about discovering your potential and learning about the world around you,” ElShafie says.

ElShafie chose Berkeley for her doctoral program because it could offer her training in both scientific research and informal education, and was especially drawn to the UC Museum of Paleontology, with its emphasis on outreach and public service. She got involved in the museum’s educational programming, giving tours to school groups and assisting with Cal Day activities. Her lecture at Cal Day 2017, “Real-World Fantastic Beasts and Where To Find Them,” was another opportunity for her to apply her skills in explaining science to the public.

“Museums are often the first opportunity for young people to experience hands-on science,” White says, “and for Ph.D. students, a museum career is particularly rewarding, because you not only do research, but also share your specialty as well as the importance of science broadly with the public.”

ElShafie studies lizard fossils and how climate change has affected animals throughout history. (UC Berkeley photo by Brittany Hosea-Small)

For ElShafie, balancing science communication workshops with her dissertation research has not been easy. But she spends whatever spare time she has on this passion project and is running workshops across the country. She has given six workshops in the past year and has four more coming up, including one in June at the Smithsonian National Museum of Natural History.

She says, “I enjoy it immensely and have learned a ton — how to put materials together in a cohesive way; how to articulate a vision for a new initiative and get other people on board; how to create a niche for yourself by looking at what everyone else has done, what’s missing, and coming up with novel ideas to address a need.”

Job-seekers with communications expertise are in high demand in areas such as interaction with policy-makers and data visualization for scientists, says ElShafie.

“It has never been more critical for scientists to be able to explain science to the public effectively, and the backbone of all communication is a story,” ElShafie says. “Story training benefits scientists, because it helps us to communicate it in a clear and compelling way to any audience.”

She says the story also humanizes the storyteller, which can help with the misconception that scientists have their own agenda or are out of touch. “Storytelling is an introspective process,” says ElShafie. “It helps scientists to articulate their professional motivations, about why they become scientists, why they choose their particular field and why it is important.”

For ElShafie, paleontology is “the perfect gateway to other sciences” and also involves art and imagination. She enjoys the feeling of holding the evidence — like a fossil — in her hand, imagining the animals and their past lives, and then bringing them to life for other people.

“I learned that there are many parallels between the creative process of filmmakers and of scientists,” ElShafie says, adding that stepping outside one’s profession and looking at it from another lens, for a fresh perspective, is always valuable.

Wildfires Burning in West Greenland

Brian Kahn By Brian Kahn

Published: August 7th, 2017

It’s not just the American West and British Columbia burning up. A fire has sparked in western Greenland, an odd occurrence for an island known more for ice than fire.

A series of blazes is burning roughly in the vicinity of Kangerlussuaq, a small town that serves as a basecamp for researchers in the summer to access Greenland’s ice sheet and western glaciers. The largest fire has burned roughly 3,000 acres and sent smoke spiraling a mile into the sky, prompting hunting and hiking closures in the area, according to local news reports.

The Sentinel-2 satellite captured a wildfire burning in western Greenland.

There’s no denying that it’s weird to be talking about wildfires in Greenland because ice covers the majority of the island. Forests are basically nonexistent and this fire appears to be burning through grasses, willows and other low-slung vegetation on the tundra that makes up the majority of the land not covered by ice.

Data for Greenland fires is hard to come by, but there is some context for fires in other parts of the northern tier of the world. The boreal forest sprawls across Canada, Russia, Alaska and northern Europe, and provides a longer-term record for researchers to dig into. That record shows that the boreal forest is burning at a rate unprecedented in the past 10,000 years.

Stef Lhermitte, a remote sensing expert at Delft University of Technology in the Netherlands, said there is evidence of fires burning in Greenland over the past 17 years of MODIS satellite records kept by NASA. But because of how NASA’s algorithms interpret the satellite data, there’s low confidence that every fire on the map actually occurred.

Jason Box, an ice sheet researcher with the Geologic Survey of Denmark and Greenland, said he observed a lightning-sparked fire in the late 1990s, but that otherwise, fires are rare. Looking at the MODIS record, he said one of the only other high confidence fires was actually a trash burn in 2013, though other satellites show evidence of others fires.

Box also noted that temperatures in the area rose in late July just before the fire was first observed, spiking to above 53°F (12°C) on July 27. While not exactly balmy, the temperature rise may have helped the blazes to spread.

According to La Croix, a French newspaper, there’s no precedent for a fire this size in the European Union’s forest fire system. Looking beyond the satellite record for context specific to Greenland is all but impossible as there are basically no records to refer to.

“There does not appear to be a reliable long-term record of observed wildfires in Greenland,” researchers with the Danish Meteorological Institute’s Greenland monitoring program tweeted.

Ultimately, it’s not the burning of Greenland’s tundra that’s the biggest climate change concern. It’s the island’s massive store of ice that if melted, would be enough to raise sea levels 20 feet.

The ice has been melting at a quickening pace since 2000, partly due to wildfires in other parts of the world. The uptick in boreal forest fires has kicked up more ash in the atmosphere where prevailing winds have steered it toward the ice sheet.

The dark ash traps more energy from the sun, which has warmed the ice sheet and caused more widespread melting. Soot from massive wildfires in Siberia caused 95 percent of the Greenland ice sheet surface to melt in 2012, a phenomenon that could become a yearly occurrence by 2100 as the planet warms and northern forest fires become more common.

Recycling dirty aluminum foil might not be practical — but it could be used to make biofuel

Author, Cody Boteler

Published, Aug. 2, 2017

Dirty aluminum foil can be converted into a catalyst for biofuel production, according to a researcher at Queen's University Belfast. Recycling aluminum from all feedstock is critical, according to the paper, (published in Scientific Reports), because bauxite mining, the main source of aluminum, is environmentally damaging.

The researchers were able to dissolve the foil in a solution that turned it into crystals, and then used a second mixture to purify the crystals into pure aluminum salts, according to Mental Floss. Those pure salts can be used in creating alumina catalyst, which is a key ingredient in making dimethyl ether (DME), a diesel substitute.

According to the author of the paper, producing the alumina catalyst from dirty aluminum foil costs about $142 per kilogram (just over 2 lbs.), while the commercial cost for the alumina catalyst is around $360 per kilogram — demonstrating significant savings.

The industrial process of chemically and physically transforming dirty, used aluminum foil into usable, pure aluminum crystals, may be complicated — but the implications are clear. Instead of landfilling used aluminum and taking up valuable airspace, that aluminum can be turned into a valuable biofuel.

As more areas pursue organics diversion goals, solutions should be explored on all steps of the food chain, from packaging to waste processing. While clean aluminum can be utilized by recyclers or scrap sellers, dirty aluminum can't do much more than add to the weight of a landfill delivery, and can contaminate organics collection. Converting dirty aluminum to a usable industrial material, then, could become a valuable cottage industry, if the paper's findings are viable.

Queen's University said, in a press release, that the author of the paper, Ahmed Osman, plans to continue researching ways to further improving the catalysts that produce DME, to make production more commercially viable. Osman's research could also include looking at ways to use the alumina crystals produced from the dirty aluminum foil in the catalytic converters of vehicles that run on natural gas. Given the industry trend, both from companies and governments, away from traditional fuel in collection trucks, these developments are worth keeping an eye on. The New York Department of Sanitation has been testing a DME-powered Mack Truck, with results expected sometime soon.

China’s ageing solar panels are going to be a big environmental problem

The issue of how to dispose of hazardous waste from ageing panels casts a shadow over the drive towards renewable energy and away from fossil fuels

PUBLISHED : Sunday, 30 July, 2017, 4:03pm, UPDATED : Sunday, 30 July, 2017, 10:12pm

China will have the world’s worst problem with ageing solar panels in less than two decades, according to a recent industry estimate.

Lu Fang, secretary general of the photovoltaics decision in the China Renewable Energy Society, wrote in an article circulating on mainland social media this month that the country’s cumulative capacity of retired panels would reach up to 70 gigawatts (GW) by 2034.

That is three times the scale of the Three Gorges Dam, the world’s largest hydropower project, by power production.

By 2050 these waste panels would add up to 20 million tonnes, or 2,000 times the weight of the Eiffel Tower, according to Lu.

“In fair weather, prepare for foul,” she warned.

China flips the switch on world’s biggest floating solar farm

Lu could not be immediately reached for comment. A staff member from the society, which was formerly known as the Chinese Solar Energy Society, confirmed that the figures were in a report presented by Lu at an industrial conference in Xian, Shaanxi in May.

China currently hosts the world’s largest number of solar power plants with a total capacity of close to 80GW last year, according to the International Energy Agency. The installation in China is nearly twice the amount of the US.

Nearly half of the nation’s total capacity was added last year. Industrial experts have also predicted that new solar farms completed this year will exceed 2016’s record, according to Bloomberg.

This neck-breaking pace was driven by government’s drive to diversify the country’s energy supply structure, which at present relies heavily on fossil fuels such as coal and imported oil.

But the solar plants are relatively short-lived, and the government does not have any retirement plan for them yet.

A panel’s lifespan ranges from 20 to 30 years, depending on the environment in which they are used, according to the US Department of Energy. High temperatures can accelerate the ageing process for solar cells, while other negative factors – such as the weight of snow or dust storms – could cause material fatigue on the surface and internal electric circuits, gradually reducing the panel’s power output.

China’s world-beating solar farm is almost as big as Macau, Nasa satellite images reveal

Tian Min, general manager of Nanjing Fangrun Materials, a recycling company in Jiangsu province that collects retired solar panels, said the solar power industry was a ticking time bomb.

“It will explode with full force in two or three decades and wreck the environment, if the estimate is correct,” he said.

“This is a huge amount of waste and they are not easy to recycle,” Tian added.

A solar panel contains metals such as lead and copper and also has an aluminium frame. The solar cells are made up of pure, crystallised silicon wrapped under a thick layer of plastic membrane for protection.

In Europe, some companies are reported to have developed sophisticated technology to reclaim more than 90 per cent of the materials.

But the western technology might face a hard sell in China, according to Tian.

China sets up laboratory to research building solar power station in space

China’s solar power plants are mostly located in poor, remote regions such as the Gobi in Inner Mongolia, while the majority of recycling industries are in developed areas along the Pacific coast.

Transporting these bulky panels over long distances could be very costly, Tian said.

Another cost comes from separating and purifying the waste materials, an industrial process that not only requires plenty of labour and electricity input, but also chemicals such as acids that could cause harm to the environment.

“If a recycling plant carries out every step by the book to achieve low pollutant emission, their products can end up being more expensive than new raw materials,” he said.

The cost of a kilogram of silicon crystal stands at about US$13 this year. The price is expected to drop by 30 per cent over the next decade, according to industrial estimates. This would make recycled silicon even harder to sell, according to Tian.

A sales manager of a solar power recycling company believes there could be a way to dispose of China’s solar junk, nonetheless.

Nuclear scientist predicts China could be using fusion power in 50 years

“We can sell them to Middle East,” said the manager who requested not to be named.

“Our customers there make it very clear that they don’t want perfect or brand new panels. They just want them cheap,” he said.

“They are re-selling these panels to household users living in deserts. There, there is lots of land to install a large amount of panels to make up for their low performance,” the manager added.

“Everyone is happy with the result,” he added.

Flood insurance is a mess, and Harvey won’t make Congress any more likely to fix it

By Stuart Leavenworth

September 01, 2017 4:11 PM

WASHINGTON

Hurricane Harvey is sure to add more crushing debt to the National Flood Insurance Program, which is already $25 billion in the red. So when Congress resumes on Tuesday, will it immediately act to fix this troubled program?

Don’t count on it.

Entrenched regional and partisan divisions have sidelined flood insurance reform for years. Those divides could intensify this month as lawmakers confront three tasks: Formulating an aid package for hurricane victims, raising the debt ceiling and reauthorizing the flood insurance program, which is set to expire by Sept. 30.

Given Congress’ track record of action in 2016, few observers expect lawmakers to act on comprehensive flood insurance reform legislation pending in the House and Senate.

“Not likely to happen,” said Steve Ellis, vice president of Taxpayers for Common Sense, a budget watchdog group and longtime advocate for flood insurance reform.

“If we had a Lyndon B. Johnson in office, we could get these deals made,” said R.J. Lehmann, co-founder of the R Street Institute, a conservative think tank that also advocates for flood insurance reform. “Unfortunately, we don’t have an LBJ.”

Ellis and others say the best hope now is that lawmakers will pass a three- or six-month extension of the National Flood Insurance Program, effectively kicking the debate down the road.

“That is all fine and good,” said Jimi Grande, senior vice president of government affairs for the National Association of Mutual Insurance Companies. “But after all this time, no one should get a ribbon for a last-minute extension.”

If we had a Lyndon B. Johnson in office, we could get these deals made.

R.J. Lehmann, is co-founder of the R Street Institute, a conservative think tank.

Ellis, Lehmann and Grande are part of a strange-bedfellow coalition of taxpayer groups, insurance companies, environmental organizations and fiscal hawks. Over the last decade, these groups have lobbied to phase out subsidies for flood insurance and institute other reforms. Members of the SmartSafer coalition, they argue that the NFIP is both financially unsustainable and an enabler of encouraging people to build and rebuild in dangerous floodplains.

The coalition has scored some successes, including a 2012 reauthorization that reduced FEMA’s debt and led to better mapping of flood hazards in states such as North Carolina. “The eyes get a little wider (in Washington) when lawmakers see a broad coalition of interests working together,” said Ellis.

Yet progress has been slow. Roughly five million households nationwide hold federal flood insurance policies. More than half of those are in three states — Florida, Texas and Louisiana — that enjoy sizable clout in Congress and have resisted jacking up premiums on policy holders. In 2014, they helped pass legislation to reverse some of the reforms that Congress had passed two years earlier.

In addition, powerful lobbying groups such as the National Association of Realtors and National Association of Homebuilders tend to oppose legislative changes that could make flood insurance significantly less affordable.

It used to be that support for flood insurance reform was more of a regional issue than a partisan issue, said Lehmann. But now, he said, even lawmakers of opposite parties wanting to reduce NFIP subsidies refuse to cooperate. “Our coalition faces challenges we never had when we started,” he said.

The partisan divide was evident in June, when the House Financial Services Committee passed a legislative package that would reauthorize the flood insurance program for five years, ramp up premiums, fund more flood mitigation programs and make it easier for private companies to offer their own insurance policies.

Democrats opposed several parts of that package, which was shepherded out of committee by Chairman Jeb Hensarling, a Republican who represents a portion of Texas away from the flood-prone coast. The package has since stalled amid concerns from House Republicans with large numbers of floodplain constituents, namely House Majority Whip Steve Scalise of Louisiana.

In the Senate, political opposites such Marco Rubio of Florida and Elizabeth Warren of Massachusetts are supporting a NFIP reauthorization bill. To the dismay of flood reform advocates, it would cap annual premium increases, perpetuating the subsidies.

Each year on average, the flood insurance program pays out $3 billion more than it collects in premiums, a key reason it has accumulated a debt of nearly $25 billion. In 2005, FEMA borrowed more than $17 billion from the U.S. Treasury to pay claims related to Hurricane Katrina, and Hurricane Sandy triggered another $9 billion in borrowing.

Harvey’s flooding will add to the program’s woes. Texas’ Harris County, which includes Houston, has nearly 250,000 insurance policies. The hurricane has likely caused tens of billions of dollars in damages, a portion of which the NFIP will be obligated to cover.

For Congress, one immediate task is dealing with FEMA’s cash flow. As of Wednesday, FEMA had $1.4 billion in its major disaster account, which could quickly become exhausted as the agency pays for temporary housing and other costs. Tom Bossert, the White House Homeland Security advisor, said on Wednesday the president would be asking Congress for an initial supplement to FEMA’s budget “shortly,” followed by a larger disaster relief request.

If Congress doesn’t reauthorize the National Flood Insurance Program, the program would “lapse” and FEMA would be unable to sell more policies. The last time that happened, in 2010, an estimated 46,800 home sales transactions were interrupted or canceled, according the National Association of Realtors.

Grande, the insurance industry lobbyist, said he seriously doubts that would happen again. “The politics of Harvey right now won’t allow any politician to let the insurance program lapse,” said Grande, who expects Congress to pass a simple short-term extension of the NFIP, as it has several times before.

But Lehmann fears that even a simple extension of the insurance program could get ground up in the sausage making that is about to ensue. To secure votes to raise the debt ceiling, congressional leaders may attempt to tie that measure to Hurricane Harvey aid, a possibility that is already drawing resistance.

Renewal of the NFIP could also be drawn into that deal making, he said.

“It should be easy to get a continuing resolution on flood insurance,” he said. “But these days, nothing is easy in Washington.”

Could These Robotic Kelp Farms Give Us An Abundant Source Of Carbon-Neutral Fuel?

By using elevators to grow kelp farther out in ocean waters, Marine BioEnergy thinks it can grow enough seaweed to make a dent in the fuel market.

Off the coast of Catalina Island near Los Angeles, a prototype of a new “kelp elevator”–a long tube with seaweed growing on it that can be moved up and down in the water to access sunlight and nutrients–will soon begin tests.

If the study works as hoped, the startup behind it, Marine BioEnergy, wants to use similar technology, driven by robotic submarines, to begin farming large tracts of the open ocean between California and Hawaii. Then it plans to harvest the kelp and convert it into carbon-neutral biocrude that could be used to make gasoline or jet fuel.

“In order to grow that much kelp, you really have to move outside the normal range of where kelp is found, which is along the coast.” [Photo: David Ginsburg]

“We think we can make fuel at a price that’s competitive with the fossil fuel that’s in use today,” says Cindy Wilcox, who cofounded Marine BioEnergy with her husband Brian Wilcox, who manages space robotics technology in his day job at NASA’s Jet Propulsion Laboratory at California Institute of Technology.

Other biofuels, such as ethanol made from plant waste on corn fields, have struggled to become commercially viable, particularly after oil prices crashed. Solazyme, a company that planned to make biofuel from algae (and predicted in 2009 that it would be cost-competitive with fossil fuels within two or three years), ended up pivoting to make food products under the name TerraVia, and has now declared bankruptcy.

“You’re going to need a lot of kelp in order to make it cost-competitive with something like coal, fossil fuels, or natural gas.” [Image: Evan Ackerman/IEEE Spectrum]

Kelp might have a chance of faring better. Unlike plants on land, it has little lignin or cellulose, fibers that make processing more difficult and expensive. In the right conditions, it can grow more than a foot a day, without the need for the irrigation or pesticides that might be used on land.

A key to the company’s concept is farming in the open ocean, where there is room to grow vast quantities of kelp. “You’re going to need a lot of kelp in order to make it cost-competitive with something like coal, fossil fuels, or natural gas,” says Diane Kim, a scientist at the University of Southern California’s Wrigley Institute for Environmental Studies, which is helping run the proof-of-concept study of Marine BioEnergy’s technology at Catalina. “In order to grow that much kelp, you really have to move outside the normal range of where kelp is found, which is along the coast.”

Kelp doesn’t typically grow in the open ocean since it needs both sunlight found near the surface of the water and nutrients that are found near the ocean floor (it also needs to anchor itself to something). In the 1970s, during the oil embargo, the U.S. Navy began investigating the possibility of farming kelp in the open ocean, pumping deep ocean water filled with nutrients to kelp anchored near the surface. But anchors often failed in ocean currents, and after the embargo ended, government interest faded.

“Here’s the right feedstock, and we just aren’t using it.” [Image: Evan Ackerman/IEEE Spectrum]

In shallow coastal waters, where kelp naturally has access to both sunlight and nutrients, it’s a challenge to grow the seaweed at scale. Attempts to cultivate kelp gardens for food only have succeeded in relatively small areas.

But Brian Wilcox, who happens to be the son of the researcher who led the early work with the Navy, believed that kelp farming in the open ocean still might be possible. “My husband just kept thinking about this–here’s the right feedstock, and we just aren’t using it,” says Cindy Wilcox. He began considering a new approach: moving kelp up and down in a process he calls depth cycling, which gives the kelp access to both the nutrient-rich deep water and the light near the surface.

In 2015, Marine BioEnergy got a grant from the U.S. Department of Energy’s ARPA-E to test the proof of concept that is now in early stages with the Wrigley Institute researchers. Long lines stretch in a net-like pattern in the water, with kelp attached; the the kelp can be raised in a saltwater nursery on land where it is seeded into twine, and then later can be tied to the floating farm. At the end of the farm, underwater drones can pull the whole system up and down, both to maximize growth and to avoid ship traffic or storms near the surface. When the kelp is ready for harvest, the drones can tow the farm to a nearby ship.

The startup is also working with Pacific Northwest National Laboratory, which has developed a process to convert kelp to biocrude. The team is evaluating whether it’s more economic to make the crude on a ship–the processing center could fit on a container ship, powered by the processes’ own fuel–or to bring the harvested kelp back to land.

The resulting fuel should be carbon neutral, because the carbon dioxide released when the fuel is burned will equal the carbon dioxide taken in by the kelp as it grows. Still, some argue that biofuels are not an ideal choice for powering transportation. Mark Jacobson, a Stanford University professor who has calculated that it’s feasible to get all energy needs from wind, hydropower, and solar, says that a car running on renewable electricity makes more sense than a car running on biofuel.

“I believe liquid biofuels for transportation (or any other combustion purpose) are a bad idea because they still require combustion, resulting in air pollution, which using electricity generated from clean, renewable sources for transportation avoids,” Jacobson says.

But air transportation is unlikely to run on electricity anytime soon, and–despite some predictions about the rapid demise of gas cars–biofuel could serve a practical purpose in the near term. The kelp biocrude, which could be processed at existing refineries, could also be used to make plastics that are typically made from fossil fuels.

The first step is proving that the kelp can grow and thrive as it’s pulled up and down. “Part of this project for the next two years is to really figure out, using the depth cycling strategy, if it works at all, and what are the parameters,” says Kim. “Theoretically, it should work.”

If the proof of concept is successful, Marine BioEnergy wants to go big: To cover 10% of the transportation fuel needs in the U.S., they’ll have to have enough kelp farms to cover an area of the Pacific roughly the size of Utah.

About the author

Adele Peters is a staff writer at Fast Company who focuses on solutions to some of the world's largest problems, from climate change to homelessness. Previously, she worked with GOOD, BioLite, and the Sustainable Products and Solutions program at UC Berkeley

Resort town gets high-priced help to take aim at turbines

Josh Kurtz, E&E News reporter

Climatewire: Friday, October 20, 2017

The Block Island Wind Farm off Rhode Island's coast was the first U.S. offshore wind farm. There's a fight in Maryland over turbines. Dennis Schroeder/NREL/Flickr

This article was updated at 10:52 p.m. EDT.

Already targeted by an appropriations bill in Congress, two proposed wind energy projects off the coast of Maryland could face a sneak attack in the Free State's upcoming legislative session.

Officials in the resort town of Ocean City, Md., fearful that wind turbines will damage their lucrative tourist economy, have hired a plugged-in Annapolis lobbyist to help them push the two projects farther offshore. The lobbyist, Bruce Bereano, said in an interview this week that he was hired to focus on Congress and the federal approval process.

But some stakeholders in the long battle to bring offshore wind energy to Maryland are skeptical. Bereano, by his own admission, has limited experience lobbying Congress. He is, however, the most successful lobbyist in the history of the Maryland State House.

"You don't hire Bruce Bereano to go to Capitol Hill when he has spent 40 years in Annapolis," said Mike Tidwell, executive director of the Chesapeake Climate Action Network.

So the inevitable question becomes: Will there be a push in the 2018 Maryland General Assembly session, which begins Jan. 10, to kill or alter the offshore wind projects? Could a symbolic election-year "messaging" bill emerge, even if it has no chance of passing? The answers to these questions are fraught with political implications.

Ocean City Mayor Rick Meehan could not be reached for comment yesterday about the town's recently inked $65,000-a-year contract with Bereano. But Ocean City Councilman Tony DeLuca seemed to hint at the reason for Bereano's hiring when he told Ocean City Today last month, "He knows everyone in Annapolis, he has the reputation, and he's the right man for the job."

At issue are two wind energy projects the Maryland Public Service Commission approved in May. It enabled two companies, U.S. Wind Inc. and Skipjack Offshore Energy LLC, to collectively construct 368 megawatts of capacity.

The decision to approve both projects was a surprise to most stakeholders — including the companies themselves, which assumed they were competing against each other. U.S. Wind was allowed to build 61 turbines 12 to 15 miles offshore, roughly parallel to downtown Ocean City. Skipjack's 15-turbine project would be slightly farther to the north, in waters at the Maryland-Delaware border, 17 to 21 miles offshore.

Both companies are working within the parameters of Maryland legislation that passed in 2013, approving offshore wind energy. According to a Public Service Commission consultant, Maryland residential ratepayers would pay no more than $1.40 a month on their utility bills to help subsidize the projects.

By most accounts, Ocean City officials were not very active during the three-year debate that resulted in the enabling state legislation, and they weren't especially vocal during most of the PSC approval process. But then, just days before the PSC ruling, they saw an artist's rendering of what U.S. Wind's turbines would look like from the beach, and they freaked out.

Rep. Andy Harris (R-Md.) came quickly to their aid. Harris, who represents Ocean City in Congress, attached an amendment to a House appropriations bill in July that said if a Maryland wind project isn't at least 24 miles from the shore, the federal government cannot spend money to evaluate it (Energywire, July 27).

"Ocean City's economy heavily relies on its real estate and tourism sectors, and there has not yet been a proper examination on whether construction of these wind turbines will have a negative economic impact on the community," Harris said at the time. "If construction of these turbines too close to the shoreline will reduce property value or tourism, then the turbines may cause more issues than they solve."

In response, Paul Rich, U.S. Wind's director of project development, said Harris' amendment "effectively ... kills our project."

Coincidentally, Harris held a fundraiser in Ocean City on Wednesday evening at Seacrets, a popular entertainment complex on the bay.

'A very strong and emotional issue'

Whether Harris' amendment remains in the final omnibus spending package is anybody's guess. But it provides a marker for Ocean City officials.

"They are not opposed to wind power as one of several renewable energy sources, but they strongly and very solidly oppose the placement and location of the wind turbines, because by all accounts these turbines are clearly visible and noticeable on the beaches of Ocean City as well as when you're on any floors of buildings and hotels and condominiums," Bereano said.

Bereano noted that Dominion Energy is installing two wind turbines on an experimental basis off the coast of Virginia Beach — 27 miles from the shore.

"They need to be moved to a location where they are not visible at all from Ocean City," he said. "This is a very strong and emotional issue."

Bereano said he had been hired to monitor the "significant federal [approval] process" that follows the state PSC's approval of the two sites.

A spokesperson for the Bureau of Ocean Energy Management this week said there is no timetable yet for the agencies to review the two proposals.

In his long career as a lobbyist, Bereano has barely prowled the halls of Congress or federal regulatory agencies professionally. He conceded that he'd spoken to members of the Maryland congressional delegation "three or four times" on behalf of in-state clients who had issues before the federal government but had otherwise not spent much time on Capitol Hill beyond internships in the Senate when he was in college and law school.

Which is why Bereano's next moves will be watched closely by every stakeholder.

Bereano is a legendary and controversial figure in Annapolis. He was the first million-dollar earner among Maryland lobbyists, achieving that milestone in the early 1990s, and he was the second-highest-paid lobbyist in the state between Nov. 1, 2016, and April 30 of this year, clearing more than $1.6 million.

Bereano is also a convicted felon.

He was found guilty of federal mail fraud in 1994 after overbilling clients and funneling the money into a political action committee that provided campaign contributions to lawmakers. He served a 10-month sentence and was later barred from practicing law in Maryland and Washington, D.C. But that ban did not extend to lobbying activities, and Bereano kept his robust business going, even working for his clients from a pay phone outside his Baltimore halfway house.

Bereano remains an energetic and ubiquitous presence in the State House and at political events throughout the state. He is also an enthusiastic champion of Gov. Larry Hogan (R) — who appointed Bereano's son to a district court judgeship last year.

So if Bereano is determined to find a state lawmaker to introduce a bill benefiting one of his clients, he will. Whether he has the juice to slow or halt the development of offshore wind is another matter entirely, given the fact that the Public Service Commission has already acted.

Rich, the U.S. Wind executive, did not respond to a phone message yesterday.

State Delegate Dereck Davis (D), the chairman of the powerful House Economic Matters Committee in Annapolis, which has jurisdiction over energy issues, said yesterday that he had not yet discussed the matter with Bereano and hadn't heard of any pending legislation. Davis said he sympathized with the worries of Ocean City leaders but noted that "the state is committed to the clean energy industry."

Davis said he was open to addressing the town's concerns but wasn't sure what could be done legally. "We've got two very important interests here," he added.

Whether or not a bill is introduced in the Legislature next year, wind energy is popular in Maryland, and advocates believe they can defeat any measure in the State House to slow it.

A poll of 671 Maryland residents by Goucher College in Baltimore last month found that 75 percent of those surveyed said seeing wind from the beach would make "no difference" when deciding whether to vacation in Ocean City. Eleven percent said seeing windmills on the horizon would make them "less likely" to vacation in Ocean City, while 12 percent said it would make them "more likely" to stay there.

"It's a big-tent coalition that got this moving in the first place, and it's only gotten bigger," said Tidwell of the Chesapeake Climate Action Network.

'Is this the right issue?'

But with an election year looming, there are several political cross-currents at play in Maryland.

Hogan is up for re-election, and while he's popular, he could be vulnerable if 2018 turns into a big Democratic year nationally.

Hogan has not said much publicly about the wind energy proposals off Ocean City, but he has been generally supportive of environmental measures — though not always to the satisfaction of green groups and Democrats. Hogan embraced a measure to ban hydraulic fracturing in Maryland earlier this year, and supporters note that Hogan's appointees on the PSC joined the appointees of former Gov. Martin O'Malley (D) to unanimously approve both wind projects. That included PSC Commissioner Anthony O'Donnell, a former state House minority leader who voted against O'Malley's wind energy legislation when he served in the General Assembly.

Some stakeholders said they wouldn't be surprised to see Republicans introduce legislation next session to alter the wind energy law — even if it doesn't have a prayer of passing — as a slap to O'Malley, who continues to lurk on the fringes of the 2020 Democratic presidential conversation.

One of Hogan's top priorities for 2018, beyond re-election, is to get more Republicans elected to the state Senate. Democrats hold a robust 33-14 majority in that chamber, and it takes 29 Senate votes to override a governor's veto. Republicans are working hard to pick up the five seats they'll need to forestall any veto overrides, and state Sen. Jim Mathias (D), who represents Ocean City, is one of their top targets. In 2014, Harris, the congressman, funneled tens of thousands of dollars into efforts to defeat Mathias in his conservative Eastern Shore district.

Mathias — an ex-Ocean City mayor and former small business owner there — voted to authorize the wind turbines in 2013, after opposing the measure the year before. He fully expects the wind turbines to be a major issue during his tough re-election fight against state Delegate Mary Beth Carozza (R), a former congressional aide and George W. Bush administration official.

Carozza, who did not respond to a request for comment yesterday, wasn't in the Legislature when the enabling legislation for offshore wind passed. But Republicans could introduce a bill on wind energy just to force another uncomfortable vote for Mathias and other Democrats seeking re-election in conservative districts.

"Is this going to become an issue against me?" Mathias mused. "Everything is going to become an issue. Is this the right issue?"

Real estate industry blocks sea-level warnings that could crimp profits on coastal properties

By Stuart Leavenworth

September 13, 2017 3:35 PM

NAGS HEAD, N.C. All along the coast of the southeast United States, the real estate industry confronts a hurricane. Not the kind that swirls in the Atlantic, but a storm of scientific information about sea-level rise that threatens the most lucrative, commission-boosting properties.

These studies warn that Florida, the Carolinas and other southeastern states face the nation’s fastest-growing rates of sea level rise and coastal erosion — as much as 3 feet by the year 2100, depending on how quickly Antarctic ice sheets melt. In a recent report, researchers for Zillow estimated that nearly 2 million U.S. homes could be literally underwater by 2100, if worst-case projections become reality.

This is not good news for people who market and build waterfront houses. But real estate lobbyists aren’t going down without a fight. Some are teaming up with climate change skeptics and small government advocates to block public release of sea-level rise predictions and ensure that coastal planning is not based on them.

“This is very concerning,” said Willo Kelly, who represents both the Outer Banks Home Builders Association and the Outer Banks Association of Realtors and led a six-year battle against state sea-level-rise mapping in North Carolina. “There’s a fear that some think tank is going to come in here and tell us what to do.”

The flooding and destruction caused by Hurricanes Irma and Harvey has again highlighted the risks of owning shoreline property. But coastal real estate development remains lucrative, and in recent months and years, the industry has successfully blocked coastal planning policies based on ever-higher oceans.

Last month, President Donald Trump rescinded an Obama-era executive order that required the federal government to account for climate change and sea level rise when building infrastructure, such as highways, levees and floodwalls. Trump’s move came after lobbying from the National Association of Home Builders, which called the Obama directive “an overreaching environmental rule that needlessly hurt housing affordability.”

Whose job is it to save the beach?

The Atlantic Ocean is eroding parts of North Topsail Beach by about five feet per year. The town of 800 residents is running out of cash and solutions in its efforts to protect its north shore. Whose job is to save this popular North Carolina tourist destination?

In North Carolina, Kelly teamed up with homebuilders and Realtors to pass state legislation in 2012 that prevented coastal planners from basing policies on a benchmark of a 39-inch sea-level rise by 2100.

There’s a fear that some think tank is going to come in here and tell us what to do.

Willo Kelly, government affairs liaison for Realtors and home builders on North Carolina’s Outer Banks

The legislation, authored by Republican Rep. Pat McElraft, a coastal Realtor, banned the state from using scientific projections of future sea level rise for a period of four years. It resulted in the state later adopting a 30-year forecast, which projects the sea rising a mere 8 inches.

Stan Riggs, a geologist who served on the North Carolina science panel that recommended the 39-inch benchmark, said the 2012 legislation was a blow for long-term coastal planning.

“The state is completely not dealing with this,” said Riggs, a professor of geology at East Carolina University and author of The Battle for North Carolina’s Coast. “They are approaching climate change with sand bags and pumping sand onto beaches, which is just a short-term answer.”

Todd Miller, executive director of the North Carolina Coastal Federation, agrees the state is not doing enough to prepare for climate change. But he says the power play by builders and real estate agents may have backfired, drawing national attention — including being spoofed by Stephen Colbert — to an otherwise obscure policy document.

“The controversy did more to educate people about climate issues than if the report had just been quietly released and kept on the shelves,” said Miller, who heads an environmental organization of 15,000 members.

In Texas, a similar attempt to sideline climate change science also triggered blowback. In 2011, the Texas Commission on Environmental Quality, then under the administration of Gov. Rick Perry, attempted to remove references to climate in a chapter in “State of the Bay,” a report on the ecological health of Galveston Bay.

The chapter, written by Rice University oceanographer John B. Anderson, analyzed the expected impacts of sea-level rise and described rising seas as “one of the main impacts of global climate change.” When TCEQ officials attempted to edit out such references, Anderson and other scientists objected.

“The whole story went viral,” he said in a recent telephone interview. “Then they backed off.”

Since that time, Texas officials haven’t interfered in other scientific reports, said Anderson, but neither have they consulted with academics on how to manage rising seas, erosion and the prospect of stronger storms.

“Texas is pretty much in a state of climate change denial,” Anderson said. “There’s very little outreach to the research community to address the challenges we face.”

Texas is pretty much in a state of climate change denial.

John B. Anderson, oceanographer at Rice University in Houston

Anderson’s lament is one shared by other scientists in the Southeast, where Republicans control nearly all state houses and are generally dismissive of the scientific consensus on climate change.

In Florida, where Republican Rick Scott is governor, state environmental employees have been told not to use terms such as “climate change” or “global warming” in official communications, according to the Florida Center for Investigative Reporting.

In South Carolina, the state Department of Natural Resources in 2013 was accused of keeping secret a draft report on climate change impacts. In Texas, the 2016 platform of the state Republican Party states that climate change “is a political agenda promoted to control every aspect of our lives.”

Anderson said its surprising that sea level rise is sparking controversy because, in his view, it is the least-contested aspect of climate change science.

It’s well accepted, he said, that global temperatures are rising, and that as they rise, water molecules in the oceans expand, a process called “thermal expansion.” Melting glaciers and ice sheets also contribute to rising sea levels, as does coastal subsidence caused by natural forces and manmade activities, such as excessive groundwater extraction.

According to the National Academy of Sciences, sea level has risen along most of the U.S. coast the past 50 years, with some areas along the Atlantic and Gulf coasts seeing increases of more than 8 inches. By 2100, average sea levels are expected to rise 1.6 to 3.3 feet, with some studies showing 6 feet of rise, according to the NAS.

Last year, Zillow matched up its database of 110 million homes nationwide with maps prepared by the National Oceanic and Atmospheric Administration showing a projected sea level rise of 6 feet by the end of the century.

Zillow found that, without efforts to mitigate against sea level rise — such as building floodwalls or elevating structures — some 1.9 million homes were at risk, worth $882 billion. This included 934,000 houses in Florida, 83,000 in South Carolina, 57,000 in North Carolina and 46,800 in Texas.

“This isn’t just an academic issue,” said Svenja Gudell, chief economist for Zillow, noting that some climate skeptics pushed back against the report, but public response was largely positive. “This can have a really big impact on people and their homes and livelihoods.”

With more than 3,600 miles of coastline, tidal shoreline and white-sand beaches, Eastern North Carolina is particularly vulnerable to sea-level rise. It also supports a burgeoning real estate industry, serving retirees and vacationers who want to be near the water.

In 2010, a group called NC-20 — named for the 20 coastal counties of North Carolina — was formed to advocate for this industry and others in the region. It originally protested new state stormwater rules. Then it took aim at projections of a one-meter sea level rise, calling it “a myth promoted by man-made global warming advocates on expectations of melting ice.”

Kelly, the group’s current president, said NC-20 was alarmed by a 2010 state document that said the 39-inch benchmark should be used “for land use planning and to assist in designing development and conservation projects.” The coalition also feared it could lead to a state website where prospective home buyers could research future water levels.

“That is nothing more than a SimCity video game,” Kelly fumed during a recent interview at the Outer Banks Home Builders Association office near Nags Head. “It all depends on the inputs you put into it.”

With little state guidance on climate change and flooding threats, some local governments are taking matters into their own hands. In Florida, the Tampa Bay Regional Planning Council produced a report this year on the multi-billion-dollar impacts of a projected 3-foot sea level rise by 2060.

In Eastern North Carolina, geologist Riggs resigned from the N.C. Coastal Resources Commission’s science panel in 2016, citing legislative interference. He has since teamed up with local governments on the Pamlico and Albemarle Sounds to address problems of flooding, windblown tides and saltwater intrusion, a threat to local farming.

John Trent, chair of the Bertie County Commission, said that he has been collaborating with Riggs on a range of projects to prevent flooding in Windsor, the county seat. Three severe floods have hit Windsor since 1999, and Trent says he is wary about what sea-level rise could bring over the long term.

“Absolutely it concerns us,” said Trent, a transplant from Florida who moved to the county in 2000. “We are the lowest of the low in this region.”

Further east, the Hyde County town of Swan Quarter has built a 17-mile dike around homes and farms to protect $62 million in flood-threatened property. The dike helped prevent windblown flooding during recent storms, but county officials have some concerns about the future.

“In Hyde County, you’d likely get a consensus that sea-level rise is real and that it is happening,” said Daniel Brinn, a resource specialist with the Hyde County Soil and Water District. “You might get an argument on why it is happening, but that is about it.”

Anderson, the Rice University oceanographer, said that coastal homeowners nationwide should pay attention to sea-level rise projections, even if they live in a home that has never flooded before.

During Hurricane Harvey, Anderson’s home in Houston was inundated with roughly a foot of water, the first time that had ever happened. “Given what I’ve been through in recent weeks, I am acutely aware of how important one foot can be,” he said.

Corrected: In an earlier version of this story, the final paragraph referenced the wrong hurricane.

Stuart Leavenworth: 202-383-6070, @sleavenworth

Read more here: http://www.mcclatchydc.com/news/nation-world/national/article173114221.html#storylink=cpy

====================================================================

Renewable diesel use in California moves to fast track

By Isha Salian

September 14, 2017 Updated: September 14, 2017 8:03am

2

Photo: Lea Suzuki, The Chronicle

Image 1 of 3

Amtrak machinist Darrion Brown takes a sample of renewable diesel from the fuel tank on a Capitol Corridor locomotive to be tested.

Renewable diesel sounds like a contradiction in terms. But planners for the Capitol Corridor trains, which run between the Bay Area and the Sacramento region, see it as a way to slash climate-warming emissions.

“It’s pretty exciting for our industry,” said Jim Allison, manager of planning for the Capitol Corridor Joint Powers Authority. On Aug. 28, a train between Oakland and Auburn began running entirely on the fuel from Golden Gate Petroleum of Martinez — part of a test that, if successful, could herald its use throughout the Capitol Corridor system and in trains statewide.

The Bay Area is rapidly becoming a center for renewable diesel, which can be made from vegetable oils, restaurants’ oil waste or animal fats — material known in the industry as biomass. Unlike biodiesel, which has to be mixed in with petroleum diesel, the new fuel can go into a tank at 100 percent strength. The cities of San Francisco and Oakland and the San Jose Unified School District have begun using it in their vehicle fleets, and operators of San Francisco Bay ferries are looking at it too.

“We can sell every drop of renewable diesel we make,” said Eric Bowen, head of corporate business development for the Renewable Energy Group. The company sells the fuel mostly to the California market, he said. Its Louisiana plant can produce 75 million gallons of renewable diesel each year, and it is looking to expand its facilities.

Demand for renewable diesel has soared. In 2011, 2 million gallons of the fuel were used in California, according to the Air Resources Board, which sets rules for emissions and climate change in the state. In 2016, it was 250 million gallons — about 7 percent of all liquid diesel used.

California’s interest comes largely because of a policy called the Low Carbon Fuel Standard, which requires a 10 percent reduction in the carbon intensity of transportation fuels by 2020. Simon Mui, senior scientist at the Natural Resources Defense Council, said that depending on what the renewable diesel is made from, it can cut lifecycle greenhouse gas emissions by 15 to 80 percent. (Lifecycle emissions include growing the raw materials that turn into fuel; transporting and processing it; and finally burning it.) Renewable diesel also reduces fine particle pollution and other types of emissions, like nitrogen oxides and carbon monoxide, Mui said.

Some experts question how much the renewable diesel market can grow.

“The more we study biofuels, the less clear it is there’s very much of a sustainable supply,” said Daniel Kammen, a UC Berkeley professor who helped write California’s Low Carbon Fuel Standard. Biofuels are always less efficient than electric vehicles, he said.

More by Isha Salian

Universities rush to add data science majors as demand explodes

Napa wineries streamline shipping as alcohol regulations ease

Kammen studies carbon emissions using a lifecycle analysis, analyzing the ecological impact of the entire process. From that perspective, “it really doesn’t matter the source of your material,” he said. “Getting large supplies of a truly sustainable biomass is a challenge.”

Proponents of renewable diesel point out that while passenger vehicles (which mostly run on gasoline, not diesel) seem likely to go electric in the near future, a more sustainable diesel may be a good option to reduce emissions for larger, heavier vehicles — at least until more powerful batteries are developed to make electric trucks and buses cost-effective.

“We don’t meet our state goals unless we have a full array of those electrified fleets as well as liquid-fuel-based fleets,” Mui said. “The key here is enabling all of these technologies to be on a level playing field, so a winner or winners can be determined by the market.”

Sam Wade, chief of the transportation fuels branch at the Air Resources Board, said that while heavy-duty vehicles may eventually switch to electric power, “there’s not that many applications that are fully viable today. We see (renewable diesel) as a nice near-term opportunity.”

Neste, a Finnish company that is the world’s leading producer of renewable diesel, makes between 800 million and 900 million gallons each year. The company said a major limit to growth is competing with the traditional diesel industry. “They’ve got an 80-year head start on us,” head of North American public affairs Dayne Delahoussaye said.

Not every company is able to build renewable diesel into a successful business. South San Francisco nutrition company TerraVia, formerly called Solazyme, struck an agreement in 2015 with UPS to supply its trucks with renewable diesel derived from algae oil. But the company, which sold its assets in a bankruptcy sale this week, had shifted away from renewable fuels partly, it said, due to a decline in crude oil prices. (UPS continues to use renewable diesel in some of its trucks, mostly in California.)

Propel Fuels, a Sacramento company that operates 32 retail locations selling renewable diesel, says it keeps its prices competitive with petroleum diesel. “We don’t think a market exists for premium-price renewable diesel,” said CEO Rob Elam.

Elam said Propel has a large presence in disadvantaged communities, serving customers who want to choose cleaner fuels but cannot afford electric vehicles. Other renewable diesel customers pick the fuel for higher power and mileage, he said.

Richmond resident James Clappier has been using renewable diesel from Propel Fuels in his 1994 Dodge Ram 2500 since the company began stocking the fuel in 2015. “I need a large truck for my business, but feel bad for using such an inefficient vehicle,” he said.

Capitol Corridor trains have been testing diesel alternatives for a few years, and its switch last month to 100 percent renewable diesel for the Oakland-Auburn run makes it the first train in the state to run on the fuel, Allison said. If further tests go well, all Capitol Corridor trains could switch over to renewable diesel by next summer, according to Dean Shepherd, manager of mechanical services.

Allison estimates that shifting to renewable diesel in all of its locomotives, which each need about 70 gallons of fuel per hour, would reduce the trains’ greenhouse gas emissions by two-thirds. He is optimistic that testing will go well.

“If there were any issues, we’d see them quickly,” he said.

Isha Salian is a San Francisco Chronicle staff writer. Email: business@sfchronicle.com Twitter: @Salian_Isha

=====Cost of U.S. Solar Drops 75 percent in Six Years, Ahead of Federal Goal

A 250-MW solar project on the Moapa Band of Paiute Indians Moapa Indian River Reservation in southern Nevada. It is the first utility-scale solar project on tribal land in the U.S. First Solar/DOE

The Trump administration has announced that a federal goal to slash the cost of utility-scale solar energy to 6 cents per kilowatt-hour by 2020 has been met early. The goal, set by the Obama administration in 2011 and known as the SunShot Initiative, represents a 75 percent reduction in the cost of U.S. solar in just six years. It makes solar energy-cost competitive with electricity generated by fossil fuels.

The Department of Energy attributed achieving the goal so quickly to the rapidly declining cost of solar hardware, such as photovoltaic panels and mounts. And it said it will next focus its efforts on addressing “solar energy’s critical challenges of grid reliability, resilience, and storage,” according to a press release.

The DOE also announced $82 million in new funding for solar research, particularly for research into “concentrating solar” — which uses mirrors to direct sunlight to generate thermal energy — and into improved grid technology. It set a new goal to reduce the cost of solar even further: 3 cents per kilowatt-hour by 2030.

==========================================================

Geothermal energy: Why hasn't it caught on yet?

Despite being one of the lowest-cost and most reliable renewable energy sources, harnessing heat from the Earth almost doesn't happen outside Iceland. But leaders meeting in Italy this week are trying to change that.

One of the most famous tourist sites in Iceland is the Blue Lagoon, a man-made lake close to Reykjavík Airport that is fed and heated by a nearby geothermal power plant.

Such power plants are common across Iceland -.but little-known in the rest of the world. For many of the swimmers in the lagoon, it is the first time they have ever heard of this power source.

Political leaders from 25 countries gathered at a sumptuous palace in Florence, Italy, this week are hoping to change that.

Yesterday, governmental ministers and 29 partner institutions from the private sector signed the Florence Declaration, committing to a 500-percent increase in global installed capacity for geothermal power generation by 2030.

Although that may sound like a lot, it's starting from a low baseline. Geothermal energy today accounts for just 0.3 percent of globally installed renewable energy capacity.

Listen to audio

06:24

Volcanoes power Iceland

This is despite its huge potential - for both lowering greenhouse gas emissions and saving money. Geothermal is one of the lowest-cost energy sources available, after startup costs are met. The global potential for geothermal is estimated to be around 200 gigawatts.

"Geothermal's vast potential is currently untapped," said Italian Environment Minister Gian Luca Galleti at the Florence summit. "We must develop new technologies and encourage new investments to ensure we cover this gap."

The summit was organized by the Global Geothermal Alliance, which was launched at the United Nations climate summit in Paris in 2015.

Run by the International Renewable Energy Agency (IRENA), it is bringing governments and companies together to try to speed up deployment. But significant hurdles remain.

To get heat from the layer of hot and molten magma under the Earth, water is pumped down an injection well. Then it filters through the cracks in the rocks where they are at a high temperature. The water then returns via the "recovery well" under pressure in the form of steam. That steam is captured and is used to drive electric generators or heat homes.

The ring of fire

The two main hurdles have been geographic and financial.

Italy wanted to host this week's summit because it is keen to increase its use of geothermal energy. Delegates were able to tour Italy's first-ever geothermal energy production plant in Lardarello, not far from Florence.

Italy has had the historic misfortune of being situated above some very hot earth - resulting from tectonic activity that causes earthquakes and volcanic eruptions. But that heat underground can also be harnessed to generate power.

Across the world, 90 countries possess proven geothermal resources with the potential to be harnessed, and they are mostly located in regions of tectonic activity. That means that the potential is low in most of Europe, but huge in the Asia-Pacific region.

Yet capital for funding projects in this region has been hard to come by, especially for projects at this scale.

"Right now, we may only be harvesting 6 percent of proven geothermal energy potential," said IRENA Director-General Adnan Z. Amin.

He called this week's Florence Declaration "a milestone that, in the strongest possible terms, demonstrates renewed will to unlock the potential of geothermal."

More money, more transparency

Following the signature of yesterday's declaration, IRENA released a new report, which found that access to capital for surface exploration and drilling remains the main barrier to geothermal development.

The report also found that more transparent government regulations that avoid delays are needed to provide a stable environment for developers and investors.

Representatives of African Union countries, as well as the AU's commissioner for infrastructure and energy Amani Abou-Zeid, were at the Florence summit pledging to provide this transparency. Abou-Zeid said the technology can help Africa decarbonize, while also providing jobs.

Watch video

03:56

Kenya goes geothermal

"Geothermal energy is emerging as a hidden gem of Africa's renewable energy resources and we must work together, across nations, to ensure this resource achieves its potential."

One country in which investment commitments are not lacking is Indonesia , which is planning 4,013 megawatts of additional capacity in the coming years. This puts it far ahead of all other countries. The United States, Turkey and Kenya follow, with a little over 1,000 megawatts of additional capacity each planned.

Amin says such government commitments can encourage private investment in developing these energy sources, which is capital-intensive at the start. "If we can identify and implement mechanisms that deliver a greater level of certainty to investors and developers, then we will move beyond meaningful dialogue to decisive action," he said.

If the countries gathered in Florence maintain their commitments, sites such as the Blue Lagoon may not be unique to Iceland any more.

============================================================================

Study: Food scrap recycling can work in cities of any size, though PAYT helps

Author

Cole Rosengren

@ColeRosengren

Published

Sept. 11, 2017

Share it

post

share

tweet

The financial and logistical challenges of starting a food scrap diversion program can seem daunting for smaller cities. A newly published study from MIT shows that no one characteristic is a prerequisite for taking the leap.

The study, published in the October 2017 edition of the journal Resources, Conservation and Recycling, was written by a team of three researchers from MIT's Department of Urban Studies and Planning and Department of Materials Science and Engineering. They set out to understand where food scrap diversion programs were happening in mid-size and large cities. Their results were based on 115 responses from cities with between 100,000 and 1 million people — about 28% of the U.S. population.

Story co

Learn essential components for creating a data-driven strategy to capture value from unsold inventory and reduce the volume of food sent to landfill.

The researchers had seen a growth in awareness about food scrap programs (FSP), but noted that little research had been done on what type of cities were pursuing them and whether this had any similarities to the growth of curbside recycling in the 1990s.

"The spread of recycling offers an analogue to current trends in municipal food scrap diversion programs. Today, most U.S. municipalities offer some kind of recycling infrastructure, whether through curbside collection or drop-off facilities. Food scrap recycling, on the other hand, is still in its early days," they wrote.

Where it's happening

Based on the survey, conducted in 2015, the MIT researchers found that 40% of respondents had some type of existing FSP. The survey defined an FSP as "educational programs, including home visits by composting experts; free or discounted barrels for home composting; free or discounted bins for food scrap collection in the home; drop-off facilities; and curbside collection."

Among these respondents, 36% had educational programs. Free or discounted backyard bins were the second most common at 19%, and 18% of cities had curbside collection. Some respondents also had more than one type of FSP in place. The researchers chose to keep the identities of these cities anonymous, though did offer breakdowns by size and EPA region within their research.

"Within these cities, this particular size range, it really seems like being on the smaller end or being on the larger end doesn’t make a huge difference," Lily Baum Pollans, one of the paper's co-authors who is now an assistant professor of urban policy and planning at Hunter College. "That was a little bit of a surprise for us."

In terms of which cities had any type of FSP, population size and socioeconomic status weren't found to be significant drivers. Out of various factors that have been found to correlate with successful recycling programs in previous research — median income, educational attainment, density, age, and housing characteristics — only educational attainment was found to have any connection.

As for curbside collection, cities with higher population densities were more likely to offer the service in some form.

Why it's happening

While no one type of city was a clear fit for having these programs, the MIT team did find that certain waste policies played a role.

Cities that could build on existing infrastructure and already had waste reduction policies in place, such as pay-as-you-throw (PAYT) programs, were more likely to have FSPs of some kind. The existence of yard waste collection programs and PAYT together correlated with other FSPs being in place as well.

Looking at cities with curbside programs, the researchers found that PAYT was also a factor. Yard waste collection wasn't as relevant because it is often seasonal. Cities with source-separated recycling were less likely to have curbside organics, possibly because if they hadn't followed the more widespread single-stream trend they wouldn't be prioritizing organics collection yet. Interestingly, the presence of a municipal diversion rate goal wasn't found to be a major factor.

In the cities that offered curbside programs, the researchers didn't delve into whether they were being serviced by municipal or private collection.

“From more general research we haven’t actually observed much of a pattern," said Pollans. “What really matters is that there’s a public sector agency taking leadership around the issue."

Next steps

According to this study, making food scrap recycling a priority is possible for any city regardless of their location or condition.

"There is no magic here: regardless of most socio-economic characteristics, any city could move in this direction if it is willing to make the commitment," concluded the authors.

As noted in the study, and during Waste Dive's conversation with Pollans, determining the success of these programs is a different question. The fact that many of the mid-size respondents only had educational programs could be taken as a sign that they're not doing enough. Though just because a city has curbside collection doesn't mean residents are participating fully or correctly. Further research is also underway about the politics and policies behind starting composting programs and broader waste management programs in general.

“We have a lot more work to do," said Pollans.

When factoring in larger cities not included in this study, the number offering curbside collection becomes much larger. BioCycle's latest national survey identified 198 communities with curbside service in 19 states. A new edition of the survey is currently being finalized and may show that number has grown. So far this year, cities such as Falls Church, VA and Boise, ID have launched new programs. As existing programs expand in larger cities such as New York, Los Angeles and Austin, more people are gaining access on a regular basis.

Opinions may differ on the most cost-effective way to address food waste — and state or local regulations are still big drivers for where programs are happening — but it's clear that awareness is growing at a municipal level around the country.

More Information :

http://news.mit.edu/2017/study-food-waste-recycling-policy-key-0817

Garbage From Irma Will Fuel Florida’s Power Grid

As long as they’re throwing stuff away, many counties find, they may as well make electricity out of it.

By Eric Roston

‎September ‎18‎, ‎2017‎4‎:‎00AM ‎EDT

When it comes to garbage, geography is destiny.

Look at Texas and Florida, recovering from Hurricanes Harvey and Irma. Homeowners and businesses not incapacitated by the storm have begun the arduous and emotional work of separating destroyed possessions and materials by type and placing them curbside. Cities have begun the intimidating logistics of picking it up and transporting it to its final destination.

And what is that destination? Texas’s waste-disposal strategy takes advantage of the state’s vast land. Harris County alone, which includes Houston, has 14 active landfills.

Florida, by contrast, is a peninsula with a longer coastline than any state other than Alaska, and much less room for trash. Many coastal Florida counties burn theirs, with waste incinerators particularly common around the state’s populous southern lip and up the Gulf Coast. It’s a two-fer. Combustion reduces the solid waste to ash, and the heat that’s produced runs steam generators. Much of the waste left in Irma’s path will burn, the energy released adding to local communities’ electricity.

Florida burns a disproportionate amount of U.S. trash. Ten “waste-to-energy” plants turned 4.5 million tons of trash into 3.5 million megawatt-hours statewide in 2016. That’s about 2 percent of the state’s overall power, and a large majority of its renewable energy. Burning trash makes up less than 0.5 percent of overall U.S. electricity production, according to the Energy Information Administration.

But the main point is to make stuff disappear. Incineration reduces the solid mass of trash by up to 90 percent. The leftover ash can then be more efficiently dumped in a landfill where space is precious. In 2016, Florida burned 12 percent of its trash, recycled 44 percent, and another 44 percent went into landfills. With all that, Florida has historically been an exporter of garbage to other states, the 10th largest in a 2007 study by the Congressional Research Service.

Waste-fueled power plants were built mostly in the 1980s and early 1990s, encouraged by a 1978 federal law. Environmental scrutiny in later years led to widespread retrofits of pollution-control technologies to remove mercury and dioxin.

Burning stuff up doesn’t make it entirely disappear, even once the ash is disposed of. Part of the mass of the original garbage is converted into carbon dioxide and released into the atmosphere. But this pollution may beat the alternative. If the same volume of waste were tossed into landfills, eventual emissions of methane, a more powerful greenhouse gas, would be even worse for the atmosphere.

That makes garbage an attractive, if marginal, alternative to fossil fuels in some areas. The Intergovernmental Panel on Climate Change, the authoritative scientific group backed by the United Nations, supported waste-to-energy plants as a low-carbon technology back in 2007. Before its next report, in 2014, costs for plants had fallen 15 percent globally.

Before Irma hit, Florida Department of Environmental Protection staff worked with local governments and facilities to anoint disaster-debris sites, sort of a purgatory for trash before it’s moved to incinerators. (After storms, the DEP coordinates with multiple state and federal agencies, including the U.S. Army Corps of Engineers, FEMA, and the Florida Fish and Wildlife Conservation Commission.) Fuel for these incinerator power plants stood at high levels before Irma struck. In anticipation of the hurricane, Miami residents, for example, had doubled the amount of stuff they threw out in the days before it arrived. Already, some county authorities are seeing a spike in solid waste.

“We’ve seen about a 20 percent increase,” said Kimberly Byer, solid waste director for Hillsborough County, which includes Tampa. “That’s just an initial increase, and it’s only been a couple of days.”

The county’s 565,000 tons of trash a year produces about 45 megawatts of power, or enough to run about 30,000 homes. “It pays for itself,” Byer said of Hillsborough’s waste-to-energy facility.

Hurricane or no hurricane, people are comfortable with garbage.

U.S. coastal growth continues despite lessons of past storms

Sun., Sept. 17, 2017, 10:56 a.m.

By Jeff Donn for Associated Press

Rising sea levels and fierce storms have failed to stop relentless population growth along U.S. coasts in recent years, a new Associated Press analysis shows. The latest punishing hurricanes scored bull’s-eyes on two of the country’s fastest growing regions: coastal Texas around Houston and resort areas of southwest Florida.

Nothing seems to curb America’s appetite for life near the sea, especially in the warmer climates of the South. Coastal development destroys natural barriers such as islands and wetlands, promotes erosion and flooding, and positions more buildings and people in the path of future destruction, according to researchers and policy advisers who study hurricanes.

“History gives us a lesson, but we don’t always learn from it,” said Graham Tobin, a disaster researcher at the University of South Florida in Tampa. That city took a glancing hit from Hurricane Irma – one of the most intense U.S. hurricanes in years – but suffered less flooding and damage than some other parts of the state.

In 2005, coastal communities took heed of more than 1,800 deaths and $108 billion in damages from Hurricane Katrina, one of the worst disasters in U.S. history. Images of New Orleans under water elicited solemn resolutions that such a thing should never happen again – until Superstorm Sandy inundated lower Manhattan in 2012. Last year, Hurricane Matthew spread more deaths, flooding and blackouts across Florida, Georgia and the Carolinas. From 2010-2016, major hurricanes and tropical storms are blamed for more than 280 deaths and $100 billion in damages, according to data from the federal National Centers for Environmental Information.

Harvey, another historically big hurricane, flooded sections of Houston in recent weeks. Four counties around Houston, where growth has been buoyed by the oil business, took the full force of the storm. The population of those counties expanded by 12 percent from 2010 to 2016, to a total of 5.3 million people, the AP analysis shows.

During the same years, two of Florida’s fastest-growing coastline counties – retirement-friendly Lee and Manatee, both south of Tampa – welcomed 16 percent more people. That area took a second direct hit from Irma after it made first landfall in the Florida Keys, where damage was far more devastating.

Overall growth of 10 percent in Texas Gulf counties and 9 percent along Florida’s coasts during the same period was surpassed only by South Carolina. Its seaside population, led by the Myrtle Beach area of Horry County, ballooned by more than 13 percent.

Nationally, coastline counties grew an average of 5.6 percent since 2010, while inland counties gained just 4 percent. This recent trend tracks with decades of development along U.S. coasts. Between 1960 and 2008, the national coastline population rose by 84 percent, compared with 64 percent inland, according to the Census Bureau.

Cindy Gerstner, a retiree from the inland mountains of upstate New York, moved to a new home in January in Dunedin, Florida, west of Tampa. The ranch house sits on a flood plain three blocks from a sound off the Gulf of Mexico. She was told it hadn’t flooded in 20 years – and she wasn’t worried anyway.

“I never gave it a thought,” she said during a visit back to New York as Irma raked Florida. “I always wanted to live down there. I always thought people who lived in California on earthquake faults were foolish.”

Her enthusiasm for her new home was undiminished by Irma, which broke her fence and knocked out power but left her house dry.

In Horry County, where 19 percent growth has led all of South Carolina coastline counties, Irma caused only minor coastal flooding. The county’s low property taxes are made possible by rapid development and tourism fees, allowing retirees from the North and Midwest to live more cheaply. Ironically, punishing hurricanes farther south in recent years has pushed some Northerners known locally as “half-backers” to return halfway home from Florida and to resettle in coastal South Carolina.

Add the area’s moderate weather, appealing golf courses, and long white strands – the county is home to Myrtle Beach – and maybe no one can slow development there. “I don’t see how you do it,” said Johnny Vaught, vice chairman of the county council. “The only thing you can do is modulate it, so developments are well designed.”

Strong building codes with elevation and drainage requirements, careful emergency preparations, and a good network of roads for evacuation help make the area more resilient to big storms, said the council chairman, Mark Lazarus. Such measures give people “a sense of comfort,” said Laura Crowther, CEO of the local Coastal Carolina Association of Realtors.

Risk researchers say more is needed. “We’re getting better at emergency response,” said Tobin at the University of South Florida. “We’re not so good at long-term control of urban development in hazardous areas.”

The Federal Emergency Management Agency helps recovery efforts with community relief and flood insurance payments. The agency did not immediately respond to a request for comment. It provides community grants for projects aimed at avoiding future losses. Some projects elevate properties, build flood barriers, or strengthen roofs and windows against high winds. Others purchase properties subject to repeated damage and allow owners to move.

But coastline communities face more storm threats in the future.

Global warming from human-generated greenhouse gases is melting polar ice and elevating sea levels at an increasing pace, according to the National Oceanic and Atmospheric Administration. That amplifies storm surges and other flooding. Also, some climate models used by scientists predict stronger, more frequent hurricanes as another effect of global warming in coming decades.

“There will be some real challenges for coastal towns,” predicted Jamie Kruse, director of the Center for Natural Hazards Research at East Carolina University in Greenville, North Carolina. “We’ll see some of these homes that are part of their tax base becoming unlivable.”

Hazard researchers said they see nothing in the near term to reverse the trend toward bigger storm losses. As a stopgap, communities should cease building new high-rises on the oceanfront, said Robert Young, director of the Program for the Study of Developed Shorelines at Western Carolina University in Cullowhee, North Carolina.

He said big changes probably will not happen unless multiple giant storms overwhelm federal and state budgets.

“The reason why this development still continues is that people are making money doing it,” he said. “Communities are still increasing their tax base – and that’s what politicians like.”

Religious communities are taking on climate change

Churches that have long played a role in social justice are stepping up.

Sarah Tory Sept. 18, 2017

Before Pastor Jim Therrien, 49, moved to New Mexico, he rarely thought about environmental issues. Back in Kansas, where he was born and raised, the grass outside his home was always green, and though the state had an active oil industry, companies fenced off well sites properly and promptly cleaned up spills. But then he and his family saw the impacts of energy development on the Southwestern landscape and their new church community. Therrien began to think about the connection between the local environment and the broader issue of climate change.

Every day, Therrien, a blond, ruddy and tattooed man of Irish descent, looked out his window and saw a dry land getting drier. Residents told him that winters used to be much colder and snowier. The hotter temperatures thickened the methane haze, and oil and gas traffic tore up the dirt roads. Therrien started to see these problems as injustices that conflicted with Christian values. So he decided to take a stand. Churches have long played a crucial role in social movements, from the civil rights era to immigration reform. Why not environmental activism?

“I don’t ever consider myself an environmentalist,” he told me one afternoon at the Lybrook Community Ministries, on a remote stretch of Highway 550, between the Navajo Nation and the Jicarilla Apache Reservation. “I’m more of a people person.”

Therrien’s congregation, mostly Navajo, had spent years living with the San Juan Basin’s drilling boom, and the last thing they needed was a sermon about climate change. So instead of lecturing, he created a garden to reduce the church’s use of fossil fuels to transport food. Then he began fundraising for solar installations on homes around the mission and urging lawmakers to tighten regulations on methane, a powerful greenhouse gas released by oil and gas drilling.

Last year, he joined the Interfaith Power & Light campaign, “a religious response to global warming” composed of churches and faith communities across the U.S. Since 2001, the network had expanded its membership from 14 congregations in California to some 20,000 in over 40 states. The group provides resources to churches and other faith communities for cutting carbon emissions — helping install solar panels, for instance, and sharing sermons on the importance of addressing climate change.

Therrien says he is merely “following the Scripture.” In the process, however, he has joined a growing environmental movement that brings a religious dimension to the problem of climate change.

Members of the New Community Project tour the Navajo Nation near now-dry Washington Lake to learn how oil and gas extraction is affecting the Navajo people who live nearby.

The green religious movement is gaining momentum. In May, nine Catholic organizations announced plans to divest from fossil fuel corporations, a move inspired by Pope Francis’ 2015 encyclical, Laudato Si’: On Care for Our Common Home.

In June, President Donald Trump announced plans to withdraw from the Paris climate accord, a decision that Catholic Bishop Oscar Cantú, of Las Cruces, New Mexico, called “deeply troubling.” “The Scriptures affirm the value of caring for creation and caring for each other in solidarity,” said Cantú, who is the chairman of the U.S. Bishops’ Committee on International Justice and Peace. “The Paris agreement is an international accord that promotes these values.”

In July, the United Church of Christ delivered a similar message, urging the clergy to preach on “the moral obligation of our generation to protect God’s creation” and exhorting individuals to take political action.

For these churches, climate change connects a long list of social and economic injustices they care deeply about, from food insecurity to the global refugee crisis.

Pope Francis stands in front of a statue of St. Francis of Assisi. The pope, who takes his name from St. Francis, the patron saint of animals and ecosystems, has led a call to action on climate change for the Catholic community.

“Climate change is the biggest ethical, moral and spiritual challenge of our day,” Joan Brown, the executive director of New Mexico Interfaith Power & Light, told me.

She pointed to St. Francis of Assisi, the patron saint of animals and ecology, a medieval Italian monk who spent much of each year living in hermitages, caves and on mountainsides, praying in the wilderness. “St. Francis speaks of everything being connected,” she said, “of there being no separation between the human and natural world.” When Jorge Mario Bergoglio was elected pope, he chose his name in honor of St. Francis. Brown credited Pope Francis with helping reframe climate change as a moral concern.

Laudato Si’ describes the relationship between global poverty, inequality and environmental destruction, and issues a call to action. When Francis visited the White House in 2015, he declared: “Climate change is a problem which can no longer be left to a future generation. I would like all men and women of goodwill in this great nation to support the efforts of the international community to protect the vulnerable in our world.”

Among non-Catholics, too, the pontiff’s message has had an effect: Polling from the Yale Project on Climate Change Communication shows that the number of Americans who say they think global warming will harm the world’s poor rose from 49 to 61 percent; the percent who say the issue has become “very” or “extremely important” to them personally jumped from 19 to 26 percent.

A little over a year later, during the Paris climate negotiations, Brown recalled how during a breakout session for faith leaders, one of the Paris organizers praised Laudato Si’ and the similar documents released by Muslim, Jewish and Buddhist leaders. It was the first time, Brown said, quoting the organizer, “that ethical and moral imperatives are front and center with delegates.”

Here at his hardscrabble New Mexico parish, Therrien continues to practice what he preaches. On a hot day in July, he herded 28 visitors into the mission’s two white vans for a drive out onto the Navajo Nation. The group, mostly Easterners, ranged in age from 8 to over 60 and had traveled to the Lybrook mission as part of a weeklong fact-finding trip. Like Therrien, many were members of the Church of the Brethren, a Protestant denomination with a history of activism. More recently, their focus had shifted to environmental issues — especially climate change.

“It’s concerning that our government is pulling back from what we should be focusing on,” one of them, Jim Dodd, told me. Recently, the giant Larsen Ice Shelf had broken off from Antarctica, and Dodd was worried. “Villages already at sea level are going to get flooded,” he said.

Leading the group was David Radcliff, director of the New Community Project, a Vermont-based organization. “It’s a fairness issue for the rest of God’s creatures,” he told me. Radcliff has led “learning tours” around social and environmental justice issues for church groups, most recently, to the Ecuadorian Amazon.

Radcliff, a small, wiry man with an intense blue gaze, wore a white T-shirt with a very long slogan on the back. “Earth is a mess,” it said, and “God’s not amused.” If you aren’t satisfied, it added, “do something about it.”

For Radcliff, discussing the facts of climate change isn’t enough. That’s where religion comes in. “At a certain point, you have to talk about the consequences, and past that it becomes a conversation about morality,” he said. Take moose in the Northeast: They are dying from tick infestations exacerbated by a warming climate, caused by humans taking more from the Earth than they need, he said. “We are stealing from other living things.”

As we drove over bumpy dirt roads west of the mission, the Easterners stared in awe at the crumpled mesas and the vast New Mexican landscape. Navajo homesteads peeked out of the sagebrush. Every so often, a large semi-truck carrying pipes and other equipment roared past in a cloud of dust, heading for one of the scattered well pads or towering rigs marking fracking operations.

Therrien stopped the van at one of the well sites and ushered everyone out, gesturing toward a row of high metal storage tanks. Under federal and state regulations, the company should have built fencing around it, but out here on Navajo land, Therrien noted, the rules weren’t always enforced. Last year, several oil storage tanks north of Lybrook exploded, forcing Navajo families living nearby to evacuate their homes. Since moving to the mission, he often thought about how much easier it was to ignore the consequences of oil and gas development — and of climate change — if people weren’t involved.

Back in Kansas, Therrien had recycled cans and kept a compost pile, but when it came to climate change, he felt mostly resigned. “I used to think, ‘What can one person do?’ ” he told me. Now, as a pastor on the Navajo Nation, he felt a new sense of urgency and purpose.

Last January, he spoke at a rally outside the Bureau of Land Management’s office in Santa Fe, New Mexico, protesting the agency’s decision to lease 844 acres of land for drilling. The rally brought together Navajo activists, environmental groups and religious leaders from the state chapter of Interfaith Power & Light. Although they failed to stop the sale, Therrien remained hopeful. “Through the church, I realized there was this network of people fighting for the same things,” he said.

Therrien stopped the van at a low rise overlooking what was once Washington Lake. Families once gathered water here for themselves and their livestock. Four years ago, it dried up.

Everyone piled out, while Radcliff explained how aquifers were losing water. “They’re dropping all over the world,” including in the West, he said. He paused and knelt to pick up a can that was left on the side of the road and brandished it above him. “No other creature makes trash,” he said. “So what’s progress?” The people gazed past him, staring at the dusty lakebed, where patches of dry grass swayed in the heat.

Correspondent Sarah Tory writes from Paonia, Colorado. She covers Utah, environmental justice and water issues

What if America Had a Detective Agency for Disasters?

The commissions are coming. Hurricane season hasn’t ended, but forensics waits for no one, so the after-action reports on Harvey and Irma have to get started. The relevant agencies—local and perhaps federal, plus maybe some national academies and disaster responders—will all no doubt look at themselves and others to see what went right or wrong. Was Houston right not to evacuate? Was Florida right to evacuate? What led to Florida’s electrical outage? Is Houston’s flood control infrastructure tough enough to withstand climate change-powered storm surges?

That’s as it should be. The science of disaster prediction and response only gets a few laboratories a year, and extracting lessons from them is a big job. Big enough, in fact, that the sheer number of reports can mean those lessons get lost. So Scott Knowles, a historian at Drexel University who studies disasters, has an idea: a federal agency whose job it is to centrally organize that detective work and get the word out. A National Disaster Investigation Board.

Stipulated, this idea would be a tough sell even if the Trump administration wasn’t anti-agency and anti-regulation. But just playing with the notion says something about what people do and don’t learn from disasters. “It’s an index of what a total nerd I am that this is a thrilling idea,” Knowles says. “We have all these agencies that we spend a fair amount of money on who do this work, but what’s missing is a technically sophisticated cop who can come in and say, 'this worked, this didn’t, this was a repeat, this is an area for research.' And then put a report out and have it be interesting, so there’s press.”

As a model, Knowles cites agencies like the National Transportation Safety Board, but also pop-up commissions like the one after the space shuttle Challenger exploded, or the 9/11 Commission. Both were riveting. The Rogers Commission, investigating the Challenger, had that thundering moment when Richard Feynman put an O-ring in ice water to show that cold made it brittle, the technical cause of the shuttle’s explosion. The 9/11 final report, controversial though it remains, was a bestseller and finalist for the National Book Award—and led to sweeping changes to the United States intelligence and military. That power to effect real, lasting change would be critical to the NDIB's success.

What’s missing is essentially a technically sophisticated cop who can come in and say, "this worked, this didn’t, this was a repeat, this is an area for research."

Scott Knowles, historian, Drexel University

So what would the NDIB actually do? Disasters are “multiple interlocking problems,” as Knowles says. The Fukushima Dai-Ichi meltdown was a human-made disaster that resulted from a natural one, a tsunami. The Challenger O-ring failed, but the decision to launch was a management failure. So the first task might be to figure out what problems to unlock. For Harvey and Houston, that might be evacuation communication and dealing with toxic sites. For Irma it might be the electrical grid.

Public, possibly even televised hearings might be another component. They’d recruit public pressure to make necessary changes, and also have a naming-and-shaming component. The post-Katrina study did that. “They said, look, the Coast Guard did good and everybody else failed,” Knowles says. “That was a spur at FEMA that produced pretty significant changes and led to legislation.”

And, as a colleague of Knowles’ suggested when he tossed out the idea of the NDIB on Twitter, the investigation should include people from among the affected. That’d give it a social science aspect, but also help make sure its findings represented the widest possible constituency.

So as long as we’re dreaming up mythical science detective agencies, who should be the stars of CSI: Disaster? “If we’re going to get serious about building this up as a body of expertise with policy juice, I think you have to have a standing body with people committed to it, and professional staff,” Knowles says. He suggests people like fire protection engineers, the cranky people in the back of the room who know what’s unsafe and aren’t afraid to say so, but can also navigate legal systems.

Knowles thought ex-FEMA administrators might be great at the job, like James Lee Witt or Craig Fugate. “He’s setting himself up as a kind of policy analyst, and having been a FEMA administrator he’s way qualified to walk into Houston and say what he’s worried about.”

So I pinged Fugate. What does he think about a federal disaster investigation team? “It’s a good idea, but will be resisted by FEMA and others. A formal review by outside experts not subject to the agencies under review will be key,” Fugate says. But he adds that you’d still want to somehow integrate the views and actions of local, state, and private sector actors. And they’re “not likely to volunteer unless required by law.”

Short of starting up a Legion of Super-Heroes-style Science Police Force, that seems unlikely.

Worse, an NDIB might be redundant. “After Ebola, there were at least 20 after-action commissions to study what happened and what went wrong,” says Larry Brilliant, chair of the Skoll Global Threats Fund and a pandemic fighter since his days on the World Health Organization team that eradicated smallpox. (Brilliant also cofounded the lynchpin internet community The Well.) Brilliant was on at least one of the commissions himself. “It’s not that there’s any absence of intentionality,” he says.

A bigger problem: “There’s no such thing as a ‘disaster,’” Brilliant says. A famine is not a flood is not an explosion is not a refugee crisis is not a disease outbreak. “Each one has its own particular ways to screw up.”

Corporate forces would resist this new agency even harder than anti-bureaucratic ones. The FIRE sector—the finance, insurance, and real estate industries—has become the biggest contributor to federal-level political campaigns. That lobby doesn’t want to hear about uninsuring coastlines or rezoning wilderness. “I call them the lending complex,” Knowles says. “They’re going to push back on this because they think we should be able build anyplace—wildland interface, Jersey Shore, wherever.”

But whether all those objections make a standing Federal agency with investigative powers a terrible idea (because it’d be redundant) or a great idea (because it’d clarify) isn’t clear. “Almost anything would be better than what we have now, which is duplicative studies and some Senate committee, and the stuff just goes on a shelf,” Knowles says. “Studies are going to happen. We may as well innovate around how they’re done, because they’re going to get more and more expensive.” And so are the disasters.

More than 8. 3 billion tons of plastics made: Most has now been discarded

Date:

July 19, 2017

Source:

University of Georgia

Summary:

Humans have created 8.3 billion metric tons of plastics since large-scale production of the synthetic materials began in the early 1950s, and most of it now resides in landfills or the natural environment, according to a study.

S

Humans have created 8.3 billion metric tons of plastics since large-scale production of the synthetic materials began in the early 1950s, and most of it now resides in landfills or the natural environment, according to a study published in the journal Science Advances.

Led by a team of scientists from the University of Georgia, the University of California, Santa Barbara and Sea Education Association, the study is the first global analysis of the production, use and fate of all plastics ever made.

The researchers found that by 2015, humans had generated 8.3 billion metric tons of plastics, 6.3 billion tons of which had already become waste. Of that waste total, only 9 percent was recycled, 12 percent was incinerated and 79 percent accumulated in landfills or the natural environment.

If current trends continue, roughly 12 billion metric tons of plastic waste will be in landfills or the natural environment by 2050. Twelve billion metric tons is about 35,000 times as heavy as the Empire State Building.

"Most plastics don't biodegrade in any meaningful sense, so the plastic waste humans have generated could be with us for hundreds or even thousands of years," said Jenna Jambeck, study co-author and associate professor of engineering at UGA. "Our estimates underscore the need to think critically about the materials we use and our waste management practices."

The scientists compiled production statistics for resins, fibers and additives from a variety of industry sources and synthesized them according to type and consuming sector.

Global production of plastics increased from 2 million metric tons in 1950 to over 400 million metric tons in 2015, according to the study, outgrowing most other human-made materials. Notable exceptions are materials that are used extensively in the construction sector, such as steel and cement.

But while steel and cement are used primarily for construction, plastics' largest market is packaging, and most of those products are used once and discarded.

"Roughly half of all the steel we make goes into construction, so it will have decades of use -- plastic is the opposite," said Roland Geyer, lead author of the paper and associate professor in UCSB's Bren School of Environmental Science and Management. "Half of all plastics become waste after four or fewer years of use."

And the pace of plastic production shows no signs of slowing. Of the total amount of plastics produced from 1950 to 2015, roughly half was produced in just the last 13 years.

"What we are trying to do is to create the foundation for sustainable materials management," Geyer said. "Put simply, you can't manage what you don't measure, and so we think policy discussions will be more informed and fact based now that we have these numbers."

The same team of researchers led a 2015 study published in the journal Science that calculated the magnitude of plastic waste going into the ocean. They estimated that 8 million metric tons of plastic entered the oceans in 2010.

"There are people alive today who remember a world without plastics," Jambeck said. "But they have become so ubiquitous that you can't go anywhere without finding plastic waste in our environment, including our oceans."

The researchers are quick to caution that they do not seek the total removal of plastic from the marketplace, but rather a more critical examination of plastic use and its end-of-life value.

"There are areas where plastics are indispensable, especially in products designed for durability," said paper co-author Kara Lavender Law, a research professor at SEA. "But I think we need to take a careful look at our expansive use of plastics and ask when the use of these materials does or does not make sense."

Story Source:

Materials provided by University of Georgia. Original written by James Hataway. Note: Content may be edited for style and length.

Journal Reference:

Roland Geyer et al. Production, use, and fate of all plastics ever made. Science Advances, July 2017 DOI: 10.1126/sciadv.1700782

Localized rapid warming of West Antarctic subsurface waters by remote winds in E.Antarctica

Paul Spence, Ryan M. Holmes, Andrew McC. Hogg, Stephen M. Griffies, Kial D. Stewart & Matthew H. England

Nature Climate Change (2017) doi:10.1038/nclimate3335

Received 25 January 2017 Accepted 08 June 2017 Published online 17 July 2017

The highest rates of Antarctic glacial ice mass loss are occurring to the west of the Antarctica Peninsula in regions where warming of subsurface continental shelf waters is also largest. However, the physical mechanisms responsible for this warming remain unknown. Here we show how localized changes in coastal winds off East Antarctica can produce significant subsurface temperature anomalies (>2 °C) around much of the continent. We demonstrate how coastal-trapped barotropic Kelvin waves communicate the wind disturbance around the Antarctic coastline. The warming is focused on the western flank of the Antarctic Peninsula because the circulation induced by the coastal-trapped waves is intensified by the steep continental slope there, and because of the presence of pre-existing warm subsurface water offshore. The adjustment to the coastal-trapped waves shoals the subsurface isotherms and brings warm deep water upwards onto the continental shelf and closer to the coast. This result demonstrates the vulnerability of the West Antarctic region to a changing climate

The increasing rate of global mean sea-level rise during 1993–2014

Xianyao Chen, Xuebin Zhang, John A. Church, Christopher S. Watson, Matt A. King, Didier Monselesan, Benoit Legresy & Christopher Harig

Nature Climate Change 7, 492–495 (2017) doi:10.1038/nclimate3325

Received 19 October 2016 Accepted 22 May 2017 Published online 26 June 2017

Global mean sea level (GMSL) has been rising at a faster rate during the satellite altimetry period (1993–2014) than previous decades, and is expected to accelerate further over the coming century1. However, the accelerations observed over century and longer periods2 have not been clearly detected in altimeter data spanning the past two decades3, 4, 5. Here we show that the rise, from the sum of all observed contributions to GMSL, increases from 2.2 ± 0.3 mm yr−1 in 1993 to 3.3 ± 0.3 mm yr−1 in 2014. This is in approximate agreement with observed increase in GMSL rise, 2.4 ± 0.2 mm yr−1 (1993) to 2.9 ± 0.3 mm yr−1 (2014), from satellite observations that have been adjusted for small systematic drift, particularly affecting the first decade of satellite observations6. The mass contributions to GMSL increase from about 50% in 1993 to 70% in 2014 with the largest, and statistically significant, increase coming from the contribution from the Greenland ice sheet, which is less than 5% of the GMSL rate during 1993 but more than 25% during 2014. The suggested acceleration and improved closure of the sea-level budget highlights the importance and urgency of mitigating climate change and formulating coastal adaption plans to mitigate the impacts of ongoing sea-level rise.

Feds release final environmental impact statement for Atlantic Coast Pipeline

Jul 24, 2017, 6:30am EDT

Lauren K. Ohnesorge, Triangle Business Journal

Federal regulators have released their final environmental impact statement for the Atlantic Coast Pipeline, a 600-mile natural gas line planned to cut through North Carolina by way of West Virginia and Virginia.

The Federal Energy Regulatory Commission concludes that, while the project’s construction and implementation is likely to result in “some adverse effects,” by following mitigation recommendations, the ACP can reduce those impacts “to less-than significant levels.”

A group of residents in Wilson County have spoken out against the project.

According to FERC, negative effects of ACP’s construction and operation could include its impact on steep slopes and adjacent bodies of water. The documents points to a handful of endangered species that could be impacted by the project, from the Roanoke logperch to the clubshell mussel to the Indiana bat.

In a prepared statement after the release, Leslie Hartz, Dominion Energy's vice president of engineering and construction, said the report "provides a clear path for final approval of the Atlantic Coast Pipeline this fall."

"The report concludes that the project can be built safely and with minimal long-term impacts to the environment," Hartz states.

The report comes a day after seven mayors, spearheaded by Linwood Parker of Four Oaks, put out a joint statement calling for the project to be built.

“We just wanted to make sure everybody is aware of the importance to eastern North Carolina to have natural gas and an abundant supply,” Parker said Friday in an interview. “The debate is not about whether you have gas, it’s about whether eastern North Carolina has gas.”

Parker and other advocates of the project hope it spurs economic development, specifically the interest of manufacturers who rely on natural gas to operate. While companies aren’t going to be connecting directly to the pipeline, they could benefit from cheaper Duke Energy prices and a more resilient system, advocates have said.

Parker said many critics live in the Triangle region, where there’s “plenty of natural gas.” “Yet my community doesn’t have it,” he said.

But, to several property owners along the route, the ACP means unwelcome trenching. Of the 2,900 landowners impacted by the project, 1,000 are in North Carolina.

A group of homeowners along Exum Road in Wilson County are among those fighting against the project, worried about its impact on their safety and property values.

They point to the demographics. Other than Johnston, the counties at play have a greater minority makeup than the state average. That includes Robeson County, which is unique because of its Lumbee Indian population.

“What are we all? Black folks who are retired,” said Alice Freeman, who expects to see the pipeline construction from her bedroom window. “They’re going to these communities because we are the people who have the least clout. We don’t have the money to fight them. We’re easy prey. And nobody is going to come to our defense.”

But proponents of the pipeline – including several economic developers across the state – point to different demographics.

Of the eight counties tapped for the Atlantic Coast Pipeline, only Johnston County is not among the most economically distressed counties in the state. And the pipeline, they say, is critical to boosting their economies.

“For us, the overriding factor is the long-term economic development potential,” Patrick Woodie, president of the North Carolina Rural Economic Development Center said earlier this year, adding that it “outweighs” homeowner impact.

The project is a joint effort by Dominion (NYSE:D), Duke Energy (NYSE:DUK), Piedmont Natural Gas and Southern Co. (NYSE:SO).

Solar Energy Boom Sets New Records, Shattering Expectations

Overall, clean energy sources accounted for two-thirds of new electricity capacity worldwide in 2016, a new report says.

By Georgina Gustin

Oct 4, 2017

Driven largely by a boom in solar power, renewable energy expansion has hit record-breaking totals across the globe and is shattering expectations, especially in the United States, where projections were pessimistic just a decade ago.

In 2016, almost two-thirds of new power capacity came from renewables, bypassing net coal generation growth globally for the first time. Most of the expansion came from a 50 percent growth in solar, much of it in China.

In the U.S., solar power capacity doubled compared to 2015—itself a record-breaking year—with the country adding 14.5 gigawatts of solar power, far outpacing government projections. In the first half of 2017, wind and solar accounted for 10 percent of monthly electricity generation for the first time.

Two reports—one from the International Energy Agency (IEA), which looked at growth in renewables globally, and one from the Natural Resources Defense Council (NRDC), which tracked growth in the U.S.—were published this week, both telling the same story.

"We had very similar findings: 2016, from a U.S. perspective was a great year for renewable energy and energy efficiency," said Amanda Levin, a co-author of the NRDC report. "China is still the largest source of new power, but in the U.S., we're seeing an increase in renewables year over year."

Growth Shatters Past Expectations

The numbers are far higher than the U.S. Energy Information Administration (EIA) predicted a decade earlier. The agency forecast in 2006 that solar power would amount to only about 0.8 gigawatts of capacity by 2016.

Instead, installed solar by 2016 was 46 times that estimate, the NRDC points out. EIA's prediction for wind power was also off—the agency predicted 17 gigawatts of wind power, but that figure actually rose nearly fivefold, to 82 gigawatts of capacity.

The agency, likewise, didn't predict a drop in coal-fired power generation, which plummeted by nearly 45 percent.

Globally, according to the report from the IEA—not to be confused with the EIA—solar was the fastest-growing source of new energy, bypassing all other energy sources, including coal. Overall, the IEA found, new solar energy capacity rose by 50 percent globally—tracking with the rise in the U.S. Adding in other renewable sources, including wind, geothermal and hydropower, clean energy sources accounted for two-thirds of new electricity capacity. The IEA also increased its forecast for future renewable energy growth, saying it now expects renewable electricity capacity will grow 43 percent, or more than 920 gigawatts, by 2022.

Solar's U.S. Growth Could Hit a Speed Bump

In the U.S., the prospects are similarly positive, despite the Trump administration's efforts to bolster the coal industry and roll back Obama-era clean energy legislation.

Levin noted one potential damper on that growth. Last month, the U.S. International Trade Commission ruled in favor of two solar manufacturers that are seeking tariffs on cheap imported solar panels. Ultimately, any tariff decision would be made by the Trump administration.

"It would mean a much higher price for solar panels, and it could put a large reduction in new solar being added over the next two to three years," Levin said.

"States and cities are moving forward on clean energy," she said. "We think the investments made by states and cities, to not only hedge on gas prices, but to meet clean energy standards, will continue to drive solar even with the decision."

About the Author: Georgina Gustin

Georgina Gustin is a Washington-based reporter who has covered food policy, farming and the environment for more than a decade. She started her journalism career at The Day in New London, Conn., then moved to the St. Louis Post-Dispatch, where she launched the "food beat," covering agriculture, biotech giant Monsanto and the growing "good food" movement. At CQ Roll Call, she covered food, farm and drug policy and the intersections between federal regulatory agencies and Congress. Her work has also appeared in The New York Times, Washington Post and National Geographic's The Plate, among others.

Santa Fe School Board Opposes State Science Education Standards

Critics of the proposed curriculum say it leaves out important information relating to climate change and evolution.

By Ashley P. Taylor | October 4, 2017

The Santa Fe school board voted yesterday (October 3) to oppose the state’s new science education standards, which, the board says, water down or leave out key information about evolution and climate change, The Santa Fe New Mexican reports. The school board has written a letter in opposition to the state’s Public Education Department (PED), which it plans to send by week’s end. The board is also planning a “teach-in” at the PED building in Santa Fe for next Friday (October 13).

The letter, shared with the New Mexican, asks why the new standards omit the age of the Earth, the time when the first unicellular organism appeared, evolution, and global warming, while emphasizing the oil and gas industries. The letter also criticizes the state’s refusal to disclose the authors of the standards, saying that the secrecy “leaves open for question the authenticity of the proposed replacement,” according to the newspaper.

The Los Alamos school board has already opposed the new standards, and the Taos board is expected to follow suit, the New Mexican reports.

The Santa Fe board is “deeply troubled,” according to the letter, that the new state standards—an update from those established in 2003—do not align with the Next Generation Science Education Standards, which the National Research Council, the National Science Teachers Association, and others developed. Eighteen states have adopted the Next Generation standards in full. Others, like New Mexico, are using them as a framework for their own standards, the New Mexican reports.

While the authorship of the Next Generation standards is public, that of the state standards is not, and the state seems to be actively withholding the information. The New Mexican submitted a public records request in mid-September seeking the names of individuals and groups that developed the new standards. The education department’s public records custodian replied that the department had no pertinent records, as the New Mexican reports in a separate article. PED spokeswoman Lida Alikhani told the paper that those who weighed in on the standards did so with the understanding that their contributions would remain confidential.

The board has planned a teach-in for October 13, days before a public hearing about the new standards.

“Our state should adopt the original proposal, not the revisions,” writes the New Mexican in an editorial.

“Future climate scientists should understand what human activity is doing to the planet, and most of all, how science might be able to save our collective futures,” the editorial continues. “Without the proper foundation, those scientists will not be coming from New Mexico. That would be a shame.”

Industry Lawsuits Try to Paint Environmental Activism as Illegal Racket

Logging and pipeline companies are using a new legal tactic to seek damages from Greenpeace and other groups. The long-shot cases are having a chilling effect.

By Nicholas Kusnetz

Oct 5, 2017

On a bright afternoon in May 2016, two men in a silver SUV pulled into Kelly Martin's driveway. One of them, tall and beefy with a crew cut, walked up to her front door.

"The guy said, 'Is Joshua Martin home?' and I said, 'No, who are you?" recalled Kelly. "He said, 'I'm with a company that's talking to current and former employees of ForestEthics, and I'm wondering if he still works there'."

Joshua had left ForestEthics, renamed Stand last year, to run the Environmental Paper Network. Kelly asked to see the stranger's ID and to snap a picture on her phone. Instead, the man retreated to the SUV and "they literally peeled out of the driveway."

Around the same time, Aaron Sanger, another former employee of Stand, also received a visit from two men asking similar questions. So did others, some of them former employees of Greenpeace.

Then, on the last day of that month, Greenpeace and Stand were hit with an unusual lawsuit brought by Resolute Forest Products, one of Canada's largest logging and paper companies, that could cost the groups hundreds of millions of dollars if Resolute wins.

The timber company said the organizations, which for years had campaigned against Resolute's logging in Canada's boreal forest, were conspiring illegally to extort the company's customers and defraud their own donors.

Invoking the Racketeer Influenced and Corrupt Organizations Act, or RICO, a federal conspiracy law devised to ensnare mobsters, the suit accuses the organizations, as well as several green campaigners individually and numerous unidentified "co-conspirators," of running what amounts to a giant racket.

"Maximizing donations, not saving the environment, is Greenpeace's true objective," the complaint says. "Its campaigns are consistently based on sensational misinformation untethered to facts or science, but crafted instead to induce strong emotions and, thereby, donations." Dozens of the group's campaign emails and tweets, it said, constituted wire fraud.

"As an NGO, that is a deeply chilling argument," said Carroll Muffett, president of the Center for International Environmental Law (CIEL), which joined eight other groups to file an amici curiae brief supporting a move to dismiss Resolute's case.

The far-reaching lawsuit has seized attention across the environmental advocacy and legal communities. Arguments are to be heard in court next week.

The Resolute lawsuit was unprecedented. Then several months ago, former employees of Greenpeace and the environmental advocacy group 350.org were similarly visited by investigators. In August, the company behind the Dakota Access Pipeline filed a separate RICO suit against Greenpeace and two smaller groups, Banktrack and Earth First!. The complaint echoes Resolute's claims: a broad conspiracy against a major company, advocacy groups running an illegal "enterprise" to further their own interests while damaging the company, Energy Transfer Partners. It even alleges support for eco-terrorism, a violation of the Patriot Act, and drug trafficking.

It was filed by the same lawyer who represents Resolute—Michael J. Bowe, of the firm Kasowitz Benson Torres LLP, who is also a member of President Donald Trump's personal legal team.

"The Energy Transfer Partners lawsuit against Greenpeace is perhaps the most aggressive SLAPP-type suit that I've ever seen," said Michael Gerrard, faculty director of the Sabin Center for Climate Change Law at Columbia University, using the acronym for a lawsuit that aims to silence political advocacy. "The paper practically bursts into flames in your hands."

While the second lawsuit names only three defendants, together with unnamed "co-conspirators," it also accuses many of the nation's leading environmental groups, including the Sierra Club, Earthjustice and 350.org, of participating in a sprawling enterprise to disrupt business and defraud donors.

"This was precisely what we were concerned we would see," said Muffett of CIEL. Dozens of organizations, American Indian tribes and countless individuals were involved in the protests against the Dakota Access Pipeline. By naming a handful of them and unnamed co-conspirators, the suit may cause anyone with any ties to the movement to think twice before sending the next campaign email or launching a new effort, Muffett said.

"Those groups will be looking at this and trying to decide on how to respond and what it means for their campaigns going forward, not only on Dakota Access but other campaigns as well. And that is, in all likelihood, a core strategy of a case like this."

Kasowitz Benson Torres has earned a reputation as a particularly brash New York law firm. The waiting room of its Manhattan offices displays a brochure of media mentions, including several about the firm's close relationship with Trump. When Timothy O'Brien reported in his 2005 book, "TrumpNation: The Art of Being the Donald," that the future president was worth between $150 million and $250 million, not the billions he boasted, Trump hired Kasowitz to sue O'Brien for $5 billion. The case dragged on for five years before being dismissed.

It may have been the firm's work for a Canadian insurance company, however, that captured the attention of Resolute.

In 2006, Fairfax Financial Holdings Limited hired Bowe and Kasowitz to sue several hedge funds, accusing them of operating a criminal enterprise to drive down the company's share price and boost their short-sale profits. The case relied on New Jersey's RICO statute, and while it was dismissed in 2012, Fairfax appealed and in April won a ruling saying it could proceed with some of its claims. Bradley Martin, Fairfax's vice president of strategic investments, is chairman of Resolute's board of directors.

Bowe said that while his clients suffered damages, "defamation cases also typically involve an element of restoring your reputation. In fact, sometimes plaintiffs who don't have substantial damages sue primarily to clear their reputations," he said. Behind Bowe's desk hangs an American flag from the battle of Fallujah, among the bloodiest of the Iraq War. "I think the filing of a suit in court sends the most credible message that what they're saying about you is not true."

The Resolute case has its origins in a long-running environmental campaign focusing on Canada's boreal forest, a thick green stripe across much of the country. It's one of the world's largest stores of carbon, and environmental groups say it's under increasing pressure from logging operations. In 2012, Greenpeace Canada released a report purporting to show that Resolute was operating in parts of the forest it had agreed to stay out of. It turned out, however, that the area in question was not part of the protected forest. Greenpeace later issued an apology and said it would retract any material that drew on the assessment.

There's some disagreement on whether Resolute's logging practices stand out from its peers—many environmentalists say they do, but some people familiar with its work dismiss the claim. What everyone interviewed for this article agreed on is that unlike many of its peers, Resolute has resisted settling with environmentalists. In 2013, it sued Greenpeace Canada in an ongoing Canadian defamation suit over the boreal campaign. And after Greenpeace continued its work, and targeted many of Resolute's American customers, Resolute eventually filed its U.S. RICO claim, targeting the organization's U.S. and international arms, as well.

In 124 pages of detailed and passionate language, the complaint alleges not only that the defendants lied in order to harm Resolute's business, but that their primary objective is not to protect the environment but to raise money. It describes a "blitzkrieg attack" of threats against the company's customers unless they dropped Resolute. Some, like Kimberly-Clark, UPM, Best Buy and News Corp have either ended contracts or requested the contracts be altered to alleviate concerns, according to the complaint.

The lawsuit says the groups impersonated company employees to gain proprietary information and suggests Greenpeace was associated with a hack of the websites of Resolute and Best Buy. Beyond the 2012 error that Greenpeace retracted, the lawsuit says Greenpeace defamed the company by, among other things, calling it a "forest destroyer." In addition to Greenpeace and Stand, the lawsuit names several of the organizations' employees personally as defendants. The co-conspirators—or John and Jane Does—it says, are people of unknown identity who assisted Greenpeace's work, including "cyber-hacktivists, environmental activists, and certain foundations directing funds to the defendants." The suit seeks damages to be determined in a trial, but cites a claim by Greenpeace to have cost the company at least $80 million. RICO claims entitle plaintiffs to recover triple damages, in this case potentially more than $240 million.

On Oct. 10, Greenpeace will ask a federal judge in California to dismiss the case. The group submitted a similar motion last year in Georgia, where the suit was originally filed. The Georgia judge later moved the case to California, where two of the defendants are based, saying Resolute had not provided any "factual basis from which to infer that defendants committed fraud or extortion" in Georgia. "Rather, the allegations in the complaint, at best, support the inference that the defendants organized and held a protest in Augusta."

"It is very alarming that you can have plaintiffs like this, representing corporate interests attacking legitimate critics doing advocacy work by just drafting a complaint, throwing whatever in there, stretching racketeering law and going after constitutionally protected free speech by throwing labels out there basically trying to criminalize legitimate advocacy work," said Tom Wetterer, Greenpeace's general counsel.

Over the past year, several energy corporations have filed lawsuits against critics. Murray Energy Corp., an Ohio-based coal company, filed a libel claim against The New York Times in May after it published an editorial critical of the company, and the following month filed another suit against comedian John Oliver, HBO, Time Warner and some staff members of his show, "Last Week Tonight." In August, Cabot Oil & Gas Corp., which has been accused of polluting residential water wells in Pennsylvania, sued one of the residents and his lawyers, saying they had harmed the company by filing a lawsuit, later withdrawn, that Cabot considered frivolous. The same month, a mining company run by the son of West Virginia Gov. Jim Justice sued two Kentucky regulators—personally—alleging that they interfered with the company's business.

The suits against Greenpeace, however, are broader in scope, and because of their use of racketeering law, potentially much more damaging. "They can change the tenor of the debate," said Joshua Galperin, who runs the Environmental Protection Clinic at Yale Law School. The cases, he said, are not simply legal acts, but political too.

"Now wouldn't be a bad time for certain industries to follow Resolute's lead and say, 'let's try to change the conversation. Let's see if we can get the Trump administration to intervene in these lawsuits, to get behind us, to bring perhaps even criminal RICO charges against groups like Greenpeace," he said. "Now wouldn't be a bad time to try these aggressive tactics, unfortunately."

The day the Resolute lawsuit was filed, Jonathan Adler, a conservative legal scholar at Case Western Reserve University of Law, wrote about it in a blog published by the Washington Post. The Wall Street Journal's editorial board offered conditional support. Others, including Steve Forbes, welcomed the lawsuit in Investor's Business Daily and the National Review.

Brandon Phillips, of the National Fisheries Institute, an industry group, wrote a letter published in the Wall Street Journal saying, "Bravo to Resolute Forest Products and here's hoping the discovery process in the litigation shines a light on the misconduct that environmental journalists have apparently ignored for far too long."

Marita Noon, in a column for Breitbart and the American Spectator, said: "Hopefully other companies will now tune into the public's change in attitude and with firmness and determination also fight back to protect shareholders and workers." She placed Resolute's fight in the political context of growing support for Trump's presidential campaign.

H. Sterling Burnett, of the conservative Heartland Institute, well known for its denial of mainstream climate science, said in an interview that other companies have been too quick to cave to the demands of environmentalists, and that the RICO suits may change that. "The Resolute case is one small thread," he said, "but you can take a thread and unravel a whole sweater."

Adler, a contributing editor to the National Review Online, spent years at the conservative Competitive Enterprise Institute, whose litigation has marked climate policy debates. "There is a view in the free market community that corporations don't fight back enough and are too quick to write the terms of their own surrender," he said.

Adler said he's not a fan of using RICO in the way that Kasowitz Benson Torres has, but that he has little sympathy for Greenpeace. And he has little doubt that companies are watching. "If weapons are lying around, someone's going to pick them up."

Bowe said he's spoken with several other companies interested in filing similar lawsuits—both before and after Energy Transfer Partners filed its case in August—though he would not disclose any details about the companies or potential targets. Speaking from his office, with plate glass windows looking over midtown Manhattan, he said his firm is not shopping the case to potential clients, that he's not allowed to do so.

A Perfect Boogeyman

In December 2014, as negotiators from around the world gathered in Lima, Peru, for the United Nations-sponsored climate change talks, a group of Greenpeace activists laid out a pro-renewable energy message in giant yellow fabric letters next to some of the nation's famous Nazca Lines, an archeological site, and photographed the episode from above. Many Peruvians, as well as diplomats at the talks, were outraged by the stunt, and the Peruvian government said the activists damaged the site.

Since its founding in 1971, Greenpeace has embraced this type of high-profile antics that self-consciously skirt the edges of legality and politesse. They also make Greenpeace the perfect boogeyman for conservatives and corporate leaders who have long pushed back against environmentalists, considering them out-of-touch radicals bent on crippling economic growth. (The Nazca Lines episode, for which Greenpeace later apologized, is listed in Resolute's complaint as evidence that the organization does more harm than good.)

Greenpeace is international in reach, nipping at corporate heels no matter where they tread, and in recent decades it has become one of the most effective groups at pressing companies to reform. For any practice that Greenpeace wants changed—clearing Amazonian forests for soy or Indonesian ones for palm oil—the group searches the supply chain for a major consumer brand, asks it to change suppliers, and threatens to paint the brand as complicit in environmental pillage if it refuses. The group has sent protesters dressed as chickens to McDonald's franchises in London to protest logging in the Amazon, deftly mimicked a Dove soap public service announcement in Canada to link its parent company, Unilever, to deforestation in Indonesia, and dropped banners across the front of Procter & Gamble's corporate headquarters for its sourcing of palm oil. The rhetoric is blunt and hyperbolic. And it's been remarkably effective, winning concessions, cooperation and even grudging respect from some of the world's best-known corporations.

Kimberly-Clark agreed to boost its use of recycled and environmentally certified wood fiber in 2009 after a years-long Greenpeace campaign. McDonald's agreed to work with the group to reduce deforestation in the Amazon.

The case brought by Energy Transfer Partners is an explicit attack not just on Greenpeace and the other defendants, but on this method, which it calls the "Greenpeace Model."

The pipeline company says the groups damaged its business through their protests against the pipeline, and also through their campaigns to pressure banks to cut off funding. It says they lied about the company's practices—claiming it hadn't consulted with Indian tribes and that the project hadn't received a thorough environmental review. (In December, the Obama administration ordered a more comprehensive environmental review of alternative routes, but the Trump administration later approved the pipeline before the review was finished.) The suit also names nine other groups, including the Sierra Club, Earthjustice and 350.org, as members of this "criminal enterprise," though most are not included as defendants. Together with the Resolute suit, which names Stand, the cases go after much of this uncompromising wing of environmentalism.

"An attack against Greenpeace and Stand, two groups that have been really at the forefront of corporate campaigns, is not just an attack on those groups but is an attack on the strategy that NGOs have used to really bring corporations back under control in terms of their social and environmental behavior," said Michael Marx, who runs CorpEthics, an environmental consulting firm that works with several of the named groups. The two complaints list other campaigns that Greenpeace has engaged in, including its work against genetically modified crops, Canada's tar sands and a campaign on sustainable fishing. "If Resolute is successful in a case like this, they're basically taking away what's become what I think is the real driver in the social responsibility movement."

The lawsuits make extraordinary claims, comparing the named environmental organizations to "parasites" and a "predatory pack preying on legitimate businesses."

Many legal experts and environmentalists say it's too soon to say whether the suits will pose a serious threat to the groups or to the advocacy community more broadly. Several, including Galperin, say the cases are unlikely to prevail in court. They may not need to.

The Resolute suit has now been running for nearly a year and a half, requiring Greenpeace and Stand to spend money and time filing briefs, responding to motions and traveling to court. Defendants in the Energy Transfer Partners case face a similar road ahead, with the filing of motions scheduled to last into March.

"We're confident they're not going to win, we're confident they're going to spend probably 10 times as much money as we're going to spend defending it, but this is definitely a difficult thing to deal with," said Todd Paglia, Stand's executive director, who was also named as a defendant. Paglia said the group has already lost support from one funder who was wary of being dragged into a lawsuit, accounting for about a quarter of the funding for its boreal campaign. "If we weren't really aggressive in trying to control costs, in trying to maintain our focus on our campaigns and revenue, it would be very easy for this lawsuit to turn into not just harassment but a death threat."

Greenpeace is much larger, with nearly $400 million in revenue from all its international branches last year. If it lost in court, the payout could be substantial. (Energy Transfer Partners didn't specify a figure, saying it had lost hundreds of millions.) But Tom Wetterer, Greenpeace's general counsel, said that's not the goal, or the danger, of the cases. "I don't think it's really about the money," he said. "I think they have to realize that their claims are not strong, that it's the process that they're really focused on, regardless of the end. It's really a means to drag us through the legal process."

The groups have tried to capitalize on the lawsuits, too. Greenpeace launched a new campaign asking authors and publishers—whom Resolute supplies with paper—to denounce the company's tactics, while Stand has asked supporters to rally behind it, adding an appeal for donations at the bottom of the message.

Energy Transfer Partners declined to comment, referring questions to Bowe. In an interview in August, Kelcy Warren, the company's chief executive, told a North Dakota television station that "We've created this kind of tolerance where, oh my gosh, you can't challenge these people in fear that someone is going to say you're not a friend of the environment. That's nonsense," he said. "Could we get some monetary damages out of this thing, and probably will we? Yeah, sure. Is that my primary objective? Absolutely not. It's to send a message, you can't do this, this is unlawful and it's not going to be tolerated in the United States."

Seth Kursman, Resolute's vice president for corporate communications, said in a written statement to InsideClimate News that "we decided to draw the line, unapologetically and forcefully defending our integrity."

The case has already provided a document trail that's proved useful to the company and its allies. In a January filing, Greenpeace wrote that some of its statements were "heated rhetoric" and opinion, and not actionable. Richard Garneau, the company's chief executive, drew on the quote in a column in the National Review, calling it an admission of lying. The cry was echoed in the Daily Caller, the oil and gas industry website Energy in Depth, and a site run by the prominent corporate public relations consultant Richard Berman, whose firm's tagline is "Change the Debate." Energy Transfer Partners cited the filing in its complaint.

A dozen media and advocacy organizations, in a second amici curiae brief supporting Greenpeace and Stand, warned that using RICO was "clearly an attempt at an end-run around the protections of the First Amendment." For a more common defamation case to proceed, plaintiffs must prove a defendant acted with "actual malice," meaning they either knew what they said was false or they should have known. It's unclear, some legal experts say, whether the same standard would apply under RICO, potentially making it easier for the case to proceed to discovery.

Among those who signed the brief was the First Amendment Coalition, which until January was led by Peter Scheer. He said he doubts Resolute will win the RICO case, but that if it drags on long enough, it could provide a model for silencing defendants, particularly smaller groups.

"It is something that is greatly subject to abuse because it gives the plaintiff so much leverage that it can force the defendant to settle," he said. "It's like an extortion law."

Bowe, the lawyer representing the companies, dismissed the idea that RICO would weaken First Amendment protections.

"Certain elements of the statute allow you to hold people accountable who might not be held accountable in a defamation case," he said. "If you're dealing with a group acting in concert, you want to hold accountable those who are directing and funding and calling the shots, not just the people doing the acts."

Enter the John and Jane Does. "When you get to discovery, the whole iceberg becomes revealed," Bowe said. "There might be people there who you weren't aware were involved. It could be funders, it could be foundations." A foundation could be implicated, he said, if it was closely involved in the campaign and its messaging.

Marx, of CorpEthics, worries that the suits have already intimidated some of his peers. "Even the threat that companies might do it could have a chilling effect both on the foundations that fund this work and on the groups that do these kinds of campaigns," he said.

Several people interviewed for this article said that threat quietly murmurs in the back of their mind. One of them is Joshua Martin, among the targets of the private investigators. The investigators never identified themselves as working for the Kasowitz firm; Bowe would not confirm or deny that he sent them.

"As soon as that lawsuit came out, I read the whole thing and I looked for my name," Martin said. "I just think that psychologically that's interesting in terms of the potentially intimidating impact that the visits have."

Facing Tougher Regulations, Transcanada Scraps $12 Billion Oil Pipeline

"CALGARY, Alberta - TransCanada Corp abandoned its C$15.7 billion ($12.52 billion) cross-country Energy East pipeline on Thursday amid mounting regulatory hurdles, dealing a blow to the country’s oil export ambitions.

The demise of the pipeline comes less than a year after the Canadian government rejected another export pipeline, Enbridge Inc’s Northern Gateway, and is a further setback for Canada’s oil industry which is already hurting from low global crude prices.

The vast majority of Canadian crude exports go to the United States, and Energy East would have shipped 1.1 million barrels a day to east coast ports for loading onto tankers destined for higher-priced markets in Europe and Asia. "

Panelists say Dangerous Inaction on Rising Seas

10/09/2017 by Ashita Gona

High tide brings the ocean over the dunes in Ocean Isle during an October 2015 storm. File photo by Christopher Surigao

RALEIGH — “Bottom line is we should not be building big buildings next to the beach.”

These were the words of Orrin Pilkey, an expert on coastal geology, during a panel discussion at a community forum on the effects of sea level rise on North Carolina last week. Pilkey and other panelists voiced strong opinions on how little state and local officials are doing to adapt to and prevent damage from sea level rise in the coming decades.

Orrin Pilkey

The forum, “Rising Seas: How will climate change affect the NC Coast?,” was part of a Community Voices series hosted by The News & Observer and WTVD-TV of Raleigh. The discussion took place at the North Carolina Museum of History on the evening of Wednesday, Sept. 27.

In addition to Pilkey, professor emeritus of earth and ocean sciences at Nicholas School of the Environment at Duke University, panelists included the following:

Astrid Caldas, senior climate scientist with the Climate and Energy program at the Union of Concerned Scientists.

Stanley R. Riggs, professor of geology at East Carolina University.

Greg “Rudi” Rudolph, Carteret County shore protection officer and a member of the Coastal Resources Commission’s science panel.

Todd Miller, executive director of the North Carolina Coastal Federation.

Ned Barnett, an opinion column and blog contributor at The News & Observer, moderated the panel. Barnett noted at the beginning of the forum that the panel included no climate change doubters or those who reject mainstream climate science.

“We cannot devote the little time we have tonight to a debate that there is even a problem to discuss,” he said.

Barnett emphasized the topic of sea level rise is relevant to the state and not talked about enough.

“The rise can seem too small to matter, a few inches a decade,” he said, “but that rise becomes more ominous when we consider that it is relatively new, starting around 1800, that it is accelerating, and that its effects can be compounded by the other consequences of global warming, heavier rainfall and more powerful storms.”

A report released in July by the Union of Concerned Scientists, an organization Caldas said jokingly calls itself the union of pissed-off scientists, found that some coastal regions, which currently only see a few floods a year, will likely see frequent and destructive floods in coming decades.

Astrid Caldas

“Science is telling us that lots of localized floods are going to occur in the near term,” Caldas said, “and that substantial areas are going to become part of the tidal zone in the long term.”

She said that the research projected that Wilmington will go from experiencing a few tidal floods a year to as many as 150 by 2035. By 2045, the number of floods may be in excess of 350. She added that Duck is projected to see about 30 flooding events by 2035, and that the water will cover extensive expanses of land. By 2045, she said, Duck may see up to 150 extensive floods annually.

One of the biggest problems, Caldas said, will be the effect of this phenomenon on poor and minority communities, who feel the effects by sea level rise in different ways. Vulnerable communities, she said, tend to live in distant areas. Flooding on roads can make it difficult to get to resources and to their jobs.

“Many socially, economically vulnerable communities are at the frontlines of this whole mess, with very few resources to cope,” she said.

Rudolph said that people in poor agricultural communities are also vulnerable to sea level rise as they may not be able to afford increasingly expensive flood insurance premiums.

Greg Rudolph

“They’re in the flood zones,” he said, “if not, they’re going to be. Premiums going up because of storms. I see that as a big socioeconomic crisis, and it’s going to impact North Carolina hard.”

Pilkey said that the scientific consensus is that sea level will likely rise 3 feet by the end of the century and possibly another 6 inches, depending on the behavior of the west Antarctic ice sheet. In theory, he said, a foot of sea level rise could create 2,000 feet of shoreline retreat, or as much as 10,000 feet along parts of the Outer Banks.

“Which means that 1 foot of sea level rise could bring the shoreline back 2 miles,” he said regarding the Outer Banks.

As sea level rises, Riggs said the coast is becoming increasingly vulnerable to storm surges, but that society must develop new economies around the natural dynamics. Storms are a part of life, he said, and we must learn to live in harmony with them.

Natural approaches to storm management are still possible in North Carolina, as only half of the state’s coast is developed, Rudolph said.

“There’s a huge what’s called opportunity,” he said, “to advance a mixture of natural flood management, living shorelines and new infrastructure.”

The panel supported the idea that preventing damage from sea level rise should be done sooner rather than waiting until after a devastating event.

“Don’t expect enlightened policymaking in the aftermath of a storm,” Miller said, “everybody’s attention at that point is on recovery.”

Pilkey said that he believes that after 2 feet of sea level rise, beach re-nourishment will no longer be possible, leaving communities with two choices: Move buildings back or build a seawall. The best we can do now, he said, is to stop building large structures that cannot be moved.

As for people who are interested in owning coastal property, Riggs said a lack of information may make them vulnerable to make risky purchases.

“Let’s at least require some statements on a deed that’s out there in the high-hazard areas,” Riggs said, “so a person who’s not familiar knows what they’re buying into.”

The final version of the state’s five-year update to the original 2010 sea-level rise report was released in March 2015.

Panel members expressed frustration at the gap between science and policy, with Pilkey saying the Coastal Resources Commission appears to be doing everything it can to promote development on the state’s coast.

Riggs referred to the 2010 sea level rise report produced by the CRC science panel, which predicted 39 inches of rise during the next century. The North Carolina General Assembly famously rejected the report, prompting nationwide attention and late-night TV jokes about the state “outlawing climate change,” and the panel was instructed to create a new report in 2015 that looked only 30 years into the future.

Riggs resigned from the panel in 2016, saying “I believe the once highly respected and effective science panel has been subtly defrocked and is now an ineffective body.”

He said during the panel discussion that education is key to putting pressure on policymakers to use science to craft coastal management policies.

“If we don’t have an educated public, we’ll never get off the ground with any of this,” Riggs said.

After hurricanes, Congress ponders future of flood insurance program

Congress is beginning to consider how to overhaul the flood insurance plan that many Texans are relying on to rebuild after Hurricane Harvey.

by Abby Livingston Oct. 9, 2017 15 hours ago

The Texas Tribune thanks its sponsors.

WASHINGTON - The devastating hit Houston took from Hurricane Harvey has exacerbated — and highlighted — the enormous financial jam facing the National Flood Insurance Program.

Thanks to the recent onslaught of hurricanes hitting Texas, Florida and Puerto Rico, there has never been a greater need for the program. But that need has also set off a new round of calls to dramatically overhaul a program that hasn't been able to sustain itself without major subsidies from the U.S. Treasury.

Republican U.S. Rep. Pete Olson's Sugar Land-based district suffered some of the most intensive flooding in the state. He said he is open to some changes, but not if it risks payouts on Harvey claims his constituents are filing. He is quick to underscore the desperation in his suburban Houston district and doesn't want changes to make things worse.

"It should be part of the package but not a do-all, end-all," Olson said of any potential overhaul.

The Texas Tribune thanks its sponsors. Become one.

Established in 1968 to help homeowners living in flood-prone areas that private insurers wouldn't cover, the program has never been on steady financial footing, and continued construction in low-lying areas — as well as more frequent and powerful storms attributed to climate change — have put the NFIP deeply in the red.

As a result, Congress repeatedly finds itself re-authorizing new money to support the program. Even before Hurricanes Harvey and Irma, the program was set to expire on Sept. 30, and no new insurance policies can be written until it's re-authorized again.

Few home insurance policies cover flood damage, and nearly all U.S. flood insurance policies are issued through the program. To qualify for national flood insurance, a home must be in a community that has agreed to adopt and enforce various policies to reduce flooding risk. The program has more than 584,000 active policies in Texas, more than any state but Florida.

It's hard to find a member of Congress who will call for an outright abolition of the program, which would likely destabilize real estate markets and property tax bases in those areas. So the aim among some legislators is to pass laws that will encourage homeowners to move into the private market.

One option is to raise the premiums for government insurance to help sustain the program, discourage new construction in high-risk areas and hope that as rates rise, more private companies will enter the flood insurance market.

The fear of many lawmakers who represent these homeowners is that a substantial rise in rates will be more than some can afford.

The Texas Tribune thanks its sponsors. Become one.

The issue had the potential to become a crisis as congressional insiders worried that re-authorizing the program could get tangled up in fights over raising the debt ceiling and funding the government. But to the surprise of nearly everyone, President Trump cut a deal with Democratic leaders to re-authorize the program for the short-term and push all of those big decisions into December.

Now, activists and member of Congress who want to overhaul the flood insurance program have an opportunity to make their case over the next couple of months.

They argue that government-subsidized insurance encourages more people to build in flood-prone areas — which then forces the government to rebuild their homes after every flood at taxpayer expense.

"We keep rebuilding areas that are at a very expensive cost to taxpayers," said U.S. Rep. Vicente Gonzalez, D-McAllen.

Those advocating an overhaul include taxpayer watchdogs, environmentalists, insurance companies and members of Congress who oppose bailing out areas that allowed building in flood-prone areas. They're pushing for legislation that requires better flood plain mapping that takes climate change into account, stricter building regulations requiring measures like elevating homes and buildings to reduce flood risk, and setting sustainable insurance rates that won't shock the market.

A powerful trifecta of interests groups comprised of bankers, real estate agents and home construction companies have fought these efforts.

Back in 2012, former President Obama signed into law a major Congressional overhaul of the flood insurance program. Among the changes: eliminating subsidies for homes that are repeatedly damaged by flooding.

But some homeowners and their representatives in Congress protested the steep price increase. In early 2014, Congress and Obama reversed course, passing into law a cap that would limit premium increases and mostly unwound the 2012 efforts.

The Texas Tribune thanks its sponsors. Become one.

Now, many House members are pushing to let the private market take over the job of insuring properties in flood zones.

Gonzalez serves on the U.S. House Financial Services Committee, which oversees flood insurance. A former attorney, he is no fan of what he describes as the program's drawn-out claims process.

"I think government shouldn't, probably, be in the business of insurance," he said.

U.S. Rep. Jeb Hensarling, R-Dallas and chair of the House Financial Services Committee, is a key player in this debate. He has pushed for reauthorization of the program, but wants to raise the premiums for government-sponsored policies to encourage homeowners to seek private insurance.

“As unfortunately the NFIP faces an uncertain future, we must ensure people have more affordable flood insurance options as they begin to rebuild," he said last week.

The first stab at changes came last week when two Floridians in Congress, U.S. Reps. Kathy Castor, a Democrat, and Dennis Ross, a Republican, attached legislation to an unrelated bill that would allow mortgage lenders to accept homeowners' use of private flood insurance instead of government insurance.

The Castor-Ross measure passed the U.S. House, but the Senate quickly axed it out of the bill. That drew a rebuke from Hensarling, who said the Senate was "letting an opportunity slip through its fingers to give flood victims and homeowners better and more affordable flood insurance options."

Supporters of the legislation in the House say they are undeterred, believing it's the most popular proposal for changing the program and will inevitably pass.

For now, no one on Capitol Hill seems inclined to increase the misery of those affected in Houston by drastically changing the flood insurance program for those who are currently filing claims. And the Florida and Texas delegations have vowed to combine their legislative firepower to protect their constituencies — members from both parties say protecting the program is a key priority.

"We've got to fix it because it's going bankrupt," said Olson, the congressman from Sugar Land. "People depend on it."

This story was produced in partnership with the Ravitch Fiscal Reporting Program at the CUNY Graduate School of Journalism.

Genetically boosting the nutritional value of corn could benefit millions

Rutgers scientists discover way to reduce animal feed and food production costs by increasing a key nutrient in corn

Rutgers University

Rutgers scientists have found an efficient way to enhance the nutritional value of corn - the world's largest commodity crop - by inserting a bacterial gene that causes it to produce a key nutrient called methionine, according to a new study.

The Rutgers University-New Brunswick discovery could benefit millions of people in developing countries, such as in South America and Africa, who depend on corn as a staple. It could also significantly reduce worldwide animal feed costs.

"We improved the nutritional value of corn, the largest commodity crop grown on Earth," said Thomas Leustek, study co-author and professor in the Department of Plant Biology in the School of Environmental and Biological Sciences. "Most corn is used for animal feed, but it lacks methionine - a key amino acid - and we found an effective way to add it."

The study, led by Jose Planta, a doctoral student at the Waksman Institute of Microbiology, was published online today in the Proceedings of the National Academy of Sciences.

Methionine, found in meat, is one of the nine essential amino acids that humans get from food, according to the National Center for Biotechnology Information. It is needed for growth and tissue repair, improves the tone and flexibility of skin and hair, and strengthens nails. The sulfur in methionine protects cells from pollutants, slows cell aging and is essential for absorbing selenium and zinc.

Every year, synthetic methionine worth several billion dollars is added to field corn seed, which lacks the substance in nature, said study senior author Joachim Messing, a professor who directs the Waksman Institute of Microbiology. The other co-author is Xiaoli Xiang of the Rutgers Department of Plant Biology and Sichuan Academy of Agricultural Sciences in China.

"It is a costly, energy-consuming process," said Messing, whose lab collaborated with Leustek's lab for this study. "Methionine is added because animals won't grow without it. In many developing countries where corn is a staple, methionine is also important for people, especially children. It's vital nutrition, like a vitamin."

Chicken feed is usually prepared as a corn-soybean mixture, and methionine is the sole essential sulfur-containing amino acid that's missing, the study says.

The Rutgers scientists inserted an E. coli bacterial gene into the corn plant's genome and grew several generations of corn. The E. coli enzyme - 3?-phosphoadenosine-5?-phosphosulfate reductase (EcPAPR) - spurred methionine production in just the plant's leaves instead of the entire plant to avoid the accumulation of toxic byproducts, Leustek said. As a result, methionine in corn kernels increased by 57 percent, the study says.

Then the scientists conducted a chicken feeding trial at Rutgers and showed that the genetically engineered corn was nutritious for them, Messing said.

"To our surprise, one important outcome was that corn plant growth was not affected," he said.

In the developed world, including the U.S., meat proteins generally have lots of methionine, Leustek said. But in the developing world, subsistence farmers grow corn for their family's consumption.

"Our study shows that they wouldn't have to purchase methionine supplements or expensive foods that have higher methionine," he said.

Huge energy potential in open ocean wind farms in the North Atlantic

In wintertime, North Atlantic wind farms could provide sufficient energy to meet all of civilization's current needs

Carnegie Institution for Science

Washington, DC -- There is considerable opportunity for generating wind power in the open ocean, particularly the North Atlantic, according to new research from Carnegie's Anna Possner and Ken Caldeira. Their work is published by Proceedings of the National Academy of Sciences.

Because wind speeds are higher on average over ocean than over land, wind turbines in the open ocean could in theory intercept more than five times as much energy as wind turbines over land. This presents an enticing opportunity for generating renewable energy through wind turbines. But it was unknown whether the faster ocean winds could actually be converted to increased amounts of electricity.

"Are the winds so fast just because there is nothing out there to slow them down? Will sticking giant wind farms out there just slow down the winds so much that it is no better than over land?" Caldeira asked.

Most of the energy captured by large wind farms originates higher up in the atmosphere and is transported down to the surface where the turbines may extract this energy. Other studies have estimated that there is a maximum rate of electricity generation for land-based wind farms, and have concluded that this maximum rate of energy extraction is limited by the rate at which energy is moved down from faster, higher up winds.

"The real question is," Caldeira said, "can the atmosphere over the ocean move more energy downward than the atmosphere over land is able to?"

Possner and Caldeira's sophisticated modeling tools compared the productivity of large Kansas wind farms to massive, theoretical open-ocean wind farms and found that in some areas ocean-based wind farms could generate at least three times more power than the ones on land.

In the North Atlantic, in particular, the drag introduced by wind turbines would not slow down winds as much as they would on land, Possner and Caldeira found. This is largely due to the fact that large amounts of heat pour out of the North Atlantic Ocean and into the overlying atmosphere, especially during the winter. This contrast in surface warming along the U.S. coast drives the frequent generation of cyclones, or low-pressure systems, that cross the Atlantic and are very efficient in drawing the upper atmosphere's energy down to the height of the turbines.

"We found that giant ocean-based wind farms are able to tap into the energy of the winds throughout much of the atmosphere, whereas wind farms onshore remain constrained by the near-surface wind resources," Possner explained.

However, this tremendous wind power is very seasonal. While in the winter, North Atlantic wind farms could provide sufficient energy to meet all of civilization's current needs, in the summer such wind farms could merely generate enough power to cover the electricity demand of Europe, or possibly the United States alone.

Wind power production in the deep waters of the open ocean is in its infancy of commercialization. The huge wind power resources identified by the Possner and Caldeira study provide strong incentives to develop lower-cost technologies that can operate in the open-ocean environment and transmit this electricity to land where it can be used.

This study was supported by the Fund for Innovative Climate and Energy Research and the Carnegie Institution for Science endowment.

The Carnegie Institution for Science is a private, nonprofit organization headquartered in Washington, D.C., with six research departments throughout the U.S. Since its founding in 1902, the Carnegie Institution has been a pioneering force in basic scientific research. Carnegie scientists are leaders in plant biology, developmental biology, astronomy, materials science, global ecology, and Earth and planetary science.

The hidden environmental costs of dog and cat food

By Karin Brulliard August 4

The kibble in a dog’s bowl has a significant environmental pawprint, a study found. (Bigstock)

Gregory Okin is quick to point out that he does not hate dogs and cats. Although he shares his home with neither — he is allergic, so his pets are fish — he thinks it is fine if you do. But if you do, he would like you to consider what their meat-heavy kibble and canned food are doing to the planet.

Okin, a geographer at UCLA, recently did that, and the numbers he crunched led to some astonishing conclusions. America’s 180 million or so Rovers and Fluffies gulp down about 25 percent of all the animal-derived calories consumed in the United States each year, according to Okin’s calculations. If these pets established a sovereign nation, it would rank fifth in global meat consumption.

Needless to say, producing that meat — which requires more land, water and energy and pollutes more than plant-based food — creates a lot of greenhouse gases: as many as 64 million tons annually, or about the equivalent of driving more than 12 million cars around for a year. That doesn’t mean pet-keeping must be eschewed for the sake of the planet, but “neither is it an unalloyed good,” Okin wrote in a study published this week in PLOS One.

“If you are worried about the environment, then in the same way you might consider what kind of car you buy … this is something that might be on your radar,” Okin said in an interview. “But it’s not necessarily something you want to feel terrible about. ”

This research was a departure for Okin, who typically travels the globe to study deserts — things such as wind erosion, dust production and plant-soil interactions. But he said the backyard chicken trend in Los Angeles got him thinking about “how cool it is” that pet chickens make protein, while dogs and cats eat protein. And he discovered that even as interest grows in the environmental impact of our own meat consumption, there has been almost no effort to quantify the part our most common pets play.

To do that, Okin turned to dog and cat population estimates from the pet industry, average animal weights, and ingredient lists in popular pet foods. The country’s dogs and cats, he determined, consume about 19 percent as many calories as the human population, or about as much as 62 million American people. But because their diets are higher in protein, the pets’ total animal-derived calorie intake amounts to about 33 percent of that of humans.

Okin’s numbers are estimates, but they do “a good job of giving us some numbers that we can talk about,” said Cailin Heinze, a veterinary nutritionist at Tufts University’s Cummings School of Veterinary Medicine who has written about the environmental impact of pet food. “They bring up a really interesting discussion.”

Okin warns that the situation isn’t likely to improve any time soon. Pet ownership is on the rise in developing countries such as China, which means the demand for meaty pet food is, too. And in the United States, the growing idea of pets as furry children has led to an expanding market of expensive, gourmet foods that sound like Blue Apron meals. That means not just kale and sweet potato in the ingredient list, but grain-free and “human-grade” concoctions that emphasize their use of high-quality meat rather than the leftover “byproducts” that have traditionally made up much of our pets’ food.

“The trend is that people will be looking for more good cuts of meat for their animals and more high-protein foods for their animals,” Okin said.

Cats are ‘obligate carnivores,’ which means they need more protein than dogs. (iStock)

What to do about this? That’s the hard part. Heinze said one place to start is by passing on the high-protein or human-grade foods. Dogs and cats do need protein — and cats, which are obligate carnivores, really do need meat, she said. But the idea that they should dine on the equivalent of prime rib and lots of it comes from what she calls “the pet food fake news machine.” There’s no need to be turned off by some plant-based proteins in a food’s ingredients, she said, and dog owners in particular can look for foods with lower percentages of protein.

The term human-grade implies that a product is using protein that humans could eat, she added. Meat byproducts — all the organs and other animal parts that don’t end up at the supermarket — are perfectly fine, she said.

“Dogs and cats happily eat organ meat,” Heinze said. “Americans do not.”

Okin has some thoughts about that. The argument that pet foods’ use of byproducts is an “efficiency” in meat production is based on the premise that offal and organs are gross, he says. (Look no further than the collective gag over a finely textured beef product known as “pink slime.”) But if we would reconsider that, his study found, about one-quarter of all the animal-derived calories in pet food would be sufficient for all the people of Colorado.

“I’ve traveled around the world and I’m cognizant that what is considered human edible is culture-specific,” he said. “Maybe we need to have a conversation about what we will eat.”

In the meantime, Okin suggests that people thinking about getting a dog might consider a smaller one — a terrier rather than a Great Dane, say. Or, if you think a hamster might fulfill your pet desires, go that route.

Heinze, for her part, sometimes offers the same counsel to vegetarian or vegan clients who want their pets to go meat-free. They are typically motivated by animal welfare concerns, not environmental ones, she said, but such diets are not always best for dogs, and they never are for cats.

“There have been a few times in my career where I’ve honestly said to my clients, ‘We need to find a new home for your pet,'” she said, “‘and you need to get a rabbit or a guinea pig or something like that.’”

Materials emitted by a water pipe-repair method may pose health risks, new safeguards and research needed

WEST LAFAYETTE, Ind. — New research is calling for immediate safeguards and the study of a widely used method for repairing sewer-, storm-water and drinking-water pipes to understand the potential health and environmental concerns for workers and the public.

The procedure, called cured-in-place pipe repair, or CIPP, was invented in the 1970s. It involves inserting a resin-impregnated fabric tube into a damaged pipe and curing it in place with hot water or pressurized steam, sometimes with ultraviolet light. The result is a new plastic pipe manufactured inside the damaged one. The process can emit chemicals into the air, sometimes in visible plumes, and can expose workers and the public to a mixture of compounds that can pose potential health hazards, said Andrew Whelton, an assistant professor in Purdue University’s Lyles School of Civil Engineering and the Environmental and Ecological Engineering program.

He led a team of researchers who conducted a testing study at seven steam-cured CIPP installations in Indiana and California. The researchers captured the emitted materials and measured their concentration, including styrene, acetone, phenol, phthalates and other volatile (VOC) and semi-volatile organic compounds (SVOC).

Results from their air testing study are detailed in a paper appearing on July 26 in Environmental Science & Technology Letters. The study files can freely be downloaded and are open-access, and the paper is available at http://pubs.acs.org/doi/10.1021/acs.estlett.7b00237.

Findings show that the chemical plume, commonly thought of as harmless steam, was actually a complex mixture of organic vapor, water vapor, particulates of condensable vapor and partially cured resin, and liquid droplets of water and organic chemicals. A YouTube video is available at https://youtu.be/rBMOoa2XcJI.

“CIPP is the most popular water-pipe rehabilitation technology in the United States,” Whelton said. “Short- and long-term health impacts caused by chemical mixture exposures should be immediately investigated. Workers are a vulnerable population, and understanding exposures and health impacts to the general public is also needed.”

The researchers have briefed the National Institute for Occupational Health and Safety (NIOSH) about their findings. NIOSH is part of the Centers for Disease Control and Prevention and has occupational safety and health experts who can investigate workplace hazards.

Purdue researchers captured the chemical plume materials from two sanitary sewer-pipe installations and five storm-water pipe installations. Samples were analyzed using gas chromatography, thermal and spectroscopic techniques. Chemicals found included hazardous air pollutants, suspected endocrine disrupting chemicals, and known and suspected carcinogens. Emissions were sometimes highly concentrated and affected by wind direction, speed and the worker’s activities, Whelton said.

A waxy substance was found in the air, and materials engineers determined it was partially cured plastic, styrene monomer, acetone, and unidentified chemicals.

No respiratory protection was used by CIPP workers, and a review of online videos, images and construction contracts indicates respiratory safety equipment use was not typical, he said.

To evaluate chemical plume toxicity, pulmonary toxicologist and assistant professor Jonathan Shannahan and a graduate student exposed captured materials to mouse lung cells. Plume samples from two of four sites tested displayed toxicity effects and two did not.

Workers repair a damaged pipe using the CIPP process, which emits a white plume containing a complex mixture vapors and chemical droplets. The workers wore no respiratory protection.

“This suggests that there are operational conditions that may decrease the potential for hazardous health effects,” Shannahan stated. “Since exposures can be highly variable in chemical composition, concentration, and exposure duration our findings demonstrate the need for further investigation.”

At the same time, existing testing methods are not capable of documenting this multi-phase chemical exposure, Whelton said.

Findings Contradict Conventional Thinking:

In 2017, Whelton completed a one-and-a-half-day CIPP construction inspector course for consulting and municipal engineers and contractors as part of a different research project working with state transportation agencies.

“CIPP workers, the public, water utilities, and engineers think steam is emitted,” he said. “What we found was not steam. Even when the chemical plume was not visible, our instruments detected that we were being chemically exposed.”

The paper was authored by graduate students Seyedeh Mahboobeh Teimouri Sendesi, Kyungyeon Ra, Mohammed Nuruddin, and Lisa M. Kobos; undergraduate student Emily N. Conkling; Brandon E. Boor, an assistant professor of civil engineering; John A. Howarter, an assistant professor of materials engineering and environmental and ecological engineering; Jeffrey P. Youngblood, a professor of materials engineering; Shannahan; Chad T. Jafvert, a professor of civil engineering and environmental and ecological engineering; and Whelton.

The new research also contains results from the team’s Freedom of Information Act requests to cities and utilites. This information is contained in the supporting-information section of the research paper. Forty-nine public reports of chemical air contamination associated with CIPP activites were found. Complaints filed by homeowners and businesses who, anecdotally, have described strong and lingering chemical odors and illness symptoms, also were described.

Additional research is needed, particularly because the procedure has not been well studied for health and environmental risks, said Howarter.

“The CIPP process is actually a brilliant technology,” Howarter said. “Health and safety concerns, though, need to be addressed. We are not aware of any study that has determined what exposure limit to the chemical mixture is safe. We are not aware of any study that indicates that skin exposure or inhaling the multi-phase mixture is safe. We also are not aware of any study that has examined the persistence of this multi-phase mixture in the environment.”

Whelton said, “In the documents we reviewed, contractors, utilities, and engineering companies have told people who complain about illness symptoms that their exposures are not a health risk. It’s unclear what data are used for these declarations.”

Utilities and municipalities sometimes cite worker chemical exposure standards established by the NIOSH as acceptable for the general public.

“That comparison is wrong and invalid,” Whelton said. “Worker safety exposure standards should not be cited as acceptable exposures for the general public. For example, children have less of an ability to handle a chemical exposure than healthy adults, and worker exposure standards do not account for the multi-phase exposure. CIPP workers can be exposed to more than one chemical at a time, in addition to droplets, vapor and particulates.”

Testing Results Show Need For Worksite Changes

Because of the air testing results obtained at CIPP installations on the Purdue campus – two of the seven test sites studied – Purdue required the faculty and students to better protect themselves from inhaling the chemicals emitted. The researchers were required to wear full-facemask carbon filter respirators during their testing in California.

Meanwhile, the uncured chemicals also might pose hazards: The nitrile gloves Whelton wore on one occasion deteriorated from contact with the uncured resin tube. The researchers observed that workers sometimes did not wear gloves while handling the materials. Some images and videos online also show workers not wearing gloves while handling the chemicals, he said.

“Workers should always wear appropriately thick chemically resistant gloves while handling uncured resin tubes,” Whelton said. “No one should handle these materials with their bare hands.”

He said health officials must be alerted when people complain about odors near CIPP sites and illness so they can investigate.

“Health officials we have spoken with were unaware of illness and odor complaints from CIPP activities,” Whelton said. “Local and state health departments should be involved in all public chemical exposure and illness complaints. They are trained on medical assessments.”

Often, the public instead speaks with CIPP contractors and engineers about their illness complaints, which Whelton said must change.

“Engineers should act immediately because it is their professional responsibility to hold paramount the safety, health, and welfare of the public according to their code of ethics,” he said. “We have seen evidence that companies and utilities do not understand what materials are created and emitted by CIPP processes or the consequences of exposure. Our new study indicates workers, the public, and the environment needs to be better protected from harm.”

The researchers are working to develop safeguards, including a new type of handheld analytical device that would quickly indicate whether the air at a worksite is safe. A patent application has been filed through the Purdue Research Foundation’s Office of Technology Commercialization.

The research was primarily funded by a RAPID response grant from the National Science Foundation. Additional support was provided by public donations and from Purdue University.

The study follows a discovery three years ago when researchers reported that chemicals released by CIPP activities into waterways have been linked to fish kills, contaminated drinking water supplies, and negative impacts to wastewater treatment plants. A previous research paper, available at http://pubs.acs.org/doi/abs/10.1021/es5018637, demonstrated that chemicals released into water by CIPP sites can be toxic, and the CIPP waste dissolved freshwater test organisms within 24 hours at room temperature.

Donations to continue to this research can be made at http://Giving.Purdue.edu/WaterPipeSafety.

Writer: Emil Venere, 765-494-4709, venere@purdue.edu

Source: Andrew Whelton, 765-494-2160, , awhelton@purdue.edu

ABSTRACT

Worksite Chemical Air Emissions and Worker Exposure during Sanitary Sewer and Stormwater Pipe Rehabilitation Using Cured-in-Place-Pipe (CIPP)

Seyedeh Mahboobeh Teimouri Sendesi†, Kyungyeon Ra‡, Emily N. Conkling‡, Brandon E. Boor†, Md. Nuruddin§, John A. Howarter‡§, Jeffrey P. Youngblood§, Lisa M. Kobos∥, Jonathan H. Shannahan∥, Chad T. Jafvert†‡, and Andrew J. Whelton*†‡

† Lyles School of Civil Engineering, Purdue University, West Lafayette, Indiana 47907, United States

‡ Division of Environmental and Ecological Engineering, Purdue University, West Lafayette, Indiana 47907, United States

§ School of Materials Engineering, Purdue University, West Lafayette, Indiana 47907, United States

∥ School of Health Sciences, Purdue University, West Lafayette, Indiana 47907, United States

​Chemical emissions were characterized for steam-cured cured-in-place-pipe (CIPP) installations in Indiana (sanitary sewer) and California (stormwater). One pipe in California involved a low-volatile organic compound (VOC) non-styrene resin, while all other CIPP sites used styrene resins. In Indiana, the uncured resin contained styrene, benzaldehyde, butylated hydroxytoluene (BHT), and unidentified compounds. Materials emitted from the CIPP worksites were condensed and characterized. An emitted chemical plume in Indiana was a complex multiphase mixture of organic vapor, water vapor, particulate (condensable vapor and partially cured resin), and liquid droplets (water and organics). The condensed material contained styrene, acetone, and unidentified compounds. In California, both styrene and low-VOC resin condensates contained styrene, benzaldehyde, benzoic acid, BHT, dibutyl phthalate, and 1-tetradecanol. Phenol was detected only in the styrene resin condensate. Acetophenone, 4-tert-butylcyclohexanol, 4-tert-butylcyclohexanone, and tripropylene glycol diacrylate were detected only in the low-VOC condensate. Styrene in the low-VOC condensate was likely due to contamination of contractor equipment. Some, but not all, condensate compounds were detected in uncured resins. Two of four California styrene resin condensates were cytotoxic to mouse alveolar type II epithelial cells and macrophages. Real-time photoionization detector monitoring showed emissions varied significantly and were a function of location, wind direction, and worksite activity.

Human-caused warming likely led to recent streak of record-breaking temperatures

American Geophysical Union

WASHINGTON D.C. - It is "extremely unlikely" 2014, 2015 and 2016 would have been the warmest consecutive years on record without the influence of human-caused climate change, according to the authors of a new study.

Temperature records were first broken in 2014, when that year became the hottest year since global temperature records began in 1880. These temperatures were then surpassed in 2015 and 2016, making last year the hottest year ever recorded. In 2016, the average global temperature across land and ocean surface areas was 0.94 degrees Celsius (1.69 degrees Fahrenheit) above the 20th century average of 13.9 degrees Celsius (57.0 degrees Fahrenheit), according to NOAA.

Combining historical temperature data and state-of-the-art climate model simulations, the new study finds the likelihood of experiencing consecutive record-breaking global temperatures from 2014 to 2016 without the effects of human-caused climate change is no greater than 0.03 percent and the likelihood of three consecutive record-breaking years happening any time since 2000 is no more than 0.7 percent. When anthropogenic warming is considered, the likelihood of three consecutive record-breaking years happening any time since 2000 rises to as high as 50 percent, according to the new study.

That means human-caused climate change is very likely to blame for the three consecutive record-hot years, according to the new study accepted for publication in Geophysical Research Letters, a journal of the American Geophysical Union.

"With climate change, this is the kind of thing we would expect to see. And without climate change, we really would not expect to see it," said Michael Mann, a climate scientist at Pennsylvania State University in State College, Pennsylvania, and lead author of the new study.

Greenhouse gases, like carbon dioxide and methane, accumulate in the atmosphere and trap heat that would otherwise escape into space. Excess greenhouse gases from industrial activities, like burning fossil fuels, are trapping additional heat in the atmosphere, causing the Earth's temperatures to rise. The average surface temperature of the planet has risen about 1.1 degrees Celsius (2.0 degrees Fahrenheit) since the late 19th century, and the past 35 years have seen a majority of the warming, with 16 of the 17 warmest years on record occurring since 2001, according to NASA.

Scientists are now trying to characterize the relationship between yearly record high temperatures and human-caused global warming.

In response to the past three years' record-breaking temperatures, authors of the new study calculated the likelihood of observing a three-year streak of record high temperatures since yearly global temperature records began in the late 19th century and the likelihood of seeing such a streak since 2000, when much of the warming has been observed. The study's authors determined how likely this kind of event was to happen both with and without the influence of human-caused warming.

The new study considers that each year is not independent of the ones coming before and after it, in contrast to previous estimates that assumed individual years are statistically independent from each other. There are both natural and human events that make temperature changes cluster together, such as climate patterns like El Niño, the solar cycle and volcanic eruptions, according to Mann.

When this dependency is taken into account, the likelihood of these three consecutive record-breaking years occurring since 1880 is about 0.03 percent in the absence of human-caused climate change. When the long-term warming trend from human-caused climate change is considered, the likelihood of 2014-2016 being the hottest consecutive years on record since 1880 rises to between 1 and 3 percent, according to the new study.

The probability that this series of record-breaking years would be observed at some point since 2000 is less than 0.7 percent without the influence of human-caused climate change, but between 30 and 50 percent when the influence of human-caused climate change is considered, the new study finds.

If human-caused climate change is not considered, the warming observed in 2016 would have about a 1-in-a-million chance of occurring, compared with a nearly 1-in-3 chance when anthropogenic warming is taken into account, according to the study.

The results make it difficult to ignore the role human-caused climate change is having on temperatures around the world, according to Mann. Rising global temperatures are linked to more extreme weather events, such as heat waves, floods, and droughts, which can harm humans, animals, agriculture and natural resources, he said.

"The things that are likely to impact us most about climate change aren't the averages, they're the extremes," Mann said. "Whether it's extreme droughts, or extreme floods, or extreme heat waves, when it comes to climate change impacts ... a lot of the most impactful climate related events are extreme events. The events are being made more frequent and more extreme by human-caused climate change."

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing 60,000 members in 137 countries. Join the conversation on Facebook, Twitter, YouTube, and our other social media channels.

Scientists discover 91 new volcanoes below Antarctic ice sheet

This is in addition to 47 already known about and eruption would melt more ice in region affected by climate change

Saturday 12 August 2017 18.11 EDT

Last modified on Monday 14 August 2017 04.08 EDT

Scientists have uncovered the largest volcanic region on Earth – two kilometres below the surface of the vast ice sheet that covers west Antarctica.

The project, by Edinburgh University researchers, has revealed almost 100 volcanoes – with the highest as tall as the Eiger, which stands at almost 4,000 metres in Switzerland.

Geologists say this huge region is likely to dwarf that of east Africa’s volcanic ridge, currently rated the densest concentration of volcanoes in the world.

And the activity of this range could have worrying consequences, they have warned. “If one of these volcanoes were to erupt it could further destabilise west Antarctica’s ice sheets,” said glacier expert Robert Bingham, one of the paper’s authors. “Anything that causes the melting of ice – which an eruption certainly would – is likely to speed up the flow of ice into the sea.

“The big question is: how active are these volcanoes? That is something we need to determine as quickly as possible.”

The Edinburgh volcano survey, reported in the Geological Society’s special publications series, involved studying the underside of the west Antarctica ice sheet for hidden peaks of basalt rock similar to those produced by the region’s other volcanoes. Their tips actually lie above the ice and have been spotted by polar explorers over the past century.

But how many lie below the ice? This question was originally asked by the team’s youngest member, Max Van Wyk de Vries, an undergraduate at the university’s school of geosciences and a self-confessed volcano fanatic. He set up the project with the help of Bingham. Their study involved analysing measurements made by previous surveys, which involved the use of ice-penetrating radar, carried either by planes or land vehicles, to survey strips of the west Antarctic ice.

The results were then compared with satellite and database records and geological information from other aerial surveys. “Essentially, we were looking for evidence of volcanic cones sticking up into the ice,” Bingham said.

After the team had collated the results, it reported a staggering 91 previously unknown volcanoes, adding to the 47 others that had been discovered over the previous century of exploring the region.

These newly discovered volcanoes range in height from 100 to 3,850 metres. All are covered in ice, which sometimes lies in layers that are more than 4km thick in the region. These active peaks are concentrated in a region known as the west Antarctic rift system, which stretches 3,500km from Antarctica’s Ross ice shelf to the Antarctic peninsula.

“We were amazed,” Bingham said. “We had not expected to find anything like that number. We have almost trebled the number of volcanoes known to exist in west Antarctica. We also suspect there are even more on the bed of the sea that lies under the Ross ice shelf, so that I think it is very likely this region will turn out to be the densest region of volcanoes in the world, greater even than east Africa, where mounts Nyiragongo, Kilimanjaro, Longonot and all the other active volcanoes are concentrated.”

The discovery is particularly important because the activity of these volcanoes could have crucial implications for the rest of the planet. If one erupts, it could further destabilise some of the region’s ice sheets, which have already been affected by global warming. Meltwater outflows into the Antarctic ocean could trigger sea level rises. “We just don’t know about how active these volcanoes have been in the past,” Bingham said.

However, he pointed to one alarming trend: “The most volcanism that is going in the world at present is in regions that have only recently lost their glacier covering – after the end of the last ice age. These places include Iceland and Alaska.

“Theory suggests that this is occurring because, without ice sheets on top of them, there is a release of pressure on the regions’ volcanoes and they become more active.”

And this could happen in west Antarctica, where significant warming in the region caused by climate change has begun to affect its ice sheets. If they are reduced significantly, this could release pressure on the volcanoes that lie below and lead to eruptions that could further destabilise the ice sheets and enhance sea level rises that are already affecting our oceans.

“It is something we will have to watch closely,” Bingham said.

Federal Court rejects Sierra Club challenge to Texas natural gas export project

By Devin Henry - 08/15/17 11:09 AM EDT

A federal court has rejected a challenge to a major liquefied natural gas (LNG) export terminal in Texas.

In its Tuesday decision, the Court of Appeals for the District of Columbia ruled that the Department of Energy (DOE) conducted the necessary environmental and economic reviews before it approved the Freeport LNG export terminal project.

The Sierra Club had challenged DOE’s review of the project, saying the agency didn’t comply with federal environmental laws before approving the Freeport terminal in 2014, and that its determination that the terminal is in the “public interest” was flawed.

But the court batted down both contentions on Tuesday, ruling DOE followed procedures outlined in the National Environmental Policy Act (NEPA) when it approved the project. The court rejected the Sierra Club’s argument that DOE should have considered indirect environmental impacts of increasing LNG exports.

“Under our limited and deferential review, we cannot say that the Department failed to fulfill its obligations under NEPA by declining to make specific projections about environmental impacts stemming from specific levels of export-induced gas production,” the court determined in an opinion written by Judge Robert Wilkins, an Obama-appointee.

The court also ruled that DOE’s determination that the project is in the public interest properly considered domestic economic impacts, foreign policy goals and energy security measures. All non-free-trade agreement exports are required to attain a public interest determination.

In a statement, the Sierra Club said it was “disappointed with the court's refusal to require DOE to use available tools to inform communities of the impact of this additional fracking prior to approving exports."

“This LNG export approval creates unnecessary risks for the people of Freeport, Texas, and for every community that is saddled with fracking rigs next to their homes, schools and public spaces,” Nathan Matthews, a Sierra Club staff attorney, said.

Tuesday’s decision is the latest development in a lengthy legal fight over the Freeport terminal, which is due to come online next year, and other natural gas export facilities.

The D.C. Circuit had previously rejected environmentalists’ complaints over the Federal Energy Regulatory Commission’s approval decisions for the Freeport project and another terminal in Louisiana.

Natural Gas flaring rises despite N.D. regulations

Mike Lee, E&E News reporter for Energywire: Tuesday, August 15, 2017

Gas flaring in North Dakota's Bakken Shale oil field has been creeping back up, blunting the state's efforts to tame a problem that came to symbolize the excesses of the oil boom.

The volume of gas burned in flares reached 222 million cubic feet a day in June, a 31 percent increase from the same month last year, when the volume was 170 million cubic feet. That's still far lower than the peak in 2014, but critics said the turnaround shows the limits of North Dakota's industry-friendly regulations.

The flares release carbon dioxide and raw methane, both of which contribute to global climate change. Methane from flares is the biggest source of greenhouse gas emissions from North Dakota's oil and gas sector, according to the Environmental Defense Fund.

Landowners have objected to the waste of gas from their property, and residents who live in the oil field worry that the emissions from flaring could cause health problems, said Don Morrison, president of the Dakota Resource Council, whose members include farmers and ranchers in the Bakken region.

"What we can do about it is not issue a permit to drill until there is some place for the gas to go," Morrison said. "Other states have figured it out. I'm not sure why North Dakota can't figure it out, too."

Like other oil fields, the Bakken Shale produces large amounts of gas along with its crude. Unlike other fields, though, it has historically lacked the pipelines and processing plants needed to get gas to market. While oil companies can still turn a profit by trucking gas out to collection points, they argued they had no option but to burn their gas in flares at individual well sites.

Most other oil-producing states don't allow long-term flaring.

The practice drew widespread attention in 2011, when The New York Times featured a front-page story with a picture of a flare lighting up the surrounding prairie. At the time, producers in North Dakota were burning about 180 million cubic feet per day — more than a third of the gas produced in the state.

The publicity led then-Gov. Jack Dalrymple's (R) administration in 2014 to pass regulations aimed at limiting flaring by 2020, and the oil industry invested billions of dollars in infrastructure to move the gas to market.

The state Industrial Commission, which the governor leads, set a goal of reducing flaring to 10 percent of total gas production and later reduced the final level to 9 percent. The state also required companies to submit plans for how they'll capture gas when they apply for an oil well permit (Energywire, July 2, 2014).

The Sierra Club and Environmental Defense Fund argued at the time that the state should set a limit on the total volume of gas that can be flared. Setting a percentage limit allows the volume to rise when production rises.

"The planet doesn't care what percentage of gas you're wasting — the planet cares about the volume," said Dan Grossman, director of state programs at the Environmental Defense Fund.

[+] The amount of gas burned in flares in the Bakken Shale oil field has crept back up in the last year, despite state regulations intended to limit the practice. North Dakota Department of Mineral Resources/North Dakota Pipeline Authority

Regulators at the state Department of Mineral Resources considered a volume limit when they wrote the rules in 2014 but rejected the idea because it would be too hard to enforce, said Alison Ritter, a spokeswoman for the agency.

Overall, the program has worked, she said. At the peak of the oil boom in December 2014, producers in North Dakota were flaring 374 million cubic feet a day, or 25 percent of production. By April 2016, the volume was down to 144 million cubic feet a day, or 8.8 percent of production.

In the last year, though, the oil bust has forced producers to shift their drilling into the most productive parts of the field, which also produce the most gas, according to the Department of Mineral Resources.

The increase in production means that more gas has been flared, even though the percentage has stayed roughly flat.

By May of this year, production grew to 1.85 billion cubic feet a day and flaring hit 203 million cubic feet a day, or about 10.9 percent of production.

This June, shutdowns at processing plants and a pipeline operator pushed up both the volume and percentage of flaring — to 12 percent and 222 million cubic feet — even though production stayed flat, according to the Department of Mineral Resources.

The industry has also had problems obtaining rights of way for new gas lines, particularly across the Fort Berthold Indian Reservation, said Ron Ness, president of the North Dakota Petroleum Council.

Two new gas-processing plants are planned for the field, which should help reduce the amount of flaring. Processing plants strip out byproducts and impurities from the gas stream so that the gas can meet the standards set by interstate pipelines.

Environmentalists hope that the state will rethink its approach to the regulations during the current downturn in drilling.

"Hopefully the Industrial Commission learned from the last boom cycle to be proactive so that it doesn't get away from us again," said Wayde Schafer, the North Dakota organizer for the Sierra Club.

Governor Cuomo Announces Winners of 76West Clean Energy Competition

August 16, 2017 Albany, NY

Dallas-Based Skyven Technologies Named $1 Million Grand Prize Winner - Will Expand Business to the Southern Tier; Grow Southern Tier's Clean Energy Economy

Competition Complements "Southern Tier Soaring" - the Region's Strategic Plan to Generate Robust Economic Growth and Community Development

Governor Andrew M. Cuomo today announced the six winning companies of the 76West Clean Energy Competition, one of the largest competitions in the country that focuses on supporting and growing clean-energy businesses and economic development. Skyven Technologies, a solar heating company from Dallas, Texas, has been awarded the $1 million grand prize, and will expand its operations in the Southern Tier. The competition complements "Southern Tier Soaring," the region's comprehensive strategy to generate robust economic growth and community development.

"New York is setting a national precedent in growing clean-energy businesses and battling climate change, and this momentum continues with the 76West Competition," Governor Cuomo said. "Skyven Technologies and the rest of the winners from this competition will bring jobs and economic growth to the Southern Tier and beyond, ensuring that New York remains at the forefront of the new clean energy economy."

A total of $2.5 million was awarded to six innovative companies from New York and across the United States. Lieutenant Governor Kathy Hochul announced the winners at an awards ceremony in downtown Binghamton today, joined by more than 100 elected officials, entrepreneurs and local business leaders. The event also named a $500,000 winner and four $250,000 winners.

As a condition of the award, companies must either move to the Southern Tier or establish a direct connection with the Southern Tier, such as a supply chain, job development with Southern Tier companies, or other strategic relationships with Southern Tier entities that increases wealth creation and creates jobs. If the companies are already in the Southern Tier, they must commit to substantially growing their business and employment in the region.

Lieutenant Governor Kathy Hochul said, "Modeled after the wildly successful 43 North business plan competition, 76West is another example of creative economic development strategies spurring start-ups in the industries of the future. Incentivizing clean energy innovation in the Southern Tier through this competition is further evidence of Governor Cuomo's commitment to creating even more jobs and opportunities in this region of the State. Investing in renewable energy and other clean energy solutions also ensures a cleaner environment and stronger economy for future generations of New Yorkers. Congratulations to all the 76West winners who are bringing their business and clean energy technologies to the Southern Tier."

The 76West winners are:

$1 million grand prize winner

Skyven Technologies - Dallas, TX: uses solar heating techniques to reduce fossil fuel consumption in agriculture and other industries

$500,000 award winner

SunTegra - Port Chester, NY (Mid-Hudson Valley): develops solar products that are integrated into the roof to provide clean energy and an alternate look to conventional solar panels

$250,000 award winners

Biological Energy - Spencer, NY (Southern Tier): provides technology that increases wastewater treatment capacity while reducing energy use

EthosGen - Wilkes-Barre, PA: Captures and transforms waste heat to resilient, renewable on-site electric power

SolarKal - New York, NY (New York City): provides a brokerage service to help businesses simplify the process of going solar

Visolis - Berkeley, CA: produces high-value chemicals from biomass instead of petroleum, which reduces greenhouse gases

The second round of the 76West Competition was launched in December 2016 and received applications from around the world, including Switzerland, South Africa and Israel. Of these, 15 semifinalists were chosen and participated in a marathon pitch session on July 11, at Alfred State College. Judges selected eight finalists, who then pitched their companies to a different panel of judges on July 13, in Corning, New York with six winners being selected.

In addition to announcing the 76West winners, Lt. Governor Hochul recognized Binghamton for being the first city in the Southern Tier designated as a Clean Energy Community by the New York State Energy Research and Development Authority. The Clean Energy Communities initiative recognizes municipalities that complete four of 10 high-impact clean energy actions. To earn the designation, Binghamton converted all city street lights to energy efficient LED technology, installed an electric vehicle charging station, completed energy code enforcement training, and streamlined the approvals process for local solar projects. The city is now eligible to apply for $250,000 in grants to fund additional clean energy projects.

NYSERDA administers the 76West Competition which supports New York's nation-leading Clean Energy Standard which will ensure that 50 percent of the state's electricity comes from renewable energy by 2030, under the comprehensive Reforming the Energy Vision, a strategy to build a cleaner, more resilient and affordable energy system for all New Yorkers. 76West fosters cleantech economic development and leadership and expands innovative entrepreneurship in the Southern Tier.

New York State Energy and Finance Chairman Richard Kauffman said, "We're thrilled to see so many entrepreneurs and different types of start-up companies involved in the 76West Competition and interested in joining the robust community of the Southern Tier. Under Governor Cuomo, all New Yorkers stand to benefit from these clean technologies that are being developed and brought to the marketplace and I look forward to watching the winners of the 76West Competition grow and help New York build its clean, resilient and affordable energy system."

Alicia Barton, President and CEO, NYSERDA said, "The 76West Competition builds on Governor Cuomo's commitment to the Southern Tier by supporting an entrepreneurial community that is integral to New York's thriving clean economy. I congratulate all of the finalists on their success in this competition and I look forward to seeing each them continue to grow in the Southern Tier."

This is the second year of 76West, a $20 million competition and support program administered by NYSERDA that will run once a year from 2016 to 2019. Each year applicants compete for a $1 million grand prize, a $500,000 award and four $250,000 awards. In total, 76West will provide $10 million in awards and $10 million for business support, marketing and administration through the Regional Greenhouse Gas Initiative and the Clean Energy Fund.

The six winners from the first round announced last November, have already integrated themselves into the Southern Tier, raised $20 million in private capital for future growth, and created new jobs.

New York State has made a significant investment in the Southern Tier to attract a talented workforce, grow business and drive innovation. It is already home to nation-leading companies and universities which are spurring innovation and leading global research. Now, with the addition of New York's six new 76West winners, the Southern Tier is quickly becoming a leading region for clean tech development and a model for other regions across the state and nationally.

Southern Tier Regional Economic Development Council Co-Chairs Tom Tranter, President & CEO of Corning Enterprises and Harvey Stenger, President of Binghamton University said, "76West is helping the Southern Tier to soar by attracting top-notch entrepreneurs to the clean energy sector and creating jobs in the region. The effort is adding momentum to our region's ongoing revitalization. Thank you to Governor Cuomo, NYSERDA and to all of our government partners for creating this competition that is resulting in exciting opportunities for the Southern Tier."

Western New York Regional Economic Development Council Co-Chair and President of SolEpoxy Inc., Jeff Belt said, "It's exciting to see the role entrepreneurs are playing in growing New York State's clean energy sector. I congratulate the winners of the 76West Competition who know what it takes to build the clean, resilient and affordable energy systems of the future."

Western New York Regional Economic Development Council Co-Chair and President of the State University of New York at Fredonia, Dr. Virginia Horvath said, "76West provides the funding to take many of the brilliant ideas that are being developed at academic institutions to the next step. The cleantech competition keeps some of the brightest students living and working in our state, which is committed to reaching its clean energy goals."

State Senator Fred Akshar said, "I'd like to congratulate all of the finalists and winners of the 76West Clean Energy Competition, especially our Southern Tier finalist Biological Energy. Our region needs economic development, job growth and clean energy - which is exactly what the 76West Clean Energy Competition and Southern Tier Soaring are working to accomplish. I applaud everyone's hard work and am very impressed by the thoughtfulness and innovation of the people of this great state."

Assemblywoman Donna Lupardo said, "Innovation has always been the key to economic development in the Southern Tier. The 76West Competition continues this tradition by helping grow the clean energy sector in our state and adds to our region's ongoing revitalization. I'm looking forward to seeing the positive results that 76West continues to produce across New York."

Today's announcement complements "Southern Tier Soaring" - the region's comprehensive blueprint to generate robust economic growth and community development. New York State has already invested more than $4.6 billion in the region since 2012 to lay for groundwork for the plan - attracting a talented workforce, growing business and driving innovation. Today, unemployment across the state is near the lowest levels since before the Great Recession; personal and corporate income taxes are down; and businesses are choosing places like the Southern Tier as a destination in which to grow and invest.

Now, the region is accelerating "Southern Tier Soaring" with a $500 million investment through New York's Upstate Revitalization Initiative, announced by Governor Cuomo in December 2015. The state's $500 million investment will incentivize private business to invest well over $2.5 billion - and the region's plan, as submitted, projects up to 10,200 new jobs. More information is available here.

To learn more about REV, including the Governor's $5 billion investment in clean energy technology and innovation, please visit www.ny.gov/REV4NY and follow us at @Rev4NY.

Pope Francis inspires Iowa churches to tackle climate change

USA Today

Kevin Hardy, kmhardy@dmreg.com Published 9:47 p.m. CT Aug. 21, 2017 | Updated 12:18 p.m. CT Aug. 22, 2017

See how St. John the Apostle Catholic Church in Norwalk is using solar energy to cut costs and help the environment.

NORWALK, Ia. ― As the sun beat down on a recent Saturday afternoon, the mammoth air conditioning system at St. John the Apostle Catholic Church offered welcome relief to parishioners trickling in for evening Mass.

While churchgoers filled the pews and hymns filled the sanctuary, 206 solar panels overhead converted the sun's energy into power for lights, the public address system and the crisp cold air.

"We're running on solar right now," said Rev. John Ludwig.

The church is the first in the Catholic Diocese of Des Moines to roll out a large scale solar energy initiative. But Catholic leaders say it won't be the last.

Bishop Robert Pates credits grassroots work at the Norwalk church for bringing about the solar project, but inspiration came from the world's highest ranking. Iowa Catholics, inspired by Pope Francis' call for the faithful to actively combat the harmful effects of global climate change, are looking for ways to conserve energy and transition to renewable forms of energy.

"It's exciting to me because I feel like they’re sort of walking the walk," said the Rev. Susan Hendershot Guy, executive director of Iowa Interfaith Power & Light, a religious group aimed at responding to global warming. "They're not just saying isn’t this great. They’re really advocating for congregations to do something."

But the Catholic church isn't alone. Across the state, leaders from various faith traditions ― who often worship in vast, aging and inefficient facilities ― are pushing forward with energy efficiency efforts and solar installations, Guy said.

In Kalona, a Mennonite congregation invested in a community solar effort. A Soto Zen Buddhist temple in Dorchester now gets about half its energy from its solar array. And a Methodist church in Ames will host a solar workshop later this month.

"I think it is getting out to all types of denominations, congregations, faith traditions really across the conservative-liberal spectrum," Guy said. "It’s a really practical way to live out that message of how we care for the world."

In Norwalk, St. John the Apostle partnered with a parishioner's for-profit company, which unlike untaxed churches, is able to take advantage of lucrative renewable energy tax credits. The company secured investors to pay for the upfront costs. It sells the energy generated back to the church at a rate lower than what the church would pay for power on the wider grid. Ludwig says the deal will save the church $2,000 per year in energy costs.

"We got the deal of the century," Ludwig said, "because we didn't have to pay anything for it."

'We cannot abandon what God has given us'

In June 2015, Pope Francis made international news after releasing his environmental encyclical Laudato Si, which translates to "Our Common Home." While citing the messages of previous popes, Francis captured the world's attention by criticizing climate change deniers and calling for sweeping changes in political action and personal behaviors to reduce emissions and better care for the environment.

"These situations have caused sister Earth, along with all the abandoned of our world, to cry out, pleading that we take another course," he wrote. "Never have we so hurt and mistreated our common home as we have in the last 200 years."

In May, Francis gave President Donald Trump, who has called climate change a "hoax," a copy of his 192-page document during the president's Vatican visit.

Pates says the Bible's Book of Genesis demands that Christians act as stewards of the Earth and "provide for those who come after us."

"We cannot put our head in the sand and say we're going to continue to have children but snuff out their existence," he said. "That is why the pope is so concerned."

While environmental activists often are viewed as a bloc within the political left, Pates said the church's embrace of the issue isn't meant to make a political stance. Yet, it does highlight how the church's bearings on abortion, immigration, poverty and the environment often split between platforms of the partisan left and right.

"The pope regards this as a moral issue, not a political issue, especially given the consensus of scientists," he said. "The timeline involved is very serious. We can do something. We can change it."

Pates says the diocese won't force parishes to make changes, but it is encouraging them to explore solar installations and other green initiatives. He also hopes a soon-to-form task force will spur meaningful change for two of the area's largest Catholic institutions: Dowling Catholic High School and Mercy Medical Center.

Though Catholics, and all Christians, hold faith in the prospect of life after death, Pates said that outlook can't be used as a crutch to forsake the planet's air, land and waters.

"In the meanwhile, this is what we are entrusted with," he said. "So we cannot abandon what God has given us."

Solar panels can be seen on the roof Saturday, Aug. 12, 2017, at St. John the Apostle Catholic Church in Norwalk. Church leaders worked with church member Terry Dvorak, owner of Red Lion Renewables, to get the panels installed at no upfront cost to the church. (Photo: Michael Zamora/The Register)

'Our kids deserve to have clean air and sunshine'

After years in the construction business, Terry Dvorak suddenly got very interested in renewable energy in 2010 while traveling around heavily polluted cities in China.

In Shanghai, it felt like he was inhaling paint fumes, he said.

"That kind of started it for me," he said. "After that trip, it solidified that I needed to do something."

He thought of the countless children who live their whole lives without escaping that tainted environment.

"Our kids deserve to have clean air and sunshine," he said, "and so does the rest of the world."

He started Red Lion Renewables, a solar development firm that installs solar arrays on commercial and residential property as well as school, church and city buildings.

A member of St. John the Apostle in Norwalk, Dvorak approached church leaders about a possible church solar project. After securing approval from the parish priest and the bishop, Dvorak and his investors put up about $200,000 to purchase 206 panels. He said the parish only spent about $100 out of pocket for a legal review of its agreement with Red Lion.

The black panels, each about 40 inches by 70 inches, sit on sloped and flat spans of the church building's southern-facing roof.

Solar delivers the biggest economic benefit to homeowners and businesses who can offset a portion of the up-front investment with tax credits, Dvorak says. But that can leave school districts, churches and local governments will little incentive to go green.

By using a third-party company, investors can earn a healthy return, while the not-for-profit entity realizes immediate energy savings.

The 69 kW array in Norwalk is expected to produce enough energy to power about 10 typical homes, Dvorak says, while eliminating about 3 million pounds of carbon dioxide emissions from the atmosphere.

Through a power-purchase agreement, the church purchases back the electricity the panels generate at a rate lower than what MidAmerican Energy would charge, Dvorak says. The estimated $2,000 annual savings will allow the church to invest more in its charitable work with groups like Habitat for Humanity or Meals from the Heartland.

"Now they can indirectly save money right off the bat on their energy bills," he said, "use their limited budgets for their own mission and help the environment in the process."

The Norwalk-based company Red Lion Renewables installed solar panels on the roof of the St. John the Apostle Catholic Church in Norwalk. Red Lion CEO Terry Dvorak is a parishioner at the church. (Photo: Michael Rolands/Record-Herald)

'The church definitely needs to lead the community'

About a decade ago, church leaders started worrying about the aging heating system at First Lutheran Church in Decorah. The congregation spent about $22,000 each year to keep the lights, air conditioning and heat running in its 150-year-old building.

"We had a discussion that the money could be better spent on ministry and the church could set a better example with climate change," said Larry Grimstad, a member of the congregation's "green team."

After an energy audit, the church in 2010 added new insulation and invested hundreds of thousands into a new, efficient HVAC system.

Then, in 2014, the church unveiled a $35,000 solar installation. The 26 panels cover much of the southern portion of the roof, where energy efficiency is most ideal.

Most recently, in 2015, First Lutheran started changing out lights to more efficient LED fixtures.

The combined efforts reduced the church's energy costs by 38 percent and reduced its carbon footprint by 42 percent, he said. Now, annual utility bills run about $13,000.

But Grimstad said the church wants to reduce its carbon emissions even more. It's exploring a solar canopy project for the parking lot and Grimstad says he's interested in a community geothermal project.

"The real goal has to be zero," he said. "So that’s where we’re going."

While churches can reap economic benefits from energy efficiency efforts, Grimstad said faith groups have a moral calling to tackle the global threat of climate change.

"We have a responsibility to do this," he said. "The church definitely needs to lead the community to do this type of thing."

Breakthrough on nitrogen generation by researchers to impact climate-change research

Rob Csernyik, Published on: August 24, 2017 | Last Updated: August 24, 2017 1:33 PM MDT

New research has yielded an ammonia-oxidizing microbe called Nitrospira inopinata. The process used to make fertilizer and an overapplication in agriculture has huge environmental implications, researchers said. John Lucas / Postmedia

A research collaboration between the University of Alberta and the University of Vienna has yielded a discovery that could significantly impact climate-change research, said a professor of biology in Edmonton.

Currently the world’s nitrogen cycle is unbalanced because humans are now responsible for adding more fixed nitrogen through ammonium to the environment than natural sources, said researcher Lisa Stein of the University of Alberta. Excess ammonium has implications on the climate and environment, from dead zones in oceans to a greenhouse gas effect 300 times that of carbon dioxide on a molecule to molecule basis.

Stein and Dimitri Kits, a University of Alberta PhD grad now working at the University of Vienna, identified and characterized the ammonia-oxidizing microbe Nitrospira inopinata.

Stein said this bacteria is important because oxidizing the ammonium previously was a process done partway by two different microbes. Nitrospira inopinata oxidizes ammonium all the way to nitrate.

“This is the first time we’ve been able to demonstrate that it can with a single source,” she said. “If we can find a way to replace bad microbes with this one, it can help mitigate the problem.”

The process of making fertilizer, known as the Hager-Bosch process, adds massive quantities of fixed nitrogen to the environment. Stein also said the over-application of fertilizer causes a problem.

Stein said about 50 per cent of applied fertilizer isn’t delivered to the plant. Instead microbes start chewing on the fertilizer and release nitrous oxide into the environment. The new discovery could help mitigate this problem.

“The high efficiency of (Nitrospira inopinata) means it is likely producing less nitrous oxide,” she said.

There is still further research to be done, said Kits, who was the lead author on the paper, which was published in the scientific journal Nature.

“So far in this paper we’ve just isolated (Nitrospira inopinata) and characterized some of its basic physiology,” he said.

“The next step is going to be to measure the actual quantities of nitrous oxide that it makes (and) how it might influence the nitrogen cycle where it’s found.”

Stein said this will be done by getting together funding and researchers to measure how much nitrous oxide is produced under various conditions.

But Stein added problems with the Earth’s nitrogen cycle is not something humans can engineer themselves out of.

She said remedying the problem will involve changing behaviours, such as overfertilization in the agricultural industry.

“You can’t just change the microbes and expect the Earth to be all right, you have to stop the feeding.”

rcsernyik@postmedia.com

How 139 countries could be powered by 100 percent wind, water, and solar energy by 2050

Cell Press

The latest roadmap to a 100% renewable energy future from Stanford's Mark Z. Jacobson and 26 colleagues is the most specific global vision yet, outlining infrastructure changes that 139 countries can make to be entirely powered by wind, water, and sunlight by 2050 after electrification of all energy sectors. Such a transition could mean less worldwide energy consumption due to the efficiency of clean, renewable electricity; a net increase of over 24 million long-term jobs; an annual decrease in 4-7 million air pollution deaths per year; stabilization of energy prices; and annual savings of over $20 trillion in health and climate costs. The work appears August 23 in the journal Joule, Cell Press's new publication focused on sustainable energy.

The challenge of moving the world toward a low-carbon future in time to avoid exacerbating global warming and to create energy self-sufficient countries is one of the greatest of our time. The roadmaps developed by Jacobson's group provide one possible endpoint. For each of the 139 nations, they assess the raw renewable energy resources available to each country, the number of wind, water, and solar energy generators needed to be 80% renewable by 2030 and 100% by 2050, how much land and rooftop area these power sources would require (only around 1% of total available, with most of this open space between wind turbines that can be used for multiple purposes), and how this approach would reduce energy demand and cost compared with a business-as-usual scenario.

"Both individuals and governments can lead this change. Policymakers don't usually want to commit to doing something unless there is some reasonable science that can show it is possible, and that is what we are trying to do," says Jacobson, director of Stanford University's Atmosphere and Energy Program and co-founder of the Solutions Project, a U.S. non-profit educating the public and policymakers about a transition to 100% clean, renewable energy. "There are other scenarios. We are not saying that there is only one way we can do this, but having a scenario gives people direction."

The analyses specifically examined each country's electricity, transportation, heating/cooling, industrial, and agriculture/forestry/fishing sectors. Of the 139 countries--selected because they were countries for which data were publically available from the International Energy Agency and collectively emit over 99% of all carbon dioxide worldwide--the places the study showed that had a greater share of land per population (e.g., the United States, China, the European Union) are projected to have the easiest time making the transition to 100% wind, water, and solar. Another learning was that the most difficult places to transition may be highly populated, very small countries surrounded by lots of ocean, such as Singapore, which may require an investment in offshore solar to convert fully.

As a result of a transition, the roadmaps predict a number of collateral benefits. For example, by eliminating oil, gas, and uranium use, the energy associated with mining, transporting and refining these fuels is also eliminated, reducing international power demand by around 13%. Because electricity is more efficient than burning fossil fuels, demand should go down another 23%. The changes in infrastructure would also mean that countries wouldn't need to depend on one another for fossil fuels, reducing the frequency of international conflict over energy. Finally, communities currently living in energy deserts would have access to abundant clean, renewable power.

"Aside from eliminating emissions and avoiding 1.5 degrees Celsius global warming and beginning the process of letting carbon dioxide drain from the Earth's atmosphere, transitioning eliminates 4-7 million air pollution deaths each year and creates over 24 million long-term, full-time jobs by these plans," Jacobson says. "What is different between this study and other studies that have proposed solutions is that we are trying to examine not only the climate benefits of reducing carbon but also the air pollution benefits, job benefits, and cost benefits"

The Joule paper is an expansion of 2015 roadmaps to transition each of the 50 United States to 100% clean, renewable energy (doi:10.1039/C5EE01283J) and an analysis of whether the electric grid can stay stable upon such a transition (doi: 10.1073/pnas.1510028112). Not only does this new study cover nearly the entire world, there are also improved calculations on the availability of rooftop solar energy, renewable energy resources, and jobs created versus lost.

The 100% clean, renewable energy goal has been criticized by some for focusing only on wind, water, and solar energy and excluding nuclear power, "clean coal," and biofuels. However, the researchers intentionally exclude nuclear power because of its 10-19 years between planning and operation, its high cost, and the acknowledged meltdown, weapons proliferation, and waste risks. "Clean coal" and biofuels are neglected because they both cause heavy air pollution, which Jacobson and coworkers are trying to eliminate, and emit over 50 times more carbon per unit of energy than wind, water, or solar power.

The 100% wind, water, solar studies have also been questioned for depending on some technologies such as underground heat storage in rocks, which exists only in a few places, and the proposed use of electric and hydrogen fuel cell aircraft, which exist only in small planes at this time. Jacobson counters that underground heat storage is not required but certainly a viable option since it is similar to district heating, which provides 60% of Denmark's heat. He also says that space shuttles and rockets have been propelled with hydrogen, and aircraft companies are now investing in electric airplanes. Wind, water, and solar can also face daily and seasonal fluctuation, making it possible that they could miss large demands for energy, but the new study refers to a new paper that suggests these stability concerns can be addressed in several ways.

These analyses have also been criticized for the massive investment it would take to move a country to the desired goal. Jacobson says that the overall cost to society (the energy, health, and climate cost) of the proposed system is one-fourth of that of the current fossil fuel system. In terms of upfront costs, most of these would be needed in any case to replace existing energy, and the rest is an investment that far more than pays itself off over time by nearly eliminating health and climate costs.

"It appears we can achieve the enormous social benefits of a zero-emission energy system at essentially no extra cost," says co-author Mark Delucchi, a research scientist at the Institute of Transportation Studies, University of California, Berkeley. "Our findings suggest that the benefits are so great that we should accelerate the transition to wind, water, and solar, as fast as possible, by retiring fossil-fuel systems early wherever we can."

"This paper helps push forward a conversation within and between the scientific, policy, and business communities about how to envision and plan for a decarbonized economy," writes Mark Dyson of Rocky Mountain Institute, in an accompanying preview of the paper. "The scientific community's growing body of work on global low-carbon energy transition pathways provides robust evidence that such a transition can be accomplished, and a growing understanding of the specific levers that need to be pulled to do so. Jacobson et al.'s present study provides sharper focus on one scenario, and refines a set of priorities for near-term action to enable it."

Technique could aid mass production of biodegradable plastic

University of Nebraska-Lincoln

Introducing a simple step to the production of plant-derived, biodegradable plastic could improve its properties while overcoming obstacles to manufacturing it commercially, says new research from the University of Nebraska-Lincoln and Jiangnan University.

That step? Bringing the heat.

Nebraska's Yiqi Yang and colleagues found that raising the temperature of bio-plastic fibers to several hundred degrees Fahrenheit, then slowly allowing them to cool, greatly improved the bio-plastic's normally lackluster resistance to heat and moisture.

Its thermal approach also allowed the team to bypass solvents and other expensive, time-consuming techniques typically needed to manufacture a commercially viable bio-plastic, the study reported.

Yang said the approach could allow manufacturers of corn-derived plastic -- such as a Cargill plant in Blair, Nebraska -- to continuously produce the biodegradable material on a scale that at least approaches petroleum-based plastic, the industry standard. Recent research estimates that about 90 percent of U.S. plastic goes unrecycled.

"This clean technology makes possible (the) industrial-scale production of commercializable bio-based plastics," the authors reported.

The approach uses polylactic acid, or polylactide, a component of biodegradable plastic that can be fermented from corn starch, sugarcane and other plants. Though most plastics are made from petroleum, polylactide has emerged as an environmentally friendlier alternative.

Yet polylactide's susceptibility to heat and moisture, particularly during the manufacturing process, has limited its use in textiles and other industries. In searching for ways to address the issue, researchers long ago discovered that mixing mirror-image polylactide molecules -- generally referred to as "L" and "D" -- could yield stronger molecular interactions and better performance than using just the L or D alone.

But there was another catch. Convincing a reasonable proportion of the L and D molecules to permanently pair up is difficult, often forcing researchers to concoct costly and complicated matchmaking schemes. Some of the most common involve the use of solvents or other chemical agents whose disposal can cause environmental issues of their own.

"The problem is that people couldn't find a way to make it work so that you could use it on large scales," said Yang, Charles Bessey Professor of biological systems engineering and of textiles, merchandising and fashion design. "People use nasty solvent or other additives. But those are not good for continuous production.

"We don't want to dissolve the polymers and then try to evaporate the solvents, and then have to consider reusing them. That's just too expensive (and) not realistic."

Yang and his colleagues decided to pursue another approach. After mixing pellets of the L and D polylactide and spinning them into fibers, the team rapidly heated them to as hot as 400 degrees Fahrenheit.

The resulting bio-plastic resisted melting at temperatures more than 100 degrees higher than did plastics containing only the L or D molecules. It also maintained its structural integrity and tensile strength after being submersed in water at more than 250 degrees, approximating the conditions that bio-plastics must endure when being incorporated into dyed textiles.

The textile industry produces about 100 million tons of fibers annually, Yang said, meaning that a feasible green alternative to petroleum-based manufacturing could pay off both environmentally and financially.

"So we just used a cheap way that can be applied continuously, which is a big part of the equation," Yang said. "You have to be able to do it continuously in order to have large-scale production. Those are important factors."

Though the team has demonstrated continuous production on a smaller scale in Yang's lab, he said it will soon ramp up to further illustrate how the approach might be integrated into existing industrial processes.

The team's findings will be published in a November print edition of Chemical Engineering Journal. Yang authored the study with Helan Xu, a former Nebraska researcher now at Jiangnan University; Bingnan Mu, graduate student in textiles, merchandising and fashion design at Nebraska; and Jiangnan University's Gangwei Pan, Bomou Ma and Jing Yang.

The researchers received support from the U.S. Department of Agriculture's National Institute of Food and Agriculture, the Agricultural Research Division at the University of Nebraska-Lincoln, and the China Scholarship Council.

NY officials criticize Hudson River cleanup to EPA

ALBANY, N.Y. — The $1.7 billion Superfund cleanup of the Hudson River is not protecting the public’s health and the river as initially promised, New York’s environmental commissioner contended Wednesday.

Commissioner Basil Seggos criticized the six-year dredging project performed by General Electric Co. in a letter to Environmental Protection Agency Administrator Scott Pruitt released in the waning days of a crucial public comment period. Seggos was particularly scornful of an EPA assessment this summer that it could take 55 years or more before all species of fish in the river are clean enough to eat once a week.

“This is unacceptable,” Pruitt wrote, echoing previous criticisms by the Cuomo administration official. “A remedy that will take generations to safeguard public health and the environment is clearly not protective.”

Boston-based GE removed 2.75 million cubic yards of polychlorinated biphenyl-contaminated sediment from a 40-mile stretch of the river north of Albany under an agreement with the EPA.

The EPA released a review of the work this summer that found, based on the data so far, the cleanup will protect human health and the environment in the long term. Critics pushing for a broader cleanup have noted that a large amount of PCB-contaminated sediment remains in the river.

Seggos said the state is nearing completion of its own sampling program to measure the true extent of contamination.

An EPA spokeswoman said Wednesday the agency would consider Seggos’ comments along with all the others they are receiving.

The EPA is accepting public comment on its review through Friday.

GE spokesman Mark Behan questioned New York’s critical stance in an email that said PCB levels in the upper-Hudson water declined in some spots by as much as 73 percent in the first year after dredging.

“New York State approved and oversaw the dredging project and was instrumental in every major decision related to the project,” Behan wrote. “Its criticism flies in the face of the most up-to-date scientific data from the river itself.”

How an Illinois utility used wind, solar and storage to power a microgrid for 24 hours

Written By David J. Unger

A Midwest utility has taken a critical step closer toward a distributed, decentralized power grid.

Ameren, in partnership with S&C Electric, a Chicago-based smart-grid engineering firm, successfully completed a 24-hour “islanding” test earlier this month at the utility’s newly built microgrid in Champaign, Illinois, using wind, solar and battery storage.

“Islanding” refers to a microgrid — itself a network of power generation and consumption — temporarily disconnecting itself from the broader, regional power grid. Many analysts believe the power grid of the future will consist of these kinds of localized, networked microgrids over the centralized “hub-and-spoke” model that has existed for over a century. The ability to island is seen as a core function of microgrids.

Ameren’s test, although limited in scale, offers a glimpse into the utility of the future. The microgrid is one of the few in the world to operate at utility-scale voltages — between 4 kilovolts and 34.5 kilovolts

“This successful test provided tangible proof that the system can accomplish what it was designed to do,” Ron Pate, senior vice president of operations and technical services at Ameren Illinois, said in a press release. “The microgrid isn’t theoretical and our tests don’t need to be lab simulations. We were able to prove that this technology works and can provide key benefits to our customers.”

Ameren’s microgrid lies adjacent to the campus of University of Illinois Urbana-Champaign and consists of a 100-kilowatt (kW) wind turbine, a 125-kW solar array, a 250-kW battery and two 500-kW natural gas generators. While the entire system is capable of supplying regular Ameren customers, this month’s islanding test focused on the 50-kW segment of the microgrid that serves an onsite Ameren-owned research facility.

Notably, the microgrid was able to complete the 24-hour test without relying on fossil-fuel generation, making it a 100 percent-renewable islanding event.

A chart released by S&C Electric visualizes how wind, solar and battery storage worked in concert to meet the research facility’s energy demands throughout the 24-hour period.

The test began at 8 a.m. with a 97 percent charge on the battery. For about two hours, the microgrid ran on stored energy alone. Once the battery levels hit 90 percent at around 10 a.m., the system automatically dispatched its wind turbine and solar array. The renewable assets took over powering the load while using excess energy to recharge the partially depleted battery.

In less than an hour, the battery returned to a 97 percent charge, enabling the system to dial down the wind and solar and resume running on stored energy alone. This cycling pattern continued, according to S&C’s data, until about 7 p.m. With the sun setting, the system switched off the solar inverter, leaving only the wind turbine to carry the load through the night.

Strong winds kept the microgrid powered throughout the night — keeping battery levels relatively stable — and even storing charge when wind speeds peaked. S&C Electric’s PureWave SMS-250 Storage Management System helped ensure voltage remained stable throughout the process. Starting at about 7 a.m. the next day, a rising sun brought solar back online until the test’s completion at 8 a.m., reconnecting to the grid without a hitch.

Throughout it all, the battery’s state of charge never dropped below 88 percent, according to S&C. What’s more, the system’s automated control system managed these shifts along the way, without the need for human controllers to make adjustments. There’s still a long way to go before utilities are relying on microgrids to power whole cities or even neighborhoods, but the test shows how wind, energy, storage and automated, intelligent controls can complement one another in an effort to deliver local, consistent, reliable, 100 percent renewable power.

“This system is designed to be completely automated,” says Chris Evanich, S&C’s applications director of microgrids. The storage management system decides which generation sources to use and when they should go offline if they’re not needed, all to “keep everything balanced and up and running so that the customer doesn’t even realize they’re running on 100 percent renewables.”

Illinois is becoming something of a hub for microgrid innovation. The Illinois Institute of Technology, a private research university on Chicago’s South Side, was an early pioneer in the technology, launching an $18.5 million plan in 2008 to build “the world’s first self-healing and efficient smart microgrid distribution system.” Today, the university manages an islandable, 9-megawatt network of gas turbines, large-scale batteries, a wind turbine and various smart-efficiency technologies.

ComEd, Illinois’ largest utility, is planning to develop its own microgrid in Chicago’s Bronzeville neighborhood, adjacent and connected to the Institute of Technology. The utility has received more than $5 million in grants from the U.S. Department of Energy to help complete the microgrid. ComEd says the project would be the “most advanced clustered urban microgrid in the United States,” providing the neighborhood with an added layer of reliability, cybersecurity and resiliency in the face of broader grid disruptions.

Late last month, ComEd submitted its official proposal for the cluster to the Illinois Commerce Commission, which must approve the plan.

“The electric industry in Illinois, and around the world, is changing rapidly,” wrote Joseph Svachula, ComEd’s vice president of smart grid and technology in testimony to the Illinois Commerce Commission. “This ‘clustering’ of microgrids is an important step in the journey to a more decentralized grid because it enables us to understand how microgrids interface, and how resources from neighboring microgrids can be used to keep power flowing to customers.”

Harvey Could Reshape How and Where Americans Build Homes

By Christopher Flavelle, for Bloomberg , August 30, 2017

Storm comes as U.S. flood insurance program is up for renewal

Texas has one of the most relaxed approaches to building codes

Jerry Garcia’s home in Corpus Christi missed the worst of Hurricane Harvey by just a few miles and lost nothing more than some shingles and his backyard pier, which turned up further down Oso Bay. A 5-foot bulkhead and sloping lawn shielded it from the flooding that’s devastated parts of Texas.

A home builder, Garcia said he built this place, like all the houses he builds, "above code" -- stronger than the standards required by law, which in Texas tends to be less than in most states. But he doesn’t think those tougher standards should be imposed on every builder.

"You’ve got to find that medium, to build affordable housing," Garcia, 65, said sitting in his house as Harvey’s rains continued to pound the coast further north. Tougher mandatory codes mean higher costs, which means fewer homes. And if insurers had their way, he added, every home would be "a box with two doors and no windows."

Hurricane Harvey has highlighted a climate debate that had mostly stayed out of public view -- a debate that’s separate from the battle over greenhouse gas emissions, but more consequential to the lives of many Americans. At the core of that fight is whether the U.S. should respond to the growing threat of extreme weather by changing how and, even where, homes are built.

That debate pits insurers, who favor tighter building codes and fewer homes in vulnerable locations, against homebuilders and developers, who want to keep homes as inexpensive as possible. As the costs of extreme weather increase, that fight has spilled over into politics: Federal budget hawks want local policies that will reduce the cost of disasters, while many state and local officials worry about the lost tax revenue that might accompany tighter restrictions on development.

Harvey slammed ashore in Texas Friday with historic levels of rain and flooding. On Wednesday, the storm returned, making landfall a second time in southwestern Louisiana, which was devastated by Hurricane Katrina in 2005. At least 15 people have died so far in Texas, according to a count by the Austin American-Statesman newspaper. Thousands more have been displaced from their homes. Early estimates on damages range from $42 billion to more than $100 billion.

Contributing to the high losses is the fact that Texas, despite being one of the states most vulnerable to storms, has one of the most relaxed approaches to building codes, inspections, and other protections. It’s one of just four states along the Gulf and Atlantic coasts with no mandatory statewide building codes, according to the Insurance Institute for Business & Home Safety, and it has no statewide program to license building officials.

Read more: Harvey Costs Seen at Catastrophic Levels With Many Uninsured

Texas policies leave home building decision to cities, whose record is mixed: Corpus Christi uses codes that reflect national standards, minus the requirement that homes be built one foot above expected 100-year-flood levels, according to the Federal Alliance for Safe Homes. Nueces County, which encompasses Corpus Christi, has no residential building code whatsoever.

The consequence of loose or non-existent codes is that storm damage is often worse than need be. "Disasters don’t have to be devastating," said Eleanor Kitzman, who was Texas’s state insurance commissioner from 2011 to 2013 and now runs a company in South Carolina that constructs and finances coastal homes that are above code. "We can’t prevent the event, but we can mitigate the damage."

Any proposal that would increase costs in Texas draws push back from home builders, a powerful group in a state where people despise red tape about as much as they hate taxes.

"They are not big on regulation," said Julie Rochman, chief executive officer of the insurance institute. That skepticism about government was on display in 2013, when the state’s two senators voted against additional federal funding to clean up after Superstorm Sandy. But it can be applied selectively: Governor Greg Abbott requested federal money for Hurricane Harvey before it even made landfall.

Building codes elicit little support in Austin. At the end of this year’s state legislative session, the Texas Association of Builders posted a document boasting of its success at killing legislation it didn’t like. That included a bill that would have let cities require residential fire sprinklers, and another that would have given counties with 100,000 people or more authority over zoning, land use and oversight of building standards--something the builders’ group called “onerous.”

Ned Muñoz, vice president of regulatory affairs for the Texas Association of Builders, said cities already do a good job choosing which parts of the building code are right for them. And he argued that people who live outside of cities don’t want the higher prices that come with land-use regulations.

Muñoz said his association’s target is "unnecessary and burdensome government regulations, which increase the price of a home."

The fight in Texas is a microcosm of a national battle. The International Code Council, a Washington nonprofit made up of government officials and industry representatives, updates its model codes every three years, inviting state and local governments to adopt them. Last year, the National Association of Home Builders boasted of its prowess at stopping the 2018 codes it didn’t like.

"Only 6 percent of the proposals that NAHB opposed made it through the committee hearings intact," the association wrote on its blog. Some of the new codes that the home builders blocked had been proposed by the Federal Emergency Management Agency -- the same agency that’s on the hook when homes collapse, flood or wash away. And when FEMA is on the hook, it’s really the taxpayer.

Ron Jones, an NAHB board member and builder in Colorado who is critical of the organization’s views on codes and regulations, said that while the focus now should be on helping people hurt by Harvey, he hoped the storm would also force new thinking.

"There’s no sort of national leadership involved," said Jones. "For them it’s just, ’Hell, we’ll rebuild these houses as many times as you’ll pay us to do it.’"

The home builders demonstrated their power again this year, when President Donald Trump reversed an Obama administration initiative restricting federally funded building projects in flood plains. "This is a huge victory for NAHB and its members," the association wrote on its blog.

Yet on other issues, Trump’s administration appears to be siding with those who favor tougher local policies. In an interview just before Harvey, FEMA chief Brock Long expressed support for an Obama administration proposal to spur more local action on resilience, such as better building codes, as a condition of getting first-dollar disaster relief from Washington.

"I don’t think the taxpayer should reward risk," Long told Bloomberg in the interview, four days before Harvey slammed into Texas.

It may seem surprising that a Republican administration would side with its Democratic predecessor on anything, especially something related to climate change. But the prompt is less ideological that practical. Over the past decade, the federal government spent more than $350 billion on disaster recovery, a figure that will almost certainly increase without major changes in local building codes and land use practice.

And much of that money goes to homes that keep getting hit. That’s true for the National Flood Insurance Program, which Congress must reauthorize by the end of next month; some lawmakers, and Long himself, have said homes that repeatedly flood should be excluded from coverage. But there are also 1.3 million households that have applied for federal disaster assistance money at least twice since 1998 -- many of them in the same areas hit hardest by Hurricane Harvey.

In his interview, Long said his focus as FEMA’s administrator will be working with cities and states to ensure that areas hit by disasters get rebuilt differently, in a way that’s more resilient.

Some state lawmakers acknowledge the need to at least consider doing things differently. Todd Hunter, who represents this part of the coast in the Texas House of Representatives, said Harvey will ignite a conversation about the need for statewide building codes.

"The discussion needs to start," Hunter said in an interview at his law office overlooking downtown Corpus Christi, where many of the buildings visible out his window were still without power. And that discussion could go beyond codes: "We need to take a look at where structures are being built."

Others are less optimistic. If people living along the Texas coast had to pay the full cost of the risk they face, some parts of that coast might empty out. That’s what worries Albert Betts Jr, executive director of the Texas Insurers Council, the trade association representing insurance companies. And it’s why he thinks Hurricane Harvey won’t shift public policy.

It’s not that Betts doesn’t like a fight. At his office off a freeway in Austin, his desk is adorned with a hammer on one side and a miniature punching bag on the other. But Betts said the price of real change could be too high.

"I can’t sit here and tell you, as a Texan, that I don’t want that area developed," Betts said. "Smarter people than me have yet to figure that out."

— With assistance by Peter Coy

Love of Coastal Living Is Draining U.S. Disaster Funds

Welcome to Alabama’s Dauphin Island, where it floods all the time.

By Christopher Flavelle, August 31, 2017

Dauphin Island has one of the highest concentrations of chronically flooded homes in the U.S. Photographer: Nicole Craine/Bloomberg

About 400 miles east of Houston, off the coast of Alabama, Dauphin Island was spared the initial impact of Hurricane Harvey. Yet this tiny sliver of land near the mouth of Mobile Bay is just as important as the battered metropolis to the debate over American disaster policy. That’s because the 14-mile-long island, home to only about 1,300 people, floods again and again, hurricane or no hurricane. And again and again, those homes are repaired and rebuilt, largely at the expense of U.S. taxpayers.

The homes on Dauphin Island are among the 5 million that are covered by the National Flood Insurance Program. Founded in 1968 to make sure homeowners in flood-prone areas could get affordable insurance, the program ends up paying most residential flood insurance claims in the U.S. Partly as a result, development along coasts and riverbanks and in flood plains has exploded over the past 50 years. So have claims for flood damages. The NFIP is now about $25 billion in debt.

On Sept. 30, the program is also set to expire. Some members of Congress would like to use a reauthorization vote to enact reforms, including perhaps kicking the most flood-exposed homes off the rolls. Given the pressure to deliver disaster relief quickly post-Harvey, it’s improbable that any initial agreement to extend the NFIP past Sept. 30 will lead to significant reforms. More likely, a short-term extension may fund it through the end of the year. But the problem won’t go away. And the debate is just getting started.

The issues surrounding the NFIP go beyond just insurance and straight to the costs of climate change—specifically, whether the government will concede that the most vulnerable places simply can’t be protected. While hurricanes contribute greatly to costs, putting a sudden spotlight on the insurance issue, it’s the chronic flooding that happens away from the public eye, in places such as Dauphin Island, that slowly drains the NFIP. The island has one of the country’s highest concentrations of houses that the Federal Emergency Management Agency calls “severe repetitive loss”—those that flood most often. The owners of those 84 properties have gotten almost $17 million since 1978, an average of $199,244 each.

While many areas that depend on flood insurance are poor, the overwhelming majority of buildings on Dauphin’s most vulnerable beaches are vacation homes and rentals. When Texas Republican Jeb Hensarling, chairman of the House Financial Services Committee, argued this year against single mothers being “forced to pay for insurance for some millionaire’s beachfront vacation home,” he could’ve been talking about this town.

Federal payments to places such as Dauphin Island can seem like a reward for stubbornness. In 1979, Hurricane Frederic destroyed the only bridge to the island; in 2004 and 2005, Ivan and Katrina wiped out about 500 homes. Federal funds replaced the bridge and the houses. But officials on the island argue they’ve been left behind, getting just enough money to limp along but not enough to survive the next big storm. After Sandy, Washington agreed to spend almost $1 billion to fortify a single stretch of the New Jersey shore with dunes. Jeff Collier, mayor of Dauphin Island, would like the same treatment. “Nobody’s ever come in and forced sand down my throat,” he says. “I wish they would. Why are we more expendable than anybody else?”

As climate change worsens, it’s getting harder for Congress to pretend there’s enough money to save everyone. John Kilcullen, Mobile County’s director of plans and operations for emergency management, says he thinks the best way to stop history from repeating itself on Dauphin is for the federal government to buy houses and tear them down, especially along the island’s low-lying western section—the same stretch that boasts the biggest, most expensive properties. He’s pessimistic about how many of those homeowners would be willing to sell. “People don’t like being told they can’t live somewhere,” Kilcullen says. “I don’t think it’s politically feasible.”

Explosive costs have a way of changing people’s minds. The U.S. spent more than $350 billion over the past decade responding to natural disasters, yet FEMA budgeted only $90 million this year to prepare communities for those catastrophes. “These storms are breaking the back of FEMA,” says Roberta Swann, director of the Mobile Bay National Estuary Program. “If FEMA starts to say, ‘If a hurricane comes and takes your house, we’re done’—that is going to change the face of coastal communities.”

And yet, if the U.S. wants to stop racking up taxpayer-funded losses, those communities do have to change. The NFIP paid out about $16.3 billion after Katrina in 2005 and $8.4 billion for Sandy in 2012, according to the Insurance Information Institute. That same year, Congress passed a law that overhauled the program, making it significantly more expensive to insure a home in a flood-prone area. The law, known as the Biggert-Waters Flood Insurance Reform Act of 2012, was the product of a rare alliance between environmentalists and Tea Party activists. After it went into effect, some coastal homeowners saw their annual flood insurance premiums spike to $20,000 or more from as low as $600. Two years later, as opposition grew, Congress rolled back most of the reforms.

“Nobody’s ever come in and forced sand down my throat,” says Dauphin Island Mayor Jeff Collier. “I wish they would. Why are we more expendable than anybody else?”Photographer: Nicole Craine/Bloomberg

Last year theNatural Resources Defense Council won a lawsuit seeking to uncover how many homes FEMA has designated severe repetitive loss. The data the agency was forced to release showed that about 30,000 properties had cost taxpayers $5.5 billion since 1978. “Repeatedly flooded properties cost the nation billions in avoidable losses and repeated trauma to the families that live there,” says Rob Moore, a water-policy expert at the council. “We should be offering assistance to move to higher ground before they run out of options, not after.”

In 2016, FEMA bought 960 severe repetitive loss homes to tear down; at that rate, it would take until midcentury to buy all 30,000. By then, as many as 265 towns and cities will suffer what the Union of Concerned Scientists in a July report called “chronic inundation”—flooding that covers at least 10 percent of their land an average of twice a month. “The human inclination to stay in place, and to resist change at all costs, is pretty strong,” says Erika Spanger-Siegfried, an analyst at the Union of Concerned Scientists and a lead author of the report. “It will catch up with those locations,” she says, “and they will need to move eventually—but at a greater cost.”

On Dauphin Island, beachfront homes are still going up. Yet some residents may be ready to leave. After Katrina, 35 homeowners applied for federal buyouts, says Mayor Collier, but none were approved. If the government doesn’t either buy out homes or fortify the beach, it will just cost more when the next hurricane hits, Collier argues, while standing amid homes under construction. Bobcat loaders zoom by, returning sand blown off the ever-shrinking beach. “Pay me now,” he says, “or pay me later.”

BOTTOM LINE - While big storms grab headlines, chronic flooding in Dauphin Island and places like it are slowly draining the NFIP dry, forcing hard questions about where we live.

Now Comes the Uncomfortable Question: Who Gets to Rebuild After Harvey?

Kate Aronoff for Bloomberg , August 30 2017, 1:50 p.m.

In the decade-plus since Hurricane Katrina, Houston has been debating what it needs to protect itself from a catastrophic weather event. While that conversation has gone on, the city did what most cities do: carried on business as usual, constructing more than 7,000 residential buildings directly in Harris County’s Federal Emergency Management Agency-designated flood plain since 2010.

That kind of construction spree, which was detailed in a prescient investigation by ProPublica and the Texas Tribune last year, is not just head-slapping in hindsight, it actually produces its own greater risks, as it means paving over the kinds of wetlands that can help buffer against extreme flooding events.

But from a financial perspective, there was a logic behind those 7,000 buildings, as each is eligible for subsidized flood protection through the National Flood Insurance Program, that just so happens to be expiring at the end of September, meaning debate over its reauthorization will compete for time with everything else on a packed fall congressional calendar.

“Harvey should be a wake-up call” for deep flood insurance reform, says Rob Moore, a senior policy analyst with the Natural Resource Defense Council’s water team. “One of the biggest shortcomings is that the program has always been intended first and foremost to be a rebuilding program.”

While environmental groups, including the Environmental Defense Fund and the National Resources Defense Council, contend that the NFIP encourages irresponsible forms of coastal development, they have odd company in a coalition fighting for the program’s reform – insurance companies and free market groups that just want the government out.

Hurricane Harvey is likely to focus attention on questions the federal government and the American public have in large part worked hard to avoid, because they don’t cut cleanly along ideological or partisan lines. And because they’re not much fun to think about. What does home ownership look like in an age of climate change? When is it OK to rebuild, and when is it time to retreat?

The 30 Texas counties declared disaster areas by Gov. Greg Abbott are home to nearly 450,000 policies underwritten by the National Flood Insurance Program, a FEMA subsidiary. The NFIP was $24.6 billion in debt before Harvey made landfall, and can borrow just $5.8 billion more from the Treasury before the program’s current round of congressional reauthorization expires. Waters are still rising, and it will be several days or even weeks before damage totals are assessed, but early insurance industry estimates say the figure could match costs incurred from Hurricane Katrina, the most expensive disaster in U.S. history. Homeowners hit by the storm will now enter into a months- and potentially years-long claims fulfillment process, attempting to rebuild what they lost.

As lawmakers return to Washington early next month to determine the future of the beleaguered agency, there are bigger questions on the table than how to make the NFIP financially solvent. Both sides of the aisle agree that the program is unsustainable as is; FEMA is predicting that the NFIP’s rolls could increase by 130 percent by century’s end. Now — at what might be the worst possible moment to do so — Congress is essentially tasked with deciding who gets to live on the country’s most vulnerable coastlines.

When it was founded in 1968, the NFIP set out to fill a gap left by the private market. Unlike most other kinds of insurance, which protect policyholders against the costs of things like car accidents and medical expenses, the payouts for floods come all at once and in huge numbers, meaning many insurers lack the cash on hand necessary to keep up with an influx of claims. While NFIP policies are underwritten by the federal government, they’re administered by private insurers, who make a percentage of the policies they write. Homeowners in flood plains with federally backed mortgages are required to purchase flood insurance, though anyone can purchase an NFIP policy. For those required to take out policies, failure to pay can result in foreclosure.

Starting with hurricanes Katrina and Rita in 2005, the program has been hit with a barrage of massive payouts that have sunk it deeper and deeper into the red. Flooding in the Midwest compounded that debt in 2008, followed soon after by Hurricane Ike. Sandy tacked on another $8 billion in 2012, and there have been several major flooding events since.

Of course, the NFIP is just one program in a web of agencies dealing with disaster mitigation and response, ranging from FEMA’s emergency relief operations to the Department of Housing and Urban Development, which administers block grants to help residential buildings recover from destructive storms. Beyond the fact that its services are relatively difficult for homeowners to access — requiring a special kind of patience for and finesse with navigating layers of bureaucratic red tape — the NFIP’s benefits aren’t immediately recognizable just after a storm hits. Where few question how much FEMA is spending to respond to a storm like Harvey, the fact that the NFIP’s benefits play out quietly and over the longer term makes it easier to point to its debt overhang as an example of funds poorly spent.

The key tension now plaguing the NFIP is that it has always had to balance two often competing goals. “It’s really a conflict between the actuarial tendency of the program and a general desire among policymakers that the prices not be too high,” says Rebecca Elliott, a London School of Economics sociologist who researches the program. Severe Repetitive Loss Properties — those that have been consistently damaged and rebuilt — account for just 1 percent of policies but as much as 30 percent of the funds paid out in claims. Many of the residential properties that carry NFIP policies have also been “grandfathered” in, meaning that policyholders can keep paying existing rates as their flood risk increases.

Among the program’s biggest problems is just how high-risk its ratepayers are. In the case of health care, for instance, insurance operates on the basic principle that healthy policyholders with cheaper coverage should help pay the way for those who are sicker and more expensive. Referencing that system, Elliot says, “Right now we have the sickest people in the pool. The people who require insurance are only the high-risk flood zone. We know that’s not a great recipe for insurance pools.”

Yet whereas most kinds of public assistance in America benefit poor and working-class people, what’s saved the NFIP from outright attacks like the ones visited on Medicaid has been the fact that its beneficiaries also include the 1 percent and their waterfront homes. Consequently, NFIP reauthorization has always tended to be a rare spot of bipartisan alignment in Congress. “Democrats and Republicans both live in floodplains,” Elliot explains. If anything, the splits are regional, with representatives from flood-prone and politically divergent states like New York, Louisiana, and California coming together to keep rates artificially low.

In part because of who the NFIP serves, reform efforts have proven controversial. The last major overhaul of the program was passed into law just before Hurricane Sandy struck New York and New Jersey in 2012. That legislation, Biggert-Waters, sailed through Congress uncontroversially, and would have brought NFIP policies up to market rate on a rapid timeline and eliminated the program’s grandfather clause.

Sanitation workers throw out debris from a flood-damaged home in Oakwood Beach in Staten Island, N.Y., on Feb. 5, 2013. Photo: Spencer Platt/Getty Images

Post-Sandy those measures got much less popular. “Families who had their lives destroyed were going to rebuild and finding out that … if they were going to rebuild their homes, they might not be able to afford to insure them,” Elliott told me. Deductibles would have jumped by 25 percent per year. Blowback was so intense that one of the bill’s sponsors, Maxine Waters, D-Calif., became a major voice to rally successfully against its implementation. The result of that fracas was the Homeowners Flood Insurance Affordability Act in 2014, which reintroduced grandfathering and allowed rates to rise more gradually. “It appeased the protest for the time being, but the rates are still going up,” Elliott says. “All of the things that homeowners were upset about are still taking place, just on a slower timeline.”

One thing that bill also lifted from Biggert-Waters was a study of how those rate hikes would impact low-income owners. “HFIAA was a recommitment to the idea that the NFIP was now going to be in the business of adjudicating who truly needed support to pay for their flood insurance. Who was getting a good deal but didn’t actually need it? It’s basically a way of introducing means tests to a program that didn’t have it before,” Elliott adds.

The larger balance of forces surrounding flood insurance reform might be among the most novel that American politics has to offer. Pushing against an overhaul are major developers and real estate interests, like the National Realtors Association. One of the biggest voices pushing for it has been a coalition called Smarter Safer, an active proponent for Biggert-Waters that encompasses everyone from big environmental groups to insurance companies to conservative free market think tanks. For different reasons, all of them want to see rates that are more actuarially sound and less heavily subsidized by taxpayers.

“We’re concerned about the cost to taxpayers,” says Steve Ellis, vice president of the group Taxpayers for Common Sense, who’s worked on multiple rounds of NFIP reauthorization. He says that for many, Biggert-Waters was “too much too fast,” as well as a case of bad timing. Ultimately, though, he hopes the reforms it outlined come to pass. “We want to see a program that charges closer to risk-based rates and sheds as much of the risk off the program and into the private sector,” with the private sector playing an increasingly large role in underwriting policies. Eventually, Ellis hopes the NFIP will become a flood insurer of last resort.

Other Smarter Safer members are more gung-ho. Groups, like the Heartland Institute spinoff R Street, a free market think tank, see the current structure of the NFIP as an expensive entitlement and want to get the government out of the flood insurance business entirely. Among the other voices calling on the federal government to shoulder less of the cost is current FEMA Administrator Brock Long, who said just before Harvey made landfall that he doesn’t “think the taxpayer should reward risk going forward.”

How exactly Harvey will impact the NFIP’s reauthorization remains to be seen, though even before the storm the chances for wholesale overhaul were slim. Two fledgling proposals in the Senate — one from Sens. Bill Cassidy, R-La., Kirsten Gillibrand, D-N.Y., and Shelley Moore Capito, R-W.Va., and another from Sen. Bob Menendez, D-N.J., with several co-sponsors — would each extend the reauthorization beyond five years, as well as increase oversight over the private insurers who write FEMA-backed claims. If passed, the Cassidy-Gillibrand bill would “[remove] barriers to privatization,” according to draft language, by allowing the companies that currently administer to policies to start underwriting them as well. Another proposal, passed through the House Financial Services Committee and praised by Smarter Safer, would also see the private sector play a larger role.

While few people are opposed outright to giving the private sector a bigger role to play, experts are leery of viewing privatization as a solution to what ails the NFIP. For one, private insurers’ involvement thus far has been hugely problematic. A report released last year by New York Attorney General Eric Schneiderman accused FEMA-contracted policy writers of widespread fraud, noting that a “lack of transparency and accountability can and does lead to inflated costs for services.” An earlier investigation by NPR found that private insurers made $400 million in Sandy’s aftermath, and that nearly a third of all NFIP funds spent between 2011 and 2014 went directly to private insurers instead of to homeowners. The more structural danger of privatization is that insurance companies could simply skim off the most profitable rates, leaving the NFIP to pick up the most costly policies. In health care terms, then, privatization could render an already sick pool even sicker.

The even bigger policy question is whether higher and more competitive rates will actually incentivize fewer people to live along high-risk coastlines, or just leave the shore open only to those wealthy homeowners and developers who can afford higher rates and round after round of rebuilding. President Donald Trump also repealed an Obama-era mandate for flood-prone construction, so there’s no guarantee that new shorefront structures will be able to withstand future damage. The result of higher rates, Elliot predicts, “is the socioeconomic transformation along with the physical transformation of the coastlines.”

Of course, the elephant wading through the flood is the fact that there are now millions of people living in areas that shouldn’t be inhabited at all, no matter the cost. “There’s the uncertainty of living at risk,” Elliot says, “and there’s the uncertainty of what it means to stay in your community when in the near to medium term, it’s going to become more expensive for you to do so — and in the long term, physically impossible.”

People are stranded on a roof due to flood waters from Hurricane Katrina in New Orleans, on Aug. 30, 2005. Photo: Vincent Laforet/Pool/AFP/Getty Images

Virtually no one thinks that the program’s rate structure is politically sustainable as is, and transformational changes to American coastlines will happen regardless of what happens to the NFIP. Beyond destructive storms becoming more common as a result of climate change, the sea level rise likely to happen over the next century will make several American cities more vulnerable to everyday flooding. Miami Beach today floods on a sunny afternoon. By 2100, for instance, Savannah, Georgia, could face two crippling floods per month. Along the same timeline, the Bronx may become the only New York City borough spared from a similar form of chronic inundation, a coin termed by the Union of Concerned Scientists to describe persistent flood risk. In all, sea level rise could force as many as 13 million Americans to relocate by the end of the century.

What’s now up for debate is when and how the retreat happens, and on whose terms. The NRDC and others are advocating to incorporate buyout programs more fully into the NFIP, allowing homeowners in persistent-risk properties to sell their homes back to the government at pre-flood prices, after which point the land is returned to wetlands, waterfront parks, or some other kind of absorbent public space. “The temptation is to always try to identify some one-size-fits-all solution,” Moore says. “We’re at a point where we know that’s not the case for these issues. For NRDC, the real emphasis needs to be on how we help low- and middle-income homeowners live in a place that is far from flooding.”

There were some 40,000 buyouts of high-risk properties between 1993 and 2011, mostly for homes along rivers in inland states. When people receive relocation funds, though, it tends to be through state or locally run pilot projects, and there’s not yet enough data to determine how effective these programs have been in the long run. So while there is some precedent in the U.S. for helping coastal residents move away from high-risk flood zones, there is still no federal plan to make moving a realistic possibility for low- and middle-income homeowners. Examples of how to do it well are few and far between.

What MIT postdoctoral fellow Liz Koslov has found in her research into planned retreat is that there are plenty of communities ready and willing to move — at least under the right circumstances. “I was shocked about how positive they were about the process,” she says of the people she met doing field work on Staten Island, where two neighborhoods successfully lobbied Albany for relocation funds. “I expected a lot more negative criticisms, but people talked about it as a process they found very empowering. It started as a bottom-up effort, without any sort of government relocation plan. … It really helped that people were hearing about it from their neighbors, in community meetings.”

Still, buyouts are no quick fix, and scaling them up at the national level would take careful planning. “From the federal and state governments’ point of view, buyouts are great. They save a huge amount of money, absorb water, and can keep nearby properties safer,” Koslov says. At the local level, though, relocation plans run the risk of bringing “all the cost and none of the benefits.” Especially for cities without a tax base as big as New York City’s, relocation can erode funds for things like schools and public services, and potentially even whole communities. That’s why New York state’s post-Sandy relocation plan included a 5 percent bonus for homeowners who chose to relocate within the county. Blue Acres, a similar post-Sandy buyout program in New Jersey, has faced hostility from towns weary of a population drain.

Even those communities that do want to relocate face serious barriers at the federal level. FEMA currently requires a disaster to have been declared in order for the agency to release relocation funds. That’s why New York and New Jersey were each able to launch relatively successful buyout programs post-Sandy, but why Alaska — where tribal nations’ land is under rapid threat from erosion — have been unable to do the same despite years of organizing.

The Staten Island neighborhoods that ended up being relocated also only received their buyouts after “months of grassroots lobbying and protests and petitions and an enormous amount of collective effort,” Koslov says. In both neighborhoods, she adds, the main populations were white, middle-class, and home-owning public sector employees (cops, firefighters, postal workers) who had retired but were young enough to use their time actively — ”the last people you would expect to collectively organize for climate change adaptation,” Koslov says. Both areas voted overwhelmingly for Trump. Aside from some obvious concerns for renters — whose landlords would have to be bought out — it remains to be seen whether working-class neighborhoods of color — like many of those likely to be worst hit by flooding — would see the same results.

There are few easy answers for how coastal communities can adapt to rising tides and more volatile weather, let alone for how to make the NFIP more sustainable in that context. What does seem clear is that leaving the responsibility of planning for and adapting to climate change up to individuals may be the wrong path for disaster policy in the 21st century. “Individuals vary tremendously in their ability to respond rationally to incentives,” Elliot says. “My sense is that when it comes to the problem of what habited places are going to look like in a future defined by climate change, that’s a collective problem that’s going to require more collective solutions.”

In terms of policy, that could mean making it easier for communities to organize and access funding like Staten Island residents have post-Sandy, or even attaching a rider to different types of insurance policies, spreading the financial risk of flooding around to more than just the people most likely to be effected in the next several years.

The need for a more collective response has implications beyond specific measures though, and extends well outside the scope of flood insurance. Amid the chaotic reports of the situation unfolding in Houston are also those of good Samaritans rescuing total strangers, the kind of post-disaster solidarity writer Rebecca Solnit has called “a paradise built in hell.” Paradise may not be on the docket as the NFIP’s future gets determined this month, but Harvey may be a chance for lawmakers to start thinking more holistically about how to deal with and prevent the kinds of hell climate change is all too likely to keep dishing out.

Full report

https://projects.propublica.org/houston-cypress/?utm_campaign=sprout&utm_medium=social&utm_source=twitter&utm_content=1503846942

Floods in India, Bangladesh and Nepal kill 1,200 and leave millions homeless

Authorities say monsoon flooding is one of the worst in region in years

Chloe Farand, Wednesday 30 August 2017 17:41 BST

The Independent Online

At least 1,200 people have been killed and millions have been left homeless following devastating floods that have hit India, Bangladesh and Nepal, in one of the worst flooding disasters to have affected the region in years.

International aid agencies said thousands of villages have been cut off by flooding with people being deprived of food and clean water for days.

South Asia suffers from frequent flooding during the monsoon season, which lasts from June to September, but authorities have said this year's floods have been much worse.

In the eastern Indian state of Bihar, the death toll has risen to more than 500, the Straits Times reported, quoting disaster management officials.

The paper said the ongoing floods had so far affected 17 mllion people in India, with thousands sheltered in relief camps.

Anirudh Kumar, a disaster management official in Patna, the capital of Bihar, a poor state known for its mass migration from rural areas to cities, said this year's farming had collapsed because of the floods, which will lead to a further rise in unemployment in the region.

In the northern state of Uttar Pradesh, reports said more than 100 people had died and 2.5 million have been affected.

In Mumbai, authorities struggled to evacuate people living in the financial capital's low-lying areas as transport links were paralysed and downpours led to water rising up to five feet in some parts of the city.

Weather officials are forecasting heavy rains to continue over the next 24 hours and have urged people to stay indoors.

Partially submerged houses are seen at a flood-affected village in Morigaon district in the northeastern state of Assam, India. (REUTERS/Anuwar Hazarika)

In neighbouring Bangladesh, at least 134 have died in monsoon flooding which is believed to have submerged at least a third of the country.

More than 600,000 hectares of farmland have been partially damaged and in excess of 10,000 hectares have been completely washed away, according to the disaster minister.

Bangladesh's economy is dependent on farming and the country lost around a million tonnes of rice in flash floods in April.

"Farmers are left with nothing, not event with clean drinking water," said Matthew Marek, the head of disaster response in Bangladesh for the International Federation of Red Cross and Red Crescent.

In Nepal, 150 people have been killed and 90,000 homes have been destroyed in what the UN has called the worst flooding incident in the country in a decade.

According to the Red Cross, at least 7.1 million people have been affected in Bangladesh - more than the population of Scotland - and around 1.4 million people have been affected in Nepal.

The disaster comes as headlines have focused on the floods in Houston, Texas, which authorities have described as "unprecedented".

Officials in Texas have said the death toll now stands at 15 in the wake of Hurricane and Tropical Storm Harvey, with thousands forced to flee their homes.

The rise in extreme weather events such as hurricanes and floods have been identified by climate scientists as the hallmark of man-made climate change.

The US has seen two of its worst storms ever, Hurricane Harvey and Hurricane Katrina, in just over a decade.

India's Prime Minister, Narendra Modi, has said climate change and new weather patterns are having "a big negative impact".

Third-hottest June puts 2017 on track to make hat-trick of hottest years

June 2017 was beaten only by June in 2015 and 2016, leaving experts with little hope for limiting warming to 1.5C or even 2C

The latest figures cement estimations that warming is now at levels not seen for 115,000 years.

First published on Wednesday 19 July 2017 00.17 EDT

Last month was the third-hottest June on record globally, temperature data suggest, confirming 2017 will almost certainly make a hat-trick of annual climate records, with 2015, 2016 and 2017 being the three hottest years since records began.

The figures also cement estimations that warming is now at levels not seen for 115,000 years, and leave some experts with little hope for limiting warming to 1.5C or even 2C.

According to new figures from the US National Oceanic and Atmospheric Administration (Noaa), June 2017 was the third-hottest June on record, beaten only by the two preceding Junes in 2015 and 2016.

The Noaa data show combined land and sea-surface temperatures for June 2017 were 0.82C above the 20th century average, making a string of 41 consecutive Junes above that average.

June 2016 still holds the record at 0.92C above the 20th century average, followed by June 2015 which was 0.89C above the baseline.

The data line up closely with Nasa figures released last week, which are calculated slightly differently, finding the month was the fourth-hottest on record – with June 1998 also being warmer in their data set.

Based on the Nasa data, climate scientist and director of Nasa’s Goddard Institute for Space Studies Gavin Schmidt estimated that 2017 was probably going to be the second-warmest year on record after 2016, but would almost certainly be among the top three hottest years.

Gavin Schmidt

(@ClimateOfGavin)

With update to Jun, 2017 will almost certainly be a top 3 year in the GISTEMP record (most likely 2nd warmest ~57% chance). pic.twitter.com/jiR6cCv1x8

July 15, 2017

The June data see all of the first six months of 2017 sitting among the three warmest months on record, making it the second-hottest first half of a year on record – again, beaten only by the previous year.

The near-record temperatures continued this year despite the passing of El Niño, which normally warms the globe, and its opposite – La Niña – currently suppressing temperatures.

The warming trend is almost certainly caused by greenhouse gas emissions – mostly the result of burning fossil fuels – with many studies showing such warm years would be almost impossible without that effect.

Last year, Michael Mann from Pennsylvania State University published a paper showing the then-record temperatures in 2014 would have had less than a one in a million chance of occurring naturally.

“We have a follow-up article that we’ve submitted showing that the likelihood of three consecutive record-breaking years such as we saw in 2015-2017 was similarly unlikely,” he told the Guardian over email. “In short, we can only explain the onslaught of record warm years by accounting for human-caused warming of the planet.”

Andy Pitman from the University of New South Wales in Sydney, Australia said the onslaught of very rapid warming in the past few years is likely a result of the climate system “catching up” after a period of relative slow warming caused by natural variability – the so-called “hiatus”.

“I do not think the recent anomalies change anything from a science perspective,” he said. “The Earth is warming at about the long-term rates that were expected and predicted [by models].”

But Pitman said the ongoing trend was “entirely inconsistent” with the target of keeping warming at just 1.5C above pre-industrial temperatures.

Current trends suggest the 1.5C barrier would be breached in the 2040s, with some studies suggesting it might happen much sooner.

“In my view, to limit warming to 2C requires both deep and rapid cuts and a climate sensitivity on the lower end of the current range,” Pitman said. “I see no evidence that the climate sensitivity is on the lower end of the current range, unfortunately.”

“It would be a good idea to cut greenhouse gas emissions rather faster than we are.”

"Heritage at Risk: How Rising Seas Threaten Ancient Coastal Ruins"

"The shores of Scotland’s Orkney Islands are dotted with ruins that date to the Stone Age. But after enduring for millennia, these archaeological sites – along with many others from Easter Island to Jamestown – are facing an existential threat from climate change."

"Perched on the breathtaking Atlantic coast of Mainland, the largest island in Scotland’s Orkney archipelago, are the remains of the Stone Age settlement of Skara Brae, dating back 5,000 years. Just feet from the sea, Skara Brae is one of the best preserved Stone Age villages in the world — a complex of ancient stone house foundations, walls, and sunken corridors carved out of the dunes by the shore of the Bay of Skaill. Fulmars and kittiwakes from the vast seabird colonies on Orkney’s high cliffs wheel above the coastal grassland of this rugged island, 15 miles from the northern coast of the Scottish mainland. On a sunny day, the surrounding bays and inlets take on a sparkling aquamarine hue.

Older than the Egpyptian pyramids and Stonehenge, Skara Brae is part of a UNESCO World Heritage site that also includes two iconic circles of standing stones — the Ring of Brodgar and the Stones of Stenness — and Maeshowe, an exquisitely structured chambered tomb famous for its Viking graffiti and the way its Stone Age architects aligned the entrance to catch the sun’s rays at the winter solstice. These sites, situated just a few miles from Skara Brae, are part of an elaborate ceremonial landscape built by Orkney’s earliest farmers.

Skara Brae and the neighboring sites have weathered thousands of years of Orkney’s wild winters and ferocious storms, but they may not outlive the changing climate of our modern era. As seas rise, storms intensify, and wave heights in this part of the world increase, the threat grows to Skara Brae, where land at each end of its protective sea wall — erected in the 1920s — is being eaten away. Today, as a result of climate change, Skara Brae is regarded by Historic Environment Scotland, the government agency responsible for its preservation, as among Scotland’s most vulnerable historic sites."

Satellite snafu masked true sea-level rise for decades

Revised tallies confirm that the rate of sea-level rise is accelerating as the Earth warms and ice sheets thaw.

Jeff Tollefson on 17 July 2017

As the Greenland ice sheet thaws, it is helping to raise the world's sea levels.

The numbers didn’t add up. Even as Earth grew warmer and glaciers and ice sheets thawed, decades of satellite data seemed to show that the rate of sea-level rise was holding steady — or even declining.

Now, after puzzling over this discrepancy for years, scientists have identified its source: a problem with the calibration of a sensor on the first of several satellites launched to measure the height of the sea surface using radar. Adjusting the data to remove that error suggests that sea levels are indeed rising at faster rates each year.

“The rate of sea-level rise is increasing, and that increase is basically what we expected,” says Steven Nerem, a remote-sensing expert at the University of Colorado Boulder who is leading the reanalysis. He presented the as-yet-unpublished analysis on 13 July in New York City at a conference sponsored by the World Climate Research Programme and the International Oceanographic Commission, among others.

Nerem's team calculated that the rate of sea-level rise increased from around 1.8 millimetres per year in 1993 to roughly 3.9 millimetres per year today as a result of global warming. In addition to the satellite calibration error, his analysis also takes into account other factors that have influenced sea-level rise in the last several decades, such as the eruption of Mount Pinatubo in the Philippines in 1991 and the recent El Niño weather pattern.

The results align with three recent studies that have raised questions about the earliest observations of sea-surface height, or altimetry, captured by the TOPEX/Poseidon spacecraft, a joint US–French mission that began collecting data in late 1992. Those measurements continued with the launch of three subsequent satellites.

“Whatever the methodology, we all come up with the same conclusions,” says Anny Cazenave, a geophysicist at the Laboratory for Studies in Space Geophysics and Oceanography (LEGOS) in Toulouse, France.

In an analysis published in Geophysical Research Letters1 in April, Cazenave’s team tallied up the various contributions to sea-level rise, including expansion resulting from warming ocean waters and from ice melt in places such as Greenland. Their results suggest that the satellite altimetry measurements were too high during the first six years that they were collected; after this point, scientists began using TOPEX/Poseidon's back-up sensor. The error in those early measurements distorted the long-term trend, masking a long-term increase in the rate of sea-level rise.

The problem was first identified in 2015 by a group that included John Church, an oceanographer at the University of New South Wales in Sydney, Australia. Church and his colleagues identified a discrepancy between sea-level data collected by satellites and those from tide gauges scattered around the globe2. In a second paper published in June in Nature Climate Change3, the researchers adjusted the altimetry records for the apparent bias and then calculated sea-level rise rates using a similar approach to Cazenave’s team. The trends lined up, Church says.

Rising tide

Still, Nerem wanted to know what had gone wrong with the satellite measurements. His team first compared the satellite data to observations from tide gauges that showed an accelerating rate of sea-level rise. Then the researchers began looking for factors that could explain the difference between the two data sets.

The team eventually identified a minor calibration that had been built into TOPEX/Poseidon's altimeter to correct any flaws in its data that might be caused by problems with the instrument, such as ageing electronic components. Nerem and his colleagues were not sure that the calibration was necessary — and when they removed it, measurements of sea-level rise in the satellite's early years aligned more closely with the tide-gauge data. The adjusted satellite data showed an increasing rate of sea-level rise over time.

“As records get longer, questions come up,” says Gavin Schmidt, a climate scientist who heads NASA’s Goddard Institute for Space Studies in New York City. But the recent spate of studies suggests that scientists have homed in on an answer, he says. “It’s all coming together.”

If sea-level rise continues to accelerate at the current rate, Nerem says, the world’s oceans could rise by about 75 centimetres over the next century. That is in line with projections made by the Intergovernmental Panel on Climate Change in 2013.

“All of this gives us much more confidence that we understand what is happening,” Church says, and the message to policymakers is clear enough. Humanity needs to reduce its output of greenhouse-gas emissions, he says — and quickly. ”The decisions we make now will have impacts for hundreds, and perhaps thousands, of years.”

Nature 547, 265–266 (20 July 2017) doi:10.1038/nature.2017.22312

China’s outdated coal thermal power technology sold in Vietnam

VietNamNet Bridge - China’s policy on exporting its coal thermal power technology has raised concerns among environmentalists, who say Vietnam could become the destination for the outdated technology.

Nguyen Dinh Tuan, former rector of the HCMC University of Natural Resources and the Environment, said that serious pollution in China was behind the country's decision to export the technology.

“Previously, when China was poor, they had to accept this situation. But now, as China has become the world's second largest economy with foreign reserves of more than $3.2 trillion, it wants to stop this,” he said.

“One of the measures is shutting down coal-fired thermopower plants. And once it doesn’t need the technology anymore, it will export the technology to seek profit,” he explained.

The Chinese government is restructuring energy sources with priority given to clean renewable energy.

Meanwhile, developing coal-fired plants remains the choice of Vietnam.

China’s policy on exporting its coal thermal power technology has raised concerns among environmentalists, who say Vietnam could become the destination for the outdated technology.

Vietnamese state management agencies say that most of Vietnam’s coal-fired thermal power projects have EPC contractors from China.

Tuan said while many countries offer coal thermal power technology, Vietnam is still choosing technology and EPC contractors from China.

“Chinese contractors are winning bids by offering very low prices. After winning the bids, they cite many reasons to raise investment capital,” he explained.

In some cases, Chinese contractors were chosen because the contractors offered commissions to Vietnamese officials.

The expert pointed out that China may follow a ‘dual strategy’ – both exporting technology and selling coal.

Vietnam’s coal is not highly appreciated in use for coal-fired plants as it is not economical. Therefore, Vietnam sells its high-quality coal and imports lower-quality coal to run its thermal power plants. As the imports lack high quality, the pollution level is high.

China has been following the coal import policy for many years and has been buying coal from Vietnam. Meanwhile, China sells coal suitable to Chinese-technology power plants to Vietnam.

“It would be bad if Vietnam both imports old polluting technology and polluting material,” Tuan said.

According to Nguy Thi Khanh, director of GreenID, Vietnam would have to import 85 million tons of coal by 2030 to run power plants as shown in the seventh power development program.

Operating power plants, generating 13,000 MW a year, discharge 15 million tons of ash which still cannot be treated, causing serious pollution.

The pollution will be even more serious if Vietnam increases coal-fired thermal power capacity to 55,000 MW, which is four times higher than now.

Methane Seeps Out as Arctic Permafrost Starts to Resemble Swiss Cheese

Measurements over Canada's Mackenzie River Basin suggest that thawing permafrost is starting to free greenhouse gases long trapped in oil and gas deposits.

By Bob Berwyn, InsideClimate News

Jul 19, 2017

In parts of northern Canada's Mackenzie River Delta, seen here by satellite, scientists are finding high levels of methane near deeply thawed pockets of permafrost.

Global warming may be unleashing new sources of heat-trapping methane from layers of oil and gas that have been buried deep beneath Arctic permafrost for millennia. As the Earth's frozen crust thaws, some of that gas appears to be finding new paths to the surface through permafrost that's starting to resemble Swiss cheese in some areas, scientists said.

In a study released today, the scientists used aerial sampling of the atmosphere to locate methane sources from permafrost along a 10,000 square-kilometer swath of the Mackenzie River Delta in northwestern Canada, an area known to have oil and gas desposits.

Deeply thawed pockets of permafrost, the research suggests, are releasing 17 percent of all the methane measured in the region, even though the emissions hotspots only make up 1 percent of the surface area, the scientists found.

In those areas, the peak concentrations of methane emissions were found to be 13 times higher than levels usually caused by bacterial decomposition—a well-known source of methane emissions from permafrost—which suggests the methane is likely also coming from geological sources, seeping up along faults and cracks in the permafrost, and from beneath lakes.

The findings suggest that global warming will "increase emissions of geologic methane that is currently still trapped under thick, continuous permafrost, as new emission pathways open due to thawing permafrost," the authors wrote in the journal Scientific Reports. Along with triggering bacterial decomposition in permafrost soils, global warming can also trigger stronger emissions of methane from fossil gas, contributing to the carbon-climate feedback loop, they concluded.

"This is another methane source that has not been included so much in the models," said the study's lead author, Katrin Kohnert, a climate scientist at the GFZ German Research Centre for Geosciences in Potsdam, Germany. "If, in other regions, the permafrost becomes discontinuous, more areas will contribute geologic methane," she said.

Similar Findings Near Permafrost Edges

The findings are based on two years of detailed aerial atmospheric sampling above the Mackenzie River Delta. It was one of the first studies to look for sources of deep methane across such a large region.

Previous site-specific studies in Alaska have looked at single sources of deep methane, including beneath lakes. A 2012 study made similar findings near the edge of permafrost areas and around melting glaciers.

Now, there is more evidence that "the loss of permafrost and glaciers opens conduits for the release of geologic methane to the atmosphere, constituting a newly identified, powerful feedback to climate warming," said the 2012 study's author, Katey Walter Anthony, a permafrost researcher at the University of Alaska Fairbanks.

"Together, these studies suggest that the geologic methane sources will likely increase in the future as permafrost warms and becomes more permeable," she said.

"I think another critical thing to point out is that you do not have to completely thaw thick permafrost to increase these geologic methane emissions," she said. "It is enough to warm permafrost and accelerate its thaw. Permafrost that starts to look like Swiss cheese would be the type that could allow substantially more geologic methane to escape in the future."

Róisín Commane, a Harvard University climate researcher, who was not involved with the study but is familiar with Kohnert's work, said, "The fluxes they saw are much larger than any biogenic flux ... so I think a different source, such as a geologic source of methane, is a reasonable interpretation."

Commane said the study makes a reasonable assumption that the high emissions hotspots are from geologic sources, but that without more site-specific data, like isotope readings, it's not possible to extrapolate the findings across the Arctic, or to know for sure if the source is from subsurface oil and gas deposits.

"There doesn't seem to be any evidence of these geogenic sources at other locations in the Arctic, but it's something that should be considered in other studies," she said. There may be regions with pockets of underground oil and gas similar to the Mackenzie River Delta that haven't yet been mapped.

Speed of Methane Release Remains a Question

The Arctic is on pace to release a lot more greenhouse gases in the decades ahead. In Alaska alone, the U.S. Geological Survey recently estimated that 16-24 percent of the state's vast permafrost area would melt by 2100.

In February, another research team documented rapidly degrading permafrost across a 52,000-square-mile swath of the northwest Canadian Arctic.

What's not clear yet is whether the rapid climate warming in the Arctic will lead to a massive surge in releases of methane, a greenhouse gas that is about 28 times more powerful at trapping heat as carbon dioxide but does not persist as long in the atmosphere. Most recent studies suggest a more gradual increase in releases, but the new research adds a missing piece of the puzzle, according Ted Schuur, a permafrost researcher at Northern Arizona University.

Since the study only covered two years, it doesn't show long-term trends, but it makes a strong argument that there is significant methane escaping from trapped layers of oil and gas, Schuur said.

"As for current and future climate impact, what matters is the flux to the atmosphere and if it is changing ... if there is methane currently trapped by permafrost, we could imagine this source might increase as new conduits in permafrost appear," he said.

Solve Antarctica’s sea-ice puzzle

John Turner & Josefino Comiso on 19 July 2017

John Turner and Josefino Comiso call for a coordinated push to crack the baffling rise and fall of sea ice around Antarctica.

Antarctica's disappearing sea ice, from its peak in August 2016 to a record low on March 3, 2017.

Different stories are unfolding at the two poles of our planet. In the Arctic, more than half of the summer sea ice has disappeared since the late 1970s1. The steady decline is what global climate models predict for a warming world2. Meanwhile, in Antarctic waters, sea-ice cover has been stable, and even increasing, for decades3. Record maxima were recorded in 2012, 2013 and 2014

So it came as a surprise to scientists when on 1 March 2017, Antarctic sea-ice cover shrank to a historic low. Its extent was the smallest observed since satellite monitoring began in 1978 (see ‘Poles apart’) — at about 2 million square kilometres, or 27% below the mean annual minimum.

Researchers are struggling to understand these stark differences5. Why do Antarctica’s marked regional and seasonal patterns of sea-ice change differ from the more uniform decline seen around most of the Arctic? Why has Antarctica managed to keep its sea ice until now? Is the 2017 Antarctic decline a brief anomaly or the start of a longer-term shift6, 7? Is sea-ice cover more variable than we thought? Pressingly, why do even the most highly-rated climate models have Antarctic sea ice decreasing rather than increasing in recent decades? We need to know whether crucial interactions and feedbacks between the atmosphere, ocean and sea ice are missing from the models, and to what extent human influences are implicated6.

What happens in the Antarctic affects the whole planet. The Southern Ocean has a key role in global ocean circulation; a frozen sea surface alters the exchange of heat and gases, including carbon dioxide, between ocean and atmosphere. Sea ice reflects sunlight and influences weather systems, the formation of clouds and precipitation patterns. These in turn affect the mass of the Antarctic ice sheet and its contribution to sea-level rise. Sea ice is also crucial to marine ecosystems. A wide range of organisms, including krill, penguins, seals and whales, depend on its seasonal advance and retreat.

It is therefore imperative that researchers understand the fate of Antarctic sea ice, especially in places where its area and thickness are changing, and why. This requires bringing together understandings of the drivers behind the movement of the ice (through drift and deformation) as well as those that control its growth and melt (thermodynamics). Such knowledge underpins climate models; these need to better represent the complex interactions between sea ice and the atmosphere, ocean and ice sheet. What’s required now is a focused and coordinated international effort across the scientific disciplines that observe and model global climate and the polar regions.

Satellites provide the best spatial information on sea ice around Antarctica. Regular observations reveal how ice cover varies over days, years and decades8. Weather, especially storms with high winds, has a daily influence, as well as a seasonal one. Longer-term changes are driven by larger patterns in the temperature and circulation of the atmosphere and oceans.

But near-continuous satellite observations only reach back about four decades. Longer records are essential to link sea-ice changes to trends in climate. Information from ships’ logbooks, coastal stations, whale-catch records, early satellite imagery and chemical analyses of ice cores hint that sea-ice coverage might have been up to 25% greater in the 1940s to 1960s6.

Source: US National Snow and Ice Data Center

Collecting more ice cores and historical records, and synthesizing the information they contain, could reveal local trends. These would help to identify which climatic factors drive Antarctic sea-ice changes6. For instance, in 2017, the area most depleted of sea ice was south of the Eastern Pacific Ocean. This region has strong links to the climate of the tropics, including the El Niño–Southern Oscillation, suggesting that sea ice is sensitive to conditions far from the poles.

Another issue is how the balance between dynamics and thermodynamics drives the advance and retreat of ice. The thickness and volume of ice depend on many factors, including the flow of heat from the ocean to the atmosphere and to the ice. Sea ice influences the saltiness of the ocean. As the ocean freezes, salt enters the water; as the ice melts, fresh water returns to the sea. Such processes are difficult to measure precisely, having uncertainties to within 50–100% of the signal. And they are hard to model.

“Sea-ice coverage might have been up to 25% greater in the 1940s to 1960s.”

Satellite altimeters can measure the distance between the surfaces of the sea ice and ocean accurately, and this distance can be used to calculate the ice thickness. But it is hard to interpret these data without knowing how much snow is on the ice, its density and whether the snow’s weight pushes the ice surface below sea level (as is often the case). Calibrating and validating satellite data are crucial, as is developing algorithms to merge and analyse information from a variety of sources.

Ice, ocean and air must be sampled at appropriate intervals over a wide enough area to establish how they interact. Research ice-breaker cruises are essential for collecting in situ observations; one such was the US PIPERS (Polynyas, Ice Production and seasonal Evolution in the Ross Sea) campaign in 2017. But ships travel only along narrow routes and for a short time, typically 1–3 months.

Increasingly, autonomous instruments and vehicles — underwater, on-ice and airborne — are providing data throughout the year and from inaccessible or dangerous regions. These robotic systems are giving revolutionary new information and insights into the formation, evolution, thickness and melting of sea ice. Sensors mounted on marine mammals (such as elephant seals), or on floats and gliders, also beam back data on temperature, salinity and other physical and biogeochemical parameters. But to operate continuously, these instruments need to be robust enough to withstand the harsh Antarctic marine environment.

Improve models

Current climate models struggle to simulate the seasonal and regional variability seen in Antarctic sea ice. Most models have biases, even in basic features such as the size and spatial patterns of the annual cycle of Antarctic sea-ice growth and retreat, or the amount of heat input to the ice from the ocean. The models fail to simulate even gross changes2, such as those driven by tropical influences on regional winds9. Because ice and climate are closely coupled, even small errors multiply quickly.

Leopard seals live, hunt and breed among the pack ice in Antarctic waters.

Features that need to be modelled more accurately include the belt of strong westerly winds that rings Antarctica, and the Amundsen Sea Low — a stormy area southwest of the Antarctic Peninsula. Models disagree, for example, on whether persistent westerly winds should increase or decrease sea-ice coverage around Antarctica. Simulations of clouds and precipitation are also inadequate. These cannot currently reproduce the correct amounts of snowfall or sea surface temperature of the Southern Ocean (the latter is widely overestimated by the models).

Climate models must also include the mixing of waters by surface winds and the impact of waves on the formation and break-up of sea ice. Precipitation and meltwater from ice sheets and icebergs influence the vertical structure of the ocean and how it holds heat, which also affects the growth and decay of sea ice. Researchers need to develop models of the atmosphere–ocean–sea-ice environment at high spatial resolution.

Explaining the recent variability in Antarctic sea ice, and improving projections of its future in a changing climate, requires projects that bridge many disciplines. For example, the research communities involved in ice-core analysis, historical data rescue and climate modelling need to collaborate to track sea-ice variability over timescales longer than the satellite record.

“Because ice and climate are closely coupled, even small errors multiply quickly.”

Some gaps in our knowledge can be filled through nationally funded research. More demanding cross-disciplinary work must be supported through international collaboration. Leading the way are organizations such as the Scientific Committee on Antarctic Research, the Scientific Committee on Oceanic Research, the World Climate Research Programme’s Climate and Cryosphere project and the Past Global Changes project. But essential work remains to be done, including: more detailed model comparisons and assessments; more research cruises; and the continuity and enhancement of satellite observing programmes relevant to sea ice. These organizations should partner with funding agencies to make that happen.

Better representations of the Southern Ocean and its sea ice must now be a priority for modelling centres, which have been focused on simulating the loss of Arctic sea ice. Such models will be crucial to the next assessment of the Intergovernmental Panel on Climate Change, which is due around 2020–21. A good example of the collaborative projects needed is the Great Antarctic Climate Hack (see go.nature.com/2ttpzcd). This brings together diverse communities with an interest in Antarctic climate to assess the performance of models.

Nature 547, 275–277 (20 July 2017) doi:10.1038/547275a

Europe's contribution to deforestation set to rise despite pledge to halt it

Europe’s consumption of products such as beef, soy and palm oil could increase its contribution to global deforestation by more than a quarter by 2030, analysis shows

Arthur Neslen in Brussels, Friday 30 June 2017 12.35 EDT, Last modified on Friday 30 June 2017 12.37 EDT

Europe’s contribution to global deforestation may rise by more than a quarter by 2030, despite a pledge to halt such practices by the end of this decade, according to a leaked draft EU analysis.

An estimated 13m hectares (Mha) of the world’s forestland is lost each year, a figure projected to spiral in the next 30 years with the Amazon, Greater Mekong and Borneo bearing the brunt of tree clearances.

But despite signing several international pledges to end deforestation by this decade’s end, more than 5Mha of extra forest land will be needed annually by 2030 to meet EU demand for agricultural products, a draft EU feasibility study predicts.

We are destroying rainforests so quickly they may be gone in 100 years

“The EU pledged to stop deforestation by 2020, not to increase it,” said Sebastian Risso, a spokesman for Greenpeace. “Deforestation is a conveyor belt to devastating climate change and species loss that the world must stop, and fast.

“It is hypocritical for Europe to portray itself as a climate leader while doing little to slow its own insatiable appetite for the fruits of destroyed forests.”

Land clearances for crop, livestock and biofuel production are by far the biggest causes of deforestation, and Europe is the end destination for nearly a quarter of such products – beef, soy, palm oil and leather – which have been cultivated on illegally deforested lands.

According to the draft EU feasibility study, which is meant to provide officials with policy options, the “embodied” deforestation rate – which is directly linked to EU consumption – will increase from between 250,000-500,000ha in 2015 to 340,000-590,000ha in 2030.

The figures do not encompass forest degradation or the impacts of lending by EU financial institutions.

The Guardian understands that the report’s methodology has also been criticised within the European commission, where some officials say that its projections are too conservative and do not account for European bioenergy demand.

A deforestation action plan that could include binding legislation is currently under consideration by the European commission.

The EU has signed several pledges to halt deforestation by 2020 within accords that include: the UN convention on biological diversity, the UN sustainable development goals, the UN forum on forests, the New York declaration on forests, and the Amsterdam declaration.

Valero refinery sues PG&E over outage that idled gas making, spewed pollution

Plumes of dark smoke flow from the Valero oil refinery in Benicia May 5 after a power outage. (Chris Riley/Times-Herald)

By Denis Cuff | dcuff@bayareanewsgroup.com | Bay Area News Group

PUBLISHED: July 3, 2017 at 3:02 pm | UPDATED: July 3, 2017 at 6:26 pm

BENICIA — The Valero oil refinery is seeking $75 million from PG&E in a federal lawsuit that blames the utility for a May 5 power outage that disrupted gasoline production for a month, damaged equipment, and showered tons of air pollution on downwind areas.

Some Benicia residents have criticized the oil refinery over pollution from hours of flaring that led to an industrial park being evacuated and an air quality agency issuing at least four violation notices against the oil company.

Valero, however, pointed the finger at PG&E, saying in a federal lawsuit that the utility abruptly pulled the plug on the refinery in a “reckless” 18 minute power failure.

PG&E cut off the flow on both of two transmission lines, one a regular power source for the refinery and the second a backup source designed to avoid shutdowns, according to the lawsuit filed Friday in U.S District court in Sacramento.

“PG&E must take responsibility for the damages it caused to ensure reckless power outages with potentially significant consequences never happen again,” Valero said in a statement.

PG&E apologized for the outage during a Benicia City Council meeting, and said it is taking measures to prevent power failures from happening again.

In response to the lawsuit, PG&E said it has commissioned an outside engineering firm, Exponent, to investigate the cause of the outage.

“We continue to partner with Valero and the city of Benicia to prevent similar power disruptions,” PG&E said. “The safety of our customers, employees and the general public is always our top priority.”

One of the transmission lines leading to Valero was taken off line on May 5 as part of planned maintenance, the utility said. For reasons being investigated, a breaker opened that morning, shutting down the backup line to Valero.

While the breaker opening is a safety measure to protect equipment, PG&E spokeswoman Deanna Contreras said the utility is trying to determine why it needed to open.

Valero said losing both transmission lines “simultaneously without any prior notice” put plant workers and neighbors at risk.

“PG&E knew or should have known that unplanned power dips or outages can present serious safety issues to refinery employees and members of the community, including the risk of explosions, fires, and serious injury,” attorneys for the oil company wrote in the lawsuit.

Valero said the outage damaged refinery equipment, and caused the refinery to lose money because it couldn’t produce gasoline and other fuels for about the month it took to make repairs and bring the refinery back into production.

Valero also said it deserves damages because its reputation was harmed.

Trump’s ‘drill, baby, drill’ energy policies are being met by ‘sue, baby, sue’

BY STUART LEAVENWORTH

sleavenworth@mcclatchydc.com

President Trump promised to grow jobs by rolling back Obama-era energy and pollution rules. And he’s fulfilling his pledge, but not how he intended. In just six months, Trump’s policies have resulted in a surge in employment — for environmental lawyers.

Since Trump took office, environmental groups and Democratic state attorneys general have filed more than four dozen lawsuits challenging his executive orders and decisions by the U.S. Environmental Protection Agency, Interior Department and other agencies. Environmental organizations are hiring extra lawyers. Federal agencies are requesting bigger legal defense budgets.

The first round of legal skirmishes has mostly gone to the environmentalists. Earlier this month, the U.S. Court of Appeals for the D.C. Circuit blocked the administration from delaying Obama-era rules aimed at reducing oil industry releases of methane, a potent greenhouse gas. On June 14, a federal judge ruled against permits issued to complete construction of the Dakota Access pipeline, a partial victory for the Standing Rock Sioux tribe, which claims its hunting and fishing grounds are threatened by the oil pipeline.

Abigail Dillen, a vice president of litigation for Earthjustice, a nonprofit organization that filed the Dakota Access lawsuit, said her group and others are girding for years of court battles over bedrock environmental laws, such as the Clean Water Act and Clean Air Act. Earthjustice has increased its staff of lawyers since November to 100, and is planning to hire another 30 in coming months.

“What we are seeing is unprecedented,” said Dillen, whose California-based organization was previously known as the Sierra Club Legal Defense Fund. “With this administration, we face a disregard to basic rules of environmental decision making.”

Trump swept to office vowing to boost coal and petroleum production, while ridding the nation of “job-killing regulations.” He’s worked to rapidly deliver on those promises. But the speed of his regulatory rollback — without a full legal team in place to vet new policies — has created a target-rich environment for litigation.

To cap off his 100 days in office, Trump signed an executive order that will expand offshore oil drilling in federal waters and open other areas that were previously off limits to new oil and gas exploration.

Altogether, environmental groups have sued the administration more than 50 times in federal court since Trump took office five months ago, according to a McClatchy review of the federal Public Access to Court Electronic Records (PACER) database. One group alone, the Center for Biological Diversity, has filed or joined in 25 suits, including ones seeking documents related to Trump’s planned border wall with Mexico and Trump’s approval of the Keystone oil pipeline.

In March, the group bragged on Twitter, “We’ve sued #Trump every week since 1/22. 3 suits today brings total to 15!”

We've sued #Trump every week since 1/22. 3 suits today brings total to 15! Help us protect the Arctic from him: https://t.co/DXFLQFkRXO pic.twitter.com/fNNcLkdu65 — Center for Bio Div (@CenterForBioDiv) May 3, 2017

During the Obama years, environmental groups also filed suits, both to toughen administration regulations and also to defend policies challenged by industry. Now, said Dillen, “the old work hasn’t gone away, and this administration has created new work for us.”

Environmental law groups have enjoyed a surge in contributions since Trump’s election, much like the ACLU and other organizations fighting Trump’s immigration policies. Earthjustice saw its fundraising increase 160 percent from the election through January, compared to the same period last year.

Federal agencies recognize they are flying into a suit storm. Several are seeking extra budget resources — or at least trying to avoid cuts — to defend the administration’s policies in court.

The Interior Department’s Office of the Solicitor, for instance, recently told Congress that it will need “substantial attorney resources” to handle an expected influx of cases. In its $65 million budget justification for Fiscal Year 2018, the solicitor’s office said: “We can also expect litigation on virtually every major permitting decision authorizing energy development on federal lands.”

It is a reasonable expectation. On Monday, Earthjustice filed suit against Interior’s attempt to delay an Obama rule limiting releases of methane from oil and gas operations on public lands. The lawsuit was filed in U.S. District Court in California on behalf of tribal groups and several environmental organizations including the Sierra Club, Natural Resources Defense Council and Wilderness Society.

Interior also faces lawsuits on plans to open tens of thousands of acres of public land to coal industry leasing, and the proposed removal of bans on new offshore oil drilling in the Arctic and Atlantic oceans.

The suits are piling up at the EPA, too. Environmentalists are suing over the agency’s refusal to ban a controversial pesticide, chlorpyrifos; its delay of rules aimed at preventing chemical plants accidents; and its delay on implementing rules on coal plant discharges into public waters.

During the Obama administration, industry groups and Republican state attorneys general engaged in their own litigation storm. As the top legal officer of Oklahoma, Scott Pruitt solicited industry contributions to build up the war chest of the Republican Attorneys General Association. He also filed or helped file 14 lawsuits against the EPA. In January, Trump tapped Pruitt to serve as EPA administrator, putting him in a position to unravel Obama-era policies he previously had litigated against.

Democratic attorneys general are now ramping up their own fundraising and are filing or joining in suits against Trump’s environmental, immigration and education policies. AGs such as California’s Xavier Becerra, Pennsylvania’s Josh Shapiro and Washington state’s Bob Ferguson are all raising their profiles with these lawsuits. All are seen as contenders for future seats as governor or U.S. senator.

The litigation surge is not going unnoticed by congressional Republicans. They’ve restarted a campaign to counter what they call “abusive” use of the courts by environmentalists and other groups. On June 28, the GOP-controlled House held a hearing focusing on lawsuits against the Interior Department, which manages the nation’s vast public lands.

“A legal sub industry has thrived from endless environmental litigation, while burdening the livelihoods of countless citizens,” said U.S. Rep. Mike Johnson, R-LA, vice chair of House Natural Resources Subcommittee on Oversight and Investigations. Excessive litigation, he added, “drains our taxpayer dollars away from good stewardship of our natural resources.”

At the EPA, Pruitt finds himself in the awkward position of wishing the courts would dismiss litigation he filed against the agency while serving as Oklahoma attorney general. One of those suits was against Obama’s Clean Power Plan, which would set the nation’s first limits on carbon emissions from power plants.

In March, Trump’s Justice Department asked the U.S. Court of Appeals in the D.C. Circuit to indefinitely suspend the lawsuit Pruitt had filed. Apparently, the administration was concerned that if the litigation continued, the court could rule in favor of the Clean Power Plan, forcing the White House to implement a climate initiative it opposes.

In a partial victory for the administration, the appeals court in April put the litigation on hold, and asked for more information before it decides on the case going forward. Depending on the court’s final ruling, the White House will likely be allowed to go ahead with full revocation of the regulations.

Revocation of the Clean Power Plan, however, could prompt another lawsuit. Environmental groups contend the EPA is obligated to address climate pollution under the Clean Air Act.

McClatchy’s Ben Wieder contributed to this story.

Flood insurance rates could rise for hundreds of thousands of homeowners under proposal

BY STUART LEAVENWORTH

sleavenworth@mcclatchydc.com

Congress is considering sweeping changes to the debt-laden National Flood Insurance Program that could jack up flood insurance rates for hundreds of thousands of homeowners under a bill that a Florida real estate group called “devastating.”

The proposal, part of a flood-insurance package with a Sept. 30 deadline, could prove costly to homeowners in flood-prone regions ranging from Florida to Texas to California’s Central Valley. It would primarily affect homeowners with low “grandfathered” rates based on flood maps that have changed since they purchased their homes.

“This could be devastating to so many people,” said Maria S. Wells, president of the Florida Realtors trade group, which has 175,000 members statewide. “People don’t realize they could be remapped into a much more high-risk zone.”

Rep. Sean Duffy, a Wisconsin Republican who introduced the 21st Century Flood Reform Act, agreed the change could deal a financial blow to some homeowners, over time. But the current practice, he argued, is unfair to households that live outside of floodplains. Through their tax dollars, they subsidize a relatively small number of homeowners who own property near the coasts, rivers and other waterways.

Duffy said he’s open to amending his legislation to address affordability concerns — but not if it keeps the current subsidy system in place.

“We are not talking about the poorest people in the country,” said Duffy, who represents a wide swath of northern Wisconsin. “You have subsidies going to these wealthy homeowners on the coast.”

Congress has until the end of September to reauthorize the National Flood Insurance Program, otherwise the program will lapse, preventing people from buying flood insurance. When that impasse last happened for a month in 2010, an estimated 46,800 home sales transactions were interrupted or canceled, according to the National Realtors Association.

Last month, a key House committee approved seven pieces of legislation aimed at improving the solvency of the insurance program, which owes $25 billion to the U.S. Treasury.

The full House is expected to consolidate and vote on the whole package — including the part that would increase insurance rates — before the August recess, but affordability may hamper its passage. If the House and Senate can’t pass and reconcile their two reauthorization bills, Congress could miss the Sept. 30 deadline, and the housing market could see a repeat of 2010.

WE NEED TO GENTLY MOVE THESE PEOPLE TO THE POINT THEY ARE PAYING PREMIUMS COMMENSURATE WITH THE RISK

Sean Duffy, R-Wisconsin, author of debated flood insurance bill

At issue is the insurance program’s practice of “grandfathering” premiums, which allows homeowners to continue paying relatively low insurance rates even when the Federal Emergency Management Agency, or FEMA, changes its maps to account for newly discovered risks. The practice is based on the concept that people with homes built to the required codes at the time of purchase shouldn’t be penalized when those codes change.

Under Duffy’s bill, as now written, no new grandfathering would be allowed after four years, and premiums on existing grandfathered policies would rise 8 percent yearly until they reach full-risk rates.

No one knows for sure how many households could be affected by the change, but Duffy said FEMA has told him it could number 500,000 or higher. The increased premium costs could be sizable.

For example, FEMA’s rate tables show that a home in an “A Zone” of Special Flood Hazard Area — typically near a lake, river or coastline — that now costs $3,000 a year in insurance premiums could rise to $5,000 a year if FEMA determined that expected flood elevations were two feet higher than previously mapped.

Wells, a real estate broker with offices in Florida and Pennsylvania, said one frequent criticism of the program — that rich people benefit disproportionately — is a myth. Under the program, flood damage claims are capped at $250,000, much less than full replacement for a wealthy person’s home.

“People talk about beach-side mansions, but I can tell you, they don’t use NFIP,” she said. “They use Lloyds of London.”

Proponents of the legislation, however, say that construction of ever-more-valuable homes along the coast and on floodplains adds to the risks being covered by the taxpayer-subsidized flood insurance program. Subsidies, they argue, encourage more people to build and buy homes on floodplains, perpetuating the program’s debt.

“We need to transition to sound actuarial rates,” said Rep. Jeb Hensarling, R-Tex., chairman of the House Financial Services Committee, which approved the legislative package last month. “Otherwise, we are putting more people in harm’s way.”

Federal flood insurance has long been the subject of a congressional tug-of-war, especially following Hurricane Katrina in 2005 and Hurricane Sandy in 2012. In 2005, FEMA borrowed more than $17 billion from the U.S. Treasury to pay claims damages, and Hurricane Sandy triggered another $9 billion in borrowing.

As of late 2016, close to five million households nationwide held federal flood insurance policies, according to FEMA. Florida was the largest, with nearly 1.8 million policy holders, followed by Texas with 608,000, Louisiana with 472,000 and California with 295,000.

The issue is complicated by the fact that many U.S. cities were originally built along coastlines and rivers and are now deemed vulnerable to extreme flooding. Scientists say that with sea level rise and stronger storms expected from global climate change, the U.S. Treasury is to sure to take future big hits from flood disasters.

To reduce the mounting debt, Congress in 2012 passed the Biggert-Waters Flood Insurance Reform Act, which was “designed to allow premiums to rise to reflect the true risk of living in high-flood areas.” But two years later, following an outcry from affected homeowners, Congress scaled back the premium rate increases.

After President Trump took office, federal debt hawks saw a chance to revive provisions in the 2012 legislation, including ending “grandfathering.” That provision is part of the 21st Century Flood Reform Act introduced by Duffy.

Some of the House package has wide support from consumer, environmental and taxpayer groups. These include provisions aimed at removing obstacles for homeowners to purchase private flood insurance, improving the FEMA claims process, discouraging new development in floodplains and helping communities “mitigate” existing risks, such as elevating structures above expected flood depths.

There’s also agreement that FEMA needs to update its process for revising flood maps, although that goal could be complicated by Trump’s proposed 2018 budget, which calls for cuts to the agency’s mapping program.

Officials with the National Realtors Association fear the House will approve legislation to phase out grandfathering with no solid information on how many households would be affected and at what cost.

“We know for sure that homeowners will be hit hard if the NFIP is allowed to expire in September, but we need better data from FEMA before we can have a fully-informed discussion on flood insurance,” William E. Brown, president of the National Realtors Association, said in a statement to McClatchy. Without data, he added, “determining costs and risks both for homeowners and for taxpayers [is] tremendously difficult.”

Duffy, a former prosecutor and ESPN commentator, said his legislation is based on the principle that people should approach home purchases knowing that insurance conditions could change. He likened it to a consumer buying car insurance, knowing that premiums could go up if a teenager is added to the policy.

But Duffy also said he’s aware his bill faces opposition and he’s working to address the concerns about grandfathering.

“I am not a purist on this,” he said. “I want to get a bill that can pass.”

Nearly 200 environmental defenders murdered last year, says report

By National Observer in News, Energy, Politics | July 13th 2017

Environmental protesters are increasingly being treated like criminals and slaughtered when they are defending land from the encroachment of pipelines, mining, and other resource development, says a new report released on Thursday.

This revelation was included in a sweeping report by Global Witness on the state of “environmental defenders” in 2016—a year marked, for example, by ongoing protests of oil and gas pipelines in North America and many other forms of environmental protest around the world.

Criminalizing environmental protesters, the report argues, means people who oppose development projects face “trumped-up and aggressive criminal and civil cases.”

This translates to “silencing dissent,” scaring people, "tarnish[ing] their reputations," and wasting court resources, the report said.

Though the report’s authors found criminalization was particularly prevalent in Latin America, they highlight Canada’s work on ushering in new anti-terror legislation and noted National Observer reporter Bruce Livesey's coverage of RCMP surveillance of Indigenous and environmental activists.

"In Canada, environmental groups and First Nation Peoples fear new anti-terrorism legislation will be used to step up the surveillance of protesters opposed to oil and mining projects," the report said. "The Canadian media has also reported on several government agencies that are systematically spying on environmental organizations."

The group calls for governments to reform laws “that unjustly target environmental activism and the right to protest,” and urged companies to “stop abusing the judicial process to silence defenders.”

Global Witness also calls on governments to ensure activists “can peacefully voice their opinions without facing arrest, and are guaranteed due process when charges are brought against them.”

Global Witness is an international advocacy organization that bills itself as investigating—and making public—links between resource extraction, conflict, and corruption. The not-for-profit research group is based in London, UK.

The bulk of its new report deals with the slayings of environmental activists.

They found nearly 200 people were murdered all over the world last year in connection to mining, logging, and other extraction efforts, or while protecting national parks.

"Almost 1,000 murders have been recorded by Global Witness since 2010, with many more facing threats, attacks, harassment, stigmatization, surveillance and arrest," the report said. "Clearly governments are failing to protect activists, while a lack of accountability leaves the door open to further attacks. By backing extractive and infrastructure projects imposed on local communities without their consent, governments, companies and investors are complicit in this crisis."

Worst offenders :

Brazil = 49

Columbia = 37

Philipines = 28

India = 16

Deforestation soars in Colombia after Farc rebels' demobilization

Area of deforestation climbed 44% in 2016 compared with year before, as criminal groups have swooped in promote illegal logging and mining

Sibylla Brodzinsky in Bogotá, Tuesday 11 July 2017 05.00 EDT, Last modified on Tuesday 11 July 2017 17.00 EDT

Colombia has seen an alarming surge in deforestation after the leftwing rebels relinquished control over vast areas of the country as a part of a historic peace deal.

The area of deforestation jumped 44% in 2016 to 178,597 hectares (690 sq miles) compared to the year before, according to official figures released this month – and most of the destruction was in remote rainforest areas once controlled by the Revolutionary Armed Forces of Colombia (Farc).

The rebel army was a violent illegal armed group, but for decades the guerrillas enforced strict limits on logging by civilians – in part to protect their cover from air raids by government warplanes.

But last year, as the Farc began to move toward demobilization, criminal groups moved in, taking advantage of the vacuum left behind to promote illegal logging and mining and cattle ranching. Civilians who had been ordered by the Farc to maintain 20% of their land with forest cover began expanding their farms.

“The Farc would limit logging to two hectares a year in the municipality,” said Jaime Pacheco, mayor of the town of Uribe, in eastern Meta province. “In one week [last year], 100 hectares were cleared and there is little we can do about it.”

Over their 53-year existence, the Farc were far from environmental angels. While in some areas the guerrilla presence helped maintain the forests, in others the rebels promoted clear-cuts to make way for the planning of coca, the raw material used in cocaine, or illegal gold mining, both a source of income for the group.

Bombings of oil pipelines dumped millions of gallons of oil in waterways and jungles. Between 1991 and 2013, 58% of the deforestation in Colombia was seen in conflict areas, according to a 2016 report by Colombia’s planning ministry.

“They weren’t environmentalists but they did regulate activity, and – since they had the guns – people complied,” says Susana Muhamad, an environmental activist who conducted a diagnostic study of the environmental risks of the rebel retreat.

“We told the government that it would need to establish control in these areas quickly, but it hasn’t,” she said. “It’s like the wild west now, a land rush.”

Colombia, which has the world’s eighth-largest forest cover, committed under the Paris climate accords to reaching zero net deforestation by 2020 and halting the loss of all natural forest by 2030.

Norway is donating about $3.5m over two years to a pilot project that hopes to stem deforestation by offering paid jobs to former Farc fighters and communities to safeguard forests by tracking and reporting illegal logging, adopting sustainable farming methods and undertaking eco-tourism projects.

“We hope this project can be the way for more activities whereby peace comes with green dividends,” said Vidar Helgesen, Norway’s environment minister, on a recent visit to Colombia.

That is the kind of incentive farmers in the town of Uribe, once a Farc stronghold, say they need to keep the forest on their land intact. “We are willing to leave the forests if we get some sort of subsidy to do it,” one farmer, Noel Segura, said in a phone interview.

Wendy Arenas, of the government’s post-conflict department, said the new deforestation figures were an alert. “We knew it could happen,” she told el Espectador newspaper. “It is no secret that the zones left behind by the Farc are territories where other actors are taking over and profiting from logging.”

Solar Power Gets $46 Million Boost From U.S. Energy Department

By Chris Martin, July 12, 2017

Technology grants awarded to 48 universities and developers

Arizona State leads awards with new module testing methods

The U.S. Energy Department awarded $46.2 million in research grants to improve solar energy technologies and reduce costs to 3 cents per kilowatt-hour by 2030.

The money will be partly matched by the 48 projects awarded to laboratories and universities, including Arizona State, which plans to use $1.6 million to develop an X-ray test to evaluate the performance of thin-film modules under harsh conditions, according to a emailed statement Wednesday.

“These projects ensure there’s a pipeline of knowledge, human resources, transformative technology solutions and research to support the industry,” Charlie Gay, director of the Energy Department’s SunShot Initiative, said in the statement.

Other awards include:

$1.37 million to Stanford University to improve a perovskite over silicon module design

$1.13 million to Colorado State University to improve thin-film manufacturing

$2 million to SolarReserve Inc. to reduce costs of molten salt storage

Swiss firm Climeworks begins sucking carbon dioxide from the atmosphere in fight against climate change

Company's 'super ambitious' plan is to capture one per cent of annual emissions by 2025

Ian Johnston Environment Correspondent

Friday 23 June 2017 11:47 BST

Humans have pumped vast amounts of carbon into the atmosphere, but we could soon be removing significant amounts Corbis

A Swiss company has opened what is believed to be the world’s first ‘commercial’ plant that sucks carbon dioxide from the atmosphere, a process that could help reduce global warming.

Climeworks, which created the plant in Hinwil, near Zurich, told the Carbon Brief website that the process was currently not making money.

However the firm expressed confidence they could bring down the cost from $600 per tonne of the greenhouse gas to $200 in three to five years with a longer term target of $100.

And its “super-ambitious” vision is to capture one per cent of annual global carbon emissions by 2025, which would involve building hundreds of thousands of the devices.

The captured gas is currently being sold, appropriately enough, to a greenhouse that grows fruit and vegetables.

However it could also be used to make fizzy drinks and renewable fuels or stored underground.

One of the company’s founders, Christoph Gebald, told Carbon Brief: “With this plant, we can show costs of roughly $600 per tonne, which is, of course, if we compare it to a market price, very high.

“But, if we compare it to studies which have been done previously, projecting the costs of direct air capture, it’s a sensation.”

Carbon maps reveal those causing the most and least climate change

Previous research into the idea had assumed a cost of $1,000 per tonne.

“We are very confident that, once we build version two, three and four of this plant, we can bring down costs,” Mr Gebald said.

“We see a factor [of] three cost reduction in the next three to five years, so a final cost of $200 per tonne. The long-term target price for what we do is clearly $100 per tonne of CO2.”

He said such a carbon capture system could start to make an impact on emissions on a global scale, but this might require a price to be put on carbon emissions. The European Union and some other countries around the world have put a price on carbon for some major emitters, but environmentalists have complained it is too low and fails to reflect the true cost.

“The vision of our company is to capture [one] per cent of global emissions by 2025, which is super ambitious, but which is something that is feasible,” Mr Gebald said.

“Reaching one per cent of global emissions by 2025 is currently not possible without political will, without a price on carbon, for example. So it’s not possible by commercial means only.”

Want to fight climate change? Have fewer children

Next best actions are selling your car, avoiding flights and going vegetarian, according to study into true impacts of different green lifestyle choices

Damian Carrington Environment editor

Wednesday 12 July 2017 00.45 EDT, Last modified on Wednesday 12 July 2017 13.34 EDT

The greatest impact individuals can have in fighting climate change is to have one fewer child, according to a new study that identifies the most effective ways people can cut their carbon emissions.

The next best actions are selling your car, avoiding long flights, and eating a vegetarian diet. These reduce emissions many times more than common green activities, such as recycling, using low energy light bulbs or drying washing on a line. However, the high impact actions are rarely mentioned in government advice and school textbooks, researchers found.

Carbon emissions must fall to two tonnes of CO2 per person by 2050 to avoid severe global warming, but in the US and Australia emissions are currently 16 tonnes per person and in the UK seven tonnes. “That’s obviously a really big change and we wanted to show that individuals have an opportunity to be a part of that,” said Kimberly Nicholas, at Lund University in Sweden and one of the research team.

The new study, published in Environmental Research Letters, sets out the impact of different actions on a comparable basis. By far the biggest ultimate impact is having one fewer child, which the researchers calculated equated to a reduction of 58 tonnes of CO2 for each year of a parent’s life.

The figure was calculated by totting up the emissions of the child and all their descendants, then dividing this total by the parent’s lifespan. Each parent was ascribed 50% of the child’s emissions, 25% of their grandchildren’s emissions and so on.

“We recognise these are deeply personal choices. But we can’t ignore the climate effect our lifestyle actually has,” said Nicholas. “It is our job as scientists to honestly report the data. Like a doctor who sees the patient is in poor health and might not like the message ‘smoking is bad for you’, we are forced to confront the fact that current emission levels are really bad for the planet and human society.”

“In life, there are many values on which people make decisions and carbon is only one of them,” she added. “I don’t have children, but it is a choice I am considering and discussing with my fiance. Because we care so much about climate change that will certainly be one factor we consider in the decision, but it won’t be the only one.”

Overpopulation has been a controversial factor in the climate change debate, with some pointing out that an American is responsible for 40 times the emissions produced by a Bangladeshi and that overconsumption is the crucial issue. The new research comes a day after researchers blamed overpopulation and overconsumption on the “biological annihilation” of wildlife which has started a mass extinction of species on the planet.

Nicholas said that many of the choices had positive effects as well, such as a healthier diet, as meat consumption in developed countries is about five times higher than recommended by health authorities. Cleaner transport also cuts air pollution, and walking and cycling can reduce obesity. “It is not a sacrifice message,” she said. “It is trying to find ways to live a good life in a way that leaves a good atmosphere for the planet. I’ve found it really positive to make many of these changes.”

The researchers analysed dozens of sources from Europe, North America and Japan to calculate the carbon savings individuals in richer nations can make. They found getting rid of a car saved 2.4 tonnes a year, avoiding a return transatlantic flight saved 1.6 tonnes and becoming vegetarian saved 0.8 tonnes a year.

These actions saved the same carbon whichever country an individual lived in, but others varied. The savings from switching to an electric car depend on how green electricity generation is, meaning big savings can be made in Australia but the savings in Belgium are six times lower. Switching your home energy supplier to a green energy company also varied, depending on whether the green energy displaces fossil fuel energy or not.

Nicholas said the low-impact actions, such as recycling, were still worth doing: “All of those are good things to do. But they are more of a beginning than an end. They are certainly not sufficient to tackle the scale of the climate challenge that we face.”

The researchers found that government advice in the US, Canada, EU and Australia rarely mentioned the high impact actions, with only the EU citing eating less meat and only Australia citing living without a car. None mentioned having one fewer child. In an analysis of school textbooks on Canada only 4% of the recommendations were high impact.

Chris Goodall, an author on low carbon living and energy, said: “The paper usefully reminds us what matters in the fight against global warming. But in some ways it will just reinforce the suspicion of the political right that the threat of climate change is simply a cover for reducing people’s freedom to live as they want.

“Population reduction would probably reduce carbon emissions but we have many other tools for getting global warming under control,” he said. “Perhaps more importantly, cutting the number of people on the planet will take hundreds of years. Emissions reduction needs to start now.”

Full Report

http://iopscience.iop.org/article/10.1088/1748-9326/aa7541/pdf

OIRA works quietly on updating social cost of carbon

Hannah Hess, E&E News reporter, Greenwire: Thursday, June 15, 2017

Some federal employees are still working on the social cost of carbon.

Not far from the White House, some of the federal government's most influential number crunchers are still working on the social cost of carbon.

President Trump's executive order on energy independence effectively signaled "pencils down" on federal work to estimate the monetary damage of greenhouse gas emissions, disbanding the interagency working group that calculated the dollar value on the effect of greenhouse gas emissions on the planet and society.

But his order didn't eliminate the metric entirely.

The Office of Information and Regulatory Affairs' Jim Laity, a career staffer who leads the Natural Resources and Environment Branch, said yesterday his office is "actively working on thinking about the guidance" Trump gave in March.

With the Trump agenda focused on regulatory rollback, federal agencies haven't yet issued rules that require valuations of carbon emissions, "although they are working on something coming in the not-too-distant future," Laity told an audience attending the National Academy of Sciences' seminar on valuing climate change impacts.

Employees from U.S. EPA, the Interior Department and the Department of Energy were in the crowd, along with academics and prominent Washington think tank scholars.

Greens fear the Trump administration could wipe out the social cost of carbon valuation, currently set around $40 per metric ton of carbon dioxide, as part of what they portray as a war on climate science. Revisions to the White House estimates have raised hackles among Republicans and conservatives, who allege they are part of a secret power grab.

Laity would play a key role in that effort as top overseer of energy and environmental rules.

Addressing the summit via webcast yesterday, Laity walked the audience through a decade of actions related to the calculation and influential feedback, including from a 13-member panel of the National Academies of Sciences, Engineering and Medicine. He stressed the outcome is still uncertain.

The Trump administration "looked at the work we had done, and they looked at the criticisms, and they decided we needed to have a pause to kind of rethink what we were doing a little bit in this area," Laity said, previewing the executive order that was praised by the oil and gas industry.

The metric flew under the radar during President Obama's first term. However, a significant jump in values in 2013 set the stage for a clash between then-OIRA Director Howard Shelanski and Republican lawmakers who alleged the estimate was the product of a closed-door process (E&E Daily, July 19, 2013).

A few months after the hearing, OIRA announced it would provide opportunity for public comment on the estimate. Critics took issue with the discount rate the government used to account for damage and expressed concern that the finding, though based on peer-reviewed literature, had not been peer-reviewed.

"I think we also realized that we needed to pay more attention to some of these issues than maybe we had in the past," Laity said.

OIRA received 140 unique comments, and some 39,000 letters, which Laity said mostly supported having some kind of cost-benefit analysis of the cost to society of each ton of emissions in property damage, health care costs, lost agricultural output and other expenses.

"We took some of those concerns to heart," he said.

In July 2015, the White House slightly revised its estimate. It also announced that the executive branch would seek independent expert advice from the National Academies to inform yet another revision of the estimate, though federal agencies would continue to use the current figure in their rulemakings until that revision could be made (Greenwire, July 6).

Part one of the two-phase study, released in January 2016, blessed the figure. It found the White House did not need to review its estimates in the short term. Part two, released a few weeks before Obama left office, recommended a new framework for making the estimate and more research into long-term climate damage.

Within three months, the Trump administration formally disbanded the interagency working group that would have implemented those recommendations. The executive order also formally withdrew the technical support documents OIRA had used in the past.

The think tank Resources for the Future recently announced the start of an initiative focused on responding to the scientists' recommendations (Greenwire, June 9).

Guidance for OIRA

Laity noted that the next paragraph of Trump's executive order acknowledged agencies would need to continue monetizing greenhouse gas emissions damage in their regulations, to the extent that the rules affect emissions.

As an interim measure, the order directed OIRA to offer some guidance about how to do that, directing that any values that were used would be consistent with the guidance of a document published by the Office of Management and Budget in September 2003, Circular A-4.

There are two main areas of concern: the discount rate and the global focus of the current estimate.

"A-4 has very specific advice," Laity explained. The document says pretty unequivocally that the main factor in weighing regulations should be cost and benefits to the U.S. If a regulation does have significant impacts outside the U.S. that are important to consider, these should be "clearly segregated out and reported separately," he said.

Economists clashed during a recent hearing of the House Science, Space and Technology Committee on how best to estimate the figure.

Former Obama administration official Michael Greenstone, who attended yesterday's summit and gave a presentation, argues the benefits of emission reductions play out in part in international politics.

Greenstone, chief economist for the Council of Economic Advisers in 2009 and 2010, has warned that reverting to a social cost of carbon that only considers domestic benefits of emission reductions is "essentially asking the rest of the world to ramp up their emissions" (E&E Daily, March 1).

Said Laity: "All I can say right now is that we are actively working on thinking about the guidance that we have been given in the new executive order and, sort of technically, how best to implement that in upcoming ... rulemaking."

Aging Oil Pipeline Under the Great Lakes Should Be Closed, Michigan AG Says

Enbridge’s Line 5 lies exposed on the floor of the Straits of Mackinac and has been losing chunks of its outer coating. A new report suggests ways to deal with it.

Nicholas Kusnetz

BY NICHOLAS KUSNETZ, JUN 30, 2017

Enbridge's Line 5 carries 540,000 barrels of oil per day from Canada to Ohio, going under the Straits of Mackinac, where Lake Michigan and Lake Huron meet.

Michigan Attorney General Bill Schuette called for a deadline to close a controversial portion of an oil pipeline that runs along the bottom of the Straits of Mackinac, a channel that connects two of the Great Lakes. The pipeline has had more than two dozen leaks over its lifespan, and parts of its outer coating have come off.

The announcement came as the state released a report looking at alternatives for that section of the Enbridge pipeline, called Line 5.

The report's suggestions include drilling a tunnel under the straits for a new line, selecting an alternate route or using rail cars to transport the oil instead. It also left open the possibility that the existing pipeline could continue to operate indefinitely.

"The Attorney General strongly disagrees" with allowing the existing pipeline to continue operating, said a statement released by Schuette's office on Thursday. "A specific and definite timetable to close Line 5 under the straits should be established."

Schuette did not, however, specify when that deadline should be, or how it should be set.

For years, environmentalists and a local Indian tribe have been calling for the closure of this short stretch of the pipeline. Built in 1953, it sits exposed above the lakebed where Lake Huron meets Lake Michigan. Earlier this year, Enbridge acknowledged that an outer coating had fallen off of the line in places, and it has sprung at least 29 leaks in its 64-year history. The 645-mile line carries about 540,000 barrels per day of light crude, including synthetic crude from Canada's tar sands, as well as natural gas liquids, from Superior, Wisconsin, to Sarnia, Ontario.

Schuette, a Republican, had said before that this section of the line should close eventually, but he hasn't taken any action to hasten a closure. Advocacy groups have asked the state to revoke Enbridge's easement to pass through the straits.

"It's great that he's reasserting his commitment to shut down Line 5," said Mike Shriberg, Great Lakes executive director for the National Wildlife Federation. "The question now is, is there enough evidence for him to take action right away."

The state had commissioned two studies on the line to be paid for by Enbridge, one that was released yesterday and another that was to produce a risk analysis for the pipeline. Last week, however, the state cancelled the risk analysis after discovering that someone who had contributed to it had subsequently done work for Enbridge.

Michael Barnes, an Enbridge spokesman, said the company would need time to review the report before giving specific comments, but that it "remains committed to protecting the Great Lakes and meeting the energy needs of Michigan through the safe operation of Line 5."

Shriberg said that now that the report on alternatives is out, it's time for the state to act.

"Ultimately, the attorney general and the governor have a decision to make," he said. "They've been saying for years that they've been waiting for the full information to come in."

California invested heavily in solar power. Now there's so much that other states are sometimes paid to take it

By IVAN PENN on JUNE 22, 2017 for LA Times

On 14 days during March, Arizona utilities got a gift from California: free solar power.

Well, actually better than free. California produced so much solar power on those days that it paid Arizona to take excess electricity its residents weren’t using to avoid overloading its own power lines.

It happened on eight days in January and nine in February as well. All told, those transactions helped save Arizona electricity customers millions of dollars this year, though grid operators declined to say exactly how much. And California also has paid other states to take power.

The number of days that California dumped its unused solar electricity would have been even higher if the state hadn’t ordered some solar plants to reduce production — even as natural gas power plants, which contribute to greenhouse gas emissions, continued generating electricity.

Solar and wind power production was curtailed a relatively small amount — about 3% in the first quarter of 2017 — but that’s more than double the same period last year. And the surge in solar power could push the number even higher in the future.

Why doesn’t California, a champion of renewable energy, use all the solar power it can generate?

The answer, in part, is that the state has achieved dramatic success in increasing renewable energy production in recent years. But it also reflects sharp conflicts among major energy players in the state over the best way to weave these new electricity sources into a system still dominated by fossil-fuel-generated power.

City officials and builders in Redondo Beach want a mixed-use development to replace the current natural gas facility. They say there is no need to overhaul the power plant when there is an abundance of clean alternatives. (Rick Loomis/Los Angeles Times)

No single entity is in charge of energy policy in California. This has led to a two-track approach that has created an ever-increasing glut of power and is proving costly for electricity users. Rates have risen faster here than in the rest of the U.S., and Californians now pay about 50% more than the national average.

Perhaps the most glaring example: The California Legislature has mandated that one-half of the state’s electricity come from renewable sources by 2030; today it’s about one-fourth. That goal once was considered wildly optimistic. But solar panels have become much more efficient and less expensive. So solar power is now often the same price or cheaper than most other types of electricity, and production has soared so much that the target now looks laughably easy to achieve.

At the same time, however, state regulators — who act independently of the Legislature — until recently have continued to greenlight utility company proposals to build more natural gas power plants.

State Senate Leader Kevin de Leon (D-Los Angeles) wants Calilfornia to produce 100% of its electricity from clean energy sources such as solar and wind by 2045.

These conflicting energy agendas have frustrated state Senate Leader Kevin de Leon (D-Los Angeles), who opposes more fossil fuel plants. He has introduced legislation that would require the state to meet its goal of 50% of its electricity from renewable sources five years earlier, by 2025. Even more ambitiously, he recently proposed legislation to require 100% of the state’s power to come from renewable energy sources by 2045.

“I want to make sure we don’t have two different pathways,” de Leon said. Expanding clean energy production and also building natural gas plants, he added, is “a bad investment.”

Environmental groups are even more critical. They contend that building more fossil fuel plants at the same time that solar production is being curtailed shows that utilities — with the support of regulators — are putting higher profits ahead of reducing greenhouse gas emissions.

“California and others have just been getting it wrong,” said Leia Guccione, an expert in renewable energy at the Rocky Mountain Institute in Colorado, a clean power advocate. “The way [utilities] earn revenue is building stuff. When they see a need, they are perversely [incentivized] to come up with a solution like a gas plant.”

California and others have just been getting it wrong.

— Leia Guccione, renewable energy expert at the Rocky Mountain Institute

Regulators and utility officials dispute this view. They assert that the transition from fossil fuel power to renewable energy is complicated and that overlap is unavoidable.

They note that electricity demand fluctuates — it is higher in summer in California, because of air conditioning, and lower in the winter — so some production capacity inevitably will be underused in the winter. Moreover, the solar power supply fluctuates as well. It peaks at midday, when the sunlight is strongest. Even then it isn’t totally reliable.

Because no one can be sure when clouds might block sunshine during the day, fossil fuel electricity is needed to fill the gaps. Utility officials note that solar production is often cut back first because starting and stopping natural gas plants is costlier and more difficult than shutting down solar panels.

In the Mojave Desert at the California/Nevada border, the Ivanpah Solar Electric Generating System uses 347,000 garage-door-sized mirrors to heat water that powers steam generators. This solar thermal plant — one of the clean energy facilities that helps produce 10% of the state’s electricity. (Mark Boster / Los Angeles Times)

Eventually, unnecessary redundancy of electricity from renewables and fossil fuel will disappear, regulators, utilities and operators of the electric grid say.

“The gas-fired generation overall will show decline,” said Neil Millar, executive director of infrastructure at CAISO, the California Independent System Operator, which runs the electric grid and shares responsibility for preventing blackouts and brownouts. “Right now, as the new generation is coming online and the older generation hasn’t left yet, there is a bit of overlap.”

The Los Angeles Times has been telling fact from fiction since 1881. Support local investigative reporting like this story by subscribing today. Start getting full access to our signature journalism for just 99 cents for the first eight weeks.

Utility critics acknowledge these complexities. But they counter that utilities and regulators have been slow to grasp how rapidly technology is transforming the business. A building slowdown is long overdue, they argue.

Despite a growing glut of power, however, authorities only recently agreed to put on hold proposals for some of the new natural gas power plants that utilities want to build to reconsider whether they are needed.

A key question in the debate is when California will be able to rely on renewable power for most or all of its needs and safely phase out fossil fuel plants, which regulators are studying.

The answer depends in large part on how fast battery storage improves, so it is cheaper and can store power closer to customers for use when the sun isn’t shining. Solar proponents say the technology is advancing rapidly, making reliance on renewables possible far sooner than previously predicted, perhaps two decades or even less from now — which means little need for new power plants with a life span of 30 to 40 years.

Calibrating this correctly is crucial to controlling electricity costs.

“It’s not the renewables that’s the problem. It’s the state’s renewable policy that’s the problem,” said Gary Ackerman, president of the Western Power Trading Forum, an association of independent power producers. “We’re curtailing renewable energy in the summertime months. In the spring, we have to give people money to take it off our hands.”

Not long ago, solar was barely a rounding error for California’s energy producers.

In 2010, power plants in the state generated just over 15% of their electricity production from renewable sources. But that was mostly wind and geothermal power, with only a scant 0.5% from solar. Now that overall amount has grown to 27%, with solar power accounting for 10%, or most of the increase. The solar figure doesn’t include the hundreds of thousands of rooftop solar systems that produce an additional 4 percentage points, a share that is ever growing.

The share of the state’s power generated by solar utilities and rooftop panels has skyrocketed in recent years.

Note: Rooftop panels were not tracked by the federal government prior to 2014.

2016 Solar utilities: 9.6% / Rooftop panels: 4.2%

Behind the rapid expansion of solar power: its plummeting price, which makes it highly competitive with other electricity sources. In part that stems from subsidies, but much of the decline comes from the sharp drop in the cost of making solar panels and their increased efficiency in converting sunlight into electricity.

The average cost of solar power for residential, commercial and utility-scale projects declined 73% between 2010 and 2016. Solar electricity now costs 5 to 6 cents per kilowatt-hour — the amount needed to light a 100-watt bulb for 10 hours — to produce, or about the same as electricity produced by a natural gas plant and half the cost of a nuclear facility, according to the U.S. Energy Information Administration.

Fly over the Carrizo Plain in California’s Central Valley near San Luis Obispo and you’ll see that what was once barren land is now a sprawling solar farm, with panels covering more than seven square miles — one of the world’s largest clean-energy projects. When the sun shines over the Topaz Solar Farm, the shimmering panels produce enough electricity to power all of the residential homes in a city the size of Long Beach, population 475,000.

The Topaz Solar Farm, one of the world’s largest solar plants, blankets the Carrizo Plain in the Central Valley. It supplies electricity to Pacific Gas & Electric Co. (NASA)

Other large-scale solar operations blanket swaths of the Mojave Desert, which has increasingly become a sun-soaking energy hub. The Beacon solar project covers nearly two square miles and the Ivanpah plant covers about five and a half square miles.

The state’s three big shareholder-owned utilities now count themselves among the biggest solar power producers. Southern California Edison produces or buys more than 7% of its electricity from solar generators, Pacific Gas & Electric 13% and San Diego Gas & Electric 22%.

Similarly, fly over any sizable city and you’ll see warehouses, businesses and parking lots with rooftop solar installations, and many homes as well.

With a glut of solar power at times, CAISO has two main options to avoid a system overload: order some solar and wind farms to temporarily halt operations or divert the excess power to other states.

That’s because too much electricity can overload the transmission system and result in power outages, just as too little can. Complicating matters is that even when CAISO requires large-scale solar plants to shut off panels, it can’t control solar rooftop installations that are churning out electricity.

CAISO is being forced to juggle this surplus more and more.

In 2015, solar and wind production were curtailed about 15% of the time on average during a 24-hour period. That rose to 21% in 2016 and 31% in the first few months of this year. The surge in solar production accounts for most of this, though heavy rainfall has increased hydroelectric power production in the state this year, adding to the surplus of renewables.

California’s clean energy supply is growing so fast that solar and wind producers are increasingly being ordered to halt production.

Even when solar production is curtailed, the state can produce more than it uses, because it is difficult to calibrate supply and demand precisely. As more homeowners install rooftop solar, for example, their panels can send more electricity to the grid than anticipated on some days, while the state’s overall power usage might fall below what was expected.

This means that CAISO increasingly has excess solar and wind power it can send to Arizona, Nevada and other states.

When those states need more electricity than they are producing, they pay California for the power. But California has excess power on a growing number of days when neighboring states don’t need it, so California has to pay them to take it. CAISO calls that “negative pricing.”

Why does California have to pay rather than simply give the power away free?

When there isn’t demand for all the power the state is producing, CAISO needs to quickly sell the excess to avoid overloading the electricity grid, which can cause blackouts. Basic economics kick in. Oversupply causes prices to fall, even below zero. That’s because Arizona has to curtail its own sources of electricity to take California’s power when it doesn’t really need it, which can cost money. So Arizona will use power from California at times like this only if it has an economic incentive — which means being paid.

In the first two months of this year, CAISO paid to send excess power to other states seven times more often than same period in 2014. “Negative pricing” happened in an average of 18% of all sales, versus about 2.5% in the same period in 2014.

Most “negative pricing” typically has occurred for relatively short periods at midday, when solar production is highest.

But what happened in March shows how the growing supply of solar power could have a much greater impact in the future. The periods of “negative pricing” lasted longer than in the past — often for six hours at a time, and once for eight hours, according to a CAISO report.

The excess power problem will ease somewhat in the summer, when electricity usage is about 50% higher in California than in the winter.

But CAISO concedes that curtailments and “negative pricing” is likely to happen even more often in the future as solar power production continues to grow, unless action is taken to better manage the excess electricity.

The sprawling Ivanpah Solar Electric Generating System, owned by NRG Energy and BrightSource Energy, occupies 5.5 square miles in the Mojave Desert. The plant can supply electricity to 180,000 Pacific Gas & Electric and Southern California Edison customers. (Mark Boster/Los Angeles Times)

Arizona’s largest utility, Arizona Public Service, is one of the biggest beneficiaries of California’s largesse because it is next door and the power can easily be sent there on transmission lines.

On days that Arizona is paid to take California’s excess solar power, Arizona Public Service says it has cut its own solar generation rather than fossil fuel power. So California’s excess solar isn’t reducing greenhouse gases when that happens.

CAISO says it does not calculate how much it has paid others so far this year to take excess electricity. But its recent oversupply report indicated that it frequently paid buyers as much as $25 per megawatt-hour to get them to take excess power, according to the Energy Information Administration.

That’s a good deal for Arizona, which uses what it is paid by California to reduce its own customers’ electricity bills. Utility buyers typically pay an average of $14 to $45 per megawatt-hour for electricity when there isn’t a surplus from high solar power production.

With solar power surging so much that it is sometimes curtailed, does California need to spend $6 billion to $8 billion to build or refurbish eight natural gas power plants that have received preliminary approval from regulators, especially as legislative leaders want to accelerate the move away from fossil fuel energy?

The answer depends on whom you ask.

Utilities have repeatedly said yes. State regulators have agreed until now, approving almost all proposals for new power plants. But this month, citing the growing electricity surplus, regulators announced plans to put on hold the earlier approvals of four of the eight plants to determine if they really are needed.

Big utilities continue to push for all of the plants, maintaining that building natural gas plants doesn’t conflict with expanding solar power. They say both paths are necessary to ensure that California has reliable sources of power — wherever and whenever it is needed.

The biggest industrial solar power plants, they note, produce electricity in the desert, in some cases hundreds of miles from population centers where most power is used.

At times of peak demand, transmission lines can get congested, like Los Angeles highways. That’s why CAISO, utilities and regulators argue that new natural gas plants are needed closer to big cities. In addition, they say, the state needs ample electricity sources when the sun isn’t shining and the wind isn’t blowing enough.

Utility critics agree that some redundancy is needed to guarantee reliability, but they contend that the state already has more than enough.

California has so much surplus electricity that existing power plants run, on average, at slightly less than one-third of capacity. And some plants are being closed decades earlier than planned.

As for congestion, critics note that the state already is crisscrossed with an extensive network of transmission lines. Building more plants and transmission lines wouldn’t make the power system much more reliable, but would mean higher profits for utilities, critics say.

That is what the debate is about, said Jaleh Firooz, a power industry consultant who previously worked as an engineer for San Diego Gas & Electric for 24 years and helped in the formation of CAISO.

“They have the lopsided incentive of building more,” she said.

Jaleh Firooz, who worked 24 years as an engineer for San Diego Gas & Electric Co., says utilities seeking higher profits “have the lopsided incentive of building more” power plants and transmission lines. (Robert Gauthier/Los Angeles Times)

The reason: Once state regulators approve new plants or transmission lines, the cost is now built into the amount that the utility can charge electricity users — no matter how much or how little it is used.

Given that technology is rapidly tilting the competitive advantage toward solar power, there are less expensive and cleaner ways to make the transition toward renewable energy, she said.

To buttress her argument, Firooz pointed to a battle in recent years over a natural gas plant in Redondo Beach.

Independent power producer AES Southland in 2012 proposed replacing an aging facility there with a new one. The estimated cost: $250 million to $275 million, an amount that customers would pay off with higher electricity bills.

CAISO and Southern California Edison, which was going to buy power from the new plant, supported it as necessary to protect against potential power interruptions. Though solar and wind power production was increasing, they said those sources couldn’t be counted on because their production is variable, not constant.

The California Public Utilities Commission approved the project, agreeing that it was needed to meet the long-term electricity needs in the L.A. area.

The Los Angeles Times has been telling fact from fiction since 1881. Support local investigative reporting like this story by subscribing today. Start getting full access to our signature journalism for just 99 cents for the first eight weeks.

But the California Coastal Conservancy, a conservation group opposed to the plant, commissioned an analysis by Firooz to determine how vital it was. Her conclusion: not at all.

Firooz calculated that the L.A. region already had excess power production capacity — even without the new plant — at least through 2020.

Along with the cushion, her report found, a combination of improved energy efficiency, local solar production, storage and other planning strategies would be more than sufficient to handle the area’s power needs even as the population grew.

She questioned utility arguments.

“In their assumptions, the amount of capacity they give to the solar is way, way undercut because they have to say, ‘What if it’s cloudy? What if the wind is not blowing?’ ” Firooz explained. “That’s how the game is played. You build these scenarios so that it basically justifies what you want.”

In their assumptions, the amount of capacity they give to the solar is way, way undercut because they have to say, ‘What if it’s cloudy?’

— Jaleh Firooz, power-industry consultant

Undeterred, AES Southland pressed forward with its proposal. In 2013, Firooz updated her analysis at the request of the city of Redondo Beach, which was skeptical that a new plant was needed. Her findings remained the same.

Nonetheless, the state Public Utilities Commission approved the project in March 2014 on the grounds that it was needed. But the California Energy Commission, another regulatory agency whose approval for new plants is required along with the PUC’s, sided with the critics. In November 2015 it suspended the project, effectively killing it.

Asked about the plant, AES said it followed the appropriate processes in seeking approval. It declined to say whether it still thinks that a new plant is needed.

The existing facility is expected to close in 2020.

A March 2017 state report showed why critics are confident that the area will be fine without a new plant: The need for power from Redondo Beach’s existing four natural gas units has been so low, the state found, that the units have operated at less than 5% of their capacity during the last four years.

Exxon Makes a Biofuel Breakthrough

By Jennifer A Dlouhy, June 19, 2017

J. Craig Venter’s Synthetic Genomics teamed with Exxon Mobil

Technique could lead to commercialization of algae-based fuels

It’s the holy grail for biofuel developers hoping to coax energy out of algae: Keep the organism fat enough to produce oil but spry enough to grow quickly.

J. Craig Venter, the scientist who mapped the human genome, just helped Exxon Mobil Corp. strike that balance, with a breakthrough that could enable widespread commercialization of algae-based biofuels. Exxon and Venter’s Synthetic Genomics Inc. are announcing the development at a conference in San Diego on Monday.

They used advanced cell engineering to more than double the fatty lipids inside a strain of algae. The technique may be replicated to boost numbers on other species too.

"Tackling the inner workings of algae cells has not been trivial," Venter said. "Nobody’s really ever been there before; there’s no guideline to go by."

Venter, who co-founded Synthetic Genomics and sequenced the human genome in the 1990s, says the development is a significant advancement in the quest to make algae a renewable energy source. The discovery is being published in the July issue of the journal Nature Biotechnology.

Prior story: Exxon’s Algae Venture May Take Quarter Century to Yield Fuel

It’s taken eight years of what Venter called tedious research to reach this point.

When Exxon Mobil announced its $600 million collaboration with Synthetic Genomics in 2009, the oil company predicted it might yield algae-based biofuels within a decade. Four years later, Exxon executives conceded a better estimate might be within a generation.

Developing strains that reproduce and generate enough of the raw material to supply a refinery meant the venture might not succeed for at least another 25 years, former chief executive and current U.S. Secretary of State, Rex Tillerson said at the time.

Even with this newest discovery, commercialization of this kind of modified algae is decades away.

Venter says the effort has "been a real slog."

"It’s to the team’s credit -- it’s to Exxon’s credit -- that they believed the steps in the learning were actually leading some place," he said. "And they have."

The companies forged on -- renewing their joint research agreement in January amid promising laboratory results.

Exxon declined to disclose how much the Irving, Texas-based company has invested in the endeavor so far. Vijay Swarup, a vice president at ExxonMobil Research and Engineering Co., says the collaboration is part of the company’s broad pursuit of "more efficient ways to produce the energy and chemicals" the world needs and "mitigate the impacts of climate change."

Carbon Consumer

Where Exxon’s chief products -- oil and natural gas -- generate carbon dioxide emissions that drive the phenomenon, algae is a CO2 consumer, Swarup said.

Most renewable fuels today are made from plant material, including corn, corn waste and soybean oil. Algae has long been considered a potentially more sustainable option; unlike those traditional biofuels, it can grow in salt water and thrive under harsh environmental conditions. And the oil contained in algae potentially could be processed in conventional refineries.

The Exxon and Synthetic Genomics team found a way to regulate the expression of genes controlling the accumulation of lipids, or fats, in the algae -- and then use it to double the strain’s lipid productivity while retaining its ability to grow.

"To my knowledge, no other group has achieved this level of lipid production by modifying algae, and there’s no algae in production that has anything like this level," Venter said in a telephone interview. It’s "our first super-strong indication that there is a path to getting to where we need to go."

Nitrogen Starved

They searched for the needed genetic regulators after observing what happened when cells were starved of nitrogen -- a tactic that generally drives more oil accumulation. Using the CRISPR-Cas9 gene-editing technique, the researchers were able to winnow a list of about 20 candidates to a single regulator -- they call it ZnCys -- and then to modulate its expression.

Test strains were grown under conditions mimicking an average spring day in southern California.

Rob Brown, Ph.D., senior director of genome engineering at Synthetic Genomics, likened the tactic to forcing an agile algae athlete to sit on the bench.

"We basically take an athlete and make them sit on a couch and get fat," Brown said. "That’s the switch — you grab this guy off the track and you put him on a couch and he turns into a couch potato. So everything he had in his body that was muscle, sinew, carbohydrates -- we basically turn that into a butterball. That’s what we’re basically doing with this system.”

Without the change, most algae growing in this environment would produce about 10 to 15 percent oil. The Exxon and Synthetic Genomics collaboration yielded a strain with more than 40 percent.

Venter, who is also working on human longevity research, views the development as a significant step toward the sustainable energy he believes humans need as they live longer, healthier lives. The study also is proof, he says, that "persistence pays."

"You have to believe in what you’re doing and that where you’re headed really is the right direction," he said, "and sometimes, like this, it takes a long time to really prove it.”

Hundreds of scientists call for caution on anti-microbial chemical use

More than 200 scientists outline a broad range of concerns for triclosan and triclocarban and call for reduced use worldwide

June 20, 2017, By Brian Bienkowski, Environmental Health News

Two ingredients used in thousands of products to kill bacteria, fungi and viruses linger in the environment and pose a risk to human health, according to a statement released today by more than 200 scientists and health professionals.

The scientists say the possible benefits in most uses of triclosan and triclocarban—used in some soaps, toothpastes, detergents, paints, carpets—are not worth the risk. The statement, published today in the Environmental Health Perspectives journal, urges “the international community to limit the production and use of triclosan and triclocarban and to question the use of other antimicrobials.”

“Triclosan and triclocarban have been permitted for years without definitive proof they’re providing benefits.” -Avery Lindeman, Green Policy Institute

They also call for warning labels on any product containing triclosan and triclocarban and for bolstered research of the chemicals' environmental toll.

The statement says evidence that the compounds are accumulating in water, land, wildlife and humans is sufficient to merit action.

“We want to draw attention to the increasing use of antimicrobials in a lot of products,” said Dr. Ted Schettler, science director of the Science and Environmental Health Network and one of the signatories of the statement. “Triclosan and triclocarban rose to the top because they’ve been in use for so long and exposures are so widespread.”

The chemicals are used to kills microbes such as bacteria and viruses that make people ill. However, both chemicals affect animals’ hormone systems, causing reproductive and development problems. And there is nascent evidence that the impacts may extend to humans as well—having been linked to reduced growth of fetuses, earlier births, and lower head circumference in boys at birth.

The compounds are used in an estimated 2,000 products but are being phased out of some uses. In February the EU banned triclosan in hygiene products. U.S. manufacturers are phasing out triclosan from hand soaps after the Food and Drug Administration banned it last year amid concerns that the compound disrupted the body's hormone systems.

The FDA noted in the restriction that antibacterial hand soaps were no more effective than non-antibacterial soap and water at preventing illness.

“Triclosan and triclocarban have been permitted for years without definitive proof they’re providing benefits,” said Avery Lindeman, deputy director of the Green Policy Institute and one of the signatories of the statement. The compounds, she added, serve as little more than a “marketing ploy” for many products, such as antimicrobial cutting boards and socks.

Despite soap bans, triclosan remains in Colgate Total toothpaste, some cleaning products and cosmetics. More worrisome, Lindeman said some manufactures of personal care products are simply substituting other antimicrobials for triclosan—some of which may pose the same risks to people and the environment.

Triclosan and triclocarban also show up in odd places, such as building products, Schettler said. “Some building materials are subject to microbial degradation, attacked by things like fungi that break them down, so manufacturers will put antimicrobials in there to reduce the risk,” he said.

Because of the widespread use, most people have some levels of triclosan in them. A 2008 study of U.S. residents found it in the urine of about 75 percent of people tested.

Once the compounds get into the environment, they don’t readily go away. Researchers have detected triclosan and triclocarban in water and sediment all over the world—including drinking water, oceans and streams. The U.S. Geological Survey found triclosan in 60 percent of U.S. streams. Studies have shown triclosan toxic to some algae, fish and other crustaceans.

The compounds impact hormones in animal studies. And there’s evidence that they may do the same to developing babies. Properly functioning hormones are critical for babies’ proper development. Last month Brown University researchers reported that mothers’ triclosan exposure during pregnancy was linked to lower birth weights, smaller heads and earlier births. They also found that as the children aged, triclosan levels spiked after they brush their teeth or wash their hands.

In addition to endocrine disruption concerns, Lindeman and other signers outline two other potential human health impacts from exposure to triclosan: heightened sensitivity to allergens, and antibiotic resistance.

Large studies of children in the United States and Norway have linked triclosan to allergies and worsening asthma. And there is evidence bacteria that develop resistance to triclosan also become resistant to other antibacterial compounds.

Companies such as Colgate have defended triclosan use. The American Chemistry Council, which represents chemical manufacturers, declined to comment on the statement.

The authors of the statement recognize the need for antimicrobials and didn’t call for a total ban, Schettler said. Toothpastes with triclosan can help people with gum disease and in hospitals it is crucial to have such germ-killing soaps for pre-surgery and to use around people with immune system problems, he said.

But the everyday use could be curbed, he added, and the hope is the statement starts a broader conversation.

“Whether it will have an influence on the policy level remains to be seen, but a lot can come together through consumer interest and concerns causing manufacturers to slowly shift directions,” Schettler said.

Hygiene leaves kids with loads of triclosan

Seemingly healthy actions like brushing teeth and washing hands leave kids with increasing levels of a controversial endocrine disrupter. Are the anti-bacterial benefits worth the risk?

June 1, 2017, By Brian Bienkowski, Environmental Health News

Levels of a controversial chemical meant to kill bacteria spike in the bodies of young children after they brush their teeth or wash their hands, according to a new study.

U.S. manufacturers are phasing triclosan out of hand soaps after the Food and Drug Administration banned it effective last year amid concerns that the compound disrupted the body's hormone systems. It remains in Colgate Total toothpaste, some cleaning products and cosmetics. Health experts say exposure is best avoided for babies in the womb and developing children.

The latest study, published in the journal Environmental Science & Technology, is one of the first to show that children’s levels rise through their first few years of life. Hand washing and teeth brushing have speedy, significant impact on levels, the researchers found.

“There’s very little data on the exposure in those first years of life,” said the senior author of the study, Joe Braun, an assistant professor and researcher at Brown University. “There are a lot of behavioral changes in the those years, and environmental chemicals can play a role.”

Braun and colleagues tested the urine of 389 mothers and their children from Cincinnati, collecting samples from the women three times during pregnancy and from the children periodically between 1 and 8 years old.

They found triclosan in more than 70 percent of the samples. Among 8 year olds, levels were 66 percent higher in those that used hand soap. And more washing left the children with higher loads—those who reported washing their hands more than five times per day had more than four times the triclosan concentrations than those washing once or less per day.

Children who had brushed their teeth within the last day had levels 2.5 times higher than those who had a toothpaste-free 24-hour span.

“It’s a thorough, well-done analysis,” said Isaac Pessah, a researcher and professor at University of California, Davis, who was not involved in the study. “Given the high concentrations [of triclosan] in personal care products, you’re seeing that the concentrations in the end user are also quite high.”

Braun said the levels of triclosan rose as the children aged, eventually leveling off. “Their levels were almost to moms’ levels by the time they reached 5 to 8 years of age.”

"There is evidence that triclosan is positive for dental health but if it's not recommended from your dentist that you use this product, then don't."-Joe Braun, Brown University

This, he said, is likely due to more frequent use of personal care products as the kids aged. Despite the hand soap ban, triclosan remains on the market because it is effective at fighting plaque and gingivitis.

Colgate uses 0.3 percent of the antibacterial to “fight harmful plaque germs.”

Colgate’s parent company, Colgate-Palmolive, did not respond to requests to comment on Braun’s study. The company has, however, maintained triclosan’s safety in its toothpaste and its role in fighting plaque and gum disease.

“As for claims of endocrine disruption, the World Health Organization defines an endocrine disruptor as something harmful to one’s health, and that is simply not the case here,” wrote Colgate’s head of R&D, Patricia Verduin, in response to media reports on triclosan’s endocrine impacts.

Braun, however, said there is “quite compelling” evidence from animal studies that triclosan decreases thyroid hormone levels. Properly functioning thyroid hormones are critical for brain development.

Just last month, using the same mothers and children, Braun and others reported that mothers’ triclosan exposure during pregnancy was linked to lower birth weights, smaller heads and earlier births.

In addition, Pessah and colleagues reported triclosan hinders proper muscle development. The researchers used mice and fish, finding that triclosan affects the process responsible for muscle contraction.

Braun said the study was limited in that they didn’t know what brands of toothpaste or soap kids were using. Also, product use was self-reported and mothers may not want to say their kids weren’t brushing their teeth, he said.

Researchers do not argue triclosan’s effectiveness at reducing bacteria. And Pessah said there is a role in protecting oral health. “Perhaps if we would have made triclosan available by prescription for some people, like those with secondary infection on their gums from diabetes, we wouldn’t reach the contamination levels we now see.”

Braun said people might take pause next time they’re browsing for bathroom and cleaning products.

“There is evidence that triclosan is positive for dental health, but if it’s not recommended from your dentist that you use this product, then don’t,” Braun said. “Especially if you’re a young child or pregnant woman.”

In heart of Southwest, natural gas leaks fuel a methane menace

By Tom Knudson and Gayathri Vaidyanathan / June 14, 2017

BLANCO, N.M. – Most evenings, the quiet is almost intoxicating.

The whoosh of the wind through the junipers, the whinny of horses in their stalls, the raspy squawking of ravens – those are the sounds Don and Jane Schreiber have grown to love on their remote Devil’s Spring Ranch.

The views are mesmerizing, too. Long, lonesome ridges of khaki-colored rocks, dome-like outcrops and distant mesas rise from a sea of sage and rabbitbrush.

The (Un) Scientific Method

Big Oil lauds Paris pullout but warns of rising seas, severe storms

The ranch and surrounding countryside are a surprising setting for an enduring climate change problem: a huge cloud of methane – a potent, heat-trapping gas – that is 10 times larger than the city of Chicago. The main sources, scientists say, are leaks from about 25,000 active and abandoned wells and 10,000 miles of pipelines that snake across the San Juan Basin, providing about $3 billion worth of natural gas per year.

Home to stunning prehistoric ruins, cathedral-like canyons and silt-laden rivers, this basin in the Four Corners region of the Southwest emits substantially more methane per unit of energy produced than most major gas-producing areas, according to a Reveal from The Center for Investigative Reporting analysis of industry data reported to the federal government.

ConocoPhillips has been the region’s largest source of methane emissions, mostly from thousands of gas wells drilled on public lands managed by the Department of Interior. BP America is the second largest source.

“This is the Land of Enchantment. I love this state,” said Don Schreiber, whose ranch and 5,760 acres of nearby public grazing land are dotted with more than 120 natural gas wells. “But it is a black mark on our reputation that we allow this to persist.”

Every year, an estimated half million tons of methane loft into the air over the basin, a geologic area located primarily in northwestern New Mexico, according to a recent study published in the journal Environmental Science & Technology. That’s equivalent to the annual greenhouse gases emissions of almost 3 million cars, “a substantial amount of methane emitted from one region,” said Eric Kort, an atmospheric scientist at the University of Michigan and co-author of the study.

Although methane accounts for only 9 percent of U.S. greenhouse gases, it traps heat far more effectively than carbon dioxide. That’s why controlling it is considered essential to addressing climate change.

And while methane isn’t harmful to human health, other gases, such as benzene, that come from the wells are linked to cancer and other effects. Studies have linked the proximity of gas wells to a higher risk of birth defects in Colorado and increased hospital visits and respiratory and skin problems in Pennsylvania.

The San Juan Basin and Four Corners region – where Utah, Colorado, New Mexico and Arizona meet – are home to large expanses of Native American lands, including the 27,400-square-mile Navajo Nation, where some residents say the gases are making them sick, too.

“I feel nausea sometimes. I feel weak,” said Sam Dee, a 59-year-old rodeo announcer on the Navajo reservation in Utah just outside the basin. Near his home are gas and oil plants where pollution is flared into the air, contributing to the Four Corners methane cloud. “I am concerned for my grandkids, for their health and their climate.”

EPA suspends methane rules

Cleaner-burning than coal and oil, natural gas is touted as a steppingstone to a more climate-friendly future. Getting it out of the ground, though, isn’t easy. Imagine a pressure cooker filled with mud, water, crude oil, natural gas and other substances. That pressure cooker is the earth. When companies sink wells, the natural gas – which is mostly methane – comes hurtling out along with everything else.

Inevitably leaks happen. In addition, large amounts of gas are vented and flared into the air intentionally to relieve pressure and force water out of wells.

Controlling methane was a priority of the Obama administration in an effort to combat climate change and air pollution. But President Donald Trump has moved quickly to suspend, and perhaps rescind, rules he considers burdensome to the energy industry.

In May 2016, the Environmental Protection Agency adopted a regulation aimed at reducing emissions from new and modified wells on private land by a half million tons by 2025. A few months later, oil and gas groups sued the agency to block the rule, calling it “excessive, uneconomic and threatening to the long-term production of oil and natural gas.”

Two months after taking office, Trump issued an executive order directing federal agencies to reconsider restrictions on energy development. In response, EPA Administrator Scott Pruitt suspended the methane regulation and launched a review. He also ended an effort to gather information about leaks from old wells.

Pruitt, who was Oklahoma’s attorney general for six years, has close ties to energy companies. Oil and gas companies donated more than $100,000 over the past two years to a political action committee tied to Pruitt, according to Federal Election Commission records. Documents and emails also show that Pruitt has had a long-time alliance with Devon Energy, the San Juan Basin’s second-largest oil and gas producer. Devon attorneys drafted a letter that Pruitt, as attorney general, sent to President Barack Obama’s EPA to challenge the methane rules.

Another Obama administration regulation targeted methane emissions from wells on public lands. But the Interior Department announced last month that it would “suspend, revise or rescind (the rule) given its significant regulatory burden that encumbers American energy production, economic growth and job creation.”

Nationally, added oil and gas companies would have to spend between $125 million and $161 million a year to reduce leaks and flaring on public lands, although the recovered gas would have offset much of that, according to the Bureau of Land Management.

Tracking down leaks

A massive plume of methane (brownish area at center) has been discovered near the Four Corners region that scientists attribute to natural gas production. The map above shows 2012 methane emissions from natural gas and oil production, the most recent data available.

A massive plume of methane (brownish area at center) has been discovered near the Four Corners region that scientists attribute to natural gas production. The map above shows 2012 methane emissions from natural gas and oil production, the most recent data available.

Sources: Gridded National Inventory of U.S. Methane Emissions, Stamen DesignCredit: Eric Sagara/Reveal

Methane is a stealth pollutant. You can’t see it. At Four Corners, the first hint of trouble came from space in 2003, when a spectrometer on a satellite detected an enormous cloud of methane northeast of the Grand Canyon, stretching across 2,500 square miles. Each time the satellite orbited the Earth, the cloud was there.

In 2014, scientists identified the oil and gas fields of the San Juan Basin as the possible source. Energy companies pushed back; there are other sources, they pointed out, including natural seeps. But the scientists weren’t finished. They mounted a device on a plane that could spot individual leaks. They drove to many sites and recorded videos.

Of 250 plumes, four were from natural geologic seeps. One was from a coal mine. Everything else was from oil and gas drilling.

The companies maintain that these studies are too narrow in scope to be conclusive.

Although the San Juan Basin produces just under 4 percent of the nation’s natural gas, it emitted 10 percent of all methane pollution from oil and gas operations in 2015, according to industry data. One reason why: The region is larger than most national parks. Reaching wells and pipelines to repair leaks takes time. A lot can go wrong: Gaskets freeze, valves malfunction and welds fail, according to reports filed with the New Mexico Oil Conservation Division.

Some venting and flaring is unavoidable. But around 40 percent could be captured “with currently available control technologies,” the U.S. Government Accountability Office reported in 2010.

To the north, in Wyoming, where state officials have enacted methane rules, the upper Green River Basin produces twice as much gas as the San Juan Basin but emits only about half as much methane, according to Jon Goldstein, senior energy policy manager at the Environmental Defense Fund.

New Mexico has no state regulation.

“We live under the largest methane cloud in the entire country thanks to over 35,000 wells that are just across the line in New Mexico,” said Gwen Lachelt, a La Plata County, Colorado, commissioner who lobbied in the U.S. Senate to retain the Bureau of Land Management’s methane measure for public land.

Dirtier here than elsewhere

Every year oil and gas companies report methane emissions and production data to the federal government. That information, obtained and analyzed by Reveal, identifies the polluters.

ConocoPhillips, the basin’s largest oil and gas producer with about 10,000 wells, emitted methane in 2015 at a rate more than three times higher than its rate for the rest of the country.

For BP America, its methane emissions rate in the basin was 17 times higher than its national rate. For Devon Energy, it was twice as high.

“The industry there is dirtier than the industry in the country as a whole,” said David McCabe, an atmospheric scientist with the nonprofit environmental group Clean Air Task Force. “You need standards so the public can have some assurance companies will take care of this.”

ConocoPhillips spokesman Daren Beaudo defended its San Juan Basin operations, which it sold in April to Texas-based Hilcorp Energy Co. and The Carlyle Group in Washington, D.C., for $3 billion. Nationwide, the company nearly halved the rate at which it emitted methane between 2012 and 2015, according to the data analyzed by Reveal.

“We’re concerned about emissions and we’ve been taking action to reduce them for more than eight years,” Beaudo said. “With a large number of operating wells, even small emissions add up.”

Living with gas wells

Few people are more familiar with the environmental costs of gas wells than Don and Jane Schreiber. Several wells are actually on their land courtesy of a federal “split estate” law that allows companies access to private land.

They’ve battled over road construction, dust, soil erosion, hazardous waste disposal and loss of wildlife habitat. “Never have I said, ‘Don’t drill,’” said Don Schreiber. “What I’ve said is, ‘Don’t drill this way.’ ”

“Natural gas is a fine fuel,” he added. “But it is finite. That we would allow ourselves to waste it, cause local health problems and global climate change, there is everything in the world wrong with that.”

Not long ago, he drove around the countryside in his pickup, pointing out drilling sites and scars. “We’re down to fighting for the air we breathe,” he said.

Driving past one well on public land, he stepped on the brake. Under cumulus clouds as white as pearls and surrounded by pinyon pine, juniper and sagebrush, the setting was majestic. The air smelled like a gas station.

Fungal toxins easily become airborne, creating potential indoor health risk

American Society for Microbiology

Washington, DC - June 23, 2017 - Toxins produced by three different species of fungus growing indoors on wallpaper may become aerosolized, and easily inhaled. The findings, which likely have implications for "sick building syndrome," were published in Applied and Environmental Microbiology, a journal of the American Society for Microbiology.

"We demonstrated that mycotoxins could be transferred from a moldy material to air, under conditions that may be encountered in buildings," said corresponding author Jean-Denis Bailly, DVM, PhD, Professor of Food Hygiene, National Veterinary School of Toulouse, France. "Thus, mycotoxins can be inhaled and should be investigated as parameters of indoor air quality, especially in homes with visible fungal contamination."

The impetus for the study was the dearth of data on the health risk from mycotoxins produced by fungi growing indoors. (image: microscopic view of a sporulating Aspergillus, showing numerous light spores that can be easily aerosolized and inhaled together with mycotoxins. credit: Sylviane Bailly.)

In the study, the investigators built an experimental bench that can simulate an airflow over a piece of contaminated wall paper, controlling speed and direction of the air. Then they analyzed the resulting bioaerosol.

"Most of the airborne toxins are likely to be located on fungal spores, but we also demonstrated that part of the toxic load was found on very small particles -- dust or tiny fragments of wallpaper, that could be easily inhaled," said Bailly..

The researchers used three fungal species in their study: Penicillium brevicompactum, Aspergillus versicolor, and Stachybotrys chartarum. These species, long studied as sources of food contaminants, also "are frequent indoor contaminants," said Bailly. He noted that they produce different mycotoxins, and their mycelia are different from one another, likely leading to differences in the quantity of mycotoxins they loft into the air. (Mycelia are the thread-like projections of fungi that seek nutrition and water from the environment.)

The findings raised two new scientific questions, said Bailly. First, "There is almost no data on toxicity of mycotoxins following inhalation," he said, noting that most research has focused on such toxins as food contaminants.

Second, the different fungal species put different quantities of mycotoxins in the air, "probably related to mycelium organization," but also possibly related to the mechanisms by which mycotoxins from different fungi become airborne -- for example via droplets of exudate versus accumulation in spores. Such knowledge could help in prioritizing those species that may be of real importance in wafting mycotoxins, he said.

Bailly noted that the push for increasingly energy efficient homes may aggravate the problem of mycotoxins indoors. Such homes "are strongly isolated from the outside to save energy," but various water-using appliances such as coffee makers "could lead to favorable conditions for fungal growth," he said.

"The presence of mycotoxins in indoors should be taken into consideration as an important parameter of air quality," Bailly concluded.

The American Society for Microbiology is the largest single life science society, composed of over 50,000 scientists and health professionals. ASM's mission is to promote and advance the microbial sciences.

How the climate can rapidly change at tipping points

A new study shows: Gradual changes in the atmospheric CO2 concentration can induce abrupt climate changes

Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research

During the last glacial period, within only a few decades the influence of atmospheric CO2 on the North Atlantic circulation resulted in temperature increases of up to 10 degrees Celsius in Greenland -- as indicated by new climate calculations from researchers at the Alfred Wegener Institute and the University of Cardiff. Their study is the first to confirm that there have been situations in our planet's history in which gradually rising CO2 concentrations have set off abrupt changes in ocean circulation and climate at "tipping points". These sudden changes, referred to as Dansgaard-Oeschger events, have been observed in ice cores collected in Greenland. The results of the study have just been released in the journal Nature Geoscience.

Previous glacial periods were characterised by several abrupt climate changes in the high latitudes of the Northern Hemisphere. However, the cause of these past phenomena remains unclear. In an attempt to better grasp the role of CO2 in this context, scientists from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) recently conducted a series of experiments using a coupled atmosphere-ocean-sea ice model.

First author Xu Zhang explains: "With this study, we've managed to show for the first time how gradual increases of CO2 triggered rapid warming." This temperature rise is the result of interactions between ocean currents and the atmosphere, which the scientists used the climate model to explore. According to their findings, the increased CO2 intensifies the trade winds over Central America, as the eastern Pacific is warmed more than the western Atlantic. This is turn produces increased moisture transport from the Atlantic, and with it, an increase in the salinity and density of the surface water. Finally, these changes lead to an abrupt amplification of the large-scale overturning circulation in the Atlantic. "Our simulations indicate that even small changes in the CO2 concentration suffice to change the circulation pattern, which can end in sudden temperature increases," says Zhang.

Further, the study's authors reveal that rising CO2 levels are the dominant cause of changed ocean currents during the transitions between glacial and interglacial periods. As climate researcher Gerrit Lohmann explains, "We can't say for certain whether rising CO2 levels will produce similar effects in the future, because the framework conditions today differ from those in a glacial period. That being said, we've now confirmed that there have definitely been abrupt climate changes in the Earth's past that were the result of continually rising CO2 concentrations."

Cut US commercial building energy use 29 percent with widespread controls

Programming, maintaining building controls can lower national power bill

DOE/Pacific Northwest National Laboratory

RICHLAND, Wash. - Like driving a car despite a glowing check-engine light, large buildings often chug along without maintenance being performed on the building controls designed to keep them running smoothly.

And sometimes those controls aren't used to their full potential, similar to a car at high speed in first gear. Instead of an expensive visit to the mechanic, the result for a commercial building is a high power bill.

A new report finds that if commercial buildings fully used controls nationwide, the U.S. could slash its energy consumption by the equivalent of what is currently used by 12 to 15 million Americans.

The report examines how 34 different energy efficiency measures, most of which rely on various building controls, could affect energy use in commercial buildings such as stores, offices and schools. Researchers at the Department of Energy's Pacific Northwest National Laboratory found the measures could cut annual commercial building energy use by an average of 29 percent. This would result in between 4 to 5 quadrillion British Thermal Units in national energy savings, which is about 4 to 5 percent of the energy consumed nationwide.

"Most large commercial buildings are already equipped with building automation systems that deploy controls to manage building energy use," said report co-author and PNNL engineer Srinivas Katipamula. "But those controls often aren't properly programmed and are allowed to deteriorate over time, creating unnecessarily large power bills.

"Our research found significant nationwide energy savings are possible if all U.S. commercial building owners periodically looked for and corrected operational problems such as air-conditioning systems running too long."

An easy, low-cost fix

The report offers the first detailed, national benefit analysis of multiple energy efficiency measures to address building operational problems. Many of these problems can be corrected with very little effort. Unlike other practices that require expensive new technologies, most of the measures evaluated improve energy efficiency by enabling already-installed equipment to work better.

Roughly 20 percent of America's total energy use goes toward powering commercial buildings. And about 15 percent of U.S. commercial buildings have building automation systems that deploy controls, such as sensors that turn on lights or heating a room only when it's occupied. As a result, helping commercial buildings better use their controls could profoundly slash America's overall energy consumption.

Katipamula and his colleagues examined the potential impact of 34 individual energy efficiency measures that can improve commercial building performance, including:

Fixing broken sensors that read temperatures and other measurements

Turning off power-using devices like printers and monitors when a room isn't occupied

Dimming lights in areas with natural lighting

Because combining individual measures can increase energy savings, the researchers also estimated the impacts of packaging energy efficiency measures together. PNNL designed packages of combined measures based on the needs of three different building conditions: buildings already efficient and with little room for improvement, inefficient buildings with a lot of room for improvement, and typical buildings in the middle.

PNNL used computer models of nine prototypical commercial buildings, and extrapolated them to represent five other, similar buildings so it could evaluate energy use in a total of 14 building types. The research team used these prototypical building models with DOE's EnergyPlus building software, which calculated potential energy use given local weather and whichever energy efficiency measures were applied.

Of the individual efficiency measures studied, those with the greatest energy-saving potential nationwide were:

Lowering daytime temperature setpoints for heating, increasing them for cooling, and lowering nighttime heating setpoints: about 8 percent reduction

Reducing the minimum rate for air to flow through a variable-air volume boxes: about 7 percent reduction

Limiting heating and cooling to when building is most likely to be occupied: about 6 percent reduction

Though the study found all commercial buildings across all climates could have an average total energy savings of 29 percent, some building types were found to have the potential to save more, such as:

Secondary schools: about 49 percent

Standalone retail stores & auto dealerships: about 41 percent

As expected, researchers found inefficient buildings have the greatest potential to save energy. After estimating how common each building condition is in the U.S., researchers found combined efficiency measure packages have the following potential national energy saving ranges:

Inefficient buildings: 30 to 59 percent

Typical buildings: 26 to 56 percent

Efficient buildings: 4 to 19 percent

The Department of Energy's Office of Energy Efficiency and Renewable Energy funded this research.

REFERENCE: N. Fernandez, S. Katipamula, W. Wang, Y. Xie, M. Zhao, C. Corgin, "Impacts of Commercial Building Controls on Energy Savings and Peak Load Reduction," PNNL report to DOE, May 2017, http://buildingretuning.pnnl.gov/publications/PNNL-25985.pdf.

Luxury-quality materials made from waste

By Laura Houseley, CNN

Updated 5:31 AM ET, Thu June 22, 2017

Recycling gets a designer makeover

(CNN)Recycling is a concept as old as trash itself. By now, we're used to seeing useful materials, such as glass and paper, reprocessed into lower-grade versions of themselves, and discarded products upcycled into entirely new designs. (Emeco's 111 Navy chair, made from 111 used Coca-Cola bottles, is a good example.)

But today we're witnessing the emergence of a new recycling trend, driven by the luxury design industry. These versatile materials, substitutes for conventional woods, plastics and stone, come in sheet or tile form, ready to be cut, shaped and manipulated by architects and fellow designers.

World's top designers reinvent the classic park bench

Perhaps because they're being developed by the very designers who are meant to use them, rather than by the manufacturing industry, they're decidedly decorative and attractive as well as strong, economical and easy to use. They bear all the attributes of the materials they might substitute and in some cases, more.

Read: How hospital design can transform the way you're cared for

One of companies at the forefront is Really, a Danish company that transforms transform used textiles into a sheet material similar to plywood. This past April, the brand revealed its debut collection, a series of benches by designer Max Lamb, at the Salone del Mobile design fair in Milan. Really's warm reception and critical success proved not only the creative potential of these new materials, but also that there is a healthy appetite for them in the design community.

How a Teen's Day at the Beach Turned Into a Climate Breakthrough

Ethan Novek's brainstorm led him to research at Yale while he was still in high school.

By Christina Nunez

PUBLISHED June 30, 2017

Transformational ideas can come from anywhere. From anyone. National Geographic’s Chasing Genius is now soliciting ideas around three issues: sustainable planet, global health, and feeding nine billion. Could your solution be a spark of genius? Check out the challenge, where the best ideas for improving our world each can win $25,000.

Ethan Novek was digging a hole in the sand at a beach in Connecticut when he noticed something that got him thinking. The seawater that seeped into the hole was rising at the same rate as the tides offshore. What if that seepage could be harnessed for energy in wells onshore, rather than out in the trickier ocean environment?

That idea kicked off the first in a chain of experiments that eventually landed Novek at Yale University, leading a research project while he was still in high school. Starting after that summer at the beach before his freshman year, Novek ran successive experiments at his school's lab. By the time he was a sophomore, he had arrived at a simple and economical way to capture carbon dioxide from the air and convert it into useful products. He recently became a semifinalist in the $20 million NRG COSIA Carbon XPRIZE.

In conversation, Novek is almost breathless as he describes his research. There's a lot to cover. His "weird route," as he calls it, from that seaside brainstorm to carbon capture began with filing a patent for a tidal energy system, which then led him to research the energy potential of salinity gradients between freshwater and ocean water, which led him to explore parallel reactions between ammonia and carbon dioxide. Fully appreciating his rapid-fire account of the multiple threads of inquiry that led to his carbon capture idea might require an advanced education in chemistry.

But Novek is certainly not alone in his fascination with carbon capture, which offers a tantalizing but elusive prospect: If we can siphon planet-warming carbon dioxide emissions away from the smokestack before they hit the atmosphere, then maybe the imperative to phase out fossil fuels becomes a bit less dire.

Despite the world's wealth of carbon dioxide, extracting it on a commercial scale— where it can be stored underground or recycled into other products—has been a tough nut to crack. Billions of dollars have disappeared into high-profile "clean coal" projects in Canada and Mississippi (work on the latter was just suspended) without producing the hoped-for results.

"I've always been inspired by the concept of turning waste products into valuable materials," says Novek. "I love the concept of being able to bring everyone in the world to a higher standard of living without running out of the resources we have on Earth."

To him, carbon capture has less to do with extending the use of coal for electricity and more to do with industrial pollution. He points out that even if we could manage to get all our electricity from renewable sources such as wind and solar, the world would still be left with carbon dioxide emisisons from steel and cement factories, for example, which aren't nearly as far along as the power sector in converting away from fossil fuels.

"How do you deal with that aspect?" he asks of those industrial sites. "That’s where CO2 capture comes into play."

As Novek read papers on salinity gradients for his tidal energy idea, he saw a recurring name: Menachem Elimelech. So he wrote to the Yale professor, expressing interest in his research and peppering him with questions via email. He got no reply.

In the meantime, Novek kept pursuing his experiments, which eventually led him to carbon dioxide. After winning awards including first place at Intel's International Science and Engineering Fair for his mechanism to turn carbon dioxide and ammonia into urea, a main component of fertilizer, he wrote again to Elimelech with an update.

This time, he got a reply. Congratulating Novek on his persistence, Elimelech wanted to know whether the teen would present his research at Yale and publish a peer-reviewed journal article on the topic.

Novek spent his last two years of high school on Yale's campus, shifting from the urea idea to his carbon capture and reuse concept, which focuses on organic solvents that release carbon dioxide from flue gas at room temperature, using 96 percent less energy than existing carbon capture processes. He is also working on an idea to use waste hydrogen sulfide from oil and gas operations to convert carbon dioxide into carbon monoxide, which is used to produce chemicals and plastics.

Working with Elimelech and graduate students, he published a peer-reviewed paper and entered the XPRIZE competition while earning his high school degree remotely.

"I have had many, many bright students come and go," says Andrew Bramante, who led Novek's Independent Science Research class at Greenwich High School in Connecticut. "Ethan, however, is a breed apart."

Bramante guesses Novek "likely spent his class time dreaming of his inventions, rather than completing the requisite class work." Novek confirms as much, saying that before he moved to New Haven, he was having trouble balancing the time commitments of high school with his Yale research.

But Novek says the early freedom to experiment in his high school lab and at home is what helped him succeed. His parents aren't science-minded, he says, so there was no one to tell him not to bother with an experiment that might be bound to fail but would give him valuable insight.

"I actually experienced grit, in a sense—to really be able to realize why something doesn't work," he says, "instead of someone telling me why something doesn't work." He also relied heavily on Web research, looking up papers in journals: "If the Internet didn't exist, I don't know where I would be right now."

Now 18, Novek has founded a company, Innovator Energy, to pursue his carbon capture and conversion techniques and has relocated to San Antonio, Texas, where he is building a prototype of his system with a team at the Southwest Research Institute. He says his technology can capture carbon dioxide at $8 per ton, well below the current market price (around $13 per ton in California, for example).

Though recently admitted at Yale, where he plans to study chemical engineering, Novek is taking a gap year to focus on his carbon research. With that year, he says, he aims to "create the biggest impact in the shortest period of time."

What makes people dislike wind? It's not NIMBY

Christa Marshall, E&E News reporter, Greenwire: Thursday, June 29, 2017

The "not in my backyard" barrier to wind development has been overstated, and other factors play much more of a role in whether communities support the resource, according to a report from Lawrence Berkeley National Laboratory.

The report surveys three decades of research on the factors driving pockets of opposition to wind farms and influencing community "acceptance." In general, 70 to 90 percent of Americans have favorable views of wind power, but local conflicts can develop, researchers said.

"The rapid growth of wind energy in recent years has increased its footprint, bringing the issue of community acceptance to the forefront. Cooperation between wind development actors and those in the host communities is critical to successful deployment processes, and therefore understanding local attitudes and factors driving acceptance and opposition is an essential first step," the lab said in a release.

Despite frequent citing of NIMBY fights, the research doesn't support that the concept matters much, the report said.

Instead, socioeconomic factors such as how wind alters job development in a given area, or how landowners are compensated for wind farms, play a significant role in whether opposition or support develops.

Compensation to landowners "may be correlated to acceptance, but it also can create community conflict, exacerbate inequality, and be seen as bribery," the report stated.

Also, the sound and visual impacts of wind turbines can be strongly tied to "annoyance and opposition," according to the report. Ignoring those concerns can exacerbate conflict, it said.

"Negative attitudes stemming from the visual impacts of wind turbines may not occur simply because people dislike how turbines look; people also have become accustomed to an electricity system that is essentially 'invisible' to consumers owing to centralized infrastructure typically sited far from population centers," the report said.

Other studies have correlated opposition to concerns about the sound of turbines and worries about health risks. Research has shown consistently that there are not health risks, or impacts on sleep quality, but the perception that it is so can still create pushback, according to the report. Making the public more aware of what the effects actually are could alter perceptions, it said.

Environmental factors matter, too, although perhaps less than other ones. "The direction of the correlation remains unclear," the lab said.

Environmental factors can cut both ways, according to various studies. Concerns about climate change, for instance, can increase support, while the perception that birds could be harmed — even if untrue — can generate backlash.

Overall, there needs to be a lot more research on what drives community acceptance, since there are contradictory findings, the report said. Research over the past 30 years has also been "slow into transition into practice."

It's still unclear, for instance, whether placing turbines a certain distance from individuals matters, and if so, exactly what that distance should be.

What is known is that an area's demographics provide little insight into likely acceptance of wind farms.

"Demographic variables such as gender, income, and education level do little to explain variation in wind energy attitudes. Some studies have shown contradictory evidence," the report said.

Ensuring that a community feels like the wind planning process was fair, and that residents were given a say — or were at least made aware of all developments — can also make a big difference.

"In some wind development models, local citizens have been entirely removed from the planning and design of wind developments. This may lead to feelings of injustice among local residents, who perceive that government and corporate decision-making ... takes place in faraway boardrooms," the report said.

----------------------------------------------------------------

Thirty years of North American wind energy acceptance research: What have we learned?

Authors: Joseph Rand; Ben Hoen

Date Published: 06/2017

Abstract:

Thirty years of North American research on public acceptance of wind energy has produced important insights, yet knowledge gaps remain. This review synthesizes the literature, revealing the following lessons learned. (1) North American support for wind has been consistently high. (2) The NIMBY explanation for resistance to wind development is invalid. (3) Socioeconomic impacts of wind development are strongly tied to acceptance. (4) Sound and visual impacts of wind facilities are strongly tied to annoyance and opposition, and ignoring these concerns can exacerbate conflict. (5) Environmental concerns matter, though less than other factors, and these concerns can both help and hinder wind development. (6) Issues of fairness, participation, and trust during the development process influence acceptance. (7) Distance from turbines affects other explanatory variables, but alone its influence is unclear. (8) Viewing opposition as something to be overcome prevents meaningful understandings and implementation of best practices. (9) Implementation of research findings into practice has been limited. The paper also identifies areas for future research on wind acceptance. With continued research efforts and a commitment toward implementing research findings into developer and policymaker practice, conflict and perceived injustices around proposed and existing wind energy facilities might be significantly lessened.

Could a Trade Dispute with China Bring an End to U.S. Solar Boom?

Low-cost solar cells produced in China have helped power the recent surge in the U.S. solar industry. But a case now before the federal International Trade Commission could lead to tariffs that would jeopardize U.S. solar’s rapid growth.

By Marc Gunther • June 27, 2017

Cheap Chinese solar cells have powered a boom in the U.S. solar industry. They have helped drive down the cost of making electricity from sunlight by about 70 percent since 2010, leading to double-digit growth rates in rooftop and utility-scale installations, according to the industry. Last year, for the first time, solar added more generating capacity to the electricity grid than any other fuel, including natural gas. That’s welcome news to those who worry about climate change.

Now, though, the solar boom may be in jeopardy. The U.S. International Trade Commission, an independent federal agency, has begun an investigation that could lead to sweeping trade protections against the imports that would raise the costs of solar power and could bring a halt to solar’s rapid U.S. growth.

If the trade commission finds that imports caused serious harm to U.S. solar manufacturers, it will recommend trade remedies, which could potentially include tariffs on all imported solar products. President Donald Trump, a champion of U.S. manufacturing, would get the final word on any action — a prospect that has the solar industry in a tizzy.

The prospect of global tariffs “poses an existential threat to the broad solar industry and its 260,000 American jobs,” says Abigail Ross Hopper, the chief executive of the Solar Electric Industries Association, the industry’s largest trade organization. Most solar jobs in the United States are in sales and installation, not manufacturing, but tariffs could drive up the cost of solar and make it less competitive.

The trade investigation began in response to a petition filed by Suniva, a bankrupt manufacturer of solar cells and panels based in suburban Atlanta, with factories in Georgia and Michigan. Suniva, the second largest U.S. solar panel maker by volume, has been joined in the case by SolarWorld Americas, the largest U.S. solar panel manufacturer, which has a factory in Oregon.

U.S. solar manufacturers “simply cannot survive” in a market where foreign imports “have unexpectedly exploded and prices have collapsed,” Suniva said in its petition. SolarWorld Americas said it decided to join with Suniva because “massive overproduction” of Chinese solar cells and panels has “led to the near-destruction of remaining solar producers in America.”

The domestic solar firms have asked the Trump administration to impose steep tariffs on all imported solar cells, which are the devices inside solar panels that convert sunlight into electricity, and to set a floor price on solar panels containing imports. Those measures would roughly double the cost of imported panels, analysts say.

Experts agree that China has subsidized its giant solar manufacturers with free or low-cost loans and artificially cheap raw materials.

Some industry analysts say higher costs for solar will slow the industry’s growth. According to a report from analyst IHS Markit, demand for U.S. solar photovoltaics could be reduced by 60 percent over the next three years if the trade commission grants Suniva’s petition.

Hugh Bromley, an industry analyst with Bloomberg New Energy Finance, said in a note to clients that Suniva’s accusations are “riddled with holes and hypocrisies.” Still, he adds, “Those may not matter if the case makes its way to President Trump’s desk.”

As a candidate and as president, Trump has vowed to enforce U.S. trade laws as a way to strengthen the nation’s manufacturing base. Slapping tariffs on imported solar panels would benefit not only domestic solar manufacturers but traditional energy producers, including the coal industry, that compete with solar.

The International Trade Commission (ITC) has already made one statement about the Suniva petition — that, within the meaning of trade law, it is “extraordinarily complicated.” About that, no one disagrees. For starters, Suniva is majority-owned by Shunfeng International Clean Energy, a Chinese company that opposes Suniva’s petition, and SolarWorld Americas is a subsidiary of an insolvent German firm. Yet both are taking a stance against imports into the U.S.

How can that be? In Suniva’s case, the petition is being driven by SQN Capital Management, a New York-based asset manager that made $51 million in loans to Suniva and spent another $4 million on legal fees. In a letter to the China Chamber of Commerce for Import & Export of Machinery & Electronic Products, SQN offered to drop the petition if a buyer could be found for Suniva’s manufacturing equipment, which SQN says is worth $55 million. The Chinese declined to make a deal, and the issue became moot when SolarWorld Americas entered the case and the ITC decided to investigate.

This isn’t the first time that U.S. solar manufacturers have sought trade sanctions. In 2012, the Obama administration imposed modest tariffs on Chinese imports after finding that the Chinese government provided illegal export subsidies to its manufacturers. Two years later, it extended the tariffs to Taiwan. Those moves were prompted by cases brought by SolarWorld Americas.

Nevertheless, solar imports to the U.S. continued to surge — from $5.1 billion in 2012 t0 $8.3 billion in 2016, according to Suniva — as Chinese companies built factories in Thailand, Vietnam, and Malaysia, which were unaffected by the tariffs. Last year, the U.S. imported $520 million in panels from Thailand, up from almost nothing in 2012, and another $514 million from Vietnam, up from less than $1 million in 2012, according to Suniva.

.

That’s why Suniva now wants tariffs imposed globally. “Without global relief, the domestic industry will be playing ‘whack-a-mole’ against [solar cells] and modules from particular countries,” says Matthew McConkey, a lawyer for Suniva, in the petition to the ITC.

Trade experts agree that China has subsidized its giant solar manufacturers. Beijing and provincial governments provided free or low-cost loans; artificially cheap raw materials, components, and land; support for research and development; and a demand that was artificially driven by domestic regulation, according to Usha C.V. Haley and George Haley, wife-and-husband authors of a 2013 book, “Subsidies to Chinese Industry: State Capitalism, Business Strategy and Trade Policy.”

“Production in China is still heavily subsidized,” says Usha Haley, a professor of management at West Virginia University. “There is little doubt in my mind that it is going to become a monopoly producer — and then, of course, they will raise prices.”

What’s more, a solar industry dominated by a handful of Chinese companies will have little incentive to innovate, argues Stephen Ezell, a vice president at the Information Technology & Innovation Foundation, a Washington think tank. The U.S. industry, which invented solar photovoltaics and still leads the world in solar patents, simply will not have the resources it needs to invest in research.

“These industries are fundamentally about generating the next-generation product,” Ezell says. “We’re getting locked into a lower level of technological development.” In the long run, that would make it more difficult for the global solar industry to dislodge its fossil-fuel competitors.

Still, U.S. firms that complain about subsidies run the risk of being called hypocrites. Suniva, for instance, enjoyed state tax incentives for its Michigan plant, and “many other U.S. solar manufacturers have received tax, grant, and loan guarantee incentives,” says analyst Hugh Bromley. SolarCity, a unit of Tesla, is building a $900 million factory in Buffalo, New York, to make solar panels, with major subsidies.

In making any decision on tariffs, the Trump administration will be free to consider jobs and the impact of higher solar prices on consumers.

Other solar manufacturers headquartered in the U.S., including SunPower and First Solar, have located a majority of their manufacturing offshore and so would be affected by any tariffs. In fact, SunPower has joined with the Solar Electric Industries Association to oppose the tariffs. SunPower, First Solar, and SolarCity all declined to comment on the trade issue.

Some analysts believe that the industry will be able to adjust to the tariffs. Setting a floor price for panels will lead developers to choose higher-efficiency, higher-cost panels that will enable other price reductions along the supply chain, says Roberto Rodriguez Labastida, an analyst with Navigant Research. “There will be some shake-ups and adjustments,” he says, but nothing like the meltdown being forecast by some.

The ITC will decide in September whether U.S. manufacturers have been injured. Shara Aranoff, a former chair of the commission who is now a corporate lawyer, says the nonpartisan ITC commissioners will be guided by the law and the facts. “It is one of the most independent agencies in the federal government,” she said.

If the commission finds harm, the issue moves into the political arena. Already, Daniel Kildee, a Democratic congressman from Michigan, and Rob Woodall, a Republican congressman from Georgia, have called for trade remedies. The solar industry is arguing that tariffs will kill many more jobs than they will save, and that there are better ways to protect U.S. manufacturing.

In making any decision, the Trump administration would be free to take anything into account — jobs, the impact of higher solar prices on consumers, and, at least in theory, the environment. Few would expect the environment to be high on the administration’s priority list here. But what will the president do? As with so many issues in Washington these days, that’s anybody’s guess.

Marc Gunther has reported on business and sustainability for Fortune, The Guardian, and GreenBiz. He now writes about foundations, nonprofits, and global development on his blog, Nonprofit Chronicles.

Southern Co.'s clean coal plant hits a dead end

Kristi E. Swartz, E&E News reporter

Energywire: Thursday, June 22, 2017

Southern Co.'s $7.5 billion clean coal plant in Mississippi should run as a natural gas plant, state regulators said yesterday, throwing a gut punch to the utility's hopes of recovering billions of dollars in cost overruns and paving the way for next-generation coal plants.

The Kemper County Energy Facility would be the first large coal-burning power plant in the United States to capture and store the majority of its carbon dioxide emissions. The power plant is supposed to gasify lignite coal into synthetic gas, capturing the CO2 for use in enhanced oil recovery.

The plant has been running on natural gas for years, and each of Kemper's two gasifiers has successfully produced electricity from synthetic gas. But Mississippi Power, the Southern Co. unit building the plant, has struggled to keep the plant's complex systems running nonstop. That has delayed its full startup.

The surprisingly aggressive move by the three-member panel of Mississippi regulators is a significant financial and political setback for Southern, which has promoted the idea of building a first-of-its-kind clean coal plant at scale.

Mississippi Power, for its part, could be stuck with a massive bill the state won't allow it to pass on to electricity customers. Instead, the state Public Service Commission wants the utility to plan on running on gas.

Kemper originally was supposed to be operating in 2014, and its original price tag was a little under $3 billion. Now at $7.5 billion and a projected startup date of the end of the month, Kemper is at a crossroads. Mississippi Power filed a new rate plan for Kemper at the beginning of the month, as required by the PSC, but the utility had to hold off on formally asking to collect $4.3 billion from customers because the plant is not yet operating.

It was that rate filing that triggered the commission's response, which came after a special meeting and a closed executive session. The PSC directed the utility to work with the staff attorney and other stakeholders to come up with a settlement within 45 days.

The settlement should allow Kemper to operate on natural gas only and not raise customer rates, the commission said. What's more, Mississippi Power should find a way to lower rates, especially for its residential class of customers.

"No rate increases, period," said PSC Chairman Brandon Presley. "And, none of the gasifier assets will ever be included in rate base."

The PSC is scheduled to review an order addressing its directives at a July meeting.

"It's obvious to anybody that this project has had its challenges," Presley said. "Because they made the rate filing, we felt it was time for the commission to give guidance to what we felt the public interest is."

'Other options'

If the utility and PSC staff cannot negotiate a settlement within 45 days, "we'll resort to other options," Presley said.

"We look forward to reviewing the order," Mississippi Power replied after the PSC meeting.

The ability to run nonstop as a clean coal plant is key for the utility to recoup costs from customers under its current regulatory certificate. As a result, the project's operation date has been moved several times including a series of one-month delays that started at the end of last year.

"I applaud the Mississippi Public Service Commissioners with their decision today to pull the plug on big bets gone bad, which has resulted in the most expensive power plant built in the United States that still does not run as advertised," said Louie Miller, state director for the Sierra Club's Mississippi chapter.

Kemper has operated for about 200 days using lignite, according to Mississippi Power. But the company said it may need to redesign a critical part for its next-generation coal power plant, and that may take 18 months to two years to complete.

"If the project isn't working at this point in time, it's not surprising that the commission is concerned about ratepayers paying for it," said Paul Patterson, a utility analyst with Glenrock Associates LLC.

Patterson said he doesn't think it means the utility cannot recover all of the costs in question. But it has to work out those details with the PSC staff and other stakeholders.

"The real question can be whether they will reach a settlement or whether there will be some litigation," he said.

Not being able to recover the bulk of the outstanding costs could cripple Mississippi Power financially. Regulators agreed to $159 million in emergency rate relief in August 2015 after the utility's CEO testified that the company was on the brink of bankruptcy and would run out of cash by the end of the year.

Kemper was running as a natural gas plant at that time. Costs continued to rise, and an earlier settlement with regulators capped what the utility eventually could ask to recover from its customers.

That agreement, in part, has led to Southern's shareholders absorbing roughly $3 billion from Kemper at this point. Mississippi Power's current liabilities exceeded current assets by about $1.2 billion as of the end of March, according to a recent Securities and Exchange Commission filing.

"Southern intends to provide Mississippi Power with loans and/or equity contributions sufficient to fund the remaining indebtedness scheduled to mature and other cash needs over the next 12 months" if sufficient funds are not available from external sources, the filing said.

A campaign to eliminate plastic straws is sucking in thousands of converts

Discarded plastic straws pose an environmental hazard. There is a growing movement to eliminate them altogether. (Helen Lockhart/Two Oceans Aquarium)

By Darryl Fears June 24

It started so innocently. A kid ordered a soda in a restaurant.

“It came with a plastic straw in it,” Milo Cress recalled. He glared at the straw for a while. “It seemed like such a waste.”

Not only did Cress yank the plastic from his drink, but he also launched a campaign, “Be Straw Free,” targeting all straws as needless pollution. He knocked on the doors of restaurants in Burlington, Vt., where he lived at the time, and asked managers not to offer straws unless patrons asked. He was 9 years old.

Today Cress, 15, is one of the faces of a growing movement to eliminate plastic straws. They have been found wedged in the nose of a sea turtle, littering the stomachs of countless dead marine animals and scattered across beaches with tons of other plastics.

Why single out pollution as small and slim as a drinking straw?

Straws are among the most common plastic items volunteers clean from beaches, along with bottles, bags and cups, conservationists say. Americans use half a billion straws every day, at least according to an estimate by Be Straw Free, based on information from straw manufacturers. That many straws could wrap around the Earth 2½ times.

The slightest wind lifts plastic straws from dinner tables, picnic blankets and trash dumps, depositing them far and wide, including in rivers and oceans, where animals often mistake them for food.

And they are ubiquitous. Nearly every chain restaurant and coffee shop offers straws. They’re in just about every movie theater and sit-down restaurant. Theme parks and corner stores and ice cream shops and school cafeterias freely hand them out.

But they are starting to disappear because of the awareness campaign Cress and dozens of conservation groups are waging. Walt Disney World’s Animal Kingdom bans them, as do the food concession areas of Smithsonian Institution museums.

Keith Christman, a managing director for plastics markets at the American Chemistry Council, which promotes plastics manufacturers and fights attempts to ban plastic, said in a National Geographic article two months ago that the group would do the same for attempts to eliminate plastic straws.

But a spokeswoman for the council said “we won’t be able to offer comment” or say whether the group backs Christman’s claim.

The movement was growing at a slow, steady pace when Cress joined it six years ago, but it exploded after a YouTube video of a sea turtle with a straw stuck in its nose went viral in 2015. The cringe-inducing effort to pull the plastic out of a bloody nostril outraged viewers — 11.8 million so far.

Cress has launched a website on the issue, partnered with several organizations that support the cause and testified against straws in the Vermont legislature. Colorado Gov. John Hickenlooper (D) cited Cress’s activism in a 2013 proclamation that made July 11 a straw-free day in the state.

Manhattan Beach outside Los Angeles banned all disposable plastics, including straws. Berkeley, Calif., is considering a ban. Restaurants in San Diego; Huntington Beach, Calif.; Asbury Park, N.J.; New York; Miami; Bradenton, Fla.; London; and British Columbia have pledged to ban straws or withhold them until patrons ask for them.

The Plastic Pollution Coalition estimates that 1,800 “restaurants, organizations, institutions and schools worldwide have gotten rid of plastic straws or implemented a serve-straws-upon-request policy,” said Jackie Nunez, founder of a group called the Last Plastic Straw.

More than 20 such restaurants near Wrightsville Beach, N.C., signed up last year to be certified by a coalition of groups as establishments that won’t serve straws unless they’re requested.

Ginger Taylor, a volunteer who cleans trash from the five-mile beach, said the campaign is working, at least anecdotally.

“I’ve been picking up straws on Monday morning on that same stretch of beach for five years,” she said. Four years ago, she picked up 248 straws in about two weeks. The next two years, she collected about 500. But the number fell to 158 after the awareness campaign started last year.

Diana Lofflin, founder of a group called Straw Free, said the turtle video inspired her year-old organization. Her volunteers persuaded California’s Joshua Tree Music Festival to go straw-free in May. They also knock on the doors of Orange County, Calif., homeowners who grow bamboo to ask whether they can harvest a little and make reusable straws from the plant. Like several other groups, Straw Free sells reusable bamboo straws online, theirs in packs of 10 for $1.50.

Xanterra Parks & Resorts, a concessions company that partners with the National Park Service to provide food and lodging at Rocky Mountain National Park, the Grand Canyon and other national parks, offers straws at dispensers but posts fliers asking patrons not to use them.

“Humans didn’t really evolve around straws. It’s not like we have to consume fluids with this appendage. What really, what is this?” said Catherine Greener, vice president of sustainability for the company.

The prevailing notion says flexible straws were invented in the late 19th century by Marvin Stone, a D.C. man who didn’t like how the traditional ryegrass straw people used for drinking would disintegrate and leave gritty residue in his mint juleps. Stone wrapped strips of paper around a pencil, glued the strips together and test-marketed the contraption, and in 1888, the disposable straw was born, according to the Smithsonian’s Lemelson Center for the Study of Invention and Innovation.

The new paper straw was limited mostly to use in hospitals, which used the innovation to avoid spreading disease. Usage widened during the polio epidemic that began in 1900 as people avoided putting their mouths on others’ drinking glasses. Finally in the 1960s, restaurants offered a new invention: a disposable plastic straw.

It’s a convenience people seem to use arbitrarily. Millions drink soda with a straw, but hardly any suck beer through one. Hot-coffee drinkers gulp directly from cups but stick straws in iced coffee. Bar hoppers drink highballs from a glass, but mixed cocktails come with a straw.

“There are plenty of times when straws just aren’t necessary,” said Aaron Pastor, a restaurant consultant and one of dozens of vendors who sell stainless steel, bamboo and other reusable straws online.

“I’ve sold thousands of [reusable] straws,” Pastor said, but it’s not a booming business. “This isn’t paying my mortgage.”

Pastor said chastising plastic straw users isn’t his style. “If your goal isn’t to preach and come across as ‘I’m better than you,’ that’s best. I just say they’re wasteful, they end up in oceans and, hey, do you really need one.”

At the Smithsonian’s National Museum of Natural History in Washington, Melissa and Brian Charon said no. The parents, visiting from Boston with small children who grabbed at food spread on a table, shrugged when told the museum doesn’t offer straws.

“It’s fine with me. I don’t really miss it,” Melissa said. Then Brian added: “I look in the drawer in our kitchen every day and say, ‘Why do we have so many straws?’ ”

At Xanterra’s national parks concessions, “We want people to think about this throwaway society, especially in these beautiful places,” Greener said. “They can take to the air. It’s easy for them to get blown around.”

The anti-straw message is also getting blown around. Greener was looking for composting tips a few years ago when she came across a profile of Cress, who had partnered with a recycling center in Colorado called Eco-Cycle.

Greener wanted to talk to the kid, who by then was powering the anti-straw movement in Colorado, where his family had moved. Based on their conversation, Greener decided to promote straw awareness at Xanterra’s concessions.

“He’s obviously a gifted teen. He’s probably running for Congress. He was very inspirational and innovative,” Greener said. “All I wanted to do at his age was get my driver’s license. I look at kids like that, and it makes me very hopeful.”

Cress isn’t running for office — yet. He said he’s enjoying a six-year passion that has taken him to Australia, Portugal, Germany, France, Boston, Washington and many high schools all over to deliver speeches.

“My favorite part about it has been getting to talk to other kids and listening to their ideas,” Cress said. “It’s really cool, and I think it’s really empowering. I certainly feel like I’m listened to and valued in a larger community, and I really appreciate that.”

California to list herbicide as cancer-causing; Monsanto vows fight

By Karl Plume

Glyphosate, an herbicide and the active ingredient in Monsanto Co's popular Roundup weed killer, will be added to California's list of chemicals known to cause cancer effective July 7, the state's Office of Environmental Health Hazard Assessment (OEHHA) said on Monday.

Monsanto vowed to continue its legal fight against the designation, required under a state law known as Proposition 65, and called the decision "unwarranted on the basis of science and the law."

The listing is the latest legal setback for the seeds and chemicals company, which has faced increasing litigation over glyphosate since the World Health Organization's International Agency for Research on Cancer said that it is "probably carcinogenic" in a controversial ruling in 2015.

Earlier this month, Reuters reported that the scientist leading the IARC’s review knew of fresh data showing no link between glyphosate and cancer. But he never mentioned it, and the agency did not take the information into account because it had yet to be published in a scientific journal. The IARC classed glyphosate as a “probable carcinogen,” the only major health regulator to do so.

Dicamba, a weed killer designed for use with Monsanto's next generation of biotech crops, is under scrutiny in Arkansas after the state's plant board voted last week to ban the chemical.

OEHHA said the designation of glyphosate under Proposition 65 will proceed following an unsuccessful attempt by Monsanto to block the listing in trial court and after requests for stay were denied by a state appellate court and the California's Supreme Court.

Monsanto's appeal of the trial court's ruling is pending.

"This is not the final step in the process, and it has no bearing on the merits of the case. We will continue to aggressively challenge this improper decision," Scott Partridge, Monsanto's vice president of global strategy, said.

Listing glyphosate as a known carcinogen under California's Proposition 65 would require companies selling the chemical in the state to add warning labels to packaging. Warnings would also be required if glyphosate is being sprayed at levels deemed unsafe by regulators.

Users of the chemical include landscapers, golf courses, orchards, vineyards and farms.

Monsanto and other glyphosate producers would have roughly a year from the listing date to re-label products or remove them from store shelves if further legal challenges are lost.

Monsanto has not calculated the cost of any re-labeling effort and does not break out glyphosate sales data by state, Partridge said.

Environmental groups cheered OEHHA's move to list the chemical.

"California's decision makes it the national leader in protecting people from cancer-causing pesticides," said Nathan Donley, a senior scientist at the Center for Biological Diversity.

Ozone hole recovery threatened by rise of paint stripper chemical

The restoration of the ozone hole, which blocks harmful radiation, will be delayed by decades if fast-rising emissions of dichloromethane are not curbed

Dichloromethane, found in paint-stripping chemicals, has a relatively short lifespan so action to cut its emissions would have rapid benefits. Photograph: Justin

Tuesday 27 June 2017 11.00 EDT, Last modified on Tuesday 27 June 2017 20.05 EDT

The restoration of the globe’s protective shield of ozone will be delayed by decades if fast-rising emissions of a chemical used in paint stripper are not curbed, new research has revealed.

Atmospheric levels of the chemical have doubled in the last decade and its use is not restricted by the Montreal protocol that successfully outlawed the CFCs mainly responsible for the ozone hole. The ozone-destroying chemical is called dichloromethane and is also used as an industrial solvent, an aerosol spray propellant and a blowing agent for polyurethane foams. Little is known about where it is leaking from or why emissions have risen so rapidly.

The loss of ozone was discovered in the 1980s and is greatest over Antarctica. But Ryan Hossaini, at Lancaster University in the UK and who led the new work, said: “It is important to remember that ozone depletion is a global phenomenon, and that while the peak depletion occurred over a decade ago, it is a persistent environmental problem and the track to recovery is expected to be a long and bumpy one.”

“Ozone shields us from harmful levels of UV radiation that would otherwise be detrimental to human, animal and plant health,” he said.

The new research, published in the journal Nature Communications, analysed the level of dichloromethane in the atmosphere and found it rose by 8% a year between 2004 and 2014. The scientists then used sophisticated computer models to find that, if this continues, the recovery of the ozone layer would be delayed by 30 years, until about 2090.

The chemical was not included in the 1987 Montreal protocol because it breaks down relatively quickly in the atmosphere, usually within six months, and had not therefore been expected to build up. In contrast, CFCs persist for decades or even centuries.

But the short lifespan of dichloromethane does mean that action to cut its emissions would have rapid benefits. “If policies were put in place to limit its production, then this gas could be flushed out of the atmosphere relatively quickly,” said Hossaini.

If the dichloromethane in the atmosphere was held at today’s level, the recovery of the ozone level would only be delayed by five years, the scientists found. There was a surge in emissions in the period 2012-14 and if growth rate continues at that very high rate, the ozone recovery would be postponed indefinitely, but Hosseini said this extreme scenario is unlikely: “Our results still show the ozone hole will recover.”

Grant Allen, an atmospheric physicist at the University of Manchester, said: “Whatever the source of this gas, we must act now to stop its release to the atmosphere in order to prevent undoing over 30 years of exemplary science and policy work which has undoubtedly saved many lives.”

Ozone layer hole appears to be healing, scientists say

Jonathan Shanklin, one of the scientists at the British Antarctic Survey (BAS) who discovered the ozone hole in 1985, said: “The Montreal protocol has proved very effective at reducing the emissions of substances that can harm the ozone layer. I am sure that the warning made in this paper will be heeded and that dichloromethane will be brought within the protocol in order to prevent further damage to the ozone layer.”

There are other short-lived gases containing the chlorine that destroys ozone, but few measurements have been taken of their levels in the atmosphere. “Unfortunately there is no long-term record of these, only sporadic data, but these do indicate they are a potentially significant source of chlorine in the atmosphere,” said Hossaini, adding that further research on this was needed.

Anna Jones, a scientist at BAS, said: “The new results underline the critical importance of long-term observations of ozone-depleting gases and expanding the Montreal protocol to mitigate new threats to the ozone layer.”

Overall the Montreal protocol is seen as very successful in cutting ozone losses, with estimates indicating that without the protocol the Antarctic ozone hole would have been 40% larger by 2013. Scientists discovered four “rogue” CFCs in 2014 that were increasing in concentration in the atmosphere and contributing to ozone-destruction.

Researchers Found They Could Hack Entire Wind Farms

Over two years, University of Tulsa researchers performed penetration tests on five different wind farms. (Not this one.)Ross Mantle for WIRED

S

On a sunny day last summer, in the middle of a vast cornfield somewhere in the large, windy middle of America, two researchers from the University of Tulsa stepped into an oven-hot, elevator-sized chamber within the base of a 300-foot-tall wind turbine. They’d picked the simple pin-and-tumbler lock on the turbine’s metal door in less than a minute and opened the unsecured server closet inside.

Jason Staggs, a tall 28-year-old Oklahoman, quickly unplugged a network cable and inserted it into a Raspberry Pi minicomputer, the size of a deck of cards, that had been fitted with a Wi-Fi antenna. He switched on the Pi and attached another Ethernet cable from the minicomputer into an open port on a programmable automation controller, a microwave-sized computer that controlled the turbine. The two men then closed the door behind them and walked back to the white van they’d driven down a gravel path that ran through the field.

Staggs sat in the front seat and opened a MacBook Pro while the researchers looked up at the towering machine. Like the dozens of other turbines in the field, its white blades—each longer than a wing of a Boeing 747—turned hypnotically. Staggs typed into his laptop's command line and soon saw a list of IP addresses representing every networked turbine in the field. A few minutes later he typed another command, and the hackers watched as the single turbine above them emitted a muted screech like the brakes of an aging 18-wheel truck, slowed, and came to a stop.

For the past two years, Staggs and his fellow researchers at the University of Tulsa have been systematically hacking wind farms around the United States to demonstrate the little-known digital vulnerabilities of an increasingly popular form of American energy production. With the permission of wind energy companies, they’ve performed penetration tests on five different wind farms across the central US and West Coast that use the hardware of five wind power equipment manufacturers.

As part of the agreement that legally allowed them to access those facilities, the researchers say they can't name the wind farms’ owners, the locations they tested, or the companies that built the turbines and other hardware they attacked. But in interviews with WIRED and a presentation they plan to give at the Black Hat security conference next month, they're detailing the security vulnerabilities they uncovered. By physically accessing the internals of the turbines themselves—which often stood virtually unprotected in the middle of open fields—and planting $45 in commodity computing equipment, the researchers carried out an extended menu of attacks on not only the individual wind turbine they'd broken into but all of the others connected to it on the same wind farm's network. The results included paralyzing turbines, suddenly triggering their brakes to potentially damage them, and even relaying false feedback to their operators to prevent the sabotage from being detected.

“When we started poking around, we were shocked. A simple tumbler lock was all that stood between us and the wind farm control network,” says Staggs. “Once you have access to one of the turbines, it’s game over.”

In their attacks, the Tulsa researchers exploited an overarching security issue in the wind farms they infiltrated: While the turbines and control systems had limited or no connections to the internet, they also lacked almost any authentication or segmentation that would prevent a computer within the same network from sending valid commands. Two of the five facilities encrypted the connections from the operators’ computers to the wind turbines, making those communications far harder to spoof. But in every case the researchers could nonetheless send commands to the entire network of turbines by planting their radio-controlled Raspberry Pi in the server closet of just one of the machines in the field.

“They don’t take into consideration that someone can just pick a lock and plug in a Raspberry Pi,” Staggs says. The turbines they broke into were protected only by easily picked standard five-pin locks, or by padlocks that took seconds to remove with a pair of bolt cutters. And while the Tulsa researchers tested connecting to their minicomputers via Wi-Fi from as far as fifty feet away, they note they could have just as easily used another radio protocol, like GSM, to launch attacks from hundreds or thousands of miles away.

The researchers developed three proof-of-concept attacks to demonstrate how hackers could exploit the vulnerable wind farms they infiltrated. One tool they built, called Windshark, simply sent commands to other turbines on the network, disabling them or repeatedly slamming on their brakes to cause wear and damage. Windworm, another piece of malicious software, went further: It used telnet and FTP to spread from one programmable automation controller to another, until it infected all of a wind farm's computers. A third attack tool, called Windpoison, used a trick called ARP cache poisoning, which exploits how control systems locate and identify components on a network. Windpoison spoofed those addresses to insert itself as a man-in-the-middle in the operators’ communications with the turbines. That would allow hackers to falsify the signals being sent back from the turbines, hiding disruptive attacks from the operators’ systems.

While the Tulsa researchers shut off only a single turbine at a time in their tests, they point out that their methods could easily paralyze an entire wind farm, cutting off as much as hundreds of megawatts of power.

Wind farms produce a relatively smaller amount of energy than their coal or nuclear equivalents, and grid operators expect them to be less reliable, given their dependence on the real-time ebb and flow of wind currents. That means even taking out a full farm may not dramatically impact the grid overall, says Ben Miller, a researcher at the critical-infrastructure security startup Dragos Inc. and a former engineer at the North American Electric Reliability Council.

More concerning than attacks to stop turbines, Miller says, are those intended to damage them. The equipment is designed for lightness and efficiency, and is often fragile as a result. That, along with the high costs of going even temporarily offline, make the vulnerabilities potentially devastating for a wind farm owner. "It would all probably be far more impactful to the operator of the wind farm than it would be to the grid," Miller says.

Staggs argues that this potential to cause costly downtime for wind farms leaves their owners open to extortion or other kinds of profit-seeking sabotage. "This is just the tip of the iceberg," he says. "Imagine a ransomware scenario."

While the Tulsa researchers were careful not to name any of the manufacturers of the equipment used in the wind farms they tested, WIRED reached out to three major wind farm suppliers for comment on their findings: GE, Siemens Gamesa, and Vestas. GE and Siemens Gamesa didn't respond. But Vestas spokesperson Anders Riis wrote in an email that "Vestas takes cyber security very seriously and continues to work with customers and grid operators to build products and offerings to improve security levels in response to the shifting cyber security landscape and evolving threats." He added that it offers security measures that include "physical breach and intrusion detection and alert; alarm solutions at turbine, plant, and substation level to notify operators of a physical intrusion; and mitigation and control systems that quarantine and limit any malicious impact to the plant level, preventing further impact to the grid or other wind plants.”1

Researchers have demonstrated the vulnerabilities of wind turbines before, albeit on a far smaller scale. In 2015, the US Industrial Control System Computer Emergency Response Team issued a warning about hundreds of wind turbines, known as the XZERES 442SR, whose controls were openly accessible via the internet. But that was a far smaller turbine aimed at residential and small business users, with blades roughly 12 feet in length—not the massive, multimillion-dollar versions the Tulsa researchers tested.

The Tulsa team also didn't attempt to hack its targets over the internet. But Staggs speculates it might be possible to remotely compromise them too—perhaps by infecting the operators' network, or the laptop of one of the technicians who services the turbines. But other hypothetical vulnerabilities pale next to the very real distributed, unprotected nature of turbines themselves, says David Ferlemann, another member of the Tulsa team. "A nuclear power plant is hard to break into," he points out. "Turbines are more distributed. It’s much easier to access one node and compromise the entire fleet."

The researchers suggest that, ultimately, wind farm operators need to build authentication into the internal communications of their control systems—not just isolate them from the internet. And in the meantime, a few stronger locks, fences, and security cameras on the doors of the turbines themselves would make physical attacks far more difficult.

For now, wind farms produce less than 5 percent of America's energy, Staggs says. But as wind power grows as a fraction of US electric generation, he hopes their work can help secure that power source before a large fraction of Americans comes depend on it.

"If you're an attacker bent on trying to influence whether the lights are on or not," says Staggs, "that becomes a more and more attractive target for you to go after."

Solving a sweet problem for renewable biofuels and chemicals

Arizona State University

Whether or not society shakes its addiction to oil and gasoline will depend on a number of profound environmental, geopolitical and societal factors.

But with current oil prices hovering around $50 dollars a barrel, it won't likely be anytime soon.

Despite several major national research initiatives, no one has been able to come up with the breakthrough renewable biofuel technology that would lead to a cheaper alternative to gasoline.

That research challenge led ASU scientists, Reed Cartwright and Xuan Wang, to enter the fray, teaming up to try to break through the innovation bottleneck for the renewable bioproduction of fuels and chemicals.

"My lab has been very interested in converting biomass such as agricultural wastes and even carbon dioxide into useful and renewable bio-based products," said Wang, an assistant professor in the School of Life Sciences. "As a microbiologist, I'm interested in manipulating microbes as biocatalysts to do a better job."

To do so, they've looked into a new approach -- harnessing the trial-and-error power of evolution to coax nature into revealing the answer.

By growing bacteria over generations under specially controlled conditions in fermentation tanks, they have test-tube evolved bacteria to better ferment sugars derived from biomass -- a rich, potential renewable energy source for the production of biofuels and chemicals.

Their results appeared recently in the online edition of PNAS (doi: 10.1073/pnas.1700345114).

The research team includes postdoctoral scholar Christian Sievert, Lizbeth Nieves, Larry Panyon, Taylor Loeffler and Chandler Morris, and was led by Reed Cartwright and Xuan Wang, in a collaboration between the ASU's School of Life Sciences and the Biodesign Institute.

A sweet problem

The appeal of plants is ideal. Just add a little carbon dioxide, water and plentiful sunshine, and presto! Society has a rich new source of renewable carbons to use.

Corn ethanol (using starch from corn for alcohol production primarily in the U.S.) has been one major biofuel avenue, and sugarcane another alternative (abundant in Brazil), but there is a big drawback. Turning the sugar-rich kernels of corn or sugarcane into ethanol competes with the food supply.

So scientists over the past few decades have migrated to research on conversion of non-food based plant materials into biofuels and chemicals. These so-called lignocellulosic biomasses, like tall switchgrasses and the inedible parts of corn and sugarcane (stovers, husks, bagasses, etc.) are rich in xylose, a five-carbon, energy-rich sugar relative of glucose.

Lignocellulosic biomass has an abundance of glucose and xylose, but industrial E coli strains can't use xylose because when glucose is available, it turns off the use of xylose. And so, to date, it's been an inefficient and costly to fully harvest and convert the xylose to biofuels.

Benchtop evolution

Wang and Cartwright wanted to squeeze out more energy from xylose sugars. To do so, they challenged E coli bacteria that could thrive comfortably on glucose -- and switch out the growth medium broth to grow solely on xylose.

The bacteria would be forced to adapt to the new food supply or lose the growth competition.

They started with a single colony of bacteria that were genetically identical and ran three separate evolution experiments with xylose. At first, the bacteria grew very slowly. But remarkable, in no more than 150 generations, the bacteria adapted, and eventually, learned to thrive in the xylose broth.

Next, they isolated the DNA from the bacteria and used next-generation DNA sequencing technology to examine the changes within the bacteria genomes. When they read out the DNA data, they could identify the telltale signs of evolution in action, mutations.

Nature finds a way

The bacteria, when challenged, randomly mutated their DNA until it could adapt to the new conditions. They held on to the fittest mutations over generations until they became fixed beneficial mutations.

And in each case, when challenged with xylose, the bacteria could grow well. Their next task was to find out what these beneficial mutations were and how did they work. To grow better on xylose, the three bacterial E. coli lines had "discovered" a different set of mutations to the same genes. The single mutations the research team identified all could enhance xylose fermentation by changing bacterial sugar metabolism.

"This suggests that there are potentially multiple evolutionary solutions for the same problem, and a bacterium's genetic background may predetermine its evolutionary trajectories," said Cartwright, a researcher at ASU's Biodesign Institute and assistant professor in the School of Life Sciences.

The most interesting mutation happened in a regulatory protein called XylR whose normal function is to control xylose utilization. Just two amino acid switches in the XylR could enhance xylose utilization and release the glucose repression, even in the non-mutated original hosts.

Through some clever genetic tricks, when the XlyR mutant was placed back in a normal "wild-type" strain or an industrial E. coli biocatalyst, it could also now grow on xylose and glucose, vastly improving the yield. Wang's team saw up to a 50 percent increase in the product after 4 days of fermentation.

Together, Wang and Cartwright's invention has now significantly boosted the potential of industrial E. coli to be used for biofuel production from lignocellulosic materials. In addition, they could use this same genetic approach for other E. coli strains for different products.

Arizona Technology Enterprises (AzTE) is filing a non-provisional patent for their discovery. Wang hopes they can partner with industry to scale up their technology and see if this invention will increase economic viability for bioproduction.

"With these new results, I believe we've solved one big, persistent bottleneck in this field," concluded Wang.

Cellulosic biofuels can benefit the environment if managed correctly

Michigan State University

Could cellulosic biofuels - or liquid energy derived from grasses and wood - become a green fuel of the future, providing an environmentally sustainable way of meeting energy needs? In Science, researchers at the U.S. Department of Energy-funded Great Lakes Bioenergy Research Center say yes, but with a few important caveats.

"The climate benefit of cellulosic biofuels is actually much greater than was originally thought," said Phil Robertson, University Distinguished Professor of Ecosystem Science at Michigan State University and lead author on the study. "But that benefit depends crucially on several different factors, all of which we need to understand to get right."

Although not yet a market force, cellulosic biofuels are routinely factored into future climate mitigation scenarios because of their potential to both displace petroleum use and mitigate greenhouse gas emissions. Those benefits, however, are complicated by the need for vast amounts of land to produce cellulosic biofuels on a large scale.

"The sustainability question is largely about the impact of using millions of acres of U.S. land to grow biofuel crops," Robertson said. "Can we do that without threatening global food security, diminishing biodiversity, or reducing groundwater supplies? How much more fertilizer would we use? What are the tradeoffs for real climate benefit, and are there synergies we can promote?"

Drawing from ten years of empirical research, Robertson and GLBRC colleagues from MSU, the University of Wisconsin and the University of Maryland identify several emerging principles for managing the complex environmental tradeoffs of cellulosic biofuel.

First, the researchers show how growing native perennial species on marginal lands -land not used for food production because of low fertility or other reasons - avoids competition with food security, and provides the greatest potential for climate mitigation and biodiversity benefits.

Second, crop choice is key. Native perennial species offer superior environmental outcomes to annual crops, but no single crop appears to be ideal for all locations. In fact, in some cases mixed species crops provide superior benefits. Third, nitrogen fertilizer use should be avoided or minimized because of its global warming and other environmental impacts.

According to the researchers, these principles (as well as four more outlined in the paper) are enough to begin guiding sound policy decisions for producing sustainable biofuels. Looking forward, however, the team calls for more research on designing landscapes to provide the optimal suite of energy, climate and environmental benefits. They say that understanding how best to integrate benefits and tradeoffs will be key to the future success of cellulosic biofuels.

"With biofuels, the stakes are high," Robertson said. "But the returns are also high, and if we take key principles into account we can begin shaping the policies and practices that could help make cellulosic biofuels a triple win for the economy, the climate and for environmental sustainability in general."

Greenland now a major driver of rising seas: study

By Marlowe HOOD for Paris (AFP) June 26, 2017

Ocean levels rose 50 percent faster in 2014 than in 1993, with meltwater from the Greenland ice sheet now supplying 25 percent of total sea level increase compared with just five percent 20 years earlier, researchers reported Monday.

The findings add to growing concern among scientists that the global watermark is climbing more rapidly than forecast only a few years ago, with potentially devastating consequences.

Hundreds of millions of people around the world live in low-lying deltas that are vulnerable, especially when rising seas are combined with land sinking due to depleted water tables, or a lack of ground-forming silt held back by dams.

Major coastal cities are also threatened, while some small island states are already laying plans for the day their drowning nations will no longer be livable.

"This result is important because the Intergovernmental Panel on Climate Change (IPCC)" -- the UN science advisory body -- "makes a very conservative projection of total sea level rise by the end of the century," at 60 to 90 centimetres (24 to 35 inches), said Peter Wadhams, a professor of ocean physics at the University of Oxford who did not take part in the research.

That estimate, he added, assumes that the rate at which ocean levels rise will remain constant.

"Yet there is convincing evidence -- including accelerating losses of mass from Greenland and Antarctica -- that the rate is actually increasing, and increasing exponentially."

Greenland alone contains enough frozen water to lift oceans by about seven metres (23 feet), though experts disagree on the global warming threshold for irreversible melting, and how long that would take once set in motion.

"Most scientists now expect total rise to be well over a metre by the end of the century," Wadhams said.

The new study, published in Nature Climate Change, reconciles for the first time two distinct measurements of sea level rise.

The first looked one-by-one at three contributions: ocean expansion due to warming, changes in the amount of water stored on land, and loss of land-based ice from glaciers and ice sheets in Greenland and Antarctica.

- 'A major warning' -

The second was from satellite altimetry, which gauges heights on the Earth's surface from space.

The technique measures the time taken by a radar pulse to travel from a satellite antenna to the surface, and then back to a satellite receiver.

Up to now, altimetry data showed little change in sea levels over the last two decades, even if other measurements left little doubt that oceans were measurably deepening.

"We corrected for a small but significant bias in the first decade of the satellite record," co-author Xuebin Zhang, a professor at Qingdao National Laboratory of Marine Science and Technology in China's Shandong Province, told AFP.

Overall, the pace of global average sea level rise went up from about 2.2 millimetres a year in 1993, to 3.3 millimetres a year two decades later.

In the early 1990s, they found, thermal expansion accounted for fully half of the added millimetres. Two decades later, that figure was only 30 percent.

Andrew Shepherd, director of the Centre for Polar Observation and Modelling at the University of Leeds in England, urged caution in interpreting the results.

"Even with decades of measurements, it is hard to be sure whether there has been a steady acceleration in the rate of global sea level rise during the satellite era because the change is so small," he said.

Disentangling single sources -- such as the massive chunk of ice atop Greenland -- is even harder.

But other researchers said the study should sound an alarm.

"This is a major warning about the dangers of a sea level rise that will continue for many centuries, even after global warming is stopped," said Brian Hoskins, chair of the Grantham Institute at Imperial College London.

Scientists throw light on mysterious ice age temperature jumps

Cardiff, UK (SPX) Jun 22, 2017

Scientists believe they have discovered the reason behind mysterious changes to the climate that saw temperatures fluctuate by up to 15C within just a few decades during the ice age periods. In a new study, the researchers show that rising levels of CO2 could have reached a tipping point during these glacial periods, triggering a series of chain events that caused temperatures to rise abruptly.

The findings, which have been published in the journal Nature Geoscience, add to mounting evidence suggesting that gradual changes such as a rising CO2 levels can lead to sudden surprises in our climate, which can be triggered when a certain threshold is crossed.

Previous studies have shown that an essential part of the natural variability of our climate during glacial times is the repeated occurrence of abrupt climate transitions, known as Dansgaard-Oeschger events. These events are characterized by drastic temperature changes of up to 15C within a few decades in the high latitudes of the Northern Hemisphere. This was the case during the last glacial period around 100,000 to 20,000 years ago.

It is commonly believed that this was a result of sudden floods of freshwater across the North Atlantic, perhaps as a consequence of melting icebergs.

Co-author of the study Professor Stephen Barker, from Cardiff University's School of Earth and Ocean Sciences, said: "Our results offer an alternative explanation to this phenomenon and show that a gradual rise of CO2 within the atmosphere can hit a tipping point, triggering abrupt temperature shifts that drastically affect the climate across the Northern Hemisphere in a relatively short space of time.

"These findings add to mounting evidence suggesting that there are sweet spots or 'windows of opportunity' within climate space where so-called boundary conditions, such as the level of atmospheric CO2 or the size of continental ice sheets, make abrupt change more likely to occur. Of course, our study looks back in time and the future will be a very different place in terms of ice sheets and CO2 but it remains to be seen whether or not Earth's climate becomes more or less stable as we move forward from here".

Using climate models to understand the physical processes that were at play during the glacial periods, the team were able to show that a gradual rise in CO2 strengthened the trade winds across Central America by inducing an El Nino-like warming pattern with stronger warming in the East Pacific than the Western Atlantic.

As a result there was an increase in moisture transport out of the Atlantic, which effectively increased the salinity and density, of the ocean surfaces, leading to an abrupt increase in circulation strength and temperature rise.

"This does not necessarily mean that a similar response would happen in the future with increasing CO2 levels, since the boundary conditions are different from the ice age," added by Professor Gerrit Lohmann, leader of the Paleoclimate Dynamics group at the Alfred Wegener Institute.

"Nevertheless, our study shows that climate models have the ability of simulating abrupt changes by gradual forcing as seen in paleoclimate data."

Building on this study, the team intend to produce a new reconstruction of global ice volume across the last glacial cycle, which will help to validate their proposition that certain boundaries can define windows of instability within the climate system.

Jury awards $218 mn to farmers in Syngenta GMO corn lawsuit

Chicago (AFP) June 23, 2017

A US federal jury on Friday ordered Swiss agribusiness giant Syngenta to pay nearly $218 million to 7,000 Kansas farmers after selling them genetically-modified corn seeds not approved for export to China.

The farmers suffered profound economic damage in 2013 when Chinese authorities refused imports of corn grown with Syngenta's bioengineered seeds, causing prices to plummet, according to a lawyer for the plaintiffs.

"The verdict is great news for corn farmers in Kansas and corn growers throughout the country who were seriously hurt by Syngenta's actions," Pat Stueve, a lawyer for the farmers, said in a statement.

The jury in Kansas found Syngenta negligent in the matter, and awarded $217.7 million in compensation to the farmers, court papers showed. The company said the case is "without merit."

The verdict came down after only a half day of deliberations, but covers only one of eight lawsuits targeting Syngenta over the matter, Stueve said.

"This is only the beginning. We look forward to pursuing justice for thousands more corn farmers in the months ahead."

Other cases involve farmers in agricultural states such as Nebraska, Iowa, Illinois and Ohio, with nationwide losses exceeding $5 billion, he said.

The company said it was "disappointed" with the verdict and will appeal.

The ruling "will only serve to deny American farmers access to future technologies even when they are fully approved in the US," adding that the two strains of corn seeds in the case had approval of US regulators and "in the key import markets recommended at the time by the National Corn Growers Association (NCGA) and other industry associations."

"American farmers shouldn't have to rely on a foreign government to decide what products they can use on their farms," the company said in a statement.

China's $10 billion strategic project in Myanmar sparks local ire

by Reuters onn Friday, 9 June 2017 09:40 GMT

KYAUK PYU, Myanmar, June 9 (Reuters) - Days before the first supertanker carrying 140,000 tonnes of Chinese-bound crude oil arrived in Myanmar's Kyauk Pyu port, local officials confiscated Nyein Aye's fishing nets.

The 36-year-old fisherman was among hundreds banned from fishing a stretch of water near the entry point for a pipeline that pumps oil 770 km (480 miles) across Myanmar to southwest China and forms a crucial part of Beijing's "Belt and Road" project to deepen its economic links with Asia and beyond.

"How can we make a living if we're not allowed to catch fish?" said Nyein Aye, who bought a bigger boat just four months ago but now says his income has dropped by two-thirds due to a decreased catch resulting from restrictions on when and where he can fish. Last month he joined more than 100 people in a protest demanding compensation from pipeline operator Petrochina .

The pipeline is part of the nearly $10 billion Kyauk Pyu Special Economic Zone, a scheme at the heart of fast-warming Myanmar-China relations and whose success is crucial for the Southeast Asian nation's leader Aung San Suu Kyi.

Embattled Suu Kyi needs a big economic win to stem criticism that her first year in office has seen little progress on reform. China's support is also key to stabilising their shared border, where a spike in fighting with ethnic armed groups threatens the peace process Suu Kyi says is her top priority.

China's state-run CITIC Group, the main developer of the Kyauk Pyu Special Economic Zone, says it will create 100,000 jobs in the northwestern state of Rakhine, one of Myanmar's poorest regions.

But many local people say the project is being rushed through without consultation or regard for their way of life.

Suspicion of China runs deep in Myanmar, and public hostility due to environmental and other concerns has delayed or derailed Chinese mega-projects in the country in the past.

China says the Kyauk Pyu development is based on "win-win" co-operation between the two countries.

Since Beijing signalled it may abandon the huge Myitsone Dam hydroelectric project in Myanmar earlier this year, it has pushed for concessions on other strategic undertakings - including the Bay of Bengal port at Kyauk Pyu, which gives it an alternative route for energy imports from the Middle East.

Internal planning documents reviewed by Reuters and more than two dozen interviews with officials show work on contracts and land acquisition has already begun before the completion of studies on the impact on local people and the environment, which legal experts said could breach development laws.

The Kyauk Pyu Special Economic Zone will cover more than 4,200 acres (17 sq km). It includes the $7.3 billion deep sea port and a $2.3 billion industrial park, with plans to attract industries such as textiles and oil refining.

A Reuters' tally based on internal planning documents and census data suggests 20,000 villagers, most of whom now depend on agriculture and fishing, are at risk of being relocated to make way for the project.

"There will be a huge project in the zone and many buildings will be built, so people who live in the area will be relocated," said Than Htut Oo, administrator of Kyauk Pyu, who also sits on the management committee of the economic zone.

He said the government has not publicly announced the plan, because it didn't want to "create panic" while it was still negotiating with the Chinese developer.

In April, Myanmar's President Htin Kyaw signed two agreements on the pipeline and the Kyauk Pyu port with his Chinese counterpart Xi Jinping, as Beijing pushed to revive a project that had stalled since its inception in 2009.

The agreements call for environmental and social assessments to be carried out as soon as possible.

While the studies are expected to take up to 15 months and have not yet started, CITIC has asked Myanmar to finalise contract terms by the end of this year so that the construction can start in 2018, said Soe Win, who leads the Myanmar management committee of the zone.

Such a schedule has alarmed experts who fear the project is being rushed.

"The environmental and social preparations for a project of these dimensions take years to complete and not months," said Vicky Bowman, head of the Myanmar Centre for Responsible Business and a former British ambassador to the country.

CITIC said in an email to Reuters it would engage "a world-renowned consulting firm" to carry out assessments.

Although large-scale land demarcation for the project has not yet started, 26 families have been displaced from farmland due to acquisitions that took place in 2014 for the construction of two dams, according to land documents and the land owners.

Experts say this violates Myanmar's environmental laws.

"Carrying out land acquisition before completing environmental impact assessments and resettlement plans is incompatible with national law," said Sean Bain, Myanmar-based legal consultant for human rights watchdog International Commission of Jurists.

CITIC says it will build a vocational school to provide training for skills needed by companies in the economic zone. It has given $1.5 million to local villages to develop businesses.

Reuters spoke to several villagers who had borrowed small sums from the village funds set up with this money.

"The CITIC money was very useful for us because most people in the village need money," said fisherman Thar Sai Aung, who borrowed $66 to buy new nets.

Chinese investors say they also plan to spend $1 million during the first five years of the development, and $500,000 per year thereafter to improve local living standards.

But villagers in Kyauk Pyu say they fear the project would not contribute to the development of the area because the operating companies employ mostly Chinese workers.

From more than 3,000 people living on the Maday island, the entry point for the oil pipeline, only 47 have landed a job with the Petrochina, while the number of Chinese workers stood at more than double that number, data from labour authorities showed.

Petrochina did not respond to requests for comment. In a recent report it said Myanmar citizens made up 72 percent of its workforce in the country overall and it would continue to hire locally.

"I don't think there's hope for me to get a job at the zone," said fisherman Nyein Aye. He had been turned down 12 times for job applications with the pipeline operator.

"Chinese companies said they would develop our village and improve our livelihoods, but it turned out we are suffering every day."

Ambiguous pledges leave large uncertainty under Paris climate agreement

International Institute for Applied Systems Analysis

Under the pledges made by countries under the Paris Agreement on climate change, greenhouse gas emissions could range from 47 to 63 billion metric tons of CO2 equivalent (GtCO2e) per year in 2030, compared to about 52 GtCO2e in 2015, according to a new analysis. That range has critical consequences for the feasibility of achieving the goal of keeping warming "well below 2°C" over preindustrial levels, according to the study published in the journal Nature Communications.

The pledges, known as National Determined Contributions (NDCs) lay out a roadmap of how individual countries will reduce their emissions, with the intention of adding up to a global emissions reduction sufficient to achieve the Paris targets. Yet the new study shows that these individual maps leave out key details that would enable policymakers to see if they are headed in the right direction.

"Countries have put forward pledges to limit and reduce their emissions. But in many cases the actions described in these pledges are ambiguous or imprecise," says IIASA researcher Joeri Rogelj, who led the study. For example, some pledges focus on improving "emissions intensity," meaning reducing the emissions per dollar of economic output, but assumptions about socioeconomic growth are often implicit or unknown. Other countries focus on absolute emissions reductions, which are simpler to understand, or propose renewable energy targets, which can be expressed in different ways. Questions also remain about how much land-use-related climate mitigation will contribute, such as reducing deforestation or preserving forests.

The study finds that the emissions implied by the current NDCs can vary by -10 to +20% around the median estimate of 52 GtCO2e/yr in 2030. A previous study, also led by IIASA, had found that that the emissions reductions set out in the NDCs would not put the world on track to achieve the Paris targets.

The new study confirms this finding. It shows in a quantitative way that in order to keep warming to below 2°C, countries should either increase the stringency of their NDCs by 2030 or consider scaling up their ambition after 2030 by a factor 4 to 25. If the ambition of NDCs is not further increased by 2030, the study finds no pathways for returning warming to 1.5°C by the end of the century.

"The new results allow us to more precisely understand what is driving the uncertainty in emissions estimates implied by the Paris pledges," says Rogelj. "With this information at hand, policymakers can formulate solutions to remediate this issue."

"This is the first global study to systematically explore the range of emissions outcomes under the current pledges. Our study allows us to identify the key contributors to the overall uncertainty as well as potential clarifications by countries that would be most promising to reduce the overall uncertainty," says IIASA Energy Program Director Keywan Riahi, a study coauthor.

The researchers find that uncertainty could be reduced by 10% with simple, technical clarifications, and could be further reduced by clearer guidelines for countries on building their NDCs. The study highlights the importance of a thorough and robust tracking process of progress made by countries towards the achievement of their NDCs and the Paris Agreement goals as a whole.

Reference

Rogelj J, Fricko O, Meinshausen M, Krey V, Zilliacus JJJ, Riahi K (2017). Understanding the origin of Paris Agreement emission uncertainties. Nature Communications. [doi: 10.1038/ncomms15748]

Offshore wind turbines vulnerable to Category 5 hurricane gusts

University of Colorado at Boulder

Offshore wind turbines built according to current standards may not be able to withstand the powerful gusts of a Category 5 hurricane, creating potential risk for any such turbines built in hurricane-prone areas, new University of Colorado Boulder-led research shows.

The study, which was conducted in collaboration with the National Center for Atmospheric Research in Boulder, Colorado and the U.S. Department of Energy's National Renewable Energy Laboratory in Golden, Colorado, highlights the limitations of current turbine design and could provide guidance for manufacturers and engineers looking to build more hurricane-resilient turbines in the future.

Offshore wind-energy development in the U.S. has ramped up in recent years, with projects either under consideration or already underway in most Atlantic coastal states from Maine to the Carolinas, as well as the West Coast and Great Lakes. The country's first utility-scale offshore wind farm, consisting of five turbines, began commercial operation in December 2016 off the coast of Rhode Island.

Turbine design standards are governed by the International Electrotechnical Commission (IEC). For offshore turbines, no specific guidelines for hurricane-force winds exist. Offshore turbines can be built larger than land-based turbines, however, owing to a manufacturer's ability to transport larger molded components such as blades via freighter rather than over land by rail or truck.

For the study, CU Boulder researchers set out to test the limits of the existing design standard. Due to a lack of observational data across the height of a wind turbine, they instead used large-eddy simulations to create a powerful hurricane with a computer.

"We wanted to understand the worst-case scenario for offshore wind turbines, and for hurricanes, that's a Category 5," said Rochelle Worsnop, a graduate researcher in CU Boulder's Department of Atmospheric and Oceanic Sciences (ATOC) and lead author of the study.

These uniquely high-resolution simulations showed that under Category 5 conditions, mean wind speeds near the storm's eyewall reached 90 meters-per-second, well in excess of the 50 meters-per-second threshold set by current standards.

"Wind speeds of this magnitude have been observed in hurricanes before, but in only a few cases, and these observations are often questioned because of the hazardous conditions and limitations of instruments," said George Bryan of NCAR and a co-author of the study. "By using large-eddy simulations, we are able to show how such winds can develop and where they occur within hurricanes."

Furthermore, current standards do not account for veer, a measure of the change in wind direction across a vertical span. In the simulation, wind direction changed by as much as 55 degrees between the tip of the rotor and its hub, creating a potentially dangerous strain on the blade.

The findings could be used to help wind farm developers improve design standards as well as to help stakeholders make informed decisions about the costs, benefits and risks of placing turbines in hurricane-prone areas.

"The study will help inform design choices before offshore wind energy development ramps up in hurricane-prone regions," said Worsnop, who received funding from the National Science Foundation Graduate Research Fellowship Program to conduct this research. "We hope that this research will aid wind turbine manufacturers and developers in successfully tapping into the incredibly powerful wind resource just beyond our coastlines."

"Success could mean either building turbines that can survive these extreme conditions, or by understanding the overall risk so that risks can be mitigated, perhaps with financial instruments like insurance," said Professor Julie Lundquist of ATOC and CU Boulder's Renewable and Sustainable Energy Institute (RASEI), a co-author of the study. "The next stage of this work would be to assess how often these extreme winds would impact an offshore wind farm on the Atlantic coast over the 20-to-30-year lifetime of a typical wind farm."

The findings were recently published online in the journal Geophysical Research Letters, a publication of the American Geophysical Union.

Climate Change Misconceptions Common Among Teachers, Study Finds

June 07, 2017, Nathan Hurst, hurstn@missouri.edu, 573-882-6217

COLUMBIA, Mo. – Recent studies have shown that misconceptions about climate change and the scientific studies that have addressed climate change are pervasive among the U.S. public. Now, a new study by Benjamin Herman, assistant professor in the Department of Learning, Teaching and Curriculum in the University of Missouri College of Education, shows that many secondary school science teachers also possess several of these same misconceptions.

In the study, Herman surveyed 220 secondary science teachers in Florida and Puerto Rico to determine their knowledge about climate change science. The survey asked questions regarding things that do contribute to climate change, such as greenhouse gas emissions, and things that do not significantly contribute, such as the depletion of the ozone layer and the use of pesticides. The survey also asked whether controlled scientific experiments are required to validate climate change.

While the majority of the surveyed teachers accurately responded that fossil fuel use, automobiles and industry emissions were major causes of climate change, they also exhibited notable climate change misconceptions. For instance, nearly all of the Puerto Rico teachers and more than 70 percent of Florida teachers believed incorrectly that ozone layer depletion and pesticide use were at least minor, yet significant, causes of climate change. Additionally, Herman says that nearly 50 percent of Florida teachers and nearly 70 percent of Puerto Rico teachers think that climate change science must be studied through controlled experiments to be valid.

Herman says the teachers in his study exhibited climate change science misconceptions at a similar rate to average Americans. He says these results are understandable given that teachers are often overworked and not afforded professional development opportunities that would deepen their climate change science knowledge.

“Teachers want and need support to keep them abreast of scientific discoveries and developments and how scientists come to their well-established claims regarding climate change,” Herman said. “Climate change science involves many different types of science methods stemming from disciplines, including physics, biology, atmospheric science and earth science. Science teachers also need professional development directed at assisting them in their efforts to accurately and effectively engage students on this important issue. Because of existing misconceptions and misinformation regarding climate change, science teachers have a crucial professional and ethical responsibility to accurately convey to their students how climate change is studied and why scientists believe the climate is changing.”

The study, “Florida and Puerto Rico Secondary Science Teachers’ Knowledge and Teaching of Climate Change Science,” was published in the International Journal of Science and Mathematics Education. The study was coauthored by Allan Feldman and Vanessa Vernaza-Hernandez from the University of South Florida. The study was funded by a National Science Foundation Coastal Areas Climate Change Education Partnership Award grant.

Jobs in Wind Energy in 2016

Published: March 29th, 2017 By Climate Central

We’ve reached the end of the windiest month of the year. But in other months, wind will continue to play an increasingly large role in the U.S. power mix. At the end of last year, wind capacity surpassed hydroelectric capacity for the first time in the U.S.

Over the past decade, wind power has exploded in the U.S. Over that time, generating capacity from wind has increased by a factor of seven, surpassing 82,000 megawatts, or enough to power 24 million homes.

Wind is most consistent in the Great Plains and on the Front Range of the Rockies. The prevailing west winds come rushing off of the mountains, making the area especially conducive for generating electricity, as higher wind speeds produce disproportionately more power from a wind turbine.

The winds in the Southeast are not as consistent, especially in the doldrums of summer. Despite the lack of a consistent breeze across the region as a whole, the Southeast coast and offshore areas in particular allow for a modest amount of power generation.

Wind power is about more than electricity. The rapid installation of wind turbines has spurred job growth in the U.S.:

More than 100,000 jobs are related to the wind energy sector (planning, construction, operations)

40 states now have utility-scale wind energy projects

Texas leads the country in the number of jobs in the wind energy sector

The power provided by wind also means that the need for fossil fuels is declining, effectively slowing the increase of heat-trapping carbon dioxide emitted into the atmosphere.

Methodology: Windiest locations are based on a height of 80 meters (260 feet), typical tower height of wind turbines. Data provided by AWS Truepower via National Renewable Energy Laboratory. Job data provided by American Wind Energy Association for 2016. Carbon dioxide savings based on total installed wind capacity for each state, an operating capacity factor of 35 percent, and estimated carbon dioxide emissions avoided per unit of wind electricity in each state.

Organisms in ballast water increasing despite discharge measures

New actions contemplated as larger ships, changes in traffic patterns may be offsetting current procedures

By Karl Blankenship on June 06, 2017

Ships arriving in Chesapeake Bay ports bring more than just cargo — in 2013 they also inadvertently released an estimated 10 billion live zooplankton from other parts of the world, a finding that surprised the researchers who recently reported the results.

Regulations aimed at reducing the risk of aquatic invasions went into effect more than a decade ago, and a team from the Smithsonian Environmental Research Center had expected to see a decrease in live organisms being released from ballast holds of ships.

Instead, they found that concentrations of coastal zooplankton discharged into Bay waters had increased nearly fivefold from releases checked before the new regulations took effect in 2004.

“It wasn’t quite what we expected to see,” said Greg Ruiz, SERC senior marine biologist and a co-author of the study. “We wanted to know how things had changed, and they changed in a way that we didn’t expect.”

So-called “biological pollution” from ships has been a concern since the 1980s, when releases of ballast water led to the devastating invasion of zebra mussels in the Great Lakes, causing billions of dollars in damages, harm to water quality and alterations to the lakes’ food chain.

Ballast water has also been blamed for introducing the Chinese mitten crab, the rapa whelk and other exotic species into the Chesapeake Bay. An Asian strain of Vibrio bacteria, which sickened two people who ate raw Bay oysters in 2010, is suspected to have arrived in a ship’s ballast tank.

Large ships often suck huge amounts of water into large ballast tanks when leaving a port to help stabilize the vessels during their voyage. A single ship can often hold 30,000–40,000 cubic meters of ballast water.

Until about a decade ago, that ballast water — along with any organisms in it — was routinely discharged into the water at destination ports. Starting in 2004, the Coast Guard required ships to exchange ballast water while they are more than 200 nautical miles offshore, replacing water drawn in at ports with ocean water. That was supposed to reduce the density of coastal species in the ballast tanks and replace them with oceanic species less likely to survive in the fresher water of ports.

But Smithsonian scientists were shocked when they examined ballast water collected in recent years from ships arriving in the ports of Norfolk and Baltimore and found that coastal zooplankton concentrations had dramatically increased when compared with samples gathered from 1993 to 2000, before ballast water exchange was required.

The scientists, who reported their findings in the journal PLOS ONE, say that several factors likely contributed to the increases, including the fact that the total amount of ballast water being discharged increased nearly fivefold during that time.

While overall shipping traffic has remained fairly steady in the Bay in recent decades, the amount of traffic by “bulkers,” primarily transporting coal, has increased steadily since 2000, figures in the study show.

Container ships both pick up and drop off cargo, and therefore often do not have to take on as much ballast. But bulkers leave the Bay full, typically taking coal to other countries, and return empty. That means they must take on large amounts of ballast water for their return voyage to stabilize the empty vessel, and most of that gets released when the ship returns to pick up another load.

From 2005 through 2013, the amount of ballast water being discharged into the Bay increased 374 percent, the scientists said. The increase was driven by bulkers, which accounted for more than three-quarters of the total ballast discharge in the final years of the study.

As a result, the effectiveness of the ballast exchange requirement was reduced simply because significantly more water was being released.

But the scientists said that alone did not account for the overall increase in zooplankton concentrations, which are supposed to be greatly diluted by ballast water exchanges.

They suggested several other factors may be playing a role. Zooplankton concentrations have been poorly studied in other areas, the Smithsonian team said, and may have increased in ports where Bay-bound ships drew in water, especially because the origins of those ships changed over time. In the earlier years of the study, almost all of the ships arriving in the Bay came from the Eastern Mediterranean, but none of the vessels calling in more recent years originated there, as shipping patterns have changed.

Also, the length of voyages has decreased as ships have become faster, increasing the chance that zooplankton taken on in the port of origin would survive the trip.

And while all ships reported conducting ballast water exchanges, the paper noted that it is impossible to verify how well those were performed. Highly effective ballast water exchanges can eliminate 90 percent of the organisms from the port of origin, but the scientists said there’s no way of knowing whether the ships achieved that level of efficiency.

Most likely, the Smithsonian team concluded, all of those factors may have contributed to the increase. “There are so many interacting factors, there is no way to pinpoint one,” said Jenny Carney, the lead author on the paper.

The findings don’t mean that ballast water exchange does not help. “The zooplankton have increased surprisingly, but it would be even higher if we didn’t have ballast exchange,” Ruiz said.

But, the scientists said, research shows how shifts in trade and trade patterns can diminish the effectiveness of efforts to reduce the risk of biological invasions through ballast water exchange.

This has implications for the future, the authors said. While the demand for coal has dropped in the past few years, other factors could lead to the importation of more ballast water. A recent expansion of the Panama Canal has led to a new generation of even larger shipping vessels, the authors noted, and the ports of Baltimore and Norfolk are among the few along the Atlantic Coast capable of handling those ships. The expected opening late this year of a new terminal in Maryland to export liquified natural gas could also lead to more shipping and ballast water.

“All of these factors combined can lead to highly dynamic changes in [ballast water] delivery to Chesapeake Bay,” the scientists wrote.

That could increase the risk of further invasions by nonnative species in the Bay, the scientists said.

“People should be concerned about this,” Ruiz said. “We know that invasions are a really major force of change, ecologically, economically and in terms of health, too. We see that with invasions around the world, including in Chesapeake Bay.”

Ballast water is considered to be the primary mechanism through which nonnative species colonize coastal waters around the globe. Zebra mussels, and the closely related quagga mussel, may be the poster children for the impact such invasions can cause. The filter-feeding mussels latch onto solid substrates in such huge numbers that they clog water intakes and have sunk navigational buoys.

They have spread avian botulism, which has killed thousands of birds in the Great Lakes, and they have caused the near extinction of some native mussel species. They filter algae from water, but are selective in the types that they consume, which has resulted in the overconsumption of some algae sought by fish — leading to the collapse of some populations — while allowing algae that contribute to poor water quality to persist.

They also accumulate toxic contaminants, which are passed up the food chain when the mussels are consumed by other species. They are blamed for more than $7 billion in damage to the Great Lakes fisheries alone, and have been spreading across the country. They reached the upper Chesapeake in the last few years, probably by attaching to recreational boats, though it’s unclear whether they will thrive in the Bay’s saltier environment.

But the Bay also has suspected ballast water invaders of its own, including the rapa whelk, a native of the western Pacific Ocean, which consumes native mollusks. It was first discovered in the Virginia portion of the Bay in the 1990s and appears to now be established.

The Chinese mitten crab, which was confirmed to be in Maryland’s portion of the Bay in 2005, has become an economic and ecological problem in other areas where it has become established and abundant.

Although coastal waters, including the Bay, are constantly subjected to new species from ballast water, it’s difficult to predict which will survive and potentially become problems. While those species thought to have entered the Bay through ballast water have not caused ecosystem-altering changes like zebra mussels in the Great Lakes, scientists note that some species may remain at low levels for years, even decades, until the right conditions allow their populations to explode.

“We are not saying that all species delivered in ballast tanks are going to have major impacts, but some subset of them can and will, and we see good evidence for that,” Ruiz said. “So at some point, with business as usual, one of those species will have a pretty big effect” in the Bay.

The risk might be reduced, over time, by new regulations that will be phased in starting this fall. Instead of relying on ballast water exchange, those rules require that ballast water to be treated with approved technologies — typically chlorination or exposure to ultraviolet light — to greatly reduce the number of live organisms that could be discharged.

But it will be years before those technologies are on all new ships and existing ships. Also, they may not perform as well in the real world as they do in tests, said Mario Tamburri, of the University of Maryland Center for Environmental Science, who has been working on the new techniques.

“It is still going to remove more organisms than just doing an exchange or doing nothing,” Tamburri said. “Whether it is actually meeting the intent of the regulations, the discharge standard, that’s what’s questionable.”

Karl Blankenship is editor of the Bay Journal and executive director of Chesapeake Media Service. He has served as editor of the Bay Journal since its inception in 1991

Environmental groups try to block Montana mine expansion

By Matt Volz, Associated Press | Posted Jun 8th, 2017 @ 5:45pm

HELENA, Mont. (AP) — Environmental advocacy groups launched a new attempt Thursday to halt the expansion of Montana's largest coal mine over its effects on climate change, after federal officials said it wouldn't contribute significantly to the nation's greenhouse gas emissions.

WildEarth Guardians and the Montana Environmental Information Center say in their lawsuit that the U.S. Interior Department's environmental review was shoddy, and argue the federal official who authorized the Spring Creek Mine expansion didn't have the authority to do so.

"From our perspective, they fell miserably short in accounting for the environmental implications of rubber-stamping more coal mining," said Jeremy Nichols, WildEarth Guardians' climate and energy program director. "It's scandalous that the coal industry seems able to survive only because of the federal government cutting corners and turning its back on climate change."

The Spring Creek Mine is the seventh-largest coal mine in the nation, and the largest in Montana. It is part of the Powder River Basin of Montana and Wyoming, which produces 40 percent of the nation's coal annually. The coal in the 1.7 square mile (4.4 square kilometer) expansion area, like most of the coal mined in the area, comes from publicly owned reserves leased to companies by the federal government.

The expansion could double Cloud Peak Energy's production of 18 million tons of coal a year from Spring Creek, according to the lawsuit. That coal would spew millions of tons of carbon dioxide into the air, which would contribute to climate change and air pollution, the environmental groups argue.

It would also mean an increase in train cars carrying the coal through towns and cities on their way to market, which the environmental review didn't consider, according to the lawsuit.

Interior Department spokeswoman Heather Swift referred questions about the lawsuit to the U.S. Department of Justice. That agency did not return a request for comment.

WildEarth Guardians led a previous attempt to block the expansion when it was first proposed in 2012. In that case, U.S. District Judge Susan Watters of Billings ordered federal officials to re-examine the environmental impacts, but did not halt the expansion.

Last fall, Interior Department officials determined the expansion would have only a minor impact on the nation's greenhouse gas emissions. That prompted the new lawsuit, which argues a more thorough environmental review is needed.

This time, the groups added a new argument to their environmental claims. They say the acting field office manager who approved the expansion did not have the authority to do so.

The groups cited recent decisions that set aside approved coal leases in Wyoming and Colorado because lower-level officials had signed off on them.

In those cases, the Interior board of Land Appeals ruled that coal lease modifications could only be approved by higher-ranking officials, such as the BLM's deputy state director.

Federal Judge Denies Trump Admin Appeal In Youth Climate Lawsuit

"A federal judge has denied the Trump administration's appeal in a climate change lawsuit, paving the way for the unprecedented suit to go to trial.

The case -- Juliana v. United States -- pits a group of youth climate plaintiffs against the federal government and the fossil fuel industry. The plaintiffs allege that the federal government, through its actions and coordination with the fossil fuel industry, have violated their constitutional right to a livable climate. It is the first climate lawsuit to rely on a version of the public trust doctrine -- known as atmospheric trust -- to make its case, and adds to a growing number of attempts to force climate action through the judicial branch.

The lawsuit was initially filed in August of 2015, against the Obama administration. The Obama administration, as well as three fossil fuel industry groups as intervenors, all filed motions to have the lawsuit dismissed, which was denied in November by U.S. Federal Judge Ann Aiken. In February, after President Donald Trump was sworn in, the youth plaintiffs filed notice with the court that they would be replacing Obama with Trump."

EPA delays chemical safety rule until 2019 while evaluates necessity

By Devin Henry - 06/12/17 11:36 AM EDT

The Environmental Protection Agency (EPA) will delay implementation of an Obama-era chemical safety rule for nearly two years while it reassesses the necessity of the regulation.

The EPA announced on Monday that Administrator Scott Pruitt signed a directive last Friday delaying the chemical plant safety standards until at least Feb. 20, 2019.

The move comes after the EPA delayed the regulation in March amid discussions over the rule’s impact on businesses.

“We are seeking additional time to review the program, so that we can fully evaluate the public comments raised by multiple petitioners and consider other issues that may benefit from additional public input," Pruitt said in a statement.

Obama regulators in December finalized a rule beefing up safety standards at chemical production plants, calling for new emergency requirements for manufacturers regulated by the EPA.

Officials moved to overhaul chemical safety standards after a 2013 explosion at a chemical plant in Texas killed 15 people. Their rule would require companies to better prepare for accidents and expand the EPA's investigative and auditing powers.

But chemical companies wrote in a letter to Pruitt shortly after his February confirmation that the rule would raise “significant security concerns and compliance issues that will cause irreparable harm."

The EPA sought public comment in March on a proposal to delay the rule while considering those objections. The agency said it received 54,117 comments before Pruitt formally moved to delay the rule.

Pollution 'devastating' China's vital ecosystem, research shows

The startling extent to which man-made pollution is devastating China's vital ecosystem's ability to offset damaging carbon emissions has been revealed.

UNIVERSITY OF EXETER

The startling extent to which man-made pollution is devastating China's vital ecosystem's ability to offset damaging carbon emissions has been revealed.

A pioneering new international study, led by the University of Exeter, has looked at the true impact air pollutants have in impeding the local vegetation's ability to absorb and store carbon from the atmosphere.

The study looked at the combined effects that surface ozone and aerosol particles - two of the primary atmospheric pollutants linked to public health and climate change - have on China's plant communities' ability to act as a carbon sink.

It found that ozone vegetation damage - which weakens leaf photosynthesis by oxidizing plant cells - far outweighs any positive impact aerosol particles may have in promoting carbon uptake by scattering sunlight and cooling temperatures.

While the damage caused to these vital ecosystems in China is not irreversible, the team of experts has warned that only drastic action will offer protection against long-term global warming.

The study is published in the journal Atmospheric Chemistry and Physics.

Professor Nadine Unger, from the University of Exeter's Mathematics department and co-author of the paper said: "We know that China suffers from the highest levels of air pollution in the world, and the adverse effects this has on human health and climate change are well documented.

"What is less clearly understood, however, is the impact it has on the regional carbon balance. The land ecosystems in China are thought to provide a natural carbon sink, but we didn't know whether air pollution inhibited or promoted carbon uptake.

"What is clear from this study is that the negative ozone vegetation damage far outstrips any benefits that an increase in aerosol particles may have. It is a stark warning that action needs to be taken now to tackle the effects man-made pollution is having on this part of the world before it is too late."

The team used state-of-the-art Earth System computer models, together with a vast array of existing measurement datasets, to assess the separate and combined effects of man-made ozone and aerosol pollution in Eastern China.

The study found that the Net Primary Productivity (NPP) - or the amount of carbon plants in an ecosystem can take in - is significantly reduced when the amount of surface ozone increases.

Crucially, this reduction is significantly greater than the effect aerosol particles have in encouraging plants to increase carbon intake through reducing canopy temperatures and increasing the scattering of light.

Professor Unger added: "Essentially, our results reveal a strong 'dampening effect' of air pollution on the land carbon uptake in China today.

"This is significant for a number of reasons, not least because the increase in surface ozone produced by man-made pollution in the region will continue to grow over the next 15 years unless something is done.

"If - and it is of course a big 'if' - China reduce their pollution to the maximum levels, we could reduce the amount of damage to the ecosystems by up to 70 per cent - offering protection of this critical ecosystem service and the mitigation of long-term global warming."

Cold conversion of food waste into renewable energy and fertilizer

A Concordia study shows bacteria in low-temperature environments could reduce Canada's carbon footprint

PUBLIC RELEASE: 31-MAY-2017

Researchers from Concordia's Department of Building, Civil and Environmental Engineering (BCEE) in collaboration with Bio-Terre Systems Inc. are taking the fight against global warming to colder climes.

Their weapon of choice? Cold-loving bacteria.

In a study published in Process Safety and Environmental Protection, authors Rajinikanth Rajagopal, David Bellavance and Mohammad Saifur Rahaman demonstrate the viability of using anaerobic digestion in a low-temperature (20°C) environment to convert solid food waste into renewable energy and organic fertilizer.

They employed psychrophilic bacteria -- which thrive in relatively low temperatures -- to break down food waste in a specially designed bioreactor. In doing so, they produced a specific methane yield comparable to that of more energy-intensive anaerobic digestion processes.

"There is enormous potential here to reduce the amount of fuel that we use for solid waste treatment," Rahaman explains.

"Managing and treating food waste is a global challenge, particularly for cold countries like Canada where the temperature often falls below -20°C and energy demands related to heating are high."

He adds that the most commonly used forms of anaerobic digestion require large amounts of energy to heat the bioreactors and maintain temperatures for the bacteria's optimal performance.

"What we've learned is that we can now use adapted psychrophilic bacteria to produce a level of methane comparable to those more common forms, while using less energy."

'A promising new research direction'

Globally, more than 1.3 billion tonnes of municipal waste are created each year, and that number is expected to increase to 2.2 billion by 2025. Most of it ends up in landfills where it biodegrades over time, producing biogas, a powerful greenhouse gas largely composed of carbon dioxide, methane and hydrogen sulfide.

Left alone, this methane-rich biogas poses a significant climate threat, as methane carries a global warming potential that is 21 times greater than that of carbon dioxide.

But, according to the researchers, engineered anaerobic digestion techniques can also be adapted to capture such gases and transform them into renewable energy.

By employing devices such as biogas storage domes, biofilters or combined heat and power co-generation systems, for instance, methane can be collected, cleaned and converted into heat or electricity that can then be substituted for most fossil fuels.

At an agronomic level, the process also contributes leftover nitrogen- and phosphorus-rich digestate material that can be subsequently recovered and used as plant fertilizer.

The process for feeding the bioreactor is unique. It involves a semi-continuously fed constant volume overflow approach: the amount of food waste fed into the bottom opening necessitates the removal of an equal amount of treated effluent from the top.

The researchers performed various tests on the extracted material to determine its physicochemical characteristics as well as to monitor the biogas quality and quantity.

"There aren't many studies that look into developing new applications for treating food waste," Rajagopal says. "We hope that this study will mark the beginning of a promising new research direction."

Why Don’t Green Buildings Live Up to Hype on Energy Efficiency?

Analysts call it the “energy performance gap” — the difference between promised energy savings in green buildings and the actual savings delivered. The problem, researchers say, is inept modeling systems that fail to capture how buildings really work.

By Richard Conniff • May 25, 2017

Not long ago in the southwest of England, a local community set out to replace a 1960s-vintage school with a new building using triple-pane windows and super-insulated walls to achieve the highest possible energy efficiency. The new school proudly opened on the same site as the old one, with the same number of students, and the same head person—and was soon burning more energy in a month than the old building had in a year.

The underfloor heating system in the new building was so badly designed that the windows automatically opened to dump heat several times a day even in winter. A camera in the parking lot somehow got wired as if it were a thermal sensor, and put out a call for energy any time anything passed in front of the lens. It was “a catalogue of disasters,” according to David Coley, a University of Bath specialist who came in to investigate.

Many of the disasters were traceable to the building energy model, a software simulation of energy use that is a critical step in designing any building intended to be green. Among other errors, the designers had extrapolated their plan from a simplified model of an isolated classroom set in a flat landscape, with full sun for much of the day. That dictated window tinting and shading to reduce solar gain. Nobody seems to have noticed that the new school actually stood in a valley surrounded by shade trees and needed all the solar gain it could get. The classrooms were so dark the lights had to be on all day.

It was an extreme case. But it was also a good example, according to Coley, of how overly optimistic energy modeling helps cause the “energy performance gap,” a problem that has become frustratingly familiar in green building projects. The performance gap refers to the failure of energy improvements, often undertaken at great expense, to deliver some (or occasionally all) of the promised savings. A study last year of refurbished apartment buildings in Germany, for instance, found that they missed the predicted energy savings by anywhere from 5 to 28 percent. In Britain, an evaluation of 50 “leading-edge modern buildings,” from supermarkets to health care centers, reported that they “were routinely using up to 3.5 times more energy than their design had allowed for” — and producing on average 3.8 times the predicted carbon emissions

Buildings account for 40 percent of climate change emissions and are the fastest growing source of emissions.

The performance gap is “a vast, terrible enormous problem,” in the words of one building technology specialist, and that’s not an exaggeration. Though much of the public concern about energy consumption and climate change focuses on automotive miles-per-gallon, the entire transport sector — including trains, planes, ships, trucks, and cars — accounts for just 26 percent of U.S. climate change emissions. Buildings come in at 40 percent, and they are the fastest growing source of emissions, according to the U.S. Green Building Council.

Eliminating the performance gap matters particularly for European Union nations, which have a legally binding commitment to reduce emissions by 80 to 95 percent below 1990 levels by mid-century. But knowing with confidence what savings will result matters for anybody trying to figure out how much to invest in a particular energy improvement.

Researchers have generally blamed the performance gap on careless work by builders, overly complicated energy-saving technology, or the bad behaviors of the eventual occupants of a building. But in a new study, Coley and his co-authors put much of the blame on inept energy modeling. The title of the study asks the provocative question “Are Modelers Literate?” Even more provocatively, a press release from the University of Bath likens the misleading claims about building energy performance to the Volkswagen emissions scandal, in which actual emissions from diesel engine cars were up to 40 times higher than “the performance promised by the car manufacturer.”

For their study, Coley and his co-authors surveyed 108 building industry professionals — architects, engineers, and energy consultants — who routinely use energy performance models. To keep the problem simple, the researchers asked participants to look at a typical British semi-detached home recently updated to meet current building codes. Then they asked test subjects to rank which improvements made the most difference to energy performance. Their answers had little correlation with objective reality, as determined by a study monitoring the actual energy performance of that home hour-by-hour over the course of a year. A quarter of the test subjects made judgments “that appeared worse than a person responding at random,” according to the study, which concluded that the sample of modelers, “and by implication the population of building modelers, cannot be considered modeling literate.”

‘We have cases where modelers will come up with a savings measure that is more than the energy use of the house,’ says one scientist.

Predictably, that conclusion raised hackles. “The sample seems odd to me,” said Evan Mills, a building technology specialist at Lawrence Berkeley National Laboratory, “to include so many people who are junior in the practice, and then to be criticizing the industry at large.” He noted that almost two-thirds of the 108 test subjects had five years or less experience in construction. But Coley and his co-authors found that even test subjects with “higher-level qualifications, or having many years of experience in modeling,” were no more accurate than their juniors.

In any case, Mills acknowledged, “the performance gap is real, and we must be aware of models not properly capturing things. We have cases where modelers will come up with a savings measure that is more than the energy use of the house, because they are just working with the model,” and not paying attention to the real house.

That sort of problem — energy models showing unreasonable results — also turns up at the preliminary stage on 50 percent of projects going through the LEED certification process, said Gail Hampsmire of the U.S. Green Building Council. Designers have a tendency to take a “black box” approach, providing whatever inputs a particular energy model requires and then accepting the outputs “without evaluating the reasonability of those results,” she said. “You always have the issue of garbage in/garbage out, and the capability of the modeler to identify whether they are getting garbage out is critical.”

So what’s the fix? The current accreditation requirements for energy modelers are “very gentle,” said Coley, but “when you’re trying to get something off the ground relatively quickly, you can’t send everybody back to college for three years.” In any case, the problem isn’t really education in the formal sense.

“It has to do with feedback,” he said, or the lack of it. The culture of building construction says it’s perfectly reasonable for architects — but not energy modelers —to travel hundreds of miles to see how the actual building compares with what they designed. For energy modelers, there’s not even an expectation that they’ll get on the phone with the building manager at year one and ask how energy usage compares with the original model. As a result, said Coley, energy modeling can become like theoretical physics: “You can very easily create a whole web of theories, and then you find yourself studying the physics of your theories, not the physics of the real world.”

The organization that gives LEED certification is now requiring that developers post actual energy usage on an online data base.

The answer, he suggested, is a regulatory requirement that modelers follow up on their work by routinely checking their predictions against a building’s actual energy consumption. A system of modest inducements could also make that feedback more broadly available — for instance, by promising to take three weeks off the planning permissions process for developers who commit to posting actual energy usage to an online database. The Green Building Council has begun to require that sort of reporting for projects seeking LEED certification, said Hampsmire, with an online platform now in development “for building owners to track their own performance and compare it with other buildings.”

A second problem, according to Coley, is the tendency of government agencies to require simplified energy models at the start of the design process. The requirements often include certain uniform assumptions about energy use, making it easier to compare one building with another. “Because you have to do that at the start, it becomes the default, and this sets up a kind of ‘Alice in Wonderland’ world, and it’s not surprising that modelers model this artificial world.” But at least in the United States that has become less of a problem in recent years, according to Hampsmire. Current building code requirements are “fairly good,” she said. “They don’t say, ‘Model energy use for a building occupied eight hours a day,’” or some other arbitrary standard. Instead, “they specifically state that all energy use has to be modeled as anticipated.”

The takeaway from all this isn’t to discredit energy modeling but to improve it. Builders increasingly need realistic modeling, said Coley, by people with a deep knowledge of building physics and at least as much experience with real buildings as with energy models. Without that, the result will be even more $500-million office blocks with too much glass on the southern exposure, causing everybody inside to bake on a hot summer afternoon. Without smart energy modeling, the result will be a world spinning even faster into out-of-control climate change.

“This isn’t rocket science,” said the Berkeley Laboratory’s Mills. But then he added, “It’s harder than rocket science.”

Richard Conniff is a National Magazine Award-winning writer whose articles have appeared in The New York Times, Smithsonian, The Atlantic, National Geographic, and other publications. His latest book is House of Lost Worlds: Dinosaurs, Dynasties, and the Story of Life on Earth.

Rising Seas May Wipe Out These Jersey Towns, but They're Still Rated AAA

by Christopher Flavelle

‎May 25, 2017

Few parts of the U.S. are as exposed to the threats from climate change as Ocean County, New Jersey. It was here in Seaside Heights that Hurricane Sandy flooded an oceanfront amusement park, leaving an inundated roller coaster as an iconic image of rising sea levels. Scientists say more floods and stronger hurricanes are likely as the planet warms.

Yet last summer, when Ocean County wanted to sell $31 million in bonds maturing over 20 years, neither of its two rating companies, Moody’s Investors Service or S&P Global Ratings, asked any questions about the expected effect of climate change on its finances.

"It didn’t come up, which says to me they’re not concerned about it," says John Bartlett, the Ocean County representative who negotiated with the rating companies. Both gave the bonds a perfect triple-A rating.

The same rating companies that were caught flat-footed by the downturn in the mortgage market during the global financial crisis that ended in 2009 may be underestimating the threat of climate change to coastal communities. If repeated storms and floods are likely to send property values -- and tax revenue -- sinking while spending on sea walls, storm drains or flood-resistant buildings goes up, investors say bond buyers should be warned.

"They are supposed to identify risk to investors," said Eric Glass, a fixed-income portfolio manager at Alliance Bernstein, a New York investment management firm that handles $500 billion in assets. "This is a material risk."

Breckinridge Capital Advisors, a Boston-based firm specializing in fixed-income investments, is already accounting for those risks internally: Last year, it downgraded a borrower in Florida due to climate risk, citing the need for additional capital spending because of future flooding.

Rob Fernandez, its director of environmental, social and governance research, said rating companies should do the same. "Either incorporate these factors, or, if you say that you are, tell us how you’re doing it," he said.

S&P and Moody’s say they’re working on how to incorporate the risk to bonds from severe or unpredictable weather. Moody’s released a report about climate impacts on corporate bond ratings last November and is preparing a similar report on municipal bonds now.

Fitch Ratings Ltd. is more skeptical.

"Some of these disasters, it’s going to sound callous and terrible, but they’re not credit-negative," Amy Laskey, managing director for the local government group at Fitch, said in an interview. "They rebuild, and the new facilities are of higher quality and higher value than the old ones."

For more than a century, rating companies have published information helping investors gauge the likelihood that companies and governments will be able to pay back the money they borrow. Investors use those ratings to decide which bonds to buy and gauge the risk of their portfolio. For most of that time, the determinants of creditworthiness were fairly constant, including revenue, debt levels and financial management. And municipal defaults are rare: Moody’s reports fewer than 100 defaults by municipal borrowers it rated between 1970 and 2014.

Climate change introduces a new risk, especially for coastal cities, as storms and floods increase in frequency and intensity, threatening to destroy property and push out residents. That, in turn, can reduce economic activity and tax revenue. Rising seas exacerbate those threats and pose new ones, as expensive property along the water becomes more costly to protect -- and, in some cases, may get swallowed up by the ocean and disappear from the property-tax rolls entirely.

Just as a shrinking auto industry slowly crippled Detroit, leading to an exodus of residents and, eventually, its bankruptcy in 2013, other cities could face the accumulating risks of storms or floods -- and then suddenly encounter a crisis.

"One of the first questions that we’re going to ask when confronted with an issuer along the coast of Texas, or on the coast of Florida, is: How are you going about addressing, mitigating the impacts of climate change?" Glass, Alliance Bernstein, said. And if local officials don’t have a good answer to that question, he added, "We will not invest, period."

When asked by Bloomberg, none of the big three bond raters could cite an example of climate risk affecting the rating of a city’s bonds.

Kurt Forsgren, a managing director at S&P, said its municipal ratings remain "largely driven by financial performance." He said the company was looking for ways to account for climate change in ratings, including through a city’s ability to access insurance.

Henry Shilling, Moody’s senior vice president for environment, social and governance risks, said the company is planning to issue a report this summer that explains how it will incorporate climate change into its municipal ratings. "It’s a bit of a journey," he said.

Last September, when Hilton Head Island in South Carolina issued bonds that mature over 20 years, Moody’s gave the debt a triple-A rating. In January 2016, all three major bond companies gave triple-A ratings to long-term bonds issued by the city of Virginia Beach, which the U.S. Navy has said faces severe threats from climate change.

The threat isn’t limited to smaller cities. The World Bank called Boston one of the 10 cities globally that are most financially exposed to flooding. But in March, when Boston issued $150 million in bonds maturing over 20 years, Moody’s and S&P each gave those bonds top ratings.

Of course, predictions are hard, especially about the future. While scientists are generally united about the science of climate change, its pace remains uncertain. And what all of that will mean for communities and their likelihood of paying back bonds is not a simple calculation. Ocean County continued to pay back its current debt load after Sandy, and will still have a lot of oceanfront property if its current coast is swamped. The oceanfront just won’t be in the same place.

The storms or floods "might be so severe that it’s going to wipe out the taxation ability," said Bob Buhr, a former vice president at Moody’s who retired last year as a director at Societe Generale SA. "I think this is a real risk."

In May 2016, 117 investors with $19 trillion in assets signed a statement calling for credit ratings to include "systematic and transparent consideration" of environmental and other factors. Signatories also included rating companies from China, the U.S. and elsewhere, including Moody’s and S&P.

Laskey, of Fitch, was skeptical that rating companies could or should account for climate risk in municipal ratings.

"We’re not emergency-preparedness experts," she said in a phone interview. "Unless we see reason to think, ‘Oh, they’re not paying attention,’ we assume that they’re competent, and they’re doing what they need to do in terms of preparedness."

That view is at odds with the picture painted by engineers, safety advocates and insurers. Timothy Reinhold, senior vice president for research at the Insurance Institute for Business & Home Safety, a group funded by insurers, said local officials aren’t doing enough to prepare for the threats of climate change.

"While most coastal communities and cities have weather-related disaster response plans, many older, existing structures within these communities are not as durable, or as resilient as they could and should be," Reinhold wrote in an email.

Politically Fraught

The types of actions that cities could take to reduce their risk -- including tougher building codes, fewer building permits near the coast and buying out the most vulnerable properties -- are politically fraught. And the benefits of those policies are typically years away, long after today’s current leaders will have retired.

The weakness of other incentives leaves the risk of a credit downgrade as one of the most effective prompts available to spur cities to deal with the threat, according to Craig Fugate, director of the Federal Emergency Management Agency under President Barack Obama.

"They need cheap money to finance government," Fugate said in a phone interview. If climate considerations meant higher interest rates, "not only will you have their attention. You’ll actually see changes."

Fugate also said rating companies were wrong to assume that cities are well prepared for climate change, or that their revenue will necessarily recover after a natural disaster.

As an example, he cited the case of Homestead, a city south of Miami that bore the worst damage from Hurricane Andrew in 1992. The city’s largest employer, Homestead Air Force Base, was destroyed in the storm; rather than rebuild it, the federal government decided to include the facility in its base closures.

Fugate said climate change increases the risk that something similar could happen to other places along the coast -- and they won’t ever be able to bounce back as Homestead eventually did.

"If that tax base does not come back," he warned, "they cannot service their debt." Asked about rating companies’ insistence that such risks are remote, Fugate scoffed. "Weren’t these the same people telling us that the subprime mortgage crisis would never happen?"

UM scientists receive grant to investigate heat trapped in Greenland's snow cover

The University of Montana

MISSOULA - A new $1.54 million grant from the National Science Foundation will fund University of Montana geoscientists as they study the deep layer of compacted snow covering most of Greenland's ice sheet.

Lead investigator Joel Harper and co-investigator Toby Meierbachtol, both from UM's Department of Geosciences, received $760,000 for the research at UM and $400,000 for chartering aircraft. A collaborator at the University of Wyoming also received $380,000 for the project.

The research will investigate the development of the snow layer covering Greenland as it melts during the summer months. Meltwater percolates into the underlying snow and refreezes to form deep layers of ice. The melting absorbs heat from the atmosphere, and the refreezing releases the heat into the ice sheet.

The icy snow builds up over years to form a layer of compacted dense snow called firn, which can reach up to 90 meters thick. The firn is a key component of the ice sheet; as its porous structure absorbs meltwater, it changes ice sheet elevation through compaction and influences heat exchanges between the ice sheet and the climate system.

Little is known about the firn layer's structure, temperature or thickness, and two contrasting theories explain how it has developed over time - one suggesting the layer still has the ability to absorb more meltwater and the other suggesting it does not. Research is critical to understanding whether future melt will percolate downward and refreeze in the empty pore space or be routed off the ice sheet, which has implications for sea level rise and heat transfer from the atmosphere to the ice sheet.

The project will quantify the structural and thermal frameworks of Greenland's firn layer by drilling boreholes through it to conduct measurements and experiments deep within the layer.

"The mixture of cold snow with scattered pockets of wet snow makes challenging drilling conditions because the cold drill tends to freeze to the wet snow," Harper said. "That's why there is so little known about this area."

The researchers have designed a new type of drill they hope will solve this problem, and they will use the data collected in the holes to guide computer modeling.

The team will visit the ice sheet six times in three years. A modified DC-3 airplane with skis and powerful engines will deliver the researchers near the top of the ice sheet, and from there they will make two long traverses with their equipment. The project will involve both graduate and undergraduate students.

Tiny shells indicate big changes to global carbon cycle

Future conditions not only stress marine creatures but also may throw off ocean carbon balance

University of California - Davis

Experiments with tiny, shelled organisms in the ocean suggest big changes to the global carbon cycle are underway, according to a study from the University of California, Davis.

For the study, published in the journal Scientific Reports, scientists raised foraminifera -- single-celled organisms about the size of a grain of sand -- at the UC Davis Bodega Marine Laboratory under future, high CO2 conditions.

These tiny organisms, commonly called "forams," are ubiquitous in marine environments and play a key role in food webs and the ocean carbon cycle.

STRESSED UNDER FUTURE CONDITIONS

After exposing them to a range of acidity levels, UC Davis scientists found that under high CO2, or more acidic, conditions, the foraminifera had trouble building their shells and making spines, an important feature of their shells.

They also showed signs of physiological stress, reducing their metabolism and slowing their respiration to undetectable levels.

This is the first study of its kind to show the combined impact of shell building, spine repair, and physiological stress in foraminifera under high CO2 conditions. The study suggests that stressed and impaired foraminifera could indicate a larger scale disruption of carbon cycling in the ocean.

OFF BALANCE

As a marine calcifier, foraminifera use calcium carbonate to build their shells, a process that plays an integral part in balancing the carbon cycle.

Normally, healthy foraminifera calcify their shells and sink to the ocean floor after they die, taking the calcite with them. This moves alkalinity, which helps neutralize acidity, to the seafloor.

When foraminifera calcify less, their ability to neutralize acidity also lessens, making the deep ocean more acidic.

But what happens in the deep ocean doesn't stay in the deep ocean.

IMPACTS FOR THOUSANDS OF YEARS

"It's not out-of-sight, out-of-mind," said lead author Catherine Davis, a Ph.D. student at UC Davis during the study and currently a postdoctoral associate at the University of South Carolina. "That acidified water from the deep will rise again. If we do something that acidifies the deep ocean, that affects atmospheric and ocean carbon dioxide concentrations on time scales of thousands of years."

Davis said the geologic record shows that such imbalances have occurred in the world's oceans before, but only during times of major change.

"This points to one of the longer time-scale effects of anthropogenic climate change that we don't understand yet," Davis said.

UPWELLING BRINGS 'FUTURE' TO SURFACE

One way acidified water returns to the surface is through upwelling, when strong winds periodically push nutrient-rich water from the deep ocean up to the surface. Upwelling supports some of the planet's most productive fisheries and ecosystems. But additional anthropogenic, or human-caused, CO2 in the system is expected to impact fisheries and coastal ecosystems.

UC Davis' Bodega Marine Laboratory in Northern California is near one of the world's most intense coastal upwelling areas. At times, it experiences conditions most of the ocean isn't expected to experience for decades or hundreds of years.

"Seasonal upwelling means that we have an opportunity to study organisms in high CO2, acidic waters today -- a window into how the ocean may look more often in the future," said co-author Tessa Hill, an associate professor in earth and planetary sciences at UC Davis. "We might have expected that a species of foraminifera well-adapted to Northern California wouldn't respond negatively to high CO2 conditions, but that expectation was wrong. This study provides insight into how an important marine calcifier may respond to future conditions, and send ripple effects through food webs and carbon cycling."

The study's other co-authors include Emily Rivest from UC Davis and Virginia Institute of Marine Science, UC Davis professors Brian Gaylord and Eric Sanford, and UC Davis associate research scientist Ann Russell.

The study was supported by the National Science Foundation and the Cushman Foundation Johanna M. Resig Fellowship.

Nagoya University researchers break down plastic waste

Nagoya University-based research team develops new highly efficient catalyst for breaking resistant chemical bonds, paving the way for easier recycling of plastic waste

Nagoya, Japan - What to do proteins and Kevlar have in common? Both feature long chain molecules that are strung together by amide bonds. These strong chemical bonds are also common to many other naturally occurring molecules as well as man-made pharmaceuticals and plastics. Although amide bonds can give great strength to plastics, when it comes to their recycling at a later point, the difficultly of breaking these bonds usually prevents recovery of useful products. Catalysts are widely used in chemistry to help speed up reactions, but breaking the kinds of amide bonds in plastics, such as nylon, and other materials requires harsh conditions and large amounts of energy.

Building on their previous work, a research team at Nagoya University recently developed a series of organometallic ruthenium catalysts to break down even the toughest amide bonds effectively under mild conditions.

"Our previous catalysts could hydrogenate most amide bonds, but the reactions needed a long time at high temperature and high pressure. This new ruthenium catalyst can hydrogenate difficult substrates under much milder conditions," says lead author Takashi Miura.

Hydrogenation is the key step leading to breakdown of amide bonds. The catalyst features a ruthenium atom supported in an organic framework. This ruthenium atom can adsorb hydrogen and deliver it to the amide bond to initiate the breakdown. The team probed the position of hydrogen on the catalyst in the reaction pathway and modified the shape of the supporting framework. By making sure that the hydrogen molecule was is the best possible position for interaction with amide bonds, the team achieved much more effective hydrogenation.

Group leader Susumu Saito says, "The changes we made to the catalyst allowed some tricky amide bonds to be selectively cleaved for the first time. This catalyst has great potential for making designer peptides for pharmaceutics and could also be used to recover materials from waste plastics to help realize an anthropogenic chemical carbon cycle."

The article, "Multifaceted catalytic hydrogenation of amides via diverse activation of a sterically confined bipyridine-ruthenium framework" was published in Scientific Reports at DOI: 10.1038/s41598-017-01645-z

Losing sleep over climate change

University of California - San Diego

Climate change may keep you awake -- and not just metaphorically. Nights that are warmer than normal can harm human sleep, researchers show in a new paper, with the poor and elderly most affected. According to their findings, if climate change is not addressed, temperatures in 2050 could cost people in the United States millions of additional nights of insufficient sleep per year. By 2099, the figure could rise by several hundred million more nights of lost sleep annually.

The study was led by Nick Obradovich, who conducted much of the research as a doctoral student in political science at the University of California San Diego. He was inspired to investigate the question by the heat wave that hit San Diego in October of 2015. Obradovich was having trouble sleeping. He tossed and he turned, the window AC in his North Park home providing little relief from the record-breaking temperatures. At school, he noticed that fellow students were also looking grumpy and bedraggled, and it got him thinking: Had anyone looked at what climate change might do to sleep?

Published by Science Advances, the research represents the largest real-world study to date to find a relationship between reports of insufficient sleep and unusually warm nighttime temperatures. It is the first to apply the discovered relationship to projected climate change.

"Sleep has been well-established by other researchers as a critical component of human health. Too little sleep can make a person more susceptible to disease and chronic illness, and it can harm psychological well-being and cognitive functioning," Obradovich said. "What our study shows is not only that ambient temperature can play a role in disrupting sleep but also that climate change might make the situation worse by driving up rates of sleep loss."

Obradovich is now a postdoctoral fellow at Harvard's Kennedy School of Government and a research scientist at the MIT Media Lab. He is also a fellow of the Center for Marine Biodiversity and Conservation at UC San Diego's Scripps Institution of Oceanography. Obradovich worked on the study with Robyn Migliorini, a student in the San Diego State University/UC San Diego Joint Doctoral Program in Clinical Psychology, and sleep researcher Sara Mednick of UC Riverside. Obradovich's dissertation advisor, social scientist James Fowler of UC San Diego, is also a co-author.

The study starts with data from 765,000 U.S. residents between 2002 and 2011 who responded to a public health survey, the Behavioral Risk Factor Surveillance Survey from the Centers for Disease Control and Prevention. The study then links data on self-reported nights of insufficient sleep to daily temperature data from the National Centers for Environmental Information. Finally, it combines the effects of unusually warm temperatures on sleep with climate model projections.

The main finding is that anomalous increases in nighttime temperature by 1 degree Celsius translate to three nights of insufficient sleep per 100 individuals per month. To put that in perspective: If we had a single month of nightly temperatures averaging 1 degree Celsius higher than normal, that is equivalent to 9 million more nights of insufficient sleep in a month across the population of the United States today, or 110 million extra nights of insufficient sleep annually.

The negative effect of warmer nights is most acute in summer, the research shows. It is almost three times as high in summer as during any other season.

The effect is also not spread evenly across all demographic groups. Those whose income is below $50,000 and those who are aged 65 and older are affected most severely. For older people, the effect is twice that of younger adults. And for the lower-income group, it is three times worse than for people who are better off financially.

Using climate projections for 2050 and 2099 by NASA Earth Exchange, the study paints a bleak picture of the future if the relationship between warmer nights and disrupted sleep persists. Warmer temperatures could cause six additional nights of insufficient sleep per 100 individuals by 2050 and approximately 14 extra nights per 100 by 2099.

"The U.S. is relatively temperate and, in global terms, quite prosperous," Obradovich said. "We don't have sleep data from around the world, but assuming the pattern is similar, one can imagine that in places that are warmer or poorer or both, what we'd find could be even worse."

The research was supported in part by the National Science Foundation, grants no. DGE0707423 and TG-SES130013 to Obradovich, DGE1247398 to Migliorini, and BCS1439210 to Mednick. Mednick is also funded by the National Institute on Aging (R01AG046646) and the Department of Defense (Office of Naval Research Young Investigator Award).

As a student, Obradovich was a Gramercy Fellow in the UC San Diego Division of Social Sciences.

The authors also acknowledge the assistance of the San Diego Supercomputer Center and the Human Nature Group at UC San Diego.



Archives:

Click here to view our environmental news archive
The Utica Rome Green ExpoTM  charitable event presented by:Energy Users Consulting Services
© all rights reserved