MAKING OUR GREEN ECONOMY COME ALIVE AND THRIVETM

NEWS

ENVIRONMENTAL NEWS

Remember to also click on and check out the "Green Local 175 News" for environmental and green news within a 175 mile radius of Utica-Rome. Organizations and companies can send their press releases to info@greenlocal175.com . Express your viewpoint on a particular issue, by sending an e-mail to our opinion page, "Urge to be Heard". Lastly, for more environmental information listen to our Green Local 175 Radio & Internet Show.

Green Local 175 on ,

France to revise carbon emissions target after missing 2016 goal

PARIS (Reuters) - France will revise its carbon emissions target by the end of this year to align it with its pledges in the Paris climate agreement after failing to meet the goal to cut greenhouse gas emissions in 2016, the ecology minister said on Monday.

France is leading efforts to keep the momentum going on the landmark 2015 climate agreement but its carbon dioxide emission’s rose 3.6 percent over the targeted 447 million tonnes of CO2 emissions equivalent, the ministry said.

Ecology Minister Nicolas Hulot said in the statement that the revision of the France’s low-carbon strategy to set new targets, will take into account the plan to further reduce emissions and make France carbon neutral by 2050.

Hulot said he will present measures at the end of the month as part of the strategic revision, which will help accelerate France’s energy transition.

He added that policies in the transport, housing and forestry sectors, where emissions targets were largely missed in 2016, would need to be substantially reinforced if France were to meet its emissions target.

Austria to sue EU over allowing expansion of Hungary nuclear plant

Shadia Nasralla

VIENNA (Reuters) - Austria is planning to sue the European Commission for allowing Hungary to expand its Paks atomic plant, it said on Monday, not viewing nuclear energy as the way to combat climate change or as being in the common European interest.

The country, which shares a border with Hungary, prides itself on supporting environmentally sound energy. It has for decades opposed nuclear power, which triggers huge disagreements about cleanliness, safety, and renewability.

The anti-nuclear position was reiterated in a coalition agreement struck last month between Chancellor Sebastian Kurz’s conservatives and the far-right Freedom Party.

“We in the government have agreed that there are sufficient reasons to sue (the Commission),” a spokesman for Austrian Sustainability Minister Elisabeth Koestinger said.

“EU assistance is only permissible when it is built on common interest. For us, nuclear energy is neither a sustainable form of energy supply, nor is it an answer to climate change.”

EU state aid regulators approved last March Hungary’s plan to build two new reactors at its Paks nuclear site with the help of Russia’s Rosatom, saying Hungarian authorities had agreed to several measures to ensure fair competition.

The two new reactors will double the plant’s nominal capacity of 2,000 megawatts. Hungary aims to start construction on the reactors this year, with the first facility expected set for completion in 2025.

“The Paks nuclear plant is the guarantee for providing a cheap, reliable and safe supply of electricity to Hungarian people and businesses,” Hungarian Prime Minister Viktor Orban’s office said in a statement.

“Therefore, the Hungarian government will stick to its plan to ensure the maintenance of capacity at the Paks plant,” it said, adding that the lawsuit would not affect work on the project.

The deadline for filing a suit to challenge the executive EU Commission’s decision at the European Court of Justice is Feb. 25, the spokesman for Austria’s Koestinger said.

In a majority of such complex cases, the European Court of Justice in Luxembourg has found in favor of the Commission.

Austria launched a similar legal action against the European Commission in 2015 over its backing of British plans for a 16 billion pound ($22.24 billion) development of the Hinkley Point nuclear power plant.

($1 = 0.7195 pounds)

Reporting By Shadia Nasralla; additional reporting by Krisztina Than in Budapest; Editing by Gareth Jones/Jeremy Gaunt

Iowa city moving towards goal of energy independence

Written By Karen Uhlenhuth

January 19, 2018

The city of Bloomfield, Iowa is at least 10 percent of the way toward its pledge to reach energy independence by 2030 with the recent completion of a 1.86-megawatt solar array.

The city turned on the solar array Dec. 29, just ahead of a Dec. 31 deadline for collecting a state production tax credit, which might reduce the city’s costs by as much as 8 percent.

“We’re still tweaking it,” said Chris Ball, the city’s energy efficiency director.

Ball estimated that the array, which tracks the sun throughout the day to maximize total energy production, will provide about 10 percent of the power used by the city’s 2,600 residents. The city chose a more costly tracking system because it is expected to produce 18 percent more energy than a stationary system and because it will produce during the utility’s peak times — late afternoon in the summer and morning in the winter.

In 2015, after an encouraging analysis by energy consultants and the Iowa Association of Municipal Utilities, the Bloomfield City Council adopted a goal of attaining energy self-sufficiency within 15 years. Ball expects the city will eventually produce much more of the power it requires.

The energy researchers studied Bloomfield and another Iowa city, Algona, to evaluate their potential for energy self-sufficiency. They concluded Algona could meet half of its electric needs with locally produced renewable energy, and that Bloomfield could meet 100 percent through a combination of efficiency, direct load control, and a renewable power. The researchers pointed out that municipal utilities typically are freer than investor-owned utilities to experiment because they don’t have the same pressure to maintain or increase sales of electricity.

Iowa’s Economic Development Authority financed most of the initial study and recently awarded Bloomfield $100,000 in federal money to the city move forward and support creating a template other communities could follow to take more control of their energy needs.

Bloomfield “has accomplished a lot in a very short time frame following the study that first showed it’s possible,” said Jeff Geerts, the economic development authority’s special projects manager. “It’s amazing.”

But the city is at a juncture where it needs to strategize about the best path forward, he said. The study presented several options including varying levels of investment in renewable generation. Some large questions remain unanswered, Geerts said.

“The discussion would be how much wind, how much solar, who owns how much wind and how much solar? And is it owned by the city or entities outside the city? How much is distributed versus how much is community wind or solar? And how much is achieved through energy-efficiency measures and what is the role of the wastewater-treatment plant? Is there an opportunity not only to reduce energy use there, but how does that potentially become a generation source?

In short, “What’s the mix look like, and what are the policies to make it happen?”

The study put more emphasis on reducing energy use than it did on developing local renewable generation, and efficiency improvements have been happening throughout the city with the help of a team of 14 workers from Americorps and Vista. Last summer they knocked on the door of every home in town — at least three times, according to Ball — and offered to replace up to 10 incandescent bulbs with LEDs. Ball said they swapped out bulbs in about 450 out of 1,100 homes. Workers also replaced incandescent bulbs with LEDs throughout the city’s elementary and middle schools.

“We’re saving them about $2,000 a month that they can spend on teachers and books,” Ball said.

The same workers performed energy audits and some basic weatherization work in some homes. Although most of the workers have returned to college, two remain and are developing a nonprofit organization aimed at keeping the energy-efficiency improvements coming.

The city government has enhanced its own operations by upgrading nearly all of its light fixtures to LEDs, adding insulation and weather sealing, and getting more systematic about maintenance tasks such as replacing filters in heating and cooling systems. Staff members inventoried the city’s 15 buildings and made a list of needed efficiency upgrades to be done in coming years.

Ball said they are considering a liquid cover for the swimming pool, which would reduce evaporation and keep heat in the pool.

The Davis County Community School District, which serves Bloomfield, also is pursuing energy efficiency along with a broader agenda of sustainability. Lead by an Americorps worker, Paul Fleetwood, students, and staff are devising a way to compost organic waste.

Danny Roberts, the district’s director of support services, is talking with four solar contractors about putting arrays on the district’s larger buildings. His aim is to generate as much solar power as they use in a year.

“We know where we want to be,” he said, “but getting there, we’re not quite sure how.”

The district is also looking at various strategies for cutting energy use in schools. It would like to install motion sensors in middle and elementary schools to automate lights and heating and cooling, as was done in the high school in 2009.

And then there’s the human factor. Over Christmas break, Fleetwood led a team of students through the district’s buildings, counting the number of machines eating electricity “that didn’t need to be” while the buildings were empty.

The count: 771.

Getting people habituated to unplugging them at appropriate times is also on Fleetwood’s to-do list.

“We’re helping them to understand the effects of wasted energy on the environment, and the cost of energy waste. We’re talking about how students make choices that can either be destructive or supportive of what we’re trying to achieve as a school system,” he said.

Much remains for the district and city to accomplish to reach the 2030 goal. Geerts, with the state’s economic development agency, is confident the lofty goal is within reach.

“There’s a lot of momentum in that community, a lot of support, a lot of optimism for achieving it,” Geerts said.

Peru passes law allowing roads through pristine Amazon rainforest

New roads in border areas could affect five indigenous reserves

Dan Collyns in Lima @yachay_dc

Mon 22 Jan 2018 15.18 EST, Last modified on Mon 22 Jan 2018 16.49 EST

Peru has approved a law that would allow roads to be built in the most remote and pristine region of its Amazon rainforest, a haven for isolated indigenous groups and an area of primary forest rich in mahogany trees.

Pope Francis says Amazon indigenous people under greater threat than ever

The law which declares the construction of roads in border zones of “national priority and interest” was announced in Peru’s official gazette just hours after Pope Francis ended a visit to the country in which he warned that the Amazon and its peoples had never been so under threat.

In an address in the jungle city of Puerto Maldonado on Friday, the pope railed against “pressure being exerted by big business interests” which were destroying a natural habitat vital for the entire planet.

But the law which promotes the construction of roads in Purus – an Amazon region near the border with Brazil – had already been approved by Peru’s congress and passed into law on Monday after no objections were raised by the country’s executive. The area encompasses four national parks and could affect five reserves for indigenous peoples living in “voluntary isolation”.

”The government clearly hasn’t reflected on the pope’s words,” said Lizardo Cauper, head of Peru’s federation of native Amazon peoples, Aidesep.

“These projects don’t benefit indigenous people. This is an area with isolated people who are extremely vulnerable,” he told the Guardian.

“Roads bring outsiders who traffic our land, log our timber, as well as drug traffickers and illegal miners,” he added.

The law contravenes several international commitments made by Peru including climate change pledges and trade agreements with the US and Europe.

“We have a state that makes decisions with its back to the Amazon and its indigenous peoples,” said Iván Lanegra, a former minister for indigenous affairs.

Julia Urrunaga, Peru director for Environmental Investigation Agency, said 95% of deforestation happens less than 6km from a road, adding the new law contradicted a court ruling that declared the protection of the forest in the national interest.

The network of roads, including the main 172-mile (277km) highway connecting Puerto Esperanza and Iñapari on the Brazilian border could result in the deforestation of 2,750 sq km, according to satellite mapping by Monitoring of the Amazon Andes Project.

“This law makes a mockery of Peru’s climate change commitments and the recent visit by the pope,” said Laura Furones of Global Witness.

Putting the ‘farm’ back in solar farms: Study to test ag potential at PV sites

Written By Frank Jossi

Minnesota will be included in a study to help federal researchers test the potential of pollinator-friendly habitat and fruit and vegetable crops around solar arrays.

The National Renewable Energy Laboratory (NREL) will plant vegetation this year at three Minnesota solar installations owned by Enel Green Power. The sites are among 15 around the country that will be part of the research project.

“We want to figure out what plants and what type of species will thrive, and how, and why, in these environments,” said Jordan Macknick, an NREL analyst. The plants can provide economic and environmental benefits over covering the ground with gravel or turf grass, as is often done.

The stakes for both the industry and environment will only grow as the amount of land used for solar projects also expands. NREL predicts 3 million acres will be devoted to solar farms by 2030, and 6 million by 2050.

In Minnesota, researchers will test vegetation on three 1-acre plots within Enel’s 100-megawatt Aurora project, which includes installations at several locations in the state. A third of each plot will be devoted to prairie plants and grasses. Another third will contain pollinator-friendly habitat. And the remaining third will feature a mix of agricultural crops.

Minnesota is already a leader in pollinator-friendly solar, having passed a 2016 law that encourages developers to follow siting guidelines and incorporate native grasses and wildflowers. Fresh Energy, which publishes Midwest Energy News, was involved in advocating for the pollinator-friendly solar legislation.

“There’s been a lot of exciting things happening in Minnesota and I feel we’re ahead of the game,” said Megan Benage, a regional ecologist with the Minnesota Department of Natural Resources.

The idea of growing crops among solar installations is newer but not unprecedented. Macknick said researchers in other states have grown beans, kale, melon, squash, lettuce, peppers and broccoli under or near solar panels. Crops could be harvested by hand or by smaller farm equipment, he said.

Minnesota’s solar pollinator standard is voluntary, but state utility regulators require pollinator-friendly vegetation at developments larger than 50 megawatts. As a result, half of the state’s 4,000 acres of solar farms have pollinator habitats, Benage said. A commercial beekeeping company, Bolton Bees, even operates on a solar garden near the Twin Cities and sells a line of products called “solar honey.”

NREL’s research project, known as Innovative Site Preparation and Impact Reductions on the Environment (InSPIRE), might reveal new opportunities to add value at solar farms. It could also help make the case that vegetation can lower the cost of developing and maintaining solar projects, Macknick said.

Site preparation accounts for about 20 percent of the cost of solar projects, Macknick said. Developers typically clear all vegetation before leveling the site and putting down gravel or turf prior to panel installation. Leaving existing vegetation in place could reduce the cost of land clearing, soil compaction, stormwater management and herbicide spraying.

“We tell developers if they vegetate the site it will be easier for them to manage in the long term,” Benage said. “The neighbors like it more because the site is showier, with more flowers and more interest than a mowed lawn – unless you’re a fan of mowed lawns.”

And the right vegetation can even improve production of solar panels by lowering the temperature.

“Having vegetation underneath the panels reduces the temperature of the panels,” Macknick said. “Panels with cooler temperatures have increased solar output.”

Travis Bolton, of Bolton Bees, points out that better environments will result in more resilient bees. In Minnesota, which lost half its bee population a couple of years ago, that’s an important virtue.

Any effort to create pollinator-friendly habitat is welcome, said Minnesota’s best-known bee advocate, MacArthur Foundation Fellow and University of Minnesota Distinguished McKnight Prof. Marla Spivak.

“These plantings can be great for bees,” Spivak wrote in an email. “If diverse, native flowers are planted, they are great for native bees, and somewhat helpful for honey bees. If cover crops with legumes are planted, they are great for honey bees and somewhat for native bees.

2017 was the warmest year on record for the global ocean

Institute of Atmospheric Physics, Chinese Academy of Sciences

The oceans in the upper 2000 m were 1.51 × 1022 J warmer than the second warmest year of 2015 and 19.19 × 1022 J above the 1981-2010 climatological reference period. Owing to its large heat capacity, the ocean accumulates the warming derived from human activities; indeed, more than 90% of Earth's residual heat related to global warming is absorbed by the ocean. As such, the global ocean heat content record robustly represents the signature of global warming and is impacted less by weather-related noise and climate variability such as El Niño and La Niña events. According to the IAP ocean analysis, the last five years have been the five warmest years in the ocean. Therefore, the long-term warming trend driven by human activities continued unabated.

The increase in ocean heat content for 2017 occurred in most regions of the world (Figure). The human greenhouse gas footprint continues to impact the Earth system. Increases in ocean temperature cause ocean volume expansion, which contributes to the global mean sea level rise. The increase in ocean heat of 1.51 × 1022 J in 2017 resulted in a 1.7 mm sea global level rise. Other consequences include declining ocean oxygen, bleaching of coral reefs, and melting sea ice and ice shelves.

Thorium reactors may dispose of enormous amounts of weapons-grade plutonium

Power, thermal energy and desalinated water from weapons-grade plutonium

Tomsk Polytechnic University

Scientists from the School of Nuclear Science & Engineering of Tomsk Polytechnic University are developing a technology enabling the creation of high-temperature gas-cool low-power reactors with thorium fuel. TPU scientists propose to burn weapons-grade plutonium in these units, converting it into power and thermal energy. Thermal energy generated at thorium reactors may be used in hydrogen industrial production. The technology also makes it possible to desalinate water.

The results of the study were published in Annals of Nuclear Energy.

Thorium reactors provide for their application in areas where there are no large water bodies and rivers, the presence of which is an obligatory condition to build a classical reactor. For example, they can be used in arid areas, as well as in remote areas of Siberia and the Arctic.

Associate Professor Sergey Bedenko from the School of Nuclear Science & Engineering tells: 'As a rule, a nuclear power plant is constructed on the riverside. Water is taken from the river and used in the active zone of the reactor for cooling. In thorium reactors, helium is applied, as well as carbon dioxide (CO2) or hydrogen, instead of water. Thus, water is not required.'

The mixture of thorium and weapons-grade plutonium is the fuel for the new kind of reactors.

Sergey Bedenko continues: 'Large amounts of weapons-grade plutonium were accumulated in the Soviet era. The cost for storing this fuel is enormous, and it needs to be disposed of. In the US, it is chemically processed and burned, and in Russia, it is burned in the reactors. However, some amount of plutonium still remains, and it needs to be disposed of in radioactive waste landfills. Our technology improves this drawback since it allows burning 97% of weapons-grade plutonium. When all weapons-grade plutonium is disposed of, it will be possible to use uranium-235 or uranium-233 in thorium reactors.'

Notably, the plant is capable of operating at low capacity (from 60 MW), the core thorium reactors require a little fuel and the percentage of its burnup is higher than that at currently used reactors. The remaining 3% of processed weapons-grade plutonium will no longer present nuclear hazard. At the output, a mixture of graphite, plutonium and decay products is formed, which is difficult to apply for other purposes. These wastes can only be buried.

Sergey Bedenko summarizes: 'The main advantage of such plants will be their multi-functionality. Firstly, we efficiently dispose of one of the most dangerous radioactive fuels in thorium reactors, secondly, we generate power and heat, thirdly, with its help, it will be possible to develop industrial hydrogen production.'

The authors of the study inform that the advantage of such reactors is their higher level of security in comparison with traditional designs, enhanced efficiency (up to 40-50%), absence of phase transitions of the coolant, increased corrosion resistance of working surfaces, possibility of using different fuels and their overload in operation, and simplified management of spent nuclear fuel.

Thorium fuel can be used both in thorium reactors and widely spread VVER-1000 reactors. The scientists expect these reactors to function at least 10-20 years, and when this fuel is spent, the core reactor may either be reloaded or disposed of.

In addition, water can be desalinated at thorium reactors.

Microwaves could be as bad for the environment as cars suggests new research

Manchester UK (SPX)

Microwaves usage across the EU alone emits as much carbon dioxide as nearly seven million cars according to a new study by The University of Manchester.

Researchers at the University have carried out the first ever comprehensive study of the environmental impacts of microwaves, considering their whole life cycle, from 'cradle to grave'. Microwaves account for the largest percentage of sales of all type of ovens in the European Union (EU), with numbers set to reach nearly 135 million by 2020. Despite this, the scale of their impacts on the environment was not known until now.

The study used life cycle assessment (LCA) to estimate the impacts of microwaves, taking into account their manufacture, use and end-of-life waste management. Altogether, the research team investigated 12 different environmental factors, including climate change, depletion of natural resources and ecological toxicity. They found, for example, that the microwaves used across the EU emit 7.7 million tonnes of carbon dioxide equivalent per year. This is equivalent to the annual emission of 6.8 million cars.

The research shows that the main environmental 'hotspots' are materials used to manufacture the microwaves, the manufacturing process and end-of-life waste management. For example, the manufacturing process alone contributes more than 20% to depletion of natural resources and to climate change.

However, it is electricity consumption by microwaves that has the biggest impact on the environment, taking into account its whole life cycle, from production of fuels to generation of electricity. In total, microwaves across the EU consume an estimated 9.4 terawatts per hour (TWh) of electricity every year. This is equivalent to the annual electricity generation by three large gas power plants.

The study found that, on average, an individual microwave uses 573 kilowatt hour (kWh) of electricity over its lifetime of eight years. That is equivalent to the electricity consumed by a 7 watt LED light bulb, left on continuously for almost nine years. This is despite the fact that microwaves spend more than 90% of their lifetime being idle, in the stand-by mode.

The study's authors suggest that efforts to reduce consumption should focus on improving consumer awareness and behaviour to use appliances more efficiently. For example, electricity consumption by microwaves can be reduced by adjusting the time of cooking to the type of food.

Waste is another major problem. Due to their relative low cost and ease of manufacture, consumers are throwing more electrical and electronic (EE) equipment away than ever before, including microwaves.

In 2005, across the EU, 184,000 tonnes of EE waste was generated from discarded microwaves. By 2025 this is estimated to rise to 195,000 tonnes, or 16 million individual units being sent for disposal.

Dr Alejandro Gallego-Schmid, from the School of Chemical Engineering and Analytical Science, explains: 'Rapid technological developments and falling prices are driving the purchase of electrical and electronic appliances in Europe.

'Consumers now tend to buy new appliances before the existing ones reach the end of their useful life as electronic goods have become fashionable and 'status' items.

'As a result, discarded electrical equipment, such as microwaves, is one of the fastest growing waste streams worldwide.'

Another major contributing factor to the waste is a reduced lifespan of microwaves. It is now nearly seven years shorter than it was almost 20 years ago. Research shows that a microwave's life cycle has decreased from around 10 to 15 years in the late 90s to between six to eight years today.

Dr Gallego-Schmid added: 'Given that microwaves account for the largest percentage of sales of all type of ovens in the EU, it is increasingly important to start addressing their impact on resource use and end-of-life waste.'

The study also shows that existing regulation will not be sufficient to reduce the environmental impacts of microwaves. It recommends that it will be necessary to develop specific regulations for these devices targeting their design. This will help to reduce the amount of resources used to make microwaves and waste generated at the end of their lifetime.

The study found:

+ Microwaves emit 7.7 million tonnes of carbon dioxide equivalent per year in the EU. This is equivalent to the annual emissions of 6.8 million cars.

+ Microwaves across the EU consume an estimated 9.4 terawatts per hour (TWh) of electricity every year. This is equivalent to the annual electricity generated by three large gas power plants.

+ Efforts to reduce consumption should focus on improving consumer awareness and behaviour to use appliances more efficiently

A moss that removes lead from water

Tokyo, Japan (SPX)

Researchers at the RIKEN Center for Sustainable Resource Science (CSRS) in Japan have demonstrated that that moss can be a green alternative for decontaminating polluted water and soil. Published in PLOS One, the study shows that in particular, the moss Funaria hygrometrica tolerates and absorbs an impressive amount of lead (Pb) from water.

Lead-contaminated water is a serious environmental concern that has recently proved to be disastrous when left untreated. Compounding the problem, the typical way to remove lead or other heavy metals from water requires fossil fuels and a tremendous amount of energy. As an alternative to these typical processes, phytoremediation is a method that uses photosynthesizing organisms to clean up soil or water contamination.

The CSRS researchers began their search for a phytoremediation-based removal method by looking at F. hygrometrica, a moss that is known to grow well in sites contaminated with metals like copper, zinc, and lead.

"We found that the moss can function as an excellent lead absorbent when in the protonema stage of development," says first author Misao Itouga.

"This valuable ability means that moss protonema will likely make exceptional wastewater cleaners in mining and chemical industries."

To characterize the metal-absorbing ability of the moss, the team first prepared solutions with varying concentrations of 15 different metals and exposed them to F. hygrometrica protonema. After 22 hours of exposure, mass-spectrometer analysis showed that the moss cells had absorbed lead up to 74% of their dry weight, which is quite high and much higher than any of the other metals.

Knowing where the lead accumulates is important for understanding how it occurs and for developing the most efficient phytoremediation. Analysis showed that within the moss protonema cells, more that 85% of the lead had accumulated in the cell walls, with smaller amounts being found in organelle membranes and inside the chloroplasts where photosynthesis occurs.

Focusing on the cell walls, the team found that they absorbed lead even after being removed from living moss. This means that there is something special about the cell walls of this species of moss that allows them to thrive in environments that are toxic to other plants.

Analysis with two-dimensional nuclear magnetic resonance indicated that polygalacturonic acid in the cell walls was responsible for absorbing the lead.

"We compared F. hygrometrica data with those from land plants and seaweeds", explains Itouga, "and found that the presence of polygalacturonic acid in the cell wall is one of the characteristics that separated this type of moss from other plants."

They next determined that the protonema cells absorbed lead well at pH values between 3 and 9, which is important because the acidity of metal-polluted water can vary.

"Our findings show that F. hygrometrica is a useful bio-material for recovering lead from aqueous solutions," says Group leader Hitoshi Sakakibara.

"and will contribute to the Sustainable Development Goals set by the United Nations, specifically the Life on Land goal. We are currently exploring opportunities to work with recycling-oriented companies."

Itouga M, Hayatsu M, Sato M, Tsuboi Y, Kato Y, Toyooka K, et al. (2017) Protonema of the moss Funaria hygrometrica can function as a lead (Pb) Adsorbent. PLOS One. 12(12) e0189726 doi: 10.1371/journal.pone.0189726

The wave power farm off Mutriku nees to improve its efficiency

Bilbao, Spain (SPX)

The offshore power plant or wave farm at Mutriku is the only commercial facility (it is not a prototype) in the world that operates by regularly feeding the grid with electrical power produced by waves. It has been operating since 2011 and the study by the UPV/EHU's EOLO group analysed its behaviour during the 2014-2016 period.

"It is important to find out how the wave power farm is actually performing, to analyse how the technology used is behaving, and to observe what deficiencies or advantages it has in order to help to improve it," said Gabriel Ibarra-Berastegi, the lead author of the study.

"Extracting energy from the waves is in its early stages and various types of devices and technologies are currently being developed. They include the OWC (Oscillating Water Column) technology used at Mutriku," he added.

In OWC technology the turbines are not driven by the waves directly but by a mass of compressed air. It is a structure in which the upper part forms an air chamber and the lower part is submerged in the water. That way the turbine takes advantage of the movement produced by the wave both when it advances and recedes, and the generator to which it is coupled feeds the power into the grid.

"The turbines generate electricity which is regularly sold to the electricity grid. In the case of Mutriku this happens 75% of the time. The plant shuts down from time to time when the waves are very calm or even when they are too rough," explained Ibarra.

The research focussed on the study and analysis of the operational data provided by the Basque Energy Agency, which manages the plant. After analysing and ordering these data, "we saw that one output indicator is the Capacity Factor (CF), which allows different technologies for producing electricity to be compared," explained the lead researcher of the article.

"In this case we calculated the CF of the Mutriku farm and its value is 0.11, while wind energy facilities have a CF in the region of 0.2-0.3, and solar plants 0.4. That indicates that the OWC technology at Mutriku needs to improve its CF to put it on a par with the values of the remaining renewable energy sources," said Ibarra.

"We believe that the way to achieve this is to improve the regulation and control of the speed at which the turbines rotate, in other words, to properly manage the speed at which the turbine rotates in relation to the advancing waves," he concluded.

According to Gabriel Ibarra, "these conclusions drawn from the data on a real farm like that of Mutriku constitute a step forward whereby it is possible to focus and identify the next steps to be taken so that OWC technology can attain a level of maturity, thus facilitating the introduction and deployment of these farms".

G. Ibarra-Berastegi, J. Senz, A. Ulazia, P. Serras, G. Esnaola, C. Garca-Soto (2018). "Electricity production, capacity factor, and plant efficiency index at the Mutriku wave farm (2014-2016)". Ocean Engineering. Vol. 147. Pages 20-29

"US Hits Record For Costly Weather Disasters: $306 Billion"

"WASHINGTON — With three strong hurricanes, wildfires, hail, flooding, tornadoes and drought, the United States tallied a record high bill last year for weather disasters: $306 billion.

The U.S. had 16 disasters last year with damage exceeding a billion dollars, the National Oceanic and Atmospheric Administration said Monday. That ties 2011 for the number of billion-dollar disasters, but the total cost blew past the previous record of $215 billion in 2005.

Costs are adjusted for inflation and NOAA keeps track of billion-dollar weather disasters going back to 1980.

Three of the five most expensive hurricanes in U.S. history hit last year."

NASA study: First direct proof of ozone hole recovery due to chemicals ban

NASA/GODDARD SPACE FLIGHT CENTER

For the first time, scientists have shown through direct satellite observations of the ozone hole that levels of ozone-destroying chlorine are declining, resulting in less ozone depletion.

Measurements show that the decline in chlorine, resulting from an international ban on chlorine-containing manmade chemicals called chlorofluorocarbons (CFCs), has resulted in about 20 percent less ozone depletion during the Antarctic winter than there was in 2005 -- the first year that measurements of chlorine and ozone during the Antarctic winter were made by NASA's Aura satellite.

"We see very clearly that chlorine from CFCs is going down in the ozone hole, and that less ozone depletion is occurring because of it," said lead author Susan Strahan, an atmospheric scientist from NASA's Goddard Space Flight Center in Greenbelt, Maryland.

CFCs are long-lived chemical compounds that eventually rise into the stratosphere, where they are broken apart by the Sun's ultraviolet radiation, releasing chlorine atoms that go on to destroy ozone molecules. Stratospheric ozone protects life on the planet by absorbing potentially harmful ultraviolet radiation that can cause skin cancer and cataracts, suppress immune systems and damage plant life.

Two years after the discovery of the Antarctic ozone hole in 1985, nations of the world signed the Montreal Protocol on Substances that Deplete the Ozone Layer, which regulated ozone-depleting compounds. Later amendments to the Montreal Protocol completely phased out production of CFCs.

Past studies have used statistical analyses of changes in the ozone hole's size to argue that ozone depletion is decreasing. This study is the first to use measurements of the chemical composition inside the ozone hole to confirm that not only is ozone depletion decreasing, but that the decrease is caused by the decline in CFCs.

The study was published in the journal Geophysical Research Letters.

The Antarctic ozone hole forms during September in the Southern Hemisphere's winter as the returning sun's rays catalyze ozone destruction cycles involving chlorine and bromine that come primarily from CFCs. To determine how ozone and other chemicals have changed year to year, scientists used data from the Microwave Limb Sounder (MLS) aboard the Aura satellite, which has been making measurements continuously around the globe since mid-2004. While many satellite instruments require sunlight to measure atmospheric trace gases, MLS measures microwave emissions and, as a result, can measure trace gases over Antarctica during the key time of year: the dark southern winter, when the stratospheric weather is quiet and temperatures are low and stable.

The change in ozone levels above Antarctica from the beginning to the end of southern winter -- early July to mid-September -- was computed daily from MLS measurements every year from 2005 to 2016. "During this period, Antarctic temperatures are always very low, so the rate of ozone destruction depends mostly on how much chlorine there is," Strahan said. "This is when we want to measure ozone loss."

They found that ozone loss is decreasing, but they needed to know whether a decrease in CFCs was responsible. When ozone destruction is ongoing, chlorine is found in many molecular forms, most of which are not measured. But after chlorine has destroyed nearly all the available ozone, it reacts instead with methane to form hydrochloric acid, a gas measured by MLS. "By around mid-October, all the chlorine compounds are conveniently converted into one gas, so by measuring hydrochloric acid we have a good measurement of the total chlorine," Strahan said.

Nitrous oxide is a long-lived gas that behaves just like CFCs in much of the stratosphere. The CFCs are declining at the surface but nitrous oxide is not. If CFCs in the stratosphere are decreasing, then over time, less chlorine should be measured for a given value of nitrous oxide. By comparing MLS measurements of hydrochloric acid and nitrous oxide each year, they determined that the total chlorine levels were declining on average by about 0.8 percent annually.

The 20 percent decrease in ozone depletion during the winter months from 2005 to 2016 as determined from MLS ozone measurements was expected. "This is very close to what our model predicts we should see for this amount of chlorine decline," Strahan said. "This gives us confidence that the decrease in ozone depletion through mid-September shown by MLS data is due to declining levels of chlorine coming from CFCs. But we're not yet seeing a clear decrease in the size of the ozone hole because that's controlled mainly by temperature after mid-September, which varies a lot from year to year."

Looking forward, the Antarctic ozone hole should continue to recover gradually as CFCs leave the atmosphere, but complete recovery will take decades. "CFCs have lifetimes from 50 to 100 years, so they linger in the atmosphere for a very long time," said Anne Douglass, a fellow atmospheric scientist at Goddard and the study's co-author. "As far as the ozone hole being gone, we're looking at 2060 or 2080. And even then there might still be a small hole."

Plain Language Summary

The Antarctic ozone hole is healing slowly because levels of the man-made chemicals causing the hole have long lifetimes. We use Microwave Limb Sounder (MLS) satellite data to measure O3 over Antarctica at the beginning of winter and then compare it to O3 near the end of winter to calculate depletion. During this period, nearly all O3 change is due to depletion. MLS also measures HCl, and when ozone levels are very low, nearly all the reactive chlorine species (Cly) are converted to HCl. Cly varies a lot from year to year from atmospheric motions. Fortunately, MLS measures nitrous oxide (N2O), a long-lived gas that also varies with the motions. Using the ratio of Cly to N2O, we find that there is less chlorine now than 9 years ago and that Cly has decreased on average about 25 parts per trillion/yr (0.8%/yr). The O3 depletion we calculate from MLS data responds to changes in the Cly levels, and the ratio of the change in ozone loss to the change in Cly matches model calculations. All of this is evidence that the Montreal Protocol is working—the Cly is decreasing in the Antarctic stratosphere and the ozone destruction is decreasing along with it.

To read the study abstract, visit: http://onlinelibrary.wiley.com/doi/10.1002/2017GL074830/abstract

U.S. FDA halts use of triclosan in health care antiseptics

Agency defers action on two quaternary ammonium compounds

By Cheryl Hogue

.The antibacterial compound triclosan, already banned in the U.S. from consumer soaps, will no longer be allowed in antiseptic products used in hospitals and other health care settings. The U.S. Food & Drug Administration on Dec. 20 deemed triclosan and 23 other antiseptic ingredients to not be generally recognized as safe and effective.

“There was a lack of sufficient safety and efficacy data” for the 24 affected chemicals, explains FDA Commissioner Scott Gottlieb.

The new regulation affects hand washes and rubs used by health care professionals, surgical hand scrubs and rubs, and antiseptic preparations used on patients’ skin before injections or surgery.

FDA’s action stems from a settlement the agency made with the Natural Resources Defense Council (NRDC). The environmental group sued the agency in 2010 for failing to finalize a 1978 proposal to ban triclosan in soaps.

In the new regulation, FDA put off deciding whether six other ingredients are safe and effective in antiseptic health care products, giving manufacturers more time to provide data to the agency. These substances are: two quaternary ammonium compounds, benzalkonium chloride and benzethonium chloride; chloroxylenol; ethyl alcohol; isopropyl alcohol; and povidone-iodine.

The American Cleaning Institute, which represents producers of soaps and detergents, is pleased that the agency deferred action on the six chemicals.

“Manufacturers need sufficient time to provide FDA with additional safety and efficacy data to support the continued use of these products in health care facilities,” the industry group says. “The active ingredients used in health care antiseptic drug products have very favorable benefit/risk ratios demonstrated over many years of extensive use.”

But Mae Wu, a senior attorney with NRDC, is concerned about the delay. Companies have had decades to provide FDA with the safety and efficacy data, she points out.

NRDC is particularly focused on the two quaternary ammonium compounds, saying the data suggests that these chemicals may cause health problems. Hand sanitizers containing these chemicals are marketed as alcohol-free and are widely used in schools as well as in health care settings, Wu says.

Wu is worried about how long FDA will take to make a determination about the six antibacterial substances.

“I don’t want us to wait another 40 years for FDA to make a decision on these other chemicals,” she says, noting the time that elapsed between FDA’s proposal to ban triclosan and its final action.

FDA banned triclosan and certain other antiseptic chemicals in consumer soaps and body washes more than a year ago. It found that companies haven’t shown the compounds to be safe for long-term daily use, and said that manufacturers failed to show that products with the substances were more effective at preventing the spread of germs than plain soap and water. FDA also cited concern about potential hormonal effects and antibiotic resistance associated with the chemicals.

How Much Food Do We Waste? Probably More Than You Think

By Somini Sengupta

Dec. 12, 2017

Globally, we throw out about 1.3 billion tons of food a year, or a third of all the food that we grow.

That’s important for at least two reasons. The less the world wastes, the easier it will be to meet the food needs of the global population in coming years. Second, cutting back on waste could go a long way to reducing greenhouse gas emissions.

How do we manage to waste so much?

Food waste is a glaring measure of inequality. In poor countries, most of the food waste is on the farm or on its way to market. In South Asia, for instance, half of all the cauliflower that’s grown is lost because there’s not enough refrigeration, according to Rosa Rolle, an expert on food waste and loss at the United Nations Food and Agriculture Organization. Tomatoes get squished if they are packed into big sacks. In Southeast Asia, lettuce spoils on the way from farms to city supermarkets. Very little food in poor countries is thrown out by consumers. It’s too precious.

But in wealthy countries, especially in the United States and Canada, around 40 percent of wasted food is thrown out by consumers.

That number, from the F.A.O., is the result of several factors. We buy too much food. We don’t finish our plates. We spend a far smaller share of our income on food.

“As you get higher and higher income, you get more and more profligacy in food waste,” Paul A. Behrens, an assistant professor of energy and environmental sciences at Leiden University in the Netherlands.

The United States as a whole wastes more than $160 billion in food a year.

According to the United States Department of Agriculture, which tracks food loss, dairy products account for the largest share of food wasted, about $91 billion.

“In the developed world, food is more abundant but it costs much less,” Ms. Rolle said. “In a sense people don’t value food for what it represents.”

Food waste and loss has a huge carbon footprint: 3.3 billion tons of carbon equivalent.

And that’s not all, according to a 2014 report from the F.A.O. Wasting that much means a lot of water is wasted, too — the equivalent of three times the size of Lake Geneva, as the report puts it.

“Food waste — it’s kind of the tip of the iceberg,” said Jason Clay, a senior vice president in charge of food policy at the World Wildlife Fund, a Washington-based advocacy group. “It’s the most obvious place to start.”

Ms. Rolle of the Food and Agriculture Organization said some of the most basic fixes are at the bottom end of the supply chain: Metal grain silos have helped against fungus ruining grain stocks in countries in Africa. In India, the F.A.O. is encouraging farmers to collect tomatoes in plastic crates instead of big sacks; they squish and rot less.

Higher up the food chain, supermarkets are trying to make a dent by changing the way best-before labels are used — making them specific to various food categories to discourage consumers from throwing out food that is safe to eat — or trying to sell misshapen fruits and vegetables rather than discarding them.

Some countries are trying to regulate food waste. France requires retailers to donate food that is at risk of being thrown out but is still safe to eat. European Union lawmakers are pushing for binding targets to curb food waste by 50 percent by 2030, echoing a United Nations development goal; negotiations have been underway since June. Some countries pushing back on the idea of continentwide targets.

What if we just ate less?

That would make a difference, but not as much as one might think. Dr. Behrens of Leiden University addressed the issue in a recent study:

Cutting waste would have “at least the same impact or more than changing diets.”

If Americans ate according to our nationally recommended dietary guidelines (each country’s are somewhat different) that would go some distance toward cutting our emissions footprint. Changing eating habits is tough, though. Experts say food waste is still critical.

The most consequential environmental stories of 2017

By Brady Dennis and Darryl Fears

President Trump made his mark in the energy and environment world during his first year in Washington. Many of his actions aimed to undo work from the Obama era. Trump all but abandoned the nation’s efforts to combat climate change, and he shrank national monuments that President Barack Obama had established or sought to preserve. Trump scaled back regulations on the fossil fuel industry and pushed for more drilling on land and at sea.

And in turn, much of the world pushed back. Protesters descended on Washington to oppose his policies and campaign against what they saw as an attack on science. Other nations denounced his decision to back out of an international climate agreement, leaving the United States at odds with the rest of the globe.

Meanwhile, extreme weather nationwide wrought devastation. Hurricanes leveled homes, triggered floods and upended lives from Puerto Rico to Texas. Wildfires ravaged California, burning entire neighborhoods to ashes. It was a tumultuous year. Here are some of the most consequential environmental stories we covered along the way.

1. Withdrawal from the Paris climate accord. “I was elected to represent the citizens of Pittsburgh, not Paris,” Trump proclaimed from the Rose Garden in June. With those words, he declared his intention to withdraw the nation from a global effort to cut greenhouse gas emissions in an attempt to fend off the worst effects of climate change. The Obama administration had led the charge for the landmark deal in late 2015, helping to persuade other world powers — and major polluters — such as China and India to pledge to reduce their emissions in coming years.

Trump reversed course, despite widespread criticism from world leaders, claiming that the Paris accord was a bad deal for the United States that would disadvantage American workers. The United States is now the only nation in the world to reject the deal. While the U.S. withdrawal from the Paris agreement cannot officially be finalized until late 2020, the action sent a clear message: Climate action has little place in the Trump administration.

2. A sea change at the Environmental Protection Agency. “The future ain’t what it used to be at the EPA,” the agency’s administrator, Scott Pruitt, is fond of saying. That’s certainly true. In nominating Pruitt to head the agency that Trump once promised to reduce to “little tidbits,” the president chose a man who had long been one of its most outspoken adversaries. As Oklahoma attorney general, Pruitt sued the EPA 14 times, challenging its authority to regulate toxic mercury pollution, smog, carbon emissions from power plants and the quality of wetlands and other waters.

Now, as EPA’s leader, he has acted aggressively to reduce the agency’s reach, pause or reverse numerous environmental rules, and shrink its workforce to Reagan-era levels. He has begun to dismantle Obama’s environmental legacy, in part by rolling back the Clean Power Plan — a key attempt to combat climate change by regulating carbon emissions from the nation’s power plants. Along the way, Pruitt has become one of Trump’s most effective Cabinet members, as well as a lightning rod for criticism from public health and environmental groups.

3. The fight over national monuments. Trump issued an executive order in April to review 27 land and marine monuments. But it was clear that two particular monuments were in his crosshairs: Bears Ears and Grand Staircase-Escalante. Utah’s congressional delegation and its governor had lobbied Trump’s inner circle to reverse the monument designations of these parks in their state even before he was elected.

Utah Republicans called the designations by Obama and President Bill Clinton overzealous land grabs, and shortly after he took office, Trump adopted some of the same language. He promised to end what he called presidential “abuses” and give control of the land “back to the people.” In the end, Trump shrank both monuments by nearly 2 million acres last month, and Interior Secretary Ryan Zinke said the borders of other monuments in the Atlantic and Pacific oceans, as well as in the West, are being reviewed. Native American groups that had requested a Bears Ears designation are leading a wave of lawsuits against the Trump administration’s decision.

4. Drill, baby, drill. Drilling platforms already dot the Gulf of Mexico, where the fossil fuel industry has extracted oil and gas for decades. But the Trump administration wanted to make history. In early November, it did so by announcing the largest gulf lease offering for oil and gas exploration in U.S. history: 77 million acres.

The move was consistent with Trump’s push for “energy dominance.” He and Zinke are also opening more land to coal excavation in the West. One of Zinke’s first acts as interior secretary was to remove a bright and colorful picture of a western landscape from the Bureau of Land Management’s website and replace it with a black wall of coal. Oil prices are climbing after reaching record lows in recent years, but coal is struggling to make a comeback after the rise of natural gas. The Gulf of Mexico promises more oil, but it also might promise disaster. It’s the scene of one of the nation’s worst environmental disaster, the Deepwater Horizon oil spill, which fouled beaches and killed untold numbers of marine animals when oil spewed into the water for months.

Is drilling in the pristine Arctic National Wildlife Refuge next? The Republican-controlled Congress greenlighted leases for exploration in the recently passed tax bill completely along party lines. But let the buyer beware. Royal Dutch Shell drilled a $7 billion hole in the Chukchi Sea in 2014 and has nothing to show for it.

5. Action on the Dakota Access and Keystone XL pipelines. As winter began to fade, it became clear that camps of protesters in Canon Ball, N.D., who for months had fought a pipeline that they argued could threaten the drinking water and cultural sites of the Standing Rock Sioux tribe, had lost this particular battle. Days after Trump took office, he signed executive orders to revive two controversial pipelines that the Obama administration had put on hold — the 1,172-mile Dakota Access and the 1,700-mile Keystone XL oil pipeline, which would extend from the Canadian tar sands region to refineries on the Texas Gulf Coast.

Oil is now flowing through the Dakota Access pipeline. And the company behind the Keystone XL this fall cleared a key regulatory hurdle in its quest to complete the northern half of the pipeline, running from Alberta to Steele City, Neb., when it received approval from the Nebraska Public Service Commission. Opponents of both projects have vowed to continue legal fights, as well as to protest any other pipelines they view as a threat to public health or the environment. But Trump shows few signs of backing down, calling his actions “part of a new era of American energy policy that will lower costs for American families — and very significantly — reduce our dependence on foreign oil, and create thousands of jobs right here in America.”

6. Attacks on the Endangered Species Act. It is arguably one of the most powerful environmental laws in the world, credited with saving at least a dozen animal and plant species from extinction. But who will save the Endangered Species Act, which is under attack by political conservatives inside and outside Washington? Led by Rep. Rob Bishop (R-Utah), chairman of the House Natural Resources Committee, who said he wants to “invalidate” the 44-year-old act, some Republicans say the law interferes with commercial development, private landowner rights and excavation of natural resources such as coal and natural gas.

Bishop’s committee passed five bills that would weaken protections for wolves, force federal workers who enforce the law to consider economic impact when deciding how to save animals and strip away a provision of the law that requires the federal government to reimburse conservation groups that prevail in court. The bills have set up a potentially titanic battle between wildlife advocates and lawmakers supporting farmers, housing developers and the oil and gas industry. It’s not the first time that conservatives have attempted to weaken the act, but it is the first time a presidential administration and the department that oversees the act appear willing to go along.

7. Epic hurricanes and wildfires. Last year around this time, a strange wildfire rushed through the Tennessee mountains, killing 14 people, destroying homes and apartment buildings, and threatening a major recreation area in Gatlinburg. The 2017 fire disasters, some of which are still burning, were much more monstrous than that Great Smoky Mountain inferno. Two California fires, the Sonoma fire that burned north of San Francisco and the Thomas fire that burned north of Los Angeles, driven by fierce Santa Ana winds, have combined to kill 45 people, burn more than a half-million acres, destroy nearly 2,000 structures and cost hundreds of millions of dollars to fight. The Thomas fire appears to be finally contained near Santa Barbara after burning the second-most acreage in state history.

But fire wasn’t even the costliest disaster this year. Hurricane Harvey’s death toll in and around Houston was nearly double the number who perished in the two fires and sent 30,000 people in search of shelter. Miami, Jacksonville and Naples, Fla., were devastated by Hurricane Irma, which immediately followed Harvey. They were followed by Hurricane Maria, which leveled much of Puerto Rico and left at least 50 people dead, but that is probably a drastic under count and the toll could be as high as 500.

8. Criminal charges mount in the Flint water crisis. In June, Michigan Attorney General Bill Schuette charged the director of the state’s health department and four other public officials with involuntary manslaughter for their roles in the Flint water crisis, which has stretched into its fourth year. In addition to ongoing worries that thousands of young children were exposed to dangerous levels of lead in the city’s contaminated water supply, the crisis has been linked to an outbreak of Legionnaires’ disease that contributed to at least a dozen deaths. The manslaughter charges were the latest reckoning.

According to Schuette’s office, the investigation into the decisions that led to tainted water for a city of nearly 100,000 people has resulted in 51 criminal charges for 15 state and local officials. It remains unclear how many of the charges will stick. But the cases serve as a reminder of the human toll of the tragedy and how, even today, many residents in the largely low-income, majority-minority city trust neither the water from their taps nor the public officials charged with ensuring it is safe.

9. Climate march on Washington. It didn’t draw nearly the crowd that the Women’s March did in January. And it didn’t get as much national attention as the March for Science that came only a week earlier. Even so, on a sweltering Saturday in April, tens of thousands of demonstrators descended on Washington to mark Trump’s first 100 days in office. Their plea: Stop the rollback of environmental protections and take climate change seriously.

Building on a massive demonstration three years earlier in New York, the People’s Climate March brought its message — and its many clever signs — to the White House. “Don’t destroy the Earth. I buy my tacos here,” one read. “Good planets are hard to find,” another read. “Make Earth Great Again!” read another. Trump wasn’t around that day to witness the protests on his doorstep, and the march’s organizers didn’t expect to change his mind. But they were gearing up for a long fight ahead. By the next morning, some participants met to discuss how to get more allies to run for public office. “It can’t just be a march,” one activist said. “It has to be a movement.”

Brady Dennis is a national reporter for The Washington Post, focusing on the environment and public health issues.

Darryl Fears has worked at The Washington Post for more than a decade, mostly as a reporter on the National staff. He currently covers the environment, focusing on the Chesapeake Bay and issues affecting wildlife.

Calif. Regulators OK Closure of Diablo Canyon, State's Last Nuke

"In a landmark decision, the California Public Utilities Commission has decided PG&E can close Diablo Canyon nuclear power plant in 2025 — and San Luis Obispo County still won’t get its much-desired $85 million mitigation settlement to support the community through the shutdown.

PG&E will get more than expected for its employee retention and retraining program, however.

The commission unanimously voted to approve the application Thursday, saying the utility company presented a reasonable pathway toward a more energy-efficient future.

Oil Giants See a Future in Offshore Wind Power; Suppliers Too

"Analysts forecast a sixfold increase in offshore wind power capacity by 2030, but while Europe's market booms, U.S. growth has been slow. That may be changing."

"Transporting an offshore wind array from the factory floor to the ocean floor is no easy feat. Giant, specialized marine vessels must carry the blades and turbines—which sit atop rigs hundreds of feet tall—out miles from shore. Steel or concrete foundations are built to hold them in place, and underwater cables are laid on the seabed to transfer the power to land.

One other industry has spent decades constructing and maintaining such massive energy infrastructure that can survive the storms of the open ocean: oil and gas. Now, with global demand for wind power growing, major oil and gas companies like Shell and Statoil are diversifying their portfolios by developing offshore wind, and the companies that provide services to offshore fossil fuel platforms are seeing a new market rising in their wake.

"Offshore wind developing seemed like a natural skill set for offshore oil and gas companies," said Stephen Bull, senior vice president of wind and carbon capture storage for Statoil, a Norwegian oil and gas company. "From the Gulf of Mexico to Brazil and beyond, we see a similar supply chain and skill set and can grow within this area."

A Prescription for Reducing Wasted Health Care Spending

A ProPublica series has illustrated the many ways the U.S. health care system leaks money. Health care leaders and policymakers suggest ways to plug the holes.

by Marshall Allen Dec. 21, 2017, 5 a.m. EST

Unused medical supplies sit in storage at a Partners for World Health facility in Portland, Maine.

Wasted Medicine, Squandered Health Care Dollars

Earlier this year, the Gallup organization set out to identify the top concerns everyday Americans have about money. Researchers asked more than a thousand people across the country, “What is the most important financial problem facing your family today?” Their top answer: the cost of health care.

Increases in medical costs have substantially outpaced economic growth for decades. In recent months, ProPublica has shown that it doesn’t have to be this way. It’s been estimated that the U.S. health care system wastes about $765 billion a year — about a quarter of what’s spent. We’ve identified ways that tens of billions of dollars are being wasted, some of them overlooked even by many experts and academics studying this problem.

It’s possible to reduce or eliminate some of the waste, but there are also formidable forces that benefit from it. Excess spending generates revenue and profit for what some have called the “medical industrial complex,” said Dr. H. Gilbert Welch, professor of medicine at the Dartmouth Institute for Health Policy & Clinical Practice. “There are a number of people who can imagine ways to solve things,” Welch said of the wasted spending. “But the political will and the forces at work can stop them pretty easily.”

Still, wasting fewer health care dollars could drive down insurance premiums and out-of-pocket costs and maybe even free up resources for education, retirement and wage increases, among other things.

Here are the ways ProPublica found the medical industry is needlessly gobbling up money, along with steps health care leaders or policymakers say we could take right now to reduce the waste.

What Hospitals Waste

The nation’s health care tab is sky-high. We’re tracking down the reasons. First stop: a look at all the perfectly good stuff hospitals throw away.

What ProPublica found: Hospitals routinely toss out brand-new supplies and gently used equipment. Most of it goes to the dump, but some gets picked up by nonprofit organizations that ship the goods to the developing world.

Partners for World Health ships America’s unwanted medical supplies from their warehouse to countries with under-developed health care systems like Greece, Syria and Uganda. (Tristan Spinski, special to ProPublica)

How much money is wasted: No one tracks the total, but one charity in Maine had about $20 million worth of discarded goods filling its warehouses. Similar nonprofits operate around the country. The University of California, San Francisco Medical Center studied how much it wasted during neurosurgery operations in one year. The discarded supplies were worth $2.9 million.

How to stop the squandering: UCSF reduced waste by reviewing the lists of supplies each surgeon wanted prepped for an operation. Many items could be taken off the lists. That reduced the number of supplies opened to set up each procedure, said Dr. R. Adams Dudley, director of the UCSF Center for Healthcare Value.

Hospitals could reduce wasted supplies by tracking everything thrown away, said Dr. Robert Pearl, former CEO of The Permanente Medical Group, the country’s largest medical group and author of the book “Mistreated.” The amount of waste could be reported when each patient is discharged, he said. Seeing the waste quantified would motivate people to prevent it, he said.

Several experts said paying hospitals a lump sum for everything involved in a particular procedure, instead of a la carte for each item, would mean they make more profit by cutting the amount of wasted supplies and equipment.

Every year nursing homes nationwide flush, burn or throw out tons of valuable prescription drugs. Iowa collects them and gives them to needy patients for free. Most other states don’t.

What ProPublica found: Nursing home patients typically have their drugs dispensed a month at a time. So whenever a drug gets discontinued— if a patient dies, or moves out, or has a reaction — there’s excess. Most nursing homes throw away the leftover drugs. They flush them down the toilet, put them in the trash or pay to have them incinerated. Iowa started a nonprofit organization to recover the excess drugs, inspect them and dispense them for free to patients.

How much money is wasted: It’s estimated that hundreds of millions of dollars a year are wasted by throwing out nursing home medications. The CEO of a long-term care pharmacy in Florida said his company incinerates about $2.5 million in valuable medication every year. He estimated the total is $50 million statewide. Colorado officials said the state’s long-term care facilities toss out 17.5 tons of potentially reusable drugs each year, worth about $10 million. Iowa is on pace to recover $6 million in drugs from nursing homes this year.

Boxes of unused drugs burn in the opening of an incinerator at Curtis Bay Medical Waste Services in Baltimore, Maryland. (Matt Roth for ProPublica)

Other harm to the public: Flushing drugs down the toilet contaminates the water supply. Trace levels of pharmaceuticals have been detected in water throughout the country.

How to stop the squandering: Many states have laws that allow drug donation, but they have not invested in a program to help the process. ProPublica’s story prompted lawmakers in Florida and New Hampshire to introduce legislation to create a program like Iowa’s. Leaders in the Vermont medical community have also shown interest in starting a drug donation program.

How Two Common Medications Became One $455 Million Specialty Pill

After I was prescribed a brand-name drug I didn’t need and given a coupon to cover the out-of-pocket costs, I discovered another reason Americans pay too much for health care.

What ProPublica found: A drug company combined two over-the-counter drugs, naproxen, which goes by the brand name Aleve, and esomeprazole magnesium, also known as Nexium, to create a new pill called Vimovo. My doctor prescribed it to me. The company that makes it, Horizon Pharma, marketed the single pill as an innovation because it was easier to take one pill than two. A month’s supply of the two inexpensive drugs costs about $40. The company billed insurance $3,252 for the Vimovo.

How much money is wasted: My insurance company rejected the bill, but Vimovo has net sales of more than $455 million since 2014. Horizon brought in $465 million more in net sales from a similar drug, Duexis, which combines ibuprofen and famotidine, aka Advil and Pepcid.

How to stop the squandering: Spurred by ProPublica’s story, Connecticut now requires doctors to get prior authorization to prescribe Horizon’s drugs to the 200,000 public employees, retirees and their dependents covered by a state insurance plan. The Connecticut comptroller also urged the attorney general’s office to investigate Horizon’s relationship with pharmacies and physicians. Other insurers have removed the drugs from their formularies.

Welch, the Dartmouth doctor, suggested tongue-in-cheek “warning labels” on drugs that aren’t actually big innovations. In the case of Vimovo, the warning could say that the “specialty” pill is actually just a combination of two cheaper ingredients. “There may need to be public service advertisements that counter the advertising that says everything is the best thing since sliced bread,” he said.

In cases where there are dramatic price increases, state lawmakers could require pharmaceutical companies to justify the higher costs, said Dr. Steven Pearson, president of the Institute for Clinical and Economic Review.

Secret deals and rebates currently cloak the actual price of drugs. Drug companies should have to publish the real prices of their products, said Linda Cahn, an attorney who advises corporations, unions and other payers on how to reduce their costs. Cahn recently proposed in an op-ed in The Hill what she calls a “bid day,” where any drug company that wants to sell its products to Medicare beneficiaries would be required to submit the net cost of each medication. All the companies would be required to submit their price at the same time, say twice a year, and then stick with the price, which Medicare would make public for all to see. The bidding would inform consumers and force competition, she said. The same process could be used for all insurance plans, she said.

The Myth of Drug Expiration Dates

Hospitals and pharmacies are required to toss expired drugs, no matter how expensive or vital. Meanwhile the FDA has long known that many remain safe and potent for years longer.

Pharmacist and toxicologist Lee Cantrell tested drugs that had been expired for decades. Most of them were still potent enough to be on the shelves today.

What ProPublica found: The term “expiration date” is a misnomer. The dates on drug labels are the point up to which pharmaceutical companies guarantee their effectiveness. But that does not mean the drugs are ineffective or dangerous the moment after they “expire.” The Food and Drug Administration and Defense Department created the Shelf Life Extension Program to test the stability of drugs in the federal government’s stockpiles and then extend their expiration dates, when possible. A 2006 study of 122 drugs tested by the program showed that two-thirds of them were stable every time a lot was tested. Each of them had their expiration dates extended, on average, by more than four years. But the same type of drugs in hospitals or pharmacies get thrown away when they “expire.”

How much money is wasted: The total is unknown, but one mid-size hospital in Boston had to destroy about $200,000 in expired drugs in 2016. If hospitals nationwide throw out similar amounts of drugs annually, the total would be about $800 million trashed each year by hospital pharmacies alone.

Other harm to the public: Some expired drugs are in short supply and difficult to replace. On occasion, a pharmaceutical company will extend the expiration date of drugs for which there are shortages, but they are not required to do so.

How to stop the squandering: Drug companies could be required to do studies to determine how long their products actually last, and report the information, said Lee Cantrell, a pharmacist who helps run the California Poison Control System. Also, the government could publish data from the Shelf Life Extension Program, which is funded by taxpayers. The information would help people see “many types of medications are safe and effective much longer than their original expiration dates,” Cantrell said.

Drug Companies Make Eyedrops Too Big — And You Pay for the Waste

The makers of cancer drugs also make vials with too much medication for many patients. The excess drugs are tossed in the trash — another reason health care costs are so high.

What ProPublica found: Drug companies make eyedrops much larger than what the eye can hold. That means patients pay for the excess from each drop, which runs down their cheeks. Vials of cancer drugs are also larger than necessary. The leftover cancer medication is billed to the patient — and thrown in the trash can.

Gregory Matthews, who is partially blind because of glaucoma, uses eyedrops every day to preserve his remaining sight. (Matt Roth for ProPublica)

How much money is wasted: It’s unknown how much it costs to waste a portion of each eyedrop, but the industry is huge. Last year, drug companies brought in about $3.4 billion in the U.S. alone for dry eyes and glaucoma drops, according to the research firm Market Scope. A 2016 study estimated that, of the top 20 cancer drugs packaged in single-use vials, 10 percent of the medication is wasted, at a cost of $1.8 billion a year.

How to stop the squandering: Citing ProPublica’s story, two U.S. senators introduced legislation that would require federal agencies to stop the waste associated with eyedrops and single-use cancer drug vials. Drugmakers could reduce the waste of cancer medication by making vials in varying sizes, so that less would be left over from each patient, said Dr. Peter Bach, director of the Center for Health Policy and Outcomes at Memorial Sloan Kettering Cancer Center in New York. Bach led a team that estimated the waste associated with vials of cancer drugs.

A Hospital Charged $1,877 to Pierce a 5-Year-Old’s Ears. This Is Why Health Care Costs So Much.

An epidemic of unnecessary treatment is wasting billions of health care dollars a year. Patients and taxpayers are paying for it.

What ProPublica found: It’s common for medical providers to deliver care that’s not needed, or that costs more than necessary, and patients get stuck with the bills. In one case, a Colorado mom said her surgeon offered to pierce her daughter’s ears as an add-on to a different operation. The mom agreed, assuming it would be free. But the hospital stuck her with a bill for $1,877 for “operating room services.” ProPublica also highlighted unnecessary imaging tests, like extra mammograms and ultrasounds, and found it’s common for hospital intensive care to be delivered to patients who are too sick or too healthy to benefit.

How much money is wasted: Unnecessary or needlessly expensive care wastes about $210 billion a year, according to the National Academy of Medicine. The cost of false-positive tests and overdiagnosed breast cancer is $4 billion a year, according to a 2015 Health Affairs study. Unnecessary intensive care costs about $137 million a year for about 100 hospitals in two states, one study estimated. That would put the tab of non-beneficial intensive care in the billions nationwide.

Dr. Dong Chang, director of the medical intensive care unit at Harbor-UCLA Medical Center, has found that many patients are too sick or too healthy to benefit from intensive care.

How to stop the squandering: Medicare has a fee schedule that sets prices and makes them public, said Welch, the Dartmouth physician who advocates against overtreatment. States could pass laws that establish similar fee schedules, he said, or insurance companies could implement them. Establishing set prices would reduce administrative overhead, he said, and allow patients to shop for the best deals.

Several experts said policymakers need to move medical providers from payment based on volume of services to payment based on value. Right now “fee for service” payment is common, so providers get a fee for everything they do. This gives them the incentive to do more, sometimes more than is necessary, or to provide care that’s more expensive than it has to be. If providers were paid a lump sum to care for individuals or a group of patients, and the outcomes of their performance were measured to ensure quality didn’t suffer, they would cut the waste. “You want to empower organizations to be responsible for improving patient health and reducing costs,” said Dr. Elliott Fisher, director of The Dartmouth Institute for Health Policy and Clinical Practice.

Community Supported Canning Gets Locavores Through Winter

December 26, 20174:34 PM ET

Allie Hill got really serious about eating local food about eight years ago. She was cooking for three young children. "I was able to go to the farmers' market and find my produce — fruits and veggies," she says. "I was able to find meat, and even some dairy."

She simply couldn't find local version of other foods, though. These are foods that fill her pantry, like marinara sauce, apple sauce and everything else that comes to us preserved in sealed jars and cans.

The technology of canning, which brings those foods to us, was invented 200 years ago, and it was life-changing. With heat to kill disease-causing bacteria and a vacuum-sealed lid to prevent contamination, you could keep food edible for years.

These days, cans are everywhere, but the act of canning has vanished inside the walls of huge factories. People don't do it as much at home anymore, and Allie Hill couldn't find many local farmers doing it in central Virginia.

Then she discovered Prince Edward County's public cannery, a place where anybody can walk in with bags of produce from their garden and walk out with preserved food.

Places like this once were common. "It used to be, every county in the commonwealth [of Virginia] would have a cannery," says Wade Bartlett, the administrator of Prince Edward County.

Across the country, there were hundreds of them. Most were set up during the New Deal and the second World War. Many were part of the agricultural extension service charged with bringing technical know-how to rural America. They brought almost industrial-scale food preservation to small towns.

Most of these public canneries have since disappeared, shut down when funding got tight. But Bartlett says that "the board of supervisors for Prince Edward County always thought it was an important part of the fabric of this county."

From the outside, the cannery doesn't look like much. It's just a long, single-story brick structure in the outskirts of the town of Farmville. But Patty Gulick, who manages the place, says what happens inside is like magic.

"If everybody could see what goes on in the cannery, the world wold be a much happier place," she says. People work side by side, help each other out, trade advice and recipes. "It doesn't matter if you're old, if you're young, your race, your wealth; everybody's the same in the cannery!"

Half a dozen people are here today, both men and women, black and white. They're dropping sweet potatoes, collard greens, and tomato soup into huge cooking kettles that can hold up to 60 gallons. Cooked food then goes into steel cans of various sizes. Lids are attached by machines, and batches of cans then are lowered into giant pressure cookers.

Rhonda Mayberry is canning what she calls creasy greens — similar to watercress. You won't find these greens in a supermarket, and they have a special place in Mayberry's heart because they helped get her through nursing school.

"My husband and I lived in a little mobile home, and I was broke as a convict," she recalls. She spied these greens growing in a nearby corn field "and so I took his good hunting knife, and I found a patch of creasies, and sent them off to sell. And that was my gas money to get back and forth to my nursing courses!"

For all the memories, though, the number of people who use this facility is modest, and the county has thought about closing it.

But then Allie Hill, the local food enthusiast, discovered the cannery. She set up a non-profit group — Virginia Food Works — that now operates this cannery on days when the home gardeners aren't using it. And it's bringing in a whole different group of cannery users. This is the farmers' market crowd: Internet-savvy young farmers and small food companies that sell to people who'll pay extra for food grown nearby.

"It almost seems to me that we are part of saving this facility. We are allowing this facility to be used in new ways," Hill says.

Companies are using it to make apple sauce, ketchup, and barbecue sauces, packaging them in glass jars, with artsy labels. And in the process, they're reinventing an old tradition.

Shipping-container farming that’s said to have price parity with farms

.

Local Roots: Farm-in-a-box coming to a distribution center near you

Diana Gitig - 12/16/2017, 12:00 PM

Eric and Matt could not be more earnest in their quest to feed the world.

These two fresh-faced LA boys founded Local Roots four years ago. Their first purchases were broken-down, 40-foot shipping containers—this is apparently easy to do, since it is cheaper for shipping companies to just churn out new ones rather than fix broken ones. Local Roots then upcycles them into modular, shippable, customizable farms, each of which can grow as much produce as five acres of farmland. The idea is to supplement, not supplant, outdoor agriculture. And Ars got a look at one of these "farms" when it was set up in New York City recently.

Every aspect of the TerraFarm, as the repurposed shipping containers have been dubbed, has been designed and optimized. The gently pulsing LED lights are purplish—apparently, that’s what lettuce likes—and the solution in which the plants are grown is clean and clear. The "farm" is bright and vibrant, and it smells great in there.

This environment came about because Local Roots consulted a lot of experts. It employs horticulturalists, mechanical, electrical, and environmental engineers, software and AI developers, and data and nutrition scientists. The company does this to ensure that the growing conditions and produce are always optimal—both for the plants' growth and their nutritional content.

TerraFarms use no pesticides, herbicides, or fertilizers—they don’t have to. This means they generate no toxic runoff, and the produce fits most definitions of organic food. They use 99-percent less water and obviously much less land than outdoor farms. Since the farms are indoors, they are not subject to the vagaries of weather, be it the extreme temperatures, storms, and droughts brought on by climate change or the more mundane conditions of heat, cold, or dryness that exist outside of LA.

They can be moved anywhere—desert, tundra, underground, and even Mars, as both Eric and Matt pointed out independently of each other. Wherever the TerraFarms are, their conditions will be constantly monitored by the experts back at HQ in Vernon, California, just outside of downtown LA, where Local Roots recently built a huge new facility.

Most of the crops that we grow today have been bred for the stability of the final product, whether a fruit or leaf or root. This way, the produce can last for the two weeks it takes to truck it from where it's grown (California, for example, which produces more food than any other state) to wherever it's headed. But TerraFarms is intended to reside and be staffed near distribution centers for major retailers, never further than 50 miles from the consumers eating the produce. So most of that same two-week period will elapse while the produce is in your fridge.

Regardless of their location, TerraFarms will provide people with fresh, local, organic produce all year long. Local Roots thus seems to have managed to attain both the benefits of small organic farms—i.e., fresh, local produce—while keeping the benefits of large, industrialized agriculture, like technical expertise and centralized distribution.

Local Roots already provides food to Tender Greens and Mendocino Farms, and the United Nations World Food Programme has just purchased TerraFarms to provide produce to developing areas of the world; although the Food Programme supplies essentials like rice and beans, about two million people still suffer from micronutrient deficiencies which other produce can alleviate.

A solution like this in a developing economy doesn't seem to make much sense on the surface. But the company is now claiming that it has achieved cost parity with traditional, outdoor farming. It's the first in the indoor/urban/vertical farming model to have done so, possibly because the shipping containers allow them to generate more farmland more quickly and more cheaply than can be done in a warehouse or other indoor systems.

Thus far, Local Roots has concentrated on growing greens—lettuces and some herbs. Since these are highly perishable, they benefit the most from being grown locally and getting to consumers quickly. But in principle, each TerraFarm can be customized to grown anything, anywhere. Which might be a very good thing, as climate change is not going to be good for the coffee crop.

New, Major Evidence That Fracking Harms Human Health

A child born very close to a well is likely to be smaller and less healthy than a child born farther away.

Robinson Meyer Dec 13, 2017 Science

Updated on December 13 at 6:30 p.m.

Hydraulic fracturing, or fracking, may pose a significant—but very local—harm to human health, a new study finds. Mothers who live very close to a fracking well are more likely to give birth to a less healthy child with a low birth weight—and low birth weight can lead to poorer health throughout a person’s life.

The research published in Science Advances, is the largest study ever conducted on fracking’s health effects.

“I think this is the most convincing evidence that fracking has a causal effect on local residents,” said Janet Currie, an economist at Princeton University and one of the authors of the study.

The researchers took the birth records for every child born in Pennsylvania from 2004 to 2013—more than 1.1 million infants in total—and looked at the mother’s proximity to a fracking site, using the state of Pennsylvania’s public inventory of fracking-well locations. They used private state records that showed the mother’s address, allowing them to pinpoint where every infant spent its nine months in utero.

They found significant, but very local, consequences. Infants born to mothers who lived within two miles of a fracking well are less healthy and more underweight than babies born to mothers who lived even a little further away. Babies born to mothers who lived between three and 15 miles from a fracking well—that is, still close enough to benefit financially from the wells—resembled infants born throughout the rest of the state.

While birth weight may seem like just a number, it can affect the path of someone’s life. Children with a low birth weight have been found to have lower test scores, lower lifetime earnings, and higher rates of reliance on welfare programs throughout their lives. In a previous study, a different team of researchers examined twins in Norway whose birth weight diverged by 10 percent or more. The lighter twin was 1 percent less likely to graduate from high school and earned 1 percent less than their sibling through their life.

“Hydraulic fracturing has widely dispersed benefits—we are all paying lower natural-gas bills for heating, we’re all paying lower electricity prices, we’re all paying less for cheaper gasoline at the pump. And even if health was all that you care about, we’re all benefitting from decreased air pollution thats widely dispersed, because coal plants are closing,” said Michael Greenstone, a professor of economics at the University of Chicago and another authors of the paper.

But all those benefits, he said, were borne by the local communities that lived extremely close to hydraulic fracturing wells. “There’s this interesting trade off between the greater good and what are the costs and benefits for local communities,” he told me.

Oil and gas lobbying groups rushed to criticize the study. “This report highlights a legitimate health issue across America that has nothing to do with natural gas and oil operations. It fails to consider important factors like family history, parental health, lifestyle habits, and other environmental factors and ignores the body of scientific research that has gone into child mortality and birthweight,” said Reid Porter, a spokesman for the American Petroleum Institute, a trade organization that represents the oil and gas industry.

In the fracking study, researchers tried to separate the costs of fracking and socioeconomic status and parental health in several ways. First, they compared baby birthweight near fracking wells to those babies immediately around them, which they believe accounts for the wealth of various communities.

Second, they found that the connection held for siblings who were or were not exposed to a fracking well. “We follow the same mother over time and ask whether on average, children born after fracking starts have worse outcomes than their siblings born before fracking starts,” Currie told me. “In this case, since we follow the same woman over time, we are controlling for her underlying characteristics.”

Babies who gestated near a well had a reliably lower birth weight than their siblings who were not exposed to the well.

The researchers don’t yet know why this link between fracking and low birth weight exists, though they suggest that air pollution could be a possible contributor. The process of fracking may release chemicals into the air, for one, but many wells also run multiple diesel engines at once, and they can be a hub of local activity, with trucks regularly commuting to the sites.

While environmental activists and some researchers have proposed that fracking chemicals may leak into groundwater, most studies have failed to find lasting and widespread water pollution near wells. The birth-weight study seems to suggest that air, not water, pollution may instead be the threat that fracking sites pose to human health.

Greenstone believes the next step for this research is to figure out exactly what is driving the babies’ low birth weight. “Is it the trucks? Is it the diesel generators?” he said. “If you knew the channel, you might be able to devise a light-touch regulatory approach.”

East Antarctic Ice Has a Wild Past. It May Be a Harbinger

The East Antarctic ice sheet has fluctuated wildly in the past, a study finds—adding to concerns of a dramatic meltback in the future.

By Douglas Fox, PUBLISHED December 13, 2017

Scientists sounding the seabed off Antarctica have uncovered some surprising episodes from the continent’s history: The East Antarctic Ice Sheet, they say, experienced a series of dramatic retreats in the distant past—retreats that were often punctuated by catastrophic floods of meltwater that erupted from beneath the ice sheet and left deep scars in the seafloor.

The ice covering East Antarctica, more than 12,000 feet thick in many places, has long been considered more stable and permanent than the West Antarctic Ice Sheet —and thus more likely to weather global warming unscathed. But the new research, published this week in Nature by Sean Gulick of the University of Texas and his colleagues, reinforces a growing concern that large swaths of East Antarctica are more vulnerable than once thought.

In recent years scientists have mapped the East Antarctic bedrock with ice-penetrating radar and found that, like West Antarctica, it includes large regions that plunge thousands of feet below sea level. Because they sit on such low ground, those areas of the ice sheet are susceptible to melting by deep, warm ocean currents.

“There’s a lot of ice in these basins” of East Antarctica, says Robert DeConto, a glaciologist at the University of Massachusetts - Amherst. They are going to be “a big, honking potential contributor to sea level.”

One of the biggest is the Aurora Subglacial Basin. Much of the ice there flows to the sea through the Totten Glacier, which has been found to be thinning rapidly. To investigate the history of that part of the ice sheet, Gulick and his colleagues collected evidence just offshore.

Sounding the Past

In February 2014, the researchers steamed along the Sabrina Coast of East Antarctica, picking their way among icebergs, in the icebreaker Nathaniel B. Palmer. A low thud resounded in the air every few seconds – the muffled burst of submerged air guns towed by the ship. The sound waves penetrated as far as 1,500 feet into the seabed before reflecting off internal boundaries in the sediments. Detectors at the surface recorded those echoes.

These seismic profiles, as they’re called, revealed hundreds of layers of mud, sand, and gravel that had been eroded off the Antarctic continent over millions of years by the slow action of glaciers, then carried out to sea. The deeper layers showed that Antarctic ice first appeared around 50 million years ago—just 15 million years after the dinosaurs died off, and “a lot earlier than we thought,” says paleo-oceanographer Amelia Shevenell of the University of South Florida, who was on the ship.

The succession of layers allowed Shevenell and her colleagues to reconstruct the ice sheet’s alternating episodes of expansion and retreat over the tens of millions of years that followed its initial formation.

During cold periods, the researchers found, the ice advanced 100 miles or more beyond its present-day edge, bulldozing out onto the continental shelf. In warm times, it ice shrank to less than its present-day size, retreating far into the Aurora Basin.

But the team was most intrigued by deep scars they found buried hundreds of feet beneath the seafloor. The V-shaped notches, cut into underlying layers of sand and mud, looked like “tunnel valleys”: Grooves cut by meltwater flowing under the ice sheet while it was sitting on the seafloor.

View Images

The East Antarctic ice sheet was previously considered stable and permanent, but scientists are now increasingly concerned about its vulnerability to warming.

Photograph by George Steimetz, National Geographic Creative

Scientists are familiar with tunnel valleys from North America and Europe, where they formed as the northern ice sheets were collapsing at the end of the last ice age, just over 10,000 years ago. But seeing such valleys off Antarctica “caught us completely by surprise,” says Shevenell.

At over half a mile wide and 500 feet deep, the largest of these grooves could have carried as much water as the Mississippi River – for a geologic instant. The researchers think they might have been formed by catastrophic events, when vast lakes of melt water on the glacier’s surface suddenly drained through cracks in the ice.

“For a short period of time you have a Niagara Falls amount of water pouring down to the base of the ice sheet,” says Slawek Tulaczyk, a glaciologist at the University of California, Santa Cruz, who was not part of the research team but who studies the flow of water beneath ice sheets. When that water hits soft sediments under the ice, “it will just chew into that and erode a tunnel quite readily.”

The Past Is Prologue?

One or more of these floods occurred during at least 11 episodes in the period between 30 million and six million years ago, the new study finds. The floods likely happened during periods of tumultuous transition, when the ice sheet had expanded to well beyond its present-day size. The ice was then hit by a rapidly warming climate, with hot summers causing vast lakes of meltwater to form on its surface—and it roared into retreat.

The air above most of the East and West Antarctic ice sheets today is too cold for them to melt at the surface. They are mainly shrinking from the underside, in spots where the edge of the ice is strafed by deep, warm ocean currents.

But DeConto says that things won’t necessarily stay that way. “There is this growing recognition,” he says, “that in Antarctica it’s not going to be just about the ocean-ice interaction, and that atmospheric processes” – warm summer air – “could play a role if greenhouse gases get high enough.”

When might that happen? DeConto cautions against drawing too strong a conclusion from floods that happened millions of years ago, when the very land under the Antarctic ice might have been shaped differently, and, owing to changes in Earth’s orbit, the continent might have been getting more sunlight and heat during summer than it does today.

Even so, there is one concerning thing about the new results, says DeConto. During much of the time that Antarctica was repeatedly losing so much ice off its topside, levels of carbon dioxide in the atmosphere were similar to what they are today – perhaps only a little higher. By the end of the century, DeConto says, “we’re going to be climbing way outside of that range.”

Atlantic hurricanes' rapid growth spurts are intensifying

14 December 2017

Over the past three decades, wind speeds in the strongest storms have increased more rapidly in some areas.

NEW ORLEANS, Louisiana

Hurricane Maria, which devastated the Caribbean in September, took just three days to grow from a tropical depression to a fully fledged category 5 storm as it marched across the Atlantic Ocean. A study now suggests that this type of dramatic growth spurt has become more pronounced in hurricanes over some parts of the Atlantic during the past three decades.

Understanding rapid intensification — in which hurricanes increase their maximum sustained winds by 30 knots or more in 24 hours — is important because higher wind speeds cause more damage if a storm makes landfall. “This is how strong a hurricane can become,” says Ruby Leung, an atmospheric scientist at the Pacific Northwest National Laboratory in Richland, Washington. “Rapid intensification has happened with larger amplitude over the last 30 years.”

Both human-caused global warming and natural patterns of climate variability may be to blame, she told a meeting of the American Geophysical Union in New Orleans, Louisiana, on 13 December.

In 2017, four Atlantic hurricanes — Harvey, Irma, Jose and Maria — underwent rapid intensification. All went on to lay waste to parts of the Caribbean or the United States.

Inspired by this year’s wild storm season, Leung decided to study long-term trends in rapid intensification. Warming ocean waters are expected to lead to storms intensifying more frequently and more severely, says Kerry Emanuel, a meteorologist at the Massachusetts Institute of Technology in Cambridge who has published on the topic1. “We’ve shown that theoretically.”

One study2 this year looked at whether the number of Atlantic hurricanes that undergo rapid intensification had increased between 1950 and 2014; it found no evidence to support that idea. But Leung wanted to explore the magnitude of such changes, rather than how frequently they occurred.

She and her colleagues went through a US National Hurricane Center database that captures the paths and strengths of hurricanes over time. The scientists examined how much each storm between 1986 and 2015 intensified as it moved across the Atlantic.

In general, there was no significant difference in the amount of intensification storms showed over the years. But that changed when Leung and her colleagues looked only at the storms that intensified the most. For these, the average intensification increased by nearly 0.4 knots per day each year.

Changing patterns

This trend held for storms in the central and eastern tropical Atlantic, but not in the western Atlantic. Leung isn't yet sure why that is the case, but it may have something to do with a long-term climate pattern called the Atlantic Multidecadal Oscillation (AMO), which affects interactions between the ocean and atmosphere. Over the past three decades, the AMO has been shifting phase in a way that causes sea surface temperatures in parts of the Atlantic to rise. Warmer ocean waters mean more energy for hurricanes to feed off.

Global warming probably also plays a part in the increased intensification, Leung says. Next, her team wants to try to untangle how much of the change comes from global warming and how much from natural patterns such as the AMO.

“Knowing why what happened happened is crucial to understanding what’s going to happen next,” says Gabriel Vecchi, a climate scientist at Princeton University in New Jersey. “If it’s connected to global warming, we can expect it to go forward, and that’s something we need to keep an eye on.”

Leung says she expects this to be the case. “I wouldn’t say that next year we would have another 2017,” she says. “But the likelihood of this is increasing over time.”

U.S. solar installations to fall more than expected in 2017

Nichola Groom, December 14, 2017

(Reuters) - U.S. solar installations will fall more than expected this year, according to an industry report released on Thursday, due to weakened demand for residential systems and delays on large projects over concerns that President Donald Trump will impose tariffs on imported panels, increasing costs.

In a quarterly report, GTM Research said it expected U.S. installations to fall 22 percent to 11.8 gigawatts (GW) this year, down from a prior forecast that called for a decline of 17 percent. GTM conducted the analysis for the Solar Energy Industries Association trade group.

In the third quarter, market installations fell 51 percent to 2.03 GW. Last year was a particularly strong year for installations as developers raced to capture a federal tax credit that had been scheduled to expire at the end of 2016. The solar market is also expected to fall next year but will resume growth in 2019, GTM said.

Many utility-scale project developers are delaying projects due to tight solar panel supplies, the report said. Uncertainty regarding a trade case brought by U.S. manufacturer Suniva has raised the price of panels and prompted some developers to buy in advance of their needs, GTM said.

Solar module prices averaged 45 cents per watt in the third quarter, up from 40 cents in the prior quarter.

Trump is expected to make a final decision on the trade case next month. While Suniva wants less foreign competition, developers rely on low-priced imports to compete with fossil fuels and have argued against tariffs.

GTM cut its 2017 utility-scale forecast to 7.6 GW from 8.1 GW but lifted its view of the market for 2018 slightly to 6.6 GW from 6.5 GW.

The industry’s residential sector dropped 18 percent from last year because of weakness in big markets like California. The residential market has suffered this year due to a pullback by top installer SolarCity after its acquisition by Tesla Inc last year. Residential solar installations are expected to be down 13 percent for the year, the market’s first-ever annual decline.

Bucking the downward trend was the commercial sector. GTM’s non-residential category, which includes projects for businesses and small communities, rose 22 percent from the third quarter of 2016 thanks to favorable policies in California, Massachusetts and New York. Installations also rose in Minnesota due to Xcel Energy Inc’s community solar program.

"Tax Bill Largely Preserves Incentives for Wind and Solar Power"

"WASHINGTON — The final text of the Republican tax bill made public Friday largely preserves key tax credits for wind and solar power and electric vehicles, reversing language in earlier versions that could have slowed the growth of renewable energy across the United States.

The last-minute changes, made as lawmakers reconciled the House and Senate versions of the tax legislation, reflect the growing political clout of the wind and solar industries, which now provide more than 7 percent of the nation’s electricity and are two of the fastest-growing energy sources.

“As wind and solar projects have soared in the U.S., in both red and blue states, so has the industry’s influence in Washington, D.C., on both sides of the aisle,” said Dan W. Reicher, director of the Center for Energy Policy and Finance at Stanford."

6 companies ordered to continue cleanup of USS Lead site at estimated cost of $26.25 million

Sarah Reese sarah.reese@nwi.com, 219-933-3351 Dec 18, 2017 Updated 13 hrs ago

EPA works to address concerns as it begins excavation

EAST CHICAGO — The U.S Environmental Protection Agency said Monday it issued two unilateral administrative orders to six companies to clean up soil and indoor dust at the USS Lead Superfund site.

The soil cleanup ordered for zone 2 — the second of three residential cleanup areas at the Superfund site — could cost the potentially responsible parties $24 million, EPA said. The projected cost to complete indoor dust cleanup in zones 2 and 3 was $2.25 million.

EPA issues unilateral orders under several circumstances, including when potentially responsible parties don’t agree to perform cleanup work through a judicial consent decree or an administrative order of consent. Unilateral orders also are issued if potentially responsible parties refuse to perform work they previously agreed to under a settlement agreement.

EPA did not immediately respond to questions about circumstances leading up to the unilateral orders for the USS Lead site.

The potentially responsible parties include USS Lead, Atlantic Richfield Co., E.I. du Pont de Nemours and Co., The Chemours Co., U.S. Metals Refining Co. and Mueller Industries. The companies are liable for implementing all activities and will be responsible for ensuring the work performed by their contractors and subcontractors meets all requirements, according to the orders.

Atlantic Richfield is a successor to Anaconda Lead Products Co., International Lead Refining Co. and International Smelting and Refining. Mueller Industries is a successor to U.S. Smelting and Refining and Mining Co., which later changed its name to UV Industries Inc., and Sharon Steel Corp.

EPA completed soil cleanups at 109 properties in zone 2 this year and cleaned indoor dust at more than 64 homes with funds from a $16 million March 2017 administrative settlement agreement with Atlantic Richfield, DuPont, Chemours and Metals Refining. The agreement expires early next year.

The companies have five days from Dec. 14 to request a conference with EPA, after which time the orders become effective. The companies also must submit a written notice of their intent to comply on or before the effective date, the orders say.

Residents remain concerned

The new administrative orders direct the companies to continue soil cleanup in zone 2 and indoor dust cleanup in zones 2 and 3, EPA said.

"The soil cleanup in zone 2 is expected to satisfy all of the requirements outlined in EPA's 2012 Record of Decision," according to a news release. "EPA will also ensure that the potentially responsible parties follow agency protocol on interior dust sampling and cleanup."

If EPA determines additional cleanup work is necessary to protect public health, welfare or the environment, the agency can modify or issue new orders.

Debbie Chizewer, an attorney at Northwestern University's Pritzker Law School’s Environmental Law Clinic, said it’s encouraged that EPA is pursing additional soil and indoor dust cleanups.

“Residents remain concerned that even if the cleanup is executed as promised, their properties and homes will not be safe,” she said.

The current plan omits indoor dust cleanups in hundreds of homes, and the standards EPA uses to determine which homes receive dust cleanups remain a concern, she said.

EPA last week said the estimated cost for soil cleanup in zones 2 and 3 has nearly quadrupled from $22.8 million in 2012 to $84.9 million, according to a proposed explanation of significant differences document.

The federal agency plans to continue soil cleanups in zone 3 under a $26 million consent decree in 2014. The consent decree covered soil cleanups in zones 1 and 3.

Comment periods still open

The $84.9 million in estimated costs do not include indoor dust cleanup or excavation work in zone 1, the site of the now-empty West Calumet Housing Complex.

Excavation work in zone 1 has been put on hold as the East Chicago Housing Authority prepares to demolish the complex and EPA continues work on a revised feasibility study.

EPA said the remedy, originally selected in a 2012 Record of Decision, for zone 1 will change along with the estimated cost. Therefore, the agency plans to document the changes in a Record of Decision amendment.

EPA started a 60-day comment period Monday on the estimated cost increase. The agency also plans to schedule a public meeting next month.

The agency also is accepting public comments on a proposed $22.6 million plan to clean up and mitigate exposures at the former DuPont industrial site, which is southeast of zones 2 and 3.

Several residents have been fighting for a say in court on the 2014 consent decree for the USS Lead site. Oral arguments on their motion to intervene are scheduled for 1 p.m. Jan. 16 before U.S. District Judge Philip Simon.

EPA Administrator Scott Pruitt on Dec. 8 named the USS Lead site to a list of 21 Superfund sites in the U.S. to be targeted for "immediate and intense attention."

Will China’s crackdown on ‘foreign garbage’ force wealthy countries to recycle more of their own waste?

December 13, 2017 6.25am EST

Author, Kate O'Neill

Associate Professor, Global Environmental Politics, University of California, Berkeley

Disclosure statement

Kate O'Neill does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Partners

University of California provides funding as a founding partner of The Conversation US.

With holidays approaching, many of us are mindful of the need to collect and recycle all the additional plastic, paper and other waste that we are about to generate. This year, however, there are questions about where that waste will end up. China, the world’s largest importer of scrap, is looking to clean up its act.

In July 2017 China, which is by far the world’s largest importer and recycler of scrap metals, plastic and paper, notified the World Trade Organization that it planned to effectively ban imports of 24 types of scrap, which its environment ministry called “foreign garbage,” by the end of the year. Immediately, organizations such as the U.S.-based Institute of Scrap Recycling Industries and the Bureau of International Recycling warned that China’s action would cause job losses, shut down many U.S. recycling facilities and send more waste to landfills.

These worries are not unfounded. Global recycling markets are easily prone to disruption, and developed countries have underinvested in recycling infrastructure for years. Beijing has delayed implementation by a few months and eased its stringent new contamination limit, but its shift continues to send shock waves through the industry. Waste Dive, the must-read daily bulletin of waste-related news, named the initiative “Disruptor of the Year.” China’s action could reshape an overlooked but critical segment of the global economy: the cross-border flows of scrap that underpin recyclng markets worldwide.

A 2017 study projected that if current global use patterns and waste management trends continue, by 2050 the world will have recycled 9 billion metric tons (9,000 million metric tons) of plastic waste, incinerated 12 billion metric tons and discarded 12 billion metric tons in landfills or the natural environment. Geyer et al., Science Advances, July 19, 2017, CC BY-NC

Scrap exports to China took off in the early 2000s following the lifting of broader trade restrictions. In 2012 China received nearly half of all the plastic waste that Americans sent abroad for recycling and about one-third of the European Union’s plastic waste exports. According to one 2014 study, China received 56 percent by weight of global scrap plastic exports.

This trade makes economic sense all around. Shipping is cheap: Cargo ships carry goods from China to Western countries and carry scrap back, a process known as reverse haulage. China’s booming industries are located near major ports and hungry for plastics they do not yet produce at home, so they willingly pay for high-quality imported scrap to reuse. For U.S.-based waste collectors, selling scrap to a broker to be shipped to China is cheaper than sending it to recycling facilities at home.

Plastic scrap is especially problematic. It has low economic value and is hard to recycle. It also breaks down extremely slowly in the environment, as evidenced by the buildup of plastic debris in the world’s oceans. Few are aware that up to half of the plastic waste we throw into recycling bins in Berkeley, New York or Omaha has wound up on container ships to China.

Data from UN Comtrade Database. Color indicates sum of value in U.S. dollars; size indicates sum of weight in kilograms. Kate O'Neill, CC BY-ND

Information about the fate of plastic scrap once it gets to China is sketchy, and available statistics are inconsistent. China’s plastics recycling rate in 2013 was about 22 percent – far higher than the United States, which averages about 9 percent annually. This figure, representing around 13.6 million metric tons, includes international and domestic scrap.

Still, this means that much of the scrap plastic shipped to China is not recycled, or is recycled under hazardous conditions. Nongovernment organizations and other observers have expressed concern about how much of this imported scrap – especially if it is contaminated or low-quality – is either diverted to sub-par incinerators for energy recovery or winds up in the oceans.

China demands quality control

To be recycled, bales of scrap should be clean, contaminant-free and sorted. Beijing has already cracked down twice on contaminated plastic and paper scrap.

In a 2013 initiative called Operation Green Fence, China sharply increased inspections of imported bales, shipping back substandard scrap at exporters’ expense and forcing them to pay more attention to quality. Almost immediately, shippers began diverting scrap to other ports for cleaning or possible disposal. Vietnam and Malaysia saw sharp spikes in plastic scrap imports. In March 2017 China launched Operation National Sword, further increasing inspections of incoming shipments, then followed with its WTO filing in July.

Chinese leaders have very real concerns over the nation’s environmental crisis and its high-profile image as the “world’s dumpsite.” Noted filmmaker Wang Jiuliang spotlighted the scrap issue in an award-winning 2016 documentary, “Plastic China,” which focuses on an unschooled 11-year-old girl who lives and works with her family in a plastic recycling workshop. The film went viral online in China after its release, then was quickly deleted from China’s internet.

Beijing is working to replace China’s informal recycling sector with cleaner, high-tech “eco-industrial parks.” However, local authorities around entry ports strongly opposed Green Fence, which cut into local businesses’ revenues, and are likely to resist the scrap ban. China’s struggle to police clandestine imports of electronic waste suggests that it will also have trouble shutting out smuggled trash.

Some observers, such as journalist Adam Minter, think the scrap restrictions could backfire. In their view, China’s high recycling rates – up to 70 percent for scrap paper – avert deforestation, mineral extraction and fossil fuel use. Domestically produced scrap is generally of far lower quality than the “foreign garbage” that China imports, and is likely to be more polluting.

Will source nations step up?

Since July, Beijing has delayed the start date for the scrap restrictions to March 2018 and raised the maximum contamination level for plastics and other scrap from 0.3 percent to 0.5 percent – still far below normal global trade standards. Under Green Fence, authorities allowed up to 1.5 percent contamination.

Although the global scrap industry is fighting back, China’s actions are forcing industrialized nations to rethink their dependence on overseas disposal. In its 2017 infrastructure report card, the American Society of Civil Engineers criticized the U.S. solid waste industry for failing to innovate and improve recycling rates.

The United States has not built a new high-quality plastics recycling facility since 2003, and very few of its existing plants can cost-effectively process harder-to-recycle, often dirty post-consumer plastics. Europe recycles 30 percent of its plastics, compared to 9 percent in the United States, but the majority of waste plastic still winds up in landfills and in the oceans. Moves are already underway to improve U.S. capacity, but will take years to implement.

Ultimately recycling doesn’t work because of technology, values or intentions. It requires strong and stable markets for scrap and recycled goods. If China closes its scrap market, nations will divert plastics to other jurisdictions that are even less equipped to recycle and utilize it, which will send more plastics into landfills and dumps.

Mass production of bioplastics is a long-term solution, but is probably years off. For now, entities such as the Closed Loop Fund, which supports research on technologies and initiatives to build a circular economy, are working to scale up recycling infrastructure and capacity in the United States. Other priorities include expanding markets for recycled products and improving consumer education. In my view, the prospect of losing China as a consumer of Western scrap could and should finally spur industrialized nations to take more responsibility for the waste they generate.

As Greenland Melts, Where’s the Water Going?

BY HENRY FOUNTAIN AND DEREK WATKINS DEC. 5, 2017

Each year, Greenland loses 270 billion tons of ice as the planet warms. New research shows that some of the water may be trapped in the ice sheet, which could change how scientists think about global sea levels. On Greenland’s vast ice sheet, meltwater flows in rivulets as sunlight and warm

air heat the surface ice. Surface variations create thousands of low-lying areas where meltwater collects and eventually drains. The rivulets drain into larger

rivers that carve natural paths across the ice sheet. Over years, impurities in the ice become concentrated, making it darker. Those areas absorb more

sunlight, causing more melting. Underneath the surface, the ice develops holes, like swiss cheese. This river reached a large depression in the

ice and formed a lake. The lake fed another fast-moving river on the opposite side. That river disappeared down a

hole in the ice, called a moulin. Sea level rise depends on how much water drains from the ice sheet into the ocean, and how fast. New research suggests that more of the water may be

trapped inside the ice sheet than previously thought. In the summer of 2015, two New York Times journalists joined a team of researchers in Greenland that was conducting a unique experiment: directly measuring a river of meltwater runoff on the top of the ice.

Now, the scientists have published the results of that work. A key finding — that not as much meltwater flows immediately through the ice sheet and drains to the ocean as previously estimated — may have implications for sea-level rise, one of the major effects of climate change.

The scientists say it appears that some of the meltwater is retained in porous ice instead of flowing to the bottom of the ice sheet and out to sea.

“It’s always treated as a parking lot, water runs straight off,” said Laurence C. Smith, a geographer at the University of California, Los Angeles who led the field work in 2015. “What we found is that it appears there is water retention.”

“It’s plausible that this is quite an important process, which could render sea-level projections too high,” he added.

There’s still much that remains unknown about the ice sheet, which at roughly 650,000 square miles is more than twice the size of Texas. The sheet, up to two miles thick, contains enough ice that, if it all melted, would raise oceans around the world by 24 feet. Precisely how the ice melts — half or more by warming on the surface, the rest by ice sheet movement toward the sea, where it melts or calves off as icebergs — can greatly affect how much and how fast the seas rise.

Greenland is currently losing an average of about 260 billion tons of ice per year; at this rate, it would contribute about two inches to sea level rise by the end of the century. This ice loss is estimated through gravity measurements by satellites, but computer models that simulate physical processes are used to estimate the surface runoff. The field study was meant to improve those models by providing on-the-ground data on the flow of meltwater.

The work involved setting up camp near a glacial river that drains meltwater from a surrounding catchment area — in this case about 24 square miles of the ice sheet — into a moulin, a hole that drains to the base of the ice sheet. The researchers suspended a device in the river that uses acoustic signals to gauge the flow, and used satellite and drone images to precisely calculate the catchment area.

The flow data, collected over 72 hours, showed that current models are overestimating the amount of runoff by 20 percent to nearly 60 percent. Were the models wrong? Or were they right about the total amount of melting, but some of the water was not running off?

When he first sent the results to modelers, Dr. Smith said, “they couldn’t believe it.” After months of back-and-forth, Dr. Smith and his colleagues concluded that the model estimates were accurate, but there was something else going on with some of the meltwater. “What is missing,” he said, “is a physical process that is not currently considered by the models — water retention in ice.”

Sunlight hitting the ice sheet melts the surface, Dr. Smith said, but some of the light reaches deeper into the ice and causes some melting there. The ice develops a rotted, porous texture -- and can, the findings suggest, hold on to some of the meltwater.

Marco Tedesco, a co-author of a paper on the work in The Proceedings of the National Academy of Sciences who is a modeler at Lamont-Doherty Earth Observatory, part of Columbia University, said when comparing models with actual field measurements, “you always look at the energy involved.”

“If there’s a mismatch between observation and model,” Dr. Tedesco said, “that means the model is moving the mass in one way or another and not respecting the way things happen in the real world.”

Further work on the ice sheet will try to directly measure how much water is retained in the ice. More direct flow measurements are needed as well, as conditions can vary greatly over the huge icecap. (As the original Times article revealed, gauging stream flow on an ice sheet is far more complicated and dangerous than doing it on land.)

But surface runoff is only one area where much remains to be learned. What happens at the bottom of the ice when meltwater reaches it is understood largely through modeling, and those processes can affect how fast the ice sheet moves — and thus how much ice calves off into the ocean.

Among other aspects that scientists are studying is bioalbedo, which is how the growth of microorganisms in the ice can darken the surface and affect how fast it melts.

Thomas P. Wagner, a program scientist at NASA who directs the agency’s efforts to study polar regions, said that studies like Dr. Smith’s — and research on the far larger Antarctic ice sheet — are critical to understanding how climate change will affect the globe.

“Sea level rising is 3 millimeters a year right now,” Dr. Wagner said. “But in a hundred years, we could have one to five feet of rise.”

“Everyday we are improving our understanding of how ice is getting into the ocean,” he added. “The overall context is, we’re on this.”

Why the Prospect of ‘Peak Oil' Is Hotly Debated

By Jess Shankleman and Hayley Warren

December 21, 2017, 7:01 PM EST From Bloomberg News

The world is turning its back on oil. But how quickly? Technological advances driven by the threat of climate change could mean the world’s thirst for petroleum tops out sooner than companies such as Exxon Mobil Corp. or giant producers like Saudi Arabia are banking on. “Peak oil” has been debated for decades, but today it means something very different.

1. Why is oil’s future in doubt?

About 60 percent of oil is used in transportation, which is also where the biggest technological changes are emerging. The impact of electric carmakers such as Tesla Inc. could be turbocharged by advances in related fields such as self-driving vehicles and ride-hailing apps, which make it possible for people to switch from owning cars to relying on rides from more efficient fleets. The culmination of these trends could transform how people travel and prompt more revisions to forecasts for when oil consumption will peak.

2. Is the world running out of oil?

No. The peak oil that’s talked about today is quite different from the concept that emerged in the 1950s, when M. King Hubbert, a Royal Dutch Shell Plc geologist, predicted that U.S. oil production would crest in the 1970s and the world would physically run out of oil. That never happened, and new discoveries and efficiency gains at existing fields mean oil supplies will abound for a long time to come. So the discussion has shifted to peak demand — whether people will simply use less petroleum and reserves that are considered valuable assets today will wind up being left in the ground.

3. What do experts say about that?

Forecasts for long-term oil demand have been coming down. The International Energy Agency, which advises rich countries on policies, has been steadily revising its predictions lower over the past 20 years as renewable energy has taken off and more power utilities have switched to cleaner-burning natural gas. Solar power, for example, has picked up steam as its cost tumbled much faster than expected, with prices falling 50 percent since 2009. That’s already upending the business model of utilities, which were designed to deliver fossil-fuel energy from large power plants to homes and businesses.

4. Is oil usage still a threat to global warming?

Yes. To limit global warming to “well below” 2 degrees Celsius (3.6 degrees Fahrenheit) — the target set by the United Nations-sponsored climate change treaty — the IEA predicted in 2016 that demand for oil would need to peak in the next few years. That’s unlikely to happen: The agency’s main demand scenario still sees oil use expanding for the next two decades.

5. So when will demand for oil peak?

There’s a range of about 25 years between the earliest and latest predictions. The most aggressive ones are based on the rapid expansion of electric vehicles, energy efficiency improvements and policy changes to curb greenhouse gas pollution. That scenario leads Statoil ASA and some forecasters to predict that oil demand could peak as soon as the late 2020s. Ben van Beurden, Shell’s chief executive, has said that if electric cars become really popular, the zenith could arrive in the next 15 years.

6. Do all oil companies have the same view?

No. Most oil companies see a peak around 2040. Others say their industry will enjoy decades of growth as it feeds the energy needs of the world’s expanding middle class. Saudi Arabia and Russia, the world’s two largest exporters, don’t foresee a top until 2050 at the earliest. Even when production does peak, it’s likely to lead to a plateau rather than a steep fall.

7. Why such differences in forecasting?

There’s a lively debate about how rapidly electric vehicles will catch on. Falling costs for batteries could make them as affordable as internal combustion engine cars over the next 10 years, according to Bloomberg New Energy Finance. Meanwhile, some of the world’s largest auto markets plan to phase out vehicles powered by fossil fuels to clean up dirty air. By 2030, India wants all new vehicles sold to be electric; the U.K. and France will ban the sale of diesel- and gasoline-fueled cars by 2040. The impact of the U.S. is a wild card because President Donald Trump is disrupting efforts to tackle global warming.

8. What will become of oil companies?

Many are speeding up efforts to diversify, investing more in natural gas and cleaner technologies such as hydrogen fuel cells. Even now these companies are doing huge business turning crude into chemicals used for everything from plastics to fertilizer. Still, peak oil has to be a concern, since it can take a decade or more for multibillion-dollar oil exploration projects to come to fruition.

9. What will happen to car companies?

They are furiously preparing for the shift. Volvo said all its new models will include electric drive by 2019, while Volkswagen wants 25 percent of its sales to be electric by 2025. Daimler and BMW are aiming for 15 percent to 25 percent by 2025.

10. What happens to countries that depend on oil revenue?

That’s a big unknown. Peak oil could cause political turmoil in so-called petro-states, which rely on oil revenue to keep government finances afloat. Saudi Arabia plans a partial privatization of its state oil company, Aramco, to raise funds to diversify its economy for the post-hydrocarbon age. Other petro-states, such as Russia, Venezuela and Nigeria, have yet to lay out plans for the future.

327 toxic Superfund sites in climate change, flooding bulls-eyes

AP , December 22, 2017, 5:46 AM

TAPRON SPRING, Fla. -- Anthony Stansbury propped his rusty bike against a live oak tree and cast his fishing line into the rushing waters of Florida's Anclote River.

When he bought a house down the street last year, Stansbury says he wasn't told that his slice of paradise had a hidden problem. The neighborhood is adjacent to the Stauffer Chemical Co. Superfund site, a former chemical manufacturing plant that is on the list of the nation's most polluted places. That 130-acre lot on the river's edge is also located in a flood zone.

"Me and my kids fish here a couple times a week. Everyone who lives on this coast right here, they fish on this water daily," said the 39-year-old father of three.

Stansbury is among nearly 2 million people in the U.S. who live within a mile of 327 Superfund sites in areas prone to flooding or vulnerable to sea-level rise caused by climate change, according to an Associated Press analysis of flood zone maps, census data and U.S. Environmental Protection Agency records.

This year's historic hurricane season exposed a little-known public health threat: Highly polluted sites that can be inundated by floodwaters, potentially spreading toxic contamination.

In Houston, more than a dozen Superfund sites were flooded by Hurricane Harvey, with breaches reported at two. In the Southeast and Puerto Rico, Superfund sites were battered by driving rains and winds from Irma and Maria.

The vulnerable sites highlighted by the AP's review are scattered across the nation, but Florida, New Jersey and California have the most, and the most people living near them. They are in largely low-income, heavily minority neighborhoods, the data show.

Many of the 327 sites have had at least some work done to help mitigate the threat to public health, including fencing them off and covering them in plastic sheeting to help keep out rain water.

The Obama administration assessed some of these at-risk places and planned to gird them from harsher weather and rising seas. EPA's 2014 Climate Adaptation Plan said prolonged flooding at low-lying Superfund sites could cause extensive erosion, carrying away contaminants as waters recede.

President Trump, however, has called climate change a hoax, and his administration has worked to remove references from federal reports and websites linking carbon emissions to the warming planet.

"Site managers had started reviewing climate and environmental trends for each Superfund site, including the potential for flooding," said Phyllis Anderson, who worked for 30 years as an EPA attorney and associate director of the division that manages Superfund cleanups until her retirement in 2013. "The current administration appears to be trying to erase these efforts in their climate change denials, which is a shame."

EPA Administrator Scott Pruitt has said he intends to focus on cleaning up Superfund sites, and he appointed a task force that developed a list of sites considered the highest priority. The Stauffer site in Florida is not on it.

Like Mr. Trump, Pruitt rejects the consensus of climate scientists that man-made carbon emissions are driving global warming. His task force's 34-page report makes no mention of the flood risk to Superfund sites from stronger storms or rising seas, but eight of the 21 sites on EPA's priority list are in areas of flood risk.

Despite EPA's announced emphasis on expediting cleanups, the Trump administration's proposed spending plan for the current 2018 fiscal year seeks to slash Superfund program funding by nearly one-third. Congress has not yet approved new spending plans for the fiscal year, which began Oct. 1.

Pruitt's office declined to comment this week on the key findings of AP's analysis or why the agency appears to no longer recognize an increasing flood risk to toxic sites posed by the changing climate.

However, Jahan Wilcox, an EPA spokesman, said, "Despite fear-mongering from the Associated Press, not a single dollar has actually been eliminated, as Congress still hasn't passed a budget."

Many flood-prone Superfund sites identified through AP's analysis are located in low-lying, densely populated urban areas. In New Jersey, several polluted sites have more than 50,000 people living within one mile.

In Hoboken, across the Hudson River from New York City, the site of a former manufacturing plant for mercury vapor lamps sits within a mile of almost 100,000 residents, including 7,000 children under 5.

The Martin Aaron Inc. Superfund site is in the heart of Camden's Waterfront South, a low-income neighborhood of crumbling row houses and industrial facilities stretching along the Delaware River.

The 2.5-acre lot, which takes up most of a city block, has been home to a succession of factories dating back to 1886 that included a leather tannery. The air around the fenced site hangs heavy with the nose-stinging odor of solvents. Testing found that soil and groundwater under the site contained a witch's brew of highly toxic chemicals, including PCBs and pesticides.

Earlier this month, workers used heavy machinery to remove contaminated soil and to pump polluted water from deep underground. Long range plans approved by EPA call for eventually covering the land and restricting its future use.

Just around the corner, Mark Skinner and his niece Cherise Skinner pushed her 1-year-old son in a stroller in front of their rented row house. Mark Skinner shrugged when asked about the work at the former industrial site.

"It's really contaminated, there's a lot of stuff in the ground, but I don't know what all it is," said Skinner, 53, who works at a nearby scrap metal yard and has lived in Waterfront South since he was a teenager.

Foul-smelling water filled the streets there during Superstorm Sandy in 2012, flooding many basements, long-time residents said. Census data show about 17,250 people live within a mile of the Martin Aaron site - 65 percent are black and 36 percent are Latino.

Across the nation, more than 800,000 homes are located near flood-prone toxic sites. Houses are at risk of contamination if intense flooding brings water into them, and many more people could be affected if the contamination seeps into the ground, finding its way into drinking water.

Mustafa Ali, who resigned in March as EPA's senior adviser and assistant associate administrator for environmental justice, said it's no accident that many of the nation's most polluted sites are also located in some of the poorest neighborhoods.

"We place the things that are most dangerous in sacrifice zones, which in many instances are communities of color where we haven't placed as much value on their lives," said Ali, who worked at EPA for 24 years.

The Stauffer site in Florida is a scrubby green field along the Anclote River, ringed on its other three sides by chain-link fences with "No Trespassing" signs. Testing showed the 130-acre lot's soils were contaminated with radium, the long-banned pesticide DDT, arsenic, lead and other pollutants that over the years have fouled the area's groundwater and the river.

Environmental regulators say the site now poses no threat to people or the environment because the current owner, the pharmaceutical company AstraZeneca, paid to treat contaminated soils, and cover the pollution with a "cap" of clean earth and grass. Still, residential development and use of groundwater on the site are prohibited because of the legacy pollution.

Covering toxic waste is often a cheaper option than completely removing the pollutants, but the installations are not always as long-lasting as the chemicals buried beneath them, said Jeff Cunningham, a civil engineering professor at the University of South Florida.

"As a long-term strategy, capping only works if the contaminants degrade to safe levels before the capping system eventually fails. What if it takes centuries for some of these contaminants to degrade to safe levels?" Cunningham said.

Damage to a protective cap from storm-fueled flooding has already occurred at least once this year.

In October, the EPA said dioxins from the San Jacinto River Waste Pits Superfund site near Houston were released after the cap was damaged by Harvey-related flooding. Tests afterward measured the toxins at 2,300 times the level that would normally trigger a new cleanup. Pruitt has since ordered an accelerated cleanup of the site.

Seventy-six-year-old Tony Leisner has lived near Florida's Stauffer chemical site all his life. He told the AP he is seeing damage to docks and riverside properties from the ever-rising waters in the neighborhood, and is concerned about what more flooding could mean for the Superfund lot. Although monitoring wells do test local groundwater for contamination from the site, some in Leisner's neighborhood said they're fearful enough to drink only bottled water.

The Anclote River is listed as an "impaired waterway" because it fails to meet state clean water criteria, though how much of that is due to the Stauffer site's legacy is unclear. The state has issued a warning about eating bass out of the river, but there are no signs at the popular fishing spot warning anglers even though tests show heightened levels of mercury in fish.

Leisner said barrels of chemicals at the Stauffer site self-ignited while crews were working. He said he's disappointed neither the company nor EPA removed the pollutants, especially since rising waters are already threatening the neighborhood.

"Burying things rarely helps. And if you've got a chemical that is that toxic ... I think you need to find a way to reuse, recycle and remove (it), to a place where it's not going to contaminate groundwater," he said.

Climate Change Is Driving People From Home. So Why Don’t They Count as Refugees?

By SOMINI SENGUPTADEC. 21, 2017

UNITED NATIONS — More than 65 million people are displaced from their homes, the largest number since the Second World War, and nearly 25 million of them are refugees and asylum seekers living outside their own country.

But that number doesn’t include people displaced by climate change.

Under international law, only those who have fled their countries because of war or persecution are entitled to refugee status. People forced to leave home because of climate change, or who leave because climate change has made it harder for them to make a living, don’t qualify.

The law doesn’t offer them much protection at all unless they can show they are fleeing a war zone or face a fear of persecution if they are returned home.

Is a legal definition outdated?

That’s not surprising, perhaps: The treaty that defines the status of refugees was written at the end of World War II.

A research paper, published Thursday in Science magazine, suggests that weather shocks are spurring people to seek asylum in the European Union. The researchers found that over a 15-year period, asylum applications in Europe increased along with “hotter-than-normal temperatures” in the countries where the asylum seekers had come from.

They predict that many more people will seek asylum in Europe as temperatures in their home countries are projected to rise.

The paper, by Anouch Missirian and Wolfram Schlenker, looks at weather patterns in the countries of origin for asylum applicants between 2000 and 2014. It found that ”weather shocks on agricultural regions in 103 countries around the globe directly influence emigration” to Europe.

“Part of the flow,” said Dr. Schlenker, a professor at the School of International and Public Affairs at Columbia University and a co-author of the study, “we can explain by what happens to the weather in the source country.”

Why isn’t anyone proposing a new law?

For starters, refugee advocates fear that if the 1951 refugee treaty were opened for renegotiation, politicians in various countries would try to weaken the protections that exist now. That includes the Trump administration, which has barred people from eight countries — including refugees from war-torn Syria and Yemen — from coming into the country altogether.

A group of academics and advocates has spent the last two years proposing an entirely new treaty, with new categories to cover those who are forcibly displaced, including by the ravages of climate change. Michael W. Doyle, a Columbia professor leading the effort to draft a new treaty, said he didn’t expect a new treaty to be embraced anytime soon, but insisted that those conversations should start as record numbers of people leave their home countries and end up displaced in others, often without legal status.

Trial balloons have been floated

A New Zealand lawmaker recently proposed a special visa category for people displaced by climate change. “One of the options is a special humanitarian visa to allow people who are forced to migrate because of climate change,” the minister, James Shaw, said in an interview from a global climate summit in Bonn, Germany, in November.

Mr. Shaw has said nothing since then about when legislation might be proposed, and it’s far from clear whether it would pass.

Several countries have offered humanitarian visas in the aftermath of calamitous natural disasters, including the United States after hurricanes and earthquakes, including in Haiti in 2010. (The Trump administration ended that so-called protected status for Haitians in November.)

There’s a bigger problem

As Elizabeth Ferris, a professor at Georgetown University, points out, most people whose lands and livelihoods are ravaged by either natural disaster or the slow burn of climate change aren’t likely to leave their countries. Many more will move somewhere else within their own country — from the countryside to cities, for instance, or from low-lying areas prone to flooding to higher elevation.

Indeed, natural disasters forced an estimated 24 million people to be displaced within the borders of their own country in 2016, according to the latest report by the Internal Displacement Monitoring Center.

Louisiana, Sinking Fast, Prepares to Empty Out Its Coastal Plain

By Christopher Flavelle, December 22, 2017, 4:00 AM EST

State weighs buyouts, prohibiting new development, tax hikes

Policy could become template for climate adaptation nationwide

Louisiana is finalizing a plan to move thousands of people from areas threatened by the rising Gulf of Mexico, effectively declaring uninhabitable a coastal area larger than Delaware.

A draft of the plan, the most aggressive response to climate-linked flooding in the U.S., calls for prohibitions on building new homes in high-risk areas, buyouts of homeowners who live there now and hikes in taxes on those who won’t leave. Commercial development would still be allowed, but developers would need to put up bonds to pay for those buildings’ eventual demolition.

“Not everybody is going to live where they are now and continue their way of life,” said Mathew Sanders, the state official in charge of the program, which has the backing of Governor John Bel Edwards. “And that is an emotional, and terrible, reality to face.”

Months of community meetings on the program wrapped up this week.

The draft plan, a portion of which was obtained by Bloomberg News, is part of a state initiative funded by the federal government to help Louisiana plan for the effects of coastal erosion. That erosion is happening faster in Louisiana than anywhere in the U.S., due to a mix of rising seas and sinking land caused in part by oil and gas extraction. State officials say they hope the program, called Louisiana Strategic Adaptations for Future Environments, or LA SAFE, becomes a model for coastal areas around the country and the world threatened by climate change.

While the state hasn’t come up with a cost estimate, the buyouts and resettlement could add up to billions of dollars. The federal grant for the initial phase cost $40 million.

The idea hasn’t gone over well with all the people it’s supposed to help, some of whom want the government to do more to protect their communities instead of abandoning them.

“Are we doing every single damn thing we can? I don’t think we are,” Richie Blink, 31, said over a bowl of gumbo in Empire, a town 60 miles south of New Orleans on the bank of the Mississippi River. He paused, then said he didn’t mean to get worked up. “This stuff wears you out emotionally.”

Empire lost half its population after Hurricane Katrina, and now has fewer than 1,000 people. Blink, a community organizer for the National Wildlife Federation, said he understands the dilemma political leaders face, but wishes they would do more to keep the area habitable longer.

Empire’s harbor has a flood gate to protect the boats inside from extreme weather. “When I was a kid, it was a big deal to see the flood gates closed,” said Blink. This year, he said, those gates were closed for 100 days.

Sanders is working to complete the plan by early next year, at which point it will be up to federal, state and local officials to decide if they will implement it. Edwards, the Democratic governor, announced his support for the program in March. If he backs its recommendations, the state could create a buyout program or eliminate the homestead exemption for homes in high-risk areas, which would mean higher property taxes for many residents.

In a statement, the governor’s office said he is “following the progress being made by the LA SAFE team intently and looking for ways to build upon their success as we determine the next steps for the program.”

Under the proposal, commercial development would still be allowed, but developers would need to put up bonds to pay for those buildings’ eventual demolition.

Rob Moore, a flood policy expert at the Natural Resources Defense Council in Chicago, said that if the state goes ahead with the plan, “then every coastal state in the country should be asking themselves, ‘If Louisiana can do this, why aren’t we?’”

Read More: Obstacle to Louisiana Village’s Relocation: Villagers Themselves

The LA SAFE program defines as “high risk” land where, five decades from now, the expected depth of a 100-year flood will reach more than six feet. According to data provided by state officials, 94 percent of the land in Plaquemines Parish, which includes Empire, falls into that category.

A state map shows areas of southern Louisiana at the greatest risk of flooding.Louisiana state map.

Across the six coastal parishes covered by the LA SAFE program, more than 59,000 people live in those high-risk areas. The proposed buyouts and restrictions on future development would only apply to the parts of that land that aren’t protected by levees, and the state isn’t sure how many that may be.

Another town on the wrong side of the state’s risk map is Leeville, a cluster of houses and trailer homes west of Empire that state officials say will soon be underwater. On a recent afternoon, Opie Griffin stood on the front deck of his family’s gas station and restaurant and wondered what he’s supposed to do then.

“There’s no way they can protect this,” Griffin said, sweeping his hand out over what’s left of the town, the bayou seeping in from all sides. “I see more and more water every year." He put the store up on stilts, perched 16 feet in the air, after Hurricane Gustav blew through in 2008. Now his parking lot floods nearly every day. “Nothing I could do, except come to work on a boat.”

However untenable Leeville becomes, however, Griffin is more worried about what’s waiting for him somewhere else. “I’m almost forty. Do I want to start a whole new chapter in my life?” he asked. “Where do you want me to go?”

The possibility of backlash makes it hard to predict whether the state’s recommendations will take effect. Other ideas in the plan would require cooperation from different levels of government.

The report calls for adding Louisiana’s high-risk communities to what is called the Coastal Barrier Resource System, a federal designation that prevents homes and businesses there from getting most types of federal assistance. That change would require federal legislation, and President Donald Trump’s signature.

Perhaps the most challenging could be local governments, which have the most authority over land-use and zoning decisions. Sanders said his office’s strategy was to demonstrate to local officials that the plan’s recommendations have the support of the people who would be affected. The state, working with a nonprofit called the Foundation for Louisiana, held some 60 public events, which together drew more than 3,500 people.

The meetings presented the science on sea-level rise and future flood risks so that people could understand what it meant for where they live, according to Liz Williams, who runs the foundation’s coastal resilience program.

An LA Safe Asian community meeting.Photographer: Derick E. Hingle/Bloomberg

According to Sanders, the lesson from those months of meetings is that the people most at risk along Louisiana’s coast know they’re in danger, and they want more than just assurances that life can go on as normal.

“‘They will string you from a tree, burn the room down, throw rotten vegetables at you, slash the tires of your car,’” Sanders recalls being warned before he started talking to residents about the idea of emptying out their neighborhoods. “None of that has happened. People appreciate honesty.”

One of them is Ly Chan.

Two years ago, Chan moved from Richmond, Virginia to Buras, a town south of Empire, after she and her husband lost their jobs. The couple had a friend in Buras who owned a shrimp boat; he told them there was work. They bought a single-wide trailer and the land underneath it for $50,000 and planted banana, mango and papaya trees. It was almost paradise.

But the drawbacks of living along the Louisiana coast soon caught up to them. The area around their home floods constantly; in October, Chan, her husband and their teenage son had to evacuate north ahead of Hurricane Nate. Chan, who has worked with the LA SAFE program as a translator for her Cambodian neighbors, said that if the state offered her a buyout she would take it.

“It’s not just going to be one time – it’s going to be again and again,” Chan said of storms like Nate. “God gives you wisdom. You have to use it.”

Minnesota joins U.S. states limiting controversial farm chemical

Tom Polansek

CHICAGO (Reuters) - Minnesota became the latest U.S. state on Tuesday to restrict controversial weed killers made by Monsanto Co and BASF SE that were linked to widespread crop damage, while Arkansas took a step back from imposing new limits.

The United States has faced an agricultural crisis this year caused by new versions of the herbicides, which are based on a chemical known as dicamba. Farmers and weed experts say the products harm crops that cannot resist dicamba because the herbicides evaporate and drift away from where they are applied, a process known as volatilization.

Monsanto and BASF say the products are safe when used properly.

Monsanto is banking on its dicamba-based herbicide and soybean seeds engineered to resist it, called Xtend, to dominate soybean production in the United States, the world’s second-largest exporter.

However, Minnesota will prohibit summertime sprayings of dicamba-based herbicides after June 20 in a bid to prevent a repeat of damage seen across the U.S. farm belt this year, according to the state’s agriculture department.

Farmers also will not be able to apply dicamba-based herbicides if temperatures top 85 degrees Fahrenheit because research shows high temperatures increase crop damage from volatilization, according to Minnesota.

“We will be closely monitoring the herbicide’s performance with these restrictions in 2018,” said Dave Frederickson, Minnesota’s agriculture commissioner.

U.S. farmers planted 90 million acres of soybeans this year, and about 4 percent showed signs of damage linked to dicamba, according to University of Missouri data.

Missouri and North Dakota have separately announced deadlines for spraying dicamba-based herbicides next year.

In Arkansas, a legislative panel advised a state plant board to review its proposal to ban the use of dicamba herbicides after April 15, in a win for the agri-chemical companies.

The panel recommended that the board consider scientific-based evidence among other factors to revise the proposal, the state said.

Monsanto believes there is no scientific evidence to support cutoff dates for spraying dicamba herbicides, said Scott Partridge, vice president of global strategy.

“They have essentially hit the pause button here,” Partridge said about Arkansas.

Monsanto, which is being acquired by Bayer AG for $63.5 billion, has sued Arkansas to prevent the state from prohibiting sprayings after April 15.

The company and BASF have said that proposal would hurt Arkansas growers by denying them access to products designed to be sprayed on dicamba-resistant soybeans and cotton during the summer growing season.

Arctic Report Card: Lowest Sea Ice on Record, 2nd Warmest Year

Climate scientists say the magnitude and rate of sea ice loss this century is unprecedented in 1,500 years and issue a warning on the impacts of a changing climate.

BY SABRINA SHANKMAN, DEC 12, 2017

Older Arctic sea ice is being replaced by thinner, younger ice. NOAA reports that multiyear ice accounts for just 21 percent of the ice cover in 2017. Credit: Thomas Newman/CICS-MD/NOAA

The Arctic experienced its second-warmest year on record in 2017, behind only 2016, and not even a cooler summer and fall could help the sea ice rebound, according to the latest Arctic Report Card.

"This year's observations confirm that the Arctic shows no signs of returning to the reliably frozen state that it was in just a decade ago," said Jeremy Mathis, director of the Arctic program at National Oceanic and Atmospheric Administration (NOAA), which publishes the annual scientific assessment.

Arctic Sea Ice Has Been Shrinking

"These changes will impact all of our lives," Mathis said. "They will mean living with more extreme weather events, paying higher food prices and dealing with the impacts of climate refugees."

The sea ice in the Arctic has been declining this century at rates not seen in at least 1,500 years, and the region continued to warm this year at about twice the global average, according to the report. Temperatures were 1.6° Celsius above the historical average from 1981-2010 despite a lack of an El Nino, which brings warmer air to the Arctic, and despite summer and fall temperatures more in line with historical averages.

Among the report's other findings:

When the sea ice hit its maximum extent on March 7, it was the lowest in the satellite record, which goes back to 1979. When sea ice hit its minimum extent in September, it was the eighth lowest on record, thanks in part to the cooler summer temperatures.

Thick, older sea ice continues to be replaced by thin, young ice. NOAA reported that multiyear ice accounts for just 21 percent of the ice cover, compared with 45 percent in 1985.

Sea surface temperatures in the Barents and Chukchi seas in August were up to 4°C warmer than the 1982-2010 average.

Permafrost temperatures in 2016 (the most recent set of complete observations) were among the highest on record.

The report card's findings were announced at the annual conference of the American Geophysical Union, an organization of more than 60,000 Earth and space scientists. The report card is peer reviewed, and was contributed to by 85 scientists from 12 countries.

When the Arctic sea ice hit its maximum extent for the winter on March 7, 2017, it was the lowest in the satellite record.

Timothy Gallaudet, a retired Navy admiral who is the acting NOAA administrator, told the audience of scientists that the findings were important for three main reasons. The first reason, he said, was that "unlike Las Vegas, what happens in the Arctic doesn't stay in the Arctic."

The next two reasons, he said, "directly relate to the priorities of this administration": national security and economic security.

"From a national security standpoint, this information is absolutely critical to allow our forces to maintain their advantage," Gallaudet said.

From an economic one, the changes in the Arctic bring challenges—like those faced by Alaskan communities threatened by coastal erosion—but also opportunity. "Our information will help inform both of those as we approach the changing Arctic," he said.

The compelling case for capturing carbon emissions and burying them underground

WRITTEN BY : Akshat Rathi

December 13, 2017, Livermore, California

Over my year of reporting on carbon-capture technology, I made it a habit to ask everyone I interviewed to suggest three others I should talk to next. The strategy yielded many useful leads, but one name got mentioned again and again.

Julio Friedmann has worked in the private sector, spending five years at ExxonMobil; in the research sector, most recently as the chief energy technologist at the Lawrence Livermore National Laboratory; and in the US government, as principal deputy assistant secretary in the Office of Fossil Energy during the Obama years. He has learned the language needed to traverse all these worlds, and it gives him a unique voice.

Friedmann’s bona fides are impeccable, but many’s are. What makes Friedmann unique is he is the clearest thinker on carbon capture I encountered.

For the uninitiated, carbon capture and storage (CCS) is a technology that stops carbon dioxide emissions from fossil-fuel power plants and chemical industries from entering the atmosphere, and buries them safely underground. Ultimately, the world will need to be powered by 100% renewable energy. But CCS is an essential stopgap; without it, experts agree, there isn’t an economically feasible way to attain the goals laid out in the Paris climate agreement. (You can read more about the technology in our investigative feature.)

I met Friedmann on a cool day in September outside his favorite coffee shop in Livermore, California, a small city some 40 miles (65 km) east of San Francisco. He arrived on a recumbent bike, wearing a faded Southwest-patterned shirt.”Cool bike,” I said. Without hesitation, he replied: “And it’s also very efficient.” We got coffee, and talked about why CCS is crucial to reach global climate goals.

The interview has been edited for clarity and length.

Quartz: You argue that carbon capture and storage is necessary. Why?

Julio Friedmann: A clarifying aspect of the Paris agreement, with 197 countries part of it, is that there is no market in the world where carbon emissions aren’t an issue. Each government is trying to deal with them in one way or another. One of the things they are grappling with is that, despite the progress on renewables and electric vehicles, we are simply not on the right trajectory to reduce our emissions.

For the past three years, CO2 emissions in the energy sector have been flat. That’s a plus. The dark cloud inside that silver lining is that global emissions continue to rise. The math remains grim. As countries look around and ask “What can we do to hit climate targets,” the CCS option keeps coming back.

Q: Is the technology ready?

There are plenty of people on the left and on the right that think this technology is not ready for prime time. That’s just hogwash. The first carbon-capture device was built in 1938. So it’s a common myth that needs to be debunked and dispelled. It is a proven, robust technology.

We are injecting [underground] tens of million of tons of CO2 every year by using CCS. We’ve been doing large-scale carbon capture and storage for over 20 years. There are a dozen companies that will sell you a unit with a performance guarantee. It’s not that the technology costs too much, but it’s that you can’t finance it.

Q: Why won’t anyone finance CCS projects?

To get one unit of carbon capture fit on a fossil-fuel power plant, it costs a lot of money. It’s not like wind or solar, where you can build small units. There is a big capital outlay at the front for CCS. So you’re committed to a billion dollars or more. That’s why it’s not something a lot of private investors are willing to take on.

If you went to a bank and you said you wanted to build wind turbines, they’ll lend you money. In 2014, the US Congress approved $44 billion in wind-production tax credits. The bank would say “we know how we’ll get their money back, so we’ll give you the loan.”

CCS needs that kind of policy support to create a market. If there is no market, the investment never comes.

From the US perspective, we now have two bills—one each in the house and in the senate. In both cases, the bills say there will be a performance tax credit for doing CCS. If those bills go through, they will be the biggest and clearest support for CCS. They will provide between $30 and $50 per metric ton of CO2 stored in tax credits. With that kind of pricing, a lot of CCS will get done immediately.

Q: If the technology is ready, why haven’t we started deploying it at the scale required to achieve our climate goals?

For the most part, nobody knows what we’re talking about. People know what a windmill looks like. But they don’t actually know what CCS is. So it’s hard to get traction in a policy context.

No constituencies are advocating for it. There are advocates for renewables, but there are not enough advocates for CCS. Only very recently have oil and gas companies begun to advocate for CCS. Through platforms like the Oil and Gas Climate Initiative which plans to invest $1 billion in these technologies, they are starting to do something.

Finally, CCS doesn’t make anything new. You just don’t have emissions. People don’t tend to see it as something additive; they tend to see it as something subtractive. That’s why people tend to look at it as a cost, as opposed to an investment. There’s a perception around CCS that it adds a burden, as opposed to accomplishing a goal. That needs to change.

Q: How could we go about achieving that change?

We don’t understand today how the market will value CO2 products. We don’t know if people are really prepared to pay more. They might do so for vodka made from CO2 or a Nike made from the air. That’s not a global climate solution, but what about gasoline made from the air? We know that it will cost more than the gasoline we get from underground, but will people pay more for the sustainable option? We don’t know.

Until we develop these technologies and create these products, we can’t know. We need to run the social and market experiments first. This is one of those things that we just have to try.

Hockey-stick problems, like climate change, need hockey-stick approaches to solving them. We can’t solve the problem by doing the same stuff over and over again. We gotta try radically different ideas. Converting CO2 into products is something we should try.

Q: Some people claim that CO2 products can be a trillion-dollar market and capture billions of tons of CO2. Is that possible?

There’s an awful amount of controversy on how big the market for CO2 conversion—some call it utilization—could be. For any chemical industry—be it cement or steel—you could imagine they could reduce emissions by about 1 billion tons per year. But that’s not a global climate solution, because we need to reduce emissions by tens of billions of tons. Nor will CO2 conversion substitute for CCS, because we’ll still have to bury the remaining tons.

But there’s another way to look at it: 1 billion ton of emissions reduction is great. We need all the reduction we can get. If you can get these reductions with a credible pathway to revenues, it becomes more palatable and more actionable to a larger number of people.

The long-term trend for renewable power is that it will keep getting cheaper. That means using the abundant renewable cheap energy to do CO2 conversion is no longer crazy. It was crazy three years ago. It’s not crazy now.

It looks like a more useful and more profitable undertaking than batteries. With batteries, you’re taking low-cost power and selling it back as low-cost power. In CO2 utilization, you’re taking low-cost power and turning it into something valuable, while reducing emissions.

Q: Many environmentalists argue that investing in CCS is essentially a form of fossil-fuel subsidies.

There’s intentional misdirection on fossil-fuel subsidies. A recent study said that fossil fuel companies get trillions of dollars in subsidies. In fact, most of that wasn’t subsidies. It was that we’re allowing them to emit (which has health and environmental costs), which they considered as a subsidy. But that’s debatable.

Within that study, there was $700 billion of direct subsidies. That’s what we are paying them to provide fossil fuels at a lower price or giving them land for free. When Saudi Arabia says that we’re going to charge you less for gas than the cost of producing it, that’s a proper subsidy. We can all agree that now is not the right time to subsidize fossil energy.

But I reject the notion that incentivizing the development of CCS is a fossil-fuel subsidy. That’s utterly ridiculous. I don’t know how people feel comfortable saying that. In this case, the money is not going to the fossil-fuel companies, it goes to whoever reduces emissions. It’s the thing we want. We’re incentivizing good behavior.

Q: What about geoengineering as a solution? If we don’t reduce emissions now, some experts say there are technologies that we could use to reduce the amount of solar heat the planet traps. That way we can keep global average temperatures from rising.

One of the things that needs to be said over and over again is that CCS and even removing carbon dioxide from the atmosphere is not geoengineering. There are plenty of people who don’t want to keep them separate and make a pig’s breakfast of it. But they are radically different things.

Interestingly, the Paris agreement threw down the gauntlet for geoengineering. The goal isn’t to keep the world’s carbon dioxide levels below a certain threshold. The goal is to keep global average temperatures below 2°C. And it’s looking like we won’t hit that goal. So, in a way, the Paris agreement has forced the geoengineering question: If we aren’t going to hit our temperature targets by managing emissions, then we have to meet our targets through geoengineering, such as solar-radiation management. It has accelerated the timeline along which policymakers have to think about these technologies. We need to look at questions not just about technologies, but about governance, economics, and social. And we need to do tests.

The fundamental issues around geoengineering have not changed. First, global governance is a mess. Second, the technology keeps getting cheaper and easier, which means someone’s going to try it at some point. I think about geoengineering the way I think about gastric bypass surgery. In an extreme case, it may be necessary. But also, I’m glad doctors did research on it before trying.

Large, valuable, private properties will ‘fall into the sea’ because of climate change

Ireland may have to endure temperature rise of over four degrees, conference told

Tue, Dec 12, 2017, 18:24 Updated: Tue, Dec 12, 2017, 18:28

Kevin O'Sullivan

The conference heard that the lessons of Storm Ophelia needed to be absorbed in order to ensure vital infrastructure is more resilient.

Houses, including some large valuable private properties, will fall into the sea and it is not in the public interest to save them, a climate change conference has heard.

Prof John Fitzgerald told the Met Éireann conference, entitled Making Ireland Weather and Climate Prepared, that State investment would have to be prioritised to adapt to the inevitable consequences of climate change.

Prof FitzGerald, who is chairman of the Government’s Climate Change Advisory Council, said difficult decisions would have to be taken beyond “do not build in areas likely to flood” and that protecting some areas would not be possible.

Houses, including some large valuable private properties, were going to fall into the sea and it was not in the public interest to provide protection for them, Prof Fitzgerald added.

“But if Cork is going to be flooded, that has different implications,” he said.

The lessons of the recent Storm Ophelia needed to be learned in making vital infrastructure more resilient with all relevant agencies working closer together, especially those providing electricity, water and sewerage.

Having experienced three instances where the Dodder river in Dublin burst its banks, it should now be technically possible to give a one- to two-hour warning of such events, he said. “They have done it in Vietnam. They have done it in Minneapolis. Why not Ireland?” he asked.

A Met Éireann official told the conference on Tuesday that temperatures in Ireland could rise by up to 4.5 degrees by the end of the century as a result of climate change and sea levels could go up by as much as a metre.

Séamus Walsh, head of climatology and observations with the forecaster, said there was a case for more strongly acknowledging climate change effects in day-to-day weather forecasting.

He said more detail on climate change impacts and the science behind it could potentially be provided.

In addition to better forecasting now being available, the science around likely climate scenarios was becoming clearer, he told the conference.

Mr Walsh said the most likely climate change scenario for Ireland would mean the island experience a temperature rise this century of between 1.5 and 4.5 degrees. He said sea level rises would be between 0.5m and 1m but that the final outcome depended on the success of global actions to reduce carbon emissions.

Rising sea levels and warmer seas would result in more severe but less frequent storms, he added, but there would be more winter days when rainfall exceeded 30mm.

Mr Walsh said the new colour coding system for weather warnings had increased understanding among the public of the likely impact of such events.

He said Met Éireann planned to make a smartphone app available “to get warnings out there in a more timely manner” and to use social media more.

Former president and climate justice campaigner Mary Robinson said Met Éireann should utilise its expertise to help those living on the front lines of climate change.

“In some of the poorest and most climate vulnerable countries, weather data is often unreliable or completely lacking,” she said. “Many do not have the capacity to provide risk information to their own citizens and are unable to manage disaster risk effectively.

“ Your expertise in understanding, not just how our climate system is evolving, but also in assisting with developing community resilience in the face of climate impacts, will help to save the lives and livelihoods of those worst affected.”

Ms Robinson said weather forecasting and climate change analysis were complex and there was a need for better communicating the realities to the public.

“The technical jargon of the climate-change community – a world of mitigation, adaptation, market mechanisms and nationally determined contributions – is meaningless to most people and only serves to further alienate.”

Met Éireann’s head of forecasting, Gerald Fleming, outlined how weather and emergency management has improved when dealing with events like Storm Ophelia.

Forecasting was now possible within the 10-day period in advance of extreme weather and this allowed agencies to anticipate impacts and plan the actions they might take, he said.

Why Downtown Parking Garages May Be Headed for Extinction

December 12, 2017 By Mindy Fetterman for Stateline

Astros fans watch from a parking garage during a parade honoring the World Series champions in Houston. The rise of ride-hailing services and autonomous cars may make huge downtown garages a thing of the past.

For decades, providing downtown parking was a top priority for urban planners. Huge parking garages for commuters’ cars occupied prime real estate that otherwise might have been used for housing, stores or offices.

But ride-hailing services and autonomous cars are going to revolutionize parking in cities across the country — in garages, in lots and along curbs. By 2030, 15 percent of new cars sold will be totally autonomous, according to one estimate. One in 10 will be shared. And as it becomes easier for people to summon shared or autonomous cars when they need them, fewer people will want to own their own vehicle, meaning fewer cars overall.

The bottom line: We’re going to need much less space to store cars. Some cities are gearing up to take advantage of the shift.

“Urban parking lots are dead or dying, and how we use the curb is changing,” said Rich Barone, vice president of transportation for the Regional Plan Association of New York, New Jersey and Connecticut. The RPA released a report in October urging cities to “get ready” for autonomous vehicles. It predicts that by 2045, 70 to 90 percent of all cars in major urban areas could be autonomous.

Parking in downtowns is going to “morph from being big massive surface lots and garages to much smaller areas configured for pickup and drop-off of autonomous vehicles,” Barone said. “Cities will be more walkable, more people-friendly, and there will be more space for parks and other amenities.”

Joe Schwieterman, a professor of transportation at DePaul University in Chicago, agrees. “The whole view of the function of streets has had a metamorphosis,” Schwieterman said. “It’s made us rethink the opportunity cost of plopping a parking garage in prime downtown property.”

In addition, curbside parking will be redesigned. The National Association of City Transportation Officials suggests in a recent study that reuses could include bike parking, small green spaces, called “park-lets,” for pedestrians to enjoy, and pickup and drop-off areas for driverless vehicles and ride-hailing services like Uber and Lyft.

“We always thought of curbs as the place where parking meters are, but it’s a different space now,” said Faye DiMassimo, general manager of Renew Atlanta, a 5-year, $252 million infrastructure bond approved by voters in 2015. “Now we find sidewalks and trails, and technologies like ride-sharing and autonomous cars pulling up and dropping off people. The curbside has to be managed differently than in the past.”

Roadways are getting updates too. In September, Atlanta unveiled its first “smart road” to install technology for traffic control and eventually driverless cars, and it tested a driverless transit shuttle. The $3 million North Avenue Smart Corridor Project has five miles of adaptive traffic signals and other sensors that make adjustments based on traffic demand, communicate with autonomous vehicles, and open traffic signals to emergency vehicles like fire trucks and police cars, DiMassimo said.

Parking Limits

As urban planners start to shift focus from making it easy to park a car downtown to making it harder, cities such as Buffalo, Miami and Seattle are reducing or dropping their minimum parking requirements for new developments. Instead, they are capping the amount of parking that developers may provide.

Arlington, Virginia, a suburb of Washington, D.C., just approved reducing the number of parking spots it requires for housing developments based on how close the developments are to its mass transit system. The requirements will drop from 0.8 parking spaces per housing unit to 0.2 for units close to the Metro subway system.

One hope is that allowing developers to forgo parking lots will let them build more affordable housing.

“Affordable housing is a really big issue, and Arlington is a very expensive place to live,” said Dennis Leach, Arlington County’s director of transportation. A single underground garage space can cost $35,000 to $45,000 to build, he said. Underground parking in denser urban areas can cost much more.

Oakland, California, also recently adopted new regulations that allow apartment buildings to be built with fewer dedicated parking spaces. Instead, the new rules require developers in areas near public transit systems to offer transit passes.

“Oakland realized that its conventional approach to parking was failing in every way,” said Jeffrey Tumlin, principal of Nelson\Nygaard, a San Francisco-based transportation consulting firm that helped Oakland design the regulations.

“Drivers are realizing it’s cheaper to take an Uber than drive,” he said. “Developers see autonomous cars coming and think, ‘Am I really going to build this garage at $70,000 a space?’ ”

Donald Shoup, a professor of urban planning at UCLA and the author of “The High Cost of Free Parking,” notes that in Los Angeles, parking takes up 40 percent more land than all the freeways and streets combined. Shoup predicts that parking garages and surface lots in his city, and many others, will be torn down and redeveloped into housing, office and retail space.

The private developer of one 475-unit apartment complex in downtown Los Angeles is taking another approach: building a garage that is designed to be converted into shops, a gym and a theater when it is no longer needed for parking.

The garage, in the city’s trendy Arts District, will hold 842 vehicles, but Virginia-based AvalonBay Communities Inc. is constructing it with level rather than inclined floors and floors that can be easily removed so it will be ready for a second life. AvalonBay, one of the country’s biggest apartment developers, operates nearly 84,000 apartments in 10 states.

“What happens to all this parking we are building today if the demand decreases?” said Mark Janda, senior vice president of development for Avalon in Southern California. “One of the things we can do is to make these parking garages adapt.” Other regions in his company are considering new parking solutions too, he said.

Curbing an ‘Addiction’

Parking in the United States is “an addiction,” said Tumlin of Nelson\Nygaard. So even as technology makes parking less essential, some cities are giving residents an extra push to kick the habit.

To discourage downtown parking and to push people toward options other than driving their own cars, cities such as Boston, San Francisco and Seattle are experimenting with parking meters that charge more at peak times. The city council in Washington, D.C., is debating new rules that would require companies that provide free parking for their employees to also provide free public transit passes and bike parking.

In Columbus, Ohio, downtown business owners, who want parking spaces freed up for customers, are chipping in $5 million initially for a program to pay for their workers to take a bus, rather than their cars, to work. More than 40,000 workers would qualify starting in June. A three-month test last year resulted in 12.2 percent of downtown workers taking the bus, nearly double the 6.4 percent who did before.

“These were people who had ingrained habits of driving into work,” said Cleve Ricksecker, executive director of the Capital Crossroads Special Improvement District. “The owners are pretty jazzed.”

Pollution-choked India buying dirty US oil byproduct

PUBLISHED: December 3, 2017 at 9:00 am | UPDATED: December 3, 2017 at 9:03 am

By Tammy Webber and Katy Daigle | Associated Press

NEW DELHI — U.S. oil refineries that are unable to sell a dirty fuel waste product at home are exporting vast quantities of it to India instead.

Petroleum coke, the bottom-of-the-barrel leftover from refining Canadian tar sands crude and other heavy oils, is cheaper and burns hotter than coal. But it also contains more planet-warming carbon and far more heart- and lung-damaging sulfur — a key reason few American companies use it.

Refineries instead are sending it around the world, especially to energy-hungry India, which last year got almost a fourth of all the fuel-grade “petcoke” the U.S. shipped out, an Associated Press investigation found. In 2016, the U.S. sent more than 8 million metric tons of petcoke to India. That’s about 20 times more than in 2010, and enough to fill the Empire State Building eight times.

The petcoke being burned in countless factories and plants is contributing to dangerously filthy air in India, which already has many of the world’s most polluted cities.

Delhi resident Satye Bir does not know all the reasons Delhi’s air is so dirty, but he says he feels both fury and resignation.

Reading this on your phone? Stay up to date with our free mobile app. Get it from the Apple app store or the Google Play store.

“My life is finished. … My lungs are finished,” said the 63-year-old Bir, wheezing as he pulls an asthma inhaler out of his pocket. “This is how I survive. Otherwise, I can’t breathe.”

Laboratory tests on imported petcoke used near New Delhi found it contained 17 times more sulfur than the limit set for coal, and a staggering 1,380 times more than for diesel, according to India’s court-appointed Environmental Pollution Control Authority. India’s own petcoke, produced domestically, adds to the pollution.

Industry officials say petcoke has been an important and valuable fuel for decades, and its use recycles a waste product. Health and environmental advocates, though, say the U.S. is simply exporting an environmental problem. The U.S. is the world’s largest producer and exporter of petcoke, federal and international data show.

“We should not become the dust bin of the rest of the world,” said Sunita Narain, a member of the pollution authority who also heads the Delhi-based Center for Science and the Environment. “We certainly can’t afford it; we’re choking to death already.”

Embracing tar sands

For more than a century, oil refining has served as a lifeline in America’s industrial heartland, where thousands of manufacturing jobs have been lost in recent decades.

In gritty northwest Indiana, a sprawling oil refinery and steel mills dominate the Lake Michigan shoreline. Freight trains chug through working-class neighborhoods. And smokestacks and distillation towers still symbolize opportunity.

Local officials and workers cheered when the BP Whiting refinery invested $4.2 billion so it could process crude extracted from tar sands in the boreal forest of Alberta, Canada.

U.S. refineries embraced tar sands oil and other heavy crudes, when domestic oil production was stagnant before the hydraulic fracturing boom. Some of the biggest built expensive units called cokers to process the gunky crude into gasoline, diesel, ship fuel and asphalt, which leaves huge amounts of petroleum coke as waste. When BP Whiting’s coker in Whiting, Indiana was finished in 2013, its petcoke output tripled, to 2.2 million tons a year.

Petcoke traditionally was used in the U.S. to make aluminum and steel after its impurities were removed. But when those mills closed or moved to other countries, the need for petcoke waned, although some power plants still use it. Other industries that had burned petcoke did not want to invest in costly upgrades to control higher emissions of sulfur and other pollutants or switched to cleaner and cheaper natural gas.

The American Fuel and Petrochemical Manufacturers, a petroleum industry trade group, released a statement to the AP saying that cokers “allow the United States to export petroleum coke to more than 30 countries to meet growing market demand.”

“Petroleum coke is used globally as a cost-effective fuel, as well as an integral component in manufacturing,” AFPM said.

But experts say it’s not market forces that are driving U.S. refiners to make this waste product from heavy oil refining. The refineries just need to get rid of it, and are willing to discount it steeply — or even take a loss — which helps drive the demand in developing countries, experts said.

“It’s a commodity that defies explanation (because) there’s not a financial market,” said Stuart Ehrenreich, an oil industry analyst who once managed petcoke export terminals for Koch Industries. “But at the end of the day, the coke has got to move.”

So it’s usually priced cheaper than even coal, sold around the world through a network of businesses — from boat captains and stevedores to buyers, brokers and middlemen — and sent on an epic, weeks-long journey by rail, barge and ship.

There are fewer than a dozen big traders globally. Among the largest are Oxbow Energy Solutions and Koch Carbon, both led by members of the politically conservative and climate-skeptical Koch family. Neither they nor a dozen U.S. oil companies and traders contacted by the AP would talk about petcoke. They cited past controversies over the mountains of the waste stored at Midwest refineries, or said they wanted to avoid angering business partners.

In India, no factory managers would allow AP access, and federal officials did not respond to repeated requests for interviews.

With the petcoke market volatile and competitive, industry holds information close, hoping to maintain an edge and make a profit.

“It’s like the Wild West,” said Ehrenreich.

Petcoke, critics say, is making a bad situation worse across India. About 1.1 million Indians die prematurely as a result of outdoor air pollution every year, according to the Health Effects Institute, a nonprofit funded by the U.S. Environmental Protection Agency and industry.

In the capital of New Delhi, pollution has sharply increased over the past decade with more cars, a construction boom, seasonal crop burning and small factories on the outskirts that burn dirty fossil fuels with little oversight. In October and November, for the second year in a row, city air pollution levels were so high they couldn’t be measured by the city’s monitoring equipment. People wore masks to venture out into gray air, and newspaper headlines warned of an “Airpocalypse.”

“Fifty percent of children in Delhi have abnormalities in their lung function — asthma, bronchitis, a recurring spasmodic cough. That’s 2.2 million children, just in Delhi,” said Dr. Sai Kiran Chaudhuri, head of the pulmonary department at the Delhi Heart & Lung Institute.

The country has seen a dramatic increase in sulfur dioxide and nitrogen dioxide emissions in recent years, concentrated in areas where power plants and steel factories are clustered. Those pollutants are converted into microscopic particles that lodge deep in the lungs and enter the bloodstream, causing breathing and heart problems.

It’s impossible to gauge precisely how much is from petcoke versus coal, fuel oil, vehicles and other sources. But experts say it certainly is contributing.

a doctor examines Jagat Singh, 59, at the Delhi Heart and Lung Institute in New Delhi, India. (Tsering Topgyal/Associated Press)

Indian purchases of U.S. fuel-grade petcoke skyrocketed two years ago after China threatened to ban the import of high-sulfur fuels. Although Indian factories and plants buy some petcoke from Saudi Arabia and other countries, 65 percent of imports in 2016 were from the U.S., according to trade data provider Export Genius.

“It is definitely alarming,” Chaudhari said. “The government should know what they’re getting, what they’re using and what are its harmful effects.”

In the north Indian industrial district of Moradabad, several hours’ drive from the capital, villagers see the skies getting dingier but have little information about what happens behind factory gates.

Only four factories are on record as using petcoke. But dozens buy it from middlemen running open-air fuel depots, according to Sarvesh Bansal, a natural gas distributor in the north Indian city who leads the ad-hoc local environmental group called WatAir.

“We want the factories moved very far away from here,” said a 25-year-old rice farmer named Mohammad Sarfaraz, who lives in nearby Farid Nagar. He and others aren’t sure what pollutants are being spewed, but they nevertheless protested at nearby factories a few years ago until shooed away by guards. “Many illnesses occur because of the factories. Small kids and old people fall sick very easily. There is breathlessness, heart disease, pain in the hands and legs.”

India’s cement companies were first to bring in petcoke, and still import the most, though cement experts say some sulfur is absorbed during manufacturing.

As word spread of the cheap, high-heat fuel, other industries began using it in their furnaces — producing everything from paper and textiles to brakes, batteries and glass, according to import records compiled by Export Genius. The government was caught off guard by the shift, and there are scant records of how much petcoke is being burned.

Petcoke’s use was further encouraged by low import tariffs and a lack of regulations on its most potent pollutants.

Industries also like that petcoke, which is around 90 percent carbon, burns hot. So they can use less of it to produce the same heat as coal — though coal still overshadows petcoke in factory furnaces.

Within a decade, India’s petcoke appetite grew so voracious that it began producing and selling its own, and Indian refineries today are making about as much as the country is importing. One of the biggest refiners — Mumbai-based Reliance Industries Lts., owned by India’s wealthiest businessman, Mukesh Ambani — has ramped up petcoke production.

Still, U.S. petcoke remains popular.

Indians typically buy petcoke with about 6-7 percent sulfur — more than double than with most coal — because it’s the least expensive, said Vedanth Vasanth, director of Viva Carbon Pvt. Ltd., a supplier based in the southern city of Chennai that helps broker petcoke contracts between Indian buyers and sellers abroad.

J.P. Gupta, whose factory in Moradabad district makes acrylic fibers used in clothing, said his factory burns through some 4,000 metric tons of Indian-made petcoke every month.

The factory spent about $300,000 on equipment to control sulfur, he said, but would have spent 50 percent more on pollution control if it had opted for U.S. petcoke, which he says is dirtier.

“We rejected the imports,” he said. “But there are some who are not bothering about the pollution.”

At an open-air brick kiln just six miles down the road, workers shoveled a mix of petcoke and coal into a fiery furnace. Other than thick wooden sandals to protect their feet from the heat, they wore no safety gear or breathing masks. And there was no equipment to control the gases or soot billowing from the chimney.

Such small factories operating off the electricity grid in India’s vast informal sector account for 25 to 30 percent of the country’s total energy generation. Often crammed into city outskirts, these outfits manufacturing everything from plastic bangles to metal screws rely on fossil fuels to keep their furnaces afire — the cheaper, the better.

Few adhere to pollution standards, said Ajay Mathur, head of The Energy Research Institute, a nonprofit policy research organization in New Delhi. “This is an area where we need to have regulations sooner rather than later,” he said

Although petcoke has been an industrial resource since the 1930s, the high sulfur content and sheer petcoke volume — and growing concern about climate change, as well as particle pollution — could restrict or halt its production, experts said.

Governments could decide to tax high-carbon fuels such as petcoke. They could ban high-sulfur or high-carbon fuels. Or they could set pollution limits that make petcoke use impractical.

In India, judges of the National Green Tribunal demanded in May that the government investigate the environmental and health impacts of petcoke.

“The government was not doing anything,” said the WatAir leader Bansal, whose environmental group launched the lawsuit. “There is no law in India, no control. So the whole world’s petcoke is coming to India, and it’s getting consumed here.”

The government’s environment ministry has dismissed the idea that petcoke threatens public health in the nation’s capital. But the country’s Supreme Court, which has consistently demanded or enacted tougher pollution control measures, recently banned petcoke use by some industries as of Nov. 1 in the three states surrounding pollution-choked New Delhi. It also demanded tighter pollution standards that — if enforced — could further limit its use nationwide.

“This is a completely disgusting state of affairs,” the judges said in their (Oct. 24) ruling, “and this is hardly the way in which the Ministry ought to function if it is expected to perform its duties sincerely, honestly and with dedication.”

The court last month also urged all states across India to pass similar bans.

The ministry refused months of requests for interviews, both before and after the court’s ruling. But analysts say that, short of a nationwide ban, petcoke use could be mostly unaffected.

“The petcoke markets grew so fast across the country that a ban around New Delhi isn’t going to put a huge dent in the overall demand for petcoke,” said Jeffrey McDonald, an analyst at S&P Global Platts.

Refineries could choose to stop producing petcoke, by using more expensive refining methods that would essentially convert all the heavy oil to other products.

But it’s more likely that if new pollution limits do affect its use, U.S. refiners will just find new petcoke customers in other developing nations, especially in Asia and Africa, experts and environmentalists said.

“It’s a classic case of environmental dumping,” said Lorne Stockman, director of the environmental group Oil Change International. “They need to get rid of it, so it’s dumped into a poor, developing country.”

Stella McCartney calls for overhaul of 'incredibly wasteful' fashion industry

UK fashion designer backs Ellen MacArthur foundation campaign to stop the global fashion industry consuming a quarter of the world’s annual carbon budget by 2050

A Stella McCartney campaign shot in a Scottish landfill site to raise awareness of waste and over-consumption. Photograph: Harley Weir and Urs Fischer for Stella

Sandra Laville

Tuesday 28 November 2017 01.01 EST

Last modified on Tuesday 28 November 2017 04.13 EST

Clothes must be designed differently, worn for longer and recycled as much as possible to stop the global fashion industry consuming a quarter of the world’s annual carbon budget by 2050.

Fashion designer Stella McCartney condemned her industry as “incredibly wasteful and harmful to the environment” as she joined forces with round-the-world sailor and environmental campaigner Dame Ellen MacArthur to call for a systemic change to the way clothing is produced and used.

In a report published on Tuesday, MacArthur’s foundation exposes the scale of the waste, and how the throwaway nature of fashion has created a business which creates greenhouse emissions of 1.2bn tonnes a year – larger than that of international flights and shipping combined.

It warns that “if the industry continues on its current path, by 2050, it could use more than 26% of the carbon budget associated with a 2C pathway.”

The report also reveals that:

less than 1% of material used to make clothing is recycled into new clothing;

the estimated cost to the UK economy of landfilling clothing and household textiles each year is about £82m;

a truckload of clothing is wasted every second across the world;

the average number of times a garment is worn before it ceases to be used has decreased by 36% in 15 years;

half a million tonnes of plastic microfibres are released per year from washed clothes – 16 times more than plastic microbeads from cosmetics – contributing to ocean pollution.

MacArthur, who gained the support of industry leaders including the C&A Foundation, H&M, and Nike for her report, is calling for a circular textile economy to be created to make fashion more sustainable.

The report calls for four actions to be taken: to phase out substances of concern and microfibre release; increase clothing utilisation, for example by the industry supporting and promoting short-term clothing rental businesses; to radically improve recycling; and to move to renewable materials.

McCartney said the ideas in the report provided solutions for an industry that was incredibly wasteful and harmful to the environment.

“The report … opens up the conversation that will allow us to find a way to work together to better our industry for the future of fashion and for the future of the planet,” she said.

MacArthur acknowledged the scale of the challenge to turn around the $2.4tn industry.

“Today’s textile industry is built on an outdated linear, take-make-dispose model and is hugely wasteful and polluting,” said MacArthur. “We need a new textile economy in which clothes are designed differently, worn longer, and recycled and reused much more often.”

Figures in the report reveal the throwaway nature of today’s fashion industry, which is based on a faster turnaround model, with more new collections released per year, at lower prices.

The report said more than half of “fast” fashion produced is disposed of in less than a year. In the US, clothes are only worn for around a quarter of the global average. The same pattern is emerging in China, where clothing utilisation has decreased by 70% over the last 15 years. Sixty percent of German and Chinese citizens admit to owning more clothes than they need.

Globally, customers miss out on $460bn of value each year by throwing away clothes that they could continue to wear.

The report said: “The textiles industry relies mostly on non-renewable resources – 98m tonnes in total per year – including oil to produce synthetic fibres, fertilisers to grow cotton, and chemicals to produce, dye, and finish fibres and textiles.

“Textiles production (including cotton farming) also uses around 93bn cubic metres of water annually, contributing to problems in some water-scarce regions.

“With its low rates of utilisation … and low levels of recycling, the current wasteful, linear system is the root cause of this massive and ever expanding pressure on resources.”

Ecofriendly knickers at London fashion week. Photograph: David Levene for the Guardian

Clothing production has nearly doubled in the last 15 years, and the growth is not just confined to the west. “Demand for clothing is continuing to grow quickly, driven particularly by emerging markets, such as Asia and Africa,” the report said.

“Should growth continue as expected, total clothing sales would reach 160m tonnes in 2050 – more than three times today’s amount.”

Greg Stanton, the mayor of Phoenix, Arizona, who is endorsing the report, said his city was attempting to create a circular economy out of textile waste: “Each year more than 18,000 tonnes of textiles find their way into … waste and recycling streams. Our city is working on creative solutions to redirect textiles from the waste stream … as a valuable resource, to ultimately stimulate the local economy.”

In east London, Cyndi Rhoades, founder of WornAgain, is working on the development of a new technology to separate and recapture polyester and cotton from textiles to be reintroduced back into the supply chain as new, raw materials.

She said: “We already have enough clothing and textiles in existence today to satisfy our annual demand of new raw materials for new clothing – all we have to do is make sure it doesn’t end up in the bin, and processes like ours are scaled as rapidly as possible.”

Europe gives controversial weed killer a 5-year lease on life

By Erik StokstadNov. 27, 2017 , 2:30 PM

The world's most popular herbicide can be used by European farmers for another 5 years. After several indecisive votes, a technical committee of the European Commission today approved a 5-year license renewal for glyphosate, the active ingredient in Monsanto's Round-Up weed killer and similar products. The license was scheduled to expire on 15 December.

Glyphosate is less toxic to mammals than other herbicides. But it became highly controversial in 2015 after it was deemed probably carcinogenic by the International Agency for Research on Cancer, part of the World Health Organization. The European Food Safety Authority and other regulators have concluded it is safe to use.

The chemical's commission license most recently expired in June 2016. But the commission's Standing Committee on Plants, Animals, Food and Feed (PAFF), made up of representatives from the commission’s 28 member nations, couldn’t agree on the length of a renewed license. PAFF initially proposed a 15-year renewal, then a 9-year renewal, and eventually settled on an 18-month extension.

By signing up, you agree to share your email address with the publication. Information provided here is subject to Science's privacy policy.

Over the past 2 months, the committee again debated an extension, but no proposal secured the necessary “qualified majority” of PAFF members—16 countries representing 65% of the European Union’s population. But today 18 countries voted in favor of a 5-year renewal, including Germany and three others that had abstained in the previous vote. France, Italy, and Austria were among the nine countries that voted against the license, and Portugal abstained.

Stéphane Travert, France’s agriculture minister, told France's international public radio: “These are 5 years during which we will work to search for alternatives, 5 years during which we will mobilize research and innovation so that tomorrow we can modify agricultural practices for our farmers and for the environment.”

Christopher Connolly, a neurobiologist at the University of Dundee in the United Kingdom, told the Science Media Center that scientists must also redouble their efforts: "We must make the next 5 years count, so that an evidence-based decision may be made at the end of this period."

With reporting by Tania Rabesandratana.

Tapping sewage as a source of useful materials

With sometimes offbeat technology, innovators seek to extract certain chemicals from municipal waste

By Alex Scott

Sewage treatment plants are chemical factories in waiting. By combining engineering expertise with chemistry and biology, plant operators can convert the solid sludge they generate into an array of useful chemical products. Some wastewater treatment plants already make phosphate and cellulose. But this is no cake walk. There are major hurdles still to jump, and project failure is very much a reality.

Sewage stinks. It causes pollution. Few people care where it goes. Fewer chemists want to work with it. And yet, with the help of the right technology, sewage could become a reliable, low-cost feedstock for chemicals and other materials.

Commercial products that could be made from municipal liquid waste include phosphorus, nitrogen-rich fertilizers, fuels, biodegradable plastics, thickeners for paints and other products, and oils for making a range of chemical intermediates.

Sounds great, if a little icky, but a few not-so-small hurdles stand in the way. One is how to economically extract useful chemicals from municipal wastewater when more than 99% of it is, well, water.

The presence of pathogens, heavy metals, and toxic compounds such as solvents adds complexity. And social taboos associated with the use of material that was once sewage could prevent investors and regulators from supporting technologies in this field. One thing is for sure: Companies aren’t going to shout about where the raw materials have come from.

These challenges have led to project failures. Nevertheless, the opportunity is substantial. More than 300 km3—yes, cubic kilometers—of municipal wastewater, and more than 600 km3 of industrial wastewater, are generated around the world each year. And more is on the way: Only 8% of domestic and industrial wastewater is treated in developing countries, compared with 70% in high-income countries.

In developed countries, solid sludge resulting from bacterial digestion in wastewater treatment plants is commonly incinerated or applied to farmland, where it can contaminate soil. In the U.S., about half of the 7 million dry metric tons of sewage sludge generated annually is applied to farmland.

But so much more can be done with this stinking stuff. Taking out high-value chemicals reduces solid-waste generation and subsequent waste treatment costs. The practice can also benefit water utilities looking to repurpose sewage into industrial or potable water.

Recovering chemicals from wastewater is at the nexus of water, energy, and food security. “Interest in using wastewater is only going to increase,” says Shrinivas Tukdeo, an analyst who covers water treatment for the market research firm Frost & Sullivan.

Nutrients make up less than 1% of municipal wastewater.

The Netherlands, a small, low-lying nation where water management is part of daily life, is leading the push to extract products from wastewater. It has programs to recover energy, phosphorus, and cellulose at more than a dozen wastewater plants.

Cellulose can be readily recovered from wastewater in a simple pretreatment phase using fine sieves (Water Res. 2013, DOI: 10.1016/j.watres.2012.08.023), says Enna Klaversma, a technology expert at Energy & Raw Materials Factory, a company that was formed by Dutch municipal water treatment companies and is known by its Dutch acronym, EFGF.

Klaversma studied environmental science at Wageningen University & Research, which has become something of a Dutch center for wastewater treatment research. Being an expert in sewage sludge and biogas generation isn’t for everyone, but she has been involved in a number of wastewater-to-chemical projects in the Netherlands as a well as a volunteer for water management projects in Indonesia.

After the recovered cellulosic material is pressed to remove water, it can be used as feedstock for biofuels or higher-value industrial chemicals. Removing cellulose can also cut the energy consumed in conventional sludge treatment by up to 40%.

In 2014 in Amsterdam, EFGF helped establish one of Europe’s first facilities to recover phosphorus from sewage. Another plant opened in Amersfoort, the Netherlands, in 2016.

The Amersfoort facility’s one-step process was developed by the University of British Columbia and commercialized by Vancouver-based Ostara Nutrient Recovery Technologies. A few other phosphorus technologies are also in use around the world, including Berlin Waterworks’ AirPrex process.

Sewage sludge with a solid content of about 30% is formed at the Amersfoort facility by passing the water through a centrifuge. The sludge is then pressed, releasing a nutrient-rich liquid. Magnesium chloride is added, causing phosphorus to crystallize into magnesium ammonium phosphate (NH4MgPO4), which can be readily separated and used as a fertilizer.

Ostara’s NH4MgPO4 crystals, branded as Crystal Green, dissolve slowly over about eight months once applied to farmland. Ostara’s technology is now being used in 15 wastewater treatment plants across North America and Europe.

The Amersfoort plant produces about 900 metric tons per year of NH4MgPO4 and generates 1,250 kW of energy in the form of biogas. The plant is fed with wastewater generated by a population of 315,000.

Ostara’s process at the waste treatment plant in Amersfoort yields magnesium ammonium phosphate crystals for applying directly onto fields.

Harvesting phosphorus cuts sewage sludge volume at Amersfoort by more than 50%, according to Eliquo Water & Energy, the engineering contractor for the project. Sludge disposal costs are about $50 per metric ton. “This can reduce wastewater plant operating costs in the Netherlands—where almost all sewage sludge is burned—by hundreds of thousands of dollars per year,” Klaversma says.

But even the enthusiastic Dutch are cautious about selling sewage-derived products on the open market. Under Dutch law, products generated from sewage plants—including NH4MgPO4—are still classified as a waste, and the market is restricted to companies licensed to receive waste.

Dutch regulators are concerned about contamination by pathogens as well as contaminants such as drug metabolites. It could be years before the government recognizes materials derived from wastewater treatment as conventional products, Klaversma acknowledges.

Although the Dutch are particularly keen on extracting chemicals from waste, officials from across Europe are promoting such technology because of the high cost of waste disposal on the Continent and a desire to shift to a circular economy in which resources are reused.

The European Commission is helping fund a string of efforts to make products from wastewater. Among them is Waste2NeoAlginate, a project aimed at alginatelike exopolymers, high-molecular-weight polymers made of exopolysaccharides, a sugar residue generated by bacteria that populate wastewater treatment plants.

The exopolymers are similar to alginate, a fairly expensive polymer extracted from seaweed and used in applications such as food and paint thickening.

In a world where the human population is projected to grow to 9.7 billion by 2050 and demand for resources will rise, the generation of products and clean water from wastewater will be essential, a United Nations report affirmed earlier this year. Recovering phosphorus—an essential element for growing food that could be in short supply in coming decades—is particularly important, the UN concluded.

In a four-year project that began in August, Waste2NeoAlginate’s partners, including Delft University of Technology, are testing a two-step process of extracting and refining alginatelike exopolymers from sewage sludge.

Modest volumes of exopolysaccharides are normally secreted by the sewage sludge bacteria, but plant operators can tune conditions in treatment tanks to promote the process. A metal cation is then added to the sludge to form an insoluble alginate-like gel. The process is similar to that used to recover alginate from seaweed.

In October, Water Authority Rijn en IJssel, a Waste2NeoAlginate partner, began building the first plant that will use the technology. Located in Zutphen, the Netherlands, the plant is due to open in early 2019 with the capacity to make 400 metric tons per year of alginatelike exopolymers from wastewater generated by nearby dairy factories. The partners anticipate recovering 20 kg of exopolymers from every 100 kg of sewage sludge.

The process is expected to reduce sludge generation by about 30%. “The regional water authorities expect to save costs on wastewater treatment by recovering the alginatelike exopolymer,” says Philip Schyns, senior project adviser for Rijn en IJssel. “They can sell it, which generates money, and they have less sewage sludge to process, which saves money.”

The Zutphen plant will cost about $13 million, paid for by the European Commission and the Dutch government. Return on investment is expected within five to 10 years of operation. Waterschap Vallei en Veluwe also plans a second wastewater-to-alginatelike exopolymers plant in Epe, the Netherlands.

About 35,000 metric tons of alginate is derived annually from seaweed, but relatively high extraction costs mean that it is not used in low-value applications. The alginatelike exopolymer’s lower production costs mean it will open up markets such as ink thickening for textile dyes, according to the water technology firm Royal HaskoningDHV, a partner of Rijn en IJssel.

Waste2Aromatics, another consortium funded partly by the EC, has made good progress since its launch in 2015. Featuring 12 partners, including the chemical maker SABIC, the consortium says it will soon unveil a blueprint for two pilot processes to generate furanic building blocks—intermediates for a range of plastics—from waste.

Both processes were developed primarily by the Dutch technology institute TNO. A steam-based process uses a feedstock of diaper fill or sieved sewage combined with household organic waste. A second, biphasic-reactor-based process uses manure as feedstock. On Nov. 30, waste management companies in the Waste2Aromatics consortium will pick one of them as the basis for a pilot plant in the Netherlands.

But it’s no easy thing to move from simple phosphate and cellulose recovery to polymers or polymer intermediate projects that require substantial capital investment.

For example, last year the Paris-based wastewater treatment firm Veolia halted an effort to make polyhydroxyalkanoate (PHA) from municipal wastewater and shut down a pilot plant in Belgium.

PHA is a biodegradable polymer with properties similar to polylactic acid, a bioplastic commonly used to make disposable cutlery. Bacteria present in wastewater can generate PHA within their cells.

Despite Veolia’s exit, research for making PHA from wastewater appears to be gathering pace. “Waste to PHA was a very interesting field of research in the 1970s when there was an oil crisis. And now with an interest in the bioeconomy rising, it’s coming back in a big way,” says Jeroen Hugenholtz, R&D manager for industrial microbiology at Wageningen University & Research.

The big hurdle to making PHA from wastewater is the cost of isolating a solid from a very dilute stream of water. “It is all about this,” Hugenholtz says.

Scientists are starting to get around the problem, although some of the projects are pretty far-out. For example, Mark A. Blenner, a Clemson University chemical engineer and synthetic biologist, has developed a process for astronauts to recycle their urine into PHA when they are on space missions.

Blenner’s approach is to engineer Yarrowia lipolytica yeast to digest urine. By splicing genes from algae and phytoplankton into the yeast and by adding a source of carbon in the form of CO2 from astronauts’ breath, Blenner has developed a route to PHA.

The idea is that PHA can be fed into a three-dimensional printer to make whatever tools or components astronauts might require. By modifying certain Yarrowia lipolytica genes, Blenner can also generate omega-3 fatty acids, essential components of the human diet that can’t be made in the body.

While the omega-3 process would be unpalatable for many consumers on terra firma, Blenner’s technology could provide the basis for PHA production in wastewater treatment plants.

New dewatering technologies are emerging that could markedly improve the economics of making PHA and other products at wastewater treatment plants.

Centrifuges and membrane filtration systems currently separate solids from water, but centrifuges use substantial amounts of energy, while filters and membranes can clog. U.K. start-up uFraction8 says it has developed a microfluidic sieve made from plastic or steel that consumes a quarter of the energy of centrifuges (Sci. Rep. 2016, DOI: 10.1038/srep36386). And uFraction8 claims its curved channels don’t clog.

The start-up’s sieves have channels of specific geometry so that at a given flow speed, only certain-sized particles will go through each aperture, says Managing Director Brian Miller. A series of channels separates and concentrates particles into size fractions. Pathogens can be removed using the microfluidic sieve, according to Miller. “We know we can do a single micrometer target, and we suspect we can go down to in the region of 350 nm,” he says.

A broad portfolio of processes for making chemical products from wastewater is starting to surface.

Sources: Companies, C&EN research

The company is working with technology partners to capture phosphate crystals in municipal wastewater. It’s also set to participate in a consortium that’s designing a system to recover crystallized dye particles from dye wastewater generated by a plant in India.

Meanwhile, the efficiency of membrane filtration is also improving. Operating costs are falling, making filtration membranes the long-term bet for processes that convert wastewater into products, claims Jens Lipnizki, head of technical marketing for membranes at Lanxess.

Membranes can be used in a cascade system to separate target particles by size. As with uFraction8’s technology, they can also filter out pathogens.

Membrane fouling is currently a problem in wastewater filtration. But Lanxess says it has developed a prototype membrane element featuring a compound designed to disrupt communication between bacteria, discouraging them from forming biofilms. “We’re not adding biocides. We don’t want to kill the bacteria, just make it uncomfortable for them,” Lipnizki says.

While companies such as Lanxess seek to ease the processing of municipal wastewater, other waste streams are gaining attention because of their heavy loading of nutrients.

This is the case in Scotland, which generates just 115,000 metric tons per year of sewage sludge and significantly more whiskey waste. Roger Kilburn, chief executive officer of Scotland’s Industrial Biotechnology Innovation Centre, sees an opportunity to take the 2 million metric tons of waste—known as pot ale—generated by Scotland’s whiskey distilleries and combine it with CO2 from whiskey fermentation to generate syngas and, from there, high-value chemicals.

A hybrid approach is to combine wastewater with other waste streams. This is the strategy being pursued by California-based start-up T2 Energy. It’s harnessing algae to convert municipal wastewater, glycerin, and CO2 from power plants into oleic acid—a building block chemical—or fuels.

By weight, half of T2 Energy’s algae cells are biomass, and half are lipids and triglycerides, says CEO Mark Randall. “Oleic acid—a triglyceride—costs in the $400–$500 range per metric ton. We can produce it for less than $100,” Randall says.

T2 Energy is in talks with potential investors to raise money for a demonstration facility. Should it go ahead, the company hopes to bring in the chemical maker AkzoNobel—a major consumer of oleic acid—to assist with the project’s management.

“Many of the inputs are free or low cost. We are ready to go to commercial scale,” Randall says.

While T2 Energy is looking to enjoy the economies of scale afforded by municipal wastewater systems and power plants, some companies want to play with more modest volumes of waste, such as that from a village or a small brewery.

The Santa Clara Valley, Calif.-based start-up NuLeaf Tech is taking this route. The firm’s Nutree device, which it calls a vertical wetland, is intended to process just 2,000 L of microbrewery wastewater into liquid fertilizer each day. The technology can be scaled up by adding modules.

Nutree is a 3-m3, treelike structure that uses microbial fuel cells to pump wastewater to its crown. Plants and microbes commonly found in wetlands digest the waste and convert it into a liquid fertilizer while releasing electrons to power the fuel cells.

The major attraction of such a system over traditional wastewater treatment plants is cost: NuLeaf estimates that it will be able to produce Nutrees for about $10,000 each. At this price the Nutree may prove attractive in developing countries, says CEO and cofounder Rachel Major. The firm is currently testing prototypes with a handful of small breweries in the U.S.

Further down the line, the effluent could even be cleaned up, used as drinking water, or reused to make beer if local water availability is an issue, Major says.

Frost & Sullivan’s Tukdeo likes the potential of artificial wetlands. Using such an approach, researchers could recover and potentially recycle hydrocarbons, fatty acids, nutrients for making fertilizers, and even dyes. This strategy would keep capital and operational costs low—advantages in developing countries, Tukdeo says.

Treelike structures that convert human waste into valuable chemicals and electricity sound like fantasies from some utopian future. The reality is that the technical and economic hurdles for such inventions are myriad. Certainly not all the technology in development will pan out.

But rising water scarcity, and the need to recover clean water from wastewater, is encouraging the adoption of such technology. By 2050, when an additional 2 billion people will be consumers, global water demand will have increased by 55%, the UN predicts.

Already, chemists and waste technologists are establishing processes with commercial potential. The merger of the wastewater and chemical industries is under way.

Are petite poplars the future of biofuels? UW studies say yes

November 15, 2017

Michelle Ma, UW News

In the quest to produce affordable biofuels, poplar trees are one of the Pacific Northwest’s best bets — the trees are abundant, fast-growing, adaptable to many terrains and their wood can be transformed into substances used in biofuel and high-value chemicals that we rely on in our daily lives.

But even as researchers test poplars’ potential to morph into everything from ethanol to chemicals in cosmetics and detergents, a commercial-scale processing plant for poplars has yet to be achieved. This is mainly because production costs still are not competitive with the current price of oil.

A University of Washington team is trying to make poplar a viable competitor by testing the production of younger poplar trees that could be harvested more frequently — after only two or three years — instead of the usual 10- to 20-year cycle. These trees, essentially juveniles compared with fully grown adults, are planted closer together and cut in such a way that more branches sprout up from the stump after each harvest, using the same root systems for up to 20 years. This method is called “coppicing,” and the trees are known as poplar coppice.

The team is the first to try converting the entire young tree — including leaves, bark and stems — into bio oil, a biologically derived oil product, and ethanol using two separate processes. Their results, published this summer in two papers — one in ACS Sustainable Chemistry & Engineering and the other in Biotechnology for Biofuels — point to a promising future for using poplar coppice for biofuel.

“Our research proved that poplar coppice can be a good option to meet the cheap, high-volume criteria of biofuel feedstock,” said lead author Chang Dou on both papers, a doctoral student in the UW’s Bioresource Science and Engineering program. “Our findings are significant for the future biofuel industry, and the ultimate goal is to make poplar coppice biofuel a step closer to the pump.”

Trees from standard poplar farms are older and taller before being harvested.

Poplar woodchips from older trees have been the focus of most research, mainly because wood parts contain the highest concentration of sugar, which is important for making ethanol and chemicals. Earlier studies show that poplar woodchips are a viable biofuel source, but costs still don’t pencil out, especially since trees are cut just once every 10-plus years. Additionally, other tree parts go to waste when only the trunk is used, making the process more inefficient and wasteful.

However, if poplar were planted close together like an agriculture crop, and whole trees were harvested on a much quicker cycle, it could make sense from a cost perspective and offer a short return on investment — and be more attractive for farmers.

Alternative fuels must make economic sense, the researchers stress, for biofuels to make a dent in the petroleum-driven market.

“We have the environmental incentives to produce fuels and chemicals from renewable resources, but right now, they aren’t enough to compete with low oil prices. That’s the problem,” said Renata Bura, a UW associate professor in the School of Environmental and Forest Sciences and the senior author.

Bura’s research is part of the Advanced Hardwood Biofuels Northwest project funded by U.S. Department of Agriculture’s National Institute of Food and Agriculture. The project, directed by UW professor Rick Gustafson, is a consortium of universities and industries led by the UW whose goal is to lay the foundation for a Pacific Northwest biofuels and bio-based industry based on poplar feedstock. For this study, trees in Jefferson, Oregon — one of the four study sites — were planted in rows close together in spring of 2012 and harvested less than two years later before the leaves had fallen.

The UW team first tested whether entire young poplar trees could be converted into sugar by a process that uses high temperature, pressure and enzymes to break down the wood materials into sugar. From there, it is possible to make ethanol, acetic acid, lactic acid and other valuable chemicals by fermenting the sugar.

After processing the trees, the researchers found that leaves are poor performers and lowered the overall sugar output, not just because leaves are naturally low in sugar, but they also contain other chemicals that impede the sugar-releasing process. When scaled up to a commercial operation, leaves should be removed and may be used for other purposes, such as feed for animals.

They also tested whole poplar trees from the same plot in another conversion process that uses much higher heat — upwards of 500 degrees Celsius — to transform the tree materials directly to bio oil in a process called “pyrolysis.” Research is underway to convert this dark brown oil to a transportation fuel that resembles gasoline or diesel.

In the experiment, the researchers found that including leaves didn’t make a big difference to the quality of the resulting bio oil. When scaled up, producers could ultimately save time and money by not separating leaves from branches to achieve similar quality oil.

Future poplar production plants could leverage both methods, weighing factors like the current cost of materials or the dollar value of the products being made to determine which method makes more financial sense, Dou explained.

The young poplars used in the study have similar properties to shoots that would sprout from a stump in a true coppicing operation. Using that cutting method, it is possible to harvest trees every two years for up to 20 years without the added effort and cost of pulling up roots, preparing the soil and planting new trees that is required in usual planting regimes.

Ultimately, the researchers say that coppice poplar is likely the best balance of cost and reliability for Pacific Northwest growers to produce biofuel.

“Currently, we are looking at how we can grow poplar for monetized ecosystem services,” Bura said. “In the future, we envision a bio-based industry that will provide multiple environmental benefits, will invigorate rural communities and will serve as a bridge to a fully developed biofuels industry.”

Other co-authors on the papers are Fernando Resende, a UW assistant professor of environmental and forest sciences; Devin Chandler, a UW graduate student in the Bioresource Science and Engineering program; Wilian Marcondes, a UW exchange student from the University of São Paulo-Brazil; and Jessica Djaja, a UW undergraduate student.

The research was funded by a grant from the U.S. Department of Agriculture’s National Institute of Food and Agriculture.

Reducing phosphorus runoff

Public Release: 22-Nov-2017

Researchers study which incentives prompt farmers to improve environment

University of Delaware

Throughout the United States, toxic algal blooms are wreaking havoc on bodies of water, causing pollution and having harmful effects on people, fish and marine mammals.

One of the main contributors to these algal blooms is excess phosphorus that runs off from agricultural fields and while there has been a lot of efforts in recent years by farmers to improve agricultural management, the problem persists and there is still a lot of work to be done.

In a paper published recently in the Journal of Soil and Water Conservation, the University of Delaware's Leah Palm-Forster met with farmers in northwest Ohio to test different incentives that would promote the use of best management practices (BMPs) to help curb the excess phosphorus runoff from their fields.

Palm-Forster, assistant professor in the Department of Applied Economics and Statistics in UD's College of Agriculture and Natural Resources, collected the data for the study in 2013 while she was a doctoral student at Michigan State University. Palm-Forster and her co-authors--Scott Swinton, professor, and Robert Shupp, associate professor both at Michigan State University--travelled to four different locations and spoke with 49 farmers, looking specifically at farms that could have an impact on Lake Erie, which was hit earlier this year with an algal bloom that stretched over 700 miles.

The researchers used four different incentives for their study--a cash payment, a cash payment with BMP insurance, a tax credit and a certification price premium--by the cost per pound of phosphorus runoff reduction to see which incentives the farmers most preferred.

"For this study, we used an artificial reverse auction, meaning that farmers didn't have to go back to their farm and actually do any of these practices," said Palm-Forster. "We were trying to pilot test these incentives in a controlled environment, so although it was artificial, they actually were receiving real cash payments based on how they performed during the session."

The farmers had mock farms which were designed to be typical farms in the Lake Erie watershed and they were given information about baseline management practices that they used and then they were given three different practices that they could bid on.

"We learned a couple interesting things," said Palm-Forster. "First of all, there didn't seem to be a lot of differences between the bids for a cash payment or a tax credit which is interesting because it means we might have some flexibility in how we design programs. If there were the ability to create a tax credit that would be comparable, then we may be able to motivate this kind of management change through that mechanism instead of giving cash payments."

Another surprising result was that the farmers asked for more money for the incentive where they were given a cash payment plus insurance.

"You would expect them to bid less because you're giving them this insurance for free, so you would expect that they would request less cash in order to adopt a practice but they were very skeptical about how insurance would work in this particular setting," Palm-Forster said. "We learned in focus groups afterward that they assumed that there were going to be more transaction costs--time, effort, money being spent trying to comply with program rules and just maintain eligibility--and they didn't view that as being attractive at all."

Farmers also seemed willing to accept the certification price premium as long as it would be comparable to an equivalent amount of cash payment. Palm-Forster said that the issue there is that if it happened in real life, it wouldn't be targeted towards only environmentally vulnerable areas.

"If you imagine there's this certification price premium, and any farmer who is willing to do these practices could be eligible for the premium, that means a farmer that's on a piece of land that's not as sensitive in an environmental sense would be getting the same price premium as a farmer that was on a really environmentally vulnerable piece of land, which is not going to result in the most cost-effective use of those dollars," Palm-Forster said.

One of the most important aspects of this research according to Palm-Forster was that the researchers went out in the world and interacted with actual farmers to hear their preferences.

"Talking to the real decision makers is key," Palm-Forster said. "It can be difficult to get farmers to engage with you but it's really important and we learned so much from working with them in that setting. After we did the experiments, we had focus group discussions which let us understand why they were making these decisions in the experiment. This particular paper was enriched by having that understanding of where the farmers were coming from, which was facilitated by the focus groups."

While this study focused on Lake Erie, it can be applicable to other areas of the country such as the Chesapeake Bay and the Mississippi River Basin.

These sorts of economic experiments are important as policy makers need to get as much information as possible from actual farmers to hopefully one day roll out incentive programs that the majority of farmers prefer.

"You want to do all these things before you try to roll out this type of program because you need to learn what would work and what wouldn't," Palm-Forster said. "This would be one piece of all of that ground work. There are a lot of projects right now in the western basin, a lot of researchers are thinking about this problem, and a lot of farmers are engaging in regional programs to help improve the lake but it's still just not enough,"

The research was funded by a grant from the Great Lakes Protection Fund.

A new way to store thermal energy

Boston MA (SPX) Nov 23, 2017

MIT postdoc, Grace Han, handles a new chemical composite that could provide an alternative to fuel by functioning as a kind of thermal battery.

In large parts of the developing world, people have abundant heat from the sun during the day, but most cooking takes place later in the evening when the sun is down, using fuel - such as wood, brush or dung - that is collected with significant time and effort.

Now, a new chemical composite developed by researchers at MIT could provide an alternative. It could be used to store heat from the sun or any other source during the day in a kind of thermal battery, and it could release the heat when needed, for example for cooking or heating after dark.

A common approach to thermal storage is to use what is known as a phase change material (PCM), where input heat melts the material and its phase change - from solid to liquid - stores energy. When the PCM is cooled back down below its melting point, it turns back into a solid, at which point the stored energy is released as heat.

There are many examples of these materials, including waxes or fatty acids used for low-temperature applications, and molten salts used at high temperatures. But all current PCMs require a great deal of insulation, and they pass through that phase change temperature uncontrollably, losing their stored heat relatively rapidly.

Instead, the new system uses molecular switches that change shape in response to light; when integrated into the PCM, the phase-change temperature of the hybrid material can be adjusted with light, allowing the thermal energy of the phase change to be maintained even well below the melting point of the original material.

The new findings, by MIT postdocs Grace Han and Huashan Li and Professor Jeffrey Grossman, are reported this week in the journal Nature Communications.

"The trouble with thermal energy is, it's hard to hold onto it," Grossman explains. So his team developed what are essentially add-ons for traditional phase change materials, or, "little molecules that undergo a structural change when light shines on them."

The trick was to find a way to integrate these molecules with conventional PCM materials to release the stored energy as heat, on demand. "There are so many applications where it would be useful to store thermal energy in a way lets you trigger it when needed," he says.

The researchers accomplished this by combining the fatty acids with an organic compound that responds to a pulse of light. With this arrangement, the light-sensitive component alters the thermal properties of the other component, which stores and releases its energy.

The hybrid material melts when heated, and after being exposed to ultraviolet light, it stays melted even when cooled back down. Next, when triggered by another pulse of light, the material resolidifies and gives back the thermal phase-change energy.

"By integrating a light-activated molecule into the traditional picture of latent heat, we add a new kind of control knob for properties such as melting, solidification, and supercooling," says Grossman, who is the Morton and Claire Goulder and Family Professor in Environmental Systems as well as professor of materials science and engineering.

The system could make use of any source of heat, not just solar, Han says.

"The availability of waste heat is widespread, from industrial processes, to solar heat, and even the heat coming out of vehicles, and it's usually just wasted." Harnessing some of that waste could provide a way of recycling that heat for useful applications.

"What we are doing technically," Han explains, "is installing a new energy barrier, so the stored heat cannot be released immediately."

In its chemically stored form, the energy can remain for long periods until the optical trigger is activated. In their initial small-scale lab versions, they showed the stored heat can remain stable for at least 10 hours, whereas a device of similar size storing heat directly would dissipate it within a few minutes. And "there's no fundamental reason why it can't be tuned to go higher," Han says.

In the initial proof-of-concept system "the temperature change or supercooling that we achieve for this thermal storage material can be up to 10 degrees C (18 F), and we hope we can go higher," Grossman says.

Already, in this version, "the energy density is quite significant, even though we're using a conventional phase-change material," Han says. The material can store about 200 joules per gram, which she says is "very good for any organic phase-change material." And already, "people have shown interest in using this for cooking in rural India," she says. Such systems could also be used for drying agricultural crops or for space heating.

"Our interest in this work was to show a proof of concept," Grossman says, "but we believe there is a lot of potential for using light-activated materials to hijack the thermal storage properties of phase change materials."

Indonesian mosques to take up the mantle of fighting climate change

by Mongabay.com on 21 November 2017

Indonesia will establish 1,000 “eco-mosques,” the country’s vice president announced at this month’s UN climate summit in Bonn.

The Southeast Asian nation is home to some 260 million people. Nearly 90 percent of them identify as Muslim, according to 2010 census data.

Indonesia also has some of the greatest expanses of rainforests, peatlands and mangroves — carbon-rich environments that are rapidly disappearing as industry expands.

Indonesia will establish 1,000 “eco-mosques,” the country’s vice president announced at this month’s UN climate summit in Bonn.

The Southeast Asian nation is home to some 260 million people, fourth after China, India and the U.S. Nearly 90 percent of them identify as Muslim, according to 2010 census data.

Indonesia also has some of the greatest expanses of rainforests, peatlands and mangroves — carbon-rich environments that are rapidly disappearing as industry expands.

“The environmentally friendly mosque or ‘eco-mosque’ program is expected to instill mosques with a concern about the mutual relationship between living things and the environment for the sustainable livelihoods of us all,” Vice President Jusuf Kalla said in a statement.

Practically, the initiative “will help the mosques to source renewable energy, manage their water and food needs sustainably, reduce and recycle waste and provide environmental education,” Thomson Reuters Foundation reported.

More broadly, it aims to cultivate among worshippers a sense of stewardship toward the natural world, in part through education programs that frame the environmental movement as a moral challenge.

Indonesia’s Muslim institutions have addressed the environment before, issuing religious edicts forbidding the trafficking of wildlife or the setting of illegal forest fires.

Hening Parlan, coordinator for environment and disaster management at Aisyiyah, the women’s wing of Indonesia’s second-largest Islamic organization Muhammadiyah, said an eco-mosque movement could unite Indonesian Muslims to fight climate change.

“Because merely adapting to climate change isn’t enough,” she told Mongabay. “This movement is aimed to make all Muslims aware that climate change is threatening our lives.”

The planned 1,000 eco-mosques will be located in urban areas as well as disadvantaged areas lacking water and energy, according to Hening.

“Because mosques in rich areas will not be lacking in water and energy,” she said.

Some of the eco-mosques will be located in the national capital of Jakarta because the city faces water scarcity, with only 50 percent of its households having piped water, the lowest in Indonesia.

As a result, most Jakarta residents rely on raw water supply by draining groundwater beneath the city.

The world’s first “negative emissions” plant has begun operation—turning carbon dioxide into stone

WRITTEN BY, Akshat Rath on October 12, 2017

There’s a colorless, odorless, and largely benign gas that humanity just can’t get enough of. We produce 40 trillion kg of carbon dioxide each year, and we’re on track to cross a crucial emissions threshold that will cause global temperature rise to pass the dangerous 2°C limit set by the Paris climate agreement.

But, in hushed tones, climate scientists are already talking about a technology that could pull us back from the brink. It’s called direct-air capture, and it consists of machines that work like a tree does, sucking carbon dioxide (CO2) out from the air, but on steroids—capturing thousands of times more carbon in the same amount of time, and, hopefully, ensuring we don’t suffer climate catastrophe.

There are at least two reasons that, to date, conversations about direct air capture have been muted. First, climate scientists have hoped global carbon emissions would come under control, and we wouldn’t need direct air capture. But most experts believe that ship has sailed. That brings up the second issue: to date, all estimates suggest direct air capture would be exorbitantly expensive to deploy.

For the past decade, a group of entrepreneurs—partly funded by billionaires like Bill Gates of Microsoft, Edgar Bronfman Jr. of Warner Music, and the late Gary Comer of Land’s End—have been working to prove those estimates wrong. Three companies—Switzerland’s Climeworks, Canada’s Carbon Engineering, and the US’s Global Thermostat—are building machines that, at reasonable costs, can capture CO2 directly from the air. (A fourth company, Kilimanjaro Energy, closed shop due to a lack of funding.)

Over the past year, I’ve been tracking the broader field of carbon capture and storage, which aims to capture emissions from sources such as power plants and chemical factories. Experts in the field look at these direct-air-capture entrepreneurs as the rebellious kids in the class. Instead of going after the low-hanging fruit, one expert told me, these companies are taking moonshots—and setting themselves up for failure.

Climeworks just proved the cynics wrong. On Oct. 11, at a geothermal power plant in Iceland, the startup inaugurated the first system that does direct air capture and verifiably achieves negative carbon emissions. Although it’s still at pilot scale—capturing only 50 metric tons CO2 from the air each year, about the same emitted by a single US household or 10 Indian households—it’s the first system to convert the emissions into stone, thus ensuring they don’t escape back into the atmosphere for the next millions of years.

The impossibility of direct air capture can be illustrated by simple physics. Imagine if you were allowed to eat as many M&M’s as you wanted—as long as you only eat red ones. If in a bag of M&M’s there was one red M&M for every 10 pieces of candy, it would be easy to find them and eat them with glee. But imagine if the concentration fell to one in every 2,500. You might give up searching for even a single M&M.

Direct air capture unit along with the cooling towers of the geothermal power plant in Hellisheidi, Iceland. (Climeworks/Zev Starr-Tambor)

At a coal-power plant, the exhaust flue gas contains about 10% carbon dioxide (i.e., about one in 10 gas molecules are CO2). Capturing the greenhouse gas at these relatively high concentrations requires less energy than capturing it from the air, where it is present at just 0.04% concentration (about one in 2,500 gas molecules). A 2011 report from the American Physical Society estimated that it may cost between $600 and $1,000 per metric ton of CO2 captured from the air. Capturing it at the source—at a coal-burning plant, for example—could cost less than one-tenth of that.

But air capture still matters. First, we currently don’t have any possible way to deal with CO2 released by cars, ships, and planes. Second, because we are on track to emit more CO2 than we need to keep under the 2°C limit, we likely need a means of sucking back up some of that extra greenhouse gas.

Most systems to capture CO2 depend on a process called “reversible absorption.” The idea is to run a mixture of gases (air being the prime example) over a material that selectively absorbs CO2. Then, in a separate process, that material is manipulated to pull the CO2 out of it. The separated CO2 can then be compressed and injected underground. Typically there’s a limited supply of the absorbing material, so it will get put through the cycle once again to capture more of the greenhouse gas.

Climeworks and Global Thermostat have piloted systems in which they coat plastics and ceramics, respectively, with an amine, a type of chemical that can absorb CO2. Carbon Engineering uses a liquid system, with calcium oxide and water. The companies say it’s too early in the development of these technologies to predict what costs will be at scale. “It’s like if you asked someone in 1960 what the cost of commercial rockets would be today,” says David Keith of Carbon Engineering. What they are willing to share are cost targets.

Jan Wurzbacher, Climeworks’s director, says it hopes to bring costs down to about $100 per metric ton of carbon dioxide. That’s close to the price Carbon Engineering is targeting, according to Geoffrey Holmes, the company’s business development manager. Peter Eisenberger, co-founder of Global Thermostat, says their technology will be even cheaper: when scaled up, he says, costs will drop to as low as $50 per metric ton. (Intriguingly, the startup’s projected capture cost seems to be inversely proportional to the money each says it has raised: about $15 million in private investment for Climeworks and Carbon Engineering, and $50 million for Global Thermostat.)

Each of the startups has built a functional pilot plant to prove their technology, with the ability to capture hundreds of kg of CO2. And all boast that their tech is modular, meaning they can build a direct air capture plant as small or large as somebody is ready to pay for. Even at $50 per metric ton of capturing emissions, if we have to capture as much as 10 billion metric tons by 2050, we are looking at spending $500 billion each year capturing carbon dioxide from the air. It seems outrageous, but it may not be if climate change’s other damages are put in perspective—and that’s what these startups are betting on.

Global Thermostat’s pilot plant in Stanford Research Institute in Palo Alto was sitting idle when I visited it. Carbon Engineering is waiting to build out the technology more—including the ability to convert captured CO2 into fuels—before it starts scaling up its business. Only Climeworks has been able to show success in commercial applications.

In May this year, Climeworks set up its first commercial unit near Zurich, Switzerland, capturing about 1,000 metric tons of CO2 from the air each year (equivalent to 20 US households’ annual emissions). The captured CO2 is supplied to a nearby greenhouse, where a high concentration of the gas boosts crop yield by 20%.

But the company’s newest installation in Iceland is even more impressive, because it’s the first true “negative emissions” plant.

Buried once and for all

To build it, Climeworks first needed a “carbon-neutral” power plant. They found one in Hellisheidi, Iceland, where the public utility company Reykjavik Energy runs a geothermal power plant. The Hellisheidi plant, about 15 miles (25 km) southeast from the capital Reykjavik, uses naturally occurring heat from a volcanically active region to produce electricity and heat, by pumping water through an underground network of pipes. As the water travels underground, it heats up and turns into steam which can be used to run turbines. The plant produces about 300 MW of electricity (enough to power 200,000 American households) and about 130 MW of heat.

Though geothermal is a clean energy source, the process of recovering heat releases gases—a mixture of carbon dioxide, hydrogen sulfide, and hydrogen. It’s not a massive amount of CO2—for each unit of energy produced, a geothermal plant produces 3% of the carbon emissions of a coal-fired power plant—but it’s still something.

In 2014, Reykjavik Energy along with the help of academics in the US, Europe, and Iceland, formed a project called CarbFix to test technology to get rid of the small amount of gases it does produce. And since the Paris climate agreeement, the utility company has been ramping up its emissions-reduction efforts to help

Reykjavik Energy’s geothermal power plant in Hellisheidi, Iceland.

Instead of using reversible absorption, CarbFix uses Iceland’s plentiful natural resources to capture carbon dioxide. Every time you open a can of soda, you are drinking dissolved CO2 in water. The Hellisheidi plant works on a similar principle. When the gases released from geothermal vents are mixed with water, they get absorbed ever so slightly. The mixture—27 kg of fresh water for each kg of CO2—is then injected 700 meters underground.

In a 2016 study, scientists found that the carbon dioxide in these water mixtures was reacting with Iceland’s vast basaltic rock—dark igneous rock usually found under ocean floors—to form minerals. This process usually takes hundreds or thousands of years, but what was surprising in Iceland was that the mineralization occurred in less than two years. The speed probably has something to do with the unique local geology. Sandstone aquifers—which have been the most well-studied type of rock system when it comes to carbon dioxide-injection systems—react very slowly with CO2. Basalt rock, on the other hand, seems to react much more quickly, likely because of the presence of metals like iron and aluminum.

The study showed, for the first time, that storing carbon dioxide underground is easier and safer than it has been made out to be. Once locked into the minerals, the carbon dioxide cannot enter the atmosphere for millions of years. Better still, this sort of basalt rock is present in large deposits around the world—enough to take in many decades of fossil-fuel emissions, direct air capture, and more.

Over the past three years, more than 18,000 metric tons of CO2 have been injected into the ground as part of the project, according to Edda Aradóttir, a CarbFix geologist. Moreover, they’ve been able to do it for less than $30 per metric ton of CO2.

This month, Climeworks installed a unit that captures carbon dioxide directly from the air and transfers it to CarbFix to inject underground. Because CarbFix has been monitoring the injection sites for the last three years, they can be sure there will be no leakage. And once mineralized, the CO2 will remain trapped for thousands or millions of years. This makes the Climeworks-CarbFix system the world’s first verified “negative emissions” plant.

Basalt core containing carbonates - Photo by Sandra O Snaebjornsdottir

Basalt rock containing carbonates, which are solids that have trapped carbon dioxide. (Sandra O Snaebjornsdottir)

(A chemical plant in Chicago, where Archer Daniels Midland ferments corn to produce ethanol, claims to be a negative-emissions plant. The company says that because corn consumes CO2 to grow in the first place, even though fermentation releases CO2, it is carbon neutral—simply returning what it took from nature. If you then capture and inject those emissions underground, you could technically consider it negative emissions. But experts haven’t yet run the numbers to verify Archer Daniels Midland’s claim.)

Climeworks says it is now looking to customers who want to buy their way into programs that cut their emissions. The delivery company DHL, for example, has committed to reaching zero emissions by 2050. But even if they move their entire road vehicle fleet to run on all electric cars, there is currently no technology to cut emissions from the airplanes DHL relies on. The hope is that DHL will pay money to Climeworks to bury those excess emissions into the ground.

Academics used to think that direct air capture would be too expensive for any practical purposes. They still tend to think that for carbon capture and storage more broadly. But what Climeworks and its competitors are showing is that, if direct air capture can be made cheap enough for there to be commercial interest, then the economics of carbon capture at point-sources will likely work, too. And if nothing else, the existence of direct air capture gives humanity a high-premium insurance policy against what would surely be a much more expensive disaster.

Nebraska Allows Keystone XL Pipeline, but Picks a Different Path

By MITCH SMITHNOV. 20, 2017

LINCOLN, Neb. — The nearly decade-long battle over Keystone XL has come to symbolize much more than what the disputed pipeline would actually be: a subterranean tube, 36 inches in diameter, carrying crude oil from Canada to Nebraska.

So when state regulators here said Monday that the project could proceed, their decision was initially seen as a hard-won validation for President Trump and the American laborer, and lamented as a grave threat to pristine farmland and the groundwater below.

But while Monday had been expected to provide a clear and final answer on Keystone XL’s future, it may have only created more uncertainty. The regulators in Nebraska rejected the pipeline company’s preferred route, approving the project only on an alternate path. TransCanada, the pipeline company, later issued a short statement that did not say whether it would move forward with construction, leaving people on both sides of the issue unsure of how to react.

“They do not get their preferred route, the route that we have been fighting in courts over for eight years,” Jane Kleeb, the longtime leader of Nebraska’s anti-pipeline efforts, said of TransCanada. “What is wrong — and what we will continue to fight — is that this pipeline is still on the table.”

Over the course of two presidencies, Democrats and environmentalists have stood against the pipeline at every turn, making it a potent emblem of a broader battle to stave off climate change and protect drinking water, an effort that included the mobilization of thousands of protesters last year to oppose a different pipeline in North Dakota.

Just as persistently, Republicans, business groups and labor unions have long pressed the Keystone XL proposal as a proxy for wider goals of creating jobs and expanding the energy sector. Some of them saw Monday’s decision as a boost for those efforts.

Tim Huelskamp, a former Republican congressman from Kansas now working for the Heartland Institute, a conservative group, released a statement calling the approval a sign that “no longer do fake environmental concerns hold up projects such as these.”

Karen Harbert, an official with the United States Chamber of Commerce, said she was “pleased that the project has cleared this final hurdle” but that TransCanada still needed to decide whether to move forward.

“This industry is a very practical industry, and they have learned over time that it’s not necessarily in your best interest to take a big victory lap knowing that the path forward is still fraught with problems,” Ms. Harbert said. If TransCanada is “going to give the greenlight to this, they’ve got to make sure they’ve got a pathway there to actually build it and operate it.”

The newly approved route for the pipeline is slightly longer and brings potential challenges for the company, which would need to negotiate new easements with landowners not along the preferred route. TransCanada had already paid many landowners along the preferred route for access to their properties.

TransCanada officials declined an interview request. The company’s president, Russ Girling, said in a statement that it would “conduct a careful review” and assess “how the decision would impact the cost and schedule of the project.”

Monday’s decision was hardly the first time that Nebraska, a conservative state with an independent streak, has muddied the prospects of Keystone XL, which would run more than 1,100 miles from Alberta, Canada, to southern Nebraska and connect there with existing pipelines. Permits and land-use easements had long been in place along the pipeline’s route through Canada, Montana and South Dakota, leaving Nebraska as the last major obstacle to construction.

In 2011, an uproar led TransCanada to reroute its proposed path around the state’s ecologically sensitive Sandhills. Later, President Barack Obama used a case pending before the Nebraska Supreme Court as a reason to delay deciding on a federal permit. Mr. Obama later rejected that permit, citing climate change, but Nebraska opponents remobilized this year when Mr. Trump resurrected the project.

In recent days, with the Nebraska decision looming, many of Keystone XL’s opponents here pointed to a spill last week of 210,000 gallons of oil from another TransCanada pipeline in South Dakota as a grim example of what was at stake.

“They’re going to rape and pillage our soil,” said Art Tanderup, whose farm near Neligh, Neb., is along the approved route, and who once hosted an anti-pipeline concert on his land featuring Willie Nelson. “We will do everything in our power to make sure it doesn’t happen.”

But supporters of the project have long touted its economic benefits, and seemed to win over the majority of the Nebraska Public Service Commission, the regulatory board that voted 3-2 to approve the alternate route.

The three commissioners who voted to approve the permit, all Republicans, said in a written opinion that they were very cognizant of the “impacts to the natural resources of the state,” but that there was “no utopian option” and that building the pipeline would bring needed tax revenue to rural governments.

The approved route enters Nebraska at the same spot and leads to the same end point as the company’s preferred option. But in between, the alternate route veers east to follow the path of an existing pipeline, a switch that regulators say could make emergency responses to either pipeline more efficient.

Opponents of the Keystone XL pipeline, including many farmers and ranchers, had packed the small hearing room on Monday for the announcement. When the vote was tallied, some landowners learned that their land would no longer be on the pipeline route and voiced relief. Others discovered that a path through their property had been approved.

Even before the decision, some questioned whether there would still be enough interest among oil shippers to support the pipeline. But TransCanada reiterated its support of the project in early November. “We anticipate commercial support for the project to be substantially similar to that which existed when we first applied for a Keystone XL pipeline permit,” a company statement said.

But now the company must consider the new route, the potential for new legal challenges and a different political landscape.

Last year, thousands of protesters gathered near the Standing Rock Sioux reservation to protest the Dakota Access oil pipeline’s path through North Dakota, at times clashing with law enforcement. Since then, smaller demonstrations have targeted pipeline projects in Minnesota, Wisconsin and Texas. Activists have vowed to assemble again if construction ever begins on Keystone XL.

“One thing we’ve learned through this whole process is we take our victories as we can get them, no matter how big or how small,” Randy Thompson, a Nebraska rancher who has been among the most visible pipeline opponents, said on Monday. “And this today is another victory for us because the damn pipe is not in the ground, and they said 10 years ago they were going to have it in the ground.”

New Analysis, EPA Is Ignoring A Lot Of Toxics in US Drinking Water

The law and regulations make it hard for the U.S. EPA to regulate new chemicals in drinking water. A lot of testing is required to show the chemical is harmful and found widely in water. New analysis of existing data shows a family of chemicals that includes perfluorooctanoic acid, or PFOA, may be doing much more harm than EPA admits.

"It takes a lot to convince the US Environmental Protection Agency to limit how much of a toxin can legally show up in America’s drinking water. The threshold for determining probable human harm is very high, and even if harm is detected, the toxin has to show up in enough public water sources with enough frequency, and at levels sufficiently high, before the EPA considers it significant enough to regulate. And even after all the exhaustive studies are done, the decision ultimately comes down to “the sole judgment of the Administrator,” the head of the agency, who may or may not be swayed by the data.

Under the Safe Drinking Water Act, the EPA is responsible for determining when a chemical needs to be regulated in the US water supply, but it hasn’t added a new toxin to its list since 1996. (Even the Government Office of Accountability thinks that’s a sign of a broken system.) In the past two decades, tens of thousands of new chemicals have come onto the market, and plenty of others that pre-date 1996 have been discovered to harm human health.

For example, newly released lab results show perfluorooctanoic acid, or PFOA (an ingredient in Teflon, the chemical used to make non-stick cookware) is far more prevalent in American drinking water than previously thought. Exposure to PFOA has been linked to a range of health risks including cancer, immune system issues, and developmental problems in fetuses."

Community Solar Heads for Rooftops of NYC’s Public Housing Projects

New York City’s public housing authority is taking bids in a plan to lease its roofs for community solar projects that could power thousands of urban homes.

By Lyndsey Gilpin, InsideClimate News

Nov 22, 2017

Miguel Rodriguez, who grew up in public housing in New York, is developing skills as a solar installer through a public housing-connected program. Credit: New York City Housing Authority

When you look out across New York City from the top of the Empire State Building, thousands of empty rooftops come into view. They could be ripe for solar panels, but the overwhelming majority of residents and business owners inside are renters with no control over those sunny patches of real estate.

The city's public housing authority, the largest public housing landlord in the United States, recognizes the potential, and it has a plan to put hundreds of those rooftops to work.

In January, the authority will start reviewing bids for phase one of a project to increase the amount of solar power generated in the city. It's a small step, but one could that could help grow the market for urban solar power. The goal is to install 25 megawatts of solar panels atop the city's public housing buildings, enough capacity to power 6,600 households, as part of New York City's 100 percent renewable commitment.

There's one catch: The New York City Public Housing Authority (NYCHA) can't directly use that power. It already has a deal with the electric utility Con Edison.

Instead, the authority plans to lease its rooftops for community solar projects―an arrangement that will allow companies to install solar panels in one location and sell the energy to customers who can't install their own.

"Our goal is to help solar power be accessible by anybody in New York City, which is not the case currently," said Daphne Boret-Camguilhem, senior program manager for energy and sustainability at NYCHA. By expanding the use of rootop solar, New York City would not only reduce its carbon footprint―the city has a goal to cut emissions 80 percent by 2050, and buildings are its largest sources of greenhouse gases―but also create renewable energy jobs for low-income residents and connect more communities to cleaner, cheaper power.

A view across New York City's Roosevelt Island shows expanses of unused roof space. Credit: Saul Loeb/AFP/Getty Images

Solstice, a company that connects customers to community solar projects, says projects like this can start to reach a massive gap in the solar market: the 77 percent of Americans who cannot access rooftop solar because they have a shaded roof, rent their property, or have low income or a credit history that prevents them from purchasing panels.

"Many people are skeptical of the idea of community solar because it sounds too good to be true: you're saving money and switching to supporting clean energy, without having to put anything on your roof," said Kelly Roache, senior program manager for Solstice. "We get people to come learn about how that works, through relationships and trust-building. We're peeling back the layers to see what has prevented them from accessing it."

How Community Solar Works

Most community solar projects work in one of two ways: community solar "gardens" are organized by cooperatives or communities that own a solar farm, or space for solar panels is leased from large developers or landowners.

Financing for these projects varies depending on state regulations, but one common model is for a utility or other company to buy about 40 percent of the power, with the remaining 60 percent purchased by communities, typically through their local utility, Roache said. People who participate can save 10 to 20 percent on their bills, aren't penalized if they move, and there's usually a waiting list to join the project if someone drops out.

The rooftop of one restaurant supply company's giant warehouse in the Bronx holds 1.5 megawatts of solar panels. Credit: Don Emmert/AFP/Getty Images

Groups like Solstice play an important role by building communities of solar power users. The company reaches potential customers by canvassing door-to-door, building relationships with faith-based organizations, community centers, and environmental groups, and teaching energy literacy, Roache said. So far, it has generated 23 megawatts of solar demand by connecting customers to solar projects in three states and Washington, D.C.

NYCHA's first phase will install up to 7 megawatts of community solar on the roofs and parking lot canopies in 14 public housing developments, enough to power up to 1,600 homes. The companies chosen will pay NYCHA to rent the rooftop space, and those companies will deliver solar power to customers through Con Edison.

Building a Solar Market with Public Housing

In order to create greater access for low-income customers, there have to be high-profile programs―like the public housing solar project in New York―that bring in investment and create a market, said Sean Gallagher, vice president of state affairs at Solar Energy Industries Association.

Other public housing authorities around the country are also exploring community solar, but with different approaches. Among them:

Denver's housing authority plans to have its own community solar garden ready for operation by the end of the year on 74 acres at a solar test facility in Aurora. The project is expected to power up to 700 public housing units and low-income homes, while cutting energy bills by about 20 percent, offsetting over 54,000 tons of carbon emissions, and providing job training.

St. Paul, Minnesota's public housing agency launched community solar gardens early this year outside of the Twin Cities to meet the electricity needs of 10 of its public housing high-rises. The solar gardens are expected to save the housing agency $130,000 a year.

Those and other projects contribute to the Department of Housing and Urban Development's Renew300 Initiative, which aims to install 300 megawatts of solar on federally assisted housing by 2020.

Not all regions and utilities are as open to community solar, however, Gallagher said. Many utilities have net metering limits for the size of solar projects or don't allow third-party purchasing agreements, and electricity rates vary by state and utility company.

New York's community solar market hit a bump in September, when the New York Public Service Commission approved a plan to replace net metering with a complex metric for large-scale community solar projects. Solar advocates say it may undercut the community solar market by allowing utilities to decide the value of proposals.

"It's the tricky part of what New York is trying to hit―can you construct a compensation mechanism that's rational, fair, and gives customer some opportunity to save some money on their bills," Gallagher said.

Ripple Effects: Training and Jobs

Community solar is still a nascent market, but that's starting to shift. There are projects in 26 states, and according to GTM Research, 410 megawatts of community solar will be installed in the U.S. in 2017, and by 2019, there will be some 500 megawatts installed each year.

In projects involving public housing, cities are aiming for more than just cheap, clean power―they see benefits in job creation and training, too. In New York, 30 percent of the hires for the NYCHA project have to be NYCHA residents.

Miguel Rodriguez, who grew up in the Lillian Wald Houses, a public housing project on the Lower East Side of Manhattan, was used to frequent blackouts, when the complex would suddenly lose power. After Hurricane Sandy flooded the neighborhood, Green City Force came in to help with restoration efforts. Rodriguez enrolled in the program and became interested in solar. After getting certified and working as an installer, Rodriguez, who is now 24, got involved with the NYCHA project.

"It would be huge," he said. "People would have a reason to get together for something positive in the neighborhood, and the social value of the neighborhood would go up."

Now finishing up his associate's degree in New York, Rodriguez is already thinking about how to use the skills he's learned to build community solar projects outside the U.S. "My family is from the Dominican Republic, and there's shortages and energy problems," he said. "I'm thinking about learning more and then going in my own backyard."

New nuclear power cannot rival windfarms on price, energy boss says

Innogy Renewables chief claims future reactors will not be competitive as offshore windfarms become even cheaper

Adam Vaughan

Wednesday 22 November 2017 10.37 EST

Last modified on Thursday 23 November 2017 05.03 EST

New nuclear power stations in the UK can no longer compete with windfarms on price, according to the boss of a German energy company’s green power arm.

Hans Bunting, the chief operating officer of renewables at Innogy SE, part of the company that owns the UK energy supplier npower, said offshore windfarms had become mainstream and were destined to become even cheaper because of new, bigger turbines.

Asked whether nuclear groups that want to build new reactors in the UK could compete with windfarms on cost, even when their intermittency was taken into account, Bunting replied: “Obviously they can’t.”

His comments came after MPs criticised the £30bn cost to consumers for EDF Energy’s Hinkley Point C nuclear power station, and said ministers should revisit the case for new nuclear before proceeding with more projects.

Innogy recently secured a subsidy of £74.75 per megawatt hour of power to build a windfarm off the Lincolnshire coast, which is £17.75 cheaper than Hinkley and should be completed about three years earlier.

“What we see now [with prices] is with today’s technology. It’s not about tomorrow’s technology, which is about [to come in] 2025, 2027, when Hinkley will most likely come to the grid ... and then it [windfarms] will be even cheaper.”

While the company is planning to use the most powerful turbines in the world today for the Lincolnshire windfarm, Bunting said even bigger ones in development would drive costs down further.

“A few years ago everyone thought 10MW [turbines] was the maximum, now we’re talking about 15[MW]. It seems the sky is the limit,” he said. “[It] means less turbines for the same capacity, less steel in the ground, less cables, even bigger rotors catching more wind, so it will become cheaper.”

However, EDF argued that nuclear was also on a path to lower costs.

“Early offshore wind projects started at around £150 per MW/h and developers have shown they can offer lower prices by repeating projects with an established supply chain – the same is true for nuclear,” an EDF spokesman said.

“EDF Energy’s follow-on nuclear projects at Sizewell and Bradwell will remain competitive with other low-carbon options and we are confident they can be developed at a significantly lower price than Hinkley Point C.”

In an interview with the Guardian, Bunting said Innogy was strongly committed to the UK despite its subsidiary npower merging with the big-six supplier SSE. A third of the group’s renewables staff are based in the UK.

“The npower and SSE merger does not for us mean we are going to leave the UK. No way. We’re going to stay here, and grow here,” he said.

He argued the new supplier would be good for billpayers, contrary to consumer groups’ fears. “There is an industrial logic in it. I think at the end of the day it will help competition because then you have two large players on the market, and they will be more efficient.”

Bunting said he would like to build onshore windfarms in the UK too, if the government rethought its ban on subsidies for them.

He said the political argument against them – public opposition in Tory shires – no longer stood because potential windfarms in Scotland and Wales were more likely to win subsidies.

“England shouldn’t worry because England doesn’t have such good wind conditions … in an auction [for subsidies] the English sites would anyway struggle to qualify against Welsh and Scottish sites.”

Innogy would also take a close interest in building large solar power plants, if ministers reopened support for them, he added.

Bunting rejected the idea that the subsidy costs of paying for clean power should be shifted off energy bills and into general taxation, as British Gas’s boss has argued for.

Such a change would make the cost of clean power less transparent and deter households and businesses from taking steps to save energy, he said. “If part of the energy [costs] is tax-financed it will become completely intransparent,” he said.

A spokesman for the Department for Business, Energy and Industrial Strategy said: “We need a diverse energy mix to ensure that demand for energy can always be met, and both nuclear and renewables will play an important role in this for many years to come.”

A Power Plant Is Burning H&M Clothes Instead of Coal

By Jesper Starn

November 23, 2017, 7:01 PM EST Updated on November 24, 2017, 3:21 AM EST

Trash burning utility uses discarded clothes as fuel

Plant’s last coal delivery arrived on Tuesday by boat

Burning discarded clothing from retail chain Hennes & Mauritz AB is helping a Swedish power plant replace coal for good.

The combined heat and power station in Vasteras, northwest of Stockholm, is converting from oil- and coal-fired generation to become a fossil fuel-free facility by 2020. That means burning recycled wood and trash, including clothes H&M can’t sell.

“For us it’s a burnable material,” said Jens Neren, head of fuel supplies at Malarenergi AB, a utility which owns and operates the 54-year-old plant about 100 kilometers (62 miles) from Stockholm. “Our goal is to use only renewable and recycled fuels.”

While Sweden prides itself on an almost entirely emission free-power system thanks to its hydro, nuclear and wind plants, some local municipalities still use coal and oil to heat homes and offices during cold winter days. By converting old plants to burn biofuels and garbage, the biggest Nordic economy is hoping to edge out the last of its fossil fuel units by the end of this decade.

Malarenergi has a deal with the neighboring city of Eskilstuna to burn their trash, some of which comes from H&M’s central warehouse in the same city. The refuse wasn’t specified as clothing until it was highlighted in a Swedish national television program on Tuesday.

“H&M does not burn any clothes that are safe to use,” Johanna Dahl, head of communications for H&M in Sweden, said by email. “However it is our legal obligation to make sure that clothes that contain mold or do not comply with our strict restriction on chemicals are destroyed.”

The Vasteras plant burned about 15 tons of discarded clothes from H&M so far in 2017, compared with about 400,000 tons of trash, Neren said. Malarenergi has deals with several nearby cities to receive rubbish and even imports waste from Britain to fuel its main boiler.

The facility, which supplies power to about 150,000 households, burned as much as 650,000 tons of coal at its peak in 1996.

On Tuesday, the last coal ship docked in Vasteras to supply the plant’s two remaining fossil-fuel generators from the 1960s with enough supplies to last until 2020. That’s when a new wood-fired boiler will be added to supplement the facility’s existing biofuel and trash burning units.

The Dirty Secret Hiding in Your Carbon Mountain Bike

Sarah Max, Nov 20, 2017

Riding bikes may be green, but the manufacturing behind them can be far from it

Last January, Leo Kokkonen, founder of Finland-based Pole Bicycles, visited China in search of a frame manufacturer, the final step of a two-year-long project to design, test, and launch the company’s first carbon-fiber mountain bike.

Founded in 2013, Pole quickly made a name for itself among critics after its Evolink bikes—known for their long wheelbase—took top honors in the industry’s 2017 Design and Innovation Awards. The rollout of a carbon frame was supposed to be the next logical step to put Kokkonen’s startup on the international map in an industry where carbon is the new king.

By the time Kokkonen boarded his flight home to Finland, however, he’d made up his mind to pull the plug on the whole project.

“It wasn’t just one thing. It was many,” he says, recounting how his lungs burned after an easy ride outside Dongguan, China, where coal-fired power plants cast a cloud over the industrial city. At the factory that was to make his frames, Kokkonen saw just how much water, electricity, and human labor is required to lay, mold, and bond the carbon, a process that entails working with toxic resins. Then there’s the waste. “We knew that carbon fiber is not recyclable, but our idea was to create a frame that was indestructible, so at least we could increase the product lifespan,” Kokkonen says. But when he asked what the facility did with the excess carbon trimmed off each frame—about a third of every carbon sheet is wasted—he was shocked by the answer: “They said they dump it in the ocean.”

Kokkonen isn’t the first person in the bicycling industry to question the environmental effects of carbon. Whereas the more commonly used aluminum can be easily melted down and reformed, the source of carbon’s strength—microscopic fibers covered in epoxy resin and cured at high temperatures—makes it prohibitively difficult to reclaim.

Researchers are working on ways to improve carbon recycling, but the field is still nascent, and the process far from perfect. The resin typically used to bond the carbon fibers together is not only toxic, but also has to be burned off or chemically dissolved to return the fiber to a reusable state. And because recycled carbon fiber can’t bear the same loads as virgin material, it can only be used by the aerospace, automobile, and bicycling industries for nonstructural, injection-molded parts.

Carbon recycling facilities have cropped up in the United States and Europe, but they are not widespread in Asia, where 99 percent of bikes sold in the U.S. market are made, according to the National Bicycle Dealers Association. So when waste gets trimmed during the manufacturing process, odds are it ends up in a landfill—or, as Kokkonen learned, the ocean—where the nonbiodegradable material simply remains. Forever.

That’s why, in 2010, Wisconsin-based Trek Bikes partnered with the South Carolina recycler Carbon Conversions to recycle the carbon prototypes, damaged frames, noncompliant parts, and waste coming from its U.S. manufacturing facility. Even so, the carbon Trek manages to recycle each year represents a small fraction of amount it uses: Most of its bikes are made in Asia, where the company says it is actively investigating recycling programs.

California’s Specialized Bicycle Components also piloted a carbon recycling program in 2015 but ended it the next year because there wasn’t enough demand to make it profitable for the recycler. The company has set up a pilot program in Germany with a recycler that has developed a way to preserve the original fiber, and Specialized is exploring new recycling options for the U.S. market. “Until then, any frame returned to SBC or our dealers is being stored in our Salt Lake City warehouse,” says Troy Jones, the company’s ‎corporate social responsibility manager, formerly a compliance manager at Outdoor Research and REI and the first chairperson of the Outdoor Industry Association’s fair labor working group. “No frames returned to SBC are being disposed of in a landfill.”

In 2014, Specialized partnered with bicycle component manufacturers SRAM and DT Swiss and Duke University’s Nicholas School of the Environment to research the environmental impact of bicycle manufacturing. Unlike some corporate-backed reports, this one did not give its partners a pass, noting that the industry had “done little to address” the environmental impact of making bikes.

The study compared Specialized’s carbon Roubaix road frame with its aluminum Allez. Based on waste and water and electricity use, the results slightly favored the aluminum frame, and the researchers recommended that bike manufacturers collaborate, as other outdoor product makers have, to create best practices, keep better data, and measure their impact.

To that end, Specialized and SRAM are part of a handful of brands to join the World Federation of Sporting Goods Industry’s Responsible Sport Initiative, launched in 2013, to share information about suppliers and use their collective influence to effect change. The brands receive a list of facilities where they overlap and then divide and conquer by auditing those plants. When the group identifies an area that needs improvement, “the request carries more weight because the factories know it is coming from more than one brand,” says Specialized’s Jones.

While the group is focused on a range of issues, including the environment, most of the changes so far have been focused on labor, health, and safety. For example, the Responsible Sports Initiative required manufacturers to provide protective equipment for their workers and identified ways to minimize the carbon-fiber dust churned up in the sanding process.

Still, with a relatively small list of active members, including Accell Group, FSA, MEC, Pon, and REI, the bike-specific program has only scratched the surface. Of the roughly 260 suppliers Specialized has identified in its supply chain, only 41 overlap with other bike and component manufacturers.

Bike and component makers of any size can make a difference, says Jones, if the industry works together. “I applaud Pole for starting the conversation,” he says. ”A lot of their concerns are valid…but I think they would do more if they stayed and tried to make an impact.”

When it comes to the world’s worst polluters, the bike industry is hardly the biggest offender. Bikes and other sporting goods made with carbon represent a sliver of total carbon demand. According to data from Composites Forecasts and Consulting, only 11 percent of carbon fiber goes to consumer products. The rest goes to make automobiles, pressure vessels, airplanes, and other industrial uses.

“One blade of a wind turbine could produce a year’s worth of carbon frames for us,” says Eric Bjorling, a spokesperson for Trek. The company did its own environmental study, examining everything from the distance between suppliers to its packaging materials. “It’s not one big thing,” Bjorling says. “It’s the sum of all the things.”

Meanwhile, most consumers probably aren’t giving it that much thought. On the sales floor of River City Bicycles in Portland, Oregon, questions about the carbon footprints and supply chains don’t come up at all, says owner Dave Guettler. “The issue has been raised on a minor level from the industry, but consumers aren’t putting pressure on the industry,” he says. “I think most people, when they get into bikes and how they’re made, would have a hard time pointing fingers, considering the environmental impact most people’s lifestyles produce.”

That may be, but it doesn’t exonerate companies, says Kokkonen. Before he pulled the plug on Pole’s carbon frames, Kokkonen looked back at his original business plan. “I wrote a line that says: ‘We don’t want to do unnecessary harm,’” he says. He tries to apply that philosophy to every part of his operations, from keeping marketing materials and swag to a minimum to shipping finished products in repurposed packaging from suppliers.

“We aren’t going to save the world by making Pole noncarbon,” Kokkonen says. “But this is an ethical choice.”

Jay Townley, a partner in the Gluskin Townley Group, a U.S. consultancy that researches the bicycle market, shares that sentiment. When he visited a factory in the late 1980s, the tour came to the area where the resin was applied. Townley said his guide took him by the elbow, turned him around, and said, “We are not going in there without a respirator.” The employees, Townley noted, were not wearing respirators.

Townley has since visited facilities in Taiwan, where he says conditions have improved and employees are now properly equipped. Still, as production has shifted to other countries, including China, Vietnam, Cambodia, and Bangladesh, he says, there is no guarantee that any such standards are in place. “I made a conscious decision I won’t buy carbon unless I’m absolutely sure [of its origins],” Townley says, acknowledging that for most consumers, this is impossible. “I don’t think there is any excuse or reason for a product to be made that not only harms the people who make it, [but also] probably can’t be recycled.”

When he visited China, Kokkonen says, it was made clear that he would have little say in how his frames would be made, let alone how employees would be treated or how the waste would be disposed. Instead, Kokkonen, who outlined Pole’s position in a September blog post, is doubling down on aluminum and has plans in place to start building high-performance aluminum bikes on demand in Pole’s own facility in Jyväskylä, Finland, by early next year. The success of his Evolink series, he argues, is proof that aluminum, if designed right, can go head-to-head with carbon on the trail.

Which is not to say that Kokkonen faults anyone who does buy carbon. “We don’t want to be part of the problem,” he says.

Sarah Max covers business and cycling from Bend, Oregon. Eric Nyquist (@ericbnyquist) is an American artist working in Los Angele

Monsanto, U.S. farm groups sue California over glyphosate warnings

Tom Polansek

Novemver 15, 2017

CHICAGO (Reuters) - Monsanto Co and U.S. farm groups sued California on Wednesday to stop the state from requiring cancer warnings on products containing the widely used weed killer glyphosate, which the company sells to farmers to apply to its genetically engineered crops.

Monsanto Co's Roundup is shown for sale in Encinitas, California, U.S., June 26, 2017. REUTERS/Mike Blake

The government of the most populous U.S. state added glyphosate, the main ingredient in Monsanto’s herbicide Roundup, to its list of cancer-causing chemicals in July and will require that products containing glyphosate carry warnings by July 2018.

California acted after the World Health Organization’s International Agency for Research on Cancer (IARC) concluded in 2015 that glyphosate was “probably carcinogenic”.

For more than 40 years, farmers have applied glyphosate to crops, most recently as they have cultivated genetically modified corn and soybeans. Roundup and Monsanto’s glyphosate-resistant seeds would be less attractive to customers if California requires warnings on products containing the chemical.

California’s Office of Environmental Health Hazard Assessment (OEHHA), which is named in the federal lawsuit, said it stands by the decision to include glyphosate on the state’s list of products known to cause cancer and believes it followed proper legal procedures.

Monsanto and groups representing corn, soy and wheat farmers reject that glyphosate causes cancer. They say in the lawsuit that California’s requirement for warnings would force sellers of products containing the chemical to spread false information.

“Such warnings would equate to compelled false speech, directly violate the First Amendment, and generate unwarranted public concern and confusion,” Scott Partridge, Monsanto’s vice president of global strategy, said in a statement.

The controversy is an additional headache for Monsanto as it faces a crisis around another herbicide based on a chemical known as dicamba that was linked to widespread U.S. crop damage this summer. The company, which is being acquired by Bayer AG for $63.5 billion, developed the product as a replacement for glyphosate following an increase of weeds resistant to the chemical.

Monsanto has already suffered damage to its investment of hundreds of millions of dollars in glyphosate products since California added the chemical to its list of cancer-causing products, according to the lawsuit.

U.S. farmers apply glyphosate to fields to kill weeds before planting corn fed to livestock, spray it on genetically engineered soybeans while they are growing and sometimes on wheat before it is harvested. The crops are then shipped across the country in food products.

“Everything that we grow is probably going to have to be labeled,” said Blake Hurst, president of the Missouri Farm Bureau, a plaintiff in the lawsuit.

Certain goods that meet a standard for containing low amounts of glyphosate, known as a No Significant Risk Level (NSRL), may be able to be sold without warnings under a proposal California is considering, said Sam Delson, a state spokesman.

“We do not anticipate that food products would cause exposures that exceed the proposed NSRL,” he said. “However, we cannot say that with certainty at this point and businesses make the determination.”

A large, long-term study on glyphosate use by U.S. agricultural workers, published last week as part of a project known as the Agricultural Health Study (AHS), found no firm link between exposure to the chemical and cancer.

Reuters reported in June that an influential scientist was aware of new AHS research data while he was chairing a panel of experts reviewing evidence on glyphosate for IARC in 2015. He did not tell the panel about it because the data had not been published, and IARC’s review did not take it into account.

A 2007 study by OEHHA also concluded the chemical was unlikely to cause cancer.

Still, flour mills have started asking farmers to test wheat for glyphosate in anticipation of California’s requirement, said Gordon Stoner, president of the National Association of Wheat Growers, another plaintiff.

Such tests add costs for farmers and could push up food prices or unnecessarily scare consumers away from buying products that contain crops grown with glyphosate, he said.

The case is National Association of Wheat Growers et al v. Lauren Zeise, director of OEHHA, et al, U.S. District Court, Eastern District of California, No. 17-at-01224.

Reporting by Tom Polansek; Editing by Tom Brown

Author Describes Writing Controversial DOE Grid Reliability ReportNov 12, 2017 @ 12:01 AM 8,158

Jeff McMahon , Contributor, Opinions expressed by Forbes Contributors are their own.

Alison Silverstein when she was chair of the American Council for an Energy Efficient Economy.

When Energy Secretary Rick Perry set out this year to prove that Obama-era regulations were killing coal plants and undermining grid reliability, he turned to a veteran energy consultant from his home state of Texas to write the report.

Alison Silverstein was known to be proficient in energy efficiency and the effects of renewables, and she had worked in Washington as a senior advisor to a George W. Bush appointee. When Perry brought Silverstein back to Washington in May, she reviewed data for several weeks and began to outline the "Staff Report on Electricity Markets and Reliability." On Friday in Chicago, she described what happened when her supervisors reviewed her work:

I got to DOE, and I set up a storyboard, and I started writing. And they called me in one day and they said, 'There’s nothing in here about regulation, the horrible effects of regulation on all of these plants.'

I said, 'Well, it wasn’t a big deal.'

And they said, 'But you’re biased against regulation.'

And I said, 'Bring me the data. I’ve been in the building three or four weeks now, you guys are the ones who own the issue. Prove to me, bring me all of your research on how regulation has killed these.'

'Well, we don’t have any.'

'Then how am I supposed to do this?'"

The Staff Report Silverstein authored finds that regulation is not among the greatest contributing factors to the closure of coal and nuclear power plants. Neither, it says, are renewables. Perry nonetheless cites the report in a proposed rule he submitted to the Federal Energy Regulatory Commission (FERC) in September, calling for subsidies to prop up coal and nuclear plants. Perry's proposed rule diverges so sharply from the Staff Report, Silverstein said, that it's "as though they had never read it."

Silverstein said she studied the effect of regulations like the Mercury and Air Toxics Standard, a rule passed in 2011 that required power plants to control emissions of certain pollutants by the end of 2015.

"So I looked a lot into regulation and here’s what I discovered: (in) the early days of all of the Obama Administration regulations, everyone said the sky is falling, we’re going to have to fix all of these plants simultaneously, the walls will fall apart as we know it. Um, not so much. It turns out that when people have to actually do a job they find cheaper ways to do it," she said Friday at a panel sponsored by the Energy Policy Institute at the University of Chicago (EPIC).

"EIA analysis has indicated that in fact, significantly fewer plants had to close (than predicted), and that a lot of the modifications for these regs were done faster and cheaper, instead of the whole disaster that people had anticipated."

The coal plants that did close were already failing economically, Silverstein said, and their operators used the regulations' 2015 compliance deadline as a logical date to close them.

"These guys made a perfectly logical economic operations decision," she said of the coal-plant operators. "They said it doesn’t make sense to spend more money to retrofit a plant that’s already out of money. So what they decided to do—they decided this in 2013—(they decided) to do the Thelma-and-Louise strategy. That regulation date, the compliance deadline, is the edge of the cliff, and I’m just going to gun it and run this flat-out as fast as I can and get every possible kilowatt hour out of it before I have to shut it down—Boom!—and go out in a blaze of glory, literally. So that’s why you saw all those retirements at that deadline. It’s just common sense operational strategy not to piss away money."

That explains why the closures clustered around the compliance date of the new regulations, but it doesn't explain why plants were closing. Coal and nuclear plants were losing money not because of regulations or renewables, Silverstein said, but because of competition from cheap natural gas and the flattening demand for electricity.

"Wholesale competition works. It did exactly what it was supposed to do. It shut down inefficient plants. It shut down unneeded stuff. It made it much clearer that you don’t piss away good money after bad. If prices are going to keep falling and you’re not in the game, get out early," she said.

"The most important thing: the coal and nuclear plants that retired were old, really old, and really inefficient, and they had earned their AARP cards, and it was time for them to get off the playing field and go enjoy a nice retirement somewhere. A whole lot of these plants retired well before renewables got big."

Renewables have had some economic impact, too, but their greatest impact—combined with advances in information and communication technology—has been to speed up the grid. Since distributed solar, smart meters and energy apps have blossomed across America, the grid moves faster, Silverstein said—too fast for coal and nuclear.

Coal plants become more expensive to operate when they cycle on and off, "and a nuclear plant, what you want to do is turn it on and walk away and come back 18 months later and refuel it. You don’t want to cycle that puppy at all.

"They cannot provide the essential resiliency and reliability services that we need, like voltage support, reg-up, reg-down, primary frequency response, secondary frequency response. Coal and nuclear plants are just not good at anything but spinning reserve. They can’t do anything except generate electricity that was once cheap and now ain’t so cheap relative to the other stuff."

Silverstein finished writing the Staff Report on July 7 and returned to Texas on July 8. DOE held onto the report for another seven weeks before releasing it with a list of recommendations and a cover letter. Silverstein sarcastically said those documents were "kind of related" to the report in that they used some of the same words, like "baseload" and "reliability." Silverstein released her own set of recommendations in October in an opinion piece for UtilityDive. Despite the divergent recommendations, she said most of the technical report remains as she wrote it.

"What DOE mostly did was pretty up the graphics and mush up some of the language in the technical study," she said.

DOE Administrators may have been hamstrung in any effort to revise the report in July and August because Silverstein's draft had been leaked to Bloomberg in late June.

Even though the report does not seem to support the case for subsidizing coal and nuclear, in September Perry wrote a letter to FERC proposing subsidies and citing the Staff Report as support.

"I was nowhere near this," she said. "It got cooked up long after I was back in Texas cooling my heels."

Silverstein has a record with FERC, having served as a senior advisor to FERC chairman Pat Wood III, a Bush appointee, from 2001-04. She followed Wood from the Texas Public Utility Commission.

The Department of Energy did not immediately respond to a request for comment.

In proposing the rule, Perry invoked an obscure power the Department of Energy had never used before: the ability to set an agenda item before FERC. EPIC Director Sam Ori asked Silverstein why DOE would take that unusual step instead of following FERC's usual procedure for regulating electricity markets.

"If you work for an administration that is making a big deal about helping coal, and you have a lot of senior staff people who don’t have a significant amount of expertise, either about the industry or about the administrative and policy issues that you are dealing with, I think what you do is say, 'Heck, we’ll send over something that’s a Hail Mary, does all the right political stuff about coal and nuclear. We’ll be the heroes, and it’s FERC’s job to do the dirty work,'" she said.

"Perry and his team are doing all the right things for the cause, and if it works, Great! If it doesn’t work, it’s someone else’s fault. I think that’s the raw political answer."

"African-Americans Taking Brunt Of Oil Industry Pollution: Report"

"WASHINGTON - African-Americans face a disproportionate risk of health problems from pollution caused by the oil and gas industry, and the situation could worsen as President Donald Trump dismantles environmental regulations, according to a report issued on Tuesday by a pair of advocacy groups.

The report, issued by the National Association for the Advancement of Colored People civil rights group and the Clean Air Task Force, said more than a million African-Americans live within half a mile (0.8 km) of an oil and gas operation, and more than 6.7 million live in a county that is home to a refinery.

“African-Americans are exposed to 38 percent more polluted air than Caucasian Americans, and they are 75 percent more likely to live in fence-line communities than the average American,” the report said, referring to neighborhoods adjacent to industrial facilities."

No one satisfied with new Vermont wind power sound rules

By wilson ring, associated press

MONTPELIER, Vt. — Nov 12, 2017, 12:17 PM ET

An effort by Vermont utility regulators to settle the long-standing, contentious issue of how much noise neighbors of industrial wind projects should be subject to ended up upsetting both proponents of wind power and those who say the noise poses a health risk to people who live near turbines.

Proponents of using industrial wind projects as part of Vermont's long-term goal of getting 90 percent of its power from renewable sources by 2050 say the new wind rules will make achieving that goal more difficult, if not impossible.

"These rules will certainly have a chilling effect on wind energy in Vermont," said Austin Davis, a spokesman for the renewable energy trade group Renewable Energy Vermont. "However, that doesn't do away with the fact that wind energy currently is the cheapest renewable energy available to New England."

Opponents counter noise levels are still too high and even at a level that is among the lowest in the country would create an unreasonable burden for people who live near the turbines.

"The wind noise rule as... approved is not going to protect Vermonters from the harm that we have already experienced from industrial wind turbines," said Annette Smith, the head of the group Vermonters for a Clean Environment and a long-time critic of industrial wind projects. "It is a step in the right direction."

On Tuesday, Vermont's Public Utilities Commission gave final approval to the rules that set a daytime limit of 42 decibels of sound from turbines near a home and 39 decibels at night. The rules grew out of a 2016 law that directed the commission to set sound standards. The new rules only apply to new projects.

The decibel level measures sound intensity. Experts say 40 decibels is the rough equivalent of a library while a rural area is about 30 decibels.

The noise debate is something that has followed industrial wind power as it has spread throughout, Vermont, New Hampshire and Maine. Although scientific studies have shown no link between wind turbine noise and human health, it can be annoying, especially to people who were accustomed to living in quiet areas.

Lisa Linowes, of Lyman, New Hampshire, executive director of the Windaction Group says people in urban areas might not even notice wind turbine noise.

"If you take that same project and put it in a rural area, the area has been permanently altered, for the wildlife, for the birds, for the people," she said.

The wind power industry says that nationally, developers work hard to ensure projects are sited so the sound doesn't bother neighbors and thousands of people across the country live near wind farms without any issues, said Mike Speerschneider, the senior director of permitting for the American Wind Energy Association.

"Individuals have a wide range of reactions to sound of all kinds, including wind turbine sound," he said.

Chris Recchia, the top utility regulator in the administration of former Democratic Gov. Peter Shumlin and now a private consultant, said he felt Vermont's new standards were reasonable, but new technology would have to be developed so turbines could operate at full capacity and meet the new 39 decibel standard before there are any new applications for wind projects in Vermont.

"I don't think this is a veto of wind in Vermont," Recchia said. "I think it's a challenge."

Maine, which has the greatest number of turbines in New England, also has been subject of complaints over noise from industrial wind power projects. Criticism led the state in 2012 to lower the nighttime sound limit for wind power projects from 45 decibels to 42 decibels.

On Maine's Vinalhaven Island, David and Sally Wylie said they were shocked when the three offshore turbines were turned on in 2009 before the new rules were enacted.

"We'd been told the wind and the trees would cancel the noise out. We stood outside as the wind turbines came on, it was just whack, whack, whack! It was so loud," David Wylie said. They built a soundproof bedroom with 12-inch thick walls just so they could sleep.

Associated Press reporter David Sharp in Portland, Maine, contributed to this report.

One in ten historic coastal landfill sites in England are at risk of erosion

Coastal erosion may release waste from ten per cent of England's historic coastal landfills in the next forty years, according to research from Queen Mary University of London and the Environment Agency.

Coastal erosion may release waste from ten per cent of England's historic coastal landfills in the next forty years, according to research from Queen Mary University of London and the Environment Agency.

There are at least 1,215 historic coastal landfill sites in England, mostly clustered around estuaries with major cities, including Liverpool, London, and Newcastle on Tyne. An investigation by researchers, published today (Thursday 16 November) in WIREs Water finds that 122 sites are at risk of starting to erode into coastal waters by 2055 if not adequately protected.

Historically it was common practise to dispose of landfill waste in low-lying estuarine and coastal areas where land had limited value due to the risk of it flooding. Historic landfills are frequently unlined with no leachate management and inadequate records of the waste they contain, which means there is a very limited understanding of the environmental risk posed if the waste erodes into estuarine or coastal waters.

The researchers show that more than one-third of England's historic coastal landfills are in close proximity to designated environmental sites, and half of them are in or near to areas influencing bathing water quality. Some historic coastal landfills, such as the East Tilbury landfill in the Thames Estuary, have already started to erode.

Dr James Brand, Research Fellow at Queen Mary University of London and lead-author of the research, said: "Although the majority of England's 'at risk' historic coastal landfills are currently being protected from erosion, some have already started to erode and release waste. Climate change effects, e.g. sea level rise and more intensive storms, are likely to increase the number of historic coastal landfills that erode."

Strategies to mitigate the risk of contaminant release from historic landfills such as excavation and relocation or incineration of waste are prohibitively expensive.

James added: "Currently obtaining funding for remediation works is challenging and very limited. We need to understand the risk of pollution from historic coastal landfills and determine appropriate management options for the future. It's essential to identify which sites pose the greatest pollution risk in order that resources can be prioritised."

Kate Spencer, Professor of Environmental Geochemistry at QMUL, added: "Unfortunately, there are a lot of unknowns here. Our research helps with the first piece of the puzzle: we now know where the high-risk sites are. What we know less about is what will happen when these landfills fail. Some of these sites include everything from coal ash, to micro plastics, to household waste. We don't necessarily know what's in there, and we can't say with confidence what will happen when the contents are exposed." Professor Spencer led the Environment Agency funded project into historic landfills at QMUL.

Shape of Lake Ontario generates white-out blizzards, study shows

So much snow that researchers needed the help of tractor-mounted snowblowers

National Science Foundation, Public Release: 15-Nov-2017, Credit: University of Utah

A 6-foot-wide snow blower mounted on a tractor makes a lot of sense when you live on the Tug Hill Plateau. Tug Hill, in upstate New York, is one of the snowiest places in the Eastern U.S. and experiences some of the most intense snowstorms in the world. This largely rural region, just east of Lake Ontario, gets an average of 20 feet of snow a year.

Hence the tractor-mounted snow blower.

The region's massive snow totals are due to lake-effect snowstorms and, it turns out, to the shape of Lake Ontario.

Lake-effect storms begin when a cold mass of air moves over relatively warm water. The heat and moisture from the water destabilize the air mass and cause intense, long-lasting storms. Lake-effect snow is common in the Great Lakes region and in areas downwind of large bodies of water, including the Great Salt Lake.

Researchers, including the University of Utah's Jim Steenburgh and University of Wyoming's Bart Geerts, now report that these intense snowstorms are fueled by air circulation driven by the heat released by the lake, and that the shoreline geography of Lake Ontario affects the formation and location of this circulation. The result? Very heavy snowfall.

The findings, published in three papers, show how the shorelines of lakes may help forecasters determine the impacts of lake-effect storms.

"Lake Ontario's east-west orientation allows intense bands of snow to form," said Ed Bensman, a program director in the National Science Foundation's (NSF) Division of Atmospheric and Geospace Sciences, which funded the research. "This study found that the shape of the lake's shoreline can have an important influence on the low-level winds that lead to bands of snow for long periods of time -- and to heavy snow totals. The research team analyzed the strength of these snow bands, and their formation and persistence. Snow bands were often active for several days."

When land breezes move offshore from places where the coastline bulges out into a lake, unstable air masses form and drive a narrow band of moisture that dumps its moisture as snow on a strip of land downwind of the lake.

Steenburgh said it's long been known that breezes coming from the shore onto a lake help initiate and direct the formation of snow bands. Steenburgh and Geerts, and colleagues from universities in Illinois, Pennsylvania and upstate New York, traveled to Lake Ontario as part of an NSF-funded project called Ontario Winter Lake-effect Systems (OWLeS). The scientists investigated several questions about lake-effect systems:

What environmental factors have the greatest influence on the amount of snowfall and location of snowbands over and near Lake Ontario?

How does the interplay between wind and clouds produce long-lived snowbands far downstream of open water?

How does the local terrain influence the strength and longevity of these systems?

To find out, Geerts' team flew a Wyoming King Air research plane through winter storms, and Steenburgh's group set up weather monitoring equipment, including profiling radars and snow-measurement stations, to monitor the arrival of lake-effect storms near Tug Hill.

The researchers witnessed the region's intense snowfall, including one storm that dropped 40 inches in 24 hours. Snowfall rates often exceeded 4 inches per hour. "That's an amazing rate," Steenburgh said. "It's just an explosion of snow."

Wyoming Cloud Radar aboard the King Air plane detected an intense secondary air circulation across the main snow band. "This circulation had a narrow updraft, creating and lifting snow like a fountain in a narrow strip that dumped heavy snow where it made landfall," Geerts said. Using a weather model, Steenburgh's team found that this circulation's origin was a land breeze generated by the lake's uneven shoreline geography.

In some cases, another land breeze generated a second snow band that merged with the first. "The intense secondary circulation, with updrafts up to 20 miles per hour, had never been observed before," Geerts said.

One particular shoreline feature played a large role: a gentle, broad bulge along Lake Ontario's southern shore that extends from about Niagara Falls in the west to Rochester, New York, in the east.

"This bulge was important in determining where the lake-effect snow bands developed," Steenburgh said. "A bulge near Oswego, New York, on the southeast shore, also contributed to an increase in the precipitation downstream of Lake Ontario over Tug Hill."

Steenburgh says the residents of the region take the heavy snowfall in stride. Roads are kept plowed, and the team found that on many days, the biggest challenge was just getting out of the driveway of the house they stayed in. Once the tractor-snow blower was fired up, however, the researchers had a clear shot.

"We're a bunch of snow geeks," Steenburgh said. "We love to see it snowing like that. It's really pretty incredible. And our friends on Tug Hill made sure we could do our research."

Incorporating considerations of shoreline geography into weather forecast models can help predict which communities might be most affected by snowstorms, Steenburgh said. Understanding the effect of breezes that arise from the shore's shape is the key.

"If we want to pinpoint where the lake-effect is going to be, we're going to have to do a very good job of simulating what's happening along these coastal areas," he said.

Climate change denial or indifference are 'perverse attitudes': pope

Philip Pullella

VATICAN CITY (Reuters) - Denying climate change or being indifferent to its effects are “perverse attitudes” that block research and dialogue aimed at protecting the future of the planet, Pope Francis said on Thursday.

Francis, a strong defender of environmental protection, made his comments in a message to ministers meeting in Bonn to work out a rule book for implementing the 2015 Paris Agreement, which aims to move the world economy off fossil fuels.

“We have to avoid falling into these four perverse attitudes, which certainly do not help honest research and sincere and productive dialogue on building the future of our planet: negation, indifference, resignation and trust in inadequate solutions,” he said.

Francis called climate change “one of the worst phenomena that our humanity is witnessing”.

He praised the Paris accord, which U.S. President Donald Trump said the United States planned to leave, for indicating what he called a “clear path of transition towards a model of economic development with little or no carbon consumption”.

The United States is the only country out of 195 signatories to have announced its intention to withdraw from the accord, which aims to cut emissions blamed for the rise in temperatures.

Trump announced the decision in June shortly after visiting the pope. At the time, a Vatican official said the move was a “slap in the face” for the pope and the Vatican.

Before his election, Trump labeled climate change a “hoax”.

The pope has sided with those who believe global warming is at least partly man-made and has praised scientists who are working to keep the earth’s temperature under control.

Last September, he said the spate of hurricanes at the time should prompt people to understand that humanity will “go down” if it does not address climate change and history will judge those who deny the science on its causes.

High cognitive ability not a safeguard from conspiracies, paranormal beliefs

by Staff Writers

Chicago IL (SPX) Nov 16, 2017

Is skepticism toward unfounded beliefs just a matter of cognitive ability? Not according to new research by a University of Illinois at Chicago social psychologist.

The moon landing and global warming are hoaxes. The U.S. government had advance knowledge of the 9/11 attacks. A UFO crashed in Roswell, New Mexico.

Is skepticism toward these kinds of unfounded beliefs just a matter of cognitive ability? Not according to new research by a University of Illinois at Chicago social psychologist.

In an article published online and in the February 2018 issue of the journal Personality and Individual Differences, Tomas Stahl reports on two studies that examined why some people are inclined to believe in various conspiracies and paranormal phenomena.

"We show that reasonable skepticism about various conspiracy theories and paranormal phenomena does not only require a relatively high cognitive ability, but also strong motivation to be rational," says Stahl, UIC visiting assistant professor of psychology and lead author of the study.

"When the motivation to form your beliefs based on logic and evidence is not there, people with high cognitive ability are just as likely to believe in conspiracies and paranormal phenomena as people with lower cognitive ability."

Previous work in this area has indicated that people with higher cognitive ability - or a more analytic thinking style - are less inclined to believe in conspiracies and the paranormal.

Stahl and co-author Jan-Willem van Prooijen of Vrije Universiteit Amsterdam conducted two online surveys with more than 300 respondents each to assess analytic thinking and other factors that might promote skepticism toward unfounded beliefs.

The first survey found that an analytic cognitive style was associated with weaker paranormal beliefs, conspiracy beliefs and conspiracy mentality. However, this was only the case among participants who strongly valued forming their beliefs based on logic and evidence.

Among participants who did not strongly value a reliance on logic and evidence, having an analytic cognitive style was not associated with weaker belief in the paranormal or in various conspiracy theories.

In the second survey, the researchers examined whether these effects were uniquely attributable to having an analytic cognitive style or whether they were explained by more general individual differences in cognitive ability. Results were more consistent with a general cognitive ability account.

The article notes that despite a century of better educational opportunities and increased intelligence scores in the U.S. population, unfounded beliefs remain pervasive in contemporary society.

"Our findings suggest that part of the reason may be that many people do not view it as sufficiently important to form their beliefs on rational grounds," Stahl said.

From linking vaccines with autism to climate change skepticism, these widespread conspiracy theories and other unfounded beliefs can lead to harmful behavior, according to Stahl.

"Many of these beliefs can, unfortunately, have detrimental consequences for individuals' health choices, as well as for society as a whole," he said.

Toxic algae, Once a nuisance, now a severe nationwide threat

MONROE, Mich. (AP) — Competing in a bass fishing tournament two years ago, Todd Steele cast his rod from his 21-foot motorboat — unaware that he was being poisoned.

A thick, green scum coated western Lake Erie. And Steele, a semipro angler, was sickened by it.

Driving home to Port Huron, Michigan, he felt lightheaded, nauseous. By the next morning he was too dizzy to stand, his overheated body covered with painful hives. Hospital tests blamed toxic algae, a rising threat to U.S. waters.

“It attacked my immune system and shut down my body’s ability to sweat,” Steele said. “If I wasn’t a healthy 51-year-old and had some type of medical condition, it could have killed me.”

He recovered, but Lake Erie hasn’t. Nor have other waterways choked with algae that’s sickening people, killing animals and hammering the economy. The scourge is escalating from occasional nuisance to severe, widespread hazard, overwhelming government efforts to curb a leading cause: fertilizer runoff from farms.

Pungent, sometimes toxic blobs are fouling waterways from the Great Lakes to Chesapeake Bay, from the Snake River in Idaho to New York’s Finger Lakes and reservoirs in California’s Central Valley.

Pungent, toxic algae is spreading across U.S. waterways, even as the government spends vast sums of money to help farmers reduce fertilizer runoff that helps cause it. An AP investigation finds algae has become a serious hazard in all 50 states. (Nov. 16)

Last year, Florida’s governor declared a state of emergency and beaches were closed when algae blooms spread from Lake Okeechobee to nearby estuaries. More than 100 people fell ill after swimming in Utah’s largest freshwater lake. Pets and livestock have died after drinking algae-laced water, including 32 cattle on an Oregon ranch in July. Oxygen-starved “dead zones” caused by algae decay have increased 30-fold since 1960, causing massive fish kills. This summer’s zone in the Gulf of Mexico was the biggest on record.

Tourism and recreation have suffered. An international water skiing festival in Milwaukee was canceled in August; scores of swimming areas were closed nationwide.

Algae are essential to food chains, but these tiny plants and bacteria sometimes multiply out of control. Within the past decade, outbreaks have been reported in every state, a trend likely to accelerate as climate change boosts water temperatures.

This June 29, 2016 aerial photo shows blue-green algae in an area along the St. Lucie River in Stuart, Fla.

“It’s a big, pervasive threat that we as a society are not doing nearly enough to solve,” said Don Scavia, a University of Michigan environmental scientist. “If we increase the amount of toxic algae in our drinking water supply, it’s going to put people’s health at risk. Even if it’s not toxic, people don’t want to go near it. They don’t want to fish in it or swim in it. That means loss of jobs and tax revenue.”

Many monster blooms are triggered by an overload of agricultural fertilizers in warm, calm waters, scientists say. Chemicals and manure intended to nourish crops are washing into lakes, streams and oceans, providing an endless buffet for algae.

Government agencies have spent billions of dollars and produced countless studies on the problem. But an Associated Press investigation found little to show for their efforts:

—Levels of algae-feeding nutrients such as nitrogen and phosphorus are climbing in many lakes and streams.

—A small minority of farms participate in federal programs that promote practices to reduce fertilizer runoff. When more farmers want to sign up, there often isn’t enough money.

—Despite years of research and testing, it’s debatable how well these measures work.

The AP’s findings underscore what many experts consider a fatal flaw in government policy: Instead of ordering agriculture to stem the flood of nutrients, regulators seek voluntary cooperation, an approach not afforded other big polluters.

Farmers are asked to take steps such as planting “cover crops” to reduce off-season erosion, or installing more efficient irrigation systems — often with taxpayers helping foot the bill.

The U.S. Natural Resources Conservation Service, part of the Department of Agriculture, says it has spent more than $29 billion on voluntary, incentive-based programs since 2009 to make some 500,000 operations more environmentally friendly.

Jimmy Bramblett, deputy chief for programs, told AP the efforts had produced “tremendous” results but acknowledged only about 6 percent of the nation’s roughly 2 million farms are enrolled at any time.

In response to a Freedom of Information Act request, the agency provided data about its biggest spending initiative, the Environmental Quality Incentives Program, or EQIP, which contracts with farmers to use pollution-prevention measures and pays up to 75 percent of their costs.

An AP analysis shows the agency paid out more than $1.8 billion between 2009 and 2016 to share costs for 45 practices designed to cut nutrient and sediment runoff or otherwise improve water quality.

A total of $2.5 billion was pledged during the period. Of that, $51 million was targeted for Indiana, Michigan and Ohio farmers in the watershed flowing into western Lake Erie, where fisherman Steele was sickened.

Yet some of the lake’s biggest algae blooms showed up during those seven years. The largest on record appeared in 2015, blanketing 300 square miles — the size of New York City. The previous year, an algae toxin described in military texts as being as lethal as a biological weapon forced a two-day tap water shutdown for more than 400,000 customers in Toledo. This summer, another bloom oozed across part of the lake and up a primary tributary, the Maumee River, to the city’s downtown for the first time in memory.

The type of phosphorus fueling the algae outbreak has doubled in western Lake Erie tributaries since EQIP started in the mid-1990s, according to research scientist Laura Johnson of Ohio’s Heidelberg University. Scientists estimate about 85 percent of the Maumee’s phosphorus comes from croplands and livestock operations.

NRCS reports, meanwhile, claim that conservation measures have prevented huge volumes of nutrient and sediment losses from farm fields.

Although the federal government and most states refuse to make such anti-pollution methods mandatory, many experts say limiting runoff is the only way to rein in rampaging algae. A U.S.-Canadian panel seeking a 40 percent cut in Lake Erie phosphorus runoff wants to make controlling nutrients a condition for receiving federally subsidized crop insurance.

“We’ve had decades of approaching this issue largely through a voluntary framework,” said Jon Devine, senior attorney for the Natural Resources Defense Council. “Clearly the existing system isn’t working.”

Farmers, though, say they can accomplish more by experimenting and learning from each other than following government dictates.

“There’s enough rules already,” said John Weiser, a third-generation dairyman with 5,000 cows in Brown County, Wisconsin, where nutrient overload causes algae and dead zones in Lake Michigan’s Green Bay. “Farmers are stewards of the land. We want to fix the problem as much as anybody else does.”

The Environmental Protection Agency says indirect runoff from agriculture and other sources, such as urban lawns, is now the biggest source of U.S. water pollution. But a loophole in the Clean Water Act of 1972 prevents the government from regulating runoff as it does pollution from sewage plants and factories that release waste directly into waterways. They are required to get permits requiring treatment and limiting discharges, and violators can be fined or imprisoned.

Those rules don’t apply to farm fertilizers that wash into streams and lakes when it rains. Congress has shown no inclination to change that.

Without economic consequences for allowing runoff, farmers have an incentive to use all the fertilizer needed to produce the highest yield, said Mark Clark, a University of Florida wetland ecologist. “There’s nothing that says, ‘For every excessive pound I put on, I’ll have to pay a fee.’ There’s no stick.”

Some states have rules, including fertilizer application standards intended to minimize runoff. Minnesota requires 50-foot vegetation buffers around public waterways. Farmers in Maryland must keep livestock from defecating in streams that feed the Chesapeake Bay, where agriculture causes about half the nutrient pollution of the nation’s biggest estuary.

But states mostly avoid challenging the powerful agriculture industry.

Wisconsin issues water quality permits for big livestock farms, where 2,500 cows can generate as much waste as a city of 400,000 residents. But its Department of Natural Resources was sued by a dairy group this summer after strengthening manure regulations.

The state’s former head of runoff management, Gordon Stevenson, is among those who doubt that the voluntary approach will be enough to make headway with the algae problem.

“Those best-management practices are a far cry from the treatment that a pulp and paper mill or a foundry or a cannery or a sewage plant has to do before they let the wastewater go,” he said. “It’s like the Stone Age versus the Space Age.”

Do the anti-pollution measures subsidized by the government to the tune of billions of dollars actually work?

Agriculture Department studies of selected watersheds, based largely on farmer surveys and computer models, credit them with dramatic cutbacks in runoff. One found nitrogen flows from croplands in the Mississippi River watershed to the Gulf of Mexico would be 28 percent higher without those steps being taken.

Critics contend such reports are based mostly on speculation, rather than on actually testing the water flowing off fields.

Although there is not a nationwide evaluation, Bramblett said “edge of field” monitoring the government started funding in 2013 points to the success of the incentives program in certain regions.

Federal audits and scientific reports raise other problems: Decisions about which farms get funding are based too little on what’s best for the environment; there aren’t enough inspections to ensure the measures taken are done properly; farm privacy laws make it hard for regulators to verify results.

It’s widely agreed that such pollution controls can make at least some difference. But experts say lots more participation is needed.

“The practices are completely overwhelmed,” said Stephen Carpenter, a University of Wisconsin lake ecologist. “Relying on them to solve the nation’s algae bloom problem is like using Band-Aids on hemorrhages.”

The AP found that the incentives program pledged $394 million between 2009 and 2016 for irrigation systems intended to reduce runoff — more than on any other water protection effort.

In arid western Idaho, where phosphorus runoff is linked to algae blooms and fish kills in the lower Snake River, government funding is helping farmer Mike Goodson install equipment to convert to “drip irrigation” rather than flooding all of his 550 acres with water diverted from rivers and creeks.

But only 795 water protection contracts were signed by Idaho farmers between 2014 and 2016, accounting for just over 1 percent of the roughly 11.7 million farmland acres statewide. Even if many farmers are preventing runoff without government subsidies, as Bramblett contends, the numbers suggest there’s a long way to go.

Goodson says forcing others to follow his example would backfire.

“Farmers have a bad taste for regulatory agencies,” he said, gazing across the flat, wind-swept landscape. “We pride ourselves on living off the land, and we try to preserve and conserve our resources.”

But allowing farmers to decide whether to participate can be costly to others. The city of Boise completed a $20 million project last year that will remove phosphorus flowing off irrigated farmland before it reaches the Snake River.

Brent Peterson spends long days in a mud-spattered pickup truck, promoting runoff prevention in eastern Wisconsin’s Lower Fox River watershed, where dairy cows excrete millions of gallons of manure daily — much of it sprayed onto cornfields as fertilizer.

The river empties into algae-plagued Green Bay, which contains less than 2 percent of Lake Michigan’s water but receives one-third of the entire lake’s nutrient flow. Farmers in the watershed were pledged $10 million from 2009 to 2016 to help address the problem, the AP found.

Peterson, employed by two counties with many hundreds of farms, has lined up six “demonstration farms” to use EQIP-funded runoff prevention, especially cover crops.

“This is a big step for a lot of these guys,” he said. “It’s out of their comfort zone.”

And for all the money devoted to EQIP, only 23 percent of eligible applications for grants were funded in 2015, according to the National Sustainable Agriculture Coalition.

Funding of the incentives program has risen from just over $1 billion in 2009 to $1.45 billion last year. The Trump administration’s 2018 budget proposes a slight cut.

“It sounds like a lot, but the amount of money we’re spending is woefully inadequate,” said Johnson of Heidelberg University.

While there’s no comprehensive tally of algae outbreaks, many experts agree they’re “quickly becoming a global epidemic,” said Anna Michalak, an ecologist at the Carnegie Institution for Science at Stanford University.

A rising number of water bodies across the U.S. have excessive levels of nutrients and blue-green algae, according to a 2016 report by the Environmental Protection Agency and U.S. Geological Survey. The algae-generated toxin that sickened Steele in Lake Erie was found in one-third of the 1,161 lakes and reservoirs the agencies studied.

California last year reported toxic blooms in more than 40 lakes and waterways, the most in state history. New York created a team of specialists to confront the mounting problem in the Finger Lakes, a tourist magnet cherished for sparkling waters amid lush hillsides dotted with vineyards. Two cities reported algal toxins in their drinking water in 2016, a first in New York.

More than half the lakes were smeared with garish green blooms this summer.

“The headlines were basically saying, ’Don’t go into the water, don’t touch the water,’” said Andy Zepp, executive director of the Finger Lakes Land Trust, who lives near Cayauga Lake in Ithaca. “I have an 11-year-old daughter, and I’m wondering, do I want to take her out on the lake?”

The U.S. Centers for Disease Control and Prevention is developing a system for compiling data on algae-related illnesses. A 2009-10 study tallied at least 61 victims in three states, a total the authors acknowledged was likely understated.

Anecdotal reports abound — a young boy hospitalized after swimming in a lake near Alexandria, Minnesota; a woman sickened while jet-skiing on Grand Lake St. Marys in western Ohio.

Signs posted at boat launches in the Hells Canyon area along the Idaho-Oregon line are typical of those at many recreation areas nationwide: “DANGER: DO NOT GO IN OR NEAR WATER” if there’s algae.

In Florida, artesian springs beloved by underwater divers are tainted by algae that causes a skin rash called “swimmer’s itch.” Elsewhere, domestic and wild animals are dying after ingesting algae-tainted water.

A year ago, shortly after a frolic in Idaho’s Snake River, Briedi Gillespie’s 11-year-old Chesapeake Bay retriever stopped breathing. Her respiratory muscles were paralyzed, her gums dark blue from lack of air.

Gillespie, a professor of veterinary medicine, and her veterinarian husband performed mouth-to-nose resuscitation and chest massage while racing their beloved Rose to a clinic. They spent eight hours pumping oxygen into her lungs and steroids into her veins. She pulled through.

The next day, Gillespie spotted Rose’s paw prints in a purplish, slimy patch on the riverbank and took samples from nearby water. They were laced with algae toxins.

“It was pretty horrendous,” Gillespie said. “This is my baby girl. How thankful I am that we could recognize what was going on and had the facilities we did, or she’d be gone.”

Associated Press data journalist Angeliki Kastanis reported from Los Angeles.

Death by Killer Algae

When 343 sei whales died from a harmful algal bloom in Chilean Patagonia, they opened a window into the effect changing climate is having on marine mammals, our oceans, and us.

by Claudia Geib, Published November 21, 2017

This article is also available in audio format. Listen now, download, or subscribe to “Hakai Magazine Audio Edition” through your favorite podcast app.

They didn’t think much of the first dead whale. Dwarfed by the rugged cliffs of Patagonia’s high green fjords, the team of biologists had sailed into a gulf off the Pacific Ocean searching for the ocean’s smaller animals, the marine invertebrates they were there to inventory. That night, while hunting for an anchorage in a narrow bay, the team spotted a large, dead whale floating on the water’s surface. But for the biologists, death—even of such an enormous animal—didn’t seem so unusual.

Not so unusual, that is, until they found the second whale, lying on the beach. And a third. And a fourth. In all, they found seven in that bay alone. Over the next day, they counted a total of 25 dead whales in the fjord.

As the team of five researchers from Chile’s Huinay Scientific Field Station sailed south across the Golfo de Penas, the dead were there, too: 200 kilometers away, they found four more whales on the beaches of the exposed, outer coast. At one point, someone’s dog rolled in one of the corpses. The scent of dead whale hung in the boat for weeks.

“Everybody was clear about it—this is not normal,” says Vreni Häussermann, director of Huinay station and the leader of the group that made the discovery in April 2015. Häussermann and her team found themselves drawn into a whodunit worthy of a detective show: they’d become accidental witnesses to a mass killing. But what had caused it, and just how many had fallen victim?

Sure that it was too much of a coincidence to find two groups of dead whales so far apart, Häussermann and her colleague Carolina Gutstein of the University of Chile went to the National Geographic Society for the funds to run a survey flight. Winter was arriving in Patagonia, and the window for a small plane to fly was shrinking fast. Seizing on two days of good weather in June, Häussermann, Gutstein, and one of Gutstein’s students crammed in next to the pilot and started counting dead whales.

“When we flew over the first of two fjords, we saw more than 70,” Häussermann recalls. “We were just all silent. Someone said: ‘Oh, shit, this is a nightmare.’”

By the end of the second day, after bad weather moved in and forced the airsick group to the ground, they knew they had something big on their hands. Häussermann thought they had marked down about 150 whales. It wasn’t until they went through the GPS points that they realized the number was 360.

Based on size, shape, and species known to frequent the region, the team posited that at least 343 of the dead were sei whales, the third-largest species of baleen whale and an endangered species. It was the largest baleen whale mortality ever recorded.

In a paper published in May 2017, Häussermann, Gutstein, and 10 colleagues from across several disciplines attribute the deaths of these enormous animals, which grow longer than a semi trailer, to something very small: a toxic species of marine alga.

Most species of algae are harmless, and are an essential part of the food chain at that. Those that do produce toxins usually do so in small amounts. However, under the right conditions—warm water and a boost of nutrients—algae can grow so explosively that those toxins become a problem, creating what’s called a harmful algal bloom, or HAB. The toxins end up in filter feeders, such as shellfish, which draw algae out of the water, and in the stomachs of zooplankton and other small animals that feed on algae. As larger animals eat these organisms, algal toxins get passed up the food chain.

Depending on the algal species, the impacts of toxins on animals will differ: from respiratory distress, to confusion and seizures, to full nervous system paralysis. The further up the food chain toxins accumulate, the more concentrated they become. The largest animals tend to receive the strongest dose, a phenomenon known as biomagnification.

Though HABs are familiar fare for marine biologists—having been documented as far back as 1672—whale deaths by algae are not, making the Huinay team’s discovery in 2015 largely unprecedented. Hoping to delve more into the mystery, an international group of seven scientists ventured twice to the Golfo de Penas—in February and May 2016—to gather more information about the role harmful algae had played in the whales’ deaths and to gauge whether such a mass mortality might happen again.

With its wide maw open to the Pacific Ocean, the Golfo de Penas—the Gulf of Sorrows—is inaccessible by land and forbidding by sea. The weather in Chilean Patagonia is infamously unpredictable, with tides that can change daily by seven meters and a prevailing westerly wind that makes its exposed coastline one of the most wave-impacted on Earth.

On February 1, 2016, after a full day of delays, the scientific team made it at last to Puerto Edén, the gateway to the Golfo de Penas. There they boarded the Saoirse, a private vessel captained by Keri-Lee Pashuk and Greg Landreth. The Huinay field station team had been aboard Saoirse when they first discovered the whales the year before, and Pashuk and Landreth had taken a particular interest in the case of the dead sei whales; over the 10 months after the troubling discovery, they’d chased down much of the funding needed to welcome this new party of scientists.

Wending through the gulf’s maze of fjords, the crew of Saoirse found themselves in a land of the dead. Whales stranded the previous year were obvious—stripped clean to bone, or with mummified remnants clinging like shredded canvas to a fluke or a rib—but they weren’t alone: freshly stranded whales had joined them, some only a few weeks gone.

The science proceeded smoothly, and the beauty of the region and the novelty of living at sea was a welcome distraction. Yet a sense of unease slowly distilled among the crew.

Researchers search for whale carcasses during the February 2016 follow-up expedition in the Gulf of Penas. Photo by Keri-Lee Pashuk

Researchers search for whale carcasses during the February 2016 follow-up expedition in the Golfo de Penas. Photo by Keri-Lee Pashuk

“I feel what I can only explain as a profound sadness, the disassociation of the war photographer having worn thin,” Pashuk wrote on the ship’s blog after two and a half weeks of ferrying scientists to land.

On shore, the team spent long days measuring, sampling, and photographing carcasses, working through high winds, rainstorms, and clouds of flies. From the boat, they tested the temperature, salinity, oxygen levels, and plankton content in every fjord they threaded through, seeking information that could be linked to the mortality. The normally chilly gulf was so much warmer than usual that one of the researchers lingered in the water after repairing a blocked drain to collect squat lobsters, a favorite food of sei whales, in a jar. (Normally the team netted the crustaceans from the deck.)

That unusually warm water in the Golfo de Penas was a major clue in solving the mystery of the whales’ deaths, and it reflected an event of global magnitude.

From January 2015 through June 2016, the planet experienced an El Niño: a period in which easterly winds across the Pacific Ocean weaken, allowing warm water to flow into the space between Oceania and South America.

The weather associated with El Niño and its effects are well documented: unusual rainfall in the Peruvian desert, droughts in Indonesia and Australia, and disastrous drops in South American fish populations, to name just a few. Only in the past few decades has another trend emerged: HABs, which seem to coincide with the changes in water temperature and nutrient availability brought on by El Niño.

Scientists often trace the rise of HABs back to 1997–98, previously the strongest El Niño year on record. In May of 1998, stranded California sea lions began having seizures on beaches all along central California. After an intensive investigation, researchers discovered the seizures were caused by domoic acid—a neurotoxin produced by a relatively common alga called Pseudo-nitzschia. Both humans and marine mammals exposed to domoic acid can suffer brain lesions, seizures, and memory loss.

Scientists think that following the 1998 bloom Pseudo-nitzschia established itself more permanently along the west coast, causing small blooms nearly every spring and summer—and with them, annual sea lion seizures. Widespread ocean warming plays a role in that trend, as does nutrient runoff from human activities. But during warm weather events, like El Niño, the number of sick sea lions tends to skyrocket.

When they initially discovered the dead sei whales in 2015, Häussermann and her crew from the Huinay Scientific Field Station hadn’t considered a HAB as the cause of death; they were experts in marine invertebrates, not in whales or HABs. Additionally, there was only one recent precedent for a HAB killing a group of whales: in 1987, 14 humpback whales off Cape Cod, Massachusetts, were killed by the paralyzing algal toxin saxitoxin. Before that, the closest analog seemed to be an ancient one. Gutstein had investigated a site in the Atacama Desert, some 2,500 kilometers north of the Golfo de Penas, in which dozens of baleen whales and other large mammals were believed to have been killed by a HAB between six and nine million years ago.

researchers digging up whale bones in the Atacana Desert in Chile

Toxic algae was likely responsible for four distinct mass stranding events that occurred millions of years ago at a site in the Atacama Desert of northern Chile. Photo by Adam Metallo/Smithsonian Institution

But by the time the second group of scientists boarded the Saoirse in 2016, harmful algae were their main suspects. Over the course of the previous year, as samples from the first expedition returned positive results for two different marine toxins, the researchers had watched as effects from El Niño spread across the western hemisphere.

“[2015] kind of rewrote what we understand about how these blooms work,” says Raphael Kudela, an algae researcher at the University of California, Santa Cruz.

Kudela’s lab was sampling algae in Monterey Bay, California, in April 2015—just before the Huinay team discovered the dead whales—when he began noticing rising levels of algal toxins. “We didn’t think it would be that large,” he says. “We happened to be in just the right spot and saw the toxins starting to show up. From there it just kept going.”

By the end of the summer, the bloom had spread from Santa Barbara in Southern California to Alaska’s Aleutian Islands. The bloom broke all the records: it was the largest, the longest lasting, and the most toxic researchers had ever seen.

“There was literally these layers where it looked like straw or something, thick with all these cells,” Kudela says. There were a mix of algal species in the bloom, but it was dominated by Pseudo-nitzschia. In the lab, healthy Pseudo-nitzschia can form chains of 20 to 30 of their golden-brown, needle-shaped cells; according to Kudela, wild chains in the 2015 bloom were 100 to 300 cells long.

As the toxin progressed through the ecosystem, Kudela’s lab found it everywhere they looked. The prevailing idea was that after small fish such as anchovies, ate the algal cells, the water-soluble toxin would reside only in their stomachs until it was excreted, usually within 24 hours. But during the 2015 bloom, Kudela’s lab could find domoic acid integrated into anchovy muscle, brains, and gills. At local fish markets, they found it in salmon tissue and in squid, neither of which consume phytoplankton directly.

In late May 2015, the Department of Fish and Wildlife in Washington State discovered a sea lion having a seizure on a Washington beach. It had been poisoned by domoic acid—the farthest north algal poisoning had been confirmed in a marine mammal.

A sea lion suffers a seizure near Long Beach, Washington, after ingesting domoic acid produced by algae. Video by Dan Ayres, WDFW, courtesy of the National Oceanic and Atmospheric Administration

And soon after, to the far north, along the coast of Alaska, dead whales began appearing, floating offshore. They ran the gamut: humpbacks, fin whales, gray whales, animals so decomposed that they couldn’t be identified; multiple adults and at least one calf. They showed up near Anchorage’s busy port and the remote rocky shores of the Alaska Peninsula, totaling 30 between May and August. Word of six more came from British Columbia down the coast.

Unfortunately, many of the whales were too far decomposed to test, or couldn’t be retrieved. The US National Oceanic and Atmospheric Administration (NOAA) said that the summer’s coast-wide HAB was likely involved in the whales’ deaths in Alaska, though the evidence simply wasn’t there to discover.

A year later, in May 2016, threads of evidence from throughout the Pacific would start to come together when a landmark paper was published on domoic acid in the region: NOAA researcher Kathi Lefebvre—a member of the team who first made the link between domoic acid and sea lion seizures in 1998—detected the toxin in the tissues of 13 species of marine mammal in Alaska, from sea otters to massive bowhead whales. She had known the algal cells could survive in the chilly north; yet this showed that they had established a significant foothold, one strong enough that it was detectable high up on the food chain.

In the late hours of one rainy February night, during the first 2016 expedition to the Golfo de Penas, Saoirse’s captain Keri-Lee Pashuk was woken by a thunk that echoed through the ship’s hull. She lay in the dark, listening as an eerie chorus of whistles and chirps resonated through the water around her.

Earlier that evening, Pashuk and the crew had watched a group of killer whales pursuing at least one living sei whale into the fjord where the science team was collecting samples from a young, dead whale. The killer whales were relentless, biting the sei whale and breaching to land on top of it. Peering from the beach through a gathering curtain of rain until darkness fell, the crew watched as the sei whale tried frantically to escape, even if that meant nearly beaching itself.

As she lay in the dark, Pashuk had a feeling that the ghostly sounds had something to do with the whales. She was right—when the team rose the next morning, they found a sei whale freshly dead on the beach.

As that research trip wrapped in early March, the killer whale attack continued to disturb the crew. They couldn’t say for sure that the attack had anything to do with the stranding, or that killer whales were responsible for the other stranded whales they’d found—but then again, could they say that they hadn’t?

This is the problem investigators face with marine mammal strandings: so many factors are in constant interplay, and conclusions often come down to an elaborate process of elimination.

“It was a little bit like CSI, or an Agatha Christie novel,” says David Cassis, an expert in HABs and phytoplankton, who worked on the sei whale investigation, “where you have the body of the dead person in a room with a locked door, no murder weapon, and no suspects.”

Thanks to Gutstein’s expertise in taphonomy—which uses the condition of an animal to infer how it died—the scientists knew that 90 percent of the sei whales had died within a period of months, and they had died at sea: the whales were found lying on their backs or sides, indicating they had been floating, not swimming, when deposited on land.

Additionally, ocean current models suggest the bodies came from at least five different locations around the region, meaning that the killer must have been able to affect those individuals across a distance of hundreds of kilometers.

Faced with the evidence from their observations and research, the team labored to rule out several potential suspects. Underwater explosions, which can disorient whales and cause them to beach, were one possibility; but that would have left a mark on the whales’ inner ears, which the scientists had found no evidence of. Death by disease, though difficult to disprove, likely would have depleted the whales’ fat as they stopped feeding—but these whales were fat with blubber and had full stomachs at their time of death. And as shocking as the killer whale attacks had been, the researchers learned that most killer whales attack baleen whales to eat their tongues, the easiest organ to access. All of the sei whales they saw up close had tongues intact. Plus, it was near impossible for killer whales to have killed so many whales almost simultaneously.

Scientists work on a whale necropsy in February 2016, gathering samples of tissue and bone to help to solve the mystery of the massive whale die-off in Patagonia.

That left only one plausible culprit—a killer alga. Two whales, as well as mussels sampled just after the initial 2015 Patagonia stranding, tested positive for both domoic acid and paralytic shellfish toxin (PST), which can cause muscle paralysis in mammals. Phytoplankton also tested positive for PST at the mouth of Seno Newman, a long fjord near the Golfo de Penas where 149 of the dead whales were found.

The team’s paper, published in May 2017, states it definitively: “Here, we show that the synchronous death of at least 343, primarily sei whales can be attributed to HABs during a building El Niño.”

HABs that persisted in Chile throughout the investigation carried out from the Saoirse have continued to support this theory, and buoy comparisons to the bloom seen in the northeast Pacific Ocean that was linked to sick sea lions and dead whales. In early February and March 2016, at the same time the scientific team was investigating the Golfo de Penas, a massive bloom was steamrolling northern Patagonia, killing at least 39 million salmon and shuttering shellfish harvests. Chile’s director of National Fisheries and Aquaculture attributed the blooms primarily to the warm waters of El Niño. In January 2017, yet another bloom around the Golfo de Penas killed around 170,000 salmon in farms there.

On the opposite coast, harmful algae have also been named as a potential suspect for years of high mortality for southern right whale calves in Argentina.

The El Niños most often linked with these HABs are cyclic, infrequent events. Yet years’ worth of dogged work around the world suggests that HABs aren’t just an occasional occurrence; they are increasingly becoming regular events, and are even called the “new normal” for certain areas.

On a warming planet, this killer has an accomplice.

As the effects of climate change become more apparent, scientists have broadly begun connecting HABs with rising ocean temperatures. Kudela, for instance, found that large blooms off the west coast of North America have been associated with unusually warm water. Published studies have also made this association; a recent modeling study led by scientists at Stony Brook University in New York shows that ocean warming makes expansive areas in the North Atlantic and parts of the North Pacific more conducive to HABs by two species, Alexandrium fundyense and Dinophysis acuminata.

However, ocean warming is not uniform, and local climates can greatly impact how an area responds to broader ocean changes. As such, scientists don’t expect the oceans will permanently resemble what the Pacific Ocean saw in 2015–16 any time soon; climate change is not quite so black and white. Yet some research suggests that climate change will make extreme events more extreme, making the fallout of landmark El Niño years like 2015–16 more common. In some areas, this could also lead to an increased frequency of HABs.

In the United States, a recent bill approved by the Senate Committee on Commerce, Science, and Transportation would make areas hardest hit by HABs eligible for emergency disaster relief. To advocates of this sort of legislation, HABs—given the havoc they wreak on wildlife, and on the seafoods that humans harvest—are just as much a natural disaster as a hurricane or a tsunami.

In this way, the health of marine mammals and humans are knotted together. HABs may signal broad changes happening in our ocean, changes so subtle that humans could easily overlook them. Because marine mammals are usually the first to show noticeable effects, they serve as sentient early warning systems, indicating changes to our food supply and marine ecosystems.

“On another level, you could say these are signals, these are canaries in the coal mine of what’s in the food web,” says Kathi Lefebvre. “What’s going to impact marine mammals can certainly impact humans. It’s not necessarily a direct correlation, but the same biology occurs.”

Lefebvre was a graduate student in 1998, when Monterey Bay’s sea lions first brought the impact of algal toxins on mammals into the public and scientific eye. The course of that investigation—which, like the sei whale mortality, mystified scientists for weeks—hooked Lefebvre. She was one of the first scientists to suggest domoic acid was behind the sea lion strandings and has since dedicated her career to untangling the effects that algal toxins can have, from the ecosystem level to the individual creature.

Some of her most recent research suggests that marine mammal strandings could signal even more insidious threats to humans than we realize. Lefebvre found that chronic, low-level exposures to domoic acid can have long-term effects on the brain. After about six months, mice injected with a low dose of domoic acid once a week showed significant learning deficits and hyperactivity, even though their brains appeared normal beneath a microscope.

Though she can’t yet say how these findings translate to human brains, Lefebvre’s biggest worry for human health is algal toxins establishing an unseen presence in communities where seafood has always been a safe resource, particularly where local diets are supplemented with subsistence shellfish harvesting. These communities could be regularly consuming small amounts of these toxins. As warming waters move toward the poles, many remote northern communities are expected to encounter harmful algae—and large blooms—more often.

“You can really sort of stress any living system, but you can only stress things until they break,” she says. Lefebvre’s research found that the mice’s brains could recover from the impacts of long-term toxin exposure, but only if they had no exposure to the toxin for at least nine weeks. But larger exposures, even of only a single dose, can have permanent and even fatal effects on both marine mammals and humans. After a HAB near Prince Edward Island, Canada in 1987, 12 of 102 people who became violently ill after eating shellfish experienced short-term memory loss and amnesia for months after the incident. Three of the afflicted died.

The past few years have made scientists certain of this: as climate change continues, HABs will be bigger, more toxic, and present in areas they have never been seen before. Yet beyond that, we are sailing on foreign seas—harmful algae could cause subtle, even far-reaching, side effects we’ve not yet discovered. The long-term impact they will have on marine life, from the tiniest of plankton to the largest of mammals, continues to unfold.

Here on Earth, we’re all a bit like the science team on the Saoirse, sailing through unfamiliar fjords. We’re on the lookout for something on the surface—but we don’t know what may emerge with the next tide.

Chinese icebreaker steams for Antarctica in polar power play

By Ryan MCMORROW

Beijing (AFP) Nov 8, 2017

The Chinese ice-breaker Xuelong steamed south from Shanghai on Wednesday bound for Antarctica, where it will establish China's newest base as Beijing strives to become a polar power.

China is a latecomer in the race for pole position but its interest in Antarctica has grown along with its economic might. The new station will be the fifth Chinese foothold on the frozen continent, more than some nations which got there earlier.

China is ploughing money into polar exploration and research as other countries like the United States pull back under funding constraints and a glut of other global priorities.

An international treaty suspends all territorial claims to Antarctica, essentially setting it aside as a scientific preserve.

That "provides a precious opportunity to quickly develop China's polar bases", Qin Weijia, director of the China Arctic and Antarctic Administration, said at an annual meeting on the poles last month.

China has rapidly built up activities on the continent, building new bases and commissioning polar-capable ships and aircraft. Officials say it intends to become a "polar power."

"The fact that China has coined this new term and has made it an important part of their foreign policy shows the level of ambition and forward thinking that China has," said Anne-Marie Brady, a Global Fellow at the Wilson Center.

Brady's research, published in her book "China as a Polar Great Power", shows that China is already the pre-eminent spender on Antarctic programmes, when its logistics, infrastructure and research funding are added together.

The multilateral Antarctic Treaty bars mineral exploitation on the continent, but that may change in 2048 when rules governing the treaty change.

Some researchers worry that resource-hungry China's interest in the South Pole is a thinly veiled cover to allow mapping of the continent in preparation for a future when mining and drilling may be allowed.

Lin Shanqing, deputy director of the State Oceanic Administration which oversees China's polar programmes, said as much last week.

China must speed up development of "polar prospecting and extraction equipment", Lin said at the administration's annual meeting.

The 334-person crew of the Xuelong, which means "Snow Dragon", will establish a temporary 206-square-meter base on rocky Inexpressible Island, a leader of the expedition told the China Daily.

This will eventually be developed into China's fifth base, with work expected to be completed around 2022.

China has a growing collection of outposts, with its largest -- the Great Wall station -- able to pack in 80 researchers in the summer months. The base was not built until 1985, more than 80 years after Argentina established Antarctica's first base, on Laurie Island in 1904.

"China will be one of the few countries with a considerable number of bases spread out over the region," said Marc Lanteigne, a lecturer at Massey University Albany in New Zealand.

"It demonstrates China is a major player on the continent."

The United States, in contrast, operates three permanent bases relying in part on decades-old equipment. Argentina tops the list with six permanent bases.

Equally important are the expensive ice-breakers, whose sturdy hulls are crucial for getting supplies to iced-in Antarctic outposts.

Russia has more than 40, while the US has just two, one of which is years past its prime. China has two ice-breakers including the red-hulled Xuelong and a third under construction.

For China, it is more than a strategic priority, Brady said.

The projects in Antarctica are the latest to showcase and bolster the Communist Party's case that it is leading the nation to "rejuvenation".

"It's also about stirring up patriotism and confidence, which is very important to this government," Brady said.

Some Chinese coal ash too radioactive for reuse

Radiation levels 43 times higher than UN safety standards

Date: November 9, 2017

Source: Duke University

Summary:

Many manufacturers use coal ash from power plants as a low-cost binding agent in concrete and other building materials. But a new study finds that coal ash from high-uranium deposits in China is too radioactive for this use. Some coal ash analyzed in the study contained radiation 43 times higher than the maximum safe limit set for residential building materials by the U.N. Scientific Committee on the Effects of Atomic Radiation.

Manufacturers are increasingly using encapsulated coal ash from power plants as a low-cost binding agent in concrete, wallboard, bricks, roofing and other building materials. But a new study by U.S. and Chinese scientists cautions that coal ash from high-uranium deposits in China may be too radioactive for this use.

"While most coals in China and the U.S. have typically low uranium concentrations, in some areas in China we have identified coals with high uranium content," said Avner Vengosh, professor of geochemistry and water quality at Duke University's Nicholas School of the Environment. "Combustion of these coals leaves behind the uranium and radium and generates coal ash with very high levels of radiation."

The level of radiation in this coal ash could pose human health risks, particularly if it is recycled for use in residential building materials, he said.

Some of the coal ash samples analyzed in the new study contained radiation levels more than 43 times higher than the maximum safe limit established for residential building materials by the United Nations Scientific Committee on the Effects of Atomic Radiation.

"The magnitude of radiation we found in some of the Chinese coal ash far exceeds safe standards for radiation in building materials," said Shifeng Dai, professor of geochemistry at the state key laboratory of coal resources and safe mining at China University of Mining and Technology (CUMT) in Beijing and Xuzhou, China. "This calls into question the use of coal ash originating from uranium-rich coals for these purposes."

Vengosh, Dai and their teams published their findings Nov. 8 in the peer-reviewed journal Environmental Science and Technology.

The new paper is part of an ongoing collaboration between researchers at CUMT, Duke and Duke Kunshan University to identify the environmental impacts of coal and coal ash in China. Vengosh holds a secondary faculty appointment at Duke Kunshan, which is in China.

To conduct their study, the scientists measured naturally occurring radioactivity in high-uranium coals from 57 sites in China. They also measured radiation levels in coal ash residues produced from this coal, and in soil collected from four sites.

"By comparing the ratio of uranium in the coal to the radioactivity of the coal ash, we identified a threshold at which uranium content in coal becomes too high to allow coal ash produced from it to be used safely in residential building construction," said Nancy Lauer, a Ph.D. student at Duke's Nicholas School, who led the research.

This threshold -- roughly 10 parts per million of uranium -- is applicable to high-uranium coal deposits worldwide, not just in China, "and should be considered when deciding whether to allow coal ash to be recycled into building materials," she said.

All radionuclide measurements of coal and coal ash samples were conducted in Vengosh's lab at Duke.

"Since our findings demonstrate that using ash from this high-uranium coal is not suitable in building materials, the challenge becomes, how do we dispose of it in ways that limit any potential water or air contamination," Vengosh said. "This question requires careful consideration."

Story Source:

Materials provided by Duke University. Note: Content may be edited for style and length.

Journal Reference:

Nancy Lauer, Avner Vengosh, Shifeng Dai. Naturally Occurring Radioactive Materials in Uranium-Rich Coals and Associated Coal Combustion Residues from China. Environmental Science & Technology, 2017; DOI: 10.1021/acs.est.7b0347

By using surplus food, U.S. cities could tackle hunger, waste problems

Sophie Hares

TEPIC, Mexico (Thomson Reuters Foundation) - Apples, bread, pasta and coffee are high on the list of foods worth $218 billion Americans dump in the bin or pour down the drain each year, costing them and the environment dear, a green group said on Wednesday.

Under growing pressure to deal with the 40 percent of food households, restaurants, grocers and others throw away, U.S. cities must find new ways to stop waste going into landfill and get edible food to those who need it, the Natural Resources Defense Council (NRDC) said in two reports.

“This is food that is surplus... It’s not food that’s coming off people’s plates,” said Darby Hoover, a senior resource specialist at the NRDC, a U.S.-based environmental non-profit. “It’s food that’s packaged and prepared, and did not end up getting served.”

More than two-thirds of food thrown away by households is potentially edible, with uneaten food costing an average American family of four at least $1,500 a year and causing major environmental damage, said the research supported by The Rockefeller Foundation.

Meanwhile food that goes unsold in grocery stores, schools, restaurants and other consumer businesses could be redirected towards the one in eight Americans who lack a steady supply, it said.

Denver, Nashville and New York, the three cities at the center of the research, could dish up as many as 68 million extra meals a year if surplus food were donated, said the NRDC.

“If we could distribute just 30 percent of the food we currently discard, it would equate to enough food to provide the total diet for 49 million Americans,” said one of the reports.

While donating more food could help vulnerable people amid rising income inequality, it would not solve the underlying causes of poverty such as unemployment or low wages, the research added.

Cities need to find better ways to match food donations, especially from businesses, to poor communities, and reduce the food waste that takes up almost a quarter of landfill space and emits nearly 3 percent of U.S. greenhouse gases, the research said.

Improved planning and more education could help cut the amount of food people throw away often because it has gone moldy or they do not want to eat leftovers, said Hoover.

“Making the most of our food supply has wide-reaching benefits - helping to feed people and save money, water and energy in one fell swoop,” NRDC senior scientist Dana Gunders said in a statement.

Reporting by Sophie Hares; editing by Megan Rowling. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers

Research shows ice sheets as large as Greenland's melted fast in a warming climate

Purdue University

New research published in Science shows that climate warming reduced the mass of the Cordilleran Ice Sheet by half in as little as 500 years, indicating the Greenland Ice Sheet could have a similar fate.

The Cordilleran Ice Sheet covered large parts of North America during the Pleistocene - or Last Ice Age - and was similar in mass to the Greenland Ice Sheet. Previous research estimated that it covered much of western Canada as late as 12,500 years ago, but new data shows that large areas in the region were ice-free as early as 1,500 years earlier. This confirms that once ice sheets start to melt, they can do so very quickly.

The melting of the Cordilleran Ice Sheet likely caused about 20 feet of sea level rise and big changes in ocean temperature and circulation. Because cold water is denser than warm water, the water contained by ice sheets sinks when it melts, disrupting the "global conveyor belt" of ocean circulation and changing climate.

Researchers used geologic evidence and ice sheet models to construct a timeline of the Cordilleran's advance and retreat. They mapped and dated moraines throughout western Canada using beryllium-10, a rare isotope of beryllium that is often used as a proxy for solar intensity. Measurements were made in Purdue University's PRIME Lab, a research facility dedicated to accelerator mass spectrometry.

"We have one group of beryllium-10 measurements, which is 14,000 years old, and another group, which is 11,500 years old, and the difference in these ages is statistically significant," said Marc Caffee, a professor of physics in Purdue's College of Science and director of PRIME Lab. "The only way this would happen is if the ice in that area had completely gone away and then advanced."

Around 14,000 years ago the Earth started warming, and the effects were significant - ice completely left the tops of the mountains in western Canada, and where there were ice sheets, they probably thinned a lot. About a thousand years later, the climate cooled again, and glaciers started to advance, then retreated as conditions warmed at the onset of the Holocene. If the Cordilleran Ice Sheet had still been there when the climate started cooling during a period known as the Younger Dryas, cirque and valley glaciers wouldn't have advanced during that time. This indicates a rapid disappearance rather than a gradual melting of the ice sheet.

Reconstructing precise chronologies of past climate helps researchers establish cause and effect. Some have wondered whether the melting of the Cordilleran Ice Sheet caused the Younger Dryras cooling, but it's unlikely; the cooling started too early for that to be true, according to the study. What caused the cooling is still up for debate.

Creating a timeline of glacial retreat also provides insight into how the first people got to North America. Current estimates place human migration to the south of the Cordilleran and Laurentide Ice Sheets between 14,600 and 18,000 years ago, but how they got there isn't clear. Some say humans could have crossed through an opening between the ice sheets, but these new findings show that passage was likely closed until 13,400 years ago.

This paper should serve as motivation for further studies, said Caffee. Continental ice sheets don't disappear in a simple, monolithic way - it's an extremely complicated process. The more we know about the retreat of the Cordilleran Ice Sheet, the better we'll be able to predict what's to come for the Greenland Ice Sheet.

Special Report: The decisions behind Monsanto's weed-killer crisis

Emily Flitter

(Reuters) - In early 2016, agri-business giant Monsanto faced a decision that would prove pivotal in what since has become a sprawling herbicide crisis, with millions of acres of crops damaged.

Monsanto had readied new genetically modified soybeans seeds. They were engineered for use with a powerful new weed-killer that contained a chemical called dicamba but aimed to control the substance’s main shortcoming: a tendency to drift into neighboring farmers’ fields and kill vegetation.

The company had to choose whether to immediately start selling the seeds or wait for the U.S. Environmental Protection Agency (EPA) to sign off on the safety of the companion herbicide.

The firm stood to lose a lot of money by waiting. Because Monsanto had bred the dicamba-resistant trait into its entire stock of soybeans, the only alternative would have been “to not sell a single soybean in the United States” that year, Monsanto Vice President of Global Strategy Scott Partridge told Reuters in an interview.

Betting on a quick approval, Monsanto sold the seeds, and farmers planted a million acres of the genetically modified soybeans in 2016. But the EPA’s deliberations on the weed-killer dragged on for another 11 months because of concerns about dicamba’s historical drift problems.

Timeline of the Monsanto crisis

That delay left farmers who bought the seeds with no matching herbicide and three bad alternatives: Hire workers to pull weeds; use the less-effective herbicide glyphosate; or illegally spray an older version of dicamba at the risk of damage to nearby farms.

The resulting rash of illegal spraying that year damaged 42,000 acres of crops in Missouri, among the hardest hit areas, as well as swaths of crops in nine other states, according to an August 2016 advisory from the U.S. Environmental Protection Agency. The damage this year has covered 3.6 million acres in 25 states, according to Kevin Bradley, a University of Missouri weed scientist who has tracked dicamba damage reports and produced estimates cited by the EPA.

The episode highlights a hole in a U.S regulatory system that has separate agencies approving genetically modified seeds and their matching herbicides.

Monsanto has blamed farmers for the illegal spraying and argued it could not have foreseen that the disjointed approval process would set off a crop-damage crisis.

But a Reuters review of regulatory records and interviews with crop scientists shows that Monsanto was repeatedly warned by crop scientists, starting as far back as 2011, of the dangers of releasing a dicamba-resistant seed without an accompanying herbicide designed to reduce drift to nearby farms.

In 2015, just before Monsanto released its soybeans seeds, Arkansas regulators notified the firm of damage from illegal spraying of its dicamba-resistant cotton seeds. Some cotton farmers chose to illegally spray old versions of dicamba because other herbicides approved for use on the seeds were far less effective.

The EPA did not approve the new dicamba formulation that Monsanto now sells for use with cotton and soybean seeds - XtendiMax with Vapor Grip - until after the 2016 growing season.

Monsanto’s Partridge acknowledged that the company misjudged the regulatory timeline for approval of its new herbicide.

“The EPA process was lengthier than usual,” Partridge said.

Monsanto, however, denies culpability for the crisis that followed the two-stage approval.

“The illegal misuse of old dicamba herbicides with Xtend seeds was not foreseeable,” the company’s attorneys said in a response to one class action suit filed by farmers in Missouri. “Even if it were foreseeable that farmers would illegally apply old dicamba to their Xtend crops, which it was not, Monsanto is not liable for harms caused by other manufacturers’ products.”

Monsanto’s Partridge said in a written statement that the reports of damage from illegal spraying of dicamba on its cotton seeds in 2015 were “extremely isolated.”

“Those who applied dicamba illegally should be held responsible,” Partridge said.

Monsanto’s handling of the delayed herbicide approval may cause the firm legal and public relations damage, but it has boosted the company’s business considerably. Instead of halting seed sales while waiting on herbicide approval, Monsanto captured a quarter of the nation’s massive soybean market by the start of 2017, according to the U.S. Department of Agriculture.

Even the damage from dicamba may have boosted sales. Some farmers whose crops were harmed said in interviews that they bought Monsanto’s new dicamba-resistant seeds as a defense against drift from nearby spraying.

State regulators believe the illegal spraying of dicamba-tolerant cotton and soybean crops continued in 2017 - after the EPA approved Monsanto’s new herbicide. Farmers would still benefit from using old versions of dicamba because it is cheaper than XtendiMax. Many growers also have dicamba on hand because it is legal to use for limited purposes.

Regulators have not yet determined, however, how much damage came from illegal spraying and how much came from the legal application of XtendiMax, which weed scientists say still vaporizes under certain conditions.

Monsanto concedes that XtendiMax has caused crop damage, but blames farmers who the company says did not properly follow directions for applying the herbicide.

The EPA, after delaying a decision on XtendiMax, gave the herbicide a limited two-year approval - as opposed to the standard 20 years - in case drift issues arose.

A U.S. Department of Agriculture spokesman, Rick Corker, acknowledged in a statement to Reuters that the release of an engineered seed before its companion herbicide caused problems. The department, he said, is now in talks with the EPA about whether to coordinate approvals of paired seeds and chemicals.

“USDA and EPA are in discussions regarding the timing of our deregulations,” Corker said in a statement.

The EPA did not comment on whether it planned any policy changes in response to the dicamba crisis.

EARLY WARNINGS

Dicamba is cheap, plentiful, and has been used as a weed killer for decades. But its tendency to damage nearby fields had caused U.S. regulators to limit its use to the task of clearing fields before planting or after harvest, when there are no crops to damage and cooler temperatures make it less likely the substance will migrate.

Farmers who illegally sprayed dicamba during growing season are now facing fines of up to $1,000 for each violation of EPA rules limiting the use of dicamba, which are enforced by state regulators.

Farmers with damaged crops have filed at least seven lawsuits — five class-action suits and two by individuals — seeking compensation from Monsanto. The suits claim the company should have known that releasing the seeds without a paired herbicide would cause problems.

Monsanto officials had been repeatedly warned of the potential for damage from illegal spraying of dicamba on seeds designed to resist the chemical.

In October 2011, five scientists from Ohio State University addressed a conference in Columbus focused on the future of dicamba. In attendance were agriculture researchers from across the country as well as representatives of the companies Monsanto, Dow Chemical and BASF.

According to Douglas Doohan, one of the conference’s organizers, three Monsanto employees, including Industry Affairs Director Douglas Rushing, attended the meeting. Monsanto had a keen interest in the topic because the company was far along in developing its new line of dicamba products at the time.

In their introduction to the symposium, Doohan and his colleagues outlined what they called an increased risk of illegal dicamba spraying by farmers once dicamba-resistant seeds became available. They also argued that dicamba-resistant seeds - and the illegal spraying that might accompany them - would lead farmers whose crops were damaged to buy their own dicamba-tolerant seeds to protect themselves from further drift, according to conference records.

Monsanto’s Rushing gave his own presentation about dicamba to the symposium, according to conference records. Rushing explained the need for a new herbicide-and-seed combination to replace those that had grown less effective as weeds become more tolerant to certain chemicals, according to slides outlining Rushing’s conference presentation. He raised the issue of damage from dicamba drift, but said the risks could be reduced by using certain kinds of sprayers and taking other precautions.

Rushing could not be reached for comment. Monsanto did not directly respond to questions about the symposium.

DAMAGE REPORTS

Years later, some of what the scientists outlined in their presentations was becoming reality.

Monsanto released its dicamba-resistant cotton seed in the summer of 2015. The seed was compatible with two other legally available herbicides, giving farmers options for dealing with weeds before the EPA’s approval of XtendiMax.

But farmers started digging into their dicamba stockpiles anyway, and damage reports started to trickle in. Monsanto officials were among the first to see those reports, according to minutes of an Arkansas Plant Board committee meeting in July 2015.

Jammy Turner, a Monsanto salesman, was on the Arkansas Plant Board, the agricultural regulator that investigated the complaints. He and Duane Simpson, a Monsanto lobbyist, attended the committee meeting. There, the board’s Pesticide Division Director Susie Nichols gave a report about drift damage complaints linked to the new seed technology.

At that meeting, lobbyist Simpson was asked by the board what Monsanto was doing about drift damage complaints, according to the minutes. Simpson told the committee that the firm had been telling farmers not to spray dicamba illegally, even over crops specifically designed to withstand it. He said the company would consider pulling whatever licenses Monsanto had given to offending farmers to use its technology.

At an Aug. 8, 2016 meeting of the same committee, Simpson was asked again how Monsanto was dealing with farmers illegally spraying dicamba on Xtend crops. This time, he responded that Monsanto saw no way to pull farmers’ seed licenses over the issue.

Monsanto did not comment on Simpson’s statements in response to written questions from Reuters. The company said it would consider revoking a particular farmer’s license if asked to do so by state regulators “when they have investigated and adjudicated an egregious violation.”

Larry Steckel, a weed scientist and professor at the University of Tennessee, said those early damage reports should have been a red flag to Monsanto against releasing its soybean seeds the following year.

“It turned out to be a precursor of what was to come,” he said.

Neither Turner nor Simpson responded to calls and emails seeking comment. Monsanto did not comment on the company’s involvement in the Arkansas investigations.

By the end of 2015, the damage reports linked to the Xtend cotton seeds were making the rounds among scientists. Weed scientist Michael Owen, a professor at Iowa State University, said he warned at the ISU Annual Integrated Crop Management Conference on Dec. 2-3, 2015 that no dicamba formulations had been approved for use on Xtend crops. He told attendees it wasn’t clear when the formulas would be greenlighted, and that the situation was cause for concern.

Monsanto representatives attended that conference, according to ISU Program Services Coordinator Brent Pringnitz, who handled the registration. He would not identify them.

Owen said he also repeated his warning directly to Monsanto officials around the same time, but did not name them.

Monsanto did not comment on Owen’s assertion that he warned the company about dicamba spraying.

FLAWED ASSUMPTIONS

Farmers who spoke to Reuters and others who gave testimony recorded in state records described a variety of reasons why they purchased Xtend seeds before XtendiMax was available.

One farmer in Arkansas, Doug Masters, planted Xtend cotton in 2015 and was caught illegally spraying dicamba, according to Arkansas Plant Board records. He said the Monsanto salesman who sold him the Xtend seeds told him that, by the time the plants came up in the summer of 2015, it would be legal to spray dicamba and he should go ahead and do it, according to the board records.

Masters declined to identify the Monsanto salesman to Arkansas regulators, records show. He declined again, when reached by Reuters, to identify the representative. Masters admitted that he illegally sprayed dicamba, and the Plant Board fined him $4,000, assessing the maximum penalty allowed for four violations.

Monsanto did not comment on Masters’ testimony to the plant board.

Ritchard Zolman, another Arkansas farmer caught illegally spraying his cotton in 2015, said in a disciplinary hearing held by the Arkansas Plant Board that he’d planted Xtend seeds because he thought he could spray dicamba legally over his fields 14 days before his crops came up from the ground, according to records.

But the Plant Board ruled it was illegal to spray dicamba onto a field where planted crops had not yet sprouted, disciplinary records show. Zolman was fined $3,000 for three dicamba spraying violations, records show.

Zolman declined to comment.

In Missouri, Gary Dalton Murphy III said he and his family planted Xtend soybeans in 2016 after “hearing” that dicamba would be legal to spray by the summer. He did not say who told him dicamba would be legal that season.

When Murphy learned XtendiMax would not be available, the family got rid of their weeds by hand, hiring extra workers to help.

(Additional reporting by Steve Barnes in Little Rock, Arkansas; Editing by Rich Valdmanis and Brian Thevenot)

As Wind Power Sector Grows, Turbine Makers Feel the Squeeze

By STANLEY REEDNOV. 9, 2017

Wind power is increasing in importance and is getting cheaper, a boon for customers but a challenge for the companies that make the equipment.

Vestas Wind Systems and Siemens Gamesa are giants of the wind-power industry, building mammoth turbines that rise high into the air and power more and more homes. But disappointing earnings reports from the two companies this week indicated that even they are struggling to adapt to a fast-changing sector.

Wind power is an increasingly important source of electricity around the world, and prices for the technology are dropping fast. But belt-tightening governments across Europe and North America are phasing out subsidies and tax incentives that had helped the industry grow, squeezing companies like Vestas and Siemens Gamesa in the process.

On Thursday, Vestas, the world’s largest maker of wind turbines, said its revenue in the third quarter fell 6 percent compared with the same period a year ago, to 2.7 billion euros, or $3.1 billion. Profit dropped 18 percent, to about €250 million, Vestas said. The figures sent the Danish company’s shares plummeting by as much as 20 percent.

The Vestas results came just days after Siemens Gamesa Renewable Energy — the recently formed company combining the wind-power units of Siemens, the German conglomerate, and Gamesa of Spain — reported a €147 million loss for the third quarter. The Madrid-listed company also said it would have to shed 6,000 jobs.

Executives and analysts blamed several factors for the two companies’ poor results.

In addition to the phasing out of tax credits and guaranteed prices by some governments, prices for solar power have fallen rapidly, making it a competitor to wind in some parts of the world. Perhaps more significant, countries like Britain, Chile and Germany are using competitive auctions more often to award enormous wind and solar power projects, helping push down costs.

Such auctions have helped lower by 15 percent the costs per unit of electricity generated by onshore wind projects set to come online over the next five years, and by a third for offshore wind projects in the same period, according to the International Energy Agency, an organization based in Paris.

“The overall theme that affects all companies is that you are seeing a transition to power auctions that are quite competitive,” said Brian Gaylord, a senior analyst at MAKE, a market research firm.

As an example of the tougher conditions, Vestas said on Thursday that the price it gets for its turbines has fallen sharply. The company received around €800,000 per megawatt, a unit of power capacity, for the orders it booked in the third quarter of the year. By comparison, it received €950,000 per megawatt in late 2016.

Our columnist Andrew Ross Sorkin and his Times colleagues help you make sense of major business and policy headlines — and the power-brokers who shape them.

Those cost differences are significant given the size of the wind turbines that Vestas produces. Its largest onshore turbine can pump out 4.2 megawatts of power, enough to provide electricity to roughly 5,000 homes.

In a conference call with analysts on Thursday, Anders Runevad, the company’s chief executive, described a landscape of competitive electricity auctions that started in Latin America and have spread across much of the globe. Vestas said in a statement that it was “seeing accelerated competition and decreasing profitability.”

Things could get worse. Proposals being considered by lawmakers in the United States, one of the world’s biggest markets for wind power, could sharply reduce the value of tax credits for new wind power projects. That would potentially reduce the sector’s growth, and create uncertainty for manufacturers.

“We believe challenges” for wind turbine makers “are mounting,” Sean McLoughlin, an analyst at the bank HSBC, wrote in a note to clients on Monday.

Still, there are positive signs. A close look at the data appears to show that the industry is not in a fatal swoon but needs to adapt to current trends. Vestas, for example, reported a 48 percent rise in orders for the third quarter, a key metric for the company, compared with the same period a year ago.

And while there is little evidence that a respite is coming for manufacturers, the industry trends have been positive for consumers. Wind energy accounted for 20 percent of new global power capacity in 2016, according to the International Energy Agency. And the falling costs are a boon for households.

The shift, Mr. Gaylord said, “is ultimately good for consumers, and the power industry.”

NJ Sets 1st State Limits for Perfluorinated Chemicals in Drinking Water

"Besides lead, no contaminant in drinking water has provoked as loud a public outcry in the last two years in the United States as a class of chemicals known as perfluorinated compounds.

New Jersey regulators are taking the strongest action to date on the man-made chemicals that are used in scores of household and industrial products. The state will be the first to require utilities to test for two compounds and remove them from drinking water.

The New Jersey Department of Environmental Protection announced on November 1 that it accepted a recommendation from state water quality experts to set a legal limit for PFOA of 14 parts per trillion. The announcement follows a proposal in August to allow drinking water to contain no more than 13 parts per trillion of PFNA, a related perfluorinated compound used at plastics manufacturing facilities in New Jersey and found in public water systems."

Plastic fibres found in tap water around the world, study reveals

Exclusive: Tests show billions of people globally are drinking water contaminated by plastic particles, with 83% of samples found to be polluted

The average number of fibres found in each 500ml sample ranged from 4.8 in the US to 1.9 in Europe.

Damian Carrington, Guardian Environment editor

Tuesday 5 September 2017 19.01 EDT, Last modified on Wednesday 6 September 2017 08.35 EDT

Microplastic contamination has been found in tap water in countries around the world, leading to calls from scientists for urgent research on the implications for health.

Scores of tap water samples from more than a dozen nations were analysed by scientists for an investigation by Orb Media, who shared the findings with the Guardian. Overall, 83% of the samples were contaminated with plastic fibres.

Analysis We are living on a plastic planet. What does it mean for our health?

New studies reveal that tiny plastic fibres are everywhere, not just in our oceans but on land too. Now we urgently need to find out how they enter our food, air and tap water and what the effects are on all of us

The US had the highest contamination rate, at 94%, with plastic fibres found in tap water sampled at sites including Congress buildings, the US Environmental Protection Agency’s headquarters, and Trump Tower in New York. Lebanon and India had the next highest rates.

European nations including the UK, Germany and France had the lowest contamination rate, but this was still 72%. The average number of fibres found in each 500ml sample ranged from 4.8 in the US to 1.9 in Europe.

The new analyses indicate the ubiquitous extent of microplastic contamination in the global environment. Previous work has been largely focused on plastic pollution in the oceans, which suggests people are eating microplastics via contaminated seafood.

“We have enough data from looking at wildlife, and the impacts that it’s having on wildlife, to be concerned,” said Dr Sherri Mason, a microplastic expert at the State University of New York in Fredonia, who supervised the analyses for Orb. “If it’s impacting [wildlife], then how do we think that it’s not going to somehow impact us?”

A separate small study in the Republic of Ireland released in June also found microplastic contamination in a handful of tap water and well samples. “We don’t know what the [health] impact is and for that reason we should follow the precautionary principle and put enough effort into it now, immediately, so we can find out what the real risks are,” said Dr Anne Marie Mahon at the Galway-Mayo Institute of Technology, who conducted the research.

Mahon said there were two principal concerns: very small plastic particles and the chemicals or pathogens that microplastics can harbour. “If the fibres are there, it is possible that the nanoparticles are there too that we can’t measure,” she said. “Once they are in the nanometre range they can really penetrate a cell and that means they can penetrate organs, and that would be worrying.” The Orb analyses caught particles of more than 2.5 microns in size, 2,500 times bigger than a nanometre.

Microplastics can attract bacteria found in sewage, Mahon said: “Some studies have shown there are more harmful pathogens on microplastics downstream of wastewater treatment plants.”

Microplastics are also known to contain and absorb toxic chemicals and research on wild animals shows they are released in the body. Prof Richard Thompson, at Plymouth University, UK, told Orb: “It became clear very early on that the plastic would release those chemicals and that actually, the conditions in the gut would facilitate really quite rapid release.” His research has shown microplastics are found in a third of fish caught in the UK.

The scale of global microplastic contamination is only starting to become clear, with studies in Germany finding fibres and fragments in all of the 24 beer brands they tested, as well as in honey and sugar. In Paris in 2015, researchers discovered microplastic falling from the air, which they estimated deposits three to 10 tonnes of fibres on the city each year, and that it was also present in the air in people’s homes.

This research led Frank Kelly, professor of environmental health at King’s College London, to tell a UK parliamentary inquiry in 2016: “If we breathe them in they could potentially deliver chemicals to the lower parts of our lungs and maybe even across into our circulation.” Having seen the Orb data, Kelly told the Guardian that research is urgently needed to determine whether ingesting plastic particles is a health risk.

The new research tested 159 samples using a standard technique to eliminate contamination from other sources and was performed at the University of Minnesota School of Public Health. The samples came from across the world, including from Uganda, Ecuador and Indonesia.

How microplastics end up in drinking water is for now a mystery, but the atmosphere is one obvious source, with fibres shed by the everyday wear and tear of clothes and carpets. Tumble dryers are another potential source, with almost 80% of US households having dryers that usually vent to the open air.

“We really think that the lakes [and other water bodies] can be contaminated by cumulative atmospheric inputs,” said Johnny Gasperi, at the University Paris-Est Créteil, who did the Paris studies. “What we observed in Paris tends to demonstrate that a huge amount of fibres are present in atmospheric fallout.”

Plastic fibres may also be flushed into water systems, with a recent study finding that each cycle of a washing machine could release 700,000 fibres into the environment. Rains could also sweep up microplastic pollution, which could explain why the household wells used in Indonesia were found to be contaminated.

In Beirut, Lebanon, the water supply comes from natural springs but 94% of the samples were contaminated. “This research only scratches the surface, but it seems to be a very itchy one,” said Hussam Hawwa, at the environmental consultancy Difaf, which collected samples for Orb.

This planktonic arrow worm, Sagitta setosa, has eaten a blue plastic fibre about 3mm long. Plankton support the entire marine food chain. Photograph: Richard Kirby/Courtesy of Orb Media

Current standard water treatment systems do not filter out all of the microplastics, Mahon said: “There is nowhere really where you can say these are being trapped 100%. In terms of fibres, the diameter is 10 microns across and it would be very unusual to find that level of filtration in our drinking water systems.”

Bottled water may not provide a microplastic-free alternative to tapwater, as the they were also found in a few samples of commercial bottled water tested in the US for Orb.

38 million pieces of plastic waste found on uninhabited South Pacific island

Almost 300m tonnes of plastic is produced each year and, with just 20% recycled or incinerated, much of it ends up littering the air, land and sea. A report in July found 8.3bn tonnes of plastic has been produced since the 1950s, with the researchers warning that plastic waste has become ubiquitous in the environment.

“We are increasingly smothering ecosystems in plastic and I am very worried that there may be all kinds of unintended, adverse consequences that we will only find out about once it is too late,” said Prof Roland Geyer, from the University of California and Santa Barbara, who led the study.

Mahon said the new tap water analyses raise a red flag, but that more work is needed to replicate the results, find the sources of contamination and evaluate the possible health impacts.

She said plastics are very useful, but that management of the waste must be drastically improved: “We need plastics in our lives, but it is us that is doing the damage by discarding them in very careless ways.”

Report by Orb Media:

https://orbmedia.org/stories/Invisibles_plastics

Cooling with propane more energy effcient

Cooling homes and small office spaces could become less costly and more efficient with new early stage technology developed by Oak Ridge National Laboratory. Researchers designed a window air conditioning unit that uses propane as the refrigerant, cooling the air with 17 percent higher efficiency than the best ENERGY STAR® commercial units. "Propane offers superior thermodynamic properties and creates 700 percent less pollution than standard refrigerants," said ORNL's Brian Fricke. "We developed a system that takes advantage of these qualities and reduces global warming potential." The team's early-stage technology includes a novel heat exchanger, compressor and controls that require less propane than similar units used overseas. The team's laboratory evaluations demonstrate the prototype unit is the first propane window air conditioner to meet U.S. building safety standards. [Contact: Kim Askey, (865) 946-1861; askeyka@ornl.gov]

Supporting hurricane damage assessments

Geospatial scientists at Oak Ridge National Laboratory have developed a novel method to quickly gather building structure datasets that support emergency response teams assessing properties damaged by Hurricanes Harvey and Irma. By coupling deep learning with high-performance computing, ORNL collected and extracted building outlines and roadways from high-resolution satellite and aerial images. As hurricanes formed in the Gulf Coast and the Atlantic Ocean, ORNL activated their technique. "During devastating weather events, it's difficult and time consuming to assess damage manually," said ORNL's Mark Tuttle. "Our method supports emergency response efforts by providing preliminary building structure data--which can be categorized for residential, multi-family and commercial properties--on the county level, and this has been applied for hurricane-impacted areas of Texas, Florida, Puerto Rico and other U.S. Caribbean territories." During Hurricane Harvey, ORNL analyzed nearly 2,000 images covering nearly 26,000 square miles of building structures in Texas' coastal counties in just 24 hours, a process that would typically take up to nine months.

As hurricanes formed in the Gulf Coast, ORNL activated a computing technique to quickly gather building structure data from Texas' coastal counties.

ORNL's novel computing method supports emergency response efforts by providing preliminary building structure data on the county level. This technique has been applied for hurricane-impacted areas of Texas, Florida, Puerto Rico and other U.S. Caribbean territories.

Study identifies additional hurdle to widespread planting of bioenergy crops

Indiana University

INDIANAPOLIS -- A study examining how certain decisions impact what farmers plant and harvest identified one crucial factor that researchers believe needs to be added to the list of decision variables when considering bioenergy crops: the option value.

Most studies have not examined the role of the option value, which has to do with farmers waiting to see how bioenergy crop prices will change in the future, said Jerome Dumortier, an associate professor in the Indiana University School of Public and Environmental Affairs at IUPUI and a co-author of the study.

The study, "Production and spatial distribution of switchgrass and miscanthus in the United States under uncertainty and sunk cost," was published in the journal Energy Economics and is one of the first to consider adding the option value to the list of barriers to widespread planting of bioenergy crops. It also shows the spatial distribution of potential bioenergy crops in the U.S. when considering the option value.

"Farmers take into account the uncertainty associated with price fluctuations with bioenergy crops," Dumortier said. "They don't know how the price is going to evolve. Hence, they have to wait. Hence, fewer bioenergy crops are produced."

Previous studies have focused on comparing the costs of traditional crops -- corn, soybeans, wheat -- to planting bioenergy crops -- switchgrass, miscanthus -- and the cost of harvesting agricultural residue.

Dumortier said bioenergy crops have a higher energy density than other crops but are more expensive. There is also the issue of displacing acreage used to produce food, he said.

The study also found:

Bioenergy crops are less likely to be produced in the Midwest. Productivity for traditional crops is so high that the opportunity costs of converting to dedicated bioenergy crops would not be cost-effective. Agricultural residues, or the organic material left in the fields after crops are harvested, are abundant enough in the Corn Belt to supply the quantity of cellulosic ethanol mandated by the U.S. Renewable Fuel Standard.

There are two main sources for the cellulosic ethanol called for in the fuel standard. One is bioenergy crops. The second is agricultural residue.

"The fuel standard mandate is covered by agricultural residue, and planting of switchgrass or miscanthus is not necessary," Dumortier said.

Harvesting agricultural residue also has its problems, including either slowing harvesting as farmers gather the crops and the residue in a single pass through a field or increasing labor as farmers harvest their fields twice, once for crops and once for residue.

All of these issues play a role in determining the cost of the option value.

"This study shows that the option value cost now must be added to the higher costs of bioenergy crops and the cost of harvesting agricultural residue," Dumortier said.

Co-authors of the study are Nathan Kauffman, Federal Reserve Bank of Kansas City, and Dermot Hayes, Department of Economics, Iowa State University.

Energy efficiency labeling for homes has little effect on purchase price

Labeling doesn't appear to convince home buyers to pay more for energy efficient homes.

Norwegian University of Science and Technology

Most buyers aren't thinking about energy performance certification when they're house shopping. That's the conclusion of a team of researchers from the Norwegian University of Science and Technology (NTNU) after they conducted a thorough assessment of how the labels affected home pricing.

"Energy labeling has zero effect on the price. The scheme doesn't seem to be achieving its intended purpose," says Professor Olaf Olaussen at the NTNU Business School.

The Norwegian energy labeling system for homes and dwellings was implemented in 2010. One of the arguments for the system was that a good energy performance rating would be an advantage for the seller as well.

Energy performance is rated from A to G. It's intended as a tip for buyers so that they know just how much energy a home requires. With energy-efficient housing, you can save a lot of money over the years and thus, you may be willing to pay more for the property. The EU uses the same system.

"We found that some European studies, especially a Dutch one, showed that the energy labeling made a big difference to the price, but it seemed strange," says Olaussen.

These studies had weaknesses. They only showed the impact after energy labeling was introduced, not before, and usually only relied on data from a single year.

Olaussen and his colleagues Are Oust and Jan Tore Solstad believed that a price premium might be due to something other than a high energy performance rating. They set out to test this.

In Norway, energy labeling was not implemented gradually. It was launched in full in July 2010. Norwegian data on the prices of most home sales are easy to access. This makes it relatively easy - albeit time consuming - to compare price developments both before and after energy labelling was introduced.

The Norwegian researchers did find an apparent effect of the energy labeling system when they used the same method as the Dutch study. But this effect disappeared when they used a more thorough procedure.

Instead, the researchers took the home sales figures from 2000 to July 2010 and from July 2010 to 2014, which gave a picture of how prices for houses developed over the long term. They also compared houses with similar characteristics, such as homes in the same area and of the same type.

"We found that the homes that had a price premium after energy labeling was introduced had that advantage before it, too," says Olaussen.

The advantage had to come from something else. Perhaps, homes with better energy performance were generally of higher quality. Or maybe something else was a factor.

What may be more important than whether your home has an energy rating of A or G is whether it is in a child-friendly area, or near shops, or has ocean views or other things that buyers are willing to pay a premium for.

Quick bidding rounds may play a role

"I don't think most people care about the energy rating when they buy a home. Other factors play a role, especially in a market like ours [in Norway], with fast bidding rounds. Then you're not thinking about whether 'this property has an A-rating and I can afford a little more'," says Olaussen.

Simply stated, Norway's home sales system works like this: potential buyers go to a house showing, and sign a list with their names and mobile phone numbers if they are interested in participating in the bidding for the house. Once the bidding process begins, it can be fast and furious, especially if there are many bidders eager to buy a single property.

This kind of bidding on housing isn't common in very many other countries. You find it mostly in Norway, Sweden, Australia, New Zealand and to some extent in Scotland and Ireland.

In other countries, you usually work from a fixed price, with a little bit of wiggle room. You often have much more time when buying. Maybe energy labelling would have greater significance in that scenario. But new studies from Europe aren't indicating that. They match the results from NTNU.

Researchers at the NTNU Business School are now studying whether variations in the price of electricity affect how the energy rating impacts house sales.

But if the energy label doesn't matter for sales, how are you going to get people to make their homes more energy efficient?

"You can require energy labeling for home sales. You can give other financial benefits for upgrading homes. You can actively use available support schemes and create different incentives," says Olaussen.

Norwegians can apply for financial support from a government funded entity called Enova if they want to make their home more energy efficient.

And you can still upgrade your home for other reasons, such as a desire to be more environmentally friendly or to save electricity. But you definitely don't want to invest your money here if you simply want to sell your home for a higher price.

Reference: Olaussen, Jon Olaf; Oust, Are; Solstad, Jan Tore. (2017) Energy performance certificates- Informing the informed or the indifferent? Energy Policy. Vol. 111.

As It Looks to Go Green, China Keeps a Tight Lid on Dissent

Chinese President Xi Jinping has vowed to clean up the nation’s air and water and create an “ecological civilization.” Yet even as it carries out major environmental reforms, China’s government is making sure activist green movements stay under its control.

By Michael Standaert • November 2, 2017

China’s “chairman of everything,” Xi Jinping, has just solidified as much power as any Chinese leader since Mao Zedong, with his ideas for a “new era” for his nation enshrined in the constitution at last month’s 19th Communist Party Congress.

Among his new priorities are the creation of an “ecological civilization,” with Xi pledging to clean up three decades of environmental degradation, protect the country’s ecosystems, stringently enforce environmental laws and regulations, and create a “green economy.” Xi also has pledged to forge ahead with the country’s commitments under the Paris climate agreement, as China’s once-soaring CO2 emissions have finally plateaued in the past several years. And the Xi administration is expected to launch a national carbon trading system next month, creating the largest such regime in the world.

But as Xi — responding to years of growing public outrage and activism over the country’s egregious air, water, and soil pollution — casts himself as the nation’s chief protector of the environment, he and his administration have made it clear that the nation’s environmental problems will be tackled on the government’s terms and timetable. Given the Chinese leadership’s overriding concern about societal stability, the Xi administration has not hesitated to clamp down when it perceives that an environmental campaign is gaining broad popular support and taking on a life of its own. Nor has it been shy about appropriating environmental causes championed by the country’s NGOs and green activists.

That heavy-handed response to burgeoning green movements was on display earlier this year with the release of the film, “Plastic China,” which took a hard look at the dirty business of recycling imported plastic, including an image of a baby being born on a mound of trash. The documentary, directed by filmmaker Wang Jiuliang, went viral in early 2017, but the government soon blocked it on the Internet. The film may have helped prompt the government to speed up plans to ban imports on many categories of foreign waste, although the central government has never acknowledged why it decided to enact the ban so quickly last July.

Beijing has made it clear that the nation’s environmental problems will be tackled on the government’s terms and timetable.

The swift reaction to “Plastic China” echoed the response to journalist Chai Jing’s “Under the Dome,” a vivid account of the toll of air pollution on China’s citizens. That video went viral and was viewed an estimated 150 million to 300 million times within the first three days of its release. Its soaring popularity — an implicit rebuke to local and central government officials who had failed to crack down on polluters — occurred just before the annual National People’s Congress meetings in Beijing in 2015. Authorities quickly blocked the film on the Chinese Internet. Nevertheless, the government is moving aggressively to reduce air pollution in Beijing and other cities.

Pollution of the soil from noxious factories has also become a significant concern over the past several years, but the government has yet to release the results of tests it has performed on the nation’s tainted soils. Last year, a scandal erupted when 500 students from an elite high school near Shanghai were afflicted with numerous ailments caused by pollution and illegal dumping from three nearby pesticide factories. But news of that situation has largely disappeared from public view.

Environmental concerns are one of the leading causes of protests in China as citizens take to the Internet or the streets to vent their anger and frustration over runaway pollution and a lack of transparency on environmental issues. Roughly five years ago, grassroots protests sprung up throughout China as residents of so-called “cancer villages” — locales near chemical plants and factories whose residents suffer from high incidences of cancer — took to the streets. But fewer environmental protests take place in China today, and those that do occur — such as a demonstration in late 2016 protesting severe air pollution in the industrial city of Chengdu — are often quickly broken up by police.

Many of the anti-pollution policies the leadership in Beijing has rolled out in recent years have been at least partly aimed at heading off social instability by giving citizens a sense that leaders are going in the right direction, thus letting the air out of potentially explosive public outbursts.

Protestors in Beijing following the release of the viral documentary about Chinese air pollution "Under the Dome" in 2015. Source

“I think it follows a familiar pattern,” said Calvin Quek, the head of Sustainable Finance for Greenpeace East Asia. “The government reads the tea leaves, monitoring social networks like Weibo and WeChat, and when something really blows up, they acknowledge it, but they never want any independent sources of momentum. They are basically saying, ‘Thank you very much, we’ll take care of it now, we’re in charge of it.’ They have become very smart about getting information and controlling the narrative. They’ve been able to say, ‘We want to be final arbitrators of the public discussion.’ And I think the public is fine with it.”

Daniel Gardner, a professor at Smith College and author of Environmental Pollution in China: What Everyone Needs to Know — soon to be published by Oxford University Press — says that vested economic and political interests have long been concerned over grassroots environmental phenomena. This was evident in the crackdown on “Plastic China” and, especially, “Under the Dome,” which pointed to official corruption and collusion with fossil fuel companies as root causes of the air pollution problem.

“While Beijing may tolerate the common street protests of 100, 2,000, even 10,000 people, largely because they are of a familiar NIMBY-style and take as their targets local industry and local government, the specter of 200 million people from all around the country tuning in to ‘Under the Dome,’ and perhaps realizing they have common cause for building a national movement, was too unsettling,” Gardner said in an interview.

Ma Jun, one of China’s best-known environmental figures and director of the Institute of Public & Environmental Affairs, which pushes for greater transparency of pollution data, said that in many ways, both documentaries “achieved their target” and “prepared the public to support some very tough policy choices” related to air pollution controls and bans on imports of trash and toxic materials for recycling.

“When [a campaign] moves beyond the environment, and becomes an event of its own, there are lines.”

But when these movements became phenomena in and of themselves, China’s leaders grew worried about their ability to control such movements, Ma said in an interview.

“These documentaries show the space is there for discussion, but they also show there are some lines here and there,” Ma said. “When it moves beyond the environment, and becomes an event of its own, there are lines.”

Numerous examples have arisen in recent years in which Xi Jinping’s administration has taken action on the heels of environmental campaigns by activists and NGOs.

Earlier this year, China launched a major cleanup campaign along one of the sources of the Yellow River and ousted several local officials for their complicity in allowing a large open-pit coal mine to extensively damage a nature reserve. Yet it was Greenpeace China that first broke the news of this environmental scandal.

Also this year, the government imposed new rules on the use, transport, and storage of chemicals, a move that was seen, in part, as a response to a 2015 disaster in which improperly stored chemicals set off massive explosions in the port city of Tianjin that killed an estimated 170 people and injured hundreds more. The government attempted to censor reporting on the disaster in the news media and on Web sites such as Weibo, but word of the explosions nevertheless filtered out through social media.

Government moves to end the illegal importation of ivory into China only really took off after WildAid launched a campaign in 2013 using basketball star Yao Ming and other celebrities in ads. (The government has vowed to ban the ivory trade by the end of this year.) Other instances of crackdowns on the illegal trade in wildlife, such as pangolins, have been prompted in part by campaigns launched by stars like Jackie Chan.

Similarly, the air pollution crises in Beijing and other large cities in recent years have left the government scrambling to stay out in front on environmental issues after initially bungling its response. As recently as 2012, cities across China, particularly Beijing, were releasing inadequate air quality readings — basically saying a day was either good, bad, or moderately bad. The state-run press often referred to extreme air pollution as “bad fog.”

Public outcry has spurred government action on air pollution in Beijing (above) and other Chinese cities. Lei Han / Flickr

But with the U.S. Embassy in Beijing updating fine particulate matter readings hourly on Twitter, and Beijing residents living through a worsening series of “airpocalypses,” the central government was forced to shift into high gear. It soon began releasing accurate air pollution readings for major cities and pushed through a 10-point action plan to deal with airborne emissions, largely spurred by increasingly vocal protests online — and in the streets — over choking air pollution.

Today, the government, at least at the top levels, understands the pollution challenges and is actively communicating them. And it is trying to reach not only average Chinese citizens, but government officials, as well.

“The audience for [Xi’s ecological civilization] campaign is officials, in my view — officials in the central government who continue to hold a ‘pollute first, clean up later’ economic policy,” said Gardner. “And, critically, it’s to signal to local officials at all levels that the central government wants cooperation from them.”

Messages demonstrating the central government’s concern for the environment have figured prominently in propaganda efforts over the past several months, with a prime-time broadcast devoted to “ecological civilization” reforms airing in July. Xi’s administration also has pushed through a variety of other high-profile environmental reforms in recent years: commitments announced in July to block imports of a variety of solid waste, including consumer plastics; halting plans for hydropower facilities along the country’s last free-flowing major river, the Nujiang ( Salween); the creation of a national park system modeled on U.S. national parks; and a surge in new environmental laws, including an amended environmental protection law, an amended water pollution law, and the creation of the country’s first soil pollution law.

Add to all this a major push to develop renewable energy, deploy millions of electric vehicles, and reform energy markets, and Xi Jinping is increasingly looking, to many Chinese, like an environmental reformer.

The machinery of state censorship vigilantly restricts information on certain environmental issues.

Throughout, however, China’s government is carefully managing the debate. The machinery of state censorship, which actively monitors online discussion, vigilantly restricts information on certain environmental issues. The California-based Web site, China Digital Times, regularly publishes leaked directives from China’s state censors on issues such as deleting a report on air pollution deaths, quashing smog forecasts, and prohibiting reporting on lawsuits filed against provincial officials for failing to regulate pollution.

China’s drive for global resources is having an unprecedented environmental impact on the planet.

The government also is tightening restrictions on national and foreign NGO’s out of fear that their activities might foment environmental activism. These restrictions include the passage last year of a Charity Law overseeing domestic NGOs, as well as adoption of a foreign NGO law. Foreign NGOs have been hampered since the start of this year by requirements that they obtain institutional sponsors and register with public security authorities.

As a result, it’s difficult these days to get staff members at foreign NGOs to go on the record or talk without having quotes vetted by communications staff. The process of going through registration for the foreign NGO law “is making us more careful,” said a source at a prominent international NGO in Beijing.

“When you have to register, you’re going to think about how what you say is perceived” by authorities, said the source, who asked not to be identified. “They want to be in control of the message.”

Michael Standaert is a freelance journalist based out of South China, primarily covering environment, energy and climate change policy and regulatory news for Bloomberg BNA, and contributing to other publications, most recently including MIT Technology Review and South China Morning Post. He has resided in China since 2007.

Down hundreds of staff, Weather Service ‘teetering on the brink of failure,’ labor union says

By Jason Samenow October 26

A National Weather Service meteorologist in Norman, Okla., tracks a super cell tornado outbreak. (National Weather Service)

After the onslaught of devastating hurricanes and wildfires, the United States is enduring one of its most costly years for extreme weather. A near-record 16 billion-dollar weather disasters have ravaged the nation. Meanwhile, the National Weather Service workforce is spread razor thin, with hundreds of vacant forecast positions.

The National Weather Service Employees Organization, its labor union, said the lack of staff is taking a toll on forecasting operations and that the agency is “for the first time in its history teetering on the brink of failure.” Managers are being forced to scale back certain operations, and staff are stressed and overworked.

“It’s gotten so bad that we’re not going to be able to provide service that two years ago we were able to provide to public, emergency managers and media,” said Dan Sobien, the president of the union. “We’ve never been in that position before.”

As one example of an overburdened Weather Service office, the team of 15 forecasters serving the Washington and Baltimore region will be short five full-time staff heading into the winter months, according to Ray Martin, a union representative who works there. He said the office is short a senior forecaster, a general forecaster, two junior forecasters, and the lead for its weather observation program — a position that has remained vacant for two years.

Martin said staff morale is in the tank. “Some people have been denied vacations, because there are not enough bodies to fill shifts,” he said. “I, myself, worked a 15-hour day about a week ago. You get a lot less sleep. You start to wonder if you’re safe on the road. You don’t see your loved ones, which eats into family life.”

Martin added that the office is cutting shifts and that one afternoon and evening forecasting desk, charged with analyzing weather radar, won’t be staffed all winter long “because we just don’t have the bodies.” Forecasters staffing other desks will have to monitor radar by committee, in addition to their other responsibilities.

Whether the cutbacks will affect the quality of forecasts and warnings, “I can’t say for certain,” Martin said. “You’re working people double shifts, some people aren’t getting days off, and you’re grinding people down. There is that potential [to affect the forecast quality]. The longer this goes on, the more the potential rises. It’s a long winter ahead.”

The Burlington Free-Press reported similar circumstances at the forecast office serving its region. “Given our staffing, our ability to fill our mission of protecting life and property would be nearly impossible if we had a big storm,” Brooke Taber, a Weather Service forecaster and union steward, told the paper.

Susan Buchanan, a spokeswoman for the National Weather Service, pushed back on any notion the organization is neglecting obligations to constituents and the health of its staff. “Let me state emphatically that we would never take an action that would jeopardize the services we provide to emergency managers and the public,” she said. “NWS is taking definitive steps to ensure the health and well-being of our employees through guidance to local managers on scheduling and flexibility.”

For the past five years, the union has loudly voiced concerns about staff vacancies and their consequences, even filing grievances. The Weather Service has faced different obstacles in trying to fill positions, including the 2013 budget sequester, related hiring freezes, and changes in administrations. In 2012, challenged to find funding to compensate its workers, it was embroiled in a “reprogramming” scandal, in which it moved around funds to cover payroll, without congressional approval.

“The NWS leadership has been incapable of placing their budget priorities correctly,” the union said in a news release this week. “In fact, the NWS nationwide has not had full staffing levels for at least seven years.”

A union fact sheet on the vacancy issues stressed “understaffing is not due to underfunding,” stating that Congress has fully funded the Weather Service since the 2013 fiscal year and that there have been “unspent carry-over funds” in the tens of millions of dollars. Nevertheless, those funds haven’t been earmarked for staffing, and the vacancy problem has worsened.

An independent report from the Government Accountability Office showed staff vacancies increased 57 percent from 2014 to 2016. The overall vacancy rate reached 11 percent or 455 positions at the end of 2016 up from just 5 percent (211 positions) at the end of 2010, the report said.

The union believes the number of vacancies is even higher, closer to 700.

Buchanan asserts the union is exaggerating the number of vacancies, which she said is closer to 226 or 5 percent. “The 700 vacancy figure cited by the union was based on an old organization table that does not reflect the agency’s current staffing profile,” she said.

But Sobien said that, unless the Weather Service cut several hundred jobs in the last seven years — without a mandate to do so, its numbers are misleading. “They’re just not filling positions and saying they don’t exist anymore,” he said. “They’re moving vacancies around the country and not filling them with new bodies. They’re playing a game with these numbers. I don’t even know if it’s legal.”

A senior official with the National Oceanic and Atmospheric Administration (NOAA), which oversees the National Weather Service, said the number of positions the agency can fill has necessarily decreased due to congressional appropriations. The official said the agency has not eliminated positions. While the “ceiling” for the total number of positions is 4,890 per the National Weather Service Table of Organization, the official said the number of positions is 4,453 people under the 2017 fiscal year appropriations act. “We believe this level is adequate to achieve National Weather Service core functions,” the official said.

In sum, the severity of the vacancy situation depends on whether it is based on the total number of positions in the legacy organizational table or the total number of positions for which Congress has provided funding. The union and the Government Accountability Office base their vacancy numbers on the organizational table, which leads to much higher total of unfilled positions compared with the Weather Service.

Buchanan stressed the Weather Service is aggressively moving to fill open slots. “We’re working with NOAA to prioritize NWS field hiring actions to address the most critical need and to streamline the security clearance process to expedite hiring,” she said. “We are also releasing a nationwide announcement for lead forecaster positions this week and are planning batch hires of meteorologist interns three times per year.”

The vacancy situation has not gone unnoticed by Congress. In its 2018 fiscal year budget markup for the Weather Service, the Senate Appropriations Committee wrote that the “extended vacancies are unacceptable — particularly when the Committee has provided more than adequate resources and direction to fill vacancies expeditiously for the past several years.”

It directed NOAA to fully account for all filled and open positions in its fiscal year 2018 spending plan. It acknowledged some Weather Service positions “may be redundant,” but instructed the Weather Service to develop justifications for eliminating any positions. “Until such time as a plan to eliminate those vacancies is approved, NWS is directed to continue to fill all vacancies as expeditiously as possible,” the committee budget markup said.

"Arsenic Reductions in Drinking Water Tied to Fewer Cancer Deaths"

"The Environmental Protection Agency’s revised rule on arsenic contamination in drinking water has resulted in fewer lung, bladder and skin cancers.

In 2006, the E.P.A. reduced the arsenic maximum in public water systems to 10 milligrams per liter, from the previous level of 50 milligrams. The rule does not apply to private wells.

Using data from a continuing nationwide health survey, researchers compared urinary arsenic levels in 2003, before the new rule went into effect, with those in 2014, after it had been fully implemented. There were 14,127 participants in the study, and the scientists adjusted for arsenic contributions from tobacco and dietary sources. The report is in Lancet Public Health."

Bat Poop: A Reliable Source of Climate Change

University of South Florida (USF Health)

People have long known that bat guano - the polite term for what the flying mammals leave on the floors of caves where they live worldwide - is a valuable source of fuel and fertilizer, but now newly published research from University of South Florida geoscientists show that the refuse is also a reliable record of climate change.

In a new paper published this week in the research journal Scientific Reports, USF geochemistry Professor Bogdan Onac and PhD student Daniel Cleary report that isotopes found in bat guano over the last 1,200 years can provide scientists with information on how the climate was and is changing.

The scientists examined bat guano from a cave in northwestern Romania to produce new insight into how the climate in east-central Europe has changed since the Medieval Warm Period, about 850 AD.

Nitrogen cycling within temperate forests is very sensitive to changes in the amount of winter precipitation received each year. When nitrogen isotopes change in response to variation in winter precipitation over the past 2,000 years, this signature is transferred from the soil to plant leaves to insect to bat and ultimately guano.

"Luckily for scientists, the statement 'you are what you eat' also applies to bats," Onac said.

Scientists frequently examine chemical records in natural substances to document how the climate has changed in the past, and to lend insight into how rapidly it is changing now. Scientists drill mud cores into the sediments under the oceans, ice cores in the Arctic and Antarctica, examine tree rings, or use the chemistry found in caves (stalagmites) as climatic proxies.

Bat guano is rich with nitrogen, and scientists know that nitrogen moves through the food change and through animals, where it is returned to the environment. When bats return to the same location within a cave, guano piles beneath their roost can reach sizable dimensions. The researchers found in M?gurici Cave in Romania is a large three-meter pile of bat guano that has been accumulating for more than a thousand years.

Isotopic analysis of the guano pile in the M?gurici Cave resulted in a near annual record of winter precipitation for the region. The location of this cave in the foreland of the East Carpathian Mountains means winter precipitation is modulated by the North Atlantic Oscillation (NAO), with wetter conditions influencing the availability of nitrogen within the surrounding forest system. Using historical records of precipitation, a relationship between winter precipitation and NAO phases was established. Through this work, past phases of the NAO could then be reconstructed back to 1600 AD, Cleary said.

The work represents the first study to provide a paleo-record of this large scale atmospheric circulation pattern for East-Central Europe using cave bat guano. The USF researchers collaborated with Babes-Bolyai University in Cluj, Romania, and the University of Bremen in Germany.

'Scars' left by icebergs record West Antarctic ice retreat

University of Cambridge

Thousands of marks on the Antarctic seafloor, caused by icebergs which broke free from glaciers more than ten thousand years ago, show how part of the Antarctic Ice Sheet retreated rapidly at the end of the last ice age as it balanced precariously on sloping ground and became unstable. Today, as the global climate continues to warm, rapid and sustained retreat may be close to happening again, and could trigger runaway ice retreat into the interior of the continent, which in turn would cause sea levels to rise even faster than currently projected.

Researchers from the University of Cambridge, the British Antarctic Survey and Stockholm University imaged the seafloor of Pine Island Bay, in West Antarctica. They found that, as seas warmed at the end of the last ice age, Pine Island Glacier retreated to a point where its grounding line - the point where it enters the ocean and starts to float - was perched precariously at the end of a slope.

Break up of a floating 'ice shelf' in front of the glacier left tall ice 'cliffs' at its edge. The height of these cliffs made them unstable, triggering the release of thousands of icebergs into Pine Island Bay, and causing the glacier to retreat rapidly until its grounding line reached a restabilising point in shallower water.

Today, as warming waters caused by climate change flow underneath the floating ice shelves in Pine Island Bay, the Antarctic Ice Sheet is once again at risk of losing mass from rapidly retreating glaciers. Significantly, if ice retreat is triggered, there are no relatively shallow points in the ice sheet bed along the course of Pine Island and Thwaites glaciers to prevent possible runaway ice retreat into the interior of West Antarctica. The results are published in the journal Nature.

"Today, the Pine Island and Thwaites glaciers are grounded in a very precarious position, and major retreat may already be happening, caused primarily by warm waters melting from below the ice shelves that jut out from each glacier into the sea," said Matthew Wise of Cambridge's Scott Polar Research Institute, and the study's first author. "If we remove these buttressing ice shelves, unstable ice thicknesses would cause the grounded West Antarctic Ice Sheet to retreat rapidly again in the future. Since there are no potential restabilising points further upstream to stop any retreat from extending deep into the West Antarctic hinterland, this could cause sea-levels to rise faster than previously projected."

Pine Island Glacier and the neighbouring Thwaites Glacier are responsible for nearly a third of total ice loss from the West Antarctic Ice Sheet, and this contribution has increased greatly over the past 25 years. In addition to basal melt, the two glaciers also lose ice by breaking off, or calving, icebergs into Pine Island Bay.

Today, the icebergs that break off from Pine Island and Thwaites glaciers are mostly large table-like blocks, which cause characteristic 'comb-like' ploughmarks as these large multi-keeled icebergs grind along the sea floor. By contrast, during the last ice age, hundreds of comparatively smaller icebergs broke free of the Antarctic Ice Sheet and drifted into Pine Island Bay. These smaller icebergs had a v-shaped structure like the keel of a ship, and left long and deep single scars in the sea floor.

High-resolution imaging techniques, used to investigate the shape and distribution of ploughmarks on the sea floor in Pine Island Bay, allowed the researchers to determine the relative size and drift direction of icebergs in the past. Their analysis showed that these smaller icebergs were released due to a process called marine ice-cliff instability (MICI). More than 12,000 years ago, Pine Island and Thwaites glaciers were grounded on top of a large wedge of sediment, and were buttressed by a floating ice shelf, making them relatively stable even though they rested below sea level.

Eventually, the floating ice shelf in front of the glaciers 'broke up', which caused them to retreat onto land sloping downward from the grounding lines to the interior of the ice sheet. This exposed tall ice 'cliffs' at their margin with an unstable height, and resulted in rapid retreat of the glaciers from marine ice cliff instability between 12,000 and 11,000 years ago. This occurred under climate conditions that were relatively similar to those of today.

"Ice-cliff collapse has been debated as a theoretical process that might cause West Antarctic Ice Sheet retreat to accelerate in the future," said co-author Dr Robert Larter, from the British Antarctic Survey. "Our observations confirm that this process is real and that it occurred about 12,000 years ago, resulting in rapid retreat of the ice sheet into Pine Island Bay."

Today, the two glaciers are getting ever closer to the point where they may become unstable, resulting once again in rapid ice retreat.

Permits invalidated for big Washington state methanol plant

Phuong Le, Associated Press

September 19, 2017 Updated: September 19, 2017 1:17pm

SEATTLE (AP) — U.S. environmental groups opposed to the Pacific Northwest becoming an international fossil fuels gateway scored a major victory when a Washington state board invalidated two permits for a $2 billion project to manufacture methanol from natural gas and export it to China.

Last week's decision by the state Shorelines Hearings Board is a setback for the project by Northwest Innovation Works on the Columbia River in the small city of Kalama.

The China-backed consortium wants to build the refinery that would produce up to 10,000 metric tons a day of methanol from natural gas piped in from North American sources. The methanol sent to China would be used to make plastics and other consumer goods.

The conservation groups Columbia Riverkeeper, Sierra Club and the Center for Biological Diversity had challenged local permits issued for the project, arguing that the environmental review wrongly concluded the project's greenhouse gas emissions would not be significant.

Washington's state Shorelines Hearings Board agreed, ruling Friday that officials from Cowlitz County and the Port of Kalama failed to fully analyze the impacts of greenhouse gas emissions from the project, including emissions from offsite sources.

The panel reversed two shoreline permits and sent the review back to the county and port for further analysis.

"We are disappointed in the Shorelines Hearings Board's order because the EIS followed both the letter and intent of (the Department of) Ecology's guidance," Vee Godley, president of Northwest Innovation Works, said in an emailed statement Monday night.

He said the company looked forward to working with others "to do our part to provide greater certainty and positive impact for our state's economic and environmental goals."

The review did not factor in impacts of greenhouse gas emission beyond the facility site, including from producing and transporting natural gas, said Brett VandenHeuvel, executive director of Columbia Riverkeeper, which has fought other Northwest fossil fuel projects.

The groups said the project would add a new, enormous source of carbon pollution at a time when Washington state is trying to reduce its carbon footprint.

Janette Brimmer, an attorney for Earthjustice representing the groups, said a more accurate analysis will give a clear picture of the project's true carbon pollution and prompt measures to address those impacts.

Methanol, a wood alcohol, is used to make olefins, a component in everyday products such as eyeglasses, insulin pumps and fleece jackets. Developers have said the project would reduce global greenhouse gas emissions globally by producing methanol from natural gas rather than coal.

Environmental groups are also fighting a major crude-oil terminal proposed about 30 miles (48 kilometers) upstream from the proposed methanol plant.

The oil-by-rail terminal proposed by Vancouver Energy would handle about 360,000 barrels of crude oil a day.

Oil would arrive by train to be stored and then loaded onto tankers and ships bound for West Coast refineries. A state energy panel is reviewing the project and Washington Gov. Jay Inslee, a Democrat, has the final say over whether it will be built.

Northwest Innovation Works in April 2016 canceled plans to build a $3.4 billion methanol plant in Tacoma, Washington, following vocal opposition from people concerned about potential environmental and health impacts.

The company's methanol refinery operation in Kalama is estimated to produce more than 1 million metric tons of greenhouse gas emissions each year and increase the state's total emissions by 1 percent.

The company had argued that no state or federal agency has published rules concerning quantifying indirect greenhouse gas emissions in an environmental review and that the Department of Ecology's guidance was the only source about the issue.

The Ecology Department approved the permit for the project in June. But the agency determined the greenhouse gas emissions were significant and imposed conditions including requiring the facility to reduce its emissions by 1.7 percent each year, under the state's new rule to cap pollution.

Elaine Placido, director of building and planning for Cowlitz County, said Tuesday that the county is planning to discuss the next steps with the project developer.

Ohio bills would ease restrictive setbacks for new wind farms

Written By Kathiann M. Kowalski

Two bills in the Ohio Senate aim to ease restrictions in a 2014 law that have effectively blocked the development of new commercial wind farms in the state.

That 2014 budget bill amendment tripled the previous property line setbacks for wind turbines and has stymied all but a few projects that had already gone through the permitting process under prior law.

“We have stopped further development of wind projects in the state of Ohio while everyone else around the country seems to be progressing,” said Cliff Hite (R-Findlay), who introduced Senate Bill 188 on September 14. “Unless we do something, we’re not going to compete with these other states.

Hite’s bill is similar to a budget amendment that the Ohio Senate passed in June, but which was rejected by the Ohio House of Representatives in conference committee negotiations. Under Hite’s bill, the property line setback would be 120 percent of the turbine’s height, which is still more than the pre-2014 law required.

The other bill, SB 184, would restore the pre-2014 property line setback of 110 percent of the turbine height. Michael Skindell (D-Lakewood) introduced that bill on August 31.

Between them, SB 184 and 188 have 20 co-sponsors, which is more than a majority of the 33 lawmakers elected to the state senate. “I think we’ll get it done in the Senate,” Hite said. “Then we’ll see what happens in the House.”

Measuring up

Before the law changed in 2014, the minimum setbacks for commercial wind farm turbines were about 1,300 feet from the nearest habitable building and 1.1 times the turbine height from the property line.

At a minimum, that meant that a turbine would be about a quarter mile from any home, explained Andrew Gohn at the American Wind Energy Association. And if a turbine fell down, it still wouldn’t land on an adjoining property.

A last-minute budget bill amendment in 2014 made the property line setback the same as the former residence setback. Rep. Bill Seitz (R-Cincinnati), then in the state senate, was the only person who spoke in favor of the change when it was briefly discussed on the senate floor before passage.

Skindell was the only lawmaker who spoke out against the more restrictive setback provision at that time. Among other things, Skindell noted that there had been no hearings or testimony on the merits of the change.

Any arguments that the residence setback should apply to the property line for health and safety reasons are “specious,” said Gohn. The current requirements are “essentially a de facto zoning from the legislature, which is certainly something that that’s not welcomed by a lot of communities that want to host wind projects.”

‘Wind is our shale’

Those communities have been “the real losers here,” said Peter Kelley, also at the American Wind Energy Association. Although some companies have been unable to move forward with projects in Ohio, most usually go elsewhere, he continued. “So really, who’s missing out here are the communities in Ohio that would like to have this option for their economic development.”

The benefits of that development can be huge. According to a May 2017 study by the Wind Energy Foundation, the current law acts as a market barrier and has blocked over $4.2 billion in lifetime economic benefits for Ohio.

Among other things, those benefits would include “millions for your county that will be invested into your schools and your local county government,” Hite explained.

For example, he noted, Van Wert County has already received millions of dollars for its schools and other needs. Another project in Hardin County should bring in about $17 million for the local government and schools over a 20-year period, he added.

“Wind is our shale,” Hite stressed. The reference compares wind resources for his district in the northwestern part of the state to the shale gas resources that have boosted revenues for some parts of southeastern Ohio since 2008.

Other revenues and jobs may be at stake as well.

Amazon, Facebook, General Motors and other companies have announced goals of sourcing their electricity needs from 100 percent renewable energy. For example, on September 19, Starwood Energy Group Global announced a deal in which General Motors agreed to buy all the electricity to be produced by a commercial wind farm in northwest Ohio.

That project is the last grandfathered project that was permitted under the prior law, noted Trish Demeter at the Ohio Environmental Council. If Ohio’s lawmakers don’t see the opportunities from wind energy and fix the law, “Ohio will be left in the dust in the clean energy era,” she said.

Indeed, cities and states often compete with each other to attract new facilities that will add jobs. “Those opportunities are not going to wait around forever,” Kelley said.

Although some wind energy critics argue about “trespass zoning” if setbacks are relaxed, supporters say current setbacks “really interfere with property rights of all those people who would like to host a turbine,” Kelley said.

“Farmers love this because it’s a cash crop they can rely on when commodity prices go south or when there’s a drought,” Kelley continued. “When it becomes difficult to make the family farm work, having turbines on their land can be a huge boon to a way of life that was disappearing. And yet their property rights are negated by this overzealous setback rule.”

Hite says he wants “a compromise between protecting the property rights of people who don’t want [turbines on their land], but also protecting the rights of people who do want them. And I think this bill does that.”

County commissioners could still vote not to have wind energy in their area if that’s what their people wanted. However, Hite noted, those areas would also be unable to take advantage of some incentives to attract new facilities. And they would forego the economic benefits from wind farms.

“I just think it is right for a state to use the resources that it has to try to help make the state better,” Hite continued. “Where we live, we’ve got wind and we want to harvest it and use it.”

“And,” he added, “I don’t have a problem reducing the carbon footprint in the state of Ohio for my granddaughters.”

Wind farms in Ohio pit environmentalists against some neighbors tired of noise, view

By Dan Gearino, The Columbus Dispatch

Posted Sep 24, 2017 at 5:45 AM Updated Sep 24, 2017 at 5:54 AM

PAYNE, Ohio — From the ground, the narrow aluminum ladder might as well extend to infinity. Actual height: 290 feet.

A Dispatch videographer straps on a protective harness, hard hat and safety glasses, joined by two employees of the farm’s operator, EDP Renewables. They are about to climb inside one of 55 wind turbines at Timber Road II wind farm in Paulding County.

The first steps are easy, even with 10 pounds of cameras and other gear.

Just resist the urge to look up, or down.

As the climbers ascend, their only rest is on three metal platforms, which are spaced out within the tower to break up the journey and catch any falling objects.

“The first thing that really hits you (is) the size in general, the gravity of just how much machinery goes into putting these things together,” said Jeremy Chenoweth, an EDP operations manager whose territory includes all of Ohio, and who made the climb.

Wind farms are a big, and growing, business in Ohio. They’re a part of the state’s clean-energy economy that has gone from near zero to more than $1 billion worth of spending in the past 10 years, with the potential to grow fourfold if every announced project is built.

But some neighbors view the turbines as an affront, spoiling the landscape with noise, the flicker of shadows from turbine blades and blinking red lights.

This is the gut-level underpinning of a Statehouse battle over rules on where turbines can be placed, a debate that will determine how much building will be allowed to occur.

On one side are the wind-energy industry, environmentalists and companies that want to increase the supply of clean power. On the other are some of the neighboring residents, along with a patchwork of conservative-leaning groups.

The state’s wind farms are all in northwestern Ohio, but regulators have approved others just outside of the Columbus metro area, with projects planned for Crawford, Champaign, Hardin and Logan counties, and still more in the pre-development stage.

So the debate about wind energy could be coming to your neighborhood, even if you live nowhere near northwestern Ohio.

Top of the world

Inside the wind tower, the climb takes about an hour, and the final steps are a strain. Muscles ache. Clothes are soaked with sweat.

But there is a reward. At the very top is a schoolbus-size room that holds a generator and control equipment.

On the ceiling is a clear plastic hatch that one of the EDP guys pops open.

Then, blue sky.

The three climbers step onto the roof for a gobsmacking view. The safety harness remains in place. It’s safe to stand.

Fort Wayne, Indiana, is visible, 22 miles to the west. And, if you stop to focus, you see the tiny rectangles of houses, on farms and along rural highways.

The turbine is turned off whenever somebody is working in it, so there is no electricity being generated. When active, the carbon-fiber blades slice through the air at speeds that can reach 185 mph at the tips. The top room and its contents weigh a crushing 70-plus tons, and send electricity down the tower through a series of high-voltage lines. The lines then go underground to connect with substations, and then meet up with interstate powerlines that feed into the country’s power grid.

Only from a distance, which is how most people see wind farms, does this giant machine look like a pinwheel.

Not many jobs

All 255 of the wind-farm turbines operating in Ohio have been built along a stretch of Paulding and Van Wert counties, where the land is flat and the winds are some of the most brisk. If you include turbines at homes and businesses, the statewide total is 302, according to the American Wind Energy Association, a trade group.

A typical turbine in northwest Ohio is 1.5 to 2 megawatts; the Timber Road II wind farm has a total 99 megawatts. For some perspective, a 2 megawatt turbine in moderate wind generates enough electricity in a year to provide for the needs off more than 500 houses, based on estimates cited by EDP.

When a wind farm is developed, there is a flurry of spending for parts and construction services. After that, the costs are minimal. The fuel — wind — is free, and the developer needs only a few people to do ongoing work.

One of those jobs belongs to Benjamin Werkowski, 28, local operations manager for EDP. He’s the first person up the ladder.

“I used to drive a dump truck, right out of high school, because I didn’t know what I wanted to do,” he said. “And then I was hauling stone and dirt at the first wind farm they put in Indiana, and it just sparked my interest, and I went from there.”

He lives in Van Wert, one of 23 full-time EDP workers who live in the area.

This small employment footprint means that there are no throngs of wind-industry workers to advocate for their business the way there would be for a power plant that runs on coal or nuclear and might employ hundreds of people. And, EDP’s headquarters is nowhere near, with a base in Spain and a main U.S. office in Houston.

EDP primarily interacts with its Ohio neighbors financially — lease payments to landowners, taxes to local governments, and charitable giving — and visibly, given the near-constant sight of the turbines.

This creates a dynamic that some people see as a conflict between the haves and have-nots, with some residents surrounded by turbines but receiving little or no money.

“It’s just very annoying, very unpleasant,” said Brenda DeLong, 61, interviewed on her front porch.

She lives on a lot that has been in her family for generations and was once part of her parents’ farm. Now, she has a view of Blue Creek Wind Farm, the state’s largest, along with parts of the Timber Road farms.

Near dusk on a Thursday, she begins to count the turbines. After a walk around the house, she is finished with 114, 115, 116.

“And that’s about all I see,” she said.

In other words, she can see nearly half of all the turbines on all of Ohio’s wind farms without leaving her 1-acre property.

She is a retired fourth-grade teacher and now spends most of her time volunteering for 4-H, the Red Cross and her church. She has three children and three grandchildren.

From her porch, she hears a near-constant sound, like a plane flying overhead, from the turbines. On some mornings and evenings, when the sun is behind the turbines, she sees a flicker of shadows on the walls of her house.

She feels like an essential part of her life — the outdoors around her home — has been taken away from her.

Rex McDowell, 69, a neighbor of DeLong’s, describes the area as the “Red Light District.” By way of explanation, he walks to the backyard, where each of the turbines has a red light that goes on for a moment, then off, and then on again. The lights act in unison. Together, they are bright enough to cast a red glow for miles.

The lights are there to help prevent collisions with aircraft and are required by federal aviation rules.

DeLong is part of a local group of opponents of wind energy, called Citizens for Clear Skies. One of the organizers is Jeremy Kitson, 41, who lives in a nearby township and is a high school teacher.

“I love how (wind-energy supporters) think we’re getting paid off by the Koch brothers,” Kitson said, referring to the politically active family behind the Kansas-based conglomerate Koch Industries. “Citizens for Clear Skies is just a lot of regular folks kicking in 25 bucks at a time.”

At state and local hearings dealing with wind energy, he is often in the audience and ready to speak, a familiar face for officials.

While he seems to relish the debate, DeLong is much more hesitant. But that hasn’t prevented her from making her views known. She even made a trip to Columbus in June to testify before an Ohio Senate panel about wind-energy regulations.

“Many who are pro-wind will never live near a turbine,” she told lawmakers.

A few miles north and east of DeLong’s house is Wayne Trace High School, a rural school consolidated from three communities. Here, wind energy is a godsend, providing 11 percent of local tax revenue in the K-12 budget.

On a recent afternoon, one of the middle-school football teams practices on a field just west of the complex that houses the high school, middle school and district offices. The nearest wind farm is barely visible to the south.

“We just don’t have a lot of people seeking out the district for large industry,” said Ben Winans, the district’s superintendent. “The wind industry is one thing we can have.”

His district received $706,923 in taxes from wind turbines in the 2016-17 budget year. The wind money has allowed the district to increase staffing without raising taxes. The new hiring includes several reading specialists who work with students struggling to read at their grade level.

Winans grew up in the area and is an alumnus of the district he now runs. Though the wind turbines are barely visible from the school, he knows what it is like to be right next to them. His house, where he lives with his wife and children, has several turbines close enough to cast shadows on his walls.

He doesn’t mind.

“I’ve got one (turbine) on each end of my property,” he said. “I really don’t notice it anymore.”

Most wind-energy development has been in sparsely populated areas. But as demand grows, developers are moving closer to major cities.

The 255 active turbines in Ohio could be joined by 620 turbines at wind farms that regulators have approved, some of which are already under construction. In addition, there are at least a half-dozen other projects that are awaiting state approval, or are in a pre-application stage, based on filings and interviews with industry officials. The new wind farms represent more than $4 billion in spending.

With this pace of growth, researchers expect to see an increase in people who don’t like the turbines.

“It’s important to remember that this kind of opposition to any kind of infrastructure development is normal,” said Joseph Rand, a scholar who specializes in energy policy at Lawrence Berkeley National Laboratory in California.

He has taken a close look at how communities respond to renewable energy. Surveys show that a large majority of Americans support wind energy. At the same time, “people are inherently protective of place, of their landscape,” he said.

There are stereotypes at play. Wind supporters say the critics are uninformed or even delusional. Opponents say that wind supporters are motivated by money.

“Rarely do you see a nuanced perspective that has a fair story from both sides,” Rand said.

In his work, he hopes to determine the underlying motivations and find out how developers and communities can better respond.

State and local

One way to protect residents’ interests is through state and local regulations. Opponents of wind farms give the impression that the projects essentially receive a rubber stamp, with little consideration of the effects on residents and little skepticism of developers’ assertions. Also, local governments have a limited role in the process.

And yet, public records show an exhaustive and expensive review. For example, Blue Creek Wind Farm was approved by the Ohio Power Siting Board, with a multiyear review before the project went online in 2012. The board has broad authority on any utility-scale wind farm.

The board’s docket has 4,460 pages for Blue Creek, not counting two related cases, with extensive testing to estimate levels of noise, shadow flicker and other ramifications. The developer is a company that has since changed its name to Avangrid Renewables and has U.S. headquarters in Portland, Oregon.

According to filings, the developer projected that 39 houses would have shadow flicker of at least 30 hours per year. There is no legal standard for acceptable flicker, but 30 hours is often the guideline used by regulators.

Most of the residents of those houses have signed on to the project by either leasing property for the turbines, or by signing so-called “good neighbor” agreements.

With leases, property owners are signing a long-term contract to allow their land to be used to build a turbine and access roads. The payments vary, but are often in the range of $10,000 per turbine per year. The wind trade group says that annual lease payments in Ohio add up to more than $1 million, but less than $5 million.

Another type of contract is a neighbor agreement, which is for people who are near wind farms but have no turbines on their land. The residents are saying they will not object to the project in exchange for annual payments that are often in the $1,000 range. Winans’ family, for example, has such an agreement that pays $1,000 per year.

Avangrid gave special attention to the fewer than 10 households that did not sign any agreement and had more than 30 hours of flicker. In some cases, the company agreed to reduce the effects by shutting off certain turbines at certain times.

DeLong’s property is one of the majority that experience fewer than 30 hours. She does not recall any contact with the developer about flicker, noise or anything else.

“Nobody came to my door,” she said. It was an inauspicious start to what has been a bad relationship.

Paul Copleman, an Avangrid spokesman, had this response:

“This is a project that has roughly 250 participating landowners and that reflects a lot of effort to talk to a lot of people in the community, at kitchen tables and in living rooms and in public meetings,” he said. “We think the onus is on us to develop responsibly and talk to people about the benefits we feel the project delivers to everybody in the community.”

The competing interests collide in an ongoing debate about how much distance should be required between the turbines and nearby property lines.

In 2014, Ohio Senate Republican leaders expanded the required distance by making a last-minute amendment to an unrelated budget bill. The provision was largely in response to citizen concerns about wind farms.

Wind-industry advocates warned that the result would be a virtual halt in development. However, projects that already were approved could go forward using the old rules, which has accounted for nearly all construction since then.

Supporters of wind energy, a mix of Republicans and Democrats, have repeatedly tried and failed to pass rules that are more wind-friendly and warned that investment might soon shift to other states. The current proposal is Senate Bill 188, sponsored by Sen. Cliff Hite, R-Findlay, whose district includes most of the wind farms. The bill would allow construction of wind turbines within about 600 feet of property lines, which is down from about 1,300 feet under the 2013 amendment.

Hite is confident the bill can pass the Senate. His problem is in the House, where Majority Leader Bill Seitz, R-Cincinnati, is one of the most outspoken critics of wind energy.

“If wind farms cannot be developed without borrowing or stealing their neighbors’ nonresidential property in order to satisfy the setback, health and safety requirements, then perhaps they should not be developed at all,” Seitz said in a 2016 letter.

For DeLong and her friends, Hite has become the face of the pro-wind crowd. She notes that there is no wind farm near his home.

“If he lived near them, it would be different,” she said.

Hite had this response: “I would put one in my backyard if I could.”

People will disagree about whether that would be a pleasant view. Meanwhile, the opposite view, from the top of the turbine, is breathtaking.

Dispatch photographer and videographer Doral Chenoweth III contributed to this story.

Majority of Americans now say climate change makes hurricanes more intense

By Emily Guskin and Brady Dennis September 28

A NASA satellite image shows Hurricane Irma slamming into the French Caribbean islands on Sept. 6. (NASA/GOES Project/AFP)

A majority of Americans say that global climate change contributed to the severity of recent hurricanes in Florida and Texas, according to a new Washington Post-ABC News poll. That marks a significant shift of opinion from a dozen years ago, when a majority of the public dismissed the role of global warming and said such severe weather events just happen from time to time.

In a 2005 Post-ABC poll, taken a month after Hurricane Katrina ravaged the Gulf Coast and devastated New Orleans, 39 percent of Americans said they believed climate change helped to fuel the intensity of hurricanes. Today, 55 percent believe that.

[Read full poll results | How the poll was conducted]

The shift may in part reflect scientists’ increasing confidence — and their increasing amount of data — in linking certain extreme weather events such as hurricanes to climate change. Many researchers have been unequivocal that while hotter oceans, rising seas and other factors are not the sole cause of any event, the warming climate is contributing to more intense storms and more frequent, more crippling storm surges and flooding.

“Harvey was not caused by climate change, yet its impacts — the storm surge and especially the extreme rainfall — very likely worsened due to human-caused global warming,” Stefan Rahmstorf of the Potsdam Institute for Climate Impact Research said in a statement after the hurricane.

In a follow-up email to The Washington Post last month, Rahmstorf said that the explanation is just basic physics: The atmosphere holds more water vapor when it is warmer, setting the stage for more rain.

Yet for many Americans, the role of climate change has become as much about political beliefs as scientific findings.

Over the past decade, Democrats and independents accounted for most of the growing percentage of Americans who said climate change was a culprit in the intensity of hurricanes. While fewer than half of Democrats said climate change was a significant factor behind the intensity of hurricanes in 2005, that figure rose to 78 percent this month after hurricanes Harvey and Irma. Similarly, 42 percent of independents saw climate change as contributing to the severity of hurricanes in 2005 — a season in which hurricanes Rita and Wilma also hit the United States — while 56 percent now do.

By contrast, attitudes among Republicans have hardly changed over the same period. About 72 percent currently say that the severity of recent storms was merely the result of the kind of intense weather events that materialize periodically. In 2005, 70 percent had that view.

Younger adults are much more likely to attribute a role to climate change in the intensity of major storms, and their opinions appear to have shifted sharply over the past decade or so. The new poll finds that 67 percent of adults younger than 30 say climate change contributed to the recent hurricanes’ power, as do 64 percent of those in their 30s. Meanwhile, only about half of Americans ages 40 to 64 say that. The percentage is lower still for people older than 65.

Despite the damage their states suffered, the Post-ABC poll finds that residents in Texas and Florida appear more skeptical of linking climate change and a storm’s punch than Americans nationally. The poll included 134 respondents living in one of the two states, and just under half say the severity of the recent hurricanes was typical. Forty-four percent say their intensity was made worse by Earth’s changing climate.

The Washington Post-ABC News poll was conducted Sept. 18-21 among a random national sample of 1,002 adults reached on cellular and landline phones. The margin of sampling error for overall results is plus or minus 3.5 percentage points.

Katrina. Sandy. Harvey. The debate over climate and hurricanes is getting louder and louder.

Emily Guskin is the polling analyst at The Washington Post, specializing in public opinion about politics, election campaigns and public policy. Follow @emgusk

Brady Dennis is a national reporter for The Washington Post, focusing on the environment and public health issues. Follow @brady_dennis

Power company kills nuclear plant, plans $6 billion in solar, battery investment

Duke Energy Florida is just the latest utility to walk away from nuclear.

On Tuesday, power provider Duke Energy Florida announced a settlement with the state’s public service commission (PSC) to cease plans to build a nuclear plant in western Florida. The utility instead intends to invest $6 billion in solar panels, grid-tied batteries, grid modernization projects, and electric vehicle charging areas. The new plan involves the installation of 700MW of solar capacity over four years in the western Florida area.

There's excitement from the solar industry, but the announcement is more bad news for the nuclear industry. Earlier this year, nuclear reactor company Westinghouse declared bankruptcy as construction of its new AP1000 reactors suffered from contractor issues and a stringent regulatory environment. Two plants whose construction was already underway—the Summer plant in South Carolina and the Vogtle plant in Georgia—found their futures in question immediately.

At the moment, Summer’s owners are considering abandoning the plant, and Vogtle’s owners are weighing whether they will do the same or attempt to salvage the project.

Duke Energy Florida hadn’t started building the Levy nuclear plant, but it did have plans to order two AP1000 reactors from Westinghouse. Now that Westinghouse company is in dire financial straits, the Florida utility decided that its money is better spent elsewhere.

Just last week, Duke told its PSC that it would have to increase rates by more than eight percent due to increased fuel costs. But with the new settlement that directs the utility toward solar and storage, customers will see that rate hike cut to 4.6 percent.

The Levy nuclear plant was proposed in 2008 and ran into hurdles early on. With cheap natural gas in 2013, Duke Energy Florida became nervous that it might not recuperate costs spent on the nuclear plant, especially with regulatory delays. The company cancelled its engineering and construction agreements in 2013 but said that it was holding open the possibility of returning to Levy someday. Over nine years, about $800 million had been spent on preparatory work for the plant.

With Tuesday’s announcement, those costs are sunk costs now. But overall, the changes will save residential customers future nuclear-related rate increases. Those customers will see a cost reduction of $2.50 per megawatt-hour (MWh) “through the removal of unrecovered Levy Nuclear Project costs,” the utility said.

The 700MW of solar won’t exactly cover the nameplate capacity of the Levy plant, which was supposed to deliver 2.2 gigawatts to the region. But the Tampa Bay Times wrote that Duke “is effectively giving up its long-held belief that nuclear power is a key component to its Florida future and, instead, making a dramatic shift toward more solar power.”

Duke Energy Florida serves 1.8 million people, and Florida relies mostly on (currently cheap) natural gas. Duke said this week that it wants to raise its solar power capacity to eight percent of generating power in the next four years.

This isn’t the only planned nuclear plant to be cancelled in the last few days, either. The parent company of Duke Energy Florida—that is, Duke Energy—also pulled the plug on another planned nuclear plant in North Carolina last week, according to GreenTechMedia. There, the power company asked for a rate increase to cover more than $500 million in sunk costs related to the unbuilt Lee nuclear power plant. (It should be noted, though, that more than half of the rate increase was to cover costs related to cleaning up coal ash sites.)

While Duke said that its existing nuclear power plants were a “vital component” of the power mix, the company also said that “cancellation of the [Lee] project is the best option for customers.”

Experts don’t foresee a lot of nuclear investment in the next decade or so, according to UtilityDive. But experts are divided on whether focusing solely on renewable energy is a good thing. While opponents cite nuclear’s expense and safety risk, proponents say we need all kinds of greenhouse gas-free power online as quickly as possible.

"A Sea of Health and Environmental Hazards in Houston’s Floodwaters"

"Officials in Houston are just beginning to grapple with the health and environmental risks that lurk in the waters dumped by Hurricane Harvey, a stew of toxic chemicals, sewage, debris and waste that still floods much of the city.

Flooded sewers are stoking fears of cholera, typhoid and other infectious diseases. Runoff from the city’s sprawling petroleum and chemicals complex contains any number of hazardous compounds. Lead, arsenic and other toxic and carcinogenic elements may be leaching from some two dozen Superfund sites in the Houston area.

Porfirio Villarreal, a spokesman for the Houston Health Department, said the hazards of the water enveloping the city were self-evident.

“There’s no need to test it,” he said. “It’s contaminated. There’s millions of contaminants.”"

Antidepressants found in fish brains in Great Lakes region

The drugs enter rivers and lakes from treatment plants and sewage overflows, threatening aquatic life, scientists say

University at Buffalo

BUFFALO, N.Y. -- Human antidepressants are building up in the brains of bass, walleye and several other fish common to the Great Lakes region, scientists say.

In a new study, researchers detected high concentrations of these drugs and their metabolized remnants in the brain tissue of 10 fish species found in the Niagara River.

This vital conduit connects two of the Great Lakes, Lake Erie and Lake Ontario, via Niagara Falls. The discovery of antidepressants in aquatic life in the river raises serious environmental concerns, says lead scientist Diana Aga, PhD, the Henry M. Woodburn Professor of chemistry in the University at Buffalo College of Arts and Sciences.

"These active ingredients from antidepressants, which are coming out from wastewater treatment plants, are accumulating in fish brains," Aga says. "It is a threat to biodiversity, and we should be very concerned.

"These drugs could affect fish behavior. We didn't look at behavior in our study, but other research teams have shown that antidepressants can affect the feeding behavior of fish or their survival instincts. Some fish won't acknowledge the presence of predators as much."

If changes like these occur in the wild, they have the potential to disrupt the delicate balance between species that helps to keep the ecosystem stable, says study co-author Randolph Singh, PhD, a recent UB graduate from Aga's lab.

"The levels of antidepressants found do not pose a danger to humans who eat the fish, especially in the U.S., where most people do not eat organs like the brain," Singh says. "However, the risk that the drugs pose to biodiversity is real, and scientists are just beginning to understand what the consequences might be."

The research team included other scientists from UB, Ramkhamhaeng University and Khon Kaen University, both in Thailand, and SUNY Buffalo State. The study was published on Aug. 16 in the journal Environmental Science and Technology.

A dangerous cocktail of antidepressants in the water

Aga has spent her career developing techniques for detecting contaminants such as pharmaceuticals, antibiotics and endocrine disrupters in the environment.

This is a field of growing concern, especially as the use of such chemicals expands. The percentage of Americans taking antidepressants, for instance, rose 65 percent between 1999-2002 and 2011-14, according to the National Center for Health Statistics.

Wastewater treatment facilities have failed to keep pace with this growth, typically ignoring these drugs, which are then released into the environment, Aga says.

Her new study looked for a variety of pharmaceutical and personal care product chemicals in the organs and muscles of 10 fish species: smallmouth bass, largemouth bass, rudd, rock bass, white bass, white perch, walleye, bowfin, steelhead and yellow perch.

Antidepressants stood out as a major problem: These drugs or their metabolites were found in the brains of every fish species the scientists studied.

The highest concentration of a single compound was found in a rock bass, which had about 400 nanograms of norsertraline -- a metabolite of sertraline, the active ingredient in Zoloft -- per gram of brain tissue. This was in addition to a cocktail of other compounds found in the same fish, including citalopram, the active ingredient in Celexa, and norfluoxetine, a metabolite of the active ingredient in Prozac and Sarafem.

More than half of the fish brain samples had norsertraline levels of 100 nanograms per gram or higher. In addition, like the rock bass, many of the fish had a medley of antidepressant drugs and metabolites in their brains.

Evidence that antidepressants can change fish behavior generally comes from laboratory studies that expose the animals to higher concentrations of drugs than what is found in the Niagara River. But the findings of the new study are still worrisome: The antidepressants that Aga's team detected in fish brains had accumulated over time, often reaching concentrations that were several times higher than the levels in the river.

In the brains of smallmouth bass, largemouth bass, rock bass, white bass and walleye, sertraline was found at levels that were estimated to be 20 or more times higher than levels in river water. Levels of norsertraline, the drug's breakdown product, were even greater, reaching concentrations that were often hundreds of times higher than that found in the river.

Scientists have not done enough research yet to understand what amount of antidepressants poses a risk to animals, or how multiple drugs might interact synergistically to influence behavior, Aga says.

Wastewater treatment is behind the times

The study raises concerns regarding wastewater treatment plants, whose operations have not kept up with the times, says Aga, a member of the UB RENEW (Research and Education in eNergy, Environment and Water) Institute.

In general, wastewater treatment focuses narrowly on killing disease-causing bacteria and on extracting solid matter such as human excrement. Antidepressants, which are found in the urine of people who use the drugs, are largely ignored, along with other chemicals of concern that have become commonplace, Aga says.

"These plants are focused on removing nitrogen, phosphorus, and dissolved organic carbon but there are so many other chemicals that are not prioritized that impact our environment," she says. "As a result, wildlife is exposed to all of these chemicals. Fish are receiving this cocktail of drugs 24 hours a day, and we are now finding these drugs in their brains."

The problem is exacerbated, Singh says, by sewage overflows that funnel large quantities of untreated water into rivers and lakes. In August, for example, The Buffalo News reported that since May of 2017, a half billion gallons of combined sewage and storm water had flowed into local waterways, including the Niagara River.

New study: Shifting school start times could contribute $83 billion to US economy within a decade

Even after just two years, the study projects a national economic gain of $8.6 billion, which would already outweigh the costs per student from delaying school start times to 8:30 a.m.

RAND Corporation

The RAND Corporation and RAND Europe have released the first-ever, state-by-state analysis (in 47 states) of the economic implications of a shift in school start times in the U.S., showing that a nationwide move to 8.30 a.m. could contribute $83 billion to the U.S. economy within a decade.

Even after just two years, the study projects a national economic gain of $8.6 billion, which would already outweigh the costs per student from delaying school start times to 8:30 a.m. The costs per student are largely due to transportation, such as rescheduling bus routes and times, which would be affected by the school start time change.

The study used a novel macroeconomic model to project economic gains to the U.S. economy over 20 years from 2017, with this being around $140 billion after 15 years. On average, this corresponds to an annual gain of about $9.3 billion each year, which is roughly the annual revenue of Major League Baseball.

The economic gains projected through the study's model would be realised through the higher academic and professional performance of students, and reduced car crash rates among adolescents.

Previous estimates show that one additional hour of sleep is, on average, estimated to increase the probability of high school graduation by 13.3 percent and college attendance rate by 9.6 percent. These positive effects impact the jobs that adolescents are able to obtain in the future and, in turn, have a direct effect on how much a particular person contributes toward the economy in future financial earnings.

Data for car crash fatalities reveal that around 20 percent involved a driver impaired by sleepiness, drowsiness or fatigue. The impact of car crashes and young adults dying prematurely has a negative impact on the future labor supply of an economy.

From a health perspective, the American Academy of Pediatrics and the American Medical Association recommend that middle and high schools start no earlier than 8:30 a.m. to accommodate the known biological shift in adolescent sleep-wake schedules. However, a previous study from the federal Centers for Disease Control and Prevention estimated that 82 percent of middle and high schools in the U.S. start before 8:30 a.m., with an average start time of 8:03 a.m.

School start times are increasingly becoming a hotly debated topic in school districts across the U.S., while the California legislature is considering a proposal to mandate that start times for middle and high schools be no earlier than 8:30 a.m.

Wendy Troxel, senior behavioral and social scientist at the RAND Corporation, says, "For years we've talked about inadequate sleep among teenagers being a public health epidemic, but the economic implications are just as significant. From a policy perspective, the potential implications of the study are hugely important. The significant economic benefits from simply delaying school start times to 8.30 a.m. would be felt in a matter of years, making this a win-win, both in terms of benefiting the public health of adolescents and doing so in a cost-effective manner."

Marco Hafner, a senior economist at RAND Europe, the European affiliate of the RAND Corporation, says, "A small change could result in big economic benefits over a short period of time for the U.S. In fact, the level of benefit and period of time it would take to recoup the costs from the policy change is unprecedented in economic terms."

Even though it is recommended that teenagers need an average of 8 to 10 hours of sleep each night, up to 60 percent of middle school and high school students report weeknight sleep duration of less than seven hours.

Previous evidence has shown that a lack of sleep among adolescents is associated with numerous adverse outcomes, including poor physical and mental health, behavioural problems, suicidal thoughts and attempts, and attention and concentration problems.

"Throughout the cost-benefit projections, we have taken a conservative approach when establishing the economic gains," Hafner said. "We have not included other effects from insufficient sleep, such as higher suicide rates, increased obesity and mental health issues, which are all difficult to quantify precisely. Therefore, it is likely that the reported economic and health benefits from delaying school start times could be even higher across many U.S. states."

The study follows a previous piece of research from RAND Europe in November 2016 that showed that the U.S. sustains economic losses of up to $411 billion a year (2.28 percent of its GDP) due to insufficient sleep among its workforce.

Climate action window could close as early as 2023

University of Michigan

ANN ARBOR--As the Trump administration repeals the U.S. Clean Power Plan, a new study from the University of Michigan underscores the urgency of reducing greenhouse gas emissions--from both environmental and economic perspectives.

For the U.S.'s most energy-hungry sectors--automotive and electricity--the study identifies timetables for action, after which the researchers say it will be too late to stave off a climate tipping point.

And the longer the nation waits, the more expensive it will be to move to cleaner technologies in those sectors--a finding that runs contrary to conventional economic thought because prices of solar, wind and battery technologies are rapidly falling, they say.

Steps outlined in the Clean Power Plan, as well as in the 2016 Paris climate accord, would not have been enough to meet the goal of keeping global temperature increase to 2 degrees Celsius by the end of this century, the study shows.

To achieve the 70-percent reduction target for carbon dioxide emissions used in the study, additional steps would be needed--and before 2023. The window for effective action could close that early.

"If we do not act to reduce greenhouse gas emissions forcefully prior to the 2020 election, costs ?to reduce emissions at a magnitude and timing consistent with averting dangerous human interference with the climate will skyrocket," said Steven Skerlos, U-M professor of mechanical engineering. "That will only make the inevitable shift to renewable energy less effective in maintaining a stable climate system throughout the lives of children already born."

Before Trump's reversal of both the domestic and international climate plans, the Intergovernmental Panel on Climate Change had recommended a 70-percent cut in carbon dioxide emissions from industrialized nations such as the U.S., where nearly half of emissions come from the electric and automotive sectors.

Using a custom, state-of-the-art model of these sectors, the researchers showed that the window for initiating additional climate action would close between 2023 and 2025 for the automotive sector and between 2023 and 2026 for the electric sector.

"That's true under even the most optimistic assumptions for clean technology advancements in vehicles and power plants," said study lead author Sarang Supekar, a mechanical engineering postdoctoral fellow at U-M.

Withdrawal from the accord and the EPA's plan to repeal the Clean Power Plan will only make the chances of achieving the goal more remote, the researchers say.

"In the absence of a government mandate, and if there is encouragement for coal to come back, then there's no way we can meet the target," Supekar said.

To arrive at their findings, Supekar and Skerlos calculated the future greenhouse gas contributions of the auto and power industries based on two approaches going forward--"business as usual" and "climate action." Their calculations relied on the lowest-cost technologies in each sector.

In the "business as usual" scenario, the auto industry followed its current rate of vehicle diversification--utilizing efficient internal combustion, electric and hybrid models, and the power sector utilized mostly natural gas and renewable plants. In the "climate action" scenario, those sectors relied on a greater percentage of cleaner automotive and power technologies to meet the IPCC climate goals.

"At some point, likely by 2023, you actually can't build the newer, cleaner power plants fast enough or sell enough fuel-efficient cars fast enough to be able to achieve the 70-percent target," Skerlos said.

Added Supekar, "The year-on-year emission reduction rate in such dramatic technology turnovers will exceed 5 percent after about 2020, which makes the 70-percent target infeasible for all practical purposes."

The analysis found no evidence to justify delaying climate action in the name of reducing technological costs, even under the most optimistic trajectories for improvement in fuels efficiencies, demand, and technology costs in the U.S. auto and electric sectors. In fact, the study found that waiting another four years to initiate measures on track with the 70 percent target would take the total cost for both sectors from about $38 billion a year to $65 billion a year.

"You could take this same model or a different model and arrive at different cost numbers using a your own set of assumptions for "business as usual" or interests rates, for instance," Supekar said. "But the point is, regardless of whether the cost of climate action today is $38 billion or $100 billion, this cost will rise sharply in three to four years from now."

The IPCC has determined that in order to keep Earth's average temperature from rising more than 2 degrees Celsius above pre-industrial times by the end of the century, global greenhouse gas emissions must be reduced between 40 percent and 70 percent by 2050. The U.S. is the largest cumulative emitter of greenhouse gases, and the electric and auto industries account for nearly half of the country's annual output. Fossil fuel combustion accounts for 95 percent of those industries' emissions.

The study, "Analysis of Costs and Time Frame for Reducing CO2 Emissions by 70% in the U.S. Auto and Energy Sectors by 2050," is published in Environmental Science and Technology. It was funded by the U-M Energy Institute and the National Science Foundation.

Baltic clams and worms release as much greenhouse gas as 20,000 dairy cows

New study shows that oceans with worms and clams enhance the release of methane into the atmosphere up to eight times more than oceans without them

Cardiff University

Scientists have shown that ocean clams and worms are releasing a significant amount of potentially harmful greenhouse gas into the atmosphere.

The team, from Cardiff University and Stockholm University, have shown that the ocean critters are producing large amounts of the strongest greenhouse gases - methane and nitrous oxides - from the bacteria in their guts.

Methane gas is making its way into the water and then finally out into the atmosphere, contributing to global warming - methane has 28 times greater warming potential than carbon dioxide.

A detailed analysis showed that around 10 per cent of total methane emissions from the Baltic Sea may be due to clams and worms.

The researchers estimate that this is equivalent to as much methane given off as 20,000 dairy cows. This is as much as 10 per cent of the entire Welsh dairy cow population and 1 per cent of the entire UK dairy cow population.

The findings, which have been published in the journal Scientific Reports, point to a so far neglected source of greenhouse gases in the sea and could have a profound impact on decision makers.

It has been suggested that farming oysters, mussels and clams could be an effective solution against human pressures on the environment, such as eutrophication caused by the run-off of fertilisers into our waters.

The authors warn that stakeholders should consider these potential impacts before deciding whether to promote shellfish farming to large areas of the ocean.

Co-author of the study Dr Ernest Chi Fru, from Cardiff University's School of Earth and Ocean Sciences, said: "What is puzzling is that the Baltic Sea makes up only about 0.1% of Earth's oceans, implying that globally, apparently harmless bivalve animals at the bottom of the world's oceans may in fact be contributing ridiculous amounts of greenhouse gases to the atmosphere that is unaccounted for."

Lead author of the study Dr Stefano Bonaglia, from Stockholm University, said: "It sounds funny but small animals in the seafloor may act like cows in a stable, both groups being important contributors of methane due to the bacteria in their gut.

"These small yet very abundant animals may play an important, but so far neglected, role in regulating the emissions of greenhouse gases in the sea."

To arrive at their results the team analysed trace gas, isotopes and molecules from the worms and clams, known as polychaetes and bivalves respectively, taken from ocean sediments in the Baltic Sea.

The team analysed both the direct and indirect contribution that these groups were having on methane and nitrous oxide production in the sea. The results showed that sediments containing clams and worms increased methane production by a factor of eight compared to completely bare sediments.

Winter cold extremes linked to high-altitude polar vortex weakening

by Staff Writers

Potsdam, Germany (SPX) Sep 25, 2017

"Several types of weather extremes are on the rise with climate change, and our study adds evidence that this can also include cold spells, which is an unpleasant surprise for these regions." The effect is stronger over Asia and Europe than over the US.

When the strong winds that circle the Arctic slacken, cold polar air can escape and cause extreme winter chills in parts of the Northern hemisphere. A new study finds that these weak states have become more persistent over the past four decades and can be linked to cold winters in Russia and Europe.

It is the first to show that changes in winds high up in the stratosphere substantially contributed to the observed winter cooling trend in northern Eurasia. While it is still a subject of research how the Arctic under climate change impacts the rest of the world, this study lends further support that a changing Arctic impacts the weather across large swaths of the Northern Hemisphere population centers.

"In winter, the freezing Arctic air is normally 'locked' by strong circumpolar winds several tens of kilometers high in the atmosphere, known as the stratospheric polar vortex, so that the cold air is confined near the pole," says Marlene Kretschmer from PIK, lead-author of the study to be published in the Bulletin of the American Meteorological Society.

"We found that there's a shift towards more-persistent weak states of the polar vortex. This allows frigid air to break out of the Arctic and threaten Russia and Europe with cold extremes. In fact this can explain most of the observed cooling of Eurasian winters since 1990."

Warm Arctic, cold continents

Despite global warming, recent winters in the Northeastern US, Europe and especially Asia were anomalously cold - some regions like Western Siberia even show a downward temperature trend in winter. In stark contrast, the Arctic has been warming rapidly. Paradoxically, both phenomena are likely linked: When sea-ice North of Scandinavia and Russia melts, the uncovered ocean releases more warmth into the atmosphere and this can impact the atmosphere up to about 30 kilometers height in the stratosphere disturbing the polar vortex.

Weak states of the high-altitude wind circling the Arctic then favors the occurrence of cold spells in the mid-latitudes. Previous work by Kretschmer and colleagues identified this causal pathway in observational data and it is further supported by several climate computer simulation studies.

"Our latest findings not only confirm the link between a weak polar vortex and severe winter weather, but also calculated how much of the observed cooling in regions like Russia and Scandinavia is linked to the weakening vortex. It turns out to be most," says co-author Judah Cohen from Atmospheric and Environmental Research/Massachusetts Institute of Technology (US).

"Several types of weather extremes are on the rise with climate change, and our study adds evidence that this can also include cold spells, which is an unpleasant surprise for these regions." The effect is stronger over Asia and Europe than over the US.

Circulation patterns drive our weather"

"It is very important to understand how global warming affects circulation patterns in the atmosphere," says co-author Dim Coumou from Vrije Universiteit Amsterdam, Netherlands. "Jet Stream changes can lead to more abrupt and surprising disturbances to which society has to adapt. The uncertainties are quite large, but global warming provides a clear risk given its potential to disturb circulation patterns driving our weather - including potentially disastrous extremes."

Students, researchers turn algae into renewable flip-flops

by Brooks Hays

Washington (UPI) Oct 5, 2017

A team of researchers and students at the University of California, San Diego are trying to curb the number of petroleum-based flip flops -- currently, 3 billion every year -- that end up in landfills.

Their solution is the world's first algae-based, eco-friendly pair of flip flops.

"Even though a flip flop seems like a minor product, a throwaway that everyone wears, it turns out that this is the number one shoe in the world," Stephen Mayfield, a professor of biology at UCSD, said in a news release.

In India, China and Africa, the flip flop is the most popular shoe.

"These are the shoes of a fisherman and a farmer," Mayfield said.

Flip flops are responsible for a large amount of the polyurethane that ends up in the ocean. But researchers originally set their sights on the pollution caused by surfboards.

Two years ago, Mayfield and Skip Pomeroy, a professor of chemistry and biochemistry, worked with their students to design an algae-based, biodegradable surfboard. They teamed up with a local surfboard blanks manufacturer, Arctic Foam of Oceanside, to produce the board.

Soon, Mayfield and Pomeroy realized their technology could be used to replace other petroleum-based, polyurethane products.

"The algae surfboard was the first obvious product to make, but when you really look at the numbers you realize that making a flip-flop or shoe sole like this is much more important," says Mayfield. "Depending on how you do the chemistry, you can make hard foams or soft foams from algae oil. You can make algae-based, renewable surfboards, flip-flops, polyurethane athletic shoes, car seats or even tires for your car."

Petroleum is formed by ancient plant material like algae, but the chemical potential for petroleum is present in living algae. By converting living algae into petroleum and then into polyurethane, researchers are pulling carbon from the atmosphere instead of taking it out of reserves in the ground. This is what makes their algae-based polyurethane more sustainable.

Researchers are also working to make the carbon bonds in their polyurethane more biodegradable, so that microorganisms can more easily break them down.

In an effort to take their research to market, researchers have spun their work off into a new company called Algenesis Materials. Their first product is the Triton flip flop. The researchers hope to continue perfecting the chemistry behind the Triton flip flop and use it to make more eco-friendly shoe soles, car seats and more.

"The idea we're pursuing is to make these flip-flops in a way that they can be thrown into a compost pile and they will be eaten by microorganisms," said Mayfield.

Algae-based surfboards have already become popular in the surfing industry, and researchers hope their latest efforts will have a similar impact on the shoe industry.

"It's going to be a little while before you can buy one of these flip-flops in the store, but not too long," says Mayfield. "Our plan is that in the next year, you'll be able to go into the store and buy an Algenesis flip-flop that is sustainable, biodegradable and that was invented by students at UC San Diego."

Cleanup bill for toxic firefighting chemicals at military bases could reach $2 billion

UPDATED: Wed., Sept. 6, 2017, 11:07 p.m.

Airway Heights Public Work Department flushes potentially contaminated water from a fire hydrant into Aspen Grove Park in Airway Heights in May. The city worked to remove contaminates from the water pipes caused when chemicals used for fire suppression at Fairchild Air Force Base entered the water supply. A U.S. Senate report said it may cost $2 billion to cleanup the water pollution at more than 400 military bases across the country. (Colin Mulvany / The Spokesman-Review)

It may cost up to $2 billion to clean up toxic firefighting chemicals that have leaked from more than 400 U.S. military installations, including Fairchild Air Force Base, a group of Democratic senators said Tuesday in a letter to the Senate Appropriations Committee.

The senators, including Patty Murray and Maria Cantwell of Washington, attributed that cost estimate to U.S. Department of Defense officials.

The senators requested a study of the chemicals known as PFOS and PFOA, which were key ingredients in a foam that was used for decades to douse aircraft fires at military bases and civilian airports.

In recent months, it was revealed that the chemicals contaminated groundwater at many of those sites. Dozens of residents who live near Fairchild have learned that the chemical concentrations in their drinking water exceed recommendations from the U.S. Environmental Protection Agency.

Other senators who signed the letter include Michael Bennet of Colorado, Jeanne Shaheen and Maggie Hassan of New Hampshire, Kirsten Gillibrand of New York and Bob Casey of Pennsylvania.

They asked that funds be included in the 2018 budget for the Centers for Disease Control, the EPA and the Department of Defense to study the spread of the chemicals, the health effects and viable alternatives for the toxic firefighting foam.

The chemicals have been linked to cancer, thyroid problems and immune system disorders, although scientists aren’t sure exactly how they interact in the human body.

“Residents of our states are concerned about exposure, and what this means for their health and safety,” the senators wrote. “We urge the Committee to include language directing DOD and the military services to budget robustly for assessment, investigation, and remediation activities in the upcoming fiscal years.”

Chicago Just Repealed Its Brand New Soda Tax

October 12, 2017 By Elaine S. Povich

Matt Gill, manager of Tischler Finer Foods in Brookfield, Illinois, says the soda tax has cut into his business. The Cook County Board repealed the tax Wednesday.

The Cook County (Chicago) Board of Commissioners voted Wednesday to repeal the controversial penny-an-ounce tax on sweetened beverages, following a pitched advertising battle between advocates who cited public health concerns and opponents who said the tax was too burdensome on business.

The repeal, on a voice vote, was taken just two months after the tax went into effect Aug. 2. The repeal goes into effect Dec. 1. Cook County Board President Toni Preckwinkle, a supporter of the tax, said the county will now have to plug a $200 million annual hole in the budget – without the money the tax was expected to raise.

The American Beverage Association and local store owners pushed for repeal with millions of dollars in television advertising. They were countered by an equally expensive campaign by billionaire public health advocate and former New York Mayor Michael Bloomberg, who in 2012 pushed through a municipal soda tax on jumbo drinks only to have it overturned by a state court in 2014.

Chicago’s complicated tax on sweetened beverages covered any nonalcoholic beverage that was sweetened, whether artificially or with sugar, and whether it came in bottles and cans or from a fountain. The tax applied to regular and diet soda, sports and energy drinks, and juice products that weren’t 100 percent fruit or vegetable juice. It also applied to ready-to-drink sweetened coffees and teas — as long as they were less than half milk. It did not apply to flavored bubbly waters without sugar but did cover sparkling waters that were sweetened with sugar.

"AEP to Spend $4.5 Billion on the Largest Wind Farm in the U.S."

Growth-Starved Utilities Have Found a New Way to Make Money

By Chris Martin , Jim Polson , and Mark Chediak

July 26, 2017, 5:10 PM EDT July 27, 2017, 10:55 AM EDT

AEP seeking permission to invest $4.5 billion in wind power

Utilities now looking to own renewable energy assets outright

There’s a new model emerging for growth-starved utilities looking to profit from America’s solar and wind power boom.

American Electric Power Co. is using it for a $4.5 billion deal that’ll land the U.S. utility owner a massive wind farm in Oklahoma and a high-voltage transmission line to deliver the power. NextEra Energy Inc.’s Florida unit is using it to build solar farms. And in April, the chief executive officer of Xcel Energy Inc. said he’d use it to help add 3.4 gigawatts of new wind energy over the next five years.

Here’s how it works: Some utilities that for years contracted to buy electricity from wind and solar farm owners are now shifting away from these so-called power purchase agreements, or PPAs. They’re instead seeking approval from state regulators to buy the assets outright and recover the costs from customers through rates. While the takeovers are being branded as a cheaper way of securing power, saving ratepayers millions in the end, they also guarantee profits for utilities.

“We keep wondering why utilities are always signing PPAs that pass the cost through to customers,” said Amy Grace, an analyst at Bloomberg New Energy Finance. “If you put it in your rate base, you can get a guaranteed return on it. There’s a big upside to ownership.”

With the cost of building solar and wind farms sliding and electricity demand weakening, owning renewables is a more attractive proposition than ever for utilities.

“The price of wind has come down enough that it’s going to be competitive with anything else you’re probably going to propose to build out there,” Kit Konolige, a New York-based utility analyst for Bloomberg Intelligence, said by phone Wednesday.

Biggest Hurdle

The catch, of course, is regulatory buy-in. AEP will need approval from Arkansas, Louisiana, Oklahoma, Texas and federal regulators to purchase the 2,000-megawatt Wind Catcher farm from developer Invenergy LLC, build a 350-mile (563-kilometer) transmission line and bake the costs into customer rates. The company expects to file these plans with regulators on July 31.

American Electric wants regulatory approvals by April, Chief Executive Officer Nicholas Akins said on the company’s second-quarter earnings call Thursday. Central to the project are $2.7 billion of U.S. tax credits for wind production. No rate increase will be needed, he said.

American Electric shares rose as much as 1.9 percent to $70.80 in intraday trading in New York, the highest in almost a month, and were at $70.32 at 10:52 a.m.

Benjamin Fowke, chief executive officer of Minneapolis-based Xcel Energy, said in April that the company would seek approval to add 3.4 gigawatts of new wind energy and bill customers for as much as 80 percent of that investment through rates. He said at the time that wind energy is now the cheapest new form of generation.

NextEra has already gotten Florida’s go-ahead to recover the costs of the solar farms its Florida utility is building through customer rates.

Should AEP gain similar approvals, the utility will be embarking on its most expensive renewable energy project yet, underscoring a dramatic shift in America’s power mix. Once the largest consumer of coal in the U.S., AEP is now shuttering money-losing plants burning the fuel and diversifying its resources along with the rest of the utility sector.

The Wind Catcher farm in Oklahoma is set to become the largest wind farm in the nation and the second-biggest in the world, according Invenergy.

AEP estimated that the low cost of wind power will save customers $7 billion over 25 years. Construction on the farm began in 2016, and the plant is scheduled to go into service in mid-2020, Invenergy said by email

Plastic Garbage Patch Bigger Than Mexico Found in Pacific

Yet another floating mass of microscopic plastic has been discovered in the ocean, and it is mind-blowingly vast.

By Shaena Montanari

PUBLISHED JULY 25, 2017

Water, water, everywhere—and most of it is filled with plastic.

A new discovery of a massive amount of plastic floating in the South Pacific is yet another piece of bad news in the fight against ocean plastic pollution. This patch was recently discovered by Captain Charles Moore, founder of the Algalita Research Foundation, a non-profit group dedicated to solving the issue of marine plastic pollution.

Moore, who was the first one to discover the famed North Pacific garbage patch in 1997, estimates this zone of plastic pollution could be upwards of a million square miles in size. (Read: A Whopping 91% of Plastic Isn’t Recycled.)

The team is currently processing the data and weighing the plastic so they can get a handle on exactly how much garbage they’ve discovered in this area off the coast of Chile and Peru.

The term “patch” referring to the plastic pollution in oceanic gyres can be misleading. The pieces of plastic are not necessarily floating bottles, bags, and buoys, but teeny-tiny pieces of plastic resembling confetti, making them almost impossible to clean up.

These microplastic particles may not be visible floating on the surface, but in this case, they were detected after collecting water samples on Moore’s recent six-month expedition to the remote area that had only been explored for plastic once before. (See a map of plastic in the ocean.)

On the first transect of the South Pacific gyre in 2011, Marcus Eriksen, marine plastic expert and research director at the 5 Gyres Institute, did not spot much plastic. In only six years, according to the new data collected by Moore, things have changed drastically.

Henderson Island, located in this South Pacific region, was recently crowned the most plastic-polluted island on Earth, as researchers discovered it is covered in roughly 38 million pieces of trash.

The problem of plastic pollution is becoming ubiquitous in the oceans, with 90 percent of sea birds consuming it and over eight million pounds of new plastic trash finding its way into the oceans every year.

50% Rise in Renewable Energy Needed to Meet Ambitious State Standards

States are powerful forces for renewable electricity. About half the growth since 2000 has come from their Renewable Portfolio Standards, a national lab finds.

BY PHIL MCKENNA

JUL 27, 2017

Wind turbines at San Gorgonio Pass, California. Credit: David McNew/Getty

Hawaii, California and New York have some of the most ambitious renewable portfolio standards in the United States. Hawaii is aiming for 100 percent of electricity from renewable sources by 2045. Credit: David McNew/Getty

Renewable electricity generation will have to increase by 50 percent by 2030 to meet ambitious state requirements for wind, solar and other sources of renewable power, according to a new report from Lawrence Berkeley National Laboratory.

The report looked at Renewable Portfolio Standards (RPSs)—commitments set by states to increase their percentage of electricity generated from sources of renewable energy, typically not including large-scale hydropower. Twenty-nine states and Washington, D.C., currently have such standards, covering 56 percent of all retail electricity sales in the country.

"I think that the industry is quite capable of meeting that objective cost-competitively and, actually, then some," said Todd Foley, senior vice president of policy and government affairs at the American Council on Renewable Energy.

What's Needed to Meet States' Renewable Power Goals?

Seven states—Maryland, Michigan, New York, Rhode Island, Massachusetts, Illinois and Oregon—as well as Washington, D.C., have increased their RPS requirements for new wind and solar projects since the start of 2016. No states weakened their RPS policies during this time. Some of the most ambitious requirements are in California and New York, which require 50 percent of electricity to come from renewable sources by 2030, and Hawaii, which requires 100 percent from renewables by 2045.

RPS policies have driven roughly half of all growth in U.S. renewable electricity generation and capacity since 2000 to its current level of 10 percent of all electricity sales, the national lab's report shows. In parts of the country, the mandates have had an even larger effect—they accounted for 70-90 percent of new renewable electricity capacity additions in the West, Mid-Atlantic and Northeast regions in 2016.

"They have been hugely important over the years to help diversify our power mix and send a signal to investors and developers alike to put their resources in the deployment of renewable energy," Foley said.

Nationally, however, the role of RPS policies in driving renewable energy development is beginning to decrease as corporate contracts from companies that have committed to getting 100 percent of their electricity from renewables, and lower costs of wind and solar, play an increasing role.

From 2008 to 2014, RPS policies drove 60-70 percent of renewable energy capacity growth in the U.S., according to the report. In 2016, the impact dropped to just 44 percent of added renewable energy capacity.

29 States and D.C. Have Set Renewable Portfolio Standards

The increasing role market forces are playing in driving renewable energy generation is seen in a number of states with no RPS policies.

In Kansas, for example, wind energy provided 24 percent of net electricity generation in 2015, up from less than 1 percent in 2005, according to the U.S. Energy Information Administration. Similarly, wind power provides roughly one quarter of net electricity generation in Oklahoma and South Dakota, states that also lack RPS policies. Some of the generation in each of these states may be serving RPS demand in other states, or, in the case of Kansas, may be partly a result of an RPS that was repealed in 2015, lead author Galen Barbose said.

With some states considering further increases in their renewable energy standards, the policies are likely to continue to play a significant role in renewable energy development, Foley said.

"They have been very important," he said, "and I think they'll continue to be."

ABOUT THE AUTHOR

Phil McKenna is a Boston-based reporter for InsideClimate News. Before joining ICN in 2016, he was a freelance writer covering energy and the environment for publications including The New York Times, Smithsonian, Audubon and WIRED. Uprising, a story he wrote about gas leaks under U.S. cities, won the AAAS Kavli Science Journalism Award and the 2014 NASW Science in Society Award. Phil has a master's degree in science writing from the Massachusetts Institute of Technology and was an Environmental Journalism Fellow at Middlebury College.

Don’t just fear the power-grid hack. Fear how little the U.S. knows about it

By Tim Johnson

July 27, 2017 4:16 PM

LAS VEGAS

The robust and agressive takedown of part of Ukraine’s power grid by hackers served as a wakeup call for cyber experts and exposed just how much America does not know about foreign operatives’ ability to strike critical U.S. infrastructure.

Electrical grids are on the minds of those gathered at Black Hat, the world’s biggest hacker convention that entered its final day Thursday. The confab draws 16,000 hackers and information technology experts from around the globe.

Worries about infrastructure have grown since presumed Russian government-linked hackers last December shut down power to a section of the Ukrainian capital of Kiev, marking the second straight year they’ve blacked out part of the country as part of Russia’s drive to bring Ukraine back under its geopolitical wing.

Hackers routinely come to the Black Hat convention to demonstrate how to break into electronic systems embedded in medical devices, ATMs, cars, routers and mobile phones. This year, at the 20th annual gathering, one security researcher walked attendees through a hack of a wind farm.

Wind farm control networks are extremely susceptible to attack.

Jason Staggs, independent researcher

“Wind farm control networks are extremely susceptible to attack,” said the researcher, Jason Staggs, who is affiliated with the University of Tulsa.

Hackers would only have to get access to a single turbine to implant malware that would spread across the wind farm, Staggs said, noting that he had hacked into turbines at multiple wind farms with the permission of operators.

“We can turn the turbine on or turn the turbine off or put it in an idle state,” he said, demonstrating the method for taking digital control of a turbine.

Wind farms provided 4.7 percent of the nation’s electricity in 2015, Staggs said, a percentage that is likely to climb to 20 percent by 2030.

When a 250-megawatt wind farm is left idle due to a malicious hack, the downtime can cost an electric utility between $10,000 and $35,000 an hour, Staggs said.

While wind farms may face vulnerabilities, some experts said the nation’s complex electrical grids are more robust and capable of withstanding a cyberattack.

The reporting around hacking the grid is often over hyped.

Tom Parker, Accenture Security

“The reporting around hacking the grid is often over hyped,” said Tom Parker, a cofounder of FusionX, a firm specializing in cyber threat profiling that was bought in 2015 by Accenture Security, where he is group technology officer.

“Whenever anyone ever says someone could hack the grid, there’s no such thing as a single grid. There are multiple grids in the United States,” Parker said in an interview. “It’s a very complex system. There’s a lot of robustness built into it.”

Any cyberattack on U.S. power plants and electrical grids would only come if a foreign power launched a simultaneous military attack on the United States, Parker said.

Others suggested a lesser attack might occur, affecting only a section of the nation.

“If somebody really wanted to send a message, it hurts to have three or four days of no power to the Eastern Seaboard. That could be done. And there’s also no quick fix,” Sam Curry, chief product officer at Cybereason, a Boston firm, said on the sidelines of the conference.

Late last month, the FBI and the Department of Homeland Security issued an alert saying that foreign hackers were targeting the nuclear, power and critical infrastructure sectors.

Authorities did not say who the hackers were but added that they were not successful at breaching any of their targets, including an unidentified nuclear plant.

“We see this probing going on all the time,” said Joe Slowik, a senior threat intelligence researcher at Dragos Inc., an industrial cybersecurity company based in Fulton, Maryland.

A former member of an elite counterterrorism cyber unit at the National Security Agency, Jay Kaplan, said in an interview that he believes some U.S. rivals have already planted malicious code in the nation’s infrastructure, perhaps including nuclear facilities.

I do believe that there is a certain percentage of our critical infrastructure that’s already compromised.

Jay Kaplan, chief executive of Synack

“I do believe that there is a certain percentage of our critical infrastructure that’s already compromised,” said Kaplan, who is chief executive of Synack, a Menlo Park, California, firm that deploys security teams to hack into the networks of clients to test for vulnerabilities.

“This is prepositioning. Should we ever go to war with another nation state, they can leverage this malware for their benefit and basically cripple the economy,” Kaplan said. “I just think it’s reality. … Right now, I don’t feel confident at all.”

The cyberattacks on Ukraine’s power system in 2015 and 2016 have sharpened worldwide focus on hacking aimed at crippling infrastructure.

In a December 2015 attack, hackers cut power to about 225,000 customers serviced by three energy distribution companies. The attack in late 2016 struck a transmission station north of Kiev, blacking out a portion of the capital. Attackers used malware designed to delete data and physically damage industrial control systems, Cybereason said in a report this month.

The team, allegedly Russian, behind the second attack was formidable.

“It looked like there were around 20 people involved in the operation,” said Robert M. Lee, chief executive of Dragos.

Dragos partnered with ESET, a Slovakian security company, to examine the malicious code used in the industrial attack on the Ukrainian power supply.

While the malware was more sophisticated in the 2016 attack, Lee said it could not be easily repurposed to attack U.S. electrical grids, which he described as far more complex.

“Please don’t go reading Ted Koppel’s ‘Lights Out’ book and think you may need to build a bunker,” Lee said. “Things will be okay.”

But Lee added that authorities routinely overestimate their cyber defensive abilities and do not know for certain if malicious worms already reside within the infrastructure.

“The scary part of it is we simply do not know because the government tends to think it has more control over certain infrastructure than it actually does,” Lee said.

Congress Is Still Fighting Over Energy-Efficient Light Bulbs

By Ari Natter

July 27, 2017, 11:18 AM EDT

Amendment bars Energy Department enforcement of bulb standards

Light bulb makers have vowed to comply with rules regardless

Congressional Republicans renewed their effort to save the traditional light bulb, passing a measure to block federal energy standards that have come to symbolize government overreach for many consumers but are largely embraced by manufacturers as the cost of the newer bulbs has plummeted.

The House passed an amendment to a spending bill late Wednesday to block enforcement of Energy Department rules requiring manufacturers to phase out sales of incandescent light bulbs to cut power use. While the measure faces an uncertain future in the Senate, its sponsor called it an important victory for freedom.

"Congress should fight to preserve the free market," Representative Michael Burgess, a Texas Republican, said on the House floor before the vote. Burgess said he had heard from tens of thousands of people about how the regulations "will take away consumer choice when constituents are deciding which light bulbs they will use in their homes."

The measure takes aim at rules that were issued under a 2007 bipartisan energy legislation signed into law by Republican President George W. Bush. The standards, which were phased in overtime starting in 2012, have curtailed the use of incandescent light bulbs.

If it becomes law, the measure could have some unintended consequences, critics warn. While it won’t change light bulb standards which have been in effect for years, it could hamstring authorities from blocking import of incandescent bulbs from China or other nations that that don’t meet the requirements, said Steven Nadel, executive director of the American Council for an Energy-Efficient Economy, a non-profit group that promotes energy conservation.

When Republicans attacked these standards, manufacturers said they would still voluntarily comply with them, and the LED share of the market has surged over the last three years as the cost of those bulbs dropped.

LED light bulbs, which made up just 1 percent of global market share in 2010, are estimated to make up 95 percent that market by 2025, according to research by Goldman Sachs. They accounted for more than a quarter of U.S. sales last year, it said. Other bulbs that have been used to meet the requirements include the ubiquitous curly compact florescent, or CFL, bulbs and halogen lights.

When a similar measure passed into law as a rider in government funding legislation in 2011, the National Electrical Manufactures Association, which represents companies such as Philips Lighting NV and General Electric Co. said light bulb makers planned to comply with them regardless. Tracy Cullen, a spokeswoman for the trade group, did not respond to an email and phone message seeking comment.

"It’s mind boggling to me," Elizabeth Noll, legislative director of the Natural Resources Defense Council’s energy and transportation program, said. "It just adds uncertainty to their products and the rules of the road they are complying with."

The requirements, which have been put in place over time starting with the 100-watt light bulb in 2012, will cut the nation’s electricity costs by $12.5 billion dollars annually and avoid the equivalent of 30 new power plants, the environmental group estimates.

"It’s a an ideological objection to government interference in the market place," said Kateri Callahan, president of the Alliance to Save Energy, a Washington-based nonprofit. "All the things that Dr. Burgess and other opponents to those standards said would happen -- limited choice by consumer, high prices -- none of that has borne out."

Global diet and farming methods 'must change for environment's sake'

IOP Publishing

Reducing meat consumption and using more efficient farming methods globally are essential to stave off irreversible damage to the environmental, a new study says.

The research, from the University of Minnesota, also found that future increases in agricultural sustainability are likely to be driven by dietary shifts and increases in efficiency, rather than changes between food production systems.

Researchers examined more than 740 production systems for more than 90 different types of food, to understand the links between diets, agricultural production practices and environmental degradation. Their results are published today in the journal Environmental Research Letters.

Lead author Dr Michael Clark said: "If we want to reduce the environmental impact of agriculture, but still provide a secure food supply for a growing global population, it is essential to understand how these things are linked."

Using life cycle assessments - which detail the input, output and environmental impact of a food production system - the researchers analysed the comparative environmental impacts of different food production systems (e.g. conventional versus organic; grain-fed versus grass-fed beef; trawling versus non-trawling fisheries; and greenhouse-grown versus open-field produce), different agricultural input efficiencies (such as feed and fertilizer), and different foods.

The impacts they studied covered levels of land use, greenhouse gas emissions (GHGs), fossil fuel energy use, eutrophication (nutrient runoff) and acidification potential.

Dr Clark said: "Although high agricultural efficiency consistently correlated with lower environmental impacts, the detailed picture we found was extremely mixed. While organic systems used less energy, they had higher land use, did not offer benefits in GHGs, and tended to have higher eutrophication and acidification potential per unit of food produced. Grass-fed beef, meanwhile, tended to require more land and emit more GHGs than grain-fed beef."

However, the authors note that these findings do not imply conventional practices are sustainable. Instead, they suggest that combining the benefits of different production systems, for example organic's reduced reliance on chemicals with the high yields of conventional systems, would result in a more sustainable agricultural system.

Dr Clark said: "Interestingly, we also found that a shift away from ruminant meats like beef - which have impacts three to 10 times greater than other animal-based foods - towards nutritionally similar foods like pork, poultry or fish would have significant benefits, both for the environment and for human health.

"Larger dietary shifts, such as global adoption of low-meat or vegetarian diets, would offer even larger benefits to environmental sustainability and human health."

Co-author Professor David Tilman said: "It's essential we take action through policy and education to increase public adoption of low-impact and healthy foods, as well the adoption of low impact, high efficiency agricultural production systems.

"A lack of action would result in massive increases in agriculture's environmental impacts including the clearing of 200 to 1000 million hectares of land for agricultural use, an approximately three-fold increase in fertilizer and pesticide applications, an 80 per cent increase in agricultural GHG emissions and a rapid rise in the prevalence of diet-related diseases such as obesity and diabetes.

Professor Tilman added: "The steps we have outlined, if adopted individually, offer large environmental benefits. Simultaneous adoption of these and other solutions, however, could prevent any increase in agriculture's environmental impacts. We must make serious choices, before agricultural activities cause substantial, and potentially irreversible, environmental damage."

Optimizing feeding is necessary to maintain milk production in organic herds

Decisions on pasture use and feed management affect greenhouse gas emission, according to a new study from the Journal of Dairy Science®

Philadelphia, PA, June 15, 2017 - Consumer demand for organic milk recently surpassed the available supply, with sales of organic products reaching $35 billion in 2014 and continuing to rise. As farms transition to organic production to meet demand, feeding strategies will need to be adapted to meet USDA National Organic Program requirements. Currently, agriculture accounts for approximately 9% of total US greenhouse gas (GHG) emissions; the US dairy industry has committed to a 25% reduction of GHG by 2020 relative to 2009. By varying diet formulation and the associated crop production to supply the diet, farmers can affect the quantity of GHG emissions of various feeding systems. Therefore, researchers from the University of Wisconsin-Madison created a study to compare the effects of feeding strategies and the associated crop hectares on GHG emissions of Wisconsin certified organic dairy farms.

"Herd feeding strategies and grazing practices influence on-farm GHG emissions not only through crop production, but also by substantially changing the productivity of the herd," lead author Di Liang said. "Managing more land as pasture, and obtaining more of the herd feed requirements from pasture, can increase the GHG emissions if pasture and feed management are not optimized to maintain milk production potential."

The authors identified four feeding strategies that typified those used on farms in Wisconsin, with varying degrees of grazing, land allocated for grazing, and diet supplementation. A 16-year study was used for robust estimates of the yield potential on organically managed crop land in southern Wisconsin as well as nitrous oxide and methane emissions and soil carbon.

Production of organic corn resulted in the greatest nitrous oxide emissions and represented about 8% of total GHG emission; corn also had the highest carbon dioxide emissions per hectare. Emissions decreased as the proportion of soybeans in the diet increased, as soybeans require less nitrogen fertilization than corn grain. More intensive grazing practices led to higher GHG emission per metric tonne. However, allowing cows more time on pasture resulted in lower emissions associated with cropland. Manure management and replacement heifers accounted for 26.3 and 20.1% of GHG emissions.

Based on their findings, the authors determined that a holistic approach to farm production is necessary. Organic dairy farms with well-managed grazing practices and adequate levels of concentrate in diet can both increase farm profitability and reduce GHG emission per kilogram of milk.

"Consumers often equate more dependence on pasture with environmentally friendly farming, but this study demonstrated that low milk production per cow is a major factor associated with high GHG emission. Managing both pasture and supplementation to increase milk production per cow will substantially reduce GHG emissions," said Journal of Dairy Science Editor-in-Chief Matt Lucy.

Factors such as dairy cow breed and nonproduction variables may also have an effect on GHG emissions on organic dairy farms. Thus, future studies are needed in this area to elucidate the effects of grazing management and feeding systems. With more research, however, crop and milk production, GHG emissions, and farm profitability can be optimized on organic dairy farms.

EPA Announces Student Award Winners

The U.S. Environmental Protection Agency (EPA) today announced the winners of the 2016 President’s Environmental Youth Award (PEYA). The program recognizes outstanding environmental stewardship projects by K-12 youth. These students demonstrate the initiative, creativity, and applied problem-solving skills needed to tackle environmental problems and find sustainable solutions.

Fifteen projects are being recognized this year, from 13 states: California, Colorado, Connecticut, Florida, Nebraska, New Jersey, Michigan, Pennsylvania, South Carolina, Texas, Utah, Virginia, and Washington.

“Today, we are pleased to honor these impressive young leaders, who demonstrate the impact that a few individuals can make to protect our environment,” said EPA Administrator Scott Pruitt. “These students are empowering their peers, educating their communities, and demonstrating the STEM skills needed for this country to thrive in the global economy.”

Each year the PEYA program honors environmental awareness projects developed by young individuals, school classes (kindergarten through high school), summer camps, public interest groups and youth organizations.

This year’s PEYA winners conducted a wide range of activities, such as:

developing a biodegradable plastic using local agricultural waste product;

designing an efficient, environmentally friendly mosquito trap using solar power and compost by-product;

saving approximately 2,000 tadpoles to raise adult frogs and toads;

implementing a hydroponics and aquaculture project in a high school;

repurposing over 25,000 books;

creating an environmental news YouTube channel;

organizing recycling programs to benefit disaster victims and underserved community members;

reclaiming and repurposing over 4,000 discarded pencils within their school;

promoting food waste reduction;

creating a small, portable tool to prevent air strikes of migratory birds;

engaging their community in a program to save a threatened bird, the Western Snowy Plover;

testing grey water to encourage water conservation;

promoting bee health;

uniting their schools to address local environmental issues.

The PEYA program promotes awareness of our nation’s natural resources and encourages positive community involvement. Since 1971, the President of the United States has joined with EPA to recognize young people for protecting our nation’s air, water, land and ecology. It is one of the most important ways EPA and the Administration demonstrate commitment to environmental stewardship efforts created and conducted by our nation’s youth.

EPA Honors Winners of the 2017 Green Chemistry Challenge Awards

The U.S. Environmental Protection Agency (EPA) is recognizing landmark green chemistry technologies developed by industrial pioneers and leading scientists that turn potential environmental issues into business opportunities, spurring innovation and economic development.

“We congratulate those who bring innovative solutions that will help solve problems and help American businesses,” said EPA Administrator Scott Pruitt. “These innovations encourage smart and safe practices, while cutting manufacturing costs and sparking investments. Ultimately, these manufacturing processes and products spur economic growth and are safer for health and the environment.”

The Green Chemistry Challenge Award winners will be honored on June 12 at a ceremony in Washington, DC. The winners and their innovative technologies are:

Professor Eric Schelter, University of Pennsylvania, for developing a simple, fast, and low-cost technology to help recycle mixtures of rare earth elements. Reducing the costs to recover these materials creates economic opportunity by turning a waste stream, currently only recycled at a rate of 1%, into a potential revenue stream. About 17,000 metric tons of rare earth oxides are used in the U.S. annually in materials such as wind turbines, catalysts, lighting phosphors, electric motors, batteries, cell phones, and many others. Mining, refining, and purification of rare earths are extraordinarily energy and waste intensive and carry a significant environmental burden.

Dow Chemical Company, Collegeville, Pennsylvania, in partnership with Papierfabrik August Koehler SE, Germany, for developing a thermal printing paper that eliminates the need for chemicals used to create an image, such as bisphenol A (BPA) or bisphenol S (BPS). Thermal paper is used broadly throughout the world for cash register receipts, tickets, tags, and labels. This technology reduces costs by creating records that do not fade, even under severe sunlight, allowing the original document to be preserved for long term storage. The paper is compatible with thermal printers currently in commercial use around the world.

Merck Research Laboratories, Rahway, New Jersey, for successfully applying green chemistry design principles to Letermovir, an antiviral drug candidate, that is currently in phase III clinical trials. The improvements to the way the drug is made, including use of a better chemical catalyst, increases the overall yield by more than 60%, reduces raw material costs by 93%, and reduces water usage by 90%.

Amgen Inc., Cambridge, Massachusetts, in partnership with Bachem, Switzerland, for improving the process used to manufacture the active ingredient in ParsabivTM, a drug for the treatment of secondary hyperparathyroidism in adult patients with chronic kidney disease. This improved peptide manufacturing process reduces chemical solvent use by 71%, manufacturing operating time by 56%, and manufacturing cost by 76%. These innovations could increase profits and eliminate 1,440 cubic meters of waste or more, including over 750 cubic meters of aqueous waste annually.

UniEnergy Technologies, LLC (UET), Mukilteo, Washington, in partnership with Pacific Northwest National Laboratory (PNNL), for an advanced vanadium redox flow battery, originally developed at the PNNL and commercialized by UET. The battery, when used by utility, commercial and industrial customers, allows cities and businesses more access to stored energy. It also lasts longer and works in a broad temperature range with one-fifth the footprint of previous flow battery technologies. The electrolyte is water-based and does not degrade, and the batteries are non-flammable and recyclable, thus helping meet the increasing demand of electrical energy storage in the electrical power market, from generation, transmission, and distribution to the end users of electricity.

During the 22 years of the program, EPA has received more than 1600 nominations and presented awards to 114 technologies that spur economic growth, reduce costs, and decrease waste. The agency estimates winning technologies are responsible for annually reducing the use or generation of more than 826 million pounds of hazardous chemicals, saving 21 billion gallons of water, and eliminating 7.8 billion pounds of carbon dioxide equivalent releases to air.

An independent panel of technical experts convened by the American Chemical Society Green Chemistry Institute formally judged the 2017 submissions from among scores of nominated technologies and made recommendations to EPA for the 2017 winners. The 2017 awards event will be held in conjunction with the 21st Annual Green Chemistry and Engineering Conference.

More information: www.epa.gov/greenchemistry

Ph.D. student pioneers storytelling strategies for science communication

By Tianyi Dong | June 6, 2017

Ph.D. student Sara ElShafie has created workshops to teach graduate students how to tell stories about science. (UC Berkeley photo by Brittany Hosea-Small)

Sara ElShafie used to struggle to explain her research to her family in a meaningful way. A UC Berkeley graduate student in integrative biology with an emphasis on how climate change affects animals over time, she says she “would always get lost in the details, and it was not doing justice to what is so amazing about natural history.”

But she doesn’t have that challenge anymore. Today, 28-year-old ElShafie is one of the few people in the country who focus on adapting storytelling strategies from the film industry to science communication. For the past year and a half, she has been leading workshops for scientists — primarily graduate students — on how to tell stories about their research that resonate with a broader audience.

ElShafie found her storytelling solution at Pixar Animation Studios, in Emeryville. A Pixar fan her entire life, she emailed the company’s outreach department in 2015 and asked if anyone there would like to talk with students at the UC Museum of Paleontology about how to adapt strategies for filmmaking to communicating science to people outside the scientific community.

“I just thought, ‘Why not?’” she says. “Communication skills require training, just like any other skills. Good communication requires good storytelling. Maybe we can learn from professional storytellers.”

To her surprise, two story artists at Pixar were interested and volunteered their time for the project. They collaborated with ElShafie to present a pilot seminar at the museum that attracted not only students, but faculty and staff. Seeing the project’s potential, ElShafie worked over the following year to develop a series of workshops inspired by Pixar. Last March, together with a Pixar collaborator, ElShafie presented the first public workshop on the Berkeley campus. Although the studio is not formally involved, additional artists at Pixar have been generously contributing their time and feedback.

Drawing on examples from Pixar films and scientific studies, the workshops illustrate storytelling strategies that make science accessible by reframing research into a story with characters, obstacles and revelations. According to ElShafie, these techniques help to overcome scientists’ difficulty with communicating science about non-human subjects. Pixar films — often about animals — are great models, she adds, because they have compelling themes and emotional depth and appeal to a broad audience.

A Pixar fan since childhood, ElShafie has a collection of Pixar figurines and toys in her lab in the Valley Life Sciences Building. (UC Berkeley photo by Brittany Hosea-Small) Pixar characters are copyright Disney/Pixar.

In her workshops with scientists, ElShafie leads by example, turning her research on fossil lizards into a story about how the conflict of climate change affects animal characters over time. “Lizards alone aren’t going to be that interesting to most audiences,” ElShafie says, “but thinking about time, space and their connection with a major societal problem is.”

ElShafie’s workshops not only help participants frame a message, but develop a conceptual framework for introducing scientific topics in the form of a story. With a worksheet she’s created to illustrate the parallel between filmmaking and science, participants leave the workshop having made an outline of a story about their own research. Extended workshops also explore how to use the visual language of animated films in scientific presentations.

“She’s really enlightened the whole museum community on ways we can strengthen our own writing by applying storytelling techniques,” says Lisa White, assistant director of education and public programs at the UC Museum of Paleontology. White says she watched how ElShafie polished and perfected her workshop over the past school year and how the audience grew threefold, to nearly 200 people at workshops this spring.

ElShafie offered her first official workshop, “Science Through Story: Strategies to Engage Any Audience,” in November 2016 at the annual meeting of the Western Society of Naturalists in Monterey. It was a hit, and she started to receive invitations to present it at other institutions around the state and country.

Mallory Rice, a UC Santa Barbara graduate student who attended the Monterey workshop, invited ElShafie to present the workshop at her campus in April. “It was great to see the example of how she turned her own questions and research into a story, rather than thinking about (just presenting) it to the public with results and data,” Rice says.

ElShafie hopes to hold the workshop annually at Berkeley.

Sara ElShafie (far left) at a March 2017 workshop she led on campus for 200 people that included (left to right) Eric Rodenbeck, CEO of Stamen Design; Erica Bree Rosenblum, UC Berkeley associate professor of Environmental Science, Policy and Management; Lisa White, director of education at the UC Museum of Paleontology; and Daniel McCoy, technical director at Pixar Animation Studios. (UC Museum of Paleontology photo by Helina Chin)

Upon arriving at Berkeley as a Ph.D. student in 2014, ElShafie knew she wanted a career in informal education leadership and eventually to become director of a major science museum. To achieve this, she sought scientific training as well as experience in communications and outreach.

As an undergraduate at the University of Chicago, ElShafie had been involved in Project Exploration, a Chicago nonprofit that makes science accessible to underrepresented groups. One of its programs brought students from the inner city to campus for a weeklong intensive summer course. She remembered how the students were overwhelmed at the beginning, but completely transformed by the end of the week, confident and dying to share with others what they had learned.

“That’s when I realized science is not just about learning science, but about discovering your potential and learning about the world around you,” ElShafie says.

ElShafie chose Berkeley for her doctoral program because it could offer her training in both scientific research and informal education, and was especially drawn to the UC Museum of Paleontology, with its emphasis on outreach and public service. She got involved in the museum’s educational programming, giving tours to school groups and assisting with Cal Day activities. Her lecture at Cal Day 2017, “Real-World Fantastic Beasts and Where To Find Them,” was another opportunity for her to apply her skills in explaining science to the public.

“Museums are often the first opportunity for young people to experience hands-on science,” White says, “and for Ph.D. students, a museum career is particularly rewarding, because you not only do research, but also share your specialty as well as the importance of science broadly with the public.”

ElShafie studies lizard fossils and how climate change has affected animals throughout history. (UC Berkeley photo by Brittany Hosea-Small)

For ElShafie, balancing science communication workshops with her dissertation research has not been easy. But she spends whatever spare time she has on this passion project and is running workshops across the country. She has given six workshops in the past year and has four more coming up, including one in June at the Smithsonian National Museum of Natural History.

She says, “I enjoy it immensely and have learned a ton — how to put materials together in a cohesive way; how to articulate a vision for a new initiative and get other people on board; how to create a niche for yourself by looking at what everyone else has done, what’s missing, and coming up with novel ideas to address a need.”

Job-seekers with communications expertise are in high demand in areas such as interaction with policy-makers and data visualization for scientists, says ElShafie.

“It has never been more critical for scientists to be able to explain science to the public effectively, and the backbone of all communication is a story,” ElShafie says. “Story training benefits scientists, because it helps us to communicate it in a clear and compelling way to any audience.”

She says the story also humanizes the storyteller, which can help with the misconception that scientists have their own agenda or are out of touch. “Storytelling is an introspective process,” says ElShafie. “It helps scientists to articulate their professional motivations, about why they become scientists, why they choose their particular field and why it is important.”

For ElShafie, paleontology is “the perfect gateway to other sciences” and also involves art and imagination. She enjoys the feeling of holding the evidence — like a fossil — in her hand, imagining the animals and their past lives, and then bringing them to life for other people.

“I learned that there are many parallels between the creative process of filmmakers and of scientists,” ElShafie says, adding that stepping outside one’s profession and looking at it from another lens, for a fresh perspective, is always valuable.

Wildfires Burning in West Greenland

Brian Kahn By Brian Kahn

Published: August 7th, 2017

It’s not just the American West and British Columbia burning up. A fire has sparked in western Greenland, an odd occurrence for an island known more for ice than fire.

A series of blazes is burning roughly in the vicinity of Kangerlussuaq, a small town that serves as a basecamp for researchers in the summer to access Greenland’s ice sheet and western glaciers. The largest fire has burned roughly 3,000 acres and sent smoke spiraling a mile into the sky, prompting hunting and hiking closures in the area, according to local news reports.

The Sentinel-2 satellite captured a wildfire burning in western Greenland.

There’s no denying that it’s weird to be talking about wildfires in Greenland because ice covers the majority of the island. Forests are basically nonexistent and this fire appears to be burning through grasses, willows and other low-slung vegetation on the tundra that makes up the majority of the land not covered by ice.

Data for Greenland fires is hard to come by, but there is some context for fires in other parts of the northern tier of the world. The boreal forest sprawls across Canada, Russia, Alaska and northern Europe, and provides a longer-term record for researchers to dig into. That record shows that the boreal forest is burning at a rate unprecedented in the past 10,000 years.

Stef Lhermitte, a remote sensing expert at Delft University of Technology in the Netherlands, said there is evidence of fires burning in Greenland over the past 17 years of MODIS satellite records kept by NASA. But because of how NASA’s algorithms interpret the satellite data, there’s low confidence that every fire on the map actually occurred.

Jason Box, an ice sheet researcher with the Geologic Survey of Denmark and Greenland, said he observed a lightning-sparked fire in the late 1990s, but that otherwise, fires are rare. Looking at the MODIS record, he said one of the only other high confidence fires was actually a trash burn in 2013, though other satellites show evidence of others fires.

Box also noted that temperatures in the area rose in late July just before the fire was first observed, spiking to above 53°F (12°C) on July 27. While not exactly balmy, the temperature rise may have helped the blazes to spread.

According to La Croix, a French newspaper, there’s no precedent for a fire this size in the European Union’s forest fire system. Looking beyond the satellite record for context specific to Greenland is all but impossible as there are basically no records to refer to.

“There does not appear to be a reliable long-term record of observed wildfires in Greenland,” researchers with the Danish Meteorological Institute’s Greenland monitoring program tweeted.

Ultimately, it’s not the burning of Greenland’s tundra that’s the biggest climate change concern. It’s the island’s massive store of ice that if melted, would be enough to raise sea levels 20 feet.

The ice has been melting at a quickening pace since 2000, partly due to wildfires in other parts of the world. The uptick in boreal forest fires has kicked up more ash in the atmosphere where prevailing winds have steered it toward the ice sheet.

The dark ash traps more energy from the sun, which has warmed the ice sheet and caused more widespread melting. Soot from massive wildfires in Siberia caused 95 percent of the Greenland ice sheet surface to melt in 2012, a phenomenon that could become a yearly occurrence by 2100 as the planet warms and northern forest fires become more common.

Recycling dirty aluminum foil might not be practical — but it could be used to make biofuel

Author, Cody Boteler

Published, Aug. 2, 2017

Dirty aluminum foil can be converted into a catalyst for biofuel production, according to a researcher at Queen's University Belfast. Recycling aluminum from all feedstock is critical, according to the paper, (published in Scientific Reports), because bauxite mining, the main source of aluminum, is environmentally damaging.

The researchers were able to dissolve the foil in a solution that turned it into crystals, and then used a second mixture to purify the crystals into pure aluminum salts, according to Mental Floss. Those pure salts can be used in creating alumina catalyst, which is a key ingredient in making dimethyl ether (DME), a diesel substitute.

According to the author of the paper, producing the alumina catalyst from dirty aluminum foil costs about $142 per kilogram (just over 2 lbs.), while the commercial cost for the alumina catalyst is around $360 per kilogram — demonstrating significant savings.

The industrial process of chemically and physically transforming dirty, used aluminum foil into usable, pure aluminum crystals, may be complicated — but the implications are clear. Instead of landfilling used aluminum and taking up valuable airspace, that aluminum can be turned into a valuable biofuel.

As more areas pursue organics diversion goals, solutions should be explored on all steps of the food chain, from packaging to waste processing. While clean aluminum can be utilized by recyclers or scrap sellers, dirty aluminum can't do much more than add to the weight of a landfill delivery, and can contaminate organics collection. Converting dirty aluminum to a usable industrial material, then, could become a valuable cottage industry, if the paper's findings are viable.

Queen's University said, in a press release, that the author of the paper, Ahmed Osman, plans to continue researching ways to further improving the catalysts that produce DME, to make production more commercially viable. Osman's research could also include looking at ways to use the alumina crystals produced from the dirty aluminum foil in the catalytic converters of vehicles that run on natural gas. Given the industry trend, both from companies and governments, away from traditional fuel in collection trucks, these developments are worth keeping an eye on. The New York Department of Sanitation has been testing a DME-powered Mack Truck, with results expected sometime soon.

China’s ageing solar panels are going to be a big environmental problem

The issue of how to dispose of hazardous waste from ageing panels casts a shadow over the drive towards renewable energy and away from fossil fuels

PUBLISHED : Sunday, 30 July, 2017, 4:03pm, UPDATED : Sunday, 30 July, 2017, 10:12pm

China will have the world’s worst problem with ageing solar panels in less than two decades, according to a recent industry estimate.

Lu Fang, secretary general of the photovoltaics decision in the China Renewable Energy Society, wrote in an article circulating on mainland social media this month that the country’s cumulative capacity of retired panels would reach up to 70 gigawatts (GW) by 2034.

That is three times the scale of the Three Gorges Dam, the world’s largest hydropower project, by power production.

By 2050 these waste panels would add up to 20 million tonnes, or 2,000 times the weight of the Eiffel Tower, according to Lu.

“In fair weather, prepare for foul,” she warned.

China flips the switch on world’s biggest floating solar farm

Lu could not be immediately reached for comment. A staff member from the society, which was formerly known as the Chinese Solar Energy Society, confirmed that the figures were in a report presented by Lu at an industrial conference in Xian, Shaanxi in May.

China currently hosts the world’s largest number of solar power plants with a total capacity of close to 80GW last year, according to the International Energy Agency. The installation in China is nearly twice the amount of the US.

Nearly half of the nation’s total capacity was added last year. Industrial experts have also predicted that new solar farms completed this year will exceed 2016’s record, according to Bloomberg.

This neck-breaking pace was driven by government’s drive to diversify the country’s energy supply structure, which at present relies heavily on fossil fuels such as coal and imported oil.

But the solar plants are relatively short-lived, and the government does not have any retirement plan for them yet.

A panel’s lifespan ranges from 20 to 30 years, depending on the environment in which they are used, according to the US Department of Energy. High temperatures can accelerate the ageing process for solar cells, while other negative factors – such as the weight of snow or dust storms – could cause material fatigue on the surface and internal electric circuits, gradually reducing the panel’s power output.

China’s world-beating solar farm is almost as big as Macau, Nasa satellite images reveal

Tian Min, general manager of Nanjing Fangrun Materials, a recycling company in Jiangsu province that collects retired solar panels, said the solar power industry was a ticking time bomb.

“It will explode with full force in two or three decades and wreck the environment, if the estimate is correct,” he said.

“This is a huge amount of waste and they are not easy to recycle,” Tian added.

A solar panel contains metals such as lead and copper and also has an aluminium frame. The solar cells are made up of pure, crystallised silicon wrapped under a thick layer of plastic membrane for protection.

In Europe, some companies are reported to have developed sophisticated technology to reclaim more than 90 per cent of the materials.

But the western technology might face a hard sell in China, according to Tian.

China sets up laboratory to research building solar power station in space

China’s solar power plants are mostly located in poor, remote regions such as the Gobi in Inner Mongolia, while the majority of recycling industries are in developed areas along the Pacific coast.

Transporting these bulky panels over long distances could be very costly, Tian said.

Another cost comes from separating and purifying the waste materials, an industrial process that not only requires plenty of labour and electricity input, but also chemicals such as acids that could cause harm to the environment.

“If a recycling plant carries out every step by the book to achieve low pollutant emission, their products can end up being more expensive than new raw materials,” he said.

The cost of a kilogram of silicon crystal stands at about US$13 this year. The price is expected to drop by 30 per cent over the next decade, according to industrial estimates. This would make recycled silicon even harder to sell, according to Tian.

A sales manager of a solar power recycling company believes there could be a way to dispose of China’s solar junk, nonetheless.

Nuclear scientist predicts China could be using fusion power in 50 years

“We can sell them to Middle East,” said the manager who requested not to be named.

“Our customers there make it very clear that they don’t want perfect or brand new panels. They just want them cheap,” he said.

“They are re-selling these panels to household users living in deserts. There, there is lots of land to install a large amount of panels to make up for their low performance,” the manager added.

“Everyone is happy with the result,” he added.

Flood insurance is a mess, and Harvey won’t make Congress any more likely to fix it

By Stuart Leavenworth

September 01, 2017 4:11 PM

WASHINGTON

Hurricane Harvey is sure to add more crushing debt to the National Flood Insurance Program, which is already $25 billion in the red. So when Congress resumes on Tuesday, will it immediately act to fix this troubled program?

Don’t count on it.

Entrenched regional and partisan divisions have sidelined flood insurance reform for years. Those divides could intensify this month as lawmakers confront three tasks: Formulating an aid package for hurricane victims, raising the debt ceiling and reauthorizing the flood insurance program, which is set to expire by Sept. 30.

Given Congress’ track record of action in 2016, few observers expect lawmakers to act on comprehensive flood insurance reform legislation pending in the House and Senate.

“Not likely to happen,” said Steve Ellis, vice president of Taxpayers for Common Sense, a budget watchdog group and longtime advocate for flood insurance reform.

“If we had a Lyndon B. Johnson in office, we could get these deals made,” said R.J. Lehmann, co-founder of the R Street Institute, a conservative think tank that also advocates for flood insurance reform. “Unfortunately, we don’t have an LBJ.”

Ellis and others say the best hope now is that lawmakers will pass a three- or six-month extension of the National Flood Insurance Program, effectively kicking the debate down the road.

“That is all fine and good,” said Jimi Grande, senior vice president of government affairs for the National Association of Mutual Insurance Companies. “But after all this time, no one should get a ribbon for a last-minute extension.”

If we had a Lyndon B. Johnson in office, we could get these deals made.

R.J. Lehmann, is co-founder of the R Street Institute, a conservative think tank.

Ellis, Lehmann and Grande are part of a strange-bedfellow coalition of taxpayer groups, insurance companies, environmental organizations and fiscal hawks. Over the last decade, these groups have lobbied to phase out subsidies for flood insurance and institute other reforms. Members of the SmartSafer coalition, they argue that the NFIP is both financially unsustainable and an enabler of encouraging people to build and rebuild in dangerous floodplains.

The coalition has scored some successes, including a 2012 reauthorization that reduced FEMA’s debt and led to better mapping of flood hazards in states such as North Carolina. “The eyes get a little wider (in Washington) when lawmakers see a broad coalition of interests working together,” said Ellis.

Yet progress has been slow. Roughly five million households nationwide hold federal flood insurance policies. More than half of those are in three states — Florida, Texas and Louisiana — that enjoy sizable clout in Congress and have resisted jacking up premiums on policy holders. In 2014, they helped pass legislation to reverse some of the reforms that Congress had passed two years earlier.

In addition, powerful lobbying groups such as the National Association of Realtors and National Association of Homebuilders tend to oppose legislative changes that could make flood insurance significantly less affordable.

It used to be that support for flood insurance reform was more of a regional issue than a partisan issue, said Lehmann. But now, he said, even lawmakers of opposite parties wanting to reduce NFIP subsidies refuse to cooperate. “Our coalition faces challenges we never had when we started,” he said.

The partisan divide was evident in June, when the House Financial Services Committee passed a legislative package that would reauthorize the flood insurance program for five years, ramp up premiums, fund more flood mitigation programs and make it easier for private companies to offer their own insurance policies.

Democrats opposed several parts of that package, which was shepherded out of committee by Chairman Jeb Hensarling, a Republican who represents a portion of Texas away from the flood-prone coast. The package has since stalled amid concerns from House Republicans with large numbers of floodplain constituents, namely House Majority Whip Steve Scalise of Louisiana.

In the Senate, political opposites such Marco Rubio of Florida and Elizabeth Warren of Massachusetts are supporting a NFIP reauthorization bill. To the dismay of flood reform advocates, it would cap annual premium increases, perpetuating the subsidies.

Each year on average, the flood insurance program pays out $3 billion more than it collects in premiums, a key reason it has accumulated a debt of nearly $25 billion. In 2005, FEMA borrowed more than $17 billion from the U.S. Treasury to pay claims related to Hurricane Katrina, and Hurricane Sandy triggered another $9 billion in borrowing.

Harvey’s flooding will add to the program’s woes. Texas’ Harris County, which includes Houston, has nearly 250,000 insurance policies. The hurricane has likely caused tens of billions of dollars in damages, a portion of which the NFIP will be obligated to cover.

For Congress, one immediate task is dealing with FEMA’s cash flow. As of Wednesday, FEMA had $1.4 billion in its major disaster account, which could quickly become exhausted as the agency pays for temporary housing and other costs. Tom Bossert, the White House Homeland Security advisor, said on Wednesday the president would be asking Congress for an initial supplement to FEMA’s budget “shortly,” followed by a larger disaster relief request.

If Congress doesn’t reauthorize the National Flood Insurance Program, the program would “lapse” and FEMA would be unable to sell more policies. The last time that happened, in 2010, an estimated 46,800 home sales transactions were interrupted or canceled, according the National Association of Realtors.

Grande, the insurance industry lobbyist, said he seriously doubts that would happen again. “The politics of Harvey right now won’t allow any politician to let the insurance program lapse,” said Grande, who expects Congress to pass a simple short-term extension of the NFIP, as it has several times before.

But Lehmann fears that even a simple extension of the insurance program could get ground up in the sausage making that is about to ensue. To secure votes to raise the debt ceiling, congressional leaders may attempt to tie that measure to Hurricane Harvey aid, a possibility that is already drawing resistance.

Renewal of the NFIP could also be drawn into that deal making, he said.

“It should be easy to get a continuing resolution on flood insurance,” he said. “But these days, nothing is easy in Washington.”

Could These Robotic Kelp Farms Give Us An Abundant Source Of Carbon-Neutral Fuel?

By using elevators to grow kelp farther out in ocean waters, Marine BioEnergy thinks it can grow enough seaweed to make a dent in the fuel market.

Off the coast of Catalina Island near Los Angeles, a prototype of a new “kelp elevator”–a long tube with seaweed growing on it that can be moved up and down in the water to access sunlight and nutrients–will soon begin tests.

If the study works as hoped, the startup behind it, Marine BioEnergy, wants to use similar technology, driven by robotic submarines, to begin farming large tracts of the open ocean between California and Hawaii. Then it plans to harvest the kelp and convert it into carbon-neutral biocrude that could be used to make gasoline or jet fuel.

“In order to grow that much kelp, you really have to move outside the normal range of where kelp is found, which is along the coast.” [Photo: David Ginsburg]

“We think we can make fuel at a price that’s competitive with the fossil fuel that’s in use today,” says Cindy Wilcox, who cofounded Marine BioEnergy with her husband Brian Wilcox, who manages space robotics technology in his day job at NASA’s Jet Propulsion Laboratory at California Institute of Technology.

Other biofuels, such as ethanol made from plant waste on corn fields, have struggled to become commercially viable, particularly after oil prices crashed. Solazyme, a company that planned to make biofuel from algae (and predicted in 2009 that it would be cost-competitive with fossil fuels within two or three years), ended up pivoting to make food products under the name TerraVia, and has now declared bankruptcy.

“You’re going to need a lot of kelp in order to make it cost-competitive with something like coal, fossil fuels, or natural gas.” [Image: Evan Ackerman/IEEE Spectrum]

Kelp might have a chance of faring better. Unlike plants on land, it has little lignin or cellulose, fibers that make processing more difficult and expensive. In the right conditions, it can grow more than a foot a day, without the need for the irrigation or pesticides that might be used on land.

A key to the company’s concept is farming in the open ocean, where there is room to grow vast quantities of kelp. “You’re going to need a lot of kelp in order to make it cost-competitive with something like coal, fossil fuels, or natural gas,” says Diane Kim, a scientist at the University of Southern California’s Wrigley Institute for Environmental Studies, which is helping run the proof-of-concept study of Marine BioEnergy’s technology at Catalina. “In order to grow that much kelp, you really have to move outside the normal range of where kelp is found, which is along the coast.”

Kelp doesn’t typically grow in the open ocean since it needs both sunlight found near the surface of the water and nutrients that are found near the ocean floor (it also needs to anchor itself to something). In the 1970s, during the oil embargo, the U.S. Navy began investigating the possibility of farming kelp in the open ocean, pumping deep ocean water filled with nutrients to kelp anchored near the surface. But anchors often failed in ocean currents, and after the embargo ended, government interest faded.

“Here’s the right feedstock, and we just aren’t using it.” [Image: Evan Ackerman/IEEE Spectrum]

In shallow coastal waters, where kelp naturally has access to both sunlight and nutrients, it’s a challenge to grow the seaweed at scale. Attempts to cultivate kelp gardens for food only have succeeded in relatively small areas.

But Brian Wilcox, who happens to be the son of the researcher who led the early work with the Navy, believed that kelp farming in the open ocean still might be possible. “My husband just kept thinking about this–here’s the right feedstock, and we just aren’t using it,” says Cindy Wilcox. He began considering a new approach: moving kelp up and down in a process he calls depth cycling, which gives the kelp access to both the nutrient-rich deep water and the light near the surface.

In 2015, Marine BioEnergy got a grant from the U.S. Department of Energy’s ARPA-E to test the proof of concept that is now in early stages with the Wrigley Institute researchers. Long lines stretch in a net-like pattern in the water, with kelp attached; the the kelp can be raised in a saltwater nursery on land where it is seeded into twine, and then later can be tied to the floating farm. At the end of the farm, underwater drones can pull the whole system up and down, both to maximize growth and to avoid ship traffic or storms near the surface. When the kelp is ready for harvest, the drones can tow the farm to a nearby ship.

The startup is also working with Pacific Northwest National Laboratory, which has developed a process to convert kelp to biocrude. The team is evaluating whether it’s more economic to make the crude on a ship–the processing center could fit on a container ship, powered by the processes’ own fuel–or to bring the harvested kelp back to land.

The resulting fuel should be carbon neutral, because the carbon dioxide released when the fuel is burned will equal the carbon dioxide taken in by the kelp as it grows. Still, some argue that biofuels are not an ideal choice for powering transportation. Mark Jacobson, a Stanford University professor who has calculated that it’s feasible to get all energy needs from wind, hydropower, and solar, says that a car running on renewable electricity makes more sense than a car running on biofuel.

“I believe liquid biofuels for transportation (or any other combustion purpose) are a bad idea because they still require combustion, resulting in air pollution, which using electricity generated from clean, renewable sources for transportation avoids,” Jacobson says.

But air transportation is unlikely to run on electricity anytime soon, and–despite some predictions about the rapid demise of gas cars–biofuel could serve a practical purpose in the near term. The kelp biocrude, which could be processed at existing refineries, could also be used to make plastics that are typically made from fossil fuels.

The first step is proving that the kelp can grow and thrive as it’s pulled up and down. “Part of this project for the next two years is to really figure out, using the depth cycling strategy, if it works at all, and what are the parameters,” says Kim. “Theoretically, it should work.”

If the proof of concept is successful, Marine BioEnergy wants to go big: To cover 10% of the transportation fuel needs in the U.S., they’ll have to have enough kelp farms to cover an area of the Pacific roughly the size of Utah.

About the author

Adele Peters is a staff writer at Fast Company who focuses on solutions to some of the world's largest problems, from climate change to homelessness. Previously, she worked with GOOD, BioLite, and the Sustainable Products and Solutions program at UC Berkeley

Resort town gets high-priced help to take aim at turbines

Josh Kurtz, E&E News reporter

Climatewire: Friday, October 20, 2017

The Block Island Wind Farm off Rhode Island's coast was the first U.S. offshore wind farm. There's a fight in Maryland over turbines. Dennis Schroeder/NREL/Flickr

This article was updated at 10:52 p.m. EDT.

Already targeted by an appropriations bill in Congress, two proposed wind energy projects off the coast of Maryland could face a sneak attack in the Free State's upcoming legislative session.

Officials in the resort town of Ocean City, Md., fearful that wind turbines will damage their lucrative tourist economy, have hired a plugged-in Annapolis lobbyist to help them push the two projects farther offshore. The lobbyist, Bruce Bereano, said in an interview this week that he was hired to focus on Congress and the federal approval process.

But some stakeholders in the long battle to bring offshore wind energy to Maryland are skeptical. Bereano, by his own admission, has limited experience lobbying Congress. He is, however, the most successful lobbyist in the history of the Maryland State House.

"You don't hire Bruce Bereano to go to Capitol Hill when he has spent 40 years in Annapolis," said Mike Tidwell, executive director of the Chesapeake Climate Action Network.

So the inevitable question becomes: Will there be a push in the 2018 Maryland General Assembly session, which begins Jan. 10, to kill or alter the offshore wind projects? Could a symbolic election-year "messaging" bill emerge, even if it has no chance of passing? The answers to these questions are fraught with political implications.

Ocean City Mayor Rick Meehan could not be reached for comment yesterday about the town's recently inked $65,000-a-year contract with Bereano. But Ocean City Councilman Tony DeLuca seemed to hint at the reason for Bereano's hiring when he told Ocean City Today last month, "He knows everyone in Annapolis, he has the reputation, and he's the right man for the job."

At issue are two wind energy projects the Maryland Public Service Commission approved in May. It enabled two companies, U.S. Wind Inc. and Skipjack Offshore Energy LLC, to collectively construct 368 megawatts of capacity.

The decision to approve both projects was a surprise to most stakeholders — including the companies themselves, which assumed they were competing against each other. U.S. Wind was allowed to build 61 turbines 12 to 15 miles offshore, roughly parallel to downtown Ocean City. Skipjack's 15-turbine project would be slightly farther to the north, in waters at the Maryland-Delaware border, 17 to 21 miles offshore.

Both companies are working within the parameters of Maryland legislation that passed in 2013, approving offshore wind energy. According to a Public Service Commission consultant, Maryland residential ratepayers would pay no more than $1.40 a month on their utility bills to help subsidize the projects.

By most accounts, Ocean City officials were not very active during the three-year debate that resulted in the enabling state legislation, and they weren't especially vocal during most of the PSC approval process. But then, just days before the PSC ruling, they saw an artist's rendering of what U.S. Wind's turbines would look like from the beach, and they freaked out.

Rep. Andy Harris (R-Md.) came quickly to their aid. Harris, who represents Ocean City in Congress, attached an amendment to a House appropriations bill in July that said if a Maryland wind project isn't at least 24 miles from the shore, the federal government cannot spend money to evaluate it (Energywire, July 27).

"Ocean City's economy heavily relies on its real estate and tourism sectors, and there has not yet been a proper examination on whether construction of these wind turbines will have a negative economic impact on the community," Harris said at the time. "If construction of these turbines too close to the shoreline will reduce property value or tourism, then the turbines may cause more issues than they solve."

In response, Paul Rich, U.S. Wind's director of project development, said Harris' amendment "effectively ... kills our project."

Coincidentally, Harris held a fundraiser in Ocean City on Wednesday evening at Seacrets, a popular entertainment complex on the bay.

'A very strong and emotional issue'

Whether Harris' amendment remains in the final omnibus spending package is anybody's guess. But it provides a marker for Ocean City officials.

"They are not opposed to wind power as one of several renewable energy sources, but they strongly and very solidly oppose the placement and location of the wind turbines, because by all accounts these turbines are clearly visible and noticeable on the beaches of Ocean City as well as when you're on any floors of buildings and hotels and condominiums," Bereano said.

Bereano noted that Dominion Energy is installing two wind turbines on an experimental basis off the coast of Virginia Beach — 27 miles from the shore.

"They need to be moved to a location where they are not visible at all from Ocean City," he said. "This is a very strong and emotional issue."

Bereano said he had been hired to monitor the "significant federal [approval] process" that follows the state PSC's approval of the two sites.

A spokesperson for the Bureau of Ocean Energy Management this week said there is no timetable yet for the agencies to review the two proposals.

In his long career as a lobbyist, Bereano has barely prowled the halls of Congress or federal regulatory agencies professionally. He conceded that he'd spoken to members of the Maryland congressional delegation "three or four times" on behalf of in-state clients who had issues before the federal government but had otherwise not spent much time on Capitol Hill beyond internships in the Senate when he was in college and law school.

Which is why Bereano's next moves will be watched closely by every stakeholder.

Bereano is a legendary and controversial figure in Annapolis. He was the first million-dollar earner among Maryland lobbyists, achieving that milestone in the early 1990s, and he was the second-highest-paid lobbyist in the state between Nov. 1, 2016, and April 30 of this year, clearing more than $1.6 million.

Bereano is also a convicted felon.

He was found guilty of federal mail fraud in 1994 after overbilling clients and funneling the money into a political action committee that provided campaign contributions to lawmakers. He served a 10-month sentence and was later barred from practicing law in Maryland and Washington, D.C. But that ban did not extend to lobbying activities, and Bereano kept his robust business going, even working for his clients from a pay phone outside his Baltimore halfway house.

Bereano remains an energetic and ubiquitous presence in the State House and at political events throughout the state. He is also an enthusiastic champion of Gov. Larry Hogan (R) — who appointed Bereano's son to a district court judgeship last year.

So if Bereano is determined to find a state lawmaker to introduce a bill benefiting one of his clients, he will. Whether he has the juice to slow or halt the development of offshore wind is another matter entirely, given the fact that the Public Service Commission has already acted.

Rich, the U.S. Wind executive, did not respond to a phone message yesterday.

State Delegate Dereck Davis (D), the chairman of the powerful House Economic Matters Committee in Annapolis, which has jurisdiction over energy issues, said yesterday that he had not yet discussed the matter with Bereano and hadn't heard of any pending legislation. Davis said he sympathized with the worries of Ocean City leaders but noted that "the state is committed to the clean energy industry."

Davis said he was open to addressing the town's concerns but wasn't sure what could be done legally. "We've got two very important interests here," he added.

Whether or not a bill is introduced in the Legislature next year, wind energy is popular in Maryland, and advocates believe they can defeat any measure in the State House to slow it.

A poll of 671 Maryland residents by Goucher College in Baltimore last month found that 75 percent of those surveyed said seeing wind from the beach would make "no difference" when deciding whether to vacation in Ocean City. Eleven percent said seeing windmills on the horizon would make them "less likely" to vacation in Ocean City, while 12 percent said it would make them "more likely" to stay there.

"It's a big-tent coalition that got this moving in the first place, and it's only gotten bigger," said Tidwell of the Chesapeake Climate Action Network.

'Is this the right issue?'

But with an election year looming, there are several political cross-currents at play in Maryland.

Hogan is up for re-election, and while he's popular, he could be vulnerable if 2018 turns into a big Democratic year nationally.

Hogan has not said much publicly about the wind energy proposals off Ocean City, but he has been generally supportive of environmental measures — though not always to the satisfaction of green groups and Democrats. Hogan embraced a measure to ban hydraulic fracturing in Maryland earlier this year, and supporters note that Hogan's appointees on the PSC joined the appointees of former Gov. Martin O'Malley (D) to unanimously approve both wind projects. That included PSC Commissioner Anthony O'Donnell, a former state House minority leader who voted against O'Malley's wind energy legislation when he served in the General Assembly.

Some stakeholders said they wouldn't be surprised to see Republicans introduce legislation next session to alter the wind energy law — even if it doesn't have a prayer of passing — as a slap to O'Malley, who continues to lurk on the fringes of the 2020 Democratic presidential conversation.

One of Hogan's top priorities for 2018, beyond re-election, is to get more Republicans elected to the state Senate. Democrats hold a robust 33-14 majority in that chamber, and it takes 29 Senate votes to override a governor's veto. Republicans are working hard to pick up the five seats they'll need to forestall any veto overrides, and state Sen. Jim Mathias (D), who represents Ocean City, is one of their top targets. In 2014, Harris, the congressman, funneled tens of thousands of dollars into efforts to defeat Mathias in his conservative Eastern Shore district.

Mathias — an ex-Ocean City mayor and former small business owner there — voted to authorize the wind turbines in 2013, after opposing the measure the year before. He fully expects the wind turbines to be a major issue during his tough re-election fight against state Delegate Mary Beth Carozza (R), a former congressional aide and George W. Bush administration official.

Carozza, who did not respond to a request for comment yesterday, wasn't in the Legislature when the enabling legislation for offshore wind passed. But Republicans could introduce a bill on wind energy just to force another uncomfortable vote for Mathias and other Democrats seeking re-election in conservative districts.

"Is this going to become an issue against me?" Mathias mused. "Everything is going to become an issue. Is this the right issue?"

Real estate industry blocks sea-level warnings that could crimp profits on coastal properties

By Stuart Leavenworth

September 13, 2017 3:35 PM

NAGS HEAD, N.C. All along the coast of the southeast United States, the real estate industry confronts a hurricane. Not the kind that swirls in the Atlantic, but a storm of scientific information about sea-level rise that threatens the most lucrative, commission-boosting properties.

These studies warn that Florida, the Carolinas and other southeastern states face the nation’s fastest-growing rates of sea level rise and coastal erosion — as much as 3 feet by the year 2100, depending on how quickly Antarctic ice sheets melt. In a recent report, researchers for Zillow estimated that nearly 2 million U.S. homes could be literally underwater by 2100, if worst-case projections become reality.

This is not good news for people who market and build waterfront houses. But real estate lobbyists aren’t going down without a fight. Some are teaming up with climate change skeptics and small government advocates to block public release of sea-level rise predictions and ensure that coastal planning is not based on them.

“This is very concerning,” said Willo Kelly, who represents both the Outer Banks Home Builders Association and the Outer Banks Association of Realtors and led a six-year battle against state sea-level-rise mapping in North Carolina. “There’s a fear that some think tank is going to come in here and tell us what to do.”

The flooding and destruction caused by Hurricanes Irma and Harvey has again highlighted the risks of owning shoreline property. But coastal real estate development remains lucrative, and in recent months and years, the industry has successfully blocked coastal planning policies based on ever-higher oceans.

Last month, President Donald Trump rescinded an Obama-era executive order that required the federal government to account for climate change and sea level rise when building infrastructure, such as highways, levees and floodwalls. Trump’s move came after lobbying from the National Association of Home Builders, which called the Obama directive “an overreaching environmental rule that needlessly hurt housing affordability.”

Whose job is it to save the beach?

The Atlantic Ocean is eroding parts of North Topsail Beach by about five feet per year. The town of 800 residents is running out of cash and solutions in its efforts to protect its north shore. Whose job is to save this popular North Carolina tourist destination?

In North Carolina, Kelly teamed up with homebuilders and Realtors to pass state legislation in 2012 that prevented coastal planners from basing policies on a benchmark of a 39-inch sea-level rise by 2100.

There’s a fear that some think tank is going to come in here and tell us what to do.

Willo Kelly, government affairs liaison for Realtors and home builders on North Carolina’s Outer Banks

The legislation, authored by Republican Rep. Pat McElraft, a coastal Realtor, banned the state from using scientific projections of future sea level rise for a period of four years. It resulted in the state later adopting a 30-year forecast, which projects the sea rising a mere 8 inches.

Stan Riggs, a geologist who served on the North Carolina science panel that recommended the 39-inch benchmark, said the 2012 legislation was a blow for long-term coastal planning.

“The state is completely not dealing with this,” said Riggs, a professor of geology at East Carolina University and author of The Battle for North Carolina’s Coast. “They are approaching climate change with sand bags and pumping sand onto beaches, which is just a short-term answer.”

Todd Miller, executive director of the North Carolina Coastal Federation, agrees the state is not doing enough to prepare for climate change. But he says the power play by builders and real estate agents may have backfired, drawing national attention — including being spoofed by Stephen Colbert — to an otherwise obscure policy document.

“The controversy did more to educate people about climate issues than if the report had just been quietly released and kept on the shelves,” said Miller, who heads an environmental organization of 15,000 members.

In Texas, a similar attempt to sideline climate change science also triggered blowback. In 2011, the Texas Commission on Environmental Quality, then under the administration of Gov. Rick Perry, attempted to remove references to climate in a chapter in “State of the Bay,” a report on the ecological health of Galveston Bay.

The chapter, written by Rice University oceanographer John B. Anderson, analyzed the expected impacts of sea-level rise and described rising seas as “one of the main impacts of global climate change.” When TCEQ officials attempted to edit out such references, Anderson and other scientists objected.

“The whole story went viral,” he said in a recent telephone interview. “Then they backed off.”

Since that time, Texas officials haven’t interfered in other scientific reports, said Anderson, but neither have they consulted with academics on how to manage rising seas, erosion and the prospect of stronger storms.

“Texas is pretty much in a state of climate change denial,” Anderson said. “There’s very little outreach to the research community to address the challenges we face.”

Texas is pretty much in a state of climate change denial.

John B. Anderson, oceanographer at Rice University in Houston

Anderson’s lament is one shared by other scientists in the Southeast, where Republicans control nearly all state houses and are generally dismissive of the scientific consensus on climate change.

In Florida, where Republican Rick Scott is governor, state environmental employees have been told not to use terms such as “climate change” or “global warming” in official communications, according to the Florida Center for Investigative Reporting.

In South Carolina, the state Department of Natural Resources in 2013 was accused of keeping secret a draft report on climate change impacts. In Texas, the 2016 platform of the state Republican Party states that climate change “is a political agenda promoted to control every aspect of our lives.”

Anderson said its surprising that sea level rise is sparking controversy because, in his view, it is the least-contested aspect of climate change science.

It’s well accepted, he said, that global temperatures are rising, and that as they rise, water molecules in the oceans expand, a process called “thermal expansion.” Melting glaciers and ice sheets also contribute to rising sea levels, as does coastal subsidence caused by natural forces and manmade activities, such as excessive groundwater extraction.

According to the National Academy of Sciences, sea level has risen along most of the U.S. coast the past 50 years, with some areas along the Atlantic and Gulf coasts seeing increases of more than 8 inches. By 2100, average sea levels are expected to rise 1.6 to 3.3 feet, with some studies showing 6 feet of rise, according to the NAS.

Last year, Zillow matched up its database of 110 million homes nationwide with maps prepared by the National Oceanic and Atmospheric Administration showing a projected sea level rise of 6 feet by the end of the century.

Zillow found that, without efforts to mitigate against sea level rise — such as building floodwalls or elevating structures — some 1.9 million homes were at risk, worth $882 billion. This included 934,000 houses in Florida, 83,000 in South Carolina, 57,000 in North Carolina and 46,800 in Texas.

“This isn’t just an academic issue,” said Svenja Gudell, chief economist for Zillow, noting that some climate skeptics pushed back against the report, but public response was largely positive. “This can have a really big impact on people and their homes and livelihoods.”

With more than 3,600 miles of coastline, tidal shoreline and white-sand beaches, Eastern North Carolina is particularly vulnerable to sea-level rise. It also supports a burgeoning real estate industry, serving retirees and vacationers who want to be near the water.

In 2010, a group called NC-20 — named for the 20 coastal counties of North Carolina — was formed to advocate for this industry and others in the region. It originally protested new state stormwater rules. Then it took aim at projections of a one-meter sea level rise, calling it “a myth promoted by man-made global warming advocates on expectations of melting ice.”

Kelly, the group’s current president, said NC-20 was alarmed by a 2010 state document that said the 39-inch benchmark should be used “for land use planning and to assist in designing development and conservation projects.” The coalition also feared it could lead to a state website where prospective home buyers could research future water levels.

“That is nothing more than a SimCity video game,” Kelly fumed during a recent interview at the Outer Banks Home Builders Association office near Nags Head. “It all depends on the inputs you put into it.”

With little state guidance on climate change and flooding threats, some local governments are taking matters into their own hands. In Florida, the Tampa Bay Regional Planning Council produced a report this year on the multi-billion-dollar impacts of a projected 3-foot sea level rise by 2060.

In Eastern North Carolina, geologist Riggs resigned from the N.C. Coastal Resources Commission’s science panel in 2016, citing legislative interference. He has since teamed up with local governments on the Pamlico and Albemarle Sounds to address problems of flooding, windblown tides and saltwater intrusion, a threat to local farming.

John Trent, chair of the Bertie County Commission, said that he has been collaborating with Riggs on a range of projects to prevent flooding in Windsor, the county seat. Three severe floods have hit Windsor since 1999, and Trent says he is wary about what sea-level rise could bring over the long term.

“Absolutely it concerns us,” said Trent, a transplant from Florida who moved to the county in 2000. “We are the lowest of the low in this region.”

Further east, the Hyde County town of Swan Quarter has built a 17-mile dike around homes and farms to protect $62 million in flood-threatened property. The dike helped prevent windblown flooding during recent storms, but county officials have some concerns about the future.

“In Hyde County, you’d likely get a consensus that sea-level rise is real and that it is happening,” said Daniel Brinn, a resource specialist with the Hyde County Soil and Water District. “You might get an argument on why it is happening, but that is about it.”

Anderson, the Rice University oceanographer, said that coastal homeowners nationwide should pay attention to sea-level rise projections, even if they live in a home that has never flooded before.

During Hurricane Harvey, Anderson’s home in Houston was inundated with roughly a foot of water, the first time that had ever happened. “Given what I’ve been through in recent weeks, I am acutely aware of how important one foot can be,” he said.

Corrected: In an earlier version of this story, the final paragraph referenced the wrong hurricane.

Stuart Leavenworth: 202-383-6070, @sleavenworth

Read more here: http://www.mcclatchydc.com/news/nation-world/national/article173114221.html#storylink=cpy

====================================================================

Renewable diesel use in California moves to fast track

By Isha Salian

September 14, 2017 Updated: September 14, 2017 8:03am

2

Photo: Lea Suzuki, The Chronicle

Image 1 of 3

Amtrak machinist Darrion Brown takes a sample of renewable diesel from the fuel tank on a Capitol Corridor locomotive to be tested.

Renewable diesel sounds like a contradiction in terms. But planners for the Capitol Corridor trains, which run between the Bay Area and the Sacramento region, see it as a way to slash climate-warming emissions.

“It’s pretty exciting for our industry,” said Jim Allison, manager of planning for the Capitol Corridor Joint Powers Authority. On Aug. 28, a train between Oakland and Auburn began running entirely on the fuel from Golden Gate Petroleum of Martinez — part of a test that, if successful, could herald its use throughout the Capitol Corridor system and in trains statewide.

The Bay Area is rapidly becoming a center for renewable diesel, which can be made from vegetable oils, restaurants’ oil waste or animal fats — material known in the industry as biomass. Unlike biodiesel, which has to be mixed in with petroleum diesel, the new fuel can go into a tank at 100 percent strength. The cities of San Francisco and Oakland and the San Jose Unified School District have begun using it in their vehicle fleets, and operators of San Francisco Bay ferries are looking at it too.

“We can sell every drop of renewable diesel we make,” said Eric Bowen, head of corporate business development for the Renewable Energy Group. The company sells the fuel mostly to the California market, he said. Its Louisiana plant can produce 75 million gallons of renewable diesel each year, and it is looking to expand its facilities.

Demand for renewable diesel has soared. In 2011, 2 million gallons of the fuel were used in California, according to the Air Resources Board, which sets rules for emissions and climate change in the state. In 2016, it was 250 million gallons — about 7 percent of all liquid diesel used.

California’s interest comes largely because of a policy called the Low Carbon Fuel Standard, which requires a 10 percent reduction in the carbon intensity of transportation fuels by 2020. Simon Mui, senior scientist at the Natural Resources Defense Council, said that depending on what the renewable diesel is made from, it can cut lifecycle greenhouse gas emissions by 15 to 80 percent. (Lifecycle emissions include growing the raw materials that turn into fuel; transporting and processing it; and finally burning it.) Renewable diesel also reduces fine particle pollution and other types of emissions, like nitrogen oxides and carbon monoxide, Mui said.

Some experts question how much the renewable diesel market can grow.

“The more we study biofuels, the less clear it is there’s very much of a sustainable supply,” said Daniel Kammen, a UC Berkeley professor who helped write California’s Low Carbon Fuel Standard. Biofuels are always less efficient than electric vehicles, he said.

More by Isha Salian

Universities rush to add data science majors as demand explodes

Napa wineries streamline shipping as alcohol regulations ease

Kammen studies carbon emissions using a lifecycle analysis, analyzing the ecological impact of the entire process. From that perspective, “it really doesn’t matter the source of your material,” he said. “Getting large supplies of a truly sustainable biomass is a challenge.”

Proponents of renewable diesel point out that while passenger vehicles (which mostly run on gasoline, not diesel) seem likely to go electric in the near future, a more sustainable diesel may be a good option to reduce emissions for larger, heavier vehicles — at least until more powerful batteries are developed to make electric trucks and buses cost-effective.

“We don’t meet our state goals unless we have a full array of those electrified fleets as well as liquid-fuel-based fleets,” Mui said. “The key here is enabling all of these technologies to be on a level playing field, so a winner or winners can be determined by the market.”

Sam Wade, chief of the transportation fuels branch at the Air Resources Board, said that while heavy-duty vehicles may eventually switch to electric power, “there’s not that many applications that are fully viable today. We see (renewable diesel) as a nice near-term opportunity.”

Neste, a Finnish company that is the world’s leading producer of renewable diesel, makes between 800 million and 900 million gallons each year. The company said a major limit to growth is competing with the traditional diesel industry. “They’ve got an 80-year head start on us,” head of North American public affairs Dayne Delahoussaye said.

Not every company is able to build renewable diesel into a successful business. South San Francisco nutrition company TerraVia, formerly called Solazyme, struck an agreement in 2015 with UPS to supply its trucks with renewable diesel derived from algae oil. But the company, which sold its assets in a bankruptcy sale this week, had shifted away from renewable fuels partly, it said, due to a decline in crude oil prices. (UPS continues to use renewable diesel in some of its trucks, mostly in California.)

Propel Fuels, a Sacramento company that operates 32 retail locations selling renewable diesel, says it keeps its prices competitive with petroleum diesel. “We don’t think a market exists for premium-price renewable diesel,” said CEO Rob Elam.

Elam said Propel has a large presence in disadvantaged communities, serving customers who want to choose cleaner fuels but cannot afford electric vehicles. Other renewable diesel customers pick the fuel for higher power and mileage, he said.

Richmond resident James Clappier has been using renewable diesel from Propel Fuels in his 1994 Dodge Ram 2500 since the company began stocking the fuel in 2015. “I need a large truck for my business, but feel bad for using such an inefficient vehicle,” he said.

Capitol Corridor trains have been testing diesel alternatives for a few years, and its switch last month to 100 percent renewable diesel for the Oakland-Auburn run makes it the first train in the state to run on the fuel, Allison said. If further tests go well, all Capitol Corridor trains could switch over to renewable diesel by next summer, according to Dean Shepherd, manager of mechanical services.

Allison estimates that shifting to renewable diesel in all of its locomotives, which each need about 70 gallons of fuel per hour, would reduce the trains’ greenhouse gas emissions by two-thirds. He is optimistic that testing will go well.

“If there were any issues, we’d see them quickly,” he said.

Isha Salian is a San Francisco Chronicle staff writer. Email: business@sfchronicle.com Twitter: @Salian_Isha

=====Cost of U.S. Solar Drops 75 percent in Six Years, Ahead of Federal Goal

A 250-MW solar project on the Moapa Band of Paiute Indians Moapa Indian River Reservation in southern Nevada. It is the first utility-scale solar project on tribal land in the U.S. First Solar/DOE

The Trump administration has announced that a federal goal to slash the cost of utility-scale solar energy to 6 cents per kilowatt-hour by 2020 has been met early. The goal, set by the Obama administration in 2011 and known as the SunShot Initiative, represents a 75 percent reduction in the cost of U.S. solar in just six years. It makes solar energy-cost competitive with electricity generated by fossil fuels.

The Department of Energy attributed achieving the goal so quickly to the rapidly declining cost of solar hardware, such as photovoltaic panels and mounts. And it said it will next focus its efforts on addressing “solar energy’s critical challenges of grid reliability, resilience, and storage,” according to a press release.

The DOE also announced $82 million in new funding for solar research, particularly for research into “concentrating solar” — which uses mirrors to direct sunlight to generate thermal energy — and into improved grid technology. It set a new goal to reduce the cost of solar even further: 3 cents per kilowatt-hour by 2030.

==========================================================

Geothermal energy: Why hasn't it caught on yet?

Despite being one of the lowest-cost and most reliable renewable energy sources, harnessing heat from the Earth almost doesn't happen outside Iceland. But leaders meeting in Italy this week are trying to change that.

One of the most famous tourist sites in Iceland is the Blue Lagoon, a man-made lake close to Reykjavík Airport that is fed and heated by a nearby geothermal power plant.

Such power plants are common across Iceland -.but little-known in the rest of the world. For many of the swimmers in the lagoon, it is the first time they have ever heard of this power source.

Political leaders from 25 countries gathered at a sumptuous palace in Florence, Italy, this week are hoping to change that.

Yesterday, governmental ministers and 29 partner institutions from the private sector signed the Florence Declaration, committing to a 500-percent increase in global installed capacity for geothermal power generation by 2030.

Although that may sound like a lot, it's starting from a low baseline. Geothermal energy today accounts for just 0.3 percent of globally installed renewable energy capacity.

Listen to audio

06:24

Volcanoes power Iceland

This is despite its huge potential - for both lowering greenhouse gas emissions and saving money. Geothermal is one of the lowest-cost energy sources available, after startup costs are met. The global potential for geothermal is estimated to be around 200 gigawatts.

"Geothermal's vast potential is currently untapped," said Italian Environment Minister Gian Luca Galleti at the Florence summit. "We must develop new technologies and encourage new investments to ensure we cover this gap."

The summit was organized by the Global Geothermal Alliance, which was launched at the United Nations climate summit in Paris in 2015.

Run by the International Renewable Energy Agency (IRENA), it is bringing governments and companies together to try to speed up deployment. But significant hurdles remain.

To get heat from the layer of hot and molten magma under the Earth, water is pumped down an injection well. Then it filters through the cracks in the rocks where they are at a high temperature. The water then returns via the "recovery well" under pressure in the form of steam. That steam is captured and is used to drive electric generators or heat homes.

The ring of fire

The two main hurdles have been geographic and financial.

Italy wanted to host this week's summit because it is keen to increase its use of geothermal energy. Delegates were able to tour Italy's first-ever geothermal energy production plant in Lardarello, not far from Florence.

Italy has had the historic misfortune of being situated above some very hot earth - resulting from tectonic activity that causes earthquakes and volcanic eruptions. But that heat underground can also be harnessed to generate power.

Across the world, 90 countries possess proven geothermal resources with the potential to be harnessed, and they are mostly located in regions of tectonic activity. That means that the potential is low in most of Europe, but huge in the Asia-Pacific region.

Yet capital for funding projects in this region has been hard to come by, especially for projects at this scale.

"Right now, we may only be harvesting 6 percent of proven geothermal energy potential," said IRENA Director-General Adnan Z. Amin.

He called this week's Florence Declaration "a milestone that, in the strongest possible terms, demonstrates renewed will to unlock the potential of geothermal."

More money, more transparency

Following the signature of yesterday's declaration, IRENA released a new report, which found that access to capital for surface exploration and drilling remains the main barrier to geothermal development.

The report also found that more transparent government regulations that avoid delays are needed to provide a stable environment for developers and investors.

Representatives of African Union countries, as well as the AU's commissioner for infrastructure and energy Amani Abou-Zeid, were at the Florence summit pledging to provide this transparency. Abou-Zeid said the technology can help Africa decarbonize, while also providing jobs.

Watch video

03:56

Kenya goes geothermal

"Geothermal energy is emerging as a hidden gem of Africa's renewable energy resources and we must work together, across nations, to ensure this resource achieves its potential."

One country in which investment commitments are not lacking is Indonesia , which is planning 4,013 megawatts of additional capacity in the coming years. This puts it far ahead of all other countries. The United States, Turkey and Kenya follow, with a little over 1,000 megawatts of additional capacity each planned.

Amin says such government commitments can encourage private investment in developing these energy sources, which is capital-intensive at the start. "If we can identify and implement mechanisms that deliver a greater level of certainty to investors and developers, then we will move beyond meaningful dialogue to decisive action," he said.

If the countries gathered in Florence maintain their commitments, sites such as the Blue Lagoon may not be unique to Iceland any more.

============================================================================

Study: Food scrap recycling can work in cities of any size, though PAYT helps

Author

Cole Rosengren

@ColeRosengren

Published

Sept. 11, 2017

Share it

post

share

tweet

The financial and logistical challenges of starting a food scrap diversion program can seem daunting for smaller cities. A newly published study from MIT shows that no one characteristic is a prerequisite for taking the leap.

The study, published in the October 2017 edition of the journal Resources, Conservation and Recycling, was written by a team of three researchers from MIT's Department of Urban Studies and Planning and Department of Materials Science and Engineering. They set out to understand where food scrap diversion programs were happening in mid-size and large cities. Their results were based on 115 responses from cities with between 100,000 and 1 million people — about 28% of the U.S. population.

Story co

Learn essential components for creating a data-driven strategy to capture value from unsold inventory and reduce the volume of food sent to landfill.

The researchers had seen a growth in awareness about food scrap programs (FSP), but noted that little research had been done on what type of cities were pursuing them and whether this had any similarities to the growth of curbside recycling in the 1990s.

"The spread of recycling offers an analogue to current trends in municipal food scrap diversion programs. Today, most U.S. municipalities offer some kind of recycling infrastructure, whether through curbside collection or drop-off facilities. Food scrap recycling, on the other hand, is still in its early days," they wrote.

Where it's happening

Based on the survey, conducted in 2015, the MIT researchers found that 40% of respondents had some type of existing FSP. The survey defined an FSP as "educational programs, including home visits by composting experts; free or discounted barrels for home composting; free or discounted bins for food scrap collection in the home; drop-off facilities; and curbside collection."

Among these respondents, 36% had educational programs. Free or discounted backyard bins were the second most common at 19%, and 18% of cities had curbside collection. Some respondents also had more than one type of FSP in place. The researchers chose to keep the identities of these cities anonymous, though did offer breakdowns by size and EPA region within their research.

"Within these cities, this particular size range, it really seems like being on the smaller end or being on the larger end doesn’t make a huge difference," Lily Baum Pollans, one of the paper's co-authors who is now an assistant professor of urban policy and planning at Hunter College. "That was a little bit of a surprise for us."

In terms of which cities had any type of FSP, population size and socioeconomic status weren't found to be significant drivers. Out of various factors that have been found to correlate with successful recycling programs in previous research — median income, educational attainment, density, age, and housing characteristics — only educational attainment was found to have any connection.

As for curbside collection, cities with higher population densities were more likely to offer the service in some form.

Why it's happening

While no one type of city was a clear fit for having these programs, the MIT team did find that certain waste policies played a role.

Cities that could build on existing infrastructure and already had waste reduction policies in place, such as pay-as-you-throw (PAYT) programs, were more likely to have FSPs of some kind. The existence of yard waste collection programs and PAYT together correlated with other FSPs being in place as well.

Looking at cities with curbside programs, the researchers found that PAYT was also a factor. Yard waste collection wasn't as relevant because it is often seasonal. Cities with source-separated recycling were less likely to have curbside organics, possibly because if they hadn't followed the more widespread single-stream trend they wouldn't be prioritizing organics collection yet. Interestingly, the presence of a municipal diversion rate goal wasn't found to be a major factor.

In the cities that offered curbside programs, the researchers didn't delve into whether they were being serviced by municipal or private collection.

“From more general research we haven’t actually observed much of a pattern," said Pollans. “What really matters is that there’s a public sector agency taking leadership around the issue."

Next steps

According to this study, making food scrap recycling a priority is possible for any city regardless of their location or condition.

"There is no magic here: regardless of most socio-economic characteristics, any city could move in this direction if it is willing to make the commitment," concluded the authors.

As noted in the study, and during Waste Dive's conversation with Pollans, determining the success of these programs is a different question. The fact that many of the mid-size respondents only had educational programs could be taken as a sign that they're not doing enough. Though just because a city has curbside collection doesn't mean residents are participating fully or correctly. Further research is also underway about the politics and policies behind starting composting programs and broader waste management programs in general.

“We have a lot more work to do," said Pollans.

When factoring in larger cities not included in this study, the number offering curbside collection becomes much larger. BioCycle's latest national survey identified 198 communities with curbside service in 19 states. A new edition of the survey is currently being finalized and may show that number has grown. So far this year, cities such as Falls Church, VA and Boise, ID have launched new programs. As existing programs expand in larger cities such as New York, Los Angeles and Austin, more people are gaining access on a regular basis.

Opinions may differ on the most cost-effective way to address food waste — and state or local regulations are still big drivers for where programs are happening — but it's clear that awareness is growing at a municipal level around the country.

More Information :

http://news.mit.edu/2017/study-food-waste-recycling-policy-key-0817

Garbage From Irma Will Fuel Florida’s Power Grid

As long as they’re throwing stuff away, many counties find, they may as well make electricity out of it.

By Eric Roston

‎September ‎18‎, ‎2017‎4‎:‎00AM ‎EDT

When it comes to garbage, geography is destiny.

Look at Texas and Florida, recovering from Hurricanes Harvey and Irma. Homeowners and businesses not incapacitated by the storm have begun the arduous and emotional work of separating destroyed possessions and materials by type and placing them curbside. Cities have begun the intimidating logistics of picking it up and transporting it to its final destination.

And what is that destination? Texas’s waste-disposal strategy takes advantage of the state’s vast land. Harris County alone, which includes Houston, has 14 active landfills.

Florida, by contrast, is a peninsula with a longer coastline than any state other than Alaska, and much less room for trash. Many coastal Florida counties burn theirs, with waste incinerators particularly common around the state’s populous southern lip and up the Gulf Coast. It’s a two-fer. Combustion reduces the solid waste to ash, and the heat that’s produced runs steam generators. Much of the waste left in Irma’s path will burn, the energy released adding to local communities’ electricity.

Florida burns a disproportionate amount of U.S. trash. Ten “waste-to-energy” plants turned 4.5 million tons of trash into 3.5 million megawatt-hours statewide in 2016. That’s about 2 percent of the state’s overall power, and a large majority of its renewable energy. Burning trash makes up less than 0.5 percent of overall U.S. electricity production, according to the Energy Information Administration.

But the main point is to make stuff disappear. Incineration reduces the solid mass of trash by up to 90 percent. The leftover ash can then be more efficiently dumped in a landfill where space is precious. In 2016, Florida burned 12 percent of its trash, recycled 44 percent, and another 44 percent went into landfills. With all that, Florida has historically been an exporter of garbage to other states, the 10th largest in a 2007 study by the Congressional Research Service.

Waste-fueled power plants were built mostly in the 1980s and early 1990s, encouraged by a 1978 federal law. Environmental scrutiny in later years led to widespread retrofits of pollution-control technologies to remove mercury and dioxin.

Burning stuff up doesn’t make it entirely disappear, even once the ash is disposed of. Part of the mass of the original garbage is converted into carbon dioxide and released into the atmosphere. But this pollution may beat the alternative. If the same volume of waste were tossed into landfills, eventual emissions of methane, a more powerful greenhouse gas, would be even worse for the atmosphere.

That makes garbage an attractive, if marginal, alternative to fossil fuels in some areas. The Intergovernmental Panel on Climate Change, the authoritative scientific group backed by the United Nations, supported waste-to-energy plants as a low-carbon technology back in 2007. Before its next report, in 2014, costs for plants had fallen 15 percent globally.

Before Irma hit, Florida Department of Environmental Protection staff worked with local governments and facilities to anoint disaster-debris sites, sort of a purgatory for trash before it’s moved to incinerators. (After storms, the DEP coordinates with multiple state and federal agencies, including the U.S. Army Corps of Engineers, FEMA, and the Florida Fish and Wildlife Conservation Commission.) Fuel for these incinerator power plants stood at high levels before Irma struck. In anticipation of the hurricane, Miami residents, for example, had doubled the amount of stuff they threw out in the days before it arrived. Already, some county authorities are seeing a spike in solid waste.

“We’ve seen about a 20 percent increase,” said Kimberly Byer, solid waste director for Hillsborough County, which includes Tampa. “That’s just an initial increase, and it’s only been a couple of days.”

The county’s 565,000 tons of trash a year produces about 45 megawatts of power, or enough to run about 30,000 homes. “It pays for itself,” Byer said of Hillsborough’s waste-to-energy facility.

Hurricane or no hurricane, people are comfortable with garbage.

U.S. coastal growth continues despite lessons of past storms

Sun., Sept. 17, 2017, 10:56 a.m.

By Jeff Donn for Associated Press

Rising sea levels and fierce storms have failed to stop relentless population growth along U.S. coasts in recent years, a new Associated Press analysis shows. The latest punishing hurricanes scored bull’s-eyes on two of the country’s fastest growing regions: coastal Texas around Houston and resort areas of southwest Florida.

Nothing seems to curb America’s appetite for life near the sea, especially in the warmer climates of the South. Coastal development destroys natural barriers such as islands and wetlands, promotes erosion and flooding, and positions more buildings and people in the path of future destruction, according to researchers and policy advisers who study hurricanes.

“History gives us a lesson, but we don’t always learn from it,” said Graham Tobin, a disaster researcher at the University of South Florida in Tampa. That city took a glancing hit from Hurricane Irma – one of the most intense U.S. hurricanes in years – but suffered less flooding and damage than some other parts of the state.

In 2005, coastal communities took heed of more than 1,800 deaths and $108 billion in damages from Hurricane Katrina, one of the worst disasters in U.S. history. Images of New Orleans under water elicited solemn resolutions that such a thing should never happen again – until Superstorm Sandy inundated lower Manhattan in 2012. Last year, Hurricane Matthew spread more deaths, flooding and blackouts across Florida, Georgia and the Carolinas. From 2010-2016, major hurricanes and tropical storms are blamed for more than 280 deaths and $100 billion in damages, according to data from the federal National Centers for Environmental Information.

Harvey, another historically big hurricane, flooded sections of Houston in recent weeks. Four counties around Houston, where growth has been buoyed by the oil business, took the full force of the storm. The population of those counties expanded by 12 percent from 2010 to 2016, to a total of 5.3 million people, the AP analysis shows.

During the same years, two of Florida’s fastest-growing coastline counties – retirement-friendly Lee and Manatee, both south of Tampa – welcomed 16 percent more people. That area took a second direct hit from Irma after it made first landfall in the Florida Keys, where damage was far more devastating.

Overall growth of 10 percent in Texas Gulf counties and 9 percent along Florida’s coasts during the same period was surpassed only by South Carolina. Its seaside population, led by the Myrtle Beach area of Horry County, ballooned by more than 13 percent.

Nationally, coastline counties grew an average of 5.6 percent since 2010, while inland counties gained just 4 percent. This recent trend tracks with decades of development along U.S. coasts. Between 1960 and 2008, the national coastline population rose by 84 percent, compared with 64 percent inland, according to the Census Bureau.

Cindy Gerstner, a retiree from the inland mountains of upstate New York, moved to a new home in January in Dunedin, Florida, west of Tampa. The ranch house sits on a flood plain three blocks from a sound off the Gulf of Mexico. She was told it hadn’t flooded in 20 years – and she wasn’t worried anyway.

“I never gave it a thought,” she said during a visit back to New York as Irma raked Florida. “I always wanted to live down there. I always thought people who lived in California on earthquake faults were foolish.”

Her enthusiasm for her new home was undiminished by Irma, which broke her fence and knocked out power but left her house dry.

In Horry County, where 19 percent growth has led all of South Carolina coastline counties, Irma caused only minor coastal flooding. The county’s low property taxes are made possible by rapid development and tourism fees, allowing retirees from the North and Midwest to live more cheaply. Ironically, punishing hurricanes farther south in recent years has pushed some Northerners known locally as “half-backers” to return halfway home from Florida and to resettle in coastal South Carolina.

Add the area’s moderate weather, appealing golf courses, and long white strands – the county is home to Myrtle Beach – and maybe no one can slow development there. “I don’t see how you do it,” said Johnny Vaught, vice chairman of the county council. “The only thing you can do is modulate it, so developments are well designed.”

Strong building codes with elevation and drainage requirements, careful emergency preparations, and a good network of roads for evacuation help make the area more resilient to big storms, said the council chairman, Mark Lazarus. Such measures give people “a sense of comfort,” said Laura Crowther, CEO of the local Coastal Carolina Association of Realtors.

Risk researchers say more is needed. “We’re getting better at emergency response,” said Tobin at the University of South Florida. “We’re not so good at long-term control of urban development in hazardous areas.”

The Federal Emergency Management Agency helps recovery efforts with community relief and flood insurance payments. The agency did not immediately respond to a request for comment. It provides community grants for projects aimed at avoiding future losses. Some projects elevate properties, build flood barriers, or strengthen roofs and windows against high winds. Others purchase properties subject to repeated damage and allow owners to move.

But coastline communities face more storm threats in the future.

Global warming from human-generated greenhouse gases is melting polar ice and elevating sea levels at an increasing pace, according to the National Oceanic and Atmospheric Administration. That amplifies storm surges and other flooding. Also, some climate models used by scientists predict stronger, more frequent hurricanes as another effect of global warming in coming decades.

“There will be some real challenges for coastal towns,” predicted Jamie Kruse, director of the Center for Natural Hazards Research at East Carolina University in Greenville, North Carolina. “We’ll see some of these homes that are part of their tax base becoming unlivable.”

Hazard researchers said they see nothing in the near term to reverse the trend toward bigger storm losses. As a stopgap, communities should cease building new high-rises on the oceanfront, said Robert Young, director of the Program for the Study of Developed Shorelines at Western Carolina University in Cullowhee, North Carolina.

He said big changes probably will not happen unless multiple giant storms overwhelm federal and state budgets.

“The reason why this development still continues is that people are making money doing it,” he said. “Communities are still increasing their tax base – and that’s what politicians like.”

Religious communities are taking on climate change

Churches that have long played a role in social justice are stepping up.

Sarah Tory Sept. 18, 2017

Before Pastor Jim Therrien, 49, moved to New Mexico, he rarely thought about environmental issues. Back in Kansas, where he was born and raised, the grass outside his home was always green, and though the state had an active oil industry, companies fenced off well sites properly and promptly cleaned up spills. But then he and his family saw the impacts of energy development on the Southwestern landscape and their new church community. Therrien began to think about the connection between the local environment and the broader issue of climate change.

Every day, Therrien, a blond, ruddy and tattooed man of Irish descent, looked out his window and saw a dry land getting drier. Residents told him that winters used to be much colder and snowier. The hotter temperatures thickened the methane haze, and oil and gas traffic tore up the dirt roads. Therrien started to see these problems as injustices that conflicted with Christian values. So he decided to take a stand. Churches have long played a crucial role in social movements, from the civil rights era to immigration reform. Why not environmental activism?

“I don’t ever consider myself an environmentalist,” he told me one afternoon at the Lybrook Community Ministries, on a remote stretch of Highway 550, between the Navajo Nation and the Jicarilla Apache Reservation. “I’m more of a people person.”

Therrien’s congregation, mostly Navajo, had spent years living with the San Juan Basin’s drilling boom, and the last thing they needed was a sermon about climate change. So instead of lecturing, he created a garden to reduce the church’s use of fossil fuels to transport food. Then he began fundraising for solar installations on homes around the mission and urging lawmakers to tighten regulations on methane, a powerful greenhouse gas released by oil and gas drilling.

Last year, he joined the Interfaith Power & Light campaign, “a religious response to global warming” composed of churches and faith communities across the U.S. Since 2001, the network had expanded its membership from 14 congregations in California to some 20,000 in over 40 states. The group provides resources to churches and other faith communities for cutting carbon emissions — helping install solar panels, for instance, and sharing sermons on the importance of addressing climate change.

Therrien says he is merely “following the Scripture.” In the process, however, he has joined a growing environmental movement that brings a religious dimension to the problem of climate change.

Members of the New Community Project tour the Navajo Nation near now-dry Washington Lake to learn how oil and gas extraction is affecting the Navajo people who live nearby.

The green religious movement is gaining momentum. In May, nine Catholic organizations announced plans to divest from fossil fuel corporations, a move inspired by Pope Francis’ 2015 encyclical, Laudato Si’: On Care for Our Common Home.

In June, President Donald Trump announced plans to withdraw from the Paris climate accord, a decision that Catholic Bishop Oscar Cantú, of Las Cruces, New Mexico, called “deeply troubling.” “The Scriptures affirm the value of caring for creation and caring for each other in solidarity,” said Cantú, who is the chairman of the U.S. Bishops’ Committee on International Justice and Peace. “The Paris agreement is an international accord that promotes these values.”

In July, the United Church of Christ delivered a similar message, urging the clergy to preach on “the moral obligation of our generation to protect God’s creation” and exhorting individuals to take political action.

For these churches, climate change connects a long list of social and economic injustices they care deeply about, from food insecurity to the global refugee crisis.

Pope Francis stands in front of a statue of St. Francis of Assisi. The pope, who takes his name from St. Francis, the patron saint of animals and ecosystems, has led a call to action on climate change for the Catholic community.

“Climate change is the biggest ethical, moral and spiritual challenge of our day,” Joan Brown, the executive director of New Mexico Interfaith Power & Light, told me.

She pointed to St. Francis of Assisi, the patron saint of animals and ecology, a medieval Italian monk who spent much of each year living in hermitages, caves and on mountainsides, praying in the wilderness. “St. Francis speaks of everything being connected,” she said, “of there being no separation between the human and natural world.” When Jorge Mario Bergoglio was elected pope, he chose his name in honor of St. Francis. Brown credited Pope Francis with helping reframe climate change as a moral concern.

Laudato Si’ describes the relationship between global poverty, inequality and environmental destruction, and issues a call to action. When Francis visited the White House in 2015, he declared: “Climate change is a problem which can no longer be left to a future generation. I would like all men and women of goodwill in this great nation to support the efforts of the international community to protect the vulnerable in our world.”

Among non-Catholics, too, the pontiff’s message has had an effect: Polling from the Yale Project on Climate Change Communication shows that the number of Americans who say they think global warming will harm the world’s poor rose from 49 to 61 percent; the percent who say the issue has become “very” or “extremely important” to them personally jumped from 19 to 26 percent.

A little over a year later, during the Paris climate negotiations, Brown recalled how during a breakout session for faith leaders, one of the Paris organizers praised Laudato Si’ and the similar documents released by Muslim, Jewish and Buddhist leaders. It was the first time, Brown said, quoting the organizer, “that ethical and moral imperatives are front and center with delegates.”

Here at his hardscrabble New Mexico parish, Therrien continues to practice what he preaches. On a hot day in July, he herded 28 visitors into the mission’s two white vans for a drive out onto the Navajo Nation. The group, mostly Easterners, ranged in age from 8 to over 60 and had traveled to the Lybrook mission as part of a weeklong fact-finding trip. Like Therrien, many were members of the Church of the Brethren, a Protestant denomination with a history of activism. More recently, their focus had shifted to environmental issues — especially climate change.

“It’s concerning that our government is pulling back from what we should be focusing on,” one of them, Jim Dodd, told me. Recently, the giant Larsen Ice Shelf had broken off from Antarctica, and Dodd was worried. “Villages already at sea level are going to get flooded,” he said.

Leading the group was David Radcliff, director of the New Community Project, a Vermont-based organization. “It’s a fairness issue for the rest of God’s creatures,” he told me. Radcliff has led “learning tours” around social and environmental justice issues for church groups, most recently, to the Ecuadorian Amazon.

Radcliff, a small, wiry man with an intense blue gaze, wore a white T-shirt with a very long slogan on the back. “Earth is a mess,” it said, and “God’s not amused.” If you aren’t satisfied, it added, “do something about it.”

For Radcliff, discussing the facts of climate change isn’t enough. That’s where religion comes in. “At a certain point, you have to talk about the consequences, and past that it becomes a conversation about morality,” he said. Take moose in the Northeast: They are dying from tick infestations exacerbated by a warming climate, caused by humans taking more from the Earth than they need, he said. “We are stealing from other living things.”

As we drove over bumpy dirt roads west of the mission, the Easterners stared in awe at the crumpled mesas and the vast New Mexican landscape. Navajo homesteads peeked out of the sagebrush. Every so often, a large semi-truck carrying pipes and other equipment roared past in a cloud of dust, heading for one of the scattered well pads or towering rigs marking fracking operations.

Therrien stopped the van at one of the well sites and ushered everyone out, gesturing toward a row of high metal storage tanks. Under federal and state regulations, the company should have built fencing around it, but out here on Navajo land, Therrien noted, the rules weren’t always enforced. Last year, several oil storage tanks north of Lybrook exploded, forcing Navajo families living nearby to evacuate their homes. Since moving to the mission, he often thought about how much easier it was to ignore the consequences of oil and gas development — and of climate change — if people weren’t involved.

Back in Kansas, Therrien had recycled cans and kept a compost pile, but when it came to climate change, he felt mostly resigned. “I used to think, ‘What can one person do?’ ” he told me. Now, as a pastor on the Navajo Nation, he felt a new sense of urgency and purpose.

Last January, he spoke at a rally outside the Bureau of Land Management’s office in Santa Fe, New Mexico, protesting the agency’s decision to lease 844 acres of land for drilling. The rally brought together Navajo activists, environmental groups and religious leaders from the state chapter of Interfaith Power & Light. Although they failed to stop the sale, Therrien remained hopeful. “Through the church, I realized there was this network of people fighting for the same things,” he said.

Therrien stopped the van at a low rise overlooking what was once Washington Lake. Families once gathered water here for themselves and their livestock. Four years ago, it dried up.

Everyone piled out, while Radcliff explained how aquifers were losing water. “They’re dropping all over the world,” including in the West, he said. He paused and knelt to pick up a can that was left on the side of the road and brandished it above him. “No other creature makes trash,” he said. “So what’s progress?” The people gazed past him, staring at the dusty lakebed, where patches of dry grass swayed in the heat.

Correspondent Sarah Tory writes from Paonia, Colorado. She covers Utah, environmental justice and water issues

What if America Had a Detective Agency for Disasters?

The commissions are coming. Hurricane season hasn’t ended, but forensics waits for no one, so the after-action reports on Harvey and Irma have to get started. The relevant agencies—local and perhaps federal, plus maybe some national academies and disaster responders—will all no doubt look at themselves and others to see what went right or wrong. Was Houston right not to evacuate? Was Florida right to evacuate? What led to Florida’s electrical outage? Is Houston’s flood control infrastructure tough enough to withstand climate change-powered storm surges?

That’s as it should be. The science of disaster prediction and response only gets a few laboratories a year, and extracting lessons from them is a big job. Big enough, in fact, that the sheer number of reports can mean those lessons get lost. So Scott Knowles, a historian at Drexel University who studies disasters, has an idea: a federal agency whose job it is to centrally organize that detective work and get the word out. A National Disaster Investigation Board.

Stipulated, this idea would be a tough sell even if the Trump administration wasn’t anti-agency and anti-regulation. But just playing with the notion says something about what people do and don’t learn from disasters. “It’s an index of what a total nerd I am that this is a thrilling idea,” Knowles says. “We have all these agencies that we spend a fair amount of money on who do this work, but what’s missing is a technically sophisticated cop who can come in and say, 'this worked, this didn’t, this was a repeat, this is an area for research.' And then put a report out and have it be interesting, so there’s press.”

As a model, Knowles cites agencies like the National Transportation Safety Board, but also pop-up commissions like the one after the space shuttle Challenger exploded, or the 9/11 Commission. Both were riveting. The Rogers Commission, investigating the Challenger, had that thundering moment when Richard Feynman put an O-ring in ice water to show that cold made it brittle, the technical cause of the shuttle’s explosion. The 9/11 final report, controversial though it remains, was a bestseller and finalist for the National Book Award—and led to sweeping changes to the United States intelligence and military. That power to effect real, lasting change would be critical to the NDIB's success.

What’s missing is essentially a technically sophisticated cop who can come in and say, "this worked, this didn’t, this was a repeat, this is an area for research."

Scott Knowles, historian, Drexel University

So what would the NDIB actually do? Disasters are “multiple interlocking problems,” as Knowles says. The Fukushima Dai-Ichi meltdown was a human-made disaster that resulted from a natural one, a tsunami. The Challenger O-ring failed, but the decision to launch was a management failure. So the first task might be to figure out what problems to unlock. For Harvey and Houston, that might be evacuation communication and dealing with toxic sites. For Irma it might be the electrical grid.

Public, possibly even televised hearings might be another component. They’d recruit public pressure to make necessary changes, and also have a naming-and-shaming component. The post-Katrina study did that. “They said, look, the Coast Guard did good and everybody else failed,” Knowles says. “That was a spur at FEMA that produced pretty significant changes and led to legislation.”

And, as a colleague of Knowles’ suggested when he tossed out the idea of the NDIB on Twitter, the investigation should include people from among the affected. That’d give it a social science aspect, but also help make sure its findings represented the widest possible constituency.

So as long as we’re dreaming up mythical science detective agencies, who should be the stars of CSI: Disaster? “If we’re going to get serious about building this up as a body of expertise with policy juice, I think you have to have a standing body with people committed to it, and professional staff,” Knowles says. He suggests people like fire protection engineers, the cranky people in the back of the room who know what’s unsafe and aren’t afraid to say so, but can also navigate legal systems.

Knowles thought ex-FEMA administrators might be great at the job, like James Lee Witt or Craig Fugate. “He’s setting himself up as a kind of policy analyst, and having been a FEMA administrator he’s way qualified to walk into Houston and say what he’s worried about.”

So I pinged Fugate. What does he think about a federal disaster investigation team? “It’s a good idea, but will be resisted by FEMA and others. A formal review by outside experts not subject to the agencies under review will be key,” Fugate says. But he adds that you’d still want to somehow integrate the views and actions of local, state, and private sector actors. And they’re “not likely to volunteer unless required by law.”

Short of starting up a Legion of Super-Heroes-style Science Police Force, that seems unlikely.

Worse, an NDIB might be redundant. “After Ebola, there were at least 20 after-action commissions to study what happened and what went wrong,” says Larry Brilliant, chair of the Skoll Global Threats Fund and a pandemic fighter since his days on the World Health Organization team that eradicated smallpox. (Brilliant also cofounded the lynchpin internet community The Well.) Brilliant was on at least one of the commissions himself. “It’s not that there’s any absence of intentionality,” he says.

A bigger problem: “There’s no such thing as a ‘disaster,’” Brilliant says. A famine is not a flood is not an explosion is not a refugee crisis is not a disease outbreak. “Each one has its own particular ways to screw up.”

Corporate forces would resist this new agency even harder than anti-bureaucratic ones. The FIRE sector—the finance, insurance, and real estate industries—has become the biggest contributor to federal-level political campaigns. That lobby doesn’t want to hear about uninsuring coastlines or rezoning wilderness. “I call them the lending complex,” Knowles says. “They’re going to push back on this because they think we should be able build anyplace—wildland interface, Jersey Shore, wherever.”

But whether all those objections make a standing Federal agency with investigative powers a terrible idea (because it’d be redundant) or a great idea (because it’d clarify) isn’t clear. “Almost anything would be better than what we have now, which is duplicative studies and some Senate committee, and the stuff just goes on a shelf,” Knowles says. “Studies are going to happen. We may as well innovate around how they’re done, because they’re going to get more and more expensive.” And so are the disasters.

More than 8. 3 billion tons of plastics made: Most has now been discarded

Date:

July 19, 2017

Source:

University of Georgia

Summary:

Humans have created 8.3 billion metric tons of plastics since large-scale production of the synthetic materials began in the early 1950s, and most of it now resides in landfills or the natural environment, according to a study.

S

Humans have created 8.3 billion metric tons of plastics since large-scale production of the synthetic materials began in the early 1950s, and most of it now resides in landfills or the natural environment, according to a study published in the journal Science Advances.

Led by a team of scientists from the University of Georgia, the University of California, Santa Barbara and Sea Education Association, the study is the first global analysis of the production, use and fate of all plastics ever made.

The researchers found that by 2015, humans had generated 8.3 billion metric tons of plastics, 6.3 billion tons of which had already become waste. Of that waste total, only 9 percent was recycled, 12 percent was incinerated and 79 percent accumulated in landfills or the natural environment.

If current trends continue, roughly 12 billion metric tons of plastic waste will be in landfills or the natural environment by 2050. Twelve billion metric tons is about 35,000 times as heavy as the Empire State Building.

"Most plastics don't biodegrade in any meaningful sense, so the plastic waste humans have generated could be with us for hundreds or even thousands of years," said Jenna Jambeck, study co-author and associate professor of engineering at UGA. "Our estimates underscore the need to think critically about the materials we use and our waste management practices."

The scientists compiled production statistics for resins, fibers and additives from a variety of industry sources and synthesized them according to type and consuming sector.

Global production of plastics increased from 2 million metric tons in 1950 to over 400 million metric tons in 2015, according to the study, outgrowing most other human-made materials. Notable exceptions are materials that are used extensively in the construction sector, such as steel and cement.

But while steel and cement are used primarily for construction, plastics' largest market is packaging, and most of those products are used once and discarded.

"Roughly half of all the steel we make goes into construction, so it will have decades of use -- plastic is the opposite," said Roland Geyer, lead author of the paper and associate professor in UCSB's Bren School of Environmental Science and Management. "Half of all plastics become waste after four or fewer years of use."

And the pace of plastic production shows no signs of slowing. Of the total amount of plastics produced from 1950 to 2015, roughly half was produced in just the last 13 years.

"What we are trying to do is to create the foundation for sustainable materials management," Geyer said. "Put simply, you can't manage what you don't measure, and so we think policy discussions will be more informed and fact based now that we have these numbers."

The same team of researchers led a 2015 study published in the journal Science that calculated the magnitude of plastic waste going into the ocean. They estimated that 8 million metric tons of plastic entered the oceans in 2010.

"There are people alive today who remember a world without plastics," Jambeck said. "But they have become so ubiquitous that you can't go anywhere without finding plastic waste in our environment, including our oceans."

The researchers are quick to caution that they do not seek the total removal of plastic from the marketplace, but rather a more critical examination of plastic use and its end-of-life value.

"There are areas where plastics are indispensable, especially in products designed for durability," said paper co-author Kara Lavender Law, a research professor at SEA. "But I think we need to take a careful look at our expansive use of plastics and ask when the use of these materials does or does not make sense."

Story Source:

Materials provided by University of Georgia. Original written by James Hataway. Note: Content may be edited for style and length.

Journal Reference:

Roland Geyer et al. Production, use, and fate of all plastics ever made. Science Advances, July 2017 DOI: 10.1126/sciadv.1700782

Localized rapid warming of West Antarctic subsurface waters by remote winds in E.Antarctica

Paul Spence, Ryan M. Holmes, Andrew McC. Hogg, Stephen M. Griffies, Kial D. Stewart & Matthew H. England

Nature Climate Change (2017) doi:10.1038/nclimate3335

Received 25 January 2017 Accepted 08 June 2017 Published online 17 July 2017

The highest rates of Antarctic glacial ice mass loss are occurring to the west of the Antarctica Peninsula in regions where warming of subsurface continental shelf waters is also largest. However, the physical mechanisms responsible for this warming remain unknown. Here we show how localized changes in coastal winds off East Antarctica can produce significant subsurface temperature anomalies (>2 °C) around much of the continent. We demonstrate how coastal-trapped barotropic Kelvin waves communicate the wind disturbance around the Antarctic coastline. The warming is focused on the western flank of the Antarctic Peninsula because the circulation induced by the coastal-trapped waves is intensified by the steep continental slope there, and because of the presence of pre-existing warm subsurface water offshore. The adjustment to the coastal-trapped waves shoals the subsurface isotherms and brings warm deep water upwards onto the continental shelf and closer to the coast. This result demonstrates the vulnerability of the West Antarctic region to a changing climate

The increasing rate of global mean sea-level rise during 1993–2014

Xianyao Chen, Xuebin Zhang, John A. Church, Christopher S. Watson, Matt A. King, Didier Monselesan, Benoit Legresy & Christopher Harig

Nature Climate Change 7, 492–495 (2017) doi:10.1038/nclimate3325

Received 19 October 2016 Accepted 22 May 2017 Published online 26 June 2017

Global mean sea level (GMSL) has been rising at a faster rate during the satellite altimetry period (1993–2014) than previous decades, and is expected to accelerate further over the coming century1. However, the accelerations observed over century and longer periods2 have not been clearly detected in altimeter data spanning the past two decades3, 4, 5. Here we show that the rise, from the sum of all observed contributions to GMSL, increases from 2.2 ± 0.3 mm yr−1 in 1993 to 3.3 ± 0.3 mm yr−1 in 2014. This is in approximate agreement with observed increase in GMSL rise, 2.4 ± 0.2 mm yr−1 (1993) to 2.9 ± 0.3 mm yr−1 (2014), from satellite observations that have been adjusted for small systematic drift, particularly affecting the first decade of satellite observations6. The mass contributions to GMSL increase from about 50% in 1993 to 70% in 2014 with the largest, and statistically significant, increase coming from the contribution from the Greenland ice sheet, which is less than 5% of the GMSL rate during 1993 but more than 25% during 2014. The suggested acceleration and improved closure of the sea-level budget highlights the importance and urgency of mitigating climate change and formulating coastal adaption plans to mitigate the impacts of ongoing sea-level rise.

Feds release final environmental impact statement for Atlantic Coast Pipeline

Jul 24, 2017, 6:30am EDT

Lauren K. Ohnesorge, Triangle Business Journal

Federal regulators have released their final environmental impact statement for the Atlantic Coast Pipeline, a 600-mile natural gas line planned to cut through North Carolina by way of West Virginia and Virginia.

The Federal Energy Regulatory Commission concludes that, while the project’s construction and implementation is likely to result in “some adverse effects,” by following mitigation recommendations, the ACP can reduce those impacts “to less-than significant levels.”

A group of residents in Wilson County have spoken out against the project.

According to FERC, negative effects of ACP’s construction and operation could include its impact on steep slopes and adjacent bodies of water. The documents points to a handful of endangered species that could be impacted by the project, from the Roanoke logperch to the clubshell mussel to the Indiana bat.

In a prepared statement after the release, Leslie Hartz, Dominion Energy's vice president of engineering and construction, said the report "provides a clear path for final approval of the Atlantic Coast Pipeline this fall."

"The report concludes that the project can be built safely and with minimal long-term impacts to the environment," Hartz states.

The report comes a day after seven mayors, spearheaded by Linwood Parker of Four Oaks, put out a joint statement calling for the project to be built.

“We just wanted to make sure everybody is aware of the importance to eastern North Carolina to have natural gas and an abundant supply,” Parker said Friday in an interview. “The debate is not about whether you have gas, it’s about whether eastern North Carolina has gas.”

Parker and other advocates of the project hope it spurs economic development, specifically the interest of manufacturers who rely on natural gas to operate. While companies aren’t going to be connecting directly to the pipeline, they could benefit from cheaper Duke Energy prices and a more resilient system, advocates have said.

Parker said many critics live in the Triangle region, where there’s “plenty of natural gas.” “Yet my community doesn’t have it,” he said.

But, to several property owners along the route, the ACP means unwelcome trenching. Of the 2,900 landowners impacted by the project, 1,000 are in North Carolina.

A group of homeowners along Exum Road in Wilson County are among those fighting against the project, worried about its impact on their safety and property values.

They point to the demographics. Other than Johnston, the counties at play have a greater minority makeup than the state average. That includes Robeson County, which is unique because of its Lumbee Indian population.

“What are we all? Black folks who are retired,” said Alice Freeman, who expects to see the pipeline construction from her bedroom window. “They’re going to these communities because we are the people who have the least clout. We don’t have the money to fight them. We’re easy prey. And nobody is going to come to our defense.”

But proponents of the pipeline – including several economic developers across the state – point to different demographics.

Of the eight counties tapped for the Atlantic Coast Pipeline, only Johnston County is not among the most economically distressed counties in the state. And the pipeline, they say, is critical to boosting their economies.

“For us, the overriding factor is the long-term economic development potential,” Patrick Woodie, president of the North Carolina Rural Economic Development Center said earlier this year, adding that it “outweighs” homeowner impact.

The project is a joint effort by Dominion (NYSE:D), Duke Energy (NYSE:DUK), Piedmont Natural Gas and Southern Co. (NYSE:SO).

Solar Energy Boom Sets New Records, Shattering Expectations

Overall, clean energy sources accounted for two-thirds of new electricity capacity worldwide in 2016, a new report says.

By Georgina Gustin

Oct 4, 2017

Driven largely by a boom in solar power, renewable energy expansion has hit record-breaking totals across the globe and is shattering expectations, especially in the United States, where projections were pessimistic just a decade ago.

In 2016, almost two-thirds of new power capacity came from renewables, bypassing net coal generation growth globally for the first time. Most of the expansion came from a 50 percent growth in solar, much of it in China.

In the U.S., solar power capacity doubled compared to 2015—itself a record-breaking year—with the country adding 14.5 gigawatts of solar power, far outpacing government projections. In the first half of 2017, wind and solar accounted for 10 percent of monthly electricity generation for the first time.

Two reports—one from the International Energy Agency (IEA), which looked at growth in renewables globally, and one from the Natural Resources Defense Council (NRDC), which tracked growth in the U.S.—were published this week, both telling the same story.

"We had very similar findings: 2016, from a U.S. perspective was a great year for renewable energy and energy efficiency," said Amanda Levin, a co-author of the NRDC report. "China is still the largest source of new power, but in the U.S., we're seeing an increase in renewables year over year."

Growth Shatters Past Expectations

The numbers are far higher than the U.S. Energy Information Administration (EIA) predicted a decade earlier. The agency forecast in 2006 that solar power would amount to only about 0.8 gigawatts of capacity by 2016.

Instead, installed solar by 2016 was 46 times that estimate, the NRDC points out. EIA's prediction for wind power was also off—the agency predicted 17 gigawatts of wind power, but that figure actually rose nearly fivefold, to 82 gigawatts of capacity.

The agency, likewise, didn't predict a drop in coal-fired power generation, which plummeted by nearly 45 percent.

Globally, according to the report from the IEA—not to be confused with the EIA—solar was the fastest-growing source of new energy, bypassing all other energy sources, including coal. Overall, the IEA found, new solar energy capacity rose by 50 percent globally—tracking with the rise in the U.S. Adding in other renewable sources, including wind, geothermal and hydropower, clean energy sources accounted for two-thirds of new electricity capacity. The IEA also increased its forecast for future renewable energy growth, saying it now expects renewable electricity capacity will grow 43 percent, or more than 920 gigawatts, by 2022.

Solar's U.S. Growth Could Hit a Speed Bump

In the U.S., the prospects are similarly positive, despite the Trump administration's efforts to bolster the coal industry and roll back Obama-era clean energy legislation.

Levin noted one potential damper on that growth. Last month, the U.S. International Trade Commission ruled in favor of two solar manufacturers that are seeking tariffs on cheap imported solar panels. Ultimately, any tariff decision would be made by the Trump administration.

"It would mean a much higher price for solar panels, and it could put a large reduction in new solar being added over the next two to three years," Levin said.

"States and cities are moving forward on clean energy," she said. "We think the investments made by states and cities, to not only hedge on gas prices, but to meet clean energy standards, will continue to drive solar even with the decision."

About the Author: Georgina Gustin

Georgina Gustin is a Washington-based reporter who has covered food policy, farming and the environment for more than a decade. She started her journalism career at The Day in New London, Conn., then moved to the St. Louis Post-Dispatch, where she launched the "food beat," covering agriculture, biotech giant Monsanto and the growing "good food" movement. At CQ Roll Call, she covered food, farm and drug policy and the intersections between federal regulatory agencies and Congress. Her work has also appeared in The New York Times, Washington Post and National Geographic's The Plate, among others.

Santa Fe School Board Opposes State Science Education Standards

Critics of the proposed curriculum say it leaves out important information relating to climate change and evolution.

By Ashley P. Taylor | October 4, 2017

The Santa Fe school board voted yesterday (October 3) to oppose the state’s new science education standards, which, the board says, water down or leave out key information about evolution and climate change, The Santa Fe New Mexican reports. The school board has written a letter in opposition to the state’s Public Education Department (PED), which it plans to send by week’s end. The board is also planning a “teach-in” at the PED building in Santa Fe for next Friday (October 13).

The letter, shared with the New Mexican, asks why the new standards omit the age of the Earth, the time when the first unicellular organism appeared, evolution, and global warming, while emphasizing the oil and gas industries. The letter also criticizes the state’s refusal to disclose the authors of the standards, saying that the secrecy “leaves open for question the authenticity of the proposed replacement,” according to the newspaper.

The Los Alamos school board has already opposed the new standards, and the Taos board is expected to follow suit, the New Mexican reports.

The Santa Fe board is “deeply troubled,” according to the letter, that the new state standards—an update from those established in 2003—do not align with the Next Generation Science Education Standards, which the National Research Council, the National Science Teachers Association, and others developed. Eighteen states have adopted the Next Generation standards in full. Others, like New Mexico, are using them as a framework for their own standards, the New Mexican reports.

While the authorship of the Next Generation standards is public, that of the state standards is not, and the state seems to be actively withholding the information. The New Mexican submitted a public records request in mid-September seeking the names of individuals and groups that developed the new standards. The education department’s public records custodian replied that the department had no pertinent records, as the New Mexican reports in a separate article. PED spokeswoman Lida Alikhani told the paper that those who weighed in on the standards did so with the understanding that their contributions would remain confidential.

The board has planned a teach-in for October 13, days before a public hearing about the new standards.

“Our state should adopt the original proposal, not the revisions,” writes the New Mexican in an editorial.

“Future climate scientists should understand what human activity is doing to the planet, and most of all, how science might be able to save our collective futures,” the editorial continues. “Without the proper foundation, those scientists will not be coming from New Mexico. That would be a shame.”

Industry Lawsuits Try to Paint Environmental Activism as Illegal Racket

Logging and pipeline companies are using a new legal tactic to seek damages from Greenpeace and other groups. The long-shot cases are having a chilling effect.

By Nicholas Kusnetz

Oct 5, 2017

On a bright afternoon in May 2016, two men in a silver SUV pulled into Kelly Martin's driveway. One of them, tall and beefy with a crew cut, walked up to her front door.

"The guy said, 'Is Joshua Martin home?' and I said, 'No, who are you?" recalled Kelly. "He said, 'I'm with a company that's talking to current and former employees of ForestEthics, and I'm wondering if he still works there'."

Joshua had left ForestEthics, renamed Stand last year, to run the Environmental Paper Network. Kelly asked to see the stranger's ID and to snap a picture on her phone. Instead, the man retreated to the SUV and "they literally peeled out of the driveway."

Around the same time, Aaron Sanger, another former employee of Stand, also received a visit from two men asking similar questions. So did others, some of them former employees of Greenpeace.

Then, on the last day of that month, Greenpeace and Stand were hit with an unusual lawsuit brought by Resolute Forest Products, one of Canada's largest logging and paper companies, that could cost the groups hundreds of millions of dollars if Resolute wins.

The timber company said the organizations, which for years had campaigned against Resolute's logging in Canada's boreal forest, were conspiring illegally to extort the company's customers and defraud their own donors.

Invoking the Racketeer Influenced and Corrupt Organizations Act, or RICO, a federal conspiracy law devised to ensnare mobsters, the suit accuses the organizations, as well as several green campaigners individually and numerous unidentified "co-conspirators," of running what amounts to a giant racket.

"Maximizing donations, not saving the environment, is Greenpeace's true objective," the complaint says. "Its campaigns are consistently based on sensational misinformation untethered to facts or science, but crafted instead to induce strong emotions and, thereby, donations." Dozens of the group's campaign emails and tweets, it said, constituted wire fraud.

"As an NGO, that is a deeply chilling argument," said Carroll Muffett, president of the Center for International Environmental Law (CIEL), which joined eight other groups to file an amici curiae brief supporting a move to dismiss Resolute's case.

The far-reaching lawsuit has seized attention across the environmental advocacy and legal communities. Arguments are to be heard in court next week.

The Resolute lawsuit was unprecedented. Then several months ago, former employees of Greenpeace and the environmental advocacy group 350.org were similarly visited by investigators. In August, the company behind the Dakota Access Pipeline filed a separate RICO suit against Greenpeace and two smaller groups, Banktrack and Earth First!. The complaint echoes Resolute's claims: a broad conspiracy against a major company, advocacy groups running an illegal "enterprise" to further their own interests while damaging the company, Energy Transfer Partners. It even alleges support for eco-terrorism, a violation of the Patriot Act, and drug trafficking.

It was filed by the same lawyer who represents Resolute—Michael J. Bowe, of the firm Kasowitz Benson Torres LLP, who is also a member of President Donald Trump's personal legal team.

"The Energy Transfer Partners lawsuit against Greenpeace is perhaps the most aggressive SLAPP-type suit that I've ever seen," said Michael Gerrard, faculty director of the Sabin Center for Climate Change Law at Columbia University, using the acronym for a lawsuit that aims to silence political advocacy. "The paper practically bursts into flames in your hands."

While the second lawsuit names only three defendants, together with unnamed "co-conspirators," it also accuses many of the nation's leading environmental groups, including the Sierra Club, Earthjustice and 350.org, of participating in a sprawling enterprise to disrupt business and defraud donors.

"This was precisely what we were concerned we would see," said Muffett of CIEL. Dozens of organizations, American Indian tribes and countless individuals were involved in the protests against the Dakota Access Pipeline. By naming a handful of them and unnamed co-conspirators, the suit may cause anyone with any ties to the movement to think twice before sending the next campaign email or launching a new effort, Muffett said.

"Those groups will be looking at this and trying to decide on how to respond and what it means for their campaigns going forward, not only on Dakota Access but other campaigns as well. And that is, in all likelihood, a core strategy of a case like this."

Kasowitz Benson Torres has earned a reputation as a particularly brash New York law firm. The waiting room of its Manhattan offices displays a brochure of media mentions, including several about the firm's close relationship with Trump. When Timothy O'Brien reported in his 2005 book, "TrumpNation: The Art of Being the Donald," that the future president was worth between $150 million and $250 million, not the billions he boasted, Trump hired Kasowitz to sue O'Brien for $5 billion. The case dragged on for five years before being dismissed.

It may have been the firm's work for a Canadian insurance company, however, that captured the attention of Resolute.

In 2006, Fairfax Financial Holdings Limited hired Bowe and Kasowitz to sue several hedge funds, accusing them of operating a criminal enterprise to drive down the company's share price and boost their short-sale profits. The case relied on New Jersey's RICO statute, and while it was dismissed in 2012, Fairfax appealed and in April won a ruling saying it could proceed with some of its claims. Bradley Martin, Fairfax's vice president of strategic investments, is chairman of Resolute's board of directors.

Bowe said that while his clients suffered damages, "defamation cases also typically involve an element of restoring your reputation. In fact, sometimes plaintiffs who don't have substantial damages sue primarily to clear their reputations," he said. Behind Bowe's desk hangs an American flag from the battle of Fallujah, among the bloodiest of the Iraq War. "I think the filing of a suit in court sends the most credible message that what they're saying about you is not true."

The Resolute case has its origins in a long-running environmental campaign focusing on Canada's boreal forest, a thick green stripe across much of the country. It's one of the world's largest stores of carbon, and environmental groups say it's under increasing pressure from logging operations. In 2012, Greenpeace Canada released a report purporting to show that Resolute was operating in parts of the forest it had agreed to stay out of. It turned out, however, that the area in question was not part of the protected forest. Greenpeace later issued an apology and said it would retract any material that drew on the assessment.

There's some disagreement on whether Resolute's logging practices stand out from its peers—many environmentalists say they do, but some people familiar with its work dismiss the claim. What everyone interviewed for this article agreed on is that unlike many of its peers, Resolute has resisted settling with environmentalists. In 2013, it sued Greenpeace Canada in an ongoing Canadian defamation suit over the boreal campaign. And after Greenpeace continued its work, and targeted many of Resolute's American customers, Resolute eventually filed its U.S. RICO claim, targeting the organization's U.S. and international arms, as well.

In 124 pages of detailed and passionate language, the complaint alleges not only that the defendants lied in order to harm Resolute's business, but that their primary objective is not to protect the environment but to raise money. It describes a "blitzkrieg attack" of threats against the company's customers unless they dropped Resolute. Some, like Kimberly-Clark, UPM, Best Buy and News Corp have either ended contracts or requested the contracts be altered to alleviate concerns, according to the complaint.

The lawsuit says the groups impersonated company employees to gain proprietary information and suggests Greenpeace was associated with a hack of the websites of Resolute and Best Buy. Beyond the 2012 error that Greenpeace retracted, the lawsuit says Greenpeace defamed the company by, among other things, calling it a "forest destroyer." In addition to Greenpeace and Stand, the lawsuit names several of the organizations' employees personally as defendants. The co-conspirators—or John and Jane Does—it says, are people of unknown identity who assisted Greenpeace's work, including "cyber-hacktivists, environmental activists, and certain foundations directing funds to the defendants." The suit seeks damages to be determined in a trial, but cites a claim by Greenpeace to have cost the company at least $80 million. RICO claims entitle plaintiffs to recover triple damages, in this case potentially more than $240 million.

On Oct. 10, Greenpeace will ask a federal judge in California to dismiss the case. The group submitted a similar motion last year in Georgia, where the suit was originally filed. The Georgia judge later moved the case to California, where two of the defendants are based, saying Resolute had not provided any "factual basis from which to infer that defendants committed fraud or extortion" in Georgia. "Rather, the allegations in the complaint, at best, support the inference that the defendants organized and held a protest in Augusta."

"It is very alarming that you can have plaintiffs like this, representing corporate interests attacking legitimate critics doing advocacy work by just drafting a complaint, throwing whatever in there, stretching racketeering law and going after constitutionally protected free speech by throwing labels out there basically trying to criminalize legitimate advocacy work," said Tom Wetterer, Greenpeace's general counsel.

Over the past year, several energy corporations have filed lawsuits against critics. Murray Energy Corp., an Ohio-based coal company, filed a libel claim against The New York Times in May after it published an editorial critical of the company, and the following month filed another suit against comedian John Oliver, HBO, Time Warner and some staff members of his show, "Last Week Tonight." In August, Cabot Oil & Gas Corp., which has been accused of polluting residential water wells in Pennsylvania, sued one of the residents and his lawyers, saying they had harmed the company by filing a lawsuit, later withdrawn, that Cabot considered frivolous. The same month, a mining company run by the son of West Virginia Gov. Jim Justice sued two Kentucky regulators—personally—alleging that they interfered with the company's business.

The suits against Greenpeace, however, are broader in scope, and because of their use of racketeering law, potentially much more damaging. "They can change the tenor of the debate," said Joshua Galperin, who runs the Environmental Protection Clinic at Yale Law School. The cases, he said, are not simply legal acts, but political too.

"Now wouldn't be a bad time for certain industries to follow Resolute's lead and say, 'let's try to change the conversation. Let's see if we can get the Trump administration to intervene in these lawsuits, to get behind us, to bring perhaps even criminal RICO charges against groups like Greenpeace," he said. "Now wouldn't be a bad time to try these aggressive tactics, unfortunately."

The day the Resolute lawsuit was filed, Jonathan Adler, a conservative legal scholar at Case Western Reserve University of Law, wrote about it in a blog published by the Washington Post. The Wall Street Journal's editorial board offered conditional support. Others, including Steve Forbes, welcomed the lawsuit in Investor's Business Daily and the National Review.

Brandon Phillips, of the National Fisheries Institute, an industry group, wrote a letter published in the Wall Street Journal saying, "Bravo to Resolute Forest Products and here's hoping the discovery process in the litigation shines a light on the misconduct that environmental journalists have apparently ignored for far too long."

Marita Noon, in a column for Breitbart and the American Spectator, said: "Hopefully other companies will now tune into the public's change in attitude and with firmness and determination also fight back to protect shareholders and workers." She placed Resolute's fight in the political context of growing support for Trump's presidential campaign.

H. Sterling Burnett, of the conservative Heartland Institute, well known for its denial of mainstream climate science, said in an interview that other companies have been too quick to cave to the demands of environmentalists, and that the RICO suits may change that. "The Resolute case is one small thread," he said, "but you can take a thread and unravel a whole sweater."

Adler, a contributing editor to the National Review Online, spent years at the conservative Competitive Enterprise Institute, whose litigation has marked climate policy debates. "There is a view in the free market community that corporations don't fight back enough and are too quick to write the terms of their own surrender," he said.

Adler said he's not a fan of using RICO in the way that Kasowitz Benson Torres has, but that he has little sympathy for Greenpeace. And he has little doubt that companies are watching. "If weapons are lying around, someone's going to pick them up."

Bowe said he's spoken with several other companies interested in filing similar lawsuits—both before and after Energy Transfer Partners filed its case in August—though he would not disclose any details about the companies or potential targets. Speaking from his office, with plate glass windows looking over midtown Manhattan, he said his firm is not shopping the case to potential clients, that he's not allowed to do so.

A Perfect Boogeyman

In December 2014, as negotiators from around the world gathered in Lima, Peru, for the United Nations-sponsored climate change talks, a group of Greenpeace activists laid out a pro-renewable energy message in giant yellow fabric letters next to some of the nation's famous Nazca Lines, an archeological site, and photographed the episode from above. Many Peruvians, as well as diplomats at the talks, were outraged by the stunt, and the Peruvian government said the activists damaged the site.

Since its founding in 1971, Greenpeace has embraced this type of high-profile antics that self-consciously skirt the edges of legality and politesse. They also make Greenpeace the perfect boogeyman for conservatives and corporate leaders who have long pushed back against environmentalists, considering them out-of-touch radicals bent on crippling economic growth. (The Nazca Lines episode, for which Greenpeace later apologized, is listed in Resolute's complaint as evidence that the organization does more harm than good.)

Greenpeace is international in reach, nipping at corporate heels no matter where they tread, and in recent decades it has become one of the most effective groups at pressing companies to reform. For any practice that Greenpeace wants changed—clearing Amazonian forests for soy or Indonesian ones for palm oil—the group searches the supply chain for a major consumer brand, asks it to change suppliers, and threatens to paint the brand as complicit in environmental pillage if it refuses. The group has sent protesters dressed as chickens to McDonald's franchises in London to protest logging in the Amazon, deftly mimicked a Dove soap public service announcement in Canada to link its parent company, Unilever, to deforestation in Indonesia, and dropped banners across the front of Procter & Gamble's corporate headquarters for its sourcing of palm oil. The rhetoric is blunt and hyperbolic. And it's been remarkably effective, winning concessions, cooperation and even grudging respect from some of the world's best-known corporations.

Kimberly-Clark agreed to boost its use of recycled and environmentally certified wood fiber in 2009 after a years-long Greenpeace campaign. McDonald's agreed to work with the group to reduce deforestation in the Amazon.

The case brought by Energy Transfer Partners is an explicit attack not just on Greenpeace and the other defendants, but on this method, which it calls the "Greenpeace Model."

The pipeline company says the groups damaged its business through their protests against the pipeline, and also through their campaigns to pressure banks to cut off funding. It says they lied about the company's practices—claiming it hadn't consulted with Indian tribes and that the project hadn't received a thorough environmental review. (In December, the Obama administration ordered a more comprehensive environmental review of alternative routes, but the Trump administration later approved the pipeline before the review was finished.) The suit also names nine other groups, including the Sierra Club, Earthjustice and 350.org, as members of this "criminal enterprise," though most are not included as defendants. Together with the Resolute suit, which names Stand, the cases go after much of this uncompromising wing of environmentalism.

"An attack against Greenpeace and Stand, two groups that have been really at the forefront of corporate campaigns, is not just an attack on those groups but is an attack on the strategy that NGOs have used to really bring corporations back under control in terms of their social and environmental behavior," said Michael Marx, who runs CorpEthics, an environmental consulting firm that works with several of the named groups. The two complaints list other campaigns that Greenpeace has engaged in, including its work against genetically modified crops, Canada's tar sands and a campaign on sustainable fishing. "If Resolute is successful in a case like this, they're basically taking away what's become what I think is the real driver in the social responsibility movement."

The lawsuits make extraordinary claims, comparing the named environmental organizations to "parasites" and a "predatory pack preying on legitimate businesses."

Many legal experts and environmentalists say it's too soon to say whether the suits will pose a serious threat to the groups or to the advocacy community more broadly. Several, including Galperin, say the cases are unlikely to prevail in court. They may not need to.

The Resolute suit has now been running for nearly a year and a half, requiring Greenpeace and Stand to spend money and time filing briefs, responding to motions and traveling to court. Defendants in the Energy Transfer Partners case face a similar road ahead, with the filing of motions scheduled to last into March.

"We're confident they're not going to win, we're confident they're going to spend probably 10 times as much money as we're going to spend defending it, but this is definitely a difficult thing to deal with," said Todd Paglia, Stand's executive director, who was also named as a defendant. Paglia said the group has already lost support from one funder who was wary of being dragged into a lawsuit, accounting for about a quarter of the funding for its boreal campaign. "If we weren't really aggressive in trying to control costs, in trying to maintain our focus on our campaigns and revenue, it would be very easy for this lawsuit to turn into not just harassment but a death threat."

Greenpeace is much larger, with nearly $400 million in revenue from all its international branches last year. If it lost in court, the payout could be substantial. (Energy Transfer Partners didn't specify a figure, saying it had lost hundreds of millions.) But Tom Wetterer, Greenpeace's general counsel, said that's not the goal, or the danger, of the cases. "I don't think it's really about the money," he said. "I think they have to realize that their claims are not strong, that it's the process that they're really focused on, regardless of the end. It's really a means to drag us through the legal process."

The groups have tried to capitalize on the lawsuits, too. Greenpeace launched a new campaign asking authors and publishers—whom Resolute supplies with paper—to denounce the company's tactics, while Stand has asked supporters to rally behind it, adding an appeal for donations at the bottom of the message.

Energy Transfer Partners declined to comment, referring questions to Bowe. In an interview in August, Kelcy Warren, the company's chief executive, told a North Dakota television station that "We've created this kind of tolerance where, oh my gosh, you can't challenge these people in fear that someone is going to say you're not a friend of the environment. That's nonsense," he said. "Could we get some monetary damages out of this thing, and probably will we? Yeah, sure. Is that my primary objective? Absolutely not. It's to send a message, you can't do this, this is unlawful and it's not going to be tolerated in the United States."

Seth Kursman, Resolute's vice president for corporate communications, said in a written statement to InsideClimate News that "we decided to draw the line, unapologetically and forcefully defending our integrity."

The case has already provided a document trail that's proved useful to the company and its allies. In a January filing, Greenpeace wrote that some of its statements were "heated rhetoric" and opinion, and not actionable. Richard Garneau, the company's chief executive, drew on the quote in a column in the National Review, calling it an admission of lying. The cry was echoed in the Daily Caller, the oil and gas industry website Energy in Depth, and a site run by the prominent corporate public relations consultant Richard Berman, whose firm's tagline is "Change the Debate." Energy Transfer Partners cited the filing in its complaint.

A dozen media and advocacy organizations, in a second amici curiae brief supporting Greenpeace and Stand, warned that using RICO was "clearly an attempt at an end-run around the protections of the First Amendment." For a more common defamation case to proceed, plaintiffs must prove a defendant acted with "actual malice," meaning they either knew what they said was false or they should have known. It's unclear, some legal experts say, whether the same standard would apply under RICO, potentially making it easier for the case to proceed to discovery.

Among those who signed the brief was the First Amendment Coalition, which until January was led by Peter Scheer. He said he doubts Resolute will win the RICO case, but that if it drags on long enough, it could provide a model for silencing defendants, particularly smaller groups.

"It is something that is greatly subject to abuse because it gives the plaintiff so much leverage that it can force the defendant to settle," he said. "It's like an extortion law."

Bowe, the lawyer representing the companies, dismissed the idea that RICO would weaken First Amendment protections.

"Certain elements of the statute allow you to hold people accountable who might not be held accountable in a defamation case," he said. "If you're dealing with a group acting in concert, you want to hold accountable those who are directing and funding and calling the shots, not just the people doing the acts."

Enter the John and Jane Does. "When you get to discovery, the whole iceberg becomes revealed," Bowe said. "There might be people there who you weren't aware were involved. It could be funders, it could be foundations." A foundation could be implicated, he said, if it was closely involved in the campaign and its messaging.

Marx, of CorpEthics, worries that the suits have already intimidated some of his peers. "Even the threat that companies might do it could have a chilling effect both on the foundations that fund this work and on the groups that do these kinds of campaigns," he said.

Several people interviewed for this article said that threat quietly murmurs in the back of their mind. One of them is Joshua Martin, among the targets of the private investigators. The investigators never identified themselves as working for the Kasowitz firm; Bowe would not confirm or deny that he sent them.

"As soon as that lawsuit came out, I read the whole thing and I looked for my name," Martin said. "I just think that psychologically that's interesting in terms of the potentially intimidating impact that the visits have."

Facing Tougher Regulations, Transcanada Scraps $12 Billion Oil Pipeline

"CALGARY, Alberta - TransCanada Corp abandoned its C$15.7 billion ($12.52 billion) cross-country Energy East pipeline on Thursday amid mounting regulatory hurdles, dealing a blow to the country’s oil export ambitions.

The demise of the pipeline comes less than a year after the Canadian government rejected another export pipeline, Enbridge Inc’s Northern Gateway, and is a further setback for Canada’s oil industry which is already hurting from low global crude prices.

The vast majority of Canadian crude exports go to the United States, and Energy East would have shipped 1.1 million barrels a day to east coast ports for loading onto tankers destined for higher-priced markets in Europe and Asia. "

Panelists say Dangerous Inaction on Rising Seas

10/09/2017 by Ashita Gona

High tide brings the ocean over the dunes in Ocean Isle during an October 2015 storm. File photo by Christopher Surigao

RALEIGH — “Bottom line is we should not be building big buildings next to the beach.”

These were the words of Orrin Pilkey, an expert on coastal geology, during a panel discussion at a community forum on the effects of sea level rise on North Carolina last week. Pilkey and other panelists voiced strong opinions on how little state and local officials are doing to adapt to and prevent damage from sea level rise in the coming decades.

Orrin Pilkey

The forum, “Rising Seas: How will climate change affect the NC Coast?,” was part of a Community Voices series hosted by The News & Observer and WTVD-TV of Raleigh. The discussion took place at the North Carolina Museum of History on the evening of Wednesday, Sept. 27.

In addition to Pilkey, professor emeritus of earth and ocean sciences at Nicholas School of the Environment at Duke University, panelists included the following:

Astrid Caldas, senior climate scientist with the Climate and Energy program at the Union of Concerned Scientists.

Stanley R. Riggs, professor of geology at East Carolina University.

Greg “Rudi” Rudolph, Carteret County shore protection officer and a member of the Coastal Resources Commission’s science panel.

Todd Miller, executive director of the North Carolina Coastal Federation.

Ned Barnett, an opinion column and blog contributor at The News & Observer, moderated the panel. Barnett noted at the beginning of the forum that the panel included no climate change doubters or those who reject mainstream climate science.

“We cannot devote the little time we have tonight to a debate that there is even a problem to discuss,” he said.

Barnett emphasized the topic of sea level rise is relevant to the state and not talked about enough.

“The rise can seem too small to matter, a few inches a decade,” he said, “but that rise becomes more ominous when we consider that it is relatively new, starting around 1800, that it is accelerating, and that its effects can be compounded by the other consequences of global warming, heavier rainfall and more powerful storms.”

A report released in July by the Union of Concerned Scientists, an organization Caldas said jokingly calls itself the union of pissed-off scientists, found that some coastal regions, which currently only see a few floods a year, will likely see frequent and destructive floods in coming decades.

Astrid Caldas

“Science is telling us that lots of localized floods are going to occur in the near term,” Caldas said, “and that substantial areas are going to become part of the tidal zone in the long term.”

She said that the research projected that Wilmington will go from experiencing a few tidal floods a year to as many as 150 by 2035. By 2045, the number of floods may be in excess of 350. She added that Duck is projected to see about 30 flooding events by 2035, and that the water will cover extensive expanses of land. By 2045, she said, Duck may see up to 150 extensive floods annually.

One of the biggest problems, Caldas said, will be the effect of this phenomenon on poor and minority communities, who feel the effects by sea level rise in different ways. Vulnerable communities, she said, tend to live in distant areas. Flooding on roads can make it difficult to get to resources and to their jobs.

“Many socially, economically vulnerable communities are at the frontlines of this whole mess, with very few resources to cope,” she said.

Rudolph said that people in poor agricultural communities are also vulnerable to sea level rise as they may not be able to afford increasingly expensive flood insurance premiums.

Greg Rudolph

“They’re in the flood zones,” he said, “if not, they’re going to be. Premiums going up because of storms. I see that as a big socioeconomic crisis, and it’s going to impact North Carolina hard.”

Pilkey said that the scientific consensus is that sea level will likely rise 3 feet by the end of the century and possibly another 6 inches, depending on the behavior of the west Antarctic ice sheet. In theory, he said, a foot of sea level rise could create 2,000 feet of shoreline retreat, or as much as 10,000 feet along parts of the Outer Banks.

“Which means that 1 foot of sea level rise could bring the shoreline back 2 miles,” he said regarding the Outer Banks.

As sea level rises, Riggs said the coast is becoming increasingly vulnerable to storm surges, but that society must develop new economies around the natural dynamics. Storms are a part of life, he said, and we must learn to live in harmony with them.

Natural approaches to storm management are still possible in North Carolina, as only half of the state’s coast is developed, Rudolph said.

“There’s a huge what’s called opportunity,” he said, “to advance a mixture of natural flood management, living shorelines and new infrastructure.”

The panel supported the idea that preventing damage from sea level rise should be done sooner rather than waiting until after a devastating event.

“Don’t expect enlightened policymaking in the aftermath of a storm,” Miller said, “everybody’s attention at that point is on recovery.”

Pilkey said that he believes that after 2 feet of sea level rise, beach re-nourishment will no longer be possible, leaving communities with two choices: Move buildings back or build a seawall. The best we can do now, he said, is to stop building large structures that cannot be moved.

As for people who are interested in owning coastal property, Riggs said a lack of information may make them vulnerable to make risky purchases.

“Let’s at least require some statements on a deed that’s out there in the high-hazard areas,” Riggs said, “so a person who’s not familiar knows what they’re buying into.”

The final version of the state’s five-year update to the original 2010 sea-level rise report was released in March 2015.

Panel members expressed frustration at the gap between science and policy, with Pilkey saying the Coastal Resources Commission appears to be doing everything it can to promote development on the state’s coast.

Riggs referred to the 2010 sea level rise report produced by the CRC science panel, which predicted 39 inches of rise during the next century. The North Carolina General Assembly famously rejected the report, prompting nationwide attention and late-night TV jokes about the state “outlawing climate change,” and the panel was instructed to create a new report in 2015 that looked only 30 years into the future.

Riggs resigned from the panel in 2016, saying “I believe the once highly respected and effective science panel has been subtly defrocked and is now an ineffective body.”

He said during the panel discussion that education is key to putting pressure on policymakers to use science to craft coastal management policies.

“If we don’t have an educated public, we’ll never get off the ground with any of this,” Riggs said.

After hurricanes, Congress ponders future of flood insurance program

Congress is beginning to consider how to overhaul the flood insurance plan that many Texans are relying on to rebuild after Hurricane Harvey.

by Abby Livingston Oct. 9, 2017 15 hours ago

The Texas Tribune thanks its sponsors.

WASHINGTON - The devastating hit Houston took from Hurricane Harvey has exacerbated — and highlighted — the enormous financial jam facing the National Flood Insurance Program.

Thanks to the recent onslaught of hurricanes hitting Texas, Florida and Puerto Rico, there has never been a greater need for the program. But that need has also set off a new round of calls to dramatically overhaul a program that hasn't been able to sustain itself without major subsidies from the U.S. Treasury.

Republican U.S. Rep. Pete Olson's Sugar Land-based district suffered some of the most intensive flooding in the state. He said he is open to some changes, but not if it risks payouts on Harvey claims his constituents are filing. He is quick to underscore the desperation in his suburban Houston district and doesn't want changes to make things worse.

"It should be part of the package but not a do-all, end-all," Olson said of any potential overhaul.

The Texas Tribune thanks its sponsors. Become one.

Established in 1968 to help homeowners living in flood-prone areas that private insurers wouldn't cover, the program has never been on steady financial footing, and continued construction in low-lying areas — as well as more frequent and powerful storms attributed to climate change — have put the NFIP deeply in the red.

As a result, Congress repeatedly finds itself re-authorizing new money to support the program. Even before Hurricanes Harvey and Irma, the program was set to expire on Sept. 30, and no new insurance policies can be written until it's re-authorized again.

Few home insurance policies cover flood damage, and nearly all U.S. flood insurance policies are issued through the program. To qualify for national flood insurance, a home must be in a community that has agreed to adopt and enforce various policies to reduce flooding risk. The program has more than 584,000 active policies in Texas, more than any state but Florida.

It's hard to find a member of Congress who will call for an outright abolition of the program, which would likely destabilize real estate markets and property tax bases in those areas. So the aim among some legislators is to pass laws that will encourage homeowners to move into the private market.

One option is to raise the premiums for government insurance to help sustain the program, discourage new construction in high-risk areas and hope that as rates rise, more private companies will enter the flood insurance market.

The fear of many lawmakers who represent these homeowners is that a substantial rise in rates will be more than some can afford.

The Texas Tribune thanks its sponsors. Become one.

The issue had the potential to become a crisis as congressional insiders worried that re-authorizing the program could get tangled up in fights over raising the debt ceiling and funding the government. But to the surprise of nearly everyone, President Trump cut a deal with Democratic leaders to re-authorize the program for the short-term and push all of those big decisions into December.

Now, activists and member of Congress who want to overhaul the flood insurance program have an opportunity to make their case over the next couple of months.

They argue that government-subsidized insurance encourages more people to build in flood-prone areas — which then forces the government to rebuild their homes after every flood at taxpayer expense.

"We keep rebuilding areas that are at a very expensive cost to taxpayers," said U.S. Rep. Vicente Gonzalez, D-McAllen.

Those advocating an overhaul include taxpayer watchdogs, environmentalists, insurance companies and members of Congress who oppose bailing out areas that allowed building in flood-prone areas. They're pushing for legislation that requires better flood plain mapping that takes climate change into account, stricter building regulations requiring measures like elevating homes and buildings to reduce flood risk, and setting sustainable insurance rates that won't shock the market.

A powerful trifecta of interests groups comprised of bankers, real estate agents and home construction companies have fought these efforts.

Back in 2012, former President Obama signed into law a major Congressional overhaul of the flood insurance program. Among the changes: eliminating subsidies for homes that are repeatedly damaged by flooding.

But some homeowners and their representatives in Congress protested the steep price increase. In early 2014, Congress and Obama reversed course, passing into law a cap that would limit premium increases and mostly unwound the 2012 efforts.

The Texas Tribune thanks its sponsors. Become one.

Now, many House members are pushing to let the private market take over the job of insuring properties in flood zones.

Gonzalez serves on the U.S. House Financial Services Committee, which oversees flood insurance. A former attorney, he is no fan of what he describes as the program's drawn-out claims process.

"I think government shouldn't, probably, be in the business of insurance," he said.

U.S. Rep. Jeb Hensarling, R-Dallas and chair of the House Financial Services Committee, is a key player in this debate. He has pushed for reauthorization of the program, but wants to raise the premiums for government-sponsored policies to encourage homeowners to seek private insurance.

“As unfortunately the NFIP faces an uncertain future, we must ensure people have more affordable flood insurance options as they begin to rebuild," he said last week.

The first stab at changes came last week when two Floridians in Congress, U.S. Reps. Kathy Castor, a Democrat, and Dennis Ross, a Republican, attached legislation to an unrelated bill that would allow mortgage lenders to accept homeowners' use of private flood insurance instead of government insurance.

The Castor-Ross measure passed the U.S. House, but the Senate quickly axed it out of the bill. That drew a rebuke from Hensarling, who said the Senate was "letting an opportunity slip through its fingers to give flood victims and homeowners better and more affordable flood insurance options."

Supporters of the legislation in the House say they are undeterred, believing it's the most popular proposal for changing the program and will inevitably pass.

For now, no one on Capitol Hill seems inclined to increase the misery of those affected in Houston by drastically changing the flood insurance program for those who are currently filing claims. And the Florida and Texas delegations have vowed to combine their legislative firepower to protect their constituencies — members from both parties say protecting the program is a key priority.

"We've got to fix it because it's going bankrupt," said Olson, the congressman from Sugar Land. "People depend on it."

This story was produced in partnership with the Ravitch Fiscal Reporting Program at the CUNY Graduate School of Journalism.

Genetically boosting the nutritional value of corn could benefit millions

Rutgers scientists discover way to reduce animal feed and food production costs by increasing a key nutrient in corn

Rutgers University

Rutgers scientists have found an efficient way to enhance the nutritional value of corn - the world's largest commodity crop - by inserting a bacterial gene that causes it to produce a key nutrient called methionine, according to a new study.

The Rutgers University-New Brunswick discovery could benefit millions of people in developing countries, such as in South America and Africa, who depend on corn as a staple. It could also significantly reduce worldwide animal feed costs.

"We improved the nutritional value of corn, the largest commodity crop grown on Earth," said Thomas Leustek, study co-author and professor in the Department of Plant Biology in the School of Environmental and Biological Sciences. "Most corn is used for animal feed, but it lacks methionine - a key amino acid - and we found an effective way to add it."

The study, led by Jose Planta, a doctoral student at the Waksman Institute of Microbiology, was published online today in the Proceedings of the National Academy of Sciences.

Methionine, found in meat, is one of the nine essential amino acids that humans get from food, according to the National Center for Biotechnology Information. It is needed for growth and tissue repair, improves the tone and flexibility of skin and hair, and strengthens nails. The sulfur in methionine protects cells from pollutants, slows cell aging and is essential for absorbing selenium and zinc.

Every year, synthetic methionine worth several billion dollars is added to field corn seed, which lacks the substance in nature, said study senior author Joachim Messing, a professor who directs the Waksman Institute of Microbiology. The other co-author is Xiaoli Xiang of the Rutgers Department of Plant Biology and Sichuan Academy of Agricultural Sciences in China.

"It is a costly, energy-consuming process," said Messing, whose lab collaborated with Leustek's lab for this study. "Methionine is added because animals won't grow without it. In many developing countries where corn is a staple, methionine is also important for people, especially children. It's vital nutrition, like a vitamin."

Chicken feed is usually prepared as a corn-soybean mixture, and methionine is the sole essential sulfur-containing amino acid that's missing, the study says.

The Rutgers scientists inserted an E. coli bacterial gene into the corn plant's genome and grew several generations of corn. The E. coli enzyme - 3?-phosphoadenosine-5?-phosphosulfate reductase (EcPAPR) - spurred methionine production in just the plant's leaves instead of the entire plant to avoid the accumulation of toxic byproducts, Leustek said. As a result, methionine in corn kernels increased by 57 percent, the study says.

Then the scientists conducted a chicken feeding trial at Rutgers and showed that the genetically engineered corn was nutritious for them, Messing said.

"To our surprise, one important outcome was that corn plant growth was not affected," he said.

In the developed world, including the U.S., meat proteins generally have lots of methionine, Leustek said. But in the developing world, subsistence farmers grow corn for their family's consumption.

"Our study shows that they wouldn't have to purchase methionine supplements or expensive foods that have higher methionine," he said.

Huge energy potential in open ocean wind farms in the North Atlantic

In wintertime, North Atlantic wind farms could provide sufficient energy to meet all of civilization's current needs

Carnegie Institution for Science

Washington, DC -- There is considerable opportunity for generating wind power in the open ocean, particularly the North Atlantic, according to new research from Carnegie's Anna Possner and Ken Caldeira. Their work is published by Proceedings of the National Academy of Sciences.

Because wind speeds are higher on average over ocean than over land, wind turbines in the open ocean could in theory intercept more than five times as much energy as wind turbines over land. This presents an enticing opportunity for generating renewable energy through wind turbines. But it was unknown whether the faster ocean winds could actually be converted to increased amounts of electricity.

"Are the winds so fast just because there is nothing out there to slow them down? Will sticking giant wind farms out there just slow down the winds so much that it is no better than over land?" Caldeira asked.

Most of the energy captured by large wind farms originates higher up in the atmosphere and is transported down to the surface where the turbines may extract this energy. Other studies have estimated that there is a maximum rate of electricity generation for land-based wind farms, and have concluded that this maximum rate of energy extraction is limited by the rate at which energy is moved down from faster, higher up winds.

"The real question is," Caldeira said, "can the atmosphere over the ocean move more energy downward than the atmosphere over land is able to?"

Possner and Caldeira's sophisticated modeling tools compared the productivity of large Kansas wind farms to massive, theoretical open-ocean wind farms and found that in some areas ocean-based wind farms could generate at least three times more power than the ones on land.

In the North Atlantic, in particular, the drag introduced by wind turbines would not slow down winds as much as they would on land, Possner and Caldeira found. This is largely due to the fact that large amounts of heat pour out of the North Atlantic Ocean and into the overlying atmosphere, especially during the winter. This contrast in surface warming along the U.S. coast drives the frequent generation of cyclones, or low-pressure systems, that cross the Atlantic and are very efficient in drawing the upper atmosphere's energy down to the height of the turbines.

"We found that giant ocean-based wind farms are able to tap into the energy of the winds throughout much of the atmosphere, whereas wind farms onshore remain constrained by the near-surface wind resources," Possner explained.

However, this tremendous wind power is very seasonal. While in the winter, North Atlantic wind farms could provide sufficient energy to meet all of civilization's current needs, in the summer such wind farms could merely generate enough power to cover the electricity demand of Europe, or possibly the United States alone.

Wind power production in the deep waters of the open ocean is in its infancy of commercialization. The huge wind power resources identified by the Possner and Caldeira study provide strong incentives to develop lower-cost technologies that can operate in the open-ocean environment and transmit this electricity to land where it can be used.

This study was supported by the Fund for Innovative Climate and Energy Research and the Carnegie Institution for Science endowment.

The Carnegie Institution for Science is a private, nonprofit organization headquartered in Washington, D.C., with six research departments throughout the U.S. Since its founding in 1902, the Carnegie Institution has been a pioneering force in basic scientific research. Carnegie scientists are leaders in plant biology, developmental biology, astronomy, materials science, global ecology, and Earth and planetary science.

The hidden environmental costs of dog and cat food

By Karin Brulliard August 4

The kibble in a dog’s bowl has a significant environmental pawprint, a study found. (Bigstock)

Gregory Okin is quick to point out that he does not hate dogs and cats. Although he shares his home with neither — he is allergic, so his pets are fish — he thinks it is fine if you do. But if you do, he would like you to consider what their meat-heavy kibble and canned food are doing to the planet.

Okin, a geographer at UCLA, recently did that, and the numbers he crunched led to some astonishing conclusions. America’s 180 million or so Rovers and Fluffies gulp down about 25 percent of all the animal-derived calories consumed in the United States each year, according to Okin’s calculations. If these pets established a sovereign nation, it would rank fifth in global meat consumption.

Needless to say, producing that meat — which requires more land, water and energy and pollutes more than plant-based food — creates a lot of greenhouse gases: as many as 64 million tons annually, or about the equivalent of driving more than 12 million cars around for a year. That doesn’t mean pet-keeping must be eschewed for the sake of the planet, but “neither is it an unalloyed good,” Okin wrote in a study published this week in PLOS One.

“If you are worried about the environment, then in the same way you might consider what kind of car you buy … this is something that might be on your radar,” Okin said in an interview. “But it’s not necessarily something you want to feel terrible about. ”

This research was a departure for Okin, who typically travels the globe to study deserts — things such as wind erosion, dust production and plant-soil interactions. But he said the backyard chicken trend in Los Angeles got him thinking about “how cool it is” that pet chickens make protein, while dogs and cats eat protein. And he discovered that even as interest grows in the environmental impact of our own meat consumption, there has been almost no effort to quantify the part our most common pets play.

To do that, Okin turned to dog and cat population estimates from the pet industry, average animal weights, and ingredient lists in popular pet foods. The country’s dogs and cats, he determined, consume about 19 percent as many calories as the human population, or about as much as 62 million American people. But because their diets are higher in protein, the pets’ total animal-derived calorie intake amounts to about 33 percent of that of humans.

Okin’s numbers are estimates, but they do “a good job of giving us some numbers that we can talk about,” said Cailin Heinze, a veterinary nutritionist at Tufts University’s Cummings School of Veterinary Medicine who has written about the environmental impact of pet food. “They bring up a really interesting discussion.”

Okin warns that the situation isn’t likely to improve any time soon. Pet ownership is on the rise in developing countries such as China, which means the demand for meaty pet food is, too. And in the United States, the growing idea of pets as furry children has led to an expanding market of expensive, gourmet foods that sound like Blue Apron meals. That means not just kale and sweet potato in the ingredient list, but grain-free and “human-grade” concoctions that emphasize their use of high-quality meat rather than the leftover “byproducts” that have traditionally made up much of our pets’ food.

“The trend is that people will be looking for more good cuts of meat for their animals and more high-protein foods for their animals,” Okin said.

Cats are ‘obligate carnivores,’ which means they need more protein than dogs. (iStock)

What to do about this? That’s the hard part. Heinze said one place to start is by passing on the high-protein or human-grade foods. Dogs and cats do need protein — and cats, which are obligate carnivores, really do need meat, she said. But the idea that they should dine on the equivalent of prime rib and lots of it comes from what she calls “the pet food fake news machine.” There’s no need to be turned off by some plant-based proteins in a food’s ingredients, she said, and dog owners in particular can look for foods with lower percentages of protein.

The term human-grade implies that a product is using protein that humans could eat, she added. Meat byproducts — all the organs and other animal parts that don’t end up at the supermarket — are perfectly fine, she said.

“Dogs and cats happily eat organ meat,” Heinze said. “Americans do not.”

Okin has some thoughts about that. The argument that pet foods’ use of byproducts is an “efficiency” in meat production is based on the premise that offal and organs are gross, he says. (Look no further than the collective gag over a finely textured beef product known as “pink slime.”) But if we would reconsider that, his study found, about one-quarter of all the animal-derived calories in pet food would be sufficient for all the people of Colorado.

“I’ve traveled around the world and I’m cognizant that what is considered human edible is culture-specific,” he said. “Maybe we need to have a conversation about what we will eat.”

In the meantime, Okin suggests that people thinking about getting a dog might consider a smaller one — a terrier rather than a Great Dane, say. Or, if you think a hamster might fulfill your pet desires, go that route.

Heinze, for her part, sometimes offers the same counsel to vegetarian or vegan clients who want their pets to go meat-free. They are typically motivated by animal welfare concerns, not environmental ones, she said, but such diets are not always best for dogs, and they never are for cats.

“There have been a few times in my career where I’ve honestly said to my clients, ‘We need to find a new home for your pet,'” she said, “‘and you need to get a rabbit or a guinea pig or something like that.’”

Materials emitted by a water pipe-repair method may pose health risks, new safeguards and research needed

WEST LAFAYETTE, Ind. — New research is calling for immediate safeguards and the study of a widely used method for repairing sewer-, storm-water and drinking-water pipes to understand the potential health and environmental concerns for workers and the public.

The procedure, called cured-in-place pipe repair, or CIPP, was invented in the 1970s. It involves inserting a resin-impregnated fabric tube into a damaged pipe and curing it in place with hot water or pressurized steam, sometimes with ultraviolet light. The result is a new plastic pipe manufactured inside the damaged one. The process can emit chemicals into the air, sometimes in visible plumes, and can expose workers and the public to a mixture of compounds that can pose potential health hazards, said Andrew Whelton, an assistant professor in Purdue University’s Lyles School of Civil Engineering and the Environmental and Ecological Engineering program.

He led a team of researchers who conducted a testing study at seven steam-cured CIPP installations in Indiana and California. The researchers captured the emitted materials and measured their concentration, including styrene, acetone, phenol, phthalates and other volatile (VOC) and semi-volatile organic compounds (SVOC).

Results from their air testing study are detailed in a paper appearing on July 26 in Environmental Science & Technology Letters. The study files can freely be downloaded and are open-access, and the paper is available at http://pubs.acs.org/doi/10.1021/acs.estlett.7b00237.

Findings show that the chemical plume, commonly thought of as harmless steam, was actually a complex mixture of organic vapor, water vapor, particulates of condensable vapor and partially cured resin, and liquid droplets of water and organic chemicals. A YouTube video is available at https://youtu.be/rBMOoa2XcJI.

“CIPP is the most popular water-pipe rehabilitation technology in the United States,” Whelton said. “Short- and long-term health impacts caused by chemical mixture exposures should be immediately investigated. Workers are a vulnerable population, and understanding exposures and health impacts to the general public is also needed.”

The researchers have briefed the National Institute for Occupational Health and Safety (NIOSH) about their findings. NIOSH is part of the Centers for Disease Control and Prevention and has occupational safety and health experts who can investigate workplace hazards.

Purdue researchers captured the chemical plume materials from two sanitary sewer-pipe installations and five storm-water pipe installations. Samples were analyzed using gas chromatography, thermal and spectroscopic techniques. Chemicals found included hazardous air pollutants, suspected endocrine disrupting chemicals, and known and suspected carcinogens. Emissions were sometimes highly concentrated and affected by wind direction, speed and the worker’s activities, Whelton said.

A waxy substance was found in the air, and materials engineers determined it was partially cured plastic, styrene monomer, acetone, and unidentified chemicals.

No respiratory protection was used by CIPP workers, and a review of online videos, images and construction contracts indicates respiratory safety equipment use was not typical, he said.

To evaluate chemical plume toxicity, pulmonary toxicologist and assistant professor Jonathan Shannahan and a graduate student exposed captured materials to mouse lung cells. Plume samples from two of four sites tested displayed toxicity effects and two did not.

Workers repair a damaged pipe using the CIPP process, which emits a white plume containing a complex mixture vapors and chemical droplets. The workers wore no respiratory protection.

“This suggests that there are operational conditions that may decrease the potential for hazardous health effects,” Shannahan stated. “Since exposures can be highly variable in chemical composition, concentration, and exposure duration our findings demonstrate the need for further investigation.”

At the same time, existing testing methods are not capable of documenting this multi-phase chemical exposure, Whelton said.

Findings Contradict Conventional Thinking:

In 2017, Whelton completed a one-and-a-half-day CIPP construction inspector course for consulting and municipal engineers and contractors as part of a different research project working with state transportation agencies.

“CIPP workers, the public, water utilities, and engineers think steam is emitted,” he said. “What we found was not steam. Even when the chemical plume was not visible, our instruments detected that we were being chemically exposed.”

The paper was authored by graduate students Seyedeh Mahboobeh Teimouri Sendesi, Kyungyeon Ra, Mohammed Nuruddin, and Lisa M. Kobos; undergraduate student Emily N. Conkling; Brandon E. Boor, an assistant professor of civil engineering; John A. Howarter, an assistant professor of materials engineering and environmental and ecological engineering; Jeffrey P. Youngblood, a professor of materials engineering; Shannahan; Chad T. Jafvert, a professor of civil engineering and environmental and ecological engineering; and Whelton.

The new research also contains results from the team’s Freedom of Information Act requests to cities and utilites. This information is contained in the supporting-information section of the research paper. Forty-nine public reports of chemical air contamination associated with CIPP activites were found. Complaints filed by homeowners and businesses who, anecdotally, have described strong and lingering chemical odors and illness symptoms, also were described.

Additional research is needed, particularly because the procedure has not been well studied for health and environmental risks, said Howarter.

“The CIPP process is actually a brilliant technology,” Howarter said. “Health and safety concerns, though, need to be addressed. We are not aware of any study that has determined what exposure limit to the chemical mixture is safe. We are not aware of any study that indicates that skin exposure or inhaling the multi-phase mixture is safe. We also are not aware of any study that has examined the persistence of this multi-phase mixture in the environment.”

Whelton said, “In the documents we reviewed, contractors, utilities, and engineering companies have told people who complain about illness symptoms that their exposures are not a health risk. It’s unclear what data are used for these declarations.”

Utilities and municipalities sometimes cite worker chemical exposure standards established by the NIOSH as acceptable for the general public.

“That comparison is wrong and invalid,” Whelton said. “Worker safety exposure standards should not be cited as acceptable exposures for the general public. For example, children have less of an ability to handle a chemical exposure than healthy adults, and worker exposure standards do not account for the multi-phase exposure. CIPP workers can be exposed to more than one chemical at a time, in addition to droplets, vapor and particulates.”

Testing Results Show Need For Worksite Changes

Because of the air testing results obtained at CIPP installations on the Purdue campus – two of the seven test sites studied – Purdue required the faculty and students to better protect themselves from inhaling the chemicals emitted. The researchers were required to wear full-facemask carbon filter respirators during their testing in California.

Meanwhile, the uncured chemicals also might pose hazards: The nitrile gloves Whelton wore on one occasion deteriorated from contact with the uncured resin tube. The researchers observed that workers sometimes did not wear gloves while handling the materials. Some images and videos online also show workers not wearing gloves while handling the chemicals, he said.

“Workers should always wear appropriately thick chemically resistant gloves while handling uncured resin tubes,” Whelton said. “No one should handle these materials with their bare hands.”

He said health officials must be alerted when people complain about odors near CIPP sites and illness so they can investigate.

“Health officials we have spoken with were unaware of illness and odor complaints from CIPP activities,” Whelton said. “Local and state health departments should be involved in all public chemical exposure and illness complaints. They are trained on medical assessments.”

Often, the public instead speaks with CIPP contractors and engineers about their illness complaints, which Whelton said must change.

“Engineers should act immediately because it is their professional responsibility to hold paramount the safety, health, and welfare of the public according to their code of ethics,” he said. “We have seen evidence that companies and utilities do not understand what materials are created and emitted by CIPP processes or the consequences of exposure. Our new study indicates workers, the public, and the environment needs to be better protected from harm.”

The researchers are working to develop safeguards, including a new type of handheld analytical device that would quickly indicate whether the air at a worksite is safe. A patent application has been filed through the Purdue Research Foundation’s Office of Technology Commercialization.

The research was primarily funded by a RAPID response grant from the National Science Foundation. Additional support was provided by public donations and from Purdue University.

The study follows a discovery three years ago when researchers reported that chemicals released by CIPP activities into waterways have been linked to fish kills, contaminated drinking water supplies, and negative impacts to wastewater treatment plants. A previous research paper, available at http://pubs.acs.org/doi/abs/10.1021/es5018637, demonstrated that chemicals released into water by CIPP sites can be toxic, and the CIPP waste dissolved freshwater test organisms within 24 hours at room temperature.

Donations to continue to this research can be made at http://Giving.Purdue.edu/WaterPipeSafety.

Writer: Emil Venere, 765-494-4709, venere@purdue.edu

Source: Andrew Whelton, 765-494-2160, , awhelton@purdue.edu

ABSTRACT

Worksite Chemical Air Emissions and Worker Exposure during Sanitary Sewer and Stormwater Pipe Rehabilitation Using Cured-in-Place-Pipe (CIPP)

Seyedeh Mahboobeh Teimouri Sendesi†, Kyungyeon Ra‡, Emily N. Conkling‡, Brandon E. Boor†, Md. Nuruddin§, John A. Howarter‡§, Jeffrey P. Youngblood§, Lisa M. Kobos∥, Jonathan H. Shannahan∥, Chad T. Jafvert†‡, and Andrew J. Whelton*†‡

† Lyles School of Civil Engineering, Purdue University, West Lafayette, Indiana 47907, United States

‡ Division of Environmental and Ecological Engineering, Purdue University, West Lafayette, Indiana 47907, United States

§ School of Materials Engineering, Purdue University, West Lafayette, Indiana 47907, United States

∥ School of Health Sciences, Purdue University, West Lafayette, Indiana 47907, United States

​Chemical emissions were characterized for steam-cured cured-in-place-pipe (CIPP) installations in Indiana (sanitary sewer) and California (stormwater). One pipe in California involved a low-volatile organic compound (VOC) non-styrene resin, while all other CIPP sites used styrene resins. In Indiana, the uncured resin contained styrene, benzaldehyde, butylated hydroxytoluene (BHT), and unidentified compounds. Materials emitted from the CIPP worksites were condensed and characterized. An emitted chemical plume in Indiana was a complex multiphase mixture of organic vapor, water vapor, particulate (condensable vapor and partially cured resin), and liquid droplets (water and organics). The condensed material contained styrene, acetone, and unidentified compounds. In California, both styrene and low-VOC resin condensates contained styrene, benzaldehyde, benzoic acid, BHT, dibutyl phthalate, and 1-tetradecanol. Phenol was detected only in the styrene resin condensate. Acetophenone, 4-tert-butylcyclohexanol, 4-tert-butylcyclohexanone, and tripropylene glycol diacrylate were detected only in the low-VOC condensate. Styrene in the low-VOC condensate was likely due to contamination of contractor equipment. Some, but not all, condensate compounds were detected in uncured resins. Two of four California styrene resin condensates were cytotoxic to mouse alveolar type II epithelial cells and macrophages. Real-time photoionization detector monitoring showed emissions varied significantly and were a function of location, wind direction, and worksite activity.

Human-caused warming likely led to recent streak of record-breaking temperatures

American Geophysical Union

WASHINGTON D.C. - It is "extremely unlikely" 2014, 2015 and 2016 would have been the warmest consecutive years on record without the influence of human-caused climate change, according to the authors of a new study.

Temperature records were first broken in 2014, when that year became the hottest year since global temperature records began in 1880. These temperatures were then surpassed in 2015 and 2016, making last year the hottest year ever recorded. In 2016, the average global temperature across land and ocean surface areas was 0.94 degrees Celsius (1.69 degrees Fahrenheit) above the 20th century average of 13.9 degrees Celsius (57.0 degrees Fahrenheit), according to NOAA.

Combining historical temperature data and state-of-the-art climate model simulations, the new study finds the likelihood of experiencing consecutive record-breaking global temperatures from 2014 to 2016 without the effects of human-caused climate change is no greater than 0.03 percent and the likelihood of three consecutive record-breaking years happening any time since 2000 is no more than 0.7 percent. When anthropogenic warming is considered, the likelihood of three consecutive record-breaking years happening any time since 2000 rises to as high as 50 percent, according to the new study.

That means human-caused climate change is very likely to blame for the three consecutive record-hot years, according to the new study accepted for publication in Geophysical Research Letters, a journal of the American Geophysical Union.

"With climate change, this is the kind of thing we would expect to see. And without climate change, we really would not expect to see it," said Michael Mann, a climate scientist at Pennsylvania State University in State College, Pennsylvania, and lead author of the new study.

Greenhouse gases, like carbon dioxide and methane, accumulate in the atmosphere and trap heat that would otherwise escape into space. Excess greenhouse gases from industrial activities, like burning fossil fuels, are trapping additional heat in the atmosphere, causing the Earth's temperatures to rise. The average surface temperature of the planet has risen about 1.1 degrees Celsius (2.0 degrees Fahrenheit) since the late 19th century, and the past 35 years have seen a majority of the warming, with 16 of the 17 warmest years on record occurring since 2001, according to NASA.

Scientists are now trying to characterize the relationship between yearly record high temperatures and human-caused global warming.

In response to the past three years' record-breaking temperatures, authors of the new study calculated the likelihood of observing a three-year streak of record high temperatures since yearly global temperature records began in the late 19th century and the likelihood of seeing such a streak since 2000, when much of the warming has been observed. The study's authors determined how likely this kind of event was to happen both with and without the influence of human-caused warming.

The new study considers that each year is not independent of the ones coming before and after it, in contrast to previous estimates that assumed individual years are statistically independent from each other. There are both natural and human events that make temperature changes cluster together, such as climate patterns like El Niño, the solar cycle and volcanic eruptions, according to Mann.

When this dependency is taken into account, the likelihood of these three consecutive record-breaking years occurring since 1880 is about 0.03 percent in the absence of human-caused climate change. When the long-term warming trend from human-caused climate change is considered, the likelihood of 2014-2016 being the hottest consecutive years on record since 1880 rises to between 1 and 3 percent, according to the new study.

The probability that this series of record-breaking years would be observed at some point since 2000 is less than 0.7 percent without the influence of human-caused climate change, but between 30 and 50 percent when the influence of human-caused climate change is considered, the new study finds.

If human-caused climate change is not considered, the warming observed in 2016 would have about a 1-in-a-million chance of occurring, compared with a nearly 1-in-3 chance when anthropogenic warming is taken into account, according to the study.

The results make it difficult to ignore the role human-caused climate change is having on temperatures around the world, according to Mann. Rising global temperatures are linked to more extreme weather events, such as heat waves, floods, and droughts, which can harm humans, animals, agriculture and natural resources, he said.

"The things that are likely to impact us most about climate change aren't the averages, they're the extremes," Mann said. "Whether it's extreme droughts, or extreme floods, or extreme heat waves, when it comes to climate change impacts ... a lot of the most impactful climate related events are extreme events. The events are being made more frequent and more extreme by human-caused climate change."

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing 60,000 members in 137 countries. Join the conversation on Facebook, Twitter, YouTube, and our other social media channels.

Scientists discover 91 new volcanoes below Antarctic ice sheet

This is in addition to 47 already known about and eruption would melt more ice in region affected by climate change

Saturday 12 August 2017 18.11 EDT

Last modified on Monday 14 August 2017 04.08 EDT

Scientists have uncovered the largest volcanic region on Earth – two kilometres below the surface of the vast ice sheet that covers west Antarctica.

The project, by Edinburgh University researchers, has revealed almost 100 volcanoes – with the highest as tall as the Eiger, which stands at almost 4,000 metres in Switzerland.

Geologists say this huge region is likely to dwarf that of east Africa’s volcanic ridge, currently rated the densest concentration of volcanoes in the world.

And the activity of this range could have worrying consequences, they have warned. “If one of these volcanoes were to erupt it could further destabilise west Antarctica’s ice sheets,” said glacier expert Robert Bingham, one of the paper’s authors. “Anything that causes the melting of ice – which an eruption certainly would – is likely to speed up the flow of ice into the sea.

“The big question is: how active are these volcanoes? That is something we need to determine as quickly as possible.”

The Edinburgh volcano survey, reported in the Geological Society’s special publications series, involved studying the underside of the west Antarctica ice sheet for hidden peaks of basalt rock similar to those produced by the region’s other volcanoes. Their tips actually lie above the ice and have been spotted by polar explorers over the past century.

But how many lie below the ice? This question was originally asked by the team’s youngest member, Max Van Wyk de Vries, an undergraduate at the university’s school of geosciences and a self-confessed volcano fanatic. He set up the project with the help of Bingham. Their study involved analysing measurements made by previous surveys, which involved the use of ice-penetrating radar, carried either by planes or land vehicles, to survey strips of the west Antarctic ice.

The results were then compared with satellite and database records and geological information from other aerial surveys. “Essentially, we were looking for evidence of volcanic cones sticking up into the ice,” Bingham said.

After the team had collated the results, it reported a staggering 91 previously unknown volcanoes, adding to the 47 others that had been discovered over the previous century of exploring the region.

These newly discovered volcanoes range in height from 100 to 3,850 metres. All are covered in ice, which sometimes lies in layers that are more than 4km thick in the region. These active peaks are concentrated in a region known as the west Antarctic rift system, which stretches 3,500km from Antarctica’s Ross ice shelf to the Antarctic peninsula.

“We were amazed,” Bingham said. “We had not expected to find anything like that number. We have almost trebled the number of volcanoes known to exist in west Antarctica. We also suspect there are even more on the bed of the sea that lies under the Ross ice shelf, so that I think it is very likely this region will turn out to be the densest region of volcanoes in the world, greater even than east Africa, where mounts Nyiragongo, Kilimanjaro, Longonot and all the other active volcanoes are concentrated.”

The discovery is particularly important because the activity of these volcanoes could have crucial implications for the rest of the planet. If one erupts, it could further destabilise some of the region’s ice sheets, which have already been affected by global warming. Meltwater outflows into the Antarctic ocean could trigger sea level rises. “We just don’t know about how active these volcanoes have been in the past,” Bingham said.

However, he pointed to one alarming trend: “The most volcanism that is going in the world at present is in regions that have only recently lost their glacier covering – after the end of the last ice age. These places include Iceland and Alaska.

“Theory suggests that this is occurring because, without ice sheets on top of them, there is a release of pressure on the regions’ volcanoes and they become more active.”

And this could happen in west Antarctica, where significant warming in the region caused by climate change has begun to affect its ice sheets. If they are reduced significantly, this could release pressure on the volcanoes that lie below and lead to eruptions that could further destabilise the ice sheets and enhance sea level rises that are already affecting our oceans.

“It is something we will have to watch closely,” Bingham said.

Federal Court rejects Sierra Club challenge to Texas natural gas export project

By Devin Henry - 08/15/17 11:09 AM EDT

A federal court has rejected a challenge to a major liquefied natural gas (LNG) export terminal in Texas.

In its Tuesday decision, the Court of Appeals for the District of Columbia ruled that the Department of Energy (DOE) conducted the necessary environmental and economic reviews before it approved the Freeport LNG export terminal project.

The Sierra Club had challenged DOE’s review of the project, saying the agency didn’t comply with federal environmental laws before approving the Freeport terminal in 2014, and that its determination that the terminal is in the “public interest” was flawed.

But the court batted down both contentions on Tuesday, ruling DOE followed procedures outlined in the National Environmental Policy Act (NEPA) when it approved the project. The court rejected the Sierra Club’s argument that DOE should have considered indirect environmental impacts of increasing LNG exports.

“Under our limited and deferential review, we cannot say that the Department failed to fulfill its obligations under NEPA by declining to make specific projections about environmental impacts stemming from specific levels of export-induced gas production,” the court determined in an opinion written by Judge Robert Wilkins, an Obama-appointee.

The court also ruled that DOE’s determination that the project is in the public interest properly considered domestic economic impacts, foreign policy goals and energy security measures. All non-free-trade agreement exports are required to attain a public interest determination.

In a statement, the Sierra Club said it was “disappointed with the court's refusal to require DOE to use available tools to inform communities of the impact of this additional fracking prior to approving exports."

“This LNG export approval creates unnecessary risks for the people of Freeport, Texas, and for every community that is saddled with fracking rigs next to their homes, schools and public spaces,” Nathan Matthews, a Sierra Club staff attorney, said.

Tuesday’s decision is the latest development in a lengthy legal fight over the Freeport terminal, which is due to come online next year, and other natural gas export facilities.

The D.C. Circuit had previously rejected environmentalists’ complaints over the Federal Energy Regulatory Commission’s approval decisions for the Freeport project and another terminal in Louisiana.

Natural Gas flaring rises despite N.D. regulations

Mike Lee, E&E News reporter for Energywire: Tuesday, August 15, 2017

Gas flaring in North Dakota's Bakken Shale oil field has been creeping back up, blunting the state's efforts to tame a problem that came to symbolize the excesses of the oil boom.

The volume of gas burned in flares reached 222 million cubic feet a day in June, a 31 percent increase from the same month last year, when the volume was 170 million cubic feet. That's still far lower than the peak in 2014, but critics said the turnaround shows the limits of North Dakota's industry-friendly regulations.

The flares release carbon dioxide and raw methane, both of which contribute to global climate change. Methane from flares is the biggest source of greenhouse gas emissions from North Dakota's oil and gas sector, according to the Environmental Defense Fund.

Landowners have objected to the waste of gas from their property, and residents who live in the oil field worry that the emissions from flaring could cause health problems, said Don Morrison, president of the Dakota Resource Council, whose members include farmers and ranchers in the Bakken region.

"What we can do about it is not issue a permit to drill until there is some place for the gas to go," Morrison said. "Other states have figured it out. I'm not sure why North Dakota can't figure it out, too."

Like other oil fields, the Bakken Shale produces large amounts of gas along with its crude. Unlike other fields, though, it has historically lacked the pipelines and processing plants needed to get gas to market. While oil companies can still turn a profit by trucking gas out to collection points, they argued they had no option but to burn their gas in flares at individual well sites.

Most other oil-producing states don't allow long-term flaring.

The practice drew widespread attention in 2011, when The New York Times featured a front-page story with a picture of a flare lighting up the surrounding prairie. At the time, producers in North Dakota were burning about 180 million cubic feet per day — more than a third of the gas produced in the state.

The publicity led then-Gov. Jack Dalrymple's (R) administration in 2014 to pass regulations aimed at limiting flaring by 2020, and the oil industry invested billions of dollars in infrastructure to move the gas to market.

The state Industrial Commission, which the governor leads, set a goal of reducing flaring to 10 percent of total gas production and later reduced the final level to 9 percent. The state also required companies to submit plans for how they'll capture gas when they apply for an oil well permit (Energywire, July 2, 2014).

The Sierra Club and Environmental Defense Fund argued at the time that the state should set a limit on the total volume of gas that can be flared. Setting a percentage limit allows the volume to rise when production rises.

"The planet doesn't care what percentage of gas you're wasting — the planet cares about the volume," said Dan Grossman, director of state programs at the Environmental Defense Fund.

[+] The amount of gas burned in flares in the Bakken Shale oil field has crept back up in the last year, despite state regulations intended to limit the practice. North Dakota Department of Mineral Resources/North Dakota Pipeline Authority

Regulators at the state Department of Mineral Resources considered a volume limit when they wrote the rules in 2014 but rejected the idea because it would be too hard to enforce, said Alison Ritter, a spokeswoman for the agency.

Overall, the program has worked, she said. At the peak of the oil boom in December 2014, producers in North Dakota were flaring 374 million cubic feet a day, or 25 percent of production. By April 2016, the volume was down to 144 million cubic feet a day, or 8.8 percent of production.

In the last year, though, the oil bust has forced producers to shift their drilling into the most productive parts of the field, which also produce the most gas, according to the Department of Mineral Resources.

The increase in production means that more gas has been flared, even though the percentage has stayed roughly flat.

By May of this year, production grew to 1.85 billion cubic feet a day and flaring hit 203 million cubic feet a day, or about 10.9 percent of production.

This June, shutdowns at processing plants and a pipeline operator pushed up both the volume and percentage of flaring — to 12 percent and 222 million cubic feet — even though production stayed flat, according to the Department of Mineral Resources.

The industry has also had problems obtaining rights of way for new gas lines, particularly across the Fort Berthold Indian Reservation, said Ron Ness, president of the North Dakota Petroleum Council.

Two new gas-processing plants are planned for the field, which should help reduce the amount of flaring. Processing plants strip out byproducts and impurities from the gas stream so that the gas can meet the standards set by interstate pipelines.

Environmentalists hope that the state will rethink its approach to the regulations during the current downturn in drilling.

"Hopefully the Industrial Commission learned from the last boom cycle to be proactive so that it doesn't get away from us again," said Wayde Schafer, the North Dakota organizer for the Sierra Club.

Governor Cuomo Announces Winners of 76West Clean Energy Competition

August 16, 2017 Albany, NY

Dallas-Based Skyven Technologies Named $1 Million Grand Prize Winner - Will Expand Business to the Southern Tier; Grow Southern Tier's Clean Energy Economy

Competition Complements "Southern Tier Soaring" - the Region's Strategic Plan to Generate Robust Economic Growth and Community Development

Governor Andrew M. Cuomo today announced the six winning companies of the 76West Clean Energy Competition, one of the largest competitions in the country that focuses on supporting and growing clean-energy businesses and economic development. Skyven Technologies, a solar heating company from Dallas, Texas, has been awarded the $1 million grand prize, and will expand its operations in the Southern Tier. The competition complements "Southern Tier Soaring," the region's comprehensive strategy to generate robust economic growth and community development.

"New York is setting a national precedent in growing clean-energy businesses and battling climate change, and this momentum continues with the 76West Competition," Governor Cuomo said. "Skyven Technologies and the rest of the winners from this competition will bring jobs and economic growth to the Southern Tier and beyond, ensuring that New York remains at the forefront of the new clean energy economy."

A total of $2.5 million was awarded to six innovative companies from New York and across the United States. Lieutenant Governor Kathy Hochul announced the winners at an awards ceremony in downtown Binghamton today, joined by more than 100 elected officials, entrepreneurs and local business leaders. The event also named a $500,000 winner and four $250,000 winners.

As a condition of the award, companies must either move to the Southern Tier or establish a direct connection with the Southern Tier, such as a supply chain, job development with Southern Tier companies, or other strategic relationships with Southern Tier entities that increases wealth creation and creates jobs. If the companies are already in the Southern Tier, they must commit to substantially growing their business and employment in the region.

Lieutenant Governor Kathy Hochul said, "Modeled after the wildly successful 43 North business plan competition, 76West is another example of creative economic development strategies spurring start-ups in the industries of the future. Incentivizing clean energy innovation in the Southern Tier through this competition is further evidence of Governor Cuomo's commitment to creating even more jobs and opportunities in this region of the State. Investing in renewable energy and other clean energy solutions also ensures a cleaner environment and stronger economy for future generations of New Yorkers. Congratulations to all the 76West winners who are bringing their business and clean energy technologies to the Southern Tier."

The 76West winners are:

$1 million grand prize winner

Skyven Technologies - Dallas, TX: uses solar heating techniques to reduce fossil fuel consumption in agriculture and other industries

$500,000 award winner

SunTegra - Port Chester, NY (Mid-Hudson Valley): develops solar products that are integrated into the roof to provide clean energy and an alternate look to conventional solar panels

$250,000 award winners

Biological Energy - Spencer, NY (Southern Tier): provides technology that increases wastewater treatment capacity while reducing energy use

EthosGen - Wilkes-Barre, PA: Captures and transforms waste heat to resilient, renewable on-site electric power

SolarKal - New York, NY (New York City): provides a brokerage service to help businesses simplify the process of going solar

Visolis - Berkeley, CA: produces high-value chemicals from biomass instead of petroleum, which reduces greenhouse gases

The second round of the 76West Competition was launched in December 2016 and received applications from around the world, including Switzerland, South Africa and Israel. Of these, 15 semifinalists were chosen and participated in a marathon pitch session on July 11, at Alfred State College. Judges selected eight finalists, who then pitched their companies to a different panel of judges on July 13, in Corning, New York with six winners being selected.

In addition to announcing the 76West winners, Lt. Governor Hochul recognized Binghamton for being the first city in the Southern Tier designated as a Clean Energy Community by the New York State Energy Research and Development Authority. The Clean Energy Communities initiative recognizes municipalities that complete four of 10 high-impact clean energy actions. To earn the designation, Binghamton converted all city street lights to energy efficient LED technology, installed an electric vehicle charging station, completed energy code enforcement training, and streamlined the approvals process for local solar projects. The city is now eligible to apply for $250,000 in grants to fund additional clean energy projects.

NYSERDA administers the 76West Competition which supports New York's nation-leading Clean Energy Standard which will ensure that 50 percent of the state's electricity comes from renewable energy by 2030, under the comprehensive Reforming the Energy Vision, a strategy to build a cleaner, more resilient and affordable energy system for all New Yorkers. 76West fosters cleantech economic development and leadership and expands innovative entrepreneurship in the Southern Tier.

New York State Energy and Finance Chairman Richard Kauffman said, "We're thrilled to see so many entrepreneurs and different types of start-up companies involved in the 76West Competition and interested in joining the robust community of the Southern Tier. Under Governor Cuomo, all New Yorkers stand to benefit from these clean technologies that are being developed and brought to the marketplace and I look forward to watching the winners of the 76West Competition grow and help New York build its clean, resilient and affordable energy system."

Alicia Barton, President and CEO, NYSERDA said, "The 76West Competition builds on Governor Cuomo's commitment to the Southern Tier by supporting an entrepreneurial community that is integral to New York's thriving clean economy. I congratulate all of the finalists on their success in this competition and I look forward to seeing each them continue to grow in the Southern Tier."

This is the second year of 76West, a $20 million competition and support program administered by NYSERDA that will run once a year from 2016 to 2019. Each year applicants compete for a $1 million grand prize, a $500,000 award and four $250,000 awards. In total, 76West will provide $10 million in awards and $10 million for business support, marketing and administration through the Regional Greenhouse Gas Initiative and the Clean Energy Fund.

The six winners from the first round announced last November, have already integrated themselves into the Southern Tier, raised $20 million in private capital for future growth, and created new jobs.

New York State has made a significant investment in the Southern Tier to attract a talented workforce, grow business and drive innovation. It is already home to nation-leading companies and universities which are spurring innovation and leading global research. Now, with the addition of New York's six new 76West winners, the Southern Tier is quickly becoming a leading region for clean tech development and a model for other regions across the state and nationally.

Southern Tier Regional Economic Development Council Co-Chairs Tom Tranter, President & CEO of Corning Enterprises and Harvey Stenger, President of Binghamton University said, "76West is helping the Southern Tier to soar by attracting top-notch entrepreneurs to the clean energy sector and creating jobs in the region. The effort is adding momentum to our region's ongoing revitalization. Thank you to Governor Cuomo, NYSERDA and to all of our government partners for creating this competition that is resulting in exciting opportunities for the Southern Tier."

Western New York Regional Economic Development Council Co-Chair and President of SolEpoxy Inc., Jeff Belt said, "It's exciting to see the role entrepreneurs are playing in growing New York State's clean energy sector. I congratulate the winners of the 76West Competition who know what it takes to build the clean, resilient and affordable energy systems of the future."

Western New York Regional Economic Development Council Co-Chair and President of the State University of New York at Fredonia, Dr. Virginia Horvath said, "76West provides the funding to take many of the brilliant ideas that are being developed at academic institutions to the next step. The cleantech competition keeps some of the brightest students living and working in our state, which is committed to reaching its clean energy goals."

State Senator Fred Akshar said, "I'd like to congratulate all of the finalists and winners of the 76West Clean Energy Competition, especially our Southern Tier finalist Biological Energy. Our region needs economic development, job growth and clean energy - which is exactly what the 76West Clean Energy Competition and Southern Tier Soaring are working to accomplish. I applaud everyone's hard work and am very impressed by the thoughtfulness and innovation of the people of this great state."

Assemblywoman Donna Lupardo said, "Innovation has always been the key to economic development in the Southern Tier. The 76West Competition continues this tradition by helping grow the clean energy sector in our state and adds to our region's ongoing revitalization. I'm looking forward to seeing the positive results that 76West continues to produce across New York."

Today's announcement complements "Southern Tier Soaring" - the region's comprehensive blueprint to generate robust economic growth and community development. New York State has already invested more than $4.6 billion in the region since 2012 to lay for groundwork for the plan - attracting a talented workforce, growing business and driving innovation. Today, unemployment across the state is near the lowest levels since before the Great Recession; personal and corporate income taxes are down; and businesses are choosing places like the Southern Tier as a destination in which to grow and invest.

Now, the region is accelerating "Southern Tier Soaring" with a $500 million investment through New York's Upstate Revitalization Initiative, announced by Governor Cuomo in December 2015. The state's $500 million investment will incentivize private business to invest well over $2.5 billion - and the region's plan, as submitted, projects up to 10,200 new jobs. More information is available here.

To learn more about REV, including the Governor's $5 billion investment in clean energy technology and innovation, please visit www.ny.gov/REV4NY and follow us at @Rev4NY.

Pope Francis inspires Iowa churches to tackle climate change

USA Today

Kevin Hardy, kmhardy@dmreg.com Published 9:47 p.m. CT Aug. 21, 2017 | Updated 12:18 p.m. CT Aug. 22, 2017

See how St. John the Apostle Catholic Church in Norwalk is using solar energy to cut costs and help the environment.

NORWALK, Ia. ― As the sun beat down on a recent Saturday afternoon, the mammoth air conditioning system at St. John the Apostle Catholic Church offered welcome relief to parishioners trickling in for evening Mass.

While churchgoers filled the pews and hymns filled the sanctuary, 206 solar panels overhead converted the sun's energy into power for lights, the public address system and the crisp cold air.

"We're running on solar right now," said Rev. John Ludwig.

The church is the first in the Catholic Diocese of Des Moines to roll out a large scale solar energy initiative. But Catholic leaders say it won't be the last.

Bishop Robert Pates credits grassroots work at the Norwalk church for bringing about the solar project, but inspiration came from the world's highest ranking. Iowa Catholics, inspired by Pope Francis' call for the faithful to actively combat the harmful effects of global climate change, are looking for ways to conserve energy and transition to renewable forms of energy.

"It's exciting to me because I feel like they’re sort of walking the walk," said the Rev. Susan Hendershot Guy, executive director of Iowa Interfaith Power & Light, a religious group aimed at responding to global warming. "They're not just saying isn’t this great. They’re really advocating for congregations to do something."

But the Catholic church isn't alone. Across the state, leaders from various faith traditions ― who often worship in vast, aging and inefficient facilities ― are pushing forward with energy efficiency efforts and solar installations, Guy said.

In Kalona, a Mennonite congregation invested in a community solar effort. A Soto Zen Buddhist temple in Dorchester now gets about half its energy from its solar array. And a Methodist church in Ames will host a solar workshop later this month.

"I think it is getting out to all types of denominations, congregations, faith traditions really across the conservative-liberal spectrum," Guy said. "It’s a really practical way to live out that message of how we care for the world."

In Norwalk, St. John the Apostle partnered with a parishioner's for-profit company, which unlike untaxed churches, is able to take advantage of lucrative renewable energy tax credits. The company secured investors to pay for the upfront costs. It sells the energy generated back to the church at a rate lower than what the church would pay for power on the wider grid. Ludwig says the deal will save the church $2,000 per year in energy costs.

"We got the deal of the century," Ludwig said, "because we didn't have to pay anything for it."

'We cannot abandon what God has given us'

In June 2015, Pope Francis made international news after releasing his environmental encyclical Laudato Si, which translates to "Our Common Home." While citing the messages of previous popes, Francis captured the world's attention by criticizing climate change deniers and calling for sweeping changes in political action and personal behaviors to reduce emissions and better care for the environment.

"These situations have caused sister Earth, along with all the abandoned of our world, to cry out, pleading that we take another course," he wrote. "Never have we so hurt and mistreated our common home as we have in the last 200 years."

In May, Francis gave President Donald Trump, who has called climate change a "hoax," a copy of his 192-page document during the president's Vatican visit.

Pates says the Bible's Book of Genesis demands that Christians act as stewards of the Earth and "provide for those who come after us."

"We cannot put our head in the sand and say we're going to continue to have children but snuff out their existence," he said. "That is why the pope is so concerned."

While environmental activists often are viewed as a bloc within the political left, Pates said the church's embrace of the issue isn't meant to make a political stance. Yet, it does highlight how the church's bearings on abortion, immigration, poverty and the environment often split between platforms of the partisan left and right.

"The pope regards this as a moral issue, not a political issue, especially given the consensus of scientists," he said. "The timeline involved is very serious. We can do something. We can change it."

Pates says the diocese won't force parishes to make changes, but it is encouraging them to explore solar installations and other green initiatives. He also hopes a soon-to-form task force will spur meaningful change for two of the area's largest Catholic institutions: Dowling Catholic High School and Mercy Medical Center.

Though Catholics, and all Christians, hold faith in the prospect of life after death, Pates said that outlook can't be used as a crutch to forsake the planet's air, land and waters.

"In the meanwhile, this is what we are entrusted with," he said. "So we cannot abandon what God has given us."

Solar panels can be seen on the roof Saturday, Aug. 12, 2017, at St. John the Apostle Catholic Church in Norwalk. Church leaders worked with church member Terry Dvorak, owner of Red Lion Renewables, to get the panels installed at no upfront cost to the church. (Photo: Michael Zamora/The Register)

'Our kids deserve to have clean air and sunshine'

After years in the construction business, Terry Dvorak suddenly got very interested in renewable energy in 2010 while traveling around heavily polluted cities in China.

In Shanghai, it felt like he was inhaling paint fumes, he said.

"That kind of started it for me," he said. "After that trip, it solidified that I needed to do something."

He thought of the countless children who live their whole lives without escaping that tainted environment.

"Our kids deserve to have clean air and sunshine," he said, "and so does the rest of the world."

He started Red Lion Renewables, a solar development firm that installs solar arrays on commercial and residential property as well as school, church and city buildings.

A member of St. John the Apostle in Norwalk, Dvorak approached church leaders about a possible church solar project. After securing approval from the parish priest and the bishop, Dvorak and his investors put up about $200,000 to purchase 206 panels. He said the parish only spent about $100 out of pocket for a legal review of its agreement with Red Lion.

The black panels, each about 40 inches by 70 inches, sit on sloped and flat spans of the church building's southern-facing roof.

Solar delivers the biggest economic benefit to homeowners and businesses who can offset a portion of the up-front investment with tax credits, Dvorak says. But that can leave school districts, churches and local governments will little incentive to go green.

By using a third-party company, investors can earn a healthy return, while the not-for-profit entity realizes immediate energy savings.

The 69 kW array in Norwalk is expected to produce enough energy to power about 10 typical homes, Dvorak says, while eliminating about 3 million pounds of carbon dioxide emissions from the atmosphere.

Through a power-purchase agreement, the church purchases back the electricity the panels generate at a rate lower than what MidAmerican Energy would charge, Dvorak says. The estimated $2,000 annual savings will allow the church to invest more in its charitable work with groups like Habitat for Humanity or Meals from the Heartland.

"Now they can indirectly save money right off the bat on their energy bills," he said, "use their limited budgets for their own mission and help the environment in the process."

The Norwalk-based company Red Lion Renewables installed solar panels on the roof of the St. John the Apostle Catholic Church in Norwalk. Red Lion CEO Terry Dvorak is a parishioner at the church. (Photo: Michael Rolands/Record-Herald)

'The church definitely needs to lead the community'

About a decade ago, church leaders started worrying about the aging heating system at First Lutheran Church in Decorah. The congregation spent about $22,000 each year to keep the lights, air conditioning and heat running in its 150-year-old building.

"We had a discussion that the money could be better spent on ministry and the church could set a better example with climate change," said Larry Grimstad, a member of the congregation's "green team."

After an energy audit, the church in 2010 added new insulation and invested hundreds of thousands into a new, efficient HVAC system.

Then, in 2014, the church unveiled a $35,000 solar installation. The 26 panels cover much of the southern portion of the roof, where energy efficiency is most ideal.

Most recently, in 2015, First Lutheran started changing out lights to more efficient LED fixtures.

The combined efforts reduced the church's energy costs by 38 percent and reduced its carbon footprint by 42 percent, he said. Now, annual utility bills run about $13,000.

But Grimstad said the church wants to reduce its carbon emissions even more. It's exploring a solar canopy project for the parking lot and Grimstad says he's interested in a community geothermal project.

"The real goal has to be zero," he said. "So that’s where we’re going."

While churches can reap economic benefits from energy efficiency efforts, Grimstad said faith groups have a moral calling to tackle the global threat of climate change.

"We have a responsibility to do this," he said. "The church definitely needs to lead the community to do this type of thing."

Breakthrough on nitrogen generation by researchers to impact climate-change research

Rob Csernyik, Published on: August 24, 2017 | Last Updated: August 24, 2017 1:33 PM MDT

New research has yielded an ammonia-oxidizing microbe called Nitrospira inopinata. The process used to make fertilizer and an overapplication in agriculture has huge environmental implications, researchers said. John Lucas / Postmedia

A research collaboration between the University of Alberta and the University of Vienna has yielded a discovery that could significantly impact climate-change research, said a professor of biology in Edmonton.

Currently the world’s nitrogen cycle is unbalanced because humans are now responsible for adding more fixed nitrogen through ammonium to the environment than natural sources, said researcher Lisa Stein of the University of Alberta. Excess ammonium has implications on the climate and environment, from dead zones in oceans to a greenhouse gas effect 300 times that of carbon dioxide on a molecule to molecule basis.

Stein and Dimitri Kits, a University of Alberta PhD grad now working at the University of Vienna, identified and characterized the ammonia-oxidizing microbe Nitrospira inopinata.

Stein said this bacteria is important because oxidizing the ammonium previously was a process done partway by two different microbes. Nitrospira inopinata oxidizes ammonium all the way to nitrate.

“This is the first time we’ve been able to demonstrate that it can with a single source,” she said. “If we can find a way to replace bad microbes with this one, it can help mitigate the problem.”

The process of making fertilizer, known as the Hager-Bosch process, adds massive quantities of fixed nitrogen to the environment. Stein also said the over-application of fertilizer causes a problem.

Stein said about 50 per cent of applied fertilizer isn’t delivered to the plant. Instead microbes start chewing on the fertilizer and release nitrous oxide into the environment. The new discovery could help mitigate this problem.

“The high efficiency of (Nitrospira inopinata) means it is likely producing less nitrous oxide,” she said.

There is still further research to be done, said Kits, who was the lead author on the paper, which was published in the scientific journal Nature.

“So far in this paper we’ve just isolated (Nitrospira inopinata) and characterized some of its basic physiology,” he said.

“The next step is going to be to measure the actual quantities of nitrous oxide that it makes (and) how it might influence the nitrogen cycle where it’s found.”

Stein said this will be done by getting together funding and researchers to measure how much nitrous oxide is produced under various conditions.

But Stein added problems with the Earth’s nitrogen cycle is not something humans can engineer themselves out of.

She said remedying the problem will involve changing behaviours, such as overfertilization in the agricultural industry.

“You can’t just change the microbes and expect the Earth to be all right, you have to stop the feeding.”

rcsernyik@postmedia.com

How 139 countries could be powered by 100 percent wind, water, and solar energy by 2050

Cell Press

The latest roadmap to a 100% renewable energy future from Stanford's Mark Z. Jacobson and 26 colleagues is the most specific global vision yet, outlining infrastructure changes that 139 countries can make to be entirely powered by wind, water, and sunlight by 2050 after electrification of all energy sectors. Such a transition could mean less worldwide energy consumption due to the efficiency of clean, renewable electricity; a net increase of over 24 million long-term jobs; an annual decrease in 4-7 million air pollution deaths per year; stabilization of energy prices; and annual savings of over $20 trillion in health and climate costs. The work appears August 23 in the journal Joule, Cell Press's new publication focused on sustainable energy.

The challenge of moving the world toward a low-carbon future in time to avoid exacerbating global warming and to create energy self-sufficient countries is one of the greatest of our time. The roadmaps developed by Jacobson's group provide one possible endpoint. For each of the 139 nations, they assess the raw renewable energy resources available to each country, the number of wind, water, and solar energy generators needed to be 80% renewable by 2030 and 100% by 2050, how much land and rooftop area these power sources would require (only around 1% of total available, with most of this open space between wind turbines that can be used for multiple purposes), and how this approach would reduce energy demand and cost compared with a business-as-usual scenario.

"Both individuals and governments can lead this change. Policymakers don't usually want to commit to doing something unless there is some reasonable science that can show it is possible, and that is what we are trying to do," says Jacobson, director of Stanford University's Atmosphere and Energy Program and co-founder of the Solutions Project, a U.S. non-profit educating the public and policymakers about a transition to 100% clean, renewable energy. "There are other scenarios. We are not saying that there is only one way we can do this, but having a scenario gives people direction."

The analyses specifically examined each country's electricity, transportation, heating/cooling, industrial, and agriculture/forestry/fishing sectors. Of the 139 countries--selected because they were countries for which data were publically available from the International Energy Agency and collectively emit over 99% of all carbon dioxide worldwide--the places the study showed that had a greater share of land per population (e.g., the United States, China, the European Union) are projected to have the easiest time making the transition to 100% wind, water, and solar. Another learning was that the most difficult places to transition may be highly populated, very small countries surrounded by lots of ocean, such as Singapore, which may require an investment in offshore solar to convert fully.

As a result of a transition, the roadmaps predict a number of collateral benefits. For example, by eliminating oil, gas, and uranium use, the energy associated with mining, transporting and refining these fuels is also eliminated, reducing international power demand by around 13%. Because electricity is more efficient than burning fossil fuels, demand should go down another 23%. The changes in infrastructure would also mean that countries wouldn't need to depend on one another for fossil fuels, reducing the frequency of international conflict over energy. Finally, communities currently living in energy deserts would have access to abundant clean, renewable power.

"Aside from eliminating emissions and avoiding 1.5 degrees Celsius global warming and beginning the process of letting carbon dioxide drain from the Earth's atmosphere, transitioning eliminates 4-7 million air pollution deaths each year and creates over 24 million long-term, full-time jobs by these plans," Jacobson says. "What is different between this study and other studies that have proposed solutions is that we are trying to examine not only the climate benefits of reducing carbon but also the air pollution benefits, job benefits, and cost benefits"

The Joule paper is an expansion of 2015 roadmaps to transition each of the 50 United States to 100% clean, renewable energy (doi:10.1039/C5EE01283J) and an analysis of whether the electric grid can stay stable upon such a transition (doi: 10.1073/pnas.1510028112). Not only does this new study cover nearly the entire world, there are also improved calculations on the availability of rooftop solar energy, renewable energy resources, and jobs created versus lost.

The 100% clean, renewable energy goal has been criticized by some for focusing only on wind, water, and solar energy and excluding nuclear power, "clean coal," and biofuels. However, the researchers intentionally exclude nuclear power because of its 10-19 years between planning and operation, its high cost, and the acknowledged meltdown, weapons proliferation, and waste risks. "Clean coal" and biofuels are neglected because they both cause heavy air pollution, which Jacobson and coworkers are trying to eliminate, and emit over 50 times more carbon per unit of energy than wind, water, or solar power.

The 100% wind, water, solar studies have also been questioned for depending on some technologies such as underground heat storage in rocks, which exists only in a few places, and the proposed use of electric and hydrogen fuel cell aircraft, which exist only in small planes at this time. Jacobson counters that underground heat storage is not required but certainly a viable option since it is similar to district heating, which provides 60% of Denmark's heat. He also says that space shuttles and rockets have been propelled with hydrogen, and aircraft companies are now investing in electric airplanes. Wind, water, and solar can also face daily and seasonal fluctuation, making it possible that they could miss large demands for energy, but the new study refers to a new paper that suggests these stability concerns can be addressed in several ways.

These analyses have also been criticized for the massive investment it would take to move a country to the desired goal. Jacobson says that the overall cost to society (the energy, health, and climate cost) of the proposed system is one-fourth of that of the current fossil fuel system. In terms of upfront costs, most of these would be needed in any case to replace existing energy, and the rest is an investment that far more than pays itself off over time by nearly eliminating health and climate costs.

"It appears we can achieve the enormous social benefits of a zero-emission energy system at essentially no extra cost," says co-author Mark Delucchi, a research scientist at the Institute of Transportation Studies, University of California, Berkeley. "Our findings suggest that the benefits are so great that we should accelerate the transition to wind, water, and solar, as fast as possible, by retiring fossil-fuel systems early wherever we can."

"This paper helps push forward a conversation within and between the scientific, policy, and business communities about how to envision and plan for a decarbonized economy," writes Mark Dyson of Rocky Mountain Institute, in an accompanying preview of the paper. "The scientific community's growing body of work on global low-carbon energy transition pathways provides robust evidence that such a transition can be accomplished, and a growing understanding of the specific levers that need to be pulled to do so. Jacobson et al.'s present study provides sharper focus on one scenario, and refines a set of priorities for near-term action to enable it."

Technique could aid mass production of biodegradable plastic

University of Nebraska-Lincoln

Introducing a simple step to the production of plant-derived, biodegradable plastic could improve its properties while overcoming obstacles to manufacturing it commercially, says new research from the University of Nebraska-Lincoln and Jiangnan University.

That step? Bringing the heat.

Nebraska's Yiqi Yang and colleagues found that raising the temperature of bio-plastic fibers to several hundred degrees Fahrenheit, then slowly allowing them to cool, greatly improved the bio-plastic's normally lackluster resistance to heat and moisture.

Its thermal approach also allowed the team to bypass solvents and other expensive, time-consuming techniques typically needed to manufacture a commercially viable bio-plastic, the study reported.

Yang said the approach could allow manufacturers of corn-derived plastic -- such as a Cargill plant in Blair, Nebraska -- to continuously produce the biodegradable material on a scale that at least approaches petroleum-based plastic, the industry standard. Recent research estimates that about 90 percent of U.S. plastic goes unrecycled.

"This clean technology makes possible (the) industrial-scale production of commercializable bio-based plastics," the authors reported.

The approach uses polylactic acid, or polylactide, a component of biodegradable plastic that can be fermented from corn starch, sugarcane and other plants. Though most plastics are made from petroleum, polylactide has emerged as an environmentally friendlier alternative.

Yet polylactide's susceptibility to heat and moisture, particularly during the manufacturing process, has limited its use in textiles and other industries. In searching for ways to address the issue, researchers long ago discovered that mixing mirror-image polylactide molecules -- generally referred to as "L" and "D" -- could yield stronger molecular interactions and better performance than using just the L or D alone.

But there was another catch. Convincing a reasonable proportion of the L and D molecules to permanently pair up is difficult, often forcing researchers to concoct costly and complicated matchmaking schemes. Some of the most common involve the use of solvents or other chemical agents whose disposal can cause environmental issues of their own.

"The problem is that people couldn't find a way to make it work so that you could use it on large scales," said Yang, Charles Bessey Professor of biological systems engineering and of textiles, merchandising and fashion design. "People use nasty solvent or other additives. But those are not good for continuous production.

"We don't want to dissolve the polymers and then try to evaporate the solvents, and then have to consider reusing them. That's just too expensive (and) not realistic."

Yang and his colleagues decided to pursue another approach. After mixing pellets of the L and D polylactide and spinning them into fibers, the team rapidly heated them to as hot as 400 degrees Fahrenheit.

The resulting bio-plastic resisted melting at temperatures more than 100 degrees higher than did plastics containing only the L or D molecules. It also maintained its structural integrity and tensile strength after being submersed in water at more than 250 degrees, approximating the conditions that bio-plastics must endure when being incorporated into dyed textiles.

The textile industry produces about 100 million tons of fibers annually, Yang said, meaning that a feasible green alternative to petroleum-based manufacturing could pay off both environmentally and financially.

"So we just used a cheap way that can be applied continuously, which is a big part of the equation," Yang said. "You have to be able to do it continuously in order to have large-scale production. Those are important factors."

Though the team has demonstrated continuous production on a smaller scale in Yang's lab, he said it will soon ramp up to further illustrate how the approach might be integrated into existing industrial processes.

The team's findings will be published in a November print edition of Chemical Engineering Journal. Yang authored the study with Helan Xu, a former Nebraska researcher now at Jiangnan University; Bingnan Mu, graduate student in textiles, merchandising and fashion design at Nebraska; and Jiangnan University's Gangwei Pan, Bomou Ma and Jing Yang.

The researchers received support from the U.S. Department of Agriculture's National Institute of Food and Agriculture, the Agricultural Research Division at the University of Nebraska-Lincoln, and the China Scholarship Council.

NY officials criticize Hudson River cleanup to EPA

ALBANY, N.Y. — The $1.7 billion Superfund cleanup of the Hudson River is not protecting the public’s health and the river as initially promised, New York’s environmental commissioner contended Wednesday.

Commissioner Basil Seggos criticized the six-year dredging project performed by General Electric Co. in a letter to Environmental Protection Agency Administrator Scott Pruitt released in the waning days of a crucial public comment period. Seggos was particularly scornful of an EPA assessment this summer that it could take 55 years or more before all species of fish in the river are clean enough to eat once a week.

“This is unacceptable,” Pruitt wrote, echoing previous criticisms by the Cuomo administration official. “A remedy that will take generations to safeguard public health and the environment is clearly not protective.”

Boston-based GE removed 2.75 million cubic yards of polychlorinated biphenyl-contaminated sediment from a 40-mile stretch of the river north of Albany under an agreement with the EPA.

The EPA released a review of the work this summer that found, based on the data so far, the cleanup will protect human health and the environment in the long term. Critics pushing for a broader cleanup have noted that a large amount of PCB-contaminated sediment remains in the river.

Seggos said the state is nearing completion of its own sampling program to measure the true extent of contamination.

An EPA spokeswoman said Wednesday the agency would consider Seggos’ comments along with all the others they are receiving.

The EPA is accepting public comment on its review through Friday.

GE spokesman Mark Behan questioned New York’s critical stance in an email that said PCB levels in the upper-Hudson water declined in some spots by as much as 73 percent in the first year after dredging.

“New York State approved and oversaw the dredging project and was instrumental in every major decision related to the project,” Behan wrote. “Its criticism flies in the face of the most up-to-date scientific data from the river itself.”

How an Illinois utility used wind, solar and storage to power a microgrid for 24 hours

Written By David J. Unger

A Midwest utility has taken a critical step closer toward a distributed, decentralized power grid.

Ameren, in partnership with S&C Electric, a Chicago-based smart-grid engineering firm, successfully completed a 24-hour “islanding” test earlier this month at the utility’s newly built microgrid in Champaign, Illinois, using wind, solar and battery storage.

“Islanding” refers to a microgrid — itself a network of power generation and consumption — temporarily disconnecting itself from the broader, regional power grid. Many analysts believe the power grid of the future will consist of these kinds of localized, networked microgrids over the centralized “hub-and-spoke” model that has existed for over a century. The ability to island is seen as a core function of microgrids.

Ameren’s test, although limited in scale, offers a glimpse into the utility of the future. The microgrid is one of the few in the world to operate at utility-scale voltages — between 4 kilovolts and 34.5 kilovolts

“This successful test provided tangible proof that the system can accomplish what it was designed to do,” Ron Pate, senior vice president of operations and technical services at Ameren Illinois, said in a press release. “The microgrid isn’t theoretical and our tests don’t need to be lab simulations. We were able to prove that this technology works and can provide key benefits to our customers.”

Ameren’s microgrid lies adjacent to the campus of University of Illinois Urbana-Champaign and consists of a 100-kilowatt (kW) wind turbine, a 125-kW solar array, a 250-kW battery and two 500-kW natural gas generators. While the entire system is capable of supplying regular Ameren customers, this month’s islanding test focused on the 50-kW segment of the microgrid that serves an onsite Ameren-owned research facility.

Notably, the microgrid was able to complete the 24-hour test without relying on fossil-fuel generation, making it a 100 percent-renewable islanding event.

A chart released by S&C Electric visualizes how wind, solar and battery storage worked in concert to meet the research facility’s energy demands throughout the 24-hour period.

The test began at 8 a.m. with a 97 percent charge on the battery. For about two hours, the microgrid ran on stored energy alone. Once the battery levels hit 90 percent at around 10 a.m., the system automatically dispatched its wind turbine and solar array. The renewable assets took over powering the load while using excess energy to recharge the partially depleted battery.

In less than an hour, the battery returned to a 97 percent charge, enabling the system to dial down the wind and solar and resume running on stored energy alone. This cycling pattern continued, according to S&C’s data, until about 7 p.m. With the sun setting, the system switched off the solar inverter, leaving only the wind turbine to carry the load through the night.

Strong winds kept the microgrid powered throughout the night — keeping battery levels relatively stable — and even storing charge when wind speeds peaked. S&C Electric’s PureWave SMS-250 Storage Management System helped ensure voltage remained stable throughout the process. Starting at about 7 a.m. the next day, a rising sun brought solar back online until the test’s completion at 8 a.m., reconnecting to the grid without a hitch.

Throughout it all, the battery’s state of charge never dropped below 88 percent, according to S&C. What’s more, the system’s automated control system managed these shifts along the way, without the need for human controllers to make adjustments. There’s still a long way to go before utilities are relying on microgrids to power whole cities or even neighborhoods, but the test shows how wind, energy, storage and automated, intelligent controls can complement one another in an effort to deliver local, consistent, reliable, 100 percent renewable power.

“This system is designed to be completely automated,” says Chris Evanich, S&C’s applications director of microgrids. The storage management system decides which generation sources to use and when they should go offline if they’re not needed, all to “keep everything balanced and up and running so that the customer doesn’t even realize they’re running on 100 percent renewables.”

Illinois is becoming something of a hub for microgrid innovation. The Illinois Institute of Technology, a private research university on Chicago’s South Side, was an early pioneer in the technology, launching an $18.5 million plan in 2008 to build “the world’s first self-healing and efficient smart microgrid distribution system.” Today, the university manages an islandable, 9-megawatt network of gas turbines, large-scale batteries, a wind turbine and various smart-efficiency technologies.

ComEd, Illinois’ largest utility, is planning to develop its own microgrid in Chicago’s Bronzeville neighborhood, adjacent and connected to the Institute of Technology. The utility has received more than $5 million in grants from the U.S. Department of Energy to help complete the microgrid. ComEd says the project would be the “most advanced clustered urban microgrid in the United States,” providing the neighborhood with an added layer of reliability, cybersecurity and resiliency in the face of broader grid disruptions.

Late last month, ComEd submitted its official proposal for the cluster to the Illinois Commerce Commission, which must approve the plan.

“The electric industry in Illinois, and around the world, is changing rapidly,” wrote Joseph Svachula, ComEd’s vice president of smart grid and technology in testimony to the Illinois Commerce Commission. “This ‘clustering’ of microgrids is an important step in the journey to a more decentralized grid because it enables us to understand how microgrids interface, and how resources from neighboring microgrids can be used to keep power flowing to customers.”

Harvey Could Reshape How and Where Americans Build Homes

By Christopher Flavelle, for Bloomberg , August 30, 2017

Storm comes as U.S. flood insurance program is up for renewal

Texas has one of the most relaxed approaches to building codes

Jerry Garcia’s home in Corpus Christi missed the worst of Hurricane Harvey by just a few miles and lost nothing more than some shingles and his backyard pier, which turned up further down Oso Bay. A 5-foot bulkhead and sloping lawn shielded it from the flooding that’s devastated parts of Texas.

A home builder, Garcia said he built this place, like all the houses he builds, "above code" -- stronger than the standards required by law, which in Texas tends to be less than in most states. But he doesn’t think those tougher standards should be imposed on every builder.

"You’ve got to find that medium, to build affordable housing," Garcia, 65, said sitting in his house as Harvey’s rains continued to pound the coast further north. Tougher mandatory codes mean higher costs, which means fewer homes. And if insurers had their way, he added, every home would be "a box with two doors and no windows."

Hurricane Harvey has highlighted a climate debate that had mostly stayed out of public view -- a debate that’s separate from the battle over greenhouse gas emissions, but more consequential to the lives of many Americans. At the core of that fight is whether the U.S. should respond to the growing threat of extreme weather by changing how and, even where, homes are built.

That debate pits insurers, who favor tighter building codes and fewer homes in vulnerable locations, against homebuilders and developers, who want to keep homes as inexpensive as possible. As the costs of extreme weather increase, that fight has spilled over into politics: Federal budget hawks want local policies that will reduce the cost of disasters, while many state and local officials worry about the lost tax revenue that might accompany tighter restrictions on development.

Harvey slammed ashore in Texas Friday with historic levels of rain and flooding. On Wednesday, the storm returned, making landfall a second time in southwestern Louisiana, which was devastated by Hurricane Katrina in 2005. At least 15 people have died so far in Texas, according to a count by the Austin American-Statesman newspaper. Thousands more have been displaced from their homes. Early estimates on damages range from $42 billion to more than $100 billion.

Contributing to the high losses is the fact that Texas, despite being one of the states most vulnerable to storms, has one of the most relaxed approaches to building codes, inspections, and other protections. It’s one of just four states along the Gulf and Atlantic coasts with no mandatory statewide building codes, according to the Insurance Institute for Business & Home Safety, and it has no statewide program to license building officials.

Read more: Harvey Costs Seen at Catastrophic Levels With Many Uninsured

Texas policies leave home building decision to cities, whose record is mixed: Corpus Christi uses codes that reflect national standards, minus the requirement that homes be built one foot above expected 100-year-flood levels, according to the Federal Alliance for Safe Homes. Nueces County, which encompasses Corpus Christi, has no residential building code whatsoever.

The consequence of loose or non-existent codes is that storm damage is often worse than need be. "Disasters don’t have to be devastating," said Eleanor Kitzman, who was Texas’s state insurance commissioner from 2011 to 2013 and now runs a company in South Carolina that constructs and finances coastal homes that are above code. "We can’t prevent the event, but we can mitigate the damage."

Any proposal that would increase costs in Texas draws push back from home builders, a powerful group in a state where people despise red tape about as much as they hate taxes.

"They are not big on regulation," said Julie Rochman, chief executive officer of the insurance institute. That skepticism about government was on display in 2013, when the state’s two senators voted against additional federal funding to clean up after Superstorm Sandy. But it can be applied selectively: Governor Greg Abbott requested federal money for Hurricane Harvey before it even made landfall.

Building codes elicit little support in Austin. At the end of this year’s state legislative session, the Texas Association of Builders posted a document boasting of its success at killing legislation it didn’t like. That included a bill that would have let cities require residential fire sprinklers, and another that would have given counties with 100,000 people or more authority over zoning, land use and oversight of building standards--something the builders’ group called “onerous.”

Ned Muñoz, vice president of regulatory affairs for the Texas Association of Builders, said cities already do a good job choosing which parts of the building code are right for them. And he argued that people who live outside of cities don’t want the higher prices that come with land-use regulations.

Muñoz said his association’s target is "unnecessary and burdensome government regulations, which increase the price of a home."

The fight in Texas is a microcosm of a national battle. The International Code Council, a Washington nonprofit made up of government officials and industry representatives, updates its model codes every three years, inviting state and local governments to adopt them. Last year, the National Association of Home Builders boasted of its prowess at stopping the 2018 codes it didn’t like.

"Only 6 percent of the proposals that NAHB opposed made it through the committee hearings intact," the association wrote on its blog. Some of the new codes that the home builders blocked had been proposed by the Federal Emergency Management Agency -- the same agency that’s on the hook when homes collapse, flood or wash away. And when FEMA is on the hook, it’s really the taxpayer.

Ron Jones, an NAHB board member and builder in Colorado who is critical of the organization’s views on codes and regulations, said that while the focus now should be on helping people hurt by Harvey, he hoped the storm would also force new thinking.

"There’s no sort of national leadership involved," said Jones. "For them it’s just, ’Hell, we’ll rebuild these houses as many times as you’ll pay us to do it.’"

The home builders demonstrated their power again this year, when President Donald Trump reversed an Obama administration initiative restricting federally funded building projects in flood plains. "This is a huge victory for NAHB and its members," the association wrote on its blog.

Yet on other issues, Trump’s administration appears to be siding with those who favor tougher local policies. In an interview just before Harvey, FEMA chief Brock Long expressed support for an Obama administration proposal to spur more local action on resilience, such as better building codes, as a condition of getting first-dollar disaster relief from Washington.

"I don’t think the taxpayer should reward risk," Long told Bloomberg in the interview, four days before Harvey slammed into Texas.

It may seem surprising that a Republican administration would side with its Democratic predecessor on anything, especially something related to climate change. But the prompt is less ideological that practical. Over the past decade, the federal government spent more than $350 billion on disaster recovery, a figure that will almost certainly increase without major changes in local building codes and land use practice.

And much of that money goes to homes that keep getting hit. That’s true for the National Flood Insurance Program, which Congress must reauthorize by the end of next month; some lawmakers, and Long himself, have said homes that repeatedly flood should be excluded from coverage. But there are also 1.3 million households that have applied for federal disaster assistance money at least twice since 1998 -- many of them in the same areas hit hardest by Hurricane Harvey.

In his interview, Long said his focus as FEMA’s administrator will be working with cities and states to ensure that areas hit by disasters get rebuilt differently, in a way that’s more resilient.

Some state lawmakers acknowledge the need to at least consider doing things differently. Todd Hunter, who represents this part of the coast in the Texas House of Representatives, said Harvey will ignite a conversation about the need for statewide building codes.

"The discussion needs to start," Hunter said in an interview at his law office overlooking downtown Corpus Christi, where many of the buildings visible out his window were still without power. And that discussion could go beyond codes: "We need to take a look at where structures are being built."

Others are less optimistic. If people living along the Texas coast had to pay the full cost of the risk they face, some parts of that coast might empty out. That’s what worries Albert Betts Jr, executive director of the Texas Insurers Council, the trade association representing insurance companies. And it’s why he thinks Hurricane Harvey won’t shift public policy.

It’s not that Betts doesn’t like a fight. At his office off a freeway in Austin, his desk is adorned with a hammer on one side and a miniature punching bag on the other. But Betts said the price of real change could be too high.

"I can’t sit here and tell you, as a Texan, that I don’t want that area developed," Betts said. "Smarter people than me have yet to figure that out."

— With assistance by Peter Coy

Love of Coastal Living Is Draining U.S. Disaster Funds

Welcome to Alabama’s Dauphin Island, where it floods all the time.

By Christopher Flavelle, August 31, 2017

Dauphin Island has one of the highest concentrations of chronically flooded homes in the U.S. Photographer: Nicole Craine/Bloomberg

About 400 miles east of Houston, off the coast of Alabama, Dauphin Island was spared the initial impact of Hurricane Harvey. Yet this tiny sliver of land near the mouth of Mobile Bay is just as important as the battered metropolis to the debate over American disaster policy. That’s because the 14-mile-long island, home to only about 1,300 people, floods again and again, hurricane or no hurricane. And again and again, those homes are repaired and rebuilt, largely at the expense of U.S. taxpayers.

The homes on Dauphin Island are among the 5 million that are covered by the National Flood Insurance Program. Founded in 1968 to make sure homeowners in flood-prone areas could get affordable insurance, the program ends up paying most residential flood insurance claims in the U.S. Partly as a result, development along coasts and riverbanks and in flood plains has exploded over the past 50 years. So have claims for flood damages. The NFIP is now about $25 billion in debt.

On Sept. 30, the program is also set to expire. Some members of Congress would like to use a reauthorization vote to enact reforms, including perhaps kicking the most flood-exposed homes off the rolls. Given the pressure to deliver disaster relief quickly post-Harvey, it’s improbable that any initial agreement to extend the NFIP past Sept. 30 will lead to significant reforms. More likely, a short-term extension may fund it through the end of the year. But the problem won’t go away. And the debate is just getting started.

The issues surrounding the NFIP go beyond just insurance and straight to the costs of climate change—specifically, whether the government will concede that the most vulnerable places simply can’t be protected. While hurricanes contribute greatly to costs, putting a sudden spotlight on the insurance issue, it’s the chronic flooding that happens away from the public eye, in places such as Dauphin Island, that slowly drains the NFIP. The island has one of the country’s highest concentrations of houses that the Federal Emergency Management Agency calls “severe repetitive loss”—those that flood most often. The owners of those 84 properties have gotten almost $17 million since 1978, an average of $199,244 each.

While many areas that depend on flood insurance are poor, the overwhelming majority of buildings on Dauphin’s most vulnerable beaches are vacation homes and rentals. When Texas Republican Jeb Hensarling, chairman of the House Financial Services Committee, argued this year against single mothers being “forced to pay for insurance for some millionaire’s beachfront vacation home,” he could’ve been talking about this town.

Federal payments to places such as Dauphin Island can seem like a reward for stubbornness. In 1979, Hurricane Frederic destroyed the only bridge to the island; in 2004 and 2005, Ivan and Katrina wiped out about 500 homes. Federal funds replaced the bridge and the houses. But officials on the island argue they’ve been left behind, getting just enough money to limp along but not enough to survive the next big storm. After Sandy, Washington agreed to spend almost $1 billion to fortify a single stretch of the New Jersey shore with dunes. Jeff Collier, mayor of Dauphin Island, would like the same treatment. “Nobody’s ever come in and forced sand down my throat,” he says. “I wish they would. Why are we more expendable than anybody else?”

As climate change worsens, it’s getting harder for Congress to pretend there’s enough money to save everyone. John Kilcullen, Mobile County’s director of plans and operations for emergency management, says he thinks the best way to stop history from repeating itself on Dauphin is for the federal government to buy houses and tear them down, especially along the island’s low-lying western section—the same stretch that boasts the biggest, most expensive properties. He’s pessimistic about how many of those homeowners would be willing to sell. “People don’t like being told they can’t live somewhere,” Kilcullen says. “I don’t think it’s politically feasible.”

Explosive costs have a way of changing people’s minds. The U.S. spent more than $350 billion over the past decade responding to natural disasters, yet FEMA budgeted only $90 million this year to prepare communities for those catastrophes. “These storms are breaking the back of FEMA,” says Roberta Swann, director of the Mobile Bay National Estuary Program. “If FEMA starts to say, ‘If a hurricane comes and takes your house, we’re done’—that is going to change the face of coastal communities.”

And yet, if the U.S. wants to stop racking up taxpayer-funded losses, those communities do have to change. The NFIP paid out about $16.3 billion after Katrina in 2005 and $8.4 billion for Sandy in 2012, according to the Insurance Information Institute. That same year, Congress passed a law that overhauled the program, making it significantly more expensive to insure a home in a flood-prone area. The law, known as the Biggert-Waters Flood Insurance Reform Act of 2012, was the product of a rare alliance between environmentalists and Tea Party activists. After it went into effect, some coastal homeowners saw their annual flood insurance premiums spike to $20,000 or more from as low as $600. Two years later, as opposition grew, Congress rolled back most of the reforms.

“Nobody’s ever come in and forced sand down my throat,” says Dauphin Island Mayor Jeff Collier. “I wish they would. Why are we more expendable than anybody else?”Photographer: Nicole Craine/Bloomberg

Last year theNatural Resources Defense Council won a lawsuit seeking to uncover how many homes FEMA has designated severe repetitive loss. The data the agency was forced to release showed that about 30,000 properties had cost taxpayers $5.5 billion since 1978. “Repeatedly flooded properties cost the nation billions in avoidable losses and repeated trauma to the families that live there,” says Rob Moore, a water-policy expert at the council. “We should be offering assistance to move to higher ground before they run out of options, not after.”

In 2016, FEMA bought 960 severe repetitive loss homes to tear down; at that rate, it would take until midcentury to buy all 30,000. By then, as many as 265 towns and cities will suffer what the Union of Concerned Scientists in a July report called “chronic inundation”—flooding that covers at least 10 percent of their land an average of twice a month. “The human inclination to stay in place, and to resist change at all costs, is pretty strong,” says Erika Spanger-Siegfried, an analyst at the Union of Concerned Scientists and a lead author of the report. “It will catch up with those locations,” she says, “and they will need to move eventually—but at a greater cost.”

On Dauphin Island, beachfront homes are still going up. Yet some residents may be ready to leave. After Katrina, 35 homeowners applied for federal buyouts, says Mayor Collier, but none were approved. If the government doesn’t either buy out homes or fortify the beach, it will just cost more when the next hurricane hits, Collier argues, while standing amid homes under construction. Bobcat loaders zoom by, returning sand blown off the ever-shrinking beach. “Pay me now,” he says, “or pay me later.”

BOTTOM LINE - While big storms grab headlines, chronic flooding in Dauphin Island and places like it are slowly draining the NFIP dry, forcing hard questions about where we live.

Now Comes the Uncomfortable Question: Who Gets to Rebuild After Harvey?

Kate Aronoff for Bloomberg , August 30 2017, 1:50 p.m.

In the decade-plus since Hurricane Katrina, Houston has been debating what it needs to protect itself from a catastrophic weather event. While that conversation has gone on, the city did what most cities do: carried on business as usual, constructing more than 7,000 residential buildings directly in Harris County’s Federal Emergency Management Agency-designated flood plain since 2010.

That kind of construction spree, which was detailed in a prescient investigation by ProPublica and the Texas Tribune last year, is not just head-slapping in hindsight, it actually produces its own greater risks, as it means paving over the kinds of wetlands that can help buffer against extreme flooding events.

But from a financial perspective, there was a logic behind those 7,000 buildings, as each is eligible for subsidized flood protection through the National Flood Insurance Program, that just so happens to be expiring at the end of September, meaning debate over its reauthorization will compete for time with everything else on a packed fall congressional calendar.

“Harvey should be a wake-up call” for deep flood insurance reform, says Rob Moore, a senior policy analyst with the Natural Resource Defense Council’s water team. “One of the biggest shortcomings is that the program has always been intended first and foremost to be a rebuilding program.”

While environmental groups, including the Environmental Defense Fund and the National Resources Defense Council, contend that the NFIP encourages irresponsible forms of coastal development, they have odd company in a coalition fighting for the program’s reform – insurance companies and free market groups that just want the government out.

Hurricane Harvey is likely to focus attention on questions the federal government and the American public have in large part worked hard to avoid, because they don’t cut cleanly along ideological or partisan lines. And because they’re not much fun to think about. What does home ownership look like in an age of climate change? When is it OK to rebuild, and when is it time to retreat?

The 30 Texas counties declared disaster areas by Gov. Greg Abbott are home to nearly 450,000 policies underwritten by the National Flood Insurance Program, a FEMA subsidiary. The NFIP was $24.6 billion in debt before Harvey made landfall, and can borrow just $5.8 billion more from the Treasury before the program’s current round of congressional reauthorization expires. Waters are still rising, and it will be several days or even weeks before damage totals are assessed, but early insurance industry estimates say the figure could match costs incurred from Hurricane Katrina, the most expensive disaster in U.S. history. Homeowners hit by the storm will now enter into a months- and potentially years-long claims fulfillment process, attempting to rebuild what they lost.

As lawmakers return to Washington early next month to determine the future of the beleaguered agency, there are bigger questions on the table than how to make the NFIP financially solvent. Both sides of the aisle agree that the program is unsustainable as is; FEMA is predicting that the NFIP’s rolls could increase by 130 percent by century’s end. Now — at what might be the worst possible moment to do so — Congress is essentially tasked with deciding who gets to live on the country’s most vulnerable coastlines.

When it was founded in 1968, the NFIP set out to fill a gap left by the private market. Unlike most other kinds of insurance, which protect policyholders against the costs of things like car accidents and medical expenses, the payouts for floods come all at once and in huge numbers, meaning many insurers lack the cash on hand necessary to keep up with an influx of claims. While NFIP policies are underwritten by the federal government, they’re administered by private insurers, who make a percentage of the policies they write. Homeowners in flood plains with federally backed mortgages are required to purchase flood insurance, though anyone can purchase an NFIP policy. For those required to take out policies, failure to pay can result in foreclosure.

Starting with hurricanes Katrina and Rita in 2005, the program has been hit with a barrage of massive payouts that have sunk it deeper and deeper into the red. Flooding in the Midwest compounded that debt in 2008, followed soon after by Hurricane Ike. Sandy tacked on another $8 billion in 2012, and there have been several major flooding events since.

Of course, the NFIP is just one program in a web of agencies dealing with disaster mitigation and response, ranging from FEMA’s emergency relief operations to the Department of Housing and Urban Development, which administers block grants to help residential buildings recover from destructive storms. Beyond the fact that its services are relatively difficult for homeowners to access — requiring a special kind of patience for and finesse with navigating layers of bureaucratic red tape — the NFIP’s benefits aren’t immediately recognizable just after a storm hits. Where few question how much FEMA is spending to respond to a storm like Harvey, the fact that the NFIP’s benefits play out quietly and over the longer term makes it easier to point to its debt overhang as an example of funds poorly spent.

The key tension now plaguing the NFIP is that it has always had to balance two often competing goals. “It’s really a conflict between the actuarial tendency of the program and a general desire among policymakers that the prices not be too high,” says Rebecca Elliott, a London School of Economics sociologist who researches the program. Severe Repetitive Loss Properties — those that have been consistently damaged and rebuilt — account for just 1 percent of policies but as much as 30 percent of the funds paid out in claims. Many of the residential properties that carry NFIP policies have also been “grandfathered” in, meaning that policyholders can keep paying existing rates as their flood risk increases.

Among the program’s biggest problems is just how high-risk its ratepayers are. In the case of health care, for instance, insurance operates on the basic principle that healthy policyholders with cheaper coverage should help pay the way for those who are sicker and more expensive. Referencing that system, Elliot says, “Right now we have the sickest people in the pool. The people who require insurance are only the high-risk flood zone. We know that’s not a great recipe for insurance pools.”

Yet whereas most kinds of public assistance in America benefit poor and working-class people, what’s saved the NFIP from outright attacks like the ones visited on Medicaid has been the fact that its beneficiaries also include the 1 percent and their waterfront homes. Consequently, NFIP reauthorization has always tended to be a rare spot of bipartisan alignment in Congress. “Democrats and Republicans both live in floodplains,” Elliot explains. If anything, the splits are regional, with representatives from flood-prone and politically divergent states like New York, Louisiana, and California coming together to keep rates artificially low.

In part because of who the NFIP serves, reform efforts have proven controversial. The last major overhaul of the program was passed into law just before Hurricane Sandy struck New York and New Jersey in 2012. That legislation, Biggert-Waters, sailed through Congress uncontroversially, and would have brought NFIP policies up to market rate on a rapid timeline and eliminated the program’s grandfather clause.

Sanitation workers throw out debris from a flood-damaged home in Oakwood Beach in Staten Island, N.Y., on Feb. 5, 2013. Photo: Spencer Platt/Getty Images

Post-Sandy those measures got much less popular. “Families who had their lives destroyed were going to rebuild and finding out that … if they were going to rebuild their homes, they might not be able to afford to insure them,” Elliott told me. Deductibles would have jumped by 25 percent per year. Blowback was so intense that one of the bill’s sponsors, Maxine Waters, D-Calif., became a major voice to rally successfully against its implementation. The result of that fracas was the Homeowners Flood Insurance Affordability Act in 2014, which reintroduced grandfathering and allowed rates to rise more gradually. “It appeased the protest for the time being, but the rates are still going up,” Elliott says. “All of the things that homeowners were upset about are still taking place, just on a slower timeline.”

One thing that bill also lifted from Biggert-Waters was a study of how those rate hikes would impact low-income owners. “HFIAA was a recommitment to the idea that the NFIP was now going to be in the business of adjudicating who truly needed support to pay for their flood insurance. Who was getting a good deal but didn’t actually need it? It’s basically a way of introducing means tests to a program that didn’t have it before,” Elliott adds.

The larger balance of forces surrounding flood insurance reform might be among the most novel that American politics has to offer. Pushing against an overhaul are major developers and real estate interests, like the National Realtors Association. One of the biggest voices pushing for it has been a coalition called Smarter Safer, an active proponent for Biggert-Waters that encompasses everyone from big environmental groups to insurance companies to conservative free market think tanks. For different reasons, all of them want to see rates that are more actuarially sound and less heavily subsidized by taxpayers.

“We’re concerned about the cost to taxpayers,” says Steve Ellis, vice president of the group Taxpayers for Common Sense, who’s worked on multiple rounds of NFIP reauthorization. He says that for many, Biggert-Waters was “too much too fast,” as well as a case of bad timing. Ultimately, though, he hopes the reforms it outlined come to pass. “We want to see a program that charges closer to risk-based rates and sheds as much of the risk off the program and into the private sector,” with the private sector playing an increasingly large role in underwriting policies. Eventually, Ellis hopes the NFIP will become a flood insurer of last resort.

Other Smarter Safer members are more gung-ho. Groups, like the Heartland Institute spinoff R Street, a free market think tank, see the current structure of the NFIP as an expensive entitlement and want to get the government out of the flood insurance business entirely. Among the other voices calling on the federal government to shoulder less of the cost is current FEMA Administrator Brock Long, who said just before Harvey made landfall that he doesn’t “think the taxpayer should reward risk going forward.”

How exactly Harvey will impact the NFIP’s reauthorization remains to be seen, though even before the storm the chances for wholesale overhaul were slim. Two fledgling proposals in the Senate — one from Sens. Bill Cassidy, R-La., Kirsten Gillibrand, D-N.Y., and Shelley Moore Capito, R-W.Va., and another from Sen. Bob Menendez, D-N.J., with several co-sponsors — would each extend the reauthorization beyond five years, as well as increase oversight over the private insurers who write FEMA-backed claims. If passed, the Cassidy-Gillibrand bill would “[remove] barriers to privatization,” according to draft language, by allowing the companies that currently administer to policies to start underwriting them as well. Another proposal, passed through the House Financial Services Committee and praised by Smarter Safer, would also see the private sector play a larger role.

While few people are opposed outright to giving the private sector a bigger role to play, experts are leery of viewing privatization as a solution to what ails the NFIP. For one, private insurers’ involvement thus far has been hugely problematic. A report released last year by New York Attorney General Eric Schneiderman accused FEMA-contracted policy writers of widespread fraud, noting that a “lack of transparency and accountability can and does lead to inflated costs for services.” An earlier investigation by NPR found that private insurers made $400 million in Sandy’s aftermath, and that nearly a third of all NFIP funds spent between 2011 and 2014 went directly to private insurers instead of to homeowners. The more structural danger of privatization is that insurance companies could simply skim off the most profitable rates, leaving the NFIP to pick up the most costly policies. In health care terms, then, privatization could render an already sick pool even sicker.

The even bigger policy question is whether higher and more competitive rates will actually incentivize fewer people to live along high-risk coastlines, or just leave the shore open only to those wealthy homeowners and developers who can afford higher rates and round after round of rebuilding. President Donald Trump also repealed an Obama-era mandate for flood-prone construction, so there’s no guarantee that new shorefront structures will be able to withstand future damage. The result of higher rates, Elliot predicts, “is the socioeconomic transformation along with the physical transformation of the coastlines.”

Of course, the elephant wading through the flood is the fact that there are now millions of people living in areas that shouldn’t be inhabited at all, no matter the cost. “There’s the uncertainty of living at risk,” Elliot says, “and there’s the uncertainty of what it means to stay in your community when in the near to medium term, it’s going to become more expensive for you to do so — and in the long term, physically impossible.”

People are stranded on a roof due to flood waters from Hurricane Katrina in New Orleans, on Aug. 30, 2005. Photo: Vincent Laforet/Pool/AFP/Getty Images

Virtually no one thinks that the program’s rate structure is politically sustainable as is, and transformational changes to American coastlines will happen regardless of what happens to the NFIP. Beyond destructive storms becoming more common as a result of climate change, the sea level rise likely to happen over the next century will make several American cities more vulnerable to everyday flooding. Miami Beach today floods on a sunny afternoon. By 2100, for instance, Savannah, Georgia, could face two crippling floods per month. Along the same timeline, the Bronx may become the only New York City borough spared from a similar form of chronic inundation, a coin termed by the Union of Concerned Scientists to describe persistent flood risk. In all, sea level rise could force as many as 13 million Americans to relocate by the end of the century.

What’s now up for debate is when and how the retreat happens, and on whose terms. The NRDC and others are advocating to incorporate buyout programs more fully into the NFIP, allowing homeowners in persistent-risk properties to sell their homes back to the government at pre-flood prices, after which point the land is returned to wetlands, waterfront parks, or some other kind of absorbent public space. “The temptation is to always try to identify some one-size-fits-all solution,” Moore says. “We’re at a point where we know that’s not the case for these issues. For NRDC, the real emphasis needs to be on how we help low- and middle-income homeowners live in a place that is far from flooding.”

There were some 40,000 buyouts of high-risk properties between 1993 and 2011, mostly for homes along rivers in inland states. When people receive relocation funds, though, it tends to be through state or locally run pilot projects, and there’s not yet enough data to determine how effective these programs have been in the long run. So while there is some precedent in the U.S. for helping coastal residents move away from high-risk flood zones, there is still no federal plan to make moving a realistic possibility for low- and middle-income homeowners. Examples of how to do it well are few and far between.

What MIT postdoctoral fellow Liz Koslov has found in her research into planned retreat is that there are plenty of communities ready and willing to move — at least under the right circumstances. “I was shocked about how positive they were about the process,” she says of the people she met doing field work on Staten Island, where two neighborhoods successfully lobbied Albany for relocation funds. “I expected a lot more negative criticisms, but people talked about it as a process they found very empowering. It started as a bottom-up effort, without any sort of government relocation plan. … It really helped that people were hearing about it from their neighbors, in community meetings.”

Still, buyouts are no quick fix, and scaling them up at the national level would take careful planning. “From the federal and state governments’ point of view, buyouts are great. They save a huge amount of money, absorb water, and can keep nearby properties safer,” Koslov says. At the local level, though, relocation plans run the risk of bringing “all the cost and none of the benefits.” Especially for cities without a tax base as big as New York City’s, relocation can erode funds for things like schools and public services, and potentially even whole communities. That’s why New York state’s post-Sandy relocation plan included a 5 percent bonus for homeowners who chose to relocate within the county. Blue Acres, a similar post-Sandy buyout program in New Jersey, has faced hostility from towns weary of a population drain.

Even those communities that do want to relocate face serious barriers at the federal level. FEMA currently requires a disaster to have been declared in order for the agency to release relocation funds. That’s why New York and New Jersey were each able to launch relatively successful buyout programs post-Sandy, but why Alaska — where tribal nations’ land is under rapid threat from erosion — have been unable to do the same despite years of organizing.

The Staten Island neighborhoods that ended up being relocated also only received their buyouts after “months of grassroots lobbying and protests and petitions and an enormous amount of collective effort,” Koslov says. In both neighborhoods, she adds, the main populations were white, middle-class, and home-owning public sector employees (cops, firefighters, postal workers) who had retired but were young enough to use their time actively — ”the last people you would expect to collectively organize for climate change adaptation,” Koslov says. Both areas voted overwhelmingly for Trump. Aside from some obvious concerns for renters — whose landlords would have to be bought out — it remains to be seen whether working-class neighborhoods of color — like many of those likely to be worst hit by flooding — would see the same results.

There are few easy answers for how coastal communities can adapt to rising tides and more volatile weather, let alone for how to make the NFIP more sustainable in that context. What does seem clear is that leaving the responsibility of planning for and adapting to climate change up to individuals may be the wrong path for disaster policy in the 21st century. “Individuals vary tremendously in their ability to respond rationally to incentives,” Elliot says. “My sense is that when it comes to the problem of what habited places are going to look like in a future defined by climate change, that’s a collective problem that’s going to require more collective solutions.”

In terms of policy, that could mean making it easier for communities to organize and access funding like Staten Island residents have post-Sandy, or even attaching a rider to different types of insurance policies, spreading the financial risk of flooding around to more than just the people most likely to be effected in the next several years.

The need for a more collective response has implications beyond specific measures though, and extends well outside the scope of flood insurance. Amid the chaotic reports of the situation unfolding in Houston are also those of good Samaritans rescuing total strangers, the kind of post-disaster solidarity writer Rebecca Solnit has called “a paradise built in hell.” Paradise may not be on the docket as the NFIP’s future gets determined this month, but Harvey may be a chance for lawmakers to start thinking more holistically about how to deal with and prevent the kinds of hell climate change is all too likely to keep dishing out.

Full report

https://projects.propublica.org/houston-cypress/?utm_campaign=sprout&utm_medium=social&utm_source=twitter&utm_content=1503846942

Floods in India, Bangladesh and Nepal kill 1,200 and leave millions homeless

Authorities say monsoon flooding is one of the worst in region in years

Chloe Farand, Wednesday 30 August 2017 17:41 BST

The Independent Online

At least 1,200 people have been killed and millions have been left homeless following devastating floods that have hit India, Bangladesh and Nepal, in one of the worst flooding disasters to have affected the region in years.

International aid agencies said thousands of villages have been cut off by flooding with people being deprived of food and clean water for days.

South Asia suffers from frequent flooding during the monsoon season, which lasts from June to September, but authorities have said this year's floods have been much worse.

In the eastern Indian state of Bihar, the death toll has risen to more than 500, the Straits Times reported, quoting disaster management officials.

The paper said the ongoing floods had so far affected 17 mllion people in India, with thousands sheltered in relief camps.

Anirudh Kumar, a disaster management official in Patna, the capital of Bihar, a poor state known for its mass migration from rural areas to cities, said this year's farming had collapsed because of the floods, which will lead to a further rise in unemployment in the region.

In the northern state of Uttar Pradesh, reports said more than 100 people had died and 2.5 million have been affected.

In Mumbai, authorities struggled to evacuate people living in the financial capital's low-lying areas as transport links were paralysed and downpours led to water rising up to five feet in some parts of the city.

Weather officials are forecasting heavy rains to continue over the next 24 hours and have urged people to stay indoors.

Partially submerged houses are seen at a flood-affected village in Morigaon district in the northeastern state of Assam, India. (REUTERS/Anuwar Hazarika)

In neighbouring Bangladesh, at least 134 have died in monsoon flooding which is believed to have submerged at least a third of the country.

More than 600,000 hectares of farmland have been partially damaged and in excess of 10,000 hectares have been completely washed away, according to the disaster minister.

Bangladesh's economy is dependent on farming and the country lost around a million tonnes of rice in flash floods in April.

"Farmers are left with nothing, not event with clean drinking water," said Matthew Marek, the head of disaster response in Bangladesh for the International Federation of Red Cross and Red Crescent.

In Nepal, 150 people have been killed and 90,000 homes have been destroyed in what the UN has called the worst flooding incident in the country in a decade.

According to the Red Cross, at least 7.1 million people have been affected in Bangladesh - more than the population of Scotland - and around 1.4 million people have been affected in Nepal.

The disaster comes as headlines have focused on the floods in Houston, Texas, which authorities have described as "unprecedented".

Officials in Texas have said the death toll now stands at 15 in the wake of Hurricane and Tropical Storm Harvey, with thousands forced to flee their homes.

The rise in extreme weather events such as hurricanes and floods have been identified by climate scientists as the hallmark of man-made climate change.

The US has seen two of its worst storms ever, Hurricane Harvey and Hurricane Katrina, in just over a decade.

India's Prime Minister, Narendra Modi, has said climate change and new weather patterns are having "a big negative impact".

Third-hottest June puts 2017 on track to make hat-trick of hottest years

June 2017 was beaten only by June in 2015 and 2016, leaving experts with little hope for limiting warming to 1.5C or even 2C

The latest figures cement estimations that warming is now at levels not seen for 115,000 years.

First published on Wednesday 19 July 2017 00.17 EDT

Last month was the third-hottest June on record globally, temperature data suggest, confirming 2017 will almost certainly make a hat-trick of annual climate records, with 2015, 2016 and 2017 being the three hottest years since records began.

The figures also cement estimations that warming is now at levels not seen for 115,000 years, and leave some experts with little hope for limiting warming to 1.5C or even 2C.

According to new figures from the US National Oceanic and Atmospheric Administration (Noaa), June 2017 was the third-hottest June on record, beaten only by the two preceding Junes in 2015 and 2016.

The Noaa data show combined land and sea-surface temperatures for June 2017 were 0.82C above the 20th century average, making a string of 41 consecutive Junes above that average.

June 2016 still holds the record at 0.92C above the 20th century average, followed by June 2015 which was 0.89C above the baseline.

The data line up closely with Nasa figures released last week, which are calculated slightly differently, finding the month was the fourth-hottest on record – with June 1998 also being warmer in their data set.

Based on the Nasa data, climate scientist and director of Nasa’s Goddard Institute for Space Studies Gavin Schmidt estimated that 2017 was probably going to be the second-warmest year on record after 2016, but would almost certainly be among the top three hottest years.

Gavin Schmidt

(@ClimateOfGavin)

With update to Jun, 2017 will almost certainly be a top 3 year in the GISTEMP record (most likely 2nd warmest ~57% chance). pic.twitter.com/jiR6cCv1x8

July 15, 2017

The June data see all of the first six months of 2017 sitting among the three warmest months on record, making it the second-hottest first half of a year on record – again, beaten only by the previous year.

The near-record temperatures continued this year despite the passing of El Niño, which normally warms the globe, and its opposite – La Niña – currently suppressing temperatures.

The warming trend is almost certainly caused by greenhouse gas emissions – mostly the result of burning fossil fuels – with many studies showing such warm years would be almost impossible without that effect.

Last year, Michael Mann from Pennsylvania State University published a paper showing the then-record temperatures in 2014 would have had less than a one in a million chance of occurring naturally.

“We have a follow-up article that we’ve submitted showing that the likelihood of three consecutive record-breaking years such as we saw in 2015-2017 was similarly unlikely,” he told the Guardian over email. “In short, we can only explain the onslaught of record warm years by accounting for human-caused warming of the planet.”

Andy Pitman from the University of New South Wales in Sydney, Australia said the onslaught of very rapid warming in the past few years is likely a result of the climate system “catching up” after a period of relative slow warming caused by natural variability – the so-called “hiatus”.

“I do not think the recent anomalies change anything from a science perspective,” he said. “The Earth is warming at about the long-term rates that were expected and predicted [by models].”

But Pitman said the ongoing trend was “entirely inconsistent” with the target of keeping warming at just 1.5C above pre-industrial temperatures.

Current trends suggest the 1.5C barrier would be breached in the 2040s, with some studies suggesting it might happen much sooner.

“In my view, to limit warming to 2C requires both deep and rapid cuts and a climate sensitivity on the lower end of the current range,” Pitman said. “I see no evidence that the climate sensitivity is on the lower end of the current range, unfortunately.”

“It would be a good idea to cut greenhouse gas emissions rather faster than we are.”

"Heritage at Risk: How Rising Seas Threaten Ancient Coastal Ruins"

"The shores of Scotland’s Orkney Islands are dotted with ruins that date to the Stone Age. But after enduring for millennia, these archaeological sites – along with many others from Easter Island to Jamestown – are facing an existential threat from climate change."

"Perched on the breathtaking Atlantic coast of Mainland, the largest island in Scotland’s Orkney archipelago, are the remains of the Stone Age settlement of Skara Brae, dating back 5,000 years. Just feet from the sea, Skara Brae is one of the best preserved Stone Age villages in the world — a complex of ancient stone house foundations, walls, and sunken corridors carved out of the dunes by the shore of the Bay of Skaill. Fulmars and kittiwakes from the vast seabird colonies on Orkney’s high cliffs wheel above the coastal grassland of this rugged island, 15 miles from the northern coast of the Scottish mainland. On a sunny day, the surrounding bays and inlets take on a sparkling aquamarine hue.

Older than the Egpyptian pyramids and Stonehenge, Skara Brae is part of a UNESCO World Heritage site that also includes two iconic circles of standing stones — the Ring of Brodgar and the Stones of Stenness — and Maeshowe, an exquisitely structured chambered tomb famous for its Viking graffiti and the way its Stone Age architects aligned the entrance to catch the sun’s rays at the winter solstice. These sites, situated just a few miles from Skara Brae, are part of an elaborate ceremonial landscape built by Orkney’s earliest farmers.

Skara Brae and the neighboring sites have weathered thousands of years of Orkney’s wild winters and ferocious storms, but they may not outlive the changing climate of our modern era. As seas rise, storms intensify, and wave heights in this part of the world increase, the threat grows to Skara Brae, where land at each end of its protective sea wall — erected in the 1920s — is being eaten away. Today, as a result of climate change, Skara Brae is regarded by Historic Environment Scotland, the government agency responsible for its preservation, as among Scotland’s most vulnerable historic sites."

Satellite snafu masked true sea-level rise for decades

Revised tallies confirm that the rate of sea-level rise is accelerating as the Earth warms and ice sheets thaw.

Jeff Tollefson on 17 July 2017

As the Greenland ice sheet thaws, it is helping to raise the world's sea levels.

The numbers didn’t add up. Even as Earth grew warmer and glaciers and ice sheets thawed, decades of satellite data seemed to show that the rate of sea-level rise was holding steady — or even declining.

Now, after puzzling over this discrepancy for years, scientists have identified its source: a problem with the calibration of a sensor on the first of several satellites launched to measure the height of the sea surface using radar. Adjusting the data to remove that error suggests that sea levels are indeed rising at faster rates each year.

“The rate of sea-level rise is increasing, and that increase is basically what we expected,” says Steven Nerem, a remote-sensing expert at the University of Colorado Boulder who is leading the reanalysis. He presented the as-yet-unpublished analysis on 13 July in New York City at a conference sponsored by the World Climate Research Programme and the International Oceanographic Commission, among others.

Nerem's team calculated that the rate of sea-level rise increased from around 1.8 millimetres per year in 1993 to roughly 3.9 millimetres per year today as a result of global warming. In addition to the satellite calibration error, his analysis also takes into account other factors that have influenced sea-level rise in the last several decades, such as the eruption of Mount Pinatubo in the Philippines in 1991 and the recent El Niño weather pattern.

The results align with three recent studies that have raised questions about the earliest observations of sea-surface height, or altimetry, captured by the TOPEX/Poseidon spacecraft, a joint US–French mission that began collecting data in late 1992. Those measurements continued with the launch of three subsequent satellites.

“Whatever the methodology, we all come up with the same conclusions,” says Anny Cazenave, a geophysicist at the Laboratory for Studies in Space Geophysics and Oceanography (LEGOS) in Toulouse, France.

In an analysis published in Geophysical Research Letters1 in April, Cazenave’s team tallied up the various contributions to sea-level rise, including expansion resulting from warming ocean waters and from ice melt in places such as Greenland. Their results suggest that the satellite altimetry measurements were too high during the first six years that they were collected; after this point, scientists began using TOPEX/Poseidon's back-up sensor. The error in those early measurements distorted the long-term trend, masking a long-term increase in the rate of sea-level rise.

The problem was first identified in 2015 by a group that included John Church, an oceanographer at the University of New South Wales in Sydney, Australia. Church and his colleagues identified a discrepancy between sea-level data collected by satellites and those from tide gauges scattered around the globe2. In a second paper published in June in Nature Climate Change3, the researchers adjusted the altimetry records for the apparent bias and then calculated sea-level rise rates using a similar approach to Cazenave’s team. The trends lined up, Church says.

Rising tide

Still, Nerem wanted to know what had gone wrong with the satellite measurements. His team first compared the satellite data to observations from tide gauges that showed an accelerating rate of sea-level rise. Then the researchers began looking for factors that could explain the difference between the two data sets.

The team eventually identified a minor calibration that had been built into TOPEX/Poseidon's altimeter to correct any flaws in its data that might be caused by problems with the instrument, such as ageing electronic components. Nerem and his colleagues were not sure that the calibration was necessary — and when they removed it, measurements of sea-level rise in the satellite's early years aligned more closely with the tide-gauge data. The adjusted satellite data showed an increasing rate of sea-level rise over time.

“As records get longer, questions come up,” says Gavin Schmidt, a climate scientist who heads NASA’s Goddard Institute for Space Studies in New York City. But the recent spate of studies suggests that scientists have homed in on an answer, he says. “It’s all coming together.”

If sea-level rise continues to accelerate at the current rate, Nerem says, the world’s oceans could rise by about 75 centimetres over the next century. That is in line with projections made by the Intergovernmental Panel on Climate Change in 2013.

“All of this gives us much more confidence that we understand what is happening,” Church says, and the message to policymakers is clear enough. Humanity needs to reduce its output of greenhouse-gas emissions, he says — and quickly. ”The decisions we make now will have impacts for hundreds, and perhaps thousands, of years.”

Nature 547, 265–266 (20 July 2017) doi:10.1038/nature.2017.22312

China’s outdated coal thermal power technology sold in Vietnam

VietNamNet Bridge - China’s policy on exporting its coal thermal power technology has raised concerns among environmentalists, who say Vietnam could become the destination for the outdated technology.

Nguyen Dinh Tuan, former rector of the HCMC University of Natural Resources and the Environment, said that serious pollution in China was behind the country's decision to export the technology.

“Previously, when China was poor, they had to accept this situation. But now, as China has become the world's second largest economy with foreign reserves of more than $3.2 trillion, it wants to stop this,” he said.

“One of the measures is shutting down coal-fired thermopower plants. And once it doesn’t need the technology anymore, it will export the technology to seek profit,” he explained.

The Chinese government is restructuring energy sources with priority given to clean renewable energy.

Meanwhile, developing coal-fired plants remains the choice of Vietnam.

China’s policy on exporting its coal thermal power technology has raised concerns among environmentalists, who say Vietnam could become the destination for the outdated technology.

Vietnamese state management agencies say that most of Vietnam’s coal-fired thermal power projects have EPC contractors from China.

Tuan said while many countries offer coal thermal power technology, Vietnam is still choosing technology and EPC contractors from China.

“Chinese contractors are winning bids by offering very low prices. After winning the bids, they cite many reasons to raise investment capital,” he explained.

In some cases, Chinese contractors were chosen because the contractors offered commissions to Vietnamese officials.

The expert pointed out that China may follow a ‘dual strategy’ – both exporting technology and selling coal.

Vietnam’s coal is not highly appreciated in use for coal-fired plants as it is not economical. Therefore, Vietnam sells its high-quality coal and imports lower-quality coal to run its thermal power plants. As the imports lack high quality, the pollution level is high.

China has been following the coal import policy for many years and has been buying coal from Vietnam. Meanwhile, China sells coal suitable to Chinese-technology power plants to Vietnam.

“It would be bad if Vietnam both imports old polluting technology and polluting material,” Tuan said.

According to Nguy Thi Khanh, director of GreenID, Vietnam would have to import 85 million tons of coal by 2030 to run power plants as shown in the seventh power development program.

Operating power plants, generating 13,000 MW a year, discharge 15 million tons of ash which still cannot be treated, causing serious pollution.

The pollution will be even more serious if Vietnam increases coal-fired thermal power capacity to 55,000 MW, which is four times higher than now.

Methane Seeps Out as Arctic Permafrost Starts to Resemble Swiss Cheese

Measurements over Canada's Mackenzie River Basin suggest that thawing permafrost is starting to free greenhouse gases long trapped in oil and gas deposits.

By Bob Berwyn, InsideClimate News

Jul 19, 2017

In parts of northern Canada's Mackenzie River Delta, seen here by satellite, scientists are finding high levels of methane near deeply thawed pockets of permafrost.

Global warming may be unleashing new sources of heat-trapping methane from layers of oil and gas that have been buried deep beneath Arctic permafrost for millennia. As the Earth's frozen crust thaws, some of that gas appears to be finding new paths to the surface through permafrost that's starting to resemble Swiss cheese in some areas, scientists said.

In a study released today, the scientists used aerial sampling of the atmosphere to locate methane sources from permafrost along a 10,000 square-kilometer swath of the Mackenzie River Delta in northwestern Canada, an area known to have oil and gas desposits.

Deeply thawed pockets of permafrost, the research suggests, are releasing 17 percent of all the methane measured in the region, even though the emissions hotspots only make up 1 percent of the surface area, the scientists found.

In those areas, the peak concentrations of methane emissions were found to be 13 times higher than levels usually caused by bacterial decomposition—a well-known source of methane emissions from permafrost—which suggests the methane is likely also coming from geological sources, seeping up along faults and cracks in the permafrost, and from beneath lakes.

The findings suggest that global warming will "increase emissions of geologic methane that is currently still trapped under thick, continuous permafrost, as new emission pathways open due to thawing permafrost," the authors wrote in the journal Scientific Reports. Along with triggering bacterial decomposition in permafrost soils, global warming can also trigger stronger emissions of methane from fossil gas, contributing to the carbon-climate feedback loop, they concluded.

"This is another methane source that has not been included so much in the models," said the study's lead author, Katrin Kohnert, a climate scientist at the GFZ German Research Centre for Geosciences in Potsdam, Germany. "If, in other regions, the permafrost becomes discontinuous, more areas will contribute geologic methane," she said.

Similar Findings Near Permafrost Edges

The findings are based on two years of detailed aerial atmospheric sampling above the Mackenzie River Delta. It was one of the first studies to look for sources of deep methane across such a large region.

Previous site-specific studies in Alaska have looked at single sources of deep methane, including beneath lakes. A 2012 study made similar findings near the edge of permafrost areas and around melting glaciers.

Now, there is more evidence that "the loss of permafrost and glaciers opens conduits for the release of geologic methane to the atmosphere, constituting a newly identified, powerful feedback to climate warming," said the 2012 study's author, Katey Walter Anthony, a permafrost researcher at the University of Alaska Fairbanks.

"Together, these studies suggest that the geologic methane sources will likely increase in the future as permafrost warms and becomes more permeable," she said.

"I think another critical thing to point out is that you do not have to completely thaw thick permafrost to increase these geologic methane emissions," she said. "It is enough to warm permafrost and accelerate its thaw. Permafrost that starts to look like Swiss cheese would be the type that could allow substantially more geologic methane to escape in the future."

Róisín Commane, a Harvard University climate researcher, who was not involved with the study but is familiar with Kohnert's work, said, "The fluxes they saw are much larger than any biogenic flux ... so I think a different source, such as a geologic source of methane, is a reasonable interpretation."

Commane said the study makes a reasonable assumption that the high emissions hotspots are from geologic sources, but that without more site-specific data, like isotope readings, it's not possible to extrapolate the findings across the Arctic, or to know for sure if the source is from subsurface oil and gas deposits.

"There doesn't seem to be any evidence of these geogenic sources at other locations in the Arctic, but it's something that should be considered in other studies," she said. There may be regions with pockets of underground oil and gas similar to the Mackenzie River Delta that haven't yet been mapped.

Speed of Methane Release Remains a Question

The Arctic is on pace to release a lot more greenhouse gases in the decades ahead. In Alaska alone, the U.S. Geological Survey recently estimated that 16-24 percent of the state's vast permafrost area would melt by 2100.

In February, another research team documented rapidly degrading permafrost across a 52,000-square-mile swath of the northwest Canadian Arctic.

What's not clear yet is whether the rapid climate warming in the Arctic will lead to a massive surge in releases of methane, a greenhouse gas that is about 28 times more powerful at trapping heat as carbon dioxide but does not persist as long in the atmosphere. Most recent studies suggest a more gradual increase in releases, but the new research adds a missing piece of the puzzle, according Ted Schuur, a permafrost researcher at Northern Arizona University.

Since the study only covered two years, it doesn't show long-term trends, but it makes a strong argument that there is significant methane escaping from trapped layers of oil and gas, Schuur said.

"As for current and future climate impact, what matters is the flux to the atmosphere and if it is changing ... if there is methane currently trapped by permafrost, we could imagine this source might increase as new conduits in permafrost appear," he said.

Solve Antarctica’s sea-ice puzzle

John Turner & Josefino Comiso on 19 July 2017

John Turner and Josefino Comiso call for a coordinated push to crack the baffling rise and fall of sea ice around Antarctica.

Antarctica's disappearing sea ice, from its peak in August 2016 to a record low on March 3, 2017.

Different stories are unfolding at the two poles of our planet. In the Arctic, more than half of the summer sea ice has disappeared since the late 1970s1. The steady decline is what global climate models predict for a warming world2. Meanwhile, in Antarctic waters, sea-ice cover has been stable, and even increasing, for decades3. Record maxima were recorded in 2012, 2013 and 2014

So it came as a surprise to scientists when on 1 March 2017, Antarctic sea-ice cover shrank to a historic low. Its extent was the smallest observed since satellite monitoring began in 1978 (see ‘Poles apart’) — at about 2 million square kilometres, or 27% below the mean annual minimum.

Researchers are struggling to understand these stark differences5. Why do Antarctica’s marked regional and seasonal patterns of sea-ice change differ from the more uniform decline seen around most of the Arctic? Why has Antarctica managed to keep its sea ice until now? Is the 2017 Antarctic decline a brief anomaly or the start of a longer-term shift6, 7? Is sea-ice cover more variable than we thought? Pressingly, why do even the most highly-rated climate models have Antarctic sea ice decreasing rather than increasing in recent decades? We need to know whether crucial interactions and feedbacks between the atmosphere, ocean and sea ice are missing from the models, and to what extent human influences are implicated6.

What happens in the Antarctic affects the whole planet. The Southern Ocean has a key role in global ocean circulation; a frozen sea surface alters the exchange of heat and gases, including carbon dioxide, between ocean and atmosphere. Sea ice reflects sunlight and influences weather systems, the formation of clouds and precipitation patterns. These in turn affect the mass of the Antarctic ice sheet and its contribution to sea-level rise. Sea ice is also crucial to marine ecosystems. A wide range of organisms, including krill, penguins, seals and whales, depend on its seasonal advance and retreat.

It is therefore imperative that researchers understand the fate of Antarctic sea ice, especially in places where its area and thickness are changing, and why. This requires bringing together understandings of the drivers behind the movement of the ice (through drift and deformation) as well as those that control its growth and melt (thermodynamics). Such knowledge underpins climate models; these need to better represent the complex interactions between sea ice and the atmosphere, ocean and ice sheet. What’s required now is a focused and coordinated international effort across the scientific disciplines that observe and model global climate and the polar regions.

Satellites provide the best spatial information on sea ice around Antarctica. Regular observations reveal how ice cover varies over days, years and decades8. Weather, especially storms with high winds, has a daily influence, as well as a seasonal one. Longer-term changes are driven by larger patterns in the temperature and circulation of the atmosphere and oceans.

But near-continuous satellite observations only reach back about four decades. Longer records are essential to link sea-ice changes to trends in climate. Information from ships’ logbooks, coastal stations, whale-catch records, early satellite imagery and chemical analyses of ice cores hint that sea-ice coverage might have been up to 25% greater in the 1940s to 1960s6.

Source: US National Snow and Ice Data Center

Collecting more ice cores and historical records, and synthesizing the information they contain, could reveal local trends. These would help to identify which climatic factors drive Antarctic sea-ice changes6. For instance, in 2017, the area most depleted of sea ice was south of the Eastern Pacific Ocean. This region has strong links to the climate of the tropics, including the El Niño–Southern Oscillation, suggesting that sea ice is sensitive to conditions far from the poles.

Another issue is how the balance between dynamics and thermodynamics drives the advance and retreat of ice. The thickness and volume of ice depend on many factors, including the flow of heat from the ocean to the atmosphere and to the ice. Sea ice influences the saltiness of the ocean. As the ocean freezes, salt enters the water; as the ice melts, fresh water returns to the sea. Such processes are difficult to measure precisely, having uncertainties to within 50–100% of the signal. And they are hard to model.

“Sea-ice coverage might have been up to 25% greater in the 1940s to 1960s.”

Satellite altimeters can measure the distance between the surfaces of the sea ice and ocean accurately, and this distance can be used to calculate the ice thickness. But it is hard to interpret these data without knowing how much snow is on the ice, its density and whether the snow’s weight pushes the ice surface below sea level (as is often the case). Calibrating and validating satellite data are crucial, as is developing algorithms to merge and analyse information from a variety of sources.

Ice, ocean and air must be sampled at appropriate intervals over a wide enough area to establish how they interact. Research ice-breaker cruises are essential for collecting in situ observations; one such was the US PIPERS (Polynyas, Ice Production and seasonal Evolution in the Ross Sea) campaign in 2017. But ships travel only along narrow routes and for a short time, typically 1–3 months.

Increasingly, autonomous instruments and vehicles — underwater, on-ice and airborne — are providing data throughout the year and from inaccessible or dangerous regions. These robotic systems are giving revolutionary new information and insights into the formation, evolution, thickness and melting of sea ice. Sensors mounted on marine mammals (such as elephant seals), or on floats and gliders, also beam back data on temperature, salinity and other physical and biogeochemical parameters. But to operate continuously, these instruments need to be robust enough to withstand the harsh Antarctic marine environment.

Improve models

Current climate models struggle to simulate the seasonal and regional variability seen in Antarctic sea ice. Most models have biases, even in basic features such as the size and spatial patterns of the annual cycle of Antarctic sea-ice growth and retreat, or the amount of heat input to the ice from the ocean. The models fail to simulate even gross changes2, such as those driven by tropical influences on regional winds9. Because ice and climate are closely coupled, even small errors multiply quickly.

Leopard seals live, hunt and breed among the pack ice in Antarctic waters.

Features that need to be modelled more accurately include the belt of strong westerly winds that rings Antarctica, and the Amundsen Sea Low — a stormy area southwest of the Antarctic Peninsula. Models disagree, for example, on whether persistent westerly winds should increase or decrease sea-ice coverage around Antarctica. Simulations of clouds and precipitation are also inadequate. These cannot currently reproduce the correct amounts of snowfall or sea surface temperature of the Southern Ocean (the latter is widely overestimated by the models).

Climate models must also include the mixing of waters by surface winds and the impact of waves on the formation and break-up of sea ice. Precipitation and meltwater from ice sheets and icebergs influence the vertical structure of the ocean and how it holds heat, which also affects the growth and decay of sea ice. Researchers need to develop models of the atmosphere–ocean–sea-ice environment at high spatial resolution.

Explaining the recent variability in Antarctic sea ice, and improving projections of its future in a changing climate, requires projects that bridge many disciplines. For example, the research communities involved in ice-core analysis, historical data rescue and climate modelling need to collaborate to track sea-ice variability over timescales longer than the satellite record.

“Because ice and climate are closely coupled, even small errors multiply quickly.”

Some gaps in our knowledge can be filled through nationally funded research. More demanding cross-disciplinary work must be supported through international collaboration. Leading the way are organizations such as the Scientific Committee on Antarctic Research, the Scientific Committee on Oceanic Research, the World Climate Research Programme’s Climate and Cryosphere project and the Past Global Changes project. But essential work remains to be done, including: more detailed model comparisons and assessments; more research cruises; and the continuity and enhancement of satellite observing programmes relevant to sea ice. These organizations should partner with funding agencies to make that happen.

Better representations of the Southern Ocean and its sea ice must now be a priority for modelling centres, which have been focused on simulating the loss of Arctic sea ice. Such models will be crucial to the next assessment of the Intergovernmental Panel on Climate Change, which is due around 2020–21. A good example of the collaborative projects needed is the Great Antarctic Climate Hack (see go.nature.com/2ttpzcd). This brings together diverse communities with an interest in Antarctic climate to assess the performance of models.

Nature 547, 275–277 (20 July 2017) doi:10.1038/547275a

Europe's contribution to deforestation set to rise despite pledge to halt it

Europe’s consumption of products such as beef, soy and palm oil could increase its contribution to global deforestation by more than a quarter by 2030, analysis shows

Arthur Neslen in Brussels, Friday 30 June 2017 12.35 EDT, Last modified on Friday 30 June 2017 12.37 EDT

Europe’s contribution to global deforestation may rise by more than a quarter by 2030, despite a pledge to halt such practices by the end of this decade, according to a leaked draft EU analysis.

An estimated 13m hectares (Mha) of the world’s forestland is lost each year, a figure projected to spiral in the next 30 years with the Amazon, Greater Mekong and Borneo bearing the brunt of tree clearances.

But despite signing several international pledges to end deforestation by this decade’s end, more than 5Mha of extra forest land will be needed annually by 2030 to meet EU demand for agricultural products, a draft EU feasibility study predicts.

We are destroying rainforests so quickly they may be gone in 100 years

“The EU pledged to stop deforestation by 2020, not to increase it,” said Sebastian Risso, a spokesman for Greenpeace. “Deforestation is a conveyor belt to devastating climate change and species loss that the world must stop, and fast.

“It is hypocritical for Europe to portray itself as a climate leader while doing little to slow its own insatiable appetite for the fruits of destroyed forests.”

Land clearances for crop, livestock and biofuel production are by far the biggest causes of deforestation, and Europe is the end destination for nearly a quarter of such products – beef, soy, palm oil and leather – which have been cultivated on illegally deforested lands.

According to the draft EU feasibility study, which is meant to provide officials with policy options, the “embodied” deforestation rate – which is directly linked to EU consumption – will increase from between 250,000-500,000ha in 2015 to 340,000-590,000ha in 2030.

The figures do not encompass forest degradation or the impacts of lending by EU financial institutions.

The Guardian understands that the report’s methodology has also been criticised within the European commission, where some officials say that its projections are too conservative and do not account for European bioenergy demand.

A deforestation action plan that could include binding legislation is currently under consideration by the European commission.

The EU has signed several pledges to halt deforestation by 2020 within accords that include: the UN convention on biological diversity, the UN sustainable development goals, the UN forum on forests, the New York declaration on forests, and the Amsterdam declaration.

Valero refinery sues PG&E over outage that idled gas making, spewed pollution

Plumes of dark smoke flow from the Valero oil refinery in Benicia May 5 after a power outage. (Chris Riley/Times-Herald)

By Denis Cuff | dcuff@bayareanewsgroup.com | Bay Area News Group

PUBLISHED: July 3, 2017 at 3:02 pm | UPDATED: July 3, 2017 at 6:26 pm

BENICIA — The Valero oil refinery is seeking $75 million from PG&E in a federal lawsuit that blames the utility for a May 5 power outage that disrupted gasoline production for a month, damaged equipment, and showered tons of air pollution on downwind areas.

Some Benicia residents have criticized the oil refinery over pollution from hours of flaring that led to an industrial park being evacuated and an air quality agency issuing at least four violation notices against the oil company.

Valero, however, pointed the finger at PG&E, saying in a federal lawsuit that the utility abruptly pulled the plug on the refinery in a “reckless” 18 minute power failure.

PG&E cut off the flow on both of two transmission lines, one a regular power source for the refinery and the second a backup source designed to avoid shutdowns, according to the lawsuit filed Friday in U.S District court in Sacramento.

“PG&E must take responsibility for the damages it caused to ensure reckless power outages with potentially significant consequences never happen again,” Valero said in a statement.

PG&E apologized for the outage during a Benicia City Council meeting, and said it is taking measures to prevent power failures from happening again.

In response to the lawsuit, PG&E said it has commissioned an outside engineering firm, Exponent, to investigate the cause of the outage.

“We continue to partner with Valero and the city of Benicia to prevent similar power disruptions,” PG&E said. “The safety of our customers, employees and the general public is always our top priority.”

One of the transmission lines leading to Valero was taken off line on May 5 as part of planned maintenance, the utility said. For reasons being investigated, a breaker opened that morning, shutting down the backup line to Valero.

While the breaker opening is a safety measure to protect equipment, PG&E spokeswoman Deanna Contreras said the utility is trying to determine why it needed to open.

Valero said losing both transmission lines “simultaneously without any prior notice” put plant workers and neighbors at risk.

“PG&E knew or should have known that unplanned power dips or outages can present serious safety issues to refinery employees and members of the community, including the risk of explosions, fires, and serious injury,” attorneys for the oil company wrote in the lawsuit.

Valero said the outage damaged refinery equipment, and caused the refinery to lose money because it couldn’t produce gasoline and other fuels for about the month it took to make repairs and bring the refinery back into production.

Valero also said it deserves damages because its reputation was harmed.

Trump’s ‘drill, baby, drill’ energy policies are being met by ‘sue, baby, sue’

BY STUART LEAVENWORTH

sleavenworth@mcclatchydc.com

President Trump promised to grow jobs by rolling back Obama-era energy and pollution rules. And he’s fulfilling his pledge, but not how he intended. In just six months, Trump’s policies have resulted in a surge in employment — for environmental lawyers.

Since Trump took office, environmental groups and Democratic state attorneys general have filed more than four dozen lawsuits challenging his executive orders and decisions by the U.S. Environmental Protection Agency, Interior Department and other agencies. Environmental organizations are hiring extra lawyers. Federal agencies are requesting bigger legal defense budgets.

The first round of legal skirmishes has mostly gone to the environmentalists. Earlier this month, the U.S. Court of Appeals for the D.C. Circuit blocked the administration from delaying Obama-era rules aimed at reducing oil industry releases of methane, a potent greenhouse gas. On June 14, a federal judge ruled against permits issued to complete construction of the Dakota Access pipeline, a partial victory for the Standing Rock Sioux tribe, which claims its hunting and fishing grounds are threatened by the oil pipeline.

Abigail Dillen, a vice president of litigation for Earthjustice, a nonprofit organization that filed the Dakota Access lawsuit, said her group and others are girding for years of court battles over bedrock environmental laws, such as the Clean Water Act and Clean Air Act. Earthjustice has increased its staff of lawyers since November to 100, and is planning to hire another 30 in coming months.

“What we are seeing is unprecedented,” said Dillen, whose California-based organization was previously known as the Sierra Club Legal Defense Fund. “With this administration, we face a disregard to basic rules of environmental decision making.”

Trump swept to office vowing to boost coal and petroleum production, while ridding the nation of “job-killing regulations.” He’s worked to rapidly deliver on those promises. But the speed of his regulatory rollback — without a full legal team in place to vet new policies — has created a target-rich environment for litigation.

To cap off his 100 days in office, Trump signed an executive order that will expand offshore oil drilling in federal waters and open other areas that were previously off limits to new oil and gas exploration.

Altogether, environmental groups have sued the administration more than 50 times in federal court since Trump took office five months ago, according to a McClatchy review of the federal Public Access to Court Electronic Records (PACER) database. One group alone, the Center for Biological Diversity, has filed or joined in 25 suits, including ones seeking documents related to Trump’s planned border wall with Mexico and Trump’s approval of the Keystone oil pipeline.

In March, the group bragged on Twitter, “We’ve sued #Trump every week since 1/22. 3 suits today brings total to 15!”

We've sued #Trump every week since 1/22. 3 suits today brings total to 15! Help us protect the Arctic from him: https://t.co/DXFLQFkRXO pic.twitter.com/fNNcLkdu65 — Center for Bio Div (@CenterForBioDiv) May 3, 2017

During the Obama years, environmental groups also filed suits, both to toughen administration regulations and also to defend policies challenged by industry. Now, said Dillen, “the old work hasn’t gone away, and this administration has created new work for us.”

Environmental law groups have enjoyed a surge in contributions since Trump’s election, much like the ACLU and other organizations fighting Trump’s immigration policies. Earthjustice saw its fundraising increase 160 percent from the election through January, compared to the same period last year.

Federal agencies recognize they are flying into a suit storm. Several are seeking extra budget resources — or at least trying to avoid cuts — to defend the administration’s policies in court.

The Interior Department’s Office of the Solicitor, for instance, recently told Congress that it will need “substantial attorney resources” to handle an expected influx of cases. In its $65 million budget justification for Fiscal Year 2018, the solicitor’s office said: “We can also expect litigation on virtually every major permitting decision authorizing energy development on federal lands.”

It is a reasonable expectation. On Monday, Earthjustice filed suit against Interior’s attempt to delay an Obama rule limiting releases of methane from oil and gas operations on public lands. The lawsuit was filed in U.S. District Court in California on behalf of tribal groups and several environmental organizations including the Sierra Club, Natural Resources Defense Council and Wilderness Society.

Interior also faces lawsuits on plans to open tens of thousands of acres of public land to coal industry leasing, and the proposed removal of bans on new offshore oil drilling in the Arctic and Atlantic oceans.

The suits are piling up at the EPA, too. Environmentalists are suing over the agency’s refusal to ban a controversial pesticide, chlorpyrifos; its delay of rules aimed at preventing chemical plants accidents; and its delay on implementing rules on coal plant discharges into public waters.

During the Obama administration, industry groups and Republican state attorneys general engaged in their own litigation storm. As the top legal officer of Oklahoma, Scott Pruitt solicited industry contributions to build up the war chest of the Republican Attorneys General Association. He also filed or helped file 14 lawsuits against the EPA. In January, Trump tapped Pruitt to serve as EPA administrator, putting him in a position to unravel Obama-era policies he previously had litigated against.

Democratic attorneys general are now ramping up their own fundraising and are filing or joining in suits against Trump’s environmental, immigration and education policies. AGs such as California’s Xavier Becerra, Pennsylvania’s Josh Shapiro and Washington state’s Bob Ferguson are all raising their profiles with these lawsuits. All are seen as contenders for future seats as governor or U.S. senator.

The litigation surge is not going unnoticed by congressional Republicans. They’ve restarted a campaign to counter what they call “abusive” use of the courts by environmentalists and other groups. On June 28, the GOP-controlled House held a hearing focusing on lawsuits against the Interior Department, which manages the nation’s vast public lands.

“A legal sub industry has thrived from endless environmental litigation, while burdening the livelihoods of countless citizens,” said U.S. Rep. Mike Johnson, R-LA, vice chair of House Natural Resources Subcommittee on Oversight and Investigations. Excessive litigation, he added, “drains our taxpayer dollars away from good stewardship of our natural resources.”

At the EPA, Pruitt finds himself in the awkward position of wishing the courts would dismiss litigation he filed against the agency while serving as Oklahoma attorney general. One of those suits was against Obama’s Clean Power Plan, which would set the nation’s first limits on carbon emissions from power plants.

In March, Trump’s Justice Department asked the U.S. Court of Appeals in the D.C. Circuit to indefinitely suspend the lawsuit Pruitt had filed. Apparently, the administration was concerned that if the litigation continued, the court could rule in favor of the Clean Power Plan, forcing the White House to implement a climate initiative it opposes.

In a partial victory for the administration, the appeals court in April put the litigation on hold, and asked for more information before it decides on the case going forward. Depending on the court’s final ruling, the White House will likely be allowed to go ahead with full revocation of the regulations.

Revocation of the Clean Power Plan, however, could prompt another lawsuit. Environmental groups contend the EPA is obligated to address climate pollution under the Clean Air Act.

McClatchy’s Ben Wieder contributed to this story.

Flood insurance rates could rise for hundreds of thousands of homeowners under proposal

BY STUART LEAVENWORTH

sleavenworth@mcclatchydc.com

Congress is considering sweeping changes to the debt-laden National Flood Insurance Program that could jack up flood insurance rates for hundreds of thousands of homeowners under a bill that a Florida real estate group called “devastating.”

The proposal, part of a flood-insurance package with a Sept. 30 deadline, could prove costly to homeowners in flood-prone regions ranging from Florida to Texas to California’s Central Valley. It would primarily affect homeowners with low “grandfathered” rates based on flood maps that have changed since they purchased their homes.

“This could be devastating to so many people,” said Maria S. Wells, president of the Florida Realtors trade group, which has 175,000 members statewide. “People don’t realize they could be remapped into a much more high-risk zone.”

Rep. Sean Duffy, a Wisconsin Republican who introduced the 21st Century Flood Reform Act, agreed the change could deal a financial blow to some homeowners, over time. But the current practice, he argued, is unfair to households that live outside of floodplains. Through their tax dollars, they subsidize a relatively small number of homeowners who own property near the coasts, rivers and other waterways.

Duffy said he’s open to amending his legislation to address affordability concerns — but not if it keeps the current subsidy system in place.

“We are not talking about the poorest people in the country,” said Duffy, who represents a wide swath of northern Wisconsin. “You have subsidies going to these wealthy homeowners on the coast.”

Congress has until the end of September to reauthorize the National Flood Insurance Program, otherwise the program will lapse, preventing people from buying flood insurance. When that impasse last happened for a month in 2010, an estimated 46,800 home sales transactions were interrupted or canceled, according to the National Realtors Association.

Last month, a key House committee approved seven pieces of legislation aimed at improving the solvency of the insurance program, which owes $25 billion to the U.S. Treasury.

The full House is expected to consolidate and vote on the whole package — including the part that would increase insurance rates — before the August recess, but affordability may hamper its passage. If the House and Senate can’t pass and reconcile their two reauthorization bills, Congress could miss the Sept. 30 deadline, and the housing market could see a repeat of 2010.

WE NEED TO GENTLY MOVE THESE PEOPLE TO THE POINT THEY ARE PAYING PREMIUMS COMMENSURATE WITH THE RISK

Sean Duffy, R-Wisconsin, author of debated flood insurance bill

At issue is the insurance program’s practice of “grandfathering” premiums, which allows homeowners to continue paying relatively low insurance rates even when the Federal Emergency Management Agency, or FEMA, changes its maps to account for newly discovered risks. The practice is based on the concept that people with homes built to the required codes at the time of purchase shouldn’t be penalized when those codes change.

Under Duffy’s bill, as now written, no new grandfathering would be allowed after four years, and premiums on existing grandfathered policies would rise 8 percent yearly until they reach full-risk rates.

No one knows for sure how many households could be affected by the change, but Duffy said FEMA has told him it could number 500,000 or higher. The increased premium costs could be sizable.

For example, FEMA’s rate tables show that a home in an “A Zone” of Special Flood Hazard Area — typically near a lake, river or coastline — that now costs $3,000 a year in insurance premiums could rise to $5,000 a year if FEMA determined that expected flood elevations were two feet higher than previously mapped.

Wells, a real estate broker with offices in Florida and Pennsylvania, said one frequent criticism of the program — that rich people benefit disproportionately — is a myth. Under the program, flood damage claims are capped at $250,000, much less than full replacement for a wealthy person’s home.

“People talk about beach-side mansions, but I can tell you, they don’t use NFIP,” she said. “They use Lloyds of London.”

Proponents of the legislation, however, say that construction of ever-more-valuable homes along the coast and on floodplains adds to the risks being covered by the taxpayer-subsidized flood insurance program. Subsidies, they argue, encourage more people to build and buy homes on floodplains, perpetuating the program’s debt.

“We need to transition to sound actuarial rates,” said Rep. Jeb Hensarling, R-Tex., chairman of the House Financial Services Committee, which approved the legislative package last month. “Otherwise, we are putting more people in harm’s way.”

Federal flood insurance has long been the subject of a congressional tug-of-war, especially following Hurricane Katrina in 2005 and Hurricane Sandy in 2012. In 2005, FEMA borrowed more than $17 billion from the U.S. Treasury to pay claims damages, and Hurricane Sandy triggered another $9 billion in borrowing.

As of late 2016, close to five million households nationwide held federal flood insurance policies, according to FEMA. Florida was the largest, with nearly 1.8 million policy holders, followed by Texas with 608,000, Louisiana with 472,000 and California with 295,000.

The issue is complicated by the fact that many U.S. cities were originally built along coastlines and rivers and are now deemed vulnerable to extreme flooding. Scientists say that with sea level rise and stronger storms expected from global climate change, the U.S. Treasury is to sure to take future big hits from flood disasters.

To reduce the mounting debt, Congress in 2012 passed the Biggert-Waters Flood Insurance Reform Act, which was “designed to allow premiums to rise to reflect the true risk of living in high-flood areas.” But two years later, following an outcry from affected homeowners, Congress scaled back the premium rate increases.

After President Trump took office, federal debt hawks saw a chance to revive provisions in the 2012 legislation, including ending “grandfathering.” That provision is part of the 21st Century Flood Reform Act introduced by Duffy.

Some of the House package has wide support from consumer, environmental and taxpayer groups. These include provisions aimed at removing obstacles for homeowners to purchase private flood insurance, improving the FEMA claims process, discouraging new development in floodplains and helping communities “mitigate” existing risks, such as elevating structures above expected flood depths.

There’s also agreement that FEMA needs to update its process for revising flood maps, although that goal could be complicated by Trump’s proposed 2018 budget, which calls for cuts to the agency’s mapping program.

Officials with the National Realtors Association fear the House will approve legislation to phase out grandfathering with no solid information on how many households would be affected and at what cost.

“We know for sure that homeowners will be hit hard if the NFIP is allowed to expire in September, but we need better data from FEMA before we can have a fully-informed discussion on flood insurance,” William E. Brown, president of the National Realtors Association, said in a statement to McClatchy. Without data, he added, “determining costs and risks both for homeowners and for taxpayers [is] tremendously difficult.”

Duffy, a former prosecutor and ESPN commentator, said his legislation is based on the principle that people should approach home purchases knowing that insurance conditions could change. He likened it to a consumer buying car insurance, knowing that premiums could go up if a teenager is added to the policy.

But Duffy also said he’s aware his bill faces opposition and he’s working to address the concerns about grandfathering.

“I am not a purist on this,” he said. “I want to get a bill that can pass.”

Nearly 200 environmental defenders murdered last year, says report

By National Observer in News, Energy, Politics | July 13th 2017

Environmental protesters are increasingly being treated like criminals and slaughtered when they are defending land from the encroachment of pipelines, mining, and other resource development, says a new report released on Thursday.

This revelation was included in a sweeping report by Global Witness on the state of “environmental defenders” in 2016—a year marked, for example, by ongoing protests of oil and gas pipelines in North America and many other forms of environmental protest around the world.

Criminalizing environmental protesters, the report argues, means people who oppose development projects face “trumped-up and aggressive criminal and civil cases.”

This translates to “silencing dissent,” scaring people, "tarnish[ing] their reputations," and wasting court resources, the report said.

Though the report’s authors found criminalization was particularly prevalent in Latin America, they highlight Canada’s work on ushering in new anti-terror legislation and noted National Observer reporter Bruce Livesey's coverage of RCMP surveillance of Indigenous and environmental activists.

"In Canada, environmental groups and First Nation Peoples fear new anti-terrorism legislation will be used to step up the surveillance of protesters opposed to oil and mining projects," the report said. "The Canadian media has also reported on several government agencies that are systematically spying on environmental organizations."

The group calls for governments to reform laws “that unjustly target environmental activism and the right to protest,” and urged companies to “stop abusing the judicial process to silence defenders.”

Global Witness also calls on governments to ensure activists “can peacefully voice their opinions without facing arrest, and are guaranteed due process when charges are brought against them.”

Global Witness is an international advocacy organization that bills itself as investigating—and making public—links between resource extraction, conflict, and corruption. The not-for-profit research group is based in London, UK.

The bulk of its new report deals with the slayings of environmental activists.

They found nearly 200 people were murdered all over the world last year in connection to mining, logging, and other extraction efforts, or while protecting national parks.

"Almost 1,000 murders have been recorded by Global Witness since 2010, with many more facing threats, attacks, harassment, stigmatization, surveillance and arrest," the report said. "Clearly governments are failing to protect activists, while a lack of accountability leaves the door open to further attacks. By backing extractive and infrastructure projects imposed on local communities without their consent, governments, companies and investors are complicit in this crisis."

Worst offenders :

Brazil = 49

Columbia = 37

Philipines = 28

India = 16

Deforestation soars in Colombia after Farc rebels' demobilization

Area of deforestation climbed 44% in 2016 compared with year before, as criminal groups have swooped in promote illegal logging and mining

Sibylla Brodzinsky in Bogotá, Tuesday 11 July 2017 05.00 EDT, Last modified on Tuesday 11 July 2017 17.00 EDT

Colombia has seen an alarming surge in deforestation after the leftwing rebels relinquished control over vast areas of the country as a part of a historic peace deal.

The area of deforestation jumped 44% in 2016 to 178,597 hectares (690 sq miles) compared to the year before, according to official figures released this month – and most of the destruction was in remote rainforest areas once controlled by the Revolutionary Armed Forces of Colombia (Farc).

The rebel army was a violent illegal armed group, but for decades the guerrillas enforced strict limits on logging by civilians – in part to protect their cover from air raids by government warplanes.

But last year, as the Farc began to move toward demobilization, criminal groups moved in, taking advantage of the vacuum left behind to promote illegal logging and mining and cattle ranching. Civilians who had been ordered by the Farc to maintain 20% of their land with forest cover began expanding their farms.

“The Farc would limit logging to two hectares a year in the municipality,” said Jaime Pacheco, mayor of the town of Uribe, in eastern Meta province. “In one week [last year], 100 hectares were cleared and there is little we can do about it.”

Over their 53-year existence, the Farc were far from environmental angels. While in some areas the guerrilla presence helped maintain the forests, in others the rebels promoted clear-cuts to make way for the planning of coca, the raw material used in cocaine, or illegal gold mining, both a source of income for the group.

Bombings of oil pipelines dumped millions of gallons of oil in waterways and jungles. Between 1991 and 2013, 58% of the deforestation in Colombia was seen in conflict areas, according to a 2016 report by Colombia’s planning ministry.

“They weren’t environmentalists but they did regulate activity, and – since they had the guns – people complied,” says Susana Muhamad, an environmental activist who conducted a diagnostic study of the environmental risks of the rebel retreat.

“We told the government that it would need to establish control in these areas quickly, but it hasn’t,” she said. “It’s like the wild west now, a land rush.”

Colombia, which has the world’s eighth-largest forest cover, committed under the Paris climate accords to reaching zero net deforestation by 2020 and halting the loss of all natural forest by 2030.

Norway is donating about $3.5m over two years to a pilot project that hopes to stem deforestation by offering paid jobs to former Farc fighters and communities to safeguard forests by tracking and reporting illegal logging, adopting sustainable farming methods and undertaking eco-tourism projects.

“We hope this project can be the way for more activities whereby peace comes with green dividends,” said Vidar Helgesen, Norway’s environment minister, on a recent visit to Colombia.

That is the kind of incentive farmers in the town of Uribe, once a Farc stronghold, say they need to keep the forest on their land intact. “We are willing to leave the forests if we get some sort of subsidy to do it,” one farmer, Noel Segura, said in a phone interview.

Wendy Arenas, of the government’s post-conflict department, said the new deforestation figures were an alert. “We knew it could happen,” she told el Espectador newspaper. “It is no secret that the zones left behind by the Farc are territories where other actors are taking over and profiting from logging.”

Solar Power Gets $46 Million Boost From U.S. Energy Department

By Chris Martin, July 12, 2017

Technology grants awarded to 48 universities and developers

Arizona State leads awards with new module testing methods

The U.S. Energy Department awarded $46.2 million in research grants to improve solar energy technologies and reduce costs to 3 cents per kilowatt-hour by 2030.

The money will be partly matched by the 48 projects awarded to laboratories and universities, including Arizona State, which plans to use $1.6 million to develop an X-ray test to evaluate the performance of thin-film modules under harsh conditions, according to a emailed statement Wednesday.

“These projects ensure there’s a pipeline of knowledge, human resources, transformative technology solutions and research to support the industry,” Charlie Gay, director of the Energy Department’s SunShot Initiative, said in the statement.

Other awards include:

$1.37 million to Stanford University to improve a perovskite over silicon module design

$1.13 million to Colorado State University to improve thin-film manufacturing

$2 million to SolarReserve Inc. to reduce costs of molten salt storage

Swiss firm Climeworks begins sucking carbon dioxide from the atmosphere in fight against climate change

Company's 'super ambitious' plan is to capture one per cent of annual emissions by 2025

Ian Johnston Environment Correspondent

Friday 23 June 2017 11:47 BST

Humans have pumped vast amounts of carbon into the atmosphere, but we could soon be removing significant amounts Corbis

A Swiss company has opened what is believed to be the world’s first ‘commercial’ plant that sucks carbon dioxide from the atmosphere, a process that could help reduce global warming.

Climeworks, which created the plant in Hinwil, near Zurich, told the Carbon Brief website that the process was currently not making money.

However the firm expressed confidence they could bring down the cost from $600 per tonne of the greenhouse gas to $200 in three to five years with a longer term target of $100.

And its “super-ambitious” vision is to capture one per cent of annual global carbon emissions by 2025, which would involve building hundreds of thousands of the devices.

The captured gas is currently being sold, appropriately enough, to a greenhouse that grows fruit and vegetables.

However it could also be used to make fizzy drinks and renewable fuels or stored underground.

One of the company’s founders, Christoph Gebald, told Carbon Brief: “With this plant, we can show costs of roughly $600 per tonne, which is, of course, if we compare it to a market price, very high.

“But, if we compare it to studies which have been done previously, projecting the costs of direct air capture, it’s a sensation.”

Carbon maps reveal those causing the most and least climate change

Previous research into the idea had assumed a cost of $1,000 per tonne.

“We are very confident that, once we build version two, three and four of this plant, we can bring down costs,” Mr Gebald said.

“We see a factor [of] three cost reduction in the next three to five years, so a final cost of $200 per tonne. The long-term target price for what we do is clearly $100 per tonne of CO2.”

He said such a carbon capture system could start to make an impact on emissions on a global scale, but this might require a price to be put on carbon emissions. The European Union and some other countries around the world have put a price on carbon for some major emitters, but environmentalists have complained it is too low and fails to reflect the true cost.

“The vision of our company is to capture [one] per cent of global emissions by 2025, which is super ambitious, but which is something that is feasible,” Mr Gebald said.

“Reaching one per cent of global emissions by 2025 is currently not possible without political will, without a price on carbon, for example. So it’s not possible by commercial means only.”

Want to fight climate change? Have fewer children

Next best actions are selling your car, avoiding flights and going vegetarian, according to study into true impacts of different green lifestyle choices

Damian Carrington Environment editor

Wednesday 12 July 2017 00.45 EDT, Last modified on Wednesday 12 July 2017 13.34 EDT

The greatest impact individuals can have in fighting climate change is to have one fewer child, according to a new study that identifies the most effective ways people can cut their carbon emissions.

The next best actions are selling your car, avoiding long flights, and eating a vegetarian diet. These reduce emissions many times more than common green activities, such as recycling, using low energy light bulbs or drying washing on a line. However, the high impact actions are rarely mentioned in government advice and school textbooks, researchers found.

Carbon emissions must fall to two tonnes of CO2 per person by 2050 to avoid severe global warming, but in the US and Australia emissions are currently 16 tonnes per person and in the UK seven tonnes. “That’s obviously a really big change and we wanted to show that individuals have an opportunity to be a part of that,” said Kimberly Nicholas, at Lund University in Sweden and one of the research team.

The new study, published in Environmental Research Letters, sets out the impact of different actions on a comparable basis. By far the biggest ultimate impact is having one fewer child, which the researchers calculated equated to a reduction of 58 tonnes of CO2 for each year of a parent’s life.

The figure was calculated by totting up the emissions of the child and all their descendants, then dividing this total by the parent’s lifespan. Each parent was ascribed 50% of the child’s emissions, 25% of their grandchildren’s emissions and so on.

“We recognise these are deeply personal choices. But we can’t ignore the climate effect our lifestyle actually has,” said Nicholas. “It is our job as scientists to honestly report the data. Like a doctor who sees the patient is in poor health and might not like the message ‘smoking is bad for you’, we are forced to confront the fact that current emission levels are really bad for the planet and human society.”

“In life, there are many values on which people make decisions and carbon is only one of them,” she added. “I don’t have children, but it is a choice I am considering and discussing with my fiance. Because we care so much about climate change that will certainly be one factor we consider in the decision, but it won’t be the only one.”

Overpopulation has been a controversial factor in the climate change debate, with some pointing out that an American is responsible for 40 times the emissions produced by a Bangladeshi and that overconsumption is the crucial issue. The new research comes a day after researchers blamed overpopulation and overconsumption on the “biological annihilation” of wildlife which has started a mass extinction of species on the planet.

Nicholas said that many of the choices had positive effects as well, such as a healthier diet, as meat consumption in developed countries is about five times higher than recommended by health authorities. Cleaner transport also cuts air pollution, and walking and cycling can reduce obesity. “It is not a sacrifice message,” she said. “It is trying to find ways to live a good life in a way that leaves a good atmosphere for the planet. I’ve found it really positive to make many of these changes.”

The researchers analysed dozens of sources from Europe, North America and Japan to calculate the carbon savings individuals in richer nations can make. They found getting rid of a car saved 2.4 tonnes a year, avoiding a return transatlantic flight saved 1.6 tonnes and becoming vegetarian saved 0.8 tonnes a year.

These actions saved the same carbon whichever country an individual lived in, but others varied. The savings from switching to an electric car depend on how green electricity generation is, meaning big savings can be made in Australia but the savings in Belgium are six times lower. Switching your home energy supplier to a green energy company also varied, depending on whether the green energy displaces fossil fuel energy or not.

Nicholas said the low-impact actions, such as recycling, were still worth doing: “All of those are good things to do. But they are more of a beginning than an end. They are certainly not sufficient to tackle the scale of the climate challenge that we face.”

The researchers found that government advice in the US, Canada, EU and Australia rarely mentioned the high impact actions, with only the EU citing eating less meat and only Australia citing living without a car. None mentioned having one fewer child. In an analysis of school textbooks on Canada only 4% of the recommendations were high impact.

Chris Goodall, an author on low carbon living and energy, said: “The paper usefully reminds us what matters in the fight against global warming. But in some ways it will just reinforce the suspicion of the political right that the threat of climate change is simply a cover for reducing people’s freedom to live as they want.

“Population reduction would probably reduce carbon emissions but we have many other tools for getting global warming under control,” he said. “Perhaps more importantly, cutting the number of people on the planet will take hundreds of years. Emissions reduction needs to start now.”

Full Report

http://iopscience.iop.org/article/10.1088/1748-9326/aa7541/pdf

OIRA works quietly on updating social cost of carbon

Hannah Hess, E&E News reporter, Greenwire: Thursday, June 15, 2017

Some federal employees are still working on the social cost of carbon.

Not far from the White House, some of the federal government's most influential number crunchers are still working on the social cost of carbon.

President Trump's executive order on energy independence effectively signaled "pencils down" on federal work to estimate the monetary damage of greenhouse gas emissions, disbanding the interagency working group that calculated the dollar value on the effect of greenhouse gas emissions on the planet and society.

But his order didn't eliminate the metric entirely.

The Office of Information and Regulatory Affairs' Jim Laity, a career staffer who leads the Natural Resources and Environment Branch, said yesterday his office is "actively working on thinking about the guidance" Trump gave in March.

With the Trump agenda focused on regulatory rollback, federal agencies haven't yet issued rules that require valuations of carbon emissions, "although they are working on something coming in the not-too-distant future," Laity told an audience attending the National Academy of Sciences' seminar on valuing climate change impacts.

Employees from U.S. EPA, the Interior Department and the Department of Energy were in the crowd, along with academics and prominent Washington think tank scholars.

Greens fear the Trump administration could wipe out the social cost of carbon valuation, currently set around $40 per metric ton of carbon dioxide, as part of what they portray as a war on climate science. Revisions to the White House estimates have raised hackles among Republicans and conservatives, who allege they are part of a secret power grab.

Laity would play a key role in that effort as top overseer of energy and environmental rules.

Addressing the summit via webcast yesterday, Laity walked the audience through a decade of actions related to the calculation and influential feedback, including from a 13-member panel of the National Academies of Sciences, Engineering and Medicine. He stressed the outcome is still uncertain.

The Trump administration "looked at the work we had done, and they looked at the criticisms, and they decided we needed to have a pause to kind of rethink what we were doing a little bit in this area," Laity said, previewing the executive order that was praised by the oil and gas industry.

The metric flew under the radar during President Obama's first term. However, a significant jump in values in 2013 set the stage for a clash between then-OIRA Director Howard Shelanski and Republican lawmakers who alleged the estimate was the product of a closed-door process (E&E Daily, July 19, 2013).

A few months after the hearing, OIRA announced it would provide opportunity for public comment on the estimate. Critics took issue with the discount rate the government used to account for damage and expressed concern that the finding, though based on peer-reviewed literature, had not been peer-reviewed.

"I think we also realized that we needed to pay more attention to some of these issues than maybe we had in the past," Laity said.

OIRA received 140 unique comments, and some 39,000 letters, which Laity said mostly supported having some kind of cost-benefit analysis of the cost to society of each ton of emissions in property damage, health care costs, lost agricultural output and other expenses.

"We took some of those concerns to heart," he said.

In July 2015, the White House slightly revised its estimate. It also announced that the executive branch would seek independent expert advice from the National Academies to inform yet another revision of the estimate, though federal agencies would continue to use the current figure in their rulemakings until that revision could be made (Greenwire, July 6).

Part one of the two-phase study, released in January 2016, blessed the figure. It found the White House did not need to review its estimates in the short term. Part two, released a few weeks before Obama left office, recommended a new framework for making the estimate and more research into long-term climate damage.

Within three months, the Trump administration formally disbanded the interagency working group that would have implemented those recommendations. The executive order also formally withdrew the technical support documents OIRA had used in the past.

The think tank Resources for the Future recently announced the start of an initiative focused on responding to the scientists' recommendations (Greenwire, June 9).

Guidance for OIRA

Laity noted that the next paragraph of Trump's executive order acknowledged agencies would need to continue monetizing greenhouse gas emissions damage in their regulations, to the extent that the rules affect emissions.

As an interim measure, the order directed OIRA to offer some guidance about how to do that, directing that any values that were used would be consistent with the guidance of a document published by the Office of Management and Budget in September 2003, Circular A-4.

There are two main areas of concern: the discount rate and the global focus of the current estimate.

"A-4 has very specific advice," Laity explained. The document says pretty unequivocally that the main factor in weighing regulations should be cost and benefits to the U.S. If a regulation does have significant impacts outside the U.S. that are important to consider, these should be "clearly segregated out and reported separately," he said.

Economists clashed during a recent hearing of the House Science, Space and Technology Committee on how best to estimate the figure.

Former Obama administration official Michael Greenstone, who attended yesterday's summit and gave a presentation, argues the benefits of emission reductions play out in part in international politics.

Greenstone, chief economist for the Council of Economic Advisers in 2009 and 2010, has warned that reverting to a social cost of carbon that only considers domestic benefits of emission reductions is "essentially asking the rest of the world to ramp up their emissions" (E&E Daily, March 1).

Said Laity: "All I can say right now is that we are actively working on thinking about the guidance that we have been given in the new executive order and, sort of technically, how best to implement that in upcoming ... rulemaking."

Aging Oil Pipeline Under the Great Lakes Should Be Closed, Michigan AG Says

Enbridge’s Line 5 lies exposed on the floor of the Straits of Mackinac and has been losing chunks of its outer coating. A new report suggests ways to deal with it.

Nicholas Kusnetz

BY NICHOLAS KUSNETZ, JUN 30, 2017

Enbridge's Line 5 carries 540,000 barrels of oil per day from Canada to Ohio, going under the Straits of Mackinac, where Lake Michigan and Lake Huron meet.

Michigan Attorney General Bill Schuette called for a deadline to close a controversial portion of an oil pipeline that runs along the bottom of the Straits of Mackinac, a channel that connects two of the Great Lakes. The pipeline has had more than two dozen leaks over its lifespan, and parts of its outer coating have come off.

The announcement came as the state released a report looking at alternatives for that section of the Enbridge pipeline, called Line 5.

The report's suggestions include drilling a tunnel under the straits for a new line, selecting an alternate route or using rail cars to transport the oil instead. It also left open the possibility that the existing pipeline could continue to operate indefinitely.

"The Attorney General strongly disagrees" with allowing the existing pipeline to continue operating, said a statement released by Schuette's office on Thursday. "A specific and definite timetable to close Line 5 under the straits should be established."

Schuette did not, however, specify when that deadline should be, or how it should be set.

For years, environmentalists and a local Indian tribe have been calling for the closure of this short stretch of the pipeline. Built in 1953, it sits exposed above the lakebed where Lake Huron meets Lake Michigan. Earlier this year, Enbridge acknowledged that an outer coating had fallen off of the line in places, and it has sprung at least 29 leaks in its 64-year history. The 645-mile line carries about 540,000 barrels per day of light crude, including synthetic crude from Canada's tar sands, as well as natural gas liquids, from Superior, Wisconsin, to Sarnia, Ontario.

Schuette, a Republican, had said before that this section of the line should close eventually, but he hasn't taken any action to hasten a closure. Advocacy groups have asked the state to revoke Enbridge's easement to pass through the straits.

"It's great that he's reasserting his commitment to shut down Line 5," said Mike Shriberg, Great Lakes executive director for the National Wildlife Federation. "The question now is, is there enough evidence for him to take action right away."

The state had commissioned two studies on the line to be paid for by Enbridge, one that was released yesterday and another that was to produce a risk analysis for the pipeline. Last week, however, the state cancelled the risk analysis after discovering that someone who had contributed to it had subsequently done work for Enbridge.

Michael Barnes, an Enbridge spokesman, said the company would need time to review the report before giving specific comments, but that it "remains committed to protecting the Great Lakes and meeting the energy needs of Michigan through the safe operation of Line 5."

Shriberg said that now that the report on alternatives is out, it's time for the state to act.

"Ultimately, the attorney general and the governor have a decision to make," he said. "They've been saying for years that they've been waiting for the full information to come in."

California invested heavily in solar power. Now there's so much that other states are sometimes paid to take it

By IVAN PENN on JUNE 22, 2017 for LA Times

On 14 days during March, Arizona utilities got a gift from California: free solar power.

Well, actually better than free. California produced so much solar power on those days that it paid Arizona to take excess electricity its residents weren’t using to avoid overloading its own power lines.

It happened on eight days in January and nine in February as well. All told, those transactions helped save Arizona electricity customers millions of dollars this year, though grid operators declined to say exactly how much. And California also has paid other states to take power.

The number of days that California dumped its unused solar electricity would have been even higher if the state hadn’t ordered some solar plants to reduce production — even as natural gas power plants, which contribute to greenhouse gas emissions, continued generating electricity.

Solar and wind power production was curtailed a relatively small amount — about 3% in the first quarter of 2017 — but that’s more than double the same period last year. And the surge in solar power could push the number even higher in the future.

Why doesn’t California, a champion of renewable energy, use all the solar power it can generate?

The answer, in part, is that the state has achieved dramatic success in increasing renewable energy production in recent years. But it also reflects sharp conflicts among major energy players in the state over the best way to weave these new electricity sources into a system still dominated by fossil-fuel-generated power.

City officials and builders in Redondo Beach want a mixed-use development to replace the current natural gas facility. They say there is no need to overhaul the power plant when there is an abundance of clean alternatives. (Rick Loomis/Los Angeles Times)

No single entity is in charge of energy policy in California. This has led to a two-track approach that has created an ever-increasing glut of power and is proving costly for electricity users. Rates have risen faster here than in the rest of the U.S., and Californians now pay about 50% more than the national average.

Perhaps the most glaring example: The California Legislature has mandated that one-half of the state’s electricity come from renewable sources by 2030; today it’s about one-fourth. That goal once was considered wildly optimistic. But solar panels have become much more efficient and less expensive. So solar power is now often the same price or cheaper than most other types of electricity, and production has soared so much that the target now looks laughably easy to achieve.

At the same time, however, state regulators — who act independently of the Legislature — until recently have continued to greenlight utility company proposals to build more natural gas power plants.

State Senate Leader Kevin de Leon (D-Los Angeles) wants Calilfornia to produce 100% of its electricity from clean energy sources such as solar and wind by 2045.

These conflicting energy agendas have frustrated state Senate Leader Kevin de Leon (D-Los Angeles), who opposes more fossil fuel plants. He has introduced legislation that would require the state to meet its goal of 50% of its electricity from renewable sources five years earlier, by 2025. Even more ambitiously, he recently proposed legislation to require 100% of the state’s power to come from renewable energy sources by 2045.

“I want to make sure we don’t have two different pathways,” de Leon said. Expanding clean energy production and also building natural gas plants, he added, is “a bad investment.”

Environmental groups are even more critical. They contend that building more fossil fuel plants at the same time that solar production is being curtailed shows that utilities — with the support of regulators — are putting higher profits ahead of reducing greenhouse gas emissions.

“California and others have just been getting it wrong,” said Leia Guccione, an expert in renewable energy at the Rocky Mountain Institute in Colorado, a clean power advocate. “The way [utilities] earn revenue is building stuff. When they see a need, they are perversely [incentivized] to come up with a solution like a gas plant.”

California and others have just been getting it wrong.

— Leia Guccione, renewable energy expert at the Rocky Mountain Institute

Regulators and utility officials dispute this view. They assert that the transition from fossil fuel power to renewable energy is complicated and that overlap is unavoidable.

They note that electricity demand fluctuates — it is higher in summer in California, because of air conditioning, and lower in the winter — so some production capacity inevitably will be underused in the winter. Moreover, the solar power supply fluctuates as well. It peaks at midday, when the sunlight is strongest. Even then it isn’t totally reliable.

Because no one can be sure when clouds might block sunshine during the day, fossil fuel electricity is needed to fill the gaps. Utility officials note that solar production is often cut back first because starting and stopping natural gas plants is costlier and more difficult than shutting down solar panels.

In the Mojave Desert at the California/Nevada border, the Ivanpah Solar Electric Generating System uses 347,000 garage-door-sized mirrors to heat water that powers steam generators. This solar thermal plant — one of the clean energy facilities that helps produce 10% of the state’s electricity. (Mark Boster / Los Angeles Times)

Eventually, unnecessary redundancy of electricity from renewables and fossil fuel will disappear, regulators, utilities and operators of the electric grid say.

“The gas-fired generation overall will show decline,” said Neil Millar, executive director of infrastructure at CAISO, the California Independent System Operator, which runs the electric grid and shares responsibility for preventing blackouts and brownouts. “Right now, as the new generation is coming online and the older generation hasn’t left yet, there is a bit of overlap.”

The Los Angeles Times has been telling fact from fiction since 1881. Support local investigative reporting like this story by subscribing today. Start getting full access to our signature journalism for just 99 cents for the first eight weeks.

Utility critics acknowledge these complexities. But they counter that utilities and regulators have been slow to grasp how rapidly technology is transforming the business. A building slowdown is long overdue, they argue.

Despite a growing glut of power, however, authorities only recently agreed to put on hold proposals for some of the new natural gas power plants that utilities want to build to reconsider whether they are needed.

A key question in the debate is when California will be able to rely on renewable power for most or all of its needs and safely phase out fossil fuel plants, which regulators are studying.

The answer depends in large part on how fast battery storage improves, so it is cheaper and can store power closer to customers for use when the sun isn’t shining. Solar proponents say the technology is advancing rapidly, making reliance on renewables possible far sooner than previously predicted, perhaps two decades or even less from now — which means little need for new power plants with a life span of 30 to 40 years.

Calibrating this correctly is crucial to controlling electricity costs.

“It’s not the renewables that’s the problem. It’s the state’s renewable policy that’s the problem,” said Gary Ackerman, president of the Western Power Trading Forum, an association of independent power producers. “We’re curtailing renewable energy in the summertime months. In the spring, we have to give people money to take it off our hands.”

Not long ago, solar was barely a rounding error for California’s energy producers.

In 2010, power plants in the state generated just over 15% of their electricity production from renewable sources. But that was mostly wind and geothermal power, with only a scant 0.5% from solar. Now that overall amount has grown to 27%, with solar power accounting for 10%, or most of the increase. The solar figure doesn’t include the hundreds of thousands of rooftop solar systems that produce an additional 4 percentage points, a share that is ever growing.

The share of the state’s power generated by solar utilities and rooftop panels has skyrocketed in recent years.

Note: Rooftop panels were not tracked by the federal government prior to 2014.

2016 Solar utilities: 9.6% / Rooftop panels: 4.2%

Behind the rapid expansion of solar power: its plummeting price, which makes it highly competitive with other electricity sources. In part that stems from subsidies, but much of the decline comes from the sharp drop in the cost of making solar panels and their increased efficiency in converting sunlight into electricity.

The average cost of solar power for residential, commercial and utility-scale projects declined 73% between 2010 and 2016. Solar electricity now costs 5 to 6 cents per kilowatt-hour — the amount needed to light a 100-watt bulb for 10 hours — to produce, or about the same as electricity produced by a natural gas plant and half the cost of a nuclear facility, according to the U.S. Energy Information Administration.

Fly over the Carrizo Plain in California’s Central Valley near San Luis Obispo and you’ll see that what was once barren land is now a sprawling solar farm, with panels covering more than seven square miles — one of the world’s largest clean-energy projects. When the sun shines over the Topaz Solar Farm, the shimmering panels produce enough electricity to power all of the residential homes in a city the size of Long Beach, population 475,000.

The Topaz Solar Farm, one of the world’s largest solar plants, blankets the Carrizo Plain in the Central Valley. It supplies electricity to Pacific Gas & Electric Co. (NASA)

Other large-scale solar operations blanket swaths of the Mojave Desert, which has increasingly become a sun-soaking energy hub. The Beacon solar project covers nearly two square miles and the Ivanpah plant covers about five and a half square miles.

The state’s three big shareholder-owned utilities now count themselves among the biggest solar power producers. Southern California Edison produces or buys more than 7% of its electricity from solar generators, Pacific Gas & Electric 13% and San Diego Gas & Electric 22%.

Similarly, fly over any sizable city and you’ll see warehouses, businesses and parking lots with rooftop solar installations, and many homes as well.

With a glut of solar power at times, CAISO has two main options to avoid a system overload: order some solar and wind farms to temporarily halt operations or divert the excess power to other states.

That’s because too much electricity can overload the transmission system and result in power outages, just as too little can. Complicating matters is that even when CAISO requires large-scale solar plants to shut off panels, it can’t control solar rooftop installations that are churning out electricity.

CAISO is being forced to juggle this surplus more and more.

In 2015, solar and wind production were curtailed about 15% of the time on average during a 24-hour period. That rose to 21% in 2016 and 31% in the first few months of this year. The surge in solar production accounts for most of this, though heavy rainfall has increased hydroelectric power production in the state this year, adding to the surplus of renewables.

California’s clean energy supply is growing so fast that solar and wind producers are increasingly being ordered to halt production.

Even when solar production is curtailed, the state can produce more than it uses, because it is difficult to calibrate supply and demand precisely. As more homeowners install rooftop solar, for example, their panels can send more electricity to the grid than anticipated on some days, while the state’s overall power usage might fall below what was expected.

This means that CAISO increasingly has excess solar and wind power it can send to Arizona, Nevada and other states.

When those states need more electricity than they are producing, they pay California for the power. But California has excess power on a growing number of days when neighboring states don’t need it, so California has to pay them to take it. CAISO calls that “negative pricing.”

Why does California have to pay rather than simply give the power away free?

When there isn’t demand for all the power the state is producing, CAISO needs to quickly sell the excess to avoid overloading the electricity grid, which can cause blackouts. Basic economics kick in. Oversupply causes prices to fall, even below zero. That’s because Arizona has to curtail its own sources of electricity to take California’s power when it doesn’t really need it, which can cost money. So Arizona will use power from California at times like this only if it has an economic incentive — which means being paid.

In the first two months of this year, CAISO paid to send excess power to other states seven times more often than same period in 2014. “Negative pricing” happened in an average of 18% of all sales, versus about 2.5% in the same period in 2014.

Most “negative pricing” typically has occurred for relatively short periods at midday, when solar production is highest.

But what happened in March shows how the growing supply of solar power could have a much greater impact in the future. The periods of “negative pricing” lasted longer than in the past — often for six hours at a time, and once for eight hours, according to a CAISO report.

The excess power problem will ease somewhat in the summer, when electricity usage is about 50% higher in California than in the winter.

But CAISO concedes that curtailments and “negative pricing” is likely to happen even more often in the future as solar power production continues to grow, unless action is taken to better manage the excess electricity.

The sprawling Ivanpah Solar Electric Generating System, owned by NRG Energy and BrightSource Energy, occupies 5.5 square miles in the Mojave Desert. The plant can supply electricity to 180,000 Pacific Gas & Electric and Southern California Edison customers. (Mark Boster/Los Angeles Times)

Arizona’s largest utility, Arizona Public Service, is one of the biggest beneficiaries of California’s largesse because it is next door and the power can easily be sent there on transmission lines.

On days that Arizona is paid to take California’s excess solar power, Arizona Public Service says it has cut its own solar generation rather than fossil fuel power. So California’s excess solar isn’t reducing greenhouse gases when that happens.

CAISO says it does not calculate how much it has paid others so far this year to take excess electricity. But its recent oversupply report indicated that it frequently paid buyers as much as $25 per megawatt-hour to get them to take excess power, according to the Energy Information Administration.

That’s a good deal for Arizona, which uses what it is paid by California to reduce its own customers’ electricity bills. Utility buyers typically pay an average of $14 to $45 per megawatt-hour for electricity when there isn’t a surplus from high solar power production.

With solar power surging so much that it is sometimes curtailed, does California need to spend $6 billion to $8 billion to build or refurbish eight natural gas power plants that have received preliminary approval from regulators, especially as legislative leaders want to accelerate the move away from fossil fuel energy?

The answer depends on whom you ask.

Utilities have repeatedly said yes. State regulators have agreed until now, approving almost all proposals for new power plants. But this month, citing the growing electricity surplus, regulators announced plans to put on hold the earlier approvals of four of the eight plants to determine if they really are needed.

Big utilities continue to push for all of the plants, maintaining that building natural gas plants doesn’t conflict with expanding solar power. They say both paths are necessary to ensure that California has reliable sources of power — wherever and whenever it is needed.

The biggest industrial solar power plants, they note, produce electricity in the desert, in some cases hundreds of miles from population centers where most power is used.

At times of peak demand, transmission lines can get congested, like Los Angeles highways. That’s why CAISO, utilities and regulators argue that new natural gas plants are needed closer to big cities. In addition, they say, the state needs ample electricity sources when the sun isn’t shining and the wind isn’t blowing enough.

Utility critics agree that some redundancy is needed to guarantee reliability, but they contend that the state already has more than enough.

California has so much surplus electricity that existing power plants run, on average, at slightly less than one-third of capacity. And some plants are being closed decades earlier than planned.

As for congestion, critics note that the state already is crisscrossed with an extensive network of transmission lines. Building more plants and transmission lines wouldn’t make the power system much more reliable, but would mean higher profits for utilities, critics say.

That is what the debate is about, said Jaleh Firooz, a power industry consultant who previously worked as an engineer for San Diego Gas & Electric for 24 years and helped in the formation of CAISO.

“They have the lopsided incentive of building more,” she said.

Jaleh Firooz, who worked 24 years as an engineer for San Diego Gas & Electric Co., says utilities seeking higher profits “have the lopsided incentive of building more” power plants and transmission lines. (Robert Gauthier/Los Angeles Times)

The reason: Once state regulators approve new plants or transmission lines, the cost is now built into the amount that the utility can charge electricity users — no matter how much or how little it is used.

Given that technology is rapidly tilting the competitive advantage toward solar power, there are less expensive and cleaner ways to make the transition toward renewable energy, she said.

To buttress her argument, Firooz pointed to a battle in recent years over a natural gas plant in Redondo Beach.

Independent power producer AES Southland in 2012 proposed replacing an aging facility there with a new one. The estimated cost: $250 million to $275 million, an amount that customers would pay off with higher electricity bills.

CAISO and Southern California Edison, which was going to buy power from the new plant, supported it as necessary to protect against potential power interruptions. Though solar and wind power production was increasing, they said those sources couldn’t be counted on because their production is variable, not constant.

The California Public Utilities Commission approved the project, agreeing that it was needed to meet the long-term electricity needs in the L.A. area.

The Los Angeles Times has been telling fact from fiction since 1881. Support local investigative reporting like this story by subscribing today. Start getting full access to our signature journalism for just 99 cents for the first eight weeks.

But the California Coastal Conservancy, a conservation group opposed to the plant, commissioned an analysis by Firooz to determine how vital it was. Her conclusion: not at all.

Firooz calculated that the L.A. region already had excess power production capacity — even without the new plant — at least through 2020.

Along with the cushion, her report found, a combination of improved energy efficiency, local solar production, storage and other planning strategies would be more than sufficient to handle the area’s power needs even as the population grew.

She questioned utility arguments.

“In their assumptions, the amount of capacity they give to the solar is way, way undercut because they have to say, ‘What if it’s cloudy? What if the wind is not blowing?’ ” Firooz explained. “That’s how the game is played. You build these scenarios so that it basically justifies what you want.”

In their assumptions, the amount of capacity they give to the solar is way, way undercut because they have to say, ‘What if it’s cloudy?’

— Jaleh Firooz, power-industry consultant

Undeterred, AES Southland pressed forward with its proposal. In 2013, Firooz updated her analysis at the request of the city of Redondo Beach, which was skeptical that a new plant was needed. Her findings remained the same.

Nonetheless, the state Public Utilities Commission approved the project in March 2014 on the grounds that it was needed. But the California Energy Commission, another regulatory agency whose approval for new plants is required along with the PUC’s, sided with the critics. In November 2015 it suspended the project, effectively killing it.

Asked about the plant, AES said it followed the appropriate processes in seeking approval. It declined to say whether it still thinks that a new plant is needed.

The existing facility is expected to close in 2020.

A March 2017 state report showed why critics are confident that the area will be fine without a new plant: The need for power from Redondo Beach’s existing four natural gas units has been so low, the state found, that the units have operated at less than 5% of their capacity during the last four years.

Exxon Makes a Biofuel Breakthrough

By Jennifer A Dlouhy, June 19, 2017

J. Craig Venter’s Synthetic Genomics teamed with Exxon Mobil

Technique could lead to commercialization of algae-based fuels

It’s the holy grail for biofuel developers hoping to coax energy out of algae: Keep the organism fat enough to produce oil but spry enough to grow quickly.

J. Craig Venter, the scientist who mapped the human genome, just helped Exxon Mobil Corp. strike that balance, with a breakthrough that could enable widespread commercialization of algae-based biofuels. Exxon and Venter’s Synthetic Genomics Inc. are announcing the development at a conference in San Diego on Monday.

They used advanced cell engineering to more than double the fatty lipids inside a strain of algae. The technique may be replicated to boost numbers on other species too.

"Tackling the inner workings of algae cells has not been trivial," Venter said. "Nobody’s really ever been there before; there’s no guideline to go by."

Venter, who co-founded Synthetic Genomics and sequenced the human genome in the 1990s, says the development is a significant advancement in the quest to make algae a renewable energy source. The discovery is being published in the July issue of the journal Nature Biotechnology.

Prior story: Exxon’s Algae Venture May Take Quarter Century to Yield Fuel

It’s taken eight years of what Venter called tedious research to reach this point.

When Exxon Mobil announced its $600 million collaboration with Synthetic Genomics in 2009, the oil company predicted it might yield algae-based biofuels within a decade. Four years later, Exxon executives conceded a better estimate might be within a generation.

Developing strains that reproduce and generate enough of the raw material to supply a refinery meant the venture might not succeed for at least another 25 years, former chief executive and current U.S. Secretary of State, Rex Tillerson said at the time.

Even with this newest discovery, commercialization of this kind of modified algae is decades away.

Venter says the effort has "been a real slog."

"It’s to the team’s credit -- it’s to Exxon’s credit -- that they believed the steps in the learning were actually leading some place," he said. "And they have."

The companies forged on -- renewing their joint research agreement in January amid promising laboratory results.

Exxon declined to disclose how much the Irving, Texas-based company has invested in the endeavor so far. Vijay Swarup, a vice president at ExxonMobil Research and Engineering Co., says the collaboration is part of the company’s broad pursuit of "more efficient ways to produce the energy and chemicals" the world needs and "mitigate the impacts of climate change."

Carbon Consumer

Where Exxon’s chief products -- oil and natural gas -- generate carbon dioxide emissions that drive the phenomenon, algae is a CO2 consumer, Swarup said.

Most renewable fuels today are made from plant material, including corn, corn waste and soybean oil. Algae has long been considered a potentially more sustainable option; unlike those traditional biofuels, it can grow in salt water and thrive under harsh environmental conditions. And the oil contained in algae potentially could be processed in conventional refineries.

The Exxon and Synthetic Genomics team found a way to regulate the expression of genes controlling the accumulation of lipids, or fats, in the algae -- and then use it to double the strain’s lipid productivity while retaining its ability to grow.

"To my knowledge, no other group has achieved this level of lipid production by modifying algae, and there’s no algae in production that has anything like this level," Venter said in a telephone interview. It’s "our first super-strong indication that there is a path to getting to where we need to go."

Nitrogen Starved

They searched for the needed genetic regulators after observing what happened when cells were starved of nitrogen -- a tactic that generally drives more oil accumulation. Using the CRISPR-Cas9 gene-editing technique, the researchers were able to winnow a list of about 20 candidates to a single regulator -- they call it ZnCys -- and then to modulate its expression.

Test strains were grown under conditions mimicking an average spring day in southern California.

Rob Brown, Ph.D., senior director of genome engineering at Synthetic Genomics, likened the tactic to forcing an agile algae athlete to sit on the bench.

"We basically take an athlete and make them sit on a couch and get fat," Brown said. "That’s the switch — you grab this guy off the track and you put him on a couch and he turns into a couch potato. So everything he had in his body that was muscle, sinew, carbohydrates -- we basically turn that into a butterball. That’s what we’re basically doing with this system.”

Without the change, most algae growing in this environment would produce about 10 to 15 percent oil. The Exxon and Synthetic Genomics collaboration yielded a strain with more than 40 percent.

Venter, who is also working on human longevity research, views the development as a significant step toward the sustainable energy he believes humans need as they live longer, healthier lives. The study also is proof, he says, that "persistence pays."

"You have to believe in what you’re doing and that where you’re headed really is the right direction," he said, "and sometimes, like this, it takes a long time to really prove it.”

Hundreds of scientists call for caution on anti-microbial chemical use

More than 200 scientists outline a broad range of concerns for triclosan and triclocarban and call for reduced use worldwide

June 20, 2017, By Brian Bienkowski, Environmental Health News

Two ingredients used in thousands of products to kill bacteria, fungi and viruses linger in the environment and pose a risk to human health, according to a statement released today by more than 200 scientists and health professionals.

The scientists say the possible benefits in most uses of triclosan and triclocarban—used in some soaps, toothpastes, detergents, paints, carpets—are not worth the risk. The statement, published today in the Environmental Health Perspectives journal, urges “the international community to limit the production and use of triclosan and triclocarban and to question the use of other antimicrobials.”

“Triclosan and triclocarban have been permitted for years without definitive proof they’re providing benefits.” -Avery Lindeman, Green Policy Institute

They also call for warning labels on any product containing triclosan and triclocarban and for bolstered research of the chemicals' environmental toll.

The statement says evidence that the compounds are accumulating in water, land, wildlife and humans is sufficient to merit action.

“We want to draw attention to the increasing use of antimicrobials in a lot of products,&r