Remember to also click on and check out the "Green Local 175 News" for environmental and green news within a 175 mile radius of Utica-Rome. Organizations and companies can send their press releases to . Express your viewpoint on a particular issue, by sending an e-mail to our opinion page, "Urge to be Heard". Lastly, for more environmental information listen to our Green Local 175 Radio & Internet Show.

Green Local 175 on ,

"EPA Plans Soil Removal At Lead-Tainted Indiana Complex"

"EAST CHICAGO, Ind. — The U.S. Environmental Protection Agency is moving ahead with plans for a 2-foot-deep removal of lead- and arsenic-contaminated soil at the site of a northwestern Indiana public housing that’s been evacuated and demolished over health concerns.

The agency estimates the cleanup project at the East Chicago’s West Calumet Housing Complex will cost about $26 million. It would involve removing more than 160,000 cubic yards of soil contaminated over decades by a lead-products factory and replacing it with clean soil and seed or sod.

More than 1,000 people, including about 700 children, were forced from the housing complex after 2016 tests found high lead levels in blood samples from some children and some yards with lead levels over 70 times the U.S. safety standard."

U.S. plans new limits on heavy-duty truck emissions

David Shepardson, NOVEMBER 12, 2018 / 11:55 AM /

WASHINGTON (Reuters) - The U.S. Environmental Protection Agency will announce plans to propose new rules to significantly decrease emissions of smog-forming nitrogen oxide from diesel-powered heavy-duty trucks, an agency official said.

The U.S. Environmental Protection Agency (EPA) sign is seen on the podium at EPA headquarters in Washington, U.S., July 11, 2018. REUTERS/Ting Shen

Industry groups and state environmental officials have urged the EPA to set new nationwide rules as the state of California has been moving forward with plans to set new state emissions limits. California also wants nationwide rules, in part because more than half of all trucks delivering goods in the state are registered in other states.

The EPA said in a statement it had scheduled a formal announcement on Tuesday with industry executives and state environmental officials regarding its “Cleaner Trucks Initiative,” but did not immediately disclose details. The effort to impose a new regulatory limit by the EPA comes as the Trump administration has generally touted its efforts to eliminate regulations. But the effort on nitrogen oxide (NOx) is backed by industry, which wants to avoid a patchwork of federal and state standards, the official said.

The official asked not to be identified because the announcement was still pending.

In December 2016, the Obama-led EPA said in response to petitions to impose new standards that it acknowledged “a need for additional NOx reductions from on-highway heavy-duty engines, particularly in areas of the country with elevated levels of air pollution” and said it planned to propose new rules that could begin in the 2024 model year.

Local and state air quality and other agencies including New York City, New Hampshire, Rhode Island, Los Angeles, Washington State had petitioned for the rules.

Another administration official said Monday the new proposed emissions rules may not be written and announced until 2020.

Nitrogen oxide emissions are linked to significant health impacts and can exacerbate asthma attacks, the EPA has said.

The current heavy-duty truck rules for NOx were adopted in 2000 and took effect over the following decade.

In the aftermath of Volkswagen AG’s (VOWG_p.DE) light-duty diesel emissions scandal, in which the German automaker admitted to secretly using software to evade emissions rules, the EPA has taken steps to insure that diesel cars and SUVs are meeting emissions requirements in on-road use.

The new NOx heavy-duty truck rules may also include new tests or other regulatory steps to ensure that vehicles and their engines are complying during real-world driving, the official said.

Reporting by David Shepardson; Editing by Steve Orlofsky and Tom Brown

Are artisanal foodie brands ruining a California national park?

Herds of cows provide meat and dairy for influential purveyors, but environmentalists say they despoil the landscape

Laura Fraser in Pt Reyes Station, California

Fri 9 Nov 2018 06.00 EST Last modified on Fri 9 Nov 2018 19.08 EST

An hour north of San Francisco lie two-dozen dairy and meat farms that have produced some of the most beloved artisanal brands in northern California – along with a farm-fresh, locally sourced foodie ethos that has become globally influential.

All the dairies in Point Reyes are organic, and the beef is grass-fed. They are models of sustainable farming, providing the raw ingredients for cheesemaker Cowgirl Creamery, the Straus Family Creamery, and Marin Sun Farms meats, to name a few.

Yet some national environmentalists are taking a stand against these ranchers, who have farmed for generations on grasslands that are now part of Point Reyes National Seashore, claiming they are despoiling a landscape visited by 2.5 million people every year and should be ejected. Cows, the environmentalists argue, do not belong in a national park.

“Visitors come to Point Reyes seeking a wild part of the coastline of California, in order to view wildlife, walk on sand beaches, and tour dramatic ocean cliffs unhindered by private property and development,” said Erik Molvar, executive director of Western Watersheds Project. “They do not come here to see herds of cattle on overgrazed weed plots.”

His Idaho-based group is one of three that sued the National Park Service in 2016, claiming that Point Reyes cows were causing environmental damage, interfering with recreation and harming the herds of tule elk that roam the landscape.

Now a local Democratic congressman, Jared Huffman, has teamed up with an unlikely Republican ally from Utah, Rob Bishop, to introduce legislation to protect the ranchers, whose leases are expiring. Huffman has proposed extending them for 20 years. While Huffman’s environmental record is pristine, Bishop’s decidedly is not; he led efforts to eliminate Utah’s Bears Ears and Grand Staircase-Escalante National Monuments, opening them to the fossil fuel industry.

“Multi-generational ranching in a small portion of the seashore is part of the culture and the landscape and the character that was always meant to be protected,” Huffman said.

Sue Conley, one of the founders of Cowgirl Creamery, agreed. “The ranches have contributed significantly to the sustainable food scene” in the area, she said. “It’s a great model to have working farms in a national seashore, connecting consumers with farmers. There’s a consciousness that comes from being around nature and farming that’s really important to urban life.”

Lying along the San Andreas fault, Point Reyes National Seashore is a foggy 111-square-mile peninsula of wilderness, grasslands, rocky windswept beaches – and dairy and cattle ranches. Cows and hikers have coexisted uneasily at Point Reyes since 1962, when President John F Kennedy designated it a national seashore. To create the park, the government bought the ranches, which had been there since the Gold Rush, leasing them back long-term to families who had been working them for generations. It was a lifeline for the small producers, who faced threats including commercial agricultural enterprises and urban sprawl.

The argument over cattle in Point Reyes actually started with oysters, which had been farmed in an estuary in the national park since the 1930s. Like the ranchers, a local company also had a long-term lease – and when it expired, chef Alice Waters, Senator Dianne Feinstein and author Michael Pollan sided with the oysterers in their bid to stay put. Nevertheless, in 2014, the then–US interior secretary, Ken Salazar, decided that the oyster lease, which was in an area designated as wilderness, would not be renewed.

Environmentalists started to wonder: if oysters could be evicted, why not cows?

Last week Bob McClure surveyed a rare sunny sky and two tule elk bulls in the distance on his grazing land. “We’ve been led to believe that we would always stay here,” said McClure, a fourth-generation farmer whose family gave McClure Beach to the national seashore before the second world war. His 1,400-acre dairy is perched above a hiking trail and hills that tumble to a seaside lagoon. “We’ve worked cooperatively with environmentalists for 50 years,” he said. “We get that this is public ground, and we’re stewards of it.”

Both sides argue over the original intent of Congress in setting up this unusual park with historic ranches within its boundaries. “This wasn’t supposed to be Yellowstone or Yosemite,” said environmental writer John Hart, who has written histories of the area. The original plan was for the seashore to be developed as a sprawling beach recreation area, crisscrossed with roads, boardwalks and beach businesses.

Regional environmental and agricultural groups have mainly sided with the ranchers, whose organic, sustainable products are mainstays of the vibrant local food scene. They say that efforts to rid national parks of cattle may make sense elsewhere in the arid west, but Point Reyes has been grasslands for centuries owing to grazing by tule elk.

“The ranchers are making improvements that actually help protect and restore the lands,” said Linda Novy, president of the Marin Conservation League, which supports Huffman’s bill. “By grazing, the cattle maintain the rare coastal prairie ecosystem.” Without the cattle, the grasslands that characterize the hills of Point Reyes would quickly turn into dense coyote brush.

David Lewis, county director of the UC Cooperative Extension in Marin County, estimates that the Point Reyes ranches contribute as much as 20% of the county’s $110m in annual agricultural production. Given the industries that support agriculture – feed companies, veterinary services, a grass-fed beef butchery – the overall economic output of the ranches may be three times that amount. If the ranches closed, Lewis says, “You’d be losing about $60m a year in production.” The ranchers also contribute more than 5,000 jobs in the region, on and off the farms.

Albert Straus, founder and CEO of Straus Family Creamery, an organic local dairy that supplies milk to Cowgirl Creamery and others, said the ranchers are the linchpin of the local agricultural economy. Without them, the viability of the small farms in the county is at risk, he said, because the county needs a critical mass of ranches to get essential services, such as feed deliveries. “Eliminating or reducing the farms in the park would have a huge impact on the few farms that are left” in the area, he said. “It would devastate our dwindling schools and community.”

Cheeses are being made at Cowgirl Creamery inside the Tomales Bay Foods company in Point Reyes. Photograph: Talia Herman for the Guardian

For the national environmental groups, though, these commercial concerns are secondary to protecting wilderness. They raise eyebrows at the fact that Huffman reached across the aisle to a Republican like Bishop to introduce the bill. “It’s shocking that Representative Huffman would team up with anti-public lands zealot Rob Bishop to undermine the public planning process already under way at Point Reyes, hand these public lands over to ranch and dairy operations that have already been paid to get off the park, and turn a blind eye to the ecological damage this land use is doing to this special place,” said Chance Cutrano, director of Strategy at Resource Renewal Institute.

Whether or not the legislation passes, the National Park Service is continuing its own long-term planning process. While options for ridding the park of ranchers are on the table, its initial proposal to the public is to give the existing ranchers 20-year leases.

As his cows munched on grass, McClure, the rancher, suggested a solution that sounded both old-world and downright practical.

“We need to collaborate,” he said. “We all need to sit down at the Point Reyes fire station and hammer these things out.”

This story has been published in collaboration with Pacific Standard and the Missoulian.

City tests confirm some Chicago homes with water meters have lead in tap water

City Hall under Mayor Rahm Emanuel has denied for years that Chicago has a widespread problem with lead in its tap water.

John Byrne and Michael HawthorneContact Reporters

Chicago Tribune

City testing of Chicago homes with water meters found nearly 1 in 5 sampled had brain-damaging lead in their tap water, but Mayor Rahm Emanuel’s water commissioner acknowledged Thursday that the city continued installing new meters after learning about the alarming results in June.

Disclosure of the previously secret study of 296 metered homes comes after more than five years of denials by Emanuel and his aides that the nation’s third-largest city has a widespread lead problem, even as the scandal in Flint, Mich., drew national attention to the hazards and other research in Chicago consistently found the toxic metal in drinking water.

The Emanuel administration’s sudden reversal, outlined at a hastily organized City Hall news conference, adds Chicago to a growing list of cities that are distributing water filters to homes with lead service lines, which in Chicago were required by the city’s plumbing code until Congress banned the practice in 1986.

Randy Conner, the city’s water commissioner, and Julie Morita, the health commissioner, said all 165,000 Chicago homes with water meters are eligible for city-provided water filters. Money collected through water bills will cover the cost of $60 kits that include a pitcher and six replacement filters, Conner said.

“It was just determined that this was the appropriate way of action between myself, Morita and the scientists,” Conner said when asked why the city took so long to address the well-documented health risks.

The Chicago Tribune first reported in 2013 that the city water department and the U.S. Environmental Protection Agency had found high levels of lead in Chicago tap water after lead service lines had been disturbed by street work or plumbing repairs, including the installation of water meters.

Emanuel dramatically expanded that type of work after taking office in 2011. His administration has borrowed more than $481 million for water conservation projects, including the installation of household meters and new water mains citywide. The city has steadily raised water rates to pay back the 20-year loans.

None of the money has been earmarked to replace lead service lines.

In April, a Tribune analysis revealed that lead was found in water samples drawn from nearly 70 percent of the 2,797 homes that returned free testing kits provided by the city during the past two years. The toxic metal turned up in samples collected throughout the city, the newspaper found. Tap water in 3 of every 10 homes tested had lead concentrations above 5 parts per billion, the maximum allowed in bottled water by the U.S. Food and Drug Administration.

As recently as September, mayoral aides and water department officials continued to insist that it is up to individual homeowners to protect themselves from mostly invisible particles leaching out of lead pipes the city required by law for decades. On Wednesday, Emanuel himself declared Chicago’s drinking water is safe while opposing plans introduced in the City Council to finance the replacement of lead service lines. He accused the measure’s authors of treating homeowners “as an ATM machine” by proposing to pay for the project with a 1 percent tax on sales of Chicago homes worth more than $750,000.

“I believe in science in forming good policy decisions,” Emanuel said.

A day later, Conner and Morita announced the city would begin distributing water filters shortly before the water commissioner was scheduled to appear at a City Council budget hearing, a setting during which aldermen could slam the Emanuel administration for not doing more about the lead problem.

Revealing the plan to distribute water filters provided Conner with a response to blunt the criticism. Yet there is no guarantee any other action will be taken beyond conducting another study.

“What we’re committed to doing is taking a look at this thing holistically, and understanding what this is going to take to tackle this issue, from the feasibility, the framework and a funding perspective,” Conner said about a $750,000 contract with the global engineering firm CDM Smith, which is required to submit a new review of the municipal water system before Emanuel leaves office in the spring.

Lead is unsafe at any level, according to the EPA and the Centers for Disease Control and Prevention. Emanuel and his aides this week continued their technically true but misleading defense that Chicago drinking water is safe because it meets federal standards.

Water utilities are considered to be in compliance with federal water quality regulations as long as 90 percent of the homes tested have lead levels below 15 ppb, a 1991 standard the EPA acknowledges is based not on the dangers of lead but because the agency thought the limit could be met with corrosion-inhibiting chemicals.

Chicago conducts this type of testing in just 50 homes every three years — the minimum required — and typically doesn’t find anything wrong. Most of the Chicago homes tested for regulatory purposes during the past decade were owned by water department employees or retirees living on the Far Northwest and Far Southwest sides.

Aldermen who have been calling for the removal of lead service lines held their own City Hall news conference Thursday. They criticized the Emanuel administration for not immediately publicizing the results of the city’s study of metered homes and for failing to stop installing meters after learning the work could be putting Chicagoans at risk.

“It’s dangerous, it’s irresponsible and it’s unacceptable,” said Ald. Chris Taliaferro, 29th.

“I think the lack of transparency and the communication is what’s lacking here,” said Ald. Gilbert Villegas, 36th, one of the sponsors of the proposed transfer tax Emanuel opposes. “They need to do a better job. Let’s get ahead of this thing.”

Paul Vallas, one of the candidates to replace Emanuel as mayor, has urged Illinois Attorney General Lisa Madigan to investigate.

“Since June, I have been calling on the city to take more aggressive action to address our lead in the drinking water problem, but the Emanuel administration has dismissed me as a panic peddler,” Vallas said in a statement that accused the mayor and his aides of “an unbelievable level of cynicism” in their public statements on the issue.

Morita, the city health commissioner, noted that the number of Chicago children with elevated levels of lead in their blood has steadily declined citywide for years. “First and foremost, there is no public health crisis,” she said.

Most lead exposure comes from ingesting dust in homes built before 1978 with lead-based paint. A 2015 Tribune investigation found that while the rate of childhood lead poisoning has declined citywide, more than a fifth of the children tested in some of the poorest parts of Chicago still had levels of the toxic metal in their blood that exceeded CDC guidelines.

The city later announced it would begin testing water in the homes of poisoned children for the first time.

"Fast-Rising Demand for Air Conditioning Is Adding to Global Warming"

"With window units set to more than triple by 2050, home air conditioning is on pace to add half a degree Celsius to global warming this century, a new report says."

"Increasing demand for home air conditioning driven by global warming, population growth and rising incomes in developing countries could increase the planet's temperatures an additional half a degree Celsius by the end of the century, according to a new report by the Rocky Mountain Institute.

The demand is growing so fast that a "radical change" in home-cooling technology will be necessary to neutralize its impact, writes RMI, an energy innovation and sustainability organization.

The problem with air-conditioning comes from two sources: the amount of energy used, much of which is still powered by carbon-emitting coal, oil and gas generation, and the leaking of hydrofluorocarbon (HFC) coolants, which are short-lived climate pollutants many times more potent than carbon dioxide."

More Evidence Points to China as Source of Ozone-Depleting Gas

By Chris Buckley

Nov. 3, 2018

BEIJING — An environmental group says it has new evidence showing that China is behind the resurgence of a banned industrial gas that not only destroys the planet’s protective ozone layer but also contributes to global warming.

The gas, trichlorofluoromethane, or CFC-11, is supposed to be phased out worldwide under the Montreal Protocol, the global agreement to protect the ozone layer. In May, however, scientists published research showing that CFC-11 levels in the atmosphere had begun falling more slowly. Their findings suggested significant new emissions of the gas, most likely from East Asia.

Evidence then uncovered by The New York Times and the Environmental Investigation Agency pointed to rogue factories in China as a likely major source.

Now, the E.I.A. has prepared a report that it says bolsters the finding that Chinese factories are behind the return of CFC-11.

Independent laboratory tests “clearly confirm the use of CFC-11 in three enterprises” in China, the agency said in the report. It plans to submit the work this week in Quito, Ecuador, where delegates from nearly 200 countries are attending a Montreal Protocol meeting on the status of efforts to repair the ozone layer.

Avipsa Mahapatra, head of the climate change campaign at the E.I.A., said the Chinese authorities should make thorough regulatory changes that make underground CFC-11 production impossible. “Simply clamping down a few enterprises without systemic changes could mean that similar illegal enterprises pop up in other regions,” she said in an email.

But definitive answers and solutions to the problem of CFC-11 appear to be some way off.

Chinese officials have said they have already acted vigorously to close rogue chemical makers. They have also asserted that the CFC-11 emissions in question are too large to be solely from those operations.

Tracing the source of a banned chemical

Scientists have said that they need more time and data to pin down the causes of the CFC-11 resurgence.

“When it comes to definitive answers, I think we have to first emphasize that this mystery has yet to be solved,” said Keith Weller, a spokesman for the United Nations Environment Program, which helps organize the ozone layer talks.

The CFC-11 mystery has wide implications. The ozone layer has been healing, but the return of a banned substance is an alarming breach in one of the world’s most effective environmental pacts and could slow the layer’s recovery.

CFC-11 is also a potent greenhouse gas. If it is leaking directly into the atmosphere from factories, even more gas may be held in the products made in those factories — for example, in insulation foam — and may enter the atmosphere when those products are eventually destroyed.

Scientists discovered decades ago that CFC-11 and other manufactured chemicals used as refrigerants and aerosols and in the production of insulating foams were destroying the ozone layer, which shields humans, crops and animals from the most damaging solar rays.

In 1987, countries agreed on the Montreal Protocol to phase out such gases, steadily replacing them with ever safer substitutes. The protocol has been praised as a model environmental initiative.

The Chinese government has said it will investigate and stamp out any illicit production off CFC-11, and Chinese industrial associations have vowed that their businesses will not use the chemical.

Officials announced last month that the police had broken up an illegal CFC-11 plant in Henan, a rural heartland province, and found more 30 metric tons of the chemical on the site, according to an official report.

A spokesman for the Chinese Ministry of Ecology and Environment, Liu Youbin, told a news conference on Wednesday that inspectors had checked 1,172 businesses over recent months and found evidence of CFC-11 in only 10.

“If it was just those small, illegal roaming producers, the volume could not be that much,” Chen Liang, an official with the ministry who oversees international cooperation, including in ozone layer policy, said in an interview.

Mr. Weller also said the estimated new emissions of CFC-11, in the order of roughly 13,000 metric tons per year, appeared to be too great to come from illegal production alone.

Yet Mr. Chen also said there were daunting barriers to regulating China’s vast numbers of chemical and foam-making businesses. By his count, there were about 3,000 businesses in the foam sector. But the numbers of scattered, under-the-radar plants could be much higher.

Furthermore, Mr. Chen said, local inspectors across China can lack the equipment to quickly measure levels of CFC-11 or the chemicals used to make it. Fly-by-night chemical producers were hard to uncover and punish, he added.

“After they finish up production, everyone leaves,” Mr. Chen said. “This is very difficult to attack.”

CFC-11 was also once widely used as a refrigerant for fridges and air-conditioners, and Mr. Chen said that the pickup in CFC-11 might come from leakage when appliances were improperly scrapped. But an expert panel that advises Montreal Protocol member governments has assembled numbers that put into doubt the assertion that leakage can explain the persistent CFC-11, at least on its own.

The panel noted that it would take the destruction of 13 million large fridges a year to account for 13,000 tons of CFC-11 released in the atmosphere. China, where fridges tend to be smaller, disposes of about 1.5 million of them a year, the panel said.

The delegates meeting in Ecuador will receive a new United Nations assessment of the health of the ozone layer that confirms the resurgence of CFC-11 emissions, and some are likely to press China for more answers. But scientists have said it will take longer before they can confidently track down the source or sources of the pollution.

In September, a team of scientists published research confirming that China has been emitting unaccountably high volumes of carbon tetrachloride, an ozone-destroying chemical with uses that include the production of CFC-11. But Mark Lunt, a researcher at the University of Edinburgh who was a co-author of the study, stressed it was too early to identify the precise sources of the carbon tetrachloride and say if there was a link with making CFC-11.

The team of scientists would soon submit a research paper on CFC-11, Mr. Lunt said. Other efforts to study the resurgence of CFC-11 are also afoot.

“There’s a huge variety of questions we need to answer,” Paul A. Newman, chief scientist for earth sciences at NASA’s Goddard Space Flight Center, said in a telephone interview. “I just think there’s a real lack of information at this point.”

Henry Fountain contributed reporting from New York.

Chris Buckley covers China, where he has lived for more than 20 years after growing up in Australia. Before joining The Times in 2012, he was a correspondent for Reuters. @ChuBailiang

Bitcoin can push global warming above 2 degrees C in a couple decades


It alone could produce enough emissions to raise global temperatures as soon as 2033


A new study published in the peer-reviewed journal Nature Climate Change finds that if Bitcoin is implemented at similar rates at which other technologies have been incorporated, it alone could produce enough emissions to raise global temperatures by 2°C as soon as 2033.

"Bitcoin is a cryptocurrency with heavy hardware requirements, and this obviously translates into large electricity demands," said Randi Rollins, a master's student at the University of Hawaii at Manoa and coauthor of the paper.

Purchasing with bitcoins and several other cryptocurrencies, which are forms of currency that exist digitally through encryption, requires large amounts of electricity. Bitcoin purchases create transactions that are recorded and processed by a group of individuals referred to as miners. Miners group every Bitcoin transaction made during a specific timeframe into a block. Blocks are then added to the chain, which is the public ledger. The verification process by miners, who compete to decipher a computationally demanding proof-of-work in exchange for bitcoins, requires large amounts of electricity.

The electricity requirements of Bitcoin have created considerable difficulties, and extensive online discussion, about where to put the facilities or rings that compute the proof-of-work of Bitcoin. A somewhat less discussed issue is the environmental impacts of producing all that electricity.

A team of UH Manoa researchers analyzed information such as the power efficiency of computers used by Bitcoin mining, the geographic location of the miners who likely computed the Bitcoin, and the CO2 emissions of producing electricity in those countries. Based on the data, the researchers estimated that the use of bitcoins in the year 2017 emitted 69 million metric tons of CO2.

Researchers also studied how other technologies have been adopted by society, and created scenarios to estimate the cumulative emissions of Bitcoin should it grow at the rate that other technologies have been incorporated.

The team found that if Bitcoin is incorporated, even at the slowest rate at which other technologies have been incorporated, its cumulative emissions will be enough to warm the planet above 2°C in just 22 years. If incorporated at the average rate of other technologies, it is closer to 16 years.

"Currently, the emissions from transportation, housing and food are considered the main contributors to ongoing climate change. This research illustrates that Bitcoin should be added to this list," said Katie Taladay, a UH Manoa master's student and coauthor of the paper.

"We cannot predict the future of Bitcoin, but if implemented at a rate even close to the slowest pace at which other technologies have been incorporated, it will spell very bad news for climate change and the people and species impacted by it," said Camilo Mora, associate professor of Geography in the College of Social Sciences at UH Manoa and lead author of the study.

"With the ever-growing devastation created by hazardous climate conditions, humanity is coming to terms with the fact that climate change is as real and personal as it can be," added Mora. "Clearly, any further development of cryptocurrencies should critically aim to reduce electricity demand, if the potentially devastating consequences of 2°C of global warming are to be avoided."

Study reconciles persistent gap in natural gas methane emissions measurements


The Colorado State University-led study resulted from a large, multi-institutional field campaign

A new study offers answers to questions that have puzzled policymakers, researchers and regulatory agencies through decades of inquiry and evolving science: How much total methane, a greenhouse gas, is being emitted from natural gas operations across the U.S.? And why have different estimation methods, applied in various U.S. oil and gas basins, seemed to disagree?

The Colorado State University-led study, published Oct. 29 in Proceedings of the National Academy of Sciences, resulted from a large, multi-institutional field campaign called the Basin Methane Reconciliation Study. The researchers found that episodic releases of methane that occur mostly during daytime-only maintenance operations, at a few facilities on any given day, may explain why total emissions accountings have not agreed in past analyses.

With invaluable assistance from industry partners, the researchers have significantly advanced basin-level emission quantification methods and shed new light on important emissions processes.

"Our study is the first of its kind, in its scope and approach," said Dan Zimmerle, senior author of the PNAS study, and a senior research associate at the CSU Energy Institute. "It utilized concurrent ground and aircraft measurements and on-site operations data, and as a result reduces uncertainties of previous studies."

The Basin Methane Reconciliation Study included scientists from CSU, Colorado School of Mines, University of Colorado Boulder, the National Oceanic and Atmospheric Administration, and the National Renewable Energy Laboratory. Other scientific partners were University of Wyoming, Aerodyne, AECOM, Scientific Aviation and GHD. The field campaign took place in 2015 in the Fayetteville shale gas play of Arkansas' Arkoma Basin.

The campaign involved more than 60 researchers making coordinated facility- and device-level measurements of key natural gas emissions sources. The campaign also included a series of aircraft flyovers to collect measurements during the same period when researchers were taking measurements on the ground. The flights took place when meteorological conditions allowed accurate regional emissions estimates.

The research team set out to investigate the persistent gap between two widely used methods of estimating methane emissions from natural gas operations. "Bottom-up" estimates, such as those used in the EPA Inventory of U.S. Greenhouse Gas Emissions and Sinks, are developed by measuring emissions from a representative sample of devices, scaled up by the number of devices or emission events. In contrast, "top-down" measurements can be performed at a regional scale, such as flying an aircraft upwind and downwind of a study area to derive total emissions from methane entering or leaving a basin.

In the past, most aircraft-based, basin-scale emissions estimates have been statistically higher than estimates based on bottom-up accounting.

"The key to our efforts was having everyone out in the same field, at the same time," said Gabrielle Petron, a research scientist from CIRES at CU Boulder and NOAA who was the principal investigator for the top-down measurement team. "By comparing the merits and pitfalls of multiple methods of measurement, we were able to paint a much more comprehensive, and we believe accurate, picture of the methane emissions landscape for natural gas infrastructure."

The PNAS paper utilized a comprehensive set of results published in lead-up papers evaluating emissions from natural gas facilities, including well pads, gathering stations, gathering pipelines, and transmission and distribution sectors. The teams made simultaneous measurements using multiple methods at well pads and compressor stations - the largest emissions sources identified in the basin - which highlighted the strengths and weaknesses of various on-site and downwind methods. Natural gas distribution systems were only a small fraction of total natural gas emissions. The team also estimated methane emissions from biogenic sources, such as agriculture and landfills.

The entirety of those research efforts culminated in the CSU-led PNAS paper that synthesized all the data taken by the research teams. This capstone paper compared a bottom-up estimate that accounted for the location and timing of emissions, with a top-down estimate developed via aircraft measurement - an analysis that had never been done before. One of the study's key insights was the importance of understanding the timing of measurements and daytime maintenance activities, which likely explains the persistent gap between previous top-down and bottom-up estimates.

Routine maintenance activities occur in the daytime and can cause short periods of high emissions in the middle of the day. One of these activities is called "manual liquids unloading," which removes liquids buildup at a natural gas well in order to restore gas production flows. The unloading process can temporarily divert the flow of natural gas from the well to an atmospheric vent.

These activities typically happen during the day, around the same time a research aircraft would be conducting measurements. This diurnal variability of methane emissions, the researchers concluded, may help explain why estimates from aircraft measurements have previously been higher than estimates based on annual bottom-up inventories.

In this study, for the first time, the researchers showed that the east-to-west variation of methane emissions across the basin, derived from aircraft measurements, were reproduced by the high-resolution, bottom-up emissions model developed by the CSU team.

Industry partnerships and public/private funding

Working with industry technical experts gave the study team invaluable full site access for bottom-up measurements and hourly and spatially resolved operational data. The data improved the understanding of the magnitude and timing of emissions, particularly for episodic events like maintenance operations.

"What we have found is that we have two good methods of measurements. If you want to compare them, you have to account for timing and location of emissions," said PNAS lead author Tim Vaughn, a CSU research scientist.

Applicability of these results to other production basins is not known, the researchers say, without understanding the timing of emissions in those other basins.

Funding for the PNAS study was provided by the Department of Energy Research Partnership to Secure Energy for America/National Energy Technology Laboratory contract No. 12122-95/DE-AC26-07NT42677 to the Colorado School of Mines. Cost share for the project was provided by Colorado Energy Research Collaboratory, the National Oceanic and Atmospheric Administration, Southwestern Energy, XTO Energy, a subsidiary of ExxonMobil, Chevron, Equinor and the American Gas Association, many of whom also provided operational data and/or site access. Additional data and/or site access was also provided by CenterPoint, Enable Midstream Partners, Kinder Morgan, and BHP Billiton.

Most Americans underestimate minorities' environmental concerns -- even minorities



ITHACA, N.Y. - A new study shows most Americans underestimate just how concerned minorities and lower-income people are about environmental threats, including members of those groups.

These misperceptions contradict significant research that shows racial and ethnic minorities and the poor are consistently among the most worried about environmental challenges, said co-author Jonathon Schuldt, associate professor of communication at Cornell University.

"What really surprised us was just how paradoxical the results were," he said. "We found a very consistent pattern that if the American public thought a group was very low in concern, in fact that same group was reporting high levels of concern."

The study, "Diverse Segments of the U.S. Public Underestimate the Environmental Concerns of Minority and Low-Income Americans," was published in the Proceedings of the National Academy of Sciences. It also found most Americans associate the term "environmentalist" most closely with whites and the well-educated.

Schuldt and his co-authors attribute the findings to stereotypes in American culture. For example, there's a misperception that people with lower incomes have more pressing needs and don't have the luxury to worry about environmental threats. However, poorer people and people of color consistently report the opposite, perhaps because they are typically the hardest hit by environmental challenges.

The researchers conducted an online survey of a nationally representative sample of 1,200 Americans about their levels of concern for the environment, whether they identified as an environmentalist, and the age, socioeconomic class and race they associated with the term "environmentalist."

The findings could have practical implications for environmental advocacy and policy. If policymakers, scholars and practitioners endorse similar views, these misperceptions may influence which groups' perspectives get prioritized and may contribute to the historical marginalization of minority and lower-income populations.

Co-authors on the paper were Adam Pearson of Pomona College, Rainer Romero-Canyas of the Environmental Defense Fund and Columbia University, Matthew Ballew of Yale University and Dylan Larson-Konar of the Environmental Defense Fund and the University of Florida, Gainesville. The research was funded in part by a grant from Cornell's Atkinson Center for a Sustainable Future and the Environmental Defense Fund.

Alterations to seabed raise fears for future

Ocean acidification caused by high levels of human-made CO2 is dissolving the seafloor



Normally the deep sea bottom is a chalky white. It's composed, to a large extent, of the mineral calcite (CaCO3) formed from the skeletons and shells of many planktonic organisms and corals. The seafloor plays a crucial role in controlling the degree of ocean acidification. The dissolution of calcite neutralizes the acidity of the CO2, and in the process prevents seawater from becoming too acidic. But these days, at least in certain hotspots such as the Northern Atlantic and the southern Oceans, the ocean's chalky bed is becoming more of a murky brown. As a result of human activities the level of CO2 in the water is so high, and the water is so acidic, that the calcite is simply being dissolved.

The McGill-led research team who published their results this week in a study in PNAS believe that what they are seeing today is only a foretaste of the way that the ocean floor will most likely be affected in future.

Long-lasting repercussions

"Because it takes decades or even centuries for CO2 to drop down to the bottom of the ocean, almost all the CO2 created through human activity is still at the surface. But in the future, it will invade the deep-ocean, spread above the ocean floor and cause even more calcite particles at the seafloor to dissolve," says lead author Olivier Sulpis who is working on his PhD in McGill's Dept. of Earth and Planetary Sciences. "The rate at which CO2 is currently being emitted into the atmosphere is exceptionally high in Earth's history, faster than at any period since at least the extinction of the dinosaurs. And at a much faster rate than the natural mechanisms in the ocean can deal with, so it raises worries about the levels of ocean acidification in future."

In future work, the researchers plan to look at how this deep ocean bed dissolution is likely to evolve over the coming centuries, under various potential future CO2 emission scenarios. They believe that it is critical for scientists and policy makers to develop accurate estimates of how marine ecosystems will be affected, over the long-term, by acidification caused by humans.

How the work was done

Because it is difficult and expensive to obtain measurements in the deep-sea, the researchers created a set of seafloor-like microenvironments in the laboratory, reproducing abyssal bottom currents, seawater temperature and chemistry as well as sediment compositions. These experiments helped them to understand what controls the dissolution of calcite in marine sediments and allowed them to quantify precisely its dissolution rate as a function of various environmental variables. By comparing pre-industrial and modern seafloor dissolution rates, they were able to extract the anthropogenic fraction of the total dissolution rates.

The speed estimates for ocean-bottom currents came from a high-resolution ocean model developed by University of Michigan physical oceanographer Brian Arbic and a former postdoctoral fellow in his laboratory, David Trossman, who is now a research associate at the University of Texas-Austin.

"When David and I developed these simulations, applications to the dissolution of geological material at the bottom of the oceans were far from our minds. It just goes to show you that scientific research can sometimes take unexpected detours and pay unexpected dividends," said Arbic, an associate professor in the University of Michigan Department of Earth and Environmental Sciences.

Trossman adds: "Just as climate change isn't just about polar bears, ocean acidification isn't just about coral reefs. Our study shows that the effects of human activities have become evident all the way down to the seafloor in many regions, and the resulting increased acidification in these regions may impact our ability to understand Earth's climate history."

"This study shows that human activities are dissolving the geological record at the bottom of the ocean," says Arbic. "This is important because the geological record provides evidence for natural and anthropogenic changes."

Waste plastic as a fuel? Nestle, ReNew and Licella launch collaboration

Michael Holder

Thursday, August 30, 2018 - 3:30am

Square bales of wrapped plastic bottles ready for the melting process.

A group of chemicals, recycling and technology companies have announced a joint effort to explore using mixed plastic waste as a raw material for fuels, chemicals and new plastics.

The collaboration, announced Aug. 16, will see oil refining and chemicals specialist Neste team up with U.K. recycling firm ReNew ELP and Australian technology developer Licella to develop new industrial and commercial uses for mixed plastic waste.

In addition to studying the feasibility and sustainability of using liquefied waste plastic as a refinery raw material, the three companies also seek regulatory acceptance for new chemical recycling processes.

The collaboration forms part of Neste's ambition to introduce liquefied waste plastic as a future alternative raw material to fossil fuel refining. The renewable diesel manufacturer is targeting the production of more than 1 million tonnes of waste plastic each year for this purpose by 2030.

The collaboration forms part of Neste's ambition to introduce liquefied waste plastic as a future alternative raw material to fossil fuel refining.

Matti Lehmus, executive VP of Neste's oil products business, said he believed the collaboration could accelerate commercialization of waste plastic-based products. "Neste has a strong legacy in refining, as well as raw material and pretreatment research, but we still need development of technologies, value chains and supporting legislation for plastic waste-based products to become a reality at industrial scale," he said.

Separately, Neste is also working with retailer IKEA to develop durable and recyclable plastics made from bio-based materials such as waste fats and oil, with a view to launching commercial scale production for the first time later this year.

U.K. recycler ReNew ELP, meanwhile, is constructing a new chemical recycling plant in Teesside designed to recycle end-of-life plastic into a raw material for a range of petrochemical products.

The plant is being developed by Licella, and although it is not included in the wider collaboration project with Neste, it will "nevertheless contribute to a common goal of enabling more efficient waste plastic utilisation in the future," the firms explained.

Len Humphreys, CEO of Licella Holdings, said he was "excited" by both the new ReNew facility and the wider collaboration involving Neste as important means of meeting the "significant challenge of end-of-life plastic."

"The collaboration with Neste and ReNew ELP will help to create markets for recycled carbon fuels and chemicals at a critical time as Europe pushes towards a circular economy," he said.

Swedish school installs saltwater battery, highlights environmental benefits

Saltwater batteries can be fully recycled receiving a “cradle to cradle” certificate, and don’t contain lead or lithium. For some, this prevails over the disadvantages of saltwater batteries being considerably bigger and heavier, as well as having a lower discharging current than lithium-ion batteries.

Hansjörg Weisskopf (CEO BlueSky Energy), Maria Gardfjell (head commissioner of Uppsala),

Marie Utter (political secretary Uppsala Municipality).

Austria-based saltwater battery storage company, BlueSky Energy announced a new project in Sweden. The company will install a 24 kWh saltwater battery alongside a 100 kW solar PV system at a school building in Uppsala, Sweden.

Uppsala’s Foundation for Collaboration between universities, businesses, and public sector, advised the local municipality to opt for a saltwater storage system, citing environmental and safety benefits.

“Uppsala Municipality has become a test bed for new technologies thus creating new knowledge and expertise which can be spread. In Uppsala, there is a unique collaboration between the public sector, private sector, and universities which allows these kinds of exciting new energy solutions to take root,” said Marlene Burwick (Social Democratic Party), Municipal Chairman for the Board of Directors and Commissioner of STUNS Board.

As the school building was completely rebuilt, the municipality sought to install a PV plus storage system to increase self-consumption and improve the school’s carbon footprint.

The current system will cover about 10-12% of self-consumption. A spokesperson of BlueSky disclosed to pv magazine that the system was sized for initial trialing of the concept. There will be additional roof space available, and the current set-up can be scaled to increase self-consumption.

Moreover, the system aims to increase the share of renewable energy in the public building’s energy mix; feeding into the grid is not planned. The school’s 800 pupils will also be tought about climate change, the environment and renewable energy, using the PV plus storage system as an educational resource.

Saltwater battery storage systems can be an alternative to the more market popular lithium-ion storage systems. The company itself cites environmental and safety benefits, pointing to the advantages of its product.

Lithium mining can have adverse effects on the adjacent environment, and so far, there are only limited options to fully recycle lithium from existing batteries. Indeed, the material’s structure, which enables it to store energy, is altered over time, and recycling processes that enable this structure to be restored, rather than wrecking it, are still in the research phases.

Due to saltwater batteries’ lower performance, when it comes to weight and power density, its application is limited stationary storage systems, and cannot be used in EV’s, for example. Nonetheless, theyhave another advantage over lithium-ion based systems, as they cannot explode in a fire and, thus, are less harzardous to fire fighters and occupants in such an event.

BlueSky highlights that it uses Aquion’s saltwater batteries, which reportedly are the only battery storage systems to receive a “cradle to cradle” certificate. For many, this trumps other potential advantages lithium-ion storage systems have.

“It is thanks to the Municipality’s high set climate goals and ambitions to be completely nontoxic in its school environments that we are now testing this battery, which its main content is saltwater, at Tiunda school. Uppsala welcomes the opportunity to share its experiences regarding smart climate and environmental solutions with other municipalities in Sweden and internationally,” said Maria Gardfjell (Environmental Party) Commissioner responsible for Environmental and Climate Issues.


September 12, 2018 Laura B. Illinois,

Presented by Nancy Holm, Assistant Director, and Jennifer Martin, Environmental Program Development Specialist, both from the Illinois Sustainable Technology Center, a division of the Prairie Research Institute at the University of Illinois at Urbana-Champaign. Presentation slides are available here.

The new Illinois Future Energy Jobs Act which went into effect on June 1, 2017, now requires an updated Renewable Portfolio Standard (RPS) in Illinois. This new RPS directs that solar energy use be expanded across the state, increasing Illinois’ solar capacity from about 75 MW to about 2,700 MW by 2030. With these new requirements for increased solar installations, there will be a critical need to examine the disposal and recycling or repurposing of used and broken solar panels to ensure protection of the environment and also to explore the economic benefits of recycling/repurposing solar panels in the state.

As the solar energy industry continues to grow in the state as well as nationally and globally, there is a looming waste management issue at the end-of-life of these panels. The average lifespan of a panel is 30 years. Given this, the International Renewable Energy Agency estimated that there will be a surge in solar panel disposal in the early 2030s, and by 2050, there will be 60 to 78 million cumulative tons of solar panel waste globally. Some countries, particularly in Europe, have established recycling networks. However, in the U.S. we are just beginning to develop a network of recyclers and, as yet, there is not a strong recycling infrastructure in place in Illinois. These used or broken solar panels should be properly recycled to prevent toxic compounds from leaching into the environment, as well as to avoid disposal of valuable and finite resources into landfills.

The Illinois Sustainable Technology Center is working with the Illinois Environmental Protection Agency, Solar Energy Industry Association, Illinois Solar Energy Association, and other entities to create awareness around these issues, as well as establish the necessary policy and standards required to build a network of solar panel recyclers in Illinois. Details on solar panel waste and recycling will be discussed in this presentation.


National Teachers Group Confronts Climate Denial: Keep the Politics Out of Science Class

The association urged science teachers at all levels to emphasize that 'no scientific controversy exists regarding the basic facts of climate change.'

Phil McKenna


SEP 13, 2018

"If there are controversies or other issues to deal with, we want them to have a good solid foundation in evidence-based knowledge to carry out that conversation," said David Evans, executive director of the National Science Teachers Association. Credit: Peter Macdiarmid/Getty Images

In response to what it sees as increasing efforts to undermine the teaching of climate science, the nation's largest science teachers association took the unusual step Thursday of issuing a formal position statement in support of climate science education.

In its position statement, the National Science Teachers Association (NSTA) calls on science teachers from kindergarten through high school to emphasize to students that "no scientific controversy exists regarding the basic facts of climate change."

"Given the solid scientific foundation on which climate change science rests, any controversies regarding climate change and human-caused contributions to climate change that are based on social, economic, or political arguments—rather than scientific arguments—should not be part of a science curriculum," it says.

It also urges teachers to "reject pressures to eliminate or de-emphasize" climate science in their curriculum. And it urges school administrators to provide science teachers with professional development opportunities to strengthen their understanding of climate science.

"Now, more than ever, we really feel that educators need the support of a national organization, of their educational colleagues and their scientist colleagues, because they have encountered a lot of resistance," David Evans, the executive director of NSTA, said.

"In climate science, as in other areas, we really emphasize the importance that students learn the science in science class, and if there are controversies or other issues to deal with, we want them to have a good solid foundation in evidence-based knowledge to carry out that conversation," he said.

Judy Braus, executive director of the North American Association for Environmental Education, said her organization fully supports the NSTA position statement. "We feel that it's important to address the misinformation that's out there about climate" change, she said.

Only Evolution Draws This Kind of Response

NSTA has issued position statements in the past on topics such as safety, gender equity and the responsible use of animals in the classroom, but this is only the second focused on the teaching of subject matter that can be controversial for reasons not related to the science itself but for societal or political reasons.

"Over the last five years, the two issues that have had the most controversy with them have been evolution on a continuing basis and climate change, and there has been more controversy around climate change," Evans said.

Teachers and school boards have been under pressure from organizations that oppose climate policies, including some that have promoted misinformation and aruged for climate change to be removed from state science curricula. Last year, the Heartland Institute, a conservative advocacy organization with close ties to the fossil fuel industry, mailed approximately 300,000 copies of its publication "Why Scientists Disagree About Global Warming" to middle, high school and college science teachers around the country.

Evans said Thursday's position statement was not a direct response to the Heartland mailings but was precipitated by attacks on climate science curriculum that have been building since the National Research Council recommended climate science be included in K–12 science education in 2012.

Pressure to Change State Science Standards

Battles have erupted in recent years in states including Texas, Louisiana and Idaho, over the role climate science should play in new state science standards.

Glenn Branch, deputy director of the National Center for Science Education, a nonprofit that defends the integrity of science education against ideological interference, said the position statement comes at a key time: Arizona is now devising new science standards and arguing over climate change. The draft standards have not yet been approved by the state Board of Education, but he said "the latest revision deletes a whole slew of high school level standards on climate change."

Branch, who was not involved in developing NSTA's position statement, said the document should help classroom teachers who may feel political or societal pressure to eliminate climate science instruction.

"A teacher who is being pressured by a parent or an administrator can say 'look, I'm a professional, I'm trained for this, both before I became a teacher and through continuing education, I have responsibilities to my profession, and my professional organization, the NSTA says this is what I should be doing,'" Branch said. "I think that will be empowering for many teachers."

Nebraska researchers to lead largest drone-based study of storms



The University of Nebraska-Lincoln will lead the most ambitious drone-based investigation of severe storms and tornadoes ever conducted.

More than 50 scientists and students from four universities will participate in the TORUS (Targeted Observation by Radars and UAS of Supercells) study, deploying a broad suite of cutting-edge instrumentation into the US Great Plains during the 2019 and 2020 storm seasons. The project will exceed $2.5 million, with the National Science Foundation awarding a three-year, $2.4 million grant and the National Oceanic and Atmospheric Administration providing additional financial support.

It is the largest-ever study of its kind, based upon the geographical area covered and the number of drones and other assets deployed, according to Nebraska's Adam Houston.

The project will involve four unmanned aircraft systems or drones, a NOAA P3 manned aircraft, eight mesonet trucks equipped with meteorological instruments, three mobile radar systems, a mobile LIDAR system, and three balloon-borne sensor launchers. The University of Colorado Boulder, Texas Tech University and the University of Oklahoma, along with the National Severe Storms Laboratory, also are participating.

"We are flying more aircraft at the same time," said Houston, associate professor of Earth and atmospheric sciences and one of seven principal investigators. "We've only flown one drone in the past, now we're going to fly four. We can fly in more parts of the storm at the same time, get more data and answer a more extensive set of questions."

The TORUS team is authorized to operate over approximately 367,000 square miles -- virtually all of the Central Plains from North Dakota to Texas, Iowa to Wyoming and Colorado. The aim is to collect high resolution data from within the storm, to improve forecasting for tornadoes and severe storms, Houston said.

The research goal is use data collected to improve the conceptual model of supercell thunderstorms, the parent storms of the most destructive tornadoes, by exposing how small-scale structures within these storms might lead to tornado formation. These structures are hypothesized to be nearly invisible to all but the most precise research-grade instruments. By revealing the hidden composition of severe storms and associating it to known characteristics of the regularly-observed larger scale environment, the TORUS project could improve supercell and tornado forecasts.

"There are fundamental problems with tornado warnings," Houston said. "The false alarm rate is high -- 75 percent of the time we don't get a tornado. Yet if you reduce the false alarm rate, you also reduce the rate of detection. We need to improve that gap to save lives. We can do that if we can improve our understanding of small-scale structures and small-scale processes that lead up to storms.

"We want to know how the storm influences the environment, and vice versa, in the seconds, minutes and hours leading up to the storm and afterwards."

The Central Plains, aka "Tornado Alley," serves as a great laboratory to better understand severe storms, he said.

"Every place in the United States is vulnerable to super cell thunderstorms. What we learn in this laboratory called the Central Plains is applicable to everywhere -- it's geographically agnostic."

Research preparations began Sept. 1. The TORUS team will begin fieldwork May 13, locating where they see a promising weather pattern developing. Their fieldwork will continue until June 16, with a goal of staying out as long as possible.

It is dangerous work, Houston said.

"There's no getting around it. We're putting ourselves in the path of these serious storms."

Hurricane raises questions about rebuilding along North Carolina's coast

Anna Mehler Paperny

RODANTHE, N.C. (Reuters) - When Florence was raging last Friday on North Carolina’s Outer Banks, the hurricane tore a 40-foot (12-meter) chunk from a fishing pier that juts into the ocean at the state’s most popular tourist destination.

The privately owned Rodanthe pier has already undergone half a million dollars in renovation in seven years and the owners started a new round of repairs this week.

“The maintenance and upkeep on a wooden fishing pier is tremendous,” said co-owner Terry Plumblee. “We get the brunt of the rough water here.”

Scientists have warned such rebuilding efforts are futile as sea levels rise and storms chew away the coast line but protests from developers and the tourism industry have led North Carolina to pass laws that disregard the predictions.

The Outer Banks, a string of narrow barrier islands where Rodanthe is situated, may have been spared the worst of Florence, which flooded roads, smashed homes and killed at least 36 people across the eastern seaboard.

Still, the storm showed North Carolinians on this long spindly finger of land that ignoring the forces of nature to cling to their homes and the coast’s $2.4 billion economy may not be sustainable.

Some have called for halting oceanside development altogether.

September 18, 2018. REUTERS/Eduardo Munoz

“We need to actually begin an organized retreat from the rising seas,” said Duke University geologist Orrin Pilkey.

In a government study published in 2010, scientists warned that sea levels could rise 39 inches by 2100. (

Higher sea level will cause more flooding and render some communities uninhabitable, as well as affect the ocean vegetation, jeopardize the dune systems that help stabilize the barrier islands, and cause more intense erosion when storms like Florence make landfall, scientists said.

Developers said the study was too theoretical to dictate policy.

Some argue policymakers do not need a 90-year projection to know something needs to change.

“When we have a hurricane, that shows everybody where their vulnerabilities are today, forget 100 years from now, but right now,” said Rob Young, a geologist at Western Carolina University who co-authored the study by the state’s Coastal Resources Commission (CRC).

Young said he would like to see development move back from the ocean’s edge and laments that homeowners and developers rebuild almost any structure damaged or destroyed by a bad storm.

But the idea of retreating is a tough sell for the people who live there and have invested in property.

“You’re asking us to say, ‘Hey, 4,000 or 5,000 people on little Hatteras Island, it’s time for you to pack up and move,’ and that’s not a reasonable expectation,” said Bobby Outten, manager for Dare County on the Outer Banks.

Opponents of using the CRC study to set policy said that most Outer Bank homeowners recognize the risks.

“If you’re buying on the coast, anyone that buys in an area surrounded by water, you’re always taking a risk that you’re going to have storm damage,” said Willo Kelly, who has worked in real estate for more than a decade.

Even though she acknowledges that sea levels are rising, Kelly is also among those who opposed making state policy decisions, including anything affecting home insurance or property values, based on the study’s dire 90-year forecast of sea-level rise.

Kelly supported a 2012 state law that banned North Carolina from using the 90-year prediction on rising sea levels to influence coastal development policy.

The CRC released a second report in 2015 predicting sea level rise over a 30-year period, instead of 90 years. The new report was praised by developers as being more realistic and said sea levels would rise 1.9 to 10.6 inches. (

The 2012 law was welcomed by the development community and panned by scientists whose warnings, they felt, were going unheeded. Members of the legislature who sponsored the bill did not return requests for comment.

After this year’s storm demolished the sandy protective berms that stand between the water and the main coastal road, the state sent backhoes to rebuild them and officials to assess damage to bridges and roads.

“There’s also a sense of denial,” said Gavin Smith, director of the University of North Carolina’s Coastal Resilience Center, adding that with rising seas and more intense storms it will be more costly and more difficult to replace infrastructure.

Rodanthe Pier was lucky this time, sustaining only moderate damage, said Clive Thompson, 58, who works at the pier. In the past, nor’easters have ripped its end from the ground and tore pilings from sand.

The beach was not so lucky. The ocean ate away about 50 meters of what used to be dry sand above the high-tide line, he said.

“It’s a waste of man hours, time and money, having to do this over and over,” he said. “One day I hope people understand the power of water. ... It don’t play.”

California First State To Bar Restaurants From Handing Out Straws

"A new law bars California restaurants from automatically giving out single-use plastic straws with drinks, starting next year.

The bill, which only applies to full-service restaurants, was signed by California Gov. Jerry Brown on Thursday. Customers who want a plastic straw still will be able to request them from the restaurant under the law.

“Plastic has helped advance innovation in our society, but our infatuation with single-use convenience has led to disastrous consequences,” Brown wrote in a signing message, according to the Los Angeles Times.

Oklahoma: "Insurer Says Oil Companies Caused Quakes, Wants Money Back"

"An insurance company is suing oil companies in Oklahoma, saying their disposal operations caused the state's largest earthquake.

Steadfast Insurance Co. paid about $325,000 to the Pawnee Nation for damage it sustained in a magnitude 5.8 earthquake in September 2016. It's suing to recover the money.

Steadfast, a subsidiary of Zurich Holding Company of America Inc., filed suit in U.S. District Court in Tulsa last month against EnerVest Operating LLC and six other oil companies that operate in the area where the quake was centered. The tribe has also sued some of the same companies."

The world’s first hydrogen-powered train makes (almost) complete sense

By Akshat RathiSeptember 20, 2018

In a world worried about climate change, hydrogen is a magical fuel. It has all the features of natural gas: it’s a fuel that’s easy to access, transport, and burn. And unlike natural gas, when you burn hydrogen, you only get water—no carbon dioxide. It’s the reason why, as the world continues to fail to hit climate goals, it turns back to hydrogen time and again.

The latest example is the world’s first noise-free, zero-emissions, hydrogen-powered train running about 100 km (about 60 miles) between Cuxhaven and Buxtehude in Germany. Built by the French multinational rail company Alstom with funds from the German government, the train can run at a maximum speed of 140 km per hour (about 90 miles per hour) for about 1,000 km on a full tank of hydrogen.

Currently, most train routes in the world run on diesel, which doesn’t just produce carbon dioxide but a lot of particulate pollution too. You might argue that, instead of burning diesel, why not electrify these routes, and power them with wind or solar? After all, we already do have some trains that run efficiently on electricity.

And it does indeed make sense to do that in many cases. But there are edge cases, such as that of Cuxhaven and Buxtehude, a route that runs from the North Sea to just outside Hamburg, where building the necessary electric infrastructure is prohibitively expensive. That may be because the location is too far from an electric grid or because the line will run too infrequently to recoup the investment needed. Alstom estimates that electrifying a railway line costs around €1.2 million ($1.4 million) per km.

One other option to run zero-emission trains would be to power them on batteries instead of using electricity from the grid. Here, however, physics becomes a problem. Batteries are bulky. They may be efficient at transporting four or six people in a car, but moving hundreds in a train is much harder. The heavier the train becomes, the more batteries it needs to move; the more batteries we add, the heavier the train becomes.

That may change if batteries are able to pack more energy for the same weight. Until then, the next-best option is hydrogen-powered fuel cells.

There’s just one problem: hydrogen doesn’t have the ready supply chain of diesel. Over the past century, we’ve built out a large infrastructure to produce and transport diesel, and we’ve designed ever-better diesel engines. To use hydrogen at the same or lower cost, we will need to build a similar network.

Currently, the cheapest way to produce hydrogen is from natural gas. In large reactors, natural gas is exposed to high-temperature steam to produce hydrogen and carbon dioxide. Using this hydrogen to run the train defeats the purpose somewhat. The new train itself may be zero emissions at the point of use because it only produces water and no particulate pollution. But the hydrogen supply chain doesn’t entirely eliminate greenhouse gases.

The hope is that, as the use of hydrogen becomes widespread, the technology to produce hydrogen from water (essentially reversing the reaction occurring inside the train) by using renewable electricity coming from wind, solar, or nuclear power will become cheaper. Alstom has an order to build 14 more such trains across Germany. Other countries, including France, Denmark, Norway, Italy, Canada, the Netherlands, and the UK are all looking into hydrogen trains.

Just like the steam-engine kickstarted the age of coal, the German train could just kickstart the age of hydrogen.

North Carolina Flooding Soaks Federal Taxpayers All Over Again

By Christopher Flavelle

September 21, 2018, 4:00 AM EDT

One home in Nags Head received insurance payouts 28 times

Billions for rebuilding homes, but little to help owners move

When Hurricane Florence reached Belhaven, North Carolina, last weekend, the flooding that followed was unlike anything Ricky Credle, the mayor of the tiny coastal town had ever experienced.

“I’ve been here 45 years,” Credle said in a phone interview. “I’ve never seen this much water.”

For federal taxpayers, however, the flooding in Belhaven was all too familiar.

Since 1978, 120 homes in the town of 1,600 people have cost federal taxpayers $13.4 million in National Flood Insurance Program payouts, according to data the Natural Resources Defense Council obtained from the Federal Emergency Management Agency; 36 received more money than their total value. Five of those homes have gotten federal flood insurance payments ten times or more, including one that received money 15 times.

The result: homes that flood keep getting rebuilt with public money, only to flood again. That leaves people living in vulnerable areas who might rather receive money to move away -- and the flood insurance program, already more than $20 billion in debt, going deeper underwater.

“We spend all this money to rebuild these homes, and we spend very little money helping people get out of these homes -- even when that’s what they want,” said Rob Moore, a senior policy analyst at the Natural Resources Defense Council. “Efforts to help move people move somewhere safer are seen as a last option, instead of a first option.”

Encouraging people to keep rebuilding in vulnerable places reflects the design of the flood insurance program. Not only does it subsidize people’s premiums, it imposes no limit on the number of times a homeowner can make a claim. At the same time, federal programs designed to pay people to move out of flood-damaged homes often take years to result in offers, by which time many people have repaired their homes and moved back in.

There are more than 1,100 so-called severe repetitive loss properties across North Carolina, the NRDC data shows. The federal government has paid out $163.9 million in flood insurance for those homes, which is almost 60 percent of their combined total value. More than 400 have gotten more in federal insurance claims than the home is worth.

One home, in Nags Head, received flood insurance payments 28 times before it was torn down.

Severe repetitive loss properties account for just two percent of all flood insurance policies, but as much as one-third of all claims, according to R.J. Lehmann, director of insurance policy for the R Street Institute, a Washington think tank.

The result is that the program relies ever more heavily on taxpayers. Last year, the Congressional Budget Office calculated that annual costs exceed premiums by about one-third -- and that was before Hurricanes Harvey, Irma and Maria. In October, Congress forgave $16 billion in debt that the program owed the U.S. Treasury.

“The NFIP was not designed to retain funding to cover claims for truly extreme events,” the Congressional Research Service wrote in a report released this year.

Belhaven demonstrates that trend as well as any place. For all the money the federal government has spent in the aftermath of previous flooding there, it has spent very little on reducing the threat from the next flood. Just one-third of the town’s severe repetitive loss properties have gotten money for what the government calls “mitigation” -- a category that includes flood-proofing or raising the home on stilts.

The most effective type of mitigation is tearing down the house, and paying its owner to move elsewhere. But that step is also the most rare. The NRDC’s data show that of the 120 homes, none have been torn down through a buyout program.

Credle, the Belhaven mayor, said moving some residents to safer locations should be considered as part of the response to Florence. He said it didn’t matter if that meant his town shrinks, along with its property tax revenue.

“You always want the tax base to be good,” Credle said. “But you also want people to be safe.”

Washington, a North Carolina town at the mouth of the Pamlico River that was also flooded by Florence, has 114 severe repetitive loss properties. Together, those 114 homes have received $12.6 million in federal insurance payments since 1978. Yet at most 28 of those homes have received federal money to reduce their flood exposure, and only one has been torn down. One of those homes, with a total value of $102,000, has gotten federal flood insurance payments 11 times.

In New Bern, a town 45 minutes south of Washington, 29 severe repetitive loss homes have received $2.7 million in federal flood insurance payments. The federal government has acted to protect just six against flood damage. None of those homes have been bought out and demolished.

As climate change gets worse, federal policies have effectively reduced the cost of living in flood-prone areas, increasing the number of homes that continue to flood at taxpayer expense.

By subsidizing flood insurance by allowing an unlimited number of claims, the insurance program serves to increase other types of federal disaster costs, Lehmann said. “If people weren’t living there, then there would be less disaster assistance necessary,” he said.

FEMA didn’t immediately respond to a request for comment.

Congressional Action

Congress worried about the perverse incentives of the flood insurance program long before Florence. In 2012, Congress passed legislation phasing out the subsidies offered to homeowners, increasing premiums in places like coastal North Carolina. Homeowners objected, and Congress rolled back those changes in 2014.

Then, last year, the House passed a bill that would limit the amount of money a home could get from the flood insurance program to three times the value of that home.

The bill didn’t clear the Senate.

“People that live in this cycle of flooding and rebuilding may never even be offered assistance to move somewhere safer,” the NRDC’s Moore said. “If they have flood insurance, they’re always offered the option to rebuild in the same location.”

Florence: Only about 10 percent on hard-hit Carolina coast have flood insurance

Updated Sep 17, 5:51 PM; Posted Sep 17, 5:37 PM

By The Washington Post

As Americans in North and South Carolina make it out of the Florence floodwaters, they face another daunting task: figuring out whether they can afford to rebuild.

Few have flood insurance in the areas with the worst destruction. Home insurance does not typically cover flooding, a fact many realize the hard way. People have to purchase a separate flood insurance policy at least a month in advance of a major storm to be eligible for reimbursement.

Only about one in 10 homes have flood insurance in the counties hit by Florence, according to a Washington Post analysis comparing the number of policies in National Flood Insurance Program data with the number of housing units in counties hit by the storm. Milliman, an actuarial firm, found similar results.

In Craven County, North Carolina, which is home to New Bern, a city that has dominated headlines for severe flooding and hundreds of rescues, 9.9 percent have flood insurance. In Wilmington, which is located in New Hanover County and has also seen devastating flooding, slightly more -- 14.2 percent -- have flood insurance, according to The Washington Post analysis. Florence has caused historic flooding around New Bern and Wilmington, two parts of the southeast coast with some of the lowest take-up rates for flood insurance.

The federal government requires flood insurance in parts of the country that are designated as "special flood hazard areas," but many homeowners and renters still do not get it because it is too expensive or they do not believe they are truly at risk. The government does little to enforce the requirement, leaving it mainly up to banks to mandate flood insurance as a condition of getting a home mortgage.

Some wrongly believe they do not need insurance because they can rely on Federal Emergency Management Agency (FEMA) grants, but those cover only up to $33,000 in damages. The typical payout is a few thousand dollars. Flood insurance, in contrast, covers up to $250,000 for the home and another $100,000 for possessions.

Shawn Spencer in Wilmington, North Carolina, spent much of Saturday raking debris out of the ditches in front of his home, hoping that would keep his property from flooding. Like many, he was not sure whether he had flood insurance for his home and the bike shop he runs as a small business.

"I'm not 100 percent certain if I've got it," said Spencer, 47, who has lived in Wilmington his entire life and cannot remember flooding this bad. "The government is forever moving the flood plain."

A key issue with Hurricane Florence and Hurricane Harvey last year is that some of the worst flooding occurred inland, sometimes in areas that were not FEMA-designated flood hazard zones. In some coastal communities, such as the Outer Banks, flood insurance take up rates are upward of 60 percent, but the rate of people with flood insurance drops off sharply just a few miles in from the coastline.

"When people hear they aren't in a FEMA flood zone, they often think there's no risk their area will flood," said John Rollins, an actuary at Milliman who specializes in insurance for catastrophe events.

As catastrophic flooding continues to occur beyond the coastline, FEMA is trying to warn Americans that anywhere it rains can flood.

"Just one inch of floodwater in a home can cause $25,000 worth of damage," David Maurstad, FEMA's deputy associate administrator for insurance and mitigation, said on CNBC on Monday. "People think their homeowner policy may cover them from flooding, and it doesn't."

Low-income families are particularly at risk. They are even less likely to have flood insurance and the lack a financial cushion to pay for hotel rooms to ride out the storm, let alone funds to rebuild. Many low-income families will turn to FEMA for aid money, but the average FEMA grant last year in the wake of Hurricane Harvey was only $4,300. That is not enough to replace a trailer home, and it is far below the average flood insurance claim of $115,000.

The federal government's other main program to help people who have severely damaged homes from flooding is a Small Business Administration loan, but low-income people often fail to qualify for the low-interest SBA loans because they do not have good credit, says Carolyn Kousky, director of policy research at the Wharton Risk Center.

"FEMA grants are a few thousand dollars. They won't get your home back to pre-disaster conditions," Kousky said. "When people see headlines about Congress passing $15 billion in hurricane aid, they think it's going to households, but most of it is going to infrastructure and local governments."

Recycling, Once Embraced by Businesses and Environmentalists, Now Under Siege

The U.S. recycling industry is breaking down as prices for scrap paper and plastic have collapsed.

Workers at Cal-Waste Recovery Systems pre-sort raw recycling. The company has been struggling to sell its mixed-paper recycling to its usual customer, China.

The U.S. recycling industry is breaking down.

Prices for scrap paper and plastic have collapsed, leading local officials across the country to charge residents more to collect recyclables and send some to landfills. Used newspapers, cardboard boxes and plastic bottles are piling up at plants that can’t make a profit processing them for export or domestic markets.

“Recycling as we know it isn’t working,” said James Warner, chief executive of the Solid Waste Management Authority in Lancaster County, Pa. “There’s always been ups and downs in the market, but this is the biggest disruption that I can recall.”

U.S. recycling programs took off in the 1990s as calls to bury less trash in landfills coincided with China’s demand for materials such as corrugated cardboard to feed its economic boom. Shipping lines eagerly filled containers that had brought manufactured goods to the U.S. with paper, scrap metal and plastic bottles for the return trip to China.

As cities aggressively expanded recycling programs to keep more discarded household items out of landfills, the purity of U.S. scrap deteriorated as more trash infiltrated the recyclables. Discarded food, liquid-soaked paper and other contaminants recently accounted for as much as 20% of the material shipped to China, according to Waste Management Inc.’s estimates, double from five years ago.

The tedious and sometimes dangerous work of separating out that detritus at processing plants in China prompted officials there to slash the contaminants limit this year to 0.5%. China early this month suspended all imports of U.S. recycled materials until June 4, regardless of the quality. The recycling industry interpreted the move as part of the growing rift between the U.S. and China over trade policies and tariffs.

The changes have effectively cut off exports from the U.S., the world’s largest generator of scrap paper and plastic. Collectors, processors and the municipal governments that hire them are reconsidering what they will accept to recycle and how much homeowners will pay for that service. Many trash haulers and city agencies that paid for curbside collection by selling scrap said they are now losing money on almost every ton they handle.

The upended economics are likely to permanently change the U.S. recycling business, said William Moore, president of Moore & Associates, a recycled-paper consultancy in Atlanta.

Cal-Waste Recovery Systems plans to invest more than $6 million on new sorting equipment to produce cleaner bales of recyclables.

“It’s going to take domestic demand to replace what China was buying,” he said. “It’s not going to be a quick turnaround. It’s going to be a long-term issue.”

The waste-management authority in Lancaster County this spring more than doubled the charge per ton that residential trash collectors must pay to deposit recyclables at its transfer station, starting June 1. The higher cost is expected to be passed on to residents though a 3% increase in the fees that haulers charge households for trash collection and disposal.

The additional transfer-station proceeds will help offset a $40-a-ton fee that the authority will start paying this summer to a company to process the county’s recyclables. Before China raised its quality standards at the beginning of this year, that company was paying Lancaster County $4 for every ton of recyclables.

Mr. Warner may limit the recyclable items collected from Lancaster County’s 500,000 residents to those that have retained some value, such as cans and corrugated cardboard. He said mixed plastic isn’t worth processing.

“You might as well put it in the trash from the get-go,” he said.

Environmentalists are hoping landfills are only a stopgap fix for the glut of recyclables while the industry finds new markets and reduces contaminants.

“Stuff is definitely getting thrown away in landfills. Nobody is happy about it,” said Dylan de Thomas, vice president of industry collaboration for the Recycling Partnership in Virginia. “There are very few landfill owners that don’t operate recycling facilities, too. They’d much rather be paid for those materials.”

Pacific Rim Recycling in Benicia, Calif., slowed operations at its plant early this year to meet China’s new standard. But company President Steve Moore said the more intensive sorting process takes too long to process scrap profitably. Pacific Rim idled its processing plant in February and furloughed 40 of its 45 employees.

“The cost is impossible. We can’t make money at it,” Steve Moore said. “We quit accepting stuff.”

China stopped taking shipments of U.S. mixed paper and mixed plastic in January. Steve Moore said mixed-paper shipments to other Asian countries now fetch $5 a ton, down from as much as $150 last year. Other buyers such as Vietnam and India have been flooded with scrap paper and plastic that would have been sold to China in years past.

Dave Vaccarezza, president of Cal-Waste Recovery Systems near Sacramento, Calif., intends to invest more than $6 million in new sorting equipment to produce cleaner bales of recyclables.

“It’s going to cost the rate payer to recycle,” he said. “They’re going to demand we make our best effort to use those cans and bottles they put out.”

China stopped taking shipments of U.S. mixed paper and mixed plastic in January. Cal-Waste Recovery Systems workers sift through recycled trash.

Sacramento County, which collects trash and recyclables from 151,000 homes, used to earn $1.2 million a year selling the scrap to Waste Management and another processor from scrap. Now, the county is paying what will amount to about $1 million a year, or roughly $35 a ton, to defray the processors’ costs. Waste Management paid the county $250,000 to break the revenue-sharing contract and negotiate those terms.

County waste management director Doug Sloan expects those costs to keep climbing. “We’ve been put on notice that we need to do our part,” he said. The county hasn’t yet raised residential fees.

Some recyclers said residents and municipalities need to give up the “single-stream” approach of lumping used paper and cardboard together with glass, cans and plastic in one collection truck. Single-stream collections took hold in the waste-hauling industry about 20 years ago and continue to be widely used. Collecting paper separately would make curbside recycling service more expensive but cut down on contamination.

“We’re our own worst enemies,” said Michael Barry, president of Mid America Recycling, a processing-plant operator in Des Moines, Iowa, of single-stream recycling. “It’s almost impossible to get the paper away from the containers.”

Even relatively pure loads of paper have become tough to sell, Mr. Barry said, noting the domestic market for paper is saturated as well. He stockpiled paper bales at Mid America’s warehouse, hoping prices would improve. They didn’t. He has trucked 1,000 tons of paper to a landfill in recent weeks.

“We had to purge,” he said. “There’s no demand for it.”

World's first electrified road for charging vehicles opens in Sweden

Stretch of road outside Stockholm transfers energy from two tracks of rail in the road, recharging the batteries of electric cars and trucks

Supported by The Skoll Foundation

Daniel Boffey in Brussels

Thu 12 Apr 2018 11.55 EDT First published on Thu 12 Apr 2018 08.40 EDT

The world’s first electrified road that recharges the batteries of cars and trucks driving on it has been opened in Sweden.

About 2km (1.2 miles) of electric rail has been embedded in a public road near Stockholm, but the government’s roads agency has already drafted a national map for future expansion.

Sweden’s target of achieving independence from fossil fuel by 2030 requires a 70% reduction in the transport sector.

The technology behind the electrification of the road linking Stockholm Arlanda airport to a logistics site outside the capital city aims to solve the thorny problems of keeping electric vehicles charged, and the manufacture of their batteries affordable.

Energy is transferred from two tracks of rail in the road via a movable arm attached to the bottom of a vehicle. The design is not dissimilar to that of a Scalextric track, although should the vehicle overtake, the arm is automatically disconnected.

The electrified road is divided into 50m sections, with an individual section powered only when a vehicle is above it.

When a vehicle stops, the current is disconnected. The system is able to calculate the vehicle’s energy consumption, which enables electricity costs to be debited per vehicle and user.

The “dynamic charging” – as opposed to the use of roadside charging posts – means the vehicle’s batteries can be smaller, along with their manufacturing costs.

A former diesel-fuelled truck owned by the logistics firm, PostNord, is the first to use the road.

Hans Säll, chief executive of the eRoadArlanda consortium behind the project, said both current vehicles and roadways could be adapted to take advantage of the technology.

In Sweden there are roughly half a million kilometres of roadway, of which 20,000km are highways, Säll said.

“If we electrify 20,000km of highways that will definitely be be enough,” he added. “The distance between two highways is never more than 45km and electric cars can already travel that distance without needing to be recharged. Some believe it would be enough to electrify 5,000km.”

Building the eRoadArlanda: the government’s roads agency has already drafted a national map for future expansion. Photograph: Joakim Kröger/eRoadArlanda

At a cost of €1m per kilometre, the cost of electrification is said to be 50 times lower than that required to construct an urban tram line.

Säll said: “There is no electricity on the surface. There are two tracks, just like an outlet in the wall. Five or six centimetres down is where the electricity is. But if you flood the road with salt water then we have found that the electricity level at the surface is just one volt. You could walk on it barefoot.”

National grids are increasingly moving away from coal and oil and battery storage is seen as crucial to a changing the source of the energy used in transportation.

The Swedish government, represented by a minister at the formal inauguration of the electrified road on Wednesday, is in talks with Berlin about a future network. In 2016, a 2km stretch of motorway in Sweden was adapted with similar technology but through overhead power lines at lorry level, making it unusable for electric cars.

Lawsuits filed against L.A. County, lenders over green energy program

Andrew Khouri


APR 12, 2018 | 1:45 PM

Homeowner Reginald Nemore says he can't afford his PACE loan for solar panels and is worried he will lose his house. (Wally Skalij / Los Angeles Times)

Attorneys representing homeowners filed lawsuits Thursday against the Los Angeles County, alleging a county program that funds solar panels and other energy-efficient home improvements is a "plague" that's ruined the finances of many borrowers by saddling them with loans they cannot afford.

The twin suits seeking class-action status were filed in Los Angeles County Superior Court against the county, as well as Renovate America and Renew Financial, the county's private lender partners for the Property Assessed Clean Energy program.

The complaints say borrowers are now at risk of losing their homes because the loans are liens on a house, lacked adequate consumer protections, and were marketed and sold by unscrupulous contractors that were not properly monitored.

"Many PACE participants are living hand-to-mouth to hold onto their homes, fearful of what is yet to come," the nearly identical suits say. They note that many of those in trouble have low incomes, are elderly or don't speak English as their first language.

L.A. County representatives did not respond to a request for comment.

In a statement, Renew Financial said it couldn't comment on pending litigation, but said it's worked with government authorities to improve consumer protections, including new state laws that took effect this year.

Renovate America said it similarly supported the enhanced protections and finds "no merit in the allegations in the complaints."

Specifically, the lawsuits allege the county and lenders have committed financial elder abuse, while the lenders charged inflated interest rates and broke a county contract that said they were to provide "best in class" protections against predatory lending and special safeguards for seniors.

While the lenders have said they checked borrowers for previous bankruptcies or missed mortgage payments prior to approval, they did not ask for their incomes until recently, basing approvals largely on home equity.

The lawsuits said the county knew, or should have known, its program would harm vulnerable homeowners and has looked the other way as problems piled up.

The suits seek class-action status for borrowers who took out loans between March 2015 and March 2018 and that had excessive debt-to-income ratios or were left with very little residual income after making their loan payments. The lawsuits said the class size is unknown, but argued PACE has been a "disaster for thousands of vulnerable homeowners."

The suits, brought by Irell & Manella, Public Counsel and Bet Tzedek, ask that loans for borrowers in the class be canceled and that homeowners be returned any money paid.

"We can't keep up with the number of complaints about this program," said Jennifer H. Sperling, an attorney with Bet Tzedek. "This is a systemic problem."

The class time frame was chosen because as of April 1, under a new state law, PACE lenders must ask for a borrower's income and make a "good faith determination" of a borrower's ability to repay.

Another new bill mandated that PACE providers have a phone conversation with all homeowners before they take out the loan to ensure they understand the terms. Renew Financial has said it's always had such calls and Renovate America said it started doing so before the law passed.

The bills, which included other reforms as well, were signed by Gov. Jerry Brown last year following repeated complaints over unscrupulous contractors who market and sign people up for the loans on tablet computers and smartphones.

"If they had let me know from Day One this is what [you are] going to get into ... there is no way I would have signed," said Reginald Nemore, a 58-year-old former bus driver who had to retire after a back injury.

He took out a Renovate America loan for solar panels and attic insulation in 2016. Nemore said before a contractor handed him a smartphone to sign, the individual didn't explain to him exactly how much he would be paying. He said he was told he'd qualify for a $7,000 government check for going green, but found out it isn't available to him.

Nemore said he wasn't told he could lose his house if he didn't pay and only found out the true cost when paperwork arrived in the mail after the loan was finalized. He now owes roughly $240 a month for 25 years, even though he said he and his wife, who suffers from multiple sclerosis, sometimes only have $50 or less in their checking account each month.

Nemore's attorney said he and his wife only take in around $2,475 from Social Security disability payments.

"I don't want to be uprooted," Nemore said. "But I don't know if it is going to be up to me to have that choice."

First started in 2008, PACEprograms are established by government authorities, which allow the privately financed loans to be repaid as line items on property tax bills. In addition to solar panels and energy-efficient air conditioners, the loans in California can be used for items such as low-flow toilets that save water.

In Southern California, Los Angeles, Riverside, San Bernardino and San Diego counties have approved PACE programs and contracted with private lenders to fund and largely operate the efforts.

If PACE loans go unpaid, a homeowner can be foreclosed upon. However, Renovate America and Renew Financial — which partner with Los Angeles County and issue loans under the Hero and California First programs, respectively — told The Times last year they hadn't foreclosed on anyone for nonpayment of PACE loans.

L.A. County said it had set up reserve funds to cover missed borrower payments for a time, making a quick foreclosure unlikely for those who've missed PACE payments.

Program proponents, including the lenders, have praised them as a tool for saving energy and combating global warming. And they say most customers have come away happy.

A recent study from the Lawrence Berkeley National Laboratory found PACE programs boosted the deployment of residential solar photovoltaic systems in California cities with the program by 7% to 12% between 2010 and 2015.

But consumer groups say contractors who serve as de-facto mortgage brokers too often misrepresent how the financing works, sticking people with loans they can neither understand nor afford. And they note mortgage servicers will often cover the PACE bills for a time and might be the ones foreclosing upon a delinquent homeowner.

In the lawsuits filed Thursday, attorneys allege on behalf of their clients that Renovate America and Renew Financial failed to screen and monitor their network of contractors and encouraged predatory lending and aggressive marketing.

Often, the suit says, contractors served as a homeowner's "primary" source of information about the loans. Lenders told contractors they didn't need to determine if customers could afford the loan and rubber stamped payment to contractors, without regard to whether they followed certain guidelines, the lawsuits allege.

Renovate America has said it's tried to weed out bad contractors, kicking them out when they don't follow the rules. In October, the company announced its chief executive was stepping down and it retained a law firm to "conduct and make public a third-party review of practices and procedures."

In its statement Thursday, the company said a different outside review of its loans for the Western Riverside Council of Governments found it followed the applicable consumer protections that the agency set up more than 99% of the time.

The Climate and the Cross: evangelical Christians debate climate change

The latest Guardian documentary asks whether a group of Christian Americans might offer salvation for the country’s attitude towards climate change

Charlie Phillips

Fri 13 Apr 2018 07.00 EDT Last modified on Fri 13 Apr 2018 10.11 EDT

A new Guardian documentary, The Climate and the Cross, explores a battle among US evangelicals over whether climate change is real and a call to protect the Earth – the work of God, and therefore to be welcomed – or does not exist.

Evangelicals have traditionally been the bedrock of conservative politics in the US, including on climate change. But a heated debate is taking place across the country, with some Christians protesting in the name of protecting the Earth. One group has built a chapel in the way of a pipeline, and a radical pastor has encouraged his congregation to put themselves in the way of diggers. Meanwhile, a firm supporter of Donald Trump crisscrosses the country promoting solar power.

But there is still the traditional resistance: a climate scientist who denies that the world is warming, and a preacher in Florida who sees the fact he was flooded as a positive sign of a divine presence.

The Climate and the Cross

The film-makers, Chloe White and Will Davies, make documentaries, animated content, educational, promotional and fundraising films. Their previous documentary for the Guardian, Quiet Videos, was about the makers of ASMR videos that are said to give viewers “head orgasms”. It toured international film festivals to much acclaim.


Running out of Lake Erie options, Ohio looks to get tougher on farmers

ByTom Henry | BLADE STAFF WRITERPublished on April 13, 2018 | Updated 6:20 p. m.

Nearly four years after Toledo’s 2014 water crisis, the Maumee River and other western Lake Erie tributaries are still so besieged by algae-forming phosphorus that the Kasich administration now agrees it has no recourse but to impose tough, new regulations on farmers.

Ohio Environmental Protection Agency Director Craig Butler during an interview with The Blade left no doubt what’s being contemplated would be a fundamental shift — one the state’s powerful agricultural industry was hoping to avoid — away from the administration’s longstanding emphasis on voluntary incentives.

Such incentives are designed to entice farmers to be better stewards of their land through the usual suite of buffer strips, cover crops, drainage-control structures, and windbreak. They certainly will continue to be encouraged, because they do reduce nutrient-enriched agricultural runoff to some degree, he said.

But voluntary incentives are not yielding nearly enough results and are not being embraced widely enough for Ohio to achieve the goal it made with Michigan and Ontario to reduce phosphorus levels 40 percent over 2008 levels by 2025, Mr. Butler said.

“We don't see the trend line moving big enough or fast enough. It's time for us to consider the next step,” Mr. Butler said.

Joe Cornely, Ohio Farm Bureau Federation spokesman, said he’s disappointed by the recent developments.

“This is a massive shift. It comes as a bit of a surprise, given the Kasich administration's track record of trying to be thoughtful and deliberate in terms of regulating any industry, including agriculture,” he said. “Where is the wisdom in simply slapping on one more set of regulations? The fact there is no bill introduced yet probably tells you something.”

The non-binding agreement Ohio made with Michigan and Ontario two years ago also includes an interim goal of 20 percent less phosphorus by 2020.

The Ohio EPA is quietly courting legislators, lobbyists, and public officials to throw their support behind a yet-to-be-introduced bill that Mr. Butler claims would provide him with a powerful enforcement tool.

One key part of it: Expanding the state’s definition of “agricultural pollution” to include farm runoff.

That would direct the state agriculture department to establish rules for streams classified as “watersheds in distress” because of fertilizer that escapes from fields.

It would also make it easier for the Ohio EPA to move in with requirements for strict nutrient-management plans in those areas. Those plans would, among other things, limit the spread of manure and commercial fertilizer. Applications would not be banned but would be much more regulated based on soil tests, Mr. Butler said.

“It is critical to change that definition,” he said.

Another goal of the bill is to put a statewide cap on phosphorus discharges from all sewage-treatment plants across Ohio at 1.0 part per million or less.

Toledo’s permit, which expires July 31, 2019, allows the city to discharge no more than 1.0 ppm. But many other plants, especially smaller ones, don’t have to adhere to that amount.

The goal of reducing phosphorus levels by 40 percent was promoted by area scientists on a task force created a few years ago by the U.S.-Canada International Joint Commission, a 109-year-old government agency that helps the two nations resolve boundary water issues.

The year 2008 was selected as the benchmark because it was seen as one of the more typical years for agricultural runoff over the past decade. Even with a 40 percent reduction, though, scientists believe the region will continue to have some algae, just not as often or as much.

Ohio Rep. Steve Arndt (R., Port Clinton) is being courted as the lead sponsor of the bill Mr. Butler wants.

But Mr. Arndt said he has some concerns about it.

He said the Ohio EPA first approached him about sponsoring legislation a year ago, but that the part about distressed watersheds didn’t come up until last month. He said he’s now working on the fourth version of the bill.

He agrees with Mr. Butler, though, that Ohio is likely on the verge of getting tougher with farmers because it has few options left.

“We are not going to get there [to the 40 percent reduction] with any of the present strategies in place,” Mr. Arndt said.

The Ohio EPA’s 2017 Lake Erie tributary report showed high concentrations of phosphorous and other nutrients. 2017 was the fourth-worst year for algae since 2002.

The report is a compendium of data collected by the U.S. Geological Survey, Heidelberg University, the Ohio Department of Natural Resources, and the Ohio EPA.

By May 1 of last year, the Maumee River had 860 metric tons of phosphorus recorded at Heidelberg University’s Waterville station, a figure the river is not supposed to surpass for an entire year if it is going to achieve a 40 percent reduction. It ended up recording 1,800 metric tons for the year.

Mr. Butler said 860 metric tons should be the Maumee’s “new normal,” but it’s not. The figures have remained consistently high, except for drier-than-usual springs such as 2012, since the early 2000s.

“This is all about the amount of phosphorus coming down the Maumee,” he said. “The numbers are pretty telling.”

Mr. Butler said the new strategy is unrelated to the state’s recent agreement to designate western Lake Erie’s open water as impaired. The impairment requirements will be dealt with later, once agency officials have a better idea what they’re supposed to do to comply with them, he said.

“You can’t confuse it with the impairment designation,” Mr. Butler said.

He said almost 90 percent of the region’s phosphorus pollution now is believed to come from nonpoint sources, with agriculture by far being the largest of those.

A mass balance report coming out soon is expected to help the Ohio EPA better identify where most of the phosphorus is coming from, Mr. Butler said.

Karl Gebhardt, Ohio Lake Erie Commission executive director and Ohio EPA deputy director for water resources, said some of the ideas for the Lake Erie watershed have been inspired by some of the success the agency has had in limiting manure-spreading near Grand Lake St. Marys since a massive algal bloom there in 2010.

Data near Grand Lake St. Marys “shows water going into it is much better now than it was before when it was a watershed in distress,” he said.

US judge blocks weed-killer warning label in California

SAN FRANCISCO (AP) — A U.S. judge blocked California from requiring that the popular weed-killer Roundup carry a label stating that it is known to cause cancer, saying the warning is misleading because almost all regulators have concluded there is no evidence that the product’s main ingredient is a carcinogen.

U.S. District Judge William Shubb in Sacramento issued a preliminary injunction on Monday in a lawsuit challenging the state’s decision last year to list glyphosate as a chemical known to cause cancer.

The listing triggered the warning label requirement for Roundup that was set to go into effect in July.

Glyphosate is not restricted by the U.S. Environmental Protection Agency and has been widely used since 1974 to kill weeds while leaving crops and other plants alive.

The International Agency for Research on Cancer, based in Lyon, France, has classified the chemical as a “probable human carcinogen.” That prompted the California Office of Environmental Health Hazard Assessment to add glyphosate to its cancer list.

Shubb said a “reasonable consumer would not understand that a substance is ‘known to cause cancer’ where only one health organization had found that the substance in question causes cancer.”

“On the evidence before the court, the required warning for glyphosate does not appear to be factually accurate and uncontroversial because it conveys the message that glyphosate’s carcinogenicity is an undisputed fact,” he said.

Sam Delson, a spokesman for the state’s office of environmental health hazard assessment, noted that the judge did not block the state from putting glyphosate on its cancer list, but only from requiring the warning label.

“We are pleased that the listing of glyphosate remains in effect, and we believe our actions were lawful,” he said.

He said the office had not decided yet whether to appeal the ruling on Monday.

The ruling came in a lawsuit filed by the national wheat and corn growers associations, state agriculture and business organizations in Iowa, Missouri, North Dakota and South Dakota, and a regional group representing herbicide sellers in California, Arizona and Hawaii. The plaintiffs also include St. Louis-based Monsanto Co., which makes Roundup.

“Every regulatory body in the world that has reviewed glyphosate has found it safe for use, and no available product matches glyphosate with a comparable health and environmental safety profile,” Chandler Goule, CEO of the National Association of Wheat Growers, said in a statement.

The lawsuit will continue with Shubb’s injunction in place.

A brief history of plastic, design’s favorite material

More and more Americans reckon with their wild overuse of plastic. Here’s how we got so dependent on the stuff in the first place.


From its early beginnings during and after World War II, the commercial industry for polymers–long-chain synthetic molecules of which “plastics” are a common misnomer–has grown rapidly. In 2015, over 320 million tons of polymers, excluding fibers, were manufactured across the globe.

Until the last five years, polymer product designers have typically not considered what will happen after the end of their product’s initial lifetime. This is beginning to change, and this issue will require increasing focus in the years ahead.

“Plastic” has become a somewhat misguided way to describe polymers. Typically derived from petroleum or natural gas, these are long-chain molecules with hundreds to thousands of links in each chain. Long chains convey important physical properties, such as strength and toughness, that short molecules simply cannot match.

“Plastic” is actually a shortened form of “thermoplastic,” a term that describes polymeric materials that can be shaped and reshaped using heat.

The modern polymer industry was effectively created by Wallace Carothers at DuPont in the 1930s. His painstaking work on polyamides led to the commercialization of nylon, as a wartime shortage of silk forced women to look elsewhere for stockings.

When other materials became scarce during World War II, researchers looked to synthetic polymers to fill the gaps. For example, the supply of natural rubber for vehicle tires was cut off by the Japanese conquest of Southeast Asia, leading to a synthetic polymer equivalent.

Curiosity-driven breakthroughs in chemistry led to further development of synthetic polymers, including the now widely used polypropylene and high-density polyethylene. Some polymers, such as Teflon, were stumbled upon by accident.

Eventually, the combination of need, scientific advances, and serendipity led to the full suite of polymers that you can now readily recognize as “plastics.” These polymers were rapidly commercialized, thanks to a desire to reduce products’ weight and to provide inexpensive alternatives to natural materials like cellulose or cotton.


The production of synthetic polymers globally is dominated by the polyolefins–polyethylene and polypropylene.

Polyethylene comes in two types: “high density” and “low density.” On the molecular scale, high-density polyethylene looks like a comb with regularly spaced, short teeth. The low-density version, on the other hand, looks like a comb with irregularly spaced teeth of random length–somewhat like a river and its tributaries if seen from high above. Although they’re both polyethylene, the differences in shape make these materials behave differently when molded into films or other products.

Polyolefins are dominant for a few reasons. First, they can be produced using relatively inexpensive natural gas. Second, they’re the lightest synthetic polymers produced at large scale; their density is so low that they float. Third, polyolefins resist damage by water, air, grease, cleaning solvents–all things that these polymers could encounter when in use. Finally, they’re easy to shape into products, while robust enough that packaging made from them won’t deform in a delivery truck sitting in the sun all day.

However, these materials have serious downsides. They degrade painfully slowly, meaning that polyolefins will survive in the environment for decades to centuries. Meanwhile, wave and wind action mechanically abrades them, creating microparticles that can be ingested by fish and animals, making their way up the food chain toward us.

Recycling polyolefins is not as straightforward as one would like owing to collection and cleaning issues. Oxygen and heat cause chain damage during reprocessing, while food and other materials contaminate the polyolefin. Continuing advances in chemistry have created new grades of polyolefins with enhanced strength and durability, but these cannot always mix with other grades during recycling. What’s more, polyolefins are often combined with other materials in multilayer packaging. While these multilayer constructs work well, they are impossible to recycle.

Polymers are sometimes criticized for being produced from increasingly scarce petroleum and natural gas. However, the fraction of either natural gas or petroleum used to produce polymers is very low; less than 5% of either oil or natural gas produced each year is employed to generate plastics. Further, ethylene can be produced from sugarcane ethanol, as is done commercially by Braskem in Brazil.


Depending upon the region, packaging consumes 35% to 45% of the synthetic polymer produced in total, where the polyolefins dominate. Polyethylene terephthalate, a polyester, dominates the market for beverage bottles and textile fibers.

Building and construction consumes another 20% of the total polymers produced, where PVC pipe and its chemical cousins dominate. PVC pipes are lightweight, can be glued rather than soldered or welded, and greatly resist the damaging effects of chlorine in water. Unfortunately, the chlorine atoms that confer PVC this advantage make it very difficult to recycle–most is discarded at the end of life.

Polyurethanes, an entire family of related polymers, are widely used in foam insulation for homes and appliances, as well as in architectural coatings.

The automotive sector uses increasing amounts of thermoplastics, primarily to reduce weight and hence achieve greater fuel efficiency standards. The European Union estimates that 16% of the weight of an average automobile is plastic components, most notably for interior parts and components.

Over 70 million tons of thermoplastics per year are used in textiles, mostly clothing and carpeting. More than 90% of synthetic fibers, largely polyethylene terephthalate, are produced in Asia. The growth in synthetic fiber use in clothing has come at the expense of natural fibers like cotton and wool, which require significant amounts of farmland to be produced. The synthetic fiber industry has seen dramatic growth for clothing and carpeting, thanks to interest in special properties like stretch, moisture-wicking, and breathability.

As in the case of packaging, textiles are not commonly recycled. The average U.S. citizen generates over 90 pounds of textile waste each year. According to Greenpeace, the average person in 2016 bought 60% more items of clothing every year than the average person did 15 years earlier, and keeps the clothes for a shorter period of time.

This post originally appeared on The Conversation. Eric Beckman is professor of chem/petroleum engineering at the University of Pittsburgh.

7 years later, new study shows East Chicago kids exposed to more lead because of flawed government report

Lead contamination forces families from their homes in East Chicago

1,000 East Chicago residents were ordered to leave a housing complex in East Chicago tainted with lead and arsenic and find new lodging.

Michael Hawthorne, Contact Reporter

After federal officials assessed potential health hazards in East Chicago seven years ago, they declared young children could play safely in neighborhoods built on or near former industrial sites contaminated with brain-damaging lead.

Now the same government agency confirms it woefully misled parents and city officials.

Kids living in two of the contaminated neighborhoods actually were nearly three times more likely to suffer lead poisoning during the past decade than if they lived in other parts of the heavily industrialized northwest Indiana city, according to a report unveiled last week by an arm of the U.S. Centers for Disease Control and Prevention.

Written in dry, bureaucratic language, the mea culpa is the latest acknowledgement that federal and state officials repeatedly failed to protect residents in the low-income, predominantly Hispanic and African-American city, despite more than three decades of warnings about toxic pollution left by the USS Lead smelter and other abandoned factories.

The new report comes as the U.S. Environmental Protection Agency oversees the excavation of contaminated soil from dozens of East Chicago yards, a project delayed in part because the CDC branch — the Agency for Toxic Substances and Disease Registry — advised the EPA in 2011 that “breathing the air, drinking tap water or playing in soil in neighborhoods near the USS Lead Site is not expected to harm people’s health.”

Many current and former residents are angry the hazards weren’t identified and removed years ago. Some had no idea they were living on toxic land until East Chicago’s mayor ordered the 2016 evacuation of the former West Calumet Housing Complex, built on the grounds of another lead smelter.

“Nobody ever told us about the lead,” said Akeeshea Daniels, a lifelong city resident who moved into the taxpayer-funded complex a month after her now-14-year-old son was born. “It’s been two years since they forced us to move, and we still haven’t gotten our questions answered.”

With the help of public interest lawyers working on behalf of former tenants, Daniels said she discovered medical records that showed her son had elevated levels of lead in his blood when he was a toddler, a possible explanation for the attention deficit hyperactivity disorder he was diagnosed with a few years later.

The EPA found lead levels in Daniels’ apartment as high as 32,000 parts per million — 80 times the federal safety limit for areas where children play.

Lead is unsafe to consume at any level, according to the EPA and CDC. Ingesting tiny concentrations can permanently damage the developing brains of children and contribute to heart disease, kidney failure and other health problems later in life. A recent peer-reviewed study estimated more than 400,000 deaths a year in the U.S. are linked to lead exposure — or 18 percent of all deaths.

According to the new report by the toxic substances registry, more than 27 percent of West Calumet children ages 5 and younger who were tested between 2005 and 2015 had lead levels at or above 5 micrograms per deciliter of blood, the CDC’s current threshold for medical intervention and home inspections. Roughly the same percentage of kids were poisoned in the residential neighborhood immediately east of the former housing complex.

Lead poisoning

Two plastic swans help hold up a sign that reads: "Do not play in the dirt or around the mulch" at a housing unit at the West Calumet Housing Complex in East Chicago in 2016. (Antonio Perez/Chicago Tribune)

By contrast, 11.5 percent of young kids in the rest of East Chicago and 7.8 percent of children statewide had lead levels at or above the CDC threshold in the same period, according to the report.

Following the infamous public health disaster in Flint, Mich., where actions by state-appointed officials triggered alarming levels of lead in drinking water, about 4.9 percent of children tested there had high lead levels.

Federal health officials based their flawed 2011 assessment of East Chicago on lead poisoning rates throughout the city rather than data from specific neighborhoods targeted by the EPA’s Superfund program for highly contaminated industrial sites.

Relying on the wrong data led to erroneous or misleading conclusions such as “declining blood lead levels in small children appear to confirm that they are no longer exposed to lead from any source.”

Since the EPA tends to prioritize cleanups the same way battlefield medics assess the wounded — concentrating on immediate or obvious risks first — agency officials have said the 2011 report factored heavily into their decision to focus initially on a gradual approach at the East Chicago site.

The toxic substances registry is required by law to assess risks to public health at every Superfund site. Its new report pins the blame for its East Chicago snafu on Indiana state officials who provided data used in the 2011 report.

After promising a response, a spokesperson for the health agency did not answer emailed questions from the Tribune. In a statement, the EPA said it has removed contaminated soil from more than 400 yards so far and “will continue to partner and coordinate with ATSDR to protect human health and the environment.”

Local, state and federal officials had plenty of reasons to act years ago.

They always knew the public housing complex had been built on the site of a former lead smelter. In 1994, the toxic substances registry recommended extensive soil testing and blood screening, and four years later the agency reported unusually high rates of childhood lead poisoning in the area.

Yet a 2016 Tribune investigation found officials were slow to propose a cleanup or inform residents about potential hazards. Only a single soil sample from the public housing complex had been collected and analyzed before the EPA declared the development and surrounding neighborhoods a Superfund site in 2009, the newspaper found.

Since then, lead poisoning rates have been declining in neighborhoods within the Superfund site boundaries, as well as in all of East Chicago, the rest of Indiana and the nation as a whole, according to the new report from the federal health agency. But the percentage of kids tested throughout Indiana also has declined dramatically in recent years, suggesting an untold number of poisoned children aren’t accounted for in official statistics.

“These results reflect the harms that stem from environmental injustice,” said Debbie Chizewer, a lawyer with the Environmental Advocacy Center at Northwestern University's Bluhm Legal Center, who has been working with East Chicago residents to ensure they have safe housing. “Government agencies at all levels all failed this community.”

Full Report:


21 AUGUST 2018

A Macquarie PhD student believes he’s come up with a way to turn coffee waste into biodegradable plastic coffee cups.

He’s developed a method to turn coffee grounds into lactic acid, which can then be used to produce biodegradable plastics, and is now refining the process as he finishes his PhD.

“Australians consume six billion cups of coffee every year, and the coffee grounds used to make these coffees are used only once and then discarded,” says researcher Dominik Kopp.

“In Sydney alone, over 920 cafes and coffee shops produced nearly 3,000 tonnes of waste coffee grounds every year.

“Ninety-three per cent of this waste ends up in landfill, where it produces greenhouse gases that contribute to global warming.”

However, 50 per cent of coffee grounds are made up of sugars, which are ideal candidates to convert into valuable bio-based chemicals, or chemicals derived from plant- or animal-based feedstocks rather than crude oil.

“Our group is looking for new ways to convert biowaste—whether that be agricultural, garden, paper or commercial food waste—into valuable raw materials that can be used to produce high-value compounds in more environmentally-friendly ways,” says Associate Professor Anwar Sunna, Dominik’s supervisor and head of the Sunna Lab which is using the rapidly growing field of synthetic biology to address biotechnology and biomedical challenges.

Dominik sourced coffee grounds from one of the coffee shops on Macquarie’s campus and took them back to the lab.

“We assembled a synthetic pathway to convert the most abundant sugar in the coffee grounds, mannose, into lactic acid,” he says.

“Lactic acid can be used in the production of biodegradable plastics, offering a more sustainable and environmentally-friendly alternative to fossil fuel-derived plastics.

“You could use such plastics to make anything from plastic coffee cups to yoghurt containers to compost bags to sutures in medicine.”

Their method was inspired by a metabolic pathway that is thought to exist in an evolutionarily ancient organism, which lives in hot and extremely acidic environments.

Dominik was awarded the INOFEA Early Career Award for Applied Biocatalysis or Nanobiotechnology for the poster he presented on his research at the 18th European Congress on Biotechnology last month.

His next step will be to further refine his conversion pathway, and improve the yield of lactic acid.

“I think my project is one of many interesting approaches on how to use synthetic biology in a responsible manner for the development of a more sustainable and greener industry that doesn’t rely on crude oil,” says Dominik.

“The simple idea that we are converting waste into a valuable and sustainable product is extremely exciting!”

Kopp, D., Care, A., Bergquist, P.L., Willows, R. and Sunna, A. Cell-free synthetic pathway for the conversion of spent coffee grounds into lactic acid. Poster presented at: 18th European Congress on Biotechnology; 2018 July 1-4; Geneva, Switzerland.

Links to interesting Graphics Below :,c_limit,q_auto:best,f_auto/wp-cms/uploads/2018/08/i-1-90217212-a-brief-history-of-plastic-designand8217s-favorite-material.jpg,c_limit,q_auto:best,f_auto/wp-cms/uploads/2018/08/i-2-90217212-a-brief-history-of-plastic-designand8217s-favorite-material.jpg,c_limit,q_auto:best,f_auto/wp-cms/uploads/2018/08/i-4-90217212-a-brief-history-of-plastic-designand8217s-favorite-material.jpg

Biorenewable, biodegradable plastic alternative synthesized by CSU chemists


The team describes synthesis of a polymer called bacterial poly(3-hydroxybutyrate), or P3HB


Colorado State University polymer chemists have taken another step toward a future of high-performance, biorenewable, biodegradable plastics.

Publishing in Nature Communications, the team led by Professor of Chemistry Eugene Chen describes chemical synthesis of a polymer called bacterial poly(3-hydroxybutyrate) ­- or P3HB. The compound shows early promise as a substitute for petroleum plastics in major industrial uses.

P3HB is a biomaterial, typically produced by bacteria, algae and other microorganisms, and is used in some biomedical applications. Its high production costs and limited volumes render the material impractical in more widespread commodity applications, however.

The team, which includes the paper's first author and research scientist Xiaoyan Tang, used a starting material called succinate, an ester form of succinic acid. This acid is produced via fermentation of glucose and is first on the U.S. Department of Energy's list of top 12 biomass-derived compounds best positioned to replace petroleum-derived chemicals.

The researchers' new chemical synthesis route produces P3HB that's similar in performance to bacterial P3HB, but their route is faster and offers potential for larger-scale, cost-effective production for commodity plastic applications. This new route is enabled by a class of powerful new catalysts they have designed and synthesized. They have filed a provisional patent through CSU Ventures for the new technology.


‘Infinitely’ recyclable polymer shows practical properties of plastics

Apr, 2018

By Anne Manning

Eugene Chen’s lab at Colorado State University has developed a completely recyclable polymer. Credit: Bill Cotton/Colorado State University

The world fell in love with plastics because they’re cheap, convenient, lightweight and long-lasting. For these same reasons, plastics are now trashing the Earth.

Colorado State University chemists have announced in the journal Science another major step toward waste-free, sustainable materials that could one day compete with conventional plastics. Led by Eugene Chen, professor in the Department of Chemistry, they have discovered a polymer with many of the same characteristics we enjoy in plastics, such as light weight, heat resistance, strength and durability. But the new polymer, unlike typical petroleum plastics, can be converted back to its original small-molecule state for complete chemical recyclability. This can be accomplished without the use of toxic chemicals or intensive lab procedures.

Polymers are a broad class of materials characterized by long chains of chemically bonded, repeating molecular units called monomers. Synthetic polymers today include plastics, as well as fibers, ceramics, rubbers, coatings, and many other commercial products.

The work builds on a previous generation of a chemically recyclable polymer Chen’s lab first demonstrated in 2015. Making the old version required extremely cold conditions that would have limited its industrial potential. The previous polymer also had low heat resistance and molecular weight, and, while plastic-like, was relatively soft.

But the fundamental knowledge gained from that study was invaluable, Chen said. It led to a design principle for developing future-generation polymers that not only are chemically recyclable, but also exhibit robust practical properties.

The new, much-improved polymer structure resolves the issues of the first-generation material. The monomer can be conveniently polymerized under environmentally friendly, industrially realistic conditions: solvent-free, at room temperature, with just a few minutes of reaction time and only a trace amount of catalyst. The resulting material has a high molecular weight, thermal stability and crystallinity, and mechanical properties that perform very much like a plastic. Most importantly, the polymer can be recycled back to its original, monomeric state under mild lab conditions, using a catalyst. Without need for further purification, the monomer can be re-polymerized, thus establishing what Chen calls a circular materials life cycle.

This piece of innovative chemistry has Chen and his colleagues excited for a future in which new, green plastics, rather than surviving in landfills and oceans for millions of years, can be simply placed in a reactor and, in chemical parlance, de-polymerized to recover their value – not possible for today’s petroleum plastics. Back at its chemical starting point, the material could be used over and over again – completely redefining what it means to “recycle.”

“The polymers can be chemically recycled and reused, in principle, infinitely,” Chen said.

Chen stresses that the new polymer technology has only been demonstrated at the academic lab scale. There is still much work to be done to perfect the patent-pending monomer and polymer production processes he and colleagues have invented.

With the help of a seed grant from CSU Ventures, the chemists are optimizing their monomer synthesis process and developing new, even more cost-effective routes to such polymers. They’re also working on scalability issues on their monomer-polymer-monomer recycling setup, while further researching new chemical structures for even better recyclable materials.

“It would be our dream to see this chemically recyclable polymer technology materialize in the marketplace,” Chen said.

The paper’s first author is CSU research scientist Jian-Bo Zhu. Co-authors are graduate students Eli Watson and Jing Tang.

Full Report:

New study finds US oil & gas methane emissions 60 percent higher than estimated


High emissions findings undercut the case that gas offers substantial climate advantage over coal


The U.S. oil and gas industry emits 13 million metric tons of the potent greenhouse gas methane from its operations each year, 60 percent more than estimated by the U.S. Environmental Protection Agency, according to a new study published today in the journal Science.

Significantly, researchers found most of the emissions came from leaks, equipment malfunctions and other "abnormal" operating conditions. The climate impact of these leaks in 2015 was roughly the same as the climate impact of carbon dioxide emissions from all all U.S. coal-fired power plants operating in 2015, they found.

"This study provides the best estimate to date on the climate impact of oil and gas activity in the United States," said co-author Jeff Peischl, a CIRES scientist working in NOAA's Chemical Sciences Division in Boulder, Colorado. "It's the culmination of 10 years of studies by scientists across the country, many of which were spearheaded by CIRES and NOAA."

The new paper assessed measurements made at more than 400 well pads in six oil and gas production basins and scores of midstream facilities; measurements from valves, tanks and other equipment; and aerial surveys covering large swaths of the U.S. oil and gas infrastructure. The research was organized by the Environmental Defense Fund and drew on science experts from 16 research institutions including the University of Colorado Boulder and the University of Texas Austin.

Methane, the main ingredient of natural gas, is a potent greenhouse gas that has more than 80 times the warming impact of carbon dioxide over the first 20 years after its release. The new study estimates total US emissions at 2.3 percent of production, enough to erode the potential climate benefit of switching from coal to natural gas over the past 20 years. The methane lost to leakage is worth an estimated $2 billion, according to the Environmental Defense Fund, enough to heat 10 million homes in the U.S.

The assessment does suggest that repairing leaks and addressing other conditions that result in the accidental release of salrable methane could be effective. "Natural gas emissions can, in fact, be significantly reduced if properly monitored," said co-author Colm Sweeney, an atmospheric scientist in NOAA's Global Monitoring Division. "Identifying the biggest leakers could substantially reduce emissions that we have measured."

New data confirm increased frequency of extreme weather events

Public Release: 21-Mar-2018

European national science academies urge further action on climate change adaptation

European Academies' Science Advisory Council, Leopoldina - Nationale Akademie der Wissenschaften

New data show that extreme weather events have become more frequent over the past 36 years, with a significant uptick in floods and other hydrological events compared even with five years ago, according to a new publication, "Extreme weather events in Europe: Preparing for climate change adaptation: an update on EASAC's 2013 study" by the European Academies' Science Advisory Council (EASAC), a body made up of 27 national science academies in the European Union, Norway, and Switzerland. Given the increase in the frequency of extreme weather events, EASAC calls for stronger attention to climate change adaptation across the European Union: leaders and policy-makers must improve the adaptability of Europe's infrastructure and social systems to a changing climate.

Globally, according to the new data, the number of floods and other hydrological events have quadrupled since 1980 and have doubled since 2004, highlighting the urgency of adaptation to climate change. Climatological events, such as extreme temperatures, droughts, and forest fires, have more than doubled since 1980. Meteorological events, such as storms, have doubled since 1980 (Figure 1, 2013 (Figure 2.1 in 2013 report); Figure 2, 2018 (Figure 1 in 2018 updated publication).

These extreme weather events carry substantial economic costs. In the updated data (Figure 3; Figure 2 in 2018 updated publication), thunderstorm losses in North America have doubled - from under US$10 billion in 1980 to almost $20 billion in 2015. On a more positive note, river flood losses in Europe show a near-static trend (despite their increased frequency), indicating that protection measures that have been implemented may have stemmed flood losses.

Professor Michael Norton, EASAC's Environment Programme Director states, "Our 2013 Extreme Weather Events report - which was based on the findings of the Norwegian Academy of Science and Letters and the Norwegian Meteorological Institute - has been updated and the latest data supports our original conclusions: there has been and continues to be a significant increase in the frequency of extreme weather events, making climate proofing all the more urgent. Adaptation and mitigation must remain the cornerstones of tackling climate change. This update is most timely since the European Commission is due to release its evaluation of its climate strategy this year."

Is a contemporary shutdown of the Gulf Stream (AMOC) possible?

The update also reviews evidence on key drivers of extreme events. A major point of debate remains whether the Gulf Stream, or Atlantic Meridional Overturning Circulation (AMOC), will just decline or could 'switch off' entirely with substantial implications for Northwest Europe's climate. Recent monitoring does suggest a significant weakening but debate continues over whether the gulf stream may "switch off" as a result of the increased flows of fresh water from northern latitude rainfall and melting of the Greenland icecap. EASAC notes the importance of continuing to use emerging oceanographic monitoring data to provide a more reliable forecast of impacts of global warming on the AMOC. The update also notes the recent evidence which suggests an association between the rapid rate of Arctic warming and extreme cold events further south (including in Europe and the Eastern USA) due to a weakened and meandering jet stream.

Federal report on climate change: High-tide flooding could become routine by 2100

Updated 1:57 PM; Posted 1:57 PM, March 28, 2018 US News & World Report

By The Washington Post

High-tide flooding, which can wash water over roads and inundate homes and businesses, is an event that happens once in a great while in coastal areas. But its frequency has rapidly increased in recent years due to sea-level rise. Not just during storms, but increasingly on sunny days, too.

Years ago, the late Margaret Davidson, a coastal programs director at the National Oceanic and Atmospheric Administration, warned it wouldn't be long until such flooding becomes routine. "Today's flood will become tomorrow's high tide," she said.

A new NOAA report has published startling new projections that affirms Davidson's warning. Due to sea-level rise, high-tide flooding is already accelerating along many parts of the U.S. coastline, including the Mid-Atlantic.

By 2100, the report says, "high tide flooding will occur 'every other day' (182 days/year) or more often" even under an "intermediate low scenario" in coastal areas along the East Coast and Gulf of Mexico. This scenario works under the assumption that greenhouse gas emissions - which warm the climate and speed up sea-level rise - are curbed.

For a more aggressive "intermediate" scenario, in which greenhouse gas emissions carry on at today's pace, high tide flooding is forecast to occur 365 days per year.

In February, scientists published a separate study concluding that they had already detected an acceleration in sea-level rise and that it is likely to speed up more in the coming decades as ice sheets disintegrate and ocean waters expand.

The implications of the steepening sea-level rise on high-tide flooding are already apparent on the ground.

William Sweet, the lead author of the NOAA report, has witnessed an uptick in flooding firsthand. "I moved to Annapolis (Maryland) last year," he said in an interview. "Since I moved there, I've seen six or seven days when waters hit storefronts. What we're finding is along the East Coast, and the Mid-Atlantic in particular, already the flooding is accelerating."

In just 15 years, the incidence of high-tide flooding in the Mid-Atlantic has doubled, from an average of three days per year in 2000 to six in 2015, according to the NOAA report.

"That's a very important outcome," Sweet said. "What it means is that the change is not a gradual linear change, but measured in leaps and bounds. By the time you realize there is a flooding issue, impacts will become chronic rather quickly."

A similar acceleration in coastal flooding has been seen in other locations along the East Coast, including Florida, the Carolinas and in the Northeast.

Practically every time tides are intensified by the lunar cycle in South Florida, media reports show inundation from "sunny day flooding," which seldom occurred mere decades ago.

This past winter, Boston observed its first and third highest tides in recorded history as nor'easters battered southern New England in January and March.

"The record-breaking event of January 2018 would not have broken the record had it not been for relative sea-level rise that carried the tide and storm surge above the level reached in 1978," Andrew Kemp, an assistant professor of earth and ocean sciences at Tufts University, told the Boston Globe.

The prospect of high-tide flooding occurring every day or even every other day late this century is difficult to fathom.

Michael Lowry, a visiting scientist at the University Corporation for Atmospheric Research, expressed shock on Twitter after seeing these projections. "It's hard to overstate the significance of this," he said. "That isn't even the intermediate high, high, or extreme scenarios that bring us 365 (days per year) high-tide flooding in my lifetime. It's crazy."

Even by 2050, just over 30 years from now, the report projects high-tide flooding will occur between 50 and 250 days per year along the East Coast, depending on the emissions scenario.

"This is such a huge deal," tweeted Alan Gerard, a division chief at the National Severe Storms Laboratory. "And even the increase by 2040. That's not that far away."

Astrid Caldas, a senior climate scientist at the Union of Concerned Scientists who tracks the impacts of sea-level rise, is extremely worried about the projected flooding increases.

"(B)y mid-century, the frequency of this type of 'minor' flooding would become so disruptive that business as usual would be practically impossible without significant adaptation measures," Caldas said. "Without planning for flooding with measures such as protecting, elevating, accommodating the water, or even moving stuff out of the way, the impacts on the cities, their economy, and their residents would be immense."

She added: "Just imagine seeing streets (and property) flooded every other day. That gives a completely new meaning to the term 'nuisance flooding.' Or actually, it completely obliterates the concept, as flooding would become much more than a nuisance, but a rather serious problem requiring significant resources and innovative policies."

NOAA's Sweet said informing decision-makers and planners about the escalating flood risk was impetus for the report. "(Floodwaters) getting up to storefronts in Annapolis becomes the new tide (by late this century)," he said. "When you project out that far, the assumption is that society will adapt. That's the real reason we're doing this: to inform decision-making so the best science is available to plan accordingly."

In the D.C. region, significant infrastructure is at risk from rising seas, including assets around the National Mall, the Southwest Waterfront, Old Town Alexandria in Virginia, and Annapolis.

To respond to this risk, an interagency team known as the District of Columbia Silver Jackets formed in 2014. It includes members from regional and federal agencies, such as the D.C. government, U.S. Army Corps of Engineers, National Park Service and National Weather Service, as well as academia.

The team convened a flood summit in 2016, has developed an online flood inundation mapping tool, and conducted a study for vulnerable neighborhoods along Watts Branch, a tributary of the Anacostia River.

"While our team has made great strides in identifying and helping to mitigate flood risk in the region, there is still so much work to be done, including taking a more comprehensive, integrated and regional approach to flood risk management," the Silver Jackets team said in a statement. "The statistics in this report are staggering, and really hit home on the fact that time is of the essence as we plan for the future and take steps now to ensure our nation's resources and treasures are not impacted by flooding."

Caldas stressed global action is needed in addition to local measures to lower the risk of high-tide flooding. "(A)dhering to the commitments of the Paris (climate) Agreement is exceptionally important, because emissions for the next couple of decades, up to mid-century, will define how much sea level rise we will see in the end of the century - and how much tidal flooding we will see: every other day or every day," she said.

Washington bans firefighting chemicals that may cause cancer

UPDATED: Tue., March 27, 2018, 5:39 p.m.

Water that may be contaminated with chemicals used in firefighting foam on Fairchild Air Force Base is flushed from a fire hydrant in Airway Heights last December. Washington is banning those chemicals for civilian fire departments, although it has no authority to extend the ban to military bases.

By Jim Camden

(509) 879-7461

OLYMPIA – Firefighting foam with a chemical thought to cause cancer and other health problems will be banned in two years for local fire departments and districts in Washington.

A new law signed Tuesday bans the group of chemicals that are contaminating some wells in Airway Heights and other water sources near military bases, although it won’t directly affect that contamination.

Perfluorinated or polyfluorinated compounds, or PFAS, are a key ingredient in some foams used to extinguish fuel fires, and also are commonly applied to firefighters’ protective gear. They last a long time, are almost indestructible under most natural situations and travel easily through the soil to get into underground water supplies.

They may also be responsible for a high rate of cancers in firefighters, lawmakers were told in hearings for the bill signed by Gov. Jay Inslee on Tuesday. Washington may be the first state to impose such a ban, which takes effect in 2020.

In Airway Heights, the contamination is linked to firefighters practicing with the foam on Fairchild Air Force Base. The state can’t tell the Defense Department not to buy foam with PFAS to put out aviation fuel fires on its bases but the foam can’t be used for training on bases or at airports. The bill is directed at local fire departments and fire districts, some of which already are finding substitutes or altering training to reduce the use of the foam except in real emergencies.

Water Scarcity Looms in London’s Future

March 28, 2018

The city’s water demand is expected to exceed supply within the next decade.

London draws a majority of its water from the Thames and Lea rivers.

In recent years, London has grappled with water shortages and sewage overflows as the city’s population steadily increases. Water demand is expected to exceed supply within the next decade, and severe water shortages could affect Britain’s capital by 2040.

London’s water stress is influenced by several factors. The metropolitan area receives roughly 600mm of rain each year, less than cites such as Istanbul, Turkey, and Sydney, Australia. The relatively low rainfall, coupled with the growing likelihood of hotter, drier summers, is putting pressure on the city’s water supply. At the same time, London’s population is expanding by 100,000 people each year.

London’s key water sources–the Thames and Lea rivers–are also facing pollution problems. Plastic and other debris infiltrate the waterways and clog the city’s sewer system. In addition, the rivers are sometimes deluged with wastewater when heavy rainfall overwhelms the sewage system’s treatment capacity. In order for London to avoid extreme water stress, advocates say, the city needs to cut consumption and make river preservation a priority.

“Pressure on our rivers and water supplies is only going to rise. We need to act now to ensure we have enough water for people and nature in years to come.” –Rose O’Neill, freshwater programme manager at WWF-UK, in reference to the need for careful water stewardship in London.

By The Numbers

13.4 million Possible population of London by 2050, according to the London Assembly Environment Committee. The population is currently nearing 9 million.

20 percent Amount that demand for water could outstrip London’s supply by 2040.

65 percent Proportion of London’s water that is pumped from rivers. The other 35 percent is drawn from aquifers.

£20 million ($26 million) Amount that Thames Water, London’s main water provider, was charged in March 2017 for discharging sewage into the River Thames. The leaks occurred over a 2-year period and amounted to 1.4bn litres of sewage.

70 percent Proportion of flounder in the River Thames that have bits of plastic in their guts, according to a 2015 study.

25,000 tonnes Amount of debris that Thames Water removes from its sewage system every year. Recently, fatbergs–congealed masses of cooking fat, diapers, and wipes–have been obstructing sewer pipes in London. The fatbergs increase the likelihood of raw sewage flowing into streets.

According to a 2016 study by Water UK, there is a 20 percent chance that London residents will need to queue for water at standpipes during a summer drought in the next 25 years. The researchers ran models based on climate change, population growth, and future environmental protections, and found that UK summers will likely become increasingly hot and dry. The most extreme models predicted that summer droughts could result in economic costs of up to £1.3 billion per day throughout England and Wales.

One recommendation for improving London’s water outlook is to utilize a “twin track” approach to water management. This method would focus on preserving the city’s current water supply while also implementing policies to lower demand. New infrastructure is needed as well, especially to help improve the city’s aging sewage system.

If London cannot incentivize residents to reduce water use, then extra storage capacity will likely be needed. Thames Water sold 25 reservoirs after becoming privatized in the late 1980s. The utility insists that the reservoirs were “service” rather than “storage” reservoirs, but the move has drawn scrutiny, especially as London’s water stress grows. In either case, the company is now faced with buying new land in order to increase water storage capacity–an expensive endeavor in modern-day London.

After 7 years, Madison compost program trashed

CHRIS RICKERT Jun 12, 2018

The city's organics collection program will end next week. Popular with participants, the program suffered from too much contaminated waste and a lack of a biodigester to collect the gas it produced.

Madison’s organics-collection pilot program will end next week, a victim of people putting too many non-compostables into their compostables carts and a city that couldn’t afford its own biodigester, the city’s recycling coordinator announced Monday.

Bryan Johnson emphasized that collecting food scraps and possibly other compostable materials citywide continues to be a long-term goal, and ending the current seven-year-long program will give officials time to study programs elsewhere that have successfully been able to process waste to create biogas for sale to utilities.

“We are committed to making this work,” he said, but “what we’ve been getting out of the bins really is just incompatible with the processing options.”

There are currently about 1,100 households and 40 businesses in the program.

While the load of food scraps, soiled paper and other material delivered to a Middleton digester in March had fewer plastic bags, metals and other noncompostables than the prior load, “the contamination within the material created a very labor-intensive and slow process, which also requires thousands of gallons of water,” he said.

That contamination included lumber scraps and two seemingly full bottles of liquor, he said, and the extra water and labor needed to separate debris and process the compostables meant a $200-per-ton fee from the digester’s operator, GL Dairy Biogas, which is “far too expensive.” By contrast, the county landfill charges a dumping fee of $50 a ton.

Johnson said that when the program began in 2011 with about 500 participants, the plan was for the city to open its own biodigester in 2016, but “this did not happen.”

Constructing its own biodigester would allow the city to tailor it to the city’s waste, Johnson said.

But at a cost of $30 million to $35 million, it would also be “way beyond the scope of this city,” said Mayor Paul Soglin.

Ultimately, Soglin said, he’d like to see the city focus on collecting waste from larger producers — such as cafeterias, grocery stores and large employers — and then find a way to work with the Madison Metropolitan Sewerage District to process it.

MMSD officials said the city reached out in early 2016 looking for ways to bring its curbside collection to the district’s South Side Madison site, which doesn’t have the facilities to handle such waste now, said process and research engineer Matt Seib.

“In addition to studying the potential for energy conversion, his research team has been looking at opportunities to create valuable industrial products from a waste stream composed of organic material,” MMSD communications manager Jennifer Sereno said in an email.

Those products could include fertilizer or organic molecules for industrial applications, Seib said, but there are no specific plans for what MMSD could do with Madison’s organics or how soon it could be doing it.

“We remain open to potential future collaboration in this area,” Sereno said.

Johnson said sending Madison’s organics to MMSD “is not a near-term solution considering the way everything is currently performed.” As part of his research, he said he plans to learn more about a private curbside composting service for commercial customers in Milwaukee County.

Jaclyn Lawton, chairwoman of the city’s Solid Waste Advisory Committee, which oversaw the pilot program, said focusing on large producers could make sense.

“Large producers are also more able to comply with the materials to be composted,” she said. “Households, even those volunteering for the pilot, did not have uncontaminated materials as a group.”

Soglin said other factors to consider in large-scale composting are how consistent the compostable material is and how much utilities are willing to pay for the gas it produces.

“The cost of collecting individual household substrate compared to the benefit is astronomical,” he said.

The rules on what could be put in the curbside carts changed as the city changed processors in its pilot program. At one point, pet waste and diapers were compostable, and some participants have not changed what they’ve put into the compost stream as the rules have changed, Johnson said.

Ending the pilot and starting fresh would allow the city to set strict limits in any new program, he said.

Households that participate in the program will get a letter this week announcing its end, and Streets Division crews will collect participants’ black refuse carts as soon as they can after final collections next week, Johnson said.

Nothing should be put in the carts between the final scheduled collection and when the carts themselves are collected next week.

MADISON, Wis. (AP) — Madison is ending its compost collection program because residents were putting too many non-compostable items in their carts and the city can’t afford its own biodigester.

Bryan Johnson, the city’s recycling coordinator, told the Wisconsin State Journal that ending the program will give officials time to study other options for collecting food scraps and other compostable materials.

The program began in 2011 with about 500 participants, he said. It currently has about 1,100 households and 40 businesses involved. Participants will receive a letter about the programs end and crew will collect refuse carts after the final collection next week, Johnson said.

“We are committed to making this work,” Johnson said. “(But) what we’ve been getting out of the bins really is just incompatible with the processing options.”

Separating non-compostable materials, such as plastic bags and metals, is a labor-intensive and slow process that requires additional water, Johnson said. The digester’s operator, GL Dairy Biogas, charges a $200-per-ton fee to separate debris from compostable material.

The county landfill has a $50-per-ton dumping fee.

Initial program plans had called for the city building its own biodigester in 2016, but the project never happened, Johnson said.

Constructing a biodigester would cost up to $35 million, which is beyond the city’s ability, said Mayor Paul Soglin.

Soglin said he hopes the city can find ways to work with larger producers before integrating the process into the Madison Metropolitan Sewerage District.

The district’s process and research engineer, Matt Seib, is studying ways to create industrial products from organic material waste, MMSD communications manager Jennifer Sereno said in an email. Products could include fertilizer or organic molecules for industrial applications, she said.


Another Viewpoint :

Madison’s composting pilot program struggles with contamination


Part of Colin Thompson’s job at Batch Bakehouse is hauling garbage and recycling to the curb. For Thompson, this involves an extra layer of sorting.

Each week, Thompson also puts out eight bins full of the bakery’s food waste. “It reduces waste on the whole and increases the overall sustainability of our business and lessens our environmental impact,” Thompson says. “If local gardeners can end up using it, it builds the community.”

Batch is part of a group of 40 businesses and 1,100 residents participating in a pilot composting project that began in 2011. Although the project doesn’t take much effort, it requires vigilance.

Every once in a while, Thompson catches someone tossing compost into a garbage bag. Or, worse, throwing plastic into one of the composting bins. “I usually catch it but I can see the potential for slip-ups like that,” Thompson says.

Those slip-ups are putting a wrench in Madison’s ambitions to join San Francisco, Seattle, Portland and other cities that have a curbside composting collection. Contamination in the pilot project has left the city with no place to send its composting waste, says Madison recycling coordinator Bryan Johnson.

“The most persistent contamination has been plastic, like plastic bags, plastic bottles, and other plastic containers like food service boxes or salad containers,” Johnson says. “On the surface of it, you would think it would be easy to only put food scraps or compostables into a container, but everyone makes mistakes from time to time and a plastic bottle or a plastic utensil winds up in the wrong container.”

Johnson says a small amount of plastic can contaminate a lot of composting. “Organics are routinely shredded prior to composting, so the plastic bottles or plastic bags would also be shredded and spread throughout the material being processed,” Johnson says. “There are screens to get some of the debris out, but tiny flecks of plastic, metal, or glass are very difficult to remove with just screens.”

Initially, the city sent composting to a digester at UW-Oshkosh. After that site stopped accepting the material, the city turned to digesters owned by Gundersen-Lutheran in Middleton and Blue Ribbon Organics in Caledonia — both have since stopped doing business with the city due to persistent contamination.

The companies have to be picky because they’re trying to sell a product to home gardeners. “Marketable compost does not have room for error when it comes to plastic or other debris in the product,” Johnson says.

Although the city’s composting currently has no where to go, Johnson is asking residents in the pilot to keep participating. “We are hanging on to the organics until we find a place to go with them,” he says. But, he adds, “In instances where the pile is too foul to keep holding, we have no choice but to send it to the landfill.”

The city could create its own composting site or biodigester, which would be more forgiving since it would not be trying to sell the end product to home gardeners. Instead, its composting could be used for agriculture or construction, which allows for higher levels of contamination.

For years, the city has proposed building its own biodigester, but the project keeps getting postponed because of high costs and tight budgets. A biodigester would cost $12-$18 million to build and require more trucks and workers to collect and process the composting. However, the project would slowly pay for itself, by reducing the amount of waste the city pays to dump at the landfill.

Ald. Marsha Rummel, Common Council president, says several alders support the project, but ultimately the mayor sets the budget. “Recently we were faced with a new police station, fire station and other projects so the biodigester got pushed back by the mayor,” Rummel says. “The council supports the pilot and I believe alders see the benefits of a biodigester.”

The city is looking for possible partners — such as the Madison Metropolitan Sewer District — to help pay for it, Johnson says.

“Exploring partnerships is a way to help share the costs, but it is still difficult to predict when the economics of the situation will align to allow for the continued investment,” he says. “However, when we can, we will.”

Despite the challenges, Johnson sees curbside composting in Madison as inevitable. “The food we toss out is a valuable resource when we manage and process it correctly. I have no doubts we will be there.”

Thompson is hoping the city figures it out soon.

“I’d like to see it expanded and brought into my neighborhood,” he says. “Even on an individual basis it cuts down on a lot of waste.”

Exclusive: U.S. Army forms plan to test 40,000 homes for lead following Reuters report

AUGUST 27, 2018 / 2:21 PM / UPDATED A DAY AGO

Joshua Schneyer, Andrea Januta

NEW YORK (Reuters) - The U.S. Army has drafted a plan to test for toxic lead hazards in 40,000 homes on its bases, military documents show, in a sweeping response to a Reuters report that found children at risk of lead poisoning in military housing.

The inspection program, if implemented, would begin quickly and prioritize thousands of Army post homes occupied by small children, who are most vulnerable to lead exposure. Ingesting the heavy metal can stunt brain development and cause lifelong health impacts.

The lead inspections would cost up to $386 million and target pre-1978 homes to identify deteriorating lead-based paint and leaded dust, water or soil, according to the military documents.

A draft Army Execution Order says the program’s mission is to mitigate all identified lead hazards in Army post homes in the United States. In homes where dangers are detected, the Army would offer soldiers’ families “temporary or permanent relocation” to housing safe from lead hazards, it says.

The Army's mobilization comes after Reuters published an investigation on August 16 describing lead paint poisoning hazards in privatized military base homes. It documented at least 1,050 small children who tested high for lead at base clinics in recent years. Their results often weren't being reported to state health authorities as required, Reuters found.

Behind the numbers were injured families, including that of a decorated Army colonel, J. Cale Brown, whose son JC was poisoned by lead while living at Fort Benning, in Georgia.

The article drew a quick response from lawmakers, with eight U.S. senators demanding action to protect military families living in base housing.

Read Reuters' reports exposing the hidden hazards of lead poisoning across America.

The Army’s planned response is laid out in military documents, including the draft Execution Order, minutes from a private meeting attended by top Army brass, and other materials.

One priority, detailed by Under Secretary of the Army Ryan McCarthy in an August 22 meeting, is for the military’s response to counter any sense “that we … are not taking care of children of Soldiers and are not taking appropriate action quickly enough,” meeting minutes say. “The Army will remain focused on the actions to assess, inspect, and mitigate risks to Soldiers and Families,” the minutes say, citing McCarthy and Vice Chief of Staff General James C. McConville.

Army spokeswoman Colonel Kathleen Turner acknowledged plans are being formulated but said no decisions have been made. “Out of an abundance of caution, we are going above and beyond current requirements to ensure the safety of our soldiers and their families who work and live on all of our installations,” Turner said in a statement. “We are currently evaluating all options to address these concerns.”

Old lead-based paint becomes a poisoning hazard when it deteriorates, and poor maintenance of military base homes can leave legions at risk. About 30 percent of service families – including some 100,000 small children – live in U.S. military housing owned and operated by private companies in business with the military.

There are nearly 100,000 homes on U.S. Army bases, and the lead inspections are expected to focus on the approximately 40,000 built before a 1978 U.S. ban on the sale of lead paint.

The plans depart from guidance that appeared on the Army Public Health Center's website as recently as last week, which "discouraged" lead-based paint inspections in Army homes. The website has since been updated and omits that language.

Under the plans, the documents show, the Army would:

- Inspect all pre-1978 Army family housing units nationwide, including visual lead-based paint assessments by certified personnel, swipe-testing for toxic lead paint dust, and testing of tap water. Some homes will also receive soil testing. This phase alone, described as “near term actions,” will cost between $328 million to $386 million, the Army’s Installation Services director estimated.

- Temporarily or permanently relocate families when hazards are found. “If a Family or Soldier are concerned with potential negative impacts from lead; the U.S. Army will offer them a chance to relocate to a new residence,” the documents say. “We must do everything we can to maintain that trust.”

Conduct town hall meetings on Army posts to address residents’ lead concerns. The Army intends to do so with “empathy,” the meeting minutes say. “Tone is key and can be just as important as the actions we take.”

The documents leave some questions unanswered. They don’t say how long it would take to inspect all 40,000 homes. Also unclear is whether the Army has funds immediately available for the program, or would need Congressional authorization to set them aside.

The Army would ensure that the private contractors who operate base housing “are meeting their obligations” to maintain base homes, the documents say, and would require them to show compliance with lead safety standards through independent audits.

The documents do not discuss whether private housing contractors would bear any of the costs of the lead inspections, or how any repairs would be funded.

In most cases, Army post homes are now majority-owned by private real estate companies. Under their 50-year agreements with the Army, corporate landlords operating military housing agreed to control lead, asbestos, mold, and other toxic risks present in some homes, particularly historic ones.


The Army plans come as base commanders and housing contractors face a wave of complaints about potential home lead hazards, and a rush of military families seeking lead tests for their children.

Last week, the hospital at Fort Benning, where Reuters reported that at least 31 small children had tested high for lead exposure in recent years, began offering “walk-in” lead testing. Some concerned families are already being relocated; in other homes, maintenance workers are using painter’s tape to mark peeling paint spots that residents found contained lead by using store-bought testing kits.

Lead poisoning is preventable, and its prevalence in the United States has declined sharply in recent decades. Still, a 2016 Reuters investigation documented thousands of remaining exposure hotspots, mostly in civilian neighborhoods.

Last week, eight senators, including Republican Johnny Isakson of Georgia and Democrat Claire McCaskill of Missouri, pushed amendments to legislation to examine and address the military's handling of lead exposure risks.

In coming weeks, Army officials plan to meet with lawmakers to address their concerns, the military documents show.

Reuters Investigative Report:

Arizona clean-energy ballot measure can go on November ballot, judge rules

Ryan Randazzo, Arizona Republic Published 6:12 p.m. MT Aug. 27, 2018

Arizona Public Service

A Maricopa County Superior Court judge on Monday allowed a clean-energy initiative to go on the November ballot despite a challenge from opponents who argued the initiative did not gather enough legal signatures.

An opposition group backed by the parent company of Arizona Public Service Co. vowed to appeal the ruling to the Arizona Supreme Court.

The Clean Energy for a Healthy Arizona measure, now called Proposition 127, would require electric companies such as APS and Tucson Electric Power Co. to get half their power from renewable sources like solar and wind by 2030.

"Proposition 127 will give voters a chance to finally utilize Arizona’s greatest asset — its sunshine," said DJ Quinlan, spokesman for the campaign.

The existing state standard, set by the Arizona Corporation Commission, requires those companies to get 15 percent of their power from renewables by 2025, and the utilities are on track to meet that. The goal increases annually and is 8 percent this year.

APS spending millions to defeat measure

The parent company of APS, Pinnacle West Capital Corp., has contributed $11 million to a political action committee combating the initiative.

Arizona Public Service Co. says a state constitutional

Arizona Public Service Co. says a state constitutional amendment on renewable energy pushed by Clean Energy for a Healthy Arizona would dangerously overstimulate solar- and wind-power development. (Photo: Mike McPheeters)

“Proposition 127 would fundamentally alter the Arizona Constitution and implement costly new regulations to raise electric rates for Arizona families and businesses," said a response from the opposition group Monday.

"Before we proceed any further down this path, it is only prudent to be certain the initiative has met the bare standards necessary to be on the ballot.”

APS officials say that the easiest way to comply with the standard, should voters approve it, will be to build many new solar power plants, and the cost of those plants will double the average household utility bill.

The company also says that a lot of solar power on the grid will create a glut of power in mild-weather months when the solar plants generate power but customers don't use air conditioning.

That glut would force the company to curtail output at its coal and nuclear plant, forcing it to close ahead of schedule, APS said.

Measure backers: APS 'lying' about ballot proposal

Proponents of Prop. 127 doubt those APS claims.

“APS has been lying about the validity of our signatures for months,” Quinlan said. “If we can’t trust them on this, why should voters trust any of their other profit-motivated attacks against clean energy?”

An opposition group funded by APS called Arizonans for Affordable Electricity has repeatedly challenged the signature-gathering efforts of the ballot measure, and found some of the people registered to collect signatures had felony crime convictions.

Those signature gatherers were inconsequential to the hundreds of thousands of signatures collected, but the opposition group advertised the felony records and even conducted robocalls to warn voters that the people collecting signatures might pose a threat.

The clean-energy committee on July 5 submitted more than 480,000 signatures. It needed only about 226,000 to qualify for the ballot.

Phoenix voters Aug. 28, 2018, will weigh in on a series of ballot measures that could change the way the city conducts its elections.

A sample review of the signatures by the county recorders found more than 72 percent of the signatures to be valid, or about 328,908 signatures, which is more than enough to qualify for the ballot.

The plaintiffs in the case, led by state Rep. Vince Leach, R-Tucson, subpoenaed 1,180 of the signature collectors to trial the week of Aug. 20, and more than 900 showed up.

However, Judge Daniel Kiley threw out the signatures gathered by 235 of the signature gatherers who did not show up to court, and threw out more from some who showed up but left early, or did not return to court as directed later in the week.

Those actions did not eliminate enough signatures to prevent the measure from making the ballot.

Popular French environment minister quits in blow to Macron

Laurence Frost, Geert De Clercq

PARIS (Reuters) - French Environment Minister Nicolas Hulot resigned on Tuesday in frustration over sluggish progress on climate goals and nuclear energy policy, dealing a major blow to President Emmanuel Macron’s already tarnished green credentials.

Hulot, a former TV presenter and green activist who consistently scored high in opinion polls, quit during a live radio interview following what he called an “accumulation of disappointments”.

“I don’t want to lie to myself any more, or create the illusion that we’re facing up to these challenges,” Hulot said on France Inter. “I have therefore decided to leave the government.”

Hulot was among Macron’s first ministerial appointments following his May 2017 election victory. His inclusion helped to sustain a green image France had earned 18 months earlier by brokering the Paris Agreement to combat global greenhouse emissions.

But the centrist president has watered down a series of campaign pledges on the environment, including a commitment to cut the share of nuclear power in French electricity to 50 percent by 2025 and boost renewable energy.

Those policy shifts have been a repeated source of frustration for Hulot. Since a post-election honeymoon period, they have been accompanied by a sharp slide in Macron’s ratings, which touched new lows after his bodyguard was filmed assaulting demonstrators last month.

Hulot said he had not informed Macron before announcing his resignation.

“This may not be the right protocol, but if I had warned them they might have talked me out of it yet again,” Hulot said. His cabinet portfolio also included energy.

Government spokesman Benjamin Griveaux said the cabinet “regretted” his resignation, but also described it as “a blow from which we’ll recover”.

“I don’t understand why he is stepping down when we had many successes in the first year that are to his credit,” Griveaux told BFM Television. “He didn’t win all his battles but that’s the way it goes for ministers.”

Greenpeace France director Jean-Francois Julliard said that while Macron had “made some fine speeches” and stood up to U.S. President Donald Trump on climate change, he had “never turned these words to concrete action” at home.


Shares in power utility EDF, which is on the hook for the cost of decommissioning older nuclear plants, surged 2.7 percent in early trading before settling back to 14.26 euros at 0901 GMT, still up 1 percent on Monday’s close.

Hulot had also voiced disappointment after Macron wavered on a commitment to ban the weedkiller glyphosate, sold under Monsanto’s Roundup trademark, and failed to prevent Total’s La Mede refinery producing biofuel from imported palm oil linked to deforestation.

His announcement came a day after the government promised to relax hunting laws, in a move widely seen as an attempt to bolster Macron’s appeal in rural areas.

Prime Minister Edouard Philippe said he would discuss the “composition of the government” with Macron following Hulot’s departure, leaving open the possibility of other changes.

Macron’s office sought to attribute the resignation to the “frustration, even exhaustion” of a ministerial novice suddenly confronted with the slow-moving machinery of government.

“Political and administrative timeframes aren’t necessarily as one hopes, coming from activism,” an Elysee official said, insisting that Hulot’s deep commitment to environmentalism was evidenced by the government’s policy track record.

In his radio interview, however, Hulot emphasized the inadequacy of “mini-steps” on climate change by France and other nations, voicing hope that his exit might “provoke deep introspection in our society about the reality of the world”.

His doubts about remaining in government had grown over the summer as devastating droughts were met with a tepid political response, he said.

Alain Juppe, a conservative former prime minister and presidential contender, said he was impressed by Hulot’s “high-mindedness and by the nobility of his act”.

“Beyond the inevitable political buzz, I hope this decision encourages us all to think and to change,” he said on Twitter.

Reporting by Laurence Frost and Geert De Clercq; Additional reporting by Yann Le Guernigou, Richard Lough and Michel Rose; Editing by Richard Lough, John Stonestreet and Peter Graff

German e-cars still hampered by lack of charging stations

Sales of electric cars in Europe have just crossed the 1 million mark, as more consumers are persuaded to opt for green mobility. Can the infrastructure to support the transition from fuel-powered vehicles keep up?

Manufacturers of fully electric vehicles (EVs) and plug-in hybrids are delighted; sales of the more climate-friendly cars and vans across Europe have grown 42 percent in the first half of the year.

Industry analysts EV Volumes report that a million EVs are now plying the continent's roads, and that Germany is set to overtake Norway — which until now has led the way in the transition to electric cars — in total sales for the year.

Until now, earlier adopters of battery-powered cars have found it relatively easy to keep their vehicles charged up, as long as they don't travel too far. After all, most charging is done at home, while state subsidies have led to public charging stations springing up at shopping malls and city center car parks.

Although Germany has enough charging points for the current number of EVs on the roads, many industry analysts question whether governments, the private sector, and even apartment block owners will be able to cope with future demand for charging infrastructure.

"If the number (of EVs) increases significantly, problems will arise," warned transport expert and blogger Martin Randelhoff.

He told DW that while workplaces and those living in detached houses can install new charging infrastructure relatively easily, those in multi-story apartment blocks, where most Germans live, will struggle to manage the changeover.

Finding a parking lot on Berlin's crowded inner-city streets can be a nightmare. Building charging stations for most of the cars is an illusion

Even for those apartments that do have underground car parks, the cost of installing charging points for most or every parking space is likely to be a heavy burden for many residents' associations.

Although government subsidies have been generous until now, many European countries continue to face public sector cuts in the wake of the 2008 financial crisis. So despite ambitious targets to ban the most polluting vehicles in the medium term, consumers cannot rely on receiving similar cash incentives as EVs become mainstream.

Two saving graces include the financial backing by many corporations to build charging stations at their headquarters, and a European Union directive that will ensure that all newly constructed buildings will automatically be EV-ready.

"(Under EU rules), the cost of the charging points in all new commercial and residential buildings, as well as all buildings undergoing major renovations is now expected to be included in the overall budget for such developments," Liana Cipcigan, Co-Director of the Electric Vehicle Centre of Excellence at Cardiff University in Wales told DW.

The demand by public charging stations is set become more urgent as more drivers opt for electric vehicles, which many crowded cities will struggle to provide in the most convenient spots.

"The question is ... whether they are in the right places, and can be supported with the right charging rates to suit driver requirements and behaviors," said John Massey, managing director of the British green energy firm Grey Cells Energy.

Even bigger challenges lie ahead for national power grids, for example in Britain, whose electricity infrastructure may struggle to cope with increased demand from EVs without huge investment.

"An unavoidable fact is that we can't have millions of electric cars all plugging in and charging ... all at the same time (especially during the early evening peak), without having to spend billions on distribution network upgrades," Massey told DW.

He predicts that as more drivers transition to EVs, the demand for electricity will spread more evenly during the day, as more charging is done at work. Premium pricing for home charging at peak times, meanwhile, will also help nudge consumer behavior.

Public EV charging stations present their own challenges, as they will require EVs to come with fast-charging batteries, so they'll be able to service tens of thousands of drivers each day.

High-end brands like Tesla and Porsche are built with ultra-fast battery technology, but most electric vehicles can't currently handle the maximum speeds available from fast and ultra-fast charging points.

Last year, Germany installed the second largest number of public charging stations in the world after France — it already has some 13,500 charging points, according to figures from the utility lobby group BDEW. That compares to some 14,300 petrol (gas) stations, with 85,000 pumps.

But it'll be years before EV charging points are as efficient as their fossil fuel equivalents, and therefore as profitable.

Even the planned fast-charging EV service station, being built by Sortimo on the A8 autobahn between Stuttgart and Munich, will only enable 4,000 charging operations daily, even with its impressive 144 charging points.

Comparing an 18-hour (not 24-hour) day would only allow for 1.5 cars to be charged by each electric "pump" per hour. Compare that to a typical motorway petrol station, which can serve up to 12-15 vehicles every few minutes.

"There are significant challenges for batteries to accommodate fast charging but we can see progress coming from brands like BMW, VW and Audi," Cipcigan told DW, exalting how Germany's auto-makers are leading the way in developing a common ultra-fast charging standard.

German retail giant Aldi has inaugurated the first of 28 ultra-fast charging stations along the busy A5 highway this year. The hope is to lure more customers into its shops. But it's uncertain if clever marketing alone will make the investment pay off in the end

So for now, although some charging stations are capable of delivering 160 kilometers (100 miles) of range in 10 minutes, most EV owners won't benefit from the advanced technology, for now.

In other words, there's no alternative than for tens of thousands of charging stations around Europe to continue to sit, unused, Massey told DW.

"It almost certainly needs to be a case of the infrastructure leading the vehicles, so that consumers get used to seeing fast chargers wherever they normally stop, and they don't worry about being able to access a charger."

Colorado regulators green-light Xcel’s plan boosting renewables, cutting coal

Critics say plan to retire coal-fired plants early will cost ratepayers millions

Project coordinator Tim Farmer walks past cooling towers for Comanche 3, which will add 750 megawatts of power. Critics cite the plant’s cost and its use of coal, and they question whether, with energy demand dropping, it is even needed.


PUBLISHED: August 27, 2018 at 9:01 pm | UPDATED: August 27, 2018 at 9:03 pm

A plan by Xcel Energy Colorado to boost the share of power it gets from wind and solar and retire a third of its coal generation was green-lighted Monday by state regulators.

The Colorado Public Utilities Commission voted 2-1 in support of what Xcel calls the Colorado Energy Plan, which the company says will cut carbon dioxide emissions by nearly 60 percent, increase renewable energy sources to 55 percent of its mix by 2026 and save customers about $213 million.

As part of the plan, Xcel, Colorado’s largest electric utility, will phase out its Comanche 1 and 2 coal-fired plants in Pueblo about a decade earlier than the original target date of 2035. Xcel says the plan will invest $2.5 billion in eight counties and save customers about $213 million, thanks to the declining costs of renewable energy.

The commission is expected to issue a written decision approving the plan in the first week of September.

Utilities commissioners, staff and supporters said the low costs of wind and solar and the improving technology of batteries that store energy from renewable sources present “a rare opportunity” to advance an energy program that will reduce greenhouse gas emissions while promoting a clean-energy economy.

“The Colorado Energy Plan Portfolio is a transformative plan that delivers on our vision of long-term, low-cost clean renewable energy for our customers, stimulating economic development in rural Colorado, and substantially reducing our carbon emissions,” Alice Jackson, Xcel Energy Colorado president, said in a written statement. “We are excited to move forward.”

Xcel’s plan, filed in June, will significantly boost power from renewable energy sources and phase out 660 megawatts of coal power by retiring the two coal-fired units in Pueblo. The utility will add about 1,100 megawatts of wind, 700 megawatts of solar, 275 megawatts of battery storage and 380 megawatts from existing natural gas sources.

One megawatt provides power to 1,100 Colorado homes.

An alternate plan, which the Colorado Office of Consumer Counsel supported, would have slightly reduced the solar and battery storage capacity. Commissioner Wendy Moser, who supported the alternate, said shutting down the coal plants earlier than planned isn’t “going to be free.”

“Customers will pay higher rates” to cover the costs, said Moser, who does agree with closing the coal plants.

Moser and Commissioner Frances Koncilja also noted that jobs will be lost when the coal plants close. Xcel has said it will try to find other jobs for displaced workers.

The utilities commission staff and others, including a coalition led by the Independence Institute, a libertarian think tank, questioned Xcel’s estimates of savings for customers. Commission adviser Bob Bergman said the staff believes the $213 million projected savings is “likely overstated.”

“Most savings won’t occur until after 2034,” Bergman said.

Amy Oliver Cooke, the Independence Institute’s executive vice president, said the ratepayers’ coalition analysis projected no cost savings until 2046 and up to $500 million in rate increases to pay for Xcel’s plan. She said because wind and solar prices keep dropping, it would make better economic sense to wait to build new facilities until prices drop even further.

In 2016, Xcel sought bids from energy companies as it was developing its electric resource plan. Xcel received more than 400 bids, many of those at historically low prices for wind and solar energy. Bergman called the number of bids and prices unprecedented. He said it’s important take advantage of the low prices, existing tax credits and environmental benefits.

Western Resource Advocates, a Boulder-based conservation group, said Xcel’s plan will dramatically cut carbon pollution, create hundreds of jobs and invest in Colorado’s rural economy.

“Colorado’s bold decision to invest in clean energy and a healthier future for the next generation shows what the public – and the marketplace – already know, that conservation and clean energy go hand in hand with a growing, healthy economy,” said Jon Goldin-Dubois, president of Western Resource Advocates.

Numerous children have been poisoned by lead in homes approved by D.C. housing inspectors

By Terrence McCoy, Reporter

August 15

Chanelle Mattocks remembers everything about that night in 2014, when lead poisoned her son.

She was giving Alonzo, then 3, a bath in a tub that her landlord had just painted to pass a housing inspection. She turned to find a washcloth, and when she swiveled back, she found the boy with bits of peeling paint in his mouth. She tried get it out, but it was too late.

The lead tests came back positive: Alonzo had more than double what the government defines as “elevated,” and he hasn’t been the same since.

Between March 2013 and March 2018, at least 41 families discovered that their homes, subsidized by a housing voucher and approved by city inspectors, contained lead contaminants, according to a tabulation requested by The Washington Post through the Freedom of Information Act.

The District Department of Energy and Environment, which performed the count and the testing, said it inspected about half of the homes because a child living at the property, or visiting it often, had tested positive for elevated levels of lead; the other homes were investigated following a tip about possible lead hazards. The agency said that the list wasn’t exhaustive and that there may be more.

The findings again highlight key weaknesses in federal guidelines established by the U.S. Department of Housing and Urban Development, which the District and other cities follow. Many rental properties supported by housing vouchers in the city receive inspections under these standards. But they require only visual inspections for peeling paint and don’t mandate lead testing, unlike states such as Maryland and Rhode Island.

“You cannot detect with any certainty that a house does not contain toxic lead dust without doing a dust test, period,” said Ruth Ann Norton, president of ­Baltimore’s Green & Healthy Homes Initiative and one of the nation’s foremost experts on lead-poisoning prevention.

Since 2013, the District has subsidized and inspected more than 18,900 properties, all while it tries to meet a crisis in homelessness and affordable housing. In the first seven months of 2018, the D.C. Department of Human Services placed 367 homeless families — nearly three times as many as it did in 2013, according to city statistics.

Rick White, a spokesman for the District Housing Authority, which performs many of the inspections for subsidized properties, said that most of the voucher properties in the tabulation were overseen by the agency. After hazardous lead was found in the homes, some families moved out when their landlords did not abate the contamination. Other landlords cleared the properties of lead hazards and provided documentation to city authorities, and the families stayed. It is the landlords’ responsibility, he said, to ensure that the homes are free of hazardous lead.

“I do not want you, or your newspaper, mistakenly believing or inaccurately reporting that DCHA is not fully meeting its legal obligations,” he said, adding that the city is also reviewing how cities that have made strides in lead remediation, such as Baltimore, conduct their lead inspections. “Rest assured that if federal laws or regulations are amended, then we will adjust our operating practices accordingly. . . . In all cases, DCHA immediately takes appropriate actions against any private property owner where a DCHA inspector identifies peeling paint.”

The fix for peeling paint, however, often includes another coat of paint. But superficial and cosmetic fixes, according to housing advocates, lawyers and tenants, do little to address more significant and underlying issues, such as plumbing problems or leaking roofs, that can cause paint to crack and peel again. And that’s when lead paint, effectively banned in 1978, becomes dangerous.

“Sometimes families chose housing that may not be great because they feel like they don’t have any other options,” said Kathy Zeisel of the Children’s Law Center. “They may believe the coat of paint has resolved the issue, but by the end of the month, the paint is peeling all over again, and the water is coming through the walls.”

It was a problem for Donna Black. She moved into a house on Rittenhouse Street in Northwest Washington with her housing voucher in 2013, while she was pregnant. When she first saw the home, she didn’t feel good about it but didn’t want to seem “choosy.” Plus, the inspectors had said it was okay, so she assumed it was safe.

“That was very false,” she said.

The roof started leaking. The paint started peeling. She gave birth. She named the baby Damion.

A year later, his blood carried twice the amount of lead the government calls elevated, although most advocates and scientists say any trace of lead in a child’s system can lead to diminished cognitive function.

Four years after that, Black is homeless, living in a Holiday Inn Express with Damion, whose needs her life revolves around. “My son is not a normal 3-year-old,” she said.

A lot of days, she’s filled with anger.

“We’re very upset with the city,” she said. “The city is the number one reason why this has happened to my son. . . . They let our family move in there, and it was fixed up to the point where it could look like it was okay, but it really wasn’t.”

Mattocks, too, has trouble understanding how to a raise a child who is different from her seven other children. Alonzo, now 7, is always behind in his schooling, and she worries about what sort of life he will have. “I’m worried that, as an African American male, they’re already having so many issues with police brutality and being discriminated against that I’m fearful . . . that this will be another barrier that he’ll have to try to get through.”

Mattocks and Black filed lawsuits against the housing authority and their landlords in District Superior Court in 2016, but the housing authority was dismissed from the cases after arguing that it wasn’t liable, although that decision is being appealed. “There really should be stricter standards to protect the children,” said Alan Mensh, the attorney representing the two.

Scott Muchow, the landlord for Mattocks’s property, declined to address specific questions about Alonzo’s lead poisoning. “In late 2016, I received notice of a lawsuit for lead paint related issues at the property from Ms. Mattocks, but during discovery, Ms. Mattocks chose to voluntarily dismiss the case,” Muchow said in a statement.

The lawsuit against Black’s landlord, Jerome Lindsey, who could not be reached to comment, is pending.

Mattocks and Black said they were less interested in money than a sense of justice. They moved into homes that were supposed to be safe but turned out to be anything but, and now they’re raising children whose needs exceed their means. And no one, they say, wants to take responsibility.

“So who do we hold responsible?” Mattocks said. “We have to hold the city accountable, and the landlords accountable, we have to hold all of these people accountable . . . so that the children we call our future, we take care of these children. . . . But how do we do that if we don’t hold them accountable?”

Terrence McCoy covers poverty, inequality and social justice in urban and rural America. He is the recipient of numerous awards, including the 2016 George Polk Award for stories that showed how companies in an obscure industry made millions of dollars from exploitative deals with the poor and disabled. He has served in the Peace Corps in Cambodia

St. Louis suburb suspends curbside recycling collection


Katie Pyzyk



Aug. 21, 2018

Officials in the St. Louis, Missouri suburb of Kirkwood have decided to suspend the city's curbside recycling collection program, starting on October 22.

Leaders made the decision because they have been unable to find a company to process mixed recyclables after their current processor, Resource Management, notified the city that it would no longer accept single-stream materials.

The situation is being blamed on unfavorable market conditions brought on by China's regulatory measures for tighter contamination standards, according to the St. Louis Post-Dispatch.

Kirkwood leaders expect that other St. Louis-area municipalities will be forced to follow suit and suspend their curbside recycling programs because of unfavorable market conditions. In another suburb, Hazelwood, Republic Services operated one of its two major processing facilities for the region's recyclables. Paper makes up about one-third of the recycling stream at that facility, and Republic indicates the paper market has all but dried up since China enacted its stricter standards.

A statement on Kirkwood's website says employees are actively seeking alternatives that would allow the curbside program to continue, but so far none have materialized. "The City at this time has been unable to find another company in the region that can process our mixed recyclables," the statement says.

Prior to the China-induced market trouble Kirkwood was receiving $15 per ton from Resource Management for the single-stream materials, but the hauler recently began charging the city $35 per ton for material dropped off at its facility. That change is projected to cost the city an additional $13,000 per year, though leaders say the curbside program cancelation isn't simply a problem of finances.

“We have to have a place that will accept the collected mixed recyclables,” said Russ Hawes, Kirkwood’s chief administrative officer, in a statement. “This decision was not based on costs.”

If the city cannot find a processor, it will retrofit the depository where residents used to drop off separated recyclables prior to the curbside program launch in 2011. Residents will be able to bring source-separated paper, glass, aluminum, tin and some plastics.

The city will launch an educational campaign next month to inform residents about these changes and the importance of reducing contamination. The existing curbside recycling carts will be relabeled and used for traditional trash pick-up. Although the city says it is trying to salvage its curbside recycling program, the measures it's taking to move in another direction may signal a point of no return.

“We have to prepare a plan of action if we cannot find a facility to take our mixed recyclables," said Kirkwood Mayor Tim Griffin in a statement. “We are doing everything we can to save our curbside recycling program, in light of the sudden and drastic recycling market changes.”

Democratic National Committee Backtracks On Its Ban Of Fossil Fuel Donations

The move comes just two months after the party adopted a resolution to prohibit oil, gas and coal company contributions.

By Alexander C. Kaufman

Tom Perez, chair of the Democratic National Committee, introduced the new resolution to "support fossil fuel workers."

Tom Perez, chair of the Democratic National Committee, introduced the new resolution to “support fossil fuel workers.”

The Democratic National Committee passed a resolution Friday afternoon that activists say effectively reverses a ban on fossil fuel company donations.

The resolution introduced by DNC Chair Tom Perez states that the party “support[s] fossil fuel workers” and will accept donations from “employers’ political action committees.” It was approved by a 30-2 vote just two months after the committee adopted another resolution prohibiting donations from fossil fuel companies by a unanimous vote.

The new resolution nods to “forward-looking employers” that are “powering America’s all-of-the-above energy economy and moving us towards a future fueled by clean and low-emissions energy technology, from renewables to carbon capture and storage to advanced nuclear technology.”

“I am furious that the DNC would effectively undo a resolution passed just two months ago just as the movement to ban fossil fuel corporate PAC money is growing (and Democrats are winning),” said R.L. Miller, president of the super PAC Climate Hawks Vote, who co-sponsored the original resolution.

DNC spokeswoman Xochitl Hinojosa said in an email that the new resolution was “not a reversal,” noting in a statement after the vote that “any review of our current donations reflects” the Democrats’ “commitment” to turn away the fossil fuel industry. The DNC has not accepted any fossil fuel donations since adopting the ban.

The key difference could be if the new resolution applies only to campaigns ― in which case, it may not annul the original resolution but would “repudiate the spirit” of the earlier one, according to Jerald Lentini, deputy director of the Democratic fundraising group It Starts Today.

“Smart Democrats are very good at splitting hairs and nitpicking,” Miller said. “It’s trying to manufacture distinctions out of whole cloth.”

Party activists ― including Christine Pelosi, the main author of the first resolution and House Minority Leader Nancy Pelosi’s daughter ― hoped the DNC would consider a second proposal this month to stop accepting contributions over $200 from individuals who work for the fossil fuel industry. That, the thinking went, would limit the influence of high-paid executives while remaining open to the working class that Democrats aim to champion. The original resolution, which Perez voted for in June, barred the DNC from accepting contributions from corporate PACs tied to oil, gas and coal companies. But it allowed for the DNC to continue accepting individual donations from workers in those industries.

Instead, the party appears to be backtracking.

The DNC’s new resolution “reaffirms its unwavering and unconditional commitment to the workers, unions and forward-looking employers that power the American economy,” according to the text.

As such, it states that the DNC “will continue to welcome the longstanding and generous contributions of workers, including those in energy and related industries, who organize and donate to Democratic candidates individually or through their unions’ or employers’ political action committees.”

The resolution, proposed as historic wildfires are scorching California, makes no mention of climate change.

Hinojosa said the resolution came in response to “concerns from Labor” that the original fossil fuel donations ban “was an attack on workers.” Just 4.4 percent of workers in the mining sector ― including coal, oil and gas ― are union members, according to the Union Membership and Coverage Database. The International Brotherhood of Electrical Workers, however, remains a strong supporter of building pipelines and donated more than $305,000 to the DNC this year.

To put a fine point on it: This proposal isn't to let union members keep donating to the DNC. It's to let fossil fuel executives keep donating and selling influence among Democrats. Certain unions (incl some building trades) see their interests as aligned with those of executives

The strength of the fossil fuel donations ban seemed in question almost immediately after it passed. The DNC refused to announce the resolution, declining to comment to HuffPost for a story that made the vote public.

At the Texas Democratic Party’s convention two weeks later, a state party official opposed a state-level proposal to ban fossil fuel donations and oppose new gas extraction, arguing that the DNC’s own resolution was not set in stone.

A.J. Durrani, a retired engineer and manager at the oil giant Shell who recently joined the national party committee, said the DNC did not include the earlier vote in the minutes from its last executive committee meeting.

“There was no mention in it,” Durrani said by phone in June. As far as he was concerned, he said, “As of right now, the DNC has not voted.”

Durrani did not immediately respond to a request for comment on Friday.

Texas Democrats ultimately voted down their proposed resolution.

In February 2017, DNC rank-and-file rejected another Pelosi resolution to forbid “registered, federal corporate lobbyists” from serving as “DNC chair-appointed, at-large members” and to reinstate former President Barack Obama’s ban on corporate PAC donations.

Obama halted contributions from PACs and lobbyists in 2008 after winning the party’s presidential nomination. But then-DNC Chair Debbie Wasserman Schultz loosened the restrictions in July 2015 before completely rolling back the ban in February 2016, nine months before that year’s presidential election.

The energy and natural resource sectors, including fossil fuel producers and mining companies, gave $2.6 million to the DNC in 2016, according to data collected by the nonpartisan Center for Responsive Politics. That’s less than 5 percent of the $56.1 million that the finance and real estate sectors ― the DNC’s largest corporate donors ― contributed that year.

Oil and gas companies spent a record $7.6 million on Democratic races in 2016. That’s a pittance compared to the $53.7 million in direct donations to Republicans, who received 88 percent of the industry’s contributions during that election cycle. Republicans have taken in 89 percent of the industry’s donations so far in 2018. That figure rises to 95 percent of the coal sector’s largesse this year.

Coal miner Michael Nelson shakes hands with President Donald Trump as Trump prepares to sign Resolution 38, which nullifies the coal industry’s “stream protection rule.”

But even as fossil fuel companies entrench with Republicans and the Trump administration continues deregulating drilling, mining and emissions, Democrats remain slow to mount a serious challenge to the industry most responsible for anthropogenic global warming.

On the state level, Washington Gov. Jay Inslee (D) failed to pass the nation’s first carbon tax even as his party enjoys complete control over the deep-blue state. New York Gov. Andrew Cuomo (D) has refused to swear off fossil fuel donations, halt new fracked gas infrastructure in the state, or support a widely hailed climate bill despite an aggressive primary challenge from progressive Cynthia Nixon and intense pressure from environmentalists.

On the national level, most bills from Democrats who purport to be Congress’ biggest climate hawks amount to half measures, either exempting major polluters such as the meat industry, directing carbon tax revenues that could be used to mitigate the effects of climate change to lowering taxes, or waiting until 2050 to end fossil fuel use.

There are some bright spots. Last September, Rep. Tulsi Gabbard (D-Hawaii) introduced the Off Fossil Fuels for a Better Future Act ― considered one of the most progressive climate bills yet proposed ― which calls for ending oil, gas and coal use by 2035, cutting all subsidies to drilling, mining and refining companies, and funding programs to help workers transition into new industries.

Alexandria Ocasio-Cortez, the likely next representative for New York’s 14th Congressional District, has vowed to push for a “Green New Deal,” a federal plan to spur “the investment of trillions of dollars and the creation of millions of high-wage jobs.” Other progressive candidates are now rallying around the Green New Deal concept and calling for nascent proposals for a jobs guarantee program to be married to renewable energy targets. That, proponents say, may be the key to wooing unions away from supporting lucrative fossil fuel projects.

NASA launching Advanced Laser to measure Earth's changing ice

Washington DC (SPX) Aug 23, 2018

NASA's Ice, Cloud and land Elevation Satellite-2 (ICESat-2) spacecraft arrives at the Astrotech Space Operations facility at Vandenberg Air Force Base in California ahead of its scheduled launch on Sept. 15, 2018.

Next month, NASA will launch into space the most advanced laser instrument of its kind, beginning a mission to measure - in unprecedented detail - changes in the heights of Earth's polar ice.

NASA's Ice, Cloud and land Elevation Satellite-2 (ICESat-2) will measure the average annual elevation change of land ice covering Greenland and Antarctica to within the width of a pencil, capturing 60,000 measurements every second.

"The new observational technologies of ICESat-2 - a top recommendation of the scientific community in NASA's first Earth science decadal survey - will advance our knowledge of how the ice sheets of Greenland and Antarctica contribute to sea level rise," said Michael Freilich, director of the Earth Science Division in NASA's Science Mission Directorate.

ICESat-2 will extend and improve upon NASA's 15-year record of monitoring the change in polar ice heights, which started in 2003 with the first ICESat mission and continued in 2009 with NASA's Operation IceBridge, an airborne research campaign that kept track of the accelerating rate of change.

A Technological Leap

ICESat-2 represents a major technological leap in our ability to measure changes in ice height. Its Advanced Topographic Laser Altimeter System (ATLAS) measures height by timing how long it takes individual light photons to travel from the spacecraft to Earth and back.

"ATLAS required us to develop new technologies to get the measurements needed by scientists to advance the research," said Doug McLennan, ICESat-2 project manager at NASA's Goddard Space Flight Center.

"That meant we had to engineer a satellite instrument that not only will collect incredibly precise data, but also will collect more than 250 times as many height measurements as its predecessor."

ATLAS will fire 10,000 times each second, sending hundreds of trillions of photons to the ground in six beams of green light. The roundtrip of individual laser photons from ICESat-2 to Earth's surface and back is timed to the billionth of a second to precisely measure elevation.

With so many photons returning from multiple beams, ICESat-2 will get a much more detailed view of the ice surface than its predecessor, ICESat. In fact, if the two satellites were flown over a football field, ICESat would take only two measurements - one in each end zone - whereas ICESat-2 would collect 130 measurements between each end zone.

As it circles Earth from pole to pole, ICESat-2 will measure ice heights along the same path in the polar regions four times a year, providing seasonal and annual monitoring of ice elevation changes.

Tracking Ice Melt

Hundreds of billions of tons of land ice melt or flow into the oceans annually, contributing to sea level rise worldwide. In recent years, contributions of melt from the ice sheets of Greenland and Antarctica alone have raised global sea level by more than a millimeter a year, accounting for approximately one-third of observed sea level rise, and the rate is increasing.

ICESat-2 data documenting the ongoing height change of ice sheets will help researchers narrow the range of uncertainty in forecasts of future sea level rise and connect those changes to climate drivers.

ICESat-2 also will make the most precise polar-wide measurements to date of sea ice freeboard, which is the height of sea ice above the adjacent sea surface. This measurement is used to determine the thickness and volume of sea ice.

Satellites routinely measure the area covered by sea ice and have observed an Arctic sea ice area decline of about 40 percent since 1980, but precise, region-wide sea ice thickness measurements will improve our understanding of the drivers of sea ice retreat and loss.

Although floating sea ice doesn't change sea level when it melts, its loss has different consequences. The bright Arctic ice cap reflects the Sun's heat back into space. When that ice melts away, the dark water below absorbs that heat. This alters wind and ocean circulation patterns, potentially affecting Earth's global weather and climate.

Beyond the poles, ICESat-2 will measure the height of ocean and land surfaces, including forests. ATLAS is designed to measure both the tops of trees and the ground below, which - combined with existing datasets on forest extent - will help researchers estimate the amount of carbon stored in the world's forests. Researchers also will investigate the height data collected on ocean waves, reservoir levels, and urban areas.

Potential data users have been working with ICESat-2 scientists to connect the mission science to societal needs. For example, ICESat-2 measurements of snow and river heights could help local governments plan for floods and droughts.

Forest height maps, showing tree density and structure, could improve computer models that firefighters use to forecast wildfire behavior. Sea ice thickness measurements could be integrated into forecasts the U.S. Navy issues for navigation and sea ice conditions.

"Because ICESat-2 will provide measurements of unprecedented precision with global coverage, it will yield not only new insight into the polar regions, but also unanticipated findings across the globe," said Thorsten Markus, an ICESat-2 project scientist at Goddard. "The capacity and opportunity for true exploration is immense

Environmental groups fight back against corporate lawsuits

BISMARCK, N.D. (AP) — Twenty environmental and civil liberties groups are fighting back against lawsuits they believe are aimed at limiting free speech and silencing critics.

The “Protect the Protest” task force announced Tuesday targets what are known as strategic lawsuits against public participation, or SLAPP, which use legal action and the threat of financial risk to deter people and groups from speaking out against something they oppose.

“We know from our own experience that this legal bullying tactic will work if it’s not shut down,” said Katie Redford, co-founder and director of EarthRights International.

The effort is to include billboard advertisements, training sessions for journalists and nonprofits, panel discussions and rallies outside the corporate offices of companies the groups believe use such lawsuits.

Rallies are planned next week in San Francisco, New York City and Dallas. Dallas is the base for Energy Transfer Partners, the company that built the Dakota Access oil pipeline and sued Greenpeace, Earth First and BankTrack for up to $1 billion for allegedly working to undermine the $3.8 billion project to move North Dakota oil to a shipping point in Illinois.

Greenpeace and the Center for Constitutional Rights, which also is involved in helping defend against that lawsuit, are among the Protect the Protest participants.

Spokeswomen for ETP did not immediately respond to a request for comment Tuesday.

The company’s lawsuit filed a year ago alleges the environmental groups disseminated false and misleading information about the project and interfered with its construction. ETP maintains that the groups’ actions interfered with its business, facilitated crimes and acts of terrorism, incited violence, targeted financial institutions that backed the project, and violated defamation and racketeering laws. The groups maintained the lawsuit was an attack on free speech.

U.S. District Judge Billy Roy Wilson this summer dismissed both BankTrack and Earth First as defendants. He said ETP failed to make a case that Earth First is an entity that can be sued, and that BankTrack’s actions in imploring banks not to fund the pipeline did not amount to radical ecoterrorism.

EarthRights International helped defend BankTrack, assistance that Redford said exemplifies the type of collective effort the task force will bring.

Wilson also ordered ETP to clarify its claims against Greenpeace, and has given that group until Sept. 4 to file its response to ETP’s amended complaint.

Greenpeace USA Executive Director Annie Leonard on Tuesday said a $300 million lawsuit filed against the group by the Canadian timber industry over its forest protection advocacy is another example of the type of lawsuits the task force hopes to battle.

“A healthy democracy is a precondition for a healthy environment, and we can’t have a healthy democracy without informed, engaged public dissent,” she said.

Algae Bloom in Lake Superior Raises Worries on Climate Change, Tourism

"In 19 years of piloting his boat around Lake Superior, Jody Estain had never observed the water change as it has this summer. The lake has been unusually balmy and cloudy, with thick mats of algae blanketing the shoreline.

“I have never seen it that warm,” said Mr. Estain, a former Coast Guard member who guides fishing, cave and kayak tours year-round. “Everybody was talking about it.”

But it was not just recreational observers along the shores of the lake who noticed the changes with concern. Lake Superior, the largest of the Great Lakes with more than 2,700 miles of shoreline, is the latest body of water to come under increased scrutiny by scientists after the appearance this summer of the largest mass of green, oozing algae ever detected on the lake."

Portland tidal energy startup entering rough investment waters

Buoyed by a recent $6 million infusion, Ocean Renewable Power Co. hopes to raise $12 million this year and $18 million later, but venture funds have been favoring wind and solar projects.

By Peter McGuireStaff Writer

A tugboat maneuvers Ocean Renewable Power Co.'s RivGen Power System into place on the Kvichak River in Alaska. Over a two-year trial, the system provided about a third of the power for a remote village nearby. Courtesy of Ocean Renewable Power Co.

Portland-based alternative energy startup Ocean Renewable Power Co. is stepping into rough financial waters as it tries to secure investments worth tens of millions of dollars to advance its unique marine power projects.

The company wants to raise $12 million in private investment by the end of the year to set up improved versions of its RivGen and TidGen underwater turbines, said founder and CEO Christopher Sauer. Ocean Power, which recently got a $6 million infusion, has developed a sustainable technology that allows underwater turbines to generate electricity from flowing rivers and shifting tides.

ACROSS GLOBE, marine power underutilized

Globally, marine power receives the least investment by far compared with other renewable energies, excluding large hydro dams.

$200 MILLION: Total new investment attracted worldwide by the marine energy sector in 2017 (14 percent drop from 2016)

$161 BILLION: What investors put toward solar energy in 2017

$107 BILLION: What investors put toward wind energy in 2017

Source: United Nations Environment Program and Bloomberg New Energy Finance, 2018 annual renewable energy investment report

An initial funding round will be followed by a push for another $18 million intended to start production for the commercial market and reach profitability in the next four years.

“We are getting closer and closer to commercialization. We have a clear path how to get there,” Sauer said.

But the company is fighting investment headwinds. Despite the promise of tidal energy, big investors are shying away from marine projects in favor of wind turbines and solar panels, cheaper proven technologies with relatively low risk and high yields.

“It is a tough world out there,” said Angus McCrone, chief editor of Bloomberg New Energy Finance. “Most of the big funds want to invest in wind and solar projects, not tech developments.”

Recent memories of well-funded, but ultimately failed, wave energy companies like Scotland-based Pelamis Wave Power make venture funds wary of fronting new marine power attempts, McCrone said.

“It doesn’t mean it is impossible to raise money, but it is more difficult than it was 10 years ago,” he said.

Globally, marine power receives the least investment by far compared with other renewable energies, excluding large hydro dams.

The marine energy sector attracted just $200 million in total new investment worldwide in 2017, a 14 percent drop from the year before. In contrast, investors sank $161 billion into solar and $107 billion into wind last year, according to the 2018 annual renewable energy investment report by the United Nations Environment Program and Bloomberg New Energy Finance.

“From an investor perspective, tidal has been very much a sideshow and is of lower importance,” said Edward Guinness, lead manager for the alternate energy fund at Guinness Atkinson Asset Management, a firm with offices in Los Angeles and London.

The fund manages about $400 million in energy investments, with about $30 million in alternate energy, Guinness said.

“The thing about tidal is that it is a very nice idea, it is very compelling,” he said. “But it has not been providing cost reductions that other technologies have and it is unlikely to do so.”

Marine power ventures tend to be expensive, risky and unproven – prone to escalating costs and damage from saltwater and heavy seas, Guinness said.

“We are not really looking to invest in early-stage tech companies,” he said. “We are looking for reasonably mature investment opportunities.”

Sauer is clear-eyed about the realities of lining up money to move ORPC ahead, but believes the company has the right plan.

It intends to install optimized versions of its turbines on the Kvichak River in Alaska and the St. Lawrence River near Quebec City in Canada – models intended to demonstrate the possibility of power generation in remote communities.

“We think once those systems are up and operating that it is a good break point to raise the rest of the money we need to be cash flow-positive by 2022,” Sauer said.

The 14-year-old company has received millions of dollars in federal and state funding and private investment on its path to making commercially viable hydrokenetic power systems. In 2012, ORPC was the first company in the U.S. to generate electricity to the power grid with a tidal turbine in Cobscook Bay, in easternmost Maine.

Since that experiment, the company has branched out. In 2014, it tested a river version of its turbine on the Kvichak River, and over a two-year trial proved it could provide about a third of the power for Igiugig, a remote village nearby.

In June, the Igiugig Village Council and ORPC were awarded $2.3 million from the Department of Energy for further tests of the river turbine. His company received a separate $650,000 award for research and development and has pulled in about $3 million from private investors this year, Sauer said.

A focus on remote communities in North America may give ORPC the base it needs to make the leap into commercial markets, Sauer said. Energy costs are typically very high in isolated villages and townships that depend on generators and fuel. Transportable underwater turbines are a cheaper and cleaner alternative and one that may provide local employment opportunities.

“The early adoption market for us are these remote communities because our systems today can save them money,” Sauer said.

If more communities buy into the product, it could help ORPC overcome the production cost barrier and provide a springboard into European and South American markets.

“Once these are operating and have benchmark performance, we are then in a sales mode,” he said.

The company also plans a new tidal project in Eastport with the prospect of generating 5 megawatts.

Raising money for marine power technology has always been tough, Sauer said. Most of the private investment his company has received has come from independent family funds focused on “impact” investing – putting money into companies that match investors’ ethical goals.

Recently, however, parties that have been on the sidelines, such as private equity firms and renewable energy funds, have shown more interest in marine power innovations, Sauer said.

“We are making progress” he said. “I am hopeful that in our next ‘raise’ here we will have all types of investors involved.”

Peter McGuire can be contacted at 791-6365 or at:

‘Toilet to tap’ water nearly matches bottled H20 in taste test, California university researchers discover

David Downey

PUBLISHED: August 17, 2018 at 8:49 am | UPDATED: August 17, 2018 at 1:19 pm

UC Riverside researcher Daniel Harman with drinking water samples at the Riverside campus on Tuesday, Aug. 14, 2018. Harmon recently conducted a taste study on the difference between recycled water, tap water and commercial bottle water,

Saddled with the “toilet to tap” label, recycled water still has a bit of an image problem. But in a blind taste test, UC Riverside researchers found that people prefer its flavor over tap water and that they like it as much as bottled water.

Intuitively, that may sound crazy. But it makes sense, suggests UCR’s Daniel Harmon, lead author on a recent study analyzing the taste test published recently in the journal Appetite.

“Bottled water and recycled water go through more or less identical purification processes,” Harmon said. Both, experts said, are subjected to reverse osmosis, which removes most contaminants.

The study is encouraging, water officials say, because it comes at a time when Southern California is having to rely increasingly more on recycled water, and not just for turf and crop irrigation. As the planet warms, droughts become more severe and water supplies shrink. It also comes as state officials are expanding the ways agencies can filter recycled water and add it to drinking supplies. UCR’s research may help set the stage for one day piping it directly into drinking-water systems.

UC Riverside researcher and study lead author Daniel Harmon holds up clear plastic cups of drinking-water samples at the Riverside campus on Tuesday, Aug. 14, to demonstrate how his team conducted an experiment to see whether participants — college students between the ages of 18 and 28 — could tell the difference between recycled, tap and commercial bottled water in a blind taste test. (Photo by Watchara Phomicinda, The Press-Enterprise/SCNG)

“It’s inevitable that we’re going to have to use this resource more and more,” said Harmon, a doctoral candidate in developmental psychology at UCR.

Kevin Pearson, a spokesman for Eastern Municipal Water District, which supplies drinking water to more than 800,000 people in Riverside County, termed the results encouraging.

“This goes to show that people are willing to use this as a water source,” Pearson said.

What Harmon’s team did was bring in 143 UCR students ranging in age from 18 to 28, one at a time.

“We wanted to figure out whether people could tell the difference between recycled water, tap water and commercial bottled water,” Harmon said. “They were presented with three clear cups labeled A, B and C. They were completely blind to the source of any water.”

After tasting the water, participants rated samples on a scale of one to five — one indicating strong dislike and five a strong like for. Harmon said bottled water received the highest score at 3.79, but recycled water nearly matched it at 3.77. The groundwater-based tap water sample scored 3.45.

“We were surprised that the groundwater was less liked,” he said.

Harmon said researchers also evaluated personalities and analyzed whether that factored into preferences. Their conclusion? It did.

They found that people who are open to new experiences tended to like the three samples the same. But people who are more nervous or anxious preferred bottled water, Harmon said.

“What we learned is, purity and freshness is king in water preference,” he said.

Harmon said the taste test was conducted in 2015, but the study was published this year. Researchers are considering a follow-up study, he said, saying he could not hint what that might entail.

The implications are huge. The state is moving toward more extensive reuse of the waste water that flows through our sewer lines. It’s already an important part of our supply.

In Southern California, the region consumes 3.5 million acre-feet of water annually. And, of that, the source of 460,000 acre-feet is recycled waste water, said Deven Upadhyay, Metropolitan Water District assistant general manager and chief operating officer.

“It’s been on an uptick for years,” he said.

A common measuring unit in the water world, an acre-foot is about 325,000 gallons or what would cover an acre one foot deep. It’s what three Southern California families use during the course of a year.

Upadhyay said some recycled water irrigates parks, golf courses and sport fields. Some is used by industry. And some is used for drinking, showering and washing dishes.

Orange County residents are already doing the latter — on a large scale.

Orange County Water District, in partnership with the Orange County Sanitation District, just celebrated its 10th anniversary of operating the world’s largest recycled-water plant. It generates 100 million gallons of fresh water daily from waste water, said Denis Bilodeau, district president. And his agency is expanding the plant.

Orange County Water District officials were so confident in the purity of their recycled waste water that they handed out bottles of it in Hollywood in March 2017. (Photo courtesy of Orange County Water District)

The purified sewer water represents 30 percent of all water produced by the water district for residents of northern and central Orange County, Bilodeau said.

It’s injected into the ground.

“It goes through several hundred feet of sand and soil,” where it is subjected to additional filtration, he said. “And eventually it is taken up through drinking water wells.”

A similar plant is coming to Los Angeles County. Upadhyay said Metropolitan is partnering with the Sanitation Districts of Los Angeles County on a demonstration project that will be completed in 2019 and serve as a precursor to a large-scale operation.

Taking the process a step further, the San Diego area is preparing to add recycled water to a reservoir. The State Water Resources Control Board cleared the way for that through a roll-out of regulations in March.

By 2023, the board anticipates unveiling rules that would set the stage for piping recycled water directly into drinking systems.

George Tchobanoglous, a retired UC Davis professor of civil and environmental engineering who recently served on statewide expert water panel, said it may be 10 years before that level of recycling arrives.

“I think that’s a ways off,” he said.

Tchobanoglous said some issues likely will have to be resolved and officials will have to secure the public’s confidence. It’s unfortunate, he said, that a number of years ago someone popularized the phrase “toilet to tap.”

But, he said, “It is inevitable.”

He said direct piping of recycled water into homes is done in several water-starved places around the globe now. One such example is South Africa.

One can also look to the stars to find an example. At the International Space Station, Bilodeau said, every drop is captured. “Even the perspiration is recycled.”

“This isn’t a science fiction thing,” he said. “It already occurs.”

US Government Security Agencies play tug of war over pipeline protection

Blake Sobczak, Sam Mintz and Peter Behr, E&E News reporters

Energywire: Thursday, August 23, 2018

A natural gas well pad operated by Cabot Oil and Gas Corp. in northern Pennsylvania is pictured. Blake Sobczak/E&E News

Natural gas pipeline companies are being pulled in three different directions as federal agencies mull how to handle new security threats to an increasingly vital resource.

Should the U.S. government bail out competitors to natural gas to ease the power grid's reliance on the fuel, as called for by a leaked plan from the Department of Energy?

Should policymakers preserve the status quo, counting on voluntary cooperation from the sector and a slim staff of specialists to gain a window into pipeline security, as the Department of Homeland Security favors?

Or should U.S. lawmakers consider beefing up gas security oversight and moving it out of DHS's hands, an idea raised in the halls of the Federal Energy Regulatory Commission?

Shared among all three agencies — and the energy firms lobbying them — is a sense that cyberthreats to the gas pipeline networks are only set to rise as companies digitize operations and hackers backed by foreign intelligence services grow more intrusive.

"We now are dealing with nation states," said Dave McCurdy, CEO of the American Gas Association (AGA), at a July 31 cybersecurity conference in New York City. "The government isn't necessarily organized for this 21st-century paradigm ... you've got some challenges with the federal agencies, if you're in industry."

The path chosen will inevitably reverberate in the bulk power grid, which in recent years has grown to rely on natural gas more than any other fuel source for generating electricity.

"There's more concern about what is the impact of what would happen if there is an interruption in the gas supply," McCurdy said.

The gas industry has pushed back against proposals to require baseline security standards for large pipelines and related infrastructure.

"The threat evolves too quickly for a regulation or mandate to be the most effective method of maintaining the highest level of safety," McCurdy said. Beyond opposing mandatory security standards, the gas industry is also opposing any independent assessments of whether its cyber and physical defenses adequately protect its networks.

The DOE plan

To the Department of Energy strategists, grid security concerns already justify intervention through use of the Federal Power Act and a 1950s-era defense statute to prop up alternatives to natural gas.

In a draft policy memo leaked in June, DOE claimed that pipelines' distributed nature, coupled with menacing online threats to their digital control systems, makes them harder to secure from attack.

DOE has floated propping up economically ailing coal and nuclear plants to pre-empt a future in which the country's power grid relies overwhelmingly on "just-in-time" supplies from gas pipelines. DOE has not yet disclosed how many coal and nuclear plants it would support, how they would be chosen, and what subsidies would cost.

Because coal and nuclear plants have on-site fuel, the thinking goes, they would be more resilient in the face of cyber or physical attacks.

Energy Secretary Rick Perry has played up this argument, laying the rhetorical groundwork for a policy that has pre-emptively drawn fierce opposition from environmentalist and some energy industry quarters.

"Wind and solar are interruptible, and so [are] gas pipelines. The only forms that are not interruptible are coal and nuclear — because they've got fuel on-site," Perry said at a conference in Texas earlier this month (Energywire, Aug. 6).

Perry's critics says his rationale is a policy proposal in search of a security problem. A major attack on U.S. grid infrastructure is as likely to focus on high-voltage transmission systems or local power distribution utilities as on pipelines. If adversaries take down part of the grid, power flow from all generation is halted, experts note.

Nuclear plants are the only "unique" generators from a security standpoint, and are likely to have the most stringent cybersecurity and physical defenses imposed by the Nuclear Regulatory Commission, said Dewan Chowdury, founder and CEO of the cybersecurity company MalCrawler and a consultant to major gas and electric utilities.

The DHS plan

The Transportation Security Administration, better known for its role guarding the nation's airports, is charged with ensuring that vital gas pipelines are adequately protected against various threats.

E&E News reported last year that the DHS agency has assigned six full-time staffers to oversee the more than 300,000 miles of gas transmission lines crisscrossing the nation (Energywire, May 23, 2017). A TSA spokesperson confirmed that the number of full-time employees working on pipeline security remains the same.

"TSA has exercised security responsibilities over pipelines since 2002 and continues to exercise those responsibilities while working in conjunction with industry partners," the agency said in a statement.

The office relies on voluntary cooperation with large pipeline companies and industry groups like the AGA to gain a window into the sector's security practices and defenses.

Earlier this year, TSA published updates to nonbinding pipeline security guidelines that urge companies to lock down their corporate and operational networks from hackers.

But compliance with the guidelines is not enforced, and agency officials have said there is no specific timeline for pipeline firms to complete "enhanced cybersecurity measures" for their most critical facilities.

An E&E News report in 2017 documented TSA's lack of cybersecurity staffing and the absence of any systematic review of gas pipeline cyberdefenses, either by TSA, FERC or the industry itself.

An update of that reporting showed that that lack of oversight and accountability on cyber vulnerabilities has not significantly changed.

Meanwhile, homeland security officials have warned of Russia-linked hackers probing U.S. critical infrastructure networks across the country (Energywire, March 16).

While no cyberattack is known to have disrupted the flow of gas or electricity anywhere in the United States, hackers have interrupted third-party billing and document-sharing services used by large gas and power utilities. Several years ago, hackers thought to be linked to the Chinese government also launched a series of cyber intrusions into gas pipelines' corporate networks, according to law enforcement officials and DHS briefings.

"I do think you need to have this different conversation both with TSA, with DOE, with [FERC], with NERC and with the gas industry about how you get that door protected," said Steven Naumann, vice president of transmission and NERC policy at utility giant Exelon Corp. "I know it's an old adage, but if you're a burglar, you're going to go to the easy target. And if you're spending all this money, all this time, all this effort on protecting the cyber health of the electric system, and then you can attack the gas system, water system, railroad system ... then we're so vulnerable."

Gas industry representatives contend that the nature of the commodity — slowly moving through pipelines, rather than the near-instantaneous path of electrons — intrinsically helps protect gas pipelines from certain threats.

Jennifer O'Shea, AGA vice president for communications, noted that the dispersed, redundant design of the U.S. pipeline system heads off risks from single points of failure carrying major consequences.

She said the industry "actively partners with multiple federal and law enforcement agencies, exchanging threat intelligence with the government through the Downstream Natural Gas Information Sharing and Analysis Center (DNG-ISAC).

Still, the extent of the gas industry's voluntary compliance with TSA standards isn't clear. E&E News asked AGA and the Interstate Natural Gas Association of America (INGAA) whether the industry has put in place any comprehensive review procedures of how well interstate pipelines are complying with TSA's voluntary guidelines. The query asked how many interstate pipeline companies had received TSA cyber reviews this year, and what the industry's expectations were for the timetable and scope of TSA future reviews of critical pipelines. Neither organization answered these questions.

The FERC plan

FERC has signaled that the independent agency wants to move to take on a larger role in ensuring the security of the pipelines that feed into America's growing fleet of gas-fired power generators, although officials have been cautious to avoid stepping on the toes of other entities.

"There should be no aspect of our nation's energy infrastructure that is left unprotected in a cyber sense, by whatever means we need to do that," said Republican FERC Chairman Kevin McIntyre at a June press conference.

He said he had not formulated a personal view on which government agency should be assigned oversight, or whether there should be mandatory standards, but did suggest that FERC could take on more authority in the future.

"I am not speaking for a formal FERC initiative or anything like that, but I wouldn't be terribly surprised to see if we were to move in that direction at some appropriate time; we're just not there now," he said.

His statements were noticed by others, including fellow Republican Commissioner Neil Chatterjee, who had suggested in a joint op-ed with Democratic Commissioner Richard Glick earlier this year that pipeline cybersecurity oversight be moved from the TSA to DOE.

"[McIntyre] wasn't ready to commit to commission action, but he did indicate he could see us taking action in the area," Chatterjee said in a recent interview.

Chatterjee, who called TSA's pipeline oversight office "clearly undermanned," noted that FERC is responsible for grid reliability, which could be endangered by cyberthreats to pipelines.

"FERC has a very serious role to play in this, but by suggesting the [oversight move to] DOE and not at FERC, this isn't just some jurisdictional grab," he said.

The PJM Interconnection, an eastern U.S. grid operator, has called on FERC to insist that all gas pipelines provide more information about operations and vulnerabilities to the power generators that depend on them. "In PJM's view, confidential information sharing should be both uniform and mandatory when the information is identified as needed to enhance the reliability" of grid and gas systems, said the grid operator.

"PJM urges the commission to drive further coordination through the exercise of its authority over both natural gas pipelines and the electric industry," the organization said (Energywire, April 17).

Now the gas industry must hold its breath to see to what extent FERC will intervene, possibly by requiring independent assessments of gas pipeline cyber vulnerabilities to cyber or physical attacks, or natural disasters, that would cut off fuel supply to essential gas-fired power generators.

FERC Chief of Staff Anthony Pugliese highlighted the possibility of intervention earlier this month when he told an industry audience that FERC's staff was working with DOE, the Department of Defense and the National Security Council "to identify the plants that we think would be absolutely critical to ensuring that not only our military bases, but things like hospitals and other critical infrastructure are able to be maintained, regardless of what natural or man-made disaster might occur."

Pugliese singled out pipelines as a target for state-sponsored cyberattacks. "More and more, you have adversarial countries ... who see pipelines, for example, as an area of great opportunity; let's put it that way."

The INGAA challenged Pugliese's statements about pipeline vulnerability, with INGAA spokeswoman Cathy Landry pointing to a fact sheet on the industry's preparation.

"It appears that Mr. Pugliese could use a refresher on some basic facts and our industry's commitment to this very serious issue," Landry said in a statement.

INGAA also released a statement from its president, Don Santa, who said the DOE plan "represents a solution to a problem that does not exist. If the Energy Department acts, consumers will be saddled with as much as $11.8 billion to pay for the uneconomic coal and nuclear plants.

"That might be justifiable if these facilities increased the reliability of the grid. But they


U.S. Wind Power Is ‘Going All Out’ with Bigger Tech, Falling Prices, Reports Show

Three new government reports detail how the wind industry is expanding — offshore and onshore — and the role corporations, technology and tax credits are playing.


AUG 23, 2018

New technology is allowing for bigger, more efficient wind turbines that make wind energy possible even in areas with lower wind speeds.

Wind power capacity has tripled across the United States in just the last decade as prices have plunged and the technology has become more muscular, the federal government's energy labs report.

Three new reports released Thursday on the state of U.S. wind power show how the industry is expanding onshore with bigger, more powerful turbines that make wind energy possible even in areas with lower wind speeds.

Offshore, the reports describe a wind industry poised for a market breakthrough.

"Right now it's going full bore," said Mark Bolinger, a research scientist at the Lawrence Berkeley National Laboratory and co-author of one of the new reports. "The industry is really going all out."

Some of the key findings:

The country's wind energy capacity has tripled since 2008, reaching 88,973 megawatts by the end of 2017. Wind contributed 6.3 percent of the nation's energy supply last year.

The average price of wind power sales agreements is now about 2 cents per kilowatt-hour, down from a high of 9 cents in 2009 and low enough to be competitive with natural gas in some areas.

State renewable energy requirements once were the leading contributor to demand for new wind farms, but they were responsible for just 23 percent of new project capacity last year due to rising demand for clean energy from corporate customers, like Google and General Motors, and others.

Offshore wind is going from almost nothing, with just five wind turbines and 30 megawatts of capacity off Rhode Island, to 1,906 megawatts that developers have announced plans to complete by 2023.

"The short story is wind is doing well in the markets, has been doing well, and looks like it will continue to do well," said Michael Webber, deputy director of the energy institute at the University of Texas at Austin, who was not involved with the reports.

"It's despite a lot of these policy shifts that have happened under the Trump administration," he said, referring to proposed rules aimed at boosting fossil fuels. "It's as if the markets have spoken, and they've chosen wind."

Texas is the country's wind energy leader by a long stretch, with 22,599 megawatts of capacity, including 2,035 megawatts newly installed last year, which also led the nation.

Oklahoma was second for total capacity and new capacity last year, with 7,495 megawatts and 851 megawatts, respectively. That's likely to slow, though, after Gov. Mary Fallin signed legislation in April to end a key state tax incentive for wind power three years early.

Nationally, an important driver for onshore wind power expansion has been the federal Production Tax Credit, which is also phasing out. The ticking clock for incentives has led to a surge in projects heading into 2020, the final year when developers can get the full value of the credit.

Onshore Wind Boosted By Bigger Turbines

Turbines at land-based wind farms are getting larger and more efficient, according to the Lawrence Berkeley Labs report. This has pushed average turbine capacity to 2.32 megawatts, up 8 percent from the prior year and up more than 200 percent since the late-1990s.

Rotors are getting larger, blades are getting longer, and turbines are getting taller, with more projects exceeding 500 feet—the level at which federal aviation regulators need to issue a special permit.

Increases in size and other design improvements are allowing turbines to operate at levels closer to their full capacity. Turbines built between 2014 and 2016 had an average "capacity factor" of 42 percent, compared to an average capacity factor of 31.5 percent for projects built from 2004 to 2011. (For a capacity factor of 100 percent, a power plant would be running at full capacity around the clock.)

The U.S. industry has a track record of market cycles that are tied to qualifying for federal tax credits. This is what led to a still-standing record for capacity additions in 2012, followed by a drop-off in 2013.

Current tax credits are likely to lead to another banner year in 2020 and 2021. "After that, it's less certain," Bolinger said.

But the annual ups and downs do not provide a complete picture, because wind farms take years to develop. For instance, looking at 2017 in isolation, new project completions were slightly down for the second year, yet it was the third consecutive year of 7,000 or more megawatts of new capacity, the first time that had happened in the U.S. market.

Offshore Wind Energy on the Rise

The U.S. offshore wind market is gearing up for major growth, with the tiny Block Island Wind Farm in Rhode Island about to get a lot of company, the new report from the National Renewable Energy Laboratory suggests.

Thirteen states have offshore wind projects in some phase of development, or areas that are likely to be open for leasing: California, Connecticut, Delaware, Hawaii, Maine, Maryland, Massachusetts, New Jersey, New York, North Carolina, Ohio, Rhode Island and Virginia.

In Massachusetts, the Vineyard Wind project, expected to be started next year and completed in 2021, stands to become the country's largest offshore wind farm for at least a few years at 800 megawatts.

Massachusetts is one of several states in the Northeast that have passed laws to encourage offshore wind, setting off a competition as states try to attract the companies that will serve the industry.

Offshore wind is an attractive option there, in part because it would be close to major population centers in a way that would be difficult for onshore wind or solar projects in densely populated areas.

But even with projects planned, the United States is far behind the offshore wind leaders. The United Kingdom and Germany have 5,824 and 4,667 megawatts, respectively, followed by China, with 1,823 megawatts.

Smartphones may be used to better predict the weather

Data could be harnessed to forecast flash floods and other natural disasters, Tel Aviv University researchers say


Flash floods occur with little warning. Earlier this year, a flash flood that struck Ellicott City, MD, demolished the main street, swept away parked cars, pummeled buildings and left one man dead.

A recent Tel Aviv University study suggests that weather patterns that lead to flash floods may one day be tracked and anticipated by our smartphones.

"The sensors in our smartphones are constantly monitoring our environment, including gravity, the earth's magnetic field, atmospheric pressure, light levels, humidity, temperatures, sound levels and more," said Prof. Colin Price of TAU's Porter School of the Environment and Earth Sciences, who led the research. "Vital atmospheric data exists today on some 3 to 4 billion smartphones worldwide. This data can improve our ability to accurately forecast the weather and other natural disasters that are taking so many lives every year."

Prof. Price collaborated with TAU master's student Ron Maor and TAU doctoral student Hofit Shachaf for the study, which was published in the Journal of Atmospheric and Solar-Terrestrial Physics.

Smartphones measure raw data, such as atmospheric pressure, temperatures and humidity, to assess atmospheric conditions. To understand how the smartphone sensors work, the researchers placed four smartphones around TAU's expansive campus under controlled conditions and analyzed the data to detect phenomena such as "atmospheric tides," which are similar to ocean tides. They also analyzed data from a UK-based app called WeatherSignal.

"By 2020, there will be more than six billion smartphones in the world," Prof. Price said. "Compare this with the paltry 10,000 official weather stations that exist today. The amount of information we could be using to predict weather patterns, especially those that offer little to no warning, is staggering.

"In Africa, for example, there are millions of phones but only very basic meteorological infrastructures. Analyzing data from or 10 phones may be of little use, but analyzing data on millions of phones would be a game changer. Smartphones are getting cheaper, with better quality and more availability to people around the world."

The same smartphones may be used to provide real-time weather alerts through a feedback loop, Prof. Price said. The public can provide atmospheric data to the "cloud" via a smartphone application. This data would then be processed into real-time forecasts and returned to the users with a forecast or a warning to those in danger zones.

The study may lead to better monitoring and predictions of hard-to-predict flash floods. "We're observing a global increase in intense rainfall events and downpours, and some of these cause flash floods," Prof. Price said. "The frequency of these intense floods is increasing. We can't prevent these storms from happening, but soon we may be able to use the public's smartphone data to generate better forecasts and give these forecasts back to the public in real time via their phones."

Interesting Graph:

This study was first made available online in April 2018. It will be published in the print edition of Journal of Atmospheric and Solar-Terrestrial Physics in September 2018.

Tiny fern holds big environmental promise



ITHACA, N.Y. - A tiny fern - with each leaf the size of a gnat - may provide global impact for sinking atmospheric carbon dioxide, fixing nitrogen in agriculture and shooing pesky insects from crops. The fern's full genome has been sequenced by a Cornell University and Boyce Thompson Institute (BTI) scientist and his colleagues around the world, as reported in the journal Nature Plants.

Azolla filiculoides is a water fern often found fertilizing rice paddies in Asia, but its ancestry goes much further back.

"Fifteen million years ago, Earth was a much warmer place. Azolla, this fast-growing bloom that once covered the Arctic Circle, pulled in 10 trillion tons of carbon dioxide from our planet's atmosphere, and scientists think it played a key role in transitioning Earth from a hot house to the cool place it is today," said Fay-Wei Li, a plant evolutionary biologist at BTI, adjunct assistant professor of biology at Cornell and the lead author of the work, "Fern Genomes Elucidate Land Plant Evolution and Cyanobacterial Symbioses."

Li and senior author Kathleen M. Pryer of Duke University led a group of more than 40 scientists from around the world to sequence the genome completely. As the group sequenced the genome, it identified a fern-specific gene shown to provide insect resistance.

"In general, insects don't like ferns, and scientists wondered why," said Li, who explained that one of the fern's genes likely transferred from a bacterium. "It's a naturally modified gene, and now that we've found it, it could have huge implications for agriculture."

Nitrogen fixation is the process by which plants use the chemical element as a fertilizer. While plants cannot fix nitrogen by themselves, Li said, the genome reveals a symbiotic relationship with cyanobacteria, a blue-green phylum of bacteria that obtain their energy through photosynthesis and produce oxygen. Special cavities in the Azolla leaf host cyanobacteria to fix nitrogen, while the plant provides sugary fuel for the cyanobacteria.

"With this first genomic data from ferns, science can gain vital intelligence for understanding plant genes," said Li. "We can now research its properties as a sustainable fertilizer and perhaps gather carbon dioxide from the atmosphere."

Ferns are notorious for having large genomes, some as large as 148 gigabases, or the equivalent of 148 billion base pairs of DNA sequences. On average, fern genomes are 12 gigabases - a reason why scientists have not sequenced one, until now. The Azolla is .75 gigabases.

Funding came from the National Science Foundation, the German Research Foundation and the Beijing Genomics Institute. About $22,000 was provided through crowdfunding at

OFFSHORE WIND : Nascent American industry could fully arrive under Trump

Saqib Rahim, E&E News reporter Energywire: Thursday, July 12, 2018

Ryan Zinke. Photo credit: @SecretaryZinke/Twitter

Interior Secretary Ryan Zinke is seen speaking at an offshore wind conference in New Jersey in April. @SecretaryZinke/Twitter

In April, Interior Secretary Ryan Zinke came to an offshore wind conference in Princeton, N.J., to give remarks on "energy dominance."

It was a charged moment to be giving the speech. Three months prior, the Trump administration had proposed to open 90 percent of federal waters to oil and gas leasing. Zinke had offered to exempt Florida, prompting an outcry from liberal Northeast states like New Jersey that wanted to move away from fossil fuels.

What Zinke said, to many raised eyebrows in the audience, was that offshore wind, just like oil and gas, fit into the "energy dominance" framework. He said if a state chose wind, it had a friend in the White House.

One audience member said the tone was "almost apologetic."

"Of the current energy portfolio, probably wind has the greatest opportunity for growth," said Zinke. "Let's make American energy great. Let's make sure we make wind energy great."

One could easily have imagined a different tone. Zinke, after all, worked for an administration that has cast doubt on climate science, rolled back Obama-era regulations on carbon pollution, and worked to subsidize coal and nuclear plants.

Offshore wind, after more than a decade of development under Presidents George W. Bush and Obama, could hardly have been a juicier target. The young industry hadn't yet found a footing in the U.S., and it will need heavy subsidies to get started. For the Northeast states advancing it, most of which had sued over Trump's climate policies, climate wasn't a side issue; it's the point.

But instead of following the same pattern of conflict and lawsuits, offshore wind is on the brink of arrival. A year and a half into the Trump administration, with upheaval all around the U.S. energy world, offshore wind is benefiting from an oddly cooperative dynamic between states and the federal government. The projects envisioned under Obama are moving toward fruition, and if trends hold, the U.S.'s first utility-scale offshore wind project could be under construction by the end of Trump's first term and operational in 2021.

In Maryland and Massachusetts, regulators have approved financing for two installations of 368 and 800 megawatts, respectively. New Jersey and New York are laying the policy groundwork for 5,900 MW more. California, Connecticut, Delaware, Rhode Island and Virginia are exploring projects and policies of their own.

It's happened not despite, but thanks to, the help of the U.S. government, offshore wind advocates admit.

"Yeah, I can't explain it myself," said Nathanael Greene, director of renewable energy policy at the Natural Resources Defense Council. "I think as a whole, this administration has a lot of people that are probably not actually anti-renewable as much as 'all-of-the-above' advocates ... [t]hey don't care about the environmental aspect of it. That's not what's driving them. But this is [potentially] a big American industry."

"I think the administration rightly sees offshore wind as a power source that can help us achieve energy independence and security ... and also dominance in the global economy," said Stephanie McClellan, director of the Special Initiative on Offshore Wind at the University of Delaware. "I don't want to say it's not a surprise, but it's fitting."

Offshore wind fits 'energy dominance'

On Nov. 8, 2016, the day Trump was elected, many friends of offshore wind had their doubts.

During the Obama administration, Catherine Bowes, program director for offshore wind at the National Wildlife Federation, had spent years advocating to open up U.S. waters to the industry. With large-scale wind farms going up off European shores, the Obama administration leased more than a million acres of federal waters as potentially eligible for the same.

Supporters of the industry expected the next step to occur under a Hillary Clinton presidency: proposing, and ultimately building, real projects. Instead, the president-elect was Donald Trump, who had sued to block a wind project visible from a golf course he owned in Scotland. "I want to see the ocean, I do not want to see windmills," he said in 2006.

"Oh, my god, this is game over under the federal administration now," Bowes remembered thinking that night. "We've worked so hard to get to this place, and now we need them."

If Trump had wanted to crush offshore wind, he would have had plenty of options.

Wind turbines do not end up in federal waters trivially; they result from a plenitude of federal and state approvals. It can take years just to set up the conditions for an offshore wind developer to lease an area — and, once it has, five to seven years before there are windmills in the water.

The Bureau of Ocean Energy Management, which is in the Interior Department, is the lead office in charge of the federal permitting. But throughout the permitting process, many other offices get to touch the ball, including EPA, the Navy and Coast Guard, the Army Corps of Engineers, the Federal Aviation Administration, NOAA, and the Fish and Wildlife Service.

And before building a project, a developer must get federal approval for a host of highly regulated activities, such as measuring wind and water conditions at the site. When the company's ready to build, its plan goes through a full environmental review under the National Environmental Policy Act.

As some in the offshore wind business observe, it would have been easy for the Trump administration to barricade the regulatory process, whether by denying permits or just stalling. That might have "killed the industry in the cradle" by casting doubt over the process, said one lawyer with offshore wind clients.

Instead, it appears the Trump administration has been attracted to offshore wind not for the reason states like it — climate — but because it fits their brand of "energy dominance."

"The assumption was that they weren't for it. My alternative posture back to you would be that I think that they looked at it through the lens of American energy," said Jon Hrobsky, a policy director at Brownstein Hyatt Farber Schreck and former Interior official under the George W. Bush administration. "If you're only willing to look at offshore wind through the lens of climate, then I understand [that assumption], but that's not how this administration appears to be approaching it."

The proof is in the regulatory pudding. In March, the companies behind Bay State Wind, a 400-800 MW installation proposed for Massachusetts waters, said the project was the first-ever offshore wind project to receive "FAST 41" status — a special designation meant to accelerate the federal permitting process. Bay State is a joint project of Denmark-based Ørsted, the largest developer of offshore wind in the world, and U.S. utility Eversource Energy.

And in June, an Interior royalty committee suggested that Interior direct its wind leasing program to deploy at least 20 gigawatts of offshore wind — beyond what's currently on deck — by 2024 (Energywire, June 7).

Inspired by Denmark

It is unclear whether the Trump administration had this position from the start or whether it evolved into it. Either way, it seems that Denmark, which has 1.3 GW of offshore wind, played a part.

According to Zinke's public schedules, in October of last year, he met with the Danish ambassador to the U.S., Lars Gert Lose, and Dong Energy, a Danish company that once focused on oil and gas but has become the largest developer of offshore wind in the world. (Dong, which has since changed its name to Ørsted A/S, has developing projects in Massachusetts, New Jersey and Virginia.) Later that month, Zinke met with Denmark's energy minister, Lars Christian Lilleholt.

In January, Zinke released a plan to open up 90 percent of federal waters to oil and gas leasing. Coastal states revolted, saying Florida had been unfairly exempted and demanding the same exemption. In New York, Gov. Andrew Cuomo (D) said if oil explorers came to its coast, he'd lead a flotilla of boats to block them (Energywire, May 7).

With the controversy brewing, Zinke's counselor for energy policy, Vincent DeVito, traveled to Denmark for a two-day "fact-finding" tour focused on offshore wind. "We exchanged best practices for leasing and flexible permitting," DeVito said in an email to E&E News.

Less than three months later, Zinke was in New Jersey touting offshore wind, not for climate reasons but as a nascent industry that could produce jobs and infrastructure in America. He followed it up with an April op-ed in the Boston Globe that said offshore wind "will play an increasing role in sustaining American energy dominance," and he elaborated further in a June interview with the Washington Examiner.

"When the president said energy dominance, it was made without reference to a type of energy," he said. "It was making sure as a country we are American energy first and that includes offshore wind. There is enormous opportunity, especially off the East Coast, for wind. I am very bullish."

Many offshore wind advocates admit to being pleasantly baffled by this bullishness. But they have ways to make sense of it.

Some note that offshore wind hits the alpha-male notes of Trumpian "dominance," with its need for manufacturing, steel, concrete, ports and power lines.

Others observe that while offshore wind requires subsidies, it's the states, not the federal government, that actually have to produce most of the money.

Dan Kish of the Institute for Energy Research, a think tank with ties to the Koch family, is no advocate for the industry. Still, he said, if states want to raise their own citizens' electricity bills, the Trump administration might not want to get mixed up in it.

"If [states are] choosing to spend more on electricity deliberately as a result of policies, I don't think the administration is necessarily going to stand in their way," he said. "Sometimes you let the child touch the hot stove after repeatedly attempting to get his attention."

Proponents fire back that costs are falling, thanks to projects in Europe. Last year, Dong won an auction in Germany with a project it claimed would require zero subsidy.

Developments like these have encouraged Northeast states that the industry's cost promises are credible; Massachusetts' legislation requires project costs to fall over time (Energywire, Dec. 15, 2016).

The sun rises over Block Island, the United States' first offshore wind farm. Deepwater Wind

Another potential draw for Trump: the connection to Big Oil. As many observed, offshore wind opens up business opportunities for companies that can build infrastructure in deep water. Putting up a turbine means drilling into the ocean floor, laying down a massive foundation and managing logistics on the rolling waters the whole time — core competencies for companies that work in the U.S. Gulf of Mexico.

"There's a lot of big corporations that are interested in offshore wind," said Joe Martens, director of the New York Offshore Wind Alliance. "The same businesspeople that they are logically sympathetic to are in the offshore wind space."

And in some places, it's already making the federal government money. In December 2016, the month after Trump's election, Statoil ASA (which has since changed its name to Equinor ASA) paid the federal government more than $42 million to lease waters off the Long Island coast. The company estimates the site could accommodate a gigawatt of offshore wind.

Royal Dutch Shell PLC, which is a partner in developing a project off the Dutch coast, has expressed interest in New York, as well.

If offshore wind is indeed emerging as a major business interest, there's a telltale sign in Washington: lobbying. Last year, Republican Rep. Andy Harris of Maryland tried to slot language into a budget bill to cut off agency funding for projects proposed to go up within 24 nautical miles of Maryland.

That prompted an opposition letter from the National Ocean Industries Association and the American Wind Energy Association — the lead groups representing the offshore oil and wind industries. The language was cut from the final budget bill.

But as veterans of the offshore wind world know, it's too early to declare success. A single tweet could always throw the industry into disarray. And in the coming years, the industry will be dependent on the federal government in two key areas: opening up more water for development and approving the projects already proposed.

BOEM is currently managing 12 active commercial offshore wind leases. At a congressional hearing last month, James Bennett, chief of BOEM's renewable energy office, said the program can currently fulfill its "primary objectives" — managing these leases and creating new leases — despite proposed budget cuts for BOEM. Down the road, he said, "Some of those activities will not happen as quickly if we don’t have the resources to pursue them."

In a worst-case scenario, one offshore wind consultant said, the administration could take years to evaluate a project's construction proposal, only to reject it.

"Shhh," said the consultant. "Don't jinx it."

Can ‘Vaccines’ for Crops Help Cut Pesticide Use and Boost Yields?

Researchers are using gene editing to develop biodegradable vaccines that protect crops from pathogens. As the world looks to feed more and more people, this and other emerging technologies hold promise for producing more food without using chemical pesticides.


When European researchers recently announced a new technique that could potentially replace chemical pesticides with a natural “vaccine” for crops, it sounded too good to be true. Too good partly because agriculture is complicated, and novel technologies that sound brilliant in the laboratory often fail to deliver in the field. And too good because agriculture’s “Green Revolution” faith in fertilizers, fungicides, herbicides, and other agribusiness inputs has proved largely unshakable up to now, regardless of the effects on public health or the environment.

And yet emerging technologies — including plant vaccines, gene editing (as distinct from genetic engineering), and manipulation of plant-microbe partnerships — hold out the tantalizing promise that it may yet be possible to farm more sustainably and boost yields at the same time. In truth, there’s little choice this time but to deliver on that promise, because of the need to feed 10 billion people later in this century and to do so on a limited land area and in spite of unpredictable climate change.

The idea of vaccines to protect plants from pathogens has kicked around in agribusiness for decades without much result. But it has moved closer to reality in recent years as research has demonstrated that it’s possible to tweak a natural genetic messenger to target specific crop pathogens. Double-stranded RNA (or dsRNA) is a key activator of the immune system in plants and many other organisms. One current line of research uses genetic engineering to modify this natural defense system — resulting, for instance, in a new cotton cultivar with dsRNA targeting plant-sucking insects that have become a pest in China.

Organisms produced by gene editing contain nothing that could not in theory have resulted from natural breeding.

Genetic engineering has, however, faced technological and regulatory limits reducing its ability to respond quickly to an emerging pathogen and it can also be unpopular with the public, especially in food crops. A study published last month in Plant Biotechnology Journal instead proposes a method to take a pathogen from a crop, sequence it in the lab, and quickly produce dsRNA with a long nucleotide sequence specifically targeting that pathogen. Bacterial fermentation chambers could then rapidly produce enough of that targeted dsRNA to spray on crops. Senior author Manfred Heinlein, a plant molecular biologist at the French National Center for Scientific Research, estimates that the process could move from identification of a new pathogen to commercial application of a targeted spray in six months or less.

In contrast to chemical pesticides, Heinlein and his co-authors write, “dsRNA agents are biocompatible and biodegradable compounds that are part of nature and occur ubiquitously inside and outside organisms as well as in food.” Because the targeted dsRNA matches a long nucleotide sequence in the pathogen, it remains effective up to a point even as parts of that sequence in the pathogen mutate. So it’s unlikely to face the evolved resistance that typically renders chemical treatments ineffective.

A plant (left) infected with the tobacco mosaic virus, visible in fluorescent markings, compared to a plant (right) treated with a double-stranded RNA "vaccine," which activated its immune system.

A plant (left) infected with the tobacco mosaic virus, visible in fluorescent markings, compared to a plant (right) treated with a double-stranded RNA "vaccine," which activated its immune system. HEINLEIN ET AL. 2018, PLANT BIOTECHNOLOGY JOURNAL

Heinlein adds a few caveats: It’s not technically a vaccine. Instead of eliciting persistent immunity as a vaccine would do, it works only if the pathogen is present and even then only for the 10 or 20 days it takes for the dsRNA to biodegrade. The research also needs to progress from successful testing in the laboratory against the tobacco mosaic virus to field tests against other common crop pathogens. That includes learning whether dsRNA imposes unexpected stresses on the plant or produces non-target effects — for instance, against beneficial insects. If those tests go well, though, Heinlein speculated that the technology could be replacing chemical pesticides in some applications within as little as five years.

CRISPR gene editing technology, Science magazine’s 2015 Breakthrough of the Year, has already begun to put altered crops in the field, and the pace of introductions is likely to accelerate. It’s a more palatable alternative to genetic engineering, which has faced regulatory limits and a “frankenfood” reputation because it introduces genes of one species into another. Gene editing, by contrast, isn’t transgenic. Instead, using a technology called CRISPR/Cas9, scientists precisely and inexpensively snip minute genetic sequences from the genome of an organism, or add in sequences from another individual of the same species. Such organisms contain nothing that could not in theory have resulted from natural breeding, and they have so far entered the market without regulatory delays.


Much of the appeal of gene editing has to do with the potential to get plants to do work currently done by pesticides and other costly inputs. Wheat, for instance, typically requires heavy use of fungicides against powdery mildew. But researchers in China have already developed a gene-edited variety resistant to the disease. Gene editing technology is cheap enough that it could also potentially help developing regions that cannot afford, or otherwise lack access to, commercial pesticides, and where genetic engineering is now the other likely way to address diseases that can devastate rice, maize, cassava, and other staple crops.

It may, however, be a more realistic indicator of things to come that the first commercially available gene-edited crop planted in U.S. fields, in 2015, was an oilseed rape modified for herbicide resistance. It’s a reminder that past improvements in agricultural technology have at times served only to bind farmers even more inextricably to chemical inputs — for instance, with the 15-fold increase in use of the herbicide glyphosate brought on by the introduction of genetically-engineered Roundup-ready crops in the mid-1990s.

Scientists have developed a gene-edited wheat variety resistant to powdery mildew, caused by the fungus Blumeria graminis.

Scientists have developed a gene-edited wheat variety resistant to powdery mildew, caused by the fungus Blumeria graminis. THOMAS LUMPKIN/CIMMYT

“It’s a curious vision of sustainable agriculture … that sees overcoming resistance to agrochemicals as progress,”commented Maywa Montenegro, a University of California at Berkeley environmental policy researcher. “Should we really be enabling farmers to spray more glyphosate into their fields when the World Health Organization has found the chemical to be a ‘probable’ carcinogen and when it’s been associated with collapsing populations of monarch butterflies?”

One answer to that question came in March from the agribusiness giant (and glyphosate manufacturer) Monsanto. It committed $100 million to found a startup called Pairwise focused, according to a Reuters report, on using gene editing “to alter commodity crops, including corn, soy, wheat, cotton, and canola, exclusively for Monsanto.”

New genomic tools have for the first time allowed detailed analysis of microbial communities living in and around plants — the agricultural counterpart to the human microbiome. That has made the study of these mutualistic microbes a hot topic of academic and commercial research, with the potential to substitute biopesticides and biofertilizers for agrochemical inputs. Researchers have touted microbes that can help various crops pull phosphorus, nitrogen, and other nutrients from the environment; survive droughts, soil salinity, heat, cold, and other forms of stress; resist insect pests and disease; and improve photosynthesis, among other things. But these promising results often prove disappointing in the field, where hundreds of microbial species may interact with one another and a plant in ways scientists are only beginning to understand.

The classic example of a plant-microbe partnership is of course nitrogen fixation by peanuts, chickpeas, and other legumes. These plants house and feed specialized bacteria in nodules on their roots. The bacteria in turn pull nitrogen from the air and turn it into food for the plant. This partnership seems as if it ought to be easy enough to manipulate, and farmers have for decades been spending a little extra cash at planting time to boost the nitrogen-fixing process by inoculating soybeans and other seeds with an extra dose of these beneficial bacteria.

“We want to know if we can breed a plant to basically control the way a fungus forages for a nutrient,” says one scientist.

“It works great the first time you grow soybeans in a field,” says R. Ford Denison, an agricultural ecologist at the University of Minnesota. But after that, for reasons scientists don’t entirely understand, the soil builds up a population of less efficient bacteria. They demand more support from the plant that houses them, and supply less nitrogen in return. Farmers paradoxically end up adding nitrogen fertilizer to plants designed by evolution to provide nitrogen on their own.

But research into the agricultural microbiome is really just beginning and it will inevitably develop more sophisticated ways to put microbes to work in the cause of sustainable agriculture. For instance, in 2003, Toby Kiers, then a post-doctoral student in Denison’s lab, discovered that nitrogen-fixing plants have the ability to detect and starve out inefficient bacteria. She’s now working at Amsterdam’s Free University to develop plants with more nuanced control over their microbial partners.

“We want to know,” says Kiers, “if we can breed a plant to basically control the way a fungus forages for a nutrient. If a plant is deficient in a nutrient, can we align it so it uses the fungi to get what it needs?” The fungi she studies are already completely dependent on the host plant, so “we think we can tweak the partnership to make sure the plant is even more in control.” The eventual goal is a plant-microbe partnership that can essentially farm itself, so that “no matter where you put it, or what the conditions are, that partnership is able to maximize uptake.”

Up to now, all the attention in agriculture has gone to the above-ground portion of crops, and to the pesticides, herbicides, fertilizers, and other inputs farmers could add to make plants grow faster. “And now,” says Kiers, “I think people are realizing that the real revolution is going to happen underground.”

Richard Conniff is a National Magazine Award-winning writer whose articles have appeared in The New York Times, Smithsonian, The Atlantic, National Geographic, and other publications. His latest book is "House of Lost Worlds: Dinosaurs, Dynasties, and the Story of Life on Earth."

Why This Insurer Wants to Put the Spotlight on Growing ‘Ocean Risk’

Chip Cunliffe, director for sustainable development at XL Catlin, a global insurance and reinsurance group, explains why his company is hosting the first-ever Ocean Risk Summit in May and what it hopes to accomplish.


Jessica Leber


April 19, 2018

WHEN A COASTAL community floods or a drought kills crops, people, businesses and communities lose out – but so can the insurers who cover them. That’s why, for years, the insurance industry has been at the forefront of thinking about the avalanche of new risks that come with a warming climate.

XL Catlin, a leading insurance and reinsurance group that suffered a net loss in 2017 after a year of “severe natural catastrophes,” believes more attention also needs to be paid to how the ocean is changing – and how communities can build resilience to warming, acidifying or overfished waters. The company is hosting the first Ocean Risk Summit in Bermuda, where it’s headquartered, on May 8–10, along with several scientific and Bermuda-based organizational partners.

To Chip Cunliffe, director for sustainable development for XL Catlin, the conference “is very much a first” in its effort to engage the insurance community and broader business, finance and government sectors in these issues. While the company has a history of taking part in ocean research – in the past decade, it has sponsored major scientific surveys of global coral reef health, Arctic sea ice and the deep ocean and is a member of the Bermuda Institute of Ocean Science’s Risk Prediction Initiative – increasingly, it is also focusing on oceans as an industry imperative, Cunliffe said.

For instance, at a conference in Mexico in March, XL Catlin executives backed efforts to write the first-ever insurance policy for a coral reef – in this case, the Mesoamerican Reef that protects the Yucatan peninsula and its $9 billion tourist industry from hurricanes.

Oceans Deeply spoke with Cunliffe about how XL Catlin thinks about ocean risk and what it hopes to achieve at the summit.

Oceans Deeply: What does XL Catlin mean by ‘ocean risk’?

Chip Cunliffe: The oceans are changing very rapidly. We are interested in understanding how the changes in the oceans are impacting the future risk landscape.

What we’re finding is that it means quite a few things. We’re looking at sea-level rise because as the ocean warms, water expands, but you also have a corresponding likelihood of an increased intensity of storms. Looking at the storm intensity over the last couple of years, you’ve seen those increase. You take those two things together and you’re likely to see greater impacts from storm surges, so [there is] coastal flooding inundation and impacts to coastal infrastructure. But not only that, you’re also likely to see impacts to human health.

We have illegal and unreported fishing and overfishing of fish stocks around the world. But what we’re also seeing, there is a fairly big problem in that the fish are also moving poleward, because of the warming ocean. If those fish stocks are moving, what does that mean for the fishing community and what does that mean for food security? Another area is looking at aquaculture, and the potential impact that a warming ocean might have and whether those current areas that are used for aquaculture might need to move.

We’re also seeing Arctic sea-ice loss, and there may be some weather system changes related to that. We’re also seeing militarization of the Arctic.

Oceans Deeply: There’s clearly a lot going on.

Cunliffe: From the point of risk perspective, there’s a lot of stuff going on. These risks are far too big for just one business to get a hold of. So [we’re looking at] identifying ways in which the insurance industry could potentially help identify solutions to mitigate some of those risks. It’s also looking at how to build resilience and identify where resilience needs to be built.

Oceans Deeply: What do you hope to achieve with this conference?

Cunliffe: Number one is to put ocean risk up the agenda. Oceans are really only just coming into the political consciousness, if I may say that, given that it’s taken 25 years or so for the Intergovernmental Panel on Climate Change to add it to its AR5 [Fifth Assessment ] report on climate change – and [there’s] just one chapter on it.

Partly, it’s about communicating the fact that the oceans are changing and we need to be cognizant of that. Ultimately, the big thing for this conference is to identify solutions. But it can’t be done just through insurance. It can’t be done just by policy. We need a multi-sectoral approach to mitigating these risks.

That obviously comes across from a financial point of view: Where might funds come from to help build that resilience? We certainly could focus in on the legal aspects. We also need to be focusing on regulation and technology and data. Everyone says we all need more data, and I think that is particularly true in the ocean space.

Oceans Deeply: Has XL Catlin quantified the future costs of ocean risk or even seen its effects in the claims you’re seeing today?

Cunliffe: As a business, and probably more globally, no. Ultimately this is the first conference of its kind looking at these kinds of risks. Out of the back end of this we want to move that area of focus forward.

Oceans Deeply: Are ocean issues viewed not just as a potential cost but also as a future business opportunity?

Cunliffe: At the moment, and up until now, it’s certainly been a focus for us from a corporate responsibility perspective. That, for us, is really key. In fact, it started out as a sponsorship opportunity but it quickly became obvious to us that it wasn’t just about branding and, in fact, that became fairly secondary to really being a good corporate citizen.

The ocean risk perspective has come from the fact that we have done all these different expeditions, and we’ve focused on this area for quite a long time. As we’ve been listening to the likes of the United Nations and the IPCC report and others … there’s a whole lot more communication now about risk and resilience. And so it makes sense for us to work with others to identify how we could potentially help mitigate those risks.

Oceans Deeply: Anything else you want to mention?

Cunliffe: This is quite a new narrative, really. It’s an area of science and risk that is not particularly well known. It seems right that we should be doing this now when the oceans are coming more front and center in the national and international news, related to plastics and fishing and other things. Things like the BBC’s “Blue Planet” and other efforts to communicate. We’re all running in the same direction and all helping each other along

Fewer than 1 percent of offshore drilling tracts auctioned by Trump receive bids

BY MIRANDA GREEN - 08/15/18 02:27 PM EDT 79

Oil and gas companies bid on fewer than one percent of the offshore tracts made available by the Trump administration during an auction Wednesday.

Of the 14,622 tracts made available by the Interior department for bidding on 801,288 acres in federal waters off the Gulf of Mexico, only 144 received bids.

The percentage of tracts bid on was slightly less than the previous lease sale in March that leased just over 1 percent of tracts made available.

The last sale in March was the biggest offshore lease sale in United States history--with 77.3 million acres made available. The sale saw 33 companies bidding on plots off the cost of Texas, Louisiana, Mississippi, Alabama, and Florida for $124.8 million. Of the 14,474 tracts available for bidding only 148 tracts received any bids.

The Obama administration held much smaller sales that focused only on areas where oil companies had expressed interest.

The administration Wednesday nevertheless hailed the latest sale as a success, promoting the nearly $180,000,000 in sales generated in a press release.

"Today’s lease sale is yet another step our nation has taken to achieve economic security and energy dominance,” Interior Deputy Secretary Bernhardt said in a statement. “The results from the lease sale will help secure well-paying offshore jobs for rig and platform workers, support staff onshore, and related industry jobs, while generating much-needed revenue to fund everything from conservation to infrastructure.”

Despite the revenue, the limited interest is a considerable blow to an administration that has been preaching the need to expand oil and gas drilling on public lands and waters in order to increase U.S. energy dependence and revenue. Fewer than 30 companies submitted bids.

In January Interior Secretary Ryan Zinke announced a desire to expand offshore drilling to new areas off the coast. The call met resistance from a number of state leaders opposed to opening up drilling off their coastlines. Zinke has yet to formally announce where new drilling might be but has acknowledged that oil and gas companies have more interest in plots located on land rather than offshore.

Critics say the bids shows a lack of interest by fossil fuel groups to invest in offshore drilling which is expensive, risky and not always profitable.

Environmental groups have also jumped on the lackluster results as proof that drilling is overall not desirable.

"When will Ryan Zinke finally get the message that it’s time to scrap his reckless offshore drilling plans? Millions of Americans and elected officials from both sides of the aisle have made it clear that the public does not want dangerous drilling off our coasts, and even corporate polluters aren’t buying what Zinke’s selling," Athan Manuel, director of the Sierra Club’s land protection program, said in a statement.

Electric Bills Skyrocket Due to Costs of Coal in Appalachia as Region’s Economy Collapses

As natural gas and renewables get cheaper elsewhere, residents in Appalachia are stuck paying for coal-fired power plants that no longer make economic sense.

By James Bruggers, Aug 14, 2018

WHITESBURG, Kentucky — Buddy Sexton opened a community meeting about the financial troubles of local volunteer fire departments with his head bowed: "Let people understand what a need we have in this county, in the mighty name of Jesus, we pray."

Sexton then delivered some grim news to the gathering of about two dozen eastern Kentucky residents sitting in folding chairs set up in the engine bays of the Mayking Fire Department.

Just like the townspeople, his tiny department is facing crushing electric bills. They have gone up by several hundred dollars a month during the winter. That's a lot in a local budget that is already strained as revenues from taxes on mining go steadily down.

"I've got financial reports here that I am afraid to look at anymore," said Sexton, the department's treasurer. "We're in danger of shutting down."

As coal mining has collapsed across Appalachia, residents in eastern Kentucky and West Virginia have been socked with a double whammy—crippling electric bills to go along with a declining economy.

The pain in a region once known for cheap power has been felt in homes, businesses, schools and even at volunteer fire departments.

The problem of rising power bills has many causes.

Mines and other businesses have shut down and people have moved away—mining jobs are off 70 percent since 2010—so utilities are selling less power and spreading their fixed costs across fewer customers.

Electricity customers are shouldering the costs of shuttering old coal-burning power plants and cleaning up the toxic messes they leave behind.

And experts say it hasn't helped that utilities in eastern Kentucky and West Virginia have continued to invest in burning coal.

"They're doubling down on coal at a time when coal is not competitive," said James M. Van Nostrand, a professor at the West Virginia University College of Law with decades of experience in the energy field. "It's really tragic."

AEP's Rising Rates and Surcharges

The common denominator is American Electric Power, one of the nation's largest utilities. It owns Kentucky Power, along with two subsidiaries in neighboring West Virginia, Wheeling Power and Appalachian Power.

In May, Wheeling Power and Appalachian Power requested permission from the Public Service Commission of West Virginia to boost their monthly residential bill 11 percent because of declining sales. That was on top of a 29 percent increase between 2014 and 2018.

Customers in both states are furious that the regulators are going along.

"Our jobs available in this state are not a living wage, and many are working two, three jobs just to make it," wrote Elizabeth Bland of Beckley, West Virginia, in her protest letter to the commission. "Please turn down this request from Appalachian Power for the sake of all West Virginians."

Rising rates are just part of the problem.

Kentucky Power's monthly bill also includes surcharges, and a line for each customer's share of the utility's fixed costs. These add up in precious dollars.

Kentucky Power's customers received a bit of a break in January after vigorous protests. State regulators rejected most of the utility's proposed rate increases and they reduced some surcharges. That will save an average residential customer about $7 a month. Even so, that bill has nearly doubled in the past decade and is now about $151 plus local government fees and taxes, state regulators said.

The average bill per customer at Kentucky Power has been among the highest in the nation for an investor-owned utility, according to 2016 numbers from the U.S. Energy Information Agency, the most recent comparisons available.

"We're hit hard," Alice Craft, a Whitesburg-area resident, told InsideClimate News. "The power companies, they are just greedy, greedy, greedy."

Surcharges that pay for retiring a coal plant are especially irritating, said state Rep. Angie Hatton, a lawyer and Democrat from Whitesburg.

"They are charging us for shutting down our coal-fired plants that were keeping us all employed," she said.

Hatton, who is the daughter of one coal miner and wife of another, also sees the value of solar power as an option for businesses and residents with high power bills. She helped defeat legislation in Kentucky his spring that would have devalued rooftop solar power. Utilities backed it, but it was rejected by lawmakers driven by constituents' rage about power bills.

With Poverty High, Something Has to Give

Allison Barker, a spokeswoman for Kentucky Power, blamed last winter's unusually cold weather for part of the high electric bills and suggested customers could turn their thermostats down.

"Keeping the thermostat the same during extreme weather can significantly increase usage and bills," she said.

But the underlying problem, she acknowledged, was the loss of hundreds of industrial and commercial customers and thousands of residential customers since 2014. During that time, she said, energy use has dropped 15 percent, and the company has faced new costs to comply with environmental protections.

AEP's West Virginia utility, Appalachian Power, said in May that its electricity demand had dropped 14 percent since 2013 and it had lost 11,000 customers.

Poverty also comes into play.

Kentucky Power's service area includes 10 of the 100 poorest counties in the United States, with poverty rates as high as three times the national average. The region has a lot of substandard housing with poor insulation, adding to the expense of electric heat.

Something has to give when people on small, fixed incomes are hit with higher power bills, said Roger McCann, executive director of Community Action Kentucky, an umbrella group for poverty-fighting agencies with services including home-energy assistance.

"Seniors will cut their medicine in half, or they will go without food," he said. "If you don't pay the bill, the lights go out."

McCann said he knew of one woman in a mobile home who had a $600 monthly electric bill while living on a Social Security benefit of just $660 a month.

"Some mobile homes are not insulated and the heat just goes through them," he said.

Shifting Coal Costs to Consumers

Kentucky State Rep. Rocky Adkins, the Democratic minority floor leader and a potential candidate for governor, says high bills and the region's economic woes both stem from a 2012 decision by Kentucky Power as Adkins and others were trying to salvage the utility industry.

Together, AEP and the Kentucky Public Service Commission reached a decision to pull the plug on nearly $1 billion in environmental upgrades at the utility's Big Sandy power plant. It stopped burning coal three years later. One unit was converted to natural gas.

Meanwhile the Kentucky regulators steered the company to invest more than $500 million in the nearby coal-fired Mitchell plant, which had already been upgraded. That solution, the commission argued, would save customers money.

At the time, Adkins railed that this was waving the "white flag" of surrender to natural gas, environmentalists and regulators.

But the rout was already under way.

In the years that followed, the Mitchell plant's coal boilers have been undercut on the regional grid by cheaper natural gas. Kentucky Power found itself with an unneeded mountain of 400,000 tons of coal there. In June, Kentucky regulators gave the utility permission to sell the surplus, raising $17.6 million.

"We lost the power plant and the rates have gone through the roof," Adkins told ICN. "It has happened in a region of Kentucky that is more challenged than any other region of the country because of the downturn in the coal economy."

But there's another way of looking at the outcome, said Cathy Kunkel, a West Virginia-based analyst for the Institute for Energy Economics and Financial Analysis.

She explained how the maneuver supported coal-fired power while shifting its costs from shareholders to consumers.

There are two main business models for selling electricity. In deregulated wholesale markets, merchant vendors compete to sell power to the grid, and high-cost generators lose out. In regulated markets, the rates are set by the state and utilities are guaranteed a profit.

Before 2012, the Mitchell plant had been owned by a deregulated AEP company which sold its power in the open marketplace. The 2012 deal put half of Mitchell under a regulated AEP subsidiary, and that gave it economic protection, at least for a time. Two years later, West Virginia regulators allowed Wheeling Power, also regulated, to buy the other half.

Acting together, Kunkel said, AEP and its regulators had shielded the struggling coal fired plant from the pressures of the marketplace and shifted the costs onto customers. "The problem when the plant is uncompetitive is that it ends up not running very much, and not producing much revenue for customers, who end up losing out because they're still stuck paying for all of the fixed costs and operating costs of the plant," she said.

It's a method the company has used elsewhere.

"Those plants are not competitive in the wholesale market," said Van Nostrand, the law professor. "We are buying them and sticking them on the backs of the ratepayers," he said.

A spokesman for the Kentucky regulatory agency defended its 2012 actions, saying that exhaustive analysis showed that it diversified AEP's fuel mix and avoided an even more costly debacle at the Big Sandy plant.

A spokesman for West Virginia's regulators provided ICN a recent staff report to lawmakers. The state's electric prices are still competitive compared to elsewhere, it asserted, but they are rising, "because of our nearly 100 percent reliance on coal."

'These People Have Been Sold a Bill of Goods'

The solution, Kunkel said, should include more federal help for an economic transition away from coal.

"If you were to do that in a smart way it would include money for energy efficiency and money for rooftop solar," she said.

Some eastern Kentucky residents blame coal's slide on environmental regulations in general, and former President Barack Obama in particular. Van Nostrand said that's just not the case.

"These people have been sold a bill of goods for the last 10 years that everything in the coal industry would have been fine if the (U.S. Environmental Protection Agency) had just left us alone," Van Nostrand said. "It's been cheap natural gas and it's been cheap renewables." The EPA is further down on the list of the decline in the coal industry, he added.

At the Mayking Fire Department community meeting, residents were told that coal severance taxes—payments to their county from taxes collected on coal as it leaves the ground—had fallen from about $2 million a year to about $180,000 a year, and that the county budget had been slashed from $11 million to $6 million—all in the last five years. That left little money to help keep local volunteer fire departments afloat.

Wintertime power bills at the station just outside Whitesburg have risen from as low as about $800 a month to about $1,400 a month in recent years, Fire Chief Tony Fugate told ICN.

By the time the Mayking department, with its approximately $25,000-a-year budget, pays its bills for insurance and equipment, "there's nothing left for the power bill, unless you have fundraisers," he said.

But cookouts and other fundraisers, even asking drivers at intersections to drop money in a fireman's boot, can only go so far, he said.

"You can only have so many roadblocks and hot dog dinners," Fugate said.

James Bruggers covers the U.S. Southeast, part of ICN's National Environment Reporting Network. He came to InsideClimate News in May 2018 from Louisville's Courier Journal, where he covered energy and the environment for more than 18 years. He has also worked as a correspondent for USA Today and was a member of the USA Today Network environment team. Before moving to Kentucky, Bruggers worked as a journalist in Montana, Alaska, Washington and California, covering a variety of issues including the environment. Bruggers' work has won numerous recognitions, including the National Press Foundation's Thomas Stokes Award for energy reporting. He served on the board of directors of the Society of Environmental Journalists for 13 years, including two years as president. He lives in Louisville with his wife, Christine Bruggers, and their cat, Lucy.

James can be reached at

How Energy Companies and Allies Are Turning the Law Against Protesters

In at least 31 states, lawmakers and governors have introduced bills and orders since Standing Rock that target protests, particularly opposition to pipelines.

By Nicholas Kusnetz

AUG 22, 2018

A version of this ICN story was published in the Washington Post.

The activists were ready for a fight. An oil pipeline was slated to cross tribal lands in eastern Oklahoma, and Native American leaders would resist. The Sierra Club and Black Lives Matter pledged support.

The groups announced their plans at a press conference in January 2017 at the State Capitol. Ashley McCray, a member of a local Shawnee tribe, stood in front of a blue "Water is Life" banner, her hair tied back with an ornate clip, and told reporters that organizers were forming a coalition to protect native lands.

They would establish a rural encampment, like the one that had drawn thousands of people to Standing Rock in North Dakota the previous year to resist the Dakota Access Pipeline.

The following week, an Oklahoma state lawmaker introduced a bill to stiffen penalties for interfering with pipelines and other "critical infrastructure." It would impose punishments of up to 10 years in prison and $100,000 in fines—and up to $1 million in penalties for any organization "found to be a conspirator" in violating the new law. Republican Rep. Scott Biggs, the bill's sponsor, said he was responding to those same Dakota Access Pipeline protests.

The activists established the camp in March, and within weeks the federal Department of Homeland Security and state law enforcement wrote a field analysis identifying "environmental rights extremists" as the top domestic terrorist threat to the Diamond Pipeline, planned to run from Oklahoma to Tennessee. The analysis said protesters could spark "criminal trespassing events resulting in violence." It told authorities to watch for people dressed in black.

An FBI team arrived to train local police on how to handle the protest camp.

McCray recalls a surveillance plane and helicopters whirring above the Oka Lawa camp. Demonstrators were pulled over and questioned on their way in or out, though the local sheriff said people were only pulled over for violating traffic laws.

In May the governor signed the bill to protect critical infrastructure. Merely stepping onto a pipeline easement suddenly risked as much as a year in prison.

"That was really pretty successful in thwarting a lot of our efforts to continue any activism after that," McCray said.

Oka Lawa never drew the kind of participation and attention that made the Dakota Access Pipeline a national cause, and the Diamond Pipeline was completed quietly later that year.

McCray has since channeled her energy toward politics, running for a seat on the Oklahoma Corporation Commission, which regulates the energy industry. But even as a candidate, McCray says, she now watches her words and her Facebook posts, afraid of being implicated as a conspirator if someone were to violate the law, even if she doesn't know the person. "I don't feel safe, honestly," she said.

Ashley McCray says Oklahoma's new law has made protesters think carefully about what they have the freedom to say.

Across the country, activists like McCray are feeling increasingly under assault as energy companies and their allies in government have tried to turn the law—and law enforcement—against them.

In Louisiana, which enacted a similar law in May, at least nine activists have been arrested on felony charges stemming from the new law since it went into effect on Aug. 1. In one incident, three people were pulled off a canoe and kayak after they maneuvered the boats on a bayou to protest construction of an oil pipeline. The arrests were conducted by off-duty officers with the state Department of Public Safety and Corrections who were armed and in uniform, but at the time were working for a private security firm hired by the pipeline developer.

Dozens of bills and executive orders have been introduced in at least 31 states since January 2017 that aim to restrict high-profile protests that have ramped up as environmentalists focus on blocking fossil fuel projects.

In addition to Oklahoma's infrastructure bill and similar legislation enacted in two other states, these bills would expand definitions of rioting and terrorism, and even increase penalties for blocking traffic. Twelve have been enacted, according to the International Center for Not-for-Profit Law. The bills have all come since the election of President Donald Trump, who openly suggested violence as a way to handle protesters on the campaign trail, and once in office called the nation's leading news organizations "the enemy of the American people."

At the same time, law enforcement and private companies have conducted surveillance on campaigners, while some federal and state officials have suggested pipeline protesters who break laws be charged as terrorists. Corporations have hit landowners and environmental groups with restraining orders and hundred-million-dollar lawsuits.

Some pipeline opponents have conducted dangerous and illegal stunts, cutting pipelines with oxyacetylene torches or closing valves. But most protests have been peaceful. If they've broken laws by trespassing, activists say, they've done so as part of a tradition of civil disobedience that stretches to the nation's colonial roots.

"All of the social progress we've made has depended, over the entire history of this nation, from the very beginning, on that ability to speak out against things that are wrong, things that are legal but should not be," said Carroll Muffett, president of the Center for International Environmental Law. "This country, for all its failing, has long respected the importance of that. These bills put that fundamental element of our democracy in jeopardy."

The modern environmental movement sprung out of mass protest, when millions took to the streets for the first Earth Day in 1970. In the decade that followed, civil disobedience emerged as a core tactic as greens joined with anti-war protesters to launch anti-nuclear campaigns.

The movement returned to those roots with its fight to stop the Keystone XL oil pipeline. Beginning in 2011, activists staged sit-ins ending in arrests that galvanized the movement and drew national attention to what had been the mundane work of building pipelines.

Across the country, people began physically obstructing fossil fuel infrastructure. Protesters blocked train tracks carrying coal and oil in Washington state, halted trucks at the gates of a gas storage facility in upstate New York and kayaked in front of an oil rig in Seattle. This was before hundreds were arrested at Standing Rock.

In the absence of a clear federal energy plan, fossil fuel projects effectively became the policy, locking in oil, gas and coal infrastructure—and their greenhouse gas emissions —for generations. They became the focus of environmental groups, who found common cause with tribes and landowners fighting to protect their land and water. The groups launched campaigns that delayed or blocked several major coal export facilities, pipelines and other projects, costing energy companies millions or even billions of dollars.

To push back, the industry turned to its supporters in government.

Soon after Oklahoma's critical infrastructure bill passed last year, the conservative American Legislative Exchange Council (ALEC) used it to write model legislation for other states.

In at least six more states, lawmakers introduced similar bills that would impose steep penalties for trespassing on, or tampering with, pipeline property and other infrastructure. Two were enacted this year. Two others are pending.

In Wyoming, one bill was openly proposed on behalf of energy companies. In other states the ties are barely hidden.

Louisiana state Rep. Major Thibaut, a Democrat, introduced a bill that followed ALEC's model by adding pipelines to a list of critical infrastructure facilities. When he presented it to a Senate committee in April, he brought Tyler Gray, a lobbyist for the Louisiana Mid-Continent Oil and Gas Association. During the hearing, Gray answered most of the questions. At one point, he leaned over to Thibaut to recommend that he accept an amendment. Neither Thibaut nor Gray responded to requests for comment.

Gov. John Bel Edwards signed it into law in May, after an amendment removed the threat of conspiracy charges.

Iowa enacted a law that makes "critical infrastructure sabotage" a felony. Lawmakers in Wyoming and Minnesota also approved critical infrastructure bills this year, but the governors there vetoed the legislation. Similar bills are pending in Ohio and Pennsylvania.

The bills may get their first test soon in Louisiana after the arrests of the three activitists on the bayou. Activists had been conducting tree-sits and other actions for weeks and were careful to stay off the pipeline easements once the new law took effect, said Cherri Foytlin, an activist organizer in the state. The three who were taken from their boats, for example, claim to have been in navigable waters, which are supposed to be public. The definition of what's navigable has been the subject of much debate and legal wrangling in the state, however, and the off-duty officers pulled the activists off their boats to arrest them.

"It was really scary," said Cindy Spoon, one of the activists, in a video posted to Facebook. "They grabbed my wrist, grabbed my waist and they started to pull one of my arms behind my back and put me in a stress position."

The local district attorney has not yet formally filed charges. If the case proceeds, the activists will challenge the law itself, said William Quigley, a professor at Loyola University College of Law in New Orleans who is representing them pro bono. They could face up to five years in prison if convicted.

Two more incidents occurred over the weekend involving activists who had erected a stand high in the trees to block construction. Quigley said the group had permission from property owners to be on the land. But deputies with the St. Martin Parish Sheriff's Office arrested six people, including a journalist. The property is co-owned by hundreds of parties, and some of them have not signed easement agreements with Energy Transfer Partners, the primary owner of the pipeline.

The St. Martin Parish Sheriff's Office did not return requests for comment on the weekend arrests.

"I think this shows how ridiculous this law is if this is the way it's going to be applied," Quigley said.

The legislation is not without precedent. Since 1990, nearly a dozen states have passed bills known as "ag-gag" laws that prohibit surreptitiously recording inside feedlots and breeding facilities. Many came after exposés of horrific conditions and animal abuse. Three of those laws were subsequently overturned by courts.

But environmentalists and free speech advocates say the new bills are part of a broader effort to recast environmental activists as criminals, even terrorists.

For example:

In May 2017, the American Fuel and Petrochemical Manufacturers published a blog post about activists who vandalized pipelines, under the headline: "Pipelines Are Critical Infrastructure—and Attacking Them Is Terrorism."

Last year, Energy Transfer Partners filed a federal lawsuit against Greenpeace and other groups seeking hundreds of millions of dollars. The complaint accuses some of the nation's leading environmental organizations of operating an "eco-terrorist" conspiracy at Standing Rock. (A judge dismissed the case against one of the defendants last month, but has yet to rule on a motion to dismiss the case against Greenpeace.)

Another industry group launched a database to track "criminal acts on critical energy infrastructure," claiming eco-terrorism incidents were on the rise.

In Congress, 84 members wrote a letter to Attorney General Jeff Sessions asking if protestors who tamper with pipelines could be prosecuted as domestic terrorists. Sponsors of the state pipeline bills have also invoked terrorism.

Supporters of the bills say they do not suppress lawful protests.

"There's a legal process to stop something," said Oklahoma state Rep. Mark McBride, who sponsored another bill that assigns civil liability to anyone who pays protesters to trespass. "But if you're chaining yourself to a bulldozer or you're standing in the way of a piece of equipment digging a ditch or whatever it might be, yes, you're causing harm to the project and to the person that's contracted to do that job."

Advocacy groups say the bills are unnecessary—trespassing and vandalism are already unlawful, and protesters who have disrupted operations have largely been charged under existing statutes. They fear the legislation uses a handful of dangerous incidents as a pretext to intimidate mainstream advocates and target more widespread acts of peaceful civil disobedience, like temporarily blocking access to a construction site. Several environmental and civil liberties organizations are now in talks about how to respond to the industry's actions.

"The clear attempt there is to bring environmental justice, environmental advocacy organizations into a realm of criminal liability," said Pamela Spees, a senior staff attorney at the Center for Constitutional Rights, which represents activists in Louisiana. "They're basically trying to silence and minimize the impact of environmental organizations."

Ever since the state's first gusher began spurting crude in 1897, oil has dominated life in Oklahoma. Today, it is home to fracking pioneers, including Continental Resources. The town of Cushing, the self-dubbed "pipeline crossroads of the world," is a critical oil trading hub. "Environmental activist" is not a badge many wear openly in Oklahoma.

Dakota Raynes spent the past few years at Oklahoma State University writing his dissertation about how people have responded to a fracking boom that's literally shaken the state. As drillers began injecting more wastewater into wells from 2010-2015, the number of earthquakes jumped more than 20-fold to 888.

Raynes interviewed dozens of activists, lawmakers, regulators and ordinary citizens. He found that almost everyone is reluctant to speak publicly against the industry. They fear neighbors or colleagues or parents of their children's friends will catch wind and shun them. Raynes said he's attended events with activists who have returned to their cars afterward to find screws driven into tires.

"Many people read Oklahoma as a hostile context in which to engage in any kind of pro-environmental work," he said. "And they all link that back to the phenomenal amount of power that the oil and gas industry has, both as a cultural force, as a legal force, as a political force."

Standing Rock changed that deference, said Mekasi Camp Horinek, a member of the state's Ponca tribe.

Horinek grew up on tribal land near an oil refinery and a factory that produces carbon black, a sooty petroleum product. He blames the facilities for contributing to disease and deaths in his community. When the call came to join the resistance at Standing Rock, he became a leading organizer. (The Ponca's ancestral land in Nebraska lies in the path of the Keystone XL pipeline, another project he's fought.)

"I think the state of Oklahoma felt threatened and was worried that people would rise up," he said about the critical infrastructure bill. "And they should be."

Horinek says he remains undaunted, but he said the new law had an immediate effect on others.

People were suddenly threatened with long prison terms if they crossed onto pipeline property. "That definitely weighs heavy on a person's thoughts," he said.

Oklahoma has 39 American Indian tribes, most forced to move there during the Trail of Tears in the 19th Century, and indigenous activists led the fight against the Diamond Pipeline. "They can't risk a six-month sentence for trespassing. Who's going to take care of their kids, their parents, their grandparents?" he said.

Biggs, the state representative who sponsored the bill, now works for the federal Department of Agriculture. He declined to comment for this article. The Oklahoma Oil and Gas Association also declined to make anyone available for an interview.

Activists say the bill was a clear assault. 31States That Have Proposed Laws Targeting Protests.

The Sierra Club has a policy against engaging in civil disobedience. But the bill's conspiracy element—which says a group can be charged with 10 times the fines given to a person who violates the provisions—worries the organization's Oklahoma director, Johnson Grimm-Bridgwater. He was among those who spoke at the 2017 press conference. While Grimm-Bridgwater says participating in a press conference should qualify as constitutionally protected speech, there's no telling how a prosecutor might apply the new statute.

"The law is punitive and is designed to create friction and divisions among groups who normally wouldn't have a second thought at working together," he said.

Brady Henderson, legal director for the Oklahoma chapter of the American Civil Liberties Union, said there's widespread concern in both liberal and libertarian circles about the law, and he's fielded questions from several nonprofits about the conspiracy clause.

The ACLU in Oklahoma is considering legal challenges, but may have to wait until a district attorney tries to use the new law. Still, Henderson said, the language is so broad that it can apply to almost anything.

"By equating those kinds of things together, essentially political speech on one end and on the other end outright terrorism," he said, "the bill is a pretty gross instrument."

Protesters Under Surveillance

As activists were fighting the Diamond Pipeline in Oklahoma, Energy Transfer Partners was planning another oil pipeline at the southern end of its network.

The Bayou Bridge pipeline is slated to run from Texas to St. James Parish, Louisiana, already home to many petrochemical facilities. Along the way, it crosses through the Atchafalaya Basin, the nation's largest river swamp and a center of the state's crawfish industry.

Environmentalists were alarmed by the risks the project posed to residents of St. James Parish and the swamp's fragile ecosystem. And they quickly felt as if law enforcement agencies were working against them. For one thing, some officials openly supported the project. Joseph Lopinto, now Jefferson Parish sheriff, spoke at a hearing last year on behalf of the National Sheriffs' Association and urged regulators to approve the pipeline.

Anne Rolfes, an organizer of the pipeline resistance and founder of the advocacy group Louisiana Bucket Brigade, said activists also suspected they were being watched.

During the protests at Standing Rock, a private security company hired by Energy Transfer Partners had compiled daily intelligence briefings and coordinated with local law enforcement, as detailed in reports by The Intercept.

Rolfes' group this year also obtained a handful of documents through a public records request that indicate state officials were tracking their efforts.

One state police report from November said the agency sent an investigator from its criminal intelligence unit to a hearing where activists had allegedly planned a protest, and that a local sheriff's office planned to send a plainclothes officer. Rolfes said no protest was planned; activists merely intended to speak at a public hearing.

In the following months, an intelligence officer with the Governor's Office of Homeland Security and Emergency Preparedness sent two emails to colleagues describing environmental groups' priorities, quoting Rolfes' newsletter and an article about the groups and adding a photograph of her.

Louisiana State Police declined to comment. Mike Steele, a spokesman for the state Homeland Security Department, rejected the notion that his agency spies on activists. He said this type of open-source tracking was similar to what they conduct ahead of football games or festivals. "We do that with any type of event, because safety is the number one concern," he said.

Energy Transfer Partners spokeswoman Alexis Daniel issued a statement saying, "any claims that our company or our security contractors have inappropriately monitored protestors in false." It's unclear what surveillance, if any, the company or its contractors have deployed in Louisiana. Energy Transfer Partners has also been accused of spying by a family in Pennsylvania.

Anne Rolfes, founder of the advocacy group Louisiana Bucket Brigade, speaks at a protest in 2017. A similar photo of her was circulated by a law enforcement agency. Credit: Julie Dermansky

Rolfes said Louisiana officials are treating activists like criminals. "We're going to be followed while we participate in the democratic process?" she said.

The arrests in August of the three protesters on boats who were detained by off-duty officers only deepened the activists' feelings that the state and energy industry were working against them. While the arresting officers were working as contractors, they wore their state uniforms and badges and carried weapons.

Spees, with the Center for Constitutional Rights, said the incident represented a problematic melding of public and private authorities. After the industry-supported bill was enacted by the state, she said, state employees acted on behalf of an energy company to detain activists under the new law.

"It's as though they've become the hired hands for private oil companies and pipelines companies," she said.

Activists in Virginia, where two gas pipelines are drawing protests, have also been tracked by that state's Fusion Center—which shares information across agencies to combat terrorism and crime—according to a report by the Richmond Times-Dispatch.

In Washington state, a local sheriff's office has monitored activists opposed to a major pipeline project that connects with Canada's tar sands, sharing information with that state's Fusion Center, documents obtained by The Intercept show. Protest groups there already tried to block tanker traffic associated with the project, and local authorities have conducted trainings to prepare for mass protests.

The nation's intelligence community has a dark history of tracking political movements, including the FBI's COINTELPRO, which snooped on Martin Luther King, Jr., and others. Earlier this decade, FBI agents monitored opponents of the Keystone XL pipeline, according to The Guardian, and the bureau had at least one informant at Standing Rock.

Even if local law enforcement agencies today are doing little more than collecting news articles and other open source information, covert surveillance risks equating political protest with criminal activity, said Keith Mako Woodhouse, a historian at Northwestern University who wrote a book about radical environmentalists.

"It gives a sense that law enforcement is, if not on the side of, at least more sympathetic to the industries and practices that are being protested," he said. "Presumably they're not surveilling the energy companies."

Pipeline companies have cultivated that relationship. They have reimbursed state law enforcement millions of dollars for providing security for their projects in North Dakota, Iowa and Massachusetts, for example. Most major pipeline companies also run charitable campaigns that have contributed millions of dollars to local emergency responders and other agencies and groups along their routes. The National Sheriff's Association, in turn, supported the Bayou Bridge and Dakota Access pipelines while it called protesters "terrorists."

A Worrying Trend

Over the past decade, the sense of urgency around climate change has intensified. Scientists say there is little time before the planet is committed to potentially devastating warming. The only practical way to avoid that is to largely eliminate fossil fuel use over the next several decades.

The government's response has been minimal, "so the next step is civil disobedience," said Woodhouse, the historian. "And if anything, it's surprising that it hasn't happened sooner and at a greater scale."

Woodhouse said efforts to suppress environmental protest trace back to the rise of radical environmental groups in the 1980s and `90s, when public officials slapped names like "eco-terrorism" and "ecological terrorism" on activists who vandalized equipment or animal testing facilities.

Now, Woodhouse said, environmentalists are threatening one of the country's most powerful industries. Some advocates say the new push against activists seems more widespread and unabashed.

"I think there is a very worrying trend—and the trend is on both sides of the political aisle to be perfectly frank and candid—a trend of trying to suppress political speech that you don't agree with," said David Snyder, executive director of the First Amendment Coalition.

"I think a lot of people feel a lot more threatened than they have in a long time," he said, "and I think that's true on all sides of the political spectrum. And when people feel threatened, they tend to lash out."

ICN reporter Georgina Gustin contributed to this story.

Flint water crisis: Michigan's top health official to face trial over deaths

State’s health director to stand trial for involuntary manslaughter in two deaths linked to legionnaires’ disease in the Flint area

Associated Press in Flint

Tue 21 Aug 2018 10.44 EDT Last modified on Tue 21 Aug 2018 11.03 EDT

Special agent Jeff Seipenko walks out of the courtroom with signed paperwork in hand after a judge authorized charges on 14 June 2017 in Flint, Michigan for Nick Lyon in relation to the water crisis.

Special agent Jeff Seipenko walks out of the courtroom with signed paperwork in hand after a judge authorized charges on 14 June 2017 in Flint, Michigan for Nick Lyon in relation to the water crisis. Photograph: Jake May/AP

A judge has ordered Michigan’s health director to stand trial for involuntary manslaughter over two deaths linked to legionnaires’ disease in the Flint area, the highest-ranking official to face criminal charges as a result of the city’s tainted water scandal.

Nick Lyon is accused of failing to issue a timely alert about the outbreak. District court judge David Goggins said deaths probably could have been prevented if the outbreak had been publicly known. He said keeping the public in the dark was “corrupt”.

Goggins found probable cause for a trial in Genesee county court, a legal standard that is not as high as beyond a reasonable doubt. Lyon also faces a charge of misconduct in office.

When the judge announced his decision, a woman in the gallery said, “Yes, yes, yes.”

“It’s a long way from over,” Lyon told the Associated Press. He declined further comment.

Some experts have blamed legionnaires’ on Flint’s water, which was not properly treated when it was drawn from the Flint river in 2014 and 2015. Legionella bacteria can emerge through misting and cooling systems, triggering a severe form of pneumonia, especially in people with weakened immune systems.

At least 90 cases of legionnaires’ occurred in Genesee county, including 12 deaths. More than half of the people had a common thread: they spent time at McLaren hospital, which was on the Flint water system.

‘Nothing to worry about. The water is fine’: how Flint poisoned its people – podcast

The outbreak was announced by the governor, Rick Snyder, and Lyon in January 2016, although Lyon concedes that he knew that cases were being reported many months earlier. He is director of the health and human services department.

Nonetheless, he denies wrongdoing. Lyon’s attorneys said there was much speculation about the exact cause of legionnaires’ and not enough solid information to share earlier with the public.

The investigation by the office of the state attorney general, Bill Schuette, is part of a larger investigation into how Flint’s water system became contaminated when the city used Flint river water for 18 months. The water was not treated to reduce corrosion. As a result, lead leached from old pipes.

“We’re not looking at today as a win or a loss. We’re looking at today as the first step and the next step for justice for the moms, dads and kids of Flint,” said Schuette’s spokeswoman, Andrea Bitely, who specifically mentioned the families of two men whose deaths the prosecution blames on Lyon – 85-year-old Robert Skidmore and 83-year-old John Snyder.

An additional 14 current or former state and local officials have been charged with crimes, either related to legionnaires’ or lead in the water. Four agreed to misdemeanor plea deals; the other cases are moving slowly.

“Normally we don’t see government officials accused of manslaughter based on what they didn’t do,” said Peter Henning, a professor at Wayne State University law school in Detroit. “That does make it an unusual case, and it will make government officials be much more cautious. Maybe that’s the message here.”

Defense attorney John Bursch said the judge’s decision was “mystifying”. Goggins spent more than two hours summarizing evidence from weeks of testimony, but he did not specifically explain what swayed him to send Lyon to trial.

“We had 20 pages of argument in our legal brief that he didn’t address,” Bursch said outside court. “He didn’t talk about the law at all.”

A trial would be many months away after Snyder’s term as governor ends on 1 January. He said Lyon “has my full faith and confidence” and will remain as Michigan’s health director.

A courtroom spectator, Karina Petri, 30, of Milwaukee said sending a senior official to trial is “long overdue”.

“He withheld the truth. There’s no excuse,” said Petri, who wore a “Flint Lives Matter” shirt. “He could have changed hundreds of lives.”

Shopping for Community Solar? Contract Terms Are Getting Friendlier

Solstice introduces a “no-risk” contract, following a trend of increasingly flexible terms.

More flexible terms could lower the barrier to entry for consumers interested in solar power.

Community solar is supposed to make investing in PV easier. But many contracts do exactly the opposite.

Common contract terms put customers on the hook for cancellation fees or signup periods stretching into two decades. The lack of flexibility is generally a turnoff for customers, limiting signups from the 50 to 75 percent of U.S. consumers who can’t access traditional rooftop solar.

Solstice, a community solar organization focused on customer management, recently introduced a “no-risk” contract tied to a new 2.73-megawatt Delaware River Solar project in New York. The contract includes no cancellation fee and lasts just one year. Solstice called the release a “milestone in U.S. solar accessibility” and said the terms “allow renters to participate without fear of getting stuck with a contract that they can’t take with them if they move.” The project will serve 400 households after its estimated Q4 completion.

Other community solar providers are moving to simplify contract terms as well, but this particular offering is “rare,” according to Wood Mackenzie Power & Renewables Senior Solar Analyst Michelle Davis.

“We can have a future reliable, cheap, resilient grid that is 100 percent powered by clean energy.

Senator Heinrich: A 100% Clean Energy Grid Is ‘Completely Doable’

SunPower's distributed generation sales were already up 45 percent last quarter.

“This new Solstice contract is a great example of program designers meeting customers’ demands,” said Davis. “Community solar customers want community solar to be easy, not fraught with lengthy contracts, cancellation fees, or requirements to transfer a contract in the same utility territory.”

The offering from Solstice follows a trend of community solar developers moving beyond traditional power-purchase agreements. Increasingly, community solar companies are looking to tailor flexible options for consumers rather than using the template set out by the solar industry’s long contract commitments.

Other companies, like Arcadia Power, have also shunned cancellation fees.

In an interview with Greentech Media earlier this month, Arcadia’s CEO Kiran Bhatraju said the company is working to make good on the promise of community solar, which should look more like “universal access” than a rooftop solar sale. Arcadia’s data shows that on average Americans move every 5.5 years.

According to EnergySage, many of Arcadia’s term lengths last 20 years, but the company has no agreement cancellation fee, and its partnerships with utilities nationwide means the company can transfer a customer if they move.

That’s more flexible than offerings from other companies like Renovus Solar, with projects in New York that have 25-year contracts and don’t allow cancellations. EnergySage data identifies other developers like the Clean Energy Collective that require customers to transfer contracts to other consumers if they move outside the service area, or pay a cancellation fee. A past Clean Energy Collective contract included a 50-year term and its currently available contracts last up to 25 years.

Other companies have more favorable terms, such as those offered by Solstice and Arcadia. Community Power Partners offers a three-year term, but requires a cancellation fee in the first year. BlueWave Solar has a 20-year contract, but no termination fee and asks for 90 days' notice. Nexamp offers a pay-as-you-go model with no cancellation fee, though the company asks for a notification period that differs by state. That means 90 days in states including New York to six months in Massachusetts.

The flexibility of the Solstice model, though, could come with added cost. Davis said the turnover from one-year contracts will also mean faster churn on Solstice’s waiting list and customer pipeline.

“I can see this contract structure leading to an increased need for customer management for this specific project,” Davis said. “This is not an insignificant cost for community solar projects, and many developers outsource customer acquisition and management since it takes substantial resources and expertise. Time will tell if the benefits of increased customer engagement will outweigh the costs of managing customer wait lists and pipelines.”

In a conversation with Greentech Media in July, Solstice co-founder and CEO Steph Speirs said narrowing the customer base through unappealing contract terms makes customer acquisition costs spike. She said Solstice's wait list is designed specifically to lessen the impact of any churn, with more consumers waiting to fill demand.

As customer-friendly options become the norm, Speirs said it should push bigger players like NRG, with 20-year contracts and a required transfer, to follow suit.

“People aren’t signing up for marriage for 20 years,” Speirs said. “Why are we trying to force people to sign up for community solar for 20 years?”

TimberFish Launches IndieGoGo to Raise Trout on Brewery Waste and Wood Chips

By Catherine Lamb - July 12, 2018 0

It’s no secret that wild-caught seafood is fraught, what with its declining supply and associations with inhumane labor practices. Many tout farmed fish as a more ethical and sustainable (not to mention cheaper) way to satisfy our seafood cravings, which is why aquaculture is the fastest growing food-producing sector. In fact, as of 2016, aquaculture produced half of all fish for human consumption.

While aquaculture doesn’t lead to overfishing of limited ocean resources, it can have other unsavory consequences. Farmed fish produce a lot of waste (AKA fish poop), and can sometimes cause chemicals to leak into our drinking water.

And then there’s the fish food. Often, farmed fish are fed pellets of corn, soy, unwanted chicken parts, or even fish meal. Sometimes people even catch smaller, less popular fish from the ocean and grind them up to feed their farmed bretheren. Obviously, it takes a lot of energy and environmental resources to create all this fish food, and even more to filter out waste from fish enclosures.

TimberFish Technologies‘ eponymous technology promises to offer a more palatable alternative to aquaculture. The company launched in 2008 and have so far raised or won $260K, which they used to build a test facility at Five & 20 Spirits & Brewing facility in Westfield New York.

There, they feed their fish not with animal parts or corn, but with a combination of nutrient-rich wastewater from food processors (such as breweries, distilleries, and wineries) and woodchips. Microbes grow on the woodchips, small invertebrates (like worms and snails) eat the microbes, and the fish eat the invertebrates. The fish poop is grub for the microbes, and the whole cycle starts again.

In addition to seafood, the TimberFish system’s only outputs are clean water and spent wood chips, which can be used as a biofuel or soil supplement. Another benefit is that TimberFish can build their aquaculture farms close to cities, shortening the supply chain and guaranteeing fresher fish.

This is obviously not as idyllic as plucking salmon from the Alaskan seas or catching trout in a mountain stream, but, as aquaculture operations go, it’s not bad. And it’s certainly cost-efficient; diverting a waste product to make it profitable.

This week TimberFish Technologies launched an IndieGoGo campaign to raise funds for their no-waste, sustainable aquaculture system. If they reach their $10,000 goal, they’ll use the funds to design plans for a larger commercial facility, which they estimate could produce 2 to 3 million pounds of fish per year

Why recycling is now a money loser, not a moneymaker for Philly

Updated: JULY 23, 2018 — 5:00 AM EDT

by Frank Kummer, @FrankKummer |

A truck loaded with curbside recycling opened its giant maw and five tons of plastic, cardboard, and metal spewed to the ground next to a 25-foot-high pile of recyclables recently dumped at the Republic Services facility in Philadelphia.

By the end of the day, the Grays Ferry location had received up to 500 tons of recyclables — processed at a rate of 25 tons an hour. It’s a staggering amount to move, sort, bale, and ship on a daily basis. About 20 percent goes to landfills because of non-recyclable items such as hoses, rigid plastic containers, and pizza boxes mixed in.

Five years ago, Philadelphia was getting paid good money by contractors for recyclables — about $65 a ton.

By last year, it was paying $4 a ton just to get rid of it. Now, it’s paying $38 a ton.

The change has not gone unnoticed. That plastics and other scraps meant for recycling are ending up in landfills — at a cost to taxpayers — was an issue raised through Curious Philly, our new question-and-response forum that allows readers to submit questions about their community in need of further examination.

“We made the argument that recycling was so much cheaper, but that’s getting closer to not being true,” said Nic Esposito, director of Philadelphia’s Zero Waste and Litter Cabinet. “That’s scary.”

City taxpayers will likely chip in about $2 million to cover the increase this year, according to budget figures.

If the trend continues, recyclables could become just as expensive to get rid of as ordinary waste, which means items once recycled will go straight to landfills, undoing decades of work to get residents and businesses to go green. (It costs about $63 a ton to dispose of waste at a landfill.)

All the eggs in one basket

The cause of what many call a recycling crisis began in 2013 when China shifted away from its role as the world’s major recycler. As part of its crackdown on pollution, China began demanding ever purer loads of recyclables. Previously, Chinese facilities might have accepted loads that were 10 percent or 20 percent contaminated by non-recyclable plastics or paper and cardboard that was wet or dirty.

But the last year has been particularly painful for recycling. China is now demanding that loads be no more than half a percent contaminated, and contain no mixed paper (such as office paper mixed in with newspaper) — an impossible standard to meet for most municipalities. In Philadelphia, as much as one-fifth of each load might be contaminated.

Recycling companies had to figure out how to unload millions of tons elsewhere. Recyclers have been able to find some domestic and overseas markets such as Vietnam and India for plastics, but those countries are now overwhelmed.

China distressed markets even more when it refused all recycling from May 4 through June 4. Recyclers had to stockpile loads.

“We put so many eggs in the basket in China,” Esposito said.

Bales of aluminum cans and cardboard stacked high off the floor at Republic Services’ processing plant in the Grays Ferry neighborhood of Philadelphia.

Philadelphia and many municipalities use contractors to process single-stream recycling. Contractors are now appealing to residents to use less plastic and be cautious of what they put in recycling bins.

The goal is to get loads as pure as possible.

“We weren’t exporting plastic to China in recent years,” said Frank Chimera, a senior manager with Republic Services, the city’s contractor to process recycling. “But we were exporting mixed paper, cardboard. Prior to all this we were sending about 60 percent to China. As of March 1, we stopped sending anything.”

The company has had to stockpile mixed paper loads at times until it found domestic buyers. Some loads are being shipped to Indonesia and India. But that poses other problems.

“The cost to ship is greater than the value of paper,” Chimera said.

The crunch has caused turmoil for recyclers, who have always borne the brunt of processing but made up for it by selling sorted recyclables as a commodity on the open market.

“Recycling economically does not make sense right now,” he said. “So companies like Republic are propping up the recycling industry.”

Shift lead Tim Spross stands between aisles of landfill waste (left) and bales of cardboard at Republic Services’ processing plant in Grays Ferry.

Plastic bags a big problem

On Tuesday, Chimera stood with Tim Spross, also of Republic Services, high atop a steel platform where thousands of pieces of paper and cardboard rushed down a conveyor belt.

Workers scrambled to pick out as many plastic bags, dirty cardboard, and non-paper items from the swift-moving stream. Plastic bags, almost ubiquitous, are especially onerous because they clog machinery, forcing lines to shut down.

“Plastic bags are one of our biggest problems,” Spross said.

He walked to a wall of bales of non-recyclables that were removed from the trucks and conveyors. The wall ran 10 feet tall and 50 feet long. All of it will go to a landfill.

Employees at a Waste Management recycling processing facility work to free a machine clogged with plastic bags . Plastic bags are a significant source of machine shutdowns.

Contamination drives up costs

John Hambrose, a spokesman for Waste Management, which contracts with other municipalities, says his company is experiencing similar issues. Waste Management has a recycling processing facility on Bleigh Avenue in Holmesburg.

Hambrose said the situation is not at the point where recyclables are going straight to landfills. But he said costs are being passed down to municipalities and, ultimately, taxpayers.

“In the last few months, we have been working with municipalities,” Hambrose said. “We’re evaluating the quality of materials they are sending us. When we find high levels of contamination, that’s when additional fees may incur.”

Contamination occurs when residents either don’t know what’s recyclable or simply don’t care.

Food waste gets tossed in with plastic, contaminating an entire load. Liquids left in bottles drip down, ruining paper. Pizza boxes stained with grease are unusable. Heavy rigid plastics, such as crates and bread carriers, also get tossed in, although they are not recyclable. Some residents toss in dirty diapers.

“If there’s too much trash in recycling, then it’s basically all trash,” Hambrose said. “You’re really trying to do this efficiently and make a business out of it. But contaminated loads drive up your costs. At the end, you’ve got low commodity values. So we’re really caught at both ends.”

Recycling companies such as Republic Services and Waste Management are sometimes locked into longer-term contracts with municipalities and have to eat the additional costs.

For example, Camden County has a multiyear contract with Republic Services that locks in a cooperative agreement with 37 municipalities to pay $5 a ton to unload recycling — $33 less per ton than Philly.

“Other than the fact that we were getting money for recyclables, and now we are paying money, it hasn’t hit us as hard as other municipalities because of our participation in the co-op,” said Thomas Cardis, business administrator for Gloucester Township in Camden County. “If we weren’t in that bid, then we would have some serious concerns right now. But, ultimately, we’re concerned we will eventually be susceptible to the market.”

Back at the Republic Services facility in Grays Ferry, workers were sorting through the just-delivered load from the recycling truck. They loaded the non-recyclables into carts and wheeled them to a 12-foot-high hill of debris waiting to be sent to a landfill. As soon as workers clear the pile, it starts to rise anew.

What’s recyclable in Philly?

Plastics: food containers, bottles, jars, detergent and shampoo bottles, and pump and spray bottles – all rinsed. (Philly takes all plastics from #1 to #7, which might be stamped on the bottom with the recycling symbol)

Paper: Newspapers, magazines, brochures, junk mail, envelopes, scrap paper, paper bags, paperback books – any plastic sleavs removed and not soiled with food waste.

Cartons: Milk, juice, wine and soup – all rinsed.

Aluminum: Cans, paint cans (dried latex), aluminum baking dishes, foil

Glass: bottles and jars – rinsed.

Carboard: boxes, paper towel rolls, egg cartons and dry food and shipping boxes.

What can’t be recycled in Philly:

Plastic Bags


Disposable Plates, Cups, and Takeout Containers

Greasy or Food –soiled paper & cardboard

Tissues, Paper Towels, and Napkins

Light Bulbs

Cassette Tapes (VHS and audio)

Garden Hoses

Needles and Syringes

Propane Tanks

Pots & Pans

NOTE: What gets recycled may vary from town to town, but most have a list on their municipal websites.

Global warming set to exceed 1.5°C, slow growth - U.N. draft

Alister Doyle Environment Correspondent

OSLO (Reuters) - Global warming is on course to exceed the most stringent goal set in the Paris agreement by around 2040, threatening economic growth, according to a draft report that is the U.N.’s starkest warning yet of the risks of climate change.

Governments can still cap temperatures below the strict 1.5 degrees Celsius (2.7° Fahrenheit) ceiling agreed in 2015 only with “rapid and far-reaching” transitions in the world economy, according to the U.N.’s Intergovernmental Panel on Climate Change (IPCC).

The final government draft, obtained by Reuters and dated June 4, is due for publication in October in South Korea after revisions and approval by governments.

It will be the main scientific guide for combating climate change.

“If emissions continue at their present rate, human-induced warming will exceed 1.5°C by around 2040,” according to the report, which broadly reaffirms findings in an earlier draft in January but is more robust, after 25,000 comments from experts and a wider pool of scientific literature.

The Paris climate agreement, adopted by almost 200 nations in 2015, set a goal of limiting warming to “well below” a rise of 2°C above pre-industrial times while “pursuing efforts” for the tougher 1.5° goal.

The deal has been weakened after U.S. President Donald Trump decided last year to pull out and promote U.S. fossil fuels. (nL1N1IZ1BA)

Temperatures are already up about 1°C (1.8°F) and are rising at a rate of about 0.2°C a decade, according to the draft, requested by world leaders as part of the Paris Agreement.

“Economic growth is projected to be lower at 2°C warming than at 1.5° for many developed and developing countries,” it said, drained by impacts such as floods or droughts that can undermine crop growth or an increase in human deaths from heatwaves.

In a plus-1.5°C world, for instance, sea level rise would be 10 centimeters (3.94 inches) less than with 2°C, exposing about 10 million fewer people in coastal areas to risks such as floods, storm surges or salt spray damaging crops.

It says current government pledges in the Paris Agreement are too weak to limit warming to 1.5°C.

IPCC spokesman Jonathan Lynn said it did not comment on the contents of draft reports while work was still ongoing.

A man holds an umbrella as he walks next to his buffalo on the banks of the river Ganga on a hot summer day in Allahabad, India May 24, 2016. REUTERS/Jitendra Prakash

“It’s all a bit punchier,” said one official with access to the report who said it seemed slightly less pessimistic about prospects of limiting a rise in global temperatures that will affect the poorest nations hardest.

The report outlines one new scenario to stay below 1.5°C, for instance, in which technological innovations and changes in lifestyles could mean sharply lower energy demand by 2050 even with rising economic growth.

And there is no sign that the draft has been watered down by Trump’s doubts that climate change is driven by man-made greenhouse gases.

The draft says renewable energies, such as wind, solar and hydro power, would have to surge by 60 percent from 2020 levels by 2050 to stay below 1.5°C “while primary energy from coal decreases by two-thirds”.

By 2050, that meant renewables would supply between 49 and 67 percent of primary energy.

The report says governments may have to find ways to extract vast amounts of carbon from the air, for instance by planting vast forests, to turn down the global thermostat if warming overshoots the 1.5°C target.

It omits radical geo-engineering fixes such as spraying chemicals high into the atmosphere to dim sunlight, saying such measures “face large uncertainties and knowledge gaps.”

Boston has new rules to help buildings withstand climate change

A parking lot on Long Wharf was one of many stretches of downtown Boston that flooded during a storm in January.

By Jon Chesto GLOBE STAFF, JUNE 15, 2018

Memories of the storm last winter that flooded parts of downtown Boston are still fresh at City Hall. Now, the Walsh administration is pushing developers to make their buildings better able to withstand another watery apocalypse.

The Boston Planning & Development Agency on Thursday approved new rules to make big buildings more resilient to the effects of climate change. City officials hope the measures will help minimize flooding, keep the lights on in more buildings during power outages, and make it easier to upgrade street lights and other public works.

“We think we’ve identified a way forward that appears to be the first of its kind in the nation,” said Brian Golden, the agency’s director.

The rules are initially being tested for a two-year period and differ for projects based on size. For the largest developments — at least 1.5 million square feet — developers will need to assess installing an on-site power plant, and build one if it’s financially feasible. They will also have to consolidate all wiring for cable, Internet, and other telecom services into one underground tube, so there is less disruption to streets and sidewalks during repairs.

Any new development above 100,000 square feet will have to retain more rainfall than currently required, to help prevent runoff during storms from contributing to floods in the surrounding area. In 2017, the planning agency received applications for 39 projects over 100,000 square feet.

Meanwhile, developers for projects above 50,000 square feet would need to install extra wiring and technology for “smart” traffic signals and street lights if the projects require new or improved signals or lights.

The agency’s work on this policy began about two years ago. But recent storms have added to the urgency.

Golden said the rules could also help reduce traffic jams caused by construction, by requiring more coordination for underground utility work.

He said developers offered input during the process; HYM Investment Group, for example, expects to use these standards for a complex it will build at the old Suffolk Downs track.

But the development industry’s chief local lobbying group, NAIOP Massachusetts, is pushing back, opposing both the on-site power and stormwater retention requirements.

“To increase this holding capacity is a bit problematic,” chief executive David Begelfer said of the new rainwater rule. “Boston’s an old urban city. It’s very dense.”

Matthew Kiefer, a development lawyer with Goulston & Storrs, said it could be hard to connect on-site power supplies with the local electricity grid. “We’ve talked to clients who have tried to do it in different settings,” Kiefer said. “It’s kind of challenging.”

While Kiefer applauded some of the agency’s goals, he said city officials should in exchange allow developers to build bigger projects to compensate for the added expense.

“I would be careful about adding too many costs to developments,” Keifer said. “When the market starts to soften a little bit, can you really support the costs of these things?”

As Nuclear Struggles, A New Generation Of Engineers Is Motivated By Climate Change

June 15, 20185:01 AM ET


Entering the control room at Three Mile Island Unit 1 is like stepping back in time. Except for a few digital screens and new counters, much of the equipment is original to 1974, when the plant began generating electricity.

The number of people graduating with nuclear engineering degrees has more than tripled since a low point in 2001, and many are passionate about their motivation.

"I'm here because I think I can save the world with nuclear power," Leslie Dewan told the crowd at a 2014 event as she pitched her company's design for a new kind of reactor.

Dewan says climate change, and the fact that nuclear plants emit no greenhouse gases, are the big reason she became a nuclear engineer. And she is not the only one.

"The reason that almost all of our students come into this field is climate change," says Dennis Whyte, head of the Department of Nuclear Science and Engineering at the Massachusetts Institute of Technology.

The surge in new engineers comes as the nuclear industry, just like coal, is struggling to compete against cheaper natural gas and renewable energy. The Nuclear Energy Institute estimates that more than half of the nation's 99 nuclear reactors are at risk of closing in the next decade.

President Trump has ordered Energy Secretary Rick Perry to take steps to help financially troubled coal and nuclear power plants, though he has cited the reason as grid resilience and national security.

We wanted the type of nuclear reactor that people would want to have in their communities : Leslie Dewan, co-founder of Transatomic Power

But nuclear plant operators are echoing young engineers like Dewan as they lobby for public subsidies to keep reactors open.

"If you are concerned about climate change, or concerned about the environment, you should be very concerned about the future of [Three Mile Island]," says David Fein, Exelon's senior vice president of state governmental and regulatory affairs.

TMI parent company Exelon announced last year it will close Three Mile Island Unit 1 in 2019 unless there are policy changes that would make the plant profitable again. A different reactor on the site near Middletown, Pa. — Unit 2 — was involved in the country's worst nuclear accident in 1979.

Three Mile Island Nuclear Power Plant To Shut Down In 2019

Fein is among those who argue that nuclear plants should be recognized as clean energy and paid for the public benefit of not emitting greenhouse gases or other pollutants. It's a strategy that has worked in other states: Illinois, New York and — most recently — New Jersey.

"If you are concerned about climate change, or concerned about the environment, you should be very concerned about the future of [Three Mile Island]," says David Fein, the senior vice president of state governmental and regulatory affairs at Exelon, which owns Three Mile Island Unit 1.

Opponents of new subsidies include anti-nuclear activist Eric Epstein with the watchdog group Three Mile Island Alert. "If you consider nuclear green then you have to ignore high-level radioactive waste," he says.

The federal government still doesn't have permanent storage for that waste, and Epstein says there are the environmental costs of uranium mining to consider as well.

Others question giving nuclear plants public money that could be used for renewable energy instead.

"It really is this sort of philosophical battle: Are we building the energy economy of the future? Or are we just, sort of, keeping with the status quo?" says Abe Silverman, vice president for regulatory affairs group and deputy general counsel for NRG, which has opposed state subsidy proposals.

The nuclear industry thinks it has a good chance of persuading people to support its side of this debate.

Ann Bisconti, who does opinion research for the industry, says a lot fewer people oppose nuclear energy now than just after the Three Mile Island accident.

"People have moved, very much, into middle positions — they're very mushy on nuclear energy," Bisconti says. And that creates an opportunity to win them over by talking about the need for nuclear to limit the effects of climate change, she says.

"You can't get there without nuclear in the fuel mix," says Chris Wolfe, who works as a generation planning engineer at South Carolina Electric & Gas and is on the board of North American Young Generation in Nuclear.

Despite the industry's struggles, the U.S. still gets a fifth of its electricity from nuclear, and there are jobs to be had.

The number of nuclear engineering degrees awarded each year peaked in the late 1970s. Then it dropped steadily amid the Three Mile Island partial meltdown in Pennsylvania and the Chernobyl disaster in the former Soviet Union. Numbers started to climb again in 2001 as talk of climate change increased. That left an age gap in the nuclear workforce, with many now ready to retire.

"Baby boomers are leaving the industry, so there are a lot of job opportunities," says Wolfe.

Wolfe chose to work at a utility for job security, but she is cheering on nuclear entrepreneurs like Dewan, who co-founded a startup business called Transatomic Power. She says the company has raised about $6 million from investors.

"We're adapting something that's called a molten salt reactor that was first developed in the very, very early days of the nuclear industry — back in the 1950s and 1960s," Dewan says. With modern materials and a few changes, she believes such a reactor can be financially viable today.

Her company's first design aimed to use spent nuclear fuel to generate electricity. But it turned out some of the calculations were wrong.

Still, she says this new design would produce about half the waste of existing nuclear plants. And Dewan says it's safer because the effects of a meltdown would be limited to the plant site.

Nuclear Power Plants Struggle To Compete With Cheap Natural Gas

"Part of why we started this company is that we wanted the type of nuclear reactor that people would want to have in their communities," says Dewan.

That's a hard sell to anti-nuclear activists like Epstein. "I'm seeing the same arrogance I saw in the '70s," he says. "I think the new generation is like the old generation, in that they view themselves as flawless high priests of technology."

Dewan sees that skepticism as a legacy her generation will have to address. But she thinks the problem of climate change is too important to give up on nuclear energy now.



Seattle has amassed a trove of data on its dockless bike share program, which could help it better manage the 10,000 new bikes now active on its streets.

SEAN HEALY WAS driving for Uber in Seattle when a passenger offered him an intriguing job. It wasn't the first time a rider had proposed a new line of work; in the bustling tech scene in Seattle, his passengers often seemed to be scouting for people to hire. But last summer, the person in the backseat of his family’s minivan was the general manager of a company called Ofo.

THE MAN PITCHED Healy on a new gig-economy job transporting bikes around the city. The work itself sounded just so-so, but the man painted it as something bigger—a vision of a new kind of city, with fewer cars clogging the landscape and a metropolis made safe for people on two wheels. Kind of like Europe. "He seemed like a decent guy," says Healy, 33. "He wanted to do things that were not just environmentally sustainable but ethically sustainable."

Ofo's business is dockless bike sharing, and it was about to launch its US operations in Seattle. Dockless bike share is just the latest of a dozen new approaches to urban mobility in increasingly congested cities. Ride-hailing services, app-powered carpools, on-demand car rentals, electric bikes, scooters, and even self-driving taxis are all jockeying for riders on the streets of American cities. Together they are reinventing the way we navigate urban environments, reducing private car usage, improving traffic and commute times, and cutting emissions.

But where alternatives to car ownership are well-established in the US's major metropolises, bike shares are still finding their niche. Paris, London, and New York have all adopted bike share programs that use docks, bulky stations that are built into parking spaces that dictate where the bikes' users must start and end rides. Though they cost a fraction of a more traditional, multibillion-dollar transit project, the stations are still expensive to install and maintain, and their fixed locations limit the number of riders they can attract.

What makes a dockless bike share program appealing is that, beyond the bikes themselves, it doesn't need any infrastructure. With nothing to build, a city can introduce a new way of getting around virtually overnight. A smartphone app tells users where cheap, GPS-enabled bikes are located and lets them rent one. Upon arriving at a destination, a rider simply leaves the self-locking bike there for the next user. Dallas, Los Angeles, Washington, DC, and several small Florida cities, among other places, have all embraced small dockless bike share programs.

Seattle could use a transportation reboot. The home of a thriving tech sector, Seattle is a fast-growing city and home to some of the country's worst traffic. Last year, when it decided to give dockless bikes a chance, it didn't have a bike share program of any kind. Success here could set up dockless bike share for a nationwide roll-out. Failure could mean more cars, more fumes, and more traffic jams for all. But what looked on the surface like an easy win ended up revealing the limits of a startup-led revolution. As Seattle residents discovered, just because the city could get bikes on the streets with little investment didn't mean it should.

WHEN DOCKLESS BIKE share began in Chinese cities three years ago, the downsides of the idea soon became apparent. Tens of thousands of broken or stranded bikes littered those cities before their governments cracked down, impounding bikes and setting limits on their use. Seattle's Department of Transportation wanted to avoid that mess.

So last July, the city allowed three companies—Ofo, LimeBike, and Spin—to deploy up to 4,000 bikes each in a six-month trial, in return for a deluge of data about their customers and operations. Seattle planners wanted to understand in granular detail how the systems would work, and how its citizens would use them. Now the data is in, much of it sourced by WIRED through a series of public records requests.

Seattleites have taken to dockless bike sharing like nowhere else in the country. Not only does Seattle have nearly a quarter of all the nation's dockless bikes, its bikes get three times the usage of those elsewhere in the US. More than 350,000 riders have covered more than a million miles in the scheme's first five months, and 74 percent of the city is in favor of them, according to a transportation department survey. Three-quarters of rides are being used to access public transit, helping to fill in the gaps left by those established systems.

When Healy started his new gig in September, he was tasked with leading a team of workers redistributing Ofo's bright yellow bikes around the city and bringing in damaged bicycles for repair. Bikes can pile up at popular destinations, block sidewalks, or end up stranded in low-traffic parts of town. Healy's job was to keep the bikes deployed in the places where they are most likely to get used, and prevent them from becoming a menace to the city.

At first, he loved his work. An engineering student with four children and who built a basic ion thruster for satellites for fun, Healy used the work to cover his bills while taking classes. Not only was he helping to bring a cheap, healthy transportation option to Seattle, he liked the company itself. Ofo was hiring people with housing and addiction challenges through a local employment charity. "My manager was taking people from the bottom, helping them to grow," Healy says.

But a few months later, that manager was promoted, and things started to change. The three companies selected by Seattle were in the thick of a price war to lure more riders. Every bike company has an internal goal, the number of rides per bike per day, that it uses to woo investors and predict future earnings. While the companies were expanding in Seattle, they did everything they could to hit that figure—including regularly driving down the cost of a ride to zero. To attract and retain riders, the companies had to make sure their bikes were always available where customers looked for them. Keeping teams like Healy's in constant motion became essential to the program's success.

Healy noticed that Ofo kept deploying more and more bikes. "They weren't hiding their strategy, which was to overrun the city," he says. "They wanted a bike on every corner." The work was hard, involving lifting the 42-pound bikes into and out of vans many times a day, he recalls. Workers rode in the back of the van alongside poorly secured bikes, and they lacked protective gloves.

Eventually, the pressure to keep deploying bikes to desirable locations led to a new rule, Healy says: Badly damaged bikes would no longer be painstakingly cannibalized for parts but simply thrown out. "I wondered, why are we working to save the Earth with bike share if we're just going to create more trash?" he says. (Ofo says that irreparably damaged bicycles are now recycled.)

The thousands of new bikes in circulation inevitably led to conflict with residents. According to the feedback collected by Seattle's transportation department, car owners are blaming shared bikes for scraping their vehicles. Residents are peeved that unsightly bikes are clogging up sidewalks, parks, and driveways, making the streets less navigable for pedestrians and annoying local businesses. Vandals have been systematically cutting brake cables of bikes from all three companies. Some activists are now trying to oust the bike companies.

Dockless bike share was meant to be the opposite of Uber—green, healthy, equitable, and affordable. But as 10,000 new bikes sprawl over the city, Seattle's latest transport revolution is pitting residents against each other, with the programs' fans and foes arguing over its effect on the city. Seattle residents want access to bikes, that much is clear—but couldn't the bikes be just a little less annoying?

FOR SEATTLE, A bike share program was a gamble from the start. One year earlier, a different attempt at a bike share had failed in this hilly, drizzly city. Between 2014 and 2017, a nonprofit called Pronto ran a docked bike share program in Seattle that got pretty much everything wrong. It installed too few stations, which were located away from bike paths and tourist attractions. Rides were expensive and required a helmet sourced from unappealing communal bins. The program launched just as the weather turned at the end of a glorious summer, dooming it to a slow start.

There are currently no penalties for riders who violate the parking rules for Seattle's dockless bikes, such as blocking a sidewalk. JENNY RIFFLE

When Pronto went bust in early 2016, the city bought it but changed little about its operation. Early last year, Seattle decided to cut its losses and close Pronto down. "Seattle has this love-hate relationship with bikes," says Andrew Glass-Hastings, director of transit and mobility at the Seattle Department of Transportation. Despite a mild climate, decent cycling infrastructure, and traffic jams among the worst in the world, only about 3 percent of commuters used a bike to get to work in 2017. By contrast, 62 percent of commuters in the world's top cycling city, Copenhagen, ride to work. Some Seattle residents blame the rain, others the hills or a tough helmet law.

But maybe they were just waiting for something cheaper, simpler, and more convenient. Around that same time, dockless bike share programs were starting to take off in cities outside the US. Ofo was one of the first companies to find early success in this market. In 2014, members of the Peking University cycling club started a campus project they called Ofo, chosen because the word looks like a cyclist. The idea was that some students would share their bikes with others who would pay to use them, but the founders soon realized their fellow students were more interested in conveniently renting bikes than sharing their own.

In a world of carbon fiber frames and electric motors, Ofo's heavy steel bikes were lumbering and basic. The company's innovation was to package the entire tracking and rental system on the bike itself. A solar-powered unit mounted on the rear wheel powered a cellular link, GPS receiver, and lock. When a user scanned a unique QR code on the bike's frame using a smartphone app, the system would send an unlock signal to the bike and charge the rider's credit card.

Ofo was an immediate success. From its launch in 2015 with 20,000 bikes in Beijing, it now operates more than 10 million bicycles in more than 250 cities globally. If it has taken you 10 minutes to read this far, another quarter of a million people have taken one of its yellow bikes for a spin. That scale and speed of growth caught the attention of Didi Chuxing, Alibaba, and others, who have provided Ofo with more than $2 billion in funding. Soon, other dockless bike share companies popped up: Chinese copycat Mobike, bankrolled by Alibaba rival Tencent, claims to be the world's biggest dockless operator. In the US, dockless bike share companies include Spin, LimeBike, and Jump (which was acquired by Uber).

"When Pronto shut down, we found ourselves the largest city in the country without a bike share system," says Glass-Hastings, which he says created ripe conditions for a dockless company to step in. Once Ofo, Spin, and LimeBike began offering rides costing just $1 for 30 minutes (or longer, in some cases) last July, the program immediately surpassed Pronto both in volume and in trips per bike. As the summer turned to fall, the number of riders climbed steadily, topping 4,000 in October before the rains caused it to slump.

Even with the drizzle, by December, dockless bike share riders had covered more than a million miles. If all those rides had replaced car journeys, around 400 fewer tons of carbon dioxide would have been emitted, to say nothing of the time and emissions saved by relieving congestion on the roads.

The pilot program is now officially in an evaluation stage, and Glass-Hastings hinted strongly that the transportation department will recommend that it continue when the current permits expire in July. "For the first time, Capitol Hill recently ran out of bikes," he says of a popular downtown neighborhood. "There is a great deal of pent-up demand. It's really rewarding to see people use a system that is a great addition to the transportation system at almost zero cost to the city. It's been a win-win."

THE DOCKLESS BIKES have equally passionate critics, however. As head of Washington state's Department of Transportation from 2001 to 2007, Doug McDonald's job was all about keeping his part of the country moving. Now retired and living in Seattle, McDonald describes himself as a pedestrian activist. He has long been fighting a campaign against a city rule that allows cyclists on sidewalks.

Doug McDonald has been sending several photos a day of badly parked dockless bikes to Seattle's transportation department. JENNY RIFFLE

The arrival of dockless bike share got McDonald even more worked up. "If you watch people on yellow bikes, they tend to be inexperienced cyclists. There is crazy weaving in and out," he says. "And whatever profit comes from lending public rights of way to the bikeshare companies, Seattle gets not a dime. That's why my hair is on fire."

He finds the idea that there's no cost to the city ridiculous. "That ignores the cost to everybody for whom the bicycles are in the way or an annoyance," he says. Full safety statistics have not been made available, but the city says that five collisions with pedestrians were reported during the program. McDonald himself sends about five pictures a day of badly parked dockless bikes to Seattle's transportation department, and he regularly fires off emails debating the finer points of city and state laws on who has the right of way.

McDonald is the most outspoken of the program's critics, but he is not the only one. "The entire city is starting to look like the backyard of ill-behaved 7-year-olds who refuse to pick up after themselves," reads one complaint filed to the city in early September. Another one reads, "It is as if the main priority in Seattle now is to be sure that no one ever has to be farther than a half-block from a bike. Must we remain a nursery school for entitled children?"

Defenders of Seattle's parks—the city's natural treasures (Seattle spends more than three times as much on parks and recreation per capita than New York City or Washington, DC)—also take issue with the bikes. "I see bikes by all three companies parked haphazardly all over [Discovery] Park - —blocking trails, crushing native plants, etc," reads a complaint from December. "What is the city doing to ensure these companies and their customers follow city rules?"

In September, a LimeBike employee got into an altercation outside a downtown café when dropping off bikes, according to another complaint. When a worker from the café confronted the worker about the bikes parked out front, the LimeBike operative told him that moving a bike was a felony and that he would be charged if he did. (The city told LimeBike that this was incorrect and "unacceptable" behavior).

The city’s rules on where to park these bikes are strict and clear. Bikes should not be left on corners, driveways, or ramps, nor blocking building entrances, benches, bus stops, or fire hydrants, and they should always leave six feet clear for pedestrians on sidewalks. Companies have two hours to move bikes that are reported as being incorrectly parked. But while the smartphone apps communicate these restrictions to users, nothing prevents riders from leaving a bike wherever they want, nor are there currently any penalties for doing so.

Bikes started to pile up at popular places like the ferry dock and light rail stations, clogging up the walkways and obstructing commuters. They were also showing up in more worrying places. Bikes left in the road were getting hit by cars, and a mangled Spin bike was found near a train track, presumably damaged by a passing locomotive.

Mike Hemion, a diver, has been finding bikes in the water on a regular basis.

They were also getting tossed into lakes. "As soon as I saw the bikes on the roads, that same week we started seeing them in the water," says scuba instructor Mike Hemion, who teaches commercial divers in Seattle's bays and lakes. "Three out of four times when we dive downtown on the waterfront now, there's a bike in the water." In the early days of these water retrievals, workers were expected to fish them out themselves; LimeBike workers even cobbled together a makeshift grappling hook to snare them. Now the three bike share companies just call Hemion.

The three dockless bike startups call Hemion when they need their bikes retrieved. JENNY RIFFLE

McDonald thinks expanding the fleets, and especially the addition of electric bikes (LimeBike has already replaced nearly half of its manual bikes with ebikes), will only make matters worse. "I think there's going to be a bad, bad accident," he says. McDonald wants to tie approval of future dockless permits to a new rule banning all bikes from sidewalks, saying, "The stink I've made is going to get bigger."

FOR THE COMPANIES themselves, gaining significant riderships in Seattle has come at a cost. LimeBike and Spin have significantly less investor cash in their coffers than Ofo. But even the Chinese company's billions might not last forever in an all-out price war. Although all three dockless bike companies in Seattle officially charge at least $1 per ride, the average cost has probably been closer to zero. The companies give out free rides to new customers, and Ofo has only recently started charging riders at all. In Singapore, Ofo riders can even earn cryptocurrency with every ride.

LimeBike has been attempting to lure riders with a range of incentives it calls Bonus Bikes, which unlock free rides and (for a while) entered users into a drawing for a free iPhone X. Spin has shunned price cuts so far—and has probably suffered for it. The smallest of the three, Spin has deployed around 2,000 bikes in Seattle, only half as many as its competitors. "We're not here to ride a hype train," says CEO Derrick Ko. "We want to be a permanent fixture in the US."

To get there, it will need to attract many more riders. A Spin document, submitted for a new bike share scheme, says the company has a target of two rides per bike per day, "a level at which bikes are adequately available to riders and the system can operate profitably." However, city data shows that Seattle users only briefly reached that number, during warmer months. Its average over the pilot was 0.84 rides per bike per day—and the US average for dockless companies was a meagre 0.3 last year.

The tight competition has put a strain on the three companies' street teams, like the one run by Sean Healy. They start early in the morning and work well into the night, moving bikes around the city. According to an Ofo document, the company rebalances about 10 percent of its fleet—400 bikes—every day.

One day, while moving a new bike, Healy sliced his fingers on a metal edge. "It was razor sharp and cut right into my hand," he says. "There was no Band-Aid, no first aid kit, no procedure. I got fired up." He started to notice the other ways Ofo seemed to be cutting corners on safety. At that time the company had set a goal of deploying 310 bikes each day, which meant moving dozens of bikes at a time per vehicle. "We were working pretty hard," Healy says. "Lifting, extending your arms, lowering bikes off a truck 60 times. One guy injured his back doing that because we were doing so many. We weren't thinking about ergonomics, we weren't thinking about safety."

Ofo says that all the workers, even the ones like Healy who were recruited elsewhere, were actually employees of the homeless charity, which the company says was responsible for providing their protective equipment. The charity says that it is now providing gloves and other protective gear, although it remains Ofo’s responsibility to ensure the workplace itself is not hazardous.

The more Healy thought about it, the more his view of the company, especially its treatment of its disadvantaged workers, changed. What had seemed like a socially conscious move was looking more and more to him like exploitation. "There was a power dynamic here," says Healy. "This was a billion-dollar company and these guys off the streets couldn't get gloves?"

In December, he staged a walkout to protest the working conditions. Although none of his homeless coworkers joined in (for fear of losing their jobs, according to Healy), social media coverage of his action forced some changes. The employment charity conducted a safety audit of Ofo in January. It would not say whether it found any problems, only that it provided Ofo with recommendations on best safety practices. Ofo complied. Workers no longer ride in the back of vans with unsecured bikes, and the company now supplies personal protective equipment like gloves and offers safety workshops.

Healy, however, has gone back to driving for Uber.

THE DOCKLESS BIKE share companies faced their biggest showdown with the University of Washington. Yet that altercation also led to a possible solution that could help the bikes coexist peacefully on crowded city streets, in Seattle and beyond.

Tensions with the university’s Seattle campus first arose over Ofo workers using its bathrooms, so the company instructed its workers to stop relieving themselves on campus. Then, in November, the university banned Ofo altogether from dropping off bikes on its grounds. Early this year, the school revised its position and decided to start charging companies for the privilege of serving its students and staff. It recently put out a request for proposals for bike share schemes that came with strict contractual terms. They include offering a 50 percent discount on campus and allowing the university to impound any bike "at any time for any reason or for no reason."

A key requirement of the university, which all three companies agreed to, is to set up geo-fences to force bikes to be parked only in permitted areas. If a bike's GPS senses it is being locked outside an authorized area, the app will warn the user. Spin and LimeBike said they might impose a fine on riders who disobey, and all three said they would restrict or even ban habitual bad parkers. Good behavior, on the other hand, could be rewarded with free rides. "We are currently testing a carrots/sticks approach for smart parking, including bonuses for parking in a preferred location," wrote LimeBike in its response to the university. All the companies agreed to a one-hour response time for badly parked bikes, at the university's command.

They also agreed to pay the university for operating on campus, albeit at different rates. LimeBike suggested an annual $5 per bike fee to cover the university's costs for managing the parking infrastructure and other aspects of the scheme. Spin proposed handing over 10 percent of net profits; Ofo was more generous, offering 10 percent of its revenue from rentals on campus. The university says it is still negotiating the contracts but hopes to sign them soon.

The school’s negotiations with the three companies stand in contrast to the adversarial relationships that cities and new transportation networks have fallen into in recent years. If a city tries to impose restrictions on Uber and Lyft, for example, they threaten to leave, as has happened in Austin, Houston, Quebec, and Seattle. (In Austin, the companies followed through on their promise and left.) Banning new services, as San Francisco recently did with electric scooters, means losing any benefits they also bring.

Yet the willingness—even eagerness—of Seattle's three dockless bike sharing companies to enter into expensive and restrictive agreements with the University suggests that another way is possible. Instead of squeezing into gaps between the rules, these companies are entering into operational and financial partnerships with a jurisdiction, albeit a small one.

"It's a good sign that public entities are negotiating with private services that are using the public right of way," says Jennifer Dill, professor of Urban Studies and Planning at Portland State University. "Most cities did not do so with ride-hailing services. But it's a fast-changing landscape. If companies consolidate or the economic model shifts, so could the agreements."

Some people are always going to ride too fast on sidewalks, park in front of bus stops or throw bikes into lakes. Agreements holding companies accountable for the jerks who use their services help discourage those behaviors, while revenue sharing gives cities the resources to deal with other issues that emerge—and defend against accusations of selling off the public realm. Seattle has gathered the data showing that people want dockless bike sharing. Now, it also has evidence that companies are willing to bear responsibility for its drawbacks, putting the city in the perfect position to drive a harder bargain for its permanent program. If it works, it could provide a model for other cities around the world struggling to adapt to new transportation platforms.

But whether Seattle’s program gets extended, revised, or killed outright, Mike Hemion pictures a busy few months ahead. "In the summer, almost every other time we dive, we find a bike," he says. He doesn't seem to mind. "The bikes are pretty cool. I use them all the time myself."

Nuclear Power Won’t Survive Without A Government Handout

By Maggie Koerth-Baker

Nuclear power plants in the U.S. aren’t as profitable as they used to be. Without government support, plants are likely to close.GETTY IMAGES

Once upon a time, if you were an American who didn’t like nuclear energy, you had to stage sit-ins and marches and chain yourself to various inanimate objects in hopes of closing the nation’s nuclear power plants. Today … all you have to do is sit back and wait.

There are 99 nuclear reactors producing electricity in the United States today. Collectively, they’re responsible for producing about 20 percent of the electricity we use each year. But those reactors are, to put it delicately, of a certain age. The average age of a nuclear power plant in this country is 38 years old (compared with 24 years old for a natural gas power plant). Some are shutting down. New ones aren’t being built. And the ones still operational can’t compete with other sources of power on price. Just last week, several outlets reported on a leaked memo detailing a proposed Trump administration plan directing electric utilities to buy more from nuclear generators and coal plants in an effort to prop up the two struggling industries. The proposal is likely to butt up against political and legal opposition, even from within the electrical industry, in part because it would involve invoking Cold War-era emergency powers that constitute an unprecedented level of federal intervention in electricity markets. But without some type of public assistance, the nuclear industry is likely headed toward oblivion.

“Is [nuclear power] dying under its own weight? Yeah, probably,” said Granger Morgan, professor of engineering and public policy at Carnegie Mellon University.

Morgan isn’t pleased by this situation. He sees nuclear energy as a crucial part of our ability to reduce the risks of climate change because it is our single largest source of carbon emissions-free electricity. Morgan has researched what the U.S. could do to get nuclear energy back on track, but all he’s come up with is bad news (or good news, depending on your point of view).

The age of the nuclear fleet is partly to blame. That’s not because America’s nuclear reactors are falling apart — they’re regularly inspected, and almost all of them have now gone through the process of renewing their original 40-year operating licenses for 20 more years, said David McIntyre, a public affairs officer at the U.S. Nuclear Regulatory Commission. A few, including the Turkey Point Nuclear Generating Station in Florida, have even put in for a second round of renewals that could give them the ability to operate through their 80th birthdays.

Instead, it’s the cost of upkeep that’s prohibitive. Things do fall apart — especially things exposed to radiation on a daily basis. Maintenance and repair, upgrades and rejuvenation all take a lot of capital investment. And right now, that means spending lots of money on power plants that aren’t especially profitable. Historically, nuclear power plants were expensive to build but could produce electricity more cheaply than fossil fuels, making them a favored source of low-cost electricity. That changed with the fracking boom, Morgan told me. “Natural gas from fracking has gotten so cheap, [nuclear plants] aren’t as high up in the dispatch stack,” he said, referring to the order of resources utilities choose to buy electricity from. “So many of them are now not very attractive economically.”

Meanwhile, new nuclear power plants are looking even less fetching. Since 1996, only one plant has opened in the U.S. — Tennessee’s Watts Bar Unit 2 in 2016. At least 10 other reactor projects have been canceled in the past decade. Morgan and other researchers are studying the economic feasibility of investment in newer kinds of nuclear power plants — including different ways of designing the mechanical systems of a reactor and building reactors that are smaller and could be put together on an assembly line. Currently, reactors must be custom-built to each site. Their research showed that new designs are unlikely to be commercially viable in time to seriously address climate change. And in a new study that has not yet been published, they found that the domestic U.S. market for nuclear power isn’t robust enough to justify the investments necessary to build a modular reactor industry.

Combine age and economic misfortune, and you get shuttered power plants. Twelve nuclear reactors have closed in the past 22 years. Another dozen have formally announced plans to close by 2025. Those closures aren’t set in stone, however. While President Trump’s plan to tell utilities that they must buy nuclear power has received criticism as being an overreach of federal powers, states have offered subsidies to keep some nuclear power plants in business — and companies like Exelon, which owns 22 nuclear reactors across the country, have been happy to accept them. “Exelon informed us that they were going to close a couple plants in Illinois,” McIntyre said. “And then the legislature gave them subsidies and they said, ‘Never mind, we’ll stay open.’”

So intervention can work to keep nuclear afloat. But as long as natural gas is cheap, the industry can’t do without the handouts.

Maggie Koerth-Baker is a senior science writer for FiveThirtyEight.

Study of Treated Wastewater Detects Chemicals from Drug Production Facilities

Nationwide study shows how manufacturing facilities can increase levels of medical compounds in waterways.

Thursday, June 14, 2018 - 14:15

Anna Katrina Hunter, Contributor

(Inside Science) -- With many a daily flush of a toilet, a dilute cocktail of chemical compounds we excrete, from hormones to antidepressants to caffeine to nicotine, are sluiced into the sewers of American life, eventually making their way into municipal wastewater treatment plants.

Most facilities typically process and remove the majority of sediment and organic solids in the waste stream, before releasing the treated water out into the ocean or waterways.

However, they don't remove everything. In the last 15 years, there have been numerous reports in the United States about pharmaceutical chemicals found in waterways and drinking water supplies. Wastewater treatment plants are a well-known source for the discharge of these chemicals into the environment.

Scientists from the U.S. Geological Survey recently visited 13 wastewater treatment plants in nine states and Puerto Rico, to learn if the drug manufacturers themselves are a major source for contamination in waterways.

Although there were sparse, early indications more than 30 years ago of manufacturers as a source of pharmaceuticals in waterways, it wasn’t until the 1990s when findings of estrogen in liquid sewage that led to the feminization of fish sparked interest in understanding how pharmaceuticals reach the environment.

“We were seeing papers actually coming out of India, where they were seeing astronomically high concentrations [of pharmaceuticals],” said Dana Kolpin, a research hydrologist with the U.S. Geological Survey who has worked in water quality research for the past 30 years.

Kolpin said that there was initial skepticism that the same problem existed here in the United States, but their findings suggest otherwise. While levels were not quite as elevated as in India, the USGS study found "pretty hefty concentrations" of pharmaceutical chemicals coming from manufacturers, he said.

In a 2010 study in New York state, USGS scientists found that two municipal wastewater treatment plants receiving waste from pharmaceutical manufacturing facilities had concentrations of opioids in treated water up to 1,000 times higher than in the discharge from wastewater plants processing the waste stream coming from domestic households.

“We’ve gotten to the point where it’s not breaking news to find pharmaceuticals in this river or that stream or lake,” said Tia-Marie Scott, a physical scientist with the U.S. Geological Survey. They can even be found in pristine environments, she added.

Scientists recently embarked on a new, wider study to identify the sources of such pollution nationwide.

The USGS scientists tested the treated discharge coming from each of 20 wastewater treatment plants in nine states across the U.S. and Puerto Rico over a 24-hour period once a year from 2012 to 2014, for 200 pharmaceuticals, nonpharmaceuticals and other chemicals of concern.

Thirteen of the plants received wastewater from pharmaceutical manufacturers and domestic households, while six received water from domestic households only. The remaining plant initially qualified for the former category, but a nearby factory closed during the study.

The team selected their study sites based on nearby manufacturing of seven commonly used pharmaceuticals, including an antidepressant, an opioid, an antibiotic, an anticonvulsant, a synthetic estrogen, a breast cancer drug and an immunosuppressant.

Of those seven common pharmaceutical chemicals, only bupropion, the active ingredient in an antidepressant drug known as Wellbutrin, was found at high concentrations in the treated discharge coming from plants receiving waste from pharmaceutical manufacturers.

The concentrations of bupropion were as much as 452 times higher in the treated discharge from one of these wastewater plants as compared to even the highest concentration found in discharge from plants receiving only domestic wastewater.

This discharged water carried nearly 90 micrograms of bupropion per liter, which is roughly 35 times the anticipated therapeutic dose for fish such as rainbow trout. That much bupropion, even if subsequently diluted in the waterways, could impact the mating behavior and survival of fish populations.

The work was published online in Science of the Total Environment in April.

In studying less commonly used pharmaceuticals, the scientists found elevated levels of as many as 33 pharmaceuticals in the treated water of plants receiving waste from drug manufacturers out of the 200 compounds they tested.

The researchers also found much higher concentrations of some pharmaceuticals in the discharge from plants receiving wastewater from drug manufacturers than from plants with no wastewater coming from these sources.

One such pharmaceutical, an antifungal medication called Fluconazole, was present in the discharge of all 20 wastewater treatment plants. But the concentration was up to 3,000 times higher in the treated discharge from one wastewater treatment plant receiving waste from a nearby drug manufacturer.

The researchers suggest that when drug manufacturers produce large amounts of one particular drug, it can produce these dramatic concentrations in the eventual wastewater discharge.

“This is another interesting study from the USGS group,” said Bryan Brooks, an environmental scientist at Baylor University, in Waco, Texas, and an expert in water quality and toxicology. “The elevated levels as they note in their conclusions really call for additional research to determine what consequences might be presented to organisms living in waterways receiving these effluents.”

The notion that any one treatment could remove all pharmaceuticals from wastewater discharge is “pie in the sky,” said Dana Kolpin. In addition to costing millions of dollars, he said scientists and managers should figure out which compounds are the most threatening and prioritize accordingly.

“All of these pharmaceuticals, and almost every emerging contaminant that we are looking at these days are not regulated,” said Tia-Marie Scott. “There are no thresholds for what is and what is not considered safe.”

More Information:

Here’s another climate change concern: Superheated bugs in the soil, belching carbon

By Stuart Leavenworth

Updated August 01, 2018 10:36 AM


Rising temperatures that are contributing to wildfires and droughts are also changing the world’s soil so that it pumps out more carbon dioxide, a “feedback loop” that could aggravate climate change, according to a study published Wednesday in the journal Nature.

Increased heat is activating microbes in the soil, converting organic matter into carbon dioxide at a heightened rate, much like a compost heap that decomposes quickly when exposed to a lengthy heat wave.

Scientists say this feedback loop has implications for understanding the build-up of greenhouse gases in the atmosphere. In the past, many researchers assumed that increased carbon dioxide would trigger a boost in growth of forests and vegetation that would capture carbon and counteract impacts of more rapid soil decay.

This week’s study casts doubt on that theory.

“One thing we show here is not only a higher rate of respiration from the soil, but it is rising relative to the growth of vegetation,” said Ben Bond-Lamberty, the study’s lead author and a forest ecologist at the Pacific Northwest National Laboratory, a U.S. Department of Energy facility in Richland, Wa.

Numerous scientists say the atmospheric buildup of carbon dioxide, primarily from industrial emissions, is a prime driver of recent record-hot conditions that have contributed to deadly wildfires, from Northern California to Greece. Charting future impacts not only involves tracking man-made emissions but understanding how ecosystems absorb carbon and are affected by warming temperatures.

In the last year, fires have devastated neighborhoods in the Northern California wine country city of Santa Rosa, the Southern California beach city of Ventura and, now, the inland city of Redding. Hotter weather from changing climates is drying out vegetation, creating more intense fires that spread quickly from rural areas to city subdivisions, climate and fire experts say. But they also blame cities for expanding into previously undeveloped areas susceptible to fire.

Forests and grasslands are obvious reservoirs of carbon, but the planet’s soils are actually larger storehouses — unseen and out of mind. These soils include a mix of living roots and decaying layers of leaves, twigs and other matter, their decomposition affected by temperature, amounts of available oxygen and the work of bacteria, fungi and other microbes.

Scientists have have long known that certain soils worldwide were increasing outputs of carbon dioxide, but this week’s study is the first to synthesize all of that research and provide a global estimate of the increase. The Nature study — a product of the Joint Global Change Research Institute, a partnership between the University of Maryland and the Pacific Northwest laboratory — found that conversion of soil carbon to carbon dioxide has increased 1.2 percent worldwide over a quarter century, from 1990 through 2014.

While 1.2 percent may not sound like a significant increase, soil respiration worldwide emits about five times more carbon dioxide than human activities, said Bond-Lamberty. So “a small percentage of a large number is still a very large number,” he said.

The paper’s authors relied upon a database that drew from 1,500 soil respiration studies worldwide, as well as data from FLUXNET, an international network of sites that monitor the interactions of carbon from land to air. The research was funded by DOE’s Office of Science.

“This study asks the question on a global scale,” said Vanessa Bailey, a soil scientist at the Pacific Northwest laboratory who contributed to the research. “We are talking about a huge quantity of carbon.”

Not all that carbon is the same. Some of it is new, the product of recently fallen leaves and other vegetation. Some of it has built up over time, ancient and sequestered.

Climate change sucks moisture from the West, adding to droughts, fires, federal study reveals

California’s new normal? Ever more-intense heat, fires, droughts and floods

“Imagine a compost bin that you don’t turn over completely,” said Bond-Lamberty. “You end up with some bottom layers where decomposition doesn’t happen. Then, all of a sudden one summer, the temperature spikes and some of those long stored layers will get decomposed.”

Numerous scientists worldwide are studying climate change impact on soils, from the Arctic tundra to the loose layers of the tropics. At least one study has suggested that supercharged microbes are robbing nutrients from the soil, which could ultimately stunt the growth of vegetation. These studies contradict the claims of climate change doubters who say that industrial increases of CO2 are “greening the planet,” benefiting humankind with faster-growing crops and vegetation.

In a blog post last year, the Competitive Enterprise Institute, which has received funding from fossil fuel industries, celebrated a study that documented how plants have increased their uptake of carbon, because of increased CO2 emissions. “So-called carbon pollution has done much more to expand and invigorate the planet’s greenery than all the climate policies of all the world’s governments combined,” wrote a senior fellow for the think tank.

Bond-Lamberty and his co-authors acknowledge that uncertainties remain about carbon emissions from soil, including how emissions vary around the globe, as well as the shortage of data from the poles and the tropics. While their study documents soil emissions rising faster than what global vegetation is absorbing, that finding is likely to be scrutinized.

“Our result does not square well with a number of other studies suggesting the land is a really robust carbon sink,” he said. “So the question is: Are those studies overestimating the strength of the land carbon sink? Or are the studies we used not representative of what is happening globally? How do we reconcile that?”

Plastics Emit Greenhouse Gases as They Degrade

The materials are a previously unaccounted-for source of methane and ethylene, researchers find.

Aug 2, 2018

Shawna Williams

Need another reason to ditch straws? A study published yesterday (August 1) in PLOS ONE reports that plastics—ranging from construction materials to plastic bags—release the greenhouse gases methane and ethylene after being exposed to sunlight and beginning to degrade.

“Our results show that plastics represent a heretofore unrecognized source of climate-relevant trace gases that are expected to increase as more plastic is produced and accumulated in the environment,” the study authors write in their paper.

The researchers, all based at the University of Hawai‘i at Manoa, tested the emissions of seven types of plastics as they degraded: polyethylene terephthalate, polycarbonate, high-density polyethylene and low-density polyethylene (LDPE), acrylic, polypropylene, and polystyrene. All gave off methane and ethylene in the days after being exposed to sunlight, they found, but polyethylene, which is used to make plastic bags, was the worst offender.

See “Plastic Pollutants Pervade Water and Land”

Jennifer Provencher, a plastic pollution researcher at Acadia University in Canada who was not involved in the work, tells Reuters that the results pointed to “another piece of evidence suggesting that losing plastic to the environment is not good.”

The authors note that plastics degradation has not been accounted for in climate change research. “Based on the rates measured in this study and the amount of plastic produced worldwide CH4 [methane] production by plastics is likely to be an insignificant component of the global CH4 budget,” they write. “However, for the other hydrocarbon gases with much lower global emissions to the atmosphere compared to CH4, the production from the plastics might have more environmental and global relevance.”

Full Report :

Heat Wave Reveals German Mines, Grenades and Live Explosives Submerged Since World War II

By David Brennan On 8/3/18 at 5:13 AM

As Europe wilts under the most intense heat wave for many years, residents in the eastern part of Germany are facing an additional hazard—World War Two explosives uncovered by retreating water.

German police have warned people living in the states of Saxony-Anhalt and Saxony that low water levels in the Elbe River are uncovering deadly treasures in multiple locations, Deutsche Welle reported.

So far, 22 grenades, mines or other types of explosives have been found this year. Grit Merker, a spokeswoman for Saxony-Anhalt police, said the authorities “ascribe that to the low water level. That's pretty clear.”

This week, the water level of the Elbe fell as low as 20 inches in Magdeburg, just above the historic low of 18.8 inches recorded in 1934. July has been Germany’s hottest month since records began, and July 31 was the warmest day on record, temperatures hitting 103 degrees in Saxony-Anhalt.

When explosives are found, Merker said people are always warned to report the materials and to stay well away from them. Then, explosives specialists can either transport the deadly find or—if they are too unstable—detonate them in place. Though 70 years on the river bed surrounds the bombs with a protective layer of sediment, police always urge caution first and foremost.

But some people apparently do not quite understand the danger of military-grade explosives. Saxony police spokeswoman Wibke Sperling told Deutsche Welle, “Today there was a photo in a newspaper of someone holding up pieces in their hands. That is a classic example of what gives weapons disposal experts a fright.”

Keep up with this story and more by subscribing now

Unexploded ordnance is common in Germany, where some of the most brutal fighting of World War Two took place. The most destructive war in human history, the remnants of battle still litter the country. From the air alone, British and American air forces dropped 2.7 million tons of bombs on Germany between 1940 and 1945, and that is before ground combat on two fronts is taken into account.

The Elbe stretches from the Czech Republic all the way to the North Sea coast, traversing battlefields where millions fought. Like the rest of Germany, it carries mementoes of the country’s tragic history, and many munitions were dumped into the river at the close of the war. On Saturday, for example, two anti-tank mines found in the Elbe were blown up as they were considered too dangerous to move.

While urging caution, Merker said the discoveries were not a surprise. “I don't think people from the weapons disposal service find it a glaring anomaly,” she said.

Around 5,500 unexploded munitions are discovered in Germany each year, with a daily average of 15. Most are bombs dropped by aircraft. It is estimated there are around 100,000 unexploded bombs buried around Germany, each with deadly potential. Police often have to evacuate large numbers of people while uncovered bombs are cleared. The largest post-war evacuation took place in Frankfurt in 2017, when 70,000 people had to leave their homes to allow authorities to defuse a 1.4 ton British bomb.

Green Energy Producers Just Installed Their First Trillion Watts

The next trillion will cost $1.2 trillion by 2023, almost half of the price-tag for the first, according to Bloomberg New Energy Finance

By Jeremy Hodges

August 2, 2018

Global wind and solar developers took 40 years to install their first trillion watts of power generation capacity, and the next trillion may be finished within the next five years.

That’s the conclusion of research by BloombergNEF, which estimated the industry reached the 1-terrawatt milestone sometime in the first half of the year. That’s almost as much generation capacity as the entire U.S. power fleet, although renewables work less often than traditional coal and nuclear plants and therefore yield less electricity over time.

The findings illustrate the scale of the green energy boom, which has drawn $2.3 trillion of investment to deploy wind and solar farms at the scale operating today. BloombergNEF estimates that the falling costs of those technologies mean the next terrawatt of capacity will cost about half as much – $1.23 trillion – and arrive sometime in 2023.

"Hitting one terrawatt is a tremendous achievement for the wind and solar industries, but as far as we’re concerned, it’s just the start,"said Albert Cheung, BloombergNEF’s head of analysis in London. "Wind and solar are winning the battle for cost-supremacy, so this milestone will be just the first of many.’’

The world had a total of about 6.2 terrawatts of installed capacity in 2016, about 1 terrawatt of that being coal plants in China, according to the research group. Like all milestones, reaching 1 terrawatt is an arbitrary mark that scratches the surface of the debate about how much renewables will contribute to the world's energy system.

Each power plant works at a different ``capacity factor,’’ a measure capturing both the efficiency of the facility in generating electricty and how often it works. On average, wind farms have a capacity factor of about 34 percent worldwide, meaning they work about a third of the time, according to BloombergNEF. Some of the best sites have factors above 60 percent. For solar photovoltaics that track the sun, those readings range from 10 percent in the U.K. to 19 percent in the U.S. and 24 percent in Chile's Atacama desert. By comparison, coal plants have a 40 percent capacity factor and nuclear sometimes double that.

Even so, the terrawatt of installed capacity for renewables marks substantial growth for an industry that barely existed at the start of the century. More than 90 percent of all that capacity was installed in the past 10 years, reflecting incentives that Germany pioneered in the early 2000s that made payouts for green power transparent for investors and bankers alike.

Asian nations absorbed 44 percent of the new wind and 58 percent of solar developments to date, with China account for about a third of all those installations.

Wind made up 54 percent of the first terrawatt but solar is expected to overtake wind in early 2020. China has led the world in installing solar power over the last five years holding 34 percent of global solar capacity and it’ll continue to be the world’s largest market for both power sources, reaching 1.1 terrawatts in the country by 2050.

``As we get into the second and third terrawatts, energy storage is going to become much more important,’’ Cheung said. ``That’s where we see a lot of investment and innovation right now.’’

A big analysis of environmental data strengthens the case for plant-based diets

The same foods grown in various ways can also have less impact on the planet, scientists say


FOOD CHOICE MATTERS A new study calculates the bonus for the planet of choosing more foods from plants.

From beef to beer, coffee to chocolate, there are environmental costs in what humanity chooses to eat and drink. Now a new study that quantifies the impact on the planet of producing and selling 40 different foods shows how these choices make a difference.

Agricultural data from 38,700 farms plus details of processing and retailing in 119 countries show wide differences in environmental impacts — from greenhouse gas emissions to water used — even between producers of the same product, says environmental scientist Joseph Poore of the University of Oxford. The amount of climate-warming gases released in the making of a pint of beer, for example, can more than double under high-impact production scenarios. For dairy and beef cattle combined, high-impact providers released about 12 times as many greenhouse gases as low-impact producers, Poore and colleague Thomas Nemecek report in the June 1 Science.

Those disparities mean that there is room for high-impact producers to tread more lightly, Poore says. If consumers could track such differences, he argues, purchasing power could push for change.

The greatest changes in the effect of a person’s diet on the planet, however, would still come from choosing certain kinds of food over others. On average, producing 100 grams of protein from beef leads to the release of 50 kilograms of greenhouse gas emissions, which the researchers calculated as a carbon-dioxide equivalent. By comparison, 100 grams of protein from cheese releases 11 kilograms in production, from poultry 5.7 kilograms and from tofu two kilograms.

Proteins’ consequences

Proteins are not equal in the amount of climate-warming gases, classified as CO2 equivalent, emitted during the farm-to-retail production chain. Wide variation exists even for a single type of protein depending on the producer, a new study finds.

Replacing meat and dairy foods from producers with above-average environmental effects with plant-based products could make a notable difference in greenhouse gas emissions. If cuts came from these higher-impact suppliers, replacing half of each kind of animal product with something from a plant could reduce food’s share of emissions by 35 percent. That’s not too far from the 49 percent drop that could be achieved if the whole world, somehow, went vegan.

The case for switching to a plant-based diet was already pretty powerful, says Ron Milo of the Weizmann Institute of Science in Rehovot, Israel, who studies cell metabolism and environmental sustainability. The new data “make it even stronger, which is an important thing given how strongly we tend to adhere to our food choices,” he says.

Production matters

Depending on the food, greenhouse gas emissions can differ considerably between low-impact producers (at the far left end of the bars) and high-impact ones (at the far right) for a variety of foods, including dark chocolate and rice.

In their study, Poore and Nemecek, of the Swiss government research organization Agroscope in Zurich, also considered the amounts of water and land used as well as nutrient runoff and air pollution created from food production. For such an unusually broad analysis, the researchers crunched the numbers from 570 studies of what are called life-cycle assessments for 40 kinds of food. These studies calculated the environmental impacts of the whole process from growing or processing to transporting and retailing each food.

Producing food overall accounts for 26 percent of global climate-warming emissions, and takes up about 43 percent of the land that’s not desert or covered in ice, the researchers found. Out of the total carbon footprint from food, 57 percent comes from field agriculture, livestock and farmed fish. Clearing land for agriculture accounts for 24 percent and transporting food accounts for another 6 percent.

After the first year of putting the study together, Poore himself gave up eating animal products.

CO2 Levels Break Another Record, Exceeding 411 Parts Per Million

Two years of CO2 measurements at the Mauna Loa Observatory, showing how seasonal highs and lows are steadily rising.

Levels of carbon dioxide in the atmosphere exceeded 411 parts per million (ppm) in May, the highest monthly average ever recorded at the Mauna Loa Observatory in Hawaii, home to the world’s longest continuous CO2 record. In addition, scientists found that the rate of CO2 increase is accelerating, from an average 1.6 ppm per year in the 1980s and 1.5 ppm per year in the 1990s to 2.2 ppm per year during the last decade.

“Many of us had hoped to see the rise of CO2 slowing by now, but sadly that isn’t the case,” said Ralph Keeling, director of the University of California San Diego’s Scripps CO2 Program, which maintains the Mauna Loa record with the National Oceanic and Atmospheric Administration. “It could still happen in the next decade or so if renewables replace enough fossil fuels.”

Annual CO2 concentrations ebb and flow depending on the season. The lowest levels are generally recorded in late August or early September, when vegetation growth in the Northern Hemisphere is at its peak. The highest concentrations are generally measured in May, following winter months with little or no plant growth and just before the springtime boost in productivity.

From 2016 to 2017, the global CO2 average increased by 2.3 ppm — the sixth consecutive year-over-year increase greater than 2 ppm, according to Scripps researchers. Prior to 2012, back-to-back increases of 2 ppm or greater had occurred only twice.

“CO2 levels are continuing to grow at an all-time record rate because emissions from coal, oil, and natural gas are also at record high levels,” Pieter Tans, lead scientist of NOAA’s Global Greenhouse Gas Reference Network, said in a statement. “Today’s emissions will still be trapping heat in the atmosphere thousands of years from now.”

A 1,000-year flood in Maryland shows the big problem with so much asphalt

By Greta Jochemon Jun 5, 2018

The rain started to fall in Ellicott City, Maryland on the afternoon of May 27. Nearby tributaries of the Patapsco River were already dangerously swollen from last month’s steady precipitation. The storm intensified, and floodwaters soon tore through Ellicott City’s main street, submerging the first floors of buildings, sweeping away cars, and killing at least one person.

The storm was a so-called “1,000 year flood,” meaning it had a 0.1 percent chance of occurring this year. But this “exceptionally rare” event is deja vu for residents — they’re still picking up the pieces from a similar flood that destroyed the area back in July 2016.

After that big flood, Robin Holliday spent months rebuilding her business, HorseSpirit Arts Gallery. She didn’t expect a flood like that to happen again, but she also didn’t think the proposed watershed management plan was strong enough. Discouraged, she started to think about leaving. The recent flood solidified her decision.

So what’s behind the propensity for floods in Ellicott City? Part of the problem is its vulnerable location: the town lies at the foot of a hill where river branches meet the Patapsco River. And, of course, climate change makes storms wetter and increases the frequency of severe, record-breaking weather. But there’s another thing people are pointing out: concrete.

When hard, impermeable concrete replaces absorbent green spaces, it’s much easier for floodwaters to overwhelm stormwater drainage. “That’s what happened in Ellicott City,” says Marccus Hendricks, an assistant professor at the University of Maryland School of Architecture, Planning, and Preservation.

In Ellicott City, development has flourished.

“Nearly one-third of the Tiber-Hudson sub-watershed that feeds into historic Ellicott City is now covered by roads, rooftops, sidewalks and other hard surfaces that don’t absorb water,” the Baltimore Sun wrote in 2016.

In a press release, the Sierra Club’s Maryland Chapter called for a stop to development in the Tiber-Hudson watershed: “We may not have control over severe weather events (except by fighting climate change), [but] we can take ownership over the role that development played in this disaster.”

At a recent press conference, a local county official said that Howard County, home to Ellicott City, has been taking steps to prepare for more floods.

“We’re focusing on making sure that what has been approved is being done by the code and by law, making sure that stormwater regulations are being abided by,” said Allan Kittleman, the Howard County executive. Since the flood in 2016, he said the county has designed and engineered more stormwater retention facilities, but larger projects will take time.

This is far from the first time that development and asphalt have had a violent run-in with climate change. Last summer, Hurricane Harvey drenched sprawling Houston with trillions of gallons of water and caused $125 billion in damage. The area saw a 25 percent increase in paved surfaces between 1996 and 2011, according to Texas A&M professor Samuel Brody. Brody found that every square meter of Houston’s pavement cost about $4,000 more in flood damage.

And, rapidly developing or not, our cities are full of these paved surfaces. In the majority of the country, surfaces like pavement or brick make up just 1 percent of the land. Yet in cities, hardscapes account for upwards of 40 percent of land area.

Environmental change coupled with development will likely make this issue one of major national importance, Brody tells Grist.

“Every week, there’s some urbanized area that floods. We look up and say, ‘Oh that’s never happened before and it’s never going to happen again.’ But if you look at the big picture, it’s happening all the time with increasing severity.”

Oil trains on the rebound to Northeast refineries, federal data show

Curtis Tate, Staff Writer, @tatecurtis Published 6:18 a.m. ET June 5, 2018

After a steady decline throughout 2017, the volume of crude oil moving on trains to Northeast refineries is on the rise, federal data show.

After peaking in 2014 and 2015, rail shipments of crude oil from Midwestern sources to the Northeast slowed to a trickle late last summer. Since then, numbers from the U.S. Energy Information Administration show a rebound.

Northeast refiners consumed 3.1 million barrels of Midwestern oil shipped by rail in March, according to EIA data, the most since January 2017, and about 10 times what they received in August and September.

The most recent numbers are still well below the peak of 13.8 million barrels shipped to the Northeast by rail in November 2014. Each barrel contains 42 gallons. An entire unit train of crude oil can carry about 3 million gallons.

Northeast refineries also continue to receive crude oil by rail from Canada, EIA data show. They consumed 881,000 barrels of Canadian oil in March, after receiving nearly none in June and July of 2017.

Crude by rail shipments declined after 2015 when benchmark oil prices plunged from more than $100 a barrel to below $30. In recent weeks, the price has been hovering around $70 a barrel.

Northeast refiners had turned away from North American oil, importing it from the North Sea and West Africa.

The construction of new pipelines in the Midwest caused a precipitous drop in oil shipments by rail to the Gulf Coast, home to about half of U.S. refining capacity.

A series of fiery derailments from 2013 to 2016 across North America prompted a major overhaul in rail safety on both sides of the northern border.

Next month will mark the fifth anniversary of the massive oil train derailment and fire in Lac-Megantic, Quebec, that killed 47 people and leveled the lakeside town's center.

The crashes resulted in stronger tank cars, more training for emergency responders, improved inspections of track and equipment and new testing requirements for the oil itself.

Still, railroads have fought against greater transparency for shipments of flammable liquids in several states.

Last summer, then-Gov. Chris Christie vetoed legislation that would have imposed greater disclosure requirements on railroads shipping crude oil through New Jersey after a lobbying effort from the state's biggest freight haulers.

One CSX executive who lobbied Christie's office in opposition to the bill, Howard "Skip" Elliot, became chief of the U.S Pipeline and Hazardous Materials Safety Administration, the agency in Washington that regulates the shipment of crude oil by rail.

Schumer demands tougher safety rules for oil-by-train shipments in NY

Updated 1:47 PM; Posted 1:47 PM

By Rick Moriarty,

Syracuse, N.Y. -- U.S. Sen. Charles Schumer is renewing his calls for greater safety standards for the shipment of highly-explosive crude oil through New York by rail, accusing the Trump administration of dragging its feet on new safety rules.

Schumer, D-NY, demanded in a letter to the U.S. Department of Transportation and the Department of Energy this week that they quickly finalize new federal standards for the shipment of crude oil by train.

He said current law allows dangerous Bakken crude oil from North Dakota to be shipped by rail to refineries in the Northeast without first being stabilized, making explosions more likely.

"Oil tank cars go through our state with regular frequency, and they pass through, because of the way the tracks were laid out 100 years ago, some of our major population centers - Buffalo, Rochester, Syracuse, Utica and Albany, and even Plattsburgh and east of Glens Falls," he said in a teleconference with reporters. "And then they pass down through the Hudson Valley, down into Rockland County before they head to New Jersey."

The amount of oil shipped by train to refineries in the Northeast is increasing after slowing last year. Northeast refineries processed 3.1 million barrels of Midwestern oil shipped by rail in March, the most since January 2017, according to the U.S. Energy Information Administration.

The safety of oil-by-rail shipments became an issue when a train carrying 77 tank cars full of Bakken crude oil from North Dakota derailed and exploded in the town of Lac-Megantic in Quebec province, killing 47 people and destroying 30 buildings in July 2013.

Though no such disaster has occurred in New York, "it will sooner or later if we don't do something," Schumer said.

"The bottom line is, any time you are transporting volatile chemicals, there is a risk of explosion," he said. "Things like safer tank cars, better braking, lower speed limits - they all help make the rails safer."

The Department of Transportation and the Department of Energy are studying how crude oil properties affect its combustibility in rail accidents, information that could be used to set new safety rules. However, Schumer said the study is taking too long to complete.

"I am calling on the DOT today to stop dragging their feet and immediately release the necessary study," he said. "The new administration, particularly with some of the people in charge, has been far less open to having the government step in and do some of this."

He said light Bakken crude from North Dakota is more flammable than heavier crude oil but can be made safer through a heating process that forces out some of its more volatile gases before shipment

"Judge Throws Out New York Climate Lawsuit"

"A federal judge has dismissed a suit brought by the city that would have forced fossil fuel companies to pay for some costs of climate change."

"A federal judge has rejected New York City’s lawsuit to make fossil fuel companies help pay the costs of dealing with climate change.

Judge John F. Keenan of United States District Court for the Southern District of New York wrote that climate change must be addressed by the executive branch and Congress, not by the courts.

While climate change “is a fact of life,” Judge Keenan wrote, “the serious problems caused thereby are not for the judiciary to ameliorate. Global warming and solutions thereto must be addressed by the two other branches of government.”"

California hit its climate goal early — but its largest source of pollution keeps rising


JUL 22, 2018 | 3:00 AM

California hit its climate goal early — but its largest source of pollution keeps rising

California hit its target to reduce greenhouse gas emissions below 1990 levels four years early, a milestone regulators and environmentalists are cheering as more proof that you can cut pollution while growing the economy.

But a closer look at data released by the state Air Resources Board shows California’s planet-warming emissions aren’t declining across the board.

While emissions from electricity generation have plunged, transportation pollution is rising, and other key industries are flat.

That uneven progress shows the big challenges that loom as California advances toward its more ambitious goal: slashing greenhouse gas emissions another 40% by 2030.

Growth in renewable energy was the main reason California met its 2020 climate goal in 2016, the emissions report released this month shows.

“We’ve seen a substantial increase in solar and wind power, particularly rooftop solar installations,” says Dave Edwards, chief of the Air Resources Board’s greenhouse gas and toxics emissions Inventory branch.

Driving the shift is early compliance with the state’s mandate that 33% of electricity come from renewable sources by 2020 and the falling cost of solar panels, which has spurred more commercial and rooftop installation. By 2016, the state was already at 46% renewable electricity. Solar electricity grew 33% in 2016, while natural gas decreased by more than 15%.

California also got an assist from the weather. After five years of punishing drought, rains swelled rivers and generated more hydroelectric power. During drier years, the state relied more on natural gas.

Overall, the growth in renewables combined with waning imports of coal power to send emissions from electricity generation plunging 18% in 2016 compared to 2015.

Now for the downside.

Emissions from cars and trucks, already California’s biggest source of greenhouse gases, have been on the rise for the past few years in step with post-recession economic growth. Increased driving is the main reason why transportation pollution ticked up another 2% in 2016.

“The deep reductions from electric power generation are compensating for lackluster performance in other sectors of the economy, including an uptick in the transportation sector where we know we have our work cut out for us,” said Alex Jackson, senior attorney at the Natural Resources Defense Council who tracks California climate policy.

To blame for the increase in vehicle pollution is a combination of low gas prices, a growing economy, consumers’ preference for roomier, less efficient vehicles and a slower-than-anticipated transition to electric models. Those factors are essentially wiping out gains from the state’s emissions-cutting regulations.

Other key sectors of the economy, such as oil refineries, residential heating and agriculture, saw greenhouse gas emissions remain relatively flat or even rise slightly in 2016.

State regulators say that in itself is a triumph. Pollution from transportation and industry, they contend, would have been much higher without California’s climate change policies, which functioned as a lid keeping emissions in check even as its economy grew.

Air Resources Board officials played down the significance of rising car and truck pollution. Instead they are crediting measures such as cap-and-trade and the low-carbon fuel standard — market-based programs the state is using to push industry to cut pollution and shift to cleaner transportation fuels — with preventing emissions from rising even higher.

“All the indicators we’re looking at are moving in the right direction,” Edwards said. “We think that we’re on the right trajectory right now toward 2030.”

(Julian H. Lange / Los Angeles Times)

For now, California’s reductions in greenhouse gas emissions remain modest, and are broadly consistent with a nationwide decrease in recent years. The trend across the U.S. is the result of an economic shift: We’re getting less electricity from dirty, coal-fired power plants and more from cheaper, lower-polluting natural gas.

Yet while California’s greenhouse emissions dipped below 1990 levels in 2016, the national rate remained 2.4% above 1990 levels, according to the Environmental Protection Agency.

And though California’s electricity grid is powered increasingly by renewables, it was cleaner than the nation’s to begin with. California’s per-capita greenhouse gas emissions today are about half that of the nation as a whole. And they keep dropping, from a peak of 14 metric tons per person in 2001 to 10.8 in 2016.

That was viewed as an obstacle when California adopted its landmark 2006 climate law, AB 32, which enshrined the goal of cutting greenhouse gases below 1990 levels by the year 2020. At the time, industry and other critics argued that California’s relatively clean power grid would make its climate goals painful and prohibitively costly to reach. That fear, regulators and environmentalists say, has now been proven wrong.

To reach its tougher 2030 goal, however, California will have to pick up the pace and roughly double its greenhouse gas reductions. That feat, environmentalists say, will require not only a cleaner electrical grid but a rapid shift to zero-emission vehicles powered by it.

“Looking at the electricity sector 10 years ago, solar in that time went from exotic to conventional,” says Jimmy O’Dea, senior vehicles analyst for the Union of Concerned Scientists. “And that’s where we’re at with electric vehicles going forward. They’re going to go from exotic to conventional in a short period of time.”


Tony Barboza is a reporter who covers air quality and the environment with a focus on Southern California. He has been on staff at the Los Angeles Times since 2006,is a graduate of Pomona College and completed a Ted Scripps Fellowship in Environmental Journalism at the University of Colorado.

Earth's resources consumed in ever greater destructive volumes

Study says the date by which we consume a year’s worth of resources is arriving faster

Jonathan Watts

Sun 22 Jul 2018 19.01 EDT

Humanity is devouring our planet’s resources in increasingly destructive volumes, according to a new study that reveals we have consumed a year’s worth of carbon, food, water, fibre, land and timber in a record 212 days.

As a result, the Earth Overshoot Day – which marks the point at which consumption exceeds the capacity of nature to regenerate – has moved forward two days to 1 August, the earliest date ever recorded.

To maintain our current appetite for resources, we would need the equivalent of 1.7 Earths, according to Global Footprint Network, an international research organisation that makes an annual assessment of how far humankind is falling into ecological debt.

The overshoot began in the 1970s, when rising populations and increasing average demands pushed consumption beyond a sustainable level. Since then, the day at which humanity has busted its annual planetary budget has moved forward.

Thirty years ago, the overshoot was on 15 October. Twenty years ago, 30 September. Ten years ago, 15 August. There was a brief slowdown, but the pace has picked back up in the past two years. On current trends, next year could mark the first time, the planet’s budget is busted in July.

While ever greater food production, mineral extraction, forest clearance and fossil-fuel burning bring short-term (and unequally distributed) lifestyle gains, the long-term consequences are increasingly apparent in terms of soil erosion, water shortages and climate disruption.

The day of reckoning is moving nearer, according to Mathis Wackernagel, chief executive and co-founder of Global Footprint Network.

Replacing 50% of meat consumption with a vegetarian diet would push back the overshoot date by five days. Photograph: Scott Olson/Getty

“Our current economies are running a Ponzi scheme with our planet,” he said. “We are borrowing the Earth’s future resources to operate our economies in the present. Like any Ponzi scheme, this works for some time. But as nations, companies, or households dig themselves deeper and deeper into debt, they eventually fall apart.”

The situation is reversible. Research by the group indicates political action is far more effective than individual choices. It notes, for example, that replacing 50% of meat consumption with a vegetarian diet would push back the overshoot date by five days. Efficiency improvements in building and industry could make a difference of three weeks, and a 50% reduction of the carbon component of the footprint would give an extra three months of breathing space.

In the past, economic slowdowns – which tend to reduce energy consumption – have also shifted the ecological budget in a positive direction. The 2007-08 financial crisis saw the date push back by five days. Recessions in the 90s and 80s also lifted some of the pressure, as did the oil shock of the mid 1970s.

You can deny environmental calamity – until you check the facts

But the overall trend is of costs increasingly being paid by planetary support systems.

Separate scientific studies over the past year has revealed a third of land is now acutely degraded, while tropical forests have become a source rather than a sink of carbon. Scientists have also raised the alarm about increasingly erratic weather, particularly in the Arctic, and worrying declines in populations of bees and other insect pollinators, which are essential for crops.

The $3 Billion Plan to Turn Hoover Dam Into a Giant Battery

By Ivan Penn

July 24, 2018

Hoover Dam helped transform the American West, harnessing the force of the Colorado River — along with millions of cubic feet of concrete and tens of millions of pounds of steel — to power millions of homes and businesses. It was one of the great engineering feats of the 20th century.

Now it is the focus of a distinctly 21st-century challenge: turning the dam into a vast reservoir of excess electricity, fed by the solar farms and wind turbines that represent the power sources of the future.

The Los Angeles Department of Water and Power, an original operator of the dam when it was erected in the 1930s, wants to equip it with a $3 billion pipeline and a pump station powered by solar and wind energy. The pump station, downstream, would help regulate the water flow through the dam’s generators, sending water back to the top to help manage electricity at times of peak demand.

The net result would be a kind of energy storage — performing much the same function as the giant lithium-ion batteries being developed to absorb and release power.

The process begins when the dam converts water into energy. Here's how:

The Hoover Dam project may help answer a looming question for the energy industry: how to come up with affordable and efficient power storage, which is seen as the key to transforming the industry and helping curb carbon emissions.

Because the sun does not always shine, and winds can be inconsistent, power companies look for ways to bank the electricity generated from those sources for use when their output slacks off. Otherwise, they have to fire up fossil-fuel plants to meet periods of high demand.

And when solar and wind farms produce more electricity than consumers need, California utilities have had to find ways to get rid of it — including giving it away to other states — or risk overloading the electric grid and causing blackouts.

“I think we have to look at this as a once-in-a-century moment,” said Mayor Eric M. Garcetti of Los Angeles. “So far, it looks really possible. It looks sustainable, and it looks clean.”

The target for completion is 2028, and some say the effort could inspire similar innovations at other dams. Enhancing energy storage could also affect plans for billions of dollars in wind projects being proposed by the billionaires Warren E. Buffett and Philip F. Anschutz.

But the proposal will have to contend with political hurdles, including environmental concerns and the interests of those who use the river for drinking, recreation and services.

In Bullhead City, Ariz., and Laughlin, Nev. — sister cities on opposite sides of the Colorado, about 90 miles south of the dam — water levels along certain stretches depend on when dams open and close, and some residents see a change in its flow as a disruption, if not a threat.

“Any idea like this has to pass much more than engineering feasibility,” Peter Gleick, a co-founder of the Pacific Institute, a think tank in Oakland, Calif., and a member of the National Academy of Sciences, internationally known for his work on climate issues. “It has to be environmentally, politically and economically vetted, and that’s likely to prove to be the real problem.”

Housed inside Hoover Dam’s 726-foot structure are massive power-generating units. The proposed pump station would help regulate the water flow through the dam’s generators, sending water back to the top to help manage electricity at times of peak demand.

Using Hoover Dam to help manage the electricity grid has been mentioned informally over the last 15 years. But no one pursued the idea seriously until about a year ago, as California began grappling with the need to better manage its soaring alternative-electricity production — part of weaning itself from coal-fired and nuclear power plants.

In California, by far the leading state in solar power production, that has sometimes meant paying other states to take excess electricity. Companies like Tesla have gotten into the picture, making lithium-ion batteries that are deployed by some utilities, but that form of storage generally remains pricey.

Lazard, the financial advisory and asset management firm, has estimated that utility-scale lithium-ion batteries cost 26 cents a kilowatt-hour, compared with 15 cents for a pumped-storage hydroelectric project. The typical household pays about 12.5 cents a kilowatt-hour for electricity.

Some dams already provide a basis for the Hoover Dam proposal. Los Angeles operates a hydroelectric plant at Pyramid Lake, about 50 miles northwest of the city, that stores energy by using the electric grid to spin a turbine backward and pump water back into the lake.

But the Hoover Dam proposal would operate differently. The dam, with its towering 726-foot concrete wall and its 17 power generators that came online in 1936, would not be touched. Instead, engineers propose building a pump station about 20 miles downstream from the main reservoir, Lake Mead, the nation’s largest artificial lake. A pipeline would run partly or fully underground, depending on the location ultimately approved.

“Hoover Dam is ideal for this,” said Kelly Sanders, an assistant professor of civil and environmental engineering at the University of Southern California. “It’s a gigantic plant. We don’t have anything on the horizon as far as batteries of that magnitude.”

Sri Narayan, a chemistry professor at the university, said his studies of lithium-ion batteries showed that they simply weren’t ready to store the loads needed to manage all of the wind and solar power coming online.

“With lithium-ion batteries, you have durability issues,” Mr. Narayan said. “If they last five to 10 years, that would be a stretch, especially because we expect to use these facilities at full capacity. It has to be 10 times more durable than it is today.”

Mr. Narayan said he felt the Hoover Dam project should be given serious consideration because pumped-storage projects had been tested and proven for decades. In a comparison with lithium-ion batteries, he said, “I think the argument is very good.”

An aerial view of the Colorado River downstream from Lake Mead and Hoover Dam. Harnessing the river’s force, the dam helped transform the American West.

The Los Angeles Department of Water and Power, the nation’s largest municipal utility, says its proposal would increase the productivity of the dam, which operates at just 20 percent of its potential, to avoid releasing too much water at once and flooding towns downstream.

Engineers have conducted initial feasibility studies, including a review of locations for the pump station that would have as little adverse impact on the environment and nearby communities as possible.

But because Hoover Dam sits on federal land and operates under the Bureau of Reclamation, part of the Interior Department, the bureau must back the project before it can proceed.

“We’re aware of the concept, but at this point our regional management has not seen the concept in enough detail to know where we would stand on the overall project,” said Doug Hendrix, a bureau spokesman.

If the bureau agrees to consider the project, the National Park Service will review the environmental, scientific and aesthetic impact on the downstream recreation area. If the Los Angeles utility receives approval, Park Service officials have told it, the agency wants the pumping operation largely invisible to the public, which could require another engineering feat.

Among the considerations is the effect on bighorn sheep that roam Black Canyon, just below the dam, and on drinking water for places like Bullhead City. Some environmentalists worry that adding a pump facility would impair water flow farther downstream, in particular at the Colorado River Delta, a mostly dry riverbed in Mexico that no longer connects to the sea.

Another concern is that the pump station would draw water from or close to Lake Mohave, where water enthusiasts boat, fish, ride Jet-Skis, kayak and canoe.

Keri Simons, a manager of Watercraft Adventures, a 27-year-old rental business in Laughlin, said water levels already fluctuated in stretches of the Colorado close to the river towns. The smaller Davis Dam, just north of Laughlin, shuts off the flow overnight.

One morning this year, the water level just outside town dropped so low that you could walk across the riverbed, Ms. Simons said. “We couldn’t put any boats out until noon,” she said. “Half the river was a sandbar.”

Even if no water is lost because of the pumping project, the thought of any additional stress on the system worries Toby Cotter, the city manager of Bullhead City.

The town thrives on the summer tourism that draws some two million visitors to the area for recreation on the greenish-blue waters, Mr. Cotter said. “That lake is the lifeblood of this community,” he said. “It’s not uncommon to see 100 boats on that lake.”

Despite the possible benefits of the project, there are concerns among community members and business owners in the area. Keri Simons, manager of a watercraft-rental business in Laughlin, said water levels were already inconsistent along stretches of the Colorado, sometimes leaving a footpath across the river.

Environmentalists have been pushing Los Angeles to stop using fossil fuels and produce electricity from alternative sources like solar and wind power. And Mayor Garcetti said he would like his city to be the first in the nation to operate solely on clean energy, while maintaining a reliable electric system.

“Our challenge is: How do we get to 100 percent green?” he said. “Storage helps. There’s no bigger battery in our system than Hoover Dam.”

But old wounds are still raw with some along the Colorado. A coal-fired power plant in Laughlin that the Department of Water and Power and other utilities operated was shut down in 2006, costing 500 jobs and causing the local economies to buckle. And a decision long ago to allot Nevada a small fraction of the water that California and Arizona can draw remains a sore point.

“There’s nothing going on in California with power that has given people who are dealing with them any comfort,” said Joseph Hardy, a Nevada state senator. “I think from a political standpoint, we would have to allay the fears of California, Nevada and Arizona. There will be a myriad of concerns.”

The decision to close the coal plant angered many residents. They wanted the utility to simply add emission-control features known as scrubbers to reduce carbon pollution. The community later hoped a natural-gas plant would replace the coal facility, but Los Angeles could not agree with the local communities on a site.

The 2,500-acre parcel where the coal plant stood remains largely vacant. “There’s still some sting here,” said Mr. Cotter, the Bullhead City official.

There have been local efforts to convert the site into a development of housing and businesses — or to build a solar farm on a plot of land, if Los Angeles would buy the power.

Mr. Garcetti said other states and cities had worked with Los Angeles to build economic development projects for their communities, so he would like to consider similar ideas for the Hoover Dam project, as well as ways to benefit the entire region. “I’m all open ears to what their needs are,” he said.

Mr. Hardy is wary of big-city promises. The Department of Water and Power has treated Nevada so cavalierly, he said, that a security guard at the old coal plant site once refused to return a ball to children after it bounced over the property’s fence. He said the guard had told the children’s parents that they could file a claim to get it back — a process that would take two to three years.

“Not the kindest neighbor,” Mr. Hardy said.

But he said he was willing to meet with Los Angeles officials to make the project successful.

“The hurdles are minimal and the negotiations simple, as long as everybody agrees with Nevada,” Mr. Hardy said. “It would be nice if there was a table that they would come to. I’ll provide the table.”

Caribbean islands rev up for electric car revolution

Given the high costs of imported fuel for energy, several Caribbean islands are considering increasing the number of electric vehicles on their roads. But they face several barriers, including high initial costs and stiff import tariffs on vehicles.

July 25, 2018

By Sophie Hares Thomson Reuters Foundation


With her foot down to show off the acceleration of the zippy electric car, Joanna Edghill spins around the car park before plugging the vehicle into a charging point beneath rows of solar panels converting Caribbean rays into power for the grid.

In the five years since she and her husband started their company Megapower, it has sold 300 electric vehicles and set up 50 charging stations plus a handful of solar car-ports on the 21 mile-long island of Barbados.

They are now expanding elsewhere in the Caribbean.

"The main factor with islands is we don't have range anxiety. I can develop and roll out a charging network in Barbados where customers are never more than a few kilometers from a charging point," said Ms. Edghill, who previously worked in international development.

"We have at least 220 days of pure sunlight every year, so why not take advantage of the resources that we do have here?"

Costa Rica's president elect promises zero-carbon transport

Burdened by a costly dependence on imported fuel for energy, Barbados and many other Caribbean islands are considering boosting the number of electric vehicles on their roads.

But they face barriers, including high initial costs, stiff import duties on electric vehicles, and a lack of regulatory support, say people working in the sector.

Globally, the number of electric vehicles topped 3 million in 2017, according to the International Energy Agency (IEA), which predicts there will be 125 million in use by 2030.

That figure could go as high as 220 million if action to meet global climate targets and other sustainability goals becomes more ambitious, says the IEA.

More research, policies and incentives are needed to drive further uptake, it notes.

In Barbados, the island's electricity utility, government departments, and private firms are among the customers buying the lefthand-drive electric cars and delivery vans Megapower imports from Britain, said Edghill.

It has also built solar car-ports, which power charging points, for the Barbados arm of courier giant DHL and the government of St. Vincent and the Grenadines.

Megapower's other solar panels feed into the grid, offsetting the equivalent of the non-renewable power used by 400 electric cars.

"The Caribbean is ripe for the electrification of transportation," said Curtis Boodoo, assistant professor at the University of Trinidad and Tobago who also works on electric vehicles with the CARICOM regional group of 15 countries.

"If you invest in electric vehicles, you are able to use your existing electrical infrastructure, and save the costs of the transportation fuel that you have to import."

Pumping fuel into electricity generation is two to three times more efficient than putting it directly into car engines, he said, meaning more electric vehicles on the roads could help shave the region's hefty bills.

For many Caribbean countries, over half the amount of fuel they import is used for transport. Barbados spent $300 million last year on fuel imports, government data shows.

Crippled with public debt and dogged by rising oil prices, the island's new government wants to make its bus network electric, and eventually switch all government transport too.

Mr. Boodoo said increased state investment in electric buses would help upgrade transport systems, while cutting climate-changing emissions and paving the way for consumers to follow.

Plug-in vehicles could also "piggy-back" on a push to inject more power into the grid from renewables like solar, wind, and hydro, said Devon Gardner, CARICOM's energy program manager.

Costs are high, however, and on some islands, import duties for electric vehicles are higher than on combustion-engine cars.

Trinidad and Tobago has scrapped taxes and import duties for most electric cars, but taxes elsewhere can add up to 100 percent depending on the model.

High purchase prices mean vehicles remain unaffordable for most Caribbean drivers.

A Nissan Leaf electric car, for example, costs about $50,000 in Barbados, compared with $30,000 in Britain.

Heavily indebted Caribbean countries are torn between collecting much-needed revenue from car imports and supporting the roll-out of private electric vehicles, said Mr. Gardner.

"The Caribbean doesn't have the luxury of using some of the levers of incentives that were used by the richer, more developed countries," he said.

Nonetheless, power utilities in the Bahamas, Turks and Caicos, and St. Lucia are starting to install charging networks, which could help the sector expand, said Megapower's Edghill.

"Privately owned utilities want people buying electricity, so every person that is plugging in is a person not buying gas or diesel, but buying their product," she said.

John Felder, who founded Cayman Automotive, plans to open an office soon in Havana and anticipates a healthy market in Cuba for electric bikes and scooters which start at $800.

Low import duties on electric vehicles in Cuba make them cheaper to buy, said Mr. Felder. He has sold about 60 electric cars in the Cayman Islands – which has cut import duties – and installed 15 charging stations he wants to convert to solar.

"The eco-system is very fragile – there are no freeways where you can go 70, 80 miles per hour for hundreds of miles," he said. "Electric vehicles are perfect for the Caribbean."

While fast-improving battery technology is making electric cars more attractive globally, on hurricane-prone Caribbean islands, emerging vehicle-to-grid technology could use power stored in batteries to keep the lights on if disaster strikes.

Power stored in one electric bus could provide energy for up to 50 homes for a day, or power shelters and community centers if overhead electricity cables are knocked out, said Boodoo.

Driving down prices might be key to kick-starting an electric car revolution. But some bet islands will gradually wake up to the benefits plug-in vehicles can bring by improving public transport and taming expensive diesel habits.

"I'd like to say that within five years, 10 percent of the [Barbados] population will be driving electric vehicles – I think that's realistic," said Edghill.

The world's largest solar farm rises in the remote Egyptian desert


JUL 30, 2018 | 3:00 AM

KarmSolar's Tayebat Workers' Village in Egypt's Bahariya Oasis was built using local sandstone as a model of a sustainable off-grid structure that combined green energy technology with traditional local building methods. It houses up to 500 seasonal farmworkers. (Courtesy of Karmsolar)

In 1913 on the outskirts of Cairo, an inventor from Philadelphia named Frank Shuman built the world’s first solar thermal power station, using the abundant Egyptian sunshine to pump 6,000 gallons of water a minute from the Nile to irrigate a nearby cotton field.

World War I and the discovery of cheap oil derailed Shuman’s dream of replicating his “sun power plant” on a grand scale and eventually producing enough energy to challenge the world’s dependence on coal.

More than a century later, that vision has been resurrected. The world’s largest solar park, the $2.8-billion Benban complex, is set to open next year 400 miles south of Cairo in Egypt’s Western Desert.

It will single-handedly put Egypt on the clean energy map.

That is no small feat for a country that’s been hobbled by its longtime addiction to cheap, state-subsidized fossil fuels and currently gets more than 90% of its electricity from oil and natural gas.

But the prospects for green energy here have never been better as the government has been scaling back fossil-fuel subsidies in line with an International Monetary Fund-backed reform program that aims to rescue an economy ravaged by political upheaval. Meanwhile, the rapidly falling cost of equipment for solar and wind power has increased their allure.

“This is a big deal,” said Benjamin Attia, a solar analyst with U.S.-based Wood Mackenzie, talking about the Benban complex. “I can’t think of another example where so many big players have come together to fill the gap.”

Officials and international finance organizations tout the potential of Egypt’s renewables sector to create jobs and growth as well as reduce emissions in a country whose capital was recently named the second-most polluted large city on Earth by the World Health Organization.

The government’s aim is that by 2025 Egypt will get 42% of its electricity from renewable sources.

The Benban complex, which will be operated by major energy companies from around the world, is expected to generate as much as 1.8 gigawatts of electricity, or enough to power hundreds of thousands of homes and businesses. It will consist of 30 separate solar plants, the first of which began running in December, and employ 4,000 workers.

The U.S. government is backing a local program to train hundreds of technical school students in solar and wind energy.

Hatem Tawfik of Cairo Solar with the "solar flowers" his company installed in the courtyard of a bank in New Cairo. As energy prices have gone up, so has demand for solar.

Last week, Egyptian President Abdel Fattah Sisi inaugurated several big electricity projects, including the expansion of massive wind farms on the gusty Gulf of Suez in the Red Sea. Russia has promised to help build and finance a $21-billion nuclear power plant on Egypt’s north coast.

Driving all these projects is the not-so-distant memory of the electricity crisis that gripped Egypt in the years following the 2011 revolution. Factory and shop closures from rolling blackouts fed mounting public anger that culminated in the 2013 ouster of President Mohamed Morsi, whose prime minister once infamously suggested that Egyptians confront the power shortage by wearing cotton and sleeping in one room.

Today residents no longer face nightly outages, but Egypt — once a gas exporter— must import expensive liquefied natural gas to meet the energy needs of its 96 million people. And power demand is expected to more than double by 2030, much faster than in any other country in the region, according to Victoria Cuming of Bloomberg New Energy Finance.

In preparation, Egypt launched a scheme in 2014 to enable private players to sell power to the public grid, jump-starting its clean energy market, which last year saw a 500% increase in investment.

But if flashy green energy megaprojects like Benban are drawing much of the attention, small businesses are responsible for 80% of private-sector jobs, said Khaled Gasser, who founded the Solar Energy Development Assn., a local industry group. “This is the real market,” he said.

Driven by the idealism and lack of compelling job opportunities that followed the 2011 revolution, several clean-energy entrepreneurs have quietly established a grass-roots market in Egypt.

Ahmed Zahran, 38, started KarmSolar with four friends working out of a cafe after his old bosses fled the country. He had tried in vain to persuade the private equity firm to invest in clean energy. “We were fired by an … who represented everything we hated about this country, so we decided, let’s do it ourselves,” said Zahran, who figured solar power was a no-brainer in a nation that is more than 90% desert.

With private investments, the company began by making solar water pumps for off-grid desert farms that traditionally rely on diesel to pull water from beneath the sand. Now with more than 80 employees, it also builds solar stations to power poultry factories and malls under so-called power-purchasing agreements.

Zahran has also delved into green architecture, designing off-grid eco-lodges and a sustainable “worker’s village” for seasonal farmhands in Egypt’s Bahayira Oasis, a desert outpost of olive and date groves.

As the government has made good on its promise to gradually cut energy subsidies — on July 1 electricity prices rose an average of 26% — more companies are exploring the possibility of adding solar systems to the apartment buildings where most of the country’s urban population lives.

The country’s green energy start-ups are grappling with the same problems that have bedeviled small businesses here for years. In an effort to confront a big one — lack of financing — the Central Bank launched a low-interest loan program in 2016 that aimed to encourage small players, especially in key industries like renewable energy.

Hatem Gamal used it to grow his waste-to-energy startup, Empower, which currently operates two plants, in a wastewater treatment facility and a beef farm, that turn human sludge and animal waste into biogas that he sells to the government. The company has four more plants under construction.

One thing that hasn’t changed: the layers of bureaucracy that businesses must navigate. Gamal had to get licenses or permissions from at least 10 government agencies. He’s become so adept at dealing with red tape that he has a printer in his car.

“People say, ‘You must know someone’ or ‘You must have bribed someone’ or ‘That 5% loan isn’t real,’ ” Gamal said. “But the opportunities are there. I just never take ‘no’ for an answer.”

Toxic algae spreads in Baltic waters in biggest bloom in years

Reuters Staff

HELSINKI/SOPOT, Poland (Reuters) - A huge bloom of toxic algae has spread in the Baltic Sea, forcing people off beaches but delighting scientists who research cancer and antibiotics.

Toxic algae are seen on the beach in Gdynia, Poland, July 3, 2015. Picture taken July 03, 2015. Lukasz Glowala/Agencja Gazeta/via REUTERS

The blue-green algae or cyanobacteria is primarily caused by the excess of nutrients, such as nitrogen and phosphorus used in food production, in the water.

Exceptionally hot weather and a lack of wind has exacerbated the effect, scientists say. Finnish Meteorological Institute said that this July was the hottest on record.

“The environment is giving us back what we have put into it, so it works as a kind of boomerang,” said Hanna Mazur-Marzec, a professor at the Polish Academy of Sciences’ Oceanology Institute in Sopot on the Baltic sea’s southern shore.

The Finnish environment institute SYKE said the outbreak, which has hit particularly hard around the Gulf of Finland and stretches down to Poland’s shores, is among the worst in the past decade.

“In the lakes, we’ve had a five-week period that’s more extensive than the average in the last 20 years,” SYKE said. “Climate change and warming in the Baltic Sea area and lakes will increase the risk of cyanobacterial blooms.”

Cyanobacteria toxins are a risk to the marine ecosystem - especially for mussels and the flounder fish - and also pose threat to human health, SYKE said.

People in Poland, Lithuania and Sweden have been advised by the authorities not to swim in waters where the algae is blooming.

“We are here until Saturday and it is a real pity that we are not allowed to swim in the Baltic Sea but what can we do?”, said Bron Starby, a tourist from Sweden, who is spending his holidays in Sopot, a popular resort in Poland.

But some scientists are pleased because cyanobacteria are known for being a source of unique compounds that provide a means of studying substances that bacteria will be sensitive to in order to beat the growing prevalence of bacterial resistance.

“It’s a great opportunity to collect material (for research). In recent years, we had trouble collecting enough,” said Mazur-Marzec. “Now we can simply fill up a bucket.”

Last year was warmest ever that didn't feature an El Niño, report finds

State of the climate report found 2017 was the third warmest with a record high sea level and destructive coral bleaching

Oliver Milman in New York

Wed 1 Aug 2018 12.08 EDT Last modified on Wed 1 Aug 2018 15.40 EDT

Last year was the warmest ever recorded on Earth that didn’t feature an El Niño, a periodic climatic event that warms the Pacific Ocean, according to the annual state of the climate report by 500 climate scientists from around the world, overseen by the National Oceanic and Atmospheric Administration (Noaa) and released by the American Meteorological Society.

Climate change cast a long shadow in 2017, with the planet experiencing soaring temperatures, retreating sea ice, a record high sea level, shrinking glaciers and the most destructive coral bleaching event on record.

Overall, 2017 was third warmest year on record, Noaa said, behind 2016 and 2015. Countries including Spain, Bulgaria, Mexico and Argentina all broke their annual high temperature records.

Puerto Madryn in Argentina reached 43.4C (110.12F), the warmest temperature ever recorded so far south in the world, while Turbat in Pakistan baked in 53.5C (128.3F), the global record temperature for May.

Concentrations of planet-warming carbon dioxide continued on an upward march, reaching 405 parts per million in the atmosphere. This is 2.2ppm greater than 2016 and is the highest level discernible in modern records, as well as ice cores that show CO2 levels back as far as 800,000 years. The growth rate of CO2 has quadrupled since the early 1960s.

The consequences of this heat, which follows a string of warm years, was felt around the world in 2017.

In May of last year, ice extent in the Arctic reached its lowest maximum level in the 37-year satellite record, covering 8% less area than the long-term average. The Arctic experienced the sort of warmth that scientists say hasn’t been been present in the region for the last 2,000 years, with some regions 3 or 4 degrees Celsius hotter than an average recorded since 1982. Antarctic sea ice was also below average throughout 2017.

Land-based ice mirrored these reversals, with the world’s glaciers losing mass for the 38th consecutive year on record. According to the report, the total ice loss since 1980 is the equivalent to slicing 22 metres off the top of the average glacier.

Prolonged warmth in the seas helped spur a huge coral bleaching event, which is when coral reefs become stressed by high temperatures and expel their symbiotic algae. This causes them to whiten and, in some cases, die off.

A three-year stretch to May 2017 was the “longest, most widespread and almost certainty most destructive” coral bleaching event on record, the report states, taking a notable toll on places such as the Great Barrier Reef in Australia. Global average sea levels reached the highest level in the 25-year satellite record, 7.2cm (3in) above the 1993 average.

“I find it quite stunning, really, how these record temperatures have affected ocean ecosystems,” said Gregory Johnson, an oceanographer at Noaa.

Heatwave made more than twice as likely by climate change, scientists find

There were several major rainfall events in 2017 contributing to a wetter than normal year, with the Indian monsoon season claiming around 800 lives and devastating floods occurring in Venezuela and Nigeria. Global fire activity was at the lowest level since 2003, however.

While exceptionally warm years could occur without human influence, the rapidly advancing field of climate change attribution science has made it clear the broad sweep of changes taking place on Earth would be virtually impossible without greenhouse gas emissions from human activity.

The loss of glaciers and coral reefs threaten the food and water supplies of hundreds of millions of people, while heatwaves, flooding, wildfires and increasingly powerful storms are also a severe risk to human life.

These dangers have been highlighted in stunning fashion this year, with a scorching global heatwave causing multiple deaths from Canada to Japan, while wildfires have caused further fatalties in places such as Greece and the western US.

How summer heat has hit Nordic nuclear plants

Lefteris Karagiannopoulos

OSLO (Reuters) - This year’s unusually warm summer in the Nordic region has increased sea water temperatures and forced some nuclear reactors to curb power output or shut down altogether, with more expected to follow suit.

The summer has been 6-10 degrees Celsius above the seasonal average so far and has depleted the region’s hydropower reservoirs, driving power prices to record highs, boosting energy imports from continental Europe and driving up consumer energy bills.

Nuclear plants in Sweden and Finland are the region’s second largest power source after hydropower dams and have a combined capacity of 11.4 gigawatts (GW).

Reactors need cold sea water for cooling but when the temperature gets too high it can make the water too warm for safe operations, although the threshold varies depending on the reactor type and age.

Unscheduled power output cuts in Swedish and Finnish reactors could push prices even higher, said Vegard Willumsen, section manager at Norway’s energy regulator NVE.

“If nuclear reactors in the Nordics shut down or reduce power due to the heatwave, it could also put pressure on the supply and consequently on the Nordic power prices,” he added.

The Nordic region’s nuclear plants comprise either pressurized water reactors (PWR) or boiling water reactors (BWR) - and both can be affected by warm sea water.

Typically, power would be reduced at the 12 reactors after a certain temperature threshold has been reached and then fully shut down at a higher threshold.

BWRs can keep operating for longer and would only shut down after a several-degree rise in water temperatures from the moment power reductions are triggered.

However, PWRs require a shorter time to shut down after they start reducing power.

Utility Vattenfall, which operates seven reactors in Sweden, shut a 900 megawatt (MW) PWR unit - one of the four located at its Ringhals plant - this week as water temperatures exceeded 25 degrees Celsius.

The firm’s second plant at Forsmark consists of three BWRs and Vattenfall had to reduce output by 30-40 megawatt per reactor earlier in July as the sea water in the area exceeded 23 degrees Celsius.

Finland’s Fortum reduced power at its Loviisa plant last week when water temperatures reached 32 degrees C, close to a threshold of 34 degrees.

The extent to which water temperature affects nuclear plants also depends on the depth that they receive water from. Colder water is deeper.

It also depends on how warm the water is after being used in the reactors and released back into the sea. If used water exceeds 34 degrees Celsius, it can cause major output reductions or shutdowns for certain plants due to safety regulations.

Sweden’s biggest reactor - 1.4 GW Oskarshamn 3 - should be less vulnerable to very hot summers due to the depth of water, said a spokesman for operator OKG, a unit of Uniper Energy (UN01.DE).

“Water intake (is) at a depth of 18 metres where the water naturally is cooler than on the surface ... should it be too hot, we would of course reduce the capacity accordingly,” he said.

Oskarshamn 3 will reduce power if sea water reaches 25 degrees but it was below 20 degrees on Tuesday.

Similarly, Teollisuuden Voima’s Olkiluoto plant in Finland has deeper water which is colder than a 27-degree threshold.

TVO has also built an additional safety mechanism - a canal - which it can use under certain conditions to release used warm water at the other side of the Olkiluoto island.

Teachers lacking educational background in science use inquiry-oriented instruction least


Not using the pedagogical approach heralded by the National Research Council could be linked to unfilled STEM jobs


A new study shows that eighth-grade science teachers without an educational background in science are less likely to practice inquiry-oriented science instruction, a pedagogical approach that develops students' understanding of scientific concepts and engages students in hands-on science projects. This research offers new evidence for why U.S. middle-grades students may lag behind their global peers in scientific literacy. Inquiry-oriented science instruction has been heralded by the National Research Council and other experts in science education as best practice for teaching students 21st-century scientific knowledge and skills.

Published in The Elementary School Journal, the study investigated whether the educational backgrounds of 9,500 eighth-grade science teachers in 1,260 public schools were predictive of the level in which they engaged in inquiry-oriented instruction. The authors found that, nationwide, there are two distinct groups of middle-grades science teachers: 1) those with very little formal education in science or engineering, and 2) those with degrees and substantial coursework in science.

Teachers who were most likely to use inquiry-based teaching were those with both education and science degrees, and teachers with graduate-level degrees in science were most likely to teach this way. However, nationally, just half of teachers had these preferred credentials, and nearly one-quarter of eighth-grade teachers had an education-related degree with no formal educational background in science or engineering.

The study's findings beg the question: are middle-grades teachers well-prepared to engage in the kinds of teaching that have been shown to improve student engagement, interest, and preparation for STEM careers? Study author Tammy Kolbe, assistant professor of educational leadership and policy studies at the University of Vermont, says results "point toward the disparate nature of middle-level teachers' educational backgrounds as a possible leverage point for change."

Another key finding was that teachers with undergraduate or graduate degrees in science continued to use inquiry-oriented instruction throughout their careers at a higher rate than their peers. That said, novice teachers with undergraduate minors in science who initially were less likely to teach this way eventually caught up to their peers with stronger educational backgrounds in science. "This suggests that even having an undergraduate minor in science better positions a teacher to adopt and integrate reform-oriented science teaching, compared to teachers with little-to-no formal education in science or engineering," says Kolbe.

"We cannot expect that goals for reforming science education in the United States can be achieved without carefully examining how teachers are prepared," says Kolbe, who co-authored the study with Simon Jorgenson, assistant professor in the Department of Education at UVM. "We show that teachers' educational backgrounds matter for how they teach science, and suggest that teachers' degrees and coursework are valid proxies for what teachers know and can do in the classroom. The study's findings call into question existing state policies and teacher preparation programs that minimize content knowledge requirements for middle-level teachers."

Climate change is making night-shining clouds more visible



WASHINGTON -- Increased water vapor in Earth's atmosphere due to human activities is making shimmering high-altitude clouds more visible, a new study finds. The results suggest these strange but increasingly common clouds seen only on summer nights are an indicator of human-caused climate change, according to the study's authors.

Noctilucent, or night-shining, clouds are the highest clouds in Earth's atmosphere. They form in the middle atmosphere, or mesosphere, roughly 80 kilometers (50 miles) above Earth's surface. The clouds form when water vapor freezes around specks of dust from incoming meteors. Watch a video about noctilucent clouds here.

Humans first observed noctilucent clouds in 1885, after the eruption of Krakatoa volcano in Indonesia spewed massive amounts of water vapor in the air. Sightings of the clouds became more common during the 20th century, and in the 1990s scientists began to wonder whether climate change was making them more visible.

In a new study, researchers used satellite observations and climate models to simulate how the effects of increased greenhouse gases from burning fossil fuels have contributed to noctilucent cloud formation over the past 150 years. Extracting and burning fossil fuels delivers carbon dioxide, methane and water vapor into the atmosphere, all of which are greenhouse gases.

The study's results suggest methane emissions have increased water vapor concentrations in the mesosphere by about 40 percent since the late 1800s, which has more than doubled the amount of ice that forms in the mesosphere. They conclude human activities are the main reason why noctilucent clouds are significantly more visible now than they were 150 years ago.

"We speculate that the clouds have always been there, but the chance to see one was very, very poor, in historical times," said Franz-Josef Lübken, an atmospheric scientist at the Leibniz Institute of Atmospheric Physics in Kühlungsborn, Germany and lead author of the new study in Geophysical Research Letters, a journal of the American Geophysical Union.

The results suggest noctilucent clouds are a sign that human-caused climate change is affecting the middle atmosphere, according to the authors. Whether thicker, more visible noctilucent clouds could influence Earth's climate themselves is the subject of future research, Lübken said.

"Our methane emissions are impacting the atmosphere beyond just temperature change and chemical composition," said Ilissa Seroka, an atmospheric scientist at the Environmental Defense Fund in Washington, D.C. who was not connected to the new study. "We now detect a distinct response in clouds."

Conditions must be just right for noctilucent clouds to be visible. The clouds can only form at mid to high latitudes in the summertime, when mesospheric temperatures are cold enough for ice crystals to form. And they're only visible at dawn and dusk, when the Sun illuminates them from below the horizon.

Humans have injected massive amounts of greenhouse gases into the atmosphere by burning fossil fuels since the start of the industrial period 150 years ago. Researchers have wondered what effect, if any, this has had on the middle atmosphere and the formation of noctilucent clouds.

In the new study, Lübken and colleagues ran computer simulations to model the Northern Hemisphere's atmosphere and noctilucent clouds from 1871 to 2008. They wanted to simulate the effects of increased greenhouse gases, including water vapor, on noctilucent cloud formation over this time period.

The researchers found the presence of noctilucent clouds fluctuates from year to year and even from decade to decade, depending on atmospheric conditions and the solar cycle. But over the whole study period, the clouds have become significantly more visible.

The reasons for this increased visibility were surprising, according to Lübken. Carbon dioxide warms Earth's surface and the lower part of the atmosphere, but actually cools the middle atmosphere where noctilucent clouds form. In theory, this cooling effect should make noctilucent clouds form more readily.

But the study's results showed increasing carbon dioxide concentrations since the late 1800s have not made noctilucent clouds more visible. It seems counterintuitive, but when the middle atmosphere becomes colder, more ice particles form but they are smaller and therefore harder to see, Lübken explained.

"Keeping water vapor constant and making it just colder means that we would see less ice particles," he said.

On the contrary, the study found more water vapor in the middle atmosphere is making ice crystals larger and noctilucent clouds more visible. Water vapor in the middle atmosphere comes from two sources: water vapor from Earth's surface that is transported upward, and methane, a potent greenhouse gas that produces water vapor through chemical reactions in the middle atmosphere.

The study found the increase in atmospheric methane since the late 1800s has significantly increased the amount of water vapor in the middle atmosphere. This more than doubled the amount of mesospheric ice present in the mid latitudes from 1871 to 2008, according to the study.

People living in the mid to high latitudes now have a good chance of seeing noctilucent clouds several times each summer, Lübken said. In the 19th century, they were probably visible only once every several decades or so, he said.

"The result was rather surprising that, yes, on these time scales of 100 years, we would expect to see a big change in the visibility of clouds," Lübken said.

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing 60,000 members in 137 countries. Join the conversation on Facebook, Twitter, YouTube, and our other social media channels.

Company makes old polystyrene new again

Jessie Darland Monday, April 30, 2018

A company in Tigard has created a way to recycle polystyrene into more polystyrene. A very small amount of polystyrene is currently recycled and often ends up in landfills or the ocean.

Portland has an extensive waste and recycling program for garbage, bottles, cans, food waste, yard waste and paper — but where do all the plastics go? Things like Styrofoam, or packing peanuts that fill boxes from online deliveries or electronics purchases?

Polystyrene is a commonly used material that often gets thrown into landfills. According to Agilyx, an environmental solutions company located in Tigard, the U.S. recycles just 1.3 percent of polystyrene used. The company has developed the first way of recycling polystyrene back into polystyrene. They held a ribbon cutting Thursday, April 19, to celebrate the new direction of the company.

"Polystyrene is a very versatile, cost effective and valuable polymer used in our everyday lives," said Joe Vaillancourt, chief executive officer of Agilyx. "However, it is one of the least recycled materials in today's recycling programs. We are proud to have commercialized the first chemical recycling solution, creating a more sustainable end-of-life solution for polystyrene."

Polystyrene can be identified by its "'resin code" — a number six within the recycling symbol on most materials. Some material doesn't have this code, but can still be recycled.

"The easiest way to identify polystyrene foam is it looks like a bunch of balls stuck together and it will break or flake off if bent," said John Desmarteau, project engineer at Agilyx.

Agilyx receives this material and melts it down back into liquid. It's sent to a refiner to be cleaned, then is sent to a manufacturer to make new products out of the material. By recycling the material, less plastic will end up in landfills and the ocean, and by recycling the oil used in the production of these plastics, less will be taken out of the ground to make future products.

"Styrofoam (polystyrene foam) has a useful life of mere hours or days, but exists in the environment for hundreds or thousands of years," said Jeanne Roy, co-director of the Center for Earth Leadership. "It can be carried by the wind, washed into storm drains, and make its way to the ocean, contributing to a mass of floating plastic debris. Marine mammals mistake it for food, and it remains in their guts, often causing death by starvation."

Polystyrene is used to make items including fast food containers, packaging, bike helmets, parts of cars, and coolers. The molecule used in the process of making polystyrene can be used again and again, which is what Agilyx is taking advantage of.

Tyrone Neighbors, owner of Active Daily Fitness, has partnered with Agilyx to give them all of the Styrofoam that comes with exercise equipment he orders. Within the past year, he started bringing Agilyx truckloads of packing material from treadmills and strength equipment.

"It gives me peace of mind that's it's not ending up in a landfill," Neighbors said. He gets the material prepped and bagged, then Agilyx takes it from there.

According to Agilyx, this process has a 50 percent smaller carbon footprint than traditional polystyrene manufacturing.

"They're doing something even the biggest petrochemical companies aren't doing. It's impressive," Vaillancourt said at the ribbon cutting. At the event, attendees could get coffee in Styrofoam cups, then the cups were thrown into the polystyrene compactor and eventually put through the entire system. There was also a tour, where a member of the Agilyx team explained the process and the equipment used.

If you’re puzzling over the recycling codes on your plastics, here’s the scoop on what those codes mean. If you want to know which plastics you can toss in your curbside bin, please check with your local recycling service provider, as the types of plastics accepted differ by location.

Info Graphic :

Why Vietnam is shutting out some materials

Posted on May 30, 2018

by Colin Staub

Vietnam's Tan Cang portVietnamese authorities have boosted inspections of scrap imports and plan to halt shipments to key ports next month.

According to a number of sources, the changes are a result of a glut of scrap paper and plastic imports and numerous instances of customs violations.

Because of China’s restrictions, Vietnam has become a larger destination for exports of U.S. recyclables. According to the U.S. Census Bureau, U.S. exporters sent more than 173,000 short tons of recovered fiber to Vietnam from January through March, up dramatically from 57,000 short tons during the prior-year period. U.S. ports sent Vietnam nearly 79 million pounds of recovered plastics during the first quarter, up from 40 million pounds during the prior-year period.

In recent months, exporters have cited the Southeast Asian country as a major alternative market that’s increased in the past year. And the country has been a destination for Chinese scrap processors looking to invest in Southeast Asia as a way to work around China’s restrictions.

Now, Vietnamese officials are restricting imports, similar to the actions taken by the Chinese government over the past year.

Giant wave of imports

A May 21 letter from Vietnam’s Tan Cang-Cai Mep terminal to shipping companies, obtained and shared by the Institute of Scrap Recycling Industries (ISRI), describes a major increase in containers of scrap paper and plastic coming through the port.

The letter notes that a different terminal, Tan Cang-Cat Lai, which is among the largest Vietnamese shipping terminals, has amassed more than 8,000 TEUs (twenty-foot equivalent units, a measurement for container quantity) of scrap plastic and paper on-site as of May 21. Tan Cang-Cai Mep, a smaller terminal, has stockpiled 1,132 TEUs that can’t be moved to the larger terminal due to the buildup. Both terminals are operated by Saigon Newport Corporation (SNP).

The larger terminal has already stopped accepting scrap materials, and the smaller one will stop accepting all containers of scrap plastic June 25 through Oct. 15. The terminals will require more thorough documentation beginning June 15, according to the letter. This means shipments must be accompanied by valid import permits and a written guarantee of when the container will be picked up by the buyer.

“This notice comports with rumors that Vietnamese customers had no more room for imported materials and that the build-up of containers of recovered paper and plastic scrap diverted from China were causing delays at Vietnam’s main import terminal,” ISRI wrote.

ISRI added it is “confident this is more to do with port capacity constraints and not a permanent regulatory shift.”

Violations discovered

A handful of recent announcements from Vietnamese customs officials confirm growing concerns over scrap imports since the beginning of 2018. They describe activities similar to those targeted by Chinese customs officials over the past year.

In April, the General Department of Vietnam Customs (GDVC) described numerous recent violations involving scrap paper and plastic coming into the country. The violations included material not meeting Vietnamese quality standards, mislabeling, the use of false import permits and a lack of permits altogether. In response, customs officials have ramped up inspections. According to GDVC, in some areas agents began inspecting 100 percent of scrap paper and plastic imports.

Most recently, GDVC said it had implemented a “plan on risk management for scrap imports,” which includes continuing the heightened inspections and taking stock of all the containers of recyclables sitting at ports.

GDVC alluded to the Chinese material ban, and said it presented “the potential danger for Vietnam and other Southeast Asian countries to become scrap destinations.”

Restricted import permits

An announcement from SNP in late April shows the overcapacity problem has been growing since the beginning of the year.

On April 23, SNP wrote to its customers that the volume of scrap paper and plastic had “soared” since China banned most of the material. Because of that, Vietnam’s Ministry of Natural Resources and Environment and the Ministry of Finance “restricted the import permits for plastic/paper scrap into Vietnam,” according to the SNP letter.

“Consequentially, consignees are not able to complete customs clearance for many shipments of plastic/paper scrap which have arrived at Vietnam seaports,” SNP wrote.

At that time, 1,000 TEU of material Cai Mep had sent to the larger Cat Lai port were stuck without anywhere to go.

“If there is no punctual action, this volume will escalate and cause high yard occupancy at ports” and other storage areas, SNP wrote, “affecting routine operation of both ports and shipping lines.”

China says its waste restrictions will spur US job growth

Posted on June 19, 2018

by Colin Staub

mixed recyclablesChinese officials have responded to concerns from other nations about recent import restrictions. The Chinese comments directly address the “waste versus scrap” debate as well as global economic repercussions of National Sword.

The country’s World Trade Organization (WTO) Notification and Enquiry Center, an office of the Ministry of Commerce (MOFCOM), responded this month to comments from several WTO members and industry stakeholders. China’s note was shared by the Institute of Scrap Recycling Industries (ISRI).

Some of the concerns raised by U.S. recycling stakeholders over the past year have expressed frustration that Chinese regulators do not understand the impact of the policies they are enacting, and do not understand that imports of scrap material have value. But the new response offers evidence to the contrary.

Some of the key takeaways from the letter include the following:

Spurring domestic investment: China says its restrictions will spur economic development within the U.S. The letter explains that restricting imports into China will lead to increasing volumes kept within the U.S., “thereby spurring a new round of investment in the relevant processing industries.”

“Apart from solving environmental problems faced by both countries, this could also bring more employment opportunities to the U.S. recycling industry and create a win-win situation,” Chinese officials wrote.

Waste versus scrap: Chinese regulators say they are aware of the difference between waste and scrap, a point ISRI and others have questioned many times over the past year. The letter states there is “not any globally recognized standard for scrap materials and recyclable materials,” so the Chinese government went with the internationally recognized HS code (harmonized commodity description) system to define the materials that would be restricted from import.

“China would like to reiterate that solid wastes are different from raw materials in general and inherently pollutive,” the letter states.

‘Disposal to the nearest’: The MOFCOM office noted that ISRI’s comments described the infrastructure for processing recovered materials in the U.S. as a “highly efficient system that incorporates the latest sorting technologies and quality control methods.”

Because of that technical advancement, the Chinese officials believe the U.S. has the ability to manage its recyclables domestically. Doing so, the letter states, would be in line with what China describes as the principles of “waste producers responsibility” and “disposal to the nearest,” meaning countries process their own waste material domestically.

“Over decades, enterprises from other countries including the United States have exported large quantities of solid wastes to China and derived huge financial gains,” Chinese officials wrote. “We earnestly hope that these enterprises could also actively fulfil their international social responsibility and contribute to the alleviation of the environmental and health problems in China.”

Adjusting actions: The letter notes that China modified some of the policy changes due to concerns from affected stakeholders. For example, the letter points to the lower 0.5 percent contamination threshold requirement being deferred until March 1.

“The Chinese government made this decision in consideration of the fact that the relevant industries need transitioning and make adjustments to adapt to the new standards, while the risks of pollution resulting from the accumulation of large quantities of solid waste imports during the transitioning period must also be avoided,” the agency wrote.

0.5 percent is realistic: Also on the contamination standards, the letter notes that plastics have been subject to the 0.5 percent contamination limit since 2005. The letter brings that point up to display that China views the 0.5 percent figure, which this year was applied to paper and most other recyclables, as a realistic limit.

“As our past experience with implementing the standards and inspecting waste imports over more than ten years has shown, imported wastes can meet China’s standards if they are properly categorized or pretreated at their sources,” Chinese officials wrote. “Therefore, this standard is not intended to impose any de facto import restriction.”

Global warming may be twice what climate models predict

Past warming events suggest climate models fail to capture true warming under business-as-usual scenarios

University of New South Wales

Future global warming may eventually be twice as warm as projected by climate models under business-as-usual scenarios and even if the world meets the 2°C target sea levels may rise six metres or more, according to an international team of researchers from 17 countries.

The findings published last week in Nature Geoscience are based on observational evidence from three warm periods over the past 3.5 million years when the world was 0.5°C-2°C warmer than the pre-industrial temperatures of the 19th Century.

The research also revealed how large areas of the polar ice caps could collapse and significant changes to ecosystems could see the Sahara Desert become green and the edges of tropical forests turn into fire dominated savanna. "Observations of past warming periods suggest that a number of amplifying mechanisms, which are poorly represented in climate models, increase long-term warming beyond climate model projections," said lead author, Prof Hubertus Fischer of the University of Bern.

"This suggests the carbon budget to avoid 2°C of global warming may be far smaller than estimated, leaving very little margin for error to meet the Paris targets."

To get their results, the researchers looked at three of the best-documented warm periods, the Holocene thermal maximum (5000-9000 years ago), the last interglacial (129,000-116,000 years ago) and the mid-Pliocene warm period (3.3-3 million years ago).

The warming of the first two periods was caused by predictable changes in the Earth's orbit, while the mid-Pliocene event was the result of atmospheric carbon dioxide concentrations that were 350-450ppm - much the same as today.

Combining a wide range of measurements from ice cores, sediment layers, fossil records, dating using atomic isotopes and a host of other established paleoclimate methods, the researchers pieced together the impact of these climatic changes.

In combination, these periods give strong evidence of how a warmer Earth would appear once the climate had stabilized. By contrast, today our planet is warming much faster than any of these periods as human caused carbon dioxide emissions continue to grow. Even if our emissions stopped today, it would take centuries to millennia to reach equilibrium.

The changes to the Earth under these past conditions were profound - there were substantial retreats of the Antarctic and Greenland ice sheets and as a consequence sea-levels rose by at least six metres; marine plankton ranges shifted reorganising entire marine ecosystems; the Sahara became greener and forest species shifted 200 km towards the poles, as did tundra; high altitude species declined, temperate tropical forests were reduced and in Mediterranean areas fire-maintained vegetation dominated.

"Even with just 2°C of warming - and potentially just 1.5°C - significant impacts on the Earth system are profound," said co-author Prof Alan Mix of Oregon State University.

"We can expect that sea-level rise could become unstoppable for millennia, impacting much of the world's population, infrastructure and economic activity."

Yet these significant observed changes are generally underestimated in climate model projections that focus on the near term. Compared to these past observations, climate models appear to underestimate long term warming and the amplification of warmth in Polar Regions.

"Climate models appear to be trustworthy for small changes, such as for low emission scenarios over short periods, say over the next few decades out to 2100. But as the change gets larger or more persistent, either because of higher emissions, for example a business-as-usual-scenario, or because we are interested in the long term response of a low emission scenario, it appears they underestimate climate change.," said co-author Prof Katrin Meissner, Director of the University of New South Wales Climate Change Research Centre.

"This research is a powerful call to act. It tells us that if today's leaders don't urgently address our emissions, global warming will bring profound changes to our planet and way of life - not just for this century but well beyond."

China’s global infrastructure spree rings alarm bells

by Basten Gokkon on 17 July 2018

Governments across Southeast Asia have embraced billions of dollars in construction projects backed by China as they rely on infrastructure-building to drive their economic growth.

But there are worries that this building spree, under China’s Belt and Road Initiative (BRI), makes no concessions for environmental protections, and even deliberately targets host countries with a weak regulatory climate.

Beijing has also been accused of going on a debt-driven grab for natural resources and geopolitical clout, through the terms under which it lends money to other governments for the infrastructure projects.

In parallel, China is also building up its green finance system, potentially as a means to channel more funding into its Belt and Road Initiative.

KUCHING, Malaysia — As governments in Southeast Asia target economic growth through infrastructure development, China, the world’s second-largest economy, has emerged as a ready funder for some of the most ambitious and expensive projects.

Regional leaders have been quick to seize the opportunity offered by Beijing, but environmental experts warn that many of these projects could cause irreversible environmental damage in highly biodiverse areas.

The infrastructure push has come in two waves this past decade, both aimed at establishing new roads, ports and railways across the region in pursuit of improved trade and logistics. In 2010, the Association of Southeast Asian Nations (Asean) announced its Master Plan for Connectivity, which seeks to boost regional integration among the 10 member countries of the bloc through infrastructure projects. Three years later, China inaugurated its Belt and Road Initiative (BRI), a $1 trillion transportation and energy infrastructure construction juggernaut aimed to give Beijing a strong presence in markets across the region as well as Africa and Europe. The initiative is slated for completion in 2049.

Representatives of governments pose for a photo at the Belt and Road International Forum in May 2017 in Beijing. The BRI has the backing of dozens of nations that are home to billions of people. Image courtesy of the Russian Presidential Press and Information Office (by CC 4.0).

China’s Belt and Road Initiative, showing China in Red the members of the Asian Infrastructure Investment Bank in orange. The BRI comprises six proposed corridors of the Silk Road Economic Belt, a land transportation route running from China to southern Europe via Central Asia and the Middle East; and the 21st-Century Maritime Silk Road, a sea route connecting the port of Shanghai to Venice, via India and Africa. Image by Lommes (by CC 4.0).

Overall, these and other initiatives would see paved roads in Asia’s developing nations double in length in the next few years, according to a 2017 report.

Asean is one of China’s largest trading partners, and given its vast natural resources and geographical proximity to China, the association of fast-growing nations is a key priority for the BRI.

More than 60 percent of Chinese overseas direct investment (ODI) to BRI countries from 2013 to 2015 went to Asean member states, according to The Economist Intelligence Unit. China’s total ODI in BRI countries was $21.4 billion in 2015, up from $13.6 billion in 2014 and $12.6 billion in 2013. Much of this money has gone to finance large, and often highly controversial, infrastructure projects.

One of them, for instance, is a high-speed rail line in Indonesia, which the government in Jakarta shelved due to a lack of proper environmental impact studies and conflicts with local zoning plans. Also in Indonesia, a massive dam project supported by the Bank of China and Sinohydro, China’s hydropower authority, has been widely criticized for threatening the only known habitat of the world’s rarest great ape, the Tapanuli orangutan.

There have also been complaints in Laos, Vietnam and Cambodia about potential damage to the environment and communities from Chinese-backed hydropower projects along the Mekong River.

What’s at stake?

The BRI may appear to be an ambitious take on globalization, by covering 68 countries through trade and commerce and linking them to China, but there’s more to it than that, says Mason Campbell, a postdoctoral research fellow at the Centre for Tropical Environmental and Sustainability Science at James Cook University in Australia.

“It’s basically under the guise of internationalizing China, but it’s more about resource extraction and attaining materials,” Campbell told Mongabay on the sidelines of the 2018 conference of the Association for Tropical Biodiversity Conservation in Kuching, Malaysia, in early July.

Wood that was cut for charcoal production near Katha, northern Myanmar. Villages in this area used to cut teak and other valuable trees for export, but have since started producing charcoal. The charcoal being exported to China can be made from any tree, regardless of size or kind. This amount of trees could produce about 150 bags of charcoal. Photo by Nathan Siegel for Mongabay.

Wood that was cut for charcoal production near Katha, northern Myanmar. Villages in this area used to cut teak and other valuable trees for export, but have since started producing charcoal. The charcoal being exported to China can be made from any tree, regardless of size or species. This amount of trees would produce about 150 bags of charcoal. Image by Nathan Siegel for Mongabay.

Campbell, whose work has focused on the Pan Borneo Highway — a project to linking the two Malaysian states on the island with the nation of Brunei — said China’s way of asserting its influence in most infrastructure projects was typically by pushing for Chinese companies to work abroad or finance a development project in a foreign country that would eventually benefit China.

China has pursued minerals, fossil fuels, agricultural commodities and timber from other nations under this model. As a net importer of coal, China has more than 1,600 plants scheduled to be built by Chinese firms in more than 62 countries, including in Southeast Asia. The country is also investing $100 billion annually in Africa for extractive mineral industries and the associated transportation and energy infrastructure. The effort to secure these resources has spawned its own infrastructure boom that typically involves building large-scale roads, railways and other infrastructure to transport commodities from interior areas to coastal ports for export.

In an analysis that only considers the backbone BRI projects, and not these various side projects, the WWF estimates that the Chinese initiative will directly impact 265 threatened species, including endangered tigers, giant pandas, gorillas, orangutans and Saiga antelopes.

“There is significant potential overlap between the terrestrial BRI corridors and areas that are important for biodiversity conservation and for the provision of social and economic benefits to people,” the wildlife NGO wrote in the report. “These overlaps indicate risk areas for potentially negative impacts of infrastructure development.”

It said the major BRI corridors would cut through or broadly overlap with 1,739 Important Bird Areas or Key Biodiversity Areas, as well as 46 biodiversity hotspots or Global 200 Ecoregions.

“It’s huge. It’s as big as oil palm,” Alex Lechner, a researcher from the School of Environmental and Geographical Sciences at the University of Nottingham Malaysia Campus, told Mongabay.

In a report published earlier this year, Lechner highlighted the rampant biodiversity loss in Asia from the BRI, which crosses several terrestrial and marine biodiversity hotspots, wilderness areas and other key conservation areas, such as Southeast Asia’s Coral Triangle.

Road development, the report said, will create direct and indirect impacts, such as habitat loss, fragmentation, and illegal activities such as poaching and logging. In the marine environment, increased sea traffic exacerbates the movement of invasive species and pollution. Poorly planned infrastructure has the risk of locking in undesirable environmental practices for decades to come, it added.

“BRI could have disastrous consequences for biodiversity,” Lechner wrote in the report.

This map provides an overview of the land areas slated for infrastructure development, and likely to be at highest environmental risk, as a result of the BRI corridors. According to a WWF analysis, the BRI is likely to impact threatened species, environmentally important areas, protected areas, water-related ecosystem services, and areas with important wilderness characteristics. Image by WWF.

China is not the only nation promoting its own economic interests over those of other countries and their environmental health. It’s a story that has played out going back to the colonial period and earlier, when European nations ruthlessly exploited the resources and people of Africa, Asia and the Americas.

The difference with China is both scale and speed, says William Laurance, a professor at James Cook University, whose work in the past two decades has primarily focused on infrastructure projects and their impact on the environment.

“No nation has ever changed the planet so rapidly, on such a large scale, and with such single-minded determination,” Laurance wrote in an op-ed last year. “It is difficult to find a corner of the developing world where China is not having a significant environmental impact.”

He also noted that China currently lacks other key factors that would make it easier to track and mitigate the impacts of projects, such as a free press or laws regulating the practice of Chinese businesses abroad.

According to a major World Bank analysis of nearly 3,000 projects, Chinese foreign investors and companies often predominate in poorer nations with weak environmental regulations and controls. This, Laurance said, makes those nations prime “pollution havens” for China and Chinese enterprises, as the latter wouldn’t take any blame for the environmental damage wrought by their activities in the host countries.

Mohammed Alamgir, an environmental scientist at James Cook University, described China’s BRI as also a way of buying long-term political influence across the globe.

“In this arena of political economy, this huge investment from China, they’re the only player in that field,” he said.

Campbell said the political favors sought by China would become clearer once the country receiving the funding for infrastructure projects struggled to pay back the money. A prominent case is that of the $1.5 billion Hambantota port in Sri Lanka. With the host government overleveraging itself to finance the project, it has had to give China a 99-year lease on the port in exchange for debt relief.

The scale of China’s international resource exploitation is only likely to increase. The Beijing-backed Asian Infrastructure Investment Bank (AIIB), the primary investor in BRI projects, is heavily capitalized and moving rapidly to fund overseas projects with “streamlined” environmental and social safeguards. However, the host nations can, in theory, negotiate for much stringent standards for individual projects.

In the meantime, the World Bank in 2016 announced new environmental and social safeguards to remain competitive with the AIIB. But those new standards have been described by some experts, including Laurance, as weaker than the lender’s previous framework.

Laurance said the AIIB and other Chinese development banks could force a “race to the bottom” among multilateral lenders — with potentially grave consequences for the global environment.

In 2016, Chinese President Xi Jinping called for a “green, healthy, intelligent and peaceful” Silk Road. He said the participating countries should “deepen cooperation in environmental protection, intensify ecological preservation and build a green Silk Road.” Over the past decade, Chinese government ministries have released a series of “green papers” outlining lofty environmental and social guidelines for China’s overseas ventures and corporations.

But researchers remain skeptical about this commitment.

“The Chinese government readily admits that compliance with its guidelines is poor, but accepts no blame for this. Instead, it insists that it has little control over its corporations and blames the host nations themselves for not controlling Chinese corporations more carefully,” Laurance said.

“If China really wanted to reign in its freewheeling corporations, it could easily do so by making some strong official statements and visibly punishing a few extravagant sinners. It hasn’t done this for one simple reason: Despite their often-egregious environmental activities, China’s corporations operating overseas are enormously profitable.”

Alamgir said there was still hope for China to improve its environmental commitment by imposing more stringent strategic environmental assessments throughout the project development and funding processes.

“At the end of the day, it’s political will,” he said. “If the Chinese government had the political will to do that.”

Lechner called for more scientists from the global conservation community to look into the environmental impacts of the BRI, and conduct joint research with colleagues in China to better understand the dynamics of the initiative. He said researchers in China had produced about 90 papers reviewing the BRI, but they were in Chinese and only a small percentage were related to environmental aspects.

“I think there’s a lot to be learned about what’s happening internally [in China], because maybe we think [that] it’s a vacuum because it’s not in English, but maybe there are things going on internally,” Lechner said.

Green finance framework

In parallel with the BRI push, China appears to be developing its green financing system. A discussion involving 120 policymakers, financial regulators and practitioners from more than 35 countries in Asia, Africa and Latin America took place in China in May.

The six-day event was also attended by experts from more than 50 international organizations and commercial entities, including the Asian Development Bank, the Climate Bonds Initiative and the Commercial Bank of China. It highlighted Chinese roadmaps for green financial systems and the barriers related to regulatory policies, green definitions, raising awareness, building capacity, and collecting data.

“It was extremely important for senior government leaders at the highest level to send a strong policy signal to regulators and market participants on the importance of green finance to the economy,” said Ma Jun, director of the Center for Finance and Development at Tsinghua University in Beijing, who also led the drafting of China’s green finance guidelines in 2015-2016.

Sean Kidney, CEO of Climate Bond Initiatives, said the green bond market has taken off in the past couple of years, with China and the United States leading the global market.

The Chinese experience, Kidney said, demonstrates that developing a clear green bond taxonomy, verifiers, and disclosure rules will help avoid the problem of greenwashing and also paves the way for smooth accreditation and verification of bond issuances and proceeds management.

Financial institutions based in Shanghai are also seen driving green finance through product innovation, according to Clair Liu, head of international business at the Shanghai Stock Exchange. She cited the exchange’s experience in promoting environmental information disclosure, green bonds, and a green index development.

To achieve a successful green financial system, Ma called on governments to do more to encourage private capital, and to train and develop human resources in the sector.

“Policy coordination among ministries, development of taxonomies, and information disclosure are also key to success,” said Ma, whose current research priorities include investment in the BRI.

Editor’s note: William Laurance is a member of Mongabay’s advisory board.

Company takes meat off the menu at employee events

By The Associated Press

Posted at 6:41 PM, Updated at 7:24 PM

Office space sharing company WeWork says it is no longer serving red or white meat at company events.

NEW YORK — Office space sharing company WeWork says it is no longer serving red or white meat at company events.

In an email to employees Thursday, co-founder and Chief Creative Officer Miguel McKelvey said the company won’t serve pork, poultry or red meat, and it won’t allow employees to expense meals that include those meats to the company. Fish will stay on the menu.

McKelvey said the change means WeWork will use less water and produce less carbon dioxide as well as saving the lives of animals.

The company said employees are welcome to bring whatever food they want to work.

The policy is effective immediately and also applies to the company’s Summer Camp gathering in the United Kingdom in August. McKelvey wrote that WeWork could save 10,000 animals by eliminating meat at the upcoming Summer Camp event. It has 6,000 employees and some 5,000 attended the event in 2017.

WeWork has locations in 22 countries and a total of 75 cities, including 24 in the U.S. The company declined to say how much water it expects to save in 2018 or 2019 from the new policy, or how much carbon dioxide it could save in the next two years. The company gave longer term projections based on its estimates for future growth over five years.

WeWork’s meat policy may be unique, but the company is joining a group of companies that have recently looked for ways to reduce their impact on the environment. Coffee chain Starbucks, airlines including American and Alaska, and the Hilton and Hyatt hotel chains have all recently announced that they will stop using plastic straws so they produce less plastic waste. Some cities have banned the straws as well

This San Francisco Brand Claims To Have The World's Most Eco-Friendly Jeans

Esha Chhabra , Contributor for Forbes

May 15, 2018 @ 01:19 PM 616

Amour Vert's new lineup of jeans claim to be the most sustainable denim in the world.

For the past 7 years, San Francisco-based fashion brand Amour Vert has been focused on manufacturing clothes in the Bay Area that are made from natural, eco-friendly materials. Now, in collaboration with LA-based denim label AGOLDE, they have released a collection of jeans that may well be the world´s most eco-friendly.

The average American woman owns 7 pairs of jeans at any one time, and recent research has put the average quantity of water used to produce a single pair at 1,800 gallons (jeans are washed repeatedly to achieve the right color and feel). For perspective, that works out to about the equivalent of 30 years worth of drinking water for the average American. In the face of a global water crisis, the fashion industry´s insatiable thirst is one of its most damaging attributes and one that Amour Vert´s new approach aims to combat.

Produced at the AGOLDE factory in downtown Los Angeles using an ozone wash technology, the AGOLDE x Amour Vert jeans achieve the desired vintage look, they argue, without the high environmental cost usually associated with denim manufacturing.

The ozone wash procedure relies on a combination of electricity and gas through which denim can be aged in a natural and non-polluting process. It reportedly consumes less than one-tenth of the water used in traditional production methods. It also eliminates the need for harmful chemical processes such as bleaching. Subsequently, lasers are used to ensure that the washes and finishes are consistent. Amanda Halper Salinas, director of Marketing at Amour Vert, says this significantly reduces the quantity of water and the number of wash cycles needed to achieve the desired vintage-look of the denim.

The fashion industry is a major contributor to global pollution and though the industry seems to be waking up to the impact it’s having globally, few brands are making tangible efforts to combat their carbon-footprint and provide customers with a more ethical alternative. Amour Vert has been chipping away at that goal for nearly a decade -- and jeans, Salinas acknowledges, have been one of the hardest products to create in an eco-friendly manner.

The jeans are produced in Los Angeles using more eco-friendly approaches.

Founded by Parisian native Linda Balti and her husband Christoph Frehsee in 2011 on the principle that “a woman should never have to sacrifice style for sustainability,” the company embraces a zero-waste philosophy and has worked alongside American Forests to plant over 160,000 trees to offset its manufacturing footprint.

Amour Vert chose to collaborate with AGOLDE, the LA-based denim manufacturer, because they had worked for years on the technology and had the know-how, Salinas says, to transform denim from a polluting garment to a sustainable one.

Citizens of Humanity, also known for their jeans, is the parent company of AGOLDE. Federico Pagnetti, the Chief Operating Officer at Citizen's of Humanity, says, “We have employees who have been in the business for nearly 15 years and during those years have had time to master their craft. They’ve been constantly evolving and are continuously on the lookout for new sustainable technology and methods to implement.”

There’s also the materials themselves: the new lineup uses certified organic cotton, sourced globally; the reason being that conventional cotton uses, reportedly, 16 percent of the world’s insecticides and an estimated two billion dollars of pesticides annually. Overall, the company opts to source organic materials where possible. Because organic cotton is grown without insecticides and pesticides, the company estimates that yields in a 46 percent reduction in CO2 production, 91 percent reduction in water consumption and a significantly diminished contribution to the acidification of both nearby land and water.

Amour Vert is one of a growing number of brands focusing on ethical fashion.

Made in, what Amour Vert refers to as, one of the country’s only vertically-integrated denim production facilities, the jeans come in 3 cuts and 9 washes. With prices between $148 and $198, they’re definitely investment pieces, but the female-led startup maintains that this premium pricing is a reflection of the craftsmanship and quality that goes into the garments as opposed to the customer covering the costs of sustainability.

Pagnetti adds: “We will never price the garments higher because they are more ecological than other jeans. As a company, we chose to invest in establishing ourselves as a sustainable brand because that is what we believe in and aligns with our goal to make as little of a footprint on our environment. We don’t want our customer to pay for our choice to be sustainable, we just hope they believe in the product and intention as much as we do.”

So does this new denim range demonstrate that ethical fashion need not involve a compromise on style or a huge financial commitment? Could this indicate a change in the fashion industry’s relationship with environmental and social sustainability?

Frankly, there have been numerous denim brands focused on using more eco-friendly materials, like organic cotton, or repurposing waste fabrics. Brands such as Nudie Jeans have been tracking the transparency of their supply chain and producing modern cuts of raw denim for decades; SOURCE Denim uses a biodegradable material (made of crab shells of all things) for its finish, cutting down on water usage and eliminating chemicals. In fact, even Levi’s announced this year that it’s going to incorporate lasers in its dying process to cut down on chemical usage.

Nevertheless, Amour Vert offers a certain fit, look, and style that will appeal to women, giving them something that’s chic and yet eco-friendly, which has been a gap in the denim market.

Flint estimates 14,000 lead water service lines still in the ground

Updated May 27; Posted May 27

By Ron

FLINT, MI -- The city estimates 14,000 damaged lead and galvanized water service lines remain buried, about 15 percent more than past projections.

Director of Public Works Robert Bincsik provided the update in a May 16 letter to the U.S. Environmental Protection Agency, basing the projection on what crews have uncovered so far in work for the Flint Action and Sustainability Team (FAST) Start project.

City spokeswoman Kristin Moore said she doesn't believe the new estimate will require work to continue beyond next year and said crews are running ahead of schedule with work so far in 2018.

"During the first four phases of FAST Start, the city has conducted excavations at approximately 8,843 homes," Bincsik's letter says. "Of those ... 6,256 lead or galvanized steel service lines were identified or replaced ... approximately 30 percent of the lines were identified as already being copper from the water main to the house."

With no state bottled water, Flint pipe replacement starts with new urgency

"They said the (water points of distribution) would stay open until (service line removals were complete), and yet again they have backed off of their word and what they said they would do," Weaver said in a statement released by the city earlier this week. "This is exactly why the people's trust has not been restored."

Experts have said lead and galvanized lines became damaged by corrosive Flint River water used by the city from April 2014 until October 2015. Those lines essentially absorbed -- and have the potential to release -- lead particles.

The city has about 30,000 water accounts, 28,400 of which are residential.

Bencsik's letter says if the percentage of copper service lines still in the ground continues as it has so far, Flint still has about 14,000 lines to replace.

The city's past shoddy record-keeping has slowed the replacement of galvanized and lead lines, forcing the use of hydro-excavation contractors to determine which lines need replacement and which can remain.

Flint data on lead water lines stored on 45,000 index cards

FLINT, MI — The city knows which homes in Flint have pipes most likely to leach lead into tap water but can't easily access the information because it's kept on about 45,000 index cards.

City officials have said previously that information about the composition of the service lines was often stored on blueprints or index cards, and in some cases, no information on the lines was maintained by the city.

In February, the EPA asked the city to update its efforts to rebuild its service line inventory, prompting Bincsik's letter.

Bay mussels in Puget Sound show traces of oxycodone

May 9, 2018

By Jeff Rice

The opioid epidemic has now hit the waters of Puget Sound. State agencies tracking pollution levels in Puget Sound have discovered traces of oxycodone in the tissues of native bay mussels (Mytilus trossulus) from Seattle and Bremerton area harbors.

The mussels were part of the state’s Puget Sound Mussel Monitoring Program. Every two years, scientists at the Washington Department of Fish and Wildlife (WDFW) transplant uncontaminated mussels from an aquaculture source on Whidbey Island to various locations in Puget Sound to study pollution levels. Mussels, which are filter feeders, concentrate contaminants from the local marine environment into their tissues. After two to three months at the transplant site, scientists analyze the contaminants in the collected mussel tissues.

The areas where the oxycodone-tainted mussels were sampled are considered highly urbanized and are not near any commercial shellfish beds. “You wouldn’t want to collect (and eat) mussels from these urban bays,” explained PSI’s Andy James, who assisted with the study. The oxycodone was found in amounts thousands of times lower than a therapeutic dose for humans and would not be expected to affect the mussels, which likely don’t metabolize the drug, James said. The findings may raise concerns for fish, however, which are known to respond to opioids. Lab studies show that zebrafish will learn to dose themselves with opioids, and scientists say salmon and other Puget Sound fish might have a similar response.

Scientists typically find many chemical compounds in Puget Sound waters, ranging from pharmaceuticals to illicit drugs such as cocaine, but this is the first time that opioids have been discovered in local shellfish. The contaminants in this case are thought to be passed into Puget Sound through discharge from wastewater treatment plants. Even filtered wastewater can potentially include traces of thousands of chemicals known as contaminants of emerging concern (CECs). Runoff from agriculture and stormwater are also common sources of CECs.

In addition to oxycodone, the mussels also showed high levels of the chemotherapy drug Melphalan, which is a potential carcinogen due to its interactions with DNA. The drug was found at “levels where we might want to look at biological impacts,” said James. The mussels had ingested amounts of Melphalan relative by weight to a recommended dose for humans.

These Puget Sound mussel monitoring studies occur every two years and are currently funded by WDFW, the state’s Stormwater Action Monitoring program, and various other regional partners. The monitoring is led by Jennifer Lanksbury, of WDFW’s Toxics-focused Biological Observing System (TBiOS), along help from a host of citizen science volunteers from across Puget Sound. PSI’s Andy James worked with TBiOS on the chemical analysis and presented the findings at last month’s Salish Sea Ecosystem Conference.

James and his team at the University of Washington’s Center for Urban Waters in Tacoma are now using high resolution mass spectrometry to look for additional chemical exposures in the mussel tissues and to evaluate potential biological impacts on Puget Sound species.

Waste Heat, Innovators Turn to an Overlooked Resource

Nearly three-quarters of all the energy produced by humanity is squandered as waste heat. Now, large businesses, high-tech operations such as data centers, and governments are exploring innovative technologies to capture and reuse this vast energy source.

By Nicola Jones • May 29, 2018

When you think of Facebook and “hot air,” a stream of pointless online chatter might be what comes to mind. But the company will soon be putting its literal hot air — the waste heat pumped out by one of its data centers — to good environmental use. That center, in Odense, Denmark, plans to channel its waste heat to warm nearly 7,000 homes when it opens in 2020.

Waste heat is everywhere. Every time an engine runs, a machine clunks away, or any work is done by anything, heat is generated. That’s a law of thermodynamics. More often than not, that heat gets thrown away, dribbling out into the atmosphere. The scale of this invisible garbage is huge: About 70 percent of all the energy produced by humanity gets chucked as waste heat.

“It’s the biggest source of energy on the planet,” says Joseph King, one of the program directors for the U.S. government’s Advanced Research Projects Agency-Energy (ARPA-E), an agency started in 2009 with the mission of funding high-risk technology projects with high potential benefit. One of the agency’s main missions is to hike up energy efficiency, which means both avoiding making so much waste heat in the first place, and making the most of the heat that’s there. ARPA-E has funded a host of innovative projects in that realm, including a $3.5 million grant for RedWave Energy, which aims to capture the low-temperature wasted heat from places like power plants using arrays of innovative miniature antennae.

“The attitude has been that the environment can take this waste,” says one expert. “Now we have to be more efficient.”

The problem is not so much that waste heat directly warms the atmosphere — the heat we throw into the air accounts for just 1 percent of climate change. Instead, the problem is one of wastage. If the energy is there, we should use it. For a long time, says Simon Fraser University engineer Majid Bahrami, many simply haven’t bothered. “The attitude has been that the environment can take this waste; we have other things to worry about,” he says. “Now we have to be more efficient. This is the time to have this conversation.”

The global demand for energy is booming — it’s set to bump up nearly 30 percent by 2040. And every bit of waste heat recycled into energy saves some fuel — often fossil fuels — from doing the same job. Crunching the exact numbers on the projected savings is hard to do, but the potential is huge. One study showed that the heat-needy United Kingdom, for example, could prevent 10 million tons of carbon dioxide emissions annually (about 2 percent of the country’s total) just by diverting waste heat from some of the UK’s biggest power stations to warm homes and offices. And that’s not even considering any higher-tech solutions for capturing and using waste heat, many of which are now in the offing.

To help reduce carbon emissions — not to mention saving money and lessening reliance on foreign fuel imports — governments are increasingly pushing for policies and incentives to encourage more waste heat usage, big businesses like IBM are exploring innovative technologies, and start-ups are emerging to sell technologies that turn lukewarm heat into usable electricity.

For more than a century, waste heat has been used for its most obvious application: heat (think of your car, which uses waste heat from its engine to heat your interior). In 1882, when Thomas Edison built the world’s first commercial power plant in Manhattan, he sold its steam to heat nearby buildings. This co-generation of electricity and usable heat is remarkably efficient. Today, in the United States, most fossil fuel-burning power plants are about 33 percent efficient, while combined heat and power (CHP) plants are typically 60 to 80 percent efficient.

When it opens in 2020, Facebook's new data center in Odense, Denmark will channel its waste heat to warm nearly 7,000 homes. Facebook

That seems to make co-generation a no-brainer. But heat is harder to transport than electricity — the losses over piped distances are huge — and there isn’t always a ready market for heat sitting next to a power plant or industrial facility. Today, only about 10 percent of electricity generation in the U.S. produces both power and usable heat; the Department of Energy has a program specifically to boost CHP, and considers 20 percent a reasonable target by 2030.

Other countries have an easier time thanks to existing district heating infastructure, which uses locally produced heat to, typically, pipe hot water into homes. Denmark is a leader here. In response to the 1970s oil crisis, the country began switching to other energy sources, including burning biomass, which lend themselves to district heating. As a result, Denmark has an array of innovative waste-heat capture projects that can be added onto existing systems, including the upcoming Facebook data center.

In 2010, for example, Aalborg’s crematorium started using its waste heat to warm Danish homes (after the Danish Council of Ethics judged it a moral thing to do). Others are joining in. In Cologne, Germany, the heat of sewage warms a handful of schools. In London, the heat from the underground rail system is being channelled to heat homes in Islington. An IBM data center in Switzerland is being used to heat a nearby swimming pool. “Data centers crop up again and again as having huge potential,” says Tanja Groth, an energy manager and economist with the UK’s Carbon Trust, a non-profit that aims to reduce carbon emissions.

An alternative option is to turn waste heat into easier-to-transport electricity. While many power plants do that already, regulators striving for energy security are keen to push this idea for independent power producers like large manufacturers, says Groth. Businesses that make their own power would reduce carbon emissions by getting any extra electrical juice they need by squeezing it out of their waste heat, rather than buying it from the grid.

Waste heat is a problem of a thousand cuts, requiring a mass of different innovations.

Several companies have popped up to help do just this. One of the largest, Turboden, based in Brescia, Italy, sells a mechanical system based on the Organic Rankine Cycle. This is a type of external combustion engine — an idea that pre-dates the internal combustion engine used in cars. Rankine engines and similar technologies have contained, closed-loop systems of liquid that expand to gas to do work, thanks to a temperature difference on the outside of the system — so you can drive a power-generating engine off waste heat. When a cement plant in Bavaria, for example, added a Rankine engine to its system a decade ago, it reduced its electricity demand by 12 percent and its CO2 emissions by about 7,000 tons.

Since 2010, Turboden says it has sold systems for waste heat recovery to 28 production plants, with seven more under construction now. Turboden is just one of many; the Swedish-based company Climeon, for example, endorsed by spaceflight entrepreneur Richard Branson, uses a similar but different technique to make an efficient heat engine that can be bolted onto anything industrial, from cement plants to steel mills, in order to recycle their waste heat.

Waste heat is a problem of a thousand cuts, requiring a mass of innovations to tackle different slices of the problem: a system that works for one temperature range, for example, might not work for another, and some waste heat streams are contaminated with corrosive pollutants. “We aren’t looking for a silver bullet,” says Bahrami. “There are so many different things that can be done and should be done.”

. About 70 percent of all the energy produced globally gets discarded as waste heat.

Bahrami and others are pursuing solid-state systems for waste heat recovery, which, with no moving parts, can in theory be smaller and more robust than mechanical engines. There are a wide array of ways to do this, based on different bits of physics: thermoacoustics, thermionics, thermophotovoltaics, and more, each with pros and cons in terms of their efficiency, cost, and suitability to different conditions.

“Thermoelectrics have been the major player in this space for years,” says Lane Martin, a material scientist at the Univeristy of California, Berkeley and Lawrence Berkeley National Laboratory. Seiko released a “thermic watch” in 1998 that ran off the heat of your wrist, for example, and you can buy a little thermoelectric unit that will charge your cell phone off your campfire. Researchers are trying hard to increase the efficiency of such devices so they make economic sense for wide-scale use. That means screening thousands of promising new materials to find ones that work better than today’s semiconductors, or tweaking the microstructure of how they’re built.

The biggest technological challenge is to pull energy from the lukewarm end of the spectrum of waste heat: More than 60 percent of global waste heat is under the boiling point of water, and the cooler it is, the harder it is to pull usable energy from it. Martin’s group is tackling this by investigating pyroelectrics (which, unlike thermoelectrics, works by exploiting electron polarization). This isn’t near commercial application yet; it’s still early days in the lab. But the thin-film materials that Martin’s team is investigating can be tuned to work best at specific temperatures, while thermoelectrics always work better the larger the temperature difference. Martin imagines future systems that stack thermoelectric materials to suck up some of the warmer waste heat, say above 212 degrees Fahrenheit, and then uses pyroelectrics to mop up the rest. Martin says his recent work on such materials drummed up interest from a few bitcoin mining operations. “They have a real problem with waste heat,” says Martin. “Unfortunately, I had to tell them it’s a little early; I don’t have a widget I can sell them. But it’s coming.”

“Waste heat is an afterthought — we’re trying to make it a forethought,” says a U.S. government scientist.

Perhaps one of the best applications for waste heat is, ironically, cooling. Air conditioners and fans already account for about 10 percent of global energy consumption, and demand is set to triple by 2050. In urban areas, air conditioners can actually heat the local air by nearly 2 degrees F, in turn driving up the demand for more cooling.

One solution is to use waste heat rather than electricity to cool things down: absorption or absorption coolers use the energy from heat (instead of electrically driven compression) to condense a refrigerant. Again, this technology exists — absorption refrigerators are often found in recreational vehicles, and tri-generation power plants use such technology to make usable electricity, heat, and cooling all at once. “Dubai and Abu Dhabi are investing heavily in this because, well, they’re not stupid,” says Groth.

But such systems are typically bulky and expensive to install, so again research labs are on a mission to improve them. Project THRIVE, led in part by IBM Research in Rüschlikon, Switzerland, is one player aiming to improve sorption materials for both heating and cooling. They have already shown how to shrink some systems down to a reasonable size. Bahrami’s lab, too, is working on better ways to use waste heat to cool everything from long-haul trucks to electronics.

It’s very hard to know which strategies or companies will pan out. But whatever systems win out, if these researchers have their way, every last drop of usable energy will be sucked from our fuel and mechanical systems. “Waste heat is often an afterthought,” says King. “We’re trying to make it a forethought.”

Climate change course aims to persuade evangelicals to go green

Christian Today staff writer Tue 29 May 2018 11:23 BST

A new climate change course is targeting evangelical Christians, hoping to persuade the notoriously sceptical group to go green.

Only four per cent of Christians think the environment is one of the most important issues facing the UK, according to 2015 polling, and conservative evangelicals are much less likely than other Christians to be concerned by the impact of climate change in the future.

Plastic waste is increasingly recognised as an environmental disaster, in part thanks to the BBC's Blue Planet documentary.

Tenants of the King is a new course designed to persuade climate-sceptic Christians that the New Testament teaches them to live sustainably.

'We wanted to provide something for Christians who struggle to see the connection between environmental issues, the teaching of the Bible and their daily walk as Christians,' said Stephen Edwards, from Operation Noah, a Christian climate change charity launching the programme.

'We've put together a Bible-based, Jesus-focused study guide which we hope can help communicate environmental issues from a Christian perspective, equipping believers to take confident personal and political action.'

It is backed by the bishop of Kensington, Graham Tomlin, as well as Rev Mark Melluish, from New Wine, Dr Ruth Valerio, of Tearfund, and Dr Justin Thacker, of Cliff College, who have contributed video reflections for the small group discussions.

Valerio said: 'We live in an amazing world that God has placed us in and has asked us to look after. As a church we have got a part to play in responding to God's call. I want to recommend this resource as a way to learn about the calling God has given us to take care of this world.'

Liberals to buy Trans Mountain pipeline for $4.5B to ensure expansion is built

Canadian public could also incur millions to construct expansion project with estimated price tag of $7.4B

Kathleen Harris · CBC News · Posted: May 29, 2018 8:15 AM ET

Finance Minister Bill Morneau announced today the government is buying the Trans Mountain pipeline for $4.5 billion.

The Liberal government will buy the Trans Mountain pipeline and related infrastructure for $4.5 billion, and could spend billions more to build the controversial expansion.

Finance Minister Bill Morneau announced details of the agreement reached with Kinder Morgan at a news conference with Natural Resources Minister Jim Carr this morning, framing the short-term purchase agreement as financially sound and necessary to ensure a vital piece of energy infrastructure gets built.

"Make no mistake, this is an investment in Canada's future," Morneau said.

Morneau said the project is in the national interest, and proceeding with it will preserve jobs, reassure investors and get resources to world markets. He said he couldn't state exactly what additional costs will be incurred by the Canadian public to build the expansion, but suggested a toll paid by oil companies could offset some costs and that there would be a financial return on the investment.

Kinder Morgan had estimated the cost of building the expansion would be $7.4 billion, but Morneau insisted that the project will not have a fiscal impact, or "hit."

He said the government does not intend to be a long-term owner, and at the appropriate time, the government will work with investors to transfer the project and related assets to a new owner or owners. Investors such as Indigenous groups and pension funds have already expressed interest, he said.

Until then, the project will proceed under the ownership of a Crown corporation. The agreement, which must still be approved by Kinder Morgan's shareholders, is expected to close in August.

A senior government official, speaking on background, said the government hopes to get a new commercial buyer for the pipeline by August, but if that doesn't happen, it will put up the $4.5 billion to purchase the assets.

The government won't publicly discuss construction cost for the expansion because it wants private companies to carry out their own assessments, then bid on the project, the official said.

Conservative Leader Andrew Scheer said today's decision does nothing to advance the project, since the legal questions and obstacles still remain. He said the government has failed to take action to ensure certainty around the expansion by resolving jurisdictional issues.

"This is a very, very sad day for Canada's energy sector. The message that is being sent to the world is that in order to get a big project build in this country, the federal government has to nationalize a huge aspect of it," he said.

NDP Leader Jagmeet Singh called it a "bad deal that will solve nothing." Pushing ahead with the pipeline betrays the government's promise to ease reliance on fossil fuels, he said.

"Climate change leaders don't spend $4.5 billion dollars on pipelines," he said. "We need a government with a vision that takes our future seriously."

The pipeline expansion project has faced intense opposition from the B.C. government, environmental activists and some Indigenous groups.

Carr said the plan does not sacrifice the environment for economic benefits.

"Canadians want both and we can have both," he said.

Kinder Morgan issued a statement that says the deal represents the best way forward for shareholders and Canadians.

"The outcome we have reached represents the best opportunity to complete Trans Mountain Expansion Project and thereby realize the great national economic benefits promised by that project," said chairman and CEO Steve Kean.

'A great day': Kinder Morgan CEO cheerful after Trans Mountain sale to Ottawa

Federal decision to buy Trans Mountain pipeline sparks social media reaction

"Our Canadian employees and contractors have worked very hard to advance the project to this critical stage, and they will now resume work in executing this important Canadian project."

Green Party Leader Elizabeth May, who pleaded guilty Monday to criminal contempt for protesting the pipeline, tweeted that Kinder Morgan is "laughing all the way to the bank."

She called it a bad public policy decision that future generations will regret.

"Historically, I'm quite certain, this will go down as an epic financial, economic boondoggle that future students of political science will say, 'Why on earth did they do that? That made no sense,'" she said.

Green Party Leader Elizabeth May says the decision to buy the Trans Mountain pipeline will go down in history as one of Canada's greatest epic, economical boondoggles. 1:13

Alberta Premier Rachel Notley called it "a major step forward for all Canadians." She believes any efforts to "harass" the project will have less effect with the federal government as the owner, because it will have Crown immunity in legal proceedings.

She said the pipeline remains a commercially viable project that will turn a profit. She conceded that governments could be on the hook if there is a spill, but said spills are becoming less frequent.

"Just like any project, there is risk. In this case, the risk is very low," she said.

Prime Minister Justin Trudeau took to Twitter to praise the deal.

"Today, we've taken action to create and protect jobs in Alberta and B.C., and restart construction on the TMX pipeline expansion, a vital project in the national interest," his post says.

Under the arrangement, the government will indemnify a potential buyer for additional costs caused by provincial or municipal attempts to delay or obstruct the expansion. It also promises to underwrite costs if the proponent abandons the project because of an adverse judicial decision, or because it can't be completed by a predetermined date despite "commercially reasonable efforts."

Under either of those scenarios, the government will have the option to re-purchase the pipeline before the expansion is abandoned.

Notley has been locked in a bitter dispute over the pipeline with B.C. Premier John Horgan.

Today, Horgan said a change of ownership doesn't alter his concerns about the risk of a spill that could harm the coastal environment and said he'll proceed with a legal challenge.

"The good news is I think I have a better chance of progress with a Crown corporation and a government that is responsive to people rather than a company that is only responsive to its shareholders," he told CBC in Vancouver.

In a news conference, Horgan said the dispute should have been resolved through a joint reference to the Supreme Court.

"Now we have both Ottawa and Alberta, rather than going to court to determine jurisdiction, they're making financial decisions that affect taxpayers, and they'll have to be accountable for that."

The twinning of the 1,150-kilometre-long Trans Mountain pipeline will nearly triple its capacity to an estimated 890,000 barrels a day and increase traffic off B.C.'s coast from approximately five tankers to 34 tankers a month. (CBC News)

The federal government had looked at three options for moving the project forward:

compensating Kinder Morgan — or any other company — for financial losses caused by British Columbia's attempts to block the project;

buying and building the expansion itself, and then selling it once the work is complete, or;

buying the project from Kinder Morgan, then putting it on the market for investors willing to pick up the project and build it themselves.

Morneau's announcement comes just two days before a deadline that had been set by Kinder Morgan. The company had said it needed clarity on a path forward for the project by May 31 or it would walk away from construction.

The original Trans Mountain pipeline was built in 1953. The expansion would be a twinning of the existing 1,150-kilometre pipeline between Strathcona County (near Edmonton), Alta., and Burnaby, B.C. It would add 980 kilometres of new pipeline and increase capacity from 300,000 barrels a day to 890,000 barrels a day.

According to Kinder Morgan's project website, the construction and the first 20 years of expanded operations would mean a combined government revenue of $46.7 billion, with $5.7 billion for B.C., $19.4 billion for Alberta and $21.6 billion for the rest of Canada.

At Two Power Plants, Scientists Are Racing Each Other To Turn Carbon Into Dollars

The 10 finalists in the four-year-long Carbon XPRIZE have been selected, and will embark on a two-year project to show that there’s a market for products made from captured carbon.

By Eillie Anzilotti

It sounds like something out of science fiction: 10 teams of scientists and innovators that are working on plans to converting carbon emissions into useful products will ship out to two carbon-dioxide emitting power plants. Five teams will travel to a natural-gas fired plant in Alberta, Canada, and the five others will go to a coal-powered plant in Gillette, Wyoming. There, they’ll have two years to prove the validity of their models.

This is, in fact, the final stage of the Carbon XPRIZE–a four-and-a-half-year-long, $20 million global competition to develop and scale models for converting carbon emissions into valuable products like enhanced concrete, liquid fuel, plastics, and carbon fiber. XPRIZE runs various competitions around topics ranging from water quality to public health. From 47 ideas first submitted to the challenge, a panel of eight energy and sustainability experts whittled the list down to the final 10. The two winners–one from the Canada track, and the other from Wyoming–will receive a $7.5 million grand prize to bring their innovation to market.

“We give the teams literally the pipes coming out of the power plants, and they can bring whatever technology they’re developing to plug into that source,” says Marcius Extavour, XPRIZE senior director of Energy and Resources and the lead on the Carbon XPRIZE competition. Teams will be judged on how much CO2 they convert, and the net value of their innovations.

The finalists stationed at Wyoming include C4X, a team from Suzhou, China producing bio-foamed plastics, and Carbon Capture Machine from Aberdeen, Scotland, which is making solid carbonates potentially to be used in building materials. Carbon Cure from Dartmouth, Canada, and Carbon Upcycling UCLA from Los Angeles are both experimenting with CO2-infused concrete, and Breathe from Bangalore is making methanol, which can be used as fuel.

In Alberta, Carbicrete from Montreal is making concrete with captured CO2 emissions and waste from steel production, and Carbon Upcycling Technologies from Calgary is producing nanoparticles that can strengthen concrete and polymers. CERT from Toronto is making ingredients for industrial chemicals, C2CNT from Ashburn, Virginia, is making tubing that can serve as a lighter alternative to metal, say for batteries, and Newlight from Huntington Beach, California, is making bioplastics.

“This XPRIZE is about climate change, sustainability, and getting to a low-carbon future,” Extavour says. “The idea is to take emissions that are already being produced, and preventing them from leaking out into the atmosphere or oceans or soil, and converting them, chemically, into valuable material.”

Carbon capture is not a new idea. The concept of trapping carbon emissions before they seep out from a power plant by sequestering them in the ground, or sucking them out of the air, as a facility in Zurich does, has been around for years, but not without controversy. If innovators are able to scale carbon-capture and conversion models, will it stop the push toward renewables?

The two entities sponsoring the prize certainly make it seem like that’s a possibility. One, NRG is a large energy company that manages power plants across the U.S., and Canada’s Oil Sand’s Innovation Alliance, a consortium of oil sands producers. (NRG has made efforts to reduce its emissions; it’s retiring three natural-gas fired plants across California over the next year.)

But Extavour does not see carbon conversion as antithetical to reducing emissions overall. Rather, “I think it’s complimentary,” he says. “We just don’t have the option of turning off our CO2-emitting resources today.” While emission-free options like solar, wind, and geothermal are scaling, he says, they’re not doing so fast enough to completely replace carbon. “There’s still a hard core of emissions from sectors like manufacturing that we have to get our hands around,” Extavour says.

“This isn’t about a proposal anymore,” Extavour says. “This is about: Can you build it in a way that works and is reliable? And can you do it in a way that’s not just climate and carbon sustainable, but economically sustainable? Can you build a business around this technology? Because if you can, that’s how we can get emitters of CO2 today to actually adopt these solutions and scale them up, and really take a bit out of emissions.”

Burning Wood as Renewable Energy Threatens Europe’s Climate Goals

Scientists say a new EU policy on biomass is 'simplistic and misleading' and will increase emissions. U.S. forests are being turned into wood pellets to feed demand.


JUN 22, 2018

The European Union declared this week that it could make deeper greenhouse gas cuts than it has already pledged under the Paris climate agreement. But its scientific advisors are warning that the EU's new renewable energy policy fails to fully account for the climate impacts of burning wood for fuel.

By counting forest biomass, such as wood pellets used in power plants, as carbon-neutral, the new rules could make it impossible for Europe to achieve its climate goals, the European Academy of Sciences Advisory Council (EASAC) wrote in a strongly worded statement.

The council said the renewable energy policy's treatment of biomass is "simplistic and misleading" and could actually add to Europe's greenhouse gas emissions over the next 20 to 30 years.

That bump in emissions would come just as the planet's carbon emissions budget is running out, said William Gillett, EASAC's energy director. The Paris agreement aims to reduce net emissions from energy systems to zero within the next several decades.

"The Paris agreement put the time dimension into stark focus," Gillett said. "We don't have 200 years to get to carbon balance. We only have 10 to 20 years. Our carbon budget is nearly used up, and burning trees uses up the budget even faster," he said.

The Math Doesn't Add Up

The countries in the Paris treaty have been encouraged to adopt new, more ambitious goals in the next few years to further reduce the greenhouse gas emissions that are driving global warming.

So, European nations, among the treaty's strongest backers, have engaged in prolonged negotiations toward deeper emissions cuts. Over the past two weeks, they agreed to increase renewable energy to 32 percent of the power mix and set a goal of 32 percent energy efficiency savings.

The EU's climate commissioner, Miguel Arias Cañete, told a meeting of environment leaders from Europe, Canada and China on Wednesday that the new policies would mean the European Union could increase its emissions reduction target from 40 percent to just over 45 percent by 2030.

But the renewable energy policy includes burning wood for fuel. Over a year ago, the EU's science advisors published a comprehensive report debunking the logic behind treating all wood fuel as beneficial to the climate. Because burning wood gives off more CO2 than coal per unit of electricity produced, the climate math doesn't add up, scientists say.

Large-scale forest harvests have a climate warming effect for at least 20 to 35 years, said University of Helsinki climate and forest scientist Jaana Bäck, who noted that scores of evidence-based studies all say basically the same thing.

"And if we look at the Paris targets, we are in critical times at the moment. We need to reduce emissions now, not in 50 or 100 years," she said.

Of particular concern is the harvesting of mature trees. Converting waste wood or fast-growing agricultural products has less climate impact.

Large-scale forest harvests have a climate warming effect for at least 20 to 35 years, said climate and forest scientist Jaana Bäck.

The adoption of the rules is partly a disconnect between science and policy, and also part of political compromise in the EU confederation, which strives for consensus. Some countries, including forest-rich Scandinavia, pushed for the rules in their current form, Gillett said.

EASAC noted that while it may be too late to change the EU directive itself, each country can now decide how to implement it. The national science academies will be advising policy makers in their respective countries on how to implement the rules without adding to emissions, Gillett said.

What About Sustainable Forests?

In creating the energy policy, the EU attempted to address the climate impacts of wood-burning by setting standards, such as requiring biomass supply chains in the heat and power sector to emit 80 to 85 percent less greenhouse gas than fossil fuels. And the forest biomass is supposed to come from certified sustainable forests.

But that doesn't cover the emissions from burning the biomass or the loss of stored carbon when trees are harvested, "in other words, all the main things," said Alex Mason, who tracks EU energy policy for the World Wide Fund for Nature Europe, formerly the World Wildlife Fund.

Other proposed climate safeguards, such as limiting subsidies to wood-burning facilities that produce both electricity and heat, were watered down during the negotiations. In the final version, the low standards for subsidies will encourage more inefficient biofuel projects with high emissions, Mason said.

"That has potentially disastrous consequences for the climate and for global forests. That's precisely why nearly 800 scientists wrote to the members of the European Parliament in January—but they were ignored," he said. He said watchdog groups like his would continue to press the EU and its member countries to change course.

U.S. Southeast Is a Major Biomass Source

Biomass rules in Europe have a direct impact on the United States, as well.

European subsidies have been driving deforestation in the Southeastern U.S. since 2009, when the EU adopted its first renewable energy standards, said David Carr, with the Southern Environmental Law Center.

Logging in areas like northeastern North Carolina and adjacent parts of Virginia has spread so fast that environmental groups haven't been able to compile an accurate regional picture. But they do know that some of the logging is happening in ecologically valuable wetlands forests.

In one part of North Carolina, the wood fuel industry has been logging about 50,000 acres per year (about the size of Washington, D.C.) to meet demand at four wood pellet factories for export to Europe.

"The pellet producers say they are taking residues, but they're taking trees up to 2 feet in diameter, big trees that store a lot of carbon," Carr said. "You're burning that immediately and putting all that carbon in the atmosphere."

"There's no commitment those forests will regrow, no legal obligation to replant them, and it's nearly all being exported," he said. "We are the third world on this one."

Recycled plastic could supply three-quarters of UK demand, report finds

Circular economy could recycle more plastic and meet industry demand for raw materials, finds Green Alliance research

Sandra Laville

Thu 14 Jun 2018 01.01 EDT Last modified on Thu 14 Jun 2018 01.40 EDT

Recycled plastic products: the UK does not have an adequate system to capture, recycle and re-use plastic materials, the report found.

Plastic recycled in the UK could supply nearly three-quarters of domestic demand for products and packaging if the government took action to build the industry, a new report said on Thursday.

The UK consumes 3.3m tonnes of plastic annually, the report says, but exports two-thirds to be recycled. It is only able to recycle 9% domestically.

Measures including increased taxes on products made with virgin plastic, and mandatory targets for using recycled plastic in packaging, could encourage an additional 2m tonnes of plastic to be recycled in the UK, the report from Green Alliance said.

The analysis said simply collecting plastic and sending it abroad for recycling does not solve the problem of the global scourge of plastic pollution.

“The UK does not have an adequate system to capture, recycle and re-use plastic materials,” the report said.

It recommends three new measures to ensure more plastic is recovered in the UK and used as raw material in manufacturing. These are:

Mandatory recycled content requirements for all plastic products and packaging;

Short-term support to kickstart the plastic reprocessing market; and

a fund to stabilise the market for companies investing in recycling plastic domestically.

Green Alliance produced the report for a group of businesses that have formed a circular economy taskforce.

Peter Maddox, director of Wrap UK, said the UK had to take more responsibility for its own waste.

“Our mission is to create a world where resources are used sustainably. To make this happen in the UK, we need to design circular systems for plastics and other materials that are sustainable both economically and environmentally. This will require some fundamental changes from all of us.”

The report said government action is necessary to create and support a secondary plastic market in the UK. “The government is uniquely placed to address the market failures that have led to unnecessary reliance on virgin materials to the detriment of the environment, industry and the economy.”

UK businesses including supermarkets recently signed up to a pact to cut plastic.

But voluntary pacts were not enough, the report said, and government action was needed.

“A secondary plastic market ... could recycle an additional 2m tonnes in the UK and fulfil 71% of UK manufacturing’s raw material demand ... Voluntary initiatives like the UK plastic pact ... only thrive when supported by a credible prospect of government regulation if industry does not deliver.”

First Solar to build new manufacturing plant in northwest Ohio

BySarah Elms | BLADE STAFF WRITER, Published on April 26, 2018 | Updated 2:18 p. m.

First Solar Inc. plans to build a new solar panel manufacturing facility near its North American factory complex in Perrysburg Township, a move that’s expected to create 500 jobs.

The Tempe, Ariz.-based company in an announcement Thursday said the expansion plan calls for a $400 million, 1 million-square-foot facility in Lake Township, just east of Perrysburg Township in Wood County. Construction is expected to begin mid-year and the site should be able to reach full production in late 2019, a news release said.

The company began in Toledo and has its flagship site in Perrysburg Township, the largest solar panel manufacturing facility in the U.S. First Solar also has manufacturing facilities in Malaysia and Vietnam.

“First Solar started in the Toledo area and our first manufacturing facility was in Perrysburg,” spokesman Steve Krum said. “We’re really excited to be able to bring U.S. manufacturing growth to northwest Ohio.”

The new facility will produce the company’s relatively new Series 6 solar panel, which has thin-film solar panels that are three times the size of the previous model, the Series 4. Last year the company invested $175 million into retooling its existing Perrysburg Township plant to produce the Series 6 product.

“The continued, sustained demand validates that we can make and sell more, so that’s what’s driving this decision to build an additional factory that will also make the Series 6,” Mr. Krum said.

Once the new plant is complete, First Solar will produce enough panels at its two sites to generate 1.8 gigawatts of power annually. That’s three times the 600 megawatts of annual power its panels produce now.

The company laid off 350 workers when it slowed production last year to convert to building the new Series 6, but Mr. Krum said there soon will be a need for those jobs again, and then some.

Glenn Richardson, JobsOhio managing director for advanced manufacturing, called the expansion “great news for the area.”

“While it’s always difficult to see companies have to reduce their force, it’s great to see that they were able to retool so quickly and produce components that the market is now looking at as very much in demand,” he said. “I think the 500 jobs is just a start.”

Mr. Krum said the new jobs will be a combination of professional engineers and manufacturing technicians, with an average annual salary of $60,000.

The Lake Township plant is contingent on confirmation of state and local incentive packages still under negotiation, Mr. Krum said.

First Solar’s panels are for commercial, industrial, and utility-scale use, and are used by large utility solar farms, on public buildings, and at off-grid industrial sites.

The company does not make panels for personal consumer use or on rooftops of private homes.

Nationally, solar energy is expected to more than double over the next five years, according to the Solar Energy Industries Association.

Although the 10.6 gigawatts of solar-generated capacity in 2017 was down from the record-setting 15 gigawatts in 2016, it still marked a 40 percent increase over 2015’s installation total and was the second consecutive year for double-digit increases. As of now, the industry has installed 53.3 gigawatts, enough to power 10.1 million American homes, the trade group said recently.

Ross Hopper, SEIA president and chief executive officer, said he was especially encouraged by the “increasing geographic diversity in states deploying solar, from the Southeast to the Midwest, that led to a double-digit increase in total capacity.”

Ohio ranked 29th for new solar installations in 2017, according to SEIA’s latest report.

Recent applications for new construction filed with the Ohio Power Siting Board shows solar is coming on strong in this state, with supporters pointing to Gov. John Kasich’s decision in late 2016 not to extend the controversial two-year ban on renewable energy mandates he signed into law in 2014.

Bowling Green currently has the state’s largest solar farm, with 85,000 solar panels erected across 165 acres.

But the state power board has approved plans by Hardin Solar Energy LLC to build one in Hardin County, to be known as the Hardin Solar Center, that will be seven times larger.

It also has approved plans by Blue Planet Renewable Energy LLC for a project east of Cincinnati called Hillcrest Solar I that is to be six times larger than the Bowling Green project. The board is also reviewing an application for a large project that would be in Vinton County, southeast of Columbus.

Staff writer Tom Henry contributed to this report.

New York City Aims for All-Electric Bus Fleet by 2040

NYC has more than 5,700 MTA buses. Taking the fleet electric would reduce climate-warming emissions and cut fuel, maintenance and health costs.

Phil McKenna, APR 26, 2018

New York City plans to take its buses electric. Credit: Chris Hondros/Getty Images

New York City is testing 10 electric buses this year and plans to go all electric by 2040. How quickly it reaches that goal depends in part on technology and EV charging infrastructure.

New York City plans to convert its public bus system to an all-electric fleet by 2040, a new target announced this week by NYC Transit President Andy Byford.

"It does depend on the maturity of the technology—both the bus technology and the charging technology—but we are deadly serious about moving to an all-electric fleet," Byford, who became head of NYC Transit in January, said at a Metropolitan Transit Authority board meeting on Wednesday.

Byford's comments follow an ambitious action plan released on Monday that seeks to address flagging ridership and sluggish service on the nation's largest municipal bus network. The average speed of an MTA bus in Manhattan is among the slowest of large metropolitan systems at 5.7 miles per hour. That means pollution from idling engines is much higher per mile than if the buses were going faster.

The plans calls for a "transition to a zero-emissions fleet to improve air quality and reduce greenhouse gas emissions."

Environmental and community advocates applauded the plan.

"It's a surprising development and a big deal big because this is the largest transit fleet in the country, with over 5,000 buses—that is the equivalent to over 100,000 electric cars," Kenny Bruno, a clean energy consultant, said. "It's a big deal on climate change and public health. All New Yorkers will benefit, not just drivers and passengers but everyone who lives along bus routes and depots, a lot of whom have high asthma rates."

A report released earlier this month by New York City Environmental Justice Alliance found 75 percent of bus depots in New York City are located in communities of color. It noted that fossil-fuel-powered buses emit air pollution linked to respiratory distress, asthma and hospitalization for people of all ages.

"These communities have been overburdened by noxious emissions for too long," Eddie Bautista, executive director of the New York City Environmental Justice Alliance, said in a statement. The announcement by the MTA "signals to us that the Authority has heard our call for a clean bus fleet. We are pleased to receive MTA's commitment to zero emissions and applaud their efforts."

A study in 2016 by a researcher at Columbia University found that if New York shifted from diesel to electric buses, it could reduce health costs from respiratory and other illnesses by roughly $150,000 per bus. The study also showed that fuel and maintenance costs would drop by $39,000 per year by shifting to electric vehicles, and the city could cut carbon dioxide emissions across the fleet by 575,000 metric tons per year.

The MTA, which has more than 5,700 buses in its fleet, already is testing 10 all-electric buses and has plans to purchase 60 more by 2019. With these purchases representing only 1 percent of the entire fleet, the agency would have to significantly increase its electric bus purchases to meet its 2040 target.

Los Angeles is also shifting to electric buses. The city's public transportation agency agreed last year to spend $138 million to purchase 95 electric buses, taking it closer to its goal of having a zero-emissions fleet, comprising some 2,300 buses, by 2030.

Details about the planned conversion to electric vehicles and how the New York agency will pay for the new buses and charging stations were not included in this week's report. The MTA will release a full modernization plan for New York City transit in May, Byford said.

Hawaii upends utility model, adds incentives for solar

Anne C. Mulkern, E&E News reporter Energywire: Thursday, April 26, 2018

Hawaii is overhauling how utilities get paid, upending a century-old business model and ordering incentives for affordability, renewable power and helping homeowners add rooftop solar.

Gov. David Ige (D) on Tuesday signed legislation that directs the state's Public Utilities Commission to create a framework for rewarding utilities based on performance. Criteria for rewards or penalties include "rapid integration of renewable energy sources, including quality interconnection of customer-sited resources."

Utilities also face evaluation on electricity reliability, on timely execution of adding power purchased through contracts and on customer satisfaction — including on their options for managing electricity costs. The measure, S.B. 2939 S.D. 2, states that the change is needed to "break the direct link between allowed revenues and investment levels," such as investments in power plants.

It's the latest move as the state pushes toward clean energy. Hawaii in 2015 passed a mandate to hit 100 percent renewable electricity by 2045, becoming the first state in the country to approve that goal. The Aloha State at the time had no blueprint for how to make it happen. Much remains in the planning stage, though leaders argue it's achievable.

The law appears to be the first one where utilities statewide could earn reward tied to renewable energy projects located in communities, known as distributed energy generation. In other states there are rules affecting some utilities. Rhode Island allows a financial incentive for utility National Grid PLC, based on the value of its long-term renewable energy contracts. National Grid also earns a percent of payments made to distributed generation facilities. In New York, National Grid was allowed an earnings mechanism tied to the total energy from distributed energy resources.

Hawaii largely burns fuel oil to generate electricity, making consumer bills costly and volatile. That has triggered a rooftop solar boom in the state, one so large that the utility on Oahu — the most populous island — severely limited new connections for months. The state later revised its rooftop solar benefits by eliminating net energy metering, where residents earn bill credit for excess power sent to the grid.

S.B. 2939 S.D. 2 argues that a realignment of utility customer and company interests is needed to prevent "economic and environmental harm" by the electrical system. The rewrite is also critical "to ensure the ongoing viability of the State's regulated electric utilities, as they face increasing need to rapidly adapt business models" that enable new technologies and customer choice, it says.

Hawaiian Electric Co. runs electrical power in the state through subsidiaries on Oahu, Maui and the Big Island. The parent company said it welcomes performance-based evaluations.

"We're getting a lot done, including tripling the amount of clean, renewable energy on our grids," Shannon Tangonan, director of Hawaiian Electric's corporate communications, said in a statement. "We're already well below the state's 2020 target for greenhouse gas emissions. The companies have been advocating this new performance-based approach for 20 years."

Hawaiian Electric is at 27 percent renewables across its five-island territory. It's on track to reach 30 percent as required in 2020, it said.

Oil use is dropping, the company said. It used 8.55 million barrels last year, down from 10.7 million seven years ago. Across the five islands, Hawaiian Electric said, one-third of single-family home residents have rooftop solar.

The company said it has added few new power plants that it owns in recent years, and those were at the request of the military. Hawaiian Electric will own and operate a 80,760-panel solar facility under construction at Pearl Harbor on Oahu. The Navy sought the project. Additionally, a 50-megawatt biomass plant is on land the Army leased to Hawaiian Electric.

A report issued last week from the Rhodium Group and Smart Growth America, commissioned by Elemental Excelerator, a nonprofit that funds energy and other startups, said Hawaii could reach more than 80 percent clean energy in 2030. One move that would help, it said, is changing the utility model so that the companies get paid for accelerating the renewables transition (Energywire, April 20).

S.B. 2939 S.D. 2 will give consumers "improved electric services with more options for innovative renewables and batteries," said Hawaii state Rep. Chris Lee (D), chairman of the Legislature's Committee on Energy and Environmental Protection. It's also "a responsible step forward helping our utilities transition to a sustainable business model that can survive disruption in the energy market," he said.

Anne Hoskins, chief policy officer at Sunrun Inc., said other states should see it as an example.

"The time to make these changes is now, before billions of dollars are spent in rebuilding our outdated electrical networks," Hoskins said. Rooftop solar and home batteries allow choosing a system "that maximizes public benefits, not utility shareholder profits."

The Bill:

Forced to Move: Climate Change Already Displacing U.S. Communities

Benjamin Goulet for KCET

April 26, 2018

Now Playing: S1 E1: Sea Level Rising: Living with WaterLearn more about a community displaced by climate change on "Sea Level Rising: Living with Water."

Extreme drought pushes rural inhabitants of a Middle Eastern nation into the cities, leading to social stressors and a devastating civil war. Decades of drought conditions and war send African migrants into neighboring countries by the tens of thousands. Residents of a small Alaskan village decide to vote in favor of relocating their entire population, as their living area crumbles around them due to storm surges.

These scenarios aren’t cribbed from a dystopian novel, or the plot of an apocalyptic big-budget movie; it’s real life. And it’s happening – now.

The role of climate change in human displacement and migration is being cited by experts as the number one global security threat of the 21st century. According to the United Nations Refugee Agency, annually 21.5 million people are, in their words, “forcibly displaced by weather-related sudden onset hazards – such as floods, storms, wildfires, [and] extreme temperatures.” These threats are not just concerning climate scientists and environmental activists; the U.S. State Department and the Global Military Advisory Council have both stated that global instability and uprisings, such as the Arab Spring, the war in Syria, and Boko Haram’s terrorism, all have links to climate change.

While the global displacement and migration of lower and middle-income countries make headlines, the U.S. is not immune to these situations.

And while the global displacement and migration of lower and middle-income countries make headlines, the U.S. is not immune to these situations. In 2016, two U.S. communities — both very different in terms of geography and demographics — made history for relocating their respective towns because climate change greatly altered their land.

The Alaskan village of Shishmaref is an Inupiat community of about 600 people and is located on an island called Sarichef (north of the Bering Strait). The island has been shrinking for over 40 years, as storm surges have become more and more powerful. In a relocation study published in February of 2016, experts calculated that warmer sea water is eroding the permafrost, undercutting protective embankments, which then topples into the ocean. In short, the island is shrinking.

The report states that “the Community has continued to shrink over the past 40 years with erosion eating away over 200 feet since 1969. A few storms have caused 25-30 foot erosion losses from a single storm.” After much debate, in August of 2016 the town voted to relocate completely, and decided to move the village about five miles away from its current location.

More than 6,000 miles away, the community of Isle de Jean Charles, a narrow piece of land located in Terrebonne Parish, Louisiana in the process of completely relocating. Funded by a $48.3 million dollar grant, the U.S. Housing and Urban Development has begun planning the logistics of resettling the residents as a whole community. The official website for the project states:

In the face of rising sea levels, subsiding land and frequent flood events, the south Louisiana residents of Isle de Jean Charles need to move to higher, safer ground. With the loss of more than 98 percent of the community’s land during the past 60 years, only 320 of the island’s original 22,400 acres remain. Although the island has been both a home and a historically significant landmark for nearly 200 years, community resettlement is inevitable. The only question is how.

As the historical homeland and burial ground of the tribe of the Isle de Jean Charles Band of Biloxi-Chitimacha-Choctaw Indians, HUD officials explicitly recognize the sensitive nature of the move, asserting that the resettlement must, in their words, “protect the island’s American Indian culture and support future generations.” In late 2017, a 515-acre farm 40 miles north of the island was chosen as the site for relocation of the residents. The new location is considered to be more convenient, as it is closer to schools, work and shopping. The island currently has no services and its one bridge is frequently impassable due to flooding, a major problem since most residents commute off the island for work.

The Federal Emergency Management Agency’s National Flood Insurance Program is now – no pun intended – drowning in debt of $25 billion, as payouts to homeowners have skyrocketed in the face of more powerful hurricanes.

Many other states are grappling with how to respond to stronger storms, decaying infrastructure and population shifts. In the wake of 2012’s Hurricane Sandy, a powerful storm whose strength is still debated as a possible result of climate-related changes in the atmosphere, insurance companies rewrote home insurance plans for homeowners living in coastal areas. The Federal Emergency Management Agency’s National Flood Insurance Program, created by Congress in 1968, is now – no pun intended – drowning in debt of $25 billion, as payouts to homeowners have skyrocketed in the face of more powerful hurricanes.

As sea levels rise, will residents eventually walk away from their million-dollar homes for higher land? The state of Florida is especially vulnerable to these possibilities. In a 2017 study by the independent organization Climate Central, nine out of the top 10 cities most vulnerable to coastal flooding by 2050 were located in Florida (New York was number one). There are reports that some Floridians aren’t waiting for the floods to arrive and are actively selling their homes.

Even as the 2016 Presidential race gained steam and the term “fake news” became the slur du jour against climate change by candidate Donald Trump, the U.S. military was already sounding the alarm. In July of 2015, the Department of Defense published a report titled “National Security Implications of Climate-Related Risks and a Changing Climate.” The report, compiled at the request of Congress, did not hold back in its findings, unambiguously listing a cascading series of events, such as drought, flooding, melting sea ice and global displacement as major national and global security threats. The agency states that these threats and their unpredictably will precipitate a greater response from the DoD in future crises around the globe. The report cites the conflict in Syria as just one example of this looming scenario:

[Climate change] could result in increased intra-and inter-state migration, and generate other negative effects on human security. For example, from 2006-2011, a severe multi-year drought affected Syria and contributed to massive agriculture failures and population displacements. Large movements of rural dwellers to city centers coincided with the presence of large numbers of Iraqi refugees in Syrian cities, effectively overwhelming institutional capacity to respond constructively to the changing service demands. These kinds of impacts in regions around the world could necessitate greater DoD involvement in the provision of humanitarian assistance and other aid.

The DoD sees climate change as a “threat multiplier,” exacerbating already existing issues, such as economic inequalities, social vulnerability, drought, disease and more, leading countries to suffer “systemic breakdowns.” These breakdowns, according to the DoD, will trigger population displacement, creating a dangerous cycle of instability around the globe.

In 2007, the United Nations published a report that cited climate change as a major contributor to conflict in Darfur. According to the report, rainfall in the area dropped 30 percent over the last 40 years, and extreme drought conditions inflamed tensions between farmers and herders as pasture land disappeared.

Beginning in 2003, the war in Darfur resulted in the deaths of over 200,000 and created a migration crisis, as Sudanese migrants spilled over into neighboring Chad, attempting to escape widespread ethnic cleansing. Although a peace accord was signed in 2005, Darfur is still cited as one of the first modern examples of climate change triggering global conflict across regions. In fact, rural tensions continue to threaten a new war between North and South Sudan as the drought continues.

Barrier islands such as this one off the Louisiana coast are one of the first lines of defense against dangerous storms. | Nicky Milne/Thomson Reuters FoundationBarrier islands such as this one off the Louisiana coast are one of the first lines of defense against dangerous storms. | Nicky Milne/Thomson Reuters Foundation

Looking forward, as many as 200 million people could be displaced by 2050 because of climate change, according to a 2007 report. What will the displaced be called? “Climate refugees”? “Climate migrants”? This phenomenon is still so new and disorienting, there isn’t an official designation for people who are displaced by climate change in their communities. According to the United Nations Refugee Agency, “many of those who are displaced across borders as a result of climate change may not meet the refugee definition.”

But climate displacement is happening and help is needed, regardless of the lack of an official designation. As the planet warms and displacements accelerate, an official term is likely to emerge.

"EU Agrees Total Ban On Bee-Harming Pesticides"

"The world’s most widely used insecticides will be banned from all fields within six months, to protect both wild and honeybees that are vital to crop pollination"

"The European Union will ban the world’s most widely used insecticides from all fields due to the serious danger they pose to bees.

The ban on neonicotinoids, approved by member nations on Friday, is expected to come into force by the end of 2018 and will mean they can only be used in closed greenhouses.

Bees and other insects are vital for global food production as they pollinate three-quarters of all crops. The plummeting numbers of pollinators in recent years has been blamed, in part, on the widespread use of pesticides. The EU banned the use of neonicotinoids on flowering crops that attract bees, such as oil seed rape, in 2013."

"Jury Awards Hog Farm Neighbors $50 Million"

"RALEIGH - A North Carolina jury awarded $50 million to neighbors of a 15,000-hog farm in Eastern North Carolina in a case being closely watched across the country by environmentalists and the hog farm industry.

The verdict, revealed late Thursday after a jury deliberated less than two days, is the first to come in a series of federal lawsuits filed against Murphy-Brown/Smithfield Foods, the world’s largest pork producer.

In this case decided in a federal courtroom in Raleigh, 10 neighbors contended that industrial-scale hog operations have known for decades that the open-air sewage pits on their properties were the source of noxious, sickening and overwhelming odors. The stench was so thick, the neighbors argued, that it was impossible to get it out of their clothes."

Minnesota team looks for synergy between solar and electric vehicles


Frank Jossi

April 27, 2018

A $150,000 federal grant will pay for study of solar-powered charging stations in Minnesota.

A Minneapolis nonprofit has received a federal grant to study potential synergies between distributed solar and electric vehicle charging stations.

And it doesn’t have to look farther than its own rooftop for an example.

The Great Plains Institute is the lead organization on a $150,000 grant from the National Renewable Energy Laboratory’s Solar Energy Innovation Network.

With a growing expectation that EVs will someday represent a significant share of the car market, the government and utilities are using this study and others to learn how to manage growing electricity loads through maximizing the growing amount of renewable energy on the grid.

“We’re looking to create a roadmap for how we would deploy a solar-plus-EV technology in the marketplace,” said Brian Ross, senior program director for the Great Plains Institute.

A 30-kilowatt solar array on the institute’s rooftop feeds power to three electric chargers in its parking lot. A minimal charge is always available but when the sun is shining the chargers ramp up to a Level 2 power setting. If a station is not in use, power is transferred to the other chargers.

Ross said their own solar charging station pilot is an example of what could be used for park-and-ride lots, business and government fleets, homes, apartments and even retail establishments.

Another nearby example comes from Lake Region Electric Cooperative, which offers members a solar program called “Go West,” in which solar panels are tilted southwest at 240 degrees to maximize production for summer peak demand. The utility uses a Go West solar installation next to its headquarters to reduce peak demand, but in off peak hours it uses the solar energy during the day to charge a Chevy Bolt the cooperative purchased.

Other partners on the research grant include the Minnesota Department of Commerce and ZEF Energy, which operates the largest independently owned and operated fast charging network in Minnesota and Wisconsin. The company is building out a network of Level 2 chargers to link transportation corridors and reduce range anxiety among EV drivers.

CEO and founder Matthew Blackler said ZEF has developed technology allowing utilities to adjust or program Level 2 chargers based on electricity demand or time of day. A utility could fully charge vehicles when solar is plentiful, for example, and then limit charging during periods of peak demand for electricity.

“You could turn (charging) down to 25 percent or switch if off and on every 15 minutes” during peak demand hours, Blackler said.

The idea is to synchronize solar availability with EV charging and peak demand to improve grid stability, he added.

The group is one of nine teams chosen across the country. Others are looking at solar and grid resiliency, impacts on rate design, storage, and distribution generation in rural areas.


REID FRAZIERAPRIL 27, 2018 for the Alleghany Front


The natural gas boom was supposed to help the electric sector lower its carbon footprint by replacing old, carbon dioxide-spewing coal plants with newer, lower-emitting natural gas plants. And to a degree, that’s happened. As coal plants have retired, carbon emissions from the power sector have decreased.

But the flood of cheap natural gas is also threatening the nation’s biggest source of carbon-free electricity: nuclear power.

In Pennsylvania and Ohio, four nuclear plants have said they are shutting down, years ahead of their scheduled retirement dates, unless they receive state aid. Those include Three Mile Island, near Harrisburg, and Beaver Valley, in Shippingport, Pa., as well as the Davis-Besse and Perry plants in Ohio. If those plants close, that’s bad news for climate hawks.

“If your concern is about climate change and you view it as an urgent and perhaps even existential threat to human society, then you should be very concerned about the closure of nuclear power plants,” said Jesse Jenkins, a Ph.D. candidate at MIT studying the electric grid.

That’s because the power currently generated by soon-to-close nuclear plants will likely be replaced mainly by natural gas, which produces about half the carbon dioxide as coal.

If nuclear plants close, emissions likely to rise

Together, the four plants provide enough electricity to power 4 million homes. A recent industry-funded group found the plants provide more electricity than all the wind and solar in the PJM Interconnection, the electric grid that serves 65 million people in the Mid-Atlantic region. The report estimates that replacing the four plants with fossil fuels with coal and natural gas would produce the same amount of carbon pollution as 4.5 million more cars on the road.

“Each one of those nuclear power plants is a significant source of low-carbon, emissions-free electricity, and losing each one is a significant step back at a time we should be making rapid progress towards a zero carbon energy system,” Jenkins said.

This Brattle Group chart shows four nuclear power plants in Ohio and Pennsylvania create more carbon-free electricity than all renewable sources in PJM Interconnection, the mid-Atlantic electric grid. The consultancy used funding from the nuclear industry to analyze planned closures.

But keeping these plants online might be difficult in an electricity market flooded by cheap natural gas. Jenkins’ research found that low natural gas prices were the main reason for the struggles of the nuclear industry.

To stay in business, the owners of nuclear plants are turning to states and the federal government for help. FirstEnergy has asked the federal government to declare a grid emergency to keep its coal and nuclear plants running.

New York and Illinois are subsidizing struggling nuclear plants through ‘zero emissions credits’ — subsidies that pay plants based on the amount of carbon pollution they save.

Matt Wald, a spokesman with the Nuclear Energy Institute, an industry trade group, says the subsidies are necessary.

“We are seeking to have the market recognize the things that we provide to the system,” Wald said. “It doesn’t do that now.”

Environmental groups conflicted on nuclear power

Nuclear’s precarious situation creates a dilemma for environmental groups, which have long opposed nuclear power.

“If they’re replaced by plants that burn fracked gas, then that’s definitely a problem,” said Tom Schuster, a senior campaign representative for the Sierra Club in Pennsylvania.

But, Schuster said, nuclear energy carries its own environmental problems: radioactive waste, impacts from uranium mining, and the risks of a Fukushima-like disaster. Also, he says, money spent keeping nuclear plants open could be spent building out the renewables sector.

“If policies to bail nuclear plants out are structured such that we’re spending a lot of money to keep plants online that we could otherwise be spending on truly clean sources of energy, then that’s problematic.”

But nuclear is still the country’s top source of carbon-free electricity. So the Sierra Club supported a nuclear bailout in Illinois that also promoted renewables, Schuster says.

“I kind of think about it as thinking about nuclear plant policy as a phase out rather than a bailout,” Schuster said.

In New Jersey, the state will spend $300 million a year to keep three nuclear plants online, while it also raised its renewable energy targets to 50 percent by the year 2030, and took separate actions to boost solar power and energy storage.

States take lead on nuclear bailouts

Schuster says the best way to lower carbon emissions is to simply tax them. That would make coal and natural gas more expensive, he said, which would increase the use of renewables and nuclear energy. But a federal tax on carbon emissions is unlikely. So most of the activity to keep nuclear plants online is in the states.

Any action in Pennsylvania to bail out its nuclear plants would face stiff opposition. Pennsylvania’s large natural gas industry opposes any such idea.

Stephanie Catarino Wissman, executive director of the Associated Petroleum Industries of Pennsylvania, said in an email that her organization “opposes any effort to subsidize one form of energy over another because when the government picks winners and losers in the electricity markets, consumers pay the price.”

The state legislature has a bi-partisan nuclear caucus, which is looking for policy solutions to keep the nuclear plants operating. But to date, the caucus has yet to introduce any bills in the legislature to keep the state’s nuclear plants running.

U.S.-U.K. science armada to target vulnerable Antarctic ice sheet

By Paul VoosenApr. 30, 2018 , 6:00 AM

An armada of 100 scientists will soon be descending on West Antarctica, and understanding the future of global sea levels might depend on what they find. Today, after several years of planning, the U.S. and U.K. science agencies announced the details of a joint $50 million (or more) plan to study the Thwaites glacier, the Antarctic ice sheet most at risk of near-term melting.

The International Thwaites Glacier Collaboration plans to deploy six teams to the remote ice sheet, where they will study it using a host of tools, including instrument-carrying seals and earth-sensing seismographs. The researchers will concentrate their work in the Antarctic summers of 2019–20 and 2020–21. An additional two teams will channel the findings of the field teams into global models.

Overall, the collaboration is the largest joint effort between the two nations in Antarctica since the 1940s. “We’ll see what until now has been inferred playing out right in front of our sensors,” says Ted Scambos, a glaciologist at the National Snow and Ice Data Center in Boulder, Colorado, who is serving as the lead U.S. scientific coordinator.

You agree to share your email address with the publication. Information provided here is subject to Science's privacy policy.

Over the past decade, thanks to a variety of satellite and aircraft observations and modeling insights—including signs that the glacier’s ice has started thinning and flowing faster toward the ocean—scientists have been paying special attention to Thwaites. It is, they believe, the Antarctic ice sheet most at risk of accelerated melting in the next century, making it the wild card in projections of sea level rise. But its remote location, 1600 kilometers from the nearest research station, has made it inaccessible to scientists seeking to understand these risks up close.

Thwaites, a 182,000-square-kilometer glacier in the Amundsen Sea, acts as a plug, blocking the rest of the West Antarctic Ice Sheet from flowing into the ocean. Melt from the glacier already accounts for 4% of modern sea level rise, an amount that has doubled since the 1990s. Scientists are concerned that if it retreats, it could become unstable, making the collapse of the ice sheet irreversible and ultimately increasing sea levels by 3.3 meters over the span of centuries or millennia.

“It could contribute to sea level in our lifetimes in a large way, in a scale of a meter of sea level rise,” says Sridhar Anandakrishnan, a glacial seismologist at Pennsylvania State University in State College who is co-leading one Thwaites project. “Which is just an unthinkable possibility.”

Over the past decade, the Thwaites glacier has risen to the forefront of scientists' Antarctic melt concerns. JEREMY HARBECK

Overall, the U.S. National Science Foundation and the United Kingdom’s Natural Environment Research Council will spend $25 million on the science, with each of the eight teams led by researchers from both countries. Funders expect to spend another $25 million or more on the logistics of moving so much heavy equipment toward the shelf. Several ships will work off the coastline while scientists will be based at the former drilling site of the West Antarctic Ice Sheet Divide ice core, flying from there to their research sites in small airplanes or helicopters.

The scientific teams will focus on what puts Thwaites particularly at risk. Researchers have noticed that shifts in winds seem to be pushing warm, deep ocean waters on to Antarctica’s continental shelf at the base of its glaciers. Thwaites is perched on a ridge that holds these waters back, but beyond that ridge, the land under the glacier slopes downward, creating an inland bowl that is below sea level. But the researchers are uncertain about the composition and slipperiness of that geologic bowl.

One study, co-led by Anandakrishnan, will seek to understand the actual composition of the bowl and a second ridge, 70 kilometers inland, on which the glacier might catch. “In some models [the melting glacier] stabilizes” on that ridge, Anandakrishnan says. “In some it doesn’t.” By detonating explosives on the surface of the ice sheet and using seismic sensors to measure their reflections, his team will tease out whether the rock underlying Thwaites, including this critical ridge, is soft and pliable or hard and crystalline.

Another project will target the warm intruding waters. “We plan to have a pincer movement,” says Karen Heywood, an oceanographer at the University of East Anglia in Norwich, U.K., who will be co-leading the team. One effort will involve drilling sensors through the ice and then driving several autonomous submersibles and gliders toward these stations. Another, starting this year, will involve outfitting 10 seals a year with scientific instruments, using the animals to make routine, repeated studies of the ocean. The technique that should produce a torrent of sustained data. They’ll be running similar measures on the nearby Dotson Ice Shelf, seeking to understand whether differences in the ocean waters explain why Thwaites is retreating so rapidly compared with Dotson.

Scientists also expect to explore the glacier’s grounding zone, planning to drill through 800 meters of ice to observe over several seasons where the triangle of ice, ocean, and rock meet. That will include dropping a new autonomous vehicle, developed by the Georgia Institute of Technology in Atlanta, that can deploy through a borehole and explore the grounding zone—an unprecedented view.

Other projects will seek geological evidence of whether Thwaites has previously retreated and reformed in the past 10,000 years, giving a clue to whether the modern melting threat is truly unprecedented. And another team will examine the glacier’s connections to the broader ice sheet—its shear margins—using radar and seismic reflections to detect whether neighboring ice helps hold it in place or lets it go, like a game of high-stakes red rover.

For now, the researchers are eager to get started. The United States and United Kingdom announced the opportunity almost a year and a half ago and it took some time to come together. This project is expected to launch a new generation of Antarctic researchers and, in the process, might reduce some uncertainty about the future of climate change. “No doubt we’re going to learn something that’s important to refining those predictions,” Scambos says. “This is kind of the missing piece right now.”

"US Judge Scraps Oakland, California, Ban On Coal Shipments"

"SAN FRANCISCO — A federal judge in California on Tuesday struck down the city of Oakland’s ban on coal shipments at a proposed cargo terminal, siding with a developer who wants to use the site to transport Utah coal to Asia.

In a scathing ruling, U.S. District Judge Vince Chhabria in San Francisco said the information the city relied on to conclude that coal operations would pose a substantial health or safety danger to the public was “riddled with inaccuracies” and “faulty analyses, to the point that no reliable conclusion about health or safety dangers could be drawn from it.”

The decision cheered coal proponents while opponents said they would continue to fight for cleaner air. Oakland is reviewing its options and may appeal, said Justin Berton, a spokesman for Mayor Libby Schaaf."

Enbridge likely to wrap protective sleeves around damaged Line 5 portions

Updated May 14; Posted May 14

By Michael

LANSING, MI -- Protective sleeves will likely be wrapped around dented portions of the Line 5 oil and gas pipeline, which was damaged last month by a suspected anchor strike.

A timeline for the fix is not currently available, as Enbridge Energy still has to confirm plans with regulatory officials, according to company spokesperson Ryan Duffy.

A section of the Line 5 pipeline crosses the Straits of Mackinac in Lake Michigan.

The Line 5 damage occurred April 1 when, according to a lawsuit filed by Michigan Attorney General Bill Schuette, a boat owned by an Escanaba-based shipping company dragged its anchor through the Straits, causing the damage.

Although visible "impacts" to the company's twin pipelines crossing the Straits of Mackinac haven't compromised line integrity, it did warrant a precautionary reducing of maximum operating pressure, Duffy said.

The line's eastern leg sustained a dent of a little more than three-fourths of an inch and an abrasion to the outer coating, according to Peter Holran, Enbridge director of U.S. government affairs.

The two dents on the western leg were a little less than three-quarters of an inch and under a half-inch, he said. On both legs, the impacts were spaced about 24 inches apart.

The pipes are 20 inches in diameter and have a wall thickness of 0.8 inches, Holran said.

An underwater photograph of Enbridge Line 5's eastern leg shows what the company calls "apparent contact areas," which are circled, believed to be damage resulting from an April 1 incident in the Straits of Mackinac.

Once the protective sleeve is implemented, the maximum operating pressure precaution will be lifted, Duffy said.

The suspected strike also damaged American Transmission Company's high-voltage power cables running through the straits. An estimated 600 gallons of dielectric fluid leaked into the water as a result.

Jerome Popiel, a representative from the U.S. Coast Guard, categorized the spill as "minor" with negligible environmental effects.

"It wasn't a big deal in terms in what actually happened," Popiel said. "What could've happened is getting everyone's attention."

Broken cables capped as Straits of Mackinac spill response continues

The broken ends from which an estimated 600 gallons of toxic fluid leaked were capped last week, 25 days after the spill began.

Popiel declined to give further information about the incident, citing the ongoing investigation by the U.S. Coast Guard.

The Line 5 pipeline, built in 1953, runs 645 miles from Superior, Wisconsin, to Sarnia, Canada, and transports up to 540,000 barrels of light crude oil and natural gas liquids per day.

The April 1 incident and the possibility of a major spill renewed calls for the aging pipeline to be shut down.

Study puts $6.3 billion price tag on potential Mackinac Straits oil spill

Prepared for advocacy group For Love of Water (FLOW), the study includes natural resource damages and various economic impacts.

There are no restrictions to dropping or dragging anchor in the Straits of Mackinac, only an advisory.

"Mariners should use extreme caution when operating vessels in depths of water comparable to their draft in areas where pipelines and cables may exist and when anchoring, dragging or trawling," the National Oceanic and Atmospheric Administration states.

At the Michigan Pipeline Safety Advisory Board's May 14 meeting, the first since the Straits incident, the possibility of a "no-anchor zone" to protect from further, and possibly worse, anchor-strike incidents was briefly mentioned.

"I think that's definitely being looked at," said Scott Dean, a spokesperson for the Michigan Department of Environmental Quality. "I think everyone is very focused on never seeing this happen again."

A problem with any anchor-drop ban, Dean said, is there must be stipulations that ensure protection of human life in cases where, for example, a ship is headed for collision with the Mackinac Bridge.

Dean cited awareness and warning campaigns as additional options.

Holran declined to say whether he's in favor of an anchor-drop ban in the Straits.

He told the board that Enbridge is working with the state and looking into safeguards which ensure vessels aren't acting "recklessly" or "negligently."

Mike Shriberg, Great Lakes executive director for the National Wildlife Federation and a member of Michigan's Pipeline Safety Advisory Board, said he supports an anchor-drop ban in the Straits, with the only exceptions being emergencies of life or death.

"We know that this is the No. 1 risk factor," Shriberg said. "We don't know how many times anchors have been dropped in the Straits."

Toxic Algae Blooms Happen More Often, May Involve Climate Feedback Loop

"The blooms, primarily fed by farm runoff but exacerbated by warming, release methane and CO2. Lake Erie is a 'poster child' for the challenge. "

"Blooms of harmful algae in the nation's waters appear to be occurring much more frequently than in the past, increasing suspicions that the warming climate may be exacerbating the problem.

The Environmental Working Group (EWG) published newly collected data on Tuesday reporting nearly 300 large blooms since 2010. Last year alone, 169 were reported. While NOAA issues forecasts for harmful algal blooms in certain areas, the advocacy group called its report the first attempt to track the blooms on a nationwide scale.

The study comes as scientists have predicted proliferation of these blooms as the climate changes, and amid increasing attention by the news media and local politicians to the worst cases.

Just as troubling, these blooms could not only worsen with climate change, but also contribute significantly to greenhouse gas emissions."

Electric School Buses Can Be Backup Batteries For the US Power Grid

American school buses guzzle $3.2B of carcinogen-laden diesel a year, but they could do a lot of good if electrified.

Tracey Lindeman

May 15 2018, 10:30am

We often think of electricity as a one-way transaction. Need to toast a bagel, wash the sheets, or charge your phone? Your fuse box sends you the juice you need. Electric vehicles, though, have the capacity to send power back to the electrical grid using vehicle-to-grid (V2G) technology—and that’s good news for an aging grid already operating at full capacity.

Vehicle-to-grid does this by letting electric vehicle (EV) batteries switch between providing and consuming energy on an as-needed basis. As EV adoption rates steadily climb, this technology could help stabilize the electrical grid, lessen the need for new power plants, and reduce kids’ exposure to cancer-causing exhaust. Essentially, electric vehicles can be like a backup battery for your phone, but for the entire US power grid.

“You’re looking for a vehicle that’s plugged in a decent amount of hours each day,” Marc Trahand, chief operating officer of San Diego-based V2G startup Nuvve, told me on the phone. The bigger the battery, the bigger the potential to help the electrical grid and the people using it.

Enter school buses. There are approximately 500,000 buses in the US alone, and the majority of them are basically rolling cancer machines due to their diesel engines. in 2012, the World Health Organization said diesel exhaust can definitely cause lung cancer, and might also be associated with an increased risk of bladder cancer. Meanwhile, according to the American School Bus Council, the US’s school buses consume a combined $3.2 billion worth of diesel a year.

Lance Noel, a postdoctoral V2G researcher at the Center for Energy Technologies at Aarhus University in Denmark, said school buses are not only ripe for electrification, but also V2G technology because they’re only in use for a few hours at a time. “It’s a giant battery sitting in a parking lot for at least 18 hours a day,” he told me on the phone.

Using a bidirectional charger—that is, one that is able to charge and discharge power on command dynamically using smart software—these batteries can supplement the grid in peak-demand times, and act as storage when demand is low.

The technology was created by a professor and co-founder of Nuvve, Willett Kempton, who has been working on V2G at the University of Delaware since 1996. Today Nuvve, which wants to be the intermediary between EV owners and grids, is working with automakers such as Nissan and Mitsubishi to integrate bidirectionality into batteries and chargers.

The Japanese automakers’ interest was partly piqued by the 2011 Fukushima nuclear disaster that resulted in blackouts, according to Trahand. “They realized, ‘We have all these electric cars with quite a big capacity,’” he said, adding that a Nissan Leaf could probably light up 10 houses for an evening, or a single house for far longer.

At its core, V2G advocates for the democratization of energy markets by allowing small-time players to use their vehicles to provide electricity—either to the grid, to their neighbors, or to their own homes—and potentially earn a little bit of revenue while doing so. According to PJM Interconnection, a grid operator and wholesale electricity market in the eastern half of the US, V2G tests done with electric BMW Minis earned each car user about $100 a month. Electric-vehicle owners in the US who drive 15,000 miles a year and charge exclusively at home can expect to pay $540/year in electricity costs, according to Plug In America.

Single EVs can’t participate in this market on their own because they don’t produce enough power; they’d have to be aggregated with other vehicles to participate in the market. This is why fleets are especially attractive for V2G technology. School buses are a particularly interesting fleet to work with, not only because of their V2G capability and revenue-generating potential, but also because of their primary clientele: kids.

In the US, 25 million children collectively ride 62 billion miles a year in school buses. Many of these use diesel engines, which again have been shown to cause cancer. Kids of color and children from low-income households in particular are more likely to take transit or the school bus. Research published in the American Journal of Respiratory and Critical Care Medicine showed that diesel pollution is far worse inside of bus cabins than in the air around the vehicle. That same research shows that cleaner fuel policies positively impact children’s health.

While a grid that still largely depends on coal isn’t an ideal power source for any electric vehicle, V2G’s storage capabilities could actually help promote renewable energy like solar and wind—the output of which fluctuates depending on time of day and weather conditions—by offering unique energy storage solutions that are lacking on the existing grid.

Given how skittish some people still are about electric cars, it might take some cajoling to convince school boards and PTAs they’ve got enough range and power to safely deliver their kids to school and back.

In North America, at least a couple of major school-bus manufacturers—Blue Bird and Lion—are working on proving the benefits of electrification and vehicle-to-grid technology. Electric school buses are a particularly good candidate for V2G, said Noel and Trahand, because their range far exceeds most route distances. The lower fuel and maintenance costs make up for their higher sticker price, they added.

Noted Trahand, “It’s going to be cheaper to go electric.”

Illinois to sue EPA for exempting Foxconn plant from pollution controls

Valerie Volcovici

WASHINGTON (Reuters) - Illinois’ Attorney General said on Friday she plans to sue the U.S. Environmental Protection Agency for allowing a proposed Foxconn Technology Co Ltd plant in neighboring Wisconsin to operate without stringent pollution controls.

On Tuesday, the EPA identified 51 areas in 22 states that do not meet federal air quality requirements for ozone, a step toward enforcing the standards issued in 2015.

An exempted area was Racine County, Wisconsin, just north of the Illinois border that is known to have heavily polluted air, where Taiwan-based Foxconn is building a $10 billion liquid-crystal display plant.

Pollution monitoring data show the county’s ozone levels exceed the 70 parts per billion (ppb) limit. If Racine County had been designated a “non-attainment” area, it would have required Foxconn to install stringent pollution control equipment.

Attorney General Lisa Madigan said she would file a lawsuit in the District of Columbia Circuit Court of Appeals challenging the EPA’s ozone designations, saying its failure to name Racine County a “non-attainment” area puts people at risk.

“Despite its name, the Environmental Protection Agency now operates with total disregard for the quality of our air and water, and in this case, the U.S. EPA is putting a company’s profit ahead of our natural resources and the public’s health,” Madigan said in a statement.

The EPA, under Administrator Scott Pruitt, left Racine County off its non-attainment list despite an agency staff analysis of ozone levels in Wisconsin published in December, which found that the county’s air exceeded federal ozone limits.

Wisconsin Governor Scott Walker, who supports bringing Foxconn to Wisconsin, tweeted on Tuesday that the state would work with EPA “to implement a plan that continues to look out for the best interest of Wisconsin.”

Wisconsin’s Republican-controlled state Assembly last year voted to approve a bill that paves the way for a $3 billion incentives package for a proposed by Foxconn.

Tourism's carbon impact three times larger than estimated

By Matt McGrath, Environment correspondent

Travellers from affluent countries are a key part of emissions growth in tourism

A new study says global tourism accounts for 8% of carbon emissions, around three times greater than previous estimates.

The new assessment is bigger because it includes emissions from travel, plus the full life-cycle of carbon in tourists' food, hotels and shopping.

Driving the increase are visitors from affluent countries who travel to other wealthy destinations.

The US tops the rankings followed by China, Germany and India.

Tourism is a huge and booming global industry worth over $7 trillion, and employs one in ten workers around the world. It's growing at around 4% per annum.

Previous estimates of the impact of all this travel on carbon suggested that tourism accounted for 2.5-3% of emissions.

However in what is claimed to be the most comprehensive assessment to date, this new study examines the global carbon flows between 160 countries between 2009 and 2013. It shows that the total is closer to 8% of the global figure.

As well as air travel, the authors say they have included an analysis of the energy needed to support the tourism system, including all the food, beverage, infrastructure construction and maintenance as well as the retail services that tourists enjoy.

"It definitely is eye opening," Dr Arunima Malik from the University of Sydney, who's the lead author of the study, told BBC News.

"We looked at really detailed information about tourism expenditure, including consumables such as food from eating out and souvenirs. We looked at the trade between different countries and also at greenhouse gas emissions data to come up with a comprehensive figure for the global carbon footprint for tourism."

The researchers also looked at the impacts in both the countries where tourists came from and where they travelled. They found that the most important element was relatively well off people from affluent countries travelling to other well to do destinations.

In the leading countries, US, China, Germany and India, much of the travel was domestic.

Travellers from Canada, Switzerland, the Netherlands and Denmark exert a much higher carbon footprint elsewhere than in their own countries.

Small island states like the Maldives are hugely dependent on long distance tourism

When richer people travel they tend to spend more on higher carbon transportation, food and pursuits says Dr Malik.

"If you have visitors from high income countries then they typically spend heavily on air travel, on shopping and hospitality where they go to. But if the travellers are from low income countries then they spend more on public transport and unprocessed food, the spending patterns are different for the different economies they come from."

When measuring per capita emissions, small island destinations such as the Maldives, Cyprus and the Seychelles emerge as the leading lights. In these countries tourism is responsible for up to 80% of their annual emissions.

"The small island states are in a difficult position because we like travelling to these locations and those small island states very much rely on tourist income but they are also at the same time vulnerable to the effects of rising seas and climate change," said Dr Malik.

Demand for international tourism is also being seen in emerging countries like Brazil, India, China and Mexico, highlighting a fundamental problem - wealth.

The report underlines the fact that when people earn more than $40,000 per annum, their carbon footprint from tourism increase 13% for every 10% rise in income. The consumption of tourism does "not appear to satiate as incomes grow," the report says.

The World Travel and Tourism Council (WTTC) has welcomed the research but doesn't accept that the industry's efforts to cut carbon have been a flop.

As countries get wealthier their citizens' appetite for global travel rapidly increases

"It would be unfair to say that the industry is not doing anything," said Rochelle Turner, director of research at WTTC.

"We've seen a growing number of hotels, airports and tour operators that have all become carbon neutral so there is a momentum."

Experts say that offsetting, where tourists spend money on planting trees to mitigate their carbon footprint will have to increase, despite reservations about its effectiveness.

Awareness is also the key. The WTTC say that the recent water crisis in Cape Town has also helped people recognise that changes in climate can impact resources like water.

"There is a real need for people to recognise what their impact is in a destination," said Rochelle Turner, "and how much water, waste and energy you should be using compared to the local population."

"All of this will empower tourists to make better decisions and only through those better decisions that we'll be able to tackle the issue of climate change."

Full Report:

DOE’s maverick climate model is about to get its first test

By Gabriel PopkinMay. 3, 2018 , 1:20 PM

The world's growing collection of climate models has a high-profile new entry. Last week, after nearly 4 years of work, the U.S. Department of Energy (DOE) released computer code and initial results from an ambitious effort to simulate the Earth system. The new model is tailored to run on future supercomputers and designed to forecast not just how climate will change, but also how those changes might stress energy infrastructure.

Results from an upcoming comparison of global models may show how well the new entrant works. But so far it is getting a mixed reception, with some questioning the need for another model and others saying the $80 million effort has yet to improve predictions of the future climate. Even the project's chief scientist, Ruby Leung of the Pacific Northwest National Laboratory (PNNL) in Richland, Washington, acknowledges that the model is not yet a leader. "We really don't expect that our model will be wowing the world," she says.

Since the 1960s, climate modelers have used computers to build virtual globes. They break the atmosphere and ocean into thousands of boxes and assign weather conditions to each one. The toy worlds then evolve through simulated centuries, following the laws of physics. Historically, DOE's major role in climate modeling was contributing to the Community Earth System Model (CESM), an effort based at the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. But in July 2014, DOE launched its Accelerated Climate Model for Energy. The goal was to predict how storms and rising seas could affect power plants, dams, and other energy infrastructure, and to focus on regions such as North America or the Arctic. DOE officials also wanted a model that could run on a generation of megapowerful "exascale" computers expected to turn on around 2021.

The project pulled in researchers from eight DOE national labs. It began as a carbon copy of the CESM and retains similar atmosphere and land models, but includes new ocean, sea-ice, river, and soil biochemistry simulations. The DOE team doubled the number of vertical layers, extended the atmosphere higher, and adopted a number-crunching method that is computationally intensive but may be easier to break into chunks and run in parallel on the anticipated exascale machines. "For them, it makes a lot of sense to go in that direction," says Richard Neale, a climate scientist at NCAR.

In 2017, after President Donald Trump took office and pulled the nation out of the Paris climate accords, DOE dropped "climate" from the project name. The new name, the Energy Exascale Earth System Model (E3SM), better reflects the model's focus on the entire Earth system, says project leader David Bader of Lawrence Livermore National Laboratory in California.

The E3SM's first results highlight its potential; they include model runs with ultrasharp, 25-kilometer-wide grid cells—fine enough to simulate small-scale features such as ocean eddies and mountain snow packs. But this sharp picture is still too coarse to resolve individual clouds and atmospheric convection, major factors limiting models' precision. And some scientists doubt it will improve forecasts. The last intercomparison effort, which ended in 2014, included 26 modeling groups—nine more than the previous round—yet yielded collective predictions that were no more precise. "Just having more models—I don't think there's any evidence that that's key to advancing the field," says Bjorn Stevens, a climate scientist at the Max Planck Institute for Meteorology in Hamburg, Germany, and co-leader of the new intercomparison, code-named CMIP6.

Gavin Schmidt, who heads NASA's Goddard Institute for Space Studies in New York City, which also produces a global climate model, questions the new model's rationale, given that DOE's exascale computers do not yet exist. "No one knows what these machines will even look like, so it's hard to build models for them ahead of time," he wrote on Twitter. And the computational intensity of the E3SM has drawbacks, says Hansi Singh, a PNNL climate scientist who uses the CESM for her research. The sheer number of calculations needed to get a result with the E3SM would overwhelm most university clusters, limiting outside scientists' ability to use it.

One preliminary result, on the climate's sensitivity to carbon dioxide (CO2), will "raise some eyebrows," Bader says. Most models estimate that, for a doubling of CO2 above preindustrial levels, average global temperatures will rise between 1.5°C and 4.5°C. The E3SM predicts a strikingly high rise of 5.2°C, which Leung suspects is due to the way the model handles aerosols and clouds. And like many models, the E3SM produces two bands of rainfall in the tropics, rather than the one seen in nature near the equator.

The first test of the E3SM will be its performance in CMIP6. Nearly three dozen modeling groups, including newcomers from South Korea, India, Brazil, and South Africa, are expected to submit results to the intercomparison between now and 2020. Each group will devote thousands of computer-hours to standard scenarios, such as simulating the impact of a 1% per year CO2 increase and an abrupt quadrupling of it.

But given the plodding rate of improvement since previous intercomparisons, few are expecting the E3SM or any other model to yield revolutionary insights. Stevens hopes to revise the exercise to encourage innovations, such as modeling the climate at the 1-kilometer resolution needed to make out individual clouds, or campaigns to gather new kinds of data. "The whole premise of CMIP is trying to get everyone to do the same thing," he says. "Everyone knows that breakthroughs come from getting people to do different things."

Port Harcourt: Anger, anxiety as soot takes over skyline, environs

By Ann Godwin (Port Harcourt)

06 May 2018 | 4:35 am

The ranking of Port Harcourt as the world’s most polluted city in the world has not just increased panic among residents but has forced some wealthy residents and those who have alternative homes to begin relocating from the city.

In April 29, 2018, air visual, ranked Port Harcourt as the worst polluted city in the world with an air index of 188, followed by Beijing, China which ranked 182 and Delhi, India at 181 among others. Closed monitoring showed that the index of Port Harcourt has increased from 188 to 200 and above since then till now.

In fact, all parts of Port Harcourt ,including some neighbouring Local Government Areas like Eleme, Oyigbo, Ikwere, Obio/akpor, the four Local Councils in Ogoni are usually covered with dark clouds since the deposition of black soot started. Preliminary test samples showed that the soot was caused by

“incomplete combustion of hydrocarbons, as well as asphalt processing and illegal artisanal refining operations”.

The data for Cities’ air quality index is analysed everyday by Air Visual, a global air quality monitor founded in 2015, “ on the belief that data enables action and that without air quality data, the hazards of air pollution will remain invisible and unbeatable.

The soot is not like the rain or sun which drops on people outside a shield, but the soot filters in through locked doors and closed surfaces into the rooms, cars, offices, even undergrounds. In fact, there is no safe place for anyone residing in Port Harcourt as its effect is non discriminatory.

Current findings showed that people now fall sick often in the city. Cough, catarrh and sore throat is now common. Most hospitals’ bed spaces are occupied by sick children, women men and the elderly.

Undoubtedly, the black soot portend great danger to peoples lives as its has been associated with asthma, heart diseases , bronchitis and some respiratory illnesses,its toxicity can also cause cancer which may lead to premature death.

The panicking residents who could no longer keep calm took to the street three weeks ago protesting against the unending black soot in the once garden city which may now be likened as a dark city .

The hundreds of protesters including pregnant women, women, children, youths and the elderly, as well as Labour leaders, Civil society organisations marched from Iseac Boro Park as early as 8am to Government House, Port Harcourt and the State House of Assembly, demanding for a clean air to breath.

The leader of Stop-the-Soot Campaign, Tunde Bello, in a 14-point protest letter, said, it was time enough for government to priorities the issue of environment, adding that millions of Rivers residents risk cancer diseases among others if the sooth is not urgently stopped.

He also stressed the need for the state government to conduct an environmental audit on all the oil bearing communities and the operations going on in such areas.

Part of the letter reads, “There should be a street to street health campaign, and awareness on the health issue by local governments and various stakeholders.”

He added: “We want Governor Wike to lead a bi-partisan campaign on behalf of the state and its citizens,requesting that the state assembly and national assembly should change the rules governing the issue of environment and instigate the part of the Petroleum Industry Bill on environment.

Meanwhile, the Chief organiser of the protest, Eugene Abels, said the campaign would be sustained until something concrete is done by the Federal and State government to stop it.

He disclosed that, the group would kick off the public health campaign next week to enlighten and educate the populace about what to do to stay safe,adding that June 5, this year, another massive protest would be staged against the soot.

Abels said, the soot is injurious to health, which kills gradually. He maintained that serious steps need to be taken to draw the needed attention to avoid it go the way the United Nations Environment Programme (UNEP) report on Ogoni land is going.

However, the Deputy Governor , Dr. Ipalibo Banigo ,received the letter from the protesters and promised to pass the message to the State Governor, Nyesom Wike.

Meanwhile, Governor Wike, while inaugurating the Neighbourhood Safety Corps Agency at Government House Port Harcourt, urged the protesters to Chanel their protest to the Federal Government, stating that the government at the centre was responsible for the degradation of pollution of Rivers environment.

Wike said, Rivers government has done all that it suppose to do including advocacy and enlightenment campaigns with no action from the federal government.

He said: “Help us demonstrate against the Federal government because we have no control over the sources of the soot, do I go to shut down refinery, will they not say, it is economic sabotage?

“Do I even have the security, do i control the police and the army to go and stop production at the refinery, not even the navy.”

However, during the Governor’s meeting with the United Nation’s Delegation on the soot, he tabled the concerns of the state government and the people to the international community.

While addressing the UN delegation , He called on the United Nations to prevail on the Federal Government to act on the soot.

It is however, devastating that irrespective of the warnings by medical experts on the effects of the soot menace, nothing serious has been done to bring permanent solution to it. The Federal Government Agency like the NESDRA, the agency responsible for environment, has remained silent about the soot.

Though, the State Ministry of Environment in support with the task force set up by the state government has shut down asphalt processing plant operating in the city, and sealed off Chinese company for breach of environmental laws, and seizing abandoned tyres but despite all these, thick smoke still filters the air on daily bases and instead of uniting and fighting together to bring solution, the All Progressive Congress (APC and Peoples Democratic Party, (PDP) has kept politicking with the health of the people.

The state government has not taken any step further to address the challenge, rather it is blaming the Federal government, while the state chapter of APC is exonerating the Federal government from the challenge, Governor Wike is being blamed by them for causing the soot.

When The Guardian visited an Ogoni community last week in Khana Local Government Area of the State, the rain water was completely dark.

No one can wash, drink or bath with such water and since their water and lands are polluted and no provision of water yet, they resorted to buying water from vendors at high prices.

An environment expert, Professor of Applied Metrology and Environmental Management, in Rivers State University, Port Harcourt, Akuro Gobo, has advised the Federal Government to work in synergy with the State government, to bring solution to the soot menace.

He further, stated that a state of emergency should be declared on the environment sector of the state so that necessary funds would be released to tackle the soot problem.

Gobo said, “ there is need to work in synergy, the federal and state government, the National Assembly and State Assembly, the federal ministry of Environment and State Ministry of

Environment, the Federal Emergency Management Agency and the State Emergency Management Agency, to work together to enable them get some review of laws that can bring permanent solution, there should be better approach to it and encouragement in some studies”

Similarly, the Senator representing Rivers South East Senatorial District, Magnus Abe in an interview with The Guardian, said, it was time to declare national health emergency in Rivers State.

He urged all to stop politics, come together to find lasting solution to the soot, saying that should be the real problem bothers everyone in Rivers state.

On his part, the Chairman NUPENG and PENGASSAN joint Committee on Joint National Committee on PIB, Chika Onuegbu, who also joined in the soot protest, said, the more devastating issue is that there is no health facility in Rivers state or any other part of Nigeria to manage the medical emergency that the soot has unleashed already.

He said: “Look my friends, the Soot is like rain. It does not select who will breathe it. Whether you are in APC, PDP, Labour party etc does not matter here, let’s join hands and fight it.

“Just because we are in the gestation period does mean that all is well. That is the way cancer operates. So for me I don’t care what people say. I will continue to sensitize people and put pressure on all concerned to #STOPTHESOOT”.

Also, the National Coordinator of Ken Saro-Wiwa Associates, Chief Gani

Tobpa, urged government at all levels to work together to achieve a permanent solution to the soot menace.

Is the revolving door syndrome harming Europe's climate change fight?

By Chris Harris

last updated: 06/05/2018

Campaigners have sounded the alarm over climate change policy in Europe after uncovering evidence about the closeness of the relationship between governments and major energy firms.

A new report claims to have found 88 cases of senior politicians or staff moving from the public sector to the energy industry or vice versa.

Greens-European Free Alliance, the political grouping in the European Parliament that commissioned the research, says it opens the door to fossil fuel-friendly firms having an influence on Europe’s climate change policy.

It’s part of a wider ‘revolving doors’ issue in Brussels that sees former European Commissioners and other high-ranking staff move into the private sector to work for firms lobbying to shape EU legislation.

The report calls for safeguards to be put in place to help prevent conflicts of interest arising because climate change is "the greatest and most urgent challenges of our time".

“It is difficult to measure the effect the revolving door has on climate policy,” reads the report’s executive summary.

“We highlight numerous cases of revolving doors between fossil fuel companies and the public sector but more research is needed before a definitive link with its effect on climate policy can be made.

“Nevertheless, the cases documented highlight the major potential for conflicts of interest, and when one takes into account what is at stake for large fossil fuel companies, and how much lobbying they conduct on climate policy more generally, weak revolving door policy provides another avenue of influence for private fossil fuel interests to exploit.”

The report looks at the revolving door issue in 13 countries across Europe.

It highlighted scores of cases, among them Charles Hendry, the UK’s former minister for climate change, who since leaving office has netted three jobs with oil firms.

Full Report:

Scientists Find New Climate ‘Feedback Loop’ in Lakes

May 4, 2018

Reeds in a lake near Sudbury, Ontario, in the Canadian Boreal Shield. Andrew Tanentzap

Warming global temperatures have helped boost the growth of freshwater plants like cattails in the world’s lakes in recent decades. Now, scientists have found that this surge in aquatic plant growth could double the methane being emitted from lakes — already significant sources of methane — over the next 50 years.

The research, published in the journal Nature Communications, reveals a previously unknown climate feedback loop, where warming triggers the release of greenhouse gases, which in turn triggers more warming — similar to what is happening with Arctic’s melting permafrost. Freshwater lakes currently contribute as much as 16 percent of the world’s methane emissions, compared with just 1 percent from oceans.

Lakes produce methane when plant debris is buried in sediment and consumed by microbes. The scientists studied differences in methane production from biomass that originated in lakeside forests and from dead aquatic plants growing in the water. They found that forest-derived biomass helped to actually trap carbon in the lake sediment, reducing methane emissions. But aquatic plant biomass actually fueled methane production. Lake sediment full of decaying cattails produced over 400 times the amount of methane as sediment with plant debris from coniferous trees, and almost 2,800 times the methane from deciduous tree-filled sediment.

“The organic matter that runs into lakes from the forest trees acts as a latch that suppresses the production of methane within lake sediment,” Erik Emilson, an ecologist at Natural Resources Canada and lead author of the study, said in a statement. “Forests have long surrounded the millions of lakes in the northern hemisphere, but [they] are now under threat. At the same time, changing climates are providing favorable conditions for the growth and spread of aquatic plants such as cattails, and the organic matter from these plants promotes the release of even more methane from the freshwater ecosystems of the global north.”

Using models of the Boreal Shield, a lake-filled ecosystem that stretches across central and eastern Canada, Emilson and his colleagues calculated that the number of lakes colonized by just the common cattail (Typha latifolia) could double in the next 50 years — resulting in a 73 percent increase in lake-produced methane in that part of the world alone.

Full Report:

Avoiding meat and dairy is ‘single biggest way’ to reduce your impact on Earth

Biggest analysis to date reveals huge footprint of livestock - it provides just 18% of calories but takes up 83% of farmland

Damian Carrington Environment editor

Thu 31 May 2018 14.00 EDT Last modified on Fri 1 Jun 2018 08.20 EDT

Avoiding meat and dairy products is the single biggest way to reduce your environmental impact on the planet, according to the scientists behind the most comprehensive analysis to date of the damage farming does to the planet.

The new research shows that without meat and dairy consumption, global farmland use could be reduced by more than 75% – an area equivalent to the US, China, European Union and Australia combined – and still feed the world. Loss of wild areas to agriculture is the leading cause of the current mass extinction of wildlife.

The new analysis shows that while meat and dairy provide just 18% of calories and 37% of protein, it uses the vast majority – 83% – of farmland and produces 60% of agriculture’s greenhouse gas emissions. Other recent research shows 86% of all land mammals are now livestock or humans. The scientists also found that even the very lowest impact meat and dairy products still cause much more environmental harm than the least sustainable vegetable and cereal growing.

The study, published in the journal Science, created a huge dataset based on almost 40,000 farms in 119 countries and covering 40 food products that represent 90% of all that is eaten. It assessed the full impact of these foods, from farm to fork, on land use, climate change emissions, freshwater use and water pollution (eutrophication) and air pollution (acidification).

“A vegan diet is probably the single biggest way to reduce your impact on planet Earth, not just greenhouse gases, but global acidification, eutrophication, land use and water use,” said Joseph Poore, at the University of Oxford, UK, who led the research. “It is far bigger than cutting down on your flights or buying an electric car,” he said, as these only cut greenhouse gas emissions.

Humans just 0.01% of all life but have destroyed 83% of wild mammals – study

“Agriculture is a sector that spans all the multitude of environmental problems,” he said. “Really it is animal products that are responsible for so much of this. Avoiding consumption of animal products delivers far better environmental benefits than trying to purchase sustainable meat and dairy.”

The analysis also revealed a huge variability between different ways of producing the same food. For example, beef cattle raised on deforested land result in 12 times more greenhouse gases and use 50 times more land than those grazing rich natural pasture. But the comparison of beef with plant protein such as peas is stark, with even the lowest impact beef responsible for six times more greenhouse gases and 36 times more land.

The large variability in environmental impact from different farms does present an opportunity for reducing the harm, Poore said, without needing the global population to become vegan. If the most harmful half of meat and dairy production was replaced by plant-based food, this still delivers about two-thirds of the benefits of getting rid of all meat and dairy production.

Cutting the environmental impact of farming is not easy, Poore warned: “There are over 570m farms all of which need slightly different ways to reduce their impact. It is an [environmental] challenge like no other sector of the economy.” But he said at least $500bn is spent every year on agricultural subsidies, and probably much more: “There is a lot of money there to do something really good with.”

Labels that reveal the impact of products would be a good start, so consumers could choose the least damaging options, he said, but subsidies for sustainable and healthy foods and taxes on meat and dairy will probably also be necessary.

One surprise from the work was the large impact of freshwater fish farming, which provides two-thirds of such fish in Asia and 96% in Europe, and was thought to be relatively environmentally friendly. “You get all these fish depositing excreta and unconsumed feed down to the bottom of the pond, where there is barely any oxygen, making it the perfect environment for methane production,” a potent greenhouse gas, Poore said.

The research also found grass-fed beef, thought to be relatively low impact, was still responsible for much higher impacts than plant-based food. “Converting grass into [meat] is like converting coal to energy. It comes with an immense cost in emissions,” Poore said.

The new research has received strong praise from other food experts. Prof Gidon Eshel, at Bard College, US, said: “I was awestruck. It is really important, sound, ambitious, revealing and beautifully done.”

He said previous work on quantifying farming’s impacts, including his own, had taken a top-down approach using national level data, but the new work used a bottom-up approach, with farm-by-farm data. “It is very reassuring to see they yield essentially the same results. But the new work has very many important details that are profoundly revealing.”

Giving up beef will reduce carbon footprint more than cars, says expert

Prof Tim Benton, at the University of Leeds, UK, said: “This is an immensely useful study. It brings together a huge amount of data and that makes its conclusions much more robust. The way we produce food, consume and waste food is unsustainable from a planetary perspective. Given the global obesity crisis, changing diets – eating less livestock produce and more vegetables and fruit – has the potential to make both us and the planet healthier.”

Dr Peter Alexander, at the University of Edinburgh, UK, was also impressed but noted: “There may be environmental benefits, eg for biodiversity, from sustainably managed grazing and increasing animal product consumption may improve nutrition for some of the poorest globally. My personal opinion is we should interpret these results not as the need to become vegan overnight, but rather to moderate our [meat] consumption.”

Poore said: “The reason I started this project was to understand if there were sustainable animal producers out there. But I have stopped consuming animal products over the last four years of this project. These impacts are not necessary to sustain our current way of life. The question is how much can we reduce them and the answer is a lot.”

‘Sunny day flooding’ worsens at NC beaches — a sign sea rise is decades too soon, studies say

By Abbie Bennett

May 03, 2018 10:31 PM


Living in cities threatened by sea-level rise could be like living near an active volcano, according to NOAA oceanographer William Sweet.

Some parts of the Earth are seeing sea levels rise far beyond average, and it's just a waiting game before some areas are inundated with sea water, studies show.

The East Coast of the U.S. is experiencing "sunny day flooding" that scientists didn't expect for decades yet.

Sea levels are rising at a rate of about an inch per year (5 inches from 2011-15) in some areas along the East Coast, from North Carolina to Florida, according to one study — that's faster than researchers expected.

Residents of coastal communities most often feel the effects of sea level rise during tidal flooding.

Tidal flooding, also known as "sunny day flooding" is the temporary inundation of low-lying areas, such as roads, during high-tide events — especially during "king tides," the highest tides of the year.

King tides aren't caused by sea level rise in and of themselves, but because they are the annual peak tides, they demonstrate how sea level has already risen over the past 100 years.

Sea levels aren't rising equally "like water in a bathtub," according to a report from Yale Environment 360. "The oceans are more akin to a rubber kiddie pool where the water sloshes around unevenly, often considerably higher on one side than another."

More flooding, higher costs

Climate scientists view sea level rise as one of the most obvious signals of a warming planet. Sea water expands as it warms, and melting land-based ice sheets adds to rising water levels.

There are neighborhoods that now flood on sunny days, but didn’t years ago even during especially high tides, according to the National Oceanic and Atmospheric Administration.

And as sea levels continue to rise, the frequency, depth and extent of coastal flooding will continue to worsen, according to NOAA.

In 2016, Charleston saw 50 days of tidal flooding.

Fifty years ago? Just four days.

Flooding projections are set at about 25 percent above average for 2017-18 for areas including Wilmington, according to a recent NOAA report.

Wilmington had 84 days of high-tide flooding in 2016, according to NOAA.

"It is important for planning purposes that U.S. coastal cities become better informed about the extent that high-tide flooding is increasing and will likely increase in the coming decades," according to the February 2018 NOAA report.

Sea levels rise and waters inundate storm drains and wash over flood barricades — flooding buildings and streets.

While flooding impacts might be limited or not obvious in those area right now, stormwater systems are reported to be degraded, "increasing the risk of compound flooding during heavy rains," according to NOAA.

And coastal cities should be particularly concerned that the cost of dealing with an increase of many smaller floods will be greater than major, but much rarer, flood events, NOAA said.

A 2017 report "When Rising Seas Hit Home: Hard Choices Ahead for Hundreds of U.S. Coastal Communities" analyzed three projected scenarios of when towns and cities along U.S. coasts can expect to see the ocean rise enough to disrupt daily life.

That report found that as many as 20 North Carolina communities could be submerged by sea water in the next 15 years.

The report was created by the Union of Concerned Scientists, a U.S. nonprofit science advocacy group founded in 1969 by faculty and students of the Massachusetts Institute of Technology.

20 NC coastal communities could be flooded by sea water in 15 years, new report says

The Union of Concerned Scientists report predicts that by 2035, 13 communities clustered mostly on the mainland side of Pamlico Sound will be “chronically inundated.” The study defines that as the point at which 10 percent of a community’s usable land floods at least 26 times a year.

By 2060, that number rises to 25 communities. By 2100, it says, people 49 communities may be forced to adapt to rising water or move out.

Familiar vacation spots on North Carolina’s Outer Banks would suffer, the report said. Nags Head would have 11 percent of its land area, and Hatteras 14 percent, chronically inundated by 2045 under the highest sea level rise scenario. By 2060, it predicts, flooded areas would grow to 19 percent of the land at Nags Head and 28 percent at Hatteras.

Projections of global sea level rise range widely, from as little as 2 feet to more than 6 feet by 2100.

The Intergovernmental Panel on Climate Change estimated in a 2013 report that sea level will rise between 10.2 inches and nearly 39 inches by 2100, depending in part on future greenhouse gas emission scenarios and the effect of greenhouse gas concentrations on global temperatures and thermal expansion.

Sea-level rise scenarios have prompted opposition from some economic development interests on North Carolina’s coast that say long-range forecasts could be wrong.

When a state science panel reported in 2010 that seas on the coastline could rise by as much as 39 inches over the next century, legislators passed a law forbidding communities from using the report to make new rules.

A new report in 2015 looked only 30 years to the future and forecast a rise of 2 to 10 inches, depending on location.

Satellites and tide gauges have been used to report sea level at regular intervals in North Carolina.

That data shows that the sea level has been gradually rising consistently along the North Carolina coast for the past 30 years or more since those gauges were installed, according to the N.C. Department of Environmental Quality.

Tide gauge measurements show that relative sea-level rise is higher in the northern coastal plain (north of Cape Lookout) than in the southern coastal plain, according to NCDEQ. This is at least partly because the northern coastal plain has a higher rate of caving in or sinking than the southern plain.

The NCDEQ said possible impacts of sea-level rise in the state include:

▪ Accelerated coastal erosion.

▪ Higher storm surge flooding and property damage.

▪ Contamination of drinking water with seawater.

▪ Increased likelihood of flooding during heavy rain.

▪ More frequent flooding and drainage issues.

▪ Saltwater intrusion and salinity changes.

▪ Changes in the availability and distribution of fish.

Possible causes of East Coast flooding

Scientists have isolated several factors that appear to make the U.S. southeastern coast a hot spot of sea level rise.

One is the role of the Gulf Stream, a warm and fast Atlantic Ocean current that runs from the Gulf of Mexico to the tip of Florida, and then follows the East Coast. The Gulf Stream influences the East Coast's climate.

An example of the Gulf Stream's affect on sea level rise can be seen in 2016's Hurricane Matthew, the first Category 5 Atlantic hurricane since 2007's Felix. Matthew caused massive flooding, power outages and millions in damage throughout North Carolina because of its relentless rain and high sea levels that blocked drainage in North Carolina and in Virginia.

Matthew slowed in the Gulf Stream, stalling out over parts of the Southeast — worsening its effects.

Scientists credit the rapidly rising sea levels from Cape Hatteras in North Carolina to Miami from 2011 to 2015 (as much as 5 inches in some areas) to El Niño and other weather phenomena, including wind patterns that lead to higher water on the Eastern seaboard.

Charlotte Observer staff writer Bruce Henderson contributed to this report.

Weeds take over kelp in high CO2 oceans

Adelaide, Australia (SPX) May 04, 2018

Kelp forests (left image) are increasingly being replaced by extensive covers of weeds (right image) where local pollution occurs. Global pollution, in the form of carbon emissions, is likely to accelerate this switch.

Weedy plants will thrive and displace long-lived, ecologically valuable kelp forests under forecast ocean acidification, new research from the University of Adelaide shows.

Published in the journal Ecology, the researchers describe how kelp forests are displaced by weedy marine plants in high CO2 conditions, equivalent to those predicted for the turn of the century.

Carbon emissions will fuel the growth of small weedlike species, but not kelps - allowing weeds to take over large tracts of coastal habitats, the researchers say.

"Carbon emissions might boost plant life in the oceans, but not all plant life will benefit equally," says project leader Professor Sean Connell, from the University of Adelaide's Environment Institute. "Weedy species are quicker to capitalise on nutrients, such as carbon, and can grow faster than their natural predators can consume them.

"Unfortunately, the CO2 that humans are pumping into the atmosphere by burning fossil fuels gets absorbed by the ocean and favours weedy turfs, which replace kelp forests that support higher coastal productivity and biodiversity."

Led by the University of Adelaide, the international team from Europe, Canada, USA, Hong Kong used natural volcanic CO2 seeps to compare today's growth of weeds and kelps with levels of CO2 that are predicted for the turn of the century.

"In our study, we found that while elevated CO2 caused some weeds to be eaten in greater amounts, the dominant sea urchin predator ate these weeds at reduced amounts. This enabled the weeds to escape their natural controls and expand across coasts near the elevated CO2," says Professor Connell.

Fellow researchers Dr Zoe Doubleday and Professor Ivan Nagelkerken, from the University's Southern Seas Ecology Laboratories, visited the volcanic vents with Professor Connell.

"We could clearly see the effect of CO2 on promoting the dominance of weedy species and the suppression of their natural predators," says Dr Doubleday.

Professor Nagelkerken says: "Under the level of acidification we will find in oceans in a few decades, marine life is likely to be dominated by fast-growing and opportunistic species at the expense of longer-lived species with specialist lifestyles, unless we can set some change in place.

"We need to consider how natural enemies might be managed so that those weedy species are kept under control," Professor Nagelkerken says.

Europe's first solar panel recycling plant opens in France

by Reuters

Tuesday, 26 June 2018 06:32 GMT

The plant is set to recycle 1,300 tonnes of solar panels in 2018 - virtually all solar panels that will reach their end of life in France this year

* Veolia to recycle virtually all French solar panels

* PV panels for now recycled mainly with old glass

* Renewables agency expects huge growth in PV recycling value

* Veolia aims to build PV recycling plants abroad

By Geert De Clercq

PARIS, June 25 (Reuters) - French water and waste group Veolia has opened what it says is Europe's first recycling plant for solar panels and aims to build more as thousands of tonnes of ageing solar panels are set to reach the end of their life in coming years.

The new plant in Rousset, southern France, has a contract with solar industry recycling organisation PV Cycle France to recycle 1,300 tonnes of solar panels in 2018 - virtually all solar panels that will reach their end of life in France this year - and is set to ramp up to 4,000 tonnes by 2022.

"This is the first dedicated solar panel recycling plant in Europe, possibly in the world," Gilles Carsuzaa, head of electronics recycling at Veolia, told reporters.

The first ageing photovoltaic (PV) panels - which have lifespans of around 25 years - are just now beginning to come off rooftops and solar plants in volumes sufficiently steady and significant to warrant building a dedicated plant, Veolia said.

Up until now, ageing or broken solar panels have typically been recycled in general-purpose glass recycling facilities, where only their glass and aluminium frames are recovered and their specialty glass is mixed in with other glass. The remainder is often burned in cement ovens.

Employees work at Veolia’s solar panel recycling plant in Rousset, France, June 25, 2018. At the plant, photovoltaic panels are dissembled and their constituent parts such as glass, aluminium, silicon and plastics are recycled. REUTERS/Jean-Paul Pelissier

Employees work at Veolia’s solar panel recycling plant in Rousset, France, June 25, 2018. At the plant, photovoltaic panels are dissembled and their constituent parts such as glass, aluminium, silicon and plastics are recycled. REUTERS/Jean-Paul Pelissier

In a 2016 study on solar panel recycling, the International Renewable Energy Agency (IRENA) said that in the long term, building dedicated PV panel recycling plants makes sense. It estimates that recovered materials could be worth $450 million by 2030 and exceed $15 billion by 2050.

The robots in Veolia's new plant dissemble the panels to recuperate glass, silicon, plastics, copper and silver, which are crushed into granulates that can used to make new panels.

A typical crystalline silicon solar panel is made up of 65-75 percent glass, 10-15 percent aluminium for the frame, 10 percent plastic and just 3-5 percent silicon. The new plant does not recycle thin-film solar panels, which make up just a small percentage of the French market.

Veolia said it aims to recycle all decommissioned PV panels in France and wants to use this experience to build similar plants abroad.

Installed solar capacity is growing 30 to 40 percent per year in France, with 53,000 tonnes installed in 2016 and 84,000 tonnes in 2017, Veolia said.

Worldwide, Veolia expects tonnage of decommissioned PV panels will grow to several ten of millions of tonnes by 2050.

IRENA estimated that global PV waste streams will grow from 250,000 tonnes end 2016 - less than one percent of installed capacity - to more than five million tonnes by 2050. By then, the amount of PV waste will almost match the mass contained in new installations, it said.

Justice Kennedy’s Retirement Could Reshape the Environment

A new justice will likely weaken the Clean Air Act, Clean Water Act, and Endangered Species Act.


JUN 27, 2018

Justice Anthony Kennedy arrives at the funeral of fellow justice Antonin Scalia in February 2016. CARLOS BARRIA / REUTERS

The retirement of Justice Anthony Kennedy, announced Wednesday in a letter hand-delivered to President Trump, could bring about sweeping changes to U.S. environmental law, endangering the federal government’s authority to fight climate change and care for the natural world.

With Kennedy gone, a more conservative Supreme Court could overhaul key aspects of the Clean Air Act, the Clean Water Act, and the Endangered Species Act, legal scholars say. And any new justice selected by President Trump would likely seek to weaken the Environmental Protection Agency, curtail its ability to fight global warming, and weaken its protections over wetlands.

The reason has to do with simple math. As on many other issues, Kennedy has functioned as the court’s swing vote on the environment, occasionally joining with the court’s four more liberal justices to preserve some aspect of green law.

“He’s been on the court just over 30 years, and he’s been in the majority in every single environmental case but one. You don’t win without Kennedy,” said Richard Lazarus, a law professor at Harvard who has argued 14 cases in front of the Supreme Court.

“I think more than the other more conservative justices, Kennedy seemed open to embracing the idea that tough national laws were necessary to address some types of problems,” he told me. “He was concerned about private-property rights and the marketplace, but open to the necessity of tough environmental laws.”

Other legal scholars and environmental advocates agreed.

“The loss of Kennedy is not good news for environmental regulation,” said Ann Carlson, a law professor at UCLA and the co-director of the Emmett Institute on Climate Change and the Environment.

The nation’s highest court would now “almost certainly” be more hostile to environmental law than it has been since the founding of the EPA in 1970, said Jonathan Z. Cannon, a law professor at the University of Virginia.

“With the departure of Justice Kennedy, this is no time to mince words: We are in for the fight of our lives,” said Trip Van Noppen, the president of the environmental-legal-advocacy group Earthjustice, in a statement. “Trump intends to fill this Supreme Court vacancy with someone who will put corporations, the wealthy, and the powerful above the rest of us. We must do everything in our power to resist this.”

Experts said that Kennedy’s departure could change the outcome of near-term rulings on three different questions: Can the government fight climate change? How broadly can it regulate clean water? And does the Constitution even allow it to protect endangered species and regulate pollution?

Kennedy provided the crucial fifth vote in Massachusetts v. EPA, which is the most important court case in U.S. climate law. In that decision, the Supreme Court said the EPA could regulate greenhouse gases under the Clean Air Act.

The ruling meant that the president could regulate carbon dioxide and other heat-trapping gases without requiring Congress to pass a new law. It set the stage for President Obama’s broad set of climate-targeted regulations, including his fuel-economy rules for cars and anti-carbon plan for the electricity sector.

Cannon, the University of Virginia law professor whose legal arguments shaped the case, told me that Kennedy’s vote in Massachusetts was “an environmentalist triumph that would not have happened if he had held ranks with other conservatives.”

Had Kennedy not joined with the liberals, then the EPA would likely not have authority over greenhouse gases. With Kennedy gone, this may soon come to pass.

“It’s easy to think about the loss of Kennedy leading to either the repeal of Mass. v. EPA or a serious restriction to the Clean Air Act’s ability to regulate greenhouse gases,” Carlson told me.

Lazarus, the Harvard professor, disagreed that a more conservative court would overturn Massachusetts v. EPA. “I assume [the decision] itself, that greenhouse gases are air pollutants, will hold. That’s a Constitutional law question … I don’t think we’re going to run roughshod over that,” he told me.

But he worried a future court could limit the ability of environmentalists to gain standing. That is, it could seriously restrict the ability of private Americans to sue the federal government for failing to respond to climate change.

Even more likely than these changes to climate law, experts said, is that Kennedy’s successor will curtail the Clean Water Act. Specifically, he or she would make it much easier to—and this is not a joke—drain the swamps.

The Clean Water Act, passed over President Nixon’s veto in 1972, allows the government to regulate the conservation and protection of rivers, streams, and any other “waters of the United States.” It also prevents companies from draining wetlands or dumping pollutants in those waters without a permit.

The only problem: The phrase “the waters of the United States” has never been completely defined. This means that it isn’t totally clear which bodies of water—and especially which wetlands, which tend to touch many different rivers and streams—are subject to EPA conservation and pollution control.

When the Supreme Court last examined that problem, in 2006, the justices ruled in an unusual way: 4-1-4. The four liberal justices wanted to preserve broad protections for wetlands; four conservative justices wanted to overrule it.

Kennedy wound up in the middle. He wrote his own opinion, recommending that wetlands be subject to federal regulation only if they had a “significant nexus” with navigable waters. This argument would have preserved much of the federal government’s ability to regulate wetlands.

A few years later, when the Obama administration tried to define “waters of the United States” once and for all, EPA lawyers looked to Kennedy’s opinion. “The Obama administration’s approach to regulating wetlands basically came straight out of Kennedy’s reasoning in that case,” Carlson told me.

It was a savvy bit of rule-writing: Since Kennedy would likely rule on a case about their rule, why not adopt his legal thinking? The only problem: “Obviously, now, Kennedy will be gone,” Lazarus said.

Had he stayed on the court for another year or two, Kennedy likely would have ruled on this very question. In January, the EPA Administrator Scott Pruitt announced that the Trump administration would suspend the Obama administration’s Kennedy-inspired wetlands rule and replace it with a far weaker policy. States and environmental groups promptly sued Pruitt, setting up a legal fight that has a good chance of reaching the high court.

“I think Kennedy would have struck down a serious attempt to repeal federal jurisdiction [over wetlands],” Carlson told me. “Kennedy had a much more sophisticated view of why the environmental protection of wetlands made sense—and not just for the wetlands themselves. Losing him could wreak havoc.”

Finally, Kennedy’s retirement could allow the high court to rule on a broad Constitutional question about whether the government even has the power to make and enforce environmental policy in the first place.

In October, for instance, the Supreme Court will hear arguments in Weyerhaeuser Company v. United States, which could address whether parts of the Endangered Species Act are unconstitutional. Plaintiffs in that case argue that the federal government doesn’t have Constitutional authority to force land owners to protect the habitats of endangered species.

“That is a hotly contentious issue in the Supreme Court, and it’s a case that one would expect going in would have a good chance of coming out 5-4,” Lazarus said. “So replacing Kennedy with anybody else makes a big difference.”

His absence from the bench could also reopen the Supreme Court to rule on the “takings clause,” which limits the federal government’s ability to take private property “for public use, without just compensation.” During the 1980s, conservative justices pushed for more and more aggressive readings of the takings clause. Decisions increasingly seemed to incline to the idea that the government should compensate companies that it regulated.

But when Kennedy joined the court in 1988, the speed of those rulings slowed. A more conservative court could push the idea again. “If you got a Supreme Court more interested in protecting private-property rights, then you could really get a curtailment of the government’s right to regulate under the Endangered Species Act and the Clean Water Act,” Carlson said.

A more conservative court could also end the legal principle, established during the Reagan administration, that government agencies like the EPA should have a wide latitude in interpreting the laws that affect them. The idea, named “Chevron deference” after a key court case, has undergirded many Democratic policy victories in the past three decades. Now, says Lazarus, it is “certainly at risk.”

He told me he expects deference to be a big issue during the confirmation hearings for Kennedy's replacement.

No matter who Trump picks for the Supreme Court, it is clear that an era of American environmental law—built around the preferences of one man—is coming to an end. Environmental lawyers may no longer work as hard to make their arguments Kennedy-esque: respectful of private property, but comfortable with a muscular government.

“Kennedy shared the right’s concerns about the scope of federal power, but he was reluctant to draw sharp bright lines limiting such power,” said Jonathan Adler, a law professor at Case Western Reserve, in an email. “He was similarly reluctant to endorse a view of standing that would significantly constrain environmental claims. These tendencies meant he was often the swing vote, and his solo opinions often determined the contours of the law.”

I asked Lazarus whether Kennedy’s departure made him more worried to argue another environmental case in front of the court. He waved me off.

“No, not at all, you work with what you’re given,” he said. “It’s the American voters’ job to elect the right people to the White House, and then it’s my job to argue the best case I can in front of the justices that are presented.

“And these sorts of moments,” he added, “underscore the stakes.”

ROBINSON MEYER is a staff writer at The Atlantic, where he covers climate change and technology.

Our Planet Lost 40 Football Fields of Tree Cover Every Single Minute in 2017


Last year, 39 million acres of forest cover was lost from the world's tropics.

The good news is that this figure is a little lower than the record amounts of canopy destroyed in 2016. But that's pretty much the only silver lining here. A less generous interpretation of the data suggests there's no sign of the trend reversing.

Data gathered by the University of Maryland as part of the US-based World Resources Institute's Global Forest Watch was used in a snapshot describing the amount of tropical forest that lost significant amounts of cover in 2017.

To be precise, lost tree cover isn't quite the same thing as deforestation, which actually – thankfully – seems to be declining.

The reduction in a forest's cover describes the removal of 30 percent of canopy in both managed and wild wooded ecosystems, most commonly as a result of natural disasters or fires set by humans.

If 39 million acres is hard to wrap your head around, it's close to 160,000 square kilometres (about 60,000 square miles).

Still can't picture it? Nepal covers 147,181 square kilometres. So this is bigger than Nepal.

To lose that amount of cover you could strip leaves from an area of 40 standard American football fields. Every minute. For a year.

When we picture tropical forest, it's hard to not think of the Amazon first. And in 2016, Brazil lost 9.1 million acres of its portion of Amazonian tree cover – three times more than the previous year.

This jump was caused by widespread fires rather than deforestation, which still manage to do a thorough job of reducing biodiversity and biomass storage.

Considering the Amazon region had more fires in 2017 than in any other year since recording began in 1999, any ground gained in locking up carbon through past deforestation laws was well and truly set back.

Climate change is a significant contributing factor not just towards large scale fires, but various tree-stripping weather events. On the island of Dominica, an extreme hurricane season stripped bare a third of its tree cover in 2017. Similarly, Puerto Rico lost 10 percent of its island's canopy.

The biggest loser in the report was Colombia, which saw close to a 50 percent spike in tree cover loss. Yet the cause of this decline was more political than climate-related, and just as challenging to resolve.

The recent disarming of the major guerilla movement known as Revolutionary Armed Forces of Colombia (FARC) has seen them lose control over large sections of remote forest.

Where once the rebel faction kept commercial interests out of the wilderness, their removal has opened the way for illegal land clearance for coca plantations, logging, and pastures.

It's important to note that while this year's report technically shows an improvement, an average taken over the past three years still shows the problem is worsening. And if we want to see a trend, that's the more accurate number to go for.

Not that it's doom and gloom everywhere. Indonesia saw such a big reduction in 2017's tree cover loss, its three year average has improved as well.

A 2016 moratorium on converting peatland for agriculture is thought to have played a role in the drop, which combined with ongoing extensions to a halt on licenses for using primary forest that dates back to 2011 is worth cheering.

Indonesia's efforts show the rest of the world what can be done, which is a step in the right direction.

Measures to limit deforestation are to be applauded, and seem to be working. But curbing the impact of global warming on our climate and reigning in the poverty that encourages people to clear forest cover for crops and pastures remain important challenges.

Ironically, we need healthy forests to play a role in managing both of these issues. It's a vicious cycle, and one that we quickly need to put the brakes on.

Green electricity isn't enough to curb global warming

by Brooks Hays, Washington (UPI) Jun 26, 2018

The adoption of clean energies to power electric grids won't be sufficient to meet the Paris climate targets established by the United Nations.

According to new research published in the journal Nature Climate Change, the continued use of fossil fuels for a variety of industrial processes, to power vehicles and heat buildings, is likely to push CO2 emissions beyond manageable levels.

"We focused on the role of fossil fuel emissions that originate in industries like cement or steel making, fuel our transport sector from cars to freight to aviation and goes into heating our buildings," Shinichiro Fujimori, researcher from the National Institute for Environmental Studies and Kyoto University in Japan, said in a news release. "These sectors are much more difficult to decarbonize than our energy supply, as there are no such obvious options available as wind and solar electricity generation."

According to Fujimori and his colleagues, green transportation is essential to meeting CO2 emissions targets.

The Paris agreement called on nations to progressively curtail CO2 emissions in order to limit global warming to 1.5 degrees Celsius. To meet this target, scientists suggest no more than 200 gigatons of CO2 can be released between now and 2100. If current fossil fuel-use trends continue, however, 4,000 gigatons of CO2 will have been emitted by the end of the century.

Authors of the new study argue that relying on carbon capture and storage technologies is a dangerous strategy. Pulling carbon from the atmosphere is likely a necessity, scientists admit, but major industries, including the transportation industry, must also end their use of fossil fuels.

Researchers argue that climate pledges made by individual countries must be strengthened sooner rather than later in order to prevent continued investments in fossil fuel infrastructure, investments that lock in continued CO2 emissions.

"Climate mitigation might be a complex challenge, but it boils down to quite simple math in the end: If the Paris targets are to be met, future CO2 emissions have to be kept within a finite budget," said Elmar Kriegler, a scientist at the Potsdam-Institute for Climate Impact Research. "The more the budget is overrun, the more relevant will carbon dioxide removal technologies become, and those come with great uncertainties."

According to Krieglar, the precise CO2 budget may be difficult to calculate but the solution to the threat of global warming is clear.

Last straw for McDonald's, Burger King in Mumbai plastic ban

by Staff Writers

Mumbai (AFP) June 26, 2018

Burger King, McDonald's and Starbucks are among dozens of companies fined for violating a new ban on single-use plastics in India's commercial capital Mumbai, an official said Tuesday.

The rules, in force since Saturday, prohibit the use of disposable plastic items such as bags, cutlery, cups and bottles under a certain size.

Businesses and residents face fines of between 5,000 rupees ($73) for a first-time offence to 25,000 rupees ($367) or even three months in jail for repeat offending.

Some 250 officials, wearing blue uniforms and dubbed Mumbai's "anti-plastic squad", have been deployed to carry out inspections of restaurants and shops across the teeming coastal city of 20 million.

Nidhi Choudhari, a deputy municipal commissioner in charge of enforcing the ban, said 660,000 rupees ($9,684) in fines had been collected during the first three days.

She said 132 premises had been issued with penalties including outlets of Burger King, McDonald's and Starbucks.

A branch of Godrej Nature's Basket, a high-end Indian supermarket, had also been penalised, Choudhari added.

"All were fined for using banned plastic straws and disposable cutlery etc," she told AFP.

A spokesperson for Starbucks in India said the company complies with local laws in all of its markets and was committed to "environmental sustainability".

Hardcastle Restaurants, which runs the McDonald's franchise in Mumbai, said it had "successfully transitioned from plastic to eco-friendly and biodegradable alternatives" such as wooden cutlery.

Authorities hope the ban will help clean up Mumbai's beaches and streets, which like other cities in India are awash with vast mountains of plastic rubbish.

Plastic has also been blamed for blocking drains and contributing to flooding during the city's four-month-long summer monsoon.

Authorities first announced the ban -- which covers the whole of Maharashtra state, of which Mumbai is the capital -- three months ago to allow businesses to prepare.

The majority of India's 29 states have a full or partial ban on single-use plastics but the law is rarely enforced.

Choudhari said more than 8,000 businesses had been searched in Mumbai alone and at least 700 kilogrammes (1,500 pounds) of plastic seized.

Small traders, however, have claimed that the crackdown threatens their livelihoods.

Retailers associations say a confusion over what is and isn't allowed has led small grocery stores to remain closed for fear of being fined.

The Plastic Bags Manufacturers Association of India estimates that 300,000 people employed in the industry could lose their jobs.

The United Nations warned earlier this month that the world could be awash with 12 billion tonnes of plastic trash by the middle of the century if use is maintained at current levels.

Prime Minister Narendra Modi recently pledged to make India, which was the host of this year's International Environment Day, free of single-use plastic by 2022.

Industrial microbes could feed cattle, pigs and chicken with less damage to the environment

Potsdam, Germany (SPX) Jun 26, 2018

Deforestation, greenhouse gas emissions, biodiversity loss, nitrogen pollution - today's agricultural feed cultivation for cattle, pigs and chicken comes with tremendous impacts for the environment and climate. Cultivating feed in industrial facilities instead of on croplands might help to alleviate the critical implications in the agricultural food supply chain.

Protein-rich microbes, produced in large-scale industrial facilities, are likely to increasingly replace traditional crop-based feed. A new study now published in the journal Environmental Science and Technology for the first time estimates the economic and environmental potential of feeding microbial protein to pigs, cattle and chicken on a global scale. The researchers find that by replacing only 2 percent of livestock feed by protein-rich microbes, more than 5 percent of agricultural greenhouse gas emissions, global cropland area and global nitrogen losses could each be decreased.

"Chicken, pigs and cattle munch away about half of the protein feed cultivated on global croplands," says Benjamin Leon Bodirsky, author of the study from the Potsdam Institute for Climate Impact Research (PIK, member of the Leibniz Association). Without drastic changes to the agro-food system, the rising food and animal feed demand that comes with our meat-based diets will lead to continuous deforestation, biodiversity loss, nutrient pollution, and climate-impacting emissions.

"However, a new technology has emerged that might avoid these negative environmental impacts: Microbes can be cultivated with energy, nitrogen and carbon in industrial facilities to produce protein powders, which are then fed instead of soybeans to animals. Cultivating feed protein in labs instead of using croplands might be able to mitigate some environmental and climatic impacts of feed production. And our study expects that microbial protein will emerge even without policy support, as it is indeed economically profitable".

Small feed changes could have a substantial environmental impact

The study is based on computer simulations that assess the economic potential and environmental impacts of microbial protein production until the middle of the century. The simulations show that globally between 175?307 million tons of microbial protein could replace conventional concentrate feed like soybeans. So by replacing just roughly 2 percent of the livestock feed, pressure on deforestation agricultural greenhouse gas emissions and nitrogen losses from cropland could be decreased by more than 5 percent - namely 6 percent for global cropland area, 7 percent for agricultural greenhouse gas emissions and 8 percent for global nitrogen losses.

"In practice, breeding microbes like bacteria, yeast, fungi or algae could substitute protein-rich crops like soybeans and cereals. This method was originally developed during the cold war for space travel and uses energy, carbon and nitrogen fertilizers to grow protein-rich microbes in the lab," explains Ilje Pikaar from the University of Queensland in Australia.

For their new study, the researchers considered five different ways to breed microbes: By using natural gas or hydrogen, feed production could be completely decoupled from cultivating cropland. This landless production avoids any pollution due to agricultural production, but it also comes with a huge energy demand. Other processes that make use of photosynthesis by upgrading sugar, biogas or syngas from agricultural origin to high-value protein result in lower environmental benefits; some eventually even in an increase in nitrogen pollution and greenhouse gas emissions.

Microbial protein alone will not be enough for making our agriculture sustainable

"Feeding microbial protein would not affect livestock productivity," stresses author Isabelle Weindl from PIK. "In contrast, it could even have positive effects on animal growth performance or milk production". But even though the technology is economically profitable, the adoption of this new technology might still face constraints such as habitual factors in farm management, risk-aversion towards new technologies, or lacking market access. "However, pricing environmental damages in the agricultural sector could make this technology even more economically competitive," says Weindl.

"Our findings clearly highlight that the switch to microbial protein alone will not be enough for sustainably transforming our agriculture," says co-author Alexander Popp from PIK. To reduce the environmental impact of the food supply chain, major structural changes in the agro-food system are required as well as changes in human dietary patterns towards more vegetables.

"For our environment and the climate as well as our own health, it might actually be another considerable option to reduce or even skip the livestock ingredient in the food supply chain. After further advances in technology, microbial protein could also become a direct part of the human diet - using space food for people's own nutrition."

Article: Ilje Pikaar, Silvio Matassa, Benjamin L. Bodirsky, Isabelle Weindl, Florian Humpenoder, Korneel Rabaey, Nico Boon, Michele Bruschi, Zhiguo Yuan, Hannah van Zanten, Mario Herrero, Willy Verstraete, Alexander Popp (2018): Decoupling Livestock from Land Use through Industrial Feed Production Pathways. Environmental Science and Technology [DOI:10.1021/acs.est.8b00216]

Waste Heat: Innovators Turn to an Overlooked Energy Resource

Nearly three-quarters of all the energy produced by humanity is squandered as waste heat. Now, large businesses, high-tech operations such as data centers, and governments are exploring innovative technologies to capture and reuse this vast renewable energy source.

By Nicola Jones • May 29, 2018

When you think of Facebook and “hot air,” a stream of pointless online chatter might be what comes to mind. But the company will soon be putting its literal hot air — the waste heat pumped out by one of its data centers — to good environmental use. That center, in Odense, Denmark, plans to channel its waste heat to warm nearly 7,000 homes when it opens in 2020.

Waste heat is everywhere. Every time an engine runs, a machine clunks away, or any work is done by anything, heat is generated. That’s a law of thermodynamics. More often than not, that heat gets thrown away, dribbling out into the atmosphere. The scale of this invisible garbage is huge: About 70 percent of all the energy produced by humanity gets chucked as waste heat.

“It’s the biggest source of energy on the planet,” says Joseph King, one of the program directors for the U.S. government’s Advanced Research Projects Agency-Energy (ARPA-E), an agency started in 2009 with the mission of funding high-risk technology projects with high potential benefit. One of the agency’s main missions is to hike up energy efficiency, which means both avoiding making so much waste heat in the first place, and making the most of the heat that’s there. ARPA-E has funded a host of innovative projects in that realm, including a $3.5 million grant for RedWave Energy, which aims to capture the low-temperature wasted heat from places like power plants using arrays of innovative miniature antennae.

“The attitude has been that the environment can take this waste,” says one expert. “Now we have to be more efficient.”

The problem is not so much that waste heat directly warms the atmosphere — the heat we throw into the air accounts for just 1 percent of climate change. Instead, the problem is one of wastage. If the energy is there, we should use it. For a long time, says Simon Fraser University engineer Majid Bahrami, many simply haven’t bothered. “The attitude has been that the environment can take this waste; we have other things to worry about,” he says. “Now we have to be more efficient. This is the time to have this conversation.”

The global demand for energy is booming — it’s set to bump up nearly 30 percent by 2040. And every bit of waste heat recycled into energy saves some fuel — often fossil fuels — from doing the same job. Crunching the exact numbers on the projected savings is hard to do, but the potential is huge. One study showed that the heat-needy United Kingdom, for example, could prevent 10 million tons of carbon dioxide emissions annually (about 2 percent of the country’s total) just by diverting waste heat from some of the UK’s biggest power stations to warm homes and offices. And that’s not even considering any higher-tech solutions for capturing and using waste heat, many of which are now in the offing.

To help reduce carbon emissions — not to mention saving money and lessening reliance on foreign fuel imports — governments are increasingly pushing for policies and incentives to encourage more waste heat usage, big businesses like IBM are exploring innovative technologies, and start-ups are emerging to sell technologies that turn lukewarm heat into usable electricity.

For more than a century, waste heat has been used for its most obvious application: heat (think of your car, which uses waste heat from its engine to heat your interior). In 1882, when Thomas Edison built the world’s first commercial power plant in Manhattan, he sold its steam to heat nearby buildings. This co-generation of electricity and usable heat is remarkably efficient. Today, in the United States, most fossil fuel-burning power plants are about 33 percent efficient, while combined heat and power (CHP) plants are typically 60 to 80 percent efficient.

When it opens in 2020, Facebook's new data center in Odense, Denmark will channel its waste heat to warm nearly 7,000 homes. Facebook

That seems to make co-generation a no-brainer. But heat is harder to transport than electricity — the losses over piped distances are huge — and there isn’t always a ready market for heat sitting next to a power plant or industrial facility. Today, only about 10 percent of electricity generation in the U.S. produces both power and usable heat; the Department of Energy has a program specifically to boost CHP, and considers 20 percent a reasonable target by 2030.

Other countries have an easier time thanks to existing district heating infastructure, which uses locally produced heat to, typically, pipe hot water into homes. Denmark is a leader here. In response to the 1970s oil crisis, the country began switching to other energy sources, including burning biomass, which lend themselves to district heating. As a result, Denmark has an array of innovative waste-heat capture projects that can be added onto existing systems, including the upcoming Facebook data center.

In 2010, for example, Aalborg’s crematorium started using its waste heat to warm Danish homes (after the Danish Council of Ethics judged it a moral thing to do). Others are joining in. In Cologne, Germany, the heat of sewage warms a handful of schools. In London, the heat from the underground rail system is being channelled to heat homes in Islington. An IBM data center in Switzerland is being used to heat a nearby swimming pool. “Data centers crop up again and again as having huge potential,” says Tanja Groth, an energy manager and economist with the UK’s Carbon Trust, a non-profit that aims to reduce carbon emissions.

An alternative option is to turn waste heat into easier-to-transport electricity. While many power plants do that already, regulators striving for energy security are keen to push this idea for independent power producers like large manufacturers, says Groth. Businesses that make their own power would reduce carbon emissions by getting any extra electrical juice they need by squeezing it out of their waste heat, rather than buying it from the grid.

Waste heat is a problem of a thousand cuts, requiring a mass of different innovations.

Several companies have popped up to help do just this. One of the largest, Turboden, based in Brescia, Italy, sells a mechanical system based on the Organic Rankine Cycle. This is a type of external combustion engine — an idea that pre-dates the internal combustion engine used in cars. Rankine engines and similar technologies have contained, closed-loop systems of liquid that expand to gas to do work, thanks to a temperature difference on the outside of the system — so you can drive a power-generating engine off waste heat. When a cement plant in Bavaria, for example, added a Rankine engine to its system a decade ago, it reduced its electricity demand by 12 percent and its CO2 emissions by about 7,000 tons.

Since 2010, Turboden says it has sold systems for waste heat recovery to 28 production plants, with seven more under construction now. Turboden is just one of many; the Swedish-based company Climeon, for example, endorsed by spaceflight entrepreneur Richard Branson, uses a similar but different technique to make an efficient heat engine that can be bolted onto anything industrial, from cement plants to steel mills, in order to recycle their waste heat.


Waste heat is a problem of a thousand cuts, requiring a mass of innovations to tackle different slices of the problem: a system that works for one temperature range, for example, might not work for another, and some waste heat streams are contaminated with corrosive pollutants. “We aren’t looking for a silver bullet,” says Bahrami. “There are so many different things that can be done and should be done.”

Heat radiates from the Grangemouth Oil Refinery in Scotland. About 70 percent of all energy produced globally gets discarded as waste heat.

Bahrami and others are pursuing solid-state systems for waste heat recovery, which, with no moving parts, can in theory be smaller and more robust than mechanical engines. There are a wide array of ways to do this, based on different bits of physics: thermoacoustics, thermionics, thermophotovoltaics, and more, each with pros and cons in terms of their efficiency, cost, and suitability to different conditions.

“Thermoelectrics have been the major player in this space for years,” says Lane Martin, a material scientist at the Univeristy of California, Berkeley and Lawrence Berkeley National Laboratory. Seiko released a “thermic watch” in 1998 that ran off the heat of your wrist, for example, and you can buy a little thermoelectric unit that will charge your cell phone off your campfire. Researchers are trying hard to increase the efficiency of such devices so they make economic sense for wide-scale use. That means screening thousands of promising new materials to find ones that work better than today’s semiconductors, or tweaking the microstructure of how they’re built.

The biggest technological challenge is to pull energy from the lukewarm end of the spectrum of waste heat: More than 60 percent of global waste heat is under the boiling point of water, and the cooler it is, the harder it is to pull usable energy from it. Martin’s group is tackling this by investigating pyroelectrics (which, unlike thermoelectrics, works by exploiting electron polarization). This isn’t near commercial application yet; it’s still early days in the lab. But the thin-film materials that Martin’s team is investigating can be tuned to work best at specific temperatures, while thermoelectrics always work better the larger the temperature difference. Martin imagines future systems that stack thermoelectric materials to suck up some of the warmer waste heat, say above 212 degrees Fahrenheit, and then uses pyroelectrics to mop up the rest. Martin says his recent work on such materials drummed up interest from a few bitcoin mining operations. “They have a real problem with waste heat,” says Martin. “Unfortunately, I had to tell them it’s a little early; I don’t have a widget I can sell them. But it’s coming.”

“Waste heat is an afterthought — we’re trying to make it a forethought,” says a U.S. government scientist.

Perhaps one of the best applications for waste heat is, ironically, cooling. Air conditioners and fans already account for about 10 percent of global energy consumption, and demand is set to triple by 2050. In urban areas, air conditioners can actually heat the local air by nearly 2 degrees F, in turn driving up the demand for more cooling.

One solution is to use waste heat rather than electricity to cool things down: absorption or absorption coolers use the energy from heat (instead of electrically driven compression) to condense a refrigerant. Again, this technology exists — absorption refrigerators are often found in recreational vehicles, and tri-generation power plants use such technology to make usable electricity, heat, and cooling all at once. “Dubai and Abu Dhabi are investing heavily in this because, well, they’re not stupid,” says Groth.

But such systems are typically bulky and expensive to install, so again research labs are on a mission to improve them. Project THRIVE, led in part by IBM Research in Rüschlikon, Switzerland, is one player aiming to improve sorption materials for both heating and cooling. They have already shown how to shrink some systems down to a reasonable size. Bahrami’s lab, too, is working on better ways to use waste heat to cool everything from long-haul trucks to electronics.

It’s very hard to know which strategies or companies will pan out. But whatever systems win out, if these researchers have their way, every last drop of usable energy will be sucked from our fuel and mechanical systems. “Waste heat is often an afterthought,” says King. “We’re trying to make it a forethought.”

Nicola Jones is a freelance journalist based in Pemberton, British Columbia, just outside of Vancouver. With a background in chemistry and oceanography, she writes about the physical sciences, most often for the journal Nature. She has also contributed to Scientific American, Globe and Mail, and New Scientist and serves as the science journalist in residence at the University of British Columbia.

South Georgia declared rat-free after centuries of rodent devastation

World’s biggest project to kill off invasive species to protect native wildlife is hailed a success

Fiona Harvey Environment correspondent

Tue 8 May 2018 19.01 EDT

Last modified on Wed 9 May 2018 05.31 EDT

Scientists check the island for rodents. The last of the poisoned bait was dropped more than two years ago.

The world’s biggest project to eradicate a dangerous invasive species has been declared a success, as the remote island of South Georgia is now clear of the rats and mice that had devastated its wildlife for nearly 250 years.

Rats and mice were inadvertently introduced to the island, off the southern tip of South America and close to Antarctica, by ships that stopped there, usually on whaling expeditions. The effect on native bird populations was dramatic. Unused to predators, they laid their eggs on the ground or in burrows, easily accessible to the rodents.

Two species of birds unique to the island, the South Georgia pipit and pintail, were largely confined to a few tiny islands off the coast, which the rodents could not reach, and penguins and other seabird populations were also threatened.

Mike Richardson, the chair of the decade-long £10m project, said: “No rodents were discovered in the [final] survey. To the best of our knowledge, for the first time in two and a half centuries this island is rodent-free. It has been a long, long haul.”

The South Georgia pipit was one of two native birds that had largely been confined to a few tiny islands off the coast as a result of the rodent invasion.

The last of the poisoned bait was dropped more than two years ago, but scientists have spent the intervening period monitoring the island for rodents. Two experienced dog handlers from New Zealand walked three dogs – named Will, Ahu and Wai – across nearly 1,550 miles (2,500km) in often extreme weather, beset by heavy rain and often fierce storms, seeking out signs of rats and mice.

Only when none were found over the whole period and in every area was the project deemed a success according to international standards. The leaders of the eradication effort declared the island free of rats and mice on Wednesday, and said the air resounded with the once-rare song of the native pipit.

The project was led by the South Georgia Heritage Trust, a charity set up to protect the island, and an associate organisation, the US-based Friends of South Georgia Island. The UK government also played a role, but the bulk of the funding came from private fundraising and philanthropy.

Scientists hope the success could become an inspiration and model for other projects around the world to eliminate invasive species, which in the worst cases can drive native animals close to extinction.

The island’s inaccessibility added to the difficulty of the project.

Lord Gardiner, the parliamentary under secretary at the Department for Environment, Food and Rural Affairs, said: “We must not rest on our laurels. In our overseas territories, which make up 90% of the UK’s biodiversity, [many species] are highly vulnerable.”

The South Georgia programme involved dropping hundreds of tonnes of poisoned bait on areas known to be infested with rodents. Harsh weather, mountainous terrain and the island’s limited accessibility made the project fraught with danger.

At times, when scientists believed they had eradicated rats from one area, they returned from another nearby section of the island, meaning the poisoning regime had to begin again. Richardson paid tribute to the teams’ endurance and bravery.

South Georgia, one of the UK’s numerous overseas territories, was first noted by Captain Cook in 1775 on one of his voyages of discovery. It became a stop for the hundreds of whalers which plied the southern seas, providing shelter, dry land and a meeting site for ships that may have spent weeks or months without sight of land.

About 2,000 people lived on the island during whaling’s heyday, but today the main activities are centred on two scientific research stations run by the British Antarctic Survey. It was claimed for Argentina during the Falklands war, and a UK garrison was only withdrawn in 2001. The island is also known as the burial ground of the Antarctic explorer Sir Ernest Shackleton, who died there in 1922.

One of the dogs used to hunt for any remaining rodents. Photograph: Oliver Prince/South Georgia Heritage Trust

About 100 miles long, the island covers about 350,000 hectares (865,000 acres) and much of it is covered in snow and ice. The remainder is very mountainous. Some of the coastal regions have vegetation, which has provided a haven for seabirds, and the pipits and pintails.

Other species on the island include seals – 98% of the world’s population of fur seals breed here, and it is home to about half the global population of elephant seals – and four penguin species, including 450,000 breeding pairs of king penguins. All four of the penguin species are listed as threatened. About 30 million birds are thought to nest and raise chicks on the island, with 81 species recorded.

Invasive species are one of the worst threats to biodiversity around the world. When non-native species are introduced, often inadvertently but also sometimes as pets or ornamental features, they can disrupt natural ecosystems evolved over millennia to the detriment of the native species.

China Has Refused To Recycle The West's Plastics. What Now?

June 28, 20184:02 PM ET by SARA KILEY WATSON

For more than 25 years, many developed countries, including the U.S., have been sending massive amounts of plastic waste to China instead of recycling it on their own.

Some 106 million metric tons — about 45 percent — of the world's plastics set for recycling have been exported to China since reporting to the United Nations Comtrade Database began in 1992.

But in 2017, China passed the National Sword policy banning plastic waste from being imported — for the protection of the environment and people's health — beginning in January 2018.

Now that China won't take it, what's happening to the leftover waste?

According to the authors of a new study, it's piling up.

"We have heard reports of waste accumulating in these places that depend on China," says Amy Brooks, a doctoral student of engineering at the University of Georgia and the lead author of the study published in Scientific Advances last week.

She says some of it is ending up in landfills, being incinerated or sent to other countries "that lack the infrastructure to properly manage it."

By 2030, an estimated 111 million metric tons of plastic waste will be displaced because of China's new law, the study estimates. This is equal to nearly half of all plastic waste that has been imported globally since 1988.


Rapid expansion of disposable plastics in the 1990s — and single-use containers — drove imports up rapidly. Yearly global imports grew 723 percent, to around 15 million megatons, from 1993 to 2016.

For developed nations like the United States, it can be more economical to push plastics out of the country rather than recycling them, says Jenna Jambeck, an associate professor of engineering at the University of Georgia and another of the study's authors.

The U.S., Japan and Germany are all at the top of the list when it comes to exporting their used plastic. In the U.S. alone, some 26.7 million tons were sent out of the country between 1988 and 2016.

Hong Kong is the biggest exporter of plastic waste, at 56.1 million tons. But it has acted as an entry point to China — having imported 64.5 million tons from 1988 to 2016 from places like the U.S. (which sent more than 372,000 metric tons there in 2017) and then having sent most of that on to China.

Other nations can buy and recycle these plastics and manufacture more goods for sale or export, as China did, making it profitable for them as well, the study notes. Industry publication Waste Dive reports that the U.S. sent 137,044 metric tons to Vietnam in 2017, up from 66,747 in 2016.

But nations like Malaysia, Thailand or Vietnam, which have picked up some of what China is leaving behind, don't have as well-developed waste management systems. Jambeck says Vietnam has already reached a cap on how much waste it can handle: The country has announced it will not accept any more imports of plastic scraps until October.

"Not one country alone has the capacity to take what China was taking," Jambeck says. "What we need to do is take responsibility in making sure that waste is managed in a way that is responsible, wherever that waste goes — responsible meaning both environmentally and socially."

Marian Chertow, the director of the program on solid waste policy at Yale who was not involved in the study, says these new findings are useful because it confirms what experts thought they knew — nearly half of plastic waste exports have ended up being recycled in China.

"There's a tremendous shift in the market when China won't take half of these plastics. I really think that this export mindset that has developed in the U.S. is one that has to change," she said".

Seattle bans plastic straws, utensils at restaurants, bars

SEATTLE (AP) — Looking for a plastic straw to sip your soda? It’s no longer allowed in Seattle bars and restaurants.

Neither are plastic utensils in the latest push to reduce waste and prevent marine plastic pollution. Businesses that sell food or drinks won’t be allowed to offer the plastic items under a rule that went into effect Sunday.

Seattle is believed to be the first major U.S. city to ban single-use plastic straws and utensils in food service, according to Seattle Public Utilities. The eco-conscious city has been an environmental leader in the U.S., working to aggressively curb the amount of trash that goes into landfills by requiring more options that can be recycled or composted.

The city’s 5,000 restaurants - including Seattle-based Starbucks outlets - will now have to use reusable or compostable utensils, straws and cocktail picks, though the city is encouraging businesses to consider not providing straws altogether or switch to paper rather than compostable plastic straws.

“Plastic pollution is surpassing crisis levels in the world’s oceans, and I’m proud Seattle is leading the way and setting an example for the nation by enacting a plastic straw ban,” Seattle Public Utilities General Manager Mami Hara said in a statement last month.

Proposals to ban plastic straws are being considered in other cities, including New York and San Francisco.

California’s Legislature is considering statewide restrictions, but not an outright ban, on single-use plastic straws. It would block restaurants from providing straws as a default but would still allow a customer to request one. It’s passed the state Assembly and now awaits action in the Senate.

In the United Kingdom, Prime Minister Theresa May announced in April a plan to ban the sale of plastic straws, drink stirrers and plastic-stemmed cotton buds. She called plastic waste “one of the greatest environmental challenges facing the world.”

Smaller cities in California, including Malibu and San Luis Obispo, have restricted the use of plastic straws. San Luis Obispo requires single-use straws only be provided in restaurants, bars and cafes when customers ask for them. City officials said most customers will say “no” if asked if they want a straw.

Business groups have opposed the idea in Hawaii, where legislation to ban plastic straws died this year, the Honolulu Star-Advertiser reported Sunday , with the Hawaii Restaurant Association and Hawaii Food Industry Association testifying against the measure.

Seattle’s ban is part of a 2008 ordinance that requires restaurants and other food-service businesses to find recyclable or compostable alternatives to disposable containers, cups, straws, utensils and other products.

Businesses had time to work toward complying with the ban, said Jillian Henze, a spokeswoman for the Seattle Restaurant Alliance, an industry trade group.

“We’ve almost had a year to seek out products to protect the environment and give customers a good experience (with alternatives),” she said.

The city had allowed exemptions for some products until alternatives could be found. With multiple manufacturers offering alternatives, the city let the exemption for plastic utensils and straws run out over the weekend.

Environmental advocates have been pushing for restaurants and other businesses to ditch single-use straws, saying they can’t be recycled and end up in the ocean, polluting the water and harming sea life.

A “Strawless in Seattle” campaign last fall by the Lonely Whale involving more than 100 businesses voluntarily helped remove 2.3 million single-use plastic straws.

Supporters say it will take more than banning plastic straws to curb ocean pollution but that ditching them is a good first step and a way to start a conversation about waste and ocean conservation.

Seattle urged businesses to use up their existing inventory of plastic utensils and straws before Sunday. Those who weren’t able to use up their supply have been told to work with the city on a compliance schedule.

Businesses that don’t comply may face a fine of up to $250, but city officials say they will work with businesses to make the changes.

Climate Change Will Leave Many Pacific Islands Uninhabitable by Mid-Century, Study Says

As sea level rises, waves will more frequently wash over many of the atolls—and the U.S. military facilities built there—damaging water supplies and infrastructure.


Storm-driven waves wash over man-made barriers on Roi-Namur island in 2014, showing the risk as sea level rises. Credit: U.S. Geological Survey

"Even if we don't worry about the groundwater issue, when seawater floods through cars and buildings and infrastructure every year, that's going to have an impact," U.S. Geological Survey scientist Curt Storlazzi said. Credit: USGS

Long before the Pacific Ocean subsumes thousands of low-lying islands, waves will begin washing over them frequently enough to ruin groundwater supplies and damage crops and fragile infrastructure. A new report says this likely will render many coral atolls uninhabitable within decades—before 2030 in a worst case scenario and by 2065 in a more optimistic one.

"When you're walking around the islands and you see kids there, you realize this isn't something that's next generation or the generation after that," said Curt Storlazzi, a scientist with the U.S. Geological Survey and the study's lead author. "If these scenarios play out, we're talking about those kids' generation."

Some of these reef-lined islands are home to hundreds of thousands of people and American military sites.

In the study, commissioned by the Defense Department and published Wednesday in the journal Science Advances, the scientists combined climate projections with weather and wave modeling to look at the impacts of rising seas on an island in Kwajalein Atoll. Kwajalein is part of the Marshall Islands and home to the Ronald Reagan Ballistic Missile Defense Test Site, which has been used to test U.S. defenses against a nuclear attack.

The Air Force spent nearly $1 billion in recent years building a facility there to track space debris. But a report by the Associated Press found in 2016 that the Defense Department and contractor Lockheed Martin paid little attention to projections for rising seas when planning and constructing the site, which is intended to have a 25-year life span.

John Conger, who was Assistant Secretary of Defense for Energy, Installations and Environment under President Obama and is now director of the Center for Climate and Security, said the project illustrates how much money the Pentagon could waste if it fails to adequately plan for rising seas and climate change.

"You've got to think this through," he said. "A lot of people have asked me in the past about how much the department is going to invest in dealing with climate change, and I think it's the wrong question to ask. I think climate change is an important factor to study to save money."

Lessons from a 2008 Storm Surge

As the planet warms, rising seas will pose a risk to hundreds of the United States' coastal military sites, with more than 200 already reporting effects from storm surges. The new study suggests that without global efforts to rapidly cut emissions or expensive mitigation projects, the facilities on low-lying islands may have even shorter lifespans than sea level rise projections would indicate.

Storlazzi began studying the issue after a winter storm hundreds of miles away sent water surging over the Marshall Islands in 2008.

"It had destroyed the groundwater so much, they had to fill up tankers and take freshwater out to a lot of these atoll islands," he said.

Storlazzi and his colleagues found that with about 16 inches of sea level rise, such waves would wash saltwater over Roi-Namur island on Kwajalein Atoll about once a year. When that starts happening annually, the island's aquifer will be unable to recover, the authors said.

"Even if we don't worry about the groundwater issue, when seawater floods through cars and buildings and infrastructure every year, that's going to have an impact," Storlazzi said. While the study was specific to Kwajalein's topography, the effects of sea level rise will likely be similar on thousands of other atolls and reef-lined islands across the tropics.

Temporary Adaptation — at a Cost

At the Defense Department's request, the authors considered three scenarios for rising seas: one in which emissions stabilize mid-century and two in which they continue to rise. In the most optimistic scenario, annual flooding would still likely begin between 2055 and 2065.

The island can be protected, at least for a time, with measures like building seawalls and shipping in water. But the costs will continue to rise, Conger said.

Kwajalein Atoll.

"The nearest-term impact is going to be on drinking water," he said. "How expensive is it going to be to get drinking water out there?"

Most of the atolls, of course, do not host military bases, and governments of these island nations may not have the money to pay for coastal protections, leaving residents exposed.

Military Studies Show Bases Face Climate Risks

The Pentagon has been studying the impact of climate change on its facilities and national security for more than a decade. In 2014, an Army Corps of Engineers study determined that about 1.5 feet of sea level rise could bring a "tipping point" for the Navy's largest base, in Norfolk, Virginia, where the risk of damage to infrastructure would increase dramatically. That level is expected to be reached in the next few decades.

In January, the Defense Department published the results of a survey of nearly 1,700 military installations that found about half reported experiencing at least one weather-related effect associated with climate change, such as wildfires, drought and flooding. Last year, the Government Accountability Office said the military was failing to adequately plan for climate change at overseas facilities.

Now, the military's work may be caught up in the Trump administration's hostility toward action on climate change. The Defense Department's response published with the GAO report, for example, said that blaming infrastructure damage from weather on climate change was "speculative at best and misleading." Two key strategy documents issued by the Trump administration this year—the National Defense Strategy and National Security Strategy—omitted references to climate change that had been included in earlier versions.

But where the administration may be pushing one way, Congress has begun to push in the other. Last year, lawmakers included language in an annual defense budget bill that declared climate change to be a national security threat and required the Pentagon to issue a report to Congress this year assessing vulnerabilities to climate change over the next 20 years, including a list of the 10 most at-risk facilities in each service.

Trump plan to tackle lead in drinking water criticized as 'empty exercise'

Sources within EPA tell Guardian that proposals are threadbare and muddled – ‘they’re are just making it up as they go along’

Oliver Milman in New York, Thu 26 Apr 2018 06.00 EDT

Four years ago officials chose to switch Flint’s water to the Flint river, without lead corrosion controls, prompting the public health crisis.

Donald Trump has overseen an onslaught against environmental regulations while insisting, in the wake of the Flint lead crisis, that he would ensure “crystal-clean water” for Americans.

The federal government says it is currently drawing up a new plan to tackle lead contamination, which the Environmental Protection Agency says will be unveiled in June.

Flint activists still waiting as governor escapes fallout of water crisis

Public details of the plans are, however, scant. And some sources inside the EPA, speaking to the Guardian anonymously, said they were skeptical about whether what was being developed could meet such a challenge. “It’s a fig leaf,” one source claimed.

Scott Pruitt, the administrator of the EPA, which is spearheading the new strategy, has vowed to eliminate lead from drinking water and banish the specter of Flint. Wednesday this week was the fourth anniversary of the decision to switch the Michigan city’s water supply to the Flint river, without lead corrosion controls, prompting the public health crisis.

In February, Pruitt met fellow Trump cabinet members, including Alex Azar, the health and human services secretary, and Ben Carson, the housing and urban development secretary, and other agencies to tout a new approach for aimed at reducing lead exposure in children. “I really believe that we can eradicate lead from our drinking supply within a decade,” said Pruitt, who has touted a “back-to-basics approach” that has steered the EPA towards toxic clean-ups and away from challenges such as climate change.

Pruitt needs a fig leaf to suggest he’s doing something for the environment and has landed on eradicating lead

The agency’s administrator warned the “mental-acuity levels of our children are being impacted adversely” by lead and called for a coordinated approach to ensure the disaster in Flint is not replicated.

One of the few concrete proposals put forward by Pruitt is to replace the millions of lead lines that funnel drinking water to Americans’ homes, a process that could cost about $45bn. A of the nation’s water utilities found that nearly half a trillion dollars of investment is required to restore crumbling drinking water systems and ensure lead and other pollutants don’t endanger the public.

Some agency staff, while pleased that the administration is raising the profile of lead poisoning, described the new plan as threadbare and muddled.

“Everyone was running around talking about a war on lead, but there was no conversation about how it will work, which is typical of this administration,” said one senior EPA official. “The lead problem is huge and multifaceted and they are just making it up as they go along,” the source said.

Asked about progress of the plan, this week an EPA spokeswoman declined to discuss details but said: “The 17 federal agencies that comprise the president’s taskforce on environmental health risks and safety risks to children is developing a new federal strategy to address childhood lead exposure and expects the strategy to be finalized and made public in mid-June.”

Pruitt has, according to staff sources, used meetings to demand a “single standard” for lead, which has caused confusion as the EPA has an actionable level of 15 parts per billion in water samples, although the agency and other health bodies concur there is no safe limit of lead exposure.

The Flint disaster came about after the city switched its drinking water supply to the Flint river on 25 April 2014 but then failed to add proper controls to prevent lead, a known neurotoxin, leeching from pipes and joints into the water.

Lead levels in children’s blood soared and a further dozen people died from a legionnaire’s outbreak linked to the lead contamination. A total of 15 people have been charged by the Michigan attorney general over the disaster.

Nationally, the risk of a similar outcome is vast – about 18 million Americans are served by 5,300 water systems that have violated the EPA’s lead and copper rule by, among other things, failing to properly test water for lead or neglecting to adequately treat systems to avoid lead contamination.

Many cities, as the Guardian revealed, have used techniques known to mask the true level of lead in drinking water. Lead can also be present in soil and paint, particularly in older buildings.

Lead line replacement – which Pruitt has backed – would provide an overdue upgrade on a huge tangle of ageing underground pipes, many of which were installed in the wake of the civil war.

But experts have warned that tearing up lead lines, even if funding could be found, could exacerbate the problem by dislodging lead during replacement work. Utilities only have jurisdiction over pipes up to the curbside, with residents potentially either unwilling or unable to afford to pay for their share of pipe replacement.

“It can be incredibly expensive to replace lines but also complicated because so much pipe is privately owned,” said Maura Allaire, a water resources expert at the University of California.

“The other problem is we don’t even know where the lead service lines are. You could spend a ton of money and make the problem worse. There are smarter ways of doing things rather than hodge-podge lead-line replacements.”

Allaire said the EPA could demand improvements to lead corrosion treatments and tighten up a unusual patchwork testing regime that relies on individual utilities to get volunteer homeowners to test their own water. Another glaring problem is the EPA’s lack of data on what the vast range of water utilities, some servicing just a few hundred people, are doing.

Other uncertainties abound. Trump’s vision of repealing two federal rules for any new regulation enacted could dampen any enthusiasm for introducing new standards. And Pruitt’s own future as EPA head remains unclear because of a series of controversies over his use of taxpayer money for first-class flights, office furniture and personal security.

“I think this is likely to be an empty exercise,” said a former EPA water official, who declined to be named. “Pruitt needs a fig leaf to suggest he’s doing something protective of the environment and has landed on eradicating lead. It wouldn’t surprise me if we got to the end of the administration and there’s no significant change to the rules.”

Tim Epp, a lawyer with no prior experience in dealing with lead in water, has been handed the role within the EPA of coordinating the new plan. That move, along with the administration’s previous attempt to slash funding for lead clean-ups, has led to some cynicism about the new strategy within and outside the EPA. But staff insist that any effort to address the issue is worthwhile.

“It could’ve been handled better but it’s good it’s out there, it can build momentum,” said an EPA source. “Everyone in the building wants to work on lead, we all want progress on this. No one gives a shit about dissenting with the administration over this. If it’s lead, we’re all in.”

Bin liners to takeaway containers – ideas to solve your plastic conundrums

Plastic has become an environmental disaster. Microplastic pollution has been found in our waterways, fish stocks, salt, tap water and even the air we breathe. Reducing our reliance on plastic by refusing it wherever possible has never been more important, especially as Australia’s recycling system is in crisis.

Yet there are conundrums that continue to defeat even those dedicated to going plastic-free. From bin liners to takeaway containers, Guardian Australia has tried to solve them. And we want to hear from you: share your plastic conundrums and the solutions. We’ll round up the best ideas for a follow-up article.

What should I do about plastic liners for the kitchen bin?

Many people justify their continued use of plastic bags by arguing each one is reused in the kitchen bin. But that’s not actually recycling – it’s barely even reusing, as each bag is still destined for landfill. Others buy bin liners, spending good money on a product made to be ditched after just one use.

Skip the unnecessary plastic bag altogether and simply rinse your bin after emptying it into the council bin. To help cut down on smelly garbage “juice”, separate out food scraps for composting (community compost groups help if you don’t have space) or throw it in the green waste bin. If a naked bin still revolts you, try lining it with newspaper instead.

How can I avoid plastic packaging in supermarkets?

The ultimate solution is filling up your own reusable containers at bulk food stores (check out Sustainable Table’s bulk food directory), but strategic shopping at major supermarkets also helps.

Skip plastic bags altogether – including the smaller produce bags – by taking your own reusable bags. Prioritise nude food: Lebanese cucumbers over plastic-wrapped continental versions, for example. Buy staples such as flour and sugar in paper bags, rather than plastic. Buy cheese and meat at deli counters, using your own reusable container. And if staff refuse BYO containers, or too much food is unnecessarily packaged, write to management demanding change.

Can I take my plastic somewhere to be recycled and transformed directly, rather than rely on nontransparent recycling systems?

Unfortunately not, in most cases. But do go the extra mile to save soft plastics – they can be recycled into sturdy outdoor furniture, bollards and signage. You’ll need to collect them separately, as soft plastics such as shopping bags, courier satchels, bubble wrap and chip packets can’t be processed in council recycling systems. Instead, drop soft plastics – essentially anything that can be scrunched into a ball – at REDcycle bins. They are then sent on to Australian manufacturer Replas to be made into long-lasting recycled products.

What to do about getting takeaway food without plastic containers?

Discarded food and beverage containers made up almost half of the 15,552 ute loads of rubbish collected during last year’s Clean Up Australia Day. Opt out by taking your own reusable containers to the local takeaway. Trashless Takeaway helps consumers find Aussie eateries happy to fill reusable containers, while Fair Food Forager and Responsible Cafes highlight low-waste locations. Boycott places that refuse to embrace sustainable practices – as zero-waste lifestyle advocate Tammy Logan explains, refusing BYO containers is most often a business decision, not a legal requirement.

What’s the best alternative to a plastic water bottle?

Refuse bottled water point blank – plastic bottles are derived from crude oil and take thousands of years to break down in landfill. But select reusable water bottles with care, as many have plastic components, such as lids. Instead, go for brands such as Pura Stainless; it makes stainless steel bottles with medical-grade silicone lids. While you’re at it, grab a reusable coffee cup and stainless steel straw, too.

How do I clean and disinfect the house without using disposable wipes and cleaners sold in plastic bottles?

Most cleaning products are packaged in plastic – and packed with toxic chemicals, but almost the entire home can be cleaned with just three ingredients. Use Castile soap on floors and sinks, baking soda for scrubbing jobs, and vinegar for mould. (Bea Johnson from Zero Waste Home has an extensive natural cleaners recipe list.) Microfibre cloths can also clean windows and more with water only, no products needed.

Avoid all types of cleaning wipes, as they’re designed to be thrown away. As Adelaide zero waster Niki Wallace says: “We don’t need [wipes] for protection from germs and we don’t need them for their cleaning capability or even for their convenience.” Instead use cloths or old clothes cut to size.

Is it possible to get toothpaste, shampoo and conditioner without packaging?

Buying in bulk or making your own are the only real ways to access packaging-free toiletries. Toothpaste is easy to make from coconut oil, bicarb soda and salt, though some dentists baulk at using abrasive materials on tooth enamel. The easiest plastic-free shampoo and conditioner options sidestep bottles completely – choose soap bars instead, preferably unwrapped versions.

Should I be worried about mini toiletries offered in hotels?

Go easy on free toiletries when travelling, as millions of half-used bottles are ditched each day. Take your own or support hotels with more sustainable practices, such as refillable wall dispensers. Also keep an eye out for hotels involved with Melbourne non-profit Soap Aid, which recycles hotel soaps for distribution in disadvantaged communities.

How can I clean my cat litter without plastic bags?

The key is using natural litter, such as Oz-Pets’ wood pellet litter, made from waste plantation sawdust that would otherwise be dumped. The wood’s natural eucalyptus oils help kill bacteria and keep smells under control. After use, the lot can go straight into the compost (ensure it’s not subsequently used for food-producing crops, though) on to the garden as mulch or into council green bins. Recycled paper pellets are also an option. Scoop the poop and soiled litter into a small bucket rather than a plastic bag, before turfing into compost or green bins. Disinfect litter trays with boiling water and vinegar before refilling.

How can I ensure my clothes don’t add to plastic pollution?

Microfibres shed by synthetic clothing during washing is a growing concern. Synthetic fabrics such as polyester and nylon shed thousands of these tiny plastic particles with each wash, which end up polluting our rivers and oceans. Guppy Friend filter bags, developed in Berlin, are an emerging solution. Washing less, using front-loading machines, and opting for natural fabrics and fibres wherever possible also helps.

Going plastics-free is as easy as calico bags and reusable coffee cups

Australians throw away a lot of plastic, often after only one use. Here’s how to give it up

Koren Helbig

Fri 23 Mar 2018 17.51 EDT Last modified on Wed 28 Mar 2018 00.33 EDT

Say no to single-use plastics such as water bottles, straws and takeaway containers.

It’s almost everywhere you look – and it’s undeniably destroying our planet.

Over the past half a century, plastic has infiltrated modern life to such an extent that our oceans may have more of the stuff than fish by 2050.

Once hailed as an innovation, it’s now clear that plastic is very bad news. Non-renewable fossil fuels are needed for production, belching out greenhouse gases in the process. Once tossed – often after mere minutes of use – plastic then takes hundreds of years to break down, emitting toxic methane gas as it gradually breaks into smaller and smaller pieces. The end point is too often our waterways and an estimated one million seabirds and 100,000 mammals are killed every year by marine rubbish, much of it plastics.

Recycling helps, but a far more powerful solution is reducing and even eliminating single-use plastics altogether. That’s actually easier than one might think – read on for a life with less plastic.

Say no to single-use plastics

Refuse to use plastic bags, for a start. Australians go through 13 million new bags each day and around 50m of them end up in landfill each year – enough to cover Melbourne’s CBD, according to Clean Up Australia.

Be cautious with so-called “green bags” though; those commonly sold at Coles and Woolworths are made of polypropylene, a type of plastic. Instead, opt for calico, canvas, jute or hessian. Or join the Boomerang Bags movement, from Queensland’s Burleigh Heads, which encourages volunteers to make fabric bags from recycled materials.

Skip other single-use plastics too, such as coffee cups, takeaway containers, water bottles and straws – even the Queen last month banned the latter two across her royal estates.

A little pre-purchase prep makes saying no easier. Online stores including Brisbane’s Biome and Sydney’s Onya stock reusable alternatives, such as stainless steel water bottles, reusable coffee cups, bamboo toothbrushes, mesh produce bags and material food covers. Consider buying in bulk – The Source Bulk Foods has stores around the country and is Australia’s biggest bulk food retailer.

“A reusable coffee cup only needs to be used 15 times to break even on its life cycle, including cleaning. Every use after that is a bonus for the planet,” says Biome founder Tracey Bailey. “The cumulative use of reusable products makes it easy for individuals to have a long-term environmental impact. Within 12 months, the Biome community saved the waste of over 6m single-use plastic items.”

If plastics can’t be avoided, look to recycle as much as possible. Soft, scrunchable plastics, for example, are too often turfed, but can actually be collected at home then dropped at REDcycle bins in major supermarkets.

Avoid microplastics and reduce microfibre shedding

Some of the most environmentally damaging plastics can barely be seen by the human eye, yet are used daily. Microbeads are tiny plastic pellets used in cosmetics and household products, such as exfoliating face scrubs, whitening toothpastes and deep-cleaning washing powders. Flushed down the drain, microbeads are too small to be captured by wastewater treatment plant filters and end up in our waterways. According to the Australian Marine Conservation Society, marine wildlife that mistake microbeads for fish eggs often end up starving to death.

The Australian government has ordered a voluntary phase-out of microbeads by mid-2018, but many believe an outright ban is needed. Consumers can sign a Surfrider Foundation petition calling for a ban, and consult the Australian Good Scrub Guide or Beat the Microbead app for help to choose microbead-free products.

Microfibres shed by synthetic clothing during washing is a growing concern. Synthetic fabrics such as polyester and nylon shed thousands of these tiny plastic particles with each wash, which end up polluting our rivers and oceans. Guppy Friend filter bags, developed in Berlin, are an emerging solution. Washing less, using front-loading machines and opting for natural fabrics and fibres wherever possible also helps.

A whopping 269,000 tonnes of rubbish is now estimated to be floating in our oceans – weighing more than 1,300 blue whales combined – and about 80% of it comes from land, according to the Australian Marine Conservation Society. The Pacific ocean is already besieged by more plastic than plankton.

To turn that around, humans can’t just reduce our current consumption; we must also clean up the mess already created. Sydney-based non-profit Take 3 has inspired thousands to pick up three pieces of rubbish each when leaving the beach or waterways. Clean Up Australia Day, held annually each March, last year attracted 590,350 volunteers, who collected an estimated 15,500 ute loads of rubbish.

Perth surfers Andrew Turton and Pete Ceglinski have created the floating Seabin rubbish bin for marinas and ports, which moves with the tide, collecting about 1.5kg of floating rubbish each day. Dutch inventor Boyan Slat went further, inventing The Ocean Cleanup passive drifting systems – due to launch this year – which he believes will clean up half the Great Pacific Garbage Patch within just five years.

Despite how thoroughly plastic permeates modern life, it is possible to largely avoid it – with some effort. In Melbourne, Erin Rhoads has lived a “zero waste” existence since June 2014; her entire trash output since fits inside one old coffee jar. (Other notables on similar paths include Gippsland’s Tammy Logan, Adelaide’s Niki Wallace and Perth-based Lindsay Miles.)

“We have the power to dictate how things are packaged and presented to us by participating in everyday activism through our purchases,” Rhoads says. “The more businesses see and hear us saying no to plastics, the more likely we will see changes that will lay the foundation to a cleaner, safer and healthier planet.”

While going fully plastic-free remains a pipe dream for most, Rhoads advocates small shifts that build into big lifestyle changes. On her blog, The Rogue Ginger, Rhoads outlines five easy first steps, and encourages consumers to embrace Plastic-Free Tuesday or Plastic-Free July (which began in Perth).

“Going plastic-free helps us take responsibility for our actions, while reminding us that most of the plastic we use today is not necessary at all,” Rhoads says. “We should focus on preserving our earth’s resources for something other than a single-use plastic fork.”

Additional research and reporting by Nicole Lutze

The great Australian garbage map: 75% of beach rubbish made of plastic

Data compiled from rubbish collected by volunteers aims to encourage industry to control plastic pollution at the source

Maine PUC should not sink ocean wind project


Saving customers a few cents would not be worth further damaging the state's credibility as a business partner.


Volturn US generates power off Castine in 2013. The prototype is a scale model of the floating turbines to be used in a wind project planned for deep water off Monhegan Island – if the state doesn't renege on an agreement to buy clean, home-grown energy.

The Maine Public Utilities Commission usually sets prices for electricity, natural gas, telecommunications and water systems. It’s not every day that it also regulates the state’s credibility.

But that’s what will be happening when the PUC reconsiders a power purchase agreement it signed in 2013 with Maine Aqua Ventus, a public-private partnership involving the University of Maine that is trying to develop the nation’s first deep-water wind power generator.

The three PUC commissioners, all appointed by Gov. LePage since the power purchase agreement was signed, question whether Maine ratepayers should have to pay higher-than-market prices for electricity that’s produced by the demonstration project, and they’ve hinted that they could pull Maine out of the contract.

They say they would be looking out for Maine consumers, who could save as much as 73 cents a month on their electric bills. But those savings would come at a tremendous cost.

Ripping up the contract would probably kill the project. In addition to losing the money from the energy sales, it would almost certainly result in the loss of $87 million in grant money from the U.S. Department of Energy aimed at getting this technology ready to market commercially.

The end of this project would be bad news not only for the Maine Aqua Ventus partners. It would also hurt the Maine companies that would have helped build and supply a new manufacturing facility, and it would be bad news for the people who would have been hired to fill newly created jobs.

But the biggest loser would be Maine itself. A decade-long effort to take a leadership role in a home-grown energy industry – one that could sell clean power to states that don’t have the capacity to make their own – would go down the drain. To save consumers that potential 73 cents a month, the state would throw away millions already invested, including an $11 million bond (which passed in 2010 with nearly 60 percent of the vote).

The ultimate price would come at the expense of the state’s credibility as a business partner, giving investors a very good reason to look elsewhere before deciding to put their money to work here. They would have to doubt whether Maine could be trusted to keep its word through inevitable shifts in the political climate. That’s not a good position to be in for a state that cites a lack of capital as an obstacle to economic growth.

The agreement looks fairly straightforward: Maine Aqua Ventus would build a 12-megawatt wind project off Monhegan Island, and the PUC agreed that the company commits the state’s ratepayers to buy the electricity it produces at 23 cents per kilowatt-hour in the first year, or as much as $187 million over 20 years.

That price is roughly three times the average price of power now, which, to some eyes, makes the agreement look like a bad deal for Maine.

But this is not just a power purchase agreement – it’s also a research and development project.

The over-market energy sales enables Maine Aqua Ventus to develop a new technology – floating platforms for wind-driven turbines – and prove that it can work. The Legislature envisioned this financing system a decade ago when it created the Ocean Energy Task Force in 2009 and adopted its recommendations in statute a year later. The law set a limit on how much the research project could add to electric bills, and this term sheet is below the limit.

This is not the only source of revenue for this project. It has been heavily subsidized by the federal Energy Department, as well as Maine taxpayers, who have approved two bonds.

And while the kilowatt-hour price is high, the amount of energy that ratepayers would have to buy from this experimental project is very small. It’s likely that consumers would not see any noticeable change in their electric bills because of this project.

What they are more likely to see is the economic benefit that would come from Maine becoming a supply hub for an emerging industry. Private companies would buy made-in-Maine equipment or license patented technology that is being developed here. The university would cement its status as a leader in the field, making it eligible for more federal research funds that percolate in the state’s economy.

But what do we get if the PUC pulls its support?

Five years ago, Maine gave itself a black eye by reneging on a power purchase agreement with Statoil, the Norwegian energy business, which was working on a different ocean wind idea.

The company took its $120 million and invested it in Scotland instead of Maine. If the PUC repeats that sorry episode, the damage to Maine’s reputation will be complete.

Anything as complicated as the transition from fossil fuels to renewable energy will take time. It can’t be accomplished by one Legislature or one governor. Leaders come and go, but the commitments they make to a shared vision should not shift with the political winds.

Maine has put too much into ocean wind power to abandon it. The PUC should not sink this proposal, or the state’s credibility.

Fracking chemicals “imbalance” the immune system

Brian Bienkowski , May 01, 2018

Mice exposed to fracking chemicals during pregnancy were less able to fend off diseases; scientists say this could have major implications for people near oil and gas sites

Chemicals commonly found in groundwater near fracked oil and gas wells appear to impair the proper functioning of the immune system, according to a lab study released today.

The study, published today in the journal Toxicological Sciences, is the first to find a link between fracking chemicals and immune system problems and suggests that baby girls born to mothers near fracking wells may not fight diseases later in life as well as they could have with a pollution-free pregnancy.

"This is a really important study, especially since the work started with the idea of identifying what's out there in the environment, how much people are exposed to," Andrea Gore, a professor of pharmacology and toxicology in the College of Pharmacy at the University of Texas at Austin, told EHN.

"So it's all based on this model that has been determined by a real world situation," said Gore, who was not involved in the study.

The implications are far-reaching: More than 17 million people in the U.S live within a mile of an oil or gas well. Hydraulic fractured wells now account for about half of U.S. oil and two-thirds of the nation's natural gas, according to the U.S. Energy Information Administration.

Fracking, short for hydraulic fracturing, is a method of drilling where millions of gallons of water and chemicals are pumped underground at high pressures to fracture shale or coal bed layers to release otherwise unreachable oil and gas deposits. The chemicals used in the process remain proprietary, however, industry has reported more than 1,000 chemicals are used in fracked oil and gas wells—and researchers have found more than 200 of these compounds in wastewater near extraction.

The researchers tested 23 chemicals commonly found in groundwater near fracking operations. The chemicals chosen were recently associated with reproductive and development impacts on mice.

The researchers exposed mostly female mice in their mothers' womb to the chemical mixture at levels commonly found near fracking sites. Exposed mice had "abnormal responses" to diseases when they were older—specifically an allergic disease, a certain type of flu, and a disease similar to multiple sclerosis.

"The mice whose moms drank water containing the mixture had faster disease onset and more severe disease," lead author Paige Lawrence told EHN in a phone interview. Lawrence is a researcher and chair of Environmental Medicine at the University of Rochester Medical Center.

Some of the observed changes were subtle — "such as alterations in the number or percentage of certain cell types, whereas other changes were more manifest, such as advancement in the onset and severity of disease," the authors wrote.

Lawrence said human and mice immune systems are "more similar than they are different."

"This provides information as to what to look for in people," she said.

It's not entirely clear how the mixture altered the mouse immune system, but Lawrence said the chemicals may be altering pathways that control the immune cells that would fight off diseases. Some of the compounds in the mixture—benzene and styrene—are considered toxic to mammals' immune system.

Susan Nagel, a researcher and associate professor of obstetrics, gynecology and women's health at the University of Missouri School of Medicine and study co-author, said the 23 chemicals they use are known endocrine disruptors, meaning they interfere with the proper functioning of hormones.

Properly functioning hormones are crucial for immune system development.

There's probably "some overlap of chemicals that perturb the endocrine system, some that perturb the immune system and probably some that do both," Gore said.

Female mice had more severe changes to their immune systems and abilities to fight off disease. "Immune responses of males and females are inherently different, and ... sex affects the timing, magnitude or penetrance of many diseases," the authors wrote, but cautioned that this study alone doesn't conclude that females are more sensitive to fracking chemicals than males.

The study is just the latest concern for people near fracking sites. Previous studies have found associations between living near fracking sites and birth defects, prostate and breast cancer, asthma, and acute lymphocytic leukemia.

Over the past decade the U.S. Environmental Protection Agency has identified dozens of chemicals used in fracking as health hazards, according to a report from the Partnership for Policy Integrity and Earthworks. However the report found the agency allowed the chemicals to be made and used, and hasn't disclosed the chemicals to the public.

The EPA would not comment on the new study — a spokesperson said the agency is reviewing it.

Earth’s atmosphere just crossed another troubling climate change threshold

By Chris Mooney May 3 at 1:46 PM Email the author

For the first time since humans have been monitoring, atmospheric concentrations of carbon dioxide have exceeded 410 parts per million averaged across an entire month, a threshold that pushes the planet ever closer to warming beyond levels that scientists and the international community have deemed “safe.”

The reading from the Mauna Loa Observatory in Hawaii finds that concentrations of the climate-warming gas averaged above 410 parts per million throughout April. The first time readings crossed 410 at all occurred on April 18, 2017, or just about a year ago.

Carbon dioxide concentrations — whose “greenhouse gas effect” traps heat and drives climate change — were around 280 parts per million circa 1880, at the dawn of the industrial revolution. They’re now 46 percent higher.

As you can see in the famed “saw-toothed curve” graph above, more formally known as the Keeling Curve, concentrations have ticked upward in an unbroken progression for many decades. But they also go up and down on an annual cycle that’s controlled by the patterns and seasonality of plant growth around the planet.

The rate of growth is about 2.5 parts per million per year, said Ralph Keeling, who directs the CO2 program at the Scripps Institution of Oceanography, which monitors the readings. The rate has been increasing, with the decade of the 2010s rising faster than the 2000s.

“It’s another milestone in the upward increase in CO2 over time,” Keeling said of the newest measurements. “It puts us closer to some targets we don’t really want to get to, like getting over 450 or 500 ppm. That’s pretty much dangerous territory.”

“As a scientist, what concerns me the most is not that we have passed yet another round-number threshold but what this continued rise actually means: that we are continuing full speed ahead with an unprecedented experiment with our planet, the only home we have,” Katharine Hayhoe, a climate scientist at Texas Tech University, said in a statement on the milestone.

Planetary carbon dioxide levels have been this high or even higher in the planet’s history — but it has been a long time. And scientists are concerned that the rate of change now is far faster than what Earth has previously been used to.

In the mid-Pliocene warm period more than 3 million years ago, they were also around 400 parts per million — but Earth’s sea level is known to have been 66 feet or more higher, and the planet was still warmer than now.

As a recent federal climate science report (co-authored by Hahyoe) noted, the 400 parts per million carbon dioxide level in the Pliocene “was sustained over long periods of time, whereas today the global CO2 concentration is increasing rapidly.” In other words, Earth’s movement toward Pliocene-like conditions may play out in the decades and centuries ahead of us.

Even farther back, in the Miocene era between 14 million and 23 million years ago, carbon dioxide concentrations in the atmosphere are believed to have reached 500 parts per million. Antarctica lost tens of meters of ice then, probably corresponding to a sea level rise once again on the scale of that seen in the Pliocene.

Farther back still, at the Eocene-Oligocene boundary around 34 million years ago, Antarctica is believed to have had no ice at all, with atmospheric carbon dioxide concentrations of 750 parts per million.

These data points help show why it is that scientists believe that planetary temperatures, sea levels and carbon dioxide levels all tend to rise and fall together — and thus, why Earth is now headed back toward a period like the mid-Pliocene or even, perhaps, the Miocene, if current trends continue.

Keeling said that the planet, currently at 1 degree Celsius (1.8 degrees Fahrenheit) above preindustrial levels, is probably not yet committed to a warming of 1.5 or 2 degrees Celsius, but it’s getting closer all the time — particularly for 1.5 C. “We don’t have a lot of headroom,” he said.

“It’s not going to be a sudden breakthrough, either,” Keeling continued. “We’re just moving further and further into dangerous territory.”

Hawaii poised to ban sale of some sunscreens

By SOPHIA YAN Associated Press | Thursday, May 3, 2018

HONOLULU — Many sunscreen makers could soon be forced to change their formulas or be banned from selling the lotions in Hawaii.

State lawmakers passed a measure this week that would ban the sale of sunscreens containing oxybenzone and octinoxate by 2021 in an effort to protect coral reefs. Scientists have found the two substances can be toxic to coral, which are a vital part of the ocean ecosystem and a popular draw for tourists.

Consumers would only be allowed to buy sunscreen with the chemicals if prescribed by a health care provider, though the measure itself doesn’t ban online purchases or tourists from bringing their own to Hawaii.

It would become the first state to enact a ban on the chemicals if Democratic Gov. David Ige signs the bill; he has not indicated whether he will.

Similar legislation failed last year, after it pitted environmental scientists against businesses and trade groups that benefit from the $2 billion market for sun care products in the U.S.

This is “a first step to help our reef and protect it from deterioration,” said state Sen. Donna Mercado Kim, a fellow Democrat who introduced the measure. Although other factors contribute to reef degradatio

Report Says Solving Houston flooding woes will require wholesale strategy overhaul

Limiting development and telling buyers about a home's true flood risk are among the recommendations by the Greater Houston Flood Mitigation Consortium.


The devastation was swift, and the recovery is far from over. Sign up for our ongoing coverage of Hurricane Harvey's aftermath. You can help by sharing your story here or sending a tip to MORE IN THIS SERIES

A new report by a group of leading Texas researchers details a variety of shortcomings with the Houston area’s current — and proposed — approach to flood control and calls on civil leaders to pursue a multifaceted and regional strategy that ensures that all communities receive better protection regardless of socioeconomic status.

The report by the so-called Greater Houston Flood Mitigation Consortium — a group of scientists from universities and other institutions across the state formed just after Hurricane Harvey battered the region last August — says current regulations aimed at lessening the impacts of flooding are patchwork and often subpar, doing little to effectively corral floodwater.

The consortium is funded by several major philanthropic foundations connected to the likes of Walmart and Kinder Morgan.

“As greater Houston considers strategies and tactics for protecting residents from future storms, consortium members believe it is essential to acknowledge paradigm shifts that have occurred and to recognize that we are entering a new era in flood mitigation that will require bold thinking and actionable solutions,” the report says.

It repeatedly stresses the effectiveness of limiting development in certain areas so grasslands can absorb flood water and ease stress on neighborhood drainage systems and the region’s vast bayou network — a recommendation sure to be met with stiff resistance in a famously pro-development region. The report also says there is a need to better inform the public through measures like a countywide flood alert system and homebuyer disclosures detailing flood risks.

A 2016 investigation by The Texas Tribune and ProPublica explored how unchecked development has worsened the region’s flood vulnerability.

The report’s release comes the day after the Houston City Council voted 9-7 in favor of stricter floodplain building regulations; starting Sept. 1, Houston homes built or significantly expanded inside a 500-year floodplain will have to be elevated 2 feet above the floodplain.

The current standard is 1 foot above the 100-year floodplain, an area that is much more likely to flood.

The vote came after a heated, hours-long debate in which conservative, pro-business council members expressed concerns about the impact to development and the economy. Developers and realtors opposed the rule, proposed by Houston Mayor Sylvester Turner.

Turner has said more is needed to address flooding risks, and he has plans to push additional regulations.

Harris County enacted similar floodplain building regulations months ago, but the report notes that the county is more limited in what it can do than the city, which has wide-ranging rulemaking authority.

Still, such building regulations can only do so much on their own, according to the report, which acknowledges that political will — along with funding — will pose limitations.

It is still unclear how billions in federal recovery dollars will be divvied up among storm-ravaged regions and which projects will be funded.

One major Houston project state and local leaders have rallied around is the construction of another major reservoir. The report criticizes one version of that plan, saying it was designed to spur development and would not address any of the issues exposed by Harvey.

It calls on the U.S. Army Corps of Engineers to provide more information on the condition and operations of two existing major reservoirs, known as Addicks and Barker, which filled to historic levels during Harvey. That spurred the Army Corps to release a deluge to ease stress on the 70-year-old dams, which are considered in poor condition although it is largely unclear how at risk they are of failure.

The report is also critical of the current approach to buying out homes in flood-prone areas, saying an already expansive buyout program should be incorporated into a larger strategy that considers flood control infrastructure and opportunities to incorporate new parks, open space and housing stock. If buyouts are not incorporated into a housing plan, the report says the supply of affordable housing in the region — already too low — could further diminish.

Poorer communities have already been impacted by cost-benefit policies that give greater weight to the value of homes that would be protected, the report indicates.

And it brings up an issue that looms over the entire effort to protect Houston from future storms: Official floodplain maps have not proven to be a good measure of risk, casting doubt on the efficacy of any regulations on which they are based.

Disclosure: Walmart Stores Inc. has been a financial supporter of The Texas Tribune, a nonprofit, nonpartisan news organization that is funded in part by donations from members, foundations and corporate sponsors. Financial supporters play no role in the Tribune's journalism.

The swamp’s tug-o-war over America’s ethanol mandate

A biofuels standard Congress passed more than a decade ago in the name of rural development, energy security and climate change has devolved into an arcane fight over market share that has nothing to do with those initial three goals.

Why it matters: The law — called the renewable fuel standard that requires refineries to blend biofuels into gasoline — is a textbook example of how regulations create winners, losers and unintended consequences.

The level of attention President Trump has given to this policy is remarkable given the chaos emanating from him in the West Wing. It reflects the important competing interests of corn farmers in Iowa and refiners in Pennsylvania.

“He’s taking a very personal involvement in it,” Republican Sen. Chuck Grassley of Iowa, the most influential congressional backer of the policy, told me in an interview last week. “When you have the president himself, you don’t need to worry about the chaotic conditions at the White House.”

Trump and his top advisers have been meeting in recent months with companies that refine oil and those that produce corn ethanol, as well as their allies in Congress to find elusive middle ground over the mandate’s compliance costs.

Gritty details:

Some refineries are facing high costs to comply because they don’t have the capacity to blend ethanol. So they have to buy credits called Renewable Identification Numbers (RINs) from others that do have blending capacity, including other oil companies.

These refineries, which include Northeast-based firms PBF Energy and bankrupt Philadelphia Energy Solutions, want the mandate relaxed so their costs go down.

One proposal by Republican Sen. Ted Cruz of Texas would cap the amount of RINs companies can trade.

Ethanol companies want the policy expanded to allow more blending, which they argue would lower compliance costs.

“The last six months have been about RINs and most people have no idea what that is and why we are talking about RINs,” said Emily Skor, CEO of Growth Energy, a coalition of ethanol companies.

Grassley tweeted to Trump late last week: “I want to shake what u might be planning abt a RINS cap for a short period. It will be CATASTROPHIC to ethanol.”

That prompted responses on Twitter like “How many of you even know what RIN is ? Without goggle ? [sic] LOL” and “What is RINS cap?”

This arcane fight is a classic battle for market share. Corn ethanol’s share of the fuel market is growing and testing some refiners’ ability to comply with the mandate. Nearly every gallon of gasoline now has 10% ethanol blended into it.

When Congress created the mandate in 2005 and expanded it in 2007, lawmakers predicted increasing gasoline demand and decreasing oil production. On both fronts, the opposite occurred: Oil production skyrocketed and demand for gasoline leveled off. Companies are fighting for their piece of a stagnant transportation fuel mix.

“It is now a battle about winners and losers,” PBF Energy CEO Tom Nimbley told me at an energy conference in Houston earlier this month. “It’s a fight over money and market share.”

Other oil companies, ranging from giants Shell and BP to independent refiners like Andeavor, are better situated to comply with the mandate. That's either because they've had a long-standing ability to blend ethanol with gasoline or because they changed their strategy over the last decade to do so.

“Some people do long-range planning better than others,” said Jack Gerard, president of the American Petroleum Institute, whose members represent all parts of the oil and refining sectors. “That’s what we call free-market competition.”

Nimbley said his company hasn’t been able to change its operations to blend ethanol due to infrastructure and geographical constraints.

“It’s an expedient argument to say ‘you just don’t have the right business model,’ ” Nimbley said. “That’s just not factual. That’s just the hyperbole that gets into this situation.”

While refineries and ethanol companies battle it out over the mandate’s compliance costs, the third goal Congress had in mind of combating climate change is lost in the noise. Lawmakers envisioned biofuels made from plant material that's cleaner than corn ethanol to develop. That hasn't happened.

For context:

The law had envisioned 5.5 billion gallons of biofuels made from cellulosic material like switchgrass to be produced last year.

Instead, just 13 million gallons of this liquid type of biofuels came online in 2017, according to Michael McAdams, head of the Advanced Biofuels Association. This is due to several unforeseen factors, including the 2008 economic recession and regulatory uncertainty during the Obama administration.

“We had high expectations,” said Henry Waxman, a former California Democratic congressman who helped pass the 2007 bill. “But we are very much disappointed by the way this law has worked out.”

What’s next: More tug-o-war. Grassley and other Republicans representing corn-producing states sent a letter to Trump Thursday requesting another meeting to talk about Cruz’s proposal to cap RINs. Lawmakers are also working on legislation. That remains a long-shot because ethanol policy divides the Republican Party controlling Congress.

“RINS and repeat,” as a Twitter user told Grassley.

Global energy giants forced to adapt to rise of renewables

Companies face world where falling cost of solar and wind power pushes down prices

Adam Vaughan

Sat 17 Mar 2018 03.51 EDT First published on Fri 16 Mar 2018 12.00 EDT

Seven years after an earthquake off Japan’s eastern coast led to three meltdowns at the Fukushima Daiichi nuclear power station, the aftershocks are still being felt across the world. The latest came last Saturday when E.ON and RWE announced a huge shakeup of the German energy industry, following meetings that ran into the early hours.

Under a complex asset and shares swap, E.ON will be reshaped to focus on supplying energy to customers and managing energy grids. The company will leave renewables. RWE will focus on power generation and energy trading, complementing its existing coal and gas power stations with a new portfolio of windfarms that will make it Europe’s third-biggest renewable energy producer.

The major change comes two years after both groups split their green and fossil-fuel energy businesses, a result of the plan by the German chancellor, Angela Merkel, to phase out nuclear by 2022, and also the Energiewende, Germany’s speeded-up transition to renewables after Fukushima.

Coming so soon after 2016’s drastic overhaul, last week’s shakeup raises the question of what a successful energy utility looks like in Europe today. How do companies adapt to a world where the rapid growth of renewables pushes down wholesale prices, and the electrification of cars begins to be felt on power grids?

Peter Atherton, an analyst at Cornwall Insight, said the deal showed that E.ON and RWE did not get their reorganisation right two years ago. It marks a decisive break with the old, traditional model of a vertically integrated energy company that generates energy, transports it and sells it.

In the noughties, the conglomerate model was seen as a way for energy firms to succeed, leading to a wave of mergers. Some, such as Italy’s Enel and Spain’s Iberdrola (ScottishPower’s owner) are still pursuing this “do everything” model.

But by and large, companies are being broken up and becoming more specialised. “What you’re certainly seeing is companies taking bets about where the value will be,” said Atherton.

RWE argues that today the only way to compete in European government auctions for renewable energy subsidies is to go big. Rolf Martin Schmitz, the group’s chief executive, said: “Critical mass is the key in renewable energy. Before this transaction, neither RWE nor E.ON was in this position.”

After the deal, RWE will have around 8GW of renewable capacity and another 5GW in the pipeline, which will together account for 60% of its earnings by 2020.

Schmitz also said that a new type of company was needed to thrive in a world where windfarms and other green energy projects would soon have to succeed on market prices, not government subsidies. “Renewables will evolve from a regulated business to market competition,” he said.

E.ON, meanwhile, is majoring on supplying people with energy and services, and will grow from 31 million customers to roughly 50 million after the deal. It will also have a much greater proportion of its earnings – 80%, up from 65% – coming from the regulated, lower-return but lower-risk business of energy networks.

John Feddersen, chief executive of Aurora Energy Research, said the two firms were going in very different directions, but the path E.ON had taken was less well trodden.

“This is to some extent a question of try it and see what works. [For E.ON], owning grids and lobbying government for good regulatory outcomes is a well-understood business. However, the [customer] services side is untried,” he said.

The changing nature of power generation in Europe has been felt most keenly in Germany because of the Energiewende, but industry-watchers say the same pattern is driving companies to transform themselves across the continent. In the UK, British Gas owner Centrica is halfway through a sometimes painful reinvention of itself as a customer-centric energy company, divesting its old, large power stations to focus on selling services such as smart heating systems, as well as gas and electricity.

The UK’s second-biggest energy firm, SSE, is moving in the opposite direction. It is getting out of domestic energy supply, banking instead on regulated networks and renewable power generation, where prices are guaranteed.

The picture is further complicated by the entrance of big oil, which is taking serious steps to diversify out of oil and gas and into the world of energy utilities. Norway’s Statoil last week rebranded itself as Equinor to reflect its transformation into a “broad energy” company that deploys windfarms as well as oil rigs.

Shell recently bought the UK’s biggest independent household energy supplier, First Utility, and has also acquired firms in electric car infrastructure. That puts it in direct competition with E.ON, which promised to roll out charging points faster as a result of the asset swap.

Investors seem to like the paths E.ON and RWE have taken, with big bumps in the share prices of both after the deal. But no one knows if we will be back here in two years’ time. “I would view this very much as a test of the right structure for an energy company,” said Feddersen.

California , Has Banned Soda Taxes. How a New Industry Strategy Is Succeeding.

The bill to pre-empt taxes is part of a wider industry effort to skip cities and go straight to the states.

“The soda industry has gone completely rogue,” said Scott Wiener, a state senator.

For years, the soda industry had an ironclad strategy when a city wanted to enact a soda tax: Spend a lot of money, rally local businesses, and shoot it down.

That strategy worked again and again, until it didn’t. In 2014, Berkeley, Calif., passed the nation’s first tax on sugary drinks, which have been linked to heart disease, obesity and tooth decay. Since then, eight communities, including three more cities in California, enacted similar bills.

Now the beverage industry has a new approach. Instead of fighting the ordinances city by city, it is turning to states, trying to pass laws preventing any local governments from taxing their products.

In California, the legislature passed a bill Thursday that will pre-empt any new local beverage or food taxes for 12 years. Arizona and Michigan have passed similar laws. In Oregon, the state’s grocers have collected enough signatures to bring a ballot initiative barring any taxes on grocery items. And legislators are considering pre-emption bills in other states, including Pennsylvania, New Mexico and Washington.

“It’s a little bit like, instead of playing a game of whack-a-mole, you could just put a sheet of plywood over all the holes,” said Franco Ripple, a spokesman for the Campaign to Defend Local Solutions, a group that was founded last year to oppose state laws that limit local autonomy.

The beverage industry says the statewide measures are an appropriate way to protect local businesses and consumers from higher taxes. “People across the country are taxed enough, and they can’t afford new taxes on what they eat and drink,” said William Dermody, the vice president for media and public affairs at the American Beverage Association, an industry trade group.

In California, the arrival of the bill to pre-empt soda taxes, which was championed by the soda industry and introduced over the weekend, came as a shock to public health advocates and many state lawmakers. The state has passed more soda taxes than any other, shepherded by progressive lawmakers who see them as a source of revenue for schools and public services and a tool to fight obesity and diabetes. Scott Wiener, a state senator who represents San Francisco and parts of San Mateo County, said the beverage industry had successfully held California “hostage.”

Bill Monning, the Senate majority leader, was one of a handful of Democrats who voted against the bill. He called its passage “unprecedented” and said it would stop cities and counties “from being able to take steps to protect the health of their residents.” But he also said that state lawmakers would redouble their efforts to curb consumption of sugary drinks. He pointed out that the Senate health committee had just voted in favor of a bill that would require warning labels on sugary drinks indicating that they contribute to various health problems.

“It’s a sad day for democracy in California,” he said. “But ever the optimist I think that the outrage of Big Soda blackmailing the state legislature and the people of California is going to boomerang.”

Beverage companies spent at least $7 million to get an initiative on the ballot this November that would have prevented local communities from raising taxes without approval from two-thirds of voters or an elected body, rather than a simple majority. Such a change would have made it much more difficult for localities to pay for police, fire, transit and other public services.

According to several state senators, the industry then went to lawmakers in Sacramento with a proposal: Pass a bill banning soda and food taxes, and the industry would drop its November ballot initiative.

“They sent us a ransom note that they will drop this horrible ballot measure if we put a 12-year moratorium on local soda taxes,” said Mr. Wiener, who has long supported soda taxes and voted against the bill that now bars them. “It’s a classic case of picking your poison. The soda industry has gone completely rogue.”

Several top lawmakers said they opposed the measure banning soft drink taxes, known as Senate Bill 872, but many felt obliged to support it because they were so worried about the effects of the broader ballot initiative.

Nancy Skinner, a state senator who represents Richmond, Calif., as well as three cities that passed soda taxes in 2016, voted against the measure but said she recognized that her colleagues who supported it were in a bind. The law will not overturn soda taxes enacted before 2018, but it will prevent any new ones, including a soda tax on the ballot later this year in Richmond. “It is totally wrong to deny the residents of Richmond a vote,” she said.

Gov. Jerry Brown avoided taking a position on the bill until Thursday but has now signed it into law. Critics of the bill pointed to a photo that shows him posing with executives from Coca-Cola, PepsiCo and the American Beverage Association at a recent private dinner at the governor’s mansion. The governor’s spokesman said the meeting was unrelated to the soda tax measure.

The push for the pre-emption law began to materialize this year when an industry-financed group, the California Business Roundtable, started collecting signatures for the ballot initiative making local tax increases harder. The American Beverage Association California PAC contributed heavily to the effort, raising $6 million from Coke and Pepsi, $1 million from Dr Pepper Snapple and $100,000 from Red Bull

The deal, reported by The Sacramento Bee on Sunday, which called it “a shakedown,” provoked a fierce outcry across the state and the country. The American Heart Association, the American Diabetes Association, the American Cancer Society and more than 20 other groups issued a joint statement on Tuesday calling on Mr. Brown to oppose it and chiding the soda industry for “resorting to backroom deals and underhanded efforts to preserve its profits.”

The beverage association, which supported the California legislation, said its turn to pre-emption simply reflected its view that taxes on drinks represented an unfair burden — not just on them, but on retailers, bottlers, shipping companies and consumers.

Mr. Dermody noted that, in several states, the pre-emption bills are backed by a coalition of business interests, including grocery stores, and, in some cases, labor unions.

The California measure is part of a broader national strategy by the beverage industry to head off local soda taxes, which have passed in eight communities. Berkeley’s passage of a major soda tax paved the way for other cities to act. Philadelphia, San Francisco, Oakland, Calif., and the county that includes Chicago imposed their own such measures in 2016, followed by Seattle in 2017. The soda industry spent at least $38 million fighting the taxes in 2016, which were backed by public health advocates and the billionaires Laura and John Arnold and Michael R. Bloomberg, former mayor of New York, who contributed $20 million to the Bay Area soda tax campaign.

The experience of the places that adopted soda taxes has been mixed. Research in Berkeley, Philadelphia and Mexico, which has a similar tax, shows that the measures appear to increase beverage prices and reduce sugary drink sales.

But there has been backlash. Cook County in Illinois repealed its tax in the face of voter complaints and industry pressure. And Mayor Jim Kenney of Philadelphia has had to back away from his initial claims that beverage tax revenue would provide enough money to pay for a universal prekindergarten program for the city, as revenue has come in below projections. Definitive evidence about public health effects of the bills is still a long way off.

The beverage companies have borrowed a tactic from the tobacco industry, which used state pre-emption laws in the 1980s to ban cigarette taxes and other municipal antismoking ordinances. Republican-controlled state legislatures have also increasingly used state pre-emption laws to prevent cities and towns from adopting a wide range of local measures, including anti-discrimination laws, bans on natural gas fracking, higher minimum wages, and restrictions on the use of plastic bags.

“The irony is that the soda companies screamed very loudly about government overreach when soda taxes began to get passed,” said Kelly Brownell, the dean of the Sanford School of Public Policy at Duke University. “But now they are looking for the ultimate government overreach when it works in their favor.”

Tucson Electric looking at biomass power generation

By David Wichner Arizona Daily Star, Jul 2, 2018

After boosting its solar power generation and adding wind-energy projects in recent years, Tucson Electric Power Co. is looking at biomass generation from burning wood or other organic matter.

TEP said Friday that it is seeking information about forest biomass generation systems that could generate power for customers while improving the health of Arizona forests.

The utility issued a request for information about technologies, costs, environmental benefits, construction requirements and interconnection requirements of forest biomass energy projects.

The information will be used to “help determine the feasibility of using forest feedstocks as a renewable resource in Arizona,” TEP said.

Biomass power plants use the heat produced from the combustion of wood or other organic materials to generate electricity.

Fuels may include forest residues, crop residues, primary and secondary mill residues, and urban wood waste.

TEP said it will coordinate with other Arizona utilities, as appropriate, to consider joint efforts to develop a forest biomass energy project.

The Novo BioPower Plant in Snowflake burns chipped wood from Arizona’s forests to produce 27 megawatts of power, which is under contract by Arizona Public Service Co. and the Salt River Project to meet part of the utilities’ renewable-energy requirements.

While some scientists consider biomass generation carbon-neutral, some environmentalists disagree and others are concerned that it could increase other kinds of pollution and lead to over-cutting of forests.

TEP said it anticipates filing a forest biomass proposal with the Arizona Corporation Commission in 2019. The commission held a workshop on biomass generation last December.

The utility says it is working to deliver at least 30 percent of its power from renewable resources by 2030, doubling the state’s 2025 goal.

Nearly 13 percent of TEP’s power came from solar, wind and other renewable resources last year, well above Arizona’s 7 percent requirement for 2017, the company said.

The request for information process is being managed by New Hampshire-based Accion Group. A copy of the request can be found at

TEP provides electric service to about 424,000 customers in Southern Arizona.

Climate change soon to be main cause of heat waves in West, Great Lakes

A new analysis of heat wave patterns appearing today in Nature Climate Change concludes that climate change driven by the buildup of human-caused greenhouse gases will overtake natural variability as the main cause of heat waves in the western United States by the late 2020s and by the mid-2030s in the Great Lakes region.

“These are the years that climate change outweighs natural variability as the cause of heat waves in these regions,” said Hosmay Lopez, a meteorologist at NOAA’s Atlantic Oceanographic Meteorological Laboratory and the University of Miami Cooperative Institute for Marine and Atmospheric Studies, and lead author of the study. “Without human influence, half of the extreme heat waves projected to occur in the future wouldn’t happen.”

The research also found that climate change would replace natural variability as the main cause of heat waves in the northern and southern Plains in the 2050s and 2070s, respectively. Researchers defined heat waves as three or more consecutive days when temperatures rose to levels among the top five hottest days of the year for a region.

The new research is part of a larger effort to better predict heat waves in the United States. With a growing population, improved heat wave prediction can help inform adaptation and mitigation to protect human health.

This map shows when climate change, driven by human-caused greenhouse gas emissions, is predicted to be the dominant cause of heat waves in four regions, according to new NOAA and University of Miami research. Hosmay Lopez/ NOAA/ University of Miami.

“Research has tended to focus on predicting extremes such as tornadoes and hurricanes when heat waves are actually causing many more weather-related deaths in our country,” said Lopez. “We want to help fill that gap and develop effective heat wave predictions to help communities be better prepared.”

Extreme heat has been the leading weather-related cause of death in the United States for the past 30 years, according to US natural hazard statistics.

Lopez and colleagues used climate models along with historical climate data to project future heat wave patterns. They based their findings on the projection for greenhouse gas emissions this century, known as the RCP8.5 scenario. This assumes high population with modest rates of technological change and energy conservation improvements and is often called the “business as usual” scenario. Lopez said he based the research on this climate scenario because historical greenhouse gas emissions have to date aligned with this projection.

Honey laundering: Honey may be one of the nation's most fraudulent foods

Laura Reiley, Times food critic

Published: June 6, 2018Updated: June 6, 2018 at 09:57 AM

Fifteen years ago, Hackenberg Aviaries, with hives in Pennsylvania and Florida, produced 300 barrels of honey a year. According to Davey Hackenberg, that number dipped to 80 barrels two years ago and 40 last year.

Nationwide, many beekeepers report their harvests have been cut by half. The wholesale price of honeyhas nearly doubled in the past 10 years, now hovering around $8 per pound according to the National Honey Board.

If honeybees are dying and the bees that remain spend most of their time rented out as pollinators, and if American beekeepers routinely report lower honey production and diminished vigor among their honeybees, then we must have a shortage of honey. Right?

Not exactly.

We do have a honey problem, but it’s sticky.

According to Larry Olmsted, author of Real Food, Fake Food, honey is among the most faked foods. Last year, the United States imported 275 million pounds of honey and produced less than 120 million pounds. A lot of this honey is ultra-filtered, a high-temperature, high-pressure process that filters out identifying pollen, to hide this fact: It is honey from China that has been transshipped, which means it’s sent to an intermediate country like Canada and relabeled as a product of that country to circumvent tariffs.

But why, in general, is Chinese honey bad? It is often diluted with high-fructose corn syrup, rice syrup, beet sugar and other sweeteners, and sometimes contains antibiotics and dangerous chemicals. According to the Center for Infectious Disease Research and Policy, the antibiotics may be a holdover from a Chinese bee disease from 2001 that they fought with antibiotics like chloramphenicol, which is a carcinogen.

The Food and Drug Administration tests only about 5 percent of imported honey. And domestically, the USDA has created a voluntary honey grading system that allows producers to label their own products with Grade A, Grade B or Grade C, with no inspection or certification.

Why isn’t there more oversight? According to Fort Meade beekeeper Jim Doan, testing is not a priority for the FDA because the honeys in question "are not contaminated, they’re adulterated."

He sees the large-production honey packers as part of the problem.

"Some have been indicted, but the fine is so negligible. It’s considered a harmless crime," he says. "You’ll get a food buyer for a big cereal company and they’ll say to the packer on the down-low, ‘We want the special blended product,’ because it’s cheaper. It’s such a small ingredient in the finished cereal that no one knows."

So how can you be a better honey buyer? Here are some tips

• You want a honey that still has its pollen, for health reasons but also because honey with its pollen can be "tracked." A study by Food Safety News found that 75 percent of honey sold in big-box retailers and grocery stores contained no pollen. Nearly all of the honey bought at farmers markets and natural grocers had it. Buy local — even better if it’s from a beekeeper him- or herself who seems enthusiastic about their bees. But read labels: Not every honey sold at a Tampa Bay outdoor market is locally produced.

• "Raw honey" means it’s unprocessed, unheated, its live enzymes preserved. "Pure honey" is a legally ambiguous term, generally meaning that there are no additives, but says very little about what’s in the jar.

• Look for a label for a certification program called True Source honey, an organization of producers that helps prevent illegal trade in honey that circumvents U.S. law. If you don’t see the True Source logo on your bottle of honey, check the website ( to see if the brand participates.

• If you see a label that makes the claim of organic, be skeptical. A standard jar of honey requires a bee to make a million flower visits, and a bee can fly 5 miles to do so. Unless it is in a geographically remote area or that bee has a drone following its every movement, that’s a difficult claim to make good on. Did that bee dip into a Roundup-laden blossom? No telling.

• In general, darker honeys are graded higher and have bolder tastes. Figure out what you like (light or dark; a specific flower like tupelo or orange blossom; monofloral versus multifloral) and purchase accordingly. Recognize that some special honeys, like manuka honey produced in New Zealand, come with a hefty price tag. Let your wallet and your taste buds duke it out.

"Underreporting Of Toxic Waste At Hog Farms Prompts Inquiry"

"Testing of 55 North Carolina lagoons showed large discrepancies in levels of key pollutants compared to what was self-reported"

"Authorities in North Carolina have launched an investigation into widespread underreporting of dangerous toxins in dozens of feces-filled cesspools on giant hog farms that dot the eastern part of the state.

Testing of 55 waste lagoons at 35 hog-raising operations by regulators showed large discrepancies in levels of key pollutants compared with what was self-reported to the state by farmers. Excessive nutrients such as phosphorus and nitrogen, which can poison the water supply, were, in many cases, much higher than that reported by the farms.

The presence of hazardous heavy metals such as zinc and copper in the waste lagoons was even greater – zinc levels were as much as 101,108% higher in the regulators’ testing compared with what was reported to them."

U.S. Coastal Flooding Breaks Records as Sea Level Rises, NOAA Report Shows

The frequency of high-tide flooding has doubled in 30 years. Some cities faced more than 20 days of it in the past year, and not just during hurricanes.


JUN 6, 2018

In Miami Beach, high tides are creating street flooding problems as sea level rises. It isn't just during hurricanes any more. Credit: Joe Raedle/Getty Images

In Miami Beach, high tides are creating street flooding problems more often as sea level rises. The U.S. Southeast is seeing the fastest acceleration in high-tide flooding days.

The nation's coasts broke records for tidal flooding over the past year as storms combined with rising seas to inundate downtown areas of Miami, Boston and other major cities, according to a federal report released Wednesday.

While some of the flooding coincided with hurricanes and nor'easters, much of it was driven mainly by sea level rise fueled by climate change, scientists with the National Oceanographic and Atmospheric Administration (NOAA) write.

The oceans are rising about 3 millimeters a year on average, driven primarily by melting land ice and warming water, which expands. That rate is accelerating, and it has led to a steady increase in U.S. coastal flooding in recent decades, the report shows. Several cities—including Boston, Atlantic City, and Sabine Pass, Texas—saw more than 20 days of high-tide flooding between May 2017 and April 2018, the "meteorological year" covered by the report.

"Though year‐to‐year and regional variability exist, the underlying trend is quite clear," the report says. "Due to sea level rise, the national average frequency of high tide flooding is double what it was 30 years ago."

The report measured data from tidal gauges at 98 locations along the nation's coasts to see how often water levels rose above a point that typically inundates roads, infiltrates stormwater systems or otherwise disrupts daily life.

Building Barriers in Boston: Costs Add Up

While many cities are beginning to address the threat posed by rising seas, the findings highlight the fact that they aren't keeping pace with the problem.

Boston is two years into a city-wide initiative to protect itself from the effects of climate change, including rising seas. It has begun efforts to shield some vulnerable areas by building new flood walls and elevating streets. Another proposal envisions a harbor-wide barrier system that would close during major storms and cost billions of dollars. But last week, a city-backed study recommended against building a barrier, saying the money would be better spent on smaller-scale measures, such as flood walls or green infrastructure.

Tidal Flooding Is Rising with the Sea

A series of winter storms this year underscored the urgency of the problem: Boston matched its previous 12-month record, set in 2009, with 22 days of tidal flooding from May through April. One of those storms, in January, led to the highest tide ever recorded in the city and pushed seawater through downtown streets.

Coastal Cities Rethink Urban Planning

New York, which also tied its record of 15 days of tidal flooding, has launched various initiatives, including localized efforts to build "resilient neighborhoods", protecting subways by elevating ventilation grates and writing new guidelines for developers on how they can build more resilient structures. But a signature coastal project that would erect berms and walls to protect lower Manhattan—known as "The Big U"—has faced delays and been scaled back due to its high costs.

Norfolk, Virginia—with 14 days of tidal flooding over the 12-month period—has tried to weave adaptation to rising seas into all of its planning and operations, rewriting the zoning code to require that new buildings are built higher up and are more resilient to flooding, for example, and publishing an innovative planning document that considers risks out to the end of the century. It's also begun a range of infrastructure projects like building coastal wetlands to absorb flood waters and raising some streets, and is looking at a potential $1.8 billion protection plan proposed by the Army Corps of Engineers.

Miami and Miami Beach are planning to spend hundreds of millions of dollars each in coming years.

Tidal flooding is worsening in most places, and it's currently most frequent in the Northeast, primarily because of the storms that regularly lash the coast each winter. But it's getting worse fastest in the Southeast. The coast in that region is flat, so rising seas are exposing a relatively large area to new flooding.

"With a little bit of sea level rise you can really get a big jump in the number of days" of flooding, said William Sweet, an oceanographer with NOAA and the lead author of the report.

This past March, the National Guard had to carry in emergency workers as high-tide flooding swamped cars and coastal neighborhoods in Quincy, Massachusetts, during a nor'easter.

Earlier this year, NOAA published a report with detailed predictions for how tidal flooding could increase as seas rise. While Miami currently experiences only a few days of tidal flooding per year, for example, it could see inundation every other day by 2060 under an intermediate scenario for rising seas. The report said that some cities that are dry today could have daily flooding by the end of the century.

Gregory Dusek, chief scientist at NOAA's Center for Operational Oceanographic Products and Services and a co-author of the report, noted that rising seas have pushed the baseline high enough so that the type of common storm that hits a place like Charleston, South Carolina, several times a year can now induce flooding that once came only with hurricanes.

"The flooding that used to occur only during major storms, and maybe once in a decade," he said, "now occurs with regularity."

KFC goes vegetarian with plans to test plant-based chicken

Zlati Meyer, USA TODAY Published 10:58 a.m. ET June 6, 2018 | Updated 2:33 p.m. ET June 6, 2018

The company's unit in the United Kingdom and Ireland unit last week announced that it plans to test the faux chicken with customers later this year. If successful, it could launch the product in 2019.

The move isn't being billed as a concession to animal-rights activists. Rather, the vegetarian option is part of a plan to reduce calories by 20% per serving by 2025

The vegetarian option is believed to be the first time a major fast-food chain is putting fake chicken on the menu.

"We always look to respond to the latest changes in lifestyle and dining habits of our customers and a key part of that for our business in the UK is offering lighter options and more choice," the company said in an e-mail. "That’s why we’re looking into vegetarian options that would offer the great taste of KFC to new and existing customers who are changing their dining habits."

KFC UK & Ireland's foray into vegetarian dishes stems from a growing trend not just in Europe, but also in the U.S.. More people want to have healthier lifestyles and to know what's in their food, experts say. Consumers increasingly seek out organic and less-processed foods, while those with concerns about antibiotics in meat and poultry and animal welfare are turning to plant-based alternatives.

Some 14% of U.S. consumers, or 43 million people, regularly use plant-based alternatives to traditional foods, such as almond milk, tofu and veggie burgers, according to research firm NPD Group. Of those, 86% don't consider themselves vegan or vegetarian.

Other big chains are trying their hand at faux animal proteins, too.

In April, White Castle introduced a plant-based equivalent of its Cheese Slider, made with Impossible Foods plant-based meat. It's for sale in the New York, New Jersey and greater Chicago areas, but the burger chain said that it could be available nationally later on. And late last year, McDonald's added its faux beef burger, called the McVegan, to menus in Sweden and Finland.

In February, Ikea announced it would start selling vegetarian hot dogs in Malmo, Sweden, with plans to bring them to the rest of Europe in August and the U.S. in 2019. In 2015, the company added a vegan equivalent to its iconic meatballs to its menu.

Orlando-based global restaurant consultant Aaron Allen predicts large restaurant companies increasingly unveil more vegetarian options, like KFC's.

"How sharply has vegetarianism increased in the U.S.? Double and triple digits," he said. "We’re talking about millions of people dramatically changing their diets. Chains will take note of that and change their menu offerings."



We chose to go the Moon because it was hard, but it turns out difficulty might not be a compelling enough argument to sway Americans these days.

That's according to a new poll released by the Pew Research Center, a non-partisan polling group that studies a range of fields including science and U.S. government policy. Perhaps the most interesting set of results came when the researchers asked how NASA should prioritize the suite of tasks the agency oversees.

The top two priorities out of the list given were to watch Earth's climate and to keep an eye on asteroids that could theoretically slam into the planet. Each of those tasks was ranked as a top priority by almost two-thirds of the survey respondents, with another quarter considering it an important but somewhat lower priority.

A handful of other tasks also drew fairly strong support from the respondents. Those include answering basic science questions about space, creating new technologies, studying how humans are affected physically by traveling in space, looking for natural resources that could be brought to Earth and looking for extraterrestrial life.

Out of the options the researchers provided (of course, NASA can and does many other things as well), sending astronauts to Mars and the Moon fared worst. Only one in between five and 10 respondents thought each of those goals should be a top priority, and almost half of all respondents thought it was still important.

Astronaut Jack Schmitt takes a photo of Earth from the Moon's surface during humanity's last trip there. According to respondents, NASA should watch Earth's climate and keep an eye on asteroids that could theoretically slam into the planet.

But more than a third of respondents thought sending astronauts to Mars wasn't important or shouldn't be done by NASA, and almost half said the same about sending astronauts to the Moon. And overall, only about six in 10 respondents said sending humans into space is essential at all; the rest thought we could get the job done with robots alone.

NASA's existence itself was even questioned by some of the respondents, about a third of whom said that private companies would be able to carry the burden of space exploration. It's unclear whether they realize that a significant portion of commercial space work is financed by NASA's willingness to be their customer.

Some respondents also raised concerns that commercial space companies would prioritize profit over safety issues, managing space junk and doing basic research—issues on which NASA has a generally strong track record.

Costa Rica poised to become world's first fossil fuel-free country


May 11, 2018, 1:39 p.m.

Clean electricity production rules in Costa Rica. The nation's fossil fuel-dependent transportation sector, however, is a whole other story.

Dutifully picking up where his carbon neutrality-aspiring predecessor left off, newly elected Costa Rica President Carlos Alvarado made quite the pledge at his inauguration ceremony: by 2021 — the year of Costa Rica's bicentennial — the preternaturally happy Central American nation will have completely weaned itself off the use of fossil fuels.

"Decarbonisation is the great task of our generation and Costa Rica must be one of the first countries in the world to accomplish it, if not the first," proclaimed Alvarado, a 38-year-old former journalist and member of the left-leaning Citizens' Action Party (PAC). "We have the titanic and beautiful task of abolishing the use of fossil fuels in our economy to make way for the use of clean and renewable energies."

For Costa Rica, with its ironclad conservation laws and booming ecotourism industry, reaching such a formidable goal within such a relatively short time frame may not seem entirely loco. After all, the country is famed for producing roughly 99 percent of its electricity using renewable sources — predominately hydropower but also solar, wind, biomass and geothermal. In 2017, Costa Rica broke its own record by using only clean energy for 300 consecutive days. (By comparison, 66 percent of electricity in the United States comes from coal and natural and gas while roughly 15 percent comes from renewable sources. The remaining 19 percent is nuclear-sourced.)

President Carlos Alvarado is dedicated to leading his already progressive country into an even cleaner and greener future.

And for this, Costa Rica, a country of 5 million, deserves all the accolades thrown at it. But abolishing fossil fuels in just three short years isn't as effortless as it might appear when you consider the one area in which the ultra-progressive country isn't light-years ahead: transportation.

As reported by the Independent, public transportation is not one of Costa Rica's strong suits. In turn, gas- and diesel-powered private cars largely rule the road and are only growing in number. Per data from the country's National Registry, there were twice as many cars registered as babies born in 2016. The previous year, the Costa Rica's automobile industry grew a staggering 25 percent, making it one of the fastest-growing auto markets in Latin America.

With a weak public transportion network and a growing number of cars hitting the road, roughly two-thirds of Costa Rica's annual emissions come from transport. Still, Alvarado, who arrived at his own inauguration ceremony via a hydrogen-powered bus, is undaunted: "When we reach 200 years of independent life, we will take Costa Rica forward and celebrate ... that we've removed gasoline and diesel from our transportation," he proclaimed.

Core to Alvarado's campaign were promises to clean up and modernize Costa Rica's gasoline-reliant public transportation system, promote research into new, sustainable fuel sources and outlaw oil and gas exploration in the country. He also vowed to continue former President Luis Guillermo Solís' embrace of electric vehicles. (In 2016, hybrids and EVs represented less than 1 percent of the country's total vehicles.)

The Costa Rican capitol of San Jose, home to roughly 300,000 people, as seen from above. Despite Costa Rica's squeaky-clean reputation, the city experiences air pollution caused by automobile traffic.

While many experts applaud Costa Rica's ambitious goals, they point out that a fossil fuel-free transport sector by 2021 is a long shot that may end up being more symbolic than anything. It can — and should — happen, just perhaps not in time for the country's bicentennial.

"If there's no previous infrastructure, competence, affordable prices and waste management, we'd be leading this process to failure." Oscar Echeverría, president of the Vehicle and Machinery Importers Association, tells Reuters. "We need to be careful."

One considerable economic roadblock is the fact that, per Ministry of Treasury data, roughly 22 percent of the government's income currently comes from taxes on fossil fuels. Completely phasing out the import of the gasoline that a vast number of motorists depend on would, for example, force the debt-ridden government to radically rethink how and what it taxes. Again, not a negative but a dramatic shift nonetheless.

More aggressive taxes on carbon emissions seem an obvious path for the Alvarado administration to take to make up for the loss, although that too isn't so straightforward. As recently noted by Nobel laureate Joseph Stiglitz:

Because Costa Rica is already so green, a carbon tax would not raise as much money as elsewhere. But, because virtually all of the country's electricity is clean, a shift to electric cars would be more effective in reducing carbon dioxide emissions. Such a tax could help Costa Rica become the first country where electric cars dominate, moving it still closer to the goal of achieving a carbon-neutral economy.

And even if Costa Rica doesn't achieve such a miraculous feat in such a limited time, there's hope that other countries will take note and follow.

"Getting rid of fossil fuels is a big idea coming from a small country," economist Mónica Araya of Costa Rica Limpia explains to Reuters. "This is an idea that's starting to gain international support with the rise of new technologies. Tackling resistance to change is one of the most important tasks we have right now."

Possible conflict of interest clouds West Virginia's natural gas deal with China

By Ken Ward Staff writer Jun 15, 2018 (3)

Editor's note: This article was produced in partnership with the ProPublica Local Reporting Network. ProPublica is supporting seven local and regional newsrooms this year, including the Gazette-Mail, as they work on important investigative projects affecting their communities.

A member of West Virginia’s negotiating team on the $80 billion natural gas investment deal with China was asked to repay $23,000 in travel expenses after the Justice administration raised questions about a potential conflict of interest, the governor revealed Friday.

Last November, President Donald Trump and Chinese President Xi Jinping looked on in Beijing as officials from the Mountain State and a Chinese energy company signed what was hailed as a landmark deal for West Virginia.

Under the deal, China Energy Investment Corp. would invest more than $80 billion over the next 20 years in West Virginia’s natural gas industry.

Gov. Jim Justice and other state leaders have been banking on the China deal, predicting it will create tens of thousands of additional jobs in the state. It also was described as a victory for Trump, the largest in a series of Chinese investments in the United States that totaled $250 billion.

But on Friday, Justice revealed an ethical cloud over the China deal: At least one member of the state’s trade delegation — an industry executive — was also working to help his private company.

Brian Abraham, the governor’s general counsel, said the state was “using someone who probably shouldn’t have been involved in the negotiations” as part of its trade delegation.

“People that were there in China maybe representing their own special interests, we didn’t think was right,” the governor added.

West Virginia officials are eager to see the fruits of the China Energy investment, as a cornerstone to the natural gas industry’s continued growth in the state. But along the way, some lawmakers and watchdogs are questioning whether the state is putting the industry’s interests ahead of the public concerns about broadening the state’s economic base. This year, ProPublica is partnering with the Charleston Gazette-Mail to examine those issues.

At a news conference Friday, neither Justice nor Abraham would name the individual or his company. In an interview later, Abraham confirmed that the man was Steven B. Hedrick, who is CEO of Appalachia Development Group LLC and also CEO of the Mid-Atlantic Technology, Research and Innovation Center, or MATRIC, a nonprofit that partners with industry on various research and development efforts.

Appalachia Development Group has been seeking a loan guarantee from the U.S. Department of Energy as part of an effort to build a natural gas “storage hub” for various natural gas liquid byproducts that can be used in a wide variety of manufacturing.

Abraham said the state Commerce Department paid for Hedrick’s travel for the China negotiations because it considered him acting as a state official, part of a special Commerce Department program in which certain executives are “loaned” to the state.

The Governor’s Office, though, discovered later that Hedrick had not joined the program and when asked to do so after the trip, he declined, Abraham said. Had he joined the program, Hedrick would have been required to sign an agreement to abide by the state ethics law’s prohibition on using public office for private gain.

“Why is this person behind the curtain at Commerce if they’re an individual on the outside?” Abraham said. “That created an ethical dilemma.”

Also, Abraham cited an incident in which state officials were later told that Hedrick asked China Energy officials to specifically target some of their investment in his company’s natural gas storage hub. Abraham said that, on one trip, Hedrick stayed an extra day to pitch his project.

Abraham said Hedrick was asked to repay the state $23,000 in travel expenses and that the repayment had been made.

A spokeswoman for Hedrick said he was not available for comment, but she issued a short email statement that said Hedrick was “grateful to respond to the request of the state of West Virginia to support the Commerce Department’s mission to attract business to the state.”

The statement said MATRIC “promptly paid any expenses invoiced by the state.”

Although officials signed a memorandum of understanding in China, the state has refused to release the text of the agreement and few details have been made public. The China deal and the natural gas storage hub are considered by many state officials as key and related economic development projects for West Virginia’s future.

The state’s natural gas industry has already greatly expanded, and backers of the China deal say it will provide huge amounts of capital that could fund processing plants, pipelines and other facilities that will turn natural gas byproducts into crucial ingredients for a wide variety of plastics manufacturers. These kinds of “downstream” developments will allow West Virginia to capture far more jobs and economic growth than just drilling for gas and shipping it out of state.

The revelations about the China deal came just one day after Justice asked for and received the resignation of Commerce Secretary Woody Thrasher, whose agency bungled the state’s implementation of a federally funded flood-relief program.

Thrasher was the top state official who traveled to China last November as part of the trade delegation.

Justice said Friday that discussions toward realizing the Chinese natural gas investments are ongoing, and repeated his earlier statements that the deal “came into being” because of his personal friendship with Trump.

More informations and graphics:

America’s Last-Ditch Climate Strategy of Retreat Isn’t Going So Well

After two vicious floods, Sidney, N.Y., decided to pull back from the river. Seven years later, things haven’t turned out as planned.

By May 2, 2018, 5:00 AM EDT

Corrine Spry had no way of knowing, on the day Tropical Storm Lee ruined her house seven years ago, that she was about to become part of a radical experiment to transform how the U.S. protects itself against climate change. All she knew was that it had been raining for days, so she’d better get some cash.

Spry set out in her car through Sidney, N.Y., a fading village of roughly 4,000 along the Susquehanna River, swollen from the storm and rising fast. On the way home from the bank, she saw firefighters huddled in front of their station. They’d already taken out Sidney’s rescue boat. Spry, who’d lived down the street for 30 years, knew most by name. She pulled over to ask how much time she had. Get your stuff, the firefighters told her, and get out now.

A small woman who moves and speaks in quick bursts, Spry rushed home and called in some favors. Pickup trucks soon arrived. She and her husband, Lynn, moved what they could and drove to higher ground. Six feet of water filled her neighborhood, swallowing front porches and flowing into living rooms.

More than 400 homes and businesses ended up underwater in Sidney, affecting more than 2,000 people. It was months before Spry and her neighbors could move back in. It was also the second time in five years that the Susquehanna had wrecked half the village. People had just finished rebuilding. When Spry walked back into her soggy house, the street outside reeking with the rancid smell of garbage and fuel, she was hit first, she remembers, by the sight of her brand-new hardwood floor, completely buckled.

Spry didn’t want to rebuild again, and neither did local officials; everyone knew the river would keep flooding homes and businesses. So Sidney decided to try something else: It would use federal and state money to demolish Spry’s neighborhood while creating a new one away from the flood plain for displaced residents. Sidney would be on the forefront of U.S. disaster policy, a case study in what’s known as managed retreat—and the many ways it can go wrong.

Until recently, the guiding philosophy behind attempts to protect U.S. homes and cities against the effects of climate change was to build more defenses. Houses can be perched on stilts, surrounded by barriers, buttressed with stormproof windows and roofs. Neighborhoods can be buffered by seawalls for storm surges, levees for floods, firebreaks for wildfires. Defenses are an instinctive response for a species that’s evolved by taming the natural world.

But sometimes the natural world won’t be tamed. Or, more precisely, sometimes engineered solutions can no longer withstand the unrelenting force of more water, more rain, more fires, more wind. Within 20 years, says the Union of Concerned Scientists, 170 cities and towns along the U.S. coast will be “chronically inundated,” which the group defines as flooding of at least 10 percent of a land area, on average, twice a month. By the end of the century, that category will grow to include more than half of the communities along the Eastern Seaboard and Gulf Coast—and that’s if the rate of climate change doesn’t accelerate. In their less guarded moments, officials in charge of this country’s disaster programs have begun to acknowledge the previously unthinkable: Sometimes the only effective way to protect people from climate change is to give up. Let nature reclaim the land and move a neighborhood out of harm’s way while it still is one.

But even when all the most obvious ingredients are in place—from federal money and local buy-in to cheap, dry land right next door—moving is hard. Sidney has yet to remove more than a few dozen homes from the flood plain or break ground on land away from the river. Its failure so far illustrates how unprepared the U.S. is politically, financially, and emotionally to re-create even a single community away from rising waters in an organized way, preserving some semblance of its character and history. The alternative, though, is towns and cities morphing haphazardly, with people staying in ever more dangerous and decaying neighborhoods as long as possible. And if pulling back is hard in Sidney, imagine it in Miami or Boston or New York.

“Every time the river starts to rise, I panic,” says Spry, now 67, who still lives in the same house where she’d found her curtains rotted three feet up from the floor. “Why not buy us out and be done?”

On a frigid morning in December, John Redente drives around the former future of Sidney. Perched on a cliff, Riverlea Farm was going to be the site of 165 single-family homes, a civic center, housing for seniors, a fire and police station, stores, a hotel with a conference center, and village offices. And all of it would be safely nestled 90 feet above the Susquehanna. Today, the area still consists of cornfields ringed by trees. Redente, 69, became the village’s grant administrator eight weeks after the 2011 flood. His responsibility is to ensure that Sidney’s grand plans go somewhere. All these years later, there’s a hint of anger beneath his small-town charm as he talks about the urgency of rebuilding Sidney away from the river. “If we don’t, it’s going to die,” he says.

Sidney’s fate wasn’t always so tenuous. The conditions that threaten to erase it are what first brought settlers here: flat plains, their soil fed by the intersection of the Susquehanna, which starts in Cooperstown, N.Y., and flows 450 miles south to the Chesapeake Bay, and the smaller Unadilla River. Those waterways formed natural highways for a village nestled among the hills at the northeastern tip of Appalachia.

But the toll of flooding has been worsened by decisions about where and how to build. As the cresting Susquehanna rushed past Sidney in 2011, it hit a concrete bridge that became an accidental dam, pushing the water back onto the banks of the village and into homes. Residents blame bad planning for their predicament as much as they blame increased precipitation.

That skepticism colored residents’ initial reaction to Mayor Andrew Matviak’s plans. With his mustache and steady stare, Matviak has the look of a kind but stern uncle, and the flat cadence of his voice evokes somebody accustomed to delivering bad news. When he was elected mayor in the spring of 2011, he expected to spend his time worrying about the things small towns usually struggle with, such as its mix of Victorian and postwar homes falling into disrepair and an aging, shrinking population. Six months later, when his village flooded for the second time in five years, he realized he had a far greater problem. Matviak began to worry there was no practical way to protect almost 200 homes from the next inundation. But doing nothing, he decided, would mean slow decay as people drifted away from endangered houses. So he settled on a fix. Sidney would pursue federal funds for sweeping buyouts along the river, erasing more than a tenth of its housing stock to make a “green plain” to better absorb future floods. The village would also try to replace those homes by acquiring the farm.

Right away, the notion of uprooting a neighborhood generated confusion. When town officials first described the proposal, some residents assumed their houses would be transported up the hill by helicopter. Others believed they were being asked to choose which lot they wanted on the farm. “Residents took it upon themselves to go to the property and wander around, much to the displeasure of the property owners,” says Shelly Johnson-Bennett, planning director for Delaware County, which includes Sidney.

But retreating to the farm initially had the support of the people who mattered most: the state officials everyone hoped would pay for it. “When the government first came down and looked at that land, they were so impressed,” Matviak says. “And everybody was gung-ho.”

Killing and simultaneously resurrecting entire neighborhoods is almost unheard of in U.S. climate policy. The greatest obstacles to managed retreat, disaster policy experts say, are inertia and moral hazard. Local officials often oppose buyouts because it will reduce their tax base, and they know that after a town floods, the federal government will pay to rebuild. National programs remain designed for this, even though $1 spent reducing exposure to disasters (for example, buying homes in the flood plain) saves $6 later, according to a federally funded report released in January. Congress spent $350 billion on disaster recovery in the past decade; February’s budget deal alone included more than $80 billion in relief for last year’s hurricanes. Most of that will go toward temporary housing, as well as repairing or reconstructing homes and infrastructure—almost always in the exact same place as before the storm.

Of the 30,000 homes across the country that flooded multiple times from 1978 to 2015, fewer than 9 percent were bought out, according to the Natural Resources Defense Council. Those buyouts have typically focused on small numbers of the most vulnerable homes. Demand from homeowners has often exceeded the number that have actually received buyouts. For example, Harris County, Texas, which includes Houston, has 3,000 homeowners seeking money to leave their houses, according to Ed Emmett, the county’s chief executive officer. “We won’t get enough money to buy them all out, so we’ll have to prioritize,” he says. Sidney’s 2006 flooding inundated hundreds of houses, but officials bought and tore down only nine.

After the second flooding, “we realized that we can’t operate the way we’ve operated before and say it’s not going to happen again,” Matviak says. Far from Sidney, two things made his proposal suddenly seem within reach. In the summer of 2012, President Obama signed a bill phasing out subsidies for federal flood insurance. In Sidney, according to Johnson-Bennett, that could mean $7,500 a year in premiums, often more than people’s mortgages. The insurance subsidies would disappear entirely when homes changed owners, making sales all but impossible.

A few months later, Superstorm Sandy smashed into the Northeast. The federal government sent $60 billion in emergency aid, and New York state was suddenly awash in disaster money, along with federal pressure to seek solutions for flood-prone areas. Governor Andrew Cuomo made rebuilding a signature issue: In the summer of 2013, he created the Governor’s Office of Storm Recovery, which announced $750 million in grants for towns that pursued “innovative rebuilding plans,” including Sidney. It wasn’t the first time Cuomo had gotten involved in the town’s recovery. Less than three months after the 2011 floods, the governor announced $20 million to help Amphenol Aerospace, a major military contractor and Sidney’s largest employer, relocate its flooded factory away from the river. Amphenol broke ground on a new plant 18 months later.

Sidney was promised $3 million—enough, local officials hoped, to purchase the farmland and entice a developer to build homes. Meanwhile, the village needed to get the federal government to buy up its soggiest neighborhood before people started to abandon their homes and the town.

Federal buyouts are never fast. But what happened in Sidney made the usual programs seem downright zippy. After a flood, it usually takes six to eight months before the Federal Emergency Management Agency announces a buyout program, says Johnson-Bennett, who is shepherding Sidney through the labyrinth of federal and state disaster programs and who comes across as an almost bewilderingly upbeat force, considering that her job is to manage the aftermath of disasters. Another five or six months usually pass before FEMA approves an application from local authorities. Then the real work starts: contracts, land surveys, house appraisals. Somebody in Johnson-Bennett’s office has to find out if the homes that applied for buyouts have liens against them or foreclosure notices or any other complications before money changes hands. “Two to three years,” she says of how long a typical FEMA-funded buyout takes after a disaster. “Shortest.”

Sure enough, almost three years passed after the 2011 floods before Delaware County was able to buy and demolish a single home with FEMA money. But Sidney’s main problem was that only 32 families initially applied for buyouts, not nearly enough for Matviak’s plans to work.

The reason came down to money, Johnson-Bennett says. FEMA pays only 75 percent of the cost of buying out a home. In Sidney, with housing values already heavily diminished because of the 2006 floods, FEMA’s offers would neither pay off a mortgage nor cover the price of a new home somewhere else. So the state decided to mix disaster programs, adding grants from the U.S. Department of Housing and Urban Development that require no local contribution.

On one level, it worked: An additional 127 families applied for buyouts, enough to turn big chunks of Sidney’s most vulnerable neighborhood into the green plain. But it also meant residents would remain in danger for years while FEMA, HUD, and the state figured out the rules for mixing the two funding streams. It took until 2017 for the federal agencies to agree on rules for what happens if a homeowner appeals the amount of money offered for her house, Johnson-Bennett says. Six years had passed since the floods.

The federal government knows it’s not doing enough to move people away from risky areas. Two years ago, Sean Becketti, chief economist at Freddie Mac, the government-backed mortgage investor, warned that the “economic losses and social disruption” caused by flooding and sea-level rise were likely to exceed the turmoil of the Great Recession.

The Obama administration took notice, but too late. One of its final climate initiatives was asking 11 agencies to set up a working group to “coordinate federal assistance for managed retreat and relocation,” according to an internal memo. A few weeks later, Donald Trump, who’s called climate change a hoax, became president. The group never held a single meeting.

As federal agencies argued over details, the local decay that Matviak was trying to prevent had already started. Families slipped away, and the tax base shrank. Today, the part of Sidney that was underwater in 2011 retains an eerie sense of emptiness. Some homes are dilapidated, others in pristine condition. Most look somewhere in between.

Residents who could neither sell nor afford to make their homes livable a second time abandoned them. Spry says squatters occupied the house across from hers, selling what was left inside. For a time, she thought it was being used by drug dealers.

Living in a neighborhood that’s not supposed to exist anymore creates quotidian concerns as well: How much is the right amount to put into a house that will get torn down? “You have to keep it looking respectable,” says Spry, who, with her husband, paints the trim regularly and maintains the garden. “But you don’t want to spend any more than you have to.”

Some of her neighbors bought homes elsewhere in Sidney after the state announced the large-scale buyouts funded by HUD. James and Shareen Bonner found a house a mile and a half up the hill, one of the few available on dry ground. But they continue to own their former home, a yellow brick house in the flood plain. “We were told that within one to two years, that house would be torn down,” James says. “That was three and a half years ago.” They’re stuck paying property taxes and insurance on two houses, wondering how long they can keep it up.

Others tried to find buyers but couldn’t. Not far from the Bonners’ empty house, Bridget and Steve Bargher tried selling their single-story home in 2012. Potential buyers backed out when they heard about the floods. By 2013, the Barghers gave up. Now, like most of their neighbors, they’re left waiting for the buyouts that were supposed to come years ago. And when that happens—if it happens—they’ve had enough of Sidney.

“If they had moved faster, then probably we would have stayed,” Bridget says. And anyway, she adds, where in Sidney are they supposed to move? “All the houses are garbage,” she says.

Nongarbage houses were supposed to be going up at Riverlea Farm, dry and safe from floodwaters. Sidney reached an agreement with the family who owned the farm for an option to buy the land for $1.3 million. A developer drew up plans. The village paid an engineering firm for infrastructure estimates.

Then, in 2015, everything fell apart. The engineers determined it would cost $4 million to extend water and sewer service to the farm, more than the state wanted to pay. The developer gave up. The owners put the property back on the market. Four years after the flood, Sidney was back to square one. State officials blame high costs, but Redente thinks the problem ran deeper. “Anything you do with a state person that’s out of the ordinary, they get very nervous,” he says.

Matviak and Redente decided to pursue a second, smaller plot of land closer to the village—one that wouldn’t hold as many new homes but also wouldn’t cost as much to connect to Sidney’s sewer and water systems. This dry land, however, came with neighbors.

Under the rules that govern the HUD grants, low- and moderate-income families must benefit. In Sidney, that distinction is mostly a formality: The cutoff is about $37,000 for a household of four, Johnson-Bennett says, which includes many of the families in the village. But when Brenda Philpott found out that the homes Sidney planned to erect behind her house would go to low- and moderate-income families, she got worried. “We do not want more low income housing that will create a future ‘SKID ROW’ effect in the Village of Sidney,” read a petition that she circulated. By last summer, Philpott had garnered some 300 signatures.

The petition has served to slow things down further, according to Johnson-Bennett, but she doesn’t think it will stop the new housing from getting built. The greater problem is that the warnings about skid row soured the views about the new development among Sidney’s residents—including some of the same residents who were supposed to move there.

“If they put all welfare people in there, we don’t want to do that,” explains Shareen Bonner.

“I do believe that it’s going to be a welfare situation,” Corrine Spry says.

“I don’t want to live in a housing project,” says Steve Bargher.

Both Redente and Matviak maintain that Sidney’s struggles ought to concern people across the country in areas that can no longer be protected from climate change. “John and I talk all the time about Houston, Texas, or about Puerto Rico,” Matviak says. “When you have hundreds or thousands of homes, how are they going to make this happen?”

Homeowners, Redente adds, are led to believe that the creation of a federally funded buyout program means “bing-bang-boom, you’re all set to go, let’s high-five it, everything’s taken care of. Five years later, they’re still waiting.”

FEMA didn’t respond to requests for comment on the Sidney project. HUD referred questions about Sidney to state officials. When asked about the pace of the buyouts, Catie Marshall, a spokeswoman for the Governor’s Office of Storm Recovery, notes it was just created in 2013. “It’s disheartening to hear that people think that GOSR’s efforts are less than adequate,” she says. “Nobody likes that it takes time.”

Johnson-Bennett remains optimistic, saying the county has made offers to two homeowners near the river, with more to come. “It moves ever so slow, but we are finally making real progress,” she says. Still, nearly seven years have passed since the 2011 floods, and the HUD money hasn’t finalized a single buyout. And Johnson-Bennett doesn’t expect to begin building new houses until next year at the earliest.

As Sidney keeps missing its twin goals of emptying a neighborhood that can no longer be protected and building a new one on dry land, the risk increases that more people will drift away, taking Sidney’s future with them. The delays also tarnish the model this village was supposed to provide the rest of the country. Instead, Sidney underscores how hard it is to overcome the U.S. disaster policy’s emphasis on rebuilding in place. “I was hopeful that a project like this could show something could change,” Johnson-Bennett says. Does she think it worked? “No,” she replies after a pause. She’s been sitting for more than an hour at the McDonald’s up the hill from the village, talking about an unlikely effort to prove the government didn’t have to keep repeating the same mistakes. For the first time, she sounds defeated.

Clean energy sector swings Republican with U.S. campaign donations

Nichola Groom

(Reuters) - U.S. solar and wind energy companies have donated far more money to Republicans than Democrats in congressional races this election cycle, according to a Reuters analysis of campaign finance data, an unprecedented tilt to the right for an industry long associated with the environmental left.

While the money is modest compared with that donated by fossil fuel interests, the support provides GOP candidates with added credibility on clean energy, an issue polling shows swing voters care about.

Renewable energy has typically depended on government subsidies and policies to help fuel its growth, and the donations come at a time when Republicans control both houses of Congress as well as a majority of state houses across the country. Republicans have so far left subsidies for the industry largely intact.

Comprehensive coverage of the global oil and gas markets, policy changes impacting climate and the latest developments in solar and wind power.

“We support those leaders who share our vision,” said Arthur Haubenstock, vice president of policy and strategy at 8minutenergy Renewables LLC, a California-based solar project developer, and treasurer of a newly formed employee-funded political action committee that shares the company’s name. So far, the PAC has donated only to Republicans.

Overall, political action committees representing solar and wind companies have donated nearly $400,000 to candidates and PACs in the 2018 election cycle, including $247,000 to Republicans, $139,300 to Democrats, and $7,500 to independents, according to the Reuters analysis.

That marks a record. During the 2016 presidential elections, the first cycle during which the clean energy industry gave more to the GOP than to Democrats, Republicans received just over half of the combined $695,470 in political contributions from major wind and solar PACs.

Before that, solar and wind companies mainly donated to Democrats, who were broadly seen as more supportive of policies that could help the nascent sector grow. In 2014, 70 percent of the contributions from seven major wind and solar PACs went to Democrats.

The U.S. solar and wind industries have expanded greatly over the last decade and now employ some 300,000 workers nationwide, nearly six times more than coal mining. The hottest growth has been in states that voted heavily for President Donald Trump in 2016.

That has helped strengthen the industry’s appeal to Republican lawmakers, allowing it to rebrand as a jobs engine for the heartland, instead of as a tool for combating global warming, an issue that played better with Democrats.

“Solar is creating economic activity in so many districts,” said Abigail Ross Hopper, head of the Solar Energy Industries Association (SEIA), whose donations have tilted heavily toward the GOP. Global warming “is certainly not our lead talking point,” she said.

SEIA has contributed more than twice as much to Republicans as to Democrats this cycle, $56,500 versus $26,700.

Prior to 2016, SEIA’s contributions to Democrats were reliably double what the group gave to Republicans. The American Wind Energy Association’s PAC, too, has shifted its giving. In 2014 it gave Democrats twice as much money as Republicans, while in the current cycle it has given $87,500 to Republicans compared to $67,500 for Democrats.


Polls have found widespread support for renewable energy among voters, including among Republicans. Most recently, a Gallup poll from early March found 73 percent of adults favor an emphasis on alternatives like wind and solar over traditional fossil fuels. Just over half of Republicans - 51 percent - favored alternatives, compared with 88 percent of Democrats, Gallup said.

Among moderate Republicans and voters who lean Republican, there is even wider support for renewable energy. A poll conducted by Pew Research Center in early 2017 found that nearly two-thirds of that group favored alternative energy sources over fossil fuels.

The polls also found that attitudes toward clean energy are not necessarily linked to those about climate change. The Gallup poll, for instance, found just 35 percent of Republicans think climate change is caused by human activities, and 69 percent think the seriousness of global warming is exaggerated.

“Clean energy works every time and it doesn’t alienate the base,” said Jay Faison, Chief Executive of ClearPath, a group that aims to help elect Republicans supporting clean power. Independent-minded voters view support for alternative energy as a signal that a candidate is “not an errand boy for the party leadership,” he added.

Nevada incumbent Senator Dean Heller is among the chief Republican beneficiaries of support from the clean energy industry. His re-election effort has drawn more than $15,000 in backing from solar and wind this election cycle.

Nevada ranks fourth in the nation in solar installations and generates more than 11 percent of its electricity from the sun. One in every 203 people is employed by the solar industry in Nevada, putting it second only to ultra-green California.

Heller told Reuters he supports clean energy because of the jobs it has brought to his state. Nevada has been able to attract employers like Tesla, he said, in part because its abundant sunshine can produce renewable power for factories and other business operations.

He added he doesn’t see a conflict with supporting both solar energy and fossil fuel interests: ”I’m very pro ‘all of the above,’ and I think that’s where the GOP is,” he said.

Other Republicans receiving solar and wind donations include Kevin Brady of Texas, Carlos Curbelo of Florida, George Holding of North Carolina and Tom Reed of New York, all members of the House Committee on Ways and Means, which is responsible for writing tax policy.

All four hail from states with sizeable solar markets, and Curbelo is also co-chair of the Climate Solutions Caucus, a bipartisan group of House lawmakers working on policies to address climate change.

House Majority Leader Kevin McCarthy has received about $13,500 in contributions from the renewable energy PACs, and SEIA held a fundraiser for him in his California district last year. McCarthy’s congressional district, which includes a swath of the Mojave desert, boasts more solar capacity than any other district in the nation, most of it in large-scale projects for utilities.

Democratic and environmental groups downplay the clean energy industry’s shifting financial support, saying companies are simply trying to protect their interests by supporting the party in power.

“AWEA and SEIA are trade associations representing the financial business interests of their member companies,” said Sara Chieffo, vice president of government affairs at the League of Conservation Voters, which has found itself at odds with solar and wind PACs on many candidates, including Heller.

The league gives Heller rock-bottom marks for his environmental voting record and has endorsed his Democratic opponent, Jacky Rosen, in the Nevada senate race.

Most environmental groups still primarily back Democrats. The League of Conservation Voters, for example, has contributed $1.3 million to Democratic congressional candidates this cycle but has not supported a single Republican. Billionaire Tom Steyer’s NextGen Climate Action Super PAC has spent tens of millions of dollars in recent election cycles on campaigns against Republicans and for Democratic candidates.

Democratic National Committee spokeswoman Sabrina Singh questions why solar and wind companies would support Republicans over Democrats.

Republicans, she said, “consistently seek to defund efforts to promote clean energy,” while “Democrats at both the federal and state level have been fighting to promote renewable energy.”

The Republican National Committee did not respond to a request for comment.

Lab-Grown Meat Is Getting Cheap Enough For Anyone To Buy

A company called Future Meat Technologies, which just got a large investment from meat giant Tyson Food, is perfecting a process that could see us soon grilling meat grown in a bioreactor.


In 2013, producing the first lab-grown burger cost $325,000. By 2015, though the cost had dropped to around $11, Mark Post, the Dutch researcher who created the burger, thought that it might take another two or three decades before it was commercially viable. But the first so-called “clean meat,” produced from animal cells without an actual animal, may be in restaurants by the end of 2018.

The Israel-based startup Future Meat Technologies aims to begin selling its first products later this year. The startup’s costs are still very high–around $363 a pound–but it believes that it can cut the cost of cellular agriculture to about $2.30 to $4.50 a pound by 2020. (Post’s $11 burger came in at $37 a pound; as of April, the average wholesale value of beef in the U.S. was $3.28 a pound, though no directly comparable production cost is available.) Today, the startup announced a $2.2 million seed round of investment, co-led by Tyson Ventures, the venture capital arm of the meat giant Tyson Foods.

“Right now, growing cells as meat instead of animals is a very expensive process,” says Yaakov Nahmias, founder and chief scientist of Future Meat Technologies. The startup’s new process is designed to reduce that cost in a few ways. The biggest expense in cellular agriculture is the medium–made of sugars, salts, and amino acids–used for growing cells, which typically has to be replaced as the cells grow. The startup uses a process that cleans and recycles the medium, similar to the way that an animal’s liver and kidneys clean and recirculate blood.

The process also avoids using serums, which are made from animal blood, and which have been used by other companies working in the field, and are both expensive and unappealing to consumers who want to avoid animal products entirely. In addition, rather than using the same type of huge bioreactors that are used in the pharmaceutical industry–and are also very expensive–the company plans to use small units that can be distributed to existing farms.

“If we start small and stay small, we can essentially dramatically reduce the cost, and the capital burden drops by an order of magnitude or more,” Nahmias says. “With these two plays–a more efficient bioreactor and a distributed manufacturing model–we can essentially drop the cost down to about $5 a kilogram [$2.27 a pound]. This is where it starts getting interesting, because the distributed model also allows you to use the current economics.”

Farmers, he suggests, could begin to shift from animal agriculture to cellular agriculture. “These distributive models allow us to grow organically and essentially replace chicken coops with these bioreactors,” he says. “This, I think, is a reasonable way of actually taking over and replacing this industry sustainably.”

The company plans to supply farmers with a small collection of cells or a piece of tissue roughly the size of an coffee capsule, along with the nutrients to feed the cells and the equipment for growing them (the platform can use cells from any animal). Ten to 18 days later, after the tissue has grown, it will be sent to processing plants where it can be turned into “clean meat” for consumers. Turning protein into something with the shape, texture, and mouthfeel of meat has become relatively easy, he says, as companies like Beyond Meat and others have shown with soy and other plant-based proteins.

To address flavor, Future Meat Technologies creates lab-grown fat cells along with muscle cells (it first grows connective tissue cells, which can be induced to turn into either fat or muscle cells). “Most companies today are growing muscle cells . . . they’re producing this mass of protein, but one of the things that’s missing is fat,” says Nahmias. “Fat is what we really crave. Fat is what gives meat its distinct aroma and flavor.” The end result tastes like meat from an animal, with the same nutrition–although because the process is controlled, it’s possible to tweak the ratio of saturated to unsaturated fat to make it healthier.

If a move away from animals seems unlikely, Nahmias argues that it’s necessary as the world adapts to a growing demand for meat at the same time as traditional meat production strains the environment. A quarter of the world’s land, apart from Antarctica, is used for pasture; most deforestation in the Amazon basin happened because of cattle ranching. Livestock is responsible for an estimated 14.5% of all human-caused greenhouse gas emissions. Producing a single pound of beef can take around 1,800 gallons of water. As the world population swells to 10 billion by 2050, we’ll need to produce 70% more food calories–and between 2006 and 2050, demand for animal-based protein is expected to grow 80%.

“Cultured meat is a transformative concept in general,” Nahmias says. “We’re running out of land, we’re running out of water resources, and if you want to continue feeding a growing population not only in the West but also in China and India, where people are moving toward Western-style diets in increasing numbers, then we need to fundamentally change the way we produce meat.”

In 2017, China made a $300 million trade deal with Israel to collaborate on clean-tech projects, including both clean energy and clean meat. “There is a strong interest in China to move into sustainable agriculture in multiple ways,” he says. “This trade deal is definitely part of it and we can definitely take advantage of it if we get to market fast enough.” Chinese VC firm Bits x Bites, the country’s first food tech accelerator, is one of Future Meat Technologies’ investors.

Tyson, which is one of the world’s largest food companies and is known for chicken, sausage, hot dogs, and other meats, doesn’t expect traditional meat production to go away. But the company, which also invested in the lab-grown meat company Memphis Meats, sees an opportunity for new kinds of protein. “It will be a part of the story, over time, in our estimation,” says Justin Whitmore, executive vice president of corporate strategy and chief sustainability officer of Tyson Foods. “The biggest question is how much time, when you’re talking about lab-grown meat.”

One key to widespread adoption, of course, is cost. The Good Food Institute, an organization that focuses on alternatives to animal products, has done research into the financial viability of the clean meat sector as a whole. “I was tasked with figuring out if this clean meat thing is worthy of our time or should we put all of our focus into plant-based meat,” says Liz Specht, the nonprofit’s senior scientist. “I actually came in into that exercise quite skeptical because I’ve done animal cell culture for biomedical R&D, and I knew what the high costs were associated with that. Through doing this analysis I actually did end up quite optimistic.”

Other startups in the space, she says, are also finding ways to drive down the cost of production, though because the products are not yet in production, companies’ specific cost projections are still highly speculative. At least one other company–Just, formerly known as Hampton Creek–also plans to launch its first clean meat product by the end of 2018. The company is aiming for a price within 30% of the conventional product (other companies are also working on recycling the medium and eliminating the use of serum).

Nahmias believes that clean meat will eventually be cheaper than traditional meat from animals. Along with other advantages, such as avoiding the risk of contaminants like salmonella, and eliminating the need for antibiotics, which are heavily used in animal agriculture today, he thinks that clean meat could someday largely replace the traditional version.

“That’s not to say that there are not going to be specialty restaurants producing meat traditionally–more expensive restaurants–but I think the burgers that we’re going to put on the grill, and the chicken nuggets that we’re going to eat at McDonald’s, and the barbecued chicken that we’re going to eat in Chipotle is mainly going to be cultured meat decades from now,” he says.

UN forest accounting loophole allows CO2 underreporting by EU, UK, US

by Justin Catanoso on 2 May 2018

Emissions accounting helps determine whether or not nations are on target to achieve their voluntary Paris Agreement reduction goals. Ideally, the global community’s CO2 pledges, adjusted downward over time, would, taken together, help keep the world from heating up by 1.5 degrees Celsius (2.7 degrees Fahrenheit) by 2100 from a 1900 baseline.

But scientists are raising the alarm that this goal may already be beyond reach. One reason: a carbon accounting loophole within UN Intergovernmental Panel on Climate Change (IPCC) guidelines accepting the burning of wood pellets (biomass) as a carbon neutral replacement for coal — with wood now used in many European Union and United Kingdom power plants.

Scientists warn, however, that their research shows that replacing coal with wood pellets in power plants is not carbon neutral. That’s partly because burning wood, which is celebrated by governments as a renewable and sustainable energy resource, is less efficient than coal burning, so it actually produces more CO2 emissions than coal.

Also, while wood burning and tree replanting over hundreds of years will end up carbon neutral, that doesn’t help right now. Over a short timeframe, at a historical moment when we require aggressive greenhouse gas reductions, wood burning is adding to global emissions. Analysts say that this loophole needs to be closed, and soon, to avoid further climate chaos.

Pine forests throughout the southeastern US, especially in North Carolina, are farmed, pulped and turned into wood pellets that are shipped overseas.

For the past ten years, Mary Booth, an ecologist with the Partnership for Policy Integrity in Pelham, Massachusetts, has immersed herself in the complex, nuanced, politically charged world of international carbon emissions accounting models as if the planet’s fate depends on it.

In many ways, it does.

Booth studies how countries count and report their emissions. In particular, she evaluates whether generating energy via the burning of wood pellets, or biomass, puts less carbon into the atmosphere than burning coal. In a rising trend, countries, especially in the European Union and United Kingdom, are converting existing coal-fired power plants to burn wood — a renewable, albeit controversial, fuel source.

Emissions accounting helps determine whether or not nations are on target to achieve their voluntary Paris Agreement reduction goals. That agreement also represents the global community’s pledge to keep the world from heating up by just 1.5 degrees Celsius (2.7 degrees Fahrenheit) by 2100 from a 1900 baseline (we’ve already warmed 1 degree Celsius).

Emissions tallies are reported regularly to the United Nations Intergovernmental Panel on Climate Change, or IPCC. But those figures aren’t just numbers on paper or political aspirations. The data, if accurately calculated, tell us how much greenhouse gas nations are actually putting into the air, and those combined totals help us know whether we are on target to avert climate change catastrophe.

Booth is darkly pessimistic — a price she pays for knowing too much, she told me.

“This is a message that no one has said yet. It’s what I believe to be true: there may not be a pathway to 1.5 [degrees] anymore — at all. Carbon capture and storage is a fantasy,” Booth told me in a series of interviews for Mongabay. “Growing forests may not work fast enough. We’re not reducing emissions fast enough. The sooner that story gets told, the sooner people understand what’s really required to keep the earth from burning up.”

Is this the hyperbole of a passionate but potentially cynical climate researcher? Perhaps. But as the saying goes, nature is indifferent to our opinions; it only responds to our actions.

Some of those actions to curb climate change have been positive: wind and solar continue to grow, and those technologies have never been cheaper per kilowatt hour. Coal use continues to fall. Electric cars are growing in popularity, especially in China.

But tellingly, global temperatures continue rising year after year. And in 2017, planetary carbon emissions hit a record high, even as major industrialized nations, like the United States, reported reduced carbon emissions, according to IPCC statistics.

Nature’s response? More extreme weather. More frequent and ferocious storms. More drought. More wildfires. More Arctic ice melting, along with Antarctic ice melt, too. More sea level rise. More coral reefs dying. More ecosystems imperiled.

One of the largest users of woody biomass for energy production are the Drax power stations in the United Kingdom. Shown here is the so-called Drax biomass dome, which once burned coal. The UK has nearly eliminated burning coal for energy, cutting its official United Nations IPCC emissions, but is ramping up its burning of woody biomass.

A blind eye to carbon cheating?

Booth’s research — Not carbon neutral: Assessing the net emissions impact of residues burned for bioenergy, published this February in the journal Environmental Research Letters — helps answer some thorny questions critical to our energy and carbon future.

Her study examines the net CO2 emissions of biomass burned to replace coal at the UK’s massive Drax power stations and other EU power plants. Combined, those energy facilities consume tons of wood each year.

One major finding, right out of the gate: Booth reports that — contrary to a largely accepted view — wood pellets aren’t sourced mainly from fallen limbs and lumber waste called residue, but rather from whole trees. However, she based her study on residue-derived wood pellets anyway because the biomass industry “so often claims residues are a main pellet source.”

Even based on the false assumption that only wood waste, not whole trees, are being burnt, Booth found that “up to 95 percent of cumulative CO2 emitted [by the biomass burning power plants] represent a net addition to the atmosphere over decades.” In other words, biomass is not carbon neutral.

More disturbing: Booth’s research opens up the IPCC to charges that its policymaking decisions regarding emissions accounting have been politicized — crafted by negotiators to include built-in loopholes that allow nations to underreport certain emissions while appearing to achieve their carbon-reduction targets.

The optimism following the signing in December 2015 of the landmark Paris Agreement has waned as the world’s largest carbon emitters have moved slowly to meet their Paris carbon reduction pledges. But even if those pledges were met, IPCC carbon counting loopholes could prevent the Paris Agreement from effectively curbing emissions and the worst impact of climate chaos. Image by Justin Catanoso

In particular, both the UK and EU appear to have slipped through a large loophole in order to “disappear” real emissions from their carbon accounting, as one source told me, thus undermining the Paris Agreement’s critically important carbon-mitigation strategies.

“Why does the IPCC appear to accept inaccurate emissions accounting?” Professor Doreen Stabinsky asked me, then answered: Because “IPCC scientists are technocrats. It is not a neutral body. There is a lot of politics behind the positions of individuals on the IPCC. Their meetings are often loudly political.” Stabinsky speaks from firsthand knowledge: she studies the nexus between environmental policy and politics at College of the Atlantic, Maine.

Bioenergy representatives, Stabinsky points out, are IPCC members and help write UN emissions guidelines. Likewise, countries with large areas of forest, such as the United States and Brazil, lobby to avoid counting or undercounting forest-related carbon emissions, including that from biomass burning.

Doreen Stabinsky is a professor of global environmental politics at the College of the Atlantic in Maine. She has closely observed the IPCC and its members responsible for approving international carbon-accounting models. “The IPCC isn’t a neutral body… Their meetings are loudly political,” she told Mongabay.

Repeated calls and emails seeking comment from US-based wood pellet-producing companies such as Maryland-based Enviva and its trade group, the American Forest and Paper Association, went unreturned.

As for the IPCC, its reports have tens if not hundreds of authors, thus an official spokesperson for the panel does not exist. But contributing IPCC panelists have spoken out on this issue. William Moomaw, a Tufts University professor of international environmental policy and a former lead IPCC author, wrote a scathing report to the panel in 2011 called “The Myth of Carbon Neutrality of Biomass.” He also spoke forcefully on the topic directly to the European Union Parliament in 2015.

“The EU, including the UK, counts biomass used for electric power as carbon neutral by definition,” Moomaw wrote in 2011. “This means that biomass is counted on the same basis as solar or wind, which clearly are low-carbon sources of energy. This is not only incorrect, but ironic, given that developing countries that use wood for fuel that leads to deforestation is counted as contributing to climate change, while Europe and most states in the US count emissions from ‘modern biofuels’ as carbon neutral.”

Pine forests cut to provide wood pellets for power plants are replanted, so this energy resource could technically be called carbon neutral, but only over the long term. It might take hundreds of years for those new trees to become mature and for the carbon equation to balance out.

Another irony: The IPCC states on its website that its “guidelines do not automatically consider biomass used for energy as ‘carbon neutral,’ even if the biomass is thought to be produced sustainably.” Yet, IPCC carbon accounting models appear to ignore the panel’s own guidelines, a contradiction which brings us to the essence of Booth’s research, and to a better understanding of what she calls “the controversy in climate modeling.”

North Carolina, where I live, is nicknamed the Tar Heel State because of its abundance of longleaf, white and loblolly pine. It is a leading US producer of wood pellets from fallen limbs, tree tops, lumber mill waste, but mostly, farmed pine trees. Cargo vessels filled with 45,000 metric tons of wood pellets per shipment are sent to the UK and EU regularly and burned for energy.

All those trees and residue in North Carolina are counted as carbon emissions produced by the United States, with the assumption — built into IPCC accounting models — that the organic matter would eventually die, rot and decompose there anyway, thus releasing its stored carbon.

To avoid so-called “double counting,” when those wood pellets are burned overseas, the CO2 sent into the atmosphere over Europe is not counted because of another assumption: new trees are quickly replanted in North Carolina, thus theoretically and immediately soaking up the CO2 emitted across the Atlantic.

And there you have it — carbon neutrality. Except, as Booth fairly screams: “That’s a lie!”

“If in year one, you burn 10 tons of wood pellets, you put 10 tons of carbon into the atmosphere,” Booth explained. “If it were rotting, only a fraction would decompose [over that same period]. You are comparing two pots of carbon” without taking the timeline into consideration.

Put bluntly, the IPCC is allowing some creative accounting around biomass burning. This potentially makes us all victims of magical thinking, as we collectively agree that the CO2 from all those North Carolina trees is not flooding into the atmosphere over a short timeframe, thus contributing to climate change right now — at the moment when we most need to reduce carbon emissions.

Clouds of water vapor rising from the cooling towers of the Drax power station in the United Kingdom. Wood burned there in large amounts is considered carbon neutral for the country’s purposes of carbon emissions accounting, and reporting to the United Nations IPCC.

Rich Birdsey retired from the US Forest Service after 40 years in 2016. Today, he is a senior scientist at Woods Hole Research Center in Massachusetts, where he studies biomass-for-energy. He explained the problem with the IPCC’s biomass creative accounting method further: newly planted trees take decades to absorb the emissions from the atmosphere created by just a few days of burning woody biomass.

“The carbon neutrality assumption at some point will be correct,” Birdsey said. “But it might be hundreds of years in the future when acres full of new trees mature. Yet, that idea of neutrality became fairly well ingrained, before people realized that emissions accounting was more complicated than that. The idea took hold 20 years ago, and countries and industries don’t want to give it up — especially those with a vested interest in wood and energy production.”

The news gets worse: both Booth and Birdsey explained that wood burning is not nearly as efficient in generating energy as coal burning. As a result, more wood is burned to produce the same amount of kilowatt hours. Burning wood, which is celebrated by governments as a sustainable energy resource, actually produces more CO2 emissions than coal.

“It’s true that the UK and EU are decreasing their emissions of fossil fuels in shifting to wood,” Birdsey said. “But they are not decreasing their greenhouse gas emissions. Yet with a claim of carbon neutrality, they are counting their emissions as if they are.”

The real danger, Booth warned: while wood burning now represents a small part of the UK, EU and US energy mix, “bioenergy is projected to increase 1,000 percent in the coming years. We’re talking about a massive upscaling of wood burning, as much as 9 gigatons a year. And IPCC models allow bioenergy as having the same emissions as wind and solar — zero. And that’s not true.”

To reemphasize the key point: nature is not being fooled.

Burning biomass for energy is a small part of the US energy production mix. But EPA Administrator Scott Pruitt has reminded state governors that President Trump signed an executive order in April 2017 declaring the burning of biomass “carbon neutral.” That’s a policy originally crafted by the Obama administration.

Interviews with former US Environmental Protection Agency officials revealed that the issue of biomass and carbon neutrality was a conundrum for the Obama Administration. The science was clear; the politics was murky.

Obama’s EPA in creating the now stalled Clean Power Plan, bowed to political pressure from biomass industry lobbyists and members of Congress. In a dubious compromise, the administration called for a five-year study of biomass and its supposed carbon neutrality, and in the meantime allowed continued biomass burning, with all emissions counted as zero.

“Sen. Angus King of Maine [a major timber-producing state] threatened to withdraw his support of the Clean Power Plan if biomass wasn’t deemed carbon neutral,” Booth told me. “And he claims to be a climate champion.”

In February, President Trump’s EPA administrator, Scott Pruitt, wrote a two-page letter to Republican Governor Chris Sununu of New Hampshire hailing the state’s burning of biomass for energy. He reminded Sununu that Trump’s executive order 13777 makes clear that, for emission-accounting purposes, burning biomass should be considered carbon neutral.

“As you and I both recognize,” Pruitt wrote, “continuing to be responsible stewards of our nation’s forests and lands while utilizing all domestic forms of biomass to meet our energy needs are mutually compatible goals.”

On April 23, 2018, Pruitt’s EPA went a step further, issuing “a statement of policy making it clear that future regulatory actions on biomass from managed forests will be treated as carbon neutral when used for energy production.”

Mary Booth is director of the Partnership for Policy Integrity in Pelham, Massachusetts. She has studied IPCC carbon-emissions accounting models related to burning woody biomass for a decade. Her recent study found that, far from being carbon neutral, burning woody biomass emits more CO2 per kilowatt generated than burning coal.

Is there hope?

Here’s where Mary Booth’s pessimism concerning the fate of the planet hits home: if the world’s largest carbon emitters — the US, UK, EU, and quite possibly China and India — “disappear emissions” through creative accounting loopholes tolerated by the IPCC, what hope is there for the Paris Agreement of slowing climate change with all its horrific consequences?

“It is depressing. But it’s reality,” said Stabinsky. “Life is political. The IPCC space is highly political. Climate change is a really difficult challenge we’re facing. It takes a long time to fix things. And governments don’t want to fix things they don’t think are broken. There is a reason to try to hide biomass emissions.”

Echoing Booth, I asked: Is the Paris pathway to 1.5 degrees Celsius hopeless?

“Actually, I still think we have a good chance,” Stabinksy responded. “I am a pessimist of the intellect and an optimist of the will. If we just look at the numbers, I think, ‘Wow, the human race is doomed.’ But as an optimist of the will, I think, ‘We can do it.’ We’ve ended slavery. We’ve ended apartheid in South Africa. We have made massive systems changes that have had huge economic impacts.

“Climate change at its essence is a political challenge. We have to completely retool our energy system, which will be massively disruptive. But we know what to do. The technology in wind and solar is available and dropping in price. We can do it. But no one is leading. It’s too easy to hide your head in the sand. Which is what’s happening with biomass accounting.”

Ok, I agreed. Optimism is critical in the face of such a monumental global challenge, and there is historical precedent for a sea change in political will — with the Paris Agreement itself as evidence of that kind of shift. Also, developed nations have said they will strive to make major policy decisions at COP 24, the next UN climate summit, in Katowice, Poland, this December.

But, one last question: do we have enough time?

“I don’t know,” Stabinsky told me.

I don’t either, I responded.

“Right, nobody does,” she said.


Booth, M.S. Not carbon neutral: Assessing the net emissions impact of residues burned for bioenergy. (2018). Environ. Res. Lett. 13 035001 (Vol. 13, Number 3), DOI:

Justin Catanoso, a regular Mongabay contributor, is a professor of journalism at Wake Forest University in North Carolina

UN experts denounce 'myth' pesticides are necessary to feed the world

Report warns of catastrophic consequences and blames manufacturers for ‘systematic denial of harms’ and ‘unethical marketing tactics’

Damian Carrington

Tue 7 Mar 2017 06.17 EST First published on Tue 7 Mar 2017 01.00 EST

The global pesticides market is worth $50bn and companies lobby heavily to resist reforms and regulations.

The idea that pesticides are essential to feed a fast-growing global population is a myth, according to UN food and pollution experts.

A new report, being presented to the UN human rights council on Wednesday, is severely critical of the global corporations that manufacture pesticides, accusing them of the “systematic denial of harms”, “aggressive, unethical marketing tactics” and heavy lobbying of governments which has “obstructed reforms and paralysed global pesticide restrictions”.

The report says pesticides have “catastrophic impacts on the environment, human health and society as a whole”, including an estimated 200,000 deaths a year from acute poisoning. Its authors said: “It is time to create a global process to transition toward safer and healthier food and agricultural production.”

The world’s population is set to grow from 7 billion today to 9 billion in 2050. The pesticide industry argues that its products – a market worth about $50bn (£41bn) a year and growing – are vital in protecting crops and ensuring sufficient food supplies.

“It is a myth,” said Hilal Elver, the UN’s special rapporteur on the right to food. “Using more pesticides is nothing to do with getting rid of hunger. According to the UN Food and Agriculture Organisation (FAO), we are able to feed 9 billion people today. Production is definitely increasing, but the problem is poverty, inequality and distribution.”

Elver said many of the pesticides are used on commodity crops, such as palm oil and soy, not the food needed by the world’s hungry people: “The corporations are not dealing with world hunger, they are dealing with more agricultural activity on large scales.”

The new report, which is co-authored by Baskut Tuncak, the UN’s special rapporteur on toxics, said: “While scientific research confirms the adverse effects of pesticides, proving a definitive link between exposure and human diseases or conditions or harm to the ecosystem presents a considerable challenge. This challenge has been exacerbated by a systematic denial, fuelled by the pesticide and agro-industry, of the magnitude of the damage inflicted by these chemicals, and aggressive, unethical marketing tactics.”

Elver, who visited the Philippines, Paraguay, Morocco and Poland as part of producing the report, said: “The power of the corporations over governments and over the scientific community is extremely important. If you want to deal with pesticides, you have to deal with the companies – that is why [we use] these harsh words. They will say, of course, it is not true, but also out there is the testimony of the people.”

She said some developed countries did have “very strong” regulations for pesticides, such as the EU, which she said based their rules on the “precautionary principle”. The EU banned the use of neonicotinoid pesticides, which harm bees, on flowering crops in 2013, a move strongly opposed by the industry. But she noted that others, such as the US, did not use the precautionary principle.

Elver also said that while consumers in developed countries are usually better protected from pesticides, farms workers often are not. In the US, she, said, 90% of farm workers were undocumented and their consequent lack of legal protections and health insurance put them at risk from pesticide use.

“The claim that it is a myth that farmers need pesticides to meet the challenge of feeding 7 billion people simply doesn’t stand up to scrutiny,” said a spokesman for the Crop Protection Association, which represents pesticide manufacturers in the UK. “The UN FAO is clear on this – without crop protection tools, farmers could lose as much as 80% of their harvests to damaging insects, weeds and plant disease.”

“The plant science industry strongly agrees with the UN special rapporteurs that the right to food must extend to every global citizen, and that all citizens have a right to food that has been produced in a way that is safe for human health and for the environment,” said the spokesman. “Pesticides play a key role in ensuring we have access to a healthy, safe, affordable and reliable food supply.”

The report found that just 35% of developing countries had a regulatory regime for pesticides and even then enforcement was problematic. It also found examples of pesticides banned from use in one country still being produced there for export.

It recommended a move towards a global treaty to govern the use of pesticides and a move to sustainable practices including natural methods of suppressing pests and crop rotation, as well as incentivising organically produced food.

The report said: “Chronic exposure to pesticides has been linked to cancer, Alzheimer’s and Parkinson’s diseases, hormone disruption, developmental disorders and sterility.” It also highlighted the risk to children from pesticide contamination of food, citing 23 deaths in India in 2013 and 39 in China in 2014. Furthermore, the report said, recent Chinese government studies indicated that pesticide contamination meant farming could not continue on about 20% of arable land.

The latest weak attacks on EVs and solar panels

The powerful few who benefit from the fossil fuel status quo are exerting their influence

Dana Nuccitelli

Mon 4 Jun 2018 06.00 EDT Last modified on Mon 4 Jun 2018 06.01 EDT

Over the past two weeks, media attacks on solar panels and electric vehicles have been followed by Trump administration policies aimed at boosting their fossil fueled rivals.

Efforts to undermine solar power

The first salvo came via a Forbes article written by Michael Shellenberger, who’s running a doomed campaign for California governor and really loves nuclear power. Shellenberger’s critique focused on the problem of potential waste at the end of a solar panel lifespan when the modules must be disposed or recycled. It’s a somewhat ironic concern from a proponent of nuclear power, which has a rather bigger toxic waste problem.

About 80% of a solar panel module can be recycled, but some portions cannot, and create potentially hazardous waste due to the presence of metals like cadmium and lead. The Electric Power Research Institute notes that long-term storage of used panels until recycling technologies become available may be the best option for dealing with this waste stream. Ultimately, it’s an issue that will need to be addressed as solar panels become more widespread and reach the end of their 25-plus year lifespan, much like the issue of nuclear waste. But it’s an issue that we should be able to resolve with smart policies and technologies.

It’s also not a big near-term concern, unlike the urgent need to deploy low-carbon energy, or an immediate pollution problem like for example the environmental crises that result when oil rigs fail or coal barges sink into rivers.

Shellenberger also raised concerns about the possibility “that cadmium can be washed out of solar modules by rainwater.” But that’s only a problem for broken panels, which are relatively rare except perhaps in the wake a natural disaster like a hurricane or earthquake. In a disaster area, leaching of metals from some broken solar panels is the least of a city’s problems.

In short, it’s valid to note that end-of-life solar panel recycling and disposal is an issue that we’ll have to address smartly, but unlike climate change, it’s not a big or urgent concern. Meanwhile, the Trump administration is planning to order grid operators to buy electricity from struggling coal power plants to extend their lives. This is the latest in the administration’s misguided campaign to save the dying coal industry, which can no longer compete in the free market against cheaper, cleaner alternatives. As a result, coal power plants have continued to shut down at a rapid rate. Those coal plants pose a much greater threat of pollution, including from heavy metals.

Denying the imminent transition to EVs

In the New York Times, conservative opinion columnist Bret Stephens devoted an editorial last week to attacking Tesla specifically, and electric cars in general. There are valid reasons to criticize Tesla – the company regularly falls short of its ambitious production goals – but Stephens’ piece went far beyond what’s fair. For example, it called the Tesla Model 3 “a lemon” because Consumer Reports initially did not recommend the vehicle due primarily to issues with braking distance during its tests. Within about a week, Tesla issued a software update to correct the braking problem, and the Model 3 earned its Consumer Reports recommendation. Worse yet, Stephens declared that gasoline is the fuel of the future:

The terrible idea is that electric cars are the wave of the future, at least for the mass market. Gasoline has advantages in energy density, cost, infrastructure and transportability that electricity doesn’t and won’t for decades.

That’s equivalentt to saying in 1910 that horses have advantages over automobiles, and will continue to dominate the transportation market. The experts disagree. The International Energy Agency estimated that global electric vehicle ownership increased 54% from 2016 to 2017, to 3.1 million EVs, and projects that number will increase 40-fold to 125 million by 2030. Virtually every major automaker is developing electric cars. GM plans to launch 20 new all-electric models by 2023 and the company “believes in an all-electric future.” Nissan plans to launch 8 new all-electric models by 2020 and hopes to sell 1 million EVs per year by that date.

As for the infrastructure disadvantage, California, New York, and New Jersey are spending a combined $1.3bn on EV charging to address that problem. But more than 80% of EV charging happens at home. The Nissan Leaf and Chevrolet Bolt have 150- and 200-mile ranges, respectively, and the average American only drives 30 miles per day. An MIT study in 2016 found that EVs with a 74-mile range could meet 87% of American car owners’ needs with only overnight charging at home; Nissan and GM have doubled and nearly tripled that range. Infrastructure is no longer a big obstacle to EV adoption for many people.

Costs are difficult to compare; while electric cars receive federal and state rebates, fossil fuels receive tremendous subsidies, particularly in the form of deferred climate change costs. Recent research suggests that the ‘social cost of carbon’ is around $200 per ton, which equates to an effective subsidy of close to $2 per gallon of gasoline. Doing an apples-to-apples comparison by either accounting for current fuel prices and federal tax EV credits, or excluding tax credits but adding $2 per gallon to fuel costs, electric vehicles already have similar lifetime costs to comparable gasoline-powered sedans. And due to improving battery technology, the price per mile of EV range is falling rapidly.

The Trump administration is working on behalf of the fossil fuel industry to stall the transition to electric cars, advancing regulations that will freeze fuel efficiency requirements. However, California will be taking the Trump and Pruitt EPA to court in order to preserve the state’s ability to set its own vehicle pollution standards.

In a world where leaders in every country outside America accept the reality of dangerous human-caused climate change, these clean technologies are the future. Working to undermine their progress in the US will just allow competing countries like China to gain a bigger advantage in the growing green economy of the future.

U.S. industries ask Trump administration to endorse global hydrofluorocarbon deal

With EPA regulation overturned in court, companies seek political backing for reducing HFC use domestically

by Cheryl Hogue, JUNE 4, 2018

Fix-a-Flat is among U.S. product makers that have already stopped using HFCs.

A group of unlikely allies wants the Trump administration to require reduced U.S. use of a class of synthetic chemicals that are potent greenhouse gases, in line with an international agreement.

The organizations lobbying for the restrictions include rival international chemical producers Arkema, Chemours, Honeywell, and Mexichem Fluor as well as a U.S. business that builds refrigerated trailers for shipping. Then there are environmental advocates, domestic manufacturers of home refrigerators, and a company that makes aerosol tire-inflating products.

The organizations hold different views on how the U.S. should implement a 2016 treaty on hydrofluorocarbons (HFCs)—chemicals used as refrigerants, aerosol propellants, and blowing agents that puff up plastic into foams. But they are delivering a unified appeal to the Trump administration, which has generally disparaged international deals, from trade pacts to the Paris Agreement to combat climate change. Their message is simple: Back the global accord on HFCs.

HFCs replaced chlorofluorocarbons (CFCs) and hydrochlorofluorocarbons, two classes of chemicals that were used for the above applications and that break down stratospheric ozone. In contrast, HFCs are harmless to the ozone layer but have large potential to contribute to global warming.

To address the climate change threat from continued use of HFCs, countries met in Kigali, Rwanda, in 2016 and set a global schedule to phase down use of these greenhouse gases. They acted by tacking an amendment onto a 1989 treaty, the Montreal Protocol on Substances That Deplete the Ozone Layer.

Implementing the Kigali Amendment on HFCs hasn’t been high on the Trump administration’s agenda. To do so, the administration must seek formal consent from the Senate to ratify the international treaty. Besides, the U.S. already had a plan in place for ratcheting down use of HFCs—a 2015 regulation from the Environmental Protection Agency under former president Barack Obama.

But HFC manufacturers Mexichem Fluor and Arkema challenged that rule in court. They argued that the agency improperly relied on part of the Clean Air Act that requires manufacturers to replace substances that deplete stratospheric ozone with safer substitutes. Last August, a federal court agreed with them and struck down the rule, delivering a blow to EPA and its supporters in the case, Honeywell International, Chemours, and environmental activists. Honeywell and Chemours make HFC alternatives.

You have billions of dollars on the line.

The decision threw the market for HFCs and alternatives into a whirl of uncertainty about whether, when, and how the U.S. will pare back the use of HFCs. At a recent public meeting with EPA, representatives from a variety of industries warned that U.S. businesses might lose money from investments in HFC-free technology because the 2015 regulation is kaput.

For instance, Fix-a-Flat, a company that produces aerosol cans that can inflate vehicle tires, has already switched from HFCs, marketing its products as containing a “zero global warming potential propellant.”

But some makers of other aerosol products are wondering whether they can switch from newer, more climate-friendly chemicals back to HFCs, said Douglas Raymond, a consultant who spoke at the EPA meeting on behalf of the National Aerosol Association.

Companies can now legally use HFCs in aerosols sold in the U.S., responded Cindy Newberg, who manages EPA’s program that approves the use of chemicals as substitutes for ozone-depleting substances such as CFCs. However, she warned that EPA will eventually restrict HFCs through a new regulation, although the agency is just in the starting phase of creating that rule.

While EPA works on the rule, the Trump administration should reduce uncertainty by endorsing the Kigali Amendment as a political signal that the U.S. still plans to crank back its HFC use despite the court ruling, said Kevin Fay, executive director of the Alliance for Responsible Atmospheric Policy. That group represents manufacturers, businesses, and trade associations that make or use fluorinated gases, including fluorochemical producers AGC Chemicals, Arkema, Chemours, Daikin, Honeywell, Mexichem Fluor, and Solvay. Individual representatives of Arkema and Mexichem Fluor, the companies that successfully argued in court against EPA’s HFC regulation, also said at the meeting that those companies back U.S. participation in the Kigali Amendment even though they questioned the agency’s legal basis for the rule.

Officials throughout the administration, including representatives from the Commerce Department and the military, have held discussions on the Kigali Amendment, said William Wehrum, who leads EPA’s Office of Air & Radiation. They are trying to reach a well-reasoned, deliberative decision on whether to back the global HFC deal, he said.

“You have billions of dollars on the line,” Fay stated. “If you don’t speak up soon, that investment from our companies [in HFC-free technology] is going to go elsewhere.”

President Donald J. Trump has often chosen to change or withdraw from—rather than embrace—international agreements. But failure to support the Kigali Amendment would work at odds with some of his priorities, such as expanding U.S. jobs and cutting the country’s trade deficit, warns a recent study by the Alliance for Responsible Atmospheric Policy and the Air-Conditioning, Heating & Refrigeration Institute. Domestic implementation of the Kigali deal is expected to create 33,000 U.S. jobs in manufacturing by 2027, the study says. At the same time, working with other countries to decrease global HFC use would boost the U.S. share of the world market in heating, ventilation, air-conditioning, and refrigeration equipment to 9.0%, up from 7.2% in 2016, thus increasing the nation’s exports. Not embracing the Kigali deal will mean economic losses for the U.S., the study finds.

Meanwhile, the lack of a federal path to reduce HFCs is leading to a situation that industry detests—regulation of HFCs by individual states, Fay and others said. California is already moving to ban HFCs in certain new equipment, such as frozen-yogurt dispensers, and in foams as a backstop to the overturned EPA rule.

Many sectors fear California’s move will lead to a patchwork of state requirements for HFCs, causing headaches for manufacturers that sell products across the U.S. The federal government, rather than individual states, needs to set policy on HFCs, said Helen Walter-Terrinoni, who works on regulatory strategy for Chemours.

Environmental groups, including the Natural Resources Defense Council and others, have asked the U.S. Supreme Court to review the decision that overturned EPA’s HFC regulation. However, the nation’s highest court receives far more requests for review than it accepts. If the Supreme Court does accept the HFC case, a ruling may not emerge for nearly a year—and even then, the Supreme Court might agree with the lower court’s ruling.

Overall, the situation is “a mess” that may take Congress and EPA several years to rectify, predicted Charles A. Samuels, general counsel of the Association of Home Appliance Manufacturers. The association in 2016 set voluntary goals to stop using HFCs in foam insulation by 2020 and in household refrigerators and freezers after 2024.

The U.S. Senate will have to grant formal agreement to the Kigali Amendment. Both houses of Congress will have to pass legislation giving EPA new authority to implement an HFC phasedown. Then the agency will have to go through a time-consuming procedure to issue a regulation to implement the agreement, a process that usually takes about three years. In the meantime, Samuels said, states have ample opportunity to create a hodgepodge of HFC regulations.

This, Samuels concluded, is “a rather dismal scenario.”

Pennsylvania, for first time, sets methane requirements on natural gas wells

JUNE 07, 2018 | 06:02 PM

Wolf touts the rules, which apply only to new wells

Reid Frazier, Joe Ulrich / WITF

Methane leaks throughout the entire process of developing natural gas– from well sites, to storage and processing facilities and pipelines.

Pennsylvania has joined a growing number of states in issuing new methane requirements for natural gas wells.

On Thursday, the Pennsylvania Department of Environmental Protection said it has finalized permits for new natural gas wells and processing facilities.

“We are uniquely positioned to be a national leader in addressing climate change while supporting and ensuring responsible energy development, while protecting public health and our environment,” Gov. Tom Wolf said in a news release.

Pennsylvania joins states like Ohio and Colorado, which already have methane requirements for the oil and gas industry. Methane is a powerful greenhouse gas, and is the main component of natural gas. The oil and gas industry is the largest source of methane in the U.S.

The permits apply to new and modified wells and related compressor stations. The number of new wells drilled in Pennsylvania fluctuates every year, but is generally between 500 and 1,000. DEP records show the gas industry drilled 809 unconventional gas wells in 2017.

Wolf has said he will issue rules for the state’s 11,000 existing Marcellus shale gas wells, but hasn’t proposed any yet.

The permits for new wells will mandate companies do more frequent leak detection and repair at well pads than currently required by state or federal law, and allow the DEP to penalize companies that don’t comply.

“That’s why it’s critical the state is taking the lead on controlling methane pollution,” said Robert Routh, staff attorney at Clean Air Council, who praised Wolf for finalizing the new permits.

Leann Leiter of the environmental group Earthworks said the permits will help control not only methane, but other hazardous air pollutants, like benzene, a known carcinogen, which can be released alongside methane at a leaky well.

“We certainly are going to be impacted and are being impacted by global climate change, but I think it’s really key to realize that these permits are (going to be good) for Pennsylvanians,” Leiter said.

Leiter said the state needs similar rules that govern the state’s existing shale wells.

“Those are emitting significant amounts of pollution on a daily basis and so we need those sources to be regulated in a similar way,” Leiter said.

The question of how much methane and other pollutants is escaping oil and gas wells has been hotly pursued by scientists in recent years. According to the United Nations Intergovernmental Panel on Climate Change, methane is up to 86 times more potent at trapping heat in the atmosphere than carbon dioxide over a 20-year period.

The DEP’s most recent data shows overall methane emissions from Pennsylvania’s shale gas industry is rising, while other air pollutants from the industry are falling.

Scientists at the Environmental Defense Fund recently calculated that Pennsylvania’s Marcellus shale industry is emitting twice as much methane as companies are reporting.

In a statement, Marcellus Shale Coalition President Dave Spigelmyer said the gas industry is working to keep leaks of methane to a minimum.

“Despite this positive and continued progress, we remain concerned about imposing additional requirements through operating permits, particularly those that exceed DEP’s statutory authority,” the statement said.

The group did not immediately respond to questions about whether it would sue to block the new permits.

Canadian company testing system to convert carbon dioxide into fuel


Thu., June 7, 2018

A Canadian startup is testing a system that sucks carbon dioxide from the air and converts it into fuel for cars and other vehicles.

Carbon Engineering’s technique combines several common manufacturing processes and will eventually be able to produce fuel for about $4 (U.S.) a gallon, according to David Keith, a Harvard University professor and co-founder of the company.

Direct air capture of carbon dioxide could help governments meet pollution targets outlined in the Paris accords that otherwise are likely out of reach by just cutting emissions from power plants and cars

Direct air capture of carbon dioxide could help governments meet pollution targets outlined in the Paris accords that otherwise are likely out of reach by just cutting emissions from power plants and cars

With oil prices climbing and U.S. gasoline following suit, that’s a level that could make this alternative fuel competitive. Many companies have developed ways to make fuel from plants, trees, sugarcane waste and other substances instead of petroleum, but the challenge has always been the cost. Carbon Engineering’s technique was developed specifically to address this.

“This isn’t some new clever piece of science or weird chemical we synthesized in some fancy lab,” Keith said in an interview. “The key thing that the company’s done from the beginning is focus on doing this in a way that is industrially scalable.”

The “direct air capture” process starts with common industrial cooling systems and a solution that draws carbon from the air, according to a paper published Thursday in the journal Joule. The carbon is combined with hydrogen to make motor fuel, through a technique used at pulp mills. The most expensive part is the electricity used to extract hydrogen from water.

While Carbon Engineering’s $4 price point is about 40 per cent higher than fossil fuels, it may be competitive in markets like California that require cleaner low-carbon fuels, Keith said. The company has been sucking carbon dioxide from the air since 2015 and producing fuel at a pilot plant in Squamish, B.C., since the end of last year.

Power plants and transportation are the biggest sources of greenhouse gas emissions blamed for climate change. Keith and other scientists are advocating for more resources to study geo-engineering techniques to manipulate the climate. Some argue it may be the only way to meet the Paris climate accord’s goal of holding the increase in the average global temperature “well below” 2 degrees C above pre-industrial levels.

Direct air capture of carbon dioxide could help governments meet pollution targets outlined in the Paris accords that otherwise are likely out of reach by just cutting emissions from power plants and cars, Keith said.

While past estimates of direct air capture have run as high as $1,000 a metric ton, according to the paper, Carbon Engineering’s technique comes in at $94 to $232 a ton.

“Our fundamental value proposition for fuel is that we can decarbonize any vehicle,” said Steve Oldham, chief executive officer of Carbon Engineering.

Oldham said he’s in talks with oil and natural gas companies interested in using his fuel in markets with carbon restrictions. A plant that can produce 2,000 barrels of fuel a day would cost about $300 million and take about 3 years to build.

“We have signed agreements with a U.S. company that has very cheap wind power,” Keith said. “They like the investment proposition of taking that now very-low-value wind power, and doing a high capital cost investment in us to make this fuel that’s a very high-value fuel. In some ways you can call it energy arbitrage.”

Chile Bans Plastic Bags at Retail Businesses

The ban is President Sebastián Piñera’s first legislative victory since taking office in March.s

By Pascale Bonnefoy, June 1, 2018

SANTIAGO, Chile — Chile will become the first nation in the Americas to ban retail businesses from using plastic bags, an initiative aimed at protecting the country’s 4,000-mile coastline.

The measure, which Congress approved unanimously this week, gives large retailers and supermarkets six months to comply.

Small and medium-size businesses, including neighborhood shops, will have two years to abide by the new rules. In the meantime, they may hand out up to two plastic bags per client.

The measure is President Sebastián Piñera’s first legislative victory since taking office in March. Mr. Piñera expanded the scope of a bill introduced last year by his predecessor, Michelle Bachelet, that sought to prohibit the use of plastic bags in more than 100 coastal towns.

“We are convinced that our coast imposes an obligation to be leaders in cleaning up our oceans,” said Marcela Cubillos, the environment minister.

Almost 80 municipal governments around the country have already adopted restrictions over the past few years, while some beach and lakeside communities have outright bans.

Chileans use more than 3.4 million plastic bags a year, according to the environment ministry, and most get dumped in landfills or make their way into the ocean.

The measure represents Chile’s latest effort to be a model on environmental issues. Congress passed legislation in 2016 to reduce waste and promote recycling, and Chile has overhauled its electricity sector in recent years to rely more heavily on renewable energy sources.

“It is a positive step,” Flavia Liberona, director of the environmental group Terram, said of the ban. “It opens the door to move forward and discuss other related problems, like the use of plastic food packaging and recycling.”

Hundreds of Golf Courses Tee Up to Help Monarch Butterflies

Golf course managers across North America have begun planting milkweed and wildflower habitat to attract butterflies, bees and … golfers.

May 31, 2018

(TROY, NY) Audubon International and Environmental Defense Fund (EDF) recently partnered to launch Monarchs in the Rough, a program to assist golf courses in the United States, Canada and Mexico in creating habitat for monarch butterflies and other pollinators in out-of-play areas.

The program first rolled out in January 2018 with a goal of enrolling 100 courses. Today, the program has far surpassed its initial goal by enrolling more than 250 courses. The program has set a new goal of enrolling 500 additional courses, and launched a new website to feature participating courses.

“The response from the golf community to helping pollinators recover from dramatic declines in recent years has been tremendous,” said Christine Kane, CEO of Audubon International. “Habitat loss is a key driver of the monarch butterfly’s decline, and golf courses are uniquely positioned to help create new habitat and turn things around for this iconic species.”

Golf course properties occupy approximately 2.5 million acres in the United States. Audubon International estimates there are at least 100,000 acres that have the potential to become suitable habitat for butterflies and bees, if managed appropriately.

Monarchs in the Rough encourages golf courses to adopt conservation practices such as planting milkweed and other wildflowers that monarchs need to breed and feed, in addition to changing mowing practices to support the timing of the monarch’s migration, and protecting sites from pesticide treatments.

“This program is not only helping turn things around for the monarch – it’s also an opportunity for the golf community to change the assumptions many people have about golf courses being unsustainable,” said Yank Moore, land manager for the Jekyll Island Authority and Golf Club in Jekyll Island, Georgia. “We have a real opportunity here to showcase the stewardship ethic of golf course managers and superintendents, and to educate the public about conservation practices that support monarchs and other pollinators.”

Monarchs in the Rough provides course superintendents and staff with the information they need to incorporate monarch habitat into the unique layout of each course.

“We bring the scientific expertise and the technical support, and the golf courses bring the land and the staff who are already well positioned to implement conservation practices,” said Daniel Kaiser, senior manager of habitat markets at EDF. “As an avid golfer and conservationist, I couldn’t be more excited about this partnership and the potential it has to help change the trajectory for the monarch butterfly.”

“I can’t wait to hear what our golfers have to say about these conservation efforts,” said Isaac Breuer, golf course superintendent at the A.L. Gustin Golf Course at the University of Missouri. Breuer is an early program participant who has incorporated wildlife habitat management into the university course since 2010.

“We know that our golfers notice and appreciate every effort we make to improve the natural beauty and sustainability of the course, because it makes the whole experience more enjoyable for both the golfer and the butterfly,” Breuer said.

For more information about Monarchs in the Rough, including a resource guide, please visit:

Mexico recyclers see opportunity in China's ban on imported waste


Mexico City — The head of recycling at Mexico's plastics industry association says the country should take advantage of China's ban on all imported waste material by increasing its recycling capacity.

"The China situation has opened a window of opportunity for Mexico, and we need to make adequate investments," Eduardo Martínez Hernández told a May 15 Asociación Nacional de Industrias del Plástico AC (Anipac) recycling forum in Mexico City.

Martínez, who is also the commercial director of a major collector of recyclable material in Mexico, cautioned it was imperative first to convince packaging manufacturers and others in the country not to contaminate plastics that have recycling potential.

Speaking after Martínez's presentation, Santiago Lobeira, commercial director of Mexico City-based Innovative Group México, told Plastics News his company would definitely strive to fill the gap left by China.

"There is a lot of potential in Mexico," Lobeira said. "But many companies have not invested because they don't have the resources."

He said his company is still in the process of educating its 30-or-so waste generators, adding: "We need to make society and business people aware that they should not pollute film, which is 100 percent recyclable, and we need to convince big companies with Anipac's help about this issue."

"Plastic is a great invention," he added. "The problem is we have used it in such a wasteful way."

Some packaging manufacturers, he said, combine plastic film and paper, which renders the plastic unsuitable for recycling.

"Mexican [recycling] companies don't have the infrastructure to get rid of pollutants," he said.

Asked whether the manufacturing companies he deals with will respond to the call for a more thoughtful use of plastics, Lobeira replied: "I think they will respond, especially if you affect their wallet."

Guadalupe Chávez, purchasing director of Plasticos Panamericanos SA de CV, a large maker of crates and pallets that uses recycled plastics, agreed with Lobeira, telling the forum that manufacturers in Mexico "must stop treating plastic scrap as garbage."

Innovative is a subsidiary of Houston recycler Avangard Innovative LP, which recycles the material that Innovative México and other affiliates ship to it.

Martínez is the president of Anipac's recyclers commission.

"If you have a couple of million dollars hidden away, now is the time to invest it," he jokingly told the forum audience of about 100.

China was the world's largest importer of waste until it announced a ban on 24 types of scrap, including plastic, starting in January. The new policy has left major waste exporters, such as the United States, Japan, the European Union and the United Kingdom, scurrying to find alternative destinations.


June 5, 2018, Read the full story at Waste360.

Textiles are now the fastest growing type of waste, accounting for about 16.2 million tons in the U.S. Most of this waste is landfilled or incinerated, with about 16 percent of it recycled. Each year, Americans throw out more, a pattern that comes with large environmental and financial impacts, with municipalities especially feeling the consequences. Meanwhile, there is almost no infrastructure to manage these materials, with the biggest hurdles being a lack of sophisticated sorting methods and no scalable collections systems.

“Textiles are outpacing almost every waste type in our stream,” says Marisa Adler, RRS senior consultant. “It’s grown 71 percent in 15 years, while the overall stream has grown by 6 percent. So, we have a textiles crisis, and it doesn’t seem to be subsiding because consumers continue purchasing at the same rate with clothes becoming more affordable and people keeping them and other textiles for less time. They are starting to treat them as disposables.”

A big question, according to Anne Johnson, vice president of RRS, becomes how can we recover more?

From boats to barns, Midwest salt bed could shift seafood market inland

Shrimp farming in Indiana

Based in Fowler, Ind., RDM Aquaculture produces about 500 pounds of shrimp a month.

Tony Briscoe, Chicago Tribune

Alongside East State Road 18, a two-lane rural byway that cuts through Fowler, Ind., agricultural machines rove over barren fields, stirring up dirt clouds as they prepare the land for growing season.

The drive is a looping sequence of grain silos, wind turbines and barns on a flat, far-reaching horizon. But at Karlanea and Darryl Brown’s farm, a sign breaks through the repetition, advertising “Fresh Farm Raised Shrimp.”

“I’m surprised there hasn’t been an accident out there,” Karlanea Brown said. “People literally slam on their brakes, back up their cars and come in to see what we’re about. ‘Really? You raise shrimp here?’”

On a sunny May morning, in the front of their shop, Karlanea Brown, 56, bags a pound of live shrimp by the handful, packing them with ice and handing them to a customer. For nearly eight years, the Browns have raised Pacific white shrimp in tarped swimming pools fixed with tubing and PVC pipes for aeration, a homemade system that won them the 2015 Indiana Innovation Award presented by Indianapolis-based nonprofit Centric Inc.

The Midwest has long been considered America’s breadbasket, a bountiful agricultural hub cultivating corn, soybeans, wheat and oats. Rearing cows and hogs has been a family tradition for some farmers who support a market of fresh meat and dairy.

Great Lakes states like Illinois support a moderate market for supply of freshwater fish despite decades of overfishing and competition with invasive species. But be it shrimp or sea bass, the coasts have been the envy of the nation’s heartland when it comes to locally sourced seafood.

Scientists have been exploring the viability of raising saltwater species in landlocked states, an approach that might help the United States — a country that imports more than 80 percent of its seafood (fresh and saltwater) — scale back a $15.6 billion trade deficit for seafood, by far the largest deficit in the food sector and second only to crude oil among natural resources. Experts say it would also alleviate the pressures of overfishing at a time when species are being harvested at unsustainable rates and some habitats are increasingly being threatened by climate change.

The process of raising seafood, known as aquaculture, has become one of the fastest-growing food production industries in the world, and 2014 marked a significant milestone as it was the first time that more fish were farmed globally for human consumption than caught, according to the Food and Agriculture Organization of the United Nations.

“Half of the seafood we’re already eating is farmed,” said Julie Qiu, an oyster connoisseur and marketing director for Massachusetts-based Australis Aquaculture, which farms barramundi, an Asian sea bass, in United States and Vietnam. “If we’re absolutely meeting demand, we won’t have enough wild seafood to feed everyone in the world. To put it in perspective, we should be having two servings of seafood a week, according to health experts. If we were to achieve that by only relying on wild catch, we’d need four planet Earths to sustain that. Farm-raised seafood has to step in and take on some of that volume.”

The U.S. ranked as the second-largest consumer and the third-largest producer of wild-caught seafood in the most recent United Nations report on global fishing and aquaculture commerce. However, the U.S. was listed 13th in aquaculture production, one spot below North Korea.

Production is scant in the Midwest, which produces less than 1 percent of the farm-raised seafood consumed in the U.S. Some operations have struggled to shoulder hefty operational costs, such as synthetic salt to replicate ocean waters.

Researchers say the solution to that problem could be right under their feet. About 540 million years ago, much of modern-day United States was the site of an ancient sea. Illinois, then located near the equator, was brimming with marine species.

While much of the seawater evaporated, researchers say considerable amounts of brine from that period remain trapped underneath Illinois. If drawn to the surface, this saltwater may be able to once again sustain saltwater species, establishing a locally sourced market in the Midwest and helping would-be farmers to drastically reduce the upfront costs, according to a study published by Illinois Sustainable Technology Center.

While this saltwater is five to six times saltier than the ocean in some places, researchers believe that if it were diluted, it would be suitable to raise striped bass.

Saltwater from these aquifers is already brought to the surface regularly in oil and natural gas recovery, and some is generated from coal processing and other industrial activities. The cost of obtaining the water and possibly treating it for contaminants is unclear.

“It’s kind of strange,” Srirupa Ganguly, an engineer at the center, acknowledged. “Saltwater is not something someone would imagine in a landlocked state.”

There are about 70 freshwater aquaculture facilities licensed in Illinois, down from 101 in 2011, according to the Illinois Department of Natural Resources. The vast majority raise native species commonly stocked in lakes and ponds, like largemouth bass, bluegill and channel catfish. Tilapia, a fast-growing freshwater fish native to Africa and the Middle East, has been a popular choice for indoor operations. Eight of the 11 aquaculture facilities in Cook County, which has the largest number of operations, cultivate tilapia.

It’s unclear how many saltwater aquaculture operations are in the state because the Illinois Department of Natural Resources doesn’t require permits for facilities that raise species that cannot survive in freshwater, such as lobster, saltwater shrimp and oysters. Over the years, the department has fielded calls from those expressing interest in raising lobster and barramundi, according to Nathan Grider, an aquaculture program specialist for the department.

“There is quite a bit of interest in aquaculture because of the concerns with ocean fisheries being depleted,” Grider said. “So, there are public and even corporate interests in aquaculture technologies and tackling the food shortage issues. With the overarching issue of sustainable fisheries, I could see more opportunities for expansion inland where the production is really only a drop in the bucket compared to the potential.”

To this point, many of the aquaculture companies remain near the coasts or the Gulf of Mexico, some nurturing fish in large, open-water pens in the oceans, cages near shore and tanks close to the water. The high cost and limited availability of coastal land, in addition to negative effects to habitat from waste, could push aquaculture inland, researchers say.

In the early 1990s, when pork prices bottomed out, Darryl Brown, a lifelong hog farmer, began experimenting with farming tilapia. After years of trial and error, the Browns decided to shift their focus to shrimp, pooling together $500,000 from their inheritances for a six-tank system.

The early going was rough. The couple, who import fingerling shrimp from a hatchery in Florida, had failures of their pump and filters systems, a weather-related power outage and other issues, causing them to lose their entire first two batches and 70 percent of their third.

That motivated them to design their own system using regenerative blowers to pump air from outside. They’ve mostly staved off the cost of replenishing water and salts because they implement a closed-loop system with virtually no discharge. The farm also performs daily tests on water, gauging temperature, dissolved oxygen, salinity, alkalinity, nitrites, nitrates, ammonia, carbon dioxide and pH.

Today, RDM Aquaculture has expanded to 19 production tanks for shrimp, seven intermediate tanks, 10 nursery tanks, 25 crayfish tanks and a tilapia tank. The operation produces about 500 pounds of shrimp a month, some of which has been sold at Harvest Market in Champaign.

After nearly eight years in the shrimp business, RDM has helped set up 32 other shrimp farms across the country.

“I get asked all the time: Why are you helping set other farmers up?” Karlanea Brown said. “I’m not naive enough to think that I can produce all the shrimp in the U.S. I need help. With all the shrimp farmers coming up, a lot of people are worried about another shrimp farmer close by. They don’t understand, you can’t supply the whole market. I don’t care how large you are.”

About half of U.S. seafood imports are from aquaculture, according to the National Oceanic and Atmospheric Administration. In announcing $9.3 million in research grants for projects around the country to develop the nation’s marine and coastal aquaculture industry in October, Secretary of Commerce Wilbur Ross took aim at the disparity.

“This country, with its abundant coastline, should not have to import billions of pounds of seafood each year,” Ross said.

While the growing demand for seafood is undeniable, the aquaculture industry still faces many roadblocks, namely the stigma of farm-raised seafood.

Coastal shrimp farms have been blamed for destroying mangroves in several countries in Central and South America and using slave labor in Thailand. Some overseas operations have been known to use hormones for control of reproduction and growth, and antibiotics to treat diseases caused by excessive bacteria.

“People think farm-raised seafood is terrible. It’s actually not,” Karlanea Brown said, noting their shrimp are hormone- and antibiotic-free. “I always tell people when I get that comment, ‘You know what, I work darn hard to produce a clean product, because that’s my livelihood. I’m going to work harder than the guy who’s pulling it off the boat.’”

In recent years, seafood consumers have become more sensitive about what’s in their food and where it comes from, according to Sean Dimin, co-founder and CEO of Sea to Table, a supplier of traceable and sustainably caught seafood. Growing consumer knowledge has been driving change for seafood in the same way it brought about antibiotic-free chicken or grass-fed beef, Dimin said.

The company, which does business with Chicago-based restaurants including Frontera Grill and Girl & the Goat, touts a menu of exclusively wild-caught, domestic fishing operations. Dimin, whose appreciation for seafood grew from his time fishing, said he remains skeptical of the farm-raised products and the overall viability of aquaculture.

If we only relied on (wild-caught seafood), no one would be able to afford to eat seafood except the top 1 percent.

“You can trace the roots of aquaculture back to the ancient Romans who were farming oysters,” Dimin said. “But it hadn’t really been done at scale until the 1970s. Unfortunately, then, it was built for volume through an agri-industrial model, and there wasn’t enough regard for the environment. There are small companies, even some in the U.S., who are farming fish the right way — and that’s fantastic. But very few are doing it at scale.”

Some in the industry note that aquaculture has been a lifeline for wild fish, and that the two industries are now intertwined.

Between 80 to 90 percent of salmon caught in the Pacific Northwest started their lives in hatcheries, according to NOAA. Even with efforts to rebuild, wild stocks haven’t been reliable, said Matt Mixter, owner of seafood shop Wixter Market in Wicker Park

“It makes no sense to promote one or the other,” Mixter said. “One wouldn’t exist without the other. Both will be there in the future and, if we only relied on one, no one would be able to afford to eat seafood except the top 1 percent. (Fishermen) should all support aquaculture because it relieves the pressure on wild stock.”

In addition to overfishing, scientists are concerned about the growing amount of carbon dioxide in the atmosphere, which they suggest is leading to warming waterways and changing pH, which can be disruptive for some species.

Seafood retailers, Mixter included, have had to closely monitor some species to make sure their products are being harvested at sustainable rates. Mixter said he’s traveled extensively scouting the products — farm-raised and wild caught — that he sells: mahi mahi from Ecuador, yellowtail from Hawaii, barramundi from Vietnam, Scottish salmon.

The primary benefit of having saltwater aquaculture facilities in the Midwest would be lower transportation costs and a lower carbon footprint. For now, the main deterrent for many businesses has been pricing, Mixter said.

A pound of shrimp from Southeast Asia could be priced at $6, whereas some shrimp operations would need to sell at about $13 a pound to break even, according to studies by Kwamena Quagrainie, an aquaculture marketing expert at Purdue University who has studied consumers’ interest in farm-raised seafood in the Midwest.

The tides have begun turning in the past decade, and now consumers are willing to pay more for locally sourced food, Quagrainie said.

“With the local food movement, the perception has changed,” Quagrainie said. “Local is the new term for organic; it’s the trendy thing around food.”

Despite the odds, RDM has pressed forward, most recently with a goal to start a shrimp and oyster hatchery in their attempt to become a one-stop seafood shop in the midst of corn and soy country.

“I’ve been told by so many people, ‘It can’t be done,’” Brown said. “I say, ‘Y’all told us we couldn’t raise shrimp, either.’ But here we are, 7½ years later and we’re still running.”

LEED must be updated to address climate change

Greg Kats

Thursday, May 24, 2018 - 1:45am

This is part one in a pair of stories, adapted from a set of comments published by the author on an online discussion group from October to April. Part one is here.

Over the last 20 years, LEED has become the dominant U.S. green building design standard and is the most influential standard of its kind globally. This is an extraordinary achievement and has made for healthier, more productive and greener homes, workplaces, schools, hospitals and public spaces for tens of millions of families, students and workers.

But LEED has not kept up with the accelerating urgency of climate change or the availability of low and no-cost ways to deeply cut carbon — particularly from steep declines in the cost of clean energy options (such as the 60 percent cost reduction of residential solar since 2010) — that make these now the cheapest electricity source in most states. The rapid growth in the ability to buy onsite and offsite solar and wind under a power purchase agreement (PPA) structure allows LEED building owners to buy carbon-free power at a fixed price at or below conventional utility rates. These onsite and offsite wind and solar options allow most LEED buildings to switch to 100-percent, zero-carbon power at low or no cost — and in many cases can reduce the future cost of electricity (PDF).

However, in a world of accelerating climate change and fossil-fuel-funded denial, LEED has failed to maintain a carbon leadership role. LEED v4, the current version of LEED launched in 2013, was not stringent enough for 2013 — let alone for 2018. Many buildings receiving LEED Silver, Gold and even Platinum ratings deliver an anemic 15 or 20 percent lower energy use and CO2 reduction. Science dictates that serious green building standards today must deliver large reductions in CO2, and LEED must step up to this.

To address this urgent need, Kevin Hydes of the Integral Group, Emma Stewart of WRI, Mary Ann Lazarus of the Cameron MacAllister Group and I developed a proposal, "LEEDing on Climate Change," for adoption in the current LEED V4.1 upgrade process. The proposal would enable LEED to take a leadership role on climate change. It has been signed by more than 150 longtime green building leaders including David Gottfried, founder of both the USGBC and World Green Building Council, and LEED founder Rob Watson, and has been endorsed by groups including National Grid, Amalgamated Bank and HOK.

The proposal, submitted to USGBC in support of USGBC, has broad support on the LEED steering and advisory committees, as well as among USGBC staff. It sets out minimum levels of carbon reduction by level of LEED, reflecting the growing scientific urgency of climate change.

The proposal would enable LEED to take a leadership role on climate change and would address some outstanding LEED weaknesses (including reliance on zero impact RECs).

A 2017 Scientific American article entitled 'The Window Is Closing to Avoid Dangerous Global Warming' warns that 'Deadly climate change could threaten most of the world's human population by the end of this century without efforts well beyond those captured in the Paris Agreement.' The Scientific American article quotes from a 2017 report published in the Proceedings of the National Academy of Sciences that states that; 'We are quickly running out of time to prevent hugely dangerous, expensive and perhaps unmanageable climate change.'

The moral, as well as the scientific, dimensions of climate change, have their deniers. Some deny the science. Others argue that responsibility for global warming can be left to future generations who will experience the largest costs of climate change but may have more money or technologies to manage or mitigate climate change. The moral or ethical aspects of when we take on responsibility for our own contamination of the earth has been spoken to directly by leading moral figures.

At the end of 2016, Pope Francis stated, "Every year the problems are getting worse. We are at the limits. If I may use a strong word, I would say that we are at the limits of suicide." Francis warned, "There is an urgent need to develop policies so that, in the next few years, the emissions of carbon dioxide and other highly polluting gasses can be drastically reduced."

Current LEED CO2 requirements do not meet the call of scientists or the pope. But LEED has