Arquivo da tag: Mudanças climáticas

Bill Gates e o problema com o solucionismo climático (MIT Technology Review)

Bill Gates e o problema com o solucionismo climático

Natureza e espaço

Focar em soluções tecnológicas para mudanças climáticas parece uma tentativa para se desviar dos obstáculos políticos mais desafiadores.

By MIT Technology Review, 6 de abril de 2021

Em seu novo livro Como evitar um desastre climático, Bill Gates adota uma abordagem tecnológica para compreender a crise climática. Gates começa com os 51 bilhões de toneladas de gases com efeito de estufa criados por ano. Ele divide essa poluição em setores com base em seu impacto, passando pelo elétrico, industrial e agrícola para o de transporte e construção civil. Do começo ao fim, Gates se mostra  adepto a diminuir as complexidades do desafio climático, dando ao leitor heurísticas úteis para distinguir maiores problemas tecnológicos (cimento) de menores (aeronaves).

Presente nas negociações climáticas de Paris em 2015, Gates e dezenas de indivíduos bem-afortunados lançaram o Breakthrough Energy, um fundo de capital de investimento interdependente lobista empenhado em conduzir pesquisas. Gates e seus companheiros investidores argumentaram que tanto o governo federal quanto o setor privado estão investindo pouco em inovação energética. A Breakthrough pretende preencher esta lacuna, investindo em tudo, desde tecnologia nuclear da próxima geração até carne vegetariana com sabor de carne bovina. A primeira rodada de US$ 1 bilhão do fundo de investimento teve alguns sucessos iniciais, como a Impossible Foods, uma fabricante de hambúrgueres à base de plantas. O fundo anunciou uma segunda rodada de igual tamanho em janeiro.

Um esforço paralelo, um acordo internacional chamado de Mission Innovation, diz ter convencido seus membros (o setor executivo da União Europeia junto com 24 países incluindo China, os EUA, Índia e o Brasil) a investirem um adicional de US$ 4,6 bilhões por ano desde 2015 para a pesquisa e desenvolvimento da energia limpa.

Essas várias iniciativas são a linha central para o livro mais recente de Gates, escrito a partir de uma perspectiva tecno-otimista. “Tudo que aprendi a respeito do clima e tecnologia me deixam otimista… se agirmos rápido o bastante, [podemos] evitar uma catástrofe climática,” ele escreveu nas páginas iniciais.

Como muitos já assinalaram, muito da tecnologia necessária já existe, muito pode ser feito agora. Por mais que Gates não conteste isso, seu livro foca nos desafios tecnológicos que ele acredita que ainda devem ser superados para atingir uma maior descarbonização. Ele gasta menos tempo nos percalços políticos, escrevendo que pensa “mais como um engenheiro do que um cientista político.” Ainda assim, a política, com toda a sua desordem, é o principal impedimento para o progresso das mudanças climáticas. E engenheiros devem entender como sistemas complexos podem ter ciclos de feedback que dão errado.

Sim, ministro

Kim Stanley Robinson, este sim pensa como um cientista político. O começo de seu romance mais recente The Ministry for the Future (ainda sem tradução para o português), se passa apenas a alguns anos no futuro, em 2025, quando uma onda de calor imensa atinge a Índia, matando milhões de pessoas. A protagonista do livro, Mary Murphy, comanda uma agência da ONU designada a representar os interesses das futuras gerações em uma tentativa de unir os governos mundiais em prol de uma solução climática. Durante todo o livro a equidade intergeracional e várias formas de políticas distributivas em foco.

Se você já viu os cenários que o Painel Intergovernamental sobre Mudanças Climáticas (IPCC) desenvolve para o futuro, o livro de Robinson irá parecer familiar. Sua história questiona as políticas necessárias para solucionar a crise climática, e ele certamente fez seu dever de casa. Apesar de ser um exercício de imaginação, há momentos em que o romance se assemelha mais a um seminário de graduação sobre ciências sociais do que a um trabalho de ficção escapista. Os refugiados climáticos, que são centrais para a história, ilustram a forma como as consequências da poluição atingem a população global mais pobre com mais força. Mas os ricos produzem muito mais carbono.

Ler Gates depois de Robinson evidencia a inextricável conexão entre desigualdade e mudanças climáticas. Os esforços de Gates sobre a questão do clima são louváveis. Mas quando ele nos diz que a riqueza combinada das pessoas apoiando seu fundo de investimento é de US$ 170 bilhões, ficamos um pouco intrigados que estes tenham dedicado somente US$ 2 bilhões para soluções climáticas, menos de 2% de seus ativos. Este fato por si só é um argumento favorável para taxar fortunas: a crise climática exige ação governamental. Não pode ser deixado para o capricho de bilionários.

Quanto aos bilionários, Gates é possivelmente um dos bonzinhos. Ele conta histórias sobre como usa sua fortuna para ajudar os pobres e o planeta. A ironia dele escrever um livro sobre mudanças climáticas quando voa em um jato particular e detém uma mansão de 6.132 m² não é algo que passa despercebido pelo leitor, e nem por Gates, que se autointitula um “mensageiro imperfeito sobre mudanças climáticas”. Ainda assim, ele é inquestionavelmente um aliado do movimento climático.

Mas ao focar em inovações tecnológicas, Gates minimiza a participação dos combustíveis fósseis na obstrução deste progresso. Peculiarmente, o ceticismo climático não é mencionado no livro. Lavando as mãos no que diz respeito à polarização política, Gates nunca faz conexão com seus colegas bilionários Charles e David Koch, que enriqueceram com os petroquímicos e têm desempenhado papel de destaque na reprodução do negacionismo climático.

Por exemplo, Gates se admira que para a vasta maioria dos americanos aquecedores elétricos são na verdade mais baratos do que continuar a usar combustíveis fósseis. Para ele, as pessoas não adotarem estas opções mais econômicas e sustentáveis é um enigma. Mas, não é assim. Como os jornalistas Rebecca Leber e Sammy Roth reportaram em  Mother Jones  e no  Los Angeles Times, a indústria do gás está investindo em defensores e criando campanhas de marketing para se opor à eletrificação e manter as pessoas presas aos combustíveis fósseis.

Essas forças de oposição são melhor vistas no livro do Robinson do que no de Gates. Gates teria se beneficiado se tivesse tirado partido do trabalho que Naomi Oreskes, Eric Conway, Geoffrey Supran, entre outros, têm feito para documentar os esforços persistentes das empresas de combustíveis fósseis em semear dúvida sobre a ciência climática para a população.

No entanto, uma coisa que Gates e Robinson têm em comum é a opinião de que a geoengenharia, intervenções monumentais para combater os sintomas ao invés das causas das mudanças climáticas, venha a ser inevitável. Em The Ministry for the Future, a geoengenharia solar, que vem a ser a pulverização de partículas finas na atmosfera para refletir mais do calor solar de volta para o espaço, é usada na sequência dos acontecimentos da onda de calor mortal que inicia a história. E mais tarde, alguns cientistas vão aos polos e inventam elaborados métodos para remover água derretida de debaixo de geleiras para evitar que avançasse para o mar. Apesar de alguns contratempos, eles impedem a subida do nível do mar em vários metros. É possível imaginar Gates aparecendo no romance como um dos primeiros a financiar estes esforços. Como ele próprio observa em seu livro, ele tem investido em pesquisa sobre geoengenharia solar há anos.

A pior parte

O título do novo livro de Elizabeth Kolbert, Under a White Sky (ainda sem tradução para o português), é uma referência a esta tecnologia nascente, já que implementá-la em larga escala pode alterar a cor do céu de azul para branco.
Kolbert observa que o primeiro relatório sobre mudanças climáticas foi parar na mesa do presidente Lyndon Johnson em 1965. Este relatório não argumentava que deveríamos diminuir as emissões de carbono nos afastando de combustíveis fósseis. No lugar, defendia mudar o clima por meio da geoengenharia solar, apesar do termo ainda não ter sido inventado. É preocupante que alguns se precipitem imediatamente para essas soluções arriscadas em vez de tratar a raiz das causas das mudanças climáticas.

Ao ler Under a White Sky, somos lembrados das formas com que intervenções como esta podem dar errado. Por exemplo, a cientista e escritora Rachel Carson defendeu importar espécies não nativas como uma alternativa a utilizar pesticidas. No ano após o seu livro Primavera Silenciosa ser publicado, em 1962, o US Fish and Wildlife Service trouxe carpas asiáticas para a América pela primeira vez, a fim de controlar algas aquáticas. Esta abordagem solucionou um problema, mas criou outro: a disseminação dessa espécie invasora ameaçou às locais e causou dano ambiental.

Como Kolbert observa, seu livro é sobre “pessoas tentando solucionar problemas criados por pessoas tentando solucionar problemas.” Seu relato cobre exemplos incluindo esforços malfadados de parar a disseminação das carpas, as estações de bombeamento em Nova Orleans que aceleram o afundamento da cidade e as tentativas de seletivamente reproduzir corais que possam tolerar temperaturas mais altas e a acidificação do oceano. Kolbert tem senso de humor e uma percepção aguçada para consequências não intencionais. Se você gosta do seu apocalipse com um pouco de humor, ela irá te fazer rir enquanto Roma pega fogo.

Em contraste, apesar de Gates estar consciente das possíveis armadilhas das soluções tecnológicas, ele ainda enaltece invenções como plástico e fertilizante como vitais. Diga isso para as tartarugas marinhas engolindo lixo plástico ou as florações de algas impulsionadas por fertilizantes destruindo o ecossistema do Golfo do México.

Com níveis perigosos de dióxido de carbono na atmosfera, a geoengenharia pode de fato se provar necessária, mas não deveríamos ser ingênuos sobre os riscos. O livro de Gates tem muitas ideias boas e vale a pena a leitura. Mas para um panorama completo da crise que enfrentamos, certifique-se de também ler Robinson e Kolbert.

Cerejeiras florescem mais cedo no Japão em 1,2 mil anos (Folha de S.Paulo)

f5.folha.uol.com.br

Kazuhiro Nogi – 24.mar.2021/AFP 4-5 minutos


São Paulo

O florescer das famosas cerejeiras brancas e rosas leva milhares às ruas e parques do Japão para observar o fenômeno, que dura poucos dias e é reverenciado há mais de mil anos. Mas este ano a antecipação da florada tem preocupado cientistas, pois indica impacto nas mudanças climáticas.

Segundo registros da Universidade da Prefeitura de Osaka, em 2021, as famosas cerejeiras brancas e rosas floresceram totalmente em 26 de março em Quioto, a data mais antecipada em 12 séculos. As floradas mais cedo foram registradas em 27 de março dos anos 1612, 1409 e 1236.

A instituição conseguiu identificar a antecipação do fenômeno porque tem um banco de dados completo dos registros das floradas ao longo dos séculos. Os registros começaram no ano 812 e incluem documentos judiciais da Quioto Imperial, a antiga capital do Japão e diários medievais.

O professor de ciência ambiental da universidade da Prefeitura de Osaka, Yasuyuki Aono, responsável por compilar um banco de dados, disse à Agência Reuters que o fenômeno costuma ocorrer em abril, mas à medida que as temperaturas sobem, o início da floração é mais cedo.

Kazuhiro Nogui, 24.mar.2021/AFP

“As flores de cerejeira são muito sensíveis à temperatura. A floração e a plena floração podem ocorrer mais cedo ou mais tarde, dependendo apenas da temperatura. A temperatura era baixa na década de 1820, mas subiu cerca de 3,5 graus Celsius até hoje”, disse.

Segundo ele, as estações deste ano, em particular, influenciaram as datas de floração. O inverno foi muito frio, mas a primavera veio rápida e excepcionalmente quente, então “os botões estão completamente despertos depois de um descanso suficiente”.

Na capital Tóquio, as cerejeiras atingiram o máximo da florada em 22 de março, o segundo ano mais cedo já registrado. “À medida que as temperaturas globais aumentam, as geadas da última Primavera estão ocorrendo mais cedo e a floração está ocorrendo mais cedo”, afirmou Lewis Ziska, da Universidade de Columbia, à CNN.

A Agência Meteorológica do Japão acompanha ainda 58 cerejeiras “referência” no país. Neste ano, 40 já atingiram o pico de floração e 14 o fizeram em tempo recorde. As árvores normalmente florescem por cerca de duas semanas todos os anos. “Podemos dizer que é mais provável por causa do impacto do aquecimento global”, disse Shunji Anbe, funcionário da divisão de observações da agência.

Dados Organização Meteorológica Mundial divulgados em janeiro mostram que as temperaturas globais em 2020 estiveram entre as mais altas já registradas e rivalizaram com 2016 com o ano mais quente de todos os tempos.

As flores de cerejeira têm longas raízes históricas e culturais no Japão, anunciando a Primavera e inspirando artistas e poetas ao longo dos séculos. Sua fragilidade é vista como um símbolo de vida, morte e renascimento.

Atualmente, as pessoas se reúnem sob as flores de cerejeiras a cada primavera para festas hanami (observação das flores), passeiam em parques e fazem piqueniques embaixo dos galhos e abusar das selfies. Mas, neste ano, a florada de cerejeiras veio e se foi em um piscar de olhos.

Com o fim do estado de emergência para conter a pandemia de Covid-19 em todas as regiões do Japão, muitas pessoas se aglomeraram em locais populares de exibição no fim de semana, embora o número de pessoas tenha sido menor do que em anos normais.

NOAA Acknowledges the New Reality of Hurricane Season (Gizmodo)

earther.gizmodo.com

Molly Taft, March 2, 2021


This combination of satellite images provided by the National Hurricane Center shows 30 hurricanes that occurred during the 2020 Atlantic hurricane season.
This combination of satellite images provided by the National Hurricane Center shows 30 hurricanes that occurred during the 2020 Atlantic hurricane season.

We’re one step closer to officially moving up hurricane season. The National Hurricane Center announced Tuesday that it would formally start issuing its hurricane season tropical weather outlooks on May 15 this year, bumping it up from the traditional start of hurricane season on June 1. The move comes after a recent spate of early season storms have raked the Atlantic.

Atlantic hurricane season runs from June 1 to November 30. That’s when conditions are most conducive to storm formation owing to warm air and water temperatures. (The Pacific ocean has its own hurricane season, which covers the same timeframe, but since waters are colder fewer hurricanes tend to form there than in the Atlantic.)

Storms have begun forming on the Atlantic earlier as ocean and air temperatures have increased due to climate change. Last year, Hurricane Arthur roared to life off the East Coast on May 16. That storm made 2020 the sixth hurricane season in a row to have a storm that formed earlier than the June 1 official start date. While the National Oceanic and Atmospheric Administration won’t be moving up the start of the season just yet, the earlier outlooks addresses the recent history.

“In the last decade, there have been 10 storms formed in the weeks before the traditional start of the season, which is a big jump,” said Sean Sublette, a meteorologist at Climate Central, who pointed out that the 1960s through 2010s saw between one and three storms each decade before the June 1 start date on average.

It might be tempting to ascribe this earlier season entirely to climate change warming the Atlantic. But technology also has a role to play, with more observations along the coast as well as satellites that can spot storms far out to sea.

“I would caution that we can’t just go, ‘hah, the planet’s warming, we’ve had to move the entire season!’” Sublette said. “I don’t think there’s solid ground for attribution of how much of one there is over the other. Weather folks can sit around and debate that for awhile.”

Earlier storms don’t necessarily mean more harmful ones, either. In fact, hurricanes earlier in the season tend to be weaker than the monsters that form in August and September when hurricane season is at its peak. But regardless of their strength, these earlier storms have generated discussion inside the NHC on whether to move up the official start date for the season, when the agency usually puts out two reports per day on hurricane activity. Tuesday’s step is not an official announcement of this decision, but an acknowledgement of the increased attention on early hurricanes.

“I would say that [Tuesday’s announcement] is the National Hurricane Center being proactive,” Sublette said. “Like hey, we know that the last few years it’s been a little busier in May than we’ve seen in the past five decades, and we know there is an awareness now, so we’re going to start issuing these reports early.”

While the jury is still out on whether climate change is pushing the season earlier, research has shown that the strongest hurricanes are becoming more common, and that climate change is likely playing a role. A study published last year found the odds of a storm becoming a major hurricanes—those Category 3 or stronger—have increase 49% in the basin since satellite monitoring began in earnest four decades ago. And when storms make landfall, sea level rise allows them to do more damage. So regardless of if climate change is pushing Atlantic hurricane season is getting earlier or not, the risks are increasing. Now, at least, we’ll have better warnings before early storms do hit.

Texas Power Grid Run by ERCOT Set Up the State for Disaster (New York Times)

nytimes.com

Clifford Krauss, Manny Fernandez, Ivan Penn, Rick Rojas – Feb 21, 2021


Texas has refused to join interstate electrical grids and railed against energy regulation. Now it’s having to answer to millions of residents who were left without power in last week’s snowstorm.

The cost of a free market electrical grid became painfully clear last week, as a snowstorm descended on Texas and millions of people ran out of power and water.
Credit: Nitashia Johnson for The New York Times

HOUSTON — Across the plains of West Texas, the pump jacks that resemble giant bobbing hammers define not just the landscape but the state itself: Texas has been built on the oil-and-gas business for the last 120 years, ever since the discovery of oil on Spindletop Hill near Beaumont in 1901.

Texas, the nation’s leading energy-producing state, seemed like the last place on Earth that could run out of energy.

Then last week, it did.

The crisis could be traced to that other defining Texas trait: independence, both from big government and from the rest of the country. The dominance of the energy industry and the “Republic of Texas” ethos became a devastating liability when energy stopped flowing to millions of Texans who shivered and struggled through a snowstorm that paralyzed much of the state.

Part of the responsibility for the near-collapse of the state’s electrical grid can be traced to the decision in 1999 to embark on the nation’s most extensive experiment in electrical deregulation, handing control of the state’s entire electricity delivery system to a market-based patchwork of private generators, transmission companies and energy retailers.

The energy industry wanted it. The people wanted it. Both parties supported it. “Competition in the electric industry will benefit Texans by reducing monthly rates and offering consumers more choices about the power they use,” George W. Bush, then the governor, said as he signed the top-to-bottom deregulation legislation.

Mr. Bush’s prediction of lower-cost power generally came true, and the dream of a free-market electrical grid worked reasonably well most of the time, in large part because Texas had so much cheap natural gas as well as abundant wind to power renewable energy. But the newly deregulated system came with few safeguards and even fewer enforced rules.

With so many cost-conscious utilities competing for budget-shopping consumers, there was little financial incentive to invest in weather protection and maintenance. Wind turbines are not equipped with the de-icing equipment routinely installed in the colder climes of the Dakotas and power lines have little insulation. The possibility of more frequent cold-weather events was never built into infrastructure plans in a state where climate change remains an exotic, disputed concept.

“Deregulation was something akin to abolishing the speed limit on an interstate highway,” said Ed Hirs, an energy fellow at the University of Houston. “That opens up shortcuts that cause disasters.”

The state’s entire energy infrastructure was walloped with glacial temperatures that even under the strongest of regulations might have frozen gas wells and downed power lines.

But what went wrong was far broader: Deregulation meant that critical rules of the road for power were set not by law, but rather by a dizzying array of energy competitors.

Utility regulation is intended to compensate for the natural monopolies that occur when a single electrical provider serves an area; it keeps prices down while protecting public safety and guaranteeing fair treatment to customers. Yet many states have flirted with deregulation as a way of giving consumers more choices and encouraging new providers, especially alternative energy producers.

California, one of the early deregulators in the 1990s, scaled back its initial foray after market manipulation led to skyrocketing prices and rolling blackouts.

States like Maryland allow customers to pick from a menu of producers. In some states, competing private companies offer varied packages like discounts for cheaper power at night. But no state has gone as far as Texas, which has not only turned over the keys to the free market but has also isolated itself from the national grid, limiting the state’s ability to import power when its own generators are foundering.

Consumers themselves got a direct shock last week when customers who had chosen variable-rate electricity contracts found themselves with power bills of $5,000 or more. While they were expecting extra-low monthly rates, many may now face huge bills as a result of the upswing in wholesale electricity prices during the cold wave. Gov. Greg Abbott on Sunday said the state’s Public Utility Commission has issued a moratorium on customer disconnections for non-payment and will temporarily restrict providers from issuing invoices.

A family in Austin, Texas, kept warm by a fire outside their apartment on Wednesday. They lost power early Monday morning.
Credit: Tamir Kalifa for The New York Times

There is regulation in the Texas system, but it is hardly robust. One nonprofit agency, the Electric Reliability Council of Texas, or ERCOT, was formed to manage the wholesale market. It is supervised by the Public Utility Commission, which also oversees the transmission companies that offer customers an exhaustive array of contract choices laced with more fine print than a credit card agreement.

But both agencies are nearly unaccountable and toothless compared to regulators in other regions, where many utilities have stronger consumer protections and submit an annual planning report to ensure adequate electricity supply. Texas energy companies are given wide latitude in their planning for catastrophic events.

One example of how Texas has gone it alone is its refusal to enforce a “reserve margin” of extra power available above expected demand, unlike all other power systems around North America. With no mandate, there is little incentive to invest in precautions for events, such as a Southern snowstorm, that are rare. Any company that took such precautions would put itself at a competitive disadvantage.

A surplus supply of natural gas, the dominant power fuel in Texas, near power plants might have helped avoid the cascade of failures in which power went off, forcing natural gas production and transmission offline, which in turn led to further power shortages.

In the aftermath of the dayslong outages, ERCOT has been criticized by both Democratic and Republican residents, lawmakers and business executives, a rare display of unity in a fiercely partisan and Republican-dominated state. Mr. Abbott said he supported calls for the agency’s leadership to resign and made ERCOT reform a priority for the Legislature. The reckoning has been swift — this week, lawmakers will hold hearings in Austin to investigate the agency’s handling of the storm and the rolling outages.

For ERCOT operators, the storm’s arrival was swift and fierce, but they had anticipated it and knew it would strain their system. They asked power customers across the state to conserve, warning that outages were likely.

But late on Sunday, Feb. 14, it rapidly became clear that the storm was far worse than they had expected: Sleet and snow fell, and temperatures plunged. In the council’s command center outside Austin, a room dominated by screens flashing with maps, graphics and data tracking the flow of electricity to 26 million people in Texas, workers quickly found themselves fending off a crisis. As weather worsened into Monday morning, residents cranked up their heaters and demand surged.

Power plants began falling offline in rapid succession as they were overcome by the frigid weather or ran out of fuel to burn. Within hours, 40 percent of the power supply had been lost.

The entire grid — carrying 90 percent of the electric load in Texas — was barreling toward a collapse.

Much of Austin lost power last week due to rolling blackouts.
Credit: Tamir Kalifa for The New York Times

In the electricity business, supply and demand need to be in balance. Imbalances lead to catastrophic blackouts. Recovering from a total blackout would be an agonizing and tedious process, known as a “black start,” that could take weeks, or possibly months.

And in the early-morning hours last Monday, the Texas grid was “seconds and minutes” away from such a collapse, said Bill Magness, the president and chief executive of the Electric Reliability Council.

“If we had allowed a catastrophic blackout to happen, we wouldn’t be talking today about hopefully getting most customers their power back,” Mr. Magness said. “We’d be talking about how many months it might be before you get your power back.”

The outages and the cold weather touched off an avalanche of failures, but there had been warnings long before last week’s storm.

After a heavy snowstorm in February 2011 caused statewide rolling blackouts and left millions of Texans in the dark, federal authorities warned the state that its power infrastructure had inadequate “winterization” protection. But 10 years later, pipelines remained inadequately insulated and heaters that might have kept instruments from freezing were never installed.

During heat waves, when demand has soared during several recent summers, the system in Texas has also strained to keep up, raising questions about lack of reserve capacity on the unregulated grid.

And aside from the weather, there have been periodic signs that the system can run into trouble delivering sufficient energy, in some cases because of equipment failures, in others because of what critics called an attempt to drive up prices, according to Mr. Hirs of the University of Houston, as well as several energy consultants.

Another potential safeguard might have been far stronger connections to the two interstate power-sharing networks, East and West, that allow states to link their electrical grids and obtain power from thousands of miles away when needed to hold down costs and offset their own shortfalls.

But Texas, reluctant to submit to the federal regulation that is part of the regional power grids, made decisions as far back as the early 20th century to become the only state in the continental United States to operate its own grid — a plan that leaves it able to borrow only from a few close neighbors.

The border city of El Paso survived the freeze much better than Dallas or Houston because it was not part of the Texas grid but connected to the much larger grid covering many Western states.

But the problems that began with last Monday’s storm went beyond an isolated electrical grid. The entire ecosystem of how Texas generates, transmits and uses power stalled, as millions of Texans shivered in darkened, unheated homes.

A surplus supply of natural gas, the dominant power fuel in Texas, near power plants might have helped avoid the cascade of failures.
Credit: Eddie Seal/Bloomberg

Texans love to brag about natural gas, which state officials often call the cleanest-burning fossil fuel. No state produces more, and gas-fired power plants produce nearly half the state’s electricity.

“We are struggling to come to grips with the reality that gas came up short and let us down when we needed it most,” said Michael E. Webber, a professor of mechanical engineering at the University of Texas at Austin.

The cold was so severe that the enormous oil and natural gas fields of West Texas froze up, or could not get sufficient power to operate. Though a few plants had stored gas reserves, there was insufficient electricity to pump it.

The leaders of ERCOT defended the organization, its lack of mandated reserves and the state’s isolation from larger regional grids, and said the blame for the power crisis lies with the weather, not the overall deregulated system in Texas.

“The historic, just about unprecedented, storm was the heart of the problem,” Mr. Magness, the council’s chief executive, said, adding: “We’ve found that this market structure works. It demands reliability. I don’t think there’s a silver-bullet market structure that could have managed the extreme lows and generation outages that we were facing Sunday night.”

In Texas, energy regulation is as much a matter of philosophy as policy. Its independent power grid is a point of pride that has been an applause line in Texas political speeches for decades.

Deregulation is a hot topic among Texas energy experts, and there has been no shortage of predictions that the grid could fail under stress. But there has not been widespread public dissatisfaction with the system, although many are now wondering if they are being well served.

“I believe there is great value in Texas being on its own grid and I believe we can do so safely and securely and confidently going forward,” said State Representative Jeff Leach, a Republican from Plano who has called for an investigation into what went wrong. “But it’s going to take new investment and some new strategic decisions to make sure we’re protected from this ever happening again.”

Steven D. Wolens, a former Democratic lawmaker from Dallas and a principal architect of the 1999 deregulation legislation, said deregulation was meant to spur more generation, including from renewable energy sources, and to encourage the mothballing of older plants that were spewing pollution. “We were successful,” said Mr. Wolens, who left the Legislature in 2005.

But the 1999 legislation was intended as a first iteration that would evolve along with the needs of the state, he said. “They can focus on it now and they can fix it now,” he said. “The buck stops with the Texas Legislature and they are in a perfect position to determine the basis of the failure, to correct it and make sure it never happens again.”

Clifford Krauss reported from Houston, Manny Fernandez and Ivan Penn from Los Angeles, and Rick Rojas from Nashville. David Montgomery contributed reporting from Austin, Texas.

Texas Blackouts Point to Coast-to-Coast Crises Waiting to Happen (New York Times)

nytimes.com

Christopher Flavelle, Brad Plumer, Hiroko Tabuchi – Feb 20, 2021


Traffic at a standstill on Interstate 35 in Kileen, Texas, on Thursday.
Traffic at a standstill on Interstate 35 in Kileen, Texas, on Thursday. Credit: Joe Raedle/Getty Images
Continent-spanning storms triggered blackouts in Oklahoma and Mississippi, halted one-third of U.S. oil production and disrupted vaccinations in 20 states.

Even as Texas struggled to restore electricity and water over the past week, signs of the risks posed by increasingly extreme weather to America’s aging infrastructure were cropping up across the country.

The week’s continent-spanning winter storms triggered blackouts in Texas, Oklahoma, Mississippi and several other states. One-third of oil production in the nation was halted. Drinking-water systems in Ohio were knocked offline. Road networks nationwide were paralyzed and vaccination efforts in 20 states were disrupted.

The crisis carries a profound warning. As climate change brings more frequent and intense storms, floods, heat waves, wildfires and other extreme events, it is placing growing stress on the foundations of the country’s economy: Its network of roads and railways, drinking-water systems, power plants, electrical grids, industrial waste sites and even homes. Failures in just one sector can set off a domino effect of breakdowns in hard-to-predict ways.

Much of this infrastructure was built decades ago, under the expectation that the environment around it would remain stable, or at least fluctuate within predictable bounds. Now climate change is upending that assumption.

“We are colliding with a future of extremes,” said Alice Hill, who oversaw planning for climate risks on the National Security Council during the Obama administration. “We base all our choices about risk management on what’s occurred in the past, and that is no longer a safe guide.”

While it’s not always possible to say precisely how global warming influenced any one particular storm, scientists said, an overall rise in extreme weather creates sweeping new risks.

Sewer systems are overflowing more often as powerful rainstorms exceed their design capacity. Coastal homes and highways are collapsing as intensified runoff erodes cliffs. Coal ash, the toxic residue produced by coal-burning plants, is spilling into rivers as floods overwhelm barriers meant to hold it back. Homes once beyond the reach of wildfires are burning in blazes they were never designed to withstand.

A broken water main in McComb., Miss. on Thursday.
Credit: Matt Williamson/The Enterprise-Journal, via Associated Press

Problems like these often reflect an inclination of governments to spend as little money as possible, said Shalini Vajjhala, a former Obama administration official who now advises cities on meeting climate threats. She said it’s hard to persuade taxpayers to spend extra money to guard against disasters that seem unlikely.

But climate change flips that logic, making inaction far costlier. “The argument I would make is, we can’t afford not to, because we’re absorbing the costs” later, Ms. Vajjhala said, after disasters strike. “We’re spending poorly.”

The Biden administration has talked extensively about climate change, particularly the need to reduce greenhouse gas emissions and create jobs in renewable energy. But it has spent less time discussing how to manage the growing effects of climate change, facing criticism from experts for not appointing more people who focus on climate resilience.

“I am extremely concerned by the lack of emergency-management expertise reflected in Biden’s climate team,” said Samantha Montano, an assistant professor at the Massachusetts Maritime Academy who focuses on disaster policy. “There’s an urgency here that still is not being reflected.”

A White House spokesman, Vedant Patel, said in a statement, “Building resilient and sustainable infrastructure that can withstand extreme weather and a changing climate will play an integral role in creating millions of good paying, union jobs” while cutting greenhouse gas emissions.

And while President Biden has called for a major push to refurbish and upgrade the nation’s infrastructure, getting a closely divided Congress to spend hundreds of billions, if not trillions of dollars, will be a major challenge.

Heightening the cost to society, disruptions can disproportionately affect lower-income households and other vulnerable groups, including older people or those with limited English.

“All these issues are converging,” said Robert D. Bullard, a professor at Texas Southern University who studies wealth and racial disparities related to the environment. “And there’s simply no place in this country that’s not going to have to deal with climate change.”

Flooding around Edenville Township, Mich., last year swept away a bridge over the Tittabawassee River.
Credit: Matthew Hatcher/Getty Images

In September, when a sudden storm dumped a record of more than two inches of water on Washington in less than 75 minutes, the result wasn’t just widespread flooding, but also raw sewage rushing into hundreds of homes.

Washington, like many other cities in the Northeast and Midwest, relies on what’s called a combined sewer overflow system: If a downpour overwhelms storm drains along the street, they are built to overflow into the pipes that carry raw sewage. But if there’s too much pressure, sewage can be pushed backward, into people’s homes — where the forces can send it erupting from toilets and shower drains.

This is what happened in Washington. The city’s system was built in the late 1800s. Now, climate change is straining an already outdated design.

DC Water, the local utility, is spending billions of dollars so that the system can hold more sewage. “We’re sort of in uncharted territory,” said Vincent Morris, a utility spokesman.

The challenge of managing and taming the nation’s water supplies — whether in streets and homes, or in vast rivers and watersheds — is growing increasingly complex as storms intensify. Last May, rain-swollen flooding breached two dams in Central Michigan, forcing thousands of residents to flee their homes and threatening a chemical complex and toxic waste cleanup site. Experts warned it was unlikely to be the last such failure.

Many of the country’s 90,000 dams were built decades ago and were already in dire need of repairs. Now climate change poses an additional threat, bringing heavier downpours to parts of the country and raising the odds that some dams could be overwhelmed by more water than they were designed to handle. One recent study found that most of California’s biggest dams were at increased risk of failure as global warming advances.

In recent years, dam-safety officials have begun grappling with the dangers. Colorado, for instance, now requires dam builders to take into account the risk of increased atmospheric moisture driven by climate change as they plan for worst-case flooding scenarios.

But nationwide, there remains a backlog of thousands of older dams that still need to be rehabilitated or upgraded. The price tag could ultimately stretch to more than $70 billion.

“Whenever we study dam failures, we often find there was a lot of complacency beforehand,” said Bill McCormick, president of the Association of State Dam Safety Officials. But given that failures can have catastrophic consequences, “we really can’t afford to be complacent.”

Crews repaired switches on utility poles damaged by the storms in Texas.
Credit: Tamir Kalifa for The New York Times

If the Texas blackouts exposed one state’s poor planning, they also provide a warning for the nation: Climate change threatens virtually every aspect of electricity grids that aren’t always designed to handle increasingly severe weather. The vulnerabilities show up in power lines, natural-gas plants, nuclear reactors and myriad other systems.

Higher storm surges can knock out coastal power infrastructure. Deeper droughts can reduce water supplies for hydroelectric dams. Severe heat waves can reduce the efficiency of fossil-fuel generators, transmission lines and even solar panels at precisely the moment that demand soars because everyone cranks up their air-conditioners.

Climate hazards can also combine in new and unforeseen ways.

In California recently, Pacific Gas & Electric has had to shut off electricity to thousands of people during exceptionally dangerous fire seasons. The reason: Downed power lines can spark huge wildfires in dry vegetation. Then, during a record-hot August last year, several of the state’s natural gas plants malfunctioned in the heat, just as demand was spiking, contributing to blackouts.

“We have to get better at understanding these compound impacts,” said Michael Craig, an expert in energy systems at the University of Michigan who recently led a study looking at how rising summer temperatures in Texas could strain the grid in unexpected ways. “It’s an incredibly complex problem to plan for.”

Some utilities are taking notice. After Superstorm Sandy in 2012 knocked out power for 8.7 million customers, utilities in New York and New Jersey invested billions in flood walls, submersible equipment and other technology to reduce the risk of failures. Last month, New York’s Con Edison said it would incorporate climate projections into its planning.

As freezing temperatures struck Texas, a glitch at one of two reactors at a South Texas nuclear plant, which serves 2 million homes, triggered a shutdown. The cause: Sensing lines connected to the plant’s water pumps had frozen, said Victor Dricks, a spokesman for the federal Nuclear Regulatory Agency.

It’s also common for extreme heat to disrupt nuclear power. The issue is that the water used to cool reactors can become too warm to use, forcing shutdowns.

Flooding is another risk.

After a tsunami led to several meltdowns at Japan’s Fukushima Daiichi power plant in 2011, the U.S. Nuclear Regulatory Commission told the 60 or so working nuclear plants in the United States, many decades old, to evaluate their flood risk to account for climate change. Ninety percent showed at least one type of flood risk that exceeded what the plant was designed to handle.

The greatest risk came from heavy rain and snowfall exceeding the design parameters at 53 plants.

Scott Burnell, an Nuclear Regulatory Commission spokesman, said in a statement, “The NRC continues to conclude, based on the staff’s review of detailed analyses, that all U.S. nuclear power plants can appropriately deal with potential flooding events, including the effects of climate change, and remain safe.”

A section of Highway 1 along the California coastline collapsed in January amid heavy rains.
Credit: Josh Edelson/Agence France-Presse — Getty Images

The collapse of a portion of California’s Highway 1 into the Pacific Ocean after heavy rains last month was a reminder of the fragility of the nation’s roads.

Several climate-related risks appeared to have converged to heighten the danger. Rising seas and higher storm surges have intensified coastal erosion, while more extreme bouts of precipitation have increased the landslide risk.

Add to that the effects of devastating wildfires, which can damage the vegetation holding hillside soil in place, and “things that wouldn’t have slid without the wildfires, start sliding,” said Jennifer M. Jacobs, a professor of civil and environmental engineering at the University of New Hampshire. “I think we’re going to see more of that.”

The United States depends on highways, railroads and bridges as economic arteries for commerce, travel and simply getting to work. But many of the country’s most important links face mounting climate threats. More than 60,000 miles of roads and bridges in coastal floodplains are already vulnerable to extreme storms and hurricanes, government estimates show. And inland flooding could also threaten at least 2,500 bridges across the country by 2050, a federal climate report warned in 2018.

Sometimes even small changes can trigger catastrophic failures. Engineers modeling the collapse of bridges over Escambia Bay in Florida during Hurricane Ivan in 2004 found that the extra three inches of sea-level rise since the bridge was built in 1968 very likely contributed to the collapse, because of the added height of the storm surge and force of the waves.

“A lot of our infrastructure systems have a tipping point. And when you hit the tipping point, that’s when a failure occurs,” Dr. Jacobs said. “And the tipping point could be an inch.”

Crucial rail networks are at risk, too. In 2017, Amtrak consultants found that along parts of the Northeast corridor, which runs from Boston to Washington and carries 12 million people a year, flooding and storm surge could erode the track bed, disable the signals and eventually put the tracks underwater.

And there is no easy fix. Elevating the tracks would require also raising bridges, electrical wires and lots of other infrastructure, and moving them would mean buying new land in a densely packed part of the country. So the report recommended flood barriers, costing $24 million per mile, that must be moved into place whenever floods threaten.

A worker checked efforts to prevent coal ash from escaping into the Waccamaw River in South Carolina after Hurricane Florence in 2018.
Credit: Randall Hill/Reuters

A series of explosions at a flood-damaged chemical plant outside Houston after Hurricane Harvey in 2017 highlighted a danger lurking in a world beset by increasingly extreme weather.

The blasts at the plant came after flooding knocked out the site’s electrical supply, shutting down refrigeration systems that kept volatile chemicals stable. Almost two dozen people, many of them emergency workers, were treated for exposure to the toxic fumes, and some 200 nearby residents were evacuated from their homes.

More than 2,500 facilities that handle toxic chemicals lie in federal flood-prone areas across the country, about 1,400 of them in areas at the highest risk of flooding, a New York Times analysis showed in 2018.

Leaks from toxic cleanup sites, left behind by past industry, pose another threat.

Almost two-thirds of some 1,500 superfund cleanup sites across the country are in areas with an elevated risk of flooding, storm surge, wildfires or sea level rise, a government audit warned in 2019. Coal ash, a toxic substance produced by coal power plants that is often stored as sludge in special ponds, have been particularly exposed. After Hurricane Florence in 2018, for example, a dam breach at the site of a power plant in Wilmington, N.C., released the hazardous ash into a nearby river.

“We should be evaluating whether these facilities or sites actually have to be moved or re-secured,” said Lisa Evans, senior counsel at Earthjustice, an environmental law organization. Places that “may have been OK in 1990,” she said, “may be a disaster waiting to happen in 2021.”

East Austin, Texas, during a blackout on Wednesday.  
Credit: Bronte Wittpenn/Austin American-Statesman, via Associated Press

Texas’s Power Crisis Has Turned Into a Disaster That Parallels Hurricane Katrina (TruthOut)

truthout.org

Sharon Zhang, Feb. 18, 2021


Propane tanks are placed in a line as people wait for the power to turn on to fill their tanks in Houston, Texas on February 17, 2021.
Propane tanks are placed in a line as people wait for the power to turn on to fill their tanks in Houston, Texas, on February 17, 2021. Mark Felix for The Washington Post via Getty Images

As many in Texas wake up still without power on Thursday morning, millions are now also having to contend with water shutdowns, boil advisories, and empty grocery shelves as cities struggle with keeping infrastructure powered and supply chains are interrupted.

As of estimates performed on Wednesday, 7 million Texans were under a boil advisory. Since then, Austin has also issued a citywide water-boil notice due to power loss at their biggest water treatment plant. Austin Water serves over a million customers, according to its website.

With hundreds of thousands of people still without power in the state, some contending that they have no water coming out of their faucets at all, and others facing burst pipes leading to collapsed ceilings and other damage to their homes, the situation is dire for many Texans facing multiple problems at once.

Even as some residents are getting their power restored, the problems are only continuing to layer as the only grocery stores left open were quickly selling out of food and supplies. As many without power watched their refrigerated food spoil, lines to get into stores wrapped around blocks and buildings and store shelves sat completely empty with no indication of when new shipments would be coming in. Food banks have had to cancel deliveries and schools to halt meal distribution to students, the Texas Tribune reports.

People experiencing homelessness, including a disproportionate number of Black residents, have especially suffered in the record cold temperatures across the state. There have been some reports of people being found dead in the streets because of a lack of shelter.

“Businesses are shut down. Streets are empty, other than a few guys sliding around in 4x4s and fire trucks rushing to rescue people who turn their ovens on to keep warm and poison themselves with carbon monoxide,” wrote Austin resident Jeff Goodell in Rolling Stone. “Yesterday, the line at our neighborhood grocery store was three blocks long. People wandering around with handguns on their hip adds to a sense of lawlessness (Texas is an open-carry state).”

The Texas agricultural commissioner has said that farmers and ranchers are having to throw away millions of dollars worth of goods because of a lack of power. “We’re looking at a food supply chain problem like we’ve never seen before, even with COVID-19,” he told one local news affiliate.

An energy analyst likened the power crisis to the fallout of Hurricane Katrina as it’s becoming increasingly clear that the situation in Texas is a statewide disaster.

As natural gas output declined dramatically in the state, Paul Sankey, who leads energy analyst firm Sankey Research, said on Bloomberg, “This situation to me is very reminiscent of Hurricane Katrina…. We have never seen a loss [of energy supply] at this scale” in mid-winter. This is “the biggest outage in the history [of] U.S. oil and gas,” Sankey said.

Many others online echoed Sankey’s words as “Katrina” trended on Twitter, saying that the situation is similar to the hurricane disaster in that it has been downplayed by politicians but may be uncovered to be even more serious in the coming weeks.

Experts say that the power outages have partially been caused by the deregulation of the state’s electric grid. The government, some say, favored deregulatory actions like not requiring electrical equipment upgrades or proper weatherization, instead relying on free market mechanisms that ultimately contributed to the current disaster.

Former Gov. Rick Perry faced criticism on Wednesday when he said that Texans would rather face the current disaster than have to be regulated by the federal government. And he’s not the only Republican currently catching heat — many have begun calling for the resignation of Gov. Greg Abbott for a failure of leadership. On Wednesday, as millions suffered without power and under boil-water advisories, the governor went on Fox to attack clean energy, which experts say was not a major contributor to the current crisis, and the Green New Deal.

After declaring a state of emergency in the state over the weekend, the Joe Biden administration announced on Wednesday that it would be sending generators and other supplies to the state.

The freeze in Texas exposes America’s infrastructural failings (The Economist)

economist.com

Feb 17th 2021

You ain’t foolin’ nobody with the lights out

WHEN IT RAINS, it pours, and when it snows, the lights turn off. Or so it goes in Texas. After a winter storm pummelled the Lone Star State with record snowfall and the lowest temperatures in more than 30 years, millions were left without electricity and heat. On February 16th 4.5m Texan households were cut off from power, as providers were overloaded with demand and tried to shuffle access to electricity so the whole grid did not go down.

Whole skylines, including Dallas’s, went dark to conserve power. Some Texans braved the snowy roads to check into the few hotels with remaining rooms, only for the hotels’ power to go off as they arrived. Others donned skiwear and remained inside, hoping the lights and heat would come back on. Across the state, what were supposed to be “rolling” blackouts lasted for days. It is still too soon to quantify the devastation. More than 20 people have died in motor accidents, from fires lit for warmth and from carbon-monoxide poisoning from using cars for heat. The storm has also halted deliveries of covid-19 vaccines and may prevent around 1m vaccinations from happening this week. Several retail electricity providers are likely to go bankrupt, after being hit with surging wholesale power prices.

Other states, including Tennessee, were also covered in snow, but Texas got the lion’s share and ground to a halt. Texans are rightly furious that residents of America’s energy capital cannot count on reliable power. Everyone is asking why.

The short answer is that the Electric Reliability Council of Texas (ERCOT), which operates the grid, did not properly forecast the demand for energy as a result of the storm. Some say that this was nearly impossible to predict, but there were warnings of the severity of the coming weather in the preceding week, and ERCOT’s projections were notably short. Brownouts last summer had already demonstrated the grid’s lack of excess capacity, says George O’Leary of Tudor, Pickering, Holt & CO (TPH), an energy investment bank.

Many Republican politicians were quick to blame renewable energy sources, such as wind power, for the blackouts, but that is not fair. Some wind turbines did indeed freeze, but natural gas, which accounts for around half of the state’s electricity generation, was the primary source of the shortfall. Plants broke down, as did the gas supply chain and pipelines. The cold also caused a reactor at one of the state’s two nuclear plants to go offline. Transmission lines may have also iced up, says Wade Schauer of Wood Mackenzie, an energy-research firm. In short, Texas experienced a perfect storm.

Some of the blame falls on the unique design of the electricity market in Texas. Of America’s 48 contiguous states, it is the only one with its own stand-alone electricity grid—the Texas Interconnection. This means that when power generators fail, the state cannot import electricity from outside its borders.

The state’s deregulated power market is also fiercely competitive. ERCOT oversees the grid, while power generators produce electricity for the wholesale market. Some 300 retail electricity providers buy that fuel and then compete for consumers. Because such cold weather is rare, energy companies do not invest in “winterising” their equipment, as this would raise their prices for consumers. Perhaps most important, the state does not have a “capacity market”, which would ensure that there was extra power available for surging demand. This acts as a sort of insurance policy so the lights will not go out, but it also means customers pay higher bills.

For years the benefits of Texas’s deregulated market structure were clear. At 8.6 cents per kilowatt hour, the state’s average retail price for electricity is around one-fifth lower than the national average and about half the cost of California’s. In 1999 the state set targets for renewables, and today it accounts for around 30% of America’s wind energy.

This disaster is prompting people to question whether Texas’s system is as resilient and well-designed as people previously believed. Greg Abbott, the governor, has called for an investigation into ERCOT. This storm “has exposed some serious weaknesses in our free-market approach in Texas”, says Luke Metzger of Environment Texas, a non-profit, who had been without power for 54 hours when The Economist went to press.

Wholly redesigning the power grid in Texas seems unlikely. After the snow melts, the state will need to tackle two more straightforward questions. The first is whether it needs to increase reserve capacity. “If we impose a capacity market here and a bunch of new cap-ex is required to winterise equipment, who bears that cost? Ultimately it’s the customer,” says Bobby Tudor, chairman of TPH. The second is how Texas can ensure the reliability of equipment in extreme weather conditions. After a polar vortex in 2014 hit the east coast, PJM, a regional transmission organisation, started making higher payments based on reliability of service, says Michael Weinstein of Credit Suisse, a bank. In Texas there is no penalty for systems going down, except for public complaints and politicians’ finger-pointing.

Texas is hardly the only state to struggle with blackouts. California, which has a more tightly regulated power market, is regularly plunged into darkness during periods of high heat, winds and wildfires. Unlike Texas, much of northern California is dependent on a single utility, PG&E. The company has been repeatedly sued for dismal, dangerous management. But, as in Texas, critics have blamed intermittent renewable power for blackouts. In truth, California’s blackouts share many of the same causes as those in Texas: extreme weather, power generators that failed unexpectedly, poor planning by state regulators and an inability (in California, temporary) to import power from elsewhere. In California’s blackouts last year, solar output naturally declined in the evening. But gas plants also went offline and weak rainfall lowered the output of hydroelectric dams.

In California, as in Texas, it would help to have additional power generation, energy storage to meet peak demand and more resilient infrastructure, such as buried power lines and more long-distance, high-voltage transmission. Weather events that once might have been dismissed as unusual are becoming more common. Without more investment in electricity grids, blackouts will be, too.

A Glimpse of America’s Future: Climate Change Means Trouble for Power Grids (New York Times)

nytimes.com

Brad Plumer, Feb. 17, 2021


Systems are designed to handle spikes in demand, but the wild and unpredictable weather linked to global warming will very likely push grids beyond their limits.
A street in Austin, Texas, without power on Monday evening.
Credit: Tamir Kalifa for The New York Times

Published Feb. 16, 2021Updated Feb. 17, 2021, 6:59 a.m. ET

Huge winter storms plunged large parts of the central and southern United States into an energy crisis this week, with frigid blasts of Arctic weather crippling electric grids and leaving millions of Americans without power amid dangerously cold temperatures.

The grid failures were most severe in Texas, where more than four million people woke up Tuesday morning to rolling blackouts. Separate regional grids in the Southwest and Midwest also faced serious strain. As of Tuesday afternoon, at least 23 people nationwide had died in the storm or its aftermath.

Analysts have begun to identify key factors behind the grid failures in Texas. Record-breaking cold weather spurred residents to crank up their electric heaters and pushed power demand beyond the worst-case scenarios that grid operators had planned for. At the same time, a large fraction of the state’s gas-fired power plants were knocked offline amid icy conditions, with some plants suffering fuel shortages as natural gas demand spiked. Many of Texas’ wind turbines also froze and stopped working.

The crisis sounded an alarm for power systems throughout the country. Electric grids can be engineered to handle a wide range of severe conditions — as long as grid operators can reliably predict the dangers ahead. But as climate change accelerates, many electric grids will face extreme weather events that go far beyond the historical conditions those systems were designed for, putting them at risk of catastrophic failure.

While scientists are still analyzing what role human-caused climate change may have played in this week’s winter storms, it is clear that global warming poses a barrage of additional threats to power systems nationwide, including fiercer heat waves and water shortages.

Measures that could help make electric grids more robust — such as fortifying power plants against extreme weather, or installing more backup power sources — could prove expensive. But as Texas shows, blackouts can be extremely costly, too. And, experts said, unless grid planners start planning for increasingly wild and unpredictable climate conditions, grid failures will happen again and again.

“It’s essentially a question of how much insurance you want to buy,” said Jesse Jenkins, an energy systems engineer at Princeton University. “What makes this problem even harder is that we’re now in a world where, especially with climate change, the past is no longer a good guide to the future. We have to get much better at preparing for the unexpected.”

Texas’ main electric grid, which largely operates independently from the rest of the country, has been built with the state’s most common weather extremes in mind: soaring summer temperatures that cause millions of Texans to turn up their air-conditioners all at once.

While freezing weather is rarer, grid operators in Texas have also long known that electricity demand can spike in the winter, particularly after damaging cold snaps in 2011 and 2018. But this week’s winter storms, which buried the state in snow and ice, and led to record-cold temperatures, surpassed all expectations — and pushed the grid to its breaking point.

Residents of East Dallas trying to warm up on Monday after their family home lost power.
Credit: Juan Figueroa/The Dallas Morning News, via Associated Press

Texas’ grid operators had anticipated that, in the worst case, the state would use 67 gigawatts of electricity during the winter peak. But by Sunday evening, power demand had surged past that level. As temperatures dropped, many homes were relying on older, inefficient electric heaters that consume more power.

The problems compounded from there, with frigid weather on Monday disabling power plants with capacity totaling more than 30 gigawatts. The vast majority of those failures occurred at thermal power plants, like natural gas generators, as plummeting temperatures paralyzed plant equipment and soaring demand for natural gas left some plants struggling to obtain sufficient fuel. A number of the state’s power plants were also offline for scheduled maintenance in preparation for the summer peak.

The state’s fleet of wind farms also lost up to 4.5 gigawatts of capacity at times, as many turbines stopped working in cold and icy conditions, though this was a smaller part of the problem.

In essence, experts said, an electric grid optimized to deliver huge quantities of power on the hottest days of the year was caught unprepared when temperatures plummeted.

While analysts are still working to untangle all of the reasons behind Texas’ grid failures, some have also wondered whether the unique way the state manages its largely deregulated electricity system may have played a role. In the mid-1990s, for instance, Texas decided against paying energy producers to hold a fixed number of backup power plants in reserve, instead letting market forces dictate what happens on the grid.

On Tuesday, Gov. Greg Abbott called for an emergency reform of the Electric Reliability Council of Texas, the nonprofit corporation that oversees the flow of power in the state, saying its performance had been “anything but reliable” over the previous 48 hours.

In theory, experts said, there are technical solutions that can avert such problems.

Wind turbines can be equipped with heaters and other devices so that they can operate in icy conditions — as is often done in the upper Midwest, where cold weather is more common. Gas plants can be built to store oil on-site and switch over to burning the fuel if needed, as is often done in the Northeast, where natural gas shortages are common. Grid regulators can design markets that pay extra to keep a larger fleet of backup power plants in reserve in case of emergencies, as is done in the Mid-Atlantic.

But these solutions all cost money, and grid operators are often wary of forcing consumers to pay extra for safeguards.

“Building in resilience often comes at a cost, and there’s a risk of both underpaying but also of overpaying,” said Daniel Cohan, an associate professor of civil and environmental engineering at Rice University. “It’s a difficult balancing act.”

In the months ahead, as Texas grid operators and policymakers investigate this week’s blackouts, they will likely explore how the grid might be bolstered to handle extremely cold weather. Some possible ideas include: Building more connections between Texas and other states to balance electricity supplies, a move the state has long resisted; encouraging homeowners to install battery backup systems; or keeping additional power plants in reserve.

The search for answers will be complicated by climate change. Over all, the state is getting warmer as global temperatures rise, and cold-weather extremes are, on average, becoming less common over time.

But some climate scientists have also suggested that global warming could, paradoxically, bring more unusually fierce winter storms. Some research indicates that Arctic warming is weakening the jet stream, the high-level air current that circles the northern latitudes and usually holds back the frigid polar vortex. This can allow cold air to periodically escape to the South, resulting in episodes of bitter cold in places that rarely get nipped by frost.

ImageCredit: Jacob Ford/Odessa American, via Associated Press

But this remains an active area of debate among climate scientists, with some experts less certain that polar vortex disruptions are becoming more frequent, making it even trickier for electricity planners to anticipate the dangers ahead.

All over the country, utilities and grid operators are confronting similar questions, as climate change threatens to intensify heat waves, floods, water shortages and other calamities, all of which could create novel risks for the nation’s electricity systems. Adapting to those risks could carry a hefty price tag: One recent study found that the Southeast alone may need 35 percent more electric capacity by 2050 simply to deal with the known hazards of climate change.

And the task of building resilience is becoming increasingly urgent. Many policymakers are promoting electric cars and electric heating as a way of curbing greenhouse gas emissions. But as more of the nation’s economy depends on reliable flows of electricity, the cost of blackouts will become ever more dire.

“This is going to be a significant challenge,” said Emily Grubert, an infrastructure expert at Georgia Tech. “We need to decarbonize our power systems so that climate change doesn’t keep getting worse, but we also need to adapt to changing conditions at the same time. And the latter alone is going to be very costly. We can already see that the systems we have today aren’t handling this very well.”

John Schwartz, Dave Montgomery and Ivan Penn contributed reporting.

Mudanças climáticas podem estar por trás da pandemia de Covid-19; entenda (Galileu)

revistagalileu.globo.com

Redação Galileu, 05 Fev 2021 – 16h19 Atualizado em 05 Fev 2021 – 16h19

Estudo indica que espécies de morcegos, que carregam diferentes tipos de coronavírus, mudaram de região ao longo dos anos em decorrência das alterações no clima

Estudo indica que 40 espécies de morcegos migraram para a província de Yunnan no último século (Foto:  Jackie Chin/Unsplash)
A população mundial de morcegos carrega cerca de 3 mil tipos diferentes de coronavírus (Foto: Jackie Chin/Unsplash)

Desde o início da pandemia do novo coronavírus, há um ano, algumas questões permanecem sem resposta. Entre elas, de onde surgiu o Sars-CoV-2, causador da Covid-19.  No fim de janeiro, o jornal Science of the Total Environment publicou um estudo que evidencia uma possível relação entre a pandemia e as mudanças climáticas.

De acordo com a pesquisa, as emissões globais de gases do efeito estufa no último século favoreceram o crescimento de um habitat para morcegos, tornando o sul da China uma região propícia para o surgimento e a propagação do vírus Sars-CoV-2.

A análise foi feita com base em um mapa da vegetação do mundo no século 20, utilizando dados relacionados a temperatura, precipitação e cobertura de nuvens. Os pesquisadores analisaram a distribuição de morcegos no início dos anos 1900 e, comparando com a distribuição atual, concluíram que diferentes espécies mudaram de região por causa das mudanças no clima do planeta.

“Entender como a distribuição das espécies de morcego pelo mundo mudou em função das mudanças climáticas pode ser um passo importante para reconstruir a origem do surto de Covid-19”, afirmou, em nota, Robert Beyer, pesquisador do Departamento de Zoologia da Universidade de Cambridge, no Reino Unido, e autor principal do estudo.

Foram observadas grandes alterações na vegetação da província chinesa de Yunnan, de Mianmar e do Laos. Os aumentos na temperatura, na incidência da luz solar e nas concentrações de dióxido de carbono presente na atmosfera fizeram com que o habitat, que antes era composto por arbustos tropicais, se transformasse em savana tropical e florestas temperadas.

As novas características criaram um ambiente favorável para que 40 espécies de morcegos migrassem para a província de Yunnan no último século, reunindo assim mais de 100 tipos de coronavírus na área em que os dados apontam como a origem do surto do Sars-CoV-2. Essa região também é habitat dos pangolins, que são considerados prováveis agentes intermediários na pandemia.

“Conforme as mudanças climáticas alteraram os habitats, espécies deixaram algumas áreas e foram para outras — levando os vírus com elas. Isso não apenas alterou as regiões onde os vírus estão presentes, mas provavelmente permitiu novas interações entre animais e vírus, fazendo com que vírus mais perigosos fossem transmitidos ou desenvolvidos”, explicou Beyer.

O estudo ainda identificou que as mudanças climáticas resultaram no aumento do número de espécies de morcegos em outras regiões, como na África Central, na América do Sul e na América Central. “A pandemia de Covid-19 causou grande prejuízo social e econômico. Os governos devem aproveitar a oportunidade para reduzir os riscos que doenças infecciosas apresentam à saúde e agir para mitigar as mudanças climáticas”, alertou o professor Andrea Manica, do Departamento de Zoologia da Universidade de Cambridge.

Os pesquisadores também ressaltam que é preciso limitar a expansão de áreas urbanas, fazendas e áreas de caça em habitats naturais para que seja reduzido o contato entre humanos e animais transmissores doenças.

Esta matéria faz parte da iniciativa #UmSóPlaneta, união de 19 marcas da Editora Globo, Edições Globo Condé Nast e CBN. Saiba mais em umsoplaneta.globo.com

COVID-19 lockdowns temporarily raised global temperatures, research shows (Science Daily)

Reductions in aerosol emissions had slight warming impact, study finds

Date: February 2, 2021

Source: National Center for Atmospheric Research/University Corporation for Atmospheric Research

Summary: The lockdowns and reduced societal activity related to the COVID-19 pandemic affected emissions of pollutants in ways that slightly warmed the planet for several months last year, according to new research. The counterintuitive finding highlights the influence of airborne particles, or aerosols, that block incoming sunlight.


The lockdowns and reduced societal activity related to the COVID-19 pandemic affected emissions of pollutants in ways that slightly warmed the planet for several months last year, according to new research led by the National Center for Atmospheric Research (NCAR).

The counterintuitive finding highlights the influence of airborne particles, or aerosols, that block incoming sunlight. When emissions of aerosols dropped last spring, more of the Sun’s warmth reached the planet, especially in heavily industrialized nations, such as the United States and Russia, that normally pump high amounts of aerosols into the atmosphere.

“There was a big decline in emissions from the most polluting industries, and that had immediate, short-term effects on temperatures,” said NCAR scientist Andrew Gettelman, the study’s lead author. “Pollution cools the planet, so it makes sense that pollution reductions would warm the planet.”

Temperatures over parts of Earth’s land surface last spring were about 0.2-0.5 degrees Fahrenheit (0.1-0.3 degrees Celsius) warmer than would have been expected with prevailing weather conditions, the study found. The effect was most pronounced in regions that normally are associated with substantial emissions of aerosols, with the warming reaching about 0.7 degrees F (0.37 C) over much of the United States and Russia.

The new study highlights the complex and often conflicting influences of different types of emissions from power plants, motor vehicles, industrial facilities, and other sources. While aerosols tend to brighten clouds and reflect heat from the Sun back into space, carbon dioxide and other greenhouse gases have the opposite effect, trapping heat near the planet’s surface and elevating temperatures.

Despite the short-term warming effects, Gettelman emphasized that the long-term impact of the pandemic may be to slightly slow climate change because of reduced emissions of carbon dioxide, which lingers in the atmosphere for decades and has a more gradual influence on climate. In contrast, aerosols — the focus of the new study — have a more immediate impact that fades away within a few years.

The study was published in Geophysical Research Letters. It was funded in part by the National Science Foundation, NCAR’s sponsor. In addition to NCAR scientists, the study was co-authored by scientists at Oxford University, Imperial College, and the University of Leeds.

Teasing out the impacts

Although scientists have long been able to quantify the warming impacts of carbon dioxide, the climatic influence of various types of aerosols — including sulfates, nitrates, black carbon, and dust — has been more difficult to pin down. One of the major challenges for projecting the extent of future climate change is estimating the extent to which society will continue to emit aerosols in the future and the influence of the different types of aerosols on clouds and temperature.

To conduct the research, Gettelman and his co-authors used two of the world’s leading climate models: the NCAR-based Community Earth System Model and a model known as ECHAM-HAMMOZ, which was developed by a consortium of European nations. They ran simulations on both models, adjusting emissions of aerosols and incorporating actual meteorological conditions in 2020, such as winds.

This approach enabled them to identify the impact of reduced emissions on temperature changes that were too small to tease out in actual observations, where they could be obscured by the variability in atmospheric conditions.

The results showed that the warming effect was strongest in the mid and upper latitudes of the Northern Hemisphere. The effect was mixed in the tropics and comparatively minor in much of the Southern Hemisphere, where aerosol emissions are not as pervasive.

Gettelman said the study will help scientists better understand the influence of various types of aerosols in different atmospheric conditions, helping to inform efforts to minimize climate change. Although the research illustrates how aerosols counter the warming influence of greenhouse gases, he emphasized that emitting more of them into the lower atmosphere is not a viable strategy for slowing climate change.

“Aerosol emissions have major health ramifications,” he said. “Saying we should pollute is not practical.”


Story Source:

Materials provided by National Center for Atmospheric Research/University Corporation for Atmospheric Research. Original written by David Hosansky. Note: Content may be edited for style and length.


Journal Reference:

  1. A. Gettelman, R. Lamboll, C. G. Bardeen, P. M. Forster, D. Watson‐Parris. Climate Impacts of COVID‐19 Induced Emission Changes. Geophysical Research Letters, 2021; 48 (3) DOI: 10.1029/2020GL091805

Climate crisis: world is at its hottest for at least 12,000 years – study (The Guardian)

theguardian.com

Damian Carrington, Environment editor @dpcarrington

Wed 27 Jan 2021 16.00 GMT

The world’s continuously warming climate is revealed also in contemporary ice melt at glaciers, such as with this one in the Kenai mountains, Alaska (seen September 2019). Photograph: Joe Raedle/Getty Images

The planet is hotter now than it has been for at least 12,000 years, a period spanning the entire development of human civilisation, according to research.

Analysis of ocean surface temperatures shows human-driven climate change has put the world in “uncharted territory”, the scientists say. The planet may even be at its warmest for 125,000 years, although data on that far back is less certain.

The research, published in the journal Nature, reached these conclusions by solving a longstanding puzzle known as the “Holocene temperature conundrum”. Climate models have indicated continuous warming since the last ice age ended 12,000 years ago and the Holocene period began. But temperature estimates derived from fossil shells showed a peak of warming 6,000 years ago and then a cooling, until the industrial revolution sent carbon emissions soaring.

This conflict undermined confidence in the climate models and the shell data. But it was found that the shell data reflected only hotter summers and missed colder winters, and so was giving misleadingly high annual temperatures.

“We demonstrate that global average annual temperature has been rising over the last 12,000 years, contrary to previous results,” said Samantha Bova, at Rutgers University–New Brunswick in the US, who led the research. “This means that the modern, human-caused global warming period is accelerating a long-term increase in global temperatures, making today completely uncharted territory. It changes the baseline and emphasises just how critical it is to take our situation seriously.”

The world may be hotter now than any time since about 125,000 years ago, which was the last warm period between ice ages. However, scientists cannot be certain as there is less data relating to that time.

One study, published in 2017, suggested that global temperatures were last as high as today 115,000 years ago, but that was based on less data.

The new research is published in the journal Nature and examined temperature measurements derived from the chemistry of tiny shells and algal compounds found in cores of ocean sediments, and solved the conundrum by taking account of two factors.

First, the shells and organic materials had been assumed to represent the entire year but in fact were most likely to have formed during summer when the organisms bloomed. Second, there are well-known predictable natural cycles in the heating of the Earth caused by eccentricities in the orbit of the planet. Changes in these cycles can lead to summers becoming hotter and winters colder while average annual temperatures change only a little.

Combining these insights showed that the apparent cooling after the warm peak 6,000 years ago, revealed by shell data, was misleading. The shells were in fact only recording a decline in summer temperatures, but the average annual temperatures were still rising slowly, as indicated by the models.

“Now they actually match incredibly well and it gives us a lot of confidence that our climate models are doing a really good job,” said Bova.

The study looked only at ocean temperature records, but Bova said: “The temperature of the sea surface has a really controlling impact on the climate of the Earth. If we know that, it is the best indicator of what global climate is doing.”

She led a research voyage off the coast of Chile in 2020 to take more ocean sediment cores and add to the available data.

Jennifer Hertzberg, of Texas A&M University in the US, said: “By solving a conundrum that has puzzled climate scientists for years, Bova and colleagues’ study is a major step forward. Understanding past climate change is crucial for putting modern global warming in context.”

Lijing Cheng, at the International Centre for Climate and Environment Sciences in Beijing, China, recently led a study that showed that in 2020 the world’s oceans reached their hottest level yet in instrumental records dating back to the 1940s. More than 90% of global heating is taken up by the seas.

Cheng said the new research was useful and intriguing. It provided a method to correct temperature data from shells and could also enable scientists to work out how much heat the ocean absorbed before the industrial revolution, a factor little understood.

The level of carbon dioxide today is at its highest for about 4m years and is rising at the fastest rate for 66m years. Further rises in temperature and sea level are inevitable until greenhouse gas emissions are cut to net zero.

‘Star Wars without Darth Vader’ – why the UN climate science story names no villains (Climate Home News)

Published on 12/01/2021, 4:10pm

As the next blockbuster science report on cutting emissions goes to governments for review, critics say it downplays the obstructive role of fossil fuel lobbying

Darth Vader: What would Star Wars be without its villain? (Pic: Pixabay)

By Joe Lo

On Monday, a weighty draft report on how to halt and reverse human-caused global warming will hit the inboxes of government experts. This is the final review before the Intergovernmental Panel on Climate Change (IPCC) issues its official summary of the science.

While part of the brief was to identify barriers to climate action, critics say there is little space given to the obstructive role of fossil fuel lobbying – and that’s a problem.

Robert Brulle, an American sociologist who has long studied institutions that promote climate denial, likened it to “trying to tell the story of Star Wars, but omitting Darth Vader”.

Tweeting in November, Brulle explained he declined an invitation to contribute to the working group three (WG3) report. “It became clear to me that institutionalized efforts to obstruct climate action was a peripheral concern. So I didn’t consider it worth engaging in this effort. It really deserves its own chapter & mention in the summary.”

In an email exchange with Climate Home News, Brulle expressed a hope the final version would nonetheless reflect his feedback. The significance of obstruction efforts should be reflected in the summary for policymakers and not “buried in an obscure part of the report,” he wrote.

His tweet sparked a lively conversation among scientists, with several supporting his concerns and others defending the IPCC, which aims to give policymakers an overview of the scientific consensus.

David Keith, a Harvard researcher into solar geoengineering, agreed the IPCC “tells a bloodless story, and abstract numb version of the sharp political conflict that will shape climate action”.

Social ecology and ecological economics professor Julia Steinberger, a lead author on WG3, said “there is a lot of self-censorship” within the IPCC. Where authors identify enemies of climate action, like fossil fuel companies, that content is “immediately flagged as political or normative or policy-prescriptive”.

The next set of reports is likely to be “a bit better” at covering the issue than previous efforts, Steinberger added, “but mainly because the world and outside publications have overwhelmingly moved past this, and the IPCC is catching up: not because the IPCC is leading.”

Politics professor Matthew Paterson was a lead author on WG3 for the previous round of assessment reports, published in 2014. He told Climate Home that Brulle is “broadly right” lobbying hasn’t been given enough attention although there is a “decent chunk” in the latest draft on corporations fighting for their interests and slowing down climate action.

Paterson said this was partly because the expertise of authors didn’t cover fossil fuel company lobbying and partly because governments would oppose giving the subject greater prominence. “Not just Saudi Arabia,” he said. “They object to everything. But the Americans [and others too]”.

While the IPCC reports are produced by scientists, government representatives negotiate the initial scope and have some influence over how the evidence is summarised before approving them for publication. “There was definitely always a certain adaptation – or an internalised sense of what governments are and aren’t going to accept – in the report,” said Paterson.

The last WG3 report in 2014 was nearly 1,500 pages long. Lobbying was not mentioned in its 32-page ‘summary for policymakers’ but lobbying against carbon taxes is mentioned a few times in the full report.

On page 1,184, the report says some companies “promoted climate scepticism by providing financial resources to like-minded think-tanks and politicians”. The report immediately balances this by saying “other fossil fuel companies adopted a more supportive position on climate science”.

One of the co-chairs of WG3, Jim Skea, rejected the criticisms as “completely unfair”. He told Climate Home News: “The IPCC produces reports very slowly because the whole cycle lasts seven years… we can’t respond on a 24/7 news cycle basis to ideas that come up.”

Skea noted there was a chapter on policies and institutions in the 2014 report which covered lobbying from industry and from green campaigners and their influence on climate policy. “The volume of climate change mitigation literature that comes out every year is huge and I would say that the number of references to articles which talk about lobbying of all kinds – including industrial lobbying and whether people had known about the science – it is in there and about the right proportions”, he said.

“We’re not an advocacy organisation, we’re a scientific organisation, it’s not our job to take up arms and take one side or another” he said. “That’s the strength of the IPCC. If if oversteps its role, it will weaken its influence” and “undermine the scientific statements it makes”.

A broader, long-running criticism of the IPCC is that it downplays subjects like political science, development studies, sociology and anthropology and over-relies on economists and the people who put together ‘integrated assessment models’ (IAMs), which attempt to answer big questions like how the world can keep to 1.5C of global warming.

Paterson said the IPCC is “largely dominated by large-scale modellers or economists and the representations of others sorts of social scientists’ expertise is very thin”. A report he co-authored on the social make-up of that IPCC working group found that nearly half the authors were engineers or economists but just 15% were from social sciences other than economics. This dominance was sharper among the more powerful authors. Of the 35 Contributing Lead Authors, 20 were economists or engineers,  there was one each from political science, geography and law and none from the humanities.

Wim Carton, a lecturer in the political economy of climate change mitigation at Lund University, said that the IPCC (and scientific research in general) has been caught up in “adulation” of IAMs and this has led to “narrow techno-economic conceptualisations of future mitigation pathways”.

Skea said that there has been lots of material on political science and international relations and even “quite a bit” on moral philosophy. He told Climate Home: “It’s not the case that IPCC is only economics and modelling. Frankly, a lot of that catches attention because these macro numbers are eye-catching. There’s a big difference in the emphasis in [media] coverage of IPCC reports and the balance of materials when you go into the reports themselves.”

According to Skea’s calculations, the big models make up only 6% of the report contents, about a quarter of the summary and the majority of the press coverage. “But there’s an awful lot of bread-and-butter material in IPCC reports which is just about how you get on with it,” he added. “It’s not sexy material but it’s just as important because that’s what needs to be done to mitigate climate change.”

While saying their dominance had been amplified by the media, Skea defended the usefulness of IAMs. “Our audience are governments. Their big question is how you connect all this human activity with actual impacts on the climate. It’s very difficult to make that leap without actually modelling it. You can’t do it with lots of little micro-studies. You need models and you need scenarios to think your way through that connection.”

The IPCC has also been accused of placing too much faith in negative emissions technologies and geo-engineering. Carton calls these technologies ‘carbon unicorns’ because he says they “do not exist at any meaningful scale” and probably never will.

In a recent book chapter, Carton argues: “If one is to believe recent IPCC reports, then gone are the days when the world could resolve the climate crisis merely by reducing emissions. Avoiding global warming in excess of 2°C/1.5°C now also involves a rather more interventionist enterprise: to remove vast amounts of carbon dioxide from the atmosphere, amounts that only increase the longer emissions refuse to fall.”

When asked about carbon capture technologies, Skea said that in terms of deployment, “they haven’t moved on very much” since the last big IPCC report in 2014. He added that carbon capture and storage and bio-energy are “all things that have been done commercially somewhere in the world.”

“What has never been done”, he said, “is to connect the different parts of the system together and run them over all. That’s led many people looking at the literature to conclude that the main barriers to the adoption of some technologies are the lack of policy incentives and the lack of working out good business models to put what would be complex supply chains together – rather than anything that’s standing in the way technically.”

The next set of three IPCC assessment reports was originally due to be published in 2021, but work was delayed by the coronavirus pandemic. Governments and experts will have from 18 January to 14 March to read and comment on the draft for WG3. Dates for a final government review have yet to be set.

Inner Workings: Crop researchers harness artificial intelligence to breed crops for the changing climate (PNAS)

Carolyn Beans PNAS November 3, 2020 117 (44) 27066-27069; first published October 14, 2020; https://doi.org/10.1073/pnas.2018732117

Until recently, the field of plant breeding looked a lot like it did in centuries past. A breeder might examine, for example, which tomato plants were most resistant to drought and then cross the most promising plants to produce the most drought-resistant offspring. This process would be repeated, plant generation after generation, until, over the course of roughly seven years, the breeder arrived at what seemed the optimal variety.

Figure1
Researchers at ETH Zürich use standard color images and thermal images collected by drone to determine how plots of wheat with different genotypes vary in grain ripeness. Image credit: Norbert Kirchgessner (ETH Zürich, Zürich, Switzerland).

Now, with the global population expected to swell to nearly 10 billion by 2050 (1) and climate change shifting growing conditions (2), crop breeder and geneticist Steven Tanksley doesn’t think plant breeders have that kind of time. “We have to double the productivity per acre of our major crops if we’re going to stay on par with the world’s needs,” says Tanksley, a professor emeritus at Cornell University in Ithaca, NY.

To speed up the process, Tanksley and others are turning to artificial intelligence (AI). Using computer science techniques, breeders can rapidly assess which plants grow the fastest in a particular climate, which genes help plants thrive there, and which plants, when crossed, produce an optimum combination of genes for a given location, opting for traits that boost yield and stave off the effects of a changing climate. Large seed companies in particular have been using components of AI for more than a decade. With computing power rapidly advancing, the techniques are now poised to accelerate breeding on a broader scale.

AI is not, however, a panacea. Crop breeders still grapple with tradeoffs such as higher yield versus marketable appearance. And even the most sophisticated AI cannot guarantee the success of a new variety. But as AI becomes integrated into agriculture, some crop researchers envisage an agricultural revolution with computer science at the helm.

An Art and a Science

During the “green revolution” of the 1960s, researchers developed new chemical pesticides and fertilizers along with high-yielding crop varieties that dramatically increased agricultural output (3). But the reliance on chemicals came with the heavy cost of environmental degradation (4). “If we’re going to do this sustainably,” says Tanksley, “genetics is going to carry the bulk of the load.”

Plant breeders lean not only on genetics but also on mathematics. As the genomics revolution unfolded in the early 2000s, plant breeders found themselves inundated with genomic data that traditional statistical techniques couldn’t wrangle (5). Plant breeding “wasn’t geared toward dealing with large amounts of data and making precise decisions,” says Tanksley.

In 1997, Tanksley began chairing a committee at Cornell that aimed to incorporate data-driven research into the life sciences. There, he encountered an engineering approach called operations research that translates data into decisions. In 2006, Tanksley cofounded the Ithaca, NY-based company Nature Source Improved Plants on the principle that this engineering tool could make breeding decisions more efficient. “What we’ve been doing almost 15 years now,” says Tanksley, “is redoing how breeding is approached.”

A Manufacturing Process

Such approaches try to tackle complex scenarios. Suppose, for example, a wheat breeder has 200 genetically distinct lines. The breeder must decide which lines to breed together to optimize yield, disease resistance, protein content, and other traits. The breeder may know which genes confer which traits, but it’s difficult to decipher which lines to cross in what order to achieve the optimum gene combination. The number of possible combinations, says Tanksley, “is more than the stars in the universe.”

An operations research approach enables a researcher to solve this puzzle by defining the primary objective and then using optimization algorithms to predict the quickest path to that objective given the relevant constraints. Auto manufacturers, for example, optimize production given the expense of employees, the cost of auto parts, and fluctuating global currencies. Tanksley’s team optimizes yield while selecting for traits such as resistance to a changing climate. “We’ve seen more erratic climate from year to year, which means you have to have crops that are more robust to different kinds of changes,” he says.

For each plant line included in a pool of possible crosses, Tanksley inputs DNA sequence data, phenotypic data on traits like drought tolerance, disease resistance, and yield, as well as environmental data for the region where the plant line was originally developed. The algorithm projects which genes are associated with which traits under which environmental conditions and then determines the optimal combination of genes for a specific breeding goal, such as drought tolerance in a particular growing region, while accounting for genes that help boost yield. The algorithm also determines which plant lines to cross together in which order to achieve the optimal combination of genes in the fewest generations.

Nature Source Improved Plants conducts, for example, a papaya program in southeastern Mexico where the once predictable monsoon season has become erratic. “We are selecting for varieties that can produce under those unknown circumstances,” says Tanksley. But the new papaya must also stand up to ringspot, a virus that nearly wiped papaya from Hawaii altogether before another Cornell breeder developed a resistant transgenic variety (6). Tanksley’s papaya isn’t as disease resistant. But by plugging “rapid growth rate” into their operations research approach, the team bred papaya trees that produce copious fruit within a year, before the virus accumulates in the plant.

“Plant breeders need operations research to help them make better decisions,” says William Beavis, a plant geneticist and computational biologist at Iowa State in Ames, who also develops operations research strategies for plant breeding. To feed the world in rapidly changing environments, researchers need to shorten the process of developing a new cultivar to three years, Beavis adds.

The big seed companies have investigated use of operations research since around 2010, with Syngenta, headquartered in Basel, Switzerland, leading the pack, says Beavis, who spent over a decade as a statistical geneticist at Pioneer Hi-Bred in Johnston, IA, a large seed company now owned by Corteva, which is headquartered in Wilmington, DE. “All of the soybean varieties that have come on the market within the last couple of years from Syngenta came out of a system that had been redesigned using operations research approaches,” he says. But large seed companies primarily focus on grains key to animal feed such as corn, wheat, and soy. To meet growing food demands, Beavis believes that the smaller seed companies that develop vegetable crops that people actually eat must also embrace operations research. “That’s where operations research is going to have the biggest impact,” he says, “local breeding companies that are producing for regional environments, not for broad adaptation.”

In collaboration with Iowa State colleague and engineer Lizhi Wang and others, Beavis is developing operations research-based algorithms to, for example, help seed companies choose whether to breed one variety that can survive in a range of different future growing conditions or a number of varieties, each tailored to specific environments. Two large seed companies, Corteva and Syngenta, and Kromite, a Lambertville, NJ-based consulting company, are partners on the project. The results will be made publicly available so that all seed companies can learn from their approach.

Figure2
Nature Source Improved Plants (NSIP) speeds up its papaya breeding program in southeastern Mexico by using decision-making approaches more common in engineering. Image credit: Nature Source Improved Plants/Jesús Morales.

Drones and Adaptations

Useful farming AI requires good data, and plenty of it. To collect sufficient inputs, some researchers take to the skies. Crop researcher Achim Walter of the Institute of Agricultural Sciences at ETH Zürich in Switzerland and his team are developing techniques to capture aerial crop images. Every other day for several years, they have deployed image-capturing sensors over a wheat field containing hundreds of genetic lines. They fly their sensors on drones or on cables suspended above the crops or incorporate them into handheld devices that a researcher can use from an elevated platform (7).

Meanwhile, they’re developing imaging software that quantifies growth rate captured by these images (8). Using these data, they build models that predict how quickly different genetic lines grow under different weather conditions. If they find, for example, that a subset of wheat lines grew well despite a dry spell, then they can zero in on the genes those lines have in common and incorporate them into new drought-resistant varieties.

Research geneticist Edward Buckler at the US Department of Agriculture and his team are using machine learning to identify climate adaptations in 1,000 species in a large grouping of grasses spread across the globe. The grasses include food and bioenergy crops such as maize, sorghum, and sugar cane. Buckler says that when people rank what are the most photosynthetically efficient and water-efficient species, this is the group that comes out at the top. Still, he and collaborators, including plant scientist Elizabeth Kellogg of the Donald Danforth Plant Science Center in St. Louis, MO, and computational biologist Adam Siepel of Cold Spring Harbor Laboratory in NY, want to uncover genes that could make crops in this group even more efficient for food production in current and future environments. The team is first studying a select number of model species to determine which genes are expressed under a range of different environmental conditions. They’re still probing just how far this predictive power can go.

Such approaches could be scaled up—massively. To probe the genetic underpinnings of climate adaptation for crop species worldwide, Daniel Jacobson, the chief researcher for computational systems biology at Oak Ridge National Laboratory in TN, has amassed “climatype” data for every square kilometer of land on Earth. Using the Summit supercomputer, they then compared each square kilometer to every other square kilometer to identify similar environments (9). The result can be viewed as a network of GPS points connected by lines that show the degree of environmental similarity between points.

“For me, breeding is much more like art. I need to see the variation and I don’t prejudge it. I know what I’m after, but nature throws me curveballs all the time, and I probably can’t count the varieties that came from curveballs.”

—Molly Jahn

In collaboration with the US Department of Energy’s Center for Bioenergy Innovation, the team combines this climatype data with GPS coordinates associated with individual crop genotypes to project which genes and genetic interactions are associated with specific climate conditions. Right now, they’re focused on bioenergy and feedstocks, but they’re poised to explore a wide range of food crops as well. The results will be published so that other researchers can conduct similar analyses.

The Next Agricultural Revolution

Despite these advances, the transition to AI can be unnerving. Operations research can project an ideal combination of genes, but those genes may interact in unpredictable ways. Tanksley’s company hedges its bets by engineering 10 varieties for a given project in hopes that at least one will succeed.

On the other hand, such a directed approach could miss happy accidents, says Molly Jahn, a geneticist and plant breeder at the University of Wisconsin–Madison. “For me, breeding is much more like art. I need to see the variation and I don’t prejudge it,” she says. “I know what I’m after, but nature throws me curveballs all the time, and I probably can’t count the varieties that came from curveballs.”

There are also inherent tradeoffs that no algorithm can overcome. Consumers may prefer tomatoes with a leafy crown that stays green longer. But the price a breeder pays for that green calyx is one percent of the yield, says Tanksley.

Image recognition technology comes with its own host of challenges, says Walter. “To optimize algorithms to an extent that makes it possible to detect a certain trait, you have to train the algorithm thousands of times.” In practice, that means snapping thousands of crop images in a range of light conditions. Then there’s the ground-truthing. To know whether the models work, Walter and others must measure the trait they’re after by hand. Keen to know whether the model accurately captures the number of kernels on an ear of corn? You’d have to count the kernels yourself.

Despite these hurdles, Walter believes that computer science has brought us to the brink of a new agricultural revolution. In a 2017 PNAS Opinion piece, Walter and colleagues described emerging “smart farming” technologies—from autonomous weeding vehicles to moisture sensors in the soil (10). The authors worried, though, that only big industrial farms can afford these solutions. To make agriculture more sustainable, smaller farms in developing countries must have access as well.

Fortunately, “smart breeding” advances may have wider reach. Once image recognition technology becomes more developed for crops, which Walter expects will happen within the next 10 years, deploying it may be relatively inexpensive. Breeders could operate their own drones and obtain more precise ratings of traits like time to flowering or number of fruits in shorter time, says Walter. “The computing power that you need once you have established the algorithms is not very high.”

The genomic data so vital to AI-led breeding programs is also becoming more accessible. “We’re really at this point where genomics is cheap enough that you can apply these technologies to hundreds of species, maybe thousands,” says Buckler.

Plant breeding has “entered the engineered phase,” adds Tanksley. And with little time to spare. “The environment is changing,” he says. “You have to have a faster breeding process to respond to that.”

Published under the PNAS license.

References

1. United Nations, Department of Economic and Social Affairs, Population Division, World Population Prospects 2019: Highlights, (United Nations, New York, 2019).

2. N. Jones, “Redrawing the map: How the world’s climate zones are shifting” Yale Environment 360 (2018). https://e360.yale.edu/features/redrawing-the-map-how-the-worlds-climate-zones-are-shifting. Accessed 14 May 2020.

3. P. L. Pingali, Green revolution: Impacts, limits, and the path ahead. Proc. Natl. Acad. Sci. U.S.A. 109, 12302–12308 (2012).

4. D. Tilman, The greening of the green revolution. Nature 396, 211–212 (1998).

5. G. P. Ramstein, S. E. Jensen, E. S. Buckler, Breaking the curse of dimensionality to identify causal variants in Breeding 4. Theor. Appl. Genet. 132, 559–567 (2019).

6. D. Gonsalves, Control of papaya ringspot virus in papaya: A case study. Annu. Rev. Phytopathol. 36, 415–437 (1998).

7. N. Kirchgessner et al., The ETH field phenotyping platform FIP: A cable-suspended multi-sensor system. Funct. Plant Biol. 44, 154–168 (2016).

8. K. Yu, N. Kirchgessner, C. Grieder, A. Walter, A. Hund, An image analysis pipeline for automated classification of imaging light conditions and for quantification of wheat canopy cover time series in field phenotyping. Plant Methods 13, 15 (2017).

9. J. Streich et al., Can exascale computing and explainable artificial intelligence applied to plant biology deliver on the United Nations sustainable development goals? Curr. Opin. Biotechnol. 61, 217–225 (2020).

10. A. Walter, R. Finger, R. Huber, N. Buchmann, Opinion: Smart farming is key to developing sustainable agriculture. Proc. Natl. Acad. Sci. U.S.A. 114, 6148–6150 (2017).

UK to make climate risk reports mandatory for large companies (Guardian)

theguardian.com

Larry Elliott, Mon 9 Nov 2020 19.18 GMT. Last modified on Tue 10 Nov 2020 04.37 GMT

St. Paul’s Cathedral and buildings of the City of London financial district are seen as buses cross Waterloo bridge at sunset
Sunak said departure from the EU meant the UK’s financial services sector – which employs more than a million people – was entering a new chapter. Photograph: Toby Melville/Reuters

Large companies and financial institutions in the UK will have to come clean about their exposure to climate risks within five years under the terms of a tougher regime announced by the chancellor, Rishi Sunak.

In an attempt to demonstrate the government’s commitment to tackling global heating, Sunak said the UK would go further than an international taskforce had recommended and make disclosure by large businesses mandatory.

The chancellor also announced plans for Britain’s first green gilt – a bond that will be floated in the financial markets during 2021 with the money raised paying for investment in carbon-reducing projects and the creation of jobs across the country.

In a Commons statement, Sunak said departure from the EU meant the financial services sector – which employs more than a million people – was entering a new chapter.

“This new chapter means putting the full weight of private sector innovation, expertise and capital behind the critical global effort to tackle climate change and protect the environment.

“We’re announcing the UK’s intention to mandate climate disclosures by large companies and financial institutions across our economy, by 2025, going further than recommended by the Task Force on Climate-related Financial Disclosures, and the first G20 country to do so.”

The Treasury said the new disclosure rules and regulations would cover a significant portion of the economy, including listed commercial companies, UK-registered large private companies, banks, building societies, insurance companies, UK-authorised asset managers, life insurers, pension schemes regulated by the Financial Conduct Authority and occupational pension schemes.

The government plans to make Britain a net-zero-carbon country by 2050 and the previous governor of the Bank of England, Mark Carney, told a London conference that the Covid-19 pandemic illustrated the dangers of ill-preparation and of underestimating risks.

Climate change was “a crisis that involves the whole world and from which no one will be able to self-isolate”, Carney said on Monday.

His successor at Threadneedle Street, Andrew Bailey, said the decision to issue a green bond underlined the UK’s commitment to combating climate change – as did Sunak’s announcement that disclosures related to climate change risk would be mandatory by 2025.

Sunak, Carney and Bailey were all speakers at the Green Horizon summit, which took place in London on what would have been the first day of the UN climate change conference in Glasgow had Covid-19 not forced the postponement of the event.

Bailey said: “Our goal is to build a UK financial system resilient to the risks from climate change and supportive of the transition to a net-zero economy. In the aftermath of the financial crisis we took far-reaching action to make the financial system more resilient against crises – Covid is the first real test of those changes.”

Doug Parr, Greenpeace UK’s policy director, said: “Tackling climate change means the corporate sector is not just green round the edges but green right to its core. The chancellor’s plans to make disclosure mandatory for companies is right if the rules are compulsory and thorough. Sign up to the daily Business Today email or follow Guardian Business on Twitter at @BusinessDesk

“The real win would be to make all financial institutions put in place plans to meet the Paris climate agreement by the end of next year, steadily choking off the supply of cash to planet-wrecking activities. Disclosure is a route to making that happen, but not an end in itself.”

Roger Barker, the director of policy and corporate governance at the Institute of Directors, said: “What gets measured gets changed. The problem is there’s a hundred and one different ways of measuring climate impact out there right now. It’s a confusing landscape for companies and investors alike, so bringing in common standards is absolutely the right thing to do.

Fran Boait, the executive director of the campaign group Positive Money, said: “We desperately need more green public investment if we are to have a fair, green transition, so it’s positive that the government has signalled that it is finally taking this more seriously, by issuing green gilts for the first time.”

Geoengenharia solar não deve ser descartada, segundo cientistas (TecMundo)

03/11/2020 às 19:00 3 min de leitura

Imagem de: Geoengenharia solar não deve ser descartada, segundo cientistas

Reinaldo Zaruvni

Antes encaradas com desconfiança pela comunidade científica, as metodologias de intervenção artificial no meio ambiente com o objetivo de frear os efeitos devastadores do aquecimento global estão sendo consideradas agora como recursos a serem aplicados em última instância (já que iniciativas para reduzir a emissão de gases dependem diretamente da ação coletiva e demandam décadas para que tenham algum tipo de efeito benéfico). É possível que não tenhamos esse tempo, de acordo com alguns pesquisadores da área, os quais têm atraído investimentos e muita atenção.

Fazendo parte de um campo também referenciado como geoengenharia solar, grande parte dos métodos se vale da emissão controlada de partículas na atmosfera, responsáveis por barrar a energia recebida pelo nosso planeta e direcioná-la novamente ao espaço, criando uma espécie de resfriamento semelhante ao gerado por erupções vulcânicas.

Ainda que não atuem sobre a poluição, por exemplo, cientistas consideram que, diante de tempestades cada vez mais agressivas, tornados de fogo, inundações e outros desastres naturais, tais ações seriam interessantes enquanto soluções mais eficazes não são desenvolvidas.

Diretor do Sabin Center for Climate Change Law, na Columbia Law School, e editor de um livro sobre a tecnologia e suas implicações legais, Michael Gerrard exemplificou a situação em entrevista ao The New York Times: “Estamos enfrentando uma ameaça existencial. Por isso, é necessário que analisemos todas as opções”.

“Gosto de comparar a geoengenharia a uma quimioterapia para o planeta: se todo o resto estiver falhando, resta apenas tentar”, ele defendeu.

Desastres naturais ocasionados pelo aquecimento global tornam intervenções urgentes, defendem pesquisadores.

Desastres naturais ocasionados pelo aquecimento global tornam urgente a ação de intervenções, segundo pesquisadores. Fonte:  Unsplash 

Dois pesos e duas medidas

Entre aquelas que se destacam, pode ser citada a ação empreendida por uma organização não governamental chamada SilverLining, que concedeu US$ 3 milhões a diversas universidades e outras instituições para que se dediquem à busca de respostas para questões práticas. Um exemplo é encontrar a altitude ideal para a aplicação de aerossóis e como inserir a quantidade mais indicada, verificando seus efeitos sobre a cadeia de produção de alimentos mundial.

Chris Sacca, cofundador da Lowercarbon Capital, um grupo de investimentos que é um dos financiadores da SilverLining, declarou em tom alarmista: “A descarbonização é necessária, mas vai demorar 20 anos ou mais para que ocorra. Se não explorarmos intervenções climáticas como a reflexão solar neste momento, condenaremos um número incontável de vidas, espécies e ecossistemas ao calor”.

Outra contemplada por somas substanciais foi a National Oceanic and Atmospheric Administration, que recebeu do congresso norte-americano US$ 4 milhões justamente para o desenvolvimento de tecnologias do tipo, assim como o monitoramento de uso secreto de tais soluções por outros países.

Douglas MacMartin, pesquisador de Engenharia Mecânica e aeroespacial na Universidade Cornell, afirmou que “é certo o poder da humanidade de resfriar as coisas, mas o que não está claro é o que vem a seguir”.

Se, por um lado, planeta pode ser resfriado artificialmente, por outro não se sabe o que virá.

Se, por um lado, o planeta pode ser resfriado artificialmente; por outro, não se sabe o que virá. Fonte:  Unsplash 

Existe uma maneira

Para esclarecer as possíveis consequências de intervenções dessa magnitude, MacMartin desenvolverá modelos de efeitos climáticos específicos oriundos da injeção de aerossóis na atmosfera acima de diferentes partes do globo e altitudes. “Dependendo de onde você colocar [a substância], terá efeitos diferentes nas monções na Ásia e no gelo marinho do Ártico“, ele apontou.

O Centro Nacional de Pesquisa Atmosférica em Boulder, Colorado, financiado também pela SilverLining, acredita ter o sistema ideal para isso — o qual é considerado o mais sofisticado do mundo. Com ele, serão executadas centenas de simulações e, assim, especialistas procurarão o que chamam de ponto ideal, no qual a quantidade de resfriamento artificial que pode reduzir eventos climáticos extremos não cause mudanças mais amplas nos padrões regionais de precipitação ou impactos semelhantes.

“Existe uma maneira, pelo menos em nosso modelo de mundo, de ver se podemos alcançar um sem acionar demais o outro?” questionou Jean-François Lamarque, diretor do laboratório de Clima e Dinâmica Global da instituição. Ainda não há resposta para essa dúvida, mas soluções sustentáveis estão sendo analisadas por pesquisadores australianos, que utilizariam a emissão de água salgada para tornar nuvens mais reflexivas, assim indicando resultados promissores de testes.

Dessa maneira, quem sabe as perdas de corais de recife que testemunhamos tenham data para acabar. Quanto ao resto, bem, só o tempo mostrará.

Natural disasters must be unusual or deadly to prompt local climate policy change (Science Daily)

Date: August 28, 2020

Source: Oregon State University

Summary: Natural disasters alone are not enough to motivate local communities to engage in climate change mitigation or adaptation, a new study has found. Rather, policy change in response to extreme weather events appears to depend on a combination of factors, including fatalities, sustained media coverage, the unusualness of the event and the political makeup of the community.

Natural disasters alone are not enough to motivate local communities to engage in climate change mitigation or adaptation, a new study from Oregon State University found.

Rather, policy change in response to extreme weather events appears to depend on a combination of factors, including fatalities, sustained media coverage, the unusualness of the event and the political makeup of the community.

Climate scientists predict that the frequency and severity of extreme weather events will only continue to increase in coming decades. OSU researchers wanted to understand how local communities are reacting.

“There’s obviously national and state-level climate change policy, but we’re really interested in what goes on at the local level to adapt to these changes,” said lead author Leanne Giordono, a post-doctoral researcher in OSU’s College of Public Health and Human Sciences. “Local communities are typically the first to respond to extreme events and disasters. How are they making themselves more resilient — for example, how are they adapting to more frequent flooding or intense heat?”

For the study, which was funded by the National Science Foundation, Giordono and co-authors Hilary Boudet of OSU’s College of Liberal Arts and Alexander Gard-Murray at Harvard University examined 15 extreme weather events that occurred around the U.S. between March 2012 and June 2017, and any subsequent local climate policy change.

These events included flooding, winter weather, extreme heat, tornadoes, wildfires and a landslide.

The study, published recently in the journal Policy Sciences, found there were two “recipes” for local policy change after an extreme weather event.

“For both recipes, experiencing a high-impact event — one with many deaths or a presidential disaster declaration — is a necessary condition for future-oriented policy adoption,” Giordono said.

In addition to a high death toll, the first recipe consisted of Democrat-leaning communities where there was focused media coverage of the weather event. These communities moved forward with adopting policies aimed at adapting in response to future climate change, such as building emergency preparedness and risk management capacity.

The second recipe consisted of Republican-leaning communities with past experiences of other uncommon weather events. In these locales, residents often didn’t engage directly in conversation about climate change but still worked on policies meant to prepare their communities for future disasters.

In both recipes, policy changes were fairly modest and reactive, such as building fire breaks, levees or community tornado shelters. Giordono referred to these as “instrumental” policy changes.

“As opposed to being driven by ideology or a shift in thought process, it’s more a means to an end,” she said. “‘We don’t want anyone else to die from tornadoes, so we build a shelter.’ It’s not typically a systemic response to global climate change.”

In their sample, the researchers didn’t find any evidence of mitigation-focused policy response, such as communities passing laws to limit carbon emissions or require a shift to solar power. And some communities did not make any policy changes at all in the wake of extreme weather.

The researchers suggest that in communities that are ideologically resistant to talking about climate change, it may be more effective to frame these policy conversations in other ways, such as people’s commitment to their community or the community’s long-term viability.

Without specifically examining communities that have not experienced extreme weather events, the researchers cannot speak to the status of their policy change, but Giordono said it is a question for future study.

“In some ways, it’s not surprising that you see communities that have these really devastating events responding to them,” Giordono said. “What about the vast majority of communities that don’t experience a high-impact event — is there a way to also spark interest in those communities?”

“We don’t want people to have to experience these types of disasters to make changes.”


Story Source:

Materials provided by Oregon State University. Original written by Molly Rosbach. Note: Content may be edited for style and length.


Journal Reference:

  1. Leanne Giordono, Hilary Boudet, Alexander Gard-Murray. Local adaptation policy responses to extreme weather events. Policy Sciences, 2020; DOI: 10.1007/s11077-020-09401-3

Global warming is changing our plant communities (Science Daily)

Date: August 17, 2020

Source: University of Miami

Summary: In a comprehensive study of nearly 20,000 species, research shows that plant communities are shifting to include more heat-loving species as a result of climate change.

Although Live Oak trees are common in South Florida today, Ken Feeley, a University of Miami biology professor, said their time here may be fleeting. With climate change pushing up temperatures, the oaks, which favor cooler conditions, could soon decline in the region and be replaced with more tropical, heat-loving species such as Gumbo Limbo or Mahogany trees.

“Live Oaks occur throughout the southeast and all the way up to coastal Virginia, so down here we are in one of the very hottest places in its range,” said Feeley, who is also the University’s Smathers Chair of Tropical Tree Biology. “As temperatures increase, it may simply get too hot in Miami for oaks and other temperate species.”

Likewise, in Canada, as temperatures increase, sugar maple trees — which are used to produce maple syrup — are losing their habitats. And in New York City, trees that are more typical of the balmy South, such as Magnolias, are increasing in abundance and blooming earlier each year, news reports indicate.

These are just a few examples of a larger trend happening across the Americas — from Hudson Bay to Tierra del Fuego — as plant communities shift their ranges and respond to changing climates, Feeley pointed out. In his newest study, published in Nature Climate Change, Feeley, along with three of his graduate students and a visiting graduate student from the Nacional University of Colombia, analyzed more than 20 million records of more than 17,000 plant species from throughout the Western Hemisphere. They found that since the 1970s, entire plant ecosystems have changed directionally over time to include more and more of the species that prefer warmer climates. This process is called thermophilization.

“Almost anywhere you go, the types of species that you encounter now are different than what you would have found in that same spot 40 years ago, and we believe that this pattern is the direct result of rising temperatures and climate change,” Feeley said.

The research of Feeley and his students demonstrates that entire ecosystems are consistently losing the plant species that favor cold temperatures, and that those plants are being replaced by more heat-tolerant species that can withstand the warming climate. Plants favoring cool temperatures are either moving to higher elevations and latitudes, or some species may even be going locally extinct. Feeley and his students are now exploring key focal species that may offer more insight into these processes.

“Some of these changes can be so dramatic that we are shifting entire habitat types from forests to grasslands or vice versa — by looking at all types of plants over long periods of time and over huge areas, we were able to observe those changes,” he explained.

In addition to the effects of rising temperatures, the researchers also looked at how plant communities are being affected by changes in rainfall during the past four decades. Feeley and his team observed shifts in the amounts of drought-tolerant versus drought-sensitive plant species. But in many cases, the observed changes were not connected to the changes in rainfall. In fact, in many areas that are getting drier, the drought-sensitive species have become more common during the past decades. According to Feeley, this may be because of a connection between the species’ heat tolerances and their water demands. Heat tolerant species are typically less drought-tolerant, so as rising temperatures favor the increase of heat-tolerant species, it may also indirectly prompt a rise in water-demanding species. Feeley stressed that this can create dangerous situations in some areas where the plant communities are pushed out of equilibrium with their climate.

“When drought hits, it will be doubly bad for these ecosystems that have lost their tolerance to drought,” he said, adding that “for places where droughts are becoming more severe and frequent — like in California — this could make things a lot worse.”

But the implications of thermophilization go far beyond just the loss of certain plants, according to Feeley. Plants are at the base of the food chain and provide sustenance and habitat for wildlife — so if the plant communities transform, so will the animals that need them.

“All animals — including humans — depend on the plants around them,” Feeley said. “If we lose some plants, we may also lose the insects, birds, and many other forms of wildlife that we are used to seeing in our communities and that are critical to our ways of life. When people think of climate change, they need to realize that it’s not just about losing ice in Antarctica or rising sea levels — climate change affects almost every natural system in every part of the planet.”


Story Source:

Materials provided by University of Miami. Original written by Janette Neuwahl Tannen. Note: Content may be edited for style and length.


Journal Reference:

  1. K. J. Feeley, C. Bravo-Avila, B. Fadrique, T. M. Perez, D. Zuleta. Climate-driven changes in the composition of New World plant communities. Nature Climate Change, 2020; DOI: 10.1038/s41558-020-0873-2

Climate scientists increasingly ignore ecological role of indigenous peoples (EurekAlert!)

News Release 20-Jul-2020

Penn State

UNIVERSITY PARK, Pa. — In their zeal to promote the importance of climate change as an ecological driver, climate scientists increasingly are ignoring the profound role that indigenous peoples played in fire and vegetation dynamics, not only in the eastern United States but worldwide, according to a Penn State researcher.

“In many locations, evidence shows that indigenous peoples actively managed vast areas and were skilled stewards of the land,” said Marc Abrams, professor of forest ecology and physiology. “The historical record is clear, showing that for thousands of years indigenous peoples set frequent fires to manage forests to produce more food for themselves and the wildlife that they hunted, and practiced extensive agriculture.”

Responding to an article published earlier this year in a top scientific journal that claimed fires set by Native Americans were rare in southern New England and Long Island, New York, and played minor ecological roles, Abrams said there is significant evidence to the contrary.

In an article published today (July 20) in Nature Sustainability, Abrams, who has been studying the historical use of fire in eastern U.S. forests for nearly four decades, refutes those contentions.

“The palaeoecological view — based on a science of analyzing pollen and charcoal in lake sediments — that has arisen over the last few decades, contending that anthropogenic fires were rare and mostly climate-driven, contradicts the proud legacy and heritage of land use by indigenous peoples, worldwide,” he said.

In his article, Abrams, the Nancy and John Steimer Professor of Agricultural Sciences in the College of Agricultural Sciences, argues that the authors of the previous paper assumed that the scarcity of charcoal indicated that there had not been burning. But frequent, low-intensity fires do not create the amount of charcoal that intense, crown-level, forest-consuming wildfires do, he pointed out.

“Surface fires set by indigenous people in oak and pine forests, which dominate southern New England, often produced insufficient charcoal to be noticed in the sediment,” said Abrams. “The authors of the earlier article did not consider charcoal types, which distinguish between crown and surface fires, and charcoal size — macro versus micro — to differentiate local versus regional fires.”

Also, lightning in New England could not account for the ignition of so many fires, Abrams argues. In southern New England, lightning-strike density is low and normally is associated with rain events.

“The region lacks dry lightning needed to sustain large fires,” he said. “Moreover, lightning storms largely are restricted to the summer when humidity is high and vegetation flammability is low, making them an unlikely ignition source.”

Early explorers and colonists of southern New England routinely described open, park-like forests and witnessed, firsthand, Native American vegetation management, Abrams writes in his article, adding that oral history and numerous anthropological studies indicate long-term burning and land-use for thousands of years by indigenous people.

Burning near Native American villages and along their extensive trail systems constitutes large land areas, and fires would have kept burning as long as fuel, weather and terrain allowed, he explained. Following European settlement, these open oak and pine woodlands increasingly became closed by trees that previously were held in check by frequent fire.

The authors of the previous paper also argued that fire should not be used as a present-day management tool, a view that Abrams does not support.

The role of anthropogenic fires is front and center in the long-running climate-disturbance debate, according to Abrams, who notes that fires increased with the rise of human populations. The world would be a very different place without those fires, he contends.

“Surprisingly, the importance of indigenous peoples burning in vegetation-fire dynamics is increasingly downplayed among paleoecologists,” he writes. “This applies to locations where lightning-caused fires are rare.”

Abrams points out that he is not denying the importance of climate in vegetation and fire dynamics or its role in enhancing the extent of human fires. “However,” he writes, “in oak-pine forests of southern New England, Native American populations were high enough, lighting-caused fires rare enough, vegetation flammable enough and the benefits of burning and agriculture great enough for us to have confidence in the importance of historic human land management.”

###

Gregory Nowacki, a scientist in the U.S. Department of Agriculture’s Eastern Regional Forest Service Office in Milwaukee, Wisconsin, contributed to the article.

Related Journal Article

http://dx.doi.org/10.1038/s41893-019-0466-0

The Ugly History of Climate Determinism Is Still Evident Today (Scientific American)

To fix climate injustice, we have to face our implicit biases about people living in different regions of the world

By Simon Donner on June 24, 2020

The Ugly History of Climate Determinism Is Still Evident Today
Boy swims in seawater in the flooded village of Eita in Kiribati. Credit: Jonas Gratzer Getty Images

When you perform a Google Image search for “victims of climate change,” the faces you see are those of Black and brown people in the tropics. The images depict small reef islands in the Pacific Ocean, arid landscapes in East Africa and flooded villages in South Asia.

At some level, this association seems to make sense: Climate change disproportionately affects indigenous people and people of color, both within North America and around the world. The people and the nations least responsible for the problem are, in many cases, the most threatened.

But the search results also reflect the assumption, common among individuals in the global north—notably Europe and North America—that people in tropical climates are helpless victims who lack the capacity to cope with climate change. Beneath that assumption lies a long, ugly history of climate determinism: the racially motivated notion that the climate influences human intelligence and societal development. In the wake of George Floyd’s death, we need to reckon with systemic racism in all forms. Understanding how climate determinism unfolded historically can reveal the racial biases in present-day conversations about climate change. And it can better prepare us to address the very real inequalities of climate injury and adaptation—which I have seen firsthand while working with colleagues in the Pacific island nation of Kiribati over the past 15 years.

The story starts with the term “climate.” It comes from the ancient Greek word klima, which describes latitude. Greek philosophers deduced that the temperature at a given time of year varies roughly with the klima, because latitude determines how much energy a region receives from the sun. Eratosthenes defined bands of klima, later to be called clim-ata, going from frigid belts in the high latitudes, where there was permanent night in winter, to a hot band near the equator.

That is the sanitized origin story in most old textbooks. Here’s the part you’re not told: Using the concept of the “golden mean,” the ideal balance between two extremes, Greek philosophers argued that civilization thrived the most in climates that were neither too hot nor too cold. Greece, not so coincidentally, happened to be in the middle climate. The Greek people used climate to argue that they were healthier and more advanced than their neighbors to the north and south.

Scan the ancient Greek and Roman texts taught in philosophy classes—from Aristotle, Herodotus, Hippocrates, Plato, Cicero, Ptolemy, Pliny the Elder—and you will find passages that use climate differences to make harsh claims about other races. Hippocrates, often called the father of modern medicine, argued that more equatorial civilizations to the south were inferior because the climate was too hot for creativity. In Politics, Aristotle argued that people in cold climates were “full of spirit, but wanting in intelligence and skill,” whereas those in warm climates were “wanting in spirit, and therefore they are always in a state of subjection and slavery.” These ideas were parroted by scholars across Europe and the Islamic world throughout the Middle Ages. Ibn al-Faqih, a Persian geographer in the 10th century, assertedpeople in equatorial climates were born “either like uncooked pastry or like things so thoroughly cooked as to be burnt.” Another suggested northerners tended toward albinism.

During the Enlightenment scientists used stories of heat and disease from tropical expeditions to build faux empirical evidence for racially loaded climate determinism. Scholars, as well as national leaders, advanced the European colonial view of tropical people and society as a lesser other, a practice referred to today as “tropicality.” Immanuel Kant argued in the late 1700s that a mix of warm and cold weather made Europeans smarter, harder-working, prettier, more civilized and wittier than the “exceptionally lethargic” people in the tropics. David Hume claimed that southern people were more sexual because their blood was heated. These arguments were also adapted to claim superiority over indigenous people of North America: writers such as Montesquieu claimed that the more extreme weather of the American colonies created degeneracy.

Blatant climate determinism persisted in the academic literature well into the last century. Geographer Ellsworth Huntington’s 1915 book Civilization and Climate featured tropicality-infused maps of “climatic energy” and “level of civilization”—which the journal Nature highlighted as showing “remarkably close agreement” when it published his 1947 obituary.

Although such claims no longer appear in textbooks, tropicality lives on in Western popular culture and everyday discourse. Think of how movies, books and vacation ads depict tropical islands as unsophisticated places where you can get your mind off the busy tasking of advanced society. The subtext: it is hot here, so no one works hard. Movies and TV programs also present certain tropical locations as dangerous, dirty and disease-ridden because of the climate. The 2020 Netflix film Extraction literally imposes a yellow filter on images of Bangladesh to disguise the blue skies.

Research and media coverage of climate are not immune from this cultural history. Well-intentioned efforts to document and spread stories about the inequalities of climate change inadvertently call up implicit climatic biases. For example, story after story presents the people of low-lying Pacific island nations such as Kiribati and Tuvalu as potential climate “refugees”—victims in need of hand-holding rather than resilient individuals able to make conscious and informed decisions about their future. Warnings that climate change could help trigger mass migration, violence and political instability focus almost exclusively on the global south—Latin American, Africa, and South and Southeast Asia—and ignore the likelihood of similar unrest in Europe and North America, despite current events. This implicit bias rears its head in coverage of climate change in the U.S. Deep South, too: Why were Black Americans who were moving to Houston after Hurricane Katrina called “refugees,” a word applied to international migrants, when they had not left the country?

The problem persists, in part, because predicting the impacts of climate change inevitably requires assumptions about people and society. Science can project future climate conditions and those conditions’ impacts on the environment at any latitude. But there is no formula for calculating how individuals, communities or society at large will respond. This inexactness opens the door for implicit biases, developed over hundreds of years of deterministic thinking, to influence conclusions about who is, and who is not, capable of adapting to climate change. For example, a 2018 study inNature Climate Change found that research on climate and conflict suffers from a “streetlight” effect: researchers tend to conclude that climate change will cause violence in Africa and the Middle East because that is where they choose to look.

This legacy of deterministic thinking matters because it can do real-world damage. Rhetoric about climate refugees robs people of agency in their own future. Adaptation is possible, even in low-lying atoll nations. But good luck to them in securing international investment, loans or assistance for adaptation when every story says their community is doomed. People across the tropics have been pushing back. The slogan of the Pacific islands’ youth groupthe Pacific Climate Warriors—“We are not drowning. We are fighting.”—is an explicit rebuke to reflexive, racially loaded assumptions that the people of the Pacificare not up to the challenge.

Climate change is about legacy. We are seeing the legacy of past greenhouse gas emissions in today’s warming climate. And we are seeing the legacy of systemic racism and inequality in the impacts of that warming: indigenous peoples in the Arctic losing a way of life; people of color in the U.S. being disproportionately affected by extreme heat; slum dwellers in Chennai, India, suffering from floods. We need to spread the word about the inequalities of climate change. To do so responsibly, we must be willing to check our implicit biases before drawing conclusions about people living in different places.

Praising Emissions Reductions Due to Coronavirus Plays Into Right-Wing Strategy (Truthout)

Construction workers wear protective face masks in the streets of Bushwick, New York, on May 19, 2020.
Construction workers wear protective face masks in the streets of Bushwick, New York, on May 19, 2020.

By Sharon Zhang, Truthout – Published May 25, 2020

Part of the Series Despair and Disparity: The Uneven Burdens of COVID-19

A study published in Nature Climate Change recently found that, in early April, daily global carbon dioxide emissions decreased by 17 percent compared to the 2019 mean levels. Because of shelter-in-place rules and businesses being closed, people have been driving and flying less, leading to lower emissions.

Shortly after emissions started dropping in March, the climate community was careful to apply nuance to the emissions reduction discussion: Less carbon dioxide emissions, while good, should not be celebrated when caused by a global pandemic. In other words, while this time may show us the extent that people can come together in action, the ends don’t justify the means — the means here being a global financial crisis and hundreds of thousands of people dead. As climate scientist Carl-Friedrich Schleussner said in Carbon Brief, “The narrative that the economic catastrophe caused by the coronavirus is ‘good’ for the climate is dangerously misleading and could undermine support for climate action.”

Though the climate community quickly dismissed this narrative, the right wing latched onto the idea that progressives were celebrating COVID-19 for its environmental benefits. Quickly, commentators on the right asserted that the world as it is under the pandemic is the world that climate advocates want under policies like the Green New Deal. The British libertarian web magazine Spiked wrote that “Covid-19 is a frightening dress rehearsal of the climate agenda.” Spiked is, incidentally, funded by the Koch foundation. Meanwhile, figures like Alex Epstein, who wrote a book entitled The Moral Case for Fossil Fuels and whose organization Center for Industrial Progress has ties to the coal industry, have said that the recession caused by the pandemic is a preview of the Green New Deal.

This argument is incorrect in many ways, the least of which being that the temporary emissions reduction isn’t nearly enough: The UN has said that emissions need to drop by 7.5 percent each year. That drop needs to be permanent.

“[Right-wingers] are grasping at straws. And they’re actually trying to spin a couple pieces of straw into silk,” says Anthony Leiserowitz, director of the Yale Program on Climate Change Communication. “I don’t see anybody in the climate community actually making that argument” that coronavirus is a good thing.

Yet for climate deniers and delayers, a straw man argument is often enough. “It’s totally in character with the entire [denier] community to make shit up and try to pin it on their opponents,” says Leiserowitz. They just need to sow enough confusion that their benefactors — usually the fossil fuel industry — can thrive under deregulation and the status quo.

Commentators on the right asserted that the world as it is under the pandemic is the world that climate advocates want under policies like the Green New Deal.

So, while these are unprecedented times, this line of attack on the climate community already has a long history. Linking the global pandemic with an imaginary environmental agenda is just part of a quiet but consistent decades-long strategy to attack climate policy. This particular argument strives to equate any climate action with suffering — or, as the reactionary right might put it, a loss of “freedom” or “liberty.”

This narrative has taken many forms over the years; the idea that climate regulation kills jobs was a prominent one for several decades. Now, as the progressive left and climate action have gained ideological ground, the right has had to adapt and warp its arguments accordingly. The new argument, it seems, is that climate regulation kills not just jobs but the entire economy, as conservative pundits and politicians argued in early 2019 when the Green New Deal became popularized.

A particular fear that deniers like to stoke — one that they also play upon in their pandemic and climate regulation fear mongering — is that people’s everyday lives will change drastically when we actually start to address the climate crisis. Consider the rhetoric of extremist “reopen” protesters and the bizarre conservative claim that Rep. Alexandria Ocasio-Cortez (D-New York) wanted to take away people’s burgers.

The irony is that many climate policies are built to be as non-disruptive as possible. “The kinds of changes that we’re going to make in our lives — some of them, we’re not even going to notice,” says Leiserowitz. Light switches will still work, people will still be free to roam outside, meat will still be available to eat; in many ways, without the most oppressive effects of climate change, life will be better.

But deniers take advantage of the fact that climate communicators haven’t quite articulated the vast amounts of life-improving changes that climate action will bring and fill in the gaps with conspiratorial scare tactics.

“The thing that intrigued us [at DeSmogBlog] about the overlap between COVID misinformation and climate denial is that we couldn’t have one without the other,” says Brendan DeMelle, executive director of DeSmogBlog. DeSmogBlog has been documenting the overlap between those who deny or downplay the effects the coronavirus and known climate deniers. The overlap is vast, with climate denialist figures such as Alex Jones and Charlie Kirk and organizations such as the Heartland Institute, The Daily Caller, The Federalist and PragerU participating in various COVID-19 denial tactics.

“This echo chamber [on the right] is rapidly spreading misinformation through The Daily Caller and through all kinds of outlets that wouldn’t exist if it weren’t for this strategy of undermining public trust in science and government leadership,” says DeMelle. Part of the efficacy of the radical right’s propaganda is that it politicizes and smears institutional authorities like scientists and journalists in order to push its counterfactual agenda.

One effective way to combat the narrative that environmentalists want to destroy freedom and liberty is “to paint the positive and inspiring picture of transitioning from polluting energy sources to clean energy,” says John Cook, a climate and cognitive science researcher at George Mason University, over email to Truthout.

After all, mitigating the climate crisis, living free of harmful air that chokes out entire communities, leaving behind the fear that rising sea levels will displace entire countries and casting off our dread of what an entirely new category of hurricane will bring — that would be true freedom.

Sharon Zhang is a development fellow at Truthout.

COVID-19 crisis causes 17 percent drop in global carbon emissions (Science Daily)

Date: May 19, 2020

Source: University of East Anglia

Summary: The COVID-19 global lockdown has had an ‘extreme’ effect on daily carbon emissions, but it is unlikely to last, according to a new analysis.

Coronavirus and world | Credit: © Photocreo Bednarek / stock.adobe.com

Coronavirus and world concept illustration (stock image). Credit: © Photocreo Bednarek / stock.adobe.com

The COVID-19 global lockdown has had an “extreme” effect on daily carbon emissions, but it is unlikely to last — according to a new analysis by an international team of scientists.

The study published in the journal Nature Climate Change shows that daily emissions decreased by 17% — or 17 million tonnes of carbon dioxide — globally during the peak of the confinement measures in early April compared to mean daily levels in 2019, dropping to levels last observed in 2006.

Emissions from surface transport, such as car journeys, account for almost half (43%) of the decrease in global emissions during peak confinement on April 7. Emissions from industry and from power together account for a further 43% of the decrease in daily global emissions.

Aviation is the economic sector most impacted by the lockdown, but it only accounts for 3% of global emissions, or 10% of the decrease in emissions during the pandemic.

The increase in the use of residential buildings from people working at home only marginally offset the drop in emissions from other sectors.

In individual countries, emissions decreased by 26% on average at the peak of their confinement.

The analysis also shows that social responses alone, without increases in wellbeing and/or supporting infrastructure, will not drive the deep and sustained reductions needed to reach net zero emissions.

Prof Corinne Le Quéré of the University of East Anglia, in the UK, led the analysis. She said: “Population confinement has led to drastic changes in energy use and CO2 emissions. These extreme decreases are likely to be temporary though, as they do not reflect structural changes in the economic, transport, or energy systems.

“The extent to which world leaders consider climate change when planning their economic responses post COVID-19 will influence the global CO2 emissions paths for decades to come.

“Opportunities exist to make real, durable, changes and be more resilient to future crises, by implementing economic stimulus packages that also help meet climate targets, especially for mobility, which accounts for half the decrease in emissions during confinement.

“For example in cities and suburbs, supporting walking and cycling, and the uptake of electric bikes, is far cheaper and better for wellbeing and air quality than building roads, and it preserves social distancing.”

The team analysed government policies on confinement for 69 countries responsible for 97% of global CO2 emissions. At the peak of the confinement, regions responsible for 89% of global CO2 emissions were under some level of restriction. Data on activities indicative of how much each economic sector was affected by the pandemic was then used to estimate the change in fossil CO2 emissions for each day and country from January to April 2020.

The estimated total change in emissions from the pandemic amounts to 1048 million tonnes of carbon dioxide (MtCO2) until the end of April. Of this, the changes are largest in China where the confinement started, with a decrease of 242 MtCO2, then in the US (207 MtCO2), Europe (123 MtCO2), and India (98 MtCO2). The total change in the UK for January-April 2020 is an estimated 18 MtCO2.

The impact of confinement on 2020 annual emissions is projected to be around 4% to 7% compared to 2019, depending on the duration of the lockdown and the extent of the recovery. If pre-pandemic conditions of mobility and economic activity return by mid-June, the decline would be around 4%. If some restrictions remain worldwide until the end of the year, it would be around 7%.

This annual drop is comparable to the amount of annual emission reductions needed year-on-year across decades to achieve the climate objectives of UN Paris Agreement.

Prof Rob Jackson of Stanford University and Chair of the Global Carbon Project who co-authored the analysis, added: “The drop in emissions is substantial but illustrates the challenge of reaching our Paris climate commitments. We need systemic change through green energy and electric cars, not temporary reductions from enforced behavior.”

The authors warn that the rush for economic stimulus packages must not make future emissions higher by delaying New Green Deals or weakening emissions standards.

‘Temporary reduction in daily global CO2 emissions during the COVID-19 forced confinement’, Corinne Le Quéré, Robert B. Jackson, Matthew W. Jones, Adam J. P. Smith, Sam Abernethy, Robbie M. Andrew, Anthony J. De-Gol, David R. Willis, Yuli Shan, Josep G. Canadell, Pierre Friedlingstein, Felix Creutzig, Glen P. Peters, is published in Nature Climate Change on May 19.

The research received support from the Royal Society, the European Commission projects 4C, VERIFY and CHE, the Gordon and Betty Moore Foundation, and the Australian National Environmental Science Program.


Story Source:

Materials provided by University of East Anglia. Note: Content may be edited for style and length.


Journal Reference:

  1. Corinne Le Quéré, Robert B. Jackson, Matthew W. Jones, Adam J. P. Smith, Sam Abernethy, Robbie M. Andrew, Anthony J. De-Gol, David R. Willis, Yuli Shan, Josep G. Canadell, Pierre Friedlingstein, Felix Creutzig, Glen P. Peters. Temporary reduction in daily global CO2 emissions during the COVID-19 forced confinement. Nature Climate Change, 2020; DOI: 10.1038/s41558-020-0797-x

The End of Meat Is Here (New York Times)

If you care about the working poor, about racial justice, and about climate change, you have to stop eating animals.

By Jonathan Safran Foer – May 21, 2020

Credit: Jun Cen

Is any panic more primitive than the one prompted by the thought of empty grocery store shelves? Is any relief more primitive than the one provided by comfort food?

Most everyone has been doing more cooking these days, more documenting of the cooking, and more thinking about food in general. The combination of meat shortages and President Trump’s decision to order slaughterhouses open despite the protestations of endangered workers has inspired many Americans to consider just how essential meat is.

Is it more essential than the lives of the working poor who labor to produce it? It seems so. An astonishing six out of 10 counties that the White House itself identified as coronavirus hot spots are home to the very slaughterhouses the president ordered open.

In Sioux Falls, S.D., the Smithfield pork plant, which produces some 5 percent of the country’s pork, is one of the largest hot spots in the nation. A Tyson plant in Perry, Iowa, had 730 cases of the coronavirus — nearly 60 percent of its employees. At another Tyson plant, in Waterloo, Iowa, there were 1,031 reported cases among about 2,800 workers.

Sick workers mean plant shutdowns, which has led to a backlog of animals. Some farmers are injecting pregnant sows to cause abortions. Others are forced to euthanize their animals, often by gassing or shooting them. It’s gotten bad enough that Senator Chuck Grassley, an Iowa Republican, has asked the Trump administration to provide mental health resources to hog farmers.

Despite this grisly reality — and the widely reported effects of the factory-farm industry on America’s lands, communities, animals and human health long before this pandemic hit — only around half of Americans say they are trying to reduce their meat consumption. Meat is embedded in our culture and personal histories in ways that matter too much, from the Thanksgiving turkey to the ballpark hot dog. Meat comes with uniquely wonderful smells and tastes, with satisfactions that can almost feel like home itself. And what, if not the feeling of home, is essential?

And yet, an increasing number of people sense the inevitability of impending change.

Animal agriculture is now recognized as a leading cause of global warming. According to The Economist, a quarter of Americans between the ages of 25 and 34 say they are vegetarians or vegans, which is perhaps one reason sales of plant-based “meats” have skyrocketed, with Impossible and Beyond Burgers available everywhere from Whole Foods to White Castle.

Our hand has been reaching for the doorknob for the last few years. Covid-19 has kicked open the door.

At the very least it has forced us to look. When it comes to a subject as inconvenient as meat, it is tempting to pretend unambiguous science is advocacy, to find solace in exceptions that could never be scaled and to speak about our world as if it were theoretical.

Some of the most thoughtful people I know find ways not to give the problems of animal agriculture any thought, just as I find ways to avoid thinking about climate change and income inequality, not to mention the paradoxes in my own eating life. One of the unexpected side effects of these months of sheltering in place is that it’s hard not to think about the things that are essential to who we are.

Credit: Jun Cen

We cannot protect our environment while continuing to eat meat regularly. This is not a refutable perspective, but a banal truism. Whether they become Whoppers or boutique grass-fed steaks, cows produce an enormous amount of greenhouse gas. If cows were a country, they would be the third-largest greenhouse gas emitter in the world.

According to the research director of Project Drawdown — a nonprofit organization dedicated to modeling solutions to address climate change — eating a plant-based diet is “the most important contribution every individual can make to reversing global warming.”

Americans overwhelmingly accept the science of climate change. A majority of both Republicans and Democrats say that the United States should have remained in the Paris climate accord. We don’t need new information, and we don’t need new values. We only need to walk through the open door.

We cannot claim to care about the humane treatment of animals while continuing to eat meat regularly. The farming system we rely on is woven through with misery. Modern chickens have been so genetically modified that their very bodies have become prisons of pain even if we open their cages. Turkeys are bred to be so obese that they are incapable of reproducing without artificial insemination. Mother cows have their calves ripped from them before weaning, resulting in acute distress we can hear in their wails and empirically measure through the cortisol in their bodies.

No label or certification can avoid these kinds of cruelty. We don’t need any animal rights activist waving a finger at us. We don’t need to be convinced of anything we don’t already know. We need to listen to ourselves.

We cannot protect against pandemics while continuing to eat meat regularly. Much attention has been paid to wet markets, but factory farms, specifically poultry farms, are a more important breeding ground for pandemics. Further, the C.D.C. reports that three out of four new or emerging infectious diseases are zoonotic — the result of our broken relationship with animals.

It goes without saying that we want to be safe. We know how to make ourselves safer. But wanting and knowing are not enough.

These are not my or anyone’s opinions, despite a tendency to publish this information in opinion sections. And the answers to the most common responses raised by any serious questioning of animal agriculture aren’t opinions.

Don’t we need animal protein? No.

We can live longer, healthier lives without it. Most American adults eat roughly twice the recommended intake of protein — including vegetarians, who consume 70 percent more than they need. People who eat diets high in animal protein are more likely to die of heart disease, diabetes and kidney failure. Of course, meat, like cake, can be part of a healthy diet. But no sound nutritionist would recommend eating cake too often.

If we let the factory-farm system collapse, won’t farmers suffer? No.

The corporations that speak in their name while exploiting them will. There are fewer American farmers today than there were during the Civil War, despite America’s population being nearly 11 times greater. This is not an accident, but a business model. The ultimate dream of the animal-agriculture industrial complex is for “farms” to be fully automated. Transitioning toward plant-based foods and sustainable farming practices would create many more jobs than it would end.

Don’t take my word for it. Ask a farmer if he or she would be happy to see the end of factory farming.

Isn’t a movement away from meat elitist? No.

A 2015 study found that a vegetarian diet is $750 a year cheaper than a meat-based diet. People of color disproportionately self-identify as vegetarian and disproportionately are victims of factory farming’s brutality. The slaughterhouse employees currently being put at risk to satisfy our taste for meat are overwhelmingly brown and black. Suggesting that a cheaper, healthier, less exploitative way of farming is elitist is in fact a piece of industry propaganda.

Can’t we work with factory-farming corporations to improve the food system? No.

Well, unless you believe that those made powerful through exploitation will voluntarily destroy the vehicles that have granted them spectacular wealth. Factory farming is to actual farming what criminal monopolies are to entrepreneurship. If for a single year the government removed its $38-billion-plus in props and bailouts, and required meat and dairy corporations to play by normal capitalist rules, it would destroy them forever. The industry could not survive in the free market.

Perhaps more than any other food, meat inspires both comfort and discomfort. That can make it difficult to act on what we know and want. Can we really displace meat from the center of our plates? This is the question that brings us to the threshold of the impossible. On the other side is the inevitable.

With the horror of pandemic pressing from behind, and the new questioning of what is essential, we can now see the door that was always there. As in a dream where our homes have rooms unknown to our waking selves, we can sense there is a better way of eating, a life closer to our values. On the other side is not something new, but something that calls from the past — a world in which farmers were not myths, tortured bodies were not food and the planet was not the bill at the end of the meal.

One meal in front of the other, it’s time to cross the threshold. On the other side is home.

Jonathan Safran Foer is the author of “Eating Animals” and “We Are the Weather.”

Not-so-slow burn. The world’s energy system must be transformed completely (The Economist)

The Economist

It has been changed before, but never as fast or fully as must happen now

May 23rd 2020 edition


May 23rd 2020

FOR MORE than 100,000 years humans derived all their energy from what they hunted, gathered and grazed on or grew for themselves. Their own energy for moving things came from what they ate. Energy for light and heat came from burning the rest. In recent millennia they added energy from the flow of water and, later, air to the repertoire. But, important as water- and windmills were, they did little to change the overall energy picture. Global energy use broadly tracked the size of a population fed by farms and warmed by wood.

The combination of fossil fuels and machinery changed everything. According to calculations by Vaclav Smil, a scholar of energy systems at the University of Manitoba, between 1850 and 2000 the human world’s energy use increased by a factor of 15 or so.

The expansion was not homogeneous; over its course the mixture of fossil fuels used changed quite dramatically. These are the monumental shifts historians call “energy transitions”. They require huge amounts of infrastructure; they change the way the economy works; and they take place quite slowly.

James Watt patented his steam engine in 1769; coal did not exceed the share of total energy provided by “traditional biomass”—wood, peat, dung and the like—until the 1900s (see chart overleaf). It was not until the 1950s, a century after the first commercial oil well was drilled in Titusville, Pennsylvania, that crude oil came to represent 25% of humankind’s total primary energy. Energy transitions were slow largely because the growth in total energy use was fast. In the century it took oil to capture a quarter of the total, that total increased. They are also always incomplete. New fuels may reduce the share of the pie that old fuels control, but they rarely reduce the total energy those fuels supply. Much more “traditional biomass” is burned by the world’s poor today than was burned by the whole world in 1900.

To give the world a good chance of keeping global warming, measured against the temperature pre-coal, well below 2°C (3.6°F) will require an energy transition far larger and quicker than any before it. In the next 30-50 years 90% or more of the share of the world’s energy now being produced from fossil fuels will need to be provided by renewable-energy sources, nuclear power or fossil-fuel plants that bury their waste rather than exhaling it.

During this time, the pie will keep growing—but not necessarily as fast as it used to. The direct relationship between GDP and energy use, which held tight until the 1970s, has weakened over the past half century. It is possible for growth per person to continue without energy use per person increasing. Though the population is no longer growing as fast as it did at the 20th-century peak of its increase, it will still be the best part of 2bn higher by mid-century. And all those people should be able to aspire to modern energy services. Today more than 800m people still lack electricity—hence all that burning of traditional biomass.

The good news, however, is that governments say they are willing to push through the change. Previous transitions, though shaped by government policy at national levels, were mostly caused by the demand for new services that only a specific fuel could provide, such as petrol for engines.

The growth in renewable-generation capacity is the exception. It has not been driven by the fact that renewable electrons allow you to do things of which those from coal are not capable. It has largely been driven by government policy. This has not always had the near-term effects for which such policy should aim. Germany’s roll-out of renewables has been offset by its retreat from nuclear, and its emissions have risen. But subsidies there and elsewhere have boosted supply chains and lowered the cost of renewable technologies.

During the 2010s the levelised cost (that is the average lifetime cost of equipment, per megawatt hour of electricity generated) of solar, offshore wind and onshore wind fell by 87%, 62% and 56%, respectively, according to BloombergNEF, an energy-data outfit (see chart overleaf). This has allowed deployments that were unthinkable in the 2000s. Britain now has more than 2,000 offshore wind turbines. They are built by developers chosen based on how low a price they are willing to take for their electricity (the government pledges to make the cost up if the market price falls below it).

In 2015 winning bids were well over £100 ($123) per MWh, far higher than the cost of fossil-fuel electricity. Thanks to predictable policy, fierce competition and technical progress, a recent auction brought a bid as low as £39.65 per MWh, roughly the level of average wholesale power prices. Solar and onshore wind are even less expensive. About two-thirds of the world’s population live in countries where renewables represent the cheapest source of new power generation, says BloombergNEF.

Solar power is the really spectacular achiever, outstripping the expectations of its most fervent boosters. Ramez Naam, a bullish solar investor, recently recalibrated his expectations to foresee a future of “insanely cheap” solar power. By 2030, he reckons, in sunny parts of the world, building large new solar installations from scratch will be a cheaper way of getting electricity than operating fully depreciated fossil-fuel plants, let alone building new ones. Michael Liebreich, a consultant on renewable energies, speculates about a “renewable singularity” in which cheap renewable electricity opens up new markets that demand new capacity which makes electricity cheaper still.

Even without such speculative wonders, the effect of renewables is appreciable. Together with natural gas, which America’s fracking revolution has made cheaper, solar and wind are already squeezing coal, the energy sector’s biggest emitter (a megawatt of coal produces a stream of emissions twice the size of that given off by a megawatt of gas). In 2018 coal’s share of global energy supply fell to 27%, the lowest in 15 years. The pressure that they can apply to oil is not yet as great, because oil mostly drives cars, and electric cars are still rare. But as that changes, renewables will come for oil, as they are already coming for gas.

There are stumbling blocks. Neither the sun nor the wind produces energy consistently. Germany’s solar-power installations produce five times more electricity in the summer than they do in the winter, when demand for electricity is at its peak. Wind strengths vary not just from day to day but from season to season and, to some extent, year to year. This amounts to a speed bump for renewables, not a blockade. Long transmission lines that keep losses low by working at very high voltages can move electricity from oversupplied areas to those where demand is surging. Lithium-ion batteries can store extra energy and release it as needed. The economic stimulus China announced in March includes both ultra-high-voltage grids and electric-vehicle-charging infrastructure.

Thou orb aloft, somewhat dazzling

As the sun and wind account for a larger share of power, renewables might store power by splitting water to create hydrogen to be burned later. More ambitiously, if technologies for pulling carbon dioxide back out of the air improve, such hydrogen could be combined with that scavenged carbon to make fossil-free fuels.

In doing so, they might help remedy the other problem with renewables. There are some emissions which even very cheap electricity cannot replace. Lithium-ion batteries are too bulky to power big planes on long flights, which is where artificial fuels might come in. Some industrial processes, such as cement-making, give out carbon dioxide by their very nature. They may require technology that intercepts the carbon dioxide before it gets into the atmosphere and squirrels it away underground. When emissions cannot be avoided—as may be the case with some of those from farmland—they will need to be offset by removing carbon dioxide from the atmosphere either with trees or technology.

None of this happens, though, without investment. The International Renewable Energy Agency, an advisory group, estimates that $800bn of investment in renewables is needed each year until 2050 for the world to be on course for less than 2°C of warming, with more than twice that needed for electric infrastructure and efficiency. In 2019 investment in renewables was $250bn. The big oil and gas firms invested twice as much in fossil-fuel extraction.

If governments want to limit climate change, therefore, they must do more. They do not have to do everything. If policy choices show that the road away from fossil fuels is right, private capital will follow. Investors are already wary of fossil-fuel companies, eyeing meagre returns and the possibility that action on climate change will leave firms with depreciating assets.

But governments need to make the signals clear. Around the world, they currently provide more than $400bn a year in direct support for fossil-fuel consumption, more than twice what they spend subsidising renewable production. A price on carbon, which hastens the day when new renewables are cheaper than old fossil-fuel plants, is another crucial step. So is research spending aimed at those emissions which are hard to electrify away. Governments have played a large role in the development of solar panels, wind turbines and fracking. There is a lot more to do.

However much they do, though, and however well they do it, they will not stop the climate change at today’s temperature of 1°C above the pre-industrial. Indeed, they will need to expand their efforts greatly to meet the 2°C target; on today’s policies, the rise by the end of the century looks closer to 3°C. This means that as well as trying to limit climate change, the world also needs to learn how to adapt to it. ■

Correction (May 22nd 2020): This article previously stated that Britain had 10,000 offshore wind turbines. In fact that is the total number of turbines; only 2,016 are offshore. We’re sorry for the error.

This article appeared in the Schools brief section of the print edition under the headline “Not-so-slow burn”

Countries should seize the moment to flatten the climate curve (The Economist)

economist.com

May 21st 2020 7-8 minutes


The pandemic shows how hard it will be to decarbonise—and creates an opportunity


Editor’s note: Some of our covid-19 coverage is free for readers of The Economist Today, our daily newsletter. For more stories and our pandemic tracker, see our hub

FOLLOWING THE pandemic is like watching the climate crisis with your finger jammed on the fast-forward button. Neither the virus nor greenhouse gases care much for borders, making both scourges global. Both put the poor and vulnerable at greater risk than wealthy elites and demand government action on a scale hardly ever seen in peacetime. And with China’s leadership focused only on its own advantage and America’s as scornful of the World Health Organisation as it is of the Paris climate agreement, neither calamity is getting the co-ordinated international response it deserves.

The two crises do not just resemble each other. They interact. Shutting down swathes of the economy has led to huge cuts in greenhouse-gas emissions. In the first week of April, daily emissions worldwide were 17% below what they were last year. The International Energy Agency expects global industrial greenhouse-gas emissions to be about 8% lower in 2020 than they were in 2019, the largest annual drop since the second world war.

That drop reveals a crucial truth about the climate crisis. It is much too large to be solved by the abandonment of planes, trains and automobiles. Even if people endure huge changes in how they lead their lives, this sad experiment has shown, the world would still have more than 90% of the necessary decarbonisation left to do to get on track for the Paris agreement’s most ambitious goal, of a climate only 1.5°C warmer than it was before the Industrial Revolution.

But as we explain this week (see article) the pandemic both reveals the size of the challenge ahead and also creates a unique chance to enact government policies that steer the economy away from carbon at a lower financial, social and political cost than might otherwise have been the case. Rock-bottom energy prices make it easier to cut subsidies for fossil fuels and to introduce a tax on carbon. The revenues from that tax over the next decade can help repair battered government finances. The businesses at the heart of the fossil-fuel economy—oil and gas firms, steel producers, carmakers—are already going through the agony of shrinking their long-term capacity and employment. Getting economies in medically induced comas back on their feet is a circumstance tailor-made for investment in climate-friendly infrastructure that boosts growth and creates new jobs. Low interest rates make the bill smaller than ever.

Take carbon-pricing first. Long cherished by economists (and The Economist), such schemes use the power of the market to incentivise consumers and firms to cut their emissions, thus ensuring that the shift from carbon happens in the most efficient way possible. The timing is particularly propitious because such prices have the most immediate effects when they tip the balance between two already available technologies. In the past it was possible to argue that, although prices might entrench an advantage for cleaner gas over dirtier coal, renewable technologies were too immature to benefit. But over the past decade the costs of wind and solar power have tumbled. A relatively small push from a carbon price could give renewables a decisive advantage—one which would become permanent as wider deployment made them cheaper still. There may never have been a time when carbon prices could achieve so much so quickly.

Carbon prices are not as popular with politicians as they are with economists, which is why too few of them exist. But even before covid-19 there were hints their time was coming. Europe is planning an expansion of its carbon-pricing scheme, the largest in the world; China is instituting a brand new one. Joe Biden, who backed carbon prices when he was vice-president, will do so again in the coming election campaign—and at least some on the right will agree with that. The proceeds from a carbon tax could raise over 1% of GDP early on and would then taper away over several decades. This money could either be paid as a dividend to the public or, as is more likely now, help lower government debts, which are already forecast to reach an average of 122% of GDP in the rich world this year, and will rise further if green investments are debt-financed.

Carbon pricing is only part of the big-bang response now possible. By itself, it is unlikely to create a network of electric-vehicle charging-points, more nuclear power plants to underpin the cheap but intermittent electricity supplied by renewables, programmes to retrofit inefficient buildings and to develop technologies aimed at reducing emissions that cannot simply be electrified away, such as those from large aircraft and some farms. In these areas subsidies and direct government investment are needed to ensure that tomorrow’s consumers and firms have the technologies which carbon prices will encourage.

Some governments have put their efforts into greening their covid-19 bail-outs. Air France has been told either to scrap domestic routes that compete with high-speed trains, powered by nuclear electricity, or to forfeit taxpayer assistance. But dirigisme disguised as a helping hand could have dangerous consequences: better to focus on insisting that governments must not skew their bail-outs towards fossil fuels. In other countries the risk is of climate-damaging policies. America has been relaxing its environmental rules further during the pandemic. China—whose stimulus for heavy industry sent global emissions soaring after the global financial crisis—continues to build new coal plants (see article).

Carpe covid

The covid-19 pause is not inherently climate-friendly. Countries must make it so. Their aim should be to show by 2021, when they gather to take stock of progress made since the Paris agreement and commit themselves to raising their game, that the pandemic has been a catalyst for a breakthrough on the environment.

Covid-19 has demonstrated that the foundations of prosperity are precarious. Disasters long talked about, and long ignored, can come upon you with no warning, turning life inside out and shaking all that seemed stable. The harm from climate change will be slower than the pandemic but more massive and longer-lasting. If there is a moment for leaders to show bravery in heading off that disaster, this is it. They will never have a more attentive audience. ■

This article appeared in the Leaders section of the print edition under the headline “Seize the moment”

economist.com

Can covid help flatten the climate curve?

May 21st 2020 8-10 minutes


Editor’s note: Some of our covid-19 coverage is free for readers of The Economist Today, our daily newsletter. For more stories and our pandemic tracker, see our hub

AMID COVID-19’s sweeping devastation, its effect on greenhouse gases has emerged as something of a bright spot. Between January and March demand for coal dropped by 8% and oil by 5%, compared with the same period in 2019. By the end of the year energy demand may be 6% down overall, according to the International Energy Agency (IEA), an intergovernmental forecaster, amounting to the largest drop it has ever seen.

Because less energy use means less burning of fossil fuels, greenhouse-gas emissions are tumbling, too. According to an analysis by the Global Carbon Project, a consortium of scientists, 2020’s emissions will be 2-7% lower than 2019’s if the world gets back to prepandemic conditions by mid-June; if restrictions stay in place all year, the estimated drop is 3-13% depending on how strict they are. The IEA’s best guess for the drop is 8%.

That is not enough to make any difference to the total warming the world can expect. Warming depends on the cumulative emissions to date; a fraction of one year’s toll makes no appreciable difference. But returning the world to the emission levels of 2010—for a 7% drop—raises the tantalising prospect of crossing a psychologically significant boundary. The peak in carbon-dioxide emissions from fossil fuels may be a lot closer than many assume. It might, just possibly, turn out to lie in the past.

That emissions from fossil fuels have to peak, and soon, is a central tenet of climate policy. Precisely when they might do so, though, is so policy-dependent that many forecasters decline to give a straight answer. The IEA makes a range of projections depending on whether governments keep on with today’s policies or enact new ones. In the scenario which assumes that current policies stay in place, fossil-fuel demand rises by nearly 30% from 2018 to 2040, with no peak in sight.

The IEA, though, has persistently underestimated the renewable-energy sector. Others are more bullish. Carbon Tracker, a financial think-tank, predicted in 2018 that with impressive but plausible growth in renewable deployment and relatively slow growth in overall demand, even under current policy fossil-fuel emissions should peak in the 2020s—perhaps as early as 2023. Michael Liebreich, who founded BloombergNEF, an energy-data outfit, has also written about a possible peak in the mid 2020s. Depending on how the pandemic pans out he now thinks that it may be in 2023—or may have been in 2019.

Previously, drops in emissions caused by economic downturns have proved only temporary setbacks to the ongoing rise in fossil-fuel use. The collapse of the Soviet Union in 1991, the Asian financial crash in 1997 and the financial crisis of 2007-09 all saw emissions stumble briefly before beginning to rise again (see chart). But if a peak really was a near-term prospect before the pandemic, almost a decade’s worth of setback could mean that, though emissions will rise over the next few years, they never again reach the level they stood at last year.

The alternative, more orthodox pre-covid view was that the peak was both further off and destined to be higher. On this view, emissions will regain their pre-pandemic level within a few years and will climb right on past it. Covid’s damage to the economy probably means that the peak, when it arrives, will be lower than it might have been, says Roman Kramarchuk of S&P Global Platts Analytics, a data and research firm. But an economic dip is unlikely to bring it on sooner.

What, though, if covid does not merely knock demand back, but reshapes it? This shock, unlike prior ones, comes upon an energy sector already in the throes of change. The cost of renewables is dipping below that of new fossil-fuel plants in much of the world. After years of development, electric vehicles are at last poised for the mass market. In such circumstances covid-19 may spur decisions—by individuals, firms, investors and governments—that hasten fossil fuels’ decline.

So far, renewables have had a pretty good pandemic, despite some disruptions to supply chains. With no fuel costs and the preferential access to electricity grids granted by some governments, renewables demand jumped 1.5% in the first quarter, even as demand for all other forms of energy sank. America’s Energy Information Administration expects renewables to surpass coal’s share of power generation in America for the first time this year.

Coal prices have fallen, given the low demand, which may position it well post-pandemic in some places. Even before covid, China was building new coal-fired plants (see article). But the cost of borrowing is also low, and likely to stay that way, which means installing renewables should stay cheap for longer. Renewable developers such as Iberdrola and Orsted, both of which have weathered covid-19 rather well so far, are keen to replace coal on an ever larger scale.

Those who see demand for fossil fuels continuing to climb as populations and economies grow have assumed demand for oil will be much more persistent than that for coal. Coal is almost entirely a source of electricity, which makes it ripe for replacement by renewables. Oil is harder to shift. Electric vehicles are sure to eat into some of its demand; but a rising appetite for petrochemicals and jet fuel, to which lithium-ion batteries offer no competition, was thought likely to offset the loss.

Breaking bounds

Now oil’s future looks much more murky, depending as it does on a gallimaufry of newly questionable assumptions about commuting, airline routes, government intervention, capital spending and price recovery. In the future more people may work from home, and commuting accounts for about 8% of oil demand. But those who do commute may prefer to do so alone in their cars, offsetting some of those gains. Chinese demand for oil has picked up again quickly in part because of reticence about buses and trains.

As to planes, Jeff Currie of Goldman Sachs estimates that demand for oil will recover to pre-crisis levels by the middle of 2022, but that demand for jet fuel may well stay 1.7m barrels a day below what it was as business travel declines. That is equivalent to nearly 2% of oil demand.

Such uncertainty means more trouble for the oil sector, whose poor returns and climate risks have been repelling investors for a while. Companies are slashing spending on new projects. By the mid-2020s today’s underinvestment in oil may boost crude prices—making demand for electric vehicles grow all the faster.

Natural gas, the fossil fuel for which analysts have long predicted continued growth, has weathered the pandemic better than its two older siblings. But it, too, faces accelerating competition. One of gas’s niches is powering the “peaker” plants which provide quick influxes of energy when demand outstrips a grid’s supply. It looks increasingly possible for batteries to take a good chunk of that business.

Those hoping for fossil fuels’ imminent demise should not be overconfident. As lockdowns around the world end, use of dirty fuels will tick back up, as they have in China. Energy emissions no longer rise in lockstep with economic growth, but demand for fossil fuels remains tied to it. Mr Currie of Goldman Sachs, for one, is wary of declaring a permanent decoupling: “I’m not willing to say there is a structural shift in oil demand to GDP.” Even so, a peak of fossil fuels in the 2020s looks less and less farfetched—depending on what governments do next in their struggle with the pandemic. Of all the uncertainties in energy markets, none currently looms larger than that. ■