João Paulo Charleaux – 13 de out de 2021 (atualizado 13/10/2021 às 00h26)
Laurent-Henri Vignaud, historiador da ciência na Universidade de Bourgogne, fala ao ‘Nexo’ sobre as ideias, à direita e à esquerda, por trás do movimento antivacina nos últimos 300 anos
A resistência à vacinação é um fenômeno antigo e persistente, que encontra adeptos à esquerda e à direita – sempre nas franjas mais extremas desses setores –, e não está ligado à falta de educação, mas ao excesso de informação e à dificuldade de saber em que acreditar, de acordo com o historiador da ciência Laurent-Henri Vignaud, da Universidade de Bourgogne, na França.
O autor do livro “Antivax: Resistência às vacinas, do século 18 aos Nossos Dias” esmiuça, nesta entrevista concedida por escrito ao Nexo nesta quarta-feira (6), os argumentos dos que ainda resistem a se vacinar contra a covid-19 em todo mundo, e faz um retrospecto desse movimento antivacinal ao longo da história.
Vignaud fará uma conferência virtual sobre o tema no dia 14 de outubro, no ciclo de palestras sobre a Covid promovido pelo Consulado da França em São Paulo em parceria com a Unesco, órgão das Nações Unidas para educação e cultura, e com os Blogs de Ciência da Unicamp. A transmissão é ao vivo e os vídeos ficam disponíveis nos canais do Consulado da França na internet.
Quais são os argumentos daqueles que se opõem à vacinação? Como esses argumentos variaram nos últimos 300 anos?
Laurent-Henri Vignaud Esses argumentos são muito diversos, assim como os perfis “antivax”. Muitos têm dúvidas simples sobre a qualidade das vacinas ou sobre os conflitos de interesse de quem as promove. Outros desenvolvem teorias extremas de conspiração, dizendo que as vacinas são feitas para adoecer, para esterilizar, matar ou escravizar. No meio, há aqueles que “hesitam” por tal ou tal motivo.
Aqueles que recusam explicitamente uma ou mais vacinas – quando falamos estritamente dos “antivax” – o fazem por motivos religiosos, políticos ou alternativos e naturalistas. Há certas correntes rigorosas, em todas as religiões, que recusam a vacinação em nome de um princípio fatalista e providencialista, numa afirmação da ideia de que o homem não é senhor de seu próprio destino.
Já os que se opõem às vacinas por razões políticas atacam as leis impositivas em nome da livre disposição de seus corpos e das liberdades individuais, no discurso do “meu corpo me pertence”.
Outros, muito numerosos hoje, contestam a eficácia das vacinas e defendem outras terapias que vão desde regimes de saúde a fitoterápicos e homeopatia – o que aparece em discursos como “a imunidade natural é superior à imunidade a vacinas” e “as doenças nos fortalecem”. A maioria desses argumentos está presente desde o início da polêmica vacinal no final do século 18, mas se atualizam de maneira diferente em cada época.
Historicamente, o movimento antivacinação é de direita ou de esquerda? Isso é algo que mudou ao longo do tempo ou permanece o mesmo?
Laurent-Henri Vignaud Atualmente, as duas tendências existem: há uma postura “ecológica” antivacina que é bastante esquerdista e burguesa – um modelo muito difundido por exemplo na Califórnia entre funcionários de empresas digitais. E há uma postura “libertária” ou “confessional” antivacina, que é de direita, presente sobretudo na América, em círculos religiosos conservadores e partidários de líderes populistas como [o ex-presidente dos EUA Donald] Trump ou [o presidente do Brasil, Jair] Bolsonaro.
Historicamente, a inoculação, técnica que antecedeu as vacinas no século 18, foi promovida por filósofos como Voltaire [iluminista francês, 1694-1778] e contrariada por homens da Igreja. Portanto, podemos classificar essa oposição como uma oposição à direita. No século 19, a dureza das medidas de vacinação obrigatória levou à revolta de setores mais pobres que não podiam escapar da injeção. O vacinismo aparece aí como higiene social e o antivacinismo, como algo protagonizado por movimentos operários, feministas e de defesa dos animais, mais marcadamente à esquerda, portanto.
A Revolta da Vacina, de 1904, no Brasil, foi desencadeada por uma campanha de vacinação forçada pretendida pela jovem República, que gerou motins na classe trabalhadora. No século 20, o antivacinismo está representado à direita e à esquerda, mas quase sempre nos extremos.
O que explica por que a França, país desenvolvido, rico, cientificamente avançado, onde não faltam fontes confiáveis de informação, tenha hoje uma resistência tão elevada à vacinação, mesmo entre os profissionais de saúde?
Laurent-Henri Vignaud Esse é um fenômeno recente. A França não está isenta da tradição antivacinal. Na verdade, essa era uma tradição até bastante virulenta na época de Pasteur [século 19], a ponto de atrasar o estabelecimento de uma obrigação de vacinar contra a varíola, mas esta não é uma opinião muito difundida até o início do anos 2000.
Por exemplo, nossa primeira liga “antivax” apareceu em 1954 após a entrada em vigor da obrigação do BCG, mas, à época, os ingleses e os americanos já tinha ligas “antivax” há quase um século.
Durante a última epidemia de varíola na Bretanha em 1954-1955, na altura em que o prefeito decretou o reforço da vacinação obrigatória, mais de 90% dos habitantes concernidos já tinham sido vacinados voluntariamente.
Essa confiança foi abalada durante o debate sobre a vacina contra a hepatite B em meados da década de 1990, até porque os políticos se contradiziam sobre sua possível periculosidade. E, na crise do do influenza A em 2009, a campanha de vacinação falhou. Os franceses não acreditavam na possibilidade de uma pandemia e não entendiam por que deveriam ter sido vacinados contra uma doença na qual não viam perigo. Talvez o choque da pandemia de covid reverta essa tendência.
Como você explica o fato de que os boatos, o misticismo e a irracionalidade persistam, mesmo em uma época em que a ciência se desenvolveu tanto, mesmo em uma época em que a educação formal alcançou tantos? Essa adesão às teorias da conspiração seria uma característica humana inextinguível?
Laurent-Henri Vignaud A suspeita de riscos tecnológicos – porque a vacina é um produto manufaturado – não se alimenta da falta de informação, mas de seu transbordamento. É por sermos inundados com informações e por não podermos lidar com um décimo delas que nós duvidamos.
Quem de nós pode explicar, ainda que de forma grosseira, como funciona algo tão difundido como um telefone celular? Diante dessa superabundância de quebra-cabeças técnico-científicos e de conhecimentos que não podemos assimilar, os cidadãos 2.0 fazem seu mercado e acreditam no que querem acreditar de acordo com o que consideram ser do seu interesse.
A maioria confia em palavras de autoridade e no pouco que conseguem entender de tudo o que chega a si. Alguns ficam insatisfeitos com as respostas que lhes são dadas e passam a duvidar de tudo, chegando a imaginar universos paralelos e paranóicos. Não é, portanto, na ignorância que estas crenças se baseiam, mas sim num “ônus da prova”, que pesa cada vez mais sobre os ombros dos cidadãos contemporâneos.
Nessa “sociedade de risco”, os cidadãos contemporâneos são cada vez mais instados a assumir a responsabilidade por si próprios e julgar por si próprios o que é verdadeiro e o que é falso. Em alguns, o espírito crítico se empolga e leva a uma forma de ceticismo radical da qual o antivacinismo é um bom exemplo.
“Eles estão em um momento crítico. Não podem contar com a sorte. Não podem contar com a sorte de que vai ter um dilúvio, um tsunami de chuva no Brasil. Não vai. Ficaram tão fechados nesse mundinho deles da ideologia, agora estão indo para o lado esotérico. É o que restou para eles”, afirma Aguayo.
O ministério divulgou comunicado no domingo (17) dizendo que seu encontro com a Fundação Cacique Cobra Coral não foi pedido pela pasta.
Placing our faith in forecasting and science could save lives and money
October 14, 2021
2021 is shaping up to be a historically busy hurricane season. And while damage and destruction have been serious, there has been one saving grace — that the National Weather Service has been mostly correct in its predictions.
Thanks to remote sensing, Gulf Coast residents knew to prepare for the “life-threatening inundation,” “urban flooding” and “potentially catastrophic wind damage” that the Weather Service predicted for Hurricane Ida. Meteorologists nailed Ida’s strength, surge and location of landfall while anticipating that a warm eddy would make her intensify too quickly to evacuate New Orleans safely. Then, as her remnants swirled northeast, reports warned of tornadoes and torrential rain. Millions took heed, and lives were saved. While many people died, their deaths resulted from failures of infrastructure and policy, not forecasting.
The long history of weather forecasting and weather mapping shows that having access to good data can help us make better choices in our own lives. Trust in meteorology has made our communities, commutes and commerce safer — and the same is possible for climate science.
Two hundred years ago, the few who studied weather deemed any atmospheric phenomenon a “meteor.” The term, referencing Aristotle’s “Meteorologica,” essentially meant “strange thing in the sky.” There were wet things (hail), windy things (tornadoes), luminous things (auroras) and fiery things (comets). In fact, the naturalist Elias Loomis, who was among the first to spot Halley’s comet upon its return in 1835, thought storms behaved as cyclically as comets. So to understand “the laws of storms,” Loomis and the era’s other leading weatherheads began gathering observations. Master the elements, they reasoned, and you could safely sail the seas, settle the American West, plant crops with confidence and ward off disease.
In 1856, Joseph Henry, the Smithsonian Institution’s first director, hung a map of the United States in the lobby of its Washington headquarters. Every morning, he would affix small colored discs to show the nation’s weather: white for places with clear skies, blue for snow, black for rain and brown for cloud cover. An arrow on each disc allowed him to note wind direction, too. For the first time, visitors could see weather across the expanding country.
Although simple by today’s standards, the map belied the effort and expense needed to select the correct colors each day. Henry persuaded telegraph companies to transmit weather reports every morning at 10. Then he equipped each station with thermometers, barometers, weathervanes and rain gauges — no small task by horse and rail, as instruments often broke in transit.
For longer-term studies of the North American climate, Henry enlisted academics, farmers and volunteers from Maine to the Caribbean. Eager to contribute, “Smithsonian observers” took readings three times a day and posted them to Washington each month. At its peak in 1860, the Smithsonian Meteorological Project had more than 500 observers. Then the Civil War broke out.
Henry’s ranks thinned by 40 percent as men traded barometers for bayonets. Severed telegraph lines and the priority of war messages crippled his network. Then in January 1865, a fire in Henry’s office landed the fatal blow to the project. All of his efforts turned to salvaging what survived. With a vacuum of leadership in Washington, citizen scientists picked up the slack.
Although the Chicago Tribune lampooned Lapham, wondering “what practical value” a warning service would provide “if it takes 10 years to calculate the progress of a storm,” Rep. Halbert E. Paine (Wis.), who had studied storms under Loomis, rushed a bill into Congress before the winter recess. In early 1870, a joint resolution establishing a storm-warning service under the U.S. Army Signal Office passed without debate. President Ulysses S. Grant signed it into law the following week.
Despite the mandate for an early-warning system, an aversion to predictions remained. Fiscal hawks could not justify an investment in erroneous forecasts, religious zealots could not stomach the hubris, and politicians wary of a skeptical public could not bear the fallout. In 1893, Agriculture Secretary J. Sterling Morton cut the salary of one of the country’s top weather scientists, Cleveland Abbe, by 25 percent, making an example out of him.
While Moore didn’t face consequences for his dereliction of duty, the Weather Bureau’s hurricane-forecasting methods gradually improved as the network expanded and technologies like radio emerged. The advent of aviation increased insight into the upper atmosphere; military research led to civilian weather radar, first deployed at Washington National Airport in 1947. By the 1950s, computers were ushering in the future of numerical forecasting. Meanwhile, public skepticism thawed as more people and businesses saw it in their best interests to trust experts.
In September 1961, a local news team decided to broadcast live from the Weather Bureau office in Galveston, Tex., as Hurricane Carla angled across the Gulf of Mexico. Leading the coverage was a young reporter named Dan Rather. “There is the eye of the hurricane right there,” he told his audience as the radar sweep brought the invisible into view. At the time, no one had seen a radar weather map televised before.
Rather realized that for viewers to comprehend the storm’s size, location and imminent danger, people needed a sense of scale. So he had a meteorologist draw the Texas coast on a transparent sheet of plastic, which Rather laid over the radarscope. Years later, he recalled that when he said “one inch equals 50 miles,” you could hear people in the studio gasp. The sight of the approaching buzz saw persuaded 350,000 Texans to evacuate their homes in what was then the largest weather-related evacuation in U.S. history. Ultimately, Carla inflicted twice as much damage as the Galveston hurricane 60 years earlier. But with the aid of Rather’s impromptu visualization, fewer than 50 lives were lost.
In other words, weather forecasting wasn’t only about good science, but about good communication and visuals.
Data visualization helped the public better understand the weather shaping their lives, and this enabled them to take action. It also gives us the power to see deadly storms not as freak occurrences, but as part of something else: a pattern.
Two hundred years ago, a 10-day forecast would have seemed preposterous. Now we can predict if we’ll need an umbrella tomorrow or a snowplow next week. Imagine if we planned careers, bought homes, built infrastructure and passed policy based on 50-year forecasts as routinely as we plan our weeks by five-day ones.
Unlike our predecessors of the 19th or even 20th centuries, we have access to ample climate data and data visualization that give us the knowledge to take bold actions. What we do with that knowledge is a matter of political will. It may be too late to stop the coming storm, but we still have time to board our windows.
Este mês, tempestades forçaram o cancelamento de mais de 300 voos no aeroporto O’Hare, de Chicago, e no aeroporto de Dalas/Fort Worth, no Texas. Em julho, oito voos foram cancelados em Denver e outros 300 sofreram atrasos devido aos incêndios florestais que atingiram a região do Pacífico Noroeste dos Estados Unidos. O calor extremo afetou decolagens em Las Vegas e no Colorado no começo deste verão [do final de junho ao final de setembro, no hemisfério norte].
As perturbações se alinham a uma tendência: cancelamentos e atrasos de voos causados pelo clima se tornaram muito mais frequentes nos Estados Unidos e na Europa durante as duas últimas décadas, demonstram dados das autoridades regulatórias. Embora seja difícil vincular qualquer tempestade ou onda de calor individual à mudança do clima, estudos científicos determinaram que elas se tornarão mais frequentes ou intensas à medida que o planeta se aquece.
A ICAO (Organização Internacional da Aviação Civil), o órgão vinculado à ONU que estabelece normas para o setor, constatou em uma pesquisa de 2019 entre seus países membros que três quartos dos respondentes afirmavam que seus setores de transporte aéreo já estavam experimentando algum impacto causado pela mudança no clima.
“É algo que absolutamente ocupa nossos pensamentos, com relação a se poderemos continuar mantendo nosso cronograma de voos, especialmente se considerarmos o crescimento que temos planejado para o futuro”, disse David Kensick, vice-presidente de operações mundiais da United Airlines. “Com a mudança no clima, estamos vendo um clima cada vez mais difícil de prever, e por isso teremos de lidar melhor com as situações criadas por ele”.
As companhias de aviação respondem por cerca de 2% das emissões mundiais de gases causadores do efeito estufa, ainda que, se outras substâncias emitidas por aviões forem consideradas, alguns estudos indiquem que seu impacto sobre o clima pode ser ainda maior.
O impacto potencial da mudança do clima sobre o setor é abrangente. Em curto prazo, as condições climáticas intensas criam dores de cabeça operacionais. Desvios forçados e cancelamentos de voos aumentam os custos de um setor que perdeu bilhões de dólares durante a pandemia.
Em prazo mais longo, as companhias de aviação acreditam que as mudanças nos padrões do clima alterarão as rotas de voo e o consumo de combustível. Provavelmente, voos entre a Europa e os Estados Unidos demorarão mais tempo, quando a “jet stream” que existe por sobre o Atlântico Norte mudar, por exemplo.
“A aviação será vítima da mudança do clima, além de ser vista, por muitas pessoas, como um dos vilões”, disse Paul Williams, professor de ciência atmosférica na Universidade de Reading, no Reino Unido.
O número de atrasos atribuídos ao mau tempo no espaço aéreo europeu subiu de 2,5 milhões em 2003 a um pico de 6,5 milhões em 2019, de acordo com dados da Eurocontrol, embora parte dessa alta possa ser atribuída ao crescimento do setor. Como proporção das causas gerais de atraso, problemas de clima subiram de 23% para 27% no mesmo período.
A proporção de voos cancelados nos Estados Unidos por conta do clima aumentou de aproximadamente 35% do total em 2004 para 54% em 2019, de acordo com a FAA (Administração Federal da Aviação) americana.
Mark Searle, diretor mundial de segurança na Associação Internacional do Transporte Aéreo (IATA), disse que as companhias de aviação haviam se adaptado ao longo dos anos à mudança do clima.
“Existe uma situação evoluindo, mas não é como se estivéssemos à beira do precipício”, ele disse. “Na verdade, nós a estamos administrando muito bem”.
Para os aeroportos, isso pode significar preparação para níveis de mar mais elevados. O novo terminal de passageiros do aeroporto de Changi, em Cingapura, foi construído apenas 5,5 metros acima do nível médio do mar. A Avinor, que opera aeroportos ao longo da costa da Noruega, determinou que todas as pistas de aterrissagem novas sejam construídas pelo menos sete metros acima do nível do mar.
No caso das companhias de aviação, será necessário recorrer à tecnologia. A American Airlines e a United Airlines melhoraram sua capacidade de prever a proximidade de relâmpagos, permitindo que o trabalho nos pátios continue por mais tempo, antes de uma tempestade que se aproxima, sem colocar em risco o pessoal de terra.
Em diversos de seus aeroportos centrais, a United Airlines, sediada em Chicago, também criou sistemas de taxiagem automática que permitem que aviões sejam conduzidos aos terminais mesmo que tempestades impeçam que agentes de rampa os orientem até os portões.
O clima severo exige pessoal adicional. As operadoras são forçadas a pagar horas extras quando seu pessoal de embarque e dos call centers enfrenta demanda adicional gerada por passageiros tentando reorganizar suas viagens. As empresas terão de calcular se compensa mais pagar o adicional por horas extras, criar turnos adicionais de trabalho ou deixar que os passageiros arquem com as consequências dos problemas.
“Haverá custo adicional de qualquer forma se –e essa é uma questão em aberto– as companhias de aviação decidirem que querem lidar com isso”, disse Jon Jager, analista da Cirium, uma empresa de pesquisa sobre aviação.
Embora os passageiros tipicamente culpem as companhias de aviação pelos problemas que encontram, as regras dos Estados Unidos, Reino Unido e União Europeia não exigem que elas indenizem os passageiros por problemas causados pelo clima. “A Mãe Natureza serve como desculpa para livrar as companhias de aviação de problemas”, disse Jager.
Perturbações surgem não só com tempestades, mas com extremos de calor. Aviões enfrentam dificuldade para decolar em temperaturas muito elevadas, porque o ar quente é menos denso, o que significa que as asas criam menos empuxo aerodinâmico. Quanto mais quente a temperatura, mais leve um avião precisa estar para decolar, especialmente em aeroportos com pistas curtas e em áreas quentes.
Williams, o cientista atmosférico, publicou um estudo no qual constata que, para um Airbus A320 decolando da ilha grega de Chios, a carga útil teve de ser reduzida em cerca de 130 quilos por ano, ao longo de três décadas –o que equivale, em linhas gerais, ao peso de um passageiro e sua bagagem.
A Iata está negociando com seus integrantes sobre a adoção de novas metas relacionadas à mudança do clima neste ano. As metas atuais do setor, adotadas em 2009, incluem reduzir à metade o nível de emissões de 2005, até 2050, e que todo crescimento seja neutro em termos de emissões de carbono, de 2020 em diante.
Mas em muitas áreas do setor, especialmente na Europa e Estados Unidos, existe uma convicção de que metas mais duras, incluindo um compromisso de zerar as emissões líquidas de poluentes, são necessárias.
“Acreditamos que provavelmente devemos ir além, e estamos trabalhando nisso”, disse Alexandre de Juniac, que está encerrando seu mandato como presidente da Iata, ao Financial Times alguns meses atrás.
Williams disse que a abordagem do setor de aviação quanto à mudança do clima parecia estar mudando.
“Historicamente, havia muita gente cética sobre a mudança do clima no setor de aviação, mas percebi uma mudança”, ele disse. “Agora o setor é muito mais honesto”.
Opinion | The Climate Has a Gun (The Wall Street Journal)
Those who dismiss risk of climate change often appeal to uncertainty, but they have it backward.
Aug. 17, 2021 1:14 pm ET 2 minutes
In “Climate Change Brings a Flood of Hyperbole” (op-ed, Aug. 11), Steven Koonin put himself in the unenviable position of playing down climate change precisely while we are experiencing unprecedented heat waves, storms, fires, droughts, and floods that exceed model-based expectations.
Mr. Koonin claims that regional projections are “meant to scare people.” But the paper he cites for support addresses the “unfolding of what may become catastrophic changes to Earth’s climate” and argues that “being able to anticipate what would otherwise be surprises in extreme weather and climate variations” requires better models. In other words, our current models cannot rule out a catastrophic future.
Model uncertainty is two-edged. If we’d been lucky, we’d be discovering that we overestimated the danger. But all indicators suggest the opposite. Those who dismiss climate risk often appeal to uncertainty, but they have it backward. Climate uncertainty is like not knowing how many shots Dirty Harry fired from his .44-caliber Magnum. Now that it’s pointed at our head, it’s dawning on us that we’ve probably miscalculated. By the time we’re sure, it’s too late. We’ve got to ask ourselves one question: Do we feel lucky? Well, do we?
Adj. Prof. Mark Boslough – University of New Mexico
Opinion | Climate Change Brings a Flood of Hyperbole (The Wall Street Journal)
Despite constant warnings of catastrophe, things aren’t anywhere near as dire as the media say.
Steven E. Koonin – Aug. 10, 2021 6:33 pm ET
The Intergovernmental Panel on Climate Change has issued its latest report assessing the state of the climate and projecting its future. As usual, the media and politicians are exaggerating and distorting the evidence in the report. They lament an allegedly broken climate and proclaim, yet again, that we are facing the “last, best chance” to save the planet from a hellish future. In fact, things aren’t—and won’t be—anywhere near as dire.
The new report, titled AR6, is almost 4,000 pages, written by several hundred government-nominated scientists over the past four years. It should command our attention, especially because this report will be a crucial element of the coming United Nations Climate Change Conference in Glasgow. Leaders from 196 countries will come together there in November, likely to adopt more-aggressive nonbinding pledges to reduce greenhouse-gas emissions.
Previous climate-assessment reports have misrepresented scientific research in the “conclusions” presented to policy makers and the media. The summary of the most recent U.S. government climate report, for instance, said heat waves across the U.S. have become more frequent since 1960, but neglected to mention that the body of the report shows they are no more common today than they were in 1900. Knowledgeable independent scientists need to scrutinize the latest U.N. report because of the major societal and economic disruptions that would take place on the way to a “net zero” world, including the elimination of fossil-fueled electricity, transportation and heat, as well as complete transformation of agricultural methods.
It is already easy to see things in this report that you almost certainly won’t learn from the general media coverage. Most important, the model muddle continues. We are repeatedly told “the models say.” But the complicated computer models used to project future temperature, rainfall and so on remain deficient. Some models are far more sensitive to greenhouse gases than others. Many also disagree on the baseline temperature for the Earth’s surface.
The latest models also don’t reproduce the global climate of the past. The models fail to explain why rapid global warming occurred from 1910 to 1940, when human influences on the climate were less significant. The report also presents an extensive “atlas” of future regional climates based on the models. Sounds authoritative. But two experts, Tim Palmer and Bjorn Stevens, write in the Proceedings of the National Academy of Sciences that the lack of detail in current modeling approaches makes them “not fit” to describe regional climate. The atlas is mainly meant to scare people.
The social cost of carbon could guide us toward intellinget policies – only if we knew what it was.
In contrast to the existential angst currently in fashion around climate change, there’s a cold-eyed calculation that its advocates, mostly economists, like to call the most important number you’ve never heard of.
It’s the social cost of carbon. It reflects the global damage of emitting one ton of carbon dioxide into the sky, accounting for its impact in the form of warming temperatures and rising sea levels. Economists, who have squabbled over the right number for a decade, see it as a powerful policy tool that could bring rationality to climate decisions. It’s what we should be willing to pay to avoid emitting that one more ton of carbon.
This story was part of our May 2019 issue
For most of us, it’s a way to grasp how much our carbon emissions will affect the world’s health, agriculture, and economy for the next several hundred years. Maximilian Auffhammer, an economist at the University of California, Berkeley, describes it this way: it’s approximately the damage done by driving from San Francisco to Chicago, assuming that about a ton of carbon dioxide spits out of the tailpipe over those 2,000 miles.
Common estimates of the social cost of that ton are $40 to $50. The cost of the fuel for the journey in an average car is currently around $225. In other words, you’d pay roughly 20% more to take the social cost of the trip into account.
The number is contentious, however. A US federal working group in 2016, convened by President Barack Obama, calculated it at around $40, while the Trump administration has recently put it at $1 to $7. Some academic researchers cite numbers as high as $400 or more.
Why so wide a range? It depends on how you value future damages. And there are uncertainties over how the climate will respond to emissions. But another reason is that we actually have very little insight into just how climate change will affect us over time. Yes, we know there’ll be fiercer storms and deadly wildfires, heat waves, droughts, and floods. We know the glaciers are melting rapidly and fragile ocean ecosystems are being destroyed. But what does that mean for the livelihood or life expectancy of someone in Ames, Iowa, or Bangalore, India, or Chelyabinsk, Russia?
For the first time, vast amounts of data on the economic and social effects of climate change are becoming available, and so is the computational power to make sense of it. Taking this opportunity to compute a precise social cost of carbon could help us decide how much to invest and which problems to tackle first.
“It is the single most important number in the global economy,” says Solomon Hsiang, a climate policy expert at Berkeley. “Getting it right is incredibly important. But right now, we have almost no idea what it is.”
That could soon change.
The cost of death
In the past, calculating the social cost of carbon typically meant estimating how climate change would slow worldwide economic growth. Computer models split the world into at most a dozen or so regions and then averaged the predicted effects of climate change to get the impact on global GDP over time. It was at best a crude number.
Over the last several years, economists, data scientists, and climate scientists have worked together to create far more detailed and localized maps of impacts by examining how temperatures, sea levels, and precipitation patterns have historically affected things like mortality, crop yields, violence, and labor productivity. This data can then be plugged into increasingly sophisticated climate models to see what happens as the planet continues to warm.
The wealth of high-resolution data makes a far more precise number possible—at least in theory. Hsiang is co-director of the Climate Impact Lab, a team of some 35 scientists from institutions including the University of Chicago, Berkeley, Rutgers, and the Rhodium Group, an economic research organization. Their goal is to come up with a number by looking at about 24,000 different regions and adding together the diverse effects that each will experience over the coming hundreds of years in health, human behavior, and economic activity.
It’s a huge technical and computational challenge, and it will take a few years to come up with a single number. But along the way, the efforts to better understand localized damages are creating a nuanced and disturbing picture of our future.
So far, the researchers have found that climate change will kill far more people than once thought. Michael Greenstone, a University of Chicago economist who co-directs the Climate Impact Lab with Hsiang, says that previous mortality estimates had looked at seven wealthy cities, most in relatively cool climates. His group looked at data gleaned from 56% of the world’s population. It found that the social cost of carbon due to increased mortality alone is $30, nearly as high as the Obama administration’s estimate for the social cost of all climate impacts. An additional 9.1 million people will die every year by 2100, the group estimates, if climate change is left unchecked (assuming a global population of 12.7 billion people).
However, while the Climate Impact Lab’s analysis showed that 76% of the world’s population would suffer from higher mortality rates, it found that warming temperatures would actually save lives in a number of northern regions. That’s consistent with other recent research; the impacts of climate change will be remarkably uneven.
The variations are significant even within some countries. In 2017, Hsiang and his collaborators calculated climate impacts county by county in the United States. They found that every degree of warming would cut the country’s GDP by about 1.2%, but the worst-hit counties could see a drop of around 20%.
If climate change is left to run unchecked through the end of the century, the southern and southwestern US will be devastated by rising rates of mortality and crop failure. Labor productivity will slow, and energy costs (especially due to air-conditioning) will rise. In contrast, the northwestern and parts of the northeastern US will benefit.
“It is a massive restructuring of wealth,” says Hsiang. This is the most important finding of the last several years of climate economics, he adds. By examining ever smaller regions, you can see “the incredible winners and losers.” Many in the climate community have been reluctant to talk about such findings, he says. “But we have to look [the inequality] right in the eye.”
The social cost of carbon is typically calculated as a single global number. That makes sense, since the damage of a ton of carbon emitted in one place is spread throughout the world. But last year Katharine Ricke, a climate scientist at UC San Diego and the Scripps Institution of Oceanography, published the social costs of carbon for specific countries to help parse out regional differences.
India is the big loser. Not only does it have a fast-growing economy that will be slowed, but it’s already a hot country that will suffer greatly from getting even hotter. “India bears a huge share of the global social cost of carbon—more than 20%,” says Ricke. It also stands out for how little it has actually contributed to the world’s carbon emissions. “It’s a serious equity issue,” she says.
Estimating the global social cost of carbon also raises a vexing question: How do you put a value on future damages? We should invest now to help our children and grandchildren avoid suffering, but how much? This is hotly and often angrily debated among economists.
A standard tool in economics is the discount rate, used to calculate how much we should invest now for a payoff years from now. The higher the discount rate, the less you value the future benefit. William Nordhaus, who won the 2018 Nobel Prize in economics for pioneering the use of models to show the macroeconomic effects of climate change, has used a discount rate of around 4%. The relatively high rate suggests we should invest conservatively now. In sharp contrast, a landmark 2006 report by British economist Nicholas Stern used a discount rate of 1.4%, concluding that we should begin investing much more heavily to slow climate change.
There’s an ethical dimension to these calculations. Wealthy countries whose prosperity has been built on fossil fuels have an obligation to help poorer countries. The climate winners can’t abandon the losers. Likewise, we owe future generations more than just financial considerations. What’s the value of a world free from the threat of catastrophic climate events—one with healthy and thriving natural ecosystems?
Enter the Green New Deal (GND). It’s the sweeping proposal issued earlier this year by Representative Alexandria Ocasio-Cortez and other US progressives to address everything from climate change to inequality. It cites the dangers of temperature increases beyond the UN goal of 1.5 °C and makes a long list of recommendations. Energy experts immediately began to bicker over its details: Is achieving 100% renewables in the next 12 years really feasible? (Probably not.) Should it include nuclear power, which many climate activists now argue is essential for reducing emissions?
In reality, the GND has little to say about actual policies and there’s barely a hint of how it will attack its grand challenges, from providing a secure retirement for all to fostering family farms to ensuring access to nature. But that’s not the point. The GND is a cry of outrage against what it calls “the twin crises of climate change and worsening income inequality.” It’s a political attempt to make climate change part of the wider discussion about social justice. And, at least from the perspective of climate policy, it’s right in arguing that we can’t tackle global warming without considering broader social and economic issues.
The work of researchers like Ricke, Hsiang, and Greenstone supports that stance. Not only do their findings show that global warming can worsen inequality and other social ills; they provide evidence that aggressive action is worth it. Last year, researchers at Stanford calculated that limiting warming to 1.5 °C would save upwards of $20 trillion worldwide by the end of the century. Again, the impacts were mixed—the GDPs of some countries would be harmed by aggressive climate action. But the conclusion was overwhelming: more than 90% of the world’s population would benefit. Moreover, the cost of keeping temperature increases limited to 1.5 °C would be dwarfed by the long-term savings.
Nevertheless, the investments will take decades to pay for themselves. Renewables and new clean technologies may lead to a boom in manufacturing and a robust economy, but the Green New Deal is wrong to paper over the financial sacrifices we’ll need to make in the near term.
That is why climate remedies are such a hard sell. We need a global policy—but, as we’re always reminded, all politics is local. Adding 20% to the cost of that San Francisco–Chicago trip might not seem like much, but try to convince a truck driver in a poor county in Florida that raising the price of fuel is wise economic policy. A much smaller increase sparked the gilets jaunes riots in France last winter. That is the dilemma, both political and ethical, that we all face with climate change.
The latest landmark climate science report goes much further than previous ones in providing estimates of how bad things might get as the planet heats up, even if a lack of data may mean it underestimates the perils.
Scientists have used the seven years since the previous assessment report of the Intergovernmental Panel of Climate Change (IPCC) to narrow the uncertainties around major issues, such as how much the planet will warm if we double atmospheric levels of carbon dioxide and other greenhouse gases.
While temperatures have risen largely in lockstep with rising CO2, this IPCC report examines in much more detail the risks of so-called abrupt changes, when relatively stable systems abruptly and probably irreversibly shift to a new state.
Michael Mann, director of the Pennsylvania State University’s Earth System Science and one of the world’s most prominent climate researchers, says the models are not capturing all the risks as the climate heats up.
Perhaps the most prominent of these threats is a possible stalling of the Atlantic Meridional Overturning Circulation (AMOC). Also known as the Gulf Stream, it brings tropic water north from the Caribbean, keeping northern Europe much warmer than its latitude might otherwise suggest, and threatening massive disruptions if it slows or stops.
“Where the models have underestimated the impact is with projections of ice melt, the AMOC, and – I argue in my own work – the uptick on extreme weather events,” Professor Mann tells the Herald and The Age.
Stefan Rahmstorf, head of research at the Potsdam Institute for Climate Impact Research, agrees that climate models have not done a good job of reproducing the so-called cold blob in the subpolar Atlantic that is forming where melting Greenland ice is cooling the subpolar Atlantic.
If they are not picking that blob up, “should we trust those models on AMOC stability?” Professor Rahmstorf asks.
The IPCC’s language, too, doesn’t necessarily convey the nature of the threat, much of which will be detailed in the second AR6 report on the impacts of climate change, scheduled for release next February.
“Like just stating the AMOC collapse by 2100 is ‘very unlikely’ – that was in a previous report – it sounds reassuring,” Professor Rahmstorf said. “Now the IPCC says they have ‘medium confidence’ that it won’t happen by 2100, whatever that means.”
West Antarctic melt
Another potential tipping point is the possible disintegration of the West Antarctic ice sheet. Much of the sheet lies below sea level and as the Southern Ocean warms, it will melt causing it to “flow” towards the sea in a process that is expected to be self-sustaining.
This so-called marine ice sheet instability is identified in the IPCC report as likely resulting in ice mass loss under all emissions scenarios. There is also “deep uncertainty in projections for above 3 degrees of warming”, the report states.
Containing enough water to lift sea levels by 3.3 metres, it matters what happens to the ice sheet. As Andrew Mackintosh, an ice expert at Monash University, says, the understanding is limited: “We know more about the surface of Mars than the ice sheet bed under the ice.”
Permafrost not so permanent
Much has been made about the so-called “methane bomb” sitting under the permafrost in the northern hemisphere. As the Arctic has warmed at more than twice the pace of the globe overall, with heatwaves of increasing intensity and duration, it is not surprising that the IPCC has listed the release of so-called biogenic emissions from permafrost thaw as among potential tipping points.
These emissions could total up to 240 gigatonnes of CO2-equivalent which, if released, would add an unwanted warming boost.
The IPCC lists as “high” the probability of such releases during this century, adding there is “high confidence” that the process is irreversible at century scales.
“In some cases abrupt changes can occur in Earth System Models but don’t on the timescales of the projections (for example, an AMOC collapse),” said Peter Cox, a Professor of Climate System Dynamics at the UK’s University of Exeter. “In other cases the processes involved are not yet routinely included in ESMs [such as] CO2 and methane release from deep permafrost.”
“In the latter cases IPCC statements are made on the basis of the few studies available, and are necessarily less definitive,” he said.
From the Amazon rainforest to the boreal forests of Russia and Canada, there is a risk of fire and pests that could trigger dieback and transform those regions.
Australia’s bush faces an increased risk of bad fire weather days right across the continent, the IPCC notes. How droughts, heatwaves and heavy rain and other extreme events will play out at a local level is also not well understood.
Ocean acidification and marine heatwaves also mean the world’s coral reefs will be much diminished at more than 1.5 degrees of warming. “You can kiss it goodbye as we know it,” says Sarah Perkins-Kirkpatrick, a climate researcher at the University of NSW, said.
Global monsoons, which affect billions of people including those on the Indian subcontinent, are likely to increase their rainfall in most parts of the world, the IPCC said.
Andy Pitman, director of the ARC Centre of Excellence for Climate Extremes, said policymakers need to understand that much is riding on these tipping points not being triggered as even one or two of them would have long-lasting and significant effects. “How lucky do you feel?” Professor Pitman says.
The Biggest uncertainty
Christian Jakob, a Monash University climate researcher, said that while there remain important uncertainties, science is honing most of those risks down.
Much harder to gauge, though, is which emissions path humans are going to take. Picking between the five scenarios ranging from low to high that we are going to choose is “much larger than the uncertainty we have in the science,” Professor Jakob said.
O impacto do aumento da temperatura média na Terra é planetário, com elevação do nível do mar e alteração de ecossistemas inteiros, entre outras mudanças.
Alterações regionais do clima, com maior frequência de eventos extremos, já são percebidas e se intensificarão nos próximos anos, com consequências diretas na saúde de todos.
No Brasil, alguns estados conviverão com mais dias de calorão, que podem ser prejudiciais à saúde a ponto de provocar a morte de idosos.
Em outros, chuvas intensas se tornarão mais recorrentes, ocasionando inundações que aumentam o risco de doenças, quando não destroem bairros e cidades.
Por fim, as secas também devem ficar mais intensas, o que pode agravar problemas respiratórios.
Além disso, tanto as chuvas intensas quanto as secas prejudicam lavouras, aumentando o preço dos alimentos.
Um exemplo prático de aumento de temperatura está no Sudeste e no Sul do Brasil. Segundo o cenário mais otimista do IPCC, até 2040 os dias com termômetros acima de 35°C passarão de 26 por ano (média de 1995 a 2014) para 32. Num cenário intermediário, até o final do século esse número pode chegar a 43, um aumento de mais de 65% em relação à situação recente.
No Centro-Oeste, o aumento do calorão é ainda mais severo. No cenário intermediário, do IPCC, a média de 53 dias por ano com termômetros acima de 35°C salta para cerca de 72 até 2040 e para 108 até o fim do século, ou pouco mais de um trimestre de temperatura extrema.
As consequências para a saúde são graves. Ondas de calor extremo podem causar hipertermia, que afeta os órgãos internos e provoca lesões no coração, nas células musculares e nos vasos sanguíneos. São danos que podem levar à morte.
Em junho, uma onda de calor nos estados de Oregon e Washington, nos Estados Unidos, custou a vida de centenas de pessoas. Segundo reportagem do jornal The New York Times, foram registrados cerca de 600 óbitos em excesso no período.
Além do calor, a crise do clima deve tornar mais frequentes os períodos de seca e os dias sem chuva em muitas regiões. É o caso da Amazônia.
Dados do IPCC apontam que, na região Norte, no período 1995-2014 eram em média 43 dias consecutivos sem chuva por ano, que podem aumentar para 51, com períodos 10% mais secos até 2040.
Situação similar deve ocorrer no Centro-Oeste, que tinha 69 dias consecutivos sem chuva por ano, que podem ir a 76, com períodos 13% mais secos.
Períodos mais secos nessas regiões preocupam por causa das queimadas. Na Amazônia, por exemplo, a época sem chuvas é associada à intensificação de processos de desmatamento e de incêndios.
As queimadas na região amazônica têm relação com piora da qualidade do ar e consequentes problemas respiratórios. A Fiocruz e a ONG WWF-Brasil estimam que estados amazônicos com índices elevados de queimadas tenham gastado, em dez anos, quase R$ 1 bilhão com hospitalizações por doenças respiratórias provavelmente relacionadas à fumaça dos incêndios.
No ano passado, o Pantanal passou por sua pior seca dos últimos 60 anos, estiagem que ainda pode continuar por até cinco anos, segundo afirmou à época a Secretaria Nacional de Proteção e Defesa Civil. A situação fez explodir o número de queimadas na região.
O IPCC também aponta aumento da frequência e da intensidade de chuvas extremas e enchentes em diversas regiões do Brasil.
Além dos danos óbvios na infraestrutura das cidades, as inundações provocam problemas de saúde. Hepatite A (transmitida de modo oral-fecal, ou seja, por alimentos e água contaminada) e leptospirose (com transmissão a partir do contato com urina de ratos) são suspeitos conhecidos, mas há também o risco de acidentes com animais peçonhentos, já que cobras e escorpiões podem procurar abrigos dentro das casas.
Manaus tornou-se exemplo recente desse tipo de situação. A cidade enfrentou uma cheia histórica, a maior desde o início das medições, há 119 anos. As águas do rio Negro provocaram inundações com duração superior a um mês na principal capital da região amazônica. Seis das dez maiores cheias já registradas no rio ocorreram no século 21, ou seja, nas últimas duas décadas.
Ruas da região do porto de Manaus tiveram que ser interditadas e foi necessária a construção de passarelas sobre as vias alagadas. Enquanto isso, comerciantes fizeram barreiras com sacos de areia e jogaram cal na água parada para tentar neutralizar o cheiro de fezes.
Em meio à inundação em igarapés, houve acúmulo de lixo, que chegou a cobrir toda a área superficial da água. Dentro das casas, moradores usaram plataformas de madeira (chamadas de marombas) para suspender móveis e eletrodomésticos.
As enchentes não são exclusividade amazônica. Elas também ocorrem na região Sudeste, em São Paulo e Rio de Janeiro, por exemplo.
Pouco tempo depois da cheia em Manaus, a Europa também viu chuvas intensas concentradas em um curto espaço de tempo causarem inundações severas, principalmente na Alemanha. Além da destruição de vias públicas e imóveis, houve mais de uma centena de mortes.
Segundo Lincoln Alves, pesquisador do Inpe (Instituto Nacional de Pesquisas Espaciais) e autor-líder do Atlas do IPCC, a ferramenta pretende facilitar o acesso a informações normalmente complexas. “É visível a mudança do clima”, afirma o pesquisador.
A partir do Atlas, diz Alves, é possível que comunidades, empresas e até esferas do governo consigam olhar de forma mais regional para os efeitos da crise do clima.
A ferramenta permite ver a história climática da Terra e observar as projeções para diferentes variáveis em diferentes cenários de emissões —e de aquecimento, como 1,5°C e 2°C— apontados pelo IPCC.
PRINCIPAIS CONCLUSÕES DO RELATÓRIO DO IPCC
Aumento de temperatura provocada pelo ser humano desde 1850-1900 até 2010-2019: de 0,8°C a 1,21°C
Os anos de 2016 a 2020 foram o período de cinco anos mais quentes de 1850 a 2020
De 2021 a 2040, um aumento de temperatura de 1,5°C é, no mínimo, provável de acontecer em qualquer cenário de emissões
A estabilização da temperatura na Terra pode levar de 20 a 30 anos se houver redução forte e sustentada de emissões
O oceano está esquentando mais rápido —inclusive em profundidades maiores do que 2.000 m— do que em qualquer período anterior, desde pelo menos a última transição glacial. É extremamente provável que as atividades humanas sejam o principal fator para isso
O oceano continuará a aquecer por todo o século 21 e provavelmente até 2300, mesmo em cenários de baixas emissões
O aquecimento de áreas profundas do oceano e o derretimento de massas de gelo tende a elevar o nível do mar, o que tende a se manter por milhares de anos
Nos próximos 2.000 anos, o nível médio global do mar deve aumentar 2 a 3 metros, se o aumento da temperatura ficar contido em 1,5°C. Se o aquecimento global ficar contido a 2°C, o nível deve aumentar de 2 a 6 metros. No caso de 5°C de aumento de temperatura, o mar subirá de 19 a 22 metros
As the Intergovernmental Panel on Climate Change (IPCC) released its Sixth Assessment Report, summarized nicely on these pages by Bob Henson, much of the associated media coverage carried a tone of inevitable doom.
These proclamations of unavoidable adverse outcomes center around the fact that in every scenario considered by IPCC, within the next decade average global temperatures will likely breach the aspirational goal set in the Paris climate agreement of limiting global warming to 1.5 degrees Celsius (2.7 degrees Fahrenheit) above pre-industrial temperatures. The report also details a litany of extreme weather events like heatwaves, droughts, wildfires, floods, and hurricanes that will all worsen as long as global temperatures continue to rise.
While United Nations Secretary-General António Guterres rightly called the report a “code red for humanity,” tucked into it are details illustrating that if – BIG IF –top-emitting countries respond to the IPCC’s alarm bells with aggressive efforts to curb carbon pollution, the worst climate outcomes remain avoidable.
The IPCC’s future climate scenarios
In the Marvel film Avengers: Infinity War, the Dr. Strange character goes forward in time to view 14,000,605 alternate futures to see all the possible outcomes of the Avengers’ coming conflict. Lacking the fictional Time Stone used in this gambit, climate scientists instead ran hundreds of simulations of several different future carbon emissions scenarios using a variety of climate models. Like Dr. Strange, climate scientists’ goal is to determine the range of possible outcomes given different actions taken by the protagonists: in this case, various measures to decarbonize the global economy.
The scenarios considered by IPCC are called Shared Socioeconomic Pathways (SSPs). The best-case climate scenario, called SSP1, involves a global shift toward sustainable management of global resources and reduced inequity. The next scenario, SSP2, is more of a business-as-usual path with slow and uneven progress toward sustainable development goals and persisting income inequality and environmental degradation. SSP3 envisions insurgent nationalism around the world with countries focusing on their short-term domestic best interests, resulting in persistent and worsening inequality and environmental degradation. Two more scenarios, SSP4 and SSP5, consider even greater inequalities and fossil fuel extraction, but seem at odds with an international community that has agreed overwhelmingly to aim for the Paris climate targets.
The latest IPCC report’s model runs simulated two SSP1 scenarios that would achieve the Paris targets of limiting global warming to 1.5 and 2°C (2.7 and 3.6°F); one SSP2 scenario in which temperatures approach 3°C (5.4°F) in the year 2100; an SSP3 scenario with about 4°C (7.2°F) global warming by the end of the century; and one SSP5 ‘burn all the fossil fuels possible’ scenario resulting in close to 5°C (9°F), again by 2100.
The report’s SSP3-7.0 pathway (the latter number represents the eventual global energy imbalance caused by the increased greenhouse effect, in watts per square meter), is considered by many experts to be a realistic worst-case scenario, with global carbon emissions continuing to rise every year throughout the 21st century. Such an outcome would represent a complete failure of international climate negotiations and policies and would likely result in catastrophic consequences, including widespread species extinctions, food and water shortages, and disastrous extreme weather events.
Scenario SSP2-4.5 is more consistent with government climate policies that are currently in place. It envisions global carbon emissions increasing another 10% over the next decade before reaching a plateau that’s maintained until carbon pollution slowly begins to decline starting in the 2050s. Global carbon emissions approach but do not reach zero by the end of the century. Even in this unambitious scenario, the very worst climate change impacts might be averted, although the resulting climate impacts would be severe.
Most encouragingly, the report’s two SSP1 scenarios illustrate that the Paris targets remain within reach. To stay below the main Paris target of 2°C (3.6°F) warming, global carbon emissions in SSP1-2.6 plateau essentially immediately and begin to decline after 2025 at a modest rate of about 2% per year for the first decade, then accelerating to around 3% per year the next decade, and continuing along a path of consistent year-to-year carbon pollution cuts before reaching zero around 2075. The IPCC concluded that once global carbon emissions reach zero, temperatures will stop rising. Toward the end of the century, emissions in SSP1-2.6 move into negative territory as the IPCC envisions that efforts to remove carbon from the atmosphere via natural and technological methods (like sequestering carbon in agricultural soils and scrubbing it from the atmosphere through direct air capture) outpace overall fossil fuel emissions.
Meeting the aspirational Paris goal of limiting global warming to 1.5°C (2.7°F) in SSP1-1.9 would be extremely challenging, given that global temperatures are expected to breach this level within about a decade. This scenario similarly envisions that global carbon emissions peak immediately and that they decline much faster than in SSP1-2.6, at a rate of about 6% per year from 2025 to 2035 and 9% per year over the following decade, reaching net zero by around the year 2055 and becoming net negative afterwards.
For perspective, global carbon emissions fell by about 6-7% in 2020 as a result of restrictions associated with the COVID-19 pandemic and are expected to rebound by a similar amount in 2021. As IPCC report contributor Zeke Hausfather noted, this scenario also relies on large-scale carbon sequestration technologies that currently do not exist, without which global emissions would have to reach zero a decade sooner.
More warming means more risk
The new IPCC report details that, depending on the region, climate change has already worsened extreme heat, drought, fires, floods, and hurricanes, and those will only become more damaging and destructive as temperatures continue to rise. The IPCC’s 2018 “1.5°C Report” had entailed the differences in climate consequences in a 2°C vs. 1.5°C world, as summarized at this site by Bruce Lieberman.
Consider that in the current climate of just over 1°C (2°F) warmer than pre-industrial temperatures, 40 countries this summer alone have experienced extreme flooding, including more than a year’s worth of rain falling within 24 hours in Zhengzhou, China. Many regions have also experienced extreme heat, including the deadly Pacific Northwest heatwave and dangerously hot conditions during the Olympics in Tokyo. Siberia, Greece, Italy, and the US west coast are experiencing explosive wildfires, including the “truly frightening fire behavior” of the Dixie fire, which broke the record as the largest single wildfire on record in California. The IPCC report warned of “compound events” like heat exacerbating drought, which in turn fuels more dangerous wildfires, as is happening in California.
The IPCC report notes that the low-emissions SSP1 scenarios “would lead to substantially smaller changes” in these sorts of climate impact drivers than the higher-emissions scenarios. It also points out that with the world currently at around 1°C of warming, the intensity of extreme weather will be twice as bad compared to today’s conditions if temperatures reach 2°C (1°C hotter than today) than if the warming is limited to 1.5°C (0.5°C hotter than today), and quadruple as bad if global warming reaches 3°C (2°C hotter than today). For example, what was an extreme once-in-50-years heat wave in the late-1800s now occurs once per decade, which would rise to almost twice per decade at 1.5°C, and nearly three times per decade at 2°C global warming.
Climate’s fate has yet to be written
At the same time, there is no tipping point temperature at which it becomes “too late” to curb climate change and its damaging consequences. Every additional bit of global warming above current temperatures will result in increased risks of worsening extreme weather of the sorts currently being experienced around the world. Achieving the aspirational 1.5°C Paris target may be politically infeasible, but most countries (137 total) have either committed to or are in the process of setting a target for net zero emissions by 2050 (including the United States) or 2060 (including China).
That makes the SSP1 scenarios and limiting global warming to less than 2°C a distinct possibility, depending on how successful countries are at following through with decarbonization plans over the coming three decades. And with its proposed infrastructure bipartisan and budget reconciliation legislative plans – for which final enactment of each remains another big IF – the United States could soon implement some of the bold investments and policies necessary to set the world’s second-largest carbon polluter on a track consistent with the Paris targets.
Again and again, assessment after assessment, the IPCC has already made it clear. Climate change puts at risk every aspect of human life as we know it … We are already starting to experience those risks today; but we know what we need to do to avoid the worst future impacts. The difference between a fossil fuel versus a clean energy future is nothing less than the future of civilization as we know it.
Back to the Avengers: They had only one chance in 14 million to save the day, and they succeeded. Time is running short, but policymakers’ odds of meeting the Paris targets remain much better than that. There are no physical constraints playing the role of Thanos in our story; only political barriers stand between humanity and a prosperous clean energy future, although those can sometimes be the most difficult types of barriers to overcome.
The new IPCC report is “a code red for humanity”, says UN Secretary-General António Guterres.
Established in 1988 by United Nations Environment Programme (UNEP) and the World Meteorological Organisation (WMO), the Intergovernmental Panel on Climate Change (IPCC) assesses climate change science. Its new report is a warning sign for policy makers all over the world.
In this picture taken on 26 October, 2014, Peia Kararaua, 16, swims in the flooded area of Aberao village in Kiribati. Kiribati is one of the countries worst hit by the sea level rise since high tides mean many villages are inundated, making them uninhabitable. Image credit: UNICEF/Sokhin
This was the first time the approval meeting for the report was conducted online. There were 234 authors from the world over who clocked in 186 hours working together to get this report released.
For the first time, the report offers an interactive atlas for people to see what has already happened and what may happen in the future to where they live.
“This report tells us that recent changes in the climate are widespread, rapid and intensifying, unprecedented in thousands of years,” said IPCC Vice-Chair Ko Barrett.
UNEP Executive Director Inger Andersen that scientists have been issuing these messages for more than three decades, but the world hasn’t listened.
Here are the most important takeaways from the report:
Humans are to be blamed
Human activity is the cause of climate change and this is an unequivocal fact. All the warming caused in the pre-industrial times had been generated by the burning of fossil fuels such as coal, oil, wood, and natural gas.
Global temperatures have already risen by 1.1 degrees Celsius since the 19th century. They have reached their highest in over 100,000 years, and only a fraction of that increase has come from natural forces.
Michael Mann told the Independentthe effects of climate change will be felt in all corners of the world and will worsen, especially since “the IPCC has connected the dots on climate change and the increase in severe extreme weather events… considerably more directly than previous assessments.”
We will overshoot the 1.5 C mark
According to the report’s highly optimistic-to-reckless scenarios, even if we do everything right and start reducing emissions now, we will still overshoot the 1.5C mark by 2030. But, we will see a drop in temperatures to around 1.4 C.
Control emissions, Earth will do the rest
According to the report, if we start working to bring our emissions under control, we will be able to decrease warming, even if we overshoot the 1.5C limit.
The changes we are living through are unprecedented; however, they are reversible to a certain extent. And it will take a lot of time for nature to heal. We can do this by reducing our greenhouse gas (GHG) emissions. While we might see some benefits quickly, “it could take 20-30 years to see global temperatures stabilise” says the IPCC.
Sea level rise
Global oceans have risen about 20 centimetres (eight inches) since 1900, and the rate of increase has nearly tripled in the last decade. Crumbling and melting ice sheets atop Antarctica (especially in Greenland) have replaced glacier melt as the main drivers.
If global warming is capped at 2 C, the ocean watermark will go up about half a metre over the 21st century. It will continue rising to nearly two metres by 2300 — twice the amount predicted by the IPCC in 2019.
Because of uncertainty over ice sheets, scientists cannot rule out a total rise of two metres by 2100 in a worst-case emissions scenario.
CO2 is at all-time high
CO2 levels were greater in 2019 than they had been in “at least two million years.” Methane and nitrous oxide levels, the second and third major contributors of warming respectively, were higher in 2019 than at any point in “at least 800,000 years,” reported the Independent.
The report includes more data than ever before on methane (CH4), the second most important greenhouse gas after CO2, and warns that failure to curb emissions could undermine Paris Agreement goals.
Human-induced sources are roughly divided between leaks from natural gas production, coal mining and landfills on one side, and livestock and manure handling on the other.
CH4 lingers in the atmosphere only a fraction as long as CO2, but is far more efficient at trapping heat. CH4 levels are their highest in at least 800,000 years.
Natural allies are weakened
Since about 1960, forests, soil and oceans have absorbed 56 percent of all the CO2 humanity has released into the atmosphere — even as those emissions have increased by half. Without nature’s help, Earth would already be a much hotter and less hospitable place.
But these allies in our fight against global heating — known in this role as carbon sinks — are showing signs of saturatation, and the percentage of human-induced carbon they soak up is likely to decline as the century unfolds.
Suck it out
The report suggests that warming could be brought back down via “negative emissions.” We could cool down the planet by sucking out or sequestering the carbon from the atmosphere. While this is a viable suggestion that has been thrown around and there have been small-scale studies that have tried to do this, the technology is not yet perfect. The panel said that could be done starting about halfway through this century but doesn’t explain how, and many scientists are skeptical about its feasibility.
Cities will bear the brunt
Expertswarn that the impact of some elements of climate change, like heat, floods and sea-level rise in coastal areas, may be exacerbated in cities. Furthermore, IPCC experts warn that low-probability scenarios, like an ice sheet collapse or rapid changes in ocean circulation, cannot be ruled out.
A perversidade do negacionismo recai em jurar que se está dizendo o contrário do que de fato se diz. Nesta novilíngua, negacionismo veste o sapatênis do antialarmismo. Chega a ser tedioso, posto que mofado, o argumento de Leandro Narloch nesta Folha na terça (10). Mofado pois —como relata Michael Mann em “The New Climate War”— não passa da mesma retórica negacionista 2.0.
Em essência, Narloch defende que há atividades nocivas ao clima que devem ser “celebradas e difundidas” por nos tornar “menos vulneráveis à natureza”. Narloch está cientificamente errado. E o faz subscrevendo a uma das formas mais nefárias de negacionismo: mascara-o, vendendo soluções que não só não são capazes de mitigar e adaptar as sociedades à crise climática como possuem o efeito adverso. Implode-se a Amazônia para salvá-la, eis o argumento.
Esses e outros discursos negacionistas já tinham sido mapeados na revista Global Sustaintability, de Cambridge, em julho de 2020: não são novos. Em vez de mexer em tabus do século 21, vendem-se inverdades como se ciência fosse. Narloch erra no conceito de vulnerabilidade: dos incêndios florestais na Califórnia às inundações na Alemanha, não estamos protegidos contra a natureza porque nela estamos inseridos. Ignora, ademais, a vasta literatura do Painel do Clima sobre vulnerabilidade.
Narloch desconsidera o conceito da ciência climática de “feedback loops”: a crise climática aciona uma série de gatilhos de dimensão incalculável, uma reação de cadeia nunca vista. Destruir o clima não nos protegerá do clima, porque é a ausência de uma mudança drástica energética que tem aprofundado a crise climática. É ineficiente o investir no contrário.
Se o relatório do Painel do Clima acendeu o sinal vermelho, não é com desinformação que o jornalismo contribuirá ao tema. Pluralismo é um rio onde as ideias se movem dentro das margens da verdade e da ciência. Não reclamem quando o rio secar, implodindo as margens que o jornalismo deveria ter protegido.
Na sua opinião, o que aconteceu nos últimos cem anos com o número total de mortes causadas por furacões, inundações, secas, ondas de calor e outros desastres climáticos? Peço que escolha uma destas alternativas:
a) Aumentou mais de 800%
b) Aumentou cerca de 50%
c) Manteve-se constante
d) Diminuiu cerca de 50%
e) Diminuiu mais de 80%
Como a população mundial cresceu de 1,8 bilhão em 1921 para 8 bilhões em 2021, é razoável cravar as respostas B ou C, pois o fato de haver mais pessoas resultaria em mais vítimas. Muitos leitores devem ter escolhido a primeira opção, tendo em vista as notícias assustadoras do relatório do IPCC desta semana.
A alternativa correta, porém, é a última. As mortes por desastres naturais diminuíram 87% desde a década de 1920 até os anos 2010, segundo dados coletados pelo Our World in Data.
Passaram de 540 mil por ano para 68 mil. A taxa em relação à população teve picos de 63 mortes por 100 mil habitantes em 1921, e 176 em 1931. Hoje está em 0,15.
Esses números levam a dois paradoxos interessantes sobre a relação entre o homem e o clima. O primeiro lembra o Paradoxo de Spencer –referência a Herbert Spencer, para quem “o grau de preocupação pública sobre um problema ou fenômeno social varia inversamente a sua incidência”.
Assim como os ingleses se deram conta da pobreza quando ela começava a diminuir, durante a Revolução Industrial, a humanidade está apavorada com os infortúnios do clima justamente depois de conseguir sobreviver a eles.
O segundo paradoxo: ao mesmo tempo em que emitimos muito (mas muito mesmo) carbono na atmosfera e causamos um grave problema de efeito estufa, também nos tornamos menos vulneráveis à natureza. Na verdade, proteger-se do clima foi um dos principais motivos para termos poluído tanto.
Veja o caso da construção. Produzir cimento consiste grosseiramente em queimar calcário e liberar dióxido de carbono.
Se a indústria de cimento fosse um país, seria o terceiro maior emissor de gases do efeito estufa. Mas essa indústria poluidora permitiu que as pessoas deixassem casas de pau-a-pique ou madeira para dormirem abrigadas em estruturas mais seguras.
Já a fome originada pela seca, principal causa de morte por desastres naturais nos anos 1920, foi resolvida com a criação dos fertilizantes químicos, sistemas de irrigação e a construção de represas e redes de saneamento.
Todas essas atividades causaram aquecimento global –mas não deixam de ser grandes conquistas humanas, que merecem ser celebradas e difundidas entre os pobres que ainda vivem sob risco de morrer durante furacões, secas ou inundações.
Será que a queda histórica das mortes por desastres naturais vai se reverter nos próximos anos, tornando realidade os vaticínios apocalípticos de Greta Thunberg, para quem “bilhões de pessoas morrerão se não tomarmos medidas urgentes”?
O ativista climático Michael Shellenberger, autor do brilhante “Apocalipse Nunca”, que será lançado este mês no Brasil pela editora LVM, acha que não.
Pretendo falar mais sobre o livro de Shellenberger em outras colunas, mas já adianto um dos argumentos: o alarmismo ambiental despreza a capacidade humana de se adaptar e resolver problemas.
“Os Países Baixos, por exemplo, tornaram-se uma nação rica mesmo tendo um terço de suas terras abaixo do nível do mar, incluindo áreas que estão nada menos do que sete metros abaixo do mar”, diz ele.
A luta contra o aquecimento global não precisa de ativistas obcecados com o apocalipse (que geralmente desprezam soluções óbvias, como a energia nuclear). Precisa de tecnologia, de inovadores, de gente que dê mais conforto e segurança à humanidade interferindo na natureza cada vez menos.
Amanda Shendruk, Tim McDonnell, David Yanofsky, Michael J. Coren
Published August 10, 2021
[Check the original publication here for the text of the report with most important parts highlighted.]
The most important takeaways from the new Intergovernmental Panel on Climate Change report are easily summarized: Global warming is happening, it’s caused by human greenhouse gas emissions, and the impacts are very bad (in some cases, catastrophic). Every fraction of a degree of warming we can prevent by curbing emissions substantially reduces this damage. It’s a message that hasn’t changed much since the first IPCC report in 1990.
The final Aug. 9 report is nearly 4,000 pages long. While much of it is written in inscrutable scientific jargon, if you want to understand the scientific case for man-made global warming, look no further. We’ve reviewed the data, summarized the main points, and created an interactive graphic showing a “heat map” of scientists’ confidence in their conclusions. The terms describing statistical confidence range from very high confidence (a 9 out of 10 chance) to very low confidence (a 1 in 10 chance). Just hover over the graphic [here] and click to see what they’ve written.
Here’s your guide to the IPCC’s latest assessment.
CH 1: Framing, context, methods
The first chapter comes out swinging with a bold political charge: It concludes with “high confidence” that the plans countries so far have put forward to reduce emissions are “insufficient” to keep warming well below 2°C, the goal enshrined in the 2015 Paris Agreement. While unsurprising on its own, it is surprising for a document that had to be signed off on by the same government representatives it condemns. It then lists advancements in climate science since the last IPCC report, as well as key evidence behind the conclusion that human-caused global warming is “unequivocal.”
👀Scientists’ ability to observe the physical climate system has continued to improve and expand.
📈Since the last IPCC report, new techniques have provided greater confidence in attributing changes in extreme events to human-caused climate change.
🔬The latest generation of climate models is better at representing natural processes, and higher-resolution models that better capture smaller-scale processes and extreme events have become available.
CH 2: Changing state of the climate system
Chapter 2 looks backward in time to compare the current rate of climate changes to those that happened in the past. That comparison clearly reveals human fingerprints on the climate system. The last time global temperatures were comparable to today was 125,000 years ago, the concentration of atmospheric carbon dioxide is higher than anytime in the last 2 million years, and greenhouse gas emissions are rising faster than anytime in the last 800,000 years.
🥵Observed changes in the atmosphere, oceans, cryosphere, and biosphere provide unequivocal evidence of a world that has warmed. Over the past several decades, key indicators of the climate system are increasingly at levels unseen in centuries to millennia, and are changing at rates unprecedented in at least the last 2000 years
🧊Annual mean Arctic sea ice coverage levels are the lowest since at least 1850. Late summer levels are the lowest in the past 1,000 years.
🌊Global mean sea level (GMSL) is rising, and the rate of GMSL rise since the 20th century is faster than over any preceding century in at least the last three millennia. Since 1901, GMSL has risen by 0.20 [0.15–0.25] meters, and the rate of rise is accelerating.
CH 3: Human influence on the climate system
Chapter 3 leads with the IPCC’s strongest-ever statement on the human impact on the climate: “It is unequivocal that human influence has warmed the global climate system since pre-industrial times” (the last IPCC report said human influence was “clear”). Specifically, the report blames humanity for nearly all of the 1.1°C increase in global temperatures observed since the Industrial Revolution (natural forces played a tiny role as well), and the loss of sea ice, rising temperatures, and acidity in the ocean.
🌍Human-induced greenhouse gas forcing is the main driver of the observed changes in hot and cold extremes.
🌡️The likely range of warming in global-mean surface air temperature (GSAT) in 2010–2019 relative to 1850–1900 is 0.9°C–1.2°C. Of that, 0.8°C–1.3°C is attributable to human activity, while natural forces contributed −0.1°C–0.1°C.
😬Combining the attributable contributions from melting ice and the expansion of warmer water, it is very likely that human influence was the main driver of the observed global mean sea level rise since at least 1970.
CH 4: Future global climate: Scenario-based projections and near-term information
Chapter 4 holds two of the report’s most important conclusions: Climate change is happening faster than previously understood, and the likelihood that the global temperature increase can stay within the Paris Agreement goal of 1.5°C is extremely slim. The 2013 IPCC report projected that temperatures could exceed 1.5°C in the 2040s; here, that timeline has been advanced by a decade to the “early 2030s” in the median scenario. And even in the lowest-emission scenario, it is “more likely than not” to occur by 2040.
🌡️By 2030, in all future warming scenarios, globally averaged surface air temperature in any individual year could exceed 1.5°C relative to 1850–1900.
🌊Under all scenarios, it is virtually certain that global mean sea level will continue to rise through the 21st century.
💨Even if enough carbon were removed from the atmosphere that global emissions become net negative, some climate change impacts, such as sea level rise, will be not reversed for at least several centuries.
CH 5: Global carbon and other biochemical cycles and feedbacks
Chapter 5 quantifies the level by which atmospheric CO2 and methane concentrations have increased since 1750 (47% and 156% respectively) and addresses the ability of oceans and other natural systems to soak those emissions up. The more emissions increase, the less they can be offset by natural sinks—and in a high-emissions scenario, the loss of forests from wildfires becomes so severe that land-based ecosystems become a net source of emissions, rather than a sink (this is already happening to a degree in the Amazon).
🌲The CO2 emitted from human activities during the decade of 2010–2019 was distributed between three Earth systems: 46% accumulated in the atmosphere, 23% was taken up by the ocean, and 31% was stored by vegetation.
📉The fraction of emissions taken up by land and ocean is expected to decline as the CO2 concentration increases.
💨Global temperatures rise in a near-linear relationship to cumulative CO2 emissions. In other words, to halt global warming, net emissions must reach zero.
CH 6: Short-lived climate forcers
Chapter 6 is all about methane, particulate matter, aerosols, hydrofluorocarbons, and other non-CO2 gases that don’t linger very long in the atmosphere (just a few hours, in some cases) but exert a tremendous influence on the climate while they do. In cases, that influence might be cooling, but their net impact has been to contribute to warming. Because they are short-lived, the future abundance and impact of these gases are highly variable in the different socioeconomic pathways considered in the report. These gases have a huge impact on the respiratory health of people around the world.
⛽The sectors most responsible for warming from short-lived climate forcers are those dominated by methane emissions: fossil fuel production and distribution, agriculture, and waste management.
🧊In the next two decades, it is very likely that emissions from short-lived climate forcers will cause a warming relative to 2019, in addition to the warming from long-lived greenhouse gases like CO2.
🌏Rapid decarbonization leads to air quality improvements, but on its own is not sufficient to achieve, in the near term, air quality guidelines set by the World Health Organization, especially in parts of Asia and in some other highly polluted regions.
CH 7: The Earth’s energy budget, climate feedbacks, and climate sensitivity
Climate sensitivity is a measure of how much the Earth responds to changes in greenhouse gas concentrations. For every doubling of atmospheric CO2, temperatures go up by about 3°C, this chapter concludes. That’s about the same level scientists have estimated for several decades, but over time the range of uncertainty around that estimate has narrowed. The energy budget is a calculation of how much energy is flowing into the Earth system from the sun. Put together these metrics paint a picture of the human contribution to observed warming.
🐻❄️The Arctic warms more quickly than the Antarctic due to differences in radiative feedbacks and ocean heat uptake between the poles.
🌊Because of existing greenhouse gas concentrations, energy will continue to accumulate in the Earth system until at least the end of the 21st century, even under strong emissions reduction scenarios.
☁️The net effect of changes in clouds in response to global warming is to amplify human-induced warming. Compared to the last IPCC report, major advances in the understanding of cloud processes have increased the level of confidence in the cloud feedback cycle.
CH 8: Water cycle changes
This chapter catalogs what happens to water in a warming world. Although instances of drought are expected to become more common and more severe, wet parts of the world will get wetter as the warmer atmosphere is able to carry more water. Total net precipitation will increase, yet the thirstier atmosphere will make dry places drier. And within any one location, the difference in precipitation between the driest and wettest month will likely increase. But rainstorms are complex phenomenon and typically happen at a scale that is smaller than the resolution of most climate models, so specific local predictions about monsoon patterns remains an area of relatively high uncertainty.
🌎Increased evapotranspiration will decrease soil moisture over the Mediterranean, southwestern North America, south Africa, southwestern South America, and southwestern Australia.
🌧️Summer monsoon precipitation is projected to increase for the South, Southeast and East Asian monsoon domains, while North American monsoon precipitation is projected to decrease. West African monsoon precipitation is projected to increase over the Central Sahel and decrease over the far western Sahel.
🌲Large-scale deforestation has likely decreased evapotranspiration and precipitation and increased runoff over the deforested regions. Urbanization has increased local precipitation and runoff intensity.
CH 9: Ocean, cryosphere, and sea level change
Most of the heat trapped by greenhouse gases is ultimately absorbed by the oceans. Warmer water expands, contributing significantly to sea level rise, and the slow, deep circulation of ocean water is a key reason why global temperatures don’t turn on a dime in relation to atmospheric CO2. Marine animals are feeling this heat, as scientists have documented that the frequency of marine heatwaves has doubled since the 1980s. Meanwhile, glaciers, polar sea ice, the Greenland ice sheet, and global permafrost are all rapidly melting. Overall sea levels have risen about 20 centimeters since 1900, and the rate of sea level rise is increasing.
📈Global mean sea level rose faster in the 20th century than in any prior century over the last three millennia.
🌡️The heat content of the global ocean has increased since at least 1970 and will continue to increase over the 21st century. The associated warming will likely continue until at least 2300 even for low-emission scenarios because of the slow circulation of the deep ocean.
🧊The Arctic Ocean will likely become practically sea ice–free during the seasonal sea ice minimum for the first time before 2050 in all considered SSP scenarios.
CH 10: Linking global to regional climate change
Since 1950, scientists have clearly detected how greenhouse gas emissions from human activity are changing regional temperatures. Climate models can predict regional climate impacts. Where data are limited, statistical methods help identify local impacts (especially in challenging terrain such as mountains). Cities, in particular, will warm faster as a result of urbanization. Global warming extremes in urban areas will be even more pronounced, especially during heatwaves. Although global models largely agree, it is more difficult to consistently predict regional climate impacts across models.
⛰️Some local-scale phenomena such as sea breezes and mountain wind systems can not be well represented by the resolution of most climate models.
🌆The difference in observed warming trends between cities and their surroundings can partly be attributed to urbanization. Future urbanization will amplify the projected air temperature change in cities regardless of the characteristics of the background climate.
😕Statistical methods are improving to downscale global climate models to more accurately depict local or regional projections.
CH 11: Weather and climate extreme events in a changing climate
Better data collection, modeling, and means scientists are more confident than ever in understanding the role of rising greenhouse gas concentration in weather and climate extremes. We are virtually certain humans are behind observed temperature extremes.
Human activity is more making extreme weather and temperatures more intense and frequent, especially rain, droughts, and tropical cyclones. While even 1.5°C of warming will make events more severe, the intensity of extreme events is expected to at least double with 2°C of global warming compared today’s conditions, and quadruple with 3°C of warming. As global warming accelerates, historically unprecedented climatic events are likely to occur.
🌡️It is an established fact that human-induced greenhouse gas emissions have led to an increased frequency and/or intensity of some weather and climate extremes since pre-industrial time, in particular for temperature extremes.
🌎Even relatively small incremental increases in global warming cause statistically significant changes in extremes.
🌪️The occurrence of extreme events is unprecedented in the observed record, and will increase with increasing global warming.
⛈️Relative to present-day conditions, changes in the intensity of extremes would be at least double at 2°C, and quadruple at 3°C of global warming.
CH 12: Climate change information for regional impact and for risk assessment
Climate models are getting better, more precise, and more accurate at predicting regional impacts. We know a lot more than we did in 2014 (the release of AR5). Our climate is already different compared ti the early or mid-20th century and we’re seeing big changes to mean temperatures, growing season, extreme heat, ocean acidification, and deoxygenation, and Arctic sea ice loss. Expect more changes by mid-century: more rain in the northern hemisphere, less rain in a few regions (the Mediterranean and South Africa), as well as sea-level rise along all coasts. Overall, there is high confidence that mean and extreme temperatures will rise over land and sea. Major widespread damages are expected, but also benefits are possible in some places.
🌏Every region of the world will experience concurrent changes in multiple climate impact drivers by mid-century.
🌱Climate change is already resulting in significant societal and environmental impacts and will induce major socio-economic damages in the future. In some cases, climate change can also lead to beneficial conditions which can be taken into account in adaptation strategies.
🌨️The impacts of climate change depend not only on physical changes in the climate itself, but also on whether humans take steps to limit their exposure and vulnerability.
What we did:
The visualization of confidence is only for the executive summary at the beginning of each chapter. If a sentence had a confidence associated with it, the confidence text was removed and a color applied instead. If a sentence did not have an associated confidence, that doesn’t mean scientists do not feel confident about the content; they may be using likelihood (or certainty) language in that instance instead. We chose to only visualize confidence, as it is used more often in the report. Highlights were drawn from the text of the report but edited and in some cases rephrased for clarity.
Everywhere from business to medicine to the climate, forecasting the future is a complex and absolutely critical job. So how do you do it—and what comes next?
February 26, 2020
Professor of atmospheric science, University of California, Berkeley
Prediction for 2030: We’ll light up the world… safely
I’ve spoken to people who want climate model information, but they’re not really sure what they’re asking me for. So I say to them, “Suppose I tell you that some event will happen with a probability of 60% in 2030. Will that be good enough for you, or will you need 70%? Or would you need 90%? What level of information do you want out of climate model projections in order to be useful?”
I joined Jim Hansen’s group in 1979, and I was there for all the early climate projections. And the way we thought about it then, those things are all still totally there. What we’ve done since then is add richness and higher resolution, but the projections are really grounded in the same kind of data, physics, and observations.
Still, there are things we’re missing. We still don’t have a real theory of precipitation, for example. But there are two exciting things happening there. One is the availability of satellite observations: looking at the cloud is still not totally utilized. The other is that there used to be no way to get regional precipitation patterns through history—and now there is. Scientists found these caves in China and elsewhere, and they go in, look for a nice little chamber with stalagmites, and then they chop them up and send them back to the lab, where they do fantastic uranium-thorium dating and measure oxygen isotopes in calcium carbonate. From there they can interpret a record of historic rainfall. The data are incredible: we have got over half a million years of precipitation records all over Asia.
I don’t see us reducing fossil fuels by 2030. I don’t see us reducing CO2 or atmospheric methane. Some 1.2 billion people in the world right now have no access to electricity, so I’m looking forward to the growth in alternative energy going to parts of the world that have no electricity. That’s important because it’s education, health, everything associated with a Western standard of living. That’s where I’m putting my hopes.
Anne Lise Kjaer
Futurist, Kjaer Global, London
Prediction for 2030: Adults will learn to grasp new ideas
As a kid I wanted to become an archaeologist, and I did in a way. Archaeologists find artifacts from the past and try to connect the dots and tell a story about how the past might have been. We do the same thing as futurists; we use artifacts from the present and try to connect the dots into interesting narratives in the future.
When it comes to the future, you have two choices. You can sit back and think “It’s not happening to me” and build a great big wall to keep out all the bad news. Or you can build windmills and harness the winds of change.
A lot of companies come to us and think they want to hear about the future, but really it’s just an exercise for them—let’s just tick that box, do a report, and put it on our bookshelf.
So we have a little test for them. We do interviews, we ask them questions; then we use a model called a Trend Atlas that considers both the scientific dimensions of society and the social ones. We look at the trends in politics, economics, societal drivers, technology, environment, legislation—how does that fit with what we know currently? We look back maybe 10, 20 years: can we see a little bit of a trend and try to put that into the future?
What’s next? Obviously with technology we can educate much better than we could in the past. But it’s a huge opportunity to educate the parents of the next generation, not just the children. Kids are learning about sustainability goals, but what about the people who actually rule our world?
Coauthor of Superforecasting and professor, University of Pennsylvania
Prediction for 2030: We’ll get better at being uncertain
At the Good Judgment Project, we try to track the accuracy of commentators and experts in domains in which it’s usually thought impossible to track accuracy. You take a big debate and break it down into a series of testable short-term indicators. So you could take a debate over whether strong forms of artificial intelligence are going to cause major dislocations in white-collar labor markets by 2035, 2040, 2050. A lot of discussion already occurs at that level of abstraction—but from our point of view, it’s more useful to break it down and to say: If we were on a long-term trajectory toward an outcome like that, what sorts of things would we expect to observe in the short term? So we started this off in 2015, and in 2016 AlphaGo defeated people in Go. But then other things didn’t happen: driverless Ubers weren’t picking people up for fares in any major American city at the end of 2017. Watson didn’t defeat the world’s best oncologists in a medical diagnosis tournament. So I don’t think we’re on a fast track toward the singularity, put it that way.
Forecasts have the potential to be either self-fulfilling or self-negating—Y2K was arguably a self-negating forecast. But it’s possible to build that into a forecasting tournament by asking conditional forecasting questions: i.e., How likely is X conditional on our doing this or doing that?
What I’ve seen over the last 10 years, and it’s a trend that I expect will continue, is an increasing openness to the quantification of uncertainty. I think there’s a grudging, halting, but cumulative movement toward thinking about uncertainty, and more granular and nuanced ways that permit keeping score.
Associate professor of economics, UCLA
Prediction for 2030: We’ll be more—and less—private
When I worked on Uber’s surge pricing algorithm, the problem it was built to solve was very coarse: we were trying to convince drivers to put in extra time when they were most needed. There were predictable times—like New Year’s—when we knew we were going to need a lot of people. The deeper problem was that this was a system with basically no control. It’s like trying to predict the weather. Yes, the amount of weather data that we collect today—temperature, wind speed, barometric pressure, humidity data—is 10,000 times greater than what we were collecting 20 years ago. But we still can’t predict the weather 10,000 times further out than we could back then. And social movements—even in a very specific setting, such as where riders want to go at any given point in time—are, if anything, even more chaotic than weather systems.
These days what I’m doing is a little bit more like forensic economics. We look to see what we can find and predict from people’s movement patterns. We’re just using simple cell-phone data like geolocation, but even just from movement patterns, we can infer salient information and build a psychological dimension of you. What terrifies me is I feel like I have much worse data than Facebook does. So what are they able to understand with their much better information?
I think the next big social tipping point is people actually starting to really care about their privacy. It’ll be like smoking in a restaurant: it will quickly go from causing outrage when people want to stop it to suddenly causing outrage if somebody does it. But at the same time, by 2030 almost every Chinese citizen will be completely genotyped. I don’t quite know how to reconcile the two.
Science fiction and nonfiction author, San Francisco
Prediction for 2030: We’re going to see a lot more humble technology
Every era has its own ideas about the future. Go back to the 1950s and you’ll see that people fantasized about flying cars. Now we imagine bicycles and green cities where cars are limited, or where cars are autonomous. We have really different priorities now, so that works its way into our understanding of the future.
Science fiction writers can’t actually make predictions. I think of science fiction as engaging with questions being raised in the present. But what we can do, even if we can’t say what’s definitely going to happen, is offer a range of scenarios informed by history.
There are a lot of myths about the future that people believe are going to come true right now. I think a lot of people—not just science fiction writers but people who are working on machine learning—believe that relatively soon we’re going to have a human-equivalent brain running on some kind of computing substrate. This is as much a reflection of our time as it is what might actually happen.
It seems unlikely that a human-equivalent brain in a computer is right around the corner. But we live in an era where a lot of us feel like we live inside computers already, for work and everything else. So of course we have fantasies about digitizing our brains and putting our consciousness inside a machine or a robot.
I’m not saying that those things could never happen. But they seem much more closely allied to our fantasies in the present than they do to a real technical breakthrough on the horizon.
We’re going to have to develop much better technologies around disaster relief and emergency response, because we’ll be seeing a lot more floods, fires, storms. So I think there is going to be a lot more work on really humble technologies that allow you to take your community off the grid, or purify your own water. And I don’t mean in a creepy survivalist way; I mean just in a this-is-how-we-are-living-now kind of way.
Associate professor of computer science, Harvard
Prediction for 2030: Humans and machines will make decisions together
In my lab, we’re trying to answer questions like “How might this patient respond to this antidepressant?” or “How might this patient respond to this vasopressor?” So we get as much data as we can from the hospital. For a psychiatric patient, we might have everything about their heart disease, kidney disease, cancer; for a blood pressure management recommendation for the ICU, we have all their oxygen information, their lactate, and more.
Some of it might be relevant to making predictions about their illnesses, some not, and we don’t know which is which. That’s why we ask for the large data set with everything.
There’s been about a decade of work trying to get unsupervised machine-learning models to do a better job at making these predictions, and none worked really well. The breakthrough for us was when we found that all the previous approaches for doing this were wrong in the exact same way. Once we untangled all of this, we came up with a different method.
We also realized that even if our ability to predict what drug is going to work is not always that great, we can more reliably predict what drugs are not going to work, which is almost as valuable.
I’m excited about combining humans and AI to make predictions. Let’s say your AI has an error rate of 70% and your human is also only right 70% of the time. Combining the two is difficult, but if you can fuse their successes, then you should be able to do better than either system alone. How to do that is a really tough, exciting question.
All these predictive models were built and deployed and people didn’t think enough about potential biases. I’m hopeful that we’re going to have a future where these human-machine teams are making decisions that are better than either alone.
Abdoulaye Banire Diallo
Professor, director of the bioinformatics lab, University of Quebec at Montreal
Prediction for 2030: Machine-based forecasting will be regulated
When a farmer in Quebec decides whether to inseminate a cow or not, it might depend on the expectation of milk that will be produced every day for one year, two years, maybe three years after that. Farms have management systems that capture the data and the environment of the farm. I’m involved in projects that add a layer of genetic and genomic data to help forecasting—to help decision makers like the farmer to have a full picture when they’re thinking about replacing cows, improving management, resilience, and animal welfare.
With the emergence of machine learning and AI, what we’re showing is that we can help tackle problems in a way that hasn’t been done before. We are adapting it to the dairy sector, where we’ve shown that some decisions can be anticipated 18 months in advance just by forecasting based on the integration of this genomic data. I think in some areas such as plant health we have only achieved 10% or 20% of our capacity to improve certain models.
Until now AI and machine learning have been associated with domain expertise. It’s not a public-wide thing. But less than 10 years from now they will need to be regulated. I think there are a lot of challenges for scientists like me to try to make those techniques more explainable, more transparent, and more auditable.
Our model reveals the true course of the pandemic. Here is what to do next
May 15th 2021 8-10 minutos
THIS WEEK we publish our estimate of the true death toll from covid-19. It tells the real story of the pandemic. But it also contains an urgent warning. Unless vaccine supplies reach poorer countries, the tragic scenes now unfolding in India risk being repeated elsewhere. Millions more will die.
Using known data on 121 variables, from recorded deaths to demography, we have built a pattern of correlations that lets us fill in gaps where numbers are lacking. Our model suggests that covid-19 has already claimed 7.1m-12.7m lives. Our central estimate is that 10m people have died who would otherwise be living. This tally of “excess deaths” is over three times the official count, which nevertheless is the basis for most statistics on the disease, including fatality rates and cross-country comparisons.
The most important insight from our work is that covid-19 has been harder on the poor than anyone knew. Official figures suggest that the pandemic has struck in waves, and that the United States and Europe have been hit hard. Although South America has been ravaged, the rest of the developing world seemed to get off lightly.
Our modelling tells another story. When you count all the bodies, you see that the pandemic has spread remorselessly from the rich, connected world to poorer, more isolated places. As it has done so, the global daily death rate has climbed steeply.
Death rates have been very high in some rich countries, but the overwhelming majority of the 6.7m or so deaths that nobody counted were in poor and middle-income ones. In Romania and Iran excess deaths are more than double the number officially put down to covid-19. In Egypt they are 13 times as big. In America the difference is 7.1%.
India, where about 20,000 are dying every day, is not an outlier. Our figures suggest that, in terms of deaths as a share of population, Peru’s pandemic has been 2.5 times worse than India’s. The disease is working its way through Nepal and Pakistan. Infectious variants spread faster and, because of the tyranny of exponential growth, overwhelm health-care systems and fill mortuaries even if the virus is no more lethal.
Ultimately the way to stop this is vaccination. As an example of collaboration and pioneering science, covid-19 vaccines rank with the Apollo space programme. Within just a year of the virus being discovered, people could be protected from severe disease and death. Hundreds of millions of them have benefited.
However, in the short run vaccines will fuel the divide between rich and poor. Soon, the only people to die from covid-19 in rich countries will be exceptionally frail or exceptionally unlucky, as well as those who have spurned the chance to be vaccinated. In poorer countries, by contrast, most people will have no choice. They will remain unprotected for many months or years.
The world cannot rest while people perish for want of a jab costing as little as $4 for a two-dose course. It is hard to think of a better use of resources than vaccination. Economists’ central estimate for the direct value of a course is $2,900—if you include factors like long covid and the effect of impaired education, the total is much bigger. The benefit from an extra 1bn doses supplied by July would be worth hundreds of billions of dollars. Less circulating virus means less mutation, and so a lower chance of a new variant that reinfects the vaccinated.
Supplies of vaccines are already growing. By the end of April, according to Airfinity, an analytics firm, vaccine-makers produced 1.7bn doses, 700m more than the end of March and ten times more than January. Before the pandemic, annual global vaccine capacity was roughly 3.5bn doses. The latest estimates are that total output in 2021 will be almost 11bn. Some in the industry predict a global surplus in 2022.
And yet the world is right to strive to get more doses in more arms sooner. Hence President Joe Biden has proposed waiving intellectual-property claims on covid-19 vaccines. Many experts argue that, because some manufacturing capacity is going begging, millions more doses might become available if patent-owners shared their secrets, including in countries that today are at the back of the queue. World-trade rules allow for a waiver. When invoke them if not in the throes of a pandemic?
We believe that Mr Biden is wrong. A waiver may signal that his administration cares about the world, but it is at best an empty gesture and at worst a cynical one.
A waiver will do nothing to fill the urgent shortfall of doses in 2021. The head of the World Trade Organisation, the forum where it will be thrashed out, warns there may be no vote until December. Technology transfer would take six months or so to complete even if it started today. With the new mRNA vaccines made by Pfizer and Moderna, it may take longer. Supposing the tech transfer was faster than that, experienced vaccine-makers would be unavailable for hire and makers could not obtain inputs from suppliers whose order books are already bursting. Pfizer’s vaccine requires 280 inputs from suppliers in 19 countries. No firm can recreate that in a hurry.
In any case, vaccine-makers do not appear to be hoarding their technology—otherwise output would not be increasing so fast. They have struck 214 technology-transfer agreements, an unprecedented number. They are not price-gouging: money is not the constraint on vaccination. Poor countries are not being priced out of the market: their vaccines are coming through COVAX, a global distribution scheme funded by donors.
In the longer term, the effect of a waiver is unpredictable. Perhaps it will indeed lead to technology being transferred to poor countries; more likely, though, it will cause harm by disrupting supply chains, wasting resources and, ultimately, deterring innovation. Whatever the case, if vaccines are nearing a surplus in 2022, the cavalry will arrive too late.
A needle in time
If Mr Biden really wants to make a difference, he can donate vaccine right now through COVAX. Rich countries over-ordered because they did not know which vaccines would work. Britain has ordered more than nine doses for each adult, Canada more than 13. These will be urgently needed elsewhere. It is wrong to put teenagers, who have a minuscule risk of dying from covid-19, before the elderly and health-care workers in poor countries. The rich world should not stockpile boosters to cover the population many times over on the off-chance that they may be needed. In the next six months, this could yield billions of doses of vaccine.
Countries can also improve supply chains. The Serum Institute, an Indian vaccine-maker, has struggled to get parts such as filters from America because exports were gummed up by the Defence Production Act (DPA), which puts suppliers on a war-footing. Mr Biden authorised a one-off release, but he should be focusing the DPA on supplying the world instead. And better use needs to be made of finished vaccine. In some poor countries, vaccine languishes unused because of hesitancy and chaotic organisation. It makes sense to prioritise getting one shot into every vulnerable arm, before setting about the second.
Our model is not predictive. However it does suggest that some parts of the world are particularly vulnerable—one example is South-East Asia, home to over 650m people, which has so far been spared mass fatalities for no obvious reason. Covid-19 has not yet run its course. But vaccines have created the chance to save millions of lives. The world must not squander it. ■
This week, the C.D.C. acknowledged what scientists have been saying for months: The risk of catching the coronavirus from surfaces is low.
April 8, 2021
When the coronavirus began to spread in the United States last spring, many experts warned of the danger posed by surfaces. Researchers reported that the virus could survive for days on plastic or stainless steel, and the Centers for Disease Control and Prevention advised that if someone touched one of these contaminated surfaces — and then touched their eyes, nose or mouth — they could become infected.
Americans responded in kind, wiping down groceries, quarantining mail and clearing drugstore shelves of Clorox wipes. Facebook closed two of its offices for a “deep cleaning.” New York’s Metropolitan Transportation Authority began disinfecting subway cars every night.
“People can be affected with the virus that causes Covid-19 through contact with contaminated surfaces and objects,” Dr. Rochelle Walensky, the director of the C.D.C., said at a White House briefing on Monday. “However, evidence has demonstrated that the risk by this route of infection of transmission is actually low.”
The admission is long overdue, scientists say.
“Finally,” said Linsey Marr, an expert on airborne viruses at Virginia Tech. “We’ve known this for a long time and yet people are still focusing so much on surface cleaning.” She added, “There’s really no evidence that anyone has ever gotten Covid-19 by touching a contaminated surface.”
During the early days of the pandemic, many experts believed that the virus spread primarily through large respiratory droplets. These droplets are too heavy to travel long distances through the air but can fall onto objects and surfaces.
In this context, a focus on scrubbing down every surface seemed to make sense. “Surface cleaning is more familiar,” Dr. Marr said. “We know how to do it. You can see people doing it, you see the clean surface. And so I think it makes people feel safer.”
But over the last year, it has become increasingly clear that the virus spreads primarily through the air — in both large and small droplets, which can remain aloft longer — and that scouring door handles and subway seats does little to keep people safe.
“The scientific basis for all this concern about surfaces is very slim — slim to none,” said Emanuel Goldman, a microbiologist at Rutgers University, who wrote last summer that the risk of surface transmission had been overblown. “This is a virus you get by breathing. It’s not a virus you get by touching.”
The C.D.C. has previously acknowledged that surfaces are not the primary way that the virus spreads. But the agency’s statements this week went further.
“The most important part of this update is that they’re clearly communicating to the public the correct, low risk from surfaces, which is not a message that has been clearly communicated for the past year,” said Joseph Allen, a building safety expert at the Harvard T.H. Chan School of Public Health.
Catching the virus from surfaces remains theoretically possible, he noted. But it requires many things to go wrong: a lot of fresh, infectious viral particles to be deposited on a surface, and then for a relatively large quantity of them to be quickly transferred to someone’s hand and then to their face. “Presence on a surface does not equal risk,” Dr. Allen said.
In most cases, cleaning with simple soap and water — in addition to hand-washing and mask-wearing — is enough to keep the odds of surface transmission low, the C.D.C.’s updated cleaning guidelines say. In most everyday scenarios and environments, people do not need to use chemical disinfectants, the agency notes.
“What this does very usefully, I think, is tell us what we don’t need to do,” said Donald Milton, an aerosol scientist at the University of Maryland. “Doing a lot of spraying and misting of chemicals isn’t helpful.”
Still, the guidelines do suggest that if someone who has Covid-19 has been in a particular space within the last day, the area should be both cleaned and disinfected.
“Disinfection is only recommended in indoor settings — schools and homes — where there has been a suspected or confirmed case of Covid-19 within the last 24 hours,” Dr. Walensky said during the White House briefing. “Also, in most cases, fogging, fumigation and wide-area or electrostatic spraying is not recommended as a primary method of disinfection and has several safety risks to consider.”
And the new cleaning guidelines do not apply to health care facilities, which may require more intensive cleaning and disinfection.
Saskia Popescu, an infectious disease epidemiologist at George Mason University, said that she was happy to see the new guidance, which “reflects our evolving data on transmission throughout the pandemic.”
But she noted that it remained important to continue doing some regular cleaning — and maintaining good hand-washing practices — to reduce the risk of contracting not just the coronavirus but any other pathogens that might be lingering on a particular surface.
Dr. Allen said that the school and business officials he has spoken with this week expressed relief over the updated guidelines, which will allow them to pull back on some of their intensive cleaning regimens. “This frees up a lot of organizations to spend that money better,” he said.
Schools, businesses and other institutions that want to keep people safe should shift their attention from surfaces to air quality, he said, and invest in improved ventilation and filtration.
“This should be the end of deep cleaning,” Dr. Allen said, noting that the misplaced focus on surfaces has had real costs. “It has led to closed playgrounds, it has led to taking nets off basketball courts, it has led to quarantining books in the library. It has led to entire missed school days for deep cleaning. It has led to not being able to share a pencil. So that’s all that hygiene theater, and it’s a direct result of not properly classifying surface transmission as low risk.”
The U.S. National Academy of Sciences has published a new report (“Reflecting Sunlight“) on the topic of Geoengineering (that is, the deliberate manipulation of the global Earth environment in an effort to offset the effects of human carbon pollution-caused climate change). While I am, in full disclosure, a member of the Academy, I offer the following comments in an entirely independent capacity:
Let me start by congratulating the authors on their comprehensive assessment of the science. It is solid as we would expect, since the author team and reviewers cover that well in their expertise. The science underlying geoengineering is the true remit of the study. Chris Field , the lead author, is a duly qualified person to lead the effort, and did a good job making sure that intricacies of the science are covered, including the substantial uncertainties and caveats when it comes to the potential environmental impacts of some of the riskier geoengineering strategies (i.e. stratosphere sulphate aerosol injection to block out sunlight).
I like the fact that there is a discussion of the importance of labels and terminology and how this can impact public perception. For example, the oft-used term “solar radiation management” is not favored by the report authors, as it can be misleading (we don’t have our hand on a dial that controls solar output). On the other hand, I think that the term they do chose to use “solar geoengineering”, is still potentially problematic, because it still implies we’re directly modify solar output—but that’s not the case. We’re talking about messing with Earth’s atmospheric chemistry, we’re not dialing down the sun, even though many of the modeling experiments assume that’s what we’re doing. It’s a bit of a bait and switch. Even the title of the report, “Reflecting Sunlight” falls victim to this biased framing.
“They don’t actually put aerosols in the atmosphere. They turn down the Sun to mimic geoengineering. You might think that is relatively unimportant . . . [but] controlling the Sun is effectively a perfect knob. We know almost precisely how a reduction in solar flux will project onto the energy balance of a planet. Aerosol-climate interactions are much more complex.”
I have a deeper and more substantive concern though, and it really is about the entire framing of the report. A report like this is as much about the policy message it conveys as it is about the scientific assessment, for it will be used immediately by policy advocates. And here I’m honestly troubled at the fodder it provides for mis-framing of the risks.
I recognize that the authors are dealing with a contentious and still much-debated topic, and it’s a challenge to represent the full range of views within the community, but the opening of the report itself, in my view, really puts a thumb on the scales. It falls victim to the moral hazard that I warn about in “The New Climate War” when it states, as justification for potentially considering implementing these geoengineering schemes:
But despite overwhelming evidence that the climate crisis is real and pressing, emissions of greenhouse gases continue to increase, with global emissions of fossil carbon dioxide rising 10.8 percent from 2010 through 2019. The total for 2020 is on track to decrease in response to decreased economic activity related to the COVID-19 pandemic. The pandemic is thus providing frustrating confirmation of the fact that the world has made little progress in separating economic activity from carbon dioxide emissions.
First of all, the discussion of carbon emissions reductions there is misleading. Emissions flattened in the years before the pandemic, and the International Energy Agency (IEA) specifically attributed that flattening to a decrease in carbon emissions globally in the power generation sector. These reductions continue on and contributed at least party to the 7% decrease in global emissions last year. We will certainly need policy interventions favoring further decarbonization to maintain that level of decrease year after year, but if we can do that, we remain on a path to limiting warming below dangerous levels (decent chance less than 1.5C and very good chance less than 2C) without resorting on very risky geoengineering schemes. It is a matter of political willpower, not technology–we have the technology now necessary to decarbonize our economy.
The authors are basically arguing that because carbon reductions haven’t been great enough (thanks to successful opposition by polluters and their advocates) we should consider geoengineering. That framing (unintentionally, I realize) provides precisely the crutch that polluters are looking for.
As I explain in the book:
A fundamental problem with geoengineering is that it presents what is known as a moral hazard, namely, a scenario in which one party (e.g., the fossil fuel industry) promotes actions that are risky for another party (e.g., the rest of us), but seemingly advantageous to itself. Geoengineering provides a potential crutch for beneficiaries of our continued dependence on fossil fuels. Why threaten our economy with draconian regulations on carbon when we have a cheap alternative? The two main problems with that argument are that (1) climate change poses a far greater threat to our economy than decarbonization, and (2) geoengineering is hardly cheap—it comes with great potential harm.
So, in short, this report is somewhat of a mixed bag. The scientific assessment and discussion is solid, and there is a discussion of uncertainties and caveats in the detailed report. But the spin in the opening falls victim to moral hazard and will provide fodder for geoengineering advocates to use in leveraging policy decision-making.
Last Updated: March 10, 2021 at 5:59 p.m. ET First Published: March 10, 2021 at 8:28 a.m. ET By
Vincent H. Smith and Eric J. Belasco
Congress has reduced risk by underwriting crop prices and cash revenues
Bill Gates is now the largestowner of farmland in the U.S. having made substantial investments in at least 19 states throughout the country. He has apparently followed the advice of another wealthy investor, Warren Buffett, who in a February 24, 2014 letter to investors described farmland as an investment that has “no downside and potentially substantial upside.”
There is a simple explanation for this affection for agricultural assets. Since the early 1980s, Congress has consistently succumbed to pressures from farm interest groups to remove as much risk as possible from agricultural enterprises by using taxpayer funds to underwrite crop prices and cash revenues.
Over the years, three trends in farm subsidy programs have emerged.
The first and most visible is the expansion of the federally supported crop insurance program, which has grown from less than $200 million in 1981 to over $8 billion in 2021. In 1980, only a few crops were covered and the government’s goal was just to pay for administrative costs. Today taxpayers pay over two-thirds of the total cost of the insurance programs that protect farmers against drops in prices and yields for hundreds of commodities ranging from organic oranges to GMO soybeans.
The second trend is the continuation of longstanding programs to protect farmers against relatively low revenues because of price declines and lower-than-average crop yields. The subsidies, which on average cost taxpayers over $5 billion a year, are targeted to major Corn Belt crops such as soybeans and wheat. Also included are other commodities such as peanuts, cotton and rice, which are grown in congressionally powerful districts in Georgia, the Carolinas, Texas, Arkansas, Mississippi and California.
The third, more recent trend is a return over the past four years to a 1970s practice: annual ad hoc “one off” programs justified by political expediency with support from the White House and Congress. These expenditures were $5.1 billion in 2018, $14.7 billion in 2019, and over $32 billion in 2020, of which $29 billion came from COVID relief funds authorized in the CARES Act. An additional $13 billion for farm subsidies was later included in the December 2020 stimulus bill.
If you are wondering why so many different subsidy programs are used to compensate farmers multiple times for the same price drops and other revenue losses, you are not alone. Our research indicates that many owners of large farms collect taxpayer dollars from all three sources. For many of the farms ranked in the top 10% in terms of sales, recent annual payments exceeded a quarter of a million dollars.
Farms with average or modest sales received much less. Their subsidies ranged from close to zero for small farms to a few thousand dollars for averaged-sized operations.
So what does all this have to do with Bill Gates, Warren Buffet and their love of farmland as an investment? In a financial environment in which real interest rates have been near zero or negative for almost two decades, the annual average inflation-adjusted (real) rate of return in agriculture (over 80% of which consists of land) has been about 5% for the past 30 years, despite some ups and downs, as this chart shows. It is a very solid investment for an owner who can hold on to farmland for the long term.
The overwhelming majority of farm owners can manage that because they have substantial amounts of equity (the sector-wide debt-to-equity ratio has been less than 14% for many years) and receive significant revenue from other sources.
Thus for almost all farm owners, and especially the largest 10% whose net equity averages over $6 million, as Buffet observed, there is little or no risk and lots of potential gain in owning and investing in agricultural land.
Returns from agricultural land stem from two sources: asset appreciation — increases in land prices, which account for the majority of the gains — and net cash income from operating the land. As is well known, farmland prices are closely tied to expected future revenue. And these include generous subsidies, which have averaged 17% of annual net cash incomes over the past 50 years. In addition, Congress often provides substantial additional one-off payments in years when net cash income is likely to be lower than average, as in 2000 and 2001 when grain prices were relatively low and in 2019 and 2020.
It is possible for small-scale investors to buy shares in real-estate investment trusts (REITs) that own and manage agricultural land. However, as with all such investments, how a REIT is managed can be a substantive source of risk unrelated to the underlying value of the land assets, not all of which may be farm land.
Thanks to Congress and the average less affluent American taxpayer, farmers and other agricultural landowners get a steady and substantial return on their investments through subsidies that consistently guarantee and increase those revenues.
While many agricultural support programs are meant to “save the family farm,” the largest beneficiaries of agricultural subsidies are the richest landowners with the largest farms who, like Bill Gates and Warren Buffet, are scarcely in any need of taxpayer handouts.
We’re one step closer to officially moving up hurricane season. The National Hurricane Center announced Tuesday that it would formally start issuing its hurricane season tropical weather outlooks on May 15 this year, bumping it up from the traditional start of hurricane season on June 1. The move comes after a recent spate of early season storms have raked the Atlantic.
Atlantic hurricane season runs from June 1 to November 30. That’s when conditions are most conducive to storm formation owing to warm air and water temperatures. (The Pacific ocean has its own hurricane season, which covers the same timeframe, but since waters are colder fewer hurricanes tend to form there than in the Atlantic.)
Storms have begun forming on the Atlantic earlier as ocean and air temperatures have increased due to climate change. Last year, Hurricane Arthur roared to life off the East Coast on May 16. That storm made 2020 the sixth hurricane season in a row to have a storm that formed earlier than the June 1 official start date. While the National Oceanic and Atmospheric Administration won’t be moving up the start of the season just yet, the earlier outlooks addresses the recent history.
“In the last decade, there have been 10 storms formed in the weeks before the traditional start of the season, which is a big jump,” said Sean Sublette, a meteorologist at Climate Central, who pointed out that the 1960s through 2010s saw between one and three storms each decade before the June 1 start date on average.
It might be tempting to ascribe this earlier season entirely to climate change warming the Atlantic. But technology also has a role to play, with more observations along the coast as well as satellites that can spot storms far out to sea.
“I would caution that we can’t just go, ‘hah, the planet’s warming, we’ve had to move the entire season!’” Sublette said. “I don’t think there’s solid ground for attribution of how much of one there is over the other. Weather folks can sit around and debate that for awhile.”
Earlier storms don’t necessarily mean more harmful ones, either. In fact, hurricanes earlier in the season tend to be weaker than the monsters that form in August and September when hurricane season is at its peak. But regardless of their strength, these earlier storms have generated discussion inside the NHC on whether to move up the official start date for the season, when the agency usually puts out two reports per day on hurricane activity. Tuesday’s step is not an official announcement of this decision, but an acknowledgement of the increased attention on early hurricanes.
“I would say that [Tuesday’s announcement] is the National Hurricane Center being proactive,” Sublette said. “Like hey, we know that the last few years it’s been a little busier in May than we’ve seen in the past five decades, and we know there is an awareness now, so we’re going to start issuing these reports early.”
While the jury is still out on whether climate change is pushing the season earlier, research has shown that the strongest hurricanes are becoming more common, and that climate change is likely playing a role. A study published last year found the odds of a storm becoming a major hurricanes—those Category 3 or stronger—have increase 49% in the basin since satellite monitoring began in earnest four decades ago. And when storms make landfall, sea level rise allows them to do more damage. So regardless of if climate change is pushing Atlantic hurricane season is getting earlier or not, the risks are increasing. Now, at least, we’ll have better warnings before early storms do hit.
When the polio vaccine was declared safe and effective, the news was met with jubilant celebration. Church bells rang across the nation, and factories blew their whistles. “Polio routed!” newspaper headlines exclaimed. “An historic victory,” “monumental,” “sensational,” newscasters declared. People erupted with joy across the United States. Some danced in the streets; others wept. Kids were sent home from school to celebrate.
One might have expected the initial approval of the coronavirus vaccines to spark similar jubilation—especially after a brutal pandemic year. But that didn’t happen. Instead, the steady drumbeat of good news about the vaccines has been met with a chorus of relentless pessimism.
The problem is not that the good news isn’t being reported, or that we should throw caution to the wind just yet. It’s that neither the reporting nor the public-health messaging has reflected the truly amazing reality of these vaccines. There is nothing wrong with realism and caution, but effective communication requires a sense of proportion—distinguishing between due alarm and alarmism; warranted, measured caution and doombait; worst-case scenarios and claims of impending catastrophe. We need to be able to celebrate profoundly positive news while noting the work that still lies ahead. However, instead of balanced optimism since the launch of the vaccines, the public has been offered a lot of misguided fretting over new virus variants, subjected to misleading debates about the inferiority of certain vaccines, and presented with long lists of things vaccinated people still cannot do, while media outlets wonder whether the pandemic will ever end.
This pessimism is sapping people of energy to get through the winter, and the rest of this pandemic. Anti-vaccination groups and those opposing the current public-health measures have been vigorously amplifying the pessimistic messages—especially the idea that getting vaccinated doesn’t mean being able to do more—telling their audiences that there is no point in compliance, or in eventual vaccination, because it will not lead to any positive changes. They are using the moment and the messaging to deepen mistrust of public-health authorities, accusing them of moving the goalposts and implying that we’re being conned. Either the vaccines aren’t as good as claimed, they suggest, or the real goal of pandemic-safety measures is to control the public, not the virus.
Five key fallacies and pitfalls have affected public-health messaging, as well as media coverage, and have played an outsize role in derailing an effective pandemic response. These problems were deepened by the ways that we—the public—developed to cope with a dreadful situation under great uncertainty. And now, even as vaccines offer brilliant hope, and even though, at least in the United States, we no longer have to deal with the problem of a misinformer in chief, some officials and media outlets are repeating many of the same mistakes in handling the vaccine rollout.
The pandemic has given us an unwelcome societal stress test, revealing the cracks and weaknesses in our institutions and our systems. Some of these are common to many contemporary problems, including political dysfunction and the way our public sphere operates. Others are more particular, though not exclusive, to the current challenge—including a gap between how academic research operates and how the public understands that research, and the ways in which the psychology of coping with the pandemic have distorted our response to it.
Recognizing all these dynamics is important, not only for seeing us through this pandemic—yes, it is going to end—but also to understand how our society functions, and how it fails. We need to start shoring up our defenses, not just against future pandemics but against all the myriad challenges we face—political, environmental, societal, and technological. None of these problems is impossible to remedy, but first we have to acknowledge them and start working to fix them—and we’re running out of time.
The past 12 months were incredibly challenging for almost everyone. Public-health officials were fighting a devastating pandemic and, at least in this country, an administration hell-bent on undermining them. The World Health Organization was not structured or funded for independence or agility, but still worked hard to contain the disease. Many researchers and experts noted the absence of timely and trustworthy guidelines from authorities, and tried to fill the void by communicating their findings directly to the public on social media. Reporters tried to keep the public informed under time and knowledge constraints, which were made more severe by the worsening media landscape. And the rest of us were trying to survive as best we could, looking for guidance where we could, and sharing information when we could, but always under difficult, murky conditions.
Despite all these good intentions, much of the public-health messaging has been profoundly counterproductive. In five specific ways, the assumptions made by public officials, the choices made by traditional media, the way our digital public sphere operates, and communication patterns between academic communities and the public proved flawed.
One of the most important problems undermining the pandemic response has been the mistrust and paternalism that some public-health agencies and experts have exhibited toward the public. A key reason for this stance seems to be that some experts feared that people would respond to something that increased their safety—such as masks, rapid tests, or vaccines—by behaving recklessly. They worried that a heightened sense of safety would lead members of the public to take risks that would not just undermine any gains, but reverse them.
The theory that things that improve our safety might provide a false sense of security and lead to reckless behavior is attractive—it’s contrarian and clever, and fits the “here’s something surprising we smart folks thought about” mold that appeals to, well, people who think of themselves as smart. Unsurprisingly, such fears have greeted efforts to persuade the public to adopt almost every advance in safety, including seat belts, helmets, and condoms.
But time and again, the numbers tell a different story: Even if safety improvements cause a few people to behave recklessly, the benefitsoverwhelmthe ill effects. In any case, most people are already interested in staying safe from a dangerous pathogen. Further, even at the beginning of the pandemic, sociological theory predictedthat wearing masks would be associated with increased adherence to other precautionary measures—people interested in staying safe are interested in staying safe—and empirical research quickly confirmedexactly that. Unfortunately, though, the theory of risk compensation—and its implicit assumptions—continue to haunt our approach, in part because there hasn’t been a reckoning with the initial missteps.
Rules in Place of Mechanisms and Intuitions
Much of the public messaging focused on offering a series of clear rules to ordinary people, instead of explaining in detail the mechanisms of viral transmission for this pathogen. A focus on explaining transmission mechanisms, and updating our understanding over time, would have helped empower people to make informed calculations about risk in different settings. Instead, both the CDC and the WHO chose to offer fixed guidelines that lent a false sense of precision.
In the United States, the public was initially told that “close contact” meant coming within six feet of an infected individual, for 15 minutes or more. This messaging led to ridiculous gaming of the rules; some establishments moved people around at the 14th minute to avoid passing the threshold. It also led to situations in which people working indoors with others, but just outside the cutoff of six feet, felt that they could take their mask off. None of this made any practical sense. What happened at minute 16? Was seven feet okay? Faux precision isn’t more informative; it’s misleading.
All of this was complicated by the fact that key public-health agencies like the CDC and the WHO were late to acknowledge the importance of some key infection mechanisms, such as aerosol transmission. Even when they did so, the shift happened without a proportional change in the guidelines or the messaging—it was easy for the general public to miss its significance.
Frustrated by the lack of public communication from health authorities, I wrote an article last July on what we then knew about the transmission of this pathogen—including how it could be spread via aerosols that can float and accumulate, especially in poorly ventilated indoor spaces. To this day, I’m contacted by people who describe workplaces that are following the formal guidelines, but in ways that defy reason: They’ve installed plexiglass, but barred workers from opening their windows; they’ve mandated masks, but only when workers are within six feet of one another, while permitting them to be taken off indoors during breaks.
Perhaps worst of all, our messaging and guidelines elided the difference between outdoor and indoor spaces, where, given the importance of aerosol transmission, the same precautions should not apply. This is especially important because this pathogen is overdispersed: Much of the spread is driven by a few people infecting many others at once, while most people do not transmit the virus at all.
After I wrote an article explaining how overdispersion and super-spreading were driving the pandemic, I discovered that this mechanism had also been poorly explained. I was inundated by messages from people, including elected officials around the world, saying they had no idea that this was the case. None of it was secret—numerous academic papers and articles had been written about it—but it had not been integrated into our messaging or our guidelines despite its great importance.
Crucially, super-spreading isn’t equally distributed; poorly ventilated indoor spaces can facilitate the spread of the virus over longer distances, and in shorter periods of time, than the guidelines suggested, and help fuel the pandemic.
Outdoors? It’s the opposite.
There is a solid scientific reason for the fact that there are relatively few documented cases of transmission outdoors, even after a year of epidemiological work: The open air dilutes the virus very quickly, and the sun helps deactivate it, providing further protection. And super-spreading—the biggest driver of the pandemic— appears to be an exclusively indoor phenomenon. I’ve been tracking every report I can find for the past year, and have yet to find a confirmed super-spreading event that occurred solely outdoors. Such events might well have taken place, but if the risk were great enough to justify altering our lives, I would expect at least a few to have been documented by now.
And yet our guidelines do not reflect these differences, and our messaging has not helped people understand these facts so that they can make better choices. I published my first article pleading for parks to be kept open on April 7, 2020—but outdoor activities are still banned by some authorities today, a full year after this dreaded virus began to spread globally.
We’d have been much better off if we gave people a realistic intuition about this virus’s transmission mechanisms. Our public guidelines should have been more like Japan’s, which emphasize avoiding the three C’s—closed spaces, crowded places, and close contact—that are driving the pandemic.
Scolding and Shaming
Throughout the past year, traditional and social media have been caught up in a cycle of shaming—made worse by being so unscientific and misguided. How dare you go to the beach? newspapers have scolded us for months, despite lacking evidence that this posed any significant threat to public health. It wasn’t just talk: Many cities closed parks and outdoor recreational spaces, even as they kept open indoor dining and gyms. Just this month, UC Berkeley and the University of Massachusetts at Amherst both banned students from taking even solitary walks outdoors.
Even when authorities relax the rules a bit, they do not always follow through in a sensible manner. In the United Kingdom, after some locales finally started allowing children to play on playgrounds—something that was already way overdue—they quickly ruled that parents must not socialize while their kids have a normal moment. Why not? Who knows?
On social media, meanwhile, pictures of people outdoors without masks draw reprimands, insults, and confident predictions of super-spreading—and yet few note when super-spreading fails to follow.
While visible but low-risk activities attract the scolds, other actual risks—in workplaces and crowded households, exacerbated by the lack of testing or paid sick leave—are not as easily accessible to photographers. Stefan Baral, an associate epidemiology professor at the Johns Hopkins Bloomberg School of Public Health, says that it’s almost as if we’ve “designed a public-health response most suitable for higher-income” groups and the “Twitter generation”—stay home; have your groceries delivered; focus on the behaviors you can photograph and shame online—rather than provide the support and conditionsnecessary for more people to keep themselves safe.
And the viral videos shaming people for failing to take sensible precautions, such as wearing masks indoors, do not necessarily help. For one thing, fretting over the occasional person throwing a tantrum while going unmasked in a supermarket distorts the reality: Most of the public has been complying with mask wearing. Worse, shaming is often an ineffective way of getting people to change their behavior, and it entrenches polarization and discourages disclosure, making it harder to fight the virus. Instead, we should be emphasizing safer behavior and stressing how many people are doing their part, while encouraging others to do the same.
Amidst all the mistrust and the scolding, a crucial public-health concept fell by the wayside. Harm reduction is the recognition that if there is an unmet and yet crucial human need, we cannot simply wish it away; we need to advise people on how to do what they seek to do more safely. Risk can never be completely eliminated; life requires more than futile attempts to bring risk down to zero. Pretending we can will away complexities and trade-offs with absolutism is counterproductive. Consider abstinence-only education: Not letting teenagers know about ways to have safer sex results in more of them having sex with no protections.
As Julia Marcus, an epidemiologist and associate professor at Harvard Medical School, told me, “When officials assume that risks can be easily eliminated, they might neglect the other things that matter to people: staying fed and housed, being close to loved ones, or just enjoying their lives. Public health works best when it helps people find safer ways to get what they need and want.””
Another problem with absolutism is the “abstinence violation” effect, Joshua Barocas, an assistant professor at the Boston University School of Medicine and Infectious Diseases, told me. When we set perfection as the only option, it can cause people who fall short of that standard in one small, particular way to decide that they’ve already failed, and might as well give up entirely. Most people who have attempted a diet or a new exercise regimen are familiar with this psychological state. The better approach is encouraging risk reduction and layered mitigation—emphasizing that every little bit helps—while also recognizing that a risk-free life is neither possible nor desirable.
Socializing is not a luxury—kids need to play with one another, and adults need to interact. Your kids can play together outdoors, and outdoor time is the best chance to catch up with your neighbors is not just a sensible message; it’s a way to decrease transmission risks. Some kids will play and some adults will socialize no matter what the scolds say or public-health officials decree, and they’ll do it indoors, out of sight of the scolding.
And if they don’t? Then kids will be deprived of an essential activity, and adults will be deprived of human companionship. Socializing is perhaps the most important predictor of health and longevity, after not smoking and perhaps exercise and a healthy diet. We need to help people socialize more safely, not encourage them to stop socializing entirely.
The Balance Between Knowledge And Action
Last but not least, the pandemic response has been distorted by a poor balance between knowledge, risk, certainty, and action.
Sometimes, public-health authorities insisted that we did not know enough to act, when the preponderance of evidence already justified precautionary action. Wearing masks, for example, posed few downsides, and held the prospect of mitigating the exponential threat we faced. The wait for certainty hampered our response to airborne transmission, even though there was almost no evidence for—and increasing evidence against—the importance of fomites, or objects that can carry infection. And yet, we emphasized the risk of surface transmission while refusing to properly address the risk of airborne transmission, despite increasing evidence. The difference lay not in the level of evidence and scientific support for either theory—which, if anything, quickly tilted in favor of airborne transmission, and not fomites, being crucial—but in the fact that fomite transmission had been a key part of the medical canon, and airborne transmission had not.
Sometimes, experts and the public discussion failed to emphasize that we were balancing risks, as in the recurring cycles of debate over lockdowns or school openings. We should have done more to acknowledge that there were no good options, only trade-offs between different downsides. As a result, instead of recognizing the difficulty of the situation, too many people accused those on the other side of being callous and uncaring.
And sometimes, the way that academics communicate clashed with how the public constructs knowledge. In academia, publishing is the coin of the realm, and it is often done through rejecting the null hypothesis—meaning that many papers do not seek to prove something conclusively, but instead, to reject the possibility that a variable has no relationship with the effect they are measuring (beyond chance). If that sounds convoluted, it is—there are historical reasons for this methodology and big arguments within academia about its merits, but for the moment, this remains standard practice.
At crucial points during the pandemic, though, this resulted in mistranslations and fueled misunderstandings, which were further muddled by differing stances toward prior scientific knowledge and theory. Yes, we faced a novel coronavirus, but we should have started by assuming that we could make some reasonable projections from prior knowledge, while looking out for anything that might prove different. That prior experience should have made us mindful of seasonality, the key role of overdispersion, and aerosol transmission. A keen eye for what was different from the past would have alerted us earlier to the importance of presymptomatic transmission.
Thus, on January 14, 2020, the WHO stated that there was “no clear evidence of human-to-human transmission.” It should have said, “There is increasing likelihood that human-to-human transmission is taking place, but we haven’t yet proven this, because we have no access to Wuhan, China.” (Cases were already popping up around the world at that point.) Acting as if there was human-to-human transmission during the early weeks of the pandemic would have been wise and preventive.
Later that spring, WHO officials stated that there was “currently no evidence that people who have recovered from COVID-19 and have antibodies are protected from a second infection,” producing many articles laden with panic and despair. Instead, it should have said: “We expect the immune system to function against this virus, and to provide some immunity for some period of time, but it is still hard to know specifics because it is so early.”
Similarly, since the vaccines were announced, too many statements have emphasized that we don’t yet know if vaccines prevent transmission. Instead, public-health authorities should have said that we have many reasons to expect, and increasing amounts of data to suggest, that vaccines will blunt infectiousness, but that we’re waiting for additional data to be more precise about it. That’s been unfortunate, because while many, many things have gone wrong during this pandemic, the vaccines are one thing that has gone very, very right.
As late as April 2020, Anthony Fauci was slammed for being too optimistic for suggesting we might plausibly have vaccines in a year to 18 months. We had vaccines much, much sooner than that: The first two vaccine trials concluded a mere eight months after the WHO declared a pandemic in March 2020.
Moreover, they have delivered spectacular results. In June 2020, the FDA said a vaccine that was merely 50 percent efficacious in preventing symptomatic COVID-19 would receive emergency approval—that such a benefit would be sufficient to justify shipping it out immediately. Just a few months after that, the trials of the Moderna and Pfizer vaccines concluded by reporting not just a stunning 95 percent efficacy, but also a complete elimination of hospitalization or death among the vaccinated. Even severe disease was practically gone: The lone case classified as “severe” among 30,000 vaccinated individuals in the trials was so mild that the patient needed no medical care, and her case would not have been considered severe if her oxygen saturation had been a single percent higher.
These are exhilarating developments, because global, widespread, and rapid vaccination is our way out of this pandemic. Vaccines that drastically reduce hospitalizations and deaths, and that diminish even severe disease to a rare event, are the closest things we have had in this pandemic to a miracle—though of course they are the product of scientific research, creativity, and hard work. They are going to be the panacea and the endgame.
And yet, two months into an accelerating vaccination campaign in the United States, it would be hard to blame people if they missed the news that things are getting better.
Yes, there are new variants of the virus, which may eventually require booster shots, but at least so far, the existing vaccines are standing up to them well—very, very well. Manufacturers are already working on new vaccines or variant-focused booster versions, in case they prove necessary, and the authorizing agencies are ready for a quick turnaround if and when updates are needed. Reports from places that have vaccinated large numbers of individuals, and even trials in places where variants are widespread, are exceedingly encouraging, with dramatic reductions in cases and, crucially, hospitalizations and deaths among the vaccinated. Global equity and access to vaccines remain crucial concerns, but the supply is increasing.
Here in the United States, despite the rocky rollout and the need to smooth access and ensure equity, it’s become clear that toward the end of spring 2021, supply will be more than sufficient. It may sound hard to believe today, as many who are desperate for vaccinations await their turn, but in the near future, we may have to discuss what to do with excess doses.
So why isn’t this story more widely appreciated?
Part of the problem with the vaccines was the timing—the trials concluded immediately after the U.S. election, and their results got overshadowed in the weeks of political turmoil. The first, modest headline announcing the Pfizer-BioNTech results in The New York Times was a single column, “Vaccine Is Over 90% Effective, Pfizer’s Early Data Says,” below a banner headline spanning the page: “BIDEN CALLS FOR UNITED FRONT AS VIRUS RAGES.” That was both understandable—the nation was weary—and a loss for the public.
Just a few days later, Moderna reported a similar 94.5 percent efficacy. If anything, that provided even more cause for celebration, because it confirmed that the stunning numbers coming out of Pfizer weren’t a fluke. But, still amid the political turmoil, the Moderna report got a mere two columns on The New York Times’ front page with an equally modest headline: “Another Vaccine Appears to Work Against the Virus.”
So we didn’t get our initial vaccine jubilation.
But as soon as we began vaccinating people, articles started warning the newly vaccinated about all they could not do. “COVID-19 Vaccine Doesn’t Mean You Can Party Like It’s 1999,” one headline admonished. And the buzzkill has continued right up to the present. “You’re fully vaccinated against the coronavirus—now what? Don’t expect to shed your mask and get back to normal activities right away,” began a recent Associated Press story.
People might well want to party after being vaccinated. Those shots will expand what we can do, first in our private lives and among other vaccinated people, and then, gradually, in our public lives as well. But once again, the authorities and the media seem more worried about potentially reckless behavior among the vaccinated, and about telling them what not to do, than with providing nuanced guidance reflecting trade-offs, uncertainty, and a recognition that vaccination can change behavior. No guideline can cover every situation, but careful, accurate, and updated information can empower everyone.
Take the messaging and public conversation around transmission risks from vaccinated people. It is, of course, important to be alert to such considerations: Many vaccines are “leaky” in that they prevent disease or severe disease, but not infection and transmission. In fact, completely blocking all infection—what’s often called “sterilizing immunity”—is a difficult goal, and something even many highly effective vaccines don’t attain, but that doesn’t stop them from being extremely useful.
As Paul Sax, an infectious-disease doctor at Boston’s Brigham & Women’s Hospital, put it in early December, it would be enormously surprising “if these highly effective vaccines didn’t also make people less likely to transmit.” From multiple studies, we already knew that asymptomatic individuals—those who never developed COVID-19 despite being infected—were much less likely to transmit the virus. The vaccine trials were reporting 95 percent reductions in any form of symptomatic disease. In December, we learned that Moderna had swabbed some portion of trial participants to detect asymptomatic, silent infections, and found an almost two-thirds reduction even in such cases. The good news kept pouring in. Multiple studies found that, even in those few cases where breakthrough disease occurred in vaccinated people, their viral loads were lower—which correlates with lower rates of transmission. Data from vaccinated populations further confirmed what many experts expected all along: Of course these vaccines reduce transmission.
What went wrong? The same thing that’s going wrong right now with the reporting on whether vaccines will protect recipients against the new viral variants. Some outlets emphasize the worst or misinterpret the research. Some public-health officials are wary of encouraging the relaxation of any precautions. Some prominent experts on social media—even those with seemingly solid credentials—tend to respond to everything with alarm and sirens. So the message that got heard was that vaccines will not prevent transmission, or that they won’t work against new variants, or that we don’t know if they will. What the public needs to hear, though, is that based on existing data, we expect them to work fairly well—but we’ll learn more about precisely how effective they’ll be over time, and that tweaks may make them even better.
A year into the pandemic, we’re still repeating the same mistakes.
The top-down messaging is not the only problem. The scolding, the strictness, the inability to discuss trade-offs, and the accusations of not caring about people dying not only have an enthusiastic audience, but portions of the public engage in these behaviors themselves. Maybe that’s partly because proclaiming the importance of individual actions makes us feel as if we are in the driver’s seat, despite all the uncertainty.
Psychologists talk about the “locus of control”—the strength of belief in control over your own destiny. They distinguish between people with more of an internal-control orientation—who believe that they are the primary actors—and those with an external one, who believe that society, fate, and other factors beyond their control greatly influence what happens to us. This focus on individual control goes along with something called the “fundamental attribution error”—when bad things happen to other people, we’re more likely to believe that they are personally at fault, but when they happen to us, we are more likely to blame the situation and circumstances beyond our control.
An individualistic locus of control is forged in the U.S. mythos—that we are a nation of strivers and people who pull ourselves up by our bootstraps. An internal-control orientation isn’t necessarily negative; it can facilitate resilience, rather than fatalism, by shifting the focus to what we can do as individuals even as things fall apart around us. This orientation seems to be common among children who not only survive but sometimes thrive in terrible situations—they take charge and have a go at it, and with some luck, pull through. It is probably even more attractive to educated, well-off people who feel that they have succeeded through their own actions.
You can see the attraction of an individualized, internal locus of control in a pandemic, as a pathogen without a cure spreads globally, interrupts our lives, makes us sick, and could prove fatal.
There have been very few things we could do at an individual level to reduce our risk beyond wearing masks, distancing, and disinfecting. The desire to exercise personal control against an invisible, pervasive enemy is likely why we’ve continued to emphasize scrubbing and cleaning surfaces, in what’s appropriately called “hygiene theater,” long after it became clear that fomites were not a key driver of the pandemic. Obsessive cleaning gave us something to do, and we weren’t about to give it up, even if it turned out to be useless. No wonder there was so much focus on telling others to stay home—even though it’s not a choice available to those who cannot work remotely—and so much scolding of those who dared to socialize or enjoy a moment outdoors.
And perhaps it was too much to expect a nation unwilling to release its tight grip on the bottle of bleach to greet the arrival of vaccines—however spectacular—by imagining the day we might start to let go of our masks.
The focus on individual actions has had its upsides, but it has also led to a sizable portion of pandemic victims being erased from public conversation. If our own actions drive everything, then some other individuals must be to blame when things go wrong for them. And throughout this pandemic, the mantra many of us kept repeating—“Wear a mask, stay home; wear a mask, stay home”—hid many of the real victims.
Study after study, in country after country, confirms that this disease has disproportionately hit the poor and minority groups, along with the elderly, who are particularly vulnerable to severe disease. Even among the elderly, though, those who are wealthier and enjoy greater access to health care have fared better.
The poor and minority groups are dying in disproportionately large numbers for the same reasons that they suffer from many other diseases: a lifetime of disadvantages, lack of access to health care, inferior working conditions, unsafe housing, and limited financial resources.
Many lacked the option of staying home precisely because they were working hard to enable others to do what they could not, by packing boxes, delivering groceries, producing food. And even those who could stay home faced other problems born of inequality: Crowded housing is associatedwith higher rates of COVID-19 infection and worse outcomes, likely because many of the essential workers who live in such housing bring the virus home to elderly relatives.
Individual responsibility certainly had a large role to play in fighting the pandemic, but many victims had little choice in what happened to them. By disproportionately focusing on individual choices, not only did we hide the real problem, but we failed to do more to provide safe working and living conditions for everyone.
For example, there has been a lot of consternation about indoor dining, an activity I certainly wouldn’t recommend. But even takeout and delivery can impose a terrible cost: One study of California found that line cooks are the highest-risk occupation for dying of COVID-19. Unless we provide restaurants with funds so they can stay closed, or provide restaurant workers with high-filtration masks, better ventilation, paid sick leave, frequent rapid testing, and other protections so that they can safely work, getting food to go can simply shift the risk to the most vulnerable. Unsafe workplaces may be low on our agenda, but they do pose a real danger. Bill Hanage, associate professor of epidemiology at Harvard, pointed me to a paper he co-authored: Workplace-safety complaints to OSHA—which oversees occupational-safety regulations—during the pandemic were predictive of increases in deaths 16 days later.
New data highlight the terrible toll of inequality: Life expectancy has decreased dramatically over the past year, with Black people losing the most from this disease, followed by members of the Hispanic community. Minorities are also more likely to die of COVID-19 at a younger age. But when the new CDC director, Rochelle Walensky, noted this terrible statistic, she immediately followed up by urging people to “continue to use proven prevention steps to slow the spread—wear a well-fitting mask, stay 6 ft away from those you do not live with, avoid crowds and poorly ventilated places, and wash hands often.”
Those recommendations aren’t wrong, but they are incomplete. None of these individual acts do enough to protect those to whom such choices aren’t available—and the CDC has yet to issue sufficient guidelines for workplace ventilation or to make higher-filtration masks mandatory, or even available, for essential workers. Nor are these proscriptions paired frequently enough with prescriptions: Socialize outdoors, keep parks open, and let children play with one another outdoors.
Vaccines are the tool that will end the pandemic. The story of their rollout combines some of our strengths and our weaknesses, revealing the limitations of the way we think and evaluate evidence, provide guidelines, and absorb and react to an uncertain and difficult situation.
But also, after a weary year, maybe it’s hard for everyone—including scientists, journalists, and public-health officials—to imagine the end, to have hope. We adjust to new conditions fairly quickly, even terrible new conditions. During this pandemic, we’ve adjusted to things many of us never thought were possible. Billions of people have led dramatically smaller, circumscribed lives, and dealt with closed schools, the inability to see loved ones, the loss of jobs, the absence of communal activities, and the threat and reality of illness and death.
Hope nourishes us during the worst times, but it is also dangerous. It upsets the delicate balance of survival—where we stop hoping and focus on getting by—and opens us up to crushing disappointment if things don’t pan out. After a terrible year, many things are understandably making it harder for us to dare to hope. But, especially in the United States, everything looks better by the day. Tragically, at least 28 million Americans have been confirmed to have been infected, but the real number is certainly much higher. By one estimate, as many as 80 million have already been infected with COVID-19, and many of those people now have some level of immunity. Another 46 million people have already received at least one dose of a vaccine, and we’re vaccinating millions more each day as the supply constraints ease. The vaccines are poised to reduce or nearly eliminate the things we worry most about—severe disease, hospitalization, and death.
Not all our problems are solved. We need to get through the next few months, as we race to vaccinate against more transmissible variants. We need to do more to address equity in the United States—because it is the right thing to do, and because failing to vaccinate the highest-risk people will slow the population impact. We need to make sure that vaccines don’t remain inaccessible to poorer countries. We need to keep up our epidemiological surveillance so that if we do notice something that looks like it may threaten our progress, we can respond swiftly.
And the public behavior of the vaccinated cannot change overnight—even if they are at much lower risk, it’s not reasonable to expect a grocery store to try to verify who’s vaccinated, or to have two classes of people with different rules. For now, it’s courteous and prudent for everyone to obey the same guidelines in many public places. Still, vaccinated people can feel more confident in doing things they may have avoided, just in case—getting a haircut, taking a trip to see a loved one, browsing for nonessential purchases in a store.
But it is time to imagine a better future, not just because it’s drawing nearer but because that’s how we get through what remains and keep our guard up as necessary. It’s also realistic—reflecting the genuine increased safety for the vaccinated.
Public-health agencies should immediately start providing expanded information to vaccinated people so they can make informed decisions about private behavior. This is justified by the encouraging data, and a great way to get the word out on how wonderful these vaccines really are. The delay itself has great human costs, especially for those among the elderly who have been isolated for so long.
Public-health authorities should also be louder and more explicit about the next steps, giving us guidelines for when we can expect easing in rules for public behavior as well. We need the exit strategy spelled out—but with graduated, targeted measures rather than a one-size-fits-all message. We need to let people know that getting a vaccine will almost immediately change their lives for the better, and why, and also when and how increased vaccination will change more than their individual risks and opportunities, and see us out of this pandemic.
We should encourage people to dream about the end of this pandemic by talking about it more, and more concretely: the numbers, hows, and whys. Offering clear guidance on how this will end can help strengthen people’s resolve to endure whatever is necessary for the moment—even if they are still unvaccinated—by building warranted and realistic anticipation of the pandemic’s end.
Hope will get us through this. And one day soon, you’ll be able to hop off the subway on your way to a concert, pick up a newspaper, and find the triumphant headline: “COVID Routed!”
Zeynep Tufekci is a contributing writer at The Atlantic and an associate professor at the University of North Carolina. She studies the interaction between digital technology, artificial intelligence, and society.
Many scientists are expecting another rise in infections. But this time the surge will be blunted by vaccines and, hopefully, widespread caution. By summer, Americans may be looking at a return to normal life.
Published Feb. 25, 2021Updated Feb. 26, 2021, 12:07 a.m. ET
Across the United States, and the world, the coronavirus seems to be loosening its stranglehold. The deadly curve of cases, hospitalizations and deaths has yo-yoed before, but never has it plunged so steeply and so fast.
Is this it, then? Is this the beginning of the end? After a year of being pummeled by grim statistics and scolded for wanting human contact, many Americans feel a long-promised deliverance is at hand.
Americans will win against the virus and regain many aspects of their pre-pandemic lives, most scientists now believe. Of the 21 interviewed for this article, all were optimistic that the worst of the pandemic is past. This summer, they said, life may begin to seem normal again.
But — of course, there’s always a but — researchers are also worried that Americans, so close to the finish line, may once again underestimate the virus.
So far, the two vaccines authorized in the United States are spectacularly effective, and after a slow start, the vaccination rollout is picking up momentum. A third vaccine is likely to be authorized shortly, adding to the nation’s supply.
But it will be many weeks before vaccinations make a dent in the pandemic. And now the virus is shape-shifting faster than expected, evolving into variants that may partly sidestep the immune system.
The latest variant was discovered in New York City only this week, and another worrisome version is spreading at a rapid pace through California. Scientists say a contagious variant first discovered in Britain will become the dominant form of the virus in the United States by the end of March.
The road back to normalcy is potholed with unknowns: how well vaccines prevent further spread of the virus; whether emerging variants remain susceptible enough to the vaccines; and how quickly the world is immunized, so as to halt further evolution of the virus.
But the greatest ambiguity is human behavior. Can Americans desperate for normalcy keep wearing masks and distancing themselves from family and friends? How much longer can communities keep businesses, offices and schools closed?
Covid-19 deaths will most likely never rise quite as precipitously as in the past, and the worst may be behind us. But if Americans let down their guard too soon — many states are already lifting restrictions — and if the variants spread in the United States as they have elsewhere, another spike in cases may well arrive in the coming weeks.
Scientists call it the fourth wave. The new variants mean “we’re essentially facing a pandemic within a pandemic,” said Adam Kucharski, an epidemiologist at the London School of Hygiene and Tropical Medicine.
The declines are real, but they disguise worrying trends.
The United States has now recorded 500,000 deaths amid the pandemic, a terrible milestone. As of Wednesday morning, at least 28.3 million people have been infected.
Yet the numbers are still at the horrific highs of November, scientists noted. At least 3,210 people died of Covid-19 on Wednesday alone. And there is no guarantee that these rates will continue to decrease.
“Very, very high case numbers are not a good thing, even if the trend is downward,” said Marc Lipsitch, an epidemiologist at the Harvard T.H. Chan School of Public Health in Boston. “Taking the first hint of a downward trend as a reason to reopen is how you get to even higher numbers.”
In late November, for example, Gov. Gina Raimondo of Rhode Island limited social gatherings and some commercial activities in the state. Eight days later, cases began to decline. The trend reversed eight days after the state’s pause lifted on Dec. 20.
The virus’s latest retreat in Rhode Island and most other states, experts said, results from a combination of factors: growing numbers of people with immunity to the virus, either from having been infected or from vaccination; changes in behavior in response to the surges of a few weeks ago; and a dash of seasonality — the effect of temperature and humidity on the survival of the virus.
The vaccines were first rolled out to residents of nursing homes and to the elderly, who are at highest risk of severe illness and death. That may explain some of the current decline in hospitalizations and deaths.
But young people drive the spread of the virus, and most of them have not yet been inoculated. And the bulk of the world’s vaccine supply has been bought up by wealthy nations, which have amassed one billion more doses than needed to immunize their populations.
Vaccination cannot explain why cases are dropping even in countries where not a single soul has been immunized, like Honduras, Kazakhstan or Libya. The biggest contributor to the sharp decline in infections is something more mundane, scientists say: behavioral change.
Leaders in the United States and elsewhere stepped up community restrictions after the holiday peaks. But individual choices have also been important, said Lindsay Wiley, an expert in public health law and ethics at American University in Washington.
“People voluntarily change their behavior as they see their local hospital get hit hard, as they hear about outbreaks in their area,” she said. “If that’s the reason that things are improving, then that’s something that can reverse pretty quickly, too.”
The downward curve of infections with the original coronavirus disguises an exponential rise in infections with B.1.1.7, the variant first identified in Britain, according to many researchers.
“We really are seeing two epidemic curves,” said Ashleigh Tuite, an infectious disease modeler at the University of Toronto.
The B.1.1.7 variant is thought to be more contagious and more deadly, and it is expected to become the predominant form of the virus in the United States by late March. The number of cases with the variant in the United States has risen from 76 in 12 states as of Jan. 13 to more than 1,800 in 45 states now. Actual infections may be much higher because of inadequate surveillance efforts in the United States.
Buoyed by the shrinking rates over all, however, governors are lifting restrictions across the United States and are under enormous pressure to reopen completely. Should that occur, B.1.1.7 and the other variants are likely to explode.
“Everybody is tired, and everybody wants things to open up again,” Dr. Tuite said. “Bending to political pressure right now, when things are really headed in the right direction, is going to end up costing us in the long term.”
Another wave may be coming, but it can be minimized.
Looking ahead to late March or April, the majority of scientists interviewed by The Times predicted a fourth wave of infections. But they stressed that it is not an inevitable surge, if government officials and individuals maintain precautions for a few more weeks.
A minority of experts were more sanguine, saying they expected powerful vaccines and an expanding rollout to stop the virus. And a few took the middle road.
“We’re at that crossroads, where it could go well or it could go badly,” said Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases.
The vaccines have proved to be more effective than anyone could have hoped, so far preventing serious illness and death in nearly all recipients. At present, about 1.4 million Americans are vaccinated each day. More than 45 million Americans have received at least one dose.
A team of researchers at Fred Hutchinson Cancer Research Center in Seattle tried to calculate the number of vaccinations required per day to avoid a fourth wave. In a model completed before the variants surfaced, the scientists estimated that vaccinating just one million Americans a day would limit the magnitude of the fourth wave.
“But the new variants completely changed that,” said Dr. Joshua T. Schiffer, an infectious disease specialist who led the study. “It’s just very challenging scientifically — the ground is shifting very, very quickly.”
Natalie Dean, a biostatistician at the University of Florida, described herself as “a little more optimistic” than many other researchers. “We would be silly to undersell the vaccines,” she said, noting that they are effective against the fast-spreading B.1.1.7 variant.
But Dr. Dean worried about the forms of the virus detected in South Africa and Brazil that seem less vulnerable to the vaccines made by Pfizer and Moderna. (On Wednesday, Johnson & Johnson reported that its vaccine was relatively effective against the variant found in South Africa.)
About 50 infections with those two variants have been identified in the United States, but that could change. Because of the variants, scientists do not know how many people who were infected and had recovered are now vulnerable to reinfection.
South Africa and Brazil have reported reinfections with the new variants among people who had recovered from infections with the original version of the virus.
“That makes it a lot harder to say, ‘If we were to get to this level of vaccinations, we’d probably be OK,’” said Sarah Cobey, an evolutionary biologist at the University of Chicago.
Yet the biggest unknown is human behavior, experts said. The sharp drop in cases now may lead to complacency about masks and distancing, and to a wholesale lifting of restrictions on indoor dining, sporting events and more. Or … not.
“The single biggest lesson I’ve learned during the pandemic is that epidemiological modeling struggles with prediction, because so much of it depends on human behavioral factors,” said Carl Bergstrom, a biologist at the University of Washington in Seattle.
Taking into account the counterbalancing rises in both vaccinations and variants, along with the high likelihood that people will stop taking precautions, a fourth wave is highly likely this spring, the majority of experts told The Times.
Kristian Andersen, a virologist at the Scripps Research Institute in San Diego, said he was confident that the number of cases will continue to decline, then plateau in about a month. After mid-March, the curve in new cases will swing upward again.
In early to mid-April, “we’re going to start seeing hospitalizations go up,” he said. “It’s just a question of how much.”
Summer will feel like summer again, sort of.
Now the good news.
Despite the uncertainties, the experts predict that the last surge will subside in the United States sometime in the early summer. If the Biden administration can keep its promise to immunize every American adult by the end of the summer, the variants should be no match for the vaccines.
Combine vaccination with natural immunity and the human tendency to head outdoors as weather warms, and “it may not be exactly herd immunity, but maybe it’s sufficient to prevent any large outbreaks,” said Youyang Gu, an independent data scientist, who created some of the most prescient models of the pandemic.
Infections will continue to drop. More important, hospitalizations and deaths will fall to negligible levels — enough, hopefully, to reopen the country.
“Sometimes people lose vision of the fact that vaccines prevent hospitalization and death, which is really actually what most people care about,” said Stefan Baral, an epidemiologist at the Johns Hopkins Bloomberg School of Public Health.
Even as the virus begins its swoon, people may still need to wear masks in public places and maintain social distance, because a significant percent of the population — including children — will not be immunized.
“Assuming that we keep a close eye on things in the summer and don’t go crazy, I think that we could look forward to a summer that is looking more normal, but hopefully in a way that is more carefully monitored than last summer,” said Emma Hodcroft, a molecular epidemiologist at the University of Bern in Switzerland.
Imagine: Groups of vaccinated people will be able to get together for barbecues and play dates, without fear of infecting one another. Beaches, parks and playgrounds will be full of mask-free people. Indoor dining will return, along with movie theaters, bowling alleys and shopping malls — although they may still require masks.
The virus will still be circulating, but the extent will depend in part on how well vaccines prevent not just illness and death, but also transmission. The data on whether vaccines stop the spread of the disease are encouraging, but immunization is unlikely to block transmission entirely.
“It’s not zero and it’s not 100 — exactly where that number is will be important,” said Shweta Bansal, an infectious disease modeler at Georgetown University. “It needs to be pretty darn high for us to be able to get away with vaccinating anything below 100 percent of the population, so that’s definitely something we’re watching.”
Over the long term — say, a year from now, when all the adults and children in the United States who want a vaccine have received them — will this virus finally be behind us?
Every expert interviewed by The Times said no. Even after the vast majority of the American population has been immunized, the virus will continue to pop up in clusters, taking advantage of pockets of vulnerability. Years from now, the coronavirus may be an annoyance, circulating at low levels, causing modest colds.
Many scientists said their greatest worry post-pandemic was that new variants may turn out to be significantly less susceptible to the vaccines. Billions of people worldwide will remain unprotected, and each infection gives the virus new opportunities to mutate.
“We won’t have useless vaccines. We might have slightly less good vaccines than we have at the moment,” said Andrew Read, an evolutionary microbiologist at Penn State University. “That’s not the end of the world, because we have really good vaccines right now.”
For now, every one of us can help by continuing to be careful for just a few more months, until the curve permanently flattens.
“Just hang in there a little bit longer,” Dr. Tuite said. “There’s a lot of optimism and hope, but I think we need to be prepared for the fact that the next several months are likely to continue to be difficult.”
Clifford Krauss, Manny Fernandez, Ivan Penn, Rick Rojas – Feb 21, 2021
Texas has refused to join interstate electrical grids and railed against energy regulation. Now it’s having to answer to millions of residents who were left without power in last week’s snowstorm.
HOUSTON — Across the plains of West Texas, the pump jacks that resemble giant bobbing hammers define not just the landscape but the state itself: Texas has been built on the oil-and-gas business for the last 120 years, ever since the discovery of oil on Spindletop Hill near Beaumont in 1901.
Texas, the nation’s leading energy-producing state, seemed like the last place on Earth that could run out of energy.
Then last week, it did.
The crisis could be traced to that other defining Texas trait: independence, both from big government and from the rest of the country. The dominance of the energy industry and the “Republic of Texas” ethos became a devastating liability when energy stopped flowing to millions of Texans who shivered and struggled through a snowstorm that paralyzed much of the state.
Part of the responsibility for the near-collapse of the state’s electrical grid can be traced to the decision in 1999 to embark on the nation’s most extensive experiment in electrical deregulation, handing control of the state’s entire electricity delivery system to a market-based patchwork of private generators, transmission companies and energy retailers.
The energy industry wanted it. The people wanted it. Both parties supported it. “Competition in the electric industry will benefit Texans by reducing monthly rates and offering consumers more choices about the power they use,” George W. Bush, then the governor, said as he signed the top-to-bottom deregulation legislation.
Mr. Bush’s prediction of lower-cost power generally came true, and the dream of a free-market electrical grid worked reasonably well most of the time, in large part because Texas had so much cheap natural gas as well as abundant wind to power renewable energy. But the newly deregulated system came with few safeguards and even fewer enforced rules.
With so many cost-conscious utilities competing for budget-shopping consumers, there was little financial incentive to invest in weather protection and maintenance. Wind turbines are not equipped with the de-icing equipment routinely installed in the colder climes of the Dakotas and power lines have little insulation. The possibility of more frequent cold-weather events was never built into infrastructure plans in a state where climate change remains an exotic, disputed concept.
“Deregulation was something akin to abolishing the speed limit on an interstate highway,” said Ed Hirs, an energy fellow at the University of Houston. “That opens up shortcuts that cause disasters.”
The state’s entire energy infrastructure was walloped with glacial temperatures that even under the strongest of regulations might have frozen gas wells and downed power lines.
But what went wrong was far broader: Deregulation meant that critical rules of the road for power were set not by law, but rather by a dizzying array of energy competitors.
Utility regulation is intended to compensate for the natural monopolies that occur when a single electrical provider serves an area; it keeps prices down while protecting public safety and guaranteeing fair treatment to customers. Yet many states have flirted with deregulation as a way of giving consumers more choices and encouraging new providers, especially alternative energy producers.
California, one of the early deregulators in the 1990s, scaled back its initial foray after market manipulation led to skyrocketing prices and rolling blackouts.
States like Maryland allow customers to pick from a menu of producers. In some states, competing private companies offer varied packages like discounts for cheaper power at night. But no state has gone as far as Texas, which has not only turned over the keys to the free market but has also isolated itself from the national grid, limiting the state’s ability to import power when its own generators are foundering.
Consumers themselves got a direct shock last week when customers who had chosen variable-rate electricity contracts found themselves with power bills of $5,000 or more. While they were expecting extra-low monthly rates, many may now face huge bills as a result of the upswing in wholesale electricity prices during the cold wave. Gov. Greg Abbott on Sunday said the state’s Public Utility Commission has issued a moratorium on customer disconnections for non-payment and will temporarily restrict providers from issuing invoices.
There is regulation in the Texas system, but it is hardly robust. One nonprofit agency, the Electric Reliability Council of Texas, or ERCOT, was formed to manage the wholesale market. It is supervised by the Public Utility Commission, which also oversees the transmission companies that offer customers an exhaustive array of contract choices laced with more fine print than a credit card agreement.
But both agencies are nearly unaccountable and toothless compared to regulators in other regions, where many utilities have stronger consumer protections and submit an annual planning report to ensure adequate electricity supply. Texas energy companies are given wide latitude in their planning for catastrophic events.
Into a snowstorm with no reserves
One example of how Texas has gone it alone is its refusal to enforce a “reserve margin” of extra power available above expected demand, unlike all other power systems around North America. With no mandate, there is little incentive to invest in precautions for events, such as a Southern snowstorm, that are rare. Any company that took such precautions would put itself at a competitive disadvantage.
A surplus supply of natural gas, the dominant power fuel in Texas, near power plants might have helped avoid the cascade of failures in which power went off, forcing natural gas production and transmission offline, which in turn led to further power shortages.
In the aftermath of the dayslong outages, ERCOT has been criticized by both Democratic and Republican residents, lawmakers and business executives, a rare display of unity in a fiercely partisan and Republican-dominated state. Mr. Abbott said he supported calls for the agency’s leadership to resign and made ERCOT reform a priority for the Legislature. The reckoning has been swift — this week, lawmakers will hold hearings in Austin to investigate the agency’s handling of the storm and the rolling outages.
For ERCOT operators, the storm’s arrival was swift and fierce, but they had anticipated it and knew it would strain their system. They asked power customers across the state to conserve, warning that outages were likely.
But late on Sunday, Feb. 14, it rapidly became clear that the storm was far worse than they had expected: Sleet and snow fell, and temperatures plunged. In the council’s command center outside Austin, a room dominated by screens flashing with maps, graphics and data tracking the flow of electricity to 26 million people in Texas, workers quickly found themselves fending off a crisis. As weather worsened into Monday morning, residents cranked up their heaters and demand surged.
Power plants began falling offline in rapid succession as they were overcome by the frigid weather or ran out of fuel to burn. Within hours, 40 percent of the power supply had been lost.
The entire grid — carrying 90 percent of the electric load in Texas — was barreling toward a collapse.
In the electricity business, supply and demand need to be in balance. Imbalances lead to catastrophic blackouts. Recovering from a total blackout would be an agonizing and tedious process, known as a “black start,” that could take weeks, or possibly months.
And in the early-morning hours last Monday, the Texas grid was “seconds and minutes” away from such a collapse, said Bill Magness, the president and chief executive of the Electric Reliability Council.
“If we had allowed a catastrophic blackout to happen, we wouldn’t be talking today about hopefully getting most customers their power back,” Mr. Magness said. “We’d be talking about how many months it might be before you get your power back.”
Earlier warnings of trouble
The outages and the cold weather touched off an avalanche of failures, but there had been warnings long before last week’s storm.
After a heavy snowstorm in February 2011 caused statewide rolling blackouts and left millions of Texans in the dark, federal authorities warned the state that its power infrastructure had inadequate “winterization” protection. But 10 years later, pipelines remained inadequately insulated and heaters that might have kept instruments from freezing were never installed.
During heat waves, when demand has soared during several recent summers, the system in Texas has also strained to keep up, raising questions about lack of reserve capacity on the unregulated grid.
And aside from the weather, there have been periodic signs that the system can run into trouble delivering sufficient energy, in some cases because of equipment failures, in others because of what critics called an attempt to drive up prices, according to Mr. Hirs of the University of Houston, as well as several energy consultants.
Another potential safeguard might have been far stronger connections to the two interstate power-sharing networks, East and West, that allow states to link their electrical grids and obtain power from thousands of miles away when needed to hold down costs and offset their own shortfalls.
But Texas, reluctant to submit to the federal regulation that is part of the regional power grids, made decisions as far back as the early 20th century to become the only state in the continental United States to operate its own grid — a plan that leaves it able to borrow only from a few close neighbors.
The border city of El Paso survived the freeze much better than Dallas or Houston because it was not part of the Texas grid but connected to the much larger grid covering many Western states.
But the problems that began with last Monday’s storm went beyond an isolated electrical grid. The entire ecosystem of how Texas generates, transmits and uses power stalled, as millions of Texans shivered in darkened, unheated homes.
Texans love to brag about natural gas, which state officials often call the cleanest-burning fossil fuel. No state produces more, and gas-fired power plants produce nearly half the state’s electricity.
“We are struggling to come to grips with the reality that gas came up short and let us down when we needed it most,” said Michael E. Webber, a professor of mechanical engineering at the University of Texas at Austin.
The cold was so severe that the enormous oil and natural gas fields of West Texas froze up, or could not get sufficient power to operate. Though a few plants had stored gas reserves, there was insufficient electricity to pump it.
The leaders of ERCOT defended the organization, its lack of mandated reserves and the state’s isolation from larger regional grids, and said the blame for the power crisis lies with the weather, not the overall deregulated system in Texas.
“The historic, just about unprecedented, storm was the heart of the problem,” Mr. Magness, the council’s chief executive, said, adding: “We’ve found that this market structure works. It demands reliability. I don’t think there’s a silver-bullet market structure that could have managed the extreme lows and generation outages that we were facing Sunday night.”
In Texas, energy regulation is as much a matter of philosophy as policy. Its independent power grid is a point of pride that has been an applause line in Texas political speeches for decades.
Deregulation is a hot topic among Texas energy experts, and there has been no shortage of predictions that the grid could fail under stress. But there has not been widespread public dissatisfaction with the system, although many are now wondering if they are being well served.
“I believe there is great value in Texas being on its own grid and I believe we can do so safely and securely and confidently going forward,” said State Representative Jeff Leach, a Republican from Plano who has called for an investigation into what went wrong. “But it’s going to take new investment and some new strategic decisions to make sure we’re protected from this ever happening again.”
Steven D. Wolens, a former Democratic lawmaker from Dallas and a principal architect of the 1999 deregulation legislation, said deregulation was meant to spur more generation, including from renewable energy sources, and to encourage the mothballing of older plants that were spewing pollution. “We were successful,” said Mr. Wolens, who left the Legislature in 2005.
But the 1999 legislation was intended as a first iteration that would evolve along with the needs of the state, he said. “They can focus on it now and they can fix it now,” he said. “The buck stops with the Texas Legislature and they are in a perfect position to determine the basis of the failure, to correct it and make sure it never happens again.”
Clifford Krauss reported from Houston, Manny Fernandez and Ivan Penn from Los Angeles, and Rick Rojas from Nashville. David Montgomery contributed reporting from Austin, Texas.
Christopher Flavelle, Brad Plumer, Hiroko Tabuchi – Feb 20, 2021
Continent-spanning storms triggered blackouts in Oklahoma and Mississippi, halted one-third of U.S. oil production and disrupted vaccinations in 20 states.
Even as Texas struggled to restore electricity and water over the past week, signs of the risks posed by increasingly extreme weather to America’s aging infrastructure were cropping up across the country.
The week’s continent-spanning winter storms triggered blackouts in Texas, Oklahoma, Mississippi and several other states. One-third of oil production in the nation was halted. Drinking-water systems in Ohio were knocked offline. Road networks nationwide were paralyzed and vaccination efforts in 20 states were disrupted.
The crisis carries a profound warning. As climate change brings more frequent and intense storms, floods, heat waves, wildfires and other extreme events, it is placing growing stress on the foundations of the country’s economy: Its network of roads and railways, drinking-water systems, power plants, electrical grids, industrial waste sites and even homes. Failures in just one sector can set off a domino effect of breakdowns in hard-to-predict ways.
Much of this infrastructure was built decades ago, under the expectation that the environment around it would remain stable, or at least fluctuate within predictable bounds. Now climate change is upending that assumption.
“We are colliding with a future of extremes,” said Alice Hill, who oversaw planning for climate risks on the National Security Council during the Obama administration. “We base all our choices about risk management on what’s occurred in the past, and that is no longer a safe guide.”
Sewer systems are overflowing more often as powerful rainstorms exceed their design capacity. Coastal homes and highways are collapsing as intensified runoff erodes cliffs. Coal ash, the toxic residue produced by coal-burning plants, is spilling into rivers as floods overwhelm barriers meant to hold it back. Homes once beyond the reach of wildfires are burning in blazes they were never designed to withstand.
Problems like these often reflect an inclination of governments to spend as little money as possible, said Shalini Vajjhala, a former Obama administration official who now advises cities on meeting climate threats. She said it’s hard to persuade taxpayers to spend extra money to guard against disasters that seem unlikely.
But climate change flips that logic, making inaction far costlier. “The argument I would make is, we can’t afford not to, because we’re absorbing the costs” later, Ms. Vajjhala said, after disasters strike. “We’re spending poorly.”
The Biden administration has talked extensively about climate change, particularly the need to reduce greenhouse gas emissions and create jobs in renewable energy. But it has spent less time discussing how to manage the growing effects of climate change, facing criticism from experts for not appointing more people who focus on climate resilience.
“I am extremely concerned by the lack of emergency-management expertise reflected in Biden’s climate team,” said Samantha Montano, an assistant professor at the Massachusetts Maritime Academy who focuses on disaster policy. “There’s an urgency here that still is not being reflected.”
A White House spokesman, Vedant Patel, said in a statement, “Building resilient and sustainable infrastructure that can withstand extreme weather and a changing climate will play an integral role in creating millions of good paying, union jobs” while cutting greenhouse gas emissions.
And while President Biden has called for a major push to refurbish and upgrade the nation’s infrastructure, getting a closely divided Congress to spend hundreds of billions, if not trillions of dollars, will be a major challenge.
Heightening the cost to society, disruptions can disproportionately affect lower-income households and other vulnerable groups, including older people or those with limited English.
“All these issues are converging,” said Robert D. Bullard, a professor at Texas Southern University who studies wealth and racial disparities related to the environment. “And there’s simply no place in this country that’s not going to have to deal with climate change.”
Many forms of water crisis
In September, when a sudden storm dumped a record of more than two inches of water on Washington in less than 75 minutes, the result wasn’t just widespread flooding, but also raw sewage rushing into hundreds of homes.
Washington, like many other cities in the Northeast and Midwest, relies on what’s called a combined sewer overflow system: If a downpour overwhelms storm drains along the street, they are built to overflow into the pipes that carry raw sewage. But if there’s too much pressure, sewage can be pushed backward, into people’s homes — where the forces can send it erupting from toilets and shower drains.
This is what happened in Washington. The city’s system was built in the late 1800s. Now, climate change is straining an already outdated design.
DC Water, the local utility, is spending billions of dollars so that the system can hold more sewage. “We’re sort of in uncharted territory,” said Vincent Morris, a utility spokesman.
The challenge of managing and taming the nation’s water supplies — whether in streets and homes, or in vast rivers and watersheds — is growing increasingly complex as storms intensify. Last May, rain-swollen flooding breached two dams in Central Michigan, forcing thousands of residents to flee their homes and threatening a chemical complex and toxic waste cleanup site. Experts warned it was unlikely to be the last such failure.
Many of the country’s 90,000 dams were built decades ago and were already in dire need of repairs. Now climate change poses an additional threat, bringing heavier downpours to parts of the country and raising the odds that some dams could be overwhelmed by more water than they were designed to handle. One recent study found that most of California’s biggest dams were at increased risk of failure as global warming advances.
In recent years, dam-safety officials have begun grappling with the dangers. Colorado, for instance, now requires dam builders to take into account the risk of increased atmospheric moisture driven by climate change as they plan for worst-case flooding scenarios.
But nationwide, there remains a backlog of thousands of older dams that still need to be rehabilitated or upgraded. The price tag could ultimately stretch to more than $70 billion.
“Whenever we study dam failures, we often find there was a lot of complacency beforehand,” said Bill McCormick, president of the Association of State Dam Safety Officials. But given that failures can have catastrophic consequences, “we really can’t afford to be complacent.”
Built for a different future
If the Texas blackouts exposed one state’s poor planning, they also provide a warning for the nation: Climate change threatens virtually every aspect of electricity grids that aren’t always designed to handle increasingly severe weather. The vulnerabilities show up in power lines, natural-gas plants, nuclear reactors and myriad other systems.
Higher storm surges can knock out coastal power infrastructure. Deeper droughts can reduce water supplies for hydroelectric dams. Severe heat waves can reduce the efficiency of fossil-fuel generators, transmission lines and even solar panels at precisely the moment that demand soars because everyone cranks up their air-conditioners.
Climate hazards can also combine in new and unforeseen ways.
In California recently, Pacific Gas & Electric has had to shut off electricity to thousands of people during exceptionally dangerous fire seasons. The reason: Downed power lines can spark huge wildfires in dry vegetation. Then, during a record-hot August last year, several of the state’s natural gas plants malfunctioned in the heat, just as demand was spiking, contributing to blackouts.
“We have to get better at understanding these compound impacts,” said Michael Craig, an expert in energy systems at the University of Michigan who recently led a study looking at how rising summer temperatures in Texas could strain the grid in unexpected ways. “It’s an incredibly complex problem to plan for.”
Some utilities are taking notice. After Superstorm Sandy in 2012 knocked out power for 8.7 million customers, utilities in New York and New Jersey invested billions in flood walls, submersible equipment and other technology to reduce the risk of failures. Last month, New York’s Con Edison said it would incorporate climate projections into its planning.
As freezing temperatures struck Texas, a glitch at one of two reactors at a South Texas nuclear plant, which serves 2 million homes, triggered a shutdown. The cause: Sensing lines connected to the plant’s water pumps had frozen, said Victor Dricks, a spokesman for the federal Nuclear Regulatory Agency.
It’s also common for extreme heat to disrupt nuclear power. The issue is that the water used to cool reactors can become too warm to use, forcing shutdowns.
Flooding is another risk.
After a tsunami led to several meltdowns at Japan’s Fukushima Daiichi power plant in 2011, the U.S. Nuclear Regulatory Commission told the 60 or so working nuclear plants in the United States, many decades old, to evaluate their flood risk to account for climate change. Ninety percent showed at least one type of flood risk that exceeded what the plant was designed to handle.
The greatest risk came from heavy rain and snowfall exceeding the design parameters at 53 plants.
Scott Burnell, an Nuclear Regulatory Commission spokesman, said in a statement, “The NRC continues to conclude, based on the staff’s review of detailed analyses, that all U.S. nuclear power plants can appropriately deal with potential flooding events, including the effects of climate change, and remain safe.”
Several climate-related risks appeared to have converged to heighten the danger. Rising seas and higher storm surges have intensified coastal erosion, while more extreme bouts of precipitation have increased the landslide risk.
Add to that the effects of devastating wildfires, which can damage the vegetation holding hillside soil in place, and “things that wouldn’t have slid without the wildfires, start sliding,” said Jennifer M. Jacobs, a professor of civil and environmental engineering at the University of New Hampshire. “I think we’re going to see more of that.”
The United States depends on highways, railroads and bridges as economic arteries for commerce, travel and simply getting to work. But many of the country’s most important links face mounting climate threats. More than 60,000 miles of roads and bridges in coastal floodplains are already vulnerable to extreme storms and hurricanes, government estimates show. And inland flooding could also threaten at least 2,500 bridges across the country by 2050, a federal climate report warned in 2018.
Sometimes even small changes can trigger catastrophic failures. Engineers modeling the collapse of bridges over Escambia Bay in Florida during Hurricane Ivan in 2004 found that the extra three inches of sea-level rise since the bridge was built in 1968 very likely contributed to the collapse, because of the added height of the storm surge and force of the waves.
“A lot of our infrastructure systems have a tipping point. And when you hit the tipping point, that’s when a failure occurs,” Dr. Jacobs said. “And the tipping point could be an inch.”
Crucial rail networks are at risk, too. In 2017, Amtrak consultants found that along parts of the Northeast corridor, which runs from Boston to Washington and carries 12 million people a year, flooding and storm surge could erode the track bed, disable the signals and eventually put the tracks underwater.
And there is no easy fix. Elevating the tracks would require also raising bridges, electrical wires and lots of other infrastructure, and moving them would mean buying new land in a densely packed part of the country. So the report recommended flood barriers, costing $24 million per mile, that must be moved into place whenever floods threaten.
The blasts at the plant came after flooding knocked out the site’s electrical supply, shutting down refrigeration systems that kept volatile chemicals stable. Almost two dozen people, many of them emergency workers, were treated for exposure to the toxic fumes, and some 200 nearby residents were evacuated from their homes.
More than 2,500 facilities that handle toxic chemicals lie in federal flood-prone areas across the country, about 1,400 of them in areas at the highest risk of flooding, a New York Times analysis showed in 2018.
Leaks from toxic cleanup sites, left behind by past industry, pose another threat.
Almost two-thirds of some 1,500 superfund cleanup sites across the country are in areas with an elevated risk of flooding, storm surge, wildfires or sea level rise, a government audit warned in 2019. Coal ash, a toxic substance produced by coal power plants that is often stored as sludge in special ponds, have been particularly exposed. After Hurricane Florence in 2018, for example, a dam breach at the site of a power plant in Wilmington, N.C., released the hazardous ash into a nearby river.
“We should be evaluating whether these facilities or sites actually have to be moved or re-secured,” said Lisa Evans, senior counsel at Earthjustice, an environmental law organization. Places that “may have been OK in 1990,” she said, “may be a disaster waiting to happen in 2021.”
As many in Texas wake up still without power on Thursday morning, millions are now also having to contend with water shutdowns, boil advisories, and empty grocery shelves as cities struggle with keeping infrastructure powered and supply chains are interrupted.
Even as some residents are getting their power restored, the problems are only continuing to layer as the only grocery stores left open were quickly selling out of food and supplies. As many without power watched their refrigerated food spoil, lines to get into stores wrapped around blocks and buildings and store shelves sat completely empty with no indication of when new shipments would be coming in. Food banks have had to cancel deliveries and schools to halt meal distribution to students, the Texas Tribune reports.
People experiencing homelessness, including a disproportionate number of Black residents, have especially suffered in the record cold temperatures across the state. There have been somereports of people being found dead in the streets because of a lack of shelter.
“Businesses are shut down. Streets are empty, other than a few guys sliding around in 4x4s and fire trucks rushing to rescue people who turn their ovens on to keep warm and poison themselves with carbon monoxide,” wrote Austin resident Jeff Goodell in Rolling Stone. “Yesterday, the line at our neighborhood grocery store was three blocks long. People wandering around with handguns on their hip adds to a sense of lawlessness (Texas is an open-carry state).”
The Texas agricultural commissioner has said that farmers and ranchers are having to throw away millions of dollars worth of goods because of a lack of power. “We’re looking at a food supply chain problem like we’ve never seen before, even with COVID-19,” he told one local news affiliate.
An energy analyst likened the power crisis to the fallout of Hurricane Katrina as it’s becoming increasingly clear that the situation in Texas is a statewide disaster.
As natural gas output declined dramatically in the state, Paul Sankey, who leads energy analyst firm Sankey Research, said on Bloomberg, “This situation to me is very reminiscent of Hurricane Katrina…. We have never seen a loss [of energy supply] at this scale” in mid-winter. This is “the biggest outage in the history [of] U.S. oil and gas,” Sankey said.
Experts say that the power outages have partially been caused by the deregulation of the state’s electric grid. The government, some say, favored deregulatory actions like not requiring electrical equipment upgrades or proper weatherization, instead relying on free market mechanisms that ultimately contributed to the current disaster.
Former Gov. Rick Perry faced criticism on Wednesday when he said that Texans would rather face the current disaster than have to be regulated by the federal government. And he’s not the only Republican currently catching heat — many have begun calling for the resignation of Gov. Greg Abbott for a failure of leadership. On Wednesday, as millions suffered without power and under boil-water advisories, the governor went on Fox to attack clean energy, which experts say was not a major contributor to the current crisis, and the Green New Deal.
After declaring a state of emergency in the state over the weekend, the Joe Biden administration announced on Wednesday that it would be sending generators and other supplies to the state.