Arquivo da tag: Previsão

The professionals who predict the future for a living (MIT Technology Review)

Everywhere from business to medicine to the climate, forecasting the future is a complex and absolutely critical job. So how do you do it—and what comes next?

Bobbie Johnson

February 26, 2020

Inez Fung

Professor of atmospheric science, University of California, Berkeley

Inez Fung
Leah Fasten

Prediction for 2030: We’ll light up the world… safely

I’ve spoken to people who want climate model information, but they’re not really sure what they’re asking me for. So I say to them, “Suppose I tell you that some event will happen with a probability of 60% in 2030. Will that be good enough for you, or will you need 70%? Or would you need 90%? What level of information do you want out of climate model projections in order to be useful?”

I joined Jim Hansen’s group in 1979, and I was there for all the early climate projections. And the way we thought about it then, those things are all still totally there. What we’ve done since then is add richness and higher resolution, but the projections are really grounded in the same kind of data, physics, and observations.

Still, there are things we’re missing. We still don’t have a real theory of precipitation, for example. But there are two exciting things happening there. One is the availability of satellite observations: looking at the cloud is still not totally utilized. The other is that there used to be no way to get regional precipitation patterns through history—and now there is. Scientists found these caves in China and elsewhere, and they go in, look for a nice little chamber with stalagmites, and then they chop them up and send them back to the lab, where they do fantastic uranium-thorium dating and measure oxygen isotopes in calcium carbonate. From there they can interpret a record of  historic rainfall. The data are incredible: we have got over half a million years of precipitation records all over Asia.

I don’t see us reducing fossil fuels by 2030. I don’t see us reducing CO2 or atmospheric methane. Some 1.2 billion people in the world right now have no access to electricity, so I’m looking forward to the growth in alternative energy going to parts of the world that have no electricity. That’s important because it’s education, health, everything associated with a Western standard of living. That’s where I’m putting my hopes.

Anne-Lise Kjaer
Dvora Photography

Anne Lise Kjaer

Futurist, Kjaer Global, London

Prediction for 2030: Adults will learn to grasp new ideas

As a kid I wanted to become an archaeologist, and I did in a way. Archaeologists find artifacts from the past and try to connect the dots and tell a story about how the past might have been. We do the same thing as futurists; we use artifacts from the present and try to connect the dots into interesting narratives in the future.

When it comes to the future, you have two choices. You can sit back and think “It’s not happening to me” and build a great big wall to keep out all the bad news. Or you can build windmills and harness the winds of change.

A lot of companies come to us and think they want to hear about the future, but really it’s just an exercise for them—let’s just tick that box, do a report, and put it on our bookshelf.

So we have a little test for them. We do interviews, we ask them questions; then we use a model called a Trend Atlas that considers both the scientific dimensions of society and the social ones. We look at the trends in politics, economics, societal drivers, technology, environment, legislation—how does that fit with what we know currently? We look back maybe 10, 20 years: can we see a little bit of a trend and try to put that into the future?

What’s next? Obviously with technology we can educate much better than we could in the past. But it’s a huge opportunity to educate the parents of the next generation, not just the children. Kids are learning about sustainability goals, but what about the people who actually rule our world?

Philip Tetlock
Courtesy Photo

Philip Tetlock

Coauthor of Superforecasting and professor, University of Pennsylvania

Prediction for 2030: We’ll get better at being uncertain

At the Good Judgment Project, we try to track the accuracy of commentators and experts in domains in which it’s usually thought impossible to track accuracy. You take a big debate and break it down into a series of testable short-term indicators. So you could take a debate over whether strong forms of artificial intelligence are going to cause major dislocations in white-collar labor markets by 2035, 2040, 2050. A lot of discussion already occurs at that level of abstractionbut from our point of view, it’s more useful to break it down and to say: If we were on a long-term trajectory toward an outcome like that, what sorts of things would we expect to observe in the short term? So we started this off in 2015, and in 2016 AlphaGo defeated people in Go. But then other things didn’t happen: driverless Ubers weren’t picking people up for fares in any major American city at the end of 2017. Watson didn’t defeat the world’s best oncologists in a medical diagnosis tournament. So I don’t think we’re on a fast track toward the singularity, put it that way.

Forecasts have the potential to be either self-fulfilling or self-negatingY2K was arguably a self-negating forecast. But it’s possible to build that into a forecasting tournament by asking conditional forecasting questions: i.e., How likely is X conditional on our doing this or doing that?

What I’ve seen over the last 10 years, and it’s a trend that I expect will continue, is an increasing openness to the quantification of uncertainty. I think there’s a grudging, halting, but cumulative movement toward thinking about uncertainty, and more granular and nuanced ways that permit keeping score.

Keith Chen
Ryan Young

Keith Chen

Associate professor of economics, UCLA

Prediction for 2030: We’ll be more—and less—private

When I worked on Uber’s surge pricing algorithm, the problem it was built to solve was very coarse: we were trying to convince drivers to put in extra time when they were most needed. There were predictable times—like New Year’s—when we knew we were going to need a lot of people. The deeper problem was that this was a system with basically no control. It’s like trying to predict the weather. Yes, the amount of weather data that we collect today—temperature, wind speed, barometric pressure, humidity data—is 10,000 times greater than what we were collecting 20 years ago. But we still can’t predict the weather 10,000 times further out than we could back then. And social movements—even in a very specific setting, such as where riders want to go at any given point in time—are, if anything, even more chaotic than weather systems.

These days what I’m doing is a little bit more like forensic economics. We look to see what we can find and predict from people’s movement patterns. We’re just using simple cell-phone data like geolocation, but even just from movement patterns, we can infer salient information and build a psychological dimension of you. What terrifies me is I feel like I have much worse data than Facebook does. So what are they able to understand with their much better information?

I think the next big social tipping point is people actually starting to really care about their privacy. It’ll be like smoking in a restaurant: it will quickly go from causing outrage when people want to stop it to suddenly causing outrage if somebody does it. But at the same time, by 2030 almost every Chinese citizen will be completely genotyped. I don’t quite know how to reconcile the two.

Annalee Newitz
Sarah Deragon

Annalee Newitz

Science fiction and nonfiction author, San Francisco

Prediction for 2030: We’re going to see a lot more humble technology

Every era has its own ideas about the future. Go back to the 1950s and you’ll see that people fantasized about flying cars. Now we imagine bicycles and green cities where cars are limited, or where cars are autonomous. We have really different priorities now, so that works its way into our understanding of the future.

Science fiction writers can’t actually make predictions. I think of science fiction as engaging with questions being raised in the present. But what we can do, even if we can’t say what’s definitely going to happen, is offer a range of scenarios informed by history.

There are a lot of myths about the future that people believe are going to come true right now. I think a lot of people—not just science fiction writers but people who are working on machine learning—believe that relatively soon we’re going to have a human-equivalent brain running on some kind of computing substrate. This is as much a reflection of our time as it is what might actually happen.

It seems unlikely that a human-equivalent brain in a computer is right around the corner. But we live in an era where a lot of us feel like we live inside computers already, for work and everything else. So of course we have fantasies about digitizing our brains and putting our consciousness inside a machine or a robot.

I’m not saying that those things could never happen. But they seem much more closely allied to our fantasies in the present than they do to a real technical breakthrough on the horizon.

We’re going to have to develop much better technologies around disaster relief and emergency response, because we’ll be seeing a lot more floods, fires, storms. So I think there is going to be a lot more work on really humble technologies that allow you to take your community off the grid, or purify your own water. And I don’t mean in a creepy survivalist way; I mean just in a this-is-how-we-are-living-now kind of way.

Finale Doshi-Velez
Noah Willman

Finale Doshi-Velez

Associate professor of computer science, Harvard

Prediction for 2030: Humans and machines will make decisions together

In my lab, we’re trying to answer questions like “How might this patient respond to this antidepressant?” or “How might this patient respond to this vasopressor?” So we get as much data as we can from the hospital. For a psychiatric patient, we might have everything about their heart disease, kidney disease, cancer; for a blood pressure management recommendation for the ICU, we have all their oxygen information, their lactate, and more.

Some of it might be relevant to making predictions about their illnesses, some not, and we don’t know which is which. That’s why we ask for the large data set with everything.

There’s been about a decade of work trying to get unsupervised machine-­learning models to do a better job at making these predictions, and none worked really well. The breakthrough for us was when we found that all the previous approaches for doing this were wrong in the exact same way. Once we untangled all of this, we came up with a different method.

We also realized that even if our ability to predict what drug is going to work is not always that great, we can more reliably predict what drugs are not going to work, which is almost as valuable.

I’m excited about combining humans and AI to make predictions. Let’s say your AI has an error rate of 70% and your human is also only right 70% of the time. Combining the two is difficult, but if you can fuse their successes, then you should be able to do better than either system alone. How to do that is a really tough, exciting question.

All these predictive models were built and deployed and people didn’t think enough about potential biases. I’m hopeful that we’re going to have a future where these human-machine teams are making decisions that are better than either alone.

Abdoulaye Banire Diallo
Guillaume Simoneau

Abdoulaye Banire Diallo

Professor, director of the bioinformatics lab, University of Quebec at Montreal

Prediction for 2030: Machine-based forecasting will be regulated

When a farmer in Quebec decides whether to inseminate a cow or not, it might depend on the expectation of milk that will be produced every day for one year, two years, maybe three years after that. Farms have management systems that capture the data and the environment of the farm. I’m involved in projects that add a layer of genetic and genomic data to help forecastingto help decision makers like the farmer to have a full picture when they’re thinking about replacing cows, improving management, resilience, and animal welfare.

With the emergence of machine learning and AI, what we’re showing is that we can help tackle problems in a way that hasn’t been done before. We are adapting it to the dairy sector, where we’ve shown that some decisions can be anticipated 18 months in advance just by forecasting based on the integration of this genomic data. I think in some areas such as plant health we have only achieved 10% or 20% of our capacity to improve certain models.

Until now AI and machine learning have been associated with domain expertise. It’s not a public-wide thing. But less than 10 years from now they will need to be regulated. I think there are a lot of challenges for scientists like me to try to make those techniques more explainable, more transparent, and more auditable.

This story was part of our March 2020 issue.

The predictions issue

What AI still can’t do (MIT Technology Review)

Brian Bergstein

February 19, 2020

Machine-learning systems can be duped or confounded by situations they haven’t seen before. A self-driving car gets flummoxed by a scenario that a human driver could handle easily. An AI system laboriously trained to carry out one task (identifying cats, say) has to be taught all over again to do something else (identifying dogs). In the process, it’s liable to lose some of the expertise it had in the original task. Computer scientists call this problem “catastrophic forgetting.”

These shortcomings have something in common: they exist because AI systems don’t understand causation. They see that some events are associated with other events, but they don’t ascertain which things directly make other things happen. It’s as if you knew that the presence of clouds made rain likelier, but you didn’t know clouds caused rain.

Elias Bareinboim
Elias Bareinboim: AI systems are clueless when it comes to causation.

Understanding cause and effect is a big aspect of what we call common sense, and it’s an area in which AI systems today “are clueless,” says Elias Bareinboim. He should know: as the director of the new Causal Artificial Intelligence Lab at Columbia University, he’s at the forefront of efforts to fix this problem.

His idea is to infuse artificial-intelligence research with insights from the relatively new science of causality, a field shaped to a huge extent by Judea Pearl, a Turing Award–winning scholar who considers Bareinboim his protégé.

As Bareinboim and Pearl describe it, AI’s ability to spot correlations—e.g., that clouds make rain more likely—is merely the simplest level of causal reasoning. It’s good enough to have driven the boom in the AI technique known as deep learning over the past decade. Given a great deal of data about familiar situations, this method can lead to very good predictions. A computer can calculate the probability that a patient with certain symptoms has a certain disease, because it has learned just how often thousands or even millions of other people with the same symptoms had that disease.

But there’s a growing consensus that progress in AI will stall if computers don’t get better at wrestling with causation. If machines could grasp that certain things lead to other things, they wouldn’t have to learn everything anew all the time—they could take what they had learned in one domain and apply it to another. And if machines could use common sense we’d be able to put more trust in them to take actions on their own, knowing that they aren’t likely to make dumb errors.

Today’s AI has only a limited ability to infer what will result from a given action. In reinforcement learning, a technique that has allowed machines to master games like chess and Go, a system uses extensive trial and error to discern which moves will essentially cause them to win. But this approach doesn’t work in messier settings in the real world. It doesn’t even leave a machine with a general understanding of how it might play other games.

An even higher level of causal thinking would be the ability to reason about why things happened and ask “what if” questions. A patient dies while in a clinical trial; was it the fault of the experimental medicine or something else? School test scores are falling; what policy changes would most improve them? This kind of reasoning is far beyond the current capability of artificial intelligence.

Performing miracles

The dream of endowing computers with causal reasoning drew Bareinboim from Brazil to the United States in 2008, after he completed a master’s in computer science at the Federal University of Rio de Janeiro. He jumped at an opportunity to study under Judea Pearl, a computer scientist and statistician at UCLA. Pearl, 83, is a giant—the giant—of causal inference, and his career helps illustrate why it’s hard to create AI that understands causality.

Even well-trained scientists are apt to misinterpret correlations as signs of causation—or to err in the opposite direction, hesitating to call out causation even when it’s justified. In the 1950s, for example, a few prominent statisticians muddied the waters around whether tobacco caused cancer. They argued that without an experiment randomly assigning people to be smokers or nonsmokers, no one could rule out the possibility that some unknown—stress, perhaps, or some gene—caused people both to smoke and to get lung cancer.

Eventually, the fact that smoking causes cancer was definitively established, but it needn’t have taken so long. Since then, Pearl and other statisticians have devised a mathematical approach to identifying what facts would be required to support a causal claim. Pearl’s method shows that, given the prevalence of smoking and lung cancer, an independent factor causing both would be extremely unlikely.

Conversely, Pearl’s formulas also help identify when correlations can’t be used to determine causation. Bernhard Schölkopf, who researches causal AI techniques as a director at Germany’s Max Planck Institute for Intelligent Systems, points out that you can predict a country’s birth rate if you know its population of storks. That isn’t because storks deliver babies or because babies attract storks, but probably because economic development leads to more babies and more storks. Pearl has helped give statisticians and computer scientists ways of attacking such problems, Schölkopf says.

Judea Pearl
Judea Pearl: His theory of causal reasoning has transformed science.

Pearl’s work has also led to the development of causal Bayesian networks—software that sifts through large amounts of data to detect which variables appear to have the most influence on other variables. For example, GNS Healthcare, a company in Cambridge, Massachusetts, uses these techniques to advise researchers about experiments that look promising.

In one project, GNS worked with researchers who study multiple myeloma, a kind of blood cancer. The researchers wanted to know why some patients with the disease live longer than others after getting stem-cell transplants, a common form of treatment. The software churned through data with 30,000 variables and pointed to a few that seemed especially likely to be causal. Biostatisticians and experts in the disease zeroed in on one in particular: the level of a certain protein in patients’ bodies. Researchers could then run a targeted clinical trial to see whether patients with the protein did indeed benefit more from the treatment. “It’s way faster than poking here and there in the lab,” says GNS cofounder Iya Khalil.

Nonetheless, the improvements that Pearl and other scholars have achieved in causal theory haven’t yet made many inroads in deep learning, which identifies correlations without too much worry about causation. Bareinboim is working to take the next step: making computers more useful tools for human causal explorations.

Pearl says AI can’t be truly intelligent until it has a rich understanding of cause and effect, which would enable the introspection that is at the core of cognition.

One of his systems, which is still in beta, can help scientists determine whether they have sufficient data to answer a causal question. Richard McElreath, an anthropologist at the Max Planck Institute for Evolutionary Anthropology, is using the software to guide research into why humans go through menopause (we are the only apes that do).

The hypothesis is that the decline of fertility in older women benefited early human societies because women who put more effort into caring for grandchildren ultimately had more descendants. But what evidence might exist today to support the claim that children do better with grandparents around? Anthropologists can’t just compare the educational or medical outcomes of children who have lived with grandparents and those who haven’t. There are what statisticians call confounding factors: grandmothers might be likelier to live with grandchildren who need the most help. Bareinboim’s software can help McElreath discern which studies about kids who grew up with their grandparents are least riddled with confounding factors and could be valuable in answering his causal query. “It’s a huge step forward,” McElreath says.

The last mile

Bareinboim talks fast and often gestures with two hands in the air, as if he’s trying to balance two sides of a mental equation. It was halfway through the semester when I visited him at Columbia in October, but it seemed as if he had barely moved into his office—hardly anything on the walls, no books on the shelves, only a sleek Mac computer and a whiteboard so dense with equations and diagrams that it looked like a detail from a cartoon about a mad professor.

He shrugged off the provisional state of the room, saying he had been very busy giving talks about both sides of the causal revolution. Bareinboim believes work like his offers the opportunity not just to incorporate causal thinking into machines, but also to improve it in humans.

Getting people to think more carefully about causation isn’t necessarily much easier than teaching it to machines, he says. Researchers in a wide range of disciplines, from molecular biology to public policy, are sometimes content to unearth correlations that are not actually rooted in causal relationships. For instance, some studies suggest drinking alcohol will kill you early, while others indicate that moderate consumption is fine and even beneficial, and still other research has found that heavy drinkers outlive nondrinkers. This phenomenon, known as the “reproducibility crisis,” crops up not only in medicine and nutrition but also in psychology and economics. “You can see the fragility of all these inferences,” says Bareinboim. “We’re flipping results every couple of years.”

He argues that anyone asking “what if”—medical researchers setting up clinical trials, social scientists developing pilot programs, even web publishers preparing A/B tests—should start not merely by gathering data but by using Pearl’s causal logic and software like Bareinboim’s to determine whether the available data could possibly answer a causal hypothesis. Eventually, he envisions this leading to “automated scientist” software: a human could dream up a causal question to go after, and the software would combine causal inference theory with machine-learning techniques to rule out experiments that wouldn’t answer the question. That might save scientists from a huge number of costly dead ends.

Bareinboim described this vision while we were sitting in the lobby of MIT’s Sloan School of Management, after a talk he gave last fall. “We have a building here at MIT with, I don’t know, 200 people,” he said. How do those social scientists, or any scientists anywhere, decide which experiments to pursue and which data points to gather? By following their intuition: “They are trying to see where things will lead, based on their current understanding.”

That’s an inherently limited approach, he said, because human scientists designing an experiment can consider only a handful of variables in their minds at once. A computer, on the other hand, can see the interplay of hundreds or thousands of variables. Encoded with “the basic principles” of Pearl’s causal calculus and able to calculate what might happen with new sets of variables, an automated scientist could suggest exactly which experiments the human researchers should spend their time on. Maybe some public policy that has been shown to work only in Texas could be made to work in California if a few causally relevant factors were better appreciated. Scientists would no longer be “doing experiments in the darkness,” Bareinboim said.

He also doesn’t think it’s that far off: “This is the last mile before the victory.”

What if?

Finishing that mile will probably require techniques that are just beginning to be developed. For example, Yoshua Bengio, a computer scientist at the University of Montreal who shared the 2018 Turing Award for his work on deep learning, is trying to get neural networks—the software at the heart of deep learning—to do “meta-learning” and notice the causes of things.

As things stand now, if you wanted a neural network to detect when people are dancing, you’d show it many, many images of dancers. If you wanted it to identify when people are running, you’d show it many, many images of runners. The system would learn to distinguish runners from dancers by identifying features that tend to be different in the images, such as the positions of a person’s hands and arms. But Bengio points out that fundamental knowledge about the world can be gleaned by analyzing the things that are similar or “invariant” across data sets. Maybe a neural network could learn that movements of the legs physically cause both running and dancing. Maybe after seeing these examples and many others that show people only a few feet off the ground, a machine would eventually understand something about gravity and how it limits human movement. Over time, with enough meta-learning about variables that are consistent across data sets, a computer could gain causal knowledge that would be reusable in many domains.

For his part, Pearl says AI can’t be truly intelligent until it has a rich understanding of cause and effect. Although causal reasoning wouldn’t be sufficient for an artificial general intelligence, it’s necessary, he says, because it would enable the introspection that is at the core of cognition. “What if” questions “are the building blocks of science, of moral attitudes, of free will, of consciousness,” Pearl told me.

You can’t draw Pearl into predicting how long it will take for computers to get powerful causal reasoning abilities. “I am not a futurist,” he says. But in any case, he thinks the first move should be to develop machine-learning tools that combine data with available scientific knowledge: “We have a lot of knowledge that resides in the human skull which is not utilized.”

Brian Bergstein, a former editor at MIT Technology Review, is deputy opinion editor at the Boston Globe.

This story was part of our March 2020 issue.

The predictions issue

Profetas da chuva do Nordeste aderem ao trabalho remoto (Folha de S.Paulo)

Douglas Gavras, 10 de abril de 2021

Todos os anos, os chamados profetas da chuva do interior do Ceará se reúnem para divulgar suas previsões para a quadra chuvosa na região —período que geralmente vai de janeiro a junho e é crucial para o sustento das famílias de pequenos agricultores locais. Com a pandemia do novo coronavírus, no entanto, até eles tiveram de aderir ao “trabalho remoto”.

Os profetas são nordestinos que dizem antever as chuvas no semiárido observando sinais da natureza, como a floração de determinadas plantas, o comportamento de pássaros ou o movimento dos insetos. Para alguns deles, por exemplo, se o joão-de-barro não estiver no ninho é sinal de chuva.

“A formiga é o bicho mais inteligente que existe”, define o agricultor Titico Báia, de 69 anos e há 35 como profeta. “Elas começam a limpar o formigueiro no verão e jogam o bagaço seco fora. Dali a 90 dias, a chuva vem. Quando o sapo cururu se prepara para começar a cantar, também é sinal de água.”

Báia é conhecido por observar a natureza principalmente em três dias de setembro e três em outubro. O comportamento de animais, do vento e da lua nesses dias ajuda a prever como será a chuva no primeiro semestre do ano seguinte. “A gente vive da safra de milho e de feijão. E a regularidade da chuva é o que define se a vida vai ser mais fácil ou mais difícil naquele ano. Em 2021, a chuva só não foi mais esperada do que a vacina.”

“Em uma época em que não tínhamos medições regulares na nossa região, essas mulheres e esses homens eram a bússola para os agricultores. Eles diziam a hora certa de plantar e ajudavam a garantir uma colheita melhor”, conta o diretor do Instituto Federal do Ceará (IFCE) de Tauá (337 km de Fortaleza), José Alves Neto. Ele é um dos organizadores do encontro dos profetas da chuva da região do sertão dos Inhamuns, que acontece há cinco anos. “Mesmo hoje, os órgãos que medem as chuvas escutam o que os profetas têm a dizer.”

Com a pandemia da Covid-19, no entanto, esse grupo de observadores da natureza, que faz recomendações do melhor dia de plantio e estimativas de como será o inverno do ano, não pôde se reunir, já que a maioria tem mais de 60 anos e faz parte do grupo de risco.

Para manter a tradição, os profetas populares se adaptaram e promoveram os primeiros encontros virtuais, com transmissões de profecias pela internet, aumento das parcerias com rádios locais e trocas de previsões com outros agricultores pelo WhatsApp. A tecnologia, que chegou a ser vista como rival da tradição popular, agora ajuda os observadores da natureza a continuarem em atividade.

Quem espera a vacina para voltar a circular pela região é o agricultor Totonho Alves, 64. Ele, que herdou a sensibilidade para observar a natureza do pai e do avô, começou a notar o trabalho das formigas e outros sinais há 40 anos. “Meu avô sempre dizia que a chuva estava para vir quando via as formigas trabalhando mais do que o normal. Foi ele quem me ensinou que vinha enchente quando a coruja cantava na beira do riacho.”

Alves conta que as formigas poucas vezes erraram com ele. Quando elas saem das terras mais baixas e se mudam para o alto é sinal de chuva. Mas, se o formigueiro estiver na beira dos riachos, não é bom sinal. “Sinto falta do encontro presencial, daquele movimento, de poder reencontrar os amigos e trocar previsões. Mas com a pandemia, a gente faz o que pode”, diz.

O encontro mais antigo do estado começou em 1997, na região de Quixadá, a 164 km de Fortaleza. Na época, o funcionário da Companhia de Água e Esgoto do Ceará (Cagece) Hélder Cortez se deu conta de que havia um número cada vez maior de interessados em ouvir os agricultores que alertavam sobre as chuvas. Hoje, são 22 profetas na região do município.

Ano passado, antes da pandemia, 400 pessoas se reuniram em um auditório em Quixadá. “Este ano, pela internet, tanta gente se interessou pelos profetas que surgiram várias ideias para que essa cultura não se perca. Com o fim da pandemia, a gente pretende criar uma escola de formação de profetas, em parceria com instituições locais”, diz Cortez.

A audiência online animou os profetas e alguns municípios querem manter os encontros virtuais mesmo após a vacina. O evento no sertão dos Inhamuns, por exemplo, que chegava a reunir até 90 pessoas em auditórios de escolas da região quando era presencial, já tem mais de 3.000 visualizações no canal oficial no YouTube.

O diretor do IFCE de Boa Viagem, no Sertão de Canindé, João Paulo Arcelino do Rêgo, que organizou algumas das gravações das profecias virtuais, conta que a maioria dos profetas não tem acesso regular à internet e acaba não trocando previsões ao longo do ano. Os encontros presenciais sempre foram a oportunidade para isso.

“Este ano, levamos uma equipe de filmagem até a casa de cada um deles, fizemos os vídeos e o evento aumentou de tamanho. O mais surpreendente é que as previsões batiam, mesmo sem eles conversarem antes. Não é por acaso que muitos engenheiros agrônomos da região nos telefonam para saber o que os profetas têm a dizer”, diz ele.

“Quando eles viram que a pandemia se arrastaria em 2021 também e que as reuniões seriam prejudicadas, foram os primeiros a cobrar uma solução. São pessoas que tiram o sustento da terra e que são profetas por vocação, sem ganhar nada com isso. Para eles, repassar as previsões é a forma de manter viva a tradição”, diz Neto.

Outra expectativa é a de que a tecnologia também sirva para atrair mais jovens interessados em aprender sobre as técnicas de observação da natureza, já que a transmissão desse conhecimento para as novas gerações é fonte constante de preocupação do grupo.

“A nossa intenção é repassar o conhecimento para as outras gerações. Minha esperança é um dos meus filhos mais novos, que é quem eu levo para observar a natureza comigo. Quando eu vier a faltar, é ele quem vai levar a história para frente”, diz Alves.

New study ties solar variability to the onset of decadal La Nina events (Science Daily)

[Linking solar activity to the onset of droughts in places like Northeast Brazil has historically been treated as something that did not deserve attention by mainstream meteorology. The El Niño Southern Oscillation – of which La Niña is part – was always presented as the main causal factor for droughts. This new study connects solar activity with the La Niña. The interesting thing here is that many local farmers seen as knowledgeable about rains and drought in NE Brazil mention a 10 years period for the repetition of climate events. -RT]

Date: April 5, 2021

Source: National Center for Atmospheric Research/University Corporation for Atmospheric Research

Summary: A new study shows a correlation between the end of solar cycles and a switch from El Nino to La Nina conditions in the Pacific Ocean, suggesting that solar variability can drive seasonal weather variability on Earth.

A new study shows a correlation between the end of solar cycles and a switch from El Nino to La Nina conditions in the Pacific Ocean, suggesting that solar variability can drive seasonal weather variability on Earth.

If the connection outlined in the journal Earth and Space Science holds up, it could significantly improve the predictability of the largest El Nino and La Nina events, which have a number of seasonal climate effects over land. For example, the southern United States tends to be warmer and drier during a La Nina, while the northern U.S. tends to be colder and wetter.

“Energy from the Sun is the major driver of our entire Earth system and makes life on Earth possible,” said Scott McIntosh, a scientist at the National Center for Atmospheric Research (NCAR) and co-author of the paper. “Even so, the scientific community has been unclear on the role that solar variability plays in influencing weather and climate events here on Earth. This study shows there’s reason to believe it absolutely does and why the connection may have been missed in the past.”

The study was led by Robert Leamon at the University of Maryland-Baltimore County, and it is also co-authored by Daniel Marsh at NCAR. The research was funded by the National Science Foundation, which is NCAR’s sponsor, and the NASA Living With a Star program.

Applying a new solar clock

The appearance (and disappearance) of spots on the Sun — the outwardly visible signs of solar variability — have been observed by humans for hundreds of years. The waxing and waning of the number of sunspots takes place over approximately 11-year cycles, but these cycles do not have distinct beginnings and endings. This fuzziness in the length of any particular cycle has made it challenging for scientists to match up the 11-year cycle with changes happening on Earth.

In the new study, the researchers rely on a more precise 22-year “clock” for solar activity derived from the Sun’s magnetic polarity cycle, which they outlined as a more regular alternative to the 11-year solar cycle in several companion studies published recently in peer-reviewed journals.

The 22-year cycle begins when oppositely charged magnetic bands that wrap the Sun appear near the star’s polar latitudes, according to their recent studies. Over the cycle, these bands migrate toward the equator — causing sunspots to appear as they travel across the mid-latitudes. The cycle ends when the bands meet in the middle, mutually annihilating one another in what the research team calls a terminator event. These terminators provide precise guideposts for the end of one cycle and the beginning of the next.

The researchers imposed these terminator events over sea surface temperatures in the tropical Pacific stretching back to 1960. They found that the five terminator events that occurred between that time and 2010-11 all coincided with a flip from an El Nino (when sea surface temperatures are warmer than average) to a La Nina (when the sea surface temperatures are cooler than average). The end of the most recent solar cycle — which is unfolding now — is also coinciding with the beginning of a La Nina event.

“We are not the first scientists to study how solar variability may drive changes to the Earth system,” Leamon said. “But we are the first to apply the 22-year solar clock. The result — five consecutive terminators lining up with a switch in the El Nino oscillation — is not likely to be a coincidence.”

In fact, the researchers did a number of statistical analyses to determine the likelihood that the correlation was just a fluke. They found there was only a 1 in 5,000 chance or less (depending on the statistical test) that all five terminator events included in the study would randomly coincide with the flip in ocean temperatures. Now that a sixth terminator event — and the corresponding start of a new solar cycle in 2020 — has also coincided with an La Nina event, the chance of a random occurrence is even more remote, the authors said.

The paper does not delve into what physical connection between the Sun and Earth could be responsible for the correlation, but the authors note that there are several possibilities that warrant further study, including the influence of the Sun’s magnetic field on the amount of cosmic rays that escape into the solar system and ultimately bombard Earth. However, a robust physical link between cosmic rays variations and climate has yet to be determined.

“If further research can establish that there is a physical connection and that changes on the Sun are truly causing variability in the oceans, then we may be able to improve our ability to predict El Nino and La Nina events,” McIntosh said.

Story Source:

Materials provided by National Center for Atmospheric Research/University Corporation for Atmospheric Research. Original written by Laura Snider. Note: Content may be edited for style and length.

Journal Reference:

  1. Robert J. Leamon, Scott W. McIntosh, Daniel R. Marsh. Termination of Solar Cycles and Correlated Tropospheric Variability. Earth and Space Science, 2021; 8 (4) DOI: 10.1029/2020EA001223

Bill Gates e o problema com o solucionismo climático (MIT Technology Review)

Bill Gates e o problema com o solucionismo climático

Natureza e espaço

Focar em soluções tecnológicas para mudanças climáticas parece uma tentativa para se desviar dos obstáculos políticos mais desafiadores.

By MIT Technology Review, 6 de abril de 2021

Em seu novo livro Como evitar um desastre climático, Bill Gates adota uma abordagem tecnológica para compreender a crise climática. Gates começa com os 51 bilhões de toneladas de gases com efeito de estufa criados por ano. Ele divide essa poluição em setores com base em seu impacto, passando pelo elétrico, industrial e agrícola para o de transporte e construção civil. Do começo ao fim, Gates se mostra  adepto a diminuir as complexidades do desafio climático, dando ao leitor heurísticas úteis para distinguir maiores problemas tecnológicos (cimento) de menores (aeronaves).

Presente nas negociações climáticas de Paris em 2015, Gates e dezenas de indivíduos bem-afortunados lançaram o Breakthrough Energy, um fundo de capital de investimento interdependente lobista empenhado em conduzir pesquisas. Gates e seus companheiros investidores argumentaram que tanto o governo federal quanto o setor privado estão investindo pouco em inovação energética. A Breakthrough pretende preencher esta lacuna, investindo em tudo, desde tecnologia nuclear da próxima geração até carne vegetariana com sabor de carne bovina. A primeira rodada de US$ 1 bilhão do fundo de investimento teve alguns sucessos iniciais, como a Impossible Foods, uma fabricante de hambúrgueres à base de plantas. O fundo anunciou uma segunda rodada de igual tamanho em janeiro.

Um esforço paralelo, um acordo internacional chamado de Mission Innovation, diz ter convencido seus membros (o setor executivo da União Europeia junto com 24 países incluindo China, os EUA, Índia e o Brasil) a investirem um adicional de US$ 4,6 bilhões por ano desde 2015 para a pesquisa e desenvolvimento da energia limpa.

Essas várias iniciativas são a linha central para o livro mais recente de Gates, escrito a partir de uma perspectiva tecno-otimista. “Tudo que aprendi a respeito do clima e tecnologia me deixam otimista… se agirmos rápido o bastante, [podemos] evitar uma catástrofe climática,” ele escreveu nas páginas iniciais.

Como muitos já assinalaram, muito da tecnologia necessária já existe, muito pode ser feito agora. Por mais que Gates não conteste isso, seu livro foca nos desafios tecnológicos que ele acredita que ainda devem ser superados para atingir uma maior descarbonização. Ele gasta menos tempo nos percalços políticos, escrevendo que pensa “mais como um engenheiro do que um cientista político.” Ainda assim, a política, com toda a sua desordem, é o principal impedimento para o progresso das mudanças climáticas. E engenheiros devem entender como sistemas complexos podem ter ciclos de feedback que dão errado.

Sim, ministro

Kim Stanley Robinson, este sim pensa como um cientista político. O começo de seu romance mais recente The Ministry for the Future (ainda sem tradução para o português), se passa apenas a alguns anos no futuro, em 2025, quando uma onda de calor imensa atinge a Índia, matando milhões de pessoas. A protagonista do livro, Mary Murphy, comanda uma agência da ONU designada a representar os interesses das futuras gerações em uma tentativa de unir os governos mundiais em prol de uma solução climática. Durante todo o livro a equidade intergeracional e várias formas de políticas distributivas em foco.

Se você já viu os cenários que o Painel Intergovernamental sobre Mudanças Climáticas (IPCC) desenvolve para o futuro, o livro de Robinson irá parecer familiar. Sua história questiona as políticas necessárias para solucionar a crise climática, e ele certamente fez seu dever de casa. Apesar de ser um exercício de imaginação, há momentos em que o romance se assemelha mais a um seminário de graduação sobre ciências sociais do que a um trabalho de ficção escapista. Os refugiados climáticos, que são centrais para a história, ilustram a forma como as consequências da poluição atingem a população global mais pobre com mais força. Mas os ricos produzem muito mais carbono.

Ler Gates depois de Robinson evidencia a inextricável conexão entre desigualdade e mudanças climáticas. Os esforços de Gates sobre a questão do clima são louváveis. Mas quando ele nos diz que a riqueza combinada das pessoas apoiando seu fundo de investimento é de US$ 170 bilhões, ficamos um pouco intrigados que estes tenham dedicado somente US$ 2 bilhões para soluções climáticas, menos de 2% de seus ativos. Este fato por si só é um argumento favorável para taxar fortunas: a crise climática exige ação governamental. Não pode ser deixado para o capricho de bilionários.

Quanto aos bilionários, Gates é possivelmente um dos bonzinhos. Ele conta histórias sobre como usa sua fortuna para ajudar os pobres e o planeta. A ironia dele escrever um livro sobre mudanças climáticas quando voa em um jato particular e detém uma mansão de 6.132 m² não é algo que passa despercebido pelo leitor, e nem por Gates, que se autointitula um “mensageiro imperfeito sobre mudanças climáticas”. Ainda assim, ele é inquestionavelmente um aliado do movimento climático.

Mas ao focar em inovações tecnológicas, Gates minimiza a participação dos combustíveis fósseis na obstrução deste progresso. Peculiarmente, o ceticismo climático não é mencionado no livro. Lavando as mãos no que diz respeito à polarização política, Gates nunca faz conexão com seus colegas bilionários Charles e David Koch, que enriqueceram com os petroquímicos e têm desempenhado papel de destaque na reprodução do negacionismo climático.

Por exemplo, Gates se admira que para a vasta maioria dos americanos aquecedores elétricos são na verdade mais baratos do que continuar a usar combustíveis fósseis. Para ele, as pessoas não adotarem estas opções mais econômicas e sustentáveis é um enigma. Mas, não é assim. Como os jornalistas Rebecca Leber e Sammy Roth reportaram em  Mother Jones  e no  Los Angeles Times, a indústria do gás está investindo em defensores e criando campanhas de marketing para se opor à eletrificação e manter as pessoas presas aos combustíveis fósseis.

Essas forças de oposição são melhor vistas no livro do Robinson do que no de Gates. Gates teria se beneficiado se tivesse tirado partido do trabalho que Naomi Oreskes, Eric Conway, Geoffrey Supran, entre outros, têm feito para documentar os esforços persistentes das empresas de combustíveis fósseis em semear dúvida sobre a ciência climática para a população.

No entanto, uma coisa que Gates e Robinson têm em comum é a opinião de que a geoengenharia, intervenções monumentais para combater os sintomas ao invés das causas das mudanças climáticas, venha a ser inevitável. Em The Ministry for the Future, a geoengenharia solar, que vem a ser a pulverização de partículas finas na atmosfera para refletir mais do calor solar de volta para o espaço, é usada na sequência dos acontecimentos da onda de calor mortal que inicia a história. E mais tarde, alguns cientistas vão aos polos e inventam elaborados métodos para remover água derretida de debaixo de geleiras para evitar que avançasse para o mar. Apesar de alguns contratempos, eles impedem a subida do nível do mar em vários metros. É possível imaginar Gates aparecendo no romance como um dos primeiros a financiar estes esforços. Como ele próprio observa em seu livro, ele tem investido em pesquisa sobre geoengenharia solar há anos.

A pior parte

O título do novo livro de Elizabeth Kolbert, Under a White Sky (ainda sem tradução para o português), é uma referência a esta tecnologia nascente, já que implementá-la em larga escala pode alterar a cor do céu de azul para branco.
Kolbert observa que o primeiro relatório sobre mudanças climáticas foi parar na mesa do presidente Lyndon Johnson em 1965. Este relatório não argumentava que deveríamos diminuir as emissões de carbono nos afastando de combustíveis fósseis. No lugar, defendia mudar o clima por meio da geoengenharia solar, apesar do termo ainda não ter sido inventado. É preocupante que alguns se precipitem imediatamente para essas soluções arriscadas em vez de tratar a raiz das causas das mudanças climáticas.

Ao ler Under a White Sky, somos lembrados das formas com que intervenções como esta podem dar errado. Por exemplo, a cientista e escritora Rachel Carson defendeu importar espécies não nativas como uma alternativa a utilizar pesticidas. No ano após o seu livro Primavera Silenciosa ser publicado, em 1962, o US Fish and Wildlife Service trouxe carpas asiáticas para a América pela primeira vez, a fim de controlar algas aquáticas. Esta abordagem solucionou um problema, mas criou outro: a disseminação dessa espécie invasora ameaçou às locais e causou dano ambiental.

Como Kolbert observa, seu livro é sobre “pessoas tentando solucionar problemas criados por pessoas tentando solucionar problemas.” Seu relato cobre exemplos incluindo esforços malfadados de parar a disseminação das carpas, as estações de bombeamento em Nova Orleans que aceleram o afundamento da cidade e as tentativas de seletivamente reproduzir corais que possam tolerar temperaturas mais altas e a acidificação do oceano. Kolbert tem senso de humor e uma percepção aguçada para consequências não intencionais. Se você gosta do seu apocalipse com um pouco de humor, ela irá te fazer rir enquanto Roma pega fogo.

Em contraste, apesar de Gates estar consciente das possíveis armadilhas das soluções tecnológicas, ele ainda enaltece invenções como plástico e fertilizante como vitais. Diga isso para as tartarugas marinhas engolindo lixo plástico ou as florações de algas impulsionadas por fertilizantes destruindo o ecossistema do Golfo do México.

Com níveis perigosos de dióxido de carbono na atmosfera, a geoengenharia pode de fato se provar necessária, mas não deveríamos ser ingênuos sobre os riscos. O livro de Gates tem muitas ideias boas e vale a pena a leitura. Mas para um panorama completo da crise que enfrentamos, certifique-se de também ler Robinson e Kolbert.

Cerejeiras florescem mais cedo no Japão em 1,2 mil anos (Folha de S.Paulo)

Kazuhiro Nogi – 24.mar.2021/AFP 4-5 minutos

São Paulo

O florescer das famosas cerejeiras brancas e rosas leva milhares às ruas e parques do Japão para observar o fenômeno, que dura poucos dias e é reverenciado há mais de mil anos. Mas este ano a antecipação da florada tem preocupado cientistas, pois indica impacto nas mudanças climáticas.

Segundo registros da Universidade da Prefeitura de Osaka, em 2021, as famosas cerejeiras brancas e rosas floresceram totalmente em 26 de março em Quioto, a data mais antecipada em 12 séculos. As floradas mais cedo foram registradas em 27 de março dos anos 1612, 1409 e 1236.

A instituição conseguiu identificar a antecipação do fenômeno porque tem um banco de dados completo dos registros das floradas ao longo dos séculos. Os registros começaram no ano 812 e incluem documentos judiciais da Quioto Imperial, a antiga capital do Japão e diários medievais.

O professor de ciência ambiental da universidade da Prefeitura de Osaka, Yasuyuki Aono, responsável por compilar um banco de dados, disse à Agência Reuters que o fenômeno costuma ocorrer em abril, mas à medida que as temperaturas sobem, o início da floração é mais cedo.

Kazuhiro Nogui, 24.mar.2021/AFP

“As flores de cerejeira são muito sensíveis à temperatura. A floração e a plena floração podem ocorrer mais cedo ou mais tarde, dependendo apenas da temperatura. A temperatura era baixa na década de 1820, mas subiu cerca de 3,5 graus Celsius até hoje”, disse.

Segundo ele, as estações deste ano, em particular, influenciaram as datas de floração. O inverno foi muito frio, mas a primavera veio rápida e excepcionalmente quente, então “os botões estão completamente despertos depois de um descanso suficiente”.

Na capital Tóquio, as cerejeiras atingiram o máximo da florada em 22 de março, o segundo ano mais cedo já registrado. “À medida que as temperaturas globais aumentam, as geadas da última Primavera estão ocorrendo mais cedo e a floração está ocorrendo mais cedo”, afirmou Lewis Ziska, da Universidade de Columbia, à CNN.

A Agência Meteorológica do Japão acompanha ainda 58 cerejeiras “referência” no país. Neste ano, 40 já atingiram o pico de floração e 14 o fizeram em tempo recorde. As árvores normalmente florescem por cerca de duas semanas todos os anos. “Podemos dizer que é mais provável por causa do impacto do aquecimento global”, disse Shunji Anbe, funcionário da divisão de observações da agência.

Dados Organização Meteorológica Mundial divulgados em janeiro mostram que as temperaturas globais em 2020 estiveram entre as mais altas já registradas e rivalizaram com 2016 com o ano mais quente de todos os tempos.

As flores de cerejeira têm longas raízes históricas e culturais no Japão, anunciando a Primavera e inspirando artistas e poetas ao longo dos séculos. Sua fragilidade é vista como um símbolo de vida, morte e renascimento.

Atualmente, as pessoas se reúnem sob as flores de cerejeiras a cada primavera para festas hanami (observação das flores), passeiam em parques e fazem piqueniques embaixo dos galhos e abusar das selfies. Mas, neste ano, a florada de cerejeiras veio e se foi em um piscar de olhos.

Com o fim do estado de emergência para conter a pandemia de Covid-19 em todas as regiões do Japão, muitas pessoas se aglomeraram em locais populares de exibição no fim de semana, embora o número de pessoas tenha sido menor do que em anos normais.

NOAA Acknowledges the New Reality of Hurricane Season (Gizmodo)

Molly Taft, March 2, 2021

This combination of satellite images provided by the National Hurricane Center shows 30 hurricanes that occurred during the 2020 Atlantic hurricane season.
This combination of satellite images provided by the National Hurricane Center shows 30 hurricanes that occurred during the 2020 Atlantic hurricane season.

We’re one step closer to officially moving up hurricane season. The National Hurricane Center announced Tuesday that it would formally start issuing its hurricane season tropical weather outlooks on May 15 this year, bumping it up from the traditional start of hurricane season on June 1. The move comes after a recent spate of early season storms have raked the Atlantic.

Atlantic hurricane season runs from June 1 to November 30. That’s when conditions are most conducive to storm formation owing to warm air and water temperatures. (The Pacific ocean has its own hurricane season, which covers the same timeframe, but since waters are colder fewer hurricanes tend to form there than in the Atlantic.)

Storms have begun forming on the Atlantic earlier as ocean and air temperatures have increased due to climate change. Last year, Hurricane Arthur roared to life off the East Coast on May 16. That storm made 2020 the sixth hurricane season in a row to have a storm that formed earlier than the June 1 official start date. While the National Oceanic and Atmospheric Administration won’t be moving up the start of the season just yet, the earlier outlooks addresses the recent history.

“In the last decade, there have been 10 storms formed in the weeks before the traditional start of the season, which is a big jump,” said Sean Sublette, a meteorologist at Climate Central, who pointed out that the 1960s through 2010s saw between one and three storms each decade before the June 1 start date on average.

It might be tempting to ascribe this earlier season entirely to climate change warming the Atlantic. But technology also has a role to play, with more observations along the coast as well as satellites that can spot storms far out to sea.

“I would caution that we can’t just go, ‘hah, the planet’s warming, we’ve had to move the entire season!’” Sublette said. “I don’t think there’s solid ground for attribution of how much of one there is over the other. Weather folks can sit around and debate that for awhile.”

Earlier storms don’t necessarily mean more harmful ones, either. In fact, hurricanes earlier in the season tend to be weaker than the monsters that form in August and September when hurricane season is at its peak. But regardless of their strength, these earlier storms have generated discussion inside the NHC on whether to move up the official start date for the season, when the agency usually puts out two reports per day on hurricane activity. Tuesday’s step is not an official announcement of this decision, but an acknowledgement of the increased attention on early hurricanes.

“I would say that [Tuesday’s announcement] is the National Hurricane Center being proactive,” Sublette said. “Like hey, we know that the last few years it’s been a little busier in May than we’ve seen in the past five decades, and we know there is an awareness now, so we’re going to start issuing these reports early.”

While the jury is still out on whether climate change is pushing the season earlier, research has shown that the strongest hurricanes are becoming more common, and that climate change is likely playing a role. A study published last year found the odds of a storm becoming a major hurricanes—those Category 3 or stronger—have increase 49% in the basin since satellite monitoring began in earnest four decades ago. And when storms make landfall, sea level rise allows them to do more damage. So regardless of if climate change is pushing Atlantic hurricane season is getting earlier or not, the risks are increasing. Now, at least, we’ll have better warnings before early storms do hit.

5 Pandemic Mistakes We Keep Repeating (The Atlantic)

Zeynep Tufekci

February 26, 2021

We can learn from our failures.
Photo illustration showing a Trump press conference, a vaccine syringe, and Anthony Fauci
Alex Wong / Chet Strange/ Sarah Silbiger / Bloomberg / Getty / The Atlantic

When the polio vaccine was declared safe and effective, the news was met with jubilant celebration. Church bells rang across the nation, and factories blew their whistles. “Polio routed!” newspaper headlines exclaimed. “An historic victory,” “monumental,” “sensational,” newscasters declared. People erupted with joy across the United States. Some danced in the streets; others wept. Kids were sent home from school to celebrate.

One might have expected the initial approval of the coronavirus vaccines to spark similar jubilation—especially after a brutal pandemic year. But that didn’t happen. Instead, the steady drumbeat of good news about the vaccines has been met with a chorus of relentless pessimism.

The problem is not that the good news isn’t being reported, or that we should throw caution to the wind just yet. It’s that neither the reporting nor the public-health messaging has reflected the truly amazing reality of these vaccines. There is nothing wrong with realism and caution, but effective communication requires a sense of proportion—distinguishing between due alarm and alarmism; warranted, measured caution and doombait; worst-case scenarios and claims of impending catastrophe. We need to be able to celebrate profoundly positive news while noting the work that still lies ahead. However, instead of balanced optimism since the launch of the vaccines, the public has been offered a lot of misguided fretting over new virus variants, subjected to misleading debates about the inferiority of certain vaccines, and presented with long lists of things vaccinated people still cannot do, while media outlets wonder whether the pandemic will ever end.

This pessimism is sapping people of energy to get through the winter, and the rest of this pandemic. Anti-vaccination groups and those opposing the current public-health measures have been vigorously amplifying the pessimistic messages—especially the idea that getting vaccinated doesn’t mean being able to do more—telling their audiences that there is no point in compliance, or in eventual vaccination, because it will not lead to any positive changes. They are using the moment and the messaging to deepen mistrust of public-health authorities, accusing them of moving the goalposts and implying that we’re being conned. Either the vaccines aren’t as good as claimed, they suggest, or the real goal of pandemic-safety measures is to control the public, not the virus.

Five key fallacies and pitfalls have affected public-health messaging, as well as media coverage, and have played an outsize role in derailing an effective pandemic response. These problems were deepened by the ways that we—the public—developed to cope with a dreadful situation under great uncertainty. And now, even as vaccines offer brilliant hope, and even though, at least in the United States, we no longer have to deal with the problem of a misinformer in chief, some officials and media outlets are repeating many of the same mistakes in handling the vaccine rollout.

The pandemic has given us an unwelcome societal stress test, revealing the cracks and weaknesses in our institutions and our systems. Some of these are common to many contemporary problems, including political dysfunction and the way our public sphere operates. Others are more particular, though not exclusive, to the current challenge—including a gap between how academic research operates and how the public understands that research, and the ways in which the psychology of coping with the pandemic have distorted our response to it.

Recognizing all these dynamics is important, not only for seeing us through this pandemic—yes, it is going to end—but also to understand how our society functions, and how it fails. We need to start shoring up our defenses, not just against future pandemics but against all the myriad challenges we face—political, environmental, societal, and technological. None of these problems is impossible to remedy, but first we have to acknowledge them and start working to fix them—and we’re running out of time.

The past 12 months were incredibly challenging for almost everyone. Public-health officials were fighting a devastating pandemic and, at least in this country, an administration hell-bent on undermining them. The World Health Organization was not structured or funded for independence or agility, but still worked hard to contain the disease. Many researchers and experts noted the absence of timely and trustworthy guidelines from authorities, and tried to fill the void by communicating their findings directly to the public on social media. Reporters tried to keep the public informed under time and knowledge constraints, which were made more severe by the worsening media landscape. And the rest of us were trying to survive as best we could, looking for guidance where we could, and sharing information when we could, but always under difficult, murky conditions.

Despite all these good intentions, much of the public-health messaging has been profoundly counterproductive. In five specific ways, the assumptions made by public officials, the choices made by traditional media, the way our digital public sphere operates, and communication patterns between academic communities and the public proved flawed.

Risk Compensation

One of the most important problems undermining the pandemic response has been the mistrust and paternalism that some public-health agencies and experts have exhibited toward the public. A key reason for this stance seems to be that some experts feared that people would respond to something that increased their safety—such as masks, rapid tests, or vaccines—by behaving recklessly. They worried that a heightened sense of safety would lead members of the public to take risks that would not just undermine any gains, but reverse them.

The theory that things that improve our safety might provide a false sense of security and lead to reckless behavior is attractive—it’s contrarian and clever, and fits the “here’s something surprising we smart folks thought about” mold that appeals to, well, people who think of themselves as smart. Unsurprisingly, such fears have greeted efforts to persuade the public to adopt almost every advance in safety, including seat belts, helmets, and condoms.

But time and again, the numbers tell a different story: Even if safety improvements cause a few people to behave recklessly, the benefits overwhelm the ill effects. In any case, most people are already interested in staying safe from a dangerous pathogen. Further, even at the beginning of the pandemic, sociological theory predicted that wearing masks would be associated with increased adherence to other precautionary measures—people interested in staying safe are interested in staying safe—and empirical research quickly confirmed exactly that. Unfortunately, though, the theory of risk compensation—and its implicit assumptions—continue to haunt our approach, in part because there hasn’t been a reckoning with the initial missteps.

Rules in Place of Mechanisms and Intuitions

Much of the public messaging focused on offering a series of clear rules to ordinary people, instead of explaining in detail the mechanisms of viral transmission for this pathogen. A focus on explaining transmission mechanisms, and updating our understanding over time, would have helped empower people to make informed calculations about risk in different settings. Instead, both the CDC and the WHO chose to offer fixed guidelines that lent a false sense of precision.

In the United States, the public was initially told that “close contact” meant coming within six feet of an infected individual, for 15 minutes or more. This messaging led to ridiculous gaming of the rules; some establishments moved people around at the 14th minute to avoid passing the threshold. It also led to situations in which people working indoors with others, but just outside the cutoff of six feet, felt that they could take their mask off. None of this made any practical sense. What happened at minute 16? Was seven feet okay? Faux precision isn’t more informative; it’s misleading.

All of this was complicated by the fact that key public-health agencies like the CDC and the WHO were late to acknowledge the importance of some key infection mechanisms, such as aerosol transmission. Even when they did so, the shift happened without a proportional change in the guidelines or the messaging—it was easy for the general public to miss its significance.

Frustrated by the lack of public communication from health authorities, I wrote an article last July on what we then knew about the transmission of this pathogen—including how it could be spread via aerosols that can float and accumulate, especially in poorly ventilated indoor spaces. To this day, I’m contacted by people who describe workplaces that are following the formal guidelines, but in ways that defy reason: They’ve installed plexiglass, but barred workers from opening their windows; they’ve mandated masks, but only when workers are within six feet of one another, while permitting them to be taken off indoors during breaks.

Perhaps worst of all, our messaging and guidelines elided the difference between outdoor and indoor spaces, where, given the importance of aerosol transmission, the same precautions should not apply. This is especially important because this pathogen is overdispersed: Much of the spread is driven by a few people infecting many others at once, while most people do not transmit the virus at all.

After I wrote an article explaining how overdispersion and super-spreading were driving the pandemic, I discovered that this mechanism had also been poorly explained. I was inundated by messages from people, including elected officials around the world, saying they had no idea that this was the case. None of it was secret—numerous academic papers and articles had been written about it—but it had not been integrated into our messaging or our guidelines despite its great importance.

Crucially, super-spreading isn’t equally distributed; poorly ventilated indoor spaces can facilitate the spread of the virus over longer distances, and in shorter periods of time, than the guidelines suggested, and help fuel the pandemic.

Outdoors? It’s the opposite.

There is a solid scientific reason for the fact that there are relatively few documented cases of transmission outdoors, even after a year of epidemiological work: The open air dilutes the virus very quickly, and the sun helps deactivate it, providing further protection. And super-spreading—the biggest driver of the pandemic— appears to be an exclusively indoor phenomenon. I’ve been tracking every report I can find for the past year, and have yet to find a confirmed super-spreading event that occurred solely outdoors. Such events might well have taken place, but if the risk were great enough to justify altering our lives, I would expect at least a few to have been documented by now.

And yet our guidelines do not reflect these differences, and our messaging has not helped people understand these facts so that they can make better choices. I published my first article pleading for parks to be kept open on April 7, 2020—but outdoor activities are still banned by some authorities today, a full year after this dreaded virus began to spread globally.

We’d have been much better off if we gave people a realistic intuition about this virus’s transmission mechanisms. Our public guidelines should have been more like Japan’s, which emphasize avoiding the three C’s—closed spaces, crowded places, and close contact—that are driving the pandemic.

Scolding and Shaming

Throughout the past year, traditional and social media have been caught up in a cycle of shaming—made worse by being so unscientific and misguided. How dare you go to the beach? newspapers have scolded us for months, despite lacking evidence that this posed any significant threat to public health. It wasn’t just talk: Many cities closed parks and outdoor recreational spaces, even as they kept open indoor dining and gyms. Just this month, UC Berkeley and the University of Massachusetts at Amherst both banned students from taking even solitary walks outdoors.

Even when authorities relax the rules a bit, they do not always follow through in a sensible manner. In the United Kingdom, after some locales finally started allowing children to play on playgrounds—something that was already way overdue—they quickly ruled that parents must not socialize while their kids have a normal moment. Why not? Who knows?

On social media, meanwhile, pictures of people outdoors without masks draw reprimands, insults, and confident predictions of super-spreading—and yet few note when super-spreading fails to follow.

While visible but low-risk activities attract the scolds, other actual risks—in workplaces and crowded households, exacerbated by the lack of testing or paid sick leave—are not as easily accessible to photographers. Stefan Baral, an associate epidemiology professor at the Johns Hopkins Bloomberg School of Public Health, says that it’s almost as if we’ve “designed a public-health response most suitable for higher-income” groups and the “Twitter generation”—stay home; have your groceries delivered; focus on the behaviors you can photograph and shame online—rather than provide the support and conditions necessary for more people to keep themselves safe.

And the viral videos shaming people for failing to take sensible precautions, such as wearing masks indoors, do not necessarily help. For one thing, fretting over the occasional person throwing a tantrum while going unmasked in a supermarket distorts the reality: Most of the public has been complying with mask wearing. Worse, shaming is often an ineffective way of getting people to change their behavior, and it entrenches polarization and discourages disclosure, making it harder to fight the virus. Instead, we should be emphasizing safer behavior and stressing how many people are doing their part, while encouraging others to do the same.

Harm Reduction

Amidst all the mistrust and the scolding, a crucial public-health concept fell by the wayside. Harm reduction is the recognition that if there is an unmet and yet crucial human need, we cannot simply wish it away; we need to advise people on how to do what they seek to do more safely. Risk can never be completely eliminated; life requires more than futile attempts to bring risk down to zero. Pretending we can will away complexities and trade-offs with absolutism is counterproductive. Consider abstinence-only education: Not letting teenagers know about ways to have safer sex results in more of them having sex with no protections.

As Julia Marcus, an epidemiologist and associate professor at Harvard Medical School, told me, “When officials assume that risks can be easily eliminated, they might neglect the other things that matter to people: staying fed and housed, being close to loved ones, or just enjoying their lives. Public health works best when it helps people find safer ways to get what they need and want.””

Another problem with absolutism is the “abstinence violation” effect, Joshua Barocas, an assistant professor at the Boston University School of Medicine and Infectious Diseases, told me. When we set perfection as the only option, it can cause people who fall short of that standard in one small, particular way to decide that they’ve already failed, and might as well give up entirely. Most people who have attempted a diet or a new exercise regimen are familiar with this psychological state. The better approach is encouraging risk reduction and layered mitigation—emphasizing that every little bit helps—while also recognizing that a risk-free life is neither possible nor desirable.

Socializing is not a luxury—kids need to play with one another, and adults need to interact. Your kids can play together outdoors, and outdoor time is the best chance to catch up with your neighbors is not just a sensible message; it’s a way to decrease transmission risks. Some kids will play and some adults will socialize no matter what the scolds say or public-health officials decree, and they’ll do it indoors, out of sight of the scolding.

And if they don’t? Then kids will be deprived of an essential activity, and adults will be deprived of human companionship. Socializing is perhaps the most important predictor of health and longevity, after not smoking and perhaps exercise and a healthy diet. We need to help people socialize more safely, not encourage them to stop socializing entirely.

The Balance Between Knowledge And Action

Last but not least, the pandemic response has been distorted by a poor balance between knowledge, risk, certainty, and action.

Sometimes, public-health authorities insisted that we did not know enough to act, when the preponderance of evidence already justified precautionary action. Wearing masks, for example, posed few downsides, and held the prospect of mitigating the exponential threat we faced. The wait for certainty hampered our response to airborne transmission, even though there was almost no evidence for—and increasing evidence against—the importance of fomites, or objects that can carry infection. And yet, we emphasized the risk of surface transmission while refusing to properly address the risk of airborne transmission, despite increasing evidence. The difference lay not in the level of evidence and scientific support for either theory—which, if anything, quickly tilted in favor of airborne transmission, and not fomites, being crucial—but in the fact that fomite transmission had been a key part of the medical canon, and airborne transmission had not.

Sometimes, experts and the public discussion failed to emphasize that we were balancing risks, as in the recurring cycles of debate over lockdowns or school openings. We should have done more to acknowledge that there were no good options, only trade-offs between different downsides. As a result, instead of recognizing the difficulty of the situation, too many people accused those on the other side of being callous and uncaring.

And sometimes, the way that academics communicate clashed with how the public constructs knowledge. In academia, publishing is the coin of the realm, and it is often done through rejecting the null hypothesis—meaning that many papers do not seek to prove something conclusively, but instead, to reject the possibility that a variable has no relationship with the effect they are measuring (beyond chance). If that sounds convoluted, it is—there are historical reasons for this methodology and big arguments within academia about its merits, but for the moment, this remains standard practice.

At crucial points during the pandemic, though, this resulted in mistranslations and fueled misunderstandings, which were further muddled by differing stances toward prior scientific knowledge and theory. Yes, we faced a novel coronavirus, but we should have started by assuming that we could make some reasonable projections from prior knowledge, while looking out for anything that might prove different. That prior experience should have made us mindful of seasonality, the key role of overdispersion, and aerosol transmission. A keen eye for what was different from the past would have alerted us earlier to the importance of presymptomatic transmission.

Thus, on January 14, 2020, the WHO stated that there was “no clear evidence of human-to-human transmission.” It should have said, “There is increasing likelihood that human-to-human transmission is taking place, but we haven’t yet proven this, because we have no access to Wuhan, China.” (Cases were already popping up around the world at that point.) Acting as if there was human-to-human transmission during the early weeks of the pandemic would have been wise and preventive.

Later that spring, WHO officials stated that there was “currently no evidence that people who have recovered from COVID-19 and have antibodies are protected from a second infection,” producing many articles laden with panic and despair. Instead, it should have said: “We expect the immune system to function against this virus, and to provide some immunity for some period of time, but it is still hard to know specifics because it is so early.”

Similarly, since the vaccines were announced, too many statements have emphasized that we don’t yet know if vaccines prevent transmission. Instead, public-health authorities should have said that we have many reasons to expect, and increasing amounts of data to suggest, that vaccines will blunt infectiousness, but that we’re waiting for additional data to be more precise about it. That’s been unfortunate, because while many, many things have gone wrong during this pandemic, the vaccines are one thing that has gone very, very right.

As late as April 2020, Anthony Fauci was slammed for being too optimistic for suggesting we might plausibly have vaccines in a year to 18 months. We had vaccines much, much sooner than that: The first two vaccine trials concluded a mere eight months after the WHO declared a pandemic in March 2020.

Moreover, they have delivered spectacular results. In June 2020, the FDA said a vaccine that was merely 50 percent efficacious in preventing symptomatic COVID-19 would receive emergency approval—that such a benefit would be sufficient to justify shipping it out immediately. Just a few months after that, the trials of the Moderna and Pfizer vaccines concluded by reporting not just a stunning 95 percent efficacy, but also a complete elimination of hospitalization or death among the vaccinated. Even severe disease was practically gone: The lone case classified as “severe” among 30,000 vaccinated individuals in the trials was so mild that the patient needed no medical care, and her case would not have been considered severe if her oxygen saturation had been a single percent higher.

These are exhilarating developments, because global, widespread, and rapid vaccination is our way out of this pandemic. Vaccines that drastically reduce hospitalizations and deaths, and that diminish even severe disease to a rare event, are the closest things we have had in this pandemic to a miracle—though of course they are the product of scientific research, creativity, and hard work. They are going to be the panacea and the endgame.

And yet, two months into an accelerating vaccination campaign in the United States, it would be hard to blame people if they missed the news that things are getting better.

Yes, there are new variants of the virus, which may eventually require booster shots, but at least so far, the existing vaccines are standing up to them well—very, very well. Manufacturers are already working on new vaccines or variant-focused booster versions, in case they prove necessary, and the authorizing agencies are ready for a quick turnaround if and when updates are needed. Reports from places that have vaccinated large numbers of individuals, and even trials in places where variants are widespread, are exceedingly encouraging, with dramatic reductions in cases and, crucially, hospitalizations and deaths among the vaccinated. Global equity and access to vaccines remain crucial concerns, but the supply is increasing.

Here in the United States, despite the rocky rollout and the need to smooth access and ensure equity, it’s become clear that toward the end of spring 2021, supply will be more than sufficient. It may sound hard to believe today, as many who are desperate for vaccinations await their turn, but in the near future, we may have to discuss what to do with excess doses.

So why isn’t this story more widely appreciated?

Part of the problem with the vaccines was the timing—the trials concluded immediately after the U.S. election, and their results got overshadowed in the weeks of political turmoil. The first, modest headline announcing the Pfizer-BioNTech results in The New York Times was a single column, “Vaccine Is Over 90% Effective, Pfizer’s Early Data Says,” below a banner headline spanning the page: “BIDEN CALLS FOR UNITED FRONT AS VIRUS RAGES.” That was both understandable—the nation was weary—and a loss for the public.

Just a few days later, Moderna reported a similar 94.5 percent efficacy. If anything, that provided even more cause for celebration, because it confirmed that the stunning numbers coming out of Pfizer weren’t a fluke. But, still amid the political turmoil, the Moderna report got a mere two columns on The New York Times’ front page with an equally modest headline: “Another Vaccine Appears to Work Against the Virus.”

So we didn’t get our initial vaccine jubilation.

But as soon as we began vaccinating people, articles started warning the newly vaccinated about all they could not do. “COVID-19 Vaccine Doesn’t Mean You Can Party Like It’s 1999,” one headline admonished. And the buzzkill has continued right up to the present. “You’re fully vaccinated against the coronavirus—now what? Don’t expect to shed your mask and get back to normal activities right away,” began a recent Associated Press story.

People might well want to party after being vaccinated. Those shots will expand what we can do, first in our private lives and among other vaccinated people, and then, gradually, in our public lives as well. But once again, the authorities and the media seem more worried about potentially reckless behavior among the vaccinated, and about telling them what not to do, than with providing nuanced guidance reflecting trade-offs, uncertainty, and a recognition that vaccination can change behavior. No guideline can cover every situation, but careful, accurate, and updated information can empower everyone.

Take the messaging and public conversation around transmission risks from vaccinated people. It is, of course, important to be alert to such considerations: Many vaccines are “leaky” in that they prevent disease or severe disease, but not infection and transmission. In fact, completely blocking all infection—what’s often called “sterilizing immunity”—is a difficult goal, and something even many highly effective vaccines don’t attain, but that doesn’t stop them from being extremely useful.

As Paul Sax, an infectious-disease doctor at Boston’s Brigham & Women’s Hospital, put it in early December, it would be enormously surprising “if these highly effective vaccines didn’t also make people less likely to transmit.” From multiple studies, we already knew that asymptomatic individuals—those who never developed COVID-19 despite being infected—were much less likely to transmit the virus. The vaccine trials were reporting 95 percent reductions in any form of symptomatic disease. In December, we learned that Moderna had swabbed some portion of trial participants to detect asymptomatic, silent infections, and found an almost two-thirds reduction even in such cases. The good news kept pouring in. Multiple studies found that, even in those few cases where breakthrough disease occurred in vaccinated people, their viral loads were lower—which correlates with lower rates of transmission. Data from vaccinated populations further confirmed what many experts expected all along: Of course these vaccines reduce transmission.

And yet, from the beginning, a good chunk of the public-facing messaging and news articles implied or claimed that vaccines won’t protect you against infecting other people or that we didn’t know if they would, when both were false. I found myself trying to convince people in my own social network that vaccines weren’t useless against transmission, and being bombarded on social media with claims that they were.

What went wrong? The same thing that’s going wrong right now with the reporting on whether vaccines will protect recipients against the new viral variants. Some outlets emphasize the worst or misinterpret the research. Some public-health officials are wary of encouraging the relaxation of any precautions. Some prominent experts on social media—even those with seemingly solid credentials—tend to respond to everything with alarm and sirens. So the message that got heard was that vaccines will not prevent transmission, or that they won’t work against new variants, or that we don’t know if they will. What the public needs to hear, though, is that based on existing data, we expect them to work fairly well—but we’ll learn more about precisely how effective they’ll be over time, and that tweaks may make them even better.

A year into the pandemic, we’re still repeating the same mistakes.

The top-down messaging is not the only problem. The scolding, the strictness, the inability to discuss trade-offs, and the accusations of not caring about people dying not only have an enthusiastic audience, but portions of the public engage in these behaviors themselves. Maybe that’s partly because proclaiming the importance of individual actions makes us feel as if we are in the driver’s seat, despite all the uncertainty.

Psychologists talk about the “locus of control”—the strength of belief in control over your own destiny. They distinguish between people with more of an internal-control orientation—who believe that they are the primary actors—and those with an external one, who believe that society, fate, and other factors beyond their control greatly influence what happens to us. This focus on individual control goes along with something called the “fundamental attribution error”—when bad things happen to other people, we’re more likely to believe that they are personally at fault, but when they happen to us, we are more likely to blame the situation and circumstances beyond our control.

An individualistic locus of control is forged in the U.S. mythos—that we are a nation of strivers and people who pull ourselves up by our bootstraps. An internal-control orientation isn’t necessarily negative; it can facilitate resilience, rather than fatalism, by shifting the focus to what we can do as individuals even as things fall apart around us. This orientation seems to be common among children who not only survive but sometimes thrive in terrible situations—they take charge and have a go at it, and with some luck, pull through. It is probably even more attractive to educated, well-off people who feel that they have succeeded through their own actions.

You can see the attraction of an individualized, internal locus of control in a pandemic, as a pathogen without a cure spreads globally, interrupts our lives, makes us sick, and could prove fatal.

There have been very few things we could do at an individual level to reduce our risk beyond wearing masks, distancing, and disinfecting. The desire to exercise personal control against an invisible, pervasive enemy is likely why we’ve continued to emphasize scrubbing and cleaning surfaces, in what’s appropriately called “hygiene theater,” long after it became clear that fomites were not a key driver of the pandemic. Obsessive cleaning gave us something to do, and we weren’t about to give it up, even if it turned out to be useless. No wonder there was so much focus on telling others to stay home—even though it’s not a choice available to those who cannot work remotely—and so much scolding of those who dared to socialize or enjoy a moment outdoors.

And perhaps it was too much to expect a nation unwilling to release its tight grip on the bottle of bleach to greet the arrival of vaccines—however spectacular—by imagining the day we might start to let go of our masks.

The focus on individual actions has had its upsides, but it has also led to a sizable portion of pandemic victims being erased from public conversation. If our own actions drive everything, then some other individuals must be to blame when things go wrong for them. And throughout this pandemic, the mantra many of us kept repeating—“Wear a mask, stay home; wear a mask, stay home”—hid many of the real victims.

Study after study, in country after country, confirms that this disease has disproportionately hit the poor and minority groups, along with the elderly, who are particularly vulnerable to severe disease. Even among the elderly, though, those who are wealthier and enjoy greater access to health care have fared better.

The poor and minority groups are dying in disproportionately large numbers for the same reasons that they suffer from many other diseases: a lifetime of disadvantages, lack of access to health care, inferior working conditions, unsafe housing, and limited financial resources.

Many lacked the option of staying home precisely because they were working hard to enable others to do what they could not, by packing boxes, delivering groceries, producing food. And even those who could stay home faced other problems born of inequality: Crowded housing is associated with higher rates of COVID-19 infection and worse outcomes, likely because many of the essential workers who live in such housing bring the virus home to elderly relatives.

Individual responsibility certainly had a large role to play in fighting the pandemic, but many victims had little choice in what happened to them. By disproportionately focusing on individual choices, not only did we hide the real problem, but we failed to do more to provide safe working and living conditions for everyone.

For example, there has been a lot of consternation about indoor dining, an activity I certainly wouldn’t recommend. But even takeout and delivery can impose a terrible cost: One study of California found that line cooks are the highest-risk occupation for dying of COVID-19. Unless we provide restaurants with funds so they can stay closed, or provide restaurant workers with high-filtration masks, better ventilation, paid sick leave, frequent rapid testing, and other protections so that they can safely work, getting food to go can simply shift the risk to the most vulnerable. Unsafe workplaces may be low on our agenda, but they do pose a real danger. Bill Hanage, associate professor of epidemiology at Harvard, pointed me to a paper he co-authored: Workplace-safety complaints to OSHA—which oversees occupational-safety regulations—during the pandemic were predictive of increases in deaths 16 days later.

New data highlight the terrible toll of inequality: Life expectancy has decreased dramatically over the past year, with Black people losing the most from this disease, followed by members of the Hispanic community. Minorities are also more likely to die of COVID-19 at a younger age. But when the new CDC director, Rochelle Walensky, noted this terrible statistic, she immediately followed up by urging people to “continue to use proven prevention steps to slow the spread—wear a well-fitting mask, stay 6 ft away from those you do not live with, avoid crowds and poorly ventilated places, and wash hands often.”

Those recommendations aren’t wrong, but they are incomplete. None of these individual acts do enough to protect those to whom such choices aren’t available—and the CDC has yet to issue sufficient guidelines for workplace ventilation or to make higher-filtration masks mandatory, or even available, for essential workers. Nor are these proscriptions paired frequently enough with prescriptions: Socialize outdoors, keep parks open, and let children play with one another outdoors.

Vaccines are the tool that will end the pandemic. The story of their rollout combines some of our strengths and our weaknesses, revealing the limitations of the way we think and evaluate evidence, provide guidelines, and absorb and react to an uncertain and difficult situation.

But also, after a weary year, maybe it’s hard for everyone—including scientists, journalists, and public-health officials—to imagine the end, to have hope. We adjust to new conditions fairly quickly, even terrible new conditions. During this pandemic, we’ve adjusted to things many of us never thought were possible. Billions of people have led dramatically smaller, circumscribed lives, and dealt with closed schools, the inability to see loved ones, the loss of jobs, the absence of communal activities, and the threat and reality of illness and death.

Hope nourishes us during the worst times, but it is also dangerous. It upsets the delicate balance of survival—where we stop hoping and focus on getting by—and opens us up to crushing disappointment if things don’t pan out. After a terrible year, many things are understandably making it harder for us to dare to hope. But, especially in the United States, everything looks better by the day. Tragically, at least 28 million Americans have been confirmed to have been infected, but the real number is certainly much higher. By one estimate, as many as 80 million have already been infected with COVID-19, and many of those people now have some level of immunity. Another 46 million people have already received at least one dose of a vaccine, and we’re vaccinating millions more each day as the supply constraints ease. The vaccines are poised to reduce or nearly eliminate the things we worry most about—severe disease, hospitalization, and death.

Not all our problems are solved. We need to get through the next few months, as we race to vaccinate against more transmissible variants. We need to do more to address equity in the United States—because it is the right thing to do, and because failing to vaccinate the highest-risk people will slow the population impact. We need to make sure that vaccines don’t remain inaccessible to poorer countries. We need to keep up our epidemiological surveillance so that if we do notice something that looks like it may threaten our progress, we can respond swiftly.

And the public behavior of the vaccinated cannot change overnight—even if they are at much lower risk, it’s not reasonable to expect a grocery store to try to verify who’s vaccinated, or to have two classes of people with different rules. For now, it’s courteous and prudent for everyone to obey the same guidelines in many public places. Still, vaccinated people can feel more confident in doing things they may have avoided, just in case—getting a haircut, taking a trip to see a loved one, browsing for nonessential purchases in a store.

But it is time to imagine a better future, not just because it’s drawing nearer but because that’s how we get through what remains and keep our guard up as necessary. It’s also realistic—reflecting the genuine increased safety for the vaccinated.

Public-health agencies should immediately start providing expanded information to vaccinated people so they can make informed decisions about private behavior. This is justified by the encouraging data, and a great way to get the word out on how wonderful these vaccines really are. The delay itself has great human costs, especially for those among the elderly who have been isolated for so long.

Public-health authorities should also be louder and more explicit about the next steps, giving us guidelines for when we can expect easing in rules for public behavior as well. We need the exit strategy spelled out—but with graduated, targeted measures rather than a one-size-fits-all message. We need to let people know that getting a vaccine will almost immediately change their lives for the better, and why, and also when and how increased vaccination will change more than their individual risks and opportunities, and see us out of this pandemic.

We should encourage people to dream about the end of this pandemic by talking about it more, and more concretely: the numbers, hows, and whys. Offering clear guidance on how this will end can help strengthen people’s resolve to endure whatever is necessary for the moment—even if they are still unvaccinated—by building warranted and realistic anticipation of the pandemic’s end.

Hope will get us through this. And one day soon, you’ll be able to hop off the subway on your way to a concert, pick up a newspaper, and find the triumphant headline: “COVID Routed!”

Zeynep Tufekci is a contributing writer at The Atlantic and an associate professor at the University of North Carolina. She studies the interaction between digital technology, artificial intelligence, and society.

The Coronavirus Is Plotting a Comeback. Here’s Our Chance to Stop It for Good. (New York Times)

Apoorva Mandavilli

Lincoln Park in Chicago. Scientists are hopeful, as vaccinations continue and despite the emergence of variants, that we’re past the worst of the pandemic.
Lincoln Park in Chicago. Scientists are hopeful, as vaccinations continue and despite the emergence of variants, that we’re past the worst of the pandemic. Credit: Lyndon French for The New York Times
Many scientists are expecting another rise in infections. But this time the surge will be blunted by vaccines and, hopefully, widespread caution. By summer, Americans may be looking at a return to normal life.

Published Feb. 25, 2021Updated Feb. 26, 2021, 12:07 a.m. ET

Across the United States, and the world, the coronavirus seems to be loosening its stranglehold. The deadly curve of cases, hospitalizations and deaths has yo-yoed before, but never has it plunged so steeply and so fast.

Is this it, then? Is this the beginning of the end? After a year of being pummeled by grim statistics and scolded for wanting human contact, many Americans feel a long-promised deliverance is at hand.

Americans will win against the virus and regain many aspects of their pre-pandemic lives, most scientists now believe. Of the 21 interviewed for this article, all were optimistic that the worst of the pandemic is past. This summer, they said, life may begin to seem normal again.

But — of course, there’s always a but — researchers are also worried that Americans, so close to the finish line, may once again underestimate the virus.

So far, the two vaccines authorized in the United States are spectacularly effective, and after a slow start, the vaccination rollout is picking up momentum. A third vaccine is likely to be authorized shortly, adding to the nation’s supply.

But it will be many weeks before vaccinations make a dent in the pandemic. And now the virus is shape-shifting faster than expected, evolving into variants that may partly sidestep the immune system.

The latest variant was discovered in New York City only this week, and another worrisome version is spreading at a rapid pace through California. Scientists say a contagious variant first discovered in Britain will become the dominant form of the virus in the United States by the end of March.

The road back to normalcy is potholed with unknowns: how well vaccines prevent further spread of the virus; whether emerging variants remain susceptible enough to the vaccines; and how quickly the world is immunized, so as to halt further evolution of the virus.

But the greatest ambiguity is human behavior. Can Americans desperate for normalcy keep wearing masks and distancing themselves from family and friends? How much longer can communities keep businesses, offices and schools closed?

Covid-19 deaths will most likely never rise quite as precipitously as in the past, and the worst may be behind us. But if Americans let down their guard too soon — many states are already lifting restrictions — and if the variants spread in the United States as they have elsewhere, another spike in cases may well arrive in the coming weeks.

Scientists call it the fourth wave. The new variants mean “we’re essentially facing a pandemic within a pandemic,” said Adam Kucharski, an epidemiologist at the London School of Hygiene and Tropical Medicine.

A patient received comfort in the I.C.U. of Marian Regional Medical Center in Santa Maria, Calif., last month. 
Credit: Daniel Dreifuss for The New York Times

The United States has now recorded 500,000 deaths amid the pandemic, a terrible milestone. As of Wednesday morning, at least 28.3 million people have been infected.

But the rate of new infections has tumbled by 35 percent over the past two weeks, according to a database maintained by The New York Times. Hospitalizations are down 31 percent, and deaths have fallen by 16 percent.

Yet the numbers are still at the horrific highs of November, scientists noted. At least 3,210 people died of Covid-19 on Wednesday alone. And there is no guarantee that these rates will continue to decrease.

“Very, very high case numbers are not a good thing, even if the trend is downward,” said Marc Lipsitch, an epidemiologist at the Harvard T.H. Chan School of Public Health in Boston. “Taking the first hint of a downward trend as a reason to reopen is how you get to even higher numbers.”

In late November, for example, Gov. Gina Raimondo of Rhode Island limited social gatherings and some commercial activities in the state. Eight days later, cases began to decline. The trend reversed eight days after the state’s pause lifted on Dec. 20.

The virus’s latest retreat in Rhode Island and most other states, experts said, results from a combination of factors: growing numbers of people with immunity to the virus, either from having been infected or from vaccination; changes in behavior in response to the surges of a few weeks ago; and a dash of seasonality — the effect of temperature and humidity on the survival of the virus.

Parts of the country that experienced huge surges in infection, like Montana and Iowa, may be closer to herd immunity than other regions. But patchwork immunity alone cannot explain the declines throughout much of the world.

The vaccines were first rolled out to residents of nursing homes and to the elderly, who are at highest risk of severe illness and death. That may explain some of the current decline in hospitalizations and deaths.

A volunteer in the Johnson & Johnson vaccine trial received a shot in the Desmond Tutu H.I.V. Foundation Youth Center in Masiphumelele, South Africa, in December.
Credit: Joao Silva/The New York Times

But young people drive the spread of the virus, and most of them have not yet been inoculated. And the bulk of the world’s vaccine supply has been bought up by wealthy nations, which have amassed one billion more doses than needed to immunize their populations.

Vaccination cannot explain why cases are dropping even in countries where not a single soul has been immunized, like Honduras, Kazakhstan or Libya. The biggest contributor to the sharp decline in infections is something more mundane, scientists say: behavioral change.

Leaders in the United States and elsewhere stepped up community restrictions after the holiday peaks. But individual choices have also been important, said Lindsay Wiley, an expert in public health law and ethics at American University in Washington.

“People voluntarily change their behavior as they see their local hospital get hit hard, as they hear about outbreaks in their area,” she said. “If that’s the reason that things are improving, then that’s something that can reverse pretty quickly, too.”

The downward curve of infections with the original coronavirus disguises an exponential rise in infections with B.1.1.7, the variant first identified in Britain, according to many researchers.

“We really are seeing two epidemic curves,” said Ashleigh Tuite, an infectious disease modeler at the University of Toronto.

The B.1.1.7 variant is thought to be more contagious and more deadly, and it is expected to become the predominant form of the virus in the United States by late March. The number of cases with the variant in the United States has risen from 76 in 12 states as of Jan. 13 to more than 1,800 in 45 states now. Actual infections may be much higher because of inadequate surveillance efforts in the United States.

Buoyed by the shrinking rates over all, however, governors are lifting restrictions across the United States and are under enormous pressure to reopen completely. Should that occur, B.1.1.7 and the other variants are likely to explode.

“Everybody is tired, and everybody wants things to open up again,” Dr. Tuite said. “Bending to political pressure right now, when things are really headed in the right direction, is going to end up costing us in the long term.”

A fourth wave doesn’t have to be inevitable, scientists say, but the new variants will pose a significant challenge to averting that wave.
Credit: Lyndon French for The New York Times

Looking ahead to late March or April, the majority of scientists interviewed by The Times predicted a fourth wave of infections. But they stressed that it is not an inevitable surge, if government officials and individuals maintain precautions for a few more weeks.

A minority of experts were more sanguine, saying they expected powerful vaccines and an expanding rollout to stop the virus. And a few took the middle road.

“We’re at that crossroads, where it could go well or it could go badly,” said Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases.

The vaccines have proved to be more effective than anyone could have hoped, so far preventing serious illness and death in nearly all recipients. At present, about 1.4 million Americans are vaccinated each day. More than 45 million Americans have received at least one dose.

A team of researchers at Fred Hutchinson Cancer Research Center in Seattle tried to calculate the number of vaccinations required per day to avoid a fourth wave. In a model completed before the variants surfaced, the scientists estimated that vaccinating just one million Americans a day would limit the magnitude of the fourth wave.

“But the new variants completely changed that,” said Dr. Joshua T. Schiffer, an infectious disease specialist who led the study. “It’s just very challenging scientifically — the ground is shifting very, very quickly.”

Natalie Dean, a biostatistician at the University of Florida, described herself as “a little more optimistic” than many other researchers. “We would be silly to undersell the vaccines,” she said, noting that they are effective against the fast-spreading B.1.1.7 variant.

But Dr. Dean worried about the forms of the virus detected in South Africa and Brazil that seem less vulnerable to the vaccines made by Pfizer and Moderna. (On Wednesday, Johnson & Johnson reported that its vaccine was relatively effective against the variant found in South Africa.)

Ccoronavirus test samples in a lab for genomic sequencing at Duke University in Durham, N.C., earlier this month.
Credit: Pete Kiehart for The New York Times

About 50 infections with those two variants have been identified in the United States, but that could change. Because of the variants, scientists do not know how many people who were infected and had recovered are now vulnerable to reinfection.

South Africa and Brazil have reported reinfections with the new variants among people who had recovered from infections with the original version of the virus.

“That makes it a lot harder to say, ‘If we were to get to this level of vaccinations, we’d probably be OK,’” said Sarah Cobey, an evolutionary biologist at the University of Chicago.

Yet the biggest unknown is human behavior, experts said. The sharp drop in cases now may lead to complacency about masks and distancing, and to a wholesale lifting of restrictions on indoor dining, sporting events and more. Or … not.

“The single biggest lesson I’ve learned during the pandemic is that epidemiological modeling struggles with prediction, because so much of it depends on human behavioral factors,” said Carl Bergstrom, a biologist at the University of Washington in Seattle.

Taking into account the counterbalancing rises in both vaccinations and variants, along with the high likelihood that people will stop taking precautions, a fourth wave is highly likely this spring, the majority of experts told The Times.

Kristian Andersen, a virologist at the Scripps Research Institute in San Diego, said he was confident that the number of cases will continue to decline, then plateau in about a month. After mid-March, the curve in new cases will swing upward again.

In early to mid-April, “we’re going to start seeing hospitalizations go up,” he said. “It’s just a question of how much.”

Hospitalizations and deaths will fall to levels low enough to reopen the country — though mask-wearing may remain necessary as a significant portion of people, including children, won’t be immunized.
Credit: Kendrick Brinson for The New York Times

Now the good news.

Despite the uncertainties, the experts predict that the last surge will subside in the United States sometime in the early summer. If the Biden administration can keep its promise to immunize every American adult by the end of the summer, the variants should be no match for the vaccines.

Combine vaccination with natural immunity and the human tendency to head outdoors as weather warms, and “it may not be exactly herd immunity, but maybe it’s sufficient to prevent any large outbreaks,” said Youyang Gu, an independent data scientist, who created some of the most prescient models of the pandemic.

Infections will continue to drop. More important, hospitalizations and deaths will fall to negligible levels — enough, hopefully, to reopen the country.

“Sometimes people lose vision of the fact that vaccines prevent hospitalization and death, which is really actually what most people care about,” said Stefan Baral, an epidemiologist at the Johns Hopkins Bloomberg School of Public Health.

Even as the virus begins its swoon, people may still need to wear masks in public places and maintain social distance, because a significant percent of the population — including children — will not be immunized.

“Assuming that we keep a close eye on things in the summer and don’t go crazy, I think that we could look forward to a summer that is looking more normal, but hopefully in a way that is more carefully monitored than last summer,” said Emma Hodcroft, a molecular epidemiologist at the University of Bern in Switzerland.

Imagine: Groups of vaccinated people will be able to get together for barbecues and play dates, without fear of infecting one another. Beaches, parks and playgrounds will be full of mask-free people. Indoor dining will return, along with movie theaters, bowling alleys and shopping malls — although they may still require masks.

The virus will still be circulating, but the extent will depend in part on how well vaccines prevent not just illness and death, but also transmission. The data on whether vaccines stop the spread of the disease are encouraging, but immunization is unlikely to block transmission entirely.

Self-swab testing for Covid at Duke University in February.
Credit: Pete Kiehart for The New York Times

“It’s not zero and it’s not 100 — exactly where that number is will be important,” said Shweta Bansal, an infectious disease modeler at Georgetown University. “It needs to be pretty darn high for us to be able to get away with vaccinating anything below 100 percent of the population, so that’s definitely something we’re watching.”

Over the long term — say, a year from now, when all the adults and children in the United States who want a vaccine have received them — will this virus finally be behind us?

Every expert interviewed by The Times said no. Even after the vast majority of the American population has been immunized, the virus will continue to pop up in clusters, taking advantage of pockets of vulnerability. Years from now, the coronavirus may be an annoyance, circulating at low levels, causing modest colds.

Many scientists said their greatest worry post-pandemic was that new variants may turn out to be significantly less susceptible to the vaccines. Billions of people worldwide will remain unprotected, and each infection gives the virus new opportunities to mutate.

“We won’t have useless vaccines. We might have slightly less good vaccines than we have at the moment,” said Andrew Read, an evolutionary microbiologist at Penn State University. “That’s not the end of the world, because we have really good vaccines right now.”

For now, every one of us can help by continuing to be careful for just a few more months, until the curve permanently flattens.

“Just hang in there a little bit longer,” Dr. Tuite said. “There’s a lot of optimism and hope, but I think we need to be prepared for the fact that the next several months are likely to continue to be difficult.”

Credit: Lyndon French for The New York Times

Texas Blackouts Point to Coast-to-Coast Crises Waiting to Happen (New York Times)

Christopher Flavelle, Brad Plumer, Hiroko Tabuchi – Feb 20, 2021

Traffic at a standstill on Interstate 35 in Kileen, Texas, on Thursday.
Traffic at a standstill on Interstate 35 in Kileen, Texas, on Thursday. Credit: Joe Raedle/Getty Images
Continent-spanning storms triggered blackouts in Oklahoma and Mississippi, halted one-third of U.S. oil production and disrupted vaccinations in 20 states.

Even as Texas struggled to restore electricity and water over the past week, signs of the risks posed by increasingly extreme weather to America’s aging infrastructure were cropping up across the country.

The week’s continent-spanning winter storms triggered blackouts in Texas, Oklahoma, Mississippi and several other states. One-third of oil production in the nation was halted. Drinking-water systems in Ohio were knocked offline. Road networks nationwide were paralyzed and vaccination efforts in 20 states were disrupted.

The crisis carries a profound warning. As climate change brings more frequent and intense storms, floods, heat waves, wildfires and other extreme events, it is placing growing stress on the foundations of the country’s economy: Its network of roads and railways, drinking-water systems, power plants, electrical grids, industrial waste sites and even homes. Failures in just one sector can set off a domino effect of breakdowns in hard-to-predict ways.

Much of this infrastructure was built decades ago, under the expectation that the environment around it would remain stable, or at least fluctuate within predictable bounds. Now climate change is upending that assumption.

“We are colliding with a future of extremes,” said Alice Hill, who oversaw planning for climate risks on the National Security Council during the Obama administration. “We base all our choices about risk management on what’s occurred in the past, and that is no longer a safe guide.”

While it’s not always possible to say precisely how global warming influenced any one particular storm, scientists said, an overall rise in extreme weather creates sweeping new risks.

Sewer systems are overflowing more often as powerful rainstorms exceed their design capacity. Coastal homes and highways are collapsing as intensified runoff erodes cliffs. Coal ash, the toxic residue produced by coal-burning plants, is spilling into rivers as floods overwhelm barriers meant to hold it back. Homes once beyond the reach of wildfires are burning in blazes they were never designed to withstand.

A broken water main in McComb., Miss. on Thursday.
Credit: Matt Williamson/The Enterprise-Journal, via Associated Press

Problems like these often reflect an inclination of governments to spend as little money as possible, said Shalini Vajjhala, a former Obama administration official who now advises cities on meeting climate threats. She said it’s hard to persuade taxpayers to spend extra money to guard against disasters that seem unlikely.

But climate change flips that logic, making inaction far costlier. “The argument I would make is, we can’t afford not to, because we’re absorbing the costs” later, Ms. Vajjhala said, after disasters strike. “We’re spending poorly.”

The Biden administration has talked extensively about climate change, particularly the need to reduce greenhouse gas emissions and create jobs in renewable energy. But it has spent less time discussing how to manage the growing effects of climate change, facing criticism from experts for not appointing more people who focus on climate resilience.

“I am extremely concerned by the lack of emergency-management expertise reflected in Biden’s climate team,” said Samantha Montano, an assistant professor at the Massachusetts Maritime Academy who focuses on disaster policy. “There’s an urgency here that still is not being reflected.”

A White House spokesman, Vedant Patel, said in a statement, “Building resilient and sustainable infrastructure that can withstand extreme weather and a changing climate will play an integral role in creating millions of good paying, union jobs” while cutting greenhouse gas emissions.

And while President Biden has called for a major push to refurbish and upgrade the nation’s infrastructure, getting a closely divided Congress to spend hundreds of billions, if not trillions of dollars, will be a major challenge.

Heightening the cost to society, disruptions can disproportionately affect lower-income households and other vulnerable groups, including older people or those with limited English.

“All these issues are converging,” said Robert D. Bullard, a professor at Texas Southern University who studies wealth and racial disparities related to the environment. “And there’s simply no place in this country that’s not going to have to deal with climate change.”

Flooding around Edenville Township, Mich., last year swept away a bridge over the Tittabawassee River.
Credit: Matthew Hatcher/Getty Images

In September, when a sudden storm dumped a record of more than two inches of water on Washington in less than 75 minutes, the result wasn’t just widespread flooding, but also raw sewage rushing into hundreds of homes.

Washington, like many other cities in the Northeast and Midwest, relies on what’s called a combined sewer overflow system: If a downpour overwhelms storm drains along the street, they are built to overflow into the pipes that carry raw sewage. But if there’s too much pressure, sewage can be pushed backward, into people’s homes — where the forces can send it erupting from toilets and shower drains.

This is what happened in Washington. The city’s system was built in the late 1800s. Now, climate change is straining an already outdated design.

DC Water, the local utility, is spending billions of dollars so that the system can hold more sewage. “We’re sort of in uncharted territory,” said Vincent Morris, a utility spokesman.

The challenge of managing and taming the nation’s water supplies — whether in streets and homes, or in vast rivers and watersheds — is growing increasingly complex as storms intensify. Last May, rain-swollen flooding breached two dams in Central Michigan, forcing thousands of residents to flee their homes and threatening a chemical complex and toxic waste cleanup site. Experts warned it was unlikely to be the last such failure.

Many of the country’s 90,000 dams were built decades ago and were already in dire need of repairs. Now climate change poses an additional threat, bringing heavier downpours to parts of the country and raising the odds that some dams could be overwhelmed by more water than they were designed to handle. One recent study found that most of California’s biggest dams were at increased risk of failure as global warming advances.

In recent years, dam-safety officials have begun grappling with the dangers. Colorado, for instance, now requires dam builders to take into account the risk of increased atmospheric moisture driven by climate change as they plan for worst-case flooding scenarios.

But nationwide, there remains a backlog of thousands of older dams that still need to be rehabilitated or upgraded. The price tag could ultimately stretch to more than $70 billion.

“Whenever we study dam failures, we often find there was a lot of complacency beforehand,” said Bill McCormick, president of the Association of State Dam Safety Officials. But given that failures can have catastrophic consequences, “we really can’t afford to be complacent.”

Crews repaired switches on utility poles damaged by the storms in Texas.
Credit: Tamir Kalifa for The New York Times

If the Texas blackouts exposed one state’s poor planning, they also provide a warning for the nation: Climate change threatens virtually every aspect of electricity grids that aren’t always designed to handle increasingly severe weather. The vulnerabilities show up in power lines, natural-gas plants, nuclear reactors and myriad other systems.

Higher storm surges can knock out coastal power infrastructure. Deeper droughts can reduce water supplies for hydroelectric dams. Severe heat waves can reduce the efficiency of fossil-fuel generators, transmission lines and even solar panels at precisely the moment that demand soars because everyone cranks up their air-conditioners.

Climate hazards can also combine in new and unforeseen ways.

In California recently, Pacific Gas & Electric has had to shut off electricity to thousands of people during exceptionally dangerous fire seasons. The reason: Downed power lines can spark huge wildfires in dry vegetation. Then, during a record-hot August last year, several of the state’s natural gas plants malfunctioned in the heat, just as demand was spiking, contributing to blackouts.

“We have to get better at understanding these compound impacts,” said Michael Craig, an expert in energy systems at the University of Michigan who recently led a study looking at how rising summer temperatures in Texas could strain the grid in unexpected ways. “It’s an incredibly complex problem to plan for.”

Some utilities are taking notice. After Superstorm Sandy in 2012 knocked out power for 8.7 million customers, utilities in New York and New Jersey invested billions in flood walls, submersible equipment and other technology to reduce the risk of failures. Last month, New York’s Con Edison said it would incorporate climate projections into its planning.

As freezing temperatures struck Texas, a glitch at one of two reactors at a South Texas nuclear plant, which serves 2 million homes, triggered a shutdown. The cause: Sensing lines connected to the plant’s water pumps had frozen, said Victor Dricks, a spokesman for the federal Nuclear Regulatory Agency.

It’s also common for extreme heat to disrupt nuclear power. The issue is that the water used to cool reactors can become too warm to use, forcing shutdowns.

Flooding is another risk.

After a tsunami led to several meltdowns at Japan’s Fukushima Daiichi power plant in 2011, the U.S. Nuclear Regulatory Commission told the 60 or so working nuclear plants in the United States, many decades old, to evaluate their flood risk to account for climate change. Ninety percent showed at least one type of flood risk that exceeded what the plant was designed to handle.

The greatest risk came from heavy rain and snowfall exceeding the design parameters at 53 plants.

Scott Burnell, an Nuclear Regulatory Commission spokesman, said in a statement, “The NRC continues to conclude, based on the staff’s review of detailed analyses, that all U.S. nuclear power plants can appropriately deal with potential flooding events, including the effects of climate change, and remain safe.”

A section of Highway 1 along the California coastline collapsed in January amid heavy rains.
Credit: Josh Edelson/Agence France-Presse — Getty Images

The collapse of a portion of California’s Highway 1 into the Pacific Ocean after heavy rains last month was a reminder of the fragility of the nation’s roads.

Several climate-related risks appeared to have converged to heighten the danger. Rising seas and higher storm surges have intensified coastal erosion, while more extreme bouts of precipitation have increased the landslide risk.

Add to that the effects of devastating wildfires, which can damage the vegetation holding hillside soil in place, and “things that wouldn’t have slid without the wildfires, start sliding,” said Jennifer M. Jacobs, a professor of civil and environmental engineering at the University of New Hampshire. “I think we’re going to see more of that.”

The United States depends on highways, railroads and bridges as economic arteries for commerce, travel and simply getting to work. But many of the country’s most important links face mounting climate threats. More than 60,000 miles of roads and bridges in coastal floodplains are already vulnerable to extreme storms and hurricanes, government estimates show. And inland flooding could also threaten at least 2,500 bridges across the country by 2050, a federal climate report warned in 2018.

Sometimes even small changes can trigger catastrophic failures. Engineers modeling the collapse of bridges over Escambia Bay in Florida during Hurricane Ivan in 2004 found that the extra three inches of sea-level rise since the bridge was built in 1968 very likely contributed to the collapse, because of the added height of the storm surge and force of the waves.

“A lot of our infrastructure systems have a tipping point. And when you hit the tipping point, that’s when a failure occurs,” Dr. Jacobs said. “And the tipping point could be an inch.”

Crucial rail networks are at risk, too. In 2017, Amtrak consultants found that along parts of the Northeast corridor, which runs from Boston to Washington and carries 12 million people a year, flooding and storm surge could erode the track bed, disable the signals and eventually put the tracks underwater.

And there is no easy fix. Elevating the tracks would require also raising bridges, electrical wires and lots of other infrastructure, and moving them would mean buying new land in a densely packed part of the country. So the report recommended flood barriers, costing $24 million per mile, that must be moved into place whenever floods threaten.

A worker checked efforts to prevent coal ash from escaping into the Waccamaw River in South Carolina after Hurricane Florence in 2018.
Credit: Randall Hill/Reuters

A series of explosions at a flood-damaged chemical plant outside Houston after Hurricane Harvey in 2017 highlighted a danger lurking in a world beset by increasingly extreme weather.

The blasts at the plant came after flooding knocked out the site’s electrical supply, shutting down refrigeration systems that kept volatile chemicals stable. Almost two dozen people, many of them emergency workers, were treated for exposure to the toxic fumes, and some 200 nearby residents were evacuated from their homes.

More than 2,500 facilities that handle toxic chemicals lie in federal flood-prone areas across the country, about 1,400 of them in areas at the highest risk of flooding, a New York Times analysis showed in 2018.

Leaks from toxic cleanup sites, left behind by past industry, pose another threat.

Almost two-thirds of some 1,500 superfund cleanup sites across the country are in areas with an elevated risk of flooding, storm surge, wildfires or sea level rise, a government audit warned in 2019. Coal ash, a toxic substance produced by coal power plants that is often stored as sludge in special ponds, have been particularly exposed. After Hurricane Florence in 2018, for example, a dam breach at the site of a power plant in Wilmington, N.C., released the hazardous ash into a nearby river.

“We should be evaluating whether these facilities or sites actually have to be moved or re-secured,” said Lisa Evans, senior counsel at Earthjustice, an environmental law organization. Places that “may have been OK in 1990,” she said, “may be a disaster waiting to happen in 2021.”

East Austin, Texas, during a blackout on Wednesday.  
Credit: Bronte Wittpenn/Austin American-Statesman, via Associated Press

A Glimpse of America’s Future: Climate Change Means Trouble for Power Grids (New York Times)

Brad Plumer, Feb. 17, 2021

Systems are designed to handle spikes in demand, but the wild and unpredictable weather linked to global warming will very likely push grids beyond their limits.
A street in Austin, Texas, without power on Monday evening.
Credit: Tamir Kalifa for The New York Times

Published Feb. 16, 2021Updated Feb. 17, 2021, 6:59 a.m. ET

Huge winter storms plunged large parts of the central and southern United States into an energy crisis this week, with frigid blasts of Arctic weather crippling electric grids and leaving millions of Americans without power amid dangerously cold temperatures.

The grid failures were most severe in Texas, where more than four million people woke up Tuesday morning to rolling blackouts. Separate regional grids in the Southwest and Midwest also faced serious strain. As of Tuesday afternoon, at least 23 people nationwide had died in the storm or its aftermath.

Analysts have begun to identify key factors behind the grid failures in Texas. Record-breaking cold weather spurred residents to crank up their electric heaters and pushed power demand beyond the worst-case scenarios that grid operators had planned for. At the same time, a large fraction of the state’s gas-fired power plants were knocked offline amid icy conditions, with some plants suffering fuel shortages as natural gas demand spiked. Many of Texas’ wind turbines also froze and stopped working.

The crisis sounded an alarm for power systems throughout the country. Electric grids can be engineered to handle a wide range of severe conditions — as long as grid operators can reliably predict the dangers ahead. But as climate change accelerates, many electric grids will face extreme weather events that go far beyond the historical conditions those systems were designed for, putting them at risk of catastrophic failure.

While scientists are still analyzing what role human-caused climate change may have played in this week’s winter storms, it is clear that global warming poses a barrage of additional threats to power systems nationwide, including fiercer heat waves and water shortages.

Measures that could help make electric grids more robust — such as fortifying power plants against extreme weather, or installing more backup power sources — could prove expensive. But as Texas shows, blackouts can be extremely costly, too. And, experts said, unless grid planners start planning for increasingly wild and unpredictable climate conditions, grid failures will happen again and again.

“It’s essentially a question of how much insurance you want to buy,” said Jesse Jenkins, an energy systems engineer at Princeton University. “What makes this problem even harder is that we’re now in a world where, especially with climate change, the past is no longer a good guide to the future. We have to get much better at preparing for the unexpected.”

Texas’ main electric grid, which largely operates independently from the rest of the country, has been built with the state’s most common weather extremes in mind: soaring summer temperatures that cause millions of Texans to turn up their air-conditioners all at once.

While freezing weather is rarer, grid operators in Texas have also long known that electricity demand can spike in the winter, particularly after damaging cold snaps in 2011 and 2018. But this week’s winter storms, which buried the state in snow and ice, and led to record-cold temperatures, surpassed all expectations — and pushed the grid to its breaking point.

Residents of East Dallas trying to warm up on Monday after their family home lost power.
Credit: Juan Figueroa/The Dallas Morning News, via Associated Press

Texas’ grid operators had anticipated that, in the worst case, the state would use 67 gigawatts of electricity during the winter peak. But by Sunday evening, power demand had surged past that level. As temperatures dropped, many homes were relying on older, inefficient electric heaters that consume more power.

The problems compounded from there, with frigid weather on Monday disabling power plants with capacity totaling more than 30 gigawatts. The vast majority of those failures occurred at thermal power plants, like natural gas generators, as plummeting temperatures paralyzed plant equipment and soaring demand for natural gas left some plants struggling to obtain sufficient fuel. A number of the state’s power plants were also offline for scheduled maintenance in preparation for the summer peak.

The state’s fleet of wind farms also lost up to 4.5 gigawatts of capacity at times, as many turbines stopped working in cold and icy conditions, though this was a smaller part of the problem.

In essence, experts said, an electric grid optimized to deliver huge quantities of power on the hottest days of the year was caught unprepared when temperatures plummeted.

While analysts are still working to untangle all of the reasons behind Texas’ grid failures, some have also wondered whether the unique way the state manages its largely deregulated electricity system may have played a role. In the mid-1990s, for instance, Texas decided against paying energy producers to hold a fixed number of backup power plants in reserve, instead letting market forces dictate what happens on the grid.

On Tuesday, Gov. Greg Abbott called for an emergency reform of the Electric Reliability Council of Texas, the nonprofit corporation that oversees the flow of power in the state, saying its performance had been “anything but reliable” over the previous 48 hours.

In theory, experts said, there are technical solutions that can avert such problems.

Wind turbines can be equipped with heaters and other devices so that they can operate in icy conditions — as is often done in the upper Midwest, where cold weather is more common. Gas plants can be built to store oil on-site and switch over to burning the fuel if needed, as is often done in the Northeast, where natural gas shortages are common. Grid regulators can design markets that pay extra to keep a larger fleet of backup power plants in reserve in case of emergencies, as is done in the Mid-Atlantic.

But these solutions all cost money, and grid operators are often wary of forcing consumers to pay extra for safeguards.

“Building in resilience often comes at a cost, and there’s a risk of both underpaying but also of overpaying,” said Daniel Cohan, an associate professor of civil and environmental engineering at Rice University. “It’s a difficult balancing act.”

In the months ahead, as Texas grid operators and policymakers investigate this week’s blackouts, they will likely explore how the grid might be bolstered to handle extremely cold weather. Some possible ideas include: Building more connections between Texas and other states to balance electricity supplies, a move the state has long resisted; encouraging homeowners to install battery backup systems; or keeping additional power plants in reserve.

The search for answers will be complicated by climate change. Over all, the state is getting warmer as global temperatures rise, and cold-weather extremes are, on average, becoming less common over time.

But some climate scientists have also suggested that global warming could, paradoxically, bring more unusually fierce winter storms. Some research indicates that Arctic warming is weakening the jet stream, the high-level air current that circles the northern latitudes and usually holds back the frigid polar vortex. This can allow cold air to periodically escape to the South, resulting in episodes of bitter cold in places that rarely get nipped by frost.

ImageCredit: Jacob Ford/Odessa American, via Associated Press

But this remains an active area of debate among climate scientists, with some experts less certain that polar vortex disruptions are becoming more frequent, making it even trickier for electricity planners to anticipate the dangers ahead.

All over the country, utilities and grid operators are confronting similar questions, as climate change threatens to intensify heat waves, floods, water shortages and other calamities, all of which could create novel risks for the nation’s electricity systems. Adapting to those risks could carry a hefty price tag: One recent study found that the Southeast alone may need 35 percent more electric capacity by 2050 simply to deal with the known hazards of climate change.

And the task of building resilience is becoming increasingly urgent. Many policymakers are promoting electric cars and electric heating as a way of curbing greenhouse gas emissions. But as more of the nation’s economy depends on reliable flows of electricity, the cost of blackouts will become ever more dire.

“This is going to be a significant challenge,” said Emily Grubert, an infrastructure expert at Georgia Tech. “We need to decarbonize our power systems so that climate change doesn’t keep getting worse, but we also need to adapt to changing conditions at the same time. And the latter alone is going to be very costly. We can already see that the systems we have today aren’t handling this very well.”

John Schwartz, Dave Montgomery and Ivan Penn contributed reporting.

Climate crisis: world is at its hottest for at least 12,000 years – study (The Guardian)

Damian Carrington, Environment editor @dpcarrington

Wed 27 Jan 2021 16.00 GMT

The world’s continuously warming climate is revealed also in contemporary ice melt at glaciers, such as with this one in the Kenai mountains, Alaska (seen September 2019). Photograph: Joe Raedle/Getty Images

The planet is hotter now than it has been for at least 12,000 years, a period spanning the entire development of human civilisation, according to research.

Analysis of ocean surface temperatures shows human-driven climate change has put the world in “uncharted territory”, the scientists say. The planet may even be at its warmest for 125,000 years, although data on that far back is less certain.

The research, published in the journal Nature, reached these conclusions by solving a longstanding puzzle known as the “Holocene temperature conundrum”. Climate models have indicated continuous warming since the last ice age ended 12,000 years ago and the Holocene period began. But temperature estimates derived from fossil shells showed a peak of warming 6,000 years ago and then a cooling, until the industrial revolution sent carbon emissions soaring.

This conflict undermined confidence in the climate models and the shell data. But it was found that the shell data reflected only hotter summers and missed colder winters, and so was giving misleadingly high annual temperatures.

“We demonstrate that global average annual temperature has been rising over the last 12,000 years, contrary to previous results,” said Samantha Bova, at Rutgers University–New Brunswick in the US, who led the research. “This means that the modern, human-caused global warming period is accelerating a long-term increase in global temperatures, making today completely uncharted territory. It changes the baseline and emphasises just how critical it is to take our situation seriously.”

The world may be hotter now than any time since about 125,000 years ago, which was the last warm period between ice ages. However, scientists cannot be certain as there is less data relating to that time.

One study, published in 2017, suggested that global temperatures were last as high as today 115,000 years ago, but that was based on less data.

The new research is published in the journal Nature and examined temperature measurements derived from the chemistry of tiny shells and algal compounds found in cores of ocean sediments, and solved the conundrum by taking account of two factors.

First, the shells and organic materials had been assumed to represent the entire year but in fact were most likely to have formed during summer when the organisms bloomed. Second, there are well-known predictable natural cycles in the heating of the Earth caused by eccentricities in the orbit of the planet. Changes in these cycles can lead to summers becoming hotter and winters colder while average annual temperatures change only a little.

Combining these insights showed that the apparent cooling after the warm peak 6,000 years ago, revealed by shell data, was misleading. The shells were in fact only recording a decline in summer temperatures, but the average annual temperatures were still rising slowly, as indicated by the models.

“Now they actually match incredibly well and it gives us a lot of confidence that our climate models are doing a really good job,” said Bova.

The study looked only at ocean temperature records, but Bova said: “The temperature of the sea surface has a really controlling impact on the climate of the Earth. If we know that, it is the best indicator of what global climate is doing.”

She led a research voyage off the coast of Chile in 2020 to take more ocean sediment cores and add to the available data.

Jennifer Hertzberg, of Texas A&M University in the US, said: “By solving a conundrum that has puzzled climate scientists for years, Bova and colleagues’ study is a major step forward. Understanding past climate change is crucial for putting modern global warming in context.”

Lijing Cheng, at the International Centre for Climate and Environment Sciences in Beijing, China, recently led a study that showed that in 2020 the world’s oceans reached their hottest level yet in instrumental records dating back to the 1940s. More than 90% of global heating is taken up by the seas.

Cheng said the new research was useful and intriguing. It provided a method to correct temperature data from shells and could also enable scientists to work out how much heat the ocean absorbed before the industrial revolution, a factor little understood.

The level of carbon dioxide today is at its highest for about 4m years and is rising at the fastest rate for 66m years. Further rises in temperature and sea level are inevitable until greenhouse gas emissions are cut to net zero.

How will AI shape our lives post-Covid? (BBC)

Original article

BBC, 09 Nov 2020

Audrey Azoulay: Director-General, Unesco
How will AI shape our lives post-Covid?

Covid-19 is a test like no other. Never before have the lives of so many people around the world been affected at this scale or speed.

Over the past six months, thousands of AI innovations have sprung up in response to the challenges of life under lockdown. Governments are mobilising machine-learning in many ways, from contact-tracing apps to telemedicine and remote learning.

However, as the digital transformation accelerates exponentially, it is highlighting the challenges of AI. Ethical dilemmas are already a reality – including privacy risks and discriminatory bias.

It is up to us to decide what we want AI to look like: there is a legislative vacuum that needs to be filled now. Principles such as proportionality, inclusivity, human oversight and transparency can create a framework allowing us to anticipate these issues.

This is why Unesco is working to build consensus among 193 countries to lay the ethical foundations of AI. Building on these principles, countries will be able to develop national policies that ensure AI is designed, developed and deployed in compliance with fundamental human values.

As we face new, previously unimaginable challenges – like the pandemic – we must ensure that the tools we are developing work for us, and not against us.

A Supercomputer Analyzed Covid-19 — and an Interesting New Theory Has Emerged (Medium/Elemental)

A closer look at the Bradykinin hypothesis

Thomas Smith, Sept 1, 2020

Original article

3d rendering of multiple coronavirus.
Photo: zhangshuang/Getty Images

Earlier this summer, the Summit supercomputer at Oak Ridge National Lab in Tennessee set about crunching data on more than 40,000 genes from 17,000 genetic samples in an effort to better understand Covid-19. Summit is the second-fastest computer in the world, but the process — which involved analyzing 2.5 billion genetic combinations — still took more than a week.

When Summit was done, researchers analyzed the results. It was, in the words of Dr. Daniel Jacobson, lead researcher and chief scientist for computational systems biology at Oak Ridge, a “eureka moment.” The computer had revealed a new theory about how Covid-19 impacts the body: the bradykinin hypothesis. The hypothesis provides a model that explains many aspects of Covid-19, including some of its most bizarre symptoms. It also suggests 10-plus potential treatments, many of which are already FDA approved. Jacobson’s group published their results in a paper in the journal eLife in early July.

According to the team’s findings, a Covid-19 infection generally begins when the virus enters the body through ACE2 receptors in the nose, (The receptors, which the virus is known to target, are abundant there.) The virus then proceeds through the body, entering cells in other places where ACE2 is also present: the intestines, kidneys, and heart. This likely accounts for at least some of the disease’s cardiac and GI symptoms.

But once Covid-19 has established itself in the body, things start to get really interesting. According to Jacobson’s group, the data Summit analyzed shows that Covid-19 isn’t content to simply infect cells that already express lots of ACE2 receptors. Instead, it actively hijacks the body’s own systems, tricking it into upregulating ACE2 receptors in places where they’re usually expressed at low or medium levels, including the lungs.

In this sense, Covid-19 is like a burglar who slips in your unlocked second-floor window and starts to ransack your house. Once inside, though, they don’t just take your stuff — they also throw open all your doors and windows so their accomplices can rush in and help pillage more efficiently.

The renin–angiotensin system (RAS) controls many aspects of the circulatory system, including the body’s levels of a chemical called bradykinin, which normally helps to regulate blood pressure. According to the team’s analysis, when the virus tweaks the RAS, it causes the body’s mechanisms for regulating bradykinin to go haywire. Bradykinin receptors are resensitized, and the body also stops effectively breaking down bradykinin. (ACE normally degrades bradykinin, but when the virus downregulates it, it can’t do this as effectively.)

The end result, the researchers say, is to release a bradykinin storm — a massive, runaway buildup of bradykinin in the body. According to the bradykinin hypothesis, it’s this storm that is ultimately responsible for many of Covid-19’s deadly effects. Jacobson’s team says in their paper that “the pathology of Covid-19 is likely the result of Bradykinin Storms rather than cytokine storms,” which had been previously identified in Covid-19 patients, but that “the two may be intricately linked.” Other papers had previously identified bradykinin storms as a possible cause of Covid-19’s pathologies.

Covid-19 is like a burglar who slips in your unlocked second-floor window and starts to ransack your house.

As bradykinin builds up in the body, it dramatically increases vascular permeability. In short, it makes your blood vessels leaky. This aligns with recent clinical data, which increasingly views Covid-19 primarily as a vascular disease, rather than a respiratory one. But Covid-19 still has a massive effect on the lungs. As blood vessels start to leak due to a bradykinin storm, the researchers say, the lungs can fill with fluid. Immune cells also leak out into the lungs, Jacobson’s team found, causing inflammation.

And Covid-19 has another especially insidious trick. Through another pathway, the team’s data shows, it increases production of hyaluronic acid (HLA) in the lungs. HLA is often used in soaps and lotions for its ability to absorb more than 1,000 times its weight in fluid. When it combines with fluid leaking into the lungs, the results are disastrous: It forms a hydrogel, which can fill the lungs in some patients. According to Jacobson, once this happens, “it’s like trying to breathe through Jell-O.”

This may explain why ventilators have proven less effective in treating advanced Covid-19 than doctors originally expected, based on experiences with other viruses. “It reaches a point where regardless of how much oxygen you pump in, it doesn’t matter, because the alveoli in the lungs are filled with this hydrogel,” Jacobson says. “The lungs become like a water balloon.” Patients can suffocate even while receiving full breathing support.

The bradykinin hypothesis also extends to many of Covid-19’s effects on the heart. About one in five hospitalized Covid-19 patients have damage to their hearts, even if they never had cardiac issues before. Some of this is likely due to the virus infecting the heart directly through its ACE2 receptors. But the RAS also controls aspects of cardiac contractions and blood pressure. According to the researchers, bradykinin storms could create arrhythmias and low blood pressure, which are often seen in Covid-19 patients.

The bradykinin hypothesis also accounts for Covid-19’s neurological effects, which are some of the most surprising and concerning elements of the disease. These symptoms (which include dizziness, seizures, delirium, and stroke) are present in as many as half of hospitalized Covid-19 patients. According to Jacobson and his team, MRI studies in France revealed that many Covid-19 patients have evidence of leaky blood vessels in their brains.

Bradykinin — especially at high doses — can also lead to a breakdown of the blood-brain barrier. Under normal circumstances, this barrier acts as a filter between your brain and the rest of your circulatory system. It lets in the nutrients and small molecules that the brain needs to function, while keeping out toxins and pathogens and keeping the brain’s internal environment tightly regulated.

If bradykinin storms cause the blood-brain barrier to break down, this could allow harmful cells and compounds into the brain, leading to inflammation, potential brain damage, and many of the neurological symptoms Covid-19 patients experience. Jacobson told me, “It is a reasonable hypothesis that many of the neurological symptoms in Covid-19 could be due to an excess of bradykinin. It has been reported that bradykinin would indeed be likely to increase the permeability of the blood-brain barrier. In addition, similar neurological symptoms have been observed in other diseases that result from an excess of bradykinin.”

Increased bradykinin levels could also account for other common Covid-19 symptoms. ACE inhibitors — a class of drugs used to treat high blood pressure — have a similar effect on the RAS system as Covid-19, increasing bradykinin levels. In fact, Jacobson and his team note in their paper that “the virus… acts pharmacologically as an ACE inhibitor” — almost directly mirroring the actions of these drugs.

By acting like a natural ACE inhibitor, Covid-19 may be causing the same effects that hypertensive patients sometimes get when they take blood pressure–lowering drugs. ACE inhibitors are known to cause a dry cough and fatigue, two textbook symptoms of Covid-19. And they can potentially increase blood potassium levels, which has also been observed in Covid-19 patients. The similarities between ACE inhibitor side effects and Covid-19 symptoms strengthen the bradykinin hypothesis, the researchers say.

ACE inhibitors are also known to cause a loss of taste and smell. Jacobson stresses, though, that this symptom is more likely due to the virus “affecting the cells surrounding olfactory nerve cells” than the direct effects of bradykinin.

Though still an emerging theory, the bradykinin hypothesis explains several other of Covid-19’s seemingly bizarre symptoms. Jacobson and his team speculate that leaky vasculature caused by bradykinin storms could be responsible for “Covid toes,” a condition involving swollen, bruised toes that some Covid-19 patients experience. Bradykinin can also mess with the thyroid gland, which could produce the thyroid symptoms recently observed in some patients.

The bradykinin hypothesis could also explain some of the broader demographic patterns of the disease’s spread. The researchers note that some aspects of the RAS system are sex-linked, with proteins for several receptors (such as one called TMSB4X) located on the X chromosome. This means that “women… would have twice the levels of this protein than men,” a result borne out by the researchers’ data. In their paper, Jacobson’s team concludes that this “could explain the lower incidence of Covid-19 induced mortality in women.” A genetic quirk of the RAS could be giving women extra protection against the disease.

The bradykinin hypothesis provides a model that “contributes to a better understanding of Covid-19” and “adds novelty to the existing literature,” according to scientists Frank van de Veerdonk, Jos WM van der Meer, and Roger Little, who peer-reviewed the team’s paper. It predicts nearly all the disease’s symptoms, even ones (like bruises on the toes) that at first appear random, and further suggests new treatments for the disease.

As Jacobson and team point out, several drugs target aspects of the RAS and are already FDA approved to treat other conditions. They could arguably be applied to treating Covid-19 as well. Several, like danazol, stanozolol, and ecallantide, reduce bradykinin production and could potentially stop a deadly bradykinin storm. Others, like icatibant, reduce bradykinin signaling and could blunt its effects once it’s already in the body.

Interestingly, Jacobson’s team also suggests vitamin D as a potentially useful Covid-19 drug. The vitamin is involved in the RAS system and could prove helpful by reducing levels of another compound, known as REN. Again, this could stop potentially deadly bradykinin storms from forming. The researchers note that vitamin D has already been shown to help those with Covid-19. The vitamin is readily available over the counter, and around 20% of the population is deficient. If indeed the vitamin proves effective at reducing the severity of bradykinin storms, it could be an easy, relatively safe way to reduce the severity of the virus.

Other compounds could treat symptoms associated with bradykinin storms. Hymecromone, for example, could reduce hyaluronic acid levels, potentially stopping deadly hydrogels from forming in the lungs. And timbetasin could mimic the mechanism that the researchers believe protects women from more severe Covid-19 infections. All of these potential treatments are speculative, of course, and would need to be studied in a rigorous, controlled environment before their effectiveness could be determined and they could be used more broadly.

Covid-19 stands out for both the scale of its global impact and the apparent randomness of its many symptoms. Physicians have struggled to understand the disease and come up with a unified theory for how it works. Though as of yet unproven, the bradykinin hypothesis provides such a theory. And like all good hypotheses, it also provides specific, testable predictions — in this case, actual drugs that could provide relief to real patients.

The researchers are quick to point out that “the testing of any of these pharmaceutical interventions should be done in well-designed clinical trials.” As to the next step in the process, Jacobson is clear: “We have to get this message out.” His team’s finding won’t cure Covid-19. But if the treatments it points to pan out in the clinic, interventions guided by the bradykinin hypothesis could greatly reduce patients’ suffering — and potentially save lives.

Exponential growth bias: The numerical error behind Covid-19 (BBC/Future)

A basic mathematical calculation error has fuelled the spread of coronavirus (Credit: Reuters)

Original article

By David Robson – 12th August 2020

A simple mathematical mistake may explain why many people underestimate the dangers of coronavirus, shunning social distancing, masks and hand-washing.

Imagine you are offered a deal with your bank, where your money doubles every three days. If you invest just $1 today, roughly how long will it take for you to become a millionaire?

Would it be a year? Six months? 100 days?

The precise answer is 60 days from your initial investment, when your balance would be exactly $1,048,576. Within a further 30 days, you’d have earnt more than a billion. And by the end of the year, you’d have more than $1,000,000,000,000,000,000,000,000,000,000,000,000 – an “undecillion” dollars.

If your estimates were way out, you are not alone. Many people consistently underestimate how fast the value increases – a mistake known as the “exponential growth bias” – and while it may seem abstract, it may have had profound consequences for people’s behaviour this year.

A spate of studies has shown that people who are susceptible to the exponential growth bias are less concerned about Covid-19’s spread, and less likely to endorse measures like social distancing, hand washing or mask wearing. In other words, this simple mathematical error could be costing lives – meaning that the correction of the bias should be a priority as we attempt to flatten curves and avoid second waves of the pandemic around the world.

To understand the origins of this particular bias, we first need to consider different kinds of growth. The most familiar is “linear”. If your garden produces three apples every day, you have six after two days, nine after three days, and so on.

Exponential growth, by contrast, accelerates over time. Perhaps the simplest example is population growth; the more people you have reproducing, the faster the population grows. Or if you have a weed in your pond that triples each day, the number of plants may start out low – just three on day two, and nine on day three – but it soon escalates (see diagram, below).

Many people assume that coronavirus spreads in a linear fashion, but unchecked it's exponential (Credit: Nigel Hawtin)

Many people assume that coronavirus spreads in a linear fashion, but unchecked it’s exponential (Credit: Nigel Hawtin)

Our tendency to overlook exponential growth has been known for millennia. According to an Indian legend, the brahmin Sissa ibn Dahir was offered a prize for inventing an early version of chess. He asked for one grain of wheat to be placed on the first square on the board, two for the second square, four for the third square, doubling each time up to the 64th square. The king apparently laughed at the humility of ibn Dahir’s request – until his treasurers reported that it would outstrip all the food in the land (18,446,744,073,709,551,615 grains in total).

It was only in the late 2000s that scientists started to study the bias formally, with research showing that most people – like Sissa ibn Dahir’s king – intuitively assume that most growth is linear, leading them to vastly underestimate the speed of exponential increase.

These initial studies were primarily concerned with the consequences for our bank balance. Most savings accounts offer compound interest, for example, where you accrue additional interest on the interest you have already earned. This is a classic example of exponential growth, and it means that even low interest rates pay off handsomely over time. If you have a 5% interest rate, then £1,000 invested today will be worth £1,050 next year, and £1,102.50 the year after… which adds up to more than £7,000 in 40 years’ time. Yet most people don’t recognise how much more bang for their buck they will receive if they start investing early, so they leave themselves short for their retirement.

If the number of grains on a chess board doubled for each square, the 64th would 'hold' 18 quintillion (Credit: Getty Images)

If the number of grains on a chess board doubled for each square, the 64th would ‘hold’ 18 quintillion (Credit: Getty Images)

Besides reducing their savings, the bias also renders people more vulnerable to unfavourable loans, where debt escalates over time. According to one study from 2008, the bias increases someone’s debt-to-income ratio from an average of 23% to an average of 54%.

Surprisingly, a higher level of education does not prevent people from making these errors. Even mathematically trained science students can be vulnerable, says Daniela Sele, who researchs economic decision making at the Swiss Federal Institute of Technology in Zurich. “It does help somewhat, but it doesn’t preclude the bias,” she says.

This may be because they are relying on their intuition rather than deliberative thinking, so that even if they have learned about things like compound interest, they forget to apply them. To make matters worse, most people will confidently report understanding exponential growth but then still fall for the bias when asked to estimate things like compound interest.

As I explored in my book The Intelligence Trap, intelligent and educated people often have a “bias blind spot”, believing themselves to be less susceptible to error than others – and the exponential growth bias appears to fall dead in its centre.

Most people will confidently report understanding exponential growth but then still fall for the bias

It was only this year – at the start of the Covid-19 pandemic – that researchers began to consider whether the bias might also influence our understanding of infectious diseases.

According to various epidemiological studies, without intervention the number of new Covid-19 cases doubles every three to four days, which was the reason that so many scientists advised rapid lockdowns to prevent the pandemic from spiralling out of control.

In March, Joris Lammers at the University of Bremen in Germany joined forces with Jan Crusius and Anne Gast at the University of Cologne to roll out online surveys questioning people about the potential spread of the disease. Their results showed that the exponential growth bias was prevalent in people’s understanding of the virus’s spread, with most people vastly underestimating the rate of increase. More importantly, the team found that those beliefs were directly linked to the participants’ views on the best ways to contain the spread. The worse their estimates, the less likely they were to understand the need for social distancing: the exponential growth bias had made them complacent about the official advice.

The charts that politicians show often fail to communicate exponential growth effectively (Credit: Reuters)

The charts that politicians show often fail to communicate exponential growth effectively (Credit: Reuters)

This chimes with other findings by Ritwik Banerjee and Priyama Majumda at the Indian Institute of Management in Bangalore, and Joydeep Bhattacharya at Iowa State University. In their study (currently under peer-review), they found susceptibility to the exponential growth bias can predict reduced compliance with the World Health Organization’s recommendations – including mask wearing, handwashing, the use of sanitisers and self-isolation.

The researchers speculate that some of the graphical representations found in the media may have been counter-productive. It’s common for the number of infections to be presented on a “logarithmic scale”, in which the figures on the y-axis increase by a power of 10 (so the gap between 1 and 10 is the same as the gap between 10 and 100, or 100 and 1000).

While this makes it easier to plot different regions with low and high growth rates, it means that exponential growth looks more linear than it really is, which could reinforce the exponential growth bias. “To expect people to use the logarithmic scale to extrapolate the growth path of a disease is to demand a very high level of cognitive ability,” the authors told me in an email. In their view, simple numerical tables may actually be more powerful.

Even a small effort to correct this bias could bring huge benefits

The good news is that people’s views are malleable. When Lammers and colleagues reminded the participants of the exponential growth bias, and asked them to calculate the growth in regular steps over a two week period, people hugely improved their estimates of the disease’s spread – and this, in turn, changed their views on social distancing. Sele, meanwhile, has recently shown that small changes in framing can matter. Emphasising the short amount of time that it will take to reach a large number of cases, for instance – and the time that would be gained by social distancing measures – improves people’s understanding of accelerating growth, rather than simply stating the percentage increase each day.

Lammers believes that the exponential nature of the virus needs to be made more salient in coverage of the pandemic. “I think this study shows how media and government should report on a pandemic in such a situation. Not only report the numbers of today and growth over the past week, but also explain what will happen in the next days, week, month, if the same accelerating growth persists,” he says.

He is confident that even a small effort to correct this bias could bring huge benefits. In the US, where the pandemic has hit hardest, it took only a few months for the virus to infect more than five million people, he says. “If we could have overcome the exponential growth bias and had convinced all Americans of this risk back in March, I am sure 99% would have embraced all possible distancing measures.”

David Robson is the author of The Intelligence Trap: Why Smart People Do Dumb Things (WW Norton/Hodder & Stoughton), which examines the psychology of irrational thinking and the best ways to make wiser decisions.

The Biblical Flood That Will Drown California (Wired)

Tom Philpott, 08.29.20 8:00 AM

The Great Flood of 1861–1862 was a preview of what scientists expect to see again, and soon.

This story originally appeared on Mother Jones and is part of the Climate Desk collaboration.

In November 1860, a young scientist from upstate New York named William Brewer disembarked in San Francisco after a long journey that took him from New York City through Panama and then north along the Pacific coast. “The weather is perfectly heavenly,” he enthused in a letter to his brother back east. The fast-growing metropolis was already revealing the charms we know today: “large streets, magnificent buildings” adorned by “many flowers we [northeasterners] see only in house cultivations: various kinds of geraniums growing of immense size, dew plant growing like a weed, acacia, fuchsia, etc. growing in the open air.”

Flowery prose aside, Brewer was on a serious mission. Barely a decade after being claimed as a US state, California was plunged in an economic crisis. The gold rush had gone bust, and thousands of restive settlers were left scurrying about, hot after the next ever-elusive mineral bonanza. The fledgling legislature had seen fit to hire a state geographer to gauge the mineral wealth underneath its vast and varied terrain, hoping to organize and rationalize the mad lunge for buried treasure. The potential for boosting agriculture as a hedge against mining wasn’t lost on the state’s leaders. They called on the state geographer to deliver a “full and scientific description of the state’s rocks, fossils, soils, and minerals, and its botanical and zoological productions, together with specimens of same.”

The task of completing the fieldwork fell to the 32-year-old Brewer, a Yale-trained botanist who had studied cutting-edge agricultural science in Europe. His letters home, chronicling his four-year journey up and down California, form one of the most vivid contemporary accounts of its early statehood.

They also provide a stark look at the greatest natural disaster known to have befallen the western United States since European contact in the 16th century: the Great Flood of 1861–1862. The cataclysm cut off telegraph communication with the East Coast, swamped the state’s new capital, and submerged the entire Central Valley under as much as 15 feet of water. Yet in modern-day California—a region that author Mike Davis once likened to a “Book of the Apocalypse theme park,” where this year’s wildfires have already burned 1.4 million acres, and dozens of fires are still raging—the nearly forgotten biblical-scale flood documented by Brewer’s letters has largely vanished from the public imagination, replaced largely by traumatic memories of more recent earthquakes.

When it was thought of at all, the flood was once considered a thousand-year anomaly, a freak occurrence. But emerging science demonstrates that floods of even greater magnitude occurred every 100 to 200 years in California’s precolonial history. Climate change will make them more frequent still. In other words, the Great Flood was a preview of what scientists expect to see again, and soon. And this time, given California’s emergence as agricultural and economic powerhouse, the effects will be all the more devastating.

Barely a year after Brewer’s sunny initial descent from a ship in San Francisco Bay, he was back in the city, on a break. In a November 1861 letter home, he complained of a “week of rain.” In his next letter, two months later, Brewer reported jaw-dropping news: Rain had fallen almost continuously since he had last written—and now the entire Central Valley was underwater. “Thousands of farms are entirely underwater—cattle starving and drowning.”

Picking up the letter nine days later, he wrote that a bad situation had deteriorated. All the roads in the middle of the state are “impassable, so all mails are cut off.” Telegraph service, which had only recently been connected to the East Coast through the Central Valley, stalled. “The tops of the poles are under water!” The young state’s capital city, Sacramento, about 100 miles northeast of San Francisco at the western edge of the valley and the intersection of two rivers, was submerged, forcing the legislature to evacuate—and delaying a payment Brewer needed to forge ahead with his expedition.

The surveyor gaped at the sheer volume of rain. In a normal year, Brewer reported, San Francisco received about 20 inches. In the 10 weeks leading up to January 18, 1862, the city got “thirty-two and three-quarters inches and it is still raining!”

Brewer went on to recount scenes from the Central Valley that would fit in a Hollywood disaster epic. “An old acquaintance, a buccaro [cowboy], came down from a ranch that was overflowed,” he wrote. “The floor of their one-story house was six weeks under water before the house went to pieces.” Steamboats “ran back over the ranches fourteen miles from the [Sacramento] river, carrying stock [cattle], etc., to the hills,” he reported. He marveled at the massive impromptu lake made up of “water ice cold and muddy,” in which “winds made high waves which beat the farm homes in pieces.” As a result, “every house and farm over this immense region is gone.”

Eventually, in March, Brewer made it to Sacramento, hoping (without success) to lay hands on the state funds he needed to continue his survey. He found a city still in ruins, weeks after the worst of the rains. “Such a desolate scene I hope never to see again,” he wrote: “Most of the city is still under water, and has been for three months … Every low place is full—cellars and yards are full, houses and walls wet, everything uncomfortable.” The “better class of houses” were in rough shape, Brewer observed, but “it is with the poorer classes that this is the worst.” He went on: “Many of the one-story houses are entirely uninhabitable; others, where the floors are above the water are, at best, most wretched places in which to live.” He summarized the scene:

Many houses have partially toppled over; some have been carried from their foundations, several streets (now avenues of water) are blocked up with houses that have floated in them, dead animals lie about here and there—a dreadful picture. I don’t think the city will ever rise from the shock, I don’t see how it can.

Brewer’s account is important for more than just historical interest. In the 160 years since the botanist set foot on the West Coast, California has transformed from an agricultural backwater to one of the jewels of the US food system. The state produces nearly all of the almonds, walnuts, and pistachios consumed domestically; 90 percent or more of the broccoli, carrots, garlic, celery, grapes, tangerines, plums, and artichokes; at least 75 percent of the cauliflower, apricots, lemons, strawberries, and raspberries; and more than 40 percent of the lettuce, cabbage, oranges, peaches, and peppers.

And as if that weren’t enough, California is also a national hub for milk production. Tucked in amid the almond groves and vegetable fields are vast dairy operations that confine cows together by the thousands and produce more than a fifth of the nation’s milk supply, more than any other state. It all amounts to a food-production juggernaut: California generates $46 billion worth of food per year, nearly double the haul of its closest competitor among US states, the corn-and-soybean behemoth Iowa.

You’ve probably heard that ever-more more frequent and severe droughts threaten the bounty we’ve come to rely on from California. Water scarcity, it turns out, isn’t the only menace that stalks the California valleys that stock our supermarkets. The opposite—catastrophic flooding—also occupies a niche in what Mike Davis, the great chronicler of Southern California’s sociopolitical geography, has called the state’s “ecology of fear.” Indeed, his classic book of that title opens with an account of a 1995 deluge that saw “million-dollar homes tobogganed off their hill-slope perches” and small children and pets “sucked into the deadly vortices of the flood channels.”

Yet floods tend to be less feared than rival horsemen of the apocalypse in the state’s oft-stimulated imagination of disaster. The epochal 2011–2017 drought, with its missing-in-action snowpacks and draconian water restrictions, burned itself into the state’s consciousness. Californians are rightly terrified of fires like the ones that roared through the northern Sierra Nevada foothills and coastal canyons near Los Angeles in the fall of 2018, killing nearly 100 people and fouling air for miles around, or the current LNU Lightning Complex fire that has destroyed nearly 1,000 structures and killed five people in the region between Sacramento and San Francisco. Many people are frightfully aware that a warming climate will make such conflagrations increasingly frequent. And “earthquake kits” are common gear in closets and garages all along the San Andreas Fault, where the next Big One lurks. Floods, though they occur as often in Southern and Central California as they do anywhere in the United States, don’t generate quite the same buzz.

But a growing body of research shows there’s a flip side to the megadroughts Central Valley farmers face: megafloods. The region most vulnerable to such a water-drenched cataclysm in the near future is, ironically enough, the California’s great arid, sinking food production basin, the beleaguered behemoth of the US food system: the Central Valley. Bordered on all sides by mountains, the Central Valley stretches 450 miles long, is on average 50 miles wide, and occupies a land mass of 18,000 square miles, or 11.5 million acres—roughly equivalent in size to Massachusetts and Vermont combined. Wedged between the Sierra Nevada to the east and the Coast Ranges to the west, it’s one of the globe’s greatest expanses of fertile soil and temperate weather. For most Americans, it’s easy to ignore the Central Valley, even though it’s as important to eaters as Hollywood is to moviegoers or Silicon Valley is to smartphone users. Occupying less than 1 percent of US farmland, the Central Valley churns out a quarter of the nation’s food supply.

At the time of the Great Flood, the Central Valley was still mainly cattle ranches, the farming boom a ways off. Late in 1861, the state suddenly emerged from a two-decade dry spell when monster storms began lashing the West Coast from Baja California to present-day Washington state. In central California, the deluge initially took the form of 10 to 15 feet of snow dumped onto the Sierra Nevada, according to research by the UC Berkeley paleoclimatologist B. Lynn Ingram and laid out in her 2015 book, The West Without Water, cowritten with Frances Malamud-Roam. Ingram has emerged as a kind of Cassandra of drought and flood risks in the western United States. Soon after the blizzards came days of warm, heavy rain, which in turn melted the enormous snowpack. The resulting slurry cascaded through the Central Valley’s network of untamed rivers.

As floodwater gathered in the valley, it formed a vast, muddy, wind-roiled lake, its size “rivaling that of Lake Superior,” covering the entire Central Valley floor, from the southern slopes of the Cascade Mountains near the Oregon border to the Tehachapis, south of Bakersfield, with depths in some places exceeding 15 feet.

At least some of the region’s remnant indigenous population saw the epic flood coming and took precautions to escape devastation, Ingram reports, quoting an item in the Nevada City Democrat on January 11, 1862:

We are informed that the Indians living in the vicinity of Marysville left their abodes a week or more ago for the foothills predicting an unprecedented overflow. They told the whites that the water would be higher than it has been for thirty years, and pointed high up on the trees and houses where it would come. The valley Indians have traditions that the water occasionally rises 15 or 20 feet higher than it has been at any time since the country was settled by whites, and as they live in the open air and watch closely all the weather indications, it is not improbable that they may have better means than the whites of anticipating a great storm.

All in all, thousands of people died, “one-third of the state’s property was destroyed, and one home in eight was destroyed completely or carried away by the floodwaters.” As for farming, the 1862 megaflood transformed valley agriculture, playing a decisive role in creating today’s Anglo-dominated, crop-oriented agricultural powerhouse: a 19th-century example of the “disaster capitalism” that Naomi Klein describes in her 2007 book, The Shock Doctrine.

Prior to the event, valley land was still largely owned by Mexican rancheros who held titles dating to Spanish rule. The 1848 Treaty of Guadalupe Hidalgo, which triggered California’s transfer from Mexican to US control, gave rancheros US citizenship and obligated the new government to honor their land titles. The treaty terms met with vigorous resentment from white settlers eager to shift from gold mining to growing food for the new state’s burgeoning cities. The rancheros thrived during the gold rush, finding a booming market for beef in mining towns. By 1856, their fortunes had shifted. A severe drought that year cut production, competition from emerging US settler ranchers meant lower prices, and punishing property taxes—imposed by land-poor settler politicians—caused a further squeeze. “As a result, rancheros began to lose their herds, their land, and their homes,” writes the historian Lawrence James Jelinek.

The devastation of the 1862 flood, its effects magnified by a brutal drought that started immediately afterward and lasted through 1864, “delivered the final blow,” Jelinek writes. Between 1860 and 1870, California’s cattle herd, concentrated in the valley, plunged from 3 million to 630,000. The rancheros were forced to sell their land to white settlers at pennies per acre, and by 1870 “many rancheros had become day laborers in the towns,” Jelinek reports. The valley’s emerging class of settler farmers quickly turned to wheat and horticultural production and set about harnessing and exploiting the region’s water resources, both those gushing forth from the Sierra Nevada and those beneath their feet.

Despite all the trauma it generated and the agricultural transformation it cemented in the Central Valley, the flood quickly faded from memory in California and the broader United States. To his shocked assessment of a still-flooded and supine Sacramento months after the storm, Brewer added a prophetic coda:

No people can so stand calamity as this people. They are used to it. Everyone is familiar with the history of fortunes quickly made and as quickly lost. It seems here more than elsewhere the natural order of things. I might say, indeed, that the recklessness of the state blunts the keener feelings and takes the edge from this calamity.

Indeed, the new state’s residents ended up shaking off the cataclysm. What lesson does the Great Flood of 1862 hold for today? The question is important. Back then, just around 500,000 people lived in the entire state, and the Central Valley was a sparsely populated badland. Today, the valley has a population of 6.5 million people and boasts the state’s three fastest-growing counties. Sacramento (population 501,344), Fresno (538,330), and Bakersfield (386,839) are all budding metropolises. The state’s long-awaited high-speed train, if it’s ever completed, will place Fresno residents within an hour of Silicon Valley, driving up its appeal as a bedroom community.

In addition to the potentially vast human toll, there’s also the fact that the Central Valley has emerged as a major linchpin of the US and global food system. Could it really be submerged under fifteen feet of water again—and what would that mean?

In less than two centuries as a US state, California has maintained its reputation as a sunny paradise while also enduring the nation’s most erratic climate: the occasional massive winter storm roaring in from the Pacific; years-long droughts. But recent investigations into the fossil record show that these past years have been relatively stable.

One avenue of this research is the study of the regular megadroughts, the most recent of which occurred just a century before Europeans made landfall on the North American west coast. As we are now learning, those decades-long arid stretches were just as regularly interrupted by enormous storms—many even grander than the one that began in December 1861. (Indeed, that event itself was directly preceded and followed by serious droughts.) In other words, the same patterns that make California vulnerable to droughts also make it ripe for floods.

Beginning in the 1980s, scientists including B. Lynn Ingram began examining streams and banks in the enormous delta network that together serve as the bathtub drain through which most Central Valley runoff has flowed for millennia, reaching the ocean at the San Francisco Bay. (Now-vanished Tulare Lake gathered runoff in the southern part of the valley.) They took deep-core samples from river bottoms, because big storms that overflow the delta’s banks transfer loads of soil and silt from the Sierra Nevada and deposit a portion of it in the Delta. They also looked at fluctuations in old plant material buried in the sediment layers. Plant species that thrive in freshwater suggest wet periods, as heavy runoff from the mountains crowds out seawater. Salt-tolerant species denote dry spells, as sparse mountain runoff allows seawater to work into the delta.

What they found was stunning. The Great Flood of 1862 was no one-off black-swan event. Summarizing the science, Ingram and USGS researcher Michael Dettinger deliver the dire news: A flood comparable to—and sometimes much more intense than—the 1861–1862 catastrophe occurred sometime between 1235–1360, 1395–1410, 1555–1615, 1750–1770, and 1810–1820; “that is, one megaflood every 100 to 200 years.” They also discovered that the 1862 flood didn’t appear in the sediment record in some sites that showed evidence of multiple massive events—suggesting that it was actually smaller than many of the floods that have inundated California over the centuries.

During its time as a US food-production powerhouse, California has been known for its periodic droughts and storms. But Ingram and Dettinger’s work pulls the lens back to view the broader timescale, revealing the region’s swings between megadroughts and megastorms—ones more than severe enough to challenge concentrated food production, much less dense population centers.

The dynamics of these storms themselves explain why the state is also prone to such swings. Meteorologists have known for decades that those tempests that descend upon California over the winter—and from which the state receives the great bulk of its annual precipitation—carry moisture from the South Pacific. In the late 1990s, scientists discovered that these “pineapple expresses,” as TV weather presenters call them, are a subset of a global weather phenomenon: long, wind-driven plumes of vapor about a mile above the sea that carry moisture from warm areas near the equator on a northeasterly path to colder, drier regions toward the poles. They carry so much moisture—often more than 25 times the flow of the Mississippi River, over thousands of miles—that they’ve been dubbed “atmospheric rivers.”

In a pioneering 1998 paper, researchers Yong Zhu and Reginald E. Newell found that nearly all the vapor transport between the subtropics (regions just south or north of the equator, depending on the hemisphere) toward the poles occurred in just five or six narrow bands. And California, it turns out, is the prime spot in the western side of the northern hemisphere for catching them at full force during the winter months.

As Ingram and Dettinger note, atmospheric rivers are the primary vector for California’s floods. That includes pre-Columbian cataclysms as well as the Great Flood of 1862, all the way to the various smaller ones that regularly run through the state. Between 1950 and 2010, Ingram and Dettinger write, atmospheric rivers “caused more than 80 percent of flooding in California rivers and 81 percent of the 128 most well-documented levee breaks in California’s Central Valley.”

Paradoxically, they are at least as much a lifeblood as a curse. Between eight and 11 atmospheric rivers hit California every year, the great majority of them doing no major damage, and they deliver between 30 and 50 percent of the state’s rain and snow. But the big ones are damaging indeed. Other researchers are reaching similar conclusions. In a study released in December 2019, a team from the US Army Corps of Engineers and the Scripps Institution of Oceanography found that atmospheric-river storms accounted for 84 percent of insured flood damages in the western United States between 1978 and 2017; the 13 biggest storms wrought more than half the damage.

So the state—and a substantial portion of our food system—exists on a razor’s edge between droughts and floods, its annual water resources decided by massive, increasingly fickle transfers of moisture from the South Pacific. As Dettinger puts it, the “largest storms in California’s precipitation regime not only typically end the state’s frequent droughts, but their fluctuations also cause those droughts in the first place.”

We know that before human civilization began spewing millions of tons of greenhouse gases into the atmosphere annually, California was due “one megaflood every 100 to 200 years”—and the last one hit more than a century and a half ago. What happens to this outlook when you heat up the atmosphere by 1 degree Celsius—and are on track to hit at least another half-degree Celsius increase by midcentury?

That was the question posed by Daniel Swain and a team of researchers at UCLA’s Department of Atmospheric and Oceanic Sciences in a series of studies, the first of which was published in 2018. They took California’s long pattern of droughts and floods and mapped it onto the climate models based on data specific to the region, looking out to century’s end.

What they found isn’t comforting. As the tropical Pacific Ocean and the atmosphere just above it warm, more seawater evaporates, feeding ever bigger atmospheric rivers gushing toward the California coast. As a result, the potential for storms on the scale of the ones that triggered the Great Flood has increased “more than threefold,” they found. So an event expected to happen on average every 200 years will now happen every 65 or so. It is “more likely than not we will see one by 2060,” and it could plausibly happen again before century’s end, they concluded.

As the risk of a catastrophic event increases, so will the frequency of what they call “precipitation whiplash”: extremely wet seasons interrupted by extremely dry ones, and vice versa. The winter of 2016–2017 provides a template. That year, a series of atmospheric-river storms filled reservoirs and at one point threatened a major flood in the northern Central Valley, abruptly ending the worst multiyear drought in the state’s recorded history.

Swings on that magnitude normally occur a handful of times each century, but in the model by Swain’s team, “it goes from something that happens maybe once in a generation to something that happens two or three times,” he told me in an interview. “Setting aside a repeat of 1862, these less intense events could still seriously test the limits of our water infrastructure.” Like other efforts to map climate change onto California’s weather, this one found that drought years characterized by low winter precipitation would likely increase—in this case, by a factor of as much as two, compared with mid-20th-century patterns. But extreme-wet winter seasons, accumulating at least as much precipitation as 2016–2017, will grow even more: they could be three times as common as they were before the atmosphere began its current warming trend.

While lots of very wet years—at least the ones that don’t reach 1861–1862 levels—might sound encouraging for food production in the Central Valley, there’s a catch, Swain said. His study looked purely at precipitation, independent of whether it fell as rain or snow. A growing body of research suggests that as the climate warms, California’s precipitation mix will shift significantly in favor of rain over snow. That’s dire news for our food system, because the Central Valley’s vast irrigation networks are geared to channeling the slow, predictable melt of the snowpack into usable water for farms. Water that falls as rain is much harder to capture and bend to the slow-release needs of agriculture.

In short, California’s climate, chaotic under normal conditions, is about to get weirder and wilder. Indeed, it’s already happening.

What if an 1862-level flood, which is overdue and “more likely than not” to occur with a couple of decades, were to hit present-day California?

Starting in 2008, the USGS set out to answer just that question, launching a project called the ARkStorm (for “atmospheric river 1,000 storm”) Scenario. The effort was modeled on a previous USGS push to get a grip on another looming California cataclysm: a massive earthquake along the San Andreas Fault. In 2008, USGS produced the ShakeOut Earthquake Scenario, a “detailed depiction of a hypothetical magnitude 7.8 earthquake.” The study “served as the centerpiece of the largest earthquake drill in US history, involving over five thousand emergency responders and the participation of over 5.5 million citizens,” the USGS later reported.

That same year, the agency assembled a team of 117 scientists, engineers, public-policy experts, and insurance experts to model what kind of impact a monster storm event would have on modern California.

At the time, Lucy Jones served as the chief scientist for the USGS’s Multi Hazards Demonstration Project, which oversaw both projects. A seismologist by training, Jones spent her time studying the devastations of earthquakes and convincing policy makers to invest resources into preparing for them. The ARkStorm project took her aback, she told me. The first thing she and her team did was ask, What’s the biggest flood in California we know about? “I’m a fourth-generation Californian who studies disaster risk, and I had never heard of the Great Flood of 1862,” she said. “None of us had heard of it,” she added—not even the meteorologists knew about what’s “by far the biggest disaster ever in California and the whole Southwest” over the past two centuries.

At first, the meteorologists were constrained in modeling a realistic megastorm by a lack of data; solid rainfall-gauge measures go back only a century. But after hearing about the 1862 flood, the ARkStorm team dug into research from Ingram and others for information about megastorms before US statehood and European contact. They were shocked to learn that the previous 1,800 years had about six events that were more severe than 1862, along with several more that were roughly of the same magnitude. What they found was that a massive flood is every bit as likely to strike California, and as imminent, as a massive quake.

Even with this information, modeling a massive flood proved more challenging than projecting out a massive earthquake. “We seismologists do this all the time—we create synthetic seismographs,” she said. Want to see what a quake reaching 7.8 on the Richter scale would look like along the San Andreas Fault? Easy, she said. Meteorologists, by contrast, are fixated on accurate prediction of near-future events; “creating a synthetic event wasn’t something they had ever done.” They couldn’t just re-create the 1862 event, because most of the information we have about it is piecemeal, from eyewitness accounts and sediment samples.

To get their heads around how to construct a reasonable approximation of a megastorm, the team’s meteorologists went looking for well-documented 20th-century events that could serve as a model. They settled on two: a series of big storms in 1969 that hit Southern California hardest and a 1986 cluster that did the same to the northern part of the state. To create the ARkStorm scenario, they stitched the two together. Doing so gave the researchers a rich and regionally precise trove of data to sketch out a massive Big One storm scenario.

There was one problem: While the fictional ARkStorm is indeed a massive event, it’s still significantly smaller than the one that caused the Great Flood of 1862. “Our [hypothetical storm] only had total rain for 25 days, while there were 45 days in 1861 to ’62,” Jones said. They plunged ahead anyway, for two reasons. One was that they had robust data on the two 20th-century storm events, giving disaster modelers plenty to work with. The second was that they figured a smaller-than-1862 catastrophe would help build public buy-in, by making the project hard to dismiss as an unrealistic figment of scaremongering bureaucrats.

What they found stunned them—and should stun anyone who relies on California to produce food (not to mention anyone who lives in the state). The headline number: $725 billion in damage, nearly four times what the USGS’s seismology team arrived at for its massive-quake scenario ($200 billion). For comparison, the two most costly natural disasters in modern US history—Hurricane Katrina in 2005 and Harvey in 2017—racked up $166 billion and $130 billion, respectively. The ARkStorm would “flood thousands of square miles of urban and agricultural land, result in thousands of landslides, [and] disrupt lifelines throughout the state for days or weeks,” the study reckoned. Altogether, 25 percent of the state’s buildings would be damaged.

In their model, 25 days of relentless rains overwhelm the Central Valley’s flood-control infrastructure. Then large swaths of the northern part of the Central Valley go under as much as 20 feet of water. The southern part, the San Joaquin Valley, gets off lighter; but a miles-wide band of floodwater collects in the lowest-elevation regions, ballooning out to encompass the expanse that was once the Tulare Lake bottom and stretching to the valley’s southern extreme. Most metropolitan parts of the Bay Area escape severe damage, but swaths of Los Angeles and Orange Counties experience “extensive flooding.”

As Jones stressed to me in our conversation, the ARkStorm scenario is a cautious approximation; a megastorm that matches 1862 or its relatively recent antecedents could plausibly bury the entire Central Valley underwater, northern tip to southern. As the report puts it: “Six megastorms that were more severe than 1861–1862 have occurred in California during the last 1800 years, and there is no reason to believe similar storms won’t occur again.”

A 21st-century megastorm would fall on a region quite different from gold rush–era California. For one thing, it’s much more populous. While the ARkStorm reckoning did not estimate a death toll, it warned of a “substantial loss of life” because “flood depths in some areas could realistically be on the order of 10–20 feet.”

Then there’s the transformation of farming since then. The 1862 storm drowned an estimated 200,000 head of cattle, about a quarter of the state’s entire herd. Today, the Central Valley houses nearly 4 million beef and dairy cows. While cattle continue to be an important part of the region’s farming mix, they no longer dominate it. Today the valley is increasingly given over to intensive almond, pistachio, and grape plantations, representing billions of dollars of investments in crops that take years to establish, are expected to flourish for decades, and could be wiped out by a flood.

Apart from economic losses, “the evolution of a modern society creates new risks from natural disasters,” Jones told me. She cited electric power grids, which didn’t exist in mid-19th-century California. A hundred years ago, when electrification was taking off, extended power outages caused inconveniences. Now, loss of electricity can mean death for vulnerable populations (think hospitals, nursing homes, and prisons). Another example is the intensification of farming. When a few hundred thousand cattle roamed the sparsely populated Central Valley in 1861, their drowning posed relatively limited biohazard risks, although, according to one contemporary account, in post-flood Sacramento, there were a “good many drowned hogs and cattle lying around loose in the streets.”

Today, however, several million cows are packed into massive feedlots in the southern Central Valley, their waste often concentrated in open-air liquid manure lagoons, ready to be swept away and blended into a fecal slurry. Low-lying Tulare County houses nearly 500,000 dairy cows, with 258 operations holding on average 1,800 cattle each. Mature modern dairy cows are massive creatures, weighing around 1,500 pounds each and standing nearly 5 feet tall at the front shoulder. Imagine trying to quickly move such beasts by the thousands out of the path of a flood—and the consequences of failing to do so.

A massive flood could severely pollute soil and groundwater in the Central Valley, and not just from rotting livestock carcasses and millions of tons of concentrated manure. In a 2015 paper, a team of USGS researchers tried to sum up the myriad toxic substances that would be stirred up and spread around by massive storms and floods. The cities of 160 years ago could not boast municipal wastewater facilities, which filter pathogens and pollutants in human sewage, nor municipal dumps, which concentrate often-toxic garbage. In the region’s teeming 21st-century urban areas, those vital sanitation services would become major threats. The report projects that a toxic soup of “petroleum, mercury, asbestos, persistent organic pollutants, molds, and soil-borne or sewage-borne pathogens” would spread across much of the valley, as would concentrated animal manure, fertilizer, pesticides, and other industrial chemicals.

The valley’s southernmost county, Kern, is a case study in the region’s vulnerabilities. Kern’s farmers lead the entire nation in agricultural output by dollar value, annually producing $7 billion worth of foodstuffs like almonds, grapes, citrus, pistachios, and milk. The county houses more than 156,000 dairy cows in facilities averaging 3,200 head each. That frenzy of agricultural production means loads of chemicals on hand; every year, Kern farmers use around 30 million pounds of pesticides, second only to Fresno among California counties. (Altogether, five San Joaquin Valley counties use about half of the more than 200 million pounds of pesticides applied in California.)

Kern is also one of the nation’s most prodigious oil-producing counties. Its vast array of pump jacks, many of them located in farm fields, produce 70 percent of California’s entire oil output. It’s also home to two large oil refineries. If Kern County were a state, it would be the nation’s seventh-leading oil-producing one, churning out twice as much crude as Louisiana. In a massive storm, floodwaters could pick up a substantial amount of highly toxic petroleum and byproducts. Again, in the ARkStorm scenario, Kern County gets hit hard by rain but mostly escapes the worst flooding. The real “Other Big One” might not be so kind, Jones said.

In the end, the USGS team could not estimate the level of damage that will be visited upon the Central Valley’s soil and groundwater from a megaflood: too many variables, too many toxins and biohazards that could be sucked into the vortex. They concluded that “flood-related environmental contamination impacts are expected to be the most widespread and substantial in lowland areas of the Central Valley, the Sacramento–San Joaquin River Delta, the San Francisco Bay area, and portions of the greater Los Angeles metroplex.”

Jones said the initial reaction to the 2011 release of the ARkStorm report among California’s policymakers and emergency managers was skepticism: “Oh, no, that’s too big—it’s impossible,” they would say. “We got lots of traction with the earthquake scenario, and when we did the big flood, nobody wanted to listen to us,” she said.

But after years of patiently informing the state’s decisionmakers that such a disaster is just as likely as a megaquake—and likely much more devastating—the word is getting out. She said the ARkStorm message probably helped prepare emergency managers for the severe storms of February 2017. That month, the massive Oroville Dam in the Sierra Nevada foothills very nearly failed, threatening to send a 30-foot-tall wall of water gushing into the northern Central Valley. As the spillway teetered on the edge of collapse, officials ordered the evacuation of 188,000 people in the communities below. The entire California National Guard was put on notice to mobilize if needed—the first such order since the 1992 Rodney King riots in Los Angeles. Although the dam ultimately held up, the Oroville incident illustrates the challenges of moving hundreds of thousands of people out of harm’s way on short notice.

The evacuation order “unleashed a flood of its own, sending tens of thousands of cars simultaneously onto undersize roads, creating hours-long backups that left residents wondering if they would get to high ground before floodwaters overtook them,” the Sacramento Bee reported. Eight hours after the evacuation, highways were still jammed with slow-moving traffic. A California Highway Patrol spokesman summed up the scene for the Bee:

Unprepared citizens who were running out of gas and their vehicles were becoming disabled in the roadway. People were utilizing the shoulder, driving the wrong way. Traffic collisions were occurring. People fearing for their lives, not abiding by the traffic laws. All combined, it created big problems. It ended up pure, mass chaos.

Even so, Jones said the evacuation went as smoothly as could be expected and likely would have saved thousands of lives if the dam had burst. “But there are some things you can’t prepare for.” Obviously, getting area residents to safety was the first priority, but animal inhabitants were vulnerable, too. If the dam had burst, she said, “I doubt they would have been able to save cattle.”

As the state’s ever-strained emergency-service agencies prepare for the Other Big One, there’s evidence other agencies are struggling to grapple with the likelihood of a megaflood. In the wake of the 2017 near-disaster at Oroville, state agencies spent more than $1 billion repairing the damaged dam and bolstering it for future storms. Just as work was being completed in fall 2018, the Federal Energy Regulatory Commission assessed the situation and found that a “probable maximum flood”—on the scale of the ArkStorm—would likely overwhelm the dam. FERC called on the state to invest in a “more robust and resilient design” to prevent a future cataclysm. The state’s Department of Water Resources responded by launching a “needs assessment” of the dam’s safety that’s due to wrap up in 2020.

Of course, in a state beset by the increasing threat of wildfires in populated areas as well as earthquakes, funds for disaster preparation are tightly stretched. All in all, Jones said, “we’re still much more prepared for a quake than a flood.” Then again, it’s hard to conceive of how we could effectively prevent a 21st century repeat of the Great Flood or how we could fully prepare for the low-lying valley that runs along the center of California like a bathtub—now packed with people, livestock, manure, crops, petrochemicals, and pesticides—to be suddenly transformed into a storm-roiled inland sea.

World population likely to shrink after mid-century, forecasting major shifts in global population and economic power (Science Daily)

Date: July 15, 2020

Source: The Lancet

Summary: With widespread, sustained declines in fertility, the world population will likely peak in 2064 at around 9.7 billion, and then decline to about 8.8 billion by 2100 — about 2 billion lower than some previous estimates, according to a new study.

Illustration of people | Credit: © Mopic /

Illustration of people forming a world map (stock image). Credit: © Mopic /

Improvements in access to modern contraception and the education of girls and women are generating widespread, sustained declines in fertility, and world population will likely peak in 2064 at around 9.7 billion, and then decline to about 8.8 billion by 2100 — about 2 billion lower than some previous estimates, according to a new study published in The Lancet.

The modelling research uses data from the Global Burden of Disease Study 2017 to project future global, regional, and national population. Using novel methods for forecasting mortality, fertility, and migration, the researchers from the Institute for Health Metrics and Evaluation (IHME) at the University of Washington’s School of Medicine estimate that by 2100, 183 of 195 countries will have total fertility rates (TFR), which represent the average number of children a woman delivers over her lifetime, below replacement level of 2.1 births per woman. This means that in these countries populations will decline unless low fertility is compensated by immigration.

The new population forecasts contrast to projections of ‘continuing global growth’ by the United Nations Population Division, and highlight the huge challenges to economic growth of a shrinking workforce, the high burden on health and social support systems of an aging population, and the impact on global power linked to shifts in world population.

The new study also predicts huge shifts in the global age structure, with an estimated 2.37 billion individuals over 65 years globally in 2100, compared with 1.7 billion under 20 years, underscoring the need for liberal immigration policies in countries with significantly declining working age populations.

“Continued global population growth through the century is no longer the most likely trajectory for the world’s population,” says IHME Director Dr. Christopher Murray, who led the research. “This study provides governments of all countries an opportunity to start rethinking their policies on migration, workforces and economic development to address the challenges presented by demographic change.”

IHME Professor Stein Emil Vollset, first author of the paper, continues, “The societal, economic, and geopolitical power implications of our predictions are substantial. In particular, our findings suggest that the decline in the numbers of working-age adults alone will reduce GDP growth rates that could result in major shifts in global economic power by the century’s end. Responding to population decline is likely to become an overriding policy concern in many nations, but must not compromise efforts to enhance women’s reproductive health or progress on women’s rights.”

Dr Richard Horton, Editor-in-Chief, The Lancet, adds: “This important research charts a future we need to be planning for urgently. It offers a vision for radical shifts in geopolitical power, challenges myths about immigration, and underlines the importance of protecting and strengthening the sexual and reproductive rights of women. The 21st century will see a revolution in the story of our human civilisation. Africa and the Arab World will shape our future, while Europe and Asia will recede in their influence. By the end of the century, the world will be multipolar, with India, Nigeria, China, and the US the dominant powers. This will truly be a new world, one we should be preparing for today.”

Accelerating decline in fertility worldwide

The global TFR is predicted to steadily decline, from 2.37 in 2017 to 1.66 in 2100 — well below the minimum rate (2.1) considered necessary to maintain population numbers (replacement level) — with rates falling to around 1.2 in Italy and Spain, and as low as 1.17 in Poland.

Even slight changes in TFR translate into large differences in population size in countries below the replacement level — increasing TFR by as little as 0.1 births per woman is equivalent to around 500 million more individuals on the planet in 2100.

Much of the anticipated fertility decline is predicted in high-fertility countries, particularly those in sub-Saharan Africa where rates are expected to fall below the replacement level for the first time — from an average 4.6 births per woman in 2017 to just 1.7 by 2100. In Niger, where the fertility rate was the highest in the world in 2017 — with women giving birth to an average of seven children — the rate is projected to decline to around 1.8 by 2100.

Nevertheless, the population of sub-Saharan Africa is forecast to triple over the course of the century, from an estimated 1.03 billion in 2017 to 3.07 billion in 2100 — as death rates decline and an increasing number of women enter reproductive age. North Africa and the Middle East is the only other region predicted to have a larger population in 2100 (978 million) than in 2017 (600 million).

Many of the fastest-shrinking populations will be in Asia and central and eastern Europe. Populations are expected to more than halve in 23 countries and territories, including Japan (from around 128 million people in 2017 to 60 million in 2100), Thailand (71 to 35 million), Spain (46 to 23 million), Italy (61 to 31 million), Portugal (11 to 5 million), and South Korea (53 to 27 million). An additional 34 countries are expected to have population declines of 25 to 50%, including China (1.4 billion in 2017 to 732 million in 2100; see table).

Huge shifts in global age structure — with over 80s outnumbering under 5s two to one

As fertility falls and life expectancy increases worldwide, the number of children under 5 years old is forecasted to decline by 41% from 681 million in 2017 to 401 million in 2100, whilst the number of individuals older than 80 years is projected to increase six fold, from 141 million to 866 million. Similarly, the global ratio of adults over 80 years to each person aged 15 years or younger is projected to rise from 0.16 in 2017 to 1.50 in 2100, in countries with a population decline of more than 25%.

Furthermore, the global ratio of non-working adults to workers was around 0.8 in 2017, but is projected to increase to 1.16 in 2100 if labour force participation by age and sex does not change.

“While population decline is potentially good news for reducing carbon emissions and stress on food systems, with more old people and fewer young people, economic challenges will arise as societies struggle to grow with fewer workers and taxpayers, and countries’ abilities to generate the wealth needed to fund social support and health care for the elderly are reduced,” says Vollset.

Declining working-age populations could see major shifts in size of economies

The study also examined the economic impact of fewer working-age adults for all countries in 2017. While China is set to replace the USA in 2035 with the largest total gross domestic product (GDP) globally, rapid population decline from 2050 onward will curtail economic growth. As a result, the USA is expected to reclaim the top spot by 2098, if immigration continues to sustain the US workforce.

Although numbers of working-age adults in India are projected to fall from 762 million in 2017 to around 578 million in 2100, it is expected to be one of the few — if only — major power in Asia to protect its working-age population over the century. It is expected to surpass China’s workforce population in the mid-2020s (where numbers of workers are estimated to decline from 950 million in 2017 to 357 million in 2100) — rising up the GDP rankings from 7th to 3rd.

Sub-Saharan Africa is likely to become an increasingly powerful continent on the geopolitical stage as its population rises. Nigeria is projected to be the only country among the world’s 10 most populated nations to see its working-age population grow over the course of the century (from 86 million in 2017 to 458 million in 2100), supporting rapid economic growth and its rise in GDP rankings from 23rd place in 2017 to 9th place in 2100.

While the UK, Germany, and France are expected to remain in the top 10 for largest GDP worldwide at the turn of the century, Italy (from rank 9th in 2017 to 25th in 2100) and Spain (from 13th to 28th) are projected to fall down the rankings, reflecting much greater population decline.

Liberal immigration could help sustain population size and economic growth

The study also suggests that population decline could be offset by immigration, with countries that promote liberal immigration better able to maintain their population size and support economic growth, even in the face of declining fertility rates.

The model predicts that some countries with fertility lower than replacement level, such as the USA, Australia, and Canada, will probably maintain their working-age populations through net immigration (see appendix 2 section 4). Although the authors note that there is considerable uncertainty about these future trends.

“For high-income countries with below-replacement fertility rates, the best solutions for sustaining current population levels, economic growth, and geopolitical security are open immigration policies and social policies supportive of families having their desired number of children,” Murray says. “However, a very real danger exists that, in the face of declining population, some countries might consider policies that restrict access to reproductive health services, with potentially devastating consequences. It is imperative that women’s freedom and rights are at the top of every government’s development agenda.”

The authors note some important limitations, including that while the study uses the best available data, predictions are constrained by the quantity and quality of past data. They also note that past trends are not always predictive of what will happen in the future, and that some factors not included in the model could change the pace of fertility, mortality, or migration. For example, the COVID-19 pandemic has affected local and national health systems throughout the world, and caused over half a million deaths. However, the authors believe the excess deaths caused by the pandemic are unlikely to significantly alter longer term forecasting trends of global population.

Writing in a linked Comment, Professor Ibrahim Abubakar, University College London (UCL), UK, and Chair of Lancet Migration (who was not involved in the study), says: “Migration can be a potential solution to the predicted shortage of working-age populations. While demographers continue to debate the long-term implications of migration as a remedy for declining TFR, for it to be successful, we need a fundamental rethink of global politics. Greater multilateralism and a new global leadership should enable both migrant sending and migrant-receiving countries to benefit, while protecting the rights of individuals. Nations would need to cooperate at levels that have eluded us to date to strategically support and fund the development of excess skilled human capital in countries that are a source of migrants. An equitable change in global migration policy will need the voice of rich and poor countries. The projected changes in the sizes of national economies and the consequent change in military power might force these discussions.”

He adds: “Ultimately, if Murray and colleagues’ predictions are even half accurate, migration will become a necessity for all nations and not an option. The positive impacts of migration on health and economies are known globally. The choice that we face is whether we improve health and wealth by allowing planned population movement or if we end up with an underclass of imported labour and unstable societies. The Anthropocene has created many challenges such as climate change and greater global migration. The distribution of working-age populations will be crucial to whether humanity prospers or withers.”

The study was in part funded by the Bill & Melinda Gates Foundation. It was conducted by researchers at the University of Washington, Seattle, USA.

Story Source:

Materials provided by The Lancet. Note: Content may be edited for style and length.

Journal Reference:

  1. Stein Emil Vollset, Emily Goren, Chun-Wei Yuan, Jackie Cao, Amanda E Smith, Thomas Hsiao, Catherine Bisignano, Gulrez S Azhar, Emma Castro, Julian Chalek, Andrew J Dolgert, Tahvi Frank, Kai Fukutaki, Simon I Hay, Rafael Lozano, Ali H Mokdad, Vishnu Nandakumar, Maxwell Pierce, Martin Pletcher, Toshana Robalik, Krista M Steuben, Han Yong Wunrow, Bianca S Zlavog, Christopher J L Murray. Fertility, mortality, migration, and population scenarios for 195 countries and territories from 2017 to 2100: a forecasting analysis for the Global Burden of Disease Study. The Lancet, 2020; DOI: 10.1016/S0140-6736(20)30677-2

New model predicts the peaks of the COVID-19 pandemic (Science Daily)

Date: May 29, 2020

Source: Santa Fe Institute

Summary: Researchers describe a single function that accurately describes all existing available data on active COVID-19 cases and deaths — and predicts forthcoming peaks.

As of late May, COVID-19 has killed more than 325,000 people around the world. Even though the worst seems to be over for countries like China and South Korea, public health experts warn that cases and fatalities will continue to surge in many parts of the world. Understanding how the disease evolves can help these countries prepare for an expected uptick in cases.

This week in the journal Frontiers in Physics, researchers describe a single function that accurately describes all existing available data on active cases and deaths — and predicts forthcoming peaks. The tool uses q-statistics, a set of functions and probability distributions developed by Constantino Tsallis, a physicist and member of the Santa Fe Institute’s external faculty. Tsallis worked on the new model together with Ugur Tirnakli, a physicist at Ege University, in Turkey.

“The formula works in all the countries in which we have tested,” says Tsallis.

Neither physicist ever set out to model a global pandemic. But Tsallis says that when he saw the shape of published graphs representing China’s daily active cases, he recognized shapes he’d seen before — namely, in graphs he’d helped produce almost two decades ago to describe the behavior of the stock market.

“The shape was exactly the same,” he says. For the financial data, the function described probabilities of stock exchanges; for COVID-19, it described daily the number of active cases — and fatalities — as a function of time.

Modeling financial data and tracking a global pandemic may seem unrelated, but Tsallis says they have one important thing in common. “They’re both complex systems,” he says, “and in complex systems, this happens all the time.” Disparate systems from a variety of fields — biology, network theory, computer science, mathematics — often reveal patterns that follow the same basic shapes and evolution.

The financial graph appeared in a 2004 volume co-edited by Tsallis and the late Nobelist Murray Gell-Mann. Tsallis developed q-statitics, also known as “Tsallis statistics,” in the late 1980s as a generalization of Boltzmann-Gibbs statistics to complex systems.

In the new paper, Tsallis and Tirnakli used data from China, where the active case rate is thought to have peaked, to set the main parameters for the formula. Then, they applied it to other countries including France, Brazil, and the United Kingdom, and found that it matched the evolution of the active cases and fatality rates over time.

The model, says Tsallis, could be used to create useful tools like an app that updates in real-time with new available data, and can adjust its predictions accordingly. In addition, he thinks that it could be fine-tuned to fit future outbreaks as well.

“The functional form seems to be universal,” he says, “Not just for this virus, but for the next one that might appear as well.”

Story Source:

Materials provided by Santa Fe Institute. Note: Content may be edited for style and length.

Journal Reference:

  1. Constantino Tsallis, Ugur Tirnakli. Predicting COVID-19 Peaks Around the World. Frontiers in Physics, 2020; 8 DOI: 10.3389/fphy.2020.00217

‘Se Brasil parar por duas semanas, é possível evitar as 125 mil mortes’, diz especialista (Folha de S.Paulo)

Marina Dias, 28 de maio de 2020

Ali Mokdad dirige parte das projeções feitas pelo IHME, instituto de métrica da Universidade de Washington utilizado pela Casa Branca como um dos principais modelos para monitorar Covid-19.

Desde o meio de maio, Mokdad e sua equipe acompanham o avanço da pandemia no Brasil e suas conclusões são bastantes sombrias. Na segunda-feira (25), o instituto atualizou para cima a expectativa de mortes pela doença no país: de 88 mil para mais de 125 mil óbitos previstos até agosto.

Em entrevista à Folha, Mokdad diz que a tendência de casos e mortes no país é de alta e que a situação pode ser ainda pior se governo e população não levarem a crise a sério e adotarem “lockdown” por duas semanas.

“As infeções e mortes vão crescer e, o mais assustador, haverá a sobrecarga total do sistema de saúde.” Caso cumpra o confinamento total por 14 dias, explica Mokdad, o Brasil conseguirá controlar a propagação do vírus e poderá fazer a reabertura das atividades econômicas de maneira estratégica –e até mais rapidamente.

Especialista em saúde pública, diz sofrer críticas por ter um modelo que varia bastante, mas, no caso da pandemia, prefere que suas projeções se ajustem com o tempo. “Se os brasileiros ficarem em casa por duas semanas, meus números vão baixar. E não porque fiz algo errado, mas porque os brasileiros fizeram algo certo.”

Qual a situação da pandemia no Brasil? Infelizmente o que vemos no Brasil é uma tendência de aumento de casos, que vai resultar no crescimento das mortes no país. Isso se dá por várias razões. Primeiro porque o país não entrou em “lockdown” cedo para impedir a propagação do vírus. O governo e a população brasileira não levaram isso a sério e não fizeram logo as coisas certas para impedir a transmissão do vírus.

Segundo, há muita disparidade no Brasil e a Covid-19 aumenta isso. Nesse caso, é preciso proteger não só os trabalhadores de saúde mas os trabalhadores de serviços essenciais, pessoas pobres que trabalham em funções que as obrigam a sair de casa. Elas não estão protegidas e estão morrendo. A terceira e mais importante preocupação é a sobrecarga do sistema de saúde. Se o país não agir, vai haver mais casos no inverno e não haverá tempo para se preparar. É perigoso e arriscado. Se você colocar tudo isso junto, o Brasil ainda vai enfrentar sérias dificuldades diante da Covid-19.

Em duas semanas, o IHME aumentou as projeções de morte no Brasil de 88 mil para mais de 125 mil até agosto. O que aconteceu? Adicionamos mais estados [de 11 para 19] na nossa projeção, isso é uma coisa. Mas estamos vendo no Brasil mais surtos e casos do que esperávamos. O país está testando mais e encontrando mais casos, mas, mesmo quando ajustamos para os testes, há uma tendência de alta.

No Brasil há também um erro de suposição quando falamos de circulação. Os dados [de mobilidade da população] são baseados no Facebook e no Google, ou seja, em smartphones, ou seja, em pessoas mais ricas. Percebemos que a circulação não parou nas favelas, por exemplo, em lugares onde pessoas mais pobres precisam sair para trabalhar. Se as pessoas se recusarem a levar isso a sério, infelizmente vamos ver mais casos e mortes.

Quais medidas precisam ser tomadas? Fechar escolas e universidades, impedir grandes aglomerações e encontros de pessoas, fechar os estabelecimentos não essenciais, igrejas, templos e locais religiosos. Nos locais essenciais, como mercados e farmácias, é preciso estabelecer regras, limitando o número de pessoas dentro, garantindo que elas se mantenham distantes umas das outras.

A última e mais importante coisa é pedir para quem precisa sair de casa—e sabemos que há quem precise— usar máscara e manter distância de 2 metros de outras pessoas. Para o sistema de saúde, é aumentar a capacidade de tratamento, de detectar cedo a chegada de um surto, fazendo rastreamento e o isolamento de casos, o que é um desafio para o Brasil, onde muitas vezes dez pessoas vivem em uma mesma casa.

Se o Brasil não cumprir essas medidas, qual é o pior cenário para o país? As infeções e mortes vão crescer e, a parte mais assustadora, haverá a sobrecarga total do sistema de saúde. Isso vai causar mais prejuízo à economia do que se fizer o isolamento por duas semanas. Se a população ficar em casa e levar isso a sério por duas semanas, registraremos diminuição da propagação do vírus e poderemos reabrir em fases. É preciso garantir que a retomada econômica seja feita de maneira estratégica, por setores.

É possível evitar o pico de 1.500 mortes diárias em julho e as 125 mil mortes até agosto se o país parar agora? Sim. O Brasil está em uma situação muito difícil e pode ser assim por muito tempo, mas ainda há esperança. Se o governo e a população pararem por duas semanas, podemos parar a circulação do vírus e reabrir o comércio. Se você olhar para estados americanos, como Nova York, depois que há o “lockdown”, as mortes e os casos diminuem. O “lockdown” salvou muitas vidas nos EUA. Fizemos as projeções para o Brasil de 125 mil mortes até 4 de agosto, mas não significa que vai acontecer, podemos parar isso. É preciso que cada brasileiro faça sua parte.

O presidente Jair Bolsonaro é contra medidas de distanciamento social, compara a Covid-19 com uma gripezinha e defende um medicamento com eficácia não comprovada contra a doença. Como essa postura pode impactar a situação do Brasil? Aqui nos EUA temos também uma situação política nesse sentido, infelizmente. Não sou político, vejo os números e dou conselhos a partir do que concluo deles. Pelos dados, o Brasil precisa de uma ação coordenada, caso contrário, vamos ter muitas perdas.

Mas precisamos ter uma coisa clara: Covid-19 não é uma gripe, causa mais mortalidade que gripe, a gripe não causa AVC e nem ataca os pulmões da maneira que a Covid-19 ataca. Contra Covid-19 não há medicamento e ponto final. Não tem vacina. Não é possível comparar Covid-19 e gripe. Fazer isso é passar mensagem errada. Dizer para a população que é possível sair e ver quem pega a doença é inaceitável, é falha de liderança.

Como ganhar a confiança dos governos e da população com projeções que variam tanto e com tanta gente trabalhando com dados sobre o tema? Há muita gente fazendo projeção mas, pela primeira vez na história da ciência, todos concordamos. Os números podem ser diferentes, mas a mensagem mais importante é a mesma: isso é um vírus letal e temos que levá-lo a sério. Meus números mudam porque as pessoas mudam. Se os brasileiros ficarem em casa por duas semanas, meus números vão baixar. E não porque fiz algo errado, mas porque os brasileiros fizeram algo certo. Aprendemos que o modelo muda se novos dados aparecem.

O sr. já foi acusado de ser alarmista ou de produzir notícias falsas quando seus números mudam? Acusado é demais, mas tem gente que fala que meus números são mais altos ou mais baixos do que deveriam ser, e isso eu nem resposto, porque não é um debate científico, é um debate político. No debate científico está todo mundo a bordo com a mesma mensagem.

Trump parece ter sido convencido da gravidade da pandemia em parte baseado nos seus números. Foi isso mesmo? Sim. Nos EUA e também na Inglaterra nossos números mudaram a postura do governante. Claro que lá o primeiro-ministro [Boris Johnson] pegou Covid-19 ele mesmo.

Como é trabalhar tendo isso em vista, com números tão sensíveis e poderosos? A gente não dorme muito por esses dias, é muito trabalho. É muito difícil dizer que 125 mil pessoas vão morrer no Brasil até agosto. Isso não é um número, são famílias, amigos, é muito duro.

Brazil coronavirus deaths could surpass 125,000 by August, U.S. study says (Reuters)

May 26, 2020 / 1:21 PM

Gravediggers work during a mass burial of people who passed away due to the coronavirus disease (COVID-19), at the Parque Taruma cemetery in Manaus, Brazil, May 26, 2020. Picture taken with a drone. REUTERS/Bruno Kelly

BRASILIA (Reuters) – As Brazil’s daily COVID-19 death rate climbs to the highest in the world, a University of Washington study is warning its total death toll could climb five-fold to 125,000 by early August, adding to fears it has become a new hot spot in the pandemic.

The forecast from the University of Washington’s Institute for Health Metrics and Evaluation (IHME), released as Brazil’s daily death toll climbed past that of the United States on Monday, came with a call for lockdowns that Brazil’s president has resisted.

“Brazil must follow the lead of Wuhan, China, as well as Italy, Spain, and New York by enforcing mandates and measures to gain control of a fast-moving epidemic and reduce transmission of the coronavirus,” wrote IHME Director Dr. Christopher Murray.

Without such measures, the institute’s model shows Brazil’s daily death toll could keep climbing to until mid-July, driving shortages of critical hospital resources in Brazil, he said in a statement accompanying the findings.

On Monday, Brazil’s coronavirus deaths reported in the last 24 hours were higher than fatalities in the United States for the first time, according to the health ministry. Brazil registered 807 deaths and 620 died in the United States.

The U.S. government on Monday brought forward to Tuesday midnight enforcement of restrictions on travel to the United States from Brazil as the South American country reported the highest death toll in the world for that day.

Washington’s ban applies to foreigners traveling to the United States if they had been in Brazil in the last two weeks. Two days earlier, Brazil overtook Russia as the world’s No. 2 coronavirus hot spot in number of confirmed cases, after the United States.

Murray said the IHME forecast captures the effects of social distancing mandates, mobility trends and testing capacity, so projections could shift along with policy changes.

The model will be updated regularly as new data is released on cases, hospitalizations, deaths, testing and mobility.

Reporting by Anthony Boadle; Editing by Brad Haynes and Steve Orlofsky

Modeling COVID-19 data must be done with extreme care (Science Daily)

Date: May 19, 2020

Source: American Institute of Physics

Summary: At the beginning of a new wave of an epidemic, extreme care should be used when extrapolating data to determine whether lockdowns are necessary, experts say.

As the infectious virus causing the COVID-19 disease began its devastating spread around the globe, an international team of scientists was alarmed by the lack of uniform approaches by various countries’ epidemiologists to respond to it.

Germany, for example, didn’t institute a full lockdown, unlike France and the U.K., and the decision in the U.S. by New York to go into a lockdown came only after the pandemic had reached an advanced stage. Data modeling to predict the numbers of likely infections varied widely by region, from very large to very small numbers, and revealed a high degree of uncertainty.

Davide Faranda, a scientist at the French National Centre for Scientific Research (CNRS), and colleagues in the U.K., Mexico, Denmark, and Japan decided to explore the origins of these uncertainties. This work is deeply personal to Faranda, whose grandfather died of COVID-19; Faranda has dedicated the work to him.

In the journal Chaos, from AIP Publishing, the group describes why modeling and extrapolating the evolution of COVID-19 outbreaks in near real time is an enormous scientific challenge that requires a deep understanding of the nonlinearities underlying the dynamics of epidemics.

Forecasting the behavior of a complex system, such as the evolution of epidemics, requires both a physical model for its evolution and a dataset of infections to initialize the model. To create a model, the team used data provided by Johns Hopkins University’s Center for Systems Science and Engineering, which is available online at or

“Our physical model is based on assuming that the total population can be divided into four groups: those who are susceptible to catching the virus, those who have contracted the virus but don’t show any symptoms, those who are infected and, finally, those who recovered or died from the virus,” Faranda said.

To determine how people move from one group to another, it’s necessary to know the infection rate, incubation time and recovery time. Actual infection data can be used to extrapolate the behavior of the epidemic with statistical models.

“Because of the uncertainties in both the parameters involved in the models — infection rate, incubation period and recovery time — and the incompleteness of infections data within different countries, extrapolations could lead to an incredibly large range of uncertain results,” Faranda said. “For example, just assuming an underestimation of the last data in the infection counts of 20% can lead to a change in total infections estimations from few thousands to few millions of individuals.”

The group has also shown that this uncertainty is due to a lack of data quality and also to the intrinsic nature of the dynamics, because it is ultrasensitive to the parameters — especially during the initial growing phase. This means that everyone should be very careful extrapolating key quantities to decide whether to implement lockdown measures when a new wave of the virus begins.

“The total final infection counts as well as the duration of the epidemic are sensitive to the data you put in,” he said.

The team’s model handles uncertainty in a natural way, so they plan to show how modeling of the post-confinement phase can be sensitive to the measures taken.

“Preliminary results show that implementing lockdown measures when infections are in a full exponential growth phase poses serious limitations for their success,” said Faranda.

Story Source:

Materials provided by American Institute of Physics. Note: Content may be edited for style and length.

Journal Reference:

  1. Davide Faranda, Isaac Pérez Castillo, Oliver Hulme, Aglaé Jezequel, Jeroen S. W. Lamb, Yuzuru Sato, Erica L. Thompson. Asymptotic estimates of SARS-CoV-2 infection counts and their sensitivity to stochastic perturbation. Chaos: An Interdisciplinary Journal of Nonlinear Science, 2020; 30 (5): 051107 DOI: 10.1063/5.0008834

Opinion | Forty Years Later, Lessons for the Pandemic From Mount St. Helens (New York Times)

By Lawrence Roberts – May 17, 2020

The tensions we now face between science, politics and economics also arose before the country’s most destructive volcanic eruption.

Mr. Roberts is a former editor at ProPublica and The Washington Post.

Mount St. Helens erupted on May 18, 1980.
United Press International

When I met David A. Johnston, it was on a spring evening, about a month before he would be erased from existence by a gigantic cloud of volcanic ash boiling over him at 300 miles per hour. He was coming through the door of a makeshift command center in Vancouver, Wash., the closest city to the graceful snow-capped dome of Mount St. Helens, a volcano that had been dormant for 123 years. This was April 1980, and Mr. Johnston, a 30-year-old geologist, was one of the first scientists summoned to monitor new warning signs from the mountain — shallow earthquakes and periodic bursts of ash and steam.

As a young reporter I had talked my way into the command center. At first Mr. Johnston was wary; he wasn’t supposed to meet the press anymore. His supervisors had played down the chance that the smoking mountain was about to explode, and they had already reprimanded him for suggesting otherwise. But on this night he’d just been setting measuring equipment deep in the surrounding forest, and his runner-thin frame vibrated with excitement, his face flushed under his blond beard, and Mr. Johnston couldn’t help riffing on the likelihood of a cataclysmic event.

“My feeling is when it goes, it’s going to go just like that,” he told me, snapping his fingers. “Bang!” At best, he said, we’d have a couple of hours of warning.

Mr. Johnston was mostly right. Early on a Sunday morning several weeks later, the mountain did blow, in the most destructive eruption in U.S. history. But there was no warning. At his instrument outpost, on a ridge more than five miles from the summit, Mr. Johnston had only seconds to radio in a last message: “Vancouver! Vancouver! This is it!”

A photograph of David Johnston, who was killed when Mount St. Helens erupted.
Chris Sweda/Daily Southtown, via Associated Press

Monday, May 18, marks the 40th anniversary of the 1980 Mount St. Helens eruption, and as we now face our own struggle to gauge the uncertain risks presented by nature, to predict how bad things will get and how much and how long to protect ourselves, it may be useful to revisit the tension back then between science, politics and economics.

The drama played out on a much smaller stage — one region of one state, instead of the whole planet — but many of the same elements were present: Scientists provided a range of educated guesses, and public officials split on how to respond. Business owners and residents chafed at the restrictions put in place, many flouted them, and a few even threatened armed rebellion. In the end, the government mostly accepted the analyses of Mr. Johnston and his fellow geologists. As a result, while the eruption killed 57 people and flattened hundreds of square miles of dense Pacific Northwest forestland, the lives of hundreds, perhaps thousands, were spared.

At the first warning signs, state and federal officials moved to distance people from the mountain. They sought to block nonessential visitors from nearby Spirit Lake, ringed with scout camps and tourist lodges. Other than loggers, few people hung around the peak year-round, but the population surged in late spring and summer, when thousands hiked, camped and moved into vacation homes. Many regulars dismissed the risk. Slipping past roadblocks became a popular activity. Locals sold maps to sightseers and amateur photographers that showed how to take old logging roads up the mountain. The owner of a nearby general store shared a common opinion of the threat: “It’s just plain bull. I lived here 26 years, and nothing like this happened before.”

Like the probability of a pandemic, though, it was well-established that one of the dozen or so volcanoes in the 800-mile Cascade Range might soon turn active. Averaging two eruptions a century, they were overdue. A 1978 report by the U.S. Geological Survey, where Mr. Johnston worked, identified Mount St. Helens as most likely to blow next. Yet forecasting how big the event could be was a matter of art as well as science. Geologists could model only previous explosions and list the possible outcomes. (“That position was difficult for many to accept, because they believed we could and should make predictions,” a U.S.G.S. report said later.)

Some scientists suggested a much larger evacuation, but uncertainty, a hallmark of their discipline, can be difficult for those making real-time public policy. The guidelines from federal and state representatives camped out in Vancouver, and from Washington’s governor, Dixy Lee Ray, often seemed in conflict. Moreover, the Weyerhaeuser Company, which owned tens of thousands of acres of timber, opposed logging restrictions, even as some crews got nervous about working near the rumbling dome.

By mid-April, a bulge grew on the north flank, a clue that highly pressurized magma was trapped and expanding. If it burst, a landslide might bury Spirit Lake. The governor, a conservative Democrat who was a biologist by training, finally agreed to stronger measures. She ordered an inner “red zone” where only scientists and law enforcement personnel could enter, and a “blue zone” open to loggers and property owners with day passes. If the zones didn’t extend as far as many geologists hoped, they were certainly an improvement.

Then the mountain got deceptively quiet. The curve of seismic activity flattened and turned downward. Many grew complacent, and restless. On Saturday, May 17, people with property inside the red zone massed in cars and pickup trucks at the roadblock on State Highway 504. Hearing rumors that some carried rifles, the governor relented, allowing them through, with a police escort, to check on their homes and leave again. The state patrol chief, Robert Landon, told them, “We hope the good Lord will keep that mountain from giving us any trouble.” The property owners vowed to return the next day.

The next day was Sunday. At 8:32 a.m., a powerful quake shook loose the snow-covered north face of Mount St. Helens, releasing the superheated magma, which roared out of the mountain in a lateral blast faster than a bullet train, over the spot where Mr. Johnston stood, mowing down 230 square miles of trees, hurling trunks into the air like twigs. It rained down a suffocating storm of thick gray ash, “a burning sky-river wind of searing lava droplet hail,” as the poet Gary Snyder described it. Mudflows clogged the river valleys, setting off deadly floods. A column of ash soared 15 miles high and bloomed into a mushroom cloud 35 miles wide. Over two weeks, ash would circle the globe. Among the 57 dead were three aspiring geologists besides Mr. Johnston, as well as loggers, sightseers and photographers.

About a week later, the Forest Service took reporters up in a helicopter. I had seen the mountain from the air before the eruption. Now the sprawling green wilderness that appeared endless and permanent had disappeared in a blink. We flew for an hour over nothing but moonscape. The scientists had done their best, but nature flexed a power far more deadly than even they had imagined.

Lawrence Roberts, a former editor at ProPublica and The Washington Post, is the author of the forthcoming “Mayday 1971: A White House at War, a Revolt in the Streets, and the Untold History of America’s Biggest Mass Arrest.”

This Is the Future of the Pandemic (New York Times)

Covid-19 isn’t going away soon. Two recent studies mapped out the possible shapes of its trajectory.

Circles at Gare du Nord train station in Paris marked safe social distances on Wednesday.
Circles at Gare du Nord train station in Paris marked safe social distances on Wednesday.Credit…Ian Langsdon/EPA, via Shutterstock

By Siobhan Roberts – May 8, 2020

By now we know — contrary to false predictions — that the novel coronavirus will be with us for a rather long time.

“Exactly how long remains to be seen,” said Marc Lipsitch, an infectious disease epidemiologist at Harvard’s T.H. Chan School of Public Health. “It’s going to be a matter of managing it over months to a couple of years. It’s not a matter of getting past the peak, as some people seem to believe.”

A single round of social distancing — closing schools and workplaces, limiting the sizes of gatherings, lockdowns of varying intensities and durations — will not be sufficient in the long term.

In the interest of managing our expectations and governing ourselves accordingly, it might be helpful, for our pandemic state of mind, to envision this predicament — existentially, at least — as a soliton wave: a wave that just keeps rolling and rolling, carrying on under its own power for a great distance.

The Scottish engineer and naval architect John Scott Russell first spotted a soliton in 1834 as it traveled along the Union Canal. He followed on horseback and, as he wrote in his “Report on Waves,” overtook it rolling along at about eight miles an hour, at thirty feet long and a foot or so in height. “Its height gradually diminished, and after a chase of one or two miles I lost it in the windings of the channel.”

The pandemic wave, similarly, will be with us for the foreseeable future before it diminishes. But, depending on one’s geographic location and the policies in place, it will exhibit variegated dimensions and dynamics traveling through time and space.

“There is an analogy between weather forecasting and disease modeling,” Dr. Lipsitch said. Both, he noted, are simple mathematical descriptions of how a system works: drawing upon physics and chemistry in the case of meteorology; and on behavior, virology and epidemiology in the case of infectious-disease modeling. Of course, he said, “we can’t change the weather.” But we can change the course of the pandemic — with our behavior, by balancing and coordinating psychological, sociological, economic and political factors.

Dr. Lipsitch is a co-author of two recent analyses — one from the Center for Infectious Disease Research and Policy at the University of Minnesota, the other from the Chan School published in Science — that describe a variety of shapes the pandemic wave might take in the coming months.

The Minnesota study describes three possibilities:

Scenario No. 1 depicts an initial wave of cases — the current one — followed by a consistently bumpy ride of “peaks and valleys” that will gradually diminish over a year or two.

Scenario No. 2 supposes that the current wave will be followed by a larger “fall peak,” or perhaps a winter peak, with subsequent smaller waves thereafter, similar to what transpired during the 1918-1919 flu pandemic.

Scenario No. 3 shows an intense spring peak followed by a “slow burn” with less-pronounced ups and downs.

The authors conclude that whichever reality materializes (assuming ongoing mitigation measures, as we await a vaccine), “we must be prepared for at least another 18 to 24 months of significant Covid-19 activity, with hot spots popping up periodically in diverse geographic areas.”

In the Science paper, the Harvard team — infectious-disease epidemiologist Yonatan Grad, his postdoctoral fellow Stephen Kissler, Dr. Lipsitch, his doctoral student Christine Tedijanto and their colleague Edward Goldstein — took a closer look at various scenarios by simulating the transmission dynamics using the latest Covid-19 data and data from related viruses.

The authors conveyed the results in a series of graphs — composed by Dr. Kissler and Ms. Tedijanto — that project a similarly wavy future characterized by peaks and valleys.

One figure from the paper, reinterpreted below, depicts possible scenarios (the details would differ geographically) and shows the red trajectory of Covid-19 infections in response to “intermittent social distancing” regimes represented by the blue bands.

Social distancing is turned “on” when the number of Covid-19 cases reaches a certain prevalence in the population — for instance, 35 cases per 10,000, although the thresholds would be set locally, monitored with widespread testing. It is turned “off” when cases drop to a lower threshold, perhaps 5 cases per 10,000. Because critical cases that require hospitalization lag behind the general prevalence, this strategy aims to prevent the health care system from being overwhelmed.

The green graph represents the corresponding, if very gradual, increase in population immunity.

“The ‘herd immunity threshold’ in the model is 55 percent of the population, or the level of immunity that would be needed for the disease to stop spreading in the population without other measures,” Dr. Kissler said.

Another iteration shows the effects of seasonality — a slower spread of the virus during warmer months. Theoretically, seasonal effects allow for larger intervals between periods of social distancing.

This year, however, the seasonal effects will likely be minimal, since a large proportion of the population will still be susceptible to the virus come summer. And there are other unknowns, since the underlying mechanisms of seasonality — such as temperature, humidity and school schedules — have been studied for some respiratory infections, like influenza, but not for coronaviruses. So, alas, we cannot depend on seasonality alone to stave off another outbreak over the coming summer months.

Yet another scenario takes into account not only seasonality but also a doubling of the critical-care capacity in hospitals. This, in turn, allows for social distancing to kick in at a higher threshold — say, at a prevalence of 70 cases per 10,000 — and for even longer breaks between social distancing periods:

What is clear overall is that a one-time social distancing effort will not be sufficient to control the epidemic in the long term, and that it will take a long time to reach herd immunity.

“This is because when we are successful in doing social distancing — so that we don’t overwhelm the health care system — fewer people get the infection, which is exactly the goal,” said Ms. Tedijanto. “But if infection leads to immunity, successful social distancing also means that more people remain susceptible to the disease. As a result, once we lift the social distancing measures, the virus will quite possibly spread again as easily as it did before the lockdowns.”

So, lacking a vaccine, our pandemic state of mind may persist well into 2021 or 2022 — which surprised even the experts.

“We anticipated a prolonged period of social distancing would be necessary, but didn’t initially realize that it could be this long,” Dr. Kissler said.

Claudio Maierovitch Pessanha Henriques: O mito do pico (Folha de S.Paulo)

Claudio Maierovitch Pessanha Henriques – 6 de maio de 2020

Desde o início da epidemia de doença causada pelo novo coronavírus (Covid-19), a grande pergunta tem sido “quando acaba?” Frequentemente, são divulgadas na mídia e nas redes sociais projeções as mais variadas sobre a famosa curva da doença em vários países e no mundo, algumas recentes, mostrando a tendência de que os casos deixem de surgir no início do segundo semestre deste ano.

Tais modelos partem do pressuposto de que há uma história, uma curva natural da doença, que começa, sobe, atinge um pico e começa a cair. Vamos analisar o sentido de tal raciocínio. Muitas doenças transmissíveis agudas, quando atingem uma população nova, expandem-se rapidamente, numa velocidade que depende de seu chamado número reprodutivo básico, ou R0 (“R zero”, que estima para quantas pessoas o portador de um agente infeccioso o transmite).

Quando uma quantidade grande de pessoas tiver adoecido ou se infectado mesmo sem sintomas, os contatos entre portadores e pessoas que não tiveram a doença começam a se tornar raros. Num cenário em que pessoas sobreviventes da infecção fiquem imunes àquele agente, sua proporção cresce e a transmissão se torna cada vez mais rara. Assim, a curva, que vinha subindo, fica horizontal e começa a cair, podendo até mesmo chegar a zero, situação em que o agente deixa de circular.

Em populações grandes, é muito raro que uma doença seja completamente eliminada desta forma, por isso a incidência cresce novamente de tempos em tempos. Quando a quantidade de pessoas que não se infectaram, somada à dos bebês que nascem e pessoas sem imunidade que vieram de outros lugares é suficientemente grande, então a curva sobe novamente.

É assim, de forma simplificada, que a ciência entende a ocorrência periódica de epidemias de doenças infecciosas agudas. A história nos ilustra com numerosos exemplos, como varíola, sarampo, gripe, rubéola, poliomielite, caxumba, entre muitos outros. Dependendo das características da doença e da sociedade, são ciclos ilustrados por sofrimento, sequelas e mortes. Realmente, nesses casos, é possível estimar a duração das epidemias e, em alguns casos, até mesmo prever as próximas.

A saúde pública tem diversas ferramentas para interferir em muitos desses casos, indicados para diferentes mecanismos de transmissão, como saneamento, medidas de higiene, isolamento, combate a vetores, uso de preservativos, extinção de fontes de contaminação, vacinas e tratamentos capazes de eliminar os microrganismos. A vacinação, ação específica de saúde considerada mais efetiva, simula o que acontece naturalmente, ao aumentar a quantidade de pessoas imunes na população até que a doença deixe de circular, sem que para isso pessoas precisem adoecer.

No caso da Covid-19, há estimativas de que para a doença deixar de circular intensamente será preciso que cerca de 70% da população seja infectada. Isso se chama imunidade coletiva (também se adota a desagradável denominação “imunidade de rebanho”). Quanto à situação atual de disseminação do coronavírus Sars-CoV-2, a Organização Mundial da Saúde (OMS) calcula que até a metade de abril apenas de 2% a 3% da população mundial terá sido infectada. Estimativas para o Brasil são um pouco inferiores a essa média.

Trocando em miúdos, para que a doença atinja naturalmente seu pico no país e comece a cair, será preciso esperar que 140 milhões de pessoas se infectem. A mais conservadora (menor) taxa de letalidade encontrada nas publicações sobre a Covid-19 é de 0,36%, mais ou menos um vigésimo daquela que os números oficiais de casos e mortes revelam. Isso significa que até o Brasil atingir o pico, contaremos 500 mil mortes se o sistema de saúde não ultrapassar seus limites —e, caso isso aconteça, um número muito maior.

Atingir o pico é sinônimo de catástrofe, não é uma aposta admissível, sobretudo quando constatamos que já está esgotada a capacidade de atendimento hospitalar em várias cidades, como Manaus, Rio de Janeiro e Fortaleza —outras seguem o mesmo caminho.

A única perspectiva aceitável é evitar o pico, e a única forma de fazê-lo é com medidas rigorosas de afastamento físico. A cota de contatos entre as pessoas deve ficar reservada às atividades essenciais, entre elas saúde, segurança, cadeias de suprimento de combustíveis, alimentos, produtos de limpeza, materiais e equipamentos de uso em saúde, limpeza, manutenção e mais um ou outro setor. Alguma dose de criatividade pode permitir ampliar um pouco esse leque, desde que os meios de transporte e vias públicas permaneçam vazios o suficiente para que seja mantida a distância mínima entre as pessoas.

O monitoramento do número de casos e mortes, que revela a transmissão com duas a três semanas de defasagem, deverá ser aprimorado e utilizado em conjunto com estudos baseados em testes laboratoriais para indicar o rigor das medidas de isolamento.

Se conseguirmos evitar a tragédia maior, vamos conviver com um longo período de restrição de atividades, mais de um ano, e teremos que aprender a organizar a vida e a economia de outras formas, além de passar por alguns períodos de “lockdown” —cerca de duas semanas cada, se a curva apontar novamente para o pico.

Hoje, a situação é grave e tende a se tornar crítica. O Brasil é o país com a maior taxa de transmissão da doença; é hora de ficar em casa e, se for imprescindível sair, fazer da máscara uma parte inseparável da vestimenta e manter rigorosamente todos os cuidados indicados.​