Introduction: Sensors everywhere. Infinite storage. Clouds of processors. Our ability to capture, warehouse, and understand massive amounts of data is changing science, medicine, business, and technology. As our collection of facts and figures grows, so will the opportunity to find answers to fundamental questions. Because in the era of big data, more isn’t just more. […]
Sensors everywhere. Infinite storage. Clouds of processors. Our ability to capture, warehouse, and understand massive amounts of data is changing science, medicine, business, and technology. As our collection of facts and figures grows, so will the opportunity to find answers to fundamental questions. Because in the era of big data, more isn’t just more. More is different.
Does big data have the answers? Maybe some, but not all, says Mark Graham
In 2008, Chris Anderson, then editor of Wired, wrote a provocative piece titled The End of Theory. Anderson was referring to the ways that computers, algorithms, and big data can potentially generate more insightful, useful, accurate, or true results than specialists or domain experts who traditionally craft carefully targeted hypotheses and research strategies.
This revolutionary notion has now entered not just the popular imagination, but also the research practices of corporations, states, journalists and academics. The idea being that the data shadows and information trails of people, machines, commodities and even nature can reveal secrets to us that we now have the power and prowess to uncover.
In other words, we no longer need to speculate and hypothesise; we simply need to let machines lead us to the patterns, trends, and relationships in social, economic, political, and environmental relationships.
It is quite likely that you yourself have been the unwitting subject of a big data experiment carried out by Google, Facebook and many other large Web platforms. Google, for instance, has been able to collect extraordinary insights into what specific colours, layouts, rankings, and designs make people more efficient searchers. They do this by slightly tweaking their results and website for a few million searches at a time and then examining the often subtle ways in which people react.
Most large retailers similarly analyse enormous quantities of data from their databases of sales (which are linked to you by credit card numbers and loyalty cards) in order to make uncanny predictions about your future behaviours. In a now famous case, the American retailer, Target, upset a Minneapolis man by knowing more about his teenage daughter’s sex life than he did. Target was able to predict his daughter’s pregnancy by monitoring her shopping patterns and comparing that information to an enormous database detailing billions of dollars of sales. This ultimately allows the company to make uncanny predictions about its shoppers.
More significantly, national intelligence agencies are mining vast quantities of non-public Internet data to look for weak signals that might indicate planned threats or attacks.
There can by no denying the significant power and potentials of big data. And the huge resources being invested in both the public and private sectors to study it are a testament to this.
However, crucially important caveats are needed when using such datasets: caveats that, worryingly, seem to be frequently overlooked.
The raw informational material for big data projects is often derived from large user-generated or social media platforms (e.g. Twitter or Wikipedia). Yet, in all such cases we are necessarily only relying on information generated by an incredibly biased or skewed user-base.
Gender, geography, race, income, and a range of other social and economic factors all play a role in how information is produced and reproduced. People from different places and different backgrounds tend to produce different sorts of information. And so we risk ignoring a lot of important nuance if relying on big data as a social/economic/political mirror.
We can of course account for such bias by segmenting our data. Take the case of using Twitter to gain insights into last summer’s London riots. About a third of all UK Internet users have a twitter profile; a subset of that group are the active tweeters who produce the bulk of content; and then a tiny subset of that group (about 1%) geocode their tweets (essential information if you want to know about where your information is coming from).
Despite the fact that we have a database of tens of millions of data points, we are necessarily working with subsets of subsets of subsets. Big data no longer seems so big. Such data thus serves to amplify the information produced by a small minority (a point repeatedly made by UCL’s Muki Haklay), and skew, or even render invisible, ideas, trends, people, and patterns that aren’t mirrored or represented in the datasets that we work with.
Big data is undoubtedly useful for addressing and overcoming many important issues face by society. But we need to ensure that we aren’t seduced by the promises of big data to render theory unnecessary.
We may one day get to the point where sufficient quantities of big data can be harvested to answer all of the social questions that most concern us. I doubt it though. There will always be digital divides; always be uneven data shadows; and always be biases in how information and technology are used and produced.
And so we shouldn’t forget the important role of specialists to contextualise and offer insights into what our data do, and maybe more importantly, don’t tell us.
Illustration: Marian Bantjes“All models are wrong, but some are useful.”
So proclaimed statistician George Box 30 years ago, and he was right. But what choice did we have? Only models, from cosmological equations to theories of human behavior, seemed to be able to consistently, if imperfectly, explain the world around us. Until now. Today companies like Google, which have grown up in an era of massively abundant data, don’t have to settle for wrong models. Indeed, they don’t have to settle for models at all.
Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition. They are the children of the Petabyte Age.
The Petabyte Age is different because more is different. Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies.
At the petabyte scale, information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics. It calls for an entirely different approach, one that requires us to lose the tether of data as something that can be visualized in its totality. It forces us to view data mathematically first and establish a context for it later. For instance, Google conquered the advertising world with nothing more than applied mathematics. It didn’t pretend to know anything about the culture and conventions of advertising — it just assumed that better data, with better analytical tools, would win the day. And Google was right.
Google’s founding philosophy is that we don’t know why this page is better than that one: If the statistics of incoming links say it is, that’s good enough. No semantic or causal analysis is required. That’s why Google can translate languages without actually “knowing” them (given equal corpus data, Google can translate Klingon into Farsi as easily as it can translate French into German). And why it can match ads to content without any knowledge or assumptions about the ads or the content.
Speaking at the O’Reilly Emerging Technology Conference this past March, Peter Norvig, Google’s research director, offered an update to George Box’s maxim: “All models are wrong, and increasingly you can succeed without them.”
This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.
The big target here isn’t advertising, though. It’s science. The scientific method is built around testable hypotheses. These models, for the most part, are systems visualized in the minds of scientists. The models are then tested, and experiments confirm or falsify theoretical models of how the world works. This is the way science has worked for hundreds of years.
Scientists are trained to recognize that correlation is not causation, that no conclusions should be drawn simply on the basis of correlation between X and Y (it could just be a coincidence). Instead, you must understand the underlying mechanisms that connect the two. Once you have a model, you can connect the data sets with confidence. Data without a model is just noise.
But faced with massive data, this approach to science — hypothesize, model, test — is becoming obsolete. Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n-dimensional grand unified models over the past few decades (the “beautiful story” phase of a discipline starved of data) is that we don’t know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on.
Now biology is heading in the same direction. The models we were taught in school about “dominant” and “recessive” genes steering a strictly Mendelian process have turned out to be an even greater simplification of reality than Newton’s laws. The discovery of gene-protein interactions and other aspects of epigenetics has challenged the view of DNA as destiny and even introduced evidence that environment can influence inheritable traits, something once considered a genetic impossibility.
In short, the more we learn about biology, the further we find ourselves from a model that can explain it.
There is now a better way. Petabytes allow us to say: “Correlation is enough.” We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.
The best practical example of this is the shotgun gene sequencing by J. Craig Venter. Enabled by high-speed sequencers and supercomputers that statistically analyze the data they produce, Venter went from sequencing individual organisms to sequencing entire ecosystems. In 2003, he started sequencing much of the ocean, retracing the voyage of Captain Cook. And in 2005 he started sequencing the air. In the process, he discovered thousands of previously unknown species of bacteria and other life-forms.
If the words “discover a new species” call to mind Darwin and drawings of finches, you may be stuck in the old way of doing science. Venter can tell you almost nothing about the species he found. He doesn’t know what they look like, how they live, or much of anything else about their morphology. He doesn’t even have their entire genome. All he has is a statistical blip — a unique sequence that, being unlike any other sequence in the database, must represent a new species.
This sequence may correlate with other sequences that resemble those of species we do know more about. In that case, Venter can make some guesses about the animals — that they convert sunlight into energy in a particular way, or that they descended from a common ancestor. But besides that, he has no better model of this species than Google has of your MySpace page. It’s just data. By analyzing it with Google-quality computing resources, though, Venter has advanced biology more than anyone else of his generation.
This kind of thinking is poised to go mainstream. In February, the National Science Foundation announced the Cluster Exploratory, a program that funds research designed to run on a large-scale distributed computing platform developed by Google and IBM in conjunction with six pilot universities. The cluster will consist of 1,600 processors, several terabytes of memory, and hundreds of terabytes of storage, along with the software, including IBM’s Tivoli and open source versions of Google File System and MapReduce.111 Early CluE projects will include simulations of the brain and the nervous system and other biological research that lies somewhere between wetware and software.
Learning to use a “computer” of this scale may be challenging. But the opportunity is great: The new availability of huge amounts of data, along with the statistical tools to crunch these numbers, offers a whole new way of understanding the world. Correlation supersedes causation, and science can advance even without coherent models, unified theories, or really any mechanistic explanation at all.
There’s no reason to cling to our old ways. It’s time to ask: What can science learn from Google?
Until recently, the field of plant breeding looked a lot like it did in centuries past. A breeder might examine, for example, which tomato plants were most resistant to drought and then cross the most promising plants to produce the most drought-resistant offspring. This process would be repeated, plant generation after generation, until, over the course of roughly seven years, the breeder arrived at what seemed the optimal variety.
Now, with the global population expected to swell to nearly 10 billion by 2050 (1) and climate change shifting growing conditions (2), crop breeder and geneticist Steven Tanksley doesn’t think plant breeders have that kind of time. “We have to double the productivity per acre of our major crops if we’re going to stay on par with the world’s needs,” says Tanksley, a professor emeritus at Cornell University in Ithaca, NY.
To speed up the process, Tanksley and others are turning to artificial intelligence (AI). Using computer science techniques, breeders can rapidly assess which plants grow the fastest in a particular climate, which genes help plants thrive there, and which plants, when crossed, produce an optimum combination of genes for a given location, opting for traits that boost yield and stave off the effects of a changing climate. Large seed companies in particular have been using components of AI for more than a decade. With computing power rapidly advancing, the techniques are now poised to accelerate breeding on a broader scale.
AI is not, however, a panacea. Crop breeders still grapple with tradeoffs such as higher yield versus marketable appearance. And even the most sophisticated AI cannot guarantee the success of a new variety. But as AI becomes integrated into agriculture, some crop researchers envisage an agricultural revolution with computer science at the helm.
An Art and a Science
During the “green revolution” of the 1960s, researchers developed new chemical pesticides and fertilizers along with high-yielding crop varieties that dramatically increased agricultural output (3). But the reliance on chemicals came with the heavy cost of environmental degradation (4). “If we’re going to do this sustainably,” says Tanksley, “genetics is going to carry the bulk of the load.”
Plant breeders lean not only on genetics but also on mathematics. As the genomics revolution unfolded in the early 2000s, plant breeders found themselves inundated with genomic data that traditional statistical techniques couldn’t wrangle (5). Plant breeding “wasn’t geared toward dealing with large amounts of data and making precise decisions,” says Tanksley.
In 1997, Tanksley began chairing a committee at Cornell that aimed to incorporate data-driven research into the life sciences. There, he encountered an engineering approach called operations research that translates data into decisions. In 2006, Tanksley cofounded the Ithaca, NY-based company Nature Source Improved Plants on the principle that this engineering tool could make breeding decisions more efficient. “What we’ve been doing almost 15 years now,” says Tanksley, “is redoing how breeding is approached.”
A Manufacturing Process
Such approaches try to tackle complex scenarios. Suppose, for example, a wheat breeder has 200 genetically distinct lines. The breeder must decide which lines to breed together to optimize yield, disease resistance, protein content, and other traits. The breeder may know which genes confer which traits, but it’s difficult to decipher which lines to cross in what order to achieve the optimum gene combination. The number of possible combinations, says Tanksley, “is more than the stars in the universe.”
An operations research approach enables a researcher to solve this puzzle by defining the primary objective and then using optimization algorithms to predict the quickest path to that objective given the relevant constraints. Auto manufacturers, for example, optimize production given the expense of employees, the cost of auto parts, and fluctuating global currencies. Tanksley’s team optimizes yield while selecting for traits such as resistance to a changing climate. “We’ve seen more erratic climate from year to year, which means you have to have crops that are more robust to different kinds of changes,” he says.
For each plant line included in a pool of possible crosses, Tanksley inputs DNA sequence data, phenotypic data on traits like drought tolerance, disease resistance, and yield, as well as environmental data for the region where the plant line was originally developed. The algorithm projects which genes are associated with which traits under which environmental conditions and then determines the optimal combination of genes for a specific breeding goal, such as drought tolerance in a particular growing region, while accounting for genes that help boost yield. The algorithm also determines which plant lines to cross together in which order to achieve the optimal combination of genes in the fewest generations.
Nature Source Improved Plants conducts, for example, a papaya program in southeastern Mexico where the once predictable monsoon season has become erratic. “We are selecting for varieties that can produce under those unknown circumstances,” says Tanksley. But the new papaya must also stand up to ringspot, a virus that nearly wiped papaya from Hawaii altogether before another Cornell breeder developed a resistant transgenic variety (6). Tanksley’s papaya isn’t as disease resistant. But by plugging “rapid growth rate” into their operations research approach, the team bred papaya trees that produce copious fruit within a year, before the virus accumulates in the plant.
“Plant breeders need operations research to help them make better decisions,” says William Beavis, a plant geneticist and computational biologist at Iowa State in Ames, who also develops operations research strategies for plant breeding. To feed the world in rapidly changing environments, researchers need to shorten the process of developing a new cultivar to three years, Beavis adds.
The big seed companies have investigated use of operations research since around 2010, with Syngenta, headquartered in Basel, Switzerland, leading the pack, says Beavis, who spent over a decade as a statistical geneticist at Pioneer Hi-Bred in Johnston, IA, a large seed company now owned by Corteva, which is headquartered in Wilmington, DE. “All of the soybean varieties that have come on the market within the last couple of years from Syngenta came out of a system that had been redesigned using operations research approaches,” he says. But large seed companies primarily focus on grains key to animal feed such as corn, wheat, and soy. To meet growing food demands, Beavis believes that the smaller seed companies that develop vegetable crops that people actually eat must also embrace operations research. “That’s where operations research is going to have the biggest impact,” he says, “local breeding companies that are producing for regional environments, not for broad adaptation.”
In collaboration with Iowa State colleague and engineer Lizhi Wang and others, Beavis is developing operations research-based algorithms to, for example, help seed companies choose whether to breed one variety that can survive in a range of different future growing conditions or a number of varieties, each tailored to specific environments. Two large seed companies, Corteva and Syngenta, and Kromite, a Lambertville, NJ-based consulting company, are partners on the project. The results will be made publicly available so that all seed companies can learn from their approach.
Drones and Adaptations
Useful farming AI requires good data, and plenty of it. To collect sufficient inputs, some researchers take to the skies. Crop researcher Achim Walter of the Institute of Agricultural Sciences at ETH Zürich in Switzerland and his team are developing techniques to capture aerial crop images. Every other day for several years, they have deployed image-capturing sensors over a wheat field containing hundreds of genetic lines. They fly their sensors on drones or on cables suspended above the crops or incorporate them into handheld devices that a researcher can use from an elevated platform (7).
Meanwhile, they’re developing imaging software that quantifies growth rate captured by these images (8). Using these data, they build models that predict how quickly different genetic lines grow under different weather conditions. If they find, for example, that a subset of wheat lines grew well despite a dry spell, then they can zero in on the genes those lines have in common and incorporate them into new drought-resistant varieties.
Research geneticist Edward Buckler at the US Department of Agriculture and his team are using machine learning to identify climate adaptations in 1,000 species in a large grouping of grasses spread across the globe. The grasses include food and bioenergy crops such as maize, sorghum, and sugar cane. Buckler says that when people rank what are the most photosynthetically efficient and water-efficient species, this is the group that comes out at the top. Still, he and collaborators, including plant scientist Elizabeth Kellogg of the Donald Danforth Plant Science Center in St. Louis, MO, and computational biologist Adam Siepel of Cold Spring Harbor Laboratory in NY, want to uncover genes that could make crops in this group even more efficient for food production in current and future environments. The team is first studying a select number of model species to determine which genes are expressed under a range of different environmental conditions. They’re still probing just how far this predictive power can go.
Such approaches could be scaled up—massively. To probe the genetic underpinnings of climate adaptation for crop species worldwide, Daniel Jacobson, the chief researcher for computational systems biology at Oak Ridge National Laboratory in TN, has amassed “climatype” data for every square kilometer of land on Earth. Using the Summit supercomputer, they then compared each square kilometer to every other square kilometer to identify similar environments (9). The result can be viewed as a network of GPS points connected by lines that show the degree of environmental similarity between points.
“For me, breeding is much more like art. I need to see the variation and I don’t prejudge it. I know what I’m after, but nature throws me curveballs all the time, and I probably can’t count the varieties that came from curveballs.”
In collaboration with the US Department of Energy’s Center for Bioenergy Innovation, the team combines this climatype data with GPS coordinates associated with individual crop genotypes to project which genes and genetic interactions are associated with specific climate conditions. Right now, they’re focused on bioenergy and feedstocks, but they’re poised to explore a wide range of food crops as well. The results will be published so that other researchers can conduct similar analyses.
The Next Agricultural Revolution
Despite these advances, the transition to AI can be unnerving. Operations research can project an ideal combination of genes, but those genes may interact in unpredictable ways. Tanksley’s company hedges its bets by engineering 10 varieties for a given project in hopes that at least one will succeed.
On the other hand, such a directed approach could miss happy accidents, says Molly Jahn, a geneticist and plant breeder at the University of Wisconsin–Madison. “For me, breeding is much more like art. I need to see the variation and I don’t prejudge it,” she says. “I know what I’m after, but nature throws me curveballs all the time, and I probably can’t count the varieties that came from curveballs.”
There are also inherent tradeoffs that no algorithm can overcome. Consumers may prefer tomatoes with a leafy crown that stays green longer. But the price a breeder pays for that green calyx is one percent of the yield, says Tanksley.
Image recognition technology comes with its own host of challenges, says Walter. “To optimize algorithms to an extent that makes it possible to detect a certain trait, you have to train the algorithm thousands of times.” In practice, that means snapping thousands of crop images in a range of light conditions. Then there’s the ground-truthing. To know whether the models work, Walter and others must measure the trait they’re after by hand. Keen to know whether the model accurately captures the number of kernels on an ear of corn? You’d have to count the kernels yourself.
Despite these hurdles, Walter believes that computer science has brought us to the brink of a new agricultural revolution. In a 2017 PNAS Opinion piece, Walter and colleagues described emerging “smart farming” technologies—from autonomous weeding vehicles to moisture sensors in the soil (10). The authors worried, though, that only big industrial farms can afford these solutions. To make agriculture more sustainable, smaller farms in developing countries must have access as well.
Fortunately, “smart breeding” advances may have wider reach. Once image recognition technology becomes more developed for crops, which Walter expects will happen within the next 10 years, deploying it may be relatively inexpensive. Breeders could operate their own drones and obtain more precise ratings of traits like time to flowering or number of fruits in shorter time, says Walter. “The computing power that you need once you have established the algorithms is not very high.”
The genomic data so vital to AI-led breeding programs is also becoming more accessible. “We’re really at this point where genomics is cheap enough that you can apply these technologies to hundreds of species, maybe thousands,” says Buckler.
Plant breeding has “entered the engineered phase,” adds Tanksley. And with little time to spare. “The environment is changing,” he says. “You have to have a faster breeding process to respond to that.”
3. P. L. Pingali, Green revolution: Impacts, limits, and the path ahead. Proc. Natl. Acad. Sci. U.S.A. 109, 12302–12308 (2012).
4. D. Tilman, The greening of the green revolution. Nature 396, 211–212 (1998).
5. G. P. Ramstein, S. E. Jensen, E. S. Buckler, Breaking the curse of dimensionality to identify causal variants in Breeding 4. Theor. Appl. Genet. 132, 559–567 (2019).
6. D. Gonsalves, Control of papaya ringspot virus in papaya: A case study. Annu. Rev. Phytopathol. 36, 415–437 (1998).
7. N. Kirchgessner et al., The ETH field phenotyping platform FIP: A cable-suspended multi-sensor system. Funct. Plant Biol. 44, 154–168 (2016).
8. K. Yu, N. Kirchgessner, C. Grieder, A. Walter, A. Hund, An image analysis pipeline for automated classification of imaging light conditions and for quantification of wheat canopy cover time series in field phenotyping. Plant Methods 13, 15 (2017).
9. J. Streich et al., Can exascale computing and explainable artificial intelligence applied to plant biology deliver on the United Nations sustainable development goals? Curr. Opin. Biotechnol. 61, 217–225 (2020).
10. A. Walter, R. Finger, R. Huber, N. Buchmann, Opinion: Smart farming is key to developing sustainable agriculture. Proc. Natl. Acad. Sci. U.S.A. 114, 6148–6150 (2017).
Antes encaradas com desconfiança pela comunidade científica, as metodologias de intervenção artificial no meio ambiente com o objetivo de frear os efeitos devastadores do aquecimento global estão sendo consideradas agora como recursos a serem aplicados em última instância (já que iniciativas para reduzir a emissão de gases dependem diretamente da ação coletiva e demandam décadas para que tenham algum tipo de efeito benéfico). É possível que não tenhamos esse tempo, de acordo com alguns pesquisadores da área, os quais têm atraído investimentos e muita atenção.
Fazendo parte de um campo também referenciado como geoengenharia solar, grande parte dos métodos se vale da emissão controlada de partículas na atmosfera, responsáveis por barrar a energia recebida pelo nosso planeta e direcioná-la novamente ao espaço, criando uma espécie de resfriamento semelhante ao gerado por erupções vulcânicas.
Ainda que não atuem sobre a poluição, por exemplo, cientistas consideram que, diante de tempestades cada vez mais agressivas, tornados de fogo, inundações e outros desastres naturais, tais ações seriam interessantes enquanto soluções mais eficazes não são desenvolvidas.
Diretor do Sabin Center for Climate Change Law, na Columbia Law School, e editor de um livro sobre a tecnologia e suas implicações legais, Michael Gerrard exemplificou a situação em entrevista ao The New York Times: “Estamos enfrentando uma ameaça existencial. Por isso, é necessário que analisemos todas as opções”.
“Gosto de comparar a geoengenharia a uma quimioterapia para o planeta: se todo o resto estiver falhando, resta apenas tentar”, ele defendeu.
Desastres naturais ocasionados pelo aquecimento global tornam urgente a ação de intervenções, segundo pesquisadores. Fonte: Unsplash
Dois pesos e duas medidas
Entre aquelas que se destacam, pode ser citada a ação empreendida por uma organização não governamental chamada SilverLining, que concedeu US$ 3 milhões a diversas universidades e outras instituições para que se dediquem à busca de respostas para questões práticas. Um exemplo é encontrar a altitude ideal para a aplicação de aerossóis e como inserir a quantidade mais indicada, verificando seus efeitos sobre a cadeia de produção de alimentos mundial.
Chris Sacca, cofundador da Lowercarbon Capital, um grupo de investimentos que é um dos financiadores da SilverLining, declarou em tom alarmista: “A descarbonização é necessária, mas vai demorar 20 anos ou mais para que ocorra. Se não explorarmos intervenções climáticas como a reflexão solar neste momento, condenaremos um número incontável de vidas, espécies e ecossistemas ao calor”.
Outra contemplada por somas substanciais foi a National Oceanic and Atmospheric Administration, que recebeu do congresso norte-americano US$ 4 milhões justamente para o desenvolvimento de tecnologias do tipo, assim como o monitoramento de uso secreto de tais soluções por outros países.
Douglas MacMartin, pesquisador de Engenharia Mecânica e aeroespacial na Universidade Cornell, afirmou que “é certo o poder da humanidade de resfriar as coisas, mas o que não está claro é o que vem a seguir”.
Se, por um lado, o planeta pode ser resfriado artificialmente; por outro, não se sabe o que virá. Fonte: Unsplash
Existe uma maneira
Para esclarecer as possíveis consequências de intervenções dessa magnitude, MacMartin desenvolverá modelos de efeitos climáticos específicos oriundos da injeção de aerossóis na atmosfera acima de diferentes partes do globo e altitudes. “Dependendo de onde você colocar [a substância], terá efeitos diferentes nas monções na Ásia e no gelo marinho do Ártico“, ele apontou.
O Centro Nacional de Pesquisa Atmosférica em Boulder, Colorado, financiado também pela SilverLining, acredita ter o sistema ideal para isso — o qual é considerado o mais sofisticado do mundo. Com ele, serão executadas centenas de simulações e, assim, especialistas procurarão o que chamam de ponto ideal, no qual a quantidade de resfriamento artificial que pode reduzir eventos climáticos extremos não cause mudanças mais amplas nos padrões regionais de precipitação ou impactos semelhantes.
“Existe uma maneira, pelo menos em nosso modelo de mundo, de ver se podemos alcançar um sem acionar demais o outro?” questionou Jean-François Lamarque, diretor do laboratório de Clima e Dinâmica Global da instituição. Ainda não há resposta para essa dúvida, mas soluções sustentáveis estão sendo analisadas por pesquisadores australianos, que utilizariam a emissão de água salgada para tornar nuvens mais reflexivas, assim indicando resultados promissores de testes.
Dessa maneira, quem sabe as perdas de corais de recife que testemunhamos tenham data para acabar. Quanto ao resto, bem, só o tempo mostrará.
Os algoritmos de inteligência artificial (IA) atuam como curadores da informação, personalizando, por exemplo, as respostas nas plataformas de busca como Google e a seleção do que será publicado no feed de notícias de cada usuário do Facebook. O ativista Eli Pariser (The Filtre Bubble, 2011) reconhece a utilidade de sistemas de relevância ao fornecer conteúdo personalizado, mas alerta para os efeitos negativos da formação de “bolhas” ao reduzir a exposição à opiniões divergentes. Para Cass Sunstein (#republic, 2017), esses sistemas são responsáveis pelo aumento da polarização cultural e política pondo em risco a democracia. Existem muitas críticas à esses sistemas, algumas justas outras nem tanto; o fato é que personalização, curadoria, clusterização, mecanismos de persuasão, nada disso é novo, cabe é investigar o que mudou com a IA.
A personalização do discurso, por exemplo, remete à Aristóteles. A arte de conhecer o ouvinte e adaptar o discurso ao seu perfil, não para convencê-lo racionalmente, mas para conquistá-lo pelo “coração” é o tema da obra “Retórica”. Composta de três volumes, o Livro II é dedicado ao plano emocional listando as emoções que devem conter um discurso persuasivo: ira, calma, amizade, inimizade, temor, confiança, vergonha, desvergonha, amabilidade, piedade, indignação, inveja e emulação. Para o filósofo, todos, de algum modo, praticam a retórica na sustentação de seus argumentos. Essa obra funda as bases da retórica ocidental que, com seus mecanismos de persuasão, busca influenciar o interlocutor seja ele usuário, consumidor, cliente ou eleitor.
Cada modelo econômico tem seus próprios mecanismos de persuasão, que extrapolam motivações comerciais com impactos culturais e comportamentais. Na Economia Industrial, caracterizada pela produção e pelo consumo massivo de bens e serviços, a propaganda predominou como meio de convencimento nas decisões dos consumidores, inicialmente tratados como uma “massa” de indivíduos indistinguíveis. O advento das tecnologias digitais viabilizou a comunicação segmentada em função de características, perfis e preferências similares, mas ainda distante da hipersegmentação proporcionada pelas tecnologias de IA.
A hipersegmentação com algoritmos de IA é baseada na mineração de grandes conjuntos de dados (Big Data) e sofisticadas técnicas de análise e previsão, particularmente os modelos estatísticos de redes neurais/deep learning. Esses modelos permitem extrair dos dados informações sobre seus usuários e/ou consumidores e fazer previsões com alto grau de acurácia – desejos, comportamentos, interesses, padrões de pesquisa, por onde circulam, bem como a capacidade de pagamento e até o estado de saúde. Os algoritmos de IA transformam em informação útil a imensidão de dados gerados nas movimentações online.
Na visão de Shoshana Zuboff (The Age of Surveillance Capitalism, 2019), a maior ameaça não está nos dados produzidos voluntariamente em nossas interações nos meios digitais (“dados consentidos”), mas nos “dados residuais” sob os quais os usuários de plataformas online não exercem controle. Até 2006, os dados residuais eram desprezados, com a sofisticação dos modelos preditivos de IA esses dados tornaram-se valiosos: a velocidade de digitalização, os erros gramaticais cometidos, o formato dos textos, as cores preferidas e mais uma infinidade de detalhes do comportamento dos usuários são registrados e inseridos na extensa base de dados gerando projeções assertivas sobre o comportamento humano atual e futuro. Outro aspecto ressaltado por Zuboff é que as plataformas tecnológicas, em geral, captam mais dados do que o necessário para a dinâmica de seus modelos de negócio, ou seja, para melhorar produtos e serviços, e os utilizam para prever o comportamento de grupos específicos (“excedente comportamental”).
Esses processos de persuasão ocorrem em níveis invisíveis, sem conhecimento e/ou consentimento dos usuários, que desconhecem o potencial e a abrangência das previsões dos algoritmos de IA; num nível mais avançado, essas previsões envolvem personalidade, emoções, orientação sexual e política, ou seja, um conjunto de informações que em tese não era a intenção do usuário revelar. As fotos postadas nas redes sociais, por exemplo, geram os chamados “sinais de previsão” tais como os músculos e a simetria da face, informações utilizadas no treinamento de algoritmos de IA de reconhecimento de imagem.
A escala atual de geração, armazenamento e mineração de dados, associada aos modelos assertivos de personalização, é um dos elementos-chave da mudança de natureza dos atuais mecanismos de persuasão. Comparando os modelos tradicionais com os de algoritmos de IA, é possível detectar a extensão dessa mudança: 1) de mensagens elaboradas com base em conhecimento do público-alvo superficial e limitado, a partir do entendimento das características generalistas das categorias, para mensagens elaboradas com base em conhecimento profundo e detalhado/minucioso do público-alvo, hipersegmentação e personalização; 2) de correlações entre variáveis determinadas pelo desenvolvedor do sistema para correlações entre variáveis determinadas automaticamente com base nos dados; 3) de limitados recursos para associar comportamentos offline e online para capacidade de capturar e armazenar dados de comportamento off-line e agregá-los aos dados capturados online formando uma base de dados única, mais completa, mais diversificada, mais precisa; 4) de mecanismos de persuasão visíveis (propaganda na mídia) e relativamente visíveis (propaganda na internet) para mecanismos de persuasão invisíveis; 5) de baixo grau de assertividade para alto grau de assertividade; 6) de instrumentos de medição/verificação dos resultados limitados para instrumentos de medição/verificação dos resultados precisos; 7) de capacidade preditiva limitada à tendências futuras para capacidade preditiva de cenários futuros e quando vão acontecer com grau de acurácia média em torno de 80-90%; e 8) de reduzida capacidade de distorcer imagem e voz para enorme capacidade de distorcer imagem e voz, as Deep Fakes.
Como sempre, cabe à sociedade encontrar um ponto de equilíbrio entre os benefícios e as ameaças da IA. No caso, entre a proteção aos direitos humanos civilizatórios e a inovação e o avanço tecnológico, e entre a curadoria da informação e a manipulação do consumo, do acesso à informação e dos processos democráticos.
*Dora Kaufman professora do TIDD PUC – SP, pós-doutora COPPE-UFRJ e TIDD PUC-SP, doutora ECA-USP com período na Université Paris – Sorbonne IV. Autora dos livros “O Despertar de Gulliver: os desafios das empresas nas redes digitais”, e “A inteligência artificial irá suplantar a inteligência humana?”. Professora convidada da Fundação Dom Cabral
Acetaminophen, also known as paracetamol and sold widely under the brand names Tylenol and Panadol, also increases risk-taking, according to a new study that measured changes in people’s behaviour when under the influence of the common over-the-counter medication.
“Acetaminophen seems to make people feel less negative emotion when they consider risky activities – they just don’t feel as scared,” says neuroscientist Baldwin Way from The Ohio State University.
“With nearly 25 percent of the population in the US taking acetaminophen each week, reduced risk perceptions and increased risk-taking could have important effects on society.”
In a similar way, the new research suggests people’s affective ability to perceive and evaluate risks can be impaired when they take acetaminophen. While the effects might be slight, they’re definitely worth noting, given acetaminophen is the most common drug ingredient in America, found in over 600 different kinds of over-the-counter and prescription medicines.
In a series of experiments involving over 500 university students as participants, Way and his team measured how a single 1,000 mg dose of acetaminophen (the recommended maximum adult single dosage) randomly assigned to participants affected their risk-taking behaviour, compared against placebos randomly given to a control group.
In each of the experiments, participants had to pump up an uninflated balloon on a computer screen, with each single pump earning imaginary money. Their instructions were to earn as much imaginary money as possible by pumping the balloon as much as possible, but to make sure not to pop the balloon, in which case they would lose the money.
The results showed that the students who took acetaminophen engaged in significantly more risk-taking during the exercise, relative to the more cautious and conservative placebo group. On the whole, those on acetaminophen pumped (and burst) their balloons more than the controls.
“If you’re risk-averse, you may pump a few times and then decide to cash out because you don’t want the balloon to burst and lose your money,” Way says.
“But for those who are on acetaminophen, as the balloon gets bigger, we believe they have less anxiety and less negative emotion about how big the balloon is getting and the possibility of it bursting.”
In addition to the balloon simulation, participants also filled out surveys during two of the experiments, rating the level of risk they perceived in various hypothetical scenarios, such as betting a day’s income on a sporting event, bungee jumping off a tall bridge, or driving a car without a seatbelt.
In one of the surveys, acetaminophen consumption did appear to reduce perceived risk compared to the control group, although in another similar survey, the same effect wasn’t observed.
Overall, however, based on an average of results across the various tests, the team concludes that there is a significant relationship between taking acetaminophen and choosing more risk, even if the observed effect can be slight.
That said, they acknowledge the drug’s apparent effects on risk-taking behaviour could also be interpreted via other kinds of psychological processes, such as reduced anxiety, perhaps.
“It may be that as the balloon increases in size, those on placebo feel increasing amounts of anxiety about a potential burst,” the researchers explain.
“When the anxiety becomes too much, they end the trial. Acetaminophen may reduce this anxiety, thus leading to greater risk taking.”
Exploring such psychological alternative explanations for this phenomenon – as well as investigating the biological mechanisms responsible for acetaminophen’s effects on people’s choices in situations like this – should be addressed in future research, the team says.
While they’re at it, scientists no doubt will also have future opportunities to further investigate the role and efficacy of acetaminophen in pain relief more broadly, after studies in recent years found that in many medical scenarios, the drug can be ineffective at pain relief, and sometimes is no better than a placebo, in addition to inviting other kinds of health problems.
Humans are dismantling and disrupting natural ecosystems around the globe and changing Earth’s climate. Over the past 50 years, actions like farming, logging, hunting, development and global commerce have caused record losses of species on land and at sea. Animals, birds and reptiles are disappearing tens to hundreds of times faster than the natural rate of extinction over the past 10 million years.
Now the world is also contending with a global pandemic. In geographically remote regions such as the Brazilian Amazon, COVID-19 is devastating Indigenous populations, with tragic consequences for both Indigenous peoples and the lands they steward.
My research focuses on ecosystems and climate change from regional to global scales. In 2019, I worked with conservation biologist and strategist Eric Dinerstein and 17 colleagues to develop a road map for simultaneously averting a sixth mass extinction and reducing climate change by protecting half of Earth’s terrestrial, freshwater and marine realms by 2030. We called this plan “A Global Deal for Nature.”
Now we’ve released a follow-on called the “Global Safety Net” that identifies the exact regions on land that must be protected to achieve its goals. Our aim is for nations to pair it with the Paris Climate Agreement and use it as a dynamic tool to assess progress towards our comprehensive conservation targets.
What to protect next
The Global Deal for Nature provided a framework for the milestones, targets and policies across terrestrial, freshwater and marine realms required to conserve the vast majority of life on Earth. Yet it didn’t specify where exactly these safeguards were needed. That’s where the new Global Safety Net comes in.
We analyzed unprotected terrestrial areas that, if protected, could sequester carbon and conserve biodiversity as effectively as the 15% of terrestrial areas that are currently protected. Through this analysis, we identified an additional 35% of unprotected lands for conservation, bringing the total percentage of protected nature to 50%.
By setting aside half of Earth’s lands for nature, nations can save our planet’s rich biodiversity, prevent future pandemics and meet the Paris climate target of keeping warming in this century below less than 2.7 degrees F (1.5 degrees C). To meet these goals, 20 countries must contribute disproportionately. Much of the responsibility falls to Russia, the U.S., Brazil, Indonesia, Canada, Australia and China. Why? Because these countries contain massive tracts of land needed to reach the dual goals of reducing climate change and saving biodiversity.
Supporting Indigenous communities
Indigenous peoples make up less than 5% of the total human population, yet they manage or have tenure rights over a quarter of the world’s land surface, representing close to 80% of our planet’s biodiversity. One of our key findings is that 37% of the proposed lands for increased protection overlap with Indigenous lands.
As the world edges closer towards a sixth mass extinction, Indigenous communities stand to lose the most. Forest loss, ecotourism and devastation wrought by climate change have already displaced Indigenous peoples from their traditional territories at unprecedented rates. Now one of the deadliest pandemics in recent history poses an even graver additional threat to Indigenous lives and livelihoods.
To address and alleviate human rights questions, social justice issues and conservation challenges, the Global Safety Net calls for better protection for Indigenous communities. We believe our goals are achievable by upholding existing land tenure rights, addressing Indigenous land claims, and carrying out supportive ecological management programs with indigenous peoples.
Preventing future pandemics
Tropical deforestation increases forest edges – areas where forests meet human habitats. These areas greatly increase the potential for contact between humans and animal vectors that serve as viral hosts.
The Global Safety Net’s policy milestones and targets would reduce the illegal wildlife trade and associated wildlife markets – two known sources of zoonotic diseases. Reducing contact zones between animals and humans can decrease the chances of future zoonotic spillovers from occurring.
Our framework also envisions the creation of a Pandemic Prevention Program, which would increase protections for natural habitats at high risk for human-animal interactions. Protecting wildlife in these areas could also reduce the potential for more catastrophic outbreaks.
Achieving the Global Safety Net’s goals will require nature-based solutions – strategies that protect, manage and restore natural or modified ecosystems while providing co-benefits to both people and nature. They are low-cost and readily available today.
The nature-based solutions that we spotlight include: – Identifying biodiverse non-agricultural lands, particularly prevalent in tropical and sub-tropical regions, for increased conservation attention. – Prioritizing ecoregions that optimize carbon storage and drawdown, such as the Amazon and Congo basins. – Aiding species movement and adaptation across ecosystems by creating a comprehensive system of wildlife and climate corridors.
We estimate that an increase of just 2.3% more land in the right places could save our planet’s rarest plant and animal species within five years. Wildlife corridors connect fragmented wild spaces, providing wild animals the space they need to survive.
Leveraging technology for conservation
In the Global Safety Net study, we identified 50 ecoregions where additional conservation attention is most needed to meet the Global Deal for Nature’s targets, and 20 countries that must assume greater responsibility for protecting critical places. We mapped an additional 35% of terrestrial lands that play a critical role in reversing biodiversity loss, enhancing natural carbon removal and preventing further greenhouse gas emissions from land conversion.
But as climate change accelerates, it may scramble those priorities. Staying ahead of the game will require a satellite-driven monitoring system with the capability of tracking real-time land use changes on a global scale. These continuously updated maps would enable dynamic analyses to help sharpen conservation planning and help decision-making.
Earlier this summer, the Summit supercomputer at Oak Ridge National Lab in Tennessee set about crunching data on more than 40,000 genes from 17,000 genetic samples in an effort to better understand Covid-19. Summit is the second-fastest computer in the world, but the process — which involved analyzing 2.5 billion genetic combinations — still took more than a week.
When Summit was done, researchers analyzed the results. It was, in the words of Dr. Daniel Jacobson, lead researcher and chief scientist for computational systems biology at Oak Ridge, a “eureka moment.” The computer had revealed a new theory about how Covid-19 impacts the body: the bradykinin hypothesis. The hypothesis provides a model that explains many aspects of Covid-19, including some of its most bizarre symptoms. It also suggests 10-plus potential treatments, many of which are already FDA approved. Jacobson’s group published their results in a paper in the journal eLife in early July.
According to the team’s findings, a Covid-19 infection generally begins when the virus enters the body through ACE2 receptors in the nose, (The receptors, which the virus is known to target, are abundant there.) The virus then proceeds through the body, entering cells in other places where ACE2 is also present: the intestines, kidneys, and heart. This likely accounts for at least some of the disease’s cardiac and GI symptoms.
But once Covid-19 has established itself in the body, things start to get really interesting. According to Jacobson’s group, the data Summit analyzed shows that Covid-19 isn’t content to simply infect cells that already express lots of ACE2 receptors. Instead, it actively hijacks the body’s own systems, tricking it into upregulating ACE2 receptors in places where they’re usually expressed at low or medium levels, including the lungs.
In this sense, Covid-19 is like a burglar who slips in your unlocked second-floor window and starts to ransack your house. Once inside, though, they don’t just take your stuff — they also throw open all your doors and windows so their accomplices can rush in and help pillage more efficiently.
The renin–angiotensin system (RAS) controls many aspects of the circulatory system, including the body’s levels of a chemical called bradykinin, which normally helps to regulate blood pressure. According to the team’s analysis, when the virus tweaks the RAS, it causes the body’s mechanisms for regulating bradykinin to go haywire. Bradykinin receptors are resensitized, and the body also stops effectively breaking down bradykinin. (ACE normally degrades bradykinin, but when the virus downregulates it, it can’t do this as effectively.)
The end result, the researchers say, is to release a bradykinin storm — a massive, runaway buildup of bradykinin in the body. According to the bradykinin hypothesis, it’s this storm that is ultimately responsible for many of Covid-19’s deadly effects. Jacobson’s team says in their paper that “the pathology of Covid-19 is likely the result of Bradykinin Storms rather than cytokine storms,” which had been previously identified in Covid-19 patients, but that “the two may be intricately linked.” Other papers had previously identified bradykinin storms as a possible cause of Covid-19’s pathologies.
Covid-19 is like a burglar who slips in your unlocked second-floor window and starts to ransack your house.
As bradykinin builds up in the body, it dramatically increases vascular permeability. In short, it makes your blood vessels leaky. This aligns with recent clinical data, which increasingly views Covid-19 primarily as a vascular disease, rather than a respiratory one. But Covid-19 still has a massive effect on the lungs. As blood vessels start to leak due to a bradykinin storm, the researchers say, the lungs can fill with fluid. Immune cells also leak out into the lungs, Jacobson’s team found, causing inflammation.
And Covid-19 has another especially insidious trick. Through another pathway, the team’s data shows, it increases production of hyaluronic acid (HLA) in the lungs. HLA is often used in soaps and lotions for its ability to absorb more than 1,000 times its weight in fluid. When it combines with fluid leaking into the lungs, the results are disastrous: It forms a hydrogel, which can fill the lungs in some patients. According to Jacobson, once this happens, “it’s like trying to breathe through Jell-O.”
This may explain why ventilators have proven less effective in treating advanced Covid-19 than doctors originally expected, based on experiences with other viruses. “It reaches a point where regardless of how much oxygen you pump in, it doesn’t matter, because the alveoli in the lungs are filled with this hydrogel,” Jacobson says. “The lungs become like a water balloon.” Patients can suffocate even while receiving full breathing support.
The bradykinin hypothesis also extends to many of Covid-19’s effects on the heart. About one in five hospitalized Covid-19 patients have damage to their hearts, even if they never had cardiac issues before. Some of this is likely due to the virus infecting the heart directly through its ACE2 receptors. But the RAS also controls aspects of cardiac contractions and blood pressure. According to the researchers, bradykinin storms could create arrhythmias and low blood pressure, which are often seen in Covid-19 patients.
Bradykinin — especially at high doses — can also lead to a breakdown of the blood-brain barrier. Under normal circumstances, this barrier acts as a filter between your brain and the rest of your circulatory system. It lets in the nutrients and small molecules that the brain needs to function, while keeping out toxins and pathogens and keeping the brain’s internal environment tightly regulated.
If bradykinin storms cause the blood-brain barrier to break down, this could allow harmful cells and compounds into the brain, leading to inflammation, potential brain damage, and many of the neurological symptoms Covid-19 patients experience. Jacobson told me, “It is a reasonable hypothesis that many of the neurological symptoms in Covid-19 could be due to an excess of bradykinin. It has been reported that bradykinin would indeed be likely to increase the permeability of the blood-brain barrier. In addition, similar neurological symptoms have been observed in other diseases that result from an excess of bradykinin.”
Increased bradykinin levels could also account for other common Covid-19 symptoms. ACE inhibitors — a class of drugs used to treat high blood pressure — have a similar effect on the RAS system as Covid-19, increasing bradykinin levels. In fact, Jacobson and his team note in their paper that “the virus… acts pharmacologically as an ACE inhibitor” — almost directly mirroring the actions of these drugs.
By acting like a natural ACE inhibitor, Covid-19 may be causing the same effects that hypertensive patients sometimes get when they take blood pressure–lowering drugs. ACE inhibitors are known to cause a dry cough and fatigue, two textbook symptoms of Covid-19. And they can potentially increase blood potassium levels, which has also been observed in Covid-19 patients. The similarities between ACE inhibitor side effects and Covid-19 symptoms strengthen the bradykinin hypothesis, the researchers say.
ACE inhibitors are also known to cause a loss of taste and smell. Jacobson stresses, though, that this symptom is more likely due to the virus “affecting the cells surrounding olfactory nerve cells” than the direct effects of bradykinin.
Though still an emerging theory, the bradykinin hypothesis explains several other of Covid-19’s seemingly bizarre symptoms. Jacobson and his team speculate that leaky vasculature caused by bradykinin storms could be responsible for “Covid toes,” a condition involving swollen, bruised toes that some Covid-19 patients experience. Bradykinin can also mess with the thyroid gland, which could produce the thyroid symptoms recently observed in some patients.
The bradykinin hypothesis could also explain some of the broader demographic patterns of the disease’s spread. The researchers note that some aspects of the RAS system are sex-linked, with proteins for several receptors (such as one called TMSB4X) located on the X chromosome. This means that “women… would have twice the levels of this protein than men,” a result borne out by the researchers’ data. In their paper, Jacobson’s team concludes that this “could explain the lower incidence of Covid-19 induced mortality in women.” A genetic quirk of the RAS could be giving women extra protection against the disease.
The bradykinin hypothesis provides a model that “contributes to a better understanding of Covid-19” and “adds novelty to the existing literature,” according to scientists Frank van de Veerdonk, Jos WM van der Meer, and Roger Little, who peer-reviewed the team’s paper. It predicts nearly all the disease’s symptoms, even ones (like bruises on the toes) that at first appear random, and further suggests new treatments for the disease.
As Jacobson and team point out, several drugs target aspects of the RAS and are already FDA approved to treat other conditions. They could arguably be applied to treating Covid-19 as well. Several, like danazol, stanozolol, and ecallantide, reduce bradykinin production and could potentially stop a deadly bradykinin storm. Others, like icatibant, reduce bradykinin signaling and could blunt its effects once it’s already in the body.
Interestingly, Jacobson’s team also suggests vitamin D as a potentially useful Covid-19 drug. The vitamin is involved in the RAS system and could prove helpful by reducing levels of another compound, known as REN. Again, this could stop potentially deadly bradykinin storms from forming. The researchers note that vitamin D has already been shown to help those with Covid-19. The vitamin is readily available over the counter, and around 20% of the population is deficient. If indeed the vitamin proves effective at reducing the severity of bradykinin storms, it could be an easy, relatively safe way to reduce the severity of the virus.
Other compounds could treat symptoms associated with bradykinin storms. Hymecromone, for example, could reduce hyaluronic acid levels, potentially stopping deadly hydrogels from forming in the lungs. And timbetasin could mimic the mechanism that the researchers believe protects women from more severe Covid-19 infections. All of these potential treatments are speculative, of course, and would need to be studied in a rigorous, controlled environment before their effectiveness could be determined and they could be used more broadly.
Covid-19 stands out for both the scale of its global impact and the apparent randomness of its many symptoms. Physicians have struggled to understand the disease and come up with a unified theory for how it works. Though as of yet unproven, the bradykinin hypothesis provides such a theory. And like all good hypotheses, it also provides specific, testable predictions — in this case, actual drugs that could provide relief to real patients.
The researchers are quick to point out that “the testing of any of these pharmaceutical interventions should be done in well-designed clinical trials.” As to the next step in the process, Jacobson is clear: “We have to get this message out.” His team’s finding won’t cure Covid-19. But if the treatments it points to pan out in the clinic, interventions guided by the bradykinin hypothesis could greatly reduce patients’ suffering — and potentially save lives.
Summary: With widespread, sustained declines in fertility, the world population will likely peak in 2064 at around 9.7 billion, and then decline to about 8.8 billion by 2100 — about 2 billion lower than some previous estimates, according to a new study.
Improvements in access to modern contraception and the education of girls and women are generating widespread, sustained declines in fertility, and world population will likely peak in 2064 at around 9.7 billion, and then decline to about 8.8 billion by 2100 — about 2 billion lower than some previous estimates, according to a new study published in The Lancet.
The modelling research uses data from the Global Burden of Disease Study 2017 to project future global, regional, and national population. Using novel methods for forecasting mortality, fertility, and migration, the researchers from the Institute for Health Metrics and Evaluation (IHME) at the University of Washington’s School of Medicine estimate that by 2100, 183 of 195 countries will have total fertility rates (TFR), which represent the average number of children a woman delivers over her lifetime, below replacement level of 2.1 births per woman. This means that in these countries populations will decline unless low fertility is compensated by immigration.
The new population forecasts contrast to projections of ‘continuing global growth’ by the United Nations Population Division, and highlight the huge challenges to economic growth of a shrinking workforce, the high burden on health and social support systems of an aging population, and the impact on global power linked to shifts in world population.
The new study also predicts huge shifts in the global age structure, with an estimated 2.37 billion individuals over 65 years globally in 2100, compared with 1.7 billion under 20 years, underscoring the need for liberal immigration policies in countries with significantly declining working age populations.
“Continued global population growth through the century is no longer the most likely trajectory for the world’s population,” says IHME Director Dr. Christopher Murray, who led the research. “This study provides governments of all countries an opportunity to start rethinking their policies on migration, workforces and economic development to address the challenges presented by demographic change.”
IHME Professor Stein Emil Vollset, first author of the paper, continues, “The societal, economic, and geopolitical power implications of our predictions are substantial. In particular, our findings suggest that the decline in the numbers of working-age adults alone will reduce GDP growth rates that could result in major shifts in global economic power by the century’s end. Responding to population decline is likely to become an overriding policy concern in many nations, but must not compromise efforts to enhance women’s reproductive health or progress on women’s rights.”
Dr Richard Horton, Editor-in-Chief, The Lancet, adds: “This important research charts a future we need to be planning for urgently. It offers a vision for radical shifts in geopolitical power, challenges myths about immigration, and underlines the importance of protecting and strengthening the sexual and reproductive rights of women. The 21st century will see a revolution in the story of our human civilisation. Africa and the Arab World will shape our future, while Europe and Asia will recede in their influence. By the end of the century, the world will be multipolar, with India, Nigeria, China, and the US the dominant powers. This will truly be a new world, one we should be preparing for today.”
Accelerating decline in fertility worldwide
The global TFR is predicted to steadily decline, from 2.37 in 2017 to 1.66 in 2100 — well below the minimum rate (2.1) considered necessary to maintain population numbers (replacement level) — with rates falling to around 1.2 in Italy and Spain, and as low as 1.17 in Poland.
Even slight changes in TFR translate into large differences in population size in countries below the replacement level — increasing TFR by as little as 0.1 births per woman is equivalent to around 500 million more individuals on the planet in 2100.
Much of the anticipated fertility decline is predicted in high-fertility countries, particularly those in sub-Saharan Africa where rates are expected to fall below the replacement level for the first time — from an average 4.6 births per woman in 2017 to just 1.7 by 2100. In Niger, where the fertility rate was the highest in the world in 2017 — with women giving birth to an average of seven children — the rate is projected to decline to around 1.8 by 2100.
Nevertheless, the population of sub-Saharan Africa is forecast to triple over the course of the century, from an estimated 1.03 billion in 2017 to 3.07 billion in 2100 — as death rates decline and an increasing number of women enter reproductive age. North Africa and the Middle East is the only other region predicted to have a larger population in 2100 (978 million) than in 2017 (600 million).
Many of the fastest-shrinking populations will be in Asia and central and eastern Europe. Populations are expected to more than halve in 23 countries and territories, including Japan (from around 128 million people in 2017 to 60 million in 2100), Thailand (71 to 35 million), Spain (46 to 23 million), Italy (61 to 31 million), Portugal (11 to 5 million), and South Korea (53 to 27 million). An additional 34 countries are expected to have population declines of 25 to 50%, including China (1.4 billion in 2017 to 732 million in 2100; see table).
Huge shifts in global age structure — with over 80s outnumbering under 5s two to one
As fertility falls and life expectancy increases worldwide, the number of children under 5 years old is forecasted to decline by 41% from 681 million in 2017 to 401 million in 2100, whilst the number of individuals older than 80 years is projected to increase six fold, from 141 million to 866 million. Similarly, the global ratio of adults over 80 years to each person aged 15 years or younger is projected to rise from 0.16 in 2017 to 1.50 in 2100, in countries with a population decline of more than 25%.
Furthermore, the global ratio of non-working adults to workers was around 0.8 in 2017, but is projected to increase to 1.16 in 2100 if labour force participation by age and sex does not change.
“While population decline is potentially good news for reducing carbon emissions and stress on food systems, with more old people and fewer young people, economic challenges will arise as societies struggle to grow with fewer workers and taxpayers, and countries’ abilities to generate the wealth needed to fund social support and health care for the elderly are reduced,” says Vollset.
Declining working-age populations could see major shifts in size of economies
The study also examined the economic impact of fewer working-age adults for all countries in 2017. While China is set to replace the USA in 2035 with the largest total gross domestic product (GDP) globally, rapid population decline from 2050 onward will curtail economic growth. As a result, the USA is expected to reclaim the top spot by 2098, if immigration continues to sustain the US workforce.
Although numbers of working-age adults in India are projected to fall from 762 million in 2017 to around 578 million in 2100, it is expected to be one of the few — if only — major power in Asia to protect its working-age population over the century. It is expected to surpass China’s workforce population in the mid-2020s (where numbers of workers are estimated to decline from 950 million in 2017 to 357 million in 2100) — rising up the GDP rankings from 7th to 3rd.
Sub-Saharan Africa is likely to become an increasingly powerful continent on the geopolitical stage as its population rises. Nigeria is projected to be the only country among the world’s 10 most populated nations to see its working-age population grow over the course of the century (from 86 million in 2017 to 458 million in 2100), supporting rapid economic growth and its rise in GDP rankings from 23rd place in 2017 to 9th place in 2100.
While the UK, Germany, and France are expected to remain in the top 10 for largest GDP worldwide at the turn of the century, Italy (from rank 9th in 2017 to 25th in 2100) and Spain (from 13th to 28th) are projected to fall down the rankings, reflecting much greater population decline.
Liberal immigration could help sustain population size and economic growth
The study also suggests that population decline could be offset by immigration, with countries that promote liberal immigration better able to maintain their population size and support economic growth, even in the face of declining fertility rates.
The model predicts that some countries with fertility lower than replacement level, such as the USA, Australia, and Canada, will probably maintain their working-age populations through net immigration (see appendix 2 section 4). Although the authors note that there is considerable uncertainty about these future trends.
“For high-income countries with below-replacement fertility rates, the best solutions for sustaining current population levels, economic growth, and geopolitical security are open immigration policies and social policies supportive of families having their desired number of children,” Murray says. “However, a very real danger exists that, in the face of declining population, some countries might consider policies that restrict access to reproductive health services, with potentially devastating consequences. It is imperative that women’s freedom and rights are at the top of every government’s development agenda.”
The authors note some important limitations, including that while the study uses the best available data, predictions are constrained by the quantity and quality of past data. They also note that past trends are not always predictive of what will happen in the future, and that some factors not included in the model could change the pace of fertility, mortality, or migration. For example, the COVID-19 pandemic has affected local and national health systems throughout the world, and caused over half a million deaths. However, the authors believe the excess deaths caused by the pandemic are unlikely to significantly alter longer term forecasting trends of global population.
Writing in a linked Comment, Professor Ibrahim Abubakar, University College London (UCL), UK, and Chair of Lancet Migration (who was not involved in the study), says: “Migration can be a potential solution to the predicted shortage of working-age populations. While demographers continue to debate the long-term implications of migration as a remedy for declining TFR, for it to be successful, we need a fundamental rethink of global politics. Greater multilateralism and a new global leadership should enable both migrant sending and migrant-receiving countries to benefit, while protecting the rights of individuals. Nations would need to cooperate at levels that have eluded us to date to strategically support and fund the development of excess skilled human capital in countries that are a source of migrants. An equitable change in global migration policy will need the voice of rich and poor countries. The projected changes in the sizes of national economies and the consequent change in military power might force these discussions.”
He adds: “Ultimately, if Murray and colleagues’ predictions are even half accurate, migration will become a necessity for all nations and not an option. The positive impacts of migration on health and economies are known globally. The choice that we face is whether we improve health and wealth by allowing planned population movement or if we end up with an underclass of imported labour and unstable societies. The Anthropocene has created many challenges such as climate change and greater global migration. The distribution of working-age populations will be crucial to whether humanity prospers or withers.”
The study was in part funded by the Bill & Melinda Gates Foundation. It was conducted by researchers at the University of Washington, Seattle, USA.
Materials provided by The Lancet. Note: Content may be edited for style and length.
Stein Emil Vollset, Emily Goren, Chun-Wei Yuan, Jackie Cao, Amanda E Smith, Thomas Hsiao, Catherine Bisignano, Gulrez S Azhar, Emma Castro, Julian Chalek, Andrew J Dolgert, Tahvi Frank, Kai Fukutaki, Simon I Hay, Rafael Lozano, Ali H Mokdad, Vishnu Nandakumar, Maxwell Pierce, Martin Pletcher, Toshana Robalik, Krista M Steuben, Han Yong Wunrow, Bianca S Zlavog, Christopher J L Murray. Fertility, mortality, migration, and population scenarios for 195 countries and territories from 2017 to 2100: a forecasting analysis for the Global Burden of Disease Study. The Lancet, 2020; DOI: 10.1016/S0140-6736(20)30677-2
On May 20, disease modelers at Columbia University posted a preprint that concluded the US could have prevented 36,000 of the 65,300 deaths that the country had suffered as a result of COVID-19 by May 3 if states had instituted social distancing measures a week earlier. In early June, Imperial College London epidemiologist Neil Ferguson, one of the UK government’s key advisers in the early stages of the pandemic, came to a similar conclusion about the UK. In evidence he presented to a parliamentary committee inquiry, Ferguson said that if the country had introduced restrictions on movement and socializing a week sooner than it did, Britain’s official death toll of 40,000 could have been halved.
On a more positive note, Ferguson and other researchers at Imperial College London published a model in Nature around the same time estimating that more than 3 million deaths had been avoided in the UK as a result of the policies that were put in place.
These and other studies from recent months aim to understand how well various social-distancing measures have curbed infections, and by extension saved lives. It’s a big challenge to unravel and reliably understand all the factors at play, but experts say the research could help inform future policies.
The most effective measure, one study found, was getting people not to travel to work, while school closures had relatively little effect.
“It’s not just about looking retrospectively,” Jeffrey Shaman, a data scientist at Columbia University and coauthor of the preprint on US deaths, tells The Scientist. “All the places that have managed to get it under control to a certain extent are still at risk of having a rebound and a flare up. And if they don’t respond to it because they can’t motivate the political and public will to actually reinstitute control measures, then we’re going to repeat the same mistakes.”
Diving into the data
Shaman and his team used a computer model and data on how people moved around to work out how reduced contact between people could explain disease trends after the US introduced social distancing measures in mid-March. Then, the researchers looked at what would have happened if the same measures had been introduced a week earlier, and found that more than half of total infections and deaths up to May 3 would have been prevented. Starting the measures on March 1 would have prevented 83 percent of the nation’s deaths during that period, according to the model. Shaman says he is waiting to submit for publication in a peer-reviewed journal until he and his colleagues update the study with more-recent data.
“I thought they had reasonably credible data in terms of trying to argue that the lockdowns had prevented infections,” says Daniel Sutter, an economist at Troy University. “They were training or calibrating that model using some cell phone data and foot traffic data and correlating that with lockdowns.”
Sébastien Annan-Phan, an economist at the University of California, Berkeley, undertook a similar analysis, looking at the growth rate of case numbers before and after various lockdown measures were introduced in China, South Korea, Italy, Iran, France, and the US. Because these countries instituted different combinations of social distancing measures, the team was able to estimate how well each action slowed disease spread. The most effective measure, they found, was getting people not to travel to work, while school closures had relatively little effect. “Every country is different and they implement different policies, but we can still tease out a couple of things,” says Annan-Phan.
In total, his group estimated that combined interventions prevented or delayed about 62 million confirmed cases in the six countries studied, or about 530 million total infections. The results were published in Naturein June alongside a study from a group at Imperial College London, which had compared COVID-19 cases reported in several European countries under lockdown with the worst-case scenario predicted for each of those countries by a computer model in which no such measures were taken. According to that analysis, which assumed that the effects of social distancing measures were the same from country to country, some 3.1 million deaths had been avoided.
It’s hard to argue against the broad conclusion that changing people’s behavior was beneficial, says Andrew Gelman, a statistician at Columbia University. “If people hadn’t changed their behavior, then it would have been disastrous.”
Lockdown policies versus personal decisions to isolate
Like all hypothetical scenarios, it’s impossible to know how events would have played out if different decisions were made. And attributing changes in people’s behavior to official lockdown policies during the pandemic is especially difficult, says Gelman. “Ultimately, we can’t say what would have happened without it, because the timing of lockdown measures correlates with when people would have gone into self-isolation anyway.” Indeed, according to a recent study of mobile phone data in the US, many people started to venture out less a good one to four weeks before they were officially asked to.
A report on data from Sweden, a country that did not introduce the same strict restrictions as others in Europe, seems to support that idea. It found that, compared with data from other countries, Sweden’s outcomes were no worse. “A lockdown would not have helped in terms of limiting COVID-19 infections or deaths in Sweden,” the study originally concluded. But Gernot Müller, an economist at the University of Tubingen who worked on that report, now says updated data show that original conclusion was flawed. Many Swedes took voluntary actions in the first few weeks, he says, and this masked the benefits that a lockdown would have had. But after the first month, the death rate started to rise. “It turns out that we do now see a lockdown effect,” Müller says of his group’s new, still unpublished analyses. “So lockdowns do work and we can attach a number to that: some 40 percent or 50 percent fewer deaths.”
Some critics question the assumption that such deaths have been prevented, rather than simply delayed. While it can appear to be a semantic point, the distinction between preventing and delaying infection is an important one when policymakers assess the costs and benefits of lockdown measures, Sutter says. “I think it’s a little misleading to keep saying these lockdowns have prevented death. They’ve just prevented cases from occurring so far,” he says. “There’s still the underlying vulnerability out there. People are still susceptible to get the virus and get sick at a later date.”
Shaman notes, however, that it’s really a race against the clock. It’s about “buying yourself and your population critical time to not be infected while we try to get our act together to produce an effective vaccine or therapeutic.”
Summary: Researchers have observed the exploratory behavior of ants to inform the development of a more efficient mathematical sampling technique.
In a paper published by the Royal Society, a team of Bristol researchers observed the exploratory behaviour of ants to inform the development of a more efficient mathematical sampling technique.
Animals like ants have the challenge of exploring their environment to look for food and potential places to live. With a large group of individuals, like an ant colony, a large amount of time would be wasted if the ants repeatedly explored the same empty areas.
The interdisciplinary team from the University of Bristol’s Faculties of Engineering and Life Sciences, predicted that the study species — the ‘rock ant’ — uses some form of chemical communication to avoid exploring the same space multiple times.
Lead author, Dr Edmund Hunt, said:
“This would be a reversal of the Hansel and Gretel story — instead of following each other’s trails, they would avoid them in order to explore collectively.
“To test this theory, we conducted an experiment where we let ants explore an empty arena one by one. In the first condition, we cleaned the arena between each ant so they could not leave behind any trace of their path. In the second condition, we did not clean between ants. The ants in the second condition (no cleaning) made a better exploration of the arena — they covered more space.”
In mathematics, a probability distribution describes how likely are each of a set of different possible outcomes: for example, the chance that an ant will find food at a certain place. In many science and engineering problems, these distributions are highly complex, and they do not have a neat mathematical description. Instead, one must sample from it to obtain a good approximation: with a desire to avoid sampling too much from unimportant (low probability) parts of the distribution.
The team wanted to find out if adopting an ant-inspired approach would hasten this sampling process.
“We predicted that we could simulate the approach adopted by the ants in the mathematical sampling problem, by leaving behind a ‘negative trail’ of where has already been sampled. We found that our ant-inspired sampling method was more efficient (faster) than a standard method which does not leave a memory of where has already been sampled,” said Dr Hunt.
These findings contribute toward an interesting parallel between the exploration problem confronted by the ants, and the mathematical sampling problem of acquiring information. This parallel can inform our fundamental understanding of what the ants have evolved to do: acquire information more efficiently.
“Our ant-inspired sampling method may be useful in many domains, such as computational biology, for speeding up the analysis of complex problems. By describing the ants’ collective behaviour in informational terms, it also allows us to quantify how helpful are different aspects of their behaviour to their success. For example, how much better do they perform when their pheromones are not cleaned away. This could allow us to make predictions about which behavioural mechanisms are most likely to be favoured by natural selection.”
Edmund R. Hunt, Nigel R. Franks, Roland J. Baddeley. The Bayesian superorganism: externalized memories facilitate distributed sampling. Journal of The Royal Society Interface, 2020; 17 (167): 20190848 DOI: 10.1098/rsif.2019.0848
Summary: Researchers describe a single function that accurately describes all existing available data on active COVID-19 cases and deaths — and predicts forthcoming peaks.
As of late May, COVID-19 has killed more than 325,000 people around the world. Even though the worst seems to be over for countries like China and South Korea, public health experts warn that cases and fatalities will continue to surge in many parts of the world. Understanding how the disease evolves can help these countries prepare for an expected uptick in cases.
This week in the journal Frontiers in Physics, researchers describe a single function that accurately describes all existing available data on active cases and deaths — and predicts forthcoming peaks. The tool uses q-statistics, a set of functions and probability distributions developed by Constantino Tsallis, a physicist and member of the Santa Fe Institute’s external faculty. Tsallis worked on the new model together with Ugur Tirnakli, a physicist at Ege University, in Turkey.
“The formula works in all the countries in which we have tested,” says Tsallis.
Neither physicist ever set out to model a global pandemic. But Tsallis says that when he saw the shape of published graphs representing China’s daily active cases, he recognized shapes he’d seen before — namely, in graphs he’d helped produce almost two decades ago to describe the behavior of the stock market.
“The shape was exactly the same,” he says. For the financial data, the function described probabilities of stock exchanges; for COVID-19, it described daily the number of active cases — and fatalities — as a function of time.
Modeling financial data and tracking a global pandemic may seem unrelated, but Tsallis says they have one important thing in common. “They’re both complex systems,” he says, “and in complex systems, this happens all the time.” Disparate systems from a variety of fields — biology, network theory, computer science, mathematics — often reveal patterns that follow the same basic shapes and evolution.
The financial graph appeared in a 2004 volume co-edited by Tsallis and the late Nobelist Murray Gell-Mann. Tsallis developed q-statitics, also known as “Tsallis statistics,” in the late 1980s as a generalization of Boltzmann-Gibbs statistics to complex systems.
In the new paper, Tsallis and Tirnakli used data from China, where the active case rate is thought to have peaked, to set the main parameters for the formula. Then, they applied it to other countries including France, Brazil, and the United Kingdom, and found that it matched the evolution of the active cases and fatality rates over time.
The model, says Tsallis, could be used to create useful tools like an app that updates in real-time with new available data, and can adjust its predictions accordingly. In addition, he thinks that it could be fine-tuned to fit future outbreaks as well.
“The functional form seems to be universal,” he says, “Not just for this virus, but for the next one that might appear as well.”
Em entrevista à Folha, Mokdad diz que a tendência de casos e mortes no país é de alta e que a situação pode ser ainda pior se governo e população não levarem a crise a sério e adotarem “lockdown” por duas semanas.
“As infeções e mortes vão crescer e, o mais assustador, haverá a sobrecarga total do sistema de saúde.” Caso cumpra o confinamento total por 14 dias, explica Mokdad, o Brasil conseguirá controlar a propagação do vírus e poderá fazer a reabertura das atividades econômicas de maneira estratégica –e até mais rapidamente.
Especialista em saúde pública, diz sofrer críticas por ter um modelo que varia bastante, mas, no caso da pandemia, prefere que suas projeções se ajustem com o tempo. “Se os brasileiros ficarem em casa por duas semanas, meus números vão baixar. E não porque fiz algo errado, mas porque os brasileiros fizeram algo certo.”
Qual a situação da pandemia no Brasil? Infelizmente o que vemos no Brasil é uma tendência de aumento de casos, que vai resultar no crescimento das mortes no país. Isso se dá por várias razões. Primeiro porque o país não entrou em “lockdown” cedo para impedir a propagação do vírus. O governo e a população brasileira não levaram isso a sério e não fizeram logo as coisas certas para impedir a transmissão do vírus.
Segundo, há muita disparidade no Brasil e a Covid-19 aumenta isso. Nesse caso, é preciso proteger não só os trabalhadores de saúde mas os trabalhadores de serviços essenciais, pessoas pobres que trabalham em funções que as obrigam a sair de casa. Elas não estão protegidas e estão morrendo. A terceira e mais importante preocupação é a sobrecarga do sistema de saúde. Se o país não agir, vai haver mais casos no inverno e não haverá tempo para se preparar. É perigoso e arriscado. Se você colocar tudo isso junto, o Brasil ainda vai enfrentar sérias dificuldades diante da Covid-19.
Em duas semanas, o IHME aumentou as projeções de morte no Brasil de 88 mil para mais de 125 mil até agosto. O que aconteceu? Adicionamos mais estados [de 11 para 19] na nossa projeção, isso é uma coisa. Mas estamos vendo no Brasil mais surtos e casos do que esperávamos. O país está testando mais e encontrando mais casos, mas, mesmo quando ajustamos para os testes, há uma tendência de alta.
No Brasil há também um erro de suposição quando falamos de circulação. Os dados [de mobilidade da população] são baseados no Facebook e no Google, ou seja, em smartphones, ou seja, em pessoas mais ricas. Percebemos que a circulação não parou nas favelas, por exemplo, em lugares onde pessoas mais pobres precisam sair para trabalhar. Se as pessoas se recusarem a levar isso a sério, infelizmente vamos ver mais casos e mortes.
Quais medidas precisam ser tomadas? Fechar escolas e universidades, impedir grandes aglomerações e encontros de pessoas, fechar os estabelecimentos não essenciais, igrejas, templos e locais religiosos. Nos locais essenciais, como mercados e farmácias, é preciso estabelecer regras, limitando o número de pessoas dentro, garantindo que elas se mantenham distantes umas das outras.
A última e mais importante coisa é pedir para quem precisa sair de casa—e sabemos que há quem precise— usar máscara e manter distância de 2 metros de outras pessoas. Para o sistema de saúde, é aumentar a capacidade de tratamento, de detectar cedo a chegada de um surto, fazendo rastreamento e o isolamento de casos, o que é um desafio para o Brasil, onde muitas vezes dez pessoas vivem em uma mesma casa.
Se o Brasil não cumprir essas medidas, qual é o pior cenário para o país? As infeções e mortes vão crescer e, a parte mais assustadora, haverá a sobrecarga total do sistema de saúde. Isso vai causar mais prejuízo à economia do que se fizer o isolamento por duas semanas. Se a população ficar em casa e levar isso a sério por duas semanas, registraremos diminuição da propagação do vírus e poderemos reabrir em fases. É preciso garantir que a retomada econômica seja feita de maneira estratégica, por setores.
É possível evitar o pico de 1.500 mortes diárias em julho e as 125 mil mortes até agosto se o país parar agora? Sim. O Brasil está em uma situação muito difícil e pode ser assim por muito tempo, mas ainda há esperança. Se o governo e a população pararem por duas semanas, podemos parar a circulação do vírus e reabrir o comércio. Se você olhar para estados americanos, como Nova York, depois que há o “lockdown”, as mortes e os casos diminuem. O “lockdown” salvou muitas vidas nos EUA. Fizemos as projeções para o Brasil de 125 mil mortes até 4 de agosto, mas não significa que vai acontecer, podemos parar isso. É preciso que cada brasileiro faça sua parte.
O presidente Jair Bolsonaro é contra medidas de distanciamento social, compara a Covid-19 com uma gripezinha e defende um medicamento com eficácia não comprovada contra a doença. Como essa postura pode impactar a situação do Brasil? Aqui nos EUA temos também uma situação política nesse sentido, infelizmente. Não sou político, vejo os números e dou conselhos a partir do que concluo deles. Pelos dados, o Brasil precisa de uma ação coordenada, caso contrário, vamos ter muitas perdas.
Mas precisamos ter uma coisa clara: Covid-19 não é uma gripe, causa mais mortalidade que gripe, a gripe não causa AVC e nem ataca os pulmões da maneira que a Covid-19 ataca. Contra Covid-19 não há medicamento e ponto final. Não tem vacina. Não é possível comparar Covid-19 e gripe. Fazer isso é passar mensagem errada. Dizer para a população que é possível sair e ver quem pega a doença é inaceitável, é falha de liderança.
Como ganhar a confiança dos governos e da população com projeções que variam tanto e com tanta gente trabalhando com dados sobre o tema? Há muita gente fazendo projeção mas, pela primeira vez na história da ciência, todos concordamos. Os números podem ser diferentes, mas a mensagem mais importante é a mesma: isso é um vírus letal e temos que levá-lo a sério. Meus números mudam porque as pessoas mudam. Se os brasileiros ficarem em casa por duas semanas, meus números vão baixar. E não porque fiz algo errado, mas porque os brasileiros fizeram algo certo. Aprendemos que o modelo muda se novos dados aparecem.
O sr. já foi acusado de ser alarmista ou de produzir notícias falsas quando seus números mudam? Acusado é demais, mas tem gente que fala que meus números são mais altos ou mais baixos do que deveriam ser, e isso eu nem resposto, porque não é um debate científico, é um debate político. No debate científico está todo mundo a bordo com a mesma mensagem.
Como é trabalhar tendo isso em vista, com números tão sensíveis e poderosos? A gente não dorme muito por esses dias, é muito trabalho. É muito difícil dizer que 125 mil pessoas vão morrer no Brasil até agosto. Isso não é um número, são famílias, amigos, é muito duro.
BRASILIA (Reuters) – As Brazil’s daily COVID-19 death rate climbs to the highest in the world, a University of Washington study is warning its total death toll could climb five-fold to 125,000 by early August, adding to fears it has become a new hot spot in the pandemic.
The forecast from the University of Washington’s Institute for Health Metrics and Evaluation (IHME), released as Brazil’s daily death toll climbed past that of the United States on Monday, came with a call for lockdowns that Brazil’s president has resisted.
“Brazil must follow the lead of Wuhan, China, as well as Italy, Spain, and New York by enforcing mandates and measures to gain control of a fast-moving epidemic and reduce transmission of the coronavirus,” wrote IHME Director Dr. Christopher Murray.
Without such measures, the institute’s model shows Brazil’s daily death toll could keep climbing to until mid-July, driving shortages of critical hospital resources in Brazil, he said in a statement accompanying the findings.
On Monday, Brazil’s coronavirus deaths reported in the last 24 hours were higher than fatalities in the United States for the first time, according to the health ministry. Brazil registered 807 deaths and 620 died in the United States.
The U.S. government on Monday brought forward to Tuesday midnight enforcement of restrictions on travel to the United States from Brazil as the South American country reported the highest death toll in the world for that day.
Washington’s ban applies to foreigners traveling to the United States if they had been in Brazil in the last two weeks. Two days earlier, Brazil overtook Russia as the world’s No. 2 coronavirus hot spot in number of confirmed cases, after the United States.
Murray said the IHME forecast captures the effects of social distancing mandates, mobility trends and testing capacity, so projections could shift along with policy changes.
The model will be updated regularly as new data is released on cases, hospitalizations, deaths, testing and mobility.
Reporting by Anthony Boadle; Editing by Brad Haynes and Steve Orlofsky
Summary: At the beginning of a new wave of an epidemic, extreme care should be used when extrapolating data to determine whether lockdowns are necessary, experts say.
As the infectious virus causing the COVID-19 disease began its devastating spread around the globe, an international team of scientists was alarmed by the lack of uniform approaches by various countries’ epidemiologists to respond to it.
Germany, for example, didn’t institute a full lockdown, unlike France and the U.K., and the decision in the U.S. by New York to go into a lockdown came only after the pandemic had reached an advanced stage. Data modeling to predict the numbers of likely infections varied widely by region, from very large to very small numbers, and revealed a high degree of uncertainty.
Davide Faranda, a scientist at the French National Centre for Scientific Research (CNRS), and colleagues in the U.K., Mexico, Denmark, and Japan decided to explore the origins of these uncertainties. This work is deeply personal to Faranda, whose grandfather died of COVID-19; Faranda has dedicated the work to him.
In the journal Chaos, from AIP Publishing, the group describes why modeling and extrapolating the evolution of COVID-19 outbreaks in near real time is an enormous scientific challenge that requires a deep understanding of the nonlinearities underlying the dynamics of epidemics.
“Our physical model is based on assuming that the total population can be divided into four groups: those who are susceptible to catching the virus, those who have contracted the virus but don’t show any symptoms, those who are infected and, finally, those who recovered or died from the virus,” Faranda said.
To determine how people move from one group to another, it’s necessary to know the infection rate, incubation time and recovery time. Actual infection data can be used to extrapolate the behavior of the epidemic with statistical models.
“Because of the uncertainties in both the parameters involved in the models — infection rate, incubation period and recovery time — and the incompleteness of infections data within different countries, extrapolations could lead to an incredibly large range of uncertain results,” Faranda said. “For example, just assuming an underestimation of the last data in the infection counts of 20% can lead to a change in total infections estimations from few thousands to few millions of individuals.”
The group has also shown that this uncertainty is due to a lack of data quality and also to the intrinsic nature of the dynamics, because it is ultrasensitive to the parameters — especially during the initial growing phase. This means that everyone should be very careful extrapolating key quantities to decide whether to implement lockdown measures when a new wave of the virus begins.
“The total final infection counts as well as the duration of the epidemic are sensitive to the data you put in,” he said.
The team’s model handles uncertainty in a natural way, so they plan to show how modeling of the post-confinement phase can be sensitive to the measures taken.
“Preliminary results show that implementing lockdown measures when infections are in a full exponential growth phase poses serious limitations for their success,” said Faranda.
Davide Faranda, Isaac Pérez Castillo, Oliver Hulme, Aglaé Jezequel, Jeroen S. W. Lamb, Yuzuru Sato, Erica L. Thompson. Asymptotic estimates of SARS-CoV-2 infection counts and their sensitivity to stochastic perturbation. Chaos: An Interdisciplinary Journal of Nonlinear Science, 2020; 30 (5): 051107 DOI: 10.1063/5.0008834
Covid-19 isn’t going away soon. Two recent studies mapped out the possible shapes of its trajectory.
By Siobhan Roberts – May 8, 2020
By now we know — contrary to false predictions — that the novel coronavirus will be with us for a rather long time.
“Exactly how long remains to be seen,” said Marc Lipsitch, an infectious disease epidemiologist at Harvard’s T.H. Chan School of Public Health. “It’s going to be a matter of managing it over months to a couple of years. It’s not a matter of getting past the peak, as some people seem to believe.”
A single round of social distancing — closing schools and workplaces, limiting the sizes of gatherings, lockdowns of varying intensities and durations — will not be sufficient in the long term.
In the interest of managing our expectations and governing ourselves accordingly, it might be helpful, for our pandemic state of mind, to envision this predicament — existentially, at least — as a soliton wave: a wave that just keeps rolling and rolling, carrying on under its own power for a great distance.
The Scottish engineer and naval architect John Scott Russell first spotted a soliton in 1834 as it traveled along the Union Canal. He followed on horseback and, as he wrote in his “Report on Waves,” overtook it rolling along at about eight miles an hour, at thirty feet long and a foot or so in height. “Its height gradually diminished, and after a chase of one or two miles I lost it in the windings of the channel.”
The pandemic wave, similarly, will be with us for the foreseeable future before it diminishes. But, depending on one’s geographic location and the policies in place, it will exhibit variegated dimensions and dynamics traveling through time and space.
“There is an analogy between weather forecasting and disease modeling,” Dr. Lipsitch said. Both, he noted, are simple mathematical descriptions of how a system works: drawing upon physics and chemistry in the case of meteorology; and on behavior, virology and epidemiology in the case of infectious-disease modeling. Of course, he said, “we can’t change the weather.” But we can change the course of the pandemic — with our behavior, by balancing and coordinating psychological, sociological, economic and political factors.
Dr. Lipsitch is a co-author of two recent analyses — one from the Center for Infectious Disease Research and Policy at the University of Minnesota, the other from the Chan School published in Science — that describe a variety of shapes the pandemic wave might take in the coming months.
The Minnesota study describes three possibilities:
Scenario No. 1 depicts an initial wave of cases — the current one — followed by a consistently bumpy ride of “peaks and valleys” that will gradually diminish over a year or two.
Scenario No. 2 supposes that the current wave will be followed by a larger “fall peak,” or perhaps a winter peak, with subsequent smaller waves thereafter, similar to what transpired during the 1918-1919 flu pandemic.
Scenario No. 3 shows an intense spring peak followed by a “slow burn” with less-pronounced ups and downs.
The authors conclude that whichever reality materializes (assuming ongoing mitigation measures, as we await a vaccine), “we must be prepared for at least another 18 to 24 months of significant Covid-19 activity, with hot spots popping up periodically in diverse geographic areas.”
In the Science paper, the Harvard team — infectious-disease epidemiologist Yonatan Grad, his postdoctoral fellow Stephen Kissler, Dr. Lipsitch, his doctoral student Christine Tedijanto and their colleague Edward Goldstein — took a closer look at various scenarios by simulating the transmission dynamics using the latest Covid-19 data and data from related viruses.
The authors conveyed the results in a series of graphs — composed by Dr. Kissler and Ms. Tedijanto — that project a similarly wavy future characterized by peaks and valleys.
One figure from the paper, reinterpreted below, depicts possible scenarios (the details would differ geographically) and shows the red trajectory of Covid-19 infections in response to “intermittent social distancing” regimes represented by the blue bands.
Social distancing is turned “on” when the number of Covid-19 cases reaches a certain prevalence in the population — for instance, 35 cases per 10,000, although the thresholds would be set locally, monitored with widespread testing. It is turned “off” when cases drop to a lower threshold, perhaps 5 cases per 10,000. Because critical cases that require hospitalization lag behind the general prevalence, this strategy aims to prevent the health care system from being overwhelmed.
The green graph represents the corresponding, if very gradual, increase in population immunity.
“The ‘herd immunity threshold’ in the model is 55 percent of the population, or the level of immunity that would be needed for the disease to stop spreading in the population without other measures,” Dr. Kissler said.
Another iteration shows the effects of seasonality — a slower spread of the virus during warmer months. Theoretically, seasonal effects allow for larger intervals between periods of social distancing.
This year, however, the seasonal effects will likely be minimal, since a large proportion of the population will still be susceptible to the virus come summer. And there are other unknowns, since the underlying mechanisms of seasonality — such as temperature, humidity and school schedules — have been studied for some respiratory infections, like influenza, but not for coronaviruses. So, alas, we cannot depend on seasonality alone to stave off another outbreak over the coming summer months.
Yet another scenario takes into account not only seasonality but also a doubling of the critical-care capacity in hospitals. This, in turn, allows for social distancing to kick in at a higher threshold — say, at a prevalence of 70 cases per 10,000 — and for even longer breaks between social distancing periods:
What is clear overall is that a one-time social distancing effort will not be sufficient to control the epidemic in the long term, and that it will take a long time to reach herd immunity.
“This is because when we are successful in doing social distancing — so that we don’t overwhelm the health care system — fewer people get the infection, which is exactly the goal,” said Ms. Tedijanto. “But if infection leads to immunity, successful social distancing also means that more people remain susceptible to the disease. As a result, once we lift the social distancing measures, the virus will quite possibly spread again as easily as it did before the lockdowns.”
So, lacking a vaccine, our pandemic state of mind may persist well into 2021 or 2022 — which surprised even the experts.
“We anticipated a prolonged period of social distancing would be necessary, but didn’t initially realize that it could be this long,” Dr. Kissler said.
Claudio Maierovitch Pessanha Henriques – 6 de maio de 2020
Desde o início da epidemia de doença causada pelo novo coronavírus (Covid-19), a grande pergunta tem sido “quando acaba?” Frequentemente, são divulgadas na mídia e nas redes sociais projeções as mais variadas sobre a famosa curva da doença em vários países e no mundo, algumas recentes, mostrando a tendência de que os casos deixem de surgir no início do segundo semestre deste ano.
Tais modelos partem do pressuposto de que há uma história, uma curva natural da doença, que começa, sobe, atinge um pico e começa a cair. Vamos analisar o sentido de tal raciocínio. Muitas doenças transmissíveis agudas, quando atingem uma população nova, expandem-se rapidamente, numa velocidade que depende de seu chamado número reprodutivo básico, ou R0 (“R zero”, que estima para quantas pessoas o portador de um agente infeccioso o transmite).
Quando uma quantidade grande de pessoas tiver adoecido ou se infectado mesmo sem sintomas, os contatos entre portadores e pessoas que não tiveram a doença começam a se tornar raros. Num cenário em que pessoas sobreviventes da infecção fiquem imunes àquele agente, sua proporção cresce e a transmissão se torna cada vez mais rara. Assim, a curva, que vinha subindo, fica horizontal e começa a cair, podendo até mesmo chegar a zero, situação em que o agente deixa de circular.
Em populações grandes, é muito raro que uma doença seja completamente eliminada desta forma, por isso a incidência cresce novamente de tempos em tempos. Quando a quantidade de pessoas que não se infectaram, somada à dos bebês que nascem e pessoas sem imunidade que vieram de outros lugares é suficientemente grande, então a curva sobe novamente.
É assim, de forma simplificada, que a ciência entende a ocorrência periódica de epidemias de doenças infecciosas agudas. A história nos ilustra com numerosos exemplos, como varíola, sarampo, gripe, rubéola, poliomielite, caxumba, entre muitos outros. Dependendo das características da doença e da sociedade, são ciclos ilustrados por sofrimento, sequelas e mortes. Realmente, nesses casos, é possível estimar a duração das epidemias e, em alguns casos, até mesmo prever as próximas.
A saúde pública tem diversas ferramentas para interferir em muitos desses casos, indicados para diferentes mecanismos de transmissão, como saneamento, medidas de higiene, isolamento, combate a vetores, uso de preservativos, extinção de fontes de contaminação, vacinas e tratamentos capazes de eliminar os microrganismos. A vacinação, ação específica de saúde considerada mais efetiva, simula o que acontece naturalmente, ao aumentar a quantidade de pessoas imunes na população até que a doença deixe de circular, sem que para isso pessoas precisem adoecer.
No caso da Covid-19, há estimativas de que para a doença deixar de circular intensamente será preciso que cerca de 70% da população seja infectada. Isso se chama imunidade coletiva (também se adota a desagradável denominação “imunidade de rebanho”). Quanto à situação atual de disseminação do coronavírus Sars-CoV-2, a Organização Mundial da Saúde (OMS) calcula que até a metade de abril apenas de 2% a 3% da população mundial terá sido infectada. Estimativas para o Brasil são um pouco inferiores a essa média.
Trocando em miúdos, para que a doença atinja naturalmente seu pico no país e comece a cair, será preciso esperar que 140 milhões de pessoas se infectem. A mais conservadora (menor) taxa de letalidade encontrada nas publicações sobre a Covid-19 é de 0,36%, mais ou menos um vigésimo daquela que os números oficiais de casos e mortes revelam. Isso significa que até o Brasil atingir o pico, contaremos 500 mil mortes se o sistema de saúde não ultrapassar seus limites —e, caso isso aconteça, um número muito maior.
Atingir o pico é sinônimo de catástrofe, não é uma aposta admissível, sobretudo quando constatamos que já está esgotada a capacidade de atendimento hospitalar em várias cidades, como Manaus, Rio de Janeiro e Fortaleza —outras seguem o mesmo caminho.
A única perspectiva aceitável é evitar o pico, e a única forma de fazê-lo é com medidas rigorosas de afastamento físico. A cota de contatos entre as pessoas deve ficar reservada às atividades essenciais, entre elas saúde, segurança, cadeias de suprimento de combustíveis, alimentos, produtos de limpeza, materiais e equipamentos de uso em saúde, limpeza, manutenção e mais um ou outro setor. Alguma dose de criatividade pode permitir ampliar um pouco esse leque, desde que os meios de transporte e vias públicas permaneçam vazios o suficiente para que seja mantida a distância mínima entre as pessoas.
O monitoramento do número de casos e mortes, que revela a transmissão com duas a três semanas de defasagem, deverá ser aprimorado e utilizado em conjunto com estudos baseados em testes laboratoriais para indicar o rigor das medidas de isolamento.
Se conseguirmos evitar a tragédia maior, vamos conviver com um longo período de restrição de atividades, mais de um ano, e teremos que aprender a organizar a vida e a economia de outras formas, além de passar por alguns períodos de “lockdown” —cerca de duas semanas cada, se a curva apontar novamente para o pico.
Hoje, a situação é grave e tende a se tornar crítica. O Brasil é o país com a maior taxa de transmissão da doença; é hora de ficar em casa e, se for imprescindível sair, fazer da máscara uma parte inseparável da vestimenta e manter rigorosamente todos os cuidados indicados.
The world has never faced a hunger emergency like this, experts say. It could double the number of people facing acute hunger to 265 million by the end of this year.
Published April 22, 2020; Updated April 23, 2020, 6:39 a.m. ET
NAIROBI, Kenya — In the largest slum in Kenya’s capital, people desperate to eat set off a stampede during a recent giveaway of flour and cooking oil, leaving scores injured and two people dead.
In India, thousands of workers are lining up twice a day for bread and fried vegetables to keep hunger at bay.
And across Colombia, poor households are hanging red clothing and flags from their windows and balconies as a sign that they are hungry.
“We don’t have any money, and now we need to survive,” said Pauline Karushi, who lost her job at a jewelry business in Nairobi, and lives in two rooms with her child and four other relatives. “That means not eating much.”
The coronavirus pandemic has brought hunger to millions of people around the world. National lockdowns and social distancing measures are drying up work and incomes, and are likely to disrupt agricultural production and supply routes — leaving millions to worry how they will get enough to eat.
The coronavirus has sometimes been called an equalizer because it has sickened both rich and poor, but when it comes to food, the commonality ends. It is poor people, including large segments of poorer nations, who are now going hungry and facing the prospect of starving.
“The coronavirus has been anything but a great equalizer,” said Asha Jaffar, a volunteer who brought food to families in the Nairobi slum of Kibera after the fatal stampede. “It’s been the great revealer, pulling the curtain back on the class divide and exposing how deeply unequal this country is.”
Already, 135 million people had been facing acute food shortages, but now with the pandemic, 130 million more could go hungry in 2020, said Arif Husain, chief economist at the World Food Program, a United Nations agency. Altogether, an estimated 265 million people could be pushed to the brink of starvation by year’s end.
“We’ve never seen anything like this before,” Mr. Husain said. “It wasn’t a pretty picture to begin with, but this makes it truly unprecedented and uncharted territory.”
The world has experienced severe hunger crises before, but those were regional and caused by one factor or another — extreme weather, economic downturns, wars or political instability.
This hunger crisis, experts say, is global and caused by a multitude of factors linked to the coronavirus pandemic and the ensuing interruption of the economic order: the sudden loss in income for countless millions who were already living hand-to-mouth; the collapse in oil prices; widespread shortages of hard currency from tourism drying up; overseas workers not having earnings to send home; and ongoing problems like climate change, violence, population dislocations and humanitarian disasters.
Already, from Honduras to South Africa to India, protests and looting have broken out amid frustrations from lockdowns and worries about hunger. With classes shut down, over 368 million children have lost the nutritious meals and snacks they normally receive in school.
There is no shortage of food globally, or mass starvation from the pandemic — yet. But logistical problems in planting, harvesting and transporting food will leave poor countries exposed in the coming months, especially those reliant on imports, said Johan Swinnen, director general of the International Food Policy Research Institute in Washington.
While the system of food distribution and retailing in rich nations is organized and automated, he said, systems in developing countries are “labor intensive,” making “these supply chains much more vulnerable to Covid-19 and social distancing regulations.”
Yet even if there is no major surge in food prices, the food security situation for poor people is likely to deteriorate significantly worldwide. This is especially true for economies like Sudan and Zimbabwe that were struggling before the outbreak, or those like Iran that have increasingly used oil revenues to finance critical goods like food and medicine.
Sign up to receive an email when we publish a new story about the coronavirus outbreak.
In the sprawling Petare slum on the outskirts of the capital, Caracas, a nationwide lockdown has left Freddy Bastardo and five others in his household without jobs. Their government-supplied rations, which had arrived only once every two months before the crisis, have long run out.
“We are already thinking of selling things that we don’t use in the house to be able to eat,” said Mr. Bastardo, 25, a security guard. “I have neighbors who don’t have food, and I’m worried that if protests start, we wouldn’t be able to get out of here.”
As wages have dried up, half a million people are estimated to have left cities to walk home, setting off the nation’s “largest mass migration since independence,” said Amitabh Behar, the chief executive of Oxfam India.
On a recent evening, hundreds of migrant workers, who have been stuck in New Delhi after a lockdown was imposed in March with little warning, sat under the shade of a bridge waiting for food to arrive. The Delhi government has set up soup kitchens, yet workers like Nihal Singh go hungry as the throngs at these centers have increased in recent days.
“Instead of coronavirus, the hunger will kill us,” said Mr. Singh, who was hoping to eat his first meal in a day. Migrants waiting in food lines have fought each other over a plate of rice and lentils. Mr. Singh said he was ashamed to beg for food but had no other option.
“The lockdown has trampled on our dignity,” he said.
Refugees and people living in conflict zones are likely to be hit the hardest.
The curfews and restrictions on movement are already devastating the meager incomes of displaced people in Uganda and Ethiopia, the delivery of seeds and farming tools in South Sudan and the distribution of food aid in the Central African Republic. Containment measures in Niger, which hosts almost 60,000 refugees fleeing conflict in Mali, have led to surges in the pricing of food, according to the International Rescue Committee.
The effects of the restrictions “may cause more suffering than the disease itself,” said Kurt Tjossem, regional vice president for East Africa at the International Rescue Committee.
Ahmad Bayoush, a construction worker who had been displaced to Idlib Province in northern Syria, said he and many others had signed up to receive food from aid groups, but that it had yet to arrive.
“I am expecting real hunger if it continues like this in the north,” he said.
The pandemic is also slowing efforts to deal with the historic locust plague that has been ravaging the East and Horn of Africa. The outbreak is the worst the region has seen in decades and comes on the heels of a year marked by extreme droughts and floods. But the arrival of billions of new swarms could further deepen food insecurity, said Cyril Ferrand, head of the Food and Agriculture Organization’s resilience team in eastern Africa.
Travel bans and airport closures, Mr. Ferrand said, are interrupting the supply of pesticides that could help limit the locust population and save pastureland and crops.
As many go hungry, there is concern in a number of countries that food shortages will lead to social discord. In Colombia, residents of the coastal state of La Guajira have begun blocking roads to call attention to their need for food. In South Africa, rioters have broken into neighborhood food kiosks and faced off with the police.
And even charitable food giveaways can expose people to the virus when throngs appear, as happened in Nairobi’s shantytown of Kibera earlier this month.
“People called each other and came rushing,” said Valentine Akinyi, who works at the district government office where the food was distributed. “People have lost jobs. It showed you how hungry they are.”
Yet communities across the world are also taking matters into their own hands. Some are raising money through crowdfunding platforms, while others have begun programs to buy meals for needy families.
On a recent afternoon, Ms. Jaffar and a group of volunteers made their way through Kibera, bringing items like sugar, flour, rice and sanitary pads to dozens of families. A native of the area herself, Ms. Jaffar said she started the food drive after hearing so many stories from families who said they and their children were going to sleep hungry.
The food drive has so far reached 500 families. But with all the calls for assistance she’s getting, she said, “that’s a drop in the ocean.”
Reporting was contributed by Anatoly Kurmanaev and Isayen Herrera from Caracas, Venezuela; Paulina Villegas from Mexico City; Julie Turkewitz from Bogotá, Colombia; Ben Hubbard and Hwaida Saad from Beirut, Lebanon; Sameer Yasir from New Delhi; and Hannah Beech from Bangkok.
Nassim Nicholas Taleb is “irritated,” he told Bloomberg Television on March 31st, whenever the coronavirus pandemic is referred to as a “black swan,” the term he coined for an unpredictable, rare, catastrophic event, in his best-selling 2007 book of that title. “The Black Swan” was meant to explain why, in a networked world, we need to change business practices and social norms—not, as he recently told me, to provide “a cliché for any bad thing that surprises us.” Besides, the pandemic was wholly predictable—he, like Bill Gates, Laurie Garrett, and others, had predicted it—a white swan if ever there was one. “We issued our warning that, effectively, you should kill it in the egg,” Taleb told Bloomberg. Governments “did not want to spend pennies in January; now they are going to spend trillions.”
The warning that he referred to appeared in a January 26th paper that he co-authored with Joseph Norman and Yaneer Bar-Yam, when the virus was still mainly confined to China. The paper cautions that, owing to “increased connectivity,” the spread will be “nonlinear”—two key contributors to Taleb’s anxiety. For statisticians, “nonlinearity” describes events very much like a pandemic: an output disproportionate to known inputs (the structure and growth of pathogens, say), owing to both unknown and unknowable inputs (their incubation periods in humans, or random mutations), or eccentric interaction among various inputs (wet markets and airplane travel), or exponential growth (from networked human contact), or all three.
“These are ruin problems,” the paper states, exposure to which “leads to a certain eventual extinction.” The authors call for “drastically pruning contact networks,” and other measures that we now associate with sheltering in place and social distancing. “Decision-makers must act swiftly,” the authors conclude, “and avoid the fallacy that to have an appropriate respect for uncertainty in the face of possible irreversible catastrophe amounts to ‘paranoia.’ ” (“Had we used masks then”—in late January—“we could have saved ourselves the stimulus,” Taleb told me.)
Yet, for anyone who knows his work, Taleb’s irritation may seem a little forced. His profession, he says, is “probability.” But his vocation is showing how the unpredictable is increasingly probable. If he was right about the spread of this pandemic it’s because he has been so alert to the dangers of connectivity and nonlinearity more generally, to pandemics and other chance calamities for which COVID-19 is a storm signal. “I keep getting asked for a list of the next four black swans,” Taleb told me, and that misses his point entirely. In a way, focussing on his January warning distracts us from his main aim, which is building political structures so that societies will be better able to cope with mounting, random events.
Indeed, if Taleb is chronically irritated, it is by those economists, officials, journalists, and executives—the “naïve empiricists”—who think that our tomorrows are likely to be pretty much like our yesterdays. He explained in a conversation that these are the people who, consulting bell curves, focus on their bulging centers, and disregard potentially fatal “fat tails”—events that seem “statistically remote” but “contribute most to outcomes,” by precipitating chain reactions, say. (Last week, Dr. Phil told Fox’s Laura Ingraham that we should open up the country again, noting, wrongly, that “three hundred and sixty thousand people die each year “from swimming pools — but we don’t shut the country down for that.” In response, Taleb tweeted, “Drowning in swimming pools is extremely contagious and multiplicative.”) Naïve empiricists plant us, he argued in “The Black Swan,” in “Mediocristan.” We actually live in “Extremistan.”
Taleb, who is sixty-one, came by this impatience honestly. As a young man, he lived through Lebanon’s civil war, which was precipitated by Palestinian militias escaping a Jordanian crackdown, in 1971, and led to bloody clashes between Maronite Christians and Sunni Muslims, drawing in Shiites, Druze, and the Syrians as well. The conflict lasted fifteen years and left some ninety thousand people dead. “These events were unexplainable, but intelligent people thought they were capable of providing convincing explanations for them—after the fact,” Taleb writes in “The Black Swan.” “The more intelligent the person, the better sounding the explanation.” But how could anyone have anticipated “that people who seemed a model of tolerance could become the purest of barbarians overnight?” Given the prior cruelties of the twentieth century, the question may sound ingenuous, but Taleb experienced sudden violence firsthand. He grew fascinated, and outraged, by extrapolations from an illusory normal—the evil of banality. “I later saw the exact same illusion of understanding in business success and the financial markets,” he writes.
“Later” began in 1983, when, after university in Paris, and a Wharton M.B.A., Taleb became an options trader—“my core identity,” he says. Over the next twelve years, he conducted two hundred thousand trades, and examined seventy thousand risk-management reports. Along the way, he developed an investment strategy that entailed exposure to regular, small losses, while positioning him to benefit from irregular, massive gains—something like a venture capitalist. He explored, especially, scenarios for derivatives: asset bundles where fat tails—price volatilities, say—can either enrich or impoverish traders, and do so exponentially when they increase the scale of the movement.
These were the years, moreover, when, following Japan, large U.S. manufacturing companies were converting to “just-in-time” production, which involved integrating and synchronizing supply-chains, and forgoing stockpiles of necessary components in favor of acquiring them on an as-needed basis, often relying on single, authorized suppliers. The idea was that lowering inventory would reduce costs. But Taleb, extrapolating from trading risks, believed that “managing without buffers was irresponsible,” because “fat-tail events” can never be completely avoided. As the Harvard Business Reviewreported this month, Chinese suppliers shut down by the pandemic have stymied the production capabilities of a majority of the companies that depend on them.
The coming of global information networks deepened Taleb’s concern. He reserved a special impatience for economists who saw these networks as stabilizing—who thought that the average thought or action, derived from an ever-widening group, would produce an increasingly tolerable standard—and who believed that crowds had wisdom, and bigger crowds more wisdom. Thus networked, institutional buyers and sellers were supposed to produce more rational markets, a supposition that seemed to justify the deregulation of derivatives, in 2000, which helped accelerate the crash of 2008.
As Taleb told me, “The great danger has always been too much connectivity.” Proliferating global networks, both physical and virtual, inevitably incorporate more fat-tail risks into a more interdependent and “fragile” system: not only risks such as pathogens but also computer viruses, or the hacking of information networks, or reckless budgetary management by financial institutions or state governments, or spectacular acts of terror. Any negative event along these lines can create a rolling, widening collapse—a true black swan—in the same way that the failure of a single transformer can collapse an electricity grid.
COVID-19 has initiated ordinary citizens into the esoteric “mayhem” that Taleb’s writings portend. Who knows what will change for countries when the pandemic ends? What we do know, Taleb says, is what cannot remain the same. He is “too much a cosmopolitan” to want global networks undone, even if they could be. But he does want the institutional equivalent of “circuit breakers, fail-safe protocols, and backup systems,” many of which he summarizes in his fourth, and favorite, book, “Antifragile,” published in 2012. For countries, he envisions political and economic principles that amount to an analogue of his investment strategy: government officials and corporate executives accepting what may seem like too-small gains from their investment dollars, while protecting themselves from catastrophic loss.
Anyone who has read the Federalist Papers can see what he’s getting at. The “separation of powers” is hardly the most efficient form of government; getting something done entails a complex, time-consuming process of building consensus among distributed centers of authority. But James Madison understood that tyranny—however distant it was from the minds of likely Presidents in his own generation—is so calamitous to a republic, and so incipient in the human condition, that it must be structurally mitigated. For Taleb, an antifragile country would encourage the distribution of power among smaller, more local, experimental, and self-sufficient entities—in short, build a system that could survive random stresses, rather than break under any particular one. (His word for this beneficial distribution is “fractal.”)
We should discourage the concentration of power in big corporations, “including a severe restriction of lobbying,” Taleb told me. “When one per cent of the people have fifty per cent of the income, that is a fat tail.” Companies shouldn’t be able to make money from monopoly power, “from rent-seeking”—using that power not to build something but to extract an ever-larger part of the surplus. There should be an expansion of the powers of state and even county governments, where there is “bottom-up” control and accountability. This could incubate new businesses and foster new education methods that emphasize “action learning and apprenticeship” over purely academic certification. He thinks that “we should have a national Entrepreneurship Day.”
But Taleb doesn’t believe that the government should abandon citizens buffeted by events they can’t possibly anticipate or control. (He dedicated his book “Skin in the Game,” published in 2018, to Ron Paul and Ralph Nader.) “The state,” he told me, “should not smooth out your life, like a Lebanese mother, but should be there for intervention in negative times, like a rich Lebanese uncle.” Right now, for example, the government should, indeed, be sending out checks to unemployed and gig workers. (“You don’t bail out companies, you bail out individuals.”) He would also consider a guaranteed basic income, much as Andrew Yang, whom he admires, has advocated. Crucially, the government should be an insurer of health care, though Taleb prefers not a centrally run Medicare-for-all system but one such as Canada’s, which is controlled by the provinces. And, like responsible supply-chain managers, the federal government should create buffers against public-health disasters: “If it can spend trillions stockpiling nuclear weapons, it ought to spend tens of billions stockpiling ventilators and testing kits.”
At the same time, Taleb adamantly opposes the state taking on staggering debt. He thinks, rather, that the rich should be taxed as disproportionately as necessary, “though as locally as possible.” The key is “to build on the good days,” when the economy is growing, and reduce the debt, which he calls “intergenerational dispossession.” The government should then encourage an eclectic array of management norms: drawing up political borders, even down to the level of towns, which can, in an epidemiological emergency, be closed; having banks and corporations hold larger cash reserves, so that they can be more independent of market volatility; and making sure that manufacturing, transportation, information, and health-care systems have redundant storage and processing components. (“That’s why nature gave us two kidneys.”) Taleb is especially keen to inhibit “moral hazard,” such as that of bankers who get rich by betting, and losing, other people’s money. “In the Hammurabi Code, if a house falls in and kills you, the architect is put to death,” he told me. Correspondingly, any company or bank that gets a bailout should expect its executives to be fired, and its shareholders diluted. “If the state helps you, then taxpayers own you.”
Some of Taleb’s principles seem little more than thought experiments, or fit uneasily with others. How does one tax more locally, or close a town border? If taxpayers own corporate equities, does this mean that companies might be nationalized, broken up, or severely regulated? But asking Taleb to describe antifragility to its end is a little like asking Thomas Hobbes to nail down sovereignty. The more important challenge is to grasp the peril for which political solutions must be designed or improvised; society cannot endure with complacent conceptions of how things work. “It would seem most efficient to drive home at two hundred miles an hour,” he put it to me.“But odds are you’d never get there.”
WUHAN, China (Reuters) – Dressed in a hazmat suit, two masks and a face shield, Du Mingjun knocked on the mahogany door of a flat in a suburban district of Wuhan on a recent morning.
FILE PHOTO: Medical personnel in protective suits wave hands to a patient who is discharged from the Leishenshan Hospital after recovering from the novel coronavirus, in Wuhan, the epicentre of the novel coronavirus outbreak, in Hubei province, China March 1, 2020. China Daily via REUTERS
A man wearing a single mask opened the door a crack and, after Du introduced herself as a psychological counsellor, burst into tears.
“I really can’t take it anymore,” he said. Diagnosed with the novel coronavirus in early February, the man, who appeared to be in his 50s, had been treated at two hospitals before being transferred to a quarantine centre set up in a cluster of apartment blocks in an industrial part of Wuhan.
Why, he asked, did tests say he still had the virus more than two months after he first contracted it?
The answer to that question is a mystery baffling doctors on the frontline of China’s battle against COVID-19, even as it has successfully slowed the spread of the coronavirus across the country.
Chinese doctors in Wuhan, where the virus first emerged in December, say a growing number of cases in which people recover from the virus, but continue to test positive without showing symptoms, is one of their biggest challenges as the country moves into a new phase of its containment battle.
Those patients all tested negative for the virus at some point after recovering, but then tested positive again, some up to 70 days later, the doctors said. Many have done so over 50-60 days.
The prospect of people remaining positive for the virus, and therefore potentially infectious, is of international concern, as many countries seek to end lockdowns and resume economic activity as the spread of the virus slows. Currently, the globally recommended isolation period after exposure is 14 days.
So far, there have been no confirmations of newly positive patients infecting others, according to Chinese health officials.
China has not published precise figures for how many patients fall into this category. But disclosures by Chinese hospitals to Reuters, as well as in other media reports, indicate there are at least dozens of such cases.
In South Korea, about 1,000 people have been testing positive for four weeks or more. In Italy, the first European country ravaged by the pandemic, health officials noticed that coronavirus patients could test positive for the virus for about a month.
As there is limited knowledge available on how infectious these patients are, doctors in Wuhan are keeping them isolated for longer.
Zhang Dingyu, president of Jinyintan Hospital, where the most serious coronavirus cases were treated, said health officials recognised the isolations may be excessive, especially if patients proved not to be infectious. But, for now, it was better to do so to protect the public, he said.
He described the issue as one of the most pressing facing the hospital and said counsellors like Du are being brought in to help ease the emotional strain.
“When patients have this pressure, it also weighs on society,” he said.
DOZENS OF CASES
The plight of Wuhan’s long-term patients underlines how much remains unknown about COVID-19 and why it appears to affect different people in numerous ways, Chinese doctors say. So far global infections have hit 2.5 million with over 171,000 deaths.
As of April 21, 93% of 82,788 people with the virus in China had recovered and been discharged, official figures show.
Yuan Yufeng, a vice president at Zhongnan Hospital in Wuhan, told Reuters he was aware of a case in which the patient had positive retests after first being diagnosed with the virus about 70 days earlier.
“We did not see anything like this during SARS,” he said, referring to the 2003 Severe Acute Respiratory Syndrome outbreak that infected 8,098 people globally, mostly in China.
Patients in China are discharged after two negative nucleic acid tests, taken at least 24 hours apart, and if they no longer show symptoms. Some doctors want this requirement to be raised to three tests or more.
China’s National Health Commission directed Reuters to comments made at a briefing Tuesday when asked for comment about how this category of patients was being handled.
Wang Guiqiang, director of the infectious disease department of Peking University First Hospital, said at the briefing that the majority of such patients were not showing symptoms and very few had seen their conditions worsen.
“The new coronavirus is a new type of virus,” said Guo Yanhong, a National Health Commission official. “For this disease, the unknowns are still greater than the knowns.”
REMNANTS AND REACTIVATION
Experts and doctors struggle to explain why the virus behaves so differently in these people.
Some suggest that patients retesting as positive after previously testing negative were somehow reinfected with the virus. This would undermine hopes that people catching COVID-19 would produce antibodies that would prevent them from getting sick again from the virus.
Zhao Yan, a doctor of emergency medicine at Wuhan’s Zhongnan Hospital, said he was sceptical about the possibility of reinfection based on cases at his facility, although he did not have hard evidence.
“They’re closely monitored in the hospital and are aware of the risks, so they stay in quarantine. So I’m sure they were not reinfected.”
Jeong Eun-kyeong, director of the Korea Centers for Disease Control and Prevention, has said the virus may have been “reactivated” in 91 South Korean patients who tested positive after having been thought to be cleared of it.
Other South Korean and Chinese experts have said that remnants of the virus could have stayed in patients’ systems but not be infectious or dangerous to the host or others.
Few details have been disclosed about these patients, such as if they have underlying health conditions.
Paul Hunter, a professor at the University of East Anglia’s Norwich School of Medicine, said an unusually slow shedding of other viruses such as norovirus or influenza had been previously seen in patients with weakened immune systems.
In 2015, South Korean authorities disclosed that they had a Middle East Respiratory Syndrome patient stricken with lymphoma who showed signs of the virus for 116 days. They said his impaired immune system kept his body from ridding itself of the virus. The lymphoma eventually caused his death.
FILE PHOTO: A volunteer walks inside a convention center that was used as a makeshift hospital to treat patients with the coronavirus disease (COVID-19), in Wuhan, Hubei province, China April 9, 2020. REUTERS/Aly Song
Yuan said that even if patients develop antibodies, it did not guarantee they would become virus-free.
He said that some patients had high levels of antibodies, and still tested positive to nucleic acid tests.
“It means that the two sides are still fighting,” he said.
As could be seen in Wuhan, the virus can also inflict a heavy mental toll on those caught in a seemingly endless cycle of positive tests.
Du, who set up a therapy hotline when Wuhan’s outbreak first began, allowed Reuters in early April to join her on a visit to the suburban quarantine centre on the condition that none of the patients be identified.
One man rattled off the names of three Wuhan hospitals he had stayed at before being moved to a flat in the centre. He had taken over 10 tests since the third week of February, he said, on occasions testing negative but mostly positive.
“I feel fine and have no symptoms, but they check and it’s positive, check and it’s positive,” he said. “What is with this virus?”
Patients need to stay at the centre for at least 28 days and obtain two negative results before being allowed to leave. Patients are isolated in individual rooms they said were paid for by the government.
The most concerning case facing Du during the visit was the man behind the mahogany door; he had told medical workers the night before that he wanted to kill himself.
“I wasn’t thinking clearly,” he told Du, explaining how he had already taken numerous CT scans and nucleic acid tests, some of which tested negative, at different hospitals. He worried that he had been reinfected as he cycled through various hospitals.
His grandson missed him after being gone for so long, he said, and he worried his condition meant he would never be able to see him again.
He broke into another round of sobs. “Why is this happening to me?”
Reporting by Brenda Goh; Additional reporting by Jack Kim in Seoul, Elvira Pollina in Milan, Belen Carreno in Madrid, and Shanghai newsroom; Editing by Philip McClellan
As parts of the United States settle in for what may be the worst weeks of their local covid-19 outbreaks, a familiar refrain is sure to emerge.
Some people will complain that the death count attributed to the coronavirus is being exaggerated. Others, including researchers, have argued that covid-19 related deaths are actually being undercounted, as people die at home without being tested. Still others will point to the final death count and say that because it’s lower than X (whether that number be flu deaths, car accident deaths, or some other moving goalpost), then that means the efforts and sacrifices made for social distancing weren’t worth it—ignoring, of course, that social distancing was the reason the toll wasn’t much higher. Figuring out how deadly covid-19 truly is will take far more time to untangle than anyone would want, and no one’s likely to be fully satisfied with the answers we get.
As of April 10, there have been around 1.6 million reported cases of covid-19, the disease caused by the novel coronavirus worldwide. There have also been over 96,000 reported deaths, with over 16,000 deaths documented in the U.S. But these numbers are largely acknowledged as a very rough, possibly even misleading estimate of the problem, given the wide gaps in testing capacity across different countries and even within a country.
On the political right, many have taken to fostering conspiracy theories about these deaths. You don’t have to go far on social media to see people accusing doctors and health officials of fudging the numbers higher to make President Trump look bad or to (somehow) profit off the tragedy. Other conservative voices like the disgraced sex pest Bill O’Reilly are less paranoid but similarly dismissive, arguing that many of those who died “were on their last legs anyway.”
It’s true that older people and those with underlying health conditions are at greater risk of serious complications and death from covid-19. But the same can be said for almost every other leading cause of death, whether it’s cancer, heart attack, or diabetes. And just as living is hardly a simple affair, so too is dying. Sometimes you can point to a single factor that kills a person, but often it’s a mix of ailments, with a viral infection like covid-19 being the final shove.
The key point here is that epidemiologists and others who try to estimate how many people die from any given cause per year know the above very well. The flu, for instance, doesn’t usually kill in isolation either—it too disproportionately kills the elderly and otherwise already sick. Yet many of the same people who are now trying to downplay covid-19 deaths also argued that its early death toll wasn’t coming anywhere close to the typical seasonal flu’s annual tally (an argument meant to push back against the idea of doing anything too serious to mitigate the spread of the coronavirus).
That said, we’re much better at estimating how many deaths in the U.S. are flu-related because the influenza virus is a known entity. We have a decent sense of how many people are infected with the flu every year, how many people go to the doctor or are hospitalized, and how many people it helps kill, thanks to a well-established nationwide surveillance system. But that isn’t true for covid-19.
There’s steady evidence indicating that covid-19 cases nearly everywhere in the world are being undercounted. That’s partly because testing remains so haphazard and has inherent limitations. The most common type of covid-19 test right now, for instance, can only confirm an active infection, not whether you had a previous case (newer antibody tests can address that problem but have their own flaws). It’s also because the virus infects a still-unknown percentage of people without making them feel sick at all.
Many more people have had or will catch the coronavirus than any current tracking will ever indicate. These hidden cases are almost certainly less deadly on average than the known cases that wind up in hospitals, so it’s likely that the current documented fatality rate of covid-19 (over 5 percent worldwide) is an overestimate. But that doesn’t mean more people aren’t dying from covid-19 than are being reported.
In areas of China and Italy hit hard by the coronavirus, news reports have suggested a wide gulf between the official number of covid-19-related deaths in a town and what residents are seeing for themselves. In the U.S., there are still regions where testing is limited and people who may have died from covid-19 in their homes are never tested, including New York City. And there’s the simple harsh reality that we’re probably still in the very beginning of this pandemic.
Even if outbreaks start to peter out in the U.S. and elsewhere, there’s the risk that loosening our restrictions on distancing will fuel new ones. And even if the summer heat in the U.S. makes it harder for the virus to spread here, as some experts hope, a second wave in the fall and winter could certainly happen, much as it did for the last pandemic (a strain of flu) in 2009.
All of these variables will affect the final death toll from covid-19, as will how countries continue to respond to the crisis. Ironically, the steps we take to prevent new cases and deaths may be the very thing that makes people doubt they were necessary.
In late March, the White House and U.S. public health officials announced that they projected 100,000 to 200,000 deaths in the country by the pandemic’s end, provided everything was done to slow its spread. On Thursday, Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, said that newer modeling data has suggested the U.S. death toll may end up closer to 60,000, so long as we keep mitigating the outbreak. Almost immediately, some people chose to take it as evidence that mitigation efforts aren’t necessary and that the initial warnings about the virus were overblown—ignoring, again, that the reason for the downward revision in projected deaths is the success of social distancing.
There are still a lot of things we don’t know about the coronavirus, and many of the things we think we know are going to keep changing. But here’s something to remember.
By the end of the 2009 H1N1 flu pandemic, the World Health Organization reported that about 19,000 people were confirmed to have died from the virus. By 2013, severalstudies estimated that the true death toll was at least 10 times higher and even higher still when you took into account other causes of death indirectly worsened by the flu, like heart attacks. Knowing how deadly covid-19 will be could very well take that long to nail down too.
Another article of interest:
New York City’s covid-19 death toll is likely higher than reported, due to the fact that the…Read more
Geneva, 1 April 2020 – The World Meteorological Organization (WMO) is concerned about the impact of the COVID-19 pandemic on the quantity and quality of weather observations and forecasts, as well as atmospheric and climate monitoring.
WMO’s Global Observing System serves as a backbone for all weather and climate services and products provided by the 193 WMO Member states and territories to their citizens. It provides observations on the state of the atmosphere and ocean surface from land-, marine- and space-based instruments. This data is used for the preparation of weather analyses, forecasts, advisories and warnings.
“National Meteorological and Hydrological Services continue to perform their essential 24/7 functions despite the severe challenges posed by the Coronavirus pandemic,” said WMO Secretary-General Petteri Taalas. “We salute their dedication to protecting lives and property but we are mindful of the increasing constraints on capacity and resources,” he said.
“The impacts of climate change and growing amount of weather-related disasters continue. The COVID-19 pandemic poses an additional challenge, and may exacerbate multi-hazard risks at a single country level. Therefore it is essential that governments pay attention to their national early warning and weather observing capacities despite the COVID-19 crisis,” said Mr Taalas.
Large parts of the observing system, for instance its satellite components and many ground-based observing networks, are either partly or fully automated. They are therefore expected to continue functioning without significant degradation for several weeks, in some cases even longer. But if the pandemic lasts more than a few weeks, then missing repair, maintenance and supply work, and missing redeployments will become of increasing concern.
Some parts of the observing system are already affected. Most notably the significant decrease in air traffic has had a clear impact. In-flight measurements of ambient temperature and wind speed and direction are a very important source of information for both weather prediction and climate monitoring.
Meteorological data from aircraft
Commercial airliners contribute to the Aircraft Meteorological Data Relay programme (AMDAR), which uses onboard sensors, computers and communications systems to collect, process, format and transmit meteorological observations to ground stations via satellite or radio links.
In some parts of the world, in particular over Europe, the decrease in the number of measurements over the last couple of weeks has been dramatic (see chart below provided by EUMETNET). The countries affiliated with EUMETNET, a collaboration between the 31 national weather services in Europe, are currently discussing ways to boost the short-term capabilities of other parts of their observing networks in order to partly mitigate this loss of aircraft observations.
The AMDAR observing system has traditionally produced over 700 000 high-quality observations per day of air temperature and wind speed and direction, together with the required positional and temporal information, and with an increasing number of humidity and turbulence measurements being made.
In most developed countries, surface-based weather observations are now almost fully automated.
However, in many developing countries, the transition to automated observations is still in progress, and the meteorological community still relies on observations taken manually by weather observers and transmitted into the international networks for use in global weather and climate models.
WMO has seen a significant decrease in the availability of this type of manual observations over the last two weeks. Some of this may well be attributable to the current coronavirus situation, but it is not yet clear whether other factors may play a role as well. WMO is currently investigating this.
“At the present time, the adverse impact of the loss of observations on the quality of weather forecast products is still expected to be relatively modest. However, as the decrease in availability of aircraft weather observations continues and expands, we may expect a gradual decrease in reliability of the forecasts,” said Lars Peter Riishojgaard, Director, Earth System Branch in WMO’s Infrastructure Department.
“The same is true if the decrease in surface-based weather observations continues, in particular if the COVID-19 outbreak starts to more widely impact the ability of observers to do their job in large parts of the developing world. WMO will continue to monitor the situation, and the organization is working with its Members to mitigate the impact as much as possible,” he said.
(Map provided by WMO; countries shown in darker colors provided fewer observations over the last week than averaged for the month of January 2020 (pre-COVID-19); countries shown in black are currently not sending any data at all).
Currently, there are 16 meteorological and 50 research satellites, over 10 000 manned and automatic surface weather stations, 1 000 upper-air stations, 7 000 ships, 100 moored and 1 000 drifting buoys, hundreds of weather radars and 3 000 specially equipped commercial aircraft measure key parameters of the atmosphere, land and ocean surface every day.
For further information contact: Clare Nullis, media officer. Email email@example.com, Cell +41 79 709 13 97
Organização Meteorológica Mundial (WMO) teme que coronavírus influencie na qualidade das previsões e no monitoramento da atmosfera
A Organização Meteorológica Mundial (Word Meteorological Organization, WMO, na sigla em inglês) está preocupada com o impacto da pandemia do covid-19 na quantidade e qualidade das observações e previsões meteorológicas, bem como no monitoramento da atmosfera e do clima.
O Sistema de Observação Global da WMO serve como espinha dorsal de todos os serviços e produtos climáticos fornecidos a seus cidadãos pelos 193 estados e territórios membros da organização. Ele fornece observações sobre o estado da atmosfera e da superfície do oceano a partir de instrumentos terrestres, marinhos e espaciais. Estes dados são utilizados para a preparação de análises meteorológicas, previsão do tempo e monitoramento do clima.
“Os Serviços Meteorológicos e Hidrológicos Nacionais continuam desempenhando suas funções essenciais 24 horas por dia e sete dias por semana, apesar dos graves desafios impostos pela pandemia de coronavírus”, disse o secretário-geral da WMO, Petteri Taalas. “Saudamos sua dedicação em proteger vidas e propriedades, mas estamos atentos às crescentes restrições de capacidade e recursos”.
Taalas afirmou ainda que os impactos das mudanças climáticas e a crescente quantidade de desastres relacionados ao clima continuam. “A pandemia do Covid-19 representa um desafio que grava os riscos de vários perigos em um único país. Portanto, é essencial que os governos prestem atenção em seu alerta nacional e às capacidades de observação do clima, apesar da crise do Covid-19”.
Grande parte do sistema de observação, como os componentes de satélite e redes de observação terrestres, por exemplo, são parcialmente ou totalmente automatizados. Por isso, espera-se que continuem funcionando sem problemas significativos por várias semanas, em alguns casos até mais. Porém, se a pandemia durar mais de algumas semanas, a falta de reparos, manutenção e suprimentos, e as redistribuições se tornarão uma preocupação crescente.
Algumas partes do sistema de observação já estão sendo afetadas, com destaque para a diminuição significativa do tráfego aéreo. As medições de temperatura ambiente e da velocidade e direção do vento em voo são uma fonte muito importante de informações para a previsão do tempo e monitoramento do clima.
Dados meteorológicos de aeronaves
Aviões comerciais contribuem para o programa “Airbus Meteorological Data Relay” (AMDAR), que usa sensores, computadores e sistemas de comunicação a bordo para coletar, processar, formatar e transmitir observações meteorológicas para estações terrestres via satélite ou rádio.
Em algumas partes do mundo, em particular na Europa, a diminuição do número de medições nas últimas duas semanas tem sido dramática. Veja o gráfico fornecido pela EUMETNET.
Total de observações do sistema AMDAR em março de 2020 (Fonte: WMO)
Os países afiliados à EUMETNET, que reúne 31 serviços meteorológicos nacionais na Europa, estão atualmente discutindo maneiras de aumentar as capacidades de curto prazo de outras partes de suas redes de observação, a fim de diminuir parcialmente a perda de observações de aeronaves.
O sistema de observação AMDAR normalmente produzia por dia mais de 700 mil observações de alta qualidade de temperatura do ar, velocidade e direção do vento. Além disso, fornecia informações posicionais e temporais necessárias, com número crescente de medições de umidade e turbulência.
Observações baseadas em superfície
Na maioria dos países desenvolvidos, as observações meteorológicas de superfície são quase totalmente automatizadas. No entanto, em muitos países em desenvolvimento, como é o caso do Brasil, a transição para observações automatizadas ainda está em andamento, e a comunidade meteorológica ainda depende de observações feitas manualmente por observadores, que as transmitem às redes internacionais para uso em modelos globais de tempo e clima.
A WMO registrou diminuição significativa na disponibilidade de observação manual nas últimas duas semanas. Parte disso pode estar relacionada à situação atual de coronavírus, mas ainda não está claro se outros fatores também podem ter contribuído. A WMO está investigando outras possíveis causas.
“Atualmente, o impacto adverso da perda de observações na qualidade dos produtos para previsão do tempo ainda é relativamente pequeno. No entanto, à medida que a diminuição na disponibilidade de observações meteorológicas das aeronaves continua e se expande, podemos esperar uma queda gradual na confiabilidade das previsões”, disse Lars Peter Riishojgaard, diretor da filial do sistema terrestre no departamento de infraestrutura da WMO.
Segundo Riishojgaard, o mesmo vale se a diminuição das observações meteorológicas na superfície continuar e, em particular, se o surto de covid-19 começar a impactar de forma mais significativa a capacidade de trabalho de observadores em países subdesenvolvidos. “A WMO continuará monitorando a situação, e a organização está trabalhando com seus membros para mitigar o impacto o máximopossível”, afirmou.
Mapa fornecido pela WMO: os países mostrados em cores mais escuras forneceram menos observações na última semana do que a média do mês de janeiro de 2020 (pré-covid-19); os países mostrados em preto atualmente não estão enviando nenhum dado.
Atualmente, existem 16 satélites meteorológicos e 50 de pesquisa no mundo, além de mais de 10 mil estações meteorológicas de superfície automáticas e tripuladas, mil estações aéreas, 7 mil navios, 100 bóias ancoradas e mil flutuantes, centenas de radares meteorológicos e 3 mil estações comerciais especialmente equipadas em aeronaves, que medem parâmetros-chave da atmosfera, da terra e da superfície do oceano todos os dias.
Tradução e adaptação de Paula Soares e Amanda Sampaio, do conteúdo publicado no site da WMO – World Meteorological Organization.