Arquivo da tag: Incerteza

Papa Francisco pede orações para robôs e IA (Tecmundo)

11/11/2020 às 18:30 1 min de leitura

Imagem de: Papa Francisco pede orações para robôs e IA

Jorge Marin

O Papa Francisco pediu aos fiéis do mundo inteiro para que, durante o mês de novembro, rezem para que o progresso da robótica e da inteligência artificial (IA) possam sempre servir a humanidade.

A mensagem faz parte de uma série de intenções de oração que o pontífice divulga anualmente, e compartilha a cada mês no YouTube para auxiliar os católicos a “aprofundar sua oração diária”, concentrando-se em tópicos específicos. Em setembro, o papa pediu orações para o “compartilhamento dos recursos do planeta”; em agosto, para o “mundo marítimo”; e agora chegou a vez dos robôs e da IA.

Na sua mensagem, o Papa Francisco pediu uma atenção especial para a IA que, segundo ele, está “no centro da mudança histórica que estamos experimentando”. E que não se trata apenas dos benefícios que a robótica pode trazer para o mundo.

Progresso tecnológico e algoritmos

Francisco afirma que nem sempre o progresso tecnológico é sinal de bem-estar para a humanidade, pois, se esse progresso contribuir para aumentar as desigualdades, não poderá ser considerado como um progresso verdadeiro. “Os avanços futuros devem ser orientados para o respeito à dignidade da pessoa”, alerta o papa.

A preocupação com que a tecnologia possa aumentar as divisões sociais já existentes levou o Vaticano assinar no início deste ano, em conjunto com a Microsoft e a IBM, a “Chamada de Roma por Ética de IA”, um documento em que são fixados alguns princípios para orientar a implantação da IA: transparência, inclusão, imparcialidade e confiabilidade.

Mesmo pessoas não religiosas são capazes de reconhecer que, quando se trata de implantar algoritmos, a preocupação do papa faz todo o sentido.

How will AI shape our lives post-Covid? (BBC)

Original article

BBC, 09 Nov 2020

Audrey Azoulay: Director-General, Unesco
How will AI shape our lives post-Covid?

Covid-19 is a test like no other. Never before have the lives of so many people around the world been affected at this scale or speed.

Over the past six months, thousands of AI innovations have sprung up in response to the challenges of life under lockdown. Governments are mobilising machine-learning in many ways, from contact-tracing apps to telemedicine and remote learning.

However, as the digital transformation accelerates exponentially, it is highlighting the challenges of AI. Ethical dilemmas are already a reality – including privacy risks and discriminatory bias.

It is up to us to decide what we want AI to look like: there is a legislative vacuum that needs to be filled now. Principles such as proportionality, inclusivity, human oversight and transparency can create a framework allowing us to anticipate these issues.

This is why Unesco is working to build consensus among 193 countries to lay the ethical foundations of AI. Building on these principles, countries will be able to develop national policies that ensure AI is designed, developed and deployed in compliance with fundamental human values.

As we face new, previously unimaginable challenges – like the pandemic – we must ensure that the tools we are developing work for us, and not against us.

Geoengenharia solar não deve ser descartada, segundo cientistas (TecMundo)

03/11/2020 às 19:00 3 min de leitura

Imagem de: Geoengenharia solar não deve ser descartada, segundo cientistas

Reinaldo Zaruvni

Antes encaradas com desconfiança pela comunidade científica, as metodologias de intervenção artificial no meio ambiente com o objetivo de frear os efeitos devastadores do aquecimento global estão sendo consideradas agora como recursos a serem aplicados em última instância (já que iniciativas para reduzir a emissão de gases dependem diretamente da ação coletiva e demandam décadas para que tenham algum tipo de efeito benéfico). É possível que não tenhamos esse tempo, de acordo com alguns pesquisadores da área, os quais têm atraído investimentos e muita atenção.

Fazendo parte de um campo também referenciado como geoengenharia solar, grande parte dos métodos se vale da emissão controlada de partículas na atmosfera, responsáveis por barrar a energia recebida pelo nosso planeta e direcioná-la novamente ao espaço, criando uma espécie de resfriamento semelhante ao gerado por erupções vulcânicas.

Ainda que não atuem sobre a poluição, por exemplo, cientistas consideram que, diante de tempestades cada vez mais agressivas, tornados de fogo, inundações e outros desastres naturais, tais ações seriam interessantes enquanto soluções mais eficazes não são desenvolvidas.

Diretor do Sabin Center for Climate Change Law, na Columbia Law School, e editor de um livro sobre a tecnologia e suas implicações legais, Michael Gerrard exemplificou a situação em entrevista ao The New York Times: “Estamos enfrentando uma ameaça existencial. Por isso, é necessário que analisemos todas as opções”.

“Gosto de comparar a geoengenharia a uma quimioterapia para o planeta: se todo o resto estiver falhando, resta apenas tentar”, ele defendeu.

Desastres naturais ocasionados pelo aquecimento global tornam intervenções urgentes, defendem pesquisadores.

Desastres naturais ocasionados pelo aquecimento global tornam urgente a ação de intervenções, segundo pesquisadores. Fonte:  Unsplash 

Dois pesos e duas medidas

Entre aquelas que se destacam, pode ser citada a ação empreendida por uma organização não governamental chamada SilverLining, que concedeu US$ 3 milhões a diversas universidades e outras instituições para que se dediquem à busca de respostas para questões práticas. Um exemplo é encontrar a altitude ideal para a aplicação de aerossóis e como inserir a quantidade mais indicada, verificando seus efeitos sobre a cadeia de produção de alimentos mundial.

Chris Sacca, cofundador da Lowercarbon Capital, um grupo de investimentos que é um dos financiadores da SilverLining, declarou em tom alarmista: “A descarbonização é necessária, mas vai demorar 20 anos ou mais para que ocorra. Se não explorarmos intervenções climáticas como a reflexão solar neste momento, condenaremos um número incontável de vidas, espécies e ecossistemas ao calor”.

Outra contemplada por somas substanciais foi a National Oceanic and Atmospheric Administration, que recebeu do congresso norte-americano US$ 4 milhões justamente para o desenvolvimento de tecnologias do tipo, assim como o monitoramento de uso secreto de tais soluções por outros países.

Douglas MacMartin, pesquisador de Engenharia Mecânica e aeroespacial na Universidade Cornell, afirmou que “é certo o poder da humanidade de resfriar as coisas, mas o que não está claro é o que vem a seguir”.

Se, por um lado, planeta pode ser resfriado artificialmente, por outro não se sabe o que virá.

Se, por um lado, o planeta pode ser resfriado artificialmente; por outro, não se sabe o que virá. Fonte:  Unsplash 

Existe uma maneira

Para esclarecer as possíveis consequências de intervenções dessa magnitude, MacMartin desenvolverá modelos de efeitos climáticos específicos oriundos da injeção de aerossóis na atmosfera acima de diferentes partes do globo e altitudes. “Dependendo de onde você colocar [a substância], terá efeitos diferentes nas monções na Ásia e no gelo marinho do Ártico“, ele apontou.

O Centro Nacional de Pesquisa Atmosférica em Boulder, Colorado, financiado também pela SilverLining, acredita ter o sistema ideal para isso — o qual é considerado o mais sofisticado do mundo. Com ele, serão executadas centenas de simulações e, assim, especialistas procurarão o que chamam de ponto ideal, no qual a quantidade de resfriamento artificial que pode reduzir eventos climáticos extremos não cause mudanças mais amplas nos padrões regionais de precipitação ou impactos semelhantes.

“Existe uma maneira, pelo menos em nosso modelo de mundo, de ver se podemos alcançar um sem acionar demais o outro?” questionou Jean-François Lamarque, diretor do laboratório de Clima e Dinâmica Global da instituição. Ainda não há resposta para essa dúvida, mas soluções sustentáveis estão sendo analisadas por pesquisadores australianos, que utilizariam a emissão de água salgada para tornar nuvens mais reflexivas, assim indicando resultados promissores de testes.

Dessa maneira, quem sabe as perdas de corais de recife que testemunhamos tenham data para acabar. Quanto ao resto, bem, só o tempo mostrará.

Rafael Muñoz: Brasil paga alto preço pela falta de política integrada de gestão de risco de desastres (Folha de S.Paulo)

www1.folha.uol.com.br – 20 de outubro de 2020

É urgente que se integrem tais questões às amplas políticas de desenvolvimento socioeconômico
Enchente em Itaoca, 2014. Fonte: Agência Brasil

Foi em janeiro de 2011 que o mito de que no Brasil não há desastre foi por terra. Chuvas torrenciais registradas na Região Serrana do Rio de Janeiro provocaram deslizamentos de terra e inundações, deixando um rastro de mais de mil mortos. O ocorrido mostrou a necessidade de priorização da agenda de riscos de desastres que fora, por muito tempo, secundária frente à falta de conhecimento dos reais impactos dos eventos naturais extremos na sociedade e economia brasileiras.

Nesse contexto, o Banco Mundial em parceria com a Sedec (Secretaria Nacional de Proteção e Defesa Civil) e a UFSC (Universidade Federal de Santa Catarina) conduziu uma análise detalhada de eventos de desastres passados demostrando a real dimensão do problema: entre 1995 e 2019, o Brasil perdeu em média mensalmente cerca de R$ 1,1 bilhão devido a desastres, ou seja, os prejuízos totais para o período são estimados em cerca de R$ 330 bilhões.

Desse total, 20% são perdas direitas (ou danos), a ampla maioria (59%) no setor de infraestrutura enquanto o de habitação responde por 37%. Já as perdas indiretas (ou prejuízos) correspondem a aproximadamente 80% do valor total dos impactos de desastre no país, mais marcantes na agricultura (R$ 149,8 bilhões) e pecuária (R$ 55,7 bilhões) pelo setor privado e água e transporte (R$ 31,9 bilhões) pelo setor público. Em relação aos impactos humanos, a conta é também significava: 4.065 mortes, 7,4 milhões de pessoas temporária ou permanentemente fora de suas casas devido a danos e mais de 276 milhões de pessoas afetadas.

Para além das perdas humanas e econômicas, as políticas públicas para a promoção de avanços socioeconômicos também podem ter sua eficácia reduzida dado que os eventos de desastres comprovadamente afetam indicadores de saúde, poder de compra, acesso a emprego e renda, educação, dentre outros. Investimentos vitais em infraestruturas críticas, como transportes e habitação, também são massivamente impactados devido a ocorrência de desastres.

Diante deste cenário, surge a inevitável pergunta: por que o Brasil ainda não tem uma política integrada de gestão de riscos de desastres e um Plano Nacional de Proteção e Defesa Civil? De forma a assegurar os tão necessários avanços, a atual gestão da Sedec definiu como prioridade a regulamentação da Lei 12.608/2012 que institui a Política Nacional de Proteção e Defesa Civil, bem como a formulação do Plano Nacional de Proteção e Defesa Civil.

Tais ações podem configurar um arcabouço legal e de diretrizes que venham a fomentar melhorias estruturais em políticas públicas. Por exemplo, no setor de habitação pode-se definir protocolos de incorporação de produtos de mapeamento de riscos em decisões de novos investimentos ou mitigação de riscos de desastres em projetos já entregues. No campo do planejamento fiscal, orçamentos mais condizentes com os impactos econômicos de desastres podem ser definidos no exercício de cada ano com vistas a melhor proteger a economia nacional e subnacional. Por fim, investimentos em infraestruturas críticas (por exemplo transportes, água e saneamento, geração e distribuição de energia) bem como manutenção das mesmas sob a ótica de exposição e vulnerabilidade a perigos naturais podem assegurar a continuidade da operação e de negócios em situações extremas, permitindo que serviços essenciais continuem a ser providos à população e que os impactos indiretos na economia sejam reduzidos.

Dado o aumento da frequência e impactos socioeconômicos dos eventos naturais extremos, existe consenso entre os especialistas que o processo de rápida urbanização favoreceu a criação de um cenário mais propício à ocorrência de desastres devido à ocupação inadequada do solo em áreas com perigos naturais e sem o devido tratamento de obras civis para gestão dos processos naturais. Ao mesmo tempo que esse processo levou a uma alta exposição de comunidades vulneráveis no território nacional e no momento em que analisamos os impactos da pandemia de Covid-19 em nossa economia e comunidades, não podemos deixar de considerar como os desastres vêm influenciando (negativamente) há muito tempo as políticas públicas em nosso país.

Felizmente avanços na coleta de dados e evidências permitem agora que os eventos de desastres e seus impactos estejam sob a luz do conhecimento técnico e em posse dos legisladores, administradores públicos e tomadores de decisões por meio de mapas de riscos, previsão de clima e tempo, modelos de inundações e deslizamentos, bem como fóruns de discussão e projetos de financiamento.

Nesse contexto, fica clara a necessidade de adaptação dos modelos de sucesso de gestão de riscos de desastres observados globalmente às características do Brasil. De forma geral, a extensão do território nacional, modelo federalista de administração pública, histórico de eventos de desastres de menor escala e alta frequência cumulativa, dentre outros, implica na necessidade de definição do papel da União e dos governos estaduais e municipais na agenda.

Assim, é urgente que se integre as questões de gestão de riscos de desastres às amplas políticas de desenvolvimento socioeconômico, tais como programas de habitação, planejamento e expansão urbana, investimentos em infraestruturas críticas, incentivos agropecuários, transferência de renda, entre outros.

Adicionalmente, há real oportunidade em se repensar processos de recuperação segundo a ótica de reconstrução melhor (em inglês, Build Back Better) de forma a assegurar que erros do passado não sejam repetidos gerando ou mantendo-se os patamares de riscos de desastres.

Esta coluna foi escrita em colaboração com Frederico Pedroso, especialista em Gestão de Riscos de Desastres do Banco Mundial, Joaquin Toro, especialista líder em Gestão de Riscos de Desastres do Banco Mundial e Rafael Schadeck, engenheiro civil e consultor em Gestão de Riscos de Desastres do Banco Mundial.

Science and Policy Collide During the Pandemic (The Scientist)

Science and Policy Collide During the Pandemic
ABOVE: MODIFIED FROM © istock.com, VASELENA
COVID-19 has laid bare some of the pitfalls of the relationship between scientific experts and policymakers—but some researchers say there are ways to make it better.

Diana Kwon

Sep 1, 2020

Science has taken center stage during the COVID-19 pandemic. Early on, as SARS-CoV-2 started spreading around the globe, many researchers pivoted to focus on studying the virus. At the same time, some scientists and science advisors—experts responsible for providing scientific information to policymakers—gained celebrity status as they calmly and cautiously updated the public on the rapidly evolving situation and lent their expertise to help governments make critical decisions, such as those relating to lockdowns and other transmission-slowing measures.

“Academia, in the case of COVID, has done an amazing job of trying to get as much information relevant to COVID gathered and distributed into the policymaking process as possible,” says Chris Tyler, the director of research and policy in University College London’s Department of Science, Technology, Engineering and Public Policy (STEaPP). 

But the pace at which COVID-related science has been conducted and disseminated during the pandemic has also revealed the challenges associated with translating fast-accumulating evidence for an audience not well versed in the process of science. As research findings are speedily posted to preprint servers, preliminary results have made headlines in major news outlets, sometimes without the appropriate dose of scrutiny.

Some politicians, such as Brazil’s President Jair Bolsonaro, have been quick to jump on premature findings, publicly touting the benefits of treatments such as hydroxychloroquine with minimal or no supporting evidence. Others have pointed to the flip-flopping of the current state of knowledge as a sign of scientists’ untrustworthiness or incompetence—as was seen, for example, in the backlash against Anthony Fauci, one of the US government’s top science advisors. 

Some comments from world leaders have been even more concerning. “For me, the most shocking thing I saw,” Tyler says, “was Donald Trump suggesting the injection of disinfectant as a way of treating COVID—that was an eye-popping, mind-boggling moment.” 

Still, Tyler notes that there are many countries in which the relationship between the scientific community and policymakers during the course of the pandemic has been “pretty impressive.” As an example, he points to Germany, where the government has both enlisted and heeded the advice of scientists across a range of disciplines, including epidemiology, virology, economics, public health, and the humanities.

Researchers will likely be assessing the response to the pandemic for years to come. In the meantime, for scientists interested in getting involved in policymaking, there are lessons to be learned, as well some preliminary insights from the pandemic that may help to improve interactions between scientists and policymakers and thereby pave the way to better evidence-based policy. 

Cultural divisions between scientists and policymakers

Even in the absence of a public-health emergency, there are several obstacles to the smooth implementation of scientific advice into policy. One is simply that scientists and policymakers are generally beholden to different incentive systems. “Classically, a scientist wants to understand something for the sake of understanding, because they have a passion toward that topic—so discovery is driven by the value of discovery,” says Kai Ruggeri, a professor of health policy and management at Columbia University. “Whereas the policymaker has a much more utilitarian approach. . . . They have to come up with interventions that produce the best outcomes for the most people.”

Scientists and policymakers are operating on considerably different timescales, too. “Normally, research programs take months and years, whereas policy decisions take weeks and months, sometimes days,” Tyler says. “This discrepancy makes it much more difficult to get scientifically generated knowledge into the policymaking process.” Tyler adds that the two groups deal with uncertainty in very different ways: academics are comfortable with it, as measuring uncertainty is part of the scientific process, whereas policymakers tend to view it as something that can cloud what a “right” answer might be. 

This cultural mismatch has been particularly pronounced during the COVID-19 pandemic. Even as scientists work at breakneck speeds, many crucial questions about COVID-19—such as how long immunity to the virus lasts, and how much of a role children play in the spread of infection—remain unresolved, and policy decisions have had to be addressed with limited evidence, with advice changing as new research emerges. 

“We have seen the messy side of science, [that] not all studies are equally well-done and that they build over time to contribute to the weight of knowledge,” says Karen Akerlof, a professor of environmental science and policy at George Mason University. “The short timeframes needed for COVID-19 decisions have run straight into the much longer timeframes needed for robust scientific conclusions.” 

Academia has done an amazing job of trying to get as much information  relevant to COVID gathered and distributed into the policymaking process as possible. —Chris Tyler, University College London

Widespread mask use, for example, was initially discouraged by many politicians and public health officials due to concerns about a shortage of supplies for healthcare workers and limited data on whether mask use by the general public would help reduce the spread of the virus. At the time, there were few mask-wearing laws outside of East Asia, where such practices were commonplace long before the COVID-19 pandemic began.  

Gradually, however, as studies began to provide evidence to support the use of face coverings as a means of stemming transmission, scientists and public health officials started to recommend their use. This shift led local, state, and federal officials around the world to implement mandatory mask-wearing rules in certain public spaces. Some politicians, however, used this about-face in advice as a reason to criticize health experts.  

“We’re dealing with evidence that is changing very rapidly,” says Meghan Azad, a professor of pediatrics at the University of Manitoba. “I think there’s a risk of people perceiving that rapid evolution as science [being] a bad process, which is worrisome.” On the other hand, the spotlight the pandemic has put on scientists provides opportunities to educate the general public and policymakers about the scientific process, Azad adds. It’s important to help them understand that “it’s good that things are changing, because it means we’re paying attention to the new evidence as it comes out.”

Bringing science and policy closer together

Despite these challenges, science and policy experts say that there are both short- and long-term ways to improve the relationship between the two communities and to help policymakers arrive at decisions that are more evidence-based.

Better tools, for one, could help close the gap. Earlier this year, Ruggeri brought together a group of people from a range of disciplines, including medicine, engineering, economics, and policy, to develop the Theoretical, Empirical, Applicable, Replicable, Impact (THEARI) rating system, a five-tiered framework for evaluating the robustness of scientific evidence in the context of policy decisions. The ratings range from “theoretical” (the lowest level, where a scientifically viable idea has been proposed but not tested) to “impact” (the highest level, in which a concept has been successfully tested, replicated, applied, and validated in the real world).

The team developed THEARI partly to establish a “common language” across scientific disciplines, which Ruggeri says would be particularly useful to policymakers evaluating evidence from a field they may know little about. Ruggeri hopes to see the THEARI framework—or something like it—adopted by policymakers and policy advisors, and even by journals and preprint servers. “I don’t necessarily think [THEARI] will be used right away,” he says. “It’d be great if it was, but we . . . [developed] it as kind of a starting point.” 

Other approaches to improve the communication between scientists and policymakers may require more resources and time. According to Akerlof, one method could include providing better incentives for both parties to engage with each other—by offering increased funding for academics who take part in this kind of activity, for instance—and boosting opportunities for such interactions to happen. 

Akerlof points to the American Association for the Advancement of Science’s Science & Technology Policy Fellowships, which place scientists and engineers in various branches of the US government for a year, as an example of a way in which important ties between the two communities could be forged. “Many of those scientists either stay in government or continue to work in science policy in other organizations,” Akerlof says. “By understanding the language and culture of both the scientific and policy communities, they are able to bridge between them.”  

In Canada, such a program was established in 2018, when the Canadian Science Policy Center and Mona Nemer, Canada’s Chief Science Advisor, held the country’s first “Science Meets Parliament” event. The 28 scientists in attendance, including Azad, spent two days learning about effective communication and the policymaking process, and interacting with senators and members of parliament. “It was eye opening for me because I didn’t know how parliamentarians really live and work,” Azad says. “We hope it’ll grow and involve more scientists and continue on an annual basis . . . and also happen at the provincial level.”

The short timeframes needed for COVID-19 decisions have run straight into the much longer timeframes needed for robust scientific conclusions. —Karen Akerlof, George Mason University

There may also be insights from scientist-policymaker exchanges in other domains that experts can apply to the current pandemic. Maria Carmen Lemos, a social scientist focused on climate policy at the University of Michigan, says that one way to make those interactions more productive is by closing something she calls the “usability gap.”

“The usability gap highlights the fact that one of the reasons that research fails to connect is because [scientists] only pay attention to the [science],” Lemos explains. “We are putting everything out there in papers, in policy briefs, in reports, but rarely do we actually systematically and intentionally try to understand who is on the other side” receiving this information, and what they will do with it.

The way to deal with this usability gap, according to Lemos, is for more scientists to consult the people who actually make, influence, and implement policy changes early on in the scientific process. Lemos and her team, for example, have engaged in this way with city officials, farmers, forest managers, tribal leaders, and others whose decision making would directly benefit from their work. “We help with organization and funding, and we also work with them very closely to produce climate information that is tailored for them, for the problems that they are trying to solve,” she adds. 

Azad applied this kind of approach in a study that involves assessing the effects of the pandemic on a cohort of children that her team has been following from infancy, starting in 2010. When she and her colleagues were putting together the proposal for the COVID-19 project this year, they reached out to public health decision makers across the Canadian provinces to find out what information would be most useful. “We have made sure to embed those decision makers in the project from the very beginning to ensure we’re asking the right questions, getting the most useful information, and getting it back to them in a very quick turnaround manner,” Azad says. 

There will also likely be lessons to take away from the pandemic in the years to come, notes Noam Obermeister, a PhD student studying science policy at the University of Cambridge. These include insights from scientific advisors about how providing guidance to policymakers during COVID-19 compared to pre-pandemic times, and how scientists’ prominent role during the pandemic has affected how they are viewed by the public; efforts to collect this sort of information are already underway. 

“I don’t think scientists anticipated that much power and visibility, or that [they] would be in [public] saying science is complicated and uncertain,” Obermeister says. “I think what that does to the authority of science in the public eye is still to be determined.”

Talking Science to PolicymakersFor academics who have never engaged with policymakers, the thought of making contact may be daunting. Researchers with experience of these interactions share their tips for success.
1. Do your homework. Policymakers usually have many different people vying for their time and attention. When you get a meeting, make sure you make the most of it. “Find out which issues related to your research are a priority for the policymaker and which decisions are on the horizon,” says Karen Akerlof, a professor of environmental science and policy at George Mason University.
2. Get to the point, but don’t oversimplify. “I find policymakers tend to know a lot about the topics they work on, and when they don’t, they know what to ask about,” says Kai Ruggeri, a professor of health policy and management at Columbia University. “Finding a good balance in the communication goes a long way.”
3. Keep in mind that policymakers’ expertise differs from that of scientists. “Park your ego at the door and treat policymakers and their staff with respect,” Akerlof says. “Recognize that the skills, knowledge, and culture that translate to success in policy may seem very different than those in academia.” 
4. Be persistent. “Don’t be discouraged if you don’t get a response immediately, or if promising communications don’t pan out,” says Meghan Azad, a professor of pediatrics at the University of Manitoba. “Policymakers are busy and their attention shifts rapidly. Meetings get cancelled. It’s not personal. Keep trying.”
5. Remember that not all policymakers are politicians, and vice versa. Politicians are usually elected and are affiliated with a political party, and they may not always be directly involved in creating new policies. This is not the case for the vast majority of policymakers—most are career civil servants whose decisions impact the daily living of constituents, Ruggeri explains. 

A Supercomputer Analyzed Covid-19 — and an Interesting New Theory Has Emerged (Medium/Elemental)

A closer look at the Bradykinin hypothesis

Thomas Smith, Sept 1, 2020

Original article

3d rendering of multiple coronavirus.
Photo: zhangshuang/Getty Images

Earlier this summer, the Summit supercomputer at Oak Ridge National Lab in Tennessee set about crunching data on more than 40,000 genes from 17,000 genetic samples in an effort to better understand Covid-19. Summit is the second-fastest computer in the world, but the process — which involved analyzing 2.5 billion genetic combinations — still took more than a week.

When Summit was done, researchers analyzed the results. It was, in the words of Dr. Daniel Jacobson, lead researcher and chief scientist for computational systems biology at Oak Ridge, a “eureka moment.” The computer had revealed a new theory about how Covid-19 impacts the body: the bradykinin hypothesis. The hypothesis provides a model that explains many aspects of Covid-19, including some of its most bizarre symptoms. It also suggests 10-plus potential treatments, many of which are already FDA approved. Jacobson’s group published their results in a paper in the journal eLife in early July.

According to the team’s findings, a Covid-19 infection generally begins when the virus enters the body through ACE2 receptors in the nose, (The receptors, which the virus is known to target, are abundant there.) The virus then proceeds through the body, entering cells in other places where ACE2 is also present: the intestines, kidneys, and heart. This likely accounts for at least some of the disease’s cardiac and GI symptoms.

But once Covid-19 has established itself in the body, things start to get really interesting. According to Jacobson’s group, the data Summit analyzed shows that Covid-19 isn’t content to simply infect cells that already express lots of ACE2 receptors. Instead, it actively hijacks the body’s own systems, tricking it into upregulating ACE2 receptors in places where they’re usually expressed at low or medium levels, including the lungs.

In this sense, Covid-19 is like a burglar who slips in your unlocked second-floor window and starts to ransack your house. Once inside, though, they don’t just take your stuff — they also throw open all your doors and windows so their accomplices can rush in and help pillage more efficiently.

The renin–angiotensin system (RAS) controls many aspects of the circulatory system, including the body’s levels of a chemical called bradykinin, which normally helps to regulate blood pressure. According to the team’s analysis, when the virus tweaks the RAS, it causes the body’s mechanisms for regulating bradykinin to go haywire. Bradykinin receptors are resensitized, and the body also stops effectively breaking down bradykinin. (ACE normally degrades bradykinin, but when the virus downregulates it, it can’t do this as effectively.)

The end result, the researchers say, is to release a bradykinin storm — a massive, runaway buildup of bradykinin in the body. According to the bradykinin hypothesis, it’s this storm that is ultimately responsible for many of Covid-19’s deadly effects. Jacobson’s team says in their paper that “the pathology of Covid-19 is likely the result of Bradykinin Storms rather than cytokine storms,” which had been previously identified in Covid-19 patients, but that “the two may be intricately linked.” Other papers had previously identified bradykinin storms as a possible cause of Covid-19’s pathologies.

Covid-19 is like a burglar who slips in your unlocked second-floor window and starts to ransack your house.

As bradykinin builds up in the body, it dramatically increases vascular permeability. In short, it makes your blood vessels leaky. This aligns with recent clinical data, which increasingly views Covid-19 primarily as a vascular disease, rather than a respiratory one. But Covid-19 still has a massive effect on the lungs. As blood vessels start to leak due to a bradykinin storm, the researchers say, the lungs can fill with fluid. Immune cells also leak out into the lungs, Jacobson’s team found, causing inflammation.

And Covid-19 has another especially insidious trick. Through another pathway, the team’s data shows, it increases production of hyaluronic acid (HLA) in the lungs. HLA is often used in soaps and lotions for its ability to absorb more than 1,000 times its weight in fluid. When it combines with fluid leaking into the lungs, the results are disastrous: It forms a hydrogel, which can fill the lungs in some patients. According to Jacobson, once this happens, “it’s like trying to breathe through Jell-O.”

This may explain why ventilators have proven less effective in treating advanced Covid-19 than doctors originally expected, based on experiences with other viruses. “It reaches a point where regardless of how much oxygen you pump in, it doesn’t matter, because the alveoli in the lungs are filled with this hydrogel,” Jacobson says. “The lungs become like a water balloon.” Patients can suffocate even while receiving full breathing support.

The bradykinin hypothesis also extends to many of Covid-19’s effects on the heart. About one in five hospitalized Covid-19 patients have damage to their hearts, even if they never had cardiac issues before. Some of this is likely due to the virus infecting the heart directly through its ACE2 receptors. But the RAS also controls aspects of cardiac contractions and blood pressure. According to the researchers, bradykinin storms could create arrhythmias and low blood pressure, which are often seen in Covid-19 patients.

The bradykinin hypothesis also accounts for Covid-19’s neurological effects, which are some of the most surprising and concerning elements of the disease. These symptoms (which include dizziness, seizures, delirium, and stroke) are present in as many as half of hospitalized Covid-19 patients. According to Jacobson and his team, MRI studies in France revealed that many Covid-19 patients have evidence of leaky blood vessels in their brains.

Bradykinin — especially at high doses — can also lead to a breakdown of the blood-brain barrier. Under normal circumstances, this barrier acts as a filter between your brain and the rest of your circulatory system. It lets in the nutrients and small molecules that the brain needs to function, while keeping out toxins and pathogens and keeping the brain’s internal environment tightly regulated.

If bradykinin storms cause the blood-brain barrier to break down, this could allow harmful cells and compounds into the brain, leading to inflammation, potential brain damage, and many of the neurological symptoms Covid-19 patients experience. Jacobson told me, “It is a reasonable hypothesis that many of the neurological symptoms in Covid-19 could be due to an excess of bradykinin. It has been reported that bradykinin would indeed be likely to increase the permeability of the blood-brain barrier. In addition, similar neurological symptoms have been observed in other diseases that result from an excess of bradykinin.”

Increased bradykinin levels could also account for other common Covid-19 symptoms. ACE inhibitors — a class of drugs used to treat high blood pressure — have a similar effect on the RAS system as Covid-19, increasing bradykinin levels. In fact, Jacobson and his team note in their paper that “the virus… acts pharmacologically as an ACE inhibitor” — almost directly mirroring the actions of these drugs.

By acting like a natural ACE inhibitor, Covid-19 may be causing the same effects that hypertensive patients sometimes get when they take blood pressure–lowering drugs. ACE inhibitors are known to cause a dry cough and fatigue, two textbook symptoms of Covid-19. And they can potentially increase blood potassium levels, which has also been observed in Covid-19 patients. The similarities between ACE inhibitor side effects and Covid-19 symptoms strengthen the bradykinin hypothesis, the researchers say.

ACE inhibitors are also known to cause a loss of taste and smell. Jacobson stresses, though, that this symptom is more likely due to the virus “affecting the cells surrounding olfactory nerve cells” than the direct effects of bradykinin.

Though still an emerging theory, the bradykinin hypothesis explains several other of Covid-19’s seemingly bizarre symptoms. Jacobson and his team speculate that leaky vasculature caused by bradykinin storms could be responsible for “Covid toes,” a condition involving swollen, bruised toes that some Covid-19 patients experience. Bradykinin can also mess with the thyroid gland, which could produce the thyroid symptoms recently observed in some patients.

The bradykinin hypothesis could also explain some of the broader demographic patterns of the disease’s spread. The researchers note that some aspects of the RAS system are sex-linked, with proteins for several receptors (such as one called TMSB4X) located on the X chromosome. This means that “women… would have twice the levels of this protein than men,” a result borne out by the researchers’ data. In their paper, Jacobson’s team concludes that this “could explain the lower incidence of Covid-19 induced mortality in women.” A genetic quirk of the RAS could be giving women extra protection against the disease.

The bradykinin hypothesis provides a model that “contributes to a better understanding of Covid-19” and “adds novelty to the existing literature,” according to scientists Frank van de Veerdonk, Jos WM van der Meer, and Roger Little, who peer-reviewed the team’s paper. It predicts nearly all the disease’s symptoms, even ones (like bruises on the toes) that at first appear random, and further suggests new treatments for the disease.

As Jacobson and team point out, several drugs target aspects of the RAS and are already FDA approved to treat other conditions. They could arguably be applied to treating Covid-19 as well. Several, like danazol, stanozolol, and ecallantide, reduce bradykinin production and could potentially stop a deadly bradykinin storm. Others, like icatibant, reduce bradykinin signaling and could blunt its effects once it’s already in the body.

Interestingly, Jacobson’s team also suggests vitamin D as a potentially useful Covid-19 drug. The vitamin is involved in the RAS system and could prove helpful by reducing levels of another compound, known as REN. Again, this could stop potentially deadly bradykinin storms from forming. The researchers note that vitamin D has already been shown to help those with Covid-19. The vitamin is readily available over the counter, and around 20% of the population is deficient. If indeed the vitamin proves effective at reducing the severity of bradykinin storms, it could be an easy, relatively safe way to reduce the severity of the virus.

Other compounds could treat symptoms associated with bradykinin storms. Hymecromone, for example, could reduce hyaluronic acid levels, potentially stopping deadly hydrogels from forming in the lungs. And timbetasin could mimic the mechanism that the researchers believe protects women from more severe Covid-19 infections. All of these potential treatments are speculative, of course, and would need to be studied in a rigorous, controlled environment before their effectiveness could be determined and they could be used more broadly.

Covid-19 stands out for both the scale of its global impact and the apparent randomness of its many symptoms. Physicians have struggled to understand the disease and come up with a unified theory for how it works. Though as of yet unproven, the bradykinin hypothesis provides such a theory. And like all good hypotheses, it also provides specific, testable predictions — in this case, actual drugs that could provide relief to real patients.

The researchers are quick to point out that “the testing of any of these pharmaceutical interventions should be done in well-designed clinical trials.” As to the next step in the process, Jacobson is clear: “We have to get this message out.” His team’s finding won’t cure Covid-19. But if the treatments it points to pan out in the clinic, interventions guided by the bradykinin hypothesis could greatly reduce patients’ suffering — and potentially save lives.

The Biblical Flood That Will Drown California (Wired)

Tom Philpott, 08.29.20 8:00 AM

The Great Flood of 1861–1862 was a preview of what scientists expect to see again, and soon.

This story originally appeared on Mother Jones and is part of the Climate Desk collaboration.

In November 1860, a young scientist from upstate New York named William Brewer disembarked in San Francisco after a long journey that took him from New York City through Panama and then north along the Pacific coast. “The weather is perfectly heavenly,” he enthused in a letter to his brother back east. The fast-growing metropolis was already revealing the charms we know today: “large streets, magnificent buildings” adorned by “many flowers we [northeasterners] see only in house cultivations: various kinds of geraniums growing of immense size, dew plant growing like a weed, acacia, fuchsia, etc. growing in the open air.”

Flowery prose aside, Brewer was on a serious mission. Barely a decade after being claimed as a US state, California was plunged in an economic crisis. The gold rush had gone bust, and thousands of restive settlers were left scurrying about, hot after the next ever-elusive mineral bonanza. The fledgling legislature had seen fit to hire a state geographer to gauge the mineral wealth underneath its vast and varied terrain, hoping to organize and rationalize the mad lunge for buried treasure. The potential for boosting agriculture as a hedge against mining wasn’t lost on the state’s leaders. They called on the state geographer to deliver a “full and scientific description of the state’s rocks, fossils, soils, and minerals, and its botanical and zoological productions, together with specimens of same.”

The task of completing the fieldwork fell to the 32-year-old Brewer, a Yale-trained botanist who had studied cutting-edge agricultural science in Europe. His letters home, chronicling his four-year journey up and down California, form one of the most vivid contemporary accounts of its early statehood.

They also provide a stark look at the greatest natural disaster known to have befallen the western United States since European contact in the 16th century: the Great Flood of 1861–1862. The cataclysm cut off telegraph communication with the East Coast, swamped the state’s new capital, and submerged the entire Central Valley under as much as 15 feet of water. Yet in modern-day California—a region that author Mike Davis once likened to a “Book of the Apocalypse theme park,” where this year’s wildfires have already burned 1.4 million acres, and dozens of fires are still raging—the nearly forgotten biblical-scale flood documented by Brewer’s letters has largely vanished from the public imagination, replaced largely by traumatic memories of more recent earthquakes.

When it was thought of at all, the flood was once considered a thousand-year anomaly, a freak occurrence. But emerging science demonstrates that floods of even greater magnitude occurred every 100 to 200 years in California’s precolonial history. Climate change will make them more frequent still. In other words, the Great Flood was a preview of what scientists expect to see again, and soon. And this time, given California’s emergence as agricultural and economic powerhouse, the effects will be all the more devastating.

Barely a year after Brewer’s sunny initial descent from a ship in San Francisco Bay, he was back in the city, on a break. In a November 1861 letter home, he complained of a “week of rain.” In his next letter, two months later, Brewer reported jaw-dropping news: Rain had fallen almost continuously since he had last written—and now the entire Central Valley was underwater. “Thousands of farms are entirely underwater—cattle starving and drowning.”

Picking up the letter nine days later, he wrote that a bad situation had deteriorated. All the roads in the middle of the state are “impassable, so all mails are cut off.” Telegraph service, which had only recently been connected to the East Coast through the Central Valley, stalled. “The tops of the poles are under water!” The young state’s capital city, Sacramento, about 100 miles northeast of San Francisco at the western edge of the valley and the intersection of two rivers, was submerged, forcing the legislature to evacuate—and delaying a payment Brewer needed to forge ahead with his expedition.

The surveyor gaped at the sheer volume of rain. In a normal year, Brewer reported, San Francisco received about 20 inches. In the 10 weeks leading up to January 18, 1862, the city got “thirty-two and three-quarters inches and it is still raining!”

Brewer went on to recount scenes from the Central Valley that would fit in a Hollywood disaster epic. “An old acquaintance, a buccaro [cowboy], came down from a ranch that was overflowed,” he wrote. “The floor of their one-story house was six weeks under water before the house went to pieces.” Steamboats “ran back over the ranches fourteen miles from the [Sacramento] river, carrying stock [cattle], etc., to the hills,” he reported. He marveled at the massive impromptu lake made up of “water ice cold and muddy,” in which “winds made high waves which beat the farm homes in pieces.” As a result, “every house and farm over this immense region is gone.”

Eventually, in March, Brewer made it to Sacramento, hoping (without success) to lay hands on the state funds he needed to continue his survey. He found a city still in ruins, weeks after the worst of the rains. “Such a desolate scene I hope never to see again,” he wrote: “Most of the city is still under water, and has been for three months … Every low place is full—cellars and yards are full, houses and walls wet, everything uncomfortable.” The “better class of houses” were in rough shape, Brewer observed, but “it is with the poorer classes that this is the worst.” He went on: “Many of the one-story houses are entirely uninhabitable; others, where the floors are above the water are, at best, most wretched places in which to live.” He summarized the scene:

Many houses have partially toppled over; some have been carried from their foundations, several streets (now avenues of water) are blocked up with houses that have floated in them, dead animals lie about here and there—a dreadful picture. I don’t think the city will ever rise from the shock, I don’t see how it can.

Brewer’s account is important for more than just historical interest. In the 160 years since the botanist set foot on the West Coast, California has transformed from an agricultural backwater to one of the jewels of the US food system. The state produces nearly all of the almonds, walnuts, and pistachios consumed domestically; 90 percent or more of the broccoli, carrots, garlic, celery, grapes, tangerines, plums, and artichokes; at least 75 percent of the cauliflower, apricots, lemons, strawberries, and raspberries; and more than 40 percent of the lettuce, cabbage, oranges, peaches, and peppers.

And as if that weren’t enough, California is also a national hub for milk production. Tucked in amid the almond groves and vegetable fields are vast dairy operations that confine cows together by the thousands and produce more than a fifth of the nation’s milk supply, more than any other state. It all amounts to a food-production juggernaut: California generates $46 billion worth of food per year, nearly double the haul of its closest competitor among US states, the corn-and-soybean behemoth Iowa.

You’ve probably heard that ever-more more frequent and severe droughts threaten the bounty we’ve come to rely on from California. Water scarcity, it turns out, isn’t the only menace that stalks the California valleys that stock our supermarkets. The opposite—catastrophic flooding—also occupies a niche in what Mike Davis, the great chronicler of Southern California’s sociopolitical geography, has called the state’s “ecology of fear.” Indeed, his classic book of that title opens with an account of a 1995 deluge that saw “million-dollar homes tobogganed off their hill-slope perches” and small children and pets “sucked into the deadly vortices of the flood channels.”

Yet floods tend to be less feared than rival horsemen of the apocalypse in the state’s oft-stimulated imagination of disaster. The epochal 2011–2017 drought, with its missing-in-action snowpacks and draconian water restrictions, burned itself into the state’s consciousness. Californians are rightly terrified of fires like the ones that roared through the northern Sierra Nevada foothills and coastal canyons near Los Angeles in the fall of 2018, killing nearly 100 people and fouling air for miles around, or the current LNU Lightning Complex fire that has destroyed nearly 1,000 structures and killed five people in the region between Sacramento and San Francisco. Many people are frightfully aware that a warming climate will make such conflagrations increasingly frequent. And “earthquake kits” are common gear in closets and garages all along the San Andreas Fault, where the next Big One lurks. Floods, though they occur as often in Southern and Central California as they do anywhere in the United States, don’t generate quite the same buzz.

But a growing body of research shows there’s a flip side to the megadroughts Central Valley farmers face: megafloods. The region most vulnerable to such a water-drenched cataclysm in the near future is, ironically enough, the California’s great arid, sinking food production basin, the beleaguered behemoth of the US food system: the Central Valley. Bordered on all sides by mountains, the Central Valley stretches 450 miles long, is on average 50 miles wide, and occupies a land mass of 18,000 square miles, or 11.5 million acres—roughly equivalent in size to Massachusetts and Vermont combined. Wedged between the Sierra Nevada to the east and the Coast Ranges to the west, it’s one of the globe’s greatest expanses of fertile soil and temperate weather. For most Americans, it’s easy to ignore the Central Valley, even though it’s as important to eaters as Hollywood is to moviegoers or Silicon Valley is to smartphone users. Occupying less than 1 percent of US farmland, the Central Valley churns out a quarter of the nation’s food supply.

At the time of the Great Flood, the Central Valley was still mainly cattle ranches, the farming boom a ways off. Late in 1861, the state suddenly emerged from a two-decade dry spell when monster storms began lashing the West Coast from Baja California to present-day Washington state. In central California, the deluge initially took the form of 10 to 15 feet of snow dumped onto the Sierra Nevada, according to research by the UC Berkeley paleoclimatologist B. Lynn Ingram and laid out in her 2015 book, The West Without Water, cowritten with Frances Malamud-Roam. Ingram has emerged as a kind of Cassandra of drought and flood risks in the western United States. Soon after the blizzards came days of warm, heavy rain, which in turn melted the enormous snowpack. The resulting slurry cascaded through the Central Valley’s network of untamed rivers.

As floodwater gathered in the valley, it formed a vast, muddy, wind-roiled lake, its size “rivaling that of Lake Superior,” covering the entire Central Valley floor, from the southern slopes of the Cascade Mountains near the Oregon border to the Tehachapis, south of Bakersfield, with depths in some places exceeding 15 feet.

At least some of the region’s remnant indigenous population saw the epic flood coming and took precautions to escape devastation, Ingram reports, quoting an item in the Nevada City Democrat on January 11, 1862:

We are informed that the Indians living in the vicinity of Marysville left their abodes a week or more ago for the foothills predicting an unprecedented overflow. They told the whites that the water would be higher than it has been for thirty years, and pointed high up on the trees and houses where it would come. The valley Indians have traditions that the water occasionally rises 15 or 20 feet higher than it has been at any time since the country was settled by whites, and as they live in the open air and watch closely all the weather indications, it is not improbable that they may have better means than the whites of anticipating a great storm.

All in all, thousands of people died, “one-third of the state’s property was destroyed, and one home in eight was destroyed completely or carried away by the floodwaters.” As for farming, the 1862 megaflood transformed valley agriculture, playing a decisive role in creating today’s Anglo-dominated, crop-oriented agricultural powerhouse: a 19th-century example of the “disaster capitalism” that Naomi Klein describes in her 2007 book, The Shock Doctrine.

Prior to the event, valley land was still largely owned by Mexican rancheros who held titles dating to Spanish rule. The 1848 Treaty of Guadalupe Hidalgo, which triggered California’s transfer from Mexican to US control, gave rancheros US citizenship and obligated the new government to honor their land titles. The treaty terms met with vigorous resentment from white settlers eager to shift from gold mining to growing food for the new state’s burgeoning cities. The rancheros thrived during the gold rush, finding a booming market for beef in mining towns. By 1856, their fortunes had shifted. A severe drought that year cut production, competition from emerging US settler ranchers meant lower prices, and punishing property taxes—imposed by land-poor settler politicians—caused a further squeeze. “As a result, rancheros began to lose their herds, their land, and their homes,” writes the historian Lawrence James Jelinek.

The devastation of the 1862 flood, its effects magnified by a brutal drought that started immediately afterward and lasted through 1864, “delivered the final blow,” Jelinek writes. Between 1860 and 1870, California’s cattle herd, concentrated in the valley, plunged from 3 million to 630,000. The rancheros were forced to sell their land to white settlers at pennies per acre, and by 1870 “many rancheros had become day laborers in the towns,” Jelinek reports. The valley’s emerging class of settler farmers quickly turned to wheat and horticultural production and set about harnessing and exploiting the region’s water resources, both those gushing forth from the Sierra Nevada and those beneath their feet.

Despite all the trauma it generated and the agricultural transformation it cemented in the Central Valley, the flood quickly faded from memory in California and the broader United States. To his shocked assessment of a still-flooded and supine Sacramento months after the storm, Brewer added a prophetic coda:

No people can so stand calamity as this people. They are used to it. Everyone is familiar with the history of fortunes quickly made and as quickly lost. It seems here more than elsewhere the natural order of things. I might say, indeed, that the recklessness of the state blunts the keener feelings and takes the edge from this calamity.

Indeed, the new state’s residents ended up shaking off the cataclysm. What lesson does the Great Flood of 1862 hold for today? The question is important. Back then, just around 500,000 people lived in the entire state, and the Central Valley was a sparsely populated badland. Today, the valley has a population of 6.5 million people and boasts the state’s three fastest-growing counties. Sacramento (population 501,344), Fresno (538,330), and Bakersfield (386,839) are all budding metropolises. The state’s long-awaited high-speed train, if it’s ever completed, will place Fresno residents within an hour of Silicon Valley, driving up its appeal as a bedroom community.

In addition to the potentially vast human toll, there’s also the fact that the Central Valley has emerged as a major linchpin of the US and global food system. Could it really be submerged under fifteen feet of water again—and what would that mean?

In less than two centuries as a US state, California has maintained its reputation as a sunny paradise while also enduring the nation’s most erratic climate: the occasional massive winter storm roaring in from the Pacific; years-long droughts. But recent investigations into the fossil record show that these past years have been relatively stable.

One avenue of this research is the study of the regular megadroughts, the most recent of which occurred just a century before Europeans made landfall on the North American west coast. As we are now learning, those decades-long arid stretches were just as regularly interrupted by enormous storms—many even grander than the one that began in December 1861. (Indeed, that event itself was directly preceded and followed by serious droughts.) In other words, the same patterns that make California vulnerable to droughts also make it ripe for floods.

Beginning in the 1980s, scientists including B. Lynn Ingram began examining streams and banks in the enormous delta network that together serve as the bathtub drain through which most Central Valley runoff has flowed for millennia, reaching the ocean at the San Francisco Bay. (Now-vanished Tulare Lake gathered runoff in the southern part of the valley.) They took deep-core samples from river bottoms, because big storms that overflow the delta’s banks transfer loads of soil and silt from the Sierra Nevada and deposit a portion of it in the Delta. They also looked at fluctuations in old plant material buried in the sediment layers. Plant species that thrive in freshwater suggest wet periods, as heavy runoff from the mountains crowds out seawater. Salt-tolerant species denote dry spells, as sparse mountain runoff allows seawater to work into the delta.

What they found was stunning. The Great Flood of 1862 was no one-off black-swan event. Summarizing the science, Ingram and USGS researcher Michael Dettinger deliver the dire news: A flood comparable to—and sometimes much more intense than—the 1861–1862 catastrophe occurred sometime between 1235–1360, 1395–1410, 1555–1615, 1750–1770, and 1810–1820; “that is, one megaflood every 100 to 200 years.” They also discovered that the 1862 flood didn’t appear in the sediment record in some sites that showed evidence of multiple massive events—suggesting that it was actually smaller than many of the floods that have inundated California over the centuries.

During its time as a US food-production powerhouse, California has been known for its periodic droughts and storms. But Ingram and Dettinger’s work pulls the lens back to view the broader timescale, revealing the region’s swings between megadroughts and megastorms—ones more than severe enough to challenge concentrated food production, much less dense population centers.

The dynamics of these storms themselves explain why the state is also prone to such swings. Meteorologists have known for decades that those tempests that descend upon California over the winter—and from which the state receives the great bulk of its annual precipitation—carry moisture from the South Pacific. In the late 1990s, scientists discovered that these “pineapple expresses,” as TV weather presenters call them, are a subset of a global weather phenomenon: long, wind-driven plumes of vapor about a mile above the sea that carry moisture from warm areas near the equator on a northeasterly path to colder, drier regions toward the poles. They carry so much moisture—often more than 25 times the flow of the Mississippi River, over thousands of miles—that they’ve been dubbed “atmospheric rivers.”

In a pioneering 1998 paper, researchers Yong Zhu and Reginald E. Newell found that nearly all the vapor transport between the subtropics (regions just south or north of the equator, depending on the hemisphere) toward the poles occurred in just five or six narrow bands. And California, it turns out, is the prime spot in the western side of the northern hemisphere for catching them at full force during the winter months.

As Ingram and Dettinger note, atmospheric rivers are the primary vector for California’s floods. That includes pre-Columbian cataclysms as well as the Great Flood of 1862, all the way to the various smaller ones that regularly run through the state. Between 1950 and 2010, Ingram and Dettinger write, atmospheric rivers “caused more than 80 percent of flooding in California rivers and 81 percent of the 128 most well-documented levee breaks in California’s Central Valley.”

Paradoxically, they are at least as much a lifeblood as a curse. Between eight and 11 atmospheric rivers hit California every year, the great majority of them doing no major damage, and they deliver between 30 and 50 percent of the state’s rain and snow. But the big ones are damaging indeed. Other researchers are reaching similar conclusions. In a study released in December 2019, a team from the US Army Corps of Engineers and the Scripps Institution of Oceanography found that atmospheric-river storms accounted for 84 percent of insured flood damages in the western United States between 1978 and 2017; the 13 biggest storms wrought more than half the damage.

So the state—and a substantial portion of our food system—exists on a razor’s edge between droughts and floods, its annual water resources decided by massive, increasingly fickle transfers of moisture from the South Pacific. As Dettinger puts it, the “largest storms in California’s precipitation regime not only typically end the state’s frequent droughts, but their fluctuations also cause those droughts in the first place.”

We know that before human civilization began spewing millions of tons of greenhouse gases into the atmosphere annually, California was due “one megaflood every 100 to 200 years”—and the last one hit more than a century and a half ago. What happens to this outlook when you heat up the atmosphere by 1 degree Celsius—and are on track to hit at least another half-degree Celsius increase by midcentury?

That was the question posed by Daniel Swain and a team of researchers at UCLA’s Department of Atmospheric and Oceanic Sciences in a series of studies, the first of which was published in 2018. They took California’s long pattern of droughts and floods and mapped it onto the climate models based on data specific to the region, looking out to century’s end.

What they found isn’t comforting. As the tropical Pacific Ocean and the atmosphere just above it warm, more seawater evaporates, feeding ever bigger atmospheric rivers gushing toward the California coast. As a result, the potential for storms on the scale of the ones that triggered the Great Flood has increased “more than threefold,” they found. So an event expected to happen on average every 200 years will now happen every 65 or so. It is “more likely than not we will see one by 2060,” and it could plausibly happen again before century’s end, they concluded.

As the risk of a catastrophic event increases, so will the frequency of what they call “precipitation whiplash”: extremely wet seasons interrupted by extremely dry ones, and vice versa. The winter of 2016–2017 provides a template. That year, a series of atmospheric-river storms filled reservoirs and at one point threatened a major flood in the northern Central Valley, abruptly ending the worst multiyear drought in the state’s recorded history.

Swings on that magnitude normally occur a handful of times each century, but in the model by Swain’s team, “it goes from something that happens maybe once in a generation to something that happens two or three times,” he told me in an interview. “Setting aside a repeat of 1862, these less intense events could still seriously test the limits of our water infrastructure.” Like other efforts to map climate change onto California’s weather, this one found that drought years characterized by low winter precipitation would likely increase—in this case, by a factor of as much as two, compared with mid-20th-century patterns. But extreme-wet winter seasons, accumulating at least as much precipitation as 2016–2017, will grow even more: they could be three times as common as they were before the atmosphere began its current warming trend.

While lots of very wet years—at least the ones that don’t reach 1861–1862 levels—might sound encouraging for food production in the Central Valley, there’s a catch, Swain said. His study looked purely at precipitation, independent of whether it fell as rain or snow. A growing body of research suggests that as the climate warms, California’s precipitation mix will shift significantly in favor of rain over snow. That’s dire news for our food system, because the Central Valley’s vast irrigation networks are geared to channeling the slow, predictable melt of the snowpack into usable water for farms. Water that falls as rain is much harder to capture and bend to the slow-release needs of agriculture.

In short, California’s climate, chaotic under normal conditions, is about to get weirder and wilder. Indeed, it’s already happening.

What if an 1862-level flood, which is overdue and “more likely than not” to occur with a couple of decades, were to hit present-day California?

Starting in 2008, the USGS set out to answer just that question, launching a project called the ARkStorm (for “atmospheric river 1,000 storm”) Scenario. The effort was modeled on a previous USGS push to get a grip on another looming California cataclysm: a massive earthquake along the San Andreas Fault. In 2008, USGS produced the ShakeOut Earthquake Scenario, a “detailed depiction of a hypothetical magnitude 7.8 earthquake.” The study “served as the centerpiece of the largest earthquake drill in US history, involving over five thousand emergency responders and the participation of over 5.5 million citizens,” the USGS later reported.

That same year, the agency assembled a team of 117 scientists, engineers, public-policy experts, and insurance experts to model what kind of impact a monster storm event would have on modern California.

At the time, Lucy Jones served as the chief scientist for the USGS’s Multi Hazards Demonstration Project, which oversaw both projects. A seismologist by training, Jones spent her time studying the devastations of earthquakes and convincing policy makers to invest resources into preparing for them. The ARkStorm project took her aback, she told me. The first thing she and her team did was ask, What’s the biggest flood in California we know about? “I’m a fourth-generation Californian who studies disaster risk, and I had never heard of the Great Flood of 1862,” she said. “None of us had heard of it,” she added—not even the meteorologists knew about what’s “by far the biggest disaster ever in California and the whole Southwest” over the past two centuries.

At first, the meteorologists were constrained in modeling a realistic megastorm by a lack of data; solid rainfall-gauge measures go back only a century. But after hearing about the 1862 flood, the ARkStorm team dug into research from Ingram and others for information about megastorms before US statehood and European contact. They were shocked to learn that the previous 1,800 years had about six events that were more severe than 1862, along with several more that were roughly of the same magnitude. What they found was that a massive flood is every bit as likely to strike California, and as imminent, as a massive quake.

Even with this information, modeling a massive flood proved more challenging than projecting out a massive earthquake. “We seismologists do this all the time—we create synthetic seismographs,” she said. Want to see what a quake reaching 7.8 on the Richter scale would look like along the San Andreas Fault? Easy, she said. Meteorologists, by contrast, are fixated on accurate prediction of near-future events; “creating a synthetic event wasn’t something they had ever done.” They couldn’t just re-create the 1862 event, because most of the information we have about it is piecemeal, from eyewitness accounts and sediment samples.

To get their heads around how to construct a reasonable approximation of a megastorm, the team’s meteorologists went looking for well-documented 20th-century events that could serve as a model. They settled on two: a series of big storms in 1969 that hit Southern California hardest and a 1986 cluster that did the same to the northern part of the state. To create the ARkStorm scenario, they stitched the two together. Doing so gave the researchers a rich and regionally precise trove of data to sketch out a massive Big One storm scenario.

There was one problem: While the fictional ARkStorm is indeed a massive event, it’s still significantly smaller than the one that caused the Great Flood of 1862. “Our [hypothetical storm] only had total rain for 25 days, while there were 45 days in 1861 to ’62,” Jones said. They plunged ahead anyway, for two reasons. One was that they had robust data on the two 20th-century storm events, giving disaster modelers plenty to work with. The second was that they figured a smaller-than-1862 catastrophe would help build public buy-in, by making the project hard to dismiss as an unrealistic figment of scaremongering bureaucrats.

What they found stunned them—and should stun anyone who relies on California to produce food (not to mention anyone who lives in the state). The headline number: $725 billion in damage, nearly four times what the USGS’s seismology team arrived at for its massive-quake scenario ($200 billion). For comparison, the two most costly natural disasters in modern US history—Hurricane Katrina in 2005 and Harvey in 2017—racked up $166 billion and $130 billion, respectively. The ARkStorm would “flood thousands of square miles of urban and agricultural land, result in thousands of landslides, [and] disrupt lifelines throughout the state for days or weeks,” the study reckoned. Altogether, 25 percent of the state’s buildings would be damaged.

In their model, 25 days of relentless rains overwhelm the Central Valley’s flood-control infrastructure. Then large swaths of the northern part of the Central Valley go under as much as 20 feet of water. The southern part, the San Joaquin Valley, gets off lighter; but a miles-wide band of floodwater collects in the lowest-elevation regions, ballooning out to encompass the expanse that was once the Tulare Lake bottom and stretching to the valley’s southern extreme. Most metropolitan parts of the Bay Area escape severe damage, but swaths of Los Angeles and Orange Counties experience “extensive flooding.”

As Jones stressed to me in our conversation, the ARkStorm scenario is a cautious approximation; a megastorm that matches 1862 or its relatively recent antecedents could plausibly bury the entire Central Valley underwater, northern tip to southern. As the report puts it: “Six megastorms that were more severe than 1861–1862 have occurred in California during the last 1800 years, and there is no reason to believe similar storms won’t occur again.”

A 21st-century megastorm would fall on a region quite different from gold rush–era California. For one thing, it’s much more populous. While the ARkStorm reckoning did not estimate a death toll, it warned of a “substantial loss of life” because “flood depths in some areas could realistically be on the order of 10–20 feet.”

Then there’s the transformation of farming since then. The 1862 storm drowned an estimated 200,000 head of cattle, about a quarter of the state’s entire herd. Today, the Central Valley houses nearly 4 million beef and dairy cows. While cattle continue to be an important part of the region’s farming mix, they no longer dominate it. Today the valley is increasingly given over to intensive almond, pistachio, and grape plantations, representing billions of dollars of investments in crops that take years to establish, are expected to flourish for decades, and could be wiped out by a flood.

Apart from economic losses, “the evolution of a modern society creates new risks from natural disasters,” Jones told me. She cited electric power grids, which didn’t exist in mid-19th-century California. A hundred years ago, when electrification was taking off, extended power outages caused inconveniences. Now, loss of electricity can mean death for vulnerable populations (think hospitals, nursing homes, and prisons). Another example is the intensification of farming. When a few hundred thousand cattle roamed the sparsely populated Central Valley in 1861, their drowning posed relatively limited biohazard risks, although, according to one contemporary account, in post-flood Sacramento, there were a “good many drowned hogs and cattle lying around loose in the streets.”

Today, however, several million cows are packed into massive feedlots in the southern Central Valley, their waste often concentrated in open-air liquid manure lagoons, ready to be swept away and blended into a fecal slurry. Low-lying Tulare County houses nearly 500,000 dairy cows, with 258 operations holding on average 1,800 cattle each. Mature modern dairy cows are massive creatures, weighing around 1,500 pounds each and standing nearly 5 feet tall at the front shoulder. Imagine trying to quickly move such beasts by the thousands out of the path of a flood—and the consequences of failing to do so.

A massive flood could severely pollute soil and groundwater in the Central Valley, and not just from rotting livestock carcasses and millions of tons of concentrated manure. In a 2015 paper, a team of USGS researchers tried to sum up the myriad toxic substances that would be stirred up and spread around by massive storms and floods. The cities of 160 years ago could not boast municipal wastewater facilities, which filter pathogens and pollutants in human sewage, nor municipal dumps, which concentrate often-toxic garbage. In the region’s teeming 21st-century urban areas, those vital sanitation services would become major threats. The report projects that a toxic soup of “petroleum, mercury, asbestos, persistent organic pollutants, molds, and soil-borne or sewage-borne pathogens” would spread across much of the valley, as would concentrated animal manure, fertilizer, pesticides, and other industrial chemicals.

The valley’s southernmost county, Kern, is a case study in the region’s vulnerabilities. Kern’s farmers lead the entire nation in agricultural output by dollar value, annually producing $7 billion worth of foodstuffs like almonds, grapes, citrus, pistachios, and milk. The county houses more than 156,000 dairy cows in facilities averaging 3,200 head each. That frenzy of agricultural production means loads of chemicals on hand; every year, Kern farmers use around 30 million pounds of pesticides, second only to Fresno among California counties. (Altogether, five San Joaquin Valley counties use about half of the more than 200 million pounds of pesticides applied in California.)

Kern is also one of the nation’s most prodigious oil-producing counties. Its vast array of pump jacks, many of them located in farm fields, produce 70 percent of California’s entire oil output. It’s also home to two large oil refineries. If Kern County were a state, it would be the nation’s seventh-leading oil-producing one, churning out twice as much crude as Louisiana. In a massive storm, floodwaters could pick up a substantial amount of highly toxic petroleum and byproducts. Again, in the ARkStorm scenario, Kern County gets hit hard by rain but mostly escapes the worst flooding. The real “Other Big One” might not be so kind, Jones said.

In the end, the USGS team could not estimate the level of damage that will be visited upon the Central Valley’s soil and groundwater from a megaflood: too many variables, too many toxins and biohazards that could be sucked into the vortex. They concluded that “flood-related environmental contamination impacts are expected to be the most widespread and substantial in lowland areas of the Central Valley, the Sacramento–San Joaquin River Delta, the San Francisco Bay area, and portions of the greater Los Angeles metroplex.”

Jones said the initial reaction to the 2011 release of the ARkStorm report among California’s policymakers and emergency managers was skepticism: “Oh, no, that’s too big—it’s impossible,” they would say. “We got lots of traction with the earthquake scenario, and when we did the big flood, nobody wanted to listen to us,” she said.

But after years of patiently informing the state’s decisionmakers that such a disaster is just as likely as a megaquake—and likely much more devastating—the word is getting out. She said the ARkStorm message probably helped prepare emergency managers for the severe storms of February 2017. That month, the massive Oroville Dam in the Sierra Nevada foothills very nearly failed, threatening to send a 30-foot-tall wall of water gushing into the northern Central Valley. As the spillway teetered on the edge of collapse, officials ordered the evacuation of 188,000 people in the communities below. The entire California National Guard was put on notice to mobilize if needed—the first such order since the 1992 Rodney King riots in Los Angeles. Although the dam ultimately held up, the Oroville incident illustrates the challenges of moving hundreds of thousands of people out of harm’s way on short notice.

The evacuation order “unleashed a flood of its own, sending tens of thousands of cars simultaneously onto undersize roads, creating hours-long backups that left residents wondering if they would get to high ground before floodwaters overtook them,” the Sacramento Bee reported. Eight hours after the evacuation, highways were still jammed with slow-moving traffic. A California Highway Patrol spokesman summed up the scene for the Bee:

Unprepared citizens who were running out of gas and their vehicles were becoming disabled in the roadway. People were utilizing the shoulder, driving the wrong way. Traffic collisions were occurring. People fearing for their lives, not abiding by the traffic laws. All combined, it created big problems. It ended up pure, mass chaos.

Even so, Jones said the evacuation went as smoothly as could be expected and likely would have saved thousands of lives if the dam had burst. “But there are some things you can’t prepare for.” Obviously, getting area residents to safety was the first priority, but animal inhabitants were vulnerable, too. If the dam had burst, she said, “I doubt they would have been able to save cattle.”

As the state’s ever-strained emergency-service agencies prepare for the Other Big One, there’s evidence other agencies are struggling to grapple with the likelihood of a megaflood. In the wake of the 2017 near-disaster at Oroville, state agencies spent more than $1 billion repairing the damaged dam and bolstering it for future storms. Just as work was being completed in fall 2018, the Federal Energy Regulatory Commission assessed the situation and found that a “probable maximum flood”—on the scale of the ArkStorm—would likely overwhelm the dam. FERC called on the state to invest in a “more robust and resilient design” to prevent a future cataclysm. The state’s Department of Water Resources responded by launching a “needs assessment” of the dam’s safety that’s due to wrap up in 2020.

Of course, in a state beset by the increasing threat of wildfires in populated areas as well as earthquakes, funds for disaster preparation are tightly stretched. All in all, Jones said, “we’re still much more prepared for a quake than a flood.” Then again, it’s hard to conceive of how we could effectively prevent a 21st century repeat of the Great Flood or how we could fully prepare for the low-lying valley that runs along the center of California like a bathtub—now packed with people, livestock, manure, crops, petrochemicals, and pesticides—to be suddenly transformed into a storm-roiled inland sea.

The aliens among us. How viruses shape the world (The Economist)

They don’t just cause pandemics

Leaders – Aug 22nd 2020 edition

HUMANS THINK of themselves as the world’s apex predators. Hence the silence of sabre-tooth tigers, the absence of moas from New Zealand and the long list of endangered megafauna. But SARSCoV-2 shows how people can also end up as prey. Viruses have caused a litany of modern pandemics, from covid-19, to HIV/AIDS to the influenza outbreak in 1918-20, which killed many more people than the first world war. Before that, the colonisation of the Americas by Europeans was abetted—and perhaps made possible—by epidemics of smallpox, measles and influenza brought unwittingly by the invaders, which annihilated many of the original inhabitants.

The influence of viruses on life on Earth, though, goes far beyond the past and present tragedies of a single species, however pressing they seem. Though the study of viruses began as an investigation into what appeared to be a strange subset of pathogens, recent research puts them at the heart of an explanation of the strategies of genes, both selfish and otherwise.

Viruses are unimaginably varied and ubiquitous. And it is becoming clear just how much they have shaped the evolution of all organisms since the very beginnings of life. In this, they demonstrate the blind, pitiless power of natural selection at its most dramatic. And—for one group of brainy bipedal mammals that viruses helped create—they also present a heady mix of threat and opportunity.

As our essay in this week’s issue explains, viruses are best thought of as packages of genetic material that exploit another organism’s metabolism in order to reproduce. They are parasites of the purest kind: they borrow everything from the host except the genetic code that makes them what they are. They strip down life itself to the bare essentials of information and its replication. If the abundance of viruses is anything to go by, that is a very successful strategy indeed.

The world is teeming with them. One analysis of seawater found 200,000 different viral species, and it was not setting out to be comprehensive. Other research suggests that a single litre of seawater may contain more than 100bn virus particles, and a kilo of dried soil ten times that number. Altogether, according to calculations on the back of a very big envelope, the world might contain 1031 of the things—that is one followed by 31 zeros, far outnumbering all other forms of life on the planet.

As far as anyone can tell, viruses—often of many different sorts—have adapted to attack every organism that exists. One reason they are powerhouses of evolution is that they oversee a relentless and prodigious slaughter, mutating as they do so. This is particularly clear in the oceans, where a fifth of single-celled plankton are killed by viruses every day. Ecologically, this promotes diversity by scything down abundant species, thus making room for rarer ones. The more common an organism, the more likely it is that a local plague of viruses specialised to attack it will develop, and so keep it in check.

This propensity to cause plagues is also a powerful evolutionary stimulus for prey to develop defences, and these defences sometimes have wider consequences. For example, one explanation for why a cell may deliberately destroy itself is if its sacrifice lowers the viral load on closely related cells nearby. That way, its genes, copied in neighbouring cells, are more likely to survive. It so happens that such altruistic suicide is a prerequisite for cells to come together and form complex organisms, such as pea plants, mushrooms and human beings.

The other reason viruses are engines of evolution is that they are transport mechanisms for genetic information. Some viral genomes end up integrated into the cells of their hosts, where they can be passed down to those organisms’ descendants. Between 8% and 25% of the human genome seems to have such viral origins. But the viruses themselves can in turn be hijacked, and their genes turned to new uses. For example, the ability of mammals to bear live young is a consequence of a viral gene being modified to permit the formation of placentas. And even human brains may owe their development in part to the movement within them of virus-like elements that create genetic differences between neurons within a single organism.

Evolution’s most enthralling insight is that breathtaking complexity can emerge from the sustained, implacable and nihilistic competition within and between organisms. The fact that the blind watchmaker has equipped you with the capacity to read and understand these words is in part a response to the actions of swarms of tiny, attacking replicators that have been going on, probably, since life first emerged on Earth around 4bn years ago. It is a startling example of that principle in action—and viruses have not finished yet.

Humanity’s unique, virus-chiselled consciousness opens up new avenues to deal with the viral threat and to exploit it. This starts with the miracle of vaccination, which defends against a pathogenic attack before it is launched. Thanks to vaccines, smallpox is no more, having taken some 300m lives in the 20th century. Polio will one day surely follow. New research prompted by the covid-19 pandemic will enhance the power to examine the viral realm and the best responses to it that bodies can muster—taking the defence against viruses to a new level.

Another avenue for progress lies in the tools for manipulating organisms that will come from an understanding of viruses and the defences against them. Early versions of genetic engineering relied on restriction enzymes—molecular scissors with which bacteria cut up viral genes and which biotechnologists employ to move genes around. The latest iteration of biotechnology, gene editing letter by letter, which is known as CRISPR, makes use of a more precise antiviral mechanism.

From the smallest beginnings

The natural world is not kind. A virus-free existence is an impossibility so deeply unachievable that its desirability is meaningless. In any case, the marvellous diversity of life rests on viruses which, as much as they are a source of death, are also a source of richness and of change. Marvellous, too, is the prospect of a world where viruses become a source of new understanding for humans—and kill fewer of them than ever before. ■

Correction: An earlier version of this article got its maths wrong. 1031 is one followed by 31 zeroes, not ten followed by 31 zeroes as we first wrote. Sorry.

Counting the Lives Saved by Lockdowns—and Lost to Slow Action (The Scientist)

the-scientist.com

David Adan, July 6, 2020

On May 20, disease modelers at Columbia University posted a preprint that concluded the US could have prevented 36,000 of the 65,300 deaths that the country had suffered as a result of COVID-19 by May 3 if states had instituted social distancing measures a week earlier. In early June, Imperial College London epidemiologist Neil Ferguson, one of the UK government’s key advisers in the early stages of the pandemic, came to a similar conclusion about the UK. In evidence he presented to a parliamentary committee inquiry, Ferguson said that if the country had introduced restrictions on movement and socializing a week sooner than it did, Britain’s official death toll of 40,000 could have been halved.

On a more positive note, Ferguson and other researchers at Imperial College London published a model in Nature around the same time estimating that more than 3 million deaths had been avoided in the UK as a result of the policies that were put in place.

These and other studies from recent months aim to understand how well various social-distancing measures have curbed infections, and by extension saved lives. It’s a big challenge to unravel and reliably understand all the factors at play, but experts say the research could help inform future policies. 

The most effective measure, one study found, was getting people not to travel to work, while school closures had relatively little effect.

“It’s not just about looking retrospectively,” Jeffrey Shaman, a data scientist at Columbia University and coauthor of the preprint on US deaths, tells The Scientist. “All the places that have managed to get it under control to a certain extent are still at risk of having a rebound and a flare up. And if they don’t respond to it because they can’t motivate the political and public will to actually reinstitute control measures, then we’re going to repeat the same mistakes.”

Diving into the data

Shaman and his team used a computer model and data on how people moved around to work out how reduced contact between people could explain disease trends after the US introduced social distancing measures in mid-March. Then, the researchers looked at what would have happened if the same measures had been introduced a week earlier, and found that more than half of total infections and deaths up to May 3 would have been prevented. Starting the measures on March 1 would have prevented 83 percent of the nation’s deaths during that period, according to the model. Shaman says he is waiting to submit for publication in a peer-reviewed journal until he and his colleagues update the study with more-recent data. 

“I thought they had reasonably credible data in terms of trying to argue that the lockdowns had prevented infections,” says Daniel Sutter, an economist at Troy University. “They were training or calibrating that model using some cell phone data and foot traffic data and correlating that with lockdowns.”

Sébastien Annan-Phan, an economist at the University of California, Berkeley, undertook a similar analysis, looking at the growth rate of case numbers before and after various lockdown measures were introduced in China, South Korea, Italy, Iran, France, and the US. Because these countries instituted different combinations of social distancing measures, the team was able to estimate how well each action slowed disease spread. The most effective measure, they found, was getting people not to travel to work, while school closures had relatively little effect. “Every country is different and they implement different policies, but we can still tease out a couple of things,” says Annan-Phan.  

In total, his group estimated that combined interventions prevented or delayed about 62 million confirmed cases in the six countries studied, or about 530 million total infections. The results were published in Naturein June alongside a study from a group at Imperial College London, which had compared COVID-19 cases reported in several European countries under lockdown with the worst-case scenario predicted for each of those countries by a computer model in which no such measures were taken. According to that analysis, which assumed that the effects of social distancing measures were the same from country to country, some 3.1 million deaths had been avoided. 

It’s hard to argue against the broad conclusion that changing people’s behavior was beneficial, says Andrew Gelman, a statistician at Columbia University. “If people hadn’t changed their behavior, then it would have been disastrous.” 

Lockdown policies versus personal decisions to isolate

Like all hypothetical scenarios, it’s impossible to know how events would have played out if different decisions were made. And attributing changes in people’s behavior to official lockdown policies during the pandemic is especially difficult, says Gelman. “Ultimately, we can’t say what would have happened without it, because the timing of lockdown measures correlates with when people would have gone into self-isolation anyway.” Indeed, according to a recent study of mobile phone data in the US, many people started to venture out less a good one to four weeks before they were officially asked to. 

A report on data from Sweden, a country that did not introduce the same strict restrictions as others in Europe, seems to support that idea. It found that, compared with data from other countries, Sweden’s outcomes were no worse. “A lockdown would not have helped in terms of limiting COVID-19 infections or deaths in Sweden,” the study originally concluded. But Gernot Müller, an economist at the University of Tubingen who worked on that report, now says updated data show that original conclusion was flawed. Many Swedes took voluntary actions in the first few weeks, he says, and this masked the benefits that a lockdown would have had. But after the first month, the death rate started to rise. “It turns out that we do now see a lockdown effect,” Müller says of his group’s new, still unpublished analyses. “So lockdowns do work and we can attach a number to that: some 40 percent or 50 percent fewer deaths.”

Some critics question the assumption that such deaths have been prevented, rather than simply delayed. While it can appear to be a semantic point, the distinction between preventing and delaying infection is an important one when policymakers assess the costs and benefits of lockdown measures, Sutter says. “I think it’s a little misleading to keep saying these lockdowns have prevented death. They’ve just prevented cases from occurring so far,” he says. “There’s still the underlying vulnerability out there. People are still susceptible to get the virus and get sick at a later date.”

Shaman notes, however, that it’s really a race against the clock. It’s about “buying yourself and your population critical time to not be infected while we try to get our act together to produce an effective vaccine or therapeutic.”

See “It’s So Hard to Know Who’s Dying of COVID-19—and When

See “The Effects of Physical Isolation on the Pandemic Quantified

New model predicts the peaks of the COVID-19 pandemic (Science Daily)

Date: May 29, 2020

Source: Santa Fe Institute

Summary: Researchers describe a single function that accurately describes all existing available data on active COVID-19 cases and deaths — and predicts forthcoming peaks.

As of late May, COVID-19 has killed more than 325,000 people around the world. Even though the worst seems to be over for countries like China and South Korea, public health experts warn that cases and fatalities will continue to surge in many parts of the world. Understanding how the disease evolves can help these countries prepare for an expected uptick in cases.

This week in the journal Frontiers in Physics, researchers describe a single function that accurately describes all existing available data on active cases and deaths — and predicts forthcoming peaks. The tool uses q-statistics, a set of functions and probability distributions developed by Constantino Tsallis, a physicist and member of the Santa Fe Institute’s external faculty. Tsallis worked on the new model together with Ugur Tirnakli, a physicist at Ege University, in Turkey.

“The formula works in all the countries in which we have tested,” says Tsallis.

Neither physicist ever set out to model a global pandemic. But Tsallis says that when he saw the shape of published graphs representing China’s daily active cases, he recognized shapes he’d seen before — namely, in graphs he’d helped produce almost two decades ago to describe the behavior of the stock market.

“The shape was exactly the same,” he says. For the financial data, the function described probabilities of stock exchanges; for COVID-19, it described daily the number of active cases — and fatalities — as a function of time.

Modeling financial data and tracking a global pandemic may seem unrelated, but Tsallis says they have one important thing in common. “They’re both complex systems,” he says, “and in complex systems, this happens all the time.” Disparate systems from a variety of fields — biology, network theory, computer science, mathematics — often reveal patterns that follow the same basic shapes and evolution.

The financial graph appeared in a 2004 volume co-edited by Tsallis and the late Nobelist Murray Gell-Mann. Tsallis developed q-statitics, also known as “Tsallis statistics,” in the late 1980s as a generalization of Boltzmann-Gibbs statistics to complex systems.

In the new paper, Tsallis and Tirnakli used data from China, where the active case rate is thought to have peaked, to set the main parameters for the formula. Then, they applied it to other countries including France, Brazil, and the United Kingdom, and found that it matched the evolution of the active cases and fatality rates over time.

The model, says Tsallis, could be used to create useful tools like an app that updates in real-time with new available data, and can adjust its predictions accordingly. In addition, he thinks that it could be fine-tuned to fit future outbreaks as well.

“The functional form seems to be universal,” he says, “Not just for this virus, but for the next one that might appear as well.”

Story Source:

Materials provided by Santa Fe Institute. Note: Content may be edited for style and length.

Journal Reference:

  1. Constantino Tsallis, Ugur Tirnakli. Predicting COVID-19 Peaks Around the World. Frontiers in Physics, 2020; 8 DOI: 10.3389/fphy.2020.00217

Modeling COVID-19 data must be done with extreme care (Science Daily)

Date: May 19, 2020

Source: American Institute of Physics

Summary: At the beginning of a new wave of an epidemic, extreme care should be used when extrapolating data to determine whether lockdowns are necessary, experts say.

As the infectious virus causing the COVID-19 disease began its devastating spread around the globe, an international team of scientists was alarmed by the lack of uniform approaches by various countries’ epidemiologists to respond to it.

Germany, for example, didn’t institute a full lockdown, unlike France and the U.K., and the decision in the U.S. by New York to go into a lockdown came only after the pandemic had reached an advanced stage. Data modeling to predict the numbers of likely infections varied widely by region, from very large to very small numbers, and revealed a high degree of uncertainty.

Davide Faranda, a scientist at the French National Centre for Scientific Research (CNRS), and colleagues in the U.K., Mexico, Denmark, and Japan decided to explore the origins of these uncertainties. This work is deeply personal to Faranda, whose grandfather died of COVID-19; Faranda has dedicated the work to him.

In the journal Chaos, from AIP Publishing, the group describes why modeling and extrapolating the evolution of COVID-19 outbreaks in near real time is an enormous scientific challenge that requires a deep understanding of the nonlinearities underlying the dynamics of epidemics.

Forecasting the behavior of a complex system, such as the evolution of epidemics, requires both a physical model for its evolution and a dataset of infections to initialize the model. To create a model, the team used data provided by Johns Hopkins University’s Center for Systems Science and Engineering, which is available online at https://systems.jhu.edu/research/public-health/ncov/ or https://github.com/CSSEGISandData/COVID-19.

“Our physical model is based on assuming that the total population can be divided into four groups: those who are susceptible to catching the virus, those who have contracted the virus but don’t show any symptoms, those who are infected and, finally, those who recovered or died from the virus,” Faranda said.

To determine how people move from one group to another, it’s necessary to know the infection rate, incubation time and recovery time. Actual infection data can be used to extrapolate the behavior of the epidemic with statistical models.

“Because of the uncertainties in both the parameters involved in the models — infection rate, incubation period and recovery time — and the incompleteness of infections data within different countries, extrapolations could lead to an incredibly large range of uncertain results,” Faranda said. “For example, just assuming an underestimation of the last data in the infection counts of 20% can lead to a change in total infections estimations from few thousands to few millions of individuals.”

The group has also shown that this uncertainty is due to a lack of data quality and also to the intrinsic nature of the dynamics, because it is ultrasensitive to the parameters — especially during the initial growing phase. This means that everyone should be very careful extrapolating key quantities to decide whether to implement lockdown measures when a new wave of the virus begins.

“The total final infection counts as well as the duration of the epidemic are sensitive to the data you put in,” he said.

The team’s model handles uncertainty in a natural way, so they plan to show how modeling of the post-confinement phase can be sensitive to the measures taken.

“Preliminary results show that implementing lockdown measures when infections are in a full exponential growth phase poses serious limitations for their success,” said Faranda.


Story Source:

Materials provided by American Institute of Physics. Note: Content may be edited for style and length.


Journal Reference:

  1. Davide Faranda, Isaac Pérez Castillo, Oliver Hulme, Aglaé Jezequel, Jeroen S. W. Lamb, Yuzuru Sato, Erica L. Thompson. Asymptotic estimates of SARS-CoV-2 infection counts and their sensitivity to stochastic perturbation. Chaos: An Interdisciplinary Journal of Nonlinear Science, 2020; 30 (5): 051107 DOI: 10.1063/5.0008834

Opinion | Forty Years Later, Lessons for the Pandemic From Mount St. Helens (New York Times)

nytimes.com

By Lawrence Roberts – May 17, 2020

The tensions we now face between science, politics and economics also arose before the country’s most destructive volcanic eruption.

Mr. Roberts is a former editor at ProPublica and The Washington Post.

Mount St. Helens erupted on May 18, 1980.
United Press International

When I met David A. Johnston, it was on a spring evening, about a month before he would be erased from existence by a gigantic cloud of volcanic ash boiling over him at 300 miles per hour. He was coming through the door of a makeshift command center in Vancouver, Wash., the closest city to the graceful snow-capped dome of Mount St. Helens, a volcano that had been dormant for 123 years. This was April 1980, and Mr. Johnston, a 30-year-old geologist, was one of the first scientists summoned to monitor new warning signs from the mountain — shallow earthquakes and periodic bursts of ash and steam.

As a young reporter I had talked my way into the command center. At first Mr. Johnston was wary; he wasn’t supposed to meet the press anymore. His supervisors had played down the chance that the smoking mountain was about to explode, and they had already reprimanded him for suggesting otherwise. But on this night he’d just been setting measuring equipment deep in the surrounding forest, and his runner-thin frame vibrated with excitement, his face flushed under his blond beard, and Mr. Johnston couldn’t help riffing on the likelihood of a cataclysmic event.

“My feeling is when it goes, it’s going to go just like that,” he told me, snapping his fingers. “Bang!” At best, he said, we’d have a couple of hours of warning.

Mr. Johnston was mostly right. Early on a Sunday morning several weeks later, the mountain did blow, in the most destructive eruption in U.S. history. But there was no warning. At his instrument outpost, on a ridge more than five miles from the summit, Mr. Johnston had only seconds to radio in a last message: “Vancouver! Vancouver! This is it!”

A photograph of David Johnston, who was killed when Mount St. Helens erupted.
Chris Sweda/Daily Southtown, via Associated Press

Monday, May 18, marks the 40th anniversary of the 1980 Mount St. Helens eruption, and as we now face our own struggle to gauge the uncertain risks presented by nature, to predict how bad things will get and how much and how long to protect ourselves, it may be useful to revisit the tension back then between science, politics and economics.

The drama played out on a much smaller stage — one region of one state, instead of the whole planet — but many of the same elements were present: Scientists provided a range of educated guesses, and public officials split on how to respond. Business owners and residents chafed at the restrictions put in place, many flouted them, and a few even threatened armed rebellion. In the end, the government mostly accepted the analyses of Mr. Johnston and his fellow geologists. As a result, while the eruption killed 57 people and flattened hundreds of square miles of dense Pacific Northwest forestland, the lives of hundreds, perhaps thousands, were spared.

At the first warning signs, state and federal officials moved to distance people from the mountain. They sought to block nonessential visitors from nearby Spirit Lake, ringed with scout camps and tourist lodges. Other than loggers, few people hung around the peak year-round, but the population surged in late spring and summer, when thousands hiked, camped and moved into vacation homes. Many regulars dismissed the risk. Slipping past roadblocks became a popular activity. Locals sold maps to sightseers and amateur photographers that showed how to take old logging roads up the mountain. The owner of a nearby general store shared a common opinion of the threat: “It’s just plain bull. I lived here 26 years, and nothing like this happened before.”

Like the probability of a pandemic, though, it was well-established that one of the dozen or so volcanoes in the 800-mile Cascade Range might soon turn active. Averaging two eruptions a century, they were overdue. A 1978 report by the U.S. Geological Survey, where Mr. Johnston worked, identified Mount St. Helens as most likely to blow next. Yet forecasting how big the event could be was a matter of art as well as science. Geologists could model only previous explosions and list the possible outcomes. (“That position was difficult for many to accept, because they believed we could and should make predictions,” a U.S.G.S. report said later.)

Some scientists suggested a much larger evacuation, but uncertainty, a hallmark of their discipline, can be difficult for those making real-time public policy. The guidelines from federal and state representatives camped out in Vancouver, and from Washington’s governor, Dixy Lee Ray, often seemed in conflict. Moreover, the Weyerhaeuser Company, which owned tens of thousands of acres of timber, opposed logging restrictions, even as some crews got nervous about working near the rumbling dome.

By mid-April, a bulge grew on the north flank, a clue that highly pressurized magma was trapped and expanding. If it burst, a landslide might bury Spirit Lake. The governor, a conservative Democrat who was a biologist by training, finally agreed to stronger measures. She ordered an inner “red zone” where only scientists and law enforcement personnel could enter, and a “blue zone” open to loggers and property owners with day passes. If the zones didn’t extend as far as many geologists hoped, they were certainly an improvement.

Then the mountain got deceptively quiet. The curve of seismic activity flattened and turned downward. Many grew complacent, and restless. On Saturday, May 17, people with property inside the red zone massed in cars and pickup trucks at the roadblock on State Highway 504. Hearing rumors that some carried rifles, the governor relented, allowing them through, with a police escort, to check on their homes and leave again. The state patrol chief, Robert Landon, told them, “We hope the good Lord will keep that mountain from giving us any trouble.” The property owners vowed to return the next day.

The next day was Sunday. At 8:32 a.m., a powerful quake shook loose the snow-covered north face of Mount St. Helens, releasing the superheated magma, which roared out of the mountain in a lateral blast faster than a bullet train, over the spot where Mr. Johnston stood, mowing down 230 square miles of trees, hurling trunks into the air like twigs. It rained down a suffocating storm of thick gray ash, “a burning sky-river wind of searing lava droplet hail,” as the poet Gary Snyder described it. Mudflows clogged the river valleys, setting off deadly floods. A column of ash soared 15 miles high and bloomed into a mushroom cloud 35 miles wide. Over two weeks, ash would circle the globe. Among the 57 dead were three aspiring geologists besides Mr. Johnston, as well as loggers, sightseers and photographers.

About a week later, the Forest Service took reporters up in a helicopter. I had seen the mountain from the air before the eruption. Now the sprawling green wilderness that appeared endless and permanent had disappeared in a blink. We flew for an hour over nothing but moonscape. The scientists had done their best, but nature flexed a power far more deadly than even they had imagined.

Lawrence Roberts, a former editor at ProPublica and The Washington Post, is the author of the forthcoming “Mayday 1971: A White House at War, a Revolt in the Streets, and the Untold History of America’s Biggest Mass Arrest.”

This Is the Future of the Pandemic (New York Times)

Covid-19 isn’t going away soon. Two recent studies mapped out the possible shapes of its trajectory.

Circles at Gare du Nord train station in Paris marked safe social distances on Wednesday.
Circles at Gare du Nord train station in Paris marked safe social distances on Wednesday.Credit…Ian Langsdon/EPA, via Shutterstock

By Siobhan Roberts – May 8, 2020

By now we know — contrary to false predictions — that the novel coronavirus will be with us for a rather long time.

“Exactly how long remains to be seen,” said Marc Lipsitch, an infectious disease epidemiologist at Harvard’s T.H. Chan School of Public Health. “It’s going to be a matter of managing it over months to a couple of years. It’s not a matter of getting past the peak, as some people seem to believe.”

A single round of social distancing — closing schools and workplaces, limiting the sizes of gatherings, lockdowns of varying intensities and durations — will not be sufficient in the long term.

In the interest of managing our expectations and governing ourselves accordingly, it might be helpful, for our pandemic state of mind, to envision this predicament — existentially, at least — as a soliton wave: a wave that just keeps rolling and rolling, carrying on under its own power for a great distance.

The Scottish engineer and naval architect John Scott Russell first spotted a soliton in 1834 as it traveled along the Union Canal. He followed on horseback and, as he wrote in his “Report on Waves,” overtook it rolling along at about eight miles an hour, at thirty feet long and a foot or so in height. “Its height gradually diminished, and after a chase of one or two miles I lost it in the windings of the channel.”

The pandemic wave, similarly, will be with us for the foreseeable future before it diminishes. But, depending on one’s geographic location and the policies in place, it will exhibit variegated dimensions and dynamics traveling through time and space.

“There is an analogy between weather forecasting and disease modeling,” Dr. Lipsitch said. Both, he noted, are simple mathematical descriptions of how a system works: drawing upon physics and chemistry in the case of meteorology; and on behavior, virology and epidemiology in the case of infectious-disease modeling. Of course, he said, “we can’t change the weather.” But we can change the course of the pandemic — with our behavior, by balancing and coordinating psychological, sociological, economic and political factors.

Dr. Lipsitch is a co-author of two recent analyses — one from the Center for Infectious Disease Research and Policy at the University of Minnesota, the other from the Chan School published in Science — that describe a variety of shapes the pandemic wave might take in the coming months.

The Minnesota study describes three possibilities:

Scenario No. 1 depicts an initial wave of cases — the current one — followed by a consistently bumpy ride of “peaks and valleys” that will gradually diminish over a year or two.

Scenario No. 2 supposes that the current wave will be followed by a larger “fall peak,” or perhaps a winter peak, with subsequent smaller waves thereafter, similar to what transpired during the 1918-1919 flu pandemic.

Scenario No. 3 shows an intense spring peak followed by a “slow burn” with less-pronounced ups and downs.

The authors conclude that whichever reality materializes (assuming ongoing mitigation measures, as we await a vaccine), “we must be prepared for at least another 18 to 24 months of significant Covid-19 activity, with hot spots popping up periodically in diverse geographic areas.”

In the Science paper, the Harvard team — infectious-disease epidemiologist Yonatan Grad, his postdoctoral fellow Stephen Kissler, Dr. Lipsitch, his doctoral student Christine Tedijanto and their colleague Edward Goldstein — took a closer look at various scenarios by simulating the transmission dynamics using the latest Covid-19 data and data from related viruses.

The authors conveyed the results in a series of graphs — composed by Dr. Kissler and Ms. Tedijanto — that project a similarly wavy future characterized by peaks and valleys.

One figure from the paper, reinterpreted below, depicts possible scenarios (the details would differ geographically) and shows the red trajectory of Covid-19 infections in response to “intermittent social distancing” regimes represented by the blue bands.

Social distancing is turned “on” when the number of Covid-19 cases reaches a certain prevalence in the population — for instance, 35 cases per 10,000, although the thresholds would be set locally, monitored with widespread testing. It is turned “off” when cases drop to a lower threshold, perhaps 5 cases per 10,000. Because critical cases that require hospitalization lag behind the general prevalence, this strategy aims to prevent the health care system from being overwhelmed.

The green graph represents the corresponding, if very gradual, increase in population immunity.

“The ‘herd immunity threshold’ in the model is 55 percent of the population, or the level of immunity that would be needed for the disease to stop spreading in the population without other measures,” Dr. Kissler said.

Another iteration shows the effects of seasonality — a slower spread of the virus during warmer months. Theoretically, seasonal effects allow for larger intervals between periods of social distancing.

This year, however, the seasonal effects will likely be minimal, since a large proportion of the population will still be susceptible to the virus come summer. And there are other unknowns, since the underlying mechanisms of seasonality — such as temperature, humidity and school schedules — have been studied for some respiratory infections, like influenza, but not for coronaviruses. So, alas, we cannot depend on seasonality alone to stave off another outbreak over the coming summer months.

Yet another scenario takes into account not only seasonality but also a doubling of the critical-care capacity in hospitals. This, in turn, allows for social distancing to kick in at a higher threshold — say, at a prevalence of 70 cases per 10,000 — and for even longer breaks between social distancing periods:

What is clear overall is that a one-time social distancing effort will not be sufficient to control the epidemic in the long term, and that it will take a long time to reach herd immunity.

“This is because when we are successful in doing social distancing — so that we don’t overwhelm the health care system — fewer people get the infection, which is exactly the goal,” said Ms. Tedijanto. “But if infection leads to immunity, successful social distancing also means that more people remain susceptible to the disease. As a result, once we lift the social distancing measures, the virus will quite possibly spread again as easily as it did before the lockdowns.”

So, lacking a vaccine, our pandemic state of mind may persist well into 2021 or 2022 — which surprised even the experts.

“We anticipated a prolonged period of social distancing would be necessary, but didn’t initially realize that it could be this long,” Dr. Kissler said.

Claudio Maierovitch Pessanha Henriques: O mito do pico (Folha de S.Paulo)

www1.folha.uol.com.br

Claudio Maierovitch Pessanha Henriques – 6 de maio de 2020

Desde o início da epidemia de doença causada pelo novo coronavírus (Covid-19), a grande pergunta tem sido “quando acaba?” Frequentemente, são divulgadas na mídia e nas redes sociais projeções as mais variadas sobre a famosa curva da doença em vários países e no mundo, algumas recentes, mostrando a tendência de que os casos deixem de surgir no início do segundo semestre deste ano.

Tais modelos partem do pressuposto de que há uma história, uma curva natural da doença, que começa, sobe, atinge um pico e começa a cair. Vamos analisar o sentido de tal raciocínio. Muitas doenças transmissíveis agudas, quando atingem uma população nova, expandem-se rapidamente, numa velocidade que depende de seu chamado número reprodutivo básico, ou R0 (“R zero”, que estima para quantas pessoas o portador de um agente infeccioso o transmite).

Quando uma quantidade grande de pessoas tiver adoecido ou se infectado mesmo sem sintomas, os contatos entre portadores e pessoas que não tiveram a doença começam a se tornar raros. Num cenário em que pessoas sobreviventes da infecção fiquem imunes àquele agente, sua proporção cresce e a transmissão se torna cada vez mais rara. Assim, a curva, que vinha subindo, fica horizontal e começa a cair, podendo até mesmo chegar a zero, situação em que o agente deixa de circular.

Em populações grandes, é muito raro que uma doença seja completamente eliminada desta forma, por isso a incidência cresce novamente de tempos em tempos. Quando a quantidade de pessoas que não se infectaram, somada à dos bebês que nascem e pessoas sem imunidade que vieram de outros lugares é suficientemente grande, então a curva sobe novamente.

É assim, de forma simplificada, que a ciência entende a ocorrência periódica de epidemias de doenças infecciosas agudas. A história nos ilustra com numerosos exemplos, como varíola, sarampo, gripe, rubéola, poliomielite, caxumba, entre muitos outros. Dependendo das características da doença e da sociedade, são ciclos ilustrados por sofrimento, sequelas e mortes. Realmente, nesses casos, é possível estimar a duração das epidemias e, em alguns casos, até mesmo prever as próximas.

A saúde pública tem diversas ferramentas para interferir em muitos desses casos, indicados para diferentes mecanismos de transmissão, como saneamento, medidas de higiene, isolamento, combate a vetores, uso de preservativos, extinção de fontes de contaminação, vacinas e tratamentos capazes de eliminar os microrganismos. A vacinação, ação específica de saúde considerada mais efetiva, simula o que acontece naturalmente, ao aumentar a quantidade de pessoas imunes na população até que a doença deixe de circular, sem que para isso pessoas precisem adoecer.

No caso da Covid-19, há estimativas de que para a doença deixar de circular intensamente será preciso que cerca de 70% da população seja infectada. Isso se chama imunidade coletiva (também se adota a desagradável denominação “imunidade de rebanho”). Quanto à situação atual de disseminação do coronavírus Sars-CoV-2, a Organização Mundial da Saúde (OMS) calcula que até a metade de abril apenas de 2% a 3% da população mundial terá sido infectada. Estimativas para o Brasil são um pouco inferiores a essa média.

Trocando em miúdos, para que a doença atinja naturalmente seu pico no país e comece a cair, será preciso esperar que 140 milhões de pessoas se infectem. A mais conservadora (menor) taxa de letalidade encontrada nas publicações sobre a Covid-19 é de 0,36%, mais ou menos um vigésimo daquela que os números oficiais de casos e mortes revelam. Isso significa que até o Brasil atingir o pico, contaremos 500 mil mortes se o sistema de saúde não ultrapassar seus limites —e, caso isso aconteça, um número muito maior.

Atingir o pico é sinônimo de catástrofe, não é uma aposta admissível, sobretudo quando constatamos que já está esgotada a capacidade de atendimento hospitalar em várias cidades, como Manaus, Rio de Janeiro e Fortaleza —outras seguem o mesmo caminho.

A única perspectiva aceitável é evitar o pico, e a única forma de fazê-lo é com medidas rigorosas de afastamento físico. A cota de contatos entre as pessoas deve ficar reservada às atividades essenciais, entre elas saúde, segurança, cadeias de suprimento de combustíveis, alimentos, produtos de limpeza, materiais e equipamentos de uso em saúde, limpeza, manutenção e mais um ou outro setor. Alguma dose de criatividade pode permitir ampliar um pouco esse leque, desde que os meios de transporte e vias públicas permaneçam vazios o suficiente para que seja mantida a distância mínima entre as pessoas.

O monitoramento do número de casos e mortes, que revela a transmissão com duas a três semanas de defasagem, deverá ser aprimorado e utilizado em conjunto com estudos baseados em testes laboratoriais para indicar o rigor das medidas de isolamento.

Se conseguirmos evitar a tragédia maior, vamos conviver com um longo período de restrição de atividades, mais de um ano, e teremos que aprender a organizar a vida e a economia de outras formas, além de passar por alguns períodos de “lockdown” —cerca de duas semanas cada, se a curva apontar novamente para o pico.

Hoje, a situação é grave e tende a se tornar crítica. O Brasil é o país com a maior taxa de transmissão da doença; é hora de ficar em casa e, se for imprescindível sair, fazer da máscara uma parte inseparável da vestimenta e manter rigorosamente todos os cuidados indicados.​

Not quite all there. The 90% economy that lockdowns will leave behind (The Economist)

It will not just be smaller, it will feel strange

BriefingApr 30th 2020 edition

Apr 30th 2020

Editor’s note: The Economist is making some of its most important coverage of the covid-19 pandemic freely available to readers of The Economist Today, our daily newsletter. To receive it, register here. For our coronavirus tracker and more coverage, see our hub

IN THE 1970s Mori Masahiro, a professor at the Tokyo Institute of Technology, observed that there was something disturbing about robots which looked almost, but not quite, like people. Representations in this “uncanny valley” are close enough to lifelike for their shortfalls and divergences from the familiar to be particularly disconcerting. Today’s Chinese economy is exploring a similarly unnerving new terrain. And the rest of the world is following in its uncertain steps.

Whatever the drawbacks of these new lowlands, they are assuredly preferable to the abyss of lockdown. Measures taken to reverse the trajectory of the pandemic around the world have brought with them remarkable economic losses.

Not all sectors of the economy have done terribly. New subscriptions to Netflix increased at twice their usual rate in the first quarter of 2020, with most of that growth coming in March. In America, the sudden stop of revenue from Uber’s ride-sharing service in March and April has been partially cushioned by the 25% increase of sales from its food-delivery unit, according to 7Park Data, a data provider.

Yet the general pattern is grim. Data from Womply, a firm which processes transactions on behalf of 450,000 small businesses across America, show that businesses in all sectors have lost substantial revenue. Restaurants, bars and recreational businesses have been badly hit: revenues have declined some two-thirds since March 15th. Travel and tourism may suffer the worst losses. In the EU, where tourism accounts for some 4% of GDP, the number of people travelling by plane fell from 5m to 50,000; on April 19th less than 5% of hotel rooms in Italy and Spain were occupied.

According to calculations made on behalf of The Economist by Now-Casting Economics, a research firm that provides high-frequency economic forecasts to institutional investors, the world economy shrank by 1.3% year-on-year in the first quarter of 2020, driven by a 6.8% year-on-year decline in China’s GDP. The Federal Reserve Bank of New York draws on measures such as jobless claims to produce a weekly index of American economic output. It suggests that the country’s GDP is currently running about 12% lower than it was a year ago (see chart 1).

These figures fit with attempts by Goldman Sachs, a bank, to estimate the relationship between the severity of lockdowns and their effect on output. It finds, roughly, that an Italian-style lockdown is associated with a GDP decline of 25%. Measures to control the virus while either keeping the economy running reasonably smoothly, as in South Korea, or reopening it, as in China, are associated with a GDP reduction in the region of 10%. That chimes with data which suggest that if Americans chose to avoid person-to-person proximity of the length of an arm or less, occupations worth approximately 10% of national output would become unviable.

The “90% economy” thus created will be, by definition, smaller than that which came before. But its strangeness will be more than a matter of size. There will undoubtedly be relief, fellow feeling, and newly felt or expressed esteem for those who have worked to keep people safe. But there will also be residual fear, pervasive uncertainty, a lack of innovative fervour and deepened inequalities. The fraction of life that is missing will colour people’s experience and behaviour in ways that will not be offset by the happy fact that most of what matters is still available and ticking over. In a world where the office is open but the pub is not, qualitative differences in the way life feels will be at least as significant as the drop in output.

The plight of the pub demonstrates that the 90% economy will not be something that can be fixed by fiat. Allowing pubs—and other places of social pleasure—to open counts for little if people do not want to visit them. Many people will have to leave the home in order to work, but they may well feel less comfortable doing so to have a good time. A poll by YouGov on behalf of The Economist finds that over a third of Americans think it will be “several months” before it will be safe to reopen businesses as normal—which suggests that if businesses do reopen some, at least, may stay away.

Ain’t nothing but tired

Some indication that the spending effects of a lockdown will persist even after it is over comes from Sweden. Research by Niels Johannesen of Copenhagen University and colleagues finds that aggregate-spending patterns in Sweden and Denmark over the past months look similarly reduced, even though Denmark has had a pretty strict lockdown while official Swedish provisions have been exceptionally relaxed. This suggests that personal choice, rather than government policy, is the biggest factor behind the drop. And personal choices may be harder to reverse.

Discretionary spending by Chinese consumers—the sort that goes on things economists do not see as essentials—is 40% off its level a year ago. Haidilao, a hotpot chain, is seeing a bit more than three parties per table per day—an improvement, but still lower than the 4.8 registered last year, according to a report by Goldman Sachs published in mid-April. Breweries are selling 40% less beer. STR, a data-analytics firm, finds that just one-third of hotel beds in China were occupied during the week ending April 19th. Flights remain far from full (see chart 2).

This less social world is not necessarily bad news for every company. UBS, a bank, reports that a growing number of people in China say that the virus has increased their desire to buy a car—presumably in order to avoid the risk of infection on public transport. The number of passengers on Chinese underground trains is still about a third below last year’s level; surface traffic congestion is as bad now as it was then.

Wanting a car, though, will not mean being able to afford one. Drops in discretionary spending are not entirely driven by a residual desire for isolation. They also reflect the fact that some people have a lot less money in the post-lockdown world. Not all those who have lost jobs will quickly find new ones, not least because there is little demand for labour-intensive services such as leisure and hospitality. Even those in jobs will not feel secure, the Chinese experience suggests. Since late March the share of people worried about salary cuts has risen slightly, to 44%, making it their biggest concern for 2020, according to Morgan Stanley, a bank. Many are now recouping the loss of income that they suffered during the most acute phase of the crisis, or paying down debt. All this points to high saving rates in the future, reinforcing low consumption.

A 90% economy is, on one level, an astonishing achievement. Had the pandemic struck even two decades ago, only a tiny minority of people would have been able to work or satisfy their needs. Watching a performance of Beethoven on a computer, or eating a meal from a favourite restaurant at home, is not the same as the real thing—but it is not bad. The lifting of the most stringent lockdowns will also provide respite, both emotionally and physically, since the mere experience of being told what you can and cannot do is unpleasant. Yet in three main ways a 90% economy is a big step down from what came before the pandemic. It will be more fragile; it will be less innovative; and it will be more unfair.

Take fragility first. The return to a semblance of normality could be fleeting. Areas which had apparently controlled the spread of the virus, including Singapore and northern Japan, have imposed or reimposed tough restrictions in response to a rise in the growth rate of new infections. If countries which retain relatively tough social-distancing rules do better at staving off a viral comeback, other countries may feel a need to follow them (see Chaguan). With rules in flux, it will feel hard to plan weeks ahead, let alone months.

Can’t start a fire

The behaviour of the economy will be far less predictable. No one really knows for how long firms facing zero revenues, or households who are working reduced hours or not at all, will be able to survive financially. Businesses can keep going temporarily, either by burning cash or by tapping grants and credit lines set up by government—but these are unlimited neither in size nor duration. What is more, a merely illiquid firm can quickly become a truly insolvent one as its earnings stagnate while its debt commitments expand. A rise in corporate and personal bankruptcies, long after the apparently acute phase of the pandemic, seems likely, though governments are trying to forestall them. In the past fortnight bankruptcies in China started to rise relative to last year. On April 28th HSBC, one of the world’s largest banks, reported worse-than-expected results, in part because of higher credit losses.

Furthermore, the pandemic has upended norms and conventions about how economic agents behave. In Britain the share of commercial tenants who paid their rent on time fell from 90% to 60% in the first quarter of this year. A growing number of American renters are no longer paying their landlords. Other creditors are being put off, too. In America, close to 40% of business-to-business payments from firms in the spectator-sports and film industries were late in March, double the rate a year ago. Enforcing contracts has become more difficult with many courts closed and social interactions at a standstill. This is perhaps the most insidious means by which weak sectors of the economy will infect otherwise moderately healthy ones.

In an environment of uncertain property rights and unknowable income streams, potential investment projects are not just risky—they are impossible to price. A recent paper by Scott Baker of Northwestern University and colleagues suggests that economic uncertainty is at an all-time high. That may go some way to explaining the results of a weekly survey from Moody’s Analytics, a research firm, which finds that businesses’ investment intentions are substantially lower even than during the financial crisis of 2007-09. An index which measures American nonresidential construction activity 9-12 months ahead has also hit new lows.

The collapse in investment points to the second trait of the 90% economy: that it will be less innovative. The development of liberal capitalism over the past three centuries went hand in hand with a growth in the number of people exchanging ideas in public or quasi-public spaces. Access to the coffeehouse, the salon or the street protest was always a partial process, favouring some people over others. But a vibrant public sphere fosters creativity.

Innovation is not impossible in a world with less social contact. There is more than one company founded in a garage now worth $1trn. During lockdowns, companies have had to innovate quickly—just look at how many firms have turned their hand to making ventilators, if with mixed success. A handful of firms claim that working from home is so productive that their offices will stay closed for good.

Yet these productivity bonuses look likely to be heavily outweighed by drawbacks. Studies suggest the benefits of working from home only materialise if employees can frequently check in at an office in order to solve problems. Planning new projects is especially difficult. Anyone who has tried to bounce ideas around on Zoom or Skype knows that spontaneity is hard. People are often using bad equipment with poor connections. Nick Bloom of Stanford University, one of the few economists to have studied working from home closely, reckons that there will be a sharp decline in patent applications in 2021.

Cities have proven particularly fertile ground for innovations which drive long-run growth. If Geoffrey West, a physicist who studies complex systems, is right to suggest that doubling a city’s population leads to all concerned becoming on aggregate 15% richer, then the emptying-out of urban areas is bad news. MoveBuddha, a relocation website, says that searches for places in New York City’s suburbs are up almost 250% compared with this time last year. A paper from New York University suggests that richer, and thus presumably more educated, New Yorkers—people from whom a disproportionate share of ideas may flow—are particularly likely to have left during the epidemic.

Something happening somewhere

Wherever or however people end up working, the experience of living in a pandemic is not conducive to creative thought. How many people entered lockdown with a determination to immerse themselves in Proust or George Eliot, only to find themselves slumped in front of “Tiger King”? When mental capacity is taken up by worries about whether or not to touch that door handle or whether or not to believe the results of the latest study on the virus, focusing is difficult. Women are more likely to take care of home-schooling and entertainment of bored children (see article), meaning their careers suffer more than men’s. Already, research by Tatyana Deryugina, Olga Shurchkov and Jenna Stearns, three economists, finds that the productivity of female economists, as measured by production of research papers, has fallen relative to male ones since the pandemic began.

The growing gender divide in productivity points to the final big problem with the 90% economy: that it is unfair. Liberally regulated economies operating at full capacity tend to have unemployment rates of 4-5%, in part because there will always be people temporarily unemployed as they move from one job to another. The new normal will have higher joblessness. This is not just because GDP will be lower; the decline in output will be particularly concentrated in labour-intensive industries such as leisure and hospitality, reducing employment disproportionately. America’s current unemployment rate, real-time data suggest, is between 15-20%.

The lost jobs tended to pay badly, and were more likely to be performed by the young, women and immigrants. Research by Abi Adams-Prassl of Oxford University and colleagues finds that an American who normally earns less than $20,000 a year is twice as likely to have lost their job due to the pandemic as one earning $80,000-plus. Many of those unlucky people do not have the skills, nor the technology, that would enable them to work from home or to retrain for other jobs.

The longer the 90% economy endures, the more such inequalities will deepen. People who already enjoy strong professional networks—largely, those of middle age and higher—may actually quite enjoy the experience of working from home. Notwithstanding the problems of bad internet and irritating children, it may be quite pleasant to chair fewer meetings or performance reviews. Junior folk, even if they make it into an office, will miss out on the expertise and guidance of their seniors. Others with poor professional networks, such as the young or recently arrived immigrants, may find it difficult or impossible to strengthen them, hindering upward mobility, points out Tyler Cowen of George Mason University.

The world economy that went into retreat in March as covid-19 threatened lives was one that looked sound and strong. And the biomedical community is currently working overtime to produce a vaccine that will allow the world to be restored to its full capacity. But estimates suggest that this will take at least another 12 months—and, as with the prospects of the global economy, that figure is highly uncertain. If the adage that it takes two months to form a habit holds, the economy that re-emerges will be fundamentally different.

Crises are no excuse for lowering scientific standards, say ethicists (Science News)

Date: April 23, 2020

Source: Carnegie Mellon University

Summary: Ethicists are calling on the global research community to resist treating the urgency of the current COVID-19 outbreak as grounds for making exceptions to rigorous research standards in pursuit of treatments and vaccines.

Ethicists from Carnegie Mellon and McGill universities are calling on the global research community to resist treating the urgency of the current COVID-19 outbreak as grounds for making exceptions to rigorous research standards in pursuit of treatments and vaccines.

With hundreds of clinical studies registered on ClinicalTrials.gov, Alex John London, the Clara L. West Professor of Ethics and Philosophy and director of the Center for Ethics and Policy at Carnegie Mellon, and Jonathan Kimmelman, James McGill Professor and director of the Biomedical Ethics Unit at McGill University, caution that urgency should not be used as an excuse for lowering scientific standards. They argue that many of the deficiencies in the way medical research is conducted under normal circumstances seem to be amplified in this pandemic. Their paper, published online April 23 by the journal Science, provides recommendations for conducting clinical research during times of crises.

“Although crises present major logistical and practical challenges, the moral mission of research remains the same: to reduce uncertainty and enable care givers, health systems and policy makers to better address individual and public health,” London and Kimmelman said.

Many of the first studies out of the gate in this pandemic have been poorly designed, not well justified, or reported in a biased manner. The deluge of studies registered in their wake threaten to duplicate efforts, concentrate resources on strategies that have received outsized media attention and increase the potential of generating false positive results purely by chance.

“All crises present exceptional situations in terms of the challenges they pose to health and welfare. But the idea that crises present an exception to the challenges of evaluating the effects drugs and vaccines is a mistake,” London and Kimmelman said. “Rather than generating permission to carry out low-quality investigations, the urgency and scarcity of pandemics heighten the responsibility of key actors in the research enterprise to coordinate their activities to uphold the standards necessary to advance this mission.”

The ethicists provide recommendations for multiple stakeholder groups involved in clinical trials:

  • Sponsors, research consortia and health agencies should prioritize research approaches that test multiple treatments side by side. The authors argue that “master protocols” enable multiple treatments to be tested under a common statistical framework.
  • Individual clinicians should avoid off-label use of unvalidated interventions that might interfere with trial recruitment and resist the urge to carry out small studies with no control groups. Instead, they should seek out opportunities to join larger, carefully orchestrated studies.
  • Regulatory agencies and public health authorities should play a leading role in identifying studies that meet rigorous standards and in fostering collaboration among a sufficient number of centers to ensure adequate recruitment and timely results. Rather than making public recommendations about interventions whose clinical merits remain to be established, health authorities can point stakeholders to recruitment milestones to elevate the profile and progress of high-quality studies.

“Rigorous research practices can’t eliminate all uncertainty from medicine,” London and Kimmelman said, “but they can represent the most efficient way to clarify the causal relationships clinicians hope to exploit in decisions with momentous consequences for patients and health systems.”

‘There is no absolute truth’: an infectious disease expert on Covid-19, misinformation and ‘bullshit’ (The Guardian)

theguardian.com

Carl Bergstrom’s two disparate areas of expertise merged as reports of a mysterious respiratory illness emerged in January

‘Just because the trend that you see is consistent with a story that someone’s selling,inferring causality is dangerous.’
‘Just because the trend that you see is consistent with a story that someone’s selling,inferring causality is dangerous.’ Photograph: Matthew Horwood/Alamy Stock Photo

Julia Carrie Wong, Tue 28 Apr 2020 11.00 BST

Carl Bergstrom is uniquely suited to understanding the current moment. A professor of biology at the University of Washington, he has spent his career studying two seemingly disparate topics: emerging infectious diseases and networked misinformation. They merged into one the moment reports of a mysterious respiratory illness emerged from China in January.

The coronavirus touched off both a pandemic and an “infodemic” of hoaxes, conspiracy theories, honest misunderstandings and politicized scientific debates. Bergstrom has jumped into the fray, helping the public and the press navigate the world of epidemiological models, statistical uncertainty and the topic of his forthcoming book: bullshit.

The following interview has been edited for length and clarity.

You’ve been teaching a course and have co-written a book about the concept of bullshit. Explain what you mean by bullshit?

The formal definition that we use is “language, statistical figures, data, graphics and other forms of presentation that are intended to persuade by impressing and overwhelming a reader or listener with a blatant disregard for truth or logical coherence”.

The idea with bullshit is that it’s trying to appear authoritative and definitive in a way that’s not about communicating accurately and informing a reader, but rather by overwhelming them, persuading them, impressing them. If that’s done without any allegiance to truth, or accuracy, that becomes bullshit.

We’re all used to verbal bullshit. We’re all used to campaign promises and weasel words, and we’re pretty good at seeing through that because we’ve had a lot of practice. But as the world has become increasingly quantified and the currency of arguments has become statistics, facts and figures and models and such, we’re increasingly confronted, even in the popular press, with numerical and statistical arguments. And this area’s really ripe for bullshit, because people don’t feel qualified to question information that’s given to them in quantitative form.

Are there bullshit narratives about the coronavirus that you are concerned about right now?

What’s happened with this pandemic that we’re not accustomed to in the epidemiology community is that it’s been really heavily politicized. Even when scientists are very well-intentioned and not trying to support any side of the narrative, when they do work and release a paper it gets picked up by actors with political agendas.

Whether it’s talking about seroprevalence or estimating the chance that this is even going to come to the United States at all each study gets picked up and placed into this little political box and sort of used as a cudgel to beat the other side with.

So even when the material isn’t being produced as bullshit, it’s being picked up and used in the service of that by overstating its claims, by cherry-picking the information that’s out there and so on. And I think that’s kind of the biggest problem that we’re facing.

One example [of intentional bullshit] might be this insistence for a while on graphing the number of cases on a per-capita basis, so that people could say the US response is so much better than the rest of the world because we have a slower rate of growth per capita. That was basically graphical malfeasance or bullshit. When a wildfire starts spreading, you’re interested in how it’s spreading now, not whether it’s spreading in a 100-acre wood or millions of square miles of national forest.

Is there one big lesson that you think that the media should keep in mind as we communicate science to the public? What mistakes are we making?

I think the media has been adjusting really fast and doing really well. When I’m talking about how to avoid misinformation around this I’m constantly telling people to trust the professional fact-based media. Rather than looking for the latest rumor that’s spreading across Facebook or Twitter so that you can have information up to the hour, recognize that it’s much better to have solidly sourced, well-vetted information from yesterday.

Hyper-partisan media are making a huge mess of this, but that’s on purpose. They’ve got a reason to promote hydroxychloroquine or whatever it is and just run with that. They’re not even trying to be responsible.

But one of the biggest things that people [in the media]could do to improve would be to recognize that scientific studies, especially in a fast-moving situation like this, are provisional. That’s the nature of science. Anything can be corrected. There’s no absolute truth there. Each model, each finding is just adding to a weight of evidence in one direction or another.

A lot of the reporting is focusing on models, and most of us probably don’t have any basic training in how to read them or what kind of credence to put in them. What should we know?

The key thing, and this goes for scientists as well as non-scientists, is that people are not doing a very good job thinking about what the purpose of different models are, how the purposes of different models vary, and then what the scope of their value is. When these models get treated as if they’re oracles, then people both over-rely on them and treat them too seriously – and then turn around and slam them too hard for not being perfect at everything.

Are there mistakes that are made by people in the scientific community when it comes to communicating with the public?

We’re trying to communicate as a scientific community in a new way, where people are posting their data in real time. But we weren’t ready for the degree to which that stuff would be picked up and assigned meaning in this highly politically polarized environment. Work that might be fairly easy for researchers to contextualize in the field can be portrayed as something very, very different in the popular press.

The first Imperial College model in March was predicting 1.1 million to 2.2 million American deaths if the pandemic were not controlled. That’s a really scary, dramatic story, and I still think that it’s not unrealistic. That got promoted by one side of the partisan divide. Then Imperial came back and modeled a completely different scenario, where the disease was actually brought under control and suppressed in the US, and they released a subsequent model that said, ‘If we do this, something like 50,000 deaths will occur.’ That was picked up by the other side and used to try to discredit the Imperial College team entirely by saying, ‘A couple of weeks ago they said a million now they’re saying 50,000; they can’t get anything right.’ And the answer , of course, is that they were modeling two different scenarios.

We’re also not doing enough of deliberately stressing the possible weaknesses of our interpretations. That varies enormously from researcher to researcher and team to team.

It requires a lot of discipline to argue really hard for something but also be scrupulously open about all of the weaknesses in your own argument.

But it’s more important than ever, right? A really good paper will lay out all the most persuasive evidence it can and then in the conclusion section or the discussion section say, ‘OK, here are all the reasons that this could be wrong and here are the weaknesses.’

When you have something that’s so directly policy relevant, and there’s a lot of lives at stake, we’re learning how to find the right balance.

It is a bit of a nightmare to put out data that is truthful, but also be aware that there are bad faith actors at the moment who might pounce on it and use it in a way you didn’t intend.

There’s a spectrum. You have outright bad faith actors – Russian propaganda picking up on things and bots spreading misinformation – and then you have someone like Georgia Governor Brian Kemp who I wouldn’t calla bad faith actor. He’s a misinformed actor.

There’s so much that goes unsaid in science in terms of context and what findings mean that we don’t usually write in papers. If someone does a mathematical forecasting model, you’re usually not going to have a half-page discussion on the limitations of forecasting. We’re used to writing for an audience of 50 people in the world, if we’re lucky, who have backgrounds that are very similar to our own and have a huge set of shared assumptions and shared knowledge. And it works really well when you’re writing on something that only 50 people in the world care about and all of them have comparable training, but it is a real mess when it becomes pressing, and I don’t think any of us have figured out exactly what to do about that because we’re also trying to work quickly and it’s important to get this information out.

One area that has already become contentious and in some ways politicized is the serology surveys, which are supposed to show what percentage of the population has antibodies to the virus. What are some of the big picture contextual caveats and limitations that we should keep in mind as these surveys come out?

The seroprevalence in the US is a political issue, and so the first thing is to recognize that when anyone is reporting on that stuff, there’s a political context to it. It may even be that some of the research is being done with an implicitly political context, depending on who the funders are or what the orientations and biases of some of the researchers.

On the scientific side, I think there’s really two things to think about. The first one is the issue of selection bias. You’re trying to draw a conclusion about one population by sampling from a subset of that population and you want to know how close to random your subset is with respect to the thing you’re trying to measure. The Santa Clara study recruited volunteers off of Facebook. The obvious source of sampling bias there is that people desperately want to get tested. The people that want it are, of course, people that think they’ve had it.

The other big piece is understanding the notion of positive predictive value and the way false positive and false negative error rates influence the estimate. And that depends on the incidence of infection in the population.

If you have a test that has a 3% error rate, and the incidence in the population is below 3%, then most of the positives that you get are going to be false positives. And so you’re not going to get a very tight estimate about how many people have it. This has been a real problem with the Santa Clara study. From my read of the paper, their data is actually consistent with nobody being infected. A New York Citystudy on the other hand showed 21% seropositive, so even if there has a 3% error rate, the majority of those positives have to be true positives.

Now that we’ve all had a crash course in models and serosurveys, what are the other areas of science where it makes sense for the public to start getting educated on the terms of the debate?

One that I think will come along sooner or later is interpreting studies of treatments. We’ve dealt with that a little bit with the hydroxychloroquine business but not in any serious way because the hydroxychloroquine work has been pretty weak and the results have not been so positive.

But there are ongoing tests of a large range of existing drugs. And these studies are actually pretty hard to do. There’s a lot of subtle technical issues: what are you doing for controls? Is there a control arm at all? If not, how do you interpret the data? If there is a control arm, how is it structured? How do you control for the characteristics of the population on whom you’re using the drug or their selection biases in terms of who’s getting the drug?

Unfortunately, given what we’ve already seen with hydroxychloroquine, it’s fairly likely that this will be politicized as well. There’ll be a parallel set of issues that are going to come around with vaccination, but that’s more like a year off.

If you had the ability to arm every person with one tool – a statistical tool or scientific concept – to help them understand and contextualize scientific information as we look to the future of this pandemic, what would it be?

I would like people to understand that there are interactions between the models we make, the science we do and the way that we behave. The models that we make influence the decisions that we take individually and as a society, which then feed back into the models and the models often don’t treat that part explicitly.

Once you put a model out there that then creates changes in behavior that pull you out of the domain that the model was trying to model in the first place. We have to be very attuned to that as we try to use the models for guiding policy.

That’s very interesting, and not what I expected you to say.

What did you expect?

That correlation does not imply causation.

That’s another very good one. Seasonality is a great example there. We’re trying a whole bunch of things at the same time. We’re throwing all kinds of possible solutions at this and lots of things are changing. It’s remarkable to me actually, that so many US states are seeing the epidemic curve decrease. And so there’s a bunch of possibilities there. It could be because people’s behavior is changing. There could be some seasonality there. And there are other possible explanations as well.

But what is really important is that just because the trend that you see is consistent with a story that someone’s selling, there may be many other stories that are also consistent, so inferring causality is dangerous.

Com pandemia de covid-19, cartórios registram alta de 43% em mortes por causa indeterminada (Estadão)

saude.estadao.com.br

Fabiana Cambricoli, 27 de abril de 2020

SÃO PAULO – Os cartórios brasileiros registraram alta de 43% no número de mortes por causa indeterminada notificadas no País desde o início da pandemia de covid-19 em território brasileiro. Os dados, antecipados pelo Estado, serão divulgados nesta segunda-feira, 27, em novo painel do Portal da Transparência do Registro Civil, mantido pela Associação Nacional dos Registradores de Pessoas Naturais (Arpen-Brasil). Segundo especialistas, o aumento de óbitos sem causa definida pode estar associado a vítimas de coronavírus que morreram sem ter o diagnóstico da doença.

A alta refere-se ao período de 26 de fevereiro, data em que o primeiro caso de infecção por coronavírus foi registrado no Brasil, até 17 de abril – como os cartórios tem até dez dias para repassar os registros para a Central de Informações do Registro Civil (CRC Nacional), a reportagem optou por um recorte até dez dias atrás.

Em 2020, o País teve 1.329 mortes por causa indeterminada no periodo mencionado. Em 2019, 925 óbitos do tipo foram registrados pelos cartórios no mesmo intervalo. De acordo com especialistas, o dado pode ser mais um indício de subnotificação do número de óbitos por coronavírus no País. Com a falta de testes e a alta demanda sobre o sistema de saúde em algumas regiões, doentes podem estar morrendo sem ter uma avaliação médica.

Para Fátima Marinho, professora da Faculdade de Medicina da Universidade Federal de Minas Gerais (UFMG) e integrante do grupo de especialistas que auxiliou a Arpen-Brasil na elaboração do painel, é provável que o aumento de mortes por causa indefinida tenha como uma das razões a morte de pessoas por covid-19 que não tiveram acesso ao sistema de saúde. “Em uma situação de uma doença nova, uma pandemia, a gente espera um aumento de mortes em casa, sem que a pessoa sequer consiga ter atendimento médico. Isso pode estar acontecendo agora”, explica.

Se analisadas as mortes também por faixa etária, o aumento de óbitos por causa indeterminada é maior entre idosos, principal grupo de risco para complicações do coronavírus. O número de mortes sem causa definida entre pessoas com idade a partir de 60 anos passou de 568 em 2019 para 879 em 2020, alta de 54,8%. Já entre indivíduos com menos de 60 anos, a variação foi de 30,5% – subiu de 321 para 419 no mesmo intervalo de tempo.

Fátima diz que outra razão que pode estar impactando na alta de mortes por causas indeterminadas é o provável crescimento de óbitos por outras causas que não estão chegando aos hospitais pela dificuldade de conseguir leitos no meio da pandemia ou pelo eventual medo de pacientes em procurar unidades de saúde e se contaminarem. “Provavelmente teremos um aumento de mortes por infarto, AVC e outros problemas registrados em casa porque as pessoas estão adiando a ida ao pronto-socorro ou tendo que disputar leitos com pacientes com covid-19”, diz ela.

Salto em mortes por Síndrome Respiratória Aguda Grave

O portal da transparência mantido pela Arpen-Brasil também passa a disponibilizar o número de mortes por Síndrome Respiratória Aguda Grave (SRAG), que registrou aumento de 680% entre 26 de fevereiro e 17 de abril de 2019 e o mesmo período de 2020. Os números contemplam casos dessa condição respiratória em que não foi especificado o agente causador da síndrome, que pode ser coronavírus, mas também influenza ou outro vírus respiratório.

De acordo com o portal, o número de mortes do tipo passou de 156 para 1.217 no período citado. A alta nos óbitos por SRAG não especificada registradas em cartórios seriam outro indício de subnotificação. Ela é ainda maior em Estados com muitos casos da doença. No Amazonas, o aumento foi de 1.214%. No Ceará, de 3.828%. Em São Paulo, Estado com o maior número de infectados, o crescimento observado foi de 916%.

Outros dados anteriormente divulgados pela Arpen-Brasil mostravam indícios de que o número de mortes por coronavírus no Brasil pode ser maior que o computado oficialmente pelo Ministério da Saúde. Como revelou o Estado em 13 de abril, o número de registros de mortes por insuficiência respiratória e pneumonia no Brasil teve um salto em março, contrariando tendência de queda que vinha sendo observada nos meses de janeiro e fevereiro. Foram 2.239 mortes a mais em março de 2020 do que no mesmo período de 2019.

O número de mortes suspeitas ou confirmadas por covid-19 registradas nos cartórios também vem se mostrando maior do que as registradas pelo Ministério da Saúde (que considera só os óbitos confirmados por coronavírus). Na tarde desta segunda, por exemplo, os cartórios já registravam 4.839 vítimas com confirmação ou suspeita da doença. Já o Ministério contabilizava 4.543 registros.

Para Luis Carlos Vendramin Júnior, vice-presidente da Arpen-Brasil, a disponibilização dos dados dos cartórios ajudam a entender o avanço da epidemia. “Como temos esses dados com atualização diária, avaliamos que ampliar a transparência e divulgar dados também sobre mortes por SRAG e causas indeterminadas, além das que já vínhamos divulgando, vai auxiliar tanto o poder público quanto a imprensa e a população em geral na análise de números”, destacou.

We Still Don’t Know How the Coronavirus Is Killing Us (The Intelligencer)

nymag.com

David Wallace-Wells, Apr. 26, 2020

Omar Rodriguez organizes bodies in the Gerard J. Neufeld funeral home in Elmhurst on April 22. Photo: Spencer Platt/Getty Images

Over the last few weeks, the country has managed to stabilize the spread of the coronavirus sufficiently enough to begin debating when and in what ways to “reopen,” and to normalize, against all moral logic, the horrifying and ongoing death toll — thousands of Americans dying each day, in multiples of 9/11 every week now with the virus seemingly “under control.” The death rate is no longer accelerating, but holding steady, which is apparently the point at which an onrushing terror can begin fading into background noise. Meanwhile, the disease itself appears to be shape-shifting before our eyes.

In an acute column published April 13, the New York Times’ Charlie Warzel listed 48 basic questions that remain unanswered about the coronavirus and what must be done to protect ourselves against it, from how deadly it is to how many people caught it and shrugged it off to how long immunity to the disease lasts after infection (if any time at all). “Despite the relentless, heroic work of doctors and scientists around the world,” he wrote, “there’s so much we don’t know.” The 48 questions he listed, he was careful to point out, did not represent a comprehensive list. And those are just the coronavirus’s “known unknowns.”

In the two weeks since, we’ve gotten some clarifying information on at least a handful of Warzel’s queries. In early trials, more patients taking the Trump-hyped hydroxychloroquinine died than those who didn’t, and the FDA has now issued a statement warning coronavirus patients and their doctors from using the drug. The World Health Organization got so worried about the much-touted antiviral remdesivir, which received a jolt of publicity (and stock appreciation) a few weeks ago on rumors of positive results, the organization leaked an unpublished, preliminary survey showing no benefit to COVID-19 patients. Globally, studies have consistently found exposure levels to the virus in most populations in the low single digits — meaning dozens of times more people have gotten the coronavirus than have been diagnosed with it, though still just a tiny fraction of the number needed to achieve herd immunity. In particular hot spots, the exposure has been significantly more widespread — one survey in New York City found that 21 percent of residents may have COVID-19 antibodies already, making the city not just the deadliest community in the deadliest country in a world during the deadliest pandemic since AIDS, but also the most infected (and, by corollary, the farthest along to herd immunity). A study in Chelsea, Massachusetts, found an even higher and therefore more encouraging figure: 32 percent of those tested were found to have antibodies, which would mean, at least in that area, the disease was only a fraction as severe as it might’ve seemed at first glance, and that the community as a whole could be as much as halfway along to herd immunity. In most of the rest of the country, the picture of exposure we now have is much more dire, with much more infection almost inevitably to come.

But there is one big question that didn’t even make it onto Warzel’s list that has only gotten more mysterious in the weeks since: How is COVID-19 actually killing us?

We are now almost six months into this pandemic, which began in November in Wuhan, with 50,000 Americans dead and 200,000 more around the world. If each of those deaths is a data point, together they represent a quite large body of evidence from which to form a clear picture of the pandemic threat. Early in the epidemic, the coronavirus was seen as a variant of a familiar family of disease, not a mysterious ailment, however infectious and concerning. But while uncertainties at the population level confuse and frustrate public-health officials, unsure when and in what form to shift gears out of lockdowns, the disease has proved just as mercurial at the clinical level, with doctors revising their understanding of COVID-19’s basic pattern and weaponry — indeed often revising that understanding in different directions at once. The clinical shape of the disease, long presumed to be a relatively predictable respiratory infection, is getting less clear by the week. Lately, it seems, by the day. As Carl Zimmer, probably the country’s most respected science journalist, asked virologists in a tweet last week, “is there any other virus out there that is this weird in terms of its range of symptoms?”

You probably have a sense of the range of common symptoms, and a sense that the range isn’t that weird: fever, dry cough, and shortness of breath have been, since the beginning of the outbreak, the familiar, oft-repeated group of tell-tale signs. But while the CDC does list fever as the top symptom of COVID-19, so confidently that for weeks patients were turned away from testing sites if they didn’t have an elevated temperature, according to the Journal of the American Medical Association, as many as 70 percent of patients sick enough to be admitted to New York State’s largest hospital system did not have a fever.

Over the past few months, Boston’s Brigham and Women’s Hospital has been compiling and revising, in real time, treatment guidelines for COVID-19 which have become a trusted clearinghouse of best-practices information for doctors throughout the country. According to those guidelines, as few as 44 percent of coronavirus patients presented with a fever (though, in their meta-analysis, the uncertainty is quite high, with a range of 44 to 94 percent). Cough is more common, according to Brigham and Women’s, with between 68 percent and 83 percent of patients presenting with some cough — though that means as many as three in ten sick enough to be hospitalized won’t be coughing. As for shortness of breath, the Brigham and Women’s estimate runs as low as 11 percent. The high end is only 40 percent, which would still mean that more patients hospitalized for COVID-19 do not have shortness of breath than do. At the low end of that range, shortness of breath would be roughly as common among COVID-19 patients as confusion (9 percent), headache (8 to 14 percent), and nausea and diarrhea (3 to 17 percent). That the ranges are so wide themselves tells you that the disease is presenting in very different ways in different hospitals and different populations of different patients — leading, for instance, some doctors and scientists to theorize the virus might be attacking the immune system like HIV does, with many others finding the disease is triggering something like the opposite response, an overwhelming overreaction of the immune system called a “cytokine storm.”

The most bedeviling confusion has arisen around the relationship of the disease to breathing, lung function, and oxygenation levels in the blood — typically, for a respiratory illness, a quite predictable relationship. But for weeks now, front-line doctors have been expressing confusion that so many coronavirus patients were registering lethally low blood-oxygenation levels while still appearing, by almost any vernacular measure, pretty okay. It’s one reason they’ve begun rethinking the initial clinical focus on ventilators, which are generally recommended when patients oxygenation falls below a certain level, but seemed, after a few weeks, of unclear benefit to COVID-19 patients, who may have done better, doctors began to suggest, on lesser or different forms of oxygen support. For a while, ventilators were seen so much as the essential tool in treating life-threatening coronavirus that shortages (and the president’s unwillingness to invoke the Defense Production Act to manufacture them quickly) became a scandal. But by one measure 88 percent of New York patients put on ventilators, for whom an outcome as known, had died. In China, the figure was 86 percent.

On April 20 in the New York Times, an ER doctor named Richard Levitan who had been volunteering at Bellevue proposed that the phenomenon of seemingly stable patients registering lethally low oxygen levels might be explained by “silent hypoxia” — the air sacs in the lung collapsing, not getting stiff or heavy with fluid, as is the case with the pneumonias doctors had been using as models in their treatment of COVID-19. But whether this explanation is universal, limited to the patients at Bellevue, or somewhere in between is not yet entirely clear. A couple of days later, in a pre-print paper others questioned, scientists reported finding that the ability of the disease to mutate has been “vastly underestimated” — investigating the disease as it appeared in just 11 patients, they said they found 30 mutations. “The most aggressive strains could generate 270 times as much viral load as the weakest type,” the South China Morning-Post reported. “These strains also killed the cells the fastest.”

That same day, the Washington Post reported on another theory gaining traction among American doctors treating the disease — that one key could be the way COVID-19 affects the blood of patients, producing much more clotting. “Autopsies have shown that some people’s lungs are filled with hundreds of microclots,” the Post reported. “Errant blood clots of a larger size can break off and travel to the brain or heart, causing a stroke or a heart attack.”

But the bigger-picture perspective the newspaper offered is perhaps more eye-opening and to the point:

One month ago, as the country went into lockdown to prepare for the first wave of coronavirus cases, many doctors felt confident that they knew what they were dealing with. Based on early reports, covid-19 appeared to be a standard variety respiratory virus, albeit a very contagious and lethal one with no vaccine and no treatment. But they’ve since become increasingly convinced that covid-19 attacks not only the lungs, but also the kidneys, heart, intestines, liver and brain.

That is a dizzying list. But it is not even comprehensive. In a fantastic survey published April 17 (“How does coronavirus kill? Clinicians trace a ferocious rampage through the body, from brain to toes,” by Meredith Wadman, Jennifer Couzin-Frankel, Jocelyn Kaiser, and Catherine Matacic), Science magazine took a thorough, detailed tour of the ever-evolving state of understanding of the disease. “Despite the more than 1,000 papers now spilling into journals and onto preprint servers every week,” Science concluded, “a clear picture is elusive, as the virus acts like no pathogen humanity has ever seen.”

In a single illuminating chart, Science lists the following organs as being vulnerable to COVID-19: brain, eyes, nose, lungs, heart, blood vessels, livers, kidneys, intestines. That is to say, nearly every organ:

And the disparate impacts were significant ones: Heart damage was discovered in 20 percent of patients hospitalized in Wuhan, where 44 percent of those in ICU exhibited arrhythmias; 38 percent of Dutch ICU patients had irregular blood clotting; 27 percent of Wuhan patients had kidney failure, with many more showing signs of kidney damage; half of Chinese patients showed signs of liver damage; and, depending on the study, between 20 percent and 50 percent of patients had diarrhea.

On April 15, the Washington Post reported that, in New York and Wuhan, between 14 and 30 percent of ICU patients had lost kidney function, requiring dialysis. New York hospitals were treating so much kidney failure “they need more personnel who can perform dialysis and have issued an urgent call for volunteers from other parts of the country. They also are running dangerously short of the sterile fluids used to deliver that therapy.” The result, the Post said, was rationed care: patients needing 24-hour support getting considerably less. On Saturday, the paper reported that “[y]oung and middle-aged people, barely sick with COVID-19, are dying from strokes.” Many of the patients described didn’t even know they were sick:

The patient’s chart appeared unremarkable at first glance. He took no medications and had no history of chronic conditions. He had been feeling fine, hanging out at home during the lockdown like the rest of the country, when suddenly, he had trouble talking and moving the right side of his body. Imaging showed a large blockage on the left side of his head. Oxley gasped when he got to the patient’s age and covid-19 status: 44, positive.

The man was among several recent stroke patients in their 30s to 40s who were all infected with the coronavirus. The median age for that type of severe stroke is 74.

But the patient’s age wasn’t the only abnormality of the case:

As Oxley, an interventional neurologist, began the procedure to remove the clot, he observed something he had never seen before. On the monitors, the brain typically shows up as a tangle of black squiggles — “like a can of spaghetti,” he said — that provide a map of blood vessels. A clot shows up as a blank spot. As he used a needlelike device to pull out the clot, he saw new clots forming in real-time around it.

“This is crazy,” he remembers telling his boss.

These strokes, several doctors who spoke to the Post theorized, could explain the high number of patients dying at home — four times the usual rate in New York, many or most of them, perhaps, dying quite suddenly. According to the Brigham and Women’s guidelines, only 53 percent of COVID-19 patients have died from respiratory failure alone.

It’s not unheard of, of course, for a disease to express itself in complicated or hard-to-parse ways, attacking or undermining the functioning of a variety of organs. And it’s common, as researchers and doctors scramble to map the shape of a new disease, for their understanding to evolve quite quickly. But the degree to which doctors and scientists are, still, feeling their way, as though blindfolded, toward a true picture of the disease cautions against any sense that things have stabilized, given that our knowledge of the disease hasn’t even stabilized. Perhaps more importantly, it’s a reminder that the coronavirus pandemic is not just a public-health crisis but a scientific one as well. And that as deep as it may feel we are into the coronavirus, with tens of thousands dead and literally billions in precautionary lockdown, we are still in the very early stages, when each new finding seems as likely to cloud or complicate our understanding of the coronavirus as it is to clarify it. Instead, confidence gives way to uncertainty.

In the space of a few months, we’ve gone from thinking there was no “asymptomatic transmission” to believing it accounts for perhaps half or more of all cases, from thinking the young were invulnerable to thinking they were just somewhat less vulnerable, from believing masks were unnecessary to requiring their use at all times outside the house, from panicking about ventilator shortages to deploying pregnancy massage pillows instead. Six months since patient zero, we still have no drugs proven to even help treat the disease. Almost certainly, we are past the “Rare Cancer Seen in 41 Homosexuals” stage of this pandemic. But how far past?

Opinion | When Will Life Be Normal Again? We Just Don’t Know (The New York Times)

nytimes.com

By Charlie Warzel, April 13, 2020

Many Americans have been living under lockdown for a month or more. We’re all getting antsy. The president is talking about a “light at the end of the tunnel.” People are looking for hope and reasons to plan a return to something — anything — approximating normalcy. Experts are starting to speculate on what lifting restrictions will look like. Despite the relentless, heroic work of doctors and scientists around the world, there’s so much we don’t know.

We don’t know how many people have been infected with Covid-19.

We don’t know the full range of symptoms.

We don’t always know why some infections develop into severe disease.

We don’t know the full range of risk factors.

We don’t know exactly how deadly the disease is.

We don’t have answers to more detailed questions about how the virus spreads, including: “How many virus particles does it even take to launch an infection? How far does the virus travel in outdoor spaces, or in indoor settings? Have these airborne movements affected the course of the pandemic?”

We don’t know for sure how this coronavirus first emerged.

We don’t know how much China has concealed the extent of the coronavirus outbreak in that country.

We don’t know what percentage of adults are asymptomatic. Or what percentage of children are asymptomatic.

We don’t know the strength and duration of immunity. Though people who recover from Covid-19 likely have some degree of immunity for some period of time, the specifics are unknown.

We don’t yet know why some who’ve been diagnosed as “fully recovered” from the virus have tested positive a second time after leaving quarantine.

We don’t know why some recovered patients have low levels of antibodies.

We don’t know the long-term health effects of a severe Covid-19 infection. What are the consequences to the lungs of those who survive intensive care?

We don’t yet know if any treatments are truly effective. While there are many therapies in trials, there are no clinically proven therapies aside from supportive care.

We don’t know for certain if the virus was in the United States before the first documented case.

We don’t know when supply chains will strengthen to provide health care workers with enough masks, gowns and face shields to protect them.

In America, we don’t know the full extent to which black people are disproportionately suffering. Fewer than a dozen states have published data on the race and ethnic patterns of Covid-19.

We don’t know if people will continue to adhere to social distancing guidelines once infections go down.

We don’t know when states will be able to test everyone who has symptoms.

We don’t know if the United States could ever deploy the number of tests — as many as 22 million per day — needed to implement mass testing and quarantining.

We don’t know if we can implement “test and trace” contact tracing at scale.

We don’t know whether smartphone location tracking could be implemented without destroying our privacy.

We don’t know if or when researchers will develop a successful vaccine.

We don’t know how many vaccines can be deployed and administered in the first months after a vaccine becomes available.

We don’t know how a vaccine will be administered — who will get it first?

We don’t know if a vaccine will be free or costly.

We don’t know if a vaccine will need to be updated every year.

We don’t know how, when we do open things up again, we will do it.

We don’t know if people will be afraid to gather in crowds.

We don’t know if people will be too eager to gather in crowds.

We don’t know what socially distanced professional sports will look like.

We don’t know what socially distanced workplaces will look like.

We don’t know what socially distanced bars and restaurants will look like.

We don’t know when schools will reopen.

We don’t know what a general election in a pandemic will look like.

We don’t know what effects lost school time will have on children.

We don’t know if the United States’s current and future government stimulus will stave off an economic collapse.

We don’t know whether the economy will bounce back in the form of a “v curve” …

Or whether it’ll be a long recession.

We don’t know when any of this will end for good.

There is, at present, no plan from the Trump White House on the way forward.

We’re working on a project about the ways people’s lives might be permanently altered by the coronavirus, even after the pandemic subsides. In what ways do you think your life will change in the long term? What will be your new “normal”?

Recovered, almost: China’s early patients unable to shed coronavirus (Reuters)

uk.reuters.com

Brenda Goh, April 22, 2020

WUHAN, China (Reuters) – Dressed in a hazmat suit, two masks and a face shield, Du Mingjun knocked on the mahogany door of a flat in a suburban district of Wuhan on a recent morning.

FILE PHOTO: Medical personnel in protective suits wave hands to a patient who is discharged from the Leishenshan Hospital after recovering from the novel coronavirus, in Wuhan, the epicentre of the novel coronavirus outbreak, in Hubei province, China March 1, 2020. China Daily via REUTERS

A man wearing a single mask opened the door a crack and, after Du introduced herself as a psychological counsellor, burst into tears.

“I really can’t take it anymore,” he said. Diagnosed with the novel coronavirus in early February, the man, who appeared to be in his 50s, had been treated at two hospitals before being transferred to a quarantine centre set up in a cluster of apartment blocks in an industrial part of Wuhan.

Why, he asked, did tests say he still had the virus more than two months after he first contracted it?

The answer to that question is a mystery baffling doctors on the frontline of China’s battle against COVID-19, even as it has successfully slowed the spread of the coronavirus across the country.

Chinese doctors in Wuhan, where the virus first emerged in December, say a growing number of cases in which people recover from the virus, but continue to test positive without showing symptoms, is one of their biggest challenges as the country moves into a new phase of its containment battle.

Those patients all tested negative for the virus at some point after recovering, but then tested positive again, some up to 70 days later, the doctors said. Many have done so over 50-60 days.

The prospect of people remaining positive for the virus, and therefore potentially infectious, is of international concern, as many countries seek to end lockdowns and resume economic activity as the spread of the virus slows. Currently, the globally recommended isolation period after exposure is 14 days.

So far, there have been no confirmations of newly positive patients infecting others, according to Chinese health officials.

China has not published precise figures for how many patients fall into this category. But disclosures by Chinese hospitals to Reuters, as well as in other media reports, indicate there are at least dozens of such cases.

In South Korea, about 1,000 people have been testing positive for four weeks or more. In Italy, the first European country ravaged by the pandemic, health officials noticed that coronavirus patients could test positive for the virus for about a month.

As there is limited knowledge available on how infectious these patients are, doctors in Wuhan are keeping them isolated for longer.

Zhang Dingyu, president of Jinyintan Hospital, where the most serious coronavirus cases were treated, said health officials recognised the isolations may be excessive, especially if patients proved not to be infectious. But, for now, it was better to do so to protect the public, he said.    

He described the issue as one of the most pressing facing the hospital and said counsellors like Du are being brought in to help ease the emotional strain.

“When patients have this pressure, it also weighs on society,” he said.

DOZENS OF CASES

The plight of Wuhan’s long-term patients underlines how much remains unknown about COVID-19 and why it appears to affect different people in numerous ways, Chinese doctors say. So far global infections have hit 2.5 million with over 171,000 deaths.

As of April 21, 93% of 82,788 people with the virus in China had recovered and been discharged, official figures show.

Yuan Yufeng, a vice president at Zhongnan Hospital in Wuhan, told Reuters he was aware of a case in which the patient had positive retests after first being diagnosed with the virus about 70 days earlier.

“We did not see anything like this during SARS,” he said, referring to the 2003 Severe Acute Respiratory Syndrome outbreak that infected 8,098 people globally, mostly in China.

Patients in China are discharged after two negative nucleic acid tests, taken at least 24 hours apart, and if they no longer show symptoms. Some doctors want this requirement to be raised to three tests or more.

China’s National Health Commission directed Reuters to comments made at a briefing Tuesday when asked for comment about how this category of patients was being handled.

Wang Guiqiang, director of the infectious disease department of Peking University First Hospital, said at the briefing that the majority of such patients were not showing symptoms and very few had seen their conditions worsen.

“The new coronavirus is a new type of virus,” said Guo Yanhong, a National Health Commission official. “For this disease, the unknowns are still greater than the knowns.”

REMNANTS AND REACTIVATION

Experts and doctors struggle to explain why the virus behaves so differently in these people.

Some suggest that patients retesting as positive after previously testing negative were somehow reinfected with the virus. This would undermine hopes that people catching COVID-19 would produce antibodies that would prevent them from getting sick again from the virus.

Zhao Yan, a doctor of emergency medicine at Wuhan’s Zhongnan Hospital, said he was sceptical about the possibility of reinfection based on cases at his facility, although he did not have hard evidence.

“They’re closely monitored in the hospital and are aware of the risks, so they stay in quarantine. So I’m sure they were not reinfected.”

Jeong Eun-kyeong, director of the Korea Centers for Disease Control and Prevention, has said the virus may have been “reactivated” in 91 South Korean patients who tested positive after having been thought to be cleared of it.  

Other South Korean and Chinese experts have said that remnants of the virus could have stayed in patients’ systems but not be infectious or dangerous to the host or others.

Few details have been disclosed about these patients, such as if they have underlying health conditions.

Paul Hunter, a professor at the University of East Anglia’s Norwich School of Medicine, said an unusually slow shedding of other viruses such as norovirus or influenza had been previously seen in patients with weakened immune systems.

In 2015, South Korean authorities disclosed that they had a Middle East Respiratory Syndrome patient stricken with lymphoma who showed signs of the virus for 116 days. They said his impaired immune system kept his body from ridding itself of the virus. The lymphoma eventually caused his death.

FILE PHOTO: A volunteer walks inside a convention center that was used as a makeshift hospital to treat patients with the coronavirus disease (COVID-19), in Wuhan, Hubei province, China April 9, 2020. REUTERS/Aly Song

Yuan said that even if patients develop antibodies, it did not guarantee they would become virus-free.

He said that some patients had high levels of antibodies, and still tested positive to nucleic acid tests.

“It means that the two sides are still fighting,” he said.

MENTAL TOLL

As could be seen in Wuhan, the virus can also inflict a heavy mental toll on those caught in a seemingly endless cycle of positive tests.

Du, who set up a therapy hotline when Wuhan’s outbreak first began, allowed Reuters in early April to join her on a visit to the suburban quarantine centre on the condition that none of the patients be identified.

One man rattled off the names of three Wuhan hospitals he had stayed at before being moved to a flat in the centre.  He had taken over 10 tests since the third week of February, he said, on occasions testing negative but mostly positive.

“I feel fine and have no symptoms, but they check and it’s positive, check and it’s positive,” he said. “What is with this virus?”

Patients need to stay at the centre for at least 28 days and obtain two negative results before being allowed to leave. Patients are isolated in individual rooms they said were paid for by the government.

The most concerning case facing Du during the visit was the man behind the mahogany door; he had told medical workers the night before that he wanted to kill himself.

“I wasn’t thinking clearly,” he told Du, explaining how he had already taken numerous CT scans and nucleic acid tests, some of which tested negative, at different hospitals. He worried that he had been reinfected as he cycled through various hospitals.

His grandson missed him after being gone for so long, he said, and he worried his condition meant he would never be able to see him again.

He broke into another round of sobs. “Why is this happening to me?”

Reporting by Brenda Goh; Additional reporting by Jack Kim in Seoul, Elvira Pollina in Milan, Belen Carreno in Madrid, and Shanghai newsroom; Editing by Philip McClellan

Climate Change – Catastrophic or Linear Slow Progression? (Armstrong Economics)

woolyrhinoIndeed, science was turned on its head after a discovery in 1772 near Vilui, Siberia, of an intact frozen woolly rhinoceros, which was followed by the more famous discovery of a frozen mammoth in 1787. You may be shocked, but these discoveries of frozen animals with grass still in their stomachs set in motion these two schools of thought since the evidence implied you could be eating lunch and suddenly find yourself frozen, only to be discovered by posterity.

baby-mammoth

The discovery of the woolly rhinoceros in 1772, and then frozen mammoths, sparked the imagination that things were not linear after all. These major discoveries truly contributed to the “Age of Enlightenment” where there was a burst of knowledge erupting in every field of inquisition. Such finds of frozen mammoths in Siberia continue to this day. This has challenged theories on both sides of this debate to explain such catastrophic events. These frozen animals in Siberia suggest strange events are possible even in climates that are not that dissimilar from the casts of dead victims who were buried alive after the volcanic eruption of 79 AD at Pompeii in ancient Roman Italy. Animals can be grazing and then suddenly freeze abruptly. That climate change was long before man invented the combustion engine.

Even the field of geology began to create great debates that perhaps the earth simply burst into a catastrophic convulsion and indeed the planet was cyclical — not linear. This view of sequential destructive upheavals at irregular intervals or cycles emerged during the 1700s. This school of thought was perhaps best expressed by a forgotten contributor to the knowledge of mankind, George Hoggart Toulmin in his rare 1785 book, “The Eternity of the World“:

” ••• convulsions and revolutions violent beyond our experience or conception, yet unequal to the destruction of the globe, or the whole of the human species, have both existed and will again exist ••• [terminating] ••• an astonishing succession of ages.”

Id./p3, 110

bernhardi-erratics

In 1832, Professor A. Bernhardi argued that the North Polar ice cap had extended into the plains of Germany. To support this theory, he pointed to the existence of huge boulders that have become known as “erratics,” which he suggested were pushed by the advancing ice. This was a shocking theory for it was certainly a nonlinear view of natural history. Bernhardi was thinking out of the box. However, in natural science people listen and review theories unlike in social science where theories are ignored if they challenge what people want to believe. In 1834, Johann von Charpentier (1786-1855) argued that there were deep grooves cut into the Alpine rock concluding, as did Karl Schimper, that they were caused by an advancing Ice Age.

This body of knowledge has been completely ignored by the global warming/climate change religious cult. They know nothing about nature or cycles and they are completely ignorant of history or even that it was the discovery of these ancient creatures who froze with food in their mouths. They cannot explain these events nor the vast amount of knowledge written by people who actually did research instead of trying to cloak an agenda in pretend science.

Glaciologists have their own word, jökulhlaup(from Icelandic), to describe the spectacular outbursts when water builds up behind a glacier and then breaks loose. An example was the 1922 jökulhlaup in Iceland. Some seven cubic kilometers of water, melted by a volcano under a glacier, had rushed out in a few days. Still grander, almost unimaginably events, were floods that had swept across Washington state toward the end of the last ice age when a vast lake dammed behind a glacier broke loose. Catastrophic geologic events are not generally part of the uniformitarian geologist’s thinking. Rather, the normal view tends to be linear including events that are local or regional in size

One example of a regional event would be the 15,000 square miles of the Channeled Scablands in eastern WashingtonInitially, this spectacular erosion was thought to be the product of slow gradual processes. In 1923, JHarlen Bretz presented a paper to the Geological Society of America suggesting the Scablands were eroded catastrophically. During the 1940s, after decades of arguing, geologists admitted that high ridges in the Scablands were the equivalent of the little ripples one sees in mud on a streambed, magnified ten thousand times. Finally, by the 1950s, glaciologists were accustomed to thinking about catastrophic regional floods. The Scablands are now accepted to have been catastrophically eroded by the “Spokane Flood.” This Spokane flood was the result of the breaching of an ice dam which had created glacial Lake Missoula. Now the United States Geological Survey estimates the flood released 500 cubic miles of water, which drained in as little as 48 hours. That rush of water gouged out millions of tons of solid rock.

When Mount St. Helens erupted in 1980, this too produced a catastrophic process whereby two hundred million cubic yards of material was deposited by volcanic flows at the base of the mountain in just a matter of hours. Then, less than two years later, there was another minor eruption, but this resulted in creating a mudflow, which carved channels through the recently deposited material. These channels, which are 1/40th the size of the Grand Canyon, exposed flat segments between the catastrophically deposited layers. This is what we see between the layers exposed in the walls of the Grand Canyon. What is clear, is that these events were relatively minor compared to a global flood. For example, the eruption of Mount St. Helens contained only 0.27 cubic miles of material compared to other eruptions, which have been as much as 950 cubic miles. That is over 2,000 times the size of Mount St. Helens!

With respect to the Grand Canyon, the specific geologic processes and timing of the formation of the Grand Canyon have always sparked lively debates by geologists. The general scientific consensus, updated at a 2010 conference, maintains that the Colorado River carved the Grand Canyon beginning 5 million to 6 million years ago. This general thinking is still linear and by no means catastrophic. The Grand Canyon is believed to have been gradually eroded. However, there is an example cyclical behavior in nature which demonstrates that water can very rapidly erode even solid rock. An example of this took place in the Grand Canyon region back on June 28th, 1983. There emerged an overflow of Lake Powell which required the use of the Glen Canyon Dam’s 40-foot diameter spillway tunnels for the first time. As the volume of water increased, the entire dam started to vibrate and large boulders spewed from one of the spillways. The spillway was immediately shut down and an inspection revealed catastrophic erosion had cut through the three-foot-thick reinforced concrete walls and eroded a hole 40 feet wide, 32 feet deep, and 150 feet long in the sandstone beneath the dam. Nobody thought such catastrophic erosion that quick was even possible.

Some have speculated that the end of the Ice Age resulted in a flood of water which had been contained by an ice dam. Like that of the Scablands, it is possible that a sudden catastrophic release of water originally carved the Grand Canyon. It is clear that both the formation of the Scablands and the evidence of how Mount St Helens unfolded, may be support for the catastrophic formation of events rather than nice, slow, and linear formations.

Then there is the Biblical Account of the Great Flood and Noah. Noah is also considered to be a Prophet of Islam. Darren Aronofsky’s film Noah was based on the biblical story of Genesis. Some Christians were angry because the film strayed from biblical Scripture. The Muslim-majority countries banned the film Noah from screening in theaters because Noah was a prophet of God in the Koran. They considered it to be blasphemous to make a film about a prophet. Many countries banned the film entirely.

The story of Noah predates the Bible. There exists the legend of the Great Flood rooted in the ancient civilizations of Mesopotamia. The Sumerian Epic of Gilgamesh dates back nearly 5,000 years which is believed to be perhaps the oldest written tale on Earth. Here too, we find an account of the great sage Utnapishtim, who is warned of an imminent flood to be unleashed by wrathful gods. He builds a vast circular-shaped boat, reinforced with tar and pitch, and carries his relatives, grains along with animals. After enduring days of storms, Utnapishtim, like Noah in Genesis, releases a bird in search of dry land. Since there is evidence that there were survivors in different parts of the world, it is merely logical that there should be more than just one.

Archaeologists generally agree that there was a historical deluge between 5,000 and 7,000 years ago which hit lands ranging from the Black Sea to what many call the cradle of civilization, which was the floodplain between the Tigris and Euphrates rivers. The translation of ancient cuneiform tablets in the 19th century confirmed the Mesopotamian Great Flood myth as an antecedent of the Noah story in the Bible.

The problem that existed was the question of just how “great” was the Great Flood? Was it regional or worldwide? The stories of the Great Flood in Western Culture clearly date back before the Bible. The region implicated has long been considered to be the Black Sea. It has been suggested that the water broke through the land by Istanbul and flooded a fertile valley on the other side much as we just looked at in the Scablands. Robert Ballard, one of the world’s best-known underwater archaeologists, who found the Titanic, set out to test that theory to search for an underwater civilization. He discovered that some four hundred feet below the surface, there was an ancient shoreline, proving that there was a catastrophic event did happen in the Black Sea. By carbon dating shells found along the underwater shoreline, Ballard dated this catastrophic event to around 5,000 BC. This may match around the time when Noah’s flood could have occurred.

Given the fact that for the entire Earth to be submerged for 40 days and 40 nights is impossible for that much water to simply vanish, we are probably looking at a Great Flood that at the very least was regional. However, there are tales of the Great Floodwhich spring from many other sources. Various ancient cultures have their own legends of a Great Flood and salvation. According to Vedic lore, a fish tells the mythic Indian king Manu of a Great Flood that will wipe out humanity. In turn, Manu also builds a ship to withstand the epic rains and is later led to a mountaintop by the same fish.

We also find an Aztec story that tells of a devout couple hiding in the hollow of a vast tree with two ears of corn as divine storms drown the wicked of the land. Creation myths from Egypt to Scandinavia also involve tidal floods of all sorts of substances purging and remaking the earth. The fact that we have Great Flood stories from India is not really a surprise since there was contact between the Middle East and India throughout recorded history. However, the Aztec story lacks the ship, but it still contains punishing the wicked and here there was certainly no direct contact, although there is evidence of cocaine use in Egypt implying there was some trade route probably through island hopping in the Pacific to the shores of India and off to Egypt. Obviously, we cannot rule out that this story of the Great Flood even made it to South America. 

Then again, there is the story of Atlantis – the island that sunk beath the sea. The Atlantic Ocean covers approximately one-fifth of Earth’s surface and second in size only to the Pacific Ocean. The ocean’s name, derived from Greek mythology, means the “Sea of Atlas.” The origin of names is often very interesting clues as well. For example. New Jersey is the English Translation of Latin Nova Caesarea which appeared even on the colonial coins of the 18th century. Hence, the state of New Jersey is named after the Island of Jersey which in turn was named in the honor of Julius Caesar. So we actually have an American state named after the man who changed the world on par with Alexander the Great, for whom Alexandria of Virginia is named after with the location of the famous cemetery for veterans, where John F. Kennedy is buried.

So here the Atlantic Ocean is named after Atlas and the story of Atlantis. The original story of Atlantis comes to us from two Socratic dialogues called Timaeus and Critias, both written about 360 BC by the Greek philosopher Plato. According to the dialogues, Socrates asked three men to meet him: Timaeus of Locri, Hermocrates of Syracuse, and Critias of Athens. Socrates asked the men to tell him stories about how ancient Athens interacted with other states. Critias was the first to tell the story. Critias explained how his grandfather had met with the Athenian lawgiver Solon, who had been to Egypt where priests told the Egyptian story about Atlantis. According to the Egyptians, Solon was told that there was a mighty power based on an island in the Atlantic Ocean. This empire was called Atlantis and it ruled over several other islands and parts of the continents of Africa and Europe.

Atlantis was arranged in concentric rings of alternating water and land. The soil was rich and the engineers were technically advanced. The architecture was said to be extravagant with baths, harbor installations, and barracks. The central plain outside the city was constructed with canals and an elaborate irrigation system. Atlantis was ruled by kings but also had a civil administration. Its military was well organized. Their religious rituals were similar to that of Athens with bull-baiting, sacrifice, and prayer.

Plato told us about the metals found in Atlantis, namely gold, silver, copper, tin and the mysterious Orichalcum. Plato said that the city walls were plated with Orichalcum (Brass). This was a rare alloy metal back then which was found both in Crete as well as in the Andes, in South America. An ancient shipwreck was discovered off the coast of Sicily in 2015 which contained 39 ingots of Orichalcum. Many claimed this proved the story of AtlantisOrichalcum was believed to have been a gold/copper alloy that was cheaper than gold, but twice the value of copper. Of course, Orichalcum was really a copper-tin or copper-zinc brass. We find in Virgil’s Aeneid, the breastplate of Turnus is described as “stiff with gold and white orichalc”.

The monetary reform of Augustus in 23BC reintroduced bronze coinage which had vanished after 84BC. Here we see the introduction of Orichalcum for the Roman sesterius and the dupondius. The Roman As was struck in near pure copper. Therefore, about 300 years after Plato, we do see Orichalcum being introduced as part of the monetary system of Rome. It is clear that Orichalcum was rare at the time Plato wrote this. Consequently, this is similar to the stories of America that there was so much gold, they paved the streets with it.

As the story is told, Atlantis was located in the Atlantic Ocean. There have been bronze-age anchors discovered at the Gates of Hercules (Straights of Gibralter) and many people proclaimed this proved Atlantis was real. However, what these proponents fail to take into account is the Minoans. The Minoans were perhaps the first International Economy. They traded far and wide even with Britain seeking tin to make bronze – henceBronze Age. Their civilization was of the Bronze Age rising civilization that arose on the island of Crete and flourished from approximately the 27th century BC to the 15th century BC – nearly 12,000 years. Their trading range and colonization extended to Spain, Egypt, Israel (Canaan), Syria (Levantine), Greece, Rhodes, and of course to Turkey (Anatolia). Many other cultures referred to them as the people from the islands in the middle of the sea. However, the Minoans had no mineral deposits. They lacked gold as well as silver or even the ability to produce large mining of copper. They appear to have copper mines in Anatolia (Turkey) in colonized cities. What has survived are examples of copper ingots that served as MONEY in trade. Keep in mind that gold at this point was rare, too rare to truly serve as MONEY. It is found largely as jewelry in tombs of royal dignitaries.

The Bronze Age emerged at different times globally appearing in Greece and China around 3,000BC but it came late to Britain reaching there about 1900BC. It is known that copper emerged as a valuable tool in Anatolia (Turkey) as early as 6,500BC, where it began to replace stone in the creation of tools. It was the development of casting copper that also appears to aid the urbanization of man in Mesopotamia. By 3,000BC, copper is in wide use throughout the Middle East and starts to move up into Europe. Copper in its pure stage appears first, and tin is eventually added creating actual bronze where a bronze sword would break a copper sword. It was this addition of tin that really propelled the transition of copper to bronze and the tin was coming from England where vast deposits existed at Cornwall. We know that the Minoans traveled into the Atlantic for trade. Anchors are not conclusive evidence of Atlantis.

As the legend unfolds, Atlantis waged an unprovoked imperialistic war on the remainder of Asia and Europe. When Atlantis attacked, Athens showed its excellence as the leader of the Greeks, the much smaller city-state the only power to stand against Atlantis. Alone, Athens triumphed over the invading Atlantean forces, defeating the enemy, preventing the free from being enslaved, and freeing those who had been enslaved. This part may certainly be embellished and remains doubtful at best. However, following this battle, there were violent earthquakes and floods, and Atlantis sank into the sea, and all the Athenian warriors were swallowed up by the earth. This appears to be almost certainly a fiction based on some ancient political realities. Still, the explosive disappearance of an island some have argued is a reference to the eruption of MinoanSantorini. The story of Atlantis does closely correlate with Plato’s notions of The Republic examining the deteriorating cycle of life in a state.

 

There have been theories that Atlantiswas the Azores, and still, others argue it was actually South America. That would explain to some extent the cocaine mummies in Egypt. Yet despite all these theories, usually, when there is an ancient story, despite embellishment, there is often a grain of truth hidden deep within. In this case, Atlantis may not have completely submerged, but it could have partially submerged from an earthquake at least where some people survived. Survivors could have made to either the Americas or to Africa/Europe. What is clear, is that a sudden event could have sent a  tsunami into the Mediterranean which then broke the land mass at Istanbul and flooded the valley below transforming this region into the Black Sea becoming the story of Noah.

We also have evidence which has surfaced that the Earth was struck by a comet around 12,800 years ago. Scientific American has published that sediments from six sites across North America—Murray Springs, Ariz.; Bull Creek, Okla.; Gainey, Mich.; Topper, S.C.; Lake Hind, Manitoba; and Chobot, Alberta, have yielded tiny diamonds, which only occur in sediment exposed to extreme temperatures and pressures. The evidence surfacing implies that the Earth moved into an Ice Age killing off large mammals and setting the course for Global Cooling for the next 1300 years. This may indeed explain that catastrophic freezing of Wooly Mammoths in Siberia. Such an event could have also been responsible for the legend of Atlantis where the survivors migrated taking their stories with them.

There is also evidence surfacing from stone carvings at one of the oldest sites recorded located in Anatolia (Turkey). Using a computer programme to show where the constellations would have appeared above Turkey thousands of years ago, researchers were able to pinpoint the comet strike to 10,950BC, the exact time the Younger Dryas,which was was a return to glacial conditions and Global Cooling which temporarily reversed the gradual climatic warming after the Last Glacial Maximum that began to recede around 20,000 BC, utilizing ice core data from Greenland.

Now, there is a very big asteroid which passed by the Earth on September 16th, 2013. What is most disturbing is the fact that its cycle is 19 years so it will return in 2032. Astronomers have not been able to swear it will not hit the Earth on the next pass in 2032. It was discovered by Ukrainian astronomers with just 10 days to go back in 2013.  The 2013 pass was only a distance of 4.2 million miles (6.7 million kilometers). If anything alters its orbit, then it will get closer and closer. It just so happens to line up on a cyclical basis that suggests we should begin to look at how to deflect asteroids and soon.

It definitely appears that catastrophic cooling may also be linked to the Earth being struck by a meteor, asteroids, or a comet. We are clearly headed into a period of Global Cooling and this will get worse as we head into 2032. The question becomes: Is our model also reflecting that it is once again time for an Earth change caused by an asteroid encounter? Such events are not DOOMSDAY and the end of the world. They do seem to be regional. However, a comet striking in North America would have altered the comet freezing animals in Siberia.

If there is a tiny element of truth in the story of Atlantis, the one thing it certainly proves is clear – there are ALWAYS survivors. Based upon a review of the history of civilization as well as climate, what resonates profoundly is that events follow the cyclical model of catastrophic occurrences rather than the linear steady slow progression of evolution.

¿Adiós al Servicio Meteorológico? Un biólogo argentino predice el clima estudiando hormigas (y acierta) (La Nación)

Jorge Finardi anticipa lluvias y tormentas a partir del comportamiento de insectos

LA NACION

JUEVES 26 DE ENERO DE 2017 • 17:44

¿Chau Servicio Meteorológico? El biólogo argentino que predice el clima estudiando hormigas

¿Chau Servicio Meteorológico? El biólogo argentino que predice el clima estudiando hormigas. Foto: Archivo 

Jorge Finardi predice el clima a través de las hormigas. Estudia sus movimientos, los registra, los compara y llega a la conclusión, por ejemplo, de que mañana a la tarde lloverá. Y acierta. Esta semana, Finardi anticipó con su método el calor sofocante del lunes, la tormenta del martes, y la caída de la temperatura del miércoles. Nada mal.

Finardi es químico, biólogo, y lleva adelante la cuenta de Twitter @GeorgeClimaPron. En ella, comunica sus pronósticos climatológicos. En una entrevista con LA NACION, explica su sistema.

-¿Cómo funciona tu método de análisis?

-En primer lugar, determino el grado de actividad de las hormigas en una escala del 1 al 10. Para armar la escala tengo en cuenta la cantidad de interacciones entre las hormigas, el número de hormigas involucradas, y el tipo y tamaño de carga que llevan, además, de la clase de hormiga que trabaja.

-¿Y de qué manera se relaciona con el clima? ¿Más actividad es indicativa de lluvia?

-En parte sí, pero depende de la carga que lleven. Por ejemplo, cuando las hormigas llevan palitos y barritas, es porque tienen que fortalecer el hormiguero, debido a que se aproxima lluvia o frío. Cuando hay movilización de tierra es porque se viene una lluvia fuerte. Cuando llevan cereal, viene frío, porque el cereal fermenta dentro del hormiguero y produce calor para que nazcan los hongos que ellas comen.

Para las altas temperaturas, por otro lado, se acondicionan los túneles: las hormigas empiezan a abrir “chimeneas”, que son como agujeritos esparcidos dentro del hormiguero, que puede llegar a tener metros de profundidad. Cuando pasa eso, se viene una ola de calor.

-¿Cómo te interesaste por el tema?

-Desde los tres años me paso horas mirando las hormigas y todo tipo de insectos. Por otro lado, mi profesión me ayudó a profundizar estos temas, y también a hablar con gente de edad avanzada que vive en el campo y no se fija en los pronósticos. No los necesita. Así avancé. Así y con un poco de prueba y error. Al principio introduje hormigas en un terrario para poder observarlas más cómodo. Pero ellas se comportaban de otra manera, por el aislamiento. Ahora las sigo con una cámara.

-¿Además de las hormigas, analizás otros insectos?

-Sí. Las arañas, por ejemplo, tienen la capacidad de detectar actividad eléctrica, cuando aparecen y están muy activas. Las libélulas pueden anticipar una tormenta o viento. Las cigarras anuncian calor. Los gallos, cuando cantan a media noche, anuncian neblinas. También hay que prestar atención a las hormigas cuando están desorientadas, porque pueden captar actividad sísmica a grandes distancias.

-¿Este tipo de análisis es científico?

-No. Hay que destacar que el método no es científico, no es positivista, pero sí es cualitativo, experimental y observacional. Y sirve. Los hombres estamos acá desde el período cuaternario, pero las hormigas, por ejemplo, están desde la época de los dinosaurios. Están muy adaptadas, son muy sensibles a los cambios de ambiente. Y la naturaleza, así, nos habla, nos presenta síntomas. Hay que saber leerlos.

Com 516 milímetros de chuva em 5 anos, Ceará tem pior seca desde 1910 (G1)

09/09/2016 09h20 – Atualizado em 09/09/2016 11h57

Previsão para 2017 ainda é indefinida devido ao “Oceano Pacífico Neutro”.
Águas do Açude Orós estão sendo transferidas para o Castanhão.

Do G1 CE com informações da TV Verdes Mares

VER VIDEO

Levantamento feito pela Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme) nesta quinta-feira (8) mostra que nos últimos cinco anos, de 2012 a 2016, foram apenas 516 milímetros de chuva, em média, no Ceará. O índice é o menor desde 1910.

De acordo com o meteorologista Davi Ferran, vai ser preciso conviver com a incerteza pelos próximos meses, já que ainda é cedo pra afirmar se 2017 vai trazer chuva ou não.

Ano Chuva (mm)
2012 388
2013 552
2014 565
2015 524
2016 550
Média 516
Fonte: Funceme

“No período chuvoso do ano que vem, ou seja, março, abril e maio, que é o período chuvoso principal, a maior probabilidade é que o Oceano Pacífico não tenha El Niño nem La Niña. Vamos ter o Oceano Pacífico neutro. Em anos de Oceano Pacífico neutro, a probabilidade de chuvas no Ceará depende mais fortemente do Atlântico. Então a previsão vai ser divulgada somente em janeiro”, explica.

Enquanto isso, segundo a Companhia de Gestão de Recursos Hídricos (Cogerh), os reservatórios secam cada vez mais. No momento, o nível médio dos 153 açudes monitorados pela Cogerh é de apenas 9,4% do volume total.O “Gigante” Castanhão, responsável por abastecer toda a Região Metropolitana de Fortaleza, está praticamente sem água. Há apenas sete anos, ele chegou a inundar a cidade de Jaguaribara com a enorme vazão das comportas.

Hoje, a Cogerh diz que o maior açude do Ceará está com apenas 6% da capacidade. Bem perto dele, o Açude Orós, também na Região Jaguaribana, sangrou em 2004 e 2008. Na época, virou até atração turística no Centro Sul do Estado.

Agora em 2016, o Orós aparece nesse cenário de seca em forma de ajuda. Desde julho, as águas do açude estão sendo transferidas para o Castanhão. Segundo a Cogerh, essa água deve chegar às residências da Região Metropolitana de Fortaleza em setembro, e garantir o abastecimento pelo menos durante esse período  de crise hídrica.

“Nossa programação é até o final de janeiro. Ou seja, até janeiro vamos estar operando de forma integrada os dois reservatórios. O caso da Região Metropolitana, ela está totalmente integrada à Região do Jaguaribe por dois grandes canais: o do Trabalhador e Eixão das Águas. Então é o caso de uma bacia hoje tem uma maior dependência de outra região, de outra bacia hidrográfica, mas elas estão integradas. Esse é o caso que eu diria mais emblemática no Estado”, explica o presidente da Cogerh, João Lúcio Farias.

saiba mais