Some ‘high-level’ scientific pronouncements have assumed stewardship of climate geoengineering in the absence of other agents. This is dangerous, as effects on the Indian monsoons will show.
Prakash Kashwan – 28/Dec/2018
Multilateral climate negotiations led by the UN have ended on disappointing notes of late. This has prompted climate scientists to weigh the pros and cons of climate geoengineering. Indian scientists, policymakers, and the public must also engage in these debates, especially given the potentially major implications of geoengineering for the monsoons in South Asia and Africa.
Since 2016, an academic working group (AWG) of 14 global governance experts (including the author) has deliberated on the wisdom and merits of geoengineering. In a report, we argue that we ought to develop ‘anticipatory governance mechanisms’.
While people often equate governance with top-down regulations, the AWG’s vision emphasises a combination of regulatory and voluntary strategies adopted by diverse state and non-state actors.
In the same vein, it’s also important to unpack the umbrella terminology of ‘geoengineering’. It comprises two sets of technologies with different governance implications: carbon geoengineering and solar geoengineering.
Carbon geoengineering, or carbon-dioxide removal, seeks to remove large quantities of the greenhouse gas from the atmosphere. The suite of options it presents include bioenergy with carbon capture and storage (BECCS). This would require planting bioenergy crops over an area up to five times the size of India by 2100. Obviously such large-scale and rapid land-use change will strain the already precarious global food security and violate the land, forest and water rights of hundreds of millions.
The second cluster of geoengineering technologies, solar geoengineering, a.k.a. solar radiation management (SRM), seeks to cool the planet by reflecting a fraction of sunlight back into space. While this could help avoid some of the more severe effects of climate change, SRM doesn’t help reduce the stock of carbon already present in the atmosphere. Scientists also caution that geoengineering may distract us from investing in emissions reduction. But we know from experience that policymakers could ignore such cautions in the policymaking process.
This means problems like air pollution and ocean acidification will continue unabated in the absence of profound climate mitigation actions. On the other hand, by altering atmospheric temperature, SRM could significantly disrupt the hydrological cycle and affect the monsoons.
Just being interested in minimising disruptions to the monsoons should encourage India to help develop international geoengineering governance.
But before we can get into into the nitty-gritty, there’s a question that must be answered. Why should the global community think about governing climate engineering at this stage when all that exists of SRM are computer simulations of its pros and cons?
Some reasons follow:
First, the suggestion that geoengineering technologies merely fill a void left open by a “lack of political will” doesn’t capture the full array of possibilities. The IPCC Special Report on the effects on a world warming by 1.5°C includes a scenario in which the Paris Agreement’s goals are secured by 2050. This pathway banks on social, business and technological innovations, and doesn’t require resorting to radical climate responses or sacrificing improvements in basic living standards in the developing world.
On the other hand, $8 trillion’s worth of investments have already been redirected away from fossil fuel operations. These successes owe thanks to a global divestment movement led by environmental activists and student groups. (Such an outcome was thought to be politically infeasible only a few years ago.)
Second, recent research has shown that some geoengineering technologies, such as BECCS, could compete against the pursuits of more “ ecologically sound, economical, and scalable” methods (source) for enhancing natural climate sinks.
Third, despite a lot of progress in recent years, we don’t know enough to support a full assessment of the intended and unintended effects of geoengineering.
Decisions about which unresolved questions of geoengineering deserve public investment can’t be left only to the scientists and policymakers. The community of climate engineering scientists tends to frame geoengineering in certain ways over other equally valid alternatives.
This includes considering the global average surface temperature as the central climate impact indicator and ignoring vested interests linked to capital-intensive geoengineering infrastructure. This could bias future R&D trajectories in this area.
And these priorities, together with the assessments produced by eminent scientific bodies, have contributed to the rise of a de facto form of governance. In other words, some ‘high-level’ scientific pronouncements have assumed stewardship of climate geoengineering in the absence of other agents.
Such technocratic modes of governance don’t enjoy broad-based social or political legitimacy.
Individual research groups (e.g. Harvard University’s Solar Geoengineering Research Program) have opened themselves up to public scrutiny. They don’t support commercial work on solar geoengineering and have decided not to patent technologies being developed in their labs. While this is commendable, none of this can substitute more politically legitimate arrangements.
The case of the Indian monsoons illustrates these challenges well. Various models of the Geoengineering Model Intercomparison Project have shown that SRM in use will likely cause the net summer monsoon precipitation to decline from 6.4% to 12.7%. (These predictions are based on average changes in atmospheric temperature, which means bigger or smaller variations could occur over different parts of India.)
So politically legitimate international governance is important to ensure global responses to climate change account for these and other domestic consequences.
As a first step, the AWG report recommends the UN secretary-general establish a high-level representative body to engage in international dialogue on various questions of governing SRM R&D, supported by a General Assembly resolution. Among other things, the mandate of this ‘World Commission’ could include debating whether, and to what end, SRM should be researched and developed and how it could fit within broader climate response strategies.
Then again, debates over solar geoengineering can’t be limited to global bodies and commissions. So the AWG also recommends the UN create a global forum for stakeholder dialogue to facilitate discussions on solar geoengineering. Such a forum could engage a variety of stakeholders, including local governments, communities, indigenous peoples and other climate-vulnerable groups, youth organisations and women’s groups. Only such a process is likely to effectively represent Indian peasants and farmers at the receiving end of a longstanding agrarian crisis.
These proposals for geoengineering governance build on various precedents. For example, from the 1990s, the World Commission on Dams demonstrated the feasibility and value of an extensive multi-level governance arrangement.
In 2018, policy experts have finally recognised that global climate governance can’t ignore the general public’s concerns. It would be best to avoid rediscovering this wheel in the international governance domain of climate geoengineering.
Prakash Kashwan is an associate professor at the University of Connecticut, Storrs, and was a member of the AWG. The South Asia edition of his book Democracy in the Woods (2017) is due out later this month.
New antibody and antiviral treatments, and better vaccines, are on the way
The Economist – Nov 8th 2021
IN THE WELL-VACCINATED wealthier countries of the world, year three of the pandemic will be better than year two, and covid-19 will have much less impact on health and everyday activities. Vaccines have weakened the link between cases and deaths in countries such as Britain and Israel (see chart). But in countries that are poorer, less well vaccinated or both, the deleterious effects of the virus will linger. A disparity of outcomes between rich and poor countries will emerge. The Gates Foundation, one of the world’s largest charities, predicts that average incomes will return to their pre-pandemic levels in 90% of advanced economies, compared with only a third of low- and middle-income economies.
Although the supply of vaccines surged in the last quarter of 2021, many countries will remain under-vaccinated for much of 2022, as a result of distribution difficulties and vaccine hesitancy. This will lead to higher rates of death and illness and weaker economic recoveries. The “last mile” problem of vaccine delivery will become painfully apparent as health workers carry vaccines into the planet’s poorest and most remote places. But complaints about unequal distribution will start to abate during 2022 as access to patients’ arms becomes a larger limiting factor than access to jabs. Indeed, if manufacturers do not scale back vaccine production there will be a glut by the second half of the year, predicts Airfinity, a provider of life-sciences data.
Booster jabs will be more widely used in 2022 as countries develop an understanding of when they are needed. New variants will also drive uptake, says Stanley Plotkin of the University of Pennsylvania, inventor of the rubella vaccine. Dr Plotkin says current vaccines and tweaked versions will be used as boosters, enhancing protection against variants.
The vaccination of children will also expand, in some countries to those as young as six months. Where vaccine hesitancy makes it hard for governments to reach their targets they will be inclined to make life difficult for the unvaccinated—by requiring vaccine passports to attend certain venues, and making vaccination compulsory for groups such as health-care workers.
Immunity and treatments may be widespread enough by mid-2022 to drive down case numbers and reduce the risk of new variants. At this point, the virus will become endemic in many countries. But although existing vaccines may be able to suppress the virus, new ones are needed to cut transmission.
Stephane Bancel, the boss of Moderna, a maker of vaccines based on mRNA technology, says his firm is working on a “multivalent” vaccine that will protect against more than one variant of covid-19. Beyond that he is looking at a “pan-respiratory” vaccine combining protection against multiple coronaviruses, respiratory viruses and strains of influenza.
Other innovations in covid-19 vaccines will include freeze-dried formulations of mRNA jabs, and vaccines that are given via skin patches or inhalation. Freeze-dried mRNA vaccines are easy to transport. As the supply of vaccines grows in 2022, those based on mRNA will be increasingly preferred, because they offer higher levels of protection. That will crimp the global market for less effective vaccines, such as the Chinese ones.
In rich countries there will also be greater focus on antibody treatments for people infected with covid-19. America, Britain and other countries will rely more on cocktails such as those from Regeneron or AstraZeneca.
Most promising of all are new antiviral drugs. Pfizer is already manufacturing “significant quantities” of its protease inhibitor. In America, the government has agreed to buy 1.2bn courses of an antiviral drug being developed by Merck, known as molnupiravir. This has shown its efficacy in trials, and the company has licensed it for widespread, affordable production.
There are many other antivirals in the pipeline. Antiviral drugs that can be taken in pill form, after diagnosis, are likely to become blockbusters in 2022, helping make covid-19 an ever more treatable disease. That will lead, in turn, to new concerns about unequal access and of misuse fostering resistant strains.
The greatest risk to this more optimistic outlook is the emergence of a new variant capable of evading the protection provided by existing vaccines. The coronavirus remains a formidable foe.
Natasha Loder: Health-policy editor, The Economist■
This article appeared in the Science and Technology section of the print edition of The World Ahead 2022 under the headline “From pandemic to endemic”
From discs in the sky to faces in toast, learn to weigh evidence sceptically without becoming a closed-minded naysayer
by Stephen Law
Stephen Law is a philosopher and author. He is director of philosophy at the Department of Continuing Education at the University of Oxford, and editor of Think, the Royal Institute of Philosophy journal. He researches primarily in the fields of philosophy of religion, philosophy of mind, Ludwig Wittgenstein, and essentialism. His books for a popular audience include The Philosophy Gym (2003), The Complete Philosophy Files (2000) and Believing Bullshit (2011). He lives in Oxford.
Many people believe in extraordinary hidden beings, including demons, angels, spirits and gods. Plenty also believe in supernatural powers, including psychic abilities, faith healing and communication with the dead. Conspiracy theories are also popular, including that the Holocaust never happened and that the terrorist attacks on the United States of 11 September 2001 were an inside job. And, of course, many trust in alternative medicines such as homeopathy, the effectiveness of which seems to run contrary to our scientific understanding of how the world actually works.
Such beliefs are widely considered to be at the ‘weird’ end of the spectrum. But, of course, just because a belief involves something weird doesn’t mean it’s not true. As science keeps reminding us, reality often is weird. Quantum mechanics and black holes are very weird indeed. So, while ghosts might be weird, that’s no reason to dismiss belief in them out of hand.
I focus here on a particular kind of ‘weird’ belief: not only are these beliefs that concern the enticingly odd, they’re also beliefs that the general public finds particularly difficult to assess.
Almost everyone agrees that, when it comes to black holes, scientists are the relevant experts, and scientific investigation is the right way to go about establishing whether or not they exist. However, when it comes to ghosts, psychic powers or conspiracy theories, we often hold wildly divergent views not only about how reasonable such beliefs are, but also about what might count as strong evidence for or against them, and who the relevant authorities are.
Take homeopathy, for example. Is it reasonable to focus only on what scientists have to say? Shouldn’t we give at least as much weight to the testimony of the many people who claim to have benefitted from homeopathic treatment? While most scientists are sceptical about psychic abilities, what of the thousands of reports from people who claim to have received insights from psychics who could only have known what they did if they really do have some sort of psychic gift? To what extent can we even trust the supposed scientific ‘experts’? Might not the scientific community itself be part of a conspiracy to hide the truth about Area 51 in Nevada, Earth’s flatness or the 9/11 terrorist attacks being an inside job?
Most of us really struggle when it comes to assessing such ‘weird’ beliefs – myself included. Of course, we have our hunches about what’s most likely to be true. But when it comes to pinning down precisely why such beliefs are or aren’t reasonable, even the most intelligent and well educated of us can quickly find ourselves out of our depth. For example, while most would pooh-pooh belief in fairies, Arthur Conan Doyle, the creator of the quintessentially rational detective Sherlock Holmes, actually believed in them and wrote a book presenting what he thought was compelling evidence for their existence.
When it comes to weird beliefs, it’s important we avoid being closed-minded naysayers with our fingers in our ears, but it’s also crucial that we avoid being credulous fools. We want, as far as possible, to be reasonable.
I’m a philosopher who has spent a great deal of time thinking about the reasonableness of such ‘weird’ beliefs. Here I present five key pieces of advice that I hope will help you figure out for yourself what is and isn’t reasonable.
Let’s begin with an illustration of the kind of case that can so spectacularly divide opinion. In 1976, six workers reported a UFO over the site of a nuclear plant being constructed near the town of Apex, North Carolina. A security guard then reported a ‘strange object’. The police officer Ross Denson drove over to investigate and saw what he described as something ‘half the size of the Moon’ hanging over the plant. The police also took a call from local air traffic control about an unidentified blip on their radar.
The next night, the UFO appeared again. The deputy sheriff described ‘a large lighted object’. An auxiliary officer reported five lighted objects that appeared to be burning and about 20 times the size of a passing plane. The county magistrate described a rectangular football-field-sized object that looked like it was on fire.
Finally, the press got interested. Reporters from the Star newspaper drove over to investigate. They too saw the UFO. But when they tried to drive nearer, they discovered that, weirdly, no matter how fast they drove, they couldn’t get any closer.
This report, drawn from Philip J Klass’s bookUFOs: The Public Deceived (1983), is impressive: it involves multiple eyewitnesses, including police officers, journalists and even a magistrate. Their testimony is even backed up by hard evidence – that radar blip.
Surely, many would say, given all this evidence, it’s reasonable to believe there was at least something extraordinary floating over the site. Anyone who failed to believe at least that much would be excessively sceptical – one of those perpetual naysayers whose kneejerk reaction, no matter how strong the evidence, is always to pooh-pooh.
What’s most likely to be true: that there really was something extraordinary hanging over the power plant, or that the various eyewitnesses had somehow been deceived? Before we answer, here’s my first piece of advice.NEED TO KNOWTHINK IT THROUGHKEY POINTSWHY IT MATTERSLINKS & BOOKS
Think it through
1. Expect unexplained false sightings and huge coincidences
Our UFO story isn’t over yet. When the Star’s two-man investigative team couldn’t get any closer to the mysterious object, they eventually pulled over. The photographer took out his long lens to take a look: ‘Yep … that’s the planet Venus all right.’ It was later confirmed beyond any reasonable doubt that what all the witnesses had seen was just a planet. But what about that radar blip? It was a coincidence, perhaps caused by a flock of birds or unusual weather.
What moral should we draw from this case? Not, of course, that because this UFO report turned out to have a mundane explanation, all such reports can be similarly dismissed. But notice that, had the reporters not discovered the truth, this story would likely have gone down in the annals of ufology as one of the great unexplained cases. The moral I draw is that UFO cases that have multiple eyewitnesses and even independent hard evidence (the radar blip) may well crop up occasionally anyway, even if there are no alien craft in our skies.
We tend significantly to underestimate how prone to illusion and deception we are when it comes to the wacky and weird. In particular, we have a strong tendency to overdetect agency – to think we are witnessing a person, an alien or some other sort of creature or being – where in truth there’s none.
Psychologists have developed theories to account for this tendency to overdetect agency, including that we have evolved what’s called a hyperactive agency detecting device. Had our ancestors missed an agent – a sabre-toothed tiger or a rival, say – that might well have reduced their chances of surviving and reproducing. Believing an agent is present when it’s not, on the other hand, is likely to be far less costly. Consequently, we’ve evolved to err on the side of overdetection – often seeing agency where there is none. For example, when we observe a movement or pattern we can’t understand, such as the retrograde motion of a planet in the night sky, we’re likely to think the movement is explained by some hidden agent working behind the scenes (that Mars is actually a god, say).
One example of our tendency to overdetect agency is pareidolia: our tendency to find patterns – and, in particular, faces – in random noise. Stare at passing clouds or into the embers of a fire, and it’s easy to interpret the randomly generated shapes we see as faces, often spooky ones, staring back.
And, of course, nature is occasionally going to throw up the face-like patterns just by chance. One famous illustration was produced in 1976 by the Mars probe Viking Orbiter 1. As the probe passed over the Cydonia region, it photographed what appeared to be an enormous, reptilian-looking face 800 feet high and nearly 2 miles long. Some believe this ‘face on Mars’ was a relic of an ancient Martian civilisation, a bit like the Great Sphinx of Giza in Egypt. A book called TheMonuments of Mars: A City on the Edge of Forever (1987) even speculated about this lost civilisation. However, later photos revealed the ‘face’ to be just a hill that looks face-like when lit a certain way. Take enough photos of Mars, and some will reveal face-like features just by chance.
The fact is, we should expect huge coincidences. Millions of pieces of bread are toasted each morning. One or two will exhibit face-like patterns just by chance, even without divine intervention. One such piece of toast that was said to show the face of the Virgin Mary (how do we know what she looked like?) was sold for $28,000. We think about so many people each day that eventually we’ll think about someone, the phone will ring, and it will be them. That’s to be expected, even if we’re not psychic. Yet many put down such coincidences to supernatural powers.
2. Understand what strong evidence actually is
When is a claim strongly confirmed by a piece of evidence? The following principle appears correct (it captures part of what confirmation theorists call the Bayes factor; for more on Bayesian approaches to assessing evidence, see the link at the end):
Evidence confirms a claim to the extent that the evidence is more likely if the claim is true than if it’s false.
Here’s a simple illustration. Suppose I’m in the basement and can’t see outside. Jane walks in with a wet coat and umbrella and tells me it’s raining. That’s pretty strong evidence it’s raining. Why? Well, it is of course possible that Jane is playing a prank on me with her wet coat and brolly. But it’s far more likely she would appear with a wet coat and umbrella and tell me it’s raining if that’s true than if it’s false. In fact, given just this new evidence, it may well be reasonable for me to believe it’s raining.
Here’s another example. Sometimes whales and dolphins are found with atavistic limbs – leg-like structures – where legs would be found on land mammals. These discoveries strongly confirm the theory that whales and dolphins evolved from earlier limbed, land-dwelling species. Why? Because, while atavistic limbs aren’t probable given the truth of that theory, they’re still far more probable than they would be if whales and dolphins weren’t the descendants of such limbed creatures.
The Mars face, on the other hand, provides an example of weak or non-existent evidence. Yes, if there was an ancient Martian civilisation, then we might discover what appeared to be a huge face built on the surface of the planet. However, given pareidolia and the likelihood of face-like features being thrown up by chance, it’s about as likely that we would find such face-like features anyway, even if there were no alien civilisation. That’s why such features fail to provide strong evidence for such a civilisation.
So now consider our report of the UFO hanging over the nuclear power construction site. Are several such cases involving multiple witnesses and backed up by some hard evidence (eg, a radar blip) good evidence that there are alien craft in our skies? No. We should expect such hard-to-explain reports anyway, whether or not we’re visited by aliens. In which case, such reports are not strong evidence of alien visitors.
Being sceptical about such reports of alien craft, ghosts or fairies is not knee-jerk, fingers-in-our-ears naysaying. It’s just recognising that, though we might not be able to explain the reports, they’re likely to crop up occasionally anyway, whether or not alien visitors, ghosts or fairies actually exist. Consequently, they fail to provide strong evidence for such beings.
It was the scientist Carl Sagan who in 1980 said: ‘Extraordinary claims require extraordinary evidence.’ By an ‘extraordinary’ claim, Sagan appears to have meant an extraordinarily improbable claim, such as that Alice can fly by flapping her arms, or that she can move objects with her mind. On Sagan’s view, such claims require extraordinarily strong evidence before we should accept them – much stronger than the evidence required to support a far less improbable claim.
Suppose for example that Fred claims Alice visited him last night, sat on his sofa and drank a cup of tea. Ordinarily, we would just take Fred’s word for that. But suppose Fred adds that, during her visit, Alice flew around the room by flapping her arms. Of course, we’re not going to just take Fred’s word for that. It’s an extraordinary claim requiring extraordinary evidence.
If we’re starting from a very low base, probability-wise, then much more heavy lifting needs to be done by the evidence to raise the probability of the claim to a point where it might be reasonable to believe it. Clearly, Fred’s testimony about Alice flying around the room is not nearly strong enough.
Similarly, given the low prior probability of the claims that someone communicated with a dead relative, or has fairies living in their local wood, or has miraculously raised someone from the dead, or can move physical objects with their mind, we should similarly set the evidential bar much higher than we would for more mundane claims.
4. Beware accumulated anecdotes
Once we’ve formed an opinion, it can be tempting to notice only evidence that supports it and to ignore the rest. Psychologists call this tendency confirmation bias.
For example, suppose Simon claims a psychic ability to know the future. He can provide 100 examples of his predictions coming true, including one or two dramatic examples. In fact, Simon once predicted that a certain celebrity would die within 12 months, and they did!
Do these 100 examples provide us with strong evidence that Simon really does have some sort of psychic ability? Not if Simon actually made many thousands of predictions and most didn’t come true. Still, if we count only Simon’s ‘hits’ and ignore his ‘misses’, it’s easy to create the impression that he has some sort of ‘gift’.
Confirmation bias can also create the false impression that a therapy is effective. A long list of anecdotes about patients whose condition improved after a faith healing session can seem impressive. People may say: ‘Look at all this evidence! Clearly this therapy has some benefits!’ But the truth is that such accumulated anecdotes are usually largely worthless as evidence.
It’s also worth remembering that such stories are in any case often dubious. For example, they can be generated by the power of suggestion: tell people that a treatment will improve their condition, and many will report that it has, even if the treatment actually offers no genuine medical benefit.
Impressive anecdotes can also be generated by means of a little creative interpretation. Many believe that the 16th-century seer Nostradamus predicted many important historical events, from the Great Fire of London to the assassination of John F Kennedy. However, because Nostradamus’s prophecies are so vague, nobody was able to use his writings to predict any of these events before they occurred. Rather, his texts were later creatively interpreted to fit what subsequently happened. But that sort of ‘fit’ can be achieved whether Nostradamus had extraordinary abilities or not. In which case, as we saw under point 2 above, the ‘fit’ is not strong evidence of such abilities.
5. Beware ‘But it fits!’
Often, when we’re presented with strong evidence that our belief is false, we can easily change our mind. Show me I’m mistaken in believing that the Matterhorn is near Chamonix, and I’ll just drop that belief.
However, abandoning a belief isn’t always so easy. That’s particularly the case for beliefs in which we have invested a great deal emotionally, socially and/or financially. When it comes to religious and political beliefs, for example, or beliefs about the character of our close relatives, we can find it extraordinarily difficult to change our minds. Psychologists refer to the discomfort we feel in such situations – when our beliefs or attitudes are in conflict – as cognitive dissonance.
Perhaps the most obvious strategy we can employ when a belief in which we have invested a great deal is threatened is to start explaining away the evidence.
Here’s an example. Dave believes dogs are spies from the planet Venus – that dogs are Venusian imposters on Earth sending secret reports back to Venus in preparation for their imminent invasion of our planet. Dave’s friends present him with a great deal of evidence that he’s mistaken. But, given a little ingenuity, Dave finds he can always explain away that evidence:
‘Dave, dogs can’t even speak – how can they communicate with Venus?’
‘They can speak, they just hide their linguistic ability from us.’
‘But Dave, dogs don’t have transmitters by which they could relay their messages to Venus – we’ve searched their baskets: nothing there!’
‘Their transmitters are hidden in their brain!’
‘But we’ve X-rayed this dog’s brain – no transmitter!’
‘The transmitters are made from organic material indistinguishable from ordinary brain stuff.’
‘But we can’t detect any signals coming from dogs’ heads.’
‘This is advanced alien technology – beyond our ability to detect it!’
‘Look Dave, Venus can’t support dog life – it’s incredibly hot and swathed in clouds of acid.’
‘The dogs live in deep underground bunkers to protect them. Why do you think they want to leave Venus?!’
You can see how this conversation might continue ad infinitum. No matter how much evidence is presented to Dave, it’s always possible for him to cook up another explanation. And so he can continue to insist his belief is logicallyconsistent with the evidence.
But, of course, despite the possibility of his endlessly explaining away any and all counterevidence, Dave’s belief is absurd. It’s certainly not confirmed by the available evidence about dogs. In fact, it’s powerfully disconfirmed.
The moral is: showing that your theory can be made to ‘fit’ – be consistent with – the evidence is not the same thing as showing your theory is confirmed by the evidence. However, those who hold weird beliefs often muddle consistency and confirmation.
Take young-Earth creationists, for example. They believe in the literal truth of the Biblical account of creation: that the entire Universe is under 10,000 years old, with all species being created as described in the Book of Genesis.
Polls indicate that a third or more of US citizens believe that the Universe is less than 10,000 years old. Of course, there’s a mountain of evidence against the belief. However, its proponents are adept at explaining away that evidence.
Take the fossil record embedded in sedimentary layers revealing that today’s species evolved from earlier species over many millions of years. Many young-Earth creationists explain away this record as a result of the Biblical flood, which they suppose drowned and then buried living things in huge mud deposits. The particular ordering of the fossils is supposedly accounted for by different ecological zones being submerged one after the other, starting with simple marine life. Take a look at the Answers in Genesis website developed by the Bible literalist Ken Ham, and you’ll discover how a great deal of other evidence for evolution and a billions-of-years-old Universe is similarly explained away. Ham believes that, by explaining away the evidence against young-Earth creationism in this way, he can show that his theory ‘fits’ – and so is scientifically confirmed by – that evidence:
Increasing numbers of scientists are realising that when you take the Bible as your basis and build your models of science and history upon it, all the evidence from the living animals and plants, the fossils, and the cultures fits. This confirms that the Bible really is the Word of God and can be trusted totally. [my italics]
According to Ham, young-Earth creationists and evolutionists do the same thing: they look for ways to make the evidence fit the theory to which they have already committed themselves:
Evolutionists have their own framework … into which they try to fit the data. [my italics]
But, of course, scientists haven’t just found ways of showing how the theory of evolution can be made consistent with the evidence. As we saw above, that theory really is strongly confirmed by the evidence.
Any theory, no matter how absurd, can, with sufficient ingenuity be made to ‘fit’ the evidence: even Dave’s theory that dogs are Venusian spies. That’s not to say it’s reasonable or well confirmed.
Of course, it’s not always unreasonable to explain away evidence. Given overwhelming evidence that water boils at 100 degrees Celsius at 1 atmosphere, a single experiment that appeared to contradict that claim might reasonably be explained away as a result of some unidentified experimental error. But as we increasingly come to rely on explaining away evidence in order to try to convince ourselves of the reasonableness of our belief, we begin to drift into delusion.
Key points – How to think about weird things
Expect unexplained false sightings and huge coincidences. Reports of mysterious and extraordinary hidden agents – such as angels, demons, spirits and gods – are to be expected, whether or not such beings exist. Huge coincidences – such as a piece of toast looking very face-like – are also more or less inevitable.
Understand what strong evidence is. If the alleged evidence for a belief is scarcely more likely if the belief is true than if it’s false, then it’s not strong evidence.
Extraordinary claims require extraordinary evidence. If a claim is extraordinarily improbable – eg, the claim that Alice flew round the room by flapping her arms – much stronger evidence is required for reasonable belief than is required for belief in a more mundane claim, such as that Alice drank a cup of tea.
Beware accumulated anecdotes. A large number of reports of, say, people recovering after taking an alternative medicine or visiting a faith healer is not strong evidence that such treatments actually work.
Beware ‘But it fits!’ Any theory, no matter how ludicrous (even the theory that dogs are spies from Venus), can, with sufficient ingenuity, always be made logically consistent with the evidence. That’s not to say it’s confirmed by the evidence.
Why it matters
Sometimes, belief in weird things is pretty harmless. What does it matter if Mary believes there are fairies at the bottom of her garden, or Joe thinks his dead aunty visits him occasionally? What does it matter if Sally is a closed-minded naysayer when it comes to belief in psychic powers? However, many of these beliefs have serious consequences.
Clearly, people can be exploited. Grieving parents contact spiritualists who offer to put them in contact with their dead children. Peddlers of alternative medicine and faith healing charge exorbitant fees for their ‘cures’ for terminal illnesses. If some alternative medicines really work, casually dismissing them out of hand and refusing to properly consider the evidence could also cost lives.
Lives have certainly been lost. Many have died who might have been saved because they believed they should reject conventional medicine and opted for ineffective alternatives.
Huge amounts of money are often also at stake when it comes to weird beliefs. Psychic reading and astrology are huge businesses with turnovers of billions of dollars per year. Often, it’s the most desperate who will turn to such businesses for advice. Are they, in reality, throwing their money away?
Many ‘weird’ beliefs also have huge social and political implications. The former US president Ronald Reagan and his wife Nancy were reported to have consulted an astrologer before making any major political decision. Conspiracy theories such as QAnon and the Sandy Hook hoax shape our current political landscape and feed extremist political thinking. Mainstream religions are often committed to miracles and gods.
In short, when it comes to belief in weird things, the stakes can be very high indeed. It matters that we don’t delude ourselves into thinking we’re being reasonable when we’re not.
Links & books
The Atlanticarticle ‘The Cognitive Biases Tricking Your Brain’ (2018) by Ben Yagoda provides a great introduction to thinking that can lead us astray, including confirmation bias.
The UK-based magazineThe Skeptic provides some high-quality free articles on belief in weird things. Well worth a subscription.
The Skeptical Inquirermagazine in the US is also excellent, and provides some free content.
The RationalWiki portal provides many excellent articles on pseudoscience.
The British mathematician Norman Fenton, professor of risk information management at Queen Mary University of London, provides a brief online introduction to Bayesian approaches to assessing evidence.
My bookBelieving Bullshit: How Not to Get Sucked into an Intellectual Black Hole (2011) identifies eight tricks of the trade that can turn flaky ideas into psychological flytraps – and how to avoid them.
The textbookHow to Think About Weird Things: Critical Thinking for a New Age (2019, 8th ed) by the philosophers Theodore Schick and Lewis Vaughn, offers step-by-step advice on sorting through reasons, evaluating evidence and judging the veracity of a claim.
The bookCritical Thinking (2017) by Tom Chatfield offers a toolkit for what he calls ‘being reasonable in an unreasonable world’.
João Paulo Charleaux – 13 de out de 2021 (atualizado 13/10/2021 às 00h26)
Laurent-Henri Vignaud, historiador da ciência na Universidade de Bourgogne, fala ao ‘Nexo’ sobre as ideias, à direita e à esquerda, por trás do movimento antivacina nos últimos 300 anos
Manifestante com máscara rasgada em marcha contra a imposição de um passaporte sanitário na Holanda
A resistência à vacinação é um fenômeno antigo e persistente, que encontra adeptos à esquerda e à direita – sempre nas franjas mais extremas desses setores –, e não está ligado à falta de educação, mas ao excesso de informação e à dificuldade de saber em que acreditar, de acordo com o historiador da ciência Laurent-Henri Vignaud, da Universidade de Bourgogne, na França.
O autor do livro “Antivax: Resistência às vacinas, do século 18 aos Nossos Dias” esmiuça, nesta entrevista concedida por escrito ao Nexo nesta quarta-feira (6), os argumentos dos que ainda resistem a se vacinar contra a covid-19 em todo mundo, e faz um retrospecto desse movimento antivacinal ao longo da história.
Vignaud fará uma conferência virtual sobre o tema no dia 14 de outubro, no ciclo de palestras sobre a Covid promovido pelo Consulado da França em São Paulo em parceria com a Unesco, órgão das Nações Unidas para educação e cultura, e com os Blogs de Ciência da Unicamp. A transmissão é ao vivo e os vídeos ficam disponíveis nos canais do Consulado da França na internet.
Quais são os argumentos daqueles que se opõem à vacinação? Como esses argumentos variaram nos últimos 300 anos?
Laurent-Henri Vignaud Esses argumentos são muito diversos, assim como os perfis “antivax”. Muitos têm dúvidas simples sobre a qualidade das vacinas ou sobre os conflitos de interesse de quem as promove. Outros desenvolvem teorias extremas de conspiração, dizendo que as vacinas são feitas para adoecer, para esterilizar, matar ou escravizar. No meio, há aqueles que “hesitam” por tal ou tal motivo.
Aqueles que recusam explicitamente uma ou mais vacinas – quando falamos estritamente dos “antivax” – o fazem por motivos religiosos, políticos ou alternativos e naturalistas. Há certas correntes rigorosas, em todas as religiões, que recusam a vacinação em nome de um princípio fatalista e providencialista, numa afirmação da ideia de que o homem não é senhor de seu próprio destino.
Já os que se opõem às vacinas por razões políticas atacam as leis impositivas em nome da livre disposição de seus corpos e das liberdades individuais, no discurso do “meu corpo me pertence”.
Outros, muito numerosos hoje, contestam a eficácia das vacinas e defendem outras terapias que vão desde regimes de saúde a fitoterápicos e homeopatia – o que aparece em discursos como “a imunidade natural é superior à imunidade a vacinas” e “as doenças nos fortalecem”. A maioria desses argumentos está presente desde o início da polêmica vacinal no final do século 18, mas se atualizam de maneira diferente em cada época.
Historicamente, o movimento antivacinação é de direita ou de esquerda? Isso é algo que mudou ao longo do tempo ou permanece o mesmo?
Laurent-Henri Vignaud Atualmente, as duas tendências existem: há uma postura “ecológica” antivacina que é bastante esquerdista e burguesa – um modelo muito difundido por exemplo na Califórnia entre funcionários de empresas digitais. E há uma postura “libertária” ou “confessional” antivacina, que é de direita, presente sobretudo na América, em círculos religiosos conservadores e partidários de líderes populistas como [o ex-presidente dos EUA Donald] Trump ou [o presidente do Brasil, Jair] Bolsonaro.
Manifestante exibe cartaz contra vacina em protesto na Avenida Paulista
Historicamente, a inoculação, técnica que antecedeu as vacinas no século 18, foi promovida por filósofos como Voltaire [iluminista francês, 1694-1778] e contrariada por homens da Igreja. Portanto, podemos classificar essa oposição como uma oposição à direita. No século 19, a dureza das medidas de vacinação obrigatória levou à revolta de setores mais pobres que não podiam escapar da injeção. O vacinismo aparece aí como higiene social e o antivacinismo, como algo protagonizado por movimentos operários, feministas e de defesa dos animais, mais marcadamente à esquerda, portanto.
A Revolta da Vacina, de 1904, no Brasil, foi desencadeada por uma campanha de vacinação forçada pretendida pela jovem República, que gerou motins na classe trabalhadora. No século 20, o antivacinismo está representado à direita e à esquerda, mas quase sempre nos extremos.
O que explica por que a França, país desenvolvido, rico, cientificamente avançado, onde não faltam fontes confiáveis de informação, tenha hoje uma resistência tão elevada à vacinação, mesmo entre os profissionais de saúde?
Laurent-Henri Vignaud Esse é um fenômeno recente. A França não está isenta da tradição antivacinal. Na verdade, essa era uma tradição até bastante virulenta na época de Pasteur [século 19], a ponto de atrasar o estabelecimento de uma obrigação de vacinar contra a varíola, mas esta não é uma opinião muito difundida até o início do anos 2000.
Por exemplo, nossa primeira liga “antivax” apareceu em 1954 após a entrada em vigor da obrigação do BCG, mas, à época, os ingleses e os americanos já tinha ligas “antivax” há quase um século.
Clientes exibem passe sanitário em seus celulares num café de Paris
Durante a última epidemia de varíola na Bretanha em 1954-1955, na altura em que o prefeito decretou o reforço da vacinação obrigatória, mais de 90% dos habitantes concernidos já tinham sido vacinados voluntariamente.
Essa confiança foi abalada durante o debate sobre a vacina contra a hepatite B em meados da década de 1990, até porque os políticos se contradiziam sobre sua possível periculosidade. E, na crise do do influenza A em 2009, a campanha de vacinação falhou. Os franceses não acreditavam na possibilidade de uma pandemia e não entendiam por que deveriam ter sido vacinados contra uma doença na qual não viam perigo. Talvez o choque da pandemia de covid reverta essa tendência.
Como você explica o fato de que os boatos, o misticismo e a irracionalidade persistam, mesmo em uma época em que a ciência se desenvolveu tanto, mesmo em uma época em que a educação formal alcançou tantos? Essa adesão às teorias da conspiração seria uma característica humana inextinguível?
Laurent-Henri Vignaud A suspeita de riscos tecnológicos – porque a vacina é um produto manufaturado – não se alimenta da falta de informação, mas de seu transbordamento. É por sermos inundados com informações e por não podermos lidar com um décimo delas que nós duvidamos.
Quem de nós pode explicar, ainda que de forma grosseira, como funciona algo tão difundido como um telefone celular? Diante dessa superabundância de quebra-cabeças técnico-científicos e de conhecimentos que não podemos assimilar, os cidadãos 2.0 fazem seu mercado e acreditam no que querem acreditar de acordo com o que consideram ser do seu interesse.
A maioria confia em palavras de autoridade e no pouco que conseguem entender de tudo o que chega a si. Alguns ficam insatisfeitos com as respostas que lhes são dadas e passam a duvidar de tudo, chegando a imaginar universos paralelos e paranóicos. Não é, portanto, na ignorância que estas crenças se baseiam, mas sim num “ônus da prova”, que pesa cada vez mais sobre os ombros dos cidadãos contemporâneos.
Nessa “sociedade de risco”, os cidadãos contemporâneos são cada vez mais instados a assumir a responsabilidade por si próprios e julgar por si próprios o que é verdadeiro e o que é falso. Em alguns, o espírito crítico se empolga e leva a uma forma de ceticismo radical da qual o antivacinismo é um bom exemplo.
Para Aguayo, o ministro Bento Albuquerque é “intransigente e cabeça dura”. Ele afirma que deve ser difícil por parte do governo admitir a volta do horário de verão porque o debate tomou um rumo ideológico comparável a cloroquina e tratamento precoce, quando deveria ser mais econômico, científico e estratégico.
“Eles estão em um momento crítico. Não podem contar com a sorte. Não podem contar com a sorte de que vai ter um dilúvio, um tsunami de chuva no Brasil. Não vai. Ficaram tão fechados nesse mundinho deles da ideologia, agora estão indo para o lado esotérico. É o que restou para eles”, afirma Aguayo.
O ministério divulgou comunicado no domingo (17) dizendo que seu encontro com a Fundação Cacique Cobra Coral não foi pedido pela pasta.
Placing our faith in forecasting and science could save lives and money
Oliver Uberti
October 14, 2021
2021 is shaping up to be a historically busy hurricane season. And while damage and destruction have been serious, there has been one saving grace — that the National Weather Service has been mostly correct in its predictions.
Thanks to remote sensing, Gulf Coast residents knew to prepare for the “life-threatening inundation,” “urban flooding” and “potentially catastrophic wind damage” that the Weather Service predicted for Hurricane Ida. Meteorologists nailed Ida’s strength, surge and location of landfall while anticipating that a warm eddy would make her intensify too quickly to evacuate New Orleans safely. Then, as her remnants swirled northeast, reports warned of tornadoes and torrential rain. Millions took heed, and lives were saved. While many people died, their deaths resulted from failures of infrastructure and policy, not forecasting.
The long history of weather forecasting and weather mapping shows that having access to good data can help us make better choices in our own lives. Trust in meteorology has made our communities, commutes and commerce safer — and the same is possible for climate science.
Two hundred years ago, the few who studied weather deemed any atmospheric phenomenon a “meteor.” The term, referencing Aristotle’s “Meteorologica,” essentially meant “strange thing in the sky.” There were wet things (hail), windy things (tornadoes), luminous things (auroras) and fiery things (comets). In fact, the naturalist Elias Loomis, who was among the first to spot Halley’s comet upon its return in 1835, thought storms behaved as cyclically as comets. So to understand “the laws of storms,” Loomis and the era’s other leading weatherheads began gathering observations. Master the elements, they reasoned, and you could safely sail the seas, settle the American West, plant crops with confidence and ward off disease.
In 1856, Joseph Henry, the Smithsonian Institution’s first director, hung a map of the United States in the lobby of its Washington headquarters. Every morning, he would affix small colored discs to show the nation’s weather: white for places with clear skies, blue for snow, black for rain and brown for cloud cover. An arrow on each disc allowed him to note wind direction, too. For the first time, visitors could see weather across the expanding country.
Although simple by today’s standards, the map belied the effort and expense needed to select the correct colors each day. Henry persuaded telegraph companies to transmit weather reports every morning at 10. Then he equipped each station with thermometers, barometers, weathervanes and rain gauges — no small task by horse and rail, as instruments often broke in transit.
For longer-term studies of the North American climate, Henry enlisted academics, farmers and volunteers from Maine to the Caribbean. Eager to contribute, “Smithsonian observers” took readings three times a day and posted them to Washington each month. At its peak in 1860, the Smithsonian Meteorological Project had more than 500 observers. Then the Civil War broke out.
Henry’s ranks thinned by 40 percent as men traded barometers for bayonets. Severed telegraph lines and the priority of war messages crippled his network. Then in January 1865, a fire in Henry’s office landed the fatal blow to the project. All of his efforts turned to salvaging what survived. With a vacuum of leadership in Washington, citizen scientists picked up the slack.
Although the Chicago Tribune lampooned Lapham, wondering “what practical value” a warning service would provide “if it takes 10 years to calculate the progress of a storm,” Rep. Halbert E. Paine (Wis.), who had studied storms under Loomis, rushed a bill into Congress before the winter recess. In early 1870, a joint resolution establishing a storm-warning service under the U.S. Army Signal Office passed without debate. President Ulysses S. Grant signed it into law the following week.
Despite the mandate for an early-warning system, an aversion to predictions remained. Fiscal hawks could not justify an investment in erroneous forecasts, religious zealots could not stomach the hubris, and politicians wary of a skeptical public could not bear the fallout. In 1893, Agriculture Secretary J. Sterling Morton cut the salary of one of the country’s top weather scientists, Cleveland Abbe, by 25 percent, making an example out of him.
While Moore didn’t face consequences for his dereliction of duty, the Weather Bureau’s hurricane-forecasting methods gradually improved as the network expanded and technologies like radio emerged. The advent of aviation increased insight into the upper atmosphere; military research led to civilian weather radar, first deployed at Washington National Airport in 1947. By the 1950s, computers were ushering in the future of numerical forecasting. Meanwhile, public skepticism thawed as more people and businesses saw it in their best interests to trust experts.
In September 1961, a local news team decided to broadcast live from the Weather Bureau office in Galveston, Tex., as Hurricane Carla angled across the Gulf of Mexico. Leading the coverage was a young reporter named Dan Rather. “There is the eye of the hurricane right there,” he told his audience as the radar sweep brought the invisible into view. At the time, no one had seen a radar weather map televised before.
Rather realized that for viewers to comprehend the storm’s size, location and imminent danger, people needed a sense of scale. So he had a meteorologist draw the Texas coast on a transparent sheet of plastic, which Rather laid over the radarscope. Years later, he recalled that when he said “one inch equals 50 miles,” you could hear people in the studio gasp. The sight of the approaching buzz saw persuaded 350,000 Texans to evacuate their homes in what was then the largest weather-related evacuation in U.S. history. Ultimately, Carla inflicted twice as much damage as the Galveston hurricane 60 years earlier. But with the aid of Rather’s impromptu visualization, fewer than 50 lives were lost.
In other words, weather forecasting wasn’t only about good science, but about good communication and visuals.
Data visualization helped the public better understand the weather shaping their lives, and this enabled them to take action. It also gives us the power to see deadly storms not as freak occurrences, but as part of something else: a pattern.
Two hundred years ago, a 10-day forecast would have seemed preposterous. Now we can predict if we’ll need an umbrella tomorrow or a snowplow next week. Imagine if we planned careers, bought homes, built infrastructure and passed policy based on 50-year forecasts as routinely as we plan our weeks by five-day ones.
Unlike our predecessors of the 19th or even 20th centuries, we have access to ample climate data and data visualization that give us the knowledge to take bold actions. What we do with that knowledge is a matter of political will. It may be too late to stop the coming storm, but we still have time to board our windows.
Este mês, tempestades forçaram o cancelamento de mais de 300 voos no aeroporto O’Hare, de Chicago, e no aeroporto de Dalas/Fort Worth, no Texas. Em julho, oito voos foram cancelados em Denver e outros 300 sofreram atrasos devido aos incêndios florestais que atingiram a região do Pacífico Noroeste dos Estados Unidos. O calor extremo afetou decolagens em Las Vegas e no Colorado no começo deste verão [do final de junho ao final de setembro, no hemisfério norte].
As perturbações se alinham a uma tendência: cancelamentos e atrasos de voos causados pelo clima se tornaram muito mais frequentes nos Estados Unidos e na Europa durante as duas últimas décadas, demonstram dados das autoridades regulatórias. Embora seja difícil vincular qualquer tempestade ou onda de calor individual à mudança do clima, estudos científicos determinaram que elas se tornarão mais frequentes ou intensas à medida que o planeta se aquece.
A ICAO (Organização Internacional da Aviação Civil), o órgão vinculado à ONU que estabelece normas para o setor, constatou em uma pesquisa de 2019 entre seus países membros que três quartos dos respondentes afirmavam que seus setores de transporte aéreo já estavam experimentando algum impacto causado pela mudança no clima.
“É algo que absolutamente ocupa nossos pensamentos, com relação a se poderemos continuar mantendo nosso cronograma de voos, especialmente se considerarmos o crescimento que temos planejado para o futuro”, disse David Kensick, vice-presidente de operações mundiais da United Airlines. “Com a mudança no clima, estamos vendo um clima cada vez mais difícil de prever, e por isso teremos de lidar melhor com as situações criadas por ele”.
As companhias de aviação respondem por cerca de 2% das emissões mundiais de gases causadores do efeito estufa, ainda que, se outras substâncias emitidas por aviões forem consideradas, alguns estudos indiquem que seu impacto sobre o clima pode ser ainda maior.
O impacto potencial da mudança do clima sobre o setor é abrangente. Em curto prazo, as condições climáticas intensas criam dores de cabeça operacionais. Desvios forçados e cancelamentos de voos aumentam os custos de um setor que perdeu bilhões de dólares durante a pandemia.
Em prazo mais longo, as companhias de aviação acreditam que as mudanças nos padrões do clima alterarão as rotas de voo e o consumo de combustível. Provavelmente, voos entre a Europa e os Estados Unidos demorarão mais tempo, quando a “jet stream” que existe por sobre o Atlântico Norte mudar, por exemplo.
“A aviação será vítima da mudança do clima, além de ser vista, por muitas pessoas, como um dos vilões”, disse Paul Williams, professor de ciência atmosférica na Universidade de Reading, no Reino Unido.
O número de atrasos atribuídos ao mau tempo no espaço aéreo europeu subiu de 2,5 milhões em 2003 a um pico de 6,5 milhões em 2019, de acordo com dados da Eurocontrol, embora parte dessa alta possa ser atribuída ao crescimento do setor. Como proporção das causas gerais de atraso, problemas de clima subiram de 23% para 27% no mesmo período.
A proporção de voos cancelados nos Estados Unidos por conta do clima aumentou de aproximadamente 35% do total em 2004 para 54% em 2019, de acordo com a FAA (Administração Federal da Aviação) americana.
Mark Searle, diretor mundial de segurança na Associação Internacional do Transporte Aéreo (IATA), disse que as companhias de aviação haviam se adaptado ao longo dos anos à mudança do clima.
“Existe uma situação evoluindo, mas não é como se estivéssemos à beira do precipício”, ele disse. “Na verdade, nós a estamos administrando muito bem”.
Para os aeroportos, isso pode significar preparação para níveis de mar mais elevados. O novo terminal de passageiros do aeroporto de Changi, em Cingapura, foi construído apenas 5,5 metros acima do nível médio do mar. A Avinor, que opera aeroportos ao longo da costa da Noruega, determinou que todas as pistas de aterrissagem novas sejam construídas pelo menos sete metros acima do nível do mar.
No caso das companhias de aviação, será necessário recorrer à tecnologia. A American Airlines e a United Airlines melhoraram sua capacidade de prever a proximidade de relâmpagos, permitindo que o trabalho nos pátios continue por mais tempo, antes de uma tempestade que se aproxima, sem colocar em risco o pessoal de terra.
Em diversos de seus aeroportos centrais, a United Airlines, sediada em Chicago, também criou sistemas de taxiagem automática que permitem que aviões sejam conduzidos aos terminais mesmo que tempestades impeçam que agentes de rampa os orientem até os portões.
O clima severo exige pessoal adicional. As operadoras são forçadas a pagar horas extras quando seu pessoal de embarque e dos call centers enfrenta demanda adicional gerada por passageiros tentando reorganizar suas viagens. As empresas terão de calcular se compensa mais pagar o adicional por horas extras, criar turnos adicionais de trabalho ou deixar que os passageiros arquem com as consequências dos problemas.
“Haverá custo adicional de qualquer forma se –e essa é uma questão em aberto– as companhias de aviação decidirem que querem lidar com isso”, disse Jon Jager, analista da Cirium, uma empresa de pesquisa sobre aviação.
Embora os passageiros tipicamente culpem as companhias de aviação pelos problemas que encontram, as regras dos Estados Unidos, Reino Unido e União Europeia não exigem que elas indenizem os passageiros por problemas causados pelo clima. “A Mãe Natureza serve como desculpa para livrar as companhias de aviação de problemas”, disse Jager.
Perturbações surgem não só com tempestades, mas com extremos de calor. Aviões enfrentam dificuldade para decolar em temperaturas muito elevadas, porque o ar quente é menos denso, o que significa que as asas criam menos empuxo aerodinâmico. Quanto mais quente a temperatura, mais leve um avião precisa estar para decolar, especialmente em aeroportos com pistas curtas e em áreas quentes.
Williams, o cientista atmosférico, publicou um estudo no qual constata que, para um Airbus A320 decolando da ilha grega de Chios, a carga útil teve de ser reduzida em cerca de 130 quilos por ano, ao longo de três décadas –o que equivale, em linhas gerais, ao peso de um passageiro e sua bagagem.
A Iata está negociando com seus integrantes sobre a adoção de novas metas relacionadas à mudança do clima neste ano. As metas atuais do setor, adotadas em 2009, incluem reduzir à metade o nível de emissões de 2005, até 2050, e que todo crescimento seja neutro em termos de emissões de carbono, de 2020 em diante.
Mas em muitas áreas do setor, especialmente na Europa e Estados Unidos, existe uma convicção de que metas mais duras, incluindo um compromisso de zerar as emissões líquidas de poluentes, são necessárias.
“Acreditamos que provavelmente devemos ir além, e estamos trabalhando nisso”, disse Alexandre de Juniac, que está encerrando seu mandato como presidente da Iata, ao Financial Times alguns meses atrás.
Williams disse que a abordagem do setor de aviação quanto à mudança do clima parecia estar mudando.
“Historicamente, havia muita gente cética sobre a mudança do clima no setor de aviação, mas percebi uma mudança”, ele disse. “Agora o setor é muito mais honesto”.
Opinion | The Climate Has a Gun (The Wall Street Journal)
Those who dismiss risk of climate change often appeal to uncertainty, but they have it backward.
Aug. 17, 2021 1:14 pm ET 2 minutes
In “Climate Change Brings a Flood of Hyperbole” (op-ed, Aug. 11), Steven Koonin put himself in the unenviable position of playing down climate change precisely while we are experiencing unprecedented heat waves, storms, fires, droughts, and floods that exceed model-based expectations.
Mr. Koonin claims that regional projections are “meant to scare people.” But the paper he cites for support addresses the “unfolding of what may become catastrophic changes to Earth’s climate” and argues that “being able to anticipate what would otherwise be surprises in extreme weather and climate variations” requires better models. In other words, our current models cannot rule out a catastrophic future.
Model uncertainty is two-edged. If we’d been lucky, we’d be discovering that we overestimated the danger. But all indicators suggest the opposite. Those who dismiss climate risk often appeal to uncertainty, but they have it backward. Climate uncertainty is like not knowing how many shots Dirty Harry fired from his .44-caliber Magnum. Now that it’s pointed at our head, it’s dawning on us that we’ve probably miscalculated. By the time we’re sure, it’s too late. We’ve got to ask ourselves one question: Do we feel lucky? Well, do we?
Adj. Prof. Mark Boslough – University of New Mexico
Opinion | Climate Change Brings a Flood of Hyperbole (The Wall Street Journal)
Despite constant warnings of catastrophe, things aren’t anywhere near as dire as the media say.
Steven E. Koonin – Aug. 10, 2021 6:33 pm ET
The Intergovernmental Panel on Climate Change has issued its latest report assessing the state of the climate and projecting its future. As usual, the media and politicians are exaggerating and distorting the evidence in the report. They lament an allegedly broken climate and proclaim, yet again, that we are facing the “last, best chance” to save the planet from a hellish future. In fact, things aren’t—and won’t be—anywhere near as dire.
The new report, titled AR6, is almost 4,000 pages, written by several hundred government-nominated scientists over the past four years. It should command our attention, especially because this report will be a crucial element of the coming United Nations Climate Change Conference in Glasgow. Leaders from 196 countries will come together there in November, likely to adopt more-aggressive nonbinding pledges to reduce greenhouse-gas emissions.
Previous climate-assessment reports have misrepresented scientific research in the “conclusions” presented to policy makers and the media. The summary of the most recent U.S. government climate report, for instance, said heat waves across the U.S. have become more frequent since 1960, but neglected to mention that the body of the report shows they are no more common today than they were in 1900. Knowledgeable independent scientists need to scrutinize the latest U.N. report because of the major societal and economic disruptions that would take place on the way to a “net zero” world, including the elimination of fossil-fueled electricity, transportation and heat, as well as complete transformation of agricultural methods.
It is already easy to see things in this report that you almost certainly won’t learn from the general media coverage. Most important, the model muddle continues. We are repeatedly told “the models say.” But the complicated computer models used to project future temperature, rainfall and so on remain deficient. Some models are far more sensitive to greenhouse gases than others. Many also disagree on the baseline temperature for the Earth’s surface.
The latest models also don’t reproduce the global climate of the past. The models fail to explain why rapid global warming occurred from 1910 to 1940, when human influences on the climate were less significant. The report also presents an extensive “atlas” of future regional climates based on the models. Sounds authoritative. But two experts, Tim Palmer and Bjorn Stevens, write in the Proceedings of the National Academy of Sciences that the lack of detail in current modeling approaches makes them “not fit” to describe regional climate. The atlas is mainly meant to scare people.
The social cost of carbon could guide us toward intellinget policies – only if we knew what it was.
In contrast to the existential angst currently in fashion around climate change, there’s a cold-eyed calculation that its advocates, mostly economists, like to call the most important number you’ve never heard of.
It’s the social cost of carbon. It reflects the global damage of emitting one ton of carbon dioxide into the sky, accounting for its impact in the form of warming temperatures and rising sea levels. Economists, who have squabbled over the right number for a decade, see it as a powerful policy tool that could bring rationality to climate decisions. It’s what we should be willing to pay to avoid emitting that one more ton of carbon.
This story was part of our May 2019 issue
For most of us, it’s a way to grasp how much our carbon emissions will affect the world’s health, agriculture, and economy for the next several hundred years. Maximilian Auffhammer, an economist at the University of California, Berkeley, describes it this way: it’s approximately the damage done by driving from San Francisco to Chicago, assuming that about a ton of carbon dioxide spits out of the tailpipe over those 2,000 miles.
Common estimates of the social cost of that ton are $40 to $50. The cost of the fuel for the journey in an average car is currently around $225. In other words, you’d pay roughly 20% more to take the social cost of the trip into account.
The number is contentious, however. A US federal working group in 2016, convened by President Barack Obama, calculated it at around $40, while the Trump administration has recently put it at $1 to $7. Some academic researchers cite numbers as high as $400 or more.
Why so wide a range? It depends on how you value future damages. And there are uncertainties over how the climate will respond to emissions. But another reason is that we actually have very little insight into just how climate change will affect us over time. Yes, we know there’ll be fiercer storms and deadly wildfires, heat waves, droughts, and floods. We know the glaciers are melting rapidly and fragile ocean ecosystems are being destroyed. But what does that mean for the livelihood or life expectancy of someone in Ames, Iowa, or Bangalore, India, or Chelyabinsk, Russia?
For the first time, vast amounts of data on the economic and social effects of climate change are becoming available, and so is the computational power to make sense of it. Taking this opportunity to compute a precise social cost of carbon could help us decide how much to invest and which problems to tackle first.
“It is the single most important number in the global economy,” says Solomon Hsiang, a climate policy expert at Berkeley. “Getting it right is incredibly important. But right now, we have almost no idea what it is.”
That could soon change.
The cost of death
In the past, calculating the social cost of carbon typically meant estimating how climate change would slow worldwide economic growth. Computer models split the world into at most a dozen or so regions and then averaged the predicted effects of climate change to get the impact on global GDP over time. It was at best a crude number.
Over the last several years, economists, data scientists, and climate scientists have worked together to create far more detailed and localized maps of impacts by examining how temperatures, sea levels, and precipitation patterns have historically affected things like mortality, crop yields, violence, and labor productivity. This data can then be plugged into increasingly sophisticated climate models to see what happens as the planet continues to warm.
The wealth of high-resolution data makes a far more precise number possible—at least in theory. Hsiang is co-director of the Climate Impact Lab, a team of some 35 scientists from institutions including the University of Chicago, Berkeley, Rutgers, and the Rhodium Group, an economic research organization. Their goal is to come up with a number by looking at about 24,000 different regions and adding together the diverse effects that each will experience over the coming hundreds of years in health, human behavior, and economic activity.
It’s a huge technical and computational challenge, and it will take a few years to come up with a single number. But along the way, the efforts to better understand localized damages are creating a nuanced and disturbing picture of our future.
So far, the researchers have found that climate change will kill far more people than once thought. Michael Greenstone, a University of Chicago economist who co-directs the Climate Impact Lab with Hsiang, says that previous mortality estimates had looked at seven wealthy cities, most in relatively cool climates. His group looked at data gleaned from 56% of the world’s population. It found that the social cost of carbon due to increased mortality alone is $30, nearly as high as the Obama administration’s estimate for the social cost of all climate impacts. An additional 9.1 million people will die every year by 2100, the group estimates, if climate change is left unchecked (assuming a global population of 12.7 billion people).
Unfairly Distributed
However, while the Climate Impact Lab’s analysis showed that 76% of the world’s population would suffer from higher mortality rates, it found that warming temperatures would actually save lives in a number of northern regions. That’s consistent with other recent research; the impacts of climate change will be remarkably uneven.
The variations are significant even within some countries. In 2017, Hsiang and his collaborators calculated climate impacts county by county in the United States. They found that every degree of warming would cut the country’s GDP by about 1.2%, but the worst-hit counties could see a drop of around 20%.
If climate change is left to run unchecked through the end of the century, the southern and southwestern US will be devastated by rising rates of mortality and crop failure. Labor productivity will slow, and energy costs (especially due to air-conditioning) will rise. In contrast, the northwestern and parts of the northeastern US will benefit.
“It is a massive restructuring of wealth,” says Hsiang. This is the most important finding of the last several years of climate economics, he adds. By examining ever smaller regions, you can see “the incredible winners and losers.” Many in the climate community have been reluctant to talk about such findings, he says. “But we have to look [the inequality] right in the eye.”
The social cost of carbon is typically calculated as a single global number. That makes sense, since the damage of a ton of carbon emitted in one place is spread throughout the world. But last year Katharine Ricke, a climate scientist at UC San Diego and the Scripps Institution of Oceanography, published the social costs of carbon for specific countries to help parse out regional differences.
India is the big loser. Not only does it have a fast-growing economy that will be slowed, but it’s already a hot country that will suffer greatly from getting even hotter. “India bears a huge share of the global social cost of carbon—more than 20%,” says Ricke. It also stands out for how little it has actually contributed to the world’s carbon emissions. “It’s a serious equity issue,” she says.
Estimating the global social cost of carbon also raises a vexing question: How do you put a value on future damages? We should invest now to help our children and grandchildren avoid suffering, but how much? This is hotly and often angrily debated among economists.
A standard tool in economics is the discount rate, used to calculate how much we should invest now for a payoff years from now. The higher the discount rate, the less you value the future benefit. William Nordhaus, who won the 2018 Nobel Prize in economics for pioneering the use of models to show the macroeconomic effects of climate change, has used a discount rate of around 4%. The relatively high rate suggests we should invest conservatively now. In sharp contrast, a landmark 2006 report by British economist Nicholas Stern used a discount rate of 1.4%, concluding that we should begin investing much more heavily to slow climate change.
There’s an ethical dimension to these calculations. Wealthy countries whose prosperity has been built on fossil fuels have an obligation to help poorer countries. The climate winners can’t abandon the losers. Likewise, we owe future generations more than just financial considerations. What’s the value of a world free from the threat of catastrophic climate events—one with healthy and thriving natural ecosystems?
Outrage
Enter the Green New Deal (GND). It’s the sweeping proposal issued earlier this year by Representative Alexandria Ocasio-Cortez and other US progressives to address everything from climate change to inequality. It cites the dangers of temperature increases beyond the UN goal of 1.5 °C and makes a long list of recommendations. Energy experts immediately began to bicker over its details: Is achieving 100% renewables in the next 12 years really feasible? (Probably not.) Should it include nuclear power, which many climate activists now argue is essential for reducing emissions?
In reality, the GND has little to say about actual policies and there’s barely a hint of how it will attack its grand challenges, from providing a secure retirement for all to fostering family farms to ensuring access to nature. But that’s not the point. The GND is a cry of outrage against what it calls “the twin crises of climate change and worsening income inequality.” It’s a political attempt to make climate change part of the wider discussion about social justice. And, at least from the perspective of climate policy, it’s right in arguing that we can’t tackle global warming without considering broader social and economic issues.
The work of researchers like Ricke, Hsiang, and Greenstone supports that stance. Not only do their findings show that global warming can worsen inequality and other social ills; they provide evidence that aggressive action is worth it. Last year, researchers at Stanford calculated that limiting warming to 1.5 °C would save upwards of $20 trillion worldwide by the end of the century. Again, the impacts were mixed—the GDPs of some countries would be harmed by aggressive climate action. But the conclusion was overwhelming: more than 90% of the world’s population would benefit. Moreover, the cost of keeping temperature increases limited to 1.5 °C would be dwarfed by the long-term savings.
Nevertheless, the investments will take decades to pay for themselves. Renewables and new clean technologies may lead to a boom in manufacturing and a robust economy, but the Green New Deal is wrong to paper over the financial sacrifices we’ll need to make in the near term.
That is why climate remedies are such a hard sell. We need a global policy—but, as we’re always reminded, all politics is local. Adding 20% to the cost of that San Francisco–Chicago trip might not seem like much, but try to convince a truck driver in a poor county in Florida that raising the price of fuel is wise economic policy. A much smaller increase sparked the gilets jaunes riots in France last winter. That is the dilemma, both political and ethical, that we all face with climate change.
The latest landmark climate science report goes much further than previous ones in providing estimates of how bad things might get as the planet heats up, even if a lack of data may mean it underestimates the perils.
Scientists have used the seven years since the previous assessment report of the Intergovernmental Panel of Climate Change (IPCC) to narrow the uncertainties around major issues, such as how much the planet will warm if we double atmospheric levels of carbon dioxide and other greenhouse gases.
While temperatures have risen largely in lockstep with rising CO2, this IPCC report examines in much more detail the risks of so-called abrupt changes, when relatively stable systems abruptly and probably irreversibly shift to a new state.
Michael Mann, director of the Pennsylvania State University’s Earth System Science and one of the world’s most prominent climate researchers, says the models are not capturing all the risks as the climate heats up.
Running AMOC
Perhaps the most prominent of these threats is a possible stalling of the Atlantic Meridional Overturning Circulation (AMOC). Also known as the Gulf Stream, it brings tropic water north from the Caribbean, keeping northern Europe much warmer than its latitude might otherwise suggest, and threatening massive disruptions if it slows or stops.
“Where the models have underestimated the impact is with projections of ice melt, the AMOC, and – I argue in my own work – the uptick on extreme weather events,” Professor Mann tells the Herald and The Age.
Stefan Rahmstorf, head of research at the Potsdam Institute for Climate Impact Research, agrees that climate models have not done a good job of reproducing the so-called cold blob in the subpolar Atlantic that is forming where melting Greenland ice is cooling the subpolar Atlantic.
Breaking up: The US Coast Guard Icebreaker Healy on a research cruise in the Chukchi Sea of the Arctic Ocean. Credit:AP
If they are not picking that blob up, “should we trust those models on AMOC stability?” Professor Rahmstorf asks.
The IPCC’s language, too, doesn’t necessarily convey the nature of the threat, much of which will be detailed in the second AR6 report on the impacts of climate change, scheduled for release next February.
“Like just stating the AMOC collapse by 2100 is ‘very unlikely’ – that was in a previous report – it sounds reassuring,” Professor Rahmstorf said. “Now the IPCC says they have ‘medium confidence’ that it won’t happen by 2100, whatever that means.”
West Antarctica has enough ice to raise global sea levels by more than 3 metres if it melts.Credit:Ian Joughin
West Antarctic melt
Another potential tipping point is the possible disintegration of the West Antarctic ice sheet. Much of the sheet lies below sea level and as the Southern Ocean warms, it will melt causing it to “flow” towards the sea in a process that is expected to be self-sustaining.
This so-called marine ice sheet instability is identified in the IPCC report as likely resulting in ice mass loss under all emissions scenarios. There is also “deep uncertainty in projections for above 3 degrees of warming”, the report states.
Containing enough water to lift sea levels by 3.3 metres, it matters what happens to the ice sheet. As Andrew Mackintosh, an ice expert at Monash University, says, the understanding is limited: “We know more about the surface of Mars than the ice sheet bed under the ice.”
Permafrost not so permanent
Much has been made about the so-called “methane bomb” sitting under the permafrost in the northern hemisphere. As the Arctic has warmed at more than twice the pace of the globe overall, with heatwaves of increasing intensity and duration, it is not surprising that the IPCC has listed the release of so-called biogenic emissions from permafrost thaw as among potential tipping points.
These emissions could total up to 240 gigatonnes of CO2-equivalent which, if released, would add an unwanted warming boost.
The IPCC lists as “high” the probability of such releases during this century, adding there is “high confidence” that the process is irreversible at century scales.
“In some cases abrupt changes can occur in Earth System Models but don’t on the timescales of the projections (for example, an AMOC collapse),” said Peter Cox, a Professor of Climate System Dynamics at the UK’s University of Exeter. “In other cases the processes involved are not yet routinely included in ESMs [such as] CO2 and methane release from deep permafrost.”
“In the latter cases IPCC statements are made on the basis of the few studies available, and are necessarily less definitive,” he said.
Other risks
From the Amazon rainforest to the boreal forests of Russia and Canada, there is a risk of fire and pests that could trigger dieback and transform those regions.
Australia’s bush faces an increased risk of bad fire weather days right across the continent, the IPCC notes. How droughts, heatwaves and heavy rain and other extreme events will play out at a local level is also not well understood.
Ocean acidification and marine heatwaves also mean the world’s coral reefs will be much diminished at more than 1.5 degrees of warming. “You can kiss it goodbye as we know it,” says Sarah Perkins-Kirkpatrick, a climate researcher at the University of NSW, said.
Global monsoons, which affect billions of people including those on the Indian subcontinent, are likely to increase their rainfall in most parts of the world, the IPCC said.
Andy Pitman, director of the ARC Centre of Excellence for Climate Extremes, said policymakers need to understand that much is riding on these tipping points not being triggered as even one or two of them would have long-lasting and significant effects. “How lucky do you feel?” Professor Pitman says.
The Biggest uncertainty
Christian Jakob, a Monash University climate researcher, said that while there remain important uncertainties, science is honing most of those risks down.
Much harder to gauge, though, is which emissions path humans are going to take. Picking between the five scenarios ranging from low to high that we are going to choose is “much larger than the uncertainty we have in the science,” Professor Jakob said.
O impacto do aumento da temperatura média na Terra é planetário, com elevação do nível do mar e alteração de ecossistemas inteiros, entre outras mudanças.
Alterações regionais do clima, com maior frequência de eventos extremos, já são percebidas e se intensificarão nos próximos anos, com consequências diretas na saúde de todos.
No Brasil, alguns estados conviverão com mais dias de calorão, que podem ser prejudiciais à saúde a ponto de provocar a morte de idosos.
Em outros, chuvas intensas se tornarão mais recorrentes, ocasionando inundações que aumentam o risco de doenças, quando não destroem bairros e cidades.
Por fim, as secas também devem ficar mais intensas, o que pode agravar problemas respiratórios.
Além disso, tanto as chuvas intensas quanto as secas prejudicam lavouras, aumentando o preço dos alimentos.
Um exemplo prático de aumento de temperatura está no Sudeste e no Sul do Brasil. Segundo o cenário mais otimista do IPCC, até 2040 os dias com termômetros acima de 35°C passarão de 26 por ano (média de 1995 a 2014) para 32. Num cenário intermediário, até o final do século esse número pode chegar a 43, um aumento de mais de 65% em relação à situação recente.
No Centro-Oeste, o aumento do calorão é ainda mais severo. No cenário intermediário, do IPCC, a média de 53 dias por ano com termômetros acima de 35°C salta para cerca de 72 até 2040 e para 108 até o fim do século, ou pouco mais de um trimestre de temperatura extrema.
As consequências para a saúde são graves. Ondas de calor extremo podem causar hipertermia, que afeta os órgãos internos e provoca lesões no coração, nas células musculares e nos vasos sanguíneos. São danos que podem levar à morte.
Homem empurra carrinho com frutas em rua inundada em Manaus, que enfrentou, nos últimos meses, a maior cheia já registrada do rio Negro – Michael Dantas/AFP
Em junho, uma onda de calor nos estados de Oregon e Washington, nos Estados Unidos, custou a vida de centenas de pessoas. Segundo reportagem do jornal The New York Times, foram registrados cerca de 600 óbitos em excesso no período.
Além do calor, a crise do clima deve tornar mais frequentes os períodos de seca e os dias sem chuva em muitas regiões. É o caso da Amazônia.
Dados do IPCC apontam que, na região Norte, no período 1995-2014 eram em média 43 dias consecutivos sem chuva por ano, que podem aumentar para 51, com períodos 10% mais secos até 2040.
Situação similar deve ocorrer no Centro-Oeste, que tinha 69 dias consecutivos sem chuva por ano, que podem ir a 76, com períodos 13% mais secos.
Períodos mais secos nessas regiões preocupam por causa das queimadas. Na Amazônia, por exemplo, a época sem chuvas é associada à intensificação de processos de desmatamento e de incêndios.
As queimadas na região amazônica têm relação com piora da qualidade do ar e consequentes problemas respiratórios. A Fiocruz e a ONG WWF-Brasil estimam que estados amazônicos com índices elevados de queimadas tenham gastado, em dez anos, quase R$ 1 bilhão com hospitalizações por doenças respiratórias provavelmente relacionadas à fumaça dos incêndios.
No ano passado, o Pantanal passou por sua pior seca dos últimos 60 anos, estiagem que ainda pode continuar por até cinco anos, segundo afirmou à época a Secretaria Nacional de Proteção e Defesa Civil. A situação fez explodir o número de queimadas na região.
O IPCC também aponta aumento da frequência e da intensidade de chuvas extremas e enchentes em diversas regiões do Brasil.
Além dos danos óbvios na infraestrutura das cidades, as inundações provocam problemas de saúde. Hepatite A (transmitida de modo oral-fecal, ou seja, por alimentos e água contaminada) e leptospirose (com transmissão a partir do contato com urina de ratos) são suspeitos conhecidos, mas há também o risco de acidentes com animais peçonhentos, já que cobras e escorpiões podem procurar abrigos dentro das casas.
Manaus tornou-se exemplo recente desse tipo de situação. A cidade enfrentou uma cheia histórica, a maior desde o início das medições, há 119 anos. As águas do rio Negro provocaram inundações com duração superior a um mês na principal capital da região amazônica. Seis das dez maiores cheias já registradas no rio ocorreram no século 21, ou seja, nas últimas duas décadas.
Ruas da região do porto de Manaus tiveram que ser interditadas e foi necessária a construção de passarelas sobre as vias alagadas. Enquanto isso, comerciantes fizeram barreiras com sacos de areia e jogaram cal na água parada para tentar neutralizar o cheiro de fezes.
Em meio à inundação em igarapés, houve acúmulo de lixo, que chegou a cobrir toda a área superficial da água. Dentro das casas, moradores usaram plataformas de madeira (chamadas de marombas) para suspender móveis e eletrodomésticos.
As enchentes não são exclusividade amazônica. Elas também ocorrem na região Sudeste, em São Paulo e Rio de Janeiro, por exemplo.
Pouco tempo depois da cheia em Manaus, a Europa também viu chuvas intensas concentradas em um curto espaço de tempo causarem inundações severas, principalmente na Alemanha. Além da destruição de vias públicas e imóveis, houve mais de uma centena de mortes.
Também no mesmo período, a China teve que lidar com grandes precipitações e perda de vidas humanas pelas inundações, que chegaram a encher de água o metrô, deixando pessoas presas. Foram as piores chuvas em 60 anos em Zhengzhou, capital da província de Henan.
Em termos globais, um estudo recente apontou o aumento da população exposta a inundações. De 2000 a 2015, de 255 milhoes a 290 milhões de pessoas foram diretamente afetadas por enchentes.
Segundo Lincoln Alves, pesquisador do Inpe (Instituto Nacional de Pesquisas Espaciais) e autor-líder do Atlas do IPCC, a ferramenta pretende facilitar o acesso a informações normalmente complexas. “É visível a mudança do clima”, afirma o pesquisador.
A partir do Atlas, diz Alves, é possível que comunidades, empresas e até esferas do governo consigam olhar de forma mais regional para os efeitos da crise do clima.
A ferramenta permite ver a história climática da Terra e observar as projeções para diferentes variáveis em diferentes cenários de emissões —e de aquecimento, como 1,5°C e 2°C— apontados pelo IPCC.
PRINCIPAIS CONCLUSÕES DO RELATÓRIO DO IPCC
Aumento de temperatura provocada pelo ser humano desde 1850-1900 até 2010-2019: de 0,8°C a 1,21°C
Os anos de 2016 a 2020 foram o período de cinco anos mais quentes de 1850 a 2020
De 2021 a 2040, um aumento de temperatura de 1,5°C é, no mínimo, provável de acontecer em qualquer cenário de emissões
A estabilização da temperatura na Terra pode levar de 20 a 30 anos se houver redução forte e sustentada de emissões
O oceano está esquentando mais rápido —inclusive em profundidades maiores do que 2.000 m— do que em qualquer período anterior, desde pelo menos a última transição glacial. É extremamente provável que as atividades humanas sejam o principal fator para isso
O oceano continuará a aquecer por todo o século 21 e provavelmente até 2300, mesmo em cenários de baixas emissões
O aquecimento de áreas profundas do oceano e o derretimento de massas de gelo tende a elevar o nível do mar, o que tende a se manter por milhares de anos
Nos próximos 2.000 anos, o nível médio global do mar deve aumentar 2 a 3 metros, se o aumento da temperatura ficar contido em 1,5°C. Se o aquecimento global ficar contido a 2°C, o nível deve aumentar de 2 a 6 metros. No caso de 5°C de aumento de temperatura, o mar subirá de 19 a 22 metros
As the Intergovernmental Panel on Climate Change (IPCC) released its Sixth Assessment Report, summarized nicely on these pages by Bob Henson, much of the associated media coverage carried a tone of inevitable doom.
These proclamations of unavoidable adverse outcomes center around the fact that in every scenario considered by IPCC, within the next decade average global temperatures will likely breach the aspirational goal set in the Paris climate agreement of limiting global warming to 1.5 degrees Celsius (2.7 degrees Fahrenheit) above pre-industrial temperatures. The report also details a litany of extreme weather events like heatwaves, droughts, wildfires, floods, and hurricanes that will all worsen as long as global temperatures continue to rise.
While United Nations Secretary-General António Guterres rightly called the report a “code red for humanity,” tucked into it are details illustrating that if – BIG IF –top-emitting countries respond to the IPCC’s alarm bells with aggressive efforts to curb carbon pollution, the worst climate outcomes remain avoidable.
The IPCC’s future climate scenarios
In the Marvel film Avengers: Infinity War, the Dr. Strange character goes forward in time to view 14,000,605 alternate futures to see all the possible outcomes of the Avengers’ coming conflict. Lacking the fictional Time Stone used in this gambit, climate scientists instead ran hundreds of simulations of several different future carbon emissions scenarios using a variety of climate models. Like Dr. Strange, climate scientists’ goal is to determine the range of possible outcomes given different actions taken by the protagonists: in this case, various measures to decarbonize the global economy.
The scenarios considered by IPCC are called Shared Socioeconomic Pathways (SSPs). The best-case climate scenario, called SSP1, involves a global shift toward sustainable management of global resources and reduced inequity. The next scenario, SSP2, is more of a business-as-usual path with slow and uneven progress toward sustainable development goals and persisting income inequality and environmental degradation. SSP3 envisions insurgent nationalism around the world with countries focusing on their short-term domestic best interests, resulting in persistent and worsening inequality and environmental degradation. Two more scenarios, SSP4 and SSP5, consider even greater inequalities and fossil fuel extraction, but seem at odds with an international community that has agreed overwhelmingly to aim for the Paris climate targets.
The latest IPCC report’s model runs simulated two SSP1 scenarios that would achieve the Paris targets of limiting global warming to 1.5 and 2°C (2.7 and 3.6°F); one SSP2 scenario in which temperatures approach 3°C (5.4°F) in the year 2100; an SSP3 scenario with about 4°C (7.2°F) global warming by the end of the century; and one SSP5 ‘burn all the fossil fuels possible’ scenario resulting in close to 5°C (9°F), again by 2100.
Projected global average surface temperature change in each of the five SSP scenarios. (Source: IPCC Sixth Assessment Report)
The report’s SSP3-7.0 pathway (the latter number represents the eventual global energy imbalance caused by the increased greenhouse effect, in watts per square meter), is considered by many experts to be a realistic worst-case scenario, with global carbon emissions continuing to rise every year throughout the 21st century. Such an outcome would represent a complete failure of international climate negotiations and policies and would likely result in catastrophic consequences, including widespread species extinctions, food and water shortages, and disastrous extreme weather events.
Scenario SSP2-4.5 is more consistent with government climate policies that are currently in place. It envisions global carbon emissions increasing another 10% over the next decade before reaching a plateau that’s maintained until carbon pollution slowly begins to decline starting in the 2050s. Global carbon emissions approach but do not reach zero by the end of the century. Even in this unambitious scenario, the very worst climate change impacts might be averted, although the resulting climate impacts would be severe.
Most encouragingly, the report’s two SSP1 scenarios illustrate that the Paris targets remain within reach. To stay below the main Paris target of 2°C (3.6°F) warming, global carbon emissions in SSP1-2.6 plateau essentially immediately and begin to decline after 2025 at a modest rate of about 2% per year for the first decade, then accelerating to around 3% per year the next decade, and continuing along a path of consistent year-to-year carbon pollution cuts before reaching zero around 2075. The IPCC concluded that once global carbon emissions reach zero, temperatures will stop rising. Toward the end of the century, emissions in SSP1-2.6 move into negative territory as the IPCC envisions that efforts to remove carbon from the atmosphere via natural and technological methods (like sequestering carbon in agricultural soils and scrubbing it from the atmosphere through direct air capture) outpace overall fossil fuel emissions.
Meeting the aspirational Paris goal of limiting global warming to 1.5°C (2.7°F) in SSP1-1.9 would be extremely challenging, given that global temperatures are expected to breach this level within about a decade. This scenario similarly envisions that global carbon emissions peak immediately and that they decline much faster than in SSP1-2.6, at a rate of about 6% per year from 2025 to 2035 and 9% per year over the following decade, reaching net zero by around the year 2055 and becoming net negative afterwards.
Global carbon dioxide emissions (in billions of tons per year) from 2015 to 2100 in each of the five SSP scenarios. (Source: IPCC Sixth Assessment Report)
For perspective, global carbon emissions fell by about 6-7% in 2020 as a result of restrictions associated with the COVID-19 pandemic and are expected to rebound by a similar amount in 2021. As IPCC report contributor Zeke Hausfather noted, this scenario also relies on large-scale carbon sequestration technologies that currently do not exist, without which global emissions would have to reach zero a decade sooner.
More warming means more risk
The new IPCC report details that, depending on the region, climate change has already worsened extreme heat, drought, fires, floods, and hurricanes, and those will only become more damaging and destructive as temperatures continue to rise. The IPCC’s 2018 “1.5°C Report” had entailed the differences in climate consequences in a 2°C vs. 1.5°C world, as summarized at this site by Bruce Lieberman.
Consider that in the current climate of just over 1°C (2°F) warmer than pre-industrial temperatures, 40 countries this summer alone have experienced extreme flooding, including more than a year’s worth of rain falling within 24 hours in Zhengzhou, China. Many regions have also experienced extreme heat, including the deadly Pacific Northwest heatwave and dangerously hot conditions during the Olympics in Tokyo. Siberia, Greece, Italy, and the US west coast are experiencing explosive wildfires, including the “truly frightening fire behavior” of the Dixie fire, which broke the record as the largest single wildfire on record in California. The IPCC report warned of “compound events” like heat exacerbating drought, which in turn fuels more dangerous wildfires, as is happening in California.
Western North America (WNA) and the Mediterranean (MED) regions are those for which climate scientists have the greatest confidence that human-caused global warming is exacerbating drought by drying out the soil. (Source: IPCC Sixth Assessment Report)The southwestern United States and Mediterranean are also among the regions for which climate scientists have the greatest confidence that climate change will continue to increase drought risk and severity. (Source: IPCC Sixth Assessment Report)
The IPCC report notes that the low-emissions SSP1 scenarios “would lead to substantially smaller changes” in these sorts of climate impact drivers than the higher-emissions scenarios. It also points out that with the world currently at around 1°C of warming, the intensity of extreme weather will be twice as bad compared to today’s conditions if temperatures reach 2°C (1°C hotter than today) than if the warming is limited to 1.5°C (0.5°C hotter than today), and quadruple as bad if global warming reaches 3°C (2°C hotter than today). For example, what was an extreme once-in-50-years heat wave in the late-1800s now occurs once per decade, which would rise to almost twice per decade at 1.5°C, and nearly three times per decade at 2°C global warming.
The increasing frequency and intensity of what used to be 1-in-50-year extreme heat as global temperatures rise. (Source: IPCC Sixth Assessment Report)
Climate’s fate has yet to be written
At the same time, there is no tipping point temperature at which it becomes “too late” to curb climate change and its damaging consequences. Every additional bit of global warming above current temperatures will result in increased risks of worsening extreme weather of the sorts currently being experienced around the world. Achieving the aspirational 1.5°C Paris target may be politically infeasible, but most countries (137 total) have either committed to or are in the process of setting a target for net zero emissions by 2050 (including the United States) or 2060 (including China).
That makes the SSP1 scenarios and limiting global warming to less than 2°C a distinct possibility, depending on how successful countries are at following through with decarbonization plans over the coming three decades. And with its proposed infrastructure bipartisan and budget reconciliation legislative plans – for which final enactment of each remains another big IF – the United States could soon implement some of the bold investments and policies necessary to set the world’s second-largest carbon polluter on a track consistent with the Paris targets.
Again and again, assessment after assessment, the IPCC has already made it clear. Climate change puts at risk every aspect of human life as we know it … We are already starting to experience those risks today; but we know what we need to do to avoid the worst future impacts. The difference between a fossil fuel versus a clean energy future is nothing less than the future of civilization as we know it.
Back to the Avengers: They had only one chance in 14 million to save the day, and they succeeded. Time is running short, but policymakers’ odds of meeting the Paris targets remain much better than that. There are no physical constraints playing the role of Thanos in our story; only political barriers stand between humanity and a prosperous clean energy future, although those can sometimes be the most difficult types of barriers to overcome.
The new IPCC report is “a code red for humanity”, says UN Secretary-General António Guterres.
Established in 1988 by United Nations Environment Programme (UNEP) and the World Meteorological Organisation (WMO), the Intergovernmental Panel on Climate Change (IPCC) assesses climate change science. Its new report is a warning sign for policy makers all over the world.
In this picture taken on 26 October, 2014, Peia Kararaua, 16, swims in the flooded area of Aberao village in Kiribati. Kiribati is one of the countries worst hit by the sea level rise since high tides mean many villages are inundated, making them uninhabitable. Image credit: UNICEF/Sokhin
This was the first time the approval meeting for the report was conducted online. There were 234 authors from the world over who clocked in 186 hours working together to get this report released.
For the first time, the report offers an interactive atlas for people to see what has already happened and what may happen in the future to where they live.
“This report tells us that recent changes in the climate are widespread, rapid and intensifying, unprecedented in thousands of years,” said IPCC Vice-Chair Ko Barrett.
UNEP Executive Director Inger Andersen that scientists have been issuing these messages for more than three decades, but the world hasn’t listened.
Here are the most important takeaways from the report:
Humans are to be blamed
Human activity is the cause of climate change and this is an unequivocal fact. All the warming caused in the pre-industrial times had been generated by the burning of fossil fuels such as coal, oil, wood, and natural gas.
Global temperatures have already risen by 1.1 degrees Celsius since the 19th century. They have reached their highest in over 100,000 years, and only a fraction of that increase has come from natural forces.
Michael Mann told the Independentthe effects of climate change will be felt in all corners of the world and will worsen, especially since “the IPCC has connected the dots on climate change and the increase in severe extreme weather events… considerably more directly than previous assessments.”
We will overshoot the 1.5 C mark
According to the report’s highly optimistic-to-reckless scenarios, even if we do everything right and start reducing emissions now, we will still overshoot the 1.5C mark by 2030. But, we will see a drop in temperatures to around 1.4 C.
Control emissions, Earth will do the rest
According to the report, if we start working to bring our emissions under control, we will be able to decrease warming, even if we overshoot the 1.5C limit.
The changes we are living through are unprecedented; however, they are reversible to a certain extent. And it will take a lot of time for nature to heal. We can do this by reducing our greenhouse gas (GHG) emissions. While we might see some benefits quickly, “it could take 20-30 years to see global temperatures stabilise” says the IPCC.
Sea level rise
Global oceans have risen about 20 centimetres (eight inches) since 1900, and the rate of increase has nearly tripled in the last decade. Crumbling and melting ice sheets atop Antarctica (especially in Greenland) have replaced glacier melt as the main drivers.
If global warming is capped at 2 C, the ocean watermark will go up about half a metre over the 21st century. It will continue rising to nearly two metres by 2300 — twice the amount predicted by the IPCC in 2019.
Because of uncertainty over ice sheets, scientists cannot rule out a total rise of two metres by 2100 in a worst-case emissions scenario.
CO2 is at all-time high
CO2 levels were greater in 2019 than they had been in “at least two million years.” Methane and nitrous oxide levels, the second and third major contributors of warming respectively, were higher in 2019 than at any point in “at least 800,000 years,” reported the Independent.
Control methane
The report includes more data than ever before on methane (CH4), the second most important greenhouse gas after CO2, and warns that failure to curb emissions could undermine Paris Agreement goals.
Human-induced sources are roughly divided between leaks from natural gas production, coal mining and landfills on one side, and livestock and manure handling on the other.
CH4 lingers in the atmosphere only a fraction as long as CO2, but is far more efficient at trapping heat. CH4 levels are their highest in at least 800,000 years.
Natural allies are weakened
Since about 1960, forests, soil and oceans have absorbed 56 percent of all the CO2 humanity has released into the atmosphere — even as those emissions have increased by half. Without nature’s help, Earth would already be a much hotter and less hospitable place.
But these allies in our fight against global heating — known in this role as carbon sinks — are showing signs of saturatation, and the percentage of human-induced carbon they soak up is likely to decline as the century unfolds.
Suck it out
The report suggests that warming could be brought back down via “negative emissions.” We could cool down the planet by sucking out or sequestering the carbon from the atmosphere. While this is a viable suggestion that has been thrown around and there have been small-scale studies that have tried to do this, the technology is not yet perfect. The panel said that could be done starting about halfway through this century but doesn’t explain how, and many scientists are skeptical about its feasibility.
Cities will bear the brunt
Expertswarn that the impact of some elements of climate change, like heat, floods and sea-level rise in coastal areas, may be exacerbated in cities. Furthermore, IPCC experts warn that low-probability scenarios, like an ice sheet collapse or rapid changes in ocean circulation, cannot be ruled out.
A perversidade do negacionismo recai em jurar que se está dizendo o contrário do que de fato se diz. Nesta novilíngua, negacionismo veste o sapatênis do antialarmismo. Chega a ser tedioso, posto que mofado, o argumento de Leandro Narloch nesta Folha na terça (10). Mofado pois —como relata Michael Mann em “The New Climate War”— não passa da mesma retórica negacionista 2.0.
Em essência, Narloch defende que há atividades nocivas ao clima que devem ser “celebradas e difundidas” por nos tornar “menos vulneráveis à natureza”. Narloch está cientificamente errado. E o faz subscrevendo a uma das formas mais nefárias de negacionismo: mascara-o, vendendo soluções que não só não são capazes de mitigar e adaptar as sociedades à crise climática como possuem o efeito adverso. Implode-se a Amazônia para salvá-la, eis o argumento.
Esses e outros discursos negacionistas já tinham sido mapeados na revista Global Sustaintability, de Cambridge, em julho de 2020: não são novos. Em vez de mexer em tabus do século 21, vendem-se inverdades como se ciência fosse. Narloch erra no conceito de vulnerabilidade: dos incêndios florestais na Califórnia às inundações na Alemanha, não estamos protegidos contra a natureza porque nela estamos inseridos. Ignora, ademais, a vasta literatura do Painel do Clima sobre vulnerabilidade.
Narloch desconsidera o conceito da ciência climática de “feedback loops”: a crise climática aciona uma série de gatilhos de dimensão incalculável, uma reação de cadeia nunca vista. Destruir o clima não nos protegerá do clima, porque é a ausência de uma mudança drástica energética que tem aprofundado a crise climática. É ineficiente o investir no contrário.
Se o relatório do Painel do Clima acendeu o sinal vermelho, não é com desinformação que o jornalismo contribuirá ao tema. Pluralismo é um rio onde as ideias se movem dentro das margens da verdade e da ciência. Não reclamem quando o rio secar, implodindo as margens que o jornalismo deveria ter protegido.
Na sua opinião, o que aconteceu nos últimos cem anos com o número total de mortes causadas por furacões, inundações, secas, ondas de calor e outros desastres climáticos? Peço que escolha uma destas alternativas:
a) Aumentou mais de 800%
b) Aumentou cerca de 50%
c) Manteve-se constante
d) Diminuiu cerca de 50%
e) Diminuiu mais de 80%
Como a população mundial cresceu de 1,8 bilhão em 1921 para 8 bilhões em 2021, é razoável cravar as respostas B ou C, pois o fato de haver mais pessoas resultaria em mais vítimas. Muitos leitores devem ter escolhido a primeira opção, tendo em vista as notícias assustadoras do relatório do IPCC desta semana.
A alternativa correta, porém, é a última. As mortes por desastres naturais diminuíram 87% desde a década de 1920 até os anos 2010, segundo dados coletados pelo Our World in Data.
Passaram de 540 mil por ano para 68 mil. A taxa em relação à população teve picos de 63 mortes por 100 mil habitantes em 1921, e 176 em 1931. Hoje está em 0,15.
Esses números levam a dois paradoxos interessantes sobre a relação entre o homem e o clima. O primeiro lembra o Paradoxo de Spencer –referência a Herbert Spencer, para quem “o grau de preocupação pública sobre um problema ou fenômeno social varia inversamente a sua incidência”.
Assim como os ingleses se deram conta da pobreza quando ela começava a diminuir, durante a Revolução Industrial, a humanidade está apavorada com os infortúnios do clima justamente depois de conseguir sobreviver a eles.
O segundo paradoxo: ao mesmo tempo em que emitimos muito (mas muito mesmo) carbono na atmosfera e causamos um grave problema de efeito estufa, também nos tornamos menos vulneráveis à natureza. Na verdade, proteger-se do clima foi um dos principais motivos para termos poluído tanto.
Veja o caso da construção. Produzir cimento consiste grosseiramente em queimar calcário e liberar dióxido de carbono.
Se a indústria de cimento fosse um país, seria o terceiro maior emissor de gases do efeito estufa. Mas essa indústria poluidora permitiu que as pessoas deixassem casas de pau-a-pique ou madeira para dormirem abrigadas em estruturas mais seguras.
Já a fome originada pela seca, principal causa de morte por desastres naturais nos anos 1920, foi resolvida com a criação dos fertilizantes químicos, sistemas de irrigação e a construção de represas e redes de saneamento.
Todas essas atividades causaram aquecimento global –mas não deixam de ser grandes conquistas humanas, que merecem ser celebradas e difundidas entre os pobres que ainda vivem sob risco de morrer durante furacões, secas ou inundações.
Será que a queda histórica das mortes por desastres naturais vai se reverter nos próximos anos, tornando realidade os vaticínios apocalípticos de Greta Thunberg, para quem “bilhões de pessoas morrerão se não tomarmos medidas urgentes”?
O ativista climático Michael Shellenberger, autor do brilhante “Apocalipse Nunca”, que será lançado este mês no Brasil pela editora LVM, acha que não.
Pretendo falar mais sobre o livro de Shellenberger em outras colunas, mas já adianto um dos argumentos: o alarmismo ambiental despreza a capacidade humana de se adaptar e resolver problemas.
“Os Países Baixos, por exemplo, tornaram-se uma nação rica mesmo tendo um terço de suas terras abaixo do nível do mar, incluindo áreas que estão nada menos do que sete metros abaixo do mar”, diz ele.
A luta contra o aquecimento global não precisa de ativistas obcecados com o apocalipse (que geralmente desprezam soluções óbvias, como a energia nuclear). Precisa de tecnologia, de inovadores, de gente que dê mais conforto e segurança à humanidade interferindo na natureza cada vez menos.
Amanda Shendruk, Tim McDonnell, David Yanofsky, Michael J. Coren
Published August 10, 2021
[Check the original publication here for the text of the report with most important parts highlighted.]
The most important takeaways from the new Intergovernmental Panel on Climate Change report are easily summarized: Global warming is happening, it’s caused by human greenhouse gas emissions, and the impacts are very bad (in some cases, catastrophic). Every fraction of a degree of warming we can prevent by curbing emissions substantially reduces this damage. It’s a message that hasn’t changed much since the first IPCC report in 1990.
But to reach these conclusions (and ratchet up confidence in their findings), hundreds of scientists from universities around the globe spent years combing through the peer-reviewed literature—at least 14,000 papers—on everything from cyclones to droughts.
The final Aug. 9 report is nearly 4,000 pages long. While much of it is written in inscrutable scientific jargon, if you want to understand the scientific case for man-made global warming, look no further. We’ve reviewed the data, summarized the main points, and created an interactive graphic showing a “heat map” of scientists’ confidence in their conclusions. The terms describing statistical confidence range from very high confidence (a 9 out of 10 chance) to very low confidence (a 1 in 10 chance). Just hover over the graphic [here] and click to see what they’ve written.
Here’s your guide to the IPCC’s latest assessment.
CH 1: Framing, context, methods
The first chapter comes out swinging with a bold political charge: It concludes with “high confidence” that the plans countries so far have put forward to reduce emissions are “insufficient” to keep warming well below 2°C, the goal enshrined in the 2015 Paris Agreement. While unsurprising on its own, it is surprising for a document that had to be signed off on by the same government representatives it condemns. It then lists advancements in climate science since the last IPCC report, as well as key evidence behind the conclusion that human-caused global warming is “unequivocal.”
Highlights
👀Scientists’ ability to observe the physical climate system has continued to improve and expand.
📈Since the last IPCC report, new techniques have provided greater confidence in attributing changes in extreme events to human-caused climate change.
🔬The latest generation of climate models is better at representing natural processes, and higher-resolution models that better capture smaller-scale processes and extreme events have become available.
CH 2: Changing state of the climate system
Chapter 2 looks backward in time to compare the current rate of climate changes to those that happened in the past. That comparison clearly reveals human fingerprints on the climate system. The last time global temperatures were comparable to today was 125,000 years ago, the concentration of atmospheric carbon dioxide is higher than anytime in the last 2 million years, and greenhouse gas emissions are rising faster than anytime in the last 800,000 years.
Highlights
🥵Observed changes in the atmosphere, oceans, cryosphere, and biosphere provide unequivocal evidence of a world that has warmed. Over the past several decades, key indicators of the climate system are increasingly at levels unseen in centuries to millennia, and are changing at rates unprecedented in at least the last 2000 years
🧊Annual mean Arctic sea ice coverage levels are the lowest since at least 1850. Late summer levels are the lowest in the past 1,000 years.
🌊Global mean sea level (GMSL) is rising, and the rate of GMSL rise since the 20th century is faster than over any preceding century in at least the last three millennia. Since 1901, GMSL has risen by 0.20 [0.15–0.25] meters, and the rate of rise is accelerating.
CH 3: Human influence on the climate system
Chapter 3 leads with the IPCC’s strongest-ever statement on the human impact on the climate: “It is unequivocal that human influence has warmed the global climate system since pre-industrial times” (the last IPCC report said human influence was “clear”). Specifically, the report blames humanity for nearly all of the 1.1°C increase in global temperatures observed since the Industrial Revolution (natural forces played a tiny role as well), and the loss of sea ice, rising temperatures, and acidity in the ocean.
🌍Human-induced greenhouse gas forcing is the main driver of the observed changes in hot and cold extremes.
🌡️The likely range of warming in global-mean surface air temperature (GSAT) in 2010–2019 relative to 1850–1900 is 0.9°C–1.2°C. Of that, 0.8°C–1.3°C is attributable to human activity, while natural forces contributed −0.1°C–0.1°C.
😬Combining the attributable contributions from melting ice and the expansion of warmer water, it is very likely that human influence was the main driver of the observed global mean sea level rise since at least 1970.
CH 4: Future global climate: Scenario-based projections and near-term information
Chapter 4 holds two of the report’s most important conclusions: Climate change is happening faster than previously understood, and the likelihood that the global temperature increase can stay within the Paris Agreement goal of 1.5°C is extremely slim. The 2013 IPCC report projected that temperatures could exceed 1.5°C in the 2040s; here, that timeline has been advanced by a decade to the “early 2030s” in the median scenario. And even in the lowest-emission scenario, it is “more likely than not” to occur by 2040.
Highlights
🌡️By 2030, in all future warming scenarios, globally averaged surface air temperature in any individual year could exceed 1.5°C relative to 1850–1900.
🌊Under all scenarios, it is virtually certain that global mean sea level will continue to rise through the 21st century.
💨Even if enough carbon were removed from the atmosphere that global emissions become net negative, some climate change impacts, such as sea level rise, will be not reversed for at least several centuries.
CH 5: Global carbon and other biochemical cycles and feedbacks
Chapter 5 quantifies the level by which atmospheric CO2 and methane concentrations have increased since 1750 (47% and 156% respectively) and addresses the ability of oceans and other natural systems to soak those emissions up. The more emissions increase, the less they can be offset by natural sinks—and in a high-emissions scenario, the loss of forests from wildfires becomes so severe that land-based ecosystems become a net source of emissions, rather than a sink (this is already happening to a degree in the Amazon).
Highlights
🌲The CO2 emitted from human activities during the decade of 2010–2019 was distributed between three Earth systems: 46% accumulated in the atmosphere, 23% was taken up by the ocean, and 31% was stored by vegetation.
📉The fraction of emissions taken up by land and ocean is expected to decline as the CO2 concentration increases.
💨Global temperatures rise in a near-linear relationship to cumulative CO2 emissions. In other words, to halt global warming, net emissions must reach zero.
CH 6: Short-lived climate forcers
Chapter 6 is all about methane, particulate matter, aerosols, hydrofluorocarbons, and other non-CO2 gases that don’t linger very long in the atmosphere (just a few hours, in some cases) but exert a tremendous influence on the climate while they do. In cases, that influence might be cooling, but their net impact has been to contribute to warming. Because they are short-lived, the future abundance and impact of these gases are highly variable in the different socioeconomic pathways considered in the report. These gases have a huge impact on the respiratory health of people around the world.
Highlights
⛽The sectors most responsible for warming from short-lived climate forcers are those dominated by methane emissions: fossil fuel production and distribution, agriculture, and waste management.
🧊In the next two decades, it is very likely that emissions from short-lived climate forcers will cause a warming relative to 2019, in addition to the warming from long-lived greenhouse gases like CO2.
🌏Rapid decarbonization leads to air quality improvements, but on its own is not sufficient to achieve, in the near term, air quality guidelines set by the World Health Organization, especially in parts of Asia and in some other highly polluted regions.
CH 7: The Earth’s energy budget, climate feedbacks, and climate sensitivity
Climate sensitivity is a measure of how much the Earth responds to changes in greenhouse gas concentrations. For every doubling of atmospheric CO2, temperatures go up by about 3°C, this chapter concludes. That’s about the same level scientists have estimated for several decades, but over time the range of uncertainty around that estimate has narrowed. The energy budget is a calculation of how much energy is flowing into the Earth system from the sun. Put together these metrics paint a picture of the human contribution to observed warming.
🐻❄️The Arctic warms more quickly than the Antarctic due to differences in radiative feedbacks and ocean heat uptake between the poles.
🌊Because of existing greenhouse gas concentrations, energy will continue to accumulate in the Earth system until at least the end of the 21st century, even under strong emissions reduction scenarios.
☁️The net effect of changes in clouds in response to global warming is to amplify human-induced warming. Compared to the last IPCC report, major advances in the understanding of cloud processes have increased the level of confidence in the cloud feedback cycle.
CH 8: Water cycle changes
This chapter catalogs what happens to water in a warming world. Although instances of drought are expected to become more common and more severe, wet parts of the world will get wetter as the warmer atmosphere is able to carry more water. Total net precipitation will increase, yet the thirstier atmosphere will make dry places drier. And within any one location, the difference in precipitation between the driest and wettest month will likely increase. But rainstorms are complex phenomenon and typically happen at a scale that is smaller than the resolution of most climate models, so specific local predictions about monsoon patterns remains an area of relatively high uncertainty.
Highlights
🌎Increased evapotranspiration will decrease soil moisture over the Mediterranean, southwestern North America, south Africa, southwestern South America, and southwestern Australia.
🌧️Summer monsoon precipitation is projected to increase for the South, Southeast and East Asian monsoon domains, while North American monsoon precipitation is projected to decrease. West African monsoon precipitation is projected to increase over the Central Sahel and decrease over the far western Sahel.
🌲Large-scale deforestation has likely decreased evapotranspiration and precipitation and increased runoff over the deforested regions. Urbanization has increased local precipitation and runoff intensity.
CH 9: Ocean, cryosphere, and sea level change
Most of the heat trapped by greenhouse gases is ultimately absorbed by the oceans. Warmer water expands, contributing significantly to sea level rise, and the slow, deep circulation of ocean water is a key reason why global temperatures don’t turn on a dime in relation to atmospheric CO2. Marine animals are feeling this heat, as scientists have documented that the frequency of marine heatwaves has doubled since the 1980s. Meanwhile, glaciers, polar sea ice, the Greenland ice sheet, and global permafrost are all rapidly melting. Overall sea levels have risen about 20 centimeters since 1900, and the rate of sea level rise is increasing.
Highlights
📈Global mean sea level rose faster in the 20th century than in any prior century over the last three millennia.
🌡️The heat content of the global ocean has increased since at least 1970 and will continue to increase over the 21st century. The associated warming will likely continue until at least 2300 even for low-emission scenarios because of the slow circulation of the deep ocean.
🧊The Arctic Ocean will likely become practically sea ice–free during the seasonal sea ice minimum for the first time before 2050 in all considered SSP scenarios.
CH 10: Linking global to regional climate change
Since 1950, scientists have clearly detected how greenhouse gas emissions from human activity are changing regional temperatures. Climate models can predict regional climate impacts. Where data are limited, statistical methods help identify local impacts (especially in challenging terrain such as mountains). Cities, in particular, will warm faster as a result of urbanization. Global warming extremes in urban areas will be even more pronounced, especially during heatwaves. Although global models largely agree, it is more difficult to consistently predict regional climate impacts across models.
Highlights
⛰️Some local-scale phenomena such as sea breezes and mountain wind systems can not be well represented by the resolution of most climate models.
🌆The difference in observed warming trends between cities and their surroundings can partly be attributed to urbanization. Future urbanization will amplify the projected air temperature change in cities regardless of the characteristics of the background climate.
😕Statistical methods are improving to downscale global climate models to more accurately depict local or regional projections.
CH 11: Weather and climate extreme events in a changing climate
Better data collection, modeling, and means scientists are more confident than ever in understanding the role of rising greenhouse gas concentration in weather and climate extremes. We are virtually certain humans are behind observed temperature extremes.
Human activity is more making extreme weather and temperatures more intense and frequent, especially rain, droughts, and tropical cyclones. While even 1.5°C of warming will make events more severe, the intensity of extreme events is expected to at least double with 2°C of global warming compared today’s conditions, and quadruple with 3°C of warming. As global warming accelerates, historically unprecedented climatic events are likely to occur.
Highlights
🌡️It is an established fact that human-induced greenhouse gas emissions have led to an increased frequency and/or intensity of some weather and climate extremes since pre-industrial time, in particular for temperature extremes.
🌎Even relatively small incremental increases in global warming cause statistically significant changes in extremes.
🌪️The occurrence of extreme events is unprecedented in the observed record, and will increase with increasing global warming.
⛈️Relative to present-day conditions, changes in the intensity of extremes would be at least double at 2°C, and quadruple at 3°C of global warming.
CH 12: Climate change information for regional impact and for risk assessment
Climate models are getting better, more precise, and more accurate at predicting regional impacts. We know a lot more than we did in 2014 (the release of AR5). Our climate is already different compared ti the early or mid-20th century and we’re seeing big changes to mean temperatures, growing season, extreme heat, ocean acidification, and deoxygenation, and Arctic sea ice loss. Expect more changes by mid-century: more rain in the northern hemisphere, less rain in a few regions (the Mediterranean and South Africa), as well as sea-level rise along all coasts. Overall, there is high confidence that mean and extreme temperatures will rise over land and sea. Major widespread damages are expected, but also benefits are possible in some places.
Highlights
🌏Every region of the world will experience concurrent changes in multiple climate impact drivers by mid-century.
🌱Climate change is already resulting in significant societal and environmental impacts and will induce major socio-economic damages in the future. In some cases, climate change can also lead to beneficial conditions which can be taken into account in adaptation strategies.
🌨️The impacts of climate change depend not only on physical changes in the climate itself, but also on whether humans take steps to limit their exposure and vulnerability.
What we did:
The visualization of confidence is only for the executive summary at the beginning of each chapter. If a sentence had a confidence associated with it, the confidence text was removed and a color applied instead. If a sentence did not have an associated confidence, that doesn’t mean scientists do not feel confident about the content; they may be using likelihood (or certainty) language in that instance instead. We chose to only visualize confidence, as it is used more often in the report. Highlights were drawn from the text of the report but edited and in some cases rephrased for clarity.
Everywhere from business to medicine to the climate, forecasting the future is a complex and absolutely critical job. So how do you do it—and what comes next?
Bobbie Johnson
February 26, 2020
Inez Fung
Professor of atmospheric science, University of California, Berkeley
Leah Fasten
Prediction for 2030: We’ll light up the world… safely
I’ve spoken to people who want climate model information, but they’re not really sure what they’re asking me for. So I say to them, “Suppose I tell you that some event will happen with a probability of 60% in 2030. Will that be good enough for you, or will you need 70%? Or would you need 90%? What level of information do you want out of climate model projections in order to be useful?”
I joined Jim Hansen’s group in 1979, and I was there for all the early climate projections. And the way we thought about it then, those things are all still totally there. What we’ve done since then is add richness and higher resolution, but the projections are really grounded in the same kind of data, physics, and observations.
Still, there are things we’re missing. We still don’t have a real theory of precipitation, for example. But there are two exciting things happening there. One is the availability of satellite observations: looking at the cloud is still not totally utilized. The other is that there used to be no way to get regional precipitation patterns through history—and now there is. Scientists found these caves in China and elsewhere, and they go in, look for a nice little chamber with stalagmites, and then they chop them up and send them back to the lab, where they do fantastic uranium-thorium dating and measure oxygen isotopes in calcium carbonate. From there they can interpret a record of historic rainfall. The data are incredible: we have got over half a million years of precipitation records all over Asia.
I don’t see us reducing fossil fuels by 2030. I don’t see us reducing CO2 or atmospheric methane. Some 1.2 billion people in the world right now have no access to electricity, so I’m looking forward to the growth in alternative energy going to parts of the world that have no electricity. That’s important because it’s education, health, everything associated with a Western standard of living. That’s where I’m putting my hopes.
Dvora Photography
Anne Lise Kjaer
Futurist, Kjaer Global, London
Prediction for 2030: Adults will learn to grasp new ideas
As a kid I wanted to become an archaeologist, and I did in a way. Archaeologists find artifacts from the past and try to connect the dots and tell a story about how the past might have been. We do the same thing as futurists; we use artifacts from the present and try to connect the dots into interesting narratives in the future.
When it comes to the future, you have two choices. You can sit back and think “It’s not happening to me” and build a great big wall to keep out all the bad news. Or you can build windmills and harness the winds of change.
A lot of companies come to us and think they want to hear about the future, but really it’s just an exercise for them—let’s just tick that box, do a report, and put it on our bookshelf.
So we have a little test for them. We do interviews, we ask them questions; then we use a model called a Trend Atlas that considers both the scientific dimensions of society and the social ones. We look at the trends in politics, economics, societal drivers, technology, environment, legislation—how does that fit with what we know currently? We look back maybe 10, 20 years: can we see a little bit of a trend and try to put that into the future?
What’s next? Obviously with technology we can educate much better than we could in the past. But it’s a huge opportunity to educate the parents of the next generation, not just the children. Kids are learning about sustainability goals, but what about the people who actually rule our world?
Courtesy Photo
Philip Tetlock
Coauthor of Superforecasting and professor, University of Pennsylvania
Prediction for 2030: We’ll get better at being uncertain
At the Good Judgment Project, we try to track the accuracy of commentators and experts in domains in which it’s usually thought impossible to track accuracy. You take a big debate and break it down into a series of testable short-term indicators. So you could take a debate over whether strong forms of artificial intelligence are going to cause major dislocations in white-collar labor markets by 2035, 2040, 2050. A lot of discussion already occurs at that level of abstraction—but from our point of view, it’s more useful to break it down and to say: If we were on a long-term trajectory toward an outcome like that, what sorts of things would we expect to observe in the short term? So we started this off in 2015, and in 2016 AlphaGo defeated people in Go. But then other things didn’t happen: driverless Ubers weren’t picking people up for fares in any major American city at the end of 2017. Watson didn’t defeat the world’s best oncologists in a medical diagnosis tournament. So I don’t think we’re on a fast track toward the singularity, put it that way.
Forecasts have the potential to be either self-fulfilling or self-negating—Y2K was arguably a self-negating forecast. But it’s possible to build that into a forecasting tournament by asking conditional forecasting questions: i.e., How likely is X conditional on our doing this or doing that?
What I’ve seen over the last 10 years, and it’s a trend that I expect will continue, is an increasing openness to the quantification of uncertainty. I think there’s a grudging, halting, but cumulative movement toward thinking about uncertainty, and more granular and nuanced ways that permit keeping score.
Ryan Young
Keith Chen
Associate professor of economics, UCLA
Prediction for 2030: We’ll be more—and less—private
When I worked on Uber’s surge pricing algorithm, the problem it was built to solve was very coarse: we were trying to convince drivers to put in extra time when they were most needed. There were predictable times—like New Year’s—when we knew we were going to need a lot of people. The deeper problem was that this was a system with basically no control. It’s like trying to predict the weather. Yes, the amount of weather data that we collect today—temperature, wind speed, barometric pressure, humidity data—is 10,000 times greater than what we were collecting 20 years ago. But we still can’t predict the weather 10,000 times further out than we could back then. And social movements—even in a very specific setting, such as where riders want to go at any given point in time—are, if anything, even more chaotic than weather systems.
These days what I’m doing is a little bit more like forensic economics. We look to see what we can find and predict from people’s movement patterns. We’re just using simple cell-phone data like geolocation, but even just from movement patterns, we can infer salient information and build a psychological dimension of you. What terrifies me is I feel like I have much worse data than Facebook does. So what are they able to understand with their much better information?
I think the next big social tipping point is people actually starting to really care about their privacy. It’ll be like smoking in a restaurant: it will quickly go from causing outrage when people want to stop it to suddenly causing outrage if somebody does it. But at the same time, by 2030 almost every Chinese citizen will be completely genotyped. I don’t quite know how to reconcile the two.
Sarah Deragon
Annalee Newitz
Science fiction and nonfiction author, San Francisco
Prediction for 2030: We’re going to see a lot more humble technology
Every era has its own ideas about the future. Go back to the 1950s and you’ll see that people fantasized about flying cars. Now we imagine bicycles and green cities where cars are limited, or where cars are autonomous. We have really different priorities now, so that works its way into our understanding of the future.
Science fiction writers can’t actually make predictions. I think of science fiction as engaging with questions being raised in the present. But what we can do, even if we can’t say what’s definitely going to happen, is offer a range of scenarios informed by history.
There are a lot of myths about the future that people believe are going to come true right now. I think a lot of people—not just science fiction writers but people who are working on machine learning—believe that relatively soon we’re going to have a human-equivalent brain running on some kind of computing substrate. This is as much a reflection of our time as it is what might actually happen.
It seems unlikely that a human-equivalent brain in a computer is right around the corner. But we live in an era where a lot of us feel like we live inside computers already, for work and everything else. So of course we have fantasies about digitizing our brains and putting our consciousness inside a machine or a robot.
I’m not saying that those things could never happen. But they seem much more closely allied to our fantasies in the present than they do to a real technical breakthrough on the horizon.
We’re going to have to develop much better technologies around disaster relief and emergency response, because we’ll be seeing a lot more floods, fires, storms. So I think there is going to be a lot more work on really humble technologies that allow you to take your community off the grid, or purify your own water. And I don’t mean in a creepy survivalist way; I mean just in a this-is-how-we-are-living-now kind of way.
Noah Willman
Finale Doshi-Velez
Associate professor of computer science, Harvard
Prediction for 2030: Humans and machines will make decisions together
In my lab, we’re trying to answer questions like “How might this patient respond to this antidepressant?” or “How might this patient respond to this vasopressor?” So we get as much data as we can from the hospital. For a psychiatric patient, we might have everything about their heart disease, kidney disease, cancer; for a blood pressure management recommendation for the ICU, we have all their oxygen information, their lactate, and more.
Some of it might be relevant to making predictions about their illnesses, some not, and we don’t know which is which. That’s why we ask for the large data set with everything.
There’s been about a decade of work trying to get unsupervised machine-learning models to do a better job at making these predictions, and none worked really well. The breakthrough for us was when we found that all the previous approaches for doing this were wrong in the exact same way. Once we untangled all of this, we came up with a different method.
We also realized that even if our ability to predict what drug is going to work is not always that great, we can more reliably predict what drugs are not going to work, which is almost as valuable.
I’m excited about combining humans and AI to make predictions. Let’s say your AI has an error rate of 70% and your human is also only right 70% of the time. Combining the two is difficult, but if you can fuse their successes, then you should be able to do better than either system alone. How to do that is a really tough, exciting question.
All these predictive models were built and deployed and people didn’t think enough about potential biases. I’m hopeful that we’re going to have a future where these human-machine teams are making decisions that are better than either alone.
Guillaume Simoneau
Abdoulaye Banire Diallo
Professor, director of the bioinformatics lab, University of Quebec at Montreal
Prediction for 2030: Machine-based forecasting will be regulated
When a farmer in Quebec decides whether to inseminate a cow or not, it might depend on the expectation of milk that will be produced every day for one year, two years, maybe three years after that. Farms have management systems that capture the data and the environment of the farm. I’m involved in projects that add a layer of genetic and genomic data to help forecasting—to help decision makers like the farmer to have a full picture when they’re thinking about replacing cows, improving management, resilience, and animal welfare.
With the emergence of machine learning and AI, what we’re showing is that we can help tackle problems in a way that hasn’t been done before. We are adapting it to the dairy sector, where we’ve shown that some decisions can be anticipated 18 months in advance just by forecasting based on the integration of this genomic data. I think in some areas such as plant health we have only achieved 10% or 20% of our capacity to improve certain models.
Until now AI and machine learning have been associated with domain expertise. It’s not a public-wide thing. But less than 10 years from now they will need to be regulated. I think there are a lot of challenges for scientists like me to try to make those techniques more explainable, more transparent, and more auditable.
Our model reveals the true course of the pandemic. Here is what to do next
May 15th 2021 8-10 minutos
THIS WEEK we publish our estimate of the true death toll from covid-19. It tells the real story of the pandemic. But it also contains an urgent warning. Unless vaccine supplies reach poorer countries, the tragic scenes now unfolding in India risk being repeated elsewhere. Millions more will die.
Using known data on 121 variables, from recorded deaths to demography, we have built a pattern of correlations that lets us fill in gaps where numbers are lacking. Our model suggests that covid-19 has already claimed 7.1m-12.7m lives. Our central estimate is that 10m people have died who would otherwise be living. This tally of “excess deaths” is over three times the official count, which nevertheless is the basis for most statistics on the disease, including fatality rates and cross-country comparisons.
The most important insight from our work is that covid-19 has been harder on the poor than anyone knew. Official figures suggest that the pandemic has struck in waves, and that the United States and Europe have been hit hard. Although South America has been ravaged, the rest of the developing world seemed to get off lightly.
Our modelling tells another story. When you count all the bodies, you see that the pandemic has spread remorselessly from the rich, connected world to poorer, more isolated places. As it has done so, the global daily death rate has climbed steeply.
Death rates have been very high in some rich countries, but the overwhelming majority of the 6.7m or so deaths that nobody counted were in poor and middle-income ones. In Romania and Iran excess deaths are more than double the number officially put down to covid-19. In Egypt they are 13 times as big. In America the difference is 7.1%.
India, where about 20,000 are dying every day, is not an outlier. Our figures suggest that, in terms of deaths as a share of population, Peru’s pandemic has been 2.5 times worse than India’s. The disease is working its way through Nepal and Pakistan. Infectious variants spread faster and, because of the tyranny of exponential growth, overwhelm health-care systems and fill mortuaries even if the virus is no more lethal.
Ultimately the way to stop this is vaccination. As an example of collaboration and pioneering science, covid-19 vaccines rank with the Apollo space programme. Within just a year of the virus being discovered, people could be protected from severe disease and death. Hundreds of millions of them have benefited.
However, in the short run vaccines will fuel the divide between rich and poor. Soon, the only people to die from covid-19 in rich countries will be exceptionally frail or exceptionally unlucky, as well as those who have spurned the chance to be vaccinated. In poorer countries, by contrast, most people will have no choice. They will remain unprotected for many months or years.
The world cannot rest while people perish for want of a jab costing as little as $4 for a two-dose course. It is hard to think of a better use of resources than vaccination. Economists’ central estimate for the direct value of a course is $2,900—if you include factors like long covid and the effect of impaired education, the total is much bigger. The benefit from an extra 1bn doses supplied by July would be worth hundreds of billions of dollars. Less circulating virus means less mutation, and so a lower chance of a new variant that reinfects the vaccinated.
Supplies of vaccines are already growing. By the end of April, according to Airfinity, an analytics firm, vaccine-makers produced 1.7bn doses, 700m more than the end of March and ten times more than January. Before the pandemic, annual global vaccine capacity was roughly 3.5bn doses. The latest estimates are that total output in 2021 will be almost 11bn. Some in the industry predict a global surplus in 2022.
And yet the world is right to strive to get more doses in more arms sooner. Hence President Joe Biden has proposed waiving intellectual-property claims on covid-19 vaccines. Many experts argue that, because some manufacturing capacity is going begging, millions more doses might become available if patent-owners shared their secrets, including in countries that today are at the back of the queue. World-trade rules allow for a waiver. When invoke them if not in the throes of a pandemic?
We believe that Mr Biden is wrong. A waiver may signal that his administration cares about the world, but it is at best an empty gesture and at worst a cynical one.
A waiver will do nothing to fill the urgent shortfall of doses in 2021. The head of the World Trade Organisation, the forum where it will be thrashed out, warns there may be no vote until December. Technology transfer would take six months or so to complete even if it started today. With the new mRNA vaccines made by Pfizer and Moderna, it may take longer. Supposing the tech transfer was faster than that, experienced vaccine-makers would be unavailable for hire and makers could not obtain inputs from suppliers whose order books are already bursting. Pfizer’s vaccine requires 280 inputs from suppliers in 19 countries. No firm can recreate that in a hurry.
In any case, vaccine-makers do not appear to be hoarding their technology—otherwise output would not be increasing so fast. They have struck 214 technology-transfer agreements, an unprecedented number. They are not price-gouging: money is not the constraint on vaccination. Poor countries are not being priced out of the market: their vaccines are coming through COVAX, a global distribution scheme funded by donors.
In the longer term, the effect of a waiver is unpredictable. Perhaps it will indeed lead to technology being transferred to poor countries; more likely, though, it will cause harm by disrupting supply chains, wasting resources and, ultimately, deterring innovation. Whatever the case, if vaccines are nearing a surplus in 2022, the cavalry will arrive too late.
A needle in time
If Mr Biden really wants to make a difference, he can donate vaccine right now through COVAX. Rich countries over-ordered because they did not know which vaccines would work. Britain has ordered more than nine doses for each adult, Canada more than 13. These will be urgently needed elsewhere. It is wrong to put teenagers, who have a minuscule risk of dying from covid-19, before the elderly and health-care workers in poor countries. The rich world should not stockpile boosters to cover the population many times over on the off-chance that they may be needed. In the next six months, this could yield billions of doses of vaccine.
Countries can also improve supply chains. The Serum Institute, an Indian vaccine-maker, has struggled to get parts such as filters from America because exports were gummed up by the Defence Production Act (DPA), which puts suppliers on a war-footing. Mr Biden authorised a one-off release, but he should be focusing the DPA on supplying the world instead. And better use needs to be made of finished vaccine. In some poor countries, vaccine languishes unused because of hesitancy and chaotic organisation. It makes sense to prioritise getting one shot into every vulnerable arm, before setting about the second.
Our model is not predictive. However it does suggest that some parts of the world are particularly vulnerable—one example is South-East Asia, home to over 650m people, which has so far been spared mass fatalities for no obvious reason. Covid-19 has not yet run its course. But vaccines have created the chance to save millions of lives. The world must not squander it. ■
This week, the C.D.C. acknowledged what scientists have been saying for months: The risk of catching the coronavirus from surfaces is low.
Credit: Celeste Noche for The New York Times
April 8, 2021
When the coronavirus began to spread in the United States last spring, many experts warned of the danger posed by surfaces. Researchers reported that the virus could survive for days on plastic or stainless steel, and the Centers for Disease Control and Prevention advised that if someone touched one of these contaminated surfaces — and then touched their eyes, nose or mouth — they could become infected.
Americans responded in kind, wiping down groceries, quarantining mail and clearing drugstore shelves of Clorox wipes. Facebook closed two of its offices for a “deep cleaning.” New York’s Metropolitan Transportation Authority began disinfecting subway cars every night.
“People can be affected with the virus that causes Covid-19 through contact with contaminated surfaces and objects,” Dr. Rochelle Walensky, the director of the C.D.C., said at a White House briefing on Monday. “However, evidence has demonstrated that the risk by this route of infection of transmission is actually low.”
The admission is long overdue, scientists say.
“Finally,” said Linsey Marr, an expert on airborne viruses at Virginia Tech. “We’ve known this for a long time and yet people are still focusing so much on surface cleaning.” She added, “There’s really no evidence that anyone has ever gotten Covid-19 by touching a contaminated surface.”
During the early days of the pandemic, many experts believed that the virus spread primarily through large respiratory droplets. These droplets are too heavy to travel long distances through the air but can fall onto objects and surfaces.
In this context, a focus on scrubbing down every surface seemed to make sense. “Surface cleaning is more familiar,” Dr. Marr said. “We know how to do it. You can see people doing it, you see the clean surface. And so I think it makes people feel safer.”
Credit: Hiroko Masuike/The New York Times
But over the last year, it has become increasingly clear that the virus spreads primarily through the air — in both large and small droplets, which can remain aloft longer — and that scouring door handles and subway seats does little to keep people safe.
“The scientific basis for all this concern about surfaces is very slim — slim to none,” said Emanuel Goldman, a microbiologist at Rutgers University, who wrote last summer that the risk of surface transmission had been overblown. “This is a virus you get by breathing. It’s not a virus you get by touching.”
The C.D.C. has previously acknowledged that surfaces are not the primary way that the virus spreads. But the agency’s statements this week went further.
“The most important part of this update is that they’re clearly communicating to the public the correct, low risk from surfaces, which is not a message that has been clearly communicated for the past year,” said Joseph Allen, a building safety expert at the Harvard T.H. Chan School of Public Health.
Catching the virus from surfaces remains theoretically possible, he noted. But it requires many things to go wrong: a lot of fresh, infectious viral particles to be deposited on a surface, and then for a relatively large quantity of them to be quickly transferred to someone’s hand and then to their face. “Presence on a surface does not equal risk,” Dr. Allen said.
In most cases, cleaning with simple soap and water — in addition to hand-washing and mask-wearing — is enough to keep the odds of surface transmission low, the C.D.C.’s updated cleaning guidelines say. In most everyday scenarios and environments, people do not need to use chemical disinfectants, the agency notes.
“What this does very usefully, I think, is tell us what we don’t need to do,” said Donald Milton, an aerosol scientist at the University of Maryland. “Doing a lot of spraying and misting of chemicals isn’t helpful.”
Still, the guidelines do suggest that if someone who has Covid-19 has been in a particular space within the last day, the area should be both cleaned and disinfected.
“Disinfection is only recommended in indoor settings — schools and homes — where there has been a suspected or confirmed case of Covid-19 within the last 24 hours,” Dr. Walensky said during the White House briefing. “Also, in most cases, fogging, fumigation and wide-area or electrostatic spraying is not recommended as a primary method of disinfection and has several safety risks to consider.”
And the new cleaning guidelines do not apply to health care facilities, which may require more intensive cleaning and disinfection.
Saskia Popescu, an infectious disease epidemiologist at George Mason University, said that she was happy to see the new guidance, which “reflects our evolving data on transmission throughout the pandemic.”
But she noted that it remained important to continue doing some regular cleaning — and maintaining good hand-washing practices — to reduce the risk of contracting not just the coronavirus but any other pathogens that might be lingering on a particular surface.
Dr. Allen said that the school and business officials he has spoken with this week expressed relief over the updated guidelines, which will allow them to pull back on some of their intensive cleaning regimens. “This frees up a lot of organizations to spend that money better,” he said.
Schools, businesses and other institutions that want to keep people safe should shift their attention from surfaces to air quality, he said, and invest in improved ventilation and filtration.
“This should be the end of deep cleaning,” Dr. Allen said, noting that the misplaced focus on surfaces has had real costs. “It has led to closed playgrounds, it has led to taking nets off basketball courts, it has led to quarantining books in the library. It has led to entire missed school days for deep cleaning. It has led to not being able to share a pencil. So that’s all that hygiene theater, and it’s a direct result of not properly classifying surface transmission as low risk.”
The U.S. National Academy of Sciences has published a new report (“Reflecting Sunlight“) on the topic of Geoengineering (that is, the deliberate manipulation of the global Earth environment in an effort to offset the effects of human carbon pollution-caused climate change). While I am, in full disclosure, a member of the Academy, I offer the following comments in an entirely independent capacity:
Let me start by congratulating the authors on their comprehensive assessment of the science. It is solid as we would expect, since the author team and reviewers cover that well in their expertise. The science underlying geoengineering is the true remit of the study. Chris Field , the lead author, is a duly qualified person to lead the effort, and did a good job making sure that intricacies of the science are covered, including the substantial uncertainties and caveats when it comes to the potential environmental impacts of some of the riskier geoengineering strategies (i.e. stratosphere sulphate aerosol injection to block out sunlight).
I like the fact that there is a discussion of the importance of labels and terminology and how this can impact public perception. For example, the oft-used term “solar radiation management” is not favored by the report authors, as it can be misleading (we don’t have our hand on a dial that controls solar output). On the other hand, I think that the term they do chose to use “solar geoengineering”, is still potentially problematic, because it still implies we’re directly modify solar output—but that’s not the case. We’re talking about messing with Earth’s atmospheric chemistry, we’re not dialing down the sun, even though many of the modeling experiments assume that’s what we’re doing. It’s a bit of a bait and switch. Even the title of the report, “Reflecting Sunlight” falls victim to this biased framing.
In my recent book (“The New Climate War”), I quote one leading scientist on this:
“They don’t actually put aerosols in the atmosphere. They turn down the Sun to mimic geoengineering. You might think that is relatively unimportant . . . [but] controlling the Sun is effectively a perfect knob. We know almost precisely how a reduction in solar flux will project onto the energy balance of a planet. Aerosol-climate interactions are much more complex.”
I have a deeper and more substantive concern though, and it really is about the entire framing of the report. A report like this is as much about the policy message it conveys as it is about the scientific assessment, for it will be used immediately by policy advocates. And here I’m honestly troubled at the fodder it provides for mis-framing of the risks.
I recognize that the authors are dealing with a contentious and still much-debated topic, and it’s a challenge to represent the full range of views within the community, but the opening of the report itself, in my view, really puts a thumb on the scales. It falls victim to the moral hazard that I warn about in “The New Climate War” when it states, as justification for potentially considering implementing these geoengineering schemes:
But despite overwhelming evidence that the climate crisis is real and pressing, emissions of greenhouse gases continue to increase, with global emissions of fossil carbon dioxide rising 10.8 percent from 2010 through 2019. The total for 2020 is on track to decrease in response to decreased economic activity related to the COVID-19 pandemic. The pandemic is thus providing frustrating confirmation of the fact that the world has made little progress in separating economic activity from carbon dioxide emissions.
First of all, the discussion of carbon emissions reductions there is misleading. Emissions flattened in the years before the pandemic, and the International Energy Agency (IEA) specifically attributed that flattening to a decrease in carbon emissions globally in the power generation sector. These reductions continue on and contributed at least party to the 7% decrease in global emissions last year. We will certainly need policy interventions favoring further decarbonization to maintain that level of decrease year after year, but if we can do that, we remain on a path to limiting warming below dangerous levels (decent chance less than 1.5C and very good chance less than 2C) without resorting on very risky geoengineering schemes. It is a matter of political willpower, not technology–we have the technology now necessary to decarbonize our economy.
The authors are basically arguing that because carbon reductions haven’t been great enough (thanks to successful opposition by polluters and their advocates) we should consider geoengineering. That framing (unintentionally, I realize) provides precisely the crutch that polluters are looking for.
As I explain in the book:
A fundamental problem with geoengineering is that it presents what is known as a moral hazard, namely, a scenario in which one party (e.g., the fossil fuel industry) promotes actions that are risky for another party (e.g., the rest of us), but seemingly advantageous to itself. Geoengineering provides a potential crutch for beneficiaries of our continued dependence on fossil fuels. Why threaten our economy with draconian regulations on carbon when we have a cheap alternative? The two main problems with that argument are that (1) climate change poses a far greater threat to our economy than decarbonization, and (2) geoengineering is hardly cheap—it comes with great potential harm.
So, in short, this report is somewhat of a mixed bag. The scientific assessment and discussion is solid, and there is a discussion of uncertainties and caveats in the detailed report. But the spin in the opening falls victim to moral hazard and will provide fodder for geoengineering advocates to use in leveraging policy decision-making.
Last Updated: March 10, 2021 at 5:59 p.m. ET First Published: March 10, 2021 at 8:28 a.m. ET By
Vincent H. Smith and Eric J. Belasco
Congress has reduced risk by underwriting crop prices and cash revenues
Bill Gates is now the largestowner of farmland in the U.S. having made substantial investments in at least 19 states throughout the country. He has apparently followed the advice of another wealthy investor, Warren Buffett, who in a February 24, 2014 letter to investors described farmland as an investment that has “no downside and potentially substantial upside.”
There is a simple explanation for this affection for agricultural assets. Since the early 1980s, Congress has consistently succumbed to pressures from farm interest groups to remove as much risk as possible from agricultural enterprises by using taxpayer funds to underwrite crop prices and cash revenues.
Over the years, three trends in farm subsidy programs have emerged.
The first and most visible is the expansion of the federally supported crop insurance program, which has grown from less than $200 million in 1981 to over $8 billion in 2021. In 1980, only a few crops were covered and the government’s goal was just to pay for administrative costs. Today taxpayers pay over two-thirds of the total cost of the insurance programs that protect farmers against drops in prices and yields for hundreds of commodities ranging from organic oranges to GMO soybeans.
The second trend is the continuation of longstanding programs to protect farmers against relatively low revenues because of price declines and lower-than-average crop yields. The subsidies, which on average cost taxpayers over $5 billion a year, are targeted to major Corn Belt crops such as soybeans and wheat. Also included are other commodities such as peanuts, cotton and rice, which are grown in congressionally powerful districts in Georgia, the Carolinas, Texas, Arkansas, Mississippi and California.
The third, more recent trend is a return over the past four years to a 1970s practice: annual ad hoc “one off” programs justified by political expediency with support from the White House and Congress. These expenditures were $5.1 billion in 2018, $14.7 billion in 2019, and over $32 billion in 2020, of which $29 billion came from COVID relief funds authorized in the CARES Act. An additional $13 billion for farm subsidies was later included in the December 2020 stimulus bill.
If you are wondering why so many different subsidy programs are used to compensate farmers multiple times for the same price drops and other revenue losses, you are not alone. Our research indicates that many owners of large farms collect taxpayer dollars from all three sources. For many of the farms ranked in the top 10% in terms of sales, recent annual payments exceeded a quarter of a million dollars.
Farms with average or modest sales received much less. Their subsidies ranged from close to zero for small farms to a few thousand dollars for averaged-sized operations.
So what does all this have to do with Bill Gates, Warren Buffet and their love of farmland as an investment? In a financial environment in which real interest rates have been near zero or negative for almost two decades, the annual average inflation-adjusted (real) rate of return in agriculture (over 80% of which consists of land) has been about 5% for the past 30 years, despite some ups and downs, as this chart shows. It is a very solid investment for an owner who can hold on to farmland for the long term.
The overwhelming majority of farm owners can manage that because they have substantial amounts of equity (the sector-wide debt-to-equity ratio has been less than 14% for many years) and receive significant revenue from other sources.
Thus for almost all farm owners, and especially the largest 10% whose net equity averages over $6 million, as Buffet observed, there is little or no risk and lots of potential gain in owning and investing in agricultural land.
Returns from agricultural land stem from two sources: asset appreciation — increases in land prices, which account for the majority of the gains — and net cash income from operating the land. As is well known, farmland prices are closely tied to expected future revenue. And these include generous subsidies, which have averaged 17% of annual net cash incomes over the past 50 years. In addition, Congress often provides substantial additional one-off payments in years when net cash income is likely to be lower than average, as in 2000 and 2001 when grain prices were relatively low and in 2019 and 2020.
It is possible for small-scale investors to buy shares in real-estate investment trusts (REITs) that own and manage agricultural land. However, as with all such investments, how a REIT is managed can be a substantive source of risk unrelated to the underlying value of the land assets, not all of which may be farm land.
Thanks to Congress and the average less affluent American taxpayer, farmers and other agricultural landowners get a steady and substantial return on their investments through subsidies that consistently guarantee and increase those revenues.
While many agricultural support programs are meant to “save the family farm,” the largest beneficiaries of agricultural subsidies are the richest landowners with the largest farms who, like Bill Gates and Warren Buffet, are scarcely in any need of taxpayer handouts.
This combination of satellite images provided by the National Hurricane Center shows 30 hurricanes that occurred during the 2020 Atlantic hurricane season.
We’re one step closer to officially moving up hurricane season. The National Hurricane Center announced Tuesday that it would formally start issuing its hurricane season tropical weather outlooks on May 15 this year, bumping it up from the traditional start of hurricane season on June 1. The move comes after a recent spate of early season storms have raked the Atlantic.
Atlantic hurricane season runs from June 1 to November 30. That’s when conditions are most conducive to storm formation owing to warm air and water temperatures. (The Pacific ocean has its own hurricane season, which covers the same timeframe, but since waters are colder fewer hurricanes tend to form there than in the Atlantic.)
Storms have begun forming on the Atlantic earlier as ocean and air temperatures have increased due to climate change. Last year, Hurricane Arthur roared to life off the East Coast on May 16. That storm made 2020 the sixth hurricane season in a row to have a storm that formed earlier than the June 1 official start date. While the National Oceanic and Atmospheric Administration won’t be moving up the start of the season just yet, the earlier outlooks addresses the recent history.
“In the last decade, there have been 10 storms formed in the weeks before the traditional start of the season, which is a big jump,” said Sean Sublette, a meteorologist at Climate Central, who pointed out that the 1960s through 2010s saw between one and three storms each decade before the June 1 start date on average.
It might be tempting to ascribe this earlier season entirely to climate change warming the Atlantic. But technology also has a role to play, with more observations along the coast as well as satellites that can spot storms far out to sea.
“I would caution that we can’t just go, ‘hah, the planet’s warming, we’ve had to move the entire season!’” Sublette said. “I don’t think there’s solid ground for attribution of how much of one there is over the other. Weather folks can sit around and debate that for awhile.”
Earlier storms don’t necessarily mean more harmful ones, either. In fact, hurricanes earlier in the season tend to be weaker than the monsters that form in August and September when hurricane season is at its peak. But regardless of their strength, these earlier storms have generated discussion inside the NHC on whether to move up the official start date for the season, when the agency usually puts out two reports per day on hurricane activity. Tuesday’s step is not an official announcement of this decision, but an acknowledgement of the increased attention on early hurricanes.
“I would say that [Tuesday’s announcement] is the National Hurricane Center being proactive,” Sublette said. “Like hey, we know that the last few years it’s been a little busier in May than we’ve seen in the past five decades, and we know there is an awareness now, so we’re going to start issuing these reports early.”
While the jury is still out on whether climate change is pushing the season earlier, research has shown that the strongest hurricanes are becoming more common, and that climate change is likely playing a role. A study published last year found the odds of a storm becoming a major hurricanes—those Category 3 or stronger—have increase 49% in the basin since satellite monitoring began in earnest four decades ago. And when storms make landfall, sea level rise allows them to do more damage. So regardless of if climate change is pushing Atlantic hurricane season is getting earlier or not, the risks are increasing. Now, at least, we’ll have better warnings before early storms do hit.
Alex Wong / Chet Strange/ Sarah Silbiger / Bloomberg / Getty / The Atlantic
When the polio vaccine was declared safe and effective, the news was met with jubilant celebration. Church bells rang across the nation, and factories blew their whistles. “Polio routed!” newspaper headlines exclaimed. “An historic victory,” “monumental,” “sensational,” newscasters declared. People erupted with joy across the United States. Some danced in the streets; others wept. Kids were sent home from school to celebrate.
One might have expected the initial approval of the coronavirus vaccines to spark similar jubilation—especially after a brutal pandemic year. But that didn’t happen. Instead, the steady drumbeat of good news about the vaccines has been met with a chorus of relentless pessimism.
The problem is not that the good news isn’t being reported, or that we should throw caution to the wind just yet. It’s that neither the reporting nor the public-health messaging has reflected the truly amazing reality of these vaccines. There is nothing wrong with realism and caution, but effective communication requires a sense of proportion—distinguishing between due alarm and alarmism; warranted, measured caution and doombait; worst-case scenarios and claims of impending catastrophe. We need to be able to celebrate profoundly positive news while noting the work that still lies ahead. However, instead of balanced optimism since the launch of the vaccines, the public has been offered a lot of misguided fretting over new virus variants, subjected to misleading debates about the inferiority of certain vaccines, and presented with long lists of things vaccinated people still cannot do, while media outlets wonder whether the pandemic will ever end.
This pessimism is sapping people of energy to get through the winter, and the rest of this pandemic. Anti-vaccination groups and those opposing the current public-health measures have been vigorously amplifying the pessimistic messages—especially the idea that getting vaccinated doesn’t mean being able to do more—telling their audiences that there is no point in compliance, or in eventual vaccination, because it will not lead to any positive changes. They are using the moment and the messaging to deepen mistrust of public-health authorities, accusing them of moving the goalposts and implying that we’re being conned. Either the vaccines aren’t as good as claimed, they suggest, or the real goal of pandemic-safety measures is to control the public, not the virus.
Five key fallacies and pitfalls have affected public-health messaging, as well as media coverage, and have played an outsize role in derailing an effective pandemic response. These problems were deepened by the ways that we—the public—developed to cope with a dreadful situation under great uncertainty. And now, even as vaccines offer brilliant hope, and even though, at least in the United States, we no longer have to deal with the problem of a misinformer in chief, some officials and media outlets are repeating many of the same mistakes in handling the vaccine rollout.
The pandemic has given us an unwelcome societal stress test, revealing the cracks and weaknesses in our institutions and our systems. Some of these are common to many contemporary problems, including political dysfunction and the way our public sphere operates. Others are more particular, though not exclusive, to the current challenge—including a gap between how academic research operates and how the public understands that research, and the ways in which the psychology of coping with the pandemic have distorted our response to it.
Recognizing all these dynamics is important, not only for seeing us through this pandemic—yes, it is going to end—but also to understand how our society functions, and how it fails. We need to start shoring up our defenses, not just against future pandemics but against all the myriad challenges we face—political, environmental, societal, and technological. None of these problems is impossible to remedy, but first we have to acknowledge them and start working to fix them—and we’re running out of time.
The past 12 months were incredibly challenging for almost everyone. Public-health officials were fighting a devastating pandemic and, at least in this country, an administration hell-bent on undermining them. The World Health Organization was not structured or funded for independence or agility, but still worked hard to contain the disease. Many researchers and experts noted the absence of timely and trustworthy guidelines from authorities, and tried to fill the void by communicating their findings directly to the public on social media. Reporters tried to keep the public informed under time and knowledge constraints, which were made more severe by the worsening media landscape. And the rest of us were trying to survive as best we could, looking for guidance where we could, and sharing information when we could, but always under difficult, murky conditions.
Despite all these good intentions, much of the public-health messaging has been profoundly counterproductive. In five specific ways, the assumptions made by public officials, the choices made by traditional media, the way our digital public sphere operates, and communication patterns between academic communities and the public proved flawed.
Risk Compensation
One of the most important problems undermining the pandemic response has been the mistrust and paternalism that some public-health agencies and experts have exhibited toward the public. A key reason for this stance seems to be that some experts feared that people would respond to something that increased their safety—such as masks, rapid tests, or vaccines—by behaving recklessly. They worried that a heightened sense of safety would lead members of the public to take risks that would not just undermine any gains, but reverse them.
The theory that things that improve our safety might provide a false sense of security and lead to reckless behavior is attractive—it’s contrarian and clever, and fits the “here’s something surprising we smart folks thought about” mold that appeals to, well, people who think of themselves as smart. Unsurprisingly, such fears have greeted efforts to persuade the public to adopt almost every advance in safety, including seat belts, helmets, and condoms.
But time and again, the numbers tell a different story: Even if safety improvements cause a few people to behave recklessly, the benefitsoverwhelmthe ill effects. In any case, most people are already interested in staying safe from a dangerous pathogen. Further, even at the beginning of the pandemic, sociological theory predictedthat wearing masks would be associated with increased adherence to other precautionary measures—people interested in staying safe are interested in staying safe—and empirical research quickly confirmedexactly that. Unfortunately, though, the theory of risk compensation—and its implicit assumptions—continue to haunt our approach, in part because there hasn’t been a reckoning with the initial missteps.
Rules in Place of Mechanisms and Intuitions
Much of the public messaging focused on offering a series of clear rules to ordinary people, instead of explaining in detail the mechanisms of viral transmission for this pathogen. A focus on explaining transmission mechanisms, and updating our understanding over time, would have helped empower people to make informed calculations about risk in different settings. Instead, both the CDC and the WHO chose to offer fixed guidelines that lent a false sense of precision.
In the United States, the public was initially told that “close contact” meant coming within six feet of an infected individual, for 15 minutes or more. This messaging led to ridiculous gaming of the rules; some establishments moved people around at the 14th minute to avoid passing the threshold. It also led to situations in which people working indoors with others, but just outside the cutoff of six feet, felt that they could take their mask off. None of this made any practical sense. What happened at minute 16? Was seven feet okay? Faux precision isn’t more informative; it’s misleading.
All of this was complicated by the fact that key public-health agencies like the CDC and the WHO were late to acknowledge the importance of some key infection mechanisms, such as aerosol transmission. Even when they did so, the shift happened without a proportional change in the guidelines or the messaging—it was easy for the general public to miss its significance.
Frustrated by the lack of public communication from health authorities, I wrote an article last July on what we then knew about the transmission of this pathogen—including how it could be spread via aerosols that can float and accumulate, especially in poorly ventilated indoor spaces. To this day, I’m contacted by people who describe workplaces that are following the formal guidelines, but in ways that defy reason: They’ve installed plexiglass, but barred workers from opening their windows; they’ve mandated masks, but only when workers are within six feet of one another, while permitting them to be taken off indoors during breaks.
Perhaps worst of all, our messaging and guidelines elided the difference between outdoor and indoor spaces, where, given the importance of aerosol transmission, the same precautions should not apply. This is especially important because this pathogen is overdispersed: Much of the spread is driven by a few people infecting many others at once, while most people do not transmit the virus at all.
After I wrote an article explaining how overdispersion and super-spreading were driving the pandemic, I discovered that this mechanism had also been poorly explained. I was inundated by messages from people, including elected officials around the world, saying they had no idea that this was the case. None of it was secret—numerous academic papers and articles had been written about it—but it had not been integrated into our messaging or our guidelines despite its great importance.
Crucially, super-spreading isn’t equally distributed; poorly ventilated indoor spaces can facilitate the spread of the virus over longer distances, and in shorter periods of time, than the guidelines suggested, and help fuel the pandemic.
Outdoors? It’s the opposite.
There is a solid scientific reason for the fact that there are relatively few documented cases of transmission outdoors, even after a year of epidemiological work: The open air dilutes the virus very quickly, and the sun helps deactivate it, providing further protection. And super-spreading—the biggest driver of the pandemic— appears to be an exclusively indoor phenomenon. I’ve been tracking every report I can find for the past year, and have yet to find a confirmed super-spreading event that occurred solely outdoors. Such events might well have taken place, but if the risk were great enough to justify altering our lives, I would expect at least a few to have been documented by now.
And yet our guidelines do not reflect these differences, and our messaging has not helped people understand these facts so that they can make better choices. I published my first article pleading for parks to be kept open on April 7, 2020—but outdoor activities are still banned by some authorities today, a full year after this dreaded virus began to spread globally.
We’d have been much better off if we gave people a realistic intuition about this virus’s transmission mechanisms. Our public guidelines should have been more like Japan’s, which emphasize avoiding the three C’s—closed spaces, crowded places, and close contact—that are driving the pandemic.
Scolding and Shaming
Throughout the past year, traditional and social media have been caught up in a cycle of shaming—made worse by being so unscientific and misguided. How dare you go to the beach? newspapers have scolded us for months, despite lacking evidence that this posed any significant threat to public health. It wasn’t just talk: Many cities closed parks and outdoor recreational spaces, even as they kept open indoor dining and gyms. Just this month, UC Berkeley and the University of Massachusetts at Amherst both banned students from taking even solitary walks outdoors.
Even when authorities relax the rules a bit, they do not always follow through in a sensible manner. In the United Kingdom, after some locales finally started allowing children to play on playgrounds—something that was already way overdue—they quickly ruled that parents must not socialize while their kids have a normal moment. Why not? Who knows?
On social media, meanwhile, pictures of people outdoors without masks draw reprimands, insults, and confident predictions of super-spreading—and yet few note when super-spreading fails to follow.
While visible but low-risk activities attract the scolds, other actual risks—in workplaces and crowded households, exacerbated by the lack of testing or paid sick leave—are not as easily accessible to photographers. Stefan Baral, an associate epidemiology professor at the Johns Hopkins Bloomberg School of Public Health, says that it’s almost as if we’ve “designed a public-health response most suitable for higher-income” groups and the “Twitter generation”—stay home; have your groceries delivered; focus on the behaviors you can photograph and shame online—rather than provide the support and conditionsnecessary for more people to keep themselves safe.
And the viral videos shaming people for failing to take sensible precautions, such as wearing masks indoors, do not necessarily help. For one thing, fretting over the occasional person throwing a tantrum while going unmasked in a supermarket distorts the reality: Most of the public has been complying with mask wearing. Worse, shaming is often an ineffective way of getting people to change their behavior, and it entrenches polarization and discourages disclosure, making it harder to fight the virus. Instead, we should be emphasizing safer behavior and stressing how many people are doing their part, while encouraging others to do the same.
Harm Reduction
Amidst all the mistrust and the scolding, a crucial public-health concept fell by the wayside. Harm reduction is the recognition that if there is an unmet and yet crucial human need, we cannot simply wish it away; we need to advise people on how to do what they seek to do more safely. Risk can never be completely eliminated; life requires more than futile attempts to bring risk down to zero. Pretending we can will away complexities and trade-offs with absolutism is counterproductive. Consider abstinence-only education: Not letting teenagers know about ways to have safer sex results in more of them having sex with no protections.
As Julia Marcus, an epidemiologist and associate professor at Harvard Medical School, told me, “When officials assume that risks can be easily eliminated, they might neglect the other things that matter to people: staying fed and housed, being close to loved ones, or just enjoying their lives. Public health works best when it helps people find safer ways to get what they need and want.””
Another problem with absolutism is the “abstinence violation” effect, Joshua Barocas, an assistant professor at the Boston University School of Medicine and Infectious Diseases, told me. When we set perfection as the only option, it can cause people who fall short of that standard in one small, particular way to decide that they’ve already failed, and might as well give up entirely. Most people who have attempted a diet or a new exercise regimen are familiar with this psychological state. The better approach is encouraging risk reduction and layered mitigation—emphasizing that every little bit helps—while also recognizing that a risk-free life is neither possible nor desirable.
Socializing is not a luxury—kids need to play with one another, and adults need to interact. Your kids can play together outdoors, and outdoor time is the best chance to catch up with your neighbors is not just a sensible message; it’s a way to decrease transmission risks. Some kids will play and some adults will socialize no matter what the scolds say or public-health officials decree, and they’ll do it indoors, out of sight of the scolding.
And if they don’t? Then kids will be deprived of an essential activity, and adults will be deprived of human companionship. Socializing is perhaps the most important predictor of health and longevity, after not smoking and perhaps exercise and a healthy diet. We need to help people socialize more safely, not encourage them to stop socializing entirely.
The Balance Between Knowledge And Action
Last but not least, the pandemic response has been distorted by a poor balance between knowledge, risk, certainty, and action.
Sometimes, public-health authorities insisted that we did not know enough to act, when the preponderance of evidence already justified precautionary action. Wearing masks, for example, posed few downsides, and held the prospect of mitigating the exponential threat we faced. The wait for certainty hampered our response to airborne transmission, even though there was almost no evidence for—and increasing evidence against—the importance of fomites, or objects that can carry infection. And yet, we emphasized the risk of surface transmission while refusing to properly address the risk of airborne transmission, despite increasing evidence. The difference lay not in the level of evidence and scientific support for either theory—which, if anything, quickly tilted in favor of airborne transmission, and not fomites, being crucial—but in the fact that fomite transmission had been a key part of the medical canon, and airborne transmission had not.
Sometimes, experts and the public discussion failed to emphasize that we were balancing risks, as in the recurring cycles of debate over lockdowns or school openings. We should have done more to acknowledge that there were no good options, only trade-offs between different downsides. As a result, instead of recognizing the difficulty of the situation, too many people accused those on the other side of being callous and uncaring.
And sometimes, the way that academics communicate clashed with how the public constructs knowledge. In academia, publishing is the coin of the realm, and it is often done through rejecting the null hypothesis—meaning that many papers do not seek to prove something conclusively, but instead, to reject the possibility that a variable has no relationship with the effect they are measuring (beyond chance). If that sounds convoluted, it is—there are historical reasons for this methodology and big arguments within academia about its merits, but for the moment, this remains standard practice.
At crucial points during the pandemic, though, this resulted in mistranslations and fueled misunderstandings, which were further muddled by differing stances toward prior scientific knowledge and theory. Yes, we faced a novel coronavirus, but we should have started by assuming that we could make some reasonable projections from prior knowledge, while looking out for anything that might prove different. That prior experience should have made us mindful of seasonality, the key role of overdispersion, and aerosol transmission. A keen eye for what was different from the past would have alerted us earlier to the importance of presymptomatic transmission.
Thus, on January 14, 2020, the WHO stated that there was “no clear evidence of human-to-human transmission.” It should have said, “There is increasing likelihood that human-to-human transmission is taking place, but we haven’t yet proven this, because we have no access to Wuhan, China.” (Cases were already popping up around the world at that point.) Acting as if there was human-to-human transmission during the early weeks of the pandemic would have been wise and preventive.
Later that spring, WHO officials stated that there was “currently no evidence that people who have recovered from COVID-19 and have antibodies are protected from a second infection,” producing many articles laden with panic and despair. Instead, it should have said: “We expect the immune system to function against this virus, and to provide some immunity for some period of time, but it is still hard to know specifics because it is so early.”
Similarly, since the vaccines were announced, too many statements have emphasized that we don’t yet know if vaccines prevent transmission. Instead, public-health authorities should have said that we have many reasons to expect, and increasing amounts of data to suggest, that vaccines will blunt infectiousness, but that we’re waiting for additional data to be more precise about it. That’s been unfortunate, because while many, many things have gone wrong during this pandemic, the vaccines are one thing that has gone very, very right.
As late as April 2020, Anthony Fauci was slammed for being too optimistic for suggesting we might plausibly have vaccines in a year to 18 months. We had vaccines much, much sooner than that: The first two vaccine trials concluded a mere eight months after the WHO declared a pandemic in March 2020.
Moreover, they have delivered spectacular results. In June 2020, the FDA said a vaccine that was merely 50 percent efficacious in preventing symptomatic COVID-19 would receive emergency approval—that such a benefit would be sufficient to justify shipping it out immediately. Just a few months after that, the trials of the Moderna and Pfizer vaccines concluded by reporting not just a stunning 95 percent efficacy, but also a complete elimination of hospitalization or death among the vaccinated. Even severe disease was practically gone: The lone case classified as “severe” among 30,000 vaccinated individuals in the trials was so mild that the patient needed no medical care, and her case would not have been considered severe if her oxygen saturation had been a single percent higher.
These are exhilarating developments, because global, widespread, and rapid vaccination is our way out of this pandemic. Vaccines that drastically reduce hospitalizations and deaths, and that diminish even severe disease to a rare event, are the closest things we have had in this pandemic to a miracle—though of course they are the product of scientific research, creativity, and hard work. They are going to be the panacea and the endgame.
And yet, two months into an accelerating vaccination campaign in the United States, it would be hard to blame people if they missed the news that things are getting better.
Yes, there are new variants of the virus, which may eventually require booster shots, but at least so far, the existing vaccines are standing up to them well—very, very well. Manufacturers are already working on new vaccines or variant-focused booster versions, in case they prove necessary, and the authorizing agencies are ready for a quick turnaround if and when updates are needed. Reports from places that have vaccinated large numbers of individuals, and even trials in places where variants are widespread, are exceedingly encouraging, with dramatic reductions in cases and, crucially, hospitalizations and deaths among the vaccinated. Global equity and access to vaccines remain crucial concerns, but the supply is increasing.
Here in the United States, despite the rocky rollout and the need to smooth access and ensure equity, it’s become clear that toward the end of spring 2021, supply will be more than sufficient. It may sound hard to believe today, as many who are desperate for vaccinations await their turn, but in the near future, we may have to discuss what to do with excess doses.
So why isn’t this story more widely appreciated?
Part of the problem with the vaccines was the timing—the trials concluded immediately after the U.S. election, and their results got overshadowed in the weeks of political turmoil. The first, modest headline announcing the Pfizer-BioNTech results in The New York Times was a single column, “Vaccine Is Over 90% Effective, Pfizer’s Early Data Says,” below a banner headline spanning the page: “BIDEN CALLS FOR UNITED FRONT AS VIRUS RAGES.” That was both understandable—the nation was weary—and a loss for the public.
Just a few days later, Moderna reported a similar 94.5 percent efficacy. If anything, that provided even more cause for celebration, because it confirmed that the stunning numbers coming out of Pfizer weren’t a fluke. But, still amid the political turmoil, the Moderna report got a mere two columns on The New York Times’ front page with an equally modest headline: “Another Vaccine Appears to Work Against the Virus.”
So we didn’t get our initial vaccine jubilation.
But as soon as we began vaccinating people, articles started warning the newly vaccinated about all they could not do. “COVID-19 Vaccine Doesn’t Mean You Can Party Like It’s 1999,” one headline admonished. And the buzzkill has continued right up to the present. “You’re fully vaccinated against the coronavirus—now what? Don’t expect to shed your mask and get back to normal activities right away,” began a recent Associated Press story.
People might well want to party after being vaccinated. Those shots will expand what we can do, first in our private lives and among other vaccinated people, and then, gradually, in our public lives as well. But once again, the authorities and the media seem more worried about potentially reckless behavior among the vaccinated, and about telling them what not to do, than with providing nuanced guidance reflecting trade-offs, uncertainty, and a recognition that vaccination can change behavior. No guideline can cover every situation, but careful, accurate, and updated information can empower everyone.
Take the messaging and public conversation around transmission risks from vaccinated people. It is, of course, important to be alert to such considerations: Many vaccines are “leaky” in that they prevent disease or severe disease, but not infection and transmission. In fact, completely blocking all infection—what’s often called “sterilizing immunity”—is a difficult goal, and something even many highly effective vaccines don’t attain, but that doesn’t stop them from being extremely useful.
As Paul Sax, an infectious-disease doctor at Boston’s Brigham & Women’s Hospital, put it in early December, it would be enormously surprising “if these highly effective vaccines didn’t also make people less likely to transmit.” From multiple studies, we already knew that asymptomatic individuals—those who never developed COVID-19 despite being infected—were much less likely to transmit the virus. The vaccine trials were reporting 95 percent reductions in any form of symptomatic disease. In December, we learned that Moderna had swabbed some portion of trial participants to detect asymptomatic, silent infections, and found an almost two-thirds reduction even in such cases. The good news kept pouring in. Multiple studies found that, even in those few cases where breakthrough disease occurred in vaccinated people, their viral loads were lower—which correlates with lower rates of transmission. Data from vaccinated populations further confirmed what many experts expected all along: Of course these vaccines reduce transmission.
And yet, from the beginning, a good chunk of the public-facing messaging and news articles implied or claimed that vaccines won’t protect you against infecting other people or that we didn’t know if they would, when both were false. I found myself trying to convince people in my own social network that vaccines weren’t useless against transmission, and being bombarded on social media with claims that they were.
What went wrong? The same thing that’s going wrong right now with the reporting on whether vaccines will protect recipients against the new viral variants. Some outlets emphasize the worst or misinterpret the research. Some public-health officials are wary of encouraging the relaxation of any precautions. Some prominent experts on social media—even those with seemingly solid credentials—tend to respond to everything with alarm and sirens. So the message that got heard was that vaccines will not prevent transmission, or that they won’t work against new variants, or that we don’t know if they will. What the public needs to hear, though, is that based on existing data, we expect them to work fairly well—but we’ll learn more about precisely how effective they’ll be over time, and that tweaks may make them even better.
A year into the pandemic, we’re still repeating the same mistakes.
The top-down messaging is not the only problem. The scolding, the strictness, the inability to discuss trade-offs, and the accusations of not caring about people dying not only have an enthusiastic audience, but portions of the public engage in these behaviors themselves. Maybe that’s partly because proclaiming the importance of individual actions makes us feel as if we are in the driver’s seat, despite all the uncertainty.
Psychologists talk about the “locus of control”—the strength of belief in control over your own destiny. They distinguish between people with more of an internal-control orientation—who believe that they are the primary actors—and those with an external one, who believe that society, fate, and other factors beyond their control greatly influence what happens to us. This focus on individual control goes along with something called the “fundamental attribution error”—when bad things happen to other people, we’re more likely to believe that they are personally at fault, but when they happen to us, we are more likely to blame the situation and circumstances beyond our control.
An individualistic locus of control is forged in the U.S. mythos—that we are a nation of strivers and people who pull ourselves up by our bootstraps. An internal-control orientation isn’t necessarily negative; it can facilitate resilience, rather than fatalism, by shifting the focus to what we can do as individuals even as things fall apart around us. This orientation seems to be common among children who not only survive but sometimes thrive in terrible situations—they take charge and have a go at it, and with some luck, pull through. It is probably even more attractive to educated, well-off people who feel that they have succeeded through their own actions.
You can see the attraction of an individualized, internal locus of control in a pandemic, as a pathogen without a cure spreads globally, interrupts our lives, makes us sick, and could prove fatal.
There have been very few things we could do at an individual level to reduce our risk beyond wearing masks, distancing, and disinfecting. The desire to exercise personal control against an invisible, pervasive enemy is likely why we’ve continued to emphasize scrubbing and cleaning surfaces, in what’s appropriately called “hygiene theater,” long after it became clear that fomites were not a key driver of the pandemic. Obsessive cleaning gave us something to do, and we weren’t about to give it up, even if it turned out to be useless. No wonder there was so much focus on telling others to stay home—even though it’s not a choice available to those who cannot work remotely—and so much scolding of those who dared to socialize or enjoy a moment outdoors.
And perhaps it was too much to expect a nation unwilling to release its tight grip on the bottle of bleach to greet the arrival of vaccines—however spectacular—by imagining the day we might start to let go of our masks.
The focus on individual actions has had its upsides, but it has also led to a sizable portion of pandemic victims being erased from public conversation. If our own actions drive everything, then some other individuals must be to blame when things go wrong for them. And throughout this pandemic, the mantra many of us kept repeating—“Wear a mask, stay home; wear a mask, stay home”—hid many of the real victims.
Study after study, in country after country, confirms that this disease has disproportionately hit the poor and minority groups, along with the elderly, who are particularly vulnerable to severe disease. Even among the elderly, though, those who are wealthier and enjoy greater access to health care have fared better.
The poor and minority groups are dying in disproportionately large numbers for the same reasons that they suffer from many other diseases: a lifetime of disadvantages, lack of access to health care, inferior working conditions, unsafe housing, and limited financial resources.
Many lacked the option of staying home precisely because they were working hard to enable others to do what they could not, by packing boxes, delivering groceries, producing food. And even those who could stay home faced other problems born of inequality: Crowded housing is associatedwith higher rates of COVID-19 infection and worse outcomes, likely because many of the essential workers who live in such housing bring the virus home to elderly relatives.
Individual responsibility certainly had a large role to play in fighting the pandemic, but many victims had little choice in what happened to them. By disproportionately focusing on individual choices, not only did we hide the real problem, but we failed to do more to provide safe working and living conditions for everyone.
For example, there has been a lot of consternation about indoor dining, an activity I certainly wouldn’t recommend. But even takeout and delivery can impose a terrible cost: One study of California found that line cooks are the highest-risk occupation for dying of COVID-19. Unless we provide restaurants with funds so they can stay closed, or provide restaurant workers with high-filtration masks, better ventilation, paid sick leave, frequent rapid testing, and other protections so that they can safely work, getting food to go can simply shift the risk to the most vulnerable. Unsafe workplaces may be low on our agenda, but they do pose a real danger. Bill Hanage, associate professor of epidemiology at Harvard, pointed me to a paper he co-authored: Workplace-safety complaints to OSHA—which oversees occupational-safety regulations—during the pandemic were predictive of increases in deaths 16 days later.
New data highlight the terrible toll of inequality: Life expectancy has decreased dramatically over the past year, with Black people losing the most from this disease, followed by members of the Hispanic community. Minorities are also more likely to die of COVID-19 at a younger age. But when the new CDC director, Rochelle Walensky, noted this terrible statistic, she immediately followed up by urging people to “continue to use proven prevention steps to slow the spread—wear a well-fitting mask, stay 6 ft away from those you do not live with, avoid crowds and poorly ventilated places, and wash hands often.”
Those recommendations aren’t wrong, but they are incomplete. None of these individual acts do enough to protect those to whom such choices aren’t available—and the CDC has yet to issue sufficient guidelines for workplace ventilation or to make higher-filtration masks mandatory, or even available, for essential workers. Nor are these proscriptions paired frequently enough with prescriptions: Socialize outdoors, keep parks open, and let children play with one another outdoors.
Vaccines are the tool that will end the pandemic. The story of their rollout combines some of our strengths and our weaknesses, revealing the limitations of the way we think and evaluate evidence, provide guidelines, and absorb and react to an uncertain and difficult situation.
But also, after a weary year, maybe it’s hard for everyone—including scientists, journalists, and public-health officials—to imagine the end, to have hope. We adjust to new conditions fairly quickly, even terrible new conditions. During this pandemic, we’ve adjusted to things many of us never thought were possible. Billions of people have led dramatically smaller, circumscribed lives, and dealt with closed schools, the inability to see loved ones, the loss of jobs, the absence of communal activities, and the threat and reality of illness and death.
Hope nourishes us during the worst times, but it is also dangerous. It upsets the delicate balance of survival—where we stop hoping and focus on getting by—and opens us up to crushing disappointment if things don’t pan out. After a terrible year, many things are understandably making it harder for us to dare to hope. But, especially in the United States, everything looks better by the day. Tragically, at least 28 million Americans have been confirmed to have been infected, but the real number is certainly much higher. By one estimate, as many as 80 million have already been infected with COVID-19, and many of those people now have some level of immunity. Another 46 million people have already received at least one dose of a vaccine, and we’re vaccinating millions more each day as the supply constraints ease. The vaccines are poised to reduce or nearly eliminate the things we worry most about—severe disease, hospitalization, and death.
Not all our problems are solved. We need to get through the next few months, as we race to vaccinate against more transmissible variants. We need to do more to address equity in the United States—because it is the right thing to do, and because failing to vaccinate the highest-risk people will slow the population impact. We need to make sure that vaccines don’t remain inaccessible to poorer countries. We need to keep up our epidemiological surveillance so that if we do notice something that looks like it may threaten our progress, we can respond swiftly.
And the public behavior of the vaccinated cannot change overnight—even if they are at much lower risk, it’s not reasonable to expect a grocery store to try to verify who’s vaccinated, or to have two classes of people with different rules. For now, it’s courteous and prudent for everyone to obey the same guidelines in many public places. Still, vaccinated people can feel more confident in doing things they may have avoided, just in case—getting a haircut, taking a trip to see a loved one, browsing for nonessential purchases in a store.
But it is time to imagine a better future, not just because it’s drawing nearer but because that’s how we get through what remains and keep our guard up as necessary. It’s also realistic—reflecting the genuine increased safety for the vaccinated.
Public-health agencies should immediately start providing expanded information to vaccinated people so they can make informed decisions about private behavior. This is justified by the encouraging data, and a great way to get the word out on how wonderful these vaccines really are. The delay itself has great human costs, especially for those among the elderly who have been isolated for so long.
Public-health authorities should also be louder and more explicit about the next steps, giving us guidelines for when we can expect easing in rules for public behavior as well. We need the exit strategy spelled out—but with graduated, targeted measures rather than a one-size-fits-all message. We need to let people know that getting a vaccine will almost immediately change their lives for the better, and why, and also when and how increased vaccination will change more than their individual risks and opportunities, and see us out of this pandemic.
We should encourage people to dream about the end of this pandemic by talking about it more, and more concretely: the numbers, hows, and whys. Offering clear guidance on how this will end can help strengthen people’s resolve to endure whatever is necessary for the moment—even if they are still unvaccinated—by building warranted and realistic anticipation of the pandemic’s end.
Hope will get us through this. And one day soon, you’ll be able to hop off the subway on your way to a concert, pick up a newspaper, and find the triumphant headline: “COVID Routed!”
Zeynep Tufekci is a contributing writer at The Atlantic and an associate professor at the University of North Carolina. She studies the interaction between digital technology, artificial intelligence, and society.
Lincoln Park in Chicago. Scientists are hopeful, as vaccinations continue and despite the emergence of variants, that we’re past the worst of the pandemic. Credit: Lyndon French for The New York Times
Many scientists are expecting another rise in infections. But this time the surge will be blunted by vaccines and, hopefully, widespread caution. By summer, Americans may be looking at a return to normal life.
Published Feb. 25, 2021Updated Feb. 26, 2021, 12:07 a.m. ET
Across the United States, and the world, the coronavirus seems to be loosening its stranglehold. The deadly curve of cases, hospitalizations and deaths has yo-yoed before, but never has it plunged so steeply and so fast.
Is this it, then? Is this the beginning of the end? After a year of being pummeled by grim statistics and scolded for wanting human contact, many Americans feel a long-promised deliverance is at hand.
Americans will win against the virus and regain many aspects of their pre-pandemic lives, most scientists now believe. Of the 21 interviewed for this article, all were optimistic that the worst of the pandemic is past. This summer, they said, life may begin to seem normal again.
But — of course, there’s always a but — researchers are also worried that Americans, so close to the finish line, may once again underestimate the virus.
So far, the two vaccines authorized in the United States are spectacularly effective, and after a slow start, the vaccination rollout is picking up momentum. A third vaccine is likely to be authorized shortly, adding to the nation’s supply.
But it will be many weeks before vaccinations make a dent in the pandemic. And now the virus is shape-shifting faster than expected, evolving into variants that may partly sidestep the immune system.
The latest variant was discovered in New York City only this week, and another worrisome version is spreading at a rapid pace through California. Scientists say a contagious variant first discovered in Britain will become the dominant form of the virus in the United States by the end of March.
The road back to normalcy is potholed with unknowns: how well vaccines prevent further spread of the virus; whether emerging variants remain susceptible enough to the vaccines; and how quickly the world is immunized, so as to halt further evolution of the virus.
But the greatest ambiguity is human behavior. Can Americans desperate for normalcy keep wearing masks and distancing themselves from family and friends? How much longer can communities keep businesses, offices and schools closed?
Covid-19 deaths will most likely never rise quite as precipitously as in the past, and the worst may be behind us. But if Americans let down their guard too soon — many states are already lifting restrictions — and if the variants spread in the United States as they have elsewhere, another spike in cases may well arrive in the coming weeks.
Scientists call it the fourth wave. The new variants mean “we’re essentially facing a pandemic within a pandemic,” said Adam Kucharski, an epidemiologist at the London School of Hygiene and Tropical Medicine.
The declines are real, but they disguise worrying trends.
Credit: Daniel Dreifuss for The New York Times
The United States has now recorded 500,000 deaths amid the pandemic, a terrible milestone. As of Wednesday morning, at least 28.3 million people have been infected.
But the rate of new infections has tumbled by 35 percent over the past two weeks, according to a database maintained by The New York Times. Hospitalizations are down 31 percent, and deaths have fallen by 16 percent.
Yet the numbers are still at the horrific highs of November, scientists noted. At least 3,210 people died of Covid-19 on Wednesday alone. And there is no guarantee that these rates will continue to decrease.
“Very, very high case numbers are not a good thing, even if the trend is downward,” said Marc Lipsitch, an epidemiologist at the Harvard T.H. Chan School of Public Health in Boston. “Taking the first hint of a downward trend as a reason to reopen is how you get to even higher numbers.”
In late November, for example, Gov. Gina Raimondo of Rhode Island limited social gatherings and some commercial activities in the state. Eight days later, cases began to decline. The trend reversed eight days after the state’s pause lifted on Dec. 20.
The virus’s latest retreat in Rhode Island and most other states, experts said, results from a combination of factors: growing numbers of people with immunity to the virus, either from having been infected or from vaccination; changes in behavior in response to the surges of a few weeks ago; and a dash of seasonality — the effect of temperature and humidity on the survival of the virus.
The vaccines were first rolled out to residents of nursing homes and to the elderly, who are at highest risk of severe illness and death. That may explain some of the current decline in hospitalizations and deaths.
Credit: Joao Silva/The New York Times
But young people drive the spread of the virus, and most of them have not yet been inoculated. And the bulk of the world’s vaccine supply has been bought up by wealthy nations, which have amassed one billion more doses than needed to immunize their populations.
Vaccination cannot explain why cases are dropping even in countries where not a single soul has been immunized, like Honduras, Kazakhstan or Libya. The biggest contributor to the sharp decline in infections is something more mundane, scientists say: behavioral change.
Leaders in the United States and elsewhere stepped up community restrictions after the holiday peaks. But individual choices have also been important, said Lindsay Wiley, an expert in public health law and ethics at American University in Washington.
“People voluntarily change their behavior as they see their local hospital get hit hard, as they hear about outbreaks in their area,” she said. “If that’s the reason that things are improving, then that’s something that can reverse pretty quickly, too.”
The downward curve of infections with the original coronavirus disguises an exponential rise in infections with B.1.1.7, the variant first identified in Britain, according to many researchers.
“We really are seeing two epidemic curves,” said Ashleigh Tuite, an infectious disease modeler at the University of Toronto.
The B.1.1.7 variant is thought to be more contagious and more deadly, and it is expected to become the predominant form of the virus in the United States by late March. The number of cases with the variant in the United States has risen from 76 in 12 states as of Jan. 13 to more than 1,800 in 45 states now. Actual infections may be much higher because of inadequate surveillance efforts in the United States.
Buoyed by the shrinking rates over all, however, governors are lifting restrictions across the United States and are under enormous pressure to reopen completely. Should that occur, B.1.1.7 and the other variants are likely to explode.
“Everybody is tired, and everybody wants things to open up again,” Dr. Tuite said. “Bending to political pressure right now, when things are really headed in the right direction, is going to end up costing us in the long term.”
Another wave may be coming, but it can be minimized.
Credit: Lyndon French for The New York Times
Looking ahead to late March or April, the majority of scientists interviewed by The Times predicted a fourth wave of infections. But they stressed that it is not an inevitable surge, if government officials and individuals maintain precautions for a few more weeks.
A minority of experts were more sanguine, saying they expected powerful vaccines and an expanding rollout to stop the virus. And a few took the middle road.
“We’re at that crossroads, where it could go well or it could go badly,” said Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases.
The vaccines have proved to be more effective than anyone could have hoped, so far preventing serious illness and death in nearly all recipients. At present, about 1.4 million Americans are vaccinated each day. More than 45 million Americans have received at least one dose.
A team of researchers at Fred Hutchinson Cancer Research Center in Seattle tried to calculate the number of vaccinations required per day to avoid a fourth wave. In a model completed before the variants surfaced, the scientists estimated that vaccinating just one million Americans a day would limit the magnitude of the fourth wave.
“But the new variants completely changed that,” said Dr. Joshua T. Schiffer, an infectious disease specialist who led the study. “It’s just very challenging scientifically — the ground is shifting very, very quickly.”
Natalie Dean, a biostatistician at the University of Florida, described herself as “a little more optimistic” than many other researchers. “We would be silly to undersell the vaccines,” she said, noting that they are effective against the fast-spreading B.1.1.7 variant.
But Dr. Dean worried about the forms of the virus detected in South Africa and Brazil that seem less vulnerable to the vaccines made by Pfizer and Moderna. (On Wednesday, Johnson & Johnson reported that its vaccine was relatively effective against the variant found in South Africa.)
Credit: Pete Kiehart for The New York Times
About 50 infections with those two variants have been identified in the United States, but that could change. Because of the variants, scientists do not know how many people who were infected and had recovered are now vulnerable to reinfection.
South Africa and Brazil have reported reinfections with the new variants among people who had recovered from infections with the original version of the virus.
“That makes it a lot harder to say, ‘If we were to get to this level of vaccinations, we’d probably be OK,’” said Sarah Cobey, an evolutionary biologist at the University of Chicago.
Yet the biggest unknown is human behavior, experts said. The sharp drop in cases now may lead to complacency about masks and distancing, and to a wholesale lifting of restrictions on indoor dining, sporting events and more. Or … not.
“The single biggest lesson I’ve learned during the pandemic is that epidemiological modeling struggles with prediction, because so much of it depends on human behavioral factors,” said Carl Bergstrom, a biologist at the University of Washington in Seattle.
Taking into account the counterbalancing rises in both vaccinations and variants, along with the high likelihood that people will stop taking precautions, a fourth wave is highly likely this spring, the majority of experts told The Times.
Kristian Andersen, a virologist at the Scripps Research Institute in San Diego, said he was confident that the number of cases will continue to decline, then plateau in about a month. After mid-March, the curve in new cases will swing upward again.
In early to mid-April, “we’re going to start seeing hospitalizations go up,” he said. “It’s just a question of how much.”
Summer will feel like summer again, sort of.
Credit: Kendrick Brinson for The New York Times
Now the good news.
Despite the uncertainties, the experts predict that the last surge will subside in the United States sometime in the early summer. If the Biden administration can keep its promise to immunize every American adult by the end of the summer, the variants should be no match for the vaccines.
Combine vaccination with natural immunity and the human tendency to head outdoors as weather warms, and “it may not be exactly herd immunity, but maybe it’s sufficient to prevent any large outbreaks,” said Youyang Gu, an independent data scientist, who created some of the most prescient models of the pandemic.
Infections will continue to drop. More important, hospitalizations and deaths will fall to negligible levels — enough, hopefully, to reopen the country.
“Sometimes people lose vision of the fact that vaccines prevent hospitalization and death, which is really actually what most people care about,” said Stefan Baral, an epidemiologist at the Johns Hopkins Bloomberg School of Public Health.
Even as the virus begins its swoon, people may still need to wear masks in public places and maintain social distance, because a significant percent of the population — including children — will not be immunized.
“Assuming that we keep a close eye on things in the summer and don’t go crazy, I think that we could look forward to a summer that is looking more normal, but hopefully in a way that is more carefully monitored than last summer,” said Emma Hodcroft, a molecular epidemiologist at the University of Bern in Switzerland.
Imagine: Groups of vaccinated people will be able to get together for barbecues and play dates, without fear of infecting one another. Beaches, parks and playgrounds will be full of mask-free people. Indoor dining will return, along with movie theaters, bowling alleys and shopping malls — although they may still require masks.
The virus will still be circulating, but the extent will depend in part on how well vaccines prevent not just illness and death, but also transmission. The data on whether vaccines stop the spread of the disease are encouraging, but immunization is unlikely to block transmission entirely.
Credit: Pete Kiehart for The New York Times
“It’s not zero and it’s not 100 — exactly where that number is will be important,” said Shweta Bansal, an infectious disease modeler at Georgetown University. “It needs to be pretty darn high for us to be able to get away with vaccinating anything below 100 percent of the population, so that’s definitely something we’re watching.”
Over the long term — say, a year from now, when all the adults and children in the United States who want a vaccine have received them — will this virus finally be behind us?
Every expert interviewed by The Times said no. Even after the vast majority of the American population has been immunized, the virus will continue to pop up in clusters, taking advantage of pockets of vulnerability. Years from now, the coronavirus may be an annoyance, circulating at low levels, causing modest colds.
Many scientists said their greatest worry post-pandemic was that new variants may turn out to be significantly less susceptible to the vaccines. Billions of people worldwide will remain unprotected, and each infection gives the virus new opportunities to mutate.
“We won’t have useless vaccines. We might have slightly less good vaccines than we have at the moment,” said Andrew Read, an evolutionary microbiologist at Penn State University. “That’s not the end of the world, because we have really good vaccines right now.”
For now, every one of us can help by continuing to be careful for just a few more months, until the curve permanently flattens.
“Just hang in there a little bit longer,” Dr. Tuite said. “There’s a lot of optimism and hope, but I think we need to be prepared for the fact that the next several months are likely to continue to be difficult.”
Você precisa fazer login para comentar.