O mundo vive um inferno astral de ameaças de curto e longo prazo. Em brilhante palestra recente, Tharman Shanmugaratnam, ministro sênior de Singapura, listou cinco riscos que, para ele, configuram uma “longa tempestade perfeita” para o planeta. Neste artigo, discutirei as implicações desse quadro para o Brasil, procurando também identificar as oportunidades disponíveis.
O pano de fundo é conhecido. Ao acordar do sonho do mundo pacífico e integrado do fim da história de Fukuyama, nos deparamos com crescentes tensões, que se manifestam em múltiplas esferas. A mais chocante de todas e primeiro tema da lista de Tharman é a tragédia ucraniana, que configura o rompimento de uma governança global que garantia a soberania e a integridade territorial de todas as nações.
A esse retorno da Guerra Fria original, de natureza ideológica (modificada) e militar, se soma a Guerra Fria.2 entre os Estados Unidos e a China, também ideológica, mas muito mais complexa em suas frentes de disputa.
O embate entre os dois gigantes caracteriza um período de ausência de uma liderança global hegemônica que, como bem diagnosticou Charles P. Kindleberger, tende a ser muito instável. Do ponto de vista econômico, as duas guerras frias forçosamente demandam um importante repensar de alianças e relações de produção e comércio globais.
Para o Brasil, será necessário retornar à política externa tradicional do Itamaraty, voltada para a busca do interesse nacional através de boas relações viabilizadas pelo nosso histórico apego a princípios universais e pela nossa natural vocação multilateral. Nos cabe primeiramente e o quanto antes uma defesa inequívoca da integridade de todas as nações. Temos também que zelar pela manutenção de relações mutuamente benéficas com a maior parte dos países.
Em seu segundo grande tema, o autor discute o perigo de uma prolongada estagflação. O epicentro do problema encontra-se nos Estados Unidos, onde uma economia superaquecida por políticas expansionistas vem sendo atingida pelos choques de oferta da pandemia e das guerras frias. Para o Brasil, o risco maior advém da real possibilidade de o banco central americano ter de elevar os juros bem além do que os mercados já antecipam. Nos faria lembrar da frase “quando o Norte espirra, o Sul pega pneumonia”.
Um cenário alternativo, também nada reconfortante, seria uma queda ainda maior das Bolsas, acompanhada de um novo colapso nos preços dos imóveis, hoje acima em termos reais dos níveis da bolha que estourou em 2008.
Do lado de cá, o quadro é ainda mais complicado do que nos Estados Unidos, pois mesmo em recessão a inflação atingiu dois dígitos. Não é difícil imaginar uma tempestade perfeita para o Brasil, onde desafios externos e internos se reforçam. O próximo presidente terá que conduzir a política econômica com coragem e competência, de preferência com o apoio qualitativo das respostas aos demais desafios, que discuto a seguir.
A ameaça existencial da mudança climática é o terceiro tema do discurso. Aqui o Brasil terá a oportunidade de promover uma guinada verdadeiramente alquímica: trocar uma posição de pária ambiental, decorrente de posturas que aumentaram o desmatamento e o crime organizado, por uma guinada que nos poria em uma posição de liderança global no tema, com consequências extremamente positivas fora e dentro do país.
A criação de um mercado de carbono, como vem sendo discutido no Congresso e prometido pelo Executivo, seria um passo essencial nessa direção. É fundamental que o mercado seja desenhado de forma a permitir a plena inserção do país no mercado global de carbono, alternativa não disponível no momento. Vejo amplo potencial para investimentos no setor, em ambiente de concorrência e plenamente alinhados com o interesse público (estou investindo nessa área).
O elevado risco de novas pandemias vem a seguir. A ciência recomenda todo cuidado com o tema. Aqui também vejo amplo espaço para um cavalo de pau. Será necessário reforçar sob todos os ângulos o SUS, que, com seus 4% do PIB de recursos, precisa urgentemente subir na escala de prioridades dos orçamentos de todas as esferas de governo.
Cabe também incluir nas prioridades da nação mais apoio à pesquisa. Fontes de recursos para tais esforços não faltam, como tenho argumentado aqui. Falta sim transparência orçamentária e vontade política.
Em último lugar na lista, mas não menos importante, são as desigualdades de crescimento e bem-estar dentro dos países e entre eles, os mais ricos em vantagem em ambos os casos. Essa situação vem se agravando com as “tempestades perfeitas” e representa um terreno fértil para populismos e autoritarismos. O Brasil tem muito a fazer nessa área.
Com sucesso nessas frentes, o Brasil se qualificaria para ser relevante na reconstrução de uma governança global ora em frangalhos. As vantagens seriam imensas, pois ajudaria a si próprio em tudo mais. No entanto, sem sucesso, os prejuízos para a população seriam enormes. Um futuro melhor só virá se e quando a nossa democracia não mais estiver ameaçada e um tanto disfuncional.
Dois pagés caiapós de Mato Grosso irão a Roraima no domingo para fazer um ritual de dança da chuva, com o objetivo de tentar acabar com o fogo.
Os dois pagés deverão partir de Colider, no norte do Estado, perto do Pará. A viagem de avião até Boa Vista demora cerca de 5h.
De acordo com o índio caiapó Pitisiaru Metupire, o ritual é secreto e deverá ser feito pelos dois pagés, sozinhos, no meio do mato. Antes de fazer o ritual, os pagés passarão algum tempo no interior da mata “estudando” a situação.
“Não dá para explicar como é, porque o trabalho do pagé é bem diferente do do doutor”, diz Metupire. Segundo o índio, os pagés vão conversar com espíritos dos antepassados que, por sua vez, pedirão aos espíritos da chuva e do trovão que provoquem chuva.
O ritual deverá ser feito na aldeia do Demini, onde há montanhas, consideradas fontes de energia.
“A chuva vai começar e não vai parar mais”, disse Metupire, que é da mesma aldeia dos pagés que irão a Boa Vista. De acordo com ele, os caiapós decidiram ajudar os índios ianomâmis e ofereceram a realização do ritual.
“Eles estão precisando de ajuda e nós vamos lá para tentar resolver o problema”, afirmou Metupire. Depois do ritual, os índios ainda ficarão em Roraima para “estudar” os resultados.
Além dos pagés, o administrador da Funai em Colider, Megaron Txucarramãe, também irá a Boa Vista.
Summary: Drawing on 70 years of historic wind and solar-power data, researchers built an AI model to predict the probability of a network-scale ‘drought,’ when daily production of renewables fell below a target threshold. Under a threshold set at the 30th percentile, when roughly a third of all days are low-production days, the researchers found that Texas could face a daily energy drought for up to four months straight. Batteries would be unable to compensate for a drought of this length, and if the system relied on solar energy alone, the drought could be expected to last twice as long — for eight months.
Renewable energy prices have fallen by more than 70 percent in the last decade, driving more Americans to abandon fossil fuels for greener, less-polluting energy sources. But as wind and solar power continue to make inroads, grid operators may have to plan for large swings in availability.
The warning comes from Upmanu Lall, a professor at Columbia Engineering and the Columbia Climate School who has recently turned his sights from sustainable water use to sustainable renewables in the push toward net-zero carbon emissions.
“Designers of renewable energy systems will need to pay attention to changing wind and solar patterns over weeks, months, and years, the way water managers do,” he said. “You won’t be able to manage variability like this with batteries. You’ll need more capacity.”
In a new modeling study in the journal Patterns, Lall and Columbia PhD student Yash Amonkar, show that solar and wind potential vary widely over days and weeks, not to mention months to years. They focused on Texas, which leads the country in generating electricity from wind power and is the fifth-largest solar producer. Texas also boasts a self-contained grid that’s as big as many countries’, said Lall, making it an ideal laboratory for charting the promise and peril of renewable energy systems.
Drawing on 70 years of historic wind and solar-power data, the researchers built an AI model to predict the probability of a network-scale “drought,” when daily production of renewables fell below a target threshold. Under a threshold set at the 30th percentile, when roughly a third of all days are low-production days, the researchers found that Texas could face a daily energy drought for up to four months straight.
Batteries would be unable to compensate for a drought of this length, said Lall, and if the system relied on solar energy alone, the drought could be expected to last twice as long — for eight months. “These findings suggest that energy planners will have to consider alternate ways of storing or generating electricity, or dramatically increasing the capacity of their renewable systems,” he said.
Anticipating Future ‘Energy’ Droughts — in Texas, and Across the Continental United States
The research began six years ago, when Lall and a former graduate student, David Farnham, examined wind and solar variability at eight U.S. airports, where weather records tend to be longer and more detailed. They wanted to see how much variation could be expected under a hypothetical 100% renewable-energy grid.
The results, which Farnham published in his PhD thesis, weren’t a surprise. Farnham and Lall found that solar and wind potential, like rainfall, is highly variable based on the time of year and the place where wind turbines and solar panels have been sited. Across eight cities, they found that renewable energy potential rose and fell from the long-term average by as much as a third in some seasons.
“We coined the term ‘energy’ droughts since a 10-year cycle with this much variation from the long-term average would be seen as a major drought,” said Lall. “That was the beginning of the energy drought work.”
In the current study, Lall chose to zoom in on Texas, a state well-endowed with both sun and wind. Lall and Amonkar found that persistent renewable energy droughts could last as long as a year even if solar and wind generators were spread across the entire state. The conclusion, Lall said, is that renewables face a storage problem that can only realistically be solved by adding additional capacity or sources of energy.
“In a fully renewable world, we would need to develop nuclear fuel or hydrogen fuel, or carbon recycling, or add much more capacity for generating renewables, if we want to avoid burning fossil fuels,” he said.
In times of low rainfall, water managers keep fresh water flowing through the spigot by tapping municipal reservoirs or underground aquifers. Solar and wind energy systems have no equivalent backup. The batteries used to store excess solar and wind power on exceptionally bright and gusty days hold a charge for only a few hours, and at most, a few days. Hydropower plants provide a potential buffer, said Lall, but not for long enough to carry the system through an extended dry spell of intermittent sun and wind.
“We won’t solve the problem by building a larger network,” he said. “Electric grid operators have a target of 99.99% reliability while water managers strive for 90 percent reliability. You can see what a challenging game this will be for the energy industry, and just how valuable seasonal and longer forecasts could be.”
In the next phase of research, Lall will work with Columbia Engineering professors Vijay Modi and Bolun Xu to see if they can predict both energy droughts and “floods,” when the system generates a surplus of renewables. Armed with these projections, they hope to predict the rise and fall of energy prices.
Yash Amonkar, David J. Farnham, Upmanu Lall. A k-nearest neighbor space-time simulator with applications to large-scale wind and solar power modeling. Patterns, 2022; 3 (3): 100454 DOI: 10.1016/j.patter.2022.100454
We can reduce global temperatures faster than we once thought — if we act now
One of the biggest obstacles to avoiding global climate breakdown is that so many people think there’s nothing we can do about it.
They point out that record-breaking heat waves, fires and storms are already devastating communities and economies throughout the world. And they’ve long been told that temperatures will keep rising for decades to come, no matter how many solar panels replace oil derricks or how many meat-eaters go vegetarian. No wonder they think we’re doomed.
But climate science actually doesn’t say this. To the contrary, the best climate science you’ve probably never heard of suggests that humanity can still limit the damage to a fraction of the worst projections if — and, we admit, this is a big if — governments, businesses and all of us take strong action starting now.
For many years, the scientific rule of thumb was that a sizable amount of temperature rise was locked into the Earth’s climate system. Scientists believed — and told policymakers and journalists, who in turn told the public — that even if humanity hypothetically halted all heat-trapping emissions overnight, carbon dioxide’s long lifetime in the atmosphere, combined with the sluggish thermal properties of the oceans, would nevertheless keep global temperatures rising for 30 to 40 more years. Since shifting to a zero-carbon global economy would take at least a decade or two, temperatures were bound to keep rising for at least another half-century.
But guided by subsequent research, scientists dramatically revised that lag time estimate down to as little as three to five years. That is an enormous difference that carries paradigm-shifting and broadly hopeful implications for how people, especially young people, think and feel about the climate emergency and how societies can respond to it.
This revised science means that if humanity slashes emissions to zero, global temperatures will stop rising almost immediately. To be clear, this is not a get-out-of-jail-free card. Global temperatures will not fall if emissions go to zero, so the planet’s ice will keep melting and sea levels will keep rising. But global temperatures will stop their relentless climb, buying humanity time to devise ways to deal with such unavoidable impacts. In short, we are not irrevocably doomed — or at least we don’t have to be, if we take bold, rapid action.
The science we’re referencing was included — but buried — in the United Nations Intergovernmental Panel on Climate Change’s most recent report, issued in August. Indeed, it was first featured in the IPCC’s landmark 2018 report, “Global warming of 1.5 C.”That report’s key finding — that global emissions must fall by 45 percent by 2030 to avoid catastrophic climate disruption — generated headlines declaring that we had “12 years to save the planet.” That 12-year timeline, and the related concept of a “carbon budget” — the amount of carbon that can be burned while still limiting temperature rise to 1.5 degrees Celsius above preindustrial levels — were both rooted in this revised science. Meanwhile, the public and policy worlds have largely neglected the revised science that enabled these very estimates.
Nonscientists can reasonably ask: What made scientists change their minds? Why should we believe their new estimate of a three-to-five-year lag time if their previous estimate of 30 to 40 years is now known to be incorrect? And does this mean the world still must cut emissions in half by 2030 to avoid climate catastrophe?
The short answer to the last question is yes. Remember, temperatures only stop rising once global emissions fall to zero. Currently, emissions are not falling. Instead, humanity continues to pump approximately 36 billion tons of carbon dioxide a year into the atmosphere. The longer it takes to cut those 36 billion tons to zero, the more temperature rise humanity eventually will face. And as the IPCC’s 2018 report made hauntingly clear, pushing temperatures above 1.5 degrees C would cause unspeakable amounts of human suffering, economic loss and social breakdown — and perhaps trigger genuinely irreversible impacts.
Scientists changed their minds about how much warming is locked in because additional research gave them a much better understanding of how the climate system works. Their initial 30-to-40-year estimates were based on relatively simple computer models that treated the concentration of carbon dioxide in the atmosphere as a “control knob” that determines temperature levels. The long lag in the warming impact is due to the oceans, which continue to warm long after the control knob is turned up. More recent climate models account for the more dynamic nature of carbon emissions. Yes, CO2 pushes temperatures higher, but carbon “sinks,” including forests and in particular the oceans, absorb almost half of the CO2 that is emitted, causing atmospheric CO2 levels to drop, offsetting the delayed warming effect.
Knowing that 30 more years of rising temperatures are not necessarily locked in can be a game-changer for how people, governments and businesses respond to the climate crisis. Understanding that we can still save our civilization if we take strong, fast action can banish the psychological despair that paralyzes people and instead motivate them to get involved. Lifestyle changes can help, but that involvement must also include political engagement. Slashing emissions in half by 2030 demands the fastest possible transition away from today’s fossil-fueled economies in favor of wind, solar and other non-carbon alternatives. That can happen only if governments enact dramatically different policies. If citizens understand that things aren’t hopeless, they can better push elected officials to make such changes.
As important as minimizing temperature rise is to the United States, where last year’s record wildfires in California and the Pacific Northwest illustrated just how deadly climate change can be, it matters most in the highly climate-vulnerable communities throughout the global South. Countless people in Bangladesh, the Philippines, Madagascar, Africa’s Sahel nations, Brazil, Honduras and other low-income countries have already been suffering from climate disasters for decades because their communities tend to be more exposed to climate impacts and have less financial capacity to protect themselves. For millions of people in such countries, limiting temperature rise to 1.5 degrees C is not a scientific abstraction.
The IPCC’s next report, due for release Feb. 28, will address how societies can adapt to the temperature rise now underway and the fires, storms and rising seas it unleashes. If we want a livable future for today’s young people, temperature rise must be kept as close as possible to 1.5 C. The best climate science most people have never heard of says that goal remains within reach. The question is whether enough of us will act on that knowledge in time.
America’s coastline will see sea levels rise in the next 30 years by as much as they did in the entire 20th century, with major Eastern cities hit regularly with costly floods even on sunny days, a government report warns.
By 2050, seas lapping against the U.S. shore will be 10 to 12 inches (0.25 to 0.3 meters) higher, with parts of Louisiana and Texas projected to see waters a foot and a half (0.45 meters) higher, according to a 111-page report issued Tuesday by the National Oceanic and Atmospheric Administration and six other federal agencies.
“Make no mistake: Sea level rise is upon us,” said Nicole LeBoeuf, director of NOAA’s National Ocean Service.
The projected increase is especially alarming given that in the 20th century, seas along the Atlantic coast rose at the fastest clip in 2,000 years.
LeBoeuf warned that the cost will be high, pointing out that much of the American economy and 40% of the population are along the coast.
However, the worst of the long-term sea level rise from the melting of ice sheets in Antarctica and Greenland probably won’t kick in until after 2100, said ocean service oceanographer William Sweet, the report’s lead author.
Warmer water expands, and the melting ice sheets and glaciers adds more water to the worlds oceans.
The report “is the equivalent of NOAA sending a red flag up” about accelerating the rise in sea levels, said University of Wisconsin-Madison geoscientist Andrea Dutton, a specialist in sea level rise who wasn’t part of the federal report. The coastal flooding the U.S. is seeing now “will get taken to a whole new level in just a couple of decades.”
“We can see this freight train coming from more than a mile away,” Dutton said in an email. “The question is whether we continue to let houses slide into the ocean.”
Sea level rises more in some places than others because of sinking land, currents and water from ice melt. The U.S. will get slightly more sea level rise than the global average. And the greatest rise in the U.S. will be on the Gulf and East Coasts, while the West Coast and Hawaii will be hit less than average, Sweet said.
For example, between now and 2060, expect almost 25 inches (0.63 meters) of sea level rise in Galveston, Texas, and just under 2 feet (0.6 meters) in St. Petersburg, Florida, while only 9 inches (0.23 inches) in Seattle and 14 inches (0.36 meters) in Los Angeles, the report said.
While higher seas cause much more damage when storms such as hurricanes hit the coast, they are becoming a problem even on sunny days.
Cities such as Miami Beach, Florida; Annapolis, Maryland; and Norfolk, Virginia, already get a few minor “nuisance” floods a year during high tides, but those will be replaced by several “moderate” floods a year by mid-century, ones that cause property damage, the researchers said.
“It’s going to be areas that haven’t been flooding that are starting to flood,” Sweet said in an interview. “Many of our major metropolitan areas on the East Coast are going to be increasingly at risk.”
The western Gulf of Mexico coast, should get hit the most with the highest sea level rise — 16 to 18 inches (0.4 to 0.45 meters) — by 2050, the report said. And that means more than 10 moderate property-damaging sunny-day floods and one “major” high tide flood event a year.
The eastern Gulf of Mexico should expect 14 to 16 inches (0.35 to 0.4 meters) of sea level rise by 2050 and three moderate sunny-day floods a year. By mid-century, the Southeast coast should get a foot to 14 inches (0.3 to 0.35 meters) of sea level rise and four sunny-day moderate floods a year, while the Northeast coast should get 10 inches to a foot (0.25 to 0.3 meters) of sea level rise and six moderate sunny-day floods a year.
Both the Hawaiian Islands and Southwestern coast should expect 6 to 8 inches (0.15 to 0.2 meters) of sea level rise by mid-century, with the Northwest coast seeing only 4 to 6 inches (0.1 to 0.15 meters). The Pacific coastline will get more than 10 minor nuisance sunny-day floods a year but only about one moderate one a year, with Hawaii getting even less than that.
And that’s just until 2050. The report is projecting an average of about 2 feet of sea level rise in the United States — more in the East, less in the West — by the end of the century.
In Spain, rainfall this winter stands at only a third of the average in recent years
Feb. 14, 2022
In north-western Spain, the sight of roofs emerging from the surface of the water in the Lindoso reservoir is not uncommon at the height of particularly dry summers, but since the lake was first created three decades ago, this winter is the first time the flooded village of Aceredo has been revealed in its entirety.
The decrepit old stone works of the village are an indication of the extent of the severe winter drought impacting Spain and Portugal, which is now devastating crops after more than two months with no rain.
While 10 per cent of Spain has officially been declared as being under “prolonged drought,” large areas outside this categorisation, particularly in the south, also face extreme shortages that could impact the irrigation of crops.
Overall around 50 per cent of all Spanish farms are believed to be at risk due to the record low rainfall which is impacting rain-fed crops including cereals, olives, nuts and vineyards, which could lose 6 per cent to 8 per cent of their production, Spanish farming organisations have warned.
While the government is planning to spend around €570m (£477m) to improve irrigation systems, the lack of rainfall has been blamed on the worsening climate crisis.
Over the last three months of 2021, Spain recorded just 35 per cent of the average rainfall it had during the same period from 1981 to 2010. But there has been almost no rain since then.
Meanwhile in Portugal, 45 per cent of the country is currently experiencing “severe” or “extreme” drought conditions, Portuguese national weather agency IPMA said, with the climate crisis bringing hotter, drier conditions that make agriculture increasingly difficult.
IPMA climatologist Vanda Pires, Portugal told AP the agency had recorded an increase in the frequency of droughts over the past 20 to 30 years, with lower rainfall and higher temperatures.
“It’s part of the context of climate change,” she said.
Scientists estimate that Portugal will see a drop in average annual rainfall of 20 per cent to 40 per cent by the end of the century.
According to the Spain’s national weather agency AEMET, only in 2005 has there been a January with almost no rain in this century.
If there is not significant rain within the next two weeks, emergency subsidies for farmers will be needed, Spanish authorities told AP.
Rubén del Campo, a spokesman for the Spanish weather service, said the below-average rainfall over the last six months was likely to continue for several more weeks, with hopes that spring will bring much-needed rainfall.
Spanish Agriculture Minister Luis Planas said last week the government would take emergency action if it did not rain in two weeks – likely to be financial support measures for farmers to alleviate the loss of crops and revenues.
United Nations (AP) — Drought in the Horn of Africa has killed more than 1.5 million livestock and drastically cut cereal production, “and we are most definitely now sitting on the brink of catastrophe,” a senior official for the U.N. Food and Agriculture Organization said Monday.
Rein Paulsen, FAO’s director of emergencies and resilience who returned from the region Friday, said a “very small window” exists for taking urgent action, and a key is whether the region’s long rains between March and May are good — and whether the agency gets the $130 million it needs until June.
The short rains in the region, which includes parts of Somalia, Ethiopia and Kenya, were supposed to come between October and December but “were extremely poor,” he said. “And this represents the third consecutive failed rainy season with lower average rans, all of which has a severe impact on vulnerable households.”
The result of the drought meant that overall cereal production for the last rainy season in southern Somalia was estimated to be 58% lower than the long-term average, Paulsen said. In agricultural areas in marginal coastal zones in southeastern parts of Kenya, “we’re looking at crop production estimated to be 70% below average,” he said.
In addition, most places for water that have usually been resilient to climate variability have dried up in Kenya, he said during a virtual news conference from Rome.
Paulsen said $130 million in funding is essential now to provide cash for people to buy food until production resumes, to keep livestock alive and to provide drought-resistant seeds for farmers to reap a harvest.
“We have a window to the middle of this year — to June, which is a very time sensitive, narrow window for urgent actions to scale up to prevent a worst-case scenario,” Paulsen said. “Agriculture needs a lot more attention. It’s central to the survival of drought affected communities.”
During his visit to the region, Paulsen said: “We saw both livestock and wildlife carcasses by the side of the road as we were driving. We saw animals dying together with their farmers, and the numbers I think are quite shocking.”
In Kenya alone, 1.4 million livestock died in the final part of last year as a result of drought, and in southern Ethiopia, about 240,000 livestock died as a result of drought, he said.
Paulsen said that “it was quite traumatic driving through communities and seeing farmers tending livestock as they were dying by the side of the roads.”
Livestock are not only crucial to people’s livelihoods, he said, but they provide milk for children, and FAO is focused on providing urgent fodder and water to keep them alive.
The U.N. World Food Program said Feb. 8 that drought has left an estimated 13 million people in the Horn of Africa facing severe hunger amid the driest conditions since 1981. It is seeking $327 million to look after the urgent needs of 4.5 million people over the next six months.
1. More scientists are investigating ways to help people adapt
Over the past half century, thousands of scientists around the world have dedicated their careers to documenting the link between climate change and human activity. A remarkable amount of this work has been done at Columbia’s Lamont-Doherty Earth Observatory, in Palisades, New York. Indeed, one of the founders of modern climate science, the late Columbia geochemist Wally Broecker ’53CC, ’58GSAS, popularized the term “global warming” and first alerted the broader scientific community to the emerging climate crisis in a landmark 1975 paper. He and other Columbia researchers then set about demonstrating that rising global temperatures could not be explained by the earth’s natural long-term climate cycles. For evidence, they relied heavily on Columbia’s world-class collections of tree-ring samples and deep-sea sediment cores, which together provide a unique window into the earth’s climate history.
Today, experts say, the field of climate science is in transition. Having settled the question of whether humans are causing climate change — the evidence is “unequivocal,” according to the UN’s Intergovernmental Panel on Climate Change (IPCC) — many scientists have been branching out into new areas, investigating the myriad ways that global warming is affecting our lives. Columbia scholars from fields as diverse as public health, agriculture, economics, law, political science, urban planning, finance, and engineering are now teaming up with climate scientists to learn how communities can adapt to the immense challenges they are likely to confront.
The University is taking bold steps to support such interdisciplinary thinking. Its new Columbia Climate School, established last year, is designed to serve as a hub for research and education on climate sustainability. Here a new generation of students will be trained to find creative solutions to the climate crisis. Its scholars are asking questions such as: How can communities best protect themselves from rising sea levels and intensifying storm surges, droughts, and heat waves? When extreme weather occurs, what segments of society are most vulnerable? And what types of public policies and ethical principles are needed to ensure fair and equitable adaptation strategies? At the same time, Columbia engineers, physicists, chemists, data scientists, and others are working with entrepreneurs to develop the new technologies that are urgently needed to scale up renewable-energy systems and curb emissions.
“The challenges that we’re facing with climate change are so huge, and so incredibly complex, that we need to bring people together from across the entire University to tackle them,” says Alex Halliday, the founding dean of the Columbia Climate School and the director of the Earth Institute. “Success will mean bringing the resources, knowledge, and capacity of Columbia to the rest of the world and guiding society toward a more sustainable future.”
For climate scientists who have been at the forefront of efforts to document the effects of fossil-fuel emissions on our planet, the shift toward helping people adapt to climate change presents new scientific challenges, as well as the opportunity to translate years of basic research into practical, real-world solutions.
“A lot of climate research has traditionally looked at how the earth’s climate system operates at a global scale and predicted how a given amount of greenhouse-gas emissions will affect global temperatures,” says Adam Sobel, a Columbia applied physicist, mathematician, and climate scientist. “The more urgent questions we face now involve how climate hazards vary across the planet, at local or regional scales, and how those variations translate into specific risks to human society. We also need to learn to communicate climate risks in ways that can facilitate actions to reduce them. This is where climate scientists need to focus more of our energy now, if we’re to maximize the social value of our work.”
2. Big data will enable us to predict extreme weather
Just a few years ago, scientists couldn’t say with any confidence how climate change was affecting storms, floods, droughts, and other extreme weather around the world. But now, armed with unprecedented amounts of real-time and historical weather data, powerful new supercomputers, and a rapidly evolving understanding of how different parts of our climate system interact, researchers are routinely spotting the fingerprints of global warming on our weather.
“Of course, no individual weather event can be attributed solely to climate change, because weather systems are highly dynamic and subject to natural variability,” says Sobel, who studies global warming’s impact on extreme weather. “But data analysis clearly shows that global warming is tilting the scales of nature in a way that is increasing both the frequency and intensity of certain types of events, including heat waves, droughts, and floods.”
According to the World Meteorological Organization, the total number of major weather-related disasters to hit the world annually has increased five-fold since the 1970s. In 2021, the US alone endured eighteen weather-related disasters that caused at least $1 billion in damages. Those included Hurricanes Ida and Nicholas; tropical storms Fred and Elsa; a series of thunderstorms that devastated broad swaths of the Midwest; floods that overwhelmed the coasts of Texas and Louisiana; and a patchwork of wildfires that destroyed parts of California, Oregon, Washington, Idaho, Montana, and Arizona. In 2020, twenty-two $1 billion events struck this country — the most ever.
“The pace and magnitude of the weather disasters we’ve seen over the past couple of years are just bonkers,” says Sobel, who studies the atmospheric dynamics behind hurricanes. (He notes that while hurricanes are growing stronger as a result of climate change, scientists are not yet sure if they are becoming more common.) “Everybody I know who studies this stuff is absolutely stunned by it. When non-scientists ask me what I think about the weather these days, I say, ‘If it makes you worried for the future, it should, because the long-term trend is terrifying.’”
The increasing ferocity of our weather, scientists say, is partly attributable to the fact that warmer air can hold more moisture. This means that more water is evaporating off oceans, lakes, and rivers and accumulating in the sky, resulting in heavier rainstorms. And since hot air also wicks moisture out of soil and vegetation, regions that tend to receive less rainfall, like the American West, North Africa, the Middle East, and Central Asia, are increasingly prone to drought and all its attendant risks. “Climate change is generally making wet areas wetter and dry regions drier,” Sobel says.
But global warming is also altering the earth’s climate system in more profound ways. Columbia glaciologist Marco Tedesco, among others, has found evidence that rising temperatures in the Arctic are weakening the North Atlantic jet stream, a band of westerly winds that influence much of the Northern Hemisphere’s weather. These winds are produced when cold air from the Arctic clashes with warm air coming up from the tropics. But because the Arctic is warming much faster than the rest of the world, the temperature differential between these air flows is diminishing and causing the jet stream to slow down and follow a more wobbly path. As a result, scientists have discovered, storm systems and pockets of hot or cold air that would ordinarily be pushed along quickly by the jet stream are now sometimes hovering over particular locations for days, amplifying their impact. Experts say that the jet stream’s new snail-like pace may explain why a heavy rainstorm parked itself over Zhengzhou, China, for three days last July, dumping an entire year’s worth of precipitation, and why a heat wave that same month brought 120-degree temperatures and killed an estimated 1,400 people in the northwestern US and western Canada.
Many Columbia scientists are pursuing research projects aimed at helping communities prepare for floods, droughts, heat waves, and other threats. Sobel and his colleagues, for example, have been using their knowledge of hurricane dynamics to develop an open-source computer-based risk-assessment model that could help policymakers in coastal cities from New Orleans to Mumbai assess their vulnerability to cyclones as sea levels rise and storms grow stronger. “The goal is to create analytic tools that will reveal how much wind and flood damage would likely occur under different future climate scenarios, as well as the human and economic toll,” says Sobel, whose team has sought input from public-health researchers, urban planners, disaster-management specialists, and civil engineers and is currently collaborating with insurance companies as well as the World Bank, the International Red Cross, and the UN Capital Development Fund. “Few coastal cities have high-quality information of this type, which is necessary for making rational adaptation decisions.”
Radley Horton ’07GSAS, another Columbia climatologist who studies weather extremes; Christian Braneon, a Columbia civil engineer and climate scientist; and Kim Knowlton ’05PH and Thomas Matte, Columbia public-health researchers, are members of the New York City Panel on Climate Change, a scientific advisory body that is helping local officials prepare for increased flooding, temperature spikes, and other climate hazards. New York City has acted decisively to mitigate and adapt to climate change, in part by drawing on the expertise of scientists from Columbia and other local institutions, and its city council recently passed a law requiring municipal agencies to develop a comprehensive long-term plan to protect all neighborhoods against climate threats. The legislation encourages the use of natural measures, like wetland restoration and expansion, to defend against rising sea levels. “There’s a growing emphasis on attending to issues of racial justice as the city develops its adaptation strategies,” says Horton. “In part, that means identifying communities that are most vulnerable to climate impacts because of where they’re located or because they lack resources. We want to make sure that everybody is a part of the resilience conversation and has input about what their neighborhoods need.”
Horton is also conducting basic research that he hopes will inform the development of more geographically targeted climate models. For example, in a series of recent papers on the atmospheric and geographic factors that influence heat waves, he and his team discovered that warm regions located near large bodies of water have become susceptible to heat waves of surprising intensity, accompanied by dangerous humidity. His team has previously shown that in some notoriously hot parts of the world — like northern India, Bangladesh, and the Persian Gulf — the cumulative physiological impact of heat and humidity can approach the upper limits of human tolerance. “We’re talking about conditions in which a perfectly healthy person could actually die of the heat, simply by being outside for several hours, even if they’re resting and drinking plenty of water,” says Horton, explaining that when it is extremely humid, the body loses its ability to sufficiently perspire, which is how it cools itself. Now his team suspects that similarly perilous conditions could in the foreseeable future affect people who live near the Mediterranean, the Black Sea, the Caspian Sea, or even the Great Lakes. “Conditions in these places probably won’t be quite as dangerous as what we’re seeing now in South Asia or the Middle East, but people who are old, sick, or working outside will certainly be at far greater risk than they are today,” Horton says. “And communities will be unprepared, which increases the danger.”
How much worse could the weather get? Over the long term, that will depend on us and how decisively we act to reduce our fossil-fuel emissions. But conditions are likely to continue to deteriorate over the next two to three decades no matter what we do, since the greenhouse gases that we have already added to the atmosphere will take years to dissipate. And the latest IPCC report states that every additional increment of warming will have a larger, more destabilizing impact. Of particular concern, the report cautions, is that in the coming years we are bound to experience many more “compound events,” such as when heat waves and droughts combine to fuel forest fires, or when coastal communities get hit by tropical storms and flooding rivers simultaneously.
“A lot of the extreme weather events that we’ve been experiencing lately are so different from anything we’ve seen that nobody saw them coming,” says Horton, who points out that climate models, which remain our best tool for projecting future climate risks, must constantly be updated with new data as real-world conditions change. “What’s happening now is that the conditions are evolving so rapidly that we’re having to work faster, with larger and more detailed data sets, to keep pace.”
3. The world’s food supply is under threat
“A warmer world could also be a hungry one, even in the rich countries,” writes the Columbia environmental economist Geoffrey Heal in his latest book, Endangered Economies: How the Neglect of Nature Threatens Our Prosperity. “A small temperature rise and a small increase in CO2 concentrations may be good for crops, but beyond a point that we will reach quickly, the productivity of our present crops will drop, possibly sharply.”
Indeed, a number of studies, including several by Columbia scientists, have found that staple crops like corn, rice, wheat, and soybeans are becoming more difficult to cultivate as the planet warms. Wolfram Schlenker, a Columbia economist who studies the impact of climate change on agriculture, has found that corn and soybean plants exposed to temperatures of 90°F or higher for just a few consecutive days will generate much less yield. Consequently, he has estimated that US output of corn and soybeans could decline by 30 to 80 percent this century, depending on how high average temperatures climb.
“This will reduce food availability and push up prices worldwide, since the US is the largest producer and exporter of these commodities,” Schlenker says.
There is also evidence that climate change is reducing the nutritional value of our food. Lewis Ziska, a Columbia professor of environmental health sciences and an expert on plant physiology, has found that as CO2 levels rise, rice plants are producing grains that contain less protein and fewer vitamins and minerals. “Plant biology is all about balance, and when crops suddenly have access to more CO2 but the same amount of soil nutrients, their chemical composition changes,” he says. “The plants look the same, and they may even grow a little bit faster, but they’re not as good for you. They’re carbon-rich and nutrient-poor.” Ziska says that the molecular changes in rice that he has observed are fairly subtle, but he expects that as CO2 levels continue to rise over the next two to three decades, the changes will become more pronounced and have a significant impact on human health. “Wheat, barley, potatoes, and carrots are also losing some of their nutritional value,” he says. “This is going to affect everybody — but especially people in developing countries who depend on grains like wheat and rice for most of their calories.”
Experts also worry that droughts, heat waves, and floods driven by climate change could destroy harvests across entire regions, causing widespread food shortages. A major UN report coauthored by Columbia climate scientist Cynthia Rosenzweig in 2019 described the growing threat of climate-induced hunger, identifying Africa, South America, and Asia as the areas of greatest susceptibility, in part because global warming is accelerating desertification there. Already, some eight hundred million people around the world are chronically undernourished, and that number could grow by 20 percent as a result of climate change in the coming decades, the report found.
In hopes of reversing this trend, Columbia scientists are now spearheading ambitious efforts to improve the food security of some of the world’s most vulnerable populations. For example, at the International Research Institute for Climate and Society (IRI), which is part of the Earth Institute, multidisciplinary teams of climatologists and social scientists are working in Ethiopia, Senegal, Colombia, Guatemala, Bangladesh, and Vietnam to minimize the types of crop losses that often occur when climate change brings more sporadic rainfall. The IRI experts, whose work is supported by Columbia World Projects, are training local meteorologists, agricultural officials, and farmers to use short-term climate-prediction systems to anticipate when an upcoming season’s growing conditions necessitate using drought-resistant or flood-resistant seeds. They can also suggest more favorable planting schedules. To date, they have helped boost crop yields in dozens of small agricultural communities.
“This is a versatile approach that we’re modeling in six nations, with the hope of rolling it out to many others,” says IRI director John Furlow. “Agriculture still dominates the economies of most developing countries, and in order to succeed despite increasingly erratic weather, farmers need to be able to integrate science into their decision-making.”
4. We need to prepare for massive waves of human migration
For thousands of years,the vast majority of the human population has lived in a surprisingly narrow environmental niche, on lands that are fairly close to the equator and offer warm temperatures, ample fresh water, and fertile soils.
But now, suddenly, the environment is changing. The sun’s rays burn hotter, and rainfall is erratic. Some areas are threatened by rising sea levels, and in others the land is turning to dust, forests to kindling. What will people do in the coming years? Will they tough it out and try to adapt, or will they migrate in search of more hospitable territory?
Alex de Sherbinin, a Columbia geographer, is among the first scientists attempting to answer this question empirically. In a series of groundbreaking studies conducted with colleagues at the World Bank, the Potsdam Institute for Climate Impact Research, New York University, Baruch College, and other institutions, he has concluded that enormous waves of human migration will likely occur this century unless governments act quickly to shift their economies away from fossil fuels and thereby slow the pace of global warming. His team’s latest report, published this fall and based on a comprehensive analysis of climatic, demographic, agricultural, and water-use data, predicts that up to 215 million people from Asia, Eastern Europe, Africa, and Latin America — mostly members of agricultural communities, but also some city dwellers on shorelines — will permanently abandon their homes as a result of droughts, crop failures, and sea-level rise by 2050.
“And that’s a conservative estimate,” says de Sherbinin, a senior research scientist at Columbia’s Center for International Earth Science Information Network. “We’re only looking at migration that will occur as the result of the gradual environmental changes occurring where people live, not massive one-time relocations that might be prompted by natural disasters like typhoons or wildfires.”
De Sherbinin and his colleagues do not predict how many climate migrants will ultimately cross international borders in search of greener pastures. Their work to date has focused on anticipating population movements within resource-poor countries in order to help governments develop strategies for preventing exoduses of their own citizens, such as by providing struggling farmers with irrigation systems or crop insurance. They also identify cities that are likely to receive large numbers of new residents from the surrounding countryside, so that local governments can prepare to accommodate them. Among the regions that will see large-scale population movements, the researchers predict, is East Africa, where millions of smallholder farmers will abandon drought-stricken lands and flock to cities like Kampala, Nairobi, and Lilongwe. Similarly, agricultural communities across Latin America, devastated by plummeting corn, bean, and coffee yields, will leave their fields and depart for urban centers. And in Southeast Asia, rice farmers and fishing families in increasingly flood-prone coastal zones like Vietnam’s Mekong Delta, home to twenty-one million people, will retreat inland.
But these migrations, if they do occur, do not necessarily need to be tragic or chaotic affairs, according to de Sherbinin. In fact, he says that with proper planning, and with input from those who are considering moving, it is even possible that large-scale relocations could be organized in ways that ultimately benefit everybody involved, offering families of subsistence farmers who would otherwise face climate-induced food shortages a new start in more fertile locations or in municipalities that offer more education, job training, health care, and other public services.
“Of course, wealthy nations should be doing more to stop climate change and to help people in developing countries adapt to environmental changes, so they have a better chance of thriving where they are,” he says. “But the international community also needs to help poorer countries prepare for these migrations. If and when large numbers of people do find that their lands are no longer habitable, there should be systems in place to help them relocate in ways that work for them, so that they’re not spontaneously fleeing droughts or floods as refugees but are choosing to safely move somewhere they want to go, to a place that’s ready to receive them.”
5. Rising temperatures are already making people sick
One of the deadliest results of climate change, and also one of the most insidious and overlooked, experts say, is the public-health threat posed by rising temperatures and extreme heat.
“Hot weather can trigger changes in the body that have both acute and chronic health consequences,” says Cecilia Sorensen, a Columbia emergency-room physician and public-health researcher. “It actually alters your blood chemistry in ways that make it prone to clotting, which can lead to heart attacks or strokes, and it promotes inflammation, which can contribute to a host of other problems.”
Exposure to severe heat, Sorensen says, has been shown to exacerbate cardiovascular disease, asthma, chronic obstructive pulmonary disease, arthritis, migraines, depression, and anxiety, among other conditions. “So if you live in a hot climate and lack access to air conditioning, or work outdoors, you’re more likely to get sick.”
By destabilizing the natural environment and our relationship to it, climate change is endangering human health in numerous ways. Researchers at Columbia’s Mailman School of Public Health, which launched its innovative Climate and Health Program in 2010, have shown that rising temperatures are making air pollution worse, in part because smog forms faster in warmer weather and because wildfires are spewing enormous amounts of particulate matter into the atmosphere. Global warming is also contributing to food and drinking-water shortages, especially in developing countries. And it is expected to fuel transmission of dengue fever, Lyme disease, West Nile virus, and other diseases by expanding the ranges of mosquitoes and ticks. But experts say that exposure to extreme heat is one of the least understood and fastest growing threats.
“Health-care professionals often fail to notice when heat stress is behind a patient’s chief complaint,” says Sorensen, who directs the Mailman School’s Global Consortium on Climate and Health Education, an initiative launched in 2017 to encourage other schools of public health and medicine to train practitioners to recognize when environmental factors are driving patients’ health problems. “If I’m seeing someone in the ER with neurological symptoms in the middle of a heat wave, for example, I need to quickly figure out whether they’re having a cerebral stroke or a heat stroke, which itself can be fatal if you don’t cool the body down quickly. And then I need to check to see if they’re taking any medications that can cause dehydration or interfere with the body’s ability to cool itself. But these steps aren’t always taken.”
Sorensen says there is evidence to suggest that climate change, in addition to aggravating existing medical conditions, is causing new types of heat-related illnesses to emerge. She points out that tens of thousands of agricultural workers in Central America have died of an enigmatic new kidney ailment that has been dubbed Mesoamerican nephropathy or chronic kidney disease of unknown origin (CKDu), which appears to be the result of persistent heat-induced inflammation. Since CKDu was first observed among sugarcane workers in El Salvador in the 1990s, Sorensen says, it has become endemic in those parts of Central America where heat waves have grown the most ferocious.
“It’s also been spotted among rice farmers in Sri Lanka and laborers in India and Egypt,” says Sorensen, who is collaborating with physicians in Guatemala to develop an occupational-health surveillance system to spot workers who are at risk of developing CKDu. “In total, we think that at least fifty thousand people have died of this condition worldwide.”
Heat waves are now also killing hundreds of Americans each year. Particularly at risk, experts say, are people who live in dense urban neighborhoods that lack trees, open space, reflective rooftops, and other infrastructure that can help dissipate the heat absorbed by asphalt, concrete, and brick. Research has shown that temperatures in such areas can get up to 15°F hotter than in surrounding neighborhoods on summer days. The fact that these so-called “urban heat islands” are inhabited largely by Black and Latino people is now seen as a glaring racial inequity that should be redressed by investing in public-infrastructure projects that would make the neighborhoods cooler and safer.
“It isn’t a coincidence that racially segregated neighborhoods in US cities are much hotter, on average, than adjacent neighborhoods,” says Joan Casey, a Columbia epidemiologist who studies how our natural and built environments influence human health. In fact, in one recent study, Casey and several colleagues showed that urban neighborhoods that lack green space are by and large the same as those that in the 1930s and 1940s were subject to the racist practice known as redlining, in which banks and municipalities designated minority neighborhoods as off-limits for private lending and public investment. “There’s a clear link between that history of institutionalized racism and the subpar public infrastructure we see in these neighborhoods today,” she says.
Extreme heat is hardly the only environmental health hazard faced by residents of historically segregated neighborhoods. Research by Columbia scientists and others has shown that people in these areas are often exposed to dirty air, partly as a result of the large numbers of trucks and buses routed through their streets, and to toxins emanating from industrial sites. But skyrocketing temperatures are exacerbating all of these other health risks, according to Sorensen.
“A big push now among climate scientists and public-health researchers is to gather more street-by-street climate data in major cities so that we know exactly where people are at the greatest risk of heat stress and can more effectively advocate for major infrastructure upgrades in those places,” she says. “In the meantime, there are relatively small things that cities can do now to save lives in the summer — like providing people free air conditioners, opening community cooling centers, and installing more water fountains.”
6. We’re curbing emissions but need to act faster
Since the beginning ofthe industrial revolution, humans have caused the planet to warm 1.1°C (or about 2°F), mainly by burning coal, oil, and gas for energy. Current policies put the world on pace to increase global temperatures by about 2.6°C over pre-industrial levels by the end of the century. But to avoid the most catastrophic consequences of climate change, we must try to limit the warming to 1.5°C, scientists say. This will require that we retool our energy systems, dramatically expanding the use of renewable resources and eliminating nearly all greenhouse-gas emissions by mid-century.
“We’ll have to build the equivalent of the world’s largest solar park every day for the next thirty years to get to net zero by 2050,” says Jason Bordoff, co-dean of the Columbia Climate School. A leading energy-policy expert, Bordoff served on the National Security Council of President Barack Obama ’83CC. “We’ll also have to ramp up global investments in clean energy R&D from about $2 trillion to $5 trillion per year,” he adds, citing research from the International Energy Agency. “The challenge is enormous.”
Over the past few years, momentum for a clean-energy transition has been accelerating. In the early 2000s, global emissions were increasing 3 percent each year. Now they are rising just 1 percent annually, on average, with some projections indicating that they will peak in the mid-2020s and then start to decline. This is the result of a variety of policies that countries have taken to wean themselves off fossil fuels. European nations, for example, have set strict limits on industrial emissions. South Africa, Chile, New Zealand, and Canada have taken significant steps to phase out coal-fired power plants. And the US and China have enacted fuel-efficiency standards and invested in the development of renewable solar, wind, and geothermal energy — which, along with hydropower, account for nearly 30 percent of all electricity production in the world.
“It’s remarkable how efficient renewables have become over the past decade,” says Bordoff, noting that the costs of solar and wind power have dropped by roughly 90 percent and 70 percent, respectively, in that time. “They’re now competing quite favorably against fossil fuels in many places, even without government subsidies.”
But in the race to create a carbon-neutral global economy, Bordoff says, the biggest hurdles are ahead of us. He points out that we currently have no affordable ways to decarbonize industries like shipping, trucking, air travel, and cement and steel production, which require immense amounts of energy that renewables cannot yet provide. “About half of all the emission reductions that we’ll need to achieve between now and 2050 must come from technologies that aren’t yet available at commercial scale,” says Bordoff.
In order to fulfill the potential of solar and wind energy, we must also improve the capacity of electrical grids to store power. “We need new types of batteries capable of storing energy for longer durations, so that it’s available even on days when it isn’t sunny or windy,” he says.
Perhaps the biggest challenge, Bordoff says, will be scaling up renewable technologies quickly enough to meet the growing demand for electricity in developing nations, which may otherwise choose to build more coal- and gas-fueled power plants. “There are large numbers of people around the world today who have almost no access to electricity, and who in the coming years are going to want to enjoy some of the basic conveniences that we often take for granted, like refrigeration, Internet access, and air conditioning,” he says. “Finding sustainable ways to meet their energy needs is a matter of equity and justice.”
Bordoff, who is co-leading the new Climate School alongside geochemist Alex Halliday, environmental geographer Ruth DeFries, and marine geologist Maureen Raymo ’89GSAS, is also the founding director of SIPA’s Center on Global Energy Policy, which supports research aimed at identifying evidence-based, actionable solutions to the world’s energy needs. With more than fifty affiliate scholars, the center has, since its creation in 2013, established itself as an intellectual powerhouse in the field of energy policy, publishing a steady stream of definitive reports on topics such as the future of coal; the potential for newer, safer forms of nuclear energy to help combat climate change; and the geopolitical ramifications of the shift away from fossil fuels. One of the center’s more influential publications, Energizing America, from 2020, provides a detailed roadmap for how the US can assert itself as an international leader in clean-energy systems by injecting more federal money into the development of technologies that could help decarbonize industries like construction, transportation, agriculture, and manufacturing. President Joe Biden’s $1 trillion Infrastructure Investment and Jobs Act, signed into law in November, incorporates many of the report’s recommendations, earmarking tens of billions of dollars for scientific research in these areas.
“When we sat down to work on that project, my colleagues and I asked ourselves: If an incoming administration wanted to go really big on climate, what would it do? How much money would you need, and where exactly would you put it?” Bordoff says. “I think that’s one of our successes.”
Which isn’t to say that Bordoff considers the climate initiatives currently being pursued by the Biden administration to be sufficient to combat global warming. The vast majority of the climate-mitigation measures contained in the administration’s first two major legislative packages — the infrastructure plan and the more ambitious Build Back Better social-spending bill, which was still being debated in Congress when this magazine went to press — are designed to reward businesses and consumers for making more sustainable choices, like switching to renewable energy sources and purchasing electric vehicles. A truly transformative climate initiative, Bordoff says, would also discourage excessive use of fossil fuels. “Ideally, you’d want to put a price on emissions, such as with a carbon tax or a gasoline tax, so that the biggest emitters are forced to internalize the social costs they’re imposing on everyone else,” he says.
Bordoff is a pragmatist, though, and ever mindful of the fact that public policy is only as durable as it is popular. “I think the American people are more divided on this than we sometimes appreciate,” he says. “Support for climate action is growing in the US, but we have to be cognizant of how policy affects everyday people. There would be concern, maybe even outrage, if electric or gas bills suddenly increased. And that would make it much, much harder to gain and keep support during this transition.”
Today, researchers from across the entire University are working together to pursue a multitude of strategies that may help alleviate the climate crisis. Some are developing nanomaterials for use in ultra-efficient solar cells. Others are inventing methods to suck CO2 out of the air and pump it underground, where it will eventually turn into chalk. Bordoff gets particularly excited when describing the work of engineers at the Columbia Electrochemical Energy Center who are designing powerful new batteries to store solar and wind power. “This is a team of more than a dozen people who are the top battery experts in the world,” he says. “Not only are they developing technologies to create long-duration batteries, but they’re looking for ways to produce them without having to rely on critical minerals like cobalt and lithium, which are in short supply.”
In his own work, Bordoff has recently been exploring the geopolitical ramifications of the energy transition, with an eye toward helping policymakers navigate the shifting international power dynamics that are likely to occur as attention tilts away from fossil fuels in favor of other natural resources.
But he believes the best ideas will come from the next generation of young people, who, like the students in the Climate School’s inaugural class this year, are demanding a better future. “When I see the growing sense of urgency around the world, especially among the younger demographics, it gives me hope,” he says. “The pressure for change is building. Our climate policies don’t go far enough yet, so something is eventually going to have to give — and I don’t think it’s going to be the will and determination of the young people. Sooner or later, they’re going to help push through the more stringent policies that we need. The question is whether it will be in time.”
Tradição acontece desde 1887 em pequena cidade da Pensilvânia
Na manhã desta quarta (2), a marmota Phil viu a sua própria sombra e voltou para a sua toca. Segundo a tradição americana do Dia da Marmota, o movimento do animal significa que o frio continuará por mais seis semanas nos Estados Unidos.
Se Phil não tivesse visto a própria sombra, significaria que o calor da primavera estaria a caminho.
A previsão feita pela marmota é uma tradição que acontece desde 1887, sempre no dia 2 de fevereiro, na pequena cidade de Punxsutawney, na Pensilvânia. Após uma edição virtual em 2021 por causa da pandemia, neste ano o evento reuniu milhares de pessoas.
O roedor —que é substituído e rebatizado a cada vez que um animal titular morre— acertou 50% das vezes nos últimos dez anos —ou seja, índice de acerto igual ao de uma previsão aleatória, segundo o Noaa (Centros Nacionais de Informação Ambiental, na sigla em inglês),
O evento do Dia da Marmota foi retratado na comédia “Feitiço do Tempo“, de 1993, no qual um repórter de TV, vivido por Bill Murray, fica “preso” neste dia e é obrigado a reviver a mesma data inúmeras vezes, em sequência. Com isso, Dia da Marmota passou a ser uma forma de se referir à sensação de que os dias se repetem, situação comum na pandemia.
Researchers are exploring whether building massive berms or unfurling underwater curtains could hold back the warm waters degrading ice sheets.
January 14, 2022
In December, researchers reported that huge and growing cracks have formed in the eastern ice shelf of the Thwaites Glacier, a Florida-size mass of ice that stretches 75 miles across western Antarctica.
They warned that the floating tongue of the glacier—which acts as a brace to prop up the Thwaites—could snap off into the ocean in as little as five years. That could trigger a chain reaction as more and more towering cliffs of ice are exposed and then fracture and collapse.
A complete loss of the so-called doomsday glacier could raise ocean levels by two feet—or as much as 10 feet if the collapse drags down surrounding glaciers with it, according to scientists with the International Thwaites Glacier Collaboration. Either way, it would flood coastal cities around the world, threatening tens of millions of people.
All of which raises an urgent question: Is there anything we could do to stop it?
Even if the world immediately halted the greenhouse-gas emissions driving climate change and warming the waters beneath the ice shelf, that wouldn’t do anything to thicken and restabilize the Thwaites’s critical buttress, says John Moore, a glaciologist and professor at the Arctic Centre at the University of Lapland in Finland.
“So the only way of preventing the collapse … is to physically stabilize the ice sheets,” he says.
That will require what is variously described as active conservation, radical adaptation, or glacier geoengineering.
Moore and others have laid out several ways that people could intervene to preserve key glaciers. Some of the schemes involve building artificial braces through polar megaprojects, or installing other structures that would nudge nature to restore existing ones. The basic idea is that a handful of engineering efforts at the source of the problem could significantly reduce the property damage and flooding dangers that basically every coastal city and low-lying island nation will face, as well as the costs of the adaptation projects required to minimize them.
If it works, it could potentially preserve crucial ice sheets for a few more centuries, buying time to cut emissions and stabilize the climate, the researchers say.
But there would be massive logistical, engineering, legal, and financial challenges. And it’s not yet clear how effective the interventions would be, or whether they could be done before some of the largest glaciers are lost.
Redirecting warming waters
In articles and papers published in 2018, Moore, Michael Wolovick of Princeton, and others laid out the possibility of preserving critical glaciers, including the Thwaites, through massive earth-moving projects. These would involve shipping in or dredging up large amounts of material to build up berms or artificial islands around or beneath key glaciers. The structures would support glaciers and ice shelves, block the warm, dense water layers at the bottom of the ocean that are melting them from below, or both.
More recently, they and researchers affiliated with the University of British Columbia have explored a more technical concept: constructing what they’ve dubbed “seabed anchored curtains.” These would be buoyant flexible sheets, made from geotextile material, that could hold back and redirect warm water.
The hope is that this proposal would be cheaper than the earlier ones, and that these curtains would stand up to iceberg collisions and could be removed if there were negative side effects. The researchers have modeled the use of these structures around three glaciers in Greenland, as well as the Thwaites and nearby Pine Island glaciers.
If the curtains redirected enough warm water, the eastern ice shelf of the Thwaites could begin to thicken again and firmly reattach itself to the underwater formations that have supported it for millennia, Moore says.
“The idea is to return the system to its state around the early 20th century, when we know that warm water could not access the ice shelf as much as today,” he wrote in an email.
They’ve explored the costs and effects of strategically placing these structures in key channels where most of the warm water flows in, and of establishing a wider curtain farther out in the bay. The latter approach would cost on the order of $50 billion. That’s a big number, but it’s not even half what one proposed seawall around New York City would cost.
Researchers have floated other potential approaches as well, including placing reflective or insulating material over portions of glaciers; building fencing to retain snow that would otherwise blow into the ocean; and applying various techniques to dry up the bed beneath glaciers, eliminating water that acts as lubricant and thus slowing the glaciers’ movement.
Will it work?
Some scientists have criticized these ideas. Seven researchers submitted a response in Nature to Moore’s 2018 proposals, arguing that the concepts would be partial solutions at best, could in some cases inadvertently accelerate ice loss, and could pull attention and resources from efforts to eliminate the root of the problem: greenhouse-gas emissions.
The lead author, Twila Moon, a scientist at the National Snow and Ice Data Center at the University of Colorado, Boulder, says the efforts would be akin to plugging a couple of holes in a garden hose riddled with them.
And that’s if they worked at all. She argues that the field doesn’t understand ice dynamics and other relevant factors well enough to be confident that these things will work, and the logistical challenges strike her as extreme given the difficulty of getting a single research vessel to Antarctica.
“Addressing the source of the problem means turning off that hose, and that is something that we understand,” she says. “We understand climate change; we understand the sources, and we understand how to reduce emissions.”
There would also be significant governance and legal obstacles, as Charles Corbett and Edward Parson, legal scholars at University of California, Los Angeles, School of Law, noted in a forthcoming essay in Ecology Law Quarterly.
Notably, Antarctica is governed by a consortium of nations under the Antarctic Treaty System, and any one of the 29 voting members could veto such proposals. In addition, the Madrid Protocol strictly limits certain activities on and around Antarctica, including projects that would have major physical or environmental impacts.
Corbett and Parson stress that the obstacles aren’t insurmountable and that the issue could inspire needed updates to how these regions are governed amid the rising threat of climate change. But they also note: “It all raises the question of whether a country or coalition could drive the project forward with sufficient determination.”
Moore and others have noted in earlier work that a “handful of ice streams and large glaciers” are expected to produce nearly all the sea-level rise over the next few centuries, so a few successful interventions could have a significant impact.
But Moore readily acknowledges that such efforts will face vast challenges. Much more work needs to be done to closely evaluate how the flow of warm water will be affected, how well the curtains will hold up over time, what sorts of environmental side effects could occur, and how the public will respond. And installing the curtains under the frigid, turbulent conditions near Antarctica would likely require high-powered icebreakers and the sorts of submersible equipment used for deep-sea oil and gas platforms.
As a next step, Moore hopes to begin conversations with communities in Greenland to seek their input on such ideas well ahead of any field research proposals. But the basic idea would be to start with small-scale tests in regions where it will be relatively easy to work, like Greenland or Alaska. The hope is the lessons and experience gained there would make it possible to move on to harder projects in harsher areas.
The Thwaites would be at the top rung of this “ladder of difficulty.” And the researchers have been operating on the assumption that it could take three decades to build the public support, raise the needed financing, sort out the governance challenges, and build up the skills necessary to undertake such a project there.
There’s a clear problem with that timeline, however: the latest research suggests that the critical eastern buttress may not even be there by the end of this decade.
Source: Potsdam Institute for Climate Impact Research (PIK)
Summary: Economic growth goes down when the number of wet days and days with extreme rainfall go up, a team of scientists finds. The data analysis of more than 1,500 regions over the past 40 years shows a clear connection and suggests that intensified daily rainfall driven by climate-change from burning oil and coal will harm the global economy.
Economic growth goes down when the number of wet days and days with extreme rainfall go up, a team of Potsdam scientists finds. Rich countries are most severely affected and herein the manufacturing and service sectors, according to their study now published as cover story in the journal Nature. The data analysis of more than 1,500 regions over the past 40 years shows a clear connection and suggests that intensified daily rainfall driven by climate-change from burning oil and coal will harm the global economy.
“This is about prosperity, and ultimately about people’s jobs. Economies across the world are slowed down by more wet days and extreme daily rainfall — an important insight that adds to our growing understanding of the true costs of climate change,” says Leonie Wenz from the Potsdam Institute for Climate Impact Research (PIK) and the Mercator Research Institute on Global Commons and Climate Change (MCC) who led the study.
“Macro-economic assessments of climate impacts have so far focused mostly on temperature and considered — if at all — changes in rainfall only across longer time scales such as years or months, thus missing the complete picture,” explains Wenz. “While more annual rainfall is generally good for economies, especially agriculturally dependent ones, the question is also how the rain is distributed across the days of the year. Intensified daily rainfall turns out to be bad, especially for wealthy, industrialized countries like the US, Japan, or Germany.”
A first-of-its-kind global analysis of subnational rainfall effects
“We identify a number of distinct effects on economic production, yet the most important one really is from extreme daily rainfall,” says Maximilian Kotz, first author of the study and also at the Potsdam Institute. “This is because rainfall extremes are where we can already see the influence of climate change most clearly, and because they are intensifying almost everywhere across the world.”
The analysis statistically evaluates data of sub-national economic output for 1554 regions worldwide in the period 1979-2019, collected and made publicly available by MCC and PIK. The scientists combine these with high resolution rainfall data. The combination of ever increasing detail in climatic and economic data is of particular importance in the context of rain, a highly local phenomenon, and revealed the new insights.
“It’s the daily rainfall that poses the threat“
By loading the Earth’s atmosphere with greenhouse gases from fossil power plants and cars, humanity is heating the planet. Warming air can hold more water vapour that eventually becomes rain. Although atmospheric dynamics make regional changes in annual averages more complicated, daily rainfall extremes are increasing globally due to this water vapour effect.
“Our study reveals that it’s precisely the fingerprint of global warming in daily rainfall which have hefty economic effects that have not yet been accounted for but are highly relevant,” says co-author Anders Levermann, Head of the Potsdam Institute’s Complexity Science domain, professor at Potsdam University and researcher at Columbia University’s Lamont Doherty Earth Observatory, New York. “Taking a closer look at short time scales instead of annual averages helps to understand what is going on: it’s the daily rainfall which poses the threat. It’s rather the climate shocks from weather extremes that threaten our way of life than the gradual changes. By destabilizing our climate we harm our economies. We have to make sure that our burning of fossil fuels does not destabilize our societies, too.”
Maximilian Kotz, Anders Levermann, Leonie Wenz. The effect of rainfall changes on economic production. Nature, 2022; 601 (7892): 223 DOI: 10.1038/s41586-021-04283-8
Cold era, lasting from early 15th to mid-19th centuries, triggered by unusually warm conditions
Date: December 15, 2021
Source: University of Massachusetts Amherst
Summary: New research provides a novel answer to one of the persistent questions in historical climatology, environmental history and the earth sciences: what caused the Little Ice Age? The answer, we now know, is a paradox: warming.
New research from the University of Massachusetts Amherst provides a novel answer to one of the persistent questions in historical climatology, environmental history and the earth sciences: what caused the Little Ice Age? The answer, we now know, is a paradox: warming.
The Little Ice Age was one of the coldest periods of the past 10,000 years, a period of cooling that was particularly pronounced in the North Atlantic region. This cold spell, whose precise timeline scholars debate, but which seems to have set in around 600 years ago, was responsible for crop failures, famines and pandemics throughout Europe, resulting in misery and death for millions. To date, the mechanisms that led to this harsh climate state have remained inconclusive. However, a new paper published recently in Science Advances gives an up-to-date picture of the events that brought about the Little Ice Age. Surprisingly, the cooling appears to have been triggered by an unusually warm episode.
When lead author Francois Lapointe, postdoctoral researcher and lecturer in geosciences at UMass Amherst and Raymond Bradley, distinguished professor in geosciences at UMass Amherst began carefully examining their 3,000-year reconstruction of North Atlantic sea surface temperatures, results of which were published in the Proceedings of the National Academy of Sciences in 2020, they noticed something surprising: a sudden change from very warm conditions in the late 1300s to unprecedented cold conditions in the early 1400s, only 20 years later.
Using many detailed marine records, Lapointe and Bradley discovered that there was an abnormally strong northward transfer of warm water in the late 1300s which peaked around 1380. As a result, the waters south of Greenland and the Nordic Seas became much warmer than usual. “No one has recognized this before,” notes Lapointe.
Normally, there is always a transfer of warm water from the tropics to the Arctic. It’s a well-known process called the Atlantic Meridional Overturning Circulation (AMOC), which is like a planetary conveyor belt. Typically, warm water from the tropics flows north along the coast of Northern Europe, and when it reaches higher latitudes and meets colder Arctic waters, it loses heat and becomes denser, causing the water to sink at the bottom of the ocean. This deep-water formation then flows south along the coast of North America and continues on to circulate around the world.
But in the late 1300s, AMOC strengthened significantly, which meant that far more warm water than usual was moving north, which in turn cause rapid Arctic ice loss. Over the course of a few decades in the late 1300s and 1400s, vast amounts of ice were flushed out into the North Atlantic, which not only cooled the North Atlantic waters, but also diluted their saltiness, ultimately causing AMOC to collapse. It is this collapse that then triggered a substantial cooling.
Fast-forward to our own time: between the 1960s and 1980s, we have also seen a rapid strengthening of AMOC, which has been linked with persistently high pressure in the atmosphere over Greenland. Lapointe and Bradley think the same atmospheric situation occurred just prior to the Little Ice Age — but what could have set off that persistent high-pressure event in the 1380s?
The answer, Lapointe discovered, is to be found in trees. Once the researchers compared their findings to a new record of solar activity revealed by radiocarbon isotopes preserved in tree rings, they discovered that unusually high solar activity was recorded in the late 1300s. Such solar activity tends to lead to high atmospheric pressure over Greenland.
At the same time, fewer volcanic eruptions were happening on earth, which means that there was less ash in the air. A “cleaner” atmosphere meant that the planet was more responsive to changes in solar output. “Hence the effect of high solar activity on the atmospheric circulation in the North-Atlantic was particularly strong,” said Lapointe.
Lapointe and Bradley have been wondering whether such an abrupt cooling event could happen again in our age of global climate change. They note that there is now much less Arctic sea ice due to global warming, so an event like that in the early 1400s, involving sea ice transport, is unlikely. “However, we do have to keep an eye on the build-up of freshwater in the Beaufort Sea (north of Alaska) which has increased by 40% in the past two decades. Its export to the subpolar North Atlantic could have a strong impact on oceanic circulation,” said Lapointe. “Also, persistent periods of high pressure over Greenland in summer have been much more frequent over the past decade and are linked with record-breaking ice melt. Climate models do not capture these events reliably and so we may be underestimating future ice loss from the ice sheet, with more freshwater entering the North Atlantic, potentially leading to a weakening or collapse of the AMOC.” The authors conclude that there is an urgent need to address these uncertainties.
This research was supported by funding from the National Science Foundation.
Francois Lapointe, Raymond S. Bradley. Little Ice Age abruptly triggered by intrusion of Atlantic waters into the Nordic Seas. Science Advances, 2021; 7 (51) DOI: 10.1126/sciadv.abi8230
IS IT NEARLY over? In 2021 people have been yearning for something like stability. Even those who accepted that they would never get their old lives back hoped for a new normal. Yet as 2022 draws near, it is time to face the world’s predictable unpredictability. The pattern for the rest of the 2020s is not the familiar routine of the pre-covid years, but the turmoil and bewilderment of the pandemic era. The new normal is already here.
Remember how the terrorist attacks of September 11th 2001 began to transform air travel in waves. In the years that followed each fresh plot exposed an unforeseen weakness that required a new rule. First came locked cockpit doors, more armed air marshals and bans on sharp objects. Later, suspicion fell on bottles of liquid, shoes and laptops. Flying did not return to normal, nor did it establish a new routine. Instead, everything was permanently up for revision.
The world is similarly unpredictable today and the pandemic is part of the reason. For almost two years people have lived with shifting regimes of mask-wearing, tests, lockdowns, travel bans, vaccination certificates and other paperwork. As outbreaks of new cases and variants ebb and flow, so these regimes can also be expected to come and go. That is the price of living with a disease that has not yet settled into its endemic state.
And covid-19 may not be the only such infection. Although a century elapsed between the ravages of Spanish flu and the coronavirus, the next planet-conquering pathogen could strike much sooner. Germs thrive in an age of global travel and crowded cities. The proximity of people and animals will lead to the incubation of new human diseases. Such zoonoses, which tend to emerge every few years, used to be a minority interest. For the next decade, at least, you can expect each new outbreak to trigger paroxysms of precaution.
Covid has also helped bring about today’s unpredictable world indirectly, by accelerating change that was incipient. The pandemic has shown how industries can be suddenly upended by technological shifts. Remote shopping, working from home and the Zoom boom were once the future. In the time of covid they rapidly became as much of a chore as picking up the groceries or the daily commute.
Big technological shifts are nothing new. But instead of taking centuries or decades to spread around the world, as did the printing press and telegraph, new technologies become routine in a matter of years. Just 15 years ago, modern smartphones did not exist. Today more than half of the people on the planet carry one. Any boss who thinks their industry is immune to such wild dynamism is unlikely to last long.
The pandemic may also have ended the era of low global inflation that began in the 1990s and was ingrained by economic weakness after the financial crisis of 2007-09. Having failed to achieve a quick recovery then, governments spent nearly $11trn trying to ensure that the harm caused by the virus was transient.
They broadly succeeded, but fiscal stimulus and bunged-up supply chains have raised global inflation above 5%. The apparent potency of deficit spending will change how recessions are fought. As they raise interest rates to deal with inflation, central banks may find themselves in conflict with indebted governments. Amid a burst of innovation around cryptocoins, central-bank digital currencies and fintech, many outcomes are possible. A return to the comfortable macroeconomic orthodoxies of the 1990s is one of the least likely.
The pandemic has also soured relations between the world’s two great powers. America blames China’s secretive Communist Party for failing to contain the virus that emerged from Wuhan at the end of 2019. Some claim that it came from a Chinese laboratory there—an idea China has allowed to fester through its self-defeating resistance to open investigations. For its part, China, which has recorded fewer than 6,000 deaths, no longer bothers to hide its disdain for America, with its huge death toll. In mid-December this officially passed 800,000 (The Economist estimates the full total to be almost 1m). The contempt China and America feel for each other will heighten tensions over Taiwan, the South China Sea, human rights in Xinjiang and the control of strategic technologies.
In the case of climate change, the pandemic has served as an emblem of interdependence. Despite the best efforts to contain them, virus particles cross frontiers almost as easily as molecules of methane and carbon dioxide. Scientists from around the world showed how vaccines and medicines can save hundreds of millions of lives. However, hesitancy and the failure to share doses frustrated their plans. Likewise, in a world that is grappling with global warming, countries that have everything to gain from working together continually fall short. Even under the most optimistic scenarios, the accumulation of long-lasting greenhouse gases in the atmosphere means that extreme and unprecedented weather of the kind seen during 2021 is here to stay.
The desire to return to a more stable, predictable world may help explain a 1990s revival. You can understand the appeal of going back to a decade in which superpower competition had abruptly ended, liberal democracy was triumphant, suits were oversized, work ended when people left the office, and the internet was not yet disrupting cosy, established industries or stoking the outrage machine that has supplanted public discourse.
Events, dear boy, events
That desire is too nostalgic. It is worth notching up some of the benefits that come with today’s predictable unpredictability. Many people like to work from home. Remote services can be cheaper and more accessible. The rapid dissemination of technology could bring unimagined advances in medicine and the mitigation of global warming.
Even so, beneath it lies the unsettling idea that once a system has crossed some threshold, every nudge tends to shift it further from the old equilibrium. Many of the institutions and attitudes that brought stability in the old world look ill-suited to the new. The pandemic is like a doorway. Once you pass through, there is no going back. ■
This article appeared in the Leaders section of the print edition under the headline “The new normal”
As alarm about climate change and calls for action intensify, solar geoengineering (SG) is seeing increased attention and controversy. Views on whether it should or will ever be used diverge, but the evidentiary basis for these views is thin. On such a high-stakes, knowledge-limited issue, one might expect strong support for research, but even research has met opposition. Opponents’ objections are grounded in valid concerns but impossible to fully address, as they are framed in ways that make rejecting research an axiom, not a conclusion based on evidence.
Supporters of SG research argue that it can inform future decisions and prepare for likely future calls for deployment. A US National Academies of Sciences, Engineering, and Medicine (NASEM) report earlier this year lent thoughtful support to this view. Opponents raise well-known concerns about SG such as its imperfect climate correction, its time-scale mismatch with greenhouse gases (GHGs), and the potential to over-rely on it or use it recklessly or unjustly. They oppose research based on the same concerns, arguing that usage can never be acceptable so research is superfluous; or that sociopolitical lock-in will drive research toward deployment even if unwarranted. Both support and opposition are often implicit, embedded in debates over additional governance of SG research beyond peer review, program management, and regulatory compliance.
At present, potential SG methods and claimed benefits and harms are hypothetical, not demonstrated. The strongest objections to research invoke potential consequences that are indirect, mediated by imprudent or unjust policy decisions. Because the paths from research to these bad outcomes involve political behavior, claims that these “could” happen cannot be fully refuted. Understanding and limiting these risks require the same research and governance-building activities that opponents reject as causing the risks.
To reject an activity based on harms that might follow is to apply extreme precaution. This can be warranted when there is risk of serious, unmitigable harm and the alternative is known to be acceptable. That is not the case here. Rejecting SG research means taking the alternative trajectory of uncertain but potentially severe climate impacts, reduced by whatever emissions cuts, GHG removals, and adaptation are achieved. But these other responses needed to meet prudent climate targets carry their own risks: of falling short and suffering more severe climate change, and of collateral environmental and socioeconomic harms from deployment at the required transformative, even revolutionary, scale.
Suppressing research on SG might reduce risks from its future use, but this is not assured: Rather than preventing use in some future crisis, blocking research might make such use less informed, cruder, and more dangerous. Even if these risks are reduced, this would shift increased risks onto climate change and crash pursuit of other responses. Total climate-related risk may well increase—and be more unjustly distributed, because the largest benefits of SG appear likely to flow to the most vulnerable people and communities.
Yet the concerns that motivate opposition to research are compelling. SG use would be an unprecedented step, affecting climate response, international governance, sustainability, and global justice. Major concerns—about reckless or rivalrous use, or over-reliance weakening emissions cuts—are essential to address, even if they cannot be avoided with certainty. A few directions show promise for doing so. Research should be in public programs, in jurisdictions with cultures of public benefit and research accountability. The NASEM call for a US federal program is sound. Other national programs should be established. Research governance should be somewhat stronger than for less controversial research, including scale limits on field experiments and periodic program reassessments. Exploration of governance needs for larger-scale interventions should begin well before these are considered. Research and governance should seek broad international cooperation—promptly, but not as a precondition to national programs. Broad citizen consultations are needed on overall climate response and the role of SG. These should link to national research and governance programs but not have veto power over specific activities.
Precaution is appropriate, even necessary. But precaution cannot selectively target risks from one climate response while ignoring its linkages to other responses and risks. Suppressing SG research is likely to make the harms and injustices that opponents fear more likely, not less.
O que o surgimento da internet, os ataques de 11 de setembro de 2001 e a crise econômica de 2008 têm em comum?
Foram eventos extremamente raros e surpreendentes que tiveram um forte impacto na história.
Acontecimentos deste tipo costumam ser chamados de “cisnes negros”.
Alguns argumentam que a recente pandemia de covid-19 também pode ser considerada um deles, mas nem todo mundo concorda.
A “teoria do cisne negro” foi desenvolvida pelo professor, escritor e ex-operador da bolsa libanês-americano Nassim Taleb em 2007.
E possui três componentes, como o próprio Taleb explicou em um artigo no jornal americano The New York Times no mesmo ano:
– Em primeiro lugar, é algo atípico, já que está fora do âmbito das expectativas habituais, porque nada no passado pode apontar de forma convincente para sua possibilidade.
– Em segundo lugar, tem um impacto extremo.
– Em terceiro lugar, apesar de seu status atípico, a natureza humana nos faz inventar explicações para sua ocorrência após o fato em si, tornando-o explicável e previsível.
A tese de Taleb está geralmente associada à economia, mas se aplica a qualquer área.
E uma vez que as consequências costumam ser catastróficas, é importante aceitar que a ocorrência de um evento”cisne negro” é possível — e por isso é necessário ter um plano para lidar com o mesmo.
Em suma, o “cisne negro” representa uma metáfora para algo imprevisível e muito estranho, mas não impossível.
Por que são chamados assim?
No fim do século 17, navios europeus embarcaram na aventura de explorar a Austrália.
Em 1697, enquanto navegava nas águas de um rio desconhecido no sudoeste da Austrália Ocidental, o capitão holandês Willem de Vlamingh avistou vários cisnes negros, sendo possivelmente o primeiro europeu a observá-los.
Como consequência, Vlamingh deu ao rio o nome de Zwaanenrivier (Rio dos Cisnes, em holandês) por causa do grande número de cisnes negros que havia ali.
Tratava-se de um acontecimento inesperado e novo. Até aquele momento, a ciência só havia registrado cisnes brancos.
A primeira referência conhecida ao termo “cisne negro” associado ao significado de raridade vem de uma frase do poeta romano Décimo Júnio Juvenal (60-128).
Desesperado para encontrar uma esposa com todas as “qualidades certas” da época, ele escreveu em latim que esta mulher era rara avis in terris, nigroque simillima cygno (“uma ave rara nestas terras, como um cisne negro”), detalha o dicionário de Oxford.
Porque naquela época e até cerca de 1,6 mil anos depois, para os europeus, não existiam cisnes negros.
Prevendo os ‘cisnes negros’
Um grupo de cientistas da Universidade de Stanford, nos Estados Unidos, está trabalhando para prever o imprevisível.
Ou seja, para se antecipar aos “cisnes negros” — não às aves, mas aos estranhos eventos que acontecem na história.
Embora sua análise primária tenha sido baseada em três ambientes diferentes na natureza, o método computacional que eles criaram pode ser aplicado a qualquer área, incluindo economia e política.
“Ao analisar dados de longo prazo de três ecossistemas, pudemos demonstrar que as flutuações que ocorrem em diferentes espécies biológicas são estatisticamente iguais em diferentes ecossistemas”, afirmou Samuel Bray, assistente de pesquisa no laboratório de Bo Wang, professor de bioengenharia na Universidade de Stanford.
“Isso sugere que existem certos processos universais que podemos utilizar para prever esse tipo de comportamento extremo”, acrescentou Bray, conforme publicado no site da universidade.
Para desenvolver o método de previsão, os pesquisadores procuraram sistemas biológicos que vivenciaram eventos “cisne negro” e como foram os contextos em que ocorreram.
Eles se basearam então em ecossistemas monitorados de perto por muitos anos.
Os exemplos incluíram: um estudo de oito anos do plâncton do Mar Báltico com níveis de espécies medidos duas vezes por semana; medições de carbono de um bosque da Universidade de Harvard, nos EUA, que foram coletadas a cada 30 minutos desde 1991; e medições de cracas (mariscos), algas e mexilhões na costa da Nova Zelândia, feitas mensalmente por mais de 20 anos, detalha o estudo publicado na revista científica Plos Computational Biology.
Os pesquisadores aplicaram a estas bases de dados a teoria física por trás de avalanches e terremotos que, assim como os “cisnes negros”, mostram um comportamento extremo, repentino e de curto prazo.
A partir desta análise, os especialistas desenvolveram um método para prever eventos “cisne negro” que fosse flexível entre espécies e períodos de tempo e também capaz de trabalhar com dados muito menos detalhados e mais complexos.
Posteriormente, conseguiram prever com precisão eventos extremos que ocorreram nesses sistemas.
Até agora, “os métodos se baseavam no que vimos para prever o que pode acontecer no futuro, e é por isso que não costumam identificar os eventos ‘cisne negro'”, diz Wang.
Mas este novo mecanismo é diferente, segundo o professor de Stanford, “porque parte do pressuposto de que estamos vendo apenas parte do mundo”.
“Extrapola um pouco do que falta e ajuda enormemente em termos de previsão”, acrescenta.
Então, os “cisnes negros” poderiam ser detectados em outras áreas, como finanças ou economia?
“Aplicamos nosso método às flutuações do mercado de ações e funcionou muito bem”, disse Wang à BBC News Mundo, serviço de notícias em espanhol da BBC, por e-mail.
Os pesquisadores analisaram os índices Nasdaq, Dow Jones Industrial Average e S&P 500.
“Embora a principal tendência do mercado seja o crescimento exponencial de longo prazo, as flutuações em torno dessa tendência seguem as mesmas trajetórias e escalas médias que vimos nos sistemas ecológicos”, explica.
Mas “embora as semelhanças entre as variações na bolsa e ecológicas sejam interessantes, nosso método de previsão é mais útil nos casos em que os dados são escassos e as flutuações geralmente vão além dos registros históricos (o que não é o caso do mercado de ações)”, adverte Wang.
Por isso, temos que continuar atentos para saber se o próximo “cisne negro” vai nos pegar de surpresa… ou talvez não.
[Solar radiation management is listed first. Calling it “controversial” is bad journalism. It is extremely dangerous and there is not a lot of controversy about this aspect of the thing.]
Nov 8th 2021
The astonishingly rapid development and rollout of coronavirus vaccines has been a reminder of the power of science and technology to change the world. Although vaccines based on new mRNA technology seemed to have been created almost instantly, they actually drew upon decades of research going back to the 1970s. As the saying goes in the technology industry, it takes years to create an overnight success. So what else might be about to burst into prominence? Here are 22 emerging technologies worth watching in 2022
It sounds childishly simple. If the world is getting too hot, why not offer it some shade? The dust and ash released into the upper atmosphere by volcanoes is known to have a cooling effect: Mount Pinatubo’s eruption in 1991 cooled the Earth by as much as 0.5°C for four years. Solar geoengineering, also known as solar radiation management, would do the same thing deliberately.
This is hugely controversial. Would it work? How would rainfall and weather patterns be affected? And wouldn’t it undermine efforts to curb greenhouse-gas emissions? Efforts to test the idea face fierce opposition from politicians and activists. In 2022, however, a group at Harvard University hopes to conduct a much-delayed experiment called SCoPEX. It involves launching a balloon into the stratosphere, with the aim of releasing 2kg of material (probably calcium carbonate), and then measuring how it dissipates, reacts and scatters solar energy.
Proponents argue that it is important to understand the technique, in case it is needed to buy the world more time to cut emissions. The Harvard group has established an independent advisory panel to consider the moral and political ramifications. Whether the test goes ahead or not, expect controversy.
Keeping buildings warm in winter accounts for about a quarter of global energy consumption. Most heating relies on burning coal, gas or oil. If the world is to meet its climate-change targets, that will have to change. The most promising alternative is to use heat pumps—essentially, refrigerators that run in reverse.
Instead of pumping heat out of a space to cool it down, a heat pump forces heat in from the outside, warming it up. Because they merely move existing heat around, they can be highly efficient: for every kilowatt of electricity consumed, heat pumps can deliver 3kW of heat, making them cheaper to run than electric radiators. And running a heat pump backwards cools a home rather than heating it.
Gradient, based in San Francisco, is one of several companies offering a heat pump that can provide both heating and cooling. Its low-profile, saddle-bag shaped products can be mounted in windows, like existing air conditioners, and will go on sale in 2022.
Electrifying road transport is one thing. Aircraft are another matter. Batteries can only power small aircraft for short flights. But might electricity from hydrogen fuel cells, which excrete only water, do the trick? Passenger planes due to be test-flown with hydrogen fuel cells in 2022 include a two-seater being built at Delft University of Technology in the Netherlands. ZeroAvia, based in California, plans to complete trials of a 20-seat aircraft, and aims to have its hydrogen-propulsion system ready for certification by the end of the year. Universal Hydrogen, also of California, hopes its 40-seat plane will take off in September 2022.
Direct air capture
Carbon dioxide in the atmosphere causes global warming. So why not suck it out using machines? Several startups are pursuing direct air capture (DAC), a technology that does just that. In 2022 Carbon Engineering, a Canadian firm, will start building the world’s biggest DAC facility in Texas, capable of capturing 1m tonnes of CO2 per year. ClimeWorks, a Swiss firm, opened a DAC plant in Iceland in 2021, which buries captured CO2 in mineral form at a rate of 4,000 tonnes a year. Global Thermostat, an American firm, has two pilot plants. DAC could be vital in the fight against climate change. The race is on to get costs down and scale the technology up.
A new type of agriculture is growing. Vertical farms grow plants on trays stacked in a closed, controlled environment. Efficient LED lighting has made the process cheaper, though energy costs remain a burden. Vertical farms can be located close to customers, reducing transport costs and emissions. Water use is minimised and bugs are kept out, so no pesticides are needed.
In Britain, the Jones Food Company will open the world’s largest vertical farm, covering 13,750 square metres, in 2022. AeroFarms, an American firm, will open its largest vertical farm, in Daneville, Virginia. Other firms will be expanding, too. Nordic Harvest will enlarge its facility just outside Copenhagen and construct a new one in Stockholm. Plenty, based in California, will open a new indoor farm near Los Angeles. Vertical farms mostly grow high-value leafy greens and herbs, but some are venturing into tomatoes, peppers and berries. The challenge now is to make the economics stack up, too.
Container ships with sails
Ships produce 3% of greenhouse-gas emissions. Burning maritime bunker fuel, a dirty diesel sludge, also contributes to acid rain. None of this was a problem in the age of sail—which is why sails are making a comeback, in high-tech form, to cut costs and emissions.
In 2022 Michelin of France will equip a freighter with an inflatable sail that is expected to reduce fuel consumption by 20%. MOL, a Japanese shipping firm, plans to put a telescoping rigid sail on a ship in August 2022. Naos Design of Italy expects to equip eight ships with its pivoting and foldable hard “wing sails”. Other approaches include kites, “suction wings” that house fans, and giant, spinning cylinders called Flettner rotors. By the end of 2022 the number of big cargo ships with sails of some kind will have quadrupled to 40, according to the International Windship Association. If the European Union brings shipping into its carbon-trading scheme in 2022, as planned, that will give these unusual technologies a further push.
Most people do not do enough exercise. Many would like to, but lack motivation. Virtual reality (VR) headsets let people play games and burn calories in the process, as they punch or slice oncoming shapes, or squat and shimmy to dodge obstacles. VR workouts became more popular during the pandemic as lockdowns closed gyms and a powerful, low-cost headset, the Oculus Quest 2, was released. An improved model and new fitness features are coming in 2022. And Supernatural, a highly regarded VR workout app available only in North America, may be released in Europe. Could the killer app for virtual reality be physical fitness?
Vaccines for HIV and malaria
The impressive success of coronavirus vaccines based on messenger RNA (mRNA) heralds a golden era of vaccine development. Moderna is developing an HIV vaccine based on the same mRNA technology used in its highly effective coronavirus vaccine. It entered early-stage clinical trials in 2021 and preliminary results are expected in 2022. BioNTech, joint-developer of the Pfizer-BioNTech coronavirus vaccine, is working on an mRNA vaccine for malaria, with clinical trials expected to start in 2022. Non-mRNA vaccines for HIV and malaria, developed at the University of Oxford, are also showing promise.
3D-printed bone implants
For years, researchers have been developing techniques to create artificial organs using 3D printing of biological materials. The ultimate goal is to take a few cells from a patient and create fully functional organs for transplantation, thus doing away with long waiting-lists, testing for matches and the risk of rejection.
That goal is still some way off for fleshy organs. But bones are less tricky. Two startups, Particle3D and ADAM, hope to have 3D-printed bones available for human implantation in 2022. Both firms use calcium-based minerals to print their bones, which are made to measure based on patients’ CT scans. Particle3D’s trials in pigs and mice found that bone marrow and blood vessels grew into its implants within eight weeks. ADAM says its 3D-printed implants stimulate natural bone growth and gradually biodegrade, eventually being replaced by the patient’s bone tissue. If all goes well, researchers say 3D-printed blood vessels and heart valves are next.
Flying electric taxis
Long seen as something of a fantasy, flying taxis, or electric vertical take-off and landing (eVTOL) aircraft, as the fledgling industry calls them, are getting serious. Several firms around the world will step up test flights in 2022 with the aim of getting their aircraft certified for commercial use in the following year or two. Joby Aviation, based in California, plans to build more than a dozen of its five-seater vehicles, which have a 150-mile range. Volocopter of Germany aims to provide an air-taxi service at the 2024 Paris Olympics. Other contenders include eHang, Lilium and Vertical Aerospace. Keep an eye on the skies.
After a stand-out year for space tourism in 2021, as a succession of billionaire-backed efforts shot civilians into the skies, hopes are high for 2022. Sir Richard Branson’s Virgin Galactic just beat Jeff Bezos’s Blue Origin to the edge of space in July, with both billionaires riding in their own spacecraft on suborbital trips. In September Elon Musk’s company, SpaceX, sent four passengers on a multi-day orbital cruise around the Earth.
All three firms hope to fly more tourists in 2022, which promises to be the first year in which more people go to space as paying passengers than as government employees. But Virgin Galactic is modifying its vehicle to make it stronger and safer, and it is not expected to fly again until the second half of 2022, with commercial service starting in the fourth quarter. Blue Origin plans more flights but has not said when or how many. For its part, SpaceX has done a deal to send tourists to the International Space Station. Next up? The Moon.
They are taking longer than expected to get off the ground. But new rules, which came into effect in 2021, will help drone deliveries gain altitude in 2022. Manna, an Irish startup which has been delivering books, meals and medicine in County Galway, plans to expand its service in Ireland and into Britain. Wing, a sister company of Google, has been doing test deliveries in America, Australia and Finland and will expand its mall-to-home delivery service, launched in late 2021. Dronamics, a Bulgarian startup, will start using winged drones to shuttle cargo between 39 European airports. The question is: will the pace of drone deliveries pick up—or drop off?
Quieter supersonic aircraft
For half a century, scientists have wondered whether changes to the shape of a supersonic aircraft could reduce the intensity of its sonic boom. Only recently have computers become powerful enough to run the simulations needed to turn those noise-reduction theories into practice.
In 2022 NASA’s X-59 QueSST (short for “Quiet Supersonic Technology”) will make its first test flight. Crucially, that test will take place over land—specifically, Edwards Air Force Base in California. Concorde, the world’s first and only commercial supersonic airliner, was not allowed to travel faster than sound when flying over land. The X-59’s sonic boom is expected to be just one-eighth as loud as Concorde’s. At 75 perceived decibels, it will be equivalent to a distant thunderstorm—more of a sonic “thump”. If it works, NASA hopes that regulators could lift the ban on supersonic flights over land, ushering in a new era for commercial flight.
Architects often use 3D printing to create scale models of buildings. But the technology can be scaled up and used to build the real thing. Materials are squirted out of a nozzle as a foam that then hardens. Layer by layer, a house is printed—either on site, or as several pieces in a factory that are transported and assembled.
In 2022 Mighty Buildings, based in California, will complete a development of 15 eco-friendly 3D-printed homes at Rancho Mirage. And ICON, based in Texas, plans to start building a community of 100 3D-printed homes near Austin, which would be the largest development of its kind.
It’s become a craze in Silicon Valley. Not content with maximising their productivity and performance during their waking hours, geeks are now optimising their sleep, too, using an array of technologies. These include rings and headbands that record and track sleep quality, soothing sound machines, devices to heat and cool mattresses, and smart alarm clocks to wake you at the perfect moment. Google launched a sleep-tracking nightstand tablet in 2021, and Amazon is expected to follow suit in 2022. It sounds crazy. But poor sleep is linked with maladies from heart disease to obesity. And what Silicon Valley does today, everyone else often ends up doing tomorrow.
Diets don’t work. Evidence is growing that each person’s metabolism is unique, and food choices should be, too. Enter personalised nutrition: apps that tell you what to eat and when, using machine-learning algorithms, tests of your blood and gut microbiome, data on lifestyle factors such as exercise, and real-time tracking of blood-sugar levels using coin-sized devices attached to the skin. After successful launches in America, personalised-nutrition firms are eyeing other markets in 2022. Some will also seek regulatory approval as treatments for conditions such as diabetes and migraine.
Wearable health trackers
Remote medical consultations have become commonplace. That could transform the prospects for wearable health trackers such as the Fitbit or Apple Watch. They are currently used primarily as fitness trackers, measuring steps taken, running and swimming speeds, heart rates during workouts, and so forth. But the line between consumer and medical uses of such devices is now blurring, say analysts at Gartner, a consultancy.
Smart watches can already measure blood oxygenation, perform ECGs and detect atrial fibrillation. The next version of the Apple Watch, expected in 2022, may include new sensors capable of measuring levels of glucose and alcohol in the blood, along with blood pressure and body temperature. Rockley Photonics, the company supplying the sensor technology, calls its system a “clinic on the wrist”. Regulatory approval for such functions may take a while, but in the meantime doctors, not just users, will be paying more attention to data from wearables.
Coined in 1992 by Neal Stephenson in his novel “Snow Crash”, the word “metaverse” referred to a persistent virtual world, accessible via special goggles, where people could meet, flirt, play games, buy and sell things, and much more besides. In 2022 it refers to the fusion of video games, social networking and entertainment to create new, immersive experiences, like swimming inside your favourite song at an online concert. Games such as Minecraft, Roblox and Fortnite are all stepping-stones to an emerging new medium. Facebook has renamed itself Meta to capitalise on the opportunity—and distract from its other woes.
An idea that existed only on blackboards in the 1990s has grown into a multi-billion dollar contest between governments, tech giants and startups: harnessing the counter-intuitive properties of quantum physics to build a new kind of computer. For some kinds of mathematics a quantum computer could outperform any non-quantum machine that could ever be built, making quick work of calculations used in cryptography, chemistry and finance.
But when will such machines arrive? One measure of a quantum computer’s capability is its number of qubits. A Chinese team has built a computer with 66 qubits. IBM, an American firm, hopes to hit 433 qubits in 2022 and 1,000 by 2023. But existing machines have a fatal flaw: the delicate quantum states on which they depend last for just a fraction of a second. Fixing that will take years. But if existing machines can be made useful in the meantime, quantum computing could become a commercial reality much sooner than expected.
Unlike a human influencer, a virtual influencer will never be late to a photoshoot, get drunk at a party or get old. That is because virtual influencers are computer-generated characters who plug products on Instagram, Facebook and TikTok.
The best known is Miquela Sousa, or “Lil Miquela”, a fictitious Brazilian-American 19-year-old with 3m Instagram followers. With $15bn expected to be spent on influencer marketing in 2022, virtual influencers are proliferating. Aya Stellar—an interstellar traveller crafted by Cosmiq Universe, a marketing agency—will land on Earth in February. She has already released a song on YouTube.
In April 2021 the irrepressible entrepreneur Elon Musk excitedly tweeted that a macaque monkey was “literally playing a video game telepathically using a brain chip”. His company, Neuralink, had implanted two tiny sets of electrodes into the monkey’s brain. Signals from these electrodes, transmitted wirelessly and then decoded by a nearby computer, enabled the monkey to move the on-screen paddle in a game of Pong using thought alone.
In 2022 Neuralink hopes to test its device in humans, to enable people who are paralysed to operate a computer. Another firm, Synchron, has already received approval from American regulators to begin human trials of a similar device. Its “minimally invasive” neural prosthetic is inserted into the brain via blood vessels in the neck. As well as helping paralysed people, Synchron is also looking at other uses, such as diagnosing and treating nervous-system conditions including epilepsy, depression and hypertension.
Artificial meat and fish
Winston Churchill once mused about “the absurdity of growing a whole chicken to eat the breast or wing”. Nearly a century later, around 70 companies are “cultivating” meats in bioreactors. Cells taken from animals, without harming them, are nourished in soups rich in proteins, sugars, fats, vitamins and minerals. In 2020 Eat Just, an artificial-meat startup based in San Francisco, became the first company certified to sell its products, in Singapore.
It is expected to be joined by a handful of other firms in 2022. In the coming year an Israeli startup, SuperMeat, expects to win approval for commercial sales of cultivated chicken burgers, grown for $10 a pop—down from $2,500 in 2018, the company says. Finless Foods, based in California, hopes for approval to sell cultivated bluefin tuna, grown for $440 a kilogram—down from $660,000 in 2017. Bacon, turkey and other cultivated meats are in the pipeline. Eco-conscious meat-lovers will soon be able to have their steak—and eat it.
By the Science and technology correspondents of The Economist■
This article appeared in the What next? section of the print edition of The World Ahead 2022 under the headline “What next?”
Uma sala repleta de estudantes de agronomia assiste a uma palestra sobre mudanças climáticas no Brasil. Estão em uma faculdade no Estado do Mato Grosso, maior produtor de soja do país, ouvindo falar um professor da Universidade de São Paulo. Mas o que escutam é o contrário do que acredita a esmagadora maioria da comunidade científica do mundo. Ali, a mensagem transmitida é de que não existe aquecimento global causado pelo homem.
“Os objetivos [de quem fala em mudanças climáticas] são congelar os países em desenvolvimento. O Brasil é o principal foco dessas operações que envolvem meio ambiente e clima. A ideia da mudança climática e dessas questões ambientais são para segurar o nosso desenvolvimento”, afirmou o palestrante, o meteorologista Ricardo Felicio, sem respaldo científico, em uma entrevista concedida após o evento que aconteceu em 2019.
Na realidade, segundo o último relatório do Painel Intergovernamental sobre Mudanças Climáticas (IPCC), de agosto deste ano, o papel da influência humana no aquecimento do planeta é “inequívoco”. É para limitar as mudanças climáticas por meio da redução na emissão de gases de efeito estufa que líderes se reuniram nas últimas duas semanas na COP26 em Glasgow, no Reino Unido.
No Brasil, a maior causa de emissões de dióxido de carbono é o desmatamento feito para expansão da agricultura e da pecuária.
Mas, na contramão do que diz a ciência, associações do agronegócio — de fazendeiros de soja, passando por cafeicultores, sindicatos rurais, faculdades ligadas a agronomia e até uma empresa de fertilizantes — estão bancando palestras dos chamados “negacionistas climáticos”, pessoas que não acreditam que existam mudanças climáticas causadas pelo homem e que apresentam esse fato como uma fraude. As apresentações são direcionadas a outros fazendeiros, produtores rurais ou estudantes de agronomia.
A reportagem contou ao menos 20 palestras do tipo nesses ambientes nos últimos três anos feitas por Felicio e por outro professor. A citada no início desta reportagem aconteceu em 2019, e fez parte de um circuito universitário de um total de 11 palestras com o nome “Aquecimento global, mito ou realidade?” em nove faculdades e dois sindicatos no Mato Grosso. Todas elas foram bancadas pela Aprosoja Mato Grosso, a associação de produtores de soja e milho do Estado, maior produtor de soja do Brasil.
Ao mesmo tempo em que negam o aquecimento global antropogênico, as palestras pagas e vistas por ruralistas os absolvem de reconhecer seu papel nas mudanças climáticas. Elas seriam, de acordo com o conteúdo contrário ao consenso científico apresentado pelos professores, somente fruto de variações naturais, sem interferência alguma do homem.
Ao contrário desse setor “negacionista” do agronegócio, o presidente do conselho diretor da Associação Brasileira do Agronegócio, Marcello Brito, diz que a associação se pauta “pela melhor ciência” e que “jogar fora a ciência porque ela não nos traz só vantagens, mas também deveres, é no mínimo contraproducente, jogando contra a melhoria contínua”.
Felicio, o professor do departamento de Geografia da USP contratado pela Aprosoja Mato Grosso em 2019, é conhecido por suas posições controversas — ultimamente, em relação à pandemia de covid-19. Em um vídeo publicado em agosto deste ano em seu canal do YouTube, chamou a pandemia de “fraudemia” e disse, sem base científica, que vacinas causam danos maiores que a covid-19. Em outro, afirmou que máscaras não são efetivas contra a covid-19. É também um notório negacionista das mudanças climáticas causadas pelo homem. Ficou conhecido em 2012, quando foi convidado ao Programa do Jô, da Globo, e, sem provas, negou o efeito estufa.
Durante três semanas, a reportagem tentou falar com Felicio por telefonemas, mensagens de texto e e-mails, mas não obteve resposta. O vice-presidente da Aprosoja Mato Grosso, Lucas Beber, justificou o convite em entrevista à BBC News Brasil.
“A gente trouxe o Ricardo Felicio para fazer um contraponto com aquilo que é replicado na mídia hoje, que parece uma verdade absoluta. A gente não queria impor aquilo como uma verdade, mas sim trazer a um debate”, afirma. Para ele, as mudanças climáticas causadas pelo homem ainda são uma “incerteza” — embora já haja consenso científico em torno delas. Beber também disse não se lembrar quanto custou o ciclo de 11 palestras feitas por Felicio naquele ano.
No ano passado, o meteorologista também foi convidado para falar no Tecno Safra Nortão 2020, uma feira para produtores rurais, lideranças, técnicos, pesquisadores e estudantes organizada pelo sindicato rural de Matupá, município no norte de Mato Grosso.
Segundo o vice-presidente do sindicato, Fernando Bertolin, ao menos cem pessoas, entre pequenos e grandes agricultores, pecuaristas e outras pessoas da cidade assistiram à palestra. Ele defende o convite, dizendo que, à época, Felicio estava “bem forte na mídia” e que sua palestra “foi um pedido dos produtores”. “A gente ouve todo mundo. Ele tem o embasamento teórico dele e a gente queria saber por que ele dizia aquilo.”
Bertolin diz não se recordar do valor da palestra de Felicio de cabeça, mas afirma que nenhuma das contratadas pela feira custou mais de R$ 15 mil.
Em 2018, Felicio concorreu, sem sucesso, ao cargo de deputado federal pelo PSL, antigo partido do presidente Jair Bolsonaro.
Um ano antes, o presidente tuitou um vídeo de uma entrevista em que Felicio nega a existência de mudanças climáticas causadas pelo homem. Bolsonaro escreveu: “Vale a pena conferir”. Consultada pela BBC News Brasil sobre esta recomendação feita por Bolsonaro, a assessoria da Presidência não respondeu.
O professor não foi aclamado apenas pelo presidente. Em 2019, Felicio foi convidado para dar uma palestra no Senado ao lado de outro acadêmico que não acredita no aquecimento global causado pelo homem, o professor aposentado da Universidade Federal de Alagoas (Ufal), meteorologista Luiz Carlos Molion.
O convite para que os professores falassem em uma audiência pública conjunta das comissões de Relações Exteriores e de Meio Ambiente do Senado sobre as mudanças climáticas partiu do senador do Acre Marcio Bittar (hoje PSL, mas, na época, do MDB), um ex-pecuarista que faz parte da bancada ruralista.
Ao lado de Felicio, Molion é considerado um dos principais representantes do negacionismo climático no Brasil e autor das outras palestras contabilizadas pela reportagem.
Nos últimos três anos, Molion fez diversas palestras promovidas por entidades como a Cooperativa Agrícola de Unaí, em Minas Gerais, a Associação Avícola de Pernambuco, a Associação de Engenheiros e Arquitetos de Itanhaém, com o patrocínio oficial do Conselho Regional de Engenharia e Agronomia de São Paulo, a Central Campo, uma empresa especializada na venda de insumos agrícolas, a Feira Agrotecnológica do Tocantins, do governo do Tocantins, a feira de Agronegócios da Cooabriel, uma cooperativa de café com atuação no Espírito Santo e na Bahia, e o sindicato rural de Canarana, no Mato Grosso.
Molion também foi convidado para falar em universidades: o Instituto de Ciências Agrárias da Universidade Federal de Minas Gerais (UFMG) e a Universidade Federal da Paraíba (UFPB). A BBC News Brasil procurou todas essas instituições para comentar sobre os convites que fizeram a Molion — leia as respostas abaixo e no fim desta reportagem.
A maior parte dessas palestras tem como tema as perspectivas climáticas para o ano seguinte e as “tendências para os próximos 10 anos”. Nas palestras — a maioria disponível no YouTube e vistas pela BBC News Brasil —, Molion de fato faz previsões para o ano seguinte, útil para que os produtores rurais se planejem para as próximas safras, mas reserva a última parte da palestra para falar sobre como o “aquecimento global é uma fraude” — novamente, uma afirmação sem embasamento científico.
Ele mostra um slide na parte final em sua apresentação de Powerpoint, com suas palavras finais. O texto da apresentação diz que o clima “varia por causas naturais”, e que “eventos extremos sempre ocorreram”. Afirma, também: “Aquecimento global é mito. CO2 não controla o clima, não é vilão (…) Redução de emissões: inútil!”
Na palestra promovida pela Secretaria de Agricultura, Pecuária e Aquicultura do governo do Tocantins em maio de 2020, por exemplo, Molion afirmou, contrariando a ciência, que o “aquecimento global é uma farsa, é um mito”. “Reduzir emissões como quer esse Acordo de Paris de 2015 é inútil, o Brasil tinha que pular fora porque reduzir emissões não vai causar nenhum benefício para o planeta, para o clima, porque o CO2 não controla o clima”, disse, indo contra a esmagadora maioria da produção científica dos últimos anos e aos esforço global de selar acordos para diminuir as emissões dos gases de efeito estufa.
A secretaria disse que o convidou, ao lado de outros palestrantes, para “alinhar o setor agropecuário quanto às diversas correntes existentes e auxiliá-los no seu planejamento e tomadas de decisão mais assertivas para seu empreendimento rural”.
Depois, em outubro de 2020, em um seminário virtual promovido pela Central Campo, uma empresa mineira especializada na venda de insumos agrícolas, Molion fez as mesmas afirmações sobre o CO2 e o Acordo de Paris.
O diretor da empresa, Artur Barros, disse por e-mail à BBC News Brasil que a empresa “sempre soube do posicionamento do professor Molion, que é muito pragmático quanto às questões climáticas” e “o profissional que tem maior assertividade nas previsões”. “A Central Campo, assim como grande parte dos produtores atendidos pela empresa, está muito alinhada ao posicionamento do professor Molion”.
À BBC News Brasil, Molion afirmou: “Procuro usar minhas palestras para o agronegócio, que não são poucas, para no terceiro bloco falar sobre as mudanças climáticas e a farsa do CO2 como controlador do clima global. Faço um diagnóstico local, previsão para safra e depois falo sobre a tendência do clima dos próximos dez, 15 anos, que é de resfriamento.”
Segundo Molion, ele dá 50 palestras por ano, “a grande maioria, 80%, 85% para o agronegócio”, cobrando R$ 4 mil por cada uma. Barros, da Central Campo, afirmou que foi este o valor que pagou pela palestra do professor.
O meteorologista diz que não se incomoda de ser chamado de “negacionista”, embora, ressalte, nunca tenha negado que houve aquecimento no planeta em um período específico no passado. “Eu levo o que acho que está correto, pode ser que daqui a alguns anos me provem que estou errado e vou reconhecer isto. Não sou paraquedista. Eu tenho visão muito crítica do clima local e global graças ao meu treinamento.”
Um dos seminários mais recentes de que participou teve também a presença de membros do governo Bolsonaro: o vice-presidente Hamilton Mourão e o ministro de Infraestrutura, Tarcisio Freitas. Foi um seminário virtual sobre a Amazônia em agosto deste ano organizado pelo Instituto General Villas Bôas, ONG do ex-comandante do Exército.
Contrariando o consenso da comunidade científica sobre as mudanças climáticas, Molion defendeu que o clima global varia naturalmente, sem influência da ação humana, e apresentou um slide em que dizia que o efeito-estufa, “como descrito pelo IPCC, é questionável”. Antes de passar a palavra para o ministro Freitas, afirmou: “CO2 não é vilão, quanto mais CO2 tiver na atmosfera, melhor”.
A BBC News Brasil procurou a vice-presidência questionando por que Mourão aceitou participar de um seminário ao lado de um professor que nega que a ação do homem esteja contribuindo para o aquecimento global. Sua assessoria disse apenas que Mourão participou do evento a convite do Instituto General Villas Bôas e que “baseia-se em dados científicos para emissão de suas ideias e opiniões”.
A assessoria do ministro da Infraestrutura, Tarcísio Freitas, afirmou que ele participou do seminário após convite feito pelo próprio general Villas Boas. A ministra da Agricultura, Pecuária e Abastecimento, Tereza Cristina, foi inicialmente anunciada como um dos nomes de ministros que participariam do seminário, mas sua assessoria informou que ela não participaria do evento, sem responder por que desistiu.
Negacionismo climático no Brasil
A genealogia do negacionismo climático no Brasil começa nos anos 2000, quando a imprensa “dava pesos iguais para argumentos com pesos totalmente diferentes”, avalia o sociólogo Jean Miguel, pesquisador associado da Unifesp que estuda o tema. O debate sobre o assunto no Brasil se deu principalmente a partir do documentário americano Uma Verdade Inconveniente (2006), sobre a campanha do ex-vice-presidente americano Al Gore a respeito do aquecimento global.
Enquanto isso, um grupo pequeno de negacionistas na academia brasileira, incluindo Felicio e Molion, se pronunciavam publicamente sobre o tema. Para Miguel, eles são “verdadeiros mercadores da dúvida, trabalhando para destacar lacunas que toda ciência possui e amplificar incertezas”.
“[E quem os ouviu no Brasil] foi parte do agronegócio interessado na desregulamentação florestal”, responde Miguel.
Hoje, “as palestras fazem massagem no ego do produtor rural e criam a mentalidade de que esses grupos de agronegócio estão sendo injustiçados enquanto estão contribuindo para o PIB nacional”, diz o pesquisador.
Não significa que todos os produtores rurais sejam negacionistas. “A briga hoje é entre dois lados: o setor de agroexportação, que está mais em contato com compradores internacionais, portanto mais pressionado pela questão reputacional, e que faz investimentos a longo prazo, pensando na questão produtiva na próxima década, não na próxima safra”, diz Raoni Rajão, professor de gestão ambiental na Universidade Federal de Minas Gerais (UFMG).
“Outro lado do setor são os produtores, mais politizados e fortes apoiadores de Bolsonaro e toda sua agenda. Eles de certa forma compram esse discurso que toda a narrativa de mudança climática é algo para poder impedir o desenvolvimento do Brasil.”
Apesar de não começar no governo Bolsonaro, o negacionismo “encontra terreno fértil para proliferar” em sua gestão, avalia Miguel, citando algumas ações do governo atual, como o fechamento da secretaria responsável por elaborar políticas públicas sobre as mudanças climáticas, no início da gestão Bolsonaro (ela foi reaberta em meio a críticas no ano seguinte) e a desistência em sediara COP-25 que ocorreria no Brasil em novembro de 2019. Em sua campanha, em 2018, Bolsonaro também prometeu acabar com o que chamava de “indústria das multas” ambientais.
O ex-ministro das Relações Exteriores Ernesto Araújo, que ficou no cargo do começo do governo Bolsonaro até março de 2021, chegou a colocar em dúvida que as mudanças climáticas seriam causadas pela ação humana, na contramão do consenso científico.
“Eles estão altamente informados pelo negacionismo climático. Mesmo que não digam que é uma fraude, de uma maneira interna vão criando as possibilidade de sabotar a ciência e as políticas climáticas nacionais, com formas práticas de negacionismo climático”, afirma Miguel.
Mas ações práticas terão de ser adotadas para que o Brasil cumpra as metas anunciadas pelo governo durante a COP-26: zerar o desmatamento ilegal no país até 2028, reduzir as emissões de gases do efeito estufa em 50% até 2030 e atingir a neutralidade de carbono até 2050.
O desmatamento, causado pela expansão da agricultura e da pecuária, é responsável pela maior emissão de CO2 no Brasil.
Só entre agosto de 2019 e julho de 2020, uma área de 10.851 km2 — mais ou menos metade da área do Estado de Sergipe — foi desmatada na Amazônia Legal, segundo dados do sistema Prodes, do Instituto Nacional de Pesquisas Espaciais (INPE). O valor representou um aumento de 7,13% em relação ao ano anterior.
Esse crescimento teve um claro reflexo nas emissões de gases poluentes pelo Brasil em 2020. Houve um aumento de 9,5%, segundo dados do Sistema de Estimativas de Emissões de Gases de Efeito Estufa (SEEG), do Observatório do Clima, principalmente por mudanças no uso da terra e floresta, que inclui o desmatamento, e a agropecuária. O aumento aconteceu na contramão do mundo que, parado por conta da pandemia de covid-19, diminuiu as emissões em 7%.
Para Tasso Azevedo, coordenador do SEEG, a boa notícia é que, se o Brasil conseguir controlar o desmatamento, “as emissões cairão muito rapidamente”. “Se controlarmos o desmatamento, não há país no mundo que vai ter emissões menores proporcionalmente do que temos no Brasil, então acho que é uma oportunidade. Teremos um resultado incrível para o Brasil e para o planeta.”
Apesar de pertencer ao setor responsável pela maior parte de emissões de gases do efeito estufa no Brasil, parte dos ruralistas diz acreditar ser injustamente acusada por ambientalistas.
As palestras do meteorologista Felicio no Mato Grosso, em 2019, “foram bem numa época em que era moda dizer que o agricultor era quem estava acabando com o mundo”, diz o produtor rural Artemio Antonini, presidente do sindicato rural de Nova Xavantina, no Mato Grosso. Também cético em relação às mudanças climáticas, Antonini ajudou a organizar a palestra de Felicio na região.
Na opinião de Rajão, da UFMG, “o agro como um todo toma as dores e se sente ofendido quando se fala de desmatamento”. “A reação é negar o desmatamento e a existência das mudanças climáticas.”
“Tomar as dores” porque, de fato, quem desmata primariamente não é produtor rural. Uma área desmatada começa com uma onda de especuladores – quem demarca a terra e serra dali a vegetação depois quem tenta regularizar a área -, em seguida vem o pecuarista e depois vem o agricultor, explica Rajão. “Por isso que quando dizem que não estão envolvidos com o desmatamento, é verdade, boa parte deles não está. Mas se beneficiam de um fornecimento de terra barata, que vem de todo o processo de desmatamento ilegal que às vezes aconteceu 10 anos antes.”
A ilegalidade é bastante concentrada. O estudo “As maçãs podres do agronegócio brasileiro”, de Rajão e outros pesquisadores, mostrou que mais de 90% dos produtores na Amazônia e no Cerrado não praticaram desmatamento ilegal após 2008. Além disso, apenas 2% das propriedades nessas regiões eram responsáveis por 62% de todo desmatamento potencialmente ilegal. O trabalho foi publicado na revista Science no ano passado.
O agricultor de soja Ilson Redivo também esteve na plateia em uma das palestras que o professor Ricardo Felicio deu em 2019, no município de Sinop, norte do Mato Grosso.
Redivo migrou do Paraná para Sinop em 1988, inicialmente trabalhando, como a maioria dos migrantes, no setor madeireiro. “Era um grande polo madeireiro, e era o que dava retorno na época”, diz. Hoje, ele possui uma fazenda de 4200 hectares de milho e soja na região, e é presidente do Sindicato Rural da cidade.
Ele diz ter gostado da palestra de Felicio. Como ele, o produtor rural também rejeita a ciência estabelecida sobre o aquecimento global. Ele diz que é uma “narrativa econômica”, não ambiental, criada para conter o desenvolvimento do Brasil.
“Eu estou há trinta anos aqui, foi desmatado um monte e o clima continua da mesma forma, tá certo? Não houve alteração climática”, diz Redivo à BBC News Brasil.
Ecoando argumentos já usados por Bolsonaro, o agricultor diz que o Brasil é “um exemplo para o mundo em preservação ambiental”. “O produtor brasileiro é o cara que mais preserva.”
O argumento é repetido por outros produtores rurais. “Ninguém fala que o agricultor está deixando 80% e só usando 20% da área para produzir”, reclama o produtor rural Antonini.
Eles se referem à Reserva Legal, um dispositivo criado no Código Florestal Brasileiro que obriga os proprietários de terras na Amazônia a preservar 80% da floresta nativa (no Cerrado, o valor é de 35%; em outros biomas, 20%), algo que beneficia o próprio agronegócio, por meio dos serviços ambientais prestados pela floresta. Muitos agricultores acham isso injusto. Mas, na prática, nem todos respeitam essa exigência.
A pesquisadora do Instituto do Homem e Meio Ambiente da Amazônia (Imazon) Ritaumaria Pereira conduziu entrevistas com 131 criadores de gado no Pará em 2013 e 2014 e descobriu que mais de 95% deles declararam preservar menos do que a quantidade exigida. Segundo ela, argumentam que, quando chegaram, a terra já estava nua, ou que no passado tinham o estímulo para desmatar, ou que não tinham recursos para regenerar 80%.
Para Pereira, da Imazon, para que o Brasil consiga cumprir as metas anunciadas durante a COP-26, será preciso investir em fiscalização na Amazônia, fortalecendo órgãos como o Ibama e o ICMBio.
Também será preciso combater o discurso do negacionismo climático. A mensagem transmitida a produtores rurais, diz ela, legitima o desmatamento, e “traz mais pessoas para esse pensamento, para que, num futuro próximo, validem assim tudo o que já desmataram”.
Para Rajão, da UFMG, é uma narrativa “que no curto prazo é confortante, mas no longo prazo contribui para o chamado ‘agrosuicídio'”.
Posicionamentos de empresas que convidaram professores para palestras
Cooperativa Agrícola de Unaí (Coagril)
A Cooperativa Agrícola de Unaí Ltda (Coagril) diz “ter contratado o professor Molion no intuito de obter informações acerca do regime de chuvas para a região de sua atuação, visando ao planejamento estratégico dos seus negócios e de seus cooperado”.
Associação Avícola de Pernambuco
“A AVIPE reforça seu caráter plural onde preza pela diversidade de ideias onde o debate de todos os pontos de vista precisa ser exaurido constantemente com o intuito da busca eterna de uma conclusão contingente sobre quaisquer assuntos. (…) Como associação, não nos cabe acreditar ou não se os fatos humanos causam mudanças climáticas, pois nosso papel não é de credo, mas sim de apoiar o debate científico por aqueles que se dedicam toda uma vida em pesquisa. Não condiz com nossos princípios, condutas e valores, selecionar uma parcela de opiniões do mundo científico para apoiar determinada conclusão com fins casuísticos ou individuais. Aspectos financeiros são reservados apenas aos nossos associados.”
Associação de Engenheiros e Arquitetos de Itanhaém, com o patrocínio oficial do Conselho Regional de Engenharia e Agronomia de São Paulo
A Associação de Engenheiros e Arquitetos de Itanhaém recebeu o pedido da BBC News Brasil por e-mail e WhatsApp, mas não respondeu.
O Crea-SP respondeu que “tem como missão legal o aperfeiçoamento técnico e cultural dos profissionais da área tecnológica, conforme a Lei 5.194”.
“Os eventos com essa finalidade, realizados pelas associações, são de responsabilidade de seus idealizadores e não necessariamente representam a posição do Crea-SP.
O Conselho reforça ainda que acredita em mudanças climáticas causadas pelas ações humanas e, como forma de apoiar medidas para combatê-las, é signatário dos 17 Objetivos de Desenvolvimento Sustentável da ONU.”
Cooperativa de café Cooabriel
Recebeu o pedido da BBC News Brasil por e-mail, mas não respondeu.
Sindicato rural de Canarana
O presidente do sindicato, Alex Wisch, respondeu, por mensagem via WhatsApp: “Propomos que vocês indiquem um cientista de mesmo nível acadêmico do Prof. Molion para que todos possam ter conhecimento da verdade científica sobre esse tema. Podemos colaborar financeiramente com esse evento e inclusive sediar o evento.”
Instituto de Ciências Agrárias da Universidade Federal de Minas Gerais (UFMG)
Por telefone, o vice-diretor do Instituto de Ciências Agrárias da UFMG, Helder Augusto, afirmou: “Na universidade, há diversidade de ideias e contrapontos. Não é um posicionamento da UFMG. É um ponto de vista dele, é uma fala relativa. A pessoa veio, fez palestra e pode falar o que bem entender porque é um ambiente público. A universidade não paga palestra para ninguém.”
Universidade Federal da Paraíba
“O evento foi realizado no auditório do Centro de Tecnologia da UFPB, organizado no âmbito do Departamento de Engenharia Mecânica, que aproveitou que o palestrante já estava em João Pessoa (PB) e o convidou para ministrar palestra na UFPB, portanto, neste caso em particular, sem ônus para a UFPB.
A iniciativa de convidar o pesquisador para ministrar palestra sobre seus estudos não se confunde com a visão, missão e valores da UFPB, entre os quais destaca-se o caráter público e autônomo da Universidade.
A UFPB defende o papel da academia e apoia a ciência e a pesquisa, o conhecimento gerado a partir de métodos científicos, no intuito de encontrar soluções para desafios em todas as áreas e geração de benefícios para a sociedade. Por meio da ciência, as teorias são constantemente testadas, visando sua comprovação ou substituição por outra teoria que resista à checagem. Não compete à Universidade aplicar censura prévia à ciência.”
Some ‘high-level’ scientific pronouncements have assumed stewardship of climate geoengineering in the absence of other agents. This is dangerous, as effects on the Indian monsoons will show.
Prakash Kashwan – 28/Dec/2018
Multilateral climate negotiations led by the UN have ended on disappointing notes of late. This has prompted climate scientists to weigh the pros and cons of climate geoengineering. Indian scientists, policymakers, and the public must also engage in these debates, especially given the potentially major implications of geoengineering for the monsoons in South Asia and Africa.
Since 2016, an academic working group (AWG) of 14 global governance experts (including the author) has deliberated on the wisdom and merits of geoengineering. In a report, we argue that we ought to develop ‘anticipatory governance mechanisms’.
While people often equate governance with top-down regulations, the AWG’s vision emphasises a combination of regulatory and voluntary strategies adopted by diverse state and non-state actors.
In the same vein, it’s also important to unpack the umbrella terminology of ‘geoengineering’. It comprises two sets of technologies with different governance implications: carbon geoengineering and solar geoengineering.
Carbon geoengineering, or carbon-dioxide removal, seeks to remove large quantities of the greenhouse gas from the atmosphere. The suite of options it presents include bioenergy with carbon capture and storage (BECCS). This would require planting bioenergy crops over an area up to five times the size of India by 2100. Obviously such large-scale and rapid land-use change will strain the already precarious global food security and violate the land, forest and water rights of hundreds of millions.
The second cluster of geoengineering technologies, solar geoengineering, a.k.a. solar radiation management (SRM), seeks to cool the planet by reflecting a fraction of sunlight back into space. While this could help avoid some of the more severe effects of climate change, SRM doesn’t help reduce the stock of carbon already present in the atmosphere. Scientists also caution that geoengineering may distract us from investing in emissions reduction. But we know from experience that policymakers could ignore such cautions in the policymaking process.
This means problems like air pollution and ocean acidification will continue unabated in the absence of profound climate mitigation actions. On the other hand, by altering atmospheric temperature, SRM could significantly disrupt the hydrological cycle and affect the monsoons.
Just being interested in minimising disruptions to the monsoons should encourage India to help develop international geoengineering governance.
But before we can get into into the nitty-gritty, there’s a question that must be answered. Why should the global community think about governing climate engineering at this stage when all that exists of SRM are computer simulations of its pros and cons?
Some reasons follow:
First, the suggestion that geoengineering technologies merely fill a void left open by a “lack of political will” doesn’t capture the full array of possibilities. The IPCC Special Report on the effects on a world warming by 1.5°C includes a scenario in which the Paris Agreement’s goals are secured by 2050. This pathway banks on social, business and technological innovations, and doesn’t require resorting to radical climate responses or sacrificing improvements in basic living standards in the developing world.
On the other hand, $8 trillion’s worth of investments have already been redirected away from fossil fuel operations. These successes owe thanks to a global divestment movement led by environmental activists and student groups. (Such an outcome was thought to be politically infeasible only a few years ago.)
Second, recent research has shown that some geoengineering technologies, such as BECCS, could compete against the pursuits of more “ ecologically sound, economical, and scalable” methods (source) for enhancing natural climate sinks.
Third, despite a lot of progress in recent years, we don’t know enough to support a full assessment of the intended and unintended effects of geoengineering.
Decisions about which unresolved questions of geoengineering deserve public investment can’t be left only to the scientists and policymakers. The community of climate engineering scientists tends to frame geoengineering in certain ways over other equally valid alternatives.
This includes considering the global average surface temperature as the central climate impact indicator and ignoring vested interests linked to capital-intensive geoengineering infrastructure. This could bias future R&D trajectories in this area.
And these priorities, together with the assessments produced by eminent scientific bodies, have contributed to the rise of a de facto form of governance. In other words, some ‘high-level’ scientific pronouncements have assumed stewardship of climate geoengineering in the absence of other agents.
Such technocratic modes of governance don’t enjoy broad-based social or political legitimacy.
Individual research groups (e.g. Harvard University’s Solar Geoengineering Research Program) have opened themselves up to public scrutiny. They don’t support commercial work on solar geoengineering and have decided not to patent technologies being developed in their labs. While this is commendable, none of this can substitute more politically legitimate arrangements.
The case of the Indian monsoons illustrates these challenges well. Various models of the Geoengineering Model Intercomparison Project have shown that SRM in use will likely cause the net summer monsoon precipitation to decline from 6.4% to 12.7%. (These predictions are based on average changes in atmospheric temperature, which means bigger or smaller variations could occur over different parts of India.)
So politically legitimate international governance is important to ensure global responses to climate change account for these and other domestic consequences.
As a first step, the AWG report recommends the UN secretary-general establish a high-level representative body to engage in international dialogue on various questions of governing SRM R&D, supported by a General Assembly resolution. Among other things, the mandate of this ‘World Commission’ could include debating whether, and to what end, SRM should be researched and developed and how it could fit within broader climate response strategies.
Then again, debates over solar geoengineering can’t be limited to global bodies and commissions. So the AWG also recommends the UN create a global forum for stakeholder dialogue to facilitate discussions on solar geoengineering. Such a forum could engage a variety of stakeholders, including local governments, communities, indigenous peoples and other climate-vulnerable groups, youth organisations and women’s groups. Only such a process is likely to effectively represent Indian peasants and farmers at the receiving end of a longstanding agrarian crisis.
These proposals for geoengineering governance build on various precedents. For example, from the 1990s, the World Commission on Dams demonstrated the feasibility and value of an extensive multi-level governance arrangement.
In 2018, policy experts have finally recognised that global climate governance can’t ignore the general public’s concerns. It would be best to avoid rediscovering this wheel in the international governance domain of climate geoengineering.
Prakash Kashwan is an associate professor at the University of Connecticut, Storrs, and was a member of the AWG. The South Asia edition of his book Democracy in the Woods (2017) is due out later this month.
New antibody and antiviral treatments, and better vaccines, are on the way
The Economist – Nov 8th 2021
IN THE WELL-VACCINATED wealthier countries of the world, year three of the pandemic will be better than year two, and covid-19 will have much less impact on health and everyday activities. Vaccines have weakened the link between cases and deaths in countries such as Britain and Israel (see chart). But in countries that are poorer, less well vaccinated or both, the deleterious effects of the virus will linger. A disparity of outcomes between rich and poor countries will emerge. The Gates Foundation, one of the world’s largest charities, predicts that average incomes will return to their pre-pandemic levels in 90% of advanced economies, compared with only a third of low- and middle-income economies.
Although the supply of vaccines surged in the last quarter of 2021, many countries will remain under-vaccinated for much of 2022, as a result of distribution difficulties and vaccine hesitancy. This will lead to higher rates of death and illness and weaker economic recoveries. The “last mile” problem of vaccine delivery will become painfully apparent as health workers carry vaccines into the planet’s poorest and most remote places. But complaints about unequal distribution will start to abate during 2022 as access to patients’ arms becomes a larger limiting factor than access to jabs. Indeed, if manufacturers do not scale back vaccine production there will be a glut by the second half of the year, predicts Airfinity, a provider of life-sciences data.
Booster jabs will be more widely used in 2022 as countries develop an understanding of when they are needed. New variants will also drive uptake, says Stanley Plotkin of the University of Pennsylvania, inventor of the rubella vaccine. Dr Plotkin says current vaccines and tweaked versions will be used as boosters, enhancing protection against variants.
The vaccination of children will also expand, in some countries to those as young as six months. Where vaccine hesitancy makes it hard for governments to reach their targets they will be inclined to make life difficult for the unvaccinated—by requiring vaccine passports to attend certain venues, and making vaccination compulsory for groups such as health-care workers.
Immunity and treatments may be widespread enough by mid-2022 to drive down case numbers and reduce the risk of new variants. At this point, the virus will become endemic in many countries. But although existing vaccines may be able to suppress the virus, new ones are needed to cut transmission.
Stephane Bancel, the boss of Moderna, a maker of vaccines based on mRNA technology, says his firm is working on a “multivalent” vaccine that will protect against more than one variant of covid-19. Beyond that he is looking at a “pan-respiratory” vaccine combining protection against multiple coronaviruses, respiratory viruses and strains of influenza.
Other innovations in covid-19 vaccines will include freeze-dried formulations of mRNA jabs, and vaccines that are given via skin patches or inhalation. Freeze-dried mRNA vaccines are easy to transport. As the supply of vaccines grows in 2022, those based on mRNA will be increasingly preferred, because they offer higher levels of protection. That will crimp the global market for less effective vaccines, such as the Chinese ones.
In rich countries there will also be greater focus on antibody treatments for people infected with covid-19. America, Britain and other countries will rely more on cocktails such as those from Regeneron or AstraZeneca.
Most promising of all are new antiviral drugs. Pfizer is already manufacturing “significant quantities” of its protease inhibitor. In America, the government has agreed to buy 1.2bn courses of an antiviral drug being developed by Merck, known as molnupiravir. This has shown its efficacy in trials, and the company has licensed it for widespread, affordable production.
There are many other antivirals in the pipeline. Antiviral drugs that can be taken in pill form, after diagnosis, are likely to become blockbusters in 2022, helping make covid-19 an ever more treatable disease. That will lead, in turn, to new concerns about unequal access and of misuse fostering resistant strains.
The greatest risk to this more optimistic outlook is the emergence of a new variant capable of evading the protection provided by existing vaccines. The coronavirus remains a formidable foe.
Natasha Loder: Health-policy editor, The Economist■
This article appeared in the Science and Technology section of the print edition of The World Ahead 2022 under the headline “From pandemic to endemic”
From discs in the sky to faces in toast, learn to weigh evidence sceptically without becoming a closed-minded naysayer
by Stephen Law
Stephen Law is a philosopher and author. He is director of philosophy at the Department of Continuing Education at the University of Oxford, and editor of Think, the Royal Institute of Philosophy journal. He researches primarily in the fields of philosophy of religion, philosophy of mind, Ludwig Wittgenstein, and essentialism. His books for a popular audience include The Philosophy Gym (2003), The Complete Philosophy Files (2000) and Believing Bullshit (2011). He lives in Oxford.
Many people believe in extraordinary hidden beings, including demons, angels, spirits and gods. Plenty also believe in supernatural powers, including psychic abilities, faith healing and communication with the dead. Conspiracy theories are also popular, including that the Holocaust never happened and that the terrorist attacks on the United States of 11 September 2001 were an inside job. And, of course, many trust in alternative medicines such as homeopathy, the effectiveness of which seems to run contrary to our scientific understanding of how the world actually works.
Such beliefs are widely considered to be at the ‘weird’ end of the spectrum. But, of course, just because a belief involves something weird doesn’t mean it’s not true. As science keeps reminding us, reality often is weird. Quantum mechanics and black holes are very weird indeed. So, while ghosts might be weird, that’s no reason to dismiss belief in them out of hand.
I focus here on a particular kind of ‘weird’ belief: not only are these beliefs that concern the enticingly odd, they’re also beliefs that the general public finds particularly difficult to assess.
Almost everyone agrees that, when it comes to black holes, scientists are the relevant experts, and scientific investigation is the right way to go about establishing whether or not they exist. However, when it comes to ghosts, psychic powers or conspiracy theories, we often hold wildly divergent views not only about how reasonable such beliefs are, but also about what might count as strong evidence for or against them, and who the relevant authorities are.
Take homeopathy, for example. Is it reasonable to focus only on what scientists have to say? Shouldn’t we give at least as much weight to the testimony of the many people who claim to have benefitted from homeopathic treatment? While most scientists are sceptical about psychic abilities, what of the thousands of reports from people who claim to have received insights from psychics who could only have known what they did if they really do have some sort of psychic gift? To what extent can we even trust the supposed scientific ‘experts’? Might not the scientific community itself be part of a conspiracy to hide the truth about Area 51 in Nevada, Earth’s flatness or the 9/11 terrorist attacks being an inside job?
Most of us really struggle when it comes to assessing such ‘weird’ beliefs – myself included. Of course, we have our hunches about what’s most likely to be true. But when it comes to pinning down precisely why such beliefs are or aren’t reasonable, even the most intelligent and well educated of us can quickly find ourselves out of our depth. For example, while most would pooh-pooh belief in fairies, Arthur Conan Doyle, the creator of the quintessentially rational detective Sherlock Holmes, actually believed in them and wrote a book presenting what he thought was compelling evidence for their existence.
When it comes to weird beliefs, it’s important we avoid being closed-minded naysayers with our fingers in our ears, but it’s also crucial that we avoid being credulous fools. We want, as far as possible, to be reasonable.
I’m a philosopher who has spent a great deal of time thinking about the reasonableness of such ‘weird’ beliefs. Here I present five key pieces of advice that I hope will help you figure out for yourself what is and isn’t reasonable.
Let’s begin with an illustration of the kind of case that can so spectacularly divide opinion. In 1976, six workers reported a UFO over the site of a nuclear plant being constructed near the town of Apex, North Carolina. A security guard then reported a ‘strange object’. The police officer Ross Denson drove over to investigate and saw what he described as something ‘half the size of the Moon’ hanging over the plant. The police also took a call from local air traffic control about an unidentified blip on their radar.
The next night, the UFO appeared again. The deputy sheriff described ‘a large lighted object’. An auxiliary officer reported five lighted objects that appeared to be burning and about 20 times the size of a passing plane. The county magistrate described a rectangular football-field-sized object that looked like it was on fire.
Finally, the press got interested. Reporters from the Star newspaper drove over to investigate. They too saw the UFO. But when they tried to drive nearer, they discovered that, weirdly, no matter how fast they drove, they couldn’t get any closer.
This report, drawn from Philip J Klass’s bookUFOs: The Public Deceived (1983), is impressive: it involves multiple eyewitnesses, including police officers, journalists and even a magistrate. Their testimony is even backed up by hard evidence – that radar blip.
Surely, many would say, given all this evidence, it’s reasonable to believe there was at least something extraordinary floating over the site. Anyone who failed to believe at least that much would be excessively sceptical – one of those perpetual naysayers whose kneejerk reaction, no matter how strong the evidence, is always to pooh-pooh.
What’s most likely to be true: that there really was something extraordinary hanging over the power plant, or that the various eyewitnesses had somehow been deceived? Before we answer, here’s my first piece of advice.NEED TO KNOWTHINK IT THROUGHKEY POINTSWHY IT MATTERSLINKS & BOOKS
Think it through
1. Expect unexplained false sightings and huge coincidences
Our UFO story isn’t over yet. When the Star’s two-man investigative team couldn’t get any closer to the mysterious object, they eventually pulled over. The photographer took out his long lens to take a look: ‘Yep … that’s the planet Venus all right.’ It was later confirmed beyond any reasonable doubt that what all the witnesses had seen was just a planet. But what about that radar blip? It was a coincidence, perhaps caused by a flock of birds or unusual weather.
What moral should we draw from this case? Not, of course, that because this UFO report turned out to have a mundane explanation, all such reports can be similarly dismissed. But notice that, had the reporters not discovered the truth, this story would likely have gone down in the annals of ufology as one of the great unexplained cases. The moral I draw is that UFO cases that have multiple eyewitnesses and even independent hard evidence (the radar blip) may well crop up occasionally anyway, even if there are no alien craft in our skies.
We tend significantly to underestimate how prone to illusion and deception we are when it comes to the wacky and weird. In particular, we have a strong tendency to overdetect agency – to think we are witnessing a person, an alien or some other sort of creature or being – where in truth there’s none.
Psychologists have developed theories to account for this tendency to overdetect agency, including that we have evolved what’s called a hyperactive agency detecting device. Had our ancestors missed an agent – a sabre-toothed tiger or a rival, say – that might well have reduced their chances of surviving and reproducing. Believing an agent is present when it’s not, on the other hand, is likely to be far less costly. Consequently, we’ve evolved to err on the side of overdetection – often seeing agency where there is none. For example, when we observe a movement or pattern we can’t understand, such as the retrograde motion of a planet in the night sky, we’re likely to think the movement is explained by some hidden agent working behind the scenes (that Mars is actually a god, say).
One example of our tendency to overdetect agency is pareidolia: our tendency to find patterns – and, in particular, faces – in random noise. Stare at passing clouds or into the embers of a fire, and it’s easy to interpret the randomly generated shapes we see as faces, often spooky ones, staring back.
And, of course, nature is occasionally going to throw up the face-like patterns just by chance. One famous illustration was produced in 1976 by the Mars probe Viking Orbiter 1. As the probe passed over the Cydonia region, it photographed what appeared to be an enormous, reptilian-looking face 800 feet high and nearly 2 miles long. Some believe this ‘face on Mars’ was a relic of an ancient Martian civilisation, a bit like the Great Sphinx of Giza in Egypt. A book called TheMonuments of Mars: A City on the Edge of Forever (1987) even speculated about this lost civilisation. However, later photos revealed the ‘face’ to be just a hill that looks face-like when lit a certain way. Take enough photos of Mars, and some will reveal face-like features just by chance.
The fact is, we should expect huge coincidences. Millions of pieces of bread are toasted each morning. One or two will exhibit face-like patterns just by chance, even without divine intervention. One such piece of toast that was said to show the face of the Virgin Mary (how do we know what she looked like?) was sold for $28,000. We think about so many people each day that eventually we’ll think about someone, the phone will ring, and it will be them. That’s to be expected, even if we’re not psychic. Yet many put down such coincidences to supernatural powers.
2. Understand what strong evidence actually is
When is a claim strongly confirmed by a piece of evidence? The following principle appears correct (it captures part of what confirmation theorists call the Bayes factor; for more on Bayesian approaches to assessing evidence, see the link at the end):
Evidence confirms a claim to the extent that the evidence is more likely if the claim is true than if it’s false.
Here’s a simple illustration. Suppose I’m in the basement and can’t see outside. Jane walks in with a wet coat and umbrella and tells me it’s raining. That’s pretty strong evidence it’s raining. Why? Well, it is of course possible that Jane is playing a prank on me with her wet coat and brolly. But it’s far more likely she would appear with a wet coat and umbrella and tell me it’s raining if that’s true than if it’s false. In fact, given just this new evidence, it may well be reasonable for me to believe it’s raining.
Here’s another example. Sometimes whales and dolphins are found with atavistic limbs – leg-like structures – where legs would be found on land mammals. These discoveries strongly confirm the theory that whales and dolphins evolved from earlier limbed, land-dwelling species. Why? Because, while atavistic limbs aren’t probable given the truth of that theory, they’re still far more probable than they would be if whales and dolphins weren’t the descendants of such limbed creatures.
The Mars face, on the other hand, provides an example of weak or non-existent evidence. Yes, if there was an ancient Martian civilisation, then we might discover what appeared to be a huge face built on the surface of the planet. However, given pareidolia and the likelihood of face-like features being thrown up by chance, it’s about as likely that we would find such face-like features anyway, even if there were no alien civilisation. That’s why such features fail to provide strong evidence for such a civilisation.
So now consider our report of the UFO hanging over the nuclear power construction site. Are several such cases involving multiple witnesses and backed up by some hard evidence (eg, a radar blip) good evidence that there are alien craft in our skies? No. We should expect such hard-to-explain reports anyway, whether or not we’re visited by aliens. In which case, such reports are not strong evidence of alien visitors.
Being sceptical about such reports of alien craft, ghosts or fairies is not knee-jerk, fingers-in-our-ears naysaying. It’s just recognising that, though we might not be able to explain the reports, they’re likely to crop up occasionally anyway, whether or not alien visitors, ghosts or fairies actually exist. Consequently, they fail to provide strong evidence for such beings.
It was the scientist Carl Sagan who in 1980 said: ‘Extraordinary claims require extraordinary evidence.’ By an ‘extraordinary’ claim, Sagan appears to have meant an extraordinarily improbable claim, such as that Alice can fly by flapping her arms, or that she can move objects with her mind. On Sagan’s view, such claims require extraordinarily strong evidence before we should accept them – much stronger than the evidence required to support a far less improbable claim.
Suppose for example that Fred claims Alice visited him last night, sat on his sofa and drank a cup of tea. Ordinarily, we would just take Fred’s word for that. But suppose Fred adds that, during her visit, Alice flew around the room by flapping her arms. Of course, we’re not going to just take Fred’s word for that. It’s an extraordinary claim requiring extraordinary evidence.
If we’re starting from a very low base, probability-wise, then much more heavy lifting needs to be done by the evidence to raise the probability of the claim to a point where it might be reasonable to believe it. Clearly, Fred’s testimony about Alice flying around the room is not nearly strong enough.
Similarly, given the low prior probability of the claims that someone communicated with a dead relative, or has fairies living in their local wood, or has miraculously raised someone from the dead, or can move physical objects with their mind, we should similarly set the evidential bar much higher than we would for more mundane claims.
4. Beware accumulated anecdotes
Once we’ve formed an opinion, it can be tempting to notice only evidence that supports it and to ignore the rest. Psychologists call this tendency confirmation bias.
For example, suppose Simon claims a psychic ability to know the future. He can provide 100 examples of his predictions coming true, including one or two dramatic examples. In fact, Simon once predicted that a certain celebrity would die within 12 months, and they did!
Do these 100 examples provide us with strong evidence that Simon really does have some sort of psychic ability? Not if Simon actually made many thousands of predictions and most didn’t come true. Still, if we count only Simon’s ‘hits’ and ignore his ‘misses’, it’s easy to create the impression that he has some sort of ‘gift’.
Confirmation bias can also create the false impression that a therapy is effective. A long list of anecdotes about patients whose condition improved after a faith healing session can seem impressive. People may say: ‘Look at all this evidence! Clearly this therapy has some benefits!’ But the truth is that such accumulated anecdotes are usually largely worthless as evidence.
It’s also worth remembering that such stories are in any case often dubious. For example, they can be generated by the power of suggestion: tell people that a treatment will improve their condition, and many will report that it has, even if the treatment actually offers no genuine medical benefit.
Impressive anecdotes can also be generated by means of a little creative interpretation. Many believe that the 16th-century seer Nostradamus predicted many important historical events, from the Great Fire of London to the assassination of John F Kennedy. However, because Nostradamus’s prophecies are so vague, nobody was able to use his writings to predict any of these events before they occurred. Rather, his texts were later creatively interpreted to fit what subsequently happened. But that sort of ‘fit’ can be achieved whether Nostradamus had extraordinary abilities or not. In which case, as we saw under point 2 above, the ‘fit’ is not strong evidence of such abilities.
5. Beware ‘But it fits!’
Often, when we’re presented with strong evidence that our belief is false, we can easily change our mind. Show me I’m mistaken in believing that the Matterhorn is near Chamonix, and I’ll just drop that belief.
However, abandoning a belief isn’t always so easy. That’s particularly the case for beliefs in which we have invested a great deal emotionally, socially and/or financially. When it comes to religious and political beliefs, for example, or beliefs about the character of our close relatives, we can find it extraordinarily difficult to change our minds. Psychologists refer to the discomfort we feel in such situations – when our beliefs or attitudes are in conflict – as cognitive dissonance.
Perhaps the most obvious strategy we can employ when a belief in which we have invested a great deal is threatened is to start explaining away the evidence.
Here’s an example. Dave believes dogs are spies from the planet Venus – that dogs are Venusian imposters on Earth sending secret reports back to Venus in preparation for their imminent invasion of our planet. Dave’s friends present him with a great deal of evidence that he’s mistaken. But, given a little ingenuity, Dave finds he can always explain away that evidence:
‘Dave, dogs can’t even speak – how can they communicate with Venus?’
‘They can speak, they just hide their linguistic ability from us.’
‘But Dave, dogs don’t have transmitters by which they could relay their messages to Venus – we’ve searched their baskets: nothing there!’
‘Their transmitters are hidden in their brain!’
‘But we’ve X-rayed this dog’s brain – no transmitter!’
‘The transmitters are made from organic material indistinguishable from ordinary brain stuff.’
‘But we can’t detect any signals coming from dogs’ heads.’
‘This is advanced alien technology – beyond our ability to detect it!’
‘Look Dave, Venus can’t support dog life – it’s incredibly hot and swathed in clouds of acid.’
‘The dogs live in deep underground bunkers to protect them. Why do you think they want to leave Venus?!’
You can see how this conversation might continue ad infinitum. No matter how much evidence is presented to Dave, it’s always possible for him to cook up another explanation. And so he can continue to insist his belief is logicallyconsistent with the evidence.
But, of course, despite the possibility of his endlessly explaining away any and all counterevidence, Dave’s belief is absurd. It’s certainly not confirmed by the available evidence about dogs. In fact, it’s powerfully disconfirmed.
The moral is: showing that your theory can be made to ‘fit’ – be consistent with – the evidence is not the same thing as showing your theory is confirmed by the evidence. However, those who hold weird beliefs often muddle consistency and confirmation.
Take young-Earth creationists, for example. They believe in the literal truth of the Biblical account of creation: that the entire Universe is under 10,000 years old, with all species being created as described in the Book of Genesis.
Polls indicate that a third or more of US citizens believe that the Universe is less than 10,000 years old. Of course, there’s a mountain of evidence against the belief. However, its proponents are adept at explaining away that evidence.
Take the fossil record embedded in sedimentary layers revealing that today’s species evolved from earlier species over many millions of years. Many young-Earth creationists explain away this record as a result of the Biblical flood, which they suppose drowned and then buried living things in huge mud deposits. The particular ordering of the fossils is supposedly accounted for by different ecological zones being submerged one after the other, starting with simple marine life. Take a look at the Answers in Genesis website developed by the Bible literalist Ken Ham, and you’ll discover how a great deal of other evidence for evolution and a billions-of-years-old Universe is similarly explained away. Ham believes that, by explaining away the evidence against young-Earth creationism in this way, he can show that his theory ‘fits’ – and so is scientifically confirmed by – that evidence:
Increasing numbers of scientists are realising that when you take the Bible as your basis and build your models of science and history upon it, all the evidence from the living animals and plants, the fossils, and the cultures fits. This confirms that the Bible really is the Word of God and can be trusted totally. [my italics]
According to Ham, young-Earth creationists and evolutionists do the same thing: they look for ways to make the evidence fit the theory to which they have already committed themselves:
Evolutionists have their own framework … into which they try to fit the data. [my italics]
But, of course, scientists haven’t just found ways of showing how the theory of evolution can be made consistent with the evidence. As we saw above, that theory really is strongly confirmed by the evidence.
Any theory, no matter how absurd, can, with sufficient ingenuity be made to ‘fit’ the evidence: even Dave’s theory that dogs are Venusian spies. That’s not to say it’s reasonable or well confirmed.
Of course, it’s not always unreasonable to explain away evidence. Given overwhelming evidence that water boils at 100 degrees Celsius at 1 atmosphere, a single experiment that appeared to contradict that claim might reasonably be explained away as a result of some unidentified experimental error. But as we increasingly come to rely on explaining away evidence in order to try to convince ourselves of the reasonableness of our belief, we begin to drift into delusion.
Key points – How to think about weird things
Expect unexplained false sightings and huge coincidences. Reports of mysterious and extraordinary hidden agents – such as angels, demons, spirits and gods – are to be expected, whether or not such beings exist. Huge coincidences – such as a piece of toast looking very face-like – are also more or less inevitable.
Understand what strong evidence is. If the alleged evidence for a belief is scarcely more likely if the belief is true than if it’s false, then it’s not strong evidence.
Extraordinary claims require extraordinary evidence. If a claim is extraordinarily improbable – eg, the claim that Alice flew round the room by flapping her arms – much stronger evidence is required for reasonable belief than is required for belief in a more mundane claim, such as that Alice drank a cup of tea.
Beware accumulated anecdotes. A large number of reports of, say, people recovering after taking an alternative medicine or visiting a faith healer is not strong evidence that such treatments actually work.
Beware ‘But it fits!’ Any theory, no matter how ludicrous (even the theory that dogs are spies from Venus), can, with sufficient ingenuity, always be made logically consistent with the evidence. That’s not to say it’s confirmed by the evidence.
Why it matters
Sometimes, belief in weird things is pretty harmless. What does it matter if Mary believes there are fairies at the bottom of her garden, or Joe thinks his dead aunty visits him occasionally? What does it matter if Sally is a closed-minded naysayer when it comes to belief in psychic powers? However, many of these beliefs have serious consequences.
Clearly, people can be exploited. Grieving parents contact spiritualists who offer to put them in contact with their dead children. Peddlers of alternative medicine and faith healing charge exorbitant fees for their ‘cures’ for terminal illnesses. If some alternative medicines really work, casually dismissing them out of hand and refusing to properly consider the evidence could also cost lives.
Lives have certainly been lost. Many have died who might have been saved because they believed they should reject conventional medicine and opted for ineffective alternatives.
Huge amounts of money are often also at stake when it comes to weird beliefs. Psychic reading and astrology are huge businesses with turnovers of billions of dollars per year. Often, it’s the most desperate who will turn to such businesses for advice. Are they, in reality, throwing their money away?
Many ‘weird’ beliefs also have huge social and political implications. The former US president Ronald Reagan and his wife Nancy were reported to have consulted an astrologer before making any major political decision. Conspiracy theories such as QAnon and the Sandy Hook hoax shape our current political landscape and feed extremist political thinking. Mainstream religions are often committed to miracles and gods.
In short, when it comes to belief in weird things, the stakes can be very high indeed. It matters that we don’t delude ourselves into thinking we’re being reasonable when we’re not.
Links & books
The Atlanticarticle ‘The Cognitive Biases Tricking Your Brain’ (2018) by Ben Yagoda provides a great introduction to thinking that can lead us astray, including confirmation bias.
The UK-based magazineThe Skeptic provides some high-quality free articles on belief in weird things. Well worth a subscription.
The Skeptical Inquirermagazine in the US is also excellent, and provides some free content.
The RationalWiki portal provides many excellent articles on pseudoscience.
The British mathematician Norman Fenton, professor of risk information management at Queen Mary University of London, provides a brief online introduction to Bayesian approaches to assessing evidence.
My bookBelieving Bullshit: How Not to Get Sucked into an Intellectual Black Hole (2011) identifies eight tricks of the trade that can turn flaky ideas into psychological flytraps – and how to avoid them.
The textbookHow to Think About Weird Things: Critical Thinking for a New Age (2019, 8th ed) by the philosophers Theodore Schick and Lewis Vaughn, offers step-by-step advice on sorting through reasons, evaluating evidence and judging the veracity of a claim.
The bookCritical Thinking (2017) by Tom Chatfield offers a toolkit for what he calls ‘being reasonable in an unreasonable world’.
Placing our faith in forecasting and science could save lives and money
October 14, 2021
2021 is shaping up to be a historically busy hurricane season. And while damage and destruction have been serious, there has been one saving grace — that the National Weather Service has been mostly correct in its predictions.
Thanks to remote sensing, Gulf Coast residents knew to prepare for the “life-threatening inundation,” “urban flooding” and “potentially catastrophic wind damage” that the Weather Service predicted for Hurricane Ida. Meteorologists nailed Ida’s strength, surge and location of landfall while anticipating that a warm eddy would make her intensify too quickly to evacuate New Orleans safely. Then, as her remnants swirled northeast, reports warned of tornadoes and torrential rain. Millions took heed, and lives were saved. While many people died, their deaths resulted from failures of infrastructure and policy, not forecasting.
The long history of weather forecasting and weather mapping shows that having access to good data can help us make better choices in our own lives. Trust in meteorology has made our communities, commutes and commerce safer — and the same is possible for climate science.
Two hundred years ago, the few who studied weather deemed any atmospheric phenomenon a “meteor.” The term, referencing Aristotle’s “Meteorologica,” essentially meant “strange thing in the sky.” There were wet things (hail), windy things (tornadoes), luminous things (auroras) and fiery things (comets). In fact, the naturalist Elias Loomis, who was among the first to spot Halley’s comet upon its return in 1835, thought storms behaved as cyclically as comets. So to understand “the laws of storms,” Loomis and the era’s other leading weatherheads began gathering observations. Master the elements, they reasoned, and you could safely sail the seas, settle the American West, plant crops with confidence and ward off disease.
In 1856, Joseph Henry, the Smithsonian Institution’s first director, hung a map of the United States in the lobby of its Washington headquarters. Every morning, he would affix small colored discs to show the nation’s weather: white for places with clear skies, blue for snow, black for rain and brown for cloud cover. An arrow on each disc allowed him to note wind direction, too. For the first time, visitors could see weather across the expanding country.
Although simple by today’s standards, the map belied the effort and expense needed to select the correct colors each day. Henry persuaded telegraph companies to transmit weather reports every morning at 10. Then he equipped each station with thermometers, barometers, weathervanes and rain gauges — no small task by horse and rail, as instruments often broke in transit.
For longer-term studies of the North American climate, Henry enlisted academics, farmers and volunteers from Maine to the Caribbean. Eager to contribute, “Smithsonian observers” took readings three times a day and posted them to Washington each month. At its peak in 1860, the Smithsonian Meteorological Project had more than 500 observers. Then the Civil War broke out.
Henry’s ranks thinned by 40 percent as men traded barometers for bayonets. Severed telegraph lines and the priority of war messages crippled his network. Then in January 1865, a fire in Henry’s office landed the fatal blow to the project. All of his efforts turned to salvaging what survived. With a vacuum of leadership in Washington, citizen scientists picked up the slack.
Although the Chicago Tribune lampooned Lapham, wondering “what practical value” a warning service would provide “if it takes 10 years to calculate the progress of a storm,” Rep. Halbert E. Paine (Wis.), who had studied storms under Loomis, rushed a bill into Congress before the winter recess. In early 1870, a joint resolution establishing a storm-warning service under the U.S. Army Signal Office passed without debate. President Ulysses S. Grant signed it into law the following week.
Despite the mandate for an early-warning system, an aversion to predictions remained. Fiscal hawks could not justify an investment in erroneous forecasts, religious zealots could not stomach the hubris, and politicians wary of a skeptical public could not bear the fallout. In 1893, Agriculture Secretary J. Sterling Morton cut the salary of one of the country’s top weather scientists, Cleveland Abbe, by 25 percent, making an example out of him.
While Moore didn’t face consequences for his dereliction of duty, the Weather Bureau’s hurricane-forecasting methods gradually improved as the network expanded and technologies like radio emerged. The advent of aviation increased insight into the upper atmosphere; military research led to civilian weather radar, first deployed at Washington National Airport in 1947. By the 1950s, computers were ushering in the future of numerical forecasting. Meanwhile, public skepticism thawed as more people and businesses saw it in their best interests to trust experts.
In September 1961, a local news team decided to broadcast live from the Weather Bureau office in Galveston, Tex., as Hurricane Carla angled across the Gulf of Mexico. Leading the coverage was a young reporter named Dan Rather. “There is the eye of the hurricane right there,” he told his audience as the radar sweep brought the invisible into view. At the time, no one had seen a radar weather map televised before.
Rather realized that for viewers to comprehend the storm’s size, location and imminent danger, people needed a sense of scale. So he had a meteorologist draw the Texas coast on a transparent sheet of plastic, which Rather laid over the radarscope. Years later, he recalled that when he said “one inch equals 50 miles,” you could hear people in the studio gasp. The sight of the approaching buzz saw persuaded 350,000 Texans to evacuate their homes in what was then the largest weather-related evacuation in U.S. history. Ultimately, Carla inflicted twice as much damage as the Galveston hurricane 60 years earlier. But with the aid of Rather’s impromptu visualization, fewer than 50 lives were lost.
In other words, weather forecasting wasn’t only about good science, but about good communication and visuals.
Data visualization helped the public better understand the weather shaping their lives, and this enabled them to take action. It also gives us the power to see deadly storms not as freak occurrences, but as part of something else: a pattern.
Two hundred years ago, a 10-day forecast would have seemed preposterous. Now we can predict if we’ll need an umbrella tomorrow or a snowplow next week. Imagine if we planned careers, bought homes, built infrastructure and passed policy based on 50-year forecasts as routinely as we plan our weeks by five-day ones.
Unlike our predecessors of the 19th or even 20th centuries, we have access to ample climate data and data visualization that give us the knowledge to take bold actions. What we do with that knowledge is a matter of political will. It may be too late to stop the coming storm, but we still have time to board our windows.
A Comunidade de Inteligência dos EUA (CI), federação de 17 agências governamentais independentes que realizam atividades de inteligência, divulgou uma pesquisa sobre o estado do mundo em 2040.
E o futuro é sombrio: o estudo alerta para uma volatilidade política e crescente competição internacional ou mesmo conflito.
O relatório intitulado “Globo Trends 2040 – A More Contested World” (“Tendências Globais 2040 – Um Mundo Mais Disputado”, em português) é uma tentativa de analisar as principais tendências, descrevendo uma série de cenários possíveis.
É o sétimo relatório desse tipo, publicado a cada quatro anos pelo Conselho Nacional de Inteligência desde 1997.
Não se trata de uma leitura relaxante para quem é um líder político ou diplomata internacional – ou espera ser um nos próximos anos.
Em primeiro lugar, o relatório foca nos fatores-chave que vão impulsionar a mudança.
Um deles é a volatilidade política.
“Em muitos países, as pessoas estão pessimistas sobre o futuro e estão cada vez mais desconfiadas de líderes e instituições que consideram incapazes ou relutantes em lidar com tendências econômicas, tecnológicas e demográficas disruptivas”, adverte o relatório.
O estudo argumenta que as pessoas estão gravitando em torno de grupos com ideias semelhantes e fazendo demandas maiores e mais variadas aos governos em um momento em que esses mesmos governos estão cada vez mais limitados no que podem fazer.
“Essa incompatibilidade entre as habilidades dos governos e as expectativas do público tende a se expandir e levar a mais volatilidade política, incluindo crescente polarização e populismo dentro dos sistemas políticos, ondas de ativismo e movimentos de protesto e, nos casos mais extremos, violência, conflito interno, ou mesmo colapso do estado”, diz o relatório.
Expectativas não atendidas, alimentadas por redes sociais e tecnologia, podem criar riscos para a democracia.
“Olhando para o futuro, muitas democracias provavelmente serão vulneráveis a uma erosão e até mesmo ao colapso”, adverte o texto, acrescentando que essas pressões também afetarão os regimes autoritários.
Pandemia, uma ‘grande ruptura global’
O relatório afirma que a atual pandemia é a “ruptura global mais significativa e singular desde a 2ª Guerra Mundial”, que alimentou divisões, acelerou as mudanças existentes e desafiou suposições, inclusive sobre como os governos podem lidar com isso.
O último relatório, de 2017, previu a possibilidade de uma “pandemia global em 2023” reduzir drasticamente as viagens globais para conter sua propagação.
Os autores reconhecem, no entanto, que não esperavam o surgimento da covid-19, que dizem ter “abalado suposições antigas sobre resiliência e adaptação e criado novas incertezas sobre a economia, governança, geopolítica e tecnologia”.
As mudanças climáticas e demográficas também vão exercer um impacto primordial sobre o futuro do mundo, assim como a tecnologia, que pode ser prejudicial, mas também trazer oportunidades para aqueles que a utilizarem de maneira eficaz e primeiro.
Internacionalmente, os analistas esperam que a intensidade da competição pela influência global alcance seu nível mais alto desde a Guerra Fria nas próximas duas décadas em meio ao enfraquecimento contínuo da velha ordem, enquanto instituições como as Nações Unidas enfrentam dificuldades.
Organizações não-governamentais, incluindo grupos religiosos e as chamadas “empresas superestrelas da tecnologia” também podem ter a capacidade de construir redes que competem com – ou até mesmo – driblam os Estados.
O risco de conflito pode aumentar, tornando-se mais difícil impedir o uso de novas armas.
O terrorismo jihadista provavelmente continuará, mas há um alerta de que terroristas de extrema direita e esquerda que promovem questões como racismo, ambientalismo e extremismo antigovernamental possam ressurgir na Europa, América Latina e América do Norte.
Os grupos podem usar inteligência artificial para se tornarem mais perigosos ou usar realidade aumentada para criar “campos de treinamento de terroristas virtuais”.
A competição entre os EUA e a China está no centro de muitas das diferenças nos cenários – se um deles se torna mais bem-sucedido ou se os dois competem igualmente ou dividem o mundo em esferas de influência separadas.
Um relatório de 2004 também previu um califado emergindo do Oriente Médio, como o que o autodenominado Estado Islâmico tentou criar na última década, embora o mesmo estudo – olhando para 2020 – não tenha capturado a competição com a China, que agora domina as preocupações de segurança dos EUA.
O objetivo geral é analisar futuros possíveis, em vez de acertar previsões.
Democracias mais fortes ou ‘mundo à deriva’?
Existem alguns cenários otimistas para 2040 – um deles foi chamado de “o renascimento das democracias”.
Isso envolve os EUA e seus aliados aproveitando a tecnologia e o crescimento econômico para lidar com os desafios domésticos e internacionais, enquanto as repressões da China e da Rússia (inclusive em Hong Kong) sufocam a inovação e fortalecem o apelo da democracia.
Mas outros são mais desanimadores.
“O cenário do mundo à deriva” imagina as economias de mercado nunca se recuperando da pandemia de Covid, tornando-se profundamente divididas internamente e vivendo em um sistema internacional “sem direção, caótico e volátil”, já que as regras e instituições internacionais são ignoradas por países, empresas e outros grupos.
Um cenário, porém, consegue combinar pessimismo com otimismo.
“Tragédia e mobilização” prevê um mundo em meio a uma catástrofe global no início de 2030, graças às mudanças climáticas, fome e agitação – mas isso, por sua vez, leva a uma nova coalizão global, impulsionada em parte por movimentos sociais, para resolver esses problemas.
Claro, nenhum dos cenários pode acontecer ou – mais provavelmente – uma combinação deles ou algo totalmente novo pode surgir. O objetivo, dizem os autores, é se preparar para uma série de futuros possíveis – mesmo que muitos deles pareçam longe de ser otimistas.
Large, expensive efforts to map the brain started a decade ago but have largely fallen short. It’s a good reminder of just how complex this organ is.
August 25, 2021
In September 2011, a group of neuroscientists and nanoscientists gathered at a picturesque estate in the English countryside for a symposium meant to bring their two fields together.
At the meeting, Columbia University neurobiologist Rafael Yuste and Harvard geneticist George Church made a not-so-modest proposal: to map the activity of the entire human brain at the level of individual neurons and detail how those cells form circuits. That knowledge could be harnessed to treat brain disorders like Alzheimer’s, autism, schizophrenia, depression, and traumatic brain injury. And it would help answer one of the great questions of science: How does the brain bring about consciousness?
Yuste, Church, and their colleagues drafted a proposal that would later be published in the journal Neuron. Their ambition was extreme: “a large-scale, international public effort, the Brain Activity Map Project, aimed at reconstructing the full record of neural activity across complete neural circuits.” Like the Human Genome Project a decade earlier, they wrote, the brain project would lead to “entirely new industries and commercial ventures.”
New technologies would be needed to achieve that goal, and that’s where the nanoscientists came in. At the time, researchers could record activity from just a few hundred neurons at once—but with around 86 billion neurons in the human brain, it was akin to “watching a TV one pixel at a time,” Yuste recalled in 2017. The researchers proposed tools to measure “every spike from every neuron” in an attempt to understand how the firing of these neurons produced complex thoughts.
But it wasn’t the first audacious brain venture. In fact, a few years earlier, Henry Markram, a neuroscientist at the École Polytechnique Fédérale de Lausanne in Switzerland, had set an even loftier goal: to make a computer simulation of a living human brain. Markram wanted to build a fully digital, three-dimensional model at the resolution of the individual cell, tracing all of those cells’ many connections. “We can do it within 10 years,” he boasted during a 2009 TED talk.
In January 2013, a few months before the American project was announced, the EU awarded Markram $1.3 billion to build his brain model. The US and EU projects sparked similar large-scale research efforts in countries including Japan, Australia, Canada, China, South Korea, and Israel. A new era of neuroscience had begun.
An impossible dream?
A decade later, the US project is winding down, and the EU project faces its deadline to build a digital brain. So how did it go? Have we begun to unwrap the secrets of the human brain? Or have we spent a decade and billions of dollars chasing a vision that remains as elusive as ever?
From the beginning, both projects had critics.
EU scientists worried about the costs of the Markram scheme and thought it would squeeze out other neuroscience research. And even at the original 2011 meeting in which Yuste and Church presented their ambitious vision, many of their colleagues argued it simply wasn’t possible to map the complex firings of billions of human neurons. Others said it was feasible but would cost too much money and generate more data than researchers would know what to do with.
In a blistering article appearing in Scientific American in 2013, Partha Mitra, a neuroscientist at the Cold Spring Harbor Laboratory, warned against the “irrational exuberance” behind the Brain Activity Map and questioned whether its overall goal was meaningful.
Even if it were possible to record all spikes from all neurons at once, he argued, a brain doesn’t exist in isolation: in order to properly connect the dots, you’d need to simultaneously record external stimuli that the brain is exposed to, as well as the behavior of the organism. And he reasoned that we need to understand the brain at a macroscopic level before trying to decode what the firings of individual neurons mean.
Others had concerns about the impact of centralizing control over these fields. Cornelia Bargmann, a neuroscientist at Rockefeller University, worried that it would crowd out research spearheaded by individual investigators. (Bargmann was soon tapped to co-lead the BRAIN Initiative’s working group.)
There isn’t a single, agreed-upon theory of how the brain works, and not everyone in the field agreed that building a simulated brain was the best way to study it.
While the US initiative sought input from scientists to guide its direction, the EU project was decidedly more top-down, with Markram at the helm. But as Noah Hutton documents in his 2020 film In Silico, Markram’s grand plans soon unraveled. As an undergraduate studying neuroscience, Hutton had been assigned to read Markram’s papers and was impressed by his proposal to simulate the human brain; when he started making documentary films, he decided to chronicle the effort. He soon realized, however, that the billion-dollar enterprise was characterized more by infighting and shifting goals than by breakthrough science.
In Silico shows Markram as a charismatic leader who needed to make bold claims about the future of neuroscience to attract the funding to carry out his particular vision. But the project was troubled from the outset by a major issue: there isn’t a single, agreed-upon theory of how the brain works, and not everyone in the field agreed that building a simulated brain was the best way to study it. It didn’t take long for those differences to arise in the EU project.
In 2014, hundreds of experts across Europe penned a letter citing concerns about oversight, funding mechanisms, and transparency in the Human Brain Project. The scientists felt Markram’s aim was premature and too narrow and would exclude funding for researchers who sought other ways to study the brain.
“What struck me was, if he was successful and turned it on and the simulated brain worked, what have you learned?” Terry Sejnowski, a computational neuroscientist at the Salk Institute who served on the advisory committee for the BRAIN Initiative, told me. “The simulation is just as complicated as the brain.”
The Human Brain Project’s board of directors voted to change its organization and leadership in early 2015, replacing a three-member executive committee led by Markram with a 22-member governing board. Christoph Ebell, a Swiss entrepreneur with a background in science diplomacy, was appointed executive director. “When I took over, the project was at a crisis point,” he says. “People were openly wondering if the project was going to go forward.”
But a few years later he was out too, after a “strategic disagreement” with the project’s host institution. The project is now focused on providing a new computational research infrastructure to help neuroscientists store, process, and analyze large amounts of data—unsystematic data collection has been an issue for the field—and develop 3D brain atlases and software for creating simulations.
The US BRAIN Initiative, meanwhile, underwent its own changes. Early on, in 2014, responding to the concerns of scientists and acknowledging the limits of what was possible, it evolved into something more pragmatic, focusing on developing technologies to probe the brain.
Those changes have finally started to produce results—even if they weren’t the ones that the founders of each of the large brain projects had originally envisaged.
And earlier this year Alipasha Vaziri, a neuroscientist funded by the BRAIN Initiative, and his team at Rockefeller University reported in a preprint paper that they’d simultaneously recorded the activity of more than a million neurons across the mouse cortex. It’s the largest recording of animal cortical activity yet made, if far from listening to all 86 billion neurons in the human brain as the original Brain Activity Map hoped.
The US effort has also shown some progress in its attempt to build new tools to study the brain. It has speeded the development of optogenetics, an approach that uses light to control neurons, and its funding has led to new high-density silicon electrodes capable of recording from hundreds of neurons simultaneously. And it has arguably accelerated the development of single-cell sequencing. In September, researchers using these advances will publish a detailed classification of cell types in the mouse and human motor cortexes—the biggest single output from the BRAIN Initiative to date.
While these are all important steps forward, though, they’re far from the initial grand ambitions.
We are now heading into the last phase of these projects—the EU effort will conclude in 2023, while the US initiative is expected to have funding through 2026. What happens in these next years will determine just how much impact they’ll have on the field of neuroscience.
When I asked Ebell what he sees as the biggest accomplishment of the Human Brain Project, he didn’t name any one scientific achievement. Instead, he pointed to EBRAINS, a platform launched in April of this year to help neuroscientists work with neurological data, perform modeling, and simulate brain function. It offers researchers a wide range of data and connects many of the most advanced European lab facilities, supercomputing centers, clinics, and technology hubs in one system.
“If you ask me ‘Are you happy with how it turned out?’ I would say yes,” Ebell said. “Has it led to the breakthroughs that some have expected in terms of gaining a completely new understanding of the brain? Perhaps not.”
Katrin Amunts, a neuroscientist at the University of Düsseldorf, who has been the Human Brain Project’s scientific research director since 2016, says that while Markram’s dream of simulating the human brain hasn’t been realized yet, it is getting closer. “We will use the last three years to make such simulations happen,” she says. But it won’t be a big, single model—instead, several simulation approaches will be needed to understand the brain in all its complexity.
Meanwhile, the BRAIN Initiative has provided more than 900 grants to researchers so far, totaling around $2 billion. The National Institutes of Health is projected to spend nearly $6 billion on the project by the time it concludes.
For the final phase of the BRAIN Initiative, scientists will attempt to understand how brain circuits work by diagramming connected neurons. But claims for what can be achieved are far more restrained than in the project’s early days. The researchers now realize that understanding the brain will be an ongoing task—it’s not something that can be finalized by a project’s deadline, even if that project meets its specific goals.
“With a brand-new tool or a fabulous new microscope, you know when you’ve got it. If you’re talking about understanding how a piece of the brain works or how the brain actually does a task, it’s much more difficult to know what success is,” says Eve Marder, a neuroscientist at Brandeis University. “And success for one person would be just the beginning of the story for another person.”
Yuste and his colleagues were right that new tools and techniques would be needed to study the brain in a more meaningful way. Now, scientists will have to figure out how to use them. But instead of answering the question of consciousness, developing these methods has, if anything, only opened up more questions about the brain—and shown just how complex it is.
“I have to be honest,” says Yuste. “We had higher hopes.”
Emily Mullin is a freelance journalist based in Pittsburgh who focuses on biotechnology.
Em maio, afirmei, aqui na MIT Technology Review Brasil, que o “Brasil tem chance de liderar a corrida pelo metaverso”. Em apenas três meses muito aconteceu e o metaverso se tornou um termo cada vez mais presente na mídia, e principalmente, uma nova estratégia de gigantes de tecnologia. O termo foi mencionado por CEOs em várias recentes conferências de anúncio de resultados no segundo trimestre. Mark Zuckerberg, do Facebook, Satya Nadella, da Microsoft, David Baszucki, da Roblox, e Shar Dubey, da Match Group, afirmaram que o metaverso iria pautar a estratégia de suas empresas.
Do Vale do Silício a Shenzhen, as empresas de tecnologia aumentam suas apostas nesse setor. Para os não iniciados, “o metaverso é a terminologia utilizada para indicar um tipo de mundo virtual que tenta replicar a realidade através de dispositivos digitais. É um espaço coletivo e virtual compartilhado, constituído pela soma de ‘realidade virtual’, ‘realidade aumentada’ e ‘Internet’”, como afirma a página do termo na Wikipédia. A expressão foi cunhada pelo escritor Neal Stephenson em seu romance de 1992, “Snow Crash”. Depois, Ernest Cline usou o mesmo conceito para criar o Oásis em seu romance “Ready Player One”, que virou filme de Steven Spielberg.
Mark Zuckerberg, fundador e CEO do Facebook, parece ter se tornado o mais recente convertido ao metaverso. O executivo deu uma série de entrevistas recentemente afirmando que o Facebook vai apostar o seu futuro no metaverso. “Nós vamos realizar uma transição de ser empresa vista primariamente como de redes sociais para sermos uma empresa de mertaverso”, disse Zuckerberg.
Em julho, o Facebook disse que estava criando uma equipe de produto para trabalhar no metaverso que faria parte de seu grupo de AR e VR, no Facebook Reality Labs. Dias atrás tivemos uma demonstração do que está por vir. O Facebook convidou um grupo de jornalistas para conhecer seu Horizon Workrooms. O app é a primeira tentativa da rede social de criar uma experiência de Realidade Virtual especificamente para as pessoas trabalharem juntas.
Segundo o jornalista Alex Heath, que participou da demonstração, até 16 pessoas em VR podem estar juntas em uma sala de trabalho, enquanto outras 34 pessoas podem entrar em uma videochamada sem usar um fone de ouvido. Um aplicativo de desktop complementar permite que você faça uma transmissão ao vivo da tela do seu computador sobre o seu espaço de mesa virtual. Graças ao rastreamento manual e às câmeras frontais, uma representação virtual do seu teclado físico fica embaixo da tela para digitar em um aplicativo web simples que o Facebook criou para fazer anotações e gerenciar calendários. Ou seja, você entra em um mundo virtual para realizar a reunião com seus colegas.
Facebook não deve liderar o metaverso
Zuckerberg fala de realidade virtual há anos. Ainda em 2014, quando o Facebook comprou a Oculus por US$ 2 bilhões, ele afirmou com entusiasmo que a compra permitiria experiências virtuais imersivas nas quais você se sentiria “presente em outro lugar com outras pessoas”. De certa forma, o metaverso é uma sequência dos planos do Facebook iniciados há quase uma década.
O Facebook é um player gigante a ser reconhecido, mas minha aposta é que não será o vencedor na corrida pelo metaverso. Da mesma forma que a IBM não se tornou a líder nos computadores pessoais ou na nuvem, o Google nunca conseguiu construir uma presença sólida nas redes sociais ou no setor de mensagens instantâneas e nem a Microsoft e muito menos a Nokia se tornaram as líderes em smartphones, o Facebook, apesar de seu entusiasmo, não deve liderar essa corrida.
Basicamente, porque mesmo tendo a vontade e os recursos, usualmente falta às empresas líderes a cultura para operar nesses novos mercados. E não estou dizendo que o Facebook será um player irrelevante, longe disso. Os bilhões de dólares que a empresa já investiu no desenvolvimento do Oculos Quest e toda a tecnologia de hardware criada para uso em realidade virtual (e consequentemente o metaverso) são impressionantes e levaram a avanços indiscutíveis.
“O metaverso, o sonho de um tecnólogo, é o pesadelo do Facebook. Ele tornaria a rede social irrelevante”, afirmou Scott Galloway, professor de marketing. “O ativo mais valioso do Facebook é seu gráfico social, seu conjunto de dados de usuários, links entre usuários e seu conteúdo compartilhado. Em um futuro metaverso, nós todos teremos identidades no metaverso e qualquer um pode abrir um espaço virtual para compartilhar fotos da festa de aniversário de seu filho de 10 anos ou discutir sobre vacinas”, conclui.
Quem tem potencial no metaverso?
De um ponto de vista ocidental, eu apostaria minhas fichas na Roblox e na Epic Games como novos líderes do metaverso de maneira mais ampla. Nas aplicações empresariais, a vantagem seria da Microsoft.
Da perspectiva hardware/software Nvidia e Apple levam vantagem por já terem a capacidade de desenvolverem seus próprios chips (o Facebook compra chips prontos da Qualcomm). Uma vasta biblioteca de chips de Inteligência Artificial e o software necessário para executá-los também são peças essenciais do metaverso.
Do outro lado do mundo, Tencent, Bytedance e Sea são competidores robustos, mas as duas primeiras se vêem diante da crescente regulação chinesa e a terceira tem seu foco na construção de um e-commerce competitivo na Ásia.
A Microsoft tem uma grande vantagem não somente por sua gigantesca comunidade de desenvolvedores criando soluções corporativas e sua robusta presença no mundo corporativo. A Microsoft também está trazendo jogos em nuvem para seus consoles Xbox. Em breve, os assinantes do Xbox Game Pass Ultimate nos consoles Xbox Series X / S e Xbox One poderão transmitir mais de 100 jogos sem baixá-los. Segundo a Microsoft, as métricas de desempenho do serviço serão 1080p e 60 frames por segundo. O Xbox Cloud Gaming se tornou disponível para dispositivos móveis e PC em junho de 2021. A Microsoft também anunciou esta semana que o próximo capítulo da popular série Halo, Halo Infinite, será lançado em 8 de dezembro de 2021.
O poder da comunidade
Há anos a Microsoft desenvolve hardware de mixed reality para aplicações corporativas. Seu HoloLens é um dos mais usados no mercado. Realidade mista ou realidade híbrida é a tecnologia que une características da realidade virtual com a realidade aumentada. Ela insere objetos virtuais no mundo real e permite a interação do usuário com os objetos, produzindo novos ambientes nos quais itens físicos e virtuais coexistem e interagem em tempo real.
No ano passado, a Nvidia lançou sua plataforma Omniverse “para conectar mundos 3D em um universo virtual compartilhado.” O presidente-executivo, Jensen Huang, usou a maior conferência anual da empresa, em outubro, para creditar publicamente “Snow Crash”, de Stephenson, como a inspiração original para o conceito de um sucessor de realidade virtual para a Internet, afirmando que “o metaverso está chegando”.
Mas o que definirá os vencedores do metaverso não será apenas o dinheiro, a vontade de fazer liderar esse movimento ou a propriedade intelectual de uma empresa. É a capacidade de envolver comunidades, seja para as pessoas congregarem no metaverso ou desenvolverem as experiências desse ambiente digital que criará os vencedores.
Games, Netflix e onde gastamos nosso tempo
Os games são uma parte essencial do metaverso, mas o metaverso não irá se limitar aos jogos. Eles são apenas a porta de entrada, um primeiro passo nesse sentido. Reed Rastings, CEO da Netflix, já disse que o Netflix “compete com (e perde para) o Fortnite mais do que a HBO”. Recentemente, a Netflix inclusive anunciou que a partir de 2022 entrará no segmento de jogos, oferecendo games em seu app.
Como aponta o ensaísta Matthew Ball, o mercado de games é enorme e cresce rapidamente, mas essa não é a única razão para a entrada da Netflix em games. “Embora seja comum ouvir que ‘os jogos agora têm quatro vezes o tamanho da bilheteria global dos cinemas’, a bilheteria é menos de 1/15 da receita total de vídeo globalmente. Em outras palavras, os jogos provavelmente vão arrecadar cerca de US$ 180 bilhões em 2021, enquanto os vídeos excederão US$ 650 bilhões”, diz Ball. Ou seja, na guerra pela atenção do consumidor o videogame e o metaverso têm um potencial enorme e a receita de games mostra que esse ainda é um mercado bastante incipiente em comparação ao vídeo como um todo.
Vale lembrar que somente em 2021 a Netflix deve investir US$ 19 bilhões na produção de conteúdo original. Mesmo assim, a empresa tem perdido assinantes nos Estados Unidos e Canadá. A entrada do HBO Max, Paramount+ e diversos novos concorrentes ajudam a explicar a queda, mas os games também são um elemento a ser considerado. E no final do dia, a Netflix está no mercado de vender entretenimento, e estar próximo da indústria de games não é uma ideia ruim.
Nossas crianças, nosso futuro
Mas assim como o Facebook, se sobra dinheiro e vontade/necessidade de reter nossa atenção, falta o elemento da comunidade de desenvolvedores para uma entrada relevante no metaverso. Ao observarmos o Roblox fica mais fácil de entender como esse elemento se aplica.
O Roblox é muito mais que um jogo, é uma plataforma onde qualquer um pode criar um jogo (ou experiência). Hoje, já são mais de 8 milhões de desenvolvedores criando essas experiências. São mais de 20 milhões de experiências, que vão desde adotar um animal de estimação no Adopt Me! ou aprender sobre história em uma visita virtual ao Coliseu.
Desde 2008, quando a plataforma foi lançada, os usuários já passaram mais de 30,6 bilhões de horas engajados no jogo. No segundo trimestre, a receita da Roblox aumentou 127% em relação ao segundo trimestre de 2020, indo para US$ 454,1 milhões. A média de usuários ativos diários (DAUs) foi de 43,2 milhões, um aumento de 29% ano após ano.
Perceba a ironia de que enquanto Facebook e Netflix estagnaram no crescimento de usuários, a Roblox continua aumentando sua base mesmo com a pandemia diminuindo o isolamento social e permitindo que muitos retornem às suas atividades.
Mas provavelmente os grandes números do Roblox e da Epic Games (dona do Fortnite), que tem o capital fechado e não divulga números da mesma maneira que o Roblox, são o aspecto menos interessante das possibilidades que oferecem.
O metaverso é o novo terceiro lugar
Como já escrevi aqui na MIT Tech Review ao falar sobre o impacto dos games no e-commerce, o crescimento dos jogos eletrônicos está diretamente ligado à transformação dos games em um “Terceiro Lugar”. O termo foi cunhado pelo sociólogo Ray Oldenburg e se refere a lugares onde as pessoas passam o tempo entre a casa (“primeiro” lugar) e o trabalho (“segundo” lugar). São espaços onde as pessoas trocam ideias, se divertem e estabelecem relacionamentos. Igrejas, cafés e parques são exemplos de “Terceiro Lugar”. Ter um terceiro lugar para socializar fora de casa e do trabalho é crucial para o bem-estar, pois traz um sentimento de conexão e pertencimento. E os videogames são cada vez mais um “Terceiro Lugar”. Historicamente, as atividades e o desenvolvimento da comunidade eram offline, mas graças aos avanços da tecnologia os videogames se tornaram sociais.
Não por acaso, são cada vez mais frequentes shows e eventos dentro do Roblox e Fortnite (Travis Scott reuniu milhares de pessoas e o diretor Christopher Nolan fez uma premiere). As marcas têm investido pesadamente para entrar nesse universo. De olho nos 43 milhões de usuários que acessam o Roblox diariamente, a Netflix anunciou em julho um novo ponto de encontro virtual baseado na série Stranger Things. Mais recentemente, a Roblox anunciou o lançamento do Vans World, um metaverso de skate interativo da marca Vans dentro do mundo dos jogos. Ele é inspirado nos locais da marca, como a House of Vans e outros destinos de skate, o Vans World é um espaço 3D contínuo onde os fãs podem praticar suas manobras com outras pessoas e experimentar os equipamentos da marca.
“A Roblox é o novo ponto de encontro social, muito parecido com o shopping local na década de 1980, onde os adolescentes se reuniam”, afirma Christina Wootton, vice-presidente de parcerias de marca da Roblox. “O Starcourt Mall virtual é um cenário semelhante reinventado dentro do Roblox que abre possibilidades únicas para envolver e aumentar o público global do programa.”
Vale assistir a essa apresentação de fevereiro de David Baszucki, CEO da Roblox. Nela, o executivo detalha a estratégia de crescimento da empresa com seu potencial de criar experiências, inclusive educativas e comerciais, com uma crescente comunidade.
Brasil pode ser protagonista no metaverso
De tempos em tempos acontece um alinhamento de estrelas que pode beneficiar um mercado. E o Brasil possivelmente se vê diante dessa oportunidade. Na China, o governo cria um ambiente cada vez mais inóspito para empresas e desenvolvedores. Nos Estados Unidos, existe o dinheiro e a escala de usuários, mas falta engajamento e mão de obra. Não é fácil apostar no metaverso em um país que sobram empregos e faltam candidatos. Na Europa há desenvolvedores, principalmente no Leste Europeu, mas a fragmentação é gigantesca.
Enrico Machado, brasileiro que desenvolve Roblox, é um exemplo do potencial de milhares de usuários acostumados a uma base desde a infância. Ele começou a jogar Roblox com 11 anos de idade. Aos 15 já era um desenvolvedor. Hoje, na faculdade, cursa sistemas da informação e trabalha em um grande estúdio brasileiro desenvolvendo apenas jogos para Roblox.
“O Roblox está muito popular. Ele funciona a partir de microtransações. Você pode comprar coisas nos jogos que as pessoas criam e os desenvolvedores ganham dinheiro com isso. Hoje tem muita gente fazendo uma grana absurda. É tipo o mercado de futebol que você tem que você tem alguns caras que estão no topo da Pirâmide. Para cada Neymar você tem milhões de pessoas que gostariam de ser Neymar, essa relação é parecida. Mas qualquer pessoa pode ganhar um dinheiro razoável”, diz Machado.
Ele garante que não é muito difícil ganhar um dinheiro razoável na plataforma.
Tem muita consumidor querendo jogar. Então, se você entende o básico de comunidade, de design, de jogo, de programação, você sai do zero e em um curto espaço de tempo já começa a ganhar uma graninha se você focar nisso”.
Machado trabalha em um estúdio com outras dezenas de desenvolvedores. “No estúdio fazemos reuniões e tudo mais para para aplicar as melhores práticas para todos os jogos. Estou aprendendo bastante com eles. Eu sei programar, sei fazer um joguinho bonitinho mas eu não entendo nada de game design. Eu não sei como fazer um jogo de sucesso. Você sabe que existem melhores práticas, mas com um grupo maior fica mais fácil. Conhecer essas práticas é tão importante quanto saber programar”, garante.
Milhões de desenvolvedores se unindo
Não é um caso isolado. Como Machado existem milhares de jovens no Brasil trabalhando em estúdios enormes desenvolvendo Roblox. E diferentemente de outras linguagens, a utilizada pela Roblox é acessível e de fácil aprendizado. Além disso, não é essencial ter um computador super poderoso ou uma conexão ultra-rápida.
Não por acaso o Brasil já é o quinto mercado de games do mundo, tem uma das maiores comunidades de usuários do planeta, um crescente mercado de streaming e ícones de jogos eletrônicos como Nobru.
A Wildlife, unicórnio brasileiro avaliado em mais de US$ 1,3 bilhão, já conta com mais de 800 colaboradores em países como Brasil, Estados Unidos, Argentina e Irlanda. Criada em 2011, a empresa tem mais de 60 jogos mobile.
O metaverso precisa de tecnologia e software, mas o fator determinante é uma engajada comunidade de desenvolvedores e usuários. Por essas razões, o Roblox e o Fortnite estão na dianteira. Já o Brasil tem todos os elementos para ser o líder global neste setor. Mas nada garante que isso irá acontecer. Montreal, no Canadá, oferece pistas sobre como podemos acelerar esse processo ao criar incentivos para atrair e reunir empresas, desenvolvedores e investimentos. Mas esse será assunto para a próxima coluna.
O metaverso deverá se tornar a próxima Internet e muitos gigantes de hoje vão perder influência. Mas assim como a Internet criou uma nova indústria, com novos empregos e novos bilionários, o metaverso repetirá essa história e possivelmente em uma escala ainda maior. É irônico que Stephenson tenha dito para a revista Vanity Fair, em 2017, que quando escrevia “Snow Crash” e criava o metaverso, estava “apenas inventando merda”. Décadas depois, os CEOs levam essa “invenção” cada vez mais a sério.
Este artigo foi produzido por Guilherme Ravache, jornalista, consultor digital e colunista da MIT Technology Review Brasil.