A remarkable new study by a director at one of the largest accounting firms in the world has found that a famous, decades-old warning from MIT about the risk of industrial civilization collapsing appears to be accurate based on new empirical data.
As the world looks forward to a rebound in economic growth following the devastation wrought by the pandemic, the research raises urgent questions about the risks of attempting to simply return to the pre-pandemic ‘normal.’
In 1972, a team of MIT scientists got together to study the risks of civilizational collapse. Their system dynamics model published by the Club of Rome identified impending ‘limits to growth’ (LtG) that meant industrial civilization was on track to collapse sometime within the 21st century, due to overexploitation of planetary resources.
The controversial MIT analysis generated heated debate, and was widely derided at the time by pundits who misrepresented its findings and methods. But the analysis has now received stunning vindication from a study written by a senior director at professional services giant KPMG, one of the ‘Big Four’ accounting firms as measured by global revenue.
Limits to growth
The study was published in the Yale Journal of Industrial Ecology in November 2020 and is available on the KPMG website. It concludes that the current business-as-usual trajectory of global civilization is heading toward the terminal decline of economic growth within the coming decade—and at worst, could trigger societal collapse by around 2040.
The study represents the first time a top analyst working within a mainstream global corporate entity has taken the ‘limits to growth’ model seriously. Its author, Gaya Herrington, is Sustainability and Dynamic System Analysis Lead at KPMG in the United States. However, she decided to undertake the research as a personal project to understand how well the MIT model stood the test of time.
The study itself is not affiliated or conducted on behalf of KPMG, and does not necessarily reflect the views of KPMG. Herrington performed the research as an extension of her Masters thesis at Harvard University in her capacity as an advisor to the Club of Rome. However, she is quoted explaining her project on the KPMG website as follows:
“Given the unappealing prospect of collapse, I was curious to see which scenarios were aligning most closely with empirical data today. After all, the book that featured this world model was a bestseller in the 70s, and by now we’d have several decades of empirical data which would make a comparison meaningful. But to my surprise I could not find recent attempts for this. So I decided to do it myself.”
Titled ‘Update to limits to growth: Comparing the World3 model with empirical data’, the study attempts to assess how MIT’s ‘World3’ model stacks up against new empirical data. Previous studies that attempted to do this found that the model’s worst-case scenarios accurately reflected real-world developments. However, the last study of this nature was completed in 2014.
The risk of collapse
Herrington’s new analysis examines data across 10 key variables, namely population, fertility rates, mortality rates, industrial output, food production, services, non-renewable resources, persistent pollution, human welfare, and ecological footprint. She found that the latest data most closely aligns with two particular scenarios, ‘BAU2’ (business-as-usual) and ‘CT’ (comprehensive technology).
“BAU2 and CT scenarios show a halt in growth within a decade or so from now,” the study concludes. “Both scenarios thus indicate that continuing business as usual, that is, pursuing continuous growth, is not possible. Even when paired with unprecedented technological development and adoption, business as usual as modelled by LtG would inevitably lead to declines in industrial capital, agricultural output, and welfare levels within this century.”
Study author Gaya Herrington told Motherboard that in the MIT World3 models, collapse “does not mean that humanity will cease to exist,” but rather that “economic and industrial growth will stop, and then decline, which will hurt food production and standards of living… In terms of timing, the BAU2 scenario shows a steep decline to set in around 2040.”
The ‘Business-as-Usual’ scenario (Source: Herrington, 2021)
The end of growth?
In the comprehensive technology (CT) scenario, economic decline still sets in around this date with a range of possible negative consequences, but this does not lead to societal collapse.
The ‘Comprehensive Technology’ scenario (Source: Herrington, 2021)
Unfortunately, the scenario which was the least closest fit to the latest empirical data happens to be the most optimistic pathway known as ‘SW’ (stabilized world), in which civilization follows a sustainable path and experiences the smallest declines in economic growth—based on a combination of technological innovation and widespread investment in public health and education.
The ‘Stabilized World’ Scenario (Source: Herrington, 2021)
Although both the business-as-usual and comprehensive technology scenarios point to the coming end of economic growth in around 10 years, only the BAU2 scenario “shows a clear collapse pattern, whereas CT suggests the possibility of future declines being relatively soft landings, at least for humanity in general.”
Both scenarios currently “seem to align quite closely not just with observed data,” Herrington concludes in her study, indicating that the future is open.
A window of opportunity
While focusing on the pursuit of continued economic growth for its own sake will be futile, the study finds that technological progress and increased investments in public services could not just avoid the risk of collapse, but lead to a new stable and prosperous civilization operating safely within planetary boundaries. But we really have only the next decade to change course.
“At this point therefore, the data most aligns with the CT and BAU2 scenarios which indicate a slowdown and eventual halt in growth within the next decade or so, but World3 leaves open whether the subsequent decline will constitute a collapse,” the study concludes. Although the ‘stabilized world’ scenario “tracks least closely, a deliberate trajectory change brought about by society turning toward another goal than growth is still possible. The LtG work implies that this window of opportunity is closing fast.”
In a presentation at the World Economic Forum in 2020 delivered in her capacity as a KPMG director, Herrington argued for ‘agrowth’—an agnostic approach to growth which focuses on other economic goals and priorities.
“Changing our societal priorities hardly needs to be a capitulation to grim necessity,” she said. “Human activity can be regenerative and our productive capacities can be transformed. In fact, we are seeing examples of that happening right now. Expanding those efforts now creates a world full of opportunity that is also sustainable.”
She noted how the rapid development and deployment of vaccines at unprecedented rates in response to the COVID-19 pandemic demonstrates that we are capable of responding rapidly and constructively to global challenges if we choose to act. We need exactly such a determined approach to the environmental crisis.
“The necessary changes will not be easy and pose transition challenges but a sustainable and inclusive future is still possible,” said Herrington.
The best available data suggests that what we decide over the next 10 years will determine the long-term fate of human civilization. Although the odds are on a knife-edge, Herrington pointed to a “rapid rise” in environmental, social and good governance priorities as a basis for optimism, signalling the change in thinking taking place in both governments and businesses. She told me that perhaps the most important implication of her research is that it’s not too late to create a truly sustainable civilization that works for all.
Experts say it could spur conflict with a neighboring country.
This week, the Chinese government announced that it plans to drastically increase its use of technology that artificially changes the weather.
Cloud seeding technology, or systems that can blasts silver molecules into the sky to prompt condensation and cloud formation, has been around for decades, and China makes frequent use of it. But now, CNN reports that China wants to increase the total size of its weather modification test area to 5.5 million square miles by 2025 — a huge increase, and an area larger than that of the entire country of India, which could affect the environment on an epic scale and even potentially spur conflict with nearby countries.
Fog Of War
Most notably, China and India share a hotly-disputed border that they’ve violently clashed over as recently as this year, CNN has previously reported. India’s agriculture relies on a monsoon season that’s already grown unpredictable due to climate change, prompting experts in the country to worry that China may use its ability to control rain and snowfall as a weapon.
“Lack of proper coordination of weather modification activity (could) lead to charges of ‘rain stealing’ between neighboring regions,” National Taiwan University researchers conclude in a 2017 paper published in Geoforum.
In the past, China has used its weather modification tech to seed clouds well in advance of major events like the 2008 Olympics and political meetings so the events themselves happen under clear skies, CNN reports.
But this planned expansion of the system means that other countries may be subject to its meteorological whims — seeding international conflict in addition to clouds.
Facing a hotter future, dwindling water sources and an exploding population, scientists in one Middle East country are making it rain.
United Arab Emirates meteorological officials released a video this week of cars driving through a downpour in Ras al Khaimah in the northern part of the country. The storm was the result of one of the UAE’s newest efforts to increase rainfall in a desert nation that gets about four inches a year on average.
Washington, D.C., in contrast, has averaged nearly 45 inches of rain annually for the past decade.
Scientists created rainstorms by launching drones, which then zapped clouds with electricity, the Independent reports. Jolting droplets in the clouds can cause them to clump together, researchers found. The larger raindrops that result then fall to the ground, instead of evaporating midair — which is often the fate of smaller droplets in the UAE, where temperatures are hot and the clouds are high.
“What we are trying to do is to make the droplets inside the clouds big enough so that when they fall out of the cloud, they survive down to the surface,” meteorologist and researcher Keri Nicoll told CNN in May as her team prepared to start testing the drones near Dubai.
Nicoll is part of a team of scientists with the University of Reading in England whose research led to this week’s man-made rainstorms. In 2017, the university’s scientists received $1.5 million for use over three years from the UAE Research Program for Rain Enhancement Science, which has invested in at least nine different research projects over the past five years.
To test their research, Nicoll and her team built four drones with wingspans of about 6½ feet. The drones, which are launched from a catapult, can fly for about 40 minutes, CNN reported. During flight, the drone’s sensors measure temperature, humidity and electrical charge within a cloud, which lets the researchers know when and where they need to zap.
Water is a big issue in the UAE. The country uses about 4 billion cubic meters of it each year but has access to about 4 percent of that in renewable water resources, according to the CIA. The number of people living in the UAE has skyrocketed in recent years, doubling to 8.3 million between 2005 and 2010, which helps explain why demand for water spiked by a third around that time, according to the government’s 2015 “State of Environment” report. The population kept surging over the next decade and is now 9.9 million.
“The water table is sinking drastically in [the] UAE,” University of Reading professor and meteorologist Maarten Ambaum told BBC News, “and the purpose of this [project] is to try to help with rainfall.”
It usually rains just a few days out of the year in the UAE. During the summer, there’s almost no rainfall. Temperatures there recently topped 125 degrees.
In recent years, the UAE’s massive push into desalination technology — which transforms seawater into freshwater by removing the salt — has helped close the gap between the demand for water and supply. Most of the UAE’s drinkable water, and 42 percent of all water used in the country, comes from its roughly 70 desalination plants, according to the UAE government.
Still, part of the government’s “water security strategy” is to lower demand by 21 percent in the next 15 years.
Ideas to get more water for the UAE have not lacked imagination. In 2016, The Washington Post reported government officials were considering building a mountain to create rainfall. As moist air reaches a mountain, it is forced upward, cooling as it rises. The air can then condense and turn into liquid, which falls as rain.
Estimates for another mountain-building project in the Netherlands came in as high as $230 billion.
Other ideas for getting more water to the UAE have included building a pipeline from Pakistan and floating icebergs down from the Arctic.
Floods swept Germany, fires ravaged the American West and another heat wave loomed, driving home the reality that the world’s richest nations remain unprepared for the intensifying consequences of climate change.
July 17, 2021
Some of Europe’s richest countries lay in disarray this weekend, as raging rivers burst through their banks in Germany and Belgium, submerging towns, slamming parked cars against trees and leaving Europeans shellshocked at the intensity of the destruction.
Only days before in the Northwestern United States, a region famed for its cool, foggy weather, hundreds had died of heat. In Canada, wildfire had burned a village off the map. Moscow reeled from record temperatures. And this weekend the northern Rocky Mountains were bracing for yet another heat wave, as wildfires spread across 12 states in the American West.
The extreme weather disasters across Europe and North America have driven home two essential facts of science and history: The world as a whole is neither prepared to slow down climate change, nor live with it. The week’s events have now ravaged some of the world’s wealthiest nations, whose affluence has been enabled by more than a century of burning coal, oil and gas — activities that pumped the greenhouse gases into the atmosphere that are warming the world.
“I say this as a German: The idea that you could possibly die from weather is completely alien,” said Friederike Otto, a physicist at Oxford University who studies the links between extreme weather and climate change. “There’s not even a realization that adaptation is something we have to do right now. We have to save people’s lives.”
The floods in Europe have killed at least 165 people, most of them in Germany, Europe’s most powerful economy. Across Germany, Belgium, and the Netherlands, hundreds have been reported as missing, which suggests the death toll could rise. Questions are now being raised about whether the authorities adequately warned the public about risks.
The bigger question is whether the mounting disasters in the developed world will have a bearing on what the world’s most influential countries and companies will do to reduce their own emissions of planet-warming gases. They come a few months ahead of United Nations-led climate negotiations in Glasgow in November, effectively a moment of reckoning for whether the nations of the world will be able to agree on ways to rein in emissions enough to avert the worst effects of climate change.
Disasters magnified by global warming have left a long trail of death and loss across much of the developing world, after all, wiping out crops in Bangladesh, leveling villages in Honduras, and threatening the very existence of small island nations. Typhoon Haiyan devastated the Philippines in the run-up to climate talks in 2013, which prompted developing-country representatives to press for funding to deal with loss and damage they face over time for climate induced disasters that they weren’t responsible for. That was rejected by richer countries, including the United States and Europe.
“Extreme weather events in developing countries often cause great death and destruction — but these are seen as our responsibility, not something made worse by more than a hundred years of greenhouse gases emitted by industrialized countries,” said Ulka Kelkar, climate director at the India office of the World Resources Institute. These intensifying disasters now striking richer countries, she said, show that developing countries seeking the world’s help to fight climate change “have not been crying wolf.”
Indeed, even since the 2015 Paris Agreement was negotiated with the goal of averting the worst effects of climate change, global emissions have kept increasing. China is the world’s biggest emitter today. Emissions have been steadily declining in both the United States and Europe, but not at the pace required to limit global temperature rise.
A reminder of the shared costs came from Mohamed Nasheed, the former president of the Maldives, an island nation at acute risk from sea level rise.
“While not all are affected equally, this tragic event is a reminder that, in the climate emergency, no one is safe, whether they live on a small island nation like mine or a developed Western European state,” Mr. Nasheed said in a statement on behalf of a group of countries that call themselves the Climate Vulnerable Forum.
The ferocity of these disasters is as notable as their timing, coming ahead of the global talks in Glasgow to try to reach agreement on fighting climate change. The world has a poor track record on cooperation so far, and, this month, new diplomatic tensions emerged.
Among major economies, the European Commission last week introduced the most ambitious road map for change. It proposed laws to ban the sale of gas and diesel cars by 2035, require most industries to pay for the emissions they produce, and most significantly, impose a tax on imports from countries with less stringent climate policies.
But those proposals are widely expected to meet vigorous objections both from within Europe and from other countries whose businesses could be threatened by the proposed carbon border tax, potentially further complicating the prospects for global cooperation in Glasgow.
The events of this summer come after decades of neglect of science. Climate models have warned of the ruinous impact of rising temperatures. An exhaustive scientific assessment in 2018 warned that a failure to keep the average global temperature from rising past 1.5 degrees Celsius, compared to the start of the industrial age, could usher in catastrophic results, from the inundation of coastal cities to crop failures in various parts of the world.
The report offered world leaders a practical, albeit narrow path out of chaos. It required the world as a whole to halve emissions by 2030. Since then, however, global emissions have continued rising, so much so that global average temperature has increased by more than 1 degree Celsius (about 2 degrees Fahrenheit) since 1880, narrowing the path to keep the increase below the 1.5 degree Celsius threshold.
As the average temperature has risen, it has heightened the frequency and intensity of extreme weather events in general. In recent years, scientific advances have pinpointed the degree to which climate change is responsible for specific events.
And even though it will take extensive scientific analysis to link climate change to last week’s cataclysmic floods in Europe, a warmer atmosphere holds more moisture and is already causing heavier rainfall in many storms around the world. There is little doubt that extreme weather events will continue to be more frequent and more intense as a consequence of global warming. A paper published Friday projected a significant increase in slow-moving but intense rainstorms across Europe by the end of this century because of climate change.
“We’ve got to adapt to the change we’ve already baked into the system and also avoid further change by reducing our emissions, by reducing our influence on the climate,” said Richard Betts, a climate scientist at the Met Office in Britain and a professor at the University of Exeter.
That message clearly hasn’t sunk in among policymakers, and perhaps the public as well, particularly in the developed world, which has maintained a sense of invulnerability.
The result is a lack of preparation, even in countries with resources. In the United States, flooding has killed more than 1,000 people since 2010 alone, according to federal data. In the Southwest, heat deaths have spiked in recent years.
Sometimes that is because governments have scrambled to respond to disasters they haven’t experienced before, like the heat wave in Western Canada last month, according to Jean Slick, head of the disaster and emergency management program at Royal Roads University in British Columbia. “You can have a plan, but you don’t know that it will work,” Ms. Slick said.
Other times, it’s because there aren’t political incentives to spend money on adaptation.
“By the time they build new flood infrastructure in their community, they’re probably not going to be in office anymore,” said Samantha Montano, a professor of emergency management at the Massachusetts Maritime Academy. “But they are going to have to justify millions, billions of dollars being spent.”
By Carolyn Gramling July 9, 2021 at 6:00 am 22-28 minutos
Massive projects need much more planning and follow-through to succeed – and other tree protections need to happen too
Trees are symbols of hope, life and transformation. They’re also increasingly touted as a straightforward, relatively inexpensive, ready-for-prime-time solution to climate change.
When it comes to removing human-caused emissions of the greenhouse gas carbon dioxide from Earth’s atmosphere, trees are a big help. Through photosynthesis, trees pull the gas out of the air to help grow their leaves, branches and roots. Forest soils can also sequester vast reservoirs of carbon.
Earth holds, by one estimate, as many as 3 trillion trees. Enthusiasm is growing among governments, businesses and individuals for ambitious projects to plant billions, even a trillion more. Such massive tree-planting projects, advocates say, could do two important things: help offset current emissions and also draw out CO2 emissions that have lingered in the atmosphere for decades or longer.
Even in the politically divided United States, large-scale tree-planting projects have broad bipartisan support, according to a spring 2020 poll by the Pew Research Center. And over the last decade, a diverse garden of tree-centric proposals — from planting new seedlings to promoting natural regrowth of degraded forests to blending trees with crops and pasturelands — has sprouted across the international political landscape.
Trees “are having a bit of a moment right now,” says Joe Fargione, an ecologist with The Nature Conservancy who is based in Minneapolis. It helps that everybody likes trees. “There’s no anti-tree lobby. [Trees] have lots of benefits for people. Not only do they store carbon, they help provide clean air, prevent soil erosion, shade and shelter homes to reduce energy costs and give people a sense of well-being.”
Conservationists are understandably eager to harness this enthusiasm to combat climate change. “We’re tapping into the zeitgeist,” says Justin Adams, executive director of the Tropical Forest Alliance at the World Economic Forum, an international nongovernmental organization based in Geneva. In January 2020, the World Economic Forum launched the One Trillion Trees Initiative, a global movement to grow, restore and conserve trees around the planet. One trillion is also the target for other organizations that coordinate global forestation projects, such as Plant-for-the-Planet’s Trillion Tree Campaign and Trillion Trees, a partnership of the World Wildlife Fund, the Wildlife Conservation Society and other conservation groups.
A carbon-containing system
Forests store carbon aboveground and below. That carbon returns to the atmosphere by microbial activity in the soil, or when trees are cut down and die.
SOURCE: MINNESOTA BOARD OF WATER AND SOIL RESOURCES 2019; images: T. Tibbitts
Yet, as global eagerness for adding more trees grows, some scientists are urging caution. Before moving forward, they say, such massive tree projects must address a range of scientific, political, social and economic concerns. Poorly designed projects that don’t address these issues could do more harm than good, the researchers say, wasting money as well as political and public goodwill. The concerns are myriad: There’s too much focus on numbers of seedlings planted, and too little time spent on how to keep the trees alive in the long term, or in working with local communities. And there’s not enough emphasis on how different types of forests sequester very different amounts of carbon. There’s too much talk about trees, and not enough about other carbon-storing ecosystems.
“There’s a real feeling that … forests and trees are just the idea we can use to get political support” for many, perhaps more complicated, types of landscape restoration initiatives, says Joseph Veldman, an ecologist at Texas A&M University in College Station. But that can lead to all kinds of problems, he adds. “For me, the devil is in the details.”
The root of the problem
The pace of climate change is accelerating into the realm of emergency, scientists say. Over the last 200 years, human-caused emissions of greenhouse gases, including CO2 and methane, have raised the average temperature of the planet by about 1 degree Celsius (SN: 12/22/18 & 1/5/19, p. 18).
The world’s oceans and land-based ecosystems, such as forests, absorb about half of the carbon emissions from fossil fuel burning and other industrial activities. The rest goes into the atmosphere. So “the majority of the solution to climate change will need to come from reducing our emissions,” Fargione says. To meet climate targets set by the 2015 Paris Agreement, much deeper and more painful cuts in emissions than nations have pledged so far will be needed in the next 10 years.
We invest a lot in tree plantings, but we are not sure what happens after that.
But increasingly, scientists warn that reducing emissions alone won’t be enough to bring Earth’s thermostat back down. “We really do need an all-hands-on-deck approach,” Fargione says. Specifically, researchers are investigating ways to actively remove that carbon, known as negative emissions technologies. Many of these approaches, such as removing CO2 directly from the air and converting it into fuel, are still being developed.
But trees are a ready kind of negative emissions “technology,” and many researchers see them as the first line of defense. In its January 2020 report, “CarbonShot,” the World Resources Institute, a global nonprofit research organization, suggested that large and immediate investments in reforestation within the United States will be key for the country to have any hope of reaching carbon neutrality — in which ongoing carbon emissions are balanced by carbon withdrawals — by 2050. The report called for the U.S. government to invest $4 billion a year through 2030 to support tree restoration projects across the United States. Those efforts would be a bridge to a future of, hopefully, more technologies that can pull large amounts of carbon out of the atmosphere.
The numbers game
Earth’s forests absorb, on average, 16 billion metric tons of CO2 annually, researchers reported in the March Nature Climate Change. But human activity can turn forests into sources of carbon: Thanks to land clearing, wildfires and the burning of wood products, forests also emit an estimated 8.1 billion tons of the gas back to the atmosphere.
That leaves a net amount of 7.6 billion tons of CO2 absorbed by forests per year — roughly a fifth of the 36 billion tons of CO2 emitted by humans in 2019. Deforestation and forest degradation are rapidly shifting the balance. Forests in Southeast Asia now emit more carbon than they absorb due to clearing for plantations and uncontrolled fires. The Amazon’s forests may flip from carbon sponge to carbon source by 2050, researchers say (SN Online: 1/10/20). The priority for slowing climate change, many agree, should be saving the trees we have.
Forests in flux
While global forests were a net carbon sink of about 7.6 gigatons of carbon dioxide per year from 2001 to 2019, forests in areas such as Southeast Asia and parts of the Amazon began releasing more carbon than they store. Tap map to enlarge
Net annual average contribution of carbon dioxide from Earth’s forests, 2001–2019
Just how many more trees might be mustered for the fight is unclear, however. In 2019, Thomas Crowther, an ecologist at ETH Zurich, and his team estimated in Science that around the globe, there are 900 million hectares of land — an area about the size of the United States — available for planting new forests and reviving old ones (SN: 8/17/19, p. 5). That land could hold over a trillion more trees, the team claimed, which could trap about 206 billion tons of carbon over a century.
That study, led by Jean-Francois Bastin, then a postdoc in Crowther’s lab, was sweeping, ambitious and hopeful. Its findings spread like wildfire through media, conservationist and political circles. “We were in New York during Climate Week , and everybody’s talking about this paper,” Adams recalls. “It had just popped into people’s consciousness, this unbelievable technology solution called the tree.”
To channel that enthusiasm, the One Trillion Trees Initiative incorporated the study’s findings into its mission statement, and countless other tree-planting efforts have cited the report.
But critics say the study is deeply flawed, and that its accounting — of potential trees, of potential carbon uptake — is not only sloppy, but dangerous. In 2019, Science published five separate responses outlining numerous concerns. For example, the study’s criteria for “available” land for tree planting were too broad, and the carbon accounting was inaccurate because it assumes that new tree canopy cover equals new carbon storage. Savannas and natural grasslands may have relatively few trees, critics noted, but these regions already hold plenty of carbon in their soils. When that carbon is accounted for, the carbon uptake benefit from planting trees drops to perhaps a fifth of the original estimate.
Trees are having a bit of a moment right now.
There’s also the question of how forests themselves can affect the climate. Adding trees to snow-covered regions, for example, could increase the absorption of solar radiation, possibly leading to warming.
“Their numbers are just so far from anything reasonable,” Veldman says. And focusing on the number of trees planted also sets up another problem, he adds — an incentive structure that is prone to corruption. “Once you set up the incentive system, behaviors change to basically play that game.”
Adams acknowledges these concerns. But, the One Trillion Trees Initiative isn’t really focused on “the specifics of the math,” he says, whether it’s the number of trees or the exact amount of carbon sequestered. The goal is to create a powerful climate movement to “motivate a community behind a big goal and a big vision,” he says. “It could give us a fighting chance to get restoration right.”
Other nonprofit conservation groups, like the World Resources Institute and The Nature Conservancy, are trying to walk a similar line in their advocacy. But some scientists are skeptical that governments and policy makers tasked with implementing massive forest restoration programs will take note of such nuances.
“I study how government bureaucracy works,” says Forrest Fleischman, who researches forest and environmental policy at the University of Minnesota in St. Paul. Policy makers, he says, are “going to see ‘forest restoration,’ and that means planting rows of trees. That’s what they know how to do.”
How much carbon a forest can draw from the atmosphere depends on how you define “forest.” There’s reforestation — restoring trees to regions where they used to be — and afforestation — planting new trees where they haven’t historically been. Reforestation can mean new planting, including crop trees; allowing forests to regrow naturally on lands previously cleared for agriculture or other purposes; or blending tree cover with croplands or grazing areas.
In the past, the carbon uptake potential of letting forests regrow naturally was underestimated by 32 percent, on average — and by as much as 53 percent in tropical forests, according to a 2020 study in Nature. Now, scientists are calling for more attention to this forestation strategy.
If it’s just a matter of what’s best for the climate, natural forest regrowth offers the biggest bang for the buck, says Simon Lewis, a forest ecologist at University College London. Single-tree commercial crop plantations, on the other hand, may meet the technical definition of a “forest” — a certain concentration of trees in a given area — but factor in land clearing to plant the crop and frequent harvesting of the trees, and such plantations can actually release more carbon than they sequester.
Comparing the carbon accounting between different restoration projects becomes particularly important in the framework of international climate targets and challenges. For example, the 2011 Bonn Challenge is a global project aimed at restoring 350 million hectares by 2030. As of 2020, 61 nations had pledged to restore a total of 210 million hectares of their lands. The potential carbon impact of the stated pledges, however, varies widely depending on the specific restoration plans.
Levels of protection
The Bonn Challenge aims to globally reforest 350 million hectares of land. Allowing all to regrow naturally would sequester 42 gigatons of carbon by 2100. Pledges of 43 tropical and subtropical nations that joined by 2019 — a mix of plantations and natural regrowth — would sequester 16 gigatons of carbon. If some of the land is later converted to biofuel plantations, sequestration is 3 gigatons. With only plantations, carbon storage is 1 gigaton.
Amount of carbon sequestered by 2100 in four Bonn Challenge scenarios
SOURCE: S.L. LEWIS ET AL/NATURE 2019; graphs: T. Tibbitts
In a 2019 study in Nature, Lewis and his colleagues estimated that if all 350 million hectares were allowed to regrow natural forest, those lands would sequester about 42 billion metric tons (gigatons in chart above) of carbon by 2100. Conversely, if the land were to be filled with single-tree commercial crop plantations, carbon storage drops to about 1 billion metric tons. And right now, plantations make up a majority of the restoration plans submitted under the Bonn Challenge.
Striking the right balance between offering incentives to landowners to participate while also placing certain restrictions remains a tricky and long-standing challenge, not just for combating the climate emergency but also for trying to preserve biodiversity (SN: 8/1/20, p. 18). Since 1974, Chile, for example, has been encouraging private landowners to plant trees through subsidies. But landowners are allowed to use these subsidies to replace native forestlands with profitable plantations. As a result, Chile’s new plantings not only didn’t increase carbon storage, they also accelerated biodiversity losses, researchers reported in the September 2020 Nature Sustainability.
The reality is that plantations are a necessary part of initiatives like the Bonn Challenge, because they make landscape restoration economically viable for many nations, Lewis says. “Plantations can play a part, and so can agroforestry as well as areas of more natural forest,” he says. “It’s important to remember that landscapes provide a whole host of services and products to people who live there.”
But he and others advocate for increasing the proportion of forestation that is naturally regenerated. “I’d like to see more attention on that,” says Robin Chazdon, a forest ecologist affiliated with the University of the Sunshine Coast in Australia as well as with the World Resources Institute. Naturally regenerated forests could be allowed to grow in buffer regions between farms, creating connecting green corridors that could also help preserve biodiversity, she says. And “it’s certainly a lot less expensive to let nature do the work,” Chazdon says.
Indeed, massive tree-planting projects may also be stymied by pipeline and workforce issues. Take seeds: In the United States, nurseries produce about 1.3 billion seedlings per year, Fargione and colleagues calculated in a study reported February 4 in Frontiers in Forests and Global Change. To support a massive tree-planting initiative, U.S. nurseries would need to at least double that number.
A tree-planting report card
From China to Turkey, countries around the world have launched enthusiastic national tree-planting efforts. And many of them have become cautionary tales.
China kicked off a campaign in 1978 to push back the encroaching Gobi Desert, which has become the fastest-growing desert on Earth due to a combination of mass deforestation and overgrazing, exacerbated by high winds that drive erosion. China’s Three-North Shelter Forest Program, nicknamed the Great Green Wall, aims to plant a band of trees stretching 4,500 kilometers across the northern part of the country. The campaign has involved millions of seeds dropped from airplanes and millions more seedlings planted by hand. But a 2011 analysis suggested that up to 85 percent of the plantings had failed because the nonnative species chosen couldn’t survive in the arid environments they were plopped into.
More recently, Turkey launched its own reforestation effort. On November 11, 2019, National Forestation Day, volunteers across the country planted 11 million trees at more than 2,000 sites. In Turkey’s Çorum province, 303,150 saplings were planted in a single hour, setting a new world record.
Within three months, however, up to 90 percent of the new saplings inspected by Turkey’s agriculture and forestry trade union were dead, according to the union’s president, Şükrü Durmuş, speaking to the Guardian (Turkey’s minister of agriculture and forestry denied that this was true). The saplings, Durmuş said, died due to a combination of insufficient water and because they were planted at the wrong time of year, and not by experts.
Some smaller-scale efforts also appear to be failing, though less spectacularly. Tree planting has been ongoing for decades in the Kangra district of Himachal Pradesh in northern India, says Eric Coleman, a political scientist at Florida State University in Tallahassee, who’s been studying the outcomes. The aim is to increase the density of the local forests and provide additional forest benefits for communities nearby, such as wood for fuel and fodder for grazing animals. How much money was spent isn’t known, Coleman says, because there aren’t records of how much was paid for seeds. “But I imagine it was in the millions and millions of dollars.”
Coleman and his colleagues analyzed satellite images and interviewed members of the local communities. They found that the tree planting had very little impact one way or the other. Forest density didn’t change much, and the surveys suggested that few households were gaining benefits from the planted forests, such as gathering wood for fuel, grazing animals or collecting fodder.
But massive tree-planting efforts don’t have to fail. “It’s easy to point to examples of large-scale reforestation efforts that weren’t using the right tree stock, or adequately trained workforces, or didn’t have enough investment in … postplanting treatments and care,” Fargione says. “We … need to learn from those efforts.”
Speak for the trees
Forester Lalisa Duguma of World Agroforestry in Nairobi, Kenya, and colleagues explored some of the reasons for the very high failure rates of these projects in a working paper in 2020. “Every year there are billions of dollars invested [in tree planting], but forest cover is not increasing,” Duguma says. “Where are those resources going?”
In 2019, Duguma raised this question at the World Congress on Agroforestry in Montpellier, France. He asked the audience of scientists and conservationists: “How many of you have ever planted a tree seedling?” To those who raised their hands, he asked, “Have they grown?”
Some respondents acknowledged that they weren’t sure. “Very good! That’s what I wanted,” he told them. “We invest a lot in tree plantings, but we are not sure what happens after that.”
It comes down to a deceptively simple but “really fundamental” point, Duguma says. “The narrative has to change — from tree planting to tree growing.”
The good news is that this point has begun to percolate through the conservationist world, he says. To have any hope of success, restoration projects need to consider the best times of year to plant seeds, which seeds to plant and where, who will care for the seedlings as they grow into trees, how that growth will be monitored, and how to balance the economic and environmental needs of people in developing countries where the trees might be planted.
“That is where we need to capture the voice of the people,” Duguma says. “From the beginning.”
Even as the enthusiasm for tree planting takes root in the policy world, there’s a growing awareness among researchers and conservationists that local community engagement must be built into these plans; it’s indispensable to their success.
“It will be almost impossible to meet these targets we all care so much about unless small farmers and communities benefit more from trees,” as David Kaimowitz of the United Nations’ Food and Agriculture Organization wrote March 19 in a blog post for the London-based nonprofit International Institute for Environment and Development.
For one thing, farmers and villagers managing the land need incentives to care for the plantings and that includes having clear rights to the trees’ benefits, such as food or thatching or grazing. “People who have insecure land tenure don’t plant trees,” Fleischman says.
The old cliché — think globally, act locally — may offer the best path forward for conservationists and researchers trying to balance so many different needs and still address climate change.
“There are a host of sociologically and biologically informed approaches to conservation and restoration that … have virtually nothing to do with tree planting,” Veldman says. “An effective global restoration agenda needs to encompass the diversity of Earth’s ecosystems and the people who use them.”
Estudo coordenado pela agência Purpose recomenda incluir brasileiros de baixa renda no debate sobre sustentabilidade e aponta argumentos e narrativas para dialogar com aqueles que já vivem as consequências do caos climático
O Brasil é importante estrategicamente nos debates sobre o futuro do planeta por ser o principal responsável legal por biomas importantes, especialmente a floresta amazônica. Mas há um segundo protagonismo no caso brasileiro: o país já foi afetado por um evento climático de longa duração e portanto os brasileiros já conhecem as consequências desse fenômeno.
O Brasil do início do século XXI é a paisagem provável do mundo no século seguinte, quando fenômenos climáticos forçarem o desenraizamento e o deslocamentos de grandes populações para as periferias das cidades. No cenário conservador apresentado em A terra inabitável (Cia das Letras, 2019), o jornalista David Wallace-Wells apresenta cenários em que as mudanças climáticas provocarão o deslocamento de 600 milhões a 2 bilhões de refugiados até o final do século.
O clima como assunto dos intelectuais
Se a terra fosse um carro, os cientistas seriam faróis indicando que estamos nos dirigindo em alta velocidade para um abismo. A surpresa é que o motorista e os passageiros —líderes políticos, empresariais e a sociedade— não estão reagindo da forma que deveriam considerando a catástrofe que afetará todo mundo.
Uma pesquisa recente coordenada pela agência Purpose e realizada pela Behup, uma startup de pesquisa, sugere que brasileiros pobres podem ser mobilizados para atuar em defesa da sustentabilidade. Eles são mais da metade da população do país e sabem na pele como vai ser o mundo no futuro, porque eles representam mais da metade da população e já vivem os efeitos da catástrofe climática. Mas para isso funcionar, nós temos que partir das referências e das experiências deles em relação a esse assunto.
Um problema real, atual e econômico
A seguir, estão listados alguns insights sobre como populações menos privilegiadas percebem e falam sobre sustentabilidade.
1) Real e atual – Nos debates científicos sobre o aquecimento global, as consequências virão em algum momento do futuro, mas no Brasil popular ele é palpável e acontece hoje. O alagamento —causado pelo aumento ou pela irregularidade das chuvas— é a maneira mais evidente como o caos climático se mostra para essas pessoas. E ele aponta para dois problemas reais: a falta de infraestrutura de escoamento de água da chuva e a falta de serviços regulares de coleta de lixo. Outro problema é causado pelo clima seco, que acentua os casos de doenças respiratórias —relativamente fáceis de serem tratadas pela medicina, mas complicadas para quem depende do atendimento público.
2) O lixo tangibiliza o problema. Quando o pobre urbano fala sobre sustentabilidade, a primeira associação é com o lixo. Lixo coletado sem regularidade pelo serviço público nas periferias, muitas vezes descartado nas ruas — o “papel de bala” jogado no chão, lixo dispensado fora de hora e espalhado na rua por gatos, cachorros e outros animais. O problema do lixo materializa esse assunto para quem vê o lixo acumulado, a coleta irregular de dejetos, sacos de lixo resgados e espalhados nas ruas por animais; o impacto de se viver em locais sujos, desprezados pelos governantes; lixo que se acumula em espaços sem iluminação pública e que são ocupados por assaltantes ou por traficantes.
3) A metáfora do consumismo predatório. O lixo representa ou metaforiza a sociedade de consumo que descarta o que ainda é útil. O lixo talvez seja algo “naturalizado” para as camadas urbanas médias e altas, mas isso é menos claro para vem de uma lógica de reuso —o lixo orgânico alimenta os animais, a lata vira lamparina, a garrafa PET tem mil e uma utilidades. Para alguns respondentes do estudo, é moralmente incômodo descartar aquilo que pode ser reutilizado. Classificar algo como “lixo” é uma decisão, uma escolha, que mostra uma percepção sobre desperdício e responsabilidade conjunta para cuidar do lugar em que se vive.
4) Ser sustentável é ser econômico. Geralmente ouvimos falar da preservação do meio ambiente como algo que tem uma motivação altruísta: “zelar pelo futuro das crianças, das florestas etc.” Mas essa abstração não é prioridade para quem vive em situação de vulnerabilidade e está preocupada com o que vai acontecer amanhã. De onde vem o alimento, o emprego, o remédio; como se defender do crime, o que fazer em relação à escola fechada, por exemplo. Para esse brasileiro, sustentabilidade é uma boa ação que traz vantagem econômica. Plásticos e latas podem se tornar utensílios e brinquedos. Usar lâmpadas LED e controlar o uso da água diminuem os gastos. Pneus, tijolos e outros produtos de demolição são mais baratos para quem quer construir. E finalmente há o tema do trabalho: recolher lixo reciclável é uma fonte de renda para quem não tem outra fonte de renda.
O tema da sustentabilidade geralmente é debatido em círculos intelectualizados entre brasileiros das camadas médias e altas. O brasileiro pobre não é convidado a participar dessa conversa, pelo preconceito que associa baixa escolaridade a incapacidade de pensar e entender o mundo. Mas em um mundo com muito mais pobres do que ricos, essa discussão se fortalecerá se dialogar com as milhares de pessoas —no Brasil e no mundo— que já vivem as consequências do caos climático.
Estudo de caso
No início do mês de junho, portanto, pouco tempo depois de eu escrever este artigo, recebi pelo WhatsApp o vídeo incluído adiante, feito pela ativista Duda Salabert, vereadora em Belo Horizonte, sobre instalação de uma mineradora da empresa Tamisa na Serra do Curral, próxima à capital mineira. O vídeo argumenta que a mineração afetará as nascentes de água que servem a cidade, levantará poeira causando problemas respiratórios na população de BH, particularmente para uma comunidade/bairro chamado Taquaril, que fica a três quilômetros de onde o empreendimento será instalado caso seja aprovado.
O vídeo tem argumento convincente, imagens registradas por drone para dar ideia das distâncias entre os locais indicados. Fui mobilizado e por isso, parei o que estava fazendo e repassei o vídeo para… ambientalistas amigos meus — já me desculpando por achar que eles possivelmente já conheciam a situação ou teriam recebido o vídeo de outras pessoas. Mas escrevendo as mensagens, examinei como o argumento do vídeo — à luz do que eu mesmo escrevi acima — é feito para circular entre pessoas das camadas médias e altas, principalmente mais escolarizadas e identificadas com valores progressistas.
A vereadora Duda, em um trecho do vídeo, aponta para a comunidade/bairro do Taquaril e diz que os moradores não foram ouvidos mas sofrerão diretamente os impactos ambientais da mineração. Para a vereadora, essa atitude configura um caso de “racismo ambiental”. Esse argumento é convincente e deve soar “natural” para leitores e leitoras do EL PAÍS, mas falar dessa forma:
Compara esse bairro pobre ao recurso natural, sugerindo passividade dos moradores, como se eles não tivessem capacidade —por falta de estudos e situação econômica adversa— para participar do debate.
Ao fazer isso, os criadores do vídeo cometem o mesmo erro que estão denunciando, que é não envolver os moradores nesse debate.
Debater com os moradores do Taquaril, visitar o bairro e conversar com líderes comunitários. Mas escutar como pessoas comuns como eles percebem o empreendimento minerador — inclusive considerar a possibilidade de que a mineração abrirá oportunidades de emprego para várias dessas famílias. E, a partir dessa conversa interessada, atenta e continuada, que procura entender o problema a partir da ótica dessas pessoas, dialogar com elas sobre o assunto, conforme este artigo propõe.
O movimento ambientalista está se dando conta que precisa dialogar com outras audiências se quiser —mais do que ter razão— ser eficiente e produzir os resultados que mitigarão o caos climático. O caso da Serra do Curral em BH mostra como essa reflexão é urgente; se essa mudança de atitude não acontece em relação a um problema que acontece tão próximo a uma cidade grande, como então agir em relação ao que acontece nos rincões do país?
Juliano Spyer é antropólogo digital, escritor e educador. Mestre e doutor pela University College London, é autor de Povo de Deus: Quem são os evangélicos e por que eles importam (Geração Editorial), entre outros livros. Este texto foi publicado originalmente aqui.
MANAUS, Brazil (AP) — Rivers around the biggest city in Brazil’s Amazon rainforest have swelled to levels unseen in over a century of record-keeping, according to data published Tuesday by Manaus’ port authorities, straining a society that has grown weary of increasingly frequent flooding.
The Rio Negro was at its highest level since records began in 1902, with a depth of 29.98 meters (98 feet) at the port’s measuring station. The nearby Solimoes and Amazon rivers were also nearing all-time highs, flooding streets and houses in dozens of municipalities and affecting some 450,000 people in the region.
Higher-than-usual precipitation is associated with the La Nina phenomenon, when currents in the central and eastern Pacific Ocean affect global climate patterns. Environmental experts and organizations including the U.S. Environmental Protection Agency and the National Oceanic and Atmospheric Administration say there is strong evidence that human activity and global warming are altering the frequency and intensity of extreme weather events, including La Nina.
Seven of the 10 biggest floods in the Amazon basin have occurred in the past 13 years, data from Brazil’s state-owned Geological Survey shows.
“If we continue to destroy the Amazon the way we do, the climatic anomalies will become more and more accentuated,” said Virgílio Viana, director of the Sustainable Amazon Foundation, a nonprofit. ” Greater floods on the one hand, greater droughts on the other.”
Large swaths of Brazil are currently drying up in a severe drought, with a possible shortfall in power generation from the nation’s hydroelectric plants and increased electricity prices, government authorities have warned.
But in Manaus, 66-year-old Julia Simas has water ankle-deep in her home. Simas has lived in the working-class neighborhood of Sao Jorge since 1974 and is used to seeing the river rise and fall with the seasons. Simas likes her neighborhood because it is safe and clean. But the quickening pace of the floods in the last decade has her worried.
“From 1974 until recently, many years passed and we wouldn’t see any water. It was a normal place,” she said.
When the river does overflow its banks and flood her street, she and other residents use boards and beams to build rudimentary scaffolding within their homes to raise their floors above the water.
“I think human beings have contributed a lot (to this situation,” she said. “Nature doesn’t forgive. She comes and doesn’t want to know whether you’re ready to face her or not.”
Flooding also has a significant impact on local industries such as farming and cattle ranching. Many family-run operations have seen their production vanish under water. Others have been unable to reach their shops, offices and market stalls or clients.
“With these floods, we’re out of work,” said Elias Gomes, a 38-year-old electrician in Cacau Pirera, on the other side of the Rio Negro, though noted he’s been able to earn a bit by transporting neighbors in his small wooden boat.
Gomes is now looking to move to a more densely populated area where floods won’t threaten his livelihood.
Limited access to banking in remote parts of the Amazon can make things worse for residents, who are often unable to get loans or financial compensation for lost production, said Viana, of the Sustainable Amazon Foundation. “This is a clear case of climate injustice: Those who least contributed to global warming and climate change are the most affected.”
Meteorologists say Amazon water levels could continue to rise slightly until late June or July, when floods usually peak.
Of all the things attributable to climate change, the rotational poles moving differently is definitely one of the weirder ones. But a new study shows that’s exactly what’s happening. It builds on previous findings to show that disappearing ice is playing a major role, and shows that groundwater depletion is responsible for contributing to wobbles as well.
The findings, published last month in Geophysical Research Letters, uses satellites that track gravity to track what researchers call “polar drift.” While we think of gravity as a constant, it’s actually a moving target based on the shape of the planet. While earthquakes and other geophysical activities can certainly play a role by pushing land around, it’s water that is responsible for the biggest shifts. The satellites used for the study, known as GRACE and GRACE-FO, were calibrated to measure Earth’s shifting mass.
Polar drift is something that happens naturally. The Earth’s axis is slowly shifting, but there’s been a marked acceleration in recent decades. The poles are now moving at nearly 17 times the rate they were in 1981, a fairly remarkable speed-up. What’s even more remarkable, though, is that poles actually began moving in a new direction quite suddenly in 2000, at a rapid clip.
Previous research used the same satellite data to observe the speed-up and change of gear and attributed it to ice loss in Greenland and West Antarctica as well as groundwater pumping. The new study extends the record back to the 1990s and explores some of the year-to-year wobbles in more detail. The findings point to changes in groundwater use in specific regions as the source of some of those differences.
“Using the GRACE data (for the period 2002-2015) we showed that such interannual signals (as these authors pointed out: kinks at 2005 and 2012) can be explained by the terrestrial water storage,” Surendra Adhikari, a scientist at NASA Jet Propulsion Laboratory who led the 2016 research, said in an email. “The new paper reinforces the statement by also showing that another kink in the polar motion data (at 1995) is also explained by total water storage variability, especially by the on-set of accelerated Greenland ice mass loss and depletion of water storage in the Middle East and the Indian subcontinent.
“In general, the paper (along with our previous works) reveals the strong connection between the climate variability and how the Earth wobbles,” he added, noting the new study was a “nicely done paper.”
In the scheme of things, climate change triggering polar movement isn’t too worrisome, given the other clear and present dangers like intense heat waves, ocean acidification, and the sixth mass extinction. Ditto for the role of groundwater depletion, which has the potential to impact billions of lives. But it’s a powerful reminder of just how much humans have reshaped the planet and why we should probably cut it out sooner than later if we don’t want our world to turn upside down.
Correction, 4/23/21, 6:30 p.m.: This post has been updated to reflect that the rotational poles are the ones in question moving and being studied.
Is it really just code for white people wishing to hold onto their way of life or to get “back to normal?”
The climate movement is ascendant, and it has become common to see climate change as a social justice issue. Climate change and its effects—pandemics, pollution, natural disasters—are not universally or uniformly felt: the people and communities suffering most are disproportionately Black, Indigenous and people of color. It is no surprise then that U.S. surveys show that these are the communities most concerned about climate change.
One year ago, I published a book called A Field Guide to Climate Anxiety. Since its publication, I have been struck by the fact that those responding to the concept of climate anxiety are overwhelmingly white. Indeed, these climate anxiety circles are even whiter than the environmental circles I’ve been in for decades. Today, a year into the pandemic, after the murder of George Floyd and the protests that followed, and the attack on the U.S. Capitol, I am deeply concerned about the racial implications of climate anxiety. If people of color are more concerned about climate change than white people, why is the interest in climate anxiety so white? Is climate anxiety a form of white fragility or even racial anxiety? Put another way, is climate anxiety just code for white people wishing to hold onto their way of life or get “back to normal,” to the comforts of their privilege?
The white response to climate change is literally suffocating to people of color. Climate anxiety can operate like white fragility, sucking up all the oxygen in the room and devoting resources toward appeasing the dominant group. As climate refugees are framed as a climate security threat, will the climate-anxious recognize their role in displacing people from around the globe? Will they be able to see their own fates tied to the fates of the dispossessed? Or will they hoard resources, limit the rights of the most affected and seek to save only their own, deluded that this xenophobic strategy will save them? How can we make sure that climate anxiety is harnessed for climate justice?
My book has connected me to a growing community focused on the emotional dimensions of climate change. As writer Britt Wray puts it, emotions like mourning, anger, dread and anxiety are “merely a sign of our attachment to the world.” Paradoxically, though, anxiety about environmental crisis can create apathy, inaction and burnout. Anxiety may be a rational response to the world that climate models predict, but it is unsustainable.
And climate panic can be as dangerous as it is galvanizing. Dealing with feelings of climate anxiety will require the existential tools I provided in A Field Guide to Climate Anxiety, but it will also require careful attention to extremism and climate zealotry. We can’t fight climate change with more racism. Climate anxiety must be directed toward addressing the ways that racism manifests as environmental trauma and vice versa—how environmentalism manifests as racialized violence. We need to channel grief toward collective liberation.
The prospect of an unlivable future has always shaped the emotional terrain for Black and brown people, whether that terrain is racism or climate change. Climate change compounds existing structures of injustice, and those structures exacerbate climate change. Exhaustion, anger, hope—the effects of oppression and resistance are not unique to this climate moment. What is unique is that people who had been insulated from oppression are now waking up to the prospect of their own unlivable future.
It is a surprisingly short step from “chronic fear of environmental doom,” as the American Psychological Association defines ecoanxiety, to xenophobia and fascism. Racism is not an accidental byproduct of environmentalism; it has been a constant reference point. As I wrote about in my first book, The Ecological Other, early environmentalists in the U.S. were anti-immigrant eugenicists whose ideas were later adopted by Nazis to implement their “blood and soil” ideology. In a recent, dramatic example, the gunman of the 2019 El Paso shooting was motivated by despair about the ecological fate of the planet: “My whole life I have been preparing for a future that currently doesn’t exist.” Intense emotions mobilize people, but not always for the good of all life on this planet.
Today’s progressives espouse climate change as the “greatest existential threat of our time,” a claim that ignores people who have been experiencing existential threats for much longer. Slavery, colonialism, ongoing police brutality—we can’t neglect history to save the future.
RESILIENCE AND RELATION AS RESISTANCE
I recently gave a college lecture about climate anxiety. One of the students e-mailed me to say she was so distressed that she’d be willing to submit to a green dictator if they would address climate change. Young people know the stakes, but they are not learning how to cope with the intensity of their dread. It would be tragic and dangerous if this generation of climate advocates becomes willing to sacrifice democracy and human rights in the name of climate change.
Oppressed and marginalized people have developed traditions of resilience out of necessity. Black, feminist and Indigenous leaders have painstakingly cultivated resilience over the long arc of the fight for justice. They know that protecting joy and hope is the ultimate resistance to domination. Persistence is nonnegotiable when your mental, physical and reproductive health are on the line.
Instead of asking “What can I do to stop feeling so anxious?”, “What can I do to save the planet?” and “What hope is there?”, people with privilege can be asking “Who am I?” and “How am I connected to all of this?” The answers reveal that we are deeply interconnected with the well-being of others on this planet, and that there are traditions of environmental stewardship that can be guides for where we need to go from here.
Author’s Note: I want to thank Jade Sasser, Britt Wray, Janet Fiskio, and Jennifer Atkinson for rich discussions about this topic, which inform this piece.
This is an opinion and analysis article.
Sarah Jaquette Ray, Ph.D., is professor and chair in the Environmental Studies Department at Humboldt State University.
Em seu novo livro Como evitar um desastre climático, Bill Gates adota uma abordagem tecnológica para compreender a crise climática. Gates começa com os 51 bilhões de toneladas de gases com efeito de estufa criados por ano. Ele divide essa poluição em setores com base em seu impacto, passando pelo elétrico, industrial e agrícola para o de transporte e construção civil. Do começo ao fim, Gates se mostra adepto a diminuir as complexidades do desafio climático, dando ao leitor heurísticas úteis para distinguir maiores problemas tecnológicos (cimento) de menores (aeronaves).
Presente nas negociações climáticas de Paris em 2015, Gates e dezenas de indivíduos bem-afortunados lançaram o Breakthrough Energy, um fundo de capital de investimento interdependente lobista empenhado em conduzir pesquisas. Gates e seus companheiros investidores argumentaram que tanto o governo federal quanto o setor privado estão investindo pouco em inovação energética. A Breakthrough pretende preencher esta lacuna, investindo em tudo, desde tecnologia nuclear da próxima geração até carne vegetariana com sabor de carne bovina. A primeira rodada de US$ 1 bilhão do fundo de investimento teve alguns sucessos iniciais, como a Impossible Foods, uma fabricante de hambúrgueres à base de plantas. O fundo anunciou uma segunda rodada de igual tamanho em janeiro.
Um esforço paralelo, um acordo internacional chamado de Mission Innovation, diz ter convencido seus membros (o setor executivo da União Europeia junto com 24 países incluindo China, os EUA, Índia e o Brasil) a investirem um adicional de US$ 4,6 bilhões por ano desde 2015 para a pesquisa e desenvolvimento da energia limpa.
Essas várias iniciativas são a linha central para o livro mais recente de Gates, escrito a partir de uma perspectiva tecno-otimista. “Tudo que aprendi a respeito do clima e tecnologia me deixam otimista… se agirmos rápido o bastante, [podemos] evitar uma catástrofe climática,” ele escreveu nas páginas iniciais.
Como muitos já assinalaram, muito da tecnologia necessária já existe, muito pode ser feito agora. Por mais que Gates não conteste isso, seu livro foca nos desafios tecnológicos que ele acredita que ainda devem ser superados para atingir uma maior descarbonização. Ele gasta menos tempo nos percalços políticos, escrevendo que pensa “mais como um engenheiro do que um cientista político.” Ainda assim, a política, com toda a sua desordem, é o principal impedimento para o progresso das mudanças climáticas. E engenheiros devem entender como sistemas complexos podem ter ciclos de feedback que dão errado.
Kim Stanley Robinson, este sim pensa como um cientista político. O começo de seu romance mais recente The Ministry for the Future (ainda sem tradução para o português), se passa apenas a alguns anos no futuro, em 2025, quando uma onda de calor imensa atinge a Índia, matando milhões de pessoas. A protagonista do livro, Mary Murphy, comanda uma agência da ONU designada a representar os interesses das futuras gerações em uma tentativa de unir os governos mundiais em prol de uma solução climática. Durante todo o livro a equidade intergeracional e várias formas de políticas distributivas em foco.
Se você já viu os cenários que o Painel Intergovernamental sobre Mudanças Climáticas (IPCC) desenvolve para o futuro, o livro de Robinson irá parecer familiar. Sua história questiona as políticas necessárias para solucionar a crise climática, e ele certamente fez seu dever de casa. Apesar de ser um exercício de imaginação, há momentos em que o romance se assemelha mais a um seminário de graduação sobre ciências sociais do que a um trabalho de ficção escapista. Os refugiados climáticos, que são centrais para a história, ilustram a forma como as consequências da poluição atingem a população global mais pobre com mais força. Mas os ricos produzem muito mais carbono.
Ler Gates depois de Robinson evidencia a inextricável conexão entre desigualdade e mudanças climáticas. Os esforços de Gates sobre a questão do clima são louváveis. Mas quando ele nos diz que a riqueza combinada das pessoas apoiando seu fundo de investimento é de US$ 170 bilhões, ficamos um pouco intrigados que estes tenham dedicado somente US$ 2 bilhões para soluções climáticas, menos de 2% de seus ativos. Este fato por si só é um argumento favorável para taxar fortunas: a crise climática exige ação governamental. Não pode ser deixado para o capricho de bilionários.
Quanto aos bilionários, Gates é possivelmente um dos bonzinhos. Ele conta histórias sobre como usa sua fortuna para ajudar os pobres e o planeta. A ironia dele escrever um livro sobre mudanças climáticas quando voa em um jato particular e detém uma mansão de 6.132 m² não é algo que passa despercebido pelo leitor, e nem por Gates, que se autointitula um “mensageiro imperfeito sobre mudanças climáticas”. Ainda assim, ele é inquestionavelmente um aliado do movimento climático.
Mas ao focar em inovações tecnológicas, Gates minimiza a participação dos combustíveis fósseis na obstrução deste progresso. Peculiarmente, o ceticismo climático não é mencionado no livro. Lavando as mãos no que diz respeito à polarização política, Gates nunca faz conexão com seus colegas bilionários Charles e David Koch, que enriqueceram com os petroquímicos e têm desempenhado papel de destaque na reprodução do negacionismo climático.
Por exemplo, Gates se admira que para a vasta maioria dos americanos aquecedores elétricos são na verdade mais baratos do que continuar a usar combustíveis fósseis. Para ele, as pessoas não adotarem estas opções mais econômicas e sustentáveis é um enigma. Mas, não é assim. Como os jornalistas Rebecca Leber e Sammy Roth reportaram em Mother Jones e no Los Angeles Times, a indústria do gás está investindo em defensores e criando campanhas de marketing para se opor à eletrificação e manter as pessoas presas aos combustíveis fósseis.
Essas forças de oposição são melhor vistas no livro do Robinson do que no de Gates. Gates teria se beneficiado se tivesse tirado partido do trabalho que Naomi Oreskes, Eric Conway, Geoffrey Supran, entre outros, têm feito para documentar os esforços persistentes das empresas de combustíveis fósseis em semear dúvida sobre a ciência climática para a população.
No entanto, uma coisa que Gates e Robinson têm em comum é a opinião de que a geoengenharia, intervenções monumentais para combater os sintomas ao invés das causas das mudanças climáticas, venha a ser inevitável. Em The Ministry for the Future, a geoengenharia solar, que vem a ser a pulverização de partículas finas na atmosfera para refletir mais do calor solar de volta para o espaço, é usada na sequência dos acontecimentos da onda de calor mortal que inicia a história. E mais tarde, alguns cientistas vão aos polos e inventam elaborados métodos para remover água derretida de debaixo de geleiras para evitar que avançasse para o mar. Apesar de alguns contratempos, eles impedem a subida do nível do mar em vários metros. É possível imaginar Gates aparecendo no romance como um dos primeiros a financiar estes esforços. Como ele próprio observa em seu livro, ele tem investido em pesquisa sobre geoengenharia solar há anos.
A pior parte
O título do novo livro de Elizabeth Kolbert, Under a White Sky (ainda sem tradução para o português), é uma referência a esta tecnologia nascente, já que implementá-la em larga escala pode alterar a cor do céu de azul para branco. Kolbert observa que o primeiro relatório sobre mudanças climáticas foi parar na mesa do presidente Lyndon Johnson em 1965. Este relatório não argumentava que deveríamos diminuir as emissões de carbono nos afastando de combustíveis fósseis. No lugar, defendia mudar o clima por meio da geoengenharia solar, apesar do termo ainda não ter sido inventado. É preocupante que alguns se precipitem imediatamente para essas soluções arriscadas em vez de tratar a raiz das causas das mudanças climáticas.
Ao ler Under a White Sky, somos lembrados das formas com que intervenções como esta podem dar errado. Por exemplo, a cientista e escritora Rachel Carson defendeu importar espécies não nativas como uma alternativa a utilizar pesticidas. No ano após o seu livro Primavera Silenciosa ser publicado, em 1962, o US Fish and Wildlife Service trouxe carpas asiáticas para a América pela primeira vez, a fim de controlar algas aquáticas. Esta abordagem solucionou um problema, mas criou outro: a disseminação dessa espécie invasora ameaçou às locais e causou dano ambiental.
Como Kolbert observa, seu livro é sobre “pessoas tentando solucionar problemas criados por pessoas tentando solucionar problemas.” Seu relato cobre exemplos incluindo esforços malfadados de parar a disseminação das carpas, as estações de bombeamento em Nova Orleans que aceleram o afundamento da cidade e as tentativas de seletivamente reproduzir corais que possam tolerar temperaturas mais altas e a acidificação do oceano. Kolbert tem senso de humor e uma percepção aguçada para consequências não intencionais. Se você gosta do seu apocalipse com um pouco de humor, ela irá te fazer rir enquanto Roma pega fogo.
Em contraste, apesar de Gates estar consciente das possíveis armadilhas das soluções tecnológicas, ele ainda enaltece invenções como plástico e fertilizante como vitais. Diga isso para as tartarugas marinhas engolindo lixo plástico ou as florações de algas impulsionadas por fertilizantes destruindo o ecossistema do Golfo do México.
Com níveis perigosos de dióxido de carbono na atmosfera, a geoengenharia pode de fato se provar necessária, mas não deveríamos ser ingênuos sobre os riscos. O livro de Gates tem muitas ideias boas e vale a pena a leitura. Mas para um panorama completo da crise que enfrentamos, certifique-se de também ler Robinson e Kolbert.
O florescer das famosas cerejeiras brancas e rosas leva milhares às ruas e parques do Japão para observar o fenômeno, que dura poucos dias e é reverenciado há mais de mil anos. Mas este ano a antecipação da florada tem preocupado cientistas, pois indica impacto nas mudanças climáticas.
Segundo registros da Universidade da Prefeitura de Osaka, em 2021, as famosas cerejeiras brancas e rosas floresceram totalmente em 26 de março em Quioto, a data mais antecipada em 12 séculos. As floradas mais cedo foram registradas em 27 de março dos anos 1612, 1409 e 1236.
A instituição conseguiu identificar a antecipação do fenômeno porque tem um banco de dados completo dos registros das floradas ao longo dos séculos. Os registros começaram no ano 812 e incluem documentos judiciais da Quioto Imperial, a antiga capital do Japão e diários medievais.
O professor de ciência ambiental da universidade da Prefeitura de Osaka, Yasuyuki Aono, responsável por compilar um banco de dados, disse à Agência Reuters que o fenômeno costuma ocorrer em abril, mas à medida que as temperaturas sobem, o início da floração é mais cedo.
“As flores de cerejeira são muito sensíveis à temperatura. A floração e a plena floração podem ocorrer mais cedo ou mais tarde, dependendo apenas da temperatura. A temperatura era baixa na década de 1820, mas subiu cerca de 3,5 graus Celsius até hoje”, disse.
Segundo ele, as estações deste ano, em particular, influenciaram as datas de floração. O inverno foi muito frio, mas a primavera veio rápida e excepcionalmente quente, então “os botões estão completamente despertos depois de um descanso suficiente”.
Na capital Tóquio, as cerejeiras atingiram o máximo da florada em 22 de março, o segundo ano mais cedo já registrado. “À medida que as temperaturas globais aumentam, as geadas da última Primavera estão ocorrendo mais cedo e a floração está ocorrendo mais cedo”, afirmou Lewis Ziska, da Universidade de Columbia, à CNN.
A Agência Meteorológica do Japão acompanha ainda 58 cerejeiras “referência” no país. Neste ano, 40 já atingiram o pico de floração e 14 o fizeram em tempo recorde. As árvores normalmente florescem por cerca de duas semanas todos os anos. “Podemos dizer que é mais provável por causa do impacto do aquecimento global”, disse Shunji Anbe, funcionário da divisão de observações da agência.
Dados Organização Meteorológica Mundial divulgados em janeiro mostram que as temperaturas globais em 2020 estiveram entre as mais altas já registradas e rivalizaram com 2016 com o ano mais quente de todos os tempos.
As flores de cerejeira têm longas raízes históricas e culturais no Japão, anunciando a Primavera e inspirando artistas e poetas ao longo dos séculos. Sua fragilidade é vista como um símbolo de vida, morte e renascimento.
Atualmente, as pessoas se reúnem sob as flores de cerejeiras a cada primavera para festas hanami (observação das flores), passeiam em parques e fazem piqueniques embaixo dos galhos e abusar das selfies. Mas, neste ano, a florada de cerejeiras veio e se foi em um piscar de olhos.
Com o fim do estado de emergência para conter a pandemia de Covid-19 em todas as regiões do Japão, muitas pessoas se aglomeraram em locais populares de exibição no fim de semana, embora o número de pessoas tenha sido menor do que em anos normais.
We’re one step closer to officially moving up hurricane season. The National Hurricane Center announced Tuesday that it would formally start issuing its hurricane season tropical weather outlooks on May 15 this year, bumping it up from the traditional start of hurricane season on June 1. The move comes after a recent spate of early season storms have raked the Atlantic.
Atlantic hurricane season runs from June 1 to November 30. That’s when conditions are most conducive to storm formation owing to warm air and water temperatures. (The Pacific ocean has its own hurricane season, which covers the same timeframe, but since waters are colder fewer hurricanes tend to form there than in the Atlantic.)
Storms have begun forming on the Atlantic earlier as ocean and air temperatures have increased due to climate change. Last year, Hurricane Arthur roared to life off the East Coast on May 16. That storm made 2020 the sixth hurricane season in a row to have a storm that formed earlier than the June 1 official start date. While the National Oceanic and Atmospheric Administration won’t be moving up the start of the season just yet, the earlier outlooks addresses the recent history.
“In the last decade, there have been 10 storms formed in the weeks before the traditional start of the season, which is a big jump,” said Sean Sublette, a meteorologist at Climate Central, who pointed out that the 1960s through 2010s saw between one and three storms each decade before the June 1 start date on average.
It might be tempting to ascribe this earlier season entirely to climate change warming the Atlantic. But technology also has a role to play, with more observations along the coast as well as satellites that can spot storms far out to sea.
“I would caution that we can’t just go, ‘hah, the planet’s warming, we’ve had to move the entire season!’” Sublette said. “I don’t think there’s solid ground for attribution of how much of one there is over the other. Weather folks can sit around and debate that for awhile.”
Earlier storms don’t necessarily mean more harmful ones, either. In fact, hurricanes earlier in the season tend to be weaker than the monsters that form in August and September when hurricane season is at its peak. But regardless of their strength, these earlier storms have generated discussion inside the NHC on whether to move up the official start date for the season, when the agency usually puts out two reports per day on hurricane activity. Tuesday’s step is not an official announcement of this decision, but an acknowledgement of the increased attention on early hurricanes.
“I would say that [Tuesday’s announcement] is the National Hurricane Center being proactive,” Sublette said. “Like hey, we know that the last few years it’s been a little busier in May than we’ve seen in the past five decades, and we know there is an awareness now, so we’re going to start issuing these reports early.”
While the jury is still out on whether climate change is pushing the season earlier, research has shown that the strongest hurricanes are becoming more common, and that climate change is likely playing a role. A study published last year found the odds of a storm becoming a major hurricanes—those Category 3 or stronger—have increase 49% in the basin since satellite monitoring began in earnest four decades ago. And when storms make landfall, sea level rise allows them to do more damage. So regardless of if climate change is pushing Atlantic hurricane season is getting earlier or not, the risks are increasing. Now, at least, we’ll have better warnings before early storms do hit.
Clifford Krauss, Manny Fernandez, Ivan Penn, Rick Rojas – Feb 21, 2021
Texas has refused to join interstate electrical grids and railed against energy regulation. Now it’s having to answer to millions of residents who were left without power in last week’s snowstorm.
HOUSTON — Across the plains of West Texas, the pump jacks that resemble giant bobbing hammers define not just the landscape but the state itself: Texas has been built on the oil-and-gas business for the last 120 years, ever since the discovery of oil on Spindletop Hill near Beaumont in 1901.
Texas, the nation’s leading energy-producing state, seemed like the last place on Earth that could run out of energy.
Then last week, it did.
The crisis could be traced to that other defining Texas trait: independence, both from big government and from the rest of the country. The dominance of the energy industry and the “Republic of Texas” ethos became a devastating liability when energy stopped flowing to millions of Texans who shivered and struggled through a snowstorm that paralyzed much of the state.
Part of the responsibility for the near-collapse of the state’s electrical grid can be traced to the decision in 1999 to embark on the nation’s most extensive experiment in electrical deregulation, handing control of the state’s entire electricity delivery system to a market-based patchwork of private generators, transmission companies and energy retailers.
The energy industry wanted it. The people wanted it. Both parties supported it. “Competition in the electric industry will benefit Texans by reducing monthly rates and offering consumers more choices about the power they use,” George W. Bush, then the governor, said as he signed the top-to-bottom deregulation legislation.
Mr. Bush’s prediction of lower-cost power generally came true, and the dream of a free-market electrical grid worked reasonably well most of the time, in large part because Texas had so much cheap natural gas as well as abundant wind to power renewable energy. But the newly deregulated system came with few safeguards and even fewer enforced rules.
With so many cost-conscious utilities competing for budget-shopping consumers, there was little financial incentive to invest in weather protection and maintenance. Wind turbines are not equipped with the de-icing equipment routinely installed in the colder climes of the Dakotas and power lines have little insulation. The possibility of more frequent cold-weather events was never built into infrastructure plans in a state where climate change remains an exotic, disputed concept.
“Deregulation was something akin to abolishing the speed limit on an interstate highway,” said Ed Hirs, an energy fellow at the University of Houston. “That opens up shortcuts that cause disasters.”
The state’s entire energy infrastructure was walloped with glacial temperatures that even under the strongest of regulations might have frozen gas wells and downed power lines.
But what went wrong was far broader: Deregulation meant that critical rules of the road for power were set not by law, but rather by a dizzying array of energy competitors.
Utility regulation is intended to compensate for the natural monopolies that occur when a single electrical provider serves an area; it keeps prices down while protecting public safety and guaranteeing fair treatment to customers. Yet many states have flirted with deregulation as a way of giving consumers more choices and encouraging new providers, especially alternative energy producers.
California, one of the early deregulators in the 1990s, scaled back its initial foray after market manipulation led to skyrocketing prices and rolling blackouts.
States like Maryland allow customers to pick from a menu of producers. In some states, competing private companies offer varied packages like discounts for cheaper power at night. But no state has gone as far as Texas, which has not only turned over the keys to the free market but has also isolated itself from the national grid, limiting the state’s ability to import power when its own generators are foundering.
Consumers themselves got a direct shock last week when customers who had chosen variable-rate electricity contracts found themselves with power bills of $5,000 or more. While they were expecting extra-low monthly rates, many may now face huge bills as a result of the upswing in wholesale electricity prices during the cold wave. Gov. Greg Abbott on Sunday said the state’s Public Utility Commission has issued a moratorium on customer disconnections for non-payment and will temporarily restrict providers from issuing invoices.
There is regulation in the Texas system, but it is hardly robust. One nonprofit agency, the Electric Reliability Council of Texas, or ERCOT, was formed to manage the wholesale market. It is supervised by the Public Utility Commission, which also oversees the transmission companies that offer customers an exhaustive array of contract choices laced with more fine print than a credit card agreement.
But both agencies are nearly unaccountable and toothless compared to regulators in other regions, where many utilities have stronger consumer protections and submit an annual planning report to ensure adequate electricity supply. Texas energy companies are given wide latitude in their planning for catastrophic events.
Into a snowstorm with no reserves
One example of how Texas has gone it alone is its refusal to enforce a “reserve margin” of extra power available above expected demand, unlike all other power systems around North America. With no mandate, there is little incentive to invest in precautions for events, such as a Southern snowstorm, that are rare. Any company that took such precautions would put itself at a competitive disadvantage.
A surplus supply of natural gas, the dominant power fuel in Texas, near power plants might have helped avoid the cascade of failures in which power went off, forcing natural gas production and transmission offline, which in turn led to further power shortages.
In the aftermath of the dayslong outages, ERCOT has been criticized by both Democratic and Republican residents, lawmakers and business executives, a rare display of unity in a fiercely partisan and Republican-dominated state. Mr. Abbott said he supported calls for the agency’s leadership to resign and made ERCOT reform a priority for the Legislature. The reckoning has been swift — this week, lawmakers will hold hearings in Austin to investigate the agency’s handling of the storm and the rolling outages.
For ERCOT operators, the storm’s arrival was swift and fierce, but they had anticipated it and knew it would strain their system. They asked power customers across the state to conserve, warning that outages were likely.
But late on Sunday, Feb. 14, it rapidly became clear that the storm was far worse than they had expected: Sleet and snow fell, and temperatures plunged. In the council’s command center outside Austin, a room dominated by screens flashing with maps, graphics and data tracking the flow of electricity to 26 million people in Texas, workers quickly found themselves fending off a crisis. As weather worsened into Monday morning, residents cranked up their heaters and demand surged.
Power plants began falling offline in rapid succession as they were overcome by the frigid weather or ran out of fuel to burn. Within hours, 40 percent of the power supply had been lost.
The entire grid — carrying 90 percent of the electric load in Texas — was barreling toward a collapse.
In the electricity business, supply and demand need to be in balance. Imbalances lead to catastrophic blackouts. Recovering from a total blackout would be an agonizing and tedious process, known as a “black start,” that could take weeks, or possibly months.
And in the early-morning hours last Monday, the Texas grid was “seconds and minutes” away from such a collapse, said Bill Magness, the president and chief executive of the Electric Reliability Council.
“If we had allowed a catastrophic blackout to happen, we wouldn’t be talking today about hopefully getting most customers their power back,” Mr. Magness said. “We’d be talking about how many months it might be before you get your power back.”
Earlier warnings of trouble
The outages and the cold weather touched off an avalanche of failures, but there had been warnings long before last week’s storm.
After a heavy snowstorm in February 2011 caused statewide rolling blackouts and left millions of Texans in the dark, federal authorities warned the state that its power infrastructure had inadequate “winterization” protection. But 10 years later, pipelines remained inadequately insulated and heaters that might have kept instruments from freezing were never installed.
During heat waves, when demand has soared during several recent summers, the system in Texas has also strained to keep up, raising questions about lack of reserve capacity on the unregulated grid.
And aside from the weather, there have been periodic signs that the system can run into trouble delivering sufficient energy, in some cases because of equipment failures, in others because of what critics called an attempt to drive up prices, according to Mr. Hirs of the University of Houston, as well as several energy consultants.
Another potential safeguard might have been far stronger connections to the two interstate power-sharing networks, East and West, that allow states to link their electrical grids and obtain power from thousands of miles away when needed to hold down costs and offset their own shortfalls.
But Texas, reluctant to submit to the federal regulation that is part of the regional power grids, made decisions as far back as the early 20th century to become the only state in the continental United States to operate its own grid — a plan that leaves it able to borrow only from a few close neighbors.
The border city of El Paso survived the freeze much better than Dallas or Houston because it was not part of the Texas grid but connected to the much larger grid covering many Western states.
But the problems that began with last Monday’s storm went beyond an isolated electrical grid. The entire ecosystem of how Texas generates, transmits and uses power stalled, as millions of Texans shivered in darkened, unheated homes.
Texans love to brag about natural gas, which state officials often call the cleanest-burning fossil fuel. No state produces more, and gas-fired power plants produce nearly half the state’s electricity.
“We are struggling to come to grips with the reality that gas came up short and let us down when we needed it most,” said Michael E. Webber, a professor of mechanical engineering at the University of Texas at Austin.
The cold was so severe that the enormous oil and natural gas fields of West Texas froze up, or could not get sufficient power to operate. Though a few plants had stored gas reserves, there was insufficient electricity to pump it.
The leaders of ERCOT defended the organization, its lack of mandated reserves and the state’s isolation from larger regional grids, and said the blame for the power crisis lies with the weather, not the overall deregulated system in Texas.
“The historic, just about unprecedented, storm was the heart of the problem,” Mr. Magness, the council’s chief executive, said, adding: “We’ve found that this market structure works. It demands reliability. I don’t think there’s a silver-bullet market structure that could have managed the extreme lows and generation outages that we were facing Sunday night.”
In Texas, energy regulation is as much a matter of philosophy as policy. Its independent power grid is a point of pride that has been an applause line in Texas political speeches for decades.
Deregulation is a hot topic among Texas energy experts, and there has been no shortage of predictions that the grid could fail under stress. But there has not been widespread public dissatisfaction with the system, although many are now wondering if they are being well served.
“I believe there is great value in Texas being on its own grid and I believe we can do so safely and securely and confidently going forward,” said State Representative Jeff Leach, a Republican from Plano who has called for an investigation into what went wrong. “But it’s going to take new investment and some new strategic decisions to make sure we’re protected from this ever happening again.”
Steven D. Wolens, a former Democratic lawmaker from Dallas and a principal architect of the 1999 deregulation legislation, said deregulation was meant to spur more generation, including from renewable energy sources, and to encourage the mothballing of older plants that were spewing pollution. “We were successful,” said Mr. Wolens, who left the Legislature in 2005.
But the 1999 legislation was intended as a first iteration that would evolve along with the needs of the state, he said. “They can focus on it now and they can fix it now,” he said. “The buck stops with the Texas Legislature and they are in a perfect position to determine the basis of the failure, to correct it and make sure it never happens again.”
Clifford Krauss reported from Houston, Manny Fernandez and Ivan Penn from Los Angeles, and Rick Rojas from Nashville. David Montgomery contributed reporting from Austin, Texas.
Christopher Flavelle, Brad Plumer, Hiroko Tabuchi – Feb 20, 2021
Continent-spanning storms triggered blackouts in Oklahoma and Mississippi, halted one-third of U.S. oil production and disrupted vaccinations in 20 states.
Even as Texas struggled to restore electricity and water over the past week, signs of the risks posed by increasingly extreme weather to America’s aging infrastructure were cropping up across the country.
The week’s continent-spanning winter storms triggered blackouts in Texas, Oklahoma, Mississippi and several other states. One-third of oil production in the nation was halted. Drinking-water systems in Ohio were knocked offline. Road networks nationwide were paralyzed and vaccination efforts in 20 states were disrupted.
The crisis carries a profound warning. As climate change brings more frequent and intense storms, floods, heat waves, wildfires and other extreme events, it is placing growing stress on the foundations of the country’s economy: Its network of roads and railways, drinking-water systems, power plants, electrical grids, industrial waste sites and even homes. Failures in just one sector can set off a domino effect of breakdowns in hard-to-predict ways.
Much of this infrastructure was built decades ago, under the expectation that the environment around it would remain stable, or at least fluctuate within predictable bounds. Now climate change is upending that assumption.
“We are colliding with a future of extremes,” said Alice Hill, who oversaw planning for climate risks on the National Security Council during the Obama administration. “We base all our choices about risk management on what’s occurred in the past, and that is no longer a safe guide.”
Sewer systems are overflowing more often as powerful rainstorms exceed their design capacity. Coastal homes and highways are collapsing as intensified runoff erodes cliffs. Coal ash, the toxic residue produced by coal-burning plants, is spilling into rivers as floods overwhelm barriers meant to hold it back. Homes once beyond the reach of wildfires are burning in blazes they were never designed to withstand.
Problems like these often reflect an inclination of governments to spend as little money as possible, said Shalini Vajjhala, a former Obama administration official who now advises cities on meeting climate threats. She said it’s hard to persuade taxpayers to spend extra money to guard against disasters that seem unlikely.
But climate change flips that logic, making inaction far costlier. “The argument I would make is, we can’t afford not to, because we’re absorbing the costs” later, Ms. Vajjhala said, after disasters strike. “We’re spending poorly.”
The Biden administration has talked extensively about climate change, particularly the need to reduce greenhouse gas emissions and create jobs in renewable energy. But it has spent less time discussing how to manage the growing effects of climate change, facing criticism from experts for not appointing more people who focus on climate resilience.
“I am extremely concerned by the lack of emergency-management expertise reflected in Biden’s climate team,” said Samantha Montano, an assistant professor at the Massachusetts Maritime Academy who focuses on disaster policy. “There’s an urgency here that still is not being reflected.”
A White House spokesman, Vedant Patel, said in a statement, “Building resilient and sustainable infrastructure that can withstand extreme weather and a changing climate will play an integral role in creating millions of good paying, union jobs” while cutting greenhouse gas emissions.
And while President Biden has called for a major push to refurbish and upgrade the nation’s infrastructure, getting a closely divided Congress to spend hundreds of billions, if not trillions of dollars, will be a major challenge.
Heightening the cost to society, disruptions can disproportionately affect lower-income households and other vulnerable groups, including older people or those with limited English.
“All these issues are converging,” said Robert D. Bullard, a professor at Texas Southern University who studies wealth and racial disparities related to the environment. “And there’s simply no place in this country that’s not going to have to deal with climate change.”
Many forms of water crisis
In September, when a sudden storm dumped a record of more than two inches of water on Washington in less than 75 minutes, the result wasn’t just widespread flooding, but also raw sewage rushing into hundreds of homes.
Washington, like many other cities in the Northeast and Midwest, relies on what’s called a combined sewer overflow system: If a downpour overwhelms storm drains along the street, they are built to overflow into the pipes that carry raw sewage. But if there’s too much pressure, sewage can be pushed backward, into people’s homes — where the forces can send it erupting from toilets and shower drains.
This is what happened in Washington. The city’s system was built in the late 1800s. Now, climate change is straining an already outdated design.
DC Water, the local utility, is spending billions of dollars so that the system can hold more sewage. “We’re sort of in uncharted territory,” said Vincent Morris, a utility spokesman.
The challenge of managing and taming the nation’s water supplies — whether in streets and homes, or in vast rivers and watersheds — is growing increasingly complex as storms intensify. Last May, rain-swollen flooding breached two dams in Central Michigan, forcing thousands of residents to flee their homes and threatening a chemical complex and toxic waste cleanup site. Experts warned it was unlikely to be the last such failure.
Many of the country’s 90,000 dams were built decades ago and were already in dire need of repairs. Now climate change poses an additional threat, bringing heavier downpours to parts of the country and raising the odds that some dams could be overwhelmed by more water than they were designed to handle. One recent study found that most of California’s biggest dams were at increased risk of failure as global warming advances.
In recent years, dam-safety officials have begun grappling with the dangers. Colorado, for instance, now requires dam builders to take into account the risk of increased atmospheric moisture driven by climate change as they plan for worst-case flooding scenarios.
But nationwide, there remains a backlog of thousands of older dams that still need to be rehabilitated or upgraded. The price tag could ultimately stretch to more than $70 billion.
“Whenever we study dam failures, we often find there was a lot of complacency beforehand,” said Bill McCormick, president of the Association of State Dam Safety Officials. But given that failures can have catastrophic consequences, “we really can’t afford to be complacent.”
Built for a different future
If the Texas blackouts exposed one state’s poor planning, they also provide a warning for the nation: Climate change threatens virtually every aspect of electricity grids that aren’t always designed to handle increasingly severe weather. The vulnerabilities show up in power lines, natural-gas plants, nuclear reactors and myriad other systems.
Higher storm surges can knock out coastal power infrastructure. Deeper droughts can reduce water supplies for hydroelectric dams. Severe heat waves can reduce the efficiency of fossil-fuel generators, transmission lines and even solar panels at precisely the moment that demand soars because everyone cranks up their air-conditioners.
Climate hazards can also combine in new and unforeseen ways.
In California recently, Pacific Gas & Electric has had to shut off electricity to thousands of people during exceptionally dangerous fire seasons. The reason: Downed power lines can spark huge wildfires in dry vegetation. Then, during a record-hot August last year, several of the state’s natural gas plants malfunctioned in the heat, just as demand was spiking, contributing to blackouts.
“We have to get better at understanding these compound impacts,” said Michael Craig, an expert in energy systems at the University of Michigan who recently led a study looking at how rising summer temperatures in Texas could strain the grid in unexpected ways. “It’s an incredibly complex problem to plan for.”
Some utilities are taking notice. After Superstorm Sandy in 2012 knocked out power for 8.7 million customers, utilities in New York and New Jersey invested billions in flood walls, submersible equipment and other technology to reduce the risk of failures. Last month, New York’s Con Edison said it would incorporate climate projections into its planning.
As freezing temperatures struck Texas, a glitch at one of two reactors at a South Texas nuclear plant, which serves 2 million homes, triggered a shutdown. The cause: Sensing lines connected to the plant’s water pumps had frozen, said Victor Dricks, a spokesman for the federal Nuclear Regulatory Agency.
It’s also common for extreme heat to disrupt nuclear power. The issue is that the water used to cool reactors can become too warm to use, forcing shutdowns.
Flooding is another risk.
After a tsunami led to several meltdowns at Japan’s Fukushima Daiichi power plant in 2011, the U.S. Nuclear Regulatory Commission told the 60 or so working nuclear plants in the United States, many decades old, to evaluate their flood risk to account for climate change. Ninety percent showed at least one type of flood risk that exceeded what the plant was designed to handle.
The greatest risk came from heavy rain and snowfall exceeding the design parameters at 53 plants.
Scott Burnell, an Nuclear Regulatory Commission spokesman, said in a statement, “The NRC continues to conclude, based on the staff’s review of detailed analyses, that all U.S. nuclear power plants can appropriately deal with potential flooding events, including the effects of climate change, and remain safe.”
Several climate-related risks appeared to have converged to heighten the danger. Rising seas and higher storm surges have intensified coastal erosion, while more extreme bouts of precipitation have increased the landslide risk.
Add to that the effects of devastating wildfires, which can damage the vegetation holding hillside soil in place, and “things that wouldn’t have slid without the wildfires, start sliding,” said Jennifer M. Jacobs, a professor of civil and environmental engineering at the University of New Hampshire. “I think we’re going to see more of that.”
The United States depends on highways, railroads and bridges as economic arteries for commerce, travel and simply getting to work. But many of the country’s most important links face mounting climate threats. More than 60,000 miles of roads and bridges in coastal floodplains are already vulnerable to extreme storms and hurricanes, government estimates show. And inland flooding could also threaten at least 2,500 bridges across the country by 2050, a federal climate report warned in 2018.
Sometimes even small changes can trigger catastrophic failures. Engineers modeling the collapse of bridges over Escambia Bay in Florida during Hurricane Ivan in 2004 found that the extra three inches of sea-level rise since the bridge was built in 1968 very likely contributed to the collapse, because of the added height of the storm surge and force of the waves.
“A lot of our infrastructure systems have a tipping point. And when you hit the tipping point, that’s when a failure occurs,” Dr. Jacobs said. “And the tipping point could be an inch.”
Crucial rail networks are at risk, too. In 2017, Amtrak consultants found that along parts of the Northeast corridor, which runs from Boston to Washington and carries 12 million people a year, flooding and storm surge could erode the track bed, disable the signals and eventually put the tracks underwater.
And there is no easy fix. Elevating the tracks would require also raising bridges, electrical wires and lots of other infrastructure, and moving them would mean buying new land in a densely packed part of the country. So the report recommended flood barriers, costing $24 million per mile, that must be moved into place whenever floods threaten.
The blasts at the plant came after flooding knocked out the site’s electrical supply, shutting down refrigeration systems that kept volatile chemicals stable. Almost two dozen people, many of them emergency workers, were treated for exposure to the toxic fumes, and some 200 nearby residents were evacuated from their homes.
More than 2,500 facilities that handle toxic chemicals lie in federal flood-prone areas across the country, about 1,400 of them in areas at the highest risk of flooding, a New York Times analysis showed in 2018.
Leaks from toxic cleanup sites, left behind by past industry, pose another threat.
Almost two-thirds of some 1,500 superfund cleanup sites across the country are in areas with an elevated risk of flooding, storm surge, wildfires or sea level rise, a government audit warned in 2019. Coal ash, a toxic substance produced by coal power plants that is often stored as sludge in special ponds, have been particularly exposed. After Hurricane Florence in 2018, for example, a dam breach at the site of a power plant in Wilmington, N.C., released the hazardous ash into a nearby river.
“We should be evaluating whether these facilities or sites actually have to be moved or re-secured,” said Lisa Evans, senior counsel at Earthjustice, an environmental law organization. Places that “may have been OK in 1990,” she said, “may be a disaster waiting to happen in 2021.”
As many in Texas wake up still without power on Thursday morning, millions are now also having to contend with water shutdowns, boil advisories, and empty grocery shelves as cities struggle with keeping infrastructure powered and supply chains are interrupted.
Even as some residents are getting their power restored, the problems are only continuing to layer as the only grocery stores left open were quickly selling out of food and supplies. As many without power watched their refrigerated food spoil, lines to get into stores wrapped around blocks and buildings and store shelves sat completely empty with no indication of when new shipments would be coming in. Food banks have had to cancel deliveries and schools to halt meal distribution to students, the Texas Tribune reports.
People experiencing homelessness, including a disproportionate number of Black residents, have especially suffered in the record cold temperatures across the state. There have been somereports of people being found dead in the streets because of a lack of shelter.
“Businesses are shut down. Streets are empty, other than a few guys sliding around in 4x4s and fire trucks rushing to rescue people who turn their ovens on to keep warm and poison themselves with carbon monoxide,” wrote Austin resident Jeff Goodell in Rolling Stone. “Yesterday, the line at our neighborhood grocery store was three blocks long. People wandering around with handguns on their hip adds to a sense of lawlessness (Texas is an open-carry state).”
The Texas agricultural commissioner has said that farmers and ranchers are having to throw away millions of dollars worth of goods because of a lack of power. “We’re looking at a food supply chain problem like we’ve never seen before, even with COVID-19,” he told one local news affiliate.
An energy analyst likened the power crisis to the fallout of Hurricane Katrina as it’s becoming increasingly clear that the situation in Texas is a statewide disaster.
As natural gas output declined dramatically in the state, Paul Sankey, who leads energy analyst firm Sankey Research, said on Bloomberg, “This situation to me is very reminiscent of Hurricane Katrina…. We have never seen a loss [of energy supply] at this scale” in mid-winter. This is “the biggest outage in the history [of] U.S. oil and gas,” Sankey said.
Experts say that the power outages have partially been caused by the deregulation of the state’s electric grid. The government, some say, favored deregulatory actions like not requiring electrical equipment upgrades or proper weatherization, instead relying on free market mechanisms that ultimately contributed to the current disaster.
Former Gov. Rick Perry faced criticism on Wednesday when he said that Texans would rather face the current disaster than have to be regulated by the federal government. And he’s not the only Republican currently catching heat — many have begun calling for the resignation of Gov. Greg Abbott for a failure of leadership. On Wednesday, as millions suffered without power and under boil-water advisories, the governor went on Fox to attack clean energy, which experts say was not a major contributor to the current crisis, and the Green New Deal.
After declaring a state of emergency in the state over the weekend, the Joe Biden administration announced on Wednesday that it would be sending generators and other supplies to the state.
WHEN IT RAINS, it pours, and when it snows, the lights turn off. Or so it goes in Texas. After a winter storm pummelled the Lone Star State with record snowfall and the lowest temperatures in more than 30 years, millions were left without electricity and heat. On February 16th 4.5m Texan households were cut off from power, as providers were overloaded with demand and tried to shuffle access to electricity so the whole grid did not go down.
Whole skylines, including Dallas’s, went dark to conserve power. Some Texans braved the snowy roads to check into the few hotels with remaining rooms, only for the hotels’ power to go off as they arrived. Others donned skiwear and remained inside, hoping the lights and heat would come back on. Across the state, what were supposed to be “rolling” blackouts lasted for days. It is still too soon to quantify the devastation. More than 20 people have died in motor accidents, from fires lit for warmth and from carbon-monoxide poisoning from using cars for heat. The storm has also halted deliveries of covid-19 vaccines and may prevent around 1m vaccinations from happening this week. Several retail electricity providers are likely to go bankrupt, after being hit with surging wholesale power prices.
Other states, including Tennessee, were also covered in snow, but Texas got the lion’s share and ground to a halt. Texans are rightly furious that residents of America’s energy capital cannot count on reliable power. Everyone is asking why.
The short answer is that the Electric Reliability Council of Texas (ERCOT), which operates the grid, did not properly forecast the demand for energy as a result of the storm. Some say that this was nearly impossible to predict, but there were warnings of the severity of the coming weather in the preceding week, and ERCOT’s projections were notably short. Brownouts last summer had already demonstrated the grid’s lack of excess capacity, says George O’Leary of Tudor, Pickering, Holt & CO (TPH), an energy investment bank.
Many Republican politicians were quick to blame renewable energy sources, such as wind power, for the blackouts, but that is not fair. Some wind turbines did indeed freeze, but natural gas, which accounts for around half of the state’s electricity generation, was the primary source of the shortfall. Plants broke down, as did the gas supply chain and pipelines. The cold also caused a reactor at one of the state’s two nuclear plants to go offline. Transmission lines may have also iced up, says Wade Schauer of Wood Mackenzie, an energy-research firm. In short, Texas experienced a perfect storm.
Some of the blame falls on the unique design of the electricity market in Texas. Of America’s 48 contiguous states, it is the only one with its own stand-alone electricity grid—the Texas Interconnection. This means that when power generators fail, the state cannot import electricity from outside its borders.
The state’s deregulated power market is also fiercely competitive. ERCOT oversees the grid, while power generators produce electricity for the wholesale market. Some 300 retail electricity providers buy that fuel and then compete for consumers. Because such cold weather is rare, energy companies do not invest in “winterising” their equipment, as this would raise their prices for consumers. Perhaps most important, the state does not have a “capacity market”, which would ensure that there was extra power available for surging demand. This acts as a sort of insurance policy so the lights will not go out, but it also means customers pay higher bills.
For years the benefits of Texas’s deregulated market structure were clear. At 8.6 cents per kilowatt hour, the state’s average retail price for electricity is around one-fifth lower than the national average and about half the cost of California’s. In 1999 the state set targets for renewables, and today it accounts for around 30% of America’s wind energy.
This disaster is prompting people to question whether Texas’s system is as resilient and well-designed as people previously believed. Greg Abbott, the governor, has called for an investigation into ERCOT. This storm “has exposed some serious weaknesses in our free-market approach in Texas”, says Luke Metzger of Environment Texas, a non-profit, who had been without power for 54 hours when The Economist went to press.
Wholly redesigning the power grid in Texas seems unlikely. After the snow melts, the state will need to tackle two more straightforward questions. The first is whether it needs to increase reserve capacity. “If we impose a capacity market here and a bunch of new cap-ex is required to winterise equipment, who bears that cost? Ultimately it’s the customer,” says Bobby Tudor, chairman of TPH. The second is how Texas can ensure the reliability of equipment in extreme weather conditions. After a polar vortex in 2014 hit the east coast, PJM, a regional transmission organisation, started making higher payments based on reliability of service, says Michael Weinstein of Credit Suisse, a bank. In Texas there is no penalty for systems going down, except for public complaints and politicians’ finger-pointing.
Texas is hardly the only state to struggle with blackouts. California, which has a more tightly regulated power market, is regularly plunged into darkness during periods of high heat, winds and wildfires. Unlike Texas, much of northern California is dependent on a single utility, PG&E. The company has been repeatedly sued for dismal, dangerous management. But, as in Texas, critics have blamed intermittent renewable power for blackouts. In truth, California’s blackouts share many of the same causes as those in Texas: extreme weather, power generators that failed unexpectedly, poor planning by state regulators and an inability (in California, temporary) to import power from elsewhere. In California’s blackouts last year, solar output naturally declined in the evening. But gas plants also went offline and weak rainfall lowered the output of hydroelectric dams.
In California, as in Texas, it would help to have additional power generation, energy storage to meet peak demand and more resilient infrastructure, such as buried power lines and more long-distance, high-voltage transmission. Weather events that once might have been dismissed as unusual are becoming more common. Without more investment in electricity grids, blackouts will be, too.
Systems are designed to handle spikes in demand, but the wild and unpredictable weather linked to global warming will very likely push grids beyond their limits.
Published Feb. 16, 2021Updated Feb. 17, 2021, 6:59 a.m. ET
Huge winter storms plunged large parts of the central and southern United States into an energy crisis this week, with frigid blasts of Arctic weather crippling electric grids and leaving millions of Americans without power amid dangerously cold temperatures.
The grid failures were most severe in Texas, where more than four million people woke up Tuesday morning to rolling blackouts. Separate regional grids in the Southwest and Midwest also faced serious strain. As of Tuesday afternoon, at least 23 people nationwide had died in the storm or its aftermath.
Analysts have begun to identify key factors behind the grid failures in Texas. Record-breaking cold weather spurred residents to crank up their electric heaters and pushed power demand beyond the worst-case scenarios that grid operators had planned for. At the same time, a large fraction of the state’s gas-fired power plants were knocked offline amid icy conditions, with some plants suffering fuel shortages as natural gas demand spiked. Many of Texas’ wind turbines also froze and stopped working.
The crisis sounded an alarm for power systems throughout the country. Electric grids can be engineered to handle a wide range of severe conditions — as long as grid operators can reliably predict the dangers ahead. But as climate change accelerates, many electric grids will face extreme weather events that go far beyond the historical conditions those systems were designed for, putting them at risk of catastrophic failure.
Measures that could help make electric grids more robust — such as fortifying power plants against extreme weather, or installing more backup power sources — could prove expensive. But as Texas shows, blackouts can be extremely costly, too. And, experts said, unless grid planners start planning for increasingly wild and unpredictable climate conditions, grid failures will happen again and again.
“It’s essentially a question of how much insurance you want to buy,” said Jesse Jenkins, an energy systems engineer at Princeton University. “What makes this problem even harder is that we’re now in a world where, especially with climate change, the past is no longer a good guide to the future. We have to get much better at preparing for the unexpected.”
A System Pushed to the Limit
Texas’ main electric grid, which largely operates independently from the rest of the country, has been built with the state’s most common weather extremes in mind: soaring summer temperatures that cause millions of Texans to turn up their air-conditioners all at once.
While freezing weather is rarer, grid operators in Texas have also long known that electricity demand can spike in the winter, particularly after damaging cold snaps in 2011 and 2018. But this week’s winter storms, which buried the state in snow and ice, and led to record-cold temperatures, surpassed all expectations — and pushed the grid to its breaking point.
Texas’ grid operators had anticipated that, in the worst case, the state would use 67 gigawatts of electricity during the winter peak. But by Sunday evening, power demand had surged past that level. As temperatures dropped, many homes were relying on older, inefficient electric heaters that consume more power.
The problems compounded from there, with frigid weather on Monday disabling power plants with capacity totaling more than 30 gigawatts. The vast majority of those failures occurred at thermal power plants, like natural gas generators, as plummeting temperatures paralyzed plant equipment and soaring demand for natural gas left some plants struggling to obtain sufficient fuel. A number of the state’s power plants were also offline for scheduled maintenance in preparation for the summer peak.
The state’s fleet of wind farms also lost up to 4.5 gigawatts of capacity at times, as many turbines stopped working in cold and icy conditions, though this was a smaller part of the problem.
In essence, experts said, an electric grid optimized to deliver huge quantities of power on the hottest days of the year was caught unprepared when temperatures plummeted.
While analysts are still working to untangle all of the reasons behind Texas’ grid failures, some have also wondered whether the unique way the state manages its largely deregulated electricity system may have played a role. In the mid-1990s, for instance, Texas decided against paying energy producers to hold a fixed number of backup power plants in reserve, instead letting market forces dictate what happens on the grid.
On Tuesday, Gov. Greg Abbott called for an emergency reform of the Electric Reliability Council of Texas, the nonprofit corporation that oversees the flow of power in the state, saying its performance had been “anything but reliable” over the previous 48 hours.
‘A Difficult Balancing Act’
In theory, experts said, there are technical solutions that can avert such problems.
Wind turbines can be equipped with heaters and other devices so that they can operate in icy conditions — as is often done in the upper Midwest, where cold weather is more common. Gas plants can be built to store oil on-site and switch over to burning the fuel if needed, as is often done in the Northeast, where natural gas shortages are common. Grid regulators can design markets that pay extra to keep a larger fleet of backup power plants in reserve in case of emergencies, as is done in the Mid-Atlantic.
But these solutions all cost money, and grid operators are often wary of forcing consumers to pay extra for safeguards.
“Building in resilience often comes at a cost, and there’s a risk of both underpaying but also of overpaying,” said Daniel Cohan, an associate professor of civil and environmental engineering at Rice University. “It’s a difficult balancing act.”
In the months ahead, as Texas grid operators and policymakers investigate this week’s blackouts, they will likely explore how the grid might be bolstered to handle extremely cold weather. Some possible ideas include: Building more connections between Texas and other states to balance electricity supplies, a move the state has long resisted; encouraging homeowners to install battery backup systems; or keeping additional power plants in reserve.
The search for answers will be complicated by climate change. Over all, the state is getting warmer as global temperatures rise, and cold-weather extremes are, on average, becoming less common over time.
But some climate scientists have also suggested that global warming could, paradoxically, bring more unusually fierce winter storms. Some research indicates that Arctic warming is weakening the jet stream, the high-level air current that circles the northern latitudes and usually holds back the frigid polar vortex. This can allow cold air to periodically escape to the South, resulting in episodes of bitter cold in places that rarely get nipped by frost.
But this remains an active area of debate among climate scientists, with some experts less certain that polar vortex disruptions are becoming more frequent, making it even trickier for electricity planners to anticipate the dangers ahead.
All over the country, utilities and grid operators are confronting similar questions, as climate change threatens to intensify heat waves, floods, water shortages and other calamities, all of which could create novel risks for the nation’s electricity systems. Adapting to those risks could carry a hefty price tag: One recent study found that the Southeast alone may need 35 percent more electric capacity by 2050 simply to deal with the known hazards of climate change.
And the task of building resilience is becoming increasingly urgent. Many policymakers are promoting electric cars and electric heating as a way of curbing greenhouse gas emissions. But as more of the nation’s economy depends on reliable flows of electricity, the cost of blackouts will become ever more dire.
“This is going to be a significant challenge,” said Emily Grubert, an infrastructure expert at Georgia Tech. “We need to decarbonize our power systems so that climate change doesn’t keep getting worse, but we also need to adapt to changing conditions at the same time. And the latter alone is going to be very costly. We can already see that the systems we have today aren’t handling this very well.”
John Schwartz, Dave Montgomery and Ivan Penn contributed reporting.
De acordo com a pesquisa, as emissões globais de gases do efeito estufa no último século favoreceram o crescimento de um habitat para morcegos, tornando o sul da China uma região propícia para o surgimento e a propagação do vírus Sars-CoV-2.
A análise foi feita com base em um mapa da vegetação do mundo no século 20, utilizando dados relacionados a temperatura, precipitação e cobertura de nuvens. Os pesquisadores analisaram a distribuição de morcegos no início dos anos 1900 e, comparando com a distribuição atual, concluíram que diferentes espécies mudaram de região por causa das mudanças no clima do planeta.
“Entender como a distribuição das espécies de morcego pelo mundo mudou em função das mudanças climáticas pode ser um passo importante para reconstruir a origem do surto de Covid-19”, afirmou, em nota, Robert Beyer, pesquisador do Departamento de Zoologia da Universidade de Cambridge, no Reino Unido, e autor principal do estudo.
Foram observadas grandes alterações na vegetação da província chinesa de Yunnan, de Mianmar e do Laos. Os aumentos na temperatura, na incidência da luz solar e nas concentrações de dióxido de carbono presente na atmosfera fizeram com que o habitat, que antes era composto por arbustos tropicais, se transformasse em savana tropical e florestas temperadas.
As novas características criaram um ambiente favorável para que 40 espécies de morcegos migrassem para a província de Yunnan no último século, reunindo assim mais de 100 tipos de coronavírus na área em que os dados apontam como a origem do surto do Sars-CoV-2. Essa região também é habitat dos pangolins, que são considerados prováveis agentes intermediários na pandemia.
“Conforme as mudanças climáticas alteraram os habitats, espécies deixaram algumas áreas e foram para outras — levando os vírus com elas. Isso não apenas alterou as regiões onde os vírus estão presentes, mas provavelmente permitiu novas interações entre animais e vírus, fazendo com que vírus mais perigosos fossem transmitidos ou desenvolvidos”, explicou Beyer.
O estudo ainda identificou que as mudanças climáticas resultaram no aumento do número de espécies de morcegos em outras regiões, como na África Central, na América do Sul e na América Central. “A pandemia de Covid-19 causou grande prejuízo social e econômico. Os governos devem aproveitar a oportunidade para reduzir os riscos que doenças infecciosas apresentam à saúde e agir para mitigar as mudanças climáticas”, alertou o professor Andrea Manica, do Departamento de Zoologia da Universidade de Cambridge.
Os pesquisadores também ressaltam que é preciso limitar a expansão de áreas urbanas, fazendas e áreas de caça em habitats naturais para que seja reduzido o contato entre humanos e animais transmissores doenças.
Esta matéria faz parte da iniciativa #UmSóPlaneta, união de 19 marcas da Editora Globo, Edições Globo Condé Nast e CBN. Saiba mais em umsoplaneta.globo.com
Reductions in aerosol emissions had slight warming impact, study finds
Date: February 2, 2021
Source: National Center for Atmospheric Research/University Corporation for Atmospheric Research
Summary: The lockdowns and reduced societal activity related to the COVID-19 pandemic affected emissions of pollutants in ways that slightly warmed the planet for several months last year, according to new research. The counterintuitive finding highlights the influence of airborne particles, or aerosols, that block incoming sunlight.
The lockdowns and reduced societal activity related to the COVID-19 pandemic affected emissions of pollutants in ways that slightly warmed the planet for several months last year, according to new research led by the National Center for Atmospheric Research (NCAR).
The counterintuitive finding highlights the influence of airborne particles, or aerosols, that block incoming sunlight. When emissions of aerosols dropped last spring, more of the Sun’s warmth reached the planet, especially in heavily industrialized nations, such as the United States and Russia, that normally pump high amounts of aerosols into the atmosphere.
“There was a big decline in emissions from the most polluting industries, and that had immediate, short-term effects on temperatures,” said NCAR scientist Andrew Gettelman, the study’s lead author. “Pollution cools the planet, so it makes sense that pollution reductions would warm the planet.”
Temperatures over parts of Earth’s land surface last spring were about 0.2-0.5 degrees Fahrenheit (0.1-0.3 degrees Celsius) warmer than would have been expected with prevailing weather conditions, the study found. The effect was most pronounced in regions that normally are associated with substantial emissions of aerosols, with the warming reaching about 0.7 degrees F (0.37 C) over much of the United States and Russia.
The new study highlights the complex and often conflicting influences of different types of emissions from power plants, motor vehicles, industrial facilities, and other sources. While aerosols tend to brighten clouds and reflect heat from the Sun back into space, carbon dioxide and other greenhouse gases have the opposite effect, trapping heat near the planet’s surface and elevating temperatures.
Despite the short-term warming effects, Gettelman emphasized that the long-term impact of the pandemic may be to slightly slow climate change because of reduced emissions of carbon dioxide, which lingers in the atmosphere for decades and has a more gradual influence on climate. In contrast, aerosols — the focus of the new study — have a more immediate impact that fades away within a few years.
The study was published in Geophysical Research Letters. It was funded in part by the National Science Foundation, NCAR’s sponsor. In addition to NCAR scientists, the study was co-authored by scientists at Oxford University, Imperial College, and the University of Leeds.
Teasing out the impacts
Although scientists have long been able to quantify the warming impacts of carbon dioxide, the climatic influence of various types of aerosols — including sulfates, nitrates, black carbon, and dust — has been more difficult to pin down. One of the major challenges for projecting the extent of future climate change is estimating the extent to which society will continue to emit aerosols in the future and the influence of the different types of aerosols on clouds and temperature.
To conduct the research, Gettelman and his co-authors used two of the world’s leading climate models: the NCAR-based Community Earth System Model and a model known as ECHAM-HAMMOZ, which was developed by a consortium of European nations. They ran simulations on both models, adjusting emissions of aerosols and incorporating actual meteorological conditions in 2020, such as winds.
This approach enabled them to identify the impact of reduced emissions on temperature changes that were too small to tease out in actual observations, where they could be obscured by the variability in atmospheric conditions.
The results showed that the warming effect was strongest in the mid and upper latitudes of the Northern Hemisphere. The effect was mixed in the tropics and comparatively minor in much of the Southern Hemisphere, where aerosol emissions are not as pervasive.
Gettelman said the study will help scientists better understand the influence of various types of aerosols in different atmospheric conditions, helping to inform efforts to minimize climate change. Although the research illustrates how aerosols counter the warming influence of greenhouse gases, he emphasized that emitting more of them into the lower atmosphere is not a viable strategy for slowing climate change.
“Aerosol emissions have major health ramifications,” he said. “Saying we should pollute is not practical.”
The planet is hotter now than it has been for at least 12,000 years, a period spanning the entire development of human civilisation, according to research.
Analysis of ocean surface temperatures shows human-driven climate change has put the world in “uncharted territory”, the scientists say. The planet may even be at its warmest for 125,000 years, although data on that far back is less certain.
The research, published in the journal Nature, reached these conclusions by solving a longstanding puzzle known as the “Holocene temperature conundrum”. Climate models have indicated continuous warming since the last ice age ended 12,000 years ago and the Holocene period began. But temperature estimates derived from fossil shells showed a peak of warming 6,000 years ago and then a cooling, until the industrial revolution sent carbon emissions soaring.
This conflict undermined confidence in the climate models and the shell data. But it was found that the shell data reflected only hotter summers and missed colder winters, and so was giving misleadingly high annual temperatures.
“We demonstrate that global average annual temperature has been rising over the last 12,000 years, contrary to previous results,” said Samantha Bova, at Rutgers University–New Brunswick in the US, who led the research. “This means that the modern, human-caused global warming period is accelerating a long-term increase in global temperatures, making today completely uncharted territory. It changes the baseline and emphasises just how critical it is to take our situation seriously.”
The world may be hotter now than any time since about 125,000 years ago, which was the last warm period between ice ages. However, scientists cannot be certain as there is less data relating to that time.
One study, published in 2017, suggested that global temperatures were last as high as today 115,000 years ago, but that was based on less data.
The new research is published in the journal Nature and examined temperature measurements derived from the chemistry of tiny shells and algal compounds found in cores of ocean sediments, and solved the conundrum by taking account of two factors.
First, the shells and organic materials had been assumed to represent the entire year but in fact were most likely to have formed during summer when the organisms bloomed. Second, there are well-known predictable natural cycles in the heating of the Earth caused by eccentricities in the orbit of the planet. Changes in these cycles can lead to summers becoming hotter and winters colder while average annual temperatures change only a little.
Combining these insights showed that the apparent cooling after the warm peak 6,000 years ago, revealed by shell data, was misleading. The shells were in fact only recording a decline in summer temperatures, but the average annual temperatures were still rising slowly, as indicated by the models.
“Now they actually match incredibly well and it gives us a lot of confidence that our climate models are doing a really good job,” said Bova.
The study looked only at ocean temperature records, but Bova said: “The temperature of the sea surface has a really controlling impact on the climate of the Earth. If we know that, it is the best indicator of what global climate is doing.”
She led a research voyage off the coast of Chile in 2020 to take more ocean sediment cores and add to the available data.
Jennifer Hertzberg, of Texas A&M University in the US, said: “By solving a conundrum that has puzzled climate scientists for years, Bova and colleagues’ study is a major step forward. Understanding past climate change is crucial for putting modern global warming in context.”
Lijing Cheng, at the International Centre for Climate and Environment Sciences in Beijing, China, recently led a study that showed that in 2020 the world’s oceans reached their hottest level yet in instrumental records dating back to the 1940s. More than 90% of global heating is taken up by the seas.
Cheng said the new research was useful and intriguing. It provided a method to correct temperature data from shells and could also enable scientists to work out how much heat the ocean absorbed before the industrial revolution, a factor little understood.
On Monday, a weighty draft report on how to halt and reverse human-caused global warming will hit the inboxes of government experts. This is the final review before the Intergovernmental Panel on Climate Change (IPCC) issues its official summary of the science.
While part of the brief was to identify barriers to climate action, critics say there is little space given to the obstructive role of fossil fuel lobbying – and that’s a problem.
Robert Brulle, an American sociologist who has long studied institutions that promote climate denial, likened it to “trying to tell the story of Star Wars, but omitting Darth Vader”.
Tweeting in November, Brulle explained he declined an invitation to contribute to the working group three (WG3) report. “It became clear to me that institutionalized efforts to obstruct climate action was a peripheral concern. So I didn’t consider it worth engaging in this effort. It really deserves its own chapter & mention in the summary.”
In an email exchange with Climate Home News, Brulle expressed a hope the final version would nonetheless reflect his feedback. The significance of obstruction efforts should be reflected in the summary for policymakers and not “buried in an obscure part of the report,” he wrote.
His tweet sparked a lively conversation among scientists, with several supporting his concerns and others defending the IPCC, which aims to give policymakers an overview of the scientific consensus.
David Keith, a Harvard researcher into solar geoengineering, agreed the IPCC “tells a bloodless story, and abstract numb version of the sharp political conflict that will shape climate action”.
Social ecology and ecological economics professor Julia Steinberger, a lead author on WG3, said “there is a lot of self-censorship” within the IPCC. Where authors identify enemies of climate action, like fossil fuel companies, that content is “immediately flagged as political or normative or policy-prescriptive”.
The next set of reports is likely to be “a bit better” at covering the issue than previous efforts, Steinberger added, “but mainly because the world and outside publications have overwhelmingly moved past this, and the IPCC is catching up: not because the IPCC is leading.”
Politics professor Matthew Paterson was a lead author on WG3 for the previous round of assessment reports, published in 2014. He told Climate Home that Brulle is “broadly right” lobbying hasn’t been given enough attention although there is a “decent chunk” in the latest draft on corporations fighting for their interests and slowing down climate action.
Paterson said this was partly because the expertise of authors didn’t cover fossil fuel company lobbying and partly because governments would oppose giving the subject greater prominence. “Not just Saudi Arabia,” he said. “They object to everything. But the Americans [and others too]”.
While the IPCC reports are produced by scientists, government representatives negotiate the initial scope and have some influence over how the evidence is summarised before approving them for publication. “There was definitely always a certain adaptation – or an internalised sense of what governments are and aren’t going to accept – in the report,” said Paterson.
The last WG3 report in 2014 was nearly 1,500 pages long. Lobbying was not mentioned in its 32-page ‘summary for policymakers’ but lobbying against carbon taxes is mentioned a few times in the full report.
On page 1,184, the report says some companies “promoted climate scepticism by providing financial resources to like-minded think-tanks and politicians”. The report immediately balances this by saying “other fossil fuel companies adopted a more supportive position on climate science”.
One of the co-chairs of WG3, Jim Skea, rejected the criticisms as “completely unfair”. He told Climate Home News: “The IPCC produces reports very slowly because the whole cycle lasts seven years… we can’t respond on a 24/7 news cycle basis to ideas that come up.”
Skea noted there was a chapter on policies and institutions in the 2014 report which covered lobbying from industry and from green campaigners and their influence on climate policy. “The volume of climate change mitigation literature that comes out every year is huge and I would say that the number of references to articles which talk about lobbying of all kinds – including industrial lobbying and whether people had known about the science – it is in there and about the right proportions”, he said.
“We’re not an advocacy organisation, we’re a scientific organisation, it’s not our job to take up arms and take one side or another” he said. “That’s the strength of the IPCC. If if oversteps its role, it will weaken its influence” and “undermine the scientific statements it makes”.
A broader, long-running criticism of the IPCC is that it downplays subjects like political science, development studies, sociology and anthropology and over-relies on economists and the people who put together ‘integrated assessment models’ (IAMs), which attempt to answer big questions like how the world can keep to 1.5C of global warming.
Paterson said the IPCC is “largely dominated by large-scale modellers or economists and the representations of others sorts of social scientists’ expertise is very thin”. A report he co-authored on the social make-up of that IPCC working group found that nearly half the authors were engineers or economists but just 15% were from social sciences other than economics. This dominance was sharper among the more powerful authors. Of the 35 Contributing Lead Authors, 20 were economists or engineers, there was one each from political science, geography and law and none from the humanities.
Wim Carton, a lecturer in the political economy of climate change mitigation at Lund University, said that the IPCC (and scientific research in general) has been caught up in “adulation” of IAMs and this has led to “narrow techno-economic conceptualisations of future mitigation pathways”.
Skea said that there has been lots of material on political science and international relations and even “quite a bit” on moral philosophy. He told Climate Home: “It’s not the case that IPCC is only economics and modelling. Frankly, a lot of that catches attention because these macro numbers are eye-catching. There’s a big difference in the emphasis in [media] coverage of IPCC reports and the balance of materials when you go into the reports themselves.”
According to Skea’s calculations, the big models make up only 6% of the report contents, about a quarter of the summary and the majority of the press coverage. “But there’s an awful lot of bread-and-butter material in IPCC reports which is just about how you get on with it,” he added. “It’s not sexy material but it’s just as important because that’s what needs to be done to mitigate climate change.”
While saying their dominance had been amplified by the media, Skea defended the usefulness of IAMs. “Our audience are governments. Their big question is how you connect all this human activity with actual impacts on the climate. It’s very difficult to make that leap without actually modelling it. You can’t do it with lots of little micro-studies. You need models and you need scenarios to think your way through that connection.”
The IPCC has also been accused of placing too much faith in negative emissions technologies and geo-engineering. Carton calls these technologies ‘carbon unicorns’ because he says they “do not exist at any meaningful scale” and probably never will.
In a recent book chapter, Carton argues: “If one is to believe recent IPCC reports, then gone are the days when the world could resolve the climate crisis merely by reducing emissions. Avoiding global warming in excess of 2°C/1.5°C now also involves a rather more interventionist enterprise: to remove vast amounts of carbon dioxide from the atmosphere, amounts that only increase the longer emissions refuse to fall.”
When asked about carbon capture technologies, Skea said that in terms of deployment, “they haven’t moved on very much” since the last big IPCC report in 2014. He added that carbon capture and storage and bio-energy are “all things that have been done commercially somewhere in the world.”
“What has never been done”, he said, “is to connect the different parts of the system together and run them over all. That’s led many people looking at the literature to conclude that the main barriers to the adoption of some technologies are the lack of policy incentives and the lack of working out good business models to put what would be complex supply chains together – rather than anything that’s standing in the way technically.”
The next set of three IPCC assessment reports was originally due to be published in 2021, but work was delayed by the coronavirus pandemic. Governments and experts will have from 18 January to 14 March to read and comment on the draft for WG3. Dates for a final government review have yet to be set.
Until recently, the field of plant breeding looked a lot like it did in centuries past. A breeder might examine, for example, which tomato plants were most resistant to drought and then cross the most promising plants to produce the most drought-resistant offspring. This process would be repeated, plant generation after generation, until, over the course of roughly seven years, the breeder arrived at what seemed the optimal variety.
Now, with the global population expected to swell to nearly 10 billion by 2050 (1) and climate change shifting growing conditions (2), crop breeder and geneticist Steven Tanksley doesn’t think plant breeders have that kind of time. “We have to double the productivity per acre of our major crops if we’re going to stay on par with the world’s needs,” says Tanksley, a professor emeritus at Cornell University in Ithaca, NY.
To speed up the process, Tanksley and others are turning to artificial intelligence (AI). Using computer science techniques, breeders can rapidly assess which plants grow the fastest in a particular climate, which genes help plants thrive there, and which plants, when crossed, produce an optimum combination of genes for a given location, opting for traits that boost yield and stave off the effects of a changing climate. Large seed companies in particular have been using components of AI for more than a decade. With computing power rapidly advancing, the techniques are now poised to accelerate breeding on a broader scale.
AI is not, however, a panacea. Crop breeders still grapple with tradeoffs such as higher yield versus marketable appearance. And even the most sophisticated AI cannot guarantee the success of a new variety. But as AI becomes integrated into agriculture, some crop researchers envisage an agricultural revolution with computer science at the helm.
An Art and a Science
During the “green revolution” of the 1960s, researchers developed new chemical pesticides and fertilizers along with high-yielding crop varieties that dramatically increased agricultural output (3). But the reliance on chemicals came with the heavy cost of environmental degradation (4). “If we’re going to do this sustainably,” says Tanksley, “genetics is going to carry the bulk of the load.”
Plant breeders lean not only on genetics but also on mathematics. As the genomics revolution unfolded in the early 2000s, plant breeders found themselves inundated with genomic data that traditional statistical techniques couldn’t wrangle (5). Plant breeding “wasn’t geared toward dealing with large amounts of data and making precise decisions,” says Tanksley.
In 1997, Tanksley began chairing a committee at Cornell that aimed to incorporate data-driven research into the life sciences. There, he encountered an engineering approach called operations research that translates data into decisions. In 2006, Tanksley cofounded the Ithaca, NY-based company Nature Source Improved Plants on the principle that this engineering tool could make breeding decisions more efficient. “What we’ve been doing almost 15 years now,” says Tanksley, “is redoing how breeding is approached.”
A Manufacturing Process
Such approaches try to tackle complex scenarios. Suppose, for example, a wheat breeder has 200 genetically distinct lines. The breeder must decide which lines to breed together to optimize yield, disease resistance, protein content, and other traits. The breeder may know which genes confer which traits, but it’s difficult to decipher which lines to cross in what order to achieve the optimum gene combination. The number of possible combinations, says Tanksley, “is more than the stars in the universe.”
An operations research approach enables a researcher to solve this puzzle by defining the primary objective and then using optimization algorithms to predict the quickest path to that objective given the relevant constraints. Auto manufacturers, for example, optimize production given the expense of employees, the cost of auto parts, and fluctuating global currencies. Tanksley’s team optimizes yield while selecting for traits such as resistance to a changing climate. “We’ve seen more erratic climate from year to year, which means you have to have crops that are more robust to different kinds of changes,” he says.
For each plant line included in a pool of possible crosses, Tanksley inputs DNA sequence data, phenotypic data on traits like drought tolerance, disease resistance, and yield, as well as environmental data for the region where the plant line was originally developed. The algorithm projects which genes are associated with which traits under which environmental conditions and then determines the optimal combination of genes for a specific breeding goal, such as drought tolerance in a particular growing region, while accounting for genes that help boost yield. The algorithm also determines which plant lines to cross together in which order to achieve the optimal combination of genes in the fewest generations.
Nature Source Improved Plants conducts, for example, a papaya program in southeastern Mexico where the once predictable monsoon season has become erratic. “We are selecting for varieties that can produce under those unknown circumstances,” says Tanksley. But the new papaya must also stand up to ringspot, a virus that nearly wiped papaya from Hawaii altogether before another Cornell breeder developed a resistant transgenic variety (6). Tanksley’s papaya isn’t as disease resistant. But by plugging “rapid growth rate” into their operations research approach, the team bred papaya trees that produce copious fruit within a year, before the virus accumulates in the plant.
“Plant breeders need operations research to help them make better decisions,” says William Beavis, a plant geneticist and computational biologist at Iowa State in Ames, who also develops operations research strategies for plant breeding. To feed the world in rapidly changing environments, researchers need to shorten the process of developing a new cultivar to three years, Beavis adds.
The big seed companies have investigated use of operations research since around 2010, with Syngenta, headquartered in Basel, Switzerland, leading the pack, says Beavis, who spent over a decade as a statistical geneticist at Pioneer Hi-Bred in Johnston, IA, a large seed company now owned by Corteva, which is headquartered in Wilmington, DE. “All of the soybean varieties that have come on the market within the last couple of years from Syngenta came out of a system that had been redesigned using operations research approaches,” he says. But large seed companies primarily focus on grains key to animal feed such as corn, wheat, and soy. To meet growing food demands, Beavis believes that the smaller seed companies that develop vegetable crops that people actually eat must also embrace operations research. “That’s where operations research is going to have the biggest impact,” he says, “local breeding companies that are producing for regional environments, not for broad adaptation.”
In collaboration with Iowa State colleague and engineer Lizhi Wang and others, Beavis is developing operations research-based algorithms to, for example, help seed companies choose whether to breed one variety that can survive in a range of different future growing conditions or a number of varieties, each tailored to specific environments. Two large seed companies, Corteva and Syngenta, and Kromite, a Lambertville, NJ-based consulting company, are partners on the project. The results will be made publicly available so that all seed companies can learn from their approach.
Drones and Adaptations
Useful farming AI requires good data, and plenty of it. To collect sufficient inputs, some researchers take to the skies. Crop researcher Achim Walter of the Institute of Agricultural Sciences at ETH Zürich in Switzerland and his team are developing techniques to capture aerial crop images. Every other day for several years, they have deployed image-capturing sensors over a wheat field containing hundreds of genetic lines. They fly their sensors on drones or on cables suspended above the crops or incorporate them into handheld devices that a researcher can use from an elevated platform (7).
Meanwhile, they’re developing imaging software that quantifies growth rate captured by these images (8). Using these data, they build models that predict how quickly different genetic lines grow under different weather conditions. If they find, for example, that a subset of wheat lines grew well despite a dry spell, then they can zero in on the genes those lines have in common and incorporate them into new drought-resistant varieties.
Research geneticist Edward Buckler at the US Department of Agriculture and his team are using machine learning to identify climate adaptations in 1,000 species in a large grouping of grasses spread across the globe. The grasses include food and bioenergy crops such as maize, sorghum, and sugar cane. Buckler says that when people rank what are the most photosynthetically efficient and water-efficient species, this is the group that comes out at the top. Still, he and collaborators, including plant scientist Elizabeth Kellogg of the Donald Danforth Plant Science Center in St. Louis, MO, and computational biologist Adam Siepel of Cold Spring Harbor Laboratory in NY, want to uncover genes that could make crops in this group even more efficient for food production in current and future environments. The team is first studying a select number of model species to determine which genes are expressed under a range of different environmental conditions. They’re still probing just how far this predictive power can go.
Such approaches could be scaled up—massively. To probe the genetic underpinnings of climate adaptation for crop species worldwide, Daniel Jacobson, the chief researcher for computational systems biology at Oak Ridge National Laboratory in TN, has amassed “climatype” data for every square kilometer of land on Earth. Using the Summit supercomputer, they then compared each square kilometer to every other square kilometer to identify similar environments (9). The result can be viewed as a network of GPS points connected by lines that show the degree of environmental similarity between points.
“For me, breeding is much more like art. I need to see the variation and I don’t prejudge it. I know what I’m after, but nature throws me curveballs all the time, and I probably can’t count the varieties that came from curveballs.”
In collaboration with the US Department of Energy’s Center for Bioenergy Innovation, the team combines this climatype data with GPS coordinates associated with individual crop genotypes to project which genes and genetic interactions are associated with specific climate conditions. Right now, they’re focused on bioenergy and feedstocks, but they’re poised to explore a wide range of food crops as well. The results will be published so that other researchers can conduct similar analyses.
The Next Agricultural Revolution
Despite these advances, the transition to AI can be unnerving. Operations research can project an ideal combination of genes, but those genes may interact in unpredictable ways. Tanksley’s company hedges its bets by engineering 10 varieties for a given project in hopes that at least one will succeed.
On the other hand, such a directed approach could miss happy accidents, says Molly Jahn, a geneticist and plant breeder at the University of Wisconsin–Madison. “For me, breeding is much more like art. I need to see the variation and I don’t prejudge it,” she says. “I know what I’m after, but nature throws me curveballs all the time, and I probably can’t count the varieties that came from curveballs.”
There are also inherent tradeoffs that no algorithm can overcome. Consumers may prefer tomatoes with a leafy crown that stays green longer. But the price a breeder pays for that green calyx is one percent of the yield, says Tanksley.
Image recognition technology comes with its own host of challenges, says Walter. “To optimize algorithms to an extent that makes it possible to detect a certain trait, you have to train the algorithm thousands of times.” In practice, that means snapping thousands of crop images in a range of light conditions. Then there’s the ground-truthing. To know whether the models work, Walter and others must measure the trait they’re after by hand. Keen to know whether the model accurately captures the number of kernels on an ear of corn? You’d have to count the kernels yourself.
Despite these hurdles, Walter believes that computer science has brought us to the brink of a new agricultural revolution. In a 2017 PNAS Opinion piece, Walter and colleagues described emerging “smart farming” technologies—from autonomous weeding vehicles to moisture sensors in the soil (10). The authors worried, though, that only big industrial farms can afford these solutions. To make agriculture more sustainable, smaller farms in developing countries must have access as well.
Fortunately, “smart breeding” advances may have wider reach. Once image recognition technology becomes more developed for crops, which Walter expects will happen within the next 10 years, deploying it may be relatively inexpensive. Breeders could operate their own drones and obtain more precise ratings of traits like time to flowering or number of fruits in shorter time, says Walter. “The computing power that you need once you have established the algorithms is not very high.”
The genomic data so vital to AI-led breeding programs is also becoming more accessible. “We’re really at this point where genomics is cheap enough that you can apply these technologies to hundreds of species, maybe thousands,” says Buckler.
Plant breeding has “entered the engineered phase,” adds Tanksley. And with little time to spare. “The environment is changing,” he says. “You have to have a faster breeding process to respond to that.”
3. P. L. Pingali, Green revolution: Impacts, limits, and the path ahead. Proc. Natl. Acad. Sci. U.S.A. 109, 12302–12308 (2012).
4. D. Tilman, The greening of the green revolution. Nature 396, 211–212 (1998).
5. G. P. Ramstein, S. E. Jensen, E. S. Buckler, Breaking the curse of dimensionality to identify causal variants in Breeding 4. Theor. Appl. Genet. 132, 559–567 (2019).
6. D. Gonsalves, Control of papaya ringspot virus in papaya: A case study. Annu. Rev. Phytopathol. 36, 415–437 (1998).
7. N. Kirchgessner et al., The ETH field phenotyping platform FIP: A cable-suspended multi-sensor system. Funct. Plant Biol. 44, 154–168 (2016).
8. K. Yu, N. Kirchgessner, C. Grieder, A. Walter, A. Hund, An image analysis pipeline for automated classification of imaging light conditions and for quantification of wheat canopy cover time series in field phenotyping. Plant Methods 13, 15 (2017).
9. J. Streich et al., Can exascale computing and explainable artificial intelligence applied to plant biology deliver on the United Nations sustainable development goals? Curr. Opin. Biotechnol. 61, 217–225 (2020).
10. A. Walter, R. Finger, R. Huber, N. Buchmann, Opinion: Smart farming is key to developing sustainable agriculture. Proc. Natl. Acad. Sci. U.S.A. 114, 6148–6150 (2017).
Larry Elliott, Mon 9 Nov 2020 19.18 GMT. Last modified on Tue 10 Nov 2020 04.37 GMT
Large companies and financial institutions in the UK will have to come clean about their exposure to climate risks within five years under the terms of a tougher regime announced by the chancellor, Rishi Sunak.
In an attempt to demonstrate the government’s commitment to tackling global heating, Sunak said the UK would go further than an international taskforce had recommended and make disclosure by large businesses mandatory.
The chancellor also announced plans for Britain’s first green gilt – a bond that will be floated in the financial markets during 2021 with the money raised paying for investment in carbon-reducing projects and the creation of jobs across the country.
In a Commons statement, Sunak said departure from the EU meant the financial services sector – which employs more than a million people – was entering a new chapter.
“This new chapter means putting the full weight of private sector innovation, expertise and capital behind the critical global effort to tackle climate change and protect the environment.
“We’re announcing the UK’s intention to mandate climate disclosures by large companies and financial institutions across our economy, by 2025, going further than recommended by the Task Force on Climate-related Financial Disclosures, and the first G20 country to do so.”
The Treasury said the new disclosure rules and regulations would cover a significant portion of the economy, including listed commercial companies, UK-registered large private companies, banks, building societies, insurance companies, UK-authorised asset managers, life insurers, pension schemes regulated by the Financial Conduct Authority and occupational pension schemes.
The government plans to make Britain a net-zero-carbon country by 2050 and the previous governor of the Bank of England, Mark Carney, told a London conference that the Covid-19 pandemic illustrated the dangers of ill-preparation and of underestimating risks.
Climate change was “a crisis that involves the whole world and from which no one will be able to self-isolate”, Carney said on Monday.
His successor at Threadneedle Street, Andrew Bailey, said the decision to issue a green bond underlined the UK’s commitment to combating climate change – as did Sunak’s announcement that disclosures related to climate change risk would be mandatory by 2025.
Sunak, Carney and Bailey were all speakers at the Green Horizon summit, which took place in London on what would have been the first day of the UN climate change conference in Glasgow had Covid-19 not forced the postponement of the event.
Bailey said: “Our goal is to build a UK financial system resilient to the risks from climate change and supportive of the transition to a net-zero economy. In the aftermath of the financial crisis we took far-reaching action to make the financial system more resilient against crises – Covid is the first real test of those changes.”
Doug Parr, Greenpeace UK’s policy director, said: “Tackling climate change means the corporate sector is not just green round the edges but green right to its core. The chancellor’s plans to make disclosure mandatory for companies is right if the rules are compulsory and thorough. Sign up to the daily Business Today email or follow Guardian Business on Twitter at @BusinessDesk
“The real win would be to make all financial institutions put in place plans to meet the Paris climate agreement by the end of next year, steadily choking off the supply of cash to planet-wrecking activities. Disclosure is a route to making that happen, but not an end in itself.”
Roger Barker, the director of policy and corporate governance at the Institute of Directors, said: “What gets measured gets changed. The problem is there’s a hundred and one different ways of measuring climate impact out there right now. It’s a confusing landscape for companies and investors alike, so bringing in common standards is absolutely the right thing to do.
Fran Boait, the executive director of the campaign group Positive Money, said: “We desperately need more green public investment if we are to have a fair, green transition, so it’s positive that the government has signalled that it is finally taking this more seriously, by issuing green gilts for the first time.”
Antes encaradas com desconfiança pela comunidade científica, as metodologias de intervenção artificial no meio ambiente com o objetivo de frear os efeitos devastadores do aquecimento global estão sendo consideradas agora como recursos a serem aplicados em última instância (já que iniciativas para reduzir a emissão de gases dependem diretamente da ação coletiva e demandam décadas para que tenham algum tipo de efeito benéfico). É possível que não tenhamos esse tempo, de acordo com alguns pesquisadores da área, os quais têm atraído investimentos e muita atenção.
Fazendo parte de um campo também referenciado como geoengenharia solar, grande parte dos métodos se vale da emissão controlada de partículas na atmosfera, responsáveis por barrar a energia recebida pelo nosso planeta e direcioná-la novamente ao espaço, criando uma espécie de resfriamento semelhante ao gerado por erupções vulcânicas.
Ainda que não atuem sobre a poluição, por exemplo, cientistas consideram que, diante de tempestades cada vez mais agressivas, tornados de fogo, inundações e outros desastres naturais, tais ações seriam interessantes enquanto soluções mais eficazes não são desenvolvidas.
Diretor do Sabin Center for Climate Change Law, na Columbia Law School, e editor de um livro sobre a tecnologia e suas implicações legais, Michael Gerrard exemplificou a situação em entrevista ao The New York Times: “Estamos enfrentando uma ameaça existencial. Por isso, é necessário que analisemos todas as opções”.
“Gosto de comparar a geoengenharia a uma quimioterapia para o planeta: se todo o resto estiver falhando, resta apenas tentar”, ele defendeu.
Desastres naturais ocasionados pelo aquecimento global tornam urgente a ação de intervenções, segundo pesquisadores. Fonte: Unsplash
Dois pesos e duas medidas
Entre aquelas que se destacam, pode ser citada a ação empreendida por uma organização não governamental chamada SilverLining, que concedeu US$ 3 milhões a diversas universidades e outras instituições para que se dediquem à busca de respostas para questões práticas. Um exemplo é encontrar a altitude ideal para a aplicação de aerossóis e como inserir a quantidade mais indicada, verificando seus efeitos sobre a cadeia de produção de alimentos mundial.
Chris Sacca, cofundador da Lowercarbon Capital, um grupo de investimentos que é um dos financiadores da SilverLining, declarou em tom alarmista: “A descarbonização é necessária, mas vai demorar 20 anos ou mais para que ocorra. Se não explorarmos intervenções climáticas como a reflexão solar neste momento, condenaremos um número incontável de vidas, espécies e ecossistemas ao calor”.
Outra contemplada por somas substanciais foi a National Oceanic and Atmospheric Administration, que recebeu do congresso norte-americano US$ 4 milhões justamente para o desenvolvimento de tecnologias do tipo, assim como o monitoramento de uso secreto de tais soluções por outros países.
Douglas MacMartin, pesquisador de Engenharia Mecânica e aeroespacial na Universidade Cornell, afirmou que “é certo o poder da humanidade de resfriar as coisas, mas o que não está claro é o que vem a seguir”.
Se, por um lado, o planeta pode ser resfriado artificialmente; por outro, não se sabe o que virá. Fonte: Unsplash
Existe uma maneira
Para esclarecer as possíveis consequências de intervenções dessa magnitude, MacMartin desenvolverá modelos de efeitos climáticos específicos oriundos da injeção de aerossóis na atmosfera acima de diferentes partes do globo e altitudes. “Dependendo de onde você colocar [a substância], terá efeitos diferentes nas monções na Ásia e no gelo marinho do Ártico“, ele apontou.
O Centro Nacional de Pesquisa Atmosférica em Boulder, Colorado, financiado também pela SilverLining, acredita ter o sistema ideal para isso — o qual é considerado o mais sofisticado do mundo. Com ele, serão executadas centenas de simulações e, assim, especialistas procurarão o que chamam de ponto ideal, no qual a quantidade de resfriamento artificial que pode reduzir eventos climáticos extremos não cause mudanças mais amplas nos padrões regionais de precipitação ou impactos semelhantes.
“Existe uma maneira, pelo menos em nosso modelo de mundo, de ver se podemos alcançar um sem acionar demais o outro?” questionou Jean-François Lamarque, diretor do laboratório de Clima e Dinâmica Global da instituição. Ainda não há resposta para essa dúvida, mas soluções sustentáveis estão sendo analisadas por pesquisadores australianos, que utilizariam a emissão de água salgada para tornar nuvens mais reflexivas, assim indicando resultados promissores de testes.
Dessa maneira, quem sabe as perdas de corais de recife que testemunhamos tenham data para acabar. Quanto ao resto, bem, só o tempo mostrará.