OUTRO LADO: Defesa Civil diz que enviou SMS para 34 mil celulares cadastrados na região do litoral norte
Isabela Palhares
22 de fevereiro de 2023
O Cemaden (Centro Nacional de Monitoramento e Alerta para Desastres Naturais) afirma ter alertado o Governo de São Paulo cerca de 48 horas antes sobre o alto risco de desastre no litoral paulista.
Segundo o Cemaden, que é um órgão federal, a Defesa Civil estadual foi alertada sobre a ocorrência de chuvas fortes na região e o alto risco de desastres em uma reunião online na manhã de sexta (17). A vila do Sahy, o ponto em que mais pessoas morreram, foi citada como uma área de alto risco para deslizamento.
Em nota, a Defesa Civil diz que emitiu alertas preventivos à população desde que foi informada da previsão de fortes chuvas.
“Nós alertamos e avisamos a Defesa Civil na sexta, foram quase 48 horas antes de o desastre acontecer. Seguimos o protocolo que é estabelecido, alertando a Defesa Civil estadual para que ela se organizasse com os municípios”, disse Osvaldo Moraes, presidente do Cemaden.
O Cemaden é ligado ao Ministério da Ciência, Tecnologia e Inovação. O centro é responsável por monitorar índices meteorológicos e geológicos e alertar, caso necessário, os órgãos de prevenção.
Moraes diz que, ainda na quinta-feira (16), um boletim meteorológico já indicava as fortes chuvas na região. Esse boletim foi repassado para a Defesa Civil do estado.
Depois desse primeiro alerta, o Cemaden se reuniu com um representante da Defesa Civil estadual na sexta de manhã. “Nós emitimos boletins diários, o de quinta já indicava o risco. Mas o de sexta-feira aumentou o nível de alerta para essa região.”
A Defesa Civil disse que enviou 14 alertas de mensagem de texto (SMS) para mais de 34 mil celulares cadastrados na região do litoral norte. O órgão informou ainda que começou a articular ações as defesas civis municipais na quinta-feira quando recebeu a previsão de fortes chuvas na região.
“Os primeiros avisos divulgados pela Defesa Civil do Estado, que ocorreram ainda de forma preventiva, foram publicados por volta das 15 horas de quinta-feira, nas redes sociais da Defesa Civil e do Governo com informações sobre o volume de chuvas estimado para o período, bem como as medidas de segurança que poderiam ser adotadas pela população em áreas de risco”, diz a nota.
O órgão disse ainda que à 00h52 de sexta, ao acompanhar imagens de radares e satélites, enviou a primeira mensagem de SMS com o alerta.
Nas redes sociais da Defesa Civil, a primeira mensagem de alertas para chuvas fortes no sábado foi feita às 12h22. A mensagem, no entanto, não fala sobre os riscos de desmoronamento.
Durante a noite, outros alertas foram postados pelo órgão e nenhum deles faz menção ao risco de desmoronamento de terra. Foi só às 19h49 uma mensagem recomendou que as pessoas deixassem o local se precisassem.
Para os especialistas, a proporção do desastre e o elevado número de vítimas mostram que apenas a estratégia de envio de SMS aos moradores não é eficiente. Além de não ser possível saber se as pessoas viram os alertas, não havia um plano ou orientação sobre o que fazer na situação.
“Você cria um sistema de aviso, as pessoas podem até receber a mensagem, mas não sabem o que fazer com aquela informação. Não há uma orientação para onde devem ir, quando sair de casa, o que levar”, diz Eduardo Mario Mendiondo, coordenador científico do Ceped (Centro de Estudos e Pesquisas sobre Desastres) da USP.
Para ele, a estratégias devem pensar também criação de rotas de fugas em áreas de risco e na orientação aos moradores. “A população precisa saber qual o risco está correndo e como se proteger. É injusto depois dizer que eles não queriam sair de casa, eles não tinham orientação correta do que fazer.”
Segundo ele, em diversas cidades do país, como Petrópolis e Salvador, o alerta ocorre por uma sirene.
“Você garante que todo mundo vai ouvir a qualquer momento do dia. É o instrumento mais antigo, mas que funciona. Uma sirene dá o recado claro do risco iminente”, diz.
Para Fernando Rocha Nogueira, coordenador do LabGRIS (Laboratório Gestão de Riscos) da UFABC, as autoridade brasileiras assistem de forma inerte aos desastres que ocorrem no país. Segundo ele, o Brasil conta com bons sistemas de monitoramento, mas não desenvolve estratégias para proteger a população.
“Temos um problema grave de comunicação no país. Tinha o mapeamento de que iria chover muito, que havia um alto risco e não se deu a atenção devida. Milhares de pessoas desceram para o litoral, ignorando a previsão. Nós não temos conscientização do risco, nós vivemos um negacionismo das informações climáticas”, diz.
Como foram os avisos
Quinta-feira (16) Boletim do Cemaden alerta para a ocorrência de chuvas fortes e volumosas no litoral paulista durante o Carnaval
Sexta-feira (17) Em reunião virtual, o Cemaden faz alerta sobre a previsão de chuvas fortes e o risco de deslizamentos de terra para integrantes da Defesa Civil do estado. A vila do Sahy estava entre as áreas apontadas como de maior risco
Sábado (18)
12h22: Defesa Civil do Estado avisa nas redes sociais que a chuva estava se espalhando pela região de Ubatuba e Caraguatatuba. “Tem vento e raios. Atinge municípios vizinhos. Tenha cuidado nas próximas horas”, diz a mensagem
18h33: Uma nova mensagem da Defesa Civil é postada alertando para chuva persistente na região.
19h49: Outra mensagem é postada pela Defesa Civil diz que a “chuva está se espalhando” pelo Litoral Norte e pede para que as pessoas “tenham cuidado nas próximas horas”
23h13: A Defesa Civil alerta que a chuva persiste na região e recomenda “não enfrente alagamentos. Fique atento a inclinação de muros e a rachaduras. Se precisar saia do local”
03h15: O órgão volta a alerta sobre a chuva forte e persistente no litoral norte e diz “não enfrente alagamentos. Fique atento a inclinação de muros e a rachaduras. Se precisar saia do local.”
Especialistas apontam falta de investimento e defasagem do modelo; temporal foi agravado por ciclone extratropical, diz meteorologista
Carlos Petrocilo
22 de fevereiro de 2023
A falta de investimento em novas tecnologias, aliada à aceleração das mudanças climáticas, torna a previsão do tempo mais imprecisa no Brasil, segundo especialistas ouvidos pela Folha.
O serviço de meteorologia é essencial para que órgãos públicos, como Defesa Civil, se preparem com antecedência na tentativa de mitigar os efeitos de um temporal.
Como consequência do temporal, 48 pessoas morreram, sendo 47 em São Sebastião e uma em Ubatuba, conforme os dados desta quarta (22).
Segundo o professor Eduardo Mario Mendiondo, coordenador científico do Ceped (Centro de Educação e Pesquisa de Desastres) da USP, os modelos atuais de previsão utilizam parâmetros atmosféricos calibrados por condições históricas e precisam ser atualizados.
“O clima está mudando, com maior magnitude e com maior frequência de ocorrência de extremos. Os modelos precisam ser atualizados de forma constante, em escala global e em regiões específicas, com microclima e dinâmicas peculiares, como é o caso da Serra do Mar e da Baixada Santista”, afirma Mendiondo.
O professor chama atenção para falta de investimentos públicos. Segundo ele, o governo precisa reforçar o quadro de servidores e investir em novas ferramentas para Cemaden (Centro Nacional de Monitoramento e Alertas de Desastres Naturais), Inpe (Instituto Nacional de Pesquisas Espaciais) e Inmet (Instituto Nacional de Meteorologia).
“Falta aumentar em 20 vezes o potencial de supercomputadores atuais em território nacional, falta contratar até 20 vezes o número servidores de manutenção e operação de supercomputadores e falta contratar até em dez vezes o número atual de técnicos operadores”, afirma o professor da USP.
Para suprir tais necessidades, Mendiondo estima que é necessário investimentos de R$ 25 bilhões por ano. “Isto para converter essas novas evidências científicas, melhorando as previsões, seguindo exemplos como Japão, Europa e Estados Unidos.”
O meteorologista Mamedes Luiz Melo afirma que o volume de chuva foi agravado pela ação do ciclone extratropical associado a uma frente fria que passou pelo Sul do país e por São Paulo. “A tecnologia vinha alertando, mas estamos lidando com algo móvel na atmosfera”, afirma Melo.
A Defesa Civil diz, em nota, que os boletins especiais e de aviso de risco meteorológicos são emitidos com base em simulações numéricas de previsão do tempo. “Tais limiares baseiam-se no histórico da chuva da região em que a chuva acumulada representa risco para transtornos, como deslizamentos, desabamentos, alagamentos, enchentes e ocorrências relacionadas a raios e ventos”, disse a Defesa Civil.
As projeções do Inmet, que emite alertas sobre riscos de deslizamentos para órgãos públicos, previram volumes de chuva menores do que um modelo usado pela empresa de meteorologia MetSul.
O modelo da empresa, chamado WRF, apontou que algumas áreas poderiam ter chuva acima de 600 mm em alguns pontos do terreno, o que acabou se confirmando. As previsões mais graves do instituto federal falavam em chuvas no patamar de 400 mm.
A previsão do Inmet para a chuva no litoral norte utilizou seis modelos numéricos diferentes. O instituto também usa o WRF, mas com uma resolução menor do que a da MetSul. Ou seja, a empresa conseguiu fazer os cálculos a partir de detalhes mais precisos do relevo do que o órgão público.
“O WRF tem se mostrado uma ferramenta muito importante na identificação de eventos extremos de chuva”, diz a meteorologista Estael Sias, da MetSul. “É importante assinalar que o modelo WRF é meramente uma ferramenta de trabalho, um produto, e não a previsão, e que o prognóstico final divulgado ao público e clientes leva em conta outros modelos e também a experiência do meteorologista para eventos extremos.”
Segundo o meteorologista Franco Nadal Villela, da equipe do Inmet em São Paulo, a resolução não é o fator mais decisivo na previsão de chuvas. Ele diz que os modelos usados pelo instituto deram conta de prever que o temporal em São Sebastião seria muito grave, embora não tenham chegado ao valor de 600 mm.
“Há modelos de menor resolução que pontualmente previram menos precipitação”, diz Villela. “As previsões modeladas estavam prevendo bem este evento e as variações na quantificação de precipitação [volume de chuva por hora] são mais uma das varáveis que ponderamos para emitir alertas.”
A Folha enviou perguntas através de email ao Inpe, que coordena o Centro de Previsão de Tempo e Estudos Climáticos (Cptec), mas não obteve resposta até a publicação deste texto.
Para José Marengo, climatologista e coordenador do Cemaden, defende mudanças [sic]. Ele explica que o modelo de previsão do tempo divide a região em áreas de até 200 quilômetros quadrados. Com isso, não é possível prever a quantidade de chuva aproximada em toda a região.
“O Brasil não está preparado tecnologicamente. É como se dividisse o Brasil em caixas grandes de 200 quilômetros quadrados, por isso há distorções dentro da mesma região. Pode ter áreas em que chove menos e outras que superaram os 600 milímetros, a modelagem não é perfeita”, afirma Marengo.
Ele também alerta para a falta de novas tecnologias. “O supercomputador do Inpe, o Tupã, que resolve as equações matemáticas em alta velocidade, é de 2010 e considerado obsoleto”, afirma o climatologista.
O professor Pedro Côrtes, do Instituto de Energia e Ambiente da USP, concorda que é área precise de mais recursos, mas pondera que as previsões dos órgãos do governo foram suficientes para apontar que uma tempestade grave se aproximava.
“A espera pelo investimento não pode postergar a solução do problema, as previsões já funcionam.”
A Folha publicou, no dia 28 de dezembro de 2010, a inauguração do supercomputador. Na ocasião, o Tupã custou R$ 31 milhões e era utilizado em países como Estados Unidos, China, Alemanha e Rússia. Para operá-lo, o Inpe precisou construir uma nova central elétrica, de mil quilowatts —antes tinha só 280 quilowatts disponíveis no instituto.
Até hoje os especialistas apontam o Tupã como o melhor equipamento que o Brasil possui para prever, além de enchentes, ondas de calor e frio e os períodos de seca.
A different ‘Big One’ is approaching. Climate change is hastening its arrival.
Aug. 12, 2022
California, where earthquakes, droughts and wildfires have shaped life for generations, also faces the growing threat of another kind of calamity, one whose fury would be felt across the entire state.
This one will come from the sky.
According to new research, it will very likely take shape one winter in the Pacific, near Hawaii. No one knows exactly when, but from the vast expanse of tropical air around the Equator, atmospheric currents will pluck out a long tendril of water vapor and funnel it toward the West Coast.
This vapor plume will be enormous, hundreds of miles wide and more than 1,200 miles long, and seething with ferocious winds. It will be carrying so much water that if you converted it all to liquid, its flow would be about 26 times what the Mississippi River discharges into the Gulf of Mexico at any given moment.
When this torpedo of moisture reaches California, it will crash into the mountains and be forced upward. This will cool its payload of vapor and kick off weeks and waves of rain and snow.
The coming superstorm — really, a rapid procession of what scientists call atmospheric rivers — will be the ultimate test of the dams, levees and bypasses California has built to impound nature’s might.
But in a state where scarcity of water has long been the central fact of existence, global warming is not only worsening droughts and wildfires. Because warmer air can hold more moisture, atmospheric rivers can carry bigger cargoes of precipitation. The infrastructure design standards, hazard maps and disaster response plans that protected California from flooding in the past might soon be out of date.
As humans burn fossil fuels and heat up the planet, we have already increased the chances each year that California will experience a monthlong, statewide megastorm of this severity to roughly 1 in 50, according to a new study published Friday. (The hypothetical storm visualized here is based on computer modeling from this study.)
In the coming decades, if global average temperatures climb by another 1.8 degrees Fahrenheit, or 1 degree Celsius — and current trends suggest they might — then the likelihood of such storms will go up further, to nearly 1 in 30.
At the same time, the risk of megastorms that are rarer but even stronger, with much fiercer downpours, will rise as well.
These are alarming possibilities. But geological evidence suggests the West has been struck by cataclysmic floods several times over the past millennium, and the new study provides the most advanced look yet at how this threat is evolving in the age of human-caused global warming.
The researchers specifically considered hypothetical storms that are extreme but realistic, and which would probably strain California’s flood preparations. According to their findings, powerful storms that once would not have been expected to occur in an average human lifetime are fast becoming ones with significant risks of happening during the span of a home mortgage.
“We got kind of lucky to avoid it in the 20th century,” said Daniel L. Swain, a climate scientist at the University of California, Los Angeles, who prepared the new study with Xingying Huang of the National Center for Atmospheric Research in Boulder, Colo. “I would be very surprised to avoid it occurring in the 21st.”
Unlike a giant earthquake, the other “Big One” threatening California, an atmospheric river superstorm will not sneak up on the state. Forecasters can now spot incoming atmospheric rivers five days to a week in advance, though they don’t always know exactly where they’ll hit or how intense they’ll be.
Using Dr. Huang and Dr. Swain’s findings, California hopes to be ready even earlier. Aided by supercomputers, state officials plan to map out how all that precipitation will work its way through rivers and over land. They will hunt for gaps in evacuation plans and emergency services.
The last time government agencies studied a hypothetical California megaflood, more than a decade ago, they estimated it could cause $725 billion in property damage and economic disruption. That was three times the projected fallout from a severe San Andreas Fault earthquake, and five times the economic damage from Hurricane Katrina, which left much of New Orleans underwater for weeks in 2005.
Dr. Swain and Dr. Huang have handed California a new script for what could be one of its most challenging months in history. Now begin the dress rehearsals.
“Mother Nature has no obligation to wait for us,” said Michael Anderson, California’s state climatologist.
In fact, nature has not been wasting any time testing California’s defenses. And when it comes to risks to the water system, carbon dioxide in the atmosphere is hardly the state’s only foe.
THE ULTIMATE CURVEBALL
On Feb. 12, 2017, almost 190,000 people living north of Sacramento received an urgent order: Get out. Now. Part of the tallest dam in America was verging on collapse.
That day, Ronald Stork was in another part of the state, where he was worrying about precisely this kind of disaster — at a different dam.
Standing with binoculars near California’s New Exchequer Dam, he dreaded what might happen if large amounts of water were ever sent through the dam’s spillways. Mr. Stork, a policy expert with the conservation group Friends of the River, had seen on a previous visit to Exchequer that the nearby earth was fractured and could be easily eroded. If enough water rushed through, it might cause major erosion and destabilize the spillways.
He only learned later that his fears were playing out in real time, 150 miles north. At the Oroville Dam, a 770-foot-tall facility built in the 1960s, water from atmospheric rivers was washing away the soil and rock beneath the dam’s emergency spillway, which is essentially a hillside next to the main chute that acts like an overflow drain in a bathtub. The top of the emergency spillway looked like it might buckle, which would send a wall of water cascading toward the cities below.
Mr. Stork had no idea this was happening until he got home to Sacramento and found his neighbor in a panic. The neighbor’s mother lived downriver from Oroville. She didn’t drive anymore. How was he going to get her out?
Mr. Stork had filed motions and written letters to officials, starting in 2001, about vulnerabilities at Oroville. People were now in danger because nobody had listened. “It was nearly soul crushing,” he said.
“With flood hazard, it’s never the fastball that hits you,” said Nicholas Pinter, an earth scientist at the University of California, Davis. “It’s the curveball that comes from a direction you don’t anticipate. And Oroville was one of those.”
Ronald Stork in his office at Friends of the River in Sacramento.
The spillway of the New Exchequer Dam.
Such perils had lurked at Oroville for so long because California’s Department of Water Resources had been “overconfident and complacent” about its infrastructure, tending to react to problems rather than pre-empt them, independent investigators later wrote in a report. It is not clear this culture is changing, even as the 21st-century climate threatens to test the state’s aging dams in new ways. One recent study estimated that climate change had boosted precipitation from the 2017 storms at Oroville by up to 15 percent.
A year and a half after the crisis, crews were busy rebuilding Oroville’s emergency spillway when the federal hydropower regulator wrote to the state with some unsettling news: The reconstructed emergency spillway will not be big enough to safely handle the “probable maximum flood,” or the largest amount of water that might ever fall there.
Sources: Global Historical Climatology Network, Huang and Swain (2022) Measurements taken from the Oroville weather station and the nearest modeled data point
This is the standard most major hydroelectric projects in the United States have to meet. The idea is that spillways should basically never fail because of excessive rain.
Today, scientists say they believe climate change might be increasing “probable maximum” precipitation levels at many dams. When the Oroville evacuation was ordered in 2017, nowhere near that much water had been flowing through the dam’s emergency spillway.
Yet California officials have downplayed these concerns about the capacity of Oroville’s emergency spillway, which were raised by the Federal Energy Regulatory Commission. Such extreme flows are a “remote” possibility, they argued in a letter last year. Therefore, further upgrades at Oroville aren’t urgently needed.
In a curt reply last month, the commission said this position was “not acceptable.” It gave the state until mid-September to submit a plan for addressing the issue.
The Department of Water Resources told The Times it would continue studying the matter. The Federal Energy Regulatory Commission declined to comment.
“People could die,” Mr. Stork said. “And it bothers the hell out of me.”
WETTER WET YEARS
Donald G. Sullivan was lying in bed one night, early in his career as a scientist, when he realized his data might hold a startling secret.
For his master’s research at the University of California, Berkeley, he had sampled the sediment beneath a remote lake in the Sacramento Valley and was hoping to study the history of vegetation in the area. But a lot of the pollen in his sediment cores didn’t seem to be from nearby. How had it gotten there?
When he X-rayed the cores, he found layers where the sediment was denser. Maybe, he surmised, these layers were filled with sand and silt that had washed in during floods.
It was only late that night that he tried to estimate the ages of the layers. They lined up neatly with other records of West Coast megafloods.
“That’s when it clicked,” said Dr. Sullivan, who is now at the University of Denver.
His findings, from 1982, showed that major floods hadn’t been exceptionally rare occurrences over the past eight centuries. They took place every 100 to 200 years. And in the decades since, advancements in modeling have helped scientists evaluate how quickly the risks are rising because of climate change.
For their new study, which was published in the journal Science Advances, Dr. Huang and Dr. Swain replayed portions of the 20th and 21st centuries using 40 simulations of the global climate. Extreme weather events, by definition, don’t occur very often. So by using computer models to create realistic alternate histories of the past, present and future climate, scientists can study a longer record of events than the real world offers.
Dr. Swain and Dr. Huang looked at all the monthlong California storms that took place during two time segments in the simulations, one in the recent past and the other in a future with high global warming, and chose one of the most intense events from each period. They then used a weather model to produce detailed play-by-plays of where and when the storms dump their water.
Those details matter. There are “so many different factors” that make an atmospheric river deadly or benign, Dr. Huang said.
Xingying Huang of the National Center for Atmospheric Research in Boulder, Colo. Rachel Woolf for The New York Times
The New Don Pedro Dam spillway.
Wes Monier, a hydrologist, with a 1997 photo of water rushing through the New Don Pedro Reservoir spillway.
In the high Sierras, for example, atmospheric rivers today largely bring snow. But higher temperatures are shifting the balance toward rain. Some of this rain can fall on snowpack that accumulated earlier, melting it and sending even more water toward towns and cities below.
Climate change might be affecting atmospheric rivers in other ways, too, said F. Martin Ralph of the Scripps Institution of Oceanography at the University of California, San Diego. How strong their winds are, for instance. Or how long they last: Some storms stall, barraging an area for days on end, while others blow through quickly.
Scientists are also working to improve atmospheric river forecasts, which is no easy task as the West experiences increasingly sharp shifts from very dry conditions to very wet and back again. In October, strong storms broke records in Sacramento and other places. Yet this January through March was the driest in the Sierra Nevada in more than a century.
“My scientific gut says there’s change happening,” Dr. Ralph said. “And we just haven’t quite pinned down how to detect it adequately.”
Better forecasting is already helping California run some of its reservoirs more efficiently, a crucial step toward coping with wetter wet years and drier dry ones.
On the last day of 2016, Wes Monier was looking at forecasts on his iPad and getting a sinking feeling.
Mr. Monier is chief hydrologist for the Turlock Irrigation District, which operates the New Don Pedro Reservoir near Modesto. The Tuolumne River, where the Don Pedro sits, was coming out of its driest four years in a millennium. Now, some terrifying rainfall projections were rolling in.
First, 23.2 inches over the next 16 days. A day later: 28.8 inches. Then 37.1 inches, roughly what the area normally received in a full year.
If Mr. Monier started releasing Don Pedro’s water too quickly, homes and farms downstream would flood. Release too much and he would be accused of squandering water that would be precious come summer.
But the forecasts helped him time his flood releases precisely enough that, after weeks of rain, the water in the dam ended up just shy of capacity. Barely a drop was wasted, although some orchards were flooded, and growers took a financial hit.
The next storm might be even bigger, though. And even the best data and forecasts might not allow Mr. Monier to stop it from causing destruction. “There’s a point there where I can’t do anything,” he said.
KATRINA 2.0
How do you protect a place as vast as California from a storm as colossal as that? Two ways, said David Peterson, a veteran engineer. Change where the water goes, or change where the people are. Ideally, both. But neither is easy.
Firebaugh is a quiet, mostly Hispanic city of 8,100 people, one of many small communities that power the Central Valley’s prodigious agricultural economy. Many residents work at nearby facilities that process almonds, pistachios, garlic and tomatoes.
Firebaugh also sits right on the San Joaquin River.
For a sleepless stretch of early 2017, Ben Gallegos, Firebaugh’s city manager, did little but watch the river rise and debate whether to evacuate half the town. Water from winter storms had already turned the town’s cherished rodeo grounds into a swamp. Now it was threatening homes, schools, churches and the wastewater treatment plant. If that flooded, people would be unable to flush their toilets. Raw sewage would flow down the San Joaquin.
Luckily, the river stopped rising. Still, the experience led Mr. Gallegos to apply for tens of millions in funding for new and improved levees around Firebaugh.
Levees change where the water goes, giving rivers more room to swell before they inundate the land. Levee failures in New Orleans were what turned Katrina into an epochal catastrophe, and after that storm, California toughened levee standards in urbanized areas of the Sacramento and San Joaquin Valleys, two major river basins of the Central Valley.
The idea is to keep people out of places where the levees don’t protect against 200-year storms, or those with a 0.5 percent chance of occurring in any year. To account for rising seas and the shifting climate, California requires that levees be recertified as providing this level of defense at least every 20 years.
Firebaugh, Calif., on the San Joaquin River, is home to 8,100 people and helps power the Central Valley’s agricultural economy.
Ben Gallegos, the Firebaugh city manager.
A 6-year-old’s birthday celebration in Firebaugh.
The problem is that once levees are strengthened, the areas behind them often become particularly attractive for development: fancier homes, bigger buildings, more people. The likelihood of a disaster is reduced, but the consequences, should one strike, are increased.
Federal agencies try to stop this by not funding infrastructure projects that induce growth in flood zones. But “it’s almost impossible to generate the local funds to raise that levee if you don’t facilitate some sort of growth behind the levee,” Mr. Peterson said. “You need that economic activity to pay for the project,” he said. “It puts you in a Catch-22.”
A project to provide 200-year protection to the Mossdale Tract, a large area south of Stockton, one of the San Joaquin Valley’s major cities, has been on pause for years because the Army Corps of Engineers fears it would spur growth, said Chris Elias, executive director of the San Joaquin Area Flood Control Agency, which is leading the project. City planners have agreed to freeze development across thousands of acres, but the Corps still hasn’t given its final blessing.
The Corps and state and local agencies will begin studying how best to protect the area this fall, said Tyler M. Stalker, a spokesman for the Corps’s Sacramento District.
The plodding pace of work in the San Joaquin Valley has set people on edge. At a recent public hearing in Stockton on flood risk, Mr. Elias stood up and highlighted some troubling math.
The Department of Water Resources says up to $30 billion in investment is needed over the next 30 years to keep the Central Valley safe. Yet over the past 15 years, the state managed to spend only $3.5 billion.
“We have to find ways to get ahead of the curve,” Mr. Elias said. “We don’t want to have a Katrina 2.0 play out right here in the heart of Stockton.”
As Mr. Elias waits for projects to be approved and budgets to come through, heat and moisture will continue to churn over the Pacific. Government agencies, battling the forces of inertia, indifference and delay, will make plans and update policies. And Stockton and the Central Valley, which runs through the heart of California, will count down the days and years until the inevitable storm.
The Sacramento-San Joaquin Delta near Stockton, Calif.
Sources
The megastorm simulation is based on the “ARkHist” storm modeled by Huang and Swain, Science Advances (2022), a hypothetical statewide, 30-day atmospheric river storm sequence over California with an approximately 2 percent likelihood of occurring each year in the present climate. Data was generated using the Weather Research and Forecasting model and global climate simulations from the Community Earth System Model Large Ensemble.
The chart of precipitation at Oroville compares cumulative rainfall at the Oroville weather station before the 2017 crisis with cumulative rainfall at the closest data point in ARkHist.
The rainfall visualization compares observed hourly rainfall in December 2016 from the Los Angeles Downtown weather station with rainfall at the closest data point in a hypothetical future megastorm, the ARkFuture scenario in Huang and Swain (2022). This storm would be a rare but plausible event in the second half of the 21st century if nations continue on a path of high greenhouse-gas emissions.
Additional credits
The 3D rainfall visualization and augmented reality effect by Nia Adurogbola, Jeffrey Gray, Evan Grothjan, Lydia Jessup, Max Lauter, Daniel Mangosing, Noah Pisner, James Surdam and Raymond Zhong.
Photo editing by Matt McCann.
Produced by Sarah Graham, Claire O’Neill, Jesse Pesta and Nadja Popovich.
Giant rainstorms have ravaged California before. Times journalists combined data, graphics and old-fashioned reporting to explore what the next big one might look like.
Credit: Erin Schaff/The New York Times
Aug. 25, 2022
Times Insider explains who we are and what we do and delivers behind-the-scenes insights into how our journalism comes together.
Not long ago, when I heard that California officials were embarking on an ambitious, multiyear effort to study one of the worst natural disasters in the state’s history, I knew there would be a lot of interesting material to cover. There was just one wrinkle: The disaster hadn’t happened yet — it still hasn’t.
The California water authorities wanted to examine a much bigger and more powerful version of the rainstorms the state often gets in winter. The milder ones replenish water supplies. But the strong ones cause devastating flooding and debris flows. And the really strong ones, like those that have hit the Pacific Coast several times over the past millennium, can erase whole landscapes, turning valleys and plains into lakes.
As global warming increases the likelihood and the intensity of severe storms, the state’s Department of Water Resources wanted to know: What would a really big (yet plausible) storm look like today? How well would we handle it?
As a climate reporter for The New York Times, I had a pretty good idea of how to tell the first part of the story. The department was starting its study by commissioning two climate scientists to construct a detailed play-by-play of how a monthlong storm might unload its precipitation throughout the state. (And what a lot of precipitation it would be: nearly 16 inches, on average, across California, according to the scientists’ simulations, and much more in mountainous areas.)
All that detail would help operators of dams and other infrastructure pinpoint how much water they might get at specific times and places. It would also allow the graphics wizards at The Times to bring the storm to stunning visual life in our article, which we published this month.
But to make the article more than an academic recounting of a computer-modeling exercise, I knew I had to find ways to ground this future storm strongly in the present. And as I started reporting, I realized this was what a lot of people in the flood-management world were trying to do, too. Unlike traffic congestion, air pollution or even drought, flood risk isn’t in people’s faces most of the time. Forecasters and engineers have to keep reminding them that it’s there.
I realized this wasn’t a story about predicting the future at all. Like a lot of climate stories, it was about how humans and institutions function, or fail to function, when faced with catastrophic possibilities whose arrival date is uncertain.
The near-catastrophe Californians remember most vividly is the 2017 crisis at the Oroville Dam, north of Sacramento. The dam’s emergency spillway nearly collapsed after heavy rainstorms, prompting the evacuation of 188,000 people. The state authorities spent the next few years reinspecting dams and re-evaluating safety needs. Yet I found signs that all this attention might already be starting to fade, even when it came to Oroville itself.
For every example of proactive thinking on flood risks, I found instances where budgets, political exigencies or other complications had gotten in the way. I visited flood-prone communities in the Sacramento-San Joaquin Delta with Kathleen Schaefer, an engineer formerly with the Federal Emergency Management Agency. She helped prepare the last major study of a hypothetical California megastorm, over a decade ago, and she recalled the frosty reception her and her colleagues’ work had received in some official circles.
She described the attitude she encountered this way: “If you can’t do anything about it, if it’s such a big problem, then you don’t want to stick your head out and raise it, because then you’re supposed to do something about it. So it’s better just to be like, ‘Oh, I hope it doesn’t happen on my watch.’”
I also sought out Californians who had suffered the effects of flooding firsthand. One reason the state is so vulnerable is that so many people and their homes and assets are in inundation-prone places. The reasons they stay, despite the dangers, are complex and often deeply personal.
Rudy Mussi has lived through two devastating levee failures near his land, in a part of the Delta called the Jones Tract. Neither experience made him want to go farm somewhere else. He recently invested millions in almond trees.
“Even though there’s risk,” Mr. Mussi told me, “there’s people willing to take that risk.”
Bob Ott grows cherries, almonds and walnuts in the fertile soil along the Tuolumne River. As we drove through his orchards on a rickety golf cart, he showed me where the water had rushed in during the 2017 storms.
Mr. Ott said he knew his land was bound to flood again, whether from a repeat of rains past or from a future megastorm. Still, he would never consider leaving, he said. His family has been farming there for the better part of a century. “This is part of us,” he said.
C A firefighter battled the Sugar Fire in Doyle, Calif., this month. Credit: Noah Berger/Associated Press
Floods swept Germany, fires ravaged the American West and another heat wave loomed, driving home the reality that the world’s richest nations remain unprepared for the intensifying consequences of climate change.
July 17, 2021
Some of Europe’s richest countries lay in disarray this weekend, as raging rivers burst through their banks in Germany and Belgium, submerging towns, slamming parked cars against trees and leaving Europeans shellshocked at the intensity of the destruction.
Only days before in the Northwestern United States, a region famed for its cool, foggy weather, hundreds had died of heat. In Canada, wildfire had burned a village off the map. Moscow reeled from record temperatures. And this weekend the northern Rocky Mountains were bracing for yet another heat wave, as wildfires spread across 12 states in the American West.
The extreme weather disasters across Europe and North America have driven home two essential facts of science and history: The world as a whole is neither prepared to slow down climate change, nor live with it. The week’s events have now ravaged some of the world’s wealthiest nations, whose affluence has been enabled by more than a century of burning coal, oil and gas — activities that pumped the greenhouse gases into the atmosphere that are warming the world.
“I say this as a German: The idea that you could possibly die from weather is completely alien,” said Friederike Otto, a physicist at Oxford University who studies the links between extreme weather and climate change. “There’s not even a realization that adaptation is something we have to do right now. We have to save people’s lives.”
The floods in Europe have killed at least 165 people, most of them in Germany, Europe’s most powerful economy. Across Germany, Belgium, and the Netherlands, hundreds have been reported as missing, which suggests the death toll could rise. Questions are now being raised about whether the authorities adequately warned the public about risks.
Credit: Sebastien Bozon/Agence France-Presse — Getty ImagesCredit: David Swanson/Reuters
The bigger question is whether the mounting disasters in the developed world will have a bearing on what the world’s most influential countries and companies will do to reduce their own emissions of planet-warming gases. They come a few months ahead of United Nations-led climate negotiations in Glasgow in November, effectively a moment of reckoning for whether the nations of the world will be able to agree on ways to rein in emissions enough to avert the worst effects of climate change.
Disasters magnified by global warming have left a long trail of death and loss across much of the developing world, after all, wiping out crops in Bangladesh, leveling villages in Honduras, and threatening the very existence of small island nations. Typhoon Haiyan devastated the Philippines in the run-up to climate talks in 2013, which prompted developing-country representatives to press for funding to deal with loss and damage they face over time for climate induced disasters that they weren’t responsible for. That was rejected by richer countries, including the United States and Europe.
“Extreme weather events in developing countries often cause great death and destruction — but these are seen as our responsibility, not something made worse by more than a hundred years of greenhouse gases emitted by industrialized countries,” said Ulka Kelkar, climate director at the India office of the World Resources Institute. These intensifying disasters now striking richer countries, she said, show that developing countries seeking the world’s help to fight climate change “have not been crying wolf.”
Indeed, even since the 2015 Paris Agreement was negotiated with the goal of averting the worst effects of climate change, global emissions have kept increasing. China is the world’s biggest emitter today. Emissions have been steadily declining in both the United States and Europe, but not at the pace required to limit global temperature rise.
A reminder of the shared costs came from Mohamed Nasheed, the former president of the Maldives, an island nation at acute risk from sea level rise.
“While not all are affected equally, this tragic event is a reminder that, in the climate emergency, no one is safe, whether they live on a small island nation like mine or a developed Western European state,” Mr. Nasheed said in a statement on behalf of a group of countries that call themselves the Climate Vulnerable Forum.
Credit: Alexander Nemenov/Agence France-Presse — Getty ImagesCredit: John Hendricks/Oregon Office of State Fire Marshal, via Associated Press
The ferocity of these disasters is as notable as their timing, coming ahead of the global talks in Glasgow to try to reach agreement on fighting climate change. The world has a poor track record on cooperation so far, and, this month, new diplomatic tensions emerged.
Among major economies, the European Commission last week introduced the most ambitious road map for change. It proposed laws to ban the sale of gas and diesel cars by 2035, require most industries to pay for the emissions they produce, and most significantly, impose a tax on imports from countries with less stringent climate policies.
But those proposals are widely expected to meet vigorous objections both from within Europe and from other countries whose businesses could be threatened by the proposed carbon border tax, potentially further complicating the prospects for global cooperation in Glasgow.
The events of this summer come after decades of neglect of science. Climate models have warned of the ruinous impact of rising temperatures. An exhaustive scientific assessment in 2018 warned that a failure to keep the average global temperature from rising past 1.5 degrees Celsius, compared to the start of the industrial age, could usher in catastrophic results, from the inundation of coastal cities to crop failures in various parts of the world.
The report offered world leaders a practical, albeit narrow path out of chaos. It required the world as a whole to halve emissions by 2030. Since then, however, global emissions have continued rising, so much so that global average temperature has increased by more than 1 degree Celsius (about 2 degrees Fahrenheit) since 1880, narrowing the path to keep the increase below the 1.5 degree Celsius threshold.
As the average temperature has risen, it has heightened the frequency and intensity of extreme weather events in general. In recent years, scientific advances have pinpointed the degree to which climate change is responsible for specific events.
Credit: Maksim Slutsky/Associated PressCredit: Darryl Dyck/The Canadian Press, via Associated Press
And even though it will take extensive scientific analysis to link climate change to last week’s cataclysmic floods in Europe, a warmer atmosphere holds more moisture and is already causing heavier rainfall in many storms around the world. There is little doubt that extreme weather events will continue to be more frequent and more intense as a consequence of global warming. A paper published Friday projected a significant increase in slow-moving but intense rainstorms across Europe by the end of this century because of climate change.
“We’ve got to adapt to the change we’ve already baked into the system and also avoid further change by reducing our emissions, by reducing our influence on the climate,” said Richard Betts, a climate scientist at the Met Office in Britain and a professor at the University of Exeter.
That message clearly hasn’t sunk in among policymakers, and perhaps the public as well, particularly in the developed world, which has maintained a sense of invulnerability.
The result is a lack of preparation, even in countries with resources. In the United States, flooding has killed more than 1,000 people since 2010 alone, according to federal data. In the Southwest, heat deaths have spiked in recent years.
Sometimes that is because governments have scrambled to respond to disasters they haven’t experienced before, like the heat wave in Western Canada last month, according to Jean Slick, head of the disaster and emergency management program at Royal Roads University in British Columbia. “You can have a plan, but you don’t know that it will work,” Ms. Slick said.
Other times, it’s because there aren’t political incentives to spend money on adaptation.
“By the time they build new flood infrastructure in their community, they’re probably not going to be in office anymore,” said Samantha Montano, a professor of emergency management at the Massachusetts Maritime Academy. “But they are going to have to justify millions, billions of dollars being spent.”
Os dados, coletados pela Folha no Inmet (Instituto Nacional de Meteorologia), e pesquisadores indicam que a cidade enfrentará cada vez mais desafios na saúde pública, com mais mortes relacionadas a doenças cardíacas, por exemplo, que são mais comuns nas ondas de calor. E sofrerá cada vez mais problemas de infraestrutura, com mais alagamentos em alguns períodos e falta d’água em outros.
A chuva é um dos grandes exemplos da mudança no clima em São Paulo no período. Até 1980, a cidade havia enfrentado apenas um evento com mais de 100 mm em um dia. Na década de 2010, foram seis.
Patamar próximo a esse foi o que a capital paulista enfrentou no começo de fevereiro, quando os 114 mm foram suficientes para alagar trechos das marginais, ilhar moradores e suspender aulas e o serviço público.
Por outro lado, os períodos sem chuva estão cada vez maiores. A década de 1960 começou com período de até 15 dias sem precipitação em alguns anos. Nesta década mais recente, já se chegou a 51 dias secos, em 2012.
Após sequência de estiagens, a cidade sofreu com a crise hídrica de 2014, quando reservatórios chegaram a operar com 10% da capacidade, levando a racionamentos.
Os dados do Inmet, que vão de 1961 a 2019 e são coletadas na zona norte, mostram também mudança no padrão de temperatura.
Há diferentes formas de se avaliar essa variação. Considerando a diferença ano a ano, o acumulado desses 58 anos aponta para uma temperatura média 2ºC superior agora em relação ao período inicial (subindo da casa dos 20ºC para 22ºC).
Se analisada a variação das temperaturas mínimas, o aquecimento é ainda maior (quase 3ºC a mais, saindo da casa dos 8ºC para 11ºC).
Visto de outra forma, as temperaturas mínimas da década de 2010 estão 2,3ºC maiores do que de 1960, considerando as medianas (medida que identifica qual a temperatura é a que divide em dois o grupo analisado).
Como as mudanças no regime de chuvas e nas temperaturas têm sido constantes ao longo das décadas, climatologistas dizem que a situação atual deverá ser o novo padrão da cidade para os próximos anos. E as projeções apontam para presença ainda maior de eventos extremos nas próximas décadas.
“A situação exige melhoria significativa em ações para redução de desastres na região metropolitana”, escreveram o climatologista José Marengo e outros pesquisadores brasileiros em trabalho acadêmico publicado na revista da Academia de Ciências de Nova York, no começo deste ano.
A pesquisa enfocou o padrão de chuvas na região —a reportagem se inspirou nessa metodologia para a análise, acrescentando dados mais recentes.
Os cientistas destacam que as mudanças podem estar relacionadas à variação natural do clima, mas também podem ser fruto do aquecimento global e da urbanização da região.
“O aumento das temperaturas é um processo natural, que pode ser acelerado pela ação humana, com urbanização, queima de combustível fóssil e desmatamento”, disse à reportagem o cientista Marengo, do Cemaden (centro nacional de monitoramento de desastres naturais). “O que não foi estabelecido é saber qual porcentagem é natural e qual é humana.”
Mesmo que a causa das mudanças no clima da cidade ainda não esteja totalmente definida, já há pesquisas sobre o impacto na saúde da população decorrente das temperaturas mais altas e pelo novo padrão de chuvas.
A população idosa parece ser mais sensível ao aumento do calor. Uma das razões é que o corpo nessa idade tem mais dificuldade para se adaptar à mudança de temperatura. E também tarda mais para perceber o aumento do calor, demorando também para se hidratar.
As pesquisas mostram que aumento da temperatura está relacionado a mais casos de mortes decorrentes de doenças cardiovasculares e respiratórias.
Em pesquisa feita no IAG-USP (instituto de ciências atmosféricas), o meteorologista Rafael Batista avaliou o impacto de altas temperaturas nos óbitos de idosos.
O trabalho verificou que houve mais mortes do que o esperado em fevereiro de 2014 na região metropolitana de São Paulo, quando ocorreu forte onda de calor (26 dias consecutivos com máximas acima dos 30ºC).
Outro impacto do aumento do calor é a elevação do consumo de água, aponta o professor da Faculdade de Saúde Pública da USP Leandro Giatti.
E a situação pode se agravar porque o novo padrão de chuvas, com pancadas cada vez mais fortes, alternadas com períodos secos mais longos, não é o ideal para se acumular águas nos reservatórios.
Nas chuvas intensas, a água passa muito rapidamente pelo solo, não sendo absorvida para os aquíferos, além de levar sujeira e sedimentos para os reservatórios.
Ela verificou que houve aumento de internações devido a essas doenças nos períodos mais chuvosos em Rio Branco (AC), entre os anos de 2008 e 2013.
Todos esses problemas devem se intensificar, de acordo com os cientistas.
A pesquisa do meteorologista Rafael Batista, do IAG-USP, estimou como deverá ser a temperatura na região metropolitana até 2099, considerando a evolução nas últimas décadas.
Segundo esse cálculo, o número de dias de risco por altas temperaturas (médias acima de 25ºC) passará a ocupar 40% do ano, dentro das próximas seis décadas; hoje, são apenas 8% do ano.
“O inverno pode passar a ficar parecido com o que conhecemos do verão”, disse o climatologista Fábio Gonçalves, do IAG (instituto de ciências atmosféricas), da USP. A unidade também faz monitoramento do clima, a partir de ponto na zona sul na cidade, e possui observações semelhantes ao verificado pela Folha.
Governos ainda tropeçam para frear problema
As temperaturas mais altas e a frequência maior de eventos extremos ganham contornos mais graves quando se pensa que a cidade não para —nem em população (cresceu uma média de 100,8 mil habitantes por ano na última década) nem em mancha urbana (que hoje ocupa 878,6 km², o equivalente a 57% do território da cidade).
A Prefeitura de São Paulo lista intervenções como a construção de piscinões, a melhoria da drenagem e a implantação de parques como respostas. Por outro lado, reportagem da Folha no começo do mês mostrou que a cidade tem ao menos 17 grandes obras de drenagem atrasadas.
A cidade instituiu em 2009, na gestão Gilberto Kassab, sua Política Municipal de Mudança do Clima, que estabelece ações para mitigar os efeitos das mudanças ambientais.
São Paulo também tem como meta reduzir em 45% as emissões de gás carbônico nos próximos dez anos em relação ao nível de 2010, e promete neutralizar as emissões de gases que provocam efeito estufa até 2050.
“Os preâmbulos de todos os planos diretores, desde o Plano Urbanístico Básico, de 1968, até o Plano Diretor Estratégico de 2014, têm capítulos dedicados a chuvas, ao meio ambiente”, diz o professor Valter Caldana, da Arquitetura e Urbanismo da Universidade Mackenzie, que afirma que o respeito a variáveis ambientais é um dos fundamentos da boa arquitetura, mesmo antes de se falar em mudanças climáticas.
É preciso mudar o modo como se produzem cidades, diz o urbanista. E cita coisas práticas: cuidar do mobiliário urbano, aumentar a capacidade de drenagem, acabar com a exigência de recuos de edifícios (o que faz com que se desperdice espaços), fazer com que empresas abram espaços verdes para uso público.
“Antigamente São Paulo tinha bolsões de calor. Hoje a cidade inteira virou um bolsão de calor. Tem que parar de agir só na emergência e agir cotidianamente”, diz.
Secretário de Infraestrutura e Obras da cidade, o engenheiro Vitor Aly afirma que a atual administração tem olhado os problemas derivados das mudanças climáticas de forma propositiva, e não mais reativa como no passado, quando, segundo ele, apenas atacavam os efeitos das enchentes.
“Os alagamentos acontecem no mundo todo agora. Veja Austrália, Inglaterra, Japão. É um problema da sociedade moderna. Fomos ocupando o território e agora precisamos nos ocupar do problema”, diz Aly.
Ele lista soluções estruturais que têm sido elaboradas pela prefeitura: a construção de piscinões (já foram entregues oito e planejam mais cinco para 2020); um estudo para alteamento de pontes e pontilhões, que funcionam como represas quando enchem os rios; um mapeamento das 104 bacias hidrográficas e das manchas de inundação da cidade, com o propósito de alertar moradores e construtoras com precisão dos riscos de cada região.
Um dos compromissos previstos no plano de metas da atual gestão é o de reduzir em 12,6% (2,77 km²) as áreas inundáveis da cidade.
Ele avalia que a limpeza de ramais e de bocas de lobo e a retirada de resíduos de córregos fizeram com que a água da chuva tivesse fluidez no último episódio de chuvas, por exemplo. Segundo ele, a drenagem da cidade levou toda a água para os rios Pinheiros e Tietê —”foram essas artérias que não suportaram todo o volume”, afirma Modonezi. A manutenção dos dois rios é incumbência do governo do estado.
“Nas outras regiões da cidade tivemos alagamentos pontuais, pequenos, lâminas de água que acabaram sendo drenadas depois de passada a chuva”, completa.
O plano de metas dedica diversas rubricas à problemática: recuperar 240 mil metros lineares de guias e sarjetas; limpar 2,8 milhões de metros quadrados de margens de córregos; retirar 176.406 toneladas de detritos de piscinões, entre outros.
Em 2019, o prefeito Bruno Covas (PSDB) anunciou compromisso de elaborar um plano de ação climática para zerar a emissão de gases que provocam efeito estufa nos próximos 30 anos. A proposta do tucano está alinhada às metas do Acordo de Paris, repetidamente atacado pelo presidente Jair Bolsonaro (sem partido) nos últimos anos.
Ricardo Viegas, secretário adjunto de Verde e Meio Ambiente, diz que o plano será apresentado em junho, mas diversas ações para controle do aumento de temperatura e do efeito estufa já têm sido feitas. Ele diz que um grande esforço tem sido feito em relação ao transporte na cidade.
A chamada “lei do clima”, sancionada pelo então prefeito João Doria (PSDB) em 2018, estabeleceu que as emissões de dióxido de carbono e de material particulado terão que ser zeradas até 2038 pela frota de ônibus municipal, por exemplo.
A resposta às ilhas de calor e ao aumento de temperatura vem por meio da ampliação das áreas verdes. Nesse sentido, Viegas afirma que a prefeitura implantará dez parques até o final do ano e revitalizará outros 58. A cidade hoje conta com 107 parques.
Outras propostas da gestão Covas que apontam para o longo prazo são a proibição do fornecimento de utensílios plásticos por estabelecimentos comerciais, a implantação de reuso de água em 100% dos novos equipamentos entregues e ampliação do atendimento da coleta seletiva para todos os endereços da capital.
Artigo de Álvaro Rodrigues dos Santos, geólogo e consultor em Geologia de Engenharia, Geotecnia e Meio Ambiente
A cada novo período chuvoso voltam às manchetes as mortes e sinistros associados a deslizamentos de encostas e enchentes. Tragédias insistentemente anunciadas, mas anualmente recorrentes dado ao descompromisso com que a administração pública em seus três níveis tem lidado com a questão.
Todos estão fartos de saber que esses fenômenos decorrem diretamente das formas equivocadas com que se expandem nossas cidades, impermeabilizando seus territórios, canalizando e retificando seus rios, ocupando terrenos, como encostas de alta declividade e margens de córrego, que não poderiam nunca ser ocupados dada sua já altíssima suscetibilidade natural a riscos, mas também ocupando terrenos de média declividade, onde a ocupação urbana seria aceitável, com a utilização de técnicas construtivas e urbanísticas totalmente inadequadas, que acabam transformando mesmo essas áreas em um verdadeiro canteiro de situações de risco.
E com toda essa realidade, escancarada anualmente pelo meio técnico e repercutida pelos meios de comunicação, a pungente verdade é que nossas autoridades sequer tomaram a providência mínima e cristalina de parar de errar, ou seja, parar de cometer os erros que estão na exata origem causal dessas tragédias de cunho geológico, geotécnico e hidrológico. Por consequência, o que se vê é, ao invés da redução do número de áreas de risco, a sua contínua multiplicação.
Como resultado, uma perspectiva de futuro assustadora: as tragédias em áreas de risco tendem a crescer em frequência e letalidade, na exata proporção do crescimento de nossas cidades.
Dentro desse panorama é preciso que se compreenda que do ponto de vista técnico não há lacuna alguma nos conhecimentos básicos de geologia, geotecnia e hidrologia, necessários para a boa solução desses problemas. Os fenômenos de enchentes e deslizamentos nos mais variados contextos geológicos do País são já bastante estudados e conhecidos. Os instrumentos que permitirão um correto planejamento do uso e ocupação do solo urbano são dominados, como a essencial Carta Geotécnica, um mapa municipal que informa sobre os locais que não poderão nunca ser ocupados e as áreas que poderão ser ocupadas caso sejam utilizadas as técnicas adequadas para tanto. Por paradoxal que possa parecer, o Brasil é liderança internacional nesse campo tecnológico.
Vale registrar apenas que não possuímos no País uma cultura técnica arquitetônica e urbanística especialmente adequada à ocupação de terrenos com maior declividade. Isso se verifica tanto nas formas espontâneas utilizadas pela própria população de baixa renda na autoconstrução de suas moradias, como também em projetos privados ou públicos de maior porte que contam com o suporte técnico de arquitetos e urbanistas e têm, apesar do erro básico e grave de concepção, sua implantação autorizada pelos órgãos municipais responsáveis para tanto.
Em ambos os casos, ou seja, no empirismo popular e nos projetos mais elaborados, prevalece infelizmente a cultura técnica da área plana. Isto é, através de cortes e aterros obtidos por operações de terraplenagem nas encostas obsessivamente se procura produzir platôs planos sobre os quais irá ser edificado o empreendimento. Um fatal erro técnico de concepção. Esse tem sido o cacoete técnico que está invariavelmente presente na maciça produção de áreas de risco nas cidades brasileiras que, de alguma forma, crescem sobre relevos mais acidentados.
Vale insistir, no entanto, a maior dificuldade para a boa solução desses problemas continua a residir na falta de vontade e no descompromisso das administrações públicas em finalmente decidir ordenar corretamente a expansão urbana de suas cidades. Nesse mister é fundamental perceber que as populações mais pobres somente deixarão de optar por áreas de risco para instalar suas moradias quando o poder público, através de ousados Programas Habitacionais, lhes oferecer alternativas dignas e seguras de moradia na mesma faixa de custos que ela hoje só encontra na ocupação das áreas de risco. Essa é a verdade nua e crua da questão. Ou essa equação básica é resolvida, ou a instalação de novas situações de risco sempre superarão, em muito, o esforço em desarmar as já instaladas.
Em resumo, é preciso que as autoridades públicas deixem de irresponsavelmente ver a questão das áreas de risco como um problema de Defesa Civil e Corpo de Bombeiros, por mais heroicas que sejam essas corporações, e passem a entendê-la como um elemento próprio do campo das Políticas Habitacionais e de Planejamento Urbano. Somente sob essa ótica a administração pública passará ao comando ativo da situação, deixando de agir apenas a reboque das tragédias, situação em que lhes sobra apenas a descompostura esperta de, como sempre, culpar as chuvas pelos infortúnios.
Álvaro Rodrigues dos Santos (santosalvaro@uol.com.br)
Ex-Diretor de Planejamento e Gestão do IPT – Instituto de Pesquisas Tecnológicas
Autor dos livros “Geologia de Engenharia: Conceitos, Método e Prática”, “A Grande Barreira da Serra do Mar”, “Diálogos Geológicos”, “Cubatão”, “Enchentes e Deslizamentos: Causas e Soluções”, “Manual Básico para elaboração e uso da Carta Geotécnica”.
James Hansen’s new study explodes conventional goals of climate diplomacy and warns of 10 feet of sea level rise before 2100. The good news is, we can fix it.
James Hansen, the former NASA scientist whose congressional testimony put global warming on the world’s agenda a quarter-century ago, is now warning that humanity could confront “sea level rise of several meters” before the end of the century unless greenhouse gas emissions are slashed much faster than currently contemplated.This roughly 10 feet of sea level rise—well beyond previous estimates—would render coastal cities such as New York, London, and Shanghai uninhabitable. “Parts of [our coastal cities] would still be sticking above the water,” Hansen says, “but you couldn’t live there.”
Columbia University
This apocalyptic scenario illustrates why the goal of limiting temperature rise to 2 degrees Celsius is not the safe “guardrail” most politicians and media coverage imply it is, argue Hansen and 16 colleagues in a blockbuster study they are publishing this week in the peer-reviewed journal Atmospheric Chemistry and Physics. On the contrary, a 2 C future would be “highly dangerous.”
If Hansen is right—and he has been right, sooner, about the big issues in climate science longer than anyone—the implications are vast and profound.
Physically, Hansen’s findings mean that Earth’s ice is melting and its seas are rising much faster than expected. Other scientists have offered less extreme findings; the United Nations Intergovernmental Panel on Climate Change (IPCC) has projected closer to 3 feet of sea level rise by the end of the century, an amount experts say will be difficult enough to cope with. (Three feet of sea level rise would put runways of all three New York City-area airports underwater unless protective barriers were erected. The same holds for airports in the San Francisco Bay Area.)
Worldwide, approximately $3 trillion worth infrastructure vital to civilization such as water treatment plants, power stations, and highways are located at or below 3 feet of sea level, according to the Stern Review, a comprehensive analysis published by the British government.
Hansen’s track record commands respect. From the time the soft-spoken Iowan told the U.S. Senate in 1988 that man-made global warming was no longer a theory but had in fact begun and threatened unparalleled disaster, he has consistently been ahead of the scientific curve.
Hansen has long suspected that computer models underestimated how sensitive Earth’s ice sheets were to rising temperatures. Indeed, the IPCC excluded ice sheet melt altogether from its calculations of sea level rise. For their study, Hansen and his colleagues combined ancient paleo-climate data with new satellite readings and an improved model of the climate system to demonstrate that ice sheets can melt at a “non-linear” rate: rather than an incremental melting as Earth’s poles inexorably warm, ice sheets might melt at exponential rates, shedding dangerous amounts of mass in a matter of decades, not millennia. In fact, current observations indicate that some ice sheets already are melting this rapidly.
“Prior to this paper I suspected that to be the case,” Hansen told The Daily Beast. “Now we have evidence to make that statement based on much more than suspicion.”
The Nature Climate Change study and Hansen’s new paper give credence to the many developing nations and climate justice advocates who have called for more ambitious action.
Politically, Hansen’s new projections amount to a huge headache for diplomats, activists, and anyone else hoping that a much-anticipated global climate summit the United Nations is convening in Paris in December will put the world on a safe path. President Barack Obama and other world leaders must now reckon with the possibility that the 2 degrees goal they affirmed at the Copenhagen summit in 2009 is actually a recipe for catastrophe. In effect, Hansen’s study explodes what has long been the goal of conventional climate diplomacy.
More troubling, honoring even the conventional2 degrees C target has so far proven extremely challenging on political and economic grounds. Current emission trajectories put the world on track towards a staggering 4 degrees of warming before the end of the century, an amount almost certainly beyond civilization’s coping capacity. In preparation for the Paris summit, governments have begun announcing commitments to reduce emissions, but to date these commitments are falling well short of satisfying the 2 degrees goal. Now, factor in the possibility that even 2 degrees is too much and many negotiators may be tempted to throw up their hands in despair.
They shouldn’t. New climate science brings good news as well as bad. Humanity can limit temperature rise to 1.5 degrees C if it so chooses, according to a little-noticed study by experts at the Potsdam Institute for Climate Impacts (now perhaps the world’s foremost climate research center) and the International Institute for Applied Systems Analysis published in Nature Climate Change in May.
“Actions for returning global warming to below 1.5 degrees Celsius by 2100 are in many ways similar to those limiting warming to below 2 degrees Celsius,” said Joeri Rogelj, a lead author of the study. “However … emission reductions need to scale up swiftly in the next decades.” And there’s a significant catch: Even this relatively optimistic study concludes that it’s too late to prevent global temperature rising by 2 degrees C. But this overshoot of the 2 C target can be made temporary, the study argues; the total increase can be brought back down to 1.5 C later in the century.
Besides the faster emissions reductions Rogelj referenced, two additional tools are essential, the study outlines. Energy efficiency—shifting to less wasteful lighting, appliances, vehicles, building materials and the like—is already the cheapest, fastest way to reduce emissions. Improved efficiency has made great progress in recent years but will have to accelerate, especially in emerging economies such as China and India.
Also necessary will be breakthroughs in so-called “carbon negative” technologies. Call it the photosynthesis option: because plants inhale carbon dioxide and store it in their roots, stems, and leaves, one can remove carbon from the atmosphere by growing trees, planting cover crops, burying charred plant materials underground, and other kindred methods. In effect, carbon negative technologies can turn back the clock on global warming, making the aforementioned descent from the 2 C overshoot to the 1.5 C goal later in this century theoretically possible. Carbon-negative technologies thus far remain unproven at the scale needed, however; more research and deployment is required, according to the study.
Together, the Nature Climate Change study and Hansen’s new paper give credence to the many developing nations and climate justice advocates who have called for more ambitious action. The authors of the Nature Climate Changestudy point out that the 1.5 degrees goal “is supported by more than 100 countries worldwide, including those most vulnerable to climate change.” In May, the governments of 20 of those countries, including the Philippines, Costa Rica, Kenya, and Bangladesh, declared the 2 degrees target “inadequate” and called for governments to “reconsider” it in Paris.
Hansen too is confident that the world “could actually come in well under 2 degrees, if we make the price of fossil fuels honest.”
That means making the market price of gasoline and other products derived from fossil fuels reflect the enormous costs that burning those fuels currently externalizes onto society as a whole. Economists from left to right have advocated achieving this by putting a rising fee or tax on fossil fuels. This would give businesses, governments, and other consumers an incentive to shift to non-carbon fuels such as solar, wind, nuclear, and, best of all, increased energy efficiency. (The cheapest and cleanest fuel is the fuel you don’t burn in the first place.)
But putting a fee on fossil fuels will raise their price to consumers, threatening individual budgets and broader economic prospects, as opponents will surely point out. Nevertheless, higher prices for carbon-based fuels need not have injurious economic effects if the fees driving those higher prices are returned to the public to spend as it wishes. It’s been done that way for years with great success in Alaska, where all residents receive an annual check in compensation for the impact the Alaskan oil pipeline has on the state.
“Tax Pollution, Pay People” is the bumper sticker summary coined by activists at the Citizens Climate Lobby. Legislation to this effect has been introduced in both houses of the U.S. Congress.
Meanwhile, there are also a host of other reasons to believe it’s not too late to preserve a livable climate for young people and future generations.
The transition away from fossil fuels has begun and is gaining speed and legitimacy. In 2014, global greenhouse gas emissions remained flat even as the world economy grew—a first. There has been a spectacular boom in wind and solar energy, including in developing countries, as their prices plummet. These technologies now qualify as a “disruptive” economic force that promises further breakthroughs, said Achim Steiner, executive director of the UN Environment Programme.
Coal, the most carbon-intensive conventional fossil fuel, is in a death spiral, partly thanks to another piece of encouraging news: the historic climate agreement the U.S. and China reached last November, which envisions both nations slashing coal consumption (as China is already doing). Hammering another nail into coal’s coffin, the leaders of Great Britain’s three main political parties pledged to phase out coal, no matter who won the general elections last May.
“If you look at the long-term [for coal], it’s not getting any better,” said Standard & Poor’s Aneesh Prabhu when S&P downgraded coal company bonds to junk status. “It’s a secular decline,” not a mere cyclical downturn.
Last but not least, a vibrant mass movement has arisen to fight climate change, most visibly manifested when hundreds of thousands of people thronged the streets of New York City last September, demanding action from global leaders gathered at the UN. The rally was impressive enough that it led oil and gas giant ExxonMobil to increase its internal estimate of how likely the U.S. government is to take strong action. “That many people marching is clearly going to put pressure on government to do something,” an ExxonMobil spokesman told Bloomberg Businessweek.
The climate challenge has long amounted to a race between the imperatives of science and the contingencies of politics. With Hansen’s paper, the science has gotten harsher, even as the Nature Climate Change study affirms that humanity can still choose life, if it will. The question now is how the politics will respond—now, at Paris in December, and beyond.
In the future, there could be major flooding along every coast. So says a new study that warns the world’s seas are rising.
Ever-warming oceans that are melting polar ice could raise sea levels 15 feet in the next 50 to 100 years, NASA’s former climate chief now says. That’s five times higher than previous predictions.
“This is the biggest threat the planet faces,” said James Hansen, the co-author of the new journal article raising that alarm scenario.
“If we get sea level rise of several meters, all coastal cities become dysfunctional,” he said. “The implications of this are just incalculable.”
If ocean levels rise just 10 feet, areas like Miami, Boston, Seattle and New York City would face flooding.
The melting ice would cool ocean surfaces at the poles even more. While the overall climate continues to warm. The temperature difference would fuel even more volatile weather.
“As the atmosphere gets warmer and there’s more water vapor, that’s going to drive stronger thunderstorms, stronger hurricanes, stronger tornadoes, because they all get their energy from the water vapor,” said Hansen.
Nearly a decade ago, Hansen told “60 Minutes” we had 10 years to get global warming under control, or we would reach “tipping point.”
“It will be a situation that is out of our control,” he said. “We’re essentially at the edge of that. That’s why this year is a critical year.”
Critical because of a United Nations meeting in Paris that is designed to reach legally binding agreements on carbons emissions, those greenhouse gases that create global warming.
* * *
Sea Levels Could Rise Much Faster than Thought (Climate Denial Crock of the Week)
James Hansen has often been out ahead of his scientific colleagues.
With his 1988 congressional testimony, the then-NASA scientist is credited with putting the global warming issue on the map by saying that a warming trend had already begun. “It is time to stop waffling so much and say that the evidence is pretty strong that the greenhouse effect is here,” Hansen famously testified.
Now Hansen — who retired in 2013 from his NASA post, and is currently an adjunct professor at Columbia University’s Earth Institute — is publishing what he says may be his most important paper. Along with 16 other researchers — including leading experts on the Greenland and Antarctic ice sheets — he has authored a lengthy study outlining an scenario of potentially rapid sea level rise combined with more intense storm systems.
It’s an alarming picture of where the planet could be headed — and hard to ignore, given its author. But it may also meet with considerable skepticism in the broader scientific community, given that its scenarios of sea level rise occur more rapidly than those ratified by the United Nations’ Intergovernmental Panel on Climate Change in its latest assessment of the state of climate science, published in 2013.
–
In the new study, Hansen and his colleagues suggest that the “doubling time” for ice loss from West Antarctica — the time period over which the amount of loss could double — could be as short as 10 years. In other words, a non-linear process could be at work, triggering major sea level rise in a time frame of 50 to 200 years. By contrast, Hansen and colleagues note, the IPCC assumed more of a linear process, suggesting only around 1 meter of sea level rise, at most, by 2100.
Here, a clip from our extended interview with Eric Rignot in December of 2014. Rignot is one of the co-authors of the new study.
The study—written by James Hansen, NASA’s former lead climate scientist, and 16 co-authors, many of whom are considered among the top in their fields—concludes that glaciers in Greenland and Antarctica will melt 10 times faster than previous consensus estimates, resulting in sea level rise of at least 10 feet in as little as 50 years. The study, which has not yet been peer reviewed, brings new importance to a feedback loop in the ocean near Antarctica that results in cooler freshwater from melting glaciers forcing warmer, saltier water underneath the ice sheets, speeding up the melting rate. Hansen, who is known for being alarmist and also right, acknowledges that his study implies change far beyond previous consensus estimates. In a conference call with reporters, he said he hoped the new findings would be “substantially more persuasive than anything previously published.” I certainly find them to be.
–
We conclude that continued high emissions will make multi-meter sea level rise practically unavoidable and likely to occur this century. Social disruption and economic consequences of such large sea level rise could be devastating. It is not difficult to imagine that conflicts arising from forced migrations and economic collapse might make the planet ungovernable, threatening the fabric of civilization.
The science of ice melt rates is advancing so fast, scientists have generally been reluctant to put a number to what is essentially an unpredictable, non-linear response of ice sheets to a steadily warming ocean. With Hansen’s new study, that changes in a dramatic way. One of the study’s co-authors is Eric Rignot, whose own study last year found that glacial melt from West Antarctica now appears to be “unstoppable.” Chris Mooney, writing for Mother Jones, called that study a “holy shit” moment for the climate.
New climate science brings good news as well as bad. Humanity can limit temperature rise to 1.5 degrees C if it so chooses, according to a little-noticed study by experts at the Potsdam Institute for Climate Impacts (now perhaps the world’s foremost climate research center) and the International Institute for Applied Systems Analysis published in Nature Climate Changein May.
“Actions for returning global warming to below 1.5 degrees Celsius by 2100 are in many ways similar to those limiting warming to below 2 degrees Celsius,” said Joeri Rogelj, a lead author of the study. “However … emission reductions need to scale up swiftly in the next decades.” And there’s a significant catch: Even this relatively optimistic study concludes that it’s too late to prevent global temperature rising by 2 degrees C. But this overshoot of the 2 C target can be made temporary, the study argues; the total increase can be brought back down to 1.5 C later in the century.
Besides the faster emissions reductions Rogelj referenced, two additional tools are essential, the study outlines. Energy efficiency—shifting to less wasteful lighting, appliances, vehicles, building materials and the like—is already the cheapest, fastest way to reduce emissions. Improved efficiency has made great progress in recent years but will have to accelerate, especially in emerging economies such as China and India.
Also necessary will be breakthroughs in so-called “carbon negative” technologies. Call it the photosynthesis option: because plants inhale carbon dioxide and store it in their roots, stems, and leaves, one can remove carbon from the atmosphere by growing trees, planting cover crops, burying charred plant materials underground, and other kindred methods. In effect, carbon negative technologies can turn back the clock on global warming, making the aforementioned descent from the 2 C overshoot to the 1.5 C goal later in this century theoretically possible. Carbon-negative technologies thus far remain unproven at the scale needed, however; more research and deployment is required, according to the study.
* * *
Earth’s Most Famous Climate Scientist Issues Bombshell Sea Level Warning (Slate)
Monday’s new study greatly increases the potential for catastrophic near-term sea level rise. Here, Miami Beach, among the most vulnerable cities to sea level rise in the world. Photo by Joe Raedle/Getty Images
In what may prove to be a turning point for political action on climate change, a breathtaking new study casts extreme doubt about the near-term stability of global sea levels.
The study—written by James Hansen, NASA’s former lead climate scientist, and 16 co-authors, many of whom are considered among the top in their fields—concludes that glaciers in Greenland and Antarctica will melt 10 times faster than previous consensus estimates, resulting in sea level rise of at least 10 feet in as little as 50 years. The study, which has not yet been peer-reviewed, brings new importance to a feedback loop in the ocean near Antarctica that results in cooler freshwater from melting glaciers forcing warmer, saltier water underneath the ice sheets, speeding up the melting rate. Hansen, who is known for being alarmist and also right, acknowledges that his study implies change far beyond previous consensus estimates. In a conference call with reporters, he said he hoped the new findings would be “substantially more persuasive than anything previously published.” I certainly find them to be.
To come to their findings, the authors used a mixture of paleoclimate records, computer models, and observations of current rates of sea level rise, but “the real world is moving somewhat faster than the model,” Hansen says.
Hansen’s study does not attempt to predict the precise timing of the feedback loop, only that it is “likely” to occur this century. The implications are mindboggling: In the study’s likely scenario, New York City—and every other coastal city on the planet—may only have a few more decades of habitability left. That dire prediction, in Hansen’s view, requires “emergency cooperation among nations.”
We conclude that continued high emissions will make multi-meter sea level rise practically unavoidable and likely to occur this century. Social disruption and economic consequences of such large sea level rise could be devastating. It is not difficult to imagine that conflicts arising from forced migrations and economic collapse might make the planet ungovernable, threatening the fabric of civilization.
The science of ice melt rates is advancing so fast, scientists have generally been reluctant to put a number to what is essentially an unpredictable, nonlinear response of ice sheets to a steadily warming ocean. With Hansen’s new study, that changes in a dramatic way. One of the study’s co-authors is Eric Rignot, whose own study last year found that glacial melt from West Antarctica now appears to be “unstoppable.” Chris Mooney, writing for Mother Jones, called that study a “holy shit” moment for the climate.
One necessary note of caution: Hansen’s study comes via a nontraditional publishing decision by its authors. The study will be published in Atmospheric Chemistry and Physics, an open-access “discussion” journal, and will not have formal peer review prior to its appearance online later this week. [Update, July 23:The paper is now available.] The complete discussion draft circulated to journalists was 66 pages long, and included more than 300 references. The peer review will take place in real time, with responses to the work by other scientists also published online. Hansen said this publishing timeline was necessary to make the work public as soon as possible before global negotiators meet in Paris later this year. Still, the lack of traditional peer review and the fact that this study’s results go far beyond what’s been previously published will likely bring increased scrutiny. On Twitter, Ruth Mottram, a climate scientist whose work focuses on Greenland and the Arctic, was skeptical of such enormous rates of near-term sea level rise, though she defended Hansen’s decision to publish in a nontraditional way.
In 2013, Hansen left his post at NASA to become a climate activist because, in his words, “as a government employee, you can’t testify against the government.” In a wide-ranging December 2013 study, conducted to support Our Children’s Trust, a group advancing legal challenges to lax greenhouse gas emissions policies on behalf of minors, Hansen called for a “human tipping point”—essentially, a social revolution—as one of the most effective ways of combating climate change, though he still favors a bilateral carbon tax agreed upon by the United States and China as the best near-term climate policy. In the new study, Hansen writes, “there is no morally defensible excuse to delay phase-out of fossil fuel emissions as rapidly as possible.”
Asked whether Hansen has plans to personally present the new research to world leaders, he said: “Yes, but I can’t talk about that today.” What’s still uncertain is whether, like with so many previous dire warnings, world leaders will be willing to listen.
* * *
Ice Melt, Sea Level Rise and Superstorms (Climate Sciences, Awareness and Solutions / Earth Institute, Columbia University)
23 July 2015
James Hansen
The paper “Ice melt, sea level rise and superstorms: evidence from paleoclimate data, climate modeling, and modern observations that 2°C global warming is highly dangerous” has been published in Atmospheric Chemistry and Physics Discussion and is freely available here.
The paper draws on a large body of work by the research community, as indicated by the 300 references. No doubt we missed some important relevant contributions, which we may be able to rectify in the final version of the paper. I thank all the researchers who provided data or information, many of whom I may have failed to include in the acknowledgments, as the work for the paper occurred over a several year period.
I am especially grateful to the Durst family for a generous grant that allowed me to work full time this year on finishing the paper, as well as the other supporters of our program Climate Science, Awareness and Solutions at the Columbia University Earth Institute.
In the conceivable event that you do not read the full paper plus supplement, I include the Acknowledgments here:
Acknowledgments. Completion of this study was made possible by a generous gift from The Durst Family to the Climate Science, Awareness and Solutions program at the Columbia University Earth Institute. That program was initiated in 2013 primarily via support from the Grantham Foundation for Protection of the Environment, Jim and Krisann Miller, and Gerry Lenfest and sustained via their continuing support. Other substantial support has been provided by the Flora Family Foundation, Dennis Pence, the Skoll Global Threats Fund, Alexander Totic and Hugh Perrine. We thank Anders Carlson, Elsa Cortijo, Nil Irvali, Kurt Lambeck, Scott Lehman, and Ulysses Ninnemann for their kind provision of data and related information. Support for climate simulations was provided by the NASA High-End Computing (HEC) Program through the NASA Center for Climate Simulation (NCCS) at Goddard Space Flight Center.
A new study shows that, from 1500 until 2000, about a third of floods in southwestern Netherlands were deliberately caused by humans during wartimes. Some of these inundations resulted in significant changes to the landscape, being as damaging as floods caused by heavy rainfall or storm surges. The work, by Dutch researcher Adriaan de Kraker, is published in Hydrology and Earth System Sciences, a journal of the European Geosciences Union (EGU).
During the Eighty Years’ War, as the Spanish army fought to recapture territory in what is now northern Belgium and southwestern Netherlands in the late sixteenth century, the Dutch rebels led by William of Orange decided to use the low-lying, flood-prone landscape to their advantage. In an attempt to liberate Bruges, Ghent and Antwerp from Spanish dominance and defend their territory, the rebels destroyed seawalls at strategic places from 1584 to 1586 to cause deliberate, large-scale floods.
»The plan got completely out of hand«, says De Kraker, an assistant professor at the VU University Amsterdam in the Netherlands. »It came at the expense of the countryside of northern Flanders, now Zeeland Flanders, some two thirds of which was flooded.«
Floods can result in loss of life and damage homes and businesses, and when the water remains inland for a long time, it can change the landscape through erosion and deposition, forming new tidal channels and creeks. The area flooded during the Eighty Years’ War became part of a strategic line of defence and remained inundated for more than 100 years in some places, with profound consequences for the landscape. After the waters receded, a thick layer of clay covered all remnants of buildings and roads in the area. As sea water was used, soil salinity increased, affecting agricultural yields.
»Strategic flooding is a highly risky tactic. It can only be successful if there’s a well-thought-out backup plan and a plan for fast repairs«, warns De Kraker. However, that was not the case here, he says: »I desperately looked for evidence of backup plans for the repair of the dykes and who was going to pay for the costs incurred. I could find hardly any records of such plans.«
De Kraker has been studying historical floods – occurring from the year 1500 to 2000 – in southwestern Netherlands since the 1980s to find out their causes and outcomes. Mostly below sea level, and dominated by three river estuaries populated with islands and a system of dykes and dams that protect the fertile land from the sea, this region is particularly susceptible to floods.
In his research, De Kraker used documents relating to land ownership and land use, accounts of maintenance of sea defences, and correspondence between stakeholders, such as rebels, Spanish officials, and mayors of besieged towns. He also used aerial photographs of the area, historical maps and maps of soil and landscape changes.
As reported in the new Hydrology and Earth System Sciences article, he noticed the main floods in the area in the past 500 years could be grouped into those caused by storm surges (21 events) and those happening during wartimes (11 events). The former had natural causes and the latter were created by humans, but De Kraker says human action played a major role in both.
The most damaging flood occurred in the winter of 1953, when strong winds blew for two days causing a long-lasting storm surge, which resulted in extremely high water levels. Over 1800 people died, 100 000 were evacuated and damages reached the equivalent of 700 million euros. While the cause of this flood was natural, De Kraker says human factors contributed to the extent of the damage. He reports that officials were slow at responding to the event, failing to take mitigation measures such as raising the dykes fast enough. Weak building construction and inadequate rescue procedures contributed to the material damage and human toll.
The study also shows floods in the Netherlands were used as a weapon as recently as the 1940s. »Strategic flooding during the Second World War undertaken by the Germans remained purely defensive, while the Allied flooding of the former island of Walcheren in the southwest of the country sped up the Allied offensive«, says De Kraker.
Carlos de Almeida, 50, morador de Igaratá (SP), exibe uma foto da antiga cidade, inundada desde março de 1969. Tiago Queiroz/Estadão
A seca que atinge o rio Jaguari fez reaparecer as ruínas de uma cidade que estava submersa desde março de 1969, quando começou a construção dos reservatórios usados na geração de energia para a região do Vale do Paraíba e do Sistema Cantareira.
No fundo de uma represa, que está 30 metros abaixo do nível normal, entre Joanópolis e São José dos Campos, no interior paulista, a igreja matriz, a praça e a rua principal da Igaratá Velha ressurgiram e se transformaram em ponto turístico.
Os 2 mil moradores do antigo povoado de Igaratá Velha, formado em meados de 1865 em uma confluência dos Rios Jaguari e do Peixe, foram removidos para uma nova cidade homônima um século depois. Criada em dezembro de 1969 a 3 quilômetros da antiga cidade, a nova Igaratá nasceu em um terreno da antiga Centrais Elétricas de São Paulo (Cesp), doado aos moradores. Hoje, o município tem cerca de 9 mil habitantes.
O reaparecimento das ruínas da Igreja Nossa Senhora do Patrocínio emociona quem viveu no antigo povoado. Um grupo colocou uma nova cruz onde ficava a igreja. “O pessoal mais velho vem e passa o domingo rezando em volta da cruz. Não querem que a água cubra de volta a igreja”, diz o agricultor Edilson Cardoso, de 32 anos.
Com um quadro da Igaratá Velha debaixo dos braços, o pescador José Carlos de Almeida, de 50 anos, cobra R$ 5 para levar turistas de canoa até as ruínas do antigo grupo escolar, no meio da represa. “Se a represa baixar os 10 metros que faltam, vai reaparecer a cidade inteira.”
A prefeitura de Igaratá também fez melhorias na pista de terra que dá acesso às ruínas, para facilitar a visitação. “Uma pena não ter dinheiro para fazer a preservação das peças encontradas. Muita coisa as pessoas já levaram embora”, diz o secretário de Obras de Igaratá, Emerson Rodrigues, de 35 anos.
Enchentes
Na época da remoção promovida pela Cesp, a maior parte dos moradores concordava com a mudança. “Era muita enchente. No período das chuvas todo mundo tinha de sair de casa. Só os mais antigos não queriam mudar”, recorda José Rodrigues, de 72 anos.
Na nova Igaratá, a emoção pelo ressurgimento da antiga igreja parece ter anestesiado a preocupação com a seca. Mesmo entre os mais jovens a curiosidade é grande. Muitos querem descobrir onde ficava a casa da avó, da tia que morreu, do prefeito.
Telhas dos anos 1940, escadarias, tanques de lavar roupa e restos das cadeiras da praça podem ser observados sobre o solo seco. No meio da represa estão estacas das casas demolidas na época da inundação.
“Toda semana aparece uma coisa nova. Muito velhinho vem aqui e se emociona, chora mesmo”, conta Fabio Saltonato, de 28 anos. “Quero achar a casa que era do meu pai. Pelo que vi nas fotos, se baixar mais 2 metros ela vai aparecer. Quem sabe depois do carnaval.”
Mas a seca derrubou o turismo, principal atividade econômica de Igaratá. Na beira da represa, dezenas de chácaras e sítios de veraneio estão à venda. Pontos que funcionavam como marinas estão vazios. “Com essa transposição de água da represa, a cidade vai ‘morrer’ economicamente. Esse é nosso medo”, diz o secretário de Obras.
Jan. 16, 2014 — South-central Idaho and the surface of Mars have an interesting geological feature in common: amphitheater-headed canyons. These U-shaped canyons with tall vertical headwalls are found near the Snake River in Idaho as well as on the surface of Mars, according to photographs taken by satellites. Various explanations for how these canyons formed have been offered — some for Mars, some for Idaho, some for both — but in a paper published the week of December 16 in the online issue ofProceedings of the National Academy of Sciences,Caltech professor of geology Michael P. Lamb, Benjamin Mackey, formerly a postdoctoral fellow at Caltech, and W. M. Keck Foundation Professor of Geochemistry Kenneth A. Farley offer a plausible account that all these canyons were created by enormous floods.
Stubby Canyon, Malad Gorge State Park, Idaho. (Credit: Michael Lamb)
Canyons in Malad Gorge State Park, Idaho, are carved into a relatively flat plain composed of a type of volcanic rock known as basalt. The basalt originated from a hotspot, located in what is now Yellowstone Park, which has been active for the last few million years. Two canyons in Malad Gorge, Woody’s Cove and Stubby Canyon, are characterized by tall vertical headwalls, roughly 150 feet high, that curve around to form an amphitheater. Other amphitheater-headed canyons can be found nearby, outside the Gorge — Box Canyon, Blue Lakes Canyon, and Devil’s Corral — and also elsewhere on Earth, such as in Iceland.
To figure out how they formed, Lamb and Mackey conducted field surveys and collected rock samples from Woody’s Cove, Stubby Canyon, and a third canyon in Malad Gorge, known as Pointed Canyon. As its name indicates, Pointed Canyon ends not in an amphitheater but in a point, as it progressively narrows in the upstream direction toward the plateau at an average 7 percent grade. Through Pointed Canyon flows the Wood River, a tributary of the larger Snake River, which in turn empties into the Columbia River on its way to the Pacific Ocean.
Geologists have a good understanding of how the rocks in Woody’s Cove and Stubby Canyon achieved their characteristic appearance. The lava flows that hardened into basalt were initially laid down in layers, some more than six feet thick. As the lava cooled, it contracted and cracked, just as mud does when it dries. This produced vertical cracks across the entire layer of lava-turned-basalt. As each additional sheet of lava covered the same land, it too cooled and cracked vertically, leaving a wall that, when exposed, looks like stacks of tall blocks, slightly offset from one another with each additional layer. This type of structure is called columnar basalt.
While the formation of columnar basalt is well understood, it is not clear how, at Woody’s Cove and Stubby Canyon, the vertical walls became exposed or how they took on their curved shapes. The conventional explanation is that the canyons were formed via a process called “groundwater sapping,” in which springs at the bottom of the canyon gradually carve tunnels at the base of the rock wall until this undercutting destabilizes the structure so much that blocks or columns of basalt fall off from above, creating the amphitheater below.
This explanation has not been corroborated by the Caltech team’s observations, for two reasons. First, there is no evidence of undercutting, even though there are existing springs at the base of Woody’s Cove and Stubby Canyon. Second, undercutting should leave large boulders in place at the foot of the canyon, at least until they are dissolved or carried away by groundwater. “These blocks are too big to move by spring flow, and there’s not enough time for the groundwater to have dissolved them away,” Lamb explains, “which means that large floods are needed to move them out. To make a canyon, you have to erode the canyon headwall, and you also have to evacuate the material that collapses in.”
That leaves waterfall erosion during a large flood event as the only remaining candidate for the canyon formation that occurred in Malad Gorge, the Caltech team concludes.
No water flows over the top of Woody’s Cove and Stubby Canyon today. But even a single incident of overland water flow occurring during an unusually large flood event could pluck away and topple boulders from the columnar basalt, taking advantage of the vertical fracturing already present in the volcanic rock. A flood of this magnitude could also carry boulders downstream, leaving behind the amphitheater canyons we see today without massive boulder piles at their bottoms and with no existing watercourses.
Additional evidence that at some point in the past water flowed over the plateaus near Woody’s Cove and Stubby Canyon are the presence of scour marks on surface rocks on the plateau above the canyons. These scour marks are evidence of the type of abrasion that occurs when a water discharge containing sediment moves overland.
Taken together, the evidence from Malad Gorge, Lamb says, suggests that “amphitheater shapes might be diagnostic of very large-scale floods, which would imply much larger water discharges and much shorter flow durations than predicted by the previous groundwater theory.” Lamb points out that although groundwater sapping “is often assumed to explain the origin of amphitheater-headed canyons, there is no place on Earth where it has been demonstrated to work in columnar basalt.”
Closing the case on the canyons at Malad Gorge required one further bit of information: the ages of the rock samples. This was accomplished at Caltech’s Noble Gas Lab, run by Kenneth A. Farley, W. M. Keck Foundation Professor of Geochemistry and chair of the Division of Geological and Planetary Sciences.
The key to dating surface rocks on Earth is cosmic rays — very high-energy particles from space that regularly strike Earth. “Cosmic rays interact with the atmosphere and eventually with rocks at the surface, producing alternate versions of noble gas elements, or isotopes, called cosmogenic nuclides,” Lamb explains. “If we know the cosmic-ray flux, and we measure the accumulation of nuclides in a certain mineral, then we can calculate the time that rock has been sitting at Earth’s surface.”
At the Noble Gas Lab, Farley and Mackey determined that rock samples from the heads of Woody’s Cove and Stubby Canyon had been exposed for the same length of time, approximately 46,000 years. If Lamb and his colleagues are correct, this is when the flood event occurred that plucked the boulders off the canyon walls, leaving the amphitheaters behind.
Further evidence supporting the team’s theory can be found in Pointed Canyon. Rock samples collected along the walls of the first kilometer of the canyon show progressively more exposure in the downstream direction, suggesting that the canyon is still being carved by Wood River. Using the dates of exposure revealed in the rock samples, Lamb reconstructed the probable location of Pointed Canyon at the time of the formation of Woody’s Cove and Stubby Canyon. At that location, where the rock has been exposed approximately 46,000 years, the surrounding canyon walls form the characteristic U-shape of an amphitheater-headed canyon and then abruptly narrow into the point that forms the remainder of Pointed Canyon. “The same megaflood event that created Woody’s Cove and Stubby Canyon seems to have created Pointed Canyon,” Lamb concludes. “The only difference is that the other canyons had no continuing river action, while Pointed Canyon was cut relatively slowly over the last 46,000 years by the Wood River, which is not powerful enough to topple and pluck basalt blocks from the surrounding plateau, resulting in a narrow channel rather than tall vertical headwalls.”
Solving the puzzle of how amphitheater-headed canyons are created has implications reaching far beyond south-central Idaho because similar features — though some much larger — are also present on the surface of Mars. “A very popular interpretation for the amphitheater-headed canyons on Mars is that groundwater seeps out of cracks at the base of the canyon headwalls and that no water ever went over the top,” Lamb says. Judging from the evidence in Idaho, however, it seems more likely that on Mars, as on Earth, amphitheater-headed canyons were created by enormous flood events, suggesting that Mars was once a very watery planet.
Journal Reference:
M. P. Lamb, B. H. Mackey, K. A. Farley. Amphitheater-headed canyons formed by megaflooding at Malad Gorge, Idaho. Proceedings of the National Academy of Sciences, 2013; 111 (1): 57 DOI: 10.1073/pnas.1312251111
Dec. 3, 2012 — Floods have once again wreaked havoc across the country and climate scientists and meteorologists suggest that the problem is only going to get worse with wetter winters and rivers bursting their banks becoming the norm. A team based at Newcastle University and their colleagues in China have developed a computer model that can work out how the flood flow will develop and where flooding will be worst based on an understanding of fluid dynamics and the underlying topology of a region.
Writing in the journal Progress in Computational Fluid Dynamics,Newcastle civil engineer Qiuhua Liang and colleagues and Chi Zhang of Dalian University of Technology and Junxian Yin, China Institute of Water Resources and Hydropower Research in Beijing, explain how they have developed an adaptive computer model that could provide accurate and efficient predictions about the flow of water as a flood occurs. Such a model might provide environmental agencies and authorities with a more precise early-warning system for residents and businesses in a region at risk of flood. It could also be used by insurance companies to determine the relative risk of different areas within a given region and so make their underwriting of the risk economically viable.
The model is based on a numerical solution to the hydrodynamic equations of fluid flow . This allows the researchers to plot the likely movement of water during a dam break or flash flood over different kinds of terrain and around obstacles even when flood waves are spreading rapidly. The researchers have successfully tested their model on real-world flood data.
The team points out that flood disasters have become a major threat to human lives and assets. “Flood management is therefore an important task for different levels of governments and authorities in many countries”, the researchers explain. “The availability of accurate and efficient flood modelling tools is vital to assist engineers and managers charged with flood risk assessment, prevention and alleviation.”
Journal Reference:
Chi Zhang, Qiuhua Liang, Junxian Yin. A first-order adaptive solution to rapidly spreading flood waves.Progress in Computational Fluid Dynamics, An International Journal, 2013; 13 (1): 1 DOI: 10.1504/PCFD.2013.050645
Nov. 25, 2012 — New research shows concerns about governmental failure to act effectively and fairly in the aftermath of extreme weather events can affect the degree to which residents are willing to protect themselves.
Published in the journal Nature Climate Change, the findings of a team led by scientists at the University could prove key to establishing how society should evolve to cope with more turbulent weather and more frequent mega storms.
The team examined attitudes in Cumbria in north west England and Galway in western Ireland, which were both hit by heavy flooding in November 2009. Record rainfall was recorded in both countries, resulting in a number of deaths, properties being severely damaged and economic disruption.
Professor Neil Adger of Geography at the University of Exeter, who led the research, said: “The flooding of 2009 was devastating to both communities. Our study is the first to track the impacts of floods across two countries and how communities and individuals demand change after such events. When people in both studies felt that government had fallen short of their expectations, we found that the resulting perception of helplessness leads to an unwillingness to take personal action to prevent flooding in future.”
Scientists at the University of Exeter worked with colleagues at the National University of Ireland Maynooth and the Tyndall Centre for Climate Change Research at the University of East Anglia, which also provided funding for the study.
Researchers surveyed 356 residents in both areas eight months after the flooding. They measured perceptions of governments’ performances in dealing with the aftermath, as well as perceptions of fairness in that response and the willingness of individuals to take action.
Dr Irene Lorenzoni of the Tyndall Centre comments: “Residents in Galway were significantly more likely to believe that their property would be flooded again than those in Cumbria. Yet it was Cumbrians who believed they had more personal responsibility to adapt to reduce future incidents.
“Whether people felt responses were fair also diverged. In our survey in Cumbria three quarters of respondents agreed that everyone in their community had received prompt help following the flooding, while in Galway it was less than half.”
Dr Conor Murphy of the National University of Ireland, Maynooth said: “The strong perception in Galway that authorities failed to deliver on the expectations of flooded communities in late 2009 is a wakeup call. Given the high exposure of development in flood prone areas it is clear that both England and Ireland need to make major investments in building flood resilience with changing rainfall patterns induced by climate change. Political demand for those investments will only grow.”
Professor Adger says: “Our research shows that climate change is likely to lead to a series of crises which will cause major disruption as instant short-term solutions are sought. We need to consider the implicit contract between citizens and government agencies when planning for floods, to enable fairer and smoother processes of adaptation.”
Journal Reference:
W. Neil Adger, Tara Quinn, Irene Lorenzoni, Conor Murphy, John Sweeney. Changing social contracts in climate-change adaptation. Nature Climate Change, 2012; DOI:10.1038/nclimate1751
Você precisa fazer login para comentar.