The ostensibly large number of recent extreme weather events has triggered intensive discussions, both in- and outside the scientific community, on whether they are related to global warming. Here, we review the evidence and argue that for some types of extreme — notably heatwaves, but also precipitation extremes — there is now strong evidence linking specific events or an increase in their numbers to the human influence on climate. For other types of extreme, such as storms, the available evidence is less conclusive, but based on observed trends and basic physical concepts it is nevertheless plausible to expect an increase.
I sent the article around to some researchers working on these questions. Here are their reactions, along with another valuable assessment posted by Michael Tobis at Planet 3.0:
– Exaggerated language, and many unsubstantiated assertions. For instance, in what manner did the last decade experience an “unprecedented” number of extreme weather events? Note that the increase in heat waves was largely balanced by a decrease in cold waves—-
– Overly simplistic view of the relation between damage, human suffering, and the extremes. Much more balanced arguments can be found in R. Pielke Jr.’s work that consider changes in society, communities, coastal development, etc. Also, a more useful perspective is found in the recent EOS article by Mike Wallace, titled “Weather and Climate Extreme Events: Teachable Moments.”
– Very few of the [cases of extreme weather listed in the paper] have undergone a scientific investigation of contributing factors, let alone human impacts. I believe that a read of the Lewis and Clark journals would reveal an impressive list of extreme weather also…. so what is one to make of this list for the 2001-2011 period provided in this Perspective by Coumou and Rahmstorf. The fact is that extremes happen, have happened, and will continue to happen. For some, their character, preferred phase, and intensity may be changing (aside from temperature extremes, the detection and attribution evidence to date is weak).
– I suspect that if one engaged in grand mitigation today (as useful as that would be for many other purposes), many of the extremes listed in [the paper] would happen anyway, and will likely happen again.
– The piece lacks all perspective on the human and technological elements contributing to greater observational capacity to sense extremes (radar, satellite), nor does it consider the reality of a heighten interest by the public in extremes, given recent public discourses.
– The matter of attribution, as raised in the second to last paragraph, is a much broader science that merely determining the change in probability due to greenhouse-gas forcing….which is an inherently difficult and uncertain undertaking. The piece ignores the broader context in which all manner of contributing factors is assessed to understand the magnitude of events, their temporal and regional specificity (e.g., why did the heat wave happen over Texas (rather than Washington), why did it occur in 2011 (and not 2009, or next year), and why did it break the previous records by a factor of 2. After all, the irony of extreme events is that the larger the magnitude the smaller the fractional contribution by human climate change.
– Consistent with the policy-direct tone of this piece, hyperbole is used throughout. The piece often convoluting apparent “effects” of apparent changes in extremes in the last decade with causes not to arise till the latter part of the 21st century.
My reactions to the article are very much along the same lines as Marty Hoerling’s. By exaggerating the influence of climate change on today’s weather and climate-related extreme events, a part of our community is painting itself into a rhetorical corner.
My opinion piece, “Weather and Climate-Related Extreme Events: Teachable Moments ” to which Hoerling refers, serves as a counterpoint to Coumou and Rahmsdorf’s article. Before submitting it to Eos, as an experiment, I submitted it to Nature: Climate Change, where their article was published. I cannot say that I was surprised when the editors informed me that they would not be sending it out for review because “we are not persuaded that your article represents a sufficiently substantial contribution to the ‘climate change debate’ [my quotation marks] to justify publication in the journal”. Perhaps to ease the pain of rejection, the editor added, “more Commentaries are actively commissioned and […] we only rarely publish unsolicited contributions to the section”.
Although it may sound a bit like sour grapes, here’s the way that I’ve rationalized Nature’s editorial decision. I’ve become convinced that many of the editors of the high impact journals are inclined to cast opinion pieces as salvos in the ongoing war between climate change believers and skeptics. Articles like mine that take issue with the way in which the war is being waged are not particularly welcome. By soliciting opinion pieces and by selecting, from among the growing list of contributed articles, the very few that will be sent out for peer review, the editors promote their vision of what constitutes “groundbreaking” and “policy relevant” science. What if it is not the right vision?
By granting the editors of Nature and other high impact journals ever increasing power in deciding which of our articles should be singled out for emphasis in the news media, we risk losing control of the peer review process upon which our public image depends. The way to maintain control is to make a point of sending our most newsworthy scientific articles and opinion pieces to the journals of our own professional societies, in which the peer review process is editor-facilitated, rather than editor-directed. Dot.Earth could render our community a valuable service by ensuring that newsworthy articles published in our journals receive the public attention that they deserve.
Kerry Emanuel, longtime climate scientist at the Massachusetts Institute of Technology (focused on the impact of greenhouse-driven heating on hurricanes):
I read the piece differently from the way Mike and Martin read it. It was published as a “perspective” and I did not read it as a scientific paper or letter. It tries to draw attention to the point that weather extremes a) affect society more so than means, and b) require a different statistical approach to detect trends. This is certainly old hat to climate scientists, but there is so much literature on the the mean temperature response that I believe there is room to draw attention to the problem of extremes. Thus I think the perspective piece is useful. The one criticism I would level, echoing to some extent what Martin and Mike have said, is that it is a bit heavy on weather anecdotes (this record broken here; that record there), which draws attention away from the central issue of the statistics of extremes.
It is vital that as a community we focus more attention on detecting changes in the tails of the distributions of weather events. To the extent that this perspective piece may draw scientists from other disciplines into this endeavor, it will have proven useful.
On last point: I completely agree with Mike that you could do science a service by getting journalists to pay more attention to our own professional journals and not focus so exclusively on the high profile journals, which often tend toward the sensational at the expense of solid advances.
Michael Tobis, a scientist, programmer and climate bloggerfrom the University of Texas, posted a nice essay on the Coumou-Rahmstorf article and related issues. The piece, “Disequilibrium is Not Your Friend,” examines the consequences of disturbing a system in a state of complex equilibrium, whether it is an intricate Alexander Calder mobile sculpture or the climate. Here’s an excerpt:
It’s a general principle of complex equilibria that the more they are disturbed, the more complex the processes involved in restoring their equilibrium. The mobile sculpture is not unusual in this regard….
What makes the sculpture less predictable under forcing? Both the size and duration of the impact matter. If you moved the piece ten yards very gently, its behavior might be nothing out of the ordinary, while if you moved it an inch suddenly, a lot of complexity would emerge. (If you moved the piece ten yards suddenly, you would expect permanent alterations, with a whole new set of modes created and many of the old ones destroyed. Let’s hope we do not take the analogous experiment that far.)
While this in no way constitutes a mathematical proof for any given system, the underlying behavior is common and intuitively understandable. If a complex system acts otherwise, it would be something extraordinary that deserves explanation. As applied to the climate system, consider it a plausibility argument: the more rapidly and extensively the system is disturbed, the more we would expect that unexpected behaviors will emerge, and the further from expectations they will be. [Please read the rest.]
April 11, 9:47 a.m. | Updated Stefan Rahmstorf offers his response here:
There is a broad spectrum of views on extreme events in the community – you’ve sampled some of those. It is precisely this range of opinions which made us think it worthwhile to take a good dispassionate look at the evidence and stimulate some discussion. We noticed this range also in the reviews of our Perspective. One reviewer asked us to make stronger statements on the link between climate change and extremes, another just asked the opposite and the third one found we got it about right. I think overall we struck a good balance, and I’ve never gotten such an overwhelming positive feedback from colleagues after publishing a paper – lots of emails still coming in. Looks like we struck a chord.
Hoerling’s claim that we make “many unsubstantiated assertions” is itself one. First he claims we said that the last decade experienced an unprecedented number of extreme weather events – which we do not say anywhere in our paper. And then he claims that “the increase in heat waves was largely balanced by a decrease in cold waves,” which is a popular climate sceptics argument but demonstrably false. Already the IPCC TAR in 2001 illustrated that this is not the case, see the famous TAR graph and compare the size of the pink/red and blue areas in panels (a) or (c). We explained this again in our 2011 PNAS paper, and we demonstrate it again in the present Perspective: In a stationary climate you’d get approximately the same amount of hot and cold records. We cite the global data analysis of Benestad (2004) in Fig. 2 which shows that record heat waves already have increased more than threefold as compared to a stationary climate. Now even if record cold waves would have declined to zero in number (which they have not), it is obvious that this could not balance a more than threefold increase in heat waves.
Interestingly, Hoerling immediately raises the climate policy issue (stating that mitigation efforts would not prevent extremes) and even denounces our Perspective as “policy-direct”, even though we do not even mention policy – it is simply not the topic of our article, we exclusively discuss scientific questions and we point out at the outset that societal impacts and possible policy strategies are discussed in the SREX.
We cite James Hansen’s 1988 statement on global warming at the end. Back then he got a lot of criticism for it, but in hindsight it turned out he was right. We hope that in hindsight we will find out that we were wrong, and global warming is not leading to more unprecedented extremes. But the evidence is pointing the other way, I’m afraid.
The pull of the “front-page thought” and the eagerness of climate campaigners to jog the public have sometimes created a tendency to tie mounting losses from weather-related disasters to human-driven global warming.
But finding a statistically robust link between such disasters and the building human climate influence remains a daunting task. A new analysis of nearly two dozen papers assessing trends in disaster losses in light of climate change finds no convincing link. The author concludes that, so far, the rise in disaster losses is mainly a function of more investments getting in harm’s way as communities in places vulnerable to natural hazards grow.
The paper — “Have disaster losses increased due to anthropogenic climate change?” — is in press in the Bulletin of the American Meteorological Society. It was written byLaurens M. Bouwer, a researcher at Vrije University in Amsterdam focused on climate and water resources (and a lead author of a chapter in the 2001 assessment from the Intergovernmental Panel on Climate Change). You can read more about the paper at the blog of Roger Pielke, Jr., which drew my attention to this work.
Here’s the summary and a link to the full paper:
The increasing impact of natural disasters over recent decades has been well documented, especially the direct economic losses and losses that were insured. Claims are made by some that climate change has caused more losses, but others assert that increasing exposure due to population and economic growth has been a much more important driver. Ambiguity exists today, as the causal link between climate change and disaster losses has not been addressed in a systematic manner by major scientific assessments. Here I present a review and analysis of recent quantitative studies on past increases in weather disaster losses and the role of anthropogenic climate change. Analyses show that although economic losses from weather related hazards have increased, anthropogenic climate change so far did not have a significant impact on losses from natural disasters. The observed loss increase is caused primarily by increasing exposure and value of capital at risk. This finding is of direct importance for studies on impacts from extreme weather and for disaster policy. (Read the rest.)
None of this negates the importance of moving to limit emissions of long-lived greenhouse gases; the analysis just reinforces the reality that while that effort proceeds, there’s plenty of other work to do, as well, if humanity desires a relatively smooth journey in this century (as was recently stressed by Robert Verchick here).
Through decades of work, James E. Hansen of NASA has earned his plaudits as a climate scientist. But his intensifying personal push for aggressive cuts in emissions of greenhouse gases has come with a framing of climate science that is being criticized by some respected researchers for stepping beyond what peer-reviewed studies have concluded.
“Over the next several decades, the Western United States and the semi-arid region from North Dakota to Texas will develop semi-permanent drought, with rain, when it does come, occurring in extreme events with heavy flooding. Economic losses would be incalculable. More and more of the Midwest would be a dust bowl. California’s Central Valley could no longer be irrigated. Food prices would rise to unprecedented levels.”
He doesnt define “several decades,” but a reasonable assumption is that he refers to a period from today through mid-century. I am unaware of any projection for “semi-permanent” drought in this time frame over the expansive region of the Central Great Plains. He implies the drought will be due to a lack of rain (except for the brief, and ineffective downpours). I am unaware of indications, from model projections, for a material decline in mean rainfall. Indeed, that region has seen a general increase in rainfall over the long term during most seasons (certainly no material decline). Also, for the warm season when evaporative loss is especially effective, the climate of the central Great Plains has not become materially warmer (perhaps even cooled) since 1900. In other words, climate conditions in the growing season of the Central Great Plains are today not materially different from those existing 100 years ago. This observational fact belies the expectations from climate simulations and, in truth, our science lacks a good explanation for this discrepancy.
The Hansen piece is policy more than it is science, to be sure, and one can read it for the former. But facts should, and do, matter to some. The vision of a Midwest Dustbowl is a scary one, and the author appears intent to instill fear rather than reason.
The article makes these additional assertions:
“The global warming signal is now louder than the noise of random weather…”
This is patently false. Take temperature over the U.S. as an example. The variability of daily temperature over the U.S. is much larger than the anthropogenic warming signal at the time scales of local weather. Depending on season and location, the disparity is at least a factor of 5 to 10.
I think that a more scientifically justifiable statement, at least for the U.S. and extratropical land areas is that daily weather noise continues to drum out the siren call of climate change on local, weather scales.
Hansen goes on to assert that:
“Extremely hot summers have increased noticeably. We can say with high confidence that the recent heat waves in Texas and Russia, and the one in Europe in 2003, which killed tens of thousands, were not natural events — they were caused by human-induced climate change.”
Published scientific studies on the Russian heat wave indicate this claim to be false. Our own study on the Texas heat wave and drought, submitted this week to the Journal of Climate, likewise shows that that event was not caused by human-induced climate change. These are not de novo events, but upon scientific scrutiny, one finds both the Russian and Texas extreme events to be part of the physics of what has driven variability in those regions over the past century. This is not to say that climate change didn’t contribute to those cases, but their intensity owes to natural, not human, causes.
The closing comment by Hansen is then all the more ironic, though not surprising knowing he often writes from passion and not reason:
“The science of the situation is clear — it’s time for the politics to follow. ”
“Those who continue to talk in certain terms of how local weather extremes are the result of human climate change are failing to heed all the available evidence.”
Kerry Emanuel:
I see overstatements on all sides. Extreme weather begets extreme views. On the Russian heat wave, Marty is citing a single paper that claims it had nothing to do with climate change, but there are other papers that purport to demonstrate that events of that magnitude are now three times more likely than before the industrial era.
This is a collision between the fledgling application of the science of extremes and the inexperience we all have in conveying what we do know about this to the public. A complicating factor is the human psychological need to ascribe every unusual event to a cause. Our Puritan forebears ascribed them to sin, while in the 80’s is was fashionable to blame unusual weather on El Niño. Global warming is the latest whipping boy. But even conveying our level of ignorance is hard: Marty’s quotation of Harold Brooks makes it sound as though he is saying that the recent uptick in severe weather had nothing to do with climate change. The truth is that we do not know whether it did or did not; absence of evidence is not evidence of absence.
Regular readers of my work will not be surprised that I align with Emanuel.
At roughly the same time, Hoerling sent an amplification on his arguments and Miller sent a critique of Hoerling’s initial post. You can read both below. Keep in mind that neither writer has seen the other’s piece. (I asked Hansen for his thoughts on the complaints of Hoerling and Kerry Emanuel, another climate scientist who weighed in on Dot Earth. His response is at the end of this post.)
Here’s Hoerling’s expanded critique of Hansen [if you’re having trouble reading it, click here for a downloadable version]:
I have several papers well along in the publication process that make clear your characterizations are far off the mark. The editors prefer, indeed are insistent, that I not discuss these in blogs. Some scientists may be able to spend their time blogging and e-mailing without a significant impact on their scientific productivity — I’m not one of them — but I do make an effort to make my papers understandable to a wide audience.
Outra iniciativa lançada é a Agenda Total (AT), uma plataforma de conversação na internet que vai reunir todas as agendas da Rio+20, incluindo os eventos oficiais da ONU e os paralelos, promovidos pela prefeitura e pelo governo do estado, além da programação da Cúpula dos Povos e da sociedade civil.
Os brasileiros que desejem contribuir com as discussões sobre desenvolvimento sustentável, tema da conferência Rio+20, que a Organização das Nações Unidas (ONU) realiza no Rio de Janeiro em junho, pode enviar textos, fotos ou vídeos para o site http://www.ofuturoquenosqueremos.org.br.
A iniciativa, apresentada ontem (14), faz parte de uma campanha de conversa global lançada mundialmente pela ONU, com versões para o árabe, chinês, espanhol, inglês, francês e russo, línguas oficiais das Nações Unidas.
De acordo com o diretor do Centro de Informação das Nações Unidas para o Brasil (Unic Rio) e porta-voz adjunto da Rio+20, Giancarlo Summa, a criação do site pretende mobilizar os brasileiros para que manifestem seu pensamento sobre como seria o futuro num mundo mais sustentável, apresentando problemas e sugestões.
“A discussão sobre desenvolvimento sustentável só será um sucesso se a opinião pública em cada país se envolver e fizer pressão sobre governos e empresas, com contribuições envolvendo o tripé economia, ambiente e social. A nossa proposta aqui no Brasil é envolver a sociedade civil nessa discussão, para que se manifeste sobre o que queremos para daqui a 20 anos”, explicou.
Summa ressaltou que parte do conteúdo postado será apresentada em telões de led no Riocentro, onde chefes de Governo e de Estado se reunirão durante a conferência. “Também estamos pensando em outras formas de fazer chegar diretamente ao governo brasileiro e a outros governos as propostas dessa conversa global, com formas mais inovadoras, com muita internet e pouco papel”, disse.
Ele explicou que o site já está no ar e que vai receber as contribuições até o fim do ano. “O Brasil é um país muito conectado, onde a internet faz parte da vida de milhões de pessoas. Usando a rede, achamos que vamos influenciar as conversas sobre desenvolvimento sustentável”, destacou.
Para convocar a população a contribuir, foi produzida uma campanha multimídia exclusiva para o público brasileiro, intitulada ‘Eu Sou Nós’. Com depoimentos de pessoas famosas e brasileiros comuns, as peças serão veiculadas em televisão, rádio, jornais, revistas e internet. Além disso, uma série de anúncios será exposta em lugares púbicos explicando como participar da mobilização.
Outra iniciativa, também lançada ontem (14) pela ONU é a Agenda Total (AT), uma plataforma de conversação na internet que vai reunir todas as agendas da Rio+20, incluindo os eventos oficiais da ONU e os paralelos, promovidos pela prefeitura e pelo governo do estado, além da programação da Cúpula dos Povos e da sociedade civil.
Segundo Silvana de Matos, coordenadora da AT, o instrumento será a principal forma de interação da ONU com a sociedade brasileira durante a conferência. “São milhares de agendas e precisávamos integrá-las. Ao mesmo tempo, essa ferramenta vai ser o centro de documentação de todo o evento. As pessoas que estão ligadas às instituições [que vão participar da Rio+20] receberão login e senha e poderão publicar data e horário de seus eventos, além de disponibilizar vídeos e imagens em alta resolução”, explicou.
Silvana acrescentou que o projeto vai ajudar aos profissionais da imprensa na organização da cobertura dos eventos e também ao público em geral, que vai ficar sabendo o que vai acontecer na cidade durante a Rio+20. “O público em geral vai ver o que foi publicado, os eventos que acontecerão, os locais e como chegar a eles. Poderá também assistir a palestras e até fazer perguntas por chats”, enfatizou.
O debate online ‘Rio+20, o Futuro que Queremos’ lançado pela ONU servirá para promover o evento no Brasil e torná-lo mais popular. No Rio de Janeiro, por exemplo, enquanto a cidade se prepara para receber a conferência, nas ruas muitos cariocas ainda desconhecem o que será tratado durante a conferência.
A estudante Tatiana Cerqueira, de 17 anos, sabe apenas que não vai ter aula nos dias do evento. “Não estou sabendo de absolutamente nada. Só sei que não vai ter aula, porque os professores já comentaram, mas o que é o evento, eu não sei”, afirmou. O contador Marciele de Souza, de 49 anos, também disse não ter ideia do que se trata. “Não sei nada de Rio+20. Já ouvi falar, mas não sei o que é nem quando vai acontecer”, contou.
A auxiliar de escritório Cirlane de Jesus Santos, de 32 anos, disse ter “um pouco de conhecimento sobre o assunto”, mas não sabe como se envolver ou como participar. “Eu sei que é um projeto que aconteceu há vinte anos e que vai acontecer de novo esse ano e que vem muita gente de vários lugares. Mas não sei como participar ou o que eles vão discutir”, garantiu.
A Rio+20 acontece de 20 a 22 de junho, no Rio de Janeiro, e deve reunir milhares de pessoas, entre políticos, membros de organizações não governamentais (ONGs), representantes da sociedade civil e empresários, além dos chefes de Estado e de Governo. De acordo com a ONU, dos seus 193 países-membros, 183 já confirmaram presença.
Entre 1933 e 2010, o total anual de chuvas aumentou 425 mm na região metropolitana, segundo dados da USP
A terra da garoa virou a megalópole da tempestade. Em cerca de 80 anos, a quantidade de chuva anual que cai na Região Metropolitana de São Paulo, onde um em cada 10 brasileiros vive numa área equivalente a quase 1% do território nacional, aumentou 425 milímetros (mm), metade do que chove em boa parte do semiárido brasileiro. Saltou de uma média anual de quase 1.200 mm na década de 1930 para algo em torno dos 1.600 nos anos 2000. Fazendo uma soma linear, é como se todo ano tivesse chovido 5,5 mm a mais do que nos 12 meses anteriores. A pluviosidade não apenas se intensificou como alterou seu padrão de ocorrência. Não está simplesmente chovendo um pouco mais a cada dia, um efeito que seria pouco perceptível na prática e incapaz de ocasionar alagamentos constantes na região. A quantidade de dias com chuva forte ou moderada cresceu, provocando inclusive tempestades no inverno, época normalmente seca. Em contrapartida, o número de dias com chuva fraca, menor do que 5 mm, diminuiu.
Um regime de extremos, pendular, passou a dominar o ciclo das águas na região metropolitana: quando chove, em geral é muito; mas, entre os dias de grande umidade, pode haver longos períodos de seca. A Grande São Paulo parece caminhar para o pior dos dois mundos, alternando períodos intensos de excesso e de falta de chuva ao longo do ano. “A urbanização e o chamado efeito ilha de calor, além da poluição atmosférica, parecem ter um papel importante na alteração do padrão de pluviosidade em São Paulo, em especial nas estações já normalmente mais úmidas, como primavera e verão”, afirma Maria Assunção da Silva Dias, do Instituto de Astronomia, Geofísica e Ciências Atmosféricas da Universidade de São Paulo (IAG-USP), autora de um estudo ainda inédito sobre o tema. “Nos meses mais secos, a influência das mudanças globais do clima é responsável por 85% da dinâmica envolvida no aumento de chuvas extremas.” Embora com menos nitidez, a mesma tendência de elevação no número de dias com chuva intensa foi detectada na Região Metropolitana do Rio de Janeiro.
O novo padrão pluviométrico em São Paulo não é como uma frente fria passageira. Veio para ficar, segundo modelagens feitas pelo Centro de Ciência do Sistema Terrestre do Instituto Nacional de Pesquisas Espaciais (CCST-Inpe). As projeções sugerem que a situação atual é uma espécie de prólogo do enredo futuro. Elas sinalizam que deverá ocorrer até o final deste século um aumento no número de dias com chuvas superiores a 10, 20, 30 e 50 mm, ou seja, praticamente em todas as faixas significativas de pluviosidade. Haverá apenas uma diminuição na quantidade de dias com chuvas muito fracas e possivelmente um aumento no número de dias secos. “A sazonalidade das chuvas também deverá mudar”, afirma José Marengo, chefe do CCST, coordenador de um trabalho ainda não publicado sobre as projeções de chuva na região metropolitana. “A quantidade de tempestades fora da época normalmente mais úmida deverá crescer, um tipo de situação que pega a população de surpresa.” As simulações levam em conta apenas os possíveis efeitos sobre o regime pluviométrico da região metropolitana causados pelas chamadas mudanças climáticas globais, sobretudo o aumento nas concentrações dos gases de efeito estufa, que esquentam a temperatura do ar. O peso que a urbanização e a poluição atmosférica podem ter sobre as chuvas da Grande São Paulo não é considerado nas projeções.
Verde escasso na metrópole de concreto e asfalto: se 25% do território da Grande São Paulo fosse coberto por árvores, a temperatura média cairia até 2,5ºC
Uma das grandes dificuldades de fazer grandes estudos, capazes de revelar flutuações climáticas do passado e servir de baliza para projeções futuras, é a ausência de séries históricas longas e confiáveis, com informações diárias sobre a incidência de chuvas. Sem elas, não é possível fazer uma análise estatística robusta e ter uma visão clara sobre quanto chovia e como se distribuía a pluviosidade ao longo dos anos e das estações climáticas (primavera, verão, outono e inverno). Os especialistas são unânimes em apontar essa deficiência no Brasil. A série com dados de melhor qualidade sobre chuvas num ponto do território nacional é a fornecida pela estação meteorológica do IAG, que fica no Parque do Estado, no bairro da Água Funda, zona Sul da cidade de São Paulo. Os registros se iniciaram em 1933, quando a unidade foi inaugurada, e prosseguem até hoje.
Outro fator reveste os dados fornecidos pela estação meteorológica do IAG de um caráter único. Os registros foram obtidos dentro de uma grande área verde da cidade de São Paulo que não mudou radicalmente seu perfil ao longo de quase oito décadas – uma raridade numa megalópole que não possui muitos parques e jardins. Em outras palavras, embora a cidade tenha sofrido um forte processo de urbanização e de impermeabilização do solo no século passado, as condições naturais nos arredores da estação do Parque do Estado não se alteraram radicalmente. Dessa forma, faz sentido comparar os dados do presente com os do passado, visto que o ambiente local é mais ou menos o mesmo. “Na zona Norte de São Paulo, no Mirante de Santana, existe uma estação meteorológica com medições desde os anos 1950”, afirma Pedro Leite da Silva Dias, pesquisador do IAG-USP e diretor do Laboratório Nacional de Computação Científica (LNCC), no Rio de Janeiro, também autor do estudo sobre a evolução das chuvas na região metropolitana. “Mas lá só havia matas algumas décadas atrás e hoje tem prédio do lado da estação.”
Devido à riqueza de dados fornecidos pela estação do IAG no Parque do Estado, Assunção e seus colaboradores puderam enxergar detalhes e tendências mais sutis no regime das chuvas ao longo das últimas oito décadas. Entre 1935 e 1944 choveu, em média, mais do que 40 mm em cerca de 30 dias, com grande concentração de pluviosidade nos meses de verão e, em menor escala, na primavera e no outono. Durante o período não houve registros de episódios de pluviosidade dessa intensidade nos meses de inverno. A situação começou a mudar a partir de meados dos anos 1940. Desde então, em todas as décadas ocorreu, em média, ao menos uma chuva desse porte no inverno. Entre 2000 e 2009, o número total de jornadas com tempestades acima de 40 mm esteve na casa de 70 eventos. Uma tendência similar se repete quando se analisa década a década a ocorrência de chuvas diárias acima de 60 e de 80 mm.
De forma geral, dois fatores principais podem estar relacionados com a alteração no regime de chuvas na região metropolitana: as mudanças climáticas globais, um fenômeno de grande escala, e o efeito ilha de calor, de caráter localizado e típico das megacidades. Os dois atuam em conjunto. Um potencializa os efeitos do outro e, em geral, é difícil traçar uma linha divisória entre ambos. Segundo Marengo, a maioria dos modelos climáticos indica que haverá um aumento na quantidade de chuva desde a bacia do Prata até o Sudeste do Brasil nas próximas décadas. Dentro dessa moldura mais ampla, surge a questão específica do clima nas grandes cidades, em especial do efeito ilha de calor, que, ao tornar mais quentes as áreas extremamente urbanizadas, também funciona como um ímã de chuvas.
Brisa marinha mais úmida
A temperatura superficial do oceano Atlântico no litoral paulista aumentou cerca de um grau entre os anos de 1950 e 2010. Passou de 21,5°C para 22,5°C. Pode parecer pouco, mas uma das consequências desse aquecimento é aumentar a taxa de evaporação da água do oceano, combustível que torna a brisa marinha ainda mais carregada de umidade. Esse processo tem repercussões sobre o clima acima da serra do Mar, no planalto onde fica a região metropolitana.
Por que boa parte das chuvas na Grande São Paulo ocorre entre o meio e o final da tarde, depois das 15 ou 16 horas? Essa é a hora em que a brisa marinha, quente e úmida, vinda da Baixada Santista, termina de subir a serra e atinge a megalópole. “A zona Sudeste é geralmente o primeiro lugar da capital que sente os efeitos da brisa”, comenta Maria Assunção. A estrutura interna das cidades, com muitos prédios altos, altera a direção dos ventos e pode até provocar a ascensão da brisa marinha em certoss pontos da região metropolitana e favorecer localmente a formação de nuvens de chuvas. A poluição urbana, sobretudo os aerossóis, pode tanto favorecer como inibir a ocorrência de tempestades sobre as cidades, dependendo de sua quantidade.
Estudos feitos nos Estados Unidos na década de 1990 sugerem que parte do aumento de pluviosidade em algumas regiões metropolitanas, como na de Saint Louis, se deve à sua crescente urbanização. Nessa área do estado de Missouri, onde vivem cerca de 2,9 milhões de pessoas, as chuvas aumentaram entre 5% e 25% nas últimas décadas. Um estudo do ano passado, conduzido em grandes cidades da Índia, conclui que as alterações no regime pluviométrico dessas concentrações urbanas derivam mais das flutuações naturais do clima do que de fenômenos locais.
Estratégias de mitigação
No caso da Região Metropolitana de São Paulo, o trabalho da USP encontrou uma forte correlação entre seu processo de urbanização e as alterações no regime das chuvas. Os episódios de chuvas extremas, acima de 40 mm, se acentuam à medida que a população de São Paulo e de suas cidades vizinhas cresce e os territórios desses municípios viram praticamente uma única mancha de ocupação contínua, com pouco verde, muito asfalto e repleta de fontes de poluição e calor. De 1940 a 2010, a população da região metropolitana aumentou 10 vezes, de 2 para 20 milhões de habitantes. A mancha urbana cresceu 12 vezes entre 1930 e 2002, de 200 para 2.400 quilômetros quadrados. A temperatura média anual de São Paulo subiu 3°C entre 1933 e 2009, de acordo com os registros da estação do IAG no Parque do Estado e o total de chuvas aumentou em um terço. “Antes estudávamos esse processo de forma teórica”, afirma Pedro Leite da Silva Dias. “Agora temos mais dados, inclusive de fontes digitais.”
Mitigar o efeito ilha de calor pode ser uma forma de reduzir os episódios de chuvas extremas nos centros urbanos. O físico Edmilson Dias de Freitas, do IAG-USP, tem testado algumas medidas em simulações computacionais para ter uma ideia de seu impacto sobre o clima da Região Metropolitana de São Paulo. Pintar de branco as superfícies das casas e prédios não seria um procedimento eficaz. “A poluição e os eventos meteorológicos escurecem o branco rapidamente em São Paulo”, diz Freitas. “Não há como manter isso.” A medida mais eficaz seria aumentar a cobertura vegetal da cidade. Segundo as simulações, se 25% da área da região metropolitana fosse tomada por árvores, a temperatura média poderia ser reduzida entre 1,5°C e 2,5°C. Um clima mais ameno reduziria o efeito ilha de calor e talvez não atraísse tanta chuva para a região. Hoje as áreas verdes não representam nem 10% da Grande São Paulo.
Por tabela, se houvesse mais parques e menos áreas impermeabilizadas na maior metrópole brasileira, o efeito mais perverso das tempestades também seria minimizado: as chuvas intensas produziriam menos enchentes e alagamentos. O solo exposto absorve mais as águas que caem sobre ele. “São Paulo fere um princípio básico de drenagem: a água da chuva tem de se infiltrar no solo onde ela cai”, diz a engenheira civil Denise Duarte, professsora da Faculdade de Arquitetura e Urbanismo da USP, que colabora com colegas do IAG. “Aqui, com boa parte da cidade impermeabilizada, a água é simplesmente escoada.” A chuva de um lugar é transferida para outro, em geral os situados em pontos baixos da mancha urbana.
Nas zonas mais úmidas, em geral pontuadas por serras e montanhas, a pluviosidade anual pode chegar a 2.400 mm, quantidade de chuva parecida com a da floresta amazônica. Esse é caso da porção da Grande São Paulo cortada pela serra do Mar, que pega o trecho sul da capital paulista e parte de cidades como São Bernardo do Campo e Rio Grande da Serra, e também de trechos de Santana do Parnaíba e Cajamar, no oeste da região metropolitana. Nas áreas menos úmidas, como uma grande parte de Mogi das Cruzes, o índice de chuvas pode ficar na casa dos 1.300 mm por ano. Entre esses dois extremos há vários níveis intermediários de pluviosidade.O valor atual de aproximadamente 1.600 mm anuais de chuva registrado na estação do IAG funciona como uma referência genérica ao regime pluviométrico vigente na região metropolitana. Numa área que hoje se estende por 8 mil quilômetros quadrados e engloba os territórios de 39 municípios, a quantidade de chuva realmente medida ano a ano em cada estação meteorológica pode variar bastante. Um trabalho do CCST traça uma espécie de distribuição geográfica da pluviosidade na Grande São Paulo a partir de séries históricas, com o total diário de chuva, fornecidas por 94 estações meteorológicas do Departamento de Águas e Energia (DAEE) do Estado de São Paulo e da Agência Nacional de Águas (ANA). Dados de um período de 25 anos, entre 1973 e 1997, foram utilizados no trabalho.
“Essa diferença de níveis de chuvas se mantém ao longo do ano e em todas as estações climáticas”, diz Guillermo Obregón, do CCST, principal autor do estudo sobre a distribuição geográfica da chuva na região metropolitana. “Nos locais mais úmidos predominam as chuvas orográficas ou de relevo.” Esse mecanismo faz as massas de ar quente e úmido subirem ao se chocar com elevações topográficas, condensarem-se e gerarem precipitações frequentes. Seja por seus prédios e asfalto, seja por suas áreas montanhosas, a Grande São Paulo parece estar no caminho das chuvas.
Artigos científicos
1 SILVA DIAS, M.A.F. et al. Changes in extreme daily rainfall for São Paulo, Brazil. Climatic Change. no prelo. 2012.
2 MARENGO, J. A. et al. The climate in future: projections of changes in rainfall extremes for the Metropolitan Area of São Paulo (Masp). Climate Research. no prelo. 2012
As chuvas na Região Metropolitana do Rio de Janeiro, a segunda maior do país com 12,5 milhões de habitantes, parecem exibir tendências semelhantes às de São Paulo. Embora a capital fluminense não disponha de uma série histórica sobre pluviosidade tão longa e confiável como a do IAG-USP, duas estações do Instituto Nacional de Meteorologia (Inmet) instaladas no Rio de Janeiro fornecem dados de qualidade razoável sobre ao menos quatro décadas de chuva.
De acordo com os registros obtidos entre 1967 e 2007 pela estação mantida no Alto da Boa Vista, a quantidade de água despejada sobre esse bairro da zona Norte da capital fluminense nos dias de forte tempestade elevou-se, em média, 11,7 mm ao ano. A estação fica no Parque Nacional da Tijuca, uma das maiores florestas urbanas do planeta. “Houve uma tendência de aumento da pluviosidade total na região metropolitana e as áreas de floresta, como o Alto da Boa Vista, se tornaram mais úmidas”, afirma a meteorologista Claudine Dereczynski, da Universidade Federal do Rio de Janeiro (UFRJ), principal autora do estudo, ainda não publicado.
A outra estação do Inmet se situa em Santa Cruz, bairro com menos áreas verdes da zona Oeste. Nessa região, os sinais de intensificação das chuvas foram discretos, segundo as informações coletadas entre 1964 e 2009, e não foram considerados estatisticamente significativos. “No Rio, os dados climáticos das últimas décadas sinalizam mais claramente um aumento na temperatura local e de forma mais fraca uma elevação da quantidade de chuvas”, diz Claudine. Simulações feitas por pesquisadores do Inpe e da UFRJ projetam para as próximas décadas um aumento na intensidade e na frequência tanto dos dias de chuva intensa como dos de seca. A pluviosidade apresenta tendência a se tornar mais mal distribuída ao longo do ano e a se concentrar fortemente em alguns dias.
Os Projetos
1Narrowing the Uncertainties on Aerosol and Climate Changes in São Paulo State – Nuance-SPS – n° 08/58104-8 2Assessment of impacts and vulnerability to climate change in Brazil and strategies for adaptation option – n° 08/58161-1
Modalidade
1e2Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais – Projeto Temático
Coordenadores
1Maria de Fátima Andrade – IAG-USP 2José Marengo – Inpe
GLOBAL warming isn’t a prediction. It is happening. That is why I was so troubled to read a recent interview with President Obama in Rolling Stone in which he said that Canada would exploit the oil in its vast tar sands reserves “regardless of what we do.”
If Canada proceeds, and we do nothing, it will be game over for the climate.
Canada’s tar sands, deposits of sand saturated with bitumen, contain twice the amount of carbon dioxide emitted by global oil use in our entire history. If we were to fully exploit this new oil source, and continue to burn our conventional oil, gas and coal supplies, concentrations of carbon dioxide in the atmosphere eventually would reach levels higher than in the Pliocene era, more than 2.5 million years ago, when sea level was at least 50 feet higher than it is now. That level of heat-trapping gases would assure that the disintegration of the ice sheets would accelerate out of control. Sea levels would rise and destroy coastal cities. Global temperatures would become intolerable. Twenty to 50 percent of the planet’s species would be driven to extinction. Civilization would be at risk.
That is the long-term outlook. But near-term, things will be bad enough. Over the next several decades, the Western United States and the semi-arid region from North Dakota to Texas will develop semi-permanent drought, with rain, when it does come, occurring in extreme events with heavy flooding. Economic losses would be incalculable. More and more of the Midwest would be a dust bowl. California’s Central Valley could no longer be irrigated. Food prices would rise to unprecedented levels.
If this sounds apocalyptic, it is. This is why we need to reduce emissions dramatically. President Obama has the power not only to deny tar sands oil additional access to Gulf Coast refining, which Canada desires in part for export markets, but also to encourage economic incentives to leave tar sands and other dirty fuels in the ground.
The global warming signal is now louder than the noise of random weather, as I predicted would happen by now in the journal Science in 1981. Extremely hot summers have increased noticeably. We can say with high confidence that the recent heat waves in Texas and Russia, and the one in Europe in 2003, which killed tens of thousands, were not natural events — they were caused by human-induced climate change.
We have known since the 1800s that carbon dioxide traps heat in the atmosphere. The right amount keeps the climate conducive to human life. But add too much, as we are doing now, and temperatures will inevitably rise too high. This is not the result of natural variability, as some argue. The earth is currently in the part of its long-term orbit cycle where temperatures would normally be cooling. But they are rising — and it’s because we are forcing them higher with fossil fuel emissions.
The concentration of carbon dioxide in the atmosphere has risen from 280 parts per million to 393 p.p.m. over the last 150 years. The tar sands contain enough carbon — 240 gigatons — to add 120 p.p.m. Tar shale, a close cousin of tar sands found mainly in the United States, contains at least an additional 300 gigatons of carbon. If we turn to these dirtiest of fuels, instead of finding ways to phase out our addiction to fossil fuels, there is no hope of keeping carbon concentrations below 500 p.p.m. — a level that would, as earth’s history shows, leave our children a climate system that is out of their control.
We need to start reducing emissions significantly, not create new ways to increase them. We should impose a gradually rising carbon fee, collected from fossil fuel companies, then distribute 100 percent of the collections to all Americans on a per-capita basis every month. The government would not get a penny. This market-based approach would stimulate innovation, jobs and economic growth, avoid enlarging government or having it pick winners or losers. Most Americans, except the heaviest energy users, would get more back than they paid in increased prices. Not only that, the reduction in oil use resulting from the carbon price would be nearly six times as great as the oil supply from the proposed pipeline from Canada, rendering the pipeline superfluous, according to economic models driven by a slowly rising carbon price.
But instead of placing a rising fee on carbon emissions to make fossil fuels pay their true costs, leveling the energy playing field, the world’s governments are forcing the public to subsidize fossil fuels with hundreds of billions of dollars per year. This encourages a frantic stampede to extract every fossil fuel through mountaintop removal, longwall mining, hydraulic fracturing, tar sands and tar shale extraction, and deep ocean and Arctic drilling.
President Obama speaks of a “planet in peril,” but he does not provide the leadership needed to change the world’s course. Our leaders must speak candidly to the public — which yearns for open, honest discussion — explaining that our continued technological leadership and economic well-being demand a reasoned change of our energy course. History has shown that the American public can rise to the challenge, but leadership is essential.
The science of the situation is clear — it’s time for the politics to follow. This is a plan that can unify conservatives and liberals, environmentalists and business. Every major national science academy in the world has reported that global warming is real, caused mostly by humans, and requires urgent action. The cost of acting goes far higher the longer we wait — we can’t wait any longer to avoid the worst and be judged immoral by coming generations.
James Hansen directs the NASA Goddard Institute for Space Studies and is the author of “Storms of My Grandchildren.”
ScienceDaily (Feb. 22, 2011) — A new paper by George Mason University researchers shows that ‘Climategate’ — the unauthorized release in late 2009 of stolen e-mails between climate scientists in the U.S. and United Kingdom — undermined belief in global warming and possibly also trust in climate scientists among TV meteorologists in the United States, at least temporarily.
In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication and Center for Social Science Research asked these meteorologists early in 2010, when news stories about the climate e-mails were breaking, several questions about their awareness of the issue, attention to the story and impact of the story on their beliefs about climate change. A large majority (82 percent) of the respondents indicated they had heard of Climategate, and nearly all followed the story at least “a little.”
Among the respondents who indicated that they had followed the story, 42 percent indicated the story made them somewhat or much more skeptical that global warming is occurring.These results stand in stark contrast to the findings of several independent investigations of the emails, conducted later, that concluded no scientific misconduct had occurred and nothing in the emails should cause doubts about the fact which show that global warming is occurring.
The results, which were published in the journal Bulletin of the American Meteorology Society, also showed that the doubts were most pronounced among politically conservative weathercasters and those who either do not believe in global warming or do not yet know. The study showed that age was not a factor nor was professional credentials, but men — independent of political ideology and belief in global warming — were more likely than their female counterparts to say that Climategate made them doubt that global warming was happening.
“Our study shows that TV weathercasters — like most people — are motivated consumers of information in that their beliefs influence what information they choose to see, how they evaluate information, and the conclusions they draw from it,” says Ed Maibach, one of the researchers. “Although subsequent investigations showed that the climate scientists had done nothing wrong, the allegation of wrongdoing undermined many weathercasters’ confidence in the conclusions of climate science, at least temporarily.”
The poll of weathercasters was conducted as part of a larger study funded by the National Science Foundation on American television meteorologists. Maibach and others are now working with a team of TV meteorologists to test what audience members learn when weathercasters make efforts to educate their viewers about the relationship between the changing global climate and local weather conditions.
Ultimately, the team hopes to answer key research questions about how to help television meteorologists nationwide become an effective source of informal science education about climate change.
“Most members of the public consider television weather reporters to be a trusted source of information about global warming — only scientists are viewed as more trustworthy,” says Maibach. “Our research here is based on the premise that weathercasters, if given the opportunity and resources, can become an important source of climate change education for a broad cross section of Americans.”
ScienceDaily (Mar. 29, 2010) — In a time when only a handful of TV news stations employ a dedicated science reporter, TV weathercasters may seem like the logical people to fill that role, and in many cases they do.
In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication shows that two-thirds of weathercasters are interested in reporting on climate change, and many say they are already filling a role as an informal science educator.
“Our surveys of the public have shown that many Americans are looking to their local TV weathercaster for information about global warming,” says Edward Maibach, director of the Center for Climate Change Communication. “The findings of this latest survey show that TV weathercasters play — or can play — an important role as informal climate change educators.”
According to the survey, climate change is already one of the most common science topics TV weathercasters discuss — most commonly at speaking events, but also at the beginning or end of their on-air segments, on blogs and web sites, on the radio and in newspaper columns.
Weathercasters also indicated that they are interested in personalizing the story for their local viewers — reporting on local stories such as potential flooding/drought, extreme heat events, air quality and crops. About one-quarter of respondents said they have already seen evidence of climate change in their local weather patterns.
“Only about 10 percent of TV stations have a dedicated specialist to cover these topics,” says University of Texas journalism professor Kristopher Wilson, a collaborator on the survey. “By default, and in many cases by choice, science stories become the domain of the only scientifically trained person in the newsroom — weathercasters.”
Many of the weathercasters said that having access to resources such as climate scientists to interview and high-quality graphics and animations to use on-air would increase their ability to educate the public about climate change.
However, despite their interest in reporting more on this issue, the majority of weathercasters (61 percent) feel there is a lot of disagreement among scientists about the issue of global warming. Though 54 percent indicated that global warming is happening, 25 percent indicated it isn’t, and 21 percent say they don’t know yet.
“A recent survey showed that more than 96 percent of leading climate scientists are convinced that global warming is real and that human activity is a significant cause of the warming,” says Maibach. “Climate scientists may need to make their case directly to America’s weathercasters, because these two groups appear to have a very different understanding about the scientific consensus on climate change.”
This survey is one part of a National Science Foundation-funded research project on meteorologists. Using this data, Maibach and his research team will next conduct a field test of 30-second, broadcast-quality educational segments that TV weathercasters can use in their daily broadcasts to educate viewers about the link between predicted (or current) extreme weather events in that media market and the changing global climate.
Ultimately, the team hopes to answer key research questions supporting efforts to activate TV meteorologists nationwide as an important source of informal science education about climate change.
Comentário de Alexandre A. Costa, um dos mais respeitados meteorologistas do Brasil, sobre a entrevista:
A Negação da Mudança Climática e a Direita Organizada (10 de maio de 2012 – postado no Facebook)
Vocês devem ter assistido ou ouvido falar da entrevista recentemente veiculada no programa do Jô, com o Sr. Ricardo Felício que, mesmo sendo professor da Geografia da USP, atacou a comunidade de cientistas do clima, esboçou uma série de teorias conspiratórias e cometeu absurdos que não fazem sentido científico algum como as afirmações de que “não há elevação do nível do mar”, “o efeito estufa não existe”, “a camada de ozônio não existe”, “a Floresta Amazônica se reconstituiria em 20 anos após ser desmatada” e chegou ao auge ao apresentar uma explicação desprovida de sentido para a alta
temperatura de Vênus, apresentando uma interpretação totalmente absurda da lei dos gases.
Enfim, o que levaria uma pessoa que, a princípio é ligada à comunidade acadêmica, a postura tão absurda? Primeiro, achei tratar-se de alpinismo midiático. Como o currículo da figura não mostra nenhuma produção minimamente relevante, achei apenas que bater no “mainstream” fosse uma maneira de chamar atenção, atrair publicidade, ganhar fama, etc. Ingenuidade minha.
Entrevistador: “Você conhece alguma instituição que apóie o seu pensamento? Como ela funciona? E o que ela faz?” Ridardo Felício: “Recomendo que procurem, aqui no Brasil, a MSIa – Movimento de Solidariedade Ibero-Americana.”
Mas quem é essa MSIa? Um grupo de extrema-direita especialista em teorias conspiratórias e em ataques ao Greenpeace (“um instrumento político das oligarquias internacionais”), ao Movimento de Trabalhadores Sem Terra — MST (“um instrumento de guerra contra o Estado Brasileiro), o Foro de São Paulo (“reúne grupos revolucionários que objetivam desestabilizar as Forças Armadas”), a Pastoral da Terra, etc. Eu mesmo fui no site dessa organização e a última desse pessoal é uma campanha contra a Comissão da Verdade, a favor dos militares (“A quem interessa uma crise militar”)! Para quem quiser conhecer os posicionamentos desse pessoal, basta checar em http://www.msia.org.br/
Eis que um pouco mais de busca e achei o Ricardo Felicio sendo citado (‘”A ONU achou um jeito de implementar seu governo global, e o mundo será gerido por painéis pseudocientíficos””) onde? No site http://www.midiasemmascara.org/ do ultra-direitista Olavo de Carvalho…
Parece ser sintomático que às vésperas do final do prazo para veto do Código ruralista, alguém com esse tipo de vínculo (a MSIa se associa à UDR) venha dizer que se pode desmatar a Amazônia que a mesma se regenera em vinte anos… É interessante que a acusação de uma agenda “ambientalista”, “comunista”, de “governança internacional” ou qualquer que seja o delírio que os negadores da mudança climática colocam ao tentarem politizar-ideologizar a questão apenas mostram de onde vem essa politização-ideologização e com que matiz.
Como costumo dizer, moléculas de CO2 não têm ideologia e absorvem radiação infravermelho, independente da existência não só de posições políticas, mas até dos humanos que as expressam. O aumento de suas concentrações na atmosfera terrestre não poderiam ter outro efeito que não o de aquecimento do sistema climático global. Negar uma verdade científica óbvia então só faz sentido para aqueles que têm interesses atingidos. E fica claro. Esse senhor, que academicamente é um farsante é, na verdade, um militante de direita. Parafraseando aqueles que tanto o admiram, precisa aparecer na mídia sem a máscara de “professor da USP”, “climatologista”, etc., mas sim com sua verdadeira face.
Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará
A Negação das Mudanças Climáticas e a Direita Organizada – Parte II: Mais Revelações (13 de maio de 2012 – postado no Facebook)
Não é difícil continuar a ligar os pontos, após a aparição do Sr. Ricardo Felício no programa do Jô Soares. Por que alguém se disporia a se expor ao ridículo daquela forma? Como alguém seria capaz de, na posição de doutor em Geografia, professor da USP e “climatologista”, assassinar não apenas o conhecimento científico recente, mas leis básicas da Física, conhecimentos fundamentais de Química, Ecologia, etc.? O que levaria alguém a insultar de forma tão grosseira a comunidade acadêmica brasileira e internacional, principalmente a nós, Cientistas do Clima?
O que pretendo mostrar é que para chegar a esse ponto, é preciso ter motivações. E estas, meus caros, não são de mera vaidade, desejo pelo estrelato, etc. É uma agenda.
Para os que quiserem continuar comigo a rastrear a motivação por trás dessa tal entrevista, peço que visitem, mesmo que isso dê a eles alguma audiência, o repositório dos vídeos do pop-star tupiniquim da negação das mudanças climáticas em http://www.youtube.com/user/TvFakeClimate. Lá, os links são para o conhecido site http://www.msia.org.br/ do “Movimento de Solidariedade Íbero-Americana”, cujo nome pomposo esconde o neo-fascismo LeRouchista, especializado em teorias conspiratórias e manipulação e inimigo visceral, como se pode ver em seu site, do MST, do movimento feminista, do movimento de direitos humanos, da Comissão da Verdade, etc; para o não menos direitoso http://www.midiaamais.com.br/, cujos artigos não consegui ler até o fim, mas que são de ataques de direita a Obama, de ridicularização do movimento dos moradores do Pinheirinho, em SJC, de combate à decisão do STF em considerar as cotas constitucionais e, claro, negação da mudança climática e ataques ao IPCC, etc,; um site anti-movimento ambientalista de nome http://ecotretas.blogspot.com/, que por sua vez contém links neo-fascistas como “vermelho não” (http://vermelhosnao.blogspot.com.br/search/label/verdismo), que por sinal está fazendo a campanha “Não Veta, Dilma”, ou especializados em teorias conspiratórias como http://paraummundolivre.blogspot.com.br/ e até diretistas exóticos, defensores da restauração da monarquia em Portugal (http://quartarepublica.wordpress.com/) ou neo-salazaristas (http://nacionalismo-de-futuro.blogspot.com.br/).
Como coloquei em diversos momentos, não é a escolha política-ideológica que faz com que alguém tenha ou não razão em torno da questão climática. Tenho colegas em minha comunidade de pesquisa que simpatizam com os mais variados matizes político-ideológicos (o que por si só já dificultaria que nos juntássemos numa “conspiração”… como é mesmo… ah!… para “conquistar uma governança mundial da ONU via painéis de clima”, tipo de histeria típico da direita mais tresloucada dos EUA). A questão do clima é objetiva. Os mecanismos de controle do clima são conhecidos, incluindo o papel dos gases de efeito estufa. As medições, os resultados de modelos (atacados de maneira desonesta pelo entrevistado), os testemunhos paleoclimáticos, todos convergem. E dentre todas as possíveis hipóteses para o fenômeno do aquecimento do sistema climático, a contribuição antrópica via emissão de gases de efeito estufa foi a única a permanecer de pé após todos os testes. Constatar isso independe de ideologia. Basta abrir os olhos. O tipo de política pública a ser aplicada para lidar com os impactos, a adaptação às mudanças e a mitigação das mesmas, aí sim… é um terreno em que as escolhas políticas adquirem grau de liberdade.
O problema é que, para uma determinada franja político-ideológica, no caso a extrema-direita, há realmente incompatibilidade com qualquer agenda ambiental que possa significar controle público sobre o capital privado. Há também uma necessidade de ganhar respaldos afagando desejos escondidos da opinião pública (como o de que nada precisa ser feito a respeito das mudanças climáticas) e fazendo apelos ao nacionalismo (típico dos Mussolinis, dos Hitlers, dos Francos, dos Salazares e de tantas ditaduras de direita na América Latina) – ainda que eventualmente isso signifique adotar um discurso falsamente antiimperialista. Com esses objetivos “maiores”, que incluem sabotar a campanha pelo veto presidencial sobre o monstro que é o Código Florestal aprovado pelos deputados, para que compromisso com a verdade científica? Para que ética e tratamento respeitoso em relação aos demais colegas de mundo acadêmico?
É impressionante como aqueles que nos acusam de “fraude”, “conspiração”, etc., na verdade são exatamente os que as praticam. Como coloquei em outros textos que escrevi sobre o assunto, é preciso desmistificar cientificamente os pseudo-argumentos apresentados pelos negadores (e isso tenho feito em outros textos), mas como bem lembra o colega Michael Mann, eles são como a hidra. Sempre têm mais mentiras na manga para lançarem por aí e não têm preocupação nenhuma em apresentarem um todo coerente em oposição aos pontos de vista da comunidade científica. Interessa a eles semearem confusão, ganharem espaço político, atrasarem ações de proteção da estabilidade climática, darem tempo para os que os financiam na base (ainda que possa haver negadores não ligados diretamente à indústria de petróleo e outras, mas já ficou evidente a ligação desta com a campanha articulada anti-ciência do clima em escala mundial). A pseudo-ciência e a impostura intelectual são as cabeças da hidra. O coração do monstro é a agenda político-ideológica. Mas a espada da verdade é longa o suficiente para ferir-lhe de morte!
Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará
Em Defesa da Ciência do Clima (10 de maio de 2012 – postado no Facebook)
Tenho me preocupado muito com os ataques feitos recentemente à Ciência do Clima, dentre outros motivos, porque estes tem se constituído num amálgama estranho que reúne o Tea Party, a indústria petroquímica e pessoas que parecem acreditar numa grande conspiração imperialista para, ao impedir que queimem suas reservas de combustíveis fósseis, a periferia do capitalismo se “desenvolva”, o que, com o perdão da palavra, já é per si uma visão absolutamente tacanha de “desenvolvimento”.
Mas essa não é uma questão ideológica, mesmo porque se o fosse estaria eu distante de Al Gore. É uma questão científica, pois moléculas de CO2 não têm ideologia. O que elas são dotadas, assim como outras moléculas (caso do CH4 e do próprio vapor d’água), é de uma propriedade da qual não gozam os gases majoritários em nossa atmosfera, que é a de um modo de oscilação cuja frequência coincide com a de uma região do espectro eletromagnético conhecida como infravermelho. A retenção do calor é uma consequência da presença desses gases (mesmo tão minoritários) na atmosfera terrestre. Não fosse por eles, a Terra teria temperatura média de -18 graus, em contraste com os moderados 15, para não falar do papel dos mesmos em mantê-la entre limites amenos. A Terra não é Mercúrio que, por não ter atmosfera, devolve livremente a energia absorvida do Sol na porção em que é dia, levando-o a contrastes de temperatura de 430 graus durante o dia e -160 graus à noite. Felizmente, tampouco é Vênus, cuja cobertura de nuvens faz com que chegue à sua superfície menos energia solar do que na Terra, mas cujo efeito estufa, causado por sua atmosfera composta quase que exclusivamente por CO2, eleva sua temperatura a praticamente constantes 480 graus.
Desconhecer essas idéias científicas simples, de que o CO2 é um gás de efeito estufa (conhecido e medido por Tyndall, Arrhenius e outros, desde o século XIX), com mecanismo bem explicado pela Física de sua estrutura molecular; ignorar o conhecido efeito global que o CO2 tem sobre um planeta vizinho, o que é bem estabelecido pela astronomia desde o saudoso Sagan, não faz sentido, especialmente no meio acadêmico, onde encontram-se alguns dos negadores mais falantes. A esses eu gostaria de lembrar de algo básico no método científico. De um lado, a ciência não tem dogma, nem verdades definitivas. Suas verdades são sempre, por construção, parciais e provisórias (que bom, senão viraria algo chato e tedioso como, digamos, uma religião). No entanto, por outro lado, o conhecimento científico é cumulativo e, nesse sentido, não se pode andar para trás! Só quando uma teoria falha, se justifica uma nova e esta não pode ser apenas a negação da anterior, pois precisa ser capaz de reproduzir todos os seus méritos (caso da Mecânica Clássica e da Relatividade, que se reduz à primeira para baixas velocidades).
Não é uma questão de crença. “Monotonia” à parte, é ciência bem estabelecida, bem conhecida. Tanto quanto a Gravitação Universal (que também é “apenas” uma teoria) ou a Evolução das Espécies.
INJUSTIÇA, DESRESPEITO E SUBESTIMAÇÃO
Os Cientistas do Clima tem sofrido ataques, com base em factóides que em nenhum momento se assemelham à realidade de nossa área. Nenhuma Ciência é hoje tão pública e aberta. Quem quiser, pode obter facilmente, na maioria dos casos diretamente pela internet, dados observados do clima, que demonstram claramente o aquecimento global (www.cru.uea.ac.uk/cru/data/ dentre outros), dados de modelagem que estão sendo gerados agora e que certamente subsidiarão o 5o relatório do IPCC (http://cmip-pcmdi.llnl.gov/cmip5/data_portal.html) ou dados de testemunhos paleoclimáticos, que servem para analisar o clima do passado (www.ncdc.noaa.gov). Pode obter os relatórios do IPCC, emwww.ipcc.ch e seguir as referências, revisadas e publicadas em sua esmagadora maioria, principalmente no caso do Grupo de Trabalho que lida com as Bases Físicas do Sistema Climático, em revistas de grande impacto, sejam gerais (Science, Nature), sejam da área. Duvido que, em nossas universidades, cheias de laboratórios com convênios privados, sejam na engenharia de materiais ou na bioquímica, haja um segmento tão aberto, que tenha o desprendimento de sentar à mesa, compartilhar dados, levantar o estado-da-arte em sua ciência e elaborar coletivamente um relatório de síntese. Duvido! Desafio!
Os cientistas que participamos desses painéis não somos “representantes de governos”. Nada é criado ou inventado nesses painéis, além de uma síntese da Ciência que é produzida de maneira independente e publicada na literatura revisada por pares. Os que participam da comunidade acadêmica podem, inclusive, se informar melhor com facilidade, junto a colegas da comunidade científica brasileira que participaram e participam das iniciativas do IPCC e do PBMC sobre o funcionamento desses painéis, antes de emitir opinião, para que não terminem, na prática, difamando o que desconhece. Algumas pessoas, sem a menor conduta crítica em relação aos detratores do IPCC, repete-lhes a verborragia, quando poderiam ser céticos em relação aos “céticos”.
Mas não o são. Em nenhum momento, questionam as reais motivações de dois ou três (felizmente, são tão raros) que assumem a conduta lamentável da negação anti-ciência, ou por serem abertamente corruptos e serviçais da indústria petroquímica ou, simplesmente, por terem uma vaidade que não cabe no papel secundário que cumpririam caso estivessem, como nós, desprendendo, em geral quase anonimamente, enorme energia para colocar tijolo por tijolo no edifício da Ciência do Clima. É preciso saber distinguir entre o ceticismo honesto, genuíno, que é saudável em ciência, consonante com a dúvida sincera e a conduta crítica, da negação religiosa, baseada em fé e na necessidade cega de defender determinado ponto de vista, independente se o mesmo tem base real ou não e, principalmente, da canalhice pura e simples, que é o que é promovido por alguns dos negadores. O possível “sucesso” dessas idéias junto ao público, para mim, são terreno da psicologia social, mas a melhor analogia que tenho é a da popularidade de idéias religiosas, em geral mentiras reconfortantes que são preferidas em detrimento de verdades desagradáveis.
O verdadeiro ceticismo levou até onde os físicos de Berkeley foram (http://www.berkeleyearth.org/index.php). Inicialmente questionando os resultados obtidos por nossa comunidade, se municiaram de um enorme banco de dados de temperatura em escala mundial, mais amplo do que os que o Hadley Centre inglês e a NASA dispunham. Testaram outras metodologias, chegaram até a excluir as estações meteorológicas usadas por nossos centros de pesquisa. A postura inicial de Richard Muller, idealizador dessa iniciativa, era de tamanho questionamento em relação a nossos resultados que ele chegou a alavancar recursos da famigerada Fundação Koch, abertamente anti-Ciência do Clima. Mas o que Muller e seus parceiros encontraram? O mesmo resultado que já nos era conhecido. A Terra está aquecendo e este aquecimento se acelerou bastante nas últimas décadas do século XX. Este aquecimento se aproxima de um grau e portanto está muito acima de todas as flutuações naturais registradas desde que se tem registro instrumental. Aliás, confirmou o que também sabíamos: que os dados da Universidade de East Anglia (aqueles mesmos da farsa montada sob o nome altissonante de “climategate”, aqueles que foram perseguidos e cuja reputação foi ignominiosamente atacada, com repercussões em suas carreiras profissionais e vidas pessoais) contém um erro… para menos! O aquecimento sugerido pelos dados da CRU/UEA é um décimo de grau inferior aos das outras fontes de dados e, claro, entre nós, ninguém os acusa de desonestos por isso.
Outra impostura – e infelizmente, apesar da dureza do termo, acho que é neste caso em que ele se aplica – é a subestimação da inteligência de nossa comunidade, aliada ao desconhecimento dos materiais por ela produzidos. O 4o relatório do IPCC já contém um capítulo exclusivamente sobre Paleoclimatologia, isto é, sobre o clima do passado. Eu pessoalmente tenho dedicado grandes esforços na análise de testemunhos do clima passado e na modelagem das condições climáticas passadas. Existe uma preocupação permanente em discernir o sinal natural e separar, dele, o sinal antrópico, desde o primeiro relatório do IPCC. Para isso, avalia-se o papel das variações de atividade solar, as emissões dos vulcões, etc. Já avaliamos as possíveis influências naturais e as descartamos como possível causa para o aquecimento observado.
Nesse sentido, não há lugar para sofismas e tergiversações. Sobre os registros paleoclimáticos, que são capazes de recontar o histórico de temperatura e de concentração de gases de efeito estufa de 800 mil anos atrás até o presente, todos sabemos que, no passado, um pequeno aquecimento do planeta precedeu o aumento da concentração dos gases de efeito estufa. Isso se deu antes do encerramento de todas as eras glaciais. Mas é um raciocínio obtuso deduzir daí que o CO2 não exerce nenhum papel ou, nas palavras dos negadores “é consequência e não causa”. Existem diversos processos de retroalimentação no sistema climático e este é um dos melhores exemplos. As sutis variações da insolação e da distribuição desta sobre a superfície da Terra associadas aos ciclos orbitais são – e isto é do conhecimento de todos – muito pequenas para explicar as grandes diferenças de temperatura entre os períodos glaciais (“eras do gelo”) e os interglaciais (períodos quentes, mais breves, que as intercalaram). Mas um aquecimento sutil, após alguns séculos, mostrou-se suficiente para elevar a emissões naturais de CO2 e metano, que causam efeito estufa e amplificam o processo. Essa retroalimentação só era refreada, em condições livres da ação do homem, quando as condições orbitais mudavam novamente, levando a um resfriamento sutil, que induzia a captura de CO2 no sistema terrestre, que por sua vez amplificava o resfriamento e assim por diante.
Mas não é porque pessoas morrem de câncer e infarto que não se possa atribuir responsabilidades a um assassino! Porque pessoas morrem naturalmente de derrame, alguém acha possível dizer que “é impossível que um tiro mate alguém”? Ou que não se deva julgar mais ninguém por assassinato? Antes, era preciso um pequeno aquecimento para deflagrar emissões naturais e aumento de concentração de CO2, para daí o aquecimento se acelerar. Hoje, há uma fonte independente de CO2, estranha aos ciclos naturais e esta é a queima de combustíveis fósseis! Devo, aliás, frisar que até a análise isotópica (a composição é diferente entre combustíveis fósseis e outras fontes) é clara: a origem do CO2 excedente na atmosfera terrestre é sim, em sua maioria, petróleo, carvão, gás natural! Um mínimo de verdadeiro aprofundamento científico deixa claro que, hoje, o aumento das concentrações de CO2 na atmosfera é eminentemente antrópico e que é isso que vem acarretando as mudanças climáticas observadas. Não é possível mais tapar o sol, ou melhor, tapar os gases de efeito estufa com uma peneira! Os registros paleoclimáticos mostram que o aquecimento atual é inédito nos últimos 2500 anos. Mostram que a concentração atual de CO2 está 110 ppm acima do observado antes da era industrial e quase 100 ppm acima do que se viu nos últimos 800 mil anos. Mostram que esse número é maior do que a diferença entre a concentração de CO2 existente nos interglaciais e nas “eras do gelo” e que isso faz, sim, grande diferença sobre o clima.
QUAIS OS VERDADEIROS ERROS
Algumas pessoas se dizem céticas, críticas e desconfiadas em relação à maioria de nossa comunidade de cientistas do clima, mas não percebem o erro fundamental que cometem: a absoluta falta de ceticismo, criticidade e desconfiança em relação aos que nos detratam. A postura dos que combatem a Ciência do Clima sob financiamento da indústria petroquímica, ou em associação com setores partidários e da mídia mais reacionários é auto-explicativa. Interessa o acobertamento da realidade. Mas não só. Há desde essas pessoas que recebem diretamente recursos da indústria do petróleo a falastrões que há muito não têm atuação científica de verdade na área e, sem serem capazes de permanecer em evidência trabalhando seriamente para contribuir com o avançar de nossa ciência, debruçando-se sobre as verdadeiras incertezas, contribuindo para coletar dados, melhorar métodos e modelos, etc., apenas para manterem holofotes sobre si, têm atacado o restante da comunidade. Estranho e espalhafatoso como as penas de pavão. Prosaico como os mecanismos evolutivos que levaram tais penas a surgirem. Daí é preciso também combater o ponto de vista daqueles que dão a esse ataque um falso verniz “de esquerda”, pois lançam mão de teorias de conspiração, uma deturpação patológica do raciocínio crítico. Lutar com o alvo errado, com a arma errada, é pior do que desarmar para a luta.
O IPCC é perfeito? Não, é claro. Cometeu erros. Mas querem saber, de fato, quais são? Uma coisa precisa ficar claro a todos. As avaliações do IPCC tendem a ser conservadoras. As projeções de temperatura realizadas para após o ano 2000 estão essencialmente acertadas, mas sabe o que acontece com as projeções de elevação do nível dos oceanos e de degelo no Ártico? Estão subestimadas. Isso mesmo. O cenário verdadeiro é mais grave do que o 4o relatório do IPCC aponta. Mas de novo não é por uma questão política, mas pela limitação, na época, dos modelos de criosfera, incapazes de levar em conta processos importantes que levam ao degelo. Provavelmente, baseando-se em artigos que vêm sendo publicados nesse meio tempo, o 5o relatório será capaz de corrigir essas limitações e mostrar um quadro mais próximo da real gravidade do problema em 2013-2014 quando de sua publicação.
QUAL A VERDADEIRA QUESTÃO IDEOLÓGICA?
Não faz sentido “acreditar” ou não na gravidade, na evolução ou no efeito estufa. Não se trata de uma “opção ideológica” (apesar de haver, nos EUA, uma forte correlação entre ideologia e ciência junto ao eleitorado republicano mais reacionário, que dá ouvidos aos detratores da ciência do clima e que também querem ver Darwin fora das escolas).
A verdadeira questão ideológica, é que as mudanças climáticas são um processo de extrema desigualdade, da raiz, aos seus impactos. Quem mais se beneficiou das emissões dos gases de efeito estufa foram e continuam sendo as classes dominantes dos países capitalistas centrais. Juntamente com os mega-aglomerados do capital financeiro, a indústria petroquímica, o setor de mineração (que inclui mineração de carvão), o setor energético, etc. concentraram riquezas usando a atmosfera como sua grande lata de lixo. Mais do que a “pegada” de carbono atual (que é ainda extremamente desigual se compararmos americanos, europeus e australianos, de um lado, com africanos do outro), é mais díspar ainda a “pegada histórica” (isto é, o já emitido, o acumulado a partir das emissões de cada país), que faz da Europa e, em seguida, dos EUA, grandes emissores históricos.
Cruelmente, em contrapartida, os impactos das mudanças no clima recairão sobre os países mais pobres, sobre as pequenas nações, principalmente sobre os pobres dos países pobres, sobre os mais vulneráveis. Perda de territórios em países insulares, questões de segurança hídrica e alimentar em regiões semi-áridas (tão vastas no berço de nossa espécie, que é o continente africano), efeitos de eventos severos (que, com base física muito clara, devem se tornar mais frequentes num planeta aquecido), comprometimento de ecossistemas marinhos costeiros e florestas, atingindo pesca e atividades de coleta; inviabilização de culturas agrícolas tradicionais… tudo isso recai onde? Sobre o andar de baixo! O de cima fala em “adaptação” e tem muito mais instrumentos para se adaptar às mudanças. A nós, neste caso, interessa sermos conservadores quanto ao clima e frear esse “experimento” desastrado, desordenado, de alteração da composição química da atmosfera terrestre e do balanço energético planetário! Para a maioria dos 7 bilhões de habitantes dessa esfera, a estabilidade climática é importante!
Alguns dos mais ricos, na verdade, veem o aquecimento global como “oportunidade”… Claro, “oportunidade” de expandir o agronegócio para as futuras terras agricultáveis do norte do Canadá e da Sibéria e para explorar petróleo no oceano que se abrirá com o crescente degelo do Ártico.
Assim, é preciso perceber que há uma verdadeira impostura vagando por aí e a Ciência precisa ser defendida. Uma rocha é uma rocha; uma árvore é uma árvore; uma molécula de CO2 é uma molécula de CO2, independente de ideologia. Mas os de baixo só serão/seremos capazes de se/nos armarem/armarmos para transformar a sociedade se estiverem/estivermos bem informados e aí, é preciso combater os absurdos proferidos pelos detratores da Ciência do Clima.
Alexandre Costa é bacharel em Física e mestre em Física pela Universidade Federal do Ceará, Ph.D. em Ciências Atmosféricas pela Colorado State University, com pós-doutorado pela Universidade de Yale, com publicações em diversos periódicos científicos, incluindo Science, Journal of the Amospheric Sciences e Atmospheric Research. É bolsista de produtividade do CNPq e membro do Painel Brasileiro de Mudanças Climáticas.
ScienceDaily (Oct. 16, 2009) — Worried about climate change and want to learn more? You probably aren’t watching television then. A new study by George Mason University Communication Professor Xiaoquan Zhao suggests that watching television has no significant impact on viewers’ knowledge about the issue of climate change. Reading newspapers and using the web, however, seem to contribute to people’s knowledge about this issue.
The study, “Media Use and Global Warming Perceptions: A Snapshot of the Reinforcing Spirals,” looked at the relationship between media use and people’s perceptions of global warming. The study asked participants how often they watch TV, surf the Web, and read newspapers. They were also asked about their concern and knowledge of global warming and specifically its impact on the polar regions.
“Unlike many other social issues with which the public may have first-hand experience, global warming is an issue that many come to learn about through the media,” says Zhao. “The primary source of mediated information about global warming is the news.”
The results showed that people who read newspapers and use the Internet more often are more likely to be concerned about global warming and believe they are better educated about the subject. Watching more television, however, did not seem to help.
He also found that individuals concerned about global warming are more likely to seek out information on this issue from a variety of media and nonmedia sources. Other forms of media, such as the Oscar-winning documentary “The Inconvenient Truth” and the blockbuster thriller “The Day After Tomorrow,” have played important roles in advancing the public’s interest in this domain.
Politics also seemed to have an influence on people’s perceptions about the science of global warming. Republicans are more likely to believe that scientists are still debating the existence and human causes of global warming, whereas Democrats are more likely to believe that a scientific consensus has already been achieved on these matters.
“Some media forms have clear influence on people’s perceived knowledge of global warming, and most of it seems positive,” says Zhao. “Future research should focus on how to harness this powerful educational function.”
ScienceDaily (Nov. 21, 2011) — People who believe there is a lot of disagreement among scientists about global warming tend to be less certain that global warming is happening and less supportive of climate policy, researchers at George Mason, San Diego State, and Yale Universities report in a new study published in the journal Nature Climate Change.
A recent survey of climate scientists conducted by researchers at the University of Illinois found near unanimous agreement among climate scientists that human-caused global warming is happening.
This new George Mason University study, however, using results from a national survey of the American public, finds that many Americans believe that most climate scientists actually disagree about the subject.
In the national survey conducted in June 2010, two-thirds of respondents said they either believed there is a lot of disagreement among scientists about whether or not global warming is happening (45 percent), that most scientists think it is not happening (5 percent), or that they did not know enough to say (16 percent.) These respondents were less likely to support climate change policies and to view climate change as a lower priority.
By contrast, survey respondents who correctly understood that there is widespread agreement about global warming among scientists were themselves more certain that it is happening, and were more supportive of climate policies.
“Misunderstanding the extent of scientific agreement about climate change is important because it undermines people’s certainty that climate change is happening, which in turn reduces their conviction that America should find ways to deal with the problem,” says Edward Maibach, director of the Center for Climate Change Communication at George Mason University.
Maibach argues that a campaign should be mounted to correct this misperception. “It is no accident that so many Americans misunderstand the widespread scientific agreement about human-caused climate change. A well-financed disinformation campaign deliberately created a myth about there being lack of agreement. The climate science community should take all reasonable measures to put this myth to rest.”
ScienceDaily (Oct. 14, 2010) — Sixty-three percent of Americans believe that global warming is happening, but many do not understand why, according to a national study conducted by researchers at Yale University.
The report titled “Americans’ Knowledge of Climate Change” found that only 57 percent know what the greenhouse effect is, only 45 percent of Americans understand that carbon dioxide traps heat from the Earth’s surface, and just 50 percent understand that global warming is caused mostly by human activities. Large majorities incorrectly think that the hole in the ozone layer and aerosol spray cans cause global warming. Meanwhile, 75 percent of Americans have never heard of the related problems of ocean acidification or coral bleaching.
However, many Americans do understand that emissions from cars and trucks and the burning of fossil fuels contribute to global warming and that a transition to renewable energy sources is an important solution.
Americans also recognize their own limited understanding. Only 1 in 10 say that they are “very well-informed” about climate change, and 75 percent say they would like to know more about the issue. Likewise, 75 percent say that schools should teach children about climate change and 68 percent would welcome a national program to teach Americans more about the issue.
“This study demonstrates that Americans need to learn more about the causes, impacts and potential solutions to global warming,” said study director Anthony Leiserowitz of Yale University. “But it also shows that Americans want to learn more about climate change in order to make up their minds and take action.”
The online survey was conducted by Knowledge Networks from June 24 to July 22, 2010, with 2,030 American adults 18 and older. The margin of sampling error is plus- or minus-2 percent, with 95 percent confidence.
ScienceDaily (Mar. 27, 2008) — The more you know the less you care — at least that seems to be the case with global warming. A telephone survey of 1,093 Americans by two Texas A&M University political scientists and a former colleague indicates that trend, as explained in their recent article in the peer-reviewed journal Risk Analysis.
“More informed respondents both feel less personally responsible for global warming, and also show less concern for global warming,” states the article, titled “Personal Efficacy, the Information Environment, and Attitudes toward Global Warming and Climate Change in the USA.”
The study showed high levels of confidence in scientists among Americans led to a decreased sense of responsibility for global warming.
The diminished concern and sense of responsibility flies in the face of awareness campaigns about climate change, such as in the movies An Inconvenient Truth and Ice Age: The Meltdown and in the mainstream media’s escalating emphasis on the trend.
The research was conducted by Paul M. Kellstedt, a political science associate professor at Texas A&M; Arnold Vedlitz, Bob Bullock Chair in Government and Public Policy at Texas A&M’s George Bush School of Government and Public Service; and Sammy Zahran, formerly of Texas A&M and now an assistant professor of sociology at Colorado State University.
Kellstedt says the findings were a bit unexpected. The focus of the study, he says, was not to measure how informed or how uninformed Americans are about global warming, but to understand why some individuals who are more or less informed about it showed more or less concern.
“In that sense, we didn’t really have expectations about how aware or unaware people were of global warming,” he says.
But, he adds, “The findings that the more informed respondents were less concerned about global warming, and that they felt less personally responsible for it, did surprise us. We expected just the opposite.
“The findings, while rather modest in magnitude — there are other variables we measured which had much larger effects on concern for global warming — were statistically quite robust, which is to say that they continued to appear regardless of how we modeled the data.”
Measuring knowledge about global warming is a tricky business, Kellstedt adds.
“That’s true of many other things we would like to measure in surveys, of course, especially things that might embarrass people (like ignorance) or that they might feel social pressure to avoid revealing (like prejudice),” he says.
“There are no industry standards, so to speak, for measuring knowledge about global warming. We opted for this straightforward measure and realize that other measures might produce different results.”
Now, for better or worse, scientists have to deal with the public’s abundant confidence in them. “But it cannot be comforting to the researchers in the scientific community that the more trust people have in them as scientists, the less concerned they are about their findings,” the researchers conclude in their study.
ScienceDaily (Mar. 26, 2008) — British Prime Minister Gordon Brown recently declared climate change a top international threat, and Al Gore urged politicians to get involved to fight global warming. Results from a recent survey conducted by a University of Missouri professor reveal that the U.S. public, while aware of the deteriorating global environment, is concerned predominantly with local and national environmental issues.
Potomac River near Washington DC. The top three issues that the US public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. (Credit: Michele Hogan)
“The survey’s core result is that people care about their communities and express the desire to see government action taken toward local and national issues,” said David Konisky, a policy research scholar with the Institute of Public Policy. “People are hesitant to support efforts concerning global issues even though they believe that environmental quality is poorer at the global level than at the local and national level. This is surprising given the media attention that global warming has recently received and reflects the division of opinion about the severity of climate change.”
Konisky, an assistant professor in the Truman School of Public Affairs at MU, recently surveyed 1,000 adults concerning their attitudes about the environment. The survey polled respondents about their levels of concern for the environment and preferences for government action to address a wide set of environmental issues.
A strong majority of the public expressed general concern about the environment. According to the survey, the top three issues that the public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. In the survey, global warming ranks eighth in importance.
“Americans are clearly most concerned about pollution issues that might affect their personal health, or the health of their families,” Konisky said.
Additionally, Konisky and his colleagues found that the best predictor of individuals’ environmental preferences is their political attributes. They examined the relationship between party identification and political ideology and support for action to address environmental problems.
“The survey reinforced the stark differences in people’s environmental attitudes, depending on their political leanings,” Konisky said. “Democrats and political liberals clearly express more desire for governmental action to address environmental problems. Republicans and ideological conservatives are much less enthusiastic about further government intervention.”
Results from the survey were recently presented at the annual meeting of the Western Political Science Association in San Diego.
ScienceDaily (May 8, 2012) — Americans’ support for government action on global warming remains high but has dropped during the past two years, according to a new survey by Stanford researchers in collaboration with Ipsos Public Affairs. Political rhetoric and cooler-than-average weather appear to have influenced the shift, but economics doesn’t appear to have played a role.
The survey directed by Jon Krosnick, a senior fellow at the Stanford Woods Institute for the Environment, shows that support for a range of policies intended to reduce future climate change dropped by an average of 5 percentage points per year between 2010 and 2012.
In a 2010 Stanford survey, more than three-quarters of respondents expressed support for mandating more efficient and less polluting cars, appliances, homes, offices and power plants. Nearly 90 percent of respondents favored federal tax breaks to spur companies to produce more electricity from water, wind and solar energy. On average, 72 percent of respondents supported government action on climate change in 2010. By 2012, that support had dropped to 62 percent.
The drop was concentrated among Americans who distrust climate scientists, even more so among such people who identify themselves as Republicans. Americans who do not trust climate science were especially aware of and influenced by recent shifts in world temperature, and 2011 was tied for the coolest of the last 11 years.
Krosnick pointed out that during the recent campaign, all but one Republican presidential candidate expressed doubt about global warming, and some urged no government action to address the issue. Rick Santorum described belief in climate change as a “pseudo-religion,” while Ron Paul called it a “hoax.” Mitt Romney, the apparent Republican nominee, has said, “I can tell you the right course for America with regard to energy policy is to focus on job creation and not global warming.”
The Stanford-Ipsos study found no evidence that the decline in public support for government action was concentrated among respondents who lived in states struggling the most economically.
The study found that, overall, the majority of Americans continue to support many specific government actions to mitigate global warming’s effect. However, most Americans remain opposed to consumer taxes intended to decrease public use of electricity and gasoline.
In the summer of 1816, a young British woman by the name of Mary Godwin and her boyfriend Percy Shelley went to visit Lord Byron in Lake Geneva, Switzerland. They had planned to spend much of the summer outdoors, but the eruption of Mount Tambora in Indonesia the previous year had changed the climate of Europe. The weather was so bad that they spent most of their time indoors, discussing the latest popular writings on science and the supernatural.
After reading a book of German ghost stories, somebody suggested they each write their own. Byron’s physician, John Polidori, came up with the idea for The Vampyre, published in 1819,1 which was the first of the “vampire-as-seducer” novels. Godwin’s story came to her in a dream, during which she saw “the pale student of unhallowed arts kneeling beside the thing he had put together.”2 Soon after that fateful summer, Godwin and Shelley married, and in 1818, Mary Shelley’s horror story was published under the title, Frankenstein, Or, the Modern Prometheus.3
Frankenstein lives on in the popular imagination as a cautionary tale against technology. We use the monster as an all-purpose modifier to denote technological crimes against nature. When we fear genetically modified foods we call them “frankenfoods” and “frankenfish.” It is telling that even as we warn against such hybrids, we confuse the monster with its creator. We now mostly refer to Dr. Frankenstein’s monster as Frankenstein. And just as we have forgotten that Frankenstein was the man, not the monster, we have also forgotten Frankenstein’s real sin.
Dr. Frankenstein’s crime was not that he invented a creature through some combination of hubris and high technology, but rather that heabandoned the creature to itself. When Dr. Frankenstein meets his creation on a glacier in the Alps, the monster claims that it was notborn a monster, but that it became a criminal only after being left alone by his horrified creator, who fled the laboratory once the horrible thing twitched to life. “Remember, I am thy creature,” the monster protests, “I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed… I was benevolent and good; misery made me a fiend. Make me happy, and I shall again be virtuous.”
Written at the dawn of the great technological revolutions that would define the 19th and 20th centuries, Frankenstein foresees that the gigantic sins that were to be committed would hide a much greater sin. It is not the case that we have failed to care for Creation, but that we have failed to care for our technological creations. We confuse the monster for its creator and blame our sins against Nature upon our creations. But our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.4
Let Dr. Frankenstein’s sin serve as a parable for political ecology. At a time when science, technology, and demography make clear that we can never separate ourselves from the nonhuman world — that we, our technologies, and nature can no more be disentangled than we can remember the distinction between Dr. Frankenstein and his monster — this is the moment chosen by millions of well-meaning souls to flagellate themselves for their earlier aspiration to dominion, to repent for their past hubris, to look for ways of diminishing the numbers of their fellow humans, and to swear to make their footprints invisible?
The goal of political ecology must not be to stop innovating, inventing, creating, and intervening. The real goal must be to have the same type of patience and commitment to our creations as God the Creator, Himself. And the comparison is not blasphemous: we have taken the whole of Creation on our shoulders and have become coextensive with the Earth.
What, then, should be the work of political ecology? It is, I believe, tomodernize modernization, to borrow an expression proposed by Ulrich Beck.5 This challenge demands more of us than simply embracing technology and innovation. It requires exchanging the modernist notion of modernity for what I have called a “compositionist” one that sees the process of human development as neither liberation from Nature nor as a fall from it, but rather as a process of becoming ever-more attached to, and intimate with, a panoply of nonhuman natures.
1.
At the time of the plough we could only scratch the surface of the soil. Three centuries back, we could only dream, like Cyrano de Bergerac, of traveling to the moon. In the past, my Gallic ancestors were afraid of nothing except that the “sky will fall on their heads.”
Today we can fold ourselves into the molecular machinery of soil bacteria through our sciences and technologies. We run robots on Mars. We photograph and dream of further galaxies. And yet we fear that the climate could destroy us.
Everyday in our newspapers we read about more entanglements of all those things that were once imagined to be separable — science, morality, religion, law, technology, finance, and politics. But these things are tangled up together everywhere: in the Intergovernmental Panel on Climate Change, in the space shuttle, and in the Fukushima nuclear power plant.
If you envision a future in which there will be less and less of these entanglements thanks to Science, capital S, you are a modernist. But if you brace yourself for a future in which there will always be more of these imbroglios, mixing many more heterogeneous actors, at a greater and greater scale and at an ever-tinier level of intimacy requiring even more detailed care, then you are… what? A compositionist!
The dominant, peculiar story of modernity is of humankind’semancipation from Nature. Modernity is the thrusting-forward arrow of time — Progress — characterized by its juvenile enthusiasm, risk taking, frontier spirit, optimism, and indifference to the past. The spirit can be summarized in a single sentence: “Tomorrow, we will be able to separate more accurately what the world is really like from the subjective illusions we used to entertain about it.”
The very forward movement of the arrow of time and the frontier spirit associated with it (the modernizing front) is due to a certain conception of knowledge: “Tomorrow, we will be able to differentiate clearly what in the past was still mixed up, namely facts and values, thanks to Science.”
Science is the shibboleth that defines the right direction of the arrow of time because it, and only it, is able to cut into two well-separated parts what had, in the past, remained hopelessly confused: a morass of ideology, emotions, and values on the one hand, and, on the other, stark and naked matters of fact.
The notion of the past as an archaic and dangerous confusion arises directly from giving Science this role. A modernist, in this great narrative, is the one who expects from Science the revelation that Nature will finally be visible through the veils of subjectivity — and subjection — that hid it from our ancestors.
And here has been the great failure of political ecology. Just when all of the human and nonhuman associations are finally coming to the center of our consciousness, when science and nature and technology and politics become so confused and mixed up as to be impossible to untangle, just as these associations are beginning to be shaped in our political arenas and are triggering our most personal and deepest emotions, this is when a new apartheid is declared: leave Nature alone and let the humans retreat — as the English did on the beaches of Dunkirk in the 1940s.
Just at the moment when this fabulous dissonance inherent in the modernist project between what modernists say (emancipation from all attachments!) and what they do (create ever-more attachments!) is becoming apparent to all, along come those alleging to speak for Nature to say the problem lies in the violations and imbroglios — the attachments!
Instead of deciding that the great narrative of modernism (Emancipation) has always resulted in another history altogether (Attachments), the spirit of the age has interpreted the dissonance in quasi-apocalyptic terms: “We were wrong all along, let’s turn our back to progress, limit ourselves, and return to our narrow human confines, leaving the nonhumans alone in as pristine a Nature as possible, mea culpa, mea maxima culpa…”
Nature, this great shortcut of due political process, is now used to forbid humans to encroach. Instead of realizing at last that the emancipation narrative is bunk, and that modernism was always about attachments, modernist greens have suddenly shifted gears and have begun to oppose the promises of modernization.
Why do we feel so frightened at the moment that our dreams of modernization finally come true? Why do we suddenly turn pale and wish to fall back on the other side of Hercules’s columns, thinking we are being punished for having transgressed the sign: “Thou shall not transgress?” Was not our slogan until now, as Nordhaus and Shellenberger note in Break Through, “We shall overcome!”?6
In the name of indisputable facts portraying a bleak future for the human race, green politics has succeeded in leaving citizens nothing but a gloomy asceticism, a terror of trespassing Nature, and a diffidence toward industry, innovation, technology, and science. No wonder that, while political ecology claims to embody the political power of the future, it is reduced everywhere to a tiny portion of electoral strap-hangers. Even in countries where political ecology is a little more powerful, it contributes only a supporting force.
Political ecology has remained marginal because it has not grasped either its own politics or its own ecology. It thinks it is speaking of Nature, System, a hierarchical totality, a world without man, an assured Science, but it is precisely these overly ordered pronouncements that marginalize it.
Set in contrast to the modernist narrative, this idea of political ecology could not possibly succeed. There is beauty and strength in the modernist story of emancipation. Its picture of the future is so attractive, especially when put against such a repellent past, that it makes one wish to run forward to break all the shackles of ancient existence.
To succeed, an ecological politics must manage to be at least as powerful as the modernizing story of emancipation without imagining that we are emancipating ourselves from Nature. What the emancipation narrative points to as proof of increasing human mastery over and freedom from Nature — agriculture, fossil energy, technology — can be redescribed as the increasing attachmentsbetween things and people at an ever-expanding scale. If the older narratives imagined humans either fell from Nature or freed themselves from it, the compositionist narrative describes our ever-increasing degree of intimacy with the new natures we are constantly creating. Only “out of Nature” may ecological politics start again and anew.
2.
The paradox of “the environment” is that it emerged in public parlance just when it was starting to disappear. During the heyday of modernism, no one seemed to care about “the environment” because there existed a huge unknown reserve on which to discharge all bad consequences of collective modernizing actions. The environment is what appeared when unwanted consequences came back to haunt the originators of such actions.
But if the originators are true modernists, they will see the return of “the environment” as incomprehensible since they believed they were finally free of it. The return of consequences, like global warming, is taken as a contradiction, or even as a monstrosity, which it is, of course, but only according to the modernist’s narrative of emancipation. In the compositionist’s narrative of attachments, unintended consequences are quite normal — indeed, the most expected things on earth!
Environmentalists, in the American sense of the word, never managed to extract themselves from the contradiction that the environment is precisely not “what lies beyond and should be left alone” — this was the contrary, the view of their worst enemies! The environment is exactly what should be even more managed, taken up, cared for, stewarded, in brief, integrated and internalized in the very fabric of the polity.
France, for its part, has never believed in the notion of a pristine Nature that has so confused the “defense of the environment” in other countries. What we call a “national park” is a rural ecosystem complete with post offices, well-tended roads, highly subsidized cows, and handsome villages.
Those who wish to protect natural ecosystems learn, to their stupefaction, that they have to work harder and harder — that is, to intervene even more, at always greater levels of detail, with ever more subtle care — to keep them “natural enough” for Nature-intoxicated tourists to remain happy.
Like France’s parks, all of Nature needs our constant care, our undivided attention, our costly instruments, our hundreds of thousands of scientists, our huge institutions, our careful funding. But though we have Nature, and we have nurture, we don’t know what it would mean for Nature itself to be nurtured.7
The word “environmentalism” thus designates this turning point in history when the unwanted consequences are suddenly considered to be such a monstrosity that the only logical step appears to be to abstain and repent: “We should not have committed so many crimes; now we should be good and limit ourselves.” Or at least this is what people felt and thought before the breakthrough, at the time when there was still an “environment.”
But what is the breakthrough itself then? If I am right, the breakthrough involves no longer seeing a contradiction between the spirit of emancipation and its catastrophic outcomes, but accepting it as the normal duty of continuing to care for unwanted consequences, even if this means going further and further down into the imbroglios. Environmentalists say: “From now on we should limit ourselves.” Postenvironmentalists exclaim: “From now on, we should stop flagellating ourselves and take up explicitly and seriously what we have been doing all along at an ever-increasing scale, namely, intervening, acting, wanting, caring.” For environmentalists, the return of unexpected consequences appears as a scandal (which it is for the modernist myth of mastery). For postenvironmentalists, the other, unintended consequences are part and parcel of any action.
3.
One way to seize upon the breakthrough from environmentalism to postenvironmentalism is to reshape the very definition of the “precautionary principle.” This strange moral, legal, epistemological monster has appeared in European and especially French politics after many scandals due to the misplaced belief by state authority in the certainties provided by Science.8
When action is supposed to be nothing but the logical consequence of reason and facts (which the French, of all people, still believe), it is quite normal to wait for the certainty of science before administrators and politicians spring to action. The problem begins when experts fail to agree on the reasons and facts that have been taken as the necessary premises of any action. Then the machinery of decision is stuck until experts come to an agreement. It was in such a situation that the great tainted blood catastrophe of the 1980s ensued: before agreement was produced, hundreds of patients were transfused with blood contaminated by the AIDS virus.9
The precautionary principle was introduced to break this odd connection between scientific certainty and political action, stating that even in the absence of certainty, decisions could be made. But of course, as soon as it was introduced, fierce debates began on its meaning. Is it an environmentalist notion that precludes action or a postenvironmentalist notion that finally follows action through to its consequences?
Not surprisingly, the enemies of the precautionary principle — which President Chirac enshrined in the French Constitution as if the French, having indulged so much in rationalism, had to be protected against it by the highest legal pronouncements — took it as proof that no action was possible any more. As good modernists, they claimed that if you had to take so many precautions in advance, to anticipate so many risks, to include the unexpected consequences even before they arrived, and worse, to be responsible for them, then it was a plea for impotence, despondency, and despair. The only way to innovate, they claimed, is to bounce forward, blissfully ignorant of the consequences or at least unconcerned by what lies outside your range of action. Their opponents largely agreed. Modernist environmentalists argued that the principle of precaution dictated no action, no new technology, no intervention unless it could be proven with certainty that no harm would result. Modernists we were, modernists we shall be!
But for its postenvironmental supporters (of which I am one) the principle of precaution, properly understood, is exactly the change ofzeitgeist needed: not a principle of abstention — as many have come to see it — but a change in the way any action is considered, a deep tidal change in the linkage modernism established between science and politics. From now on, thanks to this principle, unexpected consequences are attached to their initiators and have to be followed through all the way.
4.
The link between technology and theology hinges on the notion of mastery. Descartes exclaimed that we should be “maîtres et possesseurs de la nature.”10
But what does it mean to be a master? In the modernist narrative, mastery was supposed to require such total dominance by the master that he was emancipated entirely from any care and worry. This is the myth about mastery that was used to describe the technical, scientific, and economic dominion of Man over Nature.
But if you think about it according to the compositionist narrative, this myth is quite odd: where have we ever seen a master freed from any dependence on his dependents? The Christian God, at least, is not a master who is freed from dependents, but who, on the contrary, gets folded into, involved with, implicated with, and incarnated into His Creation. God is so attached and dependent upon His Creation that he is continually forced (convinced? willing?) to save it. Once again, the sin is not to wish to have dominion over Nature, but to believe that this dominion means emancipation and not attachment.
If God has not abandoned His Creation and has sent His Son to redeem it, why do you, a human, a creature, believe that you can invent, innovate, and proliferate — and then flee away in horror from what you have committed? Oh, you the hypocrite who confesses of one sin to hide a much graver, mortal one! Has God fled in horror after what humans made of His Creation? Then have at least the same forbearance that He has.
The dream of emancipation has not turned into a nightmare. It was simply too limited: it excluded nonhumans. It did not care about unexpected consequences; it was unable to follow through with its responsibilities; it entertained a wholly unrealistic notion of what science and technology had to offer; it relied on a rather impious definition of God, and a totally absurd notion of what creation, innovation, and mastery could provide.
Which God and which Creation should we be for, knowing that, contrary to Dr. Frankenstein, we cannot suddenly stop being involved and “go home?” Incarnated we are, incarnated we will be. In spite of a centuries-old misdirected metaphor, we should, without any blasphemy, reverse the Scripture and exclaim: “What good is it for a man to gain his soul yet forfeit the whole world?” /
1. Polidori, John, et al. 1819. The Vampyre: A Tale. Printed for Sherwood, Neely, and Jones.
2. Shelley, Mary W., 1823. Frankenstein: Or, The Modern Prometheus. Printed for G. and W.B. Whittaker.
3. Ibid.
4. This is also the theme of: Latour, Bruno. 1996. Aramis or the Love of Technology. Translated by Catherine Porter. Cambridge, Mass: Harvard University Press.
5. Beck, Ulrich. 1992. Risk Society: Towards a New Modernity. London: Sage.
6. Nordhaus, Ted, and Michael Shellenberger. 2007. Break Through: From the Death of Environmentalism to the Politics of Possibility. Boston: Houghton Mifflin Harcourt.
7. Descola, Philippe. 2005. Par dela nature et culture. Paris: Gallimard.
8. Sadeleer, Nicolas de, 2006. Implementing the Precautionary Principle: Approaches from Nordic Countries and the EU. Earthscan Publ. Ltd.
9. Hermitte, Marie-Angele. 1996. Le Sang Et Le Droit. Essai Sur La Transfusion Sanguine. Paris: Le Seuil.
10. Descartes, Rene. 1637. Discourse on Method in Discourse on Method and Related Writings. Translated by Desmond M. Clark. 1999. Part 6, 44. New York: Penguin.
It’s a national embarrassment. It has resulted in large unnecessary costs for the U.S. economy and needless endangerment of our citizens. And it shouldn’t be occurring.
What am I talking about? The third rate status of numerical weather prediction in the U.S. It is a huge story, an important story, but one the media has not touched, probably from lack of familiarity with a highly technical subject. And the truth has been buried or unavailable to those not intimately involved in the U.S. weather prediction enterprise. This is an issue I have mentioned briefly in previous blogs, and one many of you have asked to learn more about. It’s time to discuss it.
Weather forecasting today is dependent on numerical weather prediction, the numerical solution of the equations that describe the atmosphere. The technology of weather prediction has improved dramatically during the past decades as faster computers, better models, and much more data (mainly satellites) have become available.
Supercomputers are used for numerical weather prediciton.
U.S. numerical weather prediction has fallen to third or fourth place worldwide, with the clear leader in global numerical weather prediction (NWP) being the European Center for Medium Range Weather Forecasting (ECMWF). And we have also fallen behind in ensembles (using many models to give probabilistic prediction) and high-resolution operational forecasting. We used to be the world leader decades ago in numerical weather prediction: NWP began and was perfected here in the U.S. Ironically, we have the largest weather research community in the world and the largest collection of universities doing cutting-edge NWP research (like the University of Washington!). Something is very, very wrong and I will talk about some of the issues here. And our nation needs to fix it.
But to understand the problem, you have to understand the competition and the players. And let me apologize upfront for the acronyms.
In the U.S., numerical weather prediction mainly takes place at the National Weather Service’s Environmental Modeling Center (EMC), a part of NCEP (National Centers for Environmental Prediction). They run a global model (GFS) and regional models (e.g., NAM).
The Europeans banded together decades ago to form the European Center for Medium-Range Forecasting (ECMWF), which runs a very good global model. Several European countries run regional models as well.
The United Kingdom Met Office (UKMET) runs an excellent global model and regional models. So does the Canadian Meteorological Center (CMC).
There are other major global NWP centers such as the Japanese Meteorological Agency (JMA), the U.S. Navy (FNMOC), the Australian center, one in Beijing, among others. All of these centers collect worldwide data and do global NWP.
The problem is that both objective and subjective comparisons indicate that the U.S. global model is number 3 or number 4 in quality, resulting in our forecasts being noticeably inferior to the competition. Let me show you a rather technical graph (produced by the NWS) that illustrates this. This figure shows the quality of the 500hPa forecast (about halfway up in the troposphere–approximately 18,000 ft) for the day 5 forecast. The top graph is a measure of forecast skill (closer to 1 is better) from 1996 to 2012 for several models (U.S.–black, GFS; ECMWF-red, Canadian: CMC-blue, UKMET: green, Navy: FNG, orange). The bottom graph shows the difference between the U.S. and other nation’s model skill.
You first notice that forecasts are all getting better. That’s good. But you will notice that the most skillful forecast (closest to one) is clearly the red one…the European Center. The second best is the UKMET office. The U.S. (GFS model) is third…roughly tied with the Canadians.
Here is a global model comparison done by the Canadian Meteorological Center, for various global models from 2009-2012 for the 120 h forecast. This is a plot of error (RMSE, root mean square error) again for 500 hPa, and only for North America. Guess who is best again (lowest error)?–the European Center (green circle). UKMET is next best, and the U.S. (NCEP, blue triangle) is back in the pack.
Lets looks at short-term errors. Here is a plot from a paper by Garrett Wedam, Lynn McMurdie and myself comparing various models at 24, 48, and 72 hr for sea level pressure along the West Coast. Bigger bar means more error. Guess who has the lowest errors by far? You guessed it, ECMWF.
I could show you a hundred of these plots, but the answers are very consistent. ECMWF is the worldwide gold standard in global prediction, with the British (UKMET) second. We are third or fourth (with the Canadians). One way to describe this, is that the ECWMF model is not only better at the short range, but has about one day of additional predictability: their 8 day forecast is about as skillful as our 7 day forecast. Another way to look at it is that with the current upward trend in skill they are 5-7 years ahead of the U.S.
Most forecasters understand the frequent superiority of the ECMWF model. If you read the NWS forecast discussion, which is available online, you will frequently read how they often depend not on the U.S. model, but the ECMWF. And during the January western WA snowstorm, it was the ECMWF model that first indicated the correct solution. Recently, I talked to the CEO of a weather/climate related firm that was moving up to Seattle. I asked them what model they were using: the U.S. GFS? He laughed, of course not…they were using the ECMWF.
A lot of U.S. firms are using the ECMWF and this is very costly, because the Europeans charge a lot to gain access to their gridded forecasts (hundreds of thousands of dollars per year). Can you imagine how many millions of dollars are being spent by U.S. companies to secure ECMWF predictions? But the cost of the inferior NWS forecasts are far greater than that, because many users cannot afford the ECMWF grids and the NWS uses their global predictions to drive the higher-resolution regional models–which are NOT duplicated by the Europeans. All of U.S. NWP is dragged down by these second-rate forecasts and the costs for the nation has to be huge, since so much of our economy is weather sensitive. Inferior NWP must be costing billions of dollars, perhaps many billions.
The question all of you must be wondering is why this bad situation exists. How did the most technologically advanced country in the world, with the largest atmospheric sciences community, end up with third-rate global weather forecasts? I believe I can tell you…in fact, I have been working on this issue for several decades (with little to show for it). Some reasons:
1. The U.S. has inadequate computer power available for numerical weather prediction. The ECMWF is running models with substantially higher resolution than ours because they have more resources available for NWP. This is simply ridiculous–the U.S. can afford the processors and disk space it would take. We are talking about millions or tens of millions of dollars at most to have the hardware we need. A part of the problem has been NWS procurement, that is not forward-leaning, using heavy metal IBM machines at very high costs.
2. The U.S. has used inferior data assimilation. A key aspect of NWP is to assimilate the observations to create a good description of the atmosphere. The European Center, the UKMET Office, and the Canadians using 4DVAR, an advanced approach that requires lots of computer power. We used an older, inferior approach (3DVAR). The Europeans have been using 4DVAR for 20 years! Right now, the U.S. is working on another advanced approach (ensemble-based data assimilation), but it is not operational yet.
3. The NWS numerical weather prediction effort has been isolated and has not taken advantage of the research community. NCEP’s Environmental Modeling Center (EMC) is well known for its isolation and “not invented here” attitude. While the European Center has lots of visitors and workshops, such things are a rarity at EMC. Interactions with the university community have been limited and EMC has been reluctant to use the models and approaches developed by the U.S. research community. (True story: some of the advances in probabilistic weather prediction at the UW has been adopted by the Canadians, while the NWS had little interest). The National Weather Service has invested very little in extramural research and when their budget is under pressure, university research is the first thing they reduce. And the U.S. NWP center has been housed in a decaying building outside of D.C.,one too small for their needs as well. (Good news… a new building should be available soon).
4. The NWS approach to weather related research has been ineffective and divided. The governmnent weather research is NOT in the NWS, but rather in NOAA. Thus, the head of the NWS and his leadership team do not have authority over folks doing research in support of his mission. This has been an extraordinarily ineffective and wasteful system, with the NOAA research teams doing work that often has a marginal benefit for the NWS.
5. Lack of leadership. This is the key issue. The folks in NCEP, NWS, and NOAA leadership have been willing to accept third-class status, providing lots of excuses, but not making the fundamental changes in organization and priority that could deal with the problem. Lack of resources for NWP is another issue…but that is a decision made by NOAA/NWS/Dept of Commerce leadership.
This note is getting long, so I will wait to talk about the other problems in the NWS weather modeling efforts, such as our very poor ensemble (probabilistic) prediction systems. One could write a paper on this…and I may.
I should stress that I am not alone in saying these things. A blue-ribbon panel did a review of NCEP in 2009 and came to similar conclusions (found here). And these issues are frequently noted at conferences, workshops, and meetings.
Let me note that the above is about the modeling aspects of the NWS, NOT the many people in the local forecast offices. This part of the NWS is first-rate. They suffer from inferior U.S. guidance and fortunately have access to the ECMWF global forecasts. And there are some very good people at NCEP that have lacked the resources required and suitable organization necessary to push forward effectively.
This problem at the National Weather Service is not a weather prediction problem alone, but an example of a deeper national malaise. It is related to other U.S. issues, like our inferior K-12 education system. Our nation, gaining world leadership in almost all areas, became smug, self-satisfied, and a bit lazy. We lost the impetus to be the best. We were satisfied to coast. And this attitude must end…in weather prediction, education, and everything else… or we will see our nation sink into mediocrity.
The U.S. can reclaim leadership in weather prediction, but I am not hopeful that things will change quickly without pressure from outside of the NWS. The various weather user communities and our congressional representatives must deliver a strong message to the NWS that enough is enough, that the time for accepting mediocrity is over. And the Weather Service requires the resources to be first rate, something it does not have at this point.
* * *
Saturday, April 7, 2012
Lack of Computer Power Undermines U.S. Numerical Weather Prediction (Revised)
In my last blog on this subject, I provided objective evidence of how U.S. numerical weather prediction (NWP), and particularly our global prediction skill, lags between major international centers, such as the European Centre for Medium Range Weather Forecasting (ECMWF), the UKMET office, and the Canadian Meteorological Center (CMC). I mentioned briefly how the problem extends to high-resolution weather prediction over the U.S. and the use of ensemble (many model runs) weather prediction, both globally and over the U.S. Our nation is clearly number one in meteorological research and we certainly have the knowledge base to lead the world in numerical weather prediction, but for a number of reasons we are not. The cost of inferior weather prediction is huge: in lives lost, injuries sustained, and economic impacts unmitigated. Truly, a national embarrassment. And one we must change.
In this blog, I will describe in some detail one major roadblock in giving the U.S. state-of-the-art weather prediction: inadequate computer resources. This situation should clearly have been addressed years ago by leadership in the National Weather Service, NOAA, and the Dept of Commerce, but has not, and I am convinced will not without outside pressure. It is time for the user community and our congressional representatives to intervene. To quote Samuel L. Jackson, enough is enough. (…)
In the U.S. we are trying to use less computer resources to do more tasks than the global leaders in numerical weather prediction. (Note: U.S. NWP is done by National Centers for Environmental Prediction’s (NCEP) Environmental Modeling Center (EMC)). This chart tells the story:
Courtesy of Bill Lapenta, EMC.
ECMWF does global high resolution and ensemble forecasts, and seasonal climate forecasts. UKMET office also does regional NWP (England is not a big country!) and regional air quality. NCEP does all of this plus much, much more (high resolution rapid update modeling, hurricane modeling, etc.). And NCEP has to deal with prediction over a continental-size country.
If you would expect the U.S. has a lot more computer power to balance all these responsibilities and tasks, you would be very wrong. Right now the U.S. NWS has two IBM supercomputers, each with 4992 processors (IBM Power6 processors). One computer does the operational work, the other is for back up (research and testing runs are done on the back-up). About 70 teraflops (trillion floating points operations per second) for each machine.
NCEP (U.S.) Computer
The European Centre has a newer IBM machine with 8192, much faster, processors that gets 182 terraflops (yes, over twice as fast and with far fewer tasks to do).
The UKMET office, serving a far, far smaller country, has two newer IBM machines, each with 7680 processors for 175 teraflops per machine.
Here is a figure, produced at NCEP that compares the relative computer power of NCEP’s machine with the European Centre’s. The shading indicates computational activity and the x-axis for each represents a 24-h period. The relative heights allows you to compare computer resources. Not only does the ECMWF have much more computer power, but they are more efficient in using it…packing useful computations into every available minute.
Courtesy of Bill Lapenta, EMC
Recently, NCEP had a request for proposals for a replacement computer system. You may not believe this, but the specifications were ONLY for a system at least equal to the one that have. A report in acomputer magazine suggests that perhaps this new system (IBM got the contract) might be slightly less powerful (around 150 terraflops) than one of the UKMET office systems…but that is not known at this point.
The Canadians? They have TWO machines like the European Centre’s!
So what kind of system does NCEP require to serve the nation in a reasonable way?
To start, we need to double the resolution of our global model to bring it into line with ECMWF (they are now 15 km global). Such resolution allows the global model to model regional features (such as our mountains). Doubling horizontal resolution requires 8 times more computer power. We need to use better physics (description of things like cloud processes and radiation). Double again. And we need better data assimilation (better use of observations to provide an improved starting point for the model). Double once more. So we need 32 times more computer power for the high-resolution global runs to allow us to catch up with ECMWF. Furthermore, we must do the same thing for the ensembles (running many lower resolution global simulations to get probabilistic information). 32 times more computer resources for that (we can use some of the gaps in the schedule of the high resolution runs to fit some of this in…that is what ECMWF does). There are some potential ways NCEP can work more efficiently as well. Right now NCEP runs our global model out to 384 hours four times a day (every six hours). To many of us this seems excessive, perhaps the longest periods (180hr plus) could be done twice a day. So lets begin with a computer 32 times faster that the current one.
Many workshops and meteorological meetings (such as one on improvements in model physics that was held at NCEP last summer—I was the chair) have made a very strong case that the U.S. requires an ensemble prediction system that runs at 4-km horizontal resolution. The current national ensemble system has a horizontal resolution about 32 km…and NWS plans to get to about 20 km in a few years…both are inadequate. Here is an example of the ensemble output (mean of the ensemble members) for the NWS and UW (4km) ensemble systems: the difference is huge–the NWS system does not even get close to modeling the impacts of the mountains. It is similarly unable to simulate large convective systems.
Current NWS( NCEP) “high resolution” ensembles (32 km)
4 km ensemble mean from UW system
Let me make one thing clear. Probabilistic prediction based on ensemble forecasts and reforecasting (running models back for years to get statistics of performance) is the future of weather prediction. The days of giving a single number for say temperature at day 5 are over. We need to let people know about uncertainty and probabilities. The NWS needs a massive increase of computer power to do this. It lacks this computer power now and does not seem destined to get it soon.
A real champion within NOAA of the need for more computer power is Tom Hamill, an expert on data assimilation and model post-processing. He and colleagues have put together a compelling case for more NWS computer resources for NWP. Read it here.
Back-of-the-envelope calculations indicates that a good first step– 4km national ensembles–would require about 20,000 processors to do so in a timely manner–but it would revolutionize weather prediction in the U.S., including forecasting convection and in mountainous areas. This high-resolution ensemble effort would meld with data assimilation over the long-term.
And then there is running super-high resolution numerical weather prediction to get fine-scale details right. Here in the NW my group runs a 1.3 km horizontal resolution forecast out twice a day for 48h. Such capability is needed for the entire country. It does not exist now due to inadequate computer resources.
The bottom line is that the NWS numerical modeling effort needs a huge increase of computer power to serve the needs of the country–and the potential impacts would be transformative. We could go from having a third-place effort, which is slipping back into the pack, to a world leader. Furthermore, the added computer power will finally allow NOAA to complete Observing System Simulation Experiments (OSSEs) and Observing System Experiments (OSEs) to make rational decisions about acquisitions of very expensive satellite systems. The fact that this is barely done today is really amazing and a potential waste of hundreds of millions of dollars on unnecessary satellite systems.
But do to so will require a major jump in computational power, a jump our nation can easily afford. I would suggest that NWS’s EMC should begin by securing at least a 100,000 processor machine, and down the road something considerably larger. Keep in mind my department has about 1000 processors in our computational clusters, so this is not as large as you think.
For a country with several billion-dollar weather disasters a year, investment in reasonable computer resrouces for NWP is obvious.
The cost? Well, I asked Art Mann of Silicon Mechanics (a really wonderful local vendor of computer clusters) to give me rough quote: using fast AMD chips, you could have such a 100K core machine for 11 million dollars. (this is without any discount!) OK, this is the U.S. government and they like expensive, heavy metal machines….lets go for 25 million dollars. The National Center for Atmospheric Research (NCAR) is getting a new machine with around 75,000 processors and the cost will be around 25-35 million dollars. NCEP will want two machines, so lets budget 60 million dollars. We spend this much money on a single jet fighter, but we can’t invest this amount to greatly improve forecasts and public safety in the U.S.? We have machines far larger than this for breaking codes, doing simulations of thermonuclear explosions, and simulating climate change.
Yes, a lot of money, but I suspect the cost of the machine would be paid back in a few months from improved forecasts. Last year we had quite a few (over ten) billion-dollar storms….imagine the benefits of forecasting even a few of them better. Or the benefits to the wind energy and utility industries, or U.S. aviation, of even modestly improved forecasts. And there is no doubt such computer resources would improve weather prediction. The list of benefits is nearly endless. Recent estimates suggest that normal weather events cost the U.S. economy nearly 1/2 trillion dollars a year. Add to that hurricanes, tornadoes, floods, and other extreme weather. The business case is there.
As someone with an insider’s view of the process, it is clear to me that the current players are not going to move effectively without some external pressure. In fact, the budgetary pressure on the NWS is very intense right now and they are cutting away muscle and bone at this point (like reducing IT staff in the forecast offices by over 120 people and cutting back on extramural research). I believe it is time for weather sensitive industries and local government, together with t he general public, to let NOAA management and our congressional representatives know that this acute problem needs to be addressed and addressed soon. We are acquiring huge computer resources for climate simulations, but only a small fraction of that for weather prediction…which can clearly save lives and help the economy. Enough is enough.
Today’s guest blog post is by cultural anthropologist and AAA member, Chad Huddleston. He is an Assistant Professor at St. Louis University in the Sociology, Anthropology and Criminal Justice department.
Recently, a host of new shows, such as Doomsday Preppers on NatGeo and Doomsday Bunkers on Discovery Channel, has focused on people with a wide array of concerns about possible events that may threaten their lives. Both of these shows focus on what are called ‘preppers.’ While the people that may have performed these behaviors in the past might have been called ‘survivalists,’ many ‘preppers’ have distanced themselves from that term, due to its cultural baggage: stereotypical anti-government, gun-loving, racist, extremists that are most often associated with the fundamentalist (politically and religiously) right side of the spectrum.
I’ve been doing fieldwork with preppers for the past two years, focusing on a group called Zombie Squad. It is ‘the nation’s premier non-stationary cadaver suppression task force,’ as well as a grassroots, 501(c)3 charity organization. Zombie Squad’s story is that while the zombie removal business is generally slow, there is no reason to be unprepared. So, while it is waiting for the “zombpacolpyse,” it focuses its time on disaster preparedness education for the membership and community.
The group’s position is that being prepared for zombies means that you are prepared for anything, especially those events that are much more likely than a zombie uprising – tornadoes, an interruption in services, ice storms, flooding, fires, and earthquakes.
For many in this group, Hurricane Katrina was the event that solidified their resolve to prep. They saw what we all saw – a natural disaster in which services were not available for most, leading to violence, death and chaos. Their argument is that the more prepared the public is before a disaster occurs, the less resources they will require from first responders and those agencies that come after them.
In fact, instead of being a victim of natural disaster, you can be an active responder yourself, if you are prepared. Prepare they do. Members are active in gaining knowledge of all sorts – first aid, communications, tactical training, self-defense, first responder disaster training, as well as many outdoor survival skills, like making fire, building shelters, hunting and filtering water.
This education is individual, feeding directly into the online forum they maintain (which has just under 30,000 active members from all over the world), and by monthly local meetings all over the country, as well as annual national gatherings in southern Missouri, where they socialize, learn survival skills and practice sharpshooting.
Sound like those survivalists of the past? Emphatically no. Zombie Squad’s message is one of public education and awareness, very successful charity drives for a wide array of organizations, and inclusion of all ethnicities, genders, religions and politics. Yet, the group is adamant on leaving politics and religion out of discussions on the group and prepping. You will not find exclusive language on their forum or in their media. That is not to say that the individuals in the group do not have opinions on one side or the other of these issues, but it is a fact that those issues are not to be discussed within the community of Zombie Squad.
Considering the focus on ‘future doom’ and the types of fears that are being pushed on the shows mentioned above, usually involve protecting yourself from disaster and then other people that have survived the disaster, Zombie Squad is a refreshing twist to the ‘prepper’ discourse. After all, if a natural disaster were to befall your region, whom would you rather be knocking at your door: ‘raiders’ or your neighborhood Zombie Squad member?
And the answer is no: they don’t really believe in zombies.
HISTORIAS OLVIDADAS DE BUENOS AIRES: UN HOMBRE DECIA HABER INVENTADO LA MAQUINA DE LA LLUVIA
Sucedió el 2 de enero de 1939, cuando un ingeniero llamado Juan Baigorri le aseguró al director de Meteorología que haría llover sobre la ciudad. Y llovió.
Héctor Gambini. DE LA REDACCION DE CLARIN.
Lunes 17.06.2002
“Como respuesta a la censura a mi procedimiento, regalo —por intermedio de Crítica— una lluvia a Buenos Aires para el 2 de enero de 1939″. La frase salió en el diario a fines del 38 y era un desafío público al director de Meteorología Nacional, para quien el autor de los dichos no era más que un embustero. Un ingeniero provocador que decía haber inventado la máquina de hacer llover.
Cuando llegó el 1° de enero, los porteños tenían el desafío tan presente que chocaban copas de madrugada con los ojos clavados en el cielo limpio. El día fue tan caluroso y húmedo que hasta la tarea de sentarse bajo la parra a mirar las nubes raquíticas que pasaban por Buenos Aires resultaba un entretenimiento cansador. Pero llegó la noche y nada.
En la mañana del 2, la ciudad volvió al trabajo. Y nada. Ni rastros de la lluvia. Pero no había viento ni para mover un pétalo de rosa. Y las nubecitas blancas y enfermizas de la tarde anterior iban echando cuerpo y color. Primero grises plomo. Después virando hacia el negro. Cada vez más. Hasta que una brisa de suspiro apareció de la nada con un aliento de humedad en suspensión. Gotitas sin peso ni para llegar al suelo. Y otras gotitas finas detrás, que ya tocaban el asfalto. Y otras gordas como ñoquis, que ahora hacían dibujos en los charcos incipientes. Enseguida,tormenta eléctrica y chaparrón violento. Una catarata que caía del cielo mientras Crítica paraba las rotativas para salir al mediodía con el título principal de la quinta edición, en tipografía catástrofe: “Como lo pronosticó Baigorri, hoy llovió”, debajo de una volanta que daba información acerca de lo que acababa de ocurrir en Buenos Aires:“Baigorri consiguió que tres millones de personas dirijan sus miradas al cielo”.
El tal Baigorri había nacido en Entre Ríos a fines del siglo anterior. Hijo de un militar amigo del general Roca, llegó a Buenos Aires para hacer la secundaria en el Colegio Nacional. Cuando egresó viajó a Italia para estudiar geofísica y se recibió de ingeniero en la Universidad de Milán.
En esos años —principios de la década del 30— comenzó a viajar por el mundo, contratado por diferentes petroleras. Estuvo en diversos países de Europa, Asia y Africa. Y también en Estados Unidos, desde donde volvió contratado por YPF.
Con su mujer y su hijo se instaló en Caballito. Junto a sus bultos de familia hizo trasladar desde el aeropuerto un aparato con antenas expandibles, que guardó celosamente en un placard. “Más o menos estoy adaptado a Buenos Aires, pero hay mucha humedad”, se quejaba.
Una mañana se decidió. Tomó unos aparatos y los utilizó para ir midiendo la humedad por los barrios porteños. Se paró frente a una casa de Araujo y Falcón, en Villa Luro. Las agujas le indicaban que era la zona más alta de cuanto había recorrido. Compró esa casa, que tenía un altillo perfecto para un laboratorio.
Allí se fue “desarrollando” la función de la extraña máquina, un artefacto que, a los dichos de Baigorri, provocaba que el cielo rompiese en lluvia cada vez que la encendiera. Según él, ocurría por un mecanismo de electromagnetismo que concentraba nubes en el área de influencia del aparato.
Era 1938 y los diarios hablaban de los recientes suicidios de Leopoldo Lugones y Alfonsina Storni. Y de los fraudes en las elecciones parlamentarias que ponían al presidente Roberto Ortiz al borde de la renuncia. River inauguraba el Monumental.
Baigorri buscaba demostrar que podía manejar la lluvia y buscó el patrocinio del Ferrocarril Central Argentino. El gerente inglés oyó la propuesta y sonrió, malicioso. “¿Y usted podría hacerlo en cualquier lugar?”, preguntó, tropezando con las palabras en español. Baigorri contestó que sí, y el inglés desafió, sarcástico: “Bueno, haga llover en Santiago del Estero”.
Hacia allí salió el ingeniero, con su extraña máquina y un perito agrónomo de acompañante, que viajaba para controlarlo. A los pocos días volvieron y el perito certificó que, en una estancia de una localidad llamada Estación Pinto, Baigorri se puso a trabajar y a las ocho horas llovió.
Su fama comenzó a crecer y llegó con él, en tren, a Buenos Aires. Hasta viajaron dos periodistas de The Times, de Londres, para entrevistarlo. En el otro rincón, el ingeniero Calmarini, director de Meteorología, salió a decir que todo era un invento infame o, a lo sumo, obra de la casualidad.
Aprovechando la polémica y con el tema instalado en la calle, Crítica fue a entrevistar a Baigorri. De allí salió el desafío para el 2 de enero. Ante el silencio de Meteorología, el ingeniero subió la apuesta: le mandó al funcionario nacional un paraguas de regalo . Junto al bulto, una tarjeta:“Para que lo use el 2 de enero”. Fue el día en que los porteños se desvelaron para mirar el cielo, esperando la lluvia.
Baigorri comenzó a viajar por el interior y a “hacer llover” con su máquina en diferentes localidades, con suerte dispar.
En 1951 fue asesor ad honórem del Ministerio de Asuntos Técnicos. Al año siguiente desempolvó su viejo invento y viajó a La Pampa. Llegó, encendió la batería y empezó a llover, aunque ya la gente dudaba de sus méritos:“Iba a llover de todos modos”, decían.
Baigorri se recluyó en un largo silencio. Ya viudo, pasaba horas en el altillo de Villa Luro. Leonor, la mujer que hoy vive en esa casa, contó a Clarín:“Cada vez que llovía la gente rodeaba la casa y se ponía a mirar hacia el altillo”. Allí mismo Baigorri se negó a atender a un emisario que decía venir en nombre de un empresario norteamericano para comprarle la fórmula. “Mi invento es argentino y será para exclusivo beneficio de los argentinos”, le contestó.
Anciano y solo, vendió la casa y se mudó a lo de un amigo francés, que le prestó una habitación en un departamento. Murió en el otoño de 1972, hace justo 30 años. Tenía 81 y había llegado al hospital solo, con problemas en los bronquios.
Nadie más supo de la extraña máquina de las antenas. Ni si Baigorri dejó un sucesor secreto para que la activara como homenaje durante su propio sepelio: cuando lo estaban enterrando, en el cementerio de la Chacarita, se largó a llover.
The brother of Lance Madison (C) was shot dead on September 4, 2005, at the Danziger Bridge in new Orleans [Reuters]
Five ex-police officers given prison terms for roles in shootings and cover-up in days after Hurricane Katrina in 2005.
Last Modified: 05 Apr 2012 01:03
The brother of Lance Madison (C) was shot dead on September 4, 2005, at the Danziger Bridge in new Orleans [Reuters]
Five former New Orleans police officers have been sentenced to prison terms ranging from six to 65 years for their roles in deadly shootings of unarmed residents in the chaotic days after Hurricane Katrina.
The presiding judge lashed out at prosecutors for two hours on Wednesday on their handling of the case in which police shot six people at a bridge on September 4, 2005, killing two, less than a week after Katrina made landfall.
To make the shootings appear justified, officers conspired to plant a gun, fabricate witnesses and falsify reports. The case became the centerpiece of the US Justice Department’s push to clean up the troubled New Orleans Police Department.
Kenneth Bowen, Robert Gisevius, Anthony Villavaso and Robert Faulcon were convicted of federal firearms charges that carried mandatory minimum prison sentences of at least 35 years. Retired officer Arthur Kaufman, who was assigned to investigate the shootings, was convicted of helping orchestrate the cover-up.
Faulcon, who was convicted on charges in both fatal shootings, faces the stiffest sentence of 65 years. Bowen and Gisevius each face 40 years, while Villavaso was sentenced to 38. Kaufman received the lightest sentence at six
years.
Community ‘disservice’
Afterward, US District Judge Kurt Engelhardt accused prosecutors of cutting overly lenient plea deals with five other officers who cooperated with the civil rights investigation. The former officers pleaded guilty to helping cover up the shooting and are already serving prison terms ranging from three to eight years.
“These through-the-looking-glass plea deals that tied the hands of this court … are an affront to the court and a disservice to the community,” Engelhardt said.
The judge also questioned the credibility of the officers who pleaded guilty and testified against those who went to trial.
In particular, the judge criticized prosecutors for seeking a 20-year prison sentence for Kaufman, yet Michael Lohman, who was the highest-ranking officer at the scene of the shooting, received four years under his deal for pleading guilty to participating in the cover-up.
‘Unbearable’ pain
Engelhardt heard several hours of arguments and testimony earlier on Wednesday from prosecutors, defense attorneys, relatives of shooting victims and the officers. Ronald Madison and 17-year-old James Brissette died in the shootings.
“This has been a long and painful six-and-a-half years,” said Lance Madison, whose 40-year-old, mentally disabled brother, Ronald, was killed at the bridge. “The people of New Orleans and my family are ready for justice.”
Madison individually addressed each defendant, including Faulcon, who shot his brother: “When I look at you, my pain becomes unbearable. You took the life of an angel and basically ripped my heart out.”
Madison also said he was horrified by Kaufman’s actions in the cover-up: “You tried to frame me, a man you knew was innocent, and send me to prison for the rest of my life.”
Lance Madison was arrested on attempted murder charges after police falsely accused him of shooting at the officers on the bridge. He was jailed for three weeks before a judge freed him.
None of the officers addressed the court before they were sentenced.
Chaotic aftermath
Katrina struck on August 29, 2005, leading to the collapse of levees and flooding an estimated 80 per cent of the city. New Orleans was plunged into chaos as residents who hadn’t evacuated were driven from their homes to what high places they could find.
Officers who worked in the city at the time but were not charged in the bridge case on Wednesday told Engelhardt of the lawlessness that followed the flood, and that they feared for their lives.
On the morning of September 4, one group of residents was crossing the Danziger Bridge in the city’s Gentilly area in search of food and supplies when police arrived.
The officers had received calls that shots were being fired. Gunfire reports were common after Katrina.
Faulcon was convicted of fatally shooting Madison, but the jury decided the killing didn’t amount to murder. He, Gisevius, Bowen and Villavaso were convicted in Brissette’s killing, but jurors didn’t hold any of them individually responsible for causing his death.
All five officers were convicted of participating in a cover-up.
Você precisa fazer login para comentar.