Arquivo da categoria: meteorologia

>On Birth Certificates, Climate Risk and an Inconvenient Mind (N.Y. Times, Dot Earth Blog)

April 28, 2011, 9:23 AM

As Donald Trump tries to milk a last bit of publicity out of the failed “birther” challenge to President Obama, it’s worth reading a fresh take by an Australian psychologist on the deep roots of denial in people with fundamentalist passions of whatever stripe. Here’s an excerpt:

[I]deology trumps facts.
And it doesn’t matter what the ideology is, whether socialism, any brand of fundamentalist religion, or free-market extremism. The psychological literature shows quite consistently that a threat to one’s worldview is more than likely met by a dismissal of facts, however strong the evidence. Indeed, the stronger the evidence, the greater the threat — and hence the greater the denial.
In its own bizarre way, then, the rising noise level of climate denial provides further evidence that global warming resulting from human CO2 emissions is indeed a fact, however inconvenient it may be. Read the rest.
The piece, published today on the Australian news blog The Drum, is byStephan Lewandowsky of the School of Psychology at the University of Western Australia.
Of course, just being aware that ideology can deeply skew how people filter facts and respond to risks begs the question of how to make progress in the face of the wide societal divisions this pattern creates.
It’s easy to forget that there’s been plenty of climate denial to go around. It took a decade for those seeking a rising price on carbon dioxide emissions as a means to transform American and global energy norms to realize that a price sufficient to drive the change was a political impossibility.
As a new paper in the Proceedings of the National Academy of Sciences found, even when greenhouse-gas emissions caps were put in place, trade with unregulated countries simply shifted the brunt of the emissions elsewhere.
When he was Britain’s prime minister, Tony Blair put it this way in 2005: “The blunt truth about the politics of climate change is that no country will want to sacrifice its economy in order to meet this challenge.”
My choice, of course, is to attack the two-pronged energy challenge the world faces with a sustained energy quest, nudged and nurtured from the top but mainly fostered from the ground up.
And I’m aware I still suffer from a hint of “scientism,” even “rational optimism,” in expecting that this argument can catch on, but so be it.
10:11 a.m. | Updated For much more on the behavioral factors that shape the human struggle over climate policy, I encourage you to explore “Living in Denial: Climate Change, Emotions, and Everyday Life,” a new book by Kari Marie Norgaard, a sociologist who has just moved from Whitman College to the University of Oregon.
Robert Brulle of Drexel University brought the book to my attention several months ago, and I invited him to do a Dot Earth “Book Report,” to kick off a discussion of Norgaard’s insights, which emerge from years of research she conducted on climate attitudes in a rural community in western Norway. (I’d first heard of of Norgaard’s research while reporting my 2007 article on behavior and climate risk.)
(I also encourage you to read the review in the journal Nature Climate Changeby Mike Hulme, a professor of climate at the University of East Anglia and the author of “Why We Disagree about Climate Change.”)
Here’s Brulle’s reaction to Norgaard’s book:
As a sociologist and longtime student of human responses to environmental problems, I’ve seen reams of analysis come and go on why we get some things right and some very wrong. A new book by Kari Norgaard has done the best job yet of cutting to the core on our seeming inability to grasp and meaningfully respond to human-driven climate change.
As the science of climate change has become stronger and more dire, media coverage, public opinion, and government actions regarding this issue has declined. At the same time, climate denial positions have become increasingly accepted, despite a lack of scientific evidence. Even among the public that accepts the science of global climate change, the dire circumstances we now face in this regard are consistently downplayed, and the logical implications that follow from the scientific analysis of the necessity to enact swift and aggressive measures to combat climate change are not followed through either intellectually or politically.
Instead, at best, a series of half measures have been proposed, which though they may be comforting, are essentially symbolic measures that allow the status quo to continue unchanged, and thus will not adequately address the issue of global climate change. Thus attempts to address climate change have encountered significant cultural, political, and economic barriers that have not been overcome. While there have been several attempts to explain the lack of meaningful action regarding climate change, these models have not developed into an integrated and empirically supported approach. Additionally, many of these models are based in an individualistic perspective, and thus engage in a form of psychological reductionism. Finally, none of these models are able to coherently explain the inter-related phenomena regarding climate change that is occurring at the individual, small group, institutional, and societal levels.
To move beyond the limitations of these approaches, Dr. Norgaard develops a sociological model that views the response to global climate change as a social process. One of the fundamental insights of sociology is that individuals are part of a larger structure of cultural and social interactions. Thus through the socialization processes, we construct certain ways of life and understandings of the world that guide our everyday interactions. Individuals become the carriers of the orientations and practices that constitute our social order. A disjuncture between our taken-for-granted way of living, such as the new behaviors necessitated by climate change, are experienced at the individual level as identity threats, at the institutional level as challenges to social cohesion, and at the societal level as legitimation threats. When this occurs, there are powerful processes that work at the psychological, institutional, and overall society level to maintain the current orientations and ensure social stability. Taken together, these social processes create cultural and social stability. They also create, from the view of climate change, a form of social inertia that inhibits rapid social change.
From this sociological perspective, Dr. Norgaard takes on the apparent paradox of climate change and public awareness; as our knowledge about the nature and seriousness of climate change has increased, our political and social engagement with the issue has declined. Why? Dr. Norgaard’s answer (crudely put) is that our personality structures and social norms are so thoroughly enmeshed with a growth economy based on fossil fuels that any consideration of the need to change our way of life to deal with climate change evokes powerful emotions of anxiety and desires to avoid this issue. This avoidance behavior is socially reinforced by collective group norms, as well as the messages we receive from the mass media and the political elite. She develops this thesis through the use of an impressive array of sociological theory, including the sociology of the emotions, cultural sociology, and political economy. Additionally, she utilizes specific theoretical approaches regarding the social denial of catastrophic risk. Here she skillfully repurposes the literature on nuclear war and collective denial to the issue of climate change. This is a unique and insightful use of this literature. Thus her theoretical contribution is substantial and original. She then illustrates this process through a thick qualitative analysis based on participant observation in Norway. In her analysis of conversations, she illustrates how collective denial of climate change takes place through conversations. This provided powerful ground truth evidence of her theoretical framework.
This is an extremely important intellectual contribution. Research on climate change and culture has been primarily focused on individual attitudinal change. This work brings a sociological perspective to our understanding of individual and collective responses to climate change information, and opens up a new research area. It also has important practical implications. Most climate change communication efforts are based on conveying information to individuals. The assumption is that individuals will take in this information and then act rationally in their own interests. Dr. Norgaard’s analysis course charts a different approach. As she demonstrates, it is not a lack of information that inhibits action on climate change. Rather, the knowledge brings about unpleasant emotions and anxiety. Individuals and communities seek to restore a sense of equilibrium and stability, and thus engage in a form of denial which, although the basic facts of climate change are acknowledged, the logical conclusions and actions that follow from the information are minimized and not acted upon. This perspective calls for a much different approach to climate change communications, and defines a new agenda for this field.

[Note: people interested in this line of argument should follow the work done by researchers at the Center for Research on Environmental Decisions (CRED), at Columbia University, @] 


>Climategate: What Really Happened? (Mother Jones)


>Ciência precisa avançar para embasar política climática (AIT, JC)

JC e-mail 4236, de 12 de Abril de 2011

Os investimentos em pesquisa sobre mudanças climáticas nos últimos anos possibilitaram que o País fosse um dos primeiros a estabelecer metas de redução de emissões de gases de efeito estufa (GEE).

Agora, a ciência brasileira precisa avançar mais para subsidiar as políticas públicas de adaptação da sociedade e dos setores econômicos às mudanças do clima.

Redução de emissões – A avaliação foi feita pelo secretário de Políticas e Programas de Pesquisa e Desenvolvimento do Ministério da Ciência e Tecnologia (MCT), Carlos Nobre, na abertura da 4ª Conferência Regional sobre Mudanças Globais: O plano brasileiro para o futuro sustentável, realizada no Memorial da América Latina, em São Paulo (SP).

Em função desses investimentos governamentais em pesquisa, de acordo com Nobre, foi possível o Brasil se tornar o primeiro País em desenvolvimento a fixar metas de redução de emissões de GEE entre 36% a 39% até 2020, conforme estabelecido pelo Plano Nacional de Mudanças Climáticas, sancionado no fim de 2009.
“Essa foi uma área em que avançamos mais, com o estabelecimento de metas setoriais de redução de emissões. A mais significativa, obviamente, é a redução de 80% no índice de desmatamento da Amazônia, em que o Brasil tem conseguido obter avanços notáveis nos últimos seis anos. Mas um desafio ainda maior será reduzir em pelo menos 40% o desmatamento no Cerrado, que é atualmente a maior fronteira agrícola brasileira”, disse.

Medidas de adaptação às mudanças climáticas – Ainda que as emissões de GEE sejam reduzidas rapidamente, a temperatura do planeta ainda continuará subindo nos próximos séculos. Por conta disso, o próximo passo que deve ser dado é desenvolver medidas de adaptação que permitam que a sociedade e os setores econômicos se tornem mais resilientes às mudanças do clima, assinalou o cientista.

Uma das iniciativas recentes do Brasil nesse sentido é a criação do Sistema Nacional de Monitoramento e Alerta de Desastres Naturais, coordenado por Nobre. O sistema contará com centros estaduais e regionais de monitoramento e alerta de desastres naturais, além de um nacional, que deve ser inaugurado até o fim do ano para funcionar nas próximas chuvas de verão.

“Essa é uma medida concreta de adaptação aos eventos climáticos que devíamos ao País e que finalmente será tirado do papel e se tornará uma realidade”, afirmou. De acordo com Nobre, a adaptação às mudanças climáticas também é uma das metas do segundo Plano de Ação em Ciência, Tecnologia e Inovação (Pacti), em elaboração pelo governo federal.

O plano estabelecerá grandes metas que o País almeja atingir em ciência, tecnologia e inovação no período de 2012 a 2015. Entre elas estão fazer com que o País tenha autonomia na geração de cenários climáticos futuros, especialmente em projeções de extremos climáticos em escala regional, que possam apoiar os planos regionais e setoriais de adaptação às mudanças climáticas, como os da agricultura.

“É fundamental adaptar a agricultura às mudanças climáticas para a segurança alimentar não só do País, mas também do mundo. O Brasil já é o segundo maior exportador de commodities agrícolas e, em menos de 10 anos, possivelmente se tornará o primeiro”, apontou Nobre.

Protagonista sem liderança – Na opinião de cientistas que participaram da abertura do evento, o Brasil assumiu o protagonismo nas discussões sobre redução das emissões de gases de efeito estufa. Mas, para o professor de relações internacionais da Universidade de Brasília (UnB), Eduardo José Viola, a capacidade de liderança do País nas negociações climáticas é limitada.

“O Brasil poderá assumir uma posição mais ativa nas negociações climáticas devido à sincronização das ações entre o MCT e o Ministério do Meio Ambiente. Mas o País é uma potência climática média. As grandes potências climáticas que podem solucionar o problema são os Estados Unidos, União Europeia e China, que, juntas, são responsáveis por 60% das emissões globais”, disse Viola.

Para Guy Pierre Brasseur, do National Center for Atmospheric Center (Ncar), dos Estados Unidos, a decisão sobre reduzir as emissões globais de GEE não é um problema científico, mas uma escolha política. E uma das maneiras de se conseguir fazer com que os líderes dos países assumam esse compromisso seria por meio da pressão popular. “Os resultados das negociações climáticas têm sido uma catástrofe, e os avanços foram muito limitados. Temos que pensar em como melhorar a comunicação da ciência sobre os impactos das mudanças climáticas porque a decisão dos países em reduzir suas emissões só será possível por meio da pressão exercida por seus cidadãos”, afirmou.
(Agência Inovação Tecnológica)

>Futuro da previsão do clima no Brasil em xeque (JC, Globo)

JC e-mail 4232, de 05 de Abril de 2011.

Escassez de meteorologistas coloca em risco a previsão e a prevenção de catástrofes.

A falta de profissionais qualificados que façam previsões meteorológicas e se dediquem à parte operacional da meteorologia preocupa especialistas da área. Na 4ª Conferência Regional sobre Mudanças Globais, iniciada ontem em São Paulo, o pesquisador do Instituto Nacional de Pesquisas Espaciais (Inpe) José Antonio Marengo disse que o Brasil tem muitos profissionais que se dedicam à pesquisa acadêmica, mas faltam especialistas para fazer a leitura dos dados e mapas.

Segundo ele, por enquanto, isso não prejudica as previsões meteorológicas no país, mas em cinco anos este déficit pode causar transtornos.

– Necessitamos de pessoas mais novas, jovens. Temos poucos profissionais que se dedicam à fazer a previsão e mapas. As pessoas estão se aposentando e outras preferem trabalhar na parte acadêmica – disse.

Para ele, a parte operacional da meteorologia não atrai muitos profissionais porque eles consideram o trabalho pouco interessante e instigante:

– É uma parte chata, como um médico que trabalha no pronto-socorro e um que trabalha no Albert Einstein. Já o físico da Universidade de São Paulo e integrante do Painel
Brasileiro de Mudanças Climáticas Paulo Artaxo Netto se preocupa com o fato de o país ainda não estar preparado para fazer melhores previsões de enchentes e secas para minimizar os impactos de desastres naturais:

– Os modelos climáticos feitos no Brasil não necessariamente atendem as necessidades das defesas civis. Não adianta dizer que no estado de São Paulo vai chover 200 milímetros. Isso vai acontecer em Araraquara ou em Santos? Precisamos ter uma alta resolução espacial e previsões com horas de antecedência.

Os especialistas estão confiantes na entrada em funcionamento e eficácia do Sistema Nacional de Monitoramento e Alerta de Desastres Naturais, elaborado pelo Ministério da Ciência e Tecnologia e que deve começar a operar até o fim do ano. Mas o climatologista Carlos Nobre, secretário de Políticas e Programas de Pesquisa e Desenvolvimento do MCT, admitiu que isso ainda é um desafio.

– Temos que diminuir o número de vítimas nas catástrofes por meio da ciência – disse.
(O Globo)

>Seminário comemora o Dia Meteorológico Mundial

JC e-mail 4222, de 22 de Março de 2011.

Palestras abordam mudanças climáticas, avanços tecnológicos na meteorologia e gerenciamento de risco de desastres.

O Instituto Nacional de Meteorologia (Inmet) realizará amanhã (23), em sua sede, em Brasília, o seminário Clima para Você, tema definido na 61ª Seção do Conselho Executivo da Organização Meteorológica Mundial (OMM) para as celebrações do Dia Meteorológico Mundial. Cinco palestras tratarão de temas da maior relevância no momento atual como mudanças climáticas, avanços tecnológicos na meteorologia e gerenciamento de risco de desastres naturais.

A data comemora a criação da OMM, em 1950, como uma agência especializada das Nações Unidas (ONU) que trata de três elementos fundamentais para o homem: o Clima, o Tempo e a Água.

O secretário geral da OMM, Michel Jarraud, em mensagem dirigida aos 189 países membros da Organização pelo Dia Meteorológico Mundial, disse que as atividades da OMM relacionadas ao clima são vistas hoje como fundamentais à segurança e bem-estar humanos e à consecução de benefícios econômicos para todas as nações.

Antonio Divino Moura, diretor do Inmet e terceiro vice-presidente da OMM, diz que o “o tema Clima e o Homem estão profundamente interligados. Há uma relação de reciprocidade entre eles: um afeta o outro. O modo de viver do homem tropical e do homem de latitudes médias e polares está diretamente relacionado ao clima. Por outro lado, o homem pode alterar o clima com ações que mudam elementos da natureza, como a queima de combustíveis fósseis; consequentemente, o clima reage. Hoje, com certo domínio tecnológico em escala planetária, o homem altera seu clima e sofre suas consequências.”
(Ascom Inmet)

>Palocci tentará melhorar o clima (JC, O Globo)

Ministro assume comando da política de mudanças climáticas para pôr fim a divergências.

JC e-mail 4218, de 16 de Março de 2011.

O chefe da Casa Civil, Antonio Palocci, assumiu o comando da pauta que vem ganhando mais destaque na área ambiental do governo: a política de mudanças climáticas, que, desde 2009, com a cúpula de Copenhague, rende ao Brasil prestígio internacional. A mudança de rumo no governo gerou inquietação nos dois principais ministérios que cuidam do tema: Meio Ambiente (MMA) e Ciência e Tecnologia (MCT). Ontem, foi efetivada a primeira baixa no Meio Ambiente, com a saída da secretária nacional de Mudanças Climáticas, Branca Americano, a ser substituída pelo pesquisador Eduardo Assad, da Embrapa.

Segundo a ministra do Meio Ambiente, Izabella Teixeira, que falou ontem a empresários e técnicos de governos estaduais, a ideia é acabar com as constantes divergências que a agenda climática causava entre os ministérios e obter sintonia. Visões conflitantes na Esplanada já causaram brigas e constrangimentos.

– Com a Rio+20, temos que tratar questões ambientais de forma diferente da que vínhamos tratando. Temos que ter uma estrutura de governança diferente. A mudança climática é o carro-chefe dessa discussão. Estamos trabalhando o melhor formato com o
MCT e a Casa Civil sob um novo modelo de governança da agenda de clima, a pedido da Casa Civil e do ministro Palocci. A Casa Civil, sendo o maestro, e o MCT e o MMA, os outros dois pés. Queremos acabar com as ilhas, para que haja convergência com a
agenda nacional – disse Izabella.

Departamento agregará temas

No MMA, a secretaria será transformada num super departamento, que deverá se chamar Secretaria do Clima, que agregará novos temas, como políticas de combate ao desmatamento, conservação de biodiversidade e gestão de florestas e recursos hídricos.

Na gestão do ministro do Meio Ambiente Carlos Minc, em 2009, o MMA teve problemas com o Itamaraty e o MCT. Minc pressionava para que o Brasil tivesse metas de redução de gases, e as outras duas pastas defendiam posição mais conservadora. Nas áreas que serão incorporadas pela nova secretaria, funcionários reclamam que Izabella não consultou os principais afetados sobre os novos rumos que seus trabalhos devem tomar.

Na Ciência e Tecnologia, técnicos que trabalham com mudança do clima resistem à proposta de reestruturação promovida com a nomeação do pesquisador Carlos Nobre para a Secretaria de Políticas e Programas de Pesquisa e Desenvolvimento. Para a conferência que acontecerá em Bangcoc, em abril, uma das preparatórias para o encontro anual da ONU, o MCT ainda não tem uma equipe formada para enviar. Uma das mudanças já anunciadas por Nobre atinge a menina dos olhos do ministério nesse setor: o Mecanismo de Desenvolvimento Limpo (MDL), que credencia projetos que reduzem emissões a receber créditos que podem ser negociados no mercado de carbono.

A guinada responde a uma das principais críticas feitas por elaboradores de projetos que reduzem emissões: a de que o processo de aceitação dessas propostas é excessivamente

– Vamos ter um novo olhar sobre o MDL. Vamos flexibilizar regras e torná-lo mais ágil. O Brasil tem condições de liderar, junto com a Escandinávia e a Alemanha, a transição para uma economia de baixo carbono – apontou o secretário.

Para o pesquisador do Instituto do Homem e Meio Ambiente da Amazônia (Imazon) Adalberto Veríssimo, a decisão do governo de pôr a Casa Civil na coordenação da política climática do governo é uma boa notícia. Ele argumenta que o aquecimento global é um problema que atravessa diferentes áreas temáticas e, por isso, deve estar no topo da hierarquia do Executivo.

– A coordenação de Palocci sobre as mudanças climáticas está correta. Esse é um assunto transversal, que interessa a várias áreas, como Minas e Energia, Transporte e Agricultura. É uma tarefa que vai precisar de equilíbrio. O aquecimento global é um dos pilares da discussão desta década. Colocá-lo na Casa Civil é sinal de que o Brasil quer continuar avançando na área – disse.

Sobre as mudanças nos ministérios, Veríssimo disse que a entrada de Nobre “oxigena” o debate dentro do MCT, que, segundo ele, contava com quadros retrógrados.

– Senti o MCT e o MMA falando a mesma língua. Foi a primeira vez que vi isso acontecer – disse Veríssimo.

Marina faz críticas a licenciamentos

A ex-senadora Marina Silva (PV) criticou ontem a ideia do governo federal de flexibilizar a concessão de licenciamentos ambientais para acelerar obras de infraestrutura. Ela falou antes de saber que Palocci cuidará da agenda climática.

– Vejo com preocupação essa história de mudar o processo de licenciamento ambiental. Acho que qualquer mudança dessa natureza, no sentido de flexibilizar, só vai agravar os problemas que estamos vivendo. O licenciamento tem um papel importante para reduzir e minimizar o impacto ambiental de uma obra – disse a candidata derrotada a presidente, após participar de aula magna do curso de pós-graduação do Instituto Nacional de Pesquisas Espaciais (Inpe), em São José dos Campos.

As propostas do governo serão implementadas por decretos que regularão o licenciamento de rodovias, portos, linhas de transmissão de energia elétrica, hidrovias e obras de exploração de petróleo do pré-sal.
(O Globo)

>Um novo furacão no Brasil (JC, O Globo)

Inmet, Marinha e Inpe divergem sobre tempestade Arani, que atinge litoral.

JC e-mail 4218, de 16 de Março de 2011.

Um fenômeno climático que tem provocado chuva intensa do Norte Fluminense ao sul da Bahia divide os principais órgãos meteorológicos do país. O Instituto Nacional de Meteorologia (Inmet) chama o Arani, como foi batizado, de furacão. Em um alerta especial, ressaltou a ocorrência de ventos de até 120 km/h sobre o Oceano Atlântico.

O diagnóstico, porém, não é compartilhado pela Marinha do Brasil, que define o mesmo fenômeno como tempestade subtropical – uma escala de gravidade abaixo -, nem pelo Instituto Nacional de Pesquisas Espaciais (Inpe), que afirma tratar-se de uma depressão tropical – outro degrau abaixo no nível de periculosidade.

Fenômeno se afasta da costa brasileira

O Arani (“tempo furioso”, em tupi) se formou pela conjunção de água e ar quentes em uma área de forte instabilidade próxima à costa do Espírito Santo. Esse sistema provocou uma circulação ciclônica de ventos, além de grandes volumes de chuva naquele estado. O perigo não foi maior porque a formação está sobre alto-mar e, nos próximos dois dias, deve se dirigir para sudeste, afastando-se ainda mais do litoral brasileiro.

De acordo com o Inmet, o Arani ganhou mais força quando se afastou do litoral, adquirindo as características de um furacão híbrido. Trata-se de uma formação diferente das que costumam devastar o Caribe e o Atlântico Norte, pois, em vez de um sistema independente, que se alimenta do aquecimento das águas do mar, está associado a um ciclone, originado de uma frente fria.

O furacão está a 110 quilômetros da costa brasileira e só representa ameaça a embarcações e aviões que sobrevoem a região do Cabo de São Tomé, litoral do Rio, que está em sua rota para o oceano. Nos próximos dias, o Arani deve atingir águas internacionais, e o monitoramento caberá à África do Sul.

O Inmet classificou o fenômeno com a ajuda de órgãos americanos de monitoramento de furacões. De acordo com a meteorologista Morgana Almeida, da equipe do instituto, não há risco de o movimento atual do fenômeno se inverter, trazendo prejuízos ao continente. O instituto alertou autoridades da Marinha, que tomaram providências para evitar o tráfego na área atingida pelos fortes ventos.

Mas o próprio Serviço Meteorológico da Marinha classifica o Arani de outra forma. O órgão identificou rajadas de, no máximo, 80 km/h. Há grande precipitação em alto-mar, mas as ondas provocadas por elas, de 3 a 4 metros, têm o mesmo tamanho daquelas formadas por uma frente fria.

– Formações como essa não são comuns, mas podem ocorrer no verão – ressalta a meteorologista Caroline Vidal Ferreira da Guia, do Inpe. – O Arani tem força para provocar transtornos à população, mas, segundo nossas medições, não chega a ser um furacão.
(O Globo)

>Living in Denial: Climate Change, Emotions, and Everyday Life (MIT Press)

Book release (April 2011, MIT Press):

Kari Marie Norgaard

Global warming is the most significant environmental issue of our time, yet public response in Western nations has been meager. Why have so few taken any action? In Living in Denial, sociologist Kari Norgaard searches for answers to this question, drawing on interviews and ethnographic data from her study of “Bygdaby,” the fictional name of an actual rural community in western Norway, during the unusually warm winter of 2001-2002.

In 2001-2002 the first snowfall came to Bygdaby two months later than usual; ice fishing was impossible; and the ski industry had to invest substantially in artificial snow-making. Stories in local and national newspapers linked the warm winter explicitly to global warming. Yet residents did not write letters to the editor, pressure politicians, or cut down on use of fossil fuels. Norgaard attributes this lack of response to the phenomenon of socially organized denial, by which information about climate science is known in the abstract but disconnected from political, social, and private life, and sees this as emblematic of how citizens of industrialized countries are responding to global warming.

Norgaard finds that for the highly educated and politically savvy residents of Bygdaby, global warming was both common knowledge and unimaginable. Norgaard traces this denial through multiple levels, from emotions to cultural norms to political economy. Her report from Bygdaby, supplemented by comparisons throughout the book to the United States, tells a larger story behind our paralysis in the face of today’s alarming predictions from climate scientists.

About the Author

Kari Marie Norgaard is Assistant Professor of Sociology at the University of Oregon.

>Ancient Catastrophic Drought Leads to Question: How Severe Can Climate Change Become? (NSF)

Press Release 11-039

Extreme megadrought in Afro-Asian region likely had consequences for Paleolithic cultures

A boat on Lake Tanganyika today; the lake’s ancient surface water level fell dramatically.
Credit: Curt Stager.

February 24, 2011
How severe can climate change become in a warming world?

Worse than anything we’ve seen in written history, according to results of a study appearing this week in the journal Science.

An international team of scientists led by Curt Stager of Paul Smith’s College, New York, has compiled four dozen paleoclimate records from sediment cores in Lake Tanganyika and other locations in Africa.

The records show that one of the most widespread and intense droughts of the last 50,000 years or more struck Africa and Southern Asia 17,000 to 16,000 years ago.

Between 18,000 and 15,000 years ago, large amounts of ice and meltwater entered the North Atlantic Ocean, causing regional cooling but also major drought in the tropics, says Paul Filmer, program director in the National Science Foundation’s (NSF) Division of Earth Sciences, which funded the research along with NSF’s Division of Atmospheric and Geospace Sciences and its Division of Ocean Sciences.

“The height of this time period coincided with one of the most extreme megadroughts of the last 50,000 years in the Afro-Asian monsoon region with potentially serious consequences for the Paleolithic humans that lived there at the time,” says Filmer.

The “H1 megadrought,” as it’s known, was one of the most severe climate trials ever faced by anatomically modern humans.

Africa’s Lake Victoria, now the world’s largest tropical lake, dried out, as did Lake Tana in Ethiopia, and Lake Van in Turkey.

The Nile, Congo and other major rivers shriveled, and Asian summer monsoons weakened or failed from China to the Mediterranean, meaning the monsoon season carried little or no rainwater.

What caused the megadrought remains a mystery, but its timing suggests a link to Heinrich Event 1 (or “H1”), a massive surge of icebergs and meltwater into the North Atlantic at the close of the last ice age.

Previous studies had implicated southward drift of the tropical rain belt as a localized cause, but the broad geographic coverage in this study paints a more nuanced picture.

“If southward drift were the only cause,” says Stager, lead author of the Science paper, “we’d have found evidence of wetting farther south. But the megadrought hit equatorial and southeastern Africa as well, so the rain belt didn’t just move–it also weakened.”

Climate models have yet to simulate the full scope of the event.

The lack of a complete explanation opens the question of whether an extreme megadrought could strike again as the world warms and de-ices further.

“There’s much less ice left to collapse into the North Atlantic now,” Stager says, “so I’d be surprised if it could all happen again–at least on such a huge scale.”

Given what such a catastrophic megadrought could do to today’s most densely populated regions of the globe, Stager hopes he’s right.

Stager also holds an adjunct position at the Climate Change Institute, University of Maine, Orono.

Co-authors of the paper are David Ryves of Loughborough University in the United Kingdom; Brian Chase of the Institut des Sciences de l’Evolution de Montpellier in France and the Department of Archaeology, University of Bergen, Norway; and Francesco Pausata of the Geophysical Institute, University of Bergen, Norway.


>Can a group of scientists in California end the war on climate change? (Guardian)

The Berkeley Earth project say they are about to reveal the definitive truth about global warming

Ian Sample
Sunday 27 February 2011 20.29 GMT

Richard Muller of the Berkeley Earth project is convinced his approach will lead to a better assessment of how much the world is warming. Photograph: Dan Tuffs for the Guardian

In 1964, Richard Muller, a 20-year-old graduate student with neat-cropped hair, walked into Sproul Hall at the University of California, Berkeley, and joined a mass protest of unprecedented scale. The activists, a few thousand strong, demanded that the university lift a ban on free speech and ease restrictions on academic freedom, while outside on the steps a young folk-singer called Joan Baez led supporters in a chorus of We Shall Overcome. The sit-in ended two days later when police stormed the building in the early hours and arrested hundreds of students. Muller was thrown into Oakland jail. The heavy-handedness sparked further unrest and, a month later, the university administration backed down. The protest was a pivotal moment for the civil liberties movement and marked Berkeley as a haven of free thinking and fierce independence.

Today, Muller is still on the Berkeley campus, probably the only member of the free speech movement arrested that night to end up with a faculty position there – as a professor of physics. His list of publications is testament to the free rein of tenure: he worked on the first light from the big bang, proposed a new theory of ice ages, and found evidence for an upturn in impact craters on the moon. His expertise is highly sought after. For more than 30 years, he was a member of the independent Jason group that advises the US government on defence; his college lecture series, Physics for Future Presidents was voted best class on campus, went stratospheric on YouTube and, in 2009, was turned into a bestseller.

For the past year, Muller has kept a low profile, working quietly on a new project with a team of academics hand-picked for their skills. They meet on campus regularly, to check progress, thrash out problems and hunt for oversights that might undermine their work. And for good reason. When Muller and his team go public with their findings in a few weeks, they will be muscling in on the ugliest and most hard-fought debate of modern times.

Muller calls his latest obsession the Berkeley Earth project. The aim is so simple that the complexity and magnitude of the undertaking is easy to miss. Starting from scratch, with new computer tools and more data than has ever been used, they will arrive at an independent assessment of global warming. The team will also make every piece of data it uses – 1.6bn data points – freely available on a website. It will post its workings alongside, including full information on how more than 100 years of data from thousands of instruments around the world are stitched together to give a historic record of the planet’s temperature.

Muller is fed up with the politicised row that all too often engulfs climate science. By laying all its data and workings out in the open, where they can be checked and challenged by anyone, the Berkeley team hopes to achieve something remarkable: a broader consensus on global warming. In no other field would Muller’s dream seem so ambitious, or perhaps, so naive.

“We are bringing the spirit of science back to a subject that has become too argumentative and too contentious,” Muller says, over a cup of tea. “We are an independent, non-political, non-partisan group. We will gather the data, do the analysis, present the results and make all of it available. There will be no spin, whatever we find.” Why does Muller feel compelled to shake up the world of climate change? “We are doing this because it is the most important project in the world today. Nothing else comes close,” he says.

Muller is moving into crowded territory with sharp elbows. There are already three heavyweight groups that could be considered the official keepers of the world’s climate data. Each publishes its own figures that feed into the UN’s Intergovernmental Panel on Climate Change. Nasa’s Goddard Institute for Space Studies in New York City produces a rolling estimate of the world’s warming. A separate assessment comes from another US agency, the National Oceanic and Atmospheric Administration (Noaa). The third group is based in the UK and led by the Met Office. They all take readings from instruments around the world to come up with a rolling record of the Earth’s mean surface temperature. The numbers differ because each group uses its own dataset and does its own analysis, but they show a similar trend. Since pre-industrial times, all point to a warming of around 0.75C.

You might think three groups was enough, but Muller rolls out a list of shortcomings, some real, some perceived, that he suspects might undermine public confidence in global warming records. For a start, he says, warming trends are not based on all the available temperature records. The data that is used is filtered and might not be as representative as it could be. He also cites a poor history of transparency in climate science, though others argue many climate records and the tools to analyse them have been public for years.

Then there is the fiasco of 2009 that saw roughly 1,000 emails from a server at the University of East Anglia’s Climatic Research Unit (CRU) find their way on to the internet. The fuss over the messages, inevitably dubbed Climategate, gave Muller’s nascent project added impetus. Climate sceptics had already attacked James Hansen, head of the Nasa group, for making political statements on climate change while maintaining his role as an objective scientist. The Climategate emails fuelled their protests. “With CRU’s credibility undergoing a severe test, it was all the more important to have a new team jump in, do the analysis fresh and address all of the legitimate issues raised by sceptics,” says Muller.

This latest point is where Muller faces his most delicate challenge. To concede that climate sceptics raise fair criticisms means acknowledging that scientists and government agencies have got things wrong, or at least could do better. But the debate around global warming is so highly charged that open discussion, which science requires, can be difficult to hold in public. At worst, criticising poor climate science can be taken as an attack on science itself, a knee-jerk reaction that has unhealthy consequences. “Scientists will jump to the defence of alarmists because they don’t recognise that the alarmists are exaggerating,” Muller says.

The Berkeley Earth project came together more than a year ago, when Muller rang David Brillinger, a statistics professor at Berkeley and the man Nasa called when it wanted someone to check its risk estimates of space debris smashing into the International Space Station. He wanted Brillinger to oversee every stage of the project. Brillinger accepted straight away. Since the first meeting he has advised the scientists on how best to analyse their data and what pitfalls to avoid. “You can think of statisticians as the keepers of the scientific method, ” Brillinger told me. “Can scientists and doctors reasonably draw the conclusions they are setting down? That’s what we’re here for.”

For the rest of the team, Muller says he picked scientists known for original thinking. One is Saul Perlmutter, the Berkeley physicist who found evidence that the universe is expanding at an ever faster rate, courtesy of mysterious “dark energy” that pushes against gravity. Another is Art Rosenfeld, the last student of the legendary Manhattan Project physicist Enrico Fermi, and something of a legend himself in energy research. Then there is Robert Jacobsen, a Berkeley physicist who is an expert on giant datasets; and Judith Curry, a climatologist at Georgia Institute of Technology, who has raised concerns over tribalism and hubris in climate science.

Robert Rohde, a young physicist who left Berkeley with a PhD last year, does most of the hard work. He has written software that trawls public databases, themselves the product of years of painstaking work, for global temperature records. These are compiled, de-duplicated and merged into one huge historical temperature record. The data, by all accounts, are a mess. There are 16 separate datasets in 14 different formats and they overlap, but not completely. Muller likens Rohde’s achievement to Hercules’s enormous task of cleaning the Augean stables.

The wealth of data Rohde has collected so far – and some dates back to the 1700s – makes for what Muller believes is the most complete historical record of land temperatures ever compiled. It will, of itself, Muller claims, be a priceless resource for anyone who wishes to study climate change. So far, Rohde has gathered records from 39,340 individual stations worldwide.

Publishing an extensive set of temperature records is the first goal of Muller’s project. The second is to turn this vast haul of data into an assessment on global warming. Here, the Berkeley team is going its own way again. The big three groups – Nasa, Noaa and the Met Office – work out global warming trends by placing an imaginary grid over the planet and averaging temperatures records in each square. So for a given month, all the records in England and Wales might be averaged out to give one number. Muller’s team will take temperature records from individual stations and weight them according to how reliable they are.

This is where the Berkeley group faces its toughest task by far and it will be judged on how well it deals with it. There are errors running through global warming data that arise from the simple fact that the global network of temperature stations was never designed or maintained to monitor climate change. The network grew in a piecemeal fashion, starting with temperature stations installed here and there, usually to record local weather.

Among the trickiest errors to deal with are so-called systematic biases, which skew temperature measurements in fiendishly complex ways. Stations get moved around, replaced with newer models, or swapped for instruments that record in celsius instead of fahrenheit. The times measurements are taken varies, from say 6am to 9pm. The accuracy of individual stations drift over time and even changes in the surroundings, such as growing trees, can shield a station more from wind and sun one year to the next. Each of these interferes with a station’s temperature measurements, perhaps making it read too cold, or too hot. And these errors combine and build up.

This is the real mess that will take a Herculean effort to clean up. The Berkeley Earth team is using algorithms that automatically correct for some of the errors, a strategy Muller favours because it doesn’t rely on human interference. When the team publishes its results, this is where the scrutiny will be most intense.

Despite the scale of the task, and the fact that world-class scientific organisations have been wrestling with it for decades, Muller is convinced his approach will lead to a better assessment of how much the world is warming. “I’ve told the team I don’t know if global warming is more or less than we hear, but I do believe we can get a more precise number, and we can do it in a way that will cool the arguments over climate change, if nothing else,” says Muller. “Science has its weaknesses and it doesn’t have a stranglehold on the truth, but it has a way of approaching technical issues that is a closer approximation of truth than any other method we have.”

He will find out soon enough if his hopes to forge a true consensus on climate change are misplaced. It might not be a good sign that one prominent climate sceptic contacted by the Guardian, Canadian economist Ross McKitrick, had never heard of the project. Another, Stephen McIntyre, whom Muller has defended on some issues, hasn’t followed the project either, but said “anything that [Muller] does will be well done”. Phil Jones at the University of East Anglia was unclear on the details of the Berkeley project and didn’t comment.

Elsewhere, Muller has qualified support from some of the biggest names in the business. At Nasa, Hansen welcomed the project, but warned against over-emphasising what he expects to be the minor differences between Berkeley’s global warming assessment and those from the other groups. “We have enough trouble communicating with the public already,” Hansen says. At the Met Office, Peter Stott, head of climate monitoring and attribution, was in favour of the project if it was open and peer-reviewed.

Peter Thorne, who left the Met Office’s Hadley Centre last year to join the Co-operative Institute for Climate and Satellites in North Carolina, is enthusiastic about the Berkeley project but raises an eyebrow at some of Muller’s claims. The Berkeley group will not be the first to put its data and tools online, he says. Teams at Nasa and Noaa have been doing this for many years. And while Muller may have more data, they add little real value, Thorne says. Most are records from stations installed from the 1950s onwards, and then only in a few regions, such as North America. “Do you really need 20 stations in one region to get a monthly temperature figure? The answer is no. Supersaturating your coverage doesn’t give you much more bang for your buck,” he says. They will, however, help researchers spot short-term regional variations in climate change, something that is likely to be valuable as climate change takes hold.

Despite his reservations, Thorne says climate science stands to benefit from Muller’s project. “We need groups like Berkeley stepping up to the plate and taking this challenge on, because it’s the only way we’re going to move forwards. I wish there were 10 other groups doing this,” he says.

For the time being, Muller’s project is organised under the auspices of Novim, a Santa Barbara-based non-profit organisation that uses science to find answers to the most pressing issues facing society and to publish them “without advocacy or agenda”. Funding has come from a variety of places, including the Fund for Innovative Climate and Energy Research (funded by Bill Gates), and the Department of Energy’s Lawrence Berkeley Lab. One donor has had some climate bloggers up in arms: the man behind the Charles G Koch Charitable Foundation owns, with his brother David, Koch Industries, a company Greenpeace called a “kingpin of climate science denial”. On this point, Muller says the project has taken money from right and left alike.

No one who spoke to the Guardian about the Berkeley Earth project believed it would shake the faith of the minority who have set their minds against global warming. “As new kids on the block, I think they will be given a favourable view by people, but I don’t think it will fundamentally change people’s minds,” says Thorne. Brillinger has reservations too. “There are people you are never going to change. They have their beliefs and they’re not going to back away from them.”

Waking across the Berkeley campus, Muller stops outside Sproul Hall, where he was arrested more than 40 years ago. Today, the adjoining plaza is a designated protest spot, where student activists gather to wave banners, set up tables and make speeches on any cause they choose. Does Muller think his latest project will make any difference? “Maybe we’ll find out that what the other groups do is absolutely right, but we’re doing this in a new way. If the only thing we do is allow a consensus to be reached as to what is going on with global warming, a true consensus, not one based on politics, then it will be an enormously valuable achievement.”

>Can Geoengineering Save the World from Global Warming? (Scientific American)

Ask the Experts | Energy & Sustainability
Scientific American

Is manipulating Earth’s environment to combat climate change a good idea–and where, exactly, did the idea come from?

By David Biello | February 25, 2011

STARFISH PRIME: This nighttime atmospheric nuclear weapons test generated an aurora (pictured) in Earth’s magnetic field, along with an electromagnetic pulse that blew out streetlights in Honolulu. It is seen as an early instance of geoengineering by science historian James Fleming. Image: Courtesy of US Govt. Defense Threat Reduction Agency

As efforts to combat climate change falter despite ever-rising concentrations of heat-trapping gases in the atmosphere, some scientists and other experts have begun to consider the possibility of using so-called geoengineering to fix the problem. Such “deliberate, large-scale manipulation of the planetary environment” as the Royal Society of London puts it, is fraught with peril, of course.

For example, one of the first scientists to predict global warming as a result of increasing concentrations of greenhouse gases in the atmosphere—Swedish chemist Svante Arrhenius—thought this might be a good way to ameliorate the winters of his native land and increase its growing season. Whereas that may come true for the human inhabitants of Scandinavia, polar plants and animals are suffering as sea ice dwindles and temperatures warm even faster than climatologists predicted.

Scientific American corresponded with science historian James Fleming of Colby College in Maine, author of Fixing the Sky: The Checkered History of Weather and Climate Control, about the history of geoengineering—ranging from filling the air with the artificial aftermath of a volcanic eruption to seeding the oceans with iron in order to promote plankton growth—and whether it might save humanity from the ill effects of climate change.

[An edited transcript of the interview follows.]

What is geoengineering in your view?
Geoengineering is planetary-scale intervention [in]—or tinkering with—planetary processes. Period.

As I write in my book, Fixing the Sky: The Checkered History of Weather and Climate Control, “the term ‘geoengineering’ remains largely undefined,” but is loosely, “the intentional large-scale manipulation of the global environment; planetary tinkering; a subset of terraforming or planetary engineering.”

As of June 2010 the term has a draft entry in the Oxford English Dictionary—the modification of the global environment or the climate in order to counter or ameliorate climate change. A 2009 report issued by the Royal Society of London defines geoengineering as “the deliberate large-scale manipulation of the planetary environment to counteract anthropogenic climate change.”

But there are significant problems with both definitions. First of all, an engineering practice defined by its scale (geo) need not be constrained by its stated purpose (environmental improvement), by any of its currently proposed techniques (stratospheric aerosols, space mirrors, etcetera) or by one of perhaps many stated goals (to ameliorate or counteract climate change). Nuclear engineers, for example, are capable of building both power plants and bombs; mechanical engineers can design components for both ambulances and tanks. So to constrain the essence of something by its stated purpose, techniques or goals is misleading at best.

Geo-scale engineering projects were conducted by both the U.S. and the Soviet Union between 1958 and 1962 that had nothing to do with countering or ameliorating climate change. Starting with the [U.S.’s] 1958 Argus A-bomb explosions in space and ending with the 1962 Starfish Prime H-bomb test, the militaries of both nations sought to modify the global environment for military purposes.

Project Argus was a top-secret military test aimed at detonating atomic bombs in space to generate an artificial radiation belt, disrupt the near-space environment, and possibly intercept enemy missiles. It, and the later tests conducted by both the U.S. and the Soviet Union, peaked with H-bomb detonations in space in 1962 that created an artificial [electro]magnetic [radiation] belt that persisted for 10 years. This is geoengineering.

This idea of detonating bombs in near-space was proposed in 1957 by Nicholas Christofilos, a physicist at Lawrence Berkeley National Laboratory. His hypothesis, which was pursued by the [U.S.] Department of Defense’s Advanced Research Projects Agency [subsequently known as DARPA] and tested in Project Argus and other nuclear shots, held that the debris from a nuclear explosion, mainly highly energetic electrons, would be contained within lines of force in Earth’s magnetic field and would travel almost instantly as a giant current spanning up to half a hemisphere. Thus, if a detonation occurred above a point in the South Atlantic, immense currents would flow along the magnetic lines to a point far to the north, such as Greenland, where they would severely disrupt radio communications. A shot in the Indian Ocean might, then, generate a huge electromagnetic pulse over Moscow. In addition to providing a planetary “energy ray,” Christofilos thought nuclear shots in space might also disrupt military communications, destroy satellites and the electronic guidance systems of enemy [intercontinental ballistic missiles], and possibly kill any military cosmonauts participating in an attack launched from space. He proposed thousands of them to make a space shield.

So nuclear explosions in space by the U.S. and the Soviet Union constituted some of the earliest attempts at geoengineering, or intentional human intervention in planetary-scale processes.

The neologism “geoengineer” refers to one who contrives, designs or invents at the largest planetary scale possible for either military or civilian purposes. Today, geoengineering, as an unpracticed art, may be considered “geoscientific speculation”. Geoengineering is a subset of terraformation, which also does not exist outside of the fantasies of some engineers.

I have recently written to the Oxford English Dictionary asking them to correct their draft definition.

Can geoengineering save the world from climate change?
In short, I think it may be infinitely more dangerous than climate change, largely due to the suspicion and social disruption it would trigger by changing humanity’s relationship to nature.

To take just one example from my book, on page 194: “Sarnoff Predicts Weather Control” read the headline on the front page of The New York Times on October 1, 1946. The previous evening, at his testimonial dinner at the Waldorf Astoria, RCA president Brig. Gen. David Sarnoff had speculated on worthy peaceful projects for the postwar era. Among them were “transformations of deserts into gardens through diversion of ocean currents,” a technique that could also be reversed in time of war to turn fertile lands into deserts, and ordering “rain or sunshine by pressing radio buttons,” an accomplishment that, Sarnoff declared, would require a “World Weather Bureau” in charge of global forecasting and control (much like the “Weather Distributing Administration” proposed in 1938). A commentator in The New Yorker intuited the problems with such control: “Who” in this civil service outfit, he asked, “would decide whether a day was to be sunny, rainy, overcast…or enriched by a stimulating blizzard?” It would be “some befuddled functionary,” probably bedeviled by special interests such as the raincoat and galoshes manufacturers, the beachwear and sunburn lotion industries, and resort owners and farmers. Or if a storm was to be diverted—”Detour it where? Out to sea, to hit some ship with no influence in Washington?”

How old is the idea of geoengineering? What other names has it had?
I can trace geoengineering’s direct modern legacy to 1945, and have prepared a table of such proposals and efforts for the [Government Accountability Office]. Nuclear weapons, digital computers and satellites seem to be the modern technologies of choice. Geoengineering has also been called terraformation and, more restrictively, climate engineering, climate intervention or climate modification. Many have proposed abandoning the term geoengineering in favor of solar radiation management and carbon (or carbon dioxide) capture and storage. Of course, the idea of control of nature is ancient—for example, Phaeton or Archimedes.

Phaeton, the son of Helios, received permission from his father [the Greek sun god] to drive the sun chariot, but failed to control it, putting the Earth in danger of burning up. He was killed by a thunderbolt from Zeus to prevent further disaster. Recently, a prominent meteorologist has written about climate control and urged us to “take up Phaeton’s reins,” which is not a good idea.

Archimedes is known as an engineer who said: “Give me a lever long enough and a place to stand, and I will move the Earth.” Some geoengineers think that this is now possible and that science and technology have given us an Archimedean set of levers with which to move the planet. But I ask: “Where will it roll if you tip it?”

How are weather control and climate control related?
Weather and climate are intimately related: Weather is the state of the atmosphere at a given place and time, while climate is the aggregate of weather conditions over time. A vast body of scientific literature addresses these interactions. In addition, historians are revisiting the ancient but elusive term klima, seeking to recover its multiple social connotations. Weather, climate and the climate of opinion matter in complex ways that invite—some might say require or demand—the attention of both scientists and historians. Yet some may wonder how weather and climate are interrelated rather than distinct. Both, for example, are at the center of the debate over greenhouse warming and hurricane intensity. A few may claim that rainmaking, for example, has nothing to do with climate engineering, but any intervention in the Earth’s radiation or heat budget (such as managing solar radiation) would affect the general circulation and thus the location of upper-level patterns, including the jet stream and storm tracks. Thus, the weather itself would be changed by such manipulation. Conversely, intervening in severe storms by changing their intensity or their tracks or modifying weather on a scale as large as a region, a continent or the Pacific Basin would obviously affect cloudiness, temperature and precipitation patterns with major consequences for monsoonal flows, and ultimately the general circulation. If repeated systematically, such interventions would influence the overall heat budget and the climate.

Both weather and climate control have long and checkered histories: My book explains [meteorologist] James Espy’s proposal in the 1830s to set fire to the crest of the Appalachian Mountains every Sunday evening to generate heated updrafts that would stimulate rain and clear the air for cities of the east coast. It also examines efforts to fire cannons at the clouds in the arid Southwest in the hope of generating rain by concussion.

In the 1920s airplanes loaded with electrified sand were piloted by military aviators who “attacked” the clouds in futile attempts to both make rain and clear fog. Many others have proposed either a world weather control agency or creating a global thermostat, either by burning vast quantities of fossil fuels if an ice age threatened or sucking the CO2 out of the air if the world overheated.

After 1945 three technologies—nuclear weapons, digital computers and satellites—dominated discussions about ultimate weather and climate control, but with very little acknowledgement that unintended consequences and social disruption may be more damaging than any presumed benefit.

What would be the ideal role for geoengineering in addressing climate change?
That it generates interest in and awareness of the impossibility of heavy-handed intervention in the climate system, since there could be no predictable outcome of such intervention, physically, politically or socially.

Why do scientists continue to pursue this then, after 200 or so years of failure?
Science fantasy is informed by science fiction and driven by hubris. One of the dictionary definitions of hubris cites Edward Teller (the godfather of modern geoengineering).

Teller’s hubris knew no bounds. He was the [self-proclaimed] father of the H-bomb and promoted all things atomic, even talking about using nuclear weapons to create canals and harbors. He was also an advocate of urban sprawl to survive nuclear attack, the Star Wars [missile] defense system, and a planetary sunscreen to reduce global warming. He wanted to control nature and improve it using technology.

Throughout history rainmakers and climate engineers have typically fallen into two categories: commercial charlatans using technical language and proprietary techniques to cash in on a gullible public, and sincere but deluded scientific practitioners exhibiting a modicum of chemical and physical knowledge, a bare minimum of atmospheric insight, and an abundance of hubris. We should base our decision-making not on what we think we can do “now” and in the near future. Rather, our knowledge is shaped by what we have and have not done in the past. Such are the grounds for making informed decisions and avoiding the pitfalls of rushing forward, claiming we know how to “fix the sky.”

>What we have and haven’t learned from ‘Climategate’

BY David Roberts
28 FEB 2011 1:29 PM

I wrote about the “Climategate” controversy (over emails stolen from the University of East Anglia’s Climatic Research Unit) once, which is about what it warranted.

My silent protest had no effect whatsoever, of course, and the story followed a depressingly familiar trajectory: hyped relentlessly by right-wing media, bullied into the mainstream press as he-said she-said, and later, long after the damage is done, revealed as utterly bereft of substance. It’s a familiar script for climate faux controversies, though this one played out on a slightly grander scale.

Investigations galore

Consider that there have now been five, count ‘em five, inquiries into the matter. Penn State established an independent inquiry into the accusations against scientist Michael Mann and found “no credible evidence” [PDF] of improper research conduct. A British government investigation run by the House of Commons’ Science and Technology Committee found that while the CRU scientists could have been more transparent and responsive to freedom-of-information requests, there was no evidence of scientific misconduct. The U.K.’s Royal Society (its equivalent of the National Academies) ran an investigation that found “no evidence of any deliberate scientific malpractice.” The University of East Anglia appointed respected civil servant Sir Muir Russell to run an exhaustive, six-month independent inquiry; he concluded that “the honesty and rigour of CRU as scientists are not in doubt … We have not found any evidence of behaviour that might undermine the conclusions of the IPCC assessments.”

All those results are suggestive, but let’s face it, they’re mostly … British. Sen. James Inhofe (R-Okla.) wanted an American investigation of all the American scientists involved in these purported dirty deeds. So he asked the Department of Commerce’s inspector general to get to the bottom of it. On Feb. 18, the results of that investigation were released. “In our review of the CRU emails,” the IG’s office said in its letter to Inhofe [PDF], “we did not find any evidence that NOAA inappropriately manipulated data … or failed to adhere to appropriate peer review procedures.” (Oddly, you’ll find no mention of this central result in Inhofe’s tortured public response.)

Whatever legitimate issues there may be about the responsiveness or transparency of this particular group of scientists, there was nothing in this controversy — nothing — that cast even the slightest doubt on the basic findings of climate science. Yet it became a kind of stain on the public image of climate scientists. How did that happen?

Smooth criminals

You don’t hear about it much in the news coverage, but recall, the story began with a crime. Hackers broke into the East Anglia email system and stole emails and documents, an illegal invasion of privacy. Yet according to The Wall Street Journal’s Kim Strassel, the emails “found their way to the internet.” In ABC science correspondent Ned Potter’s telling, the emails “became public.” The New York Times’ Andy Revkin says they were “extracted from computers.”

None of those phrasings are wrong, per se, but all pass rather lightly over the fact that some actual person or persons put them on the internet, made them public, extracted them from the computers. Someone hacked in, collected emails, sifted through and selected those that could be most damning, organized them, and timed the release for maximum impact, just before the Copenhagen climate talks. Said person or persons remain uncaught, uncharged, and unprosecuted. There have since been attempted break-ins at other climate research institutions.

If step one was crime, step two was character assassination. When the emails were released, they were combed over by skeptic blogs and right-wing media, who collected sentences, phrases, even individual terms that, when stripped of all context, create the worst possible impression. Altogether the whole thing was as carefully staged as any modern-day political attack ad.

Yet when the “scandal” broke, rather than being about criminal theft and character assassination, it was instantly “Climategate.” It was instantly about climate scientists, not the illegal and dishonest tactics of their attackers. The scientists, not the ideologues and ratf*ckers, had to defend themselves.

Burden of proof

It’s a numbingly familiar pattern in media coverage. The conservative movement that’s been attacking climate science for 20 years has a storied history of demonstrable fabrications, distortions, personal attacks, and nothingburger faux-scandals — not only on climate science, but going back to asbestos, ozone, leaded gasoline, tobacco, you name it. They don’t follow the rigorous standards of professional science; they follow no intellectual or ethical standards whatsoever. Yet no matter how long their record of viciousness and farce, every time the skeptic blogosphere coughs up a new “ZOMG!” it’s as though we start from zero again, like no one has a memory longer than five minutes.

Here’s the basic question: At this point, given their respective accomplishments and standards, wouldn’t it make sense to give scientists the strong benefit of the doubt when they are attacked by ideologues with a history of dishonesty and error? Shouldn’t the threshold for what counts as a “scandal” have been nudged a bit higher?

Agnotological inquiry

The lesson we’ve learned from climategate is simple. It’s the same lesson taught by death panels, socialist government takeover, Sharia law, and Obama’s birth certificate. To understand it we must turn to agnotology, the study of culturally induced ignorance or doubt. (Hat tip to an excellent recent post on this by John Quiggen.)

Beck, Palin, and the rest of Fox News and talk radio operate on the pretense that they are giving consumers access to a hidden “universe of reality,” to use Limbaugh’s term. It’s a reality being actively obscured the “lamestream media,” academics, scientists, and government officials. Affirming the tenets of that secret reality has become an act of tribal reinforcement, the equivalent of a secret handshake.

The modern right has created a closed epistemic loop containing millions of people. Within that loop, the implausibility or extremity of a claim itself counts as evidence. The more liberal elites reject it, the more it entrenches itself. Standards of evidence have nothing to do with it.

The notion that there is a global conspiracy by professional scientists to falsify results in order to get more research money is, to borrow Quiggen’s words about birtherism, “a shibboleth, that is, an affirmation that marks the speaker as a member of their community or tribe.” Once you have accepted that shibboleth, anything offered to you as evidence of its truth, no matter how ludicrous, will serve as affirmation. (Even a few context-free lines cherry-picked from thousands of private emails.)

Living with the loop

There’s one thing we haven’t learned from climategate (or death panels or birtherism). U.S. politics now contains a large, well-funded, tightly networked, and highly amplified tribe that defines itself through rejection of “lamestream” truth claims and standards of evidence. How should our political culture relate to that tribe?

We haven’t figured it out. Politicians and the political press have tried to accommodate the shibboleths of the right as legitimate positions for debate. The press in particular has practically sworn off plain judgments of accuracy or fact. But all that’s done is confuse and mislead the broader public, while the tribe pushes ever further into extremity. The tribe does not want to be accommodated. It is fueled by elite rejection.

At this point mainstream institutions like the press are in a bind: either accept the tribe’s assertions as legitimate or be deemed “biased.” Until there is a way out of that trap, there will be more and more Climategates.

Fact-Free Science (N.Y. Times)


Published: February 25, 2011

Photo: Camille Seaman.

President Obama has made scientific innovation the cornerstone of his plans for “winning the future,” requesting in his recent budget proposal large financing increases for scientific research and education and, in particular, sustained attention to developing alternative energy sources and technologies. “This is our generation’s Sputnik moment,” he declared in his State of the Union address last month.

It would be easier to believe in this great moment of scientific reawakening, of course, if more than half of the Republicans in the House and three-quarters of Republican senators did not now say that the threat of global warming, as a man-made and highly threatening phenomenon, is at best an exaggeration and at worst an utter “hoax,” as James Inhofe of Oklahoma, the ranking Republican on the Senate Environment and Public Works Committee, once put it. These grim numbers, compiled by the Center for American Progress, describe a troubling new reality: the rise of the Tea Party and its anti-intellectual, anti-establishment, anti-elite worldview has brought both a mainstreaming and a radicalization of antiscientific thought.

The politicization of science isn’t particularly new; the Bush administration was famous for pressuring government agencies to bring their vision of reality in line with White House imperatives. In response to this, and with a renewed culture war over the very nature of scientific reality clearly brewing, the Obama administration tried to initiate a pre-emptive strike earlier this winter, issuing a set of “scientific integrity” guidelines aimed at keeping the work of government scientists free from ideological pollution. But since taking over the House of Representatives, the Republicans have packed science-related committees with lawmakers who refute such basic findings as the reality of global warming and the threats of climate change. Fred Upton, the head of the House Energy and Commerce Committee, has said outright that he does not believe that global warming is man-made. John Shimkus of Illinois, who also sits on the committee — as well as on the Subcommittee on Energy and Environment — has said that the government doesn’t need to make a priority of regulating greenhouse-gas emissions, because as he put it late last year, “God said the earth would not be destroyed by a flood.”

Source: Gallup

Whoever emerges as the Republican presidential candidate in 2012 will very likely have to embrace climate-change denial. Mitt Romney, Tim Pawlenty and Mike Huckabee, all of whom once expressed some support for action on global warming, have notably distanced themselves from these views. Saying no to mainstream climate science, notes Daniel J. Weiss, a senior fellow and director of climate strategy for the Center for American Progress, is now a required practice for Republicans eager to play to an emboldened conservative base. “Opposing the belief that global warming is human-caused has become systematic, like opposition to abortion,” he says. “It’s seen as another way for government to control people’s lives. It’s become a cultural issue.”

That taking on the scientific establishment has become a favored activity of the right is quite a turnabout. After all, questioning accepted fact, revealing the myths and politics behind established certainties, is a tactic straight out of the left-wing playbook. In the 1960s and 1970s, the push back against scientific authority brought us the patients’ rights movement and was a key component of women’s rights activism. That questioning of authority veered in a more radical direction in the academy in the late 1980s and early 1990s, when left-wing scholars doing “science studies” increasingly began taking on the very idea of scientific truth.

This was the era of the culture wars, the years when the conservative University of Chicago philosopher Allan Bloom warned in his book “The Closing of the American Mind” of the dangers of liberal know-nothing relativism. But somehow, in the passage from Bush I to Bush II and beyond, the politics changed. By the mid-1990s, even some progressives said that the assault on truth, particularly scientific truth, had gone too far, a point made most famously in 1996 by the progressive New York University physicist Alan Sokal, who managed to trick the left-wing academic journal Social Text into printing a tongue-in-cheek article, written in an overblown parody of dense academic jargon, that argued that physical reality, as we know it, may not exist.

Illustration: Nomoco

Following the Sokal hoax, many on the academic left experienced some real embarrassment. But the genie was out of the bottle. And as the political zeitgeist shifted, attacking science became a sport of the radical right. “Some standard left arguments, combined with the left-populist distrust of ‘experts’ and ‘professionals’ and assorted high-and-mighty muckety-mucks who think they’re the boss of us, were fashioned by the right into a powerful device for delegitimating scientific research,” Michael Bérubé, a literature professor at Pennsylvania State University, said of this evolution recently in the journal Democracy. He quoted the disillusioned French theorist Bruno Latour, a pioneer of science studies who was horrified by the climate-change-denying machinations of the right: “Entire Ph.D. programs are still running to make sure that good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth . . . while dangerous extremists are using the very same argument of social construction to destroy hard-won evidence that could save our lives.”

Some conservatives argue that the Republican war on science is bad politics and that catering to the “climate-denier sect” in the party is a dangerous strategy, as David Jenkins, a member of Republicans for Environmental Protection wrote recently on the FrumForum blog. Public opinion, after all, has not kept pace with Republican rhetoric on the topic of climate change. A USA Today/Gallup poll conducted in January found that 83 percent of Americans want Congress to pass legislation promoting alternative energy, and a recent poll by the Opinion Research Corporation found that almost two-thirds want the Environmental Protection Agency to be more aggressive.

For those who have staked out extreme positions, backtracking may not be easy: “It is very difficult to get a man to understand something when his tribal identity depends on his not understanding it,” Bérubé notes. Maybe it’s time for some new identity politics.

Judith Warner is the author, most recently, of “We’ve Got Issues: Children and Parents in the Age of Medication.”

>‘Rapid Response Team’ Pairs Scientists and Media (The Yale Forum on Climate Change & The Media)

By Lisa Palmer | February 16, 2011
The Yale Forum on Climate Change & The Media

Think of it as the climate scientists/journalists version of “eHarmony.” A volunteer website launched by scientists serves as a matchmaking venue for media outlets and government officials looking for input on climate science topics.

It’s a Friday morning and Scott Mandia is scanning the Climate Science Rapid Response Team e-mail inbox he shares with two other climate science match-makers.

Today, on Mandia’s watch, a message from a journalist arrives at 5:30 a.m. It’s the first of two or three media requests he’ll likely get this day. Mandia’s task now? Ask for a response from one of 135 scientists in his network most qualified to answer the question. You might think of it as the climate scientists/journalists version of “eHarmony.”

Mandia, a professor of physical sciences at Suffolk County Community College, in New York, and his fellow Rapid Response founders, John Abraham, associate professor of thermodynamics at St. Thomas University, and Ray Weymann, a California-based retired astronomer and member of the National Academy of Sciences, take shifts. Each is a volunteer custodian of e-mail requests that flow in from their climate change match-making website connecting climate scientists with lawmakers and media outlets.

Launched in November 2010, the website tries to narrow the information gap between scientific understanding of climate change and what the public knows. Scientists involved with the group are screened and selected on an invitation-only basis. The experts come from a range of climate change science specialties, everything from climate modeling researchers and ecologists to economists and policy experts. Most are university faculty members or employees of government laboratories. It’s not a collection that most climate “contrarians” might be comfortable with.
The all-volunteer group promises to respond quickly to media requests to make sure science is portrayed accurately in the day’s news. They say turnaround time for requests is as fast as two hours for media operating on a short deadline.

“The scientists became members of our group because they understand that, as scientists, they have a responsibility to engage the public by engaging the media,” Mandia said in a phone interview. Mandia said he and his colleagues operate the service with no funding, and the website design was donated by Richard Hawkins, director of the Public Interest Research Centre in the United Kingdom.

Early on a Confusing Mix-up with AGU Media Project

Coincidentally, the Climate Science Rapid Response Team website debuted at the same time as the relaunch of the American Geophysical Unions’s Climate Q and A service, which has similarities with the Rapid Response Team but strictly limits questions to matters of science. (See Yale Forum related story.) Some confusion ensued when the Los Angeles Times erroneously reported a link between the AGU’s group and the Rapid Response volunteers, and AGU staff quickly initiated a damage-control effort in fear that some on Capitol Hill would find, based on the newspaper’s coverage, their effort overly politicized.

“When that (Los Angeles Times) story came out, it sounded like scientists were fighting back against politicians. We are not advocates about policy, but it made us look like we were the 98 pound weaklings getting sand kicked in their face,” said Mandia. But the bad press proved a boon to increase the numbers involved in the Rapid Response force.

“Scientists then realized they were being criticized unfairly and wanted to get involved,” said Mandia. The number of scientists involved with the Rapid Response Team quadrupled in number.
The AGU’s Q and A Service first formed to support media requests during the United Nations Climate Change Conference in Copenhagen in December 2009. It started again prior to the U.N. talks in Cancun. The Q and A service is open to anyone with a PhD degree willing to provide scientific expertise on a subject.

“AGU is not a partisan organization. We are here to make our science available so there is good information available to the media,” AGU Executive Director Chris McEntee said in a telephone interview.

About 700 scientists are registered with AGU’s service, which has provided answers to 68 media outlets. “We think it is important that policymakers, media, and the public get unbiased, nonpartisan information when making a decision,” said McEntee. “The service fits with our mission to promote scientific discovery for the benefit of humanity.”

Scientists Step Up

Mandia said scientists involved with his effort are usually tapped once or twice a month for media inquiries. No single person carries the burden of too many repeat requests because the group has selected a range of scientists, vetted for their expertise in various disciplines. The Rapid Response Team also has promised confidentiality of its scientists, who can remain anonymous if they wish. But Mandia said that, despite the offer, “none of them has ever requested anonymity.”

Andrew Dessler, an atmospheric scientist at Texas A & M University, is affiliated with both information services, but is more involved with the Climate Science Rapid Response Team. He was prompted into action because “dealing with climate change misinformation is difficult to do on your own,” Dessler wrote in an e-mail. “Effectively responding to the denial machine absolutely requires coordinated action by the climate science community. In this way, I think the CCRRT [sic] is a model of how scientists can effectively spend their limited resources on outreach.”

Dessler gives the Rapid Response service high marks, especially for institutionalizing the response process from scientists and distributing the communications workload. “You have to realize the asymmetry here. For [some] so-called skeptics, spreading misinformation is their full-time job. Scientists, on the other hand, already have a full-time job: research and teaching. Thus, we need to have mechanisms to level the playing field, and the CCRRT [sic] is one such mechanism,” said Dessler, adding that he encourages scientists to get involved in public outreach. “Because we are mainly funded by tax dollars, I think we have a responsibility to repay this by spreading the results of our research as far and wide as possible.”

A Goal of Precise Pairing

As of early February, more than 100 media organizations — newspaper, magazine, online media, television, and radio — and government officials have used the service to find climate scientists who could comment on a story. Mainstream media users have included The New York Times, The Guardian (UK), CNN International, and American Public Media’s “Marketplace,” among many others. Mandia said many of the media questions in December had to do with severe weather in the United States and in Northern Europe.

The Rapid Response website includes testimonials from such reporters as Ben Webster, of The Times in London: “I asked a difficult question about ice cores and was impressed by the efforts the team made to find the right people to respond. The response was balanced, stating clearly what was known but also the uncertainties.”

Eli Kintisch, a reporter for Science and author of Hack the Planet (Wiley, 2010), called on the service when he was looking for a scientist to serve as a color commentator of a live blog for Science he was producing during a House hearing. Facing time constraints, Kintisch relied on the matchmakers for the legwork of finding someone to fill this role.

“I have my own batch of sources on climate that I have used to comment on stories, and I have used ProfNet in the past occasionally. But I was looking for someone who had some experience with public engagement and would be available for two to four hours,” Kintisch said in a telephone interview. “The hearing was a review of the basics of climate science, and there were some prominent contrarians testifying, so I thought it would be useful to have someone available who knew the basics of climate science.”

While not all climate scientists feel comfortable engaging with the media, they are finding ways to get more involved in communications. Mandia said, “Some scientists are nervous about speaking to the press and worry they will be misquoted, but getting out of the ‘Ivory Tower’ is becoming very important.”

Lisa Palmer is a Maryland-based freelance writer and a regular contributor to The Yale Forum. (E-mail:

>The Role of Trust in Climate Change Adaptation and Resilience: Can ICTs help?

February 27, 2011
By Angelica Valeria Ospina

Amidst the magnitude and uncertainty that characterizes the climate change field, trust is a topic that is often overlooked, despite being one of the cornerstones of resilience building and adaptive capacity.

Trust is an essential element of effective communication, networking and self-organisation, and thus is indispensable in efforts to withstand and recover from the effects of climate change-related manifestations, being acute shocks or slow-changing trends. It’s an equally important basis for vulnerable communities to be able to adapt, and potentially change, in face of the -largely unknown- impact of climatic occurrences.

Associated with the belief, reliability, expectations and perceptions between people and the institutions within which they operate or interact, trust often acts as an underlying cause of action or inaction, constituting an important factor in decision-making processes.

With the rapid diffusion of Information and Communication Technologies (ICTs) such as mobile phones and the Internet, the unprecedented speed at which information is produced and shared is posing a new set of possibilities -and challenges- to communication management and trust building, both essential to the development of resilience and adaptation to the changing climate.

Adaptation experiences suggest that vulnerable communities are more prone to act upon information that they can ‘trust’, a complex concept that could be linked to factors such as the source of the information -and the local perception of it-, the language used to convey the message, the role and credibility of ‘infomediaries’ or local facilitators that help disseminate the information, the use of local appropriation mechanisms and community involvement, among others.

Climate change Adaptation Strategies and National Programmes of Action are increasingly called to foster trust-building processes by engaging local actors and gaining a better understanding of local needs and priorities. Thus, trust building in the climate change field involves finding new collaborative spaces where the interests of all stakeholders can be heard, and both scientific and traditional knowledge can be shared and built upon towards more effective adaptive practices, and potentially, transformation.

The widespread diffusion of ICTs -such as mobile phones, Internet access and even community radios- within Developing country environments could be opening up new opportunities to use these tools in support of trust-building processes, a necessary step towards change and transformation.

So, how can ICTs help to build trust within climate change resilience and adaptation processes?

Research at the intersection of ICTs, climate change and development suggests the following aspects in regards to the supportive role of ICT tools towards trust:

  • Multi-level Communication: ICTs can facilitate communication and trust-building between and across actors at the micro (e.g. community members), meso (e.g. NGOs) and macro levels (e.g. policy makers), fostering participation in the design of adaptation -and mitigation- strategies, as well as accountability and monitoring during their implementation.
  • Network Strengthening: The role of social networks is key within processes of adaptation to climate change and resilience building. Trust is at the core of networks functioning. The use of ICTs such as mobile phones can help to enhance communication and the bonds of trust within and among networks, which can in turn contribute to the effectiveness of community networks’ support and the access to resources.
  • Self-organisation: The ability to self-organize is a key attribute of resilient systems, and involves processes of collaboration that require trust among stakeholders and institutions. By facilitating access to information and resources through both point-to-multipoint and point-to-point exchange, ICTs can be important contributors to self-organisation and to the coordination of both preventive and reactive joint efforts in face of climatic events. They can help climate change actors to verify or double-check facts if the information source is not entirely trusted, diversifying their potential responses to the occurrence of climatic events. Additionally, ICTs can play a role towards trust by enabling the assessment of options and trade-offs involved in decision-making.
  • Appropriation and Infomediaries: The role of actors that ‘translate’ or ‘mediate’ the technical and scientific information to suit the needs of the local context, is vital for the appropriation of information. Tools such as the Internet, GIS or mobile phones can support and strengthen the role of agricultural extension workers, deepening the relationships of trust that they have established with local producers affected by climate change manifestations by offering them a broader set of options and information, for example, on crop diversification or plague management, including more immediate response to their queries.
  • Transparency and Fluency: Online platforms that provide new channels for citizens to voice their views and concerns, and that allow an interaction with decision makers, are an example of ICTs potential towards transparency and information fluency, which is an important factor in the local perception, expectations and ‘trust’ on local, regional and national institutions.

While at the onset of extreme events we are quick to recognize the importance of communication, we often fail to acknowledge the pivotal role of trust towards adaptation and resilience, as well as the potential of innovative tools such as ICTs to help fostering trust, strengthening networks and collaboration.

But as important as discussing the potential of ICTs towards trust building in adaptive processes, is discussing the risks associated with their use.

Ensuring the quality, accuracy and relevance of the information is key to avoid maladaptive practices and poor decision-making, which could potentially lead to deepen existent vulnerabilities and inequalities. Issues of power and differential access to information also need to be addressed when considering the potential of these tools towards trust building, network strengthening and participatory processes –including those related to climate change.

Ultimately, ICTs could play an important supportive role helping to build and strengthen trust within vulnerable communities affected by climate change impacts, as well as in National Adaptation Plans and Programmes of Action seeking to build long-term climate change resilience with a multi-stakeholder, participatory base.

>ICTs and the Climate Change ‘Unknowns’: Tackling Uncertainty

January 4, 2011
By Angelica Valeria Ospina

Determining the repercussions of the changing climate is a field of great unknowns. While the impacts of climatic variations and seasonal changes on the most vulnerable populations are expected to increase and be manifest in more vulnerable ecosystems and natural habitats, the exact magnitude and impact of climate change effects remain, for the most part, open questions.

Such uncertainty is a key contributor to climate change vulnerability, particularly among developing country populations that lack the resources, including access to information and knowledge, to properly prepare for and cope with its impacts.

But, how can vulnerable contexts prepare for the ‘unknowns’ posed by climate change? And should the quest for ‘certainty’ be the focus of our attention?

The rapid diffusion of Information and Communication Technologies (ICTs) within developing country environments, the hardest hit by climate change-related manifestations, is starting to shed new light on these issues.

A recent article by Reuters identified 10 climate change adaptation technologies that will become crucial to cope and adapt to the effects of the changing climate over the next century.

The bullet points found bellow link these 10 aspects with the potential of ICTs within the climate change field, highlighting some of the ways in which they can help vulnerable populations to better prepare for and cope with the effects of climatic uncertainty.

  • Innovations around Infectious Diseases: Extreme weather events and changing climatic patterns associated with climate change have been linked to the spread of vector-borne (i.e. malaria and dengue) and water-borne diseases. Within this context, ICTs such as mobile phones, community radio and the Internet have the potential to enable information sharing, awareness raising and capacity building on key health threats, enabling effective prevention and response.
  • Flood Safeguards: Climatic changes such as increased and erratic patterns of precipitation negatively affect the capacity of flood and drainage systems, built environment, energy and transportation, among others. ICT applications such as Geographic Information Systems (GIS) can facilitate the monitoring and provision of relevant environmental information to relevant stakeholders, including decision-making processes for the adaptation of human habitats.
  • Weather Forecasting Technologies: ICTs play a key role in the implementation of innovative weather forecasting technologies, including the integration of community monitoring. The use of mobile phones and SMS for reporting on locally-relevant indicators (e.g. likelihood of floods) can contribute to greater accuracy and more precise flood warnings to communities. Based on this information, authorities could design and put in action more appropriate strategies, and farmers could better prepare for evacuations, protect their livestock and better plan local irrigation systems, among others.
  • Insurance Tools: Access to new and more diversified sources of information and knowledge through tools such as the Internet or the mobile phone can facilitate the access to insurance mechanisms, and to information about national programs/assistance available to support vulnerable populations.
  • More Resilient Crops: In the face of higher temperatures, more variable crop seasons and decreasing productivity, ICTs have the potential to enhance food security by strengthening agricultural production systems through information about pest and disease control, planting dates, seed varieties, irrigation applications, and early warning systems, as well as improving market access, among others.
  • Supercomputing: According to the International Telecommunication Union (ITU), the use of ICT-equipped sensors (telemetry), aerial photography, satellite imagery, grid technology, global positioning by satellite (GPS) (e.g. for tracking slow, long-term movement of glaciers) and computer modeling of the earth’s atmosphere, among others, play a key role in climate change monitoring. New technologies continue to be developed, holding great potential for real-time, more accurate information key to strengthen decision-making processes.
  • Water Purification, Water Recycling and Efficient Irrigation Systems: ICTs can contribute to the improvement of water resource management techniques, monitoring of water resources, capacity building and awareness rising. Broadly diffused applications such as mobile phones can serve as tools to disseminate information on low-cost methods for desalination, using gray water and harvesting rainwater for every day uses, as well as for capacity building on new irrigation mechanisms, among others.
  • Sensors: In addition to the role that sensors play in monitoring climate change by helping to capture more accurate data, research indicates that they also constitute promising technologies for improving energy efficiency. Sensors can be used in several environmental applications, such as control of temperature, heating and lighting.

This short identification of areas of potential does not suggest that ICTs can eliminate climatic uncertainty, but it does suggest their potential to help vulnerable populations to strengthen their capacity to withstand and recover from shocks and changing climatic trends.

By contributing to building resilience and strengthening adaptive capacity, ICTs have the potential to tackle climate change uncertainty not only by providing access to information and knowledge, but also by fostering networking, personal empowerment and participation, facilitating self-organisation, access to diverse resources and learning, among others, which ultimately contribute to better preparedness and response, including the possibility of transformation in the face of the unknown.

The need to reduce uncertainty should not substitute efforts to foster creativity and flexibility, which lie at the core of resilient responses to the ongoing challenges posed by climate change.


*Further examples on the linkages between ICTs, climate change and vulnerability dimensions can be found at:

>Yale Project on Knowledge of Climate Change Across Global Warming’s Six Americas

From Anthony Leiserowitz, Yale Project on Climate Change Communication

“Today we are pleased to announce the release of a new report entitled “Knowledge of Climate Change Across Global Warming’s Six Americas.” This report draws from a national study we conducted last year on what Americans understand about how the climate system works, and the causes, impacts, and potential solutions to global warming and is available here.

Overall, we found that knowledge about climate change varies widely across the Six Americas – 49 percent of the Alarmed received a passing grade (A, B, or C), compared to 33 percent of the Concerned, 16 percent of the Cautious, 17 percent of the Doubtful, 4 percent of the Dismissive, and 5 percent of the Disengaged. In general, the Alarmed and the Concerned better understand how the climate system works and the causes, consequences, and solutions to climate change than the Disengaged, the Doubtful and the Dismissive. For example:

· 87% of the Alarmed and 76% of the Concerned understand that global warming is caused mostly by human activities compared to 37% of the Disengaged, 6% of the Doubtful and 3% of the Dismissive;
· 86% of the Alarmed and 71% of the Concerned understand that emissions from cars and trucks contribute substantially to global warming compared to 18% of the Disengaged, 16% of the Doubtful and 10% of the Dismissive;
· 89% of the Alarmed and 64% of the Concerned understand that a transition to renewable energy sources is an important solution compared to 12% of the Disengaged, 13% of the Doubtful and 7% of the Dismissive.

However, this study also found that occasionally the Doubtful and Dismissive have as good or a better understanding than the Alarmed or Concerned. For example:

· 79% of the Dismissive and 74% of the Doubtful correctly understand that the greenhouse effect refers to gases in the atmosphere that trap heat, compared to 66% of the Alarmed and 64% of the Concerned;
· The Dismissive are less likely to incorrectly say that “the greenhouse effect” refers to the Earth’s protective ozone layer than all other groups, including the Alarmed (13% vs. 24% respectively);
· 50% of the Dismissive and 57% of the Doubtful understand that carbon dioxide traps heat from the Earth’s surface, compared to 59% of the Alarmed, and 45% of the Concerned.

This study also identified numerous gaps between expert and public knowledge about climate change. For example, only:

· 13% of the Alarmed know how much carbon dioxide there is in the atmosphere today (approximately 390 parts per million) compared to 5% of the Concerned, 9% of the Cautious, 4% of the Disengaged, 6% of the Doubtful and 7% of the Dismissive;
· 52% of the Alarmed have heard of coral bleaching, vs. 24% of the Concerned, 23% of the Cautious, 5% of the Disengaged, 21% of the Doubtful and 24% of the Dismissive;
· 46% of the Alarmed have heard of ocean acidification, vs. 22% of the Concerned, 25% of the Cautious, 6% of the Disengaged, 23% of the Doubtful and 16% of the Dismissive.

This study also found important misconceptions leading many to misunderstand the causes and therefore the solutions to climate change. For example, many Americans confuse climate change and the hole in the ozone layer. Such misconceptions were particularly apparent for the Alarmed and Concerned segments:

· 63% of the Alarmed and 49% of the Concerned believe that the hole in the ozone layer is a significant contributor to global warming compared to 32% of the Cautious, 12% of the Disengaged, 6% of the Doubtful and 7% of the Dismissive;
· 49% of the Alarmed and 36% of the Concerned believe that aerosol spray cans are a significant contributor to global warming compared to 20% of the Cautious, 9% of the Disengaged, 7% of the Doubtful and 5% of the Dismissive;
· 39% of the Alarmed and 23% of the Concerned believe that banning aerosol spray cans would reduce global warming compared to 13% of the Cautious, 3% of the Disengaged, 4% of the Doubtful and 1% of the Dismissive.

Concerned, Cautious and Disengaged Americans also recognize their own limited understanding of the issue. Fewer than 1 in 10 say they are “very well informed” about climate change, and 75 percent or more say they would like to know more. The Alarmed also say they need more information (76%), while the Dismissive say they do not need any more information about global warming (73%).

Overall, these and other results within this report demonstrate that most Americans both need and desire more information about climate change. While information alone is not sufficient to engage the public in the issue, it is often a necessary precursor of effective action.”

>Mudança Climática e conflito social estão associados? (JC)

JC e-mail 4202, de 17 de Fevereiro de 2011

Artigo do ambientalista Sérgio Abranches, do Ecopolítica para o Plural em site

Eventos climáticos extremos podem ter tido efeito importante nos levantes populares no Oriente Médio e Norte da África? A mudança climática já está afetando as relações sociais?

A questão pode parecer uma dessas vias forçadas para alertar sobre a mudança climática. Mas não é. É uma preocupação relevante e essa conexão já vem sendo estudada por cientistas das mais diversas áreas, climatologistas, ecologistas, sociólogos, economistas. A pergunta é mais complexa do que ela aparenta à primeira vista. Ela indaga sobre duas relações nada triviais: entre eventos climáticos extremos e mudança climática e entre anomalias climáticas e conflito social.

Os cientistas resistem sempre a atribuir à emergência de eventos climáticos extremos específicos à mudança climática. Argumentam, com razão, que não há base científica para associar um evento em particular ao fenômeno global e de longo prazo da mudança climática. Mas o climatologista Kevin Trenberth, diretor da Seção de Análise Climática do Centro Nacional para Pesquisa Atmosférica, nos Estados Unidos, defendeu recentemente uma visão diferente desse problema, conhecido na ciência climática como “o problema da atribuição”. Em entrevista exclusiva ao editor do blog Climate Progress, o físico Joseph Romm, Trenberth disse que:

Os cientistas sempre começam com a afirmação de que não se pode atribuir um evento isolado à mudança climática. Mas ela tem uma influência sistemática sobre todos esses eventos climáticos atuais, segundo ele, por causa do fato de que há mais vapor d’água circulando na atmosfera do que se tinha, digamos, trinta anos atrás. É uma quantidade extra de 4% de vapor d’água. Ele aumenta a força das tempestades, dá mais umidade para essas tempestades e é ruim que o público não veja isto como uma manifestação da mudança climática. A perspectiva é que esse tipo de coisa só aumentará e piorará no futuro.

A quantidade de gases estufa na atmosfera, segundo a maioria dos cientistas, já tem um efeito de aceleração do aquecimento da Terra. Portanto, a mudança climática decorrente deve ser vista como um processo em curso com tendência de agravamento ao longo do tempo. Ou seja, é de longo prazo, mas as coisas não acontecem todas no futuro de uma vez só. Vão acontecendo progressivamente, com aumento de frequência e intensidade.

E qual a relação com os fatos no Oriente Médio e na África do Norte?

Tivemos um período atípico de grande quantidade de eventos climáticos extremos em 2010 e no início deste ano. Secas, enchentes, ondas de calor e frio, tempestades intensas, nevascas, queimadas. Esses eventos afetaram negativamente a produção agrícola mundial em todas as partes do mundo: os casos mais exemplares foram no Casaquistão, na Rússia, no Canadá, na Austrália, nos Estados Unidos, na China e no Brasil. O resultado foi uma forte alta dos preços internacionais das commodities agrícolas e inflação de preços de alimentos. Uma inflação climática.

O blog Climate Progress organizou uma série de referências de cientistas e da imprensa a essas relações. Entre elas, estudo dos economistas Rabah Arezki, do FMI, e Markus Brückner, da Universidade de Adelaide na Austrália. Eles estudaram o efeito de variações nos preços internacionais de alimentos sobre as instituições democráticas e conflitos internos em mais de 120 países, entre 1970 e 2007. Essa análise mostra que existe uma clara relação para os países de baixa renda: observa-se a deterioração das instituições democráticas e o aumento da incidência de conflitos de rua, demonstrações anti-governo, e movimentos de massa.

Por que nos países de baixa renda? Nos países de renda alta essa relação não é significativa. Porque quanto menor a renda do país, maior a participação dos alimentos no orçamento doméstico e, portanto, maior a sensibilidade da população a elevações fortes do preço da comida.

Estudos históricos mostram que há relação entre mudança climática e colapso social. Quebras de safra e consequente elevação dos preços de comida são causas frequentes de levantes populares e revoluções na história da sociedade moderna e contemporânea. A história do próprio Egito registra casos históricos de conflitos associados ao preço dos grãos (infelizmente não tenho cópia digital deste artigo). Na Índia, também foram muitos os episódios. O mais notável talvez tenha sido a “revolta dos grãos” de 1918, provocada por desabastecimento e elevação de preços dos grãos resultante de monções com chuvas excepcionalmente fracas.

Em vários desses episódios históricos a relação era direta: a elevação dos preços dos alimentos causava a revolta. No caso atual, as causas são outras. Para entender o que se passa no Egito, por exemplo, é preciso distinguir entre o que causa o descontentamento profundo e o que detona a revolta. O que causou o descontentamento foi a própria tirania. Um governo autocrático, um ditador no poder por 30 anos, uma administração corrupta. Repressão, censura, prisões arbitrárias, tortura. No plano social, muita pobreza, imensa desigualdade de renda e de riqueza, falta de perspectiva de mobilidade social para os jovens. Nos últimos anos houve várias manifestações de protesto, todas duramente reprimidas, mas nenhuma do porte da revolta de massas que começou no dia 25.

O que detona o levante das massas? Uma conjuntura, isto é, uma convergência de fatores, antes dissociados, que se encontram e formam “a gota d’água”, provocam a virada, o tipping point, que levam um protesto como outros inúmeros se transformar em explosão de descontentamento geral, em revolta incontrolável e espontânea da massa.

No Egito houve fatores econômicos, políticos e aceleradores importantes que criaram essa conjuntura. O econômico foi a elevação dos preços dos alimentos, que atingiu duramente as famílias mais pobres. A subida dos preços do petróleo, moradia e educação, bateu no orçamento da classe média. Esse choque de preços ocorreu em uma economia debilitada, na qual o desemprego de jovens é muito alto. O desemprego agrava uma situação de baixa mobilidade social, anulando as perspectivas de futuro dos jovens. Em alguns casos, jovens com qualificação sofrem descenso social, sendo forçados a trabalhar em setores de baixa qualificação. O desespero dos jovens se transmite facilmente para os pais e famílias.

O fator político foi a notícia de que o filho de Hosni Mubarak, Gamal Mubarak, seria seu sucessor, provavelmente já como candidato nas eleições de cartas marcadas previstas para setembro. A possibilidade de uma dinastia Mubarak provocou enorme rejeição, em um país de passado dinástico.

O quadro sócio-econômico no Egito não é muito diferente do que se observa nos outros países. Na Tunísia, no Sudão, mesmo na Arábia Saudita, há tirania, muita pobreza, desigualdade de renda e riqueza, desemprego de jovens e elevação de preços de alimentos. Ouvi recentemente entrevista de um dos príncipes sauditas, na CNN, falando que a situação em seu país é diferente, mas que há, realmente, insatisfação com o aumento de preços dos alimentos e da moradia. O governo aumentou os salários para que pudessem absorver o custo adicional. A evidência mostra que subsídios e aumentos salariais para compensar os efeitos da inflação alimentar têm efeito temporário e acabam por realimentar os preços.

No Egito, o aumento dos preços dos alimentos foi muito forte, como se vê no gráfico em –

Os preços dos alimentos subiram 40% e os de moradia e educação, mais de 10%. Os pobres são sensíveis à inflação nos alimentos e na moradia. A classe média à inflação na educação, na moradia e nos combustíveis.

O que acelerou a revolta e permitiu que se transformasse em um movimento de massa, muito rapidamente? As mídias e redes sociais e o efeito-demonstração do levante na Tunísia, que se propagou por essas vias digitais. É evidente que as mídias e redes sociais não fazem revoluções. Elas são uma revolução na forma como nos comunicamos e trocamos informação. Nisso têm sido revolucionárias. Mas, na sociologia dos conflitos sociais seu papel é de acelerador e transmissor, permitindo, por exemplo, o contágio inicial, que depois passa a se dar por contato físico, nas ruas e nas praças, e na propagação de eventos que acabam tendo o efeito de aumentar a propensão à ação.

Além disso, podem ter o efeito de prolongar o contágio. A sociologia já decifrou como terminam os processos por contágio, como os arrastões, por exemplo: quando não há mais pessoas a contagiar e a cadeia se quebra. As redes e mídias sociais – no caso do Egito principalmente o SMS – trazem mais pessoas para o movimento e realimentam o contágio.

Não é por acaso que essas revoltas ocupam as ruas e praças das cidades. O meio urbano é muito mais propício ao contágio das massas. O crescimento da população com acesso à telefonia celular dá o principal instrumento de contágio. Veja os gráficos para o Egito ( e a Tunísia (

Mas a internet teve importante papel de manter o mundo informado sobre o que se passava no Egito, provavelmente evitando um banho de sangue, e na comunicação entre os egípcios. E por isso o governo fechou o acesso à Web.

Nada é simples nesse processo. Estamos falando da convergência de processos complexos no sistema climático, no sistema social e na sociedade global. Essa convergência só aumentará nos próximos anos e décadas. Viveremos mais turbulência climática e social, no meio de uma revolução científica e tecnológica sem precedentes.

Para ouvir o comentário do autor na rádio CBN acesse

>Increased flood risk linked to global warming (Nature)


Published online 16 February 2011 | Nature 470, 316 (2011) | doi:10.1038/470316a

Likelihood of extreme rainfall may have been doubled by rising greenhouse-gas levels.

The effects of severe weather — such as these floods in Albania — take a huge human and financial toll.
The effects of severe weather — such as these floods in Albania — take a huge human and financial toll. REUTERS/A. CELI

Climate change may be hitting home. 

Rises in global average temperature are remote from most people’s experience, but two studies in this week’s 
Nature1,2conclude that climate warming is already causing extreme weather events that affect the lives of millions. The research directly links rising greenhouse-gas levels with the growing intensity of rain and snow in the Northern Hemisphere, and the increased risk of flooding in the United Kingdom.

Insurers will take note, as will those developing policies for adapting to climate change. “This has immense importance not just as a further justification for emissions reduction, but also for adaptation planning,” says Michael Oppenheimer, a climate-policy researcher at Princeton University in New Jersey, who was not involved in the studies.

There is no doubt that humans are altering the climate, but the implications for regional weather are less clear. No computer simulation can conclusively attribute a given snowstorm or flood to global warming. But with a combination of climate models, weather observations and a good dose of probability theory, scientists may be able to determine how climate warming changes the odds. An earlier study3, for example, found that global warming has at least doubled the likelihood of extreme events such as the 2003 European heatwave.

More-localized weather extremes have been harder to attribute to climate change until now. “Climate models have improved a lot since ten years ago, when we basically couldn’t say anything about rainfall,” says Gabriele Hegerl, a climate researcher at the University of Edinburgh, UK. In the first of the latest studies1, Hegerl and her colleagues compared data from weather stations in the Northern Hemisphere with precipitation simulations from eight climate models (see page 378). “We can now say with some confidence that the increased rainfall intensity in the latter half of the twentieth century cannot be explained by our estimates of internal climate variability,” she says.

The second study2 links climate change to a specific event: damaging floods in 2000 in England and Wales. By running thousands of high-resolution seasonal forecast simulations with or without the effect of greenhouse gases, Myles Allen of the University of Oxford, UK, and his colleagues found that anthropogenic climate change may have almost doubled the risk of the extremely wet weather that caused the floods (see page 382). The rise in extreme precipitation in some Northern Hemisphere areas has been recognized for more than a decade, but this is the first time that the anthropogenic contribution has been nailed down, says Oppenheimer. The findings mean that Northern Hemisphere countries need to prepare for more of these events in the future. “What has been considered a 1-in-100-years event in a stationary climate may actually occur twice as often in the future,” says Allen.

But he cautions that climate change may not always raise the risk of weather-related damage. In Britain, for example, snow-melt floods may become less likely as the climate warms. And Allen’s study leaves a 10% chance that global warming has not affected — or has even decreased — the country’s flood risk.

Similar attribution studies are under way for flood and drought risk in Europe, meltwater availability in the western United States and drought in southern Africa, typical of the research needed to develop effective climate-adaptation policies. “Governments plan to spend some US$100 billion on climate adaptation by 2020, although presently no one has an idea of what is an impact of climate change and what is just bad weather,” says Allen.

Establishing the links between climate change and weather could also shape climate treaties, he says. “If rich countries are to financially compensate the losers of climate change, as some poorer countries would expect, you’d like to have an objective scientific basis for it.”

The insurance industry has long worried about increased losses resulting from more extreme weather (see ‘Fatal floods’), but conclusively pinning the blame on climate change will take more research, says Robert Muir-Wood, chief research officer with RMS, a company headquartered in Newark, California, that constructs risk models for the insurance industry. “This is a key part of our research agenda and insurance companies do accept the premise” that there could be a link, he says. “If there’s evidence that risk is changing, then this is something we need to incorporate in our models.” 
See News and Views p.344


  1. Min, S.-K. et alNature 470, 378-381 (2011).
  2. Pall, P. et alNature 470, 382-385 (2011).
  3. Stott, P. A. et alNature 432, 610-614 (2004).

>Modelo climático brasileiro mostrará o clima sob o olhar do Brasil (Fapesp, JC)

JC e-mail 4200, de 15 de Fevereiro de 2011

Nos modelos climáticos globais divulgados no mais recente relatório do Painel Intergovernamental sobre Mudança Climática (IPCC), divulgado em 2007, o Pantanal e o Cerrado são retratados como se fossem savanas africanas

Já fenômenos como as queimadas, que podem intensificar o efeito estufa e mudar as características das chuvas e nuvens de uma determinada região, por exemplo, não são caracterizados por não serem considerados relevantes para os países que elaboraram os modelos numéricos utilizados.

É por isso, e para auxiliar nas pesquisas mundiais sobre as mudanças climáticas e avaliar o impacto que as atividades humanas têm sobre elas, que cientistas brasileiros estão desenvolvendo o Modelo Brasileiro do Sistema Climático Global (MBSCG).

O esforço congrega cientistas do Instituto Nacional de Ciência e Tecnologia sobre Mudanças Climáticas (INCT-MC), do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais e da Rede Brasileira de Pesquisa em Mudanças Climáticas Globais (Rede Clima).

Modelo brasileiro

Com conclusão estimada para 2013, o modelo climático brasileiro deverá permitir aos climatologistas realizar estudos sobre mudanças climáticas com base em um modelo que represente processos importantes para o Brasil e que são considerados secundários nos modelos climáticos estrangeiros.

“Boa parte desses modelos internacionais não atende às nossas necessidades. Temos muitos problemas associados ao clima em virtude de ações antropogênicas, como as queimadas e o desmatamento, que não são retratados e que agora serão incluídos no modelo que estamos desenvolvendo no Brasil”, explica Gilvan Sampaio de Oliveira, pesquisador Inpe e um dos coordenadores do MBSCG.

Segundo ele, o modelo brasileiro incorporará processos e interações hidrológicas, biológicas e físico-químicas relevantes do sistema climático regional e global.

Dessa forma, possibilitará gerar cenários, com resolução de 10 a 50 quilômetros, de mudanças ambientais regionais e globais que poderão ocorrer nas próximas décadas para prever seus possíveis impactos em setores como agricultura e energia.

“Com esse modelo, teremos capacidade e autonomia para gerar cenários futuros confiáveis, de modo que o país possa se preparar para enfrentar os fenômenos climáticos extremos”, disse Sampaio.

Impactos do clima na agricultura

A primeira versão do modelo brasileiro com indicações do que pode ocorrer com o clima no Brasil nos próximos 50 anos deverá ficar pronta até o fim de 2011. Para isso, os pesquisadores estão instalando e começarão a rodar em fevereiro, no supercomputador Tupã, instalado no Centro de Previsão do Tempo e Estudos Climáticos (Cptec), em Cachoeira Paulista (SP), uma versão preliminar do modelo, com módulos computacionais que analisam os fenômenos climáticos que ocorrem na atmosfera, no oceano e na superfície terrestre.

Os módulos computacionais serão integrados gradualmente a outros componentes do modelo, que avaliarão os impactos da vegetação, do ciclo de carbono terrestre, do gelo marinho e da química atmosférica no clima.

Em contrapartida, um outro componente apontará as influências das mudanças climáticas em cultivares agrícolas como a cana-de-açúcar, soja, milho e café.
“No futuro, poderemos tentar estimar a produtividade da cana-de-açúcar e da soja, por exemplo, frente ao aumento da concentração de gases de efeito estufa na atmosfera”, disse Sampaio.

Contribuição ao IPCC

Segundo o cientista, como a versão final do MSBCG só ficará pronta em 2013, o modelo climático brasileiro não será utilizado no próximo relatório que o IPCC divulgará em 2014, o AR-5. Mas o modelo que será utilizado pelo Painel Intergovernamental para realizar as simulações do AR5, o HadGEM2, contará com participação brasileira.

Por meio de uma cooperação entre o Hadley Center, no Reino Unido, e o Inpe, os pesquisadores brasileiros introduziram no modelo internacional módulos computacionais que avaliarão o impacto das plumas de fumaça produzidas por queimadas e do fogo florestal sobre o clima global, que até então não eram levados em conta nas projeções climáticas.

Com isso, o modelo passou a ser chamado HadGEM2-ES/Inpe. “Faremos simulações considerando esses componentes que introduzimos nesse modelo”, contou Sampaio.

Uso da terra e meteorologia

Em 2013, quando será concluída a versão final do Modelo Brasileiro do Sistema Climático Global, o sistema ganhará um módulo computacional de uso da terra e outro meteorológico, com alta resolução espacial. No mesmo ano, também serão realizadas as primeiras simulações de modelos regionais de alta resolução para a elaboração de um modelo climático para América do Sul com resolução de um a 10 km. “Até hoje, levávamos meses e até anos para gerar cenários regionais. Com o novo sistema de supercomputação os esforços em modelagem climática regional ganharão outra escala”, afirmou Sampaio.

(Site da Inovação Tecnológica, com informações da Agência Fapesp)

>In Denial – Climate on the Couch (BBC)

Thu 10 Feb 2011
BBC Radio 4

Something strange is happening to the climate – the climate of opinion. On the one hand, scientists are forecasting terrible changes to the planet, and to us. On the other, most of us don’t seem that bothered, even though the government keeps telling us we ought to be. Even climate scientists and environmental campaigners find it hard to stop themselves taking holidays in long haul destinations.

So why the gap between what the science says, and what we feel and do? In this programme Jolyon Jenkins investigates the psychology of climate change. Have environmentalists and the government been putting out messages that are actually counterproductive? Might trying to scare people into action actually be causing them to consume more? Are images of polar bears actually damaging to the environmentalists’ case because they alienate people who don’t think of themselves as environmentalists – and make climate change seem like a problem that’s a long way off and doesn’t have much relevance to normal life? Does the message that there are “simple and painless” steps we can take to reduce our carbon footprint (like unplugging your phone charger) unintentionally cause people to think that the problem can’t be that serious if the answers are so trivial?

Jolyon talks to people who are trying to move beyond the counterproductive messages. On the one hand there are projects like Natural Change, run by WWF Scotland, which try to reconnect people with nature using the therapeutic techniques of “ecopsychology” – intense workshops that take place in the wilderness of the west of Scotland, and which seem to convert the uncommitted into serious greens. On the other, there are schemes that try to take the issue out of the green ghetto and engage normal people with climate change. Jolyon visits a project in Stirling which has set itself the ambitious challenge of talking face to face with 35,000 people, through existing social groups like rugby clubs, knitting circles and art groups. It wants to sign up these groups to carbon cutting plans, and make carbon reduction a social norm rather than something that only eco-warriors bother with.

And he attends a “swishing party” in London, which tries to replicate the buzz women get from clothes shopping, but in a carbon neutral way. Can the green movement find substitutes for consumerism that are as fun and status-rich, that will deliver carbon reduction but without making people feel they have signed up to a life of grim austerity? And even if the British and Europeans shift their attitudes, can the Americans ever be reconciled to the climate change message? Producer Jolyon Jenkins.

>Can We Trust Climate Models? Increasingly, the Answer is ‘Yes’



Yale Environment 360

Forecasting what the Earth’s climate might look like a century from now has long presented a huge challenge to climate scientists. But better understanding of the climate system, improved observations of the current climate, and rapidly improving computing power are slowly leading to more reliable methods.

by michael d. lemonick

A chart appears on page 45 of the 2007 Synthesis Report of the Intergovernmental Panel on Climate Change (IPCC), laying out projections for what global temperature and sea level should look like by the end of this century. Both are projected to rise, which will come as no surprise to anyone who’s been paying even the slightest attention to the headlines over the past decade or so. In both cases, however, the projections span a wide range of possibilities. The temperature, for example, is likely to rise anywhere from 1.8 C to 6.4 C (3.2 F to 11.5 F), while sea level could increase by as little as 7 inches or by as much as 23 — or anywhere in between.

It all sounds appallingly vague, and the fact that it’s all based on computer models probably doesn’t reassure the general public all that much. For many people, “model” is just another way of saying “not the real world.” In fairness, the wide range of possibilities in part reflects uncertainty about human behavior: The chart lays out different possible scenarios based on how much CO2 and other greenhouse gases humans might emit over the coming century. Whether the world adopts strict emissions controls or decides to ignore the climate problem entirely will make a huge difference to how much warming is likely to happen.

But even when you factor out the vagaries of politics and economics, and assume future emissions are known perfectly, the projections from climate models still cover a range of temperatures, sea levels, and other manifestations of climate change. And while there’s just one climate, there’s more than one way to simulate it. The IPCC’s numbers come from averaging nearly two dozen individual models produced by institutions including the National Center for Atmospheric Research (NCAR), the Geophysical Fluid Dynamics Laboratory (GFDL), the U.K.’s Met Office, and more. All of these models have features in common, but they’re constructed differently — and all of them leave some potentially important climate processes out entirely. So the question remains: How much can we really trust climate models to tell us about the future?

The answer, says Keith Dixon, a modeler at GFDL, is that it all depends on questions you’re asking. “If you want to know ‘is climate change something that should be on my radar screen?’” he says, “then you end up with some very solid results. The climate is warming, and we can say why. Looking to the 21st century, all reasonable projections of what humans will be doing suggest that not only will the climate continue to warm, you have a good chance of it accelerating. Those are global-scale issues, and they’re very solid.”

The reason they’re solid is that, right from the emergence of the first crude versions back in the 1960s, models have been at their heart a series of equations that describe airflow, radiation and energy balance as the Sun

The problem is that warming causes changes that act to accelerate or slow the warming.

warms the Earth and the Earth sends some of that warmth back out into space. “It literally comes down to mathematics,” says Peter Gleckler, a research scientist with the Program for Climate Model Diagnosis and Intercomparison at Livermore National Laboratory, and the basic equations are identical from one model to another. “Global climate models,” he says, echoing Dixon, “are designed to deal with large-scale flow of the atmosphere, and they do very well with that.”

The problem is that warming causes all sorts of changes — in the amount of ice in the Arctic, in the kind of vegetation on land, in ocean currents, in permafrost and cloud cover and more — that in turn can either cause more warming, or cool things off. To model the climate accurately, you have to account for all of these factors. Unfortunately, says James Hurrell, who led the NCAR’s most recent effort to upgrade its own climate model, you can’t. “Sometimes you don’t include processes simply because you don’t understand them well enough,” he says. “Sometimes it’s because they haven’t even been discovered yet.”

A good example of the former, says Dixon, is the global carbon cycle — the complex interchange of carbon between oceans, atmosphere, and biosphere. Since atmospheric carbon dioxide is driving climate change, it’s obviously important, but until about 15 years ago, it was too poorly understood to be included in the models. “Now,” says Dixon, “we’re including it — we’re simulating life, not just physics.” Equations representing ocean dynamics and sea ice also have been added to climate models as scientists have understood these crucial processes better.

Other important phenomena, such as changes in clouds, are still too complex to model accurately. “We can’t simulate individual cumulus clouds,” says Dixon, because they’re much smaller than the 200-kilometer grid boxes that make up climate models’ representation of the world. The same applies to aerosols — tiny particles, including natural dust and manmade soot — that float around in the atmosphere and can cool or warm the planet, depending on their size and composition.

But there’s no one right way to model these small-scale phenomena. “We don’t have the observations and don’t have the theory,” says Gleckler. The best they can do on this point is to simulate the net effect of all the clouds or aerosols in a grid box, a process known as “parameterization.” Different

‘It’s not a science for which everything is known, by definition,’ says one expert.

modeling centers go about it in different ways, which, unsurprisingly, leads to varying results. “It’s not a science for which everything is known, by definition,” says Gleckler. “Many groups around the world are pursuing their own research pathways to develop improved models.” If the past is any guide, modelers will be able to abandon parameterizations one by one, replacing them with mathematical representations of real physical processes.

Sometimes, modelers don’t understand a process well enough to include it at all, even if they know it could be important. One example is a caveat that appears on that 2007 IPCC chart. The projected range of sea-level rise, it warns, explicitly excludes “future rapid dynamical changes in ice flow.” In other words, if land-based ice in Greenland and Antarctica starts moving more quickly toward the sea than it has in the past — something glaciologists knew was possible, but hadn’t yet been documented — these estimates would be incorrect. And sure enough, satellites have now detected such movements. “The last generation of NCAR models,” says Hurrell, “had no ice sheet dynamics at all. The model we just released last summer does, but the representation is relatively crude. In a year or two, we’ll have a more sophisticated update.”

Sophistication only counts, however, if the models end up doing a reasonable job of representing the real world. It’s not especially useful to wait until 2100 to find out, so modelers do the next best thing: They perform “hindcasts,” which are the inverse of forecasts. “We start the models from the middle of the 1800s,” says Dixon, “and let them run through the present.” If a model reproduces the overall characteristics of the real-world climate record reasonably well, that’s a good sign.

What the models don’t try to do is to match the timing of short-term climate variations we’ve experienced. A model might produce a Dust Bowl like that of the 1930s, but in the model it might happen in the 1950s. It should produce the ups and downs of El Niño and La Niña currents in the Pacific with about the right frequency and intensity, but not necessarily at the same times as they happen in the real Pacific. Models should show slowdowns and accelerations in the overall warming trend, the result of natural fluctuations, at about the rate they happen in the real climate. But they won’t necessarily show the specific flattening of global warming we’ve observed during the past decade — a temporary slowdown that had skeptics declaring the end of climate change.

It’s also important to realize that climate represents what modelers call a boundary condition. Blizzards in the Sahara are outside the boundaries of our current climate, and so are stands of palm trees in Greenland next year. But within those boundaries, things can bounce around a great deal from year to year or decade to decade. What modelers aim to produce is a virtual climate that resembles the real one in a statistical sense, with El Niños, say, appearing about as often as they do in reality, or hundred-year storms coming once every hundred years or so.

This is one essential difference between weather forecasting and climate projection. Both use computer models, and in some cases, even the very same models. But weather forecasts start out with the observed state of the

Many decisions about how to adapt to climate change can’t wait for better climate models.

atmosphere and oceans at this very moment, then project it forward. It’s not useful for our day-to-day lives to know that September has this average high or that average low; we want to know what the actual temperature will be tomorrow, and the day after, and next week. Because the atmosphere is chaotic, anything less than perfect knowledge of today’s conditions (which is impossible, given that observations are always imperfect) will make the forecast useless after about two weeks.

Since climate projections go out not days or weeks, but decades, modelers don’t even try to make specific forecasts. Instead, they look for changes in averages — in boundary conditions. They want to know if Septembers in 2050 will be generally warmer than Septembers in 2010, or whether extreme weather events — droughts, torrential rains, floods — will become more or less frequent. Indeed, that’s the definition of climate: the average conditions in a particular place.

“Because models are put together by different scientists using different codes, each one has its strengths and weaknesses,” says Dixon. “Sometimes one [modeling] group ends up with too much or too little sea ice but does very well with El Niño and precipitation in the continental U.S., for example,” while another nails the ice but falls down on sea-level rise. When you average many models together, however, the errors tend to cancel.

Even when models reproduce the past reasonably well, however, it doesn’t guarantee that they’re equally reliable at projecting the future. That’s in part because some changes in climate are non-linear, which is to say that a small nudge can produce an unexpectedly large result. Again, ice sheets are a good example: If you look at melting alone, it’s pretty straightforward to calculate how much extra water will enter the sea for every degree of temperature rise. But because meltwater can percolate down to lubricate the undersides of glaciers, and because warmer oceans can lift the ends of glaciers up off the sea floor and remove a natural brake, the ice itself can end up getting dumped into the sea, unmelted. A relatively small temperature rise can thus lead to an unexpectedly large increase in sea level. That particular non-linearity was already suspected, if not fully understood, but there could be others lurking in the climate system.

Beyond that, says Dixon, if three-fourths of the models project that the Sahel (the area just south of the Sahara) will get wetter, for example, and a fourth says it will dry out, “there’s a tendency to go with the majority. But we can’t rule out without a whole lot of investigation whether the minority is doing something right. Maybe they have a better representation of rainfall patterns.” Even so, he says, if you have the vast majority coming up with similar results, and you go back to the underlying theory, and it makes physical sense, that tends to give you more confidence they’re right. The best confidence-builder of all, of course, is when a trend projected by models shows up in observations — warmer springs and earlier snowmelt in the Western U.S., for example, which not only makes physical sense in a warming world, but which is clearly happening.

Climate Forecasts: The Case For Living with Uncertainty

As climate science advances, predictions about the extent of future warming and its effects are likely to become less — not more — precise, journalist Fred Pearce writes. That may make it more difficult to convince the public of the reality of climate change, but it hardly diminishes the urgency of taking action.

And the models are constantly being improved. Climate scientists are already using modified versions to try and predict the actual timing of El Ninos and La Niñas over the next few years. They’re just beginning to wrestle with periods of 10, 20 and even 30 years in the future, the so-called decadal time span where both changing boundary conditions and natural variations within the boundaries have an influence on climate. “We’ve had a modest amount of skill with El Niños,” says Hurrell, “where 15-20 years ago we weren’t so skillful. That’s where we are with decadal predictions right now. It’s going to improve significantly.”

After two decades of evaluating climate models, Gleckler doesn’t want to downplay the shortcomings that remain in existing models. “But we have better observations as of late,” he says, “more people starting to focus on these things, and better funding. I think we have better prospects for making some real progress from now on.” 


>Clima sob o olhar do Brasil (Fapesp)




Por Elton Alisson

Agência FAPESP – Nos modelos climáticos globais divulgados no mais recente relatório do Painel Intergovernamental sobre Mudança Climática (IPCC), divulgado em 2007, o Pantanal e o Cerrado são retratados como se fossem savanas africanas.

Já fenômenos como as queimadas, que podem intensificar o efeito estufa e mudar as características de chuvas e nuvens de uma determinada região, por exemplo, não são caracterizados por não serem considerados relevantes para os países que elaboraram os modelos numéricos utilizados.

Para dispor de um modelo capaz de gerar cenários de mudanças climáticas com perspectiva brasileira, pesquisadores de diversas instituições, integrantes do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, da Rede Brasileira de Pesquisa em Mudanças Climáticas Globais (Rede Clima) e do Instituto Nacional de Ciência e Tecnologia sobre Mudanças Climáticas (INCT-MC), estão desenvolvendo o Modelo Brasileiro do Sistema Climático Global (MBSCG).

Com conclusão estimada para 2013, o MSBCG deverá permitir aos climatologistas brasileiros realizar estudos sobre mudanças climáticas com base em um modelo que represente processos importantes para o Brasil e que são considerados secundários nos modelos climáticos estrangeiros.

“Boa parte desses modelos internacionais não atende às nossas necessidades. Temos muitos problemas associados ao clima em virtude de ações antropogênicas, como as queimadas e o desmatamento, que não são retratados e que agora serão incluídos no modelo que estamos desenvolvendo no Brasil”, disse Gilvan Sampaio de Oliveira, pesquisador do Centro de Ciência do Sistema Terrestre (CCST) do Instituto Nacional de Pesquisas Espaciais (Inpe), um dos pesquisadores que coordena a construção do MBSCG.

Segundo ele, o modelo brasileiro incorporará processos e interações hidrológicas, biológicas e físico-químicas relevantes do sistema climático regional e global. Dessa forma, possibilitará gerar cenários, com resolução de 10 a 50 quilômetros, de mudanças ambientais regionais e globais que poderão ocorrer nas próximas décadas para prever seus possíveis impactos em setores como agricultura e energia.

“Com esse modelo, teremos capacidade e autonomia para gerar cenários futuros confiáveis, de modo que o país possa se preparar para enfrentar os fenômenos climáticos extremos”, disse Sampaio à Agência FAPESP.

A primeira versão do modelo brasileiro com indicações do que pode ocorrer com o clima no Brasil nos próximos 50 anos deverá ficar pronta até o fim de 2011.

Para isso, os pesquisadores estão instalando e começarão a rodar em fevereiro no supercomputador Tupã, instalado no Centro de Previsão do Tempo e Estudos Climáticos (CPTEC), em Cachoeira Paulista (SP), uma versão preliminar do modelo, com módulos computacionais que analisam os fenômenos climáticos que ocorrem na atmosfera, no oceano e na superfície terrestre.

Os módulos computacionais serão integrados gradualmente a outros componentes do modelo, que avaliarão os impactos da vegetação, do ciclo de carbono terrestre, do gelo marinho e da química atmosférica no clima. Em contrapartida, um outro componente apontará as influências das mudanças climáticas em cultivares agrícolas como a cana-de-açúcar, soja, milho e café.

“No futuro, poderemos tentar estimar a produtividade da cana-de-açúcar e da soja, por exemplo, frente ao aumento da concentração de gases de efeito estufa na atmosfera”, disse Sampaio.

Classe IPCC

Segundo o cientista, como a versão final do MSBCG só ficará pronta em 2013, o modelo climático brasileiro não será utilizado no próximo relatório que o IPCC divulgará em 2014, o AR-5. Mas o modelo que será utilizado pelo Painel Intergovernamental para realizar as simulações do AR5, o HadGEM2, contará com participação brasileira.

Por meio de uma cooperação entre o Hadley Center, no Reino Unido, e o Inpe, os pesquisadores brasileiros introduziram no modelo internacional módulos computacionais que avaliarão o impacto das plumas de fumaça produzidas por queimadas e do fogo florestal sobre o clima global, que até então não eram levados em conta nas projeções climáticas.

Com isso, o modelo passou a ser chamado HadGEM2-ES/Inpe. “Faremos simulações considerando esses componentes que introduzimos nesse modelo”, contou Sampaio.

Em 2013, quando será concluída a versão final do Modelo Brasileiro do Sistema Climático Global, o sistema ganhará um módulo computacional de uso da terra e outro metereológico, com alta resolução espacial. No mesmo ano, também serão realizadas as primeiras simulações de modelos regionais de alta resolução para a elaboração de um modelo climático para América do Sul com resolução de 1 a 10 km.

“Até hoje, levávamos meses e até anos para gerar cenários regionais. Com o novo sistema de supercomputação os esforços em modelagem climática regional ganharão outra escala”, afirmou Sampaio.

Leia reportagem publicada pela revista Pesquisa FAPESP sobre o modelo climático brasileiro. 

>Um céu de probabilidades (O Povo)

O cearense carrega uma memória cultural, muitas vezes inconsciente, sobre a escassez de água que afetou as famílias no passado. Em entrevista ao Vida&Arte Cultura, o professor da UFRJ, Renzo Taddei, traça aspectos sobre a compreensão do clima e como isso reflete no cidadão

05.02.2011| 17:00

Em Quixadá, os profetas da chuva fazem suas previsões todos os anos (DÁRIO GABRIEL, EM 9/1/2010)

Diante dos ciclos da natureza, profetas preveem o futuro. Os cientistas tornam públicas as medições matemáticas e físicas que ditam a probabilidade de nublar ou fazer sol. Mediadas pela imprensa, as previsões meteorológicas afetam o cidadão e sua maneira de perceber o clima e a cidade. Professor-adjunto da Escola de Comunicação da Universidade Federal do Rio de Janeiro (UFRJ), Renzo Taddei tem relacionamento íntimo com o semi-árido cearense.

Pesquisador há quase uma década das manifestações populares na previsão do clima, em especial na atuação dos profetas da chuva de Quixadá, Renzo vez por outra vem à Fortaleza ministrar palestras, participar de encontros e pesquisas sobre o tema. Na última quarta, o professor recebeu a reportagem numa das salas da Funceme. Na entrevista que você lê a seguir, o pesquisador chama a atenção para alertas globais, como o aquecimento climático. “Já se percebeu que o tom alarmista de catástrofe iminente raramente produz algum efeito positivo. O que produz é uma sensação de impotência geral, como se não há nada que se possa fazer”. (Elisa Parente)

O POVO – Como é possível entender o clima para além do efeito atmosférico?
Renzo Taddei – A questão do clima no Ceará é muito interessante porque, ano passado, segundo a Seplag (Secretaria do Planejamento e Gestão), a economia cearense cresceu 8%. O que é um número incrível. Em 2010, tivemos o pior ano em chuvas nos últimos 30 anos. Foi a pior seca e um ano de bom crescimento para o Estado. Claramente, a agricultura contribui pouco para a economia no Ceará e, cada vez menos, as pessoas se sentem terrivelmente vulneráveis com relação ao clima. Por vários fatores, inclusive por causa dos programas sociais do governo Lula, pelas melhorias de infra-estrutura dos últimos governos do Ceará. Ou seja, a vida está um pouco mais fácil.

OP – De que maneira isto afeta na organização da cidade?
Renzo – Existiu uma estatística onde mais da metade da população acima de 40, 50 anos tinha nascido fora da Capital. Então a presença do imaginário rural é muito forte. Esse é um dos elementos do peso psicológico da seca. Até quem não tem nada a ver com agricultura, se alegra ao ver chuva. A história do bonito pra chover traz uma continuidade e uma ruptura, principalmente com relação às gerações mais novas que nasceram em Fortaleza. Entrevistei um agrônomo que me disse ter o hábito de desligar a água enquanto se ensaboava no chuveiro. E o filho dele perguntou por que ele desligava a água se o banho não tinha acabado. Ele se deu conta de que a geração mais nova sequer tem memória a respeito da escassez de água do seu Estado. A última grande crise em Fortaleza foi 1993, na construção do Canal do Trabalhador. Fortaleza é a única capital encravada no semi-árido, mesmo assim as pessoas não têm essa consciência. Você anda por Fortaleza e vê que cada lançamento imobiliário precisa ter um parque aquático, não é nem piscina. É o uso arquitetônico e recreativo da água. Só se compara a Las Vegas, que é outro lugar de abundância irresponsável encravado no deserto. Então as pessoas não têm experiência da falta de água, mas têm uma herança cultural de hipervalorização dela. Fortaleza é orgulhosa de sua pujança e gasta como novo rico. É um lance que tem a ver com cultura. Enquanto isso, o Canal da Integração está trazendo um mundo de água e a transposição do Rio São Francisco vai trazer ainda mais para Fortaleza não ter problema pelos próximos 30 anos.

OP – Mas isso também pode ter efeito contrário.
Renzo – Isso é perigoso porque, a longo prazo, não dá para achar que é sustentável você consumir. O grande debate da transposição, da construção do Castanhão e do Canal da Integração é esse. Está sempre aumentando a infra-estrutura que acumula água ao invés de educar a população para consumir menos. Não que dê para fazer os dois ao mesmo tempo, mas o fato é que os últimos governos têm preferido aumentar a oferta de água.

OP – Uma das linhas da sua pesquisa centra foco na antropologia da incerteza e do futuro. De que maneira isto está ligado ao estudo do clima?
Renzo – Isto talvez seja a parte mais desafiadora de entender qual o papel que o clima tem na nossa vida cotidiana. Porque a meteorologia rapidamente se deu conta de que a atmosfera é algo muito complexo. Agora a estação chuvosa no Ceará é algo muito mais complicado. Porque o agricultor quer saber hoje se vai ter chuva em maio. Não tem nenhum radar que mostre isso, mas a meteorologia usa a física e a matemática criando modelos que simulam no computador a maneira como funciona a natureza. Mas isso tem limitações. Por isso se fala em termos de probabilidade, porque às vezes uma coisa pequena pode mudar tudo. Então a meteorologia tem que conviver com essa relação complicada com a sociedade.

OP – Porque é tão difícil lidar com a incerteza?
Renzo – Tem inúmeras pesquisas que mostram que temos dificuldade tremenda de reter isso. A informação probabilística demanda um esforço cognitivo muito grande. É realmente complexo. É como se fôssemos programados mentalmente para não operar com probabilidade e para fingir que tudo é certo ou errado. Temos essa tendência a polarizar as coisas. E isso influencia a relação da sociedade com o clima. Veja, por exemplo, a estratégia dos agricultores. O agricultor familiar está o tempo inteiro prestando atenção em previsões do clima, só que não usa nenhuma. Ele espera o solo ficar úmido numa certa profundidade, para depositar a semente. Só que as primeiras chuvas da estação são fracas, o broto morre e ele precisa começar tudo de novo.

OP – A incerteza é parte da natureza.
Renzo – Acompanho os profetas da chuva desde 2002. É muito recorrente que alguns deles se digam observadores da natureza, e não profetas. Ser profeta tem uma carga simbólica religiosa muito forte e é um peso muito grande pra eles carregarem nas costas. Então eles fazem as previsões e, no final, dizem que quem sabe mesmo é Deus. O interessante é que, em termos de conteúdo, eles dizem exatamente o que a Funceme diz em termos de probabilidade. Existe uma incerteza envolvida. Então as pessoas aceitam, mas não dão à ciência o direito de viver as incertezas.

OP – Como a meteorologia figura nesta história?
Renzo – Uma parte dessa confusão tem a ver com a história da meteorologia no Ceará. Ela começou com muita fanfarronice, voando de avião, fazendo uma pulverização nas nuvens com sal de prata pra fazer chover mais rápido. Só que você percebe que o spray não produz chuva, só apressa. Então essa tecnologia sempre foi muito controvertida. Para uma mentalidade do sertão, isso equivaleria dizer que o homem da cidade se acreditava com o poder de produzir chuva. E tanto é assim que o Patativa do Assaré fez o poema Ao dotô do avião, onde ele coloca vários elementos importantes. O homem se adequa ao ciclo da natureza e não vice-versa. No Ceará, o clima sempre esteve ligado à religião. Então era desrespeitoso e absurdo achar que o cidadão iria produzir chuva.

OP – As pessoas já absorveram a gravidade do aquecimento global?
Renzo – Não sei se, algum dia, entenderemos o aquecimento global. A natureza funciona em ciclos. O dia e a noite, as estações do ano. São ciclos que, por serem curtos, a gente consegue entender bem. Só que existem aqueles que são muito longos. Costuma-se dizer que, aqui no semi-árido, existem ciclos onde duas ou três décadas são mais secas, depois outras mais chuvosas. Pode ser que exista um ciclo bem mais longo que a gente não tem nem ideia. Se o futuro provar que estamos errados, tudo bem, fizemos o que tinha de ser feito.

OP – Então existe uma visão positiva para o futuro?
Renzo – A ciência é feita por incertezas, ela só caminha porque ensina o que não sabe. Mas existe o que chamam de princípio precaucionário. Que diz que você precisa medir o quanto você perde se tiver certo e não fazer nada e o quanto perde se estiver errado e fizer muita coisa. Então imagina que não existe aquecimento global nenhum, só que tomamos as atitudes necessárias. O que a gente perde? Existe uma perda em termos de crescimento econômico. Agora a outra opção é que existe um aquecimento global, ele está acontecendo, tem a ver com produção industrial, mas a gente assume que não está acontecendo e não faz nada. O que perdemos no futuro? Várias pessoas dizem que não fazer nada pode ter um custo muito alto. A chance de estarmos certos é grande e mesmo que estejamos errados, tem como recuperar. Existe ainda um outro lado em que talvez não tenhamos como recuperar. Talvez a gente de fato passe por uma sequência grande de eventos extremos. O lance do aquecimento climático não tem a ver com o mundo ficar mais quente todo dia. O ponto é que eventos extremos, como chuva, furacão, tendam a ser mais frequentes. O nível do mar já está subindo, algumas nações já começam a se transferir. E voltamos à história de Fortaleza ser a Las Vegas do semi-árido. Não dá pra falar em cortar a emissão de carbono sem reduzir atividade industrial. E não podemos falar de aquecimento global sem redução de consumo. E como faz para a população da Aldeota parar de consumir tanto? Nos meus momentos mais pessimistas, eu penso que a humanidade só consegue se re-programar mentalmente em escala continental numa experiência de quase morte. O que significa uma imensa catástrofe. E aí todo mundo para e se repensa. Mas eu sou professor e tenho que acreditar que a educação tem o seu valor.