Arquivo da categoria: Incerteza

Última chance (Página 22)

04/11/2014 – 10h19

por Diego Viana, da Página 22

mundo Última chance
Foto: http://ambientalsustentavel.org/A humanidade dispõe de dinheiro, tecnologia e conhecimento para mudar a rota que conduz a uma alta catastrófica da temperatura no planeta. A bola está com atores políticos e econômicos, que têm pouco mais de um ano para fechar um acordo global decisivo.

As notícias não foram boas nos últimos meses. Apesar de estagnada desde 2008, a economia mundial não consegue reduzir as emissões de carbono no ritmo necessário. Para manter o aquecimento global em 2 graus [1] até 2100, teríamos de emitir 6,2% a menos ano após ano. Em 2013, a redução foi de só 1,2%, de acordo com relatório da consultoria PwC. No ritmo anual, caminhamos facilmente para um aquecimento de 4 graus. Entre os dois cenários – de 2 e de 4 graus –, a diferença é um abismo: ou um planeta mais difícil de viver, com desastres frequentes, falta de comida e de água, populações deslocadas, ou uma mudança climática descontrolada e completamente inóspita para a civilização (ver gráfico 1).

Já os oceanos, responsáveis por segurar boa parte do aquecimento global, estão esquentando a uma velocidade superior à prevista, segundo o Painel Intergovernamental sobre Mudança Climática (IPCC, na sigla em inglês). Assim, fica ainda mais estreito o ultimato para encontrar soluções climáticas. Em setembro, a ONG WWF anunciou que, nos últimos 40 anos, por efeito da ação humana, a população mundial de animais vertebrados caiu à metade, enquanto a distância entre a oferta de recursos naturais do planeta e as demandas do sistema econômico só cresce (ver gráfico 2).

Gráfico 1

grafico1 Última chance

Gráfico 2

grafico2 Última chance

A sucessão de dados negativos acrescenta uma dose de urgência ao esforço de controlar o acúmulo de carbono na atmosfera e, assim, reduzir boa parte da nossa pegada ecológica.

Depois de anos em que negociações multilaterais esbarraram na incapacidade de encontrar um terreno comum para um acordo entre políticos de diversos países, é cada vez maior a convicção, em sociedades ao redor do mundo, de que não se pode mais postergar uma solução por motivos políticos.

A 21a Conferência da ONU para o clima (COP 21), que ocorrerá em Paris em dezembro de 2015, torna-se tão mais decisiva para o futuro da civilização quanto mais se aproxima a data. Negociadores e ativistas esperam conseguir até lá chegar a um acordo climático eficaz. A COP 20 ocorrerá em dezembro deste ano em Lima (Peru), mas só no ano seguinte os ativistas acreditam que se produzirá algo concreto.

“Se fizermos tudo que pode ser feito, há 75% de chance de conseguirmos manter o aquecimento global dentro dos 2 graus até 2100”, diz o ecologista Tom Athanasiou, diretor- executivo da ONG americana EcoEquity, citando estudos do IPCC.

Athanasiou separa a questão em duas: técnico-científica e político-econômica. “Temos o dinheiro, a tecnologia e a ciência para fazer uma redução emergencial rápida o suficiente para segurar a linha de 2 graus. É um declínio global de emissões muito veloz e que ainda seria muito perigoso, porque envolveria o dobro do aquecimento que tivemos até hoje [de 0,8 grau]”, diz. “Mas com o ‘business and politics as usual’ , duvido que dê para evitar os 3 graus ou até 4 graus.”

Mecanismos de Mercado

Segundo o sociólogo Sérgio Abranches, que edita o site Ecopolítica, o fato de os modelos climáticos terem margens de erro elevadas resulta em discordância entre cientistas sobre a possibilidade de a temperatura ficar abaixo dos 2 graus de aquecimento. O resultado se reflete sobre a política, porque “a política trabalha com certezas. Se alguém manifesta qualquer dúvida sobre um ponto, os políticos adiam a decisão, e é isso que tem acontecido”.

Muitas das propostas para reduzir as emissões ao redor do mundo envolvem mecanismos de mercado, propiciados pelo Protocolo de Kyoto, baseados no cap-and-trade, que impõe um limite de emissões e cria créditos que podem ser negociados. Mas, sem poder de sanção e sem o apoio de países importantes como EUA e China, o protocolo é considerado um fracasso.

“Os mecanismos de mercado já mostraram que (sozinhos) não são suficientes”, diz Abranches, citando o exemplo dos créditos de carbono europeus, que não foram capazes de reduzir as emissões no continente. “Não é possível fazer o mercado funcionar só com incentivos. É preciso combiná-los com penalidades que tornem os incentivos mais atraentes para empresas emissoras.”

Para o sociólogo, o único instrumento econômico eficaz é o imposto sobre o carbono, adotado por vários países e recentemente aprovado no Chile, que também contém um sobrepreço aplicado a importações de países que não têm o imposto. A Organização Mundial do Comércio (OMC) publicou no ano passado uma portaria em que aprova o imposto de carbono e não o considera como prática desleal de concorrência. “O imposto precifica de forma penalizadora as emissões, e as empresas buscam formas de se adequar. Esse é o único jeito de fazer com que o mercado tome iniciativas para reduzir suas emissões”, argumenta (mais sobre a eficácia dos mecanismos de precifição do carbono emreportagem).

Athanasiou lembra que as catástrofes climáticas dos últimos anos ocorreram no contexto de um aquecimento ainda na casa dos 0,8 grau. Uma lista exaustiva pode ser encontrada no website do Centro de Pesquisa em Epidemiologia de Desastres (Cred), da Universidade Católica de Louvain, na Bélgica . “Um aquecimento de 2 graus causará imensa destruição e sofrimento, mas não significa o fim da civilização humana”, diz o ativista, que antevê um cenário de migrações massivas, fome, extinções e guerra constante caso cheguemos a 3 ou 4 graus. Athanasiou falou à PÁGINA22 enquanto se preparava para viajar até Bonn, na Alemanha, onde ocorreu uma conferência preparatória para a COP 20, em Lima.

O Cred informa que na década de 1940 houve 120 desastres hidrometeorológicos (que podem ter tido origem humana) contra 52 geológicos (eventos naturais). De2000 a 2005, foram 233 geológicos contra 2.135 hidrometeorológicos. O resultado sugere que o ser humano é que tem cada vez mais causado desastres ambientais.

Na ONG EcoEquity, que ele mantém com outros especialistas do clima, foi desenvolvido o conceito de Global Development Rights. Trata-se de um cálculo destinado a orientar um futuro sistema de impostos globais, cujo foco está na convicção de que nenhum acordo será obtido sem atacar o problema da desigualdade. Daí a divisão entre a responsabilidade – o quanto um país, empresa ou indivíduo polui – e a capacidade de enfrentar o problema – o quanto é capaz de contribuir para reduzir as emissões.

“A crise do clima é uma crise global dos comuns. Mas a habilidade de pagar pela transição é geográfica e economicamente separada de onde a transição deve acontecer. É preciso mover a finança e a tecnologia através do pla- neta, e muito, para atingir as taxas altíssimas de descarbonização necessárias para estabilizar o sistema climático”, explica.

Marcha do Clima

A maior tentativa de mobilizar as sociedades de todo o mundo para pressionar governantes e negociadores de acordos climáticos ocorreu em 21 de setembro, com a Marcha Popular Global do Clima [2]. Em Nova York, dois dias antes do encontro de líderes mundiais que a cidade sediou, 400 mil pessoas foram às ruas, acompanhadas à distância por manifestações em centenas de cidades ao redor do mundo, incluindo Rio de Janeiro e São Paulo (mais em reportagem).

Os organizadores da marcha foram os membros da ONG 350.org, dedicada a conscientizar a população quanto aos perigos ligados à mudança climática. O número que dá nome à instituição, “350”, corresponde ao limite de concentração, em ppm (partes por milhão), de partículas de gases de efeito estufa, abaixo do qual ainda é possível controlar o aquecimento global. No ano passado, porém, a marca de 400 ppm foi ultrapassada.

Alguns organizadores da marcha esperavam que Nova York recebesse até 1 milhão de manifestantes, a exemplo de protestos semelhantes na década de 1970, contra os armamentos nucleares ou em prol das primeiras leis ambientais. Os 400 mil foram um número expressivo, mas abaixo do desejado. Segundo Sérgio Abranches, o principal motivo é o desencanto das populações com a ação política: as pessoas passaram a considerar que não adianta se mobilizar para pressionar políticos que não reagem às pressões.

Athanasiou considera que o comparecimento foi satisfatório, mas afirma que não é o mais importante. Aos poucos, diz, os grupos de ativistas de todo o mundo estão convergindo para uma agenda comum. “É no ano que vem, em Paris, que vamos precisar juntar 1 milhão de pessoas”, crava. “A Europa tem um monte de verdes! Vamos juntá-los em Paris!”

A 350.org também é promotora da iniciativa “Divesting from Fossil Fuel” (Desinvestir em Combustíveis Fósseis), lançada em 2012 . A estratégia consiste em convencer fundos de investimento, universidades, filantropos e outras entidades a retirar seus investimentos de empresas petrolíferas.

Os membros da ONG consideram que a iniciativa já pode ser considerada como bem-sucedida, porque gerou discussões na mídia e conseguiu adesões de universidades e fundos filantrópicos ao redor do mundo. Uma adesão recente tem sabor particularmente irônico: os descendentes do magnata do petróleo John D. Rockefeller, fundador da Standard Oil, anunciaram que vão retirar gradativamente seus investimentos em empresas petrolíferas (mais em reportagem).

Algo a comemorar

Nem todas as notícias foram ruins este ano. Em grande medida graças à iniciativa alemã de ampliar a participação de usinas eólicas e painéis solares em sua matriz energética, o custo das fontes renováveis de energia está cada vez mais competitivo. Ainda não é certo, porém, que a transição para uma matriz energética mais limpa ocorra na velocidade necessária. “A mudança da matriz energética mundial está impulsionando o desengarrafamento de alguns problemas tecnológicos urgentes”, diz Abranches. “Um ponto que vai nos levar a um novo patamar em energia eólica é a armazenagem, que ainda não está resolvida.”

O cientista político cita também o desenvolvimento de biocombustíveis de segunda geração, cuja produção não compete com produtos de alimentação. “Claramente,esta é uma transição longa e gradual. Não temos ainda uma fonte que possa substituir o petróleo nas mesmas condições de eficiência energética e variedade de uso em curto prazo”, afirma o cientista político (leia reportagem sobre o pré-sal brasileiro).

[1] Dois graus é o aumento mínimo que o planeta sofrerá, no cenário mais otimista desenhado pela comunidade científica. Mesmo assim, imporá uma forte mudança nas formas de vida na Terra.

[2] Mais sobre as marchas ao redor do mundo aqui.
* Publicado originalmente no site Página 22.

(Página 22)

Anúncios

>On Birth Certificates, Climate Risk and an Inconvenient Mind (N.Y. Times, Dot Earth Blog)

>
April 28, 2011, 9:23 AM
By ANDREW C. REVKIN

As Donald Trump tries to milk a last bit of publicity out of the failed “birther” challenge to President Obama, it’s worth reading a fresh take by an Australian psychologist on the deep roots of denial in people with fundamentalist passions of whatever stripe. Here’s an excerpt:

[I]deology trumps facts.
And it doesn’t matter what the ideology is, whether socialism, any brand of fundamentalist religion, or free-market extremism. The psychological literature shows quite consistently that a threat to one’s worldview is more than likely met by a dismissal of facts, however strong the evidence. Indeed, the stronger the evidence, the greater the threat — and hence the greater the denial.
In its own bizarre way, then, the rising noise level of climate denial provides further evidence that global warming resulting from human CO2 emissions is indeed a fact, however inconvenient it may be. Read the rest.
The piece, published today on the Australian news blog The Drum, is byStephan Lewandowsky of the School of Psychology at the University of Western Australia.
Of course, just being aware that ideology can deeply skew how people filter facts and respond to risks begs the question of how to make progress in the face of the wide societal divisions this pattern creates.
It’s easy to forget that there’s been plenty of climate denial to go around. It took a decade for those seeking a rising price on carbon dioxide emissions as a means to transform American and global energy norms to realize that a price sufficient to drive the change was a political impossibility.
As a new paper in the Proceedings of the National Academy of Sciences found, even when greenhouse-gas emissions caps were put in place, trade with unregulated countries simply shifted the brunt of the emissions elsewhere.
When he was Britain’s prime minister, Tony Blair put it this way in 2005: “The blunt truth about the politics of climate change is that no country will want to sacrifice its economy in order to meet this challenge.”
My choice, of course, is to attack the two-pronged energy challenge the world faces with a sustained energy quest, nudged and nurtured from the top but mainly fostered from the ground up.
And I’m aware I still suffer from a hint of “scientism,” even “rational optimism,” in expecting that this argument can catch on, but so be it.
10:11 a.m. | Updated For much more on the behavioral factors that shape the human struggle over climate policy, I encourage you to explore “Living in Denial: Climate Change, Emotions, and Everyday Life,” a new book by Kari Marie Norgaard, a sociologist who has just moved from Whitman College to the University of Oregon.
Robert Brulle of Drexel University brought the book to my attention several months ago, and I invited him to do a Dot Earth “Book Report,” to kick off a discussion of Norgaard’s insights, which emerge from years of research she conducted on climate attitudes in a rural community in western Norway. (I’d first heard of of Norgaard’s research while reporting my 2007 article on behavior and climate risk.)
(I also encourage you to read the review in the journal Nature Climate Changeby Mike Hulme, a professor of climate at the University of East Anglia and the author of “Why We Disagree about Climate Change.”)
Here’s Brulle’s reaction to Norgaard’s book:
As a sociologist and longtime student of human responses to environmental problems, I’ve seen reams of analysis come and go on why we get some things right and some very wrong. A new book by Kari Norgaard has done the best job yet of cutting to the core on our seeming inability to grasp and meaningfully respond to human-driven climate change.
As the science of climate change has become stronger and more dire, media coverage, public opinion, and government actions regarding this issue has declined. At the same time, climate denial positions have become increasingly accepted, despite a lack of scientific evidence. Even among the public that accepts the science of global climate change, the dire circumstances we now face in this regard are consistently downplayed, and the logical implications that follow from the scientific analysis of the necessity to enact swift and aggressive measures to combat climate change are not followed through either intellectually or politically.
Instead, at best, a series of half measures have been proposed, which though they may be comforting, are essentially symbolic measures that allow the status quo to continue unchanged, and thus will not adequately address the issue of global climate change. Thus attempts to address climate change have encountered significant cultural, political, and economic barriers that have not been overcome. While there have been several attempts to explain the lack of meaningful action regarding climate change, these models have not developed into an integrated and empirically supported approach. Additionally, many of these models are based in an individualistic perspective, and thus engage in a form of psychological reductionism. Finally, none of these models are able to coherently explain the inter-related phenomena regarding climate change that is occurring at the individual, small group, institutional, and societal levels.
To move beyond the limitations of these approaches, Dr. Norgaard develops a sociological model that views the response to global climate change as a social process. One of the fundamental insights of sociology is that individuals are part of a larger structure of cultural and social interactions. Thus through the socialization processes, we construct certain ways of life and understandings of the world that guide our everyday interactions. Individuals become the carriers of the orientations and practices that constitute our social order. A disjuncture between our taken-for-granted way of living, such as the new behaviors necessitated by climate change, are experienced at the individual level as identity threats, at the institutional level as challenges to social cohesion, and at the societal level as legitimation threats. When this occurs, there are powerful processes that work at the psychological, institutional, and overall society level to maintain the current orientations and ensure social stability. Taken together, these social processes create cultural and social stability. They also create, from the view of climate change, a form of social inertia that inhibits rapid social change.
From this sociological perspective, Dr. Norgaard takes on the apparent paradox of climate change and public awareness; as our knowledge about the nature and seriousness of climate change has increased, our political and social engagement with the issue has declined. Why? Dr. Norgaard’s answer (crudely put) is that our personality structures and social norms are so thoroughly enmeshed with a growth economy based on fossil fuels that any consideration of the need to change our way of life to deal with climate change evokes powerful emotions of anxiety and desires to avoid this issue. This avoidance behavior is socially reinforced by collective group norms, as well as the messages we receive from the mass media and the political elite. She develops this thesis through the use of an impressive array of sociological theory, including the sociology of the emotions, cultural sociology, and political economy. Additionally, she utilizes specific theoretical approaches regarding the social denial of catastrophic risk. Here she skillfully repurposes the literature on nuclear war and collective denial to the issue of climate change. This is a unique and insightful use of this literature. Thus her theoretical contribution is substantial and original. She then illustrates this process through a thick qualitative analysis based on participant observation in Norway. In her analysis of conversations, she illustrates how collective denial of climate change takes place through conversations. This provided powerful ground truth evidence of her theoretical framework.
This is an extremely important intellectual contribution. Research on climate change and culture has been primarily focused on individual attitudinal change. This work brings a sociological perspective to our understanding of individual and collective responses to climate change information, and opens up a new research area. It also has important practical implications. Most climate change communication efforts are based on conveying information to individuals. The assumption is that individuals will take in this information and then act rationally in their own interests. Dr. Norgaard’s analysis course charts a different approach. As she demonstrates, it is not a lack of information that inhibits action on climate change. Rather, the knowledge brings about unpleasant emotions and anxiety. Individuals and communities seek to restore a sense of equilibrium and stability, and thus engage in a form of denial which, although the basic facts of climate change are acknowledged, the logical conclusions and actions that follow from the information are minimized and not acted upon. This perspective calls for a much different approach to climate change communications, and defines a new agenda for this field.

[Note: people interested in this line of argument should follow the work done by researchers at the Center for Research on Environmental Decisions (CRED), at Columbia University, @ http://cred.columbia.edu.] 

>Climategate: What Really Happened? (Mother Jones)

>

>The Science of Why We Don’t Believe Science (Mother Jones)

>

Illustration: Jonathon Rosen
How our brains fool us on climate, creationism, and the vaccine-autism link.

— By Chris Mooney
Mon Apr. 18, 2011 3:00 AM PDT

“A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” So wrote the celebrated Stanford University psychologist Leon Festinger, in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study in psychology.

Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.

Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.

Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!

From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.

In the annals of denial, it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president, and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience: Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

“We apply fight-or-flight reflexes not only to predators, but to data itself.”

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”

In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we’re being scientists, but we’re actually being lawyers. Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

That’s a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don’t want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn’t too emotionally invested to accept it, anyway. That’s not to suggest that we aren’t also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It’s just that we have other important goals besides accuracy—including identity affirmation and protecting one’s sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.

Modern science originated from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the “idols of the mind.” Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best ideas prevail.

“Scientific evidence is highly susceptible to misinterpretation. Giving ideologues scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.”

Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment, pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”

Since then, similar results have been found for how people respond to “evidence” about affirmative action, gun control, the accuracy of gay stereotypes, and much else. Even when study subjects are explicitly instructed to be unbiased and even-handed about the evidence, they often fail.

And it’s not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider “scientific consensus” to lie on contested issues.

In Kahan’s research, individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. In another study, hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)

“Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.”

In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” says Kahan.

And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler showed subjects fake newspaper articles in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually “ban” embryonic stem-cell research. Liberals weren’t particularly amenable to persuasion, either, but no backfire effect was observed.)

Another study gives some inkling of what may be going through people’s minds when they resist persuasion. Northwestern University sociologist Monica Prasad and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up a study in which they discussed the topic with some of these Republicans in person. They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had “said the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”

“One study showed that not even Bush’s own words could change the minds of Bush voters who believed there was an Iraq-Al Qaeda link.”

As it turned out, not even Bush’s own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those? 

Respondent: Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.

The same types of responses are already being documented on divisive topics facing the current administration. Take the “Ground Zero mosque.” Using information from the political myth-busting site FactCheck.org, a team at Ohio State presented subjects with a detailed rebuttal to the claim that “Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer.” Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.

A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'” explains Stanford social psychologist Jon Krosnick. Indeed, there’s a sense in which science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”

This may help explain a curious pattern Nyhan and his colleagues found when they tried to test the fallacy that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president’s religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using “social desirabililty” to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.

Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, a follow-up study showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.

Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or “narrowcast” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age.”

“A predictor of whether you accept the science of global warming? Whether you’re a Republican or a Democrat.”

If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it’s an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you’re a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.

So perhaps it should come as no surprise that more education doesn’t budge Republican views. On the contrary: In a 2008 Pew survey, for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.

Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy—for example, abortion—if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right—and so their minds become harder to change.

That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one’s ideology.

Climategate had a substantial impact on public opinion, according to Anthony Leiserowitz, director of the Yale Project on Climate Change Communication. It contributed to an overall drop in public concern about climate change and a significant loss of trust in scientists. But—as we should expect by now—these declines were concentrated among particular groups of Americans: Republicans, conservatives, and those with “individualistic” values. Liberals and those with “egalitarian” values didn’t lose much trust in climate science or scientists at all. “In some ways, Climategate was like a Rorschach test,” Leiserowitz says, “with different groups interpreting ambiguous facts in very different ways.”

“Is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism.”

So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr.) and numerous Hollywood celebrities (most notably Jenny McCarthy and Jim Carrey). The Huffington Post gives a very large megaphone to denialists. And Seth Mnookin, author of the new book The Panic Virus, notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.

Vaccine denial has all the hallmarks of a belief system that’s not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rates has been undermined by multiple epidemiological studies—as well as the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.

Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, after his 1998 Lancet paper—which originated the current vaccine scare—was retracted and he subsequently lost his license (PDF) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.

It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?

There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.

Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are “system justifiers”: They engage in motivated reasoning to defend the status quo.

This is a contested area, however, because as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a “bourgeois” science and officially banned.

The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?

“We all have blinders in some situations. The question then becomes: What can be done to counteract human nature?”

Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

This theory is gaining traction in part because of Kahan’s work at Yale. In one study, he and his colleagues packaged the basic science of climate change into fake newspaper articles bearing two very different headlines—”Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming”—and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan infers that the effect occurred because the science had been written into an alternative narrative that appealed to their pro-industry worldview.

You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.

[Original link with access to mentioned studies here.]

>Palocci tentará melhorar o clima (JC, O Globo)

>
Ministro assume comando da política de mudanças climáticas para pôr fim a divergências.

JC e-mail 4218, de 16 de Março de 2011.

O chefe da Casa Civil, Antonio Palocci, assumiu o comando da pauta que vem ganhando mais destaque na área ambiental do governo: a política de mudanças climáticas, que, desde 2009, com a cúpula de Copenhague, rende ao Brasil prestígio internacional. A mudança de rumo no governo gerou inquietação nos dois principais ministérios que cuidam do tema: Meio Ambiente (MMA) e Ciência e Tecnologia (MCT). Ontem, foi efetivada a primeira baixa no Meio Ambiente, com a saída da secretária nacional de Mudanças Climáticas, Branca Americano, a ser substituída pelo pesquisador Eduardo Assad, da Embrapa.

Segundo a ministra do Meio Ambiente, Izabella Teixeira, que falou ontem a empresários e técnicos de governos estaduais, a ideia é acabar com as constantes divergências que a agenda climática causava entre os ministérios e obter sintonia. Visões conflitantes na Esplanada já causaram brigas e constrangimentos.

– Com a Rio+20, temos que tratar questões ambientais de forma diferente da que vínhamos tratando. Temos que ter uma estrutura de governança diferente. A mudança climática é o carro-chefe dessa discussão. Estamos trabalhando o melhor formato com o
MCT e a Casa Civil sob um novo modelo de governança da agenda de clima, a pedido da Casa Civil e do ministro Palocci. A Casa Civil, sendo o maestro, e o MCT e o MMA, os outros dois pés. Queremos acabar com as ilhas, para que haja convergência com a
agenda nacional – disse Izabella.

Departamento agregará temas

No MMA, a secretaria será transformada num super departamento, que deverá se chamar Secretaria do Clima, que agregará novos temas, como políticas de combate ao desmatamento, conservação de biodiversidade e gestão de florestas e recursos hídricos.

Na gestão do ministro do Meio Ambiente Carlos Minc, em 2009, o MMA teve problemas com o Itamaraty e o MCT. Minc pressionava para que o Brasil tivesse metas de redução de gases, e as outras duas pastas defendiam posição mais conservadora. Nas áreas que serão incorporadas pela nova secretaria, funcionários reclamam que Izabella não consultou os principais afetados sobre os novos rumos que seus trabalhos devem tomar.

Na Ciência e Tecnologia, técnicos que trabalham com mudança do clima resistem à proposta de reestruturação promovida com a nomeação do pesquisador Carlos Nobre para a Secretaria de Políticas e Programas de Pesquisa e Desenvolvimento. Para a conferência que acontecerá em Bangcoc, em abril, uma das preparatórias para o encontro anual da ONU, o MCT ainda não tem uma equipe formada para enviar. Uma das mudanças já anunciadas por Nobre atinge a menina dos olhos do ministério nesse setor: o Mecanismo de Desenvolvimento Limpo (MDL), que credencia projetos que reduzem emissões a receber créditos que podem ser negociados no mercado de carbono.

A guinada responde a uma das principais críticas feitas por elaboradores de projetos que reduzem emissões: a de que o processo de aceitação dessas propostas é excessivamente
burocrático.

– Vamos ter um novo olhar sobre o MDL. Vamos flexibilizar regras e torná-lo mais ágil. O Brasil tem condições de liderar, junto com a Escandinávia e a Alemanha, a transição para uma economia de baixo carbono – apontou o secretário.

Para o pesquisador do Instituto do Homem e Meio Ambiente da Amazônia (Imazon) Adalberto Veríssimo, a decisão do governo de pôr a Casa Civil na coordenação da política climática do governo é uma boa notícia. Ele argumenta que o aquecimento global é um problema que atravessa diferentes áreas temáticas e, por isso, deve estar no topo da hierarquia do Executivo.

– A coordenação de Palocci sobre as mudanças climáticas está correta. Esse é um assunto transversal, que interessa a várias áreas, como Minas e Energia, Transporte e Agricultura. É uma tarefa que vai precisar de equilíbrio. O aquecimento global é um dos pilares da discussão desta década. Colocá-lo na Casa Civil é sinal de que o Brasil quer continuar avançando na área – disse.

Sobre as mudanças nos ministérios, Veríssimo disse que a entrada de Nobre “oxigena” o debate dentro do MCT, que, segundo ele, contava com quadros retrógrados.

– Senti o MCT e o MMA falando a mesma língua. Foi a primeira vez que vi isso acontecer – disse Veríssimo.

Marina faz críticas a licenciamentos

A ex-senadora Marina Silva (PV) criticou ontem a ideia do governo federal de flexibilizar a concessão de licenciamentos ambientais para acelerar obras de infraestrutura. Ela falou antes de saber que Palocci cuidará da agenda climática.

– Vejo com preocupação essa história de mudar o processo de licenciamento ambiental. Acho que qualquer mudança dessa natureza, no sentido de flexibilizar, só vai agravar os problemas que estamos vivendo. O licenciamento tem um papel importante para reduzir e minimizar o impacto ambiental de uma obra – disse a candidata derrotada a presidente, após participar de aula magna do curso de pós-graduação do Instituto Nacional de Pesquisas Espaciais (Inpe), em São José dos Campos.

As propostas do governo serão implementadas por decretos que regularão o licenciamento de rodovias, portos, linhas de transmissão de energia elétrica, hidrovias e obras de exploração de petróleo do pré-sal.
(O Globo)

>Living in Denial: Climate Change, Emotions, and Everyday Life (MIT Press)

>
Book release (April 2011, MIT Press):

Kari Marie Norgaard

Global warming is the most significant environmental issue of our time, yet public response in Western nations has been meager. Why have so few taken any action? In Living in Denial, sociologist Kari Norgaard searches for answers to this question, drawing on interviews and ethnographic data from her study of “Bygdaby,” the fictional name of an actual rural community in western Norway, during the unusually warm winter of 2001-2002.

In 2001-2002 the first snowfall came to Bygdaby two months later than usual; ice fishing was impossible; and the ski industry had to invest substantially in artificial snow-making. Stories in local and national newspapers linked the warm winter explicitly to global warming. Yet residents did not write letters to the editor, pressure politicians, or cut down on use of fossil fuels. Norgaard attributes this lack of response to the phenomenon of socially organized denial, by which information about climate science is known in the abstract but disconnected from political, social, and private life, and sees this as emblematic of how citizens of industrialized countries are responding to global warming.

Norgaard finds that for the highly educated and politically savvy residents of Bygdaby, global warming was both common knowledge and unimaginable. Norgaard traces this denial through multiple levels, from emotions to cultural norms to political economy. Her report from Bygdaby, supplemented by comparisons throughout the book to the United States, tells a larger story behind our paralysis in the face of today’s alarming predictions from climate scientists.

About the Author

Kari Marie Norgaard is Assistant Professor of Sociology at the University of Oregon.

>Ancient Catastrophic Drought Leads to Question: How Severe Can Climate Change Become? (NSF)

>
Press Release 11-039

Extreme megadrought in Afro-Asian region likely had consequences for Paleolithic cultures

A boat on Lake Tanganyika today; the lake’s ancient surface water level fell dramatically.
Credit: Curt Stager.

February 24, 2011
How severe can climate change become in a warming world?

Worse than anything we’ve seen in written history, according to results of a study appearing this week in the journal Science.

An international team of scientists led by Curt Stager of Paul Smith’s College, New York, has compiled four dozen paleoclimate records from sediment cores in Lake Tanganyika and other locations in Africa.

The records show that one of the most widespread and intense droughts of the last 50,000 years or more struck Africa and Southern Asia 17,000 to 16,000 years ago.

Between 18,000 and 15,000 years ago, large amounts of ice and meltwater entered the North Atlantic Ocean, causing regional cooling but also major drought in the tropics, says Paul Filmer, program director in the National Science Foundation’s (NSF) Division of Earth Sciences, which funded the research along with NSF’s Division of Atmospheric and Geospace Sciences and its Division of Ocean Sciences.

“The height of this time period coincided with one of the most extreme megadroughts of the last 50,000 years in the Afro-Asian monsoon region with potentially serious consequences for the Paleolithic humans that lived there at the time,” says Filmer.

The “H1 megadrought,” as it’s known, was one of the most severe climate trials ever faced by anatomically modern humans.

Africa’s Lake Victoria, now the world’s largest tropical lake, dried out, as did Lake Tana in Ethiopia, and Lake Van in Turkey.

The Nile, Congo and other major rivers shriveled, and Asian summer monsoons weakened or failed from China to the Mediterranean, meaning the monsoon season carried little or no rainwater.

What caused the megadrought remains a mystery, but its timing suggests a link to Heinrich Event 1 (or “H1”), a massive surge of icebergs and meltwater into the North Atlantic at the close of the last ice age.

Previous studies had implicated southward drift of the tropical rain belt as a localized cause, but the broad geographic coverage in this study paints a more nuanced picture.

“If southward drift were the only cause,” says Stager, lead author of the Science paper, “we’d have found evidence of wetting farther south. But the megadrought hit equatorial and southeastern Africa as well, so the rain belt didn’t just move–it also weakened.”

Climate models have yet to simulate the full scope of the event.

The lack of a complete explanation opens the question of whether an extreme megadrought could strike again as the world warms and de-ices further.

“There’s much less ice left to collapse into the North Atlantic now,” Stager says, “so I’d be surprised if it could all happen again–at least on such a huge scale.”

Given what such a catastrophic megadrought could do to today’s most densely populated regions of the globe, Stager hopes he’s right.

Stager also holds an adjunct position at the Climate Change Institute, University of Maine, Orono.

Co-authors of the paper are David Ryves of Loughborough University in the United Kingdom; Brian Chase of the Institut des Sciences de l’Evolution de Montpellier in France and the Department of Archaeology, University of Bergen, Norway; and Francesco Pausata of the Geophysical Institute, University of Bergen, Norway.

-NSF-

>Can a group of scientists in California end the war on climate change? (Guardian)

>
The Berkeley Earth project say they are about to reveal the definitive truth about global warming

Ian Sample
guardian.co.uk
Sunday 27 February 2011 20.29 GMT

Richard Muller of the Berkeley Earth project is convinced his approach will lead to a better assessment of how much the world is warming. Photograph: Dan Tuffs for the Guardian

In 1964, Richard Muller, a 20-year-old graduate student with neat-cropped hair, walked into Sproul Hall at the University of California, Berkeley, and joined a mass protest of unprecedented scale. The activists, a few thousand strong, demanded that the university lift a ban on free speech and ease restrictions on academic freedom, while outside on the steps a young folk-singer called Joan Baez led supporters in a chorus of We Shall Overcome. The sit-in ended two days later when police stormed the building in the early hours and arrested hundreds of students. Muller was thrown into Oakland jail. The heavy-handedness sparked further unrest and, a month later, the university administration backed down. The protest was a pivotal moment for the civil liberties movement and marked Berkeley as a haven of free thinking and fierce independence.

Today, Muller is still on the Berkeley campus, probably the only member of the free speech movement arrested that night to end up with a faculty position there – as a professor of physics. His list of publications is testament to the free rein of tenure: he worked on the first light from the big bang, proposed a new theory of ice ages, and found evidence for an upturn in impact craters on the moon. His expertise is highly sought after. For more than 30 years, he was a member of the independent Jason group that advises the US government on defence; his college lecture series, Physics for Future Presidents was voted best class on campus, went stratospheric on YouTube and, in 2009, was turned into a bestseller.

For the past year, Muller has kept a low profile, working quietly on a new project with a team of academics hand-picked for their skills. They meet on campus regularly, to check progress, thrash out problems and hunt for oversights that might undermine their work. And for good reason. When Muller and his team go public with their findings in a few weeks, they will be muscling in on the ugliest and most hard-fought debate of modern times.

Muller calls his latest obsession the Berkeley Earth project. The aim is so simple that the complexity and magnitude of the undertaking is easy to miss. Starting from scratch, with new computer tools and more data than has ever been used, they will arrive at an independent assessment of global warming. The team will also make every piece of data it uses – 1.6bn data points – freely available on a website. It will post its workings alongside, including full information on how more than 100 years of data from thousands of instruments around the world are stitched together to give a historic record of the planet’s temperature.

Muller is fed up with the politicised row that all too often engulfs climate science. By laying all its data and workings out in the open, where they can be checked and challenged by anyone, the Berkeley team hopes to achieve something remarkable: a broader consensus on global warming. In no other field would Muller’s dream seem so ambitious, or perhaps, so naive.

“We are bringing the spirit of science back to a subject that has become too argumentative and too contentious,” Muller says, over a cup of tea. “We are an independent, non-political, non-partisan group. We will gather the data, do the analysis, present the results and make all of it available. There will be no spin, whatever we find.” Why does Muller feel compelled to shake up the world of climate change? “We are doing this because it is the most important project in the world today. Nothing else comes close,” he says.

Muller is moving into crowded territory with sharp elbows. There are already three heavyweight groups that could be considered the official keepers of the world’s climate data. Each publishes its own figures that feed into the UN’s Intergovernmental Panel on Climate Change. Nasa’s Goddard Institute for Space Studies in New York City produces a rolling estimate of the world’s warming. A separate assessment comes from another US agency, the National Oceanic and Atmospheric Administration (Noaa). The third group is based in the UK and led by the Met Office. They all take readings from instruments around the world to come up with a rolling record of the Earth’s mean surface temperature. The numbers differ because each group uses its own dataset and does its own analysis, but they show a similar trend. Since pre-industrial times, all point to a warming of around 0.75C.

You might think three groups was enough, but Muller rolls out a list of shortcomings, some real, some perceived, that he suspects might undermine public confidence in global warming records. For a start, he says, warming trends are not based on all the available temperature records. The data that is used is filtered and might not be as representative as it could be. He also cites a poor history of transparency in climate science, though others argue many climate records and the tools to analyse them have been public for years.

Then there is the fiasco of 2009 that saw roughly 1,000 emails from a server at the University of East Anglia’s Climatic Research Unit (CRU) find their way on to the internet. The fuss over the messages, inevitably dubbed Climategate, gave Muller’s nascent project added impetus. Climate sceptics had already attacked James Hansen, head of the Nasa group, for making political statements on climate change while maintaining his role as an objective scientist. The Climategate emails fuelled their protests. “With CRU’s credibility undergoing a severe test, it was all the more important to have a new team jump in, do the analysis fresh and address all of the legitimate issues raised by sceptics,” says Muller.

This latest point is where Muller faces his most delicate challenge. To concede that climate sceptics raise fair criticisms means acknowledging that scientists and government agencies have got things wrong, or at least could do better. But the debate around global warming is so highly charged that open discussion, which science requires, can be difficult to hold in public. At worst, criticising poor climate science can be taken as an attack on science itself, a knee-jerk reaction that has unhealthy consequences. “Scientists will jump to the defence of alarmists because they don’t recognise that the alarmists are exaggerating,” Muller says.

The Berkeley Earth project came together more than a year ago, when Muller rang David Brillinger, a statistics professor at Berkeley and the man Nasa called when it wanted someone to check its risk estimates of space debris smashing into the International Space Station. He wanted Brillinger to oversee every stage of the project. Brillinger accepted straight away. Since the first meeting he has advised the scientists on how best to analyse their data and what pitfalls to avoid. “You can think of statisticians as the keepers of the scientific method, ” Brillinger told me. “Can scientists and doctors reasonably draw the conclusions they are setting down? That’s what we’re here for.”

For the rest of the team, Muller says he picked scientists known for original thinking. One is Saul Perlmutter, the Berkeley physicist who found evidence that the universe is expanding at an ever faster rate, courtesy of mysterious “dark energy” that pushes against gravity. Another is Art Rosenfeld, the last student of the legendary Manhattan Project physicist Enrico Fermi, and something of a legend himself in energy research. Then there is Robert Jacobsen, a Berkeley physicist who is an expert on giant datasets; and Judith Curry, a climatologist at Georgia Institute of Technology, who has raised concerns over tribalism and hubris in climate science.

Robert Rohde, a young physicist who left Berkeley with a PhD last year, does most of the hard work. He has written software that trawls public databases, themselves the product of years of painstaking work, for global temperature records. These are compiled, de-duplicated and merged into one huge historical temperature record. The data, by all accounts, are a mess. There are 16 separate datasets in 14 different formats and they overlap, but not completely. Muller likens Rohde’s achievement to Hercules’s enormous task of cleaning the Augean stables.

The wealth of data Rohde has collected so far – and some dates back to the 1700s – makes for what Muller believes is the most complete historical record of land temperatures ever compiled. It will, of itself, Muller claims, be a priceless resource for anyone who wishes to study climate change. So far, Rohde has gathered records from 39,340 individual stations worldwide.

Publishing an extensive set of temperature records is the first goal of Muller’s project. The second is to turn this vast haul of data into an assessment on global warming. Here, the Berkeley team is going its own way again. The big three groups – Nasa, Noaa and the Met Office – work out global warming trends by placing an imaginary grid over the planet and averaging temperatures records in each square. So for a given month, all the records in England and Wales might be averaged out to give one number. Muller’s team will take temperature records from individual stations and weight them according to how reliable they are.

This is where the Berkeley group faces its toughest task by far and it will be judged on how well it deals with it. There are errors running through global warming data that arise from the simple fact that the global network of temperature stations was never designed or maintained to monitor climate change. The network grew in a piecemeal fashion, starting with temperature stations installed here and there, usually to record local weather.

Among the trickiest errors to deal with are so-called systematic biases, which skew temperature measurements in fiendishly complex ways. Stations get moved around, replaced with newer models, or swapped for instruments that record in celsius instead of fahrenheit. The times measurements are taken varies, from say 6am to 9pm. The accuracy of individual stations drift over time and even changes in the surroundings, such as growing trees, can shield a station more from wind and sun one year to the next. Each of these interferes with a station’s temperature measurements, perhaps making it read too cold, or too hot. And these errors combine and build up.

This is the real mess that will take a Herculean effort to clean up. The Berkeley Earth team is using algorithms that automatically correct for some of the errors, a strategy Muller favours because it doesn’t rely on human interference. When the team publishes its results, this is where the scrutiny will be most intense.

Despite the scale of the task, and the fact that world-class scientific organisations have been wrestling with it for decades, Muller is convinced his approach will lead to a better assessment of how much the world is warming. “I’ve told the team I don’t know if global warming is more or less than we hear, but I do believe we can get a more precise number, and we can do it in a way that will cool the arguments over climate change, if nothing else,” says Muller. “Science has its weaknesses and it doesn’t have a stranglehold on the truth, but it has a way of approaching technical issues that is a closer approximation of truth than any other method we have.”

He will find out soon enough if his hopes to forge a true consensus on climate change are misplaced. It might not be a good sign that one prominent climate sceptic contacted by the Guardian, Canadian economist Ross McKitrick, had never heard of the project. Another, Stephen McIntyre, whom Muller has defended on some issues, hasn’t followed the project either, but said “anything that [Muller] does will be well done”. Phil Jones at the University of East Anglia was unclear on the details of the Berkeley project and didn’t comment.

Elsewhere, Muller has qualified support from some of the biggest names in the business. At Nasa, Hansen welcomed the project, but warned against over-emphasising what he expects to be the minor differences between Berkeley’s global warming assessment and those from the other groups. “We have enough trouble communicating with the public already,” Hansen says. At the Met Office, Peter Stott, head of climate monitoring and attribution, was in favour of the project if it was open and peer-reviewed.

Peter Thorne, who left the Met Office’s Hadley Centre last year to join the Co-operative Institute for Climate and Satellites in North Carolina, is enthusiastic about the Berkeley project but raises an eyebrow at some of Muller’s claims. The Berkeley group will not be the first to put its data and tools online, he says. Teams at Nasa and Noaa have been doing this for many years. And while Muller may have more data, they add little real value, Thorne says. Most are records from stations installed from the 1950s onwards, and then only in a few regions, such as North America. “Do you really need 20 stations in one region to get a monthly temperature figure? The answer is no. Supersaturating your coverage doesn’t give you much more bang for your buck,” he says. They will, however, help researchers spot short-term regional variations in climate change, something that is likely to be valuable as climate change takes hold.

Despite his reservations, Thorne says climate science stands to benefit from Muller’s project. “We need groups like Berkeley stepping up to the plate and taking this challenge on, because it’s the only way we’re going to move forwards. I wish there were 10 other groups doing this,” he says.

For the time being, Muller’s project is organised under the auspices of Novim, a Santa Barbara-based non-profit organisation that uses science to find answers to the most pressing issues facing society and to publish them “without advocacy or agenda”. Funding has come from a variety of places, including the Fund for Innovative Climate and Energy Research (funded by Bill Gates), and the Department of Energy’s Lawrence Berkeley Lab. One donor has had some climate bloggers up in arms: the man behind the Charles G Koch Charitable Foundation owns, with his brother David, Koch Industries, a company Greenpeace called a “kingpin of climate science denial”. On this point, Muller says the project has taken money from right and left alike.

No one who spoke to the Guardian about the Berkeley Earth project believed it would shake the faith of the minority who have set their minds against global warming. “As new kids on the block, I think they will be given a favourable view by people, but I don’t think it will fundamentally change people’s minds,” says Thorne. Brillinger has reservations too. “There are people you are never going to change. They have their beliefs and they’re not going to back away from them.”

Waking across the Berkeley campus, Muller stops outside Sproul Hall, where he was arrested more than 40 years ago. Today, the adjoining plaza is a designated protest spot, where student activists gather to wave banners, set up tables and make speeches on any cause they choose. Does Muller think his latest project will make any difference? “Maybe we’ll find out that what the other groups do is absolutely right, but we’re doing this in a new way. If the only thing we do is allow a consensus to be reached as to what is going on with global warming, a true consensus, not one based on politics, then it will be an enormously valuable achievement.”

>Can Geoengineering Save the World from Global Warming? (Scientific American)

>
Ask the Experts | Energy & Sustainability
Scientific American

Is manipulating Earth’s environment to combat climate change a good idea–and where, exactly, did the idea come from?

By David Biello | February 25, 2011

STARFISH PRIME: This nighttime atmospheric nuclear weapons test generated an aurora (pictured) in Earth’s magnetic field, along with an electromagnetic pulse that blew out streetlights in Honolulu. It is seen as an early instance of geoengineering by science historian James Fleming. Image: Courtesy of US Govt. Defense Threat Reduction Agency

As efforts to combat climate change falter despite ever-rising concentrations of heat-trapping gases in the atmosphere, some scientists and other experts have begun to consider the possibility of using so-called geoengineering to fix the problem. Such “deliberate, large-scale manipulation of the planetary environment” as the Royal Society of London puts it, is fraught with peril, of course.

For example, one of the first scientists to predict global warming as a result of increasing concentrations of greenhouse gases in the atmosphere—Swedish chemist Svante Arrhenius—thought this might be a good way to ameliorate the winters of his native land and increase its growing season. Whereas that may come true for the human inhabitants of Scandinavia, polar plants and animals are suffering as sea ice dwindles and temperatures warm even faster than climatologists predicted.

Scientific American corresponded with science historian James Fleming of Colby College in Maine, author of Fixing the Sky: The Checkered History of Weather and Climate Control, about the history of geoengineering—ranging from filling the air with the artificial aftermath of a volcanic eruption to seeding the oceans with iron in order to promote plankton growth—and whether it might save humanity from the ill effects of climate change.

[An edited transcript of the interview follows.]

What is geoengineering in your view?
Geoengineering is planetary-scale intervention [in]—or tinkering with—planetary processes. Period.

As I write in my book, Fixing the Sky: The Checkered History of Weather and Climate Control, “the term ‘geoengineering’ remains largely undefined,” but is loosely, “the intentional large-scale manipulation of the global environment; planetary tinkering; a subset of terraforming or planetary engineering.”

As of June 2010 the term has a draft entry in the Oxford English Dictionary—the modification of the global environment or the climate in order to counter or ameliorate climate change. A 2009 report issued by the Royal Society of London defines geoengineering as “the deliberate large-scale manipulation of the planetary environment to counteract anthropogenic climate change.”

But there are significant problems with both definitions. First of all, an engineering practice defined by its scale (geo) need not be constrained by its stated purpose (environmental improvement), by any of its currently proposed techniques (stratospheric aerosols, space mirrors, etcetera) or by one of perhaps many stated goals (to ameliorate or counteract climate change). Nuclear engineers, for example, are capable of building both power plants and bombs; mechanical engineers can design components for both ambulances and tanks. So to constrain the essence of something by its stated purpose, techniques or goals is misleading at best.

Geo-scale engineering projects were conducted by both the U.S. and the Soviet Union between 1958 and 1962 that had nothing to do with countering or ameliorating climate change. Starting with the [U.S.’s] 1958 Argus A-bomb explosions in space and ending with the 1962 Starfish Prime H-bomb test, the militaries of both nations sought to modify the global environment for military purposes.

Project Argus was a top-secret military test aimed at detonating atomic bombs in space to generate an artificial radiation belt, disrupt the near-space environment, and possibly intercept enemy missiles. It, and the later tests conducted by both the U.S. and the Soviet Union, peaked with H-bomb detonations in space in 1962 that created an artificial [electro]magnetic [radiation] belt that persisted for 10 years. This is geoengineering.

This idea of detonating bombs in near-space was proposed in 1957 by Nicholas Christofilos, a physicist at Lawrence Berkeley National Laboratory. His hypothesis, which was pursued by the [U.S.] Department of Defense’s Advanced Research Projects Agency [subsequently known as DARPA] and tested in Project Argus and other nuclear shots, held that the debris from a nuclear explosion, mainly highly energetic electrons, would be contained within lines of force in Earth’s magnetic field and would travel almost instantly as a giant current spanning up to half a hemisphere. Thus, if a detonation occurred above a point in the South Atlantic, immense currents would flow along the magnetic lines to a point far to the north, such as Greenland, where they would severely disrupt radio communications. A shot in the Indian Ocean might, then, generate a huge electromagnetic pulse over Moscow. In addition to providing a planetary “energy ray,” Christofilos thought nuclear shots in space might also disrupt military communications, destroy satellites and the electronic guidance systems of enemy [intercontinental ballistic missiles], and possibly kill any military cosmonauts participating in an attack launched from space. He proposed thousands of them to make a space shield.

So nuclear explosions in space by the U.S. and the Soviet Union constituted some of the earliest attempts at geoengineering, or intentional human intervention in planetary-scale processes.

The neologism “geoengineer” refers to one who contrives, designs or invents at the largest planetary scale possible for either military or civilian purposes. Today, geoengineering, as an unpracticed art, may be considered “geoscientific speculation”. Geoengineering is a subset of terraformation, which also does not exist outside of the fantasies of some engineers.

I have recently written to the Oxford English Dictionary asking them to correct their draft definition.

Can geoengineering save the world from climate change?
In short, I think it may be infinitely more dangerous than climate change, largely due to the suspicion and social disruption it would trigger by changing humanity’s relationship to nature.

To take just one example from my book, on page 194: “Sarnoff Predicts Weather Control” read the headline on the front page of The New York Times on October 1, 1946. The previous evening, at his testimonial dinner at the Waldorf Astoria, RCA president Brig. Gen. David Sarnoff had speculated on worthy peaceful projects for the postwar era. Among them were “transformations of deserts into gardens through diversion of ocean currents,” a technique that could also be reversed in time of war to turn fertile lands into deserts, and ordering “rain or sunshine by pressing radio buttons,” an accomplishment that, Sarnoff declared, would require a “World Weather Bureau” in charge of global forecasting and control (much like the “Weather Distributing Administration” proposed in 1938). A commentator in The New Yorker intuited the problems with such control: “Who” in this civil service outfit, he asked, “would decide whether a day was to be sunny, rainy, overcast…or enriched by a stimulating blizzard?” It would be “some befuddled functionary,” probably bedeviled by special interests such as the raincoat and galoshes manufacturers, the beachwear and sunburn lotion industries, and resort owners and farmers. Or if a storm was to be diverted—”Detour it where? Out to sea, to hit some ship with no influence in Washington?”

How old is the idea of geoengineering? What other names has it had?
I can trace geoengineering’s direct modern legacy to 1945, and have prepared a table of such proposals and efforts for the [Government Accountability Office]. Nuclear weapons, digital computers and satellites seem to be the modern technologies of choice. Geoengineering has also been called terraformation and, more restrictively, climate engineering, climate intervention or climate modification. Many have proposed abandoning the term geoengineering in favor of solar radiation management and carbon (or carbon dioxide) capture and storage. Of course, the idea of control of nature is ancient—for example, Phaeton or Archimedes.

Phaeton, the son of Helios, received permission from his father [the Greek sun god] to drive the sun chariot, but failed to control it, putting the Earth in danger of burning up. He was killed by a thunderbolt from Zeus to prevent further disaster. Recently, a prominent meteorologist has written about climate control and urged us to “take up Phaeton’s reins,” which is not a good idea.

Archimedes is known as an engineer who said: “Give me a lever long enough and a place to stand, and I will move the Earth.” Some geoengineers think that this is now possible and that science and technology have given us an Archimedean set of levers with which to move the planet. But I ask: “Where will it roll if you tip it?”

How are weather control and climate control related?
Weather and climate are intimately related: Weather is the state of the atmosphere at a given place and time, while climate is the aggregate of weather conditions over time. A vast body of scientific literature addresses these interactions. In addition, historians are revisiting the ancient but elusive term klima, seeking to recover its multiple social connotations. Weather, climate and the climate of opinion matter in complex ways that invite—some might say require or demand—the attention of both scientists and historians. Yet some may wonder how weather and climate are interrelated rather than distinct. Both, for example, are at the center of the debate over greenhouse warming and hurricane intensity. A few may claim that rainmaking, for example, has nothing to do with climate engineering, but any intervention in the Earth’s radiation or heat budget (such as managing solar radiation) would affect the general circulation and thus the location of upper-level patterns, including the jet stream and storm tracks. Thus, the weather itself would be changed by such manipulation. Conversely, intervening in severe storms by changing their intensity or their tracks or modifying weather on a scale as large as a region, a continent or the Pacific Basin would obviously affect cloudiness, temperature and precipitation patterns with major consequences for monsoonal flows, and ultimately the general circulation. If repeated systematically, such interventions would influence the overall heat budget and the climate.

Both weather and climate control have long and checkered histories: My book explains [meteorologist] James Espy’s proposal in the 1830s to set fire to the crest of the Appalachian Mountains every Sunday evening to generate heated updrafts that would stimulate rain and clear the air for cities of the east coast. It also examines efforts to fire cannons at the clouds in the arid Southwest in the hope of generating rain by concussion.

In the 1920s airplanes loaded with electrified sand were piloted by military aviators who “attacked” the clouds in futile attempts to both make rain and clear fog. Many others have proposed either a world weather control agency or creating a global thermostat, either by burning vast quantities of fossil fuels if an ice age threatened or sucking the CO2 out of the air if the world overheated.

After 1945 three technologies—nuclear weapons, digital computers and satellites—dominated discussions about ultimate weather and climate control, but with very little acknowledgement that unintended consequences and social disruption may be more damaging than any presumed benefit.

What would be the ideal role for geoengineering in addressing climate change?
That it generates interest in and awareness of the impossibility of heavy-handed intervention in the climate system, since there could be no predictable outcome of such intervention, physically, politically or socially.

Why do scientists continue to pursue this then, after 200 or so years of failure?
Science fantasy is informed by science fiction and driven by hubris. One of the dictionary definitions of hubris cites Edward Teller (the godfather of modern geoengineering).

Teller’s hubris knew no bounds. He was the [self-proclaimed] father of the H-bomb and promoted all things atomic, even talking about using nuclear weapons to create canals and harbors. He was also an advocate of urban sprawl to survive nuclear attack, the Star Wars [missile] defense system, and a planetary sunscreen to reduce global warming. He wanted to control nature and improve it using technology.

Throughout history rainmakers and climate engineers have typically fallen into two categories: commercial charlatans using technical language and proprietary techniques to cash in on a gullible public, and sincere but deluded scientific practitioners exhibiting a modicum of chemical and physical knowledge, a bare minimum of atmospheric insight, and an abundance of hubris. We should base our decision-making not on what we think we can do “now” and in the near future. Rather, our knowledge is shaped by what we have and have not done in the past. Such are the grounds for making informed decisions and avoiding the pitfalls of rushing forward, claiming we know how to “fix the sky.”

>What we have and haven’t learned from ‘Climategate’

>
DON’T KNOW MUCH AGNOTOLOGY

Grist.org
BY David Roberts
28 FEB 2011 1:29 PM

I wrote about the “Climategate” controversy (over emails stolen from the University of East Anglia’s Climatic Research Unit) once, which is about what it warranted.

My silent protest had no effect whatsoever, of course, and the story followed a depressingly familiar trajectory: hyped relentlessly by right-wing media, bullied into the mainstream press as he-said she-said, and later, long after the damage is done, revealed as utterly bereft of substance. It’s a familiar script for climate faux controversies, though this one played out on a slightly grander scale.

Investigations galore

Consider that there have now been five, count ‘em five, inquiries into the matter. Penn State established an independent inquiry into the accusations against scientist Michael Mann and found “no credible evidence” [PDF] of improper research conduct. A British government investigation run by the House of Commons’ Science and Technology Committee found that while the CRU scientists could have been more transparent and responsive to freedom-of-information requests, there was no evidence of scientific misconduct. The U.K.’s Royal Society (its equivalent of the National Academies) ran an investigation that found “no evidence of any deliberate scientific malpractice.” The University of East Anglia appointed respected civil servant Sir Muir Russell to run an exhaustive, six-month independent inquiry; he concluded that “the honesty and rigour of CRU as scientists are not in doubt … We have not found any evidence of behaviour that might undermine the conclusions of the IPCC assessments.”

All those results are suggestive, but let’s face it, they’re mostly … British. Sen. James Inhofe (R-Okla.) wanted an American investigation of all the American scientists involved in these purported dirty deeds. So he asked the Department of Commerce’s inspector general to get to the bottom of it. On Feb. 18, the results of that investigation were released. “In our review of the CRU emails,” the IG’s office said in its letter to Inhofe [PDF], “we did not find any evidence that NOAA inappropriately manipulated data … or failed to adhere to appropriate peer review procedures.” (Oddly, you’ll find no mention of this central result in Inhofe’s tortured public response.)

Whatever legitimate issues there may be about the responsiveness or transparency of this particular group of scientists, there was nothing in this controversy — nothing — that cast even the slightest doubt on the basic findings of climate science. Yet it became a kind of stain on the public image of climate scientists. How did that happen?

Smooth criminals

You don’t hear about it much in the news coverage, but recall, the story began with a crime. Hackers broke into the East Anglia email system and stole emails and documents, an illegal invasion of privacy. Yet according to The Wall Street Journal’s Kim Strassel, the emails “found their way to the internet.” In ABC science correspondent Ned Potter’s telling, the emails “became public.” The New York Times’ Andy Revkin says they were “extracted from computers.”

None of those phrasings are wrong, per se, but all pass rather lightly over the fact that some actual person or persons put them on the internet, made them public, extracted them from the computers. Someone hacked in, collected emails, sifted through and selected those that could be most damning, organized them, and timed the release for maximum impact, just before the Copenhagen climate talks. Said person or persons remain uncaught, uncharged, and unprosecuted. There have since been attempted break-ins at other climate research institutions.

If step one was crime, step two was character assassination. When the emails were released, they were combed over by skeptic blogs and right-wing media, who collected sentences, phrases, even individual terms that, when stripped of all context, create the worst possible impression. Altogether the whole thing was as carefully staged as any modern-day political attack ad.

Yet when the “scandal” broke, rather than being about criminal theft and character assassination, it was instantly “Climategate.” It was instantly about climate scientists, not the illegal and dishonest tactics of their attackers. The scientists, not the ideologues and ratf*ckers, had to defend themselves.

Burden of proof

It’s a numbingly familiar pattern in media coverage. The conservative movement that’s been attacking climate science for 20 years has a storied history of demonstrable fabrications, distortions, personal attacks, and nothingburger faux-scandals — not only on climate science, but going back to asbestos, ozone, leaded gasoline, tobacco, you name it. They don’t follow the rigorous standards of professional science; they follow no intellectual or ethical standards whatsoever. Yet no matter how long their record of viciousness and farce, every time the skeptic blogosphere coughs up a new “ZOMG!” it’s as though we start from zero again, like no one has a memory longer than five minutes.

Here’s the basic question: At this point, given their respective accomplishments and standards, wouldn’t it make sense to give scientists the strong benefit of the doubt when they are attacked by ideologues with a history of dishonesty and error? Shouldn’t the threshold for what counts as a “scandal” have been nudged a bit higher?

Agnotological inquiry

The lesson we’ve learned from climategate is simple. It’s the same lesson taught by death panels, socialist government takeover, Sharia law, and Obama’s birth certificate. To understand it we must turn to agnotology, the study of culturally induced ignorance or doubt. (Hat tip to an excellent recent post on this by John Quiggen.)

Beck, Palin, and the rest of Fox News and talk radio operate on the pretense that they are giving consumers access to a hidden “universe of reality,” to use Limbaugh’s term. It’s a reality being actively obscured the “lamestream media,” academics, scientists, and government officials. Affirming the tenets of that secret reality has become an act of tribal reinforcement, the equivalent of a secret handshake.

The modern right has created a closed epistemic loop containing millions of people. Within that loop, the implausibility or extremity of a claim itself counts as evidence. The more liberal elites reject it, the more it entrenches itself. Standards of evidence have nothing to do with it.

The notion that there is a global conspiracy by professional scientists to falsify results in order to get more research money is, to borrow Quiggen’s words about birtherism, “a shibboleth, that is, an affirmation that marks the speaker as a member of their community or tribe.” Once you have accepted that shibboleth, anything offered to you as evidence of its truth, no matter how ludicrous, will serve as affirmation. (Even a few context-free lines cherry-picked from thousands of private emails.)

Living with the loop

There’s one thing we haven’t learned from climategate (or death panels or birtherism). U.S. politics now contains a large, well-funded, tightly networked, and highly amplified tribe that defines itself through rejection of “lamestream” truth claims and standards of evidence. How should our political culture relate to that tribe?

We haven’t figured it out. Politicians and the political press have tried to accommodate the shibboleths of the right as legitimate positions for debate. The press in particular has practically sworn off plain judgments of accuracy or fact. But all that’s done is confuse and mislead the broader public, while the tribe pushes ever further into extremity. The tribe does not want to be accommodated. It is fueled by elite rejection.

At this point mainstream institutions like the press are in a bind: either accept the tribe’s assertions as legitimate or be deemed “biased.” Until there is a way out of that trap, there will be more and more Climategates.

Fact-Free Science (N.Y. Times)

THE WAY WE LIVE NOW

By JUDITH WARNER
Published: February 25, 2011

Photo: Camille Seaman.

President Obama has made scientific innovation the cornerstone of his plans for “winning the future,” requesting in his recent budget proposal large financing increases for scientific research and education and, in particular, sustained attention to developing alternative energy sources and technologies. “This is our generation’s Sputnik moment,” he declared in his State of the Union address last month.

It would be easier to believe in this great moment of scientific reawakening, of course, if more than half of the Republicans in the House and three-quarters of Republican senators did not now say that the threat of global warming, as a man-made and highly threatening phenomenon, is at best an exaggeration and at worst an utter “hoax,” as James Inhofe of Oklahoma, the ranking Republican on the Senate Environment and Public Works Committee, once put it. These grim numbers, compiled by the Center for American Progress, describe a troubling new reality: the rise of the Tea Party and its anti-intellectual, anti-establishment, anti-elite worldview has brought both a mainstreaming and a radicalization of antiscientific thought.

The politicization of science isn’t particularly new; the Bush administration was famous for pressuring government agencies to bring their vision of reality in line with White House imperatives. In response to this, and with a renewed culture war over the very nature of scientific reality clearly brewing, the Obama administration tried to initiate a pre-emptive strike earlier this winter, issuing a set of “scientific integrity” guidelines aimed at keeping the work of government scientists free from ideological pollution. But since taking over the House of Representatives, the Republicans have packed science-related committees with lawmakers who refute such basic findings as the reality of global warming and the threats of climate change. Fred Upton, the head of the House Energy and Commerce Committee, has said outright that he does not believe that global warming is man-made. John Shimkus of Illinois, who also sits on the committee — as well as on the Subcommittee on Energy and Environment — has said that the government doesn’t need to make a priority of regulating greenhouse-gas emissions, because as he put it late last year, “God said the earth would not be destroyed by a flood.”

Source: Gallup

Whoever emerges as the Republican presidential candidate in 2012 will very likely have to embrace climate-change denial. Mitt Romney, Tim Pawlenty and Mike Huckabee, all of whom once expressed some support for action on global warming, have notably distanced themselves from these views. Saying no to mainstream climate science, notes Daniel J. Weiss, a senior fellow and director of climate strategy for the Center for American Progress, is now a required practice for Republicans eager to play to an emboldened conservative base. “Opposing the belief that global warming is human-caused has become systematic, like opposition to abortion,” he says. “It’s seen as another way for government to control people’s lives. It’s become a cultural issue.”

That taking on the scientific establishment has become a favored activity of the right is quite a turnabout. After all, questioning accepted fact, revealing the myths and politics behind established certainties, is a tactic straight out of the left-wing playbook. In the 1960s and 1970s, the push back against scientific authority brought us the patients’ rights movement and was a key component of women’s rights activism. That questioning of authority veered in a more radical direction in the academy in the late 1980s and early 1990s, when left-wing scholars doing “science studies” increasingly began taking on the very idea of scientific truth.

This was the era of the culture wars, the years when the conservative University of Chicago philosopher Allan Bloom warned in his book “The Closing of the American Mind” of the dangers of liberal know-nothing relativism. But somehow, in the passage from Bush I to Bush II and beyond, the politics changed. By the mid-1990s, even some progressives said that the assault on truth, particularly scientific truth, had gone too far, a point made most famously in 1996 by the progressive New York University physicist Alan Sokal, who managed to trick the left-wing academic journal Social Text into printing a tongue-in-cheek article, written in an overblown parody of dense academic jargon, that argued that physical reality, as we know it, may not exist.


Illustration: Nomoco

Following the Sokal hoax, many on the academic left experienced some real embarrassment. But the genie was out of the bottle. And as the political zeitgeist shifted, attacking science became a sport of the radical right. “Some standard left arguments, combined with the left-populist distrust of ‘experts’ and ‘professionals’ and assorted high-and-mighty muckety-mucks who think they’re the boss of us, were fashioned by the right into a powerful device for delegitimating scientific research,” Michael Bérubé, a literature professor at Pennsylvania State University, said of this evolution recently in the journal Democracy. He quoted the disillusioned French theorist Bruno Latour, a pioneer of science studies who was horrified by the climate-change-denying machinations of the right: “Entire Ph.D. programs are still running to make sure that good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth . . . while dangerous extremists are using the very same argument of social construction to destroy hard-won evidence that could save our lives.”

Some conservatives argue that the Republican war on science is bad politics and that catering to the “climate-denier sect” in the party is a dangerous strategy, as David Jenkins, a member of Republicans for Environmental Protection wrote recently on the FrumForum blog. Public opinion, after all, has not kept pace with Republican rhetoric on the topic of climate change. A USA Today/Gallup poll conducted in January found that 83 percent of Americans want Congress to pass legislation promoting alternative energy, and a recent poll by the Opinion Research Corporation found that almost two-thirds want the Environmental Protection Agency to be more aggressive.

For those who have staked out extreme positions, backtracking may not be easy: “It is very difficult to get a man to understand something when his tribal identity depends on his not understanding it,” Bérubé notes. Maybe it’s time for some new identity politics.

Judith Warner is the author, most recently, of “We’ve Got Issues: Children and Parents in the Age of Medication.”

>‘Rapid Response Team’ Pairs Scientists and Media (The Yale Forum on Climate Change & The Media)

>
By Lisa Palmer | February 16, 2011
The Yale Forum on Climate Change & The Media


Think of it as the climate scientists/journalists version of “eHarmony.” A volunteer website launched by scientists serves as a matchmaking venue for media outlets and government officials looking for input on climate science topics.

It’s a Friday morning and Scott Mandia is scanning the Climate Science Rapid Response Team e-mail inbox he shares with two other climate science match-makers.

Today, on Mandia’s watch, a message from a journalist arrives at 5:30 a.m. It’s the first of two or three media requests he’ll likely get this day. Mandia’s task now? Ask for a response from one of 135 scientists in his network most qualified to answer the question. You might think of it as the climate scientists/journalists version of “eHarmony.”

Mandia, a professor of physical sciences at Suffolk County Community College, in New York, and his fellow Rapid Response founders, John Abraham, associate professor of thermodynamics at St. Thomas University, and Ray Weymann, a California-based retired astronomer and member of the National Academy of Sciences, take shifts. Each is a volunteer custodian of e-mail requests that flow in from their climate change match-making website connecting climate scientists with lawmakers and media outlets.

Launched in November 2010, the website tries to narrow the information gap between scientific understanding of climate change and what the public knows. Scientists involved with the group are screened and selected on an invitation-only basis. The experts come from a range of climate change science specialties, everything from climate modeling researchers and ecologists to economists and policy experts. Most are university faculty members or employees of government laboratories. It’s not a collection that most climate “contrarians” might be comfortable with.
The all-volunteer group promises to respond quickly to media requests to make sure science is portrayed accurately in the day’s news. They say turnaround time for requests is as fast as two hours for media operating on a short deadline.

“The scientists became members of our group because they understand that, as scientists, they have a responsibility to engage the public by engaging the media,” Mandia said in a phone interview. Mandia said he and his colleagues operate the service with no funding, and the website design was donated by Richard Hawkins, director of the Public Interest Research Centre in the United Kingdom.

Early on a Confusing Mix-up with AGU Media Project

Coincidentally, the Climate Science Rapid Response Team website debuted at the same time as the relaunch of the American Geophysical Unions’s Climate Q and A service, which has similarities with the Rapid Response Team but strictly limits questions to matters of science. (See Yale Forum related story.) Some confusion ensued when the Los Angeles Times erroneously reported a link between the AGU’s group and the Rapid Response volunteers, and AGU staff quickly initiated a damage-control effort in fear that some on Capitol Hill would find, based on the newspaper’s coverage, their effort overly politicized.

“When that (Los Angeles Times) story came out, it sounded like scientists were fighting back against politicians. We are not advocates about policy, but it made us look like we were the 98 pound weaklings getting sand kicked in their face,” said Mandia. But the bad press proved a boon to increase the numbers involved in the Rapid Response force.

“Scientists then realized they were being criticized unfairly and wanted to get involved,” said Mandia. The number of scientists involved with the Rapid Response Team quadrupled in number.
The AGU’s Q and A Service first formed to support media requests during the United Nations Climate Change Conference in Copenhagen in December 2009. It started again prior to the U.N. talks in Cancun. The Q and A service is open to anyone with a PhD degree willing to provide scientific expertise on a subject.

“AGU is not a partisan organization. We are here to make our science available so there is good information available to the media,” AGU Executive Director Chris McEntee said in a telephone interview.

About 700 scientists are registered with AGU’s service, which has provided answers to 68 media outlets. “We think it is important that policymakers, media, and the public get unbiased, nonpartisan information when making a decision,” said McEntee. “The service fits with our mission to promote scientific discovery for the benefit of humanity.”

Scientists Step Up

Mandia said scientists involved with his effort are usually tapped once or twice a month for media inquiries. No single person carries the burden of too many repeat requests because the group has selected a range of scientists, vetted for their expertise in various disciplines. The Rapid Response Team also has promised confidentiality of its scientists, who can remain anonymous if they wish. But Mandia said that, despite the offer, “none of them has ever requested anonymity.”

Andrew Dessler, an atmospheric scientist at Texas A & M University, is affiliated with both information services, but is more involved with the Climate Science Rapid Response Team. He was prompted into action because “dealing with climate change misinformation is difficult to do on your own,” Dessler wrote in an e-mail. “Effectively responding to the denial machine absolutely requires coordinated action by the climate science community. In this way, I think the CCRRT [sic] is a model of how scientists can effectively spend their limited resources on outreach.”

Dessler gives the Rapid Response service high marks, especially for institutionalizing the response process from scientists and distributing the communications workload. “You have to realize the asymmetry here. For [some] so-called skeptics, spreading misinformation is their full-time job. Scientists, on the other hand, already have a full-time job: research and teaching. Thus, we need to have mechanisms to level the playing field, and the CCRRT [sic] is one such mechanism,” said Dessler, adding that he encourages scientists to get involved in public outreach. “Because we are mainly funded by tax dollars, I think we have a responsibility to repay this by spreading the results of our research as far and wide as possible.”

A Goal of Precise Pairing

As of early February, more than 100 media organizations — newspaper, magazine, online media, television, and radio — and government officials have used the service to find climate scientists who could comment on a story. Mainstream media users have included The New York Times, The Guardian (UK), CNN International, and American Public Media’s “Marketplace,” among many others. Mandia said many of the media questions in December had to do with severe weather in the United States and in Northern Europe.

The Rapid Response website includes testimonials from such reporters as Ben Webster, of The Times in London: “I asked a difficult question about ice cores and was impressed by the efforts the team made to find the right people to respond. The response was balanced, stating clearly what was known but also the uncertainties.”

Eli Kintisch, a reporter for Science and author of Hack the Planet (Wiley, 2010), called on the service when he was looking for a scientist to serve as a color commentator of a live blog for Science he was producing during a House hearing. Facing time constraints, Kintisch relied on the matchmakers for the legwork of finding someone to fill this role.

“I have my own batch of sources on climate that I have used to comment on stories, and I have used ProfNet in the past occasionally. But I was looking for someone who had some experience with public engagement and would be available for two to four hours,” Kintisch said in a telephone interview. “The hearing was a review of the basics of climate science, and there were some prominent contrarians testifying, so I thought it would be useful to have someone available who knew the basics of climate science.”

While not all climate scientists feel comfortable engaging with the media, they are finding ways to get more involved in communications. Mandia said, “Some scientists are nervous about speaking to the press and worry they will be misquoted, but getting out of the ‘Ivory Tower’ is becoming very important.”

Lisa Palmer is a Maryland-based freelance writer and a regular contributor to The Yale Forum. (E-mail: lisa@yaleclimatemediaforum.org)

>The Role of Trust in Climate Change Adaptation and Resilience: Can ICTs help?

>
February 27, 2011
By Angelica Valeria Ospina
From http://niccd.wordpress.com.

Amidst the magnitude and uncertainty that characterizes the climate change field, trust is a topic that is often overlooked, despite being one of the cornerstones of resilience building and adaptive capacity.

Trust is an essential element of effective communication, networking and self-organisation, and thus is indispensable in efforts to withstand and recover from the effects of climate change-related manifestations, being acute shocks or slow-changing trends. It’s an equally important basis for vulnerable communities to be able to adapt, and potentially change, in face of the -largely unknown- impact of climatic occurrences.

Associated with the belief, reliability, expectations and perceptions between people and the institutions within which they operate or interact, trust often acts as an underlying cause of action or inaction, constituting an important factor in decision-making processes.

With the rapid diffusion of Information and Communication Technologies (ICTs) such as mobile phones and the Internet, the unprecedented speed at which information is produced and shared is posing a new set of possibilities -and challenges- to communication management and trust building, both essential to the development of resilience and adaptation to the changing climate.

Adaptation experiences suggest that vulnerable communities are more prone to act upon information that they can ‘trust’, a complex concept that could be linked to factors such as the source of the information -and the local perception of it-, the language used to convey the message, the role and credibility of ‘infomediaries’ or local facilitators that help disseminate the information, the use of local appropriation mechanisms and community involvement, among others.

Climate change Adaptation Strategies and National Programmes of Action are increasingly called to foster trust-building processes by engaging local actors and gaining a better understanding of local needs and priorities. Thus, trust building in the climate change field involves finding new collaborative spaces where the interests of all stakeholders can be heard, and both scientific and traditional knowledge can be shared and built upon towards more effective adaptive practices, and potentially, transformation.

The widespread diffusion of ICTs -such as mobile phones, Internet access and even community radios- within Developing country environments could be opening up new opportunities to use these tools in support of trust-building processes, a necessary step towards change and transformation.

So, how can ICTs help to build trust within climate change resilience and adaptation processes?

Research at the intersection of ICTs, climate change and development suggests the following aspects in regards to the supportive role of ICT tools towards trust:

  • Multi-level Communication: ICTs can facilitate communication and trust-building between and across actors at the micro (e.g. community members), meso (e.g. NGOs) and macro levels (e.g. policy makers), fostering participation in the design of adaptation -and mitigation- strategies, as well as accountability and monitoring during their implementation.
  • Network Strengthening: The role of social networks is key within processes of adaptation to climate change and resilience building. Trust is at the core of networks functioning. The use of ICTs such as mobile phones can help to enhance communication and the bonds of trust within and among networks, which can in turn contribute to the effectiveness of community networks’ support and the access to resources.
  • Self-organisation: The ability to self-organize is a key attribute of resilient systems, and involves processes of collaboration that require trust among stakeholders and institutions. By facilitating access to information and resources through both point-to-multipoint and point-to-point exchange, ICTs can be important contributors to self-organisation and to the coordination of both preventive and reactive joint efforts in face of climatic events. They can help climate change actors to verify or double-check facts if the information source is not entirely trusted, diversifying their potential responses to the occurrence of climatic events. Additionally, ICTs can play a role towards trust by enabling the assessment of options and trade-offs involved in decision-making.
  • Appropriation and Infomediaries: The role of actors that ‘translate’ or ‘mediate’ the technical and scientific information to suit the needs of the local context, is vital for the appropriation of information. Tools such as the Internet, GIS or mobile phones can support and strengthen the role of agricultural extension workers, deepening the relationships of trust that they have established with local producers affected by climate change manifestations by offering them a broader set of options and information, for example, on crop diversification or plague management, including more immediate response to their queries.
  • Transparency and Fluency: Online platforms that provide new channels for citizens to voice their views and concerns, and that allow an interaction with decision makers, are an example of ICTs potential towards transparency and information fluency, which is an important factor in the local perception, expectations and ‘trust’ on local, regional and national institutions.

While at the onset of extreme events we are quick to recognize the importance of communication, we often fail to acknowledge the pivotal role of trust towards adaptation and resilience, as well as the potential of innovative tools such as ICTs to help fostering trust, strengthening networks and collaboration.

But as important as discussing the potential of ICTs towards trust building in adaptive processes, is discussing the risks associated with their use.

Ensuring the quality, accuracy and relevance of the information is key to avoid maladaptive practices and poor decision-making, which could potentially lead to deepen existent vulnerabilities and inequalities. Issues of power and differential access to information also need to be addressed when considering the potential of these tools towards trust building, network strengthening and participatory processes –including those related to climate change.

Ultimately, ICTs could play an important supportive role helping to build and strengthen trust within vulnerable communities affected by climate change impacts, as well as in National Adaptation Plans and Programmes of Action seeking to build long-term climate change resilience with a multi-stakeholder, participatory base.

>ICTs and the Climate Change ‘Unknowns’: Tackling Uncertainty

>
January 4, 2011
By Angelica Valeria Ospina
From http://niccd.wordpress.com/

Determining the repercussions of the changing climate is a field of great unknowns. While the impacts of climatic variations and seasonal changes on the most vulnerable populations are expected to increase and be manifest in more vulnerable ecosystems and natural habitats, the exact magnitude and impact of climate change effects remain, for the most part, open questions.

Such uncertainty is a key contributor to climate change vulnerability, particularly among developing country populations that lack the resources, including access to information and knowledge, to properly prepare for and cope with its impacts.

But, how can vulnerable contexts prepare for the ‘unknowns’ posed by climate change? And should the quest for ‘certainty’ be the focus of our attention?

The rapid diffusion of Information and Communication Technologies (ICTs) within developing country environments, the hardest hit by climate change-related manifestations, is starting to shed new light on these issues.

A recent article by Reuters identified 10 climate change adaptation technologies that will become crucial to cope and adapt to the effects of the changing climate over the next century.

The bullet points found bellow link these 10 aspects with the potential of ICTs within the climate change field, highlighting some of the ways in which they can help vulnerable populations to better prepare for and cope with the effects of climatic uncertainty.

  • Innovations around Infectious Diseases: Extreme weather events and changing climatic patterns associated with climate change have been linked to the spread of vector-borne (i.e. malaria and dengue) and water-borne diseases. Within this context, ICTs such as mobile phones, community radio and the Internet have the potential to enable information sharing, awareness raising and capacity building on key health threats, enabling effective prevention and response.
  • Flood Safeguards: Climatic changes such as increased and erratic patterns of precipitation negatively affect the capacity of flood and drainage systems, built environment, energy and transportation, among others. ICT applications such as Geographic Information Systems (GIS) can facilitate the monitoring and provision of relevant environmental information to relevant stakeholders, including decision-making processes for the adaptation of human habitats.
  • Weather Forecasting Technologies: ICTs play a key role in the implementation of innovative weather forecasting technologies, including the integration of community monitoring. The use of mobile phones and SMS for reporting on locally-relevant indicators (e.g. likelihood of floods) can contribute to greater accuracy and more precise flood warnings to communities. Based on this information, authorities could design and put in action more appropriate strategies, and farmers could better prepare for evacuations, protect their livestock and better plan local irrigation systems, among others.
  • Insurance Tools: Access to new and more diversified sources of information and knowledge through tools such as the Internet or the mobile phone can facilitate the access to insurance mechanisms, and to information about national programs/assistance available to support vulnerable populations.
  • More Resilient Crops: In the face of higher temperatures, more variable crop seasons and decreasing productivity, ICTs have the potential to enhance food security by strengthening agricultural production systems through information about pest and disease control, planting dates, seed varieties, irrigation applications, and early warning systems, as well as improving market access, among others.
  • Supercomputing: According to the International Telecommunication Union (ITU), the use of ICT-equipped sensors (telemetry), aerial photography, satellite imagery, grid technology, global positioning by satellite (GPS) (e.g. for tracking slow, long-term movement of glaciers) and computer modeling of the earth’s atmosphere, among others, play a key role in climate change monitoring. New technologies continue to be developed, holding great potential for real-time, more accurate information key to strengthen decision-making processes.
  • Water Purification, Water Recycling and Efficient Irrigation Systems: ICTs can contribute to the improvement of water resource management techniques, monitoring of water resources, capacity building and awareness rising. Broadly diffused applications such as mobile phones can serve as tools to disseminate information on low-cost methods for desalination, using gray water and harvesting rainwater for every day uses, as well as for capacity building on new irrigation mechanisms, among others.
  • Sensors: In addition to the role that sensors play in monitoring climate change by helping to capture more accurate data, research indicates that they also constitute promising technologies for improving energy efficiency. Sensors can be used in several environmental applications, such as control of temperature, heating and lighting.

This short identification of areas of potential does not suggest that ICTs can eliminate climatic uncertainty, but it does suggest their potential to help vulnerable populations to strengthen their capacity to withstand and recover from shocks and changing climatic trends.

By contributing to building resilience and strengthening adaptive capacity, ICTs have the potential to tackle climate change uncertainty not only by providing access to information and knowledge, but also by fostering networking, personal empowerment and participation, facilitating self-organisation, access to diverse resources and learning, among others, which ultimately contribute to better preparedness and response, including the possibility of transformation in the face of the unknown.

The need to reduce uncertainty should not substitute efforts to foster creativity and flexibility, which lie at the core of resilient responses to the ongoing challenges posed by climate change.

—————————————————–

*Further examples on the linkages between ICTs, climate change and vulnerability dimensions can be found at: http://www.niccd.org/ScopingStudy.pdf

>Yale Project on Knowledge of Climate Change Across Global Warming’s Six Americas

>
From Anthony Leiserowitz, Yale Project on Climate Change Communication

“Today we are pleased to announce the release of a new report entitled “Knowledge of Climate Change Across Global Warming’s Six Americas.” This report draws from a national study we conducted last year on what Americans understand about how the climate system works, and the causes, impacts, and potential solutions to global warming and is available here.

Overall, we found that knowledge about climate change varies widely across the Six Americas – 49 percent of the Alarmed received a passing grade (A, B, or C), compared to 33 percent of the Concerned, 16 percent of the Cautious, 17 percent of the Doubtful, 4 percent of the Dismissive, and 5 percent of the Disengaged. In general, the Alarmed and the Concerned better understand how the climate system works and the causes, consequences, and solutions to climate change than the Disengaged, the Doubtful and the Dismissive. For example:

· 87% of the Alarmed and 76% of the Concerned understand that global warming is caused mostly by human activities compared to 37% of the Disengaged, 6% of the Doubtful and 3% of the Dismissive;
· 86% of the Alarmed and 71% of the Concerned understand that emissions from cars and trucks contribute substantially to global warming compared to 18% of the Disengaged, 16% of the Doubtful and 10% of the Dismissive;
· 89% of the Alarmed and 64% of the Concerned understand that a transition to renewable energy sources is an important solution compared to 12% of the Disengaged, 13% of the Doubtful and 7% of the Dismissive.

However, this study also found that occasionally the Doubtful and Dismissive have as good or a better understanding than the Alarmed or Concerned. For example:

· 79% of the Dismissive and 74% of the Doubtful correctly understand that the greenhouse effect refers to gases in the atmosphere that trap heat, compared to 66% of the Alarmed and 64% of the Concerned;
· The Dismissive are less likely to incorrectly say that “the greenhouse effect” refers to the Earth’s protective ozone layer than all other groups, including the Alarmed (13% vs. 24% respectively);
· 50% of the Dismissive and 57% of the Doubtful understand that carbon dioxide traps heat from the Earth’s surface, compared to 59% of the Alarmed, and 45% of the Concerned.

This study also identified numerous gaps between expert and public knowledge about climate change. For example, only:

· 13% of the Alarmed know how much carbon dioxide there is in the atmosphere today (approximately 390 parts per million) compared to 5% of the Concerned, 9% of the Cautious, 4% of the Disengaged, 6% of the Doubtful and 7% of the Dismissive;
· 52% of the Alarmed have heard of coral bleaching, vs. 24% of the Concerned, 23% of the Cautious, 5% of the Disengaged, 21% of the Doubtful and 24% of the Dismissive;
· 46% of the Alarmed have heard of ocean acidification, vs. 22% of the Concerned, 25% of the Cautious, 6% of the Disengaged, 23% of the Doubtful and 16% of the Dismissive.

This study also found important misconceptions leading many to misunderstand the causes and therefore the solutions to climate change. For example, many Americans confuse climate change and the hole in the ozone layer. Such misconceptions were particularly apparent for the Alarmed and Concerned segments:

· 63% of the Alarmed and 49% of the Concerned believe that the hole in the ozone layer is a significant contributor to global warming compared to 32% of the Cautious, 12% of the Disengaged, 6% of the Doubtful and 7% of the Dismissive;
· 49% of the Alarmed and 36% of the Concerned believe that aerosol spray cans are a significant contributor to global warming compared to 20% of the Cautious, 9% of the Disengaged, 7% of the Doubtful and 5% of the Dismissive;
· 39% of the Alarmed and 23% of the Concerned believe that banning aerosol spray cans would reduce global warming compared to 13% of the Cautious, 3% of the Disengaged, 4% of the Doubtful and 1% of the Dismissive.

Concerned, Cautious and Disengaged Americans also recognize their own limited understanding of the issue. Fewer than 1 in 10 say they are “very well informed” about climate change, and 75 percent or more say they would like to know more. The Alarmed also say they need more information (76%), while the Dismissive say they do not need any more information about global warming (73%).

Overall, these and other results within this report demonstrate that most Americans both need and desire more information about climate change. While information alone is not sufficient to engage the public in the issue, it is often a necessary precursor of effective action.”

>Mudança Climática e conflito social estão associados? (JC)

>
JC e-mail 4202, de 17 de Fevereiro de 2011

Artigo do ambientalista Sérgio Abranches, do Ecopolítica para o Plural em site

Eventos climáticos extremos podem ter tido efeito importante nos levantes populares no Oriente Médio e Norte da África? A mudança climática já está afetando as relações sociais?

A questão pode parecer uma dessas vias forçadas para alertar sobre a mudança climática. Mas não é. É uma preocupação relevante e essa conexão já vem sendo estudada por cientistas das mais diversas áreas, climatologistas, ecologistas, sociólogos, economistas. A pergunta é mais complexa do que ela aparenta à primeira vista. Ela indaga sobre duas relações nada triviais: entre eventos climáticos extremos e mudança climática e entre anomalias climáticas e conflito social.

Os cientistas resistem sempre a atribuir à emergência de eventos climáticos extremos específicos à mudança climática. Argumentam, com razão, que não há base científica para associar um evento em particular ao fenômeno global e de longo prazo da mudança climática. Mas o climatologista Kevin Trenberth, diretor da Seção de Análise Climática do Centro Nacional para Pesquisa Atmosférica, nos Estados Unidos, defendeu recentemente uma visão diferente desse problema, conhecido na ciência climática como “o problema da atribuição”. Em entrevista exclusiva ao editor do blog Climate Progress, o físico Joseph Romm, Trenberth disse que:

Os cientistas sempre começam com a afirmação de que não se pode atribuir um evento isolado à mudança climática. Mas ela tem uma influência sistemática sobre todos esses eventos climáticos atuais, segundo ele, por causa do fato de que há mais vapor d’água circulando na atmosfera do que se tinha, digamos, trinta anos atrás. É uma quantidade extra de 4% de vapor d’água. Ele aumenta a força das tempestades, dá mais umidade para essas tempestades e é ruim que o público não veja isto como uma manifestação da mudança climática. A perspectiva é que esse tipo de coisa só aumentará e piorará no futuro.

A quantidade de gases estufa na atmosfera, segundo a maioria dos cientistas, já tem um efeito de aceleração do aquecimento da Terra. Portanto, a mudança climática decorrente deve ser vista como um processo em curso com tendência de agravamento ao longo do tempo. Ou seja, é de longo prazo, mas as coisas não acontecem todas no futuro de uma vez só. Vão acontecendo progressivamente, com aumento de frequência e intensidade.

E qual a relação com os fatos no Oriente Médio e na África do Norte?

Tivemos um período atípico de grande quantidade de eventos climáticos extremos em 2010 e no início deste ano. Secas, enchentes, ondas de calor e frio, tempestades intensas, nevascas, queimadas. Esses eventos afetaram negativamente a produção agrícola mundial em todas as partes do mundo: os casos mais exemplares foram no Casaquistão, na Rússia, no Canadá, na Austrália, nos Estados Unidos, na China e no Brasil. O resultado foi uma forte alta dos preços internacionais das commodities agrícolas e inflação de preços de alimentos. Uma inflação climática.

O blog Climate Progress organizou uma série de referências de cientistas e da imprensa a essas relações. Entre elas, estudo dos economistas Rabah Arezki, do FMI, e Markus Brückner, da Universidade de Adelaide na Austrália. Eles estudaram o efeito de variações nos preços internacionais de alimentos sobre as instituições democráticas e conflitos internos em mais de 120 países, entre 1970 e 2007. Essa análise mostra que existe uma clara relação para os países de baixa renda: observa-se a deterioração das instituições democráticas e o aumento da incidência de conflitos de rua, demonstrações anti-governo, e movimentos de massa.

Por que nos países de baixa renda? Nos países de renda alta essa relação não é significativa. Porque quanto menor a renda do país, maior a participação dos alimentos no orçamento doméstico e, portanto, maior a sensibilidade da população a elevações fortes do preço da comida.

Estudos históricos mostram que há relação entre mudança climática e colapso social. Quebras de safra e consequente elevação dos preços de comida são causas frequentes de levantes populares e revoluções na história da sociedade moderna e contemporânea. A história do próprio Egito registra casos históricos de conflitos associados ao preço dos grãos (infelizmente não tenho cópia digital deste artigo). Na Índia, também foram muitos os episódios. O mais notável talvez tenha sido a “revolta dos grãos” de 1918, provocada por desabastecimento e elevação de preços dos grãos resultante de monções com chuvas excepcionalmente fracas.

Em vários desses episódios históricos a relação era direta: a elevação dos preços dos alimentos causava a revolta. No caso atual, as causas são outras. Para entender o que se passa no Egito, por exemplo, é preciso distinguir entre o que causa o descontentamento profundo e o que detona a revolta. O que causou o descontentamento foi a própria tirania. Um governo autocrático, um ditador no poder por 30 anos, uma administração corrupta. Repressão, censura, prisões arbitrárias, tortura. No plano social, muita pobreza, imensa desigualdade de renda e de riqueza, falta de perspectiva de mobilidade social para os jovens. Nos últimos anos houve várias manifestações de protesto, todas duramente reprimidas, mas nenhuma do porte da revolta de massas que começou no dia 25.

O que detona o levante das massas? Uma conjuntura, isto é, uma convergência de fatores, antes dissociados, que se encontram e formam “a gota d’água”, provocam a virada, o tipping point, que levam um protesto como outros inúmeros se transformar em explosão de descontentamento geral, em revolta incontrolável e espontânea da massa.

No Egito houve fatores econômicos, políticos e aceleradores importantes que criaram essa conjuntura. O econômico foi a elevação dos preços dos alimentos, que atingiu duramente as famílias mais pobres. A subida dos preços do petróleo, moradia e educação, bateu no orçamento da classe média. Esse choque de preços ocorreu em uma economia debilitada, na qual o desemprego de jovens é muito alto. O desemprego agrava uma situação de baixa mobilidade social, anulando as perspectivas de futuro dos jovens. Em alguns casos, jovens com qualificação sofrem descenso social, sendo forçados a trabalhar em setores de baixa qualificação. O desespero dos jovens se transmite facilmente para os pais e famílias.

O fator político foi a notícia de que o filho de Hosni Mubarak, Gamal Mubarak, seria seu sucessor, provavelmente já como candidato nas eleições de cartas marcadas previstas para setembro. A possibilidade de uma dinastia Mubarak provocou enorme rejeição, em um país de passado dinástico.

O quadro sócio-econômico no Egito não é muito diferente do que se observa nos outros países. Na Tunísia, no Sudão, mesmo na Arábia Saudita, há tirania, muita pobreza, desigualdade de renda e riqueza, desemprego de jovens e elevação de preços de alimentos. Ouvi recentemente entrevista de um dos príncipes sauditas, na CNN, falando que a situação em seu país é diferente, mas que há, realmente, insatisfação com o aumento de preços dos alimentos e da moradia. O governo aumentou os salários para que pudessem absorver o custo adicional. A evidência mostra que subsídios e aumentos salariais para compensar os efeitos da inflação alimentar têm efeito temporário e acabam por realimentar os preços.

No Egito, o aumento dos preços dos alimentos foi muito forte, como se vê no gráfico em – http://www.ecopolitica.com.br/wp-content/uploads/2011/02/Inflation-in-Egypt.jpg

Os preços dos alimentos subiram 40% e os de moradia e educação, mais de 10%. Os pobres são sensíveis à inflação nos alimentos e na moradia. A classe média à inflação na educação, na moradia e nos combustíveis.

O que acelerou a revolta e permitiu que se transformasse em um movimento de massa, muito rapidamente? As mídias e redes sociais e o efeito-demonstração do levante na Tunísia, que se propagou por essas vias digitais. É evidente que as mídias e redes sociais não fazem revoluções. Elas são uma revolução na forma como nos comunicamos e trocamos informação. Nisso têm sido revolucionárias. Mas, na sociologia dos conflitos sociais seu papel é de acelerador e transmissor, permitindo, por exemplo, o contágio inicial, que depois passa a se dar por contato físico, nas ruas e nas praças, e na propagação de eventos que acabam tendo o efeito de aumentar a propensão à ação.

Além disso, podem ter o efeito de prolongar o contágio. A sociologia já decifrou como terminam os processos por contágio, como os arrastões, por exemplo: quando não há mais pessoas a contagiar e a cadeia se quebra. As redes e mídias sociais – no caso do Egito principalmente o SMS – trazem mais pessoas para o movimento e realimentam o contágio.

Não é por acaso que essas revoltas ocupam as ruas e praças das cidades. O meio urbano é muito mais propício ao contágio das massas. O crescimento da população com acesso à telefonia celular dá o principal instrumento de contágio. Veja os gráficos para o Egito (http://www.ecopolitica.com.br/wp-content/uploads/2011/02/Egypt-Mobile-subs.jpg) e a Tunísia (http://www.ecopolitica.com.br/wp-content/uploads/2011/02/Tunisia-Mobile-subs.jpg).

Mas a internet teve importante papel de manter o mundo informado sobre o que se passava no Egito, provavelmente evitando um banho de sangue, e na comunicação entre os egípcios. E por isso o governo fechou o acesso à Web.

Nada é simples nesse processo. Estamos falando da convergência de processos complexos no sistema climático, no sistema social e na sociedade global. Essa convergência só aumentará nos próximos anos e décadas. Viveremos mais turbulência climática e social, no meio de uma revolução científica e tecnológica sem precedentes.

Para ouvir o comentário do autor na rádio CBN acesse http://www.ecopolitica.com.br/2011/02/02/mudanca-climatica-e-conflito-social-estao-associados

>Increased flood risk linked to global warming (Nature)

>

Published online 16 February 2011 | Nature 470, 316 (2011) | doi:10.1038/470316a


Likelihood of extreme rainfall may have been doubled by rising greenhouse-gas levels.

The effects of severe weather — such as these floods in Albania — take a huge human and financial toll.
The effects of severe weather — such as these floods in Albania — take a huge human and financial toll. REUTERS/A. CELI


Climate change may be hitting home. 


Rises in global average temperature are remote from most people’s experience, but two studies in this week’s 
Nature1,2conclude that climate warming is already causing extreme weather events that affect the lives of millions. The research directly links rising greenhouse-gas levels with the growing intensity of rain and snow in the Northern Hemisphere, and the increased risk of flooding in the United Kingdom.

Insurers will take note, as will those developing policies for adapting to climate change. “This has immense importance not just as a further justification for emissions reduction, but also for adaptation planning,” says Michael Oppenheimer, a climate-policy researcher at Princeton University in New Jersey, who was not involved in the studies.

There is no doubt that humans are altering the climate, but the implications for regional weather are less clear. No computer simulation can conclusively attribute a given snowstorm or flood to global warming. But with a combination of climate models, weather observations and a good dose of probability theory, scientists may be able to determine how climate warming changes the odds. An earlier study3, for example, found that global warming has at least doubled the likelihood of extreme events such as the 2003 European heatwave.

More-localized weather extremes have been harder to attribute to climate change until now. “Climate models have improved a lot since ten years ago, when we basically couldn’t say anything about rainfall,” says Gabriele Hegerl, a climate researcher at the University of Edinburgh, UK. In the first of the latest studies1, Hegerl and her colleagues compared data from weather stations in the Northern Hemisphere with precipitation simulations from eight climate models (see page 378). “We can now say with some confidence that the increased rainfall intensity in the latter half of the twentieth century cannot be explained by our estimates of internal climate variability,” she says.

The second study2 links climate change to a specific event: damaging floods in 2000 in England and Wales. By running thousands of high-resolution seasonal forecast simulations with or without the effect of greenhouse gases, Myles Allen of the University of Oxford, UK, and his colleagues found that anthropogenic climate change may have almost doubled the risk of the extremely wet weather that caused the floods (see page 382). The rise in extreme precipitation in some Northern Hemisphere areas has been recognized for more than a decade, but this is the first time that the anthropogenic contribution has been nailed down, says Oppenheimer. The findings mean that Northern Hemisphere countries need to prepare for more of these events in the future. “What has been considered a 1-in-100-years event in a stationary climate may actually occur twice as often in the future,” says Allen.

But he cautions that climate change may not always raise the risk of weather-related damage. In Britain, for example, snow-melt floods may become less likely as the climate warms. And Allen’s study leaves a 10% chance that global warming has not affected — or has even decreased — the country’s flood risk.

Similar attribution studies are under way for flood and drought risk in Europe, meltwater availability in the western United States and drought in southern Africa, typical of the research needed to develop effective climate-adaptation policies. “Governments plan to spend some US$100 billion on climate adaptation by 2020, although presently no one has an idea of what is an impact of climate change and what is just bad weather,” says Allen.

Establishing the links between climate change and weather could also shape climate treaties, he says. “If rich countries are to financially compensate the losers of climate change, as some poorer countries would expect, you’d like to have an objective scientific basis for it.”

The insurance industry has long worried about increased losses resulting from more extreme weather (see ‘Fatal floods’), but conclusively pinning the blame on climate change will take more research, says Robert Muir-Wood, chief research officer with RMS, a company headquartered in Newark, California, that constructs risk models for the insurance industry. “This is a key part of our research agenda and insurance companies do accept the premise” that there could be a link, he says. “If there’s evidence that risk is changing, then this is something we need to incorporate in our models.” 
See News and Views p.344


References

  1. Min, S.-K. et alNature 470, 378-381 (2011).
  2. Pall, P. et alNature 470, 382-385 (2011).
  3. Stott, P. A. et alNature 432, 610-614 (2004).

>Catástrofe na região serrana do Rio já é o maior desastre climático do País (Estadão)

>
[Talvez o pior evento de deslizamento, mas não chega sequer perto do pior desastre climático. A seca de 1877-1879 matou cerca de 500 MIL pessoas no Nordeste, segundo a maioria dos autores.]



Mortos são 785, mesmo número de enchente no Rio em 1967. Em ranking da ONU, também é o 8º maior deslizamento do mundo

22 de janeiro de 2011 | 0h 00
Bruno Tavares – O Estado de S.Paulo
A tragédia da região serrana do Rio se igualou ontem ao maior desastre climático da história do País. Até as 22 horas de ontem, as autoridades contabilizavam 785 mortos, o mesmo número de vítimas da enchente do Rio em 1967, segundo ranking da ONU. O número tende a aumentar, pois o Ministério Público fluminense estima que ainda existam 400 desaparecidos nos seis municípios devastados pelas chuvas do dia 12.
Marcos de Paula/AE
Marcos de Paula/AE
Fotos de desaparecidos em Teresópolis
O desastre também entra para os registros da ONU como o 8.º pior deslizamento da história mundial. O maior evento dessa natureza, segundo o Centro para a Pesquisa da Epidemiologia de Desastres, ocorreu em 1949, na antiga União Soviética, com 12 mil mortes. O segundo maior foi no Peru, em dezembro de 1941, e deixou 5 mil vítimas.
O deslizamento da região serrana já havia superado o número de vítimas registrado em 1967, em Caraguatatuba, quando 436 pessoas morreram. Por suas características devastadoras, o evento ocorrido há mais de quatro décadas na Serra do Mar paulista era considerado emblemático pelos geólogos.
Apesar da grande quantidade de água que desceu dos morros fluminenses e de vários rios terem transbordado, especialistas brasileiros e da própria ONU classificam o evento como deslizamento de terra. Na avaliação dos estudiosos, grande parte da destruição e das mortes foi causada pelas avalanches de terra e detritos – tecnicamente chamadas de corrida de lama.
O fenômeno é raro, pois depende de uma conjunção de fatores para ocorrer. No caso da região serrana do Rio, todos eles estavam presentes. Os morros são íngremes, o que favorece os escorregamentos de terra. Além disso, é preciso um grande volume de chuva concentrado em um curto espaço de tempo. Foi o que aconteceu ali. Segundo dados do Instituto Estadual do Ambiente (Inea), as estações climáticas localizadas no núcleo da tempestade registraram 249 e 297 milímetros de chuva em 24 horas – a partir das 20 horas do dia 11. Na avaliação da presidente do Inea, Marilene Ramos, um temporal dessa intensidade tem probabilidade de acontecer a cada 350 anos.
Enterro. Por questão “não só humanitária, mas também de saúde pública”, o juiz da 2.ª Vara de Família de Teresópolis, José Ricardo Ferreira de Aguiar, determinou o enterro dos corpos de 25 vítimas das chuvas que estavam em um caminhão e trailers frigoríficos. No Cemitério Carlinda Berlim, o principal dos cinco da cidade, foram 232 enterros desde a semana passada. Pelo Instituto Médico-Legal, até ontem já tinham passado 312 cadáveres. O juiz crê que existam “no mínimo quatro vezes mais soterrados” do que os encontrado.
A maioria dos corpos enterrados ontem – 22 adultos e três crianças – teve a identificação levantada pela equipe de papiloscopistas do IML, da Força Nacional e do Instituto Félix Pacheco. Mas, como os corpos não foram reclamados por parentes, o enterro foi determinado pelo juiz. No caso dos corpos sepultados sem identificação, houve coleta de DNA. Assim, será possível confrontar dados dos parentes que buscarem informações.
A partir de agora, segundo decisão do juiz, os corpos não reconhecidos serão liberados após coleta de material biológico. “Em duas horas o corpo sairá dignamente para ser sepultado.”
No caso dos desaparecidos, o Ministério Público afirma que informações registradas por parentes e amigos têm sido confrontadas com dados de hospitais e do IML. Ontem, fotos foram colocadas na frente de um centro de informações em Teresópolis. / COLABOROU MARCELO AULER

>Modelo climático brasileiro mostrará o clima sob o olhar do Brasil (Fapesp, JC)

>
JC e-mail 4200, de 15 de Fevereiro de 2011

Nos modelos climáticos globais divulgados no mais recente relatório do Painel Intergovernamental sobre Mudança Climática (IPCC), divulgado em 2007, o Pantanal e o Cerrado são retratados como se fossem savanas africanas

Já fenômenos como as queimadas, que podem intensificar o efeito estufa e mudar as características das chuvas e nuvens de uma determinada região, por exemplo, não são caracterizados por não serem considerados relevantes para os países que elaboraram os modelos numéricos utilizados.

É por isso, e para auxiliar nas pesquisas mundiais sobre as mudanças climáticas e avaliar o impacto que as atividades humanas têm sobre elas, que cientistas brasileiros estão desenvolvendo o Modelo Brasileiro do Sistema Climático Global (MBSCG).

O esforço congrega cientistas do Instituto Nacional de Ciência e Tecnologia sobre Mudanças Climáticas (INCT-MC), do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais e da Rede Brasileira de Pesquisa em Mudanças Climáticas Globais (Rede Clima).

Modelo brasileiro

Com conclusão estimada para 2013, o modelo climático brasileiro deverá permitir aos climatologistas realizar estudos sobre mudanças climáticas com base em um modelo que represente processos importantes para o Brasil e que são considerados secundários nos modelos climáticos estrangeiros.

“Boa parte desses modelos internacionais não atende às nossas necessidades. Temos muitos problemas associados ao clima em virtude de ações antropogênicas, como as queimadas e o desmatamento, que não são retratados e que agora serão incluídos no modelo que estamos desenvolvendo no Brasil”, explica Gilvan Sampaio de Oliveira, pesquisador Inpe e um dos coordenadores do MBSCG.

Segundo ele, o modelo brasileiro incorporará processos e interações hidrológicas, biológicas e físico-químicas relevantes do sistema climático regional e global.

Dessa forma, possibilitará gerar cenários, com resolução de 10 a 50 quilômetros, de mudanças ambientais regionais e globais que poderão ocorrer nas próximas décadas para prever seus possíveis impactos em setores como agricultura e energia.

“Com esse modelo, teremos capacidade e autonomia para gerar cenários futuros confiáveis, de modo que o país possa se preparar para enfrentar os fenômenos climáticos extremos”, disse Sampaio.

Impactos do clima na agricultura

A primeira versão do modelo brasileiro com indicações do que pode ocorrer com o clima no Brasil nos próximos 50 anos deverá ficar pronta até o fim de 2011. Para isso, os pesquisadores estão instalando e começarão a rodar em fevereiro, no supercomputador Tupã, instalado no Centro de Previsão do Tempo e Estudos Climáticos (Cptec), em Cachoeira Paulista (SP), uma versão preliminar do modelo, com módulos computacionais que analisam os fenômenos climáticos que ocorrem na atmosfera, no oceano e na superfície terrestre.

Os módulos computacionais serão integrados gradualmente a outros componentes do modelo, que avaliarão os impactos da vegetação, do ciclo de carbono terrestre, do gelo marinho e da química atmosférica no clima.

Em contrapartida, um outro componente apontará as influências das mudanças climáticas em cultivares agrícolas como a cana-de-açúcar, soja, milho e café.
“No futuro, poderemos tentar estimar a produtividade da cana-de-açúcar e da soja, por exemplo, frente ao aumento da concentração de gases de efeito estufa na atmosfera”, disse Sampaio.

Contribuição ao IPCC

Segundo o cientista, como a versão final do MSBCG só ficará pronta em 2013, o modelo climático brasileiro não será utilizado no próximo relatório que o IPCC divulgará em 2014, o AR-5. Mas o modelo que será utilizado pelo Painel Intergovernamental para realizar as simulações do AR5, o HadGEM2, contará com participação brasileira.

Por meio de uma cooperação entre o Hadley Center, no Reino Unido, e o Inpe, os pesquisadores brasileiros introduziram no modelo internacional módulos computacionais que avaliarão o impacto das plumas de fumaça produzidas por queimadas e do fogo florestal sobre o clima global, que até então não eram levados em conta nas projeções climáticas.

Com isso, o modelo passou a ser chamado HadGEM2-ES/Inpe. “Faremos simulações considerando esses componentes que introduzimos nesse modelo”, contou Sampaio.

Uso da terra e meteorologia

Em 2013, quando será concluída a versão final do Modelo Brasileiro do Sistema Climático Global, o sistema ganhará um módulo computacional de uso da terra e outro meteorológico, com alta resolução espacial. No mesmo ano, também serão realizadas as primeiras simulações de modelos regionais de alta resolução para a elaboração de um modelo climático para América do Sul com resolução de um a 10 km. “Até hoje, levávamos meses e até anos para gerar cenários regionais. Com o novo sistema de supercomputação os esforços em modelagem climática regional ganharão outra escala”, afirmou Sampaio.

(Site da Inovação Tecnológica, com informações da Agência Fapesp)

>In Denial – Climate on the Couch (BBC)

>
Thu 10 Feb 2011
BBC Radio 4

http://www.bbc.co.uk/programmes/b00y92mn

Something strange is happening to the climate – the climate of opinion. On the one hand, scientists are forecasting terrible changes to the planet, and to us. On the other, most of us don’t seem that bothered, even though the government keeps telling us we ought to be. Even climate scientists and environmental campaigners find it hard to stop themselves taking holidays in long haul destinations.

So why the gap between what the science says, and what we feel and do? In this programme Jolyon Jenkins investigates the psychology of climate change. Have environmentalists and the government been putting out messages that are actually counterproductive? Might trying to scare people into action actually be causing them to consume more? Are images of polar bears actually damaging to the environmentalists’ case because they alienate people who don’t think of themselves as environmentalists – and make climate change seem like a problem that’s a long way off and doesn’t have much relevance to normal life? Does the message that there are “simple and painless” steps we can take to reduce our carbon footprint (like unplugging your phone charger) unintentionally cause people to think that the problem can’t be that serious if the answers are so trivial?

Jolyon talks to people who are trying to move beyond the counterproductive messages. On the one hand there are projects like Natural Change, run by WWF Scotland, which try to reconnect people with nature using the therapeutic techniques of “ecopsychology” – intense workshops that take place in the wilderness of the west of Scotland, and which seem to convert the uncommitted into serious greens. On the other, there are schemes that try to take the issue out of the green ghetto and engage normal people with climate change. Jolyon visits a project in Stirling which has set itself the ambitious challenge of talking face to face with 35,000 people, through existing social groups like rugby clubs, knitting circles and art groups. It wants to sign up these groups to carbon cutting plans, and make carbon reduction a social norm rather than something that only eco-warriors bother with.

And he attends a “swishing party” in London, which tries to replicate the buzz women get from clothes shopping, but in a carbon neutral way. Can the green movement find substitutes for consumerism that are as fun and status-rich, that will deliver carbon reduction but without making people feel they have signed up to a life of grim austerity? And even if the British and Europeans shift their attitudes, can the Americans ever be reconciled to the climate change message? Producer Jolyon Jenkins.

>Can We Trust Climate Models? Increasingly, the Answer is ‘Yes’

>

18 JAN 2011: ANALYSIS

Yale Environment 360

Forecasting what the Earth’s climate might look like a century from now has long presented a huge challenge to climate scientists. But better understanding of the climate system, improved observations of the current climate, and rapidly improving computing power are slowly leading to more reliable methods.

by michael d. lemonick

A chart appears on page 45 of the 2007 Synthesis Report of the Intergovernmental Panel on Climate Change (IPCC), laying out projections for what global temperature and sea level should look like by the end of this century. Both are projected to rise, which will come as no surprise to anyone who’s been paying even the slightest attention to the headlines over the past decade or so. In both cases, however, the projections span a wide range of possibilities. The temperature, for example, is likely to rise anywhere from 1.8 C to 6.4 C (3.2 F to 11.5 F), while sea level could increase by as little as 7 inches or by as much as 23 — or anywhere in between.

It all sounds appallingly vague, and the fact that it’s all based on computer models probably doesn’t reassure the general public all that much. For many people, “model” is just another way of saying “not the real world.” In fairness, the wide range of possibilities in part reflects uncertainty about human behavior: The chart lays out different possible scenarios based on how much CO2 and other greenhouse gases humans might emit over the coming century. Whether the world adopts strict emissions controls or decides to ignore the climate problem entirely will make a huge difference to how much warming is likely to happen.

But even when you factor out the vagaries of politics and economics, and assume future emissions are known perfectly, the projections from climate models still cover a range of temperatures, sea levels, and other manifestations of climate change. And while there’s just one climate, there’s more than one way to simulate it. The IPCC’s numbers come from averaging nearly two dozen individual models produced by institutions including the National Center for Atmospheric Research (NCAR), the Geophysical Fluid Dynamics Laboratory (GFDL), the U.K.’s Met Office, and more. All of these models have features in common, but they’re constructed differently — and all of them leave some potentially important climate processes out entirely. So the question remains: How much can we really trust climate models to tell us about the future?

The answer, says Keith Dixon, a modeler at GFDL, is that it all depends on questions you’re asking. “If you want to know ‘is climate change something that should be on my radar screen?’” he says, “then you end up with some very solid results. The climate is warming, and we can say why. Looking to the 21st century, all reasonable projections of what humans will be doing suggest that not only will the climate continue to warm, you have a good chance of it accelerating. Those are global-scale issues, and they’re very solid.”

The reason they’re solid is that, right from the emergence of the first crude versions back in the 1960s, models have been at their heart a series of equations that describe airflow, radiation and energy balance as the Sun

The problem is that warming causes changes that act to accelerate or slow the warming.

warms the Earth and the Earth sends some of that warmth back out into space. “It literally comes down to mathematics,” says Peter Gleckler, a research scientist with the Program for Climate Model Diagnosis and Intercomparison at Livermore National Laboratory, and the basic equations are identical from one model to another. “Global climate models,” he says, echoing Dixon, “are designed to deal with large-scale flow of the atmosphere, and they do very well with that.”

The problem is that warming causes all sorts of changes — in the amount of ice in the Arctic, in the kind of vegetation on land, in ocean currents, in permafrost and cloud cover and more — that in turn can either cause more warming, or cool things off. To model the climate accurately, you have to account for all of these factors. Unfortunately, says James Hurrell, who led the NCAR’s most recent effort to upgrade its own climate model, you can’t. “Sometimes you don’t include processes simply because you don’t understand them well enough,” he says. “Sometimes it’s because they haven’t even been discovered yet.”

A good example of the former, says Dixon, is the global carbon cycle — the complex interchange of carbon between oceans, atmosphere, and biosphere. Since atmospheric carbon dioxide is driving climate change, it’s obviously important, but until about 15 years ago, it was too poorly understood to be included in the models. “Now,” says Dixon, “we’re including it — we’re simulating life, not just physics.” Equations representing ocean dynamics and sea ice also have been added to climate models as scientists have understood these crucial processes better.

Other important phenomena, such as changes in clouds, are still too complex to model accurately. “We can’t simulate individual cumulus clouds,” says Dixon, because they’re much smaller than the 200-kilometer grid boxes that make up climate models’ representation of the world. The same applies to aerosols — tiny particles, including natural dust and manmade soot — that float around in the atmosphere and can cool or warm the planet, depending on their size and composition.

But there’s no one right way to model these small-scale phenomena. “We don’t have the observations and don’t have the theory,” says Gleckler. The best they can do on this point is to simulate the net effect of all the clouds or aerosols in a grid box, a process known as “parameterization.” Different

‘It’s not a science for which everything is known, by definition,’ says one expert.

modeling centers go about it in different ways, which, unsurprisingly, leads to varying results. “It’s not a science for which everything is known, by definition,” says Gleckler. “Many groups around the world are pursuing their own research pathways to develop improved models.” If the past is any guide, modelers will be able to abandon parameterizations one by one, replacing them with mathematical representations of real physical processes.

Sometimes, modelers don’t understand a process well enough to include it at all, even if they know it could be important. One example is a caveat that appears on that 2007 IPCC chart. The projected range of sea-level rise, it warns, explicitly excludes “future rapid dynamical changes in ice flow.” In other words, if land-based ice in Greenland and Antarctica starts moving more quickly toward the sea than it has in the past — something glaciologists knew was possible, but hadn’t yet been documented — these estimates would be incorrect. And sure enough, satellites have now detected such movements. “The last generation of NCAR models,” says Hurrell, “had no ice sheet dynamics at all. The model we just released last summer does, but the representation is relatively crude. In a year or two, we’ll have a more sophisticated update.”

Sophistication only counts, however, if the models end up doing a reasonable job of representing the real world. It’s not especially useful to wait until 2100 to find out, so modelers do the next best thing: They perform “hindcasts,” which are the inverse of forecasts. “We start the models from the middle of the 1800s,” says Dixon, “and let them run through the present.” If a model reproduces the overall characteristics of the real-world climate record reasonably well, that’s a good sign.

What the models don’t try to do is to match the timing of short-term climate variations we’ve experienced. A model might produce a Dust Bowl like that of the 1930s, but in the model it might happen in the 1950s. It should produce the ups and downs of El Niño and La Niña currents in the Pacific with about the right frequency and intensity, but not necessarily at the same times as they happen in the real Pacific. Models should show slowdowns and accelerations in the overall warming trend, the result of natural fluctuations, at about the rate they happen in the real climate. But they won’t necessarily show the specific flattening of global warming we’ve observed during the past decade — a temporary slowdown that had skeptics declaring the end of climate change.

It’s also important to realize that climate represents what modelers call a boundary condition. Blizzards in the Sahara are outside the boundaries of our current climate, and so are stands of palm trees in Greenland next year. But within those boundaries, things can bounce around a great deal from year to year or decade to decade. What modelers aim to produce is a virtual climate that resembles the real one in a statistical sense, with El Niños, say, appearing about as often as they do in reality, or hundred-year storms coming once every hundred years or so.

This is one essential difference between weather forecasting and climate projection. Both use computer models, and in some cases, even the very same models. But weather forecasts start out with the observed state of the

Many decisions about how to adapt to climate change can’t wait for better climate models.

atmosphere and oceans at this very moment, then project it forward. It’s not useful for our day-to-day lives to know that September has this average high or that average low; we want to know what the actual temperature will be tomorrow, and the day after, and next week. Because the atmosphere is chaotic, anything less than perfect knowledge of today’s conditions (which is impossible, given that observations are always imperfect) will make the forecast useless after about two weeks.

Since climate projections go out not days or weeks, but decades, modelers don’t even try to make specific forecasts. Instead, they look for changes in averages — in boundary conditions. They want to know if Septembers in 2050 will be generally warmer than Septembers in 2010, or whether extreme weather events — droughts, torrential rains, floods — will become more or less frequent. Indeed, that’s the definition of climate: the average conditions in a particular place.

“Because models are put together by different scientists using different codes, each one has its strengths and weaknesses,” says Dixon. “Sometimes one [modeling] group ends up with too much or too little sea ice but does very well with El Niño and precipitation in the continental U.S., for example,” while another nails the ice but falls down on sea-level rise. When you average many models together, however, the errors tend to cancel.

Even when models reproduce the past reasonably well, however, it doesn’t guarantee that they’re equally reliable at projecting the future. That’s in part because some changes in climate are non-linear, which is to say that a small nudge can produce an unexpectedly large result. Again, ice sheets are a good example: If you look at melting alone, it’s pretty straightforward to calculate how much extra water will enter the sea for every degree of temperature rise. But because meltwater can percolate down to lubricate the undersides of glaciers, and because warmer oceans can lift the ends of glaciers up off the sea floor and remove a natural brake, the ice itself can end up getting dumped into the sea, unmelted. A relatively small temperature rise can thus lead to an unexpectedly large increase in sea level. That particular non-linearity was already suspected, if not fully understood, but there could be others lurking in the climate system.

Beyond that, says Dixon, if three-fourths of the models project that the Sahel (the area just south of the Sahara) will get wetter, for example, and a fourth says it will dry out, “there’s a tendency to go with the majority. But we can’t rule out without a whole lot of investigation whether the minority is doing something right. Maybe they have a better representation of rainfall patterns.” Even so, he says, if you have the vast majority coming up with similar results, and you go back to the underlying theory, and it makes physical sense, that tends to give you more confidence they’re right. The best confidence-builder of all, of course, is when a trend projected by models shows up in observations — warmer springs and earlier snowmelt in the Western U.S., for example, which not only makes physical sense in a warming world, but which is clearly happening.

Climate Forecasts: The Case For Living with Uncertainty

As climate science advances, predictions about the extent of future warming and its effects are likely to become less — not more — precise, journalist Fred Pearce writes. That may make it more difficult to convince the public of the reality of climate change, but it hardly diminishes the urgency of taking action.
READ MORE

And the models are constantly being improved. Climate scientists are already using modified versions to try and predict the actual timing of El Ninos and La Niñas over the next few years. They’re just beginning to wrestle with periods of 10, 20 and even 30 years in the future, the so-called decadal time span where both changing boundary conditions and natural variations within the boundaries have an influence on climate. “We’ve had a modest amount of skill with El Niños,” says Hurrell, “where 15-20 years ago we weren’t so skillful. That’s where we are with decadal predictions right now. It’s going to improve significantly.”

After two decades of evaluating climate models, Gleckler doesn’t want to downplay the shortcomings that remain in existing models. “But we have better observations as of late,” he says, “more people starting to focus on these things, and better funding. I think we have better prospects for making some real progress from now on.” 

POSTED ON 18 JAN 2011 IN BUSINESS & INNOVATION CLIMATE ENERGY SCIENCE & TECHNOLOGY NORTH AMERICA NORTH AMERICA 

>Clima sob o olhar do Brasil (Fapesp)

>

Especiais

1/2/2011

Por Elton Alisson


Agência FAPESP – Nos modelos climáticos globais divulgados no mais recente relatório do Painel Intergovernamental sobre Mudança Climática (IPCC), divulgado em 2007, o Pantanal e o Cerrado são retratados como se fossem savanas africanas.

Já fenômenos como as queimadas, que podem intensificar o efeito estufa e mudar as características de chuvas e nuvens de uma determinada região, por exemplo, não são caracterizados por não serem considerados relevantes para os países que elaboraram os modelos numéricos utilizados.

Para dispor de um modelo capaz de gerar cenários de mudanças climáticas com perspectiva brasileira, pesquisadores de diversas instituições, integrantes do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, da Rede Brasileira de Pesquisa em Mudanças Climáticas Globais (Rede Clima) e do Instituto Nacional de Ciência e Tecnologia sobre Mudanças Climáticas (INCT-MC), estão desenvolvendo o Modelo Brasileiro do Sistema Climático Global (MBSCG).

Com conclusão estimada para 2013, o MSBCG deverá permitir aos climatologistas brasileiros realizar estudos sobre mudanças climáticas com base em um modelo que represente processos importantes para o Brasil e que são considerados secundários nos modelos climáticos estrangeiros.

“Boa parte desses modelos internacionais não atende às nossas necessidades. Temos muitos problemas associados ao clima em virtude de ações antropogênicas, como as queimadas e o desmatamento, que não são retratados e que agora serão incluídos no modelo que estamos desenvolvendo no Brasil”, disse Gilvan Sampaio de Oliveira, pesquisador do Centro de Ciência do Sistema Terrestre (CCST) do Instituto Nacional de Pesquisas Espaciais (Inpe), um dos pesquisadores que coordena a construção do MBSCG.

Segundo ele, o modelo brasileiro incorporará processos e interações hidrológicas, biológicas e físico-químicas relevantes do sistema climático regional e global. Dessa forma, possibilitará gerar cenários, com resolução de 10 a 50 quilômetros, de mudanças ambientais regionais e globais que poderão ocorrer nas próximas décadas para prever seus possíveis impactos em setores como agricultura e energia.

“Com esse modelo, teremos capacidade e autonomia para gerar cenários futuros confiáveis, de modo que o país possa se preparar para enfrentar os fenômenos climáticos extremos”, disse Sampaio à Agência FAPESP.

A primeira versão do modelo brasileiro com indicações do que pode ocorrer com o clima no Brasil nos próximos 50 anos deverá ficar pronta até o fim de 2011.

Para isso, os pesquisadores estão instalando e começarão a rodar em fevereiro no supercomputador Tupã, instalado no Centro de Previsão do Tempo e Estudos Climáticos (CPTEC), em Cachoeira Paulista (SP), uma versão preliminar do modelo, com módulos computacionais que analisam os fenômenos climáticos que ocorrem na atmosfera, no oceano e na superfície terrestre.

Os módulos computacionais serão integrados gradualmente a outros componentes do modelo, que avaliarão os impactos da vegetação, do ciclo de carbono terrestre, do gelo marinho e da química atmosférica no clima. Em contrapartida, um outro componente apontará as influências das mudanças climáticas em cultivares agrícolas como a cana-de-açúcar, soja, milho e café.

“No futuro, poderemos tentar estimar a produtividade da cana-de-açúcar e da soja, por exemplo, frente ao aumento da concentração de gases de efeito estufa na atmosfera”, disse Sampaio.


Classe IPCC

Segundo o cientista, como a versão final do MSBCG só ficará pronta em 2013, o modelo climático brasileiro não será utilizado no próximo relatório que o IPCC divulgará em 2014, o AR-5. Mas o modelo que será utilizado pelo Painel Intergovernamental para realizar as simulações do AR5, o HadGEM2, contará com participação brasileira.

Por meio de uma cooperação entre o Hadley Center, no Reino Unido, e o Inpe, os pesquisadores brasileiros introduziram no modelo internacional módulos computacionais que avaliarão o impacto das plumas de fumaça produzidas por queimadas e do fogo florestal sobre o clima global, que até então não eram levados em conta nas projeções climáticas.

Com isso, o modelo passou a ser chamado HadGEM2-ES/Inpe. “Faremos simulações considerando esses componentes que introduzimos nesse modelo”, contou Sampaio.

Em 2013, quando será concluída a versão final do Modelo Brasileiro do Sistema Climático Global, o sistema ganhará um módulo computacional de uso da terra e outro metereológico, com alta resolução espacial. No mesmo ano, também serão realizadas as primeiras simulações de modelos regionais de alta resolução para a elaboração de um modelo climático para América do Sul com resolução de 1 a 10 km.

“Até hoje, levávamos meses e até anos para gerar cenários regionais. Com o novo sistema de supercomputação os esforços em modelagem climática regional ganharão outra escala”, afirmou Sampaio.

Leia reportagem publicada pela revista Pesquisa FAPESP sobre o modelo climático brasileiro. 

>Baixo retorno político (Fapesp)

>Especiais

2/2/2011

Por Fábio de Castro

Mais educação não se traduz automaticamente em mais democracia, segundo estudo realizado na USP. Entre 1989 e 2006, diminuiu a diferença entre a participação política dos mais e menos escolarizados (ABr)

Agência FAPESP – Na avaliação do senso comum, educação e politização andam de mãos dadas. Para a elite brasileira – de acordo com pesquisas de opinião –, o aumento da escolaridade da população tem o poder de gerar cidadãos que participam mais da vida política do país e que valorizam mais a democracia. Mas um novo estudo mostra que essa visão não corresponde à realidade.

A pesquisa de doutorado de Rogério Schlegel, defendida no Departamento de Ciência Política da Faculdade de Filosofia, Letras e Ciências Humanas (FFLCH) da Universidade de São Paulo (USP), utilizou análises estatísticas para interpretar os dados de pesquisas de opinião realizadas entre 1989 e 2006. O trabalho concluiu que a educação brasileira está trazendo ganhos decrescentes em termos políticos.

“O estudo mostrou que os cidadãos mais escolarizados já não se tornam tão participativos e democráticos como ocorria há duas décadas. O maior nível de escolaridade ainda diferencia os cidadãos, mas essa diferença encolheu muito em 20 anos – isto é, os retornos políticos da educação têm sido decrescentes no Brasil. Em alguns quesitos de participação e apoio à democracia, a diferença entre os mais e os menos escolarizados chega a ser inexistente”, disse Schlegel à Agência FAPESP.

A pesquisa de Schlegel foi orientada pelo professor José Álvaro Moisés, da FFLCH-USP, e integra o Projeto Temático “A Desconfiança do Cidadão nas Instituições Democráticas”, coordenado por Moisés e financiado pela FAPESP.

Segundo Schlegel, uma pesquisa de opinião coordenada em 2000 pela professora Elisa Reis, do Departamento de Sociologia da Universidade Federal do Rio de Janeiro (UFRJ), já mostrava que, na avaliação da elite brasileira, a baixa escolaridade é o maior entrave para a democracia no país.

“Por trás dessa ideia há um pressuposto de que a educação só tem impacto no comportamento político por meio da capacitação cognitiva – isto é, basta fornecer mais educação e as pessoas terão mais recursos para acompanhar a política, discutindo, lendo jornais e fazendo exigências. Mas na realidade há caminhos alternativos para a entrada nessa vida política. Os resultados do estudo indicam que não há uma relação linear entre obter mais acesso à educação e obter mais instrumentos para participar da democracia”, afirmou.

Foram analisados vários mecanismos capazes de explicar os retornos políticos decrescentes da escolarização. A hipótese mais plausível é que o fenômeno tenha sido causado pela queda na qualidade da educação brasileira.

“Ao falhar na capacitação cognitiva do indivíduo e na transmissão de conhecimentos, o sistema educacional brasileiro estaria deixando de dar as ferramentas que ajudam o cidadão a atuar na esfera política. O resultado é que o aumento do acesso ao ensino ou do volume de escolarização – em tempo passado na escola ou anos de estudo completados – não é acompanhado pelos ganhos esperados em matéria de comportamento político”, disse.

O estudo teve fundamento em quatro pesquisas de opinião realizadas pelo grupo ligado ao Projeto Temático, a primeira realizada logo após a redemocratização, em 1989, e a mais recente – financiada pela FAPESP –, em 2006.

A partir desses dados, Schlegel utilizou análises estatísticas para controlar as diversas variáveis sociodemográficas disponíveis e observar, de forma isolada, o efeito da escolaridade no comportamento do cidadão ao longo do tempo.

“Na sociologia econômica é comum o uso, por exemplo, do conceito de ‘retorno econômico da educação’ para avaliar até que ponto uma maior escolaridade pode se refletir em maior renda, ou em maior arrecadação de impostos. A partir desse conceito, o estudo trabalha com a ideia de ‘retorno político da educação’”, explicou.

Indiferença política

O retorno político foi avaliado por meio de diferentes quesitos, como participação, apoio aos princípios democráticos e confiança nas instituições. Os resultados mostraram que a distância entre mais e menos escolarizados caiu marcadamente em relação à demonstração de interesse por política, consumo de notícias sobre o tema e hábito de conversar sobre ele.

“Em alguns quesitos, o nível de escolaridade é praticamente indiferente. No caso da participação em partidos, sindicatos e associações de bairro, por exemplo, o envolvimento é igualmente baixo entre os menos e mais escolarizados”, disse Schlegel.

A maior perda de retorno político, durante os 17 anos do período analisado, deu-se na faixa do ensino médio – faixa de escolarização que teve a maior expansão de alunos nas últimas duas décadas.

“Na média, hoje não se diferencia alguém que se formou no ensino médio de um cidadão com fundamental incompleto, em termos de preferir a democracia como forma de governo ou rejeitar a concentração de poder nas mãos de um líder centralizador”, afirmou.

Em 1993, de acordo com o estudo, a chance de um universitário ser muito interessado em política era 3,6 vezes maior que a de alguém com o ensino fundamental incompleto. Em 2006, as chances se reduziram para 1,6. “A diferença entre o universitário e alguém sem nenhum diploma escolar era enorme, em termos de interesse na política. Agora, a diferença ainda existe, mas é muito menor”, ressaltou Schegel.

Em 1989, uma pessoa com o segundo grau completo tinha 66% mais chance de preferir a democracia a qualquer outro regime, em comparação com alguém sem diploma do ensino fundamental. “Em 2006, já não havia mais diferença estatística entre os dois públicos. Nesse quesito, havia no passado uma distância que desapareceu entre os diferentes níveis de escolaridade em termos de comportamento político”, disse.

A confiança nas instituições tem uma relação especial com a escolaridade. Em 1993, quem tinha mais escolaridade confiava mais nos partidos que em 2006. Mas os dados não permitem concluir se houve de fato um aumento ou diminuição da confiança.

“Tratava-se de um momento em que os partidos estavam em reconstrução e havia uma noção generalizada de que eles eram o caminho para construir a democracia. Em 2006, essa noção já havia sido desfeita pelos partidos de aluguel e isso pode ter desencadeado a maior desconfiança dos mais escolarizados”, explicou.

Para Schlegel, os resultados do estudo, ao identificar que a escolarização vem trazendo ganhos decrescentes em termos políticos, desaconselham apostas na educação como panaceia capaz de promover uma cidadania superior e fazer superar os déficits democráticos no Brasil.

“A educação importa, mas sozinha não resolve. Os efeitos benéficos da escolarização para a convivência democrática precisam de ensino de qualidade para todos para se concretizarem plenamente”, disse.

>Um céu de probabilidades (O Povo)

>
O cearense carrega uma memória cultural, muitas vezes inconsciente, sobre a escassez de água que afetou as famílias no passado. Em entrevista ao Vida&Arte Cultura, o professor da UFRJ, Renzo Taddei, traça aspectos sobre a compreensão do clima e como isso reflete no cidadão

05.02.2011| 17:00

Em Quixadá, os profetas da chuva fazem suas previsões todos os anos (DÁRIO GABRIEL, EM 9/1/2010)

Diante dos ciclos da natureza, profetas preveem o futuro. Os cientistas tornam públicas as medições matemáticas e físicas que ditam a probabilidade de nublar ou fazer sol. Mediadas pela imprensa, as previsões meteorológicas afetam o cidadão e sua maneira de perceber o clima e a cidade. Professor-adjunto da Escola de Comunicação da Universidade Federal do Rio de Janeiro (UFRJ), Renzo Taddei tem relacionamento íntimo com o semi-árido cearense.

Pesquisador há quase uma década das manifestações populares na previsão do clima, em especial na atuação dos profetas da chuva de Quixadá, Renzo vez por outra vem à Fortaleza ministrar palestras, participar de encontros e pesquisas sobre o tema. Na última quarta, o professor recebeu a reportagem numa das salas da Funceme. Na entrevista que você lê a seguir, o pesquisador chama a atenção para alertas globais, como o aquecimento climático. “Já se percebeu que o tom alarmista de catástrofe iminente raramente produz algum efeito positivo. O que produz é uma sensação de impotência geral, como se não há nada que se possa fazer”. (Elisa Parente)

O POVO – Como é possível entender o clima para além do efeito atmosférico?
Renzo Taddei – A questão do clima no Ceará é muito interessante porque, ano passado, segundo a Seplag (Secretaria do Planejamento e Gestão), a economia cearense cresceu 8%. O que é um número incrível. Em 2010, tivemos o pior ano em chuvas nos últimos 30 anos. Foi a pior seca e um ano de bom crescimento para o Estado. Claramente, a agricultura contribui pouco para a economia no Ceará e, cada vez menos, as pessoas se sentem terrivelmente vulneráveis com relação ao clima. Por vários fatores, inclusive por causa dos programas sociais do governo Lula, pelas melhorias de infra-estrutura dos últimos governos do Ceará. Ou seja, a vida está um pouco mais fácil.

OP – De que maneira isto afeta na organização da cidade?
Renzo – Existiu uma estatística onde mais da metade da população acima de 40, 50 anos tinha nascido fora da Capital. Então a presença do imaginário rural é muito forte. Esse é um dos elementos do peso psicológico da seca. Até quem não tem nada a ver com agricultura, se alegra ao ver chuva. A história do bonito pra chover traz uma continuidade e uma ruptura, principalmente com relação às gerações mais novas que nasceram em Fortaleza. Entrevistei um agrônomo que me disse ter o hábito de desligar a água enquanto se ensaboava no chuveiro. E o filho dele perguntou por que ele desligava a água se o banho não tinha acabado. Ele se deu conta de que a geração mais nova sequer tem memória a respeito da escassez de água do seu Estado. A última grande crise em Fortaleza foi 1993, na construção do Canal do Trabalhador. Fortaleza é a única capital encravada no semi-árido, mesmo assim as pessoas não têm essa consciência. Você anda por Fortaleza e vê que cada lançamento imobiliário precisa ter um parque aquático, não é nem piscina. É o uso arquitetônico e recreativo da água. Só se compara a Las Vegas, que é outro lugar de abundância irresponsável encravado no deserto. Então as pessoas não têm experiência da falta de água, mas têm uma herança cultural de hipervalorização dela. Fortaleza é orgulhosa de sua pujança e gasta como novo rico. É um lance que tem a ver com cultura. Enquanto isso, o Canal da Integração está trazendo um mundo de água e a transposição do Rio São Francisco vai trazer ainda mais para Fortaleza não ter problema pelos próximos 30 anos.

OP – Mas isso também pode ter efeito contrário.
Renzo – Isso é perigoso porque, a longo prazo, não dá para achar que é sustentável você consumir. O grande debate da transposição, da construção do Castanhão e do Canal da Integração é esse. Está sempre aumentando a infra-estrutura que acumula água ao invés de educar a população para consumir menos. Não que dê para fazer os dois ao mesmo tempo, mas o fato é que os últimos governos têm preferido aumentar a oferta de água.

OP – Uma das linhas da sua pesquisa centra foco na antropologia da incerteza e do futuro. De que maneira isto está ligado ao estudo do clima?
Renzo – Isto talvez seja a parte mais desafiadora de entender qual o papel que o clima tem na nossa vida cotidiana. Porque a meteorologia rapidamente se deu conta de que a atmosfera é algo muito complexo. Agora a estação chuvosa no Ceará é algo muito mais complicado. Porque o agricultor quer saber hoje se vai ter chuva em maio. Não tem nenhum radar que mostre isso, mas a meteorologia usa a física e a matemática criando modelos que simulam no computador a maneira como funciona a natureza. Mas isso tem limitações. Por isso se fala em termos de probabilidade, porque às vezes uma coisa pequena pode mudar tudo. Então a meteorologia tem que conviver com essa relação complicada com a sociedade.

OP – Porque é tão difícil lidar com a incerteza?
Renzo – Tem inúmeras pesquisas que mostram que temos dificuldade tremenda de reter isso. A informação probabilística demanda um esforço cognitivo muito grande. É realmente complexo. É como se fôssemos programados mentalmente para não operar com probabilidade e para fingir que tudo é certo ou errado. Temos essa tendência a polarizar as coisas. E isso influencia a relação da sociedade com o clima. Veja, por exemplo, a estratégia dos agricultores. O agricultor familiar está o tempo inteiro prestando atenção em previsões do clima, só que não usa nenhuma. Ele espera o solo ficar úmido numa certa profundidade, para depositar a semente. Só que as primeiras chuvas da estação são fracas, o broto morre e ele precisa começar tudo de novo.

OP – A incerteza é parte da natureza.
Renzo – Acompanho os profetas da chuva desde 2002. É muito recorrente que alguns deles se digam observadores da natureza, e não profetas. Ser profeta tem uma carga simbólica religiosa muito forte e é um peso muito grande pra eles carregarem nas costas. Então eles fazem as previsões e, no final, dizem que quem sabe mesmo é Deus. O interessante é que, em termos de conteúdo, eles dizem exatamente o que a Funceme diz em termos de probabilidade. Existe uma incerteza envolvida. Então as pessoas aceitam, mas não dão à ciência o direito de viver as incertezas.

OP – Como a meteorologia figura nesta história?
Renzo – Uma parte dessa confusão tem a ver com a história da meteorologia no Ceará. Ela começou com muita fanfarronice, voando de avião, fazendo uma pulverização nas nuvens com sal de prata pra fazer chover mais rápido. Só que você percebe que o spray não produz chuva, só apressa. Então essa tecnologia sempre foi muito controvertida. Para uma mentalidade do sertão, isso equivaleria dizer que o homem da cidade se acreditava com o poder de produzir chuva. E tanto é assim que o Patativa do Assaré fez o poema Ao dotô do avião, onde ele coloca vários elementos importantes. O homem se adequa ao ciclo da natureza e não vice-versa. No Ceará, o clima sempre esteve ligado à religião. Então era desrespeitoso e absurdo achar que o cidadão iria produzir chuva.

OP – As pessoas já absorveram a gravidade do aquecimento global?
Renzo – Não sei se, algum dia, entenderemos o aquecimento global. A natureza funciona em ciclos. O dia e a noite, as estações do ano. São ciclos que, por serem curtos, a gente consegue entender bem. Só que existem aqueles que são muito longos. Costuma-se dizer que, aqui no semi-árido, existem ciclos onde duas ou três décadas são mais secas, depois outras mais chuvosas. Pode ser que exista um ciclo bem mais longo que a gente não tem nem ideia. Se o futuro provar que estamos errados, tudo bem, fizemos o que tinha de ser feito.

OP – Então existe uma visão positiva para o futuro?
Renzo – A ciência é feita por incertezas, ela só caminha porque ensina o que não sabe. Mas existe o que chamam de princípio precaucionário. Que diz que você precisa medir o quanto você perde se tiver certo e não fazer nada e o quanto perde se estiver errado e fizer muita coisa. Então imagina que não existe aquecimento global nenhum, só que tomamos as atitudes necessárias. O que a gente perde? Existe uma perda em termos de crescimento econômico. Agora a outra opção é que existe um aquecimento global, ele está acontecendo, tem a ver com produção industrial, mas a gente assume que não está acontecendo e não faz nada. O que perdemos no futuro? Várias pessoas dizem que não fazer nada pode ter um custo muito alto. A chance de estarmos certos é grande e mesmo que estejamos errados, tem como recuperar. Existe ainda um outro lado em que talvez não tenhamos como recuperar. Talvez a gente de fato passe por uma sequência grande de eventos extremos. O lance do aquecimento climático não tem a ver com o mundo ficar mais quente todo dia. O ponto é que eventos extremos, como chuva, furacão, tendam a ser mais frequentes. O nível do mar já está subindo, algumas nações já começam a se transferir. E voltamos à história de Fortaleza ser a Las Vegas do semi-árido. Não dá pra falar em cortar a emissão de carbono sem reduzir atividade industrial. E não podemos falar de aquecimento global sem redução de consumo. E como faz para a população da Aldeota parar de consumir tanto? Nos meus momentos mais pessimistas, eu penso que a humanidade só consegue se re-programar mentalmente em escala continental numa experiência de quase morte. O que significa uma imensa catástrofe. E aí todo mundo para e se repensa. Mas eu sou professor e tenho que acreditar que a educação tem o seu valor.