Arquivo da categoria: opinião pública

>Brasileiro se preocupa com aquecimento global, mas muda pouco

>
Mudança latente

Por Ricardo Voltolini*, da Revista Ideia Socioambienltal
28/04/2010 – 11h04

Pesquisa do Datafolha divulgada no último dia 21 de abril revela que pouco mais de nove entre 10 brasileiros acreditam no fenômeno do aquecimento global. Três quartos dos entrevistados acham que a ação humana é a grande responsável pelas mudanças climáticas.

Os números diferem muito dos registrados em estudos com americanos e ingleses. Nos EUA, metade dos cidadãos não crê na responsabilidade do homem pelo aquecimento global. Na Inglaterra, são 25%. Nesses países, mais do que aqui, o recente ataque dos negacionistas climáticos –que tem confrontado duramente as pesquisas do painel de cientistas do clima das Nações Unidas – fez crescer o número de céticos.

Especialmente no caso dos Estados Unidos, ideias que contestam ou atenuam o impacto humano nas mudanças climáticas costumam ter boa aceitação seja porque oferecem salvo-conduto para não deixar de lançar gases de efeito estufa seja porque reduzem a culpa por um estilo de vida considerado perdulário para o planeta muito conveniente. O país é, como se sabe, o maior emissor de CO2. E, em dezembro último, seu presidente, Barack Obama, ajudou a desandar o acordo do clima justamente por não aceitar metas de redução de emissões mais ambiciosas. Para os EUA e –também para a China, sua grande concorrente no mercado global– diminuir emissões significa abrir mão de crescimento, coisa que causa arrepios no norte-americano médio e seus representantes políticos no senado.

Outros números do estudo do Datafolha merecem atenção. Segundo os dados, o número de brasileiros que se consideram bem informados sobre o tema saltou de 20% (em 2009) para 34%. Isso é bom, claro. Talvez signifique um primeiro passo. Mas sentir-se bem informado não quer dizer estar preparado para fazer as mudanças individuais necessárias visando a reduzir o impacto ao planeta.

Nesse sentido, apenas para estimular uma reflexão, lembro de uma pesquisa feita pela Market Analysis, em 2007, em 18 países. Aquele estudo, o primeiro do gênero no País, revelou que os brasileiros estavam entre os mais preocupados do mundo com as mudanças climáticas. No entanto, 46% achavam que um indivíduo pode fazer muito pouco diante de um problema tão grave.

Considerando as variáveis competência e capacidade para mudar o quadro, o estudo identificou quatro grupos. O mais numeroso (40%) seria formado por pessoas com bom nível de informação sobre o aquecimento global, alinhadas com a atuação das ONG´s, críticas em relação às empresas, mas que não necessariamente fazem algo para mudar seu dia a dia. Apenas um em cada seis integrantes desse grupo, no entanto, mostrava-se consciente e mobilizado.

O segundo grupo reunia 38% de brasileiros bem informados sobre o problema, dispostos a adotar mudanças em seu estilo de vida e sensíveis à idéia de que é possível conciliar crescimento econômico com respeito ao meio ambiente. Eles acreditavam que, individualmente, podiam dar uma resposta mais clara do que a sociedade como um todo. O terceiro grupo (12%) confiava mais na sociedade do que em sua própria capacidade de mudar a situação. E o quarto (10%) não acreditava nem no potencial do indivíduo nem no da sociedade. Ambos se caracterizavam por uma postura desinformada e acrítica.

A considerar que esses dados seguem atuais –e penso honestamente que sim- são grandes os desafios brasileiros. O mais importante é mobilizar os indivíduos, fazendo com que percebam que pequenas ações de redução de pegada ecológica somadas a outras ações de consumo consciente no dia a dia podem fazer diferença na luta para esfriar o planeta. Como já foi dito logo após o fracasso de Copenhague, o aquecimento global é um tema importante demais para esperar que as soluções venham apenas de líderes de estado comprometidos mais com a sua política doméstica do que com o futuro saudável da grande casa que habitamos.

*Ricardo Voltolini é publisher da revista Idéia Socioambiental e diretor da consultoria Idéia Sustentável: Estratégia e Inteligência em Sustentabilidade.

http://www.topblog.com.br/sustentabilidade

(Envolverde/Idéia Socioambiental)

>Mudanças climáticas: "caça às bruxas" direcionada a cientistas na Virginia (EUA)

>
An unwelcome ‘climate’ for scientists?

By Paul Guinnessy, Physics Today on May 11, 2010 6:34 PM

Virginia Attorney General Ken Cuccinelli, in a blatantly political move to help strengthen his support among the right wing for his bid to become the next governor, is causing uproar in the science community by investigating climate scientist and former University of Virginia professor Michael Mann.

Cuccinelli is accusing Mann of defrauding Virginia taxpayers by receiving research grants to study global temperatures. Mann, who is now based at the Pennsylvania State University, hasn’t worked in Virginia since 2005.

The subpoena, which currently isn’t attached to any lawsuit, requires the University of Virginia to provide Cuccinelli with thousands of documents and e-mails dating from 1999 to 2005 regarding Mann’s research. The accusation is tied to Mann and coworkers’ “hockey stick” graph that was included in a 2001 United Nations Intergovernmental Panel on Climate Change report. The graph displays annual global average temperatures by merging a wide variety of data sources that were used in some private e-mails made public when the University of East Anglia’s Climate Research Unit e-mail server got hacked.

Not answering the question

When Cuccinelli appeared on the Kojo Nnamdi Show on WAMU radio on Friday, he claimed the investigation was not into Mann’s academic work, but instead was “directed at the expenditure of dollars. Whether he does a good job, bad job or I don’t like the outcome—and I think everybody already knows his position on some of this is one that I question. But that is not what that’s about.”

However, the letter demanding materials gives a different impression. It asks, along with Mann’s correspondence with 39 other climate scientists, for “any and all computer algorithms, programs, source code, or the like created or edited by … Mann.”

This was emphasized when Cuccinelli spoke to the Washington Post, stating “in light of the Climategate e-mails, there does seem to at least be an argument to be made that a course was undertaken by some of the individuals involved, including potentially Michael Mann, where they were steering a course to reach a conclusion. Our act, frankly, just requires honesty.”

There hasn’t been an investigation by Virginia’s attorney general’s office into the funding of research grants of this nature before. Moreover, only one of the five grants under suspicion was funded by Virginia taxpayers through the university; the others were federal grants from the National Oceanic and Atmospheric Administration and the National Science Foundation.

No backbone?

The University of Virginia was originally going to succumb to Cuccinelli’s request. In a statement released to the press last Thursday the university said it was “required by law to comply.”

Shortly afterward, the University of Virginia Faculty Senate Executive Council issued its own statement, which ends:

We maintain that peer review by the scientific community is the appropriate means by which to identify error in the generation, presentation and interpretation of scientific data. The Attorney General’s use of his power to issue a CID under the provisions of Virginia’s FATA is an inappropriate way to engage with the process of scientific inquiry. His action and the potential threat of legal prosecution of scientific endeavor that has satisfied peer-review standards send a chilling message to scientists engaged in basic research involving Earth’s climate and indeed to scholars in any discipline. Such actions directly threaten academic freedom and, thus, our ability to generate the knowledge upon which informed public policy relies.

This was shortly followed by a joint letter to the university from the American Civil Liberties Union and the American Association of University Professors asking the University of Virginia to follow procedures to appeal the subpoena.

The letters seem to have had some effect: The Washington Post reported that the university is now “considering” its options before the Friday deadline to appeal is up.

State Senator Donald McEachin issued a statement, in which he stated he will submit a bill so that in the future the attorney general cannot issue a subpoena without also issuing a lawsuit.

“This is not only ludicrous and frivolous, wasting more taxpayer dollars and trampling on academic freedom, but the Attorney General has deprived Mr. Mann of his constitutional rights,” said McEachin.

Part of a bigger trend

On Friday, although it was put together before Cuccinelli issued his subpoena, Science published a letter by 255 members of the National Academy of Sciences, decrying “political assaults” against climate scientists and “McCarthy-like threats of criminal prosecution” and spelling out again the basic facts of what we know about the changing climate.

The letter was triggered by veiled threats from Senator James Inhofe, a well-known climate-change denier, to criminally investigate scientists over their research, and the political response to the CRU e-mails.

According to Peter Gleick, president of the Pacific Institute, a research center in Oakland, California—who spoke with New York Times reporter Sindya N. Bhanoo—before the NAS members gave the letter to Science, the group had first submitted it to the Wall Street Journal, the New York Times, and the Washington Post, all of whom declined to run it.

>Marcelo Leite: Águas turvas (FSP)

>
“Preconceitos, estridência, falácias, invenções e estatísticas, aliás, transformam todo o debate público numa bacia amazônica de turbidez. Não é privilégio da questão indígena. Tome a usina hidrelétrica de Belo Monte. Ou o tema explosivo da disponibilidade de terras para o agronegócio”

Marcelo Leite
Folha de S.Paulo, 09/05/2010 – reproduzido no Jornal de Ciência (JC e-mail 4006)

Por uma dessas coincidências sintomáticas que a época produz, duas frases que abrem a reportagem de capa da presente edição do caderno Mais! – “No Brasil todo mundo é índio, exceto quem não é” e “Só é índio quem se garante” – estão no centro de um bate-boca entre seu autor, o antropólogo Eduardo Viveiros de Castro, e a revista “Veja”.

A abertura foi escrita antes do quiproquó, mas pouco importa. Se ela e todo o texto sobre educação indígena forem recebidos como tomada de posição, tanto melhor.

De qualquer maneira, é instrutivo ler a reportagem da revista que deu origem a tudo, assim como as réplicas e tréplicas que se seguiram. Permite vislumbrar a profundidade dos preconceitos anti-indígenas e da estridência jornalística que turvam essa vertente de discussão no país.

Preconceitos, estridência, falácias, invenções e estatísticas, aliás, transformam todo o debate público numa bacia amazônica de turbidez. Não é privilégio da questão indígena. Tome a usina hidrelétrica de Belo Monte. Ou o tema explosivo da disponibilidade de terras para o agronegócio, epicentro da indigitada reportagem da revista “Veja”.

“Áreas de preservação ecológica, reservas indígenas e supostos antigos quilombos abarcam, hoje, 77,6% da extensão do Brasil”, afirmam seus autores, sem citar a fonte. “Se a conta incluir também os assentamentos de reforma agrária, as cidades, os portos, as estradas e outras obras de infraestrutura, o total alcança 90,6% do território nacional.”

É provável que a origem omitida seja o estudo “Alcance Territorial da Legislação Ambiental e Indigenista”, encomendado à Embrapa Monitoramento por Satélite pela Presidência da República e encampado pela Confederação Nacional da Agricultura e Pecuária do Brasil (CNA, leia-se senadora Kátia Abreu, DEM-TO). Seu coordenador foi o então chefe da unidade da Embrapa, Evaristo Eduardo de Miranda. A estimativa terminou bombardeada por vários especialistas, inclusive do Instituto Nacional de Pesquisas Espaciais (Inpe).

Nesta semana veio à luz, graças às repórteres Afra Balazina e Andrea Vialli, mais um levantamento que contradiz a projeção alarmante. O novo estudo foi realizado por Gerd Sparovek, da Escola Superior de Agricultura Luiz de Queiroz (Esalq-USP), em colaboração com a Universidade de Chalmers (Suécia).

Para Miranda, se toda a legislação ambiental, fundiária e indigenista fosse cumprida à risca, faltariam 334 mil km2 – 4% do território do Brasil – para satisfazer todas as suas exigências. O valor dá quase um Mato Grosso do Sul de deficit.

Para Sparovek, mesmo que houvesse completa obediência ao Código Florestal ora sob bombardeio de ruralistas, sobraria ainda 1 milhão de km2, além de 600 mil km2 de pastagens poucos produtivas usadas para pecuária extensiva (um boi por hectare). Dá 4,5 Mato Grosso do Sul de superavit.

A disparidade abissal entre as cifras deveria bastar para ensopar as barbas de quem acredita em neutralidade científica, ou a reivindica. Premissas, interpretações da lei e fontes de dados diversas decerto explicam o hiato.

Mas quem as examina a fundo, entrando no mérito e extraindo conclusões úteis para o esclarecimento do público e a tomada de decisão? Faltam pessoas e instituições, no Brasil, com autoridade para decantar espuma e detritos, clarificando as águas para que se possa enxergar o fundo. De blogueiros e bucaneiros já estamos cheios.

>Weathermen, and other climate change skeptics (The New Yorker)

>
Comment
Up in the Air

by Elizabeth Kolbert – April 12, 2010

Joe Bastardi, who goes by the title “expert senior forecaster” at AccuWeather, has a modest proposal. Virtually every major scientific body in the world has concluded that the planet is warming, and that greenhouse-gas emissions are the main cause. Bastardi, who holds a bachelor’s degree in meteorology, disagrees. His theory, which mixes volcanism, sunspots, and a sea-temperature trend known as the Pacific Decadal Oscillation, is that the earth is actually cooling. Why don’t we just wait twenty or thirty years, he proposes, and see who’s right? This is “the greatest lab experiment ever,” he said recently on Bill O’Reilly’s Fox News show.

Bastardi’s position is ridiculous (which is no doubt why he’s often asked to air it on Fox News). Yet there it was on the front page of the Times last week. Among weathermen, it turns out, views like Bastardi’s are typical. A survey released by researchers at George Mason University found that more than a quarter of television weathercasters agree with the statement “Global warming is a scam,” and nearly two-thirds believe that, if warming is occurring, it is caused “mostly by natural changes.” (The survey also found that more than eighty per cent of weathercasters don’t trust “mainstream news media sources,” though they are presumably included in this category.)

Why, with global warming, is it always one step forward, two, maybe three steps back? A year ago, it looked as if the so-called climate debate might finally be over, and the business of actually addressing the problem about to begin. In April, the Obama Administration designated CO2 a dangerous pollutant, thus taking the first critical step toward regulating carbon emissions. The following month, the Administration announced new fuel-efficiency standards for cars. (These rules were finalized last week.) In June, the House of Representatives passed a bill, named for its co-sponsors, Edward Markey and Henry Waxman, that called for reducing emissions seventeen per cent by 2020. Speaking in September at the United Nations, the President said that a “new era” had dawned. “We understand the gravity of the climate threat,” he declared. “We are determined to act.”

Then, much like the Arctic ice cap, that “new era” started to fall to pieces. The U.N. climate summit in Copenhagen in December broke up without agreement even on a possible outline for a future treaty. A Senate version of the Markey-Waxman bill failed to materialize and, it’s now clear, won’t be materializing anytime this year. (Indeed, the one thing that seems certain not to be in a Senate energy bill is the economy-wide emissions reduction required by the House bill.) Last week, despite the Senate’s inaction, President Obama announced that he was opening huge swaths of the Atlantic and Alaskan coasts to oil drilling. The White House billed the move as part of a “comprehensive energy strategy,” a characterization that, as many commentators pointed out, made no sense, since comprehensiveness is precisely what the President’s strategy lacks. As Josh Nelson put it on the blog EnviroKnow, “Obama is either an exceptionally bad negotiator, or he actually believes in some truly awful policy ideas. Neither of these possibilities bodes well.”

As lawmakers dither, public support for action melts away. In a Gallup poll taken last month, forty-eight per cent of respondents said that they believe the threat of global warming to be “generally exaggerated.” This figure was up from thirty-five per cent just two years ago. According to the same poll, only fifty-two per cent of Americans believe that “most scientists believe that global warming is occurring,” down from sixty-five per cent in 2008.

The most immediate explanation for this disturbing trend is the mess that’s come to be known as Climategate. Here the situation is the reverse of what’s going on in the troposphere: Climategate really is a hyped-up media phenomenon. Late last year, hackers broke into the computer system at the Climatic Research Unit of Britain’s University of East Anglia and posted online hundreds of private e-mails from scientists. In the e-mails, C.R.U. researchers often express irritation with their critics—the death of one detractor is described as “cheering news”—and discuss ways to dodge a slew of what they consider to be nuisance Freedom of Information requests. The e-mails were widely portrayed in the press and in the blogosphere as evidence of a conspiracy to misrepresent the data. But, as a parliamentary committee appointed to investigate the matter concluded last week, this charge is so off base that it is difficult even to respond to: “Insofar as the committee was able to consider accusations of dishonesty against CRU, the committee considers that there is no case to answer.”

The e-mail brouhaha was followed by—and immediately confused with—another overblown controversy, about a mistake in the second volume of the U.N. Intergovernmental Panel on Climate Change’s Fourth Assessment Report, from 2007. On page 493 of the nine-hundred-and-seventy-six-page document, it is asserted, incorrectly, that the Himalayan glaciers could disappear by 2035. (The report cites as a source for this erroneous information a report by the World Wildlife Fund.) The screw-up, which was soon acknowledged by the I.P.C.C. and the W.W.F., was somehow transformed by commentators into a reason to doubt everything in the three-volume assessment, including, by implication, the basic laws of thermodynamics. The “new scandal (already awarded the unimaginative name of ‘Glaciergate’) raises further challenges for a scientific theory that is steadily losing credibility,” James Heiser wrote on the Web site of the right-wing magazine New American.

No one has ever offered a plausible account of why thousands of scientists at hundreds of universities in dozens of countries would bother to engineer a climate hoax. Nor has anyone been able to explain why Mother Nature would keep playing along; despite what it might have felt like in the Northeast these past few months, globally it was one of the warmest winters on record.

The message from scientists at this point couldn’t be clearer: the world’s emissions trajectory is extremely dangerous. Goofball weathermen, Climategate, conspiracy theories—these are all a distraction from what’s really happening. Which, apparently, is what we’re looking for. ♦

Read more: http://www.newyorker.com/talk/comment/2010/04/12/100412taco_talk_kolbert#ixzz0lBqmFCnu

>Geoengenharia

>
Especiais
Caminhos para o clima

31/3/2010

Por Fábio de Castro

Agência FAPESP – Cientistas brasileiros e britânicos discutiram nesta terça-feira (30/3), por meio de videoconferência, possibilidades de cooperação entre instituições dos dois países para desenvolvimento de estudos e programas de pesquisa conjuntos na área de geoengenharia, que inclui diversos métodos de intervenção de larga escala no sistema climático do planeta, com a finalidade de moderar o aquecimento global.

O “Café Scientifique: Encontro Brasileiro-Britânico sobre Geoengenharia”, promovido pelo British Council, Royal Society e FAPESP, foi realizado nas sedes do British Council em São Paulo e em Londres, na Inglaterra.

O ponto de partida para a discussão foi o relatório Geoengenharia para o clima: Ciência, governança e incerteza, apresentado pelo professor John Shepherd, da Royal Society. Em seguida, Luiz Gylvan Meira Filho, pesquisador do Instituto de Estudos Avançados da Universidade de São Paulo (USP), apresentou um breve panorama da geoengenharia no Brasil.

A FAPESP foi representada pelo coordenador executivo do Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais, Carlos Afonso Nobre, pesquisador do Centro de Previsão de Tempo e Estudos Climáticos (CPTEC) do Instituto Nacional de Pesquisas Espaciais (Inpe).

De acordo com Nobre, a reunião serviu para um contato inicial entre os cientistas dos dois países. “A reunião teve um caráter exploratório, já que o próprio conceito de geoengenharia ainda não foi definido com precisão. O objetivo principal era avaliar o interesse das duas partes em iniciar alguma pesquisa conjunta nessa área e expor potenciais contribuições que cada um pode dar nesse sentido”, disse Nobre à Agência FAPESP.

Segundo Nobre, a geoengenharia é um conjunto de possibilidades de intervenção dividido em dois métodos bastante distintos: o manejo de radiação solar e a remoção de dióxido de carbono. Durante a reunião, os brasileiros deixaram claro que têm interesse apenas na segunda vertente.

O manejo de radiação solar, de acordo com o relatório britânico, inclui técnicas capazes de refletir a luz do Sol a fim de diminuir o aquecimento global, como a instalação de espelhos no espaço, o uso de aerossóis estratosféricos – com aplicação de sulfatos, por exemplo –, reforço do albedo das nuvens e incremento do albedo da superfície terrestre, com instalação de telhados brancos nas edificações.

A remoção de dióxido de carbono, por outro lado, inclui metodologias de captura do carbono da atmosfera – ou “árvores artificiais” –, geração de carbono por pirólise de biomassa, sequestro de carbono por meio de bioenergia, fertilização do oceano e armazenamento de carbono no solo ou nos oceanos.

A principal diferença entre as duas vertentes é que os métodos de manejo de radiação solar funcionam com mais rapidez, em prazos de um ou dois anos, enquanto os métodos de remoção de gás carbônico levam várias décadas para surtirem efeito.

Sem plano B

O relatório avaliou todas as técnicas segundo eficácia, prazo de funcionamento, segurança e custo. Seria preciso ainda estudar os impactos sociais, politicos e éticos, de acordo com os cientistas britânicos.

Nobre aponta que o Brasil teria interesse em contribuir com estudos relacionados à vertente da remoção de dióxido de carbono, que seria coerente com o estágio avançado das pesquisas já realizadas no país em áreas como bioenergia e métodos de captura de carbono.

“Sou muito cético em relação ao manejo de energia de radiação solar. A implementação dessas técnicas é rápida, mas, quando esses dispositivos forem desativados – o que ocorrerá inevitavelmente, já que não é sustentável mantê-los por vários milênios –, a situação do clima voltará rapidamente ao cenário anterior. Seria preciso, necessariamente, reduzir rapidamente a causa das mudanças climáticas, que são as emissões de gases de efeito estufa”, disse Nobre.

De acordo com ele, as técnicas de manejo de energia solar são vistas, em geral, como um “plano B”, em caso de iminência de um desastre climático de grandes consequências. Ou seja, seriam acionadas emergencialmente quando os sistemas climáticos estivessem atingindo pontos de saturação que provocariam mudanças irreversíveis – os chamados tipping points.

“Mas o problema é que vários tipping points foram atingidos e já não há mais plano B. O derretimento do gelo do Ártico, por exemplo, de acordo com 80% dos glaciologistas, atingiu o ponto de saturação. Em algumas décadas, no verão, ali não haverá mais gelo. Não podemos criar a ilusão de que é possível acionar um plano B. Não há sistemas de governança capazes de definir o momento de lançar essas alternativas”, disse.

A vertente da remoção do dióxido de carbono, por outro lado, deverá ser amplamente estudada, de acordo com Nobre. “Essa vertente segue a linha lógica do restabelecimento da qualidade atmosférica. O princípio é fazer a concentração dos gases voltar a um estado de equilíbrio no qual o planeta se manteve por pelo menos 1 ou 2 milhões de anos.”

Ainda assim, essas soluções de engenharia climáticas devem ser encaradas com cuidado. “A natureza é muito complexa e as soluções de engenharia não são fáceis, especialmente em escala global. Acho que vale a pena estudar as várias técnicas de remoção de gás carbônico e definir quais delas têm potencial – mas sempre lembrando que são processos lentos que vão levar décadas ou séculos. Nada elimina a necessidade de reduzir emissões”, disse Nobre.

>Understanding Scientific Terms About Climate Change

>
Certainty vs. Uncertainty

Union of Concerned Scientists – http://www.ucsusa.org
Last Revised: 03/17/10

Uncertainty is ubiquitous in our daily lives. We are uncertain about where to go to college, when and if to get married, who will play in the World Series, and so on.

To most of us, uncertainty means not knowing. To scientists, however, uncertainty is how well something is known. And, therein lies an important difference, especially when trying to understand what is known about climate change.

In science, there’s no such thing as absolute certainty. But, research reduces uncertainty. In many cases, theories have been tested and analyzed and examined so thoroughly that their chance of being wrong is infinitesimal. Other times, uncertainties linger despite lengthy research. In those cases, scientists make it their job to explain how well something is known. When gaps in knowledge exist, scientists qualify the evidence to ensure others don’t form conclusions that go beyond what is known.

Even though it may seem counterintuitive, scientists like to point out the level of uncertainty. Why? Because they want to be as transparent as possible and it shows how well certain phenomena are understood. Scientists have even developed their own phrasing regarding uncertainty, such as “very high confidence” (9 out of 10 chances of being correct) about a certain fact and “very likely” (90 chances out of 100) to describe the chance of an outcome.

Decision makers in our society use scientific input all the time. But they could make a critically wrong choice if the unknowns aren’t taken into account. For instance, city planners could build a levee too low or not evacuate enough coastal communities along an expected landfall zone of a hurricane if uncertainty is understated. For these reasons, uncertainty plays a key role in informing public policy.

However, this culture of transparency has caused problems for climate change science. Climate change deniers link certainty projections with not knowing anything. The truth is, science knows much about climate change. We have learned, for example, that the burning of fossil fuels and the clearing or burning of land creates carbon dioxide (CO2), which is released into the atmosphere. There is no uncertainty about this. We have learned that carbon dioxide and other greenhouse gases build up in the atmosphere and trap heat through the greenhouse effect.

Again, there is no uncertainty about this. Earth is warming, and scientists are very certain that humans are the main reason for the world’s temperature increase in the past 50 years.

Scientists know with very high confidence, or even greater certainty, that:

  • Human-induced warming influences physical and biological systems throughout the world
  • Sea levels are rising
  • Glaciers and permafrost are shrinking
  • Oceans are becoming more acidic
  • Ranges of plants and animals are shifting

Scientists are uncertain, however, about how much global warming will occur in the future (between 2.1 degrees and 11 degrees Fahrenheit by 2100). They are also uncertain how soon the sea ice habitat where the ringed seal lives will disappear. Curiously, much of this uncertainty has to do with—are you ready?—humans. The choices we make in the next decade, or so, to reduce emissions of heat-trapping gasses could prevent catastrophic climate change.

So, what’s the bottom line? Science has learned much about climate change. Science tells us what is more or less likely to be true. The latest climate science underscores that there’s an urgent need to reduce heat-trapping emissions. And that is certain.

Table: Language to describe confidence about facts and the likelihood of an outcome.  SOURCE: IPCC WGI (2007).

Terminology for describing confidence about facts
Very High confidence At least 9 out of 10 chance of being correct
High confidence About 8 out of 10 chance
Medium confidence About 5 out of 10 chance
Low confidence About 2 out of 10 chance
Very low confidence Less than 1 out of 10 chance

Terminology for describing likelihood of an outcome
Virtually certain More than 99 chances out of 100
Extremely likely More than 95 chances out of 100
Very likely More than 90 chances out of 100
Likely More than 65 chances out of 100
More likely than not More than 50 chances out of 100


>The clouds of unknowing (The Economist)

>
The science of climate change

There are lots of uncertainties in climate science. But that does not mean it is fundamentally wrong

Mar 18th 2010 | From The Economist print edition

FOR anyone who thinks that climate science must be unimpeachable to be useful, the past few months have been a depressing time. A large stash of e-mails from and to investigators at the Climatic Research Unit of the University of East Anglia provided more than enough evidence for concern about the way some climate science is done. That the picture they painted, when seen in the round—or as much of the round as the incomplete selection available allows—was not as alarming as the most damning quotes taken out of context is little comfort. They offered plenty of grounds for both shame and blame.

At about the same time, glaciologists pointed out that a statement concerning Himalayan glaciers in the most recent report of the Intergovernmental Panel on Climate Change (IPCC) was wrong. This led to the discovery of other poorly worded or poorly sourced claims made by the IPCC, which seeks to create a scientific consensus for the world’s politicians, and to more general worries about the panel’s partiality, transparency and leadership. Taken together, and buttressed by previous criticisms, these two revelations have raised levels of scepticism about the consensus on climate change to new heights.

Increased antsiness about action on climate change can also be traced to the recession, the unedifying spectacle of last December’s climate-change summit in Copenhagen, the political realities of the American Senate and an abnormally cold winter in much of the northern hemisphere. The new doubts about the science, though, are clearly also a part of that story. Should they be?

In any complex scientific picture of the world there will be gaps, misperceptions and mistakes. Whether your impression is dominated by the whole or the holes will depend on your attitude to the project at hand. You might say that some see a jigsaw where others see a house of cards. Jigsaw types have in mind an overall picture and are open to bits being taken out, moved around or abandoned should they not fit. Those who see houses of cards think that if any piece is removed, the whole lot falls down. When it comes to climate, academic scientists are jigsaw types, dissenters from their view house-of-cards-ists.

The defenders of the consensus tend to stress the general consilience of their efforts—the way that data, theory and modelling back each other up. Doubters see this as a thoroughgoing version of “confirmation bias”, the tendency people have to select the evidence that agrees with their original outlook. But although there is undoubtedly some degree of that (the errors in the IPCC, such as they are, all make the problem look worse, not better) there is still genuine power to the way different arguments and datasets in climate science tend to reinforce each other.

The doubters tend to focus on specific bits of empirical evidence, not on the whole picture. This is worthwhile—facts do need to be well grounded—but it can make the doubts seem more fundamental than they are. People often assume that data are simple, graspable and trustworthy, whereas theory is complex, recondite and slippery, and so give the former priority. In the case of climate change, as in much of science, the reverse is at least as fair a picture. Data are vexatious; theory is quite straightforward. Constructing a set of data that tells you about the temperature of the Earth over time is much harder than putting together the basic theoretical story of how the temperature should be changing, given what else is known about the universe in general.

Absorb and reflect

The most relevant part of that universal what-else is the requirement laid down by thermodynamics that, for a planet at a constant temperature, the amount of energy absorbed as sunlight and the amount emitted back to space in the longer wavelengths of the infra-red must be the same. In the case of the Earth, the amount of sunlight absorbed is 239 watts per square metre. According to the laws of thermodynamics, a simple body emitting energy at that rate should have a temperature of about –18ºC. You do not need a comprehensive set of surface-temperature data to notice that this is not the average temperature at which humanity goes about its business. The discrepancy is due to greenhouse gases in the atmosphere, which absorb and re-emit infra-red radiation, and thus keep the lower atmosphere, and the surface, warm (see the diagram below). The radiation that gets out to the cosmos comes mostly from above the bulk of the greenhouse gases, where the air temperature is indeed around –18ºC.

Adding to those greenhouse gases in the atmosphere makes it harder still for the energy to get out. As a result, the surface and the lower atmosphere warm up. This changes the average temperature, the way energy moves from the planet’s surface to the atmosphere above it and the way that energy flows from equator to poles, thus changing the patterns of the weather.

No one doubts that carbon dioxide is a greenhouse gas, good at absorbing infra-red radiation. It is also well established that human activity is putting more of it into the atmosphere than natural processes can currently remove. Measurements made since the 1950s show the level of carbon dioxide rising year on year, from 316 parts per million (ppm) in 1959 to 387ppm in 2009. Less direct records show that the rise began about 1750, and that the level was stable at around 280ppm for about 10,000 years before that. This fits with human history: in the middle of the 18th century people started to burn fossil fuels in order to power industrial machinery. Analysis of carbon isotopes, among other things, shows that the carbon dioxide from industry accounts for most of the build-up in the atmosphere.

The serious disagreements start when discussion turns to the level of warming associated with that rise in carbon dioxide. For various reasons, scientists would not expect temperatures simply to rise in step with the carbon dioxide (and other greenhouse gases). The climate is a noisy thing, with ups and downs of its own that can make trends hard to detect. What’s more, the oceans can absorb a great deal of heat—and there is evidence that they have done so—and in storing heat away, they add inertia to the system. This means that the atmosphere will warm more slowly than a given level of greenhouse gas would lead you to expect.

There are three records of land-surface temperature put together from thermometer readings in common use by climatologists, one of which is compiled at the Climatic Research Unit of e-mail infamy. They all show warming, and, within academia, their reliability is widely accepted. Various industrious bloggers are not so convinced. They think that adjustments made to the raw data introduce a warming bias. They also think the effects of urbanisation have confused the data because towns, which are sources of heat, have grown up near weather stations. Anthony Watts, a retired weather forecaster who blogs on climate, has set up a site, surfacestations.org, where volunteers can help record the actual sites of weather instruments used to provide climate data, showing whether they are situated close to asphalt or affected by sources of bias.

Those who compile the data are aware of this urban heat-island effect, and try in various ways to compensate for it. Their efforts may be insufficient, but various lines of evidence suggest that any errors it is inserting are not too bad. The heat-island effect is likely to be strongest on still nights, for example, yet trends from data recorded on still nights are not that different from those from windy ones. And the temperature of waters at the surface of the seas shows similar trends to that on land over the past century, as does the record of air temperature over the oceans as measured at night (see chart 1).

A recent analysis by Matthew Menne and his colleagues at America’s National Oceanic and Atmospheric Administration, published in the Journal of Geophysical Research, argued that trends calculated from climate stations that surfacestation.org found to be poorly sited and from those it found well sited were more or less indistinguishable. Mr Watts has problems with that analysis, and promises a thorough study of the project’s findings later.

There is undoubtedly room for improvement in the surface-temperature record—not least because, at the moment, it provides only monthly mean temperatures, and there are other things people would like to know about. (When worrying about future heatwaves, for example, hot days and nights, not hot months, are the figures of most interest.) In February Britain’s Met (ie, meteorological) Office called for the creation of a new set of temperature databases compiled in rigorously transparent ways and open to analysis and interpretation by all and sundry. Such an initiative would serve science well, help restore the credibility of land-surface records, and demonstrate an openness on the part of climate science which has not always been evident in the past.

Simplify and amplify

For many, the facts that an increase in carbon dioxide should produce warming, and that warming is observed in a number of different indicators and measurements, add up to a primafacie case for accepting that greenhouse gases are warming the Earth and that the higher levels of greenhouse gases that business as usual would bring over the course of this century would warm it a lot further.

The warming caused by a given increase in carbon dioxide can be calculated on the basis of laboratory measurements which show how much infra-red radiation at which specific wavelengths carbon dioxide molecules absorb. This sort of work shows that if you double the carbon dioxide level you get about 1ºC of warming. So the shift from the pre-industrial 280ppm to 560ppm, a level which on current trends might be reached around 2070, makes the world a degree warmer. If the level were to double again, to 1,100ppm, which seems unlikely, you would get another degree.

The amount of warming expected for a doubling of carbon dioxide has become known as the “climate sensitivity”—and a climate sensitivity of one degree would be small enough to end most climate-related worries. But carbon dioxide’s direct effect is not the only thing to worry about. Several types of feedback can amplify its effect. The most important involve water vapour, which is now quite well understood, and clouds, which are not. It is on these areas that academic doubters tend to focus.

As carbon dioxide warms the air it also moistens it, and because water vapour is a powerful greenhouse gas, that will provide further warming. Other things people do—such as clearing land for farms, and irrigating them—also change water vapour levels, and these can be significant on a regional level. But the effects are not as large.

Climate doubters raise various questions about water vapour, some trivial, some serious. A trivial one is to argue that because water vapour is such a powerful greenhouse gas, carbon dioxide is unimportant. But this ignores the fact that the level of water vapour depends on temperature. A higher level of carbon dioxide, by contrast, governs temperature, and can endure for centuries.

A more serious doubting point has to do with the manner of the moistening. In the 1990s Richard Lindzen, a professor of meteorology at the Massachusetts Institute of Technology, pointed out that there were ways in which moistening might not greatly enhance warming. The subsequent two decades have seen much observational and theoretical work aimed at this problem. New satellites can now track water vapour in the atmosphere far better than before (see chart 2). As a result preliminary estimates based on simplifications have been shown to be reasonably robust, with water-vapour feedbacks increasing the warming to be expected from a doubling of carbon dioxide from 1ºC without water vapour to about 1.7ºC. Dr Lindzen agrees that for parts of the atmosphere without clouds this is probably about right.

This moistening offers a helpful way to see what sort of climate change is going on. When water vapour condenses into cloud droplets it gives up energy and warms the surrounding air. This means that in a world where greenhouse warming is wetting the atmosphere, the lower parts of the atmosphere should warm at a greater rate than the surface, most notably in the tropics. At the same time, in an effect that does not depend on water vapour, an increase in carbon dioxide will cause the upper stratosphere to cool. This pattern of warming down below and cooling up on top is expected from greenhouse warming, but would not be expected if something other than the greenhouse effect was warming the world: a hotter sun would heat the stratosphere more, not less.

During the 1990s this was a point on which doubters laid considerable weight, because satellite measurements did not show the warming in the lower atmosphere that theory would predict. Over the past ten years, though, this picture has changed. To begin with, only one team was turning data from the relevant instruments that have flown on weather satellites since the 1970s into a temperature record resolved by altitude. Now others have joined them, and identified errors in the way that the calculations (which are complex and depend on a number of finicky details) were carried out. Though different teams still get different amounts and rates of warming in the lower atmosphere, there is no longer any denying that warming is seen. Stratospheric cooling is complicated by the effects of ozone depletion, but those do not seem large enough to account for the degree of cooling that has been seen there, further strengthening the case for warming by the greenhouse effect and not some other form of climate perturbation.

On top of the effect of water vapour, though, the clouds that form from it provide a further and greater source of uncertainty. On the one hand, the droplets of water of which these are made also have a strong greenhouse effect. On the other, water vapour is transparent, whereas clouds reflect light. In particular, they reflect sunlight back into space, stopping it from being absorbed by the Earth. Clouds can thus have a marked cooling effect and also a marked warming effect. Which will grow more in a greenhouse world?

Model maze

It is at this point that detailed computer models of the climate need to be called into play. These models slice the atmosphere and oceans into stacks of three-dimensional cells. The state of the air (temperature, pressure, etc) within each cell is continuously updated on the basis of what its state used to be, what is going on in adjacent cells and the greenhousing and other properties of its contents.

These models are phenomenally complex. They are also gross oversimplifications. The size of the cells stops them from explicitly capturing processes that take place at scales smaller than a hundred kilometres or so, which includes the processes that create clouds.

Despite their limitations, climate models do capture various aspects of the real world’s climate: seasons, trade winds, monsoons and the like. They also put clouds in the places where they are seen. When used to explore the effect of an increase in atmospheric greenhouse gases on the climate these models, which have been developed by different teams, all predict more warming than greenhouse gases and water-vapour feedback can supply unaided. The models assessed for the IPCC’s fourth report had sensitivities ranging from 2.1ºC to 4.4ºC. The IPCC estimated that if clouds were not included, the range would be more like 1.7ºC to 2.1ºC. So in all the models clouds amplify warming, and in some the amplification is large.

However, there are so far no compelling data on how clouds are affecting warming in fact, as opposed to in models. Ray Pierrehumbert, a climate scientist at the University of Chicago who generally has a strong way with sceptics, is happy to agree that there might be processes by which clouds rein in, rather than exaggerate, greenhouse-warming effects, but adds that, so far, few have been suggested in any way that makes sense.

Dr Lindzen and a colleague suggested a plausible mechanism in 2001. They proposed that tropical clouds in an atmosphere with more greenhouse gas might dry out neighbouring parts of the sky, making them more transparent to outgoing infra-red. The evidence Dr Lindzen brought to bear in support of this was criticised in ways convincing enough to discourage other scientists from taking the idea further. A subsequent paper by Dr Lindzen on observations that would be compatible with his ideas about low sensitivity has also suffered significant criticisms, and he accepts many of them. But having taken them on board has not, he thinks, invalidated his line of research.

Arguments based on past climates also suggest that sensitivity is unlikely to be low. Much of the cooling during the ice ages was maintained by the presence of a large northern hemisphere ice cap reflecting away a lot of sunlight, but carbon dioxide levels were lower, too. To account for all of the cooling, especially in the southern hemisphere, is most easily done with a sensitivity of temperature to carbon dioxide higher than Dr Lindzen would have it.

Before the ice age, the Earth had a little more carbon dioxide and was a good bit warmer than today—which suggests a fairly high sensitivity. More recently, the dip in global temperatures after the eruption of Mt Pinatubo in the Philippines in 1991, which inserted a layer of sunlight-diffusing sulphur particles into the stratosphere, also bolsters the case for a sensitivity near the centre of the model range—although sensitivity to a transient event and the warming that follows a slow doubling of carbon dioxide are not exactly the same sort of thing.

Logs and blogs

Moving into data from the past, though, brings the argument to one of the areas that blog-based doubters have chosen as a preferred battleground: the temperature record of the past millennium, as construed from natural records that are both sensitive to temperature and capable of precise dating. Tree rings are the obvious, and most controversial, example. Their best known use has been in a reconstruction of temperatures over the past millennium published in Nature in 1998 and widely known as the hockey stick, because it was mostly flat but had a blade sticking up at the 20th-century end. Stephen McIntyre, a retired Canadian mining consultant, was struck by the very clear message of this graph and delved into the science behind it, a process that left him and followers of his blog, Climate Audit, intensely sceptical about its value.

In 2006 a review by America’s National Research Council endorsed points Mr McIntyre and his colleagues made on some methods used to make the hockey stick, and on doubts over a specific set of tree rings. Despite this it sided with the hockey stick’s overall conclusion, which did little to stem the criticism. The fact that tree-ring records do not capture recent warming adds to the scepticism about the value of such records.

For many of Mr McIntyre’s fans (though it is not, he says, his central concern) the important thing about this work is that the hockey stick seemed to abolish the “medieval warm period”. This is a time when temperatures are held to have been as high as or higher than today’s—a warmth associated with the Norse settlement of Greenland and vineyards in England. Many climate scientists suspect this phenomenon was given undue prominence by climatologists of earlier generations with an unduly Eurocentric view of the world. There is evidence for cooling at the time in parts of the Pacific.

Doubters for the most part are big fans of the medieval warm period, and see in the climate scientists’ arguments an attempt to rewrite history so as to maximise the drama of today’s warming and minimise the possibility that natural variation might explain the 20th-century record. The possibility of more climatic variability, though, does not, in itself, mean that greenhouse warming is not happening too. And if the medieval warmth were due to some external factor, such as a slightly brighter sun, that would suggest that the climate was indeed quite sensitive.

Looking at the more recent record, logged as it has been by thermometers, you might hope it could shed light on which of the climate models is closest to being right, and thus what the sensitivity actually is. Unfortunately, other confounding factors make this difficult. Greenhouse gases are not the only climatically active ingredients that industry, farming and land clearance add to the atmosphere. There are also aerosols—particles of pollution floating in the wind. Some aerosols cool the atmosphere. Other, sootier, ones warm it. The aggregate effect, globally, is thought to be a cooling, possibly a quite strong one. But the overall history of aerosols, which are mostly short-lived, is nothing like as well known as that of greenhouse gases, and it is unlikely that any of the models are properly capturing their chemistry or their effects on clouds.

Taking aerosols into account, climate models do a pretty good job of emulating the climate trends of the 20th century. This seems odd, since the models have different sensitivities. In practice, it appears that the way the aerosols are dealt with in the models and the sensitivity of those models tend to go hand in hand; sensitive models also have strong cooling aerosol effects.

Reto Knutti of ETH Zurich, an expert on climate sensitivity, sees this as evidence that, consciously or unconsciously, aerosols are used as counterweights to sensitivity to ensure that the trends look right. This is not evidence of dishonesty, and it is not necessarily a bad thing. Since the models need to be able to capture the 20th century, putting them together in such a way that they end up doing so makes sense. But it does mean that looking at how well various models match the 20th century does not give a good indication of the climate’s actual sensitivity to greenhouse gas.

Adding the uncertainties about sensitivity to uncertainties about how much greenhouse gas will be emitted, the IPCC expects the temperature to have increased by 1.1ºC to 6.4ºC over the course of the 21st century. That low figure would sit fairly well with the sort of picture that doubters think science is ignoring or covering up. In this account, the climate has natural fluctuations larger in scale and longer in duration (such as that of the medieval warm period) than climate science normally allows, and the Earth’s recent warming is caused mostly by such a fluctuation, the effects of which have been exaggerated by a contaminated surface-temperature record. Greenhouse warming has been comparatively minor, this argument would continue, because the Earth’s sensitivity to increased levels of carbon dioxide is lower than that seen in models, which have an inbuilt bias towards high sensitivities. As a result subsequent warming, even if emissions continue full bore, will be muted too.

It seems unlikely that the errors, misprisions and sloppiness in a number of different types of climate science might all favour such a minimised effect. That said, the doubters tend to assume that climate scientists are not acting in good faith, and so are happy to believe exactly that. Climategate and the IPCC’s problems have reinforced this position.

Using the IPCC’s assessment of probabilities, the sensitivity to a doubling of carbon dioxide of less than 1.5ºC in such a scenario has perhaps one chance in ten of being correct. But if the IPCC were underestimating things by a factor of five or so, that would still leave only a 50:50 chance of such a desirable outcome. The fact that the uncertainties allow you to construct a relatively benign future does not allow you to ignore futures in which climate change is large, and in some of which it is very dangerous indeed. The doubters are right that uncertainties are rife in climate science. They are wrong when they present that as a reason for inaction.

Comments to this article here.

>What is the best way to provide people with information about climate change?

>
Nov 7th, 2009
Climate Central

There are many ways that people can benefit from having information about climate change, including being able to make informed policy and management decisions. This is one reason why people are talking about creating a national climate service. So, what functions would a national climate service provide?

A good place to start is with an organization that has a similar name and purpose—the National Weather Service, a government agency that was established in the late 1800s. The importance of the Weather Service is almost too obvious to mention. Without accurate reports about the current weather and predictions of future weather, planes would fly into thunderstorms unawares, ships would plow directly into hurricanes and typhoons, and people wouldn’t know about blizzards barreling down on them. Also, planning for pretty much any outdoor activity would become a lot more difficult. Without good weather forecasts, the losses in economic terms and in human lives would be huge.

Climate change unfolds on a slower scale—over decades rather than in hours. But now that we know it is happening, the need for forecasting how climate change will impact us has become clear as well. Knowing how much sea level is likely to rise, and how quickly, is crucial to knowing how to protect coastal areas from increased damage. Knowing how hurricane frequency and strength might change could affect building codes and evacuation strategies. Knowing how the intensity and frequency of droughts and heat waves might change would help city and regional planners manage water resources and mitigate threats to local economies.

The knowledge that these changes will come mostly from an increase in atmospheric levels of greenhouse gases could inform decisions about how to produce and use energy, and whether to develop alternative energy and other green technologies. If the world decides that limiting climate change is a priority, then this green technology could be an economic boon to the countries that perfect it.

Realizing that businesses, local governments, and individuals need the most reliable forecasts possible of how, when, and where the climate is likely to change, and what the impacts might be, universities, government agencies, and private companies have come together over the past year or so to figure out how such an entity might operate—how it would organize information and how it would deliver that information in the most useful way.

>U.S. Scientists Urge Action on Climate Change

>
On March 11, 2000 U.S. scientists and economists signed on to a statement imploring the Senate to move swiftly and comprehensively on the issue of climate change. The signatories are all experts in relevant fields of study on climate change. The statement is the first time leading U.S. scientists and economists have come together to issue a joint message of concern on climate change. The list of signatories included eight Nobel laureates, 32 members of the National Academy of Sciences, 10 members from the National Academy of Engineering, and more than 100 members of the Intergovernmental Panel on Climate Change, who shared a 2007 Nobel Peace Prize.

“If anything, the climate problem is actually worse than reported earlier,” wrote Leon Lederman, Director Emeritus of the Fermi National Accelerator Laboratory in Batavia, Illinois, and a Nobel Prize-winning physicist, in an individual statement in the letter to the Senate. “Physicists tend to be super critical of strong conclusions, but the data on global warming now indicate the conclusions are not nearly strong enough.”

Read the statement here.

>Marcelo Gleiser: Criação imperfeita (Folha Mais!)

>
A noção de que a natureza pode ser decifrada pelo reducionismo precisa ser abolida.

14 março 2010

Desde tempos imemoriais, ao se deparar com a imensa complexidade da natureza, o homem buscou nela padrões repetitivos, algum tipo de ordem. Isso faz muito sentido. Afinal, ao olharmos para os céus, vemos que existem padrões organizados, movimentos periódicos que se repetem, definindo ciclos naturais aos quais estamos profundamente ligados: o nascer e o pôr do Sol, as fases da Lua, as estações do ano, as órbitas planetárias.

Com Pitágoras, 2.500 anos atrás, a busca por uma ordem natural das coisas foi transformada numa busca por uma ordem matemática: os padrões que vemos na natureza refletem a matemática da criação. Cabe ao filósofo desvendar esses padrões, revelando assim os segredos do mundo.

Ademais, como o mundo é obra de um arquiteto universal (não exatamente o Deus judaico-cristão, mas uma divindade criadora mesmo assim), desvendar os segredos do mundo equivale a desvendar a “mente de Deus”. Escrevi recentemente sobre como essa metáfora permanece viva ainda hoje e é usada por físicos como Stephen Hawking e muitos outros.

Essa busca por uma ordem matemática da natureza rendeu -e continua a render- muitos frutos. Nada mais justo do que buscar uma ordem oculta que explica a complexidade do mundo. Essa abordagem é o cerne do reducionismo, um método de estudo baseado na ideia de que a compreensão do todo pode ser alcançada através do estudo das suas várias partes.

Os resultados dessa ordem são expressos através de leis, que chamamos de leis da natureza. As leis são a expressão máxima da ordem natural. Na realidade, as coisas não são tão simples. Apesar da sua óbvia utilidade, o reducionismo tem suas limitações. Existem certas questões, ou melhor, certos sistemas, que não podem ser compreendidos a partir de suas partes. O clima é um deles; o funcionamento da mente humana é outro.

Os processos bioquímicos que definem os seres vivos não podem ser compreendidos a partir de leis simples, ou usando que moléculas são formadas de átomos. Essencialmente, em sistemas complexos, o todo não pode ser reduzido às suas partes.

Comportamentos imprevisíveis emergem das inúmeras interações entre os elementos do sistema. Por exemplo, a função de moléculas com muitos átomos, como as proteínas, depende de como elas se “dobram”, isto é, de sua configuração espacial. O funcionamento do cérebro não pode ser deduzido a partir do funcionamento de 100 bilhões de neurônios.

Sistemas complexos precisam de leis diferentes, que descrevem comportamentos resultantes da cooperação de muitas partes. A noção de que a natureza é perfeita e pode ser decifrada pela aplicação sistemática do método reducionista precisa ser abolida. Muito mais de acordo com as descobertas da ciência moderna é que devemos adotar uma abordagem múltipla, e que junto ao reducionismo precisamos utilizar outros métodos para lidar com sistemas mais complexos. Claro, tudo ainda dentro dos parâmetros das ciências naturais, mas aceitando que a natureza é imperfeita e que a ordem que tanto procuramos é, na verdade, uma expressão da ordem que buscamos em nós mesmos.

É bom lembrar que a ciência cria modelos que descrevem a realidade; esses modelos não são a realidade, só nossas representações dela. As “verdades” que tanto admiramos são aproximações do que de fato ocorre.

As simetrias jamais são exatas. O surpreendente na natureza não é a sua perfeição, mas o fato de a matéria, após bilhões de anos, ter evoluído a ponto de criar entidades capazes de se questionarem sobre a sua existência.

MARCELO GLEISER é professor de física teórica no Dartmouth College, em Hanover (EUA) e autor do livro “A Criação Imperfeita”

>Living on Earth: Climate Confusion and the "Climategate"

>
Air Date: March 5, 2010
http://www.loe.org

Link to the audio file.

“Climategate” has damaged the credentials of the Intergovernmental Panel on Climate Change, and decades of science on global warming. But as scientists push back against efforts to dismiss the threat of global warming, some media watchers say journalists aren’t balancing their coverage of climate change with the scientifically-sound other side of the story – that the impacts of a warming world could be worse than the IPCC predicts. Host Jeff Young talks with media experts and scientists about the fallout of the hacked email scandal, and how to repair damage. (12:00)

>Climate scientists to fight back at skeptics (The Washington Times)

>
By Stephen Dinan
The Washington Times – Friday, March 5, 2010

Undaunted by a rash of scandals over the science underpinning climate change, top climate researchers are plotting to respond with what one scientist involved said needs to be “an outlandishly aggressively partisan approach” to gut the credibility of skeptics.

In private e-mails obtained by The Washington Times, climate scientists at the National Academy of Sciences say they are tired of “being treated like political pawns” and need to fight back in kind. Their strategy includes forming a nonprofit group to organize researchers and use their donations to challenge critics by running a back-page ad in the New York Times.

“Most of our colleagues don’t seem to grasp that we’re not in a gentlepersons’ debate, we’re in a street fight against well-funded, merciless enemies who play by entirely different rules,” Paul R. Ehrlich, a Stanford University researcher, said in one of the e-mails.

Some scientists question the tactic and say they should focus instead on perfecting their science, but the researchers who are organizing the effort say the political battle is eroding confidence in their work.

“This was an outpouring of angry frustration on the part of normally very staid scientists who said, ‘God, can’t we have a civil dialogue here and discuss the truth without spinning everything,'” said Stephen H. Schneider, a Stanford professor and senior fellow at the Woods Institute for the Environment who was part of the e-mail discussion but wants the scientists to take a slightly different approach.

The scientists have been under siege since late last year when e-mails leaked from a British climate research institute seemed to show top researchers talking about skewing data to push predetermined outcomes. Meanwhile, the Intergovernmental Panel on Climate Change, the authoritative body on the matter, has suffered defections of members after it had to retract claims that Himalayan glaciers will melt over the next 25 years.

Last month, President Obama announced that he would create a U.S. agency to arbitrate research on climate change.

Sen. James M. Inhofe, Oklahoma Republican and a chief skeptic of global-warming claims, is considering asking the Justice Department to investigate whether climate scientists who receive taxpayer-funded grants falsified data. He lists 17 people he said have been key players in the controversy.

That news has enraged scientists. Mr. Schneider said Mr. Inhofe is showing “McCarthyesque” behavior in the mold of the Cold War-era senator who was accused of stifling political debate through accusations of communism.

In a phone interview, Mr. Schneider, who is one of the key players Mr. Inhofe cites, said he disagrees with trying to engage in an ad battle. He said the scientists will never be able to compete with energy companies.

“They’re not going to win short-term battles playing the game against big-monied interests because they can’t beat them,” he said.

He said the “social contract” between scientists and policymakers is broken and must be reforged, and he urged colleagues to try to recruit members of Congress to take up their case. He also said the press and nongovernmental organizations must be prodded.

“What I am trying to do is head off something that will be truly ugly,” he said. “I don’t want to see a repeat of McCarthyesque behavior and I’m already personally very dismayed by the horrible state of this topic, in which the political debate has almost no resemblance to the scientific debate.”

Not all climate scientists agree with forcing a political fight.

“Sounds like this group wants to step up the warfare, continue to circle the wagons, continue to appeal to their own authority, etc.,” said Judith A. Curry, a climate scientist at the Georgia Institute of Technology. “Surprising, since these strategies haven’t worked well for them at all so far.”

She said scientists should downplay their catastrophic predictions, which she said are premature, and instead shore up and defend their research. She said scientists and institutions that have been pushing for policy changes “need to push the disconnect button for now,” because it will be difficult to take action until public confidence in the science is restored.

“Hinging all of these policies on global climate change with its substantial element of uncertainty is unnecessary and is bad politics, not to mention having created a toxic environment for climate research,” she said.

Ms. Curry also said that more engagement between scientists and the public would help – something that the NAS researchers also proposed.

Paul G. Falkowski, a professor at Rutgers University who started the effort, said in the e-mails that he is seeking a $1,000 donation from as many as 50 scientists to pay for an ad to run in the New York Times. He said in one e-mail that commitments were already arriving.

The e-mail discussion began late last week and continued into this week.

Mr. Falkowski didn’t respond to an e-mail seeking comment, and an effort to reach Mr. Ehrlich was unsuccessful.

But one of those scientists forwarded The Times’ request to the National Academy of Sciences, whose e-mail system the scientists used as their forum to plan their effort.

An NAS spokesman sought to make clear that the organization itself is not involved in the effort.

“These scientists are elected members of the National Academy of Sciences, but the discussants themselves realized their efforts would require private support since the National Academy of Sciences never considered placing such an ad or creating a nonprofit group concerning these issues,” said William Kearney, chief spokesman for NAS.

The e-mails emerged months after another set of e-mails from a leading British climate research group seemed to show scientists shading data to try to bolster their claims, and are likely to feed the impression among skeptics that researchers are pursuing political goals as much as they are disseminating science.

George Woodwell, founder of the Woods Hole Research Center, said in one e-mail that researchers have been ceding too much ground. He blasted Pennsylvania State University for pursuing an academic investigation against professor Michael E. Mann, who wrote many of the e-mails leaked from the British climate research facility.

An initial investigation cleared Mr. Mann of falsifying data but referred one charge, that he “deviated from accepted practices within the academic community,” to a committee for a more complete review.

In his e-mail, Mr. Woodwell acknowledged that he is advocating taking “an outlandishly aggressively partisan approach” but said scientists have had their “classical reasonableness” turned against them.

“We are dealing with an opposition that is not going to yield to facts or appeals from people who hold themselves in high regard and think their assertions and data are obvious truths,” he wrote.

>Scientists Taking Steps to Defend Work on Climate (N. Y. Times)

>
By JOHN M. BRODER
New York Times, March 2, 2010

WASHINGTON — For months, climate scientists have taken a vicious beating in the media and on the Internet, accused of hiding data, covering up errors and suppressing alternate views. Their response until now has been largely to assert the legitimacy of the vast body of climate science and to mock their critics as cranks and know-nothings.

Photo: Brendan Smialowski for The New York Times.
Ralph J. Cicerone of the National Academy of Sciences says scientists must try to be heard.

But the volume of criticism and the depth of doubt have only grown, and many scientists now realize they are facing a crisis of public confidence and have to fight back. Tentatively and grudgingly, they are beginning to engage their critics, admit mistakes, open up their data and reshape the way they conduct their work.

The unauthorized release last fall of hundreds of e-mail messages from a major climate research center in England, and more recent revelations of a handful of errors in a supposedly authoritative United Nations report on climate change, have created what a number of top scientists say is a major breach of faith in their research. They say the uproar threatens to undermine decades of work and has badly damaged public trust in the scientific enterprise.

The e-mail episode, dubbed “climategate” by critics, revealed arrogance and what one top climate researcher called “tribalism” among some scientists. The correspondence appears to show efforts to limit publication of contrary opinion and to evade Freedom of Information Act requests. The content of the messages opened some well-known scientists to charges of concealing temperature data from rival researchers and manipulating results to conform to precooked conclusions.

“I have obviously written some very awful e-mails,” Phil Jones, the British climate scientist at the center of the controversy, confessed to a special committee of Parliament on Monday. But he sharply disputed charges that he had hidden data or faked results.

Some of the most serious allegations against Dr. Jones, director of the climate research unit at the University of East Anglia, and other researchers have been debunked, while several investigations are still under way to determine whether others hold up.

But serious damage has already been done. A survey conducted in late December by Yale University and George Mason University found that the number of Americans who believed that climate change was a hoax or scientific conspiracy had more than doubled since 2008, to 16 percent of the population from 7 percent. An additional 13 percent of Americans said they thought that even if the planet was warming, it was a result solely of natural factors and was not a significant concern.

Climate scientists have been shaken by the criticism and are beginning to look for ways to recover their reputation. They are learning a little humility and trying to make sure they avoid crossing a line into policy advocacy.

“It’s clear that the climate science community was just not prepared for the scale and ferocity of the attacks and they simply have not responded swiftly and appropriately,” said Peter C. Frumhoff, an ecologist and chief scientist at the Union of Concerned Scientists. “We need to acknowledge the errors and help turn attention from what’s happening in the blogosphere to what’s happening in the atmosphere.”

A number of institutions are beginning efforts to improve the quality of their science and to make their work more transparent. The official British climate agency is undertaking a complete review of its temperature data and will make its records and analysis fully public for the first time, allowing outside scrutiny of methods and conclusions. The United Nations panel on climate change will accept external oversight of its research practices, also for the first time.

Two universities are investigating the work of top climate scientists to determine whether they have violated academic standards and undermined faith in science. The National Academy of Sciences is preparing to publish a nontechnical paper outlining what is known — and not known — about changes to the global climate. And a vigorous debate is under way among climate scientists on how to make their work more transparent and regain public confidence.

Some critics think these are merely cosmetic efforts that do not address the real problem, however.

“I’ll let you in on a very dark, ugly secret — I don’t want trust in climate science to be restored,” Willis Eschenbach, an engineer and climate contrarian who posts frequently on climate skeptic blogs, wrote in response to one climate scientist’s proposal to share more research. “I don’t want you learning better ways to propagandize for shoddy science. I don’t want you to figure out how to inspire trust by camouflaging your unethical practices in new and innovative ways.”

“The solution,” he concluded, “is for you to stop trying to pass off garbage as science.”

Ralph J. Cicerone, president of the National Academy of Sciences, the most prestigious scientific body in the United States, said that there was a danger that the distrust of climate science could mushroom into doubts about scientific inquiry more broadly. He said that scientists must do a better job of policing themselves and trying to be heard over the loudest voices on cable news, talk radio and the Internet.

“This is a pursuit that scientists have not had much experience in,” said Dr. Cicerone, a specialist in atmospheric chemistry.

The battle is asymmetric, in the sense that scientists feel compelled to support their findings with careful observation and replicable analysis, while their critics are free to make sweeping statements condemning their work as fraudulent.

“We have to do a better job of explaining that there is always more to learn, always uncertainties to be addressed,” said John P. Holdren, an environmental scientist and the White House science adviser. “But we also need to remind people that the occasions where a large consensus is overturned by a scientific heretic are very, very rare.”

No scientific body is under more hostile scrutiny than the United Nations Intergovernmental Panel on Climate Change, which compiles the climate research of hundreds of scientists around the globe into periodic reports intended to be the definitive statement of the science and a guide for policy makers. Critics, citing several relatively minor errors in its most recent report and charges of conflict of interest against its leader, Rajendra K. Pachauri, are calling for the I.P.C.C. to be disbanded or radically reformed.

On Saturday, after weeks of refusing to engage critics, the I.P.C.C. announced that it was asking for the creation of an independent panel to review its research procedures to try to eliminate bias and errors from future reports. But even while allowing for some external oversight, Dr. Pachauri insisted that panel stood behind its previous work.

“Scientists must continually earn the public’s trust or we risk descending into a new Dark Age where ideology trumps reason,” Dr. Pachauri said in an e-mail message.

But some scientists said that responding to climate change skeptics was a fool’s errand.

“Climate scientists are paid to do climate science,” said Gavin A. Schmidt, a senior climatologist with the National Aeronautics and Space Administration’s Goddard Institute of Space Studies. “Their job is not persuading the public.”

He said that the recent flurry of hostility to climate science had been driven as much by the cold winter as by any real or perceived scientific sins.

“There have always been people accusing us of being fraudulent criminals, of the I.P.C.C. being corrupt,” Dr. Schmidt said. “What is new is this paranoia combined with a spell of cold weather in the United States and the ‘climategate’ release. It’s a perfect storm that has allowed the nutters to control the agenda.”

The answer is simple, he said.

“Good science,” he said, “is the best revenge.”

>Signs of Damage to Public Trust in Climate Findings (N. Y. Times/Dot Earth blog)

>
By ANDREW C. REVKIN
February 5, 2010, 4:27 pm

CBS News has run a report summarizing fallout from the illegal distribution of climate scientists’ email messages and files and problems with the 2007 report from the Intergovernmental Panel on Climate Change. The conclusion is that missteps and mistakes are creating broader credibility problems for climate science.

Senator James M. Inhofe was quick to add the report to the YouTube channel of the minority on the Environment and Public Works committee:

Ralph J. Cicerone, the president of the National Academy of Sciences, has an editorial in this week’s edition of the journal Science (subscription only) noting the same issue. Over all, he wrote, “My reading of the vast scientific literature on climate change is that our understanding is undiminished by this incident; but it has raised concern about the standards of science and has damaged public trust in what scientists do.”

Dr. Cicerone, an atmospheric scientist, added that polls and input he has received from various sources indicate that “public opinion has moved toward the view that scientists often try to suppress alternative hypotheses and ideas and that scientists will withhold data and try to manipulate some aspects of peer review to prevent dissent. This view reflects the fragile nature of trust between science and society, demonstrating that the perceived misbehavior of even a few scientists can diminish the credibility of science as a whole.” (A BBC report on its latest survey on climate views supports Dr. Cicerone’s impression.)

What should scientists do? Dr. Cicerone acknowledged both the importance of improving transparency and the challenges in doing so:

“It is essential that the scientific community work urgently to make standards for analyzing, reporting, providing access to, and stewardship of research data operational, while also establishing when requests for data amount to harassment or are otherwise unreasonable. A major challenge is that acceptable and optimal standards will vary among scientific disciplines because of proprietary, privacy, national security and cost limitations. Failure to make research data and related information accessible not only impedes science, it also breeds conflicts.”

As recently as last week, senior members of the intergovernmental climate panel had told me that some colleagues did not see the need for changes in practices and were convinced that the recent flareup over errors in the 2007 report was a fleeting inconvenience. I wonder if they still feel that way.

UPDATE: Here’s some additional reading on the I.P.C.C’s travails and possible next steps for the climate panel:

IPCC Flooded by Criticism, by Quirin Schiermeier in Nature News.

Anatomy of I.P.C.C.’s Mistake on Himalayan Glaciers and Year 2035, by Bidisha Banerjee and George Collins in the Yale Forum on Climate Change and the Media.

* * *

After Emergence of Climate Files, an Uncertain Forecast

By ANDREW C. REVKIN
December 1, 2009, 10:56 am

Roger A. Pielke Jr. is a political scientist at the University of Colorado who has long focused on climate and disasters and the interface of climate science and policy. He has been among those seeking some clarity on temperature data compiled by the Climatic Research Unit of the University of East Anglia, which is now at the center of a storm over thousands of e-mail messages and documents either liberated or stolen from its servers (depending on who is describing the episode). [UPDATED 11:45 a.m. with a couple more useful voices “below the fold.”]

On Monday, I asked him, in essence, if the shape of the 20th-century temperature curve were to shift much as a result of some of the issues that have come up in the disclosed e-mail messages and files, would that erode confidence in the keystone climate question (the high confidence expressed by the Intergovernmental Panel on Climate Change in 2007 that most warming since 1950 is driven by human activities)?

This is Dr. Pielke’s answer. (I added boldface to the take-home points.):

Here is my take, in a logical ordering, from the perspective of an informed observer:

The circumstances:

1. There are many adjustments made to the raw data to account for biases and other factors.
2. Some part of the overall warming trend is as a result of these adjustments.
3. There are legitimately different ways to do the adjusting. Consider that in the e-mails, [Phil] Jones writes that he thinks [James] Hansen’s approach to urban effects is no good. There are also debates over how to handle ocean temperatures from buckets versus intake valves on ships and so on. And some of the procedures for adjusting are currently contested in the scientific literature.
4. Presumably once the data is readily available how these legitimate scientific choices are made about the adjusting would be open to scrutiny and debate.
5. People will then be much more able to cherry pick adjustment procedures to maximize or minimize the historical trends, but also to clearly see how others make decisions about adjustments.
6. Mostly this matters for pre-1979, as the R.S.S. and U.A.H. satellite records provide some degree of independent checking.

Now the implications:

A. If it turns out that the choices made by CRU, GISS, NOAA fall on the “maximize historical trends” end of the scale, that will not help their perceived credibility for obvious reasons. On the other hand, if their choices lead to the middle of the range or even low end, then this will enhance their credibility.
B. The surface temps matter because they are a key basis for estimates of climate sensitivity in the models used to make projections. So people will fight over small differences, even if everyone accepts a significant warming trend. (This is a key point for understanding why people will fight over small differences.)
C. When there are legitimate debates over procedures in science (i.e., competing certainties from different scientists), then this will help the rest of us to understand that there are irreducible uncertainties across climate science.
D. In the end, I would hypothesize that the result of the freeing of data and code will necessarily lead to a more robust understanding of scientific uncertainties, which may have the perverse effect of making the future less clear, i.e., because it will result in larger error bars around observed temperature trends which will carry through into the projections.
E. This would have the greatest implications for those who have staked a position on knowing the climate future with certainty — so on both sides, those arguing doom and those arguing, “Don’t worry be happy.”

So, in the end, Dr. Pielke appears to say, closer scrutiny of the surface-temperature data could undermine definitive statements of all kinds — that human-driven warming is an unfolding catastrophe or something concocted. More uncertainty wouldn’t produce a climate comfort zone, given that poorly understood phenomena can sometimes cause big problems. But it would surely make humanity’s energy and climate choices that much tougher.

[UPDATE, 11:45 a.m.] Andrew Freedman at the Capital Weather Gang blog has interviewed Gerald North, the climate scientist who headed the National Academies panel that examined the tree-ring data and “hockey stick” graphs. Some excerpts:

On whether the emails and files undermine Dr. North’s confidence in human-driven climate change:

This hypothesis (Anthropogenic GW) fits in the climate science paradigm that 1) Data can be collected and assembled in ways that are sensible. 2) These data can be used to test and or recalibrate climate simulation models. 3) These same models can be used to predict future and past climates. It is understood that this is a complicated goal to reach with any precision. The models are not yet perfect, but there is no reason to think the approach is wrong.

On Stephen McIntyre of Climateaudit.org:

I do think he has had an overall positive effect. He has made us re-examine the basis for our assertions. In my opinion this sorts itself out in the due course of the scientific process, but perhaps he has made a community of science not used to scrutiny take a second look from time to time. But I am not sure he has ever uncovered anything that has turned out to be significant.

Also, please note below that Michael Schlesinger at the University of Illinois sent in a response to sharp criticisms of his Dot Earth contribution from Roger Pielke, Sr., at the University of Colorado, Boulder. (Apologies for Colorado State affiliation earlier; he’s moved.)

>AP: Bin Laden blasts US for climate change

>
By SALAH NASRAWI
The Associated Press
Friday, January 29, 2010; 7:46 AM

CAIRO — Al-Qaida leader Osama bin Laden has called for the world to boycott American goods and the U.S. dollar, blaming the United States and other industrialized countries for global warming, according to a new audiotape released Friday.

In the tape, broadcast in part on Al-Jazeera television, bin Laden warned of the dangers of climate change and says that the way to stop it is to bring “the wheels of the American economy” to a halt.

He blamed Western industrialized nations for hunger, desertification and floods across the globe, and called for “drastic solutions” to global warming, and “not solutions that partially reduce the effect of climate change.”

Bin Laden has mentioned climate change and global warning in past messages, but the latest tape was his first dedicated to the topic. The speech, which included almost no religious rhetoric, could be an attempt by the terror leader to give his message an appeal beyond Islamic militants.

The al-Qaida leader also targeted the U.S. economy in the recording, calling for a boycott of American products and an end to the dollar’s domination as a world currency.

“We should stop dealings with the dollar and get rid of it as soon as possible,” he said. “I know that this has great consequences and grave ramifications, but it is the only means to liberate humanity from slavery and dependence on America.”

He argued that such steps would also hamper Washington’s war efforts in Afghanistan and Iraq.

The new message, whose authenticity could not immediately be confirmed, comes after a bin Laden tape released last week in which he endorsed a failed attempt to blow up an American airliner on Christmas Day.

>José de Souza Martins: Palmares de todos nós ou efeméride como fato histórico (O Estado de S.P.)

>
Data só terá sentido como dia da consciência de todos nós, da nossa identidade brasileira

José de Souza Martins – O Estado de S.Paulo
Domingo, 22 de novembro de 2009

– A transformação em feriado do dia do aniversário da morte de Zumbi, general e cabo de guerra do Quilombo dos Palmares, em 1695, como Dia da Consciência Negra, é providência que pode se transformar numa grande bobagem ou num fato histórico. Será uma grande e deseducativa bobagem se for capturado e instrumentalizado pelo neorracismo brasileiro para alimentar a destrutiva ideologia do confronto, que nos assola, e firmar a suposta legitimidade de uma visão de mundo que nos divide e nos afasta de nós mesmos. Será um grande fato histórico se for aceito por todos os brasileiros como desafio que pode nos instigar a rever nossa memória coletiva, para que nos livremos dos fantasmas de uma história que não é nossa. Para que nos encontremos no reconhecimento dos feitos que redundaram na construção do país pluralista que somos e deveríamos gostar de ser. Os heroicos feitos de Zumbi se inscrevem nessa pauta. São feitos que dão sentido ao anseio de liberdade e emancipação do Brasil multirracial e democrático.

Nossa cultura escolar e de oitiva insere-se numa tradição que conspira todos os dias contra essa alternativa e essa busca. Expressão disso é o modo como se propõe a figura do mulato Domingos Fernandes Calabar à consciência dos brasileiros, estigmatizado como traidor porque passou para o lado dos holandeses no século 17, quando o Brasil ainda não era Brasil. No mínimo falta aí uma consciência crítica da história, que nos revele os efetivos dilemas sociais e políticos com que se defrontavam os protobrasileiros de então.

Calabar, na verdade, fez uma opção, como tantos outros fizeram naquela época de profundas transformações no mundo, aberta a opções religiosas, econômicas e políticas. Seu próprio detrator, o frei Manuel Calado, autor de O Valeroso Lucideno, teve sua simpatia pelos holandeses. O que parece ter irritado Calado e mantido essa irritação nos registros históricos é que, como outros, Calabar tenha se convertido ao protestantismo e negado o imobilismo socialmente estreito que se anunciava na dominação portuguesa e se confirmaria em nossa história redundante e conformista.

A mesma mentalidade que amaldiçoou Calabar folclorizou Zumbi, negando-lhe o lugar em que temos o direito de tê-lo em nossa memória histórica e com ele os insubmissos palmarinos, que morreram em grande número, aniquilados como seres destituídos de humanidade. Em Palmares, a luta do negro (e do índio) foi feita em nome de todos nós, pelo reconhecimento da condição humana de pessoas que eram tratadas como animais de trabalho, peças de mercado, objeto de partilha mercantil prévia nos próprios contratos de encomenda das entradas repressivas que destruiriam o quilombo.

Acima da crônica de botequim, a história de Palmares é a fascinante história épica de um povo, que não era só de negros, como narra, apoiado em documentos, Édison Carneiro, o grande historiador e estudioso das culturas negras, autor de O Quilombo dos Palmares. Uma história bem distante de fabulações raciais. Ou mesmo de interpretações redutivas, descabidamente apoiadas em simplificações inaplicáveis ao caso, pescadas antidialeticamente no Manifesto Comunista, de Marx e Engels, como as que definem Palmares como capítulo pioneiro da história da luta de classes. Nem Zumbi era um Spartacus do sertão nem a sociedade de classes estava constituída entre nós, nem mesmo em Portugal, apenas se anunciando em países como a Itália, a Holanda e a Inglaterra. Nem por isso a história documentada macula o que poderia e deveria ser o imaginário épico que a traz à nossa consciência e aos nossos dias.

Houve vários grandes e resistentes quilombos em diferentes pontos do Brasil até o final da escravidão. Palmares foi, sem dúvida, o maior, durou quase todo o século 17 e no seu último meio século sofreu reiterados ataques. Menos porque representasse um efetivo perigo político à dominação portuguesa e muito mais porque sua captura e sujeição recompensariam seus mercenários opressores com escravos e terras. Que tampouco lutavam por algo que pudesse ser chamado de Brasil. Era o caso de Domingos Jorge Velho, o mais violento e ambicioso deles, que nem mesmo falava português, pois vivia entre tapuias. Precisou de um intérprete para conversar com um bispo que o visitou.

Palmares tem sido apresentado como uma república libertária, antecedente em quase dois séculos da Revolução Francesa, o que nunca foi. Em Palmares também havia escravidão, a dos raptados e levados à força para os mocambos. Só eram livres os que voluntariamente fugissem de seus senhores e buscassem refúgio no Quilombo. Os escravos dos negros palmarinos podiam obter a alforria, como ensina Édison Carneiro, se para lá levassem um negro cativo.

Os milhares de negros que ali se refugiaram criaram um Estado, no modelo dos Estados nativos africanos, dominado por um déspota, o rei Ganga Zumba, e por uma aristocracia em parte de sangue, de que Zumbi era membro, sobrinho do monarca. Quando o rei celebrou a paz com os brancos e o governo colonial de Pernambuco, em 1678, foi envenenado pelos negros. Zumbi, contrário à vassalagem, levou a luta até o limite, quando o quilombo foi invadido e destruído, em 1695.

Ao contrário da lenda, não se matou, e dos 20 combatentes que o acompanhavam na luta final, só um sobreviveu. Os portugueses chegaram até ele quando um mulato capturado, que era seu imediato, sob tortura e em troca da vida, indicou o reduto em que ele se encontrava. Morto Zumbi, André Furtado de Mendonça, que comandava a tropa, cortou-lhe a cabeça, enviando-a ao governo, no Recife, onde foi exibida, espetada numa estaca, para que os negros se convencessem de que morrera.

A efeméride de Zumbi terá sentido como dia da consciência de todos nós, da nossa identidade brasileira, se for o prenúncio de uma reordenação dos termos da nossa memória coletiva para nela inscrever a história como história do povo brasileiro e não como história dos feitos de funcionários públicos ou de minorias.

>Como transformar informação em ação? Info-ativismo.

>
10 tactics for turning information into action shows how rights advocates around the world have used the internet and digital technologies to create positive change.

http://www.informationactivism.org/

http://vimeo.com/moogaloop.swf?clip_id=7079347&server=vimeo.com&show_title=1&show_byline=1&show_portrait=0&color=&fullscreen=1
Info Activism from Tactical Technology Collective on Vimeo.

>Preferência desconfiada

>
Agência FAPESP, 17/8/2009

Por Thiago Romero

Agência FAPESP – Vinte anos depois de ter sido restabelecida no Brasil, a democracia é o regime político preferido por mais de dois terços da população brasileira. Um paradoxo, no entanto, aponta para níveis elevados de desconfiança dos cidadãos frente aos órgãos políticos, percepção que está associada com os “déficits de funcionamento das instituições democráticas no país”.

Essa é uma das conclusões de um trabalho de pesquisa desenvolvido por José Álvaro Moisés, professor do Departamento de Ciência Política da Universidade de São Paulo (USP), com base em dados de quatro pesquisas nacionais de opinião coordenadas por ele em 1989, 1990, 1993 e 2006.

No âmbito de um Projeto Temático apoiado pela FAPESP, o trabalho analisou os significados atribuídos ao conceito de democracia na visão de cerca de 9 mil cidadãos brasileiros, que responderam à pergunta “Para você, o que é a democracia?”, incluída em questionários realizados ao longo desses 17 anos. O Projeto Temático “A desconfiança dos cidadãos nas instituições democráticas” foi iniciado em 2005 e está em andamento.

“Diferentemente de outros períodos da história, a preferência dos brasileiros pela democracia é hoje majoritária e sua adesão ao regime democrático é validada pela rejeição de mais de dois terços do público a alternativas antidemocráticas, como a volta dos militares ao poder ou o estabelecimento de um sistema de partido único”, disse Moisés.

“Mas a maioria dos brasileiros ainda desconfia das instituições democráticas e, em especial, dos partidos políticos, do Congresso Nacional, do sistema de leis e do Judiciário. Os índices mais altos de confiança se referem a poucas instituições públicas e privadas baseadas em estruturas hierárquicas, como a igreja e as forças armadas”, apontou.

No estudo, as respostas espontâneas dos entrevistados sobre o significado da democracia foram recodificadas pelo pesquisador levando em conta três dimensões do conceito de democracia: “liberdades”, “procedimentos institucionais” e “atendimento social”, sendo que a primeira incluiu menções às liberdades políticas, direitos individuais, liberdade de organização e de expressão, liberdade de participação e direito de ir e vir.

O conceito “Procedimentos institucionais” incluiu variáveis como direito de voto, eleições livres, regra de maioria, representação política, acesso à justiça e fiscalização de governos, enquanto “dimensão social”, por sua vez, reuniu igualdade social, acesso a serviços de saúde, educação, habitação, emprego, salários justos e desenvolvimento econômico.

Os resultados do estudo mostraram que os brasileiros associam a democracia majoritariamente com as noções de “liberdade” e de “procedimentos institucionais”.
“Diferentemente das suposições dos céticos e de parte da literatura, a maior parte dos brasileiros consultados foi capaz de definir adequadamente a democracia em termos que envolvem duas das mais importantes dimensões que qualificam o processo democrático: o princípio de liberdade e os procedimentos e estruturas institucionais. Mas a dimensão social, por outro lado, teve pouco impacto nos resultados”, observou Moisés.

Liberdades e desempenho

Ao longo do tempo, segundo o pesquisador, a democracia foi associada com o seu significado político, tanto a uma perspectiva caracterizada pelas liberdades como a outra de natureza prática determinada pelo desempenho das instituições.
“Isolados, esses dois conceitos podem dizer pouco, mas juntos definem uma visão razoavelmente sofisticada do processo democrático”, complementa Moisés, que também é diretor científico do Núcleo de Pesquisa de Políticas Públicas (NUPPs) da USP.
De acordo com ele, levando-se em conta o peso que as desigualdades sociais e econômicas têm para a maior parte da população brasileira, surpreende o fato de a alternativa que recebeu a menor taxa de menções dos entrevistados ter sido à que se refere à dimensão social: o percentual dos que definiram a democracia dessa maneira foi inferior a 6% em 1989 e menos de 10% em 2006.

“Em 2006, apenas oito em cada cem brasileiros definiram a democracia em termos de fins sociais, o que coloca em questão a hipótese segundo a qual as pessoas comuns preferem a democracia porque identificam esse regime apenas com o atendimento de necessidades básicas. Pelo contrário, as análises mostram que os indivíduos definem preferencialmente a democracia em termos de princípios, conteúdos e procedimentos”, explicou.

Na literatura acadêmica o significado mais usual da democracia se refere “aos procedimentos e às instituições do sistema democrático, em especial, aos mecanismos de escolha de governos por meio do voto”.

Existem, no entanto, outras perspectivas, segundo o estudo, que ampliam a abrangência do conceito, incluindo tanto as dimensões que se referem aos conteúdos da democracia como também aos resultados esperados no terreno da economia e da sociedade. “Vários autores definem a democracia, por exemplo, em termos de competição, participação e contestação pacífica do poder”, disse.

Outro dado relevante apontado pelo trabalho é que o volume de brasileiros incapazes de definir a democracia diminuiu ao longo do tempo: de cerca de 46% em 1989 para menos de 30% em 2006. “O número de entrevistados que respondeu de modo inconsistente caiu de quase cinco em cada cem pessoas em 1989 para menos de três em 2006”, disse.

Em outras palavras, em 2006, último ano do período analisado pelo pesquisador, depois de o regime democrático ter completado mais de duas décadas de existência no país, 70% dos entrevistados brasileiros foram capazes de oferecer respostas consistentes sobre o significado da democracia. “Essa é uma impressionante proporção somente comparável à encontrada em países de democracia consolidada e em países do leste europeu”, destacou Moisés.

O trabalho será publicado em um livro que está sendo organizado pelo Departamento de Sociologia da Universidade de Brasília (UnB) e pelo Senado Federal, entidades que organizaram, em 2008, o 2º Seminário Internacional Estudos sobre o Legislativo: 20 anos da Constituição, que contou com a participação de Moisés.