Arquivo da tag: Desastre

“O REDD+ está pedindo socorro”, alerta Conservação Internacional (CarbonoBrasil)

Economia

01/10/2013 – 11h51

por Fabiano Ávila, do CarbonoBrasil

florestanew 300x208 O REDD+ está pedindo socorro, alerta Conservação Internacional

ONG afirma que mecanismo está ameaçado pelo grande desequilíbrio entre oferta e demanda; enquanto mais de 22 milhões de créditos podem ser gerados anualmente, apenas 6,8 milhões teriam compradores. 

A demora para criar instrumentos que estimulem, ou obriguem, países e empresas a comprar créditos florestais de carbono e a falta de vontade política para incluir o REDD+ (clique aqui e saiba mais sobre o conceito de REDD+)  em mercados já estabelecidos, como o EU ETS, estão resultando no excesso de créditos no mercado voluntário, causando a queda dos preços e diminuindo o interesse para o desenvolvimento de projetos de conservação florestal.

Essa é a mensagem central que a Conservação Internacional (CI) tenta passar com o relatório “REDD+ Market: Sending Out an SOS” (algo como Mercado de REDD+: pedindo socorro).

De acordo com a ONG, apenas considerando a certificação Verified Carbon Standard (VCS), até 22 milhões de créditos podem ser gerados anualmente, porém, a demanda do mercado voluntário atualmente não passaria de 6,8 milhões. Desde 2010, a procura por esse tipo de crédito teria caído 65%.

Esse desequilíbrio entre oferta e demanda fez com que o preço médio dos créditos do REDD+ passasse de US$ 12 em 2011 para US$ 6 no ano passado.

A CI aponta que o REDD+ já ajudou a proteger mais de 14 milhões de hectares de florestas. Além disso, trouxe ganhos para mais de 70 mil pessoas em comunidades locais, evitou as emissões de quatro milhões de toneladas de CO2 equivalente desde 2009 e protegeu 139 espécies que estão ameaçadas de extinção.

situacaoredd 133x300 O REDD+ está pedindo socorro, alerta Conservação Internacional“A falta de recompensas financeiras para esses casos de sucesso envia um sinal forte e preocupante para todos os países desenvolvendo esforços para reduzir o desmatamento. Suas ações não têm recebido apoio, mas indiferença e incertezas. Esse sinal não gera a motivação necessária para promover as reformas políticas complexas que o REDD+ tanto precisa”, afirma o relatório.

Como podem apenas ser negociados no mercado voluntário, os créditos do REDD+ são muito dependentes de doadores e de ferramentas internacionais que ainda não possuem a abrangência para estimular novos projetos de forma sustentável.

O relatório cita a Forest Carbon Partnership Facility’s do Banco Mundial (FCPF), o fundo de ação antecipada de REDD+ da Alemanha e o futuro Fundo Climático Verde como exemplos de mecanismos que têm buscado aumentar a demanda por créditos, mas que, no entanto, ainda são muito limitados em termos de disponibilização de financiamentos, de escopo geográfico e de velocidade de implementação.

Propostas

Uma das soluções óbvias citadas pela CI passa por garantir um preço justo para os créditos de REDD+.

Isso poderia ser conseguido de diversas formas: maior interesse dos fundos climáticos pelo REDD+, expansão dos programas de compensação voluntária do setor privado e a criação de compromissos para países doadores.

“Essas ações ajudariam a catalisar novos investimentos assim como estabilizariam a situação dos atuais projetos para os próximos anos, reduzindo a vulnerabilidade das comunidades devido à queda dos preços do REDD+”, afirma o relatório.

Outro ponto que precisa receber atenção seria o reconhecimento dos benefícios múltiplos dos projetos de REDD+.

Segundo a CI, as iniciativas de conservação florestal melhoram a vida de povos nativos, protegem a biodiversidade e garantem os serviços ecossistêmicos.

Assim, programas governamentais que tenham objetivos semelhantes aos que são alcançados pelo REDD+ deveriam considerar o financiamento desse tipo de projeto. Dessa forma, o mecanismo seria encarado não apenas como uma ferramenta para “compensar emissões”, mas também como um modelo de desenvolvimento inteligente.

O relatório destaca que muitos projetos já começam a ser desenvolvidos pelos próprios povos nativos, como é o caso do Projeto de Carbono Florestal Suruí, da Terra Indígena Sete de Setembro do povo Paiter Suruí, localizado nos estados de Rondônia e Mato Grosso.

Inclusive, no mês passado, o projeto Suruí vendeu seus primeiros créditos de REDD+; foram 120 mil unidades compradas pela Natura.

A CI conclui que a importância de manter o REDD+ funcionando em um alto nível de qualidade não pode ser subestimada. Não apenas para lidar com o desmatamento e com as emissões de gases do efeito estufa, mas também para evitar os impactos negativos que projetos mal elaborados podem produzir.

“Para alcançar os resultados esperados, está claro que o REDD+ deve melhorar em escala, mas também em questões como legislação (…) Isso deve ser feito para evitar que estímulos perversos sejam criados”, explica o relatório.

A ONG está neste caso se referindo aos riscos muitas vezes associados ao REDD+, como a exploração dos povos nativos e os conflitos por terras.

“Estabelecer estruturas institucionais é necessário para implementar a gestão local que facilitará o desenvolvimento de mecanismos de REDD+ nacionais e internacionais”, conclui a CI.

* Publicado originalmente no site CarbonoBrasil.

O que é e como surgiu o REDD? (ipam.org.br)

Florestas tropicais representam hoje 15% da superfície terrestre (FAO, 2006 apud GCP, 2008) e contém cerca de 25% de todo o carbono contido na biosfera terrestre (BONAN, 2008 apud GCP, 2008). Além disso, 90% dos cerca de 1,2 bilhões de pessoas que vivem abaixo da linha da pobreza dependem dos recursos florestais para sobreviverem (GCP, 2008).

Segundo a FAO (Food and Agriculture Organization), das Nações Unidas (2006), aproximadamente 13 milhões de hectares de florestas tropicais são desmatados todos os anos (uma área equivalente ao Peru).

Preservar florestas, além da redução nas emissões de gases do efeito estufa, tem o potencial de gerar co-benefícios substanciais, como impactos positivos sobre a biodiversidade e sobre a conservação de recursos hídricos.  A floresta em pé também auxilia na estabilização do regime de chuvas e, conseqüentemente, do clima (Angelsen, 2008).

O relatório do IPCC publicado em 2007 (IPCC, 2007) estimou as emissões por desmatamento nos anos 1990 como sendo de aproximadamente 20% do total, fazendo da “mudança no uso da terra” a segunda atividade que mais contribui para o aquecimento global (GCP, 2008).

Conceito e desenvolvimento

O conceito de REDD (Redução das Emissões por Desmatamento e Degradação florestal), basicamente, parte da idéia de incluir na contabilidade das emissões de gases de efeito estufa aquelas que são evitadas pela redução do desmatamento e a degradação florestal.  Nasceu de uma parceria entre pesquisadores brasileiros e americanos, que originou uma proposta conhecida como “Redução Compensada de Emissões” (Santilli et al, 2000), que foi apresentada durante a COP-9, em Milão, Itália (2003), por IPAM e parceiros. Segundo este conceito, os países em desenvolvimento detentores de florestas tropicais, que conseguissem promover reduções das suas emissões nacionais oriundas de desmatamento receberiam compensação financeira internacional correspondente às emissões evitadas. O conceito de redução compensada tornou-se a base da discussão de REDD nos anos seguintes.

Em seguida, durante a COP-11, em Montreal, Canadá (2005) a chamada “Coalition of Rainforest Nations” ou “Coalizão de Nações Tropicais”, liderados por  Papua Nova Guiné e Costa Rica, apresentou uma proposta similar que tem por objetivo discutir formas de incentivar economicamente a redução do desmatamento nos países em desenvolvimento, detentores de florestas tropicais (Pinto et al, 2009).

O argumento colocado é que os países tropicais são responsáveis por estabilizar o clima por meio de suas florestas e, assim, os custos para mantê-las em pé devem ser divididos por todos. Esta iniciativa fez com que, oficialmente, o assunto REDD fosse incluído na pauta de negociações internacionais.

Um ano depois, na COP-12, em Nairobi, Nigéria (2006), o governo brasileiro anunciou publicamente uma proposta para tratar da questão do desmatamento, também muito parecida com as anteriores, só que sem considerar o mecanismo de mercado de créditos de carbono e sim as doações voluntárias.

A COP-13, realizada em Bali, Indonésia, em 2007, culminou com a Decisão 1/ CP 13, conhecida como “Mapa do Caminho de Bali”, para discutir como inserir o tema REDD num mecanismo que será estruturado para iniciar em 2012, ano em que chega ao fim o primeiro período de compromisso do Protocolo de Quioto.

É imprescindível notar que este mecanismo foi inicialmente concebido para os países em desenvolvimento que detêm florestas tropicais, permitindo-os participar efetivamente dos esforços globais de redução de emissões de gases de efeito estufa.

Necessário também salientar que a discussão sobre o desmatamento evitado evoluiu de um mecanismo que tinha foco somente no desmatamento evitado (COP 11, 2005), para ser ampliado e incluir a degradação de florestas (COP 13, 2007),

e REDD+?

Hoje o conceito foi ampliado e é conhecido como REDD+, se refere à construção de um mecanismo, ou uma política, que deverá contemplar formas de prover incentivos positivos aos países em desenvolvimento que tomarem uma ou mais das seguintes ações para a mitigação das mudanças climáticas:

1. Redução das emissões derivadas de desmatamento e degradação das florestas;

2. Aumento das reservas florestais de carbono;

3. Gestão sustentável das florestas;

4. Conservação florestal. (Pinto et al, 2009).

—————

Referências:

ANGELSEN, Arild. (org.). Moving Ahead with REDD: Issues, Options and Implications. CIFOR. Poznan, Polônia. 2008.

GLOBAL CANOPY PROGRAM. The Little REDD Book: A guide to Governmental and non-governmental proposals for Reducing Emissions from Deforestation and Degradation. 2008. Disponível em: http://www.the littleREDDbook.org

INTERNATIONAL PANEL ON CLIMATE CHANGE (IPCC). Climate Change Synthesis Report. Summary for Policymakers. Switzerland. 2007.

PINTO, Erika; MOUTINHO, Paulo; RODRIGUES, Liana; OYO FRANÇA, Flavia Gabriela; MOREIRA, Paula Franco; DIETZSCH, Laura. Cartilha: Perguntas e Respostas Sobre Aquecimento Global. 4a edição. Instituto de Pesquisa Ambiental da Amazônia. Belém. 2009.

SANTILLI, Márcio; MOUTINHO, Paulo; SCHWARTZMAN, Stephan; NEPSTAD, Daniel; CURRAN, Lisa; NOBRE, Carlos. Tropical deforestation and the Kyoto Protocol: an editorial essay. Instituto de Pesquisa Ambiental da Amazônia. 2000.

Contribuição de conteúdo por Ricardo Rettmann (ricardo.rettmann@ipam.org.br)

————–

Acesse publicação REDD no Brasil: um enfoque amazônico

O livro apresenta e discute as condições favoráveis do Brasil à implementação de um regime nacional de REDD+ e propõe dois modelos de estrutura institucional para a repartição de benefícios: um baseado na distribuição por estados e outra por cate-gorias fundiárias. REDD+ é aqui discutido como um elemento importante na transição do modelo de desenvolvimento da Amazônia para um de baixas emissões de carbono, com distribuição de renda e justiça social. A alteração mais importante desta 3a edição foi a utilização da metodologia de cálculo do desmatamento evitado proposta pelo Comitê Técnico do Fundo Amazônia, juntamente com parâmetros fixados pelo Decreto 7.390/2010, que regulamenta a Política Nacional sobre Mudança do Clima. Esta alteração nos cálculos não altera a mensagem central do livro, porém pode ser percebida em algumas figuras chaves que demonstram o valor total do desmatamento evitado no Brasil.

Faça download da publicação

IPCC sustenta que aquecimento global é inequívoco (IPS)

Ambiente

30/9/2013 – 08h01

por Fabíola Ortiz, da IPS

IPCC1 IPCC sustenta que aquecimento global é inequívoco

A elevação do nível do mar pode alagar várias regiões do Recife, no Nordeste do Brasil. Foto: Alejandro Arigón/IPS

Rio de Janeiro, Brasil, 30/9/2013 – Em meio a rumores de que o aquecimento global se deteve nos últimos 15 anos, o novo informe do Grupo Internacional de Especialistas sobre a Mudança Climática (IPCC) indica que as três últimas décadas foram, sucessivamente, mais quentes do que qualquer outra desde 1850. O Resumo para Responsáveis por Políticas do informe do Grupo de Trabalho I – Bases de Ciência Física, foi divulgado no dia 27, em Estocolmo, na Suécia.

O texto completo, sem edições, será conhecido hoje e constitui o primeiro dos quatro volumes do Quinto Informe de Avaliação do IPCC. O aquecimento é “inequívoco”, afirma o IPCC. “A atmosfera e o oceano esquentam, a quantidade de neve e gelo diminui, o nível do mar sobe e as concentrações de gases-estufa aumentam”, destaca o estudo.

Para o especialista brasileiro em clima, Carlos Nobre, um dos autores principais do Quarto Informe de Avaliação, o novo relatório não dá “nenhuma razão para o otimismo. Cada uma das três últimas décadas foi sucessivamente mais quente do que qualquer outra desde 1850. No Hemisfério Norte, o período 1983-2012 representa, provavelmente, os 30 anos mais quentes dos últimos 1.400”, diz o novo resumo. Os dados “das temperaturas médias terrestres e da superfície do oceano, calculados como uma tendência linear, mostram um aquecimento de 0,85 grau no período 1880-2012”, acrescentou Nobre

A respeito da suposta pausa no aumento do calor, o IPCC afirma que “a taxa de aquecimento dos últimos 15 anos (1998-2012) – que foi de 0,05 grau por década e que começou com um potente El Niño (fase quente da Oscilação do Sul) – é menor do que a calculada entre 1951 e 2012, que foi de 0,12 grau por década”.

Entretanto, argumenta, “devido à variabilidade natural, as tendências baseadas em registros de períodos curtos são muito sensíveis às datas de começo e final, e não refletem, em geral, as tendências climáticas de longo prazo”. Resumindo, diz o documento, “é virtualmente certo (99% a 100% de certeza) que a troposfera esquenta desde meados do século 20”.

Nobre indicou à IPS que “o resumo observa com mais detalhe o que está mudando e reduz as incertezas com um conhecimento científico aperfeiçoado”. Além disso, confirma que as alterações do clima se originam principalmente nas ações humanas, destacou Nobre, secretário de Pesquisa e Desenvolvimento do Ministério de Ciência e Tecnologia.

A humanidade deverá se decidir a deixar grande parte dos combustíveis fósseis – responsáveis pela emissão de gases que esquentam a atmosfera – e passar para outras formas de energia renovável, advertiu Nobre. Tecnicamente, é possível, falta uma opção consciente de todos os países, ressaltou. “Essa transição tem um custo, mas que é cada vez menor do que o estimado há 15 anos. O problema não é a tecnologia, mas a decisão política”, acrescentou.

Para Carlos Rittl, coordenador do programa de mudanças climáticas e energia do WWF-Brasil (Fundo Mundial para a Natureza), “embora o aquecimento tenha apresentado uma aparente estabilização quanto à temperatura média, os anos mais quentes foram registrados na última década. Isso não nos deixa em uma situação de conforto”.

O informe do IPCC, uma revisão das pesquisas científicas disponíveis, se baseou em trabalhos de 259 autores de 39 países, e contou com 54.677 comentários. Sua divulgação veio permeada de uma renovada onda mediática de dúvidas e rumores sobre a existência do aquecimento global.

Com base em diferentes trajetórias de emissão de gases-estufa, o resumo projeta para este final de século uma elevação da temperatura superior a 1,5 grau em relação ao período 1850-1900 em todos os cenários, menos no de menor concentração de gases. Também afirma que, provavelmente, o aumento exceda os dois graus até 2100 nos dois cenários de maior emissão de gases. O informe anterior, de 2007, previa aumento de dois graus, no melhor dos casos, e de até seis, no pior.

Após o fracasso da conferência intergovernamental de Copenhague, em dezembro de 2009, quando os países não conseguiram obter um acordo para reduzir a contaminação climática, foram redobrados os questionamentos ao IPCC, em particular por uma errônea estimativa do derretimento das geleiras do Himalaia. A informação “foi usada de forma irresponsável por quem nega o aquecimento global”, alertou Rittl.

Seis anos depois, há mais e melhores evidências científicas para estimar, por exemplo, quanto o derretimento de gelos contribui para a elevação do nível do mar. Até o final de 2100 a elevação média do mar oscilará entre 24 centímetros e 63 centímetros, segundo o melhor e o pior cenários de contaminação atmosférica. As chuvas “aumentarão nas regiões mais úmidas e diminuirão naquelas onde há escassez pluviométrica”, detalhou Rittl, que é doutor em ecologia.

No Brasil os exemplos são a árida região do Nordeste e as mais úmidas do Sul e Sudeste. A incidência de chuvas crescerá entre 1% e 3% em zonas do sul, segundo a velocidade do aquecimento, enquanto as áridas apresentarão padrões de seca mais severos. Todas as tendências confirmadas pelo informe são “alarmantes”, apontou Rittl. “Nós, humanos, somos responsáveis por essas mudanças, que vão piorar o cenário atual em que já há centenas de milhões de pessoas sofrendo escassez de água, de comida e de condições adequadas para sobreviver”, ressaltou.

O primeiro volume do Quinto Informe de Avaliação do IPCC é divulgado dois meses antes de acontecer em Varsóvia, na Polônia, a 19ª Conferência das Partes da Convenção das Nações Unidas sobre Mudança Climática. Nesse encontro deverá ser feito um esforço mundial para garantir a transição para uma economia baixa em dióxido de carbono, disse Nobre. “Este informe é um choque de realidade”, ressaltou.

Em sua opinião, o Brasil é um dos “poucos bons exemplos”, pois conseguiu reduzir em 38,4% suas emissões de gases-estufa entre 2005 e 2010, devido à queda no desmatamento da Amazônia. “O Brasil adotou compromissos voluntários, mas no global não há um acordo ambicioso”, explicou Nobre. Quanto mais tempo se demorar em “adotar ações concretas, mais difícil e improvável será conseguir uma trajetória sustentável de acomodação e adaptação à mudança climática”, enfatizou.

Rittl acredita que os governos devem enfrentar a mudança climática como um desafio nacional para o desenvolvimento, a inclusão social e a redução da pobreza. “É necessário lidar com os riscos e as oportunidades com muita responsabilidade”, concluiu.

Envolverde/IPS

Sociological explanations for climate change denial (resilience.org)

by Olga Bonfiglio, originally published by Energy Bulletin  | MAR 17, 2012

Talk about climate change seems to be a taboo subject in America today.

Three years ago promises by both major political parties to do something have gone by the wayside while today’sRepublican presidential candidates reject evidence that humans are responsible for the warming of the earth

The mainstream media routinely report on extreme weather, like this winter’s high temperatures and last summer’s droughts, but reporters and commentators typically veer away from connecting it to climate change.

Last October, the New York Times reported that those who believed the Earth was warming dropped from 79 percent in 2006 to 59 percent.  However, a month laterpolling experts blamed this decline on a collapse in media coverage and pollsters’ deeply flawed questions.

It turns out that the United States is one of the few countries in the world still quibbling over climate change, and its influence is stymieing progress at environmental summits like Durbin (2011), Cancun (2010), and Copenhagen (2009).

What’s going on?

As you may expect, it’s about money, politics, culture and media bias.

Ron Kramer, a sociologist at Western Michigan University, has been studying how sociological and cultural factors are preventing Americans from talking about or acting on climate change.  He drew on the research of sociologist Stanley Cohen, professor emeritus at the London School of Economics, who says that denial “refers to the maintenance of social worlds in which an undesirable situation (event, condition, phenomenon) is unrecognized, ignored or made to seem normal.”

He cites three categories of denial:

  • literal denial is: “the assertion that something did not happen or is not true.”
  • With an interpretive denial, the basic facts are not denied, however, “…they are given a different meaning from what seems apparent to others.” People recognize that something is happening but that it’s good for us.
  • Implicatory denial “covers the multitude of vocabularies, justifications, rationalizations, evasions that we use to deal with our awareness of so many images of unmitigated suffering.” Here, “knowledge itself is not an issue. The genuine challenge is doing the ‘right’ thing with this knowledge.”

Through literal and interpretive denial, climate change deniers declare that the earth is not warming even though 98 percent of our scientists have written thousands of peer-reviewed papers and reports concluding that climate change is real and caused by human activity.

Actually, deniers are organized by conservative think tanks funded by the fossil fuel industry that attempt to create doubt about climate science and block actions that would reduce greenhouse gas emissions and create clean energy alternatives.

To do this they use conspiracy theories and “fake” experts with no background in climate science.  They insist on absolute certainty, cherry-pick the data and ignore the larger body of evidence or misrepresent data and promote logical fallacies like “the climate has changed in the past, therefore current change is natural.”

“Creating doubt blocks any action,” said Kramer.  “This is the same tactic the tobacco industry used to deny that smoking was harmful to people’s health. And, some of the same people are now doing this with climate change.”

The “conservative climate change denial counter-movement,” as Kramer calls it, was led by the George W. Bush administration and congressional Republicans who obstruct and manipulate the political system to promote their view.  They are complemented by right-wing media outlets such as Fox News and Rush Limbaugh.  The mainstream media’s “balancing norm” practice then allows the denialists to be placed on par with the climate scientists.

The rationale behind this counter-movement is that corporations make money if Americans continue their materialistic attitudes and practices so they lobby politicians to protect their interests and they fund think tanks that accuse “eco-scientists” of destroying capitalism, as revealed by the New YorkTimes.

Corporations provide research monies to university scientists, who rely on them as support from state legislatures and benefactors dwindles.  They advance narratives that people from other parts of the world are worse off than Americans, so we don’t have to change.  They make claims to virtue on TV ads about all they are doing for the environment (a.k.a. “greenwashing”).  And, they argue that doing something about global warming will cost too much and cause them to lose their competitive edge.

Research shows that conservative white males are more likely to espouse climate change denial than other groups for two reasons.  They tend to filter out any information that is different from their already-held worldview because it threatens the identity, status and esteem they receive by being part of their group, he said.  Sociologists call this “Identity Protective Cognition.”

Secondly, conservative white males have a stronger need to justify the status quo and resist attempts to change it.  Sociologists call this “System Justification.”

For example, successful conservative white males stridently defend the capitalist system because it has worked well for them. For anyone to imply the system is not functioning is an unfathomable impossibility akin to blasphemy. [Climatewire via Scientific American]

“Identity Protective Cognition” should also inform environmental activists that the information deficit model of activism is not always a good approach, warned Kramer.  Just providing more information may not change anyone’s views given their commitment to a particular cultural worldview.

In implicatory denial people recognize that something untoward is happening but they fail to act because they are emotionally uncomfortable or troubled about it.

For example, there are the people who are aware of climate change and have some information about it, but take no action, make no behavioral changes and remain apathetic.

This response occurs when people confront confusing and conflicting information from political leaders and the media.  Consequently, they have yet another reason for denial—or they believe the problem can be overcome with technology and they can go on with their lives.

“At some level people understand that climate change can alter human civilization, but they feel a sense of helplessness and powerlessness at the prospect,” said Kramer.  “Others feel guilty that they may have caused the problem.”

Several cultural factors also thwart any decisive action on climate change, said Kramer.

Americans have a tendency toward “anti-intellectualism,” so “nerdy” climate scientists are easily suspect.

Our strong sense of “individualism” helps us strive toward our individual goals, but it likewise keeps us from joining together to do something about climate change.  They ask:  “What good does it do to recycle or drive less when we have such a huge, complex problem as climate change?”

“American exceptionalism” celebrates the American way of life, which has given us a vast bounty of wealth and material goods.  We want to continue this life and, in fact, deserve it.  Nothing bad will happen to us.

Finally, “political alienation” keeps us from trusting our political system to tackle the problem.

“What we ultimately need is international agreement about what to do about climate change,” said Kramer.  “Nothing will happen, however, until the United States commits to doing something.”

What to do about climate change?

Kramer believes we should regard climate change as a matter of social justice and not just science because the people most affected by it are not the ones who created it.

North American and European lifestyles, which are based on easy and cheap access to fossil fuels for energy, agriculture and consumer products have inadvertently caused much suffering, poverty and environmental degradation to the people of the Global South.

To illustrate, analysts at Maplecroft have produced a map that measures of the risk of climate change impacts and the social and financial ability of communities and governments to cope with it.  The most vulnerable nations are Haiti, Bangladesh and Zimbabwe.

Second, Kramer emphasized the moral obligation we have to future generations and other species.  Simply put, we must reduce our use of fossil fuels that are largely responsible for emitting greenhouse gases into the atmosphere and causing extreme weather conditions like hurricanes, floods, drought, tsunami, earthquakes, heat waves, warm winters and melting polar ice caps.

Again, people’s lives, livelihoods and communities are affected and it shouldn’t escape notice that human encroachment on animal habitats is contributing to massive species losses, what some call the Sixth Great Extinction.  Criminologists are grappling with the language characterizing this wonton disregard with words like “ecocide” and “biocide.”

Kramer suggests that we shift our lifestyles from a culture based on materialistic consumption to a culture based on the caretaking of the Earth, as advocated by Scott Russell Sanders in A Conservationist Manifesto (2009).  Sanders asks such questions as (see a video with Sanders)

  • What would a truly sustainable economy look like?
  • What responsibilities do we bear for the well-being of future generations?
  • What responsibilities do we bear toward Earth’s millions of other species?
  • In a time of ecological calamity and widespread human suffering, how should we imagine a good life?

Third, Kramer calls for a more “prophetic imagination” as put forward by Walter Brueggeman, a theologian and professor emeritus at Columbia Theological Seminary, where we take a reasoned approach and face the realities of climate change, confront the truth, “penetrate the numbness and despair,” and avoid drowning in our sense of loss and grief that is paralyzing us from action.

Such an approach can give voice to a “hope-filled responsibility” where people are empowered to act rather than left listless and inattentive.

“It’s not about someone being responsible, but all of us,” said Kramer, “because we are all affected by climate change.”

One major way we can do that is by reducing greenhouse gas emissions—and we have a way to measure our progress.

Bill McKibben, author of The End of Nature (1988), one of the first popularized books about global warming, contends that rising counts in greenhouse gas emissions are threatening our world.  Today, we are at 400 parts per million (ppm) and heading toward 550-650 ppm compared to pre-industrial counts that measured at 275 ppm.  McKibben advocates a goal of 350 ppm and has encouraged people in 188 countries to reduce carbon emissions in their communities.  (See the video explaining this charge.)

However, can people really be motivated to act given the complexity of the problem?

Kramer harkens to Howard Zinn’s autobiographical book, You Can’t Be Neutral on a Moving Train (1994).  In it Zinn says that we have to “look to history” to see that people working at the grassroots level were able to make change despite tremendous and entrenched obstacles.  It was people who ended slavery and Apartheid, liberated India, dismantled the Soviet Union and initiated the Arab Spring of 2011.  (See a video with Zinn)

“Climate change is a political issue” Kramer insisted.  “We know what to do.  We know that we need to mitigate the carbon emissions from fossil fuels.  What we lack is the political will and the mechanisms to move forward.”

Kramer insisted that climate change is not a party or ideological issue but rather a humanity issue.

“Planet Earth will survive,” he concluded, “but will human civilization?”

Disaster Relief Donations Track Number of People Killed, Not Number of Survivors (Science Daily)

Sep. 23, 2013 — People pay more attention to the number of people killed in a natural disaster than to the number of survivors when deciding how much money to donate to disaster relief efforts, according to new research published in Psychological Science, a journal of the Association for Psychological Science. The donation bias can be reversed, however, with a simple change in terminology.

“While fatalities have a severe impact on the afflicted community or country, disaster aid should be allocated to people affected by the disaster — those who are injured, homeless, or hungry,” says lead researcher Ioannis Evangelidis of Rotterdam School of Management, Erasmus University (RSM) in the Netherlands. “Our research shows that donors tend not to consider who really receives the aid.”

This discrepancy leads to a “humanitarian disaster,” say Evangelidis and colleague Bram Van den Bergh, where money is given disproportionately toward the natural disasters with the most deaths, instead of the ones with the most people in desperate need of help.

The researchers began by examining humanitarian relief data for natural disasters occurring between 2000 and 2010. As they expected, they found that the number of fatalities predicted the probability of donation, as well as the amount donated, by private donors in various disasters. Their model estimated that about $9,300 was donated per person killed in a given disaster. The number of people affected in the disasters, on the other hand, appeared to have no influence on the amount donated to relief efforts.

Evangelidis and Van den Bergh believe that donors are more likely to pay attention to a death toll when deciding how much to give because the term “affected” is ambiguous. In many cases, though, fatalities don’t correlate with the number of actual people in need.

To find a way to combat this donation bias, the researchers brought participants into the laboratory and presented them with several scenarios, involving various types of disasters and different numbers of people killed and affected.

Overall, participants allocated more money when a disaster resulted in a high death toll — even when the number of people affected was low — mirroring the data from the real natural disasters.

The bias was reversed, however, when participants had to compare two earthquakes — one that killed 4,500 and affected 7,500 versus one that claimed 7,500 and affected 4,500 — before allocating funds.

The act of comparing the two disasters seems to have forced the participants to think critically about which group actually needed the aid more. Notably, the effect carried over when the participants were asked to allocate funds for a third disaster

But the easiest, and most realistic, way to reduce the donation bias may involve a simple change in terminology. When the researchers swapped the term “affected” with the much less ambiguous term “homeless,” participants believed that money should be allocated according to the number of homeless people following a disaster.

“Above all, attention should be diverted from the number of fatalities to the number of survivors in need,” Evangelidis and Van den Bergh conclude. “We are optimistic that these insights will enhance aid to victims of future disasters.”

Journal Reference:

  1. I. Evangelidis, B. Van den Bergh. The Number of Fatalities Drives Disaster Aid: Increasing Sensitivity to People in NeedPsychological Science, 2013; DOI:10.1177/0956797613490748

Quinto relatório do IPCC: repercussão

Documento divulgado nesta sexta (27/09) afirma que a temperatura do planeta pode subir quase 5 °C durante este século, o que poderá elevar o nível dos oceanos em até 82 centímetros (foto do Oceano Ártico: Nasa)

Especiais

Quinto relatório do IPCC mostra intensificação das mudanças climáticas

27/09/2013

Por Karina Toledo, de Londres

Agência FAPESP – Caso as emissões de gases do efeito estufa continuem crescendo às atuais taxas ao longo dos próximos anos, a temperatura do planeta poderá aumentar até 4,8 graus Celsius neste século – o que poderá resultar em uma elevação de até 82 centímetros no nível do mar e causar danos importantes na maior parte das regiões costeiras do globo.

O alerta foi feito pelos cientistas do Painel Intergovernamental sobre Mudanças Climáticas (IPCC, na sigla em inglês), da Organização das Nações Unidas (ONU), que divulgaram no dia 27 de setembro, em Estocolmo, na Suécia, a primeira parte de seu quinto relatório de avaliação (AR5). Com base na revisão de milhares de pesquisas realizadas nos últimos cinco anos, o documento apresenta as bases científicas da mudança climática global.

De acordo com Paulo Artaxo, professor do Instituto de Física da Universidade de São Paulo (USP) e um dos seis brasileiros que participaram da elaboração desse relatório, foram simulados quatro diferentes cenários de concentrações de gases de efeito estufa, possíveis de acontecer até o ano de 2100 – os chamados “Representative Concentration Pathways (RCPs)”.

“Para fazer a previsão do aumento da temperatura são necessários dois ingredientes básicos: um modelo climático e um cenário de emissões. No quarto relatório (divulgado em 2007) também foram simulados quatro cenários, mas se levou em conta apenas a quantidade de gases de efeito estufa emitida. Neste quinto relatório, nós usamos um sistema mais completo, que leva em conta os impactos dessas emissões, ou seja, o quanto haverá de alteração no balanço de radiação do sistema terrestre”, explicou Artaxo, que está em Londres para a FAPESP Week London, onde participou de um painel sobre mudança climática.

O balanço de radiação corresponde à razão entre a quantidade de energia solar que entra e que sai de nosso planeta, indicando o quanto ficou armazenada no sistema terrestre de acordo com as concentrações de gases de efeito estufa, partículas de aerossóis emitidas e outros agentes climáticos.

O cenário mais otimista prevê que o sistema terrestre armazenará 2,6 watts por metro quadrado (W/m2) adicionais. Nesse caso, o aumento da temperatura terrestre poderia variar entre 0,3 °C e 1,7 °C de 2010 até 2100 e o nível do mar poderia subir entre 26 e 55 centímetros ao longo deste século.

“Para que esse cenário acontecesse, seria preciso estabilizar as concentrações de gases do efeito estufa nos próximos 10 anos e atuar para sua remoção da atmosfera. Ainda assim, os modelos indicam um aumento adicional de quase 2 °C na temperatura – além do 0,9 °C que nosso planeta já aqueceu desde o ano 1750”, avaliou Artaxo.

O segundo cenário (RCP4.5) prevê um armazenamento de 4,5 W/m2. Nesse caso, o aumento da temperatura terrestre seria entre 1,1 °C e 2,6 °C e o nível do mar subiria entre 32 e 63 centímetros. No terceiro cenário, de 6,0 W/m2, o aumento da temperatura varia de 1,4 °C até 3,1 °C e o nível do mar subiria entre 33 e 63 centímetros.

Já o pior cenário, no qual as emissões continuam a crescer em ritmo acelerado, prevê um armazenamento adicional de 8,5 W/m2. Em tal situação, segundo o IPCC, a superfície da Terra poderia aquecer entre 2,6 °C e 4,8 °C ao longo deste século, fazendo com que o nível dos oceanos aumente entre 45 e 82 centímetros.

“O nível dos oceanos já subiu em média 20 centímetros entre 1900 e 2012. Se subir outros 60 centímetros, com as marés, o resultado será uma forte erosão nas áreas costeiras de todo o mundo. Rios como o Amazonas, por exemplo, sofrerão forte refluxo de água salgada, o que afeta todo o ecossistema local”, disse Artaxo.

Segundo o relatório AR5 do IPCC, em todos os cenários, é muito provável (90% de probabilidade) que a taxa de elevação dos oceanos durante o século 21 exceda a observada entre 1971 e 2010. A expansão térmica resultante do aumento da temperatura e o derretimento das geleiras seriam as principais causas.

O aquecimento dos oceanos, diz o relatório, continuará ocorrendo durante séculos, mesmo se as emissões de gases-estufa diminuírem ou permanecerem constantes. A região do Ártico é a que vai aquecer mais fortemente, de acordo com o IPCC.

Segundo Artaxo, o aquecimento das águas marinhas tem ainda outras consequências relevantes, que não eram propriamente consideradas nos modelos climáticos anteriores. Conforme o oceano esquenta, ele perde a capacidade de absorver dióxido de carbono (CO2) da atmosfera. Se a emissão atual for mantida, portanto, poderá haver uma aceleração nas concentrações desse gás na atmosfera.

“No relatório anterior, os capítulos dedicados ao papel dos oceanos nas mudanças climáticas careciam de dados experimentais. Mas nos últimos anos houve um enorme avanço na ciência do clima. Neste quinto relatório, por causa de medições feitas por satélites e de observações feitas com redes de boias – como as do Projeto Pirata que a FAPESP financia no Atlântico Sul –, a confiança sobre o impacto dos oceanos no clima melhorou muito”, afirmou Artaxo.

Acidificação dos oceanos

Em todos os cenários previstos no quinto relatório do IPCC, as concentrações de CO2 serão maiores em 2100 em comparação aos níveis atuais, como resultado do aumento cumulativo das emissões ocorrido durante os séculos 20 e 21. Parte do CO2 emitido pela atividade humana continuará a ser absorvida pelos oceanos e, portanto, é “virtualmente certo” (99% de probabilidade) que a acidificação dos mares vai aumentar. No melhor dos cenários – o RCP2,6 –, a queda no pH será entre 0,06 e 0,07. Na pior das hipóteses – o RCP8,5 –, entre 0,30 e 0,32.

“A água do mar é alcalina, com pH em torno de 8,12. Mas quando absorve CO2 ocorre a formação de compostos ácidos. Esses ácidos dissolvem a carcaça de parte dos microrganismos marinhos, que é feita geralmente de carbonato de cálcio. A maioria da biota marinha sofrerá alterações profundas, o que afeta também toda a cadeia alimentar”, afirmou Artaxo.

Ao analisar as mudanças já ocorridas até o momento, os cientistas do IPCC afirmam que as três últimas décadas foram as mais quentes em comparação com todas as anteriores desde 1850. A primeira década do século 21 foi a mais quente de todas. O período entre 1983 e 2012 foi “muito provavelmente” (90% de probabilidade) o mais quente dos últimos 800 anos. Há ainda cerca de 60% de probabilidade de que tenha sido o mais quente dos últimos 1.400 anos.

No entanto, o IPCC reconhece ter havido uma queda na taxa de aquecimento do planeta nos últimos 15 anos – passando de 0,12 °C por década (quando considerado o período entre 1951 e 2012) para 0,05°C (quando considerado apenas o período entre 1998 e 2012).

De acordo com Artaxo, o fenômeno se deve a dois fatores principais: a maior absorção de calor em águas profundas (mais de 700 metros) e a maior frequência de fenômenos La Niña, que alteram a taxa de transferência de calor da atmosfera aos oceanos. “O processo é bem claro e documentado em revistas científicas de prestígio. Ainda assim, o planeta continua aquecendo de forma significativa”, disse.

Há 90% de certeza de que o número de dias e noites frios diminuíram, enquanto os dias e noites quentes aumentaram na escala global. E cerca de 60% de certeza de que as ondas de calor também aumentaram. O relatório diz haver fortes evidências de degelo, principalmente na região do Ártico. Há 90% de certeza de que a taxa de redução da camada de gelo tenha sido entre 3,5% e 4,1% por década entre 1979 e 2012.

As concentrações de CO2 na atmosfera já aumentaram mais de 20% desde 1958, quando medições sistemáticas começaram a ser feitas, e cerca de 40% desde 1750. De acordo com o IPCC, o aumento é resultado da atividade humana, principalmente da queima de combustíveis fósseis e do desmatamento, havendo uma pequena participação da indústria cimenteira.

Para os cientistas há uma “confiança muito alta” (nove chances em dez) de que as taxas médias de CO2, metano e óxido nitroso do último século sejam as mais altas dos últimos 22 mil anos. Já mudanças na irradiação solar e a atividade vulcânica contribuíram com uma pequena fração da alteração climática. É “extremamente provável” (95% de certeza) de que a influência humana sobre o clima causou mais da metade do aumento da temperatura observado entre 1951 e 2010.

“Os efeitos da mudança climática já estão sendo sentidos, não é algo para o futuro. O aumento de ondas de calor, da frequência de furacões, das inundações e tempestades severas, das variações bruscas entre dias quentes e frios provavelmente está relacionado ao fato de que o sistema climático está sendo alterado”, disse Artaxo.

Impacto persistente

Na avaliação do IPCC, muitos aspectos da mudança climática vão persistir durante muitos séculos mesmo se as emissões de gases-estufa cessarem. É “muito provável” (90% de certeza) que mais de 20% do CO2 emitido permanecerá na atmosfera por mais de mil anos após as emissões cessarem, afirma o relatório.

“O que estamos alterando não é o clima da próxima década ou até o fim deste século. Existem várias publicações com simulações que mostram concentrações altas de CO2 até o ano 3000, pois os processos de remoção do CO2 atmosférico são muito lentos”, contou Artaxo.

Para o professor da USP, os impactos são significativos e fortes, mas não são catastróficos. “É certo que muitas regiões costeiras vão sofrer forte erosão e milhões de pessoas terão de ser removidas de onde vivem hoje. Mas claro que não é o fim do mundo. A questão é: como vamos nos adaptar, quem vai controlar a governabilidade desse sistema global e de onde sairão recursos para que países em desenvolvimento possam construir barreiras de contenção contra as águas do mar, como as que já estão sendo ampliadas na Holanda. Quanto mais cedo isso for planejado, menores serão os impactos socioeconômicos”, avaliou.

Os impactos e as formas de adaptação à nova realidade climática serão o tema da segunda parte do quinto relatório do IPCC, previsto para ser divulgado em janeiro de 2014. O documento contou com a colaboração de sete cientistas brasileiros. Outros 13 brasileiros participaram da elaboração da terceira parte do AR5, que discute formas de mitigar a mudança climática e deve sair em março.

De maneira geral, cresceu o número de cientistas vindos de países em desenvolvimento, particularmente do Brasil, dentro do IPCC. “O Brasil é um dos países líderes em pesquisas sobre mudança climática atualmente. Além disso, o IPCC percebeu que, se o foco ficasse apenas nos países desenvolvidos, informações importantes sobre o que está acontecendo nos trópicos poderiam deixar de ser incluídas. E é onde fica a Amazônia, um ecossistema-chave para o planeta”, disse Artaxo.

No dia 9 de setembro, o Painel Brasileiro de Mudanças Climáticas (PBMC) divulgou o sumário executivo de seu primeiro Relatório de Avaliação Nacional (RAN1). O documento, feito nos mesmos moldes do relatório do IPCC, indica que no Brasil o aumento de temperatura até 2100 será entre 1 °C e 6 °C, em comparação à registrada no fim do século 20. Como consequência, deverá diminuir significativamente a ocorrência de chuvas em grande parte das regiões central, Norte e Nordeste do país. Nas regiões Sul e Sudeste, por outro lado, haverá um aumento do número de precipitações.

“A humanidade nunca enfrentou um problema cuja relevância chegasse perto das mudanças climáticas, que vai afetar absolutamente todos os seres vivos do planeta. Não temos um sistema de governança global para implementar medidas de redução de emissões e verificação. Por isso, vai demorar ainda pelo menos algumas décadas para que o problema comece a ser resolvido”, opinou Artaxo.

Para o pesquisador, a medida mais urgente é a redução das emissões de gases de efeito estufa – compromisso que tem de ser assumido por todas as nações. “A consciência de que todos habitamos o mesmo barco é muito forte hoje, mas ainda não há mecanismos de governabilidade global para fazer esse barco andar na direção certa. Isso terá que ser construído pela nossa geração”, concluiu.

*   *   *

JC e-mail 4822, de 27 de Setembro de 2013.

IPCC lança na Suécia novo relatório sobre as mudanças climáticas (O Globo)

O documento reúne previsões do aquecimento global até 2100 apontando aumento de temperatura superior a 2 graus. ‘A influência humana no clima é clara’, diz trabalho

A previsão do Painel Intergovernamental sobre Mudanças Climáticas da ONU (IPCC) é que o aquecimento global até o final do século 21 seja “provavelmente superior” a 2 graus (com pelo menos 66% de chances disto acontecer), superando o limite considerado seguro pelos especialistas. Até 2100, o nível do mar deve aumentar perigosamente de 45 a 82 centímetros, considerando o pior cenário (ou de 26 a 55 centímetros, no melhor), e o gelo do Ártico pode diminuir até 94% durante o verão local.

Além disso, o relatório do IPCC — publicado na manhã desta sexta-feira em Estocolmo, na Suécia, com as bases científicas mais atualizadas sobre as mudanças climáticas — considera que há mais de 95% de certeza para afirmar que o homem causou mais da metade da elevação média da temperatura entre 1951 e 2010. Neste período, o nível dos oceanos aumentou 19 centímetros.

O documento, com o trabalho de 259 cientistas e representantes dos governos de 195 países, inclusive o Brasil, ressalta que parte das emissões de CO2 provocadas pelo homem continuará a ser absorvida pelos oceanos. Por isso, é praticamente certo (99% de probabilidade) que a acidificação dos mares vai aumentar, afetando profundamente a vida marinha.

– A mudança de temperatura da superfície do planeta deve exceder 1,5 grau, e, provavelmente, será superior a 2 graus – disse o copresidente do trabalho Thomas Stocker. – É muito provável que as ondas de calor ocorram com mais frequência e durem mais tempo. Com o aquecimento da Terra, esperamos ver regiões atualmente úmidas recebendo mais chuvas, e as áridas, menos, apesar de haver exceções.

Os especialistas fizeram quatro projeções considerando situações diferentes de emissões de gases-estufa. Em todas, há aumento de temperatura. As mais brandas ficam entre 0,3 ºC e 1,7 ºC. Nestes casos, seria necessário diminuir muito as emissões. Já no cenário mais pessimista, o aquecimento ficaria entre 2,6 ºC e 4,8 ºC.

– Na minha opinião, o relatório é muito bom, repleto de informações e todas muito bem fundamentadas – comentou a especialista brasileira Suzana Kahn, que faz parte do grupo de pesquisadores do IPCC em Estocolmo. – No fundo, o grande ganho é a comprovação do que tem sido dito há mais tempo, com muito mais informação sobre o papel dos oceanos, das nuvens e aerossóis. Isto é muito importante para o mundo científico, pois aponta para áreas que precisam ser mais investigadas.

Suzana conta que a dificuldade de fazer projeções e análises para o Hemisfério Sul causou muita polêmica. O grande problema é a falta de dados:

– Acho que isto foi interessante para comprovar que estávamos certos ao criar o Painel Brasileiro de Mudança Climática para suprir esta lacuna.

De acordo com Carlos Rittl, Coordenador do Programa de Mudanças Climáticas e Energias do WWF-Brasil, o relatório do IPCC deixa uma mensagem clara: o aquecimento global é incontestável. E é preciso começar agora contra os piores efeitos.

– O relatório do IPCC reafirma algumas certezas e vai além. Aponta a perda de massa de gelo, o aumento do nível dos oceanos, o incremento de chuvas onde já se chove muito, além da diminuição da umidade nas regiões mais áridas. No Brasil, o semiárido pode se tornar mais árido. Por outro lado, o Sul e Sudeste podem ter mais chuvas do que hoje – enumerou Rittl. – É importante ressaltar que o relatório fala de médias. Ou seja, em algumas regiões do país pode haver aumento de seis graus, com picos de mais de oito graus. A gente não está preparado para isso. E nossas emissões nos colocam numa trajetória de altíssimo risco, mais próxima dos piores cenários.

Para Rittl, a ciência está dando uma mensagem clara: é preciso diminuir as emissões. Isso significa levar os temas relacionados às mudanças climáticas para as decisões políticas:

– Hoje o aquecimento global ainda é um tema marginal. Os investimentos em infraestrutura, energia, o plano Safra, todos os grandes investimentos da escala de trilhões de reais não têm praticamente nenhuma vinculação com a lógica de baixo carbono. Estamos sendo pouco responsáveis.

Diplomacia e ciência
John Kerry, secretário de Estado dos EUA, ressalta que o relatório do IPCC não deve ser esquecido num armário, nem interpretado como uma peça política feita por políticos: “é ciência”, resume. De acordo com ele, os Estados Unidos estão “profundamente comprometidos em combater as mudanças climáticas”, reduzindo emissões de gases-estufa e investindo em formas eficientes de energia.

Este é mais um chamado de atenção: aqueles que negam a ciência ou procuram desculpas para evitar a ação estão brincando com fogo – afirmou Kerry. – O resumo do relatório do IPCC é este: a mudança climática é real, está acontecendo agora, os seres humanos são a causa dessa transformação, e somente a ação dos seres humanos pode salvar o mundo de seus piores impactos.

Lançado a cada seis anos, os relatórios do IPCC estão sob críticas de especialistas. O processo de análise dos documentos por representantes do governo acaba chegando a uma situação intermediária entre diplomacia e ciência, explica Emilio La Rovere, pesquisador da Coppe/UFRJ.

– O resultado é um meio termo entre diplomacia e ciência – afirmou Rovere.
Um dos pontos mais polêmicos são os chamados hiatos, períodos de cerca de 15 anos em que a temperatura média do planeta não aumenta. Por exemplo, a Agência Estatal de Meteorologia da Espanha anunciou, nesta semana, que o último trimestre na Península Ibérica foi o menos quente desde 2008. Entretanto, o relatório de 2007 do IPCC não citou estas pausas do aquecimento, dando argumento aos céticos.

Limite do aquecimento global
As informações do IPCC são importantes para a criação de estratégias de combate às mudanças climáticas. Na Convenção do Clima da Dinamarca (COP-15, realizada em 2009), foi criada a meta de limitar o aquecimento global em 2 graus. Para isso acontecer em 2050, explica Emilio La Rovere, da Coppe/UFRJ, seria necessário cortar 80% das emissões em comparação com 1990:

– Os modelos matemáticos simulando evolução demográfica, economia mundial, demanda e oferta de energia mostram que fica realmente quase impossível atingir este objetivo.

Este é o Quinto Relatório do IPCC, que será lançado em quatro partes, entre setembro de 2013 e novembro de 2014. Nesta sexta-feira, foi publicado o documento do Grupo de Trabalho I (sobre os aspectos científicos das mudanças climáticas). Entre os dias 25 e 29 de março de 2014, será a vez do Grupo de Trabalho II (analisando os impactos, a adaptação e a vulnerabilidade), que se reunirá em Yokohama, no Japão. O Grupo de Trabalho III (especializado na mitigação dos impactos das mudanças climáticas) está previsto para os dias 7 e 11 de abril em Berlim, na Alemanha. Por fim, será criado um relatório síntese, cujos trabalhos ocorrerão entre os dias 27 e 31 de outubro, em Copenhague, na Dinamarca.

(Cláudio Motta/O Globo)

http://oglobo.globo.com/ciencia/ipcc-lanca-na-suecia-novo-relatorio-sobre-as-mudancas-climaticas-10173671#ixzz2g6D6tmU6

Matéria complementar de O Globo:

Relatório destaca como a ação humana agrava o aquecimento
http://oglobo.globo.com/ciencia/relatorio-destaca-como-acao-humana-agrava-aquecimento-1-10171896#ixzz2g6MfZIsl

Veja mais em:

Folha de S.Paulo
Relatório sobe tom de alerta sobre aquecimento
http://www1.folha.uol.com.br/fsp/saudeciencia/131005-relatorio-sobe-tom-de-alerta-sobre-aquecimento.shtml

Agência Estado
ONU culpa atividades humanas por aquecimento global
http://www.territorioeldorado.limao.com.br/noticias/not297255.shtm

Zero Hora
Temperatura da Terra subirá entre 0,3°C e 4,8°C neste século, aponta IPCC
http://zerohora.clicrbs.com.br/rs/geral/planeta-ciencia/noticia/2013/09/temperatura-da-terra-subira-entre-0-3c-e-4-8c-neste-seculo-aponta-ipcc-4283083.html

Correio Braziliense
Temperatura do planeta subirá entre 0,3 e 4,8ºC no século 21, diz IPCC
http://www.correiobraziliense.com.br/app/noticia/ciencia-e-saude/2013/09/27/interna_ciencia_saude,390388/temperatura-do-planeta-subira-entre-0-3-e-4-8-c-no-seculo-21-diz-ipcc.shtml

 

Por trás do desmatamento da Amazônia (Fapesp)

Mais de 50% das emissões de gases de efeito estufa do bioma são causados pela demanda do restante do país e do exterior por insumos produzidos na região, aponta estudo feito na USP (Nasa)

Especiais

20/09/2013

Por Elton Alisson

Agência FAPESP – O consumo interno do Brasil e as exportações de soja, carne bovina e outros produtos primários provenientes da Amazônia são responsáveis por mais da metade das taxas de desmatamento e, consequentemente, das emissões de gases de efeito estufa (GEE) registradas pelo bioma.

A avaliação é de um estudo realizado por pesquisadores da Faculdade de Economia, Administração e Contabilidade (FEA), da Universidade de São Paulo (USP), no âmbito de um Projeto Temático, realizado no Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG).

Os resultados do estudo foram apresentados no dia 12 de setembro durante a 1ª Conferência Nacional de Mudanças Climáticas Globais (Conclima), realizada pela FAPESP em parceria com a Rede Brasileira de Pesquisa sobre Mudanças Climáticas Globais (Rede Clima) e o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC), em São Paulo.

“Mais da metade das emissões de GEE da Amazônia acontecem por conta da demanda de consumo fora da região, para abastecimento interno do país ou para exportação”, disse Joaquim José Martins Guilhoto, professor da FEA e um dos pesquisadores participantes do projeto.

De acordo com dados apresentados pelo pesquisador, obtidos do segundo Inventário Nacional de Emissões de Gases de Efeito Estufa – publicado no final de 2010, abrangendo o período de 1990 a 2005 –, em 2005 o Brasil emitiu mais de 2,1 gigatoneladas de CO2 equivalente. A Amazônia contribui com mais de 50% das emissões de GEE do país.

A fim de identificar e entender os fatores econômicos causadores do desmatamento e, por conseguinte, das emissões de GEE na Amazônia naquele ano, os pesquisadores fizeram um mapeamento das emissões diretas por atividade produtiva separando a Região Amazônica do restante do Brasil e calcularam a parcela de contribuição de cada um na emissão de CO2 equivalente, assim como a participação das exportações.

Os cálculos revelaram que as exportações diretas da Amazônia são responsáveis por 16,98% das emissões de GEE da região. Já as exportações do resto do país são responsáveis por mais 6,29% das emissões da Amazônia, uma vez que há produtos provenientes da região que são processados e exportados por outros estados brasileiros.

O consumo interno, por sua vez, responde por 46,13% das emissões amazônicas, sendo 30,01% pelo consumo no restante do país e 16,12% pelo consumo dentro da própria Região Amazônica, aponta o estudo.

“A soma desses percentuais demonstra que mais de 50% das emissões de GEE da Amazônia ocorrem por conta do consumo de bens produzidos na região, mas consumidos fora dela”, afirmou Guilhoto. “Essa constatação indica que os fatores externos são mais importantes para explicar as emissões de GEE pela Amazônia.”

Segundo o estudo, a pecuária, a produção de soja e de outros produtos agropecuários são os setores produtivos que mais contribuem para as emissões de GEE pela Amazônia. Mas, além deles, há outros setores econômicos, como o de mobiliário, entre outros, que são fortemente dependentes de insumos produzidos na região.

“Os dados obtidos no estudo mostram que, de modo geral, apesar de haver uma dependência muito maior da Amazônia pelos insumos produzidos pelo resto do Brasil, a pouca dependência que o resto do Brasil tem do bioma se dá em insumos fortemente relacionados com a emissão de GEE na região”, resumiu Guilhoto.

Redução do desmatamento

Em outro estudo também realizado por pesquisadores da FEA, no âmbito do Projeto Temático, constatou-se que entre 2002 e 2009 houve uma grande expansão da área de produção agropecuária brasileira e, ao mesmo tempo, uma redução drástica das taxas de desmatamento da Amazônia.

A cana-de-açúcar, a soja e o milho responderam por 95% da expansão líquida da área colhida entre 2002 e 2009, enquanto o rebanho bovino teve um acréscimo de 26 milhões de cabeças de gado. Nesse mesmo período a Amazônia registrou uma queda de 79% do desmatamento.

A fim de investigar os principais vetores do desmatamento no país, uma vez que boa parte dos eventos de expansão agropecuária ocorre fora da Amazônia, os pesquisadores fizeram um estudo usando análises espaciais integradas do território brasileiro, incluindo os seis biomas do país.

Para isso, utilizaram dados sobre desmatamento obtidos do Projeto Prodes, do Ministério do Meio Ambiente (MMA) e do Instituto Brasileiro do Meio Ambiente e dos Recursos Naturais (Ibama), além de imagens georreferenciadas obtidas dos satélites Landsat, da agência espacial norte-americana Nasa.

O estudo revelou que, no período de 2002 a 2009, foram desmatados 12,062 milhões de hectares da Amazônia, 10,015 milhões de hectares do Cerrado, 1,846 milhão de hectares da Caatinga, 447 mil hectares do Pantanal, 375 mil hectares da Mata Atlântica e 257 mil hectares do Pampa.

“A soma desses números indica que o Brasil desmatou em sete anos o equivalente ao Estado de São Paulo mais o Triângulo Mineiro ou uma Grã-Bretanha”, calculou Rafael Feltran-Barbieri, pesquisador da FEA e um dos autores do estudo.

De acordo com o pesquisador, uma das principais conclusões do estudo foi que os outros biomas estão funcionando como uma espécie de “amortecedor” do desmatamento da Amazônia.

“Quando consideramos a expansão agropecuária do Brasil como um todo, vemos que boa parte da redução das taxas de desmatamento da Amazônia se deve ao fato de que os outros biomas estão sofrendo essas consequências [registrando aumento no desmatamento]”, afirmou.

Outra conclusão do estudo é de que há um impacto espacial sinérgico dos vetores do desmatamento no Brasil, uma vez que a expansão das diversas atividades agropecuárias – como o cultivo da cana e da soja ou a criação de gado – ocorre de forma concomitante e disputa território.

No caso da cana-de-açúcar, uma das constatações foi que, no período de 2002 a 2009, a cultura passou a ocupar áreas desmatadas por outras atividades agropecuárias, embora ela própria não tenha vocação para desmatar.

“Estamos percebendo que existe uma formação quase complementar entre as expansões [nos diferentes biomas] e isso faz com que os efeitos de desmatamento sejam altamente correlacionados”, disse Feltran-Barbieri.

“Essa constatação leva à conclusão de que, se o Brasil pretende assumir uma posição de fato responsável em relação às mudanças climáticas – e disso depende a agropecuária –, é preciso fazer planejamento estratégico do território, porque o planejamento setorial não está dando conta de compreender esses efeitos sinérgicos”, avaliou.

Painel do clima da ONU sobe alerta para aquecimento global (Folha de S.Paulo)

27/09/2013 – 05h07

RAFAEL GARCIA, ENVIADO ESPECIAL A ESTOCOLMO

Atualizado às 09h47.

Quem esperava ver o IPCC (Painel Intergovernamental de Mudança Climática) emitir um relatório mais enfraquecido em razão da recente desaceleração no aquecimento global, testemunhou os cientistas da entidade subirem o tom de alerta, destacando uma subida acelerada no nível do mar e no derretimento do Ártico.

Além de manter que a mudança climática é “inequívoca”, o documento mostra projeções para o futuro com um grau de certeza melhorado. “A atmosfera e o oceano se aqueceram, a quantidade de neve e gelo diminuiu, o nível do mar subiu, e as concentrações de gases-estufa aumentaram”, diz o Sumário para Formuladores de Política, a seção do documento divulgada nesta manhã.

Mensageiro Sideral: Que diabos acontece com nosso planeta?

No final deste século, a temperatura média da superfície terrestre provavelmente vai exceder um aumento médio de 1,5°C até 2100, em relação à média de 1850 a 1900. Isso vale para todos os cenários considerados exceto o mais otimista, no qual emissões de gases estufa sofrem cortes drásticos, atingindo um pico daqui a dez anos e caindo para zero 50 anos depois.

Em um cenário intermediário, no qual as emissões dobram até 2060 e recuam a um nível maior que o atual antes de 2100, a temperatura provavelmente excederá um aumento de 2°C, limite considerado perigoso por cientistas.

Manifestantes gritam em protesto em Estocolmo, onde foi divulgado o novo relatório do painel do clima da ONU

No pior cenário, um no qual a humanidade triplica a emissão de gases do efeito estufa até 2100, a temperatura média provavelmente subiria até 4,8°C.

A palavra “provavelmente”, na linguagem do painel, significa uma propensão acima de 66%, a melhor confiabilidade possível para projeções futuras, baseadas em modelos matemáticos de simulação do clima.

Ao avaliar o efeito das emissões de gases estufa no clima passado, o relatório é bem mais contundente. O texto afirma ser “extremamente provável” (95% de certeza) que o aquecimento observado desde o meio do século 20 seja resultado da influência humana no clima. O último relatório, de 2007, dizia que o mesmo era “muito provável” (90%).

OCEANOS

O relatório também afirma com “alta confiabilidade” que o aquecimento dos oceanos domina o aumento da energia armazenada no sistema climático. Mais de 90% da energia acumulada pela Terra desde 1971 foi armazenada nos mares, diz o relatório.

“À medida que os oceanos se aquecem e as geleiras e plataformas de gelo se reduzem, o nível do mar continuará a aumentar, mas a uma taxa mais rápida do que aquela que experimentamos ao longo dos últimos 40 anos”, afirmou Qin-Dahe, co-presidente do Grupo 1 do IPCC.

O aumento do nível do mar entre 1900 e 2012 foi de 19 cm. Mesmo dentro do melhor cenário, com uma queda brusca de emissões, esse número pode quase triplicar, em razão de os oceanos já terem absorvindo muita temperatura. A hipótese mais pessimista dentro do pior cenário vê um aumento de 82 cm, que impactaria uma população de um tamanho da ordem de dezenas de milhões.

Uma das razões para a revisão dos números é que os modelos lidam melhor agora com o derretimento da Groenlândia e do oeste da Antártida. Antes, os números eram basicamente uma projeção da expansão térmica da água oceânica, que continua sendo a maior causa do aumento do nível do mar.

HIATO

A questão da desaceleração do aquecimento global nos últimos 15, o “hiato” da mudança climática, tomou grande parte do tempo de discussão dos delegados do IPCC que se reuniram nesta semana. O texto final do relatório lidou com o fenômeno afirmando que a tempereatura média global de superfície “exibe substancial variabilidade interanual e decadal” apesar de um “robusto aquecimento multi-decadal” verificado.

“Como exemplo, a taxa de aquecimento ao longo dos últimos 15 anos (1998-2012); 0,05°C por década), que começa com um forte El Niño, é menor do que a taxa calculada desde de 1951 (1951-2012; 0,12°C por década)”, afirma o relatório.

Apesar de haverem estudos recentes indicando que o calor da atmosfera está sendo roubado por águas oceânicas profundas, o relatório não menciona esse fenômeno. “É preciso colher mais evidências disso”, diz Thomas Stocker, o outro co-presidente do Grupo 1 do IPCC.

Editoria de Arte/Folhapress

 

 

Manifestantes do grupo Avaaz usam gangorra gigante para demonstrar os 95% de certeza sobre a culpa do homem nas mudanças climáticas; de um lado fica a maioria dos cientitas e, de outro, um homem com uma maleta de dinheiro

Faixa em protesto em Estocolmo diz: “o debate acabou”, em referência ao novo relatório do painel do clima, que aumenta o grau de certeza sobre a responsabilidade do homem na mudança climática

Protesto em Estocolmo, na Suécia, pede ação imediata para conter efeitos da mudança climática

*   *   *
24/09/2013 – 02h53

Previsão de aumento do nível do mar piora

RAFAEL GARCIA

ENVIADO ESPECIAL A ESTOCOLMO

A julgar pela versão preliminar do quinto relatório do IPCC (painel do clima da ONU), a ser divulgado na próxima sexta-feira, a principal atualização nas projeções da mudança climática não serão relacionadas ao aumento de temperatura, mas sim do nível do mar.

Quando o quarto relatório, que projetava um aumento de 18 cm a 59 cm, saiu em 2007, muitos cientistas o consideraram conservador. O esboço do novo texto fala agora de uma margem entre 28 cm e 89 cm de aumento até 2100.

Uma mudança da escala de dezenas de centímetros na projeção não é pequena. A margem de erro para o cenário mais pessimista chega a quase um metro de altura, o que afetaria áreas habitadas por algumas dezenas de milhares de pessoas.

Editoria de Arte/Folhapress

O principal fenômeno por trás do aumento do nível do mar é o fato de que a água aumenta de volume quando está mais quente –e os oceanos absorvem boa parte do calor aprisionado na atmosfera.

“A expansão termal é a maior contribuição para o aumento futuro do nível do mar, sendo responsável por 30% a 55% do total, com a segunda maior contribuição vindo das geleiras”, afirma a versão preliminar do sumário político do documento.

“Há alta confiabilidade em que o aumento do derretimento da superfície da Groenlândia vai exceder a elevação da queda de neve, levando à contribuição positiva [aumentando o nível do mar].”

INCERTEZAS
Se o novo relatório do IPCC será mais contundente ao dizer que o impacto do aquecimento sobre o nível do mar será maior, ele ainda tem limitações quando tenta especificar quão maior.
Um dos problemas por trás das projeções do painel do clima é que o balanço do derretimento e da formação de gelo na Antártida ainda é difícil de prever.

Apesar de a maioria das observações e dos modelos de computador alertar para o derretimento da parte ocidental do continente gelado, o aumento de precipitação na Antártida oriental deixa o cenário incerto.

Divulgação/Nasa/France Presse
Imagem capturada por satélite da Nasa mostra uma rachadura (ao centro) em geleira
Imagem capturada por satélite da Nasa mostra uma rachadura (ao centro) em geleira

“Há uma cofiabilidade média de que nevascas na Antártida vão aumentar, enquanto o derretimento de superfície continuará pequeno, resultando numa contribuição negativa [de redução do nível do mar]”, diz o texto.

Um avanço do novo relatório é a tentativa de lidar melhor com as incertezas regionais. Por exemplo, apesar de a Groenlândia ter a massa de gelo terrestre que mais vai contribuir para a elevação do mar, lá ele não deve subir.

Como a massa de gelo da região vai diminuir, ela perde força de gravidade que puxa água na direção da costa. E o mesmo deve ocorrer com a Península Antártica.

“O efeito gravitacional do derretimento da Antártida, combinado com o efeito da dinâmica de correntes e a temperatura abaixo da média, deve deixar o nível do mar abaixo da média na ponta da América do Sul”, diz Aimée Slangen, da Universidade de Utrecht, na Holanda.

“Indo para o Equador, o efeito gravitacional se reverte, deixando o nível do mar acima da média.”

Slangen publicou no ano passado um estudo sobre diferenças regionais na subida da linha d’água, mas diz que ainda é difícil fazer um mapa preciso. “Precisamos entender o papel das plataformas de gelo, das geleiras, o efeito térmico e saber como a crosta vai se mover”, diz.

O último esboço do relatório afirma que, em 95% das áreas oceânicas do mundo, o nível do mar vai subir, e que 70% das áreas costeiras terão um aumento com desvio de menos de 20% da média.

Para os cientistas, porém, é preciso aprimorar o mapeamento. “Para uma cidade ou um país, a média mundial não importa, é preciso saber o que está acontecendo logo à porta de casa”, diz Slangen.

*   *   *
26/09/2013 – 20h04

Relatório eleva o tom de alerta sobre o aquecimento global

RAFAEL GARCIA, ENVIADO ESPECIAL A ESTOCOLMO

O IPCC (painel do clima da ONU) divulgará na manhã desta sexta-feira (27) um relatório que eleva o nível de alerta para o aquecimento global. Apesar de a desaceleração na subida de temperaturas nos últimos 15 anos ter tomado bom tempo da discussão, a nova versão do “Sumário para Formuladores de Política” do documento deve indicar que as alterações climáticas têm sido mais acentuadas e mais rápidas do que se esperava, não o inverso.

As discussões finais sobre o conteúdo do documento ainda ocorrem a portas fechadas nesta madrugada em Estocolmo, mas algumas questões cruciais já foram resolvidas. A versão final do texto do Grupo 1 do IPCC –responsável pela física do clima– deve ganhar então um tom acima do que a daquela enviada aos governos para comentários em junho.

Algumas nações, incluindo o Brasil, haviam pedido que o documento excluísse a menção ao “hiato” de 15 anos, mas ela será mantida, acompanhada de explicação.

“Está escrito que esta descontinuidade decorre de um fenômeno atípico em águas no Pacífico e que um período tão curto não pode ser usado para projeções de longo prazo”, disse à Folha um dos cientistas brasileiros presentes na discussão.

Christian Åslund/Greenpeace
Protesto do Greenpeace em frente ao local de reuniões do IPCC, em Estocolmo
Protesto do Greenpeace em frente ao local de reuniões do IPCC, em Estocolmo

O debate sobre esse problema tomou tanto tempo que, no início do terceiro dos quatro dias de trabalho, o painel só tinha avançado 30% da pauta. “Entendo a frustração dos cientistas em gastar tanto tempo nisso, mas eles não tinham a opção evitar essa discussão, que se tornou pública”, afirmou a observadora de uma ONG presente à reunião.

“O que vemos é que níveis não tão altos de aquecimento estão causando impactos maiores do que aqueles que esperávamos, como no derretimento do gelo do Ártico e da Groenlândia ou no aumento do nível do mar.”

Em relação ao relatório anterior do IPCC, publicado em 2007, muitas das novidades têm a ver com o grau de confiabilidade de fenômenos conhecidos. A certeza de que o aquecimento global é causado pelos humanos, por exemplo, aumentou de 90% para 95%. Houve até uma proposta derrubada de aumentar o número para 99%, relata um delagado holandês.

Algumas tentativas de aguar o esboço inicial também falharam. Ao descrever as principais fontes de gases do efeito estufa, delegados sauditas queriam que o texto equiparasse em gravidade o desmatamento e a agricultura à queima de combustíveis fósseis, mas foram derrotados. Apesar de não ter grandes mudanças de conteúdo, a versão final do texto sairá com “palavras mais fortes”, disse outro observador.

A divulgação da versão definitiva do “Sumário para Formuladores de Políticas” do Grupo 1 do IPCC está marcada para as 5h00 desta sexta-feira. A versão completa do relatório, incluindo o “Sumário Técnico” destinado a cientistas, ocorre na segunda-feira.

*   *   *
26/09/2013 – 13h20

Veja dez perguntas para entender as discussões sobre clima na ONU

DA BBC BRASIL

O Painel Intergovernamental sobre Mudanças Climáticas (IPCC) divulga na sexta-feira em Estocolmo, na Suécia, um novo relatório no qual pretende estabelecer, com o maior grau de certeza já obtido, o papel das atividades humanas nas mudanças climáticas.

Para ajudar a entender melhor o tema, a BBC preparou uma lista com dez perguntas e respostas sobre a questão:

O que é mudança climática?

O clima do planeta está mudando constantemente ao longo do tempo geológico. A temperatura média global hoje é de cerca de 15ºC, mas as evidências geológicas sugerem que ela já foi muito maior ou muito menor em outras épocas no passado.

Entretanto, o atual período de aquecimento está ocorrendo de maneira mais rápida do que em muitas ocasiões no passado. Os cientistas estão preocupados de que a flutuação natural, ou variabilidade, está dando lugar a um aquecimento rápido induzido pela ação humana, com sérias consequências para a estabilidade do clima no planeta.

O que é ‘efeito estufa’?

O efeito estufa se refere à maneira como a atmosfera da Terra “prende” parte da energia do Sol. A energia solar irradiada de volta da superfície da Terra para o espaço é absorvida por gases atmosféricos e reemitida em todas as direções.

A energia que irradia de volta para o planeta aquece tanto a baixa atmosfera quanto a superfície da Terra. Sem esse efeito, a Terra seria 30ºC mais fria, deixando as condições no planeta hostis para a vida.

Os cientistas acreditam que estamos contribuindo para o efeito natural de estufa com gases emitidos pela indústria e pela agricultura, absorvendo mais energia e aumentando a temperatura.

O mais importante desses gases no efeito estufa natural é o vapor de água, mas suas concentrações mostram pouca mudança. Outros gases do efeito estufa incluem dióxido de carbono, metano e óxido nitroso, que são liberados pela queima de combustíveis fósseis. O desmatamento contribui para seu aumento ao eliminar florestas que absorvem carbono.

Desde o início da revolução industrial, em 1750, os níveis de dióxido de carbono (CO2) aumentaram mais de 30%, e os níveis de metano cresceram mais de 140%. A concentração de CO2 na atmosfera é agora maior do que em qualquer momento nos últimos 800 mil anos.

Qual é a evidência sobre o aquecimento?

Os registros de temperatura, a partir do fim do século 19, mostram que a temperatura média da superfície da Terra aumentou cerca de 0,8ºC nos últimos cem anos. Cerca de 0,6ºC desse aquecimento ocorreu nas últimas três décadas.

Dados de satélites mostram um aumento médio nos níveis do mar de cerca de 3 milímetros por ano nas últimas décadas. Uma grande proporção da mudança nos níveis do mar se deve à expansão dos oceanos pelo aquecimento. Mas o derretimento das geleiras de montanhas e das camadas de gelo polar também contribuem para isso.

A maioria das geleiras nas regiões temperadas do mundo e na Península Antártica estão encolhendo. Desde 1979, registros de satélites mostram um declínio dramático na extensão do gelo no Ártico, a uma taxa anual de 4% por década. Em 2012, a extensão de gelo alcançou o menor nível já registrado, cerca de 50% menor do que a média do período entre 1979 e 2000.

O manto de gelo da Groenlândia verificou um derretimento recorde nos últimos anos. Se a camada inteira, de 2,8 milhões de quilômetros cúbicos, derretesse, haveria um aumento de 6 metros nos níveis dos mares.

Dados de satélites mostram que a capa de gelo do oeste da Antártica também está perdendo massa, e um estudo recente indicou que o leste da Antártica, que não havia mostrado tendências claras de aquecimento ou resfriamento, também pode ter começado a perder massa nos últimos anos. Mas os cientistas não esperam mudanças dramáticas. Em alguns lugares, a massa de gelo pode aumentar, na verdade, com as temperaturas em alta provocando mais tempestades de neve.

Os efeitos de uma mudança climática também podem ser vistos na vegetação e nos animais terrestres. Isso inclui também o florescimento e frutificação precoces em plantas e mudanças nas áreas ocupadas pelos animais terrestres.

Há uma pausa no aquecimento?

Alguns especialistas argumentam que desde 1998 não houve um aquecimento global significativo, apesar do aumento contínuo nos níveis de emissão de CO2. Os cientistas tentam explicar isso de várias formas.

Isso inclui: variações na emissão de energia pelo Sol, um declínio no vapor de água atmosférico e uma maior absorção de calor pelos oceanos. Mas até agora, não há um consenso geral sobre o mecanismo preciso por trás dessa pausa.

Céticos destacam essa pausa como um exemplo da falibilidade das previsões baseadas em modelos climáticos computadorizados. Por outro lado, os cientistas do clima observam que o hiato no aquecimento ocorre em apenas um dos componentes do sistema climático – a média global da temperatura da superfície -, e que outros indicadores, como o derretimento do gelo e as mudanças na fauna e na flora demonstram que a Terra continua a se aquecer.

Quanto as temperaturas vão aumentar no futuro?

Em seu relatório de 2007, o IPCC previu um aumento da temperatura global entre 1,8ºC e 4ºC até 2100.

Mesmo que as emissões de gases do efeito estufa caiam dramaticamente, os cientistas dizem que os efeitos continuarão, porque partes do sistema climático, particularmente os grandes corpos de água e gelo, podem levar centenas de anos para responder a mudanças na temperatura. Também leva décadas para que os gases do efeito estufa sejam removidos da atmosfera.

Quais serão os impactos disso?

A escala do impacto potencial é incerto. As mudanças podem levar à escassez de água potável, trazer mudanças grandes nas condições para a produção de alimentos e aumentar o número de mortes por inundações, tempestades, ondas de calor e secas.

Os cientistas preveem mais chuvas em geral, mas dizem que o risco de seca em áreas não costeiras deverá aumentar durante os verões mais quentes. Mais inundações são esperadas por causa de tempestades e do aumento do nível do mar. Deverá haver, porém, muitas variações regionais nesse padrão.

Os países mais pobres, que estão menos capacitados para lidar com a mudança rápida, deverão sofrer mais.

A extinção de plantas e animais está prevista, por conta de mudanças nos habitats mais rápidas do que a capacidade de adaptação das espécies à estas. A Organização Mundial da Saúde (OMS) advertiu que a saúde de milhões de pessoas pode ser ameaçada por aumentos nos casos de malária, doenças transmitidas pela água e malnutrição.

O aumento na absorção de CO2 pelos oceanos pode levá-los a se tornar mais ácidos. Esse processo de acidificação em andamento poderia provocar grandes problemas para os recifes de corais, já que as mudanças químicas impedem os corais de formar um esqueleto calcificado, que é essencial para sua sobrevivência.

O que não sabemos?

Os modelos computadorizados são usados para estudar a dinâmica do clima na Terra e fazer projeções sobre futuras mudanças de temperatura. Mas esses modelos climáticos diferem sobre a “sensibilidade climática” – a quantidade de aquecimento ou esfriamento que ocorre por conta de um fator específico, como a elevação ou a queda na concentração de CO2.

Os modelos também diferem na forma como expressam “feedback climático”.

O aquecimento global deverá provocar algumas mudanças com probabilidade de criar mais aquecimento, como a emissão de grandes quantidades de gases do efeito estufa com o derretimento do permafrost (gelo eterno da superfície da Terra). Isso é conhecido como feedback climático positivo (no sentido de adicionar calor).

Mas também existem os feedbacks negativos, que compensam o aquecimento. Por exemplo, os oceanos e a terra absorvem CO2 como parte do ciclo do carbono.

A questão é saber qual o resultado final da soma dessas variáveis.

As inundações vão me atingir?

Detalhes vazados do relatório a ser apresentado nesta semana indicam que no pior cenário traçado pelo IPCC, com o maior nível de emissões de dióxido de carbono, os níveis dos mares no ano 2100 poderiam subir até 97 centímetros.

Alguns cientistas criticam os modelos usados pelo IPCC para calcular esse aumento. Usando o que é chamado de modelo semiempírico, as projeções para o aumento do nível do mar podem chegar a 2 metros. Nessas condições, 187 milhões de pessoas a mais no mundo sofreriam com inundações.

Mas o IPCC deve dizer que não há consenso sobre o enfoque semiempírico e manterá o dado pouco inferior a 1 metro.

O que vai acontecer com os ursos polares?

O estado dos polos Norte e Sul tem sido uma preocupação crescente para a ciência, conforme os efeitos do aquecimento global se tornam mais intensos nessas regiões.

Em 2007, o IPCC disse que as temperaturas no Ártico aumentaram quase duas vezes mais que a média global nos últimos cem anos. O relatório destacou que a região pode ter uma grande variação, com um período quente observado entre 1925 e 1945.

Nos rascunhos do relatório desta semana, os cientistas dizem que há uma evidência maior de que as camadas de gelo e as geleiras estão perdendo massa e que a camada de gelo está diminuindo no Ártico.

Em relação à Groenlândia, que por si só tem a capacidade de aumentar os níveis globais dos mares em 6 metros, o painel diz estar 90% certo de que a velocidade da perda de gelo entre 1992 e 2001 aumentou seis vezes no período entre 2002 e 2011.

Enquanto a extensão média do gelo no Ártico caiu cerca de 4% por década desde 1979, o gelo na Antártica aumentou até 1,8% por década no mesmo período.

Para o futuro, as previsões são bastante dramáticas. No pior cenário traçado pelo IPCC, um Ártico sem gelo no verão é provável até o meio deste século.

E a perspectiva para os ursos polares e para outras espécies que vivem nesse ambiente não é bom, segundo disse à BBC o professor Shang-Ping Xie, do Instituto de Oceanografia da Universidade da Califórnia em San Diego.

“Haverá bolsões de gelo marítimo em alguns mares marginais. Esperamos que os ursos polares sejam capazes de sobreviver no verão nesses bolsões de gelo remanescentes”, disse.

Qual a credibilidade do IPCC?

A escala global do envolvimento científico com o IPCC dá uma ideia do peso dado ao painel.

Dividido em três grupos de trabalho que analisam a ciência física, os impactos e as opções para limitar as mudanças climáticas, o painel envolve milhares de cientistas de todo o mundo.

O relatório a ser apresentado em Estocolmo tem 209 autores coordenadores e 50 revisões de editores de 39 países diferentes.

O documento é baseado em cerca de 9.000 estudos científicos e 50 mil comentários de especialistas.

Mas em meio a esse conjunto enorme de dados, as coisas podem não sair como o esperado.

No último relatório, publicado em 2007, houve um punhado de erros que ganharam grande projeção, entre eles a afirmação de que as geleiras do Himalaia desapareceriam até 2035. Também houve erro na projeção da porcentagem do território da Holanda que ficaria sob o nível do mar.

O IPCC admitiu os erros e explicou que em um relatório de 3 mil páginas é sempre possível que haja alguns pequenos erros. A afirmação sobre o Himalaia veio da inclusão de uma entrevista que havia sido publicada pela revista New Scientist.

Em 2009, uma revisão da forma como o IPCC analisa as informações sugeriu que o painel seja mais claro no futuro sobre as fontes de informação usadas.

O painel também teve a reputação manchada pela associação com o escândalo provocado pelo vazamento de e-mails trocados entre cientistas que trabalhavam para o IPCC, em 2009.

As mensagens pareciam mostrar algum grau de conluio entre os pesquisadores para fazer com que os dados climáticos se encaixassem mais claramente na teoria das mudanças climáticas induzidas pelo homem.

Porém ao menos três pesquisas não encontraram evidências para apoiar essa conclusão.

Mas o efeito final desses eventos sobre o painel foi o de torná-lo mais cauteloso.

Apesar de o novo relatório possivelmente enfatizar uma certeza maior entre os cientistas de que as atividades humanas estão provocando o aquecimento climático, em termos de escala, níveis e impactos a palavra “incerteza” deverá aparecer com bastante frequência.

*   *   *
23/09/2013 – 03h00

Análise: Com IPCC na defensiva, novo relatório ainda tem combustível para polêmica

MARCELO LEITE

DE SÃO PAULO

Seis anos e sete meses atrás, quando lançou o documento “Mudança Climática 2007: A Base da Ciência Física”, a reputação do IPCC estava no auge. Naquele mesmo ano viria o Prêmio Nobel da Paz, como reconhecimento pela façanha de impor o tema da mudança do clima à agenda política mundial.

A autoridade do grupo formado pela ONU 18 anos antes era tamanha que a climatologista americana Susan Solomon, coordenadora do texto de 21 páginas, deu o passo arriscado: na entrevista coletiva em Paris, repetiu com ênfase a avaliação de que o aquecimento global era “inequívoco”.

Brasil ataca hiato do aquecimento global

Foi a resposta retórica e política daqueles cientistas à repercussão então alcançada pelo grupo dos chamados “céticos” (ou negacionistas). Depois disso, o IPCC nunca mais seria o mesmo.

Em novembro de 2009, um mês antes da Conferência do Clima de Copenhague, hackers invadiram computadores de uma universidade britânica e surrupiaram mensagens eletrônicas entre climatologistas do IPCC que permitia a impressão de que houvera manipulação de dados.

Os autores foram inocentados, mas uma brecha se abrira no prestígio do painel. Ela se agravou dois meses depois, quando veio à tona que o relatório continha erro grosseiro sobre derretimento de geleiras no Himalaia.

Desde então, o IPCC se manteve na defensiva. Criou critérios mais rígidos sobre o tipo de dados científicos que poderiam fundamentar suas previsões. O resultado desse retraimento transparecerá no “Quinto Relatório de Avaliação” (AR5), cujo “Sumário para Formuladores de Políticas” entra agora em sua fase final de redação.

POLÊMICAS
Parece pouco provável que o sumário repita juízos de valor como o contido no adjetivo “inequívoco”. Ainda assim, o documento contém combustível para polêmica.

A primeira coisa a prestar atenção no relatório, além da aparente redução na taxa de aquecimento, é a maneira pela qual lidará com a situação da calota de gelo sobre o oceano Ártico. O IPCC deve reafirmar como muito alta a probabilidade de que ela continue a encolher a cada verão do hemisfério Norte.

O problema é que, neste final do verão setentrional, o gelo no polo Norte se estende por uma área 50% maior (5,1 milhões de km2) do que há um ano. Em 2012, ela caíra para 3,4 milhões de km².

Editoria de Arte/Folhapress

Como se pode ver no gráfico, porém, todos os cinco últimos anos viram a área ficar muito abaixo da média de 1981 a 2010. É dessa escala temporal de transformações (décadas) que se ocupam as projeções do IPCC, não da variação de ano para ano.

O segundo ponto a atentar é a elevação do nível do mar. vAssim como ocorre com a atmosfera, o efeito estufa também esquenta o oceano. Isso expande seu volume, que também aumenta com o derretimento de geleiras sobre terra firme (como as da Groenlândia).

No caso do “Quarto Relatório de Avaliação”, de 2007, o IPCC terminou acusado de ser conservador demais na projeção de que o mar se elevaria entre 18 cm e 59 cm até o ano 2100, pois alguns estudos já indicavam mudanças de até mais de 1 m.

Tudo indica que o limite superior da nova previsão do painel deve chegar perto disso (versões preliminares apontam até 97 cm). Seria o suficiente para inundar as moradias de dezenas de milhões de pessoas no mundo.

O número, sozinho, parecerá alarmante para muitas pessoas. Pode ou não vir acompanhado, na sua divulgação, de ponderações sóbrias– como assinalar o prazo de 86 anos (de 2014 a 2100) disponível para adaptação.

Se não contornarem com muita habilidade essas armadilhas, os cientistas e porta-vozes do IPCC se arriscam a desperdiçar a oportunidade de recompor sua credibilidade como corpo científico e vê-lo, mais uma vez, desqualificado como militante da causa alarmista.

*   *   *
23/09/2013 – 19h19

IPCC promete relatório sem viés e com revisão mais rigorosa

RAFAEL GARCIA

ENVIADO ESPECIAL A ESTOCOLMO

O chefe do conjunto de climatologistas que escreveu a mais nova atualização do IPCC sobre a física do clima, o Grupo de Trabalho 1, prometeu ontem entregar um relatório com uma visão “sem precedentes e não enviesada sobre o estado do sistema climático”.

Abrindo a primeira sessão de trabalhos que vai finalizar a primeira das três partes do AR5, o quinto relatório de avaliação do painel do clima da ONU, Thomas Stocker diz ter levado em consideração os mais de 50 mil comentários que o documento recebeu das delegações de governos, que já tiveram acesso ao “sumário para formuladores de políticas” do relatório. “Não conheço nenhum outro documento que tenha passado por um escrutínio assim”.

Após uma última rodada de discussões, a versão final do relatório sai nesta sexta-feira, e a versão provisória do documento indica que a certeza probabilística sobre a culpa das atividades humanas pelo aquecimento global deve subir de 90% para 95%, em relação ao que o IPCC concluíra em 2007.

Qin Dahe, co-chefe do Grupo 1 do IPCC, ressaltou, porém, que apesar de incertezas permanecerem em alguns aspectos do relatório, o papel humano no clima não está sendo revisado. “A evidência científica da mudança climática antropogênica se fortaleceu ano a ano, deixando poucas incertezas sobre as consequências da inação.”

Uma versão vazada do relatório técnico do Grupo 1 do IPCC, documento mais completo que o sumário para políticos, indica que há espaço para os cientistas explicarem a recente pausa no aquecimento global: o período dos últimos 15 anos em que a subida de temperaturas se desacelerou. Uma versão do documento obtida pela agência de notícias Reuters indica que “períodos de hiato de 15 anos são comuns” no registro climático. A menção ao hiato é um dos pontos criticados pelo governo brasileiro no relatório.

*   *   *

25/09/2013 – 02h57

Para painel do clima, nuvens devem elevar aquecimento

RAFAEL GARCIA

ENVIADO ESPECIAL A ESTOCOLMO

O papel das nuvens na mudança climática ainda é um assunto cheio de incertezas, mas o próximo relatório do IPCC (painel do clima da ONU) dá um passo na direção de um veredito.

Segundo uma versão preliminar do documento, o que a ciência mostra até agora é que a alteração que o aquecimento global provoca na dinâmica de formação de nuvens contribui para que o calor se amplifique.

A versão provisória do AR5, o quinto relatório do IPCC, a ser divulgado na sexta, reconhece que as nuvens têm efeitos paradoxais sobre o clima, algumas delas fazendo a Terra reter mais calor, outras fazendo o planeta refletir radiação solar.

O texto porém, afirma que “o sinal do balanço de feedback radiativo é provavelmente positivo”. Em outras palavras, os cientistas têm 66% de certeza de que a mudança climática cria um ciclo vicioso, no qual nuvens aquecem um planeta que cria mais nuvens causadoras de aquecimento.

Ainda não é um veredito com grau de certeza acachapante como o atribuído aos gases-estufa –a culpa do aquecimento global é da queima de combustíveis fósseis por ação humana com 95% de certeza, diz o relatório. Mas é uma grande evolução desde o último relatório, o AR4, em 2007, que se declarou incapaz de chegar a um consenso sobre a questão.

Editoria de Arte/Folhapress

“O fato de o AR4 ter botado o dedo na ferida e dito que a parte de aerossóis e nuvens era a mais crítica fez com que milhares de cientistas voltassem pesquisas para essa área”, diz Paulo Artaxo, climatologista da USP e coautor do capítulo dedicado ao tema no novo relatório.

O maior problema era –e ainda é– saber se as nuvens baixas têm efeito positivo ou negativo sobre o aquecimento. Por estarem sob mais pressão, elas ficam mais compactas e ajudam a refletir luz solar. Já o vapor comum ou as nuvens mais altas (rarefeitas) retêm calor.

O efeito das nuvens baixas era um dos aspectos mais explorados por grupos que negam o aquecimento global causado pelo homem.

Ainda que o grau de certeza da nova resposta esteja abaixo do máximo, a alegação de que as nuvens vão salvar o planeta não tem obtido sustentação.

O problema agora é projetar o futuro. Para isso, é preciso usar modelos matemáticos de previsão, que ainda não estão prontos para tal.

Uma das principais razões para essa limitação, segundo Artaxo, é que a resolução dos modelos ainda é grosseira demais para tratar de nuvens.

Uma célula do planeta em um modelo é como um pixel de 50 km de largura, mas nuvens são menores que isso, podendo ter 100 m ou menos. Os cientistas precisam então simplificar a representação das nuvens em seus modelos matemáticos do clima.

RAIOS CÓSMICOS
O novo relatório também descarta uma teoria que ganhou espaço entre críticos do IPCC: a de que a formação de nuvens pudesse estar sendo afetada por uma alteração sazonal na taxa de raios cósmicos. Defendida pelo físico dinamarquês Henrik Svensmark, a hipótese seria uma explicação alternativa para o aquecimento não relacionada aos combustíveis fósseis.

Segundo Artaxo, o texto fez o maior esforço possível para não deixar nenhum aspecto de fora. “Você pode até achar que uma ou outra coisa é bobagem, mas não importa. É preciso analisar as evidências experimentais para verificar se uma coisa que parece a maior loucura do mundo é verdadeira ou não.”

*   *   *

23/09/2013 – 02h59

Brasil ataca ‘hiato’ do aquecimento global

RAFAEL GARCIA

ENVIADO ESPECIAL A ESTOCOLMO

Nos últimos 15 anos, a temperatura média da Terra parou de subir tão rápido quanto antes, mas isso não significa que a mudança climática esteja freando. Essa é a posição do governo brasileiro, que defenderá retirar do mais importante documento internacional sobre clima a menção ao chamado “hiato” do aquecimento global.

O encontro que define o conteúdo final do quinto relatório do IPCC (Painel Intergovernamental de Mudança Climática) começa hoje em Estocolmo, na Suécia, quando se reúne o Grupo de Trabalho 1 da instituição –aquele encarregado de avaliar a ciência física do clima.

O relatório técnico do painel já está pronto, mas só será publicado após a edição final do “Sumário para Formuladores de Políticas”, único anexo do relatório sobre o qual governos podem opinar antes da divulgação, marcada para a próxima sexta (27).

Editoria de Arte/Folhapress

Em junho, um esboço final dessa parte do relatório foi enviado a delegações para comentários. O texto, que vazou para a imprensa, incluía uma frase com referência ao hiato: “A taxa de aquecimento ao longo dos últimos 15 anos (1998-2012; 0,05°C por década) é menor do que a tendência desde 1951 (1951-2012; 0,12°C por década)”.

Editoria de Arte/Folhapress

Explorada por negacionistas da mudança climática, essa flexão no gráfico de temperatura é descrita por alguns como mero ruído estatístico numa tendência clara de aquecimento no longo prazo.

Autores do sumário não deveriam ter incluído um recorte arbitrário de 15 anos num documento que avalia um trajeto só visível em escala maior, defende um dos representantes do governo brasileiro perante o IPCC.

“Alguém escolheu para isso um determinado ano [1998] que foi um ano de El Niño e era mais quente que o padrão”, diz Gustavo Luedemann, coordenador de mudanças globais de clima do Ministério da Ciência. “Isso é obviamente um erro metodológico e deixa o documento um menos contundente.”

Luedeman, porém, considera a menção ao hiato apenas um equívoco pontual e diz crer que o relatório é incisivo ao culpar a humanidade pela mudança climática.

Ainda assim, cientistas tentam entender por que o gráfico de temperaturas se achatou. Segundo Ed Hawkins, climatólogo da Universidade de Reading (Inglaterra) que tem estudado o fenômeno, há vários motivos, incluindo queda de atividade solar e erupções vulcânicas recentes, que tem efeito resfriador sobre a Terra.

“O terceiro fator é a variabilidade natural do clima, com mudanças importantes no Pacífico tropical”, diz. “A energia extra entrando no sistema climático provavelmente está sendo armazenada no oceano profundo.”

*   *   *

26/09/2013 – 02h54

Oceano Ártico terá verão sem gelo em 2050, diz relatório da ONU

RAFAEL GARCIA

ENVIADO ESPECIAL A ESTOCOLMO

Quando o biólogo sueco Tom Arnbom desembarcou neste verão na costa do mar de Laptev, na Rússia, para coletar DNA de morsas, notou que os enormes mamíferos tinham um medo incomum de ursos-polares, animal que normalmente não os ataca.

Após alguns dias no local, o cientista percebeu que, sem terem gelo marinho como base de caça, os ursos não alcançavam focas e outras presas prediletas. A alternativa era tentar abocanhar filhotes de morsa, com alto risco de serem pegos pelos adultos.

Alex Argozino/Editoria de Arte/Folhapress

Esse é um dos impactos que o declínio do gelo marinho no Ártico provocou nos últimos anos. O aumento do ritmo de degelo na região é um dos principais pontos a serem revistos no próximo relatório do IPCC (painel do clima da ONU, formado por milhares de cientistas).

Amanhã, a primeira parte do texto, dedicada à física do clima, será divulgada.

“É muito provável que a cobertura de gelo marinho no Ártico continue a encolher e afinar”, afirma o texto preliminar do documento. “Sob o cenário RCP8.5 [hipótese mais pessimista do relatório], um oceano Ártico quase sem gelo provavelmente será visto antes do meio do século.”

O painel informa que é possível dizer com “alta confiabilidade” que o Ártico vai se aquecer mais rápido que outras regiões e revê para pior o seu prognóstico. Há menos de uma década, porém, o maior relatório sobre o tema, o “Arctic Climate Impact Assessment”, estimava que o temido “setembro sem gelo” só viria no fim do século.

O gelo marinho ajuda a refletir radiação solar. Com o degelo e mais área de águas escuras para absorver calor, a escassez de superfície branca vai retroalimentar a mudança climática, diz o IPCC.

A MARCHA DAS MORSAS
Arnbom, hoje um pesquisador a serviço da ONG ambientalista WWF, trabalhou por muitos anos na região para o governo da Suécia, país que tem 15% de seu território no círculo ártico.
“O que eu vi no Ártico 40 anos atrás não existe mais. É impressionante ver quão rápido se foi. Estava lá todos os anos. Alguns eram mais frios que outros, e a população de animais variava, mas em 2007 me dei conta de que os impactos eram bem visíveis.”

Naquele ano, todo o gelo em alto mar derreteu na região de Laptev, e morsas que costumavam se espalhar em pedaços de gelo marinho migraram para uma única praia, formando uma concentração de mais de 50 mil indivíduos.

Norbert Rosing/National Geographic Stock/WWF Canadá

Morsas amontoadas no Ártico; derretimento do gelo reduz área disponível para os bichos, o que pode causar a morte dos filhotes prensados pelos adultos em espaços pequenos

O local, afastado das áreas de alimentação ricas em moluscos, era pequeno demais para abrigar tantos animais, e muitos morriam esmagados. O fenômeno se repetiu depois, em 2011, com uma concentração estimada em 100 mil morsas.

RÚSSIA x AMBIENTE
Ursos e morsas, porém, não são os únicos animais da região a terem mudado de comportamento. Segundo a WWF, a região é habitada por 40 comunidades tradicionais, que enfrentam escassez na caça e pesca artesanal.

E outro grupo de humanos, vendo novas rotas de navegação se abrirem, enxerga agora o potencial econômico da região como rota de transporte marítimo, pesca industrial e fonte de petróleo –o benefício vem justamente para a indústria cujo produto é apontado como causa da mudança climática.

A gigante russa Gazprom, que estabeleceu a primeira plataforma de petróleo “offshore” da região, teve suas instalações no mar de Pechora invadidas por ativistas do Greenpeace na semana passada. Trinta ambientalistas, incluindo uma brasileira, estão presos.

A ONG WWF também intensificou sua atividade na região. O DNA de morsas coletado por Arnbom será usado para pesquisas em universidades dinamarquesas, mas ainda está parado na alfândega russa.

Seca no semiárido deve se agravar nos próximos anos – e outros artigos relacionados à Conclima (Fapesp)

Pesquisadores alertam para necessidade de executar ações urgentes de adaptação e mitigação aos impactos das mudanças climáticas previstos na região (foto: Fred Jordão/Acervo ASACom)

12/09/2013

Por Elton Alisson

Agência FAPESP – Os problemas de seca prolongada registrados atualmente no semiárido brasileiro devem se agravar ainda mais nos próximos anos por causa das mudanças climáticas globais. Por isso, é preciso executar ações urgentes de adaptação e mitigação desses impactos e repensar os tipos de atividades econômicas que podem ser desenvolvidas na região.

A avaliação foi feita por pesquisadores que participaram das discussões sobre desenvolvimento regional e desastres naturais realizadas no dia 10 de setembro durante a 1ª Conferência Nacional de Mudanças Climáticas Globais (Conclima).

Organizado pela FAPESP e promovido em parceria com a Rede Brasileira de Pesquisa e Mudanças Climáticas Globais (Rede Clima) e o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC), o evento ocorre até a próxima sexta-feira (13/09), no Espaço Apas, em São Paulo.

De acordo com dados do Centro Nacional de Gerenciamento de Riscos e Desastres (Cenad), só nos últimos dois anos foram registrados 1.466 alertas de municípios no semiárido que entraram em estado de emergência ou de calamidade pública em razão de seca e estiagem – os desastres naturais mais recorrentes no Brasil, segundo o órgão.

O Primeiro Relatório de Avaliação Nacional do Painel Brasileiro de Mudanças Climáticas (PBMC) – cujo sumário executivo foi divulgado no dia de abertura da Conclima – estima que esses eventos extremos aumentem principalmente nos biomas Amazônia, Cerrado e Caatinga e que as mudanças devem se acentuar a partir da metade e até o fim do século 21. Dessa forma, o semiárido sofrerá ainda mais no futuro com o problema da escassez de água que enfrenta hoje, alertaram os pesquisadores.

“Se hoje já vemos que a situação é grave, os modelos de cenários futuros das mudanças climáticas no Brasil indicam que o problema será ainda pior. Por isso, todas as ações de adaptação e mitigação pensadas para ser desenvolvidas ao longo dos próximos anos, na verdade, têm de ser realizadas agora”, disse Marcos Airton de Sousa Freitas, especialista em recursos hídricos e técnico da Agência Nacional de Águas (ANA).

Segundo o pesquisador, o semiárido – que abrange Bahia, Sergipe, Alagoas, Pernambuco, Rio Grande do Norte, Paraíba, Ceará, Piauí e o norte de Minas Gerais – vive hoje o segundo ano do período de seca, iniciado em 2011, que pode se prolongar por um tempo indefinido.

Um estudo realizado pelo órgão, com base em dados de vazão de bacias hidrológicas da região, apontou que a duração média dos períodos de seca no semiárido é de 4,5 anos. Estados como o Ceará, no entanto, já enfrentaram secas com duração de quase nove anos, seguidos por longos períodos nos quais choveu abaixo da média estimada.

De acordo com Freitas, a capacidade média dos principais reservatórios da região – com volume acima de 10 milhões de metros cúbicos de água e capacidade de abastecer os principais municípios por até três anos – está atualmente na faixa de 40%. E a tendência até o fim deste ano é de esvaziarem cada vez mais.

“Caso não haja um aporte considerável de água nesses grandes reservatórios em 2013, poderemos ter uma transição do problema de seca que se observa hoje no semiárido, mais rural, para uma seca ‘urbana’ – que atingiria a população de cidades abastecidas por meio de adutoras desses sistemas de reservatórios”, alertou Freitas.

Ações de adaptação

Uma das ações de adaptação que começou a ser implementada no semiárido nos últimos anos e que, de acordo com os pesquisadores, contribuiu para diminuir sensivelmente a vulnerabilidade do acesso à água, principalmente da população rural difusa, foi o Programa Um Milhão de Cisternas (P1MC).

Lançado em 2003 pela Articulação Semiárido Brasileiro (ASA) – rede formada por mais de mil organizações não governamentais (ONGs) que atuam na gestão e no desenvolvimento de políticas de convivência com a região semiárida –, o programa visa implementar um sistema nas comunidades rurais da região por meio do qual a água das chuvas é capturada por calhas, instaladas nos telhados das casas, e armazenada em cisternas cobertas e semienterradas. As cisternas são construídas com placas de cimento pré-moldadas, feitas pela própria comunidade, e têm capacidade de armazenar até 16 mil litros de água.

O programa tem contribuído para o aproveitamento da água da chuva em locais onde chove até 600 milímetros por ano – comparável ao volume das chuvas na Europa – que evaporam e são perdidos rapidamente sem um mecanismo que os represe, avaliaram os pesquisadores.

“Mesmo com a seca extrema na região nos últimos dois anos, observamos que a água para o consumo da população rural difusa tem sido garantida pelo programa, que já implantou cerca de 500 mil cisternas e é uma ação política de adaptação a eventos climáticos extremos. Com programas sociais, como o Bolsa Família, o programa Um Milhão de Cisternas tem contribuído para atenuar os impactos negativos causados pelas secas prolongadas na região”, afirmou Saulo Rodrigues Filho, professor da Universidade de Brasília (UnB).

Como a água tende a ser um recurso natural cada vez mais raro no semiárido nos próximos anos, Rodrigues defendeu a necessidade de repensar os tipos de atividades econômicas mais indicadas para a região.

“Talvez a agricultura não seja a atividade mais sustentável para o semiárido e há evidências de que é preciso diversificar as atividades produtivas na região, não dependendo apenas da agricultura familiar, que já enfrenta problemas de perda de mão de obra, uma vez que o aumento dos níveis de educação leva os jovens da região a se deslocar do campo para a cidade”, disse Rodrigues.

“Por meio de políticas de geração de energia mais sustentáveis, como a solar e a eólica, e de fomento a atividades como o artesanato e o turismo, é possível contribuir para aumentar a resiliência dessas populações a secas e estiagens agudas”, afirmou.

Outras medidas necessárias, apontada por Freitas, são de realocação de água entre os setores econômicos que utilizam o recurso e seleção de culturas agrícolas mais resistentes à escassez de água enfrentada na região.

“Há culturas no semiárido, como capim para alimentação de gado, que dependem de irrigação por aspersão. Não faz sentido ter esse tipo de cultura que demanda muito água em uma região que sofrerá muito os impactos das mudanças climáticas”, afirmou Freitas.

Transposição do Rio São Francisco

O pesquisador também defendeu que o projeto de transposição do Rio São Francisco tornou-se muito mais necessário agora – tendo em vista que a escassez de água deverá ser um problema cada vez maior no semiárido nas próximas décadas – e é fundamental para complementar as ações desenvolvidas na região para atenuar o risco de desabastecimento de água.

Alvo de críticas e previsto para ser concluído em 2015, o projeto prevê que as águas do Rio São Francisco cheguem às bacias do Rio Jaguaribe, que abastece o Ceará, e do Rio Piranhas-Açu, que abastece o Rio Grande do Norte e a Paraíba.

De acordo com um estudo realizado pela ANA, com financiamento do Banco Mundial e participação de pesquisadores da Universidade Federal do Ceará, entre outras instituições, a disponibilidade hídrica dessas duas bacias deve diminuir sensivelmente nos próximos anos, contribuindo para agravar ainda mais a deficiência hídrica do semiárido.

“A transposição do Rio Francisco tornou-se muito mais necessária e deveria ser acelerada porque contribuiria para minimizar o problema do déficit de água no semiárido agora, que deve piorar com a previsão de diminuição da disponibilidade hídrica nas bacias do Rio Jaguaribe e do Rio Piranhas-Açu”, disse Freitas à Agência FAPESP.

O Primeiro Relatório de Avaliação Nacional do PBMC, no entanto, indica que a vazão do Rio São Francisco deve diminuir em até 30% até o fim do século, o que colocaria o projeto de transposição sob ameaça.

Freitas, contudo, ponderou que 70% do volume de água do Rio São Francisco vem de bacias da região Sudeste, para as quais os modelos climáticos preveem aumento da vazão nas próximas décadas. Além disso, de acordo com ele, o volume total previsto para ser transposto para as bacias do Rio Jaguaribe e do Rio Piranhas-Açu corresponde a apenas 2% da vazão média da bacia do Rio São Francisco.

“É uma situação completamente diferente do caso do Sistema Cantareira, por exemplo, no qual praticamente 90% da água dos rios Piracicaba, Jundiaí e Capivari são transpostas para abastecer a região metropolitana de São Paulo”, comparou.

“Pode-se argumentar sobre a questão de custos da transposição do Rio São Francisco. Mas, em termos de necessidade de uso da água, o projeto reforçará a operação dos sistemas de reservatórios existentes no semiárido”, afirmou.

De acordo com o pesquisador, a água é distribuída de forma desigual no território brasileiro. Enquanto 48% do total do volume de chuvas que cai na Amazônia é escoado pela Bacia Amazônica, segundo Freitas, no semiárido apenas em média 7% do volume de água precipitada na região durante três a quatro meses chegam às bacias do Rio Jaguaribe e do Rio Piranhas-Açu. Além disso, grande parte desse volume de água é perdido pela evaporação. “Por isso, temos necessidade de armazenar essa água restante para os meses nos quais não haverá disponibilidade”, explicou.

As apresentações feitas pelos pesquisadores na conferência, que termina no dia 13, estarão disponíveis em: www.fapesp.br/conclima

 

Mudanças no clima do Brasil até 2100

10/09/2013

Por Elton Alisson*

Mais calor, menos chuva no Norte e Nordeste do país e mais chuva no Sul e Sudeste são algumas das projeções do Relatório de Avaliação Nacional do Painel Brasileiro de Mudanças Climáticas (foto: Eduardo Cesar/FAPESP)

Agência FAPESP – O clima no Brasil nas próximas décadas deverá ser mais quente – com aumento gradativo e variável da temperatura média em todas as regiões do país entre 1 ºC e 6 ºC até 2100, em comparação à registrada no fim do século 20.

No mesmo período, também deverá diminuir significativamente a ocorrência de chuvas em grande parte das regiões central, Norte e Nordeste do país. Nas regiões Sul e Sudeste, por outro lado, haverá um aumento do número de precipitações.

As conclusões são do primeiro Relatório de Avaliação Nacional (RAN1) do Painel Brasileiro de Mudanças Climáticas (PBMC), cujo sumário executivo foi divulgado nesta segunda-feira (09/08), durante a 1ª Conferência Nacional de Mudanças Climáticas Globais (Conclima). Organizado pela FAPESP e promovido com a Rede Brasileira de Pesquisa e Mudanças Climáticas Globais (Rede Clima) e o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC), oevento ocorre até a próxima sexta-feira (13/09), no Espaço Apas, em São Paulo.

De acordo com o relatório, tendo em vista que as mudanças climáticas e os impactos sobre as populações e os setores econômicos nos próximos anos não serão idênticos em todo o país, o Brasil precisa levar em conta as diferenças regionais no desenvolvimento de ações de adaptação e mitigação e de políticas agrícolas, de geração de energia e de abastecimento hídrico para essas diferentes regiões.

Dividido em três partes, o Relatório 1 – em fase final de elaboração – apresenta projeções regionalizadas das mudanças climáticas que deverão ocorrer nos seis diferentes biomas do Brasil até 2100, e indica quais são seus impactos estimados e as possíveis formas de mitigá-los.

As projeções foram feitas com base em revisões de estudos realizados entre 2007 e início de 2013 por 345 pesquisadores de diversas áreas, integrantes do PBMC, e em resultados científicos de modelagem climática global e regional.

“O Relatório está sendo preparado nos mesmos moldes dos relatórios publicados pelo Painel Intergovernamental das Mudanças Climáticas [IPCC, na sigla em inglês], que não realiza pesquisa, mas avalia os estudos já publicados”, disse José Marengo, pesquisador do Instituto Nacional de Pesquisas Espaciais (Inpe) e coordenador do encontro.

“Depois de muito trabalho e interação, chegamos aos resultados principais dos três grupos de trabalho [Bases científicas das mudanças climáticas; Impactos, vulnerabilidades e adaptação; e Mitigação das mudanças climáticas]”, ressaltou.

Principais conclusões

Uma das conclusões do relatório é de que os eventos extremos de secas e estiagens prolongadas, principalmente nos biomas da Amazônia, Cerrado e Caatinga, devem aumentar e essas mudanças devem se acentuar a partir da metade e no fim do século 21.

A temperatura na Amazônia deverá aumentar progressivamente de 1 ºC a 1,5 ºC até 2040 – com diminuição de 25% a 30% no volume de chuvas –, entre 3 ºC e 3,5 ºC no período de 2041 a 2070 – com redução de 40% a 45% na ocorrência de chuvas –, e entre 5 ºC a 6 ºC entre 2071 a 2100.

Enquanto as modificações do clima associadas às mudanças globais podem comprometer o bioma em longo prazo, a questão atual do desmatamento decorrente das intensas atividades de uso da terra representa uma ameaça mais imediata para a Amazônia, ponderam os autores do relatório.

Os pesquisadores ressaltam que estudos observacionais e de modelagem numérica sugerem que, caso o desmatamento alcance 40% na região no futuro, haverá uma mudança drástica no padrão do ciclo hidrológico, com redução de 40% na chuva durante os meses de julho a novembro – o que prolongaria a duração da estação seca e provocaria o aquecimento superficial do bioma em até 4 ºC.

Dessa forma, as mudanças regionais decorrentes do efeito do desmatamento se somariam às provenientes das mudanças globais e constituíram condições propícias para a savanização da Amazônia – problema que tende a ser mais crítico na região oriental, ressaltam os pesquisadores.

“As projeções permitirão analisar melhor esse problema de savanização da Amazônia, que, na verdade, percebemos que poderá ocorrer em determinados pontos da floresta, e não no bioma como um todo, conforme previam alguns estudos”, destacou Tércio Ambrizzi, um dos autores coordenadores do sumário executivo do grupo de trabalho sobre a base científica das mudanças climáticas.

A temperatura da Caatinga também deverá aumentar entre 0,5 ºC e 1 ºC e as chuvas no bioma diminuirão entre 10% e 20% até 2040. Entre 2041 e 2070 o clima da região deverá ficar de 1,5 ºC a 2,5 ºC mais quente e o padrão de chuva diminuir entre 25% e 35%. Até o final do século, a temperatura do bioma deverá aumentar progressivamente entre 3,5 ºC e 4,5 ºC  e a ocorrência de chuva diminuir entre 40% e 50%. Tais mudanças podem desencadear o processo de desertificação do bioma.

Por sua vez, a temperatura no Cerrado deverá aumentar entre 5 ºC e 5,5 ºC e as chuvas diminuirão entre 35% e 45% no bioma até 2100. No Pantanal, o aquecimento da temperatura deverá ser de 3,5ºC a 4,5ºC até o final do século, com diminuição acentuada dos padrões de chuva no bioma – com queda de 35% a 45%.

Já no caso da Mata Atlântica, como o bioma abrange áreas desde a região Sul do país, passando pelo Sudeste e chegando até o Nordeste, as projeções apontam dois regimes distintos de mudanças climáticas.

Na porção Nordeste deve ocorrer um aumento relativamente baixo na temperatura – entre 0,5 ºC e 1 ºC – e decréscimo nos níveis de precipitação (chuva) em torno de 10% até 2040. Entre 2041 e 2070, o aquecimento do clima da região deverá ser de 2 ºC a 3 ºC, com diminuição pluviométrica entre 20% e 25%. Já para o final do século – entre 2071 e 2100 –, estimam-se condições de aquecimento intenso – com aumento de 3 ºC a 4 ºC na temperatura – e diminuição de 30% a 35% na ocorrência de chuvas.

Nas porções Sul e Sudeste as projeções indicam aumento relativamente baixo de temperatura entre 0,5 ºC e 1 ºC até 2040, com aumento de 5% a 10% no número de chuva. Entre 2041 e 2070 deverão ser mantidas as tendências de aumento gradual de 1,5 ºC a 2 ºC na temperatura e de 15% a 20% de chuvas.

Tais tendências devem se acentuar ainda mais no final do século, quando o clima deverá ficar entre 2,5 ºC e 3 ºC mais quente e entre 25% e 30% mais chuvoso.

Por fim, para o Pampa, as projeções indicam que até 2040 o clima da região será entre 5% e 10% mais chuvoso e até 1 ºC mais quente. Já entre 2041 e 2070, a temperatura do bioma deverá aumentar entre 1 ºC e 1,5 ºC  e haverá uma intensificação das chuvas entre 15% e 20%. As projeções para o clima da região no período entre 2071 e 2100 são mais agravantes, com aumento de temperatura de 2,5 ºC a 3 ºC e ocorrência de chuvas entre 35% e 40% acima do normal.

“O que se observa, de forma geral, é que nas regiões Norte e Nordeste do Brasil a tendência é de um aumento de temperatura e de diminuição das chuvas ao longo do século”, resumiu Ambrizzi.

“Já nas regiões mais ao Sul essa tendência se inverte: há uma tendência tanto de aumento da temperatura – ainda que não intenso – e de precipitação”, comparou.

Impactos e adaptação

As mudanças nos padrões de precipitação nas diferentes regiões do país, causadas pelas mudanças climáticas, deverão ter impactos diretos na agricultura, na geração e distribuição de energia e nos recursos hídricos das regiões, uma vez que a água deve se tornar mais rara nas regiões Norte e Nordeste e mais abundante no Sul e Sudeste, alertam os pesquisadores.

Por isso, será preciso desenvolver ações de adaptação e mitigação específicas e rever decisões de investimento, como a construção de hidrelétricas nas regiões leste da Amazônia, onde os rios poderão ter redução da vazão da ordem de até 20%, ressalvaram os pesquisadores.

“Essas variações de impactos mostram que qualquer tipo de estratégia planejada para geração de energia no leste da Amazônia está ameaçada, porque há uma série de fragilidades”, disse Eduardo Assad, pesquisador da Empresa Brasileira de Pesquisa Agropecuária (Embrapa).

“Dará para contar com água. Mas até quando e onde encontrar água nessas regiões são incógnitas”, disse o pesquisador, que é um dos coordenadores do Grupo de Trabalho 2 do relatório, sobre Impactos, vulnerabilidades e adaptação.

De acordo com Assad, é muito caro realizar ações de adaptação às mudanças climáticas no Brasil em razão das fragilidades que o país apresenta tanto em termos naturais – com grandes variações de paisagens – como socioeconômicas.

“A maior parte da população brasileira – principalmente a que habita as regiões costeiras do país – está vulnerável aos impactos das mudanças climáticas. Resolver isso não será algo muito fácil”, estimou.

Entre os setores econômicos do país, segundo Assad, a agricultura é um dos poucos que vêm se adiantando para se adaptar aos impactos das mudanças climáticas.

“Já estamos trabalhando com condições de adaptação há mais de oito anos. É possível desenvolver cultivares tolerantes a temperaturas elevadas ou à deficiência hídrica [dos solos], disse Assad.

O pesquisador também ressaltou que os grupos populacionais com piores condições de renda, educação e moradia sofrerão mais intensamente os impactos das mudanças climáticas no país. “Teremos que tomar decisões rápidas para evitar que tragédias aconteçam.”

Mitigação

Mercedes Bustamante, professora da Universidade de Brasília (UnB), e uma das coordenadoras do Grupo de Trabalho 3, sobre Mitigação das Mudanças Climáticas, apresentou uma síntese de estudos e pesquisas sobre o tema, identificando lacunas do conhecimento e direcionamentos futuros em um cenário de aquecimento global.

Bustamante apontou que a redução das taxas de desmatamento entre 2005 e 2010 – de 2,03 bilhões de toneladas de CO2 equivalente para 1,25 bilhão de toneladas – já teve efeitos positivos na redução das emissões de gases de efeito estufa (GEE) decorrentes do uso da terra.

“As emissões decorrentes da geração de energia e da agricultura, no entanto, aumentaram em termos absolutos e relativos, indicando mudanças no perfil das emissões brasileiras”, disse.

Mantidas as políticas atuais, a previsão é de que as emissões decorrentes dos setores de energia e de transportes aumentem 97% até 2030. Será preciso mais eficiência energética, mais inovação tecnológica e políticas de incentivo ao uso de energia renovável para reverter esse quadro.

Na área de transporte, as recomendações vão desde a transformação de um modal – fortemente baseado no transporte rodoviário – e o uso de combustíveis tecnológicos. “É preciso transferir do individual para o coletivo, investindo, por exemplo, em sistemas aquaviários e em veículos elétricos e híbridos”, ressaltou Bustamante.

O novo perfil das emissões de GEE revela uma participação crescente do metano – de origem animal – e do óxido nitroso – relacionado ao uso de fertilizantes. “Apesar desses resultados, a agricultura avançou no desenvolvimento de estratégias de mitigação e adaptação”, ponderou.

Para a indústria, responsável por 4% das emissões de GEE, a lista de recomendações para a mitigação passa pela reciclagem, pela utilização de biomassa renovável, pela cogeração de energia, entre outros.

As estratégias de mitigação das mudanças climáticas exigem, ainda, uma revisão do planejamento urbano de forma a garantir a sustentabilidade também das edificações de forma a controlar, por exemplo, o consumo da madeira e garantir maior eficiência energética na construção civil.

Informação para a sociedade

Os pesquisadores participantes da redação do relatório destacaram que, entre as virtudes do documento, está a de reunir dados de estudos científicos realizados ao longo dos últimos anos no Brasil que estavam dispersos e disponibilizar à sociedade e aos tomadores de decisão informações técnico-científicas críveis capazes de auxiliar no desenvolvimento de estratégias de adaptação e mitigação para os possíveis impactos das mudanças climáticas.

“Nós, cientistas, temos o desafio de conseguir traduzir a seriedade e a gravidade do momento e as oportunidades que as mudanças climáticas globais encerram para a sociedade. Sabemos que a inação representa a ação menos inteligente que a sociedade pode tomar”, disse Paulo Nobre, coordenador da Rede Clima.

Por sua vez, Celso Lafer, presidente da FAPESP, destacou, na abertura do evento, que a Fundação tem interesse especial nas pesquisas sobre mudanças climáticas, expresso no Programa FAPESP de Pesquisa em Mudanças Climáticas Globais (PFPMCG), mantido pela instituição.

“Uma das preocupações básicas da FAPESP é pesquisar e averiguar o impacto das mudanças climáticas globais naquilo que afeta as especificidades do Brasil e do Estado de São Paulo”, afirmou.

Também participaram da abertura do evento Bruno Covas, secretário do Meio Ambiente do Estado de São Paulo, Carlos Nobre, secretário de Políticas e Programa de Pesquisa e Desenvolvimento (Seped) do Ministério da Ciência, Tecnologia e Inovação (MCTI), e Paulo Artaxo, membro da coordenação do PFPMCG.

Carlos Nobre ressaltou que o relatório será a principal fonte de informações que orientará o Plano Nacional de Mudanças Climáticas que, no momento, está em revisão.

“É muito importante que os resultados desse estudo orientem os trabalhos em Brasília e em várias partes do Brasil, em um momento crítico de reorientar a política nacional, que tem de ir na direção de tornar a economia, a sociedade e o ambiente mais resilientes às inevitáveis mudanças climáticas que estão por vir”, afirmou.

Segundo ele, o Brasil já sinalizou compromisso com a mitigação, materializado na Política Nacional de Mudanças Climáticas e que prevê redução de 10% e 15% das emissões entre 2010 e 2020, respectivamente, relativamente a 2005.

“São Paulo lançou, em 2009, um programa ambicioso, de redução de 20% das emissões, já que a questão da mudança no uso da terra não é uma questão tão importante no Estado, mas sim o avanço tecnológico na geração de energia e em processos produtivos. O Brasil é o único país em desenvolvimento com metas voluntárias para redução de emissões”.

Ele ressaltou, entretanto, que “a adaptação ficou desassistida”. “Não é só mitigar; é preciso também se adaptar às mudanças climáticas. As três redes de pesquisa – Clima, INCT e FAPESP – avançam na adaptação, que é o guia para o desenvolvimento sustentável.”

* Colaboraram Claudia Izique e Noêmia Lopes

 

Modelo para entender as variações climáticas no país

10/09/2013

Por Noêmia Lopes

Modelo Brasileiro do Sistema Terrestre é considerado fundamental para pesquisas sobre o clima(foto:Nasa)

Agência FAPESP – O desmatamento da Amazônia, as queimadas, os processos de interação entre o Oceano Atlântico e a atmosfera são algumas das questões climáticas particulares do Brasil que o Modelo Brasileiro do Sistema Terrestre (BESM, na sigla em inglês) leva em consideração e pretende dar conta de uma forma que mesmo os melhores modelos do mundo não são capazes de fazer.

O modelo foi apresentado em detalhes na segunda-feira (09/09) na abertura da 1ª Conferência Nacional de Mudanças Climáticas Globais, realizada em São Paulo.

Determinar o quanto o clima já mudou, o quanto as suas características ainda vão se alterar e onde isso ocorrerá é a base para o planejamento de políticas públicas de adaptação aos impactos das mudanças climáticas globais, segundo Paulo Nobre, coordenador geral do BESM e da Rede Brasileira de Pesquisa sobre Mudanças Climáticas Globais (Rede Clima), na abertura do evento.

“O BESM é um eixo estruturante da pesquisa em mudanças climáticas no Brasil e oferece subsídios para o Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), para a Rede Clima e para o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC)”, disse Nobre.

Com o intuito de colaborar com as previsões sobre as consequências do aquecimento global e o consequente aumento na frequência dos eventos extremos, o BESM roda no supercomputador Tupã da Rede Clima/PFPMCG, reunindo fluxos atmosféricos, oceânicos, de superfície e, em breve, químicos. Adquirido pelo Ministério da Ciência, Tecnologia e Inovação (MCTI) com apoio da FAPESP, o Tupã está instalado na unidade do Instituto Nacional de Pesquisas Espaciais (Inpe) de Cachoeira Paulista

Embora ainda não esteja em sua versão final e completa, o modelo já consegue, por exemplo, reconstituir a ocorrência dos últimos El Niños e estimar o retorno desse fenômeno, explicar as chuvas na Zona de Convergência do Atlântico Sul e até estimar a variação do gelo no globo de maneira satisfatória.

“Os cenários mais atuais, de 2013, estão disponíveis no supercomputador. Estamos tomando medidas para que o acesso seja disponibilizado ao público em geral até o final do ano”, afirmou Nobre.

Trinta pesquisadores estão diretamente envolvidos no desenvolvimento do BESM e outros 40 estão ligados indiretamente. De acordo com Nobre, esse número deve duplicar em cinco anos e duplicar novamente em 10 anos. “Para tanto, é preciso que cada vez mais jovens doutores integrem esse desafio.”

Nobre ressaltou ainda as graves consequências que o aumento da frequência dos eventos extremos – tais como secas, enchentes e tornados – tem sobre a vida das populações: “Nossas cidades e nossos campos não foram projetados para conviver com esse novo clima, o que torna essencial a entrada do Brasil na modelagem das mudanças climáticas”, disse.

Os quatro componentes

Iracema Cavalcanti, pesquisadora do Inpe, apresentou, durante o primeiro dia da conferência, os resultados já obtidos a partir do componente atmosférico do BESM.

Em comparação com os dados observados, o modelo consegue alcançar boas simulações de variabilidade sazonal (modificações climatológicas das estações), fluxos de umidade, variabilidade interanual (diferença entre anomalias de pressão), variabilidade da precipitação, entre outros. “Ainda há deficiências, como na maioria dos modelos globais. Mas as características gerais estão contempladas”, disse Cavalcanti.

Em relação aos oceanos, não é possível separá-los do que ocorre no modelo atmosférico, daí a modelagem acoplada ser estratégica para o funcionamento eficiente do BESM. A avaliação é de Leo Siqueira, também do Inpe, que apresentou os desafios desse componente.

“Já conseguimos boas representações, mas queremos melhorar as simulações de temperatura dos oceanos, principalmente nos trópicos, e do gelo marinho. Também queremos criar uma discussão saudável sobre qual modelo de gelo terrestre será adotado dentro do BESM”, disse Siqueira.

A modelagem do terceiro componente, superfície, foi apresentada por Marcos Heil Costa, da Universidade Federal de Viçosa (UFV). “A principal função de um modelo que integra processos superficiais é fornecer fluxos de energia e vapor d’água entre a vegetação e a atmosfera. Isso é o básico. Mas o aprimoramento do sistema, ocorrido ao longo dos últimos anos, permitiu que outros processos fossem incorporados”, afirmou.

Alguns desses processos são: fluxos de radiação, energia e massa; ciclos do carbono e do nitrogênio terrestres; recuperação de áreas abandonadas; incêndios na vegetação natural; culturas agrícolas; vazão e áreas inundadas sazonais; uso do solo por ações humanas (desmatamento); representação específica dos ecossistemas; fertilidade do solo; entre outros.

Química é o quarto e mais recente dos componentes. “Sem um modelo químico, os demais precisam ser constantemente ajustados”, disse Sérgio Correa, pesquisador da Universidade Estadual do Rio de Janeiro (UERJ). Correa está à frente de estudos sobre química atmosférica que permitirão refinar os dados do BESM quando esse componente for acoplado ao modelo – uma das próximas fases previstas.

Colaborações com pesquisadores de instituições estrangeiras e treinamentos são outras estratégias que estão em vigor para melhorar as representações do Brasil e da América do Sul, com alcance também global.

Mais informações sobre a 1ª Conferência Nacional de Mudanças Climáticas Globais: www.fapesp.br/conclima

The Myth of ‘Environmental Catastrophism’ (Monthly Review)

Between October 2010 and April 2012, over 250,000 people, including 133,000 children under five, died of hunger caused by drought in Somalia. Millions more survived only because they received food aid. Scientists at the UK Met Centre have shown that human-induced climate change made this catastrophe much worse than it would otherwise have been.1

This is only the beginning: the United Nations’ 2013 Human Development Report says that without coordinated global action to avert environmental disasters, especially global warming, the number of people living in extreme poverty could increase by up to 3 billion by 2050.2 Untold numbers of children will die, killed by climate change.

If a runaway train is bearing down on children, simple human solidarity dictates that anyone who sees it should shout a warning, that anyone who can should try to stop it. It is difficult to imagine how anyone could disagree with that elementary moral imperative.

And yet some do. Increasingly, activists who warn that the world faces unprecedented environmental danger are accused of catastrophism—of raising alarms that do more harm than good. That accusation, a standard feature of right-wing attacks on the environmental movement, has recently been advanced by some left-wing critics as well. While they are undoubtedly sincere, their critique of so-called environmental catastrophism does not stand up to scrutiny.

From the Right…

The word “catastrophism” originated in nineteenth-century geology, in the debate between those who believed all geological change had been gradual and those who believed there had been episodes of rapid change. Today, the word is most often used by right-wing climate change deniers for whom it is a synonym for “alarmism.”

  • The Heartland Institute: “Climate Catastrophism Picking Up Again in the U.S. and Across the World.”3
  • A right-wing German blog: “The Climate Catastrophism Cult.”4
  • The Australian journal Quadrant: “The Chilling Costs of Climate Catastrophism.”5

Examples could be multiplied. As environmental historian Franz Mauelshagen writes, “In climate denialist circles, the word ‘climate catastrophe’ has become synonymous with ‘climate lie,’ taking the anthropogenic green house effect for a scam.”6

Those who hold such views like to call themselves “climate change skeptics,” but a more accurate term is “climate science deniers.” While there are uncertainties about the speed of change and its exact effects, there is no question that global warming is driven by greenhouse-gas emissions caused by human activity, and that if business as usual continues, temperatures will reach levels higher than any seen since before human beings evolved. Those who disagree are not skeptical, they are denying the best scientific evidence and analysis available.

The right labels the scientific consensus “catastrophism” to belittle environmentalism, and to stifle consideration of measures to delay or prevent the crisis. The real problem, they imply, is not the onrushing train, but the people who are yelling “get off the track!” Leaving the track would disrupt business as usual, and that is to be avoided at all costs.

…And From the Left

Until very recently, “catastrophism” as a political expression was pretty much the exclusive property of conservatives. When it did occur in left-wing writing, it referred to economic debates, not ecology. But in 2007 two quite different left-wing voices almost simultaneously adopted “catastrophism” as a pejorative term for radical ideas about climate change they disagreed with.

The most prominent was the late Alexander Cockburn, who in 2007 was writing regularly forThe Nation and coediting the newsletter CounterPunch. To the shock of many of his admirers, he declared that “There is still zero empirical evidence that anthropogenic production of CO2 is making any measurable contribution to the world’s present warming trend,” and that “the human carbon footprint is of zero consequence.”7 Concern about climate change was, he wrote, the result of a conspiracy “between the Greenhouser fearmongers and the nuclear industry, now largely owned by oil companies.”8

Like critics on the right, Cockburn charged that the left was using climate change to sneak through reforms it could not otherwise win: “The left has bought into environmental catastrophism because it thinks that if it can persuade the world that there is indeed a catastrophe, then somehow the emergency response will lead to positive developments in terms of social and environmental justice.”9

While Cockburn’s assault on “environmental catastrophism” was shocking, his arguments did not add anything new to the climate debate. They were the same criticisms we had long heard from right-wing deniers, albeit with leftish vocabulary.

That was not the case with Leo Panitch and Colin Leys. These distinguished Marxist scholars are by no means deniers. They began their preface to the 2007 Socialist Register by noting that “environmental problems might be so severe as to potentially threaten the continuation of anything that might be considered tolerable human life,” and insisting that “the speed of development of globalized capitalism, epitomized by the dramatic acceleration of climate change, makes it imperative for socialists to deal seriously with these issues now.”

But then they wrote: “Nonetheless, it is important to try to avoid an anxiety-driven ecological catastrophism, parallel to the kind of crisis-driven economic catastrophism that announces the inevitable demise of capitalism.”10 They went on to argue that capitalism’s “dynamism and innovativeness” might enable it to use “green commerce” to escape environmental traps.

The problem with the Panitch–Leys formulation is that the threat of ecological catastrophe isnot “parallel” to the view that capitalism will destroy itself. The desire to avoid the kind of mechanical determinism that has often characterized Marxist politics, where every crisis was proclaimed to be the final battle, led these thoughtful writers to confuse two very different kinds of catastrophe.

The idea that capitalism will inevitably face an insurmountable economic crisis and collapse is based on a misunderstanding of Marxist economic theory. While economic crises are endemic to capitalism, the system can always continue—only class struggle, only a social revolution, can overthrow capitalism and end the crisis cycle.

Large-scale environmental damage is caused by our destructive economic system, but its effectis the potentially irreversible disruption of essential natural systems. The most dramatic example is global warming: recent research shows that the earth is now warmer than at any time in the past 6,000 years, and temperatures are rising much faster than at any time since the last Ice Age. Arctic ice and the Greenland ice sheet are disappearing faster than predicted, raising the specter of flooding in coastal areas where more than a billion people live. Extreme weather events, such as giant storms, heat waves, and droughts are becoming ever more frequent. So many species are going extinct that many scientists call it a mass extinction event, comparable to the time 66 million years ago when 75 percent of all species, including the dinosaurs, were wiped out.

As the editors of Monthly Review wrote in reply to Socialist Register, if these trends continue, “we will be faced with a different world—one in which life on the planet will be massively degraded on a scale not seen for tens of millions of years.”11 To call this “anxiety-driven ecological catastrophism, parallel to…economic catastrophism” is to equate an abstract error in economic theory with some of the strongest conclusions of modern science.

A New ‘Catastrophism’ Critique

Now a new essay, provocatively titled “The Politics of Failure Have Failed,” offers a different and more sweeping left-wing critique of “environmental catastrophism.” Author Eddie Yuen is associated with the Pacifica radio program Against the Grain, and is on the editorial board of the journal Capitalism Nature Socialism.

His paper is part of a broader effort to define and critique a body of political thought calledCatastrophism, in a book by that title.12 In the book’s introduction, Sasha Lilley offers this definition:

Catastrophism presumes that society is headed for a collapse, whether economic, ecological, social, or spiritual. This collapse is frequently, but not always, regarded as a great cleansing, out of which a new society will be born. Catastrophists tend to believe that an ever-intensified rhetoric of disaster will awaken the masses from their long slumber—if the mechanical failure of the system does not make such struggles superfluous. On the left, catastrophism veers between the expectation that the worse things become, the better they will be for radical fortunes, and the prediction that capitalism will collapse under its own weight. For parts of the right, worsening conditions are welcomed, with the hope they will trigger divine intervention or allow the settling of scores for any modicum of social advance over the last century.

A political category that includes both the right and the left—and that encompasses people whose concerns might be economic, ecological, social, or spiritual—is, to say the least, unreasonably broad. It is difficult to see any analytical value in a definition that lumps together anarchists, fascists, Christian fundamentalists, right-wing conspiracy nuts, pre–1914 socialists, peak-oil theorists, obscure Trotskyist groups, and even Mao Zedong.

The definition of catastrophism becomes even more problematic in Yuen’s essay.

One Of These Things Is Not Like The Others…

Years ago, the children’s television program Sesame Street would display four items—three circles and a square, three horses and a chair, and so on—while someone sang, “One of these things is not like the others, One of these things doesn’t belong.

I thought of that when I read Yuen’s essay.

While the book’s scope is broad, most of it focuses, as Yuen writes, on “instrumental, spurious, and sometimes maniacal versions of catastrophism—including rightwing racial paranoia, religious millenarianism, liberal panics over fascism, leftist fetishization of capitalist collapse, capitalist invocation of the ‘shock doctrine’ and pop culture cliché.”

But as Yuen admits in his first paragraph, environmentalism is a very different matter, because we are in “what is unquestionably a genuine catastrophic moment in human and planetary history…. Of all of the forms of catastrophic discourse on offer, the collapse of ecological systems is unique in that it is definitively verified by a consensus within the scientific community…. It is absolutely urgent to address this by effectively and rapidly changing the direction of human society.”

If the science is clear, if widespread ecological collapse unquestionably faces us unless action is taken, why is this topic included in a book devoted to criticizing false ideas? Does it make sense to use the same term for people who believe in an imaginary train crash and for people who are trying to stop a real crash from happening?

The answer, although he does not say so, is that Yuen is using a different definition than the one Lilley gave in her introduction. Her version used the word for the belief that some form of catastrophe will have positive results—that capitalism will collapse from internal contradictions, that God will punish all sinners, that peak oil or industrial collapse will save the planet. Yuen uses the same word for the idea that environmentalists should alert people to the threat of catastrophic environmental change and try to mobilize them to prevent or minimize it.

Thus, when he refers to “a shrill note of catastrophism” in the work of James Hansen, perhaps the world’s leading climate scientist, he is not challenging the accuracy of Hansen’s analysis, but only the “narrative strategy” of clearly stating the probable results of continuing business as usual.

Yuen insists that “the veracity of apocalyptic claims about ecological collapse are separate from their effects on social, political, and economic life.” Although “the best evidence points to cascading environmental disaster,” in his view it is self-defeating to tell people that. He makes two arguments, which we can label “practical” and “principled.”

His practical argument is that by talking about “apocalyptic scenarios” environmentalists have made people more apathetic, less likely to fight for progressive change. His principledargument is that exposing and campaigning to stop tendencies towards environmental collapse has “damaging and rightward-leaning effects”—it undermines the left, promotes reactionary policies and strengthens the ruling class.

In my opinion, he is wrong on both counts.

The Truth Shall Make You Apathetic?

In Yuen’s view, the most important question facing people who are concerned about environmental destruction is: “what narrative strategies are most likely to generate effective and radical social movements?”

He is vague about what “narrative strategies” might work, but he is very firm about what does not. He argues that environmentalists have focused on explaining the environmental crisis and warning of its consequences in the belief that this will lead people to rise up and demand change, but this is a fallacy. In reality, “once convinced of apocalyptic scenarios, many Americans become more apathetic.”

Given such a sweeping assertion, it is surprising to find that the only evidence Yuen offers is a news release describing one academic paper, based on a U.S. telephone survey conducted in 2008, that purported to show that “more informed respondents both feel less personally responsible for global warming, and also show less concern for global warming.”13

Note first that being “more informed” is not the same as being “convinced of apocalyptic scenarios” or being bombarded with “increasingly urgent appeals about fixed ecological tipping points.” On the face of it, this study does not appear to contribute to our understanding of the effects of “catastrophism.”

What’s more, reading the original paper reveals that the people described as “more informed” were self-reporting. If they said they were informed, that was accepted, and no one asked if they were listening to climate scientists or to conservative talk radio. That makes the paper’s conclusion meaningless.

Later in his essay, Yuen correctly criticizes some environmentalists and scientists who “speak of ‘everyone’ as a unified subject.” But here he accepts as credible a study that purports to show how all Americans respond to information about climate change, regardless of class, gender, race, or political leanings.

The problem with such undifferentiated claims is shown in a 2011 study that examined the impact of Americans’ political opinions on their feelings about climate change. It found that liberals and Democrats who report being well-informed are more worried about climate change, while conservatives and Republicans who report being well-informed are less worried.14 Obviously the two groups mean very different things by “well-informed.”

Even if we ignore that, the study Yuen cites is a one-time snapshot—it does not tells us what radicals really need to know, which is how things are changing. For that, a more useful survey is one that scientists at Yale University and George Mason University have conducted seven times since 2008 to show shifts in U.S. public opinion.15 Based on answers to questions about their opinions, respondents are categorized according to their attitude towards global warming. The surveys show:

  • The number of people identified as “Disengaged” or “Cautious”—those we might call apathetic or uncertain—has varied very little, accounting for between 31 percent and 35 percent of the respondents every time.
  • The categories “Dismissive” or “Doubtful”—those who lean towards denial—increased between 2008 and 2010. Since then, those groups have shrunk back almost to the 2008 level.
  • In parallel, the combined “Concerned” and “Alarmed” groups shrank between 2008 and 2010, but have since largely recovered. In September 2012—before Hurricane Sandy!—there were more than twice as many Americans in these two categories as in Dismissive/Doubtful.

Another study, published in the journal Climatic Change, used seventy-four independent surveys conducted between 2002 and 2011 to create a Climate Change Threat Index (CCTI)—a measure of public concern about climate change—and showed how it changed in response to public events. It found that public concern about climate change reached an all-time high in 2006–2007, when the Al Gore documentary An Inconvenient Truth was seen in theaters by millions of people and won an Academy Award.

The authors conclude: “Our results…show that advocacy efforts produce substantial changes in public perceptions related to climate change. Specifically, the film An Inconvenient Truth and the publicity surrounding its release produced a significant positive jump in the CCTI.”16

This directly contradicts Yuen’s view that more information about climate change causes Americans to become more apathetic. There is no evidence of a long-term increase in apathy or decrease in concern—and when scientific information about climate change reached millions of people, the result was not apathy but a substantial increase in support for action to reduce greenhouse gas emissions.

‘The Two Greatest Myths’

Yuen says environmentalists have deluged Americans with catastrophic warnings, and this strategy has produced apathy, not action. Writing of establishment politicians who make exactly the same claim, noted climate change analyst Joseph Romm says, “The two greatest myths about global warming communications are 1) constant repetition of doomsday messages has been a major, ongoing strategy and 2) that strategy doesn’t work and indeed is actually counterproductive!” Contrary to liberal mythology, the North American public has not been exposed to anything even resembling the first claim. Romm writes,

The broad American public is exposed to virtually no doomsday messages, let alone constant ones, on climate change in popular culture (TV and the movies and even online)…. The major energy companies bombard the airwaves with millions and millions of dollars of repetitious pro-fossil-fuel ads. The environmentalists spend far, far less money…. Environmentalists when they do appear in popular culture, especially TV, are routinely mocked…. It is total BS that somehow the American public has been scared and overwhelmed by repeated doomsday messaging into some sort of climate fatigue.17

The website Daily Climate, which tracks U.S. news stories about climate change, says coverage peaked in 2009, during the Copenhagen talks—but then it “fell off the map,” dropping 30 percent in 2010 and another 20 percent in 2011. In 2012, despite widespread droughts and Hurricane Sandy, news coverage fell another 2 percent. The decline in editorial interest was even more dramatic—in 2012 newspapers published fewer than half as many editorials about climate change as they did in 2009.18

It should be noted that these shifts occurred in the framework of very limited news coverage of climate issues. As a leading media analyst notes, “relative to other issues like health, medicine, business, crime and government, media attention to climate change remains a mere blip.”19 Similarly, a British study describes coverage of climate change in newspapers there as “lamentably thin”—a problem exacerbated by the fact that much of the coverage consists of “worryingly persistent climate denial stories.” The author concludes drily: “The limited coverage is unlikely to have convinced readers that climate change is a serious problem warranting immediate, decisive and potentially costly action.”20

Given Yuen’s concern that Americans do not recognize the seriousness of environmental crises, it is surprising how little he says about the massive fossil-fuel-funded disinformation campaigns that have confused and distorted media reporting. I can find just four sentences on the subject in his 9,000-word text, and not one that suggests denialist campaigns might have helped undermine efforts to build a climate change movement.

On the contrary, he downplays the influence of “the well-funded climate denial lobby,” by claiming that “far more corporate and elite energy has gone toward generating anxiety about global warming,” and that “mainstream climate science is much better funded.” He provides no evidence for either statement.

Of course, the fossil-fuel lobby is not the only force working to undermine public concern about climate change. It is also important to recognize the impact of Obama’s predictable unwillingness to confront the dominant forces in U.S. capitalism, and of the craven failure of mainstream environmentalist groups and NGOs to expose and challenge the Democrats’ anti-environmental policies.

With fossil-fuel denialists on one side, and Obama’s pale-green cheerleaders on the other, activists who want to get out the truth have barely been heard. In that context, it makes little sense to blame environmentalists for sabotaging environmentalism.

The Truth Will Help the Right?

Halfway through his essay, Yuen abruptly changes direction, leaving the practical argument behind and raising his principled concern. He now argues that what he calls catastrophism leads people to support reactionary policies and promotes “the most authoritarian solutions at the state level.” Focusing attention on what he agrees is a “cascading environmental disaster” is dangerous because it “disables the left but benefits the right and capital.” He says, “Increased awareness of environmental crisis will not likely translate into a more ecological lifestyle, let alone an activist orientation against the root causes of environmental degradation. In fact, right-wing and nationalist environmental politics have much more to gain from an embrace of catastrophism.”

Yuen says that many environmentalists, including scientists, “reflexively overlook class divisions,” and so do not realize that “some business and political elites feel that they can avoid the worst consequences of the environmental crisis, and may even be able to benefit from it.” Yuen apparently thinks those elites are right—while the insurance industry is understandably worried about big claims, he says, “the opportunities for other sectors of capitalism are colossal in scope.”

He devotes much of the rest of his essay to describing the efforts of pro-capitalist forces, conservative and liberal, to use concern about potential environmental disasters to promote their own interests, ranging from emissions trading schemes to military expansion to Malthusian attacks on the world’s poorest people. “The solution offered by global elites to the catastrophe is a further program of austerity, belt-tightening, and sacrifice, the brunt of which will be borne by the world’s poor.”

Some of this is overstated. His claim that “Malthusianism is at the core of most environmental discourse,” reflects either a very limited view of environmentalism or an excessively broad definition of Malthusianism. And he seems to endorse David Noble’s bizarre theory that public concern about global warming has been engineered by a corporate conspiracy to promote carbon trading schemes.21 Nevertheless he is correct that the ruling class will do its best to profit from concern about climate change, while simultaneously offloading the costs onto the world’s poorest people.

The question is, who is he arguing with? This book says it aims to “spur debate among radicals,” but none of this is new or controversial for radicals. The insight that the interests of the ruling class are usually opposed to the interests of the rest of us has been central to left-wing thought since before Marx was born. Capitalists always try to turn crises to their advantage no matter who gets hurt, and they always try to offload the costs of their crises onto the poor and oppressed.

What needs to be proved is not that pro-capitalist forces are trying to steer the environmental movement into profitable channels, and not that many sincere environmentalists have backward ideas about the social and economic causes of ecological crises. Radicals who are active in green movements know those things perfectly well. What needs to be proved is Yuen’s view that warning about environmental disasters and campaigning to prevent them has “damaging and rightward-leaning effects” that are so severe that radicals cannot overcome them.

But no proof is offered.

What is particularly disturbing about his argument is that he devotes pages to describing the efforts of reactionaries to misdirect concern about climate change—and none to the efforts of radical environmentalists to counter those forces. Earlier in his essay, he mentioned that “environmental and climate justice perspectives are steadily gaining traction in internal environmental debates,” but those thirteen words are all he has to say on the subject.

He says nothing about the historic 2010 Cochabamba Conference, where 30,000 environmental activists from 140 countries warned that if greenhouse gas emissions are not stopped, “the damages caused to our Mother Earth will be completely irreversible”—a statement Yuen would doubtless label “catastrophist.” Far from succumbing to apathy or reactionary policies, the participants explicitly rejected market solutions, identified capitalism as the cause of the crisis, and outlined a radical program to transform the global economy.

He is equally silent about the campaign against the fraudulent “green economy” plan adopted at last year’s Rio+20 conference. One of the principal organizers of that opposition is La Via Campesina, the world’s largest organization of peasants and farmers, which warns that the world’s governments are “propagating the same capitalist model that caused climate chaos and other deep social and environmental crises.”

His essay contains not a word about Idle No More, or Occupy, or the Indigenous-led fight against Canada’s tar sands, or the anti-fracking and anti-coal movements. By omitting them, Yuen leaves the false impression that the climate movement is helpless to resist reactionary forces.

Contrary to Yuen’s title, the effort to build a movement to save the planet has not failed. Indeed, Catastrophism was published just four months before the largest U.S. climate change demonstration ever!

The question before radicals is not what “narrative strategy” to adopt, but rather, how will we relate to the growing environmental movement? How will we support its goals while strengthening the forces that see the need for more radical solutions?

What Must Be Done?

Yuen opposes attempts to build a movement around rallies, marches, and other mass protests to get out the truth and to demand action against environmental destruction. He says that strategy worked in the 1960s, when Americans were well-off and naïve, but cannot be replicated in today’s “culture of atomized cynicism.”

Like many who know that decade only from history books or as distant memories, Yuen foreshortens the experience: he knows about the mass protests and dissent late in the decade, but ignores the many years of educational work and slow movement building in a deeply reactionary and racist time. It is not predetermined that the campaign against climate change will take as long as those struggles, or take similar forms, but the real experience of the 1960s should at least be a warning against premature declarations of failure.

Yuen is much less explicit about what he thinks would be an effective strategy, but he cites as positive examples the efforts of some to promote “a bottom-up and egalitarian transition” by:

ever-increasing numbers of people who are voluntarily engaging in intentional communities, sustainability projects, permaculture and urban farming, communing and mil­itant resistance to consumerism…we must consider the alterna­tive posed by the highly imaginative Italian left of the twentieth century. The explosively popular Slow Food movement was origi­nally built on the premise that a good life can be had not through compulsive excess but through greater conviviality and a shared commonwealth.

Compare that to this list of essential tasks, prepared recently by Pablo Solón, a leading figure in the global climate justice movement:

To reduce greenhouse gas emissions to a level that avoids catastrophe, we need to:

* Leave more than two-thirds of the fossil fuel reserves under the soil;

* Stop the exploitation of tar sands, shale gas and coal;

* Support small, local, peasant and indigenous community farming while we dismantle big agribusiness that deforests and heats the planet;

* Promote local production and consumption of products, reducing the free trade of goods that send millions of tons of CO2 while they travel around the world;

* Stop extractive industries from further destroying nature and contaminating our atmosphere and our land;

* Increase significantly public transport to reduce the unsustainable “car way of life”;

* Reduce the emissions of warfare by promoting genuine peace and dismantling the military and war industry and infrastructure.22

The projects that Yuen describes are worthwhile, but unless the participants are alsocommitted to building mass environmental campaigns, they will not be helping to achieve the vital objectives that Solón identifies. Posing local communes and slow food as alternatives to building a movement against global climate change is effectively a proposal to abandon the fight against capitalist ecocide in favor of creating greenish enclaves, while the world burns.

Bright-siding versus Movement Building

Whatever its merits in other contexts, it is not helpful or appropriate to use the wordcatastrophism as a synonym for telling the truth about the environmental dangers we face. Using the same language as right-wing climate science deniers gives the impression that the dangers are non-existent or exaggerated. Putting accurate environmental warnings in the same category as apocalyptic Christian fundamentalism and century-old misreadings of Marxist economic theory leads to underestimation of the threats we face and directs efforts away from mobilizing an effective counterforce.

Yuen’s argument against publicizing the scientific consensus on climate change echoes the myth that liberal politicians and journalists use to justify their failure to challenge the crimes of the fossil-fuel industry. People are tired of all that doom and gloom, they say. It is time for positivemessages! Or, to use Yuen’s vocabulary, environmentalists need to end “apocalyptic rhetoric” and find better “narrative strategies.”

This is fundamentally an elitist position: the people cannot handle the truth, so a knowledgeable minority must sugarcoat it, to make the necessary changes palatable.

David Spratt of the Australian organization Climate Code Red calls that approach “bright-siding,” a reference to the bitterly satirical Monty Python song, “Always Look on the Bright Side of Life.”

The problem is, Spratt writes: “If you avoid including an honest assessment of climate science and impacts in your narrative, it’s pretty difficult to give people a grasp about where the climate system is heading and what needs to be done to create the conditions for living in climate safety, rather than increasing and eventually catastrophic harm.”23 Joe Romm makes the same point: “You’d think it would be pretty obvious that the public is not going to be concerned about an issue unless one explains why they should be concerned.”24

Of course, this does not mean that we only need to explain the science. We need to propose concrete goals, as Pablo Solón has done. We need to show how the scientific consensus about climate change relates to local and national concerns such as pipelines, tar sands, fracking, and extreme weather. We need to work with everyone who is willing to confront any aspect of the crisis, from people who still have illusions about capitalism to convinced revolutionaries. Activists in the wealthy countries must be unstinting in their political and practical solidarity with the primary victims of climate change, indigenous peoples, and impoverished masses everywhere.

We need to do all of that and more.

But the first step is to tell the truth—about the danger we face, about its causes, and about the measures that must be taken to turn back the threat. In a time of universal deceit, telling the truth is a revolutionary act.

Notes

  1.  Fraser C. Lott, Nikolaos Christidis, and Peter A. Stott, “Can the 2011 East African Drought Be Attributed to Human-Induced Climate Change?,” Geophysical Research Letters 40, no. 6 ( March 2013): 1177–81.
  2.  UNDP, “’Rise of South’ Transforming Global Power Balance, Says 2013 Human Development Report,” March 14, 2013, http://undp.org.
  3.  Tom Harris, “Climate Catastrophism Picking Up Again in the U.S. and Across the World,”Somewhat Reasonable, October 10, 2012 http://blog.heartland.org.
  4.  Pierre Gosselin, “The Climate Catastrophism Cult,” NoTricksZone, February 12, 2011, http://notrickszone.com.
  5.  Ray Evans, “The Chilling Costs of Climate Catastrophism,” Quadrant Online, June 2008. http://quadrant.org.au.
  6.  Franz Mauelshagen, “Climate Catastrophism: The History of the Future of Climate Change,” in Andrea Janku, Gerrit Schenk, and Franz Mauelshagen, Historical Disasters in Context: Science, Religion, and Politics (New York: Routledge, 2012), 276.
  7.  Alexander Cockburn, “Is Global Warming a Sin?,” CounterPunch, April 28–30, 2007, http://counterpunch.org.
  8.  Alexander Cockburn, “Who are the Merchants of Fear?,” CounterPunch, May 12–14, 2007, http:// counterpunch.org.
  9.  Alexander Cockburn, “I Am An Intellectual Blasphemer,” Spiked Review of Books, January 9, 2008, http://spiked-online.com.
  10.  Leo Panitch and Colin Leys, “Preface,” Socialist Register 2007: Coming to Terms With Nature(London: Merlin Press/Monthly Review Press, 2006), ix–x.
  11. 11.“Notes from the Editors,” Monthly Review 58, no. 10 (March 2007), http://monthlyreview.org.
  12.  Sasha Lilley, David McNally, Eddie Yuen, and James Davis, Catastrophism: The Apocalyptic Politics of Collapse and Rebirth (Oakland: PM Press, 2012).
  13.  Yuen’s footnote cites an article which is identical to a news release issued the previous day by Texas A&M University; see “Increased Knowledge About Global Warming Leads to Apathy, Study Shows,” Science Daily, March 28, 2008, http://eurekalert.org. The original paper, which Yuen does not cite, is: P.M. Kellstedt, S. Zahran, and A. Vedlitz, “Personal Efficacy, the Information Environment, and Attitudes Towards Global Warming and Climate Change in the United State,”Risk Analysis 28, no. 1 (2008): 113–26.
  14.  Aaron M. McCright and Riley E. Dunlap, “The Politicization of Climate Change and Polarization in the American Public’s Views of Global Warming, 2001–2010,” The Sociological Quarterly 52 (2011): 155–94.
  15.  A. Leiserowitz, et. al., Global Warming’s Six Americas, September 2012 (New Haven, CT: Yale Project on Climate Change Communication, 2013), http://environment.yale.edu.
  16.  Robert J. Brulle, Jason Carmichael, and J. Craig Jenkins, “Shifting Public Opinion on Climate Change: An Empirical Assessment of Factors Influencing Concern Over Climate Change in the U.S., 2002–2010,” Climatic Change 114, no. 2 (September 2012): 169–88.
  17.  Joe Romm, “Apocalypse Not: The Oscars, The Media and the Myth of ‘Constant Repetition of Doomsday Messages’ on Climate,” Climate Progress, February 24, 2013, http://thinkprogress.org.
  18.  Douglas Fischer. “2010 in Review: The Year Climate Coverage ‘Fell off the Map,’” Daily Climate, January 3, 2011. http://dailyclimate.org; “Climate Coverage Down Again in 2011,” Daily Climate, January 3, 2012, http://dailyclimate.org; “Climate Coverage, Dominated by Weird Weather, Falls Further in 2012,” Daily Climate, January 2, 2013, http://dailyclimate.org.
  19.  Maxwell T. Boykoff, Who Speaks for the Climate?: Making Sense of Media Reporting on Climate Change (Cambridge: Cambridge University Press, 2011), 24.
  20.  Neil T. Gavin, “Addressing Climate Change: A Media Perspective,” Environmental Politics 18, no. 5 (September 2009): 765–80.
  21.  Two responses to David Noble are: Derrick O’Keefe, “Denying Time and Place in the Global Warming Debate,” Climate & Capitalism, June 7, 2007, http://climateandcapitalism.com; Justin Podur, “Global Warming Suspicions and Confusions,” ZNet, May 11, 2007, http://zcommunications.org.
  22.  Pablo Solón, “A Contribution to the Climate Space 2013: How to Overcome the Climate Crisis?,”Climate Space, March 14, 2013, http://climatespace2013.wordpress.com.
  23.  David Spratt, Always Look on the Bright Side of Life: Bright-siding Climate Advocacy and Its Consequences, April 2012, http://climatecodered.org.
  24.  Joe Romm, “Apocalypse Not.”

Ian Angus is editor of the online journal Climate & Capitalism. He is co-author of Too Many People? Population, Immigration, and the Environmental Crisis(Haymarket, 2011), and editor of The Global Fight for Climate Justice(Fernwood, 2010).
He would like to thank Simon Butler, Martin Empson, John Bellamy Foster, John Riddell, Javier Sethness, and Chris Williams for comments and suggestions.

Rising Seas (Nat Geo)

Picture of Seaside Heights, New Jersey, after Hurricane Sandy

As the planet warms, the sea rises. Coastlines flood. What will we protect? What will we abandon? How will we face the danger of rising seas?

By Tim Folger

Photographs by George Steinmetz

September 2013

By the time Hurricane Sandy veered toward the Northeast coast of the United States last October 29, it had mauled several countries in the Caribbean and left dozens dead. Faced with the largest storm ever spawned over the Atlantic, New York and other cities ordered mandatory evacuations of low-lying areas. Not everyone complied. Those who chose to ride out Sandy got a preview of the future, in which a warmer world will lead to inexorably rising seas.

Brandon d’Leo, a 43-year-old sculptor and surfer, lives on the Rockaway Peninsula, a narrow, densely populated, 11-mile-long sandy strip that juts from the western end of Long Island. Like many of his neighbors, d’Leo had remained at home through Hurricane Irene the year before. “When they told us the tidal surge from this storm would be worse, I wasn’t afraid,” he says. That would soon change.

D’Leo rents a second-floor apartment in a three-story house across the street from the beach on the peninsula’s southern shore. At about 3:30 in the afternoon he went outside. Waves were crashing against the five-and-a-half-mile-long boardwalk. “Water had already begun to breach the boardwalk,” he says. “I thought, Wow, we still have four and a half hours until high tide. In ten minutes the water probably came ten feet closer to the street.”

Back in his apartment, d’Leo and a neighbor, Davina Grincevicius, watched the sea as wind-driven rain pelted the sliding glass door of his living room. His landlord, fearing the house might flood, had shut off the electricity. As darkness fell, Grincevicius saw something alarming. “I think the boardwalk just moved,” she said. Within minutes another surge of water lifted the boardwalk again. It began to snap apart.

Three large sections of the boardwalk smashed against two pine trees in front of d’Leo’s apartment. The street had become a four-foot-deep river, as wave after wave poured water onto the peninsula. Cars began to float in the churning water, their wailing alarms adding to the cacophony of wind, rushing water, and cracking wood. A bobbing red Mini Cooper, its headlights flashing, became wedged against one of the pine trees in the front yard. To the west the sky lit up with what looked like fireworks—electrical transformers were exploding in Breezy Point, a neighborhood near the tip of the peninsula. More than one hundred homes there burned to the ground that night.

The trees in the front yard saved d’Leo’s house, and maybe the lives of everyone inside—d’Leo, Grincevicius, and two elderly women who lived in an apartment downstairs. “There was no option to get out,” d’Leo says. “I have six surfboards in my apartment, and I was thinking, if anything comes through the wall, I’ll try to get everyone on those boards and try to get up the block. But if we’d had to get in that water, it wouldn’t have been good.”

After a fitful night’s sleep d’Leo went outside shortly before sunrise. The water had receded, but thigh-deep pools still filled parts of some streets. “Everything was covered with sand,” he says. “It looked like another planet.”

A profoundly altered planet is what our fossil-fuel-driven civilization is creating, a planet where Sandy-scale flooding will become more common and more destructive for the world’s coastal cities. By releasing carbon dioxide and other heat-trapping gases into the atmosphere, we have warmed the Earth by more than a full degree Fahrenheit over the past century and raised sea level by about eight inches. Even if we stopped burning all fossil fuels tomorrow, the existing greenhouse gases would continue to warm the Earth for centuries. We have irreversibly committed future generations to a hotter world and rising seas.

In May the concentration of carbon dioxide in the atmosphere reached 400 parts per million, the highest since three million years ago. Sea levels then may have been as much as 65 feet above today’s; the Northern Hemisphere was largely ice free year-round. It would take centuries for the oceans to reach such catastrophic heights again, and much depends on whether we manage to limit future greenhouse gas emissions. In the short term scientists are still uncertain about how fast and how high seas will rise. Estimates have repeatedly been too conservative.

Global warming affects sea level in two ways. About a third of its current rise comes from thermal expansion—from the fact that water grows in volume as it warms. The rest comes from the melting of ice on land. So far it’s been mostly mountain glaciers, but the big concern for the future is the giant ice sheets in Greenland and Antarctica. Six years ago the Intergovernmental Panel on Climate Change (IPCC) issued a report predicting a maximum of 23 inches of sea-level rise by the end of this century. But that report intentionally omitted the possibility that the ice sheets might flow more rapidly into the sea, on the grounds that the physics of that process was poorly understood.

As the IPCC prepares to issue a new report this fall, in which the sea-level forecast is expected to be slightly higher, gaps in ice-sheet science remain. But climate scientists now estimate that Greenland and Antarctica combined have lost on average about 50 cubic miles of ice each year since 1992—roughly 200 billion metric tons of ice annually. Many think sea level will be at least three feet higher than today by 2100. Even that figure might be too low.

“In the last several years we’ve observed accelerated melting of the ice sheets in Greenland and West Antarctica,” says Radley Horton, a research scientist at Columbia University’s Earth Institute in New York City. “The concern is that if the acceleration continues, by the time we get to the end of the 21st century, we could see sea-level rise of as much as six feet globally instead of two to three feet.” Last year an expert panel convened by the National Oceanic and Atmospheric Administration adopted 6.6 feet (two meters) as its highest of four scenarios for 2100. The U.S. Army Corps of Engineers recommends that planners consider a high scenario of five feet.

One of the biggest wild cards in all sea-level-rise scenarios is the massive Thwaites Glacier in West Antarctica. Four years ago NASA sponsored a series of flights over the region that used ice-penetrating radar to map the seafloor topography. The flights revealed that a 2,000-foot-high undersea ridge holds the Thwaites Glacier in place, slowing its slide into the sea. A rising sea could allow more water to seep between ridge and glacier and eventually unmoor it. But no one knows when or if that will happen.“That’s one place I’m really nervous about,” says Richard Alley, a glaciologist at Penn State University and an author of the last IPCC report. “It involves the physics of ice fracture that we really don’t understand.” If the Thwaites Glacier breaks free from its rocky berth, that would liberate enough ice to raise sea level by three meters—nearly ten feet. “The odds are in our favor that it won’t put three meters in the ocean in the next century,” says Alley. “But we can’t absolutely guarantee that. There’s at least some chance that something very nasty will happen.”

Even in the absence of something very nasty, coastal cities face a twofold threat: Inexorably rising oceans will gradually inundate low-lying areas, and higher seas will extend the ruinous reach of storm surges. The threat will never go away; it will only worsen. By the end of the century a hundred-year storm surge like Sandy’s might occur every decade or less. Using a conservative prediction of a half meter (20 inches) of sea-level rise, the Organisation for Economic Co-operation and Development estimates that by 2070, 150 million people in the world’s large port cities will be at risk from coastal flooding, along with $35 trillion worth of property—an amount that will equal 9 percent of the global GDP. How will they cope?

“During the last ice age there was a mile or two of ice above us right here,” says Malcolm Bowman, as we pull into his driveway in Stony Brook, New York, on Long Island’s north shore. “When the ice retreated, it left a heap of sand, which is Long Island. All these rounded stones you see—look there,” he says, pointing to some large boulders scattered among the trees near his home. “They’re glacial boulders.”

Bowman, a physical oceanographer at the State University of New York at Stony Brook, has been trying for years to persuade anyone who will listen that New York City needs a harbor-spanning storm-surge barrier. Compared with some other leading ports, New York is essentially defenseless in the face of hurricanes and floods. London, Rotterdam, St. Petersburg, New Orleans, and Shanghai have all built levees and storm barriers in the past few decades. New York paid a high price for its vulnerability last October. Sandy left 43 dead in the city, of whom 35 drowned; it cost the city some $19 billion. And it was all unnecessary, says Bowman.

“If a system of properly designed storm-surge barriers had been built—and strengthened with sand dunes at both ends along the low-lying coastal areas—there would have been no flooding damage from Sandy,” he says.

Bowman envisions two barriers: one at Throgs Neck, to keep surges from Long Island Sound out of the East River, and a second one spanning the harbor south of the city. Gates would accommodate ships and tides, closing only during storms, much like existing structures in the Netherlands and elsewhere. The southern barrier alone, stretching five miles between Sandy Hook, New Jersey, and the Rockaway Peninsula, might cost $10 billion to $15 billion, Bowman estimates. He pictures a six-lane toll highway on top that would provide a bypass route around the city and a light-rail line connecting the Newark and John F. Kennedy Airports.

“It could be an asset to the region,” says Bowman. “Eventually the city will have to face up to this, because the problem is going to get worse. It might take five years of study and another ten years to get the political will to do it. By then there might have been another disaster. We need to start planning immediately. Otherwise we’re mortgaging the future and leaving the next generation to cope as best it can.”

Another way to safeguard New York might be to revive a bit of its past. In the 16th-floor loft of her landscape architectural firm in lower Manhattan, Kate Orff pulls out a map of New York Harbor in the 19th century. The present-day harbor shimmers outside her window, calm and unthreatening on an unseasonably mild morning three months to the day after Sandy hit.

“Here’s an archipelago that protected Red Hook,” Orff says, pointing on the map to a small cluster of islands off the Brooklyn shore. “There was another chain of shoals that connected Sandy Hook to Coney Island.”

The islands and shallows vanished long ago, demolished by harbor-dredging and landfill projects that added new real estate to a burgeoning city. Orff would re-create some of them, particularly the Sandy Hook–Coney Island chain, and connect them with sluice gates that would close during a storm, forming an eco-engineered barrier that would cross the same waters as Bowman’s more conventional one. Behind it, throughout the harbor, would be dozens of artificial reefs built from stone, rope, and wood pilings and seeded with oysters and other shellfish. The reefs would continue to grow as sea levels rose, helping to buffer storm waves—and the shellfish, being filter feeders, would also help clean the harbor. “Twenty-five percent of New York Harbor used to be oyster beds,” Orff says.

Orff estimates her “oystertecture” vision could be brought to life at relatively low cost. “It would be chump change compared with a conventional barrier. And it wouldn’t be money wasted: Even if another Sandy never happens, you’d have a cleaner, restored harbor in a more ecologically vibrant context and a healthier New York.”

In June, Mayor Michael Bloomberg outlined a $19.5 billion plan to defend New York City against rising seas. “Sandy was a temporary setback that can ultimately propel us forward,” he said. The mayor’s proposal calls for the construction of levees, local storm-surge barriers, sand dunes, oyster reefs, and more than 200 other measures. It goes far beyond anything planned by any other American city. But the mayor dismissed the idea of a harbor barrier. “A giant barrier across our harbor is neither practical nor affordable,” Bloomberg said. The plan notes that since a barrier would remain open most of the time, it would not protect the city from the inch-by-inch creep of sea-level rise.

Meanwhile, development in the city’s flood zones continues. Klaus Jacob, a geophysicist at Columbia University, says the entire New York metropolitan region urgently needs a master plan to ensure that future construction will at least not exacerbate the hazards from rising seas.

“The problem is we’re still building the city of the past,” says Jacob. “The people of the 1880s couldn’t build a city for the year 2000—of course not. And we cannot build a year-2100 city now. But we should not build a city now that we know will not function in 2100. There are opportunities to renew our infrastructure. It’s not all bad news. We just have to grasp those opportunities.”

Will New York grasp them after Bloomberg leaves office at the end of this year? And can a single storm change not just a city’s but a nation’s policy? It has happened before. The Netherlands had its own stormy reckoning 60 years ago, and it transformed the country.

The storm roared in from the North Sea on the night of January 31, 1953. Ria Geluk was six years old at the time and living where she lives today, on the island of Schouwen Duiveland in the southern province of Zeeland. She remembers a neighbor knocking on the door of her parents’ farmhouse in the middle of the night to tell them that the dike had failed. Later that day the whole family, along with several neighbors who had spent the night, climbed to the roof, where they huddled in blankets and heavy coats in the wind and rain. Geluk’s grandparents lived just across the road, but water swept into the village with such force that they were trapped in their home. They died when it collapsed.

“Our house kept standing,” says Geluk. “The next afternoon the tide came again. My father could see around us what was happening; he could see houses disappearing. You knew when a house disappeared, the people were killed. In the afternoon a fishing boat came to rescue us.”

In 1997 Geluk helped found the Watersnoodmuseum—the “flood museum”—on Schouwen Duiveland. The museum is housed in four concrete caissons that engineers used to plug dikes in 1953. The disaster killed 1,836 in all, nearly half in Zeeland, including a baby born on the night of the storm.

Afterward the Dutch launched an ambitious program of dike and barrier construction called the Delta Works, which lasted more than four decades and cost more than six billion dollars. One crucial project was the five-mile-long Oosterscheldekering, or Eastern Scheldt barrier, completed 27 years ago to defend Zeeland from the sea. Geluk points to it as we stand on a bank of the Scheldt estuary near the museum, its enormous pylons just visible on the horizon. The final component of the Delta Works, a movable barrier protecting Rotterdam Harbor and some 1.5 million people, was finished in 1997.

Like other primary sea barriers in the Netherlands, it’s built to withstand a 1-in-10,000-year storm—the strictest standard in the world. (The United States uses a 1-in-100 standard.) The Dutch government is now considering whether to upgrade the protection levels to bring them in line with sea-level-rise projections.

Such measures are a matter of national security for a country where 26 percent of the land lies below sea level. With more than 10,000 miles of dikes, the Netherlands is fortified to such an extent that hardly anyone thinks about the threat from the sea, largely because much of the protection is so well integrated into the landscape that it’s nearly invisible.

On a bitingly cold February afternoon I spend a couple of hours walking around Rotterdam with Arnoud Molenaar, the manager of the city’s Climate Proof program, which aims to make Rotterdam resistant to the sea levels expected by 2025. About 20 minutes into our walk we climb a sloping street next to a museum designed by the architect Rem Koolhaas. The presence of a hill in this flat city should have alerted me, but I’m surprised when Molenaar tells me that we’re walking up the side of a dike. He gestures to some nearby pedestrians. “Most of the people around us don’t realize this is a dike either,” he says. The Westzeedijk shields the inner city from the Meuse River a few blocks to the south, but the broad, busy boulevard on top of it looks like any other Dutch thoroughfare, with flocks of cyclists wheeling along in dedicated lanes.

As we walk, Molenaar points out assorted subtle flood-control structures: an underground parking garage designed to hold 10,000 cubic meters—more than 2.5 million gallons—of rainwater; a street flanked by two levels of sidewalks, with the lower one designed to store water, leaving the upper walkway dry. Late in the afternoon we arrive at Rotterdam’s Floating Pavilion, a group of three connected, transparent domes on a platform in a harbor off the Meuse. The domes, about three stories tall, are made of a plastic that’s a hundred times as light as glass.

Inside we have sweeping views of Rotterdam’s skyline; hail clatters overhead as low clouds scud in from the North Sea. Though the domes are used for meetings and exhibitions, their main purpose is to demonstrate the wide potential of floating urban architecture. By 2040 the city anticipates that as many as 1,200 homes will float in the harbor. “We think these structures will be important not just for Rotterdam but for many cities around the world,” says Bart Roeffen, the architect who designed the pavilion. The homes of 2040 will not necessarily be domes; Roeffen chose that shape for its structural integrity and its futuristic appeal. “To build on water is not new, but to develop floating communities on a large scale and in a harbor with tides—that is new,” says Molenaar. “Instead of fighting against water, we want to live with it.”

While visiting the Netherlands, I heard one joke repeatedly: “God may have built the world, but the Dutch built Holland.” The country has been reclaiming land from the sea for nearly a thousand years—much of Zeeland was built that way. Sea-level rise does not yet panic the Dutch.

“We cannot retreat! Where could we go? Germany?” Jan Mulder has to shout over the wind—we’re walking along a beach called Kijkduin as volleys of sleet exfoliate our faces. Mulder is a coastal morphologist with Deltares, a private coastal management firm. This morning he and Douwe Sikkema, a project manager with the province of South Holland, have brought me to see the latest in adaptive beach protection. It’s called the zandmotor—the sand engine.

The seafloor offshore, they explain, is thick with hundreds of feet of sand deposited by rivers and retreating glaciers. North Sea waves and currents once distributed that sand along the coast. But as sea level has risen since the Ice Age, the waves no longer reach deep enough to stir up sand, and the currents have less sand to spread around. Instead the sea erodes the coast here.

The typical solution would be to dredge sand offshore and dump it directly on the eroding beaches—and then repeat the process year after year as the sand washes away. Mulder and his colleagues recommended that the provincial government try a different strategy: a single gargantuan dredging operation to create the sandy peninsula we’re walking on—a hook-shaped stretch of beach the size of 250 football fields. If the scheme works, over the next 20 years the wind, waves, and tides will spread its sand 15 miles up and down the coast. The combination of wind, waves, tides, and sand is the zandmotor.

The project started only two years ago, but it seems to be working. Mulder shows me small dunes that have started to grow on a beach where there was once open water. “It’s very flexible,” he says. “If we see that sea-level rise increases, we can increase the amount of sand.” Sikkema adds, “And it’s much easier to adjust the amount of sand than to rebuild an entire system of dikes.”

Later Mulder tells me about a memorial inscription affixed to the Eastern Scheldt barrier in Zeeland: “It says, ‘Hier gaan over het tij, de maan, de wind, en wij—Here the tide is ruled by the moon, the wind, and us.’ ” It reflects the confidence of a generation that took for granted, as we no longer can, a reasonably stable world. “We have to understand that we are not ruling the world,” says Mulder. “We need to adapt.”

With the threats of climate change and sea-level rise looming over us all, cities around the world, from New York to Ho Chi Minh City, have turned to the Netherlands for guidance. One Dutch firm, Arcadis, has prepared a conceptual design for a storm-surge barrier in the Verrazano Narrows to protect New York City. The same company helped design a $1.1 billion, two-mile-long barrier that protected New Orleans from a 13.6-foot storm surge last summer, when Hurricane Isaac hit. The Lower Ninth Ward, which suffered so greatly during Hurricane Katrina, was unscathed.

“Isaac was a tremendous victory for New Orleans,” Piet Dircke, an Arcadis executive, tells me one night over dinner in Rotterdam. “All the barriers were closed; all the levees held; all the pumps worked. You didn’t hear about it? No, because nothing happened.”

New Orleans may be safe for a few decades, but the long-term prospects for it and other low-lying cities look dire. Among the most vulnerable is Miami. “I cannot envision southeastern Florida having many people at the end of this century,” says Hal Wanless, chairman of the department of geological sciences at the University of Miami. We’re sitting in his basement office, looking at maps of Florida on his computer. At each click of the mouse, the years pass, the ocean rises, and the peninsula shrinks. Freshwater wetlands and mangrove swamps collapse—a death spiral that has already started on the southern tip of the peninsula. With seas four feet higher than they are today—a distinct possibility by 2100—about two-thirds of southeastern Florida is inundated. The Florida Keys have almost vanished. Miami is an island.

When I ask Wanless if barriers might save Miami, at least in the short term, he leaves his office for a moment. When he returns, he’s holding a foot-long cylindrical limestone core. It looks like a tube of gray, petrified Swiss cheese. “Try to plug this up,” he says. Miami and most of Florida sit atop a foundation of highly porous limestone. The limestone consists of the remains of countless marine creatures deposited more than 65 million years ago, when a warm, shallow sea covered what is now Florida—a past that may resemble the future here.

A barrier would be pointless, Wanless says, because water would just flow through the limestone beneath it. “No doubt there will be some dramatic engineering feats attempted,” he says. “But the limestone is so porous that even massive pumping systems won’t be able to keep the water out.”

Sea-level rise has already begun to threaten Florida’s freshwater supply. About a quarter of the state’s 19 million residents depend on wells sunk into the enormous Biscayne aquifer. Salt water is now seeping into it from dozens of canals that were built to drain the Everglades. For decades the state has tried to control the saltwater influx by building dams and pumping stations on the drainage canals. These “salinity-control structures” maintain a wall of fresh water behind them to block the underground intrusion of salt water. To offset the greater density of salt water, the freshwater level in the control structures is generally kept about two feet higher than the encroaching sea.

But the control structures also serve a second function: During the state’s frequent rainstorms their gates must open to discharge the flood of fresh water to the sea.“We have about 30 salinity-control structures in South Florida,” says Jayantha Obeysekera, the chief hydrological modeler at the South Florida Water Management District. “At times now the water level in the sea is higher than the freshwater level in the canal.” That both accelerates saltwater intrusion and prevents the discharge of flood waters. “The concern is that this will get worse with time as the sea-level rise accelerates,” Obeysekera says.

Using fresh water to block the salt water will eventually become impractical, because the amount of fresh water needed would submerge ever larger areas behind the control structures, in effect flooding the state from the inside. “With 50 centimeters [about 20 inches] of sea-level rise, 80 percent of the salinity-control structures in Florida will no longer be functional,” says Wanless. “We’ll either have to drown communities to keep the freshwater head above sea level or have saltwater intrusion.” When sea level rises two feet, he says, Florida’s aquifers may be poisoned beyond recovery. Even now, during unusually high tides, seawater spouts from sewers in Miami Beach, Fort Lauderdale, and other cities, flooding streets.

In a state exposed to hurricanes as well as rising seas, people like John Van Leer, an oceanographer at the University of Miami, worry that one day they will no longer be able to insure—or sell—their houses. “If buyers can’t insure it, they can’t get a mortgage on it. And if they can’t get a mortgage, you can only sell to cash buyers,” Van Leer says. “What I’m looking for is a climate-change denier with a lot of money.”

Unless we change course dramatically in the coming years, our carbon emissions will create a world utterly different in its very geography from the one in which our species evolved. “With business as usual, the concentration of carbon dioxide in the atmosphere will reach around a thousand parts per million by the end of the century,” says Gavin Foster, a geochemist at the University of Southampton in England. Such concentrations, he says, haven’t been seen on Earth since the early Eocene epoch, 50 million years ago, when the planet was completely ice free. According to the U.S. Geological Survey, sea level on an iceless Earth would be as much as 216 feet higher than it is today. It might take thousands of years and more than a thousand parts per million to create such a world—but if we burn all the fossil fuels, we will get there.

No matter how much we reduce our greenhouse gas emissions, Foster says, we’re already locked in to at least several feet of sea-level rise, and perhaps several dozens of feet, as the planet slowly adjusts to the amount of carbon that’s in the atmosphere already. A recent Dutch study predicted that the Netherlands could engineer solutions at a manageable cost to a rise of as much as five meters, or 16 feet. Poorer countries will struggle to adapt to much less. At different times in different places, engineering solutions will no longer suffice. Then the retreat from the coast will begin. In some places there will be no higher ground to retreat to.

By the next century, if not sooner, large numbers of people will have to abandon coastal areas in Florida and other parts of the world. Some researchers fear a flood tide of climate-change refugees. “From the Bahamas to Bangladesh and a major amount of Florida, we’ll all have to move, and we may have to move at the same time,” says Wanless. “We’re going to see civil unrest, war. You just wonder how—or if—civilization will function. How thin are the threads that hold it all together? We can’t comprehend this. We think Miami has always been here and will always be here. How do you get people to realize that Miami—or London—will not always be there?”

What will New York look like in 200 years? Klaus Jacob, the Columbia geophysicist, sees downtown Manhattan as a kind of Venice, subject to periodic flooding, perhaps with canals and yellow water cabs. Much of the city’s population, he says, will gather on high ground in the other boroughs. “High ground will become expensive, waterfront will become cheap,” he says. But among New Yorkers, as among the rest of us, the idea that the sea is going to rise—a lot—hasn’t really sunk in yet. Of the thousands of people in New York State whose homes were badly damaged or destroyed by Sandy’s surge, only 10 to 15 percent are expected to accept the state’s offer to buy them out at their homes’ pre-storm value. The rest plan to rebuild.

Master of Disaster (Discover)

Earthquakes and hurricanes will always wreak havoc, but risk management expert Robert Bea says the greatest tragedies result from hubris and greed.

By Linda Marsa|Thursday, May 23, 2013

Robert-Bea

Paul Chinn/San Francisco Chronicle/Corbis

Robert Bea has an unusual specialty: He studies disasters. As one of the world’s leading experts in catastrophic risk management, the former Shell Oil Co. executive sifts through the wreckage to unravel the chain of events that triggers accidents. The blunt-spoken civil engineer has spent more than a half-century investigating high-profile engineering failures, from the space shuttle Columbia’s horrific end to the explosion of the Deepwater Horizon oil-drilling rig in the Gulf.

A professor emeritus of civil engineering at the University of California, Berkeley, Bea’s disaster autopsy methods — such as looking at the organizational breakdowns that lead to calamities — have been widely adopted. Although policymakers and corporate honchos seek his counsel, sometimes they don’t like what he has to say — witness the flak he took from BP during the Deepwater Horizon probe.

Now in his mid-70s, Bea’s voice is raspier, but his critical faculties are undimmed. On a crisp fall day, he talked with DISCOVER in his comfortable one-story house in Moraga, a leafy suburb east of Berkeley, about what causes catastrophes.

You have said that engineering failures aren’t the chief culprits behind disasters, pointing instead to human and organizational failures — inadequate safety protocols, corporate hierarchies, conflicting egos or just plain laziness. Was there an “aha moment” when this became apparent?

When I was involved in the investigation into the Piper Alpha disaster, when an explosion destroyed an Occidental Petroleum oil-drilling platform in the North Sea, killing 167 men in 1988. The external investigation team that had been hired by Occidental into what caused Piper Alpha found it was a corporate culture that had gone bad, had lost its way.

I was part of that team all the way through the Lord Cullen Commission hearings in London, and I had to listen to one of my friends explain to the Cullen Commission why he and his colleagues had turned off the smoke alarms on the platform because the operating crew was doing a routine maintenance procedure late in the evening. Unfortunately, for over a month, certain alarms had been disabled to prevent unnecessary shutdowns on the rig — in some cases as a response to practical jokes. But turning off the alarms was one of the reasons they got caught by surprise.

Ironically, two years before, I was brought in to advise Occidental on risk management for Piper Alpha because they were having gas releases, pipes were leaking. Of course, you didn’t have to be very smart to say, “Yeah, we’ve got a problem — it’s called rusty pipes. And we’ve got problems with people not doing what they should be doing, and people who don’t understand what’s happening.”

One evening, during the first year of the investigation, I saw spread out on the reception table of the Occidental offices a copy of the London Times newspaper with a great, big, bold headline that said, “Occidental puts profit before safety.”

It had a picture of one of the bandaged, beat-up, horribly scarred survivors from the disaster who was telling this to the newspaper. What this survivor was observing is true. If you don’t have profitability, you don’t have the resources to invest in achieving adequate protection. What the tension is, is having the discipline and the foresight to make those investments before you’re in trouble.

When I came back to Berkeley after the investigation was completed, I realized that for the past 50-some-odd years of my career, I’d been working on 10 percent of the problems. I’d been working on normal engineering things, and 90 percent of the problems are humans and/or organizations.

We often have ample warnings before catastrophes hit, but we tend to ignore them until it’s too late. Why?

The problem is attention span, particularly in this country because we are a pretty young country. Our knowledge of history is very limited. We are extremely blessed. Lots of good things attract our attention. It’s a noisy environment, really noisy. It’s unusual to find people who are comfortable sitting in a room by themselves thinking.

You could say the eruption of Mount St. Helens was certainly painful, but it actually affected relatively few people and then disappeared into that strong noise environment. At that point people say, “Well, it’s never happened to me.

I can’t even remember my parents talking about it, and I’ve got these new things to play with, and they require attention,” like Facebook and Twitter. And suddenly, we have flitted from something that is difficult and painful to think about back to something that is enjoyable.

Piper-Alpha-Disaster

Robert Bea helped investigate the 1988 Piper Alpha disaster, where an exploding oil platform killed 167 in the North Sea. Press Association/AP

You seem to be suggesting that people have trouble dealing with issues over the long term. Are there other examples?

Well, global climate change is a perfect one, or rising sea levels. It’s happening slowly. People love living by the beach, so they build a beautiful home on a concrete slab, on top of the sand a few feet above sea level, and [ignore the fact] that the sea level is [rising]. So thinking about these slowly evolving long-term things, it is painful. It says, “Well, I might have to move my home. I really enjoy the beach,” and we don’t like to give those things up.

Is this inability to think long term also true of organizations — corporations or government agencies?

Yes. The equation for disaster is A + B = C. A is natural hazards, things like hurricanes, gases and liquids under pressure that are extremely volatile. They’re volcanoes. They’re tsunamis. They’re natural, and there’s nothing unusual about them. B is organizational hazards: people and their hubris, their arrogance, their greed. The real killer is our indolence.

So human error is the kindling that escalates a natural hazard — a hurricane, a tsunami, chemicals under pressure — into C, a catastrophic disaster. Can you give me some examples?

Hurricane Ike. Galveston, Texas, got completely wiped out in 1900. Thousands of people got killed. So the U.S. Army Corps of Engineers built a seawall on Galveston Island, and that sucker has gone through every major hurricane since 1900.

But people think that if a storm hasn’t happened since they lived there, somehow it can’t happen to them. This is where B comes in — the hubris and shortsightedness. Because a hurricane hadn’t flattened the city in decades, civic leaders decided to let people build at sea level again. And when Hurricane Ike came through in 2008, it was just like Berlin at the end of the second world war. Everything was gone.

Before Superstorm Sandy, I wrote that the subways were going to flood, but no one did anything. Mayor Bloomberg even hired some of my engineer friends from the Netherlands to come to New York City and advise him about building gates to cut off incoming hurricane surges.

But here we’re back to B — hubris and shortsightedness. People think because they’ve never seen a storm like what happened in New Jersey or they’ve never seen the tunnels flooded in New York City that it can’t happen, or that they need to think about building a levee.

Hurricane-Katrina

Disaster specialist Robert Bea studied the devastating aftermath of Hurricane Katrina, which left New Orleans flooded in 2005. NOAA

When I lived in New Orleans, we lost everything in Hurricane Betsy [in 1965]: our house, wedding photographs, marriage license, birth certificates. Yet 40 years later, after Katrina, I go back to the same place. There’s a new home built on the foundation, and the owners are dragging wet, oily mattresses out the front door.

Luckily I had no one with me that morning, but I broke down and cried. It wasn’t tears of sadness. It was tears of frustration at such a miserable, despicable mess. While we can’t prevent disaster, we can do things that are more sensible to mitigate risks, like maybe not building homes in floodplains.

But the cities are already there. Are you going to move entire cities?

In some cases, yes. We did it in the Mississippi River Valley after the 1993 floods. We actually moved entire towns to higher ground, like Valmeyer, Ill., and Rhineland, Mo., because we suddenly recognized they’d rebuilt them five times in the same damn place. Doing it six times doesn’t quite make sense. But there is not a “one size fits all” answer.

In other cases, there are intermediate solutions that can work, such as occupying only what you can defend properly and in a sustainable manner. An example is the “new New Orleans,” where parts of the city outside of the defended perimeter of the levee system can be expected to flood severely and frequently. Individuals there are building structures on higher ground and making them stronger, and preparing to take care of themselves in future storms.

But even after Superstorm Sandy’s devastation, you can’t just completely rebuild Lower Manhattan.

No, you can’t. But you can follow the example of the massive Thames Barrier that has been built in London [10 steel gates that prevent the city from being flooded by tidal surges].

But that cost over 500 million pounds — about $850 million — when it was completed in 1982. Doing the same in Manhattan would cost up to $17 billion and an additional $10 billion to $12 billion to shore up areas next to the barriers, an astonishing amount of money.

Well, do you want to fix it now or fix it later, when it will cost 100 times as much? Damages from Superstorm Sandy in New York and New Jersey are estimated at $60 billion to $70 billion. The key question I always ask is, “Can we work this problem out in a responsible way, or do we wait until it fails — in other words, will we fix it now versus pick up the pieces later?”

We looked into this “pay me now or pay me later,” and in many cases, it’s more than 100 times the cost to fix afterward. We economically documented several major accidents where the factor is bigger than 1,000, like Katrina.

People regularly ignore risks, but isn’t it the extreme scenario — the thing that has the 1 percent chance of happening, the so-called long tail at each end of the bell curve — that causes all the trouble? What about events beyond the realm of normal expectations, like Superstorm Sandy? 

Sandy is better classified as a “predictable surprise.” There were some groups in the greater New York area that had a clear understanding of the potential for significant flooding due to an intense storm. But there were other groups who had no idea and made no significant efforts to learn how vulnerable the city was to storm surges from the Atlantic.

Galveston-hurricane

A hurricane flattened Galveston, Texas, in 1900. Hurricane Ike in 2008 took out the city again. Library of Congress

What do you do when it becomes more costly to prepare for every disaster than just to take the risk? How safe is safe enough?

In many cases, you can’t prepare because no one’s willing to spend money to do this. We’ve rebuilt the levees around New Orleans, for example, but they’re right back on the same slabs. And it’s going to cost about 10 percent of the construction costs per year to maintain them.

Now where are you going to get $1.5 billion a year? So they can’t maintain it, and the next time it is challenged — and there is no doubt in my mind it will be challenged severely — it’s going to fail again, and the consequences are going to be worse because we’ve allowed more population and more infrastructure.

In aviation, which has unequaled safety records, they do predict and plan for the worst case because they can’t afford accidents. But Sully Sullenberger’s “Miracle on the Hudson,” when the pilot landed the plane on the Hudson River after flying into a flock of geese, does seem like a miracle.

It was a miracle, but it was not an accident. It was, in fact, rehearsed. That’s how the FAA was able to clear air traffic control, how he and his co-pilot, together with the flight crew, were able to get almost everything right, how the French Airbus had those backflow valves already designed in the fuselage.

We don’t normally land airplanes in water. They’re supposed to be on land, but they designed it to land in water as well, and the crew had rehearsed and planned what they would do in the event of a water landing. They are thinking about the impossible.

What warnings are going unheeded now? The antiquated levee system protecting California’s Sacramento Delta, which is the source of fresh water for 28 million of the state’s residents, comes to mind. I know you’ve been studying this.

An earthquake, a megaflood. Any of these natural disasters could rupture the delta levees and take a lot of the infrastructure — power lines, communication networks, gas pipelines, hydroelectric power systems — with it. Millions could be without power or fresh water for months. It isn’t going to be pretty, and it will knock out of commission the ninth-largest economy in the world.

And it could happen at any time.

Yeah. Tick, tick, tick.

 

Deepwater-Horizon
US Coast Guard

Bea goes back in time to explain how we might have avoided notable catastrophes of the past.

 

Deepwater Horizon, 2010

The Deepwater Horizon oil-drilling rig exploded in the Gulf of Mexico, killing 11 crewmen and igniting a fire that could not be extinguished. Two days later the rig sank, leaving the well gushing in the seabed. It ultimately leaked 210 million gallons of oil in the largest offshore spill in U.S. history.

» What was ignored? BP, the industry and federal regulators understood the potential for an uncontrolled blowout in Deepwater drilling, but they failed to heed warnings about the structural weaknesses in the cement casings that protect the well pipes. Plus, the blowout preventer hadn’t been working properly for several days.

» What would have improved the outcome? Delivering the required degree of safety consistently and ensuring that protocols were followed. Understanding that wells pumping 162,000 barrels of oil a day are under much higher pressure, and therefore are much more dangerous, than wells delivering 500 barrels a day. Building stronger protective structures to withstand these intense pressures could have prevented the disaster.

Midwest Floods, 2008

Months of rainstorms led to heavy flooding in seven Midwestern states, including Illinois, Minnesota, Indiana and Missouri, resulting in 24 deaths and more than $6 billion in damage.

» What was ignored? The lessons from floods in 1993. The increasing fragility of the aging levee system, and more people building in low-lying areas susceptible to flooding.

» What would have improved the outcome? Giving the water the room it needed to flood and not build in those areas. Building and maintaining high-quality flood protection systems in the areas that could have been protected.


Columbia Disaster, 2003 

The space shuttle disintegrated upon re-entry into the Earth’s atmosphere due to missing heat shield tiles, killing all seven astronauts on board.

» What was ignored? The mantra “better, faster, cheaper” drove the management decision to “bring the bird back,” even though the U.S. Air Force had photographs showing the leading edge of the Columbia was missing heat shield tiles, which were damaged during the launch. Problems with heat shield tiles during earlier missions hadn’t been adequately addressed, and NASA ignored engineers’ requests to ground the mission until these problems were solved.

» What would have improved the outcome? Fixing heat shield tile problems uncovered during earlier missions. Develop a backup plan for fixing the shuttle in space if it is damaged and ensuring the crew’s safety. They were lucky with previous missions that had missing tiles, but they took a chance, and their luck ran out.


Exxon Valdez Crash, 1989

The Exxon Valdez oil tanker ran aground in Prince William Sound, spilling an estimated 11 million gallons of crude oil along Alaskan shores. Numerous factors complicated cleanup, including the size of the spill, the remote location and a lack of readily available equipment and effective chemical dispersants to dissolve the thick oil.

» What was ignored? The dangers of Bligh Reef, where the ship hit the rocks. Taking shortcuts to save time to get the oil to Southern California refineries. Not traveling in regulated shipping lanes.

» What would have improved the outcome? Better communication between vessel captains and traffic control centers to avoid treacherous conditions. True pollution control to improve visibility, and better preparation for cleanup.

The Science of Why We Don’t Believe Science (Mother Jones)

How our brains fool us on climate, creationism, and the end of the world.

By  | Mon Apr. 18, 2011 3:00 AM PDT


“A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” So wrote the celebrated Stanford University psychologist Leon Festinger [1] (PDF), in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study [2] in psychology.

Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.

Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.

Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

Read also: the truth about Climategate. [3]. Read also: the truth about Climategate [4].

At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!

From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.

In the annals of denial, it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning [5]” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president [6] (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience [7] (PDF): Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia[8] of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We apply fight-or-flight reflexes not only to predators, but to data itself.

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber [9] of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”

In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt [10]: We may think we’re being scientists, but we’re actually being lawyers [11] (PDF). Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

That’s a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don’t want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn’t too emotionally invested to accept it, anyway. That’s not to suggest that we aren’t also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It’s just that we have other important goals besides accuracy—including identity affirmation and protecting one’s sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.

Modern science originated from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the “idols of the mind.” Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best ideas prevail.

Scientific evidence is highly susceptible to misinterpretation. Giving ideologues scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment [12] (PDF), pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”

Since then, similar results have been found for how people respond to “evidence” about affirmative action, gun control, the accuracy of gay stereotypes [13], and much else. Even when study subjects are explicitly instructed to be unbiased and even-handed about the evidence, they often fail.

And it’s not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan [14] and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider “scientific consensus” to lie on contested issues.

In Kahan’s research [15] (PDF), individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. Inanother study [16] (PDF), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)

Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family [16]) (PDF) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” says Kahan [17].

And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler showed subjects fake newspaper articles [18] (PDF) in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually “ban” embryonic stem-cell research. Liberals weren’t particularly amenable to persuasion, either, but no backfire effect was observed.)

Another study gives some inkling of what may be going through people’s minds when they resist persuasion. Northwestern University sociologist Monica Prasad [19] and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up a study [20] (PDF) in which they discussed the topic with some of these Republicans in person. They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had “said the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”

One study showed that not even Bush’s own words could change the minds of Bush voters who believed there was an Iraq-Al Qaeda link.

As it turned out, not even Bush’s own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those?

Respondent: Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.

The same types of responses are already being documented on divisive topics facing the current administration. Take the “Ground Zero mosque.” Using information from the political myth-busting site FactCheck.org [21], a team at Ohio State presented subjects [22] (PDF) with a detailed rebuttal to the claim that “Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer.” Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.

A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'” explains Stanford social psychologist Jon Krosnick [23]. Indeed, there’s a sense in which science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”

This may help explain a curious pattern Nyhan and his colleagues found when they tried to test the fallacy [6] (PDF) that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president’s religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using “social desirabililty” to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.

Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, a follow-up study [24] (PDF) showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.

Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or “narrowcast [25]” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age.”

A predictor of whether you accept the science of global warming? Whether you’re a Republican or a Democrat.

If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it’s an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you’re a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.

So perhaps it should come as no surprise that more education doesn’t budge Republican views. On the contrary: In a 2008 Pew survey [26], for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.

Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy—for example, abortion—if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right—and so their minds become harder to change.

That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one’s ideology.

Climategate had a substantial impact on public opinion, according to Anthony Leiserowitz [27], director of the Yale Project on Climate Change Communication [28]. It contributed to an overall drop in public concern about climate change and a significant loss of trust in scientists. But—as we should expect by now—these declines were concentrated among particular groups of Americans: Republicans, conservatives, and those with “individualistic” values. Liberals and those with “egalitarian” values didn’t lose much trust in climate science or scientists at all. “In some ways, Climategate was like a Rorschach test,” Leiserowitz says, “with different groups interpreting ambiguous facts in very different ways.”

Is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism.

So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr. [29]) and numerous Hollywood celebrities (most notably Jenny McCarthy [30] and Jim Carrey). TheHuffington Post gives a very large megaphone to denialists. And Seth Mnookin [31], author of the new book The Panic Virus [32], notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.

Vaccine denial has all the hallmarks of a belief system that’s not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rateshas been undermined [33] by multiple epidemiological studies—as well as the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.

Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, afterhis 1998 Lancet paper [34]—which originated the current vaccine scare—was retracted and he subsequently lost his license [35] (PDF) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.

It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?

There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.

Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are “system justifiers”: They engage in motivated reasoning to defend the status quo.

This is a contested area, however, because as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a “bourgeois” science and officially banned.

The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?

We all have blinders in some situations. The question then becomes: What can be done to counteract human nature?

Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

This theory is gaining traction in part because of Kahan’s work at Yale. In one study [36], he and his colleagues packaged the basic science of climate change into fake newspaper articles bearing two very different headlines—”Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming”—and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan infers that the effect occurred because the science had been written into an alternative narrative that appealed to their pro-industry worldview.

You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.


Links:
[1] https://motherjones.com/files/lfestinger.pdf
[2] http://www.powells.com/biblio/61-9781617202803-1
[3] http://motherjones.com/environment/2011/04/history-of-climategate
[4] http://motherjones.com/environment/2011/04/field-guide-climate-change-skeptics
[5] http://www.ncbi.nlm.nih.gov/pubmed/2270237
[6] http://www-personal.umich.edu/~bnyhan/obama-muslim.pdf
[7] https://motherjones.com/files/descartes.pdf
[8] http://www-personal.umich.edu/~lupia/
[9] http://www.stonybrook.edu/polsci/ctaber/
[10] http://people.virginia.edu/~jdh6n/
[11] https://motherjones.com/files/emotional_dog_and_rational_tail.pdf
[12] http://synapse.princeton.edu/~sam/lord_ross_lepper79_JPSP_biased-assimilation-and-attitude-polarization.pdf
[13] http://psp.sagepub.com/content/23/6/636.abstract
[14] http://www.law.yale.edu/faculty/DKahan.htm
[15] https://motherjones.com/files/kahan_paper_cultural_cognition_of_scientific_consesus.pdf
[16] http://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1095&context=fss_papers
[17] http://seagrant.oregonstate.edu/blogs/communicatingclimate/transcripts/Episode_10b_Dan_Kahan.html
[18] http://www-personal.umich.edu/~bnyhan/nyhan-reifler.pdf
[19] http://www.sociology.northwestern.edu/faculty/prasad/home.html
[20] http://sociology.buffalo.edu/documents/hoffmansocinquiryarticle_000.pdf
[21] http://www.factcheck.org/
[22] http://www.comm.ohio-state.edu/kgarrett/FactcheckMosqueRumors.pdf
[23] http://communication.stanford.edu/faculty/krosnick/
[24] http://www.comm.ohio-state.edu/kgarrett/MediaMosqueRumors.pdf
[25] http://en.wikipedia.org/wiki/Narrowcasting
[26] http://people-press.org/report/417/a-deeper-partisan-divide-over-global-warming
[27] http://environment.yale.edu/profile/leiserowitz/
[28] http://environment.yale.edu/climate/
[29] http://www.huffingtonpost.com/robert-f-kennedy-jr-and-david-kirby/vaccine-court-autism-deba_b_169673.html
[30] http://www.huffingtonpost.com/jenny-mccarthy/vaccine-autism-debate_b_806857.html
[31] http://sethmnookin.com/
[32] http://www.powells.com/biblio/1-9781439158647-0
[33] http://discovermagazine.com/2009/jun/06-why-does-vaccine-autism-controversy-live-on/article_print
[34] http://www.thelancet.com/journals/lancet/article/PIIS0140673697110960/fulltext
[35] http://www.gmc-uk.org/Wakefield_SPM_and_SANCTION.pdf_32595267.pdf
[36] http://www.scribd.com/doc/3446682/The-Second-National-Risk-and-Culture-Study-Making-Sense-of-and-Making-Progress-In-The-American-Culture-War-of-Fact

Climate change poses grave threat to security, says UK envoy (The Guardian)

Rear Admiral Neil Morisetti, special representative to foreign secretary, says governments can’t afford to wait for 100% certainty

The Guardian, Sunday 30 June 2013 18.19 BST

Flooding in Thailand in 2011

Flooding in Thailand in 2011. Photograph: Narong Sangnak/EPA

Climate change poses as grave a threat to the UK’s security and economic resilience as terrorism and cyber-attacks, according to a senior military commander who was appointed as William Hague’s climate envoy this year.

In his first interview since taking up the post, Rear Admiral Neil Morisetti said climate change was “one of the greatest risks we face in the 21st century”, particularly because it presented a global threat. “By virtue of our interdependencies around the world, it will affect all of us,” he said.

He argued that climate change was a potent threat multiplier at choke points in the global trade network, such as the Straits of Hormuz, through which much of the world’s traded oil and gas is shipped.

Morisetti left a 37-year naval career to become the foreign secretary’s special representative for climate change, and represents the growing influence of hard-headed military thinking in the global warming debate.

The link between climate change and global security risks is on the agenda of the UK’s presidency of the G8, including a meeting to be chaired by Morissetti in July that will include assessment of hotspots where climate stress is driving migration.

Morisetti’s central message was simple and stark: “The areas of greatest global stress and greatest impacts of climate change are broadly coincidental.”

He said governments could not afford to wait until they had all the information they might like. “If you wait for 100% certainty on the battlefield, you’ll be in a pretty sticky state,” he said.

The increased threat posed by climate change arises because droughts, storms and floods are exacerbating water, food, population and security tensions in conflict-prone regions.

“Just because it is happening 2,000 miles away does not mean it is not going to affect the UK in a globalised world, whether it is because food prices go up, or because increased instability in an area – perhaps around the Middle East or elsewhere – causes instability in fuel prices,” Morisetti said.

“In fact it is already doing so,” he added, noting that Toyota’s UK car plants had been forced to switch to a three-day week after extreme floods in Thailand cut the supply chain. Computer firms in California and Poland were left short of microchips by the same floods.

Morisetti is far from the only military figure emphasising the climate threat to security. America’s top officer tackling the threat from North Korea and China has said the biggest long-term security issue in the region is climate change.

In a recent interview, Admiral Samuel J Locklear III, who led the US naval action in Libya that helped topple Muammar Gaddafi, said a significant event related to the warming planet was “the most likely thing that is going to happen that will cripple the security environment, probably more likely than the other scenarios we all often talk about”.

There is a reason why the military are so clear-headed about the climate threat, according to Professor John Schellnhuber, a scientist who briefed the UN security council on the issue in February and formerly advised the German chancellor, Angela Merkel.

“The military do not deal with ideology. They cannot afford to: they are responsible for the lives of people and billions of pounds of investment in equipment,” he said. “When the climate change deniers took their stance after the Copenhagen summit in 2009, it is very interesting that the military people were never shaken from the idea that we are about to enter a very difficult period.”

He added: “This danger of the creation of violent conflicts is the strongest argument why we should keep climate change under control, because the international system is not stable, and the slightest thing, like the food riots in the Middle East, could make the whole system explode.”

The military has been quietly making known its concern about the climate threat to security for some time. General Wesley Clark, who commanded the Nato bombing of Yugoslavia during the Kosovo war, said in 2005: “Stopping global warming is not just about saving the environment, it’s about securing America for our children and our children’s children, as well.”

In the same year Chuck Hagel, now Obama’s defence secretary, said: “I don’t think you can separate environmental policy from economic policy or energy policy.”

Morisetti said there was also a direct link between climate change and the military because of the latter’s huge reliance on fossil fuels. “In Afghanistan, where we have had to import all our energy into the country along a single route that has been disrupted, the US military have calculated that for every 24 convoys there has been a casualty. There is a cost associated in bringing in that energy in both blood and treasure.

“So to drive up efficiency and to use alternative fuels, wind, solar, makes eminent sense to the military,” he said, noting that the use of solar blankets in Afghanistan meant fewer fuel resupply missions. “The principles of delivering your outputs more effectively, reducing your risks and reducing your costs reads across far more widely than just the military: most businesses would be looking for that too.”

Morisetti’s former employer, the Ministry of Defence, agrees that the climate threat is a serious one. The last edition of the Global Strategic Trends analysis published by the MoD’s Development, Concepts and Doctrine Centre concludes: “Climate change will amplify existing social, political and resource stresses, shifting the tipping point at which conflict ignites … Out to 2040, there are few convincing reasons to suggest that the world will become more peaceful.”

Schellnhuber was also clear about the consequences of failing to curb global warming. “The last 11,000 years – the Holocene – was characterised by the extreme stability of global climate. It is the only period when human civilisation could have developed at all,” he said. “But I don’t think a global, interconnected world can be managed in peace if climate change means we are leaving the Holocene. Let’s pray we will have a Lincoln or a Gorbachev to lead us.”

Subcommittee Reviews Legislation to Improve Weather Forecasting (Subcommittee on Environmen, House of Representatives, USA)

MAY 23, 2013

Washington, D.C. – The Subcommittee on Environment today held a hearing to examine ways to improve weather forecasting at the National Oceanic and Atmospheric Administration (NOAA).  Witnesses provided testimony on draft legislation that would prioritize weather-related research at NOAA, in accordance with its critical mission to protect lives and property through enhanced weather forecasting. The hearing was timely given the recent severe tornadoes in the mid-west and super-storms like Hurricane Sandy.

Environment Subcommittee Chairman Chris Stewart (R-Utah): “We need a world-class system of weather prediction in the United States – one, as the National Academy of Sciences recently put it, that is ‘second to none.’ We can thank the hard-working men and women at  NOAA and their partners throughout the weather enterprise for the great strides that have been made in forecasting in recent decades.  But we can do better. And it’s not enough to blame failures on programming or sequestration or lack of other resources. As the events in Moore, Oklahoma have demonstrated, we have to do better. But the good news is that we can.”

Experts within the weather community have raised concern that the U.S. models for weather prediction have fallen behind Europe and other parts of the world in predicting weather events.The Weather Forecasting Improvement Act, draft legislation discussed at today’s hearing, would build upon the down payment made by Congress following Hurricane Sandy and restore the U.S. as a leader in this field through expanded computing capacity and data assimilation techniques.

Rep. Stewart: “The people of Moore, Oklahoma received a tornado warning 16 minutes before the twister struck their town. Tornado forecasting is difficult but lead times for storms have become gradually better. The draft legislation would prioritize investments in technology being developed at NOAA’s National Severe Storms Laboratory in Oklahoma, which ‘has the potential to provide revolutionary improvements in… tornado… warning lead times and accuracy, reducing false alarms’ and could move us toward the goal of being able to ‘warn on forecast.’”

The following witnesses testified today:

Mr. Barry Myers, Chief Executive Officer, AccuWeather, Inc.

Mr. Jon Kirchner, President, GeoOptics, Inc.

Geoengineering: Can We Save the Planet by Messing with Nature? (Democracy Now!)

Video: http://www.democracynow.org/2013/5/20/geoengineering_can_we_save_the_planet

Clive Hamilton, professor of public ethics at Charles Sturt University in Canberra, Australia. He is the author of the new book, Earthmasters: The Dawn of the Age of Climate Engineering.

“It’s happening now… The village is sinking” (The Guardian)

Residents of Newtok, Alaska, know they must evacuate, but who will pay the $130m cost of moving them?

› Children jump over ground affected by erosion in Newtok. Natural erosion has accelerated due to climate change, with large areas of land lost to the Ninglick River each year. Photograph: Brian Adams

Suzanne Goldenberg in Newtok, Alaska, with video by Richard Sprenger

One afternoon in the waning days of winter, the most powerful man in Newtok, Alaska, hopped on a plane and flew 1,000 miles to plead for the survival of his village. Stanley Tom, Newtok’s administrator, had a clear purpose for his trip: find the money to move the village on the shores of the Bering Sea out of the way of an approaching disaster caused by climate change.

Village administrator Stanley Tom stands at Mertarvik, the site of relocated Newtok. Photograph: Brian Adams Photography

Newtok was rapidly losing ground to erosion. The land beneath the village was falling into the river. Tom needed money for bulldozers to begin preparing a new site for the village on higher ground. He needed funds for an airstrip, He came back from his meetings in Juneau, the Alaskan state capital, with expressions of sympathy – but nothing in the way of the cash he desperately needed. “It’s really complicated,” he said. “There are a lot of obstacles.”

Those obstacles – financial, legal and a supremely frustrating bureaucratic process – had slowed down the move for so long that some in Newtok, which is about 400 miles south of the Bering Strait that separates the US from Russia, feared they would be stuck as the village went down around them, houses swallowed up by the river.

“It’s really alarming,” said Tom, slumped in an armchair a few hours after his return to the village. “I have a hard time sleeping, and I’m getting up early in the morning. I am worried about it every day.”

The uncertainty was tearing the village apart. It also began to turn the village against Tom.

Over the winter, a large group of villagers decided that their administrator was not up to the job. By the time he returned from this particular trip, the dissidents had voted to replace the village council and to sack Tom – a vote that he ignored.

“The way I see it, we need someone who knows how to do the work,” said Katherine Charles, one of Tom’s most vocal critics. “I feel like we are being neglected. We are still standing here and we don’t know when we are going to move. For years now we have been frustrated. I have to ask myself: why are we even still here?”

It’s been more than a decade since Tom took charge of running Newtok, and leading the village out of climate disaster to higher ground.

The ground beneath Newtok is disappearing. Natural erosion has accelerated due to climate change, with large areas of land lost to the Ninglick river each year. A study by the Army Corps of Engineers found the highest point in the village would be below water level by 2017. The proximity of the threat to Newtok means that its villages are likely to be America’s first climate refugees.

Officials in Anchorage say Tom has worked tirelessly to move the village out of the way of a rampaging river. Among the relatively small circle of bureaucrats and lawyers who concern themselves with the problems of small and remote indigenous Alaskan villages, the Newtok administrator has a stellar reputation. He has won leadership awards from Native American groups in the rest of the country.

Tom said he hoped to make a big push this summer, acquiring heavy equipment that locals could use to begin moving some of the existing houses over to the new village site at Mertarvik nine miles to the south.

“It’s really happening right now. The village is sinking and flooding and eroding,” he said. He said he was planning to move his own belongings to the new village site this summer – and that villagers should start doing the same.

But Tom, despite his lobbying missions to Juneau and strong reputation with government officials, has failed to inject federal and state officials with that same sense of urgency.

Melting permafrost, sea-level rise, erosion – these are some of the worst consequences of climate change for Alaska. But none of those elements in Newtok’s slow destruction are recognised as disasters under existing legislation.

That means there is no designated pot of money set aside for those affected communities – unlike cities or towns destroyed by floods or tornados.

We weren’t thinking of climate change when federal disaster relief legislation was passed.

Robin Bronen, a human rights lawyer in Anchorage. ‘This is completely a human rights issue’ Photograph: Richard Sprenger

“We weren’t thinking of climate change when federal disaster relief legislation was passed,” said Robin Bronen, a human rights lawyer in Anchorage who has made a dozen visits to Newtok. “Our legal system is not set up. The institutions that we have created to respond to disasters are not up to the task of responding to climate change.”

In Bronen’s view, Congress needed to rewrite existing disaster legislation to take account of climate change. Communities needed to be able to access those disaster funds — if not to rebuild in place, which is not feasible in Newtok’s case, then to move.

The authorities also had responsibility under the treaty agreements with indigenous Alaskan tribes to guarantee the safety and wellbeing of indigenous communities, she argued.

“This is completely a human rights issue,” Bronen said. “When you are talking about a people who have done the least to contribute to our climate crisis facing such dramatic consequences as a result of climate change, we have a moral and legal responsibility to respond and provide the funding needed so that these communities are not in danger.”

Until then, however, it was up to Tom to find new ways to prise funds out of an unresponsive bureaucracy. It turned out that he had a knack for it.

Government officials praised Tom for finding other sources of funds, such as development grants, and putting them to use for building the new village site. But it has been a laborious process for the remote village to find its way through the different funding agencies and a maze of competing regulations.

As Tom found out, each agency had its own set of rules. The state government would not build a school for fewer than 10 children. The federal government would not build an airstrip at a village without a post office. But the rules, from Newtok’s vantage point, appeared to have at least one point in common. They seemed to conspire against the village ever getting its move off the ground.

In 2011, Alaska’s government published a timetable for Newtok’s move, setting out dates for building an emergency centre, housing, an airstrip – all items on Tom’s list. Two years later, the plan is already behind schedule and the official who oversaw that original timetable said there was little chance of getting back on track.

Newtok is something that is probably going to play out over several decades unless it reaches a dire point where something has to be done immediately to keep the people safe

Larry Hartig, who heads Alaska’s Department of Environmental Conservation

“Newtok is something that is probably going to play out over several decades unless it reaches a dire point where something has to be done immediately to keep the people safe,” said Larry Hartig, who heads Alaska’s Department of Environmental Conservation.

Officially, the government of Alaska remains committed to helping Newtok and all the other indigenous Alaskan villages that are threatened by climate change.

Almost all of Alaska’s indigenous villages – more than 180 – are experiencing the effects of climate change, including severe flooding and erosion. Some may be able to hold back rivers and sea, but others will have to move. About half a dozen villages, including Newtok, face extreme risks.

A mosaic of sea ice shifts across the Bering Sea, west of Alaska on 5 February, 2008. On either side of the Bering Strait (top centre) the land is blanketed with snow. Anchorage is located in the middle right-hand side of the image, at the top of the Cook Inlet. The village of Newtok is located north of Nunivak Island (middle), close to the coast on the lowland plain of the Yukon-Kuskokwim Delta, one of the largest river deltas in the world, which mostly consists of tundra. Photograph: NASA/Aqua/MODIS

“I am not going to tell any community that they are not going to survive. If the residents want to survive, we will help them,” said Mead Treadwell, the state’s lieutenant governor.

But the cost of relocating just one village — Newtok — could run as high as $130m, according to an estimate by the Army Corps of Engineers. That’s more than $350,000 per villager. Multiply that by half a dozen, or several more times, and the cost of protecting indigenous Alaskan villages from climate change soon soars into the billions.

So far, Newtok has received a total of about $12m in state funds over the past four years, according to George Owletuck, a consultant hired by Tom to help with the move. Much of that has already gone, to build a barge landing, a few new homes, and an emergency evacuation centre – in case the village does not manage to move in time.

Officially, federal and state government agencies have spent some $27m getting Mertarvik ready, although a considerable share of that figure, some $6m, did not go directly to the relocation, said Sally Russell Cox, the state official overseeing the move. And there is still no major infrastructure completed at Mertarvik.

Would the government of Alaska commit to picking up the rest of the tab for Newtok and the other villages?

Alaska’s oil revenues have fallen off over the years. In 2012, the state slipped into second place for oil production behind North Dakota. Treadwell admitted the state government would not cover the entire cost of fortifying or moving all of the villages threatened by climate change.

“On the question of is there money to help them with one cheque? That is something there clearly is not,” he said.

Treadwell suggested some of the at-risk villages could raise funds by setting themselves up as hubs for oil companies hoping to drill in Arctic waters.

However, a number of oil companies have put their Arctic drilling plans on hold for 2013 and 2014. Treadwell admitted there was as yet no comprehensive climate change plan for Newtok and other villages. “I think it’s going to be piece by piece with each community and many different pots of money,” he said.

In the case of Newtok, Owletuck, the consultant, had big ideas for financing the move: growing fruit and vegetables hydroponically in green houses, or testing the possibilities of producing biofuels from algae.

He let it be known the village may even have found a mysterious benefactor. Owletuck said he’d had an approach from private individuals, whom he declined to name, wanting to donate $22m to the move.

None of those propositions have materialised, however. And after more than a decade of uncertainty about the future under climate change, the basic infrastructure of Newtok is coming apart.

The impact on Newtok

The rising sea due to erosion and climate change has dramatically altered Newtok Village and by 2027 is expected to cover nearly a third of the village. Historic shorelines digitized from USGS topographic maps and aerial photos. Source: Army Corps of Engineers

Snow covers up a lot of Newtok’s flaws: the open sewage pits, the broken board walk over mudflats, some of the abandoned snowmobile wrecks.

Newtok has for years been considered a “distressed village”, with average income of $16,000, well below the rest of the state. Fewer than half of adults in the village have paid work. But even within those dismal measures, conditions have sharply deteriorated in the years since the village has been planning to move.

Aside from the clinic and the school, most buildings are in a state of advanced dilapidation. The floor in the community hall sags like an old mattress. The community laundry is out of order.

In the cramped offices of the traditional council, where Tom works, the furniture dates from the 1970s or 1980s, mid-brown vinyl chairs where the casing has split open, revealing the dirty foam inside. It’s not unheard of to find families of 10 or 12 children living in houses of less than 800 sq ft – and none of those homes have flush toilets or running water.

Early mornings find the men of the household trudging out of their homes with 5 gallon buckets of waste, which get dumped at various spots on the edges of the village, including a small stream.

The diesel-powered generator was nearing the end of its life span. The water treatment plant was shut down last October after people began getting sick. Tom said there was contamination from leaking jet fuel at the airport.

For now, villagers are drawing water from the school, which had a separate system. But the school principal said he would have to cut that off in May to preserve the system for the schoolchildren.

Tom said there was nothing he could do. Government agencies would not fund improvements at the current village site, because of the plan to move. “There is no money to improve our community,” he said. “We are suspended from federal and state agencies and there is no way of improving our lives over here. The agencies do not want to work on both villages at once.”

By last October, frustration with the stalled move and conditions in the village exploded. Villagers accused their own council of failing to hold regular elections, and raised a petition to throw out the leaders and replace Tom.

Some accused him of presiding over a dictatorship in the village. Others speculated that he and the paid consultant, Oweituck, were plotting to rob the relocation funds.

One of the dissidents, a relative newcomer to the village, posted ferocious criticism of Tom on Facebook calling for rebellion.

The dissidents organised elections, voted out the old council and installed their own leaders. Tom ignored the result. “Let them cry all they want,” he said. “I don’t care. They are not going to help my community. I am way ahead of these guys.”

The upheavals in Newtok are sadly familiar to those who have worked with indigenous Alaskan villages confronting climate change. “I don’t think you would find one community that says they are happy with the pace that’s gone on,” said Patricia Cochran, director of the Alaska Native Science Commission.

“To be honest with you, I think the state and the feds have done a terrible job, not only in assessing the conditions that communities are living within but in responding to them,” she said. “Because these communities are listed as threatened and may potentially be relocated, they are not able to get any funds now for infrastructure that is being damaged right now.”

That leaves communities stuck in a limbo that can carry for years or even decades.

That’s what has become of Newtok. The effects are devastating, said Charles. Beyond all her anger she admitted was an all-enveloping fear. “Sometimes I get scared. I’m scared for my own family. How will I take care of them if the relocation doesn’t start right away?”

She had been waiting for years to see the beginnings of any new settlement in rural Alaska rising up on the rocky hill of Mertarvik: the airport, the barge landing, the school, the houses. None of it was there yet, and Charles said she was coming close to despair.

“It’s been going on for I don’t know how long, and I am beginning to lose hope.”

For Insurers, No Doubts on Climate Change (N.Y.Times)

Master Sgt. Mark Olsen/U.S. Air Force, via Associated Press. Damage in Mantoloking, N.J., after Hurricane Sandy. Natural disasters caused $35 billion in private property losses last year.

By EDUARDO PORTER

Published: May 14, 2013

If there were one American industry that would be particularly worried about climate change it would have to be insurance, right?

From Hurricane Sandy’s devastating blow to the Northeast to the protracted drought that hit the Midwest Corn Belt, natural catastrophes across the United States pounded insurers last year, generating$35 billion in privately insured property losses, $11 billion more than the average over the last decade.

And the industry expects the situation will get worse. “Numerous studies assume a rise in summer drought periods in North America in the future and an increasing probability of severe cyclones relatively far north along the U.S. East Coast in the long term,” said Peter Höppe, who heads Geo Risks Research at the reinsurance giant Munich Re. “The rise in sea level caused by climate change will further increase the risk of storm surge.” Most insurers, including the reinsurance companies that bear much of the ultimate risk in the industry, have little time for the arguments heard in some right-wing circles that climate change isn’t happening, and are quite comfortable with the scientific consensus that burning fossil fuels is the main culprit of global warming.

“Insurance is heavily dependent on scientific thought,” Frank Nutter, president of the Reinsurance Association of America, told me last week. “It is not as amenable to politicized scientific thought.”

Yet when I asked Mr. Nutter what the American insurance industry was doing to combat global warming, his answer was surprising: nothing much. “The industry has really not been engaged in advocacy related to carbon taxes or proposals addressing carbon,” he said. While some big European reinsurers like Munich Re and Swiss Re support efforts to reduce CO2 emissions, “in the United States the household names really have not engaged at all.” Instead, the focus of insurers’ advocacy efforts is zoning rules and disaster mitigation.

Last week, scientists announced that the concentration of heat-trapping carbon dioxide in the atmosphere had reached 400 parts per million — its highest level in at least three million years, before humans appeared on the scene. Back then, mastodons roamed the earth, the polar ice caps were smaller and the sea level was as much as 60 to 80 feet higher.

The milestone puts the earth nearer a point of no return, many scientists think, when vast, disruptive climate change is baked into our future. Pietr P. Tans, who runs the monitoring program at the National Oceanic and Atmospheric Administration, told my colleague Justin Gillis: “It symbolizes that so far we have failed miserably in tackling this problem.” And it raises a perplexing question: why hasn’t corporate America done more to sway its allies in the Republican Party to try to avert a disaster that would clearly be devastating to its own interests?

Mr. Nutter argues that the insurance industry’s reluctance is born of hesitation to become embroiled in controversies over energy policy. But perhaps its executives simply don’t feel so vulnerable. Like farmers, who are largely protected from the ravages of climate change by government-financed crop insurance, insurers also have less to fear than it might at first appear.

The federal government covers flood insurance, among the riskiest kind in this time of crazy weather. And insurers can raise premiums or even drop coverage to adjust to higher risks. Indeed, despite Sandy and drought, property and casualty insurance in the United States was more profitable in 2012 than in 2011, according to the Property Casualty Insurers Association of America.

But the industry’s analysis of the risks it faces is evolving. One sign of that is how some top American insurers responded to a billboard taken out by the conservative Heartland Institute, a prominent climate change denier that has received support from the insurance industry.

The billboard had a picture of Theodore Kaczynski, the Unabomber, who asked: “I still believe in global warming. Do you?”

Concerned about global warming and angry to be equated with a murderous psychopath, insurance companies like Allied World, Renaissance Re, State Farm and XL Groupdropped their support for Heartland.

Even more telling, Eli Lehrer, a Heartland vice president who at the time led an insurance-financed project, left the group and helped start the R Street Institute, a standard conservative organization in all respects but one: it believes in climate change and supports a carbon tax to combat it. And it is financed largely with insurance industry money.

Mr. Lehrer points out that a carbon tax fits conservative orthodoxy. It is a broad and flat tax, whose revenue can be used to do away with the corporate income tax — a favorite target of the right. It provides a market-friendly signal, forcing polluters to bear the cost imposed on the rest of us and encouraging them to pollute less. And it is much preferable to a parade of new regulations from the Environmental Protection Agency.

“We are having a debate on the right about a carbon tax for the first time in a long time,” Mr. Lehrer said.

Bob Inglis, formerly a Republican congressman from South Carolina who lost his seat in the 2010 primary to a Tea Party-supported challenger, is another member of this budding coalition. Before he left Congress, he proposed a revenue-neutral bill to create a carbon tax and cut payroll taxes.

Changing the political economy of a carbon tax remains an uphill slog especially in a stagnant economy. But Mr. Inglis notices a thaw. “The best way to do this is in the context of a grand bargain on tax reform,” he said. “It could happen in 2015 or 2016, but probably not before.”

He lists a dozen Republicans in the House and eight in the Senate who would be open to legislation to help avert climate change. He notes that Exelon, the gas and electricity giant, is sympathetic to his efforts — perhaps not least because a carbon tax would give an edge to gas over its dirtier rival, coal. Exxon, too, has also said a carbon tax would be the most effective way to reduce emissions. So why hasn’t the insurance industry come on board?

Robert Muir-Wood is the chief research officer of Risk Management Solutions, one of two main companies the insurance industry relies on to crunch data and model future risks. He argues that insurers haven’t changed their tune because — with the exception of 2004 and 2005, when a string of hurricanes from Ivan to Katrina caused damage worth more than $200 billion — they haven’t yet experienced hefty, sustained losses attributable to climate change.

“Insurers were ready to sign up to all sorts of actions against climate change,” Mr. Muir-Wood told me from his office in London. Then the weather calmed down.

Still, Mr. Muir-Wood notes that the insurance industry faces a different sort of risk: political action. “That is the biggest threat,” he said. When insurers canceled policies and raised premiums in Florida in 2006, politicians jumped on them. “Insurers in Florida,” he said, “became Public Enemy No. 1.”

And that’s the best hope for those concerned about climate change: that global warming isn’t just devastating for society, but also bad for business.

World Bank turns to hydropower to square development with climate change (Washington Post)

Michael Reynolds/European Photopress Agency – World Bank President Jim Yong Kim attends the Fragility Forum this month in Washington. The forum discussed ways for fragile nations to improve their economies, their infrastructure and the well-being of their citizens.

By , Published: May 8, 2013

The World Bank is making a major push to develop large-scale hydropower projects around the globe, something it had all but abandoned a decade ago but now sees as crucial to resolving the tension between economic development and the drive to tame carbon use.

Major hydropower projects in Congo, Zambia, Nepal and elsewhere — all of a scale dubbed “transformational” to the regions involved — are a focus of the bank’s fundraising drive among wealthy nations. Bank lending for hydropower has scaled up steadily in recent years, and officials expect the trend to continue amid a worldwide boom in water-fueled electricity.

Such projects were shunned in the 1990s, in part because they can be disruptive to communities and ecosystems. But the World Bank is opening the taps for dams, transmission lines and related infrastructure as its president, Jim Yong Kim, tries to resolve a quandary at the bank’s core: how to eliminate poverty while adding as little as possible to carbon emissions.

“Large hydro is a very big part of the solution for Africa and South Asia and Southeast Asia. . . . I fundamentally believe we have to be involved,” said Rachel Kyte, the bank’s vice president for sustainable development and an influential voice among Kim’s top staff members. The earlier move out of hydro “was the wrong message. . . . That was then. This is now. We are back.”

It is a controversial stand. The bank backed out of large-scale hydropower because of the steep trade-offs involved. Big dams produce lots of cheap, clean electricity, but they often uproot villages in dam-flooded areas and destroy the livelihoods of the people the institution is supposed to help. A 2009 World Bank review of hydro­power noted the “overwhelming environmental and social risks” that had to be addressed but also concluded that Africa and Asia’s vast and largely undeveloped hydropower potential was key to providing dependable electricity to the hundreds of millions of people who remain without it.

“What’s the one issue that’s holding back development in the poorest countries? It’s energy. There’s just no question,” Kim said in an interview.

Advocacy groups remain skeptical, arguing that large projects, such as Congo’s long-debated network of dams around Inga Falls, may be of more benefit to mining companies or industries in neighboring countries than poor communities.

“It is the old idea of a silver bullet that can modernize whole economies,” said Peter Bosshard, policy director of International Rivers, a group that has organized opposition to the bank’s evolving hydro policy and argued for smaller projects designed around communities rather than mega-dams meant to export power throughout a region.

“Turning back to hydro is being anything but a progressive climate bank,” said Justin Guay, a Sierra Club spokesman on climate and energy issues. “There needs to be a clear shift from large, centralized projects.”

The major nations that support the World Bank, however, have been pushing it to identify such projects — complex undertakings that might happen only if an international organization is involved in sorting out the financing, overseeing the performance and navigating the politics.

The move toward big hydro comes amid Kim’s stark warning that global warming will leave the next generation with an “unrecognizable planet.” That dire prediction, however, has left him struggling to determine how best to respond and frustrated by some of the bank’s inherent limitations.

In his speeches, Kim talks passionately about the bank’s ability to “catalyze” and “leverage” the world to action by mobilizing money and ideas, and he says he is hunting for ideas “equal to the challenge” of curbing carbon use. He has criticized the “small bore” thinking that he says has hobbled progress on the issue.

However, the bank remains in the business of financing traditional fossil-fuel plants, including those that use the dirtiest form of coal, as well as cleaner but ­carbon-based natural gas infrastructures.

Among the projects likely to cross Kim’s desk in coming months, for example, is a 600-megawatt power plant in Kosovo that would be fired by lignite coal, the bottom of the barrel when it comes to carbon emissions.

The plant has strong backing from the United States, the World Bank’s major shareholder. It also meshes with one of the bank’s other long-standing imperatives: Give countries what they ask for. The institution has 188 members to keep happy and can go only so far in trying to impose its judgment over that of local officials. Kim, who in his younger days demonstrated against World Bank-enforced “orthodoxy” in economic policy, now may be hard-pressed to enforce an energy orthodoxy of his own.

Kosovo’s domestic supplies of lignite are ample enough to free the country from imported fuel. Kim said there is little question that Kosovo needs more electricity, and the new plant will allow an older, more polluting facility to be shut down.

“I would just love to never sign a coal project,” Kim said. “We understand it is much, much dirtier, but . . . we have 188 members. . . . We have to be fair in balancing the needs of poor countries . . . with this other bigger goal of tackling climate change.”

The bank is working on other ideas. Kim said he is considering how it might get involved in creating a more effective world market for carbon, allowing countries that invest in renewable energy or “climate friendly” agriculture to be paid for their carbon savings by industries that need to use fossil fuels. Existing carbon markets have been plagued with volatile pricing — Europe’s cost of carbon has basically collapsed — or rules that prevent carbon trading with developing countries.

“We’ve got to figure out a way to establish a stable price of carbon,” Kim said. “Everybody knows that.”

He has also staked hope for climate progress on developments in agriculture.

Hydropower projects, however, seem notably inside what Kim says is the bank’s sweet spot — complex, high-impact, green and requiring the sort of joint public and private financing Kim says the bank can attract.

The massive hydropower potential of the Congo River, estimated at about 40,000 megawatts, is such a target. Its development is on a list of top world infrastructure priorities prepared by the World Bank and other development agencies for the Group of 20 major economic powers.

Two smaller dams on the river have been plagued by poor performance and are being rehabilitated with World Bank assistance. A third being planned would represent a quantum jump — a 4,800-megawatt, $12 billion giant that would move an entire region off carbon-based electricity.

The African Development Bank has begun negotiations over the financing, and the World Bank is ready to step in with tens of millions of dollars in technical-planning help.

“In an ideal world, we start building in 2016. By 2020, we switch on the lights,” said Hela Cheikhrouhou, energy and environment director for the African Development Bank.

It is the sort of project that the World Bank had stayed away from for many years — not least because of instability in the country. But as the country tries to move beyond its civil war and the region intensifies its quest for the power to fuel economic growth, the bank seems ready to move. Kim will visit Congo this month for a discussion about development in fragile and war-torn states.

Kyte, the World Bank vice president, said the Inga project will be high on the agenda.

“People have been looking at the Inga dam for as long as I have been in the development business,” she said. “The question is: Did the stars align? Did you have a government in place? Did people want to do it? Are there investors interested? Do you have the ability to do the technical work? The stars are aligned now. Let’s go.”

Chilean Judge Upholds Manslaughter Charges Linked to ’10 Tsunami (N.Y.Times)

By PASCALE BONNEFOY

Published: May 16, 2013

SANTIAGO, Chile — A judge dismissed an appeal to suspend involuntary manslaughter charges against four government officials accused of failing to issue a tsunami alert after the 8.8-magnitude earthquake that struck Chile in 2010.

“This court believes that not enough was done to avoid the catastrophic results” of the quake, Judge Ponciano Sallés said in his ruling. “Any reasonable analysis would conclude that the risk was greater by not evacuating the population than by doing so,” he said, adding that “information was concealed.”

The investigation into the deaths of 156 people and the disappearance of 25 more during the tsunami seeks to establish responsibility for the confusing and contradictory chain of decisions made by government officials and emergency agencies shortly after the earthquake. The actions resulted in mistaken public assurances that there was no risk of tsunami, despite reports that one had already devastated the Juan Fernández Archipelago in the Pacific, west of the Chilean coast.

The former director of the National Emergency Agency, Carmen Fernández, is accused of providing false information and not issuing a tsunami alert. The former under secretary of the interior, Patricio Rosende, has been charged with “imprudent conduct” in neglecting to warn the population. Both argued that it was up to the navy’s oceanographic service to issue the alert.

According to survivors, many families returned to their homes on the coast after hearing the president at the time, Michelle Bachelet, say on the radio that there was no danger of a tsunami. Raúl Meza, a lawyer for one victim’s family, has formally requested that prosecutors interrogate the former president as a suspect. Ms. Bachelet, who is campaigning for the presidential elections in November, has testified twice, but as a witness.

Three other officials have also been charged but did not appeal. The accusations, filed last year against the seven, include operating with inexperienced personnel, lacking knowledge on the use of technology, leaving shifts vacant at regional emergency agencies and ignoring field reports.

“If the accused had been fulfilling their duties, lives would have been saved,” said the lead prosecutor, Solange Huerta, after the ruling.

A version of this article appeared in print on May 17, 2013, on page A10 of the New York edition with the headline: Chilean Judge Rejects Appeal Of Charges In ’10 Tsunami.

Câmara de São Paulo aprova envio de torpedo para alertar chuvas (Folha de S.Paulo)

16/05/2013 – 17h01

GIBA BERGAMIM JR., DE SÃO PAULO

Atualizado às 17h54.

A Câmara de São Paulo aprovou um projeto que obriga a prefeitura a enviar mensagens de texto aos celulares dos paulistanos com alertas sobre a chegada de chuvas e de iminentes alagamentos.

De autoria do vereador Ricardo Young (PPS), o projeto agora precisa ser sancionado pelo prefeito Fernando Haddad (PT) para começar a valer.

Hoje, a única maneira de se informar sobre isso é acompanhando o noticiário nas rádios e emissoras de TV ou por meio do site do CGE (Centro de Gerenciamento de Emergências), da prefeitura, que monitora as chuvas na cidade.

O vereador diz que se inspirou em iniciativas do tipo nos Estados Unidos e Europa para dar informações sobre nevascas, por exemplo.

Em 2011, a prefeitura deu início a um projeto semelhante, mas apenas para os moradores da região da favela Pantanal, na zona leste de São Paulo, que sofreu com as enchentes durante quase dois meses inteiros durante o verão.

De acordo com o texto de Young, o município terá que celebrar convênios com empresas de telefonia móvel. As informações terão que ser passadas com antecedência de pelo menos duas horas aos paulistanos.

Segundo o parlamentar, o projeto permitiria que empresas, repartições públicas e escolas pudessem antecipar o fim do expediente para que os paulistanos cheguem em casa antes das chuvas.

PROJETOS

Também foi aprovado projeto dos vereadores Antonio Goulart (PSD) e Roberto Trípoli (PV) que permite que os animais de estimação sejam enterrados no mesmo jazigo de seus donos em cemitérios municipais. O projeto irá a segunda votação.

A um ano da Copa do Mundo, os vereadores também deram o título de cidadão paulistano para o presidente da Fifa, Joseph Blatter

Scientist Superheroes: The US Government’s Crisis Science Team (Quest)

Post on May 13, 2013 by , Guest Contributor for   (science.kqed.org)

Science During Crisis 640 360

If your town were suddenly struck by an earthquake or hurricane, you could count on the arrival of police, firefighters, and medical technicians to aid in the emergency response. As of this past January, the US government has added a new team of responders to this list—scientists.

The Strategic Sciences Group was formed under Secretary of the Interior Ken Salazar in order to help the department “act quickly, decisively and effectively when hurricanes, droughts, oil spills, wildfires or other crises strike.” The group was initially tested as a pilot program during the Deepwater Horizon oil spill and is now a permanent part of the Department of the Interior.

But don’t expect to see people in lab coats rushing into burning buildings or diving into flooding rivers.

“The Strategic Sciences Group’s mission,” says group co-leader Gary Machlis, “is to very quickly assemble a team of scientists to develop scenarios of what the cascading consequences of a crisis might be.” These scenarios are projections of all the different ways in which the disaster might play out. The projections are then delivered to the President and other national leaders to help inform real-time, emergency-response decisions.

During Deepwater Horizon, for example, calculations regarding oil flow rates helped decision-makers respond to obvious problems such as ecosystem contamination, as well as some subtler consequences: long-term displacement of oyster harvesters, disproportionate economic impacts upon cultural communities, and diminished hurricane resilience due to wetland stress.

Deepwater Horizon in flames, April 21, 2010.

Deepwater Horizon in flames, April 21, 2010.
(AP Photo/U.S. Coast Guard)

With such a breadth of potential consequences to examine, its clear that the Strategic Sciences Group’s task is not simple. What is surprising however, is that many of the group’s difficulties stem from the nature of the scientific process itself.

“Scientists are very accustomed to being deliberate in their work,” says Marcia Mcnutt, who worked with the Strategic Sciences Group during her tenure as director of the United States Geological Survey. “The idea that they might get critical info on a Monday night and need to have their best guess of what that info means by six am Tuesday morning is just not the normal way science operates.”

And, Machlis adds, the rapidly determined results must also be communicated persuasively.

“It isn’t enough to do good science. You might have an extraordinary, complex scenario figured out that’s important for leaders to know, but if you can’t tell that story clearly and effectively, it’s of less value.”

These requirements—the ability to work with urgency, cope with uncertainty, and communicate well with non-scientists—separate crisis science from traditional scientific research.

When creating the Strategic Sciences Group, Machlis responded to these unique demands by borrowing ideas from an unusual agency: The Office of Strategic Services, or OSS, which was established during World War II to coordinate espionage activities behind enemy lines.

Machlis says he has learned four important lessons from the historic military-intelligence organization:

1. Who You Hire Matters

OSS founder Maj. Gen. William J. Donovan with members of the OSS Operational Groups

OSS founder Maj. Gen. William J. Donovan with members of the OSS
Photo Courtesy of the OSS Society

“The ideal candidate for the OSS were PhD’s that could win a bar fight,” says Machlis.

While martial arts training is not an actual requirement, the Strategic Sciences Group does seek scientists with a certain mental and physical tenacity. “They‘ve got to be expert in their own discipline, able to transcend their own discipline and work well with other scientists, and they have to be able to work extremely hard under very intense conditions.”

2. Expertise Not Representation

“Our goal is to get the very best people in the field working on these science teams. It is less about do we need one person from this agency and one person from this agency to make sure it’s representative.”

From government scientists to graduate students, anyone with the necessary skills can be recruited and put onto Machlis’ lists, or “rosters”.

3. Be Flexible

US Department of the Interior Strategic Sciences Working Group, New Orleans, LA, July 2010.

US Department of the Interior Strategic Sciences Working Group, New Orleans, LA, July 2010. Photo Credit: Jason Newman

Machlis ensures that the rosters are highly interdisciplinary. The 30 scientists who have already been called to action include “an anthropologist with expertise in disaster response from Louisiana, a public health medical officer from Washington, DC, a coastal geomorphologist from California, an ecologist working with a major natural history museum, and a Forest Service social scientist with expertise in urban ecology.”

Such diversity allows the Strategic Sciences Group to be highly adaptable.

“Each crisis that might happen, whether it’s an oil spill, whether it’s an earth quake, whether it’s a dam failure, there’s always going to be many elements of it that are unique. We need to be flexible enough to choose a team that has reliable expert scientists appropriate to that crisis.”

4. Avoid Bureaucracy

“We stay focused on the mission rather than developing a lot of complicated, time-consuming bureaucratic processes.”

In order to avoid creating a large government agency, Machlis only activates the rostered scientists when a disaster occurs. At all other times, the Strategic Sciences Group is made up of only three people.

This three person team, when there are no current crises, spends its time evaluating the consequences of potential crises.

By considering situations such as “a forest fire in the sierras during Yosemite’s tourist season, or a pandemic, or an arctic oil spill,” the group hopes to pre-emptively increase response preparedness. Ultimately, Machlis aims to have the capacity to address simultaneous, bi-coastal disasters. For example: an earthquake in California and a hurricane in New York on the same day.

When will the group be ready for such a situation?

“I would hope that we’re prepared for that within the year.”

Mark that down as December 31st, 2013–the date when you can expect not only police and firefighters, but also scientists, to play a role in addressing the next major natural or man-made disaster.

Climate Change Will Cause Widespread Global-Scale Loss of Common Plants and Animals, Researchers Predict (Science Daily)

May 12, 2013 — More than half of common plants and one third of the animals could see a dramatic decline this century due to climate change, according to research from the University of East Anglia.

Frog. Plants, reptiles and particularly amphibians are expected to be at highest risk. (Credit: © Anna Omelchenko / Fotolia)

Research published today in the journal Nature Climate Change looked at 50,000 globally widespread and common species and found that more than one half of the plants and one third of the animals will lose more than half of their climatic range by 2080 if nothing is done to reduce the amount of global warming and slow it down.

This means that geographic ranges of common plants and animals will shrink globally and biodiversity will decline almost everywhere.

Plants, reptiles and particularly amphibians are expected to be at highest risk. Sub-Saharan Africa, Central America, Amazonia and Australia would lose the most species of plants and animals. And a major loss of plant species is projected for North Africa, Central Asia and South-eastern Europe.

But acting quickly to mitigate climate change could reduce losses by 60 per cent and buy an additional 40 years for species to adapt. This is because this mitigation would slow and then stop global temperatures from rising by more than two degrees Celsius relative to pre-industrial times (1765). Without this mitigation, global temperatures could rise by 4 degrees Celsius by 2100.

The study was led by Dr Rachel Warren from UEA’s school of Environmental Sciences and the Tyndall Centre for Climate Change Research. Collaborators include Dr.Jeremy VanDerWal at James Cook University in Australia and Dr Jeff Price, also at UEA’s school of Environmental Sciences and the Tyndall Centre. The research was funded by the Natural Environment Research Council (NERC).

Dr Warren said: “While there has been much research on the effect of climate change on rare and endangered species, little has been known about how an increase in global temperature will affect more common species.

“This broader issue of potential range loss in widespread species is a serious concern as even small declines in these species can significantly disrupt ecosystems.

“Our research predicts that climate change will greatly reduce the diversity of even very common species found in most parts of the world. This loss of global-scale biodiversity would significantly impoverish the biosphere and the ecosystem services it provides.

“We looked at the effect of rising global temperatures, but other symptoms of climate change such as extreme weather events, pests, and diseases mean that our estimates are probably conservative. Animals in particular may decline more as our predictions will be compounded by a loss of food from plants.

“There will also be a knock-on effect for humans because these species are important for things like water and air purification, flood control, nutrient cycling, and eco-tourism.

“The good news is that our research provides crucial new evidence of how swift action to reduce CO2 and other greenhouse gases can prevent the biodiversity loss by reducing the amount of global warming to 2 degrees Celsius rather than 4 degrees. This would also buy time — up to four decades — for plants and animals to adapt to the remaining 2 degrees of climate change.”

The research team quantified the benefits of acting now to mitigate climate change and found that up to 60 per cent of the projected climatic range loss for biodiversity can be avoided.

Dr Warren said: “Prompt and stringent action to reduce greenhouse gas emissions globally would reduce these biodiversity losses by 60 per cent if global emissions peak in 2016, or by 40 per cent if emissions peak in 2030, showing that early action is very beneficial. This will both reduce the amount of climate change and also slow climate change down, making it easier for species and humans to adapt.”

Information on the current distributions of the species used in this research came from the datasets shared online by hundreds of volunteers, scientists and natural history collections through the Global Biodiversity Information Facility (GBIF).

Co-author Dr Jeff Price, also from UEA’s school of Environmental Studies, said: “Without free and open access to massive amounts of data such as those made available online through GBIF, no individual researcher is able to contact every country, every museum, every scientist holding the data and pull it all together. So this research would not be possible without GBIF and its global community of researchers and volunteers who make their data freely available.”

Journal Reference:

  1. R. Warren, J. VanDerWal, J. Price, J. A. Welbergen, I. Atkinson, et al. Quantifying the benefit of early climate change mitigation in avoiding biodiversity lossNature Climate Change, 2013 DOI: 10.1038/nclimate1887

Global Networks Must Be Redesigned, Experts Urge (Science Daily)

May 1, 2013 — Our global networks have generated many benefits and new opportunities. However, they have also established highways for failure propagation, which can ultimately result in human-made disasters. For example, today’s quick spreading of emerging epidemics is largely a result of global air traffic, with serious impacts on global health, social welfare, and economic systems.

Our global networks have generated many benefits and new opportunities. However, they have also established highways for failure propagation, which can ultimately result in human-made disasters. For example, today’s quick spreading of emerging epidemics is largely a result of global air traffic, with serious impacts on global health, social welfare, and economic systems. (Credit: © Angie Lingnau / Fotolia)

Helbing’s publication illustrates how cascade effects and complex dynamics amplify the vulnerability of networked systems. For example, just a few long-distance connections can largely decrease our ability to mitigate the threats posed by global pandemics. Initially beneficial trends, such as globalization, increasing network densities, higher complexity, and an acceleration of institutional decision processes may ultimately push human-made or human-influenced systems towards systemic instability, Helbing finds. Systemic instability refers to a system, which will get out of control sooner or later, even if everybody involved is well skilled, highly motivated and behaving properly. Crowd disasters are shocking examples illustrating that many deaths may occur even when everybody tries hard not to hurt anyone.

Our Intuition of Systemic Risks Is Misleading

Networking system components that are well-behaved in separation may create counter-intuitive emergent system behaviors, which are not well-behaved at all. For example, cooperative behavior might unexpectedly break down as the connectivity of interaction partners grows. “Applying this to the global network of banks, this might actually have caused the financial meltdown in 2008,” believes Helbing.

Globally networked risks are difficult to identify, map and understand, since there are often no evident, unique cause-effect relationships. Failure rates may change depending on the random path taken by the system, with the consequence of increasing risks as cascade failures progress, thereby decreasing the capacity of the system to recover. “In certain cases, cascade effects might reach any size, and the damage might be practically unbounded,” says Helbing. “This is quite disturbing and hard to imagine.” All of these features make strongly coupled, complex systems difficult to predict and control, such that our attempts to manage them go astray.

“Take the financial system,” says Helbing. “The financial crisis hit regulators by surprise.” But back in 2003, the legendary investor Warren Buffet warned of mega-catastrophic risks created by large-scale investments into financial derivatives. It took 5 years until the “investment time bomb” exploded, causing losses of trillions of dollars to our economy. “The financial architecture is not properly designed,” concludes Helbing. “The system lacks breaking points, as we have them in our electrical system.” This allows local problems to spread globally, thereby reaching catastrophic dimensions.

A Global Ticking Time Bomb?

Have we unintentionally created a global time bomb? If so, what kinds of global catastrophic scenarios might humans face in complex societies? A collapse of the world economy or of our information and communication systems? Global pandemics? Unsustainable growth or environmental change? A global food or energy crisis? A cultural clash or global-scale conflict? Or will we face a combination of these contagious phenomena — a scenario that the World Economic Forum calls the “perfect storm”?

“While analyzing such global risks,” says Helbing, “one must bear in mind that the propagation speed of destructive cascade effects might be slow, but nevertheless hard to stop. It is time to recognize that crowd disasters, conflicts, revolutions, wars, and financial crises are the undesired result of operating socio-economic systems in the wrong parameter range, where systems are unstable.” In the past, these social problems seemed to be puzzling, unrelated, and almost “God-given” phenomena one had to live with. Nowadays, thanks to new complexity science models and large-scale data sets (“Big Data”), one can analyze and understand the underlying mechanisms, which let complex systems get out of control.

Disasters should not be considered “bad luck.” They are a result of inappropriate interactions and institutional settings, caused by humans. Even worse, they are often the consequence of a flawed understanding of counter-intuitive system behaviors. “For example, it is surprising that we didn’t have sufficient precautions against a financial crisis and well-elaborated contingency plans,” states Helbing. “Perhaps, this is because there should not be any bubbles and crashes according to the predominant theoretical paradigm of efficient markets.” Conventional thinking can cause fateful decisions and the repetition of previous mistakes. “In other words: While we want to do the right thing, we often do wrong things,” concludes Helbing. This obviously calls for a paradigm shift in our thinking. “For example, we may try to promote innovation, but suffer economic decline, because innovation requires diversity more than homogenization.”

Global Networks Must Be Re-Designed

Helbing’s publication explores why today’s risk analysis falls short. “Predictability and controllability are design issues,” stresses Helbing. “And uncertainty, which means the impossibility to determine the likelihood and expected size of damage, is often man-made.” Many systems could be better managed with real-time data. These would allow one to avoid delayed response and to enhance the transparency, understanding, and adaptive control of systems. However, even all the data in the world cannot compensate for ill-designed systems such as the current financial system. Such systems will sooner or later get out of control, causing catastrophic human-made failure. Therefore, a re-design of such systems is urgently needed.

Helbing’s Nature paper on “Globally Networked Risks” also calls attention to strategies that make systems more resilient, i.e. able to recover from shocks. For example, setting up backup systems (e.g. a parallel financial system), limiting the system size and connectivity, building in breaking points to stop cascade effects, or reducing complexity may be used to improve resilience. In the case of financial systems, there is still much work to be done to fully incorporate these principles.

Contemporary information and communication technologies (ICT) are also far from being failure-proof. They are based on principles that are 30 or more years old and not designed for today’s use. The explosion of cyber risks is a logical consequence. This includes threats to individuals (such as privacy intrusion, identity theft, or manipulation through personalized information), to companies (such as cybercrime), and to societies (such as cyberwar or totalitarian control). To counter this, Helbing recommends an entirely new ICT architecture inspired by principles of decentralized self-organization as observed in immune systems, ecology, and social systems.

Coming Era of Social Innovation

A better understanding of the success principles of societies is urgently needed. “For example, when systems become too complex, they cannot be effectively managed top-down” explains Helbing. “Guided self-organization is a promising alternative to manage complex dynamical systems bottom-up, in a decentralized way.” The underlying idea is to exploit, rather than fight, the inherent tendency of complex systems to self-organize and thereby create a robust, ordered state. For this, it is important to have the right kinds of interactions, adaptive feedback mechanisms, and institutional settings, i.e. to establish proper “rules of the game.” The paper offers the example of an intriguing “self-control” principle, where traffic lights are controlled bottom-up by the vehicle flows rather than top-down by a traffic center.

Creating and Protecting Social Capital

“One man’s disaster is another man’s opportunity. Therefore, many problems can only be successfully addressed with transparency, accountability, awareness, and collective responsibility,” underlines Helbing. Moreover, social capital such as cooperativeness or trust is important for economic value generation, social well-being and societal resilience, but it may be damaged or exploited. “Humans must learn how to quantify and protect social capital. A warning example is the loss of trillions of dollars in the stock markets during the financial crisis.” This crisis was largely caused by a loss of trust. “It is important to stress that risk insurances today do not consider damage to social capital,” Helbing continues. However, it is known that large-scale disasters have a disproportionate public impact, in part because they destroy social capital. As we neglect social capital in risk assessments, we are taking excessive risks.

Journal Reference:

  1. Dirk Helbing. Globally networked risks and how to respondNature, 2013; 497 (7447): 51 DOI:10.1038/nature12047

Mainstream green is still too white (Color Lines)

By Brentin Mock; Cross-posted from ColorLines

We missing anything here?Last year was the hottest on record for the continental United States, and it wasn’t an outlier. The last 12 years have been the warmest years since 1880, the year the National Oceanic and Atmospheric Administration began tracking this information. And climate scientists predict that the devastating blizzards, droughts, hurricanes, and wildfires we’ve been experiencing lately will worsen due to climate change.

In many ways these punishing weather events feel like Mother Nature seeking revenge for our failure to reduce greenhouse gas emissions, the primary cause of global warming. Despite abundant evidence, the U.S. government has yet to pass a law that would force a reduction in these emissions.

During his first term, President Obama did make climate change a priority, both in his campaign and in office. The American Clean Energy and Security Act that Congress produced passed through the House in June 2009 by a narrow margin. Yet the bill never reached a vote in the Senate, and it died quietly.

Environmentalists have been flummoxed ever since. One prominent cause-of-death theory says that large mainstream (and predominantly white) environmental groups failed to mobilize grassroots support and ignored those who bear a disproportionate burden of climate change, namely poor people of color.

With Obama in for a second term and reaffirmed in his environmental commitments, climate legislation has another chance at life. Now, observers are wondering if mainstream environmentalists learned the right lessons from the first climate bill failure and how they’ll work with people of color this time around.

Anatomy of a conflict

To hear some environmental leaders tell it, their defeat wasn’t due to a lack of investment in black and brown people living in poor and working class communities, but to an over-investment in Obama. For example, Dan Lashof, climate and clean air director for Natural Resources Defense Council (NRDC), has blamed the president for having the audacity to push healthcare reform and he’s pointed the finger at green groups for being too patient with Obama.

Asked what environmental advocates who led the first climate bill effort could have done differently in 2009, Bill McKibben, founder of the online grassroots organizing campaign 350.org, says their game plan was too insular. “There was no chance last time because all the action was in the closed rooms, not in the streets,” he tells Colorlines.com.

Yet that “action” took place behind closed doors for a reason: Major mainstream green groups including the Environmental Defense Fund and The Nature Conservancy teamed up with oil companies and some of the biggest polluters and emitters in the nation to form the United States Climate Action Partnership (USCAP). This ad hoc alliance was the driving force behind the failed 2009 bill and there were no environmental justice, civil rights, or people-of-color groups at the USCAP table.

Obama can’t be blamed for the blind spots of major groups. As recent Washington Post and Politico articles have pointed out, their leadership and membership simply don’t reflect the race or socieconomic class of people most vulnerable to climate change’s wrath.

Sarah Hansen, former executive director of the Environmental Grantmakers Association, argued recently that the mainstream has been stingy with funding and resources and inept at engaging environmental justice communities. In a National Committee for Responsive Philanthropy (NCRP) study, “Cultivating the Grassroots: A Winning Approach for Environmental and Climate Funders,” Hansen reported that philanthropies awarded most of their environmental dollars to large, predominantly white groups but received little return in terms of law and policy. Meanwhile, wrote Hansen, too few dollars have been invested in community- and environmental justice-based organizations.

According to the NCRP report, environmental organizations with $5 million-plus budgets made up only 2 percent of green groups in general but in 2009 received half of all grants in the field. The NCRP also found that 15 percent of all green dollars benefited marginalized populations between 2007 and 2009. Only 11 percent went to social justice causes.

In January, Harvard professor Theda Skocpol released a study of the first climate bill campaign’s failure and faulted green groups involved for choosing direct congressional lobbying over grassroots organizing. Some of the major organizations did spend money on field organizers, wrote Skocpol, but only to push public messaging like billboards and advertisements.

“The messaging campaigns would not make it their business to actually shape legislation — or even talk about details with ordinary citizens or grassroots groups,” Skocpol wrote in the report. The public “is seen as a kind of background chorus that, hopefully, will sing on key.”

Take one for the team?

That the environmental movement thought billboards and ads could replace educating and organizing actual people was their biggest flaw, a position shared by Hansen and Skocpol. In comparison, health reform advocates took a lobbying and grassroots approach while the climate-change bill made the rounds and got a law passed.

“If you want to gain the trust of the emerging non-white majority, it’s not just a messaging thing,” explains Ryan Young, legal counsel for the California-based Greenlining Institute, a policy research nonprofit focused on economic, environmental, and racial justice. “It’s a values thing. You must understand the values of these communities and craft policy around that.”

Why does this matter?

Consider how the website of the National Wildlife Federation (NWF) recently featured an article on city bird sanctuaries from the group’s print magazine titled “Urban Renewal.”

Having people of color on staff might have helped NWF understand that for some, “urban renewal” signifies a historical legacy of black and Latino neighborhoods being effectively erased by development projects such as sports stadiums. Cultural snafus like this have led to white environmental groups being clowned in influential outlets including The Daily Show.

In an interview about the unintended message of “Urban Renewal,” Jim Lyon, NWF’s vice president for conservation policy, told Colorlines.com that the group doesn’t “always get everything right” and that “he’d take it back to his staff.” (Ironically, one of the harshest critiques of urban renewal came from Jane Jacobs, a white conservationist.) On the topic of staff diversity, Lyon said the organization isn’t where they want it to be, but that they’ve made “good progress.” He would not release staff demographics, but said NWF achieves diversity through partnerships with other groups and programs like Eco-Schools USA, which he says “engages more than 1 million children of color” daily.

Beverly Wright, who heads the New Orleans-based Deep South Center for Environmental Justice, says racial oversights of traditionally white groups are the main reason black and Latino environmentalists have formed their own organizations. The culturally divided camps sometimes use the same words, but they’re often speaking different languages.

Take “cap-and-trade,” a scheme that would commodify greenhouse gas emissions for market-trading as a way to reduce those emissions. The first climate bill centered on cap-and-trade because most major environmental groups supported it. But cap-and-trade was anathema to environmental justice because it did nothing to curb local co-pollutants such as smog and soot, direct threats to communities of color. That’s not to mention that cap-and-trade was the brainchild of C. Boyden Gray, a conservative member of the Federalist Society and leader of FreedomWorks, today a major Tea Party funder.

Wright says major green groups tried to coax environmental justice organizations into supporting cap-and-trade by claiming it was for the “greater good.”

“But that meant white people get all the greater goods and we get the rest,” says Wright. “Until they want to have real discussions around racism, they won’t have our support. That’s what happened last time with the climate bill. It did not move, because they did not have diversity in their voices.”

“Diversity” doesn’t just mean hiring more people of color. As the 30-year-old Center for Health, Environment and Justice stated in March, the diversity conversation “really needs to be about resources and assistance to the front line communities rather than head counting.”

What’s next?

So in the new round of climate bill talks, will large environmental groups meaningfully engage community-based environmental justice groups?

The prognosis is mixed. Look at MomentUs, a mammoth collaborative started in January to ramp up support for new climate legislation. While MomentUs claims to be a game-changer, the strategy behind it seems very similar to that of USCAP’s — the one that failed to deliver a climate-change law the first time around. On its website, MomentUs describes its board of directors as “cultural, environmental, business, and marketing leaders who offer the diversity of viewpoints and keen insight vital to advancing MomentUs’s mission.” At press time, all of the directors are white. So is the staff, except for one office administrator.

Looking at MomentUs partners, it appears that the same traditionally white environmental organizations who teamed up for USCAP are now working with corporations including ALEC funder Duke Energy, predatory subprime mortgage king Wells Fargo, perennial labor union target Sodexho, and Disney. At press time there are no environmental justice or civil rights groups involved.

On the other side of the spectrum, The Sierra Club — one of the nation’s largest and whitest green groups — has had an expansive role in environmental justice and advocacy, particularly in the Gulf Coast. In January it joined the NAACP and labor unions in launching the Democracy Initiative, which will tackle voting rights, environmental justice, and other civil rights concerns.

To be sure, it’s way too early to make a conclusion about MomentUs or the Democracy Initiative, but the latter appears to be a step in the right direction in terms of highlighting the intersection between poor environmental outcomes and racism.

McKibben, the 350.org founder, has helped cultivate a multicultural fight against the Keystone XL pipeline project, but he admits that the overall environmental movement has “tons of work to do” on racial equity and inclusion.

“The sooner [mainstream environmentalists] absorb the message and are led by members of the environmental justice movement, the better,” he says.

In that case, the question is a matter of timing and power, of who decides when and which environmental justice activists get to lead.

Stay tuned.

Brentin Mock is a New Orleans-based journalist who serves as ColorLines’s reporting fellow on voting rights.