Arquivo anual: 2011

Atrair a atenção do público é o grande desafio para os satisfeitos jornalistas de ciência (Fapesp)

Pesquisa FAPESP
Edição 188 – Outubro 2011
Política de C & T > Cultura científica
Leitores esquivos

Mariluce Moura

Dois estudos brasileiros sobre divulgação científica, citados em primeira mão na Conferência Mundial de Jornalismo Científico 2011, em Doha, Qatar, no final de junho, propõem quando superpostos um panorama curiosamente desconexo para esse campo no país: se de um lado os jornalistas de ciência revelam um alto grau de satisfação com seu trabalho profissional, de outro, uma alta proporção de uma amostra representativa da população paulistana (76%) informa nunca ler notícias científicas nos jornais, revistas ou internet. Agora o mais surpreendente: no universo de entrevistados ouvidos no estado de São Paulo nesta segunda pesquisa, 52,5% declararam ter “muita admiração” pelos jornalistas e 49,2%, pelos cientistas, a despeito de poucos lerem as notícias elaboradas por uns sobre o trabalho dos outros. Esses e outros dados dos estudos provocam muitas questões para os estudiosos da cultura científica nacional. Uma, só para começar: a satisfação profissional do jornalista de ciência independe de ele atingir com sua produção seus alvos, ou seja, os leitores, os telespectadores, os ouvintes ou, de maneira mais geral, o público?

A Conferência Mundial, transferida de última hora do Cairo para Doha, em razão dos distúrbios políticos no Egito iniciados em janeiro, reuniu 726 jornalistas de 81 países que, durante quatro dias, debateram desde o conceito central de jornalismo científico, passando pelas múltiplas formas de exercê-lo e suas dificuldades, até os variados problemas de organização desses profissionais na Ásia, na África, na Europa, na América do Norte ou na América Latina, nos países mais democráticos e nos mais autoritários. Uma questão que atravessou todos esses debates foi o desenvolvimento da noção de que fazer jornalismo científico não é traduzir para o público a informação científica – seria mais encontrar meios eficazes de narrar em linguagem jornalística o que dentro da produção científica pode ser identificado como notícia de interesse para a sociedade. A próxima Conferência Mundial será realizada na Finlândia, em 2013.

Apresentado por um dos representantes da FAPESP na conferência, o estudo que trouxe à tona a medida preocupante do desinteresse por notícias de ciência chama-se “Percepção pública da ciência e da tecnologia no estado de São Paulo” (confira o pdf) e constitui o 12º capítulo dos Indicadores de ciência, tecnologia e inovação em São Paulo – 2010, lançado pela FAPESP em agosto último. Elaborado pela equipe do Laboratório de Estudos Avançados em Jornalismo da Universidade Estadual de Campinas (Labjor-Unicamp) sob a coordenação de seu diretor, o linguista Carlos Vogt, em termos empíricos a pesquisa se baseou num questionário composto por 44 perguntas aplicado a 1.076 pessoas na cidade de São Paulo e a mais 749 no interior e no litoral do estado, em 2007. Portanto, foram 1.825 entrevistados em 35 municípios, distribuídos nas 15 regiões administrativas (RAs).

Vale ressaltar que esse foi o segundo levantamento direto em uma amostra da população a respeito de sua percepção da ciência realizado pelo Labjor e ambos estavam integrados a um esforço ibero- -americano em torno da construção de indicadores capazes de refletir a cultura científica nessa região. A primeira enquete, feita entre 2002 e 2003, incluiu amostras das cidades de Campinas, Buenos Aires, Montevidéu, além de Salamanca e Valladolid, na Espanha, e seus resultados foram apresentados nos Indicadores de C,T&I em São Paulo – 2004, também publicado pela FAPESP. Já em 2007, a pesquisa, com a metodologia mais refinada e amostra ampliada, alcançou sete países: além do Brasil, Colômbia, Argentina, Chile, Venezuela, Panamá e Espanha. O núcleo comum do questionário era constituído por 39 perguntas e cada região podia desenvolver outras questões de sua livre escolha.

O outro estudo brasileiro apresentado em Doha chama-se “Jornalismo científico na América Latina: conhecendo melhor os jornalistas de ciência na região” e, a rigor, ainda está em curso. Os resultados preliminares apresentados baseavam-se nas respostas a um questionário composto por 44 perguntas – desenvolvido pela London School of Economics and Political Science (LSE) –, encaminhadas até 21 de junho. Mas a essa altura, mais de 250 jornalistas responderam ao questionário, dentre eles aproximadamente 80 brasileiros, segundo sua coordenadora, a jornalista Luisa Massarani, diretora da Rede Ibero-americana de Monitoramento e Capacitação em Jornalismo Científico, instituição responsável pelo estudo, em parceria com o LSE. O levantamento tem ainda o apoio de associações de jornalismo científico e outras instituições ligadas à área de divulgação científica na Argentina, Bolívia, Brasil, Chile, Colômbia, Costa Rica, Equador, México, Panamá e Venezuela.

No alvo desse estudo, como indicado, aliás, pelo título, está uma preocupação em saber quantos são, quem são e que visão têm da ciência os jornalistas envolvidos com a cobertura sistemática dessa área na América Latina. “Não temos ideia sobre isso, sequer sabemos quantos jornalistas de ciência existem no Brasil e se eles são ou não representativos dentro da categoria”, diz Luisa Massarani, que é também diretora do Museu da Vida da Fundação Oswaldo Cruz (Fiocruz) e coordenadora para a América Latina da Rede de Ciência e Desenvolvimento (SciDev.Net). Até algum tempo, lembra, “a Associação Brasileira de Jornalismo Científico (ABJC), com base em seu registro de sócios, situava esse número em torno de 500, mas isso na verdade incluía cientistas e outros profissionais interessados em divulgação da ciência”. A propósito, a ABJC vai iniciar no próximo mês o recadastramento dos sócios, junto com uma chamada para novos associados, o que poderá contribuir para esse censo dos jornalistas de ciência no Brasil.

Crença na ciência – Com 46 gráficos e 55 tabelas anexas que podem ser cruzados de acordo com o interesse específico de cada estudioso, o estudo de percepção da ciência bancado pela FAPESP e coordenado por Vogt permite uma infinidade de conclusões e novas hipóteses a respeito de como a sociedade absorve ciência por via da mídia ou como as várias classes sociais ou econômicas no estado de São Paulo reagem à exposição a notícias da área científica. Ao próprio coordenador, um dos pontos que mais chamaram a atenção nos resultados da pesquisa foi a relação inversa que ela permite estabelecer entre crença na ciência e informação sobre ciência. “O axioma seria quanto mais informação, menos crença na ciência”, diz. Assim, se consultado o gráfico relativo a grau de consumo autodeclarado de informação científica versus atitude quanto aos riscos e benefícios da ciência (gráfico 12.11), pode-se constatar que 57% dos entrevistados que declararam alto consumo acreditam que ciência e tecnologia podem oferecer muitos riscos e muitos benefícios simultaneamente e 6,3% acreditam que podem trazer muitos riscos e poucos benefícios. Já daqueles que declararam consumo nulo de informação científica, 42,9% veem muitos riscos e muitos benefícios ao mesmo tempo e 25,5% veem muitos riscos e poucos benefícios. “Ou seja, entre os mais informados é bem alta a proporção dos que veem riscos e benefícios na ciência ao mesmo tempo”, destaca Vogt, presidente da FAPESP de 2002 a 2007 e hoje coordenador da Universidade Virtual do Estado de São Paulo (Univesp), indicando que essa seria uma visão realista. Registre-se que o grau de pessimismo é muito maior entre os que declararam consumo nulo de informação científica: 8,1% deles disseram que a ciência não traz nenhum risco e nenhum benefício, enquanto esse percentual foi de 5,8% entre os que declararam consumo baixo, de 2,3% entre os que se situaram na faixa de consumo médio baixo, de 0,7% na faixa médio alto e de zero entre os altos consumidores de informação científica.

 

Na parte do trabalho sobre interesse geral em C&T, chama a atenção como o tema está medianamente situado pelos entrevistados em quinto lugar, depois de esporte e antes de cinema, arte e cultura, dentre 10 assuntos usualmente cobertos pela mídia (gráfico 12.1). Mas enquanto para esporte 30,5% deles se declaram muito interessados e 34,9%, interessados, em ciência e tecnologia são 16,3% os muito interessados e 47,1% os interessados, ou seja, a intensidade do interesse é menor. Vale também observar como os diferentes graus de interesse em C&T aproximam a cidade de São Paulo de Madri e a distanciam imensamente de Bogotá (gráfico 12.2). Assim, respectivamente, 15,4% dos entrevistados em São Paulo e 16,7% dos entrevistados em Madri declararam-se muito interessados em C&T; para a categoria interessado, os percentuais foram 49,6% e 52,7%; para pouco interessado, 25,5% e 24,8%, e para nada interessado, respectivamente, 9,4% e 5,9%. Já em Bogotá, nada menos que 47,5% declararam-se muito interessados. Por quê, não se sabe. Os interessados totalizam 33,2%, os pouco interessados, 15,3% e os nada interessados, 4%.

Não há muita diferença no nível de interesse por idade. Jovens e pessoas mais velhas se distribuem democraticamente pelos diversos graus considerados (gráfico 12.6a). Já quanto ao grau de escolaridade, se dá exatamente o oposto: entre os muito interessados em ciência e tecnologia, 21,9% são graduados e pós-graduados, 53,9% têm grau de ensino médio, 21,5%, ensino fundamental, 1,7%, educação infantil e 1% não teve nenhuma escolaridade. Já na categoria nada interessado se encontra 1,2% de graduados e pós-graduados, 26,3% de pessoas com nível médio, 47,4% com ensino fundamental, 8,8% com educação infantil e 16,4% de pessoas que não tiveram nenhum tipo de escolaridade (gráfico 12.5).

A par de todas as inferências que os resultados tabulados e interpretados dos questionários permitem, Vogt destaca que se a maioria da população não lê notícias científicas, ela entretanto está exposta de forma mais ou menos passiva à informação que circula sobre ciência. “Cada vez que o Jornal Nacional ou o Globo Repórter fala, por exemplo, sobre um alimento funcional, praticamente a sociedade como um todo passa a tratar disso nos dias seguintes”, diz. Ele acredita que pesquisas de mídia e de frequência do noticiário sobre ciência na imprensa poderão dar parâmetros de indicação para estudos que possam complementar o que já se construiu até agora sobre percepção pública da ciência.

Profissionais satisfeitos – Luisa Massarani observa que se hoje já se avançou nos estudos de audiência em muitos campos, especialmente para as telenovelas no Brasil, na área de jornalismo científico ainda não existem estudos capazes de indicar o que acontece em termos de percepção quando a pessoa ouve e vê uma notícia dessa especialidade no Jornal Nacional. “As pessoas entendem bem? A informação suscita desconfiança? Não sabemos.” De qualquer sorte, permanece em seu entendimento como uma grande questão o que significa fazer jornalismo científico, em termos da produção e da recepção.

Por enquanto, o estudo que ela coordena conseguiu identificar que as mulheres são maioria entre os jornalistas de ciência na América Latina, 61% contra 39% de homens, e que essa é uma especialidade de jovens: quase 30% da amostra situa-se na faixa de 31 a 40 anos e 23% têm entre 21 e 30 anos. De forma coerente com esse último dado, 39% dos entrevistados trabalham há menos de 5 anos em jornalismo científico e 23% entre 6 e 10 anos. E, o dado impressionante, 62% estão satisfeitos com seu trabalho em jornalismo científico e mais 9% muito satisfeitos. É possível que isso tenha relação com o fato de 60% terem emprego formal de tempo integral na área.

Por outro lado, se os jornalistas de ciência da América Latina não têm muitas fontes oficiais que lhes deem um feedback de seu trabalho, 40% deles es–tão seguros de que seu papel é informar o público, 26% pensam que sua função é traduzir material complexo, 13% educar e 9% mobilizar o público. E avaliando o resultado do trabalho, 50% creem que o jornalismo científico produzido no Brasil é médio, 21% bom e somente 2% o classificam como muito bom.

A melhor indicação do quanto os jornalistas de ciência gostam do que fazem está na resposta à questão sobre se recomendariam a outros a carreira. Nada menos do que a metade respondeu que sim, com certeza, enquanto 40% responderam que provavelmente sim. De qualquer sorte, ainda há um caminho a percorrer na definição do papel que cabe aos jornalistas entre os atores que dizem o que a ciência é e faz. “Quem são esses atores?”, indaga Vogt. “Os cientistas achavam que eram eles. Os governos acreditavam que eram eles. Mas hoje dizemos que é a sociedade. Mas de que forma?”

Weathering Fights – Science: What’s It Up To? (The Daily Show with Jon Stewart)

http://media.mtvnservices.com/mgid:cms:video:thedailyshow.com:400760

Science claims it’s working to cure disease, save the planet and solve the greatest human mysteries, but Aasif Mandvi finds out what it’s really up to. (05:47) – Comedy Central

Global Warming May Worsen Effects of El Niño, La Niña Events (Climate Central)

Published: October 12th, 2011

By Michael D. Lemonick

Does this mean Texas is toast?

As just about everyone knows, El Niño is a periodic unusual warming of the surface water in the eastern and central tropical Pacific Ocean. Actually, that’s pretty much a lie. Most people don’t know the definition of El Niño or its mirror image, La Niña, and truthfully, most people don’t much care.

What you do care about if you’re a Texan suffering through the worst one-year drought on record, or a New Yorker who had to dig out from massive snowstorms last winter (tied in part to La Niña), or a Californian who has ever had to deal with the torrential rains that trigger catastrophic mudslides (linked to El Niño), is that these natural climate cycles can elevate the odds of natural disasters where you live.

At the moment, we’re now entering the second year of the La Niña part of the cycle. La Niña is one key reason why the Southwest was so dry last winter and through the spring and summer, and since La Niña is projected to continue through the coming winter, Texas and nearby states aren’t likely to get much relief.

Precipitation outlook for winter 2011-12, showing the likelihood of below average precipitation in Texas and other drought-stricken states.

But Niñas and Niños (the broader cycle, for you weather/climate geeks, is known as the “El Niño-Southern Oscillation,” or “ENSO”) don’t just operate in isolation. They’re part of the broader climate system, which means that climate change could theoretically change how they operate — make them develop more frequently, for example, or less frequently, or be more or less pronounced. Climate change could also intensify the effects of El Niño and La Niña events.

Climate scientists have been wrestling with the first question for a while now, and they still don’t really have a definitive answer. Some climate models have suggested that global warming has already begun to cause subtle changes in ENSO cycles, and that the changes will become more pronounced later this century. But a new study, published in the Journal of Climate, doesn’t find much evidence for that.

But on the second question, the new study is a lot more definitive. “Due to a warmer and moister atmosphere,” said co-author Baylor Fox-Kemper, of the University of Colorado in a press release, “the impacts of El Niño are changing even though El Niño itself doesn’t change.”

That’s because global warming has begun to change the playing field on which El Niño and La Niña operate, just as it’s changing the background conditions that give rise to our everyday weather. The Texas drought is a prime example. Its most likely cause is reduced rainfall from La Niña-related weather patterns. But however dry Texas and Oklahoma might have been otherwise, the killer heat wave that plagued the region this past summer — the sort of heat wave global warming is already making more commonplace — baked much of the remaining moisture out of both the soil and vegetation. No wonder large parts of the Lone Star State have gone up in smoke.

A map of sea surface temperature anomalies, showing a swath of cooler than average waters in the central and eastern tropical Pacific Ocean – a telltale sign La Niña conditions.

When the next El Niño occurs in a year or two, it will probably bring heavy rains to places like Southern California, whose unstable hillsides tend to slide when soggy. Except now, thanks to global warming, the typical El Niño-related storms that roll in off the Pacific may well be turbocharged, since a warmer atmosphere can hold more water. This is the reason, say many climate scientists, that downpours have become heavier in recent decades across broad geographical areas.

La Niña, plus the added moisture in the air from global warming, have also been partially implicated in the massive snowstorms that struck the Northeast and Mid-Atlantic states during the last two winters. Those could get worse as well, suggests the new analysis. “What we see,” says Fox-Kemper, “is that certain atmospheric patterns, such as the blocking high pressure south of Alaska typical of La Niña winters, strengthen…so, the cooling of North America expected in a La Niña winter would be stronger in future climates.” So to pre-answer the question that will inevitably be asked next winter: no, more snow does NOT contradict the idea that the planet is warming. Quite the contrary.

Finally, for those who really do want to know what El Niño and La Niña actually are, as opposed to what they do, you can go to NOAA’s El Niño page. But be warned: there will be a quiz, and the word “thermocline” will appear.

Comments

By Kirk Petersen (Maplewood, NJ 07040)
on October 13th, 2011

Seventh paragraph, third sentence should begin “Its most likely cause”—not “it’s”.

Vital Details of Global Warming Are Eluding Forecasters (Science)

Science 14 October 2011:
Vol. 334 no. 6053 pp. 173-174
DOI: 10.1126/science.334.6053.173

PREDICTING CLIMATE CHANGE

Richard A. Kerr

Decision-makers need to know how to prepare for inevitable climate change, but climate researchers are still struggling to sharpen their fuzzy picture of what the future holds.

Seattle Public Utilities officials had a question for meteorologist Clifford Mass. They were planning to install a quarter-billion dollars’ worth of storm-drain pipes that would serve the city for up to 75 years. “Their question was, what diameter should the pipe be? How will the intensity of extreme precipitation change?” Mass says. If global warming means that the past century’s rain records are no guide to how heavy future rains will be, he was asked, what could climate modeling say about adapting to future climate change? “I told them I couldn’t give them an answer,” says the University of Washington (UW), Seattle, researcher.

Climate researchers are quite comfortable with their projections for the world under a strengthening greenhouse, at least on the broadest scales. Relying heavily on climate modeling, they find that on average the globe will continue warming, more at high northern latitudes than elsewhere. Precipitation will tend to increase at high latitudes and decrease at low latitudes.

But ask researchers what’s in store for the Seattle area, the Pacific Northwest, or even the western half of the United States, and they’ll often demur. As Mass notes, “there’s tremendous uncertainty here,” and he’s not just talking about the Pacific Northwest. Switching from global models to models focusing on a single region creates a more detailed forecast, but it also “piles uncertainty on top of uncertainty,” says meteorologist David Battisti of UW Seattle.

First of all, there are the uncertainties inherent in the regional model itself. Then there are the global model’s uncertainties at the regional scale, which it feeds into the regional model. As the saying goes, if the global model gives you garbage, regional modeling will only give you more detailed garbage. And still more uncertainties are created as data are transferred from the global to the regional model.

Although uncertainties abound, “uncertainty tends to be downplayed in a lot of [regional] modeling for adaptation,” says global modeler Christopher Bretherton of UW Seattle. But help is on the way. Regional modelers are well into their first extensive comparison of global-regional model combinations to sort out the uncertainties, although that won’t help Seattle’s storm-drain builders.

Most humble origins

Policymakers have long asked for regional forecasts to help them adapt to climate change, some of which is now unavoidable. Even immediate, rather drastic action to curb emissions of greenhouse gases would not likely limit warming globally to 2°C, generally considered the threshold above which “dangerous” effects set in. And nothing at all can be done to reduce the global warming effects expected in the next several decades. They are already locked into climate change.

Sharp but true? Feeding a global climate model’s prediction for midcentury (top) into a regional model gives more details (bottom), but modelers aren’t sure how accurate the details are. CREDIT: NORTH AMERICAN REGIONAL CLIMATE CHANGE ASSESSMENT PROGRAM

So scientists have been doing what they can for decision-makers. Early on, it wasn’t much. A U.S. government assessment released in 2000, Climate Change Impacts on the United States, relied on the most rudimentary regional forecasting technique (Science, 23 June 2000, p. 2113). Expert committee members divided the country into eight regions and then considered what two of their best global climate models had to say about each region over the next century. The two models were somewhat consistent in the far southwest, where the report’s authors found it was likely that warmer and drier conditions would eliminate alpine ecosystems and shorten the ski season.

But elsewhere, there was far less consistency. Over the eastern two-thirds of the contiguous 48 states, for example, the two models couldn’t agree on how much moisture soils would hold in the summer. Kansas corn would either suffer severe droughts more frequently, as one model had it, or enjoy even more moisture than it currently does, as the other indicated. But at least the uncertainties were plain for all to see.

The uncertainties of regional projections nearly faded from view in the next U.S. effort, Global Climate Change Impacts in the United States. The 2009 study drew on not two but 15 global models melded into single projections. In a technique called statistical downscaling, its authors assumed that local changes would be proportional to changes on the larger scales. And they adjusted regional projections of future climate according to how well model simulations of past climate matched actual climate.

Statistical downscaling yielded a broad warming across the lower 48 states with less warming across the southeast and up the West Coast. Precipitation was mostly down, especially in the southwest. But discussion of uncertainties in the modeling fell largely to a footnote (number 110), in which the authors cite a half-dozen papers to support their assertion that statistical downscaling techniques are “well-documented” and thoroughly corroborated.

The other sort of downscaling, known as dynamical downscaling or regional modeling, has yet to be fully incorporated into a U.S. national assessment. But an example of state-of-the-art regional modeling appeared 30 June in Environmental Research Letters. To investigate what will happen in the U.S. wine industry, regional modeler Noah Diffenbaugh of Purdue University in West Lafayette, Indiana, and his colleagues embedded a detailed model that spanned the lower 48 states in a climate model that spanned the globe. The global model’s relatively fuzzy simulation of evolving climate from 1950 to 2039—calculated at points about 150 kilometers apart—then fed into the embedded regional model, which calculated a sharper picture of climate change at points only 25 kilometers apart.

Closely analyzing the regional model’s temperature projections on the West Coast, the group found that the projected warming would decrease the area suitable for production of premium wine grapes by 30% to 50% in parts of central and northern California. The loss in Washington state’s Columbia Valley would be more than 30%. But adaptation to the warming, such as the introduction of heat-tolerant varieties of grapes, could sharply reduce the losses in California and turn the Washington loss into a 150% gain.

Not so fast

A rapidly growing community of regional modelers is turning out increasingly detailed projections of future climate, but many researchers, mostly outside the downscaling community, have serious reservations. “Many regional modelers don’t do an adequate job of quantifying issues of uncertainty,” says Bretherton, who is chairing a National Academy of Sciences study committee on a national strategy for advancing climate modeling. “We’re not confident predicting the very things people are most interested in being predicted,” such as changes in precipitation.

Regional models produce strikingly detailed maps of changed climate, but they might be far off base. “The problem is that precision is often mistaken for accuracy,” Bretherton says. Battisti just doesn’t see the point of downscaling. “I would never use one of these products,” he says.

The problems start with the global models, as critics see it. Regional models must fill in the detail in the fuzzy picture of climate provided by global models, notes atmospheric scientist Edward Sarachik, professor emeritus at UW Seattle. But if the fuzzy picture of the region is wrong, the details will be wrong as well. And global models aren’t very good at painting regional pictures, he says. A glaring example, according to Sarachik, is the way global models place the cooler waters of the tropical Pacific farther west than they are in reality. Such ocean temperature differences drive weather and climate shifts in specific regions halfway around the world, but with the cold water in the wrong place, the global models drive climate change in the wrong regions.

Gregory Tripoli’s complaint about the global models is that they can’t create the medium-size weather systems that they should be sending into any embedded regional model. Tripoli, a meteorologist and modeler at the University of Wisconsin, Madison, cites the case of summertime weather disturbances that churn down off the Rocky Mountains and account for 80% of the Midwest’s summer rainfall. If a regional model forecasting for Wisconsin doesn’t extend to the Rockies, Wisconsin won’t get the major weather events that add up to be climate. And some atmospheric disturbances travel from as far away as Thailand to wreak havoc in the Midwest, he says, so they could never be included in the regional model.

A tougher nut. Predicting the details of precipitation using a regional model (bottom) fed by a global model (top) is even more uncertain than projecting regional temperature change. CREDIT: NORTH AMERICAN REGIONAL CLIMATE CHANGE ASSESSMENT PROGRAM

Even the things the global models get right have a hard time getting into regional models, critics say. “There are a lot of problems matching regional and global models,” Tripoli says. In one problem area, global and regional models usually have different ways of accounting for atmospheric processes such as individual cloud development that neither model can simulate directly, creating further clashes. Even the different philosophies involved in building global models and regional models can lead to mismatches that create phantom atmospheric circulations, Tripoli says. “It’s not straightforward you’re going to get anything realistic,” he says.

Redeeming regional modeling

“You could say all the global and regional models are wrong; some people do say that,” notes regional modeler Filippo Giorgi of the Abdus Salam International Centre for Theoretical Physics in Trieste, Italy. “My personal opinion is we do know something now. A few reports ago, it was really very, very difficult to say anything about regional climate change.”

But Giorgi says that in recent years he has been seeing increasingly consistent regional projections coming from combinations of many different models and from successive generations of models. “This means the projections are more and more reliable,” he says. “I would be confident saying the Mediterranean area will see a general decrease in precipitation in the next decades. I’ve seen this in several generations of models, and we understand the processes underlying this phenomenon. This is fairly reliable information, qualitatively. Saying whether the decrease will be 10% or 50% is a different issue.”

The skill of regional climate forecasting also varies from region to region and with what is being forecast. “Temperature is much, much easier” than precipitation, Giorgi notes. Precipitation depends on processes like atmospheric convection that operate on scales too small for any model to render in detail. Trouble simulating convection also means that higher-latitude climate is easier to project than that of the tropics, where convection dominates.

Regional modeling does have a clear advantage in areas with complex terrain such as mountainous regions, notes UW’s Mass, who does regional forecasting of both weather and climate. In the Pacific Northwest, the mountains running parallel to the coast direct onshore winds upward, predictably wringing rain and snow from the air without much difficult-to-simulate convection.

The downscaling of climate projections should be getting a boost as the Coordinated Regional Climate Downscaling Experiment (CORDEX) gets up to speed. Begun in 2009, CORDEX “is really the first time we’ll get a handle on all these uncertainties,” Giorgi says. Various groups will take on each of the world’s continent-size regions. Multiple global models will be matched with multiple regional models and run multiple times to tease out the uncertainties in each. “It’s a landmark for the regional climate modeling community,” Giorgi says.

 

Science 23 June 2000:
Vol. 288 no. 5474 p. 2113
DOI: 10.1126/science.288.5474.2113

GREENHOUSE WARMING

Dueling Models: Future U.S. Climate Uncertain

Richard A. Kerr

When Congress started funding a global climate change research program in 1990, it wanted to know what all this talk about greenhouse warming would mean for United States voters. Ten years later, a U.S. national assessment, drawing on the best available climate model predictions, concludes that the United States will indeed warm, affecting everything from the western snowpacks that supply California with water to New England’s fall foliage. But on a more detailed level, the assessment often draws a blank. Whether the cornfields of Kansas will be gripped by frequent, severe droughts, as one climate model has it, or blessed with more moisture than they now enjoy, as another predicts, the report can’t say. As much as policy-makers would like to know exactly what’s in store for Americans, the rudimentary state of regional climate science will not soon allow it, and the results of this 3-year effort brought the point home.

“This is the first time we’ve tried to take the physical [climate] system and see what effect it might have on ecosystems and socioeconomic systems,” says Thomas Karl, director of the National Oceanic and Atmospheric Administration’s (NOAA’s) National Climatic Data Center in Asheville, North Carolina, and a co-chair of the committee of experts that pulled together the assessment report “Climate Change Impacts on the United States” (available at http://www.nacc.usgcrp.gov/). “We don’t say we know there’s going to be catastrophic drought in Kansas,” he says. “What we do say is, ‘Here’s the range of our uncertainties.’ This document should get people to think.” If anything is certain, Karl says, it’s that “the past isn’t going to be a very good guide to future climate.”

By chance, the assessment had a handy way to convey the range of uncertainty that regional modeling serves up. The report, which divides the country into eight regions, is based on a pair of state-of-the-art climate models—one from the Canadian Climate Center and one from the U.K. Hadley Center for Climate Research and Prediction—that couple a simulated atmosphere and ocean. The two models solved the problems of simplifying a complex world in different ways, leading to very different predicted U.S. climates. “In terms of temperature, the Canadian model is at the upper end of the warming by 2100” predicted by a range of models, says modeler Eric Barron of Pennsylvania State University, University Park, and a member of the assessment team. “The Hadley model is toward the lower end. The Canadian model is on the dry side, and the Hadley model is on the wet side. We’re capturing a substantial portion of the range of simulations. We tried hard to convey that uncertainty.”

On a broad scale, the report can conclude: “Overall productivity of American agriculture will likely remain high, and is projected to increase throughout the 21st century,” although there will be winners and losers from place to place, and adapting agricultural practice to climate change will be key. Where the models are somewhat consistent, as in the far southwest, the report ventures what could be construed as predictions: “It is likely that some ecosystems, such as alpine ecosystems, will disappear entirely from the region,” or “Higher temperatures are likely to mean … a shorter season for winter activities, such as skiing.” Where the models clash, as on summer soil moisture over the eastern two-thirds of the lower 48 states, it explains the alternatives and suggests ways to adapt, such as switching crops.

The range of possible climate impacts laid out by the models “fairly reflects where we are in the science,” says Karl. But he notes that the effort did lack one important input: Congress mandated the assessment without funding it. “You get what you pay for,” says climatologist Kevin Trenberth of the National Center for Atmospheric Research in Boulder, Colorado. “A lot of it was done hastily.” Karl concedes that everyone involved would have liked to have had more funding delivered more reliably.

Even given more time and money, however, the assessment may not have come up with much better small-scale predictions, given the inherent limitations of the science. Even the best models today can say little that’s reliable about climate change at the regional level, never mind at the scale of a congressional district. Their picture of future climate is fuzzy—they might lump together San Francisco and Los Angeles because the models have such coarse geographic resolution—and the realism of such meteorological phenomena as clouds and precipitation is compromised by the inevitable simplifications of simulating the world in a computer.

“For the most part, these sorts of models give a warming,” says modeler Filippo Giorgi, “but they tend to give very different predictions, especially at the regional level, and there’s no way to say one should be believed over another.” Giorgi and his colleague Raquel Francisco of the Abdus Salam International Center for Theoretical Physics in Trieste, Italy, recently evaluated the uncertainties in five coupled climate models—including the two used in the national assessment—within 23 regions, the continental United States comprising roughly three regions. Giorgi concludes that as the scale of prediction shrinks, reliability drops until for small regions “the model data are not believable at all.”

Add in uncertainties external to the models, such as population and economic growth rates, says modeler Jerry D. Mahlman, director of NOAA’s Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey, and the details of future climate recede toward unintelligibility. Some people in Congress and the policy community had “almost silly expectations there would be enormously useful, small-scale specifics, if you just got the right model. But the right model doesn’t exist,” says Mahlman.

Still, even though the national assessment does not offer the list of region-by-region impacts that Congress might have hoped for, it does show “where we are adaptable and where we are vulnerable,” says global change researcher Stephen Schneider of Stanford University. In 10 years, modelers say, they’ll do better.

The Post-Normal Seduction of Climate Science (Forbes)

William Pentland10/14/2011 @ 12:22AM |2,770 views

In early 2002, former U.S. Defense Secretary Donald Rumsfeld explained why the lack of evidence linking Saddam Hussein with terrorist groups did not mean there was no connection during a televised press conference.

“[T]here are known ‘knowns’ – there are things we know we know,” said Rumsfeld. “We also know there are known ‘unknowns’ – that is to say we know there are some things we do not know. But there are also unknown ‘unknowns’ – the ones we don’t know we don’t know . . . it is the latter category that tend to be the difficult ones.”

Rumsfeld turned out to be wrong about Hussein, but what if he had been talking about global warming?  Well, he probably would have been on to something there.  Unknowns of any ilk are a real pickle in climate science.

Indeed, uncertainty in climate science has induced a state of severe political paralysis. The trouble is that nobody really knows why. A rash of recent surveys and studies have exonerated most of the usual suspects – scientific illiteracy, industry distortions, skewed media coverage.

Now, the climate-science community is scrambling to crack the code on the “uncertainty” conundrum. Exhibit A: the October 2011 issue of the journal Climatic Change, the closest thing in climate science to gospel truth, which is devoted entirely to the subject of uncertainty.

While I have yet to digest all of the dozen or so essays, I suspect they are only the opening salvo in what is will soon become a robust debate about the significance of uncertainty in climate-change science. The first item up on the chopping block is called post-normal science (PNS).

PNS is a model of the scientific process pioneered by Jerome Ravetz and Silvio Funtowicz, which describes the peculiar challenges science encounters where “facts are uncertain, values in dispute, stakes high and decisions urgent.” Unlike “normal” science in the sense described by the philosopher of science Thomas Kuhn, post-normal science commonly crosses disciplinary lines and involves new methods, instruments and experimental systems.

Judith Curry, a professor at Georgia Tech, weighs the wisdom of taking the plunge on PNS in an excellent piece called “Reasoning about climate uncertainty.” Drawing on the work of Dutch wunderkind, Jeroen van der Sluijs, Curry calls on the Intergovernmental Panel on Climate Change to stop marginalizing uncertainty and get real about bias in the consensus building process. Curry writes:

The consensus approach being used by the IPCC has failed to produce a thorough portrayal of the complexities of the problem and the associated uncertainties in our understanding . . . Better characterization of uncertainty and ignorance and a more realistic portrayal of confidence levels could go a long way towards reducing the “noise” and animosity portrayed in the media that fuels the public distrust of climate science and acts to stymie the policy process.

PNS is especially seductive in the context of uncertainty. Not surprisingly, Curry suggests that instituting PNS-like strategies at the IPCC “could go a long way towards reducing the ‘noise’ and animosity” surrounding climate-change science.

While I personally believe PNS is persuasive, the PNS model provokes something closer to revulsion in many people. Last year, members of the U.S. House of Representatives filed a petition challenging the U.S. Environmental Protection Agency‘s Greenhouse Gas Endangerment seemed less sanguine about post-normal science:

. . . the conclusions of organizing bodies, especially the IPCC, cannot be said to reflect scientific “consensus” in any meaningful sense of that word. Instead, they reflect a political movement that has commandeered science to the service of its agenda. This is “post-normal science”: the long-dreaded arrival of deconstructionism to the natural sciences, according to which scientific quality is determined not by its fidelity to truth, but by its fidelity to the political agenda.

It seems unlikely that taking the PNS plunge would appreciably improve the U.S. public’s perception of the credibility, legitimacy and salience of climate-change assessments. This probably says more about Americans than it does about the analytic force of the PNS model.

Let’s face it. Americans do not agree on a whole hell of a lot. And they never have. Many U.S. institutions were deliberately designed to tolerate the coexistence of free states and slave-owning states. Ironically, Americans appear to agree more on climate-change science than other high-profile scientific controversies like the safety of genetically-modified organisms.

National Science Foundation

While it pains me to admit this, I am increasingly convinced that the IPCC’s role in assessing the science of climate change needs to be scaled back. The IPCC was an overly optimistic experiment in international governance designed for a world that never materialized.  The U.N. General Assembly established the IPCC in the months immediately preceding the fall of the Berlin Wall. Only two few years later, the IPCC’s first assessment report and the creation of the U.N. Framework Convention on Climate Change coincided with the collapse of the Soviet Union and the end of the Cold War.

A new world order seemed to be dawning in those days, which is probably why it seemed like a good idea to ask scientists to tell us what constitutes “dangerous climate change.”   Two decades and two world trade towers later, the world is a decidedly less hospitable place for institutions like the IPCC.

The proof is in the pudding – or, in this case, the atmosphere.

Climate Change Tumbles Down Europe’s Political Agenda as Economic Worries Take the Stage (N.Y. Times)

By JEREMY LOVELL of ClimateWire. Published: October 13, 2011

LONDON — Climate change has all but fallen off the political agenda across Europe as the resurging economic crisis empties national coffers and shakes economic confidence, and the public and the press turn their attention to more immediate issues of rising fuels bills and joblessness, analysts say.

Sputtering economies, a shift of attention to looming elections and the prospect of little or no movement in the December climate talks in Durban, South Africa, have combined to take the political momentum out of an issue that was a major cause in Europe.

“It is way down the agenda and will not feature in elections,” said Edward Cameron, director of the World Resources Institute think tank’s international climate initiative, on the sidelines of a meeting on climate change at London’s Chatham House think tank. “At a time of joblessness and fiscal crises, it is very difficult to advance the climate change issue.”

That is as true for next year’s presidential elections in the United States as it will be in France, despite the fact that there has been a series of environmental disasters, from the Texas drought this year to Russia’s heat wave and consequent steep rise in wheat prices last year.

According to acclaimed NASA scientist James Hansen, who has been warning of impending climatic doom for decades, the lack of focus on these events is in no small part due to the fact that scientists are poor communicators while the climate change skeptics have mounted a smoothly run campaign to capitalize on any mistakes and admissions of uncertainty.

“There is a strong campaign by those people who want to continue the fossil fuel business as usual. Climate contrarians … have managed in the public’s eye to muddy the waters enough that there is uncertainty why should we do anything yet,” he said on a visit to London’s Royal Society for a meeting on lessons to be learned from past climate change battles.

“They have been winning the argument in the last several years, even though the science has become clearer,” he added.

Nuclear power issue distracts Berlin

In Germany, where a generous feed-in tariff scheme has produced some 28 gigawatts of wind power capacity and more than 18 GW of solar photovoltaic capacity, Chancellor Angela Merkel’s coalition government was forced into an abrupt U-turn on a controversial move to extend the lives of the country’s fleet of nuclear power plants. There was a political revolt after the March 11 nuclear disaster at Fukushima in Japan.

The oldest seven of Germany’s nuclear plants were closed immediately after Fukushima and will now never reopen, while the remainder will close by 2022.

This has had the perverse effect in a country proud of its renewable energy efforts of increasing the use of coal-fired power plants and increasing the likelihood of new coal- or gas-fired plants being built. The price tag will include higher carbon emissions at exactly the time that the Germany along with the rest of the European Union is pledged to cut emissions.

While political observers believe the climate change issue will come back to the fore at some point in Germany — a country where the Greens have played a pivotal political role — the nuclear power issue is so politically charged that it is off the agenda for now.

Even in the United Kingdom, which has a huge wind energy program and where the Conservative-Liberal Democrat coalition came to power 15 months ago pledging to be the “greenest government ever,” there are major signs of backsliding. A long-awaited energy bill has been shelved, and renewable energy support costs and carbon emission reduction targets are either under review or about to be.

At the Conservative Party’s annual conference earlier this month, climate change was consigned to a brief debate on the opening Sunday, when delegates were mostly just arriving and finding their way around or still traveling to get there.

Damned by faint praise in London

Prime Minister David Cameron did not mention the issue in his speech to the conference — a performance that usually sets the broad agenda for the following year — and Chancellor of the Exchequer George Osborne caused environmental outrage but satisfaction to the party’s right wing by pledging that the United Kingdom would not go any faster than its E.U. neighbors on emission cuts.

This is despite the fact that the United Kingdom has a legal target to cut its carbon emissions by at least 80 percent below 1990 levels by 2050, with cuts of 35 percent by 2022 and 50 percent by 2025, whereas the European Union’s goal is 20 percent by 2020.

It was widely reported that the 2022 target was only agreed to after a major battle in the Cabinet between supporters of Conservative Osborne and those of Liberal Democrat Energy and Climate Change Minister Chris Huhne. It has since been announced that the carbon targets will be reviewed in 2014.

Even in London, where charismatic Conservative Mayor Boris Johnson came to power in 2008 in part on a green ticket, the issue has largely been parked and replaced by transport in the run-up to next year’s mayoral elections. The city’s aging transport system is feared likely to come under massive strain during the 2012 Olympic Games.

Then there is the strange case of a strategic plan on adapting London to climate change, the draft of which was launched with great fanfare and declarations of urgency in February 2010. It was on the brink of publication in September 2010, but after that, it appeared to have vanished without trace.

At the same time, most members of City Hall’s climate change team, set up under the previous Labour administration, have been moved to other jobs.

‘Too difficult — and not a vote winner’

“Political leaders get it, but the treasuries don’t. The men with the money don’t want to be first movers,” said Nick Mabey, co-founder of environmental think tank E3G. “But the political froth has gone. It has become too difficult — and not a vote winner.”

Compounding that problem, at least in the United Kingdom, has been a series of reports underscoring the likely high cost to households of green energy policies at a time when the prices of domestic electricity and gas are already rising sharply.

A recent opinion poll found that the climate change issue has been replaced by concerns over rising fuel bills and energy security.

But Mabey is not too concerned. While the subject may be off the immediate political agenda, behind the scenes, the more enlightened corporate leaders and investment fund managers have been making their own calculations. They are moving their money into the low-carbon economic transformation that in some cases is already profitable and in many eyes essential and inevitable.

The main danger, they say, is that if climate change as a driver of action is allowed to languish too long and become too invisible while energy becomes the main motivator, it will become far harder to resurrect climate change.

For Mabey and WRI’s Cameron, while the deep and seemingly returning global economic crisis has proved a serious distraction internationally as well as domestically, all is not lost.

For a number of reasons, including the rise of a new and major climate player — China — and a series of new scientific reports on climate change due over the next two or three years, 2015 will be the next pivotal moment for the world to take collective action, they say.

“Climate change doesn’t keep people awake at night. Our task for the next few years is to move it back up the political agenda again,” said WRI’s Cameron.

Copyright 2011 E&E Publishing. All Rights Reserved.

Group Urges Research Into Aggressive Efforts to Fight Climate Change (N.Y. Times)

By CORNELIA DEAN, Published: October 4, 2011

With political action on curbing greenhouse gases stalled, a bipartisan panel of scientists, former government officials and national security experts is recommending that the government begin researching a radical fix: directly manipulating the Earth’s climate to lower the temperature.

Members said they hoped that such extreme engineering techniques, which include scattering particles in the air to mimic the cooling effect of volcanoes or stationing orbiting mirrors in space to reflect sunlight, would never be needed. But in itsreport, to be released on Tuesday, the panel said it is time to begin researching and testing such ideas in case “the climate system reaches a ‘tipping point’ and swift remedial action is required.”

The 18-member panel was convened by the Bipartisan Policy Center, a research organization based in Washington founded by four senators — Democrats and Republicans — to offer policy advice to the government. In interviews, some of the panel members said they hoped that the mere discussion of such drastic steps would jolt the public and policy makers into meaningful action in reducing greenhouse gas emissions, which they called the highest priority.

The idea of engineering the planet is “fundamentally shocking,” David Keith, an energy expert at Harvard and the University of Calgary and a member of the panel, said. “It should be shocking.”

In fact, it is an idea that many environmental groups have rejected as misguided and potentially dangerous.

Jane Long, an associate director of the Lawrence Livermore National Laboratory and the panel’s co-chairwoman, said that by spewing greenhouse gases into the atmosphere, human activity was already engaged in climate modification. “We are doing it accidentally, but the Earth doesn’t know that,” she said, adding, “Going forward in ignorance is not an option.”

The panel, the Task Force on Climate Remediation Research, suggests that the White House Office of Science and Technology Policy begin coordinating research and estimates that a valuable effort could begin with a few million dollars in financing over the next few years.

One reason that the United States should embrace such research, the report suggests, is the threat of unilateral action by another country. Members say research is already under way in Britain, Germany and possibly other countries, as well as in the private sector.

“A conversation about this is going to go on with us or without us,” said David Goldston, a panel member who directs government affairs at the Natural Resources Defense Counciland is a former chief of staff of the House Committee on Science. “We have to understand what is at stake.”

In interviews, panelists said again and again that the continuing focus of policy makers and experts should be on reducing emissions of carbon dioxide and other greenhouse gases. But several acknowledged that significant action remained a political nonstarter. Last month, for example, the Obama administration told the federal Environmental Protection Agency to hold off on tightening ozone standards, citing complications related to the weak economy.

According to the United Nations Intergovernmental Panel on Climate Change, greenhouse gas emissions have contributed to raising the global average surface temperatures by about 1.3 degrees Fahrenheit in the past 100 years. It is impossible to predict how much impact the report will have. But given the panelists’ varied political and professional backgrounds, they seem likely to achieve one major goal: starting a broader conversation on the issue. Some climate experts have been working on it for years, but they have largely kept their discussions to themselves, saying they feared giving the impression that there might be quick fixes for climate change.

“Climate adaptation went through the same period of concern,” Mr. Goldston said, referring to the onetime reluctance of some researchers to discuss ways in which people, plants and animals might adjust to climate change. Now, he said, similar reluctance to discuss geoengineering is giving way, at least in part because “it’s possible we may have to do this no matter what.”

Although the techniques, which fall into two broad groups, are more widely known as geoengineering, the panel prefers “climate remediation.”

The first is carbon dioxide removal, in which the gas is absorbed by plants, trapped and stored underground or otherwise removed from the atmosphere. The methods are “generally uncontroversial and don’t introduce new global risks,” said Ken Caldeira, a climate expert at Stanford University and a panel member. “It’s mostly a question of how much do these things cost.”

Controversy arises more with the second group of techniques, solar radiation management, which involves increasing the amount of solar energy that bounces back into space before it can be absorbed by the Earth. They include seeding the atmosphere with reflective particles, launching giant mirrors above the earth or spewing ocean water into the air to form clouds.

These techniques are thought to pose a risk of upsetting earth’s natural rhythms. With them, Dr. Caldeira said, “the real question is what are the unknown unknowns: Are you creating more risk than you are alleviating?”

At the influential blog Climate Progress, Joe Romm, a fellow at the Center for American Progress, has made a similar point, likening geo-engineering to a dangerous course of chemotherapy and radiation to treat a condition curable through diet and exercise — or, in this case, emissions reduction.

The panel rejected any immediate application of climate remediation techniques, saying too little is known about them. In 2009, the Royal Society in Britain said much the same, assessing geoengineering technologies as “technically feasible” but adding that their potential costs, effectiveness and risks were unknown.

Similarly, in a 2010 review of federal research that might be relevant to climate remediation, the federal Government Accountability Office noted that “major uncertainties remain on the efficacy and potential consequences” of the approach. Its report also recommended that the White House Office of Science and Technology Policy “establish a clear strategy for geoengineering research.”

John P. Holdren, who heads that office, declined interview requests. He issued a statement reiterating the Obama administration’s focus on “taking steps to sensibly reduce pollution that is contributing to climate change.”

Yet in an interview with The Associated Press in 2009, Dr. Holdren said the possible risks and benefits of geoengineering should be studied very carefully because “we might get desperate enough to want to use it.”

In a draft plan made public on Friday, the U.S. Global Change Research Program, a coordinating effort administered by his office, outlined its own climate change research agenda, including studies of the impacts of rapid climate change.

The plan said that climate-related projections would be crucial to future studies of the “feasibility, effectiveness and unintended consequences of strategies for deliberate, large-scale manipulations of Earth’s environment,” including carbon dioxide removal and solar radiation management.

Many countries fault the United States for government inaction on climate change, especially given its longtime role as a chief contributor to the problem.

Frank Loy, a panelist and former chief climate negotiator for the United States, suggested that people around the world would see past those issues if the United States embraced geoengineering studies, provided that it was “very clear about what kind of research is undertaken and what the safeguards are.”

This article has been revised to reflect the following correction:

Correction: October 4, 2011

An earlier version of this article mistakenly referred to Frank Loy as the nation’s chief climate negotiator; he is a former chief climate negotiator. It also misstated the name of a federal agency that reported on the potential effectiveness of climate remediation. It is the Government Accountability Office, not the General Accountability Office.

NSF seeks cyber infrastructure to make sense of scientific data (Federal Computer Week)

By Camille Tuutti, Oct 04, 2011

The National Science Foundation has tapped a research team at the University of North Carolina-Chapel Hill to develop a national data infrastructure that would help future scientists and researchers manage the data deluge, share information and fuel innovation in the scientific community.

The UNC group will lead the DataNet Federation Consortium, which includes seven universities. The infrastructure that the consortium will try to create would support collaborative multidisciplinary research and will “democratize access to information among researchers and citizen scientists alike,” said Rob Pennington, program director in NSF’s Office of Cyberinfrastructure.

“It means researchers on the cutting edge have access to new, more extensive, multidisciplinary datasets that will enable breakthroughs and the creation of new fields of science and engineering,” he added.

The effort would be a “significant step in the right direction” in solving some of the key problems researchers run into, said Stan Ahalt, director at the Renaissance Computing Institute at UNC-Chapel Hill, which federates the consortium’s data repositories to enable cross-disciplinary research. One of the issues researchers today grapple with is how to best manage data in a way that maximizes its utility to the scientific community, he said. Storing massive quantities of data and the lack of well-designed methods that allow researchers to use unstructured and structured data simultaneously are additional obstacles for researchers, Ahalt added.

The national data infrastructure may not solve everything immediately, he said, “but it will give us a platform for start working meticulously on more long-term rugged solutions or robust solutions.”

DFC will use iRODS, the integrated Rule Oriented Data System, to implement a data management infrastructure. Multiple federal agencies are already using the technology: the NASA Center for Climate Simulation, for example, imported a Moderate Resolution Imaging Spectroradiometer satellite image dataset onto the environment so academic researchers would have access, said Reagan Moore, principal investigator for the Data Intensive Cyber Environments research group at UNC-Chapel Hill that leads the consortium.

It’s very typical for a scientific community to develop a set of practices around a particular methodology of collecting data, Ahalt explained. For example, hydrologists know where their censors are and what those mean from a geographical perspective. Those hydrologists put their data in a certain format that may not be obvious to someone who is, for example, doing atmospheric studies, he said.

“The long-term goal of this effort is to improve the ability to do research,” Moore said. “If I’m a researcher in any given area, I’d like to be able to access data from other people working in the same area, collaborate with them, and then build a new collection that represents the new research results that are found. To do that, I need access to the old research results, to the observational data, to simulations or analyze what happens using computers, etc. These environments then greatly minimize the effort required to manage and distribute a collection and make it available to research.”

For science research as a whole, Ahalt said the infrastructure could mean a lot more than just managing the data deluge or sharing information within the different research communities.

“Data is the currency of the knowledge economy,” he said. “Right now, a lot of what we do collectively and globally from an economic standpoint is highly dependent on our ability to manipulate and analyze data. Data is also the currency of science; it’s our ability to have a national infrastructure that will allow us to share those scientific assets.”

The bottom line: “We’ll be more efficient at producing new science, new innovation and new innovation knowledge,” he said.

About the Author

Camille Tuutti is a staff writer covering the federal workforce.

Little Ice Age Shrank Europeans, Sparked Wars (NetGeo)

Study aims to scientifically link climate change to societal upheaval.

London’s River Thames, frozen over in 1677. Painting by Abraham Hondius via Heritage Images/Corbis

Brian Handwerk, for National Geographic News

Published October 3, 2011

Pockmarked with wars, inflation, famines and shrinking humans, the 1600s in Europe came to be called the General Crisis.

But whereas historians have blamed those tumultuous decades on growing pains between feudalism and capitalism, a new study points to another culprit: the coldest stretch of the climate change period known as the Little Ice Age.

(Also see “Sun Oddly Quiet—Hints at Next ‘Little Ice Age’?”)

The Little Ice Age curbed agricultural production and eventually led to the European crisis, according to the authors of the study—said to be the first to scientifically verify cause-and-effect between climate change and large-scale human crises.

Prior to the industrial revolution, all European countries were by and large agrarian, and as study co-author David Zhang pointed out, “In agricultural societies, the economy is controlled by climate,” since it dictates growing conditions.

A team led by Zhang, of the University of Hong Kong, pored over data from Europe and other the Northern Hemisphere regions between A.D. 1500 to 1800.

The team compared climate data, such as temperatures, with other variables, including population sizes, growth rates, wars and other social disturbances, agricultural production figures and famines, grain prices, and wages.

The authors say some effects, such as food shortages and health problems, showed up almost immediately between 1560 and 1660—the Little Ice Age’s harshest period—during which growing seasons shortened and cultivated land shrank.

As arable land contracted, so too did Europeans themselves, the study notes. Average height followed the temperature line, dipping nearly an inch (two centimeters) during the late 1500s, as malnourishment spread, and rising again only as temperatures climbed after 1650, the authors found.

(Related: “British Have Changed Little Since Ice Age, Gene Study Says.”)

Others effects—such as famines, the Thirty Years’ War (1618-48), or the 164 Manchu conquest of China—took decades to manifest. “Temperature is not a direct cause of war and social disturbance,” Zhang said. “The direct cause of war and social disturbance is the grain price. That is why we say climate change is the ultimate cause.”

The new study is both history lesson and warning, the researchers added.

As our climate changes due to global warming (see interactive), Zhang said, “developing countries will suffer more, because large populations in these countries [directly] rely on agricultural production.”

More: “Climate Change Killed Neanderthals, Study Says” >>

Índios invadem obras de Belo Monte e bloqueiam Transamazônica (FSP)

27/10/2011 – 14h21

AGUIRRE TALENTO
DE BELÉM

O canteiro de obras da hidrelétrica de Belo Monte, localizado no município de Vitória do Xingu (oeste do Pará, a 945 km de Belém), foi invadido na manhã desta quinta-feira (27) em um protesto de indígenas, pescadores e moradores da região.

Eles também bloquearam a rodovia Transamazônica na altura do quilômetro 52, onde fica a entrada do canteiro de obras da usina.

O protesto, que começou às 5h da manhã, foi organizado durante seminário realizado nesta semana, em Altamira (também no oeste, a 900 km de Belém), que discutiu os impactos da instalação de usinas hidrelétricas na região.

Os seguranças permitiram a entrada dos manifestantes sem oferecer resistência, e os funcionários da empresa não apareceram para trabalhar. Com isso, as obras estão paradas.

“Acreditamos que a empresa ficou sabendo de nossa manifestação e não quis entrar em confronto”, afirmou Eden Magalhães, secretário-executivo do Cimi (Conselho Indigenista Missionário), uma das entidades participantes do protesto.

A Polícia Rodoviária Federal confirmou a ocorrência do protesto, mas ainda não sabe estimar a quantidade de pessoas presentes.Segundo ele, há cerca de 600 pessoas no local, entre índios, pescadores, população ribeirinha e até estudantes.

Os manifestantes exigem a presença de algum integrante do governo federal no local e pedem a paralisação das obras.

Ontem, foi adiado mais uma vez o julgamento na Justiça Federal sobre o licenciamento da usina de Belo Monte. O julgamento está empatado com um voto a favor da construção da usina e um voto contra. Falta o voto de desempate, mas ainda não há previsão de quando o processo voltará a ser colocado em pauta.

 

Novo Maracanã já nasce velho (O Globo)

André Trigueiro

O Globo, 27/10/11

O projeto do novo Maracanã confirma a exclusão de um item absolutamente importante para que qualquer projeto de engenharia do gênero possa ser chamado de “moderno e sustentável”. Apesar do variado cardápio de estádios de futebol espalhados pelo mundo com aproveitamento energético do sol, a caríssima obra de reconstrução do Maracanã – quase 1 bilhão de reais – ignorou essa possibilidade.

Estranho que isso tenha acontecido num país onde o sol brilha em média 280 dias por ano. Ainda mais estranho que isso tenha acontecido na cidade que sediou a Rio-92, que vai sediar a Rio+20, e que está situada na mesma faixa de exposição solar que Sidney, na Austrália, que se notabilizou por realizar os primeiros Jogos Verdes da História, inteiramente abastecidos de energia solar.

Cobri como jornalista os Jogos de Sidney em 2000 e lembro-me das imensas estruturas com placas fotovoltaicas que captavam energia solar para iluminar as competições no estádio olímpico, no Superdome e em todas as instalações esportivas. A Vila Olímpica com 665 casas se transformou no maior bairro dotado de energia solar do planeta. O porta-voz do Comitê Olímpico Internacional, o australiano Michael Bland, justificou assim os investimentos em energia solar: “Queremos fazer com que a energia solar se torne popular em todos os países. É ridículo que, na Austrália, todas as casas não usem um captador de energia solar. Temos os telhados, temos o sol, e os desperdiçamos. É um jeito estúpido de levar a vida”.

Que estupidez a nossa desperdiçar a imensa área das marquises do novo Maracanã – quase 29 mil metros quadrados – que poderiam abrigar um vistoso conjunto de placas fotovoltaicas capazes de gerar energia elétrica para até 3.000 domicílios. O custo varia de dez a vinte milhões de reais, dependendo da tecnologia empregada. Alguém poderá dizer: “É caro demais! Não vale a pena”. Mas será que a forma usual de comprar energia está valendo a pena?

Vivemos num país onde, segundo o IBGE, a tarifa de energia elétrica subiu mais do que o dobro da inflação oficial nos últimos 15 anos. A opção pelo solar – embora mais cara – oferece como vantagem a amortização do investimento em alguns poucos anos.

Alguém poderá dizer que a nova marquise – mais leve – poderia não suportar as tradicionais placas fotovoltaicas. Pois que se pensasse numa estrutura compatível. O que está em jogo é a possibilidade de tornar o estádio útil mesmo em dias que não aconteçam partidas de futebol. O Maracanã poderia ser uma usina de energia – ainda que com potência modesta – que além do benefício direto de gerar eletricidade, funcionaria também como elemento indutor de mais pesquisas e investimentos em energia solar no Brasil.

E quem disse que o custo de instalação de um projeto como esse só seria possível com recursos públicos? Se houvesse vontade política para promover inovação tecnológica no setor energético usando o novo Maracanã como garoto-propaganda, seria perfeitamente possível sondar o interesse de grandes empresas com know-how em energia solar que aceitassem instalar os equipamentos fotovoltaicos a custo zero, sem ônus para o governo. E o que essa empresa ganharia em troca? O direito de explorar a imagem do Maracanã como “estádio solar” graças à tecnologia oferecida pela empresa.

Alguém duvida que a imagem aérea do estádio tanto na Copa de 2014 quanto nas Olimpíadas de 2016 alcançará bilhões de telespectadores pelo mundo? É mídia espontânea, super-exposição positiva de imagem, e tudo aquilo que um bom negociador não levaria mais do que alguns minutos para convencer o investidor a botar a mão no bolso e bancar a ideia.

Com recursos públicos ou privados, o certo era fazer. Não basta instalar alguns coletores solares para aquecer a água do banho usadas pelos atletas nos vestiários. É pouco. Se os responsáveis pelo projeto do Maracanã marcaram um gol contra desprezando o sol, os estádios de Pituaçu, em Salvador, e Mineirão, em Belo Horizonte, terão a energia solar como aliada para a produção de energia elétrica. Acorda Rio! Maracanã sem energia solar é como o Rio sem praia. Infelizmente os cariocas continuarão usando o sol apenas para se bronzear.Símbolo da sustentabilidade por suas belezas naturais e por sediar grande conferências ambientais da ONU, o Rio de Janeiro continua com um Maracanã aquém do que merece.

 

 

Acre: In defence of life and the integrity of the peoples and their territories against REDD and the commodification of nature

Letter from the State of Acre

In defence of life and the integrity of the peoples and their territories against REDD and the commodification of nature

We gathered in Rio Branco, in the State of Acre, on 3-7 October 2011 for the workshop “Serviços Ambientais, REDD e Fundos Verdes do BNDES: Salvação da Amazônia ou Armadilha do Capitalismo Verde?” (Environmental Services, REDD and BNDES Green Funds: The Amazon’s Salvation or a Green Capitalism Trap?)

The participants included socio-environmental organizations, family agriculture associations, Extractive Reserve (RESEX) and Extractive Settlement organizations, human rights organizations (national and international), social pastoral organizations, professors, students, and members of civil society committed to the struggle of “the underdogs”.

We saw the emergence of a consensus around the belief that, since 1999 and the election of the Popular Front of Acre (FPA) government, initiatives have been adopted to establish a “new model” of development in the state. Since then, this model has been praised as a prime example of harmony between economic development and the preservation of forests, their natural resources and the way of life of their inhabitants. With strong support from the media, trade unions, NGOs that promote green capitalism in the Amazon region, multilateral banks, local oligarchies and international organizations, it is presented as a “successful model” to be emulated by other regions of Brazil and the world.

Over these past few days we have had the opportunity to learn first hand, in the field, about some of the initiatives in Acre that are considered as exemplary. We saw for ourselves the social and environmental impacts of the “sustainable development” underway in the state. We visited the Chico Mendes Agro-Extractive Settlement Project, the NATEX condom factory, and the Fazendas Ranchão I and II Sustainable Forest Management Project in Seringal São Bernardo (the São Bernardo rubber plantation). These field visits presented us with a reality that is rather far removed from the image portrayed nationally and internationally.

In Seringal São Bernardo, we were able to observe the priority placed on the interests of timber companies, to the detriment of the interests of local communities and nature conservation. Even the questionable rules of the forest management plans are not respected, and according to the local inhabitants, these violations are committed in collusion with the responsible state authorities. In the case of the Chico Mendes Agro-Extractive Settlement Project in Xapuri, we saw that the local population remains subjugated to monopoly control: they currently sell their timber to the company Laminados Triunfo at a rate of R$90 per cubic metre, when this same amount of wood can be sold for as much as R$1200 in the city. This is why we support the demands of various communities for the suspension of these famous forest management projects. We call for the investigation of all of the irregularities revealed, and we demand punishment for those guilty of the criminal destruction of natural resources.

During the course of the workshop we also analyzed the issues of environmental services, REDD and the BNDES (Brazilian Development Bank) Green Funds. We gained a greater understanding of the role of banks (World Bank, IMF, IDB and BNDES), of NGOs that promote green capitalism (e.g. WWF, TNC and CI) and other institutions such as the ITTO, FSC and USAID, and also sectors of civil society and the state and federal governments who have allied with international capital for the commodification of the natural heritage of the Amazon region.

It was stressed that, in addition to being anti-constitutional, Law Nº 2.308 of 22 October 2010, which regulates the State System of Incentives for Environmental Services, was created without the due debate with sectors of society directly impacted by the law, that is, the men and women of the countryside and forests. Slavishly repeating the arguments of the powerful countries, local state authorities present it as an effective means of contributing to climate equilibrium, protecting the forests and improving the quality of life of those who live in the forests. It should be noted, however, that this legislation generates “environmental assets” in order to negotiate natural resources on the “environmental services” market, such as the carbon market. It represents a reinforcement of the current phase of capitalism, whose defenders, in order to ensure its widespread expansion, utilize an environmental discourse to commodify life, privatize nature and plunder the inhabitants of the countryside and the cities. Under this law, the beauty of nature, pollination by insects, regulation of rainfall, culture, spiritual values, traditional knowledge, water, plants and even popular imagery are converted into merchandise. The current proposal to reform the Forest Code complements this new strategy of capital accumulation by authorizing the negotiation of forests on the financial market, through the issuing of “green bonds”, or so-called “Environmental Reserve Quota Certificates” (CCRAs). In this way, everything is placed in the sphere of the market, to be administered by banks and private corporations.

Although it is presented as a solution for global warming and climate change, the REDD proposal allows the powerful capitalist countries to maintain their current levels of production, consumption and, therefore, pollution. They will continue to consume energy generated by sources that produce more and more carbon emissions. Historically responsible for the creation of the problem, they now propose a “solution” that primarily serves their own interests. While making it possible to purchase the “right to pollute”, mechanisms like REDD strip “traditional” communities (riverine, indigenous and Afro-Brazilian communities, rubber tappers, women coconut gatherers, etc.) of their autonomy in the management of their territories.

As a result, roles are turned upside down. Capitalism, the most predatory civilization in the history of humankind, would not pose a danger; on the contrary, it would be the “solution”. The “destroyers” would now be those who fight to defend nature. And so those who have historically ensured the preservation of nature are now viewed as predators, and are therefore criminalized. It comes as no surprise then that the state has recently become more open in its repression, persecution and even the expulsion of local populations from their territories – all to ensure the free expansion of the natural resources market.

With undisguised state support, through this and other projects, capital is now promoting and combining two forms of re-territorialization in the Amazon region. On one hand, it is evicting peoples and communities from their territories (as in the case of mega projects like hydroelectric dams), stripping them of their means of survival. On the other hand, it is stripping those who remain on their territories of their relative autonomy, as in the case of environmental conservation areas. These populations may be allowed to remain on their land, but they are no longer able to use it in accordance with their ways of life. Their survival will no longer be guaranteed by subsistence farming – which has been transformed into a “threat” to the earth’s climate stability – but rather by a “bolsa verde” or “green allowance”, which in addition to being insufficient is paid in order to maintain the oil civilization.

Because we are fully aware of the risks posed by projects like these, we oppose the REDD agreement between California, Chiapas and Acre, which has already caused serious problems for indigenous and traditional communities such as those in the Amador Hernández region of Chiapas, Mexico. This is why we share our solidarity with the poor communities of California and Chiapas, who have already suffered from its consequences. We also share our solidarity with the indigenous peoples of the Isiboro Sécure National Park and Indigenous Territory (TIPNIS) in Bolivia, who are facing the threat of the violation of their territory by a highway linking Cochabamba and Beni, financed by the BNDES.

We are in a state which, in the 1970s and 1980s, was the stage for historical struggles against the predatory expansion of capital and in defence of territories inhabited by indigenous peoples and peasant communities of the forests. These struggles inspired many others in Brazil and around the world. In the late 1990s, however, Acre was converted into a laboratory for the IDB’s and World Bank’s experiments in the commodification and privatization of nature, and is now a state “intoxicated” by environmental discourse and victimized by the practice of “green capitalism”. Among the mechanisms used to legitimize this state of affairs, one of the most striking is the manipulation of the figure of Chico Mendes. To judge by what they present us with, we would have to consider him the patron saint of green capitalism. The name of this rubber tapper and environmental activist is used to defend oil exploitation, monoculture sugar cane plantations, large-scale logging activity and the sale of the air we breathe.

In view of this situation, we would have to ask if there is anything that could not be made to fit within this “sustainable development” model. Perhaps at no other time have cattle ranchers and logging companies met with a more favourable scenario. This is why we believe it is necessary and urgent to fight it, because under the guise of something new and virtuous, it merely reproduces the old and perverse strategies of the domination and exploitation of humans and nature.

Finally, we want to express here our support for the following demands: agrarian reform, official demarcation of indigenous lands, investments in agroecology and the solidarity economy, autonomous territorial management, health and education for all, and democratization of the media. In defence of the Amazon, of life, of the integrity of the peoples and their territories, and against REDD and the commodification of nature. Our struggle continues.

Rio Branco, Acre, 7 October 2011

Signed:

Assentamento de Produção Agro-Extrativista Limoeiro-Floresta Pública do Antimary (APAEPL)

Amazonlink

Cáritas – Manaus

Centro de Defesa dos Direitos Humanos e Educação Popular do Acre (CDDHEP/AC)

Centro de Estudos e Pesquisas para o Desenvolvimento do Extremo Sul da Bahia (CEPEDES)

Comissão Pastoral da Terra – CPT Acre

Conselho Indigenista Missionário – CIMI Regional Amazônia Ocidental

Conselho de Missão entre Índios – COMIN Assessoria Acre e Sul do Amazonas

Coordenação da União dos Povos Indígenas de Rondônia, Sul do Amazonas e Noroeste do Mato Grosso – CUNPIR

FERN

Fórum da Amazônia Ocidental (FAOC)

Global Justice Ecology Project

Grupo de Estudo sobre Fronteira e Identidade – Universidade Federal do Acre

Instituto Madeira Vivo (IMV-Rondônia)

Instituto Mais Democracia

Movimento Anticapitalista Amazônico – MACA

Movimento de Mulheres Camponesas (MMC – Roraima)

Nós Existimos – Roraima

Núcleo Amigos da Terra Brasil

Núcleo de Pesquisa Estado, Sociedade e Desenvolvimento na Amazônia Ocidental -Universidade Federal do Acre.

Oposição Sindical do STTR de Brasiléia

Rede Alerta Contra o Deserto Verde

Rede Brasil sobre Instituições Financeiras Multilaterais

Sindicato dos Trabalhadores Rurais de Bujarí (STTR – Bujarí)

Sindicato dos Trabalhadores Rurais de Xapuri (STTR- Xapuri)

Terra de Direitos

União de Mulheres Indígenas da Amazonia Brasileira

World Rainforest Movement (WRM)

Carta del Estado de Acre

En defensa de la vida, de la integridad de los pueblos y de sus territorios contra el REDD y la mercantilización de la naturaleza

Estuvimos reunidos en Rio Branco – Estado de Acre, entre los días 3 y 7 de octubre de 2011 en el Taller: “Serviços Ambientais, REDD e Fundos Verdes do BNDES: Salvação da Amazônia ou Armadilha do Capitalismo Verde?” (Servicios Ambientales, REDD y Fondos Verdes del BNDES: ¿Salvación de la Amazonia o Trampa del Capitalismo Verde? )

Estábamos presentes organizaciones socioambientales, de trabajadoras y trabajadores de la agricultura familiar, organizaciones de Resex (Reservas Extractivistas) y Asentamientos Extractivistas, de derechos humanos (nacionales e internacionales), organizaciones indígenas, organizaciones de mujeres, pastorales sociales, profesores, estudiantes y personas de la sociedad civil comprometidas con la lucha “de los de abajo”.

Percibimos la formación de un consenso en torno a la idea de que, desde 1999, con la elección del gobierno del Frente Popular de Acre (FPA), se tomaron iniciativas para la implantación de un “nuevo modelo” de desarrollo. Desde entonces, dicho modelo es celebrado como primor de armonía entre desarrollo económico y conservación del bosque, de sus bienes naturales y del modo de vida de sus habitantes. Con fuerte apoyo de los medios de comunicación, de sindicatos, de ONGs promotoras del capitalismo verde en la región amazónica, de bancos multilaterales, de oligarquías locales, de organizaciones internacionales, éste es presentado como “modelo exitoso” a ser seguido por otras regiones del Brasil y del mundo.

En estos días tuvimos la oportunidad de conocer, en el campo, algunas iniciativas consideradas como referencia en Acre. Vimos de cerca los impactos sociales y ambientales del “desarrollo sustentable” en curso en el estado. Visitamos el “Projeto de Assentamento Agroextrativista Chico Mendes”, “Fábrica de Preservativos NATEX” y el “Seringal São Bernardo” (“Projeto de Manejo Florestal Sustentável das Fazendas Ranchão I e II”). Las visitas nos colocaron frente a un escenario bastante distinto a aquello que es publicitado a nivel nacional e internacional.

En “Seringal São Bernardo” pudimos constatar que la atención de los intereses de las madereras se hace en detrimento de los intereses de las poblaciones locales y de la conservación de la naturaleza. Incluso las cuestionables reglas de los planes de manejo no son respetadas y, según dicen los pobladores, con connivencia de gestores estatales. En el caso del “Projeto de Assentamento Agroextrativista Chico Mendes Cachoeira” (en Xapuri), constatamos que los pobladores continúan subyugados al dominio monopolista, actualmente venden la madera a la empresa “Laminados Triunfo” a R$90,00 el m3, cuando la misma cantidad de madera llega a valer hasta R$1200 en la ciudad. Por ello, apoyamos la reivindicación de diversas comunidades por la suspensión de los célebres proyectos de manejo. Solicitamos la determinación de todas las irregularidades y exigimos la penalización de los culpables por la destrucción delictiva de los bienes naturales.

Los días en que estuvimos reunidos fueron dedicados asimismo al estudio sobre Servicios Ambientales, REDD y Fondos Verdes del BNDES. Comprendimos el papel de los Bancos (Banco Mundial, FMI, BID y BNDES), ONGs comprometidas con el capitalismo verde, tales como WWF, TNC y CI; así como el papel de otras instituciones como ITTO, FSC y USAID, sectores de la sociedad civil y Gobiernos de los Estados y Federal que se han aliado al capital internacional con la intención de mercantilizar el patrimonio natural de la Amazonia.

Destacamos que, además de desprovista de amparo constitucional, la Ley Nº 2.308 de fecha 22 de octubre de 2010, que reglamenta el Sistema del Estado de Incentivo a Servicios Ambientales, se creó sin el debido debate con los sectores de la sociedad directamente impactados por ella, esto es, los hombres y mujeres del campos y del bosque. Reproduciendo servilmente los argumentos de los países centrales, los gestores estatales locales la presentan como una forma eficaz de contribuir con el equilibrio del clima, proteger el bosque y mejorar la calidad de vida de aquellos que habitan en él. Debe decirse, sin embargo, que la referida ley genera “activos ambientales” para negociar los bienes naturales en el mercado de “servicios ambientales” como el mercado de carbono. Se trata de un desdoblamiento de la actual fase del capitalismo cuyos defensores, con el fin de asegurar su reproducción ampliada, recurren al discurso ambiental para mercantilizar la vida, privatizar la naturaleza y despojar a los pobladores del campo y de la ciudad. Por la ley, la belleza natural, la polinización de insectos, la regulación de lluvias, la cultura, los valores espirituales, los saberes tradicionales, el agua, las plantas y hasta el propio imaginario popular, todo pasa a ser mercadería. La actual propuesta de modificación del Código Forestal complementa esta nueva estrategia de acumulación del capital, al autorizar la negociación de los bosques en el mercado financiero, con la emisión de “papeles verdes”, el llamado “Certificado de Cuotas de Reserva Ambiental” (CCRA). De este modo, todo se coloca en el ámbito del mercado para ser administrado por bancos y empresas privadas.

Aunque sea presentada como solución para el calentamiento global y para los cambios climáticos, la propuesta REDD permite a los países centrales del capitalismo mantener sus estándares de producción, consumo y, por lo tanto, también de contaminación. Continuarán consumiendo energía de fuentes que producen más y más emisiones de carbono. Históricamente responsables de la creación del problema, ahora proponen una “solución” que atiende más a sus intereses. Posibilitando la compra del “derecho de contaminar”, mecanismos como REDD fuerzan a las “poblaciones tradicionales” (ribereños, indígenas, afrobrasileños, trabajadoras del coco, caucheros, etc.) a renunciar a la autonomía en la gestión de sus territorios.

Con esto, se confunden los papeles. El capitalismo, la civilización más predadora de la historia de la humanidad, no representaría ningún problema. Por lo contrario, sería la solución. Los destructores serían ahora los grandes defensores de la naturaleza. Y aquellos que históricamente garantizaron la conservación natural son, ahora, encarados como predadores y por eso mismo son criminalizados. No sorprende, por lo tanto, que recientemente el Estado haya vuelto más ostensiva la represión, la persecución y hasta la expulsión de las poblaciones locales de sus territorios. Todo para asegurar la libre expansión del mercado de los bienes naturales.

Con el indisfrazable apoyo estatal, por ese y otros proyectos, el capital hoy promueve y conjuga dos formas de reterritorialización en la región amazónica. Por una parte, expulsa pueblos y comunidades del territorio (como es el caso de los grandes proyectos como las hidroeléctricas), privándolos de las condiciones de supervivencia. Por otra parte, quita la relativa autonomía de aquellos que permanecen en sus territorios, como es el caso de las áreas de conservación ambiental. Tales poblaciones pueden incluso permanecer en la tierra, pero ya no pueden utilizarla según su modo de vida. Su supervivencia ya no sería más garantizada por el cultivo de subsistencia –convertido en amenaza al buen funcionamiento del clima del planeta-, sino por “bolsas verdes”, que, además de insuficientes, son pagadas para el mantenimiento de la civilización del petróleo.

Conscientes de los riesgos que dichos proyectos traen, rechazamos el acuerdo de REDD entre California, Chiapas, y Acre que ya ha causado serios problemas a comunidades indígenas y tradicionales, como en la región de Amador Hernández, en Chiapas, México. Por ello nos solidarizamos con las poblaciones pobres de California y Chiapas, que ya han sufrido con las consecuencias. También nos solidarizamos con los pueblos indígenas del TIPNIS, en Bolivia, bajo amenaza de que su territorio sea violado por la carretera que liga Cochabamba a Beni, financiada por el BNDES.

Estamos en un estado que, en los años 1970-80, fue escenario de luchas históricas contra la expansión predatoria del capital y por la defensa de los territorios ocupados por pueblos indígenas y poblaciones campesinas del bosque. Luchas que inspiraron muchas otras en el Brasil y en el mundo. Convertido, sin embargo, a partir de fines de los años 90 en laboratorio del BID y del Banco Mundial para experimentos de mercantilización y privatización de la naturaleza, Acre es hoy un estado “intoxicado” por el discurso verde y victimizado por la práctica del “capitalismo verde”. Entre los mecanismos utilizados con el fin de legitimar ese orden de cosas, adquiere relevancia la manipulación de la figura de Chico Mendes. A juzgar por lo que nos presentan, deberíamos considerarlo el patrono del capitalismo verde. En nombre del cauchero se defiende la explotación de petróleo, el monocultivo de la caña de azúcar, la explotación maderera en gran escala y la venta del aire que se respira.

Ante tal cuadro, cabe preguntar qué es lo que no cabría en este modelo de “desarrollo sustentable”. Tal vez en ningún otro momento los ganaderos y madereros hayan encontrado un escenario más favorable. Es por esa razón que creemos necesario y urgente combatirlo, puesto que, bajo la apariencia de algo nuevo y virtuoso, reproduce las viejas y perversas estrategias de dominación y explotación del hombre y de la naturaleza.

Finalmente dejamos aquí nuestra reivindicación por la atención de las siguientes demandas: reforma agraria, homologación de tierras indígenas, inversiones en agroecología y economía solidaria, autonomía de gestión de los territorios, salud y educación para todos, democratización de los medios de comunicación. En defensa de la Amazonia, de la vida, de la integridad de los pueblos y de sus territorios y contra el REDD y la mercantilización de la naturaleza. Estamos en lucha.

Rio Branco, Acre, 07 de octubre de 2011.

Firman esta carta:

Assentamento de Produção Agro-Extrativista Limoeiro-Floresta

Pública do Antimary (APAEPL)

Amazonlink

Cáritas – Manaus

Centro de Defesa dos Direitos Humanos e Educação Popular do Acre (CDDHEP/AC)

Centro de Estudos e Pesquisas para o Desenvolvimento do Extremo Sul da Bahia (CEPEDES)

Comissão Pastoral da Terra – CPT Acre

Conselho Indigenista Missionário – CIMI Regional Amazônia Ocidental

Conselho de Missão entre Índios – COMIN Assessoria Acre e Sul do Amazonas

Coordenação da União dos Povos Indígenas de Rondônia, Sul do Amazonas e Noroeste do Mato Grosso – CUNPIR

FERN

Fórum da Amazônia Ocidental (FAOC)

Global Justice Ecology Project

Grupo de Estudo sobre Fronteira e Identidade – Universidade Federal do Acre

Instituto Madeira Vivo (IMV-Rondônia)

Instituto Mais Democracia

Movimento Anticapitalista Amazônico – MACA

Movimento de Mulheres Camponesas (MMC – Roraima)

Nós Existimos – Roraima

Núcleo Amigos da Terra Brasil

Núcleo de Pesquisa Estado, Sociedade e Desenvolvimento na Amazônia Ocidental -Universidade Federal do Acre.

Oposição Sindical do STTR de Brasiléia

Rede Alerta Contra o Deserto Verde

Rede Brasil sobre Instituições Financeiras Multilaterais

Sindicato dos Trabalhadores Rurais de Bujarí (STTR – Bujarí)

Sindicato dos Trabalhadores Rurais de Xapuri (STTR- Xapuri)

Terra de Direitos

União de Mulheres Indígenas da Amazonia Brasileira

World Rainforest Movement (WRM)

Forgetting Is Part of Remembering (Science Daily)

ScienceDaily (Oct. 18, 2011) — It’s time for forgetting to get some respect, says Ben Storm, author of a new article on memory in Current Directions in Psychological Science, a journal of the Association for Psychological Science. “We need to rethink how we’re talking about forgetting and realize that under some conditions it actually does play an important role in the function of memory,” says Storm, who is a professor at the University of Illinois at Chicago.

“Memory is difficult. Thinking is difficult,” Storm says. Memories and associations accumulate rapidly. “These things could completely overrun our life and make it impossible to learn and retrieve new things if they were left alone, and could just overpower the rest of memory,” he says.

But, fortunately, that isn’t what happens. “We’re able to get around these strong competing inappropriate memories to remember the ones we want to recall.” Storm and other psychological scientists are trying to understand how our minds select the right things to recall — if someone’s talking about beaches near Omaha, Nebraska, for example, you will naturally suppress any knowledge you’ve collected about Omaha Beach in Normandy.

In one kind of experiment, participants are given a list of words that have some sort of relation to each other. They might be asked to memorize a list of birds, for example. In the next part of the test, they have to do a task that requires remembering half the birds. “That’s going to make you forget the other half of the birds in that list,” Storm says. That might seem bad — it’s forgetting. “But what the research shows is that this forgetting is actually a good thing.”

People who are good at forgetting information they don’t need are also good at problem solving and at remembering something when they’re being distracted with other information. This shows that forgetting plays an important role in problem solving and memory, Storm says.

There are plenty of times when forgetting makes sense in daily life. “Say you get a new cell phone and you have to get a new phone number, do you really want to remember your old phone number every time someone asks what your number is?” Storm asks. Or where you parked your car this morning — it’s important information today, but you’d better forget it when it comes time to go get your car for tomorrow afternoon’s commute. “We need to be able to update our memory so we can remember and think about the things that are currently relevant.”

Questioning Privacy Protections in Research (New York Times)

Dr. John Cutler, center, during the Tuskegee syphilis experiment. Abuses in that study led to ethics rules for researchers. Coto Report

By PATRICIA COHEN
Published: October 23, 2011

Hoping to protect privacy in an age when a fingernail clipping can reveal a person’s identity, federal officials are planning to overhaul the rules that regulate research involving human subjects. But critics outside the biomedical arena warn that the proposed revisions may unintentionally create a more serious problem: sealing off vast collections of publicly available information from inspection, including census data, market research, oral histories and labor statistics.

Organizations that represent tens of thousands of scholars in the humanities and social sciences are scrambling to register their concerns before the Wednesday deadline for public comment on the proposals.

The rules were initially created in the 1970s after shocking revelations that poor African-American men infected with syphilis in Tuskegee, Ala., were left untreated by the United States Public Health Service so that doctors could study the course of the disease. Now every institution that receives money from any one of 18 federal agencies must create an ethics panel, called an institutional review board, or I.R.B.

More than 5,875 boards have to sign off on research involving human participants to ensure that subjects are fully informed, that their physical and emotional health is protected, and that their privacy is respected. Although only projects with federal financing are covered by what is known as the Common Rule, many institutions routinely subject all research with a human factor to review.

The changes in the ethical guidelines — the first comprehensive revisions in more than 30 years — were prompted by a surge of health-related research and technological advances.

Researchers in the humanities and social sciences are pleased that the reforms would address repeated complaints that medically oriented regulations have choked off research in their fields with irrelevant and cumbersome requirements. But they were dismayed to discover that the desire to protect individuals’ privacy in the genomics age resulted in rules that they say could also restrict access to basic data, like public-opinion polls.

Jerry Menikoff, director of the federal Office for Human Research Protections, which oversees the Common Rule, cautions that any alarm is premature, saying that federal officials do not intend to pose tougher restrictions on information that is already public. “If the technical rules end up doing that, we’ll try to come up with a result that’s appropriate,” he said.

Critics welcomed the assurance but remained skeptical. Zachary Schrag, a historian at George Mason University who wrote a book about the review process, said, “For decades, scholars in the social sciences and humanities have suffered because of rules that were well intended but poorly considered and drafted and whose unintended consequences restricted research.”

The American Historical Association, with 15,000 members, and the Oral History Association, with 900 members, warn that under the proposed revisions, for example, new revelations that Public Health Service doctors deliberately infected Guatemalan prisoners, soldiers and mental patients with syphilis in the 1940s might never have come to light. The abuses were uncovered by a historian who by chance came across notes in the archives of the University of Pittsburgh. That kind of undirected research could be forbidden under guidelines designed to prevent “data collected for one purpose” from being “used for a new purpose to which the subjects never consented,” said Linda Shopes, who helped draft the historians’ statement.

The suggested changes, she said, “really threaten access to information in a democratic society.”

Numerous organizations including the Consortium of Social Science Associations, which represents dozens of colleges, universities and research centers, expressed particular concern that the new standards might be modeled on federal privacy rules relating to health insurance and restrict use of the broadest of identifying information, like a person’s ZIP code, county or city.

The 11,000-member American Anthropological Association declared in a statement that any process that is based on the health insurance act’s privacy protections “would be disastrous for social and humanities research.” The 45,000-member American Association of University Professors warned that such restrictions “threaten mayhem” and “render impossible a great deal of social-science research, ranging from ethnographic community studies to demographic analysis that relies on census tracts to traffic models based on ZIP code to political polls that report by precinct.”

Dr. Menikoff said references to the statutes governing health insurance information were meant to serve as a starting point, not a blueprint. “Nothing is ruled out,” he said, though he wondered how the review system could be severed from the issue of privacy protection, as the consortium has discussed, “if the major risk for most of these studies is that you’re going to disclose information inadvertently.” If there is confidential information on a laptop, he said, requiring a password may be a reasonable requirement.

Ms. Shopes, Mr. Schrag and other critics emphasized that despite their worries they were happy with the broader effort to fix some longstanding problems with institutional review boards that held, say, an undergraduate interviewing Grandma for an oral history project to the same guidelines as a doctor doing experimental research on cancer patients.

“The system has been sliding into chaos in recent years,” said Alice Kessler-Harris, president of the 9,000-member Organization of American Historians. “No one can even agree on what is supposed to be covered in the humanities and social sciences.”

Vague rules designed to give the thousands of review boards flexibility when dealing with nonmedical subjects have instead resulted in higgledy-piggledy enforcement and layers of red tape even when no one is at risk, she said.

For example Columbia University, where Ms. Kessler-Harris teaches, exempts oral history projects from review, while boards at the University of Illinois in Urbana-Champaign and the University of California, San Diego, have raised lengthy objections to similar interview projects proposed by undergraduate and master’s students, according to professors there.

Brown University has been sued by an associate professor of education who said the institutional review board overstepped its powers by barring her from using three years’ worth of research on how the parents of Chinese-American children made use of educational testing.

Ms. Shopes said board members at one university had suggested at one point that even using recorded interviews deposited at the Ronald Reagan Presidential Foundation and Library would have needed Reagan’s specific approval when he was alive.

Many nonmedical researchers praised the idea that scholars in fields like history, literature, journalism, languages and classics who use traditional methods of research should not have to submit to board review. They would like the office of human protections to go further and lift restrictions on research that may cause participants embarrassment or emotional distress. “Our job is to hold people accountable,” Ms. Kessler-Harris said.

Dr. Menikoff said, “We want to hear all these comments.” But he maintained that when the final language is published, critics may find themselves saying, “Wow, this is reasonable stuff.”

 

This article has been revised to reflect the following correction:

Correction: October 26, 2011

An article on Monday about federal officials’ plans to overhaul privacy rules that regulate research involving human subjects, and concerns raised by scholars, paraphrased incorrectly from comments by Linda Shopes, who helped draft a statement by historians about possible changes. She said that board members at a university (which she did not name) — not board members at the University of Chicago — suggested at one point that using recorded interviews deposited at the Ronald Reagan Presidential Foundation and Library would have needed Reagan’s specific approval when he was alive.

More anthropologists on Wall Street please (The Economist)

Education policy

Oct 24th 2011, 20:58 by M.S.

APPARENTLY Rick Scott, the governor of Florida, called two weeks ago for reducing funding for liberal-arts disciplines at state universities and shifting the money to science, technology, engineering and math, which he abbreviates to STEM. (Amusingly, if you Google “Rick Scott STEM” you end up getting multiple references to Mr Scott’s apparently non-operative campaign pledge to ban stem-cell research in Florida. Between the two issues, you’ve got a sort of operatic treatment of the modern Republican love-hate relationship with science.) Mr Scott seems to have repeatedly singled out the discipline of anthropology for derision. On one occasion, he apparently told a right-wing radio host: “You know, we don’t need a lot more anthropologists in the state. It’s a great degree if people want to get it, but we don’t need them here. I want to spend our dollars giving people science, technology, engineering, math degrees…so when they get out of school, they can get a job.” On another occasion, he’s quoted as telling a business group in Tallahassee: “Do you want to use your tax dollars to educate more people who can’t get jobs in anthropology? I don’t.”

Few would defend deliberately educating more people who can’t get jobs in anthropology, as such. (Of course, giving people math degrees rather than anthropology degrees will render them even less able to get jobs in anthropology.) Many, however, would defend educating more people in anthropology, regardless of what they wind up getting jobs in. In Slate on Friday, Michael Crow, president of Arizona State University, gave the traditional and entirely accurate pitch:

[R]esolving the complex challenges that confront our nation and the world requires more than expertise in science and technology. We must also educate individuals capable of meaningful civic participation, creative expression, and communicating insights across borders. The potential for graduates in any field to achieve professional success and to contribute significantly to our economy depends on an education that entails more than calculus.

Curricula expressly tailored in response to the demands of the workforce must be balanced with opportunities for students to develop their capacity for critical thinking, analytical reasoning, creativity, and leadership—all of which we learn from the full spectrum of disciplines associated with a liberal arts education. Taken together with the rigorous training provided in the STEM fields, the opportunities for exploration and learning that Gov. Scott is intent on marginalizing are those that have defined our national approach to higher education.

This is a solid response. What it lacks are rhetorical oomph and concrete examples. So here’s a concrete example with a little oomph. Some of the best analysis of the 2007-2008 financial crisis, and of the ongoing follies on Wall Street these days, has been produced by the Financial Times‘ Gillian Tett. Ms Tett began warning that collateralised debt obligations and credit-default swaps were likely to lead to a major financial implosion in 2005 or so. The people who devise such complex derivatives are generally trained in physics or math. Ms Tett has a PhD in anthropology. Here’s a 2008 profile of Ms Tett by the Guardian’s Laurie Barton.

Tett began looking at the subject of credit five years ago. “Everyone was looking at the City and talking about M&A [mergers and acquisitions] and equity markets, and all the traditional high-glamour, high-status parts of the City. I got into this corner of the market because I passionately believed there was a revolution happening that had been almost entirely ignored. And I got really excited about trying to actually illustrate what was happening.”

Not that anyone particularly wanted to listen. “You could see everyone’s eyes glazing over … But my team, not just me, we very much warned of the dangers. Though I don’t think we expected the full scale of the disaster that’s unfolded.”

There is something exceedingly calm and thorough about Tett. She talks with the patient enthusiasm of a Tomorrow’s World presenter—a throwback, perhaps, to her days studying social anthropology, in which she has a PhD from Cambridge. “I happen to think anthropology is a brilliant background for looking at finance,” she reasons. “Firstly, you’re trained to look at how societies or cultures operate holistically, so you look at how all the bits move together. And most people in the City don’t do that. They are so specialised, so busy, that they just look at their own little silos. And one of the reasons we got into the mess we are in is because they were all so busy looking at their own little bit that they totally failed to understand how it interacted with the rest of society.

“But the other thing is, if you come from an anthropology background, you also try and put finance in a cultural context. Bankers like to imagine that money and the profit motive is as universal as gravity. They think it’s basically a given and they think it’s completely apersonal. And it’s not. What they do in finance is all about culture and interaction.”

Another person with an anthropology degree who’s been doing terrific work in recent years in a somewhat-related field is the Dutch journalist Joris Luyendijk, who produced a fantastic short book last year analysing the tribal culture of the Dutch parliament and the media circles that cover it. He’s currently working on a study of the City as well. Anyway, the general point is that while studying human behaviour through complex derivatives has its uses, there’s something to be said for the more rigorous and less egocentric analytical tools that anthropology brings to play, and it might be worth Mr Scott’s time to take a course or two. It’s never too late to learn.

The scientific finding that settles the climate-change debate (Washington Post)

By Eugene Robinson, Published: October 24

For the clueless or cynical diehards who deny global warming, it’s getting awfully cold out there.

The latest icy blast of reality comes from an eminent scientist whom the climate-change skeptics once lauded as one of their own. Richard Muller, a respected physicist at the University of California, Berkeley, used to dismiss alarmist climate research as being “polluted by political and activist frenzy.” Frustrated at what he considered shoddy science, Muller launched his own comprehensive study to set the record straight. Instead, the record set him straight.

“Global warming is real,” Muller wrote last week in The Wall Street Journal.

Rick Perry, Herman Cain, Michele Bachmann and the rest of the neo-Luddites who are turning the GOP into the anti-science party should pay attention.

“When we began our study, we felt that skeptics had raised legitimate issues, and we didn’t know what we’d find,” Muller wrote. “Our results turned out to be close to those published by prior groups. We think that means that those groups had truly been careful in their work, despite their inability to convince some skeptics of that.”

In other words, the deniers’ claims about the alleged sloppiness or fraudulence of climate science are wrong. Muller’s team, the Berkeley Earth Surface Temperature project, rigorously explored the specific objections raised by skeptics — and found them groundless.

Muller and his fellow researchers examined an enormous data set of observed temperatures from monitoring stations around the world and concluded that the average land temperature has risen 1 degree Celsius — or about 1.8 degrees Fahrenheit — since the mid-1950s.

This agrees with the increase estimated by the United Nations-sponsored Intergovernmental Panel on Climate Change. Muller’s figures also conform with the estimates of those British and American researchers whose catty e-mails were the basis for the alleged “Climategate” scandal, which was never a scandal in the first place.

The Berkeley group’s research even confirms the infamous “hockey stick” graph — showing a sharp recent temperature rise — that Muller once snarkily called “the poster child of the global warming community.” Muller’s new graph isn’t just similar, it’s identical.

Muller found that skeptics are wrong when they claim that a “heat island” effect from urbanization is skewing average temperature readings; monitoring instruments in rural areas show rapid warming, too. He found that skeptics are wrong to base their arguments on the fact that records from some sites seem to indicate a cooling trend, since records from at least twice as many sites clearly indicate warming. And he found that skeptics are wrong to accuse climate scientists of cherry-picking the data, since the readings that are often omitted — because they are judged unreliable — show the same warming trend.

Muller and his colleagues examined five times as many temperature readings as did other researchers — a total of 1.6 billion records — and now have put that merged database online. The results have not yet been subjected to peer review, so technically they are still preliminary. But Muller’s plain-spoken admonition that “you should not be a skeptic, at least not any longer” has reduced many deniers to incoherent grumbling or stunned silence.

Not so, I predict, with the blowhards such as Perry, Cain and Bachmann, who, out of ignorance or perceived self-interest, are willing to play politics with the Earth’s future. They may concede that warming is taking place, but they call it a natural phenomenon and deny that human activity is the cause.

It is true that Muller made no attempt to ascertain “how much of the warming is due to humans.” Still, the Berkeley group’s work should help lead all but the dimmest policymakers to the overwhelmingly probable answer.

We know that the rise in temperatures over the past five decades is abrupt and very large. We know it is consistent with models developed by other climate researchers that posit greenhouse gas emissions — the burning of fossil fuels by humans — as the cause. And now we know, thanks to Muller, that those other scientists have been both careful and honorable in their work.

Nobody’s fudging the numbers. Nobody’s manipulating data to win research grants, as Perry claims, or making an undue fuss over a “naturally occurring” warm-up, as Bachmann alleges. Contrary to what Cain says, the science is real.

It is the know-nothing politicians — not scientists — who are committing an unforgivable fraud.

Comitês de Bacias vão apresentar moção contra reforma do Código Florestal (Ascom da ANA)

JC e-mail 4372, de 26 de Outubro de 2011.

Reunidos em São Luis (MA) no 13º Encontro Nacional de Comitês de Bacias Hidrográficas, representantes de comitês de todo o Brasil vão apresentar na sexta-feira (28) manifestação contra a redução das Áreas de Proteção Ambiental.

Representantes de Comitês de Bacias Hidrográficas de várias regiões do País preparam moção contra a redução das áreas de proteção ambiental às margens dos rios, em protesto contra o texto da reforma do Código Florestal, aprovado na Câmara dos Deputados em maio, que permite o uso das áreas de preservação permanente (APPs). O texto tramita agora no Senado e deve ir a plenário até o final do ano.

A moção será apresentada na sexta-feira (28), último dia do 13º Encontro Nacional de Comitês de Bacias Hidrográficas (Encob), que começa hoje em São Luis (MA).

Atualmente, o Brasil possui cerca de 180 Comitês, sendo dez em rios federias, com representações de diferentes segmentos da sociedade, espalhados por várias bacias. Ao todo, são mais de 50 mil pessoas engajadas na defesa dos recursos hídricos. Esses comitês funcionam como parlamentos da água, pois são formados por usuários locais dos recursos hídricos; organizações não governamentais; sociedade civil e representes do poder público nos três níveis (municipal, estadual e federal), que se reúnem em sessões plenárias.

A Agência Nacional de Águas (ANA) dá apoio técnico aos comitês federias e os órgãos gestores locais, aos estaduais, conforme determina a Lei 9.433 de 1997, conhecida como Lei das Águas, que estabeleceu a Política Nacional de Recursos Hídricos (PNRH) e criou o Sistema Nacional de Gerenciamento de Recursos Hídricos (Singreh). Todos os anos, representantes de comitês de bacia se reúnem para fazer um balanço da gestão dos recursos hídricos, da atuação desses arranjos locais e debater os desafios da implementação da PNRH. Este ano, porém, a reforma do Código Florestal dominou a cerimônia de abertura do 13º Encob, na noite de ontem (25), em São Luís.

“O Encob é o maior encontro nacional de água do planeta, portanto, reúne a visão de vários segmentos da sociedade, de usuários a pesquisadores, gestores e sociedade civil”, disse o diretor-presidente da ANA, Vicente Andreu. “É fundamental que haja uma forte sinalização ao Congresso. O tempo é curto e precisamos fazer chegar aos senadores uma posição muito firme”, completou. Em abril, a ANA divulgou uma Nota Técnica que explica as razões pelas quais a Agência defende a manutenção da cobertura florestal em torno dos rios na proporção atual estabelecida pelo Código Florestal, ou seja, no mínimo 30 metros. O projeto de lei propõe reduzir as áreas de proteção mínima para 15 metros. As matas ciliares são fundamentais para proteger os rios e garantir a qualidade das águas.

O deputado federal Sarney Filho (PV-MA) prometeu levar as análises do Encob à Subcomissão da Rio+20 da Câmara dos Deputados. “Todos sabemos que nossos rios estão ameaçados pelo lançamento de esgotos, pelo desmatamento das matas ciliares e agora pela reforma do Código Florestal”, disse.

Para o presidente da Rede de Organismos de Bacia (Rebob) e coordenador geral do Fórum Nacional dos Comitês de Bacias Hidrográficas, Lupércio Ziroldo Antônio, “aos olhos do mundo o Brasil é considerado uma potência hídrica por possui 13% da água do planeta e alguns dos maiores aqüíferos do mundo, por isso, precisa dar exemplo, principalmente nos próximos meses, quando haverá dois encontros internacionais importantes sobre meio ambiente e recursos hídricos: o Fórum Mundial da Água, em março de 2012,em Marselha, na França; e a Rio+20, em junho de 2012”.

Vários dos temas que estão sendo debatidos no Encob esta semana poderão ser abordados na Rio+20. Entre as proposições da ANA para o encontro no Rio estão a criação de um fundo para pagamentos por serviços ambientais para a proteção de nascentes, no moldes do Programa Produtor de Água da ANA; a criação de um programa global de pagamento para o tratamento de esgoto, baseado no Prodes (Programa de Despoluição de Bacias Hidrográficas) da ANA; e a criação de um órgão de governança global da água, no âmbito das Nações Unidas.

A programação do Encob inclui cursos de gestão de recursos hídricos para membros dos comitês de bacia e órgãos gestores locais de recursos hídricos, reuniões de comitês interestaduais, reunião da seção Brasil do Conselho Mundial da Água, oficina de adaptação às Mudanças Climáticas na Gestão dos Recursos Hídricos, além de mesas de debates sobre nascentes de centros urbanos, o papel dos comitês na universalização do saneamento, entre outras discussões.

Demora em demarcações impulsiona ocupações (Carta Capital)

Joana Moncau e Spensy Pimentel 25 de outubro de 2011 às 14:56h

A paciência de muitos grupos se esgotou, porque até mesmo áreas já declaradas indígenas há décadas estão ocupadas por colonos. Fotos: Joana Moncau e Spensy Pimentel

É a convite das próprias lideranças indígenas que chegamos ao local onde estão montadas as barracas de lona preta das quase 70 famílias guarani-kaiowá. No fim do mês de maio, elas deixaram suas casas na reserva de Panambi para criar o acampamento de Guyra Kambi’y, a apenas algumas centenas de metros de outro deles, o Yta’y Ka’aguyrusu, formado em setembro do ano passado, em meio a conflitos com os colonos que vieram para a região a convite do governo federal, entre os anos 40 e 50 do século passado.

Poucas semanas antes da visita, um índio de 56 anos que estava residindo no local foi encontrado enforcado no terreno onde costumava buscar lenha. “Não entendemos bem o que aconteceu, ele estava ajudando a preparar uma casa de reza, inclusive. Essa demora toda, às vezes, deixa as pessoas tristes”, comenta um dos indígenas.

A demora nas demarcações de terras indígenas em Mato Grosso do Sul tem impulsionado a formação de mais e mais acampamentos. A paciência de muitos grupos se esgotou, porque até mesmo áreas já declaradas indígenas há décadas estão ocupadas por colonos – é o que ocorre em Panambi, onde, de 2000 hectares demarcados nos anos 70, os indígenas só ocupam efetivamente 300. Só em Dourados, onde está a reserva cuja situação é mais crítica – fala-se em até 15 mil indígenas em 3,5 mil hectares –, surgiram dois acampamentos este ano.

Índios acampam em MS, enquanto esperam a demarcação

Um levantamento do Conselho Indigenista Missionário atualizado este mês encontrou 31 acampamentos guarani-kaiowá na região sul de Mato Grosso do Sul. Nem sempre eles estão em situação de conflito como acontece em casos como os de Ypo’i, Pyelito e Kurusu Amba (ver matéria anterior), mas a vulnerabilidade é uma constante – alguns grupos vivem na miséria, à beira das estradas, há décadas, com acesso precário aos direitos mais básicos, como saúde, educação e documentação civil.

A partir do momento em que os grupos deixam as reservas superlotadas para realizar ocupações nas fazendas a fim de reivindicar seu direito sobre suas terras, expõem-se ainda mais. A única assistência que passam a ter é federal e vem da Fundação Nacional do Índio (Funai) e da Secretaria Especial de Saúde Indígena (Sesai). Benefícios sociais como cestas básicas dadas pelo estado são automaticamente cortados. “O estado e os municípios não dão absolutamente nenhuma assistência a esses grupos”,  afirma Maria Aparecida Mendes de Oliveira, coordenadora regional da Funai, em Dourados.

Em áreas dentro de fazendas, muitas vezes mesmo a Sesai e a Funai só conseguem agir com ordem judicial. Em Ypo’i, por exemplo, segundo a Funai, as equipes de saúde e de assistência social só podem entrar uma vez a cada 15 dias. “O problema é que as pessoas não escolhem hora para ficar doentes”, reclama Maria Aparecida. Mesmo o programa de distribuição de cestas básicas da Funai apresenta problemas, pois depende de doações feitas pela Companhia Nacional de Abastecimento (Conab). Em alguns meses, a comida simplesmente não chega.

O discurso do governo estadual prega a necessidade de políticas públicas para os indígenas, em contraposição às demandas por terra. Em 2009, o governador André Puccinelli chegou a afirmar: “Eles não querem tanta terra como a Funai quer dar a eles. Os índios querem menos terra e mais programas sociais”. Só que, mesmo nas reservas já demarcadas, o atendimento é péssimo. No caso da saúde, as denúncias de desvios e ineficiência são constantes. Casas recentemente construídas com verba federal são entregues cheias de defeitos e com acabamento precário. Nas escolas, atualmente, está ameaçada a política de educação diferenciada, que pressupõe o ensino em língua guarani, entre outros elementos – prefeitos de diversas cidades têm demitido professores indígenas sem a menor consulta às comunidades, muitas vezes contratando brancos para seus postos.

Acampamento indígena no MS

A crise nas aldeias também se intensifica pela falta de alternativas econômicas com a escassez de terras. Atualmente, está ameaçado até o trabalho precário no corte da cana para as usinas de açúcar e álcool. O plantio da cana está sendo progressivamente mecanizado, o que significa que haverá mais desemprego e fome entre os indígenas, caso o problema das terras não seja resolvido logo.

Para enfrentar recentes agressões como as de Pyelito e Ypo’i, o movimento político guarani-kaiowá, conhecido como Aty Guasu (grande reunião), está solicitando à Secretaria de Direitos Humanos da Presidência da República que intensifique sua presença nas áreas em conflito. Desde 2006, por meio do Conselho de Defesa dos Direitos da Pessoa Humana (CDDPH), órgão de Estado vinculado à SDH, a crise por que passam os Guarani-Kaiowá tem sido reconhecida pelo governo como um dos mais sérios desafios do país na área dos direitos humanos. Diversas lideranças indígenas já integram atualmente o Programa de Proteção aos Defensores dos Direitos Humanos.

Um sinal de atendimento à reivindicação por mais segurança foi a renovação, na semana passada, da portaria do Ministério da Justiça que autoriza a presença da Força Nacional de Segurança Pública para apoiar a Polícia Federal em ações nas aldeias guarani-kaiowá. A esperança dos indígenas é que a ação da chamada Operação Tekoha, hoje focada hoje nas hiperviolentas reservas de Dourados, Amambai e Caarapó, se estenda às áreas localizadas na fronteira e ajude a coibir ataques contra os indígenas em regiões de conflito como Paranhos e Tacuru.

As ações de segurança pública são paliativos necessários, porque a disputa pelas terras ainda deve se estender por vários anos. Atualmente, a grande discussão é sobre a possibilidade de, em caso de demarcação, haver pagamento não só pelas benfeitorias sobre as terras consideradas indígenas, mas também pelo próprio terreno – algo vetado pela Constituição. Como, no estado, a colonização contou com amplo apoio tanto do governo federal como do estadual, uma boa parte dos fazendeiros tem títulos sobre as terras, o que torna a situação particularmente delicada.

Liderança mostra marcas de violência em área de demarcação

Para driblar o lento processo de tramitação de uma Proposta de Emenda à Constituição (PEC) no Congresso, o deputado estadual Laerte Tetila (PT) apresentou na Assembleia Legislativa de MS o projeto para a criação de um Fundo Estadual para Aquisição de Terras Indígenas. Do movimento indígena aos fazendeiros, os diversos atores envolvidos no conflito agora analisam a proposta de lei.

O debate sobre a questão das terras em MS também chegou ao Conselho Nacional de Justiça este ano. Em maio, uma comissão especial foi formada para discutir o impasse judicial que cerca as demarcações – um levantamento de 2009 encontrou 87 ações na Justiça envolvendo o conflito sobre terras indígenas no estado. Com a previsão de que venham a público até o início do ano que vem os seis relatórios de identificação de áreas guarani-kaiowá iniciados em 2008, espera-se que a negociação no CNJ previna o completo travamento do processo por conta das batalhas nos tribunais.

A crise envolvendo os Guarani-Kaiowá é a mais grave, mas não a única em MS a envolver disputa por terras indígenas. Os Terena, o segundo maior povo indígena do estado, com pouco mais de 20 mil pessoas, também têm reivindicado a demarcação de suas terras, atualmente reduzidas a umas poucas reservas definidas no início do século XX. Em assembleia recente, eles anunciaram que voltarão a ocupar terras reivindicadas como indígenas antes do fim do ano. Como se vê, a tendência é que os problemas se agravem no estado, caso o governo federal não aja com rapidez.

Segundo a Constituição, a demarcação das terras indígenas em todo o Brasil já deveria ter sido concluída há 18 anos, em 1993. O governo Lula só homologou três terras guarani-kaiowá, e dois desses processos estão suspensos pelo STF até hoje – e a única das novas terras que está efetivamente ocupada pelos indígenas, a Panambizinho, em Dourados, tem pouco mais de 1.200 hectares. Como ministro da Justiça, Tarso Genro vinha garantindo o seguimento do processo iniciado em 2008 em MS, apesar das pressões dos ruralistas e do PMDB. Está chegando a hora de seu sucessor, José Eduardo Cardozo, mostrar a que veio.

Para enfrentar crise em MS, governo federal lançará comitê especial

O governo federal deve recriar oficialmente no próximo mês uma coordenação especial das políticas públicas voltadas para os indígenas Guarani-Kaiowá do sul de Mato Grosso do Sul. O chamado Comitê Gestor de Políticas Indigenistas Integradas do Cone Sul de MS será instalado em uma reunião com participação de representantes de mais de dez ministérios, em Dourados, principal cidade da região, entre os dias 28 e 29 de novembro.

Balas de disparo contra índios de área de conflito

O anúncio foi feito na última quinta-feira (20) pelo secretário Nacional de Articulação Social da Secretaria Geral da Presidência da República, Paulo Maldos, após visita ao acampamento indígena de Ypo’i, onde três pessoas já foram mortas desde 2009 e a comunidade, atualmente, vive uma situação que o secretário definiu como de “crise humanitária” (veja matéria anterior).

“A situação no Cone Sul do Mato Grosso do Sul já ultrapassou todos os limites imagináveis”, afirmou Maldos, em entrevista à CartaCapital. “O governo federal não admite mais esse clima de violência nessa região. Sabemos que o único caminho é a demarcação de terras, mas é um caminho longo, e não podemos esperar. Vamos fortalecer a rede de proteção que está sendo formada entre as comunidades indígenas vítimas de violência. O objetivo é garantir a vida e a integridade das comunidades.”

Em 2006, após a divulgação pela imprensa de mortes por desnutrição entre as crianças guarani-kaiowá, o governo federal já havia criado um comitê gestor semelhante. Depois que o caso arrefeceu no debate público, a iniciativa perdeu impulso. Agora, Maldos promete que essa coordenação das ações federais voltadas para os indígenas será para valer: “Haverá prioridade máxima em todos os sentidos. Vamos agir nas mais variadas áreas: saúde, educação, apoio à produção, segurança, cultura, comunicação e o que mais for preciso”. “Queremos sinalizar para a região que buscamos fazer justiça aos direitos históricos dos Guarani-Kaiowá a partir de agora. Não vamos esperar as demarcações.”

Maldos disse ainda que todas as comunidades guarani-kaiowá serão alvo das políticas do comitê, independente de onde se localizam: “O Estado vai chegar a todas as comunidades, estejam em terras demarcadas ou não, em beiras de estrada ou mesmo dentro de fazendas”.

Índios durante protesto por demarcação. Foto: Cimi

Nos últimos anos, alguns dos principais relatórios internacionais sobre direitos humanos têm apontado a situação dos Guarani-Kaiowá como uma das mais graves entre os povos indígenas das Américas. No ano passado, a Aty Guasu (grande reunião, em guarani), assembleia que congrega os representantes das dezenas de comunidades desses indígenas, recebeu da Presidência da República o Prêmio Direitos Humanos. “Tudo o que for feito será feito em conjunto com eles. A Aty Guasu é nossa parceira”, diz Maldos.

O secretário disse que a escolha de ir ao MS e fazer o anúncio dessas novidades em Ypo’i foi proposital. “Fomos visitar a comunidade mais violentada das violentadas. Além de tudo o que os Guarani-Kaiowá em geral sofrem, lá eles estão sujeitos a um verdadeiro confinamento”, relata ele. “Eu já acompanhava as informações sobre as violências contra os Guarani-Kaiowá havia muitos anos, mas ir até lá me deixou ainda mais indignado com tudo o que vi.”

No final do mês, completam-se dois anos de um crime emblemático: o assassinato de dois professores guarani em Ypo’i, Rolindo e Genivaldo Vera. “Nenhum crime vai ficar impune. Nós vamos identificar esses criminosos”, comprometeu-se Maldos. Entre as violências que têm sido cometidas, o secretário lembra que houve, inclusive, ameaças aos próprios antropólogos que participam dos processos de demarcação de terras.

20 mil escravos no País (Correio Braziliense)

JC e-mail 4372, de 26 de Outubro de 2011.

A Organização Internacional do Trabalho (OIT) divulgou ontem (25) um perfil do trabalho escravo rural no Brasil, indicando que 81% das pessoas que vivem em condições análogas à escravidão são negras, jovens e com baixa escolaridade.

O estudo foi feito a partir de entrevistas com pessoas libertadas, aliciadores e empregadores em fazendas do Pará, Mato Grosso, Bahia e Goiás entre 2006 e 2007.

Além da predominância da raça negra, o documento aponta que cerca de 93% dessas pessoas iniciaram a vida profissional antes dos 16 anos, o que configura trabalho infantil, e que quase 75% delas são analfabetas. O estudo identificou que a maioria dos empregadores e dos aliciadores, os chamados “gatos”, é branca.

Para o coordenador da área de combate ao trabalho escravo da OIT, Luiz Machado, o dado reflete a condição de vulnerabilidade da população mais pobre ao trabalho escravo, composta maioritariamente por negros. “Isso é um resquício da exploração colonial”, atestou. O fato de não terem frequentado escolas na infância também é destacado pelo coordenador como um indutor do problema. “O trabalho infantil tira as possibilidades futuras e facilita o caminho ao trabalho escravo. Pessoas sem escolaridade não têm oportunidades.”

O Ministério Público do Trabalho (MPT) estima que cerca de 20 mil pessoas estejam submetidas ao trabalho forçado ou degradante no Brasil hoje. Desde 1995, mais de 40 mil trabalhadores foram libertados no país, que assumiu um compromisso internacional para erradicar a prática até 2015. A coordenadora nacional de Combate ao Trabalho Escravo do MPT, Débora Tito, relata que as políticas sobre o tema têm se concentrado no que ela chama “pedagogia do bolso”.

A ideia é enfrentar o problema por meio de multas altas e da inserção de nomes de empregadores em cadastros negativos para que deixem de conseguir financiamentos de bancos. “Temos que tornar essa prática economicamente inviável, para que os fazendeiros parem de economizar à custa da dignidade do trabalhador”, disse a procuradora. Segundo ela, a pena para punir o empregador de trabalho análogo ao escravo é de dois a oito anos de prisão, mas existem poucas condenações no país.

Convenção – As centrais sindicais que representam os servidores públicos das três esferas do governo estão se debatendo para definir o projeto de lei que tratará de temas como direito de greve, negociação coletiva e liberação de dirigentes sindicais de bater o ponto para se dedicar aos assuntos das categorias, itens da Convenção 151 da Organização Internacional do Trabalho (OIT), que deverá ser regulamentada até o fim do ano. Em audiência pública na Câmara ontem, a queda de braço girou em torno da cobrança do imposto sindical, um desconto no contracheque de um dia de salário ao ano, a exemplo do que ocorre com os trabalhadores da iniciativa privada.

Bleak Prospects for Avoiding Dangerous Global Warming (Science)

by Richard A. Kerr on 23 October 2011, 1:00 PM

The bad news just got worse: A new study finds that reining in greenhouse gas emissions in time to avert serious changes to Earth’s climate will be at best extremely difficult. Current goals for reducing emissions fall far short of what would be needed to keep warming below dangerous levels, the study suggests. To succeed, we would most likely have to reverse the rise in emissions immediately and follow through with steep reductions through the century. Starting later would be far more expensive and require unproven technology.

Published online today in Nature Climate Change, the new study merges model estimates of how much greenhouse gas society might put into the atmosphere by the end of the century with calculations of how climate might respond to those human emissions. Climate scientist Joeri Rogelj of ETH Zurich and his colleagues combed the published literature for model simulations that keep global warming below 2°C at the lowest cost. They found 193 examples. Modelers running such optimal-cost simulations tried to include every factor that might influence the amount of greenhouse gases society will produce —including the rate of technological progress in burning fuels efficiently, the amount of fossil fuels available, and the development of renewable fuels. The researchers then fed the full range of emissions from the scenarios into a simple climate model to estimate the odds of avoiding a dangerous warming.

The results suggest challenging times ahead for decision makers hoping to curb the greenhouse. Strategies that are both plausible and likely to succeed call for emissions to peak this decade and start dropping right away. They should be well into decline by 2020 and far less than half of current emissions by 2050. Only three of the 193 scenarios examined would be very likely to keep the warming below the danger level, and all of those require heavy use of energy systems that actually remove greenhouse gases from the atmosphere. That would require, for example, both creating biofuels and storing the carbon dioxide from their combustion in the ground.

“The alarming thing is very few scenarios give the kind of future we want,” says climate scientist Neil Edwards of The Open University in Milton Keynes, U.K. Both he and Rogelj emphasize the uncertainties inherent in the modeling, especially on the social and technological side, but the message seems clear to Edwards: “What we need is at the cutting edge. We need to be as innovative as we can be in every way.” And even then, success is far from guaranteed.

A skeptical physicist ends up confirming climate data (Washington Post)

Posted by Brad Plumer at 04:18 PM ET, 10/20/2011
Back in 2010, Richard Muller, a Berkeley physicist and self-proclaimed climate skeptic, decided to launch the Berkeley Earth Surface Temperature (BEST) project to review the temperature data that underpinned global-warming claims. Remember, this was not long after the Climategate affair had erupted, at a time when skeptics were griping that climatologists had based their claims on faulty temperature data.(Jonathan Hayward/AP)Muller’s stated aims were simple. He and his team would scour and re-analyze the climate data, putting all their calculations and methods online. Skeptics cheered the effort. “I’m prepared to accept whatever result they produce, even if it proves my premise wrong,” wrote Anthony Watts, a blogger who has criticized the quality of the weather stations in the United Statse that provide temperature data. The Charles G. Koch Foundation even gave Muller’s project $150,000 — and the Koch brothers, recall, are hardly fans of mainstream climate science.So what are the end results? Muller’s team appears to have confirmed the basic tenets of climate science. Back in March, Muller told the House Science and Technology Committee that, contrary to what he expected, the existing temperature data was “excellent.” He went on: “We see a global warming trend that is very similar to that previously reported by the other groups.” And, today, the BEST team has released a flurry of new papers that confirm that the planet is getting hotter. As the team’s two-page summary flatly concludes, “Global warming is real.”Here’s a chart comparing their findings with existing data:

The BEST team tried to take a number of skeptic claims seriously, to see if they panned out. Take, for instance, their paper on the “urban heat island effect.” Watts has long argued that many weather stations collecting temperature data could be biased by being located in cities. Since cities are naturally warmer than rural areas (because building materials retain more heat), the uptick in recorded temperatures might be exaggerated, an illusion spawned by increased urbanization. So Muller’s team decided to compare overall temperature trends with only those weather stations based in rural areas. And, as it turns out the trends match up well. “Urban warming does not unduly bias estimates of recent global temperature change,” Muller’s group concluded.

That shouldn’t be so jaw-dropping. Previous analyses — like this one from the National Oceanic and Atmospheric Administration — have responded to Watts’ concerns by showing that a few flawed stations don’t warp the overall trend. But maybe Muller’s team can finally put this controversy to rest, right? Well, not yet. As Watts responds over at his site, the BEST papers still haven’t been peer-reviewed (an important caveat, to be sure). And Watts isn’t pleased with how much pre-publication hype the studies are getting. But so far, what we have is a prominent skeptic casting a critical eye at the data and finding, much to his own surprise, that the data holds up.