Arquivo mensal: outubro 2011

More anthropologists on Wall Street please (The Economist)

Education policy

Oct 24th 2011, 20:58 by M.S.

APPARENTLY Rick Scott, the governor of Florida, called two weeks ago for reducing funding for liberal-arts disciplines at state universities and shifting the money to science, technology, engineering and math, which he abbreviates to STEM. (Amusingly, if you Google “Rick Scott STEM” you end up getting multiple references to Mr Scott’s apparently non-operative campaign pledge to ban stem-cell research in Florida. Between the two issues, you’ve got a sort of operatic treatment of the modern Republican love-hate relationship with science.) Mr Scott seems to have repeatedly singled out the discipline of anthropology for derision. On one occasion, he apparently told a right-wing radio host: “You know, we don’t need a lot more anthropologists in the state. It’s a great degree if people want to get it, but we don’t need them here. I want to spend our dollars giving people science, technology, engineering, math degrees…so when they get out of school, they can get a job.” On another occasion, he’s quoted as telling a business group in Tallahassee: “Do you want to use your tax dollars to educate more people who can’t get jobs in anthropology? I don’t.”

Few would defend deliberately educating more people who can’t get jobs in anthropology, as such. (Of course, giving people math degrees rather than anthropology degrees will render them even less able to get jobs in anthropology.) Many, however, would defend educating more people in anthropology, regardless of what they wind up getting jobs in. In Slate on Friday, Michael Crow, president of Arizona State University, gave the traditional and entirely accurate pitch:

[R]esolving the complex challenges that confront our nation and the world requires more than expertise in science and technology. We must also educate individuals capable of meaningful civic participation, creative expression, and communicating insights across borders. The potential for graduates in any field to achieve professional success and to contribute significantly to our economy depends on an education that entails more than calculus.

Curricula expressly tailored in response to the demands of the workforce must be balanced with opportunities for students to develop their capacity for critical thinking, analytical reasoning, creativity, and leadership—all of which we learn from the full spectrum of disciplines associated with a liberal arts education. Taken together with the rigorous training provided in the STEM fields, the opportunities for exploration and learning that Gov. Scott is intent on marginalizing are those that have defined our national approach to higher education.

This is a solid response. What it lacks are rhetorical oomph and concrete examples. So here’s a concrete example with a little oomph. Some of the best analysis of the 2007-2008 financial crisis, and of the ongoing follies on Wall Street these days, has been produced by the Financial Times‘ Gillian Tett. Ms Tett began warning that collateralised debt obligations and credit-default swaps were likely to lead to a major financial implosion in 2005 or so. The people who devise such complex derivatives are generally trained in physics or math. Ms Tett has a PhD in anthropology. Here’s a 2008 profile of Ms Tett by the Guardian’s Laurie Barton.

Tett began looking at the subject of credit five years ago. “Everyone was looking at the City and talking about M&A [mergers and acquisitions] and equity markets, and all the traditional high-glamour, high-status parts of the City. I got into this corner of the market because I passionately believed there was a revolution happening that had been almost entirely ignored. And I got really excited about trying to actually illustrate what was happening.”

Not that anyone particularly wanted to listen. “You could see everyone’s eyes glazing over … But my team, not just me, we very much warned of the dangers. Though I don’t think we expected the full scale of the disaster that’s unfolded.”

There is something exceedingly calm and thorough about Tett. She talks with the patient enthusiasm of a Tomorrow’s World presenter—a throwback, perhaps, to her days studying social anthropology, in which she has a PhD from Cambridge. “I happen to think anthropology is a brilliant background for looking at finance,” she reasons. “Firstly, you’re trained to look at how societies or cultures operate holistically, so you look at how all the bits move together. And most people in the City don’t do that. They are so specialised, so busy, that they just look at their own little silos. And one of the reasons we got into the mess we are in is because they were all so busy looking at their own little bit that they totally failed to understand how it interacted with the rest of society.

“But the other thing is, if you come from an anthropology background, you also try and put finance in a cultural context. Bankers like to imagine that money and the profit motive is as universal as gravity. They think it’s basically a given and they think it’s completely apersonal. And it’s not. What they do in finance is all about culture and interaction.”

Another person with an anthropology degree who’s been doing terrific work in recent years in a somewhat-related field is the Dutch journalist Joris Luyendijk, who produced a fantastic short book last year analysing the tribal culture of the Dutch parliament and the media circles that cover it. He’s currently working on a study of the City as well. Anyway, the general point is that while studying human behaviour through complex derivatives has its uses, there’s something to be said for the more rigorous and less egocentric analytical tools that anthropology brings to play, and it might be worth Mr Scott’s time to take a course or two. It’s never too late to learn.

The scientific finding that settles the climate-change debate (Washington Post)

By Eugene Robinson, Published: October 24

For the clueless or cynical diehards who deny global warming, it’s getting awfully cold out there.

The latest icy blast of reality comes from an eminent scientist whom the climate-change skeptics once lauded as one of their own. Richard Muller, a respected physicist at the University of California, Berkeley, used to dismiss alarmist climate research as being “polluted by political and activist frenzy.” Frustrated at what he considered shoddy science, Muller launched his own comprehensive study to set the record straight. Instead, the record set him straight.

“Global warming is real,” Muller wrote last week in The Wall Street Journal.

Rick Perry, Herman Cain, Michele Bachmann and the rest of the neo-Luddites who are turning the GOP into the anti-science party should pay attention.

“When we began our study, we felt that skeptics had raised legitimate issues, and we didn’t know what we’d find,” Muller wrote. “Our results turned out to be close to those published by prior groups. We think that means that those groups had truly been careful in their work, despite their inability to convince some skeptics of that.”

In other words, the deniers’ claims about the alleged sloppiness or fraudulence of climate science are wrong. Muller’s team, the Berkeley Earth Surface Temperature project, rigorously explored the specific objections raised by skeptics — and found them groundless.

Muller and his fellow researchers examined an enormous data set of observed temperatures from monitoring stations around the world and concluded that the average land temperature has risen 1 degree Celsius — or about 1.8 degrees Fahrenheit — since the mid-1950s.

This agrees with the increase estimated by the United Nations-sponsored Intergovernmental Panel on Climate Change. Muller’s figures also conform with the estimates of those British and American researchers whose catty e-mails were the basis for the alleged “Climategate” scandal, which was never a scandal in the first place.

The Berkeley group’s research even confirms the infamous “hockey stick” graph — showing a sharp recent temperature rise — that Muller once snarkily called “the poster child of the global warming community.” Muller’s new graph isn’t just similar, it’s identical.

Muller found that skeptics are wrong when they claim that a “heat island” effect from urbanization is skewing average temperature readings; monitoring instruments in rural areas show rapid warming, too. He found that skeptics are wrong to base their arguments on the fact that records from some sites seem to indicate a cooling trend, since records from at least twice as many sites clearly indicate warming. And he found that skeptics are wrong to accuse climate scientists of cherry-picking the data, since the readings that are often omitted — because they are judged unreliable — show the same warming trend.

Muller and his colleagues examined five times as many temperature readings as did other researchers — a total of 1.6 billion records — and now have put that merged database online. The results have not yet been subjected to peer review, so technically they are still preliminary. But Muller’s plain-spoken admonition that “you should not be a skeptic, at least not any longer” has reduced many deniers to incoherent grumbling or stunned silence.

Not so, I predict, with the blowhards such as Perry, Cain and Bachmann, who, out of ignorance or perceived self-interest, are willing to play politics with the Earth’s future. They may concede that warming is taking place, but they call it a natural phenomenon and deny that human activity is the cause.

It is true that Muller made no attempt to ascertain “how much of the warming is due to humans.” Still, the Berkeley group’s work should help lead all but the dimmest policymakers to the overwhelmingly probable answer.

We know that the rise in temperatures over the past five decades is abrupt and very large. We know it is consistent with models developed by other climate researchers that posit greenhouse gas emissions — the burning of fossil fuels by humans — as the cause. And now we know, thanks to Muller, that those other scientists have been both careful and honorable in their work.

Nobody’s fudging the numbers. Nobody’s manipulating data to win research grants, as Perry claims, or making an undue fuss over a “naturally occurring” warm-up, as Bachmann alleges. Contrary to what Cain says, the science is real.

It is the know-nothing politicians — not scientists — who are committing an unforgivable fraud.

Comitês de Bacias vão apresentar moção contra reforma do Código Florestal (Ascom da ANA)

JC e-mail 4372, de 26 de Outubro de 2011.

Reunidos em São Luis (MA) no 13º Encontro Nacional de Comitês de Bacias Hidrográficas, representantes de comitês de todo o Brasil vão apresentar na sexta-feira (28) manifestação contra a redução das Áreas de Proteção Ambiental.

Representantes de Comitês de Bacias Hidrográficas de várias regiões do País preparam moção contra a redução das áreas de proteção ambiental às margens dos rios, em protesto contra o texto da reforma do Código Florestal, aprovado na Câmara dos Deputados em maio, que permite o uso das áreas de preservação permanente (APPs). O texto tramita agora no Senado e deve ir a plenário até o final do ano.

A moção será apresentada na sexta-feira (28), último dia do 13º Encontro Nacional de Comitês de Bacias Hidrográficas (Encob), que começa hoje em São Luis (MA).

Atualmente, o Brasil possui cerca de 180 Comitês, sendo dez em rios federias, com representações de diferentes segmentos da sociedade, espalhados por várias bacias. Ao todo, são mais de 50 mil pessoas engajadas na defesa dos recursos hídricos. Esses comitês funcionam como parlamentos da água, pois são formados por usuários locais dos recursos hídricos; organizações não governamentais; sociedade civil e representes do poder público nos três níveis (municipal, estadual e federal), que se reúnem em sessões plenárias.

A Agência Nacional de Águas (ANA) dá apoio técnico aos comitês federias e os órgãos gestores locais, aos estaduais, conforme determina a Lei 9.433 de 1997, conhecida como Lei das Águas, que estabeleceu a Política Nacional de Recursos Hídricos (PNRH) e criou o Sistema Nacional de Gerenciamento de Recursos Hídricos (Singreh). Todos os anos, representantes de comitês de bacia se reúnem para fazer um balanço da gestão dos recursos hídricos, da atuação desses arranjos locais e debater os desafios da implementação da PNRH. Este ano, porém, a reforma do Código Florestal dominou a cerimônia de abertura do 13º Encob, na noite de ontem (25), em São Luís.

“O Encob é o maior encontro nacional de água do planeta, portanto, reúne a visão de vários segmentos da sociedade, de usuários a pesquisadores, gestores e sociedade civil”, disse o diretor-presidente da ANA, Vicente Andreu. “É fundamental que haja uma forte sinalização ao Congresso. O tempo é curto e precisamos fazer chegar aos senadores uma posição muito firme”, completou. Em abril, a ANA divulgou uma Nota Técnica que explica as razões pelas quais a Agência defende a manutenção da cobertura florestal em torno dos rios na proporção atual estabelecida pelo Código Florestal, ou seja, no mínimo 30 metros. O projeto de lei propõe reduzir as áreas de proteção mínima para 15 metros. As matas ciliares são fundamentais para proteger os rios e garantir a qualidade das águas.

O deputado federal Sarney Filho (PV-MA) prometeu levar as análises do Encob à Subcomissão da Rio+20 da Câmara dos Deputados. “Todos sabemos que nossos rios estão ameaçados pelo lançamento de esgotos, pelo desmatamento das matas ciliares e agora pela reforma do Código Florestal”, disse.

Para o presidente da Rede de Organismos de Bacia (Rebob) e coordenador geral do Fórum Nacional dos Comitês de Bacias Hidrográficas, Lupércio Ziroldo Antônio, “aos olhos do mundo o Brasil é considerado uma potência hídrica por possui 13% da água do planeta e alguns dos maiores aqüíferos do mundo, por isso, precisa dar exemplo, principalmente nos próximos meses, quando haverá dois encontros internacionais importantes sobre meio ambiente e recursos hídricos: o Fórum Mundial da Água, em março de 2012,em Marselha, na França; e a Rio+20, em junho de 2012”.

Vários dos temas que estão sendo debatidos no Encob esta semana poderão ser abordados na Rio+20. Entre as proposições da ANA para o encontro no Rio estão a criação de um fundo para pagamentos por serviços ambientais para a proteção de nascentes, no moldes do Programa Produtor de Água da ANA; a criação de um programa global de pagamento para o tratamento de esgoto, baseado no Prodes (Programa de Despoluição de Bacias Hidrográficas) da ANA; e a criação de um órgão de governança global da água, no âmbito das Nações Unidas.

A programação do Encob inclui cursos de gestão de recursos hídricos para membros dos comitês de bacia e órgãos gestores locais de recursos hídricos, reuniões de comitês interestaduais, reunião da seção Brasil do Conselho Mundial da Água, oficina de adaptação às Mudanças Climáticas na Gestão dos Recursos Hídricos, além de mesas de debates sobre nascentes de centros urbanos, o papel dos comitês na universalização do saneamento, entre outras discussões.

Demora em demarcações impulsiona ocupações (Carta Capital)

Joana Moncau e Spensy Pimentel 25 de outubro de 2011 às 14:56h

A paciência de muitos grupos se esgotou, porque até mesmo áreas já declaradas indígenas há décadas estão ocupadas por colonos. Fotos: Joana Moncau e Spensy Pimentel

É a convite das próprias lideranças indígenas que chegamos ao local onde estão montadas as barracas de lona preta das quase 70 famílias guarani-kaiowá. No fim do mês de maio, elas deixaram suas casas na reserva de Panambi para criar o acampamento de Guyra Kambi’y, a apenas algumas centenas de metros de outro deles, o Yta’y Ka’aguyrusu, formado em setembro do ano passado, em meio a conflitos com os colonos que vieram para a região a convite do governo federal, entre os anos 40 e 50 do século passado.

Poucas semanas antes da visita, um índio de 56 anos que estava residindo no local foi encontrado enforcado no terreno onde costumava buscar lenha. “Não entendemos bem o que aconteceu, ele estava ajudando a preparar uma casa de reza, inclusive. Essa demora toda, às vezes, deixa as pessoas tristes”, comenta um dos indígenas.

A demora nas demarcações de terras indígenas em Mato Grosso do Sul tem impulsionado a formação de mais e mais acampamentos. A paciência de muitos grupos se esgotou, porque até mesmo áreas já declaradas indígenas há décadas estão ocupadas por colonos – é o que ocorre em Panambi, onde, de 2000 hectares demarcados nos anos 70, os indígenas só ocupam efetivamente 300. Só em Dourados, onde está a reserva cuja situação é mais crítica – fala-se em até 15 mil indígenas em 3,5 mil hectares –, surgiram dois acampamentos este ano.

Índios acampam em MS, enquanto esperam a demarcação

Um levantamento do Conselho Indigenista Missionário atualizado este mês encontrou 31 acampamentos guarani-kaiowá na região sul de Mato Grosso do Sul. Nem sempre eles estão em situação de conflito como acontece em casos como os de Ypo’i, Pyelito e Kurusu Amba (ver matéria anterior), mas a vulnerabilidade é uma constante – alguns grupos vivem na miséria, à beira das estradas, há décadas, com acesso precário aos direitos mais básicos, como saúde, educação e documentação civil.

A partir do momento em que os grupos deixam as reservas superlotadas para realizar ocupações nas fazendas a fim de reivindicar seu direito sobre suas terras, expõem-se ainda mais. A única assistência que passam a ter é federal e vem da Fundação Nacional do Índio (Funai) e da Secretaria Especial de Saúde Indígena (Sesai). Benefícios sociais como cestas básicas dadas pelo estado são automaticamente cortados. “O estado e os municípios não dão absolutamente nenhuma assistência a esses grupos”,  afirma Maria Aparecida Mendes de Oliveira, coordenadora regional da Funai, em Dourados.

Em áreas dentro de fazendas, muitas vezes mesmo a Sesai e a Funai só conseguem agir com ordem judicial. Em Ypo’i, por exemplo, segundo a Funai, as equipes de saúde e de assistência social só podem entrar uma vez a cada 15 dias. “O problema é que as pessoas não escolhem hora para ficar doentes”, reclama Maria Aparecida. Mesmo o programa de distribuição de cestas básicas da Funai apresenta problemas, pois depende de doações feitas pela Companhia Nacional de Abastecimento (Conab). Em alguns meses, a comida simplesmente não chega.

O discurso do governo estadual prega a necessidade de políticas públicas para os indígenas, em contraposição às demandas por terra. Em 2009, o governador André Puccinelli chegou a afirmar: “Eles não querem tanta terra como a Funai quer dar a eles. Os índios querem menos terra e mais programas sociais”. Só que, mesmo nas reservas já demarcadas, o atendimento é péssimo. No caso da saúde, as denúncias de desvios e ineficiência são constantes. Casas recentemente construídas com verba federal são entregues cheias de defeitos e com acabamento precário. Nas escolas, atualmente, está ameaçada a política de educação diferenciada, que pressupõe o ensino em língua guarani, entre outros elementos – prefeitos de diversas cidades têm demitido professores indígenas sem a menor consulta às comunidades, muitas vezes contratando brancos para seus postos.

Acampamento indígena no MS

A crise nas aldeias também se intensifica pela falta de alternativas econômicas com a escassez de terras. Atualmente, está ameaçado até o trabalho precário no corte da cana para as usinas de açúcar e álcool. O plantio da cana está sendo progressivamente mecanizado, o que significa que haverá mais desemprego e fome entre os indígenas, caso o problema das terras não seja resolvido logo.

Para enfrentar recentes agressões como as de Pyelito e Ypo’i, o movimento político guarani-kaiowá, conhecido como Aty Guasu (grande reunião), está solicitando à Secretaria de Direitos Humanos da Presidência da República que intensifique sua presença nas áreas em conflito. Desde 2006, por meio do Conselho de Defesa dos Direitos da Pessoa Humana (CDDPH), órgão de Estado vinculado à SDH, a crise por que passam os Guarani-Kaiowá tem sido reconhecida pelo governo como um dos mais sérios desafios do país na área dos direitos humanos. Diversas lideranças indígenas já integram atualmente o Programa de Proteção aos Defensores dos Direitos Humanos.

Um sinal de atendimento à reivindicação por mais segurança foi a renovação, na semana passada, da portaria do Ministério da Justiça que autoriza a presença da Força Nacional de Segurança Pública para apoiar a Polícia Federal em ações nas aldeias guarani-kaiowá. A esperança dos indígenas é que a ação da chamada Operação Tekoha, hoje focada hoje nas hiperviolentas reservas de Dourados, Amambai e Caarapó, se estenda às áreas localizadas na fronteira e ajude a coibir ataques contra os indígenas em regiões de conflito como Paranhos e Tacuru.

As ações de segurança pública são paliativos necessários, porque a disputa pelas terras ainda deve se estender por vários anos. Atualmente, a grande discussão é sobre a possibilidade de, em caso de demarcação, haver pagamento não só pelas benfeitorias sobre as terras consideradas indígenas, mas também pelo próprio terreno – algo vetado pela Constituição. Como, no estado, a colonização contou com amplo apoio tanto do governo federal como do estadual, uma boa parte dos fazendeiros tem títulos sobre as terras, o que torna a situação particularmente delicada.

Liderança mostra marcas de violência em área de demarcação

Para driblar o lento processo de tramitação de uma Proposta de Emenda à Constituição (PEC) no Congresso, o deputado estadual Laerte Tetila (PT) apresentou na Assembleia Legislativa de MS o projeto para a criação de um Fundo Estadual para Aquisição de Terras Indígenas. Do movimento indígena aos fazendeiros, os diversos atores envolvidos no conflito agora analisam a proposta de lei.

O debate sobre a questão das terras em MS também chegou ao Conselho Nacional de Justiça este ano. Em maio, uma comissão especial foi formada para discutir o impasse judicial que cerca as demarcações – um levantamento de 2009 encontrou 87 ações na Justiça envolvendo o conflito sobre terras indígenas no estado. Com a previsão de que venham a público até o início do ano que vem os seis relatórios de identificação de áreas guarani-kaiowá iniciados em 2008, espera-se que a negociação no CNJ previna o completo travamento do processo por conta das batalhas nos tribunais.

A crise envolvendo os Guarani-Kaiowá é a mais grave, mas não a única em MS a envolver disputa por terras indígenas. Os Terena, o segundo maior povo indígena do estado, com pouco mais de 20 mil pessoas, também têm reivindicado a demarcação de suas terras, atualmente reduzidas a umas poucas reservas definidas no início do século XX. Em assembleia recente, eles anunciaram que voltarão a ocupar terras reivindicadas como indígenas antes do fim do ano. Como se vê, a tendência é que os problemas se agravem no estado, caso o governo federal não aja com rapidez.

Segundo a Constituição, a demarcação das terras indígenas em todo o Brasil já deveria ter sido concluída há 18 anos, em 1993. O governo Lula só homologou três terras guarani-kaiowá, e dois desses processos estão suspensos pelo STF até hoje – e a única das novas terras que está efetivamente ocupada pelos indígenas, a Panambizinho, em Dourados, tem pouco mais de 1.200 hectares. Como ministro da Justiça, Tarso Genro vinha garantindo o seguimento do processo iniciado em 2008 em MS, apesar das pressões dos ruralistas e do PMDB. Está chegando a hora de seu sucessor, José Eduardo Cardozo, mostrar a que veio.

Para enfrentar crise em MS, governo federal lançará comitê especial

O governo federal deve recriar oficialmente no próximo mês uma coordenação especial das políticas públicas voltadas para os indígenas Guarani-Kaiowá do sul de Mato Grosso do Sul. O chamado Comitê Gestor de Políticas Indigenistas Integradas do Cone Sul de MS será instalado em uma reunião com participação de representantes de mais de dez ministérios, em Dourados, principal cidade da região, entre os dias 28 e 29 de novembro.

Balas de disparo contra índios de área de conflito

O anúncio foi feito na última quinta-feira (20) pelo secretário Nacional de Articulação Social da Secretaria Geral da Presidência da República, Paulo Maldos, após visita ao acampamento indígena de Ypo’i, onde três pessoas já foram mortas desde 2009 e a comunidade, atualmente, vive uma situação que o secretário definiu como de “crise humanitária” (veja matéria anterior).

“A situação no Cone Sul do Mato Grosso do Sul já ultrapassou todos os limites imagináveis”, afirmou Maldos, em entrevista à CartaCapital. “O governo federal não admite mais esse clima de violência nessa região. Sabemos que o único caminho é a demarcação de terras, mas é um caminho longo, e não podemos esperar. Vamos fortalecer a rede de proteção que está sendo formada entre as comunidades indígenas vítimas de violência. O objetivo é garantir a vida e a integridade das comunidades.”

Em 2006, após a divulgação pela imprensa de mortes por desnutrição entre as crianças guarani-kaiowá, o governo federal já havia criado um comitê gestor semelhante. Depois que o caso arrefeceu no debate público, a iniciativa perdeu impulso. Agora, Maldos promete que essa coordenação das ações federais voltadas para os indígenas será para valer: “Haverá prioridade máxima em todos os sentidos. Vamos agir nas mais variadas áreas: saúde, educação, apoio à produção, segurança, cultura, comunicação e o que mais for preciso”. “Queremos sinalizar para a região que buscamos fazer justiça aos direitos históricos dos Guarani-Kaiowá a partir de agora. Não vamos esperar as demarcações.”

Maldos disse ainda que todas as comunidades guarani-kaiowá serão alvo das políticas do comitê, independente de onde se localizam: “O Estado vai chegar a todas as comunidades, estejam em terras demarcadas ou não, em beiras de estrada ou mesmo dentro de fazendas”.

Índios durante protesto por demarcação. Foto: Cimi

Nos últimos anos, alguns dos principais relatórios internacionais sobre direitos humanos têm apontado a situação dos Guarani-Kaiowá como uma das mais graves entre os povos indígenas das Américas. No ano passado, a Aty Guasu (grande reunião, em guarani), assembleia que congrega os representantes das dezenas de comunidades desses indígenas, recebeu da Presidência da República o Prêmio Direitos Humanos. “Tudo o que for feito será feito em conjunto com eles. A Aty Guasu é nossa parceira”, diz Maldos.

O secretário disse que a escolha de ir ao MS e fazer o anúncio dessas novidades em Ypo’i foi proposital. “Fomos visitar a comunidade mais violentada das violentadas. Além de tudo o que os Guarani-Kaiowá em geral sofrem, lá eles estão sujeitos a um verdadeiro confinamento”, relata ele. “Eu já acompanhava as informações sobre as violências contra os Guarani-Kaiowá havia muitos anos, mas ir até lá me deixou ainda mais indignado com tudo o que vi.”

No final do mês, completam-se dois anos de um crime emblemático: o assassinato de dois professores guarani em Ypo’i, Rolindo e Genivaldo Vera. “Nenhum crime vai ficar impune. Nós vamos identificar esses criminosos”, comprometeu-se Maldos. Entre as violências que têm sido cometidas, o secretário lembra que houve, inclusive, ameaças aos próprios antropólogos que participam dos processos de demarcação de terras.

20 mil escravos no País (Correio Braziliense)

JC e-mail 4372, de 26 de Outubro de 2011.

A Organização Internacional do Trabalho (OIT) divulgou ontem (25) um perfil do trabalho escravo rural no Brasil, indicando que 81% das pessoas que vivem em condições análogas à escravidão são negras, jovens e com baixa escolaridade.

O estudo foi feito a partir de entrevistas com pessoas libertadas, aliciadores e empregadores em fazendas do Pará, Mato Grosso, Bahia e Goiás entre 2006 e 2007.

Além da predominância da raça negra, o documento aponta que cerca de 93% dessas pessoas iniciaram a vida profissional antes dos 16 anos, o que configura trabalho infantil, e que quase 75% delas são analfabetas. O estudo identificou que a maioria dos empregadores e dos aliciadores, os chamados “gatos”, é branca.

Para o coordenador da área de combate ao trabalho escravo da OIT, Luiz Machado, o dado reflete a condição de vulnerabilidade da população mais pobre ao trabalho escravo, composta maioritariamente por negros. “Isso é um resquício da exploração colonial”, atestou. O fato de não terem frequentado escolas na infância também é destacado pelo coordenador como um indutor do problema. “O trabalho infantil tira as possibilidades futuras e facilita o caminho ao trabalho escravo. Pessoas sem escolaridade não têm oportunidades.”

O Ministério Público do Trabalho (MPT) estima que cerca de 20 mil pessoas estejam submetidas ao trabalho forçado ou degradante no Brasil hoje. Desde 1995, mais de 40 mil trabalhadores foram libertados no país, que assumiu um compromisso internacional para erradicar a prática até 2015. A coordenadora nacional de Combate ao Trabalho Escravo do MPT, Débora Tito, relata que as políticas sobre o tema têm se concentrado no que ela chama “pedagogia do bolso”.

A ideia é enfrentar o problema por meio de multas altas e da inserção de nomes de empregadores em cadastros negativos para que deixem de conseguir financiamentos de bancos. “Temos que tornar essa prática economicamente inviável, para que os fazendeiros parem de economizar à custa da dignidade do trabalhador”, disse a procuradora. Segundo ela, a pena para punir o empregador de trabalho análogo ao escravo é de dois a oito anos de prisão, mas existem poucas condenações no país.

Convenção – As centrais sindicais que representam os servidores públicos das três esferas do governo estão se debatendo para definir o projeto de lei que tratará de temas como direito de greve, negociação coletiva e liberação de dirigentes sindicais de bater o ponto para se dedicar aos assuntos das categorias, itens da Convenção 151 da Organização Internacional do Trabalho (OIT), que deverá ser regulamentada até o fim do ano. Em audiência pública na Câmara ontem, a queda de braço girou em torno da cobrança do imposto sindical, um desconto no contracheque de um dia de salário ao ano, a exemplo do que ocorre com os trabalhadores da iniciativa privada.

Bleak Prospects for Avoiding Dangerous Global Warming (Science)

by Richard A. Kerr on 23 October 2011, 1:00 PM

The bad news just got worse: A new study finds that reining in greenhouse gas emissions in time to avert serious changes to Earth’s climate will be at best extremely difficult. Current goals for reducing emissions fall far short of what would be needed to keep warming below dangerous levels, the study suggests. To succeed, we would most likely have to reverse the rise in emissions immediately and follow through with steep reductions through the century. Starting later would be far more expensive and require unproven technology.

Published online today in Nature Climate Change, the new study merges model estimates of how much greenhouse gas society might put into the atmosphere by the end of the century with calculations of how climate might respond to those human emissions. Climate scientist Joeri Rogelj of ETH Zurich and his colleagues combed the published literature for model simulations that keep global warming below 2°C at the lowest cost. They found 193 examples. Modelers running such optimal-cost simulations tried to include every factor that might influence the amount of greenhouse gases society will produce —including the rate of technological progress in burning fuels efficiently, the amount of fossil fuels available, and the development of renewable fuels. The researchers then fed the full range of emissions from the scenarios into a simple climate model to estimate the odds of avoiding a dangerous warming.

The results suggest challenging times ahead for decision makers hoping to curb the greenhouse. Strategies that are both plausible and likely to succeed call for emissions to peak this decade and start dropping right away. They should be well into decline by 2020 and far less than half of current emissions by 2050. Only three of the 193 scenarios examined would be very likely to keep the warming below the danger level, and all of those require heavy use of energy systems that actually remove greenhouse gases from the atmosphere. That would require, for example, both creating biofuels and storing the carbon dioxide from their combustion in the ground.

“The alarming thing is very few scenarios give the kind of future we want,” says climate scientist Neil Edwards of The Open University in Milton Keynes, U.K. Both he and Rogelj emphasize the uncertainties inherent in the modeling, especially on the social and technological side, but the message seems clear to Edwards: “What we need is at the cutting edge. We need to be as innovative as we can be in every way.” And even then, success is far from guaranteed.

A skeptical physicist ends up confirming climate data (Washington Post)

Posted by Brad Plumer at 04:18 PM ET, 10/20/2011
Back in 2010, Richard Muller, a Berkeley physicist and self-proclaimed climate skeptic, decided to launch the Berkeley Earth Surface Temperature (BEST) project to review the temperature data that underpinned global-warming claims. Remember, this was not long after the Climategate affair had erupted, at a time when skeptics were griping that climatologists had based their claims on faulty temperature data.(Jonathan Hayward/AP)Muller’s stated aims were simple. He and his team would scour and re-analyze the climate data, putting all their calculations and methods online. Skeptics cheered the effort. “I’m prepared to accept whatever result they produce, even if it proves my premise wrong,” wrote Anthony Watts, a blogger who has criticized the quality of the weather stations in the United Statse that provide temperature data. The Charles G. Koch Foundation even gave Muller’s project $150,000 — and the Koch brothers, recall, are hardly fans of mainstream climate science.So what are the end results? Muller’s team appears to have confirmed the basic tenets of climate science. Back in March, Muller told the House Science and Technology Committee that, contrary to what he expected, the existing temperature data was “excellent.” He went on: “We see a global warming trend that is very similar to that previously reported by the other groups.” And, today, the BEST team has released a flurry of new papers that confirm that the planet is getting hotter. As the team’s two-page summary flatly concludes, “Global warming is real.”Here’s a chart comparing their findings with existing data:

The BEST team tried to take a number of skeptic claims seriously, to see if they panned out. Take, for instance, their paper on the “urban heat island effect.” Watts has long argued that many weather stations collecting temperature data could be biased by being located in cities. Since cities are naturally warmer than rural areas (because building materials retain more heat), the uptick in recorded temperatures might be exaggerated, an illusion spawned by increased urbanization. So Muller’s team decided to compare overall temperature trends with only those weather stations based in rural areas. And, as it turns out the trends match up well. “Urban warming does not unduly bias estimates of recent global temperature change,” Muller’s group concluded.

That shouldn’t be so jaw-dropping. Previous analyses — like this one from the National Oceanic and Atmospheric Administration — have responded to Watts’ concerns by showing that a few flawed stations don’t warp the overall trend. But maybe Muller’s team can finally put this controversy to rest, right? Well, not yet. As Watts responds over at his site, the BEST papers still haven’t been peer-reviewed (an important caveat, to be sure). And Watts isn’t pleased with how much pre-publication hype the studies are getting. But so far, what we have is a prominent skeptic casting a critical eye at the data and finding, much to his own surprise, that the data holds up.

Brasil é país-modelo em PSA, mas precisa intensificar atuação (Valor Econômico)

JC e-mail 4370, de 24 de Outubro de 2011.

Ainda sem regulamentação nacional, o Pagamento por Serviços Ambientais (PSA) se expande no Brasil, mas a passos lentos.

O estudo Pagamento por Serviços Ambientais na Mata Atlântica, feito pela Agência de Cooperação Internacional Alemã (GIZ), levantou quase 80 programas de PSA na região. São 40 projetos de PSA de água, 33 de carbono e 5 em biodiversidade. “As iniciativas aqui estão se proliferando rapidamente. Mas ainda são projetos isolados, que precisam ganhar escala”, afirma Susan Seehusen, assessora técnica em Economia de Meio Ambiente da GIZ. Com 22% de sua área original, a Mata Atlântica fornece serviços ambientais para comunidades tradicionais e rurais de seu entorno e a comunidade global.

De maior abrangência, os projetos de água contam com fontes de recursos de orçamentos públicos e verba do Comitê de Bacias Hidrográficas lideradas por prefeituras municipais e empresas do setor. O programa Produtor de Água, da Agência Nacional de Água (ANA), remunera produtores rurais e impulsiona o desenvolvimento do setor. Com o pagamento desses recursos humanos mais ações de restauração e conservação florestal, o custo anual dos projetos vai de R$ 200 mil a R$ 2,5 milhões por ano. Hoje, programas em fase inicial envolvem cerca de 350 produtores e beneficiam 22,2 milhões de pessoas.

Ligados a projetos de neutralização de CO2, os PSA de carbono se concentram na região do Pontal de Paranapanema, na tríplice fronteira São Paulo, Paraná e Minas Gerais, em terras de 10 hectares e 50 hectares. Já proprietários de áreas de mais de 100 hectares aderem aos programas a fim de atrair investidores.

Já atividades de proteção à biodiversidade são as menos apoiadas. “Nessa área, há baixa disposição para pagar. As pessoas se aproveitam do serviço mas não pagam por eles, são os chamados caronistas”, ressalta Susan.

A ampliação dos programas de PSA esbarra em problemas de governança, nos altos investimentos e na falta de regulamentação – tramita no Congresso o projeto de lei nº 792/2007, que visa instituir uma política nacional e criar um programa nacional e um fundo de PSA. Restaurar 1 hectare de terra custa de R$ 10 mil a R$ 20 mil. Então, ganham pontos ações que visam diminuir a pobreza e melhorar a distribuição de renda, como o ICMS-Ecológico, em regiões do Paraná.

Ainda assim, o Brasil é um dos países mais avançados em PSA e serve como modelo para outros países, de acordo com Peter May, professor de pós-graduação em Desenvolvimento, Agricultura e Sociedade da Universidade Federal Rural do Rio de Janeiro. Membro da International Society for Ecological Economics (ISEE), May faz estudo comparativo global sobre Redução das Emissões do Desmatamento e Degradação Florestal (Redd) e divulga dados estratégicos para formuladores de políticas públicas. “Em relação a outros países, temos um mercado agropecuário maduro sem ilegalidades e donos de terras mais instruídos. Estados como Espírito Santo, São Paulo, Amazonas e Acre criaram leis próprias”, diz. Segundo ele, a Costa Rica é modelo clássico com legislação e PSA desenvolvidos. Já Colômbia, Peru, México e Equador têm políticas próprias, mas se espelham nos nossos moldes.

Para May, além de quadro regulatório, a política de PSA deve ser incorporada pelo Código Florestal. Ele alerta ainda que o Brasil carece de experiências mais concretas e resultados práticos. O que, no entanto, exige pesquisa e monitoramento que podem sair mais caros que o projeto.

Estudo americano confirma aquecimento da superfície terrestre (BBC)

Richard Black

Da BBC News

Estação meteorológica próxima de aeroporto.Grupo afirma que estações meteorológicas dão dados precisos sobre aquecimento

Uma nova análise de um grupo de cientistas dos Estados Unidos concluiu que a superfície da Terra está ficando mais quente.

Desde 1950, a temperatura média em terra aumentou em um grau centígrado, segundo as descobertas do grupo Berkeley Earth Project.

O Berkeley Earth Project usou novos métodos e novos dados, mas as descobertas do grupo seguem a mesma tendência climática vista pela Nasa e pelo Escritório de Meteorologia da Grã-Bretanha, por exemplo.

“Nossa maior surpresa foi que os novos resultados concordam com os valores de aquecimento publicados anteriormente por outras equipes nos Estados Unidos e Grã-Bretanha”, afirmou o professor Richard Muller, que estabeleceu o Berkeley Earth Project na Universidade da Califórnia reunindo dez cientistas renomados.

“Isto confirma que estes estudos foram feitos cuidadosamente e que o potencial de (estudos) tendenciosos, identificados pelos céticos em relação ao aquecimento global, não afetam seriamente as conclusões”, acrescentou.

O grupo de cientistas também relata que, apesar de o efeito de aumento de calor perto de cidades – o chamado efeito de ilha de calor urbana – ser real e já ter sido estabelecido, ele não é o responsável pelo aquecimento registrado pela maioria das estações climáticas no mundo todo.


O grupo examinou as alegações de blogueiros “céticos” em relação ao fenômeno, que afirmam que os dados de estações meteorológicas não mostram uma tendência verdadeira de aquecimento global.

Eles dizem que muitas estações meteorológicas registraram aquecimento pois estão localizadas perto de cidades e as cidades crescem, aumentando o calor.

No entanto, o grupo de cientistas descobriu cerca de 40 mil estações meteorológicas no mundo todo cujas informações foram gravadas e armazenadas no formato digital.

Os pesquisadores então desenvolveram uma nova forma de analisar os dados para detectar a tendência das temperaturas globais em terra desde 1800.

O resultado foi um gráfico muito parecido com aqueles produzidos pelos grupos mais importantes do mundo, que tiveram seus trabalhos criticados pelos céticos.

Dois destes três registros são mantidos pelos Estados Unidos, na Administração Oceânica e Atmosférica Nacional (NOAA) e na Nasa. O terceiro é uma colaboração entre o Escritório de Meteorologia da Grã-Bretanha e o Centro de Pesquisa Climática da Universidade de East Anglia (UEA).

O professor Phil Jones, do Centro de Pesquisa Climática da UEA, encarou o trabalho do grupo com cautela e afirmou que espera ler “o relatório final”, quando for publicado.

“Estas descobertas iniciais são muito encorajadoras e ecoam nossos resultados e nossa conclusão de que o impacto das ilhas urbanas de calor na média global de temperatura é mínimo”, disse.

Trânsito e fumaça em rua da China (Reuters)Céticos dizem que proximidade de cidades alteram dados de estações

Phil Jones foi um dos cientistas britânicos acusados de manipular dados para exagerar a influência humana no aquecimento global. Os cientistas foram inocentados em 2010.

O caso teve início em 2009, com o vazamento de e-mails de Jones nos quais o cientista parecia sugerir que alguns dados de pesquisas sobre o aquecimento global fossem excluídos de apresentações que seriam realizadas na conferência da ONU sobre mudanças climáticas.

O episódio deu munição aos céticos em relação ao papel dos seres humanos nas alterações climáticas. Mas a sindicância da Universidade de East Anglia concluiu que não havia dúvidas sobre o rigor e a honestidade dos cientistas.

Sem publicação

Bob Ward, diretor de política e comunicações para o Instituto Graham de Mudança Climática e Meio Ambiente, de Londres, afirmou que o aquecimento global é claro.

“Os chamados céticos devem deixar de lado sua alegações de que o aumento na temperatura média global pode ser atribuído ao impacto do crescimento das cidades”, disse.

A equipe do Berkeley Earth Project decidiu divulgar os dados de suas pesquisas inicialmente em seu próprio website, ao invés de fazê-lo em uma publicação especializada.

Os pesquisadores estão pedindo para que os internautas comentem e forneçam suas opiniões antes de preparar os manuscritos para a publicação científica formal.

Richard Muller, que criou o grupo de pesquisa, afirmou que esta livre circulação de informações marca uma volta à forma como a ciência precisa ser feita, ao invés de apenas publicar o estudo em revistas científicas.

Revealed – the capitalist network that runs the world (New Scientist)

19 October 2011 by Andy Coghlan and Debora MacKenzie

The 1318 transnational corporations that form the core of the economy. Superconnected companies are red, very connected companies are yellow. The size of the dot represents revenue (Image: PLoS One)

AS PROTESTS against financial power sweep the world this week, science may have confirmed the protesters’ worst fears. An analysis of the relationships between 43,000 transnational corporations has identified a relatively small group of companies, mainly banks, with disproportionate power over the global economy.

The study’s assumptions have attracted some criticism, but complex systems analysts contacted by New Scientist say it is a unique effort to untangle control in the global economy. Pushing the analysis further, they say, could help to identify ways of making global capitalism more stable.

The idea that a few bankers control a large chunk of the global economy might not seem like news to New York’s Occupy Wall Street movement and protesters elsewhere (see photo). But the study, by a trio of complex systems theorists at the Swiss Federal Institute of Technology in Zurich, is the first to go beyond ideology to empirically identify such a network of power. It combines the mathematics long used to model natural systems with comprehensive corporate data to map ownership among the world’s transnational corporations (TNCs).

“Reality is so complex, we must move away from dogma, whether it’s conspiracy theories or free-market,” says James Glattfelder. “Our analysis is reality-based.”

Previous studies have found that a few TNCs own large chunks of the world’s economy, but they included only a limited number of companies and omitted indirect ownerships, so could not say how this affected the global economy – whether it made it more or less stable, for instance.

The Zurich team can. From Orbis 2007, a database listing 37 million companies and investors worldwide, they pulled out all 43,060 TNCs and the share ownerships linking them. Then they constructed a model of which companies controlled others through shareholding networks, coupled with each company’s operating revenues, to map the structure of economic power.

The work, to be published in PloS One, revealed a core of 1318 companies with interlocking ownerships (see image). Each of the 1318 had ties to two or more other companies, and on average they were connected to 20. What’s more, although they represented 20 per cent of global operating revenues, the 1318 appeared to collectively own through their shares the majority of the world’s large blue chip and manufacturing firms – the “real” economy – representing a further 60 per cent of global revenues.

When the team further untangled the web of ownership, it found much of it tracked back to a “super-entity” of 147 even more tightly knit companies – all of their ownership was held by other members of the super-entity – that controlled 40 per cent of the total wealth in the network. “In effect, less than 1 per cent of the companies were able to control 40 per cent of the entire network,” says Glattfelder. Most were financial institutions. The top 20 included Barclays Bank, JPMorgan Chase & Co, and The Goldman Sachs Group.

John Driffill of the University of London, a macroeconomics expert, says the value of the analysis is not just to see if a small number of people controls the global economy, but rather its insights into economic stability.

Concentration of power is not good or bad in itself, says the Zurich team, but the core’s tight interconnections could be. As the world learned in 2008, such networks are unstable. “If one [company] suffers distress,” says Glattfelder, “this propagates.”

“It’s disconcerting to see how connected things really are,” agrees George Sugihara of the Scripps Institution of Oceanography in La Jolla, California, a complex systems expert who has advised Deutsche Bank.

Yaneer Bar-Yam, head of the New England Complex Systems Institute (NECSI), warns that the analysis assumes ownership equates to control, which is not always true. Most company shares are held by fund managers who may or may not control what the companies they part-own actually do. The impact of this on the system’s behaviour, he says, requires more analysis.

Crucially, by identifying the architecture of global economic power, the analysis could help make it more stable. By finding the vulnerable aspects of the system, economists can suggest measures to prevent future collapses spreading through the entire economy. Glattfelder says we may need global anti-trust rules, which now exist only at national level, to limit over-connection among TNCs. Bar-Yam says the analysis suggests one possible solution: firms should be taxed for excess interconnectivity to discourage this risk.

One thing won’t chime with some of the protesters’ claims: the super-entity is unlikely to be the intentional result of a conspiracy to rule the world. “Such structures are common in nature,” says Sugihara.

Newcomers to any network connect preferentially to highly connected members. TNCs buy shares in each other for business reasons, not for world domination. If connectedness clusters, so does wealth, says Dan Braha of NECSI: in similar models, money flows towards the most highly connected members. The Zurich study, says Sugihara, “is strong evidence that simple rules governing TNCs give rise spontaneously to highly connected groups”. Or as Braha puts it: “The Occupy Wall Street claim that 1 per cent of people have most of the wealth reflects a logical phase of the self-organising economy.”

So, the super-entity may not result from conspiracy. The real question, says the Zurich team, is whether it can exert concerted political power. Driffill feels 147 is too many to sustain collusion. Braha suspects they will compete in the market but act together on common interests. Resisting changes to the network structure may be one such common interest.

The top 50 of the 147 superconnected companies

1. Barclays plc
2. Capital Group Companies Inc
3. FMR Corporation
4. AXA
5. State Street Corporation
6. JP Morgan Chase & Co
7. Legal & General Group plc
8. Vanguard Group Inc
10. Merrill Lynch & Co Inc
11. Wellington Management Co LLP
12. Deutsche Bank AG
13. Franklin Resources Inc
14. Credit Suisse Group
15. Walton Enterprises LLC
16. Bank of New York Mellon Corp
17. Natixis
18. Goldman Sachs Group Inc
19. T Rowe Price Group Inc
20. Legg Mason Inc
21. Morgan Stanley
22. Mitsubishi UFJ Financial Group Inc
23. Northern Trust Corporation
24. Société Générale
25. Bank of America Corporation
26. Lloyds TSB Group plc
27. Invesco plc
28. Allianz SE 29. TIAA
30. Old Mutual Public Limited Company
31. Aviva plc
32. Schroders plc
33. Dodge & Cox
34. Lehman Brothers Holdings Inc*
35. Sun Life Financial Inc
36. Standard Life plc
37. CNCE
38. Nomura Holdings Inc
39. The Depository Trust Company
40. Massachusetts Mutual Life Insurance
41. ING Groep NV
42. Brandes Investment Partners LP
43. Unicredito Italiano SPA
44. Deposit Insurance Corporation of Japan
45. Vereniging Aegon
46. BNP Paribas
47. Affiliated Managers Group Inc
48. Resona Holdings Inc
49. Capital Group International Inc
50. China Petrochemical Group Company

* Lehman still existed in the 2007 dataset used

Graphic: The 1318 transnational corporations that form the core of the economy

(Data: PLoS One)  

Brain Scans Support Findings That IQ Can Rise or Fall Significantly During Adolescence (Science Daily)

ScienceDaily (Oct. 20, 2011) — IQ, the standard measure of intelligence, can increase or fall significantly during our teenage years, according to research funded by the Wellcome Trust, and these changes are associated with changes to the structure of our brains. The findings may have implications for testing and streaming of children during their school years.

Across our lifetime, our intellectual ability is considered to be stable, with intelligence quotient (IQ) scores taken at one point in time used to predict educational achievement and employment prospects later in life. However, in a study published October 20 in the journal Nature, researchers at the Wellcome Trust Centre for Neuroimaging at UCL (University College London) and the Centre for Educational Neuroscience show for the first time that, in fact, our IQ is not constant.

The researchers, led by Professor Cathy Price, tested 33 healthy adolescents in 2004 when they were between the ages of 12 and 16 years. They then repeated the tests four years later when the same subjects were between 15 and 20 years old. On both occasions, the researchers took structural brain scans of the subjects using magnetic resonance imaging (MRI).

Professor Price and colleagues found significant changes in the IQ scores measured in 2008 compared to the 2004 scores. Some subjects had improved their performance relative to people of a similar age by as much as 20 points on the standardised IQ scale; in other cases, however, performance had fallen by a similar amount.

To test whether these changes were meaningful, the researchers analysed the MRI scans to see whether there was a correlation with changes in the structure of the subjects’ brains.

“We found a considerable amount of change in how our subjects performed on the IQ tests in 2008 compared to four years earlier,” explains Sue Ramsden, first author of the study. “Some subjects performed markedly better but some performed considerably worse. We found a clear correlation between this change in performance and changes in the structure of their brains and so can say with some certainty that these changes in IQ are real.”

The researchers measured each subject’s verbal IQ, which includes measurements of language, arithmetic, general knowledge and memory, and their non-verbal IQ, such as identifying the missing elements of a picture or solving visual puzzles. They found a clear correlation with particular regions of the brain.

An increase in verbal IQ score correlated with an increase in the density of grey matter — the nerve cells where the processing takes place — in an area of the left motor cortex of the brain that is activated when articulating speech. Similarly, an increase in non-verbal IQ score correlated with an increase in the density of grey matter in the anterior cerebellum, which is associated with movements of the hand. However, an increase in verbal IQ did not necessarily go hand-in-hand with an increase in non-verbal IQ.

According to Professor Price, a Wellcome Trust Senior Research Fellow, it is not clear why IQ should have changed so much and why some people’s performance improved while others’ declined. It is possible that the differences are due to some of the subjects being early or late developers, but it is equally possible that education had a role in changing IQ, and this has implications for how schoolchildren are assessed.

“We have a tendency to assess children and determine their course of education relatively early in life, but here we have shown that their intelligence is likely to be still developing,” says Professor Price. “We have to be careful not to write off poorer performers at an early stage when in fact their IQ may improve significantly given a few more years.

“It’s analogous to fitness.A teenager who is athletically fit at 14 could be less fit at 18 if they stopped exercising. Conversely, an unfit teenager can become much fitter with exercise.”

Other studies from the Wellcome Trust Centre for Neuroimaging and other research groups have provided strong evidence that the structure of the brain remains ‘plastic’ even throughout adult life. For example, Professor Price showed recently that guerrillas in Columbia who had learned to read as adults had a higher density of grey matter in several areas of the left hemisphere of the brain than those who had not learned to read. Professor Eleanor Maguire, also from the Wellcome Trust Centre, showed that part of a brain structure called the hippocampus, which plays an important part in memory and navigation, has greater volume in licensed London taxi drivers.

“The question is, if our brain structure can change throughout our adult lives, can our IQ also change?” adds Professor Price. “My guess is yes. There is plenty of evidence to suggest that our brains can adapt and their structure changes, even in adulthood.”

“This interesting study highlights how ‘plastic’ the human brain is,” said Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust. “It will be interesting to see whether structural changes as we grow and develop extend beyond IQ to other cognitive functions. This study challenges us to think about these observations and how they may be applied to gain insight into what might happen when individuals succumb to mental health disorders.”

After Pregnancy Loss, Internet Forums Help Women Understand They Are Not Alone (Science Daily)

ScienceDaily (Oct. 20, 2011) — Nearly one in six pregnancies end in miscarriage or stillbirth, but parents’ losses are frequently minimized or not acknowledged by friends, family or the community.

“Women who have not gone through a stillbirth don’t want to hear about my birth, or what my daughter looked like, or anything about my experience,” said one woman, responding in a University of Michigan Health System-led study that explored how Internet communities and message boards increasingly provide a place for women to share feelings about these life-altering experiences.

The anonymous survey of more than 1,000 women on 18 message boards opens a new window into who is using the forums and why. The findings will be published in Women’s Health Issues.

The researchers were surprised to find that only half of the women surveyed were in their first year of loss after a pregnancy. Many were still coping with the emotional impacts five, 10 and even 20 years later.

“To my family and most friends, the twins have been gone for nearly a year and are entirely a subject for the past,” another woman wrote.

A second unexpected finding was that only 2 percent of survey respondents were African American, despite nearly 60 percent of African Americans having internet access and despite black women having twice the risk of stillbirth as white women.

“This is the largest study to look at who uses Internet message boards after a pregnancy loss and it demonstrates a significant disparity between the women who experience loss and those who responded to the survey,” says lead study author Katherine J. Gold, M.D., M.S.W., M.S., assistant professor of family medicine at the U-M Medical School. “This suggests an important gap in support for African American parents that should be explored further.”

By far, the most common reason women gave for participating in the message boards was that it helped them to feel that their experience wasn’t unique.

One woman explained that the most important aspect of the forums was knowing “that I am not the only one this has happened to and that I am not alone in this horrible nightmare.” Another common theme was that the online environments provided a safe and validating space for the women to express themselves. Others appreciated the ease and convenience of the Internet and their ability to spend more time composing their thoughts than they would be able to in a face-to-face conversation.

Most participants agreed that boards should have a moderator or facilitator, and that health care professionals should participate. Of the 908 women who answered the question, 82 percent said they had learned new medical information from one of the forums.

“The fact that so many women learned new medical information from the message boards shows what an important resource they can be in this regard,” says study senior author Christie Palladino, M.D., M.Sc., an obstetrician/gynecologist with Georgia Health Sciences University’s Education Discovery Institute.

Gold and her colleagues are currently pursuing similar research with bereaved parents who attend in-person support groups and plan to compare and contrast the results.

Too Much Undeserved Self-Praise Can Lead to Depression (Science Daily)

ScienceDaily (Oct. 20, 2011) — People who try to boost their self-esteem by telling themselves they’ve done a great job when they haven’t could end up feeling dejected instead, according to new research published by the American Psychological Association.

High and low performers felt fine when they assessed themselves accurately, probably because the high performers recognized their strengths and low performers acknowledged their weaknesses and could try to improve their future performance, according to a study in the October issue of the APA journal Emotion®.

“These findings challenge the popular notion that self-enhancement and providing positive performance feedback to low performers is beneficial to emotional health. Instead, our results underscore the emotional benefits of accurate self-assessments and performance feedback,” said lead author Young-Hoon Kim, PhD, of the University of Pennsylvania. . The study involved experiments with four different groups of young people from the United States and Hong Kong. Three U.S. groups totaled 295 college undergraduates, with 186 women and a mean age of 19, and one Hong Kong group consisted of 2,780 high school students, with 939 girls, from four different schools and across grades 7-12.

In the first two experiments, one of the U.S. groups and the Hong Kong students took academic tests and were asked to rate and compare their own performances with other students at their schools. Following their assessments, all the participants completed another widely used questionnaire to assess symptoms of depression.

In the third and fourth experiments, researchers evaluated the other two sets of U.S. undergraduates with feedback exercises that made high performers think their performance was low and low performers think their performance was high. Control groups participated in both and received their scores with no feedback.

Across all the studies, results showed that those who rated their own performance as much higher than it actually was were significantly more likely to feel dejected. “Distress following excessive self-praise is likely to occur when a person’s inadequacy is exposed, and because inaccurate self-assessments can prevent self-improvement,” said co-author Chi-Yue Chiu, of Nanyang Technological University in Singapore.

The results also revealed cross-cultural differences that support past findings that Asians are more humble than Americans. The U.S. undergraduates had a higher mean response when rating their performance than the Hong Kong students, at 63 percent compared to 49 percent, the researchers found. Still, they found that excessive self-enhancement was related to depression for both cultures.

A New Discipline Emerges: The Psychology of Science (Science Daily)

ScienceDaily (Oct. 20, 2011) — You’ve heard of the history of science, the philosophy of science, maybe even the sociology of science. But how about the psychology of science? In a new article in Current Directions in Psychological Science, a journal published by the Association for Psychological Science, San Jose State University psychologist Gregory J. Feist argues that a field has been quietly taking shape over the past decade, and it holds great promise for both psychology and science.

“Science is a cognitive act by definition: It involves personality, creativity, developmental processes,” says Feist — everything about individual psychology. So what is the psychology of science? “Simply put,” he writes, it is “the scientific study of scientific thought and behavior.” The psychology of science isn’t just about scientists, though. It’s about how children make organized sense of the world, what comprises scientific talent and interest — or growing disinterest — and even people’s embrace of pseudoscience.

Reviewing about two dozen articles, Feist mentions work in many psychological subspecialties. Neuroscientists have observed the brain correlations of scientific reasoning, discovering, for instance, that people pay more attention to data that concur with their own personal theories. Developmental psychologists have found that infants can craft theories of the way the world works. They’ve also looked at the ages at which small children begin to distinguish theories from evidence.

In its focus on such processes as problem-solving, memory, and creativity, cognitive psychology may be the most mature of the specialties in its relationship to the doing of science. Feist’s own work in this area offers some intriguing findings. In meta-analyses of personality studies of scientific interest and creativity, he has teased out a contradiction: People who are highly interested in science are higher than others in “conscientiousness” (that is, such traits as caution and fastidiousness) and lower in “openness” to experience. Meanwhile, scientific creativity is associated with low conscientiousness and high openness.

Feist believes that a new psychology of science is good for science, which has become more and more important to society, culture, and the economy. Educators need to understand the ways children and adolescents acquire the requisites of scientific inquiry, he says, “and we want to encourage kids who have that talent to go that way.”

But the new sub-discipline is also good for psychology. “Like other disciplines, psychology is fracturing into smaller and smaller areas that are isolated from each other,” he says. “The psychology of science is one of the few recent disciplines that bucks that trend. We’re saying: ‘Let’s look at the whole person in all the basic psychological areas — cognition, development, neuroscience — and integrate it in one phenomenon.’ That’s an approach which is unusual these days.”

Future-Directed Therapy Helps Depression Patients Cultivate Optimistic Outlook (Science Daily)

ScienceDaily (Oct. 20, 2011) — Patients with major depression do better by learning to create a more positive outlook about the future, rather than by focusing on negative thoughts about their past experiences, researchers at Cedars-Sinai say after developing a new treatment that helps patients do this.

While Major Depressive Disorder patients traditionally undergo cognitive-behavior therapy care that seeks to alter their irrational, negative thoughts about past experiences, patients who were treated with the newly-developed Future-Directed Therapy™ demonstrated significant improvement in depression and anxiety, as well as improvement in overall reported quality of life, the researchers found.

Results were published recently in the peer-reviewed journal CNS Neuroscience & Therapeutics.

“Recent imaging studies show that depressed patients have reduced functioning in the regions of the brain responsible for optimism,” said Jennice Vilhauer, PhD, study author and clinical director of Adult Outpatient Programs for the Cedars-Sinai Department of Psychiatry and Behavioral Neurosciences. “Also, people with depression tend to have fewer skills to help them develop a better future. They have less ability to set goals, problem solve or plan for future events.”

According to the U.S. Centers for Disease Control and Prevention, an estimated one in 10 American adults meet the diagnostic criteria for depression.

Anand Pandya, MD, interim chair of Cedars-Sinai’s Department of Psychiatry and Behavioral Neurosciences, said, “Future-Directed Therapy is designed to reduce depression by teaching people the skills they need to think more positively about the future and take the action required to create positive future experiences. This is the first study that demonstrates this intervention intended to increase positive expectations about the future can reduce symptoms of Major Depressive Disorder.”

When people talk only about the negative aspects of their lives, it causes them to focus more attention on what makes them unhappy, Vilhauer said. “Talking about what makes you unhappy in life doesn’t generate the necessary thinking patterns or action needed to promote a state of thriving and create a more positive future,” Vilhauer said. “Future-Directed Therapy helps people shift their attention constructing visions of what they want more of in the future and it helps them develop the skills that they will need to eventually get there.”

In the study conducted at Cedars-Sinai, 16 adult patients diagnosed with Major Depressive Disorder attended future-directed group therapy sessions led by a licensed psychologist twice a week for 10 weeks. Each week, patients read a chapter from a Future-Directed Therapy manual and completed worksheets aimed at improving certain skills, such as goal-setting. Another group of 17 patients diagnosed with depression underwent standard cognitive group therapy. The study team measured the severity of depression and anxiety symptoms, and quality of life before and after treatment, using the Quick Inventory of Depressive Symptoms, the Beck Anxiety Inventory, and the Quality-of-Life Enjoyment and Satisfaction Questionnaire short form.

Results include:

Patients in the Future-Directed Therapy group experienced on average a 5.4 point reduction in their depressive symptoms on the Quick Inventory of Depressive Symptoms scale, compared to a two point reduction in the cognitive therapy group.

Patients in the Future-Directed Therapy group on average reported a 5.4 point reduction in anxiety symptoms on the Beck Anxiety Inventory, compared to a reduction of 1.7 points in the cognitive therapy group.

Patients in the Future-Directed Therapy group reported on average an 8.4 point improvement in their self-reported quality of life on the Quality of Life Enjoyment and Satisfaction scale, compared to a 1.2 point improvement in the cognitive therapy group.

Number of Facebook Friends Linked to Size of Brain Regions, Study Suggests (Science Daily)

ScienceDaily (Oct. 20, 2011) — Scientists funded by the Wellcome Trust have found a direct link between the number of ‘Facebook friends’ a person has and the size of particular brain regions. Researchers at University College London (UCL) also showed that the more Facebook friends a person has, the more ‘real-world’ friends they are likely to have.

The researchers are keen to stress that they have found a correlation and not a cause, however: in other words, it is not possible from the data to say whether having more Facebook friends makes the regions of the brain larger or whether some people are ‘hardwired’ to have more friends.

The social networking site Facebook has more than 800 million active users worldwide. Nearly 30 million of these are believed to be in the UK.

The site allows people to keep in touch online with a network of friends. The sizes of individual networks vary considerably, and some users have only a handful of online friends while others have over a thousand; however, whether this variability is reflected in the size of real-world social networks has not been clear.

Professor Geraint Rees, a Wellcome Trust Senior Clinical Research Fellow at UCL, said: “Online social networks are massively influential, yet we understand very little about the impact they have on our brains. This has led to a lot of unsupported speculation that the internet is somehow bad for us.

“Our study will help us begin to understand how our interactions with the world are mediated through social networks. This should allow us to start asking intelligent questions about the relationship between the internet and the brain — scientific questions, not political ones.”

Professor Rees and colleagues at the UCL Institute of Cognitive Neuroscience and the Wellcome Trust Centre for Neuroimaging studied brain scans of 125 university students — all active Facebook users — and compared them against the size of the students’ network of friends, both online and in the real world. Their findings, which they replicated in a further group of 40 students, are published October 20 in the journal Proceedings of the Royal Society B.

Professor Rees and colleagues found a strong connection between the number of Facebook friends an individual had and the amount of grey matter (the brain tissue where the processing is done) in several regions of the brain. One of these regions was the amygdala, a region associated with processing memory and emotional responses. A study published recently showed that the volume of grey matter in this area is larger in people with a larger network of real-world friends — the new study shows that the same is true for people with a larger network of online friends.
The size of three other regions — the right superior temporal sulcus, the left middle temporal gyrus and the right entorhinal cortex — also correlated with online social networks but did not appear to correlate with real-world networks.

The superior temporal sulcus has a role in our ability to perceive a moving object as biological, and structural defects in this region have been identified in some children with autism. The entorhinal cortex, meanwhile, has been linked to memory and navigation — including navigating through online social networks. Finally, the middle temporal gyrus has been shown to activate in response to the gaze of others and so is implicated in perception of social cues.

Dr Ryota Kanai, first author of the study, added: “We have found some interesting brain regions that seem to link to the number of friends we have — both ‘real’ and ‘virtual’. The exciting question now is whether these structures change over time — this will help us answer the question of whether the internet is changing our brains.”
As well as examining brain structure, the researchers also examined whether there was a link between the size of a person’s online network of friends and their real-world network. Previous studies have looked at this, but only in relatively small sample sizes.

The UCL researchers asked their volunteers questions such as ‘How many people would send a text message to you marking a celebratory event (e.g. birthday, new job, etc.)?’, ‘What is the total number of friends in your phonebook?’ and ‘How many friends have you kept from school and university that you could have a friendly conversation with now?’ The responses suggest that the size of their online networks also related to the size of their real world networks.

“Our findings support the idea that most Facebook users use the site to support their existing social relationships, maintaining or reinforcing these friendships, rather than just creating networks of entirely new, virtual friends,” adds Professor Rees.

Commenting on the study, Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust, said: “We cannot escape the ubiquity of the internet and its impact on our lives, yet we understand little of its impact on the brain, which we know is plastic and can change over time. This new study illustrates how well-designed investigations can help us begin to understand whether or not our brains are evolving as they adapt to the challenges posed by social media.”

The Political Effects of Existential Fear (Science Daily)

ScienceDaily (Oct. 20, 2011) — Why did the approval ratings of President George W. Bush — who was perceived as indecisive before September 11, 2001 — soar over 90 percent after the terrorist attacks? Because Americans were acutely aware of their own deaths. That is one lesson from the psychological literature on “mortality salience” reviewed in a new article called “The Politics of Mortal Terror.”

The paper, by psychologists Florette Cohen of the City University of New York’s College of Staten Island and Sheldon Solomon of Skidmore College, appears in October’s Current Directions in Psychological Science, a journal published by the Association for Psychological Science.

The fear people felt after 9/11 was real, but it also made them ripe for psychological manipulation, experts say. “We all know that fear tactics have been used by politicians for years to sway votes,” says Cohen. Now psychological research offers insight into the chillingly named “terror management.”

The authors cite studies showing that awareness of mortality tends to make people feel more positive toward heroic, charismatic figures and more punitive toward wrongdoers. In one study, Cohen and her colleagues asked participants to think of death and then gave them statements from three fictional political figures. One was charismatic: he appealed to the specialness of the person and the group to which she belonged. One was a technocrat, offering practical solutions to problems. The third stressed the value of participation in democracy. After thinking about death, support for the charismatic leader shot up eightfold.

Even subliminal suggestions of mortality have similar effects. Subjects who saw the numbers 911 or the letters WTC had higher opinions of a Bush statement about the necessity of invading Iraq. This was true of both liberals and conservatives.

Awareness of danger and death can bias even peaceful people toward war or aggression. Iranian students in a control condition preferred the statement of a person preaching understanding and the value of human life over a jihadist call to suicide bombing. But primed to think about death, they grew more positive toward the bomber. Some even said that they might consider becoming a martyr.

As time goes by and the memory of danger and death grows fainter, however, “morality salience” tends to polarize people politically, leading them to cling to their own beliefs and demonize others who hold opposing beliefs — seeing in them the cause of their own endangerment.

The psychological research should make voters wary of emotional political appeals and even of their own emotions in response, Cohen says. “We encourage all citizens to vote with their heads rather than their hearts. Become an educated voter. Look at the candidate’s positions and platforms. Look at who you are voting for and what they stand for.”

Radiação vaza em indústria nuclear no Rio (Correio Braziliense)

JC e-mail 4367, de 19 de Outubro de 2011.

Ocorreram três vazamentos dentro da Fábrica de Combustível Nuclear, pertencente ao governo federal, em Resende (RJ). Dois deles, envolvendo substâncias químicas. Outro, urânio enriquecido altamente radioativo. A empresa admite “falhas”, mas descarta danos a funcionários e ao meio ambiente

Produto radioativo vaza em indústria nuclear de Resende (RJ). A empresa, pertencente ao governo federal, confirma o caso, reconhece “falhas” em equipamentos, mas descarta danos aos funcionários e ao meio ambiente

Engenheiros e técnicos de segurança do trabalho detectaram três vazamentos dentro da Fábrica de Combustível Nuclear (FCN), em Resende (RJ), dois deles envolvendo substâncias químicas e um de urânio enriquecido (UO2), elemento altamente radioativo. A constatação dos vazamentos foi comunicada pelos engenheiros e técnicos a seus superiores por e-mails internos. O Correio teve acesso a cópias desses e-mails.

O pó de urânio vazou de um equipamento chamado homogeneizador e caiu no piso da sala. O episódio foi registrado em 14 de julho de 2009. Em janeiro de 2010, o alarme de atenção da fábrica foi acionado em razão do vazamento de gás liquefeito usado no forno que queima os excessos de gases resultantes da produção de pastilhas de urânio. E, em julho deste ano, um engenheiro suspeitou do vazamento de amônia e comunicou o ocorrido aos gerentes.

Os três casos não representaram riscos aos trabalhadores, ao meio ambiente e ao funcionamento da fábrica, garantem a diretoria da fábrica – pertencente ao governo federal – e a presidência da Comissão Nacional de Energia Nuclear (Cnen), órgão responsável pela fiscalização de atividades radioativas no Brasil. “O urânio ficou numa sala confinada, hermeticamente fechada, não foi para o meio ambiente”, diz o diretor de Produção de Combustível Nuclear da FCN, Samuel Fayad Filho. Ele reconhece “falhas” nos equipamentos e diz que “todos os procedimentos foram tomados” em relação aos problemas detectados. “Não há vazamento de material radioativo em Resende”, assegura.

O Correio consultou especialistas para saber o que significam as informações que circularam internamente na FCN. Para o engenheiro nuclear Aquilino Senra, “é evidente que houve uma falha”. “Não era para o pó de UO2 sair dessa prensa”, diz o engenheiro nuclear, vice-diretor do Instituto Alberto Luiz Coimbra de Pós-Graduação e Pesquisa de Engenharia (Coppe), da Universidade Federal do Rio de Janeiro. “É uma anormalidade clara o vazamento de UO2 da prensa e a presença da substância no solo.”

Em relação ao vazamento de gás liquefeito, Aquilino afirma que “gás vazado não é boa coisa”. “Detectores existem para isso, mas o ponto é por que o gás vazou.” O Correio ouviu também um técnico ligado à Presidência da República, sob a condição de anonimato: “Não me parece um problema grave, pois a Presidência não foi avisada”, diz.

Funções – A FCN é um conjunto de fábricas responsáveis pela montagem do elemento combustível, pela fabricação do pó e da pastilha de urânio e por uma pequena parte do enriquecimento de urânio. O mineral é extraído em Caetité (BA). O processo de enriquecimento é feito quase todo fora do país, mas parte dele já ocorre na FCN. Cabe à fábrica, além dessa pequena fatia do enriquecimento, produzir as pastilhas que serão utilizadas na geração de energia nuclear pelas usinas Angra 1 e Angra 2, em Angra dos Reis (RJ).

Hoje, a FCN é responsável pelo enriquecimento de 10% do urânio necessário para Angra 1 e de 5% para Angra 2, segundo Samuel Fayad. A FCN faz parte da estatal Indústrias Nucleares do Brasil (INB), subordinada ao Ministério de Ciência, Tecnologia e Inovação (MCT).

O episódio do vazamento de pó de urânio foi relatado por um técnico de segurança do trabalho às coordenações superiores. A Cnen confirmou ao Correio o alerta. “O fato é irrelevante em termos de segurança. O referido pó foi identificado em área controlada, dentro de ambiente com contenção para material radioativo, não afetando trabalhadores da unidade ou o meio ambiente”, sustenta o órgão, por meio da assessoria de imprensa.

Crise – O setor de geração de energia nuclear vive um conflito e uma crise dentro do governo federal. O presidente da Comissão Nacional de Energia Nuclear (Cnen), Angelo Padilha, assumiu o cargo em 7 de julho, depois de o ministro da Ciência e Tecnologia, Aloizio Mercadante, demitir Odair Dias Gonçalves. Odair perdeu o cargo após revelações de que a usina Angra 2 operou por 10 anos sem licença definitiva e de que o Brasil passou a importar urânio em razão de licenças travadas. Até agora, a Agência Reguladora de Energia Nuclear é apenas um projeto, em razão de conflitos dentro do setor. A agência vai retirar da Cnen – principal acionista das Indústrias Nucleares do Brasil – a função de regulação e fiscalização.

Onças-pintadas ajudam a preservar Caatinga (Valor Econômico)

JC e-mail 4366, de 18 de Outubro de 2011.

Mapear quantas são, como vivem e por onde andam as onças-pintadas da Caatinga permitirá conhecer o efeito da transposição do São Francisco sobre a região.

É bem ali, onde a onça bebe água, que se arma o laço. Em setembro, no auge da seca na Caatinga, foram dez armadilhas na região de Sento Sé, município do norte baiano, às margens do lago de Sobradinho. Cinco pesquisadores, 30 dias, água racionada, nada de luz elétrica, computador ou telefone, e R$ 22 mil de investimento. No fim da expedição, nenhuma onça-pintada ganhou colar com GPS. Mas a frustração dos cientistas dá logo lugar ao planejamento da nova campanha. É nesse compasso que vão perseguindo a criação de uma espécie de “índice-onça de sustentabilidade”, que está relacionado com uma das principais, e mais polêmicas, obras do Programa de Aceleração do Crescimento (PAC), a transposição do rio São Francisco.

Tanto interesse nesse gato hiperbólico – é o maior felino das Américas, o terceiro maior do mundo depois do tigre e do leão, e dono da mordida mais potente entre seus parentes – transcende a biologia. Onça-pintada só vive onde tem água e é predador importante, que regula ecossistemas. Não deixa, por exemplo, a população de capivaras, veados ou ratos explodir. No topo da cadeia alimentar, é uma espécie guarda-chuva. “Protegendo a onça-pintada, está se protegendo todas as outras”, diz o veterinário Ronaldo Gonçalves Morato, 44 anos, um dos poucos especialistas em onças do país das onças.

Na base do estudo está a proposta de se criar, no coração do Semiárido, um corredor de fauna. A tentativa é construir uma área de proteção que leve em conta o potencial econômico da região. Um dos elementos é o Parque Nacional Boqueirão da Onça, em estudo há dez anos. Teria 800 mil hectares e seria a maior unidade de conservação fora da Amazônia. Mas enquanto o governo não resolve se cria ou não o parque, o valor e a grilagem das terras aumentam. Há também o interesse do Ministério das Minas e Energia, que vê na região um bom potencial eólico.

O governo busca consenso para garantir alguma proteção ao terceiro e mais castigado bioma brasileiro. Menos de 2% da Caatinga é área protegida. Mais de 45% da vegetação foi desmatada e a região sofre desertificação. “Temos a visão de que a Caatinga é pobre e pronto. Mas existem paisagens fantásticas e recursos naturais mal aproveitados”, diz Morato. “Explorar a Caatinga com um bom programa turístico, seria bem interessante.”

A diversidade biológica é rica, mesmo com escassez de água. Há centenas de espécies de pássaros, répteis e anfíbios. As paisagens são belas e variadas, há pinturas rupestres e frutas que dão doces exóticos. Na seca, a vegetação fica sem folhas, para gastar menos energia. “O pessoal chama esse cenário de mata branca. É só chover que, três dias depois, está tudo verde. É maravilhoso”, encanta-se Morato.

O parque, que não sai do papel, tomaria 45% do município de Sento Sé, região bem pouco povoada de gente e talvez bem povoada por onças. A pintada, que se espalhava pela Caatinga nos tempos de Lampião, hoje está restrita a 25% do bioma. Os pesquisadores acreditam que existam cinco grandes populações de onças-pintadas no Semiárido, um ou dois animais a cada 100 km2 – em Cáceres, no Pantanal, a densidade é bem mais alta, média de sete onças a cada 100 km2. As estimativas falam em 300 a 400 animais na Caatinga.

Mapear, com alguma precisão, quantas são, como vivem e por onde andam as onças-pintadas do sertão nordestino é ter um indicador ambiental para saber, depois, o quanto a transposição do São Francisco afetou a região. Se as onças-pintadas continuarem por lá depois da obra, é sinal positivo. O projeto faz parte do Programa de Revitalização da Bacia do São Francisco, coordenado pelo Ministério do Meio Ambiente em parceria com o da Integração Regional. Também conseguiu recursos na BM&F Bovespa. Onças, principalmente as pintadas, são animais glamourosos.

A majestade da espécie-símbolo da fauna brasileira, impressa nas cédulas de R$ 50, é inversamente proporcional ao que se conhece sobre o animal. “Nem sabemos o quanto uma onça-pintada vive”, diz Morato. “A cada pergunta que respondemos, surge uma nova.” Ele começou a carreira fazendo estágio no zoológico de Sorocaba, em São Paulo, o suficiente para perceber que queria mesmo era estudar animais em vida livre.

Morato trabalha com onças há 20 anos, há seis é o coordenador do Centro Nacional de Pesquisa e Conservação de Mamíferos Carnívoros (Cenap), instituto que estuda uma lista de 26 espécies – de lobos-guará a ariranhas. Só de felinos são oito espécies, entre onças pintadas e pardas, jaguatiricas e gatos-do-mato. Dá para ver da estrada o painel gigante de uma onça-pintada nos vidros da sede do Cenap, em Atibaia. O centro foi criado há 17 anos e é um braço do Instituto Chico Mendes de Conservação da Biodiversidade (ICMBio).

Os investimentos no projeto Ecologia e Conservação da Onça-Pintada no Médio São Francisco, ou simplesmente Onças da Caatinga, é de R$ 800 mil em quatro anos. A primeira campanha de captura para colocação do colar foi em 2010, e também não teve êxito. Não é fácil pegar um bicho desses. Os laços de aço são montados perto das áreas que elas costumam frequentar. “A gente identifica os pontos onde as onças passam, e deixamos os laços. Mas às vezes elas andam ao lado do laço e a gente só vê os rastros no dia seguinte. É difícil.”

Difícil é pouco. Para um mês de acampamento em setembro, levaram 300 litros de água por pessoa. Não há estradas, carro não chega, as pedras cortam os pneus. Equipamentos e água são levados a pé. Banho, só de caneca.

Quando dá sorte e a onça cai no laço, os pesquisadores lançam o dardo anestésico e começam a medir o animal: peso, tamanho, tamanho da pata, análise dos dentes. É a hora de colocar o colar com telemetria que pesa 800 gramas e tem um GPS instalado em uma caixinha, na parte da frente. Cada animal tem frequência própria. Depois, programam de quanto em quanto tempo o pesquisador receberá as informações por onde anda a onça – de duas em duas horas, por exemplo. Uma vez por semana, os dados são enviados ao e-mail do cientista pela empresa que administra o satélite. O colar pode ser programado para cair do pescoço depois de determinado período, e ser recolhido. “Fica, por exemplo, 400 dias na onça, e aí cai”, explica Morato.

“A tecnologia favoreceu muito o nosso trabalho”, diz ele. O avanço tecnológico tem seu preço, nada disso é barato. O Cenap usa colares da suíça Televilt, cada um a US$ 3.800. O contrato anual do satélite são outros US$ 1.200 por colar. Hoje existem 40 equipamentos do gênero em onças-pintadas no Brasil. Ao recolher várias informações sobre o comportamento do animal – desde como e para onde se desloca, quais ambientes procura, como se alimenta – os cientistas desenham o tamanho da “área de vida” da onça. “Vou vislumbrando o ambiente que posso sugerir para preservação”, explica Morato.

O “Onças na Caatinga” levantou recursos na BVS&A, portal da Bovespa que lista projetos sociais e ambientais. “Quem tiver interesse pode entrar lá, escolher o que acha interessante, e doar”, diz Sonia Favaretto, diretora de sustentabilidade da Bolsa. A iniciativa resultou em R$ 150 mil em dois anos. O Cenap trabalhou em parceria com a ONG Pró-Carnívoros, que ajuda a viabilizar os projetos de pesquisa.

O papel de regulador ecológico da onça-pintada não é o único. “Com a perda de espécies, perdem-se ambientes, ficamos mais expostos a catástrofes”, aponta Morato. A redução de predadores representa aumento das presas e mais pressão sobre a vegetação. “Isso, a longo prazo, diminui o estoque de carbono”, lembra. Morato defende que é preciso refletir sobre o valor econômico das onças-pintadas e o apelo turístico que representam.

Economics has met the enemy, and it is economics (Globe and Mail)

Adam Smith is considered the founding father of modern economics. - Adam Smith is considered the founding father of modern economics.

Adam Smith is considered the founding father of modern economics.

ira basen

From Saturday’s Globe and Mail
Published Saturday, Oct. 15, 2011 6:00AM EDT
Last updated Tuesday, Oct. 18, 2011 8:41AM EDT

After Thomas Sargent learned on Monday morning that he and colleague Christopher Sims had been awarded the Nobel Prize in Economics for 2011, the 68-year-old New York University professor struck an aw-shucks tone with an interviewer from the official Nobel website: “We’re just bookish types that look at numbers and try to figure out what’s going on.”

But no one who’d followed Prof. Sargent’s long, distinguished career would have been fooled by his attempt at modesty. He’d won for his part in developing one of economists’ main models of cause and effect: How can we expect people to respond to changes in prices, for example, or interest rates? According to the laureates’ theories, they’ll do whatever’s most beneficial to them, and they’ll do it every time. They don’t need governments to instruct them; they figure it out for themselves. Economists call this the “rational expectations” model. And it’s not just an abstraction: Bankers and policy-makers apply these formulae in the real world, so bad models lead to bad policy.

Which is perhaps why, by the end of that interview on Monday, Prof. Sargent was adopting a more realistic tone: “We experiment with our models,” he explained, “before we wreck the world.”

Rational-expectations theory and its corollary, the efficient-market hypothesis, have been central to mainstream economics for more than 40 years. And while they may not have “wrecked the world,” some critics argue these models have blinded economists to reality: Certain the universe was unfolding as it should, they failed both to anticipate the financial crisis of 2008 and to chart an effective path to recovery.

The economic crisis has produced a crisis in the study of economics – a growing realization that if the field is going to offer meaningful solutions, greater attention must be paid to what is happening in university lecture halls and seminar rooms.

While the protesters occupying Wall Street are not carrying signs denouncing rational-expectations and efficient-market modelling, perhaps they should be.

They wouldn’t be the first young dissenters to call economics to account. In June of 2000, a small group of elite graduate students at some of France’s most prestigious universities declared war on the economic establishment. This was an unlikely group of student radicals, whose degrees could be expected to lead them to lucrative careers in finance, business or government if they didn’t rock the boat. Instead, they protested – not about tuition or workloads, but that too much of what they studied bore no relation to what was happening outside the classroom walls.

They launched an online petition demanding greater realism in economics teaching, less reliance on mathematics “as an end in itself” and more space for approaches beyond the dominant neoclassical model, including input from other disciplines, such as psychology, history and sociology. Their conclusion was that economics had become an “autistic science,” lost in “imaginary worlds.” They called their movement Autisme-economie.

The students’ timing is notable: It was the spring of 2000, when the world was still basking in the glow of “the Great Moderation,” when for most of a decade Western economies had been enjoying a prolonged period of moderate but fairly steady growth.

Some economists were daring to think the unthinkable – that their understanding of how advanced capitalist economies worked had become so sophisticated that they might finally have succeeded in smoothing out the destructive gyrations of capitalism’s boom-and-bust cycle. (“The central problem of depression prevention has been solved,” declared another Nobel laureate, Robert Lucas of the University of Chicago, in 2003 – five years before the greatest economic collapse in more than half a century.)

The students’ petition sparked a lively debate. The French minister of education established a committee on economic education. Economics students across Europe and North America began meeting and circulating petitions of their own, even as defenders of the status quo denounced the movement as a Trotskyite conspiracy. By September, the first issue of the Post-Autistic Economic Newsletter was published in Britain.

As The Independent summarized the students’ message: “If there is a daily prayer for the global economy, it should be, ‘Deliver us from abstraction.’”

It seems that entreaty went unheard through most of the discipline before the economic crisis, not to mention in the offices of hedge funds and the Stockholm Nobel selection committee. But is it ringing louder now? And how did economics become so abstract in the first place?

The great classical economists of the late 18th and early 19th centuries had no problem connecting to the real world – the Industrial Revolution had unleashed profound social and economic changes, and they were trying to make sense of what they were seeing. Yet Adam Smith, who is considered the founding father of modern economics, would have had trouble understanding the meaning of the word “economist.”

What is today known as economics arose out of two larger intellectual traditions that have since been largely abandoned. One is political economy, which is based on the simple idea that economic outcomes are often determined largely by political factors (as well as vice versa). But when political-economy courses first started appearing in Canadian universities in the 1870s, it was still viewed as a small offshoot of a far more important topic: moral philosophy.

In The Wealth of Nations (1776), Adam Smith famously argued that the pursuit of enlightened self-interest by individuals and companies could benefit society as a whole. His notion of the market’s “invisible hand” laid the groundwork for much of modern neoclassical and neo-liberal, laissez-faire economics. But unlike today’s free marketers, Smith didn’t believe that the morality of the market was appropriate for society at large. Honesty, discipline, thrift and co-operation, not consumption and unbridled self-interest, were the keys to happiness and social cohesion. Smith’s vision was a capitalist economy in a society governed by non-capitalist morality.

But by the end of the 19th century, the new field of economics no longer concerned itself with moral philosophy, and less and less with political economy. What was coming to dominate was a conviction that markets could be trusted to produce the most efficient allocation of scarce resources, that individuals would always seek to maximize their utility in an economically rational way, and that all of this would ultimately lead to some kind of overall equilibrium of prices, wages, supply and demand.

Political economy was less vital because government intervention disrupted the path to equilibrium and should therefore be avoided except in exceptional circumstances. And as for morality, economics would concern itself with the behaviour of rational, self-interested, utility-maximizing Homo economicus. What he did outside the confines of the marketplace would be someone else’s field of study.

As those notions took hold, a new idea emerged that would have surprised and probably horrified Adam Smith – that economics, divorced from the study of morality and politics, could be considered a science. By the beginning of the 20th century, economists were looking for theorems and models that could help to explain the universe. One historian described them as suffering from “physics envy.” Although they were dealing with the behaviour of humans, not atoms and particles, they came to believe they could accurately predict the trajectory of human decision-making in the marketplace.

In their desire to have their field be recognized as a science, economists increasingly decided to speak the language of science. From Smith’s innovations through John Maynard Keynes’s work in the 1930s, economics was argued in words. Now, it would go by the numbers.

The turning point came in 1947, when Paul Samuelson’s classic book Foundations of Economic Analysis for the first time presented economics as a branch of applied mathematics. Without “the invigorating kiss of mathematical method,” Samuelson maintained, economists had been practising “mental gymnastics of a particularly depraved type,” like “highly trained athletes who never run a race.” After Samuelson, no economist could ever afford to make that mistake.

And that may have been the greatest mistake of all: In a post-crisis, 2009 essay in The New York Times Magazine, Princeton economist and Nobel laureate Paul Krugman wrote, “The central cause of the profession’s failure was the desire for an all-encompassing, intellectually elegant approach that gave economists a chance to show off their mathematical prowess.”

Of course, nothing says science like a Nobel Prize. Prizes in chemistry, physics and medicine were first awarded in 1901, long before anyone would have thought that economics could or should be included. But by the late 1960s, the central bank of Sweden was determined to change that, and when the Nobel family objected, the bank agreed to put up the money itself, making it the only one of the prizes to be funded by taxpayers.

Officially, then, it is known as the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel – but that title is rarely used. On Monday morning, Prof. Sargent and Princeton University Prof. Sims were widely reported to have won the Nobel Prize in Economics.

The confusion is understandable, and deliberate, according to Philip Mirowski, an economic historian at the University of Notre Dame. “It’s part of the PR trick,” Prof. Mirowski argues. Awarding the economics prize immediately after the prizes for physics, chemistry and medicine helps to place economics on the same level as those other natural sciences.

The prize also has helped to transform one particular ideology into economic orthodoxy. Prof. Mirowski, who is co-writing a book on the history of the economics prize, notes that throughout the 1970s and 1980s, economists whose work supported neoclassical, pro-market, laissez-faire ideas won a disproportionate number of those honours, as well as support from the increasing numbers of well-funded think tanks and foundations that cleaved to the same lines. People who rejected those ideas, or were skeptical of the natural sciences model, were quickly marginalized, and their road to academic advancement often blocked.

The result was a homogenization of economic thought that Prof. Mirowski believes “has been pretty deleterious for economics on the whole.”

The road to hell is paved with good intentions,

rational expectations and efficient markets

Many critics of neo-classical economics argue that it has a powerful pro-market bias that’s provided an intellectual justification for politicians ideologically disposed to reduce government involvement in the economy.

The rational-expectations model, for example, assumes that consumers and producers all inform themselves with all available data, understand how the world around them operates and will therefore respond to the same stimulus in essentially the same way. That allows economists to mathematically forecast how these “representative” consumers and producers would behave.

During a recession, say, a well-meaning government might want to enhance benefits for the unemployed. Prof. Sargent, for one, would caution against that, because a “rational” unemployed worker might then calculate that it’s better to reject a lower-paying job. He’s blamed much of the chronically high unemployment in some European countries on the presence of an army of voluntarily unemployed workers, and spoken out against the Obama administration’s recent efforts to extend unemployment benefits.

Indeed, under the rational-expectations model, most market interventions by governments and central banks wind up looking counterproductive.

Meanwhile, the efficient-markets hypothesis, developed by University of Chicago economist Eugene Fama in the 1970s, has dominated thinking about financial markets. It posits that the prices of stocks and other financial assets are always “efficient” because they accurately reflect all the available information about economic fundamentals.

By this reasoning, there can be no speculative price bubbles or busts in the stock or housing markets, and speculators with evil intentions cannot successfully manipulate markets. Conveniently, since markets are self-stabilizing, there’s no need for government regulation of them.

Critics point out that both these theories tend to ignore what John Maynard Keynes called the “animal spirits” – playing down human irrationality, inefficiency, venality and ignorance. Those are qualities that are hard to plug into a mathematical equation that purports to model human behaviour.

These models also have failed to take into account the profound changes wrought by globalization, and the growing importance of banks, hedge funds and other financial institutions. Yet they have successfully provided a “scientific” cover for an anti-regulatory political agenda that is popular on Wall Street and in some Washington political circles.

Inside jobs: Pay no attention to that banker behind the curtain

The Great Depression of the 1930s led many economists of the day to question some of their discipline’s most fundamental assumptions and produced a decades-long heyday for Keynesian economics. So far, the Great Recession has led to less of a fundamental shift.

Notre Dame’s Prof. Mirowski believes that more rethinking is necessary. “Everyone thought the banks would have to change their behaviour, but they got bailed out and nothing changed. The economics profession has also been bailed out because it is so highly interlinked with the financial profession, so of course they don’t change. Why would they change?”

Indeed, economics may be the dismal science, but there is nothing dismal about the payoffs for those at the top of the heap serving as advisers and consultants and sitting on various boards. Unlike some disciplines, economics has no guidelines governing conflict of interest and disclosure.

In 2010, the Academy Award-winning documentary Inside Job exposed several disturbing examples of academic economists calling for deregulation while working for financial-services companies. And in a study of 19 prominent financial economists, published last year by the Political Economy Research Institute at the University of Massachusetts Amherst, 13 were found to own stock or sit on the boards of private financial institutions, but in only four cases were those affiliations revealed when they testified or wrote op-eds concerning financial regulation.

This year, the American Economics Association agreed to set up a committee to investigate whether economists should develop ethical guidelines similar to those already in place for sociologists, psychologists, statisticians and anthropologists.

But there appears to be little enthusiasm for the idea among mainstream economists. Prof. Lucas of the University of Chicago, in an interview with The New York Times, objected: “What disciplines economics, like any science, is whether your work can be replicated. It either stands up or it doesn’t. Your motivations and whatnot are secondary.”

Several billion pennies for their thoughts

The critics, however, are more numerous and considerably better financed than the French students a decade ago. In October, 2009, billionaire financier George Soros said that “the current paradigm has failed.” He resolved to help save economics from itself. He pledged $50-million toward the establishment of the New York-based Institute for New Economic Thinking (INET), with a mandate to promote changes in economic theory and practice through conferences, grants and campaigns for graduate and undergraduate education reforms.

Perry Mehrling, a professor of economics at New York’s Columbia University, is the chair of the curriculum task force at INET. He says his graduate students at Columbia are growing increasingly frustrated by at the tendency to define the discipline by its tools instead of its subject matter – like the students in Paris a decade ago, they find little relationship between the mathematical models in class and the world outside the door.

Prof. Mehrling believes that economics education has become far too insular. Never mind cross-disciplinary study – even courses in economic history and the history of economic thought have all but disappeared, so students spend almost no time reading Smith, Keynes or other past masters.

“It’s not just that we’re not listening to sociologists,” Prof. Mehrling laments. “We’re not even listening to economists.”

He says he has no problem with teaching efficient-markets and rational-expectations theories, but as hypothesis, not catechism. “I object to the idea that these are articles of faith and if you don’t accept them, you are not a member of the tribe. These things need to be questioned and we need a broader conversation.”

The challenge, as Columbia University economist Joseph Stiglitz said at the opening conference of INET, is that “we need better theories of persistent deviations from rationality.”

Some of those theories are coming from the rapidly growing field of behavioural economics, which borrows insights about human motivation from cognitive psychology: A paper titled The Hubris Hypothesis of Corporate Takeovers, for example, examines how the egos of ambitious chief executive officers can lead them to pursue takeovers, even when all available evidence suggests that the move could be a disaster.

It is not yet clear how such new approaches can evolve into workable models, but they hint at what a post-autistic economics might look like.

Prof. Mehrling is cautiously optimistic. “There’s a recognition that things we thought were true aren’t necessarily true,” he argues, “and the world is more complicated and interesting than we thought – so all bets are off, and that’s exciting intellectually.”

Change comes slowly in academia. The few jobs that are available don’t generally go to people who challenge orthodoxy. But over the next decade, as the post-crash crop of economics students make their impact felt in government, business and schools, the lessons learned may well seep into the mainstream.

Theories based on assumptions of rationality, efficiency and equilibrium in the marketplace are likely to be treated with a great deal more skepticism. Homo economicus is a lot more anxious, irrational, unpredictable and complex than most economists believed. And, as Adam Smith recognized, he has a moral and ethical dimension that should not be ignored.

Today, the Post-Autistic Economic Network continues to publish its newsletter, now known as the Real-World Economic Review. It remains a thorn in the side of mainstream economics. In an editorial in January, 2010, the editors called for major economics organizations to censure those economists who “through their teachings, pronouncements and policy recommendations facilitated the global financial collapse” and pointed to the “continuing moral crisis within the economics profession.”

It is unlikely that Prof. Sargent will acknowledge any of this when he travels to Stockholm to accept his (sort of) Nobel Prize in December. Nor is he likely to speak about what role, if any, his models really might have played in “wrecking the world.”

But he did make one concession in his interview with the Nobel website this week: “Many of the practical problems are ahead of where the models are,” he admitted. “That’s life.”

Ira Basen is a radio producer, journalist and educator based in Toronto.


2011 American Anthropological Association Meeting, Montreal

Friday, November 18, 2011: 08:00-11:45

Organizers: Renzo Taddei (Federal University of Rio de Janeiro) and Karen E Pennesi (University of Western Ontario)
Chairs: Karen E Pennesi (University of Western Ontario)
Discussants: Ben Orlove (Columbia University) and Renzo Taddei (Federal University of Rio de Janeiro)

Future Forecasting and the End of Relativism: A Challenge for Anthropology
Will Rollason (Brunel University)

Forecasting Credit: Living Proleptically In the Brazilian Amazon
Jeremy M Campbell (Roger Williams University)

The Secret Life of Forecasts: Examining the Production and Use of Tornado Warnings As Social Processes
Heather Lazrus (National Center for Atmospheric Research), Amy Nichols (University of Oklahoma) and Stephanie Hoekstra (University of Oklahoma)

Functions and Interpretations of Ambiguous Language In Predictions
Karen E Pennesi (University of Western Ontario)

On the Simulation of Deforestation Scenarios In Making REDD Carbon Market
Shaozeng Zhang (University of California, Irvine)

Ben Orlove (Columbia University)



Forecasting As History: Japan’s Modern Earthquakes
Kerry Smith (Brown University)

Articulating Future Health Effects and Climate Change: Collaborative Modeling Systems and Future Epistemologies
Brandon J Costelloe-Kuehn (Rensselaer Polytechnic Institute)

Looking After Today: Resource Depletion and Hydrocarbon Culture In Industrial Trinidad
Jacob Campbell (University of Arizona)

Social Impact Assessment and the Anthropology of the Future In Canada’s Oilsands
Clinton N Westman (University of Saskatchewan)

Clouds In the Forecast: The Future, Climate Science, and Humanitarian Aid
Soo-Young Kim (Columbia University)

Renzo Taddei (Federal University of Rio de Janeiro)


Rick Perry officials spark revolt after doctoring environment report (The Guardian)

Scientists ask for names to be removed after mentions of climate change and sea-level rise taken out by Texas officials

Suzanne Goldenberg, US environment correspondent, Friday 14 October 2011 13.05 BST

Republican presidential hopeful Texas Gov. Rick Perry

Rick Perry’s administration deleted references to climate change and sea-level rise from the report. Photograph: Evan Vucci/AP

Officials in Rick Perry’s home state of Texas have set off a scientists’ revolt after purging mentions of climate change and sea-level rise from what was supposed to be a landmark environmental report. The scientists said they were disowning the report on the state of Galveston Bay because of political interference and censorship from Perry appointees at the state’s environmental agency.

By academic standards, the protest amounts to the beginnings of a rebellion: every single scientist associated with the 200-page report has demanded their names be struck from the document. “None of us can be party to scientific censorship so we would all have our names removed,” said Jim Lester, a co-author of the report and vice-president of the Houston Advanced Research Centre.

“To me it is simply a question of maintaining scientific credibility. This is simply antithetical to what a scientist does,” Lester said. “We can’t be censored.” Scientists see Texas as at high risk because of climate change, from the increased exposure to hurricanes and extreme weather on its long coastline to this summer’s season of wildfires and drought.

However, Perry, in his run for the Republican nomination, has elevated denial of science, from climate change to evolution, to an art form. He opposes any regulation of industry, and has repeatedly challenged the authority of the Environmental Protection Agency.

Texas is the only state to refuse to sign on to the federal government’s new regulations on greenhouse gas emissions. “I like to tell people we live in a state of denial in the state of Texas,” said John Anderson, an oceanography at Rice University, and author of the chapter targeted by the government censors.

That state of denial percolated down to the leadership of the Texas Commission on Environmental Quality. The agency chief, who was appointed by Perry, is known to doubt the science of climate change. “The current chair of the commission, Bryan Shaw, commonly talks about how human-induced climate change is a hoax,” said Anderson.

But scientists said they still hoped to avoid a clash by simply avoiding direct reference to human causes of climate change and by sticking to materials from peer-reviewed journals. However, that plan began to unravel when officials from the agency made numerous unauthorised changes to Anderson’s chapter, deleting references to climate change, sea-level rise and wetlands destruction.

“It is basically saying that the state of Texas doesn’t accept science results published in Science magazine,” Anderson said. “That’s going pretty far.”

Officials even deleted a reference to the sea level at Galveston Bay rising five times faster than the long-term average – 3mm a year compared to .5mm a year – which Anderson noted was a scientific fact. “They just simply went through and summarily struck out any reference to climate change, any reference to sea level rise, any reference to human influence – it was edited or eliminated,” said Anderson. “That’s not scientific review that’s just straight forward censorship.”

Mother Jones has tracked the changes. The agency has defended its actions. “It would be irresponsible to take whatever is sent to us and publish it,” Andrea Morrow, a spokeswoman said in an emailed statement. “Information was included in a report that we disagree with.”

She said Anderson’s report had been “inconsistent with current agency policy”, and that he had refused to change it. She refused to answer any questions. Campaigners said the censorship by the Texas state authorities was a throwback to the George Bush era when White House officials also interfered with scientific reports on climate change.

In the last few years, however, such politicisation of science has spread to the states. In the most notorious case, Virginia’s attorney general Ken Cuccinelli, who is a professed doubter of climate science, has spent a year investigating grants made to a prominent climate scientist Michael Mann, when he was at a state university in Virginia.

Several courts have rejected Cuccinelli’s demands for a subpoena for the emails. In Utah, meanwhile, Mike Noel, a Republican member of the Utah state legislature called on the state university to sack a physicist who had criticised climate science doubters.

The university rejected Noel’s demand, but the physicist, Robert Davies said such actions had had a chilling effect on the state of climate science. “We do have very accomplished scientists in this state who are quite fearful of retribution from lawmakers, and who consequently refuse to speak up on this very important topic. And the loser is the public,” Davies said in an email.

“By employing these intimidation tactics, these policymakers are, in fact, successful in censoring the message coming from the very institutions whose expertise we need.”

Seeing Value in Ignorance, College Expects Its Physicists to Teach Poetry (N.Y. Times)


ANNAPOLIS, Md. — Sarah Benson last encountered college mathematics 20 years ago in an undergraduate algebra class. Her sole experience teaching math came in the second grade, when the first graders needed help with their minuses.

Sarah Benson has a Ph.D. in art history and a master’s in comparative literature, but this year she is teaching geometry. Shannon Jensen for The New York Times
And yet Ms. Benson, with a Ph.D. in art history and a master’s degree in comparative literature, stood at the chalkboard drawing parallelograms, constructing angles and otherwise dismembering Euclid’s Proposition 32the way a biology professor might treat a water frog. Her students cared little about her inexperience. As for her employers, they did not mind, either: they had asked her to teach formal geometry expressly because it was a subject about which she knew very little.

It was just another day here at St. John’s College, whose distinctiveness goes far beyond its curriculum of great works: Aeschylus and Aristotle, Bacon and Bach. As much of academia fractures into ever more specific disciplines, this tiny college still expects — in fact, requires — its professors to teach almost every subject, leveraging ignorance as much as expertise.

“There’s a little bit of impostor syndrome,” said Ms. Benson, who will teach Lavoisier’s “Elements of Chemistry” next semester. “But here, it’s O.K. that I don’t know something. I can figure it out, and my job is to help the students do the same thing. It’s very collaborative.”

Students in Ms. Benson’s class discussing Euclid.Shannon Jensen for The New York Times

Or as St. John’s president, Chris Nelson (class of 1970), put it with a smile only slightly sadistic: “Every member of the faculty who comes here gets thrown in the deep end. I think the faculty members, if they were cubbyholed into a specialization, they’d think that they know more than they do. That usually is an impediment to learning. Learning is born of ignorance.”

Students who attend St. John’s — it has a sister campus in Santa Fe, N.M., with the same curriculum and philosophies — know that their college experience will be like no other. There are no majors; every student takes the same 16 yearlong courses, which generally feature about 15 students discussing Sophocles or Homer, and the professor acting more as catalyst than connoisseur.

What they may not know is that their professor — or tutor in the St. John’s vernacular — might have no background in the subject. This is often the case for the courses that freshmen take. For example, Hannah Hintze, who has degrees in philosophy and woodwind performance, and whose dissertation concerned Plato’s “Republic,” is currently leading classes on observational biology and Greek.

“Some might not find that acceptable, but we explore things together,” said Ryan Fleming, a freshman in Ms. Benson’s Euclid class. “We don’t have someone saying, ‘I have all the answers.’ They’re open-minded and go along with us to see what answers there can be.”

Like all new tutors, Ms. Benson, 42, went through a one-week orientation in August to reacquaint herself with Euclid, and to learn the St. John’s way of teaching. She attends weekly conferences with more seasoned tutors.

Her plywood-floor classroom in McDowell Hall is as almost as dim and sparse as the ones Francis Scott Key (valedictorian of the class of 1796) studied in before the college’s original building burned down in 1909. Eight underpowered ceiling lights barely illuminated three walls of chalkboards. While even kindergarten classrooms now feature interactive white boards and Wi-Fi connected iPads, not one laptop or cellphone was visible; the only evidence of contemporary life was the occasional plastic foam coffee cup.

The discussion centered not on examples and exercises, but on the disciplined narrative of Euclid’s assertions, the aesthetic economy of mathematical argument. When talk turned to Proposition 34 of Book One, which states that a parallelogram’s diagonal divides it into equal areas, not one digit was used or even mentioned. Instead, the students debated whether Propositions 4 and 26 were necessary for Euclid’s proof.

When a student punctuated a blackboard analysis with, “The self-evident truth that these triangles will be equal,” the subliminal reference to the Declaration of Independence hinted at the eventual braiding of the disciplines by both students and tutors here. So, too, did a subsequent discussion of how “halves of equals are equals themselves,” evoking the United States Supreme Court’s logic in endorsing segregation 2,200 years after Euclid died.

Earlier in the day, in a junior-level class taught by a longtime tutor about a portion of Newton’s seminal physics text “Principia,” science and philosophy became as intertwined as a candy cane’s swirls. Students discussed Newton’s shrinking parabolic areas as if they were voting districts, and the limits of curves as social ideals.

One student remarked, “In Euclid before, he talked a lot about what is equal and what isn’t. It seems here that equality is more of a continuum — we can get as close as we want, but never actually get there.” A harmony of Tocqueville was being laid over Newton’s melody.

The tutor, Michael Dink, graduated from St. John’s in 1975 and earned his master’s degree and Ph.D. in philosophy from the Catholic University of America. Like most professors here, he long ago traded the traditional three-course academic career — writing journal articles, attending conferences and teaching a specific subject — for the intellectual buffet at St. John’s. His first year included teaching Ptolemy’s “Almagest,” a treatise on planetary movements, and atomic theory. He since has taught 15 of the school’s 16 courses, the exception being sophomore music.

“You have to not try to control things,” Mr. Dink said, “and not think that what’s learned has to come from you.”

This ancient teaching method could be making a comeback well beyond St. John’s two campuses. Some education reformers assert that teachers as early as elementary school should lecture less at the blackboard while students silently take notes — the sage-on-the-stage model, as some call it — and foster more discussion and collaboration among smaller groups. It is a strategy that is particularly popular among schools that use technology to allow students to learn at their own pace.

Still, not even the most rabid reformer has suggested that biology be taught by social theorists, or Marx by mathematicians. That philosophy will continue to belong to a school whose president has joyfully declared, “We don’t have departmental politics — we don’t have departments!”

Anthony T. Grafton, a professor of history at Princeton and president of the American Historical Association, said he appreciated the approach.

“There’s no question that people are becoming more specialized — it’s natural for scholars to cover a narrow field in great depth rather than many at the same time,” he said. “I admire how St. John’s does it. It sounds both fun and scary.”