Arquivo mensal: setembro 2012

NASA Scientists Use Unmanned Aircraft to Spy on Hurricanes (Wired)

By Jason Paur

September 21, 2012

Image: NASA

Hurricane researchers are gathering unprecedented data this month by using two NASA Global Hawk unmanned aircraft. The airplanes were originally developed for the military, but have been modified to aid in atmospheric research.

One of the Global Hawks was flown to its new base at NASA’s Wallops Island Flight Facility on Virginia’s Atlantic coast earlier this month and has already flown several missions over developing tropical storms giving atmospheric scientists the ability to watch and measure storms for up to three times as long as they could with manned aircraft, including NASA’s modified U-2 spyplane. The second Global Hawk is set to depart the Dryden Flight Research Center in California and join its hangar mate in the next week or so.

The airplanes were first used to observe storm development on a limited basis in 2010. The earlier missions were flown out of Dryden, which cut short the amount of time the planes could spend over Atlantic storms. Now based on the East Coast, the Global Hawks can spend up to six hours off the coast of Africa as storms develop, or 20 hours or more as the storms approach North America.

This long loiter capability is what has scientists excited to gain new insight into the life of a hurricane, says Scott Braun, principal investigator on NASA’s Hurricane and Severe Storm Sentinel (HS3) project.

“We’ve kind of gone from reconnaissance, which is short-term investigations” said Braun, “to more of surveillance where you can stay with a storm and move with it for a while.”

The path flown by Global Hawk AV-6 over tropical storm Nadine on Sept. 12. (NASA)

On Wednesday, the Global Hawk known as AV-6 departed Wallops Island and flew for nearly 25 hours observing tropical storm Nadine over the central Atlantic. It was the third time AV-6 had flown over the storm in the past 10 days. The airplane carries an “environmental payload” designed to gather big-picture data by flying around the entire storm (above). In addition to a high-resolution infrared sounder and cloud-physics LIDAR for measuring the structure and depth of clouds, AV-6 can make direct measurements by dropping out radiosondes that can measure temperature, humidity, wind speed and air pressure as they descend by parachute from as high as 60,000 feet.

The other airplane, AV-1 has a different suite of remote sensing instruments on board focused on the development of the inner core of the storms, measuring variables including wind profiles, rain rates and liquid water content of the clouds.

Researchers have learned a lot about predicting the path of hurricanes over the past several decades. But being able to predict the intensification of storms, especially early in their development has not been as successful. One of the aspects of storm and hurricane development Braun and the HS3 team are hoping to learn more about is the role dry, dusty air masses blowing off the Sahara play in the intensification process. It’s something debated in the research community, and until now scientists have had limited capabilities to watch the interaction for long periods of time.

“In some ways the Saharan air layer is essential,” Braun said. “The question is, once you develop these waves, to what extent can this dry and dusty air get into these disturbances to disrupt the ability of thunderstorms to get rotation organized on smaller scales to spin up a hurricane.”

Braun says most of the storms form from the air masses that come off Africa, but in manned aircraft they may only get a few hours at a time to watch and gather data. Satellites are also used extensively, but they offer a few snapshots a day and cannot make direct measurements. The Global Hawks provide the capability to watch the storms develop for up to a day at a time. And with two of them the researchers will eventually be able to watch storms continuously.

“We could potentially follow them all the way across the Atlantic, so you’re looking at the full life cycle of these storms.”

The Global Hawks will remain in Virginia through early October until hurricane season is over. The airplanes will be back on the East Coast for more long flights next year as well as in 2014.

NIH Decision Signals the Beginning of the End for Medical Research on Chimps (Wired)

By Brandon Keim

September 21, 2012

Henry, one of the chimps at Chimp Haven. Image: Steven Snodgrass/Flickr

With the retirement of 110 government-owned chimpanzees, the end of medical research on man’s closest living relative may be near.

Today, the National Institutes of Health announced that all of its chimps now living at the New Iberia Research Center would be permanently removed from the research population.

Long criticized by animal advocates for mistreating animals and illegally breeding chimps, New Iberia operates the largest research chimp colony in the United States and is a bastion of a practice abandoned in every other country.

“This is a major message from the NIH: that this era is coming to an end,” said John Pippin of the Physicians Committee for Responsible Medicine, an animal advocacy group. “This is huge.”

In December of last year, an expert panel convened by the Institute of Medicine, the nation’s medical science advisers, declared that medical research on chimpanzees was ethically problematic and, in most cases, scientifically unnecessary. The NIH announced a moratorium on new chimp research funding and agreed to review the status of its own animals. After years of fighting for an end to medical research on chimps, whose ability to think, feel and suffer is not far removed from our own, animal advocates greeted that news with cautious relief. The NIH’s intentions sounded good, but what they’d actually do remained to be seen.

With the decision to retire 110 chimps at New Iberia, the NIH leaves little doubt of its plans. “This is a significant step in winding down NIH’s investment in chimpanzee research based on the way science has evolved and our great sensitivity to the special nature of these remarkable animals, our closest relatives,” said NIH director Francis Collins to the Washington Post.

‘They do not have scientific or ethical justification to continue.’
Excluding the retired chimpanzees, the NIH still owns an estimated 475 chimps eligible for research. Another 500 or so are owned by pharmaceutical companies. The NIH’s decisions influence their fate as well, said Pippin.

“With this indication that the NIH is going to get out of chimp research, that’s going to drop the bottom out of the whole chimpanzee research enterprise,” Pippin said. “How are you going to justify your research in light of what the IOM and NIH have said? Even those not directly affected by this prohibition are going to give up. They do not have scientific or ethical justification to continue.”

Kathleen Conlee, animal research director with the Humane Society of the United States, was more measured in her response.

“They’re taking a step in the right direction by deeming these chimps ineligible for research,” she said. “But we’d rather see them go to sanctuary.” She noted that while 10 of the New Iberia retirees will be sent to the Chimp Haven sanctuary, the rest will go to the Texas Biomedical Institute’s Southwest National Primate Research Center.

Though the newly retired chimps won’t be used again in medical research, that type of research still occurs at Southwest. Indeed, it was an attempt to send retired chimps back into research at Southwestthat sparked the controversy that led to the IOM report and NIH review.

“Places like Southwest were built to be research labs. We’d urge the chimps to be sent somewhere where the mission is the well-being of chimps,” Conlee said. According to Conlee, housing animals at Chimp Haven costs the government $40 per day, compared to $60 per day at research laboratories.

Conlee said that some companies, including Abbott Labs and Idenix, have agreed to follow the IOM guidelines for chimp research or abandon it altogether. Others, including GlaxoSmithKline, have already given up.

Rather than relying on corporate goodwill, however, both Conlee and Pippin urged people to support the Great Ape Protection Act and Cost Savings Act. Now under Congressional consideration, the bill would end on medical research on chimps.

ONU quer garantir que temperatura global não se eleve mais que 2ºC (Globo Natureza)

JC e-mail 4582, de 13 de Setembro de 2012

As negociações climáticas da Organização das Nações Unidas (ONU) devem continuar pressionando por atitudes mais ambiciosas para garantir que o aquecimento global não ultrapasse os 2 graus, disse um negociador da União Europeia nesta semana, um mês depois de os EUA terem sido acusados de apresentar um retrocesso na meta.

Quase 200 países concordaram em 2010 em limitar o aumento das temperaturas para abaixo de 2 graus Celsius, acima da era pré-industrial para evitar os impactos perigos da mudança climática, como enchentes, secas e elevação do nível das marés.

Para desacelerar o ritmo do aquecimento global, as conversações climáticas da ONU na África do Sul concordaram em desenvolver um acordo climático legalmente vinculante até 2015, que poderia entrar em vigor no máximo até 2020.

Entretanto, especialistas advertem que a chance de limitar o aumento da temperatura global para menos de 2 graus está ficando cada vez menor, à medida que aumenta a emissão dos gases de efeito estufa por causa da queima de combustíveis fósseis.

“Está muito claro que devemos pressionar nas negociações de que a meta de 2 graus não é suficiente. A razão pela qual não estamos fazendo o bastante se deve à situação política em algumas partes do mundo”, disse Peter Betts, o diretor para mudança climática internacional da Grã-Bretanha e negociador sênior da UE, a um grupo de mudança climática no Parlamento britânico.

Na última semana, cientistas e diplomatas se reuniram em Bangcoc para a reunião da Convenção da ONU sobre Mudança Climática (UNFCCC, na sigla em inglês), a última antes do encontro anual que será realizado entre novembro e dezembro em Doha, no Qatar.

Flexibilidade nas metas – No mês passado, os EUA foram criticados por dizer que apoiavam uma abordagem mais flexível para um novo acordo climático – que não necessariamente manteria o limite de 2 graus -, mas depois acrescentaram que a flexibilidade daria ao mundo uma chance maior de chegar a um novo acordo.

Diversos países, incluindo alguns dos mais vulneráveis à mudança climática, dizem que o limite de 2 graus não é suficiente e que um limite de 1,5 graus seria mais seguro. As emissões do principal gás de efeito estufa, o dióxido de carbono, subiram 3,1% em 2011, em um recorde de alta. A China foi a maior emissora do mundo, seguida pelos EUA.

As negociações para a criação de um novo acordo global para o clima, nos mesmos moldes de Kyoto, já iniciaram. Na última conferência climática foi aprovada uma série de medidas que estabelece metas para países desenvolvidos e em desenvolvimento.

O documento denominado “Plataforma de Durban para Ação Aumentada” aponta uma série de medidas que deverão ser implementadas, mas na prática, não há medidas efetivas urgentes para conter em todo o planeta o aumento dos níveis de poluição nos próximos oito anos.

Obrigação para todos no futuro – Ele prevê a criação de um acordo global climático que vai compreender todos os países integrantes da UNFCCC e irá substituir o Protocolo de Kyoto. Será desenhado pelos países “um protocolo, outro instrumento legal ou um resultado acordado com força legal” para combater as mudanças climáticas.

Isso quer dizer que metas de redução de gases serão definidas para todas as nações, incluindo Estados Unidos e China, que não aceitavam qualquer tipo de negociação se uma das partes não fosse incluída nas obrigações de redução.

O delineamento deste novo plano começará a ser feito a partir das próximas negociações da ONU, o que inclui a COP 18, que vai acontecer em 2012 no Catar. O documento afirma que um grupo de trabalho será criado e que deve concluir o novo plano em 2015.

As medidas de contenção da poluição só deverão ser implementadas pelos países a partir de 2020, prazo estabelecido na Plataforma de Durban, e deverão levar em conta as recomendações do relatório do Painel Intergovernamental sobre Mudanças Climáticas (IPCC, na sigla em inglês), que será divulgado entre 2014 e 2015.

Em 2007, o organismo divulgou um documento que apontava para um aumento médio global das temperaturas entre 1,8 ºC e 4,0 ºC até 2100, com possibilidade de alta para 6,4 ºC se a população e a economia continuarem crescendo rapidamente e se for mantido o consumo intenso dos combustíveis fósseis.

Entretanto, a estimativa mais confiável fala em um aumento médio de 3ºC, assumindo que os níveis de dióxido de carbono se estabilizem em 45% acima da taxa atual. Aponta também, com mais de 90% de confiabilidade, que a maior parte do aumento de temperatura observado nos últimos 50 anos foi provocada por atividades humanas.

O esvaziamento da discussão ecológica atual que não questiona o modelo econômico e de desenvolvimento (EcoDebate)

Publicado em setembro 6, 2012 por 

“A pergunta passa a ser ‘o que eu devo fazer para ajudar?’ (…) enquanto a questão principal deveria ser ‘contra quem e contra o quê eu devo lutar?’”

 Vladimir Safatle faz parte de uma nova leva de intelectuais de esquerda que não se intimida diante da diversidade de questões trazidas pelo mundo contemporâneo. Nessa entrevista, o professor do Departamento de Filosofia da Universidade de São Paulo (USP) mostra que a crise da democracia representativa pode ser a chave para compreender melhor fatos que à primeira vista não estão relacionados, desvelando mecanismos que ligam islandeses a pescadores brasileiros, ecologistas a jovens que voltam a reivindicar as ruas como espaço do fazer político. Um dos autores de ‘Occupy’ (Boitempo, 2012), Safatle defende que vivemos um momento em que a crítica da democracia, longe de balizar o totalitarismo, reacende a capacidade de reinvenção democrática na perspectiva da soberania popular. Com o lançamento de ‘A esquerda que não teme dizer seu nome’ (Três Estrelas, 2012), o filósofo propõe a urgência da saída do “cômodo e depressivo fatalismo”, que, desde a queda do muro de Berlim, alimenta a falsa impressão de que nenhuma ruptura radical está na pauta do campo político.

No seu livro, o senhor defende que falta à esquerda mostrar o que é inegociável. Abandonar o pragmatismo, superar os impasses da ‘governabilidade’, dentre outros elementos, seriam caminhos para isso. Em contrapartida, paira uma dúvida sobre os próprios partidos, sindicatos e estruturas semelhantes: será que serão capazes de se transformar? Os jovens que ocupam as ruas do mundo parecem não se identificar com esse tipo de organização da vida política. Por que isso acontece?

O que aconteceu com os partidos de esquerda?

Os partidos de esquerda passaram por duas fases. A primeira, muito marcada pela polaridade entre os partidos socialdemocratas e os partidos comunistas, sustentou o desenvolvimento dos Estados de bem-estar social na Europa nos anos 1950 e 1960. O segundo momento dos partidos de esquerda é resultado das ideias libertárias de maio de 1968, que vai gerar uma miríade de partidos libertários, sendo o mais importante deles o partido verde. Os partidos verdes vão conseguir impor uma pauta ecológica fundamental no debate político, mas este movimento também se esgotou. Talvez o último relance dele esteja acontecendo na Alemanha com o Partido Pirata. Só que falta uma terceira leva de partidos que sejam capazes de processar a situação fim de linha da crise de 2008, que ainda vai se perpetuar durante muito tempo.

Como esses partidos se caracterizariam?

Falta uma geração de partidos que tenha consciência de problemas vinculados à desigualdade econômica, coisa que esses partidos de segunda geração não têm. Diga-se de passagem, o Partido Verde alemão foi responsável pela lei que desregulamentou e flexibilizou o mercado de trabalho, votada na época do Gerhard Schröder [premier alemão de 1998 a 2005]. Falta uma geração de partidos com a coragem de radicalizar os processos de institucionalização da soberania popular. Partidos que não funcionem como partidos. Isso pode parecer uma coisa estranha, mas no fundo é muito importante. Partidos que não tenham essa estrutura centralizada, estrategicamente orientada, em que as discussões se submetem às estratégias político-partidárias eleitorais do dia. Por que os jovens não querem entrar em partidos hoje? Porque não querem ter a sua capacidade crítica instrumentalizada por cálculos eleitorais. Ninguém mais quer ficar fazendo uma aliança política com fulano para garantir a eleição de sicrano. Esse tipo de raciocínio de mercador, que conseguiu monopolizar a política em todos os seus níveis – inclusive no campo das esquerdas – é o que boa parte dos jovens de hoje se recusa veementemente a seguir, com todas as razões.

O que se coloca no lugar disso?

É fundamental encontrar um modelo de participação eleitoral em que esse tipo de posição não seja rifada. Ninguém aqui está fazendo a profissão de fé que vigorou nos anos 1990 de mudar o mundo sem tomar o poder. Isso não funcionou nem funcionará, o Egito é um exemplo. O grupo que realmente mobilizou o processo revolucionário chama-se Movimento 6 de abril. Eles decidiram não entrar no jogo eleitoral e estão cada vez mais isolados. Essa coisa da força que vem das ruas e vai pressionar o regime de fora tem limite. Então, não se trata de uma crítica abstrata do processo eleitoral, mas da constatação de que é necessário saber entrar nesse processo de uma maneira diferente da que vimos até hoje. Talvez a criação de alianças flexíveis para uma eleição que depois se dissolvem, como a Frente de Esquerda na França, coisas desse tipo. É difícil saber o que vai aparecer, mas uma coisa é certa: o que temos hoje não dá mais conta. Há uma fixação muito grande na democracia representativa. Desde os anos 1970 vivemos nas Ciências Políticas uma espécie de deleite em ficar discutindo como deve ser o jogo democrático, a estrutura dos partidos, dos poderes e blá, blá, blá. Esse tipo de perspectiva bloqueia radicalmente a ideia de que uma das questões centrais da democracia é fazer a crítica da democracia. Quando a democracia perde sua capacidade de reinvenção, ela morre. É o que está acontecendo agora.

O que contribuiu para a recomposição do espaço público das ruas e por que ele foi abandonado durante tanto tempo?

Para você ter crítica social e mobilização é necessário desencanto. Vários níveis de desencanto foram necessários para que as pessoas voltassem às ruas. Quando eu tinha vinte e poucos anos, o discurso era de que nunca mais veríamos grandes mobilizações populares. Poderia haver mobilizações pontuais sobre questões pontuais, mas nunca uma mobilização que colocasse em xeque o modelo de funcionamento e gestão da vida social no interior das sociedades capitalistas avançadas. Hoje vemos que quem fez essas previsões não só errou como tinha interesses ideológicos inconfessáveis. As pessoas que saíram às ruas em 2011 queriam discutir o modelo de funcionamento da estrutura econômica e social das nossas sociedades. No momento em que isso aconteceu, muitos, principalmente da imprensa, se deleitaram em dizer que eles não tinham propostas, o que é falso. Quem foi às ruas buscou o direito de colocar os problemas em questão. Muitas vezes, a pior maneira de se pensar em um problema é “solucioná-lo” muito rapidamente. Também houve quem não tenha ido às ruas e, diante da crise financeira, apareceu com soluções prontas. Essas ‘soluções’ só pioraram os problemas.

No que diz respeito à agenda ambiental, existem muitas ‘soluções’ que, na verdade, provocam um esvaziamento deliberado do potencial político das questões ecológicas. Vemos a individualização da responsabilidade pela poluição presente no discurso das sacolas plásticas, do tempo que as pessoas devem gastar tomando banho, etc. e também um esforço em afastar a população da discussão travestindo-a como eminentemente técnica. Como vê isso?

É uma tentativa de retirar a força política da questão ecológica transformando-a em uma questão moral. A discussão gira em torno dos atos dos indivíduos, que precisam ser modificados. Você precisa gastar menos tempo no banho, comprar produtos bio e coisas desse tipo. É uma maneira muita astuta de operar um deslocamento que é mortal para o problema ecológico, porque a pergunta passa a ser “o que eu devo fazer para ajudar?” – e, a princípio, parece legal todo mundo fazer alguma coisa para ajudar –, enquanto a questão principal deveria ser “contra quem e contra o quê eu devo lutar?”. Sem isso, a tendência é esvaziar completamente a dimensão da discussão ecológica, não se questiona o modelo econômico e de desenvolvimento. E o forte potencial político dessa discussão reside justamente nesse questionamento do modelo de desenvolvimento das sociedades capitalistas avançadas, colocando em xeque o modelo de organização e gestão das cidades, dos transportes, dos resíduos, da energia… Como resultado desse deslocamento da dimensão política para a moral, nada disso é colocado em questão, por mais que todo mundo defenda com a mão no coração “as florestas”, a questão que a ecologia trouxe está fora do debate.

A retórica do discurso técnico na qual as pessoas não conseguem ter acesso aos fatos sem a mediação de especialistas é um obstáculo para a reconstrução do campo político nas bases dessa democracia direta, estreitamente ligada aos reais interesses das populações, não?

Posso dar um exemplo sobre esse tipo de problema. A Islândia foi um dos primeiros países a entrar na crise financeira de 2008. Bancos islandeses venderam fundos de investimento na Holanda e na Inglaterra e quando esses bancos quebraram, os governos holandês e inglês exigiram que o governo da Islândia bancasse a dívida dos bancos. Diante disso, o parlamento islandês resolveu votar uma lei de ajuda aos bancos falidos e a lei passou. Mas o presidente da Islândia, que era um sujeito mais esclarecido, lembrou que a Constituição do país previa a convocação de um referendo popular em casos como aquele. Resumindo, ele lembrou que o princípio central da democracia é: quem paga a orquestra, escolhe a música. Quem pagaria aquela dívida não seria o parlamento, mas a população, que teria seus recursos e salários expropriados por uma série de impostos destinados ao pagamento da dívida dos bancos. A população islandesa decidiu que não queria isso. Depois do resultado do referendo, aconteceu a coisa mais fantástica, que é a essência da democracia parlamentar atual: o parlamento votou e aprovou mais uma vez a mesma lei de ajuda aos bancos. Então, novamente, o presidente acionou o mecanismo do referendo popular e, pela segunda vez, os islandeses disseram não. O que isso significa? Alguns podem questionar “como uma questão ‘técnica’ dessas vai parar em referendo popular?”, acusar o presidente de demagogia, etc., o que é absolutamente surreal. Não é possível que parlamentares que têm suas campanhas pagas por bancos definam o que vai acontecer com o dinheiro da população em relação ao pagamento ou não da dívida destes bancos. Não faltaram economistas prevendo que a Islândia iria quebrar. No entanto, de todos os países que entraram na crise, a Islândia é um dos que está em melhor situação atualmente. A tentativa de retirar a força política da decisão era simplesmente uma construção ideológica para legitimar os “técnicos”, que, no fundo, de técnicos não têm nada porque representantes do poder financeiro que conseguiu tomar conta de todas as instituições das democracias avançadas. Esse é o limite da democracia atual. O sistema financeiro é o grande inimigo da democracia.

Existe um tipo de agenda ambiental apoiada na entrada de bens comuns para o mercado que vem sendo denunciada como a solução encontrada pelo sistema financeiro para sair da crise ao mesmo tempo em que, também apoiada na retórica da crise, Angela Merkel lidera na zona do Euro políticas de austeridade que deslegitimam a vontade soberana dos povos, como no caso grego. Como ‘a esquerda que não teme dizer seu nome’ se coloca nesse processo?

Os problemas ligados à ecologia têm um forte potencial não só mobilizador como também transformador. No entanto, nós temos hoje duas ecologias. Uma tem um potencial transformador, mas a outra é conservadora. O capitalismo vê na ecologia um dos elementos de sua renovação. Hoje, qualquer liberal, qualquer analista de Wall Street vai admitir o discurso ecológico. Há alguns autores que falam que depois da bolha imobiliária, nós temos agora a bolha verde. Uma vez escrevi um pequeno texto sobre o filme Wall Street [2010], de Oliver Stone, que me impressionou pela agudez da metáfora. Um jovem analista do mercado aposta no potencial financeiro das energias renováveis. Ele era um visionário porque, de certa maneira, pregava uma reconciliação entre o setor mais rentista da economia e algumas exigências presentes na pauta ecológica. Isso só pode ser feito rifando completamente a dimensão em que a reflexão ecológica aparece como um elemento fundamental de afirmação da soberania popular. Existe uma tendência bizarra, mas muito concreta, de articulação entre um determinado setor de lutas ecológicas e o capital financeiro. Inclusive, do ponto de vista eleitoral, acontece muita coisa complicada. Os partidos verdes europeus preferem se aliar a partidos de centro do que aos partidos de esquerda. Por exemplo, na Alemanha, o Partido Verde prefere uma aliança com a CDU [partido democrata-cristão da primeira-ministra Angela Merkel] do que uma aliança com a Die LINK, que é um partido de esquerda mais dura. Na França foi a mesma coisa. Tudo isso me parece muito preocupante. É necessário livrar a agenda ecológica dessa tendência à justificativa de um liberalismo renovado para recolocá-la no lugar onde ela sempre esteve, ou seja, como elemento fundamental da reflexão da esquerda sobre o caráter deletério dos processos de desenvolvimento do capitalismo avançado.

Como o novo pensamento de esquerda pode articular uma mirada filosófica diferente para a questão do uso produtivista da natureza, característico do neodesenvolvimentismo aqui no Brasil?

Eu reconheço que esse produtivismo em relação à natureza também esteve muito presente em certos setores da esquerda que, durante muito tempo, entenderam a natureza como fonte de recursos e só. Basta lembrar que nos países comunistas a política ambiental foi catastrófica. Isso, inclusive, tem base teórica, vem de uma leitura do pensamento marxista em que a natureza era um discurso reificado, sem realidade ontológica em si. Em última instância, a natureza era o fruto do trabalho humano então a intervenção humana na natureza já estava justificada de antemão, sem maiores contradições. Mas acredito que do ponto de vista da esquerda hoje existe uma consciência tácita a respeito da centralidade da agenda ecológica. Não foram poucos os filósofos no século 20 que nos alertaram para o impacto negativo da redução da relação com a natureza a sua dimensão eminentemente técnica. Por mais que o desenvolvimento técnico pareça nos assegurar a dominação da natureza, o fato de compreender a relação humana com a natureza sob o signo da dominação já é um problema grave. Então, essa ideia de que, sim, vivemos em um país que tem necessidades de desenvolvimento maiores porque há urgências de inclusão social não invalida o fato de estarmos no interior de um processo de reflexão sobre o que significa riqueza social. Será que riqueza social significa ter um conjunto determinado de bens de consumo, ter transporte individual, ter uma relação extrativista da energia natural? Ou significa ser capaz de criar um modelo de relação com a natureza que garanta de maneira fundamental a qualidade de vida? Essa é uma bela questão que só o debate ecológico foi capaz de colocar.

Assim como em movimentos urbanos, a exemplo do Ocuppy, a pauta ecológica delineia um horizonte onde outro modelo de sociedade é possível, fazendo cada vez mais a crítica ao poder do sistema financeiro para bloqueá-lo?

A pauta ecológica atinge o modelo na sua esfera econômica mais clara ao afirmar que nós não queremos uma situação na qual todos os agentes econômicos estejam submetidos aos interesses de uma meia dúzia de multinacionais que detém não só a estrutura de produção, mas também o desenvolvimento da técnica. Quando se fala em agricultura familiar, o que isso quer dizer? Que, enquanto modelo econômico, não é possível estabelecer uma brutal concentração de terras, de tecnologia, de insumos. Insistir na agricultura familiar é, dentre outras coisas, insistir na pulverização radical da posse não só da terra, mas dos bens e das técnicas. Porque se isso não ocorrer, você tem não só consequências demográficas muito brutais, como o inchaço das periferias urbanas, mas também uma espécie de situação na qual a criatividade inerente à pulverização das técnicas é perdida. Milhares de produtores não vão produzir as mesmas coisas, nem sob as mesmas condições.

Por exemplo?

Por exemplo, quando essas questões ecológicas se vinculam ao problema da soberania alimentar. O fato de que você tem uma política agrícola que vai eliminando completamente a diversidade alimentar não é só uma questão de garantia das tradições – eu seria o último a fazer aqui a defesa abstrata da particularidade das tradições. Dentre outras coisas, é preciso reconhecer que a tradição tem uma dimensão de experiência que será muito importante para nós quando tivermos condições de compreender como os saberes alimentares se constituíram e o que eles garantem. Há uma tendência monopolista muito forte, nós vemos nas últimas décadas algo que está na base da tradição marxista, a ideia de que vai chegar um momento em que a própria noção de concorrência começa a desaparecer. Esse processo concentracionista toma a relação com a natureza de assalto, da maneira mais brutal possível. Todos esses movimentos camponeses, como a Via Campesina, insistem que há um risco não só econômico como social em se permitir a concentração das atividades agrícolas na mão de multinacionais. As sociedades pagarão caro se não conseguirem bloquear esse processo.

Pegando carona nesse exemplo da Via Campesina, cada vez mais surgem relatos de populações tradicionais emparedadas por esse modelo de desenvolvimento, mas, ainda sim, estes relatos bastante concretos e verificáveis são deslegitimados…

Tenta-se desqualificar essas resistências como uma espécie de arcaísmo. É como se dissessem “vocês precisam entender que têm uma visão absolutamente romântica do mundo”. É um discurso que condena “a crítica às luzes”, no final das contas. Diz muito a tentativa de retirar dessas lutas uma espécie de prova maior do conservadorismo de certas populações que no fundo são as populações mais vulneráveis, pois sabem que quando essas empresas chegam eles vão para o espaço simplesmente. Quando a Petrobrás chega para fazer a exploração de petróleo nas bacias, a vida dos pescadores é a última coisa na qual ela vai pensar. “Imagina você ficar preocupado com peixe quando o país quer se transformar em uma grande potência petrolífera?”. Ou seja, eles querem vender essa perspectiva, mas uma questão fundamental da esquerda é saber defender as alas mais vulneráveis da sociedade. Existe um modelo retórico que procura nos fazer acreditar que toda resistência seja, no fundo, uma recusa do progresso. Acho importante recolocar de maneira clara o que significa ‘progresso’ no interior desse contexto. O progresso procuraria dar conta de certas exigências fundamentais de bem-estar. O progresso científico não é simplesmente um processo de dominação da natureza, mas também um processo de otimização do bem-estar humano. Mas esse dito ‘progresso’ promete uma maior qualidade de vida para as populações e acaba produzindo o inverso. Para que essa inversão não ocorra, é necessária uma reconstituição brutal dos modelos de relação com a natureza. E, nesse processo, o interessante é que nasce outra consciência da organização social.

* Entrevista realizada por Maíra Mathias para a revista Poli n° 24, de julho e agosto de 2012

** Entrevista socializada pela Escola Politécnica de Saúde Joaquim Venâncio(EPSJV/Fiocruz), publicada pelo EcoDebate, 06/09/2012

[ O conteúdo do EcoDebate é “Copyleft”, podendo ser copiado, reproduzido e/ou distribuído, desde que seja dado crédito ao autor, ao Ecodebate e, se for o caso, à fonte primária da informação ]

Violência de torcidas organizadas ganha cada vez mais espaço na academia (Jornal da Ciência)

JC e-mail 4577, de 05 de Setembro de 2012.

Clarissa Vasconcellos – Jornal da Ciência

Com a proximidade da Copa do Mundo no Brasil em 2014, a violência das torcidas organizadas ganha foco e atrai o interesse de pesquisadores.

Nas últimas semanas, o noticiário voltou a se encher de notícias relacionadas a conflitos entre torcedores que vêm demonstrando pouco espírito esportivo dentro e fora dos estádios. A recente morte de um torcedor e as brigas entre as torcidas organizadas vêm sendo motivo de debates e sanções. Estudiosos que analisam o fenômeno há décadas, dentro e fora do País, trazem novos dados sobre o problema.

É o caso de Heloísa Reis, pesquisadora e professora da Faculdade de Educação Física (FEF) da Unicamp, que investiga o tema há 17 anos. Uma de suas motivações foi uma experiência pessoal: ex-jogadora de futebol, ela sentiu na pele a hostilidade dos torcedores masculinos, algo muito comum na década de 80. “Minha indignação com aquela violência simbólica fez com que eu tivesse grande interesse em entender a atuação dos torcedores e o que leva eles a xingar os jogadores e brigar entre si”, conta Heloísa ao Jornal da Ciência.

Reconhecida internacionalmente – acaba de voltar de Londres, onde foi convidada para trabalhar com a polícia durante as Olimpíadas -, Heloísa conta que o assunto é um tema social muito relevante, já que, além de ter provocado um elevado número de mortos nos últimos anos, trata-se de uma situação que muitas vezes mobiliza a cidade, especialmente no grande raio que abrange os estádios. “Há uma tensão dos moradores, toda segurança pública fica envolvida e o transporte público é afetado”, enumera.

De acordo com dados divulgados, no dia 30 de agosto, pelo Grupamento Especial de Policiamento em Estádios (Gepe), de um ano para cá, mais de 370 torcedores foram presos no Rio de Janeiro – 83 apenas na semana passada. As causas foram promoção de tumulto, baderna e lesão corporal no caminho ou na volta dos estádios. Além disso, a Polícia Civil fluminense anunciou a criação do Núcleo de Apoio aos Grandes Eventos (Nage), para combater as ações criminosas promovidas por integrantes de torcidas organizadas de clubes de futebol.

Semelhanças e diferenças – Além de estudar a violência no futebol brasileiro, Heloísa já pesquisou o problema em outros países, como a Espanha e a Inglaterra. Em comum, ela afirma que os torcedores violentos de todos os países (incluindo os hooligans, considerados os mais violentos desse grupo de indivíduos) em geral são homens e jovens.

“A questão surge com mais evidência no Reino Unido na década de 80, mas já havia esse tipo de violência no Brasil e na Espanha. Dessa época para cá, podemos verificar também um aumento na idade dos homens, pois cada vez mais a ‘adolescência’ é estendida, para inclusive além dos 30 anos, por exemplo”, relata. Heloísa vê como traços comuns entre os hooligans estrangeiros e brasileiros o fato de serem homens jovens que se satisfazem brigando e correndo risco, utilizando o enfrentamento e o confronto (tanto com torcedores rivais quanto com a polícia) para terem prazer. “Também buscam uma autoafirmação masculina; não à toa guardam a vestimenta do time adversário como troféu de conquista”, completa.

Porém, os países guardam algumas diferenças significativas entre seus torcedores violentos. Uma das principais é que em diversas sociedades mais desenvolvidas o acesso a armas de fogo é difícil, o que diminui drasticamente o número de mortes. “Brasil e Argentina são os países com mais mortes no futebol na América do Sul pelo acesso a armas de fogo. A entrada delas, principalmente nos anos 80, fez com que aumentasse muito o numero de mortes aqui”, conta, lembrando que, em toda história do futebol francês, por exemplo, houve três mortes por violência de torcedores, enquanto no Brasil o número atual é de 69 falecimentos.

Ela também destaca uma diferença entre o “código de conduta” de torcedores europeus e latino-americanos. “Lá, a intenção não é levar o inimigo à morte e sim fazê-lo sofrer”, conta. Outro dado interessante apurado por Heloísa no Brasil, mais especificamente nas torcidas organizadas de São Paulo, é que 85% dos jovens de 15 a 24 anos pesquisados moram com a família, “o que contraria um discurso da mídia de que as famílias precisam voltar aos estádios”, afirma, já que lares estruturados não são garantia para se evitar esse tipo de conflito.

Álcool e violência – Heloísa explica que, como o futebol se tornou um “produto valiosíssimo na economia mundial”, “há um grande interesse de países mais desenvolvidos em ter uma política de prevenção para que se garanta o lazer seguro de qualidade”. Isso passa pelo controle de bebidas alcoólicas nos estádios, de acordo com a pesquisadora.

“Em todos os países onde se fez política de prevenção da violência relacionada a futebol, em algum momento o álcool foi proibido, já que todos esses países verificaram que a maioria das pessoas detidas estava sob efeito do álcool”, informa. Heloísa lamenta que tenha prevalecido, para a Copa de 2014, “o interesse econômico das cervejarias” na chamada Lei da Copa. “Acho temerário, estou muito preocupada”, alerta.

A pesquisadora também se inquieta com o fato de algumas autoridades pensarem que é um exagero se preocupar com torcedores violentos durante a Copa, diante da alegação que não estarão envolvidos torcedores de grandes clubes. Ela relembra os conflitos que aconteceram durante a Copa da Alemanha, em 2006, que geraram inclusive um documentário da rede inglesa BBC.

No dia 30 de agosto, o governo brasileiro divulgou o planejamento estratégico de segurança que será aplicado na Copa do Mundo de 2014, no qual considera os torcedores violentos, tanto nacionais como estrangeiros, como um dos fatores de risco. O plano, publicado no Diário Oficial (http://www.in.gov.br/visualiza/index.jsp?data=30/08/2012&jornal=1&pagina=45&totalArquivos=120), foi elaborado pela Secretaria Extraordinária para a Segurança de Grandes Eventos e inclui todas as ações que serão implementadas para garantir a segurança da competição.

O texto detalha, por exemplo, os responsáveis pela segurança do evento, as medidas já pensadas, os objetivos perseguidos, os recursos investidos, os preparativos, a cooperação com outros países e a interação com as firmas privadas contratadas pela Fifa para a segurança. A instalação de circuitos internos de televisão e a identificação dos torcedores são vistas como algumas medidas de sucesso utilizadas nas competições europeias.

“Por aqui existe um grande risco, especialmente entre embates de Inglaterra e Argentina ou Brasil e Argentina”, pontua Heloísa. Ela acredita que, no caso de jogo entre esses países, o ideal seria deslocar a partida para o Norte ou Nordeste do Brasil, dificultando o acesso. Outra medida de prevenção, observada in loco por ela nas Olimpíadas de Londres, seria realizar um trabalho conjunto com todos os órgãos de segurança nacionais. “Todas as organizações que deveriam prestar algum socorro ou assistência estavam concentradas da mesma sala. Os comandos saíam do mesmo lugar; foi tudo muito orquestrado”, relembra.

“Virão hooligans para o Brasil com certeza. Podem vir menos devido à distância em que se encontra o Brasil, mas virão. Eles se aproveitam do dia do jogo, seja do seu time ou de seu país, para brigar. A competição deles é paralela ao jogo do campo”, conclui.

Bits of Mystery DNA, Far From ‘Junk,’ Play Crucial Role (N.Y.Times)

By GINA KOLATA

Published: September 5, 2012

Among the many mysteries of human biology is why complex diseases like diabeteshigh blood pressure and psychiatric disorders are so difficult to predict and, often, to treat. An equally perplexing puzzle is why one individual gets a disease like cancer or depression, while an identical twin remains perfectly healthy.

Béatrice de Géa for The New York Times. “It is like opening a wiring closet and seeing a hairball of wires,” Mark Gerstein of Yale University said of the DNA intricacies.

Now scientists have discovered a vital clue to unraveling these riddles. The human genome is packed with at least four million gene switches that reside in bits of DNA that once were dismissed as “junk” but that turn out to play critical roles in controlling how cells, organs and other tissues behave. The discovery, considered a major medical and scientific breakthrough, has enormous implications for human health because many complex diseases appear to be caused by tiny changes in hundreds of gene switches.

The findings, which are the fruit of an immense federal project involving 440 scientists from 32 laboratories around the world, will have immediate applications for understanding how alterations in the non-gene parts of DNA contribute to human diseases, which may in turn lead to new drugs. They can also help explain how the environment can affect disease risk. In the case of identical twins, small changes in environmental exposure can slightly alter gene switches, with the result that one twin gets a disease and the other does not.

As scientists delved into the “junk” — parts of the DNA that are not actual genes containing instructions for proteins — they discovered a complex system that controls genes. At least 80 percent of this DNA is active and needed. The result of the work is an annotated road map of much of this DNA, noting what it is doing and how. It includes the system of switches that, acting like dimmer switches for lights, control which genes are used in a cell and when they are used, and determine, for instance, whether a cell becomes a liver cell or a neuron.

“It’s Google Maps,” said Eric Lander, president of the Broad Institute, a joint research endeavor of Harvard and the Massachusetts Institute of Technology. In contrast, the project’s predecessor, the Human Genome Project, which determined the entire sequence of human DNA, “was like getting a picture of Earth from space,” he said. “It doesn’t tell you where the roads are, it doesn’t tell you what traffic is like at what time of the day, it doesn’t tell you where the good restaurants are, or the hospitals or the cities or the rivers.”

The new result “is a stunning resource,” said Dr. Lander, who was not involved in the research that produced it but was a leader in the Human Genome Project. “My head explodes at the amount of data.”

The discoveries were published on Wednesday in six papers in the journal Nature and in 24 papers in Genome Research and Genome Biology. In addition, The Journal of Biological Chemistry is publishing six review articles, and Science is publishing yet another article.

Human DNA is “a lot more active than we expected, and there are a lot more things happening than we expected,” said Ewan Birney of the European Molecular Biology Laboratory-European Bioinformatics Institute, a lead researcher on the project.

In one of the Nature papers, researchers link the gene switches to a range of human diseases — multiple sclerosislupusrheumatoid arthritisCrohn’s diseaseceliac disease — and even to traits like height. In large studies over the past decade, scientists found that minor changes in human DNA sequences increase the risk that a person will get those diseases. But those changes were in the junk, now often referred to as the dark matter — they were not changes in genes — and their significance was not clear. The new analysis reveals that a great many of those changes alter gene switches and are highly significant.

“Most of the changes that affect disease don’t lie in the genes themselves; they lie in the switches,” said Michael Snyder, a Stanford University researcher for the project, called Encode, for Encyclopedia of DNA Elements.

And that, said Dr. Bradley Bernstein, an Encode researcher at Massachusetts General Hospital, “is a really big deal.” He added, “I don’t think anyone predicted that would be the case.”

The discoveries also can reveal which genetic changes are important in cancer, and why. As they began determining the DNA sequences of cancer cells, researchers realized that most of the thousands of DNA changes in cancer cells were not in genes; they were in the dark matter. The challenge is to figure out which of those changes are driving the cancer’s growth.

“These papers are very significant,” said Dr. Mark A. Rubin, a prostate cancer genomics researcher at Weill Cornell Medical College. Dr. Rubin, who was not part of the Encode project, added, “They will definitely have an impact on our medical research on cancer.”

In prostate cancer, for example, his group found mutations in important genes that are not readily attacked by drugs. But Encode, by showing which regions of the dark matter control those genes, gives another way to attack them: target those controlling switches.

Dr. Rubin, who also used the Google Maps analogy, explained: “Now you can follow the roads and see the traffic circulation. That’s exactly the same way we will use these data in cancer research.” Encode provides a road map with traffic patterns for alternate ways to go after cancer genes, he said.

Dr. Bernstein said, “This is a resource, like the human genome, that will drive science forward.”

The system, though, is stunningly complex, with many redundancies. Just the idea of so many switches was almost incomprehensible, Dr. Bernstein said.

There also is a sort of DNA wiring system that is almost inconceivably intricate.

“It is like opening a wiring closet and seeing a hairball of wires,” said Mark Gerstein, an Encode researcher from Yale. “We tried to unravel this hairball and make it interpretable.”

There is another sort of hairball as well: the complex three-dimensional structure of DNA. Human DNA is such a long strand — about 10 feet of DNA stuffed into a microscopic nucleus of a cell — that it fits only because it is tightly wound and coiled around itself. When they looked at the three-dimensional structure — the hairball — Encode researchers discovered that small segments of dark-matter DNA are often quite close to genes they control. In the past, when they analyzed only the uncoiled length of DNA, those controlling regions appeared to be far from the genes they affect.

The project began in 2003, as researchers began to appreciate how little they knew about human DNA. In recent years, some began to find switches in the 99 percent of human DNA that is not genes, but they could not fully characterize or explain what a vast majority of it was doing.

The thought before the start of the project, said Thomas Gingeras, an Encode researcher from Cold Spring Harbor Laboratory, was that only 5 to 10 percent of the DNA in a human being was actually being used.

The big surprise was not only that almost all of the DNA is used but also that a large proportion of it is gene switches. Before Encode, said Dr. John Stamatoyannopoulos, a University of Washington scientist who was part of the project, “if you had said half of the genome and probably more has instructions for turning genes on and off, I don’t think people would have believed you.”

By the time the National Human Genome Research Institute, part of the National Institutes of Health, embarked on Encode, major advances in DNA sequencing and computational biology had made it conceivable to try to understand the dark matter of human DNA. Even so, the analysis was daunting — the researchers generated 15 trillion bytes of raw data. Analyzing the data required the equivalent of more than 300 years of computer time.

Just organizing the researchers and coordinating the work was a huge undertaking. Dr. Gerstein, one of the project’s leaders, has produced a diagram of the authors with their connections to one another. It looks nearly as complicated as the wiring diagram for the human DNA switches. Now that part of the work is done, and the hundreds of authors have written their papers.

“There is literally a flotilla of papers,” Dr. Gerstein said. But, he added, more work has yet to be done — there are still parts of the genome that have not been figured out.

That, though, is for the next stage of Encode.

*   *   *

Published: September 5, 2012

Rethinking ‘Junk’ DNA

A large group of scientists has found that so-called junk DNA, which makes up most of the human genome, does much more than previously thought.

GENES: Each human cell contains about 10 feet of DNA, coiled into a dense tangle. But only a very small percentage of DNA encodes genes, which control inherited traits like eye color, blood type and so on.

JUNK DNA: Stretches of DNA around and between genes seemed to do nothing, and were called junk DNA. But now researchers think that the junk DNA contains a large number of tiny genetic switches, controlling how genes function within the cell.

REGULATION: The many genetic regulators seem to be arranged in a complex and redundant hierarchy. Scientists are only beginning to map and understand this network, which regulates how cells, organs and tissues behave.

DISEASE: Errors or mutations in genetic switches can disrupt the network and lead to a range of diseases. The new findings will spur further research and may lead to new drugs and treatments.

 

Evolution could explain the placebo effect (New Scientist)

06 September 2012 by Colin Barras

Magazine issue 2881

ON THE face of it, the placebo effect makes no sense. Someone suffering from a low-level infection will recover just as nicely whether they take an active drug or a simple sugar pill. This suggests people are able to heal themselves unaided – so why wait for a sugar pill to prompt recovery?

New evidence from a computer model offers a possible evolutionary explanation, and suggests that the immune system has an on-off switch controlled by the mind.

It all starts with the observation that something similar to the placebo effect occurs in many animals, says Peter Trimmer, a biologist at the University of Bristol, UK. For instance, Siberian hamsters do little to fight an infection if the lights above their lab cage mimic the short days and long nights of winter. But changing the lighting pattern to give the impression of summer causes them to mount a full immune response.

Likewise, those people who think they are taking a drug but are really receiving a placebo can have a response which is twice that of those who receive no pills (Annals of Family Medicinedoi.org/cckm8b). In Siberian hamsters and people, intervention creates a mental cue that kick-starts the immune response.

There is a simple explanation, says Trimmer: the immune system is costly to run – so costly that a strong and sustained response could dangerously drain an animal’s energy reserves. In other words, as long as the infection is not lethal, it pays to wait for a sign that fighting it will not endanger the animal in other ways.

Nicholas Humphrey, a retired psychologist formerly at the London School of Economics, first proposed this idea a decade ago, but only now has evidence to support it emerged from a computer model designed by Trimmer and his colleagues.

According to Humphrey’s picture, the Siberian hamster subconsciously acts on a cue that it is summer because food supplies to sustain an immune response are plentiful at that time of year. We subconsciously respond to treatment – even a sham one – because it comes with assurances that it will weaken the infection, allowing our immune response to succeed rapidly without straining the body’s resources.

Trimmer’s simulation is built on this assumption – that animals need to spend vital resources on fighting low-level infections. The model revealed that, in challenging environments, animals lived longer and sired more offspring if they endured infections without mounting an immune response. In more favourable environments, it was best for animals to mount an immune response and return to health as quickly as possible (Evolution and Human Behavior, doi.org/h8p). The results show a clear evolutionary benefit to switching the immune system on and off depending on environmental conditions.

“I’m pleased to see that my theory stands up to computational modelling,” says Humphrey. If the idea is right, he adds, it means we have misunderstood the nature of placebos. Farming and other innovations in the past 10,000 years mean that many people have a stable food supply and can safely mount a full immune response at any time – but our subconscious switch has not yet adapted to this. A placebo tricks the mind into thinking it is an ideal time to switch on an immune response, says Humphrey.

Paul Enck at the University of Tübingen in Germany says it is an intriguing idea, but points out that there are many different placebo responses, depending on the disease. It is unlikely that a single mechanism explains them all, he says.

Ecosystems Cope With Stress More Effectively the Greater the Biodiversity (Science Daily)

ScienceDaily (Sep. 5, 2012) — Ecosystems with a high degree of biodiversity can cope with more stress, such as higher temperatures or increasing salt concentrations, than those with less biodiversity. They can also maintain their services for longer, as botanists and ecologists from the universities of Zurich and Göttingen have discovered. Their study provides the first evidence of the relationship between stress intensity and ecosystem functioning.

Higher average temperatures and increasing salt concentrations are stress factors that many ecosystems face today in the wake of climate change. However, do all ecosystems react to stress in the same way and what impact does stress have on ecosystem services, such as biomass production? Botanists and ecologists from the universities of Zurich and Göttingen demonstrate that a high level of biodiversity aids stress resistance.

Higher number of species leads to greater stress resistance

The scientists studied a total of 64 species of single-celled microalgae from the SAG Culture Collection of Algae in Göttingen. These are at the bottom of the food chain and absorb environmentally harmful COvia photosynthesis. “The more species of microalgae there are in a system, the more robust the system is under moderate stress compared to those with fewer species,” says first author Bastian Steudel, explaining one of the results. Systems with a higher number of species can thus keep their biomass production stable for longer than those with less biodiversity.

In all, the researchers studied six different intensities of two stress gradients. In the case of very high intensities, the positive effects of biodiversity decreased or ceased altogether. However, increasing stress in systems with few species had a considerably more negative impact than in those with high biodiversity levels. “The study shows that a high degree of biodiversity under stress is especially important to maintain biomass production,” says Steudel’s PhD supervisor Michael Kessler, summing up the significance of the research project.

Journal Reference:

  1. Bastian Steudel, Andy Hector, Thomas Friedl, Christian Löfke, Maike Lorenz, Moritz Wesche, Michael Kessler.Biodiversity effects on ecosystem functioning change along environmental stress gradientsEcology Letters, 2012; DOI: 10.1111/j.1461-0248.2012.01863.x

Salamanders Display Survival Techniques in Period of Extreme Drought (Science Daily)

ScienceDaily (Sep. 5, 2012) — The stress of drought is acutely felt by aquatic animals such as salamanders. The extreme drought in the southeastern United States in 2007-2008 provided an opportunity to study how salamanders react and survive during such dry conditions. It also gave us clues as to how salamanders and other aquatic organisms may react to global warming.

The journal Herpetologica reports on a 5-year study of the Northern Dusky Salamander, common to eastern North America. From 2005 to 2009, including two severe drought years, the presence of salamanders was recorded at 17 first-order streams in the Piedmont region of North Carolina. Data on the amphibians’ presence were established by capturing, marking, and recapturing salamanders over the course of the study.

Researchers found that the adult salamanders had a high rate of survival over the course of the study, even during the drought years. The abundance of larval salamanders, however, decreased by an average of 30 percent during the drought. This differential mortality suggests a between-generation survival strategy, with the high survival rate of adults mitigating the effect of drought on the numbers of larvae.

During the extreme drought, water levels reached a 110-year low. Many streams were dry for periods of 2 to 3 months at a time, reduced to pools rather than flowing water. These conditions brought about another survival strategy, temporary migration of adult salamanders — at twice the rate of non-drought years. They moved from stream beds to underground or high-humidity refuges. Crayfish burrows and rocks provided shelter from the hot and dry conditions.

Because climate change is expected to bring warming trends and more drought, this study offers implications for the survival of stream-dwelling salamanders. An increase in the mortality of larvae, or early metamorphosis, could mean declines in salamander fitness and size.

Journal Reference:

  1. Steven J. Price, Robert A. Browne, and Michael E. Dorcas.Resistance and Resilience of A Stream Salamander To Supraseasonal DroughtHerpetologica:, September 2012; 68 (3): 312-323 DOI: 10.1655/HERPETOLOGICA-D-11-00084.1

Mulheres são mais vulneráveis aos impactos das mudanças climáticas globais (Fapesp)

Fatores socioeconômicos e culturais potencializam as vulnerabilidades do sexo feminino aos desastres provocados pelos eventos climáticos extremos, avalia pesquisadora mexicana que integra o IPCC (foto:Eduardo Cesar/FAPESP)

06/09/2012

Por Elton Alisson

Agência FAPESP – As mulheres e meninas representam atualmente 72% do total de pessoas que vivem em condições de extrema pobreza no mundo. Em função disso e da combinação de uma série de outros fatores socioeconômicos e culturais, elas representam hoje as maiores vítimas de desastres provocados por eventos climáticos extremos, como inundações e furacões.

Os dados foram apresentados pela médica e antropóloga mexicana Úrsula Oswald Spring durante o workshop “Gestão dos riscos dos extremos climáticos e desastres na América do Sul – O que podemos aprender com o Relatório Especial do IPCC sobre os extremos?”, realizado em agosto pela FAPESP, em São Paulo.

Professora da Universidade Nacional Autônoma do México, a pesquisadora mexicana, que é membro do IPCC, explica em entrevista concedida à Agência FAPESP as razões e quais ações são necessárias para diminuir a vulnerabilidade das mulheres e meninas aos impactos das mudanças climáticas.

Agência FAPESP – Quais são os grupos humanos mais vulneráveis aos impactos das mudanças climáticas globais?
Úrsula Oswald Spring – Primeiro, as mulheres e meninas. Em segundo lugar, os grupos indígenas refugiados em comunidades com línguas e culturas diferentes das suas. E em terceiro todas as pessoas que vivem em cidades em pobreza extrema, em zonas de alto risco e de violência, sem apoio governamental, ilegais, sem emprego e expostas às intempéries climáticas. Coincidentemente, esses três grupos humanos também são os mais discriminados. Há um problema de discriminação estrutural e uma combinação catastrófica de fatores socioeconômicos, ambientais e culturais que potencializam as vulnerabilidades desses três grupos humanos aos impactos das mudanças climáticas.

Agência FAPESP – O que torna as mulheres e meninas mais vulneráveis aos impactos das mudanças climáticas?
Úrsula Oswald Spring – Mundialmente, elas representam 72% dos pobres extremos e, sem recursos financeiros, é muito difícil enfrentar os impactos dos eventos climáticos extremos. Além disso, as mulheres foram educadas a cuidar dos outros e, por isso, assumimos o papel de “mãe de todos”. Esse processo, que chamo de teoria das representações sociais, também nos torna mais vulneráveis, porque temos o papel de proteger primeiramente os outros, para depois nos preocuparmos conosco. Por trás de tudo isso também persiste há milhares de anos um sistema político excludente, reforçado por todas as crenças religiosas, denominado sistema patriarcal, que preceitua a autoridade de um ser – o homem –, resultando em muita violência, exclusão e discriminação contra as mulheres. O capitalismo, por sua vez, se aproveitou do sistema patriarcal e construiu um sistema vertical, excludente, autoritário e violento, que permitiu que hoje 1,2 mil homens comandem a metade de todo o planeta e que as mulheres tivessem pouco poder de decisão e de veto em questões que lhes afetam diretamente.

Agência FAPESP – Diante desta realidade, o que é preciso fazer para diminuir a vulnerabilidade das mulheres e meninas aos impactos dos eventos climáticos extremos?
Úrsula Oswald Spring – Não vale a pena destruir, por exemplo, essa capacidade das mulheres em querer ser a mãe de todos. Mas é necessário treiná-las para que esse processo de cuidar dos outros seja mais eficiente e que não seja realizado ao custo de sua própria vida, mas que possa beneficiar todo um conjunto de pessoas, incluindo ela e suas filhas. E isto implica em mais condições para que possam ter maior poder de decisão.

Agência FAPESP – Como seria possível realizar esse processo?
Úrsula Oswald Spring – Sobretudo, possibilitando o maior acesso das mulheres à educação. De acordo com o Banco Mundial, todo país islâmico que investe na educação de suas mulheres aumenta imediatamente 1% de seu PIB. Outra ação é dar mais visibilidade ao trabalho das mulheres, que muitas vezes não é valorizado. Nos Estados Unidos o trabalho feminino representa 38% do PIB. É preciso dar visibilidade a essa participação econômica das mulheres. Alem disso, são necessárias leis que garantam maior equidade e participação das mulheres em todos os processos decisórios. Teríamos que usar sistemas de cotas para mulheres para reverter a discriminação, que seria um passo para garantir maior equidade. Desgraçadamente, as catástrofes e os desastres provocados pelos eventos climáticos extremos irão ajudar no processo de dar maior poder às mulheres.

Agência FAPESP – De que maneira?
Úrsula Oswald Spring – No México, por exemplo, a produção campesina está nas mãos dos homens. Mas está passando para as mãos das mulheres, porque os homens migraram para os Estados Unidos em busca de emprego. Na nova condição de chefes de família, elas estão tendo que tomar decisões sobre as mais variadas questões. Nós precisamos ajudá-las nesse processo de “empoderamento”, possibilitando que elas tenham acesso a tecnologias sustentáveis, que lhes permitam, por exemplo, se proteger dos riscos de desastres causados pelos eventos climáticos extremos.

Agência FAPESP – Além da questão do “empoderamento”, que é um processo que demanda longo prazo, que ações mais urgentes devem ser tomadas para preparar as mulheres para enfrentar os eventos climáticos extremos?
Úrsula Oswald Spring – É preciso possibilitar e treinar as mulheres para que em um momento de perigo iminente, por exemplo, elas tenham o direito de sair de casa. Muitas comunidades proíbem que uma mulher saia de casa se não está acompanhada por um homem. Isso é uma discriminação e uma forma de controle que é preciso superar com leis de equidade de gênero. Além disso, é preciso treinar mulheres para aprender a nadar, a correr, a trepar em uma árvore, e permitir que possam usar uma roupa mais adequada para realizar essas atividades. Eu assisti os Jogos Olímpicos de Londres e me chamou a atenção a vestimenta das atletas da natação e de corrida da Arábia Saudita. Apesar de estarem vestidas de forma diferente das atletas de outros países, ao menos elas vestiam uma calça que lhes permitia correr, sem infringir os códigos religiosos. Esse é um tipo de ação que poderíamos socializar. Poderíamos aproveitar os Jogos Olímpicos para promover em todos os países islâmicos esse tipo de ação, e dar cursos de natação e de corrida para as mulheres.

Agência FAPESP – Dentre os três grupos humanos que a senhora aponta como os mais vulneráveis aos impactos das mudanças climáticas, qual apresenta maior resiliência?
Úrsula Oswald Spring – Só os indígenas têm a capacidade adquirida ao longo de milhares de anos de administrar situações muito difíceis sem contar com ajuda internacional, nacional ou estatal, mas sim sozinhos. Eles se adaptaram às mudanças climáticas e cultivaram durante milhares de anos e da mesma maneira vegetais, como batatas, resistentes à seca, ao frio e ao calor, e desenvolveram sistemas muito eficientes e baratos de irrigação e fertilização da terra. É preciso aproveitar esses conhecimentos tradicionais e vinculá-los às tecnologias modernas para nos adaptarmos às mudanças climáticas. Mas estamos perdendo esses conhecimentos tradicionais porque a última geração de indígenas que ainda detêm esses conhecimentos, que são jovens, já passou pela escola, fala outras línguas que não a materna e está perdendo sua cultura indígena. Se não fizermos nada, vamos perder mundialmente esses conhecimentos tradicionais que permitiriam desenvolver soluções locais para enfrentar as mudanças climáticas.

Agência FAPESP – Que iniciativas existem hoje para promover essa aproximação de conhecimentos tradicionais com os científicos?
Úrsula Oswald Spring – No México, por exemplo, foi criada a Universidade Campesina do Sul. Lá são integrados grupos locais, que são constituídos hoje basicamente por mulheres – há 20 anos eram formados, em sua maioria, por homens –, e com base nas necessidades desses grupos nós disseminamos um processo de educação baseado no método de Paulo Freire, em que eles aprendem a partir de sua própria realidade.

Agência FAPESP – O que é ensinado na Universidade Campesina do Sul? 
Úrsula Oswald Spring – Um dos temas com os quais trabalhamos é agricultura orgânica, ensinando as mulheres a trabalhar com hortas familiares, para garantir seus próprios alimentos e de sua família. Outro tema é o manejo de água. Há muita água não potável, como a utilizada para lavar as mãos, por exemplo, que é muito fácil de tratar e que pode ser utilizada junto com dejetos orgânicos de sanitários secos como melhoradores de solo para ajudar a recuperar a fertilidade natural do solo. Outro tema ao qual temos nos dedicado é o da medicina alternativa. A medicina moderna é muito cara e a maior parte das pessoas não tem recursos para utilizar o sistema de saúde. Em função disso, estamos criando modos de integrar a medicina tradicional mexicana, que utiliza ervas e métodos tradicionais de cura, como vapores, com a medicina moderna. É um conjunto de ações voltadas para potencializar o uso dos conhecimentos científico e tradicional e tentar buscar soluções para enfrentar coletivamente problemas das mais variadas ordens, como o das mudanças climáticas. Porque não são grandes obras que protegem as pessoas de uma catástrofe provocada por um evento climático extremo, como uma inundação, mas sim pequenas obras, contanto que sejam muito eficientes.

Agência FAPESP – Na opinião da senhora, como será possível enfrentar os riscos das mudanças climáticas em escala mundial, em um momento em que diversos países passam por graves crises econômicas e têm problemas mais urgentes para resolver?
Úrsula Oswald Spring – Há condições de grande incerteza em relação às mudanças climáticas porque, além das crises econômicas, grande parte das pessoas no mundo nunca presenciou uma situação de desastre causado por um evento climático extremo. Mas se algumas pessoas ainda não passaram por uma situação dessas, é preciso justamente pensar em maneiras de se preparar para enfrentar os eventos climáticos extremos, que ocorrerão com maior frequência nos próximos anos. E uma das formas de se fazer isso é descentralizando a gestão dos riscos das mudanças climáticas, levando em contas as condições próprias de cada região. O problema climático na Amazônia, por exemplo, não é o mesmo que ocorre na parte alta dos Andes. Os tipos de manejos nessas regiões são muito diferentes. Por isso, os países precisam descentralizar as ações. A gestão dos riscos de mudanças climáticas pelos países irá depender de uma boa gestão local. Os primeiros 10 minutos de uma situação de risco, como uma inundação ou deslizamento, são cruciais e não há ajuda internacional que possa socorrer. Por isso, é preciso investir fortemente em prevenção e treinamento em nível local para enfrentar os riscos de um evento climático extremo.

When Do We Lie? When We’re Short On Time and Long On Reasons (Science Daily)

ScienceDaily (Sep. 5, 2012) — Almost all of us have been tempted to lie at some point, whether about our GPA, our annual income, or our age. But what makes us actually do it?

In a study forthcoming inPsychological Science, a journal of the Association for Psychological Science, psychological scientists Shaul Shalvi of the University of Amsterdam and Ori Eldar and Yoella Bereby-Meyer of Ben-Gurion University of the Negev investigated what factors influence dishonest behavior.

Previous research shows that a person’s first instinct is to serve his or her own self-interest. And research also shows that people are more likely to lie when they can justify such lies to themselves. With these findings in mind, Shalvi and colleagues hypothesized that, when under time pressure, having to make a decision that could yield financial reward would make people more likely to lie. They also hypothesized that, when people are not under time pressure, they are unlikely to lie if there is no opportunity to rationalize their behavior.

“According to our theory, people first act upon their self-serving instincts, and only with time do they consider what socially acceptable behavior is,” says Shalvi. “When people act quickly, they may attempt to do all they can to secure a profit — including bending ethical rules and lying. Having more time to deliberate leads people to restrict the amount of lying and refrain from cheating.”

The researchers first tested participants’ tendency to lie when doing so could be easily justified: Approximately 70 adult participants rolled a die three times such that the result was hidden from the experimenter’s view. The participants were told to report the first roll, and they earned more money for a higher reported roll.

Seeing the outcomes of the second and third rolls provided the participants with the opportunity to justify reporting the highest number that they rolled, even if it was not the first — after all, they had rolled that number, just not the first time they rolled the die. Some of the participants were under time pressure, and were instructed to report their answer within 20 seconds. The others were not under time pressure, and had an unlimited amount of time to provide a response.

The experimenters were not able to see the actual die rolls of the participants, to ensure all rolls were private. Instead, in order to determine whether or not the participants had lied about the numbers they rolled, Shalvi and colleagues compared their responses to those that would be expected from fair rolls. They found that both groups of participants lied, but those who were given less time to report their numbers were more likely to lie than those who weren’t under a time constraint.

The second experiment followed a similar procedure, except that the participants were not given information that could help them justify their lies: instead of rolling their die three times, they only rolled it once and then reported the outcome. In this experiment, the researchers found that participants who were under time pressure lied, while those without a time constraint did not.

Together, the two experiments suggest that, in general, people are more likely to lie when time is short. When time isn’t a concern, people may only lie when they have justifications for doing so.

One implication of the current findings is that to increase the likelihood of honest behavior in business or personal settings, it is important not push a person into a corner but rather to give him or her time,” explains Shalvi. “People usually know it is wrong to lie, they just need time to do the right thing.”

Embrapa envia sementes de milho e arroz para o Banco de Svalbard, na Noruega (O Globo)

JC e-mail 4577, de 05 de Setembro de 2012.

Banco nórdico é o mais seguro do mundo, construído para resistir a catástrofes climáticas e a explosão nuclear.

A Embrapa envia esta semana 264 amostras representativas de sementes de milho e 541 de arroz para o Banco Global de Sementes de Svalbard, na Noruega, como parte do acordo assinado com o Real Ministério de Agricultura e Alimentação do país em 2008. Serão enviadas ao banco genético norueguês as coleções nucleares de arroz e milho, ou seja, um grupo limitado de acessos derivados de uma coleção vegetal, escolhido para representar a variabilidade genética da coleção inteira. Tradicionalmente, as coleções nucleares são estabelecidas com tamanho em torno de 10% dos acessos de toda a coleção original e incluem aproximadamente 70% no acervo genético.

A escolha dessas culturas atende a uma das recomendações do Banco de Svalbard quanto à relevância para a segurança alimentar e agricultura sustentável. Embora não sejam culturas originárias do Brasil, são cultivadas no país há séculos e têm características de rusticidade e adaptabilidade às condições nacionais. A próxima cultura agrícola a ser encaminhada para o banco norueguês será o feijão, o que deve acontecer até o fim de 2012.

O envio de amostras para Svalbard é mais uma garantia de segurança, já que o banco nórdico é o mais seguro do mundo, construído com total segurança para resistir a catástrofes climáticas e até a uma explosão nuclear. O banco tem capacidade para quatro milhões e quinhentas mil amostras de sementes. O conjunto arquitetônico conta com três câmaras de segurança máxima situadas ao final de um túnel de 125 metros dentro de uma montanha em uma pequena ilha do arquipélago de Svalbard situado no paralelo 780 N, próximo do Pólo Norte.

As sementes são armazenadas a 20ºC abaixo de zero em embalagens hermeticamente fechadas, guardadas em caixas armazenadas em prateleiras. O depósito está rodeado pelo clima glacial do Ártico, o que assegura as baixas temperaturas, mesmo se houver falha no suprimento de energia elétrica. As baixas temperatura e umidade garantem a baixa atividade metabólica, mantendo a viabilidade das sementes por um milênio ou mais.

Kinsey Reporter: Free App Allows Public to Anonymously Report, Share Information On Sexual Behavior (Science Daily)

ScienceDaily (Sep. 5, 2012) — Indiana University has released Kinsey Reporter, a global mobile survey platform for collecting and reporting anonymous data about sexual and other intimate behaviors. The pilot project allows citizen observers around the world to use free applications now available for Apple and Android mobile platforms to not only report on sexual behavior and experiences, but also to share, explore and visualize the accumulated data.

The Kinsey Reporter platform is available free from Apple iOS and Google Play (for Android) online stores. Reports made by anonymous citizen scientists will be used for research and shared with the public at the Kinsey Reporter website. (Credit: Image courtesy of Indiana University)

“People are natural observers. It’s part of being social, and using mobile apps is an excellent way to involve citizen scientists,” said Julia Heiman, director of The Kinsey Institute for Research in Sex, Gender and Reproduction. “We expect to get new insights into sexuality and relationships today. What do people notice, what are they involved in, and what can they relate to us about their lives and their communities?”

The project will collect anonymous data and then aggregate and share it openly. Kinsey Reporter is a joint project between The Kinsey Institute and the Center for Complex Networks and Systems Research, or CNetS, which is part of the IU School of Informatics and Computing and the Pervasive Technology Institute. Both Kinsey and CNetS are based on the IU Bloomington campus.

CNetS director Filippo Menczer called development of the citizen reporting platform an opportunity to gather information on important issues that may have been difficult to examine in the past.

“This new platform will allow us to explore issues that have been challenging to study until now, such as the prevalence of unreported sexual violence in different parts of the world, or the correlation between various sexual practices like condom use, for example, and the cultural, political, religious or health contexts in particular geographical areas. These were some of our initial motivations for the project,” he said.

Users simply download the free app and begin contributing observed information on topics such as sexual activity, public displays of affection, flirting, unwanted experiences and birth control use. Even though no information identifying users submitting reports is collected or stored, the time and general location of the report is collected and input into the database. Users also have the option of selecting their own geographic preference for the report by choosing city/town, state/region or country.

Surveys will change over time, and users can view aggregated reports by geographic region via interactive maps, timelines or charts. All of these reporting venues can be manipulated with filters that remove or add data based on specific survey topics and questions selected by the user.

Both Heiman and Menczer said The Kinsey Institute’s longstanding seminal studies of sexual behaviors created a perfect synergy with research going on at CNetS related to mining big data crowd-sourced from mobile social media. The sensitive domain — sexual relations — added an intriguing challenge in finding a way to share useful data with the community while protecting the privacy and anonymity of the reporting volunteers, they added.

Reports are transmitted to Kinsey Reporter using a secure, encrypted protocol, and the only data collected are a timestamp, the approximate geo-location selected by the user, and the tags the user chooses in response to various survey questions. The protections and anonymity provided to those responding to surveys allowed IU’s Institutional Review Board to classify the research as “exempt from review,” which allows the data to be used for research and shared without requiring informed consent from users of the apps.

The Kinsey Reporter platform is now in public beta release. Apps are available for free download at both the Apple iOS and Android app stores. Accompanying the app release are a Kinsey Reporter website, a Twitter feed and a Facebook page. The four resources also provide links to information about sexuality, such as blogs and podcasts from the Kinsey Confidential website. YouTube videos on “What Is the Kinsey Reporter App” and “Making the Kinsey Reporter App” are also available for viewing.

The Kinsey Institute receives support from the Office of the Vice Provost for Research at IU Bloomington, which is dedicated to supporting ongoing faculty research and creative activity and developing new multidisciplinary initiatives to enhance opportunities for federal, state and private research funding.

First Holistic View of How Human Genome Actually Works: ENCODE Study Produces Massive Data Set (Science Daily)

ScienceDaily (Sep. 5, 2012) — The Human Genome Project produced an almost complete order of the 3 billion pairs of chemical letters in the DNA that embodies the human genetic code — but little about the way this blueprint works. Now, after a multi-year concerted effort by more than 440 researchers in 32 labs around the world, a more dynamic picture gives the first holistic view of how the human genome actually does its job.

William Noble, professor of genome sciences and computer science, in the data center at the William H. Foege Building. Noble, an expert on machine learning, and his team designed artificial intellience programs to analyze ENCODE data. These computer programs can learn from experience, recognize patterns, and organize information into categories understandable to scientists. The center houses systems for a wide variety of genetic research. The computer center has the capacity to store and analyze a tremendous amount of data, the equivalent of a 670-page autobiography of each person on earth, uncompressed.The computing resources analyze over 4 pentabytes of genomic data a year. (Credit: Clare McLean, Courtesy of University of Washington)

During the new study, researchers linked more than 80 percent of the human genome sequence to a specific biological function and mapped more than 4 million regulatory regions where proteins specifically interact with the DNA. These findings represent a significant advance in understanding the precise and complex controls over the expression of genetic information within a cell. The findings bring into much sharper focus the continually active genome in which proteins routinely turn genes on and off using sites that are sometimes at great distances from the genes themselves. They also identify where chemical modifications of DNA influence gene expression and where various functional forms of RNA, a form of nucleic acid related to DNA, help regulate the whole system.

“During the early debates about the Human Genome Project, researchers had predicted that only a few percent of the human genome sequence encoded proteins, the workhorses of the cell, and that the rest was junk. We now know that this conclusion was wrong,” said Eric D. Green, M.D., Ph.D., director of the National Human Genome Research Institute (NHGRI), a part of the National Institutes of Health. “ENCODE has revealed that most of the human genome is involved in the complex molecular choreography required for converting genetic information into living cells and organisms.”

NHGRI organized the research project producing these results; it is called the Encyclopedia oDNA Elements or ENCODE. Launched in 2003, ENCODE’s goal of identifying all of the genome’s functional elements seemed just as daunting as sequencing that first human genome. ENCODE was launched as a pilot project to develop the methods and strategies needed to produce results and did so by focusing on only 1 percent of the human genome. By 2007, NHGRI concluded that the technology had sufficiently evolved for a full-scale project, in which the institute invested approximately $123 million over five years. In addition, NHGRI devoted about $40 million to the ENCODE pilot project, plus approximately $125 million to ENCODE-related technology development and model organism research since 2003.

The scale of the effort has been remarkable. Hundreds of researchers across the United States, United Kingdom, Spain, Singapore and Japan performed more than 1,600 sets of experiments on 147 types of tissue with technologies standardized across the consortium. The experiments relied on innovative uses of next-generation DNA sequencing technologies, which had only become available around five years ago, due in large part to advances enabled by NHGRI’s DNA sequencing technology development program. In total, ENCODE generated more than 15 trillion bytes of raw data and consumed the equivalent of more than 300 years of computer time to analyze.

“We’ve come a long way,” said Ewan Birney, Ph.D., of the European Bioinformatics Institute, in the United Kingdom, and lead analysis coordinator for the ENCODE project. “By carefully piecing together a simply staggering variety of data, we’ve shown that the human genome is simply alive with switches, turning our genes on and off and controlling when and where proteins are produced. ENCODE has taken our knowledge of the genome to the next level, and all of that knowledge is being shared openly.”

The ENCODE Consortium placed the resulting data sets as soon as they were verified for accuracy, prior to publication, in several databases that can be freely accessed by anyone on the Internet. These data sets can be accessed through the ENCODE project portal (www.encodeproject.org) as well as at the University of California, Santa Cruz genome browser,http://genome.ucsc.edu/ENCODE/, the National Center for Biotechnology Information,http://www.ncbi.nlm.nih.gov/geo/info/ENCODE.html and the European Bioinformatics Institute,http://useast.ensembl.org/Homo_sapiens/encode.html?redirect=mirror;source=www.ensembl.org.

“The ENCODE catalog is like Google Maps for the human genome,” said Elise Feingold, Ph.D., an NHGRI program director who helped start the ENCODE Project. “Simply by selecting the magnification in Google Maps, you can see countries, states, cities, streets, even individual intersections, and by selecting different features, you can get directions, see street names and photos, and get information about traffic and even weather. The ENCODE maps allow researchers to inspect the chromosomes, genes, functional elements and individual nucleotides in the human genome in much the same way.”

The coordinated publication set includes one main integrative paper and five related papers in the journal Nature; 18 papers inGenome Research; and six papers in Genome Biology. The ENCODE data are so complex that the three journals have developed a pioneering way to present the information in an integrated form that they call threads.

“Because ENCODE has generated so much data, we, together with the ENCODE Consortium, have introduced a new way to enable researchers to navigate through the data,” said Magdalena Skipper, Ph.D., senior editor at Nature, which produced the freely available publishing platform on the Internet.

Since the same topics were addressed in different ways in different papers, the new website, www.nature.com/encode, will allow anyone to follow a topic through all of the papers in the ENCODE publication set by clicking on the relevant thread at the Nature ENCODE explorer page. For example, thread number one compiles figures, tables, and text relevant to genetic variation and disease from several papers and displays them all on one page. ENCODE scientists believe this will illuminate many biological themes emerging from the analyses.

In addition to the threaded papers, six review articles are being published in the Journal of Biological Chemistry and two related papers in Science and one in Cell.

The ENCODE data are rapidly becoming a fundamental resource for researchers to help understand human biology and disease. More than 100 papers using ENCODE data have been published by investigators who were not part of the ENCODE Project, but who have used the data in disease research. For example, many regions of the human genome that do not contain protein-coding genes have been associated with disease. Instead, the disease-linked genetic changes appear to occur in vast tracts of sequence between genes where ENCODE has identified many regulatory sites. Further study will be needed to understand how specific variants in these genomic areas contribute to disease.

“We were surprised that disease-linked genetic variants are not in protein-coding regions,” said Mike Pazin, Ph.D., an NHGRI program director working on ENCODE. “We expect to find that many genetic changes causing a disorder are within regulatory regions, or switches, that affect how much protein is produced or when the protein is produced, rather than affecting the structure of the protein itself. The medical condition will occur because the gene is aberrantly turned on or turned off or abnormal amounts of the protein are made. Far from being junk DNA, this regulatory DNA clearly makes important contributions to human health and disease.”

Identifying regulatory regions will also help researchers explain why different types of cells have different properties. For example why do muscle cells generate force while liver cells break down food? Scientists know that muscle cells turn on some genes that only work in muscle, but it has not been previously possible to examine the regulatory elements that control that process. ENCODE has laid a foundation for these kinds of studies by examining more than 140 of the hundreds of cell types found in the human body and identifying many of the cell type-specific control elements.

Despite the enormity of the dataset described in this historic collection of publications, it does not comprehensively describe all of the functional genomic elements in all of the different types of cells in the human body. NHGRI plans to invest in additional ENCODE-related research for at least another four years. During the next phase, ENCODE will increase the depth of the catalog with respect to the types of functional elements and cell types studied. It will also develop new tools for more sophisticated analyses of the data.

Journal References:

  1. Magdalena Skipper, Ritu Dhand, Philip Campbell.Presenting ENCODENature, 2012; 489 (7414): 45 DOI:10.1038/489045a
  2. Joseph R. Ecker, Wendy A. Bickmore, Inês Barroso, Jonathan K. Pritchard, Yoav Gilad, Eran Segal. Genomics: ENCODE explainedNature, 2012; 489 (7414): 52 DOI:10.1038/489052a
  3. The ENCODE Project Consortium. An integrated encyclopedia of DNA elements in the human genome.Nature, 2012; 489 (7414): 57 DOI: 10.1038/nature11247

Deadly Witch Hunts Targeted by Grassroots Women’s Groups (Science Daily)

ScienceDaily (Sep. 4, 2012) — Witch hunts are common and sometimes deadly in the tea plantations of Jalpaiguri, India. But a surprising source — small groups of women who meet through a government loan program — has achieved some success in preventing the longstanding practice, a Michigan State University sociologist found.

Basanti, shown here with children in her family, survived a witch hunt in India’s tea plantations. (Credit: Photo by Soma Chaudhuri)

Soma Chaudhuri spent seven months studying witch hunts in her native India and discovered that the economic self-help groups have made it part of their agenda to defend their fellow plantation workers against the hunts.

“It’s a grassroots movement and it’s helping provide a voice to women who wouldn’t otherwise have one,” said Chaudhuri, assistant professor of sociology and criminal justice. “I can see the potential for this developing into a social movement, but it’s not going to happen in a day because an entire culture needs to be changed.”

Witch hunts, she explained, are fueled by the tribal workers’ belief in the existence of witches and the desperate need of this poor, illiterate population to make sense of rampant diseases in villages with no doctors or medical facilities. There are some 84 million tribal people in India, representing about 8 percent of the country’s population.

In 2003, at a tea plantation in Jalpaiguri, five women were tied up, tortured and killed after being falsely accused of witchcraft in the death of a male villager who had suffered from a stomach illness.

Chaudhuri interviewed the villagers at length and found that such attacks are often impulsive and that the “witch” is often killed immediately. Widespread alcoholism is also a factor, she found.

But the study also documents examples of the women’s groups stopping potential attacks. In one case, a woman was accused of causing disease in livestock and an attack was planned. Members of the self-help groups gathered in a vigil around the woman’s home and surrounded the accuser’s home as well, stating their case to the accuser’s wife. Eventually the wife intervened and her husband recanted and “begged for forgiveness.”

Through the loan program, each woman is issued a low-interest, collateral-free “microcredit” loan of about 750 rupees ($18) to start her own business such as basket weaving, tailoring or selling chicken eggs. Participants meet in groups of about eight to 10 to support one another.

Chaudhuri said the loan program is run by nongovernmental activists who have been successful in encouraging the groups to look beyond the economic aspects and mobilize against domestic abuse, alcoholism and the practice of witch hunts.

Through the bonds of trust and friendship, group members have established the necessary social capital to collectively resist the deep-seated tradition of witch hunts, Chaudhuri said.

“Why would they go against something so risky, something that breaks tradition?” she said. “They do it because they believe in the ideals of the microcredit group — in women’s development, family development and gender equality.”

The study, which Chaudhuri co-authored with Anuradha Chakravarty of the University of South Carolina, appears in Mobilization, an international research journal.

Journal Reference:

  1. Anuradha Chakravarty, Soma Chaudhuri. Strategic Framing Work(s): How Microcredit Loans Facilitate Anti-Witch-Hunt MovementsMobilization, 2012; 17 (2)

Violent Video Games Not So Bad When Players Cooperate (Science Daily)

ScienceDaily (Sep. 4, 2012) — New research suggests that violent video games may not make players more aggressive — if they play cooperatively with other people.

In two studies, researchers found that college students who teamed up to play violent video games later showed more cooperative behavior, and sometimes less signs of aggression, than students who played the games competitively.

The results suggest that it is too simplistic to say violent video games are always bad for players, said David Ewoldsen, co-author of the studies and professor of communication at Ohio State University.

“Clearly, research has established there are links between playing violent video games and aggression, but that’s an incomplete picture,” Ewoldsen said.

“Most of the studies finding links between violent games and aggression were done with people playing alone. The social aspect of today’s video games can change things quite a bit.”

The new research suggests playing a violent game with a teammate changes how people react to the violence.

“You’re still being very aggressive, you’re still killing people in the game — but when you cooperate, that overrides any of the negative effects of the extreme aggression,” said co-author John Velez, a graduate student in communication at Ohio State.

One study was recently published online in the journalCommunication Research, and will appear in a future print edition. The second related study was published recently in the journal Cyberpsychology, Behavior and Social Networking.

The CBSN study involved 119 college students who were placed into four groups to play the violent video game Halo II with a partner. The groups differed in whether they competed or cooperated in playing the game.

First, all participants filled out a survey about their video game history and a measure of their aggressiveness.

Those in direct competition played in multiplayer mode and were told that their task was to kill their opponent more times than they were killed.

Those in indirect competition played in single-player mode, but were told their task was to beat their opponent by getting further in the game.

In the cooperative condition, participants were told to get as far as they could through the game by working with their partner in Halo II’s cooperative campaign mode. In this case, the pair worked together to defeat computer-controlled enemies.

The final group simply filled out the measures and played the game at the end of the study. Their game playing was not recorded.

After playing the violent video game, the same pairs of participants who played with or against each other took part in a real-life game where they had the opportunity to cooperate or compete with each other.

In this game, they played multiple rounds where they were given dimes which they could keep or share with their partner. The researchers were looking to see if they engaged in “tit for tat” behavior, in which the players mirrored the behaviors of their partner. In other words, if your partner acts cooperatively towards you, you do the same for him. Tit for tat behavior is seen by researchers as a precursor to cooperation.

The results showed that participants who played the video game cooperatively were more likely than those who competed to show cooperative tendencies in this later real-life game.

“These findings suggest video game research needs to consider not only the content of the game but also how video game players are playing the game,” Velez said.

The second study, published in Communication Research, extended the findings by showing that cooperating in playing a violent video game can even unite people from rival groups — in this case, fans of Ohio State and those of their bitter rival, the University of Michigan.

This study involved 80 Ohio State students who, when they came to the lab for the experiment, were paired with a person who they thought was another student participant. In fact, it was one of the experimenters who was wearing an Ohio State t-shirt — or one from the rival University of Michigan.

One of the researchers made sure to point out the t-shirt to the student participant.

The student and confederate then played the highly realistic and violent first-person-shooter video game Unreal Tournament III together — either as teammates or as rivals.

After playing the video game, the participants played the same real-life game used in the previous study with their supposed partner, who was really one of the researchers.

They also completed tasks that measured how aggressive they felt, and their aggressive tendencies.

The results showed the power of cooperatively playing violent video games in reducing aggressive thoughts — and even overcoming group differences.

As in the first study, players who cooperated in playing the video game later showed more cooperation than did those who competed against each other.

It even worked when Ohio State participants thought they were playing with a rival from the University of Michigan.

“The cooperative play just wiped out any effect of who you were playing with,” Velez said. “Ohio State students happily cooperated with Michigan fans.”

Also, those participants who played cooperatively showed less aggressive tendencies afterwards than those who played competitively, at least at first. In fact, those who played competitively with a rival actually showed less aggression than those who played with a supporter of their own team.

“If you’re playing with a rival, and that rival is cooperating with you, that violates your expectations — you’re surprised by their cooperation and that makes you even more willing to cooperate,” Ewoldsen said.

Eventually, even those who competed with each other in the video games started cooperating with each other in the real-life games afterwards.

“The point is that the way you act in the real world very quickly overrides anything that is going on in the video games,” Ewoldsen said. “Video games aren’t controlling who we are.”

These results should encourage researchers to study not only how the content of violent video games affects players, but also how the style of play has an impact.

“What is more important: cooperating with another human being, or killing a digital creature?” Ewoldsen said.

“We think that cooperating with another human overrides the effects of playing a violent video game.”

Other authors of the CBSN paper were Cassie Eno of Waldorf College; Bradley Okdie of Ohio State’s Newark campus; Rosanna Guadagno of the University of Alabama; and James DeCoster of the University of Virginia.

Other authors of the Communication Research paper were Chad Mahood and Emily Moyer-Guse, both of Ohio State.

Journal References:

  1. J. A. Velez, C. Mahood, D. R. Ewoldsen, E. Moyer-Guse.Ingroup Versus Outgroup Conflict in the Context of Violent Video Game Play: The Effect of Cooperation on Increased Helping and Decreased Aggression.Communication Research, 2012; DOI:10.1177/0093650212456202
  2. David R. Ewoldsen, Cassie A. Eno, Bradley M. Okdie, John A. Velez, Rosanna E. Guadagno, Jamie DeCoster. Effect of Playing Violent Video Games Cooperatively or Competitively on Subsequent Cooperative Behavior.Cyberpsychology, Behavior, and Social Networking, 2012; 15 (5): 277 DOI: 10.1089/cyber.2011.0308

How Language Change Sneaks in (Science Daily)

ScienceDaily (Sep. 4, 2012) — Languages are continually changing, not just words but also grammar. A recent study examines how such changes happen and what the changes can tell us about how speakers’ grammars work.

The study, “The course of actualization,” to be published in the September 2012 issue of the scholarly journal Language, is authored by Hendrik De Smet of the University of Leuven /Research Foundation Flanders.

Historical linguists, who document and study language change, have long noticed that language changes have a sneaky quality, starting small and unobtrusive and then gradually conquering more ground, a process termed ‘actualization’. De Smet’s study investigates how actualization proceeds by tracking and comparing different language changes, using large collections of digitized historical texts. This way, it is shown that any actualization process consists of a series of smaller changes with each new change building on and following from the previous ones, each time making only a minimal adjustment. A crucial role in this is played by similarity.

Consider the development of so-called downtoners — grammatical elements that minimize the force of the word they accompany. Nineteenth-century English saw the emergence of a new downtoner, all but, meaning ‘almost’. All but started out being used only with adjectives, as in her escape was all but miraculous. But later it also began to turn up with verbs, as in until his clothes all but dropped from him. In grammatical terms, that is a fairly big leap, but when looked at closely the leap is found to go in smaller steps. Before all but spread to verbs, it appeared with past participles, which very much resemble both adjectives and verbs, as in her breath was all but gone. So, changes can sneak into a language and spread from context to context by exploiting the similarities between contexts.

The role of similarity in language change makes a number of predictions. For one thing, actualization processes will differ from item to item because in each case there will be different similarities to exploit. English is currently seeing some nouns developing into adjectives, such as fun or key. This again goes by small adjustments, but along different pathways. For fun, speakers started from expressions like that was really fun, which they would adjust to that was very fun, and from there they would go on to a very fun time and by now some have even gone on to expressions like the funnest time ever. For key, change started from expressions like a key player, which could be adjusted to an absolutely key player, and from there to a player who is absolutely key. When the changes are over, the eventual outcome will be the same — fun and key will have all the characteristics of any other English adjective — but the way that is coming about is different.

Another prediction is that actualization processes will differ from language to language, because grammatical contexts that are similar in one language may not be in another. Comparing the development of another English downtoner, far from (as in far from perfect), to its Dutch equivalent, verre van, it is found that, even though they started out quite similar, the two downtoners went on to develop differently due to differences in the overall structure of English and Dutch. Importantly, this is one way in which even small changes may reinforce and gradually increase existing differences between languages.

Finally, this research can say something about how language works in general. Similarity is so important to how changes unfold precisely because it is important to how speakers subconsciously use language all the time. Presumably, whenever a speaker thinks up a new sentence and decides it is acceptable, they do so by evaluating its resemblance to previous sentences. In this respect, actualization processes are giving us a unique window on how similarity works in organizing and reorganizing speakers’ internal grammars, showing just how sensitive speakers are to all sorts of similarities. Strikingly, then, the same similarity judgments that speakers make to form acceptable and intelligible sentences allow their grammars to gradually change over time.

Journal Reference:

  1. Hendrik De Smet. The Course of Actualization.Language, 2012 (in press)

Reciprocity an Important Component of Prosocial Behavior: Scorekeeping of Past Favors Isn’t, However, a Factor (Science Daily)

ScienceDaily (Sep. 3, 2012) — While exchanging favors with others, humans tend to think in terms of tit-for-tat, an assumption easily extended to other animals. As a result, reciprocity is often viewed as a cognitive feat requiring memory, perhaps even calculation. But what if the process is simpler, not only in other animals but in humans as well?

Researchers at the Yerkes National Primate Research Center, Emory University, have determined monkeys may gain the advantages of reciprocal exchange of favors without necessarily keeping precise track of past favors. Malini Suchak, a graduate student at Emory University, and Frans de Waal, PhD, director of the Living Links Center at Yerkes and C.H. Candler Professor of Psychology at Emory, led the study. Their findings will appear in an Early Online Edition of theProceedings of the National Academy of Sciences this week.

“Prosocial is defined as a motivation to assist others regardless of benefits for self, explained Suchak. “We used a prosocial choice test to study whether direct reciprocity could promote generosity among brown capuchin monkeys. We found one monkey willing to do another favors if the first monkey was the only one to choose, and we found the monkeys became even more prosocial if they could alternate and help each other. We did not find any evidence that the monkeys paid close attention to each other’s past choices, so they were prosocial regardless of what their partner had just done,” she continued.

Suchak and de Waal suggest the synchronization of the same actions in alternation creates a more positive attitude the same way humans who row a boat together or work toward a shared goal develop a more positive attitude about each other.

Another interesting finding according to the researchers is the capuchin monkeys were prosocial whether they were paired with a familiar partner from their own group {in-group} or a partner from a different social group {out-group}.

According to de Waal, “This research has several implications for better understanding human behavior. First, we observed an increase in prosocial behavior as a result of reciprocity, but the monkeys did not develop a contingency between their own and their partners’ behaviors. Like humans, the capuchins may have understood the benefits of reciprocity and used this understanding to maximize their own benefits. Second, that the capuchins responded similarly to in-group and out-group partners has implications for the commonly held view that humans are unique in their ability to cooperate with strangers,” de Waal explained.

According to the researchers, capuchin monkeys (Cebus apella) are ideal subjects for this type of study given the numerous observations of cooperative and prosocial behavior in the field, their sensitivity to other monkeys’ efforts in coordination experiments, and their robust, spontaneous prosocial behavior in the prosocial choice test compared with, for example, chimpanzees, which seem more sensitive to methodological variables.

In this study, the researchers tested 12 brown capuchin monkeys in pairs on a prosocial choice task. The monkeys had the choice between a selfish token that benefited only them and a prosocial token that benefited themselves and a partner. By comparing each monkey’s behavior with a familiar partner from the monkey’s own group and a partner from a different social group, the researchers examined the influence of each monkey’s relationship outside the experimental context on prosocial behavior. There was no difference between in-group and out-group pairs in any of the test conditions. To test the role of reciprocity, the researchers allowed the monkeys to take turns making choices and found this greatly increased prosocial behavior, but the researchers did not observe any tit-for-tat behavior. The researchers also tested whether the monkeys could overcome their aversion for inequity by creating a situation in which both individuals could provide each other with superior rewards, making reciprocity an even more attractive strategy. The monkeys did, but again without keeping track of each other’s choices. Finally, through a series of control conditions, the researchers established the monkeys were responding to their partners’ behaviors, rather than the rewards delivered by their partners, and that the monkeys understood the values of the tokens and were flexibly responding to changing conditions throughout the test sessions.

This research opens several avenues for future research, including further examining the emergence of reciprocity among humans without the cognition required for tit-for-tat and the tendency to cooperate with out-group partners.

Journal Reference:

  1. Malini Suchak and Frans B. M. de Waal. Monkeys benefit from reciprocity without the cognitive burden.Proceedings of the National Academy of Sciences, 2012; DOI: 10.1073/pnas.1213173109

Tigers Take the Night Shift to Coexist With People (Science Daily)

ScienceDaily (Sep. 3, 2012) — Tigers aren’t known for being accommodating, but a new study in the Proceedings of the National Academy of Sciences indicates that the carnivores in Nepal are taking the night shift to better coexist with humans.

A tiger camera trapped in Chitwan National Forest in Nepal. (Credit: Center for Systems Integration and Sustainability, Michigan State University)

The revelation that tigers and people are sharing exactly the same space — the same roads and trails — of Chitwan National Park flies in the face of long-held convictions in conservation circles. It also underscores how successful conservation efforts need sciences that takes into account both nature and humans.

“As our planet becomes more crowded, we need to find creative solutions that consider both human and natural systems,” said Jianguo “Jack” Liu, the director of the Center for Systems Integration and Sustainability at Michigan State University. “Sustainability can be achieved if we have a good understanding of the complicated connections between both worlds. We’ve found something very interesting is happening in Nepal that holds promise for both humans and nature to thrive.”

Conventional conservation wisdom is that tigers need plenty of people-free space, which often leads to people being relocated or their access to resources compromised to make way for tigers.

Neil Carter, MSU doctoral student and one of the paper’s co-authors, spent two seasons setting motion-detecting camera traps. His analysis of the images shows that people and tigers are walking the same paths, albeit at different times.

Tigers typically move around at all times of the day and night, monitoring their territory, mating and hunting. But in the study area, the tigers had become creatures of the night. People in Nepal generally avoid the forests at night. Essentially, quitting time for people signals starting time for Chitwan’s tigers.

“It’s a very fundamental conflict over resources,” Carter said. “Tigers need resources, people need the same resources. If we operate under the traditional wisdom that tigers only can survive with space dedicated solely for them, there would always be conflict. If your priority is people, tigers lose out. If your priority is tigers, people lose out.”

In Chitwan, tigers seem to be adapting to make it work, he added.

“There appears to be a middle ground where you might actually be able to protect the species at high densities and give people access to forest goods they need to live,” Carter said. “If that’s the case, then this can happen in other places, and the future of tigers is much brighter than it would be otherwise.”

Additional co-authors of the paper include Binoj Shresthaof the Institute for Social and Environmental Research in Nepal, Jhamak Karkiof Nepal’s Department of National Parks and Wildlife Conservation and Narendra Man Babu Pradhan of the World Wildlife Fund in Nepal.

The research was funded by the National Science Foundation, NASA, the U.S. Fish and Wildlife Service Rhinoceros and Tiger Conservation Fund and MSU AgBioReseach. It was part of the Partnership for International Research and Education.

Realistically What Might the Future Climate Look Like? (Skeptical Science)

Posted on 31 August 2012 by dana1981

Robert Watson, former Chair of the Intergovernmental Panel on Climate Change (IPCC), recently made headlines by declaring that it is unlikely we will be able to limit global warming to the 2°C ‘danger limit’.  This past April, the International Energy Agency similarly warnedthat we are rapidly running out of time to avoid blowing past 2°C global warming compared to late 19th Century temperatures.  The reason for their pessimism is illustrated in the ‘ski slopes’ graphic, which depicts how steep emissions cuts will have to be in order to give ourselves a good chance to stay below the 2°C target, given different peak emissions dates (Figure 1).

ski slopes

Figure 1: Three scenarios, each of which would limit the total global emission of carbon dioxide from fossil-fuel burning and industrial processes to 750 billion tonnes over the period 2010–2050.  Source: German Advisory Council on Global Change, WBGU (2009)

Clearly our CO2 emissions have not yet peaked – in fact they increased by 1 billion tonnes between 2010 and 2011 despite a continued global economic recession; therefore, the green curve is no longer an option.  There has also been little progress toward an international climate accord to replace the Kyoto Protocol, which suggests that the blue curve does not represent a likely scenario either – in order to achieve peak emissions in 2015 we would have to take serious steps to reduce emissions today, which we are not.  The red curve seems the most likely, but the required cuts are so steep that it is unlikely we will be able to achieve them, which means we are indeed likely to surpass the 2°C target.

Thus it is worth exploring the question, what would a world with >2°C global surface warming look like?

Global Warming Impacts

The 2007 IPCC Fourth Assessment Report (AR4) summarizes the magnitudes of impact of various degrees of warming here, and graphically in Figure 2, relative to ~1990 temperatures (~0.6°C above late 19th Century temperatures).

fig spm.2

Figure 2: Illustrative examples of global impacts projected for climate changes (and sea level and atmospheric carbon dioxide where relevant) associated with different amounts of increase in global average surface temperature in the 21st century. The black lines link impacts, dotted arrows indicate impacts continuing with increasing temperature. Entries are placed so that the left-hand side of the text indicates the approximate onset of a given impact. Quantitative entries for water stress and flooding represent the additional impacts of climate change relative to the conditions projected across the range of Special Report on Emissions Scenarios (SRES) scenarios. Adaptation to climate change is not included in these estimations. Confidence levels for all statements are high.  IPCC AR4 WGII Figure SPM.2.  Click the image for a larger version.

Some adverse impacts are expected even before we reach the 2°C limit, for example hundreds of millions of people being subjected to increased water stress, increasing drought at mid-latitudes (as we recently discussed here), increased coral bleaching, increased coastal damage from floods and storms, and increased morbidity and mortality from more frequent and intense heat waves (see here), floods, and droughts.  However, by and large these are impacts which we should be able to adapt to, at a cost, but without disastrous consequences.

Once we surpass the 2°C target, the impacts listed above are exacerbated, and some new impacts will occur.  Most corals will bleach, and widespread coral mortality is expected ~3°C above late 19th Century temperatures.  Up to 30% of global species will be at risk for extinction, and the figure could exceed 40% if we surpass 4°C, as we continue on the path toward the Earth’s sixth mass extinction.  Coastal flooding will impact millions more people at ~2.5°C, and a number of adverse health effects are expected to continue rising along with temperatures.

Reasons for Concern

Smith et al. (2009) (on which the late great Stephen Schneider was a co-author) updated the IPCC impact assessment, arriving at similar conclusions.  For example,

“There is medium confidence that ~20–30% of known plant and animal species are likely to be at increased risk of extinction if increases in global average temperature exceed 1.5 °C to 2.5 °C over 1980–1999”

“increases in drought, heat waves, and floods are projected in many regions and would have adverse impacts, including increased water stress, wildfire frequency, and flood risks (starting at less than 1 °C of additional warming above 1990 levels) and adverse health effects (slightly above 1 °C)”

“climate change over the next century is likely to adversely affect hundreds of millions of people through increased coastal flooding after a further 2 °C warming from 1990 levels; reductions in water supplies (0.4 to 1.7 billion people affected with less than a 1 °C warming from 1990 levels); and increased health impacts (that are already being observed”

Smith et al. updated the 2001 IPCC report ‘burning embers’ diagram to reflect their findings (Figure 3).  On this figure, white regions indicate neutral or low impacts or risks, yellow indicates negative impacts for some systems or more significant risks, and red indicates substantial negative impacts or risks that are more widespread and/or severe.  They have grouped the various climate change consequences into ‘reasons for concern’ (RFCs), summarized below.

smith embers

Figure 3:  Risks from climate change, by reason for concern (RFC). Climate change consequences are plotted against increases in global mean temperature (°C) after 1990. Each column corresponds to a specific RFC and represents additional outcomes associated with increasing global mean temperature. The color scheme represents progressively increasing levels of risk and should not be interpreted as representing ‘‘dangerous anthropogenic interference,’’ which is a value judgment. The historical period 1900 to 2000 warmed by 0.6 °C and led to some impacts. It should be noted that this figure addresses only how risks change as global mean temperature increases, not how risks might change at different rates of warming. Furthermore, it does not address when impacts might be realized, nor does it account for the effects of different development pathways on vulnerability.

  • Risk to Unique and Threatened Systems addresses the potential for increased damage to or irreversible loss of unique and threatened systems, such as coral reefs, tropical glaciers, endangered species, unique ecosystems, biodiversity hotspots, small island states, and indigenous communities.
  • Risk of Extreme Weather Events tracks increases in extreme events with substantial consequences for societies and natural systems. Examples include increase in the frequency, intensity, or consequences of heat waves, floods, droughts, wildfires, or tropical cyclones.
  • Distribution of Impacts concerns disparities of impacts.  Some regions, countries, and populations face greater harm from climate change, whereas other regions, countries, or populations would be much less harmed—and some may benefit; the magnitude of harm can also vary within regions and across sectors and populations.
  • Aggregate Damages covers comprehensive measures of impacts. Impacts distributed across the globe can be aggregated into a single metric, such as monetary damages, lives affected, or lives lost. Aggregation techniques vary in their treatment of equity of outcomes, as well as treatment of impacts that are not easily quantified.
  • Risks of Large-Scale Discontinuities represents the likelihood that certain phenomena (sometimes called tipping points) would occur, any of which may be accompanied by very large impacts. These phenomena include the deglaciation (partial or complete) of the West Antarctic or Greenland ice sheets and major changes in some components of the Earth’s climate system, such as a substantial reduction or collapse of the North Atlantic Meridional Overturning Circulation.

All of these reasons for concern enter the red (substantial negative impact, high risk) region by 4°C.  Aggregate impacts are in the red region by 3°C, and some types of concerns are in the red region by 1°C.

For more details we also recommend Mark Lynas’ book Six Degrees, which goes through the climate impacts from each subsequent degree of warming, based on a very thorough review of the scientific literature.  A brief review of the book by Eric Steig and summary of some key impacts is available here.  National Geographic also did a series of videos on the Six Degrees theme, which no longer seem to be available on their websites, but which can still be found on YouTube.

This is Why Reducing Emissions is Critical

We’re not yet committed to surpassing 2°C global warming, but as Watson noted, we are quickly running out of time to realistically give ourselves a chance to stay below that ‘danger limit’.  However, 2°C is not a do-or-die threshold.  Every bit of CO2 emissions we can reduce means that much avoided future warming, which means that much avoided climate change impacts.  As Lonnie Thompson noted, the more global warming we manage to mitigate, the less adaption and suffering we will be forced to cope with in the future.

Realistically, based on the current political climate (which we will explore in another post next week), limiting global warming to 2°C is probably the best we can do.  However, there is a big difference between 2°C and 3°C, between 3°C and 4°C, and anything greater than 4°C can probably accurately be described as catastrophic, since various tipping points are expected to be triggered at this level.  Right now, we are on track for the catastrophic consequences (widespread coral mortality, mass extinctions, hundreds of millions of people adversely impacted by droughts, floods, heat waves, etc.).  But we’re not stuck on that track just yet, and we need to move ourselves as far off of it as possible by reducing our greenhouse gas emissions as soon and as much as possible.

There are of course many people who believe that the planet will not warm as much, or that the impacts of the associated climate change will be as bad as the body of scientific evidence suggests.  That is certainly a possiblity, and we very much hope that their optimistic view is correct.  However, what we have presented here is the best summary of scientific evidence available, and it paints a very bleak picture if we fail to rapidly reduce our greenhouse gas emissions.

If we continue forward on our current path, catastrophe is not just a possible outcome, it is the most probable outcome.  And an intelligent risk management approach would involve taking steps to prevent a catastrophic scenario if it were a mere possibility, let alone the most probable outcome.  This is especially true since the most important component of the solution – carbon pricing – can be implemented at a relatively low cost, and a far lower cost than trying to adapt to the climate change consequences we have discussed here (Figure 4).

Figure 4:  Approximate costs of climate action (green) and inaction (red) in 2100 and 2200. Sources: German Institute for Economic Research andWatkiss et al. 2005

Climate contrarians will often mock ‘CAGW’ (catastrophic anthropogenic global warming), but the sad reality is that CAGW is looking more and more likely every day.  But it’s critical that we don’t give up, that we keep doing everything we can do to reduce our emissions as much as possible in order to avoid as many catastrophic consequences as possible, for the sake of future generations and all species on Earth.  The future climate will probably be much more challenging for life on Earth than today’s, but we still can and must limit the damage.

Design Help for Drug Cocktails for HIV Patients: Mathematical Model Helps Design Efficient Multi-Drug Therapies (Science Daily)

ScienceDaily (Sep. 2, 2012) — For years, doctors treating those with HIV have recognized a relationship between how faithfully patients take the drugs they prescribe, and how likely the virus is to develop drug resistance. More recently, research has shown that the relationship between adherence to a drug regimen and resistance is different for each of the drugs that make up the “cocktail” used to control the disease.

HIV is shown attaching to and infecting a T4 cell. The virus then inserts its own genetic material into the T4 cell’s host DNA. The infected host cell then manufactures copies of the HIV. (Credit: iStockphoto/Medical Art Inc.)

New research conducted by Harvard scientists could help explain why those differences exist, and may help doctors quickly and cheaply design new combinations of drugs that are less likely to result in resistance.

As described in a September 2 paper in Nature Medicine, a team of researchers led by Martin Nowak, Professor of Mathematics and of Biology and Director of the Program for Evolutionary Dynamics, have developed a technique medical researchers can use to model the effects of various treatments, and predict whether they will cause the virus to develop resistance.

“What we demonstrate in this paper is a prototype for predicting, through modeling, whether a patient at a given adherence level is likely to develop resistance to treatment,” Alison Hill, a PhD student in Biophysics and co-first author of the paper, said. “Compared to the time and expense of a clinical trial, this method offers a relatively easy way to make these predictions. And, as we show in the paper, our results match with what doctors are seeing in clinical settings.”

The hope, said Nowak, is that the new technique will take some of the guesswork out of what is now largely a trial-and-error process.

“This is a mathematical tool that will help design clinical trials,” he said. “Right now, researchers are using trial and error to develop these combination therapies. Our approach uses the mathematical understanding of evolution to make the process more akin to engineering.”

Creating a model that can make such predictions accurately, however, requires huge amounts of data.

To get that data, Hill and Daniel Scholes Rosenbloom, a PhD student in Organismic and Evolutionary Biology and the paper’s other first author, turned to Johns Hopkins University Medical School, where Professor of Medicine and of Molecular Biology and Genetics Robert F. Siliciano was working with PhD student Alireza Rabi (also co-first author) to study how the HIV virus reacted to varying drug dosages.

Such data proved critical to the model that Hill, Rabi and Rosenbloom eventually designed, because the level of the drug in patients — even those that adhere to their treatment perfectly — naturally varies. When drug levels are low — as they are between doses, or if a dose is missed — the virus is better able to replicate and grow. Higher drug levels, by contrast, may keep the virus in check, but they also increase the risk of mutant strains of the virus emerging, leading to drug resistance.

Armed with the data from Johns Hopkins, Hill, Rabi and Rosenbloom created a computer model that could predict whether and how much the virus, or a drug-resistant strain, was growing based on how strictly patients stuck to their drug regimen.

“Our model is essentially a simulation of what goes on during treatment,” Rosenbloom said. “We created a number of simulated patients, each of whom had different characteristics, and then we said, ‘Let’s imagine these patients have 60 percent adherence to their treatment — they take 60 percent of the pills they’re supposed to.’ Our model can tell us what their drug concentration is over time, and based on that, we can say whether the virus is growing or shrinking, and whether they’re likely to develop resistance.”

The model’s predictions, Rosenbloom explained, can then serve as a guide to researchers as they work to design new drug cocktails to combat HIV.

While their model does hold out hope for simplifying the process of designing drug “cocktails,” Hill and Rosenbloom said they plan to continue to refine the model to take additional factors — such as multiple mutant-resistant strains of the virus and varying drug concentrations in other parts of the body — into effect.

“The prototype we have so far looks at concentrations of drugs in blood plasma,” Rosenbloom explained. “But a number of drugs don’t penetrate other parts of the body, like the brains or the gut, with the same efficiency, so it’s important to model these other areas where the concentrations of drugs might not be as high.”

Ultimately, though, both say their model can offer new hope to patients by helping doctors design better, cheaper and more efficient treatments.

“Over the past 10 years, the number of HIV-infected people receiving drug treatment has increased immensely,” Hill said. “Figuring out what the best ways are to treat people in terms of cost effectiveness, adherence and the chance of developing resistance is going to become even more important.”

Journal Reference:

  1. Daniel I S Rosenbloom, Alison L Hill, S Alireza Rabi, Robert F Siliciano, Martin A Nowak. Antiretroviral dynamics determines HIV evolution and predicts therapy outcomeNature Medicine, 2012; DOI: 10.1038/nm.2892

*   *   *

Anti-HIV Drug Simulation Offers ‘Realistic’ Tool to Predict Drug Resistance and Viral Mutation

ScienceDaily (Sep. 2, 2012) — Pooling data from thousands of tests of the antiviral activity of more than 20 commonly used anti-HIV drugs, AIDS experts at Johns Hopkins and Harvard universities have developed what they say is the first accurate computer simulation to explain drug effects. Already, the model clarifies how and why some treatment regimens fail in some patients who lack evidence of drug resistance. Researchers say their model is based on specific drugs, precise doses prescribed, and on “real-world variation” in how well patients follow prescribing instructions.

Johns Hopkins co-senior study investigator and infectious disease specialist Robert Siliciano, M.D., Ph.D., says the mathematical model can also be used to predict how well a patient is likely to do on a specific regimen, based on their prescription adherence. In addition, the model factors in each drug’s ability to suppress viral replication and the likelihood that such suppression will spur development of drug-resistant, mutant HIV strains.

“With the help of our simulation, we can now tell with a fair degree of certainty what level of viral suppression is being achieved — how hard it is for the virus to grow and replicate — for a particular drug combination, at a specific dosage and drug concentration in the blood, even when a dose is missed,” says Siliciano, a professor at the Johns Hopkins University School of Medicine and a Howard Hughes Medical Institute investigator. This information, he predicts, will remove “a lot of the current trial and error, or guesswork, involved in testing new drug combination therapies.”

Siliciano says the study findings, to be reported in the journalNature Medicine online Sept. 2, should help scientists streamline development and clinical trials of future combination therapies, by ruling out combinations unlikely to work.

One application of the model could be further development of drug combinations that can be contained in a single pill taken once a day. That could lower the chance of resistance, even if adherence is not perfect. Such future drug regimens, he says, will ideally strike a balance between optimizing viral suppression and minimizing risk of drug resistance.

Researchers next plan to expand their modeling beyond blood levels of virus to other parts of the body, such as the brain, where antiretroviral drug concentrations can be different from those measured in the blood. They also plan to expand their analysis to include multiple-drug-resistant strains of HIV.

Besides Siliciano, Johns Hopkins joint medical-doctoral student Alireza Rabi was a co-investigator in this study. Other study investigators included doctoral candidates Daniel Rosenbloom, M.S.; Alison Hill, M.S.; and co-senior study investigator Martin Nowak, Ph.D. — all at Harvard University.

Funding support for this study, which took two years to complete, was provided by the National Institutes of Health, with corresponding grant numbers R01-MH54907, R01-AI081600, R01-GM078986; the Bill and Melinda Gates Foundation; the Cancer Research Institute; the National Science Foundation; the Howard Hughes Medical Institute; Natural Sciences and Engineering Research Council of Canada; the John Templeton Foundation; and J. Epstein.

Currently, an estimated 8 million of the more than 34 million people in the world living with HIV are taking antiretroviral therapy to keep their disease in check. An estimated 1,178,000 in the United States are infected, including 23,000 in the state of Maryland.

Journal Reference:

  1. Daniel I S Rosenbloom, Alison L Hill, S Alireza Rabi, Robert F Siliciano, Martin A Nowak. Antiretroviral dynamics determines HIV evolution and predicts therapy outcomeNature Medicine, 2012; DOI: 10.1038/nm.2892

Mathematics or Memory? Study Charts Collision Course in Brain (Science Daily)

ScienceDaily (Sep. 3, 2012) — You already know it’s hard to balance your checkbook while simultaneously reflecting on your past. Now, investigators at the Stanford University School of Medicine — having done the equivalent of wire-tapping a hard-to-reach region of the brain — can tell us how this impasse arises.

The area in red is the posterior medial cortex, the portion of the brain that is most active when people recall details of their own pasts. (Credit: Courtesy of Josef Parvizi)

The researchers showed that groups of nerve cells in a structure called the posterior medial cortex, or PMC, are strongly activated during a recall task such as trying to remember whether you had coffee yesterday, but just as strongly suppressed when you’re engaged in solving a math problem.

The PMC, situated roughly where the brain’s two hemispheres meet, is of great interest to neuroscientists because of its central role in introspective activities.

“This brain region is famously well-connected with many other regions that are important for higher cognitive functions,” said Josef Parvizi, MD, PhD, associate professor of neurology and neurological sciences and director of Stanford’s Human Intracranial Cognitive Electrophysiology Program. “But it’s very hard to reach. It’s so deep in the brain that the most commonly used electrophysiological methods can’t access it.”

In a study published online Sept. 3 in Proceedings of the National Academy of Sciences, Parvizi and his Stanford colleagues found a way to directly and sensitively record the output from this ordinarily anatomically inaccessible site in human subjects. By doing so, the researchers learned that particular clusters of nerve cells in the PMC that are most active when you are recalling details of your own past are strongly suppressed when you are performing mathematical calculations. Parvizi is the study’s senior author. The first and second authors, respectively, are postdoctoral scholars Brett Foster, PhD, and Mohammed Dastjerdi, PhD.

Much of our understanding of what roles different parts of the brain play has been obtained by techniques such as functional magnetic resonance imaging, which measures the amount of blood flowing through various brain regions as a proxy for activity in those regions. But changes in blood flow are relatively slow, making fMRI a poor medium for listening in on the high-frequency electrical bursts (approximately 200 times per second) that best reflect nerve-cell firing. Moreover, fMRI typically requires pooling images from several subjects into one composite image. Each person’s brain physiognomy is somewhat different, so the blending blurs the observable anatomical coordinates of a region of interest.

Nonetheless, fMRI imaging has shown that the PMC is quite active in introspective processes such as autobiographical memory processing (“I ate breakfast this morning”) or daydreaming, and less so in external sensory processing (“How far away is that pedestrian?”). “Whenever you pay attention to the outside world, its activity decreases,” said Parvizi.

To learn what specific parts of this region are doing during, say, recall versus arithmetic requires more-individualized anatomical resolution than an fMRI provides. Otherwise, Parvizi said, “if some nerve-cell populations become less active and others more active, it all washes out, and you see no net change.” So you miss what’s really going on.

For this study, the Stanford scientists employed a highly sensitive technique to demonstrate that introspective and externally focused cognitive tasks directly interfere with one another, because they impose opposite requirements on the same brain circuitry.

The researchers took advantage of a procedure performed on patients who were being evaluated for brain surgery at the Stanford Epilepsy Monitoring Unit, associated with Stanford University Medical Center. These patients were unresponsive to drug therapy and, as a result, suffered continuing seizures. The procedure involves temporarily removing small sections of a patient’s skull, placing a thin plastic film containing electrodes onto the surface of the brain near the suspected point of origin of that patient’s seizure (the location is unique to each patient), and then monitoring electrical activity in that region for five to seven days — all of it spent in a hospital bed. Once the epilepsy team identifies the point of origin of any seizures that occurred during that time, surgeons can precisely excise a small piece of tissue at that position, effectively breaking the vicious cycle of brain-wave amplification that is a seizure.

Implanting these electrode packets doesn’t mean piercing the brain or individual cells within it. “Each electrode picks up activity from about a half-million nerve cells,” Parvizi said. “It’s more like dotting the ceiling of a big room, filled with a lot of people talking, with multiple microphones. We’re listening to the buzz in the room, not individual conversations. Each microphone picks up the buzz from a different bunch of partiers. Some groups are more excited and talking more loudly than others.”

The experimenters found eight patients whose seizures were believed to be originating somewhere near the brain’s midline and who, therefore, had had electrode packets placed in the crevasse dividing the hemispheres. (The brain’s two hemispheres are spaced far enough apart to slip an electrode packet between them without incurring damage.)

The researchers got permission from these eight patients to bring in laptop computers and put the volunteers through a battery of simple tasks requiring modest intellectual effort. “It can be boring to lie in bed waiting seven days for a seizure to come,” said Foster. “Our studies helped them pass the time.” The sessions lasted about an hour.

On the laptop would appear a series of true/false statements falling into one of four categories. Three categories were self-referential, albeit with varying degrees of specificity. Most specific was so-called “autobiographical episodic memory,” an example of which might be: “I drank coffee yesterday.” The next category of statements was more generic: “I eat a lot of fruit.” The most abstract category, “self-judgment,” comprised sentences along the lines of: “I am honest.”

A fourth category differed from the first three in that it consisted of arithmetical equations such as: 67 + 6 = 75. Evaluating such a statement’s truth required no introspection but, instead, an outward, more sensory orientation.

For each item, patients were instructed to press “1” if a statement was true, “2” if it was false.

Significant portions of the PMC that were “tapped” by electrodes became activated during self-episodic memory processing, confirming the PMC’s strong role in recall of one’s past experiences. Interestingly, true/false statements involving less specifically narrative recall — such as, “I eat a lot of fruit” — induced relatively little activity. “Self-judgment” statements — such as, “I am attractive” — elicited none at all. Moreover, whether a volunteer judged a statement to be true or false made no difference with respect to the intensity, location or duration of electrical activity in activated PMC circuits.

This suggests, both Parvizi and Foster said, that the PMC is not the brain’s “center of self-consciousness” as some have proposed, but is more specifically engaged in constructing autobiographical narrative scenes, as occurs in recall or imagination.

Foster, Dastjerdi and Parvizi also found that the PMC circuitry activated by a recall task took close to a half-second to fire up, ruling out the possibility that this circuitry’s true role was in reading or making sense of the sentence on the screen. (These two activities are typically completed within the first one-fifth of a second or so.) Once activated, these circuits remained active for a full second.

Yet all the electrodes that lit up during the self-episodic condition were conspicuously deactivated during arithmetic calculation. In fact, the circuits being monitored by these electrodes were not merely passively silent, but actively suppressed, said Parvizi. “The more a circuit is activated during autobiographical recall, the more it is suppressed during math. It’s essentially impossible to do both at once.”

The study was funded by the National Institutes of Health, with partial sponsorship from the Stanford Institute for NeuroInnovation and Translational Neuroscience.