Arquivo da tag: Mediação tecnológica

NASA Scientists Use Unmanned Aircraft to Spy on Hurricanes (Wired)

By Jason Paur

September 21, 2012

Image: NASA

Hurricane researchers are gathering unprecedented data this month by using two NASA Global Hawk unmanned aircraft. The airplanes were originally developed for the military, but have been modified to aid in atmospheric research.

One of the Global Hawks was flown to its new base at NASA’s Wallops Island Flight Facility on Virginia’s Atlantic coast earlier this month and has already flown several missions over developing tropical storms giving atmospheric scientists the ability to watch and measure storms for up to three times as long as they could with manned aircraft, including NASA’s modified U-2 spyplane. The second Global Hawk is set to depart the Dryden Flight Research Center in California and join its hangar mate in the next week or so.

The airplanes were first used to observe storm development on a limited basis in 2010. The earlier missions were flown out of Dryden, which cut short the amount of time the planes could spend over Atlantic storms. Now based on the East Coast, the Global Hawks can spend up to six hours off the coast of Africa as storms develop, or 20 hours or more as the storms approach North America.

This long loiter capability is what has scientists excited to gain new insight into the life of a hurricane, says Scott Braun, principal investigator on NASA’s Hurricane and Severe Storm Sentinel (HS3) project.

“We’ve kind of gone from reconnaissance, which is short-term investigations” said Braun, “to more of surveillance where you can stay with a storm and move with it for a while.”

The path flown by Global Hawk AV-6 over tropical storm Nadine on Sept. 12. (NASA)

On Wednesday, the Global Hawk known as AV-6 departed Wallops Island and flew for nearly 25 hours observing tropical storm Nadine over the central Atlantic. It was the third time AV-6 had flown over the storm in the past 10 days. The airplane carries an “environmental payload” designed to gather big-picture data by flying around the entire storm (above). In addition to a high-resolution infrared sounder and cloud-physics LIDAR for measuring the structure and depth of clouds, AV-6 can make direct measurements by dropping out radiosondes that can measure temperature, humidity, wind speed and air pressure as they descend by parachute from as high as 60,000 feet.

The other airplane, AV-1 has a different suite of remote sensing instruments on board focused on the development of the inner core of the storms, measuring variables including wind profiles, rain rates and liquid water content of the clouds.

Researchers have learned a lot about predicting the path of hurricanes over the past several decades. But being able to predict the intensification of storms, especially early in their development has not been as successful. One of the aspects of storm and hurricane development Braun and the HS3 team are hoping to learn more about is the role dry, dusty air masses blowing off the Sahara play in the intensification process. It’s something debated in the research community, and until now scientists have had limited capabilities to watch the interaction for long periods of time.

“In some ways the Saharan air layer is essential,” Braun said. “The question is, once you develop these waves, to what extent can this dry and dusty air get into these disturbances to disrupt the ability of thunderstorms to get rotation organized on smaller scales to spin up a hurricane.”

Braun says most of the storms form from the air masses that come off Africa, but in manned aircraft they may only get a few hours at a time to watch and gather data. Satellites are also used extensively, but they offer a few snapshots a day and cannot make direct measurements. The Global Hawks provide the capability to watch the storms develop for up to a day at a time. And with two of them the researchers will eventually be able to watch storms continuously.

“We could potentially follow them all the way across the Atlantic, so you’re looking at the full life cycle of these storms.”

The Global Hawks will remain in Virginia through early October until hurricane season is over. The airplanes will be back on the East Coast for more long flights next year as well as in 2014.

ONU quer garantir que temperatura global não se eleve mais que 2ºC (Globo Natureza)

JC e-mail 4582, de 13 de Setembro de 2012

As negociações climáticas da Organização das Nações Unidas (ONU) devem continuar pressionando por atitudes mais ambiciosas para garantir que o aquecimento global não ultrapasse os 2 graus, disse um negociador da União Europeia nesta semana, um mês depois de os EUA terem sido acusados de apresentar um retrocesso na meta.

Quase 200 países concordaram em 2010 em limitar o aumento das temperaturas para abaixo de 2 graus Celsius, acima da era pré-industrial para evitar os impactos perigos da mudança climática, como enchentes, secas e elevação do nível das marés.

Para desacelerar o ritmo do aquecimento global, as conversações climáticas da ONU na África do Sul concordaram em desenvolver um acordo climático legalmente vinculante até 2015, que poderia entrar em vigor no máximo até 2020.

Entretanto, especialistas advertem que a chance de limitar o aumento da temperatura global para menos de 2 graus está ficando cada vez menor, à medida que aumenta a emissão dos gases de efeito estufa por causa da queima de combustíveis fósseis.

“Está muito claro que devemos pressionar nas negociações de que a meta de 2 graus não é suficiente. A razão pela qual não estamos fazendo o bastante se deve à situação política em algumas partes do mundo”, disse Peter Betts, o diretor para mudança climática internacional da Grã-Bretanha e negociador sênior da UE, a um grupo de mudança climática no Parlamento britânico.

Na última semana, cientistas e diplomatas se reuniram em Bangcoc para a reunião da Convenção da ONU sobre Mudança Climática (UNFCCC, na sigla em inglês), a última antes do encontro anual que será realizado entre novembro e dezembro em Doha, no Qatar.

Flexibilidade nas metas – No mês passado, os EUA foram criticados por dizer que apoiavam uma abordagem mais flexível para um novo acordo climático – que não necessariamente manteria o limite de 2 graus -, mas depois acrescentaram que a flexibilidade daria ao mundo uma chance maior de chegar a um novo acordo.

Diversos países, incluindo alguns dos mais vulneráveis à mudança climática, dizem que o limite de 2 graus não é suficiente e que um limite de 1,5 graus seria mais seguro. As emissões do principal gás de efeito estufa, o dióxido de carbono, subiram 3,1% em 2011, em um recorde de alta. A China foi a maior emissora do mundo, seguida pelos EUA.

As negociações para a criação de um novo acordo global para o clima, nos mesmos moldes de Kyoto, já iniciaram. Na última conferência climática foi aprovada uma série de medidas que estabelece metas para países desenvolvidos e em desenvolvimento.

O documento denominado “Plataforma de Durban para Ação Aumentada” aponta uma série de medidas que deverão ser implementadas, mas na prática, não há medidas efetivas urgentes para conter em todo o planeta o aumento dos níveis de poluição nos próximos oito anos.

Obrigação para todos no futuro – Ele prevê a criação de um acordo global climático que vai compreender todos os países integrantes da UNFCCC e irá substituir o Protocolo de Kyoto. Será desenhado pelos países “um protocolo, outro instrumento legal ou um resultado acordado com força legal” para combater as mudanças climáticas.

Isso quer dizer que metas de redução de gases serão definidas para todas as nações, incluindo Estados Unidos e China, que não aceitavam qualquer tipo de negociação se uma das partes não fosse incluída nas obrigações de redução.

O delineamento deste novo plano começará a ser feito a partir das próximas negociações da ONU, o que inclui a COP 18, que vai acontecer em 2012 no Catar. O documento afirma que um grupo de trabalho será criado e que deve concluir o novo plano em 2015.

As medidas de contenção da poluição só deverão ser implementadas pelos países a partir de 2020, prazo estabelecido na Plataforma de Durban, e deverão levar em conta as recomendações do relatório do Painel Intergovernamental sobre Mudanças Climáticas (IPCC, na sigla em inglês), que será divulgado entre 2014 e 2015.

Em 2007, o organismo divulgou um documento que apontava para um aumento médio global das temperaturas entre 1,8 ºC e 4,0 ºC até 2100, com possibilidade de alta para 6,4 ºC se a população e a economia continuarem crescendo rapidamente e se for mantido o consumo intenso dos combustíveis fósseis.

Entretanto, a estimativa mais confiável fala em um aumento médio de 3ºC, assumindo que os níveis de dióxido de carbono se estabilizem em 45% acima da taxa atual. Aponta também, com mais de 90% de confiabilidade, que a maior parte do aumento de temperatura observado nos últimos 50 anos foi provocada por atividades humanas.

O esvaziamento da discussão ecológica atual que não questiona o modelo econômico e de desenvolvimento (EcoDebate)

Publicado em setembro 6, 2012 por 

“A pergunta passa a ser ‘o que eu devo fazer para ajudar?’ (…) enquanto a questão principal deveria ser ‘contra quem e contra o quê eu devo lutar?’”

 Vladimir Safatle faz parte de uma nova leva de intelectuais de esquerda que não se intimida diante da diversidade de questões trazidas pelo mundo contemporâneo. Nessa entrevista, o professor do Departamento de Filosofia da Universidade de São Paulo (USP) mostra que a crise da democracia representativa pode ser a chave para compreender melhor fatos que à primeira vista não estão relacionados, desvelando mecanismos que ligam islandeses a pescadores brasileiros, ecologistas a jovens que voltam a reivindicar as ruas como espaço do fazer político. Um dos autores de ‘Occupy’ (Boitempo, 2012), Safatle defende que vivemos um momento em que a crítica da democracia, longe de balizar o totalitarismo, reacende a capacidade de reinvenção democrática na perspectiva da soberania popular. Com o lançamento de ‘A esquerda que não teme dizer seu nome’ (Três Estrelas, 2012), o filósofo propõe a urgência da saída do “cômodo e depressivo fatalismo”, que, desde a queda do muro de Berlim, alimenta a falsa impressão de que nenhuma ruptura radical está na pauta do campo político.

No seu livro, o senhor defende que falta à esquerda mostrar o que é inegociável. Abandonar o pragmatismo, superar os impasses da ‘governabilidade’, dentre outros elementos, seriam caminhos para isso. Em contrapartida, paira uma dúvida sobre os próprios partidos, sindicatos e estruturas semelhantes: será que serão capazes de se transformar? Os jovens que ocupam as ruas do mundo parecem não se identificar com esse tipo de organização da vida política. Por que isso acontece?

O que aconteceu com os partidos de esquerda?

Os partidos de esquerda passaram por duas fases. A primeira, muito marcada pela polaridade entre os partidos socialdemocratas e os partidos comunistas, sustentou o desenvolvimento dos Estados de bem-estar social na Europa nos anos 1950 e 1960. O segundo momento dos partidos de esquerda é resultado das ideias libertárias de maio de 1968, que vai gerar uma miríade de partidos libertários, sendo o mais importante deles o partido verde. Os partidos verdes vão conseguir impor uma pauta ecológica fundamental no debate político, mas este movimento também se esgotou. Talvez o último relance dele esteja acontecendo na Alemanha com o Partido Pirata. Só que falta uma terceira leva de partidos que sejam capazes de processar a situação fim de linha da crise de 2008, que ainda vai se perpetuar durante muito tempo.

Como esses partidos se caracterizariam?

Falta uma geração de partidos que tenha consciência de problemas vinculados à desigualdade econômica, coisa que esses partidos de segunda geração não têm. Diga-se de passagem, o Partido Verde alemão foi responsável pela lei que desregulamentou e flexibilizou o mercado de trabalho, votada na época do Gerhard Schröder [premier alemão de 1998 a 2005]. Falta uma geração de partidos com a coragem de radicalizar os processos de institucionalização da soberania popular. Partidos que não funcionem como partidos. Isso pode parecer uma coisa estranha, mas no fundo é muito importante. Partidos que não tenham essa estrutura centralizada, estrategicamente orientada, em que as discussões se submetem às estratégias político-partidárias eleitorais do dia. Por que os jovens não querem entrar em partidos hoje? Porque não querem ter a sua capacidade crítica instrumentalizada por cálculos eleitorais. Ninguém mais quer ficar fazendo uma aliança política com fulano para garantir a eleição de sicrano. Esse tipo de raciocínio de mercador, que conseguiu monopolizar a política em todos os seus níveis – inclusive no campo das esquerdas – é o que boa parte dos jovens de hoje se recusa veementemente a seguir, com todas as razões.

O que se coloca no lugar disso?

É fundamental encontrar um modelo de participação eleitoral em que esse tipo de posição não seja rifada. Ninguém aqui está fazendo a profissão de fé que vigorou nos anos 1990 de mudar o mundo sem tomar o poder. Isso não funcionou nem funcionará, o Egito é um exemplo. O grupo que realmente mobilizou o processo revolucionário chama-se Movimento 6 de abril. Eles decidiram não entrar no jogo eleitoral e estão cada vez mais isolados. Essa coisa da força que vem das ruas e vai pressionar o regime de fora tem limite. Então, não se trata de uma crítica abstrata do processo eleitoral, mas da constatação de que é necessário saber entrar nesse processo de uma maneira diferente da que vimos até hoje. Talvez a criação de alianças flexíveis para uma eleição que depois se dissolvem, como a Frente de Esquerda na França, coisas desse tipo. É difícil saber o que vai aparecer, mas uma coisa é certa: o que temos hoje não dá mais conta. Há uma fixação muito grande na democracia representativa. Desde os anos 1970 vivemos nas Ciências Políticas uma espécie de deleite em ficar discutindo como deve ser o jogo democrático, a estrutura dos partidos, dos poderes e blá, blá, blá. Esse tipo de perspectiva bloqueia radicalmente a ideia de que uma das questões centrais da democracia é fazer a crítica da democracia. Quando a democracia perde sua capacidade de reinvenção, ela morre. É o que está acontecendo agora.

O que contribuiu para a recomposição do espaço público das ruas e por que ele foi abandonado durante tanto tempo?

Para você ter crítica social e mobilização é necessário desencanto. Vários níveis de desencanto foram necessários para que as pessoas voltassem às ruas. Quando eu tinha vinte e poucos anos, o discurso era de que nunca mais veríamos grandes mobilizações populares. Poderia haver mobilizações pontuais sobre questões pontuais, mas nunca uma mobilização que colocasse em xeque o modelo de funcionamento e gestão da vida social no interior das sociedades capitalistas avançadas. Hoje vemos que quem fez essas previsões não só errou como tinha interesses ideológicos inconfessáveis. As pessoas que saíram às ruas em 2011 queriam discutir o modelo de funcionamento da estrutura econômica e social das nossas sociedades. No momento em que isso aconteceu, muitos, principalmente da imprensa, se deleitaram em dizer que eles não tinham propostas, o que é falso. Quem foi às ruas buscou o direito de colocar os problemas em questão. Muitas vezes, a pior maneira de se pensar em um problema é “solucioná-lo” muito rapidamente. Também houve quem não tenha ido às ruas e, diante da crise financeira, apareceu com soluções prontas. Essas ‘soluções’ só pioraram os problemas.

No que diz respeito à agenda ambiental, existem muitas ‘soluções’ que, na verdade, provocam um esvaziamento deliberado do potencial político das questões ecológicas. Vemos a individualização da responsabilidade pela poluição presente no discurso das sacolas plásticas, do tempo que as pessoas devem gastar tomando banho, etc. e também um esforço em afastar a população da discussão travestindo-a como eminentemente técnica. Como vê isso?

É uma tentativa de retirar a força política da questão ecológica transformando-a em uma questão moral. A discussão gira em torno dos atos dos indivíduos, que precisam ser modificados. Você precisa gastar menos tempo no banho, comprar produtos bio e coisas desse tipo. É uma maneira muita astuta de operar um deslocamento que é mortal para o problema ecológico, porque a pergunta passa a ser “o que eu devo fazer para ajudar?” – e, a princípio, parece legal todo mundo fazer alguma coisa para ajudar –, enquanto a questão principal deveria ser “contra quem e contra o quê eu devo lutar?”. Sem isso, a tendência é esvaziar completamente a dimensão da discussão ecológica, não se questiona o modelo econômico e de desenvolvimento. E o forte potencial político dessa discussão reside justamente nesse questionamento do modelo de desenvolvimento das sociedades capitalistas avançadas, colocando em xeque o modelo de organização e gestão das cidades, dos transportes, dos resíduos, da energia… Como resultado desse deslocamento da dimensão política para a moral, nada disso é colocado em questão, por mais que todo mundo defenda com a mão no coração “as florestas”, a questão que a ecologia trouxe está fora do debate.

A retórica do discurso técnico na qual as pessoas não conseguem ter acesso aos fatos sem a mediação de especialistas é um obstáculo para a reconstrução do campo político nas bases dessa democracia direta, estreitamente ligada aos reais interesses das populações, não?

Posso dar um exemplo sobre esse tipo de problema. A Islândia foi um dos primeiros países a entrar na crise financeira de 2008. Bancos islandeses venderam fundos de investimento na Holanda e na Inglaterra e quando esses bancos quebraram, os governos holandês e inglês exigiram que o governo da Islândia bancasse a dívida dos bancos. Diante disso, o parlamento islandês resolveu votar uma lei de ajuda aos bancos falidos e a lei passou. Mas o presidente da Islândia, que era um sujeito mais esclarecido, lembrou que a Constituição do país previa a convocação de um referendo popular em casos como aquele. Resumindo, ele lembrou que o princípio central da democracia é: quem paga a orquestra, escolhe a música. Quem pagaria aquela dívida não seria o parlamento, mas a população, que teria seus recursos e salários expropriados por uma série de impostos destinados ao pagamento da dívida dos bancos. A população islandesa decidiu que não queria isso. Depois do resultado do referendo, aconteceu a coisa mais fantástica, que é a essência da democracia parlamentar atual: o parlamento votou e aprovou mais uma vez a mesma lei de ajuda aos bancos. Então, novamente, o presidente acionou o mecanismo do referendo popular e, pela segunda vez, os islandeses disseram não. O que isso significa? Alguns podem questionar “como uma questão ‘técnica’ dessas vai parar em referendo popular?”, acusar o presidente de demagogia, etc., o que é absolutamente surreal. Não é possível que parlamentares que têm suas campanhas pagas por bancos definam o que vai acontecer com o dinheiro da população em relação ao pagamento ou não da dívida destes bancos. Não faltaram economistas prevendo que a Islândia iria quebrar. No entanto, de todos os países que entraram na crise, a Islândia é um dos que está em melhor situação atualmente. A tentativa de retirar a força política da decisão era simplesmente uma construção ideológica para legitimar os “técnicos”, que, no fundo, de técnicos não têm nada porque representantes do poder financeiro que conseguiu tomar conta de todas as instituições das democracias avançadas. Esse é o limite da democracia atual. O sistema financeiro é o grande inimigo da democracia.

Existe um tipo de agenda ambiental apoiada na entrada de bens comuns para o mercado que vem sendo denunciada como a solução encontrada pelo sistema financeiro para sair da crise ao mesmo tempo em que, também apoiada na retórica da crise, Angela Merkel lidera na zona do Euro políticas de austeridade que deslegitimam a vontade soberana dos povos, como no caso grego. Como ‘a esquerda que não teme dizer seu nome’ se coloca nesse processo?

Os problemas ligados à ecologia têm um forte potencial não só mobilizador como também transformador. No entanto, nós temos hoje duas ecologias. Uma tem um potencial transformador, mas a outra é conservadora. O capitalismo vê na ecologia um dos elementos de sua renovação. Hoje, qualquer liberal, qualquer analista de Wall Street vai admitir o discurso ecológico. Há alguns autores que falam que depois da bolha imobiliária, nós temos agora a bolha verde. Uma vez escrevi um pequeno texto sobre o filme Wall Street [2010], de Oliver Stone, que me impressionou pela agudez da metáfora. Um jovem analista do mercado aposta no potencial financeiro das energias renováveis. Ele era um visionário porque, de certa maneira, pregava uma reconciliação entre o setor mais rentista da economia e algumas exigências presentes na pauta ecológica. Isso só pode ser feito rifando completamente a dimensão em que a reflexão ecológica aparece como um elemento fundamental de afirmação da soberania popular. Existe uma tendência bizarra, mas muito concreta, de articulação entre um determinado setor de lutas ecológicas e o capital financeiro. Inclusive, do ponto de vista eleitoral, acontece muita coisa complicada. Os partidos verdes europeus preferem se aliar a partidos de centro do que aos partidos de esquerda. Por exemplo, na Alemanha, o Partido Verde prefere uma aliança com a CDU [partido democrata-cristão da primeira-ministra Angela Merkel] do que uma aliança com a Die LINK, que é um partido de esquerda mais dura. Na França foi a mesma coisa. Tudo isso me parece muito preocupante. É necessário livrar a agenda ecológica dessa tendência à justificativa de um liberalismo renovado para recolocá-la no lugar onde ela sempre esteve, ou seja, como elemento fundamental da reflexão da esquerda sobre o caráter deletério dos processos de desenvolvimento do capitalismo avançado.

Como o novo pensamento de esquerda pode articular uma mirada filosófica diferente para a questão do uso produtivista da natureza, característico do neodesenvolvimentismo aqui no Brasil?

Eu reconheço que esse produtivismo em relação à natureza também esteve muito presente em certos setores da esquerda que, durante muito tempo, entenderam a natureza como fonte de recursos e só. Basta lembrar que nos países comunistas a política ambiental foi catastrófica. Isso, inclusive, tem base teórica, vem de uma leitura do pensamento marxista em que a natureza era um discurso reificado, sem realidade ontológica em si. Em última instância, a natureza era o fruto do trabalho humano então a intervenção humana na natureza já estava justificada de antemão, sem maiores contradições. Mas acredito que do ponto de vista da esquerda hoje existe uma consciência tácita a respeito da centralidade da agenda ecológica. Não foram poucos os filósofos no século 20 que nos alertaram para o impacto negativo da redução da relação com a natureza a sua dimensão eminentemente técnica. Por mais que o desenvolvimento técnico pareça nos assegurar a dominação da natureza, o fato de compreender a relação humana com a natureza sob o signo da dominação já é um problema grave. Então, essa ideia de que, sim, vivemos em um país que tem necessidades de desenvolvimento maiores porque há urgências de inclusão social não invalida o fato de estarmos no interior de um processo de reflexão sobre o que significa riqueza social. Será que riqueza social significa ter um conjunto determinado de bens de consumo, ter transporte individual, ter uma relação extrativista da energia natural? Ou significa ser capaz de criar um modelo de relação com a natureza que garanta de maneira fundamental a qualidade de vida? Essa é uma bela questão que só o debate ecológico foi capaz de colocar.

Assim como em movimentos urbanos, a exemplo do Ocuppy, a pauta ecológica delineia um horizonte onde outro modelo de sociedade é possível, fazendo cada vez mais a crítica ao poder do sistema financeiro para bloqueá-lo?

A pauta ecológica atinge o modelo na sua esfera econômica mais clara ao afirmar que nós não queremos uma situação na qual todos os agentes econômicos estejam submetidos aos interesses de uma meia dúzia de multinacionais que detém não só a estrutura de produção, mas também o desenvolvimento da técnica. Quando se fala em agricultura familiar, o que isso quer dizer? Que, enquanto modelo econômico, não é possível estabelecer uma brutal concentração de terras, de tecnologia, de insumos. Insistir na agricultura familiar é, dentre outras coisas, insistir na pulverização radical da posse não só da terra, mas dos bens e das técnicas. Porque se isso não ocorrer, você tem não só consequências demográficas muito brutais, como o inchaço das periferias urbanas, mas também uma espécie de situação na qual a criatividade inerente à pulverização das técnicas é perdida. Milhares de produtores não vão produzir as mesmas coisas, nem sob as mesmas condições.

Por exemplo?

Por exemplo, quando essas questões ecológicas se vinculam ao problema da soberania alimentar. O fato de que você tem uma política agrícola que vai eliminando completamente a diversidade alimentar não é só uma questão de garantia das tradições – eu seria o último a fazer aqui a defesa abstrata da particularidade das tradições. Dentre outras coisas, é preciso reconhecer que a tradição tem uma dimensão de experiência que será muito importante para nós quando tivermos condições de compreender como os saberes alimentares se constituíram e o que eles garantem. Há uma tendência monopolista muito forte, nós vemos nas últimas décadas algo que está na base da tradição marxista, a ideia de que vai chegar um momento em que a própria noção de concorrência começa a desaparecer. Esse processo concentracionista toma a relação com a natureza de assalto, da maneira mais brutal possível. Todos esses movimentos camponeses, como a Via Campesina, insistem que há um risco não só econômico como social em se permitir a concentração das atividades agrícolas na mão de multinacionais. As sociedades pagarão caro se não conseguirem bloquear esse processo.

Pegando carona nesse exemplo da Via Campesina, cada vez mais surgem relatos de populações tradicionais emparedadas por esse modelo de desenvolvimento, mas, ainda sim, estes relatos bastante concretos e verificáveis são deslegitimados…

Tenta-se desqualificar essas resistências como uma espécie de arcaísmo. É como se dissessem “vocês precisam entender que têm uma visão absolutamente romântica do mundo”. É um discurso que condena “a crítica às luzes”, no final das contas. Diz muito a tentativa de retirar dessas lutas uma espécie de prova maior do conservadorismo de certas populações que no fundo são as populações mais vulneráveis, pois sabem que quando essas empresas chegam eles vão para o espaço simplesmente. Quando a Petrobrás chega para fazer a exploração de petróleo nas bacias, a vida dos pescadores é a última coisa na qual ela vai pensar. “Imagina você ficar preocupado com peixe quando o país quer se transformar em uma grande potência petrolífera?”. Ou seja, eles querem vender essa perspectiva, mas uma questão fundamental da esquerda é saber defender as alas mais vulneráveis da sociedade. Existe um modelo retórico que procura nos fazer acreditar que toda resistência seja, no fundo, uma recusa do progresso. Acho importante recolocar de maneira clara o que significa ‘progresso’ no interior desse contexto. O progresso procuraria dar conta de certas exigências fundamentais de bem-estar. O progresso científico não é simplesmente um processo de dominação da natureza, mas também um processo de otimização do bem-estar humano. Mas esse dito ‘progresso’ promete uma maior qualidade de vida para as populações e acaba produzindo o inverso. Para que essa inversão não ocorra, é necessária uma reconstituição brutal dos modelos de relação com a natureza. E, nesse processo, o interessante é que nasce outra consciência da organização social.

* Entrevista realizada por Maíra Mathias para a revista Poli n° 24, de julho e agosto de 2012

** Entrevista socializada pela Escola Politécnica de Saúde Joaquim Venâncio(EPSJV/Fiocruz), publicada pelo EcoDebate, 06/09/2012

[ O conteúdo do EcoDebate é “Copyleft”, podendo ser copiado, reproduzido e/ou distribuído, desde que seja dado crédito ao autor, ao Ecodebate e, se for o caso, à fonte primária da informação ]

Bits of Mystery DNA, Far From ‘Junk,’ Play Crucial Role (N.Y.Times)

By GINA KOLATA

Published: September 5, 2012

Among the many mysteries of human biology is why complex diseases like diabeteshigh blood pressure and psychiatric disorders are so difficult to predict and, often, to treat. An equally perplexing puzzle is why one individual gets a disease like cancer or depression, while an identical twin remains perfectly healthy.

Béatrice de Géa for The New York Times. “It is like opening a wiring closet and seeing a hairball of wires,” Mark Gerstein of Yale University said of the DNA intricacies.

Now scientists have discovered a vital clue to unraveling these riddles. The human genome is packed with at least four million gene switches that reside in bits of DNA that once were dismissed as “junk” but that turn out to play critical roles in controlling how cells, organs and other tissues behave. The discovery, considered a major medical and scientific breakthrough, has enormous implications for human health because many complex diseases appear to be caused by tiny changes in hundreds of gene switches.

The findings, which are the fruit of an immense federal project involving 440 scientists from 32 laboratories around the world, will have immediate applications for understanding how alterations in the non-gene parts of DNA contribute to human diseases, which may in turn lead to new drugs. They can also help explain how the environment can affect disease risk. In the case of identical twins, small changes in environmental exposure can slightly alter gene switches, with the result that one twin gets a disease and the other does not.

As scientists delved into the “junk” — parts of the DNA that are not actual genes containing instructions for proteins — they discovered a complex system that controls genes. At least 80 percent of this DNA is active and needed. The result of the work is an annotated road map of much of this DNA, noting what it is doing and how. It includes the system of switches that, acting like dimmer switches for lights, control which genes are used in a cell and when they are used, and determine, for instance, whether a cell becomes a liver cell or a neuron.

“It’s Google Maps,” said Eric Lander, president of the Broad Institute, a joint research endeavor of Harvard and the Massachusetts Institute of Technology. In contrast, the project’s predecessor, the Human Genome Project, which determined the entire sequence of human DNA, “was like getting a picture of Earth from space,” he said. “It doesn’t tell you where the roads are, it doesn’t tell you what traffic is like at what time of the day, it doesn’t tell you where the good restaurants are, or the hospitals or the cities or the rivers.”

The new result “is a stunning resource,” said Dr. Lander, who was not involved in the research that produced it but was a leader in the Human Genome Project. “My head explodes at the amount of data.”

The discoveries were published on Wednesday in six papers in the journal Nature and in 24 papers in Genome Research and Genome Biology. In addition, The Journal of Biological Chemistry is publishing six review articles, and Science is publishing yet another article.

Human DNA is “a lot more active than we expected, and there are a lot more things happening than we expected,” said Ewan Birney of the European Molecular Biology Laboratory-European Bioinformatics Institute, a lead researcher on the project.

In one of the Nature papers, researchers link the gene switches to a range of human diseases — multiple sclerosislupusrheumatoid arthritisCrohn’s diseaseceliac disease — and even to traits like height. In large studies over the past decade, scientists found that minor changes in human DNA sequences increase the risk that a person will get those diseases. But those changes were in the junk, now often referred to as the dark matter — they were not changes in genes — and their significance was not clear. The new analysis reveals that a great many of those changes alter gene switches and are highly significant.

“Most of the changes that affect disease don’t lie in the genes themselves; they lie in the switches,” said Michael Snyder, a Stanford University researcher for the project, called Encode, for Encyclopedia of DNA Elements.

And that, said Dr. Bradley Bernstein, an Encode researcher at Massachusetts General Hospital, “is a really big deal.” He added, “I don’t think anyone predicted that would be the case.”

The discoveries also can reveal which genetic changes are important in cancer, and why. As they began determining the DNA sequences of cancer cells, researchers realized that most of the thousands of DNA changes in cancer cells were not in genes; they were in the dark matter. The challenge is to figure out which of those changes are driving the cancer’s growth.

“These papers are very significant,” said Dr. Mark A. Rubin, a prostate cancer genomics researcher at Weill Cornell Medical College. Dr. Rubin, who was not part of the Encode project, added, “They will definitely have an impact on our medical research on cancer.”

In prostate cancer, for example, his group found mutations in important genes that are not readily attacked by drugs. But Encode, by showing which regions of the dark matter control those genes, gives another way to attack them: target those controlling switches.

Dr. Rubin, who also used the Google Maps analogy, explained: “Now you can follow the roads and see the traffic circulation. That’s exactly the same way we will use these data in cancer research.” Encode provides a road map with traffic patterns for alternate ways to go after cancer genes, he said.

Dr. Bernstein said, “This is a resource, like the human genome, that will drive science forward.”

The system, though, is stunningly complex, with many redundancies. Just the idea of so many switches was almost incomprehensible, Dr. Bernstein said.

There also is a sort of DNA wiring system that is almost inconceivably intricate.

“It is like opening a wiring closet and seeing a hairball of wires,” said Mark Gerstein, an Encode researcher from Yale. “We tried to unravel this hairball and make it interpretable.”

There is another sort of hairball as well: the complex three-dimensional structure of DNA. Human DNA is such a long strand — about 10 feet of DNA stuffed into a microscopic nucleus of a cell — that it fits only because it is tightly wound and coiled around itself. When they looked at the three-dimensional structure — the hairball — Encode researchers discovered that small segments of dark-matter DNA are often quite close to genes they control. In the past, when they analyzed only the uncoiled length of DNA, those controlling regions appeared to be far from the genes they affect.

The project began in 2003, as researchers began to appreciate how little they knew about human DNA. In recent years, some began to find switches in the 99 percent of human DNA that is not genes, but they could not fully characterize or explain what a vast majority of it was doing.

The thought before the start of the project, said Thomas Gingeras, an Encode researcher from Cold Spring Harbor Laboratory, was that only 5 to 10 percent of the DNA in a human being was actually being used.

The big surprise was not only that almost all of the DNA is used but also that a large proportion of it is gene switches. Before Encode, said Dr. John Stamatoyannopoulos, a University of Washington scientist who was part of the project, “if you had said half of the genome and probably more has instructions for turning genes on and off, I don’t think people would have believed you.”

By the time the National Human Genome Research Institute, part of the National Institutes of Health, embarked on Encode, major advances in DNA sequencing and computational biology had made it conceivable to try to understand the dark matter of human DNA. Even so, the analysis was daunting — the researchers generated 15 trillion bytes of raw data. Analyzing the data required the equivalent of more than 300 years of computer time.

Just organizing the researchers and coordinating the work was a huge undertaking. Dr. Gerstein, one of the project’s leaders, has produced a diagram of the authors with their connections to one another. It looks nearly as complicated as the wiring diagram for the human DNA switches. Now that part of the work is done, and the hundreds of authors have written their papers.

“There is literally a flotilla of papers,” Dr. Gerstein said. But, he added, more work has yet to be done — there are still parts of the genome that have not been figured out.

That, though, is for the next stage of Encode.

*   *   *

Published: September 5, 2012

Rethinking ‘Junk’ DNA

A large group of scientists has found that so-called junk DNA, which makes up most of the human genome, does much more than previously thought.

GENES: Each human cell contains about 10 feet of DNA, coiled into a dense tangle. But only a very small percentage of DNA encodes genes, which control inherited traits like eye color, blood type and so on.

JUNK DNA: Stretches of DNA around and between genes seemed to do nothing, and were called junk DNA. But now researchers think that the junk DNA contains a large number of tiny genetic switches, controlling how genes function within the cell.

REGULATION: The many genetic regulators seem to be arranged in a complex and redundant hierarchy. Scientists are only beginning to map and understand this network, which regulates how cells, organs and tissues behave.

DISEASE: Errors or mutations in genetic switches can disrupt the network and lead to a range of diseases. The new findings will spur further research and may lead to new drugs and treatments.

 

Evolution could explain the placebo effect (New Scientist)

06 September 2012 by Colin Barras

Magazine issue 2881

ON THE face of it, the placebo effect makes no sense. Someone suffering from a low-level infection will recover just as nicely whether they take an active drug or a simple sugar pill. This suggests people are able to heal themselves unaided – so why wait for a sugar pill to prompt recovery?

New evidence from a computer model offers a possible evolutionary explanation, and suggests that the immune system has an on-off switch controlled by the mind.

It all starts with the observation that something similar to the placebo effect occurs in many animals, says Peter Trimmer, a biologist at the University of Bristol, UK. For instance, Siberian hamsters do little to fight an infection if the lights above their lab cage mimic the short days and long nights of winter. But changing the lighting pattern to give the impression of summer causes them to mount a full immune response.

Likewise, those people who think they are taking a drug but are really receiving a placebo can have a response which is twice that of those who receive no pills (Annals of Family Medicinedoi.org/cckm8b). In Siberian hamsters and people, intervention creates a mental cue that kick-starts the immune response.

There is a simple explanation, says Trimmer: the immune system is costly to run – so costly that a strong and sustained response could dangerously drain an animal’s energy reserves. In other words, as long as the infection is not lethal, it pays to wait for a sign that fighting it will not endanger the animal in other ways.

Nicholas Humphrey, a retired psychologist formerly at the London School of Economics, first proposed this idea a decade ago, but only now has evidence to support it emerged from a computer model designed by Trimmer and his colleagues.

According to Humphrey’s picture, the Siberian hamster subconsciously acts on a cue that it is summer because food supplies to sustain an immune response are plentiful at that time of year. We subconsciously respond to treatment – even a sham one – because it comes with assurances that it will weaken the infection, allowing our immune response to succeed rapidly without straining the body’s resources.

Trimmer’s simulation is built on this assumption – that animals need to spend vital resources on fighting low-level infections. The model revealed that, in challenging environments, animals lived longer and sired more offspring if they endured infections without mounting an immune response. In more favourable environments, it was best for animals to mount an immune response and return to health as quickly as possible (Evolution and Human Behavior, doi.org/h8p). The results show a clear evolutionary benefit to switching the immune system on and off depending on environmental conditions.

“I’m pleased to see that my theory stands up to computational modelling,” says Humphrey. If the idea is right, he adds, it means we have misunderstood the nature of placebos. Farming and other innovations in the past 10,000 years mean that many people have a stable food supply and can safely mount a full immune response at any time – but our subconscious switch has not yet adapted to this. A placebo tricks the mind into thinking it is an ideal time to switch on an immune response, says Humphrey.

Paul Enck at the University of Tübingen in Germany says it is an intriguing idea, but points out that there are many different placebo responses, depending on the disease. It is unlikely that a single mechanism explains them all, he says.

Embrapa envia sementes de milho e arroz para o Banco de Svalbard, na Noruega (O Globo)

JC e-mail 4577, de 05 de Setembro de 2012.

Banco nórdico é o mais seguro do mundo, construído para resistir a catástrofes climáticas e a explosão nuclear.

A Embrapa envia esta semana 264 amostras representativas de sementes de milho e 541 de arroz para o Banco Global de Sementes de Svalbard, na Noruega, como parte do acordo assinado com o Real Ministério de Agricultura e Alimentação do país em 2008. Serão enviadas ao banco genético norueguês as coleções nucleares de arroz e milho, ou seja, um grupo limitado de acessos derivados de uma coleção vegetal, escolhido para representar a variabilidade genética da coleção inteira. Tradicionalmente, as coleções nucleares são estabelecidas com tamanho em torno de 10% dos acessos de toda a coleção original e incluem aproximadamente 70% no acervo genético.

A escolha dessas culturas atende a uma das recomendações do Banco de Svalbard quanto à relevância para a segurança alimentar e agricultura sustentável. Embora não sejam culturas originárias do Brasil, são cultivadas no país há séculos e têm características de rusticidade e adaptabilidade às condições nacionais. A próxima cultura agrícola a ser encaminhada para o banco norueguês será o feijão, o que deve acontecer até o fim de 2012.

O envio de amostras para Svalbard é mais uma garantia de segurança, já que o banco nórdico é o mais seguro do mundo, construído com total segurança para resistir a catástrofes climáticas e até a uma explosão nuclear. O banco tem capacidade para quatro milhões e quinhentas mil amostras de sementes. O conjunto arquitetônico conta com três câmaras de segurança máxima situadas ao final de um túnel de 125 metros dentro de uma montanha em uma pequena ilha do arquipélago de Svalbard situado no paralelo 780 N, próximo do Pólo Norte.

As sementes são armazenadas a 20ºC abaixo de zero em embalagens hermeticamente fechadas, guardadas em caixas armazenadas em prateleiras. O depósito está rodeado pelo clima glacial do Ártico, o que assegura as baixas temperaturas, mesmo se houver falha no suprimento de energia elétrica. As baixas temperatura e umidade garantem a baixa atividade metabólica, mantendo a viabilidade das sementes por um milênio ou mais.

Kinsey Reporter: Free App Allows Public to Anonymously Report, Share Information On Sexual Behavior (Science Daily)

ScienceDaily (Sep. 5, 2012) — Indiana University has released Kinsey Reporter, a global mobile survey platform for collecting and reporting anonymous data about sexual and other intimate behaviors. The pilot project allows citizen observers around the world to use free applications now available for Apple and Android mobile platforms to not only report on sexual behavior and experiences, but also to share, explore and visualize the accumulated data.

The Kinsey Reporter platform is available free from Apple iOS and Google Play (for Android) online stores. Reports made by anonymous citizen scientists will be used for research and shared with the public at the Kinsey Reporter website. (Credit: Image courtesy of Indiana University)

“People are natural observers. It’s part of being social, and using mobile apps is an excellent way to involve citizen scientists,” said Julia Heiman, director of The Kinsey Institute for Research in Sex, Gender and Reproduction. “We expect to get new insights into sexuality and relationships today. What do people notice, what are they involved in, and what can they relate to us about their lives and their communities?”

The project will collect anonymous data and then aggregate and share it openly. Kinsey Reporter is a joint project between The Kinsey Institute and the Center for Complex Networks and Systems Research, or CNetS, which is part of the IU School of Informatics and Computing and the Pervasive Technology Institute. Both Kinsey and CNetS are based on the IU Bloomington campus.

CNetS director Filippo Menczer called development of the citizen reporting platform an opportunity to gather information on important issues that may have been difficult to examine in the past.

“This new platform will allow us to explore issues that have been challenging to study until now, such as the prevalence of unreported sexual violence in different parts of the world, or the correlation between various sexual practices like condom use, for example, and the cultural, political, religious or health contexts in particular geographical areas. These were some of our initial motivations for the project,” he said.

Users simply download the free app and begin contributing observed information on topics such as sexual activity, public displays of affection, flirting, unwanted experiences and birth control use. Even though no information identifying users submitting reports is collected or stored, the time and general location of the report is collected and input into the database. Users also have the option of selecting their own geographic preference for the report by choosing city/town, state/region or country.

Surveys will change over time, and users can view aggregated reports by geographic region via interactive maps, timelines or charts. All of these reporting venues can be manipulated with filters that remove or add data based on specific survey topics and questions selected by the user.

Both Heiman and Menczer said The Kinsey Institute’s longstanding seminal studies of sexual behaviors created a perfect synergy with research going on at CNetS related to mining big data crowd-sourced from mobile social media. The sensitive domain — sexual relations — added an intriguing challenge in finding a way to share useful data with the community while protecting the privacy and anonymity of the reporting volunteers, they added.

Reports are transmitted to Kinsey Reporter using a secure, encrypted protocol, and the only data collected are a timestamp, the approximate geo-location selected by the user, and the tags the user chooses in response to various survey questions. The protections and anonymity provided to those responding to surveys allowed IU’s Institutional Review Board to classify the research as “exempt from review,” which allows the data to be used for research and shared without requiring informed consent from users of the apps.

The Kinsey Reporter platform is now in public beta release. Apps are available for free download at both the Apple iOS and Android app stores. Accompanying the app release are a Kinsey Reporter website, a Twitter feed and a Facebook page. The four resources also provide links to information about sexuality, such as blogs and podcasts from the Kinsey Confidential website. YouTube videos on “What Is the Kinsey Reporter App” and “Making the Kinsey Reporter App” are also available for viewing.

The Kinsey Institute receives support from the Office of the Vice Provost for Research at IU Bloomington, which is dedicated to supporting ongoing faculty research and creative activity and developing new multidisciplinary initiatives to enhance opportunities for federal, state and private research funding.

First Holistic View of How Human Genome Actually Works: ENCODE Study Produces Massive Data Set (Science Daily)

ScienceDaily (Sep. 5, 2012) — The Human Genome Project produced an almost complete order of the 3 billion pairs of chemical letters in the DNA that embodies the human genetic code — but little about the way this blueprint works. Now, after a multi-year concerted effort by more than 440 researchers in 32 labs around the world, a more dynamic picture gives the first holistic view of how the human genome actually does its job.

William Noble, professor of genome sciences and computer science, in the data center at the William H. Foege Building. Noble, an expert on machine learning, and his team designed artificial intellience programs to analyze ENCODE data. These computer programs can learn from experience, recognize patterns, and organize information into categories understandable to scientists. The center houses systems for a wide variety of genetic research. The computer center has the capacity to store and analyze a tremendous amount of data, the equivalent of a 670-page autobiography of each person on earth, uncompressed.The computing resources analyze over 4 pentabytes of genomic data a year. (Credit: Clare McLean, Courtesy of University of Washington)

During the new study, researchers linked more than 80 percent of the human genome sequence to a specific biological function and mapped more than 4 million regulatory regions where proteins specifically interact with the DNA. These findings represent a significant advance in understanding the precise and complex controls over the expression of genetic information within a cell. The findings bring into much sharper focus the continually active genome in which proteins routinely turn genes on and off using sites that are sometimes at great distances from the genes themselves. They also identify where chemical modifications of DNA influence gene expression and where various functional forms of RNA, a form of nucleic acid related to DNA, help regulate the whole system.

“During the early debates about the Human Genome Project, researchers had predicted that only a few percent of the human genome sequence encoded proteins, the workhorses of the cell, and that the rest was junk. We now know that this conclusion was wrong,” said Eric D. Green, M.D., Ph.D., director of the National Human Genome Research Institute (NHGRI), a part of the National Institutes of Health. “ENCODE has revealed that most of the human genome is involved in the complex molecular choreography required for converting genetic information into living cells and organisms.”

NHGRI organized the research project producing these results; it is called the Encyclopedia oDNA Elements or ENCODE. Launched in 2003, ENCODE’s goal of identifying all of the genome’s functional elements seemed just as daunting as sequencing that first human genome. ENCODE was launched as a pilot project to develop the methods and strategies needed to produce results and did so by focusing on only 1 percent of the human genome. By 2007, NHGRI concluded that the technology had sufficiently evolved for a full-scale project, in which the institute invested approximately $123 million over five years. In addition, NHGRI devoted about $40 million to the ENCODE pilot project, plus approximately $125 million to ENCODE-related technology development and model organism research since 2003.

The scale of the effort has been remarkable. Hundreds of researchers across the United States, United Kingdom, Spain, Singapore and Japan performed more than 1,600 sets of experiments on 147 types of tissue with technologies standardized across the consortium. The experiments relied on innovative uses of next-generation DNA sequencing technologies, which had only become available around five years ago, due in large part to advances enabled by NHGRI’s DNA sequencing technology development program. In total, ENCODE generated more than 15 trillion bytes of raw data and consumed the equivalent of more than 300 years of computer time to analyze.

“We’ve come a long way,” said Ewan Birney, Ph.D., of the European Bioinformatics Institute, in the United Kingdom, and lead analysis coordinator for the ENCODE project. “By carefully piecing together a simply staggering variety of data, we’ve shown that the human genome is simply alive with switches, turning our genes on and off and controlling when and where proteins are produced. ENCODE has taken our knowledge of the genome to the next level, and all of that knowledge is being shared openly.”

The ENCODE Consortium placed the resulting data sets as soon as they were verified for accuracy, prior to publication, in several databases that can be freely accessed by anyone on the Internet. These data sets can be accessed through the ENCODE project portal (www.encodeproject.org) as well as at the University of California, Santa Cruz genome browser,http://genome.ucsc.edu/ENCODE/, the National Center for Biotechnology Information,http://www.ncbi.nlm.nih.gov/geo/info/ENCODE.html and the European Bioinformatics Institute,http://useast.ensembl.org/Homo_sapiens/encode.html?redirect=mirror;source=www.ensembl.org.

“The ENCODE catalog is like Google Maps for the human genome,” said Elise Feingold, Ph.D., an NHGRI program director who helped start the ENCODE Project. “Simply by selecting the magnification in Google Maps, you can see countries, states, cities, streets, even individual intersections, and by selecting different features, you can get directions, see street names and photos, and get information about traffic and even weather. The ENCODE maps allow researchers to inspect the chromosomes, genes, functional elements and individual nucleotides in the human genome in much the same way.”

The coordinated publication set includes one main integrative paper and five related papers in the journal Nature; 18 papers inGenome Research; and six papers in Genome Biology. The ENCODE data are so complex that the three journals have developed a pioneering way to present the information in an integrated form that they call threads.

“Because ENCODE has generated so much data, we, together with the ENCODE Consortium, have introduced a new way to enable researchers to navigate through the data,” said Magdalena Skipper, Ph.D., senior editor at Nature, which produced the freely available publishing platform on the Internet.

Since the same topics were addressed in different ways in different papers, the new website, www.nature.com/encode, will allow anyone to follow a topic through all of the papers in the ENCODE publication set by clicking on the relevant thread at the Nature ENCODE explorer page. For example, thread number one compiles figures, tables, and text relevant to genetic variation and disease from several papers and displays them all on one page. ENCODE scientists believe this will illuminate many biological themes emerging from the analyses.

In addition to the threaded papers, six review articles are being published in the Journal of Biological Chemistry and two related papers in Science and one in Cell.

The ENCODE data are rapidly becoming a fundamental resource for researchers to help understand human biology and disease. More than 100 papers using ENCODE data have been published by investigators who were not part of the ENCODE Project, but who have used the data in disease research. For example, many regions of the human genome that do not contain protein-coding genes have been associated with disease. Instead, the disease-linked genetic changes appear to occur in vast tracts of sequence between genes where ENCODE has identified many regulatory sites. Further study will be needed to understand how specific variants in these genomic areas contribute to disease.

“We were surprised that disease-linked genetic variants are not in protein-coding regions,” said Mike Pazin, Ph.D., an NHGRI program director working on ENCODE. “We expect to find that many genetic changes causing a disorder are within regulatory regions, or switches, that affect how much protein is produced or when the protein is produced, rather than affecting the structure of the protein itself. The medical condition will occur because the gene is aberrantly turned on or turned off or abnormal amounts of the protein are made. Far from being junk DNA, this regulatory DNA clearly makes important contributions to human health and disease.”

Identifying regulatory regions will also help researchers explain why different types of cells have different properties. For example why do muscle cells generate force while liver cells break down food? Scientists know that muscle cells turn on some genes that only work in muscle, but it has not been previously possible to examine the regulatory elements that control that process. ENCODE has laid a foundation for these kinds of studies by examining more than 140 of the hundreds of cell types found in the human body and identifying many of the cell type-specific control elements.

Despite the enormity of the dataset described in this historic collection of publications, it does not comprehensively describe all of the functional genomic elements in all of the different types of cells in the human body. NHGRI plans to invest in additional ENCODE-related research for at least another four years. During the next phase, ENCODE will increase the depth of the catalog with respect to the types of functional elements and cell types studied. It will also develop new tools for more sophisticated analyses of the data.

Journal References:

  1. Magdalena Skipper, Ritu Dhand, Philip Campbell.Presenting ENCODENature, 2012; 489 (7414): 45 DOI:10.1038/489045a
  2. Joseph R. Ecker, Wendy A. Bickmore, Inês Barroso, Jonathan K. Pritchard, Yoav Gilad, Eran Segal. Genomics: ENCODE explainedNature, 2012; 489 (7414): 52 DOI:10.1038/489052a
  3. The ENCODE Project Consortium. An integrated encyclopedia of DNA elements in the human genome.Nature, 2012; 489 (7414): 57 DOI: 10.1038/nature11247

Violent Video Games Not So Bad When Players Cooperate (Science Daily)

ScienceDaily (Sep. 4, 2012) — New research suggests that violent video games may not make players more aggressive — if they play cooperatively with other people.

In two studies, researchers found that college students who teamed up to play violent video games later showed more cooperative behavior, and sometimes less signs of aggression, than students who played the games competitively.

The results suggest that it is too simplistic to say violent video games are always bad for players, said David Ewoldsen, co-author of the studies and professor of communication at Ohio State University.

“Clearly, research has established there are links between playing violent video games and aggression, but that’s an incomplete picture,” Ewoldsen said.

“Most of the studies finding links between violent games and aggression were done with people playing alone. The social aspect of today’s video games can change things quite a bit.”

The new research suggests playing a violent game with a teammate changes how people react to the violence.

“You’re still being very aggressive, you’re still killing people in the game — but when you cooperate, that overrides any of the negative effects of the extreme aggression,” said co-author John Velez, a graduate student in communication at Ohio State.

One study was recently published online in the journalCommunication Research, and will appear in a future print edition. The second related study was published recently in the journal Cyberpsychology, Behavior and Social Networking.

The CBSN study involved 119 college students who were placed into four groups to play the violent video game Halo II with a partner. The groups differed in whether they competed or cooperated in playing the game.

First, all participants filled out a survey about their video game history and a measure of their aggressiveness.

Those in direct competition played in multiplayer mode and were told that their task was to kill their opponent more times than they were killed.

Those in indirect competition played in single-player mode, but were told their task was to beat their opponent by getting further in the game.

In the cooperative condition, participants were told to get as far as they could through the game by working with their partner in Halo II’s cooperative campaign mode. In this case, the pair worked together to defeat computer-controlled enemies.

The final group simply filled out the measures and played the game at the end of the study. Their game playing was not recorded.

After playing the violent video game, the same pairs of participants who played with or against each other took part in a real-life game where they had the opportunity to cooperate or compete with each other.

In this game, they played multiple rounds where they were given dimes which they could keep or share with their partner. The researchers were looking to see if they engaged in “tit for tat” behavior, in which the players mirrored the behaviors of their partner. In other words, if your partner acts cooperatively towards you, you do the same for him. Tit for tat behavior is seen by researchers as a precursor to cooperation.

The results showed that participants who played the video game cooperatively were more likely than those who competed to show cooperative tendencies in this later real-life game.

“These findings suggest video game research needs to consider not only the content of the game but also how video game players are playing the game,” Velez said.

The second study, published in Communication Research, extended the findings by showing that cooperating in playing a violent video game can even unite people from rival groups — in this case, fans of Ohio State and those of their bitter rival, the University of Michigan.

This study involved 80 Ohio State students who, when they came to the lab for the experiment, were paired with a person who they thought was another student participant. In fact, it was one of the experimenters who was wearing an Ohio State t-shirt — or one from the rival University of Michigan.

One of the researchers made sure to point out the t-shirt to the student participant.

The student and confederate then played the highly realistic and violent first-person-shooter video game Unreal Tournament III together — either as teammates or as rivals.

After playing the video game, the participants played the same real-life game used in the previous study with their supposed partner, who was really one of the researchers.

They also completed tasks that measured how aggressive they felt, and their aggressive tendencies.

The results showed the power of cooperatively playing violent video games in reducing aggressive thoughts — and even overcoming group differences.

As in the first study, players who cooperated in playing the video game later showed more cooperation than did those who competed against each other.

It even worked when Ohio State participants thought they were playing with a rival from the University of Michigan.

“The cooperative play just wiped out any effect of who you were playing with,” Velez said. “Ohio State students happily cooperated with Michigan fans.”

Also, those participants who played cooperatively showed less aggressive tendencies afterwards than those who played competitively, at least at first. In fact, those who played competitively with a rival actually showed less aggression than those who played with a supporter of their own team.

“If you’re playing with a rival, and that rival is cooperating with you, that violates your expectations — you’re surprised by their cooperation and that makes you even more willing to cooperate,” Ewoldsen said.

Eventually, even those who competed with each other in the video games started cooperating with each other in the real-life games afterwards.

“The point is that the way you act in the real world very quickly overrides anything that is going on in the video games,” Ewoldsen said. “Video games aren’t controlling who we are.”

These results should encourage researchers to study not only how the content of violent video games affects players, but also how the style of play has an impact.

“What is more important: cooperating with another human being, or killing a digital creature?” Ewoldsen said.

“We think that cooperating with another human overrides the effects of playing a violent video game.”

Other authors of the CBSN paper were Cassie Eno of Waldorf College; Bradley Okdie of Ohio State’s Newark campus; Rosanna Guadagno of the University of Alabama; and James DeCoster of the University of Virginia.

Other authors of the Communication Research paper were Chad Mahood and Emily Moyer-Guse, both of Ohio State.

Journal References:

  1. J. A. Velez, C. Mahood, D. R. Ewoldsen, E. Moyer-Guse.Ingroup Versus Outgroup Conflict in the Context of Violent Video Game Play: The Effect of Cooperation on Increased Helping and Decreased Aggression.Communication Research, 2012; DOI:10.1177/0093650212456202
  2. David R. Ewoldsen, Cassie A. Eno, Bradley M. Okdie, John A. Velez, Rosanna E. Guadagno, Jamie DeCoster. Effect of Playing Violent Video Games Cooperatively or Competitively on Subsequent Cooperative Behavior.Cyberpsychology, Behavior, and Social Networking, 2012; 15 (5): 277 DOI: 10.1089/cyber.2011.0308

Realistically What Might the Future Climate Look Like? (Skeptical Science)

Posted on 31 August 2012 by dana1981

Robert Watson, former Chair of the Intergovernmental Panel on Climate Change (IPCC), recently made headlines by declaring that it is unlikely we will be able to limit global warming to the 2°C ‘danger limit’.  This past April, the International Energy Agency similarly warnedthat we are rapidly running out of time to avoid blowing past 2°C global warming compared to late 19th Century temperatures.  The reason for their pessimism is illustrated in the ‘ski slopes’ graphic, which depicts how steep emissions cuts will have to be in order to give ourselves a good chance to stay below the 2°C target, given different peak emissions dates (Figure 1).

ski slopes

Figure 1: Three scenarios, each of which would limit the total global emission of carbon dioxide from fossil-fuel burning and industrial processes to 750 billion tonnes over the period 2010–2050.  Source: German Advisory Council on Global Change, WBGU (2009)

Clearly our CO2 emissions have not yet peaked – in fact they increased by 1 billion tonnes between 2010 and 2011 despite a continued global economic recession; therefore, the green curve is no longer an option.  There has also been little progress toward an international climate accord to replace the Kyoto Protocol, which suggests that the blue curve does not represent a likely scenario either – in order to achieve peak emissions in 2015 we would have to take serious steps to reduce emissions today, which we are not.  The red curve seems the most likely, but the required cuts are so steep that it is unlikely we will be able to achieve them, which means we are indeed likely to surpass the 2°C target.

Thus it is worth exploring the question, what would a world with >2°C global surface warming look like?

Global Warming Impacts

The 2007 IPCC Fourth Assessment Report (AR4) summarizes the magnitudes of impact of various degrees of warming here, and graphically in Figure 2, relative to ~1990 temperatures (~0.6°C above late 19th Century temperatures).

fig spm.2

Figure 2: Illustrative examples of global impacts projected for climate changes (and sea level and atmospheric carbon dioxide where relevant) associated with different amounts of increase in global average surface temperature in the 21st century. The black lines link impacts, dotted arrows indicate impacts continuing with increasing temperature. Entries are placed so that the left-hand side of the text indicates the approximate onset of a given impact. Quantitative entries for water stress and flooding represent the additional impacts of climate change relative to the conditions projected across the range of Special Report on Emissions Scenarios (SRES) scenarios. Adaptation to climate change is not included in these estimations. Confidence levels for all statements are high.  IPCC AR4 WGII Figure SPM.2.  Click the image for a larger version.

Some adverse impacts are expected even before we reach the 2°C limit, for example hundreds of millions of people being subjected to increased water stress, increasing drought at mid-latitudes (as we recently discussed here), increased coral bleaching, increased coastal damage from floods and storms, and increased morbidity and mortality from more frequent and intense heat waves (see here), floods, and droughts.  However, by and large these are impacts which we should be able to adapt to, at a cost, but without disastrous consequences.

Once we surpass the 2°C target, the impacts listed above are exacerbated, and some new impacts will occur.  Most corals will bleach, and widespread coral mortality is expected ~3°C above late 19th Century temperatures.  Up to 30% of global species will be at risk for extinction, and the figure could exceed 40% if we surpass 4°C, as we continue on the path toward the Earth’s sixth mass extinction.  Coastal flooding will impact millions more people at ~2.5°C, and a number of adverse health effects are expected to continue rising along with temperatures.

Reasons for Concern

Smith et al. (2009) (on which the late great Stephen Schneider was a co-author) updated the IPCC impact assessment, arriving at similar conclusions.  For example,

“There is medium confidence that ~20–30% of known plant and animal species are likely to be at increased risk of extinction if increases in global average temperature exceed 1.5 °C to 2.5 °C over 1980–1999”

“increases in drought, heat waves, and floods are projected in many regions and would have adverse impacts, including increased water stress, wildfire frequency, and flood risks (starting at less than 1 °C of additional warming above 1990 levels) and adverse health effects (slightly above 1 °C)”

“climate change over the next century is likely to adversely affect hundreds of millions of people through increased coastal flooding after a further 2 °C warming from 1990 levels; reductions in water supplies (0.4 to 1.7 billion people affected with less than a 1 °C warming from 1990 levels); and increased health impacts (that are already being observed”

Smith et al. updated the 2001 IPCC report ‘burning embers’ diagram to reflect their findings (Figure 3).  On this figure, white regions indicate neutral or low impacts or risks, yellow indicates negative impacts for some systems or more significant risks, and red indicates substantial negative impacts or risks that are more widespread and/or severe.  They have grouped the various climate change consequences into ‘reasons for concern’ (RFCs), summarized below.

smith embers

Figure 3:  Risks from climate change, by reason for concern (RFC). Climate change consequences are plotted against increases in global mean temperature (°C) after 1990. Each column corresponds to a specific RFC and represents additional outcomes associated with increasing global mean temperature. The color scheme represents progressively increasing levels of risk and should not be interpreted as representing ‘‘dangerous anthropogenic interference,’’ which is a value judgment. The historical period 1900 to 2000 warmed by 0.6 °C and led to some impacts. It should be noted that this figure addresses only how risks change as global mean temperature increases, not how risks might change at different rates of warming. Furthermore, it does not address when impacts might be realized, nor does it account for the effects of different development pathways on vulnerability.

  • Risk to Unique and Threatened Systems addresses the potential for increased damage to or irreversible loss of unique and threatened systems, such as coral reefs, tropical glaciers, endangered species, unique ecosystems, biodiversity hotspots, small island states, and indigenous communities.
  • Risk of Extreme Weather Events tracks increases in extreme events with substantial consequences for societies and natural systems. Examples include increase in the frequency, intensity, or consequences of heat waves, floods, droughts, wildfires, or tropical cyclones.
  • Distribution of Impacts concerns disparities of impacts.  Some regions, countries, and populations face greater harm from climate change, whereas other regions, countries, or populations would be much less harmed—and some may benefit; the magnitude of harm can also vary within regions and across sectors and populations.
  • Aggregate Damages covers comprehensive measures of impacts. Impacts distributed across the globe can be aggregated into a single metric, such as monetary damages, lives affected, or lives lost. Aggregation techniques vary in their treatment of equity of outcomes, as well as treatment of impacts that are not easily quantified.
  • Risks of Large-Scale Discontinuities represents the likelihood that certain phenomena (sometimes called tipping points) would occur, any of which may be accompanied by very large impacts. These phenomena include the deglaciation (partial or complete) of the West Antarctic or Greenland ice sheets and major changes in some components of the Earth’s climate system, such as a substantial reduction or collapse of the North Atlantic Meridional Overturning Circulation.

All of these reasons for concern enter the red (substantial negative impact, high risk) region by 4°C.  Aggregate impacts are in the red region by 3°C, and some types of concerns are in the red region by 1°C.

For more details we also recommend Mark Lynas’ book Six Degrees, which goes through the climate impacts from each subsequent degree of warming, based on a very thorough review of the scientific literature.  A brief review of the book by Eric Steig and summary of some key impacts is available here.  National Geographic also did a series of videos on the Six Degrees theme, which no longer seem to be available on their websites, but which can still be found on YouTube.

This is Why Reducing Emissions is Critical

We’re not yet committed to surpassing 2°C global warming, but as Watson noted, we are quickly running out of time to realistically give ourselves a chance to stay below that ‘danger limit’.  However, 2°C is not a do-or-die threshold.  Every bit of CO2 emissions we can reduce means that much avoided future warming, which means that much avoided climate change impacts.  As Lonnie Thompson noted, the more global warming we manage to mitigate, the less adaption and suffering we will be forced to cope with in the future.

Realistically, based on the current political climate (which we will explore in another post next week), limiting global warming to 2°C is probably the best we can do.  However, there is a big difference between 2°C and 3°C, between 3°C and 4°C, and anything greater than 4°C can probably accurately be described as catastrophic, since various tipping points are expected to be triggered at this level.  Right now, we are on track for the catastrophic consequences (widespread coral mortality, mass extinctions, hundreds of millions of people adversely impacted by droughts, floods, heat waves, etc.).  But we’re not stuck on that track just yet, and we need to move ourselves as far off of it as possible by reducing our greenhouse gas emissions as soon and as much as possible.

There are of course many people who believe that the planet will not warm as much, or that the impacts of the associated climate change will be as bad as the body of scientific evidence suggests.  That is certainly a possiblity, and we very much hope that their optimistic view is correct.  However, what we have presented here is the best summary of scientific evidence available, and it paints a very bleak picture if we fail to rapidly reduce our greenhouse gas emissions.

If we continue forward on our current path, catastrophe is not just a possible outcome, it is the most probable outcome.  And an intelligent risk management approach would involve taking steps to prevent a catastrophic scenario if it were a mere possibility, let alone the most probable outcome.  This is especially true since the most important component of the solution – carbon pricing – can be implemented at a relatively low cost, and a far lower cost than trying to adapt to the climate change consequences we have discussed here (Figure 4).

Figure 4:  Approximate costs of climate action (green) and inaction (red) in 2100 and 2200. Sources: German Institute for Economic Research andWatkiss et al. 2005

Climate contrarians will often mock ‘CAGW’ (catastrophic anthropogenic global warming), but the sad reality is that CAGW is looking more and more likely every day.  But it’s critical that we don’t give up, that we keep doing everything we can do to reduce our emissions as much as possible in order to avoid as many catastrophic consequences as possible, for the sake of future generations and all species on Earth.  The future climate will probably be much more challenging for life on Earth than today’s, but we still can and must limit the damage.

Design Help for Drug Cocktails for HIV Patients: Mathematical Model Helps Design Efficient Multi-Drug Therapies (Science Daily)

ScienceDaily (Sep. 2, 2012) — For years, doctors treating those with HIV have recognized a relationship between how faithfully patients take the drugs they prescribe, and how likely the virus is to develop drug resistance. More recently, research has shown that the relationship between adherence to a drug regimen and resistance is different for each of the drugs that make up the “cocktail” used to control the disease.

HIV is shown attaching to and infecting a T4 cell. The virus then inserts its own genetic material into the T4 cell’s host DNA. The infected host cell then manufactures copies of the HIV. (Credit: iStockphoto/Medical Art Inc.)

New research conducted by Harvard scientists could help explain why those differences exist, and may help doctors quickly and cheaply design new combinations of drugs that are less likely to result in resistance.

As described in a September 2 paper in Nature Medicine, a team of researchers led by Martin Nowak, Professor of Mathematics and of Biology and Director of the Program for Evolutionary Dynamics, have developed a technique medical researchers can use to model the effects of various treatments, and predict whether they will cause the virus to develop resistance.

“What we demonstrate in this paper is a prototype for predicting, through modeling, whether a patient at a given adherence level is likely to develop resistance to treatment,” Alison Hill, a PhD student in Biophysics and co-first author of the paper, said. “Compared to the time and expense of a clinical trial, this method offers a relatively easy way to make these predictions. And, as we show in the paper, our results match with what doctors are seeing in clinical settings.”

The hope, said Nowak, is that the new technique will take some of the guesswork out of what is now largely a trial-and-error process.

“This is a mathematical tool that will help design clinical trials,” he said. “Right now, researchers are using trial and error to develop these combination therapies. Our approach uses the mathematical understanding of evolution to make the process more akin to engineering.”

Creating a model that can make such predictions accurately, however, requires huge amounts of data.

To get that data, Hill and Daniel Scholes Rosenbloom, a PhD student in Organismic and Evolutionary Biology and the paper’s other first author, turned to Johns Hopkins University Medical School, where Professor of Medicine and of Molecular Biology and Genetics Robert F. Siliciano was working with PhD student Alireza Rabi (also co-first author) to study how the HIV virus reacted to varying drug dosages.

Such data proved critical to the model that Hill, Rabi and Rosenbloom eventually designed, because the level of the drug in patients — even those that adhere to their treatment perfectly — naturally varies. When drug levels are low — as they are between doses, or if a dose is missed — the virus is better able to replicate and grow. Higher drug levels, by contrast, may keep the virus in check, but they also increase the risk of mutant strains of the virus emerging, leading to drug resistance.

Armed with the data from Johns Hopkins, Hill, Rabi and Rosenbloom created a computer model that could predict whether and how much the virus, or a drug-resistant strain, was growing based on how strictly patients stuck to their drug regimen.

“Our model is essentially a simulation of what goes on during treatment,” Rosenbloom said. “We created a number of simulated patients, each of whom had different characteristics, and then we said, ‘Let’s imagine these patients have 60 percent adherence to their treatment — they take 60 percent of the pills they’re supposed to.’ Our model can tell us what their drug concentration is over time, and based on that, we can say whether the virus is growing or shrinking, and whether they’re likely to develop resistance.”

The model’s predictions, Rosenbloom explained, can then serve as a guide to researchers as they work to design new drug cocktails to combat HIV.

While their model does hold out hope for simplifying the process of designing drug “cocktails,” Hill and Rosenbloom said they plan to continue to refine the model to take additional factors — such as multiple mutant-resistant strains of the virus and varying drug concentrations in other parts of the body — into effect.

“The prototype we have so far looks at concentrations of drugs in blood plasma,” Rosenbloom explained. “But a number of drugs don’t penetrate other parts of the body, like the brains or the gut, with the same efficiency, so it’s important to model these other areas where the concentrations of drugs might not be as high.”

Ultimately, though, both say their model can offer new hope to patients by helping doctors design better, cheaper and more efficient treatments.

“Over the past 10 years, the number of HIV-infected people receiving drug treatment has increased immensely,” Hill said. “Figuring out what the best ways are to treat people in terms of cost effectiveness, adherence and the chance of developing resistance is going to become even more important.”

Journal Reference:

  1. Daniel I S Rosenbloom, Alison L Hill, S Alireza Rabi, Robert F Siliciano, Martin A Nowak. Antiretroviral dynamics determines HIV evolution and predicts therapy outcomeNature Medicine, 2012; DOI: 10.1038/nm.2892

*   *   *

Anti-HIV Drug Simulation Offers ‘Realistic’ Tool to Predict Drug Resistance and Viral Mutation

ScienceDaily (Sep. 2, 2012) — Pooling data from thousands of tests of the antiviral activity of more than 20 commonly used anti-HIV drugs, AIDS experts at Johns Hopkins and Harvard universities have developed what they say is the first accurate computer simulation to explain drug effects. Already, the model clarifies how and why some treatment regimens fail in some patients who lack evidence of drug resistance. Researchers say their model is based on specific drugs, precise doses prescribed, and on “real-world variation” in how well patients follow prescribing instructions.

Johns Hopkins co-senior study investigator and infectious disease specialist Robert Siliciano, M.D., Ph.D., says the mathematical model can also be used to predict how well a patient is likely to do on a specific regimen, based on their prescription adherence. In addition, the model factors in each drug’s ability to suppress viral replication and the likelihood that such suppression will spur development of drug-resistant, mutant HIV strains.

“With the help of our simulation, we can now tell with a fair degree of certainty what level of viral suppression is being achieved — how hard it is for the virus to grow and replicate — for a particular drug combination, at a specific dosage and drug concentration in the blood, even when a dose is missed,” says Siliciano, a professor at the Johns Hopkins University School of Medicine and a Howard Hughes Medical Institute investigator. This information, he predicts, will remove “a lot of the current trial and error, or guesswork, involved in testing new drug combination therapies.”

Siliciano says the study findings, to be reported in the journalNature Medicine online Sept. 2, should help scientists streamline development and clinical trials of future combination therapies, by ruling out combinations unlikely to work.

One application of the model could be further development of drug combinations that can be contained in a single pill taken once a day. That could lower the chance of resistance, even if adherence is not perfect. Such future drug regimens, he says, will ideally strike a balance between optimizing viral suppression and minimizing risk of drug resistance.

Researchers next plan to expand their modeling beyond blood levels of virus to other parts of the body, such as the brain, where antiretroviral drug concentrations can be different from those measured in the blood. They also plan to expand their analysis to include multiple-drug-resistant strains of HIV.

Besides Siliciano, Johns Hopkins joint medical-doctoral student Alireza Rabi was a co-investigator in this study. Other study investigators included doctoral candidates Daniel Rosenbloom, M.S.; Alison Hill, M.S.; and co-senior study investigator Martin Nowak, Ph.D. — all at Harvard University.

Funding support for this study, which took two years to complete, was provided by the National Institutes of Health, with corresponding grant numbers R01-MH54907, R01-AI081600, R01-GM078986; the Bill and Melinda Gates Foundation; the Cancer Research Institute; the National Science Foundation; the Howard Hughes Medical Institute; Natural Sciences and Engineering Research Council of Canada; the John Templeton Foundation; and J. Epstein.

Currently, an estimated 8 million of the more than 34 million people in the world living with HIV are taking antiretroviral therapy to keep their disease in check. An estimated 1,178,000 in the United States are infected, including 23,000 in the state of Maryland.

Journal Reference:

  1. Daniel I S Rosenbloom, Alison L Hill, S Alireza Rabi, Robert F Siliciano, Martin A Nowak. Antiretroviral dynamics determines HIV evolution and predicts therapy outcomeNature Medicine, 2012; DOI: 10.1038/nm.2892

Research Reveals Contrasting Consequences of a Warmer Earth (Science Daily)

ScienceDaily (Sep. 3, 2012) — A new study, by scientists from the Universities of York, Glasgow and Leeds, involving analysis of fossil and geological records going back 540 million years, suggests that biodiversity on Earth generally increases as the planet warms.

New research involving analysis of fossil and geological records going back 540 million years suggests that biodiversity on Earth generally increases as the planet warms. (Credit: © mozZz / Fotolia)

But the research says that the increase in biodiversity depends on the evolution of new species over millions of years, and is normally accompanied by extinctions of existing species. The researchers suggest that present trends of increasing temperature are unlikely to boost global biodiversity in the short term because of the long timescales necessary for new forms to evolve. Instead, the speed of current change is expected to cause diversity loss.

The study which is published inProceedings of the National Academy of Sciences (PNAS) says that while warm periods in the geological past experienced increased extinctions, they also promoted the origination of new species, increasing overall biodiversity.

The new research is a refinement of an earlier study that analysed biodiversity over the same time interval, but with a less sophisticated data set, and concluded that a warming climate led to drops in overall diversity. Using the improved data that are now available, the researchers re-examined patterns of marine invertebrate biodiversity over the last 540 million years.

Lead author, Dr Peter Mayhew, of the Department of Biology at York, said: “The improved data give us a more secure picture of the impact of warmer temperatures on marine biodiversity and they show that, as before, there is more extinction and origination in warm geological periods. But, overall, warm climates seem to boost biodiversity in the very long run, rather than reducing it.”

Dr Alistair McGowan, of the School of Geographical and Earth Sciences at the University of Glasgow said: “The previous findings always seemed paradoxical. Ecological studies show that species richness consistently increases towards the Equator, where it is warm, yet the relationship between biodiversity and temperature through time appeared to be the opposite. Our new results reverse these conclusions and bring them into line with the ecological pattern.”

Professor Tim Benton, of the Faculty of Biological Sciences at the University of Leeds, added: “Science progresses by constantly re-examining conclusions in the light of better data. Our results seem to show that temperature improves biodiversity through time as well as across space. However, they do not suggest that current global warming is good for existing species. Increases in global diversity take millions of years, and in the meantime we expect extinctions to occur.”

Pesquisa indica relação entre inclusão digital e felicidade (Exame)

JC e-mail 4576, de 04 de Setembro de 2012

Alguns anos atrás, foi muito divulgado um estudo que mostrava uma relação direta entre penetração de telefonia móvel e crescimento de Produto Interno Bruto (PIB) de um país. Agora, uma pesquisa realizada pela FGV indica uma correlação entre acesso a meios de comunicação (Internet, telefonias fixa e móvel) e felicidade.

O estudo reuniu dados globais do Gallup e, no caso brasileiro, do IBGE, e descobriu que, em média, a cada 10 pontos percentuais de penetração de Internet, telefonia fixa e/ou móvel há um aumento de 2.2 pontos percentuais na felicidade de um país.

“Entretanto, não se pode dizer que inclusão digitaltraz felicidade ou vice-versa”, ressalta Marcelo Neri, pesquisador da FGV que conduziu o estudo e apresentou os dados durante o painel “O crescimento do Brasil e as TICs”, durante o 56º Painel Telebrasil, na última quinta-feira (30), em Brasília. Neri alertou também que no Brasil, especificamente, a correlação não se apresentou tão claramente. Nos próximos dias, Neri deve tomar posse como presidente do Ipea.

Para esse levantamento, Neri e sua equipe criou um indicador, chamado ITIC (Indicador de Telefonia, Internet e Celular), que mescla as penetrações dos três referidos serviços. Em uma lista de 158 países, o Brasil ocupa a 72a posição, com ITIC de 51,25%. A média mundial é de 49,1%. O ranking é liderado por Suécia (95.8%), Cingapura (95.5%) e Islândia (95.5%). Nos últimos lugares estão República Centro Africana (5.5%), Burundi (5.75%) e Etiópia (8.25%). Se retirada a penetração de telefonia celular, o ITIC dos países africanos cai drasticamente. No caso da República Centro Africana, passa a ser de 0.7%.

Em sua palestra, Neri apresentou mapas do Brasil mostrando a evolução da penetração dos três serviços ao longo da última década, com destaque para a telefonia celular, cujos resultados chamam mais a atenção. “A plataforma celular pode promover a inclusão digital pois hoje dois terços dos pobres no Brasil têm um telefone móvel”, disse o pesquisador. Nesse dado, é considerado pobre quem vive com renda mensal familiar per capita abaixo de R$ 150.

No caso dos PCs, Hélio Rotenberg, presidente do grupo Positivo, disse que a revolução no Brasil aconteceu nos últimos sete anos, período em que o país subiu para a terceira colocação entre os maiores mercados de computadores pessoais no mundo. A razão para isso foi não apenas o barateamento do equipamento em função de políticas de desoneração, mas o acesso a financiamento, explicou.

“A grande virada aconteceu com o PC das Casas Bahia, vendido em 10 ou mais vezes”, afirmou. O mesmo Rotenberg, contudo, pontuou que a tecnologia não é a salvação para a educação no Brasil. “Temos problemas estruturais a serem resolvidos. Enquanto o professor não estiver preparado, não tem tecnologia que vá ajudar”, comentou.

Desigualdade – A diretora da LCA Consultores, Cláudia Viegas, alertou para o risco de o avanço tecnológico no Brasil agravar a desigualdade social, caso seja feito de maneira desordenada. “É claro que o número de acessos vai crescer. A questão é como isso vai acontecer ao longo de todo o País. Pode ocorrer um efeito perverso, agravando a desigualdade social em vez de reduzi-la”, disse.

Ela se refere ao risco de parte da população brasileira não acompanhar o avanço tecnológico e ficar ainda mais alijada do desenvolvimento econômico do País. Célio Bozola, presidente do Prodesp, responsável pelo projeto Poupa Tempo, confirma que nem mesmo no estado de São Paulo o acesso à tecnologia está distribuído de forma homogênea.

Para Cláudia, da LCA, é fundamental que haja uma política pública direcionadora do processo de inclusão digital no Brasil. Sobre a possibilidade de ajuda estatal para esse fim, o presidente da Telebrasil e da Telefônica/Vivo, Antônio Carlos Valente, comentou: “não tenho nada contra apoio estatal. Mas há diversas formas de apoio estatal. Uma delas é a disponibilidade de fundos públicos”.

(Fonte: Exame)

Wild Weather A New Normal And Insurance Companies Must Act (Forbes)

GREEN TECH

Mindy Lubber, Contributor – President, Ceres and Director, Investor Network on Climate Risk (INCR)

8/30/2012 @ 9:01AM

Damage after Hurricane IreneSevere weather has been clobbering insurance companies, and the headlines just keep coming. “Drought to cost insurers billions in losses,” said the Financial Times a few days ago. “Many U.S. hurricanes would cause $10b or more in losses in 2012 dollars,” the Boston Globe said about the latest hurricane forecasts. “June’s severe weather losses near $2 billion in U.S.,” said the Insurance Journal earlier this year.

This year’s extreme events follow the world’s costliest year ever for natural catastrophe losses, including $32 billion in 2011 insured losses in the United States due to extreme weather events. This is no short-term uptick: insured losses due to extreme weather have been trending upward for 30 years, as the climate has changed and populations in coastal areas and other vulnerable places have grown.

The U.S. insurance industry continues to be “surprised” by extreme weather losses. But the truth is that weather extremes are no longer surprising. Back-to-back summers of devastating droughts, record heat waves and raging wildfires are clear evidence of this. Last year’s crazy weather triggered near record underwriting losses and numerous credit rating downgrades among U.S. property and casualty insurers.

And in the face of a changing climate, such events can be expected to increase in number, and severity.  It’s time for insurance companies to recognize this new normal, and incorporate it into their business planning—for the sake of their shareholders, their industry’s survival, and the stability of the U.S. economy.

Ceres, a business sustainability leadership organization, has been researching the effects of climate change and severe weather on the insurance sector. In a report to be released next month, titled Stormy Future for U.S. Property and Casualty Insurers, we will detail our recommendations for insurance companies, investors and regulators to help strengthen the insurance sector so it can better weather the challenges ahead.

For insurance companies, using catastrophe models that can better anticipate probable effects of climate change on extreme weather events are key. And especially in vulnerable markets, insurers’ guidance on insurability should inform decisions that communities make on land-use planning, infrastructure decisions, and building codes.

Insurers can also encourage the transition to a low-carbon economy—one built to forestall the worst effects of climate change—by offering products and services that encourage clean and efficient energy, encouraging customers to adopt climate-change mitigation plans, and encouraging policymakers to act to reduce carbon pollution.

This would not be the first time insurance companies have helped change American society. By making insurance contingent on smoke detectors, insurers cut down on deaths and losses from building fires. By backing seat belt laws and including seat belt violations in rate calculations, they helped save lives on the road.

By engaging fully on climate change and energy policy—inside and outside of the boardroom – insurance companies can lead the way once again. It would be the right thing to do, both for their business, and for our future.

People Merge Supernatural and Scientific Beliefs When Reasoning With the Unknown, Study Shows (Science Daily)

ScienceDaily (Aug. 30, 2012) — Reliance on supernatural explanations for major life events, such as death and illness, often increases rather than declines with age, according to a new psychology study from The University of Texas at Austin.

Reliance on supernatural explanations for major life events, such as death and illness, often increases rather than declines with age, according to a new psychology study. (Credit: © Nikki Zalewski / Fotolia)

The study, published in the June issue of Child Development, offers new insight into developmental learning.

“As children assimilate cultural concepts into their intuitive belief systems — from God to atoms to evolution — they engage in coexistence thinking,” said Cristine Legare, assistant professor of psychology and lead author of the study. “When they merge supernatural and scientific explanations, they integrate them in a variety of predictable and universal ways.”

Legare and her colleagues reviewed more than 30 studies on how people (ages 5-75) from various countries reason with three major existential questions: the origin of life, illness and death. They also conducted a study with 366 respondents in South Africa, where biomedical and traditional healing practices are both widely available.

As part of the study, Legare presented the respondents with a variety of stories about people who had AIDS. They were then asked to endorse or reject several biological and supernatural explanations for why the characters in the stories contracted the virus.

According to the findings, participants of all age groups agreed with biological explanations for at least one event. Yet supernatural explanations such as witchcraft were also frequently supported among children (ages 5 and up) and universally among adults.

Among the adult participants, only 26 percent believed the illness could be caused by either biology or witchcraft. And 38 percent split biological and scientific explanations into one theory. For example: “Witchcraft, which is mixed with evil spirits, and unprotected sex caused AIDS.” However, 57 percent combined both witchcraft and biological explanations. For example: “A witch can put an HIV-infected person in your path.”

Legare said the findings contradict the common assumption that supernatural beliefs dissipate with age and knowledge.

“The findings show supernatural explanations for topics of core concern to humans are pervasive across cultures,” Legare said. “If anything, in both industrialized and developing countries, supernatural explanations are frequently endorsed more often among adults than younger children.”

The results provide evidence that reasoning about supernatural phenomena is a fundamental and enduring aspect of human thinking, Legare said.

“The standard assumption that scientific and religious explanations compete should be re-evaluated in light of substantial psychological evidence,” Legare said. “The data, which spans diverse cultural contexts across the lifespan, shows supernatural reasoning is not necessarily replaced with scientific explanations following gains in knowledge, education or technology.”

Journal Reference:

  1. Cristine H. Legare, E. Margaret Evans, Karl S. Rosengren, Paul L. Harris. The Coexistence of Natural and Supernatural Explanations Across Cultures and DevelopmentChild Development, 2012; 83 (3): 779 DOI:10.1111/j.1467-8624.2012.01743.x

Shading Earth: Delivering Solar Geoengineering Materials to Combat Global Warming May Be Feasible and Affordable (Science Daily)

ScienceDaily (Aug. 29, 2012) — A cost analysis of the technologies needed to transport materials into the stratosphere to reduce the amount of sunlight hitting Earth and therefore reduce the effects of global climate change has shown that they are both feasible and affordable.

A cost analysis of the technologies needed to transport materials into the stratosphere to reduce the amount of sunlight hitting Earth and therefore reduce the effects of global climate change has shown that they are both feasible and affordable. (Credit: © mozZz / Fotolia)

Published August 31, 2012, in IOP Publishing’s journal Environmental Research Letters, the study has shown that the basic technology currently exists and could be assembled and implemented in a number of different forms for less than USD $5 billion a year.

Put into context, the cost of reducing carbon dioxide emissions is currently estimated to be between 0.2 and 2.5 per cent of GDP in the year 2030, which is equivalent to roughly USD $200 to $2000 billion.

Solar radiation management (SRM) looks to induce the effects similar to those observed after volcanic eruptions; however, the authors state that it is not a preferred strategy and that such a claim could only be made after the thorough investigation of the implications, risks and costs associated with these issues.

The authors caution that reducing incident sunlight does nothing at all to reduce greenhouse gas concentrations in the atmosphere, nor the resulting increase in the acid content of the oceans. They note that other research has shown that the effects of solar radiation management are not uniform, and would cause different temperature and precipitation changes in different countries.

Co-author of the study, Professor Jay Apt, said: “As economists are beginning to explore the role of several types of geoengineering, it is important that a cost analysis of SRM is carried out. The basic feasibility of SRM with current technology is still being disputed and some political scientists and policy makers are concerned about unilateral action.”

In the study, the researchers, from Aurora Flight Sciences, Harvard University and Carnegie Mellon University, performed an engineering cost analysis on six systems capable of delivering 1-5 million metric tonnes of material to altitudes of 18-30 km: existing aircraft, a new airplane designed to perform at altitudes up to 30 km, a new hybrid airship, rockets, guns and suspended pipes carrying gas or slurry to inject the particles into the atmosphere.

Based on existing research into solar radiation management, the researchers performed their cost analyses for systems that could deliver around one million tonnes of aerosols each year at an altitude between 18 and 25 km and between a latitude range of 30°N and 30°S.

The study concluded that using aircraft is easily within the current capabilities of aerospace engineering, manufacturing and operations. The development of new, specialized aircraft appeared to be the cheapest option, with costs of around $1 to $2 billion a year; existing aircraft would be more expensive as they are not optimized for high altitudes and would need considerable and expensive modifications to do so.

Guns and rockets appeared to be capable of delivering materials at high altitudes but the costs associated with these are much higher than those of airplanes and airships due to their lack of reusability.

Although completely theoretical at this point in time, a large gas pipe, rising to 20 km in the sky and suspended by helium-filled floating platforms, would offer the lowest recurring cost-per-kilogram of particles delivered but the costs of research into the materials required, the development of the pipe and the testing to ensure safety, would be high; the whole system carries a large uncertainty.

Professor Apt continued: “We hope our study will help other scientists looking at more novel methods for dispersing particles and help them to explore methods with increased efficiency and reduced environmental risk.”

The researchers make it clear that they have not sought to address the science of aerosols in the stratosphere, nor issues of risk, effectiveness or governance that will add to the costs of solar radiation management geoengineering.

Journal Reference:

  1. Justin McClellan, David W Keith, Jay Apt. Cost analysis of stratospheric albedo modification delivery systems.Environmental Research Letters, 2012; 7 (3): 034019 DOI:10.1088/1748-9326/7/3/034019

Profeta do nada óbvio (FSP)

RUY CASTRO

São Paulo, segunda-feira, 09 de julho de 2012

RIO DE JANEIRO – Nelson Rodrigues era diabólico. Com toda a miopia e aversão a óculos, seu poder de enxergar ao longe era desconcertante. Como em 1970, quando, durante meses, sustentou sozinho a certeza de que a seleção brasileira “ganharia andando” a Copa do México (seus colegas, os “profetas da derrota”, apostavam na correria das seleções europeias). Bem, o Brasil ganhou andando.

Outra premonição foi a de fins de maio de 1962, quando garantiu que, se Pelé se machucasse na Copa do Chile, o garoto Amarildo o substituiria como um “possesso”. Um mês depois, Pelé se contundiu na segunda partida e a Copa acabou para ele. Amarildo entrou nos jogos restantes, fez os gols que Pelé faria e foi o “Possesso” que Nelson anteviu.

Outra de suas “verdades eternas” -proclamada na “Resenha Facit”, que Nelson estrelava na TV Rio com João Saldanha, Armando Nogueira e demais- foi a de que o videoteipe era “burro”. Na hora, parecia absurdo -como contestar algo que se podia ver e rever? Trinta anos depois, uma infração de Júnior Baiano na área do Brasil passou em branco pelos videoteipes e só apareceu, dias depois, por um ângulo inusitado de câmera. O juiz, que marcara o pênalti, estava certo. O videoteipe era mesmo burro.

Em fins dos anos 60, Nelson produziu outra frase impossível de ser verificada e que parecia provocação: “Ainda seremos o maior país ex-católico do mundo”. Previa um declínio da fé católica no Brasil porque setores da igreja estavam trocando a promessa da vida eterna pela luta armada contra a ditadura.

Nas décadas seguintes, sai papa, entra papa, essa linha política seria abandonada. Mas o encanto se quebrara. Em 1970, os católicos eram 90% dos brasileiros. Hoje, segundo o IBGE, são 64%. E, em 2030, serão menos de 50%. Nelson errou a causa, mas sua sentença continuou de pé.

Twitter Data Crunching: The New Crystal Ball (Science Daily)

ScienceDaily (Aug. 29, 2012) — Fabio Ciulla from Northeastern University, Boston, USA, and his colleagues demonstrated that the elimination of contestants in TV talent shows based on public voting, such as American Idol, can be anticipated. They unveiled the predictive power of microblogging Twitter signals–used as a proxy for the general preference of an audience–in a study recently published in EPJ Data Science.

The authors considered the voting system of these shows as a basic test to assess the predictive power of Twitter signals. They relied on the overlap between Twitter users and show audience to collect extremely detailed data on social behaviour on a massive scale. This approach provided a unique and unprecedented opportunity to apply network science to social media. Social phenomena can thus be studied in a completely unobtrusive way. Previously, Twitter has already been used to forecast epidemics spreading, stock market behaviour and election outcomes with varying degrees of success.

In this study, the authors demonstrated that the Twitter activity during the time span limited to the TV show airing and the voting period following it correlated with the contestants’ ranking. As a result, it helped predict the outcome of the votes. This approach offers a simplified version helping to analyse complex societal phenomena such as political elections. Unlike previous voting systems, Twitter offers a quantitative indicator that can act as proxy for what is occurring around the world in real time, thereby anticipating the outcome of future events based on opinions.

Ciulla and colleagues also showed that the fraction of tweets that included geolocalisation information enabled to internationally map the fan base of each contestant. They identified a strong influence by the geographical origin of the votes, suggesting a different outcome to the show, if voting had not been limited to US voters.

Journal Reference:

  1. Fabio Ciulla, Delia Mocanu, Andrea Baronchelli, Bruno Goncalves, Nicola Perra, Alessandro Vespignani. Beating the news using Social Media: the case study of American IdolEPJ Data Science, 2012; 1 (1): 8 DOI:10.1140/epjds8

Calls for doomsday remain unheeded (Washington Post)

By George Will

11:15 PM, Aug 20, 2012

WASHINGTON — Sometimes the news is that something was not newsworthy. The United Nation’s Rio+20 conference — 50,000 participants from 188 nations — occurred in June, without consequences. A generation has passed since the 1992 Earth Summit in Rio, which begat other conferences and protocols (e.g., Kyoto). And, by now, apocalypse fatigue — boredom from being repeatedly told the end is nigh.

This began two generations ago, in 1972, when we were warned (by computer models developed at MIT) that we were doomed. We were supposed to be pretty much extinct by now, or at least miserable. We are neither. So, what when wrong?

That year begat “The Limits to Growth,” a book from the Club of Rome, which called itself “a project on the predicament of mankind.” It sold 12 million copies, staggered The New York Times (“one of the most important documents of our age”) and argued that economic growth was doomed by intractable scarcities. Bjorn Lomborg, the Danish academic and “skeptical environmentalist,” writing in Foreign Affairs, says it “helped send the world down a path of worrying obsessively about misguided remedies for minor problems while ignoring much greater concerns,” such as poverty, which only economic growth can ameliorate.

MIT’s models foresaw the collapse of civilization because of “nonrenewable resource depletion” and population growth. “In an age more innocent of and reverential toward computers,” Lomborg writes, “the reams of cool printouts gave the book’s argument an air of scientific authority and inevitability” that “seemed to banish any possibility of disagreement.” Then — as now, regarding climate change — respect for science was said to require reverential suspension of skepticism about scientific hypotheses. Time magazine’s story about “The Limits to Growth” exemplified the media’s frisson of hysteria:

“The furnaces of Pittsburgh are cold; the assembly lines of Detroit are still. In Los Angeles, a few gaunt survivors of a plague desperately till freeway center strips … Fantastic? No, only grim inevitability if society continues its present dedication to growth and ‘progress.’”

The modelers examined 19 commodities and said 12 would be gone long before now — aluminum, copper, gold, lead, mercury, molybdenum, natural gas, oil, silver, tin, tungsten and zinc. Lomborg says:

Technological innovations have replaced mercury in batteries, dental fillings and thermometers, mercury consumption is down 98 percent and its price was down 90 percent by 2000. Since 1970, when gold reserves were estimated at 10,980 tons, 81,410 tons have been mined and estimated reserves are 51,000 tons. Since 1970, when known reserves of copper were 280 million tons, about 400 million tons have been produced globally and reserves are estimated at almost 700 million tons. Aluminum consumption has increased 16-fold since 1950, the world has consumed four times the 1950 known reserves, and known reserves could sustain current consumption for 177 years. Potential U.S. gas resources have doubled in the last six years. And so on.

The modelers missed something — human ingenuity in discovering, extracting and innovating. Which did not just appear after 1972.

Aluminum, Lomborg writes, is one of earth’s most common metals. But until the 1886 invention of the Hall-Heroult process, it was so difficult and expensive to extract that “Napoleon III had bars of aluminum exhibited alongside the French crown jewels, and he gave his honored guests aluminum forks and spoons while lesser visitors had to make do with gold utensils.”

Forty years after “The Limits to Growth” imparted momentum to environmentalism, that impulse now is often reduced to children indoctrinated to “reduce, reuse, and recycle.” Lomborg calls recycling “a feel-good gesture that provides little environmental benefit at a significant cost.” He says “we pay tribute to the pagan god of token environmentalism by spending countless hours sorting, storing and collecting used paper, which, when combined with government subsidies, yields slightly lower-quality paper in order to secure a resource” — forests — “that was never threatened in the first place.”

In 1980, economist Julian Simon made a wager in the form of a complex futures contract. He bet Paul Ehrlich (whose 1968 book “The Population Bomb” predicted “hundreds of millions of people” would starve to death in the 1970s as population growth swamped agricultural production) that by 1990 the price of any five commodities Ehrlich and his advisers picked would be lower than in 1980. Ehrlich’s group picked five metals. All were cheaper in 1990.

The bet cost Ehrlich $576.07. But that year he was awarded a $345,000 MacArthur Foundation “genius” grant and half of the $240,000 Crafoord Prize for ecological virtue. One of Ehrlich’s advisers, John Holdren, is President Barack Obama’s science adviser.

George F. Will writes about foreign and domestic politics and policy for the Washington Post Writers Group. Email:georgewill@washpost.com.

Media Violence Consumption Increases the Relative Risk of Aggression, Analysis Shows (Science Daily)

ScienceDaily (Aug. 27, 2012) — As president of the International Society for Research on Aggression (IRSA) and with consent of the organization’s elected council, Craig Anderson appointed an international Media Violence Commission last December to prepare a public statement on the known effects of media violence exposure, based on the current state of scientific knowledge.

The Iowa State University Distinguished Professor of psychology appointed 12 IRSA researchers to the commission, including Douglas Gentile, an ISU associate professor of psychology.

The Media Violence Commission’s research-based report concludes that the research clearly shows that media violence consumption increases the relative risk of aggression, defined as intentional harm to another person that could be verbal, relational, or physical. The report is published in the September/October issue of the journal Aggressive Behavior.

“Basically, the commission looked at, ‘What does the research literature say?'” Anderson said. “In addition, we asked them to make some recommendations, if they chose to do so, about public policy. It really was kind of an open-ended charge.”

Members took a fair and balanced look at the research

A well-known researcher on the effects of media on children, Gentile says commission members took a fair and balanced look at all of the existing research to see if they could achieve consensus, and then summarized what they found.

In their report, the commission wrote that aside from being sources of imitation, violent images — such as scenes in movies, games or pictures in comic books — act as triggers for activating aggressive thoughts and feelings already stored in memory. If these aggressive thoughts and feelings are activated over and over again because of repeated exposure to media violence, they become chronically accessible, and thus more likely to influence behavior.

“One may also become more vigilant for hostility and aggression in the world, and therefore, begin to feel some ambiguous actions by others (such as being bumped in a crowded room) are deliberate acts of provocation,” the commission wrote in the report.

The commission recommends that parents know what media their children and adolescents are using. Rating systems often provide too little detail about media content to be helpful, and in any case, are not substitutes for parents’ watching, playing, or listening to the media their children use.

“Parents can also set limits on screen use (The American Academy of Pediatrics recommends no screen time for children under 2 and no more than one to two hours total screen time per day for children/youth 3-18), and should discuss media content with their children to promote critical thinking when viewing,” the researchers wrote. “Schools may help parents by teaching students from an early age to be critical consumers of the media and that, just like food, the ‘you are what you eat’ principle applies to healthy media consumption.”

The commission recommends improving media ratings

While most public policy has focused on restricting children’s access to violent media, the commission found that approach to have significant political and legal challenges in many countries. For that reason, it recommends putting efforts into improving media ratings, classifications, and public education about the effects of media on children.

“Improving media ratings really has two pieces. One is that the media ratings themselves need to be done by an independent entity — meaning, not by an industry-influenced or controlled system,” said Anderson, himself a leading researcher of the effects of violent media on children. “They need to be ratings that have some scientific validity to them.

“But the other piece is education, and if parents aren’t educated — not just about what the ratings system does, but also about why it’s important for them to take control of their child’s media diet — then it doesn’t matter how good the ratings system is, because they’re going to ignore it anyway,” he added.

Anderson hopes the final report will have value to child advocacy groups.

“Having such a clear statement by an unbiased, international scientific group should be very helpful to a number of child advocacy groups — such as parenting groups — in their efforts to improve the lives of children,” he said.

Journal Reference:

  1. Media Violence Commission, International Society for Research on Aggression (ISRA). Report of the Media Violence CommissionAggressive Behavior, Volume 38, Issue 5, September/October 2012, Pages: 335%u2013341 DOI: 10.1002/ab.21443

The Role of Genes in Political Behavior (Science Daily)

ScienceDaily (Aug. 27, 2012) — Politics and genetics have traditionally been considered non-overlapping fields, but over the past decade it has become clear that genes can influence political behavior, according to a review published online August 27th in Trends in Genetics. This paradigm shift has led to novel insights into why people vary in their political preferences and could have important implications for public policy.

“We’re seeing an awakening in the social sciences, and the wall that divided politics and genetics is really starting to fall apart,” says review author Peter Hatemi of the University of Sydney. “This is a big advance, because the two fields could inform each other to answer some very complex questions about individual differences in political views.”

In the past, social scientists had assumed that political preferences were shaped by social learning and environmental factors, but recent studies suggest that genes also strongly influence political traits. Twin studies show that genes have some influence on why people differ on political issues such as the death penalty, unemployment and abortion. Because this field of research is relatively new, only a handful of genes have been implicated in political ideology and partisanship, voter turnout, and political violence.

Future research, including gene-expression and sequencing studies, may lead to deeper insights into genetic influences on political views and have a greater impact on public policy. “Making the public aware of how their mind works and affects their political behavior is critically important,” Hatemi says. “This has real implications for the reduction of discrimination, foreign policy, public health, attitude change and many other political issues.”

Journal Reference:

  1. Peter K Hatemi and Rose McDermott. The Genetics of Politics: Discovery, Challenges and ProgressTrends in Genetics, August 27, 2012 DOI: 10.1016/j.tig.2012.07.004

Sweden recognises new file-sharing religion Kopimism (BBC)

5 January 2012 Last updated at 13:49 GMT

Fingers nearly touchingFile-sharing is a religious ceremony according to the church leader

A “church” whose central tenet is the right to file-share has been formally recognised by the Swedish government.

The Church of Kopimism claims that “kopyacting” – sharing information through copying – is akin to a religious service.

The “spiritual leader” of the church said recognition was a “large step”.

But others were less enthusiastic and said the church would do little to halt the global crackdown on piracy.

Holy information

It doesn’t mean illegal file-sharing will become legal, any more than if ‘Jedi’ was recognised as a religion everyone would be walking around with light sabres” – Mark Mulligan, Music analyst

The Swedish government agency Kammarkollegiet finally registered the Church of Kopimism as a religious organisation shortly before Christmas, the group said.

“We had to apply three times,” said Gustav Nipe, chairman of the organisation.

The church, which holds CTRL+C and CTRL+V (shortcuts for copy and paste) as sacred symbols, does not directly promote illegal file sharing, focusing instead on the open distribution of knowledge to all.

It was founded by 19-year-old philosophy student and leader Isak Gerson. He hopes that file-sharing will now be given religious protection.

“For the Church of Kopimism, information is holy and copying is a sacrament. Information holds a value, in itself and in what it contains and the value multiplies through copying. Therefore copying is central for the organisation and its members,” he said in a statement.

“Being recognised by the state of Sweden is a large step for all of Kopimi. Hopefully this is one step towards the day when we can live out our faith without fear of persecution,” he added.

The church’s website has been unavailable since it broke the news of its religious status. A message urged those interested in joining to “come back in a couple of days when the storm has settled”.

Despite the new-found interest in the organisation, experts said religious status for file-sharing would have little effect on the global crackdown on piracy.

“It is quite divorced from reality and is reflective of Swedish social norms rather than the Swedish legislative system,” said music analyst Mark Mulligan.

“It doesn’t mean that illegal file-sharing will become legal, any more than if ‘Jedi’ was recognised as a religion everyone would be walking around with light sabres.

“In some ways these guys are looking outdated. File-sharing as a means to pirate content is becoming yesterday’s technology,” he added.

Piracy crackdown

The establishment of the church comes amid a backdrop of governmental zero-tolerance towards piracy.

The crackdown on piracy has moved focus away from individual pirates and more towards the ecosystem that supports piracy.

In the US, the Stop Online Piracy Act (Sopa) aims to stop online ad networks and payment processors from doing business with foreign websites accused of enabling or facilitating copyright infringement.

It could also stop search engines from linking to the allegedly infringing sites. Domain name registrars could be forced to take down the websites, and internet service providers forced to block access to the sites accused of infringing.

The government is pushing ahead with the controversial legislation despite continued opposition.

 

*   *   *

Kopimism: the world’s newest religion explained (New Scientist)

14:35 06 January 2012 by Alison George

Download this in memory of me <i>(Image: Lars Johansson)</i>Isak Gerson is spiritual leader of the world’s newest religion, Kopimism, devoted to file-sharing. On 5 January the Church of Kopimism was formallyrecognised as a religion by the Swedish government.

Tell me about this new file-sharing religion, Kopimism.
We were founded about 15 months ago and we believe that information is holy and that the act of copying is holy.

Why make a religion out of file-sharing? Why not just be an ordinary club without defining yourselves as being a religious community?
Because we see ourselves as a religious group, a church seems like a good way of organising ourselves.

Was it hard to become an official religion?
We have had this faith for several years and one day we thought, why not try and get it registered? It was quite difficult. The authorities were quite dogmatic with their formalities. It took us three tries and more than a year to get recognised.

What criteria do you have to meet to become an official religion?
The law states that to be a religion you have to be an organisation that practises moments of prayer or meditation in your rituals.

What are the Kopimist prayers and meditations?
We have a part of our religious practices where we worship the value of information by copying it.

You call this “kopyacting”. Do you actually meet up in a building, like a church, to undertake these rituals?
We do meet up, but it doesn’t have to be a physical room. It could be a server or a web page too.

I understand that certain symbols have special significance in Kopimism.
Yes. There is the “kopimi” logo, which is a K written inside a pyramid a symbol used online to show you want to be copied. But there are also symbols that represent and encourage copying, for example, “CTRL+V” and “CTRL+C”.

Why is information, and sharing it, so important to you?
Information is the building block of everything around me and everything I believe in. Copying it is a way of multiplying the value of information.

What’s your stance on illegal file-sharing?
I think that the copyright laws are very problematic, and at least need to be rewritten, but I would suggest getting rid of most of them.

So all file-sharing should be legal?
Absolutely.

Are you just trying to make a point, or is this religion for real?
We’ve had this faith for several years.

What has the reaction been from established churches?
I haven’t spoken to many of them, but those I have spoken to have been curious, and seen it as an interesting discussion.

Can you get excommunicated from the Church of Kopimism?
We have never thought about it. But if you don’t believe in our values then I guess there is no point in being a member, and if you do believe in our values you can’t really be excommunicated.

How many church members are there?
Around 3000.

How do you become a Kopimist?
Our site is down for moment, because there has been too much traffic, but when it is up, you just have to read about our values and agree with them, then you can register on the web page.

Is there a deity associated with Kopimism?
No, there isn’t.

Is Julian Assange a high priest of Kopimism?
No. We have had no communication with him.

Does Kopimism have anything to say about the afterlife?
Not really. As a religion we are not so focussed on humans.

It could be a digital afterlife.
Information doesn’t really have a life, but I guess it can be forgotten, but as long as it is copied it won’t be.

Profile

Isak Gerson is a 20-year-old philosophy student at Uppsala University, Sweden. Together with Gustav Nipe – a member of Sweden’s Pirate party – and others, he has founded the Church of Kopimism.

 

*    *    *

Kopimism, Sweden’s Pirate Religion, Begins to Plunder America (U.S.News)

‘Kopimism’ gives internet piracy a place to worship

April 20, 2012

The symbol of Kopimism, a religion dedicated to information sharing.The symbol of Kopimism, a religion dedicated to information sharing.

A Swedish religion whose dogma centers on the belief that people should be free to copy and distribute all information—regardless of any copyright or trademarks—has made its way to the United States.

Followers of so-called “Kopimism” believe copying, sharing, and improving on knowledge, music, and other types of information is only human—the Romans remixed Greek mythology, after all, they say. In January, Kopimism—a play on the words “copy me”—was formally recognized by a Swedish government agency, raising its profile worldwide.

“Culture is something that makes people feel much better and makes people appreciate their world in a different way. Knowledge is also something we should copy regardless of the law,” says Isak Gerson, the 20-year-old founder of Kopimism. “It makes us better when we share knowledge and culture with each other.”

More than 3,500 people “like” Kopimism on Facebook, and thousands more practice its sacred ritual of file sharing. According to its manifesto, private, closed-source software code and anti-piracy software are “comparable to slavery.” Kopimist “Ops,” or spiritual leaders, are encouraged to give counsel to people who want to pirate files, are banned from recording and should encrypt all virtual religious service meetings “because of society’s vicious legislative and litigious persecution of Kopimists.”

Official in-person meetings must happen in places free of anti-Kopimist monitoring and in spaces with the Kopimist symbol—a pyramid with the letter K inside. To be initiated new parishioners must share the Kopimist symbol and say the sacred words “copied and seeded.”

The gospel of the church has begun to spread, with Kopimist branches in 18 countries.

An American branch of the religion was recently registered with Illinois and is in the process of gaining federal recognition, according to Christopher Carmean, a 25-year-old student at the University of Chicago and head of the U.S. branch.

“Data is what we are made of, data is what defines our life, and data is how we express ourselves,” says Carmean. “Forms of copying, remixing, and sharing enhance the quality of life for all who have access to them. Attempts to hinder sharing are antithetical to our data-driven existence.”

About 450 people have registered with his church, and about 30 of them are actively practicing the religion, whose symbols include Ctrl+C and Ctrl+V—the keyboard shortcuts for copy and paste.

It’s no surprise the religion was born in Sweden—it has some of the laxest copyright laws in the world. The Swedish Pirate Party has two seats in the European Parliament, and The Pirate Bay, a Swedish website that’s one of the world’s largest portals to illegal files, has avoided being shut down for years.

Gerson is happy to allow people who want to open their own branches of Kopimism to copy its symbols and religious documents.

“There’s been a couple people that asked me [to start congregations], but I tell them they shouldn’t ask. You don’t need permission,” he says. “It’s a project, and I want projects to be copied, so I’m happy when people copy without asking.”

Most Kopimists say they realized they were practicing the religion before they found it.

“There are many people who are like me, who always held the Kopimist ideals, but hadn’t yet heard of the official church,” says Lauren Pespisa, a web developer in Cambridge, Mass., who gave a speech about the religion in March to a group of anti-copyright activists called the Massachusetts Pirate Party. “I think some people are like me and have embraced it officially and publicly, but some people believe in it and don’t really want to mix religion and politics.”

That’s a big criticism of the religion—lawsuits brought upon Kopimists is a form of religious persecution, according to Gerson. But Pespisa says that crying persecution in court probably “wouldn’t hold up in reality.”

In a blog post in late March, Carmean wrote that people should not “bring a legal argument to a religion fight.”

“Expecting any religion to provide a logic-based mandate for every single action that one might take is absurd and offensive,” he wrote. “It insults the basic moral fiber of Kopimists and all of humanity to outright demand a total moral code of conduct from anyone purporting to have a new perspective on issues of our time.”

Although many Kopimists are practicing a “sacred” ritual whenever they download or share a movie, CD, or book, they also regularly meet in online chat rooms to discuss the religion. Many of them are also internet activists, working to make file sharing legal, regardless of copyright. Even if they’re unsuccessful, Gerson is happy to help the information flow in any way he can.

“I think we need to change the laws, but I don’t think we need to focus only on them. I think laws can, in many cases, be ignored,” he says. “We want to encourage people to share regardless of what the laws say.”

Information Overload in the Era of ‘Big Data’ (Science Daily)

ScienceDaily (Aug. 20, 2012) — Botany is plagued by the same problem as the rest of science and society: our ability to generate data quickly and cheaply is surpassing our ability to access and analyze it. In this age of big data, scientists facing too much information rely on computers to search large data sets for patterns that are beyond the capability of humans to recognize — but computers can only interpret data based on the strict set of rules in their programming.

New tools called ontologies provide the rules computers need to transform information into knowledge, by attaching meaning to data, thereby making those data retrievable by computers and more understandable to human beings. Ontology, from the Greek word for the study of being or existence, traditionally falls within the purview of philosophy, but the term is now used by computer and information scientists to describe a strategy for representing knowledge in a consistent fashion. An ontology in this contemporary sense is a description of the types of entities within a given domain and the relationships among them.

A new article in this month’s American Journal of Botany by Ramona Walls (New York Botanical Garden) and colleagues describes how scientists build ontologies such as the Plant Ontology (PO) and how these tools can transform plant science by facilitating new ways of gathering and exploring data.

When data from many divergent sources, such as data about some specific plant organ, are associated or “tagged” with particular terms from a single ontology or set of interrelated ontologies, the data become easier to find, and computers can use the logical relationships in the ontologies to correctly combine the information from the different databases. Moreover, computers can also use ontologies to aggregate data associated with the different subclasses or parts of entities.

For example, suppose a researcher is searching online for all examples of gene expression in a leaf. Any botanist performing this search would include experiments that described gene expression in petioles and midribs or in a frond. However, a search engine would not know that it needs to include these terms in its search — unless it was told that a frond is a type of leaf, and that every petiole and every midrib are parts of some leaf. It is this information that ontologies provide.

The article in the American Journal of Botany by Walls and colleagues describes what ontologies are, why they are relevant to plant science, and some of the basic principles of ontology development. It includes an overview of the ontologies that are relevant to botany, with a more detailed description of the PO and the challenges of building an ontology that covers all green plants. The article also describes four keys areas of plant science that could benefit from the use of ontologies: (1) comparative genetics, genomics, phenomics, and development; (2) taxonomy and systematics; (3) semantic applications; and (4) education. Although most of the examples in this article are drawn from plant science, the principles could apply to any group of organisms, and the article should be of interest to zoologists as well.

As genomic and phenomic data become available for more species, many different research groups are embarking on the annotation of their data and images with ontology terms. At the same time, cross-species queries are becoming more common, causing more researchers in plant science to turn to ontologies. Ontology developers are working with the scientists who generate data to make sure ontologies accurately reflect current science, and with database developers and publishers to find ways to make it easier for scientist to associate their data with ontologies.

Journal Reference:

R. L. Walls, B. Athreya, L. Cooper, J. Elser, M. A. Gandolfo, P. Jaiswal, C. J. Mungall, J. Preece, S. Rensing, B. Smith, D. W. Stevenson. Ontologies as integrative tools for plant scienceAmerican Journal of Botany, 2012; 99 (8): 1263 DOI: 10.3732/ajb.1200222

Cloud Brightening to Control Global Warming? Geoengineers Propose an Experiment (Science Daily)

A conceptualized image of an unmanned, wind-powered, remotely controlled ship that could be used to implement cloud brightening. (Credit: John McNeill)

ScienceDaily (Aug. 20, 2012) — Even though it sounds like science fiction, researchers are taking a second look at a controversial idea that uses futuristic ships to shoot salt water high into the sky over the oceans, creating clouds that reflect sunlight and thus counter global warming.

University of Washington atmospheric physicist Rob Wood describes a possible way to run an experiment to test the concept on a small scale in a comprehensive paper published this month in the journal Philosophical Transactions of the Royal Society.

The point of the paper — which includes updates on the latest study into what kind of ship would be best to spray the salt water into the sky, how large the water droplets should be and the potential climatological impacts — is to encourage more scientists to consider the idea of marine cloud brightening and even poke holes in it. In the paper, he and a colleague detail an experiment to test the concept.

“What we’re trying to do is make the case that this is a beneficial experiment to do,” Wood said. With enough interest in cloud brightening from the scientific community, funding for an experiment may become possible, he said.

The theory behind so-called marine cloud brightening is that adding particles, in this case sea salt, to the sky over the ocean would form large, long-lived clouds. Clouds appear when water forms around particles. Since there is a limited amount of water in the air, adding more particles creates more, but smaller, droplets.

“It turns out that a greater number of smaller drops has a greater surface area, so it means the clouds reflect a greater amount of light back into space,” Wood said. That creates a cooling effect on Earth.

Marine cloud brightening is part of a broader concept known as geoengineering which encompasses efforts to use technology to manipulate the environment. Brightening, like other geoengineering proposals, is controversial for its ethical and political ramifications and the uncertainty around its impact. But those aren’t reasons not to study it, Wood said.

“I would rather that responsible scientists test the idea than groups that might have a vested interest in proving its success,” he said. The danger with private organizations experimenting with geoengineering is that “there is an assumption that it’s got to work,” he said.

Wood and his colleagues propose trying a small-scale experiment to test feasibility and begin to study effects. The test should start by deploying sprayers on a ship or barge to ensure that they can inject enough particles of the targeted size to the appropriate elevation, Wood and a colleague wrote in the report. An airplane equipped with sensors would study the physical and chemical characteristics of the particles and how they disperse.

The next step would be to use additional airplanes to study how the cloud develops and how long it remains. The final phase of the experiment would send out five to 10 ships spread out across a 100 kilometer, or 62 mile, stretch. The resulting clouds would be large enough so that scientists could use satellites to examine them and their ability to reflect light.

Wood said there is very little chance of long-term effects from such an experiment. Based on studies of pollutants, which emit particles that cause a similar reaction in clouds, scientists know that the impact of adding particles to clouds lasts only a few days.

Still, such an experiment would be unusual in the world of climate science, where scientists observe rather than actually try to change the atmosphere.

Wood notes that running the experiment would advance knowledge around how particles like pollutants impact the climate, although the main reason to do it would be to test the geoengineering idea.

A phenomenon that inspired marine cloud brightening is ship trails: clouds that form behind the paths of ships crossing the ocean, similar to the trails that airplanes leave across the sky. Ship trails form around particles released from burning fuel.

But in some cases ship trails make clouds darker. “We don’t really know why that is,” Wood said.

Despite increasing interest from scientists like Wood, there is still strong resistance to cloud brightening.

“It’s a quick-fix idea when really what we need to do is move toward a low-carbon emission economy, which is turning out to be a long process,” Wood said. “I think we ought to know about the possibilities, just in case.”

The authors of the paper are treading cautiously.

“We stress that there would be no justification for deployment of [marine cloud brightening] unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favor of such action,” they wrote in the paper’s summary.

There are 25 authors on the paper, including scientists from University of Leeds, University of Edinburgh and the Pacific Northwest National Laboratory. The lead author is John Latham of the National Center for Atmospheric Research and the University of Manchester, who pioneered the idea of marine cloud brightening.

Wood’s research was supported by the UW College of the Environment Institute.

Journal Reference:

J. Latham, K. Bower, T. Choularton, H. Coe, P. Connolly, G. Cooper, T. Craft, J. Foster, A. Gadian, L. Galbraith, H. Iacovides, D. Johnston, B. Launder, B. Leslie, J. Meyer, A. Neukermans, B. Ormond, B. Parkes, P. Rasch, J. Rush, S. Salter, T. Stevenson, H. Wang, Q. Wang, R. Wood. Marine cloud brighteningPhilosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2012; 370 (1974): 4217 DOI:10.1098/rsta.2012.0086