Arquivo mensal: novembro 2011

Claude Lévi-Strauss: fundador do pós-estruturalismo – Eduardo Viveiros de Castro

Palestra do antropólogo Eduardo Viveiros de Castro no IEB (USP) em 09/10/08, por ocasião de um evento consagrado ao 100o aniversário de Claude Lévi-Strauss.

 

 

 

 

 

 

 

 

 

 

Anúncios

Aumento da expectativa de vida faz surgir novos problemas nas pessoas com deficiência mental (FAPESP)

Pesquisa FAPESP
Edição 189 – Novembro 2011

Ciência > Envelhecimento
O preço da longevidade

Carlos Fioravanti

As pessoas com deficiência intelectual, que há 40 anos morriam na adolescência, hoje podem viver mais de 60 anos. Como estão vivendo mais, outros problemas orgânicos estão surgindo. Reunidos durante dois dias em agosto na Associação de Paes e Amigos dos Excepcionais (Apae) de São Paulo, médicos e pesquisadores da Universidade Federal de São Paulo (Unifesp) e da Universidade de São Paulo (USP), psicólogos, terapeutas, advogados, assistentes sociais e outros profissionais da saúde reconheceram um dos graves problemas emergentes, a possibilidade de envelhecimento precoce.

Em um levantamento preliminar feito em 2009 em seis instituições da cidade de São Paulo, de um grupo de 373 pessoas com deficiência intelectual (ou DI; a expressão deficiência mental não é mais recomendada) e mais de 30 anos de idade, 192 apresentavam pelo menos três sinais de provável envelhecimento precoce, de acordo com um questionário que avaliava eventuais perdas de memória, de autonomia nas tarefas do dia a dia, de interesse por atividades ou de visão e audição. Para dimensionar esse problema, está sendo preparado um levantamento mais abrangente e detalhado, com cerca de 500 pessoas com DI e idade entre 30 e 59 anos da Grande São Paulo.

Os estudos em andamento são essenciais para “vermos o que pode ser feito, em termos de atendimento médico e de políticas públicas”, diz Regina Leondarides, coordenadora do grupo de estudo de envelhecimento precoce das pessoas com deficiência intelectual, que reúne 10 instituições de atendimento. “Temos muitas políticas de saúde voltadas para a criança, mas as políticas para o envelhecimento estão começando a ser construídas”, comenta Esper Cavalheiro, professor da Unifesp e presidente do conselho científico do Instituto Apae de São Paulo. “Estamos atrasados, em vista do envelhecimento acelerado da população brasileira.”

Um estudo da Espanha publicado em 2008 indicou que as pessoas com DI envelhecem prematuramente – as com síndrome de Down, de modo mais intenso. Para chegar a essas conclusões, os pesquisadores acompanharam a saúde de 238 pessoas com DI e mais de 40 anos de idade durante cinco anos. Não se trata, aparentemente, de um fenômeno inevitável. O envelhecimento precoce das pessoas com DI leve e moderada resulta da falta de programas de promoção de saúde e do acesso reduzido a serviços médicos e sociais. As pessoas com DI se mostraram com maior tendência à obesidade (apenas 25% tinham peso considerado normal), à hipertensão arterial (25% do total) e a distúrbios metabólicos, como diabetes e hipotireoidismo (10% do total).

“O envelhecimento precoce, se confirmado, pode ter causas genéticas ou ambientais, independentemente da deficiência intelectual”, comenta Dalci Santos, gerente do Instituto Apae de São Paulo. Matemática de formação, com doutorado em andamento na Unifesp, ela acrescenta: “Não conseguiremos avançar muito até esclarecermos melhor a origem das deficiências intelectuais”. As causas podem ser genéticas, como na síndrome de Down, ou ambientais (causas não genéticas), incluindo infecções, baixa oxigenação do cérebro do feto, alcoolismo, radiação, intoxicação por chumbo durante a gravidez ou prematuridade – muitas vezes, vários fatores em conjunto.

Causas ambientais ou genéticas
Em um artigo no primeiro número da Revista de Deficiência Intelectual DI, publicação do Instituto Apae lançada em outubro, João Monteiro de Pina-Neto, médico geneticista da Faculdade de Medicina de Ribeirão Preto da USP, apresenta os resultados de um estudo sobre as causas da deficiência intelectual em 200 pessoas atendidas nas Apaes de Altinópolis e Serrana, dois municípios da região de Ribeirão Preto. Esse estudo faz parte de um levantamento maior, com cerca de mil pessoas com DI atendidas em quatro Apaes, que Pina-Neto e sua equipe pretendem concluir em meados de 2012. Os resultados obtidos até agora indicam o predomínio de causas ambientais (42,5% do total), seguidas pelas genéticas (29%) e indeterminadas (20%).

Um estudo similar feito com 10 mil pessoas na Carolina do Sul, Estados Unidos, apresentou o mesmo percentual de causas genéticas, mas apenas 18% de causas ambientais e 56% de causas desconhecidas. Alguns contrastes chamam a atenção. Enquanto a deficiência intelectual causada por falta de oxigenação cerebral responde por 5% do total das causas de DI nos Estados Unidos, em São Paulo é 16,5%; a prematuridade, de 5% nos Estados Unidos, foi de 14,5% no estudo paulista; o efeito das infecções, de 5%, é quase o dobro aqui, 9%.

A conclusão que emerge dessa comparação é que o número de nascimentos de bebês com DI poderia ser reduzido por meio de algumas medidas preventivas. “Melhorar o atendimento pré-natal e a qualidade do parto são uma prioridade”, ressalta Pina-Neto. “Ainda temos casos de deficiência causada por sífilis, rubéola ou toxoplasmose contraída durante a gestação e meningites pós-natais”, lamenta. Segundo ele, outro problema que pode ser controlado é o alcoolismo. “De 20% a 30% das mulheres da região de Ribeirão Preto consomem bebida alcoólica em excesso e, como resultado, de cada 100 gravidezes, nasce uma criança com DI causada por síndrome alcoólica fetal”, diz ele. “Não fazemos ainda a adequada prevenção das causas da deficiência intelectual.”

As causas genéticas podem ser controladas, já que o risco de uma criança nascer com síndrome de Down aumenta muito com a idade dos pais. “As mulheres estão tendo filhos após os 35 anos de idade, portanto mais propen­sas a terem filhos com Down, e os homens estão se casando várias vezes, tendo filhos em cada ca­samento”, diz Pina-Neto. Segundo ele, homens estéreis que procuram as clínicas de reprodução deveriam ser mais informados sobre a possibilidade de terem alterações genéticas que podem ser transmitidas aos filhos caso se tornem férteis.

As pessoas com DI apresentam capacidade de raciocínio bastante abaixo da média e limitações para aprender, se cuidar ou se comunicar com outras, mas atualmente são muito mais integradas socialmente, autônomas e produtivas, com mais oportunidades para expressar a criatividade do que há algumas décadas. Frequentam escolas regulares, com outras crianças e adultos, participam de competições esportivas e conquistam mais postos de mercado de trabalho. Crianças e adultos com DI não vão mais à Apae de São Paulo para aprender todo dia, mas aparecem algumas vezes por semana para atendimento educacional especializado ou para consultas médicas. O serviço de apoio ao envelhecimento atende 132 pessoas com idade entre 30 e 67 anos.

Ainda há muitas dúvidas sobre como lidar com os novos problemas. Crianças e adultos com de­­­­ficiência precisam de hábitos e horários para se sentir calmos e confortáveis. Ao mesmo tempo, hábitos imutáveis podem favorecer o surgimento da doença de Alzheimer, doença neurológi­ca que se agrava com o envelhecimento. Vem daí um impasse: manter a rotina inalterada poderia alimentar a propensão ao Alzheimer, mas quebrar a rotina pode ser perturbador.

Propensão ao alzheimer
O cérebro das pessoas com Down pode exibir um dos sinais típicos do Alzheimer: o acúmulo de placas amiloides, que dificultam o funcionamento adequado dos neurônios. Uma equipe da Universidade da Califórnia em Los Angeles, Estados Unidos, encontrou placas amiloides em quantidade mais elevada no cérebro de pessoas com Down do que em pessoas com Alzheimer já diagnosticado e em pessoas normais.

“Os sinais biológicos de Alzheimer podem surgir antes dos sinais clínicos”, observa Orestes Forlenza, professor da Faculdade de Medicina da USP. “Ter amiloide não significa ter demência futura. Qual a melhor intervenção futura? Não sabemos. Talvez via nutrição ou atividade física seja mais seguro do que por medicamentos.” Ira Lott e sua equipe da Universidade da Califórnia em Irvine fizeram um estudo duplo-cego durante dois anos com 53 pessoas com síndrome de Down para ver se a complementação da dieta com compostos antioxidantes poderia melhorar o funcionamento mental ou estabilizar a perda da capacidade cognitiva. Os resultados, publicados em agosto na American Journal of Medical Genetics, indicaram que não.

Esper Cavalheiro apresentou três perguntas ainda sem resposta. De que modo as alterações próprias do envelhecimento, como as doenças cardiovasculares, diabetes e câncer, se apresentam nas pessoas com DI? Como alterações frequentes nessas pessoas, a exemplo de demências e osteoporose, se comportam no envelhecimento? Os medicamentos usados para tratar hipertensão, diabetes e outras doenças típicas do envelhecimento funcionam nas pessoas com DI do mesmo modo que em outros indivíduos?

Outra dúvida: as estratégias de controle dos fatores de risco de doenças cardiovasculares recomendadas para pessoas normais, como o estímulo a atividades físicas, têm o mesmo impacto sobre a saúde das pessoas com e sem deficiência intelectual? “Supomos que sim, mas não sabemos ao certo”, diz Ricardo Nitrini, da USP.

Segundo Cavalheiro, as pessoas com DI com 65 anos ou mais correspondiam a 4% da população total no Censo de 2000; hoje respondem por 5,5% da população total. “Não podemos nos contentar apenas com estatísticas e diagnósticos”, alerta. “Temos de enfrentar esse problema com rapidez. Quanto mais gente dialogando e pensando nesses problemas, melhor.”

Castles in the Desert: Satellites Reveal Lost Cities of Libya (Science Daily)

ScienceDaily (Nov. 7, 2011) — Satellite imagery has uncovered new evidence of a lost civilisation of the Sahara in Libya’s south-western desert wastes that will help re-write the history of the country. The fall of Gaddafi has opened the way for archaeologists to explore the country’s pre-Islamic heritage, so long ignored under his regime.

Satellite image of area of desert with archaeological interpretation of features: fortifications are outlined in black, areas of dwellings are in red and oasis gardens are in green. (Credit: Copyright 2011 Google, image copyright 2011 DigitalGlobe)

Using satellites and air-photographs to identify the remains in one of the most inhospitable parts of the desert, a British team has discovered more than 100 fortified farms and villages with castle-like structures and several towns, most dating between AD 1-500.

These “lost cities” were built by a little-known ancient civilisation called the Garamantes, whose lifestyle and culture was far more advanced and historically significant than the ancient sources suggested.

The team from the University of Leicester has identified the mud brick remains of the castle-like complexes, with walls still standing up to four metres high, along with traces of dwellings, cairn cemeteries, associated field systems, wells and sophisticated irrigation systems. Follow-up ground survey earlier this year confirmed the pre-Islamic date and remarkable preservation.

“It is like someone coming to England and suddenly discovering all the medieval castles. These settlements had been unremarked and unrecorded under the Gaddafi regime,” says the project leader David Mattingly FBA, Professor of Roman Archaeology at the University of Leicester.

“Satellite imagery has given us the ability to cover a large region. The evidence suggests that the climate has not changed over the years and we can see that this inhospitable landscape with zero rainfall was once very densely built up and cultivated. These are quite exceptional ancient landscapes, both in terms of the range of features and the quality of preservation,” says Dr Martin Sterry, also of the University of Leicester, who has been responsible for much of the image analysis and site interpretation.

The findings challenge a view dating back to Roman accounts that the Garamantes consisted of barbaric nomads and troublemakers on the edge of the Roman Empire.

“In fact, they were highly civilised, living in large-scale fortified settlements, predominantly as oasis farmers. It was an organised state with towns and villages, a written language and state of the art technologies. The Garamantes were pioneers in establishing oases and opening up Trans-Saharan trade,” Professor Mattingly said.

The professor and his team were forced to evacuate Libya in February when the anti-Gaddafi revolt started, but hope to be able to return to the field as soon as security is fully restored. The Libyan antiquities department, badly under-resourced under Gaddafi, is closely involved in the project. Funding for the research has come from the European Research Council who awarded Professor Mattingly an ERC Advanced Grant of nearly 2.5m euros, the Leverhulme Trust, the Society for Libyan Studies and the GeoEye Foundation.

“It is a new start for Libya’s antiquities service and a chance for the Libyan people to engage with their own long-suppressed history,” says Professor Mattingly.

“These represent the first towns in Libya that weren’t the colonial imposition of Mediterranean people such as the Greeks and Romans. The Garamantes should be central to what Libyan school children learn about their history and heritage.”

Desafios do “tsunami de dados” (FAPESP)

Lançado pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI, o livro O Quarto Paradigma debate os desafios da eScience, nova área dedicada a lidar com o imenso volume de informações que caracteriza a ciência atual

07/11/2011

Por Fábio de Castro

Agência FAPESP – Se há alguns anos a falta de dados limitava os avanços da ciência, hoje o problema se inverteu. O desenvolvimento de novas tecnologias de captação de dados, nas mais variadas áreas e escalas, tem gerado um volume tão imenso de informações que o excesso se tornou um gargalo para o avanço científico.

Nesse contexto, cientistas da computação têm se unido a especialistas de diferentes áreas para desenvolver novos conceitos e teorias capazes de lidar com a enxurrada de dados da ciência contemporânea. O resultado é chamado de eScience.

Esse é o tema debatido no livro O Quarto Paradigma – Descobertas científicas na era da eScience, lançado no dia 3 de novembro pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI.

Organizado por Tony Hey, Stewart Tansley, Kristin Tolle – todos da Microsoft Research –, a publicação foi lançada na sede da FAPESP, em evento que contou com a presença do diretor científico da Fundação, Carlos Henrique de Brito Cruz.

Durante o lançamento, Roberto Marcondes Cesar Jr., do Instituto de Matemática e Estatística (IME) da Universidade de São Paulo (USP), apresentou a palestra “eScience no Brasil”. “O Quarto Paradigma: computação intensiva de dados avançando a descoberta científica” foi o tema da palestra de Daniel Fay, diretor de Terra, Energia e Meio Ambiente da MSR.

Brito Cruz destacou o interesse da FAPESP em estimular o desenvolvimento da eScience no Brasil. “A FAPESP está muito conectada a essa ideia, porque muitos dos nossos projetos e programas apresentam essa necessidade de mais capacidade de gerenciar grandes conjuntos de dados. O nosso grande desafio está na ciência por trás dessa capacidade de lidar com grandes volumes de dados”, disse.

Iniciativas como o Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), o BIOTA-FAPESP e o Programa FAPESP de Pesquisa em Bioenergia (BIOEN) são exemplos de programas que têm grande necessidade de integrar e processar imensos volumes de dados.

“Sabemos que a ciência avança quando novos instrumentos são disponibilizados. Por outro lado, os cientistas normalmente não percebem o computador como um novo grande instrumento que revoluciona a ciência. A FAPESP está interessada em ações para que a comunidade científica tome consciência de que há grandes desafios na área de eScience”, disse Brito Cruz.

O livro é uma coleção de 26 ensaios técnicos divididos em quatro seções: “Terra e meio ambiente”, “Saúde e bem-estar”, “Infraestrutura científica” e “Comunicação acadêmica”.

“O livro fala da emergência de um novo paradigma para as descobertas científicas. Há milhares de anos, o paradigma vigente era o da ciência experimental, fundamentada na descrição de fenômenos naturais. Há algumas centenas de anos, surgiu o paradigma da ciência teórica, simbolizado pelas leis de Newton. Há algumas décadas, surgiu a ciência computacional, simulando fenômenos complexos. Agora, chegamos ao quarto paradigma, que é o da ciência orientada por dados”, disse Fay.

Com o advento do novo paradigma, segundo ele, houve uma mudança completa na natureza da descoberta científica. Entraram em cena modelos complexos, com amplas escalas espaciais e temporais, que exigem cada vez mais interações multidisciplinares.

“Os dados, em quantidade incrível, são provenientes de diferentes fontes e precisam também de abordagem multidisciplinar e, muitas vezes, de tratamento em tempo real. As comunidades científicas também estão mais distribuídas. Tudo isso transformou a maneira como se fazem descobertas”, disse Fay.

A ecologia, uma das áreas altamente afetadas pelos grandes volumes de dados, é um exemplo de como o avanço da ciência, cada vez mais, dependerá da colaboração entre pesquisadores acadêmicos e especialistas em computação.

“Vivemos em uma tempestade de sensoriamento remoto, sensores terrestres baratos e acesso a dados na internet. Mas extrair as variáveis que a ciência requer dessa massa de dados heterogêneos continua sendo um problema. É preciso ter conhecimento especializado sobre algoritmos, formatos de arquivos e limpeza de dados, por exemplo, que nem sempre é acessível para o pessoal da área de ecologia”, explicou.

O mesmo ocorre em áreas como medicina e biologia – que se beneficiam de novas tecnologias, por exemplo, em registros de atividade cerebral, ou de sequenciamento de DNA – ou a astronomia e física, à medida que os modernos telescópios capturam terabytes de informação diariamente e o Grande Colisor de Hádrons (LHC) gera petabytes de dados a cada ano.

Instituto Virtual

Segundo Cesar Jr., a comunidade envolvida com eScience no Brasil está crescendo. O país tem 2.167 cursos de sistemas de informação ou engenharia e ciências da computação. Em 2009, houve 45 mil formados nessas áreas e a pós-graduação, entre 2007 e 2009, tinha 32 cursos, mil orientadores, 2.705 mestrandos e 410 doutorandos.

“A ciência mudou do paradigma da aquisição de dados para o da análise de dados. Temos diferentes tecnologias que produzem terabytes em diversos campos do conhecimento e, hoje, podemos dizer que essas áreas têm foco na análise de um dilúvio de dados”, disse o membro da Coordenação da Área de Ciência e Engenharia da Computação da FAPESP.

Em 2006, a Sociedade Brasileira de Computação (SBC) organizou um encontro a fim de identificar os problemas-chave e os principais desafios para a área. Isso levou a diferentes propostas para que o Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) criasse um programa específico para esse tipo de problema.

“Em 2009, realizamos uma série de workshops na FAPESP, reunindo, para discutir essa questão, cientistas de áreas como agricultura, mudanças climáticas, medicina, transcriptômica, games, governo eletrônico e redes sociais. A iniciativa resultou em excelentes colaborações entre grupos de cientistas com problemas semelhantes e originou diversas iniciativas”, disse César Jr.

As chamadas do Instituto Microsoft Research-FAPESP de Pesquisas em TI, segundo ele, têm sido parte importante do conjunto de iniciativas para promover a eScience, assim como a organização da Escola São Paulo de Ciência Avançada em Processamento e Visualização de Imagens Computacionais. Além disso, a FAPESP tem apoiado diversos projetos de pesquisa ligados ao tema.

“A comunidade de eScience em São Paulo tem trabalhado com profissionais de diversas áreas e publicado em revistas de várias delas. Isso é indicação de qualidade adquirida pela comunidade para encarar o grande desafio que teremos nos próximos anos”, disse César Jr., que assina o prefácio da edição brasileira do livro.

  • O Quarto Paradigma
    Organizadores: Tony Hey, Stewart Tansley e Kristin Tolle
    Lançamento: 2011
    Preço: R$ 60
    Páginas: 263
    Mais informações: www.ofitexto.com.br

Scientists Find Evidence of Ancient Megadrought in Southwestern U.S. (Science Daily)

ScienceDaily (Nov. 6, 2011) — A new study at the the University of Arizona’s Laboratory of Tree-Ring Research has revealed a previously unknown multi-decade drought period in the second century A.D. The findings give evidence that extended periods of aridity have occurred at intervals throughout our past.

A cross section of wood shows the annual growth rings trees add with each growing season. Dark bands of latewood form the boundary between each ring and the next. Counting backwards from the bark reveals a tree’s age. (Credit: Photo by Daniel Griffin/Laboratory of Tree-Ring Research)

Almost 900 years ago, in the mid-12th century, the southwestern U.S. was in the middle of a multi-decade megadrought. It was the most recent extended period of severe drought known for this region. But it was not the first.

The second century A.D. saw an extended dry period of more than 100 years characterized by a multi-decade drought lasting nearly 50 years, says a new study from scientists at the University of Arizona.

UA geoscientists Cody Routson, Connie Woodhouse and Jonathan Overpeck conducted a study of the southern San Juan Mountains in south-central Colorado. The region serves as a primary drainage site for the Rio Grande and San Juan rivers.

“These mountains are very important for both the San Juan River and the Rio Grande River,” said Routson, a doctoral candidate in the environmental studies laboratory of the UA’s department of geosciences and the primary author of the study, which is upcoming in Geophysical Research Letters.

The San Juan River is a tributary for the Colorado River, meaning any climate changes that affect the San Juan drainage also likely would affect the Colorado River and its watershed. Said Routson: “We wanted to develop as long a record as possible for that region.”

Dendrochronology is a precise science of using annual growth rings of trees to understand climate in the past. Because trees add a normally clearly defined growth ring around their trunk each year, counting the rings backwards from a tree’s bark allows scientists to determine not only the age of the tree, but which years were good for growth and which years were more difficult.

“If it’s a wet year, they grow a wide ring, and if it’s a dry year, they grow a narrow ring,” said Routson. “If you average that pattern across trees in a region you can develop a chronology that shows what years were drier or wetter for that particular region.”

Darker wood, referred to as latewood because it develops in the latter part of the year at the end of the growing season, forms a usually distinct boundary between one ring and the next. The latewood is darker because growth at the end of the growing season has slowed and the cells are more compact.

To develop their chronology, the researchers looked for indications of climate in the past in the growth rings of the oldest trees in the southern San Juan region. “We drove around and looked for old trees,” said Routson.

Literally nothing is older than a bristlecone pine tree: The oldest and longest-living species on the planet, these pine trees normally are found clinging to bare rocky landscapes of alpine or near-alpine mountain slopes. The trees, the oldest of which are more than 4,000 years old, are capable of withstanding extreme drought conditions.

“We did a lot of hiking and found a couple of sites of bristlecone pines, and one in particular that we honed in on,” said Routson.

To sample the trees without damaging them, the dendrochronologists used a tool like a metal screw that bores a tiny hole in the trunk of the tree and allows them to extract a sample, called a core. “We take a piece of wood about the size and shape of a pencil from the tree,” explained Routson.

“We also sampled dead wood that was lying about the land. We took our samples back to the lab where we used a visual, graphic technique to match where the annual growth patterns of the living trees overlap with the patterns in the dead wood. Once we have the pattern matched we measure the rings and average these values to generate a site chronology.”

“In our chronology for the south San Juan mountains we created a record that extends back 2,200 years,” said Routson. “It was pretty profound that we were able to get back that far.”

The chronology extends many years earlier than the medieval period, during which two major drought events in that region already were known from previous chronologies.

“The medieval period extends roughly from 800 to 1300 A.D.,” said Routson. “During that period there was a lot of evidence from previous studies for increased aridity, in particular two major droughts: one in the middle of the 12th century, and one at the end of the 13th century.”

“Very few records are long enough to assess the global conditions associated with these two periods of Southwestern aridity,” said Routson. “And the available records have uncertainties.”

But the chronology from the San Juan bristlecone pines showed something completely new:

“There was another period of increased aridity even earlier,” said Routson. “This new record shows that in addition to known droughts from the medieval period, there is also evidence for an earlier megadrought during the second century A.D.”

“What we can see from our record is that it was a period of basically 50 consecutive years of below-average growth,” said Routson. “And that’s within a much broader period that extends from around 124 A.D. to 210 A.D. — about a 100-year-long period of dry conditions.”

“We’re showing that there are multiple extreme drought events that happened during our past in this region,” said Routson. “These megadroughts lasted for decades, which is much longer than our current drought. And the climatic events behind these previous dry periods are really similar to what we’re experiencing today.”

The prolonged drought in the 12th century and the newly discovered event in the second century A.D. may both have been influenced by warmer-than-average Northern Hemisphere temperatures, Routson said: “The limited records indicate there may have been similar La Nina-like background conditions in the tropical Pacific Ocean, which are known to influence modern drought, during the two periods.”

Although natural climate variation has led to extended dry periods in the southwestern U.S. in the past, there is reason to believe that human-driven climate change will increase the frequency of extreme droughts in the future, said Routson. In other words, we should expect similar multi-decade droughts in a future predicted to be even warmer than the past.

Routson’s research is funded by fellowships from the National Science Foundation, the Science Foundation Arizona and the Climate Assessment of the Southwest. His advisors, Woodhouse of the School of Geography and Development and Overpeck of the department of geosciences and co-director of the UA’s Institute of the Environment, are co-authors of the study.

Copyright: A Conceptual Battle in a Digital Age (Science Daily)

ScienceDaily (Nov. 3, 2011) — What is it about copyright that doesn’t work in the digital society? Why do millions of people think it’s OK to break the law when it comes to file sharing in particular? Sociology of law researcher Stefan Larsson from Lund University believes that legal metaphors and old-fashioned mindsets contribute to the confusion and widening gaps between legislation and the prevailing norms.

Our language is made up of metaphors, even in our legal texts. Stefan Larsson has studied what consequences this has when digital phenomena, such as file sharing and downloading, are limited by descriptions intended for an analogue world. “When legal arguments equate file sharing with theft of physical objects, it sometimes becomes problematic,” says Stefan Larsson, who doesn’t think it is possible to equate an illegal download with theft of a physical object, as has been done in the case against The Pirate Bay.

Using the compensation model employed in the case against The Pirate Bay, the total value of such a site could be calculated at over SEK 600 billion. This is almost as much as Sweden’s national budget, says Stefan Larsson. The prosecutor in the Pirate Bay case chose to pursue a smaller number of downloads and the sum of the fines therefore never reached these proportions.

In Stefan Larsson’s view, the word ‘copies’ is a hidden legal metaphor that causes problematic ideas in the digital society. For example, copyright does not take into account that a download does not result in the owner losing his or her own copy. Neither is it possible to equate number of downloads with lost income for the copyright holder, since it is likely that people download a lot more than they would purchase in a shop.

Other metaphors that are used for downloading are infringement, theft and piracy. “The problem is that these metaphors make us equate copyright with ownership of physical property,” says Stefan Larsson.

Moreover, there are underlying mindsets which guide the whole of copyright, according to Stefan Larsson. One such mindset is the idea that creation is a process undertaken by sole geniuses and not so much in a cultural context. In Stefan Larsson’s view, this has the unfortunate consequence of making stronger copyright protection with longer duration and a higher degree of legal enforcement appear reasonable. The problem is that it is based on a misconception of how a lot of things are created, says Stefan Larsson: “Borrowing and drawing inspiration from other artists is essential to a lot of creative activity. This is the case both online and offline.”

Stefan Larsson has also studied the consequences when public perception of the law, or social norms, is not in line with what the law says. One consequence is that the State needs to exercise more control and issue more severe penalties in order to ensure that the law is followed. The European trend in copyright law is heading in this direction. Among other things, it is being made easier to track what individuals do on the Internet. This means that the integrity of the many is being eroded to benefit the interests of a few, according to Stefan Larsson: “When all’s said and done, it is about what we want the Internet to be. The fight for this is taking place, at least partially, through metaphorical expressions for underlying conceptions, but also through practical action on the role of anonymity online.”

Stefan Larsson’s thesis is entitled Metaphors and Norms – Understanding Copyright Law in a Digital Society.

The Human Cause of Climate Change: Where Does the Burden of Proof Lie? (Science Daily)

ScienceDaily (Nov. 3, 2011) — The debate may largely be drawn along political lines, but the human role in climate change remains one of the most controversial questions in 21st century science. Writing in WIREs Climate Change Dr Kevin Trenberth, from the National Center for Atmospheric Research, argues that the evidence for anthropogenic climate change is now so clear that the burden of proof should lie with research which seeks to disprove the human role.

Polar bear on melting ice. Experts argue that the evidence for anthropogenic climate change is now so clear that the burden of proof should lie with research which seeks to disprove the human role. (Credit: iStockphoto/Kristian Septimius Krogh)

In response to Trenberth’s argument a second review, by Dr Judith Curry, focuses on the concept of a ‘null hypothesis’ the default position which is taken when research is carried out. Currently the null hypothesis for climate change attribution research is that humans have no influence.

“Humans are changing our climate. There is no doubt whatsoever,” said Trenberth. “Questions remain as to the extent of our collective contribution, but it is clear that the effects are not small and have emerged from the noise of natural variability. So why does the science community continue to do attribution studies and assume that humans have no influence as a null hypothesis?”

To show precedent for his position Trenberth cites the 2007 report by the Intergovernmental Panel on Climate Change which states that global warming is “unequivocal,” and is “very likely” due to human activities.

Trenberth also focused on climate attribution studies which claim the lack of a human component, and suggested that the assumptions distort results in the direction of finding no human influence, resulting in misleading statements about the causes of climate change that can serve to grossly underestimate the role of humans in climate events.

“Scientists must challenge misconceptions in the difference between weather and climate while attribution studies must include a human component,” concluded Trenberth. “The question should no longer be is there a human component, but what is it?”

In a second paper Dr Judith Curry, from the Georgia Institute of Technology, questions this position, but argues that the discussion on the null hypothesis serves to highlight fuzziness surrounding the many hypotheses related to dangerous climate change.

“Regarding attribution studies, rather than trying to reject either hypothesis regardless of which is the null, there should be a debate over the significance of anthropogenic warming relative to forced and unforced natural climate variability,” said Curry.

Curry also suggested that the desire to reverse the null hypothesis may have the goal of seeking to marginalise the climate sceptic movement, a vocal group who have challenged the scientific orthodoxy on climate change.

“The proponents of reversing the null hypothesis should be careful of what they wish for,” concluded Curry. “One consequence may be that the scientific focus, and therefore funding, would also reverse to attempting to disprove dangerous anthropogenic climate change, which has been a position of many sceptics.”

“I doubt Trenberth’s suggestion will find much support in the scientific community,” said Professor Myles Allen from Oxford University, “but Curry’s counter proposal to abandon hypothesis tests is worse. We still have plenty of interesting hypotheses to test: did human influence on climate increase the risk of this event at all? Did it increase it by more than a factor of two?”

Ministro participa da inauguração de radar meteorológico do Ceará (Ascom do governo do Ceará)

JC e-mail 4378, de 04 de Novembro de 2011.

Aloizio Mercadante e o governador do Ceará, Cid Gomes, inauguraram o Radar Meteorológico Banda-S, em Quixeramobim (CE). Equipamento ajudará na previsão de secas e cheias.

Previsão de secas e cheias, mudanças climáticas e todos os eventos ligados a meteorologia passam a ser informados pela Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme) com mais previsão, já que agora o órgão conta com um novo equipamento para captação dessas informações.

O novo Radar Meteorológico Banda-S foi inaugurado nesta quinta-feira (3) pelo governador Cid Gomes e o ministro da Ciência e Tecnologia, Aloizio Mercadante. Localizado no Morro de Santa Maria, em Quixeramobim, no Sertão Central, o equipamento vai funcionar como parte da Rede Cearense de Radares (RCR), por meio da integração com o Radar Doppler de Banda X instalado em Fortaleza. “Parece um equipamento aparentemente simples, mas por trás existe uma utilidade inimaginável. A tecnologia pode ser um aliado na melhoria da qualidade de vida da população, que é o nosso compromisso”, destacou Cid Gomes durante a inauguração.

Segundo explicou o governador, o novo equipamento pode informar condições climáticas bem específicas, como por exemplo “que no município de Nova Olinda, no Cariri, choveu cinco milímetros”, exemplificou. “Na medida que uma informação dessas é casada com outras, isso vai ajudar a diagnosticar por exemplo um período de seca ou de cheias. Somos um estado com quase 300 mil pequenos agricultores, e eles precisam de informações concretas para cuidar da colheita. E nisso o Radar vai ser bastante útil”, ressaltou Cid Gomes.

O Radar Banda-S tem capacidade para estimar uma precipitação dentro de um raio de 200 quilômetros. Além disso, pode fazer o monitoramento de sistemas meteorológicos que atuam em um alcance de até 400 quilômetros. Por sua capacidade e localização, também será possível obter informações não só do Ceará, como de vários estados nordestinos. “Esse é um instrumento de planejamento agrícola que vai beneficiar também muitos estados do Nordeste, como Paraíba, Pernambuco, Piauí e Rio Grande do Norte”, lembrou Aloizio Mercadante. O ministro também ressaltou sua importância na prevenção de desastres naturais, como longos períodos de seca ou chuvas bem acima da média. “Precisamos entender porque esses eventos acontecem e prevenir as ações que as mudanças climáticas podem causar”, explicou Mercadante.

Para a instalação do Radar Meteorológico Banda-S foram investidos R$ 14 milhões, sendo R$ 10 milhões partiram do Governo Federal, por meio do Ministério da Ciência e Tecnologia e Inovação (MCTI) e R$ 4 milhões do governo do estado do Ceará. Do total, R$ 12 milhões foram utilizados para a compra do equipamento e o restante (R$ 2 milhões) para a melhoria dos acessos ao local (construção de vias) e alimentação energética.

Segundo lembrou o secretário estadual da Ciência e Tecnologia, René Barreira, a instalação do Radar partiu de uma emenda de Ciro Gomes quando deputado federal, que aliado a sensibilidade do ex-presidente Lula, tornou possível a obra. “Com esse importante equipamento vamos ter um zoneamento agrícola e um controle mais efetivo e técnico dos eventos de grande risco”, ressaltou o secretário.

The Mental Time Travel Of Animals (NPR)

11:39 am

November 3, 2011

by BARBARA J KING

Don't underestimate the crow.

Arif Ali/AFP/Getty Images. Don’t underestimate the crow.

Without a trace of agitation, the male chimpanzee piles up stones in small caches within his enclosure. He does this in the morning, before zoo visitors arrive. Hours later, in an aroused state, the ape hurls the stones at people gathering to watch him.

detailed report by Mathias Osvath concluded that the ape had planned ahead strategically for the future. It is exactly this feat of mental time travel that psychologist Michael C. Corballis, in his book The Recursive Mind: The Origins of Human Language, Thought, and Civilization, claims is beyond the reach of nonhuman animals. Last week, my review of Corballis’s book appeared in the Times Literary Supplement.

Corballis suggests that mental time travel is one of two human ways of thinking that propelled our species into a unique cognitive status. (The other, theory of mind, I won’t deal with here.)

During mental time travel, we insert into our present consciousness an experience that we’ve had in the past or that we imagine for ourselves in the future. Corballis calls this ability mental recursion, and he’s right that we humans do it effortlessly. When we daydream at work about last weekend’s happy times with family and friends, or anticipate tonight’s quiet evening with a book, we engage in mental time travel.

Our highly elaborated ability to insert the past or future recursively into our thinking may play a role in the evolution of human civilization, as Corballis claims. But Corballis’s argument is weakened because he dismisses other animals’ mental capacities far too readily.

It’s not only one chimpanzee in a Swedish zoo who makes me think so.

When our pets grieve, as I wrote about in this space recently, they hold in their mind some memory of the past that causes them to miss a companion.

New research on the pattern of food storage by Eurasian jays indicates that these birds think ahead about what specific foods they will want in the future.

When apes (chimpanzees) and corvids (crows and ravens) make tools to obtain food, they too think ahead to a goal, even as they fashion a tool to solve the problem before them.

In the NATURE documentary film A Murder of Crows, a New Caledonian crow solves a three-part tool-using problem totally new to him (or to any other crow). As one researcher put it, the bird thinks “three chess moves into the future” as he finds one tool that allows him to get another tool that he uses finally to procure food.

Have a look at this crow’s stunning problem-solving here. The experimental footage begins at 16:30, but starting at 13:00 offers good context. And the entire film is a delight.

Fraud Case Seen as a Red Flag for Psychology Research (N.Y. Times)

By BENEDICT CAREY

Published: November 2, 2011

A well-known psychologist in the Netherlands whose work has been published widely in professional journals falsified data and made up entire experiments, an investigating committee has found. Experts say the case exposes deep flaws in the way science is done in a field,psychology, that has only recently earned a fragile respectability.

Joris Buijs/Pve

The psychologist Diederik Stapel in an undated photograph. “I have failed as a scientist and researcher,” he said in a statement after a committee found problems in dozens of his papers.

The psychologist, Diederik Stapel, of Tilburg University, committed academic fraud in “several dozen” published papers, many accepted in respected journals and reported in the news media, according to a report released on Monday by the three Dutch institutions where he has worked: the University of Groningen, the University of Amsterdam, and Tilburg. The journal Science, which published one of Dr. Stapel’s papers in April, posted an “editorial expression of concern” about the research online on Tuesday.

The scandal, involving about a decade of work, is the latest in a string of embarrassments in a field that critics and statisticians say badly needs to overhaul how it treats research results. In recent years, psychologists have reported a raft of findings on race biases, brain imaging and even extrasensory perception that have not stood up to scrutiny. Outright fraud may be rare, these experts say, but they contend that Dr. Stapel took advantage of a system that allows researchers to operate in near secrecy and massage data to find what they want to find, without much fear of being challenged.

“The big problem is that the culture is such that researchers spin their work in a way that tells a prettier story than what they really found,” said Jonathan Schooler, a psychologist at the University of California, Santa Barbara. “It’s almost like everyone is on steroids, and to compete you have to take steroids as well.”

In a prolific career, Dr. Stapel published papers on the effect of power on hypocrisy, on racial stereotyping and on how advertisements affect how people view themselves. Many of his findings appeared in newspapers around the world, including The New York Times, which reported in December on his study about advertising and identity.

In a statement posted Monday on Tilburg University’s Web site, Dr. Stapel apologized to his colleagues. “I have failed as a scientist and researcher,” it read, in part. “I feel ashamed for it and have great regret.”

More than a dozen doctoral theses that he oversaw are also questionable, the investigators concluded, after interviewing former students, co-authors and colleagues. Dr. Stapel has published about 150 papers, many of which, like the advertising study, seem devised to make a splash in the media. The study published in Science this year claimed that white people became more likely to “stereotype and discriminate” against black people when they were in a messy environment, versus an organized one. Another study, published in 2009, claimed that people judged job applicants as more competent if they had a male voice. The investigating committee did not post a list of papers that it had found fraudulent.

Dr. Stapel was able to operate for so long, the committee said, in large measure because he was “lord of the data,” the only person who saw the experimental evidence that had been gathered (or fabricated). This is a widespread problem in psychology, said Jelte M. Wicherts, a psychologist at the University of Amsterdam. In a recent survey, two-thirds of Dutch research psychologists said they did not make their raw data available for other researchers to see. “This is in violation of ethical rules established in the field,” Dr. Wicherts said.

In a survey of more than 2,000 American psychologists scheduled to be published this year, Leslie John of Harvard Business School and two colleagues found that 70 percent had acknowledged, anonymously, to cutting some corners in reporting data. About a third said they had reported an unexpected finding as predicted from the start, and about 1 percent admitted to falsifying data.

Also common is a self-serving statistical sloppiness. In an analysis published this year, Dr. Wicherts and Marjan Bakker, also at the University of Amsterdam, searched a random sample of 281 psychology papers for statistical errors. They found that about half of the papers in high-end journals contained some statistical error, and that about 15 percent of all papers had at least one error that changed a reported finding — almost always in opposition to the authors’ hypothesis.

The American Psychological Association, the field’s largest and most influential publisher of results, “is very concerned about scientific ethics and having only reliable and valid research findings within the literature,” said Kim I. Mills, a spokeswoman. “We will move to retract any invalid research as such articles are clearly identified.”

Researchers in psychology are certainly aware of the issue. In recent years, some have mocked studies showing correlations between activity on brain images and personality measures as “voodoo” science, and a controversy over statistics erupted in January after The Journal of Personality and Social Psychology accepted a paper purporting to show evidence of extrasensory perception. In cases like these, the authors being challenged are often reluctant to share their raw data. But an analysis of 49 studies appearing Wednesday in the journal PLoS One, by Dr. Wicherts, Dr. Bakker and Dylan Molenaar, found that the more reluctant that scientists were to share their data, the more likely that evidence contradicted their reported findings.

“We know the general tendency of humans to draw the conclusions they want to draw — there’s a different threshold,” said Joseph P. Simmons, a psychologist at the University of Pennsylvania’s Wharton School. “With findings we want to see, we ask, ‘Can I believe this?’ With those we don’t, we ask, ‘Must I believe this?’ ”

But reviewers working for psychology journals rarely take this into account in any rigorous way. Neither do they typically ask to see the original data. While many psychologists shade and spin, Dr. Stapel went ahead and drew any conclusion he wanted.

“We have the technology to share data and publish our initial hypotheses, and now’s the time,” Dr. Schooler said. “It would clean up the field’s act in a very big way.”

People Rationalize Situations They’re Stuck With, but Rebel When They Think There’s an out (Science Daily)

ScienceDaily (Nov. 1, 2011) — People who feel like they’re stuck with a rule or restriction are more likely to be content with it than people who think that the rule isn’t definite. The authors of a new study, which will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science, say this conclusion may help explain everything from unrequited love to the uprisings of the Arab Spring.

Psychological studies have found two contradictory results about how people respond to rules. Some research has found that, when there are new restrictions, you rationalize them; your brain comes up with a way to believe the restriction is a good idea. But other research has found that people react negatively against new restrictions, wanting the restricted thing more than ever.

Kristin Laurin of the University of Waterloo thought the difference might be absoluteness — how much the restriction is set in stone. “If it’s a restriction that I can’t really do anything about, then there’s really no point in hitting my head against the wall and trying to fight against it,” she says. “I’m better off if I just give up. But if there’s a chance I can beat it, then it makes sense for my brain to make me want the restricted thing even more, to motivate me to fight” Laurin wrote the new paper with Aaron Kay and Gavan Fitzsimons of Duke University.

In an experiment in the new study, participants read that lowering speed limits in cities would make people safer. Some read that government leaders had decided to reduce speed limits. Of those people, some were told that this legislation would definitely come into effect, and others read that it would probably happen, but that there was still a small chance government officials could vote it down.

People who thought the speed limit was definitely being lowered supported the change more than control subjects, but people who thought there was still a chance it wouldn’t happen supported it less than these control subjects. Laurin says this confirms what she suspected about absoluteness; if a restriction is definite, people find a way to live with it.

This could help explain how uprisings spread across the Arab world earlier this year. When people were living under dictatorships with power that appeared to be absolute, Laurin says, they may have been comfortable with it. But once Tunisia’s president fled, citizens of neighboring countries realized that their governments weren’t as absolute as they seemed — and they could have dropped whatever rationalizations they were using to make it possible to live under an authoritarian regime. Even more, the now non-absolute restriction their governments represented could have exacerbated their reaction, fueling their anger and motivating them to take action.

And how does this relate to unrequited love? It confirms people’s intuitive sense that leading someone can just make them fall for you more deeply, Laurin says. “If this person is telling me no, but I perceive that as not totally absolute, if I still think I have a shot, that’s just going to strengthen my desire and my feeling, that’s going to make me think I need to fight to win the person over,” she says. “If instead I believe no, I definitely don’t have a shot with this person, then I might rationalize it and decide that I don’t like them that much anyway.”

Mathematically Detecting Stock Market Bubbles Before They Burst (Science Daily)

ScienceDaily (Oct. 31, 2011) — From the dotcom bust in the late nineties to the housing crash in the run-up to the 2008 crisis, financial bubbles have been a topic of major concern. Identifying bubbles is important in order to prevent collapses that can severely impact nations and economies.

A paper published this month in the SIAM Journal on Financial Mathematics addresses just this issue. Opening fittingly with a quote from New York Federal Reserve President William Dudley emphasizing the importance of developing tools to identify and address bubbles in real time, authors Robert Jarrow, Younes Kchia, and Philip Protter propose a mathematical model to detect financial bubbles.

A financial bubble occurs when prices for assets, such as stocks, rise far above their actual value. Such an economic cycle is usually characterized by rapid expansion followed by a contraction, or sharp decline in prices.

“It has been hard not to notice that financial bubbles play an important role in our economy, and speculation as to whether a given risky asset is undergoing bubble pricing has approached the level of an armchair sport. But bubbles can have real and often negative consequences,” explains Protter, who has spent many years studying and analyzing financial markets.

“The ability to tell when an asset is or is not in a bubble could have important ramifications in the regulation of the capital reserves of banks as well as for individual investors and retirement funds holding assets for the long term. For banks, if their capital reserve holdings include large investments with unrealistic values due to bubbles, a shock to the bank could occur when the bubbles burst, potentially causing a run on the bank, as infamously happened with Lehman Brothers, and is currently happening with Dexia, a major European bank,” he goes on to explain, citing the significance of such inflated prices.

Using sophisticated mathematical methods, Protter and his co-authors answer the question of whether the price increase of a particular asset represents a bubble in real time. “[In this paper] we show that by using tick data and some statistical techniques, one is able to tell with a large degree of certainty, whether or not a given financial asset (or group of assets) is undergoing bubble pricing,” says Protter.

This question is answered by estimating an asset’s price volatility, which is stochastic or randomly determined. The authors define an asset’s price process in terms of a standard stochastic differential equation, which is driven by Brownian motion. Brownian motion, based on a natural process involving the erratic, random movement of small particles suspended in gas or liquid, has been widely used in mathematical finance. The concept is specifically used to model instances where previous change in the value of a variable is unrelated to past changes.

The key characteristic in determining a bubble is the volatility of an asset’s price, which, in the case of bubbles is very high. The authors estimate the volatility by applying state of the art estimators to real-time tick price data for a given stock. They then obtain the best possible extension of this data for large values using a technique called Reproducing Kernel Hilbert Spaces (RKHS), which is a widely used method for statistical learning.

“First, one uses tick price data to estimate the volatility of the asset in question for various levels of the asset’s price,” Protter explains. “Then, a special technique (RKHS with an optimization addition) is employed to extrapolate this estimated volatility function to large values for the asset’s price, where this information is not (and cannot be) available from tick data. Using this extrapolation, one can check the rate of increase of the volatility function as the asset price gets arbitrarily large. Whether or not there is a bubble depends on how fast this increase occurs (its asymptotic rate of increase).”

If it does not increase fast enough, there is no bubble within the model’s framework.

The authors test their methodology by applying the model to several stocks from the dot-com bubble of the nineties. They find fairly successful rates in their predictions, with higher accuracies in cases where market volatilities can be modeled more efficiently. This helps establish the strengths and weaknesses of the method.

The authors have also used the model to test more recent price increases to detect bubbles. “We have found, for example, that the IPO [initial public offering] of LinkedIn underwent bubble pricing at its debut, and that the recent rise in gold prices was not a bubble, according to our models,” Protter says.

It is encouraging to see that mathematical analysis can play a role in the diagnosis and detection of bubbles, which have significantly impacted economic upheavals in the past few decades.

Robert Jarrow is a professor at the Johnson Graduate School of Management at Cornell University in Ithaca, NY, and managing director of the Kamakura Corporation. Younes Kchia is a graduate student at Ecole Polytechnique in Paris, and Philip Protter is a professor in the Statistics Department at Columbia University in New York.

Professor Protter’s work was supported in part by NSF grant DMS-0906995.

Doctors Can Learn Empathy Through a Computer-Based Tutorial (Science Daily)

ScienceDaily (Oct. 31, 2011) — Cancer doctors want to offer a sympathetic ear, but sometimes miss the cues from patients. To help physicians better address their patients’ fears and worries, a Duke University researcher has developed a new interactive training tool.

The computer tutorial includes feedback on the doctors’ own audio recorded visits with patients, and provides an alternative to more expensive courses.

In a study appearing Nov. 1, 2011, in the Annals of Internal Medicine, the research team found that the course resulted in more empathic responses from oncologists, and patients reported greater trust in their doctors — a key component of care that enhances quality of life.

“Earlier studies have shown that oncologists respond to patient distress with empathy only about a quarter of the time,” said James A. Tulsky, MD, director of the Duke Center for Palliative Care and lead author of the study.

“Often, when patients bring up their worries, doctors change the subject or focus on the medical treatment, rather than the emotional concern. Unfortunately, this behavior sends the message, ‘This is not what we’re here to talk about.'”

Tulsky said cancer doctors have many reasons for avoiding emotionally fraught conversations. Some worry that the exchanges will cause rather than ease stress, or that they don’t have time to address non-medical concerns.

Neither is true, Tulsky said, noting his research shows that asking the right questions during patient visits can actually save time and enhance patient satisfaction.

“Oncologists are among the most devoted physicians — passionately committed to their patients. Unfortunately, their patients don’t always know this unless the doctors articulate their empathy explicitly,” Tulsky said. “It’s a skill set. It’s not that the doctors are uncaring, it’s just that communication needs to be taught and learned.”

The current gold standard for teaching empathy skills is a multiday course that involves short lectures and role-playing with actors hired to simulate clinical situations. Such courses are time-consuming and expensive, costing upwards of $3,000 per physician.

Tulsky’s team at Duke developed a computer program that models what happens in these courses. The doctors receive feedback on pre-recorded encounters, and are able to complete the intervention in their offices or homes in a little more than an hour, at a cost of about $100.

To test its effectiveness, Tulsky and colleagues enrolled 48 doctors at Duke, the Veterans Affairs Medical Center in Durham, NC, and the University of Pittsburgh Medical Center. The research team audio-recorded four to eight visits between the doctors and their patients with advanced cancer.

All the doctors then attended an hour-long lecture on communication skills. Half were randomly assigned to receive a CD-ROM tutorial, the other half received no other intervention.

The CD taught the doctors basic communication skills, including how to recognize and respond to opportunities in conversations when patients share a negative emotion, and how to share information about prognosis. Doctors also heard examples from their own clinic encounters, with feedback on how they could improve. They were asked to commit to making changes in their practice and then reminded of these prior to their next clinic visits.

Afterward, all the doctors were again recorded during patient visits, and the encounters were assessed by both patients and trained listeners who evaluated the conversations for how well the doctors responded to empathic statements.

Oncologists who had not taken the CD course made no improvement in the way they responded to patients when confronted with concerns or fears. Doctors in the trained group, however, responded empathically twice as often as those who received no training. In addition, they were better at eliciting patient concerns, using tactics to promote conversations rather than shut them down.

“Patient trust in physicians increased significantly,” Tulsky said, adding that patients report feeling better when they believe their doctors are on their side. “This is exciting, because it’s an easy, relatively inexpensive way to train physicians to respond to patients’ most basic needs.”

Although the CD course is not yet widely available, efforts are underway to develop it for broader distribution.

In addition to Tulsky, study authors include: Robert M. Arnold; Stewart C. Alexander; Maren K. Olsen; Amy S. Jeffreys; Keri L. Rodriguez; Celette Sugg Skinner; David Farrell; Amy P. Abernethy; and Kathryn I. Pollak.

Funding for the study came from the National Cancer Institute. Study authors reported no conflicts.

Putting the Body Back Into the Mind of Schizophrenia (Science Daily)

ScienceDaily (Oct. 31, 2011) — A study using a procedure called the rubber hand illusion has found striking new evidence that people experiencing schizophrenia have a weakened sense of body ownership and has produced the first case of a spontaneous, out-of-body experience in the laboratory.

These findings suggest that movement therapy, which trains people to be focused and centered on their own bodies, including some forms of yoga and dance, could be helpful for many of the2.2 million people in the United States who suffer from this mental disorder.

The study, which appears in the Oct. 31 issue of the scientific journal Public Library of Science One, measured the strength of body ownership of 24 schizophrenia patients and 21 matched control subjects by testing their susceptibility to the “rubber hand illusion” or RHI. This tactile illusion, which was discovered in 1998, is induced by simultaneously stroking a visible rubber hand and the subject’s hidden hand.

“After a while, patients with schizophrenia begin to ‘feel’ the rubber hand and disown their own hand. They also experience their real hand as closer to the rubber hand.” said Sohee Park, the Gertrude Conaway Vanderbilt Chair of Psychology and Psychiatry, who conducted the study with doctoral candidate Katharine Thakkar and research analysts Heathman Nichols and Lindsey McIntosh.

“Healthy people get this illusion too, but weakly,” Park said. “Some don’t get it at all, and there is a wide range of individual differences in how people experience this illusion that is related to a personality trait called schizotypy, associated with psychosis-proneness.”

Body ownership is one of two aspects of a person’s sense of self awareness. (The other aspect is self-agency, the sense that a person is initiating his or her own actions.) According to the researchers, the finding that schizophrenia patients are more susceptible to the rubber hand illusion suggests that they have a more flexible body representation and weakened sense of self compared to healthy people.

“What’s so interesting about Professor Park’s study is that they have found that the sense of bodily ownership does not diminish among patients with schizophrenia, but it can be extended to other objects more easily,” observed David Gray, Mellon assistant professor of philosophy at Vanderbilt, who is an expert on the philosophy of the mind. He did not participate in the study but is familiar with it. “Much of the literature concerning agency and ownership in schizophrenia focuses on the sense of lost agency over one’s own movements: But, in these cases, the sense of ownership is neither diminished nor extended.”

Before they began the procedure, the researchers gave participants a questionnaire to rate their degree of schizotypy: the extent to which they experience perceptual effects related to the illusion. The researchers found that the individuals who rated higher on the scale were more susceptible to the illusion.

The researchers gauged the relative strength of the RHI by asking participants to estimate the position of the index finger of their hidden hand on rulers placed on top of the box that conceals it before and after stimulation. The stronger the effect, the more the subjects’ estimate of the position of their hidden hand shifted in the direction of the rubber hand. Even the estimates of those who did not experience the effect subjectively shifted slightly.

The rubber hand illusion also has a physiological signature. Scientists don’t know why, but the temperature of the hidden hand drops by a few tenths of a degree when a person experiences the illusion. “It’s almost as if the hand is disowned and rejected, no longer part of the self,” Park commented.

The researchers were surprised when one of the patients undergoing the procedure experienced a full out-of-body experience. He reported that he was floating above his own body for about 15 minutes. According to Park, it is extremely rare to observe spontaneous out-of-body experiences in the laboratory. When they invited the patient back for a second session, he once again had an out-of-body experience during the rubber hand procedure, proving that the experience is repeatable.

“Anomalous experiences of the self were considered to be core features of schizophrenia decades ago but in recent years much of the emphasis has been on cognitive functions such as working memory,” said Park.

According to the psychologist, out-of-body experiences and body ownership are associated with a particular area in the brain called the temporoparietal junction. Lesions in this area and stimulation by strong magnetic fields can elicit out-of-body experiences. The new study suggests that disorders in this part of the brain may also contribute to the symptoms of schizophrenia.

The relationship between schizophrenia and body ownership may help explain the results of a German study published in 2008 that found a 12-week exercise program reduced the symptoms and improved the behavior of a small group of patients with chronic schizophrenia when compared to a control group that did not exercise. The study also found that the exercise increased size of the patients’ hippocampus slightly — a smaller-than-normal hippocampus is a well established symptom of schizophrenia.

“Exercise is inexpensive and obviously has a broad range of beneficial effects, so if it can also reduce the severity of schizophrenia, it is all to the good,” said Park. These findings suggest that focused physical exercise which involves precise body control, such as yoga and dancing, could be a beneficial form of treatment for this disorder.

The study was partly funded by a grant from the National Institutes of Health and the Gertrude Conaway Vanderbilt Endowed Chair.

That’s Gross! Study Uncovers Physiological Nature of Disgust in Politics (Science Daily)

ScienceDaily (Oct. 25, 2011) — Most likely, you would be disgusted if confronted with a picture of a man eating a mouthful of writhing worms. Or a particularly bloody wound. Or a horribly emaciated but still living body. But just how much disgust you feel may lend important insight into your personal political proclivities.

In a new study, political scientists closely measured people’s physiological reactions as they looked at a series of pleasant and unpleasant images. Participants who identified themselves as conservative — and, in particular, those who said they were against gay marriage — had strong physiological reactions when shown the gross pictures.

The study, the latest to examine the connection between political differences and humans’ built-in physiological traits, was co-authored by University of Nebraska-Lincoln political science professors Kevin Smith and John Hibbing and appears this month in the online journal PLoS ONE, published by the Public Library of Science.

“This is one more piece of evidence that we, quite literally, have gut feelings about politics,” Smith said. “Our political attitudes and behaviors are reflected in our biology.”

Researchers worked with 27 women and 23 men who were chosen from a larger pool of participants who also underwent thorough political questioning. The subjects were shown a series of disgusting and also relatively pleasant images while electrodes on their skin measured subtle skin conductance changes, which indicated an emotional response.

As predicted, conservatives responded to the pictures with much more intense disgust than did liberals. Attitudes in opposition to same-sex marriage were highly connected.

The results add to a growing area of research that suggests biology plays a larger role in influencing political orientation than many might think. Recent UNL work has produced findings in this area, including a 2008 study that found people who are highly responsive to threatening images were likely to support defense spending, capital punishment, patriotism and the Iraq War.

“The proper interpretation of the findings (in the current study) is not that biology causes politics or that politics causes biology,” the authors write, “but that certain political orientations at some unspecified point become housed in our biology, with meaningful political consequences.”

Acceptance of the role of involuntary physiological responses is not easy for many, however: “Most are proud of their political orientations, believe them to be rational responses to the world around them, and are reluctant to concede that subconscious predispositions play any role in shaping them,” they wrote. Still, the authors suggest that if recognition of the relevance of politics of involuntary physiology became more widespread, it could diminish frustration from the perceived illogical inflexibility of political opponents and reduce political hostility.

“After all, if political differences are traceable in part to the fact that people vary in the way they physically experience the world, certitude that any particular worldview is ‘correct’ may abate, lessening the hubris that fuels political conflict.”

In addition to UNL’s Smith and Hibbing, the study was co-authored by Douglas Oxley of Texas A&M University; Matthew Hibbing of the University of California, Merced; and John Alford of Rice University.

Governo apresenta oficialmente oito propostas para a Rio+20 (Jornal da Ciência)

JC e-mail 4376, de 01 de Novembro de 2011.

O governo apresenta nesta terça-feira (1º) a versão oficial do documento com oito propostas para a Conferência das Nações Unidas sobre o Desenvolvimento Sustentável, conhecida como Rio+20, a ser realizada no Rio de Janeiro de 28 de maio a 6 de junho de 2012. O documento foi apresentado hoje pela ministra do Meio Ambiente, Izabella Teixeira e pelo Itamaraty, em coletiva de imprensa, em Brasília.

A primeira proposta é a criação de um programa de proteção socioambiental global, cujo objetivo é assegurar garantia de renda para superar a pobreza extrema no mundo e promover ações estruturantes que garantam qualidade ambiental, segurança alimentar, moradia adequada e acesso à água limpa para todos.

A ideia desse programa, conforme consta do documento, é fazer com que “toda estrutura multilateral opere” para facilitar o acesso a tecnologias, recursos financeiros, infraestrutura e capacitação, a fim de que todas as pessoas tenham a quantidade e qualidade mínima de alimento, água e ambiente saudável.

Pela proposta brasileira, esse programa teria como foco uma estratégia de garantia de renda adequada às condições de cada país, diante de um momento de crise internacional em que se mobilizam vastos recursos globais para a recuperação do sistema financeiro. “O programa seria uma aposta no componente social, importante na solução brasileira para o enfrentamento da crise”, destaca o documento. “Essa é uma plataforma de diálogo global que poderia ser um passo crucial rumo ao desenvolvimento sustentável, com potencial para reforçar o papel virtuoso do multilateralismo”, complementa.

Na segunda proposta, o governo sugere a implementação de “objetivos de desenvolvimento sustentável”, adotando um programa de economia verde inclusiva, em lugar “de negociações complexas que busquem o estabelecimento de metas restritivas vinculantes”. Dentre outros, esses objetivos poderiam estar associados a erradicação da pobreza extrema; a segurança alimentar e nutricional; acesso a empregos adequados (socialmente justos e ambientalmente corretos); acesso a fontes adequadas de energia; a microempreendedorismo e microcrédito; a inovação para a sustentabilidade; acesso a fontes adequadas de recursos hídricos; e adequação da pegada ecológica à capacidade de regeneração do planeta.

Compras públicas sustentáveis – Na terceira proposta, o Brasil sugere um pacto global para produção e consumo sustentáveis. Ou seja, um conjunto de iniciativas para promover mudanças nos padrões de produção e consumo em diversos setores. Dessa forma, poderiam ser adotadas, com caráter prioritário, iniciativas que ofereçam suporte político a compras públicas sustentáveis, já que essas representam parte significativa da economia internacional, de cerca de 15% do Produto Interno Bruto (PIB) mundial; a classificações de consumo e eficiência energética; e financiamento de estudos e pesquisas para o desenvolvimento sustentável (com o objetivo de qualificar recursos humanos de alto nível e apoiar projetos científicos, tecnológicos e inovadores).

A quarta proposta sugere estabelecer repositório de iniciativas para dinamizar os mecanismos nacionais e de cooperação internacional, inclusive a utilização de recursos dos organismos multilaterais. Já a quinta sugestão propõe a criação de protocolo internacional para a sustentabilidade do setor financeiro.

Na sexta proposta o governo sugere novos indicadores para mensuração do desenvolvimento. Hoje os mais importantes são o Índice de Desenvolvimento Humano (IDH) e o Produto Interno Bruto (PIB) que, como medida de desenvolvimento sustentável, “são claramente limitadas”, por não integrarem a grande diversidade de aspectos sociais e ambientais aos valores econômicos, o que induz, segundo o documento, a percepções errôneas do grau de desenvolvimento e de progresso dos países.

Na sétima proposta o governo sugere a implementação de um “pacto pela economia verde inclusiva. A ideia é estimular a divulgação de relatórios e de índices de sustentabilidade por empresas estatais, bancos de fomento, patrocinadoras de entidades de previdência privada, empresas de capital aberto e empresas de grande porte. Ou seja, além dos aspectos econômico-financeiros, essas instituições incluam nas divulgações, obrigatoriamente, e de acordo com padrões internacionalmente aceitos e comparáveis, informações sobre suas atuações em termos sociais, ambientais e de governança corporativa.

Por sua vez, a oitava proposta é ligada a “estrutura institucional do desenvolvimento sustentável. Essa aborda vários tópicos, dentre os quais a adoção de mecanismo de coordenação institucional para o desenvolvimento sustentável”; reforma do Conselho Econômico e Social das Nações Unidas (ECOSOC), transformando-o em Conselho de Desenvolvimento Sustentável das Nações Unidas; aperfeiçoamento da governança ambiental internacional; o lançamento de processo negociador para uma convenção global sobre acesso à informação, participação pública na tomada de decisões e acesso à justiça em temas ambientais; e a governança da água.

(Viviane Monteiro – Jornal da Ciência)

O futuro da ciência está na colaboração (Valor Econômico)

JC e-mail 4376, de 01 de Novembro de 2011.

Texto de Michael Nielsen publicado no The Wall Street Journal e divulgado pelo Valor Econômico.

Um matemático da Universidade de Cambridge chamado Tim Gowers decidiu em janeiro de 2009 usar seu blog para realizar um experimento social inusitado. Ele escolheu um problema matemático difícil e tentou resolvê-lo abertamente, usando o blog para apresentar suas ideias e como estava progredindo. Ele convidou todo mundo para contribuir com ideias, na esperança de que várias mentes unidas seriam mais poderosas que uma. Ele chamou o experimento de Projeto Polímata (“Polymath Project”).

Quinze minutos depois de Gowers abrir o blog para discussão, um matemático húngaro-canadense publicou um comentário. Quinze minutos depois, um professor de matemática do ensino médio dos Estados Unidos entrou na conversa. Três minutos depois disso, o matemático Terence Tao, da Universidade da Califórnia em Los Angeles, também comentou. A discussão pegou fogo e em apenas seis semanas o problema foi solucionado.

Embora tenham surgido outros desafios e os colaboradores dessa rede nem sempre tenham encontrado todas as soluções, eles conseguiram criar uma nova abordagem para solucionar problemas. O trabalho deles é um exemplo das experiências com ciência colaborativa que estão sendo feitas para estudar desde de galáxias até dinossauros.

Esses projetos usam a internet como ferramenta cognitiva para amplificar a inteligência coletiva. Essas ferramentas são um meio de conectar as pessoas certas com os problemas certos na hora certa, ativando o que é um conhecimento apenas latente.

A colaboração em rede tem o potencial de acelerar extraordinariamente o número de descobertas da ciência como um todo. É provável que assistiremos a uma mudança mais fundamental na pesquisa científica nas próximas décadas do que a ocorrida nos últimos três séculos.

Mas há obstáculos grandes para alcançar essa meta. Embora pareça natural que os cientistas adotem essas novas ferramentas de descobrimento, na verdade eles têm demonstrado uma inibição surpreendente. Iniciativas como o Projeto Polímata continuam sendo exceção, não regra.

Considere a simples ideia de compartilhar dados científicos on-line. O melhor exemplo disso é o projeto do genoma humano, cujos dados podem ser baixados por qualquer um. Quando se lê no noticiário que um certo gene foi associado a alguma doença, é praticamente certo que é uma descoberta possibilitada pela política do projeto de abrir os dados.

Apesar do valor enorme de divulgar abertamente os dados, a maioria dos laboratórios não faz um esforço sistemático para compartilhar suas informações com outros cientistas. Como me disse um biólogo, ele estava “sentado no genoma” de uma nova espécie inteira há mais de um ano. Uma espécie inteira! Imagine as descobertas cruciais que outros cientistas poderiam ter feito se esse genoma tivesse sido carregado num banco de dados aberto.

Por que os cientistas não gostam de compartilhar?

Se você é um cientista buscando um emprego ou financiamento de pesquisa, o maior fator para determinar seu sucesso será o número de publicações científicas que já conseguiu. Se o seu histórico for brilhante, você se dará bem. Se não for, terá problemas. Então você dedica seu cotidiano de trabalho à produção de artigos para revistas acadêmicas.

Mesmo que ache pessoalmente que seria muito melhor para a ciência como um todo se você organizasse e compartilhasse seus dados na internet, é um tempo que o afasta do “verdadeiro” trabalho de escrever os artigos. Compartilhar dados não é algo a que seus colegas vão dar crédito, exceto em poucas áreas.

Há outras áreas em que os cientistas ainda estão atrasados no uso das ferramentas on-line. Um exemplo são os “wikis” criadas por pioneiros corajosos em assuntos como computação quântica, teoria das cordas e genética (um wiki permite o compartilhamento e edição colaborativa de um conjunto de informações interligadas, e o site Wikipedia é o mais conhecido deles).

Os wikis especializados podem funcionar como obras de referência atualizadas sobre as pesquisas mais recentes de um campo, como se fossem livros didáticos que evoluem ultrarrápido. Eles podem incluir descrições de problemas científicos importantes que ainda não foram resolvidos e podem servir de ferramenta para encontrar soluções.

Mas a maioria desses wikis não deu certo. Eles têm o mesmo problema que o compartilhamento de dados: mesmo se os cientistas acreditarem no valor da colaboração, sabem que escrever um único artigo medíocre fará muito mais por suas carreiras. O incentivo está completamente errado.

Para a ciência em rede alcançar seu potencial, os cientistas precisam abraçar e recompensar o compartilhamento aberto de todos os conhecimentos científicos, não só o publicado nas revistas acadêmicas tradicionais. A ciência em rede precisa ser aberta.

Michael Nielsen é um dos pioneiros da computação quântica e escreveu o livro “Reinventing Discovery: The New Era of Networked Science” (Reinventando a Descoberta: A Nova Era da Ciência em Rede, sem tradução para o português), de onde esse texto foi adaptado.